package
stringlengths
1
122
pacakge-description
stringlengths
0
1.3M
alita-login
alita-loginalita-login is login management extension for Alita。Installingpip install alita-loginQuick StartLinksCode:https://github.com/dwpy/alita-login
alita-qa
Demo of virtual QA engineer agentsHave 2 commandsConverts open github repository to swagger spec files, require org repo branchgit2swagger-t"Use repository spring-petclinic/spring-framework-petclinic with branch main It is Java Spring application, please create swagger spec. Deployment URL is https://petclinic.example.com"Converts swagger files to gherkin test casesswagger2gherkinEnvironment requirementsAZURE_LLM_ENDPOINT=OPENAI_API_KEY=OPENAI_API_VERSION=DEPLOYMENT_NAME=MAX_TOKEN=RESULT_PATH=GHERKIN_PATH=
alita-session
alita-sessionalita-session is session management extension for Alita。Installingpip install alita-sessionQuick Startfrom alita import Alita from alita_session import Session app = Alita('dw') app.config['SESSION_ENGINE'] = 'alita_session.redis' app.config['SESSION_ENGINE_CONFIG'] = { 'host': {host}, 'port': {port}, 'db': {db} } Session().init_app(app)LinksCode:https://github.com/dwpy/alita-session
alitop
UNKNOWN
alitra
AlitraWIP Library for ALIgnment and TRAnsformation between fixed coordinate frames. The transform is described by a translation and a homogeneous rotation.Developed for transforming between the fixed local coordinate-frame and the asset-fixed coordinate-frame.InstallationInstallation from pippip install alitraimportalitrahelp(alitra)Installation from sourcegit clone https://github.com/equinor/alitra cd alitra pip install .[dev]You can test whether installation was successfull with pytestpytest .Local developmentpip install -e /path/to/packageThis will install package ineditablemode. Convenient for local developmentContributingWe welcome all kinds of contributions, including code, bug reports, issues, feature requests, and documentation. The preferred way of submitting a contribution is to either make anissueon GitHub or by forking the project on GitHub and making a pull requests.How to useThe tests in this repository can be used as examples of how to use the different models and functions. Thetest_example.pyis a good place to start.
alittlebox
No description available on PyPI.
ali-tts
ali_ttsalibaba tts driver for units
aliturgutb-test
No description available on PyPI.
alitvb
TVB is a python based command-line tool that you can do performance-testing for android smart television, TV box or phones.github:https://github.com/alibaba/tvbwiki:https://github.com/alibaba/tvb/wiki
aliusmangondal
Failed to fetch description. HTTP Status Code: 404
ali-usman-gondal-dummy
My first Python package with a slightly longer description
aliusmangondalfinal
My first Python package with a slightly longer description
aliusmangondalfinal2
My first Python package with a slightly longer description
aliusmangondalfinal3
My first Python package with a slightly longer description
aliusmangondalfinal4
My first Python package with a slightly longer description
aliUtil
UNKNOWN
alivebot
AlivebotA script to find and react to !ALIVE commands in comments on the Hive blockchain.Please note that this software is in early Beta stage, and that you need to know what you are doing to use it.InstallationFor Ubuntu and Debian install these packages:sudo apt-get install python3-pip build-essential libssl-dev python3-dev python3-setuptools python3-gmpy2Install Python PackagesInstall alivebot by (you may need to replace pip3 by pip):sudo pip3 install -U alivebot beem hiveengineConfigure And Run AlivebotFirst clone the Github repository to your home directory:cd ~ git clone https://github.com/flaxz/alivebotAfter that edit your comment templates using Nano, there are 4 comment templates.sudo apt install nano cd ~/alivebot/templates ls nano COMMENT-TEMPLATE-1-2-3-4Then edit your configuration file.cd ~/alivebot nano alivebot.configCopy your configuration and comment templates to your working directory.cd ~/alivebot sudo cp -R templates /usr/local/bin sudo cp alivebot /usr/local/bin sudo cp alivebot.config /usr/local/bin sudo cp run-alivebot.sh /usr/local/binMake the startup scripts executable.cd /usr/local/bin sudo chmod u+x alivebot sudo chmod u+x run-alivebot.shCopy the Systemd config to it's directory.cd ~/alivebot sudo cp alivebot.service /etc/systemd/systemReload Systemd and start the bot.sudo systemctl daemon-reload sudo systemctl start alivebot.serviceGet status and error messages.sudo systemctl status alivebot.service qStop the bot.sudo systemctl stop alivebot.serviceAs has been stated above this bot is in early Beta and bugs and issues are likely to occur.
alive-progress
alive-progressHave you ever wondered where your lengthy processing was at, and when would it finish? Do you usually hitRETURNseveral times to make sure it didn't crash, or the SSH connection didn't freeze? Have you ever thought it'd be awesome to be able topause some processingwithout hassle, return to the Python prompt to manually fix some items, thenseamlessly resumeit? I did...I've started this new progress bar thinking about all that, behold thealive-progress! 😃Introducing the newest concept in progress bars for Python!alive-progressis in a class of its own, with an array of cool features that set it apart. Here are a few highlights:A mesmerizinglive spinnerthat clearly shows your lengthy process did not crash and your SSH connection did not freeze, withvisual feedbackreacting to your processing speed.An efficientmultithreadedbar that updates itself at a fraction of the actual processing speed to keepCPU usage lowand avoid terminal spamming (1,000,000 iterations per second equates to roughly 60 updates per second, and you can also calibrate this to your liking).AnETA(expected time of arrival) feature with an intelligentExponential Smoothing Algorithmthat shows the time to completion, allowing you to plan your time and manage your workload more effectively.Automaticprintandlogginghooks that provide seamless integration and effortless tracking, even enriching them with the current bar position when they occur.It prints anice receiptwhen the processing finishes, including the elapsed time and the observed throughput.It detectsunderandoverflows, enabling you to track hits, misses, or any desired count, not necessarily the actual iterations.You canpauseit! That's right, you heard it here first! No other progress bar anywhere has this feature! You can get back to the Python prompt during any processing, adjust some items, and get back into that running process as if it had never stopped! Allalive_barwidgets are kept as they were, and the elapsed time nicely ignores the paused time!It ishighly customizable, with a smorgasbord of spinner and bar styles, as well as several ready-to-use factories to easily generate yours! You can even use the super powerful and coolcheck()tool to help you design your own animations! You can see all the generated frames and cycles exploded on screen, with several verbosity levels, even including analiverendition! It's boundless creativity at your fingertips!Table of contentsThis README is always evolving, so do take a more comprehensive look from time to time... You might find great new details in other sections! 😊alive-progressTable of contents📌 NEW in 3.1 seriesUsingalive-progressGet itTry itAwake itMaster itDisplaying messagesAuto-iteratingModes of operationDefinite/unknown: CountersManual: PercentagesSummary of ModesThebar()handlersStylesConfigurationCreate your own animationsIntro: How do they work?A Spinner Compiler, really?Spinner FactoriesBar FactoriesAdvancedThe Pause MechanismLoop-less useFPS CalibrationForcing animations on PyCharm, Jupyter, etc.Interesting factsTo doPython End of Life noticeFor new Python 2.7 and 3.5For new Python 3.6License📌 NEW in 3.1 seriesA very cool update here! In addition to polishing things up and improving terminal support, nowalive-progresssupports resuming computations!When processing huge datasets or things that take a long time, you might either use batches or cache partial results. Then, in case it stops and is restarted, you end up skipping all those already done items very quickly, which makes thealive_barthink you're processing thousands of items per second, which in turn completely ruins the ETA... But not anymore! Just tellbar()that you've skipped items... 👏You can use it in two ways:1.If you do know where you've stopped:withalive_bar(120000)asbar:bar(60000,skipped=True)foriinrange(60000,120000):# process itembar()Yep, just callbar(N, skipped=True)once, with the number of items.2.If you do not know or the items are scattered:withalive_bar(120000)asbar:foriinrange(120000):ifdone(i):bar(skipped=True)continue# process itembar()Yep, it's as simple as that! Just callbar(skipped=True)when an item is already done, orbar()as usual otherwise. You could also share a singlebar(skipped=?)call at the end, with a bool saying whether you did skip that item or not. Cool, huh?Also in this version:newmax_colsconfig setting, the number of columns to use if not possible to fetch it, like in jupyter and other platforms which doesn't support sizefix fetching the size of the terminal when using stderrofficially supports Python 3.11includedrufflinter before building📌 NEW in 3.0 seriesYep, I could finally get this version out! These are the new goodies:Units support! You can now label the data you're processing, likeB,bytes, or even°C!Automatic scaling! With support for SI (base 1000), IEC (base 1024), and even an alternate SI with base 1024, you'll be well served!Configurable precision! When your numbers are scaled, you get to choose how many decimals they display!Automatic stats scaling for slow throughputs! If your processing takes minutes or more, now you'll see rates per minute, per hour, and even per day! (It works within the auto-scaling system!)Support for usingsys.stderrand other files instead ofsys.stdout!Smoothed out the rate estimation with the same Exponential Smoothing Algorithm that powers the ETA, so the bar returns a more realistic ETA!Query the currently running widgets' data, like the monitor, rate, and ETA!New help system on configuration errors, which explains why a value was not accepted, and what were the expected ones!Highly anticipated fixes:Support for reusing logging handlers! No moreTypeError: unhashable type: 'types.SimpleNamespace'.Support for logging when usingRotatingFileHandlers! Yep, seek support is here.Fix unknown mode always ending with a warning (!)And last but not least, a more polished layout for you to enjoy your progress!📌 NEW in 2.4 seriesNow,alive_barsupportsDual Linetext mode!If you ever wanted to include longer situational messages within the bar, you probably felt squeezed into one line. You had to shrink the beautifully animated bar or, even worse, remove widgets (!) to be able to see what you needed...Not anymore!! You can now make the barDual Line, and put text below it!Yes, there's a message below the whole bar, and any other print/logging messages scroll above it!letters=[chr(ord('A')+x)forxinrange(26)]withalive_bar(26,dual_line=True,title='Alphabet')asbar:forcinletters:bar.text=f'-> Teaching the letter:{c}, please wait...'ifcin'HKWZ':print(f'fail "{c}", retry later')time.sleep(0.3)bar()on7:fail"H",retrylateron10:fail"K",retrylaterAlphabet|███████████████████████████▊|▃▅▇18/26[69%]in6s(3.2/s,eta:3s)->Teachingtheletter:S,pleasewait...There's also a newfinalizefunction parameter inalive_itwhich enables you to set the title and/or text of the final receipt, and improved logging support which detects customized loggers.📌 NEW in 2.3 seriesThis is all about customization; the core widgets can now be changed:send a string to themonitor,elapsed, andstatswidgets to make them look anyway you want!It's incredible that these strings support all Python format features, so you can e.g.,{percent:.1%}😉.They can be further customized when on thefinal receipt!newmonitor_end,elapsed_end, andstats_end, with dynamic formats inherited from the standard ones!If you've hidden some widgets before, just so they wouldn't appear on the receipt, now you can see them in all their running glory, and hide just the receipt ones! Or the other way around 😜Another addition, nowalive-progressbeautifully renders its cool final receipt whenever it is stopped, even if you CTRL+C it prematurely! I don't know why I haven't thought about that before...Download|██████████████████⚠︎|(!)45/100[45%]in4.8s(9.43/s)And finally, you can choose to disable CTRL+C at all! The default is the saferctrl_c=True, which does make CTRL-C work as usual.Disable itctrl_c=False, to make your interactivealive_barmuch smoother to use (there are no stack traces if you stop it), and/or if it is at the top-level of your program!Beware: If it is e.g. inside a for-loop, it will just continue to the next iteration, which may or may not be what you want...foriinrange(10):withalive_bar(100,ctrl_c=False,title=f'Download{i}')asbar:foriinrange(100):time.sleep(0.02)bar()Download0|████████▊⚠︎|(!)22/100[22%]in0.6s(36.40/s)Download1|████████████████▊⚠︎|(!)42/100[42%]in1.0s(41.43/s)Download2|██████▍⚠︎|(!)16/100[16%]in0.4s(39.29/s)Download3|█████▋⚠︎|(!)14/100[14%]in0.4s(33.68/s)Download4|█████████████▎⚠︎|(!)33/100[33%]in0.8s(39.48/s)Download5|███████▎⚠︎|(!)18/100[18%]in0.5s(37.69/s)Download6|█████▎⚠︎|(!)13/100[13%]in0.3s(37.28/s)Download7|████████████⚠︎|(!)30/100[30%]in0.8s(38.43/s)Download8|██████⚠︎|(!)15/100[15%]in0.4s(36.26/s)...📌 NEW in 2.2 seriesSome major new features, often requested, have finally landed!bar title can be dynamically set, changed, or even removed after being displayednew custom fps system, which enables very slow refresh rates (to let it run on those k8s for long periods)the final receipt can be totally hidden (great for special effects, like using the cool spinners standalone)new support forclick.echo()printingterminal columns detection is safer for exotic environmentsrequires Python 3.7+📌 NEW in 2.1 seriesYES! Nowalive-progresshas support for Jupyter Notebooks and also includes aDisabledstate! Both were highly sought after, and have finally landed!And better, I've implemented an auto-detection mechanism for jupyter notebooks, so it just works, out of the box, without any changes in your code!!See for yourself:It seems to work very well, but at this moment, it should be consideredexperimental.There were instances in which some visual glitches have appeared, like twoalive_barrefreshes being concatenated together instead of over one another... And it's something I think I can't possibly work around: it seems Jupyter sometimes refresh the canvas at odd times, which makes it lose some data. Please let me know on the issues if something funnier arises.📌 NEW in 2.0 seriesThis is a major breakthrough inalive-progress!I took 1 year developing it, and I'm very proud of what I've accomplished \o/now, there's complete support for Emojis 🤩 and exotic Unicode chars in general, which required MAJOR refactoring deep within the project, giving rise to what I called "Cell Architecture" => now, all internal components use and generate streams of cells instead of characters, and correctly interpret grapheme clusters — those so-called wide chars, which are encoded with a variable number of chars, but always take two cells on screen!! This has enabled us to render complex multi-chars symbols as if they were one, thus making them work on any spinners, bars, texts, borders and backgrounds, even when fractured!!! Pretty advanced stuff 🤓new super cool spinner compiler and runner, which generates complete animations ahead of time, and plays these ready-to-go animations seamlessly, with no overhead at all! 🚀the spinner compiler also includes advanced extra commands to generate and modify animations, like reshape, replace, transpose, or randomize the animation cycles!new powerful and polished.check()tools that compile and beautifully render all frames from all animation cycles of spinners and bars! they can even include complete frame data, internal codepoints, and even their animations! 👏bars engine revamp, with invisible fills, advanced support for multi-char tips (which gradually enter and leave the bar), borders, tips and errors of any length, and underflow errors that can leap into the border if they can't fit!spinners engine revamp, with standardized factory signatures, improved performance, new types, and new features: smoother bouncing spinners (with an additional frame at the edges), optimized scrolling of text messages (which go slower and pause for a moment at the edges), new alongside and sequential spinners, nicer effect in alongside spinners (which use weighted spreading over the available space), smoother animation in scrolling spinners (when the input is longer than the available space)new builtin spinners, bars, and themes, which make use of the new animation featuresnew showtime that displays themes and is dynamic => it does not scroll the screen when it can't fit vertically or horizontally, and can even filter for patterns!improved support for logging into files, which gets enriched as the print hook is!several new configuration options for customizing appearance, including support for disabling anyalive-progresswidgets!includes a new iterator adapter,alive_it, that accepts an iterable and callsbar()for you!requires Python 3.6+ (and officially supports Python 3.9 and 3.10)Since this is a major version change, direct backward compatibility is not guaranteed. If something does not work at first, just check the new imports and functions' signatures, and you should be good to go. All previous features should still work here! 👍Usingalive-progressGet itJust install with pip:❯pipinstallalive-progressTry itIf you're wondering what styles are builtin, it'sshowtime! ;)fromalive_progress.stylesimportshowtimeshowtime()Note: Please disregard the path in the animated gif below, the correct one is above. These long gifs are very time-consuming to generate, so I can't make another on every single change. Thanks for your understanding.I've made these styles just to try all the animation factories I've created, but I think some of them ended up very, very cool! Use them at will, and mix them to your heart's content!Do you want to see actualalive-progressbars gloriously running in your system before trying them yourself?❯python-malive_progress.tools.demoAwake itCool, huh?? Now enter anipythonREPL and try this:fromalive_progressimportalive_barimporttimeforxin1000,1500,700,0:withalive_bar(x)asbar:foriinrange(1000):time.sleep(.005)bar()You'll see something like this, with cool animations throughout the process 😜:|████████████████████████████████████████| 1000/1000 [100%] in 5.8s (171.62/s) |██████████████████████████▋⚠︎ | (!) 1000/1500 [67%] in 5.8s (172.62/s) |████████████████████████████████████████✗︎ (!) 1000/700 [143%] in 5.8s (172.06/s) |████████████████████████████████████████| 1000 in 5.8s (172.45/s)Nice, huh? Loved it? I knew you would, thank you 😊.To actually use it, just wrap your normal loop in analive_barcontext manager like this:withalive_bar(total)asbar:# declare your expected totalforiteminitems:# <<-- your original loopprint(item)# process each itembar()# call `bar()` at the endAnd it's alive! 👏So, in short: retrieve the items as always, enter thealive_barcontext manager with the number of items, and then iterate/process those items, callingbar()at the end! It's that simple! :)Master ititemscan be any iterable, like for example, a queryset;the first argument of thealive_baris the expected total, likeqs.count()for querysets,len(items)for iterables with length, or even a static number;the callbar()is what makes the bar go forward — you usually call it in every iteration, just after finishing an item;if you callbar()too much (or too few at the end), the bar will graphically render that deviation from the expectedtotal, making it very easy to notice overflows and underflows;to retrieve the current bar count or percentage, callbar.current.You can get creative! Since the bar only goes forward when you callbar(), it isindependent of the loop! So you can use it to monitor anything you want, like pending transactions, broken items, etc., or even call it more than once in the same iteration! So, in the end, you'll get to know how many of those "special" events there were, including their percentage relative to the total!Displaying messagesWhile inside analive_barcontext, you can effortlessly display messages tightly integrated with the current progress bar being displayed! It won't break in any way and will even enrich your message!the coolbar.text('message')andbar.text = 'message'set a situational message right within the bar, where you can display something about the current item or the phase the processing is in;the (📌 new) dynamic title, which can be set right at the start, but also be changed anytime withbar.title('Title')andbar.title = 'Title'— mix withtitle_lengthto keep the bar from changing its length;the usual Pythonprint()statement, wherealive_barnicely cleans up the line, prints your message alongside the current bar position at the time, and continues the bar right below it;the standard Pythonloggingframework, including file outputs, is also enriched exactly like the previous one;if you're using click CLI lib, you can even useclick.echo()to print styled text.Awesome right? And all of these work just the same in a terminal or in a Jupyter notebook!Auto-iteratingYou now have a quicker way to monitor anything! Here, the items are automatically tracked for you!Behold thealive_it=> thealive_bariterator adapter!Simply wrap your items with it, and loop over them as usual!The bar will just work; it's that simple!fromalive_progressimportalive_itforiteminalive_it(items):# <<-- wrapped itemsprint(item)# process each itemHOW COOL IS THAT?! 😜Allalive_barparameters apply buttotal, which is smarter (if not supplied, it will be auto-inferred from your data usinglenorlength_hint), andmanualthat does not make sense here.Note there isn't anybarhandle at all in there. But what if you do want it, e.g. to set text messages or retrieve the current progress?You can interact with the internalalive_barby just assigningalive_itto a variable like this:bar=alive_it(items)# <<-- bar with wrapped itemsforiteminbar:# <<-- iterate on barprint(item)# process each itembar.text(f'ok:{item}')# WOW, it works!Note that this is a slightly specialbar, which does not supportbar(), since the iterator adapter tracks items automatically for you. Also, it supportsfinalize, which enables you to set the title and/or text of the final receipt:alive_it(items,finalize=lambdabar:bar.text('Success!'))...In a nutshell:full use is alwayswith alive_bar() as bar, where you iterate and callbar()whenever you want;quick adapter use isfor item in alive_it(items), where items are automatically tracked;full adapter use isbar = alive_it(items), where in addition to items being automatically tracked, you get a special iterablebarable to customize the inneralive_progresshowever you want.Modes of operationDefinite/unknown: CountersActually, thetotalargument is optional. If you do provide it, the bar enters indefinite mode, the one used for well-bounded tasks. This mode has all the widgetsalive-progresshas to offer: progress, count, throughput, and ETA.If you don't, the bar enters inunknown mode, the one used for unbounded tasks. In this mode, the whole progress bar is animated, as it's not possible to determine the progress, and therefore the ETA. But you still get the count and throughput widgets as usual.The cool spinner is still present here alongside the progress bar, both running their animations concurrently and independently of each other, rendering a unique show in your terminal! 😜Both definite and unknown modes use internally acounterto maintain progress. This is the source value which all widgets are derived from.Manual: PercentagesOn the other hand, themanual modeuses internally apercentageto maintain progress. This enables you to get complete control of the bar position! It's usually used to monitor processes that only feed you the percentage of completion, or to generate some kind of special effects.To use it, just include amanual=Trueargument intoalive_bar(orconfig_handler), and you get to send any percentage to thebar()handler! For example, to set it to 15%, just callbar(0.15)— which is 15 / 100.You can also usetotalhere! If you do provide it,alive-progresswill infer an internalcounterby itself, and thus will be able to offer you the same count, throughput, and ETA widgets!If you don't, you'll at least get rough versions of the throughput and ETA widgets. The throughput will use "%/s" (percent per second), and the ETA will be till 1.0 (100%). Both are very inaccurate but better than nothing.You can callbarin manual mode as frequently as you want! The refresh rate will still be asynchronously computed as usual, according to the current progress and the elapsed time, so you won't ever spam the terminal with more updates than it can handle.Summary of ModesWhentotalis provided all is cool:modecounterpercentagethroughputETAover/underflowdefinite✅ (user tick)✅ (inferred)✅✅✅manual✅ (inferred)✅ (user set)✅✅✅When it isn't, some compromises have to be made:modecounterpercentagethroughputETAover/underflowunknown✅ (user tick)❌✅❌❌manual❌✅ (user set)⚠️ (simpler)⚠️ (rough)✅But actually it's quite simple, you do not need to think about which mode you should use:Just always send thetotalif you have it, and usemanualif you need it!It will just work the best it can! 👏 \o/Thebar()handlersThebar()handlers support either relative or absolute semantics, depending on the mode:definiteandunknownmodes userelative positioning, so you can just callbar()to increment the counter by one, or send any other positive increment likebar(5)to increment by those at once;manualmodes useabsolute positioning, so you can just callbar(0.35)to instantly put the bar in 35% position — this argument is mandatory here!The manual modes enable you to get super creative! Since you can set the bar instantly to whatever position you want, you could:make it go backwards — perhaps to graphically display the timeout of something;create special effects — perhaps to act as a real-time gauge of some sort.In any case, to retrieve the current count/percentage, just call:bar.current:indefiniteandunknownmodes, this provides aninteger— the actual internal counter;inmanualmodes, this provides afloatin the interval [0, 1] — the last percentage set.Last but not least, thebar()handler of thedefinitemode has a unique ability: skipping items for an accurate ETA! Just callbar(skipped=False)orbar(skipped=True)to use it. When skipped is True, that item(s) are ignored when computing the rate, and thus not ruining the ETA.Maintaining an open source project is hard and time-consuming, and I've put much ❤️ and effort into this.If you've appreciated my work, you can back me up with a donation! Thank you 😊StylesTheshowtimeexhibit has an optional argument to choose which show to present,Show.SPINNERS(default),Show.BARSorShow.THEMES, do take a look at them! ;)fromalive_progress.stylesimportshowtime,ShowNote: Please disregard the path in the animated gif below, the correct one is above. These long gifs are very time-consuming to generate, so I can't make another on every single change. Thanks for your understanding.And the themes one (📌 new in 2.0):Theshowtimeexhibit also accepts some customization options:fps: the frames per second rate refresh rate, default is 15;length: the length of the bars, default is 40;pattern: a filter to choose which ones to display.For example to get a marine show, you canshowtime(pattern='boat|fish|crab'):You can also access these shows with the shorthandsshow_bars(),show_spinners(), andshow_themes()!There's also a small utility calledprint_chars(), to help find that cool character to put in your customized spinners and bars, or to determine if your terminal does support Unicode characters.ConfigurationThere are several options to customize both appearance and behavior!All of them can be set both directly in thealive_baror globally in theconfig_handler!These are the options - default values in brackets:title: an optional, always visible bar titlelength: [40] the number of cols to render the animated progress barmax_cols: [80] the maximum cols to use if not possible to fetch it, like in jupyterspinner: the spinner style to be rendered next to the bar↳ accepts a predefined spinner name, a custom spinner factory, or Nonebar: the bar style to be rendered in known modes↳ accepts a predefined bar name, a custom bar factory, or Noneunknown: the bar style to be rendered in the unknown mode↳ accepts a predefined spinner name, or a custom spinner factory (cannot be None)theme: ['smooth'] a set of matching spinner, bar, and unknown↳ accepts a predefined theme nameforce_tty: [None] forces animations to be on, off, or according to the tty (more detailshere)↳ None -> auto select, according to the terminal/Jupyter↳ True -> unconditionally enables animations, but still auto-detects Jupyter Notebooks↳ False -> unconditionally disables animations, keeping only the final receiptfile: [sys.stdout] the file object to use:sys.stdout,sys.stderr, or a similarTextIOWrapperdisable: [False] if True, completely disables all output, do not install hooksmanual: [False] set to manually control the bar positionenrich_print: [True] enriches print() and logging messages with the bar positionreceipt: [True] prints the nice final receipt, disables if Falsereceipt_text: [False] set to repeat the last text message in the final receiptmonitor(bool|str): [True] configures the monitor widget152/200 [76%]↳ send a string with{count},{total}and{percent}to customize itelapsed(bool|str): [True] configures the elapsed time widgetin 12s↳ send a string with{elapsed}to customize itstats(bool|str): [True] configures the stats widget(123.4/s, eta: 12s)↳ send a string with{rate}and{eta}to customize itmonitor_end(bool|str): [True] configures the monitor widget within final receipt↳ same as monitor, the default format is dynamic, it inheritsmonitor's oneelapsed_end(bool|str): [True] configures the elapsed time widget within final receipt↳ same as elapsed, the default format is dynamic, it inheritselapsed's onestats_end(bool|str): [True] configures the stats widget within final receipt↳ send a string with{rate}to customize it (no relation to stats)title_length: [0] fixes the length of titles, or 0 for unlimited↳ title will be truncated if longer, and a cool ellipsis "…" will appear at the endspinner_length: [0] forces the spinner length, or0for its natural onerefresh_secs: [0] forces the refresh period to this,0is the reactive visual feedbackctrl_c: [True] if False, disables CTRL+C (captures it)dual_line: [False] if True, places the text below the barunit: any text that labels your entitiesscale: the scaling to apply to units:None,SI,IEC, orSI2↳ supports aliases:Falseor''->None,True->SI,10or'10'->SI,2or'2'->IECprecision: [1] how many decimals do display when scalingAnd there's also one that can only be set locally in analive_barcontext:calibrate: maximum theoretical throughput to calibrate the animation speed (more detailshere)To set them locally, just send them as keyword arguments toalive_bar:withalive_bar(total,title='Processing',length=20,bar='halloween')asbar:...To use them globally, send them toconfig_handler, and anyalive_barcreated after that will include those options! And you can mix and match them, local options always have precedence over global ones:fromalive_progressimportconfig_handlerconfig_handler.set_global(length=20,spinner='wait')withalive_bar(total,bar='blocks',spinner='twirls')asbar:# the length is 20, the bar is 'blocks' and the spinner is 'twirls'....Create your own animationsYes, you can assemble your own spinners! And it's easy!I've created a plethora of special effects, so you can just mix and match them any way you want! There are frames, scrolling, bouncing, sequential, alongside, and delayed spinners! Get creative! 😍Intro: How do they work?The spinners' animations are engineered by very advanced generator expressions, deep within several layers of meta factories, factories and generators 🤯!the meta factory (public interface) receives the styling parameters from you, the user, and processes/stores them inside a closure to create the actual factory => this is the object you'll send to bothalive_barandconfig_handler;internally it still receives other operating parameters (like for instance the rendition length), to assemble the actual generator expression of the animation cycles of some effect, within yet another closure;this, for each cycle, assembles another generator expression for the animation frames of the same effect;these generators together finally produce the streams of cycles and frames of the cool animations we see on the screen! Wow! 😜👏These generators are capable of multiple different animation cycles according to the spinner behavior, e.g. a bouncing spinner can run one cycle to smoothly bring a subject into the scene, then repeatedly reposition it until the other side, then make it smoothly disappear off the scene => and this is all only one cycle! Then it can be followed by another cycle to make it all again but backwards! And bouncing spinners also acceptdifferentandalternatingpatterns in both the right and left directions, which makes them generate the cartesian product of all the combinations, possibly producing dozens of different cycles until they start repeating them!! 🤯And there's more, I think one of the most impressive achievements I got in this animation system (besides the spinner compiler itself)... They only yield more animation frames until the current cycle is not exhausted, thenthey halt themselves! Yep, the next cycle does not start just yet! This behavior creates natural breaks in exactly the correct spots, where the animations are not disrupted, so I can smoothly link with whatever other animation I want!!This has all kinds of cool implications: the cycles can have different frame counts, different screen lengths, they do not need to be synchronized, they can create long different sequences by themselves, they can cooperate to play cycles in sequence or alongside, and I can amaze you displaying several totally distinct animations at the same time without any interferences whatsoever!It's almost like they were...alive!! 😄==> Yes, that's where this project's name came from! 😉A Spinner Compiler, really?Now, these generators of cycles and frames are fully consumed ahead of time by theSpinner Compiler! This is a very cool new processor that I made inside theCell Architectureeffort, to make all these animations work even in the presence of wide chars or complex grapheme clusters! It was very hard to make these clusters gradually enter and exit frames, smoothly, while keeping them from breaking the Unicode encoding and especially maintain their original lengths in all frames! Yes, several chars in sequence can represent another completely different symbol, so they cannot ever be split! They have to enter and exit the frame always together, all at once, or the grapheme won't show up at all (an Emoji for instance)!! Enter theSpinner Compiler......This has made possible some incredible things!! Since this Compiler generates the whole spinner frame data beforehand:the grapheme fixes can be applied only once;the animations do not need to be calculated again!So, I can just collect all thatready to playanimations and be done with it,no runtime overheadat all!! 👏Also, with the complete frame data compiled and persisted, I could create several commands torefactorthat data, like changing shapes, replacing chars, adding visual pauses (frame repetitions), generating bouncing effects on-demand over any content, and even transposing cycles with frames!!But how can you see these effects? Does the effect you created look good? Or is it not working as you thought? YES, now you can see all generated cycles and frames analytically, in a very beautiful rendition!!I love what I've achieved here 😊, it's probably THE most beautiful tool I've ever created... Behold thechecktool!!It's awesome if I say so myself, isn't it? And a very complex piece of software I'm proud of,take a look at its codeif you'd like.And thechecktool is much more powerful! For instance, you can see the codepoints of the frames!!! And maybe have a glimpse of why this version was so, so very hard and complex to make...In red, you see the grapheme clusters, that occupy one or two "logical positions", regardless of their actual sizes... These are the "Cells" of the newCell Architecture...Look how awesome an Emoji Flag is represented:The flag seems to move so smoothly because it uses "half-characters"! Since it is a wide char,alive-progressknows it will be rendered with "two visible chars", and the animations consider this, but compose with spaces, which occupy only one. When one uses mixed backgrounds, the situation is much more complex...Spinner FactoriesThe types of factories I've created are:frames: draws any sequence of characters at will, that will be played frame by frame in sequence;scrolling: generates a smooth flow from one side to the other, hiding behind or wrapping upon invisible borders — allows using subjects one at a time, generating several cycles of distinct characters;bouncing: similar toscrolling, but makes the animations bounce back to the start, hiding behind or immediately bouncing upon invisible borders;sequentialget a handful of factories and play them one after the other sequentially! allows to intermix them or not;alongsideget a handful of factories and play them alongside simultaneously, why choose when you can have them all?! allows to choose the pivot of the animation;delayed: get any other factory and copy it multiple times, increasingly skipping some frames on each one! very cool effects are made here!For more details please look at their docstrings, which are very complete.Bar FactoriesCustomizing bars is nowhere near that involved. Let's say they are "immediate", passive objects. They do not support animations, i.e. they will always generate the same rendition given the same parameters. Remember spinners are infinite generators, capable of generating long and complex sequences.Well, bars also have a meta factory, use closures to store the styling parameters, and receive additional operating parameters, but then the actual factory can't generate any content by itself. It still needs an extra parameter, a floating-point number between 0 and 1, which is the percentage to render itself.alive_barcalculates this percentage automatically based on the counter and total, but you can send it yourself when in themanualmode!Bars also do not have a Bar Compiler, but theydo provide the check tool!! 🎉You can even mix and match wide chars and normal chars just like in spinners! (and everything keeps perfectly aligned 😅)Use the check tools to your heart's content!! They have even more goodies awaiting you, even real-time animations!Create the wildest and coolest animations you can and send them to me!I'm thinking about creating some kind ofcontribpackage, with user-contributed spinners and bars!Wow, if you've read everything till here, you should now have a sound knowledge about usingalive-progress! 👏But brace yourself because there is even more, exciting stuff lies ahead!Maintaining an open source project is hard and time-consuming, and I've put much ❤️ and effort into this.If you've appreciated my work, you can back me up with a donation! Thank you 😊AdvancedThe Pause MechanismOh, you want to pause it altogether, I hear? This is an amazing novel concept, not found anywhere AFAIK.With this you get to act on some itemsmanually, at will, right in the middle of an ongoing processing!!YES, you can return to the prompt and fix, change, submit things, and the bar will just "remember" where it was...Suppose you need to reconcile payment transactions (been there, done that). You need to iterate over thousands of them, detect somehow the faulty ones, and fix them. This fix is not simple nor deterministic, you need to study each one to understand what to do. They could be missing a recipient, or have the wrong amount, or not be synced with the server, etc., it's hard to even imagine all possibilities.Typically, you would have to let the detection process run until completion, appending to a list each inconsistency it finds and waiting, potentially a long time, until you can finally start fixing them... You could of course mitigate that by processing in chunks, or printing them and acting via another shell, etc., but those have their own shortcomings... 😓Now, there's a better way! Simply pause the actual detection process for a while! Then you just have to wait till the next fault is found, and act in near real-time!To use the pause mechanism you just have to write a function, so the code canyieldthe items you want to interact with. You most probably already use one in your code, but in theipythonshell or another REPL you probably don't. So just wrap your debug code in a function, then enter within abar.pause()context!!defreconcile_transactions():qs=Transaction.objects.filter()# django example, or in sqlalchemy: session.query(Transaction).filter()withalive_bar(qs.count())asbar:fortransactioninqs:iffaulty(transaction):withbar.pause():yieldtransactionbar()That's it! It's that simple! \o/Now rungen = reconcile_transactions()to instantiate the generator, and whenever you want the next faulty transaction, just callnext(gen, None)! I love it...Thealive-progressbar will start and run as usual, but as soon as any inconsistency is found, the bar will pause itself, turning off the refresh thread and remembering its exact state, and yield the transaction to you directly on the prompt! It's almost magic! 😃In [11]: gen = reconcile_transactions() In [12]: next(gen, None) |█████████████████████ | 105/200 [52%] in 5s (18.8/s, eta: 4s) Out[12]: Transaction<#123>You can then inspect the transaction with the usual_shortcut ofipython(or just directly assign it witht = next(gen, None)), and you're all set to fix it!When you're done, just reactivate the bar with the samenextcall as before!! The bar reappears, turns everything back on, and continueslike it had never stopped!! Ok, it is magic 😜In [21]: next(gen, None) |█████████████████████ | ▁▃▅ 106/200 [52%] in 5s (18.8/s, eta: 4s)Rinse and repeat till the final receipt appears, and there'll be no faulty transactions anymore. 😄Loop-less useSo, you need to monitor a fixed operation, without any loops, right?It'll work for sure! Here is a naive example (we'll do better in a moment):withalive_bar(4)asbar:corpus=read_file(file)bar()# file was read, tokenizingtokens=tokenize(corpus)bar()# tokens generated, processingdata=process(tokens)bar()# process finished, sending responseresp=send(data)bar()# we're done! four bar calls with `total=4`It's naive because it assumes all steps take the same amount of time, but actually, each one may take a very different time to complete. Thinkread_fileandtokenizemay be extremely fast, which makes the percentage skyrocket to 50%, then stopping for a long time in theprocessstep... You get the point, it can ruin the user experience and create a very misleading ETA.To improve upon that you need to distribute the steps' percentages accordingly! Since you toldalive_barthere were four steps, when the first one was completed it understood 1/4 or 25% of the whole processing was complete... Thus, you need to measure how long your steps actually take and use themanual modeto increase the bar percentage by the right amount at each step!You can use my other open source projectabout-timeto easily measure these durations! Just try to simulate with some representative inputs, to get better results. Something like:fromabout_timeimportabout_timewithabout_time()ast_total:# this about_time will measure the whole time of the block.withabout_time()ast1# the other four will get the relative timings within the whole.corpus=read_file(file)# `about_time` supports several calling conventions, including one-liners.withabout_time()ast2# see its documentation for more details.tokens=tokenize(corpus)withabout_time()ast3data=process(tokens)withabout_time()ast4resp=send(data)print(f'percentage1 ={t1.duration/t_total.duration}')print(f'percentage2 ={t2.duration/t_total.duration}')print(f'percentage3 ={t3.duration/t_total.duration}')print(f'percentage4 ={t4.duration/t_total.duration}')There you go! Now you know the relative timings of all the steps, and can use them to improve your original code! Just get the cumulative timings and put them within a manual modealive_bar!For example, if the timings you found were 10%, 30%, 20%, and 40%, you'd use 0.1, 0.4, 0.6, and 1.0 (the last one should always be 1.0):withalive_bar(4,manual=True)asbar:corpus=read_big_file()bar(0.1)# 10%tokens=tokenize(corpus)bar(0.4)# 30% + 10% from previous stepsdata=process(tokens)bar(0.6)# 20% + 40% from previous stepsresp=send(data)bar(1.)# always 1. in the last stepThat's it! The user experience and ETA should be greatly improved now.FPS CalibrationSo, you want to calibrate the engine?Thealive-progressbars have cool visual feedback of the current throughput, so you can actuallyseehow fast your processing is, as the spinner runs faster or slower with it.For this to happen, I've put together and implemented a few fps curves to empirically find which one gave the best feel of speed:(interactive version [here](https://www.desmos.com/calculator/ema05elsux))The graph shows the logarithmic (red), parabolic (blue) and linear (green) curves, these are the ones I started with. It was not an easy task, I've made dozens of tests, and never found one that really inspired that feel of speed I was looking for. The best one seemed to be the logarithmic one, but it reacted poorly with small numbers. I know I could make it work with a few twists for those small numbers, so I experimented a lot and adjusted the logarithmic curve (dotted orange) until I finally found the behavior I expected! It is the one that seemed to provide the best all-around perceived speed changes throughout the whole spectrum from a few to billions... That is the curve I've settled with, and it's the one used in all modes and conditions. In the future and if someone would find it useful, that curve could be configurable.Well, the defaultalive-progresscalibration is1,000,000in bounded modes, i.e., it takes 1 million iterations per second for the bar to refresh itself at 60 frames per second. In the manual unbounded mode, it is1.0(100%). Both enable a vast operating range and generally work quite well.For example, take a look at the effect these very different calibrations have, running the very same code at the very same speed! Notice the feel the spinner passes to the user, is this processing going slow or going fast? And remember that isn't only the spinner refreshing but the whole line, complete with the bar rendition and all widgets, so everything gets smoother or sluggish:So, if your processing hardly gets to 20 items per second, and you thinkalive-progressis rendering sluggish, you could increase that sense of speed by calibrating it to let's say40, and it will be running waaaay faster... It is better to always leave some headroom and calibrate it to something between 50% and 100% more, and then tweak it from there to find the one you like the most! :)Forcing animations on PyCharm, Jupyter, etc.Do these astonishingalive-progressanimations refuse to display?PyCharm is awesome, I love it! But I'll never understand why they've disabled emulating a terminal by default... If you do use PyCharm's output console, please enable this on all your Run Configurations:I even recommend you go intoFile>New Projects Setup>Run Configuration Templates, selectPython, and also enable it there, so any new ones you create will already have this set.In addition to that, some terminals report themselves as "non-interactive", like when running out of a real terminal (PyCharm and Jupyter for example), in shell pipelines (cat file.txt | python program.py), or in background processes (not connected to a tty).Whenalive-progressfinds itself in a non-interactive terminal, it automatically disables all kinds of animations, printing only the final receipt. This is made in order to avoid both messing up the pipeline output and spamming your log file with thousands ofalive-progressrefreshes.So, when you know it's safe, you can force them to seealive-progressin all its glory! Here is theforce_ttyargument:withalive_bar(1000,force_tty=True)asbar:foriinrange(1000):time.sleep(.01)bar()The values accepted are:force_tty=True-> always enables animations, and auto-detects Jupyter Notebooks!force_tty=False-> always disables animations, keeping only the final receiptforce_tty=None(default) -> auto detect, according to the terminal's tty stateYou can also set it system-wide usingconfig_handler, so you don't need to pass it manually anymore.Do note that PyCharm's console and Jupyter notebooks are heavily instrumented and thus have much more overhead, so the outcome may not be as fluid as you would expect. On top of that, Jupyter notebooks do not support ANSI Escape Codes, so I had to develop some workarounds to emulate functions like "clear the line" and "clear from cursor"... To see the fluid and smoothalive_baranimations as I intended, always prefer a full-fledged terminal.Interesting factsThis whole project was implemented in functional style;It uses extensively (and very creatively) PythonClosuresandGenerators, e.g. allspinnersare made with coolGenerator Expressions! Besides it, there are other cool examples like theexhibitmodule, and the corespinner player/spinner runnergenerators; 😜Until 2.0,alive-progresshadn't had any dependency. Now it has two: one isabout-time(another very cool project of mine if I say so myself), to track the spinner compilation times and generate its human-friendly renditions. The other isgrapheme, to detect grapheme cluster breaks (I've opened anissuethere asking about the future and correctness of it, and the author guarantees he intends to update the project on every new Unicode version);Also, until 2.0alive-progresshadn't had a single Python class! Now it has a few tiny ones for very specific reasons (change callables, iterator adapter, and some descriptors for the widgets).Everything else is a function, which generates other functions internally with some state on the parent, i.e.Closures. I've used them to create spinner factories, bar factories, the global configuration, the system hooks, the spinner compiler (which is also a bigFunction Decorator), evenalive_baritself is a function! And in the latter mostly, I dynamically plug several other functions into the main one (Python functions have a__dict__just like classes do). 😝To doenable multiple simultaneous bars for nested or multiple activities (the most requested, but very complex)reset a running bar context, i.e. run in unknown mode while "quantifying" then switch to definite modedynamic bar width rendition, which notices terminal size changes and shrink or expand the bar as needed (currentlyalive_bardoes notice terminal size changes, but just truncates the line accordingly)improve test coverage, currently at7789% branch coverage (but it's very hard since it's multithreaded, full of stateful closures, and includes system print hooks)create acontribsystem somehow, to allow a simple way to share cool spinners and bars from userssupport colors in spinners and bars (it's very hard, since color codes alter string sizes, which makes it tricky to synchronize animations, besides correctly slicing, reversing, and iterating fragments of strings whilealso maintaining color codesis very, very complex)update here: this may be much simpler now with the newCell Architecture!any other ideas are welcome!Noteworthy features already done ✅resuming computations support withskippeditemshelp system on configuration errorsreadable widgets to extract informationexponential smoothing algorithm for the ratesupport for usingstderrand other files instead ofstdoutunits with automatic scalingdual-line modecustomize final receipt widgetscustomize widgets rendition likemonitor,elapsed,statsbar title can be dynamically set, changed or removedexponential smoothing algorithm for the ETAjupyter notebook support, which works the same as in the terminal, animations and everythingcreate an unknown mode for bars (without a known total and eta)implement a pausing mechanismchange spinner styleschange bar stylesinclude a global configuration systemcreate customizable generators for scrolling, bouncing, delayed, and compound spinnerscreate an exhibition for spinners and bars, to see them all in motioninclude theme support in configurationsoft wrapping supporthiding cursor supportPython logging supportexponential smoothing of ETA time seriescreate an exhibition for themesChangelog highlightsCompletehere.3.1.4: support spaces at the start and end of titles and units3.1.3: better error handling of invalidalive_itcalls, detect nested uses of alive_progress and throw a clearer error message3.1.2: fix some exotic ANSI Escape Codes not being printed (OSC)3.1.1: support for printing ANSI Escape Codes without extra newlines, typing annotations inalive_it3.1.0: new resuming computations support withskippeditems, newmax_colsconfig setting for jupyter, fix fetching the size of the terminal when using stderr, officially supports Python 3.113.0.1: fix for logging streams that extend StreamHandler but doesn't allow changing streams3.0.0: units support with automatic and configurable scaling and precision, automatic stats scaling for slow throughputs, support for usingsys.stderrand other files instead ofsys.stdout, smoothed out the rate estimation, more queries into the currently running widgets' data, help system in configuration errors2.4.1: fix a crash when dual-line and disabled are set2.4.0: support dual line text mode; finalize function parameter in alive_it; improve logging support, detecting customized ones2.3.1: introduce ctrl_c config param; print the final receipt even when interrupted2.3.0: customizablemonitor,elapsed, andstatscore widgets, newmonitor_end,elapsed_end, andstats_endcore widgets, better support for CTRL+C, which makesalive_barstop prematurely2.2.0: bar title can be dynamically set, changed or removed; customizable refresh rates; final receipt can be hidden;click.echo()support; faster performance; safer detection of terminal columns;bar.currentacts like a property; remove Python 3.62.1.0: Jupyter notebook support (experimental), Jupyter auto-detection, disable feature and configuration2.0.0: new system-wide Cell Architecture with grapheme clusters support; super cool spinner compiler and runner;.check()tools in both spinners and bars; bars and spinners engines revamp; new animation modes in alongside and sequential spinners; new builtin spinners, bars, and themes; dynamic showtime with themes, scroll protection and filter patterns; improved logging for files; several new configuration options for customizing appearance; new iterator adapteralive_it; usestime.perf_counter()high-resolution clock; requires Python 3.6+ (and officially supports Python 3.9 and 3.10)1.6.2: newbar.current()method; newlines get printed on vanilla Python REPL; the bar is truncated to 80 chars on Windows1.6.1: fix logging support for Python 3.6 and lower; support logging for file; support for wide Unicode chars, which use 2 columns but have length 11.6.0: soft wrapping support; hiding cursor support; Python logging support; exponential smoothing of ETA time series; proper bar title, always visible; enhanced times representation; newbar.text()method, to set situational messages at any time, without incrementing position (deprecates 'text' parameter inbar()); performance optimizations1.5.1: fix compatibility with Python 2.7 (should be the last one, version 2 is in the works, with Python 3 support only)1.5.0: standard_bar accepts abackgroundparameter instead ofblank, which accepts arbitrarily sized strings and remains fixed in the background, simulating a bar going "over it"1.4.4: restructure internal packages; 100% branch coverage of all animations systems, i.e., bars and spinners1.4.3: protect configuration system against other errors (length='a' for example); first automated tests, 100% branch coverage of configuration system1.4.2: sanitize text input, keeping \n from entering and replicating bar on the screen1.4.1: include license file in the source distribution1.4.0: print() enrichment can now be disabled (locally and globally), exhibits now have a real-time fps indicator, new exhibit functionsshow_spinnersandshow_bars, new utilityprint_chars,show_barsgain some advanced demonstrations (try it again!)1.3.3: further improve stream compatibility with isatty1.3.2: beautifully finalize bar in case of unexpected errors1.3.1: fix a subtle race condition that could leave artifacts if ended very fast, flush print buffer when position changes or bar terminates, keep the total argument from unexpected types1.3.0: new fps calibration system, support force_tty and manual options in global configuration, multiple increment support in bar handler1.2.0: filled blanks bar styles, clean underflow representation of filled blanks1.1.1: optional percentage in manual mode1.1.0: new manual mode1.0.1: pycharm console support with force_tty, improve compatibility with Python stdio streams1.0.0: first public release, already very complete and maturePython End of Life noticealive_progresswill always try to keep up with Python, so starting from version 2.0, I'll drop support for all Python versions which enter EoL. See their schedulehere.But don't worry if you can't migrate just yet:alive_progressversions are perennial, so just keep using the one that works for you and you're good.I just strongly recommend setting olderalive_progresspackages in a requirements.txt file with the following formats. These will always fetch the latest build releases previous to a given version, so, if I ever release bug fixes, you'll get them too.For new Python 2.7 and 3.5❯pipinstall-U"alive_progress<2"For new Python 3.6❯pipinstall-U"alive_progress<2.2"LicenseThis software is licensed under the MIT License. See the LICENSE file in the top distribution directory for the full license text.Maintaining an open source project is hard and time-consuming, and I've put much ❤️ and effort into this.If you've appreciated my work, you can back me up with a donation! Thank you 😊
aliver
Project TitleA brief description of what this project does and who it's for
aliwaddah
This is a very simple calculator that takes two numbers and either add, subtract, multiply or divide them.Change Log0.0.1 (22/12/2021)First Release
aliya-client
No description available on PyPI.
aliya-server
No description available on PyPI.
aliyun-api
No description available on PyPI.
aliyun-api-gateway-sign
aliyun api网关调用python3
aliyun-api-gateway-sign-py3
aliyun api gateway for python3
aliyun-cert
功能通过本脚本可以为阿里云的CDN以及直播服务域名申请配置以及自动续期免费的 let's encrypt 证书。DOC EN安装和配置本脚本仅支持 Python 3pipinstallaliyun-cert需要配置阿里云 ram 账号的 access key,并至少赋予用户如下权限:AliyunDNSFullAccessAliyunCDNFullAccessAliyunYundunCertFullAccess如需同时配置直播 CDN 的证书,还需赋予如下权限:AliyunLiveFullAccessaccess key 记录在一个文件中,比如~/.serects/aliyun.ini,格式如下dns_aliyun_key_id=xxxdns_aliyun_key_secret=yyy申请并配置证书证书支持多域名,以及通配符域名,根据自己情况替换下面的example.com以及*.example.comcertbotcertonly\--authenticatordns-aliyun\--dns-aliyun-propagation-seconds30\--dns-aliyun-credentials~/.secrets/aliyun.ini\-dexample.com-d*.example.com为阿里云配置证书# 上传证书到阿里云 cas 服务aliyun-certupload-cert--domainexample.com/etc/letsencrypt/live/example.com/fullchain.pem/etc/letsencrypt/live/example.com/privkey.pem# 为 CDN 域名配置证书,cert-id 为上一步返回的 idaliyun-certset-cert--cert-id123456--domaincdn.example.com--servicecdn查看证书情况# 显示阿里云证书服务上所有上传上去的证书aliyun-certlist-certs# 显示所有开通了 HTTPS 的 CDN 域名及其证书情况aliyun-certlish-domains--cdn证书续期创建 crontab 文件/etc/cron.d/certbot0 0,12 * * * root sleep 1471 && certbot renew -q创建 certbot 的 deploy hook 脚本,每次 certbot 成功续期续期证书后都会自动调用改脚本上传证书并配置阿里云的服务/etc/letsencrypt/renewal-hooks/deploy/09-deploy-aliyun.sh#!/bin/bashaliyun-certcertbot-deploy-hook--cdn--delete-old-cert
aliyun-citybrain-sdk
用于阿里云交通云控平台 Python SDK。 该版本支持Python 3.*版本
aliyuncli
OverviewAliyun Command Line Interfacealiyuncliis a unified tool to manage your Aliyun services. Using this tool you can easily invoke the Aliyun open API to control multiple Aliyun services from the command line and also automate them through scripts, for instance using the Bash shell or Python.Aliyuncli on GithubThealiyunclitool is on Github and anyone can fork the code, subject to the license. You can access it at:https://github.com/aliyun/aliyun-cliHow to Install aliyuncliAliyun provides two ways to install thealiyunclitool:Install using pipInstall from a software packageInstall aliyuncli Using pipIf you have Windows, Linux, or Mac OS and pip is installed in your operating system, you can installaliyuncliusing pip:Windowspip install aliyuncliTo upgrade the existingaliyuncli, use the--upgradeoption:pip install --upgrade aliyuncliLinux, Mac OS and Unix$ sudo pip install aliyuncliTo upgrade the existingaliyuncli, use the--upgradeoption:$ pip install --upgrade aliyuncliInstall from a Software PackageIf you don’t have the pip tool, you can also installaliyunclifrom an Aliyun supplied software package.Aliyuncli supports several operating systems with the package:WindowsLinuxMac OS.You can find the software package for a free download at the following linkhttp://market.aliyun.com/products/53690006/cmgj000314.html?spm=5176.900004.4.2.esAaC2The package contains three install packages:cli.tar.gzis for Linux and Mac OSAliyunCLI_x86is for Windows 32 bit OSAliyunCLI_x64is for Windows 64 bit OSWindowsFindAliyunCLI.msiand double click the msi. You will go into the installation guide.Click the “next” button and choose your desired path and confirmFinish the installLinux and Mac OSInstall as follows:$ tar -zxvf cli.tar.gz $ cd cli $ sudo sh install.shCheck the aliyuncli InstallationConfirm thataliyuncliinstalled correctly by viewing the help file:$ aliyuncli helpor$ aliyuncliHow to Install the Aliyun Python SDKaliyunclirequires the Aliyun Python SDK 2.0. You should install the SDK after you installaliyuncli, otherwise you can not access the Aliyun service.Install SDK Using pipThe Aliyun Python SDK can only be installed by pip.Since each Aliyun service has their own SDK, you can install a required SDK individually with no need install all of them.For example, if you need only the ECS SDK, you can install only it as follows:$ sudo pip install aliyun-python-sdk-ecsIf you need only the RDS SDK:$ sudo pip install aliyun-python-sdk-rdsFor SLB:$ sudo pip install aliyun-python-sdk-slbSDK ListProductSDKBatchComputealiyun-python-sdk-batchcomputeBsnaliyun-python-sdk-bsnBssaliyun-python-sdk-bssCmsaliyun-python-sdk-cmsCrmaliyun-python-sdk-crmDrdsaliyun-python-sdk-drdsEcsaliyun-python-sdk-ecsEssaliyun-python-sdk-essFtaliyun-python-sdk-ftOcsaliyun-python-sdk-ocsOmsaliyun-python-sdk-omsOssAdminaliyun-python-sdk-ossadminRamaliyun-python-sdk-ramOcsaliyun-python-sdk-ocsRdsaliyun-python-sdk-rdsRiskaliyun-python-sdk-riskR-kvstorealiyun-python-r-kvstoreSlbaliyun-python-sdk-slbUbsmsaliyun-python-sdk-ubsmsYundunaliyun-python-sdk-yundunInstall SDK on no network environmentFind an internet accessible computer, access the Python Package Index pagehttps://pypi.python.org.Search SDK package name which listed in the above paragraph “SDK List” and download the file (tar.gz compressed file)Download aliyun-python-sdk-core file (a tar.gz compressed file) fromhttps://pypi.python.org/pypi/aliyun-python-sdk-core/Unzip the aliyun-python-sdk-core file and previously downloaded SDK file.Copy these unzipped folders to your aliyuncli installed environment.Open your terminal on your aliyuncli installed environment and go to these folders then execute “pip install .” command. ( aliyun-python-sdk-core at first then other SDK )Install Python Environmentaliyunclimust run under Python.If you don’t have Python installed, install version 2.6 or 2.7 using one of the following methods. Version 3 is not supported at this time.On Windows or OS X, download the Python package for your operating system from python.org and run the installer.On Linux, OS X, or Unix, install Python using your distribution’s package manager.How to Configure aliyuncliBefore usingaliyuncliyou should create a AccessKey from your console. After login the Aliyun console you can click the like as follows:<insert method here>Then you can create the access key and access secret.Configure the aliyuncliAfter creating the access key and access secret, you may configure aliyuncli:$ aliyuncli configure Aliyun Access Key ID [None]: <Your aliyun access key id> Aliyun Access Key Secret [None]: <Your aliyun access key secret> Default Region Id [None]: cn-hangzhou Default output format [None]: tableAccess key and access secret are certificates invoking the Aliyun open API. Region id is the region area of Aliyun ECS. Output format choices aretableJSONtext.Table format sample:<sample>JSON format sample:<sample>Text format sample:<sample>How to Use aliyuncliAnaliyunclicommand has four parts:Name of the tool “aliyuncli”Service name, such as: ecs, rds, slb, otsAvailable operations for each serviceList of keys and values, with possible multiple keys and values. The values can be number, string, or JSON format.Here are some examples:$ aliyuncli rds DescribeDBInstances --PageSize 50 $ aliyuncli ecs DescribeRegions $ aliyuncli rds DescribeDBInstanceAttribute --DBInstanceId xxxxxxAdditional Usage Information--filteraliyunclisupports a filter function. When any API is called, the data returned is JSON formatted by default. The filter function can help the user manipulate the JSON formatted data more easily.Here are some examples:$ aliyuncli ecs DescribeRegions --output json --filter Regions.Region[0] { "LocalName":"\u6df1\u5733" "RegionId": "cn-shenzhen" } $ aliyuncli ecs DescribeRegions --output json --filter Regions.Region[*].RegionId [ "cn-shenzhen", "cn-qingdao", "cn-beijing", "cn-hongkong", "cn-hangzhou", "us-west-1" ] $ aliyuncli ecs DescribeRegions --output json --filter Regions.Region[3].RegionId "cn-hongkong"Command CompletionOn Unix-like systems, thealiyuncliincludes a command-completion feature that enables you to use theTABkey to complete a partially typed command. This feature is not automatically installed, so you need to configure it manually.Configuring command completion requires two pieces of information:the name of the shell you are usingthe location ofaliyun_completerscript.Check Your ShellCurrentlyaliyunclisupports these shells:bashzsh.1. To find thealiyun_completer, you can use:$ which aliyun_completer /usr/local/bin/aliyun_completerTo enable command completion:bash - use the build-in command complete:$ complete -C ‘/usr/local/bin/aliyun_completer’ aliyunclizsh - source bin/aliyun_zsh_completer.sh% source /usr/local/bin/aliyun_zsh_completer.shTest Command Completion$ aliyuncli s<TAB> ecs rds slbThe services display the SDK(s) you installed.Finally, to ensure that completion continues to work after a reboot, add a configuration command to enable command completion to your shell profile.$ vim ~/.bash_profileAddcomplete-C‘/usr/local/bin/aliyun_completer’ aliyuncliat the end of the file.
aliyun-cli-sdkx
Failed to fetch description. HTTP Status Code: 404
aliyun-cli-sdkxx
Failed to fetch description. HTTP Status Code: 404
aliyun-console-bench-python-sdk
企业工作台 python sdk基本原理在官网 SDK 的基础上,对 Client进行重写,满足企业工作台的调用逻辑,同时完全兼容官网 SDK,这样就形成了 企业工作台定制 Client + 官网 SDK 提供 APIMETA 的模式。环境要求找阿里云企业工作台团队,提供 OpenAPI 访问凭证(consoleKey、consoleSecret)SDK 获取与安装使用 pip 安装(推荐)pipinstallaliyun-console-bench-python-sdk快速使用企业工作台的业务模式分为 工作台托管、聚石塔自管 两种模式,因此API调用也有针对性区分。工作台托管 SDK 调用示例fromone_sdk.clientimportOneClientfromaliyunsdkecs.request.v20140526.DescribeInstancesRequestimportDescribeInstancesRequestdeftest_client_api():client=OneClient(${consoleKey},${consoleSecret},${regionId})client.set_endpoint('console-bench.aliyuncs.com')client.add_query_param('AliUid','xxx')# OneConsole传递的主账号idreq=DescribeInstancesRequest()req.set_VpcId('xxx')res=client.do_action_with_exception(req)print(res)说明:endpoint: 测试环境下需要 host 绑定 114.55.202.134 console-bench.aliyuncs.com聚石塔托管 SDK 调用示例fromone_sdk.clientimportOneClientfromaliyunsdkecs.request.v20140526.DescribeInstancesRequestimportDescribeInstancesRequestdeftest_client_api():client=OneClient(${consoleKey},${consoleSecret},${regionId})client.set_endpoint('console-bench.aliyuncs.com')client.add_query_param('IdToken','xxx')# Oauth授权后获取的身份信息req=DescribeInstancesRequest()req.set_VpcId('xxx')res=client.do_action_with_exception(req)print(res)说明:endpoint: 测试环境下需要 host 绑定 114.55.202.134 console-bench.aliyuncs.com许可证Apache-2.0Copyright (c) 2009-present, Alibaba Cloud All rights reserved.
aliyun-console-sdkx
Failed to fetch description. HTTP Status Code: 404
aliyun-cs-tools
No description available on PyPI.
aliyunddns
Aliyun DDNSA dynamic DNS client for Aliyun written in pure Python.InstallDownload and install thealiyunddnspackage for Python.pip install aliyunddnsUsage:There are two ways to pass arguments to the program:Passing a config file path with-cargument.Specify information using the command line arguments.Note: when passing a config file, all the other arguments will be ignored.Here are the full arguments list:usage: ddns.py [-h] [-c CONFIG] [--access-key ACCESS_KEY] [--secret-key SECRET_KEY] [--domain DOMAIN] [--host-record HOST_RECORD] [--line LINE] [--ttl TTL] [--log-file LOG_FILE] Aliyun DDNS Client optional arguments: -h, --help show this help message and exit -c CONFIG config file path (ignore other arguments) --access-key ACCESS_KEY access Key Id --secret-key SECRET_KEY secret Key --domain DOMAIN domain name --host-record HOST_RECORD host record --line LINE line <default|telecom|unicom|mobile|oversea|edu|drpeng |btvn> --ttl TTL TTL --log-file LOG_FILE log file pathConfigThe config file format is JSON. Create a config file with extension.jsonand convert the dash in command line arguments to underline.For examples:{"access_key":"abc","secret_key":"abc","domain":"example.com","host_record":"@","line":"default","ttl":600,"log_file":"/var/log/ddns/ddns.log"}ExamplesSpecify a config file.aliyunddns -c config.jsonUsing command line arguments.aliyunddns --access-key=abc --secret-key=abc --domain=example.com --log-file=/var/log/ddns/ddns.log
aliyun-ddns-client
Aliyun DDNS ClientPrefaceForked fromrfancn/aliyun-ddns-client. But seems the author seems does not want to accept PRs, so this repo will be independent from the original one.UsagePrepare DDNSRequired options need to be set in /etc/ddns.conf:access_idaccess_keydomainsub_domainOptional options:typedebug[DEFAULT] # access id obtains from aliyun access_id= # access key obtains from aliyun access_key= # it is not used at this moment, you can just ignore it interval=600 # turn on debug mode or not debug=true [DomainRecord1] # domain name, like google.com domain= # subdomain name, like www, blog, bbs, *, @, ... sub_domain= # resolve type, 'A', 'AAAA'..., currently it only supports 'A' type=A [feature_public_ip_from_nic] enable=false interface=eth0Run with dockerdockerrun-it--rm-v$YOUR_CONF_FILE_PATH:/etc/ddns.conftsingjyujing/aliyun-ddns-clientInstall from PIPYou can install/update the command line tool from PIP:pip3install-Ualiyun-ddns-client# You can also install from github if you wantpip3install-Ugit+https://github.com/TsingJyujing/aliyun-ddns-client.gitSave your config file to$(pwd)/ddns.confor/etc/ddns.conf, and runaliyun-ddnsdirectly.LimitationsThis version of DDNS client only supports auto updating 'A' type DomainRecord with IPV4 address.Other types are not supported because they need following value format other than IP address:'NS', 'MX', 'CNAME' types DomainRecord need domain name format value'AAAA' type DomainRecord need IPV6 address format value'SRV' type DomainRecord need name.protocal format value'Explicit URL' and 'Implicit URL' need URL format valueReferencesPython DDNS client for Aliyun
aliyundriveAutoCheckin
阿里云盘自动签到(aliyundriveAutoCheckin)refresh_token: 阿里云盘的refresh_token,用于获取access_token。is_get_reward: 是否领取签到奖励。如果值为 "是",则在签到后会尝试领取奖励。is_send_email: 是否发送签到结果的邮件。如果值为 "是",则会尝试发送邮件。is_custom_email: 是否使用自定义的SMTP服务器发送邮件。如果值为 "是",则会使用接下来的字段作为SMTP服务器的配置。to_addr: 邮件的接收者地址。如果is_custom_email为 "是",则还需要提供以下字段:smtp_server: SMTP服务器的地址。smtp_port: SMTP服务器的端口。smtp_user: SMTP服务器的用户名。smtp_password: SMTP服务器的密码。refresh_token获取方法:电脑开机自动阿里云盘签到方法:在Windows操作系统上,想要实现开机自启动,我们可以通过“启动”文件夹来实现。具体操作步骤如下:打开运行窗口。可以通过按下Win+R快捷键,或者在任务栏的搜索栏输入“运行”打开。在运行窗口中输入shell:startup,然后按回车键。这将打开启动文件夹。你只需要将想要在启动时运行的exe文件的快捷方式拖到这个文件夹中就可以了。如果你的exe文件在桌面上,你可以右键点击它,然后选择“发送到”->“桌面(创建快捷方式)”。然后将这个新创建的快捷方式移动到启动文件夹。这样,每次你的电脑启动时,这个exe文件就会自动运行。
aliyundrive-fuse
aliyundrive-fuse🚀 Help me to become a full-time open-source developer bysponsoring me on GitHub阿里云盘 FUSE 磁盘挂载,主要用于配合Emby或者Jellyfin观看阿里云盘内容,功能特性:目前只读,不支持写入支持 Linux 和 macOS,暂不支持 Windowsaliyundrive-webdav项目已经实现了通过 WebDAV 访问阿里云盘内容,但由于 Emby 和 Jellyfin 都不支持直接访问 WebDAV 资源, 需要配合rclone之类的软件将 WebDAV 挂载为本地磁盘,而本项目则直接通过 FUSE 实现将阿里云盘挂载为本地磁盘,省去使用 rclone 再做一层中转。安装macOS 需要先安装macfuseLinux 需要先安装 fuseDebian 系如 Ubuntu:apt-get install -y fuse3RedHat 系如 CentOS:yum install -y fuse3可以从GitHub Releases页面下载预先构建的二进制包, 也可以使用 pip 从 PyPI 下载:pipinstallaliyundrive-fuse如果系统支持Snapcraft比如 Ubuntu、Debian 等,也可以使用 snap 安装:sudosnapinstallaliyundrive-fuseOpenWrt 路由器GitHub Releases中有预编译的 ipk 文件, 目前提供了 aarch64/arm/x86_64/i686 等架构的版本,可以下载后使用 opkg 安装,以 nanopi r4s 为例:wgethttps://github.com/messense/aliyundrive-fuse/releases/download/v0.1.14/aliyundrive-fuse_0.1.14-1_aarch64_generic.ipk wgethttps://github.com/messense/aliyundrive-fuse/releases/download/v0.1.14/luci-app-aliyundrive-fuse_0.1.14_all.ipk wgethttps://github.com/messense/aliyundrive-fuse/releases/download/v0.1.14/luci-i18n-aliyundrive-fuse-zh-cn_0.1.14-1_all.ipk opkginstallaliyundrive-fuse_0.1.14-1_aarch64_generic.ipk opkginstallluci-app-aliyundrive-fuse_0.1.14_all.ipk opkginstallluci-i18n-aliyundrive-fuse-zh-cn_0.1.14-1_all.ipk其它 CPU 架构的路由器可在GitHub Releases页面中查找对应的架构的主程序 ipk 文件下载安装。Tips: 不清楚 CPU 架构类型可通过运行opkg print-architecture命令查询。命令行用法USAGE:aliyundrive-fuse[OPTIONS]--refresh-token<REFRESH_TOKEN><PATH> ARGS:<PATH>Mountpoint OPTIONS:--allow-otherAllowotheruserstoaccessthedrive--domain-id<DOMAIN_ID>AliyunPDSdomainid-h,--helpPrinthelpinformation-r,--refresh-token<REFRESH_TOKEN>Aliyundriverefreshtoken[env:REFRESH_TOKEN=]-S,--read-buffer-size<READ_BUFFER_SIZE>Read/downloadbuffersizeinbytes,defaultsto10MB[default:10485760]-V,--versionPrintversioninformation-w,--workdir<WORKDIR>Workingdirectory,refresh_tokenwillbestoredinthereifspecified比如将磁盘挂载到/mnt/aliyundrive目录:mkdir-p/mnt/aliyundrive/var/run/aliyundrive-fuse aliyundrive-fuse-ryour-refresh-token-w/var/run/aliyundrive-fuse/mnt/aliyundriveEmby/Jellyfin如果是直接运行在系统上的 Emby/Jellyfin,则可以直接在其控制台添加媒体库的时候选择阿里云盘对应的挂载路径中的文件夹即可; 如果是 Docker 运行的 Emby/Jellyfin,则需要将阿里云盘挂载路径也挂载到 Docker 容器中,假设阿里云盘挂载路径为/mnt/aliyundrive, 以 Jellyfin 为例(假设 Jellyfin 工作路径为/root/jellyfin)将云盘挂载到容器/media路径:dockerrun-d--namejellyfin\-v/root/jellyfin/config:/config\-v/root/jellyfin/cache:/cache\-v/mnt/aliyundrive:/media\-p8096:8096\--device=/dev/dri/renderD128\--device/dev/dri/card0:/dev/dri/card0\--restartunless-stopped\jellyfin/jellyfinLicenseThis work is released under the MIT license. A copy of the license is provided in theLICENSEfile.
aliyundrive-webdav
aliyundrive-webdav🚀 Help me to become a full-time open-source developer bysponsoring me on GitHub阿里云盘 WebDAV 服务,主要使用场景为配合支持 WebDAV 协议的客户端 App 如Infuse、nPlayer等实现在电视上直接观看云盘视频内容, 支持客户端 App 直接从阿里云盘获取文件播放而不经过运行本应用的服务器中转, 支持上传文件,但受限于 WebDAV 协议不支持文件秒传。请注意:V2 版本基于阿里云盘开放平台接口实现,不再支持阿里云盘 Web 和 App 版本获取的 refresh token。由于本项目作者不再使用梅林固件,V2 版本不再免费支持 Koolshare 梅林固件系统,如有需要请考虑付费支持。如果项目对你有帮助,请考虑捐赠支持项目持续维护。 也可以考虑加入aliyundrive-webdav 知识星球获取咨询和技术支持服务。Note本项目作者没有上传需求, 故上传文件功能测试不全面且没有持续迭代计划.安装可以从GitHub Releases页面下载预先构建的二进制包, 也可以使用 pip 从 PyPI 下载:pipinstallaliyundrive-webdav如果系统支持Snapcraft比如 Ubuntu、Debian 等,也可以使用 snap 安装:sudosnapinstallaliyundrive-webdavOpenWrt 路由器GitHub Releases中有预编译的 ipk 文件, 目前提供了 aarch64/arm/mipsel/x86_64/i686 等架构的版本,可以下载后使用 opkg 安装,以 nanopi r4s 为例:wgethttps://github.com/messense/aliyundrive-webdav/releases/download/v2.3.3/aliyundrive-webdav_2.3.3-1_aarch64_generic.ipk wgethttps://github.com/messense/aliyundrive-webdav/releases/download/v2.3.3/luci-app-aliyundrive-webdav_2.3.3_all.ipk wgethttps://github.com/messense/aliyundrive-webdav/releases/download/v2.3.3/luci-i18n-aliyundrive-webdav-zh-cn_2.3.3-1_all.ipk opkginstallaliyundrive-webdav_2.3.3-1_aarch64_generic.ipk opkginstallluci-app-aliyundrive-webdav_2.3.3_all.ipk opkginstallluci-i18n-aliyundrive-webdav-zh-cn_2.3.3-1_all.ipk其它 CPU 架构的路由器可在GitHub Releases页面中查找对应的架构的主程序 ipk 文件下载安装, 常见 OpenWrt 路由器 CPU 架构如下表(欢迎补充):路由器CPU 架构nanopi r4saarch64_generic小米 AX3600aarch64_cortex-a53斐讯 N1 盒子aarch64_cortex-a53Newifi D2mipsel_24kcPogoplugarm_mpcoreTips: 不清楚 CPU 架构类型可通过运行opkg print-architecture命令查询。Docker 运行dockerrun-d--name=aliyundrive-webdav--restart=unless-stopped-p8080:8080\-v/etc/aliyundrive-webdav/:/etc/aliyundrive-webdav/\-eREFRESH_TOKEN='your refresh token'\-eWEBDAV_AUTH_USER=admin\-eWEBDAV_AUTH_PASSWORD=admin\messense/aliyundrive-webdav其中,REFRESH_TOKEN环境变量为你的阿里云盘refresh_token,WEBDAV_AUTH_USER和WEBDAV_AUTH_PASSWORD为连接 WebDAV 服务的用户名和密码。QNAP 威联通 NASQNAP 插件[email protected] (Docker)管理员登陆 NAS 后安装 ContainerStation 并启动服务,在 Management (管理) 标签中 Create Application (新建应用),配置如下version:'3.3'services:aliyundrive-webdav:container_name:aliyundrive-webdavrestart:unless-stoppedports:-'8080:8080'environment:-'REFRESH_TOKEN=mytoken...'image:messense/aliyundrive-webdav其中REFRESH_TOKEN文档最下面说明;:8080网盘访问映射端口,可以按需改为其他的。点击 Create (创建)后启动,访问http://nas地址:8080/即可看到你网盘的自动生成索引网页文件。参考文档https://docs.docker.com/compose/https://www.composerize.com/rclone为了避免重复上传文件,使用 rclone 时推荐使用Nextcloud WebDAV模式,可以支持 sha1 checksums. 另外需要配合--no-update-modtime参数,否则 rclone 为了更新文件修改时间还是会强制重新上传。举个例子:rclone--no-update-modtimecopyabc.pdfaliyundrive-nc://docs/获取 refresh token通过在线工具获取 refresh token命令行运行aliyundrive-webdav qr login扫码授权后会输出 refresh token命令行用法$aliyundrive-webdav--help WebDAVserverforAliyunDrive Usage:aliyundrive-webdav[OPTIONS]aliyundrive-webdav<COMMAND> Commands:qrScanQRCodehelpPrintthismessageorthehelpofthegivensubcommand(s)Options:--host<HOST>Listenhost[env:HOST=][default:0.0.0.0]-p,--port<PORT>Listenport[env:PORT=][default:8080]--client-id<CLIENT_ID>Aliyundriveclient_id[env:CLIENT_ID=]--client-secret<CLIENT_SECRET>Aliyundriveclient_secret[env:CLIENT_SECRET=]--drive-type<DRIVE_TYPE>Aliyundrivetype[env:DRIVE_TYPE=]Possiblevalues:-resource:Resourcedrive-backup:Backupdrive-default:Defaultdrive-r,--refresh-token<REFRESH_TOKEN>Aliyundriverefreshtoken[env:REFRESH_TOKEN=]-U,--auth-user<AUTH_USER>WebDAVauthenticationusername[env:WEBDAV_AUTH_USER=]-W,--auth-password<AUTH_PASSWORD>WebDAVauthenticationpassword[env:WEBDAV_AUTH_PASSWORD=]-I,--auto-indexAutomaticallygenerateindex.html-S,--read-buffer-size<READ_BUFFER_SIZE>Read/downloadbuffersizeinbytes,defaultsto10MB[default:10485760]--upload-buffer-size<UPLOAD_BUFFER_SIZE>Uploadbuffersizeinbytes,defaultsto16MB[default:16777216]--cache-size<CACHE_SIZE>Directoryentriescachesize[default:1000]--cache-ttl<CACHE_TTL>Directoryentriescacheexpirationtimeinseconds[default:600]--root<ROOT>Rootdirectorypath[default:/]-w,--workdir<WORKDIR>Workingdirectory,refresh_tokenwillbestoredinthereifspecified--no-trashDeletefilepermanentlyinsteadoftrashingit--read-onlyEnablereadonlymode--tls-cert<TLS_CERT>TLScertificatefilepath[env:TLS_CERT=]--tls-key<TLS_KEY>TLSprivatekeyfilepath[env:TLS_KEY=]--strip-prefix<STRIP_PREFIX>Prefixtobestrippedoffwhenhandlingrequest[env:WEBDAV_STRIP_PREFIX=]--debugEnabledebuglog--no-self-upgradeDisableselfautoupgrade--skip-upload-same-sizeSkipuploadingsamesizefile--prefer-http-downloadPreferdownloadingusingHTTPprotocol--redirectEnable302redirectwhenpossible-h,--helpPrinthelp(seeasummarywith'-h')-V,--versionPrintversionNote注意:TLS/HTTPS 暂不支持 MIPS 架构。Note注意:启用--skip-upload-same-size选项虽然能加速上传但可能会导致修改过的同样大小的文件不会被上传LicenseThis work is released under the MIT license. A copy of the license is provided in theLICENSEfile.
aliyun-ecs
aliyun-ecsCopyright (c) 2017 teachmyselfThis is set of tools for aliyun ecs.
aliyun-exporter
Prometheus Exporter for Alibaba Cloud中文FeaturesScreenshotsQuick StartInstallationUsageDocker ImageConfigurationMetrics MetaScale and HA SetupContributeThis Prometheus exporter collects metrics from theCloudMonitor APIof Alibaba Cloud. It can help you:integrate CloudMonitor to your Monitoring System.leverage the power of PromQL, Alertmanager and Grafana(seeScreenshots).analyze metrics however you want.save money. Api invocation is far cheaper than other services provided by CloudMonitor.This project also provides an out-of-box solution for full-stack monitoring of Alibaba Cloud, including dashboards, alerting and diagnosing.Screenshotsmore screenshots hereQuick StartA docker-compose stack is provided to launch the entire monitoring stack with Aliyun-Exporter, Prometheus, Grafana and Alertmanager.Pre-requisites: docker [email protected]:aylei/aliyun-exporter.gitcddocker-composeALIYUN_ACCESS_ID=YOUR_ACCESS_IDALIYUN_ACCESS_SECRET=YOUR_ACCESS_KEYdocker-composeupInvestigate dashboards inlocalhost:3000(the default credential for Grafana is admin:admin).For more details, seeDocker Compose.InstallationPython 3.5+ is required.pip3installaliyun-exporterUsageConfig your credential and interested metrics:credential:access_key_id:<YOUR_ACCESS_KEY_ID>access_key_secret:<YOUR_ACCESS_KEY_SECRET>region_id:<REGION_ID>metrics:acs_cdn:-name:QPSacs_mongodb:-name:CPUUtilizationperiod:300Run the exporter:>aliyun-exporter-p9525-caliyun-exporter.ymlThe default port is 9525, default config file location is./aliyun-exporter.yml.Visit metrics inlocalhost:9525/metricsDocker ImageInstalldockerpullaylei/aliyun-exporter:0.3.0To run the container, external configuration file is required:dockerrun-p9525:9525-v$(pwd)/aliyun-exporter.yml:$(pwd)/aliyun-exporter.ymlaylei/aliyun-exporter:0.3.0-c$(pwd)/aliyun-exporter.ymlConfigurationrate_limit:5# request rate limit per second. default: 10credential:access_key_id:<YOUR_ACCESS_KEY_ID># requiredaccess_key_secret:<YOUR_ACCESS_KEY_SECRET># requiredregion_id:<REGION_ID># default: 'cn-hangzhou'metrics:# required, metrics specificationsacs_cdn:# required, Project Name of CloudMonitor-name:QPS# required, Metric Name of CloudMonitor, belongs to a certain Projectrename:qps# rename the related prometheus metric. default: same as the 'name'period:60# query period. default: 60measure:Average# measure field in the response. default: Averageinfo_metrics:-ecs-rds-redisNotes:Find your target metrics usingMetrics MetaCloudMonitor API has an rate limit, tuning therate_limitconfiguration if the requests are rejected.CloudMonitor API also has an monthly quota for invocations (AFAIK, 5,000,000 invocations / month for free). Plan your usage in advance.Given that you have 50 metrics to scrape with 60s scrape interval, about 2,160,000 requests will be sent by the exporter for 30 days.Special ProjectSome metrics are not included in the Cloud Monitor API. For these metrics, we keep the configuration abstraction consistent by defining special projects.Special Projects:rds_performance: RDS performance metrics, available metric names:Performance parameter tableAn example configuration file of special project is provided asspecial-projects.ymlNote: special projects invokes different API with ordinary metrics, so it will not consume your Cloud Monitor API invocation quota. But the API of special projects could be slow, so it is recommended to separate special projects into a standalone exporter instance.Metrics Metaaliyun-exportershipped with a simple site hosting the metrics meta from the CloudMonitor API. You can visit the metric meta inlocalhost:9525after launching the exporter.host:portwill host all the available monitor projectshost:port/projects/{project}will host the metrics meta of a certain projecthost:port/yaml/{project}will host a config YAML of the project's metricsyou can easily navigate in this pages by hyperlink.Docker ComposeFrom0.3.0, we provide a docker-compose stack to help users building monitoring stack from scratch. The stack contains:aliyun-exporter (this project): Retrieving metrics (and instance information) from Alibaba Cloud.Prometheus: Metric storage and alerting calculation.Alertmanager: Alert routing and notifying.Grafana: Dashboards.prometheus-webhook-dingtalk: DingTalk (a.k.a. DingDing) notification integrating.Here's a detailed launch guide:# config prometheus external hostexportPROMETHEUS_HOST=YOUR_PUBLIC_IP_OR_HOSTNAME# config dingtalk robot tokenexportDINGTALK_TOKEN=YOUR_DINGTALK_ROBOT_TOEKN# config aliyun-exporter credentialexportALIYUN_REGION=YOUR_REGIONexportALIYUN_ACCESS_ID=YOUR_IDexportALIYUN_ACCESS_SECRET=YOUR_SECRET docker-composeup-dAfter launching, you can access:grafana:http://localhost:3000prometheus:http://localhost:9090alertmanager:http://localhost:9093You may customize the configuration of this components by editing the configuration files in./docker-compose/{component}TelemetryRequest success summary and failure summary are exposed incloudmonitor_request_latency_secondsandcloudmonitor_failed_request_latency_seconds.EachProject-Metricpair will have a corresponding metric namedaliyun_{project}_{metric}_up, which indicates whether this metric are successfully scraped.Scale and HA SetupThe CloudMonitor API could be slow if you have large amount of resources. You can separate metrics over multiple exporter instances to scale.For HA setup, simply duplicate your deployments: 2 * prometheus, and 2 * exporter for each prometheus.HA Setup will double your requests, which may run out your quota.ContributeFeel free to open issues and pull requests, any feedback will be highly appreciated!Please check thehelp wantedlabel to find issues that are good for getting started.Besides, contributing to newalert rules, newdashboardsis also welcomed!
aliyun-exporter-czb
Prometheus Exporter for Alibaba CloudNote:This repository forked fromhttps://github.com/aylei/aliyun-exporterTo meet newlink's needs中文FeaturesScreenshotsQuick StartInstallationUsageDocker ImageConfigurationMetrics MetaScale and HA SetupContributefeaturesThis Prometheus exporter collects metrics from theCloudMonitor APIof Alibaba Cloud. It can help you:integrate CloudMonitor to your Monitoring System.leverage the power of PromQL, Alertmanager and Grafana(seeScreenshots).analyze metrics however you want.save money. Api invocation is far cheaper than other services provided by CloudMonitor.add oss/polardb/dts/mq/elasticsearch supportThis project also provides an out-of-box solution for full-stack monitoring of Alibaba Cloud, including dashboards, alerting and diagnosing.Screenshotsmore screenshots hereQuick StartA docker-compose stack is provided to launch the entire monitoring stack with Aliyun-Exporter, Prometheus, Grafana and Alertmanager.Pre-requisites: docker [email protected]:aylei/aliyun-exporter.gitcddocker-composeALIYUN_ACCESS_ID=YOUR_ACCESS_IDALIYUN_ACCESS_SECRET=YOUR_ACCESS_KEYdocker-composeupInvestigate dashboards inlocalhost:3000(the default credential for Grafana is admin:admin).For more details, seeDocker Compose.InstallationPython 3.7+ is required.pip3installaliyun-exporter-czbUsageConfig your credential and interested metrics:credential:access_key_id:<YOUR_ACCESS_KEY_ID>access_key_secret:<YOUR_ACCESS_KEY_SECRET>region_id:<REGION_ID>metrics:acs_cdn:-name:QPSacs_mongodb:-name:CPUUtilizationperiod:300To display multiple regionids in one metric, you can configure the following optionsdo_info_region:-"cn-zhangjiakou"-"cn-beijing"Run the exporter:>aliyun-exporter-p9525-caliyun-exporter.ymlThe default port is 9525, default config file location is./aliyun-exporter.yml.Visit metrics inlocalhost:9525/metricsDocker ImageInstalldockerpullaylei/aliyun-exporter:0.3.1To run the container, external configuration file is required:dockerrun-p9525:9525-v$(pwd)/aliyun-exporter.yml:$(pwd)/aliyun-exporter.ymlaylei/aliyun-exporter:0.3.1-c$(pwd)/aliyun-exporter.ymlConfigurationrate_limit:5# request rate limit per second. default: 10credential:access_key_id:<YOUR_ACCESS_KEY_ID># requiredaccess_key_secret:<YOUR_ACCESS_KEY_SECRET># requiredregion_id:<REGION_ID># default: 'cn-hangzhou'metrics:# required, metrics specificationsacs_cdn:# required, Project Name of CloudMonitor-name:QPS# required, Metric Name of CloudMonitor, belongs to a certain Projectrename:qps# rename the related prometheus metric. default: same as the 'name'period:60# query period. default: 60measure:Average# measure field in the response. default: Averageinfo_metrics:-ecs-rds-redisNotes:Find your target metrics usingMetrics MetaCloudMonitor API has an rate limit, tuning therate_limitconfiguration if the requests are rejected.CloudMonitor API also has an monthly quota for invocations (AFAIK, 5,000,000 invocations / month for free). Plan your usage in advance.Given that you have 50 metrics to scrape with 60s scrape interval, about 2,160,000 requests will be sent by the exporter for 30 days.Special ProjectSome metrics are not included in the Cloud Monitor API. For these metrics, we keep the configuration abstraction consistent by defining special projects.Special Projects:rds_performance: RDS performance metrics, available metric names:Performance parameter tableAn example configuration file of special project is provided asspecial-projects.ymlNote: special projects invokes different API with ordinary metrics, so it will not consume your Cloud Monitor API invocation quota. But the API of special projects could be slow, so it is recommended to separate special projects into a standalone exporter instance.Metrics Metaaliyun-exportershipped with a simple site hosting the metrics meta from the CloudMonitor API. You can visit the metric meta inlocalhost:9525after launching the exporter.host:portwill host all the available monitor projectshost:port/projects/{project}will host the metrics meta of a certain projecthost:port/yaml/{project}will host a config YAML of the project's metricsyou can easily navigate in this pages by hyperlink.Docker ComposeFrom0.3.1, we provide a docker-compose stack to help users building monitoring stack from scratch. The stack contains:aliyun-exporter (this project): Retrieving metrics (and instance information) from Alibaba Cloud.Prometheus: Metric storage and alerting calculation.Alertmanager: Alert routing and notifying.Grafana: Dashboards.prometheus-webhook-dingtalk: DingTalk (a.k.a. DingDing) notification integrating.Here's a detailed launch guide:# config prometheus external hostexportPROMETHEUS_HOST=YOUR_PUBLIC_IP_OR_HOSTNAME# config dingtalk robot tokenexportDINGTALK_TOKEN=YOUR_DINGTALK_ROBOT_TOEKN# config aliyun-exporter credentialexportALIYUN_REGION=YOUR_REGIONexportALIYUN_ACCESS_ID=YOUR_IDexportALIYUN_ACCESS_SECRET=YOUR_SECRET docker-composeup-dAfter launching, you can access:grafana:http://localhost:3000prometheus:http://localhost:9090alertmanager:http://localhost:9093You may customize the configuration of this components by editing the configuration files in./docker-compose/{component}TelemetryRequest success summary and failure summary are exposed incloudmonitor_request_latency_secondsandcloudmonitor_failed_request_latency_seconds.EachProject-Metricpair will have a corresponding metric namedaliyun_{project}_{metric}_up, which indicates whether this metric are successfully scraped.Scale and HA SetupThe CloudMonitor API could be slow if you have large amount of resources. You can separate metrics over multiple exporter instances to scale.For HA setup, simply duplicate your deployments: 2 * prometheus, and 2 * exporter for each prometheus.HA Setup will double your requests, which may run out your quota.ContributeFeel free to open issues and pull requests, any feedback will be highly appreciated!Please check thehelp wantedlabel to find issues that are good for getting started.Besides, contributing to newalert rules, newdashboardsis also welcomed!
aliyun-fc
Aliyun FunctionCompute Python SDK=================================.. image:: https://badge.fury.io/py/aliyun-fc.svg:target: https://badge.fury.io/py/aliyun-fc.. image:: https://travis-ci.org/aliyun/fc-python-sdk.svg?branch=master:target: https://travis-ci.org/aliyun/fc-python-sdk.. image:: https://coveralls.io/repos/github/aliyun/fc-python-sdk/badge.svg?branch=master:target: https://coveralls.io/github/aliyun/fc-python-sdk?branch=masterOverview--------The SDK of this version is dependent on the third-party HTTP library `requests <https://github.com/kennethreitz/requests>`_.Running environment-------------------Python 2.7, Python 3.6Installation----------Install the official release version through PIP (taking Linux as an example):.. code-block:: bash$ pip install aliyun-fcYou can also install the unzipped installer package directly:.. code-block:: bash$ sudo python setup.py installGetting started---------------.. code-block:: python# -*- coding: utf-8 -*-import fc# To know the endpoint and access key id/secret info, please refer to:# https://help.aliyun.com/document_detail/52984.htmlclient = fc.Client(endpoint='<Your Endpoint>',accessKeyID='<Your AccessKeyID>',accessKeySecret='<Your AccessKeySecret>')# Create service.client.create_service('service_name')# Create function.# the current directory has a main.zip file (main.py which has a function of myhandler)client.create_function('service_name', 'function_name', 'main.my_handler', codeZipFile = 'main.zip')# Invoke function synchronously.client.invoke_function('service_name', 'function_name')# Invoke a function with a input parameter.client.invoke_function('service_name', 'function_name', payload=bytes('hello_world'))# Read a image and invoke a function with the file data as input parameter.src = open('src_image_file_path', 'rb') # Note: please open it as binary.r = client.invoke_function('service_name', 'function_name', payload=src)# save the result as the output image.dst = open('dst_image_file_path', 'wb')dst.write(r)src.close()dst.close()# Invoke function asynchronously.client.async_invoke_function('service_name', 'function_name')# List services.client.list_services()# List functions with prefix and limit.client.list_functions('service_name', prefix='the_prefix', limit=10)# Delete service.client.delete_service('service_name')# Delete function.client.delete_function('service_name', 'function_name')Testing-------To run the tests, please set the access key id/secret, endpoint as environment variables.Take the Linux system for example:.. code-block:: bash$ export ENDPOINT=<endpoint>$ export ACCESS_KEY_ID=<AccessKeyId>$ export ACCESS_KEY_SECRET=<AccessKeySecret>$ export STS_TOKEN=<roleARN>Run the test in the following method:.. code-block:: bash$ nosetests # First install noseMore resources--------------- `Aliyun FunctionCompute docs <https://help.aliyun.com/product/50980.html>`_Contacting us-------------- `Links <https://help.aliyun.com/document_detail/53087.html>`_License-------- `MIT <https://github.com/aliyun/fc-python-sdk/blob/master/LICENSE>`_
aliyun-fc2
OverviewThe SDK of this version is dependent on the third-party HTTP libraryrequests.Running environmentPython 2.7, Python 3.6Noticefc and fc2 are not compatible, now master repo is fc2, if you still use fc, 1.x branch is what you need. We suggest using fc2, The main difference between fc and fc2 is:1, all http request fuction can set headersdefinvoke_function(self,serviceName,functionName,payload=None,headers={'x-fc-invocation-type':'Sync','x-fc-log-type':'None'}):...Attention: abandon async_invoke_function, there is only one function interface invoke_function, distinguish between synchronous and asynchronous by x-fc-invocation-type parameters(Sync or Async).# sync invokeclient.invoke_function('service_name','function_name')# async invokeclient.invoke_function('service_name','function_name',headers={'x-fc-invocation-type':'Async'})2, The all http response returned by the user is the following objectclassFcHttpResponse(object):def__init__(self,headers,data):self._headers=headersself._data=data@propertydefheaders(self):returnself._headers@propertydefdata(self):returnself._dataNote: for invoke function, data is bytes, for other apis, data is dictInstallationInstall the official release version through PIP (taking Linux as an example):$pipinstallaliyun-fc2You can also install the unzipped installer package directly:$sudopythonsetup.pyinstallif you still use fc, you can install the official fc1 release version through PIP (taking Linux as an example):$pipinstallaliyun-fcGetting started# -*- coding: utf-8 -*-importfc2# To know the endpoint and access key id/secret info, please refer to:# https://help.aliyun.com/document_detail/52984.htmlclient=fc2.Client(endpoint='<Your Endpoint>',accessKeyID='<Your AccessKeyID>',accessKeySecret='<Your AccessKeySecret>')# Create service.client.create_service('service_name')# set vpc config when creating the servicevpcConfig={'vpcId':'<Your Vpc Id>','vSwitchIds':'<[Your VSwitch Ids]>','securityGroupId':'<Your Security Group Id>'}# create vpcConfig when creating the service# you have to set the role if you want to set vpc configvpc_role='acs:ram::12345678:role/aliyunvpcrole'# set nas config when creating the servicenasConfig={"userId":'<The NAS file system user id>',"groupId":'<The NAS file system group id>',"mountPoints":[{"serverAddrserverAddr":'<The NAS file system mount target>',"mountDir":'<The mount dir to the local file system>',}],}service=client.create_service(name,role=vpc_role,vpcConfig=vpcConfig,nasConfig=nasConfig)# Create function.# the current directory has a main.zip file (main.py which has a function of myhandler)# set environment variables {'testKey': 'testValue'}client.create_function('service_name','function_name','python3','main.my_handler',codeZipFile='main.zip',environmentVariables={'testKey':'testValue'})# Create function with initailizer# main.my_initializer is the entry point of initializer interfaceclient.create_function('service_name','function_name','python3','main.my_handler',"main.my_initializer",codeZipFile='main.zip',environmentVariables={'testKey':'testValue'})# Invoke function synchronously.client.invoke_function('service_name','function_name')# Create trigger# Create oss triggeross_trigger_config={'events':['oss:ObjectCreated:*'],'filter':{'key':{'prefix':'prefix','suffix':'suffix'}}}source_arn='acs:oss:cn-shanghai:12345678:bucketName'invocation_role='acs:ram::12345678:role/aliyunosseventnotificationrole'client.create_trigger('service_name','function_name','trigger_name','oss',oss_trigger_config,source_arn,invocation_role)# Create log triggerlog_trigger_config={'sourceConfig':{'logstore':'log_store_source'},'jobConfig':{'triggerInterval':60,'maxRetryTime':10},'functionParameter':{},'logConfig':{'project':'log_project','logstore':'log_store'},'enable':False}source_arn='acs:log:cn-shanghai:12345678:project/log_project'invocation_role='acs:ram::12345678:role/aliyunlogetlrole'client.create_trigger('service_name','function_name','trigger_name','oss',log_trigger_config,source_arn,invocation_role)# Create time triggertime_trigger_config={'payload':'awesome-fc''cronExpression':'0 5 * * * *''enable':true}client.create_trigger('service_name','function_name','trigger_name','timer',time_trigger_config,'','')# Invoke a function with a input parameter.client.invoke_function('service_name','function_name',payload=bytes('hello_world'))# Read a image and invoke a function with the file data as input parameter.src=open('src_image_file_path','rb')# Note: please open it as binary.r=client.invoke_function('service_name','function_name',payload=src)# save the result as the output image.dst=open('dst_image_file_path','wb')dst.write(r.data)src.close()dst.close()# Invoke function asynchronously.client.invoke_function('service_name','function_name',headers={'x-fc-invocation-type':'Async'})# List services.client.list_services()# List functions with prefix and limit.client.list_functions('service_name',prefix='the_prefix',limit=10)# Delete service.client.delete_service('service_name')# Delete function.client.delete_function('service_name','function_name')TestingTo run the tests, please set the access key id/secret, endpoint as environment variables. Take the Linux system for example:$exportENDPOINT=<endpoint>$exportACCESS_KEY_ID=<AccessKeyId>$exportACCESS_KEY_SECRET=<AccessKeySecret>$exportSTS_TOKEN=<roleARN>Run the test in the following method:$nosetests# First install noseMore resourcesAliyun FunctionCompute docsContacting usLinksLicenseMIT
aliyun-img-utils
Overviewaliyun-img-utilsprovides a command line utility and API for publishing images in the Aliyun Cloud. This includes helper functions for uploading image blobs, creating compute images and replicating/publishing/deprecating/ images across all available regions.See theAlibaba docsto get more info on the Aliyun cloud.Requirementsoss2ClickPyYAMLaliyun-python-sdk-corealiyun-python-sdk-ecsInstallationTo install the package on openSUSE and SLES use the following commands as root:$zypperarhttp://download.opensuse.org/repositories/Cloud:/Tools/<distribution> $zypperrefresh $zypperinpython3-aliyun-img-utilsTo install from PyPI:$pipinstallaliyun-img-utilsConfigurationaliyun-img-utilscan be configured with yaml based profiles. The configuration directory is~/.config/aliyun_img_utilsand the default profile is default.yaml (~/.config/aliyun_img_utils/default.yaml).The following configration options are available in a configuration profile:no_colorlog_levelregionaccess_keyaccess_secretbucket_nameAn example configuration profile may look like:region:cn-beijingaccess_key:FakeKEYaccess_secret:FAKESecretbucket_name:smarlow-testingWhen running any command the profile can be chosen via the--profileoption. For example,aliyun-img-utils image upload --profile productionwould pull configuration from ~/.config/aliyun_img_utils/production.yaml.CLIThe CLI is broken into multiple distinct subcommands that handle different steps of creating and publishing images in the Aliyun cloud framework.Image blob uploadThe first step is to upload a qcow2 image to a storage bucket. For thisaliyun-img-utils image uploadis available.Example:$aliyun-img-utilsimageupload--image-file~/Documents/test.qcow2In this example the qcow2 file will be uploaded to the storage bucket configured for the given profile and the blob will be named test.qcow2. If you want to override the name of the blob there is a--blob-nameoption.For more information about the image upload function see the help message:$aliyun-img-utilsimageupload--helpCompute image createThe next step is to create a compute image from the qcow2 blob. For thisaliyun-img-utils image createis available.Example:$aliyun-img-utilsimagecreate--image-nameSLES15-SP2-BYOS--image-description"Test image"--platformSUSE--blob-nameSLES15-SP2-BYOS.qcow2In this example the qcow2 blob will be used to create the compute image. for the given profile and the blob will be named test.qcow2. If you want to override the default (20GB) root disk size there is a--disk-sizeoption.For more information about the image create function see the help message:$aliyun-img-utilsimagecreate--helpReplicate (copy) imageOnce an image is created in a single region it can be replicated or copied to any other region:aliyun-img-utils image replicate.Example:$aliyun-img-utilsimagereplicate--image-nametest-image-v20210303--regionscn-shanghaiIn this example the image will be replicated to the cn-shanghai region. If no regions are provided the image will be replicated to all available regions.For more information about the image replicate function see the help message:$aliyun-img-utilsimagereplicate--helpPublish imageThe image can then be published or shared to other accounts withaliyun-img-utils image publish.Example:$aliyun-img-utilsimagepublish--image-nametest-image-v20210303--launch-permissionEXAMPLEIn this example the launch permission for the image will be set toEXAMPLE. If no regions are provided the image will be published in all available regions.For more information about the image publish function see the help message:$aliyun-img-utilsimagepublish--helpDeprecate imageAn image can be set to the deprecated state withaliyun-img-utils image deprecate.Example:$aliyun-img-utilsimagedeprecate--image-nametest-image-v20210303As with the other commands, if no regions are provided the image will be deprecated in all available regions.For more information about the image deprecate function see the help message:$aliyun-img-utilsimagedeprecate--helpActivate imageAn image can be set back to the active state withaliyun-img-utils image activate.Example:$aliyun-img-utilsimageactivate--image-nametest-image-v20210303As with the other commands, if no regions are provided the image will be activated in all available regions.For more information about the image activate function see the help message:$aliyun-img-utilsimageactivate--helpGet image infoInfo about a specific compute image can be retrieved withaliyun-img-utils image info.Example:$aliyun-img-utilsimageinfo--image-nametest-image-v20210303The image can be searched by the--image-nameor--image-id. By default only active images will be searched. To filter deprecated images there is a--deprecatedoption.For more information about the image info function see the help message:$aliyun-img-utilsimageinfo--helpDelete imageA compute image can be deleted withaliyun-img-utils image delete.Example:$aliyun-img-utilsimagedelete--image-nametest-image-v20210303As with the other commands, if no regions are provided the image will be deleted in all available regions.For more information about the image delete function see the help message:$aliyun-img-utilsimagedelete--helpAPIThe AliyunImage class can be instantiated and used as an API from code. This provides all the same functions as the CLI with a few additional helpers. For example there are waiter functions which will wait for a compute image to be created and/or deleted.To create an instance of AliyunImage you need anaccess_key,access_secret,regionandbucket_name. optionally you can pass in a Python log object and/or alog_level.aliyun_image=AliyunImage(access_key,access_secret,region,bucket_name,log_level=log_level,log_callback=logger)Code examplesWith an instance of AliyunImage you can perform any of the image functions which are available through the CLI.aliyun_image=AliyunImage('accessKEY','superSECRET','cn-beijing','images)# Upload image blobblob_name=aliyun_image.upload_image_tarball('/path/to/image.qcow2')# Create compute imageimage_id=aliyun_image.create_compute_image('test-image-v20220202','A great image to use.','test_image.qcow2','SUSE')# Delete compute image# Deletes the image from the current regiondeleted=aliyun_image.delete_compute_image('test-image-v20220202')# Delete compute image in all available regionsaliyun_image.delete_compute_image_in_regions('test-image-v20220202')# Delete storage blob from current bucketdeleted=aliyun_image.delete_storage_blob('test_image.qcow2')# Copy image to a single regionimage_id=aliyun_image.copy_compute_image('test-image-v20220202','cn-shanghai')# Replicate (copy) image to all available regions# A dictionary mapping region names to image ids is returned.images=aliyun_image.replicate_image('test-image-v20220202')# Publish image in current regionaliyun_image.publish_image('test-image-v20220202','EXAMPLE_PERMISSION')# Publish image in all available regionsaliyun_image.publish_image_to_regions('test-image-v20220202','EXAMPLE_PERMISSION')# Deprecate image in current regionaliyun_image.deprecate_image('test-image-v20220202')# Deprecate image in all available regionsaliyun_image.deprecate_image_in_regions('test-image-v20220202')# Activate image in current regionaliyun_image.activate_image('test-image-v20220202')# Activate image in all available regionsaliyun_image.activate_image_in_regions('test-image-v20220202')# Wait for image to become available based on image idaliyun_image.wait_on_compute_image('i-123456789')# Wait for image to be deleted based on image idaliyun_image.wait_on_compute_image_delete('i-123456789')# Return True if the image exists based on image nameexists=aliyun_image.image_exists('test-image-v20220202')#exists=aliyun_image.image_tarball_exists('test_image.qcow2')# Get image info as a dictionaryimage_info=aliyun_image.get_compute_image(image_name='test_image.qcow2')# Get a list of available regionsregions=aliyun_image.get_regions()The currentregionorbucket_namecan be changed at any time.When thebucket_nameis changed the currentbucket_clientsession is closed. The session will reconnect in a lazy fashion on the next storage operation.aliyun_image=AliyunImage('accessKEY','superSECRET','cn-beijing','images)exists=aliyun_image.image_tarball_exists('test_image.qcow2')# Resets the storage bucket clientaliyun_image.bucket_name='old-images'# Storage bucket client connects to the new bucket lazilyexists=aliyun_image.image_tarball_exists('test_image.qcow2')Similarly when theregionis changed both thebucket_clientand thecompute_clientsessions are closed.aliyun_image=AliyunImage('accessKEY','superSECRET','cn-beijing','images)image_info=aliyun_image.get_compute_image(image_name='test_image.qcow2')# Resets the compute clientaliyun_image.region='cn-shanghai'# Compute client connects to the new region lazilyimage_info=aliyun_image.get_compute_image(image_name='test_image.qcow2')Issues/EnhancementsPlease submit issues and requests toGithub.ContributingContributions toaliyun-img-utilsare welcome and encouraged. SeeCONTRIBUTINGfor info on getting started.LicenseCopyright (c) 2021 SUSE LLC.Distributed under the terms of GPL-3.0+ license, seeLICENSEfor details.
aliyun-iot-linkkit
#Python SDK for Aliyun IoT deviceThis is a python sdk for Aliyun IoT device.MQTT connection:MQTTV31,MQTTV311Aliyun IoT Thing ModelRelease Notes1.2.1[ADD] support to configure the custom endpoint for MQTT config_mqtt[ADD] support to configure the custom endpoint for HTTP2 config_http2[UPDATE] support to receive any unsubscribed message by on_topic_message1.2.2[UPDATE] fix the onConnect callback invoked at wrong time issue1.2.3[UPDATE] fix tsl parsing error for simplified tsl products1.2.4[UPDATE] add gateway feature1.2.5[UPDATE] add mqtt dynamic-register without pre-registration feature1.2.6[UPDATE] set Endpoint parameter with default value1.2.7[UPDATE] add ota feature1.2.8[UPDATE] add mqtt dynamic-register with pre-registration feature1.2.9[UPDATE] add api to enable users to re-connect to iot platform manually in case of exceptions1.2.10[UPDATE] fix handshake error in python 3.10/3.11[UPDATE] fix Timeout exception in lossy network[UPDATE] make the API force_reconnect async1.2.11[UPDATE] use new root certifications. The S1 certificate would not expire untill 2053.1.2.12[UPDATE] return error codes when pub/sub messages, do not throw exceptions any more; set keepalive range to be [30,1800]1.2.13[UPDATE] add option to turn on/off host_name check during tls handshake
aliyun-lite-log-python-sdk
Python SDK for Aliyun Lite Loghttp://aliyun-lite-log-python-sdk.readthedocs.io
aliyun-log-cli
Command Line Interface for Aliyun Log Servicehttp://aliyun-log-cli.readthedocs.io
aliyun-log-python-sdk
Python SDK for Alicloud Log Servicehttp://aliyun-log-python-sdk.readthedocs.io
aliyun-log-python-sdk-test
Python SDK for Alicloud Log Servicehttp://aliyun-log-python-sdk.readthedocs.io
aliyun-mns
Mns provides interfaces to Aliyun Message Service. mnscmd lets you do these actions: create/get/list/set/delete queue, send/receive/peek/change/delete message from/to queue, create/get/list/set/delete topic, publish message to topic, subscribe/get/list/set/unsubscribe subscription.
aliyun-mns-py3
aliyun mns for python3
aliyun-mns-sdk
Mns provides interfaces to Aliyun Message Service.mnscmd lets you do these actions: create/get/list/set/delete queue, send/receive/peek/change/delete message from/to queue, create/get/list/set/delete topic, publish message to topic, subscribe/get/list/set/unsubscribe subscription.
aliyun-mns-sdk-changed
Mns provides interfaces to Aliyun Message Service.mnscmd lets you do these actions: create/get/list/set/delete queue, send/receive/peek/change/delete message from/to queue, create/get/list/set/delete topic, publish message to topic, subscribe/get/list/set/unsubscribe subscription.
aliyun-nls
alibaba-nls-python-sdkThis is Python SDK for NLS. It supports SPEECH-RECOGNIZER/SPEECH-SYNTHESIZER/SPEECH-TRANSLATOR/COMMON-REQUESTS-PROTO.This module works on Python versions:3.6 and greaterinstall requirements:python -m pip install -r requirements.txtinstall package:python -m pip install .
aliyun-openapi-python-sdk-managed-credentials-provider
No description available on PyPI.
aliyunoss
Python SDK for Aliyun OSS(Open Storage Service). Hope we will get an official packaging fron Aliyun Lazy Team soon.
aliyun-oss
osscmd lets you create/delete/list bucket and upload/download/copy/delete file from/toAliyun OSS (Open Storage Service).
aliyunoss2-autoupload
Source:https://github.com/tanbro/aliyunoss2-autouploadPackage:https://pypi.org/project/aliyunoss2-autoupload/aliyunoss2-autouploadMonitor files by wildcard patterns, upload them to ALIYUN OSS, then move to backup directory.UsageCommand LineAfter the package installed, run the command in a terminator, show help messages$aliyunoss2-autoupload--helpusage: aliyunoss2-autoupload [-h] [--version] {run,echo_config_sample} ... Watch files in a directory and upload them to Aliyun OSS on file writing completed optional arguments: -h, --help show this help message and exit --version show program's version number and exit subcommands: {run,echo_config_sample} <sub_command --help> Print the help of sub_commands run Start to run then program. It will monitor and upload files continuously. echo_config_sample Echo configure file sample$aliyunoss2-autouploadrun--helpusage: aliyunoss2-autoupload run [-h] [--only-once] [--config-file CONFIG_FILE] [--logging-config-file LOGGING_CONFIG_FILE] optional arguments: -h, --help show this help message and exit --only-once, -o Upload only once, then exit. Will NOT monitor files. (default=False) --config-file CONFIG_FILE, -c CONFIG_FILE The program configuration file. The program will first try to load configuration file by environment variable${ALIYUNOSS2_AUTOUPLOAD_CONF}.Iftheenvironmentvariable not assigned, then try to load configuration file "conf/aliyunoss2-autoupload.yml" --logging-config-file LOGGING_CONFIG_FILE, -l LOGGING_CONFIG_FILE The logging configuration file. The program will first try to load logging configuration file by environment variable ${ALIYUNOSS2_AUTOUPLOAD_LOG_CONF}. If the environment variable not assigned, then try to load configuration file "conf/aliyunoss2-autoupload.log.yml"$aliyunoss2-autouploadecho_config_sample--helpusage: aliyunoss2-autoupload echo_config_sample [-h] {prog,log} positional arguments: {prog,log} Configure file to echo optional arguments: -h, --help show this help message and exitConfiguration FileThe program will first try to load configuration file from environment variableALIYUNOSS2_AUTOUPLOAD_CONF. If the environment variable not assigned, then try to load configuration file“conf/aliyunoss2-autoupload.yml”.TheYAMLfile is like blow:---## Aliyun OSS configsoss:## Name of your Aliyun OSS bucketname:"your_bucket_name"## Endpoint URL of Aliyun OSS bucketendpoint:"oss-xx-xxxxxx.aliyuncs.com"## cname of the domain of Aliyun OSS bucket. Empty if no cname.cname:""## Access Key ID of Aliyun OSS bucketaccess_key_id:"your_access_key_id"## Access Key Secret of Aliyun OSS bucketaccess_key_secret:"your_access_key_secret"## Directory name configsdir:## Calculate uploading file relative name by this local directoryrel_dir:""## Upload files to OSS in this diross_dir:""## Move uploaded file the the directory. It MUST be a different dir from where the files are. If not, the file will be uploaded again and again.bak_dir:""## watcher configswatcher:## The time interval(seconds) this program scan the directoryinterval:30## If the interval between the current time and the file\'s modification time is greater than this value, the write is considered complete.write_complete_time:30## Pattern of the files to watch and uploadpatterns:"files/*.*"## If find patterns recursivelyrecursive:false## pool of at most max_workers threads to execute upload/backup tasks. If max_workers is None or not given, it will default to the number of processors on the machine, multiplied by 5.max_workers:~Also, the program will first try to load logging configuration file by environment variableALIYUNOSS2_AUTOUPLOAD_LOG_CONF. If the environment variable not assigned, then try to load configuration file“conf/aliyunoss2-autoupload.log.yml”.Logging config file is alsoYAML. Go tohttps://docs.python.org/3/library/logging.config.htmlfor more information about Python logging config.InstallInstall by pippip install aliyunoss2-autouploadInstall from codegit clone https://github.com/tanbro/aliyunoss2-autoupload.git cd aliyunoss2-autoupload path/of/your/python setup.py installChangelog0.1AddedReload configuration every time before scanning files.–only-oncecommand line option.Optimizedtest case formain()function0.1b2Date:2018-06-20AddsMore detailed documents.Bug Fixes:Remove files in.gitignore, but tracked.0.1b1Date:2018-06-19ChangesSupport oldPython 2.7,Python 3.4Config file name extension changed from".yaml"to".yml"Default config file environment variable nameDefault config file path if no environment variableAddsSome simple test casesCircleCICodacy0.1a3The first ever-usable version.Date:2018-04-18ChangesNow Python 3.5+ ONLY, becauseglob.iglobhas norecursiveargument and"**"wildcard in lower Python versionBug FixesBackup directory errorsOSS exceptionCONTRIBUTINGContributions are welcome!It’s advised to develop invenv/path/of/your/python -m venv .venv source .venv/bin/activate git clone [email protected]:tanbro/aliyunoss2-autoupload.git cd aliyunoss2-autoupload python setup.py developAuthorsliu xue yan <[email protected]>
aliyunossdebug
My short description for my project.
aliyun-oss-python-sdk-managed-credentials-provider
No description available on PyPI.
aliyunpan
aliyunpan阿里云盘cli环境要求: python 3.7 通过测试低版本环境运行报错参考issue9安装pipinstallaliyunpan更新pipinstallaliyunpan--upgrade运行aliyunpan-clipyinstaller打包最新版下载(GitHub Actions打包,glibc版本较高#42)第三方下载(更新较慢)克隆项目--recurse-submodules用于克隆子模块,部分功能需要(可选)gitclonehttps://github.com/wxy1343/aliyunpan--recurse-submodules获取refresh_token注意web端获取的refresh_token有防盗链检测可以指定账号密码登入可以通过手机端查找日志获取refresh_token/sdcard/Android/data/com.alicloud.databox/files/logs/trace/userId/yunpan/latest.log登录api加入了ua检测,需要运行混淆的js代码来获取ua推荐安装node.js和jsdom模块来运行js代码目前阿里云盘修改了ua的算法,加入了鼠标移动之类的信息,如果有解决方法的欢迎来prnpminstalljsdom配置refresh_tokenecho"refresh_token: 'xxxxx'">~/.config/aliyunpan.yaml配置账号(可选)echo"username: 'xxxxx'">~/.config/aliyunpan.yamlecho"password: 'xxxxx'">>~/.config/aliyunpan.yaml配置aria2(可选)cat>>~/.config/aliyunpan.yaml<<EOFaria2:'host': 'http://localhost''port': 6800'secret': ''EOF功能介绍指令描述download (d)下载文件/文件夹ls (dir,l,list)列目录mv (move)移动文件/文件夹rm (del,delete)删除文件/文件夹rename (r)重命名文件/文件夹tree (show,t)查看文件树upload (u)上传文件/文件夹share (s)分享文件mkdir (m)创建文件夹cat (c)显示文件内容tui文本用户界面search搜索文件/文件夹sync同步文件夹token (r,refresh_token)查看refresh_token使用指南查看帮助aliyunpan-cli-h查看详情参数描述-h, --help查看帮助--version查看版本-c, --config-file指定配置文件-t, --refresh-token指定REFRESH_TOKEN-u, --username指定账号-p, --password指定密码-d, --depth文件递归深度-T, --timeout请求超时时间(秒)-id, --drive-id指定drive_id-a, --album是否访问相册-s, --share-id指定分享id-sp, --share-pwd指定分享密码-f, --filter-file过滤文件(多个)-w, --whitelist使用白名单过滤文件-m, --match指定使用正则匹配文件查看指令参数aliyunpan-cliCOMMAND-h查看详情指令参数描述download-p, --file选择文件(多个)download-s, --share指定分享的序列文件download-cs, --chunk-size分块大小(字节)download-a, --aria2发送到aria2ls,search-l查看详情share-p, --file指定文件(多个)share-f, --file-id指定file_id(多个)share-t, --expire-sec分享过期时间(秒),默认最大14400share-l, --share-link输出分享链接share-d, --download-link输出下载链接share-s, --save保存序列文件到云盘和本地share-S, --share-official官方分享功能(需要账号支持)upload-p, --file选择文件(多个)upload,sync-t, --time-out分块上传超时时间(秒)upload,sync-r, --retry上传失败重试次数upload-f, --force强制覆盖文件upload-s, --share指定分享的序列文件upload,sync-cs, --chunk-size分块大小(字节)upload-c断点续传cat-e, --encoding文件编码sync-st, --sync-time同步间隔时间sync--no-delete, -n不删除(云盘/本地)文件(默认)sync-d, --delete允许删除(云盘/本地)文件sync-l, --local同步云盘文件到本地token--refresh, -r刷新配置文件tokentoken--refresh-time, -t自动刷新token间隔时间(秒)token--change, -c设置新的refresh_token断点续传将文件分成多块顺序上传文件上传进度保存在当前目录下的tasks.yaml格式文件sha1:path:绝对路径upload_id:上传idfile_id:文件idchunk_size:分块大小part_number:最后上传的分块编号文件未上传成功时,CTRL+C会自动保存断点续传需带上参数-c分享由于官方修改秒传接口导致该功能失效暂时采用在秒传链接中加入直链的方法用以获取proof_code分享秒传文件时需要通过直链获取文件随机8字节,导致速度较慢由于直链的局限,秒传链接有效期为4小时1.分享链接格式aliyunpan://文件名|sha1|url_base64|文件大小|相对路径例如以下秒传链接均已失效,仅供参考aliyunpan://示例文件.txt|F61851825609372B3D7F802E600B35A497CFC38E|url_base64|24|root2.文件分享aliyunpan-clishare示例文件.txt导入aliyunpan-cliupload"aliyunpan://示例文件.txt|F61851825609372B3D7F802E600B35A497CFC38E|url_base64|24|root"3.文件夹分享aliyunpan-clishare示例文件夹导入aliyunpan-cliupload-s"aliyunpan://示例文件夹|80E7E25109D4246653B600FDFEDD8D8B0D97E517|url_base64|970|root"TUI按键指南显示菜单(ctrl+x)退出(ctrl+c)切换标签(↑↓←→,kjhl,TAB)环境变量ALIYUNPAN_CONF配置文件路径ALIYUNPAN_ROOT根目录(log和tasks输出路径)致谢感谢zhjc1124/aliyundrive的登录接口参考
aliyunpy
No description available on PyPI.
aliyun-python-sdk
UNKNOWN
aliyun-python-sdk-aas
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-aas-test
aliyun-python-sdk-aas This is the aas module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater Documentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-acm
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-acms-open
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-acs
aliyun-python-sdk-acs This is the acs module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater Documentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-acs-test
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-actiontrail
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-adb
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-adcp
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-address-purification
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-aegis
aliyun-python-sdk-aegis This is the aegis module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater Documentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-afs
aliyun-python-sdk-afs This is the afs module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater Documentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-aigen
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-aimiaobi
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-airec
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-airec-test
aliyun-python-sdk-airec This is the airec module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater Documentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-airticketopen
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-alb
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-alidns
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-aligreen-console
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-alikafka
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-alimt
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-alimt-test
aliyun-python-sdk-alimt This is the alimt module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater Documentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-alinlp
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-aliyuncvc
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-amptest
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-amqp-open
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-antiddos-public
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-apds
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-appmallsservice
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-appstream-center
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-arms
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-arms4finance
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-arms-test
aliyun-python-sdk-arms This is the arms module of Aliyun Python SDK.Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater Documentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-asapi
aliyun-python-sdk-asapi This is the asapi module of Aliyun ApsaraStack Python SDK.Aliyun ApsaraStack Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greater Documentation:Please visithttps://www.aliyun.com/product/apsara-stack
aliyun-python-sdk-avatar
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python
aliyun-python-sdk-baas
Aliyun Python SDK is the official software development kit. It makes things easy to integrate your Python application, library, or script with Aliyun services.This module works on Python versions:2.6.5 and greaterDocumentation:Please visithttp://develop.aliyun.com/sdk/python