text
stringlengths 55
456k
| metadata
dict |
---|---|
\$ click\_
==========
Click is a Python package for creating beautiful command line interfaces
in a composable way with as little code as necessary. It's the "Command
Line Interface Creation Kit". It's highly configurable but comes with
sensible defaults out of the box.
It aims to make the process of writing command line tools quick and fun
while also preventing any frustration caused by the inability to
implement an intended CLI API.
Click in three points:
- Arbitrary nesting of commands
- Automatic help page generation
- Supports lazy loading of subcommands at runtime
Installing
----------
Install and update using `pip`_:
.. code-block:: text
$ pip install -U click
.. _pip: https://pip.pypa.io/en/stable/quickstart/
A Simple Example
----------------
.. code-block:: python
import click
@click.command()
@click.option("--count", default=1, help="Number of greetings.")
@click.option("--name", prompt="Your name", help="The person to greet.")
def hello(count, name):
"""Simple program that greets NAME for a total of COUNT times."""
for _ in range(count):
click.echo(f"Hello, {name}!")
if __name__ == '__main__':
hello()
.. code-block:: text
$ python hello.py --count=3
Your name: Click
Hello, Click!
Hello, Click!
Hello, Click!
Donate
------
The Pallets organization develops and supports Click and other popular
packages. In order to grow the community of contributors and users, and
allow the maintainers to devote more time to the projects, `please
donate today`_.
.. _please donate today: https://palletsprojects.com/donate
Links
-----
- Website: https://palletsprojects.com/p/click/
- Documentation: https://click.palletsprojects.com/
- Releases: https://pypi.org/project/click/
- Code: https://github.com/pallets/click
- Issue tracker: https://github.com/pallets/click/issues
- Test status: https://dev.azure.com/pallets/click/_build
- Official chat: https://discord.gg/t6rrQZH | {
"source": "yandex/perforator",
"title": "contrib/python/click/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/click/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2022
} |
# $ click_
Click is a Python package for creating beautiful command line interfaces
in a composable way with as little code as necessary. It's the "Command
Line Interface Creation Kit". It's highly configurable but comes with
sensible defaults out of the box.
It aims to make the process of writing command line tools quick and fun
while also preventing any frustration caused by the inability to
implement an intended CLI API.
Click in three points:
- Arbitrary nesting of commands
- Automatic help page generation
- Supports lazy loading of subcommands at runtime
## A Simple Example
```python
import click
@click.command()
@click.option("--count", default=1, help="Number of greetings.")
@click.option("--name", prompt="Your name", help="The person to greet.")
def hello(count, name):
"""Simple program that greets NAME for a total of COUNT times."""
for _ in range(count):
click.echo(f"Hello, {name}!")
if __name__ == '__main__':
hello()
```
```
$ python hello.py --count=3
Your name: Click
Hello, Click!
Hello, Click!
Hello, Click!
```
## Donate
The Pallets organization develops and supports Click and other popular
packages. In order to grow the community of contributors and users, and
allow the maintainers to devote more time to the projects, [please
donate today][].
[please donate today]: https://palletsprojects.com/donate | {
"source": "yandex/perforator",
"title": "contrib/python/click/py3/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/click/py3/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1375
} |
.. image:: https://img.shields.io/pypi/v/colorama.svg
:target: https://pypi.org/project/colorama/
:alt: Latest Version
.. image:: https://img.shields.io/pypi/pyversions/colorama.svg
:target: https://pypi.org/project/colorama/
:alt: Supported Python versions
.. image:: https://github.com/tartley/colorama/actions/workflows/test.yml/badge.svg
:target: https://github.com/tartley/colorama/actions/workflows/test.yml
:alt: Build Status
Colorama
========
Makes ANSI escape character sequences (for producing colored terminal text and
cursor positioning) work under MS Windows.
.. |donate| image:: https://www.paypalobjects.com/en_US/i/btn/btn_donate_SM.gif
:target: https://www.paypal.com/cgi-bin/webscr?cmd=_donations&business=2MZ9D2GMLYCUJ&item_name=Colorama¤cy_code=USD
:alt: Donate with Paypal
`PyPI for releases <https://pypi.org/project/colorama/>`_ |
`Github for source <https://github.com/tartley/colorama>`_ |
`Colorama for enterprise on Tidelift <https://github.com/tartley/colorama/blob/master/ENTERPRISE.md>`_
If you find Colorama useful, please |donate| to the authors. Thank you!
Installation
------------
Tested on CPython 2.7, 3.7, 3.8, 3.9 and 3.10 and Pypy 2.7 and 3.8.
No requirements other than the standard library.
.. code-block:: bash
pip install colorama
# or
conda install -c anaconda colorama
Description
-----------
ANSI escape character sequences have long been used to produce colored terminal
text and cursor positioning on Unix and Macs. Colorama makes this work on
Windows, too, by wrapping ``stdout``, stripping ANSI sequences it finds (which
would appear as gobbledygook in the output), and converting them into the
appropriate win32 calls to modify the state of the terminal. On other platforms,
Colorama does nothing.
This has the upshot of providing a simple cross-platform API for printing
colored terminal text from Python, and has the happy side-effect that existing
applications or libraries which use ANSI sequences to produce colored output on
Linux or Macs can now also work on Windows, simply by calling
``colorama.just_fix_windows_console()`` (since v0.4.6) or ``colorama.init()``
(all versions, but may have other side-effects – see below).
An alternative approach is to install ``ansi.sys`` on Windows machines, which
provides the same behaviour for all applications running in terminals. Colorama
is intended for situations where that isn't easy (e.g., maybe your app doesn't
have an installer.)
Demo scripts in the source code repository print some colored text using
ANSI sequences. Compare their output under Gnome-terminal's built in ANSI
handling, versus on Windows Command-Prompt using Colorama:
.. image:: https://github.com/tartley/colorama/raw/master/screenshots/ubuntu-demo.png
:width: 661
:height: 357
:alt: ANSI sequences on Ubuntu under gnome-terminal.
.. image:: https://github.com/tartley/colorama/raw/master/screenshots/windows-demo.png
:width: 668
:height: 325
:alt: Same ANSI sequences on Windows, using Colorama.
These screenshots show that, on Windows, Colorama does not support ANSI 'dim
text'; it looks the same as 'normal text'.
Usage
-----
Initialisation
..............
If the only thing you want from Colorama is to get ANSI escapes to work on
Windows, then run:
.. code-block:: python
from colorama import just_fix_windows_console
just_fix_windows_console()
If you're on a recent version of Windows 10 or better, and your stdout/stderr
are pointing to a Windows console, then this will flip the magic configuration
switch to enable Windows' built-in ANSI support.
If you're on an older version of Windows, and your stdout/stderr are pointing to
a Windows console, then this will wrap ``sys.stdout`` and/or ``sys.stderr`` in a
magic file object that intercepts ANSI escape sequences and issues the
appropriate Win32 calls to emulate them.
In all other circumstances, it does nothing whatsoever. Basically the idea is
that this makes Windows act like Unix with respect to ANSI escape handling.
It's safe to call this function multiple times. It's safe to call this function
on non-Windows platforms, but it won't do anything. It's safe to call this
function when one or both of your stdout/stderr are redirected to a file – it
won't do anything to those streams.
Alternatively, you can use the older interface with more features (but also more
potential footguns):
.. code-block:: python
from colorama import init
init()
This does the same thing as ``just_fix_windows_console``, except for the
following differences:
- It's not safe to call ``init`` multiple times; you can end up with multiple
layers of wrapping and broken ANSI support.
- Colorama will apply a heuristic to guess whether stdout/stderr support ANSI,
and if it thinks they don't, then it will wrap ``sys.stdout`` and
``sys.stderr`` in a magic file object that strips out ANSI escape sequences
before printing them. This happens on all platforms, and can be convenient if
you want to write your code to emit ANSI escape sequences unconditionally, and
let Colorama decide whether they should actually be output. But note that
Colorama's heuristic is not particularly clever.
- ``init`` also accepts explicit keyword args to enable/disable various
functionality – see below.
To stop using Colorama before your program exits, simply call ``deinit()``.
This will restore ``stdout`` and ``stderr`` to their original values, so that
Colorama is disabled. To resume using Colorama again, call ``reinit()``; it is
cheaper than calling ``init()`` again (but does the same thing).
Most users should depend on ``colorama >= 0.4.6``, and use
``just_fix_windows_console``. The old ``init`` interface will be supported
indefinitely for backwards compatibility, but we don't plan to fix any issues
with it, also for backwards compatibility.
Colored Output
..............
Cross-platform printing of colored text can then be done using Colorama's
constant shorthand for ANSI escape sequences. These are deliberately
rudimentary, see below.
.. code-block:: python
from colorama import Fore, Back, Style
print(Fore.RED + 'some red text')
print(Back.GREEN + 'and with a green background')
print(Style.DIM + 'and in dim text')
print(Style.RESET_ALL)
print('back to normal now')
...or simply by manually printing ANSI sequences from your own code:
.. code-block:: python
print('\033[31m' + 'some red text')
print('\033[39m') # and reset to default color
...or, Colorama can be used in conjunction with existing ANSI libraries
such as the venerable `Termcolor <https://pypi.org/project/termcolor/>`_
the fabulous `Blessings <https://pypi.org/project/blessings/>`_,
or the incredible `_Rich <https://pypi.org/project/rich/>`_.
If you wish Colorama's Fore, Back and Style constants were more capable,
then consider using one of the above highly capable libraries to generate
colors, etc, and use Colorama just for its primary purpose: to convert
those ANSI sequences to also work on Windows:
SIMILARLY, do not send PRs adding the generation of new ANSI types to Colorama.
We are only interested in converting ANSI codes to win32 API calls, not
shortcuts like the above to generate ANSI characters.
.. code-block:: python
from colorama import just_fix_windows_console
from termcolor import colored
# use Colorama to make Termcolor work on Windows too
just_fix_windows_console()
# then use Termcolor for all colored text output
print(colored('Hello, World!', 'green', 'on_red'))
Available formatting constants are::
Fore: BLACK, RED, GREEN, YELLOW, BLUE, MAGENTA, CYAN, WHITE, RESET.
Back: BLACK, RED, GREEN, YELLOW, BLUE, MAGENTA, CYAN, WHITE, RESET.
Style: DIM, NORMAL, BRIGHT, RESET_ALL
``Style.RESET_ALL`` resets foreground, background, and brightness. Colorama will
perform this reset automatically on program exit.
These are fairly well supported, but not part of the standard::
Fore: LIGHTBLACK_EX, LIGHTRED_EX, LIGHTGREEN_EX, LIGHTYELLOW_EX, LIGHTBLUE_EX, LIGHTMAGENTA_EX, LIGHTCYAN_EX, LIGHTWHITE_EX
Back: LIGHTBLACK_EX, LIGHTRED_EX, LIGHTGREEN_EX, LIGHTYELLOW_EX, LIGHTBLUE_EX, LIGHTMAGENTA_EX, LIGHTCYAN_EX, LIGHTWHITE_EX
Cursor Positioning
..................
ANSI codes to reposition the cursor are supported. See ``demos/demo06.py`` for
an example of how to generate them.
Init Keyword Args
.................
``init()`` accepts some ``**kwargs`` to override default behaviour.
init(autoreset=False):
If you find yourself repeatedly sending reset sequences to turn off color
changes at the end of every print, then ``init(autoreset=True)`` will
automate that:
.. code-block:: python
from colorama import init
init(autoreset=True)
print(Fore.RED + 'some red text')
print('automatically back to default color again')
init(strip=None):
Pass ``True`` or ``False`` to override whether ANSI codes should be
stripped from the output. The default behaviour is to strip if on Windows
or if output is redirected (not a tty).
init(convert=None):
Pass ``True`` or ``False`` to override whether to convert ANSI codes in the
output into win32 calls. The default behaviour is to convert if on Windows
and output is to a tty (terminal).
init(wrap=True):
On Windows, Colorama works by replacing ``sys.stdout`` and ``sys.stderr``
with proxy objects, which override the ``.write()`` method to do their work.
If this wrapping causes you problems, then this can be disabled by passing
``init(wrap=False)``. The default behaviour is to wrap if ``autoreset`` or
``strip`` or ``convert`` are True.
When wrapping is disabled, colored printing on non-Windows platforms will
continue to work as normal. To do cross-platform colored output, you can
use Colorama's ``AnsiToWin32`` proxy directly:
.. code-block:: python
import sys
from colorama import init, AnsiToWin32
init(wrap=False)
stream = AnsiToWin32(sys.stderr).stream
# Python 2
print >>stream, Fore.BLUE + 'blue text on stderr'
# Python 3
print(Fore.BLUE + 'blue text on stderr', file=stream)
Recognised ANSI Sequences
.........................
ANSI sequences generally take the form::
ESC [ <param> ; <param> ... <command>
Where ``<param>`` is an integer, and ``<command>`` is a single letter. Zero or
more params are passed to a ``<command>``. If no params are passed, it is
generally synonymous with passing a single zero. No spaces exist in the
sequence; they have been inserted here simply to read more easily.
The only ANSI sequences that Colorama converts into win32 calls are::
ESC [ 0 m # reset all (colors and brightness)
ESC [ 1 m # bright
ESC [ 2 m # dim (looks same as normal brightness)
ESC [ 22 m # normal brightness
# FOREGROUND:
ESC [ 30 m # black
ESC [ 31 m # red
ESC [ 32 m # green
ESC [ 33 m # yellow
ESC [ 34 m # blue
ESC [ 35 m # magenta
ESC [ 36 m # cyan
ESC [ 37 m # white
ESC [ 39 m # reset
# BACKGROUND
ESC [ 40 m # black
ESC [ 41 m # red
ESC [ 42 m # green
ESC [ 43 m # yellow
ESC [ 44 m # blue
ESC [ 45 m # magenta
ESC [ 46 m # cyan
ESC [ 47 m # white
ESC [ 49 m # reset
# cursor positioning
ESC [ y;x H # position cursor at x across, y down
ESC [ y;x f # position cursor at x across, y down
ESC [ n A # move cursor n lines up
ESC [ n B # move cursor n lines down
ESC [ n C # move cursor n characters forward
ESC [ n D # move cursor n characters backward
# clear the screen
ESC [ mode J # clear the screen
# clear the line
ESC [ mode K # clear the line
Multiple numeric params to the ``'m'`` command can be combined into a single
sequence::
ESC [ 36 ; 45 ; 1 m # bright cyan text on magenta background
All other ANSI sequences of the form ``ESC [ <param> ; <param> ... <command>``
are silently stripped from the output on Windows.
Any other form of ANSI sequence, such as single-character codes or alternative
initial characters, are not recognised or stripped. It would be cool to add
them though. Let me know if it would be useful for you, via the Issues on
GitHub.
Status & Known Problems
-----------------------
I've personally only tested it on Windows XP (CMD, Console2), Ubuntu
(gnome-terminal, xterm), and OS X.
Some valid ANSI sequences aren't recognised.
If you're hacking on the code, see `README-hacking.md`_. ESPECIALLY, see the
explanation there of why we do not want PRs that allow Colorama to generate new
types of ANSI codes.
See outstanding issues and wish-list:
https://github.com/tartley/colorama/issues
If anything doesn't work for you, or doesn't do what you expected or hoped for,
I'd love to hear about it on that issues list, would be delighted by patches,
and would be happy to grant commit access to anyone who submits a working patch
or two.
.. _README-hacking.md: README-hacking.md
License
-------
Copyright Jonathan Hartley & Arnon Yaari, 2013-2020. BSD 3-Clause license; see
LICENSE file.
Professional support
--------------------
.. |tideliftlogo| image:: https://cdn2.hubspot.net/hubfs/4008838/website/logos/logos_for_download/Tidelift_primary-shorthand-logo.png
:alt: Tidelift
:target: https://tidelift.com/subscription/pkg/pypi-colorama?utm_source=pypi-colorama&utm_medium=referral&utm_campaign=readme
.. list-table::
:widths: 10 100
* - |tideliftlogo|
- Professional support for colorama is available as part of the
`Tidelift Subscription`_.
Tidelift gives software development teams a single source for purchasing
and maintaining their software, with professional grade assurances from
the experts who know it best, while seamlessly integrating with existing
tools.
.. _Tidelift Subscription: https://tidelift.com/subscription/pkg/pypi-colorama?utm_source=pypi-colorama&utm_medium=referral&utm_campaign=readme
Thanks
------
See the CHANGELOG for more thanks!
* Marc Schlaich (schlamar) for a ``setup.py`` fix for Python2.5.
* Marc Abramowitz, reported & fixed a crash on exit with closed ``stdout``,
providing a solution to issue #7's setuptools/distutils debate,
and other fixes.
* User 'eryksun', for guidance on correctly instantiating ``ctypes.windll``.
* Matthew McCormick for politely pointing out a longstanding crash on non-Win.
* Ben Hoyt, for a magnificent fix under 64-bit Windows.
* Jesse at Empty Square for submitting a fix for examples in the README.
* User 'jamessp', an observant documentation fix for cursor positioning.
* User 'vaal1239', Dave Mckee & Lackner Kristof for a tiny but much-needed Win7
fix.
* Julien Stuyck, for wisely suggesting Python3 compatible updates to README.
* Daniel Griffith for multiple fabulous patches.
* Oscar Lesta for a valuable fix to stop ANSI chars being sent to non-tty
output.
* Roger Binns, for many suggestions, valuable feedback, & bug reports.
* Tim Golden for thought and much appreciated feedback on the initial idea.
* User 'Zearin' for updates to the README file.
* John Szakmeister for adding support for light colors
* Charles Merriam for adding documentation to demos
* Jurko for a fix on 64-bit Windows CPython2.5 w/o ctypes
* Florian Bruhin for a fix when stdout or stderr are None
* Thomas Weininger for fixing ValueError on Windows
* Remi Rampin for better Github integration and fixes to the README file
* Simeon Visser for closing a file handle using 'with' and updating classifiers
to include Python 3.3 and 3.4
* Andy Neff for fixing RESET of LIGHT_EX colors.
* Jonathan Hartley for the initial idea and implementation. | {
"source": "yandex/perforator",
"title": "contrib/python/colorama/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/colorama/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 15928
} |
.. image:: https://img.shields.io/pypi/v/colorama.svg
:target: https://pypi.org/project/colorama/
:alt: Latest Version
.. image:: https://img.shields.io/pypi/pyversions/colorama.svg
:target: https://pypi.org/project/colorama/
:alt: Supported Python versions
.. image:: https://github.com/tartley/colorama/actions/workflows/test.yml/badge.svg
:target: https://github.com/tartley/colorama/actions/workflows/test.yml
:alt: Build Status
Colorama
========
Makes ANSI escape character sequences (for producing colored terminal text and
cursor positioning) work under MS Windows.
.. |donate| image:: https://www.paypalobjects.com/en_US/i/btn/btn_donate_SM.gif
:target: https://www.paypal.com/cgi-bin/webscr?cmd=_donations&business=2MZ9D2GMLYCUJ&item_name=Colorama¤cy_code=USD
:alt: Donate with Paypal
`PyPI for releases <https://pypi.org/project/colorama/>`_ |
`Github for source <https://github.com/tartley/colorama>`_ |
`Colorama for enterprise on Tidelift <https://github.com/tartley/colorama/blob/master/ENTERPRISE.md>`_
If you find Colorama useful, please |donate| to the authors. Thank you!
Installation
------------
Tested on CPython 2.7, 3.7, 3.8, 3.9 and 3.10 and Pypy 2.7 and 3.8.
No requirements other than the standard library.
.. code-block:: bash
pip install colorama
# or
conda install -c anaconda colorama
Description
-----------
ANSI escape character sequences have long been used to produce colored terminal
text and cursor positioning on Unix and Macs. Colorama makes this work on
Windows, too, by wrapping ``stdout``, stripping ANSI sequences it finds (which
would appear as gobbledygook in the output), and converting them into the
appropriate win32 calls to modify the state of the terminal. On other platforms,
Colorama does nothing.
This has the upshot of providing a simple cross-platform API for printing
colored terminal text from Python, and has the happy side-effect that existing
applications or libraries which use ANSI sequences to produce colored output on
Linux or Macs can now also work on Windows, simply by calling
``colorama.just_fix_windows_console()`` (since v0.4.6) or ``colorama.init()``
(all versions, but may have other side-effects – see below).
An alternative approach is to install ``ansi.sys`` on Windows machines, which
provides the same behaviour for all applications running in terminals. Colorama
is intended for situations where that isn't easy (e.g., maybe your app doesn't
have an installer.)
Demo scripts in the source code repository print some colored text using
ANSI sequences. Compare their output under Gnome-terminal's built in ANSI
handling, versus on Windows Command-Prompt using Colorama:
.. image:: https://github.com/tartley/colorama/raw/master/screenshots/ubuntu-demo.png
:width: 661
:height: 357
:alt: ANSI sequences on Ubuntu under gnome-terminal.
.. image:: https://github.com/tartley/colorama/raw/master/screenshots/windows-demo.png
:width: 668
:height: 325
:alt: Same ANSI sequences on Windows, using Colorama.
These screenshots show that, on Windows, Colorama does not support ANSI 'dim
text'; it looks the same as 'normal text'.
Usage
-----
Initialisation
..............
If the only thing you want from Colorama is to get ANSI escapes to work on
Windows, then run:
.. code-block:: python
from colorama import just_fix_windows_console
just_fix_windows_console()
If you're on a recent version of Windows 10 or better, and your stdout/stderr
are pointing to a Windows console, then this will flip the magic configuration
switch to enable Windows' built-in ANSI support.
If you're on an older version of Windows, and your stdout/stderr are pointing to
a Windows console, then this will wrap ``sys.stdout`` and/or ``sys.stderr`` in a
magic file object that intercepts ANSI escape sequences and issues the
appropriate Win32 calls to emulate them.
In all other circumstances, it does nothing whatsoever. Basically the idea is
that this makes Windows act like Unix with respect to ANSI escape handling.
It's safe to call this function multiple times. It's safe to call this function
on non-Windows platforms, but it won't do anything. It's safe to call this
function when one or both of your stdout/stderr are redirected to a file – it
won't do anything to those streams.
Alternatively, you can use the older interface with more features (but also more
potential footguns):
.. code-block:: python
from colorama import init
init()
This does the same thing as ``just_fix_windows_console``, except for the
following differences:
- It's not safe to call ``init`` multiple times; you can end up with multiple
layers of wrapping and broken ANSI support.
- Colorama will apply a heuristic to guess whether stdout/stderr support ANSI,
and if it thinks they don't, then it will wrap ``sys.stdout`` and
``sys.stderr`` in a magic file object that strips out ANSI escape sequences
before printing them. This happens on all platforms, and can be convenient if
you want to write your code to emit ANSI escape sequences unconditionally, and
let Colorama decide whether they should actually be output. But note that
Colorama's heuristic is not particularly clever.
- ``init`` also accepts explicit keyword args to enable/disable various
functionality – see below.
To stop using Colorama before your program exits, simply call ``deinit()``.
This will restore ``stdout`` and ``stderr`` to their original values, so that
Colorama is disabled. To resume using Colorama again, call ``reinit()``; it is
cheaper than calling ``init()`` again (but does the same thing).
Most users should depend on ``colorama >= 0.4.6``, and use
``just_fix_windows_console``. The old ``init`` interface will be supported
indefinitely for backwards compatibility, but we don't plan to fix any issues
with it, also for backwards compatibility.
Colored Output
..............
Cross-platform printing of colored text can then be done using Colorama's
constant shorthand for ANSI escape sequences. These are deliberately
rudimentary, see below.
.. code-block:: python
from colorama import Fore, Back, Style
print(Fore.RED + 'some red text')
print(Back.GREEN + 'and with a green background')
print(Style.DIM + 'and in dim text')
print(Style.RESET_ALL)
print('back to normal now')
...or simply by manually printing ANSI sequences from your own code:
.. code-block:: python
print('\033[31m' + 'some red text')
print('\033[39m') # and reset to default color
...or, Colorama can be used in conjunction with existing ANSI libraries
such as the venerable `Termcolor <https://pypi.org/project/termcolor/>`_
the fabulous `Blessings <https://pypi.org/project/blessings/>`_,
or the incredible `_Rich <https://pypi.org/project/rich/>`_.
If you wish Colorama's Fore, Back and Style constants were more capable,
then consider using one of the above highly capable libraries to generate
colors, etc, and use Colorama just for its primary purpose: to convert
those ANSI sequences to also work on Windows:
SIMILARLY, do not send PRs adding the generation of new ANSI types to Colorama.
We are only interested in converting ANSI codes to win32 API calls, not
shortcuts like the above to generate ANSI characters.
.. code-block:: python
from colorama import just_fix_windows_console
from termcolor import colored
# use Colorama to make Termcolor work on Windows too
just_fix_windows_console()
# then use Termcolor for all colored text output
print(colored('Hello, World!', 'green', 'on_red'))
Available formatting constants are::
Fore: BLACK, RED, GREEN, YELLOW, BLUE, MAGENTA, CYAN, WHITE, RESET.
Back: BLACK, RED, GREEN, YELLOW, BLUE, MAGENTA, CYAN, WHITE, RESET.
Style: DIM, NORMAL, BRIGHT, RESET_ALL
``Style.RESET_ALL`` resets foreground, background, and brightness. Colorama will
perform this reset automatically on program exit.
These are fairly well supported, but not part of the standard::
Fore: LIGHTBLACK_EX, LIGHTRED_EX, LIGHTGREEN_EX, LIGHTYELLOW_EX, LIGHTBLUE_EX, LIGHTMAGENTA_EX, LIGHTCYAN_EX, LIGHTWHITE_EX
Back: LIGHTBLACK_EX, LIGHTRED_EX, LIGHTGREEN_EX, LIGHTYELLOW_EX, LIGHTBLUE_EX, LIGHTMAGENTA_EX, LIGHTCYAN_EX, LIGHTWHITE_EX
Cursor Positioning
..................
ANSI codes to reposition the cursor are supported. See ``demos/demo06.py`` for
an example of how to generate them.
Init Keyword Args
.................
``init()`` accepts some ``**kwargs`` to override default behaviour.
init(autoreset=False):
If you find yourself repeatedly sending reset sequences to turn off color
changes at the end of every print, then ``init(autoreset=True)`` will
automate that:
.. code-block:: python
from colorama import init
init(autoreset=True)
print(Fore.RED + 'some red text')
print('automatically back to default color again')
init(strip=None):
Pass ``True`` or ``False`` to override whether ANSI codes should be
stripped from the output. The default behaviour is to strip if on Windows
or if output is redirected (not a tty).
init(convert=None):
Pass ``True`` or ``False`` to override whether to convert ANSI codes in the
output into win32 calls. The default behaviour is to convert if on Windows
and output is to a tty (terminal).
init(wrap=True):
On Windows, Colorama works by replacing ``sys.stdout`` and ``sys.stderr``
with proxy objects, which override the ``.write()`` method to do their work.
If this wrapping causes you problems, then this can be disabled by passing
``init(wrap=False)``. The default behaviour is to wrap if ``autoreset`` or
``strip`` or ``convert`` are True.
When wrapping is disabled, colored printing on non-Windows platforms will
continue to work as normal. To do cross-platform colored output, you can
use Colorama's ``AnsiToWin32`` proxy directly:
.. code-block:: python
import sys
from colorama import init, AnsiToWin32
init(wrap=False)
stream = AnsiToWin32(sys.stderr).stream
# Python 2
print >>stream, Fore.BLUE + 'blue text on stderr'
# Python 3
print(Fore.BLUE + 'blue text on stderr', file=stream)
Recognised ANSI Sequences
.........................
ANSI sequences generally take the form::
ESC [ <param> ; <param> ... <command>
Where ``<param>`` is an integer, and ``<command>`` is a single letter. Zero or
more params are passed to a ``<command>``. If no params are passed, it is
generally synonymous with passing a single zero. No spaces exist in the
sequence; they have been inserted here simply to read more easily.
The only ANSI sequences that Colorama converts into win32 calls are::
ESC [ 0 m # reset all (colors and brightness)
ESC [ 1 m # bright
ESC [ 2 m # dim (looks same as normal brightness)
ESC [ 22 m # normal brightness
# FOREGROUND:
ESC [ 30 m # black
ESC [ 31 m # red
ESC [ 32 m # green
ESC [ 33 m # yellow
ESC [ 34 m # blue
ESC [ 35 m # magenta
ESC [ 36 m # cyan
ESC [ 37 m # white
ESC [ 39 m # reset
# BACKGROUND
ESC [ 40 m # black
ESC [ 41 m # red
ESC [ 42 m # green
ESC [ 43 m # yellow
ESC [ 44 m # blue
ESC [ 45 m # magenta
ESC [ 46 m # cyan
ESC [ 47 m # white
ESC [ 49 m # reset
# cursor positioning
ESC [ y;x H # position cursor at x across, y down
ESC [ y;x f # position cursor at x across, y down
ESC [ n A # move cursor n lines up
ESC [ n B # move cursor n lines down
ESC [ n C # move cursor n characters forward
ESC [ n D # move cursor n characters backward
# clear the screen
ESC [ mode J # clear the screen
# clear the line
ESC [ mode K # clear the line
Multiple numeric params to the ``'m'`` command can be combined into a single
sequence::
ESC [ 36 ; 45 ; 1 m # bright cyan text on magenta background
All other ANSI sequences of the form ``ESC [ <param> ; <param> ... <command>``
are silently stripped from the output on Windows.
Any other form of ANSI sequence, such as single-character codes or alternative
initial characters, are not recognised or stripped. It would be cool to add
them though. Let me know if it would be useful for you, via the Issues on
GitHub.
Status & Known Problems
-----------------------
I've personally only tested it on Windows XP (CMD, Console2), Ubuntu
(gnome-terminal, xterm), and OS X.
Some valid ANSI sequences aren't recognised.
If you're hacking on the code, see `README-hacking.md`_. ESPECIALLY, see the
explanation there of why we do not want PRs that allow Colorama to generate new
types of ANSI codes.
See outstanding issues and wish-list:
https://github.com/tartley/colorama/issues
If anything doesn't work for you, or doesn't do what you expected or hoped for,
I'd love to hear about it on that issues list, would be delighted by patches,
and would be happy to grant commit access to anyone who submits a working patch
or two.
.. _README-hacking.md: README-hacking.md
License
-------
Copyright Jonathan Hartley & Arnon Yaari, 2013-2020. BSD 3-Clause license; see
LICENSE file.
Professional support
--------------------
.. |tideliftlogo| image:: https://cdn2.hubspot.net/hubfs/4008838/website/logos/logos_for_download/Tidelift_primary-shorthand-logo.png
:alt: Tidelift
:target: https://tidelift.com/subscription/pkg/pypi-colorama?utm_source=pypi-colorama&utm_medium=referral&utm_campaign=readme
.. list-table::
:widths: 10 100
* - |tideliftlogo|
- Professional support for colorama is available as part of the
`Tidelift Subscription`_.
Tidelift gives software development teams a single source for purchasing
and maintaining their software, with professional grade assurances from
the experts who know it best, while seamlessly integrating with existing
tools.
.. _Tidelift Subscription: https://tidelift.com/subscription/pkg/pypi-colorama?utm_source=pypi-colorama&utm_medium=referral&utm_campaign=readme
Thanks
------
See the CHANGELOG for more thanks!
* Marc Schlaich (schlamar) for a ``setup.py`` fix for Python2.5.
* Marc Abramowitz, reported & fixed a crash on exit with closed ``stdout``,
providing a solution to issue #7's setuptools/distutils debate,
and other fixes.
* User 'eryksun', for guidance on correctly instantiating ``ctypes.windll``.
* Matthew McCormick for politely pointing out a longstanding crash on non-Win.
* Ben Hoyt, for a magnificent fix under 64-bit Windows.
* Jesse at Empty Square for submitting a fix for examples in the README.
* User 'jamessp', an observant documentation fix for cursor positioning.
* User 'vaal1239', Dave Mckee & Lackner Kristof for a tiny but much-needed Win7
fix.
* Julien Stuyck, for wisely suggesting Python3 compatible updates to README.
* Daniel Griffith for multiple fabulous patches.
* Oscar Lesta for a valuable fix to stop ANSI chars being sent to non-tty
output.
* Roger Binns, for many suggestions, valuable feedback, & bug reports.
* Tim Golden for thought and much appreciated feedback on the initial idea.
* User 'Zearin' for updates to the README file.
* John Szakmeister for adding support for light colors
* Charles Merriam for adding documentation to demos
* Jurko for a fix on 64-bit Windows CPython2.5 w/o ctypes
* Florian Bruhin for a fix when stdout or stderr are None
* Thomas Weininger for fixing ValueError on Windows
* Remi Rampin for better Github integration and fixes to the README file
* Simeon Visser for closing a file handle using 'with' and updating classifiers
to include Python 3.3 and 3.4
* Andy Neff for fixing RESET of LIGHT_EX colors.
* Jonathan Hartley for the initial idea and implementation. | {
"source": "yandex/perforator",
"title": "contrib/python/colorama/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/colorama/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 15928
} |
.. image:: https://jazzband.co/static/img/badge.svg
:target: https://jazzband.co/
:alt: Jazzband
.. image:: https://readthedocs.org/projects/contextlib2/badge/?version=latest
:target: https://contextlib2.readthedocs.org/
:alt: Latest Docs
.. image:: https://img.shields.io/travis/jazzband/contextlib2/master.svg
:target: http://travis-ci.org/jazzband/contextlib2
.. image:: https://coveralls.io/repos/github/jazzband/contextlib2/badge.svg?branch=master
:target: https://coveralls.io/github/jazzband/contextlib2?branch=master
.. image:: https://landscape.io/github/jazzband/contextlib2/master/landscape.svg
:target: https://landscape.io/github/jazzband/contextlib2/
contextlib2 is a backport of the `standard library's contextlib
module <https://docs.python.org/3.5/library/contextlib.html>`_ to
earlier Python versions.
It also serves as a real world proving ground for possible future
enhancements to the standard library version.
Development
-----------
contextlib2 has no runtime dependencies, but requires ``unittest2`` for testing
on Python 2.x, as well as ``setuptools`` and ``wheel`` to generate universal
wheel archives.
Local testing is just a matter of running ``python test_contextlib2.py``.
You can test against multiple versions of Python with
`tox <https://tox.testrun.org/>`_::
pip install tox
tox
Versions currently tested in both tox and Travis CI are:
* CPython 2.7
* CPython 3.4
* CPython 3.5
* CPython 3.6
* CPython 3.7
* PyPy
* PyPy3 | {
"source": "yandex/perforator",
"title": "contrib/python/contextlib2/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/contextlib2/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1496
} |
.. image:: https://jazzband.co/static/img/badge.svg
:target: https://jazzband.co/
:alt: Jazzband
.. image:: https://github.com/jazzband/contextlib2/workflows/Test/badge.svg
:target: https://github.com/jazzband/contextlib2/actions
:alt: Tests
.. image:: https://codecov.io/gh/jazzband/contextlib2/branch/master/graph/badge.svg
:target: https://codecov.io/gh/jazzband/contextlib2
:alt: Coverage
.. image:: https://readthedocs.org/projects/contextlib2/badge/?version=latest
:target: https://contextlib2.readthedocs.org/
:alt: Latest Docs
contextlib2 is a backport of the `standard library's contextlib
module <https://docs.python.org/3/library/contextlib.html>`_ to
earlier Python versions.
It also sometimes serves as a real world proving ground for possible future
enhancements to the standard library version.
Licensing
---------
As a backport of Python standard library software, the implementation, test
suite and other supporting files for this project are distributed under the
Python Software License used for the CPython reference implementation.
The one exception is the included type hints file, which comes from the
``typeshed`` project, and is hence distributed under the Apache License 2.0.
Development
-----------
contextlib2 has no runtime dependencies, but requires ``setuptools`` and
``wheel`` at build time to generate universal wheel archives.
Local testing is a matter of running::
python3 -m unittest discover -t . -s test
You can test against multiple versions of Python with
`tox <https://tox.testrun.org/>`_::
pip install tox
tox
Versions currently tested in both tox and GitHub Actions are:
* CPython 3.6
* CPython 3.7
* CPython 3.8
* CPython 3.9
* CPython 3.10
* PyPy3
Updating to a new stdlib reference version
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
As of Python 3.10, 4 files needed to be copied from the CPython reference
implementation to contextlib2:
* ``Doc/contextlib.rst`` -> ``docs/contextlib2.rst``
* ``Lib/contextlib.py`` -> ``contextlib2/__init__.py``
* ``Lib/test/test_contextlib.py`` -> ``test/test_contextlib.py``
* ``Lib/test/test_contextlib_async.py`` -> ``test/test_contextlib_async.py``
The corresponding version of ``contextlib2/__init__.pyi`` also needs to be
retrieved from the ``typeshed`` project::
wget https://raw.githubusercontent.com/python/typeshed/master/stdlib/contextlib.pyi
For the 3.10 sync, the only changes needed to the test files were to import from
``contextlib2`` rather than ``contextlib``. The test directory is laid out so
that the test suite's imports from ``test.support`` work the same way they do in
the main CPython test suite.
The following patch files are saved in the ``dev`` directory:
* changes made to ``contextlib2/__init__.py`` to get it to run on the older
versions (and to add back in the deprecated APIs that never graduated to
the standard library version)
* changes made to ``contextlib2/__init__.pyi`` to make the Python version
guards unconditional (since the ``contextlib2`` API is the same on all
supported versions)
* changes made to ``docs/contextlib2.rst`` to use ``contextlib2`` version
numbers in the version added/changed notes and to integrate the module
documentation with the rest of the project documentation | {
"source": "yandex/perforator",
"title": "contrib/python/contextlib2/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/contextlib2/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 3287
} |
Decorators for Humans
=====================
The goal of the decorator module is to make it easy to define
signature-preserving function decorators and decorator factories.
It also includes an implementation of multiple dispatch and other niceties
(please check the docs). It is released under a two-clauses
BSD license, i.e. basically you can do whatever you want with it but I am not
responsible.
Installation
-------------
If you are lazy, just perform
``$ pip install decorator``
which will install just the module on your system.
If you prefer to install the full distribution from source, including
the documentation, clone the `GitHub repo`_ or download the tarball_, unpack it and run
``$ pip install .``
in the main directory, possibly as superuser.
.. _tarball: https://pypi.org/project/decorator/#files
.. _GitHub repo: https://github.com/micheles/decorator
Testing
--------
If you have the source code installation you can run the tests with
`$ python src/tests/test.py -v`
or (if you have setuptools installed)
`$ python setup.py test`
Notice that you may run into trouble if in your system there
is an older version of the decorator module; in such a case remove the
old version. It is safe even to copy the module `decorator.py` over
an existing one, since we kept backward-compatibility for a long time.
Repository
---------------
The project is hosted on GitHub. You can look at the source here:
https://github.com/micheles/decorator
Documentation
---------------
The documentation has been moved to https://github.com/micheles/decorator/blob/master/docs/documentation.md
From there you can get a PDF version by simply using the print
functionality of your browser.
Here is the documentation for previous versions of the module:
https://github.com/micheles/decorator/blob/4.3.2/docs/tests.documentation.rst
https://github.com/micheles/decorator/blob/4.2.1/docs/tests.documentation.rst
https://github.com/micheles/decorator/blob/4.1.2/docs/tests.documentation.rst
https://github.com/micheles/decorator/blob/4.0.0/documentation.rst
https://github.com/micheles/decorator/blob/3.4.2/documentation.rst
For the impatient
-----------------
Here is an example of how to define a family of decorators tracing slow
operations:
.. code-block:: python
from decorator import decorator
@decorator
def warn_slow(func, timelimit=60, *args, **kw):
t0 = time.time()
result = func(*args, **kw)
dt = time.time() - t0
if dt > timelimit:
logging.warn('%s took %d seconds', func.__name__, dt)
else:
logging.info('%s took %d seconds', func.__name__, dt)
return result
@warn_slow # warn if it takes more than 1 minute
def preprocess_input_files(inputdir, tempdir):
...
@warn_slow(timelimit=600) # warn if it takes more than 10 minutes
def run_calculation(tempdir, outdir):
...
Enjoy! | {
"source": "yandex/perforator",
"title": "contrib/python/decorator/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/decorator/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2911
} |
Decorators for Humans
=====================
The goal of the decorator module is to make it easy to define
signature-preserving function decorators and decorator factories.
It also includes an implementation of multiple dispatch and other niceties
(please check the docs). It is released under a two-clauses
BSD license, i.e. basically you can do whatever you want with it but I am not
responsible.
Installation
-------------
If you are lazy, just perform
``$ pip install decorator``
which will install just the module on your system.
If you prefer to install the full distribution from source, including
the documentation, clone the `GitHub repo`_ or download the tarball_, unpack it and run
``$ pip install .``
in the main directory, possibly as superuser.
.. _tarball: https://pypi.org/project/decorator/#files
.. _GitHub repo: https://github.com/micheles/decorator
Testing
--------
If you have the source code installation you can run the tests with
`$ python src/tests/test.py -v`
or (if you have setuptools installed)
`$ python setup.py test`
Notice that you may run into trouble if in your system there
is an older version of the decorator module; in such a case remove the
old version. It is safe even to copy the module `decorator.py` over
an existing one, since we kept backward-compatibility for a long time.
Repository
---------------
The project is hosted on GitHub. You can look at the source here:
https://github.com/micheles/decorator
Documentation
---------------
The documentation has been moved to https://github.com/micheles/decorator/blob/master/docs/documentation.md
From there you can get a PDF version by simply using the print
functionality of your browser.
Here is the documentation for previous versions of the module:
https://github.com/micheles/decorator/blob/4.3.2/docs/tests.documentation.rst
https://github.com/micheles/decorator/blob/4.2.1/docs/tests.documentation.rst
https://github.com/micheles/decorator/blob/4.1.2/docs/tests.documentation.rst
https://github.com/micheles/decorator/blob/4.0.0/documentation.rst
https://github.com/micheles/decorator/blob/3.4.2/documentation.rst
For the impatient
-----------------
Here is an example of how to define a family of decorators tracing slow
operations:
.. code-block:: python
from decorator import decorator
@decorator
def warn_slow(func, timelimit=60, *args, **kw):
t0 = time.time()
result = func(*args, **kw)
dt = time.time() - t0
if dt > timelimit:
logging.warn('%s took %d seconds', func.__name__, dt)
else:
logging.info('%s took %d seconds', func.__name__, dt)
return result
@warn_slow # warn if it takes more than 1 minute
def preprocess_input_files(inputdir, tempdir):
...
@warn_slow(timelimit=600) # warn if it takes more than 10 minutes
def run_calculation(tempdir, outdir):
...
Enjoy! | {
"source": "yandex/perforator",
"title": "contrib/python/decorator/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/decorator/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2911
} |
.. funcsigs documentation master file, created by
sphinx-quickstart on Fri Apr 20 20:27:52 2012.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Introducing funcsigs
====================
The Funcsigs Package
--------------------
``funcsigs`` is a backport of the `PEP 362`_ function signature features from
Python 3.3's `inspect`_ module. The backport is compatible with Python 2.6, 2.7
as well as 3.3 and up. 3.2 was supported by version 0.4, but with setuptools and
pip no longer supporting 3.2, we cannot make any statement about 3.2
compatibility.
Compatibility
`````````````
The ``funcsigs`` backport has been tested against:
* CPython 2.6
* CPython 2.7
* CPython 3.3
* CPython 3.4
* CPython 3.5
* CPython nightlies
* PyPy and PyPy3(currently failing CI)
Continuous integration testing is provided by `Travis CI`_.
Under Python 2.x there is a compatibility issue when a function is assigned to
the ``__wrapped__`` property of a class after it has been constructed.
Similiarily there under PyPy directly passing the ``__call__`` method of a
builtin is also a compatibility issues. Otherwise the functionality is
believed to be uniform between both Python2 and Python3.
Issues
``````
Source code for ``funcsigs`` is hosted on `GitHub`_. Any bug reports or feature
requests can be made using GitHub's `issues system`_. |build_status| |coverage|
Example
-------
To obtain a `Signature` object, pass the target function to the
``funcsigs.signature`` function.
.. code-block:: python
>>> from funcsigs import signature
>>> def foo(a, b=None, *args, **kwargs):
... pass
...
>>> sig = signature(foo)
>>> sig
<funcsigs.Signature object at 0x...>
>>> sig.parameters
OrderedDict([('a', <Parameter at 0x... 'a'>), ('b', <Parameter at 0x... 'b'>), ('args', <Parameter at 0x... 'args'>), ('kwargs', <Parameter at 0x... 'kwargs'>)])
>>> sig.return_annotation
<class 'funcsigs._empty'>
Introspecting callables with the Signature object
-------------------------------------------------
.. note::
This section of documentation is a direct reproduction of the Python
standard library documentation for the inspect module.
The Signature object represents the call signature of a callable object and its
return annotation. To retrieve a Signature object, use the :func:`signature`
function.
.. function:: signature(callable)
Return a :class:`Signature` object for the given ``callable``::
>>> from funcsigs import signature
>>> def foo(a, *, b:int, **kwargs):
... pass
>>> sig = signature(foo)
>>> str(sig)
'(a, *, b:int, **kwargs)'
>>> str(sig.parameters['b'])
'b:int'
>>> sig.parameters['b'].annotation
<class 'int'>
Accepts a wide range of python callables, from plain functions and classes to
:func:`functools.partial` objects.
.. note::
Some callables may not be introspectable in certain implementations of
Python. For example, in CPython, built-in functions defined in C provide
no metadata about their arguments.
.. class:: Signature
A Signature object represents the call signature of a function and its return
annotation. For each parameter accepted by the function it stores a
:class:`Parameter` object in its :attr:`parameters` collection.
Signature objects are *immutable*. Use :meth:`Signature.replace` to make a
modified copy.
.. attribute:: Signature.empty
A special class-level marker to specify absence of a return annotation.
.. attribute:: Signature.parameters
An ordered mapping of parameters' names to the corresponding
:class:`Parameter` objects.
.. attribute:: Signature.return_annotation
The "return" annotation for the callable. If the callable has no "return"
annotation, this attribute is set to :attr:`Signature.empty`.
.. method:: Signature.bind(*args, **kwargs)
Create a mapping from positional and keyword arguments to parameters.
Returns :class:`BoundArguments` if ``*args`` and ``**kwargs`` match the
signature, or raises a :exc:`TypeError`.
.. method:: Signature.bind_partial(*args, **kwargs)
Works the same way as :meth:`Signature.bind`, but allows the omission of
some required arguments (mimics :func:`functools.partial` behavior.)
Returns :class:`BoundArguments`, or raises a :exc:`TypeError` if the
passed arguments do not match the signature.
.. method:: Signature.replace(*[, parameters][, return_annotation])
Create a new Signature instance based on the instance replace was invoked
on. It is possible to pass different ``parameters`` and/or
``return_annotation`` to override the corresponding properties of the base
signature. To remove return_annotation from the copied Signature, pass in
:attr:`Signature.empty`.
::
>>> def test(a, b):
... pass
>>> sig = signature(test)
>>> new_sig = sig.replace(return_annotation="new return anno")
>>> str(new_sig)
"(a, b) -> 'new return anno'"
.. class:: Parameter
Parameter objects are *immutable*. Instead of modifying a Parameter object,
you can use :meth:`Parameter.replace` to create a modified copy.
.. attribute:: Parameter.empty
A special class-level marker to specify absence of default values and
annotations.
.. attribute:: Parameter.name
The name of the parameter as a string. Must be a valid python identifier
name (with the exception of ``POSITIONAL_ONLY`` parameters, which can have
it set to ``None``).
.. attribute:: Parameter.default
The default value for the parameter. If the parameter has no default
value, this attribute is set to :attr:`Parameter.empty`.
.. attribute:: Parameter.annotation
The annotation for the parameter. If the parameter has no annotation,
this attribute is set to :attr:`Parameter.empty`.
.. attribute:: Parameter.kind
Describes how argument values are bound to the parameter. Possible values
(accessible via :class:`Parameter`, like ``Parameter.KEYWORD_ONLY``):
+------------------------+----------------------------------------------+
| Name | Meaning |
+========================+==============================================+
| *POSITIONAL_ONLY* | Value must be supplied as a positional |
| | argument. |
| | |
| | Python has no explicit syntax for defining |
| | positional-only parameters, but many built-in|
| | and extension module functions (especially |
| | those that accept only one or two parameters)|
| | accept them. |
+------------------------+----------------------------------------------+
| *POSITIONAL_OR_KEYWORD*| Value may be supplied as either a keyword or |
| | positional argument (this is the standard |
| | binding behaviour for functions implemented |
| | in Python.) |
+------------------------+----------------------------------------------+
| *VAR_POSITIONAL* | A tuple of positional arguments that aren't |
| | bound to any other parameter. This |
| | corresponds to a ``*args`` parameter in a |
| | Python function definition. |
+------------------------+----------------------------------------------+
| *KEYWORD_ONLY* | Value must be supplied as a keyword argument.|
| | Keyword only parameters are those which |
| | appear after a ``*`` or ``*args`` entry in a |
| | Python function definition. |
+------------------------+----------------------------------------------+
| *VAR_KEYWORD* | A dict of keyword arguments that aren't bound|
| | to any other parameter. This corresponds to a|
| | ``**kwargs`` parameter in a Python function |
| | definition. |
+------------------------+----------------------------------------------+
Example: print all keyword-only arguments without default values::
>>> def foo(a, b, *, c, d=10):
... pass
>>> sig = signature(foo)
>>> for param in sig.parameters.values():
... if (param.kind == param.KEYWORD_ONLY and
... param.default is param.empty):
... print('Parameter:', param)
Parameter: c
.. method:: Parameter.replace(*[, name][, kind][, default][, annotation])
Create a new Parameter instance based on the instance replaced was invoked
on. To override a :class:`Parameter` attribute, pass the corresponding
argument. To remove a default value or/and an annotation from a
Parameter, pass :attr:`Parameter.empty`.
::
>>> from funcsigs import Parameter
>>> param = Parameter('foo', Parameter.KEYWORD_ONLY, default=42)
>>> str(param)
'foo=42'
>>> str(param.replace()) # Will create a shallow copy of 'param'
'foo=42'
>>> str(param.replace(default=Parameter.empty, annotation='spam'))
"foo:'spam'"
.. class:: BoundArguments
Result of a :meth:`Signature.bind` or :meth:`Signature.bind_partial` call.
Holds the mapping of arguments to the function's parameters.
.. attribute:: BoundArguments.arguments
An ordered, mutable mapping (:class:`collections.OrderedDict`) of
parameters' names to arguments' values. Contains only explicitly bound
arguments. Changes in :attr:`arguments` will reflect in :attr:`args` and
:attr:`kwargs`.
Should be used in conjunction with :attr:`Signature.parameters` for any
argument processing purposes.
.. note::
Arguments for which :meth:`Signature.bind` or
:meth:`Signature.bind_partial` relied on a default value are skipped.
However, if needed, it is easy to include them.
::
>>> def foo(a, b=10):
... pass
>>> sig = signature(foo)
>>> ba = sig.bind(5)
>>> ba.args, ba.kwargs
((5,), {})
>>> for param in sig.parameters.values():
... if param.name not in ba.arguments:
... ba.arguments[param.name] = param.default
>>> ba.args, ba.kwargs
((5, 10), {})
.. attribute:: BoundArguments.args
A tuple of positional arguments values. Dynamically computed from the
:attr:`arguments` attribute.
.. attribute:: BoundArguments.kwargs
A dict of keyword arguments values. Dynamically computed from the
:attr:`arguments` attribute.
The :attr:`args` and :attr:`kwargs` properties can be used to invoke
functions::
def test(a, *, b):
...
sig = signature(test)
ba = sig.bind(10, b=20)
test(*ba.args, **ba.kwargs)
.. seealso::
:pep:`362` - Function Signature Object.
The detailed specification, implementation details and examples.
Copyright
---------
*funcsigs* is a derived work of CPython under the terms of the `PSF License
Agreement`_. The original CPython inspect module, its unit tests and
documentation are the copyright of the Python Software Foundation. The derived
work is distributed under the `Apache License Version 2.0`_.
.. _PSF License Agreement: http://docs.python.org/3/license.html#terms-and-conditions-for-accessing-or-otherwise-using-python
.. _Apache License Version 2.0: http://opensource.org/licenses/Apache-2.0
.. _GitHub: https://github.com/testing-cabal/funcsigs
.. _PSF License Agreement: http://docs.python.org/3/license.html#terms-and-conditions-for-accessing-or-otherwise-using-python
.. _Travis CI: http://travis-ci.org/
.. _Read The Docs: http://funcsigs.readthedocs.org/
.. _PEP 362: http://www.python.org/dev/peps/pep-0362/
.. _inspect: http://docs.python.org/3/library/inspect.html#introspecting-callables-with-the-signature-object
.. _issues system: https://github.com/testing-cabal/funcsigs/issues
.. |build_status| image:: https://secure.travis-ci.org/aliles/funcsigs.png?branch=master
:target: http://travis-ci.org/#!/aliles/funcsigs
:alt: Current build status
.. |coverage| image:: https://coveralls.io/repos/aliles/funcsigs/badge.png?branch=master
:target: https://coveralls.io/r/aliles/funcsigs?branch=master
:alt: Coverage status
.. |pypi_version| image:: https://pypip.in/v/funcsigs/badge.png
:target: https://crate.io/packages/funcsigs/
:alt: Latest PyPI version | {
"source": "yandex/perforator",
"title": "contrib/python/funcsigs/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/funcsigs/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 13292
} |
.. funcsigs documentation master file, created by
sphinx-quickstart on Fri Apr 20 20:27:52 2012.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Introducing funcsigs
====================
The Funcsigs Package
--------------------
``funcsigs`` is a backport of the `PEP 362`_ function signature features from
Python 3.3's `inspect`_ module. The backport is compatible with Python 2.6, 2.7
as well as 3.3 and up. 3.2 was supported by version 0.4, but with setuptools and
pip no longer supporting 3.2, we cannot make any statement about 3.2
compatibility.
Compatibility
`````````````
The ``funcsigs`` backport has been tested against:
* CPython 2.6
* CPython 2.7
* CPython 3.3
* CPython 3.4
* CPython 3.5
* CPython nightlies
* PyPy and PyPy3(currently failing CI)
Continuous integration testing is provided by `Travis CI`_.
Under Python 2.x there is a compatibility issue when a function is assigned to
the ``__wrapped__`` property of a class after it has been constructed.
Similiarily there under PyPy directly passing the ``__call__`` method of a
builtin is also a compatibility issues. Otherwise the functionality is
believed to be uniform between both Python2 and Python3.
Issues
``````
Source code for ``funcsigs`` is hosted on `GitHub`_. Any bug reports or feature
requests can be made using GitHub's `issues system`_. |build_status| |coverage|
Example
-------
To obtain a `Signature` object, pass the target function to the
``funcsigs.signature`` function.
.. code-block:: python
>>> from funcsigs import signature
>>> def foo(a, b=None, *args, **kwargs):
... pass
...
>>> sig = signature(foo)
>>> sig
<funcsigs.Signature object at 0x...>
>>> sig.parameters
OrderedDict([('a', <Parameter at 0x... 'a'>), ('b', <Parameter at 0x... 'b'>), ('args', <Parameter at 0x... 'args'>), ('kwargs', <Parameter at 0x... 'kwargs'>)])
>>> sig.return_annotation
<class 'funcsigs._empty'>
Introspecting callables with the Signature object
-------------------------------------------------
.. note::
This section of documentation is a direct reproduction of the Python
standard library documentation for the inspect module.
The Signature object represents the call signature of a callable object and its
return annotation. To retrieve a Signature object, use the :func:`signature`
function.
.. function:: signature(callable)
Return a :class:`Signature` object for the given ``callable``::
>>> from funcsigs import signature
>>> def foo(a, *, b:int, **kwargs):
... pass
>>> sig = signature(foo)
>>> str(sig)
'(a, *, b:int, **kwargs)'
>>> str(sig.parameters['b'])
'b:int'
>>> sig.parameters['b'].annotation
<class 'int'>
Accepts a wide range of python callables, from plain functions and classes to
:func:`functools.partial` objects.
.. note::
Some callables may not be introspectable in certain implementations of
Python. For example, in CPython, built-in functions defined in C provide
no metadata about their arguments.
.. class:: Signature
A Signature object represents the call signature of a function and its return
annotation. For each parameter accepted by the function it stores a
:class:`Parameter` object in its :attr:`parameters` collection.
Signature objects are *immutable*. Use :meth:`Signature.replace` to make a
modified copy.
.. attribute:: Signature.empty
A special class-level marker to specify absence of a return annotation.
.. attribute:: Signature.parameters
An ordered mapping of parameters' names to the corresponding
:class:`Parameter` objects.
.. attribute:: Signature.return_annotation
The "return" annotation for the callable. If the callable has no "return"
annotation, this attribute is set to :attr:`Signature.empty`.
.. method:: Signature.bind(*args, **kwargs)
Create a mapping from positional and keyword arguments to parameters.
Returns :class:`BoundArguments` if ``*args`` and ``**kwargs`` match the
signature, or raises a :exc:`TypeError`.
.. method:: Signature.bind_partial(*args, **kwargs)
Works the same way as :meth:`Signature.bind`, but allows the omission of
some required arguments (mimics :func:`functools.partial` behavior.)
Returns :class:`BoundArguments`, or raises a :exc:`TypeError` if the
passed arguments do not match the signature.
.. method:: Signature.replace(*[, parameters][, return_annotation])
Create a new Signature instance based on the instance replace was invoked
on. It is possible to pass different ``parameters`` and/or
``return_annotation`` to override the corresponding properties of the base
signature. To remove return_annotation from the copied Signature, pass in
:attr:`Signature.empty`.
::
>>> def test(a, b):
... pass
>>> sig = signature(test)
>>> new_sig = sig.replace(return_annotation="new return anno")
>>> str(new_sig)
"(a, b) -> 'new return anno'"
.. class:: Parameter
Parameter objects are *immutable*. Instead of modifying a Parameter object,
you can use :meth:`Parameter.replace` to create a modified copy.
.. attribute:: Parameter.empty
A special class-level marker to specify absence of default values and
annotations.
.. attribute:: Parameter.name
The name of the parameter as a string. Must be a valid python identifier
name (with the exception of ``POSITIONAL_ONLY`` parameters, which can have
it set to ``None``).
.. attribute:: Parameter.default
The default value for the parameter. If the parameter has no default
value, this attribute is set to :attr:`Parameter.empty`.
.. attribute:: Parameter.annotation
The annotation for the parameter. If the parameter has no annotation,
this attribute is set to :attr:`Parameter.empty`.
.. attribute:: Parameter.kind
Describes how argument values are bound to the parameter. Possible values
(accessible via :class:`Parameter`, like ``Parameter.KEYWORD_ONLY``):
+------------------------+----------------------------------------------+
| Name | Meaning |
+========================+==============================================+
| *POSITIONAL_ONLY* | Value must be supplied as a positional |
| | argument. |
| | |
| | Python has no explicit syntax for defining |
| | positional-only parameters, but many built-in|
| | and extension module functions (especially |
| | those that accept only one or two parameters)|
| | accept them. |
+------------------------+----------------------------------------------+
| *POSITIONAL_OR_KEYWORD*| Value may be supplied as either a keyword or |
| | positional argument (this is the standard |
| | binding behaviour for functions implemented |
| | in Python.) |
+------------------------+----------------------------------------------+
| *VAR_POSITIONAL* | A tuple of positional arguments that aren't |
| | bound to any other parameter. This |
| | corresponds to a ``*args`` parameter in a |
| | Python function definition. |
+------------------------+----------------------------------------------+
| *KEYWORD_ONLY* | Value must be supplied as a keyword argument.|
| | Keyword only parameters are those which |
| | appear after a ``*`` or ``*args`` entry in a |
| | Python function definition. |
+------------------------+----------------------------------------------+
| *VAR_KEYWORD* | A dict of keyword arguments that aren't bound|
| | to any other parameter. This corresponds to a|
| | ``**kwargs`` parameter in a Python function |
| | definition. |
+------------------------+----------------------------------------------+
Example: print all keyword-only arguments without default values::
>>> def foo(a, b, *, c, d=10):
... pass
>>> sig = signature(foo)
>>> for param in sig.parameters.values():
... if (param.kind == param.KEYWORD_ONLY and
... param.default is param.empty):
... print('Parameter:', param)
Parameter: c
.. method:: Parameter.replace(*[, name][, kind][, default][, annotation])
Create a new Parameter instance based on the instance replaced was invoked
on. To override a :class:`Parameter` attribute, pass the corresponding
argument. To remove a default value or/and an annotation from a
Parameter, pass :attr:`Parameter.empty`.
::
>>> from funcsigs import Parameter
>>> param = Parameter('foo', Parameter.KEYWORD_ONLY, default=42)
>>> str(param)
'foo=42'
>>> str(param.replace()) # Will create a shallow copy of 'param'
'foo=42'
>>> str(param.replace(default=Parameter.empty, annotation='spam'))
"foo:'spam'"
.. class:: BoundArguments
Result of a :meth:`Signature.bind` or :meth:`Signature.bind_partial` call.
Holds the mapping of arguments to the function's parameters.
.. attribute:: BoundArguments.arguments
An ordered, mutable mapping (:class:`collections.OrderedDict`) of
parameters' names to arguments' values. Contains only explicitly bound
arguments. Changes in :attr:`arguments` will reflect in :attr:`args` and
:attr:`kwargs`.
Should be used in conjunction with :attr:`Signature.parameters` for any
argument processing purposes.
.. note::
Arguments for which :meth:`Signature.bind` or
:meth:`Signature.bind_partial` relied on a default value are skipped.
However, if needed, it is easy to include them.
::
>>> def foo(a, b=10):
... pass
>>> sig = signature(foo)
>>> ba = sig.bind(5)
>>> ba.args, ba.kwargs
((5,), {})
>>> for param in sig.parameters.values():
... if param.name not in ba.arguments:
... ba.arguments[param.name] = param.default
>>> ba.args, ba.kwargs
((5, 10), {})
.. attribute:: BoundArguments.args
A tuple of positional arguments values. Dynamically computed from the
:attr:`arguments` attribute.
.. attribute:: BoundArguments.kwargs
A dict of keyword arguments values. Dynamically computed from the
:attr:`arguments` attribute.
The :attr:`args` and :attr:`kwargs` properties can be used to invoke
functions::
def test(a, *, b):
...
sig = signature(test)
ba = sig.bind(10, b=20)
test(*ba.args, **ba.kwargs)
.. seealso::
:pep:`362` - Function Signature Object.
The detailed specification, implementation details and examples.
Copyright
---------
*funcsigs* is a derived work of CPython under the terms of the `PSF License
Agreement`_. The original CPython inspect module, its unit tests and
documentation are the copyright of the Python Software Foundation. The derived
work is distributed under the `Apache License Version 2.0`_.
.. _PSF License Agreement: http://docs.python.org/3/license.html#terms-and-conditions-for-accessing-or-otherwise-using-python
.. _Apache License Version 2.0: http://opensource.org/licenses/Apache-2.0
.. _GitHub: https://github.com/testing-cabal/funcsigs
.. _PSF License Agreement: http://docs.python.org/3/license.html#terms-and-conditions-for-accessing-or-otherwise-using-python
.. _Travis CI: http://travis-ci.org/
.. _Read The Docs: http://funcsigs.readthedocs.org/
.. _PEP 362: http://www.python.org/dev/peps/pep-0362/
.. _inspect: http://docs.python.org/3/library/inspect.html#introspecting-callables-with-the-signature-object
.. _issues system: https://github.com/testing-cabal/funcsigs/issues
.. |build_status| image:: https://secure.travis-ci.org/aliles/funcsigs.png?branch=master
:target: http://travis-ci.org/#!/aliles/funcsigs
:alt: Current build status
.. |coverage| image:: https://coveralls.io/repos/aliles/funcsigs/badge.png?branch=master
:target: https://coveralls.io/r/aliles/funcsigs?branch=master
:alt: Coverage status
.. |pypi_version| image:: https://pypip.in/v/funcsigs/badge.png
:target: https://crate.io/packages/funcsigs/
:alt: Latest PyPI version | {
"source": "yandex/perforator",
"title": "contrib/python/funcsigs/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/funcsigs/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 13292
} |
# gRPC – An RPC library and framework
gRPC is a modern, open source, high-performance remote procedure call (RPC)
framework that can run anywhere. gRPC enables client and server applications to
communicate transparently, and simplifies the building of connected systems.
<table>
<tr>
<td><b>Homepage:</b></td>
<td><a href="https://grpc.io/">grpc.io</a></td>
</tr>
<tr>
<td><b>Mailing List:</b></td>
<td><a href="https://groups.google.com/forum/#!forum/grpc-io">[email protected]</a></td>
</tr>
</table>
[](https://gitter.im/grpc/grpc?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
## To start using gRPC
To maximize usability, gRPC supports the standard method for adding dependencies
to a user's chosen language (if there is one). In most languages, the gRPC
runtime comes as a package available in a user's language package manager.
For instructions on how to use the language-specific gRPC runtime for a project,
please refer to these documents
- [C++](src/cpp): follow the instructions under the `src/cpp` directory
- [C#](src/csharp): NuGet package `Grpc`
- [Dart](https://github.com/grpc/grpc-dart): pub package `grpc`
- [Go](https://github.com/grpc/grpc-go): `go get google.golang.org/grpc`
- [Java](https://github.com/grpc/grpc-java): Use JARs from Maven Central
Repository
- [Kotlin](https://github.com/grpc/grpc-kotlin): Use JARs from Maven Central
Repository
- [Node](https://github.com/grpc/grpc-node): `npm install grpc`
- [Objective-C](src/objective-c): Add `gRPC-ProtoRPC` dependency to podspec
- [PHP](src/php): `pecl install grpc`
- [Python](src/python/grpcio): `pip install grpcio`
- [Ruby](src/ruby): `gem install grpc`
- [WebJS](https://github.com/grpc/grpc-web): follow the grpc-web instructions
Per-language quickstart guides and tutorials can be found in the
[documentation section on the grpc.io website](https://grpc.io/docs/). Code
examples are available in the [examples](examples) directory.
Precompiled bleeding-edge package builds of gRPC `master` branch's `HEAD` are
uploaded daily to [packages.grpc.io](https://packages.grpc.io).
## To start developing gRPC
Contributions are welcome!
Please read [How to contribute](CONTRIBUTING.md) which will guide you through
the entire workflow of how to build the source code, how to run the tests, and
how to contribute changes to the gRPC codebase. The "How to contribute" document
also contains info on how the contribution process works and contains best
practices for creating contributions.
## Troubleshooting
Sometimes things go wrong. Please check out the
[Troubleshooting guide](TROUBLESHOOTING.md) if you are experiencing issues with
gRPC.
## Performance
See the
[Performance dashboard](https://grafana-dot-grpc-testing.appspot.com/)
for performance numbers of master branch daily builds.
## Concepts
See [gRPC Concepts](CONCEPTS.md)
## About This Repository
This repository contains source code for gRPC libraries implemented in multiple
languages written on top of a shared C core library [src/core](src/core).
Libraries in different languages may be in various states of development. We are
seeking contributions for all of these libraries:
| Language | Source |
| ----------------------- | ---------------------------------- |
| Shared C [core library] | [src/core](src/core) |
| C++ | [src/cpp](src/cpp) |
| Ruby | [src/ruby](src/ruby) |
| Python | [src/python](src/python) |
| PHP | [src/php](src/php) |
| C# (core library based) | [src/csharp](src/csharp) |
| Objective-C | [src/objective-c](src/objective-c) |
| Language | Source repo |
| -------------------- | -------------------------------------------------- |
| Java | [grpc-java](https://github.com/grpc/grpc-java) |
| Kotlin | [grpc-kotlin](https://github.com/grpc/grpc-kotlin) |
| Go | [grpc-go](https://github.com/grpc/grpc-go) |
| NodeJS | [grpc-node](https://github.com/grpc/grpc-node) |
| WebJS | [grpc-web](https://github.com/grpc/grpc-web) |
| Dart | [grpc-dart](https://github.com/grpc/grpc-dart) |
| .NET (pure C# impl.) | [grpc-dotnet](https://github.com/grpc/grpc-dotnet) |
| Swift | [grpc-swift](https://github.com/grpc/grpc-swift) | | {
"source": "yandex/perforator",
"title": "contrib/python/grpcio/py2/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/grpcio/py2/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 4660
} |
gRPC Python
===========
|compat_check_pypi|
Package for gRPC Python.
.. |compat_check_pypi| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=grpcio
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=grpcio
Supported Python Versions
-------------------------
Python >= 3.7
Installation
------------
gRPC Python is available for Linux, macOS, and Windows.
Installing From PyPI
~~~~~~~~~~~~~~~~~~~~
If you are installing locally...
::
$ pip install grpcio
Else system wide (on Ubuntu)...
::
$ sudo pip install grpcio
If you're on Windows make sure that you installed the :code:`pip.exe` component
when you installed Python (if not go back and install it!) then invoke:
::
$ pip.exe install grpcio
Windows users may need to invoke :code:`pip.exe` from a command line ran as
administrator.
n.b. On Windows and on Mac OS X one *must* have a recent release of :code:`pip`
to retrieve the proper wheel from PyPI. Be sure to upgrade to the latest
version!
Installing From Source
~~~~~~~~~~~~~~~~~~~~~~
Building from source requires that you have the Python headers (usually a
package named :code:`python-dev`).
::
$ export REPO_ROOT=grpc # REPO_ROOT can be any directory of your choice
$ git clone -b RELEASE_TAG_HERE https://github.com/grpc/grpc $REPO_ROOT
$ cd $REPO_ROOT
$ git submodule update --init
# For the next two commands do `sudo pip install` if you get permission-denied errors
$ pip install -rrequirements.txt
$ GRPC_PYTHON_BUILD_WITH_CYTHON=1 pip install .
You cannot currently install Python from source on Windows. Things might work
out for you in MSYS2 (follow the Linux instructions), but it isn't officially
supported at the moment.
Troubleshooting
~~~~~~~~~~~~~~~
Help, I ...
* **... see a** :code:`pkg_resources.VersionConflict` **when I try to install
grpc**
This is likely because :code:`pip` doesn't own the offending dependency,
which in turn is likely because your operating system's package manager owns
it. You'll need to force the installation of the dependency:
:code:`pip install --ignore-installed $OFFENDING_DEPENDENCY`
For example, if you get an error like the following:
::
Traceback (most recent call last):
File "<string>", line 17, in <module>
...
File "/usr/lib/python2.7/dist-packages/pkg_resources.py", line 509, in find
raise VersionConflict(dist, req)
pkg_resources.VersionConflict: (six 1.8.0 (/usr/lib/python2.7/dist-packages), Requirement.parse('six>=1.10'))
You can fix it by doing:
::
sudo pip install --ignore-installed six
* **... see the following error on some platforms**
::
/tmp/pip-build-U8pSsr/cython/Cython/Plex/Scanners.c:4:20: fatal error: Python.h: No such file or directory
#include "Python.h"
^
compilation terminated.
You can fix it by installing `python-dev` package. i.e
::
sudo apt-get install python-dev | {
"source": "yandex/perforator",
"title": "contrib/python/grpcio/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/grpcio/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2977
} |
# gRPC – An RPC library and framework
gRPC is a modern, open source, high-performance remote procedure call (RPC)
framework that can run anywhere. gRPC enables client and server applications to
communicate transparently, and simplifies the building of connected systems.
<table>
<tr>
<td><b>Homepage:</b></td>
<td><a href="https://grpc.io/">grpc.io</a></td>
</tr>
<tr>
<td><b>Mailing List:</b></td>
<td><a href="https://groups.google.com/forum/#!forum/grpc-io">[email protected]</a></td>
</tr>
</table>
[](https://gitter.im/grpc/grpc?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge)
## To start using gRPC
To maximize usability, gRPC supports the standard method for adding dependencies
to a user's chosen language (if there is one). In most languages, the gRPC
runtime comes as a package available in a user's language package manager.
For instructions on how to use the language-specific gRPC runtime for a project,
please refer to these documents
- [C++](src/cpp): follow the instructions under the `src/cpp` directory
- [C#/.NET](https://github.com/grpc/grpc-dotnet): NuGet packages `Grpc.Net.Client`, `Grpc.AspNetCore.Server`
- [Dart](https://github.com/grpc/grpc-dart): pub package `grpc`
- [Go](https://github.com/grpc/grpc-go): `go get google.golang.org/grpc`
- [Java](https://github.com/grpc/grpc-java): Use JARs from Maven Central
Repository
- [Kotlin](https://github.com/grpc/grpc-kotlin): Use JARs from Maven Central
Repository
- [Node](https://github.com/grpc/grpc-node): `npm install @grpc/grpc-js`
- [Objective-C](src/objective-c): Add `gRPC-ProtoRPC` dependency to podspec
- [PHP](src/php): `pecl install grpc`
- [Python](src/python/grpcio): `pip install grpcio`
- [Ruby](src/ruby): `gem install grpc`
- [WebJS](https://github.com/grpc/grpc-web): follow the grpc-web instructions
Per-language quickstart guides and tutorials can be found in the
[documentation section on the grpc.io website](https://grpc.io/docs/). Code
examples are available in the [examples](examples) directory.
Precompiled bleeding-edge package builds of gRPC `master` branch's `HEAD` are
uploaded daily to [packages.grpc.io](https://packages.grpc.io).
## To start developing gRPC
Contributions are welcome!
Please read [How to contribute](CONTRIBUTING.md) which will guide you through
the entire workflow of how to build the source code, how to run the tests, and
how to contribute changes to the gRPC codebase. The "How to contribute" document
also contains info on how the contribution process works and contains best
practices for creating contributions.
## Troubleshooting
Sometimes things go wrong. Please check out the
[Troubleshooting guide](TROUBLESHOOTING.md) if you are experiencing issues with
gRPC.
## Performance
See the
[Performance dashboard](https://grafana-dot-grpc-testing.appspot.com/)
for performance numbers of master branch daily builds.
## Concepts
See [gRPC Concepts](CONCEPTS.md)
## About This Repository
This repository contains source code for gRPC libraries implemented in multiple
languages written on top of a shared C core library [src/core](src/core).
Libraries in different languages may be in various states of development. We are
seeking contributions for all of these libraries:
| Language | Source |
| ----------------------- | ---------------------------------- |
| Shared C [core library] | [src/core](src/core) |
| C++ | [src/cpp](src/cpp) |
| Ruby | [src/ruby](src/ruby) |
| Python | [src/python](src/python) |
| PHP | [src/php](src/php) |
| C# (core library based) | [src/csharp](src/csharp) |
| Objective-C | [src/objective-c](src/objective-c) |
| Language | Source repo |
| -------------------- | -------------------------------------------------- |
| Java | [grpc-java](https://github.com/grpc/grpc-java) |
| Kotlin | [grpc-kotlin](https://github.com/grpc/grpc-kotlin) |
| Go | [grpc-go](https://github.com/grpc/grpc-go) |
| NodeJS | [grpc-node](https://github.com/grpc/grpc-node) |
| WebJS | [grpc-web](https://github.com/grpc/grpc-web) |
| Dart | [grpc-dart](https://github.com/grpc/grpc-dart) |
| .NET (pure C# impl.) | [grpc-dotnet](https://github.com/grpc/grpc-dotnet) |
| Swift | [grpc-swift](https://github.com/grpc/grpc-swift) | | {
"source": "yandex/perforator",
"title": "contrib/python/grpcio/py3/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/grpcio/py3/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 4737
} |
gRPC Python
===========
|compat_check_pypi|
Package for gRPC Python.
.. |compat_check_pypi| image:: https://python-compatibility-tools.appspot.com/one_badge_image?package=grpcio
:target: https://python-compatibility-tools.appspot.com/one_badge_target?package=grpcio
Supported Python Versions
-------------------------
Python >= 3.7
Installation
------------
gRPC Python is available for Linux, macOS, and Windows.
Installing From PyPI
~~~~~~~~~~~~~~~~~~~~
If you are installing locally...
::
$ pip install grpcio
Else system wide (on Ubuntu)...
::
$ sudo pip install grpcio
If you're on Windows make sure that you installed the :code:`pip.exe` component
when you installed Python (if not go back and install it!) then invoke:
::
$ pip.exe install grpcio
Windows users may need to invoke :code:`pip.exe` from a command line ran as
administrator.
n.b. On Windows and on Mac OS X one *must* have a recent release of :code:`pip`
to retrieve the proper wheel from PyPI. Be sure to upgrade to the latest
version!
Installing From Source
~~~~~~~~~~~~~~~~~~~~~~
Building from source requires that you have the Python headers (usually a
package named :code:`python-dev`).
::
$ export REPO_ROOT=grpc # REPO_ROOT can be any directory of your choice
$ git clone -b RELEASE_TAG_HERE https://github.com/grpc/grpc $REPO_ROOT
$ cd $REPO_ROOT
$ git submodule update --init
# For the next two commands do `sudo pip install` if you get permission-denied errors
$ pip install -rrequirements.txt
$ GRPC_PYTHON_BUILD_WITH_CYTHON=1 pip install .
You cannot currently install Python from source on Windows. Things might work
out for you in MSYS2 (follow the Linux instructions), but it isn't officially
supported at the moment.
Troubleshooting
~~~~~~~~~~~~~~~
Help, I ...
* **... see a** :code:`pkg_resources.VersionConflict` **when I try to install
grpc**
This is likely because :code:`pip` doesn't own the offending dependency,
which in turn is likely because your operating system's package manager owns
it. You'll need to force the installation of the dependency:
:code:`pip install --ignore-installed $OFFENDING_DEPENDENCY`
For example, if you get an error like the following:
::
Traceback (most recent call last):
File "<string>", line 17, in <module>
...
File "/usr/lib/python2.7/dist-packages/pkg_resources.py", line 509, in find
raise VersionConflict(dist, req)
pkg_resources.VersionConflict: (six 1.8.0 (/usr/lib/python2.7/dist-packages), Requirement.parse('six>=1.10'))
You can fix it by doing:
::
sudo pip install --ignore-installed six
* **... see the following error on some platforms**
::
/tmp/pip-build-U8pSsr/cython/Cython/Plex/Scanners.c:4:20: fatal error: Python.h: No such file or directory
#include "Python.h"
^
compilation terminated.
You can fix it by installing `python-dev` package. i.e
::
sudo apt-get install python-dev | {
"source": "yandex/perforator",
"title": "contrib/python/grpcio/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/grpcio/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2977
} |
=========================
``importlib_metadata``
=========================
``importlib_metadata`` is a library to access the metadata for a
Python package.
As of Python 3.8, this functionality has been added to the
`Python standard library
<https://docs.python.org/3/library/importlib.metadata.html>`_.
This package supplies backports of that functionality including
improvements added to subsequent Python versions.
Usage
=====
See the `online documentation <https://importlib_metadata.readthedocs.io/>`_
for usage details.
`Finder authors
<https://docs.python.org/3/reference/import.html#finders-and-loaders>`_ can
also add support for custom package installers. See the above documentation
for details.
Caveats
=======
This project primarily supports third-party packages installed by PyPA
tools (or other conforming packages). It does not support:
- Packages in the stdlib.
- Packages installed without metadata.
Project details
===============
* Project home: https://github.com/python/importlib_metadata
* Report bugs at: https://github.com/python/importlib_metadata/issues
* Code hosting: https://github.com/python/importlib_metadata
* Documentation: https://importlib_metadata.readthedocs.io/ | {
"source": "yandex/perforator",
"title": "contrib/python/importlib-metadata/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/importlib-metadata/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1218
} |
.. image:: https://img.shields.io/pypi/v/importlib_metadata.svg
:target: https://pypi.org/project/importlib_metadata
.. image:: https://img.shields.io/pypi/pyversions/importlib_metadata.svg
.. image:: https://github.com/python/importlib_metadata/actions/workflows/main.yml/badge.svg
:target: https://github.com/python/importlib_metadata/actions?query=workflow%3A%22tests%22
:alt: tests
.. image:: https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/charliermarsh/ruff/main/assets/badge/v2.json
:target: https://github.com/astral-sh/ruff
:alt: Ruff
.. image:: https://readthedocs.org/projects/importlib-metadata/badge/?version=latest
:target: https://importlib-metadata.readthedocs.io/en/latest/?badge=latest
.. image:: https://img.shields.io/badge/skeleton-2024-informational
:target: https://blog.jaraco.com/skeleton
.. image:: https://tidelift.com/badges/package/pypi/importlib-metadata
:target: https://tidelift.com/subscription/pkg/pypi-importlib-metadata?utm_source=pypi-importlib-metadata&utm_medium=readme
Library to access the metadata for a Python package.
This package supplies third-party access to the functionality of
`importlib.metadata <https://docs.python.org/3/library/importlib.metadata.html>`_
including improvements added to subsequent Python versions.
Compatibility
=============
New features are introduced in this third-party library and later merged
into CPython. The following table indicates which versions of this library
were contributed to different versions in the standard library:
.. list-table::
:header-rows: 1
* - importlib_metadata
- stdlib
* - 7.0
- 3.13
* - 6.5
- 3.12
* - 4.13
- 3.11
* - 4.6
- 3.10
* - 1.4
- 3.8
Usage
=====
See the `online documentation <https://importlib-metadata.readthedocs.io/>`_
for usage details.
`Finder authors
<https://docs.python.org/3/reference/import.html#finders-and-loaders>`_ can
also add support for custom package installers. See the above documentation
for details.
Caveats
=======
This project primarily supports third-party packages installed by PyPA
tools (or other conforming packages). It does not support:
- Packages in the stdlib.
- Packages installed without metadata.
Project details
===============
* Project home: https://github.com/python/importlib_metadata
* Report bugs at: https://github.com/python/importlib_metadata/issues
* Code hosting: https://github.com/python/importlib_metadata
* Documentation: https://importlib-metadata.readthedocs.io/
For Enterprise
==============
Available as part of the Tidelift Subscription.
This project and the maintainers of thousands of other packages are working with Tidelift to deliver one enterprise subscription that covers all of the open source you use.
`Learn more <https://tidelift.com/subscription/pkg/pypi-importlib-metadata?utm_source=pypi-importlib-metadata&utm_medium=referral&utm_campaign=github>`_. | {
"source": "yandex/perforator",
"title": "contrib/python/importlib-metadata/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/importlib-metadata/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2967
} |
IPython `pdb`
=============
.. image:: https://travis-ci.org/gotcha/ipdb.png?branch=master
:target: https://travis-ci.org/gotcha/ipdb
.. image:: https://codecov.io/gh/gotcha/ipdb/branch/master/graphs/badge.svg?style=flat
:target: https://codecov.io/gh/gotcha/ipdb?branch=master
Use
---
ipdb exports functions to access the IPython_ debugger, which features
tab completion, syntax highlighting, better tracebacks, better introspection
with the same interface as the `pdb` module.
Example usage:
.. code-block:: python
import ipdb
ipdb.set_trace()
ipdb.set_trace(context=5) # will show five lines of code
# instead of the default three lines
# or you can set it via IPDB_CONTEXT_SIZE env variable
# or setup.cfg file
ipdb.pm()
ipdb.run('x[0] = 3')
result = ipdb.runcall(function, arg0, arg1, kwarg='foo')
result = ipdb.runeval('f(1,2) - 3')
Arguments for `set_trace`
+++++++++++++++++++++++++
The `set_trace` function accepts `context` which will show as many lines of code as defined,
and `cond`, which accepts boolean values (such as `abc == 17`) and will start ipdb's
interface whenever `cond` equals to `True`.
Using configuration file
++++++++++++++++++++++++
It's possible to set up context using a `.ipdb` file on your home folder, `setup.cfg`
or `pyproject.toml` on your project folder. You can also set your file location via
env var `$IPDB_CONFIG`. Your environment variable has priority over the home
configuration file, which in turn has priority over the setup config file.
Currently, only context setting is available.
A valid setup.cfg is as follows
::
[ipdb]
context=5
A valid .ipdb is as follows
::
context=5
A valid pyproject.toml is as follows
::
[tool.ipdb]
context=5
The post-mortem function, ``ipdb.pm()``, is equivalent to the magic function
``%debug``.
.. _IPython: http://ipython.org
If you install ``ipdb`` with a tool which supports ``setuptools`` entry points,
an ``ipdb`` script is made for you. You can use it to debug your python 2 scripts like
::
$ bin/ipdb mymodule.py
And for python 3
::
$ bin/ipdb3 mymodule.py
Alternatively with Python 2.7 only, you can also use
::
$ python -m ipdb mymodule.py
You can also enclose code with the ``with`` statement to launch ipdb if an exception is raised:
.. code-block:: python
from ipdb import launch_ipdb_on_exception
with launch_ipdb_on_exception():
[...]
.. warning::
Context managers were introduced in Python 2.5.
Adding a context manager implies dropping Python 2.4 support.
Use ``ipdb==0.6`` with 2.4.
Or you can use ``iex`` as a function decorator to launch ipdb if an exception is raised:
.. code-block:: python
from ipdb import iex
@iex
def main():
[...]
.. warning::
Using ``from future import print_function`` for Python 3 compat implies dropping Python 2.5 support.
Use ``ipdb<=0.8`` with 2.5.
Issues with ``stdout``
----------------------
Some tools, like ``nose`` fiddle with ``stdout``.
Until ``ipdb==0.9.4``, we tried to guess when we should also
fiddle with ``stdout`` to support those tools.
However, all strategies tried until 0.9.4 have proven brittle.
If you use ``nose`` or another tool that fiddles with ``stdout``, you should
explicitly ask for ``stdout`` fiddling by using ``ipdb`` like this
.. code-block:: python
import ipdb
ipdb.sset_trace()
ipdb.spm()
from ipdb import slaunch_ipdb_on_exception
with slaunch_ipdb_on_exception():
[...]
Development
-----------
``ipdb`` source code and tracker are at https://github.com/gotcha/ipdb.
Pull requests should take care of updating the changelog ``HISTORY.txt``.
Under the unreleased section, add your changes and your username.
Manual testing
++++++++++++++
To test your changes, make use of ``manual_test.py``. Create a virtual environment,
install IPython and run ``python manual_test.py`` and check if your changes are in effect.
If possible, create automated tests for better behaviour control.
Automated testing
+++++++++++++++++
To run automated tests locally, create a virtual environment, install `coverage`
and run `coverage run setup.py test`.
Third-party support
-------------------
Products.PDBDebugMode
+++++++++++++++++++++
Zope2 Products.PDBDebugMode_ uses ``ipdb``, if available, in place of ``pdb``.
.. _Products.PDBDebugMode: http://pypi.python.org/pypi/Products.PDBDebugMode
iw.debug
++++++++
iw.debug_ allows you to trigger an ``ipdb`` debugger on any published object
of a Zope2 application.
.. _iw.debug: http://pypi.python.org/pypi/iw.debug
ipdbplugin
++++++++++
ipdbplugin_ is a nose_ test runner plugin that also uses the IPython debugger
instead of ``pdb``. (It does not depend on ``ipdb`` anymore).
.. _ipdbplugin: http://pypi.python.org/pypi/ipdbplugin
.. _nose: http://readthedocs.org/docs/nose | {
"source": "yandex/perforator",
"title": "contrib/python/ipdb/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/ipdb/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 5098
} |
IPython `pdb`
=============
.. image:: https://travis-ci.org/gotcha/ipdb.png?branch=master
:target: https://travis-ci.org/gotcha/ipdb
.. image:: https://codecov.io/gh/gotcha/ipdb/branch/master/graphs/badge.svg?style=flat
:target: https://codecov.io/gh/gotcha/ipdb?branch=master
Use
---
ipdb exports functions to access the IPython_ debugger, which features
tab completion, syntax highlighting, better tracebacks, better introspection
with the same interface as the `pdb` module.
Example usage:
.. code-block:: python
import ipdb
ipdb.set_trace()
ipdb.set_trace(context=5) # will show five lines of code
# instead of the default three lines
# or you can set it via IPDB_CONTEXT_SIZE env variable
# or setup.cfg file
ipdb.pm()
ipdb.run('x[0] = 3')
result = ipdb.runcall(function, arg0, arg1, kwarg='foo')
result = ipdb.runeval('f(1,2) - 3')
Arguments for `set_trace`
+++++++++++++++++++++++++
The `set_trace` function accepts `context` which will show as many lines of code as defined,
and `cond`, which accepts boolean values (such as `abc == 17`) and will start ipdb's
interface whenever `cond` equals to `True`.
Using configuration file
++++++++++++++++++++++++
It's possible to set up context using a `.ipdb` file on your home folder, `setup.cfg`
or `pyproject.toml` on your project folder. You can also set your file location via
env var `$IPDB_CONFIG`. Your environment variable has priority over the home
configuration file, which in turn has priority over the setup config file.
Currently, only context setting is available.
A valid setup.cfg is as follows
::
[ipdb]
context=5
A valid .ipdb is as follows
::
context=5
A valid pyproject.toml is as follows
::
[tool.ipdb]
context=5
The post-mortem function, ``ipdb.pm()``, is equivalent to the magic function
``%debug``.
.. _IPython: http://ipython.org
If you install ``ipdb`` with a tool which supports ``setuptools`` entry points,
an ``ipdb`` script is made for you. You can use it to debug your python 2 scripts like
::
$ bin/ipdb mymodule.py
And for python 3
::
$ bin/ipdb3 mymodule.py
Alternatively with Python 2.7 only, you can also use
::
$ python -m ipdb mymodule.py
You can also enclose code with the ``with`` statement to launch ipdb if an exception is raised:
.. code-block:: python
from ipdb import launch_ipdb_on_exception
with launch_ipdb_on_exception():
[...]
.. warning::
Context managers were introduced in Python 2.5.
Adding a context manager implies dropping Python 2.4 support.
Use ``ipdb==0.6`` with 2.4.
Or you can use ``iex`` as a function decorator to launch ipdb if an exception is raised:
.. code-block:: python
from ipdb import iex
@iex
def main():
[...]
.. warning::
Using ``from future import print_function`` for Python 3 compat implies dropping Python 2.5 support.
Use ``ipdb<=0.8`` with 2.5.
Issues with ``stdout``
----------------------
Some tools, like ``nose`` fiddle with ``stdout``.
Until ``ipdb==0.9.4``, we tried to guess when we should also
fiddle with ``stdout`` to support those tools.
However, all strategies tried until 0.9.4 have proven brittle.
If you use ``nose`` or another tool that fiddles with ``stdout``, you should
explicitly ask for ``stdout`` fiddling by using ``ipdb`` like this
.. code-block:: python
import ipdb
ipdb.sset_trace()
ipdb.spm()
from ipdb import slaunch_ipdb_on_exception
with slaunch_ipdb_on_exception():
[...]
Development
-----------
``ipdb`` source code and tracker are at https://github.com/gotcha/ipdb.
Pull requests should take care of updating the changelog ``HISTORY.txt``.
Under the unreleased section, add your changes and your username.
Manual testing
++++++++++++++
To test your changes, make use of ``manual_test.py``. Create a virtual environment,
install IPython and run ``python manual_test.py`` and check if your changes are in effect.
If possible, create automated tests for better behaviour control.
Automated testing
+++++++++++++++++
To run automated tests locally, create a virtual environment, install `coverage`
and run `coverage run setup.py test`.
Third-party support
-------------------
Products.PDBDebugMode
+++++++++++++++++++++
Zope2 Products.PDBDebugMode_ uses ``ipdb``, if available, in place of ``pdb``.
.. _Products.PDBDebugMode: http://pypi.python.org/pypi/Products.PDBDebugMode
iw.debug
++++++++
iw.debug_ allows you to trigger an ``ipdb`` debugger on any published object
of a Zope2 application.
.. _iw.debug: http://pypi.python.org/pypi/iw.debug
ipdbplugin
++++++++++
ipdbplugin_ is a nose_ test runner plugin that also uses the IPython debugger
instead of ``pdb``. (It does not depend on ``ipdb`` anymore).
.. _ipdbplugin: http://pypi.python.org/pypi/ipdbplugin
.. _nose: http://readthedocs.org/docs/nose | {
"source": "yandex/perforator",
"title": "contrib/python/ipdb/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/ipdb/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 5098
} |
# Licensing terms
This project is licensed under the terms of the Modified BSD License
(also known as New or Revised or 3-Clause BSD), as follows:
- Copyright (c) 2001-, IPython Development Team
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of the IPython Development Team nor the names of its
contributors may be used to endorse or promote products derived from this
software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
## About the IPython Development Team
The IPython Development Team is the set of all contributors to the IPython project.
This includes all of the IPython subprojects.
The core team that coordinates development on GitHub can be found here:
https://github.com/jupyter/.
## Our Copyright Policy
IPython uses a shared copyright model. Each contributor maintains copyright
over their contributions to IPython. But, it is important to note that these
contributions are typically only changes to the repositories. Thus, the IPython
source code, in its entirety is not the copyright of any single person or
institution. Instead, it is the collective copyright of the entire IPython
Development Team. If individual contributors want to maintain a record of what
changes/contributions they have specific copyright on, they should indicate
their copyright in the commit message of the change, when they commit the
change to one of the IPython repositories.
With this in mind, the following banner should be used in any source code file
to indicate the copyright and license terms:
# Copyright (c) IPython Development Team.
# Distributed under the terms of the Modified BSD License. | {
"source": "yandex/perforator",
"title": "contrib/python/ipython-genutils/py2/COPYING.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/ipython-genutils/py2/COPYING.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2835
} |
# IPython vestigial utilities
This package shouldn't exist.
It contains some common utilities shared by Jupyter and IPython projects during The Big Split™.
As soon as possible, those packages will remove their dependency on this,
and this repo will go away.
No functionality should be added to this repository,
and no packages outside IPython/Jupyter should depend on it. | {
"source": "yandex/perforator",
"title": "contrib/python/ipython-genutils/py2/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/ipython-genutils/py2/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 373
} |
# Licensing terms
This project is licensed under the terms of the Modified BSD License
(also known as New or Revised or 3-Clause BSD), as follows:
- Copyright (c) 2001-, IPython Development Team
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of the IPython Development Team nor the names of its
contributors may be used to endorse or promote products derived from this
software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
## About the IPython Development Team
The IPython Development Team is the set of all contributors to the IPython project.
This includes all of the IPython subprojects.
The core team that coordinates development on GitHub can be found here:
https://github.com/jupyter/.
## Our Copyright Policy
IPython uses a shared copyright model. Each contributor maintains copyright
over their contributions to IPython. But, it is important to note that these
contributions are typically only changes to the repositories. Thus, the IPython
source code, in its entirety is not the copyright of any single person or
institution. Instead, it is the collective copyright of the entire IPython
Development Team. If individual contributors want to maintain a record of what
changes/contributions they have specific copyright on, they should indicate
their copyright in the commit message of the change, when they commit the
change to one of the IPython repositories.
With this in mind, the following banner should be used in any source code file
to indicate the copyright and license terms:
# Copyright (c) IPython Development Team.
# Distributed under the terms of the Modified BSD License. | {
"source": "yandex/perforator",
"title": "contrib/python/ipython-genutils/py3/COPYING.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/ipython-genutils/py3/COPYING.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2835
} |
# IPython vestigial utilities
This package shouldn't exist.
It contains some common utilities shared by Jupyter and IPython projects during The Big Split™.
As soon as possible, those packages will remove their dependency on this,
and this repo will go away.
No functionality should be added to this repository,
and no packages outside IPython/Jupyter should depend on it. | {
"source": "yandex/perforator",
"title": "contrib/python/ipython-genutils/py3/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/ipython-genutils/py3/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 373
} |
=============================
The IPython licensing terms
=============================
IPython is licensed under the terms of the Modified BSD License (also known as
New or Revised or 3-Clause BSD), as follows:
- Copyright (c) 2008-2014, IPython Development Team
- Copyright (c) 2001-2007, Fernando Perez <[email protected]>
- Copyright (c) 2001, Janko Hauser <[email protected]>
- Copyright (c) 2001, Nathaniel Gray <[email protected]>
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of the IPython Development Team nor the names of its
contributors may be used to endorse or promote products derived from this
software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
About the IPython Development Team
----------------------------------
Fernando Perez began IPython in 2001 based on code from Janko Hauser
<[email protected]> and Nathaniel Gray <[email protected]>. Fernando is still
the project lead.
The IPython Development Team is the set of all contributors to the IPython
project. This includes all of the IPython subprojects. A full list with
details is kept in the documentation directory, in the file
``about/credits.txt``.
The core team that coordinates development on GitHub can be found here:
https://github.com/ipython/.
Our Copyright Policy
--------------------
IPython uses a shared copyright model. Each contributor maintains copyright
over their contributions to IPython. But, it is important to note that these
contributions are typically only changes to the repositories. Thus, the IPython
source code, in its entirety is not the copyright of any single person or
institution. Instead, it is the collective copyright of the entire IPython
Development Team. If individual contributors want to maintain a record of what
changes/contributions they have specific copyright on, they should indicate
their copyright in the commit message of the change, when they commit the
change to one of the IPython repositories.
With this in mind, the following banner should be used in any source code file
to indicate the copyright and license terms:
::
# Copyright (c) IPython Development Team.
# Distributed under the terms of the Modified BSD License. | {
"source": "yandex/perforator",
"title": "contrib/python/ipython/py2/COPYING.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/ipython/py2/COPYING.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 3414
} |
.. image:: https://codecov.io/github/ipython/ipython/coverage.svg?branch=master
:target: https://codecov.io/github/ipython/ipython?branch=master
.. image:: https://img.shields.io/pypi/dm/IPython.svg
:target: https://pypi.python.org/pypi/ipython
.. image:: https://img.shields.io/pypi/v/IPython.svg
:target: https://pypi.python.org/pypi/ipython
.. image:: https://img.shields.io/travis/ipython/ipython.svg
:target: https://travis-ci.org/ipython/ipython
===========================================
IPython: Productive Interactive Computing
===========================================
Overview
========
Welcome to IPython. Our full documentation is available on `ipython.readthedocs.io
<https://ipython.readthedocs.io/en/stable/>`_ and contain information on how to install, use
contribute to the project.
Officially, IPython requires Python version 2.7, or 3.3 and above.
IPython 1.x is the last IPython version to support Python 2.6 and 3.2.
The Notebook, Qt console and a number of other pieces are now parts of *Jupyter*.
See the `Jupyter installation docs <http://jupyter.readthedocs.io/en/latest/install.html>`__
if you want to use these.
Developement and Instant runnimg
================================
You can find the latest version of the development documentation on `readthedocs
<http://ipython.readthedocs.io/en/latest/>`_.
You can run IPython from this directory without even installing it system-wide
by typing at the terminal::
$ python -m IPython
Or see the `developement installation docs
<http://ipython.readthedocs.io/en/latest/install/install.html#installing-the-development-version>`_
for the latest revision on read the docs.
Documentation and installation instructions for older version of IPython can be
found on the `IPython website <http://ipython.org/documentation.html>`_ | {
"source": "yandex/perforator",
"title": "contrib/python/ipython/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/ipython/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1870
} |
=============================
The IPython licensing terms
=============================
IPython is licensed under the terms of the Modified BSD License (also known as
New or Revised or 3-Clause BSD). See the LICENSE file.
About the IPython Development Team
----------------------------------
Fernando Perez began IPython in 2001 based on code from Janko Hauser
<[email protected]> and Nathaniel Gray <[email protected]>. Fernando is still
the project lead.
The IPython Development Team is the set of all contributors to the IPython
project. This includes all of the IPython subprojects.
The core team that coordinates development on GitHub can be found here:
https://github.com/ipython/.
Our Copyright Policy
--------------------
IPython uses a shared copyright model. Each contributor maintains copyright
over their contributions to IPython. But, it is important to note that these
contributions are typically only changes to the repositories. Thus, the IPython
source code, in its entirety is not the copyright of any single person or
institution. Instead, it is the collective copyright of the entire IPython
Development Team. If individual contributors want to maintain a record of what
changes/contributions they have specific copyright on, they should indicate
their copyright in the commit message of the change, when they commit the
change to one of the IPython repositories.
With this in mind, the following banner should be used in any source code file
to indicate the copyright and license terms:
::
# Copyright (c) IPython Development Team.
# Distributed under the terms of the Modified BSD License. | {
"source": "yandex/perforator",
"title": "contrib/python/ipython/py3/COPYING.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/ipython/py3/COPYING.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1638
} |
.. image:: https://codecov.io/github/ipython/ipython/coverage.svg?branch=main
:target: https://codecov.io/github/ipython/ipython?branch=main
.. image:: https://img.shields.io/pypi/v/IPython.svg
:target: https://pypi.python.org/pypi/ipython
.. image:: https://github.com/ipython/ipython/actions/workflows/test.yml/badge.svg
:target: https://github.com/ipython/ipython/actions/workflows/test.yml
.. image:: https://www.codetriage.com/ipython/ipython/badges/users.svg
:target: https://www.codetriage.com/ipython/ipython/
.. image:: https://raster.shields.io/badge/Follows-SPEC--0000-brightgreen.png
:target: https://scientific-python.org/specs/spec-0000/
.. image:: https://tidelift.com/badges/package/pypi/ipython?style=flat
:target: https://tidelift.com/subscription/pkg/pypi-ipython
===========================================
IPython: Productive Interactive Computing
===========================================
Overview
========
Welcome to IPython. Our full documentation is available on `ipython.readthedocs.io
<https://ipython.readthedocs.io/en/stable/>`_ and contains information on how to install, use, and
contribute to the project.
IPython (Interactive Python) is a command shell for interactive computing in multiple programming languages, originally developed for the Python programming language, that offers introspection, rich media, shell syntax, tab completion, and history.
**IPython versions and Python Support**
Starting after IPython 8.16, we will progressively transition to `Spec-0000 <https://scientific-python.org/specs/spec-0000/>`_.
Starting with IPython 7.10, IPython follows `NEP 29 <https://numpy.org/neps/nep-0029-deprecation_policy.html>`_
**IPython 7.17+** requires Python version 3.7 and above.
**IPython 7.10+** requires Python version 3.6 and above.
**IPython 7.0** requires Python version 3.5 and above.
**IPython 6.x** requires Python version 3.3 and above.
**IPython 5.x LTS** is the compatible release for Python 2.7.
If you require Python 2 support, you **must** use IPython 5.x LTS. Please
update your project configurations and requirements as necessary.
The Notebook, Qt console and a number of other pieces are now parts of *Jupyter*.
See the `Jupyter installation docs <https://jupyter.readthedocs.io/en/latest/install.html>`__
if you want to use these.
Main features of IPython
========================
Comprehensive object introspection.
Input history, persistent across sessions.
Caching of output results during a session with automatically generated references.
Extensible tab completion, with support by default for completion of python variables and keywords, filenames and function keywords.
Extensible system of ‘magic’ commands for controlling the environment and performing many tasks related to IPython or the operating system.
A rich configuration system with easy switching between different setups (simpler than changing $PYTHONSTARTUP environment variables every time).
Session logging and reloading.
Extensible syntax processing for special purpose situations.
Access to the system shell with user-extensible alias system.
Easily embeddable in other Python programs and GUIs.
Integrated access to the pdb debugger and the Python profiler.
Development and Instant running
===============================
You can find the latest version of the development documentation on `readthedocs
<https://ipython.readthedocs.io/en/latest/>`_.
You can run IPython from this directory without even installing it system-wide
by typing at the terminal::
$ python -m IPython
Or see the `development installation docs
<https://ipython.readthedocs.io/en/latest/install/install.html#installing-the-development-version>`_
for the latest revision on read the docs.
Documentation and installation instructions for older version of IPython can be
found on the `IPython website <https://ipython.org/documentation.html>`_
Alternatives to IPython
=======================
IPython may not be to your taste; if that's the case there might be similar
project that you might want to use:
- The classic Python REPL.
- `bpython <https://bpython-interpreter.org/>`_
- `mypython <https://www.asmeurer.com/mypython/>`_
- `ptpython and ptipython <https://pypi.org/project/ptpython/>`_
- `Xonsh <https://xon.sh/>`_
Ignoring commits with git blame.ignoreRevsFile
==============================================
As of git 2.23, it is possible to make formatting changes without breaking
``git blame``. See the `git documentation
<https://git-scm.com/docs/git-config#Documentation/git-config.txt-blameignoreRevsFile>`_
for more details.
To use this feature you must:
- Install git >= 2.23
- Configure your local git repo by running:
- POSIX: ``tools\configure-git-blame-ignore-revs.sh``
- Windows: ``tools\configure-git-blame-ignore-revs.bat`` | {
"source": "yandex/perforator",
"title": "contrib/python/ipython/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/ipython/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 4841
} |
.. image:: https://img.shields.io/pypi/v/jaraco.functools.svg
:target: https://pypi.org/project/jaraco.functools
.. image:: https://img.shields.io/pypi/pyversions/jaraco.functools.svg
.. image:: https://img.shields.io/travis/jaraco/jaraco.functools/master.svg
:target: https://travis-ci.org/jaraco/jaraco.functools
.. .. image:: https://img.shields.io/appveyor/ci/jaraco/jaraco-functools/master.svg
.. :target: https://ci.appveyor.com/project/jaraco-functools/skeleton/branch/master
.. image:: https://readthedocs.org/projects/jaracofunctools/badge/?version=latest
:target: https://jaracofunctools.readthedocs.io/en/latest/?badge=latest
Additional functools in the spirit of stdlib's functools. | {
"source": "yandex/perforator",
"title": "contrib/python/jaraco.functools/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/jaraco.functools/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 711
} |
.. image:: https://img.shields.io/pypi/v/jaraco.functools.svg
:target: https://pypi.org/project/jaraco.functools
.. image:: https://img.shields.io/pypi/pyversions/jaraco.functools.svg
.. image:: https://github.com/jaraco/jaraco.functools/actions/workflows/main.yml/badge.svg
:target: https://github.com/jaraco/jaraco.functools/actions?query=workflow%3A%22tests%22
:alt: tests
.. image:: https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/charliermarsh/ruff/main/assets/badge/v2.json
:target: https://github.com/astral-sh/ruff
:alt: Ruff
.. image:: https://readthedocs.org/projects/jaracofunctools/badge/?version=latest
:target: https://jaracofunctools.readthedocs.io/en/latest/?badge=latest
.. image:: https://img.shields.io/badge/skeleton-2024-informational
:target: https://blog.jaraco.com/skeleton
.. image:: https://tidelift.com/badges/package/pypi/jaraco.functools
:target: https://tidelift.com/subscription/pkg/pypi-jaraco.functools?utm_source=pypi-jaraco.functools&utm_medium=readme
Additional functools in the spirit of stdlib's functools.
For Enterprise
==============
Available as part of the Tidelift Subscription.
This project and the maintainers of thousands of other packages are working with Tidelift to deliver one enterprise subscription that covers all of the open source you use.
`Learn more <https://tidelift.com/subscription/pkg/pypi-jaraco.functools?utm_source=pypi-jaraco.functools&utm_medium=referral&utm_campaign=github>`_. | {
"source": "yandex/perforator",
"title": "contrib/python/jaraco.functools/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/jaraco.functools/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1504
} |
###################################################################
Jedi - an awesome autocompletion/static analysis library for Python
###################################################################
.. image:: https://img.shields.io/pypi/v/jedi.svg?style=flat
:target: https://pypi.python.org/pypi/jedi
:alt: PyPI version
.. image:: https://img.shields.io/pypi/pyversions/jedi.svg
:target: https://pypi.python.org/pypi/jedi
:alt: Supported Python versions
.. image:: https://travis-ci.org/davidhalter/jedi.svg?branch=master
:target: https://travis-ci.org/davidhalter/jedi
:alt: Linux Tests
.. image:: https://ci.appveyor.com/api/projects/status/mgva3bbawyma1new/branch/master?svg=true
:target: https://ci.appveyor.com/project/davidhalter/jedi/branch/master
:alt: Windows Tests
.. image:: https://coveralls.io/repos/davidhalter/jedi/badge.svg?branch=master
:target: https://coveralls.io/r/davidhalter/jedi
:alt: Coverage status
*If you have specific questions, please add an issue or ask on* `Stack Overflow
<https://stackoverflow.com/questions/tagged/python-jedi>`_ *with the label* ``python-jedi``.
Jedi is a static analysis tool for Python that can be used in IDEs/editors. Its
historic focus is autocompletion, but does static analysis for now as well.
Jedi is fast and is very well tested. It understands Python on a deeper level
than all other static analysis frameworks for Python.
Jedi has support for two different goto functions. It's possible to search for
related names and to list all names in a Python file and infer them. Jedi
understands docstrings and you can use Jedi autocompletion in your REPL as
well.
Jedi uses a very simple API to connect with IDEs. There's a reference
implementation as a `VIM-Plugin <https://github.com/davidhalter/jedi-vim>`_,
which uses Jedi's autocompletion. We encourage you to use Jedi in your IDEs.
It's really easy.
Jedi can currently be used with the following editors/projects:
- Vim (jedi-vim_, YouCompleteMe_, deoplete-jedi_, completor.vim_)
- Emacs (Jedi.el_, company-mode_, elpy_, anaconda-mode_, ycmd_)
- Sublime Text (SublimeJEDI_ [ST2 + ST3], anaconda_ [only ST3])
- TextMate_ (Not sure if it's actually working)
- Kate_ version 4.13+ supports it natively, you have to enable it, though. [`proof
<https://projects.kde.org/projects/kde/applications/kate/repository/show?rev=KDE%2F4.13>`_]
- Atom_ (autocomplete-python-jedi_)
- `GNOME Builder`_ (with support for GObject Introspection)
- `Visual Studio Code`_ (via `Python Extension <https://marketplace.visualstudio.com/items?itemName=ms-python.python>`_)
- Gedit (gedi_)
- wdb_ - Web Debugger
- `Eric IDE`_ (Available as a plugin)
- `IPython 6.0.0+ <https://ipython.readthedocs.io/en/stable/whatsnew/version6.html>`_
and many more!
Here are some pictures taken from jedi-vim_:
.. image:: https://github.com/davidhalter/jedi/raw/master/docs/_screenshots/screenshot_complete.png
Completion for almost anything (Ctrl+Space).
.. image:: https://github.com/davidhalter/jedi/raw/master/docs/_screenshots/screenshot_function.png
Display of function/class bodies, docstrings.
.. image:: https://github.com/davidhalter/jedi/raw/master/docs/_screenshots/screenshot_pydoc.png
Pydoc support (Shift+k).
There is also support for goto and renaming.
Get the latest version from `github <https://github.com/davidhalter/jedi>`_
(master branch should always be kind of stable/working).
Docs are available at `https://jedi.readthedocs.org/en/latest/
<https://jedi.readthedocs.org/en/latest/>`_. Pull requests with documentation
enhancements and/or fixes are awesome and most welcome. Jedi uses `semantic
versioning <https://semver.org/>`_.
If you want to stay up-to-date (News / RFCs), please subscribe to this `github
thread <https://github.com/davidhalter/jedi/issues/1063>`_.:
Installation
============
pip install jedi
Note: This just installs the Jedi library, not the editor plugins. For
information about how to make it work with your editor, refer to the
corresponding documentation.
You don't want to use ``pip``? Please refer to the `manual
<https://jedi.readthedocs.org/en/latest/docs/installation.html>`_.
Feature Support and Caveats
===========================
Jedi really understands your Python code. For a comprehensive list what Jedi
understands, see: `Features
<https://jedi.readthedocs.org/en/latest/docs/features.html>`_. A list of
caveats can be found on the same page.
You can run Jedi on CPython 2.7 or 3.4+ but it should also
understand/parse code older than those versions. Additionally you should be able
to use `Virtualenvs <https://jedi.readthedocs.org/en/latest/docs/api.html#environments>`_
very well.
Tips on how to use Jedi efficiently can be found `here
<https://jedi.readthedocs.org/en/latest/docs/features.html#recipes>`_.
API
---
You can find the documentation for the `API here <https://jedi.readthedocs.org/en/latest/docs/api.html>`_.
Autocompletion / Goto / Pydoc
-----------------------------
Please check the API for a good explanation. There are the following commands:
- ``jedi.Script.goto_assignments``
- ``jedi.Script.completions``
- ``jedi.Script.usages``
The returned objects are very powerful and really all you might need.
Autocompletion in your REPL (IPython, etc.)
-------------------------------------------
Starting with IPython `6.0.0` Jedi is a dependency of IPython. Autocompletion
in IPython is therefore possible without additional configuration.
It's possible to have Jedi autocompletion in REPL modes - `example video <https://vimeo.com/122332037>`_.
This means that in Python you can enable tab completion in a `REPL
<https://jedi.readthedocs.org/en/latest/docs/usage.html#tab-completion-in-the-python-shell>`_.
Static Analysis / Linter
------------------------
To do all forms of static analysis, please try to use ``jedi.names``. It will
return a list of names that you can use to infer types and so on.
Linting is another thing that is going to be part of Jedi. For now you can try
an alpha version ``python -m jedi linter``. The API might change though and
it's still buggy. It's Jedi's goal to be smarter than classic linter and
understand ``AttributeError`` and other code issues.
Refactoring
-----------
Jedi's parser would support refactoring, but there's no API to use it right
now. If you're interested in helping out here, let me know. With the latest
parser changes, it should be very easy to actually make it work.
Development
===========
There's a pretty good and extensive `development documentation
<https://jedi.readthedocs.org/en/latest/docs/development.html>`_.
Testing
=======
The test suite depends on ``tox`` and ``pytest``::
pip install tox pytest
To run the tests for all supported Python versions::
tox
If you want to test only a specific Python version (e.g. Python 2.7), it's as
easy as ::
tox -e py27
Tests are also run automatically on `Travis CI
<https://travis-ci.org/davidhalter/jedi/>`_.
For more detailed information visit the `testing documentation
<https://jedi.readthedocs.org/en/latest/docs/testing.html>`_.
Acknowledgements
================
- Takafumi Arakaki (@tkf) for creating a solid test environment and a lot of
other things.
- Danilo Bargen (@dbrgn) for general housekeeping and being a good friend :).
- Guido van Rossum (@gvanrossum) for creating the parser generator pgen2
(originally used in lib2to3).
.. _jedi-vim: https://github.com/davidhalter/jedi-vim
.. _youcompleteme: https://valloric.github.io/YouCompleteMe/
.. _deoplete-jedi: https://github.com/zchee/deoplete-jedi
.. _completor.vim: https://github.com/maralla/completor.vim
.. _Jedi.el: https://github.com/tkf/emacs-jedi
.. _company-mode: https://github.com/syohex/emacs-company-jedi
.. _elpy: https://github.com/jorgenschaefer/elpy
.. _anaconda-mode: https://github.com/proofit404/anaconda-mode
.. _ycmd: https://github.com/abingham/emacs-ycmd
.. _sublimejedi: https://github.com/srusskih/SublimeJEDI
.. _anaconda: https://github.com/DamnWidget/anaconda
.. _wdb: https://github.com/Kozea/wdb
.. _TextMate: https://github.com/lawrenceakka/python-jedi.tmbundle
.. _Kate: https://kate-editor.org
.. _Atom: https://atom.io/
.. _autocomplete-python-jedi: https://atom.io/packages/autocomplete-python-jedi
.. _GNOME Builder: https://wiki.gnome.org/Apps/Builder
.. _Visual Studio Code: https://code.visualstudio.com/
.. _gedi: https://github.com/isamert/gedi
.. _Eric IDE: https://eric-ide.python-projects.org | {
"source": "yandex/perforator",
"title": "contrib/python/jedi/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/jedi/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 8506
} |
###################################################################
Jedi - an awesome autocompletion/static analysis library for Python
###################################################################
.. image:: https://img.shields.io/pypi/v/jedi.svg?style=flat
:target: https://pypi.python.org/pypi/jedi
:alt: PyPI version
.. image:: https://img.shields.io/pypi/pyversions/jedi.svg
:target: https://pypi.python.org/pypi/jedi
:alt: Supported Python versions
.. image:: https://travis-ci.org/davidhalter/jedi.svg?branch=master
:target: https://travis-ci.org/davidhalter/jedi
:alt: Linux Tests
.. image:: https://ci.appveyor.com/api/projects/status/mgva3bbawyma1new/branch/master?svg=true
:target: https://ci.appveyor.com/project/davidhalter/jedi/branch/master
:alt: Windows Tests
.. image:: https://coveralls.io/repos/davidhalter/jedi/badge.svg?branch=master
:target: https://coveralls.io/r/davidhalter/jedi
:alt: Coverage status
*If you have specific questions, please add an issue or ask on* `Stack Overflow
<https://stackoverflow.com/questions/tagged/python-jedi>`_ *with the label* ``python-jedi``.
Jedi is a static analysis tool for Python that can be used in IDEs/editors. Its
historic focus is autocompletion, but does static analysis for now as well.
Jedi is fast and is very well tested. It understands Python on a deeper level
than all other static analysis frameworks for Python.
Jedi has support for two different goto functions. It's possible to search for
related names and to list all names in a Python file and infer them. Jedi
understands docstrings and you can use Jedi autocompletion in your REPL as
well.
Jedi uses a very simple API to connect with IDEs. There's a reference
implementation as a `VIM-Plugin <https://github.com/davidhalter/jedi-vim>`_,
which uses Jedi's autocompletion. We encourage you to use Jedi in your IDEs.
It's really easy.
Jedi can currently be used with the following editors/projects:
- Vim (jedi-vim_, YouCompleteMe_, deoplete-jedi_, completor.vim_)
- Emacs (Jedi.el_, company-mode_, elpy_, anaconda-mode_, ycmd_)
- Sublime Text (SublimeJEDI_ [ST2 + ST3], anaconda_ [only ST3])
- TextMate_ (Not sure if it's actually working)
- Kate_ version 4.13+ supports it natively, you have to enable it, though. [`proof
<https://projects.kde.org/projects/kde/applications/kate/repository/show?rev=KDE%2F4.13>`_]
- Atom_ (autocomplete-python-jedi_)
- `GNOME Builder`_ (with support for GObject Introspection)
- `Visual Studio Code`_ (via `Python Extension <https://marketplace.visualstudio.com/items?itemName=ms-python.python>`_)
- Gedit (gedi_)
- wdb_ - Web Debugger
- `Eric IDE`_ (Available as a plugin)
- `IPython 6.0.0+ <https://ipython.readthedocs.io/en/stable/whatsnew/version6.html>`_
and many more!
Here are some pictures taken from jedi-vim_:
.. image:: https://github.com/davidhalter/jedi/raw/master/docs/_screenshots/screenshot_complete.png
Completion for almost anything (Ctrl+Space).
.. image:: https://github.com/davidhalter/jedi/raw/master/docs/_screenshots/screenshot_function.png
Display of function/class bodies, docstrings.
.. image:: https://github.com/davidhalter/jedi/raw/master/docs/_screenshots/screenshot_pydoc.png
Pydoc support (Shift+k).
There is also support for goto and renaming.
Get the latest version from `github <https://github.com/davidhalter/jedi>`_
(master branch should always be kind of stable/working).
Docs are available at `https://jedi.readthedocs.org/en/latest/
<https://jedi.readthedocs.org/en/latest/>`_. Pull requests with documentation
enhancements and/or fixes are awesome and most welcome. Jedi uses `semantic
versioning <https://semver.org/>`_.
If you want to stay up-to-date (News / RFCs), please subscribe to this `github
thread <https://github.com/davidhalter/jedi/issues/1063>`_.:
Installation
============
pip install jedi
Note: This just installs the Jedi library, not the editor plugins. For
information about how to make it work with your editor, refer to the
corresponding documentation.
You don't want to use ``pip``? Please refer to the `manual
<https://jedi.readthedocs.org/en/latest/docs/installation.html>`_.
Feature Support and Caveats
===========================
Jedi really understands your Python code. For a comprehensive list what Jedi
understands, see: `Features
<https://jedi.readthedocs.org/en/latest/docs/features.html>`_. A list of
caveats can be found on the same page.
You can run Jedi on CPython 2.7 or 3.4+ but it should also
understand/parse code older than those versions. Additionally you should be able
to use `Virtualenvs <https://jedi.readthedocs.org/en/latest/docs/api.html#environments>`_
very well.
Tips on how to use Jedi efficiently can be found `here
<https://jedi.readthedocs.org/en/latest/docs/features.html#recipes>`_.
API
---
You can find the documentation for the `API here <https://jedi.readthedocs.org/en/latest/docs/api.html>`_.
Autocompletion / Goto / Pydoc
-----------------------------
Please check the API for a good explanation. There are the following commands:
- ``jedi.Script.goto_assignments``
- ``jedi.Script.completions``
- ``jedi.Script.usages``
The returned objects are very powerful and really all you might need.
Autocompletion in your REPL (IPython, etc.)
-------------------------------------------
Starting with IPython `6.0.0` Jedi is a dependency of IPython. Autocompletion
in IPython is therefore possible without additional configuration.
It's possible to have Jedi autocompletion in REPL modes - `example video <https://vimeo.com/122332037>`_.
This means that in Python you can enable tab completion in a `REPL
<https://jedi.readthedocs.org/en/latest/docs/usage.html#tab-completion-in-the-python-shell>`_.
Static Analysis / Linter
------------------------
To do all forms of static analysis, please try to use ``jedi.names``. It will
return a list of names that you can use to infer types and so on.
Linting is another thing that is going to be part of Jedi. For now you can try
an alpha version ``python -m jedi linter``. The API might change though and
it's still buggy. It's Jedi's goal to be smarter than classic linter and
understand ``AttributeError`` and other code issues.
Refactoring
-----------
Jedi's parser would support refactoring, but there's no API to use it right
now. If you're interested in helping out here, let me know. With the latest
parser changes, it should be very easy to actually make it work.
Development
===========
There's a pretty good and extensive `development documentation
<https://jedi.readthedocs.org/en/latest/docs/development.html>`_.
Testing
=======
The test suite depends on ``tox`` and ``pytest``::
pip install tox pytest
To run the tests for all supported Python versions::
tox
If you want to test only a specific Python version (e.g. Python 2.7), it's as
easy as ::
tox -e py27
Tests are also run automatically on `Travis CI
<https://travis-ci.org/davidhalter/jedi/>`_.
For more detailed information visit the `testing documentation
<https://jedi.readthedocs.org/en/latest/docs/testing.html>`_.
Acknowledgements
================
- Takafumi Arakaki (@tkf) for creating a solid test environment and a lot of
other things.
- Danilo Bargen (@dbrgn) for general housekeeping and being a good friend :).
- Guido van Rossum (@gvanrossum) for creating the parser generator pgen2
(originally used in lib2to3).
.. _jedi-vim: https://github.com/davidhalter/jedi-vim
.. _youcompleteme: https://valloric.github.io/YouCompleteMe/
.. _deoplete-jedi: https://github.com/zchee/deoplete-jedi
.. _completor.vim: https://github.com/maralla/completor.vim
.. _Jedi.el: https://github.com/tkf/emacs-jedi
.. _company-mode: https://github.com/syohex/emacs-company-jedi
.. _elpy: https://github.com/jorgenschaefer/elpy
.. _anaconda-mode: https://github.com/proofit404/anaconda-mode
.. _ycmd: https://github.com/abingham/emacs-ycmd
.. _sublimejedi: https://github.com/srusskih/SublimeJEDI
.. _anaconda: https://github.com/DamnWidget/anaconda
.. _wdb: https://github.com/Kozea/wdb
.. _TextMate: https://github.com/lawrenceakka/python-jedi.tmbundle
.. _Kate: https://kate-editor.org
.. _Atom: https://atom.io/
.. _autocomplete-python-jedi: https://atom.io/packages/autocomplete-python-jedi
.. _GNOME Builder: https://wiki.gnome.org/Apps/Builder
.. _Visual Studio Code: https://code.visualstudio.com/
.. _gedi: https://github.com/isamert/gedi
.. _Eric IDE: https://eric-ide.python-projects.org | {
"source": "yandex/perforator",
"title": "contrib/python/jedi/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/jedi/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 8506
} |
==============
More Itertools
==============
.. image:: https://coveralls.io/repos/github/erikrose/more-itertools/badge.svg?branch=master
:target: https://coveralls.io/github/erikrose/more-itertools?branch=master
Python's ``itertools`` library is a gem - you can compose elegant solutions
for a variety of problems with the functions it provides. In ``more-itertools``
we collect additional building blocks, recipes, and routines for working with
Python iterables.
----
+------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Grouping | `chunked <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.chunked>`_, |
| | `sliced <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.sliced>`_, |
| | `distribute <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.distribute>`_, |
| | `divide <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.divide>`_, |
| | `split_at <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.split_at>`_, |
| | `split_before <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.split_before>`_, |
| | `split_after <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.split_after>`_, |
| | `bucket <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.bucket>`_, |
| | `grouper <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.grouper>`_, |
| | `partition <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.partition>`_ |
+------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Lookahead and lookback | `spy <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.spy>`_, |
| | `peekable <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.peekable>`_, |
| | `seekable <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.seekable>`_ |
+------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Windowing | `windowed <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.windowed>`_, |
| | `stagger <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.stagger>`_, |
| | `pairwise <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.pairwise>`_ |
+------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Augmenting | `count_cycle <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.count_cycle>`_, |
| | `intersperse <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.intersperse>`_, |
| | `padded <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.padded>`_, |
| | `adjacent <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.adjacent>`_, |
| | `groupby_transform <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.groupby_transform>`_, |
| | `padnone <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.padnone>`_, |
| | `ncycles <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.ncycles>`_ |
+------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Combining | `collapse <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.collapse>`_, |
| | `sort_together <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.sort_together>`_, |
| | `interleave <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.interleave>`_, |
| | `interleave_longest <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.interleave_longest>`_, |
| | `collate <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.collate>`_, |
| | `zip_offset <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.zip_offset>`_, |
| | `dotproduct <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.dotproduct>`_, |
| | `flatten <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.flatten>`_, |
| | `roundrobin <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.roundrobin>`_, |
| | `prepend <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.prepend>`_ |
+------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Summarizing | `ilen <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.ilen>`_, |
| | `first <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.first>`_, |
| | `last <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.last>`_, |
| | `one <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.one>`_, |
| | `unique_to_each <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.unique_to_each>`_, |
| | `locate <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.locate>`_, |
| | `rlocate <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.rlocate>`_, |
| | `consecutive_groups <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.consecutive_groups>`_, |
| | `exactly_n <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.exactly_n>`_, |
| | `run_length <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.run_length>`_, |
| | `map_reduce <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.map_reduce>`_, |
| | `all_equal <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.all_equal>`_, |
| | `first_true <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.first_true>`_, |
| | `nth <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.nth>`_, |
| | `quantify <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.quantify>`_ |
+------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Selecting | `islice_extended <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.islice_extended>`_, |
| | `strip <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.strip>`_, |
| | `lstrip <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.lstrip>`_, |
| | `rstrip <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.rstrip>`_, |
| | `take <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.take>`_, |
| | `tail <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.tail>`_, |
| | `unique_everseen <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertoo ls.unique_everseen>`_, |
| | `unique_justseen <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.unique_justseen>`_ |
+------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Combinatorics | `distinct_permutations <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.distinct_permutations>`_, |
| | `circular_shifts <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.circular_shifts>`_, |
| | `powerset <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.powerset>`_, |
| | `random_product <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.random_product>`_, |
| | `random_permutation <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.random_permutation>`_, |
| | `random_combination <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.random_combination>`_, |
| | `random_combination_with_replacement <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.random_combination_with_replacement>`_, |
| | `nth_combination <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.nth_combination>`_ |
+------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Wrapping | `always_iterable <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.always_iterable>`_, |
| | `consumer <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.consumer>`_, |
| | `with_iter <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.with_iter>`_, |
| | `iter_except <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.iter_except>`_ |
+------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Others | `replace <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.replace>`_, |
| | `numeric_range <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.numeric_range>`_, |
| | `always_reversible <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.always_reversible>`_, |
| | `side_effect <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.side_effect>`_, |
| | `iterate <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.iterate>`_, |
| | `difference <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.difference>`_, |
| | `make_decorator <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.make_decorator>`_, |
| | `SequenceView <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.SequenceView>`_, |
| | `consume <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.consume>`_, |
| | `accumulate <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.accumulate>`_, |
| | `tabulate <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.tabulate>`_, |
| | `repeatfunc <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.repeatfunc>`_ |
+------------------------+----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
Getting started
===============
To get started, install the library with `pip <https://pip.pypa.io/en/stable/>`_:
.. code-block:: shell
pip install more-itertools
The recipes from the `itertools docs <https://docs.python.org/3/library/itertools.html#itertools-recipes>`_
are included in the top-level package:
.. code-block:: python
>>> from more_itertools import flatten
>>> iterable = [(0, 1), (2, 3)]
>>> list(flatten(iterable))
[0, 1, 2, 3]
Several new recipes are available as well:
.. code-block:: python
>>> from more_itertools import chunked
>>> iterable = [0, 1, 2, 3, 4, 5, 6, 7, 8]
>>> list(chunked(iterable, 3))
[[0, 1, 2], [3, 4, 5], [6, 7, 8]]
>>> from more_itertools import spy
>>> iterable = (x * x for x in range(1, 6))
>>> head, iterable = spy(iterable, n=3)
>>> list(head)
[1, 4, 9]
>>> list(iterable)
[1, 4, 9, 16, 25]
For the full listing of functions, see the `API documentation <https://more-itertools.readthedocs.io/en/latest/api.html>`_.
Development
===========
``more-itertools`` is maintained by `@erikrose <https://github.com/erikrose>`_
and `@bbayles <https://github.com/bbayles>`_, with help from `many others <https://github.com/erikrose/more-itertools/graphs/contributors>`_.
If you have a problem or suggestion, please file a bug or pull request in this
repository. Thanks for contributing! | {
"source": "yandex/perforator",
"title": "contrib/python/more-itertools/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/more-itertools/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 23902
} |
==============
More Itertools
==============
.. image:: https://readthedocs.org/projects/more-itertools/badge/?version=latest
:target: https://more-itertools.readthedocs.io/en/stable/
Python's ``itertools`` library is a gem - you can compose elegant solutions
for a variety of problems with the functions it provides. In ``more-itertools``
we collect additional building blocks, recipes, and routines for working with
Python iterables.
+------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Grouping | `chunked <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.chunked>`_, |
| | `ichunked <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.ichunked>`_, |
| | `chunked_even <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.chunked_even>`_, |
| | `sliced <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.sliced>`_, |
| | `constrained_batches <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.constrained_batches>`_, |
| | `distribute <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.distribute>`_, |
| | `divide <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.divide>`_, |
| | `split_at <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.split_at>`_, |
| | `split_before <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.split_before>`_, |
| | `split_after <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.split_after>`_, |
| | `split_into <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.split_into>`_, |
| | `split_when <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.split_when>`_, |
| | `bucket <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.bucket>`_, |
| | `unzip <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.unzip>`_, |
| | `batched <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.batched>`_, |
| | `grouper <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.grouper>`_, |
| | `partition <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.partition>`_, |
| | `transpose <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.transpose>`_ |
+------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Lookahead and lookback | `spy <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.spy>`_, |
| | `peekable <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.peekable>`_, |
| | `seekable <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.seekable>`_ |
+------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Windowing | `windowed <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.windowed>`_, |
| | `substrings <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.substrings>`_, |
| | `substrings_indexes <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.substrings_indexes>`_, |
| | `stagger <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.stagger>`_, |
| | `windowed_complete <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.windowed_complete>`_, |
| | `pairwise <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.pairwise>`_, |
| | `triplewise <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.triplewise>`_, |
| | `sliding_window <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.sliding_window>`_, |
| | `subslices <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.subslices>`_ |
+------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Augmenting | `count_cycle <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.count_cycle>`_, |
| | `intersperse <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.intersperse>`_, |
| | `padded <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.padded>`_, |
| | `repeat_each <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.repeat_each>`_, |
| | `mark_ends <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.mark_ends>`_, |
| | `repeat_last <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.repeat_last>`_, |
| | `adjacent <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.adjacent>`_, |
| | `groupby_transform <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.groupby_transform>`_, |
| | `pad_none <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.pad_none>`_, |
| | `ncycles <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.ncycles>`_ |
+------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Combining | `collapse <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.collapse>`_, |
| | `sort_together <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.sort_together>`_, |
| | `interleave <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.interleave>`_, |
| | `interleave_longest <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.interleave_longest>`_, |
| | `interleave_evenly <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.interleave_evenly>`_, |
| | `zip_offset <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.zip_offset>`_, |
| | `zip_equal <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.zip_equal>`_, |
| | `zip_broadcast <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.zip_broadcast>`_, |
| | `flatten <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.flatten>`_, |
| | `roundrobin <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.roundrobin>`_, |
| | `prepend <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.prepend>`_, |
| | `value_chain <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.value_chain>`_, |
| | `partial_product <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.partial_product>`_ |
+------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Summarizing | `ilen <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.ilen>`_, |
| | `unique_to_each <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.unique_to_each>`_, |
| | `sample <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.sample>`_, |
| | `consecutive_groups <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.consecutive_groups>`_, |
| | `run_length <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.run_length>`_, |
| | `map_reduce <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.map_reduce>`_, |
| | `join_mappings <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.join_mappings>`_, |
| | `exactly_n <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.exactly_n>`_, |
| | `is_sorted <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.is_sorted>`_, |
| | `all_equal <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.all_equal>`_, |
| | `all_unique <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.all_unique>`_, |
| | `minmax <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.minmax>`_, |
| | `first_true <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.first_true>`_, |
| | `quantify <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.quantify>`_, |
| | `iequals <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.iequals>`_ |
+------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Selecting | `islice_extended <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.islice_extended>`_, |
| | `first <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.first>`_, |
| | `last <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.last>`_, |
| | `one <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.one>`_, |
| | `only <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.only>`_, |
| | `strictly_n <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.strictly_n>`_, |
| | `strip <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.strip>`_, |
| | `lstrip <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.lstrip>`_, |
| | `rstrip <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.rstrip>`_, |
| | `filter_except <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.filter_except>`_, |
| | `map_except <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.map_except>`_, |
| | `filter_map <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.filter_map>`_, |
| | `iter_suppress <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.iter_suppress>`_, |
| | `nth_or_last <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.nth_or_last>`_, |
| | `unique_in_window <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.unique_in_window>`_, |
| | `before_and_after <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.before_and_after>`_, |
| | `nth <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.nth>`_, |
| | `take <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.take>`_, |
| | `tail <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.tail>`_, |
| | `unique_everseen <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.unique_everseen>`_, |
| | `unique_justseen <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.unique_justseen>`_, |
| | `unique <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.unique>`_, |
| | `duplicates_everseen <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.duplicates_everseen>`_, |
| | `duplicates_justseen <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.duplicates_justseen>`_, |
| | `classify_unique <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.classify_unique>`_, |
| | `longest_common_prefix <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.longest_common_prefix>`_, |
| | `takewhile_inclusive <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.takewhile_inclusive>`_ |
+------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Math | `dft <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.dft>`_, |
| | `idft <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.idft>`_, |
| | `convolve <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.convolve>`_, |
| | `dotproduct <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.dotproduct>`_, |
| | `factor <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.factor>`_, |
| | `is_prime <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.is_prime>`_, |
| | `nth_prime <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.nth_prime>`_, |
| | `matmul <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.matmul>`_, |
| | `polynomial_from_roots <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.polynomial_from_roots>`_, |
| | `polynomial_derivative <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.polynomial_derivative>`_, |
| | `polynomial_eval <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.polynomial_eval>`_, |
| | `sieve <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.sieve>`_, |
| | `sum_of_squares <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.sum_of_squares>`_, |
| | `totient <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.totient>`_ |
+------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Combinatorics | `distinct_permutations <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.distinct_permutations>`_, |
| | `distinct_combinations <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.distinct_combinations>`_, |
| | `circular_shifts <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.circular_shifts>`_, |
| | `partitions <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.partitions>`_, |
| | `set_partitions <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.set_partitions>`_, |
| | `product_index <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.product_index>`_, |
| | `combination_index <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.combination_index>`_, |
| | `permutation_index <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.permutation_index>`_, |
| | `combination_with_replacement_index <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.combination_with_replacement_index>`_, |
| | `gray_product <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.gray_product>`_, |
| | `outer_product <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.outer_product>`_, |
| | `powerset <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.powerset>`_, |
| | `powerset_of_sets <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.powerset_of_sets>`_, |
| | `random_product <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.random_product>`_, |
| | `random_permutation <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.random_permutation>`_, |
| | `random_combination <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.random_combination>`_, |
| | `random_combination_with_replacement <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.random_combination_with_replacement>`_, |
| | `nth_product <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.nth_product>`_, |
| | `nth_permutation <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.nth_permutation>`_, |
| | `nth_combination <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.nth_combination>`_, |
| | `nth_combination_with_replacement <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.nth_combination_with_replacement>`_ |
+------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Wrapping | `always_iterable <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.always_iterable>`_, |
| | `always_reversible <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.always_reversible>`_, |
| | `countable <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.countable>`_, |
| | `consumer <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.consumer>`_, |
| | `with_iter <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.with_iter>`_, |
| | `iter_except <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.iter_except>`_ |
+------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Others | `locate <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.locate>`_, |
| | `rlocate <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.rlocate>`_, |
| | `replace <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.replace>`_, |
| | `numeric_range <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.numeric_range>`_, |
| | `side_effect <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.side_effect>`_, |
| | `iterate <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.iterate>`_, |
| | `loops <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.loops>`_, |
| | `difference <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.difference>`_, |
| | `make_decorator <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.make_decorator>`_, |
| | `SequenceView <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.SequenceView>`_, |
| | `time_limited <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.time_limited>`_, |
| | `map_if <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.map_if>`_, |
| | `iter_index <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.iter_index>`_, |
| | `consume <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.consume>`_, |
| | `tabulate <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.tabulate>`_, |
| | `repeatfunc <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.repeatfunc>`_, |
| | `reshape <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.reshape>`_ |
| | `doublestarmap <https://more-itertools.readthedocs.io/en/stable/api.html#more_itertools.doublestarmap>`_ |
+------------------------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
Getting started
===============
To get started, install the library with `pip <https://pip.pypa.io/en/stable/>`_:
.. code-block:: shell
pip install more-itertools
The recipes from the `itertools docs <https://docs.python.org/3/library/itertools.html#itertools-recipes>`_
are included in the top-level package:
.. code-block:: python
>>> from more_itertools import flatten
>>> iterable = [(0, 1), (2, 3)]
>>> list(flatten(iterable))
[0, 1, 2, 3]
Several new recipes are available as well:
.. code-block:: python
>>> from more_itertools import chunked
>>> iterable = [0, 1, 2, 3, 4, 5, 6, 7, 8]
>>> list(chunked(iterable, 3))
[[0, 1, 2], [3, 4, 5], [6, 7, 8]]
>>> from more_itertools import spy
>>> iterable = (x * x for x in range(1, 6))
>>> head, iterable = spy(iterable, n=3)
>>> list(head)
[1, 4, 9]
>>> list(iterable)
[1, 4, 9, 16, 25]
For the full listing of functions, see the `API documentation <https://more-itertools.readthedocs.io/en/stable/api.html>`_.
Links elsewhere
===============
Blog posts about ``more-itertools``:
* `Yo, I heard you like decorators <https://www.bbayles.com/index/decorator_factory>`__
* `Tour of Python Itertools <https://martinheinz.dev/blog/16>`__ (`Alternate <https://dev.to/martinheinz/tour-of-python-itertools-4122>`__)
* `Real-World Python More Itertools <https://python.plainenglish.io/real-world-more-itertools-gideons-blog-a3901c607550>`_
Development
===========
``more-itertools`` is maintained by `@erikrose <https://github.com/erikrose>`_
and `@bbayles <https://github.com/bbayles>`_, with help from `many others <https://github.com/more-itertools/more-itertools/graphs/contributors>`_.
If you have a problem or suggestion, please file a bug or pull request in this
repository. Thanks for contributing!
Version History
===============
The version history can be found in `documentation <https://more-itertools.readthedocs.io/en/stable/versions.html>`_. | {
"source": "yandex/perforator",
"title": "contrib/python/more-itertools/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/more-itertools/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 35795
} |
packaging
=========
.. start-intro
Reusable core utilities for various Python Packaging
`interoperability specifications <https://packaging.python.org/specifications/>`_.
This library provides utilities that implement the interoperability
specifications which have clearly one correct behaviour (eg: :pep:`440`)
or benefit greatly from having a single shared implementation (eg: :pep:`425`).
.. end-intro
The ``packaging`` project includes the following: version handling, specifiers,
markers, requirements, tags, utilities.
Documentation
-------------
The `documentation`_ provides information and the API for the following:
- Version Handling
- Specifiers
- Markers
- Requirements
- Tags
- Utilities
Installation
------------
Use ``pip`` to install these utilities::
pip install packaging
Discussion
----------
If you run into bugs, you can file them in our `issue tracker`_.
You can also join ``#pypa`` on Freenode to ask questions or get involved.
.. _`documentation`: https://packaging.pypa.io/
.. _`issue tracker`: https://github.com/pypa/packaging/issues
Code of Conduct
---------------
Everyone interacting in the packaging project's codebases, issue trackers, chat
rooms, and mailing lists is expected to follow the `PSF Code of Conduct`_.
.. _PSF Code of Conduct: https://github.com/pypa/.github/blob/main/CODE_OF_CONDUCT.md
Contributing
------------
The ``CONTRIBUTING.rst`` file outlines how to contribute to this project as
well as how to report a potential security issue. The documentation for this
project also covers information about `project development`_ and `security`_.
.. _`project development`: https://packaging.pypa.io/en/latest/development/
.. _`security`: https://packaging.pypa.io/en/latest/security/
Project History
---------------
Please review the ``CHANGELOG.rst`` file or the `Changelog documentation`_ for
recent changes and project history.
.. _`Changelog documentation`: https://packaging.pypa.io/en/latest/changelog/ | {
"source": "yandex/perforator",
"title": "contrib/python/packaging/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/packaging/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1984
} |
packaging
=========
.. start-intro
Reusable core utilities for various Python Packaging
`interoperability specifications <https://packaging.python.org/specifications/>`_.
This library provides utilities that implement the interoperability
specifications which have clearly one correct behaviour (eg: :pep:`440`)
or benefit greatly from having a single shared implementation (eg: :pep:`425`).
.. end-intro
The ``packaging`` project includes the following: version handling, specifiers,
markers, requirements, tags, utilities.
Documentation
-------------
The `documentation`_ provides information and the API for the following:
- Version Handling
- Specifiers
- Markers
- Requirements
- Tags
- Utilities
Installation
------------
Use ``pip`` to install these utilities::
pip install packaging
The ``packaging`` library uses calendar-based versioning (``YY.N``).
Discussion
----------
If you run into bugs, you can file them in our `issue tracker`_.
You can also join ``#pypa`` on Freenode to ask questions or get involved.
.. _`documentation`: https://packaging.pypa.io/
.. _`issue tracker`: https://github.com/pypa/packaging/issues
Code of Conduct
---------------
Everyone interacting in the packaging project's codebases, issue trackers, chat
rooms, and mailing lists is expected to follow the `PSF Code of Conduct`_.
.. _PSF Code of Conduct: https://github.com/pypa/.github/blob/main/CODE_OF_CONDUCT.md
Contributing
------------
The ``CONTRIBUTING.rst`` file outlines how to contribute to this project as
well as how to report a potential security issue. The documentation for this
project also covers information about `project development`_ and `security`_.
.. _`project development`: https://packaging.pypa.io/en/latest/development/
.. _`security`: https://packaging.pypa.io/en/latest/security/
Project History
---------------
Please review the ``CHANGELOG.rst`` file or the `Changelog documentation`_ for
recent changes and project history.
.. _`Changelog documentation`: https://packaging.pypa.io/en/latest/changelog/ | {
"source": "yandex/perforator",
"title": "contrib/python/packaging/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/packaging/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2054
} |
Parameterized testing with any Python test framework
====================================================
.. image:: https://img.shields.io/pypi/v/parameterized.svg
:alt: PyPI
:target: https://pypi.org/project/parameterized/
.. image:: https://circleci.com/gh/wolever/parameterized.svg?style=svg
:alt: Circle CI
:target: https://circleci.com/gh/wolever/parameterized
Parameterized testing in Python sucks.
``parameterized`` fixes that. For everything. Parameterized testing for nose,
parameterized testing for py.test, parameterized testing for unittest.
.. code:: python
# test_math.py
from nose.tools import assert_equal
from parameterized import parameterized, parameterized_class
import unittest
import math
@parameterized([
(2, 2, 4),
(2, 3, 8),
(1, 9, 1),
(0, 9, 0),
])
def test_pow(base, exponent, expected):
assert_equal(math.pow(base, exponent), expected)
class TestMathUnitTest(unittest.TestCase):
@parameterized.expand([
("negative", -1.5, -2.0),
("integer", 1, 1.0),
("large fraction", 1.6, 1),
])
def test_floor(self, name, input, expected):
assert_equal(math.floor(input), expected)
@parameterized_class(('a', 'b', 'expected_sum', 'expected_product'), [
(1, 2, 3, 2),
(5, 5, 10, 25),
])
class TestMathClass(unittest.TestCase):
def test_add(self):
assert_equal(self.a + self.b, self.expected_sum)
def test_multiply(self):
assert_equal(self.a * self.b, self.expected_product)
@parameterized_class([
{ "a": 3, "expected": 2 },
{ "b": 5, "expected": -4 },
])
class TestMathClassDict(unittest.TestCase):
a = 1
b = 1
def test_subtract(self):
assert_equal(self.a - self.b, self.expected)
With nose (and nose2)::
$ nosetests -v test_math.py
test_floor_0_negative (test_math.TestMathUnitTest) ... ok
test_floor_1_integer (test_math.TestMathUnitTest) ... ok
test_floor_2_large_fraction (test_math.TestMathUnitTest) ... ok
test_math.test_pow(2, 2, 4, {}) ... ok
test_math.test_pow(2, 3, 8, {}) ... ok
test_math.test_pow(1, 9, 1, {}) ... ok
test_math.test_pow(0, 9, 0, {}) ... ok
test_add (test_math.TestMathClass_0) ... ok
test_multiply (test_math.TestMathClass_0) ... ok
test_add (test_math.TestMathClass_1) ... ok
test_multiply (test_math.TestMathClass_1) ... ok
test_subtract (test_math.TestMathClassDict_0) ... ok
----------------------------------------------------------------------
Ran 12 tests in 0.015s
OK
As the package name suggests, nose is best supported and will be used for all
further examples.
With py.test (version 2.0 and above)::
$ py.test -v test_math.py
============================= test session starts ==============================
platform darwin -- Python 3.6.1, pytest-3.1.3, py-1.4.34, pluggy-0.4.0
collecting ... collected 13 items
test_math.py::test_pow::[0] PASSED
test_math.py::test_pow::[1] PASSED
test_math.py::test_pow::[2] PASSED
test_math.py::test_pow::[3] PASSED
test_math.py::TestMathUnitTest::test_floor_0_negative PASSED
test_math.py::TestMathUnitTest::test_floor_1_integer PASSED
test_math.py::TestMathUnitTest::test_floor_2_large_fraction PASSED
test_math.py::TestMathClass_0::test_add PASSED
test_math.py::TestMathClass_0::test_multiply PASSED
test_math.py::TestMathClass_1::test_add PASSED
test_math.py::TestMathClass_1::test_multiply PASSED
test_math.py::TestMathClassDict_0::test_subtract PASSED
==================== 12 passed, 4 warnings in 0.16 seconds =====================
With unittest (and unittest2)::
$ python -m unittest -v test_math
test_floor_0_negative (test_math.TestMathUnitTest) ... ok
test_floor_1_integer (test_math.TestMathUnitTest) ... ok
test_floor_2_large_fraction (test_math.TestMathUnitTest) ... ok
test_add (test_math.TestMathClass_0) ... ok
test_multiply (test_math.TestMathClass_0) ... ok
test_add (test_math.TestMathClass_1) ... ok
test_multiply (test_math.TestMathClass_1) ... ok
test_subtract (test_math.TestMathClassDict_0) ... ok
----------------------------------------------------------------------
Ran 8 tests in 0.001s
OK
(note: because unittest does not support test decorators, only tests created
with ``@parameterized.expand`` will be executed)
With green::
$ green test_math.py -vvv
test_math
TestMathClass_1
. test_method_a
. test_method_b
TestMathClass_2
. test_method_a
. test_method_b
TestMathClass_3
. test_method_a
. test_method_b
TestMathUnitTest
. test_floor_0_negative
. test_floor_1_integer
. test_floor_2_large_fraction
TestMathClass_0
. test_add
. test_multiply
TestMathClass_1
. test_add
. test_multiply
TestMathClassDict_0
. test_subtract
Ran 12 tests in 0.121s
OK (passes=9)
Installation
------------
::
$ pip install parameterized
Compatibility
-------------
`Yes`__ (mostly).
__ https://travis-ci.org/wolever/parameterized
.. list-table::
:header-rows: 1
:stub-columns: 1
* -
- Py2.6
- Py2.7
- Py3.4
- Py3.5
- Py3.6
- Py3.7
- Py3.8
- Py3.9
- PyPy
- ``@mock.patch``
* - nose
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
* - nose2
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
* - py.test 2
- yes
- yes
- no*
- no*
- no*
- no*
- yes
- yes
- yes
- yes
* - py.test 3
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
* - py.test 4
- no**
- no**
- no**
- no**
- no**
- no**
- no**
- no**
- no**
- no**
* - py.test fixtures
- no†
- no†
- no†
- no†
- no†
- no†
- no†
- no†
- no†
- no†
* - | unittest
| (``@parameterized.expand``)
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
* - | unittest2
| (``@parameterized.expand``)
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
- yes
\*: py.test 2 does `does not appear to work (#71)`__ under Python 3. Please comment on the related issues if you are affected.
\*\*: py.test 4 is not yet supported (but coming!) in `issue #34`__
†: py.test fixture support is documented in `issue #81`__
__ https://github.com/wolever/parameterized/issues/71
__ https://github.com/wolever/parameterized/issues/34
__ https://github.com/wolever/parameterized/issues/81
Dependencies
------------
(this section left intentionally blank)
Exhaustive Usage Examples
--------------------------
The ``@parameterized`` and ``@parameterized.expand`` decorators accept a list
or iterable of tuples or ``param(...)``, or a callable which returns a list or
iterable:
.. code:: python
from parameterized import parameterized, param
# A list of tuples
@parameterized([
(2, 3, 5),
(3, 5, 8),
])
def test_add(a, b, expected):
assert_equal(a + b, expected)
# A list of params
@parameterized([
param("10", 10),
param("10", 16, base=16),
])
def test_int(str_val, expected, base=10):
assert_equal(int(str_val, base=base), expected)
# An iterable of params
@parameterized(
param.explicit(*json.loads(line))
for line in open("testcases.jsons")
)
def test_from_json_file(...):
...
# A callable which returns a list of tuples
def load_test_cases():
return [
("test1", ),
("test2", ),
]
@parameterized(load_test_cases)
def test_from_function(name):
...
.. **
Note that, when using an iterator or a generator, all the items will be loaded
into memory before the start of the test run (we do this explicitly to ensure
that generators are exhausted exactly once in multi-process or multi-threaded
testing environments).
The ``@parameterized`` decorator can be used test class methods, and standalone
functions:
.. code:: python
from parameterized import parameterized
class AddTest(object):
@parameterized([
(2, 3, 5),
])
def test_add(self, a, b, expected):
assert_equal(a + b, expected)
@parameterized([
(2, 3, 5),
])
def test_add(a, b, expected):
assert_equal(a + b, expected)
And ``@parameterized.expand`` can be used to generate test methods in
situations where test generators cannot be used (for example, when the test
class is a subclass of ``unittest.TestCase``):
.. code:: python
import unittest
from parameterized import parameterized
class AddTestCase(unittest.TestCase):
@parameterized.expand([
("2 and 3", 2, 3, 5),
("3 and 5", 2, 3, 5),
])
def test_add(self, _, a, b, expected):
assert_equal(a + b, expected)
Will create the test cases::
$ nosetests example.py
test_add_0_2_and_3 (example.AddTestCase) ... ok
test_add_1_3_and_5 (example.AddTestCase) ... ok
----------------------------------------------------------------------
Ran 2 tests in 0.001s
OK
Note that ``@parameterized.expand`` works by creating new methods on the test
class. If the first parameter is a string, that string will be added to the end
of the method name. For example, the test case above will generate the methods
``test_add_0_2_and_3`` and ``test_add_1_3_and_5``.
The names of the test cases generated by ``@parameterized.expand`` can be
customized using the ``name_func`` keyword argument. The value should
be a function which accepts three arguments: ``testcase_func``, ``param_num``,
and ``params``, and it should return the name of the test case.
``testcase_func`` will be the function to be tested, ``param_num`` will be the
index of the test case parameters in the list of parameters, and ``param``
(an instance of ``param``) will be the parameters which will be used.
.. code:: python
import unittest
from parameterized import parameterized
def custom_name_func(testcase_func, param_num, param):
return "%s_%s" %(
testcase_func.__name__,
parameterized.to_safe_name("_".join(str(x) for x in param.args)),
)
class AddTestCase(unittest.TestCase):
@parameterized.expand([
(2, 3, 5),
(2, 3, 5),
], name_func=custom_name_func)
def test_add(self, a, b, expected):
assert_equal(a + b, expected)
Will create the test cases::
$ nosetests example.py
test_add_1_2_3 (example.AddTestCase) ... ok
test_add_2_3_5 (example.AddTestCase) ... ok
----------------------------------------------------------------------
Ran 2 tests in 0.001s
OK
The ``param(...)`` helper class stores the parameters for one specific test
case. It can be used to pass keyword arguments to test cases:
.. code:: python
from parameterized import parameterized, param
@parameterized([
param("10", 10),
param("10", 16, base=16),
])
def test_int(str_val, expected, base=10):
assert_equal(int(str_val, base=base), expected)
If test cases have a docstring, the parameters for that test case will be
appended to the first line of the docstring. This behavior can be controlled
with the ``doc_func`` argument:
.. code:: python
from parameterized import parameterized
@parameterized([
(1, 2, 3),
(4, 5, 9),
])
def test_add(a, b, expected):
""" Test addition. """
assert_equal(a + b, expected)
def my_doc_func(func, num, param):
return "%s: %s with %s" %(num, func.__name__, param)
@parameterized([
(5, 4, 1),
(9, 6, 3),
], doc_func=my_doc_func)
def test_subtraction(a, b, expected):
assert_equal(a - b, expected)
::
$ nosetests example.py
Test addition. [with a=1, b=2, expected=3] ... ok
Test addition. [with a=4, b=5, expected=9] ... ok
0: test_subtraction with param(*(5, 4, 1)) ... ok
1: test_subtraction with param(*(9, 6, 3)) ... ok
----------------------------------------------------------------------
Ran 4 tests in 0.001s
OK
Finally ``@parameterized_class`` parameterizes an entire class, using
either a list of attributes, or a list of dicts that will be applied to the
class:
.. code:: python
from yourapp.models import User
from parameterized import parameterized_class
@parameterized_class([
{ "username": "user_1", "access_level": 1 },
{ "username": "user_2", "access_level": 2, "expected_status_code": 404 },
])
class TestUserAccessLevel(TestCase):
expected_status_code = 200
def setUp(self):
self.client.force_login(User.objects.get(username=self.username)[0])
def test_url_a(self):
response = self.client.get('/url')
self.assertEqual(response.status_code, self.expected_status_code)
def tearDown(self):
self.client.logout()
@parameterized_class(("username", "access_level", "expected_status_code"), [
("user_1", 1, 200),
("user_2", 2, 404)
])
class TestUserAccessLevel(TestCase):
def setUp(self):
self.client.force_login(User.objects.get(username=self.username)[0])
def test_url_a(self):
response = self.client.get("/url")
self.assertEqual(response.status_code, self.expected_status_code)
def tearDown(self):
self.client.logout()
The ``@parameterized_class`` decorator accepts a ``class_name_func`` argument,
which controls the name of the parameterized classes generated by
``@parameterized_class``:
.. code:: python
from parameterized import parameterized, parameterized_class
def get_class_name(cls, num, params_dict):
# By default the generated class named includes either the "name"
# parameter (if present), or the first string value. This example shows
# multiple parameters being included in the generated class name:
return "%s_%s_%s%s" %(
cls.__name__,
num,
parameterized.to_safe_name(params_dict['a']),
parameterized.to_safe_name(params_dict['b']),
)
@parameterized_class([
{ "a": "hello", "b": " world!", "expected": "hello world!" },
{ "a": "say ", "b": " cheese :)", "expected": "say cheese :)" },
], class_name_func=get_class_name)
class TestConcatenation(TestCase):
def test_concat(self):
self.assertEqual(self.a + self.b, self.expected)
::
$ nosetests -v test_math.py
test_concat (test_concat.TestConcatenation_0_hello_world_) ... ok
test_concat (test_concat.TestConcatenation_0_say_cheese__) ... ok
Using with Single Parameters
............................
If a test function only accepts one parameter and the value is not iterable,
then it is possible to supply a list of values without wrapping each one in a
tuple:
.. code:: python
@parameterized([1, 2, 3])
def test_greater_than_zero(value):
assert value > 0
Note, however, that if the single parameter *is* iterable (such as a list or
tuple), then it *must* be wrapped in a tuple, list, or the ``param(...)``
helper:
.. code:: python
@parameterized([
([1, 2, 3], ),
([3, 3], ),
([6], ),
])
def test_sums_to_6(numbers):
assert sum(numbers) == 6
(note, also, that Python requires single element tuples to be defined with a
trailing comma: ``(foo, )``)
Using with ``@mock.patch``
..........................
``parameterized`` can be used with ``mock.patch``, but the argument ordering
can be confusing. The ``@mock.patch(...)`` decorator must come *below* the
``@parameterized(...)``, and the mocked parameters must come *last*:
.. code:: python
@mock.patch("os.getpid")
class TestOS(object):
@parameterized(...)
@mock.patch("os.fdopen")
@mock.patch("os.umask")
def test_method(self, param1, param2, ..., mock_umask, mock_fdopen, mock_getpid):
...
Note: the same holds true when using ``@parameterized.expand``.
Migrating from ``nose-parameterized`` to ``parameterized``
----------------------------------------------------------
To migrate a codebase from ``nose-parameterized`` to ``parameterized``:
1. Update your requirements file, replacing ``nose-parameterized`` with
``parameterized``.
2. Replace all references to ``nose_parameterized`` with ``parameterized``::
$ perl -pi -e 's/nose_parameterized/parameterized/g' your-codebase/
3. You're done!
FAQ
---
What happened to ``nose-parameterized``?
Originally only nose was supported. But now everything is supported, and it
only made sense to change the name!
What do you mean when you say "nose is best supported"?
There are small caveates with ``py.test`` and ``unittest``: ``py.test``
does not show the parameter values (ex, it will show ``test_add[0]``
instead of ``test_add[1, 2, 3]``), and ``unittest``/``unittest2`` do not
support test generators so ``@parameterized.expand`` must be used.
Why not use ``@pytest.mark.parametrize``?
Because spelling is difficult. Also, ``parameterized`` doesn't require you
to repeat argument names, and (using ``param``) it supports optional
keyword arguments.
Why do I get an ``AttributeError: 'function' object has no attribute 'expand'`` with ``@parameterized.expand``?
You've likely installed the ``parametrized`` (note the missing *e*)
package. Use ``parameterized`` (with the *e*) instead and you'll be all
set. | {
"source": "yandex/perforator",
"title": "contrib/python/parameterized/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/parameterized/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 18062
} |
Parameterized testing with any Python test framework
====================================================
.. image:: https://img.shields.io/pypi/v/parameterized
:alt: PyPI
:target: https://pypi.org/project/parameterized/
.. image:: https://img.shields.io/pypi/dm/parameterized
:alt: PyPI - Downloads
:target: https://pypi.org/project/parameterized/
.. image:: https://circleci.com/gh/wolever/parameterized.svg?style=svg
:alt: Circle CI
:target: https://circleci.com/gh/wolever/parameterized
Parameterized testing in Python sucks.
``parameterized`` fixes that. For everything. Parameterized testing for nose,
parameterized testing for py.test, parameterized testing for unittest.
.. code:: python
# test_math.py
from nose.tools import assert_equal
from parameterized import parameterized, parameterized_class
import unittest
import math
@parameterized([
(2, 2, 4),
(2, 3, 8),
(1, 9, 1),
(0, 9, 0),
])
def test_pow(base, exponent, expected):
assert_equal(math.pow(base, exponent), expected)
class TestMathUnitTest(unittest.TestCase):
@parameterized.expand([
("negative", -1.5, -2.0),
("integer", 1, 1.0),
("large fraction", 1.6, 1),
])
def test_floor(self, name, input, expected):
assert_equal(math.floor(input), expected)
@parameterized_class(('a', 'b', 'expected_sum', 'expected_product'), [
(1, 2, 3, 2),
(5, 5, 10, 25),
])
class TestMathClass(unittest.TestCase):
def test_add(self):
assert_equal(self.a + self.b, self.expected_sum)
def test_multiply(self):
assert_equal(self.a * self.b, self.expected_product)
@parameterized_class([
{ "a": 3, "expected": 2 },
{ "b": 5, "expected": -4 },
])
class TestMathClassDict(unittest.TestCase):
a = 1
b = 1
def test_subtract(self):
assert_equal(self.a - self.b, self.expected)
With nose (and nose2)::
$ nosetests -v test_math.py
test_floor_0_negative (test_math.TestMathUnitTest) ... ok
test_floor_1_integer (test_math.TestMathUnitTest) ... ok
test_floor_2_large_fraction (test_math.TestMathUnitTest) ... ok
test_math.test_pow(2, 2, 4, {}) ... ok
test_math.test_pow(2, 3, 8, {}) ... ok
test_math.test_pow(1, 9, 1, {}) ... ok
test_math.test_pow(0, 9, 0, {}) ... ok
test_add (test_math.TestMathClass_0) ... ok
test_multiply (test_math.TestMathClass_0) ... ok
test_add (test_math.TestMathClass_1) ... ok
test_multiply (test_math.TestMathClass_1) ... ok
test_subtract (test_math.TestMathClassDict_0) ... ok
----------------------------------------------------------------------
Ran 12 tests in 0.015s
OK
As the package name suggests, nose is best supported and will be used for all
further examples.
With py.test (version 2.0 and above)::
$ py.test -v test_math.py
============================= test session starts ==============================
platform darwin -- Python 3.6.1, pytest-3.1.3, py-1.4.34, pluggy-0.4.0
collecting ... collected 13 items
test_math.py::test_pow::[0] PASSED
test_math.py::test_pow::[1] PASSED
test_math.py::test_pow::[2] PASSED
test_math.py::test_pow::[3] PASSED
test_math.py::TestMathUnitTest::test_floor_0_negative PASSED
test_math.py::TestMathUnitTest::test_floor_1_integer PASSED
test_math.py::TestMathUnitTest::test_floor_2_large_fraction PASSED
test_math.py::TestMathClass_0::test_add PASSED
test_math.py::TestMathClass_0::test_multiply PASSED
test_math.py::TestMathClass_1::test_add PASSED
test_math.py::TestMathClass_1::test_multiply PASSED
test_math.py::TestMathClassDict_0::test_subtract PASSED
==================== 12 passed, 4 warnings in 0.16 seconds =====================
With unittest (and unittest2)::
$ python -m unittest -v test_math
test_floor_0_negative (test_math.TestMathUnitTest) ... ok
test_floor_1_integer (test_math.TestMathUnitTest) ... ok
test_floor_2_large_fraction (test_math.TestMathUnitTest) ... ok
test_add (test_math.TestMathClass_0) ... ok
test_multiply (test_math.TestMathClass_0) ... ok
test_add (test_math.TestMathClass_1) ... ok
test_multiply (test_math.TestMathClass_1) ... ok
test_subtract (test_math.TestMathClassDict_0) ... ok
----------------------------------------------------------------------
Ran 8 tests in 0.001s
OK
(note: because unittest does not support test decorators, only tests created
with ``@parameterized.expand`` will be executed)
With green::
$ green test_math.py -vvv
test_math
TestMathClass_1
. test_method_a
. test_method_b
TestMathClass_2
. test_method_a
. test_method_b
TestMathClass_3
. test_method_a
. test_method_b
TestMathUnitTest
. test_floor_0_negative
. test_floor_1_integer
. test_floor_2_large_fraction
TestMathClass_0
. test_add
. test_multiply
TestMathClass_1
. test_add
. test_multiply
TestMathClassDict_0
. test_subtract
Ran 12 tests in 0.121s
OK (passes=9)
Installation
------------
::
$ pip install parameterized
Compatibility
-------------
`Yes`__ (mostly).
__ https://app.circleci.com/pipelines/github/wolever/parameterized?branch=master
.. list-table::
:header-rows: 1
:stub-columns: 1
* -
- Py3.7
- Py3.8
- Py3.9
- Py3.10
- Py3.11
- PyPy3
- ``@mock.patch``
* - nose
- yes
- yes
- yes
- yes
- no§
- no§
- yes
* - nose2
- yes
- yes
- yes
- yes
- yes
- yes
- yes
* - py.test 2
- no*
- no*
- no*
- no*
- no*
- no*
- no*
* - py.test 3
- yes
- yes
- yes
- yes
- no*
- no*
- yes
* - py.test 4
- no**
- no**
- no**
- no**
- no**
- no**
- no**
* - py.test fixtures
- no†
- no†
- no†
- no†
- no†
- no†
- no†
* - | unittest
| (``@parameterized.expand``)
- yes
- yes
- yes
- yes
- yes
- yes
- yes
* - | unittest2
| (``@parameterized.expand``)
- yes
- yes
- yes
- yes
- no§
- no§
- yes
§: nose and unittest2 - both of which were last updated in 2015 - sadly do not
appear to support Python 3.10 or 3.11.
\*: `py.test 2 does not appear to work under Python 3 (#71)`__, and
`py.test 3 does not appear to work under Python 3.10 or 3.11 (#154)`__.
\*\*: py.test 4 is not yet supported (but coming!) in `issue #34`__
†: py.test fixture support is documented in `issue #81`__
__ https://github.com/wolever/parameterized/issues/71
__ https://github.com/wolever/parameterized/issues/154
__ https://github.com/wolever/parameterized/issues/34
__ https://github.com/wolever/parameterized/issues/81
Dependencies
------------
(this section left intentionally blank)
Exhaustive Usage Examples
--------------------------
The ``@parameterized`` and ``@parameterized.expand`` decorators accept a list
or iterable of tuples or ``param(...)``, or a callable which returns a list or
iterable:
.. code:: python
from parameterized import parameterized, param
# A list of tuples
@parameterized([
(2, 3, 5),
(3, 5, 8),
])
def test_add(a, b, expected):
assert_equal(a + b, expected)
# A list of params
@parameterized([
param("10", 10),
param("10", 16, base=16),
])
def test_int(str_val, expected, base=10):
assert_equal(int(str_val, base=base), expected)
# An iterable of params
@parameterized(
param.explicit(*json.loads(line))
for line in open("testcases.jsons")
)
def test_from_json_file(...):
...
# A callable which returns a list of tuples
def load_test_cases():
return [
("test1", ),
("test2", ),
]
@parameterized(load_test_cases)
def test_from_function(name):
...
.. **
Note that, when using an iterator or a generator, all the items will be loaded
into memory before the start of the test run (we do this explicitly to ensure
that generators are exhausted exactly once in multi-process or multi-threaded
testing environments).
The ``@parameterized`` decorator can be used test class methods, and standalone
functions:
.. code:: python
from parameterized import parameterized
class AddTest(object):
@parameterized([
(2, 3, 5),
])
def test_add(self, a, b, expected):
assert_equal(a + b, expected)
@parameterized([
(2, 3, 5),
])
def test_add(a, b, expected):
assert_equal(a + b, expected)
And ``@parameterized.expand`` can be used to generate test methods in
situations where test generators cannot be used (for example, when the test
class is a subclass of ``unittest.TestCase``):
.. code:: python
import unittest
from parameterized import parameterized
class AddTestCase(unittest.TestCase):
@parameterized.expand([
("2 and 3", 2, 3, 5),
("3 and 5", 3, 5, 8),
])
def test_add(self, _, a, b, expected):
assert_equal(a + b, expected)
Will create the test cases::
$ nosetests example.py
test_add_0_2_and_3 (example.AddTestCase) ... ok
test_add_1_3_and_5 (example.AddTestCase) ... ok
----------------------------------------------------------------------
Ran 2 tests in 0.001s
OK
Note that ``@parameterized.expand`` works by creating new methods on the test
class. If the first parameter is a string, that string will be added to the end
of the method name. For example, the test case above will generate the methods
``test_add_0_2_and_3`` and ``test_add_1_3_and_5``.
The names of the test cases generated by ``@parameterized.expand`` can be
customized using the ``name_func`` keyword argument. The value should
be a function which accepts three arguments: ``testcase_func``, ``param_num``,
and ``params``, and it should return the name of the test case.
``testcase_func`` will be the function to be tested, ``param_num`` will be the
index of the test case parameters in the list of parameters, and ``param``
(an instance of ``param``) will be the parameters which will be used.
.. code:: python
import unittest
from parameterized import parameterized
def custom_name_func(testcase_func, param_num, param):
return "%s_%s" %(
testcase_func.__name__,
parameterized.to_safe_name("_".join(str(x) for x in param.args)),
)
class AddTestCase(unittest.TestCase):
@parameterized.expand([
(2, 3, 5),
(2, 3, 5),
], name_func=custom_name_func)
def test_add(self, a, b, expected):
assert_equal(a + b, expected)
Will create the test cases::
$ nosetests example.py
test_add_1_2_3 (example.AddTestCase) ... ok
test_add_2_3_5 (example.AddTestCase) ... ok
----------------------------------------------------------------------
Ran 2 tests in 0.001s
OK
The ``param(...)`` helper class stores the parameters for one specific test
case. It can be used to pass keyword arguments to test cases:
.. code:: python
from parameterized import parameterized, param
@parameterized([
param("10", 10),
param("10", 16, base=16),
])
def test_int(str_val, expected, base=10):
assert_equal(int(str_val, base=base), expected)
If test cases have a docstring, the parameters for that test case will be
appended to the first line of the docstring. This behavior can be controlled
with the ``doc_func`` argument:
.. code:: python
from parameterized import parameterized
@parameterized([
(1, 2, 3),
(4, 5, 9),
])
def test_add(a, b, expected):
""" Test addition. """
assert_equal(a + b, expected)
def my_doc_func(func, num, param):
return "%s: %s with %s" %(num, func.__name__, param)
@parameterized([
(5, 4, 1),
(9, 6, 3),
], doc_func=my_doc_func)
def test_subtraction(a, b, expected):
assert_equal(a - b, expected)
::
$ nosetests example.py
Test addition. [with a=1, b=2, expected=3] ... ok
Test addition. [with a=4, b=5, expected=9] ... ok
0: test_subtraction with param(*(5, 4, 1)) ... ok
1: test_subtraction with param(*(9, 6, 3)) ... ok
----------------------------------------------------------------------
Ran 4 tests in 0.001s
OK
Finally ``@parameterized_class`` parameterizes an entire class, using
either a list of attributes, or a list of dicts that will be applied to the
class:
.. code:: python
from yourapp.models import User
from parameterized import parameterized_class
@parameterized_class([
{ "username": "user_1", "access_level": 1 },
{ "username": "user_2", "access_level": 2, "expected_status_code": 404 },
])
class TestUserAccessLevel(TestCase):
expected_status_code = 200
def setUp(self):
self.client.force_login(User.objects.get(username=self.username)[0])
def test_url_a(self):
response = self.client.get('/url')
self.assertEqual(response.status_code, self.expected_status_code)
def tearDown(self):
self.client.logout()
@parameterized_class(("username", "access_level", "expected_status_code"), [
("user_1", 1, 200),
("user_2", 2, 404)
])
class TestUserAccessLevel(TestCase):
def setUp(self):
self.client.force_login(User.objects.get(username=self.username)[0])
def test_url_a(self):
response = self.client.get("/url")
self.assertEqual(response.status_code, self.expected_status_code)
def tearDown(self):
self.client.logout()
The ``@parameterized_class`` decorator accepts a ``class_name_func`` argument,
which controls the name of the parameterized classes generated by
``@parameterized_class``:
.. code:: python
from parameterized import parameterized, parameterized_class
def get_class_name(cls, num, params_dict):
# By default the generated class named includes either the "name"
# parameter (if present), or the first string value. This example shows
# multiple parameters being included in the generated class name:
return "%s_%s_%s%s" %(
cls.__name__,
num,
parameterized.to_safe_name(params_dict['a']),
parameterized.to_safe_name(params_dict['b']),
)
@parameterized_class([
{ "a": "hello", "b": " world!", "expected": "hello world!" },
{ "a": "say ", "b": " cheese :)", "expected": "say cheese :)" },
], class_name_func=get_class_name)
class TestConcatenation(TestCase):
def test_concat(self):
self.assertEqual(self.a + self.b, self.expected)
::
$ nosetests -v test_math.py
test_concat (test_concat.TestConcatenation_0_hello_world_) ... ok
test_concat (test_concat.TestConcatenation_0_say_cheese__) ... ok
Using with Single Parameters
............................
If a test function only accepts one parameter and the value is not iterable,
then it is possible to supply a list of values without wrapping each one in a
tuple:
.. code:: python
@parameterized([1, 2, 3])
def test_greater_than_zero(value):
assert value > 0
Note, however, that if the single parameter *is* iterable (such as a list or
tuple), then it *must* be wrapped in a tuple, list, or the ``param(...)``
helper:
.. code:: python
@parameterized([
([1, 2, 3], ),
([3, 3], ),
([6], ),
])
def test_sums_to_6(numbers):
assert sum(numbers) == 6
(note, also, that Python requires single element tuples to be defined with a
trailing comma: ``(foo, )``)
Using with ``@mock.patch``
..........................
``parameterized`` can be used with ``mock.patch``, but the argument ordering
can be confusing. The ``@mock.patch(...)`` decorator must come *below* the
``@parameterized(...)``, and the mocked parameters must come *last*:
.. code:: python
@mock.patch("os.getpid")
class TestOS(object):
@parameterized(...)
@mock.patch("os.fdopen")
@mock.patch("os.umask")
def test_method(self, param1, param2, ..., mock_umask, mock_fdopen, mock_getpid):
...
Note: the same holds true when using ``@parameterized.expand``.
Migrating from ``nose-parameterized`` to ``parameterized``
----------------------------------------------------------
To migrate a codebase from ``nose-parameterized`` to ``parameterized``:
1. Update your requirements file, replacing ``nose-parameterized`` with
``parameterized``.
2. Replace all references to ``nose_parameterized`` with ``parameterized``::
$ perl -pi -e 's/nose_parameterized/parameterized/g' your-codebase/
3. You're done!
FAQ
---
What happened to Python 2.X, 3.5, and 3.6 support?
As of version 0.9.0, ``parameterized`` no longer supports Python 2.X, 3.5,
or 3.6. Previous versions of ``parameterized`` - 0.8.1 being the latest -
will continue to work, but will not receive any new features or bug fixes.
What do you mean when you say "nose is best supported"?
There are small caveates with ``py.test`` and ``unittest``: ``py.test``
does not show the parameter values (ex, it will show ``test_add[0]``
instead of ``test_add[1, 2, 3]``), and ``unittest``/``unittest2`` do not
support test generators so ``@parameterized.expand`` must be used.
Why not use ``@pytest.mark.parametrize``?
Because spelling is difficult. Also, ``parameterized`` doesn't require you
to repeat argument names, and (using ``param``) it supports optional
keyword arguments.
Why do I get an ``AttributeError: 'function' object has no attribute 'expand'`` with ``@parameterized.expand``?
You've likely installed the ``parametrized`` (note the missing *e*)
package. Use ``parameterized`` (with the *e*) instead and you'll be all
set.
What happened to ``nose-parameterized``?
Originally only nose was supported. But now everything is supported, and it
only made sense to change the name! | {
"source": "yandex/perforator",
"title": "contrib/python/parameterized/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/parameterized/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 18403
} |
###################################################################
parso - A Python Parser
###################################################################
.. image:: https://travis-ci.org/davidhalter/parso.svg?branch=master
:target: https://travis-ci.org/davidhalter/parso
:alt: Travis CI build status
.. image:: https://coveralls.io/repos/github/davidhalter/parso/badge.svg?branch=master
:target: https://coveralls.io/github/davidhalter/parso?branch=master
:alt: Coverage Status
.. image:: https://pepy.tech/badge/parso
:target: https://pepy.tech/project/parso
:alt: PyPI Downloads
.. image:: https://raw.githubusercontent.com/davidhalter/parso/master/docs/_static/logo_characters.png
Parso is a Python parser that supports error recovery and round-trip parsing
for different Python versions (in multiple Python versions). Parso is also able
to list multiple syntax errors in your python file.
Parso has been battle-tested by jedi_. It was pulled out of jedi to be useful
for other projects as well.
Parso consists of a small API to parse Python and analyse the syntax tree.
A simple example:
.. code-block:: python
>>> import parso
>>> module = parso.parse('hello + 1', version="3.6")
>>> expr = module.children[0]
>>> expr
PythonNode(arith_expr, [<Name: hello@1,0>, <Operator: +>, <Number: 1>])
>>> print(expr.get_code())
hello + 1
>>> name = expr.children[0]
>>> name
<Name: hello@1,0>
>>> name.end_pos
(1, 5)
>>> expr.end_pos
(1, 9)
To list multiple issues:
.. code-block:: python
>>> grammar = parso.load_grammar()
>>> module = grammar.parse('foo +\nbar\ncontinue')
>>> error1, error2 = grammar.iter_errors(module)
>>> error1.message
'SyntaxError: invalid syntax'
>>> error2.message
"SyntaxError: 'continue' not properly in loop"
Resources
=========
- `Testing <https://parso.readthedocs.io/en/latest/docs/development.html#testing>`_
- `PyPI <https://pypi.python.org/pypi/parso>`_
- `Docs <https://parso.readthedocs.org/en/latest/>`_
- Uses `semantic versioning <https://semver.org/>`_
Installation
============
pip install parso
Future
======
- There will be better support for refactoring and comments. Stay tuned.
- There's a WIP PEP8 validator. It's however not in a good shape, yet.
Known Issues
============
- `async`/`await` are already used as keywords in Python3.6.
- `from __future__ import print_function` is not ignored.
Acknowledgements
================
- Guido van Rossum (@gvanrossum) for creating the parser generator pgen2
(originally used in lib2to3).
- `Salome Schneider <https://www.crepes-schnaegg.ch/cr%C3%AApes-schn%C3%A4gg/kunst-f%C3%BCrs-cr%C3%AApes-mobil/>`_
for the extremely awesome parso logo.
.. _jedi: https://github.com/davidhalter/jedi | {
"source": "yandex/perforator",
"title": "contrib/python/parso/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/parso/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2822
} |
###################################################################
parso - A Python Parser
###################################################################
.. image:: https://github.com/davidhalter/parso/workflows/Build/badge.svg?branch=master
:target: https://github.com/davidhalter/parso/actions
:alt: GitHub Actions build status
.. image:: https://coveralls.io/repos/github/davidhalter/parso/badge.svg?branch=master
:target: https://coveralls.io/github/davidhalter/parso?branch=master
:alt: Coverage Status
.. image:: https://pepy.tech/badge/parso
:target: https://pepy.tech/project/parso
:alt: PyPI Downloads
.. image:: https://raw.githubusercontent.com/davidhalter/parso/master/docs/_static/logo_characters.png
Parso is a Python parser that supports error recovery and round-trip parsing
for different Python versions (in multiple Python versions). Parso is also able
to list multiple syntax errors in your python file.
Parso has been battle-tested by jedi_. It was pulled out of jedi to be useful
for other projects as well.
Parso consists of a small API to parse Python and analyse the syntax tree.
A simple example:
.. code-block:: python
>>> import parso
>>> module = parso.parse('hello + 1', version="3.9")
>>> expr = module.children[0]
>>> expr
PythonNode(arith_expr, [<Name: hello@1,0>, <Operator: +>, <Number: 1>])
>>> print(expr.get_code())
hello + 1
>>> name = expr.children[0]
>>> name
<Name: hello@1,0>
>>> name.end_pos
(1, 5)
>>> expr.end_pos
(1, 9)
To list multiple issues:
.. code-block:: python
>>> grammar = parso.load_grammar()
>>> module = grammar.parse('foo +\nbar\ncontinue')
>>> error1, error2 = grammar.iter_errors(module)
>>> error1.message
'SyntaxError: invalid syntax'
>>> error2.message
"SyntaxError: 'continue' not properly in loop"
Resources
=========
- `Testing <https://parso.readthedocs.io/en/latest/docs/development.html#testing>`_
- `PyPI <https://pypi.python.org/pypi/parso>`_
- `Docs <https://parso.readthedocs.org/en/latest/>`_
- Uses `semantic versioning <https://semver.org/>`_
Installation
============
pip install parso
Future
======
- There will be better support for refactoring and comments. Stay tuned.
- There's a WIP PEP8 validator. It's however not in a good shape, yet.
Known Issues
============
- `async`/`await` are already used as keywords in Python3.6.
- `from __future__ import print_function` is not ignored.
Acknowledgements
================
- Guido van Rossum (@gvanrossum) for creating the parser generator pgen2
(originally used in lib2to3).
- `Salome Schneider <https://www.crepes-schnaegg.ch/cr%C3%AApes-schn%C3%A4gg/kunst-f%C3%BCrs-cr%C3%AApes-mobil/>`_
for the extremely awesome parso logo.
.. _jedi: https://github.com/davidhalter/jedi | {
"source": "yandex/perforator",
"title": "contrib/python/parso/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/parso/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2851
} |
The MIT License (MIT)
Copyright (c) 2014-2017 Matthias C. M. Troffaes
Copyright (c) 2012-2014 Antoine Pitrou and contributors
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. | {
"source": "yandex/perforator",
"title": "contrib/python/pathlib2/py2/LICENSE.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pathlib2/py2/LICENSE.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1150
} |
pathlib2
========
|jazzband| |github| |codecov|
Fork of pathlib aiming to support the full stdlib Python API.
The `old pathlib <https://web.archive.org/web/20181106215056/https://bitbucket.org/pitrou/pathlib/>`_
module on bitbucket is no longer maintained.
The goal of pathlib2 is to provide a backport of
`standard pathlib <http://docs.python.org/dev/library/pathlib.html>`_
module which tracks the standard library module,
so all the newest features of the standard pathlib can be
used also on older Python versions.
Download
--------
Standalone releases are available on PyPI:
http://pypi.python.org/pypi/pathlib2/
Development
-----------
The main development takes place in the Python standard library: see
the `Python developer's guide <http://docs.python.org/devguide/>`_.
In particular, new features should be submitted to the
`Python bug tracker <http://bugs.python.org/>`_.
Issues that occur in this backport, but that do not occur not in the
standard Python pathlib module can be submitted on
the `pathlib2 bug tracker <https://github.com/jazzband/pathlib2/issues>`_.
Documentation
-------------
Refer to the
`standard pathlib <http://docs.python.org/dev/library/pathlib.html>`_
documentation.
Known Issues
------------
For historic reasons, pathlib2 still uses bytes to represent file paths internally.
Unfortunately, on Windows with Python 2.7, the file system encoder (``mcbs``)
has only poor support for non-ascii characters,
and can silently replace non-ascii characters without warning.
For example, ``u'тест'.encode(sys.getfilesystemencoding())`` results in ``????``
which is obviously completely useless.
Therefore, on Windows with Python 2.7, until this problem is fixed upstream,
unfortunately you cannot rely on pathlib2 to support the full unicode range for filenames.
See `issue #56 <https://github.com/jazzband/pathlib2/issues/56>`_ for more details.
.. |github| image:: https://github.com/jazzband/pathlib2/actions/workflows/python-package.yml/badge.svg
:target: https://github.com/jazzband/pathlib2/actions/workflows/python-package.yml
:alt: github
.. |codecov| image:: https://codecov.io/gh/jazzband/pathlib2/branch/develop/graph/badge.svg
:target: https://codecov.io/gh/jazzband/pathlib2
:alt: codecov
.. |jazzband| image:: https://jazzband.co/static/img/badge.svg
:alt: Jazzband
:target: https://jazzband.co/ | {
"source": "yandex/perforator",
"title": "contrib/python/pathlib2/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pathlib2/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2373
} |
The MIT License (MIT)
Copyright (c) 2014-2017 Matthias C. M. Troffaes
Copyright (c) 2012-2014 Antoine Pitrou and contributors
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE. | {
"source": "yandex/perforator",
"title": "contrib/python/pathlib2/py3/LICENSE.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pathlib2/py3/LICENSE.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1150
} |
pathlib2
========
|jazzband| |github| |codecov|
Fork of pathlib aiming to support the full stdlib Python API.
The `old pathlib <https://web.archive.org/web/20181106215056/https://bitbucket.org/pitrou/pathlib/>`_
module on bitbucket is no longer maintained.
The goal of pathlib2 is to provide a backport of
`standard pathlib <http://docs.python.org/dev/library/pathlib.html>`_
module which tracks the standard library module,
so all the newest features of the standard pathlib can be
used also on older Python versions.
Download
--------
Standalone releases are available on PyPI:
http://pypi.python.org/pypi/pathlib2/
Development
-----------
The main development takes place in the Python standard library: see
the `Python developer's guide <http://docs.python.org/devguide/>`_.
In particular, new features should be submitted to the
`Python bug tracker <http://bugs.python.org/>`_.
Issues that occur in this backport, but that do not occur not in the
standard Python pathlib module can be submitted on
the `pathlib2 bug tracker <https://github.com/jazzband/pathlib2/issues>`_.
Documentation
-------------
Refer to the
`standard pathlib <http://docs.python.org/dev/library/pathlib.html>`_
documentation.
Known Issues
------------
For historic reasons, pathlib2 still uses bytes to represent file paths internally.
Unfortunately, on Windows with Python 2.7, the file system encoder (``mcbs``)
has only poor support for non-ascii characters,
and can silently replace non-ascii characters without warning.
For example, ``u'тест'.encode(sys.getfilesystemencoding())`` results in ``????``
which is obviously completely useless.
Therefore, on Windows with Python 2.7, until this problem is fixed upstream,
unfortunately you cannot rely on pathlib2 to support the full unicode range for filenames.
See `issue #56 <https://github.com/jazzband/pathlib2/issues/56>`_ for more details.
.. |github| image:: https://github.com/jazzband/pathlib2/actions/workflows/python-package.yml/badge.svg
:target: https://github.com/jazzband/pathlib2/actions/workflows/python-package.yml
:alt: github
.. |codecov| image:: https://codecov.io/gh/jazzband/pathlib2/branch/develop/graph/badge.svg
:target: https://codecov.io/gh/jazzband/pathlib2
:alt: codecov
.. |jazzband| image:: https://jazzband.co/static/img/badge.svg
:alt: Jazzband
:target: https://jazzband.co/ | {
"source": "yandex/perforator",
"title": "contrib/python/pathlib2/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pathlib2/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2373
} |
.. image:: https://travis-ci.org/pexpect/pexpect.svg?branch=master
:target: https://travis-ci.org/pexpect/pexpect
:align: right
:alt: Build status
Pexpect is a Pure Python Expect-like module
Pexpect makes Python a better tool for controlling other applications.
Pexpect is a pure Python module for spawning child applications; controlling
them; and responding to expected patterns in their output. Pexpect works like
Don Libes' Expect. Pexpect allows your script to spawn a child application and
control it as if a human were typing commands.
Pexpect can be used for automating interactive applications such as ssh, ftp,
passwd, telnet, etc. It can be used to automate setup scripts for duplicating
software package installations on different servers. It can be used for
automated software testing. Pexpect is in the spirit of Don Libes' Expect, but
Pexpect is pure Python.
The main features of Pexpect require the pty module in the Python standard
library, which is only available on Unix-like systems. Some features—waiting
for patterns from file descriptors or subprocesses—are also available on
Windows.
If you want to work with the development version of the source code then please
read the DEVELOPERS.rst document in the root of the source code tree.
Free, open source, and all that good stuff.
You can install Pexpect using pip::
pip install pexpect
`Docs on ReadTheDocs <https://pexpect.readthedocs.io/>`_
PEXPECT LICENSE::
http://opensource.org/licenses/isc-license.txt
Copyright (c) 2013-2016, Pexpect development team
Copyright (c) 2012, Noah Spurrier <[email protected]>
PERMISSION TO USE, COPY, MODIFY, AND/OR DISTRIBUTE THIS SOFTWARE FOR ANY
PURPOSE WITH OR WITHOUT FEE IS HEREBY GRANTED, PROVIDED THAT THE ABOVE
COPYRIGHT NOTICE AND THIS PERMISSION NOTICE APPEAR IN ALL COPIES.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
This license is approved by the OSI and FSF as GPL-compatible. | {
"source": "yandex/perforator",
"title": "contrib/python/pexpect/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pexpect/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2418
} |
.. image:: https://travis-ci.org/pexpect/pexpect.svg?branch=master
:target: https://travis-ci.org/pexpect/pexpect
:align: right
:alt: Build status
Pexpect is a Pure Python Expect-like module
Pexpect makes Python a better tool for controlling other applications.
Pexpect is a pure Python module for spawning child applications; controlling
them; and responding to expected patterns in their output. Pexpect works like
Don Libes' Expect. Pexpect allows your script to spawn a child application and
control it as if a human were typing commands.
Pexpect can be used for automating interactive applications such as ssh, ftp,
passwd, telnet, etc. It can be used to automate setup scripts for duplicating
software package installations on different servers. It can be used for
automated software testing. Pexpect is in the spirit of Don Libes' Expect, but
Pexpect is pure Python.
The main features of Pexpect require the pty module in the Python standard
library, which is only available on Unix-like systems. Some features—waiting
for patterns from file descriptors or subprocesses—are also available on
Windows.
If you want to work with the development version of the source code then please
read the DEVELOPERS.rst document in the root of the source code tree.
Free, open source, and all that good stuff.
You can install Pexpect using pip::
pip install pexpect
`Docs on ReadTheDocs <https://pexpect.readthedocs.io/>`_
PEXPECT LICENSE::
http://opensource.org/licenses/isc-license.txt
Copyright (c) 2013-2016, Pexpect development team
Copyright (c) 2012, Noah Spurrier <[email protected]>
PERMISSION TO USE, COPY, MODIFY, AND/OR DISTRIBUTE THIS SOFTWARE FOR ANY
PURPOSE WITH OR WITHOUT FEE IS HEREBY GRANTED, PROVIDED THAT THE ABOVE
COPYRIGHT NOTICE AND THIS PERMISSION NOTICE APPEAR IN ALL COPIES.
THE SOFTWARE IS PROVIDED "AS IS" AND THE AUTHOR DISCLAIMS ALL WARRANTIES
WITH REGARD TO THIS SOFTWARE INCLUDING ALL IMPLIED WARRANTIES OF
MERCHANTABILITY AND FITNESS. IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR
ANY SPECIAL, DIRECT, INDIRECT, OR CONSEQUENTIAL DAMAGES OR ANY DAMAGES
WHATSOEVER RESULTING FROM LOSS OF USE, DATA OR PROFITS, WHETHER IN AN
ACTION OF CONTRACT, NEGLIGENCE OR OTHER TORTIOUS ACTION, ARISING OUT OF
OR IN CONNECTION WITH THE USE OR PERFORMANCE OF THIS SOFTWARE.
This license is approved by the OSI and FSF as GPL-compatible. | {
"source": "yandex/perforator",
"title": "contrib/python/pexpect/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pexpect/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2418
} |
PickleShare - a small 'shelve' like datastore with concurrency support
Like shelve, a PickleShareDB object acts like a normal dictionary. Unlike shelve,
many processes can access the database simultaneously. Changing a value in
database is immediately visible to other processes accessing the same database.
Concurrency is possible because the values are stored in separate files. Hence
the "database" is a directory where *all* files are governed by PickleShare.
Both python2 and python3 are supported.
Example usage:
```python
from pickleshare import *
db = PickleShareDB('~/testpickleshare')
db.clear()
print("Should be empty:", db.items())
db['hello'] = 15
db['aku ankka'] = [1,2,313]
db['paths/are/ok/key'] = [1,(5,46)]
print(db.keys())
```
This module is certainly not ZODB, but can be used for low-load
(non-mission-critical) situations where tiny code size trumps the
advanced features of a "real" object database.
Installation guide:
```sh
pip install pickleshare
```
Or, if installing from source
```sh
pip install .
``` | {
"source": "yandex/perforator",
"title": "contrib/python/pickleshare/py2/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pickleshare/py2/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1042
} |
PickleShare - a small 'shelve' like datastore with concurrency support
Like shelve, a PickleShareDB object acts like a normal dictionary. Unlike shelve,
many processes can access the database simultaneously. Changing a value in
database is immediately visible to other processes accessing the same database.
Concurrency is possible because the values are stored in separate files. Hence
the "database" is a directory where *all* files are governed by PickleShare.
Both python2 and python3 are supported.
Example usage:
```python
from pickleshare import *
db = PickleShareDB('~/testpickleshare')
db.clear()
print("Should be empty:", db.items())
db['hello'] = 15
db['aku ankka'] = [1,2,313]
db['paths/are/ok/key'] = [1,(5,46)]
print(db.keys())
```
This module is certainly not ZODB, but can be used for low-load
(non-mission-critical) situations where tiny code size trumps the
advanced features of a "real" object database.
Installation guide:
```sh
pip install pickleshare
```
Or, if installing from source
```sh
pip install .
``` | {
"source": "yandex/perforator",
"title": "contrib/python/pickleshare/py3/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pickleshare/py3/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1042
} |
====================================================
pluggy - A minimalist production ready plugin system
====================================================
|pypi| |conda-forge| |versions| |travis| |appveyor| |gitter| |black| |codecov|
This is the core framework used by the `pytest`_, `tox`_, and `devpi`_ projects.
Please `read the docs`_ to learn more!
A definitive example
====================
.. code-block:: python
import pluggy
hookspec = pluggy.HookspecMarker("myproject")
hookimpl = pluggy.HookimplMarker("myproject")
class MySpec(object):
"""A hook specification namespace.
"""
@hookspec
def myhook(self, arg1, arg2):
"""My special little hook that you can customize.
"""
class Plugin_1(object):
"""A hook implementation namespace.
"""
@hookimpl
def myhook(self, arg1, arg2):
print("inside Plugin_1.myhook()")
return arg1 + arg2
class Plugin_2(object):
"""A 2nd hook implementation namespace.
"""
@hookimpl
def myhook(self, arg1, arg2):
print("inside Plugin_2.myhook()")
return arg1 - arg2
# create a manager and add the spec
pm = pluggy.PluginManager("myproject")
pm.add_hookspecs(MySpec)
# register plugins
pm.register(Plugin_1())
pm.register(Plugin_2())
# call our ``myhook`` hook
results = pm.hook.myhook(arg1=1, arg2=2)
print(results)
Running this directly gets us::
$ python docs/examples/toy-example.py
inside Plugin_2.myhook()
inside Plugin_1.myhook()
[-1, 3]
.. badges
.. |pypi| image:: https://img.shields.io/pypi/v/pluggy.svg
:target: https://pypi.org/pypi/pluggy
.. |versions| image:: https://img.shields.io/pypi/pyversions/pluggy.svg
:target: https://pypi.org/pypi/pluggy
.. |travis| image:: https://img.shields.io/travis/pytest-dev/pluggy/master.svg
:target: https://travis-ci.org/pytest-dev/pluggy
.. |appveyor| image:: https://img.shields.io/appveyor/ci/pytestbot/pluggy/master.svg
:target: https://ci.appveyor.com/project/pytestbot/pluggy
.. |conda-forge| image:: https://img.shields.io/conda/vn/conda-forge/pluggy.svg
:target: https://anaconda.org/conda-forge/pytest
.. |gitter| image:: https://badges.gitter.im/pytest-dev/pluggy.svg
:alt: Join the chat at https://gitter.im/pytest-dev/pluggy
:target: https://gitter.im/pytest-dev/pluggy?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
.. |black| image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/ambv/black
.. |codecov| image:: https://codecov.io/gh/pytest-dev/pluggy/branch/master/graph/badge.svg
:target: https://codecov.io/gh/pytest-dev/pluggy
:alt: Code coverage Status
.. links
.. _pytest:
http://pytest.org
.. _tox:
https://tox.readthedocs.org
.. _devpi:
http://doc.devpi.net
.. _read the docs:
https://pluggy.readthedocs.io/en/latest/ | {
"source": "yandex/perforator",
"title": "contrib/python/pluggy/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pluggy/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 3025
} |
====================================================
pluggy - A minimalist production ready plugin system
====================================================
|pypi| |conda-forge| |versions| |github-actions| |gitter| |black| |codecov|
This is the core framework used by the `pytest`_, `tox`_, and `devpi`_ projects.
Please `read the docs`_ to learn more!
A definitive example
====================
.. code-block:: python
import pluggy
hookspec = pluggy.HookspecMarker("myproject")
hookimpl = pluggy.HookimplMarker("myproject")
class MySpec:
"""A hook specification namespace."""
@hookspec
def myhook(self, arg1, arg2):
"""My special little hook that you can customize."""
class Plugin_1:
"""A hook implementation namespace."""
@hookimpl
def myhook(self, arg1, arg2):
print("inside Plugin_1.myhook()")
return arg1 + arg2
class Plugin_2:
"""A 2nd hook implementation namespace."""
@hookimpl
def myhook(self, arg1, arg2):
print("inside Plugin_2.myhook()")
return arg1 - arg2
# create a manager and add the spec
pm = pluggy.PluginManager("myproject")
pm.add_hookspecs(MySpec)
# register plugins
pm.register(Plugin_1())
pm.register(Plugin_2())
# call our ``myhook`` hook
results = pm.hook.myhook(arg1=1, arg2=2)
print(results)
Running this directly gets us::
$ python docs/examples/toy-example.py
inside Plugin_2.myhook()
inside Plugin_1.myhook()
[-1, 3]
.. badges
.. |pypi| image:: https://img.shields.io/pypi/v/pluggy.svg
:target: https://pypi.org/pypi/pluggy
.. |versions| image:: https://img.shields.io/pypi/pyversions/pluggy.svg
:target: https://pypi.org/pypi/pluggy
.. |github-actions| image:: https://github.com/pytest-dev/pluggy/workflows/main/badge.svg
:target: https://github.com/pytest-dev/pluggy/actions
.. |conda-forge| image:: https://img.shields.io/conda/vn/conda-forge/pluggy.svg
:target: https://anaconda.org/conda-forge/pytest
.. |gitter| image:: https://badges.gitter.im/pytest-dev/pluggy.svg
:alt: Join the chat at https://gitter.im/pytest-dev/pluggy
:target: https://gitter.im/pytest-dev/pluggy?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
.. |black| image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/ambv/black
.. |codecov| image:: https://codecov.io/gh/pytest-dev/pluggy/branch/master/graph/badge.svg
:target: https://codecov.io/gh/pytest-dev/pluggy
:alt: Code coverage Status
.. links
.. _pytest:
http://pytest.org
.. _tox:
https://tox.readthedocs.org
.. _devpi:
http://doc.devpi.net
.. _read the docs:
https://pluggy.readthedocs.io/en/latest/
Support pluggy
--------------
`Open Collective`_ is an online funding platform for open and transparent communities.
It provides tools to raise money and share your finances in full transparency.
It is the platform of choice for individuals and companies that want to make one-time or
monthly donations directly to the project.
``pluggy`` is part of the ``pytest-dev`` project, see more details in the `pytest collective`_.
.. _Open Collective: https://opencollective.com
.. _pytest collective: https://opencollective.com/pytest | {
"source": "yandex/perforator",
"title": "contrib/python/pluggy/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pluggy/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 3361
} |
Authors
=======
Creator
-------
Jonathan Slenders <jonathan AT slenders.be>
Contributors
------------
- Amjith Ramanujam <amjith.r AT gmail.com> | {
"source": "yandex/perforator",
"title": "contrib/python/prompt-toolkit/py2/AUTHORS.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/prompt-toolkit/py2/AUTHORS.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 147
} |
Python Prompt Toolkit
=====================
|Build Status| |PyPI|
``prompt_toolkit`` is a library for building powerful interactive command lines
and terminal applications in Python.
Read the `documentation on readthedocs
<http://python-prompt-toolkit.readthedocs.io/en/stable/>`_.
Ptpython
********
`ptpython <http://github.com/jonathanslenders/ptpython/>`_ is an interactive
Python Shell, build on top of prompt_toolkit.
.. image :: https://github.com/jonathanslenders/python-prompt-toolkit/raw/master/docs/images/ptpython.png
prompt_toolkit features
***********************
``prompt_toolkit`` could be a replacement for `GNU readline
<http://cnswww.cns.cwru.edu/php/chet/readline/rltop.html>`_, but it can be much
more than that.
Some features:
- Pure Python.
- Syntax highlighting of the input while typing. (For instance, with a Pygments lexer.)
- Multi-line input editing.
- Advanced code completion.
- Both Emacs and Vi key bindings. (Similar to readline.)
- Even some advanced Vi functionality, like named registers and digraphs.
- Reverse and forward incremental search.
- Runs on all Python versions from 2.6 up to 3.5.
- Works well with Unicode double width characters. (Chinese input.)
- Selecting text for copy/paste. (Both Emacs and Vi style.)
- Support for `bracketed paste <https://cirw.in/blog/bracketed-paste>`_.
- Mouse support for cursor positioning and scrolling.
- Auto suggestions. (Like `fish shell <http://fishshell.com/>`_.)
- Multiple input buffers.
- No global state.
- Lightweight, the only dependencies are Pygments, six and wcwidth.
- Runs on Linux, OS X, FreeBSD, OpenBSD and Windows systems.
- And much more...
Feel free to create tickets for bugs and feature requests, and create pull
requests if you have nice patches that you would like to share with others.
About Windows support
*********************
``prompt_toolkit`` is cross platform, and everything that you build on top
should run fine on both Unix and Windows systems. On Windows, it uses a
different event loop (``WaitForMultipleObjects`` instead of ``select``), and
another input and output system. (Win32 APIs instead of pseudo-terminals and
VT100.)
It's worth noting that the implementation is a "best effort of what is
possible". Both Unix and Windows terminals have their limitations. But in
general, the Unix experience will still be a little better.
For Windows, it's recommended to use either `cmder
<http://cmder.net/>`_ or `conemu <https://conemu.github.io/>`_.
Installation
************
::
pip install prompt_toolkit
For Conda, do:
::
conda install -c https://conda.anaconda.org/conda-forge prompt_toolkit
Getting started
***************
The most simple example of the library would look like this:
.. code:: python
from prompt_toolkit import prompt
if __name__ == '__main__':
answer = prompt('Give me some input: ')
print('You said: %s' % answer)
For more complex examples, have a look in the ``examples`` directory. All
examples are chosen to demonstrate only one thing. Also, don't be afraid to
look at the source code. The implementation of the ``prompt`` function could be
a good start.
Note for Python 2: all strings are expected to be unicode strings. So, either
put a small ``u`` in front of every string or put ``from __future__ import
unicode_literals`` at the start of the above example.
Projects using prompt_toolkit
*****************************
Shells:
- `ptpython <http://github.com/jonathanslenders/ptpython/>`_: Python REPL
- `ptpdb <http://github.com/jonathanslenders/ptpdb/>`_: Python debugger (pdb replacement)
- `pgcli <http://pgcli.com/>`_: Postgres client.
- `mycli <http://mycli.net>`_: MySql client.
- `wharfee <http://wharfee.com/>`_: A Docker command line.
- `xonsh <http://xon.sh/>`_: A Python-ish, BASHwards-compatible shell.
- `saws <https://github.com/donnemartin/saws>`_: A Supercharged AWS Command Line Interface.
- `cycli <https://github.com/nicolewhite/cycli>`_: A Command Line Interface for Cypher.
- `crash <https://github.com/crate/crash>`_: Crate command line client.
- `vcli <https://github.com/dbcli/vcli>`_: Vertica client.
- `aws-shell <https://github.com/awslabs/aws-shell>`_: An integrated shell for working with the AWS CLI.
- `softlayer-python <https://github.com/softlayer/softlayer-python>`_: A command-line interface to manage various SoftLayer products and services.
- `ipython <http://github.com/ipython/ipython/>`_: The IPython REPL
- `click-repl <https://github.com/click-contrib/click-repl>`_: Subcommand REPL for click apps.
- `haxor-news <https://github.com/donnemartin/haxor-news>`_: A Hacker News CLI.
- `gitsome <https://github.com/donnemartin/gitsome>`_: A Git/Shell Autocompleter with GitHub Integration.
- `http-prompt <https://github.com/eliangcs/http-prompt>`_: An interactive command-line HTTP client.
- `coconut <http://coconut-lang.org/>`_: Functional programming in Python.
- `Ergonomica <https://ergonomica.github.io/>`_: A Bash alternative written in Python.
- `Kube-shell <https://github.com/cloudnativelabs/kube-shell>`_: Kubernetes shell: An integrated shell for working with the Kubernetes CLI
Full screen applications:
- `pymux <http://github.com/jonathanslenders/pymux/>`_: A terminal multiplexer (like tmux) in pure Python.
- `pyvim <http://github.com/jonathanslenders/pyvim/>`_: A Vim clone in pure Python.
(Want your own project to be listed here? Please create a GitHub issue.)
Philosophy
**********
The source code of ``prompt_toolkit`` should be readable, concise and
efficient. We prefer short functions focussing each on one task and for which
the input and output types are clearly specified. We mostly prefer composition
over inheritance, because inheritance can result in too much functionality in
the same object. We prefer immutable objects where possible (objects don't
change after initialisation). Reusability is important. We absolutely refrain
from having a changing global state, it should be possible to have multiple
independent instances of the same code in the same process. The architecture
should be layered: the lower levels operate on primitive operations and data
structures giving -- when correctly combined -- all the possible flexibility;
while at the higher level, there should be a simpler API, ready-to-use and
sufficient for most use cases. Thinking about algorithms and efficiency is
important, but avoid premature optimization.
Special thanks to
*****************
- `Pygments <http://pygments.org/>`_: Syntax highlighter.
- `wcwidth <https://github.com/jquast/wcwidth>`_: Determine columns needed for a wide characters.
.. |Build Status| image:: https://api.travis-ci.org/jonathanslenders/python-prompt-toolkit.svg?branch=master
:target: https://travis-ci.org/jonathanslenders/python-prompt-toolkit#
.. |PyPI| image:: https://img.shields.io/pypi/v/prompt_toolkit.svg
:target: https://pypi.python.org/pypi/prompt-toolkit/
:alt: Latest Version | {
"source": "yandex/perforator",
"title": "contrib/python/prompt-toolkit/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/prompt-toolkit/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 6957
} |
Authors
=======
Creator
-------
Jonathan Slenders <jonathan AT slenders.be>
Contributors
------------
- Amjith Ramanujam <amjith.r AT gmail.com> | {
"source": "yandex/perforator",
"title": "contrib/python/prompt-toolkit/py3/AUTHORS.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/prompt-toolkit/py3/AUTHORS.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 147
} |
Python Prompt Toolkit
=====================
|AppVeyor| |PyPI| |RTD| |License| |Codecov|
.. image :: https://github.com/prompt-toolkit/python-prompt-toolkit/raw/master/docs/images/logo_400px.png
``prompt_toolkit`` *is a library for building powerful interactive command line applications in Python.*
Read the `documentation on readthedocs
<http://python-prompt-toolkit.readthedocs.io/en/stable/>`_.
Gallery
*******
`ptpython <http://github.com/prompt-toolkit/ptpython/>`_ is an interactive
Python Shell, build on top of ``prompt_toolkit``.
.. image :: https://github.com/prompt-toolkit/python-prompt-toolkit/raw/master/docs/images/ptpython.png
`More examples <https://python-prompt-toolkit.readthedocs.io/en/stable/pages/gallery.html>`_
prompt_toolkit features
***********************
``prompt_toolkit`` could be a replacement for `GNU readline
<https://tiswww.case.edu/php/chet/readline/rltop.html>`_, but it can be much
more than that.
Some features:
- **Pure Python**.
- Syntax highlighting of the input while typing. (For instance, with a Pygments lexer.)
- Multi-line input editing.
- Advanced code completion.
- Both Emacs and Vi key bindings. (Similar to readline.)
- Even some advanced Vi functionality, like named registers and digraphs.
- Reverse and forward incremental search.
- Works well with Unicode double width characters. (Chinese input.)
- Selecting text for copy/paste. (Both Emacs and Vi style.)
- Support for `bracketed paste <https://cirw.in/blog/bracketed-paste>`_.
- Mouse support for cursor positioning and scrolling.
- Auto suggestions. (Like `fish shell <http://fishshell.com/>`_.)
- Multiple input buffers.
- No global state.
- Lightweight, the only dependencies are Pygments and wcwidth.
- Runs on Linux, OS X, FreeBSD, OpenBSD and Windows systems.
- And much more...
Feel free to create tickets for bugs and feature requests, and create pull
requests if you have nice patches that you would like to share with others.
Installation
************
::
pip install prompt_toolkit
For Conda, do:
::
conda install -c https://conda.anaconda.org/conda-forge prompt_toolkit
About Windows support
*********************
``prompt_toolkit`` is cross platform, and everything that you build on top
should run fine on both Unix and Windows systems. Windows support is best on
recent Windows 10 builds, for which the command line window supports vt100
escape sequences. (If not supported, we fall back to using Win32 APIs for color
and cursor movements).
It's worth noting that the implementation is a "best effort of what is
possible". Both Unix and Windows terminals have their limitations. But in
general, the Unix experience will still be a little better.
Getting started
***************
The most simple example of the library would look like this:
.. code:: python
from prompt_toolkit import prompt
if __name__ == '__main__':
answer = prompt('Give me some input: ')
print('You said: %s' % answer)
For more complex examples, have a look in the ``examples`` directory. All
examples are chosen to demonstrate only one thing. Also, don't be afraid to
look at the source code. The implementation of the ``prompt`` function could be
a good start.
Philosophy
**********
The source code of ``prompt_toolkit`` should be **readable**, **concise** and
**efficient**. We prefer short functions focusing each on one task and for which
the input and output types are clearly specified. We mostly prefer composition
over inheritance, because inheritance can result in too much functionality in
the same object. We prefer immutable objects where possible (objects don't
change after initialization). Reusability is important. We absolutely refrain
from having a changing global state, it should be possible to have multiple
independent instances of the same code in the same process. The architecture
should be layered: the lower levels operate on primitive operations and data
structures giving -- when correctly combined -- all the possible flexibility;
while at the higher level, there should be a simpler API, ready-to-use and
sufficient for most use cases. Thinking about algorithms and efficiency is
important, but avoid premature optimization.
`Projects using prompt_toolkit <PROJECTS.rst>`_
***********************************************
Special thanks to
*****************
- `Pygments <http://pygments.org/>`_: Syntax highlighter.
- `wcwidth <https://github.com/jquast/wcwidth>`_: Determine columns needed for a wide characters.
.. |PyPI| image:: https://img.shields.io/pypi/v/prompt_toolkit.svg
:target: https://pypi.python.org/pypi/prompt-toolkit/
:alt: Latest Version
.. |AppVeyor| image:: https://ci.appveyor.com/api/projects/status/32r7s2skrgm9ubva?svg=true
:target: https://ci.appveyor.com/project/prompt-toolkit/python-prompt-toolkit/
.. |RTD| image:: https://readthedocs.org/projects/python-prompt-toolkit/badge/
:target: https://python-prompt-toolkit.readthedocs.io/en/master/
.. |License| image:: https://img.shields.io/github/license/prompt-toolkit/python-prompt-toolkit.svg
:target: https://github.com/prompt-toolkit/python-prompt-toolkit/blob/master/LICENSE
.. |Codecov| image:: https://codecov.io/gh/prompt-toolkit/python-prompt-toolkit/branch/master/graphs/badge.svg?style=flat
:target: https://codecov.io/gh/prompt-toolkit/python-prompt-toolkit/ | {
"source": "yandex/perforator",
"title": "contrib/python/prompt-toolkit/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/prompt-toolkit/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 5376
} |
This project is governed by
[Protobuf's Code of Conduct](https://github.com/protocolbuffers/.github/blob/main/profile/CODE_OF_CONDUCT.md). | {
"source": "yandex/perforator",
"title": "contrib/python/protobuf/py3/CODE_OF_CONDUCT.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/protobuf/py3/CODE_OF_CONDUCT.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 138
} |
# Contributing to Protocol Buffers
We welcome some types of contributions to protocol buffers. This doc describes the
process to contribute patches to protobuf and the general guidelines we
expect contributors to follow.
## What We Accept
* Bug fixes with unit tests demonstrating the problem are very welcome.
We also appreciate bug reports, even when they don't come with a patch.
Bug fixes without tests are usually not accepted.
* New APIs and features with adequate test coverage and documentation
may be accepted if they do not compromise backwards
compatibility. However there's a fairly high bar of usefulness a new public
method must clear before it will be accepted. Features that are fine in
isolation are often rejected because they don't have enough impact to justify the
conceptual burden and ongoing maintenance cost. It's best to file an issue
and get agreement from maintainers on the value of a new feature before
working on a PR.
* Performance optimizations may be accepted if they have convincing benchmarks that demonstrate
an improvement and they do not significantly increase complexity.
* Changes to existing APIs are almost never accepted. Stability and
backwards compatibility are paramount. In the unlikely event a breaking change
is required, it must usually be implemented in google3 first.
* Changes to the wire and text formats are never accepted. Any breaking change
to these formats would have to be implemented as a completely new format.
We cannot begin generating protos that cannot be parsed by existing code.
## Before You Start
We accept patches in the form of github pull requests. If you are new to
github, please read [How to create github pull requests](https://help.github.com/articles/about-pull-requests/)
first.
### Contributor License Agreements
Contributions to this project must be accompanied by a Contributor License
Agreement. You (or your employer) retain the copyright to your contribution,
this simply gives us permission to use and redistribute your contributions
as part of the project.
* If you are an individual writing original source code and you're sure you
own the intellectual property, then you'll need to sign an [individual CLA](https://cla.developers.google.com/about/google-individual?csw=1).
* If you work for a company that wants to allow you to contribute your work,
then you'll need to sign a [corporate CLA](https://cla.developers.google.com/about/google-corporate?csw=1).
### Coding Style
This project follows [Google’s Coding Style Guides](https://github.com/google/styleguide).
Before sending out your pull request, please familiarize yourself with the
corresponding style guides and make sure the proposed code change is style
conforming.
## Contributing Process
Most pull requests should go to the main branch and the change will be
included in the next major/minor version release (e.g., 3.6.0 release). If you
need to include a bug fix in a patch release (e.g., 3.5.2), make sure it’s
already merged to main, and then create a pull request cherry-picking the
commits from main branch to the release branch (e.g., branch 3.5.x).
For each pull request, a protobuf team member will be assigned to review the
pull request. For minor cleanups, the pull request may be merged right away
after an initial review. For larger changes, you will likely receive multiple
rounds of comments and it may take some time to complete. We will try to keep
our response time within 7-days but if you don’t get any response in a few
days, feel free to comment on the threads to get our attention. We also expect
you to respond to our comments within a reasonable amount of time. If we don’t
hear from you for 2 weeks or longer, we may close the pull request. You can
still send the pull request again once you have time to work on it.
Once a pull request is merged, we will take care of the rest and get it into
the final release.
## Pull Request Guidelines
* If you are a Googler, it is preferable to first create an internal CL and
have it reviewed and submitted. The code propagation process will deliver the
change to GitHub.
* Create small PRs that are narrowly focused on addressing a single concern.
We often receive PRs that are trying to fix several things at a time, but if
only one fix is considered acceptable, nothing gets merged and both author's
& reviewer's time is wasted. Create more PRs to address different concerns and
everyone will be happy.
* For speculative changes, consider opening an issue and discussing it first.
If you are suggesting a behavioral or API change, make sure you get explicit
support from a protobuf team member before sending us the pull request.
* Provide a good PR description as a record of what change is being made and
why it was made. Link to a GitHub issue if it exists.
* Don't fix code style and formatting unless you are already changing that
line to address an issue. PRs with irrelevant changes won't be merged. If
you do want to fix formatting or style, do that in a separate PR.
* Unless your PR is trivial, you should expect there will be reviewer comments
that you'll need to address before merging. We expect you to be reasonably
responsive to those comments, otherwise the PR will be closed after 2-3 weeks
of inactivity.
* Maintain clean commit history and use meaningful commit messages. PRs with
messy commit history are difficult to review and won't be merged. Use rebase
-i upstream/main to curate your commit history and/or to bring in latest
changes from main (but avoid rebasing in the middle of a code review).
* Keep your PR up to date with upstream/main (if there are merge conflicts,
we can't really merge your change).
* All tests need to be passing before your change can be merged. We recommend
you run tests locally before creating your PR to catch breakages early on.
Ultimately, the green signal will be provided by our testing infrastructure.
The reviewer will help you if there are test failures that seem not related
to the change you are making.
## Reviewer Guidelines
* Make sure that all tests are passing before approval.
* Apply the "release notes: yes" label if the pull request's description should
be included in the next release (e.g., any new feature / bug fix).
Apply the "release notes: no" label if the pull request's description should
not be included in the next release (e.g., refactoring changes that does not
change behavior, integration from Google internal, updating tests, etc.).
* Apply the appropriate language label (e.g., C++, Java, Python, etc.) to the
pull request. This will make it easier to identify which languages the pull
request affects, allowing us to better identify appropriate reviewer, create
a better release note, and make it easier to identify issues in the future. | {
"source": "yandex/perforator",
"title": "contrib/python/protobuf/py3/CONTRIBUTING.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/protobuf/py3/CONTRIBUTING.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 6868
} |
Protocol Buffers - Google's data interchange format
===================================================
Copyright 2008 Google Inc.
[Protocol Buffers documentation](https://developers.google.com/protocol-buffers/)
Overview
--------
Protocol Buffers (a.k.a., protobuf) are Google's language-neutral,
platform-neutral, extensible mechanism for serializing structured data. You
can find [protobuf's documentation on the Google Developers site](https://developers.google.com/protocol-buffers/).
This README file contains protobuf installation instructions. To install
protobuf, you need to install the protocol compiler (used to compile .proto
files) and the protobuf runtime for your chosen programming language.
Protocol Compiler Installation
------------------------------
The protocol compiler is written in C++. If you are using C++, please follow
the [C++ Installation Instructions](src/README.md) to install protoc along
with the C++ runtime.
For non-C++ users, the simplest way to install the protocol compiler is to
download a pre-built binary from our [GitHub release page](https://github.com/protocolbuffers/protobuf/releases).
In the downloads section of each release, you can find pre-built binaries in
zip packages: `protoc-$VERSION-$PLATFORM.zip`. It contains the protoc binary
as well as a set of standard `.proto` files distributed along with protobuf.
If you are looking for an old version that is not available in the release
page, check out the [Maven repository](https://repo1.maven.org/maven2/com/google/protobuf/protoc/).
These pre-built binaries are only provided for released versions. If you want
to use the github main version at HEAD, or you need to modify protobuf code,
or you are using C++, it's recommended to build your own protoc binary from
source.
If you would like to build protoc binary from source, see the [C++ Installation Instructions](src/README.md).
Protobuf Runtime Installation
-----------------------------
Protobuf supports several different programming languages. For each programming
language, you can find instructions in the corresponding source directory about
how to install protobuf runtime for that specific language:
| Language | Source |
|--------------------------------------|-------------------------------------------------------------|
| C++ (include C++ runtime and protoc) | [src](src) |
| Java | [java](java) |
| Python | [python](python) |
| Objective-C | [objectivec](objectivec) |
| C# | [csharp](csharp) |
| Ruby | [ruby](ruby) |
| Go | [protocolbuffers/protobuf-go](https://github.com/protocolbuffers/protobuf-go)|
| PHP | [php](php) |
| Dart | [dart-lang/protobuf](https://github.com/dart-lang/protobuf) |
| Javascript | [protocolbuffers/protobuf-javascript](https://github.com/protocolbuffers/protobuf-javascript)|
Quick Start
-----------
The best way to learn how to use protobuf is to follow the [tutorials in our
developer guide](https://developers.google.com/protocol-buffers/docs/tutorials).
If you want to learn from code examples, take a look at the examples in the
[examples](examples) directory.
Documentation
-------------
The complete documentation is available via the [Protocol Buffers documentation](https://developers.google.com/protocol-buffers/).
Developer Community
-------------------
To be alerted to upcoming changes in Protocol Buffers and connect with protobuf developers and users,
[join the Google Group](https://groups.google.com/g/protobuf). | {
"source": "yandex/perforator",
"title": "contrib/python/protobuf/py3/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/protobuf/py3/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 4130
} |
To report security concerns or vulnerabilities within protobuf, please use
Google's official channel for reporting these.
https://www.google.com/appserve/security-bugs/m2/new | {
"source": "yandex/perforator",
"title": "contrib/python/protobuf/py3/SECURITY.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/protobuf/py3/SECURITY.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 175
} |
Launch a subprocess in a pseudo terminal (pty), and interact with both the
process and its pty.
Sometimes, piping stdin and stdout is not enough. There might be a password
prompt that doesn't read from stdin, output that changes when it's going to a
pipe rather than a terminal, or curses-style interfaces that rely on a terminal.
If you need to automate these things, running the process in a pseudo terminal
(pty) is the answer.
Interface::
p = PtyProcessUnicode.spawn(['python'])
p.read(20)
p.write('6+6\n')
p.read(20) | {
"source": "yandex/perforator",
"title": "contrib/python/ptyprocess/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/ptyprocess/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 540
} |
Launch a subprocess in a pseudo terminal (pty), and interact with both the
process and its pty.
Sometimes, piping stdin and stdout is not enough. There might be a password
prompt that doesn't read from stdin, output that changes when it's going to a
pipe rather than a terminal, or curses-style interfaces that rely on a terminal.
If you need to automate these things, running the process in a pseudo terminal
(pty) is the answer.
Interface::
p = PtyProcessUnicode.spawn(['python'])
p.read(20)
p.write('6+6\n')
p.read(20) | {
"source": "yandex/perforator",
"title": "contrib/python/ptyprocess/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/ptyprocess/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 540
} |
.. image:: https://img.shields.io/pypi/v/py.svg
:target: https://pypi.org/project/py
.. image:: https://img.shields.io/conda/vn/conda-forge/py.svg
:target: https://anaconda.org/conda-forge/py
.. image:: https://img.shields.io/pypi/pyversions/py.svg
:target: https://pypi.org/project/py
.. image:: https://github.com/pytest-dev/py/workflows/build/badge.svg
:target: https://github.com/pytest-dev/py/actions
**NOTE**: this library is in **maintenance mode** and should not be used in new code.
The py lib is a Python development support library featuring
the following tools and modules:
* ``py.path``: uniform local and svn path objects -> please use pathlib/pathlib2 instead
* ``py.apipkg``: explicit API control and lazy-importing -> please use the standalone package instead
* ``py.iniconfig``: easy parsing of .ini files -> please use the standalone package instead
* ``py.code``: dynamic code generation and introspection (deprecated, moved to ``pytest`` as a implementation detail).
**NOTE**: prior to the 1.4 release this distribution used to
contain py.test which is now its own package, see https://docs.pytest.org
For questions and more information please visit https://py.readthedocs.io
Bugs and issues: https://github.com/pytest-dev/py
Authors: Holger Krekel and others, 2004-2017 | {
"source": "yandex/perforator",
"title": "contrib/python/py/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/py/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1318
} |
.. image:: https://img.shields.io/pypi/v/py.svg
:target: https://pypi.org/project/py
.. image:: https://img.shields.io/conda/vn/conda-forge/py.svg
:target: https://anaconda.org/conda-forge/py
.. image:: https://img.shields.io/pypi/pyversions/py.svg
:target: https://pypi.org/project/py
.. image:: https://github.com/pytest-dev/py/workflows/build/badge.svg
:target: https://github.com/pytest-dev/py/actions
**NOTE**: this library is in **maintenance mode** and should not be used in new code.
The py lib is a Python development support library featuring
the following tools and modules:
* ``py.path``: uniform local and svn path objects -> please use pathlib/pathlib2 instead
* ``py.apipkg``: explicit API control and lazy-importing -> please use the standalone package instead
* ``py.iniconfig``: easy parsing of .ini files -> please use the standalone package instead
* ``py.code``: dynamic code generation and introspection (deprecated, moved to ``pytest`` as a implementation detail).
**NOTE**: prior to the 1.4 release this distribution used to
contain py.test which is now its own package, see https://docs.pytest.org
For questions and more information please visit https://py.readthedocs.io
Bugs and issues: https://github.com/pytest-dev/py
Authors: Holger Krekel and others, 2004-2017 | {
"source": "yandex/perforator",
"title": "contrib/python/py/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/py/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1318
} |
PyParsing -- A Python Parsing Module
====================================
|Build Status|
Introduction
============
The pyparsing module is an alternative approach to creating and
executing simple grammars, vs. the traditional lex/yacc approach, or the
use of regular expressions. The pyparsing module provides a library of
classes that client code uses to construct the grammar directly in
Python code.
*[Since first writing this description of pyparsing in late 2003, this
technique for developing parsers has become more widespread, under the
name Parsing Expression Grammars - PEGs. See more information on PEGs at*
https://en.wikipedia.org/wiki/Parsing_expression_grammar *.]*
Here is a program to parse ``"Hello, World!"`` (or any greeting of the form
``"salutation, addressee!"``):
.. code:: python
from pyparsing import Word, alphas
greet = Word(alphas) + "," + Word(alphas) + "!"
hello = "Hello, World!"
print(hello, "->", greet.parseString(hello))
The program outputs the following::
Hello, World! -> ['Hello', ',', 'World', '!']
The Python representation of the grammar is quite readable, owing to the
self-explanatory class names, and the use of '+', '|' and '^' operator
definitions.
The parsed results returned from ``parseString()`` can be accessed as a
nested list, a dictionary, or an object with named attributes.
The pyparsing module handles some of the problems that are typically
vexing when writing text parsers:
- extra or missing whitespace (the above program will also handle ``"Hello,World!"``, ``"Hello , World !"``, etc.)
- quoted strings
- embedded comments
The examples directory includes a simple SQL parser, simple CORBA IDL
parser, a config file parser, a chemical formula parser, and a four-
function algebraic notation parser, among many others.
Documentation
=============
There are many examples in the online docstrings of the classes
and methods in pyparsing. You can find them compiled into online docs
at https://pyparsing-docs.readthedocs.io/en/latest/. Additional
documentation resources and project info are listed in the online
GitHub wiki, at https://github.com/pyparsing/pyparsing/wiki. An
entire directory of examples is at
https://github.com/pyparsing/pyparsing/tree/master/examples.
License
=======
MIT License. See header of pyparsing.py
History
=======
See CHANGES file.
.. |Build Status| image:: https://travis-ci.org/pyparsing/pyparsing.svg?branch=master
:target: https://travis-ci.org/pyparsing/pyparsing | {
"source": "yandex/perforator",
"title": "contrib/python/pyparsing/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pyparsing/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2502
} |
PyParsing -- A Python Parsing Module
====================================
|Version| |Build Status| |Coverage| |License| |Python Versions| |Snyk Score|
Introduction
============
The pyparsing module is an alternative approach to creating and
executing simple grammars, vs. the traditional lex/yacc approach, or the
use of regular expressions. The pyparsing module provides a library of
classes that client code uses to construct the grammar directly in
Python code.
*[Since first writing this description of pyparsing in late 2003, this
technique for developing parsers has become more widespread, under the
name Parsing Expression Grammars - PEGs. See more information on PEGs*
`here <https://en.wikipedia.org/wiki/Parsing_expression_grammar>`__
*.]*
Here is a program to parse ``"Hello, World!"`` (or any greeting of the form
``"salutation, addressee!"``):
.. code:: python
from pyparsing import Word, alphas
greet = Word(alphas) + "," + Word(alphas) + "!"
hello = "Hello, World!"
print(hello, "->", greet.parseString(hello))
The program outputs the following::
Hello, World! -> ['Hello', ',', 'World', '!']
The Python representation of the grammar is quite readable, owing to the
self-explanatory class names, and the use of '+', '|' and '^' operator
definitions.
The parsed results returned from ``parseString()`` is a collection of type
``ParseResults``, which can be accessed as a
nested list, a dictionary, or an object with named attributes.
The pyparsing module handles some of the problems that are typically
vexing when writing text parsers:
- extra or missing whitespace (the above program will also handle ``"Hello,World!"``, ``"Hello , World !"``, etc.)
- quoted strings
- embedded comments
The examples directory includes a simple SQL parser, simple CORBA IDL
parser, a config file parser, a chemical formula parser, and a four-
function algebraic notation parser, among many others.
Documentation
=============
There are many examples in the online docstrings of the classes
and methods in pyparsing. You can find them compiled into `online docs <https://pyparsing-docs.readthedocs.io/en/latest/>`__. Additional
documentation resources and project info are listed in the online
`GitHub wiki <https://github.com/pyparsing/pyparsing/wiki>`__. An
entire directory of examples can be found `here <https://github.com/pyparsing/pyparsing/tree/master/examples>`__.
License
=======
MIT License. See header of the `pyparsing __init__.py <https://github.com/pyparsing/pyparsing/blob/master/pyparsing/__init__.py#L1-L23>`__ file.
History
=======
See `CHANGES <https://github.com/pyparsing/pyparsing/blob/master/CHANGES>`__ file.
.. |Build Status| image:: https://github.com/pyparsing/pyparsing/actions/workflows/ci.yml/badge.svg
:target: https://github.com/pyparsing/pyparsing/actions/workflows/ci.yml
.. |Coverage| image:: https://codecov.io/gh/pyparsing/pyparsing/branch/master/graph/badge.svg
:target: https://codecov.io/gh/pyparsing/pyparsing
.. |Version| image:: https://img.shields.io/pypi/v/pyparsing?style=flat-square
:target: https://pypi.org/project/pyparsing/
:alt: Version
.. |License| image:: https://img.shields.io/pypi/l/pyparsing.svg?style=flat-square
:target: https://pypi.org/project/pyparsing/
:alt: License
.. |Python Versions| image:: https://img.shields.io/pypi/pyversions/pyparsing.svg?style=flat-square
:target: https://pypi.org/project/python-liquid/
:alt: Python versions
.. |Snyk Score| image:: https://snyk.io//advisor/python/pyparsing/badge.svg
:target: https://snyk.io//advisor/python/pyparsing
:alt: pyparsing | {
"source": "yandex/perforator",
"title": "contrib/python/pyparsing/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pyparsing/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 3625
} |
.. image:: https://docs.pytest.org/en/latest/_static/pytest1.png
:target: https://docs.pytest.org/en/latest/
:align: center
:alt: pytest
------
.. image:: https://img.shields.io/pypi/v/pytest.svg
:target: https://pypi.org/project/pytest/
.. image:: https://img.shields.io/conda/vn/conda-forge/pytest.svg
:target: https://anaconda.org/conda-forge/pytest
.. image:: https://img.shields.io/pypi/pyversions/pytest.svg
:target: https://pypi.org/project/pytest/
.. image:: https://codecov.io/gh/pytest-dev/pytest/branch/master/graph/badge.svg
:target: https://codecov.io/gh/pytest-dev/pytest
:alt: Code coverage Status
.. image:: https://travis-ci.org/pytest-dev/pytest.svg?branch=master
:target: https://travis-ci.org/pytest-dev/pytest
.. image:: https://dev.azure.com/pytest-dev/pytest/_apis/build/status/pytest-CI?branchName=master
:target: https://dev.azure.com/pytest-dev/pytest
.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/python/black
.. image:: https://www.codetriage.com/pytest-dev/pytest/badges/users.svg
:target: https://www.codetriage.com/pytest-dev/pytest
The ``pytest`` framework makes it easy to write small tests, yet
scales to support complex functional testing for applications and libraries.
An example of a simple test:
.. code-block:: python
# content of test_sample.py
def inc(x):
return x + 1
def test_answer():
assert inc(3) == 5
To execute it::
$ pytest
============================= test session starts =============================
collected 1 items
test_sample.py F
================================== FAILURES ===================================
_________________________________ test_answer _________________________________
def test_answer():
> assert inc(3) == 5
E assert 4 == 5
E + where 4 = inc(3)
test_sample.py:5: AssertionError
========================== 1 failed in 0.04 seconds ===========================
Due to ``pytest``'s detailed assertion introspection, only plain ``assert`` statements are used. See `getting-started <https://docs.pytest.org/en/latest/getting-started.html#our-first-test-run>`_ for more examples.
Features
--------
- Detailed info on failing `assert statements <https://docs.pytest.org/en/latest/assert.html>`_ (no need to remember ``self.assert*`` names);
- `Auto-discovery
<https://docs.pytest.org/en/latest/goodpractices.html#python-test-discovery>`_
of test modules and functions;
- `Modular fixtures <https://docs.pytest.org/en/latest/fixture.html>`_ for
managing small or parametrized long-lived test resources;
- Can run `unittest <https://docs.pytest.org/en/latest/unittest.html>`_ (or trial),
`nose <https://docs.pytest.org/en/latest/nose.html>`_ test suites out of the box;
- Python 2.7, Python 3.4+, PyPy 2.3, Jython 2.5 (untested);
- Rich plugin architecture, with over 315+ `external plugins <http://plugincompat.herokuapp.com>`_ and thriving community;
Documentation
-------------
For full documentation, including installation, tutorials and PDF documents, please see https://docs.pytest.org/en/latest/.
Bugs/Requests
-------------
Please use the `GitHub issue tracker <https://github.com/pytest-dev/pytest/issues>`_ to submit bugs or request features.
Changelog
---------
Consult the `Changelog <https://docs.pytest.org/en/latest/changelog.html>`__ page for fixes and enhancements of each version.
Support pytest
--------------
You can support pytest by obtaining a `Tideflift subscription`_.
Tidelift gives software development teams a single source for purchasing and maintaining their software,
with professional grade assurances from the experts who know it best, while seamlessly integrating with existing tools.
.. _`Tideflift subscription`: https://tidelift.com/subscription/pkg/pypi-pytest?utm_source=pypi-pytest&utm_medium=referral&utm_campaign=readme
Security
^^^^^^^^
pytest has never been associated with a security vunerability, but in any case, to report a
security vulnerability please use the `Tidelift security contact <https://tidelift.com/security>`_.
Tidelift will coordinate the fix and disclosure.
License
-------
Copyright Holger Krekel and others, 2004-2020.
Distributed under the terms of the `MIT`_ license, pytest is free and open source software.
.. _`MIT`: https://github.com/pytest-dev/pytest/blob/master/LICENSE | {
"source": "yandex/perforator",
"title": "contrib/python/pytest/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pytest/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 4476
} |
.. image:: https://github.com/pytest-dev/pytest/raw/main/doc/en/img/pytest_logo_curves.svg
:target: https://docs.pytest.org/en/stable/
:align: center
:height: 200
:alt: pytest
------
.. image:: https://img.shields.io/pypi/v/pytest.svg
:target: https://pypi.org/project/pytest/
.. image:: https://img.shields.io/conda/vn/conda-forge/pytest.svg
:target: https://anaconda.org/conda-forge/pytest
.. image:: https://img.shields.io/pypi/pyversions/pytest.svg
:target: https://pypi.org/project/pytest/
.. image:: https://codecov.io/gh/pytest-dev/pytest/branch/main/graph/badge.svg
:target: https://codecov.io/gh/pytest-dev/pytest
:alt: Code coverage Status
.. image:: https://github.com/pytest-dev/pytest/workflows/test/badge.svg
:target: https://github.com/pytest-dev/pytest/actions?query=workflow%3Atest
.. image:: https://results.pre-commit.ci/badge/github/pytest-dev/pytest/main.svg
:target: https://results.pre-commit.ci/latest/github/pytest-dev/pytest/main
:alt: pre-commit.ci status
.. image:: https://img.shields.io/badge/code%20style-black-000000.svg
:target: https://github.com/psf/black
.. image:: https://www.codetriage.com/pytest-dev/pytest/badges/users.svg
:target: https://www.codetriage.com/pytest-dev/pytest
.. image:: https://readthedocs.org/projects/pytest/badge/?version=latest
:target: https://pytest.readthedocs.io/en/latest/?badge=latest
:alt: Documentation Status
.. image:: https://img.shields.io/badge/Discord-pytest--dev-blue
:target: https://discord.com/invite/pytest-dev
:alt: Discord
.. image:: https://img.shields.io/badge/Libera%20chat-%23pytest-orange
:target: https://web.libera.chat/#pytest
:alt: Libera chat
The ``pytest`` framework makes it easy to write small tests, yet
scales to support complex functional testing for applications and libraries.
An example of a simple test:
.. code-block:: python
# content of test_sample.py
def inc(x):
return x + 1
def test_answer():
assert inc(3) == 5
To execute it::
$ pytest
============================= test session starts =============================
collected 1 items
test_sample.py F
================================== FAILURES ===================================
_________________________________ test_answer _________________________________
def test_answer():
> assert inc(3) == 5
E assert 4 == 5
E + where 4 = inc(3)
test_sample.py:5: AssertionError
========================== 1 failed in 0.04 seconds ===========================
Due to ``pytest``'s detailed assertion introspection, only plain ``assert`` statements are used. See `getting-started <https://docs.pytest.org/en/stable/getting-started.html#our-first-test-run>`_ for more examples.
Features
--------
- Detailed info on failing `assert statements <https://docs.pytest.org/en/stable/how-to/assert.html>`_ (no need to remember ``self.assert*`` names)
- `Auto-discovery
<https://docs.pytest.org/en/stable/explanation/goodpractices.html#python-test-discovery>`_
of test modules and functions
- `Modular fixtures <https://docs.pytest.org/en/stable/explanation/fixtures.html>`_ for
managing small or parametrized long-lived test resources
- Can run `unittest <https://docs.pytest.org/en/stable/how-to/unittest.html>`_ (or trial),
`nose <https://docs.pytest.org/en/stable/how-to/nose.html>`_ test suites out of the box
- Python 3.7+ or PyPy3
- Rich plugin architecture, with over 850+ `external plugins <https://docs.pytest.org/en/latest/reference/plugin_list.html>`_ and thriving community
Documentation
-------------
For full documentation, including installation, tutorials and PDF documents, please see https://docs.pytest.org/en/stable/.
Bugs/Requests
-------------
Please use the `GitHub issue tracker <https://github.com/pytest-dev/pytest/issues>`_ to submit bugs or request features.
Changelog
---------
Consult the `Changelog <https://docs.pytest.org/en/stable/changelog.html>`__ page for fixes and enhancements of each version.
Support pytest
--------------
`Open Collective`_ is an online funding platform for open and transparent communities.
It provides tools to raise money and share your finances in full transparency.
It is the platform of choice for individuals and companies that want to make one-time or
monthly donations directly to the project.
See more details in the `pytest collective`_.
.. _Open Collective: https://opencollective.com
.. _pytest collective: https://opencollective.com/pytest
pytest for enterprise
---------------------
Available as part of the Tidelift Subscription.
The maintainers of pytest and thousands of other packages are working with Tidelift to deliver commercial support and
maintenance for the open source dependencies you use to build your applications.
Save time, reduce risk, and improve code health, while paying the maintainers of the exact dependencies you use.
`Learn more. <https://tidelift.com/subscription/pkg/pypi-pytest?utm_source=pypi-pytest&utm_medium=referral&utm_campaign=enterprise&utm_term=repo>`_
Security
^^^^^^^^
pytest has never been associated with a security vulnerability, but in any case, to report a
security vulnerability please use the `Tidelift security contact <https://tidelift.com/security>`_.
Tidelift will coordinate the fix and disclosure.
License
-------
Copyright Holger Krekel and others, 2004.
Distributed under the terms of the `MIT`_ license, pytest is free and open source software.
.. _`MIT`: https://github.com/pytest-dev/pytest/blob/main/LICENSE | {
"source": "yandex/perforator",
"title": "contrib/python/pytest/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/pytest/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 5627
} |
This is a (possibly incomplete) list of all the contributors to python-dateutil,
initially generated from the git author metadata. The details of their specific
contributions can be found in the git history.
Prior to 2017-12-01, the library was licensed solely under the BSD 3-clause
license, all contributions on or after 2017-12-01 are dual-licensed between
Apache 2.0 and BSD 3-clause. In the list below, anyone whose name is marked with
**R** has agreed to re-license their previously submitted code under Apache 2.0.
Anyone whose name is marked with a **D** has only made contributions since the
switch, and thus all their contributions are dual-licensed.
## Contributors (alphabetical order)
- Adam Chainz <adam@MASKED>
- Adrien Cossa <cossa@MASKED>
- Alec Nikolas Reiter <alecreiter@MASKED>
- Alec Reiter <areiter@MASKED>
- Aleksei Strizhak <alexei.mifrill.strizhak@MASKED> (gh: @Mifrill)
- Alex Chamberlain (gh: @alexchamberlain) **D**
- Alex Verdyan <verdyan@MASKED>
- Alex Willmer <[email protected]> (gh: @moreati) **R**
- Alexander Brugh <alexander.brugh@MASKED> (gh: @abrugh)
- Alexander Shadchin <[email protected]> (gh: @shadchin) **D**
- Alistair McMaster <alistair@MASKED> (gh: @alimcmaster1 ) **D**
- Allison Quinlan <[email protected]> (gh: @aquinlan) **D**
- Andrew Bennett (gh: @andrewcbennett) **D**
- Andrew Murray <radarhere@MASKED>
- Arclight <arclight@MASKED> (gh: @arclightslavik)
- Aritro Nandi <[email protected]> (gh: @gurgenz221) **D**
- Bernat Gabor <[email protected]> (gh: @gaborbernat) **D**
- Bradlee Speice <[email protected]> (gh: @bspeice) **D**
- Brandon W Maister <quodlibetor@MASKED>
- Brock Mendel <jbrockmendel@MASKED> (gh: @jbrockmendel) **R**
- Brook Li (gh: @absreim) **D**
- Carlos <carlosxl@MASKED>
- Cheuk Ting Ho <[email protected]> (gh: @cheukting) **D**
- Chris van den Berg (gh: bergvca) **D**
- Christopher Cordero <[email protected]> (gh: cs-cordero) **D**
- Christopher Corley <cscorley@MASKED>
- Claudio Canepa <ccanepacc@MASKED>
- Corey Girard <[email protected]> (gh: @coreygirard) **D**
- Cosimo Lupo <[email protected]> (gh: @anthrotype) **D**
- Daniel Lemm (gh: @ffe4) **D**
- Daniel Lepage <dplepage@MASKED>
- David Lehrian <david@MASKED>
- Dean Allsopp (gh: @daplantagenet) **D**
- Dominik Kozaczko <dominik@MASKED>
- Elliot Hughes <[email protected]> (gh: @ElliotJH) **D**
- Elvis Pranskevichus <el@MASKED>
- Fan Huang <[email protected]>(gh: @fhuang5) **D**
- Florian Rathgeber (gh: @kynan) **D**
- Gabriel Bianconi <gabriel@MASKED> (gh: @GabrielBianconi) **D**
- Gabriel Poesia <gabriel.poesia@MASKED>
- Gökçen Nurlu <[email protected]> (gh: @gokcennurlu) **D**
- Grant Garrett-Grossman <[email protected]> (gh: @FakeNameSE) **D**
- Gustavo Niemeyer <[email protected]> (gh: @niemeyer)
- Holger Joukl <holger.joukl@MASKED> (gh: @hjoukl)
- Hugo van Kemenade (gh: @hugovk) **D**
- Igor <mrigor83@MASKED>
- Ionuț Ciocîrlan <jdxlark@MASKED>
- Jacqueline Chen <[email protected]> (gh: @jachen20) **D**
- Jake Chorley (gh: @jakec-github) **D**
- Jakub Kulík (gh: @kulikjak) **D**
- Jan Studený <jendas1@MASKED>
- Jay Weisskopf <[email protected]> (gh: @jayschwa) **D**
- Jitesh <jitesh@MASKED>
- John Purviance <jpurviance@MASKED> (gh @jpurviance) **D**
- Jon Dufresne <jon.dufresne@MASKED> (gh: @jdufresne) **R**
- Jonas Neubert <jonas@MASKED> (gh: @jonemo) **R**
- Kevin Nguyen <kvn219@MASKED> **D**
- Kirit Thadaka <[email protected]> (gh: @kirit93) **D**
- Kubilay Kocak <koobs@MASKED>
- Laszlo Kiss Kollar <kiss.kollar.laszlo@MASKED> (gh: @lkollar) **D**
- Lauren Oldja <oldja@MASKED> (gh: @loldja) **D**
- Luca Ferocino <luca.ferox@MASKED> (gh: @lucaferocino) **D**
- Mario Corchero <mcorcherojim@MASKED> (gh: @mariocj89) **R**
- Mark Bailey <msb@MASKED> **D**
- Mateusz Dziedzic (gh: @m-dz) **D**
- Matt Cooper <vtbassmatt@MASKED> (gh: @vtbassmatt) **D**
- Matthew Schinckel <matt@MASKED>
- Max Shenfield <shenfieldmax@MASKED>
- Maxime Lorant <maxime.lorant@MASKED>
- Michael Aquilina <michaelaquilina@MASKED> (gh: @MichaelAquilina)
- Michael J. Schultz <mjschultz@MASKED>
- Michael Käufl (gh: @michael-k)
- Mike Gilbert <floppym@MASKED>
- Nicholas Herrriot <[email protected]> **D**
- Nicolas Évrard (gh: @nicoe) **D**
- Nick Smith <nick.smith@MASKED>
- Orson Adams <orson.network@MASKED> (gh: @parsethis) **D**
- Paul Brown (gh: @pawl) **D**
- Paul Dickson (gh @prdickson) **D**
- Paul Ganssle <[email protected]> (gh: @pganssle) **R**
- Pascal van Kooten <kootenpv@MASKED> (gh: @kootenpv) **R**
- Pavel Ponomarev <comrad.awsum@MASKED>
- Peter Bieringer <pb@MASKED>
- Pierre Gergondet <pierre.gergondet@MASKED> (gh: @gergondet) **D**
- Quentin Pradet <quentin@MASKED>
- Raymond Cha (gh: @weatherpattern) **D**
- Ridhi Mahajan <ridhikmahajan@MASKED> **D**
- Robin Henriksson Törnström <gh: @MrRawbin> **D**
- Roy Williams <rwilliams@MASKED>
- Rustem Saiargaliev (gh: @amureki) **D**
- Satyabrat Bhol <satyabrat35@MASKED> (gh: @Satyabrat35) **D**
- Savraj <savraj@MASKED>
- Sergey Vishnikin <armicron@MASKED>
- Sherry Zhou (gh: @cssherry) **D**
- Siping Meng (gh: @smeng10) **D**
- Stefan Bonchev **D**
- Thierry Bastian <thierryb@MASKED>
- Thomas A Caswell <tcaswell@MASKED> (gh: @tacaswell) **R**
- Thomas Achtemichuk <tom@MASKED>
- Thomas Grainger <[email protected]> (gh: @graingert) **D**
- Thomas Kluyver <takowl@MASKED> (gh: @takluyver)
- Tim Gates <[email protected]> (gh: timgates42)
- Tomasz Kluczkowski (gh: @Tomasz-Kluczkowski) **D**
- Tomi Pieviläinen <[email protected]>
- Unrud <Unrud@MASKED> (gh: @unrud)
- Xavier Lapointe <lapointe.xavier@MASKED> (gh: @lapointexavier) **D**
- X O <xo@MASKED>
- Yaron de Leeuw <[email protected]> (gh: @jarondl)
- Yoney <[email protected]> **D**
- Yuan Huang <[email protected]> (gh: @huangy22) **D**
- Zbigniew Jędrzejewski-Szmek <zbyszek@MASKED>
- bachmann <bachmann.matt@MASKED>
- bjv <brandon.vanvaerenbergh@MASKED> (@bjamesvERT)
- gl <gl@MASKED>
- gfyoung <[email protected]> **D**
- Labrys <[email protected]> (gh: @labrys) **R**
- ms-boom <ms-boom@MASKED>
- ryanss <ryanssdev@MASKED> (gh: @ryanss) **R**
Unless someone has deliberately given permission to publish their e-mail, I have masked the domain names. If you are not on this list and believe you should be, or you *are* on this list and your information is inaccurate, please e-mail the current maintainer or the mailing list ([email protected]) with your name, e-mail (if desired) and GitHub (if desired / applicable), as you would like them displayed. Additionally, please indicate if you are willing to dual license your old contributions under Apache 2.0. | {
"source": "yandex/perforator",
"title": "contrib/python/python-dateutil/py2/AUTHORS.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/python-dateutil/py2/AUTHORS.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 6642
} |
dateutil - powerful extensions to datetime
==========================================
|pypi| |support| |licence|
|gitter| |readthedocs|
|travis| |appveyor| |pipelines| |coverage|
.. |pypi| image:: https://img.shields.io/pypi/v/python-dateutil.svg?style=flat-square
:target: https://pypi.org/project/python-dateutil/
:alt: pypi version
.. |support| image:: https://img.shields.io/pypi/pyversions/python-dateutil.svg?style=flat-square
:target: https://pypi.org/project/python-dateutil/
:alt: supported Python version
.. |travis| image:: https://img.shields.io/travis/dateutil/dateutil/master.svg?style=flat-square&label=Travis%20Build
:target: https://travis-ci.org/dateutil/dateutil
:alt: travis build status
.. |appveyor| image:: https://img.shields.io/appveyor/ci/dateutil/dateutil/master.svg?style=flat-square&logo=appveyor
:target: https://ci.appveyor.com/project/dateutil/dateutil
:alt: appveyor build status
.. |pipelines| image:: https://dev.azure.com/pythondateutilazure/dateutil/_apis/build/status/dateutil.dateutil?branchName=master
:target: https://dev.azure.com/pythondateutilazure/dateutil/_build/latest?definitionId=1&branchName=master
:alt: azure pipelines build status
.. |coverage| image:: https://codecov.io/gh/dateutil/dateutil/branch/master/graphs/badge.svg?branch=master
:target: https://codecov.io/gh/dateutil/dateutil?branch=master
:alt: Code coverage
.. |gitter| image:: https://badges.gitter.im/dateutil/dateutil.svg
:alt: Join the chat at https://gitter.im/dateutil/dateutil
:target: https://gitter.im/dateutil/dateutil
.. |licence| image:: https://img.shields.io/pypi/l/python-dateutil.svg?style=flat-square
:target: https://pypi.org/project/python-dateutil/
:alt: licence
.. |readthedocs| image:: https://img.shields.io/readthedocs/dateutil/latest.svg?style=flat-square&label=Read%20the%20Docs
:alt: Read the documentation at https://dateutil.readthedocs.io/en/latest/
:target: https://dateutil.readthedocs.io/en/latest/
The `dateutil` module provides powerful extensions to
the standard `datetime` module, available in Python.
Installation
============
`dateutil` can be installed from PyPI using `pip` (note that the package name is
different from the importable name)::
pip install python-dateutil
Download
========
dateutil is available on PyPI
https://pypi.org/project/python-dateutil/
The documentation is hosted at:
https://dateutil.readthedocs.io/en/stable/
Code
====
The code and issue tracker are hosted on GitHub:
https://github.com/dateutil/dateutil/
Features
========
* Computing of relative deltas (next month, next year,
next Monday, last week of month, etc);
* Computing of relative deltas between two given
date and/or datetime objects;
* Computing of dates based on very flexible recurrence rules,
using a superset of the `iCalendar <https://www.ietf.org/rfc/rfc2445.txt>`_
specification. Parsing of RFC strings is supported as well.
* Generic parsing of dates in almost any string format;
* Timezone (tzinfo) implementations for tzfile(5) format
files (/etc/localtime, /usr/share/zoneinfo, etc), TZ
environment string (in all known formats), iCalendar
format files, given ranges (with help from relative deltas),
local machine timezone, fixed offset timezone, UTC timezone,
and Windows registry-based time zones.
* Internal up-to-date world timezone information based on
Olson's database.
* Computing of Easter Sunday dates for any given year,
using Western, Orthodox or Julian algorithms;
* A comprehensive test suite.
Quick example
=============
Here's a snapshot, just to give an idea about the power of the
package. For more examples, look at the documentation.
Suppose you want to know how much time is left, in
years/months/days/etc, before the next easter happening on a
year with a Friday 13th in August, and you want to get today's
date out of the "date" unix system command. Here is the code:
.. doctest:: readmeexample
>>> from dateutil.relativedelta import *
>>> from dateutil.easter import *
>>> from dateutil.rrule import *
>>> from dateutil.parser import *
>>> from datetime import *
>>> now = parse("Sat Oct 11 17:13:46 UTC 2003")
>>> today = now.date()
>>> year = rrule(YEARLY,dtstart=now,bymonth=8,bymonthday=13,byweekday=FR)[0].year
>>> rdelta = relativedelta(easter(year), today)
>>> print("Today is: %s" % today)
Today is: 2003-10-11
>>> print("Year with next Aug 13th on a Friday is: %s" % year)
Year with next Aug 13th on a Friday is: 2004
>>> print("How far is the Easter of that year: %s" % rdelta)
How far is the Easter of that year: relativedelta(months=+6)
>>> print("And the Easter of that year is: %s" % (today+rdelta))
And the Easter of that year is: 2004-04-11
Being exactly 6 months ahead was **really** a coincidence :)
Contributing
============
We welcome many types of contributions - bug reports, pull requests (code, infrastructure or documentation fixes). For more information about how to contribute to the project, see the ``CONTRIBUTING.md`` file in the repository.
Author
======
The dateutil module was written by Gustavo Niemeyer <[email protected]>
in 2003.
It is maintained by:
* Gustavo Niemeyer <[email protected]> 2003-2011
* Tomi Pieviläinen <[email protected]> 2012-2014
* Yaron de Leeuw <[email protected]> 2014-2016
* Paul Ganssle <[email protected]> 2015-
Starting with version 2.4.1 and running until 2.8.2, all source and binary
distributions will be signed by a PGP key that has, at the very least, been
signed by the key which made the previous release. A table of release signing
keys can be found below:
=========== ============================
Releases Signing key fingerprint
=========== ============================
2.4.1-2.8.2 `6B49 ACBA DCF6 BD1C A206 67AB CD54 FCE3 D964 BEFB`_
=========== ============================
New releases *may* have signed tags, but binary and source distributions
uploaded to PyPI will no longer have GPG signatures attached.
Contact
=======
Our mailing list is available at `[email protected] <https://mail.python.org/mailman/listinfo/dateutil>`_. As it is hosted by the PSF, it is subject to the `PSF code of
conduct <https://www.python.org/psf/conduct/>`_.
License
=======
All contributions after December 1, 2017 released under dual license - either `Apache 2.0 License <https://www.apache.org/licenses/LICENSE-2.0>`_ or the `BSD 3-Clause License <https://opensource.org/licenses/BSD-3-Clause>`_. Contributions before December 1, 2017 - except those those explicitly relicensed - are released only under the BSD 3-Clause License.
.. _6B49 ACBA DCF6 BD1C A206 67AB CD54 FCE3 D964 BEFB:
https://pgp.mit.edu/pks/lookup?op=vindex&search=0xCD54FCE3D964BEFB | {
"source": "yandex/perforator",
"title": "contrib/python/python-dateutil/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/python-dateutil/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 6816
} |
This is a (possibly incomplete) list of all the contributors to python-dateutil,
initially generated from the git author metadata. The details of their specific
contributions can be found in the git history.
Prior to 2017-12-01, the library was licensed solely under the BSD 3-clause
license, all contributions on or after 2017-12-01 are dual-licensed between
Apache 2.0 and BSD 3-clause. In the list below, anyone whose name is marked with
**R** has agreed to re-license their previously submitted code under Apache 2.0.
Anyone whose name is marked with a **D** has only made contributions since the
switch, and thus all their contributions are dual-licensed.
## Contributors (alphabetical order)
- Adam Chainz <adam@MASKED>
- Adrien Cossa <cossa@MASKED>
- Alec Nikolas Reiter <alecreiter@MASKED>
- Alec Reiter <areiter@MASKED>
- Aleksei Strizhak <alexei.mifrill.strizhak@MASKED> (gh: @Mifrill)
- Alex Chamberlain (gh: @alexchamberlain) **D**
- Alex Verdyan <verdyan@MASKED>
- Alex Willmer <[email protected]> (gh: @moreati) **R**
- Alexander Brugh <alexander.brugh@MASKED> (gh: @abrugh)
- Alexander Shadchin <[email protected]> (gh: @shadchin) **D**
- Alistair McMaster <alistair@MASKED> (gh: @alimcmaster1 ) **D**
- Allison Quinlan <[email protected]> (gh: @aquinlan) **D**
- Andrew Bennett (gh: @andrewcbennett) **D**
- Andrew Murray <radarhere@MASKED>
- Arclight <arclight@MASKED> (gh: @arclightslavik)
- Aritro Nandi <[email protected]> (gh: @gurgenz221) **D**
- Bernat Gabor <[email protected]> (gh: @gaborbernat) **D**
- Bradlee Speice <[email protected]> (gh: @bspeice) **D**
- Brandon W Maister <quodlibetor@MASKED>
- Brock Mendel <jbrockmendel@MASKED> (gh: @jbrockmendel) **R**
- Brook Li (gh: @absreim) **D**
- Carlos <carlosxl@MASKED>
- Cheuk Ting Ho <[email protected]> (gh: @cheukting) **D**
- Chris van den Berg (gh: bergvca) **D**
- Christopher Cordero <[email protected]> (gh: cs-cordero) **D**
- Christopher Corley <cscorley@MASKED>
- Claudio Canepa <ccanepacc@MASKED>
- Corey Girard <[email protected]> (gh: @coreygirard) **D**
- Cosimo Lupo <[email protected]> (gh: @anthrotype) **D**
- Daniel Lemm (gh: @ffe4) **D**
- Daniel Lepage <dplepage@MASKED>
- David Lehrian <david@MASKED>
- Dean Allsopp (gh: @daplantagenet) **D**
- Dominik Kozaczko <dominik@MASKED>
- Elliot Hughes <[email protected]> (gh: @ElliotJH) **D**
- Elvis Pranskevichus <el@MASKED>
- Fan Huang <[email protected]>(gh: @fhuang5) **D**
- Florian Rathgeber (gh: @kynan) **D**
- Gabriel Bianconi <gabriel@MASKED> (gh: @GabrielBianconi) **D**
- Gabriel Poesia <gabriel.poesia@MASKED>
- Gökçen Nurlu <[email protected]> (gh: @gokcennurlu) **D**
- Grant Garrett-Grossman <[email protected]> (gh: @FakeNameSE) **D**
- Gustavo Niemeyer <[email protected]> (gh: @niemeyer)
- Holger Joukl <holger.joukl@MASKED> (gh: @hjoukl)
- Hugo van Kemenade (gh: @hugovk) **D**
- Igor <mrigor83@MASKED>
- Ionuț Ciocîrlan <jdxlark@MASKED>
- Jacqueline Chen <[email protected]> (gh: @jachen20) **D**
- Jake Chorley (gh: @jakec-github) **D**
- Jakub Kulík (gh: @kulikjak) **D**
- Jan Studený <jendas1@MASKED>
- Jay Weisskopf <[email protected]> (gh: @jayschwa) **D**
- Jitesh <jitesh@MASKED>
- John Purviance <jpurviance@MASKED> (gh @jpurviance) **D**
- Jon Dufresne <jon.dufresne@MASKED> (gh: @jdufresne) **R**
- Jonas Neubert <jonas@MASKED> (gh: @jonemo) **R**
- Kevin Nguyen <kvn219@MASKED> **D**
- Kirit Thadaka <[email protected]> (gh: @kirit93) **D**
- Kubilay Kocak <koobs@MASKED>
- Laszlo Kiss Kollar <kiss.kollar.laszlo@MASKED> (gh: @lkollar) **D**
- Lauren Oldja <oldja@MASKED> (gh: @loldja) **D**
- Luca Ferocino <luca.ferox@MASKED> (gh: @lucaferocino) **D**
- Mario Corchero <mcorcherojim@MASKED> (gh: @mariocj89) **R**
- Mark Bailey <msb@MASKED> **D**
- Mateusz Dziedzic (gh: @m-dz) **D**
- Matt Cooper <vtbassmatt@MASKED> (gh: @vtbassmatt) **D**
- Matthew Schinckel <matt@MASKED>
- Max Shenfield <shenfieldmax@MASKED>
- Maxime Lorant <maxime.lorant@MASKED>
- Michael Aquilina <michaelaquilina@MASKED> (gh: @MichaelAquilina)
- Michael J. Schultz <mjschultz@MASKED>
- Michael Käufl (gh: @michael-k)
- Mike Gilbert <floppym@MASKED>
- Nicholas Herrriot <[email protected]> **D**
- Nicolas Évrard (gh: @nicoe) **D**
- Nick Smith <nick.smith@MASKED>
- Orson Adams <orson.network@MASKED> (gh: @parsethis) **D**
- Paul Brown (gh: @pawl) **D**
- Paul Dickson (gh @prdickson) **D**
- Paul Ganssle <[email protected]> (gh: @pganssle) **R**
- Pascal van Kooten <kootenpv@MASKED> (gh: @kootenpv) **R**
- Pavel Ponomarev <comrad.awsum@MASKED>
- Peter Bieringer <pb@MASKED>
- Pierre Gergondet <pierre.gergondet@MASKED> (gh: @gergondet) **D**
- Quentin Pradet <quentin@MASKED>
- Raymond Cha (gh: @weatherpattern) **D**
- Ridhi Mahajan <ridhikmahajan@MASKED> **D**
- Robin Henriksson Törnström <gh: @MrRawbin> **D**
- Roy Williams <rwilliams@MASKED>
- Rustem Saiargaliev (gh: @amureki) **D**
- Satyabrat Bhol <satyabrat35@MASKED> (gh: @Satyabrat35) **D**
- Savraj <savraj@MASKED>
- Sergey Vishnikin <armicron@MASKED>
- Sherry Zhou (gh: @cssherry) **D**
- Siping Meng (gh: @smeng10) **D**
- Stefan Bonchev **D**
- Thierry Bastian <thierryb@MASKED>
- Thomas A Caswell <tcaswell@MASKED> (gh: @tacaswell) **R**
- Thomas Achtemichuk <tom@MASKED>
- Thomas Grainger <[email protected]> (gh: @graingert) **D**
- Thomas Kluyver <takowl@MASKED> (gh: @takluyver)
- Tim Gates <[email protected]> (gh: timgates42)
- Tomasz Kluczkowski (gh: @Tomasz-Kluczkowski) **D**
- Tomi Pieviläinen <[email protected]>
- Unrud <Unrud@MASKED> (gh: @unrud)
- Xavier Lapointe <lapointe.xavier@MASKED> (gh: @lapointexavier) **D**
- X O <xo@MASKED>
- Yaron de Leeuw <[email protected]> (gh: @jarondl)
- Yoney <[email protected]> **D**
- Yuan Huang <[email protected]> (gh: @huangy22) **D**
- Zbigniew Jędrzejewski-Szmek <zbyszek@MASKED>
- bachmann <bachmann.matt@MASKED>
- bjv <brandon.vanvaerenbergh@MASKED> (@bjamesvERT)
- gl <gl@MASKED>
- gfyoung <[email protected]> **D**
- Labrys <[email protected]> (gh: @labrys) **R**
- ms-boom <ms-boom@MASKED>
- ryanss <ryanssdev@MASKED> (gh: @ryanss) **R**
Unless someone has deliberately given permission to publish their e-mail, I have masked the domain names. If you are not on this list and believe you should be, or you *are* on this list and your information is inaccurate, please e-mail the current maintainer or the mailing list ([email protected]) with your name, e-mail (if desired) and GitHub (if desired / applicable), as you would like them displayed. Additionally, please indicate if you are willing to dual license your old contributions under Apache 2.0. | {
"source": "yandex/perforator",
"title": "contrib/python/python-dateutil/py3/AUTHORS.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/python-dateutil/py3/AUTHORS.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 6642
} |
dateutil - powerful extensions to datetime
==========================================
|pypi| |support| |licence|
|gitter| |readthedocs|
|travis| |appveyor| |pipelines| |coverage|
.. |pypi| image:: https://img.shields.io/pypi/v/python-dateutil.svg?style=flat-square
:target: https://pypi.org/project/python-dateutil/
:alt: pypi version
.. |support| image:: https://img.shields.io/pypi/pyversions/python-dateutil.svg?style=flat-square
:target: https://pypi.org/project/python-dateutil/
:alt: supported Python version
.. |travis| image:: https://img.shields.io/travis/dateutil/dateutil/master.svg?style=flat-square&label=Travis%20Build
:target: https://travis-ci.org/dateutil/dateutil
:alt: travis build status
.. |appveyor| image:: https://img.shields.io/appveyor/ci/dateutil/dateutil/master.svg?style=flat-square&logo=appveyor
:target: https://ci.appveyor.com/project/dateutil/dateutil
:alt: appveyor build status
.. |pipelines| image:: https://dev.azure.com/pythondateutilazure/dateutil/_apis/build/status/dateutil.dateutil?branchName=master
:target: https://dev.azure.com/pythondateutilazure/dateutil/_build/latest?definitionId=1&branchName=master
:alt: azure pipelines build status
.. |coverage| image:: https://codecov.io/gh/dateutil/dateutil/branch/master/graphs/badge.svg?branch=master
:target: https://codecov.io/gh/dateutil/dateutil?branch=master
:alt: Code coverage
.. |gitter| image:: https://badges.gitter.im/dateutil/dateutil.svg
:alt: Join the chat at https://gitter.im/dateutil/dateutil
:target: https://gitter.im/dateutil/dateutil
.. |licence| image:: https://img.shields.io/pypi/l/python-dateutil.svg?style=flat-square
:target: https://pypi.org/project/python-dateutil/
:alt: licence
.. |readthedocs| image:: https://img.shields.io/readthedocs/dateutil/latest.svg?style=flat-square&label=Read%20the%20Docs
:alt: Read the documentation at https://dateutil.readthedocs.io/en/latest/
:target: https://dateutil.readthedocs.io/en/latest/
The `dateutil` module provides powerful extensions to
the standard `datetime` module, available in Python.
Installation
============
`dateutil` can be installed from PyPI using `pip` (note that the package name is
different from the importable name)::
pip install python-dateutil
Download
========
dateutil is available on PyPI
https://pypi.org/project/python-dateutil/
The documentation is hosted at:
https://dateutil.readthedocs.io/en/stable/
Code
====
The code and issue tracker are hosted on GitHub:
https://github.com/dateutil/dateutil/
Features
========
* Computing of relative deltas (next month, next year,
next Monday, last week of month, etc);
* Computing of relative deltas between two given
date and/or datetime objects;
* Computing of dates based on very flexible recurrence rules,
using a superset of the `iCalendar <https://www.ietf.org/rfc/rfc2445.txt>`_
specification. Parsing of RFC strings is supported as well.
* Generic parsing of dates in almost any string format;
* Timezone (tzinfo) implementations for tzfile(5) format
files (/etc/localtime, /usr/share/zoneinfo, etc), TZ
environment string (in all known formats), iCalendar
format files, given ranges (with help from relative deltas),
local machine timezone, fixed offset timezone, UTC timezone,
and Windows registry-based time zones.
* Internal up-to-date world timezone information based on
Olson's database.
* Computing of Easter Sunday dates for any given year,
using Western, Orthodox or Julian algorithms;
* A comprehensive test suite.
Quick example
=============
Here's a snapshot, just to give an idea about the power of the
package. For more examples, look at the documentation.
Suppose you want to know how much time is left, in
years/months/days/etc, before the next easter happening on a
year with a Friday 13th in August, and you want to get today's
date out of the "date" unix system command. Here is the code:
.. doctest:: readmeexample
>>> from dateutil.relativedelta import *
>>> from dateutil.easter import *
>>> from dateutil.rrule import *
>>> from dateutil.parser import *
>>> from datetime import *
>>> now = parse("Sat Oct 11 17:13:46 UTC 2003")
>>> today = now.date()
>>> year = rrule(YEARLY,dtstart=now,bymonth=8,bymonthday=13,byweekday=FR)[0].year
>>> rdelta = relativedelta(easter(year), today)
>>> print("Today is: %s" % today)
Today is: 2003-10-11
>>> print("Year with next Aug 13th on a Friday is: %s" % year)
Year with next Aug 13th on a Friday is: 2004
>>> print("How far is the Easter of that year: %s" % rdelta)
How far is the Easter of that year: relativedelta(months=+6)
>>> print("And the Easter of that year is: %s" % (today+rdelta))
And the Easter of that year is: 2004-04-11
Being exactly 6 months ahead was **really** a coincidence :)
Contributing
============
We welcome many types of contributions - bug reports, pull requests (code, infrastructure or documentation fixes). For more information about how to contribute to the project, see the ``CONTRIBUTING.md`` file in the repository.
Author
======
The dateutil module was written by Gustavo Niemeyer <[email protected]>
in 2003.
It is maintained by:
* Gustavo Niemeyer <[email protected]> 2003-2011
* Tomi Pieviläinen <[email protected]> 2012-2014
* Yaron de Leeuw <[email protected]> 2014-2016
* Paul Ganssle <[email protected]> 2015-
Starting with version 2.4.1 and running until 2.8.2, all source and binary
distributions will be signed by a PGP key that has, at the very least, been
signed by the key which made the previous release. A table of release signing
keys can be found below:
=========== ============================
Releases Signing key fingerprint
=========== ============================
2.4.1-2.8.2 `6B49 ACBA DCF6 BD1C A206 67AB CD54 FCE3 D964 BEFB`_
=========== ============================
New releases *may* have signed tags, but binary and source distributions
uploaded to PyPI will no longer have GPG signatures attached.
Contact
=======
Our mailing list is available at `[email protected] <https://mail.python.org/mailman/listinfo/dateutil>`_. As it is hosted by the PSF, it is subject to the `PSF code of
conduct <https://www.python.org/psf/conduct/>`_.
License
=======
All contributions after December 1, 2017 released under dual license - either `Apache 2.0 License <https://www.apache.org/licenses/LICENSE-2.0>`_ or the `BSD 3-Clause License <https://opensource.org/licenses/BSD-3-Clause>`_. Contributions before December 1, 2017 - except those those explicitly relicensed - are released only under the BSD 3-Clause License.
.. _6B49 ACBA DCF6 BD1C A206 67AB CD54 FCE3 D964 BEFB:
https://pgp.mit.edu/pks/lookup?op=vindex&search=0xCD54FCE3D964BEFB | {
"source": "yandex/perforator",
"title": "contrib/python/python-dateutil/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/python-dateutil/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 6816
} |


## A fast JSON parser/generator for C++ with both SAX/DOM style API
Tencent is pleased to support the open source community by making RapidJSON available.
Copyright (C) 2015 THL A29 Limited, a Tencent company, and Milo Yip.
* [RapidJSON GitHub](https://github.com/Tencent/rapidjson/)
* RapidJSON Documentation
* [English](http://rapidjson.org/)
* [简体中文](http://rapidjson.org/zh-cn/)
* [GitBook](https://www.gitbook.com/book/miloyip/rapidjson/) with downloadable PDF/EPUB/MOBI, without API reference.
## Build status
| [Linux][lin-link] | [Windows][win-link] | [Coveralls][cov-link] |
| :---------------: | :-----------------: | :-------------------: |
| ![lin-badge] | ![win-badge] | ![cov-badge] |
[lin-badge]: https://travis-ci.org/Tencent/rapidjson.svg?branch=master "Travis build status"
[lin-link]: https://travis-ci.org/Tencent/rapidjson "Travis build status"
[win-badge]: https://ci.appveyor.com/api/projects/status/l6qulgqahcayidrf/branch/master?svg=true "AppVeyor build status"
[win-link]: https://ci.appveyor.com/project/miloyip/rapidjson-0fdqj/branch/master "AppVeyor build status"
[cov-badge]: https://coveralls.io/repos/Tencent/rapidjson/badge.svg?branch=master "Coveralls coverage"
[cov-link]: https://coveralls.io/r/Tencent/rapidjson?branch=master "Coveralls coverage"
## Introduction
RapidJSON is a JSON parser and generator for C++. It was inspired by [RapidXml](http://rapidxml.sourceforge.net/).
* RapidJSON is **small** but **complete**. It supports both SAX and DOM style API. The SAX parser is only a half thousand lines of code.
* RapidJSON is **fast**. Its performance can be comparable to `strlen()`. It also optionally supports SSE2/SSE4.2 for acceleration.
* RapidJSON is **self-contained** and **header-only**. It does not depend on external libraries such as BOOST. It even does not depend on STL.
* RapidJSON is **memory-friendly**. Each JSON value occupies exactly 16 bytes for most 32/64-bit machines (excluding text string). By default it uses a fast memory allocator, and the parser allocates memory compactly during parsing.
* RapidJSON is **Unicode-friendly**. It supports UTF-8, UTF-16, UTF-32 (LE & BE), and their detection, validation and transcoding internally. For example, you can read a UTF-8 file and let RapidJSON transcode the JSON strings into UTF-16 in the DOM. It also supports surrogates and "\u0000" (null character).
More features can be read [here](doc/features.md).
JSON(JavaScript Object Notation) is a light-weight data exchange format. RapidJSON should be in full compliance with RFC7159/ECMA-404, with optional support of relaxed syntax. More information about JSON can be obtained at
* [Introducing JSON](http://json.org/)
* [RFC7159: The JavaScript Object Notation (JSON) Data Interchange Format](https://tools.ietf.org/html/rfc7159)
* [Standard ECMA-404: The JSON Data Interchange Format](https://www.ecma-international.org/publications/standards/Ecma-404.htm)
## Highlights in v1.1 (2016-8-25)
* Added [JSON Pointer](doc/pointer.md)
* Added [JSON Schema](doc/schema.md)
* Added [relaxed JSON syntax](doc/dom.md) (comment, trailing comma, NaN/Infinity)
* Iterating array/object with [C++11 Range-based for loop](doc/tutorial.md)
* Reduce memory overhead of each `Value` from 24 bytes to 16 bytes in x86-64 architecture.
For other changes please refer to [change log](CHANGELOG.md).
## Compatibility
RapidJSON is cross-platform. Some platform/compiler combinations which have been tested are shown as follows.
* Visual C++ 2008/2010/2013 on Windows (32/64-bit)
* GNU C++ 3.8.x on Cygwin
* Clang 3.4 on Mac OS X (32/64-bit) and iOS
* Clang 3.4 on Android NDK
Users can build and run the unit tests on their platform/compiler.
## Installation
RapidJSON is a header-only C++ library. Just copy the `include/rapidjson` folder to system or project's include path.
Alternatively, if you are using the [vcpkg](https://github.com/Microsoft/vcpkg/) dependency manager you can download and install rapidjson with CMake integration in a single command:
* vcpkg install rapidjson
RapidJSON uses following software as its dependencies:
* [CMake](https://cmake.org/) as a general build tool
* (optional) [Doxygen](http://www.doxygen.org) to build documentation
* (optional) [googletest](https://github.com/google/googletest) for unit and performance testing
To generate user documentation and run tests please proceed with the steps below:
1. Execute `git submodule update --init` to get the files of thirdparty submodules (google test).
2. Create directory called `build` in rapidjson source directory.
3. Change to `build` directory and run `cmake ..` command to configure your build. Windows users can do the same with cmake-gui application.
4. On Windows, build the solution found in the build directory. On Linux, run `make` from the build directory.
On successful build you will find compiled test and example binaries in `bin`
directory. The generated documentation will be available in `doc/html`
directory of the build tree. To run tests after finished build please run `make
test` or `ctest` from your build tree. You can get detailed output using `ctest
-V` command.
It is possible to install library system-wide by running `make install` command
from the build tree with administrative privileges. This will install all files
according to system preferences. Once RapidJSON is installed, it is possible
to use it from other CMake projects by adding `find_package(RapidJSON)` line to
your CMakeLists.txt.
## Usage at a glance
This simple example parses a JSON string into a document (DOM), make a simple modification of the DOM, and finally stringify the DOM to a JSON string.
~~~~~~~~~~cpp
// rapidjson/example/simpledom/simpledom.cpp`
#include "rapidjson/document.h"
#include "rapidjson/writer.h"
#include "rapidjson/stringbuffer.h"
#include <iostream>
using namespace rapidjson;
int main() {
// 1. Parse a JSON string into DOM.
const char* json = "{\"project\":\"rapidjson\",\"stars\":10}";
Document d;
d.Parse(json);
// 2. Modify it by DOM.
Value& s = d["stars"];
s.SetInt(s.GetInt() + 1);
// 3. Stringify the DOM
StringBuffer buffer;
Writer<StringBuffer> writer(buffer);
d.Accept(writer);
// Output {"project":"rapidjson","stars":11}
std::cout << buffer.GetString() << std::endl;
return 0;
}
~~~~~~~~~~
Note that this example did not handle potential errors.
The following diagram shows the process.

More [examples](https://github.com/Tencent/rapidjson/tree/master/example) are available:
* DOM API
* [tutorial](https://github.com/Tencent/rapidjson/blob/master/example/tutorial/tutorial.cpp): Basic usage of DOM API.
* SAX API
* [simplereader](https://github.com/Tencent/rapidjson/blob/master/example/simplereader/simplereader.cpp): Dumps all SAX events while parsing a JSON by `Reader`.
* [condense](https://github.com/Tencent/rapidjson/blob/master/example/condense/condense.cpp): A command line tool to rewrite a JSON, with all whitespaces removed.
* [pretty](https://github.com/Tencent/rapidjson/blob/master/example/pretty/pretty.cpp): A command line tool to rewrite a JSON with indents and newlines by `PrettyWriter`.
* [capitalize](https://github.com/Tencent/rapidjson/blob/master/example/capitalize/capitalize.cpp): A command line tool to capitalize strings in JSON.
* [messagereader](https://github.com/Tencent/rapidjson/blob/master/example/messagereader/messagereader.cpp): Parse a JSON message with SAX API.
* [serialize](https://github.com/Tencent/rapidjson/blob/master/example/serialize/serialize.cpp): Serialize a C++ object into JSON with SAX API.
* [jsonx](https://github.com/Tencent/rapidjson/blob/master/example/jsonx/jsonx.cpp): Implements a `JsonxWriter` which stringify SAX events into [JSONx](https://www-01.ibm.com/support/knowledgecenter/SS9H2Y_7.1.0/com.ibm.dp.doc/json_jsonx.html) (a kind of XML) format. The example is a command line tool which converts input JSON into JSONx format.
* Schema
* [schemavalidator](https://github.com/Tencent/rapidjson/blob/master/example/schemavalidator/schemavalidator.cpp) : A command line tool to validate a JSON with a JSON schema.
* Advanced
* [prettyauto](https://github.com/Tencent/rapidjson/blob/master/example/prettyauto/prettyauto.cpp): A modified version of [pretty](https://github.com/Tencent/rapidjson/blob/master/example/pretty/pretty.cpp) to automatically handle JSON with any UTF encodings.
* [parsebyparts](https://github.com/Tencent/rapidjson/blob/master/example/parsebyparts/parsebyparts.cpp): Implements an `AsyncDocumentParser` which can parse JSON in parts, using C++11 thread.
* [filterkey](https://github.com/Tencent/rapidjson/blob/master/example/filterkey/filterkey.cpp): A command line tool to remove all values with user-specified key.
* [filterkeydom](https://github.com/Tencent/rapidjson/blob/master/example/filterkeydom/filterkeydom.cpp): Same tool as above, but it demonstrates how to use a generator to populate a `Document`.
## Contributing
RapidJSON welcomes contributions. When contributing, please follow the code below.
### Issues
Feel free to submit issues and enhancement requests.
Please help us by providing **minimal reproducible examples**, because source code is easier to let other people understand what happens.
For crash problems on certain platforms, please bring stack dump content with the detail of the OS, compiler, etc.
Please try breakpoint debugging first, tell us what you found, see if we can start exploring based on more information been prepared.
### Workflow
In general, we follow the "fork-and-pull" Git workflow.
1. **Fork** the repo on GitHub
2. **Clone** the project to your own machine
3. **Checkout** a new branch on your fork, start developing on the branch
4. **Test** the change before commit, Make sure the changes pass all the tests, including `unittest` and `preftest`, please add test case for each new feature or bug-fix if needed.
5. **Commit** changes to your own branch
6. **Push** your work back up to your fork
7. Submit a **Pull request** so that we can review your changes
NOTE: Be sure to merge the latest from "upstream" before making a pull request!
### Copyright and Licensing
You can copy and paste the license summary from below.
```
Tencent is pleased to support the open source community by making RapidJSON available.
Copyright (C) 2015 THL A29 Limited, a Tencent company, and Milo Yip.
Licensed under the MIT License (the "License"); you may not use this file except
in compliance with the License. You may obtain a copy of the License at
http://opensource.org/licenses/MIT
Unless required by applicable law or agreed to in writing, software distributed
under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR
CONDITIONS OF ANY KIND, either express or implied. See the License for the
specific language governing permissions and limitations under the License.
``` | {
"source": "yandex/perforator",
"title": "contrib/python/python-rapidjson/rapidjson/readme.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/python-rapidjson/rapidjson/readme.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 11137
} |
.. image:: https://img.shields.io/pypi/v/setuptools.svg
:target: https://pypi.org/project/setuptools
.. image:: https://img.shields.io/readthedocs/setuptools/latest.svg
:target: https://setuptools.readthedocs.io
.. image:: https://img.shields.io/travis/pypa/setuptools/master.svg?label=Linux%20CI&logo=travis&logoColor=white
:target: https://travis-ci.org/pypa/setuptools
.. image:: https://img.shields.io/appveyor/ci/pypa/setuptools/master.svg?label=Windows%20CI&logo=appveyor&logoColor=white
:target: https://ci.appveyor.com/project/pypa/setuptools/branch/master
.. image:: https://img.shields.io/codecov/c/github/pypa/setuptools/master.svg?logo=codecov&logoColor=white
:target: https://codecov.io/gh/pypa/setuptools
.. image:: https://tidelift.com/badges/github/pypa/setuptools?style=flat
:target: https://tidelift.com/subscription/pkg/pypi-setuptools?utm_source=pypi-setuptools&utm_medium=readme
.. image:: https://img.shields.io/pypi/pyversions/setuptools.svg
See the `Installation Instructions
<https://packaging.python.org/installing/>`_ in the Python Packaging
User's Guide for instructions on installing, upgrading, and uninstalling
Setuptools.
Questions and comments should be directed to the `distutils-sig
mailing list <http://mail.python.org/pipermail/distutils-sig/>`_.
Bug reports and especially tested patches may be
submitted directly to the `bug tracker
<https://github.com/pypa/setuptools/issues>`_.
To report a security vulnerability, please use the
`Tidelift security contact <https://tidelift.com/security>`_.
Tidelift will coordinate the fix and disclosure.
For Enterprise
==============
Available as part of the Tidelift Subscription.
Setuptools and the maintainers of thousands of other packages are working with Tidelift to deliver one enterprise subscription that covers all of the open source you use.
`Learn more <https://tidelift.com/subscription/pkg/pypi-setuptools?utm_source=pypi-setuptools&utm_medium=referral&utm_campaign=github>`_.
Code of Conduct
===============
Everyone interacting in the setuptools project's codebases, issue trackers,
chat rooms, and mailing lists is expected to follow the
`PyPA Code of Conduct <https://www.pypa.io/en/latest/code-of-conduct/>`_. | {
"source": "yandex/perforator",
"title": "contrib/python/setuptools/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/setuptools/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2240
} |
.. |pypi-version| image:: https://img.shields.io/pypi/v/setuptools.svg
:target: https://pypi.org/project/setuptools
.. |py-version| image:: https://img.shields.io/pypi/pyversions/setuptools.svg
.. |test-badge| image:: https://github.com/pypa/setuptools/actions/workflows/main.yml/badge.svg
:target: https://github.com/pypa/setuptools/actions?query=workflow%3A%22tests%22
:alt: tests
.. |ruff-badge| image:: https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/charliermarsh/ruff/main/assets/badge/v2.json
:target: https://github.com/astral-sh/ruff
:alt: Ruff
.. |docs-badge| image:: https://img.shields.io/readthedocs/setuptools/latest.svg
:target: https://setuptools.pypa.io
.. |skeleton-badge| image:: https://img.shields.io/badge/skeleton-2024-informational
:target: https://blog.jaraco.com/skeleton
.. |codecov-badge| image:: https://img.shields.io/codecov/c/github/pypa/setuptools/master.svg?logo=codecov&logoColor=white
:target: https://codecov.io/gh/pypa/setuptools
.. |tidelift-badge| image:: https://tidelift.com/badges/github/pypa/setuptools?style=flat
:target: https://tidelift.com/subscription/pkg/pypi-setuptools?utm_source=pypi-setuptools&utm_medium=readme
.. |discord-badge| image:: https://img.shields.io/discord/803025117553754132
:target: https://discord.com/channels/803025117553754132/815945031150993468
:alt: Discord
|pypi-version| |py-version| |test-badge| |ruff-badge| |docs-badge| |skeleton-badge| |codecov-badge| |discord-badge|
See the `Quickstart <https://setuptools.pypa.io/en/latest/userguide/quickstart.html>`_
and the `User's Guide <https://setuptools.pypa.io/en/latest/userguide/>`_ for
instructions on how to use Setuptools.
Questions and comments should be directed to `GitHub Discussions
<https://github.com/pypa/setuptools/discussions>`_.
Bug reports and especially tested patches may be
submitted directly to the `bug tracker
<https://github.com/pypa/setuptools/issues>`_.
Code of Conduct
===============
Everyone interacting in the setuptools project's codebases, issue trackers,
chat rooms, and fora is expected to follow the
`PSF Code of Conduct <https://github.com/pypa/.github/blob/main/CODE_OF_CONDUCT.md>`_.
For Enterprise
==============
Available as part of the Tidelift Subscription.
Setuptools and the maintainers of thousands of other packages are working with Tidelift to deliver one enterprise subscription that covers all of the open source you use.
`Learn more <https://tidelift.com/subscription/pkg/pypi-setuptools?utm_source=pypi-setuptools&utm_medium=referral&utm_campaign=github>`_. | {
"source": "yandex/perforator",
"title": "contrib/python/setuptools/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/setuptools/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2607
} |
.. image:: https://img.shields.io/pypi/v/six.svg
:target: https://pypi.org/project/six/
:alt: six on PyPI
.. image:: https://readthedocs.org/projects/six/badge/?version=latest
:target: https://six.readthedocs.io/
:alt: six's documentation on Read the Docs
.. image:: https://img.shields.io/badge/license-MIT-green.svg
:target: https://github.com/benjaminp/six/blob/master/LICENSE
:alt: MIT License badge
Six is a Python 2 and 3 compatibility library. It provides utility functions
for smoothing over the differences between the Python versions with the goal of
writing Python code that is compatible on both Python versions. See the
documentation for more information on what is provided.
Six supports Python 2.7 and 3.3+. It is contained in only one Python
file, so it can be easily copied into your project. (The copyright and license
notice must be retained.)
Online documentation is at https://six.readthedocs.io/.
Bugs can be reported to https://github.com/benjaminp/six. The code can also
be found there. | {
"source": "yandex/perforator",
"title": "contrib/python/six/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/six/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1038
} |
.. image:: https://img.shields.io/pypi/v/six.svg
:target: https://pypi.org/project/six/
:alt: six on PyPI
.. image:: https://readthedocs.org/projects/six/badge/?version=latest
:target: https://six.readthedocs.io/
:alt: six's documentation on Read the Docs
.. image:: https://img.shields.io/badge/license-MIT-green.svg
:target: https://github.com/benjaminp/six/blob/master/LICENSE
:alt: MIT License badge
Six is a Python 2 and 3 compatibility library. It provides utility functions
for smoothing over the differences between the Python versions with the goal of
writing Python code that is compatible on both Python versions. See the
documentation for more information on what is provided.
Six supports Python 2.7 and 3.3+. It is contained in only one Python
file, so it can be easily copied into your project. (The copyright and license
notice must be retained.)
Online documentation is at https://six.readthedocs.io/.
Bugs can be reported to https://github.com/benjaminp/six. The code can also
be found there. | {
"source": "yandex/perforator",
"title": "contrib/python/six/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/six/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1038
} |
Example
=======
::
import sys
from termcolor import colored, cprint
text = colored('Hello, World!', 'red', attrs=['reverse', 'blink'])
print(text)
cprint('Hello, World!', 'green', 'on_red')
print_red_on_cyan = lambda x: cprint(x, 'red', 'on_cyan')
print_red_on_cyan('Hello, World!')
print_red_on_cyan('Hello, Universe!')
for i in range(10):
cprint(i, 'magenta', end=' ')
cprint("Attention!", 'red', attrs=['bold'], file=sys.stderr)
Text Properties
===============
Text colors:
- grey
- red
- green
- yellow
- blue
- magenta
- cyan
- white
Text highlights:
- on_grey
- on_red
- on_green
- on_yellow
- on_blue
- on_magenta
- on_cyan
- on_white
Attributes:
- bold
- dark
- underline
- blink
- reverse
- concealed
Terminal properties
===================
============ ======= ==== ========= ========== ======= =========
Terminal bold dark underline blink reverse concealed
------------ ------- ---- --------- ---------- ------- ---------
xterm yes no yes bold yes yes
linux yes yes bold yes yes no
rxvt yes no yes bold/black yes no
dtterm yes yes yes reverse yes yes
teraterm reverse no yes rev/red yes no
aixterm normal no yes no yes yes
PuTTY color no yes no yes no
Windows no no no no yes no
Cygwin SSH yes no color color color yes
Mac Terminal yes no yes yes yes yes
============ ======= ==== ========= ========== ======= ========= | {
"source": "yandex/perforator",
"title": "contrib/python/termcolor/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/termcolor/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1898
} |
# termcolor
[](https://pypi.org/project/termcolor)
[](https://pypi.org/project/termcolor)
[](https://pypistats.org/packages/termcolor)
[](https://github.com/termcolor/termcolor/actions)
[](https://codecov.io/gh/termcolor/termcolor)
[](COPYING.txt)
[](https://github.com/psf/black)
[](https://tidelift.com/subscription/pkg/pypi-termcolor?utm_source=pypi-termcolor&utm_medium=referral&utm_campaign=readme)
## Installation
### From PyPI
```bash
python3 -m pip install --upgrade termcolor
```
### From source
```bash
git clone https://github.com/termcolor/termcolor
cd termcolor
python3 -m pip install .
```
### Demo
To see demo output, run:
```bash
python3 -m termcolor
```
## Example
```python
import sys
from termcolor import colored, cprint
text = colored("Hello, World!", "red", attrs=["reverse", "blink"])
print(text)
cprint("Hello, World!", "green", "on_red")
print_red_on_cyan = lambda x: cprint(x, "red", "on_cyan")
print_red_on_cyan("Hello, World!")
print_red_on_cyan("Hello, Universe!")
for i in range(10):
cprint(i, "magenta", end=" ")
cprint("Attention!", "red", attrs=["bold"], file=sys.stderr)
```
## Text properties
| Text colors | Text highlights | Attributes |
| --------------- | ------------------ | ----------- |
| `black` | `on_black` | `bold` |
| `red` | `on_red` | `dark` |
| `green` | `on_green` | `underline` |
| `yellow` | `on_yellow` | `blink` |
| `blue` | `on_blue` | `reverse` |
| `magenta` | `on_magenta` | `concealed` |
| `cyan` | `on_cyan` | `strike` |
| `white` | `on_white` | |
| `light_grey` | `on_light_grey` | |
| `dark_grey` | `on_dark_grey` | |
| `light_red` | `on_light_red` | |
| `light_green` | `on_light_green` | |
| `light_yellow` | `on_light_yellow` | |
| `light_blue` | `on_light_blue` | |
| `light_magenta` | `on_light_magenta` | |
| `light_cyan` | `on_light_cyan` | |
## Terminal properties
| Terminal | bold | dark | underline | blink | reverse | concealed |
| ------------ | ------- | ---- | --------- | ---------- | ------- | --------- |
| xterm | yes | no | yes | bold | yes | yes |
| linux | yes | yes | bold | yes | yes | no |
| rxvt | yes | no | yes | bold/black | yes | no |
| dtterm | yes | yes | yes | reverse | yes | yes |
| teraterm | reverse | no | yes | rev/red | yes | no |
| aixterm | normal | no | yes | no | yes | yes |
| PuTTY | color | no | yes | no | yes | no |
| Windows | no | no | no | no | yes | no |
| Cygwin SSH | yes | no | color | color | color | yes |
| Mac Terminal | yes | no | yes | yes | yes | yes |
## Overrides
Terminal colour detection can be disabled or enabled in several ways.
In order of precedence:
1. Calling `colored` or `cprint` with a truthy `no_color` disables colour.
2. Calling `colored` or `cprint` with a truthy `force_color` forces colour.
3. Setting the `ANSI_COLORS_DISABLED` environment variable to any value disables colour.
4. Setting the [`NO_COLOR`](https://no-color.org/) environment variable to any value
disables colour.
5. Setting the [`FORCE_COLOR`](https://force-color.org/) environment variable to any
value forces colour.
6. Setting the `TERM` environment variable to `dumb`, or using such a
[dumb terminal](https://en.wikipedia.org/wiki/Computer_terminal#Character-oriented_terminal),
disables colour.
7. Finally, termcolor will attempt to detect whether the terminal supports colour. | {
"source": "yandex/perforator",
"title": "contrib/python/termcolor/py3/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/termcolor/py3/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 4618
} |
****
TOML
****
.. image:: https://img.shields.io/pypi/v/toml
:target: https://pypi.org/project/toml/
.. image:: https://travis-ci.org/uiri/toml.svg?branch=master
:target: https://travis-ci.org/uiri/toml
.. image:: https://img.shields.io/pypi/pyversions/toml.svg
:target: https://pypi.org/project/toml/
A Python library for parsing and creating `TOML <https://en.wikipedia.org/wiki/TOML>`_.
The module passes `the TOML test suite <https://github.com/BurntSushi/toml-test>`_.
See also:
* `The TOML Standard <https://github.com/toml-lang/toml>`_
* `The currently supported TOML specification <https://github.com/toml-lang/toml/blob/v0.5.0/README.md>`_
Installation
============
To install the latest release on `PyPI <https://pypi.org/project/toml/>`_,
simply run:
::
pip install toml
Or to install the latest development version, run:
::
git clone https://github.com/uiri/toml.git
cd toml
python setup.py install
Quick Tutorial
==============
*toml.loads* takes in a string containing standard TOML-formatted data and
returns a dictionary containing the parsed data.
.. code:: pycon
>>> import toml
>>> toml_string = """
... # This is a TOML document.
...
... title = "TOML Example"
...
... [owner]
... name = "Tom Preston-Werner"
... dob = 1979-05-27T07:32:00-08:00 # First class dates
...
... [database]
... server = "192.168.1.1"
... ports = [ 8001, 8001, 8002 ]
... connection_max = 5000
... enabled = true
...
... [servers]
...
... # Indentation (tabs and/or spaces) is allowed but not required
... [servers.alpha]
... ip = "10.0.0.1"
... dc = "eqdc10"
...
... [servers.beta]
... ip = "10.0.0.2"
... dc = "eqdc10"
...
... [clients]
... data = [ ["gamma", "delta"], [1, 2] ]
...
... # Line breaks are OK when inside arrays
... hosts = [
... "alpha",
... "omega"
... ]
... """
>>> parsed_toml = toml.loads(toml_string)
*toml.dumps* takes a dictionary and returns a string containing the
corresponding TOML-formatted data.
.. code:: pycon
>>> new_toml_string = toml.dumps(parsed_toml)
>>> print(new_toml_string)
title = "TOML Example"
[owner]
name = "Tom Preston-Werner"
dob = 1979-05-27T07:32:00Z
[database]
server = "192.168.1.1"
ports = [ 8001, 8001, 8002,]
connection_max = 5000
enabled = true
[clients]
data = [ [ "gamma", "delta",], [ 1, 2,],]
hosts = [ "alpha", "omega",]
[servers.alpha]
ip = "10.0.0.1"
dc = "eqdc10"
[servers.beta]
ip = "10.0.0.2"
dc = "eqdc10"
*toml.dump* takes a dictionary and a file descriptor and returns a string containing the
corresponding TOML-formatted data.
.. code:: pycon
>>> with open('new_toml_file.toml', 'w') as f:
... new_toml_string = toml.dump(parsed_toml, f)
>>> print(new_toml_string)
title = "TOML Example"
[owner]
name = "Tom Preston-Werner"
dob = 1979-05-27T07:32:00Z
[database]
server = "192.168.1.1"
ports = [ 8001, 8001, 8002,]
connection_max = 5000
enabled = true
[clients]
data = [ [ "gamma", "delta",], [ 1, 2,],]
hosts = [ "alpha", "omega",]
[servers.alpha]
ip = "10.0.0.1"
dc = "eqdc10"
[servers.beta]
ip = "10.0.0.2"
dc = "eqdc10"
For more functions, view the API Reference below.
Note
----
For Numpy users, by default the data types ``np.floatX`` will not be translated to floats by toml, but will instead be encoded as strings. To get around this, specify the ``TomlNumpyEncoder`` when saving your data.
.. code:: pycon
>>> import toml
>>> import numpy as np
>>> a = np.arange(0, 10, dtype=np.double)
>>> output = {'a': a}
>>> toml.dumps(output)
'a = [ "0.0", "1.0", "2.0", "3.0", "4.0", "5.0", "6.0", "7.0", "8.0", "9.0",]\n'
>>> toml.dumps(output, encoder=toml.TomlNumpyEncoder())
'a = [ 0.0, 1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0,]\n'
API Reference
=============
``toml.load(f, _dict=dict)``
Parse a file or a list of files as TOML and return a dictionary.
:Args:
* ``f``: A path to a file, list of filepaths (to be read into single
object) or a file descriptor
* ``_dict``: The class of the dictionary object to be returned
:Returns:
A dictionary (or object ``_dict``) containing parsed TOML data
:Raises:
* ``TypeError``: When ``f`` is an invalid type or is a list containing
invalid types
* ``TomlDecodeError``: When an error occurs while decoding the file(s)
``toml.loads(s, _dict=dict)``
Parse a TOML-formatted string to a dictionary.
:Args:
* ``s``: The TOML-formatted string to be parsed
* ``_dict``: Specifies the class of the returned toml dictionary
:Returns:
A dictionary (or object ``_dict``) containing parsed TOML data
:Raises:
* ``TypeError``: When a non-string object is passed
* ``TomlDecodeError``: When an error occurs while decoding the
TOML-formatted string
``toml.dump(o, f, encoder=None)``
Write a dictionary to a file containing TOML-formatted data
:Args:
* ``o``: An object to be converted into TOML
* ``f``: A File descriptor where the TOML-formatted output should be stored
* ``encoder``: An instance of ``TomlEncoder`` (or subclass) for encoding the object. If ``None``, will default to ``TomlEncoder``
:Returns:
A string containing the TOML-formatted data corresponding to object ``o``
:Raises:
* ``TypeError``: When anything other than file descriptor is passed
``toml.dumps(o, encoder=None)``
Create a TOML-formatted string from an input object
:Args:
* ``o``: An object to be converted into TOML
* ``encoder``: An instance of ``TomlEncoder`` (or subclass) for encoding the object. If ``None``, will default to ``TomlEncoder``
:Returns:
A string containing the TOML-formatted data corresponding to object ``o``
Licensing
=========
This project is released under the terms of the MIT Open Source License. View
*LICENSE.txt* for more information. | {
"source": "yandex/perforator",
"title": "contrib/python/toml/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/toml/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 5930
} |
****
TOML
****
.. image:: https://img.shields.io/pypi/v/toml
:target: https://pypi.org/project/toml/
.. image:: https://travis-ci.org/uiri/toml.svg?branch=master
:target: https://travis-ci.org/uiri/toml
.. image:: https://img.shields.io/pypi/pyversions/toml.svg
:target: https://pypi.org/project/toml/
A Python library for parsing and creating `TOML <https://en.wikipedia.org/wiki/TOML>`_.
The module passes `the TOML test suite <https://github.com/BurntSushi/toml-test>`_.
See also:
* `The TOML Standard <https://github.com/toml-lang/toml>`_
* `The currently supported TOML specification <https://github.com/toml-lang/toml/blob/v0.5.0/README.md>`_
Installation
============
To install the latest release on `PyPI <https://pypi.org/project/toml/>`_,
simply run:
::
pip install toml
Or to install the latest development version, run:
::
git clone https://github.com/uiri/toml.git
cd toml
python setup.py install
Quick Tutorial
==============
*toml.loads* takes in a string containing standard TOML-formatted data and
returns a dictionary containing the parsed data.
.. code:: pycon
>>> import toml
>>> toml_string = """
... # This is a TOML document.
...
... title = "TOML Example"
...
... [owner]
... name = "Tom Preston-Werner"
... dob = 1979-05-27T07:32:00-08:00 # First class dates
...
... [database]
... server = "192.168.1.1"
... ports = [ 8001, 8001, 8002 ]
... connection_max = 5000
... enabled = true
...
... [servers]
...
... # Indentation (tabs and/or spaces) is allowed but not required
... [servers.alpha]
... ip = "10.0.0.1"
... dc = "eqdc10"
...
... [servers.beta]
... ip = "10.0.0.2"
... dc = "eqdc10"
...
... [clients]
... data = [ ["gamma", "delta"], [1, 2] ]
...
... # Line breaks are OK when inside arrays
... hosts = [
... "alpha",
... "omega"
... ]
... """
>>> parsed_toml = toml.loads(toml_string)
*toml.dumps* takes a dictionary and returns a string containing the
corresponding TOML-formatted data.
.. code:: pycon
>>> new_toml_string = toml.dumps(parsed_toml)
>>> print(new_toml_string)
title = "TOML Example"
[owner]
name = "Tom Preston-Werner"
dob = 1979-05-27T07:32:00Z
[database]
server = "192.168.1.1"
ports = [ 8001, 8001, 8002,]
connection_max = 5000
enabled = true
[clients]
data = [ [ "gamma", "delta",], [ 1, 2,],]
hosts = [ "alpha", "omega",]
[servers.alpha]
ip = "10.0.0.1"
dc = "eqdc10"
[servers.beta]
ip = "10.0.0.2"
dc = "eqdc10"
*toml.dump* takes a dictionary and a file descriptor and returns a string containing the
corresponding TOML-formatted data.
.. code:: pycon
>>> with open('new_toml_file.toml', 'w') as f:
... new_toml_string = toml.dump(parsed_toml, f)
>>> print(new_toml_string)
title = "TOML Example"
[owner]
name = "Tom Preston-Werner"
dob = 1979-05-27T07:32:00Z
[database]
server = "192.168.1.1"
ports = [ 8001, 8001, 8002,]
connection_max = 5000
enabled = true
[clients]
data = [ [ "gamma", "delta",], [ 1, 2,],]
hosts = [ "alpha", "omega",]
[servers.alpha]
ip = "10.0.0.1"
dc = "eqdc10"
[servers.beta]
ip = "10.0.0.2"
dc = "eqdc10"
For more functions, view the API Reference below.
Note
----
For Numpy users, by default the data types ``np.floatX`` will not be translated to floats by toml, but will instead be encoded as strings. To get around this, specify the ``TomlNumpyEncoder`` when saving your data.
.. code:: pycon
>>> import toml
>>> import numpy as np
>>> a = np.arange(0, 10, dtype=np.double)
>>> output = {'a': a}
>>> toml.dumps(output)
'a = [ "0.0", "1.0", "2.0", "3.0", "4.0", "5.0", "6.0", "7.0", "8.0", "9.0",]\n'
>>> toml.dumps(output, encoder=toml.TomlNumpyEncoder())
'a = [ 0.0, 1.0, 2.0, 3.0, 4.0, 5.0, 6.0, 7.0, 8.0, 9.0,]\n'
API Reference
=============
``toml.load(f, _dict=dict)``
Parse a file or a list of files as TOML and return a dictionary.
:Args:
* ``f``: A path to a file, list of filepaths (to be read into single
object) or a file descriptor
* ``_dict``: The class of the dictionary object to be returned
:Returns:
A dictionary (or object ``_dict``) containing parsed TOML data
:Raises:
* ``TypeError``: When ``f`` is an invalid type or is a list containing
invalid types
* ``TomlDecodeError``: When an error occurs while decoding the file(s)
``toml.loads(s, _dict=dict)``
Parse a TOML-formatted string to a dictionary.
:Args:
* ``s``: The TOML-formatted string to be parsed
* ``_dict``: Specifies the class of the returned toml dictionary
:Returns:
A dictionary (or object ``_dict``) containing parsed TOML data
:Raises:
* ``TypeError``: When a non-string object is passed
* ``TomlDecodeError``: When an error occurs while decoding the
TOML-formatted string
``toml.dump(o, f, encoder=None)``
Write a dictionary to a file containing TOML-formatted data
:Args:
* ``o``: An object to be converted into TOML
* ``f``: A File descriptor where the TOML-formatted output should be stored
* ``encoder``: An instance of ``TomlEncoder`` (or subclass) for encoding the object. If ``None``, will default to ``TomlEncoder``
:Returns:
A string containing the TOML-formatted data corresponding to object ``o``
:Raises:
* ``TypeError``: When anything other than file descriptor is passed
``toml.dumps(o, encoder=None)``
Create a TOML-formatted string from an input object
:Args:
* ``o``: An object to be converted into TOML
* ``encoder``: An instance of ``TomlEncoder`` (or subclass) for encoding the object. If ``None``, will default to ``TomlEncoder``
:Returns:
A string containing the TOML-formatted data corresponding to object ``o``
Licensing
=========
This project is released under the terms of the MIT Open Source License. View
*LICENSE.txt* for more information. | {
"source": "yandex/perforator",
"title": "contrib/python/toml/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/toml/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 5930
} |
# Licensing terms
Traitlets is adapted from enthought.traits, Copyright (c) Enthought, Inc.,
under the terms of the Modified BSD License.
This project is licensed under the terms of the Modified BSD License
(also known as New or Revised or 3-Clause BSD), as follows:
- Copyright (c) 2001-, IPython Development Team
All rights reserved.
Redistribution and use in source and binary forms, with or without
modification, are permitted provided that the following conditions are met:
Redistributions of source code must retain the above copyright notice, this
list of conditions and the following disclaimer.
Redistributions in binary form must reproduce the above copyright notice, this
list of conditions and the following disclaimer in the documentation and/or
other materials provided with the distribution.
Neither the name of the IPython Development Team nor the names of its
contributors may be used to endorse or promote products derived from this
software without specific prior written permission.
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS" AND
ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED
WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT OWNER OR CONTRIBUTORS BE LIABLE
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
## About the IPython Development Team
The IPython Development Team is the set of all contributors to the IPython project.
This includes all of the IPython subprojects.
The core team that coordinates development on GitHub can be found here:
https://github.com/jupyter/.
## Our Copyright Policy
IPython uses a shared copyright model. Each contributor maintains copyright
over their contributions to IPython. But, it is important to note that these
contributions are typically only changes to the repositories. Thus, the IPython
source code, in its entirety is not the copyright of any single person or
institution. Instead, it is the collective copyright of the entire IPython
Development Team. If individual contributors want to maintain a record of what
changes/contributions they have specific copyright on, they should indicate
their copyright in the commit message of the change, when they commit the
change to one of the IPython repositories.
With this in mind, the following banner should be used in any source code file
to indicate the copyright and license terms:
# Copyright (c) IPython Development Team.
# Distributed under the terms of the Modified BSD License. | {
"source": "yandex/perforator",
"title": "contrib/python/traitlets/py2/COPYING.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/traitlets/py2/COPYING.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2956
} |
# Traitlets
[](https://travis-ci.org/ipython/traitlets)
[](http://traitlets.readthedocs.org/en/latest/?badge=latest)
Traitlets is a pure Python library enabling:
- the enforcement of strong typing for attributes of Python objects
(typed attributes are called "traits"),
- notifications on changes of trait attributes,
- automatic validation and coercion of trait attributes when attempting a
change.
Its implementation relies on the [descriptor](https://docs.python.org/howto/descriptor.html)
pattern.
Traitlets powers the configuration system of IPython and Jupyter
and the declarative API of IPython interactive widgets.
## Installation
For a local installation, make sure you have
[pip installed](https://pip.pypa.io/en/stable/installing/) and run:
```bash
pip install traitlets
```
For a **development installation**, clone this repository, change into the
`traitlets` root directory, and run pip:
```bash
git clone https://github.com/ipython/traitlets.git
cd traitlets
pip install -e .
```
## Running the tests
```bash
pip install "traitlets[test]"
py.test traitlets
```
## Usage
Any class with trait attributes must inherit from `HasTraits`.
For the list of available trait types and their properties, see the
[Trait Types](https://traitlets.readthedocs.io/en/latest/trait_types.html)
section of the documentation.
### Dynamic default values
To calculate a default value dynamically, decorate a method of your class with
`@default({traitname})`. This method will be called on the instance, and
should return the default value. In this example, the `_username_default`
method is decorated with `@default('username')`:
```Python
import getpass
from traitlets import HasTraits, Unicode, default
class Identity(HasTraits):
username = Unicode()
@default('username')
def _username_default(self):
return getpass.getuser()
```
### Callbacks when a trait attribute changes
When a trait changes, an application can follow this trait change with
additional actions.
To do something when a trait attribute is changed, decorate a method with
[`traitlets.observe()`](https://traitlets.readthedocs.io/en/latest/api.html?highlight=observe#traitlets.observe).
The method will be called with a single argument, a dictionary which contains
an owner, new value, old value, name of the changed trait, and the event type.
In this example, the `_num_changed` method is decorated with ``@observe(`num`)``:
```Python
from traitlets import HasTraits, Integer, observe
class TraitletsExample(HasTraits):
num = Integer(5, help="a number").tag(config=True)
@observe('num')
def _num_changed(self, change):
print("{name} changed from {old} to {new}".format(**change))
```
and is passed the following dictionary when called:
```Python
{
'owner': object, # The HasTraits instance
'new': 6, # The new value
'old': 5, # The old value
'name': "foo", # The name of the changed trait
'type': 'change', # The event type of the notification, usually 'change'
}
```
### Validation and coercion
Each trait type (`Int`, `Unicode`, `Dict` etc.) may have its own validation or
coercion logic. In addition, we can register custom cross-validators
that may depend on the state of other attributes. For example:
```Python
from traitlets import HasTraits, TraitError, Int, Bool, validate
class Parity(HasTraits):
value = Int()
parity = Int()
@validate('value')
def _valid_value(self, proposal):
if proposal['value'] % 2 != self.parity:
raise TraitError('value and parity should be consistent')
return proposal['value']
@validate('parity')
def _valid_parity(self, proposal):
parity = proposal['value']
if parity not in [0, 1]:
raise TraitError('parity should be 0 or 1')
if self.value % 2 != parity:
raise TraitError('value and parity should be consistent')
return proposal['value']
parity_check = Parity(value=2)
# Changing required parity and value together while holding cross validation
with parity_check.hold_trait_notifications():
parity_check.value = 1
parity_check.parity = 1
```
However, we **recommend** that custom cross-validators don't modify the state
of the HasTraits instance. | {
"source": "yandex/perforator",
"title": "contrib/python/traitlets/py2/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/traitlets/py2/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 4446
} |
# Traitlets
[](https://github.com/ipython/traitlets/actions/workflows/tests.yml)
[](https://traitlets.readthedocs.io/en/latest/?badge=latest)
[](https://tidelift.com/badges/package/pypi/traitlets)
| | |
| ------------- | ------------------------------------ |
| **home** | https://github.com/ipython/traitlets |
| **pypi-repo** | https://pypi.org/project/traitlets/ |
| **docs** | https://traitlets.readthedocs.io/ |
| **license** | Modified BSD License |
Traitlets is a pure Python library enabling:
- the enforcement of strong typing for attributes of Python objects
(typed attributes are called _"traits"_);
- dynamically calculated default values;
- automatic validation and coercion of trait attributes when attempting a
change;
- registering for receiving notifications when trait values change;
- reading configuring values from files or from command line
arguments - a distinct layer on top of traitlets, so you may use
traitlets without the configuration machinery.
Its implementation relies on the [descriptor](https://docs.python.org/howto/descriptor.html)
pattern, and it is a lightweight pure-python alternative of the
[_traits_ library](https://docs.enthought.com/traits/).
Traitlets powers the configuration system of IPython and Jupyter
and the declarative API of IPython interactive widgets.
## Installation
For a local installation, make sure you have
[pip installed](https://pip.pypa.io/en/stable/installing/) and run:
```bash
pip install traitlets
```
For a **development installation**, clone this repository, change into the
`traitlets` root directory, and run pip:
```bash
git clone https://github.com/ipython/traitlets.git
cd traitlets
pip install -e .
```
## Running the tests
```bash
pip install "traitlets[test]"
py.test traitlets
```
## Code Styling
`traitlets` has adopted automatic code formatting so you shouldn't
need to worry too much about your code style.
As long as your code is valid,
the pre-commit hook should take care of how it should look.
To install `pre-commit` locally, run the following::
```
pip install pre-commit
pre-commit install
```
You can invoke the pre-commit hook by hand at any time with::
```
pre-commit run
```
which should run any autoformatting on your code
and tell you about any errors it couldn't fix automatically.
You may also install [black integration](https://github.com/psf/black#editor-integration)
into your text editor to format code automatically.
If you have already committed files before setting up the pre-commit
hook with `pre-commit install`, you can fix everything up using
`pre-commit run --all-files`. You need to make the fixing commit
yourself after that.
Some of the hooks only run on CI by default, but you can invoke them by
running with the `--hook-stage manual` argument.
## Usage
Any class with trait attributes must inherit from `HasTraits`.
For the list of available trait types and their properties, see the
[Trait Types](https://traitlets.readthedocs.io/en/latest/trait_types.html)
section of the documentation.
### Dynamic default values
To calculate a default value dynamically, decorate a method of your class with
`@default({traitname})`. This method will be called on the instance, and
should return the default value. In this example, the `_username_default`
method is decorated with `@default('username')`:
```Python
import getpass
from traitlets import HasTraits, Unicode, default
class Identity(HasTraits):
username = Unicode()
@default('username')
def _username_default(self):
return getpass.getuser()
```
### Callbacks when a trait attribute changes
When a trait changes, an application can follow this trait change with
additional actions.
To do something when a trait attribute is changed, decorate a method with
[`traitlets.observe()`](https://traitlets.readthedocs.io/en/latest/api.html?highlight=observe#traitlets.observe).
The method will be called with a single argument, a dictionary which contains
an owner, new value, old value, name of the changed trait, and the event type.
In this example, the `_num_changed` method is decorated with `` @observe(`num`) ``:
```Python
from traitlets import HasTraits, Integer, observe
class TraitletsExample(HasTraits):
num = Integer(5, help="a number").tag(config=True)
@observe('num')
def _num_changed(self, change):
print("{name} changed from {old} to {new}".format(**change))
```
and is passed the following dictionary when called:
```Python
{
'owner': object, # The HasTraits instance
'new': 6, # The new value
'old': 5, # The old value
'name': "foo", # The name of the changed trait
'type': 'change', # The event type of the notification, usually 'change'
}
```
### Validation and coercion
Each trait type (`Int`, `Unicode`, `Dict` etc.) may have its own validation or
coercion logic. In addition, we can register custom cross-validators
that may depend on the state of other attributes. For example:
```Python
from traitlets import HasTraits, TraitError, Int, Bool, validate
class Parity(HasTraits):
value = Int()
parity = Int()
@validate('value')
def _valid_value(self, proposal):
if proposal['value'] % 2 != self.parity:
raise TraitError('value and parity should be consistent')
return proposal['value']
@validate('parity')
def _valid_parity(self, proposal):
parity = proposal['value']
if parity not in [0, 1]:
raise TraitError('parity should be 0 or 1')
if self.value % 2 != parity:
raise TraitError('value and parity should be consistent')
return proposal['value']
parity_check = Parity(value=2)
# Changing required parity and value together while holding cross validation
with parity_check.hold_trait_notifications():
parity_check.value = 1
parity_check.parity = 1
```
However, we **recommend** that custom cross-validators don't modify the state
of the HasTraits instance.
## About the IPython Development Team
The IPython Development Team is the set of all contributors to the IPython project.
This includes all of the IPython subprojects.
The core team that coordinates development on GitHub can be found here:
https://github.com/jupyter/.
## Our Copyright Policy
IPython uses a shared copyright model. Each contributor maintains copyright
over their contributions to IPython. But, it is important to note that these
contributions are typically only changes to the repositories. Thus, the IPython
source code, in its entirety is not the copyright of any single person or
institution. Instead, it is the collective copyright of the entire IPython
Development Team. If individual contributors want to maintain a record of what
changes/contributions they have specific copyright on, they should indicate
their copyright in the commit message of the change, when they commit the
change to one of the IPython repositories.
With this in mind, the following banner should be used in any source code file
to indicate the copyright and license terms:
```
# Copyright (c) IPython Development Team.
# Distributed under the terms of the Modified BSD License.
``` | {
"source": "yandex/perforator",
"title": "contrib/python/traitlets/py3/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/traitlets/py3/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 7485
} |
=================
Typing Extensions
=================
.. image:: https://badges.gitter.im/python/typing.svg
:alt: Chat at https://gitter.im/python/typing
:target: https://gitter.im/python/typing?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge&utm_content=badge
Overview
========
The ``typing`` module was added to the standard library in Python 3.5 on
a provisional basis and will no longer be provisional in Python 3.7. However,
this means users of Python 3.5 - 3.6 who are unable to upgrade will not be
able to take advantage of new types added to the ``typing`` module, such as
``typing.Text`` or ``typing.Coroutine``.
The ``typing_extensions`` module contains both backports of these changes
as well as experimental types that will eventually be added to the ``typing``
module, such as ``Protocol`` (see PEP 544 for details about protocols and
static duck typing) or ``TypedDict`` (see PEP 589).
Users of other Python versions should continue to install and use
the ``typing`` module from PyPi instead of using this one unless
specifically writing code that must be compatible with multiple Python
versions or requires experimental types.
Included items
==============
This module currently contains the following:
All Python versions:
--------------------
- ``ClassVar``
- ``ContextManager``
- ``Counter``
- ``DefaultDict``
- ``Deque``
- ``final``
- ``Final``
- ``Literal``
- ``NewType``
- ``NoReturn``
- ``overload`` (note that older versions of ``typing`` only let you use ``overload`` in stubs)
- ``OrderedDict``
- ``Protocol`` (except on Python 3.5.0)
- ``runtime_checkable`` (except on Python 3.5.0)
- ``Text``
- ``Type``
- ``TypedDict``
- ``TypeAlias``
- ``TYPE_CHECKING``
Python 3.4+ only:
-----------------
- ``ChainMap``
- ``ParamSpec``
- ``Concatenate``
- ``ParamSpecArgs``
- ``ParamSpecKwargs``
- ``TypeGuard``
Python 3.5+ only:
-----------------
- ``Annotated`` (except on Python 3.5.0-3.5.2)
- ``AsyncIterable``
- ``AsyncIterator``
- ``AsyncContextManager``
- ``Awaitable``
- ``Coroutine``
Python 3.6+ only:
-----------------
- ``AsyncGenerator``
Other Notes and Limitations
===========================
There are a few types whose interface was modified between different
versions of typing. For example, ``typing.Sequence`` was modified to
subclass ``typing.Reversible`` as of Python 3.5.3.
These changes are _not_ backported to prevent subtle compatibility
issues when mixing the differing implementations of modified classes.
Certain types have incorrect runtime behavior due to limitations of older
versions of the typing module. For example, ``ParamSpec`` and ``Concatenate``
will not work with ``get_args``, ``get_origin``. Certain PEP 612 special cases
in user-defined ``Generic``\ s are also not available.
These types are only guaranteed to work for static type checking.
Running tests
=============
To run tests, navigate into the appropriate source directory and run
``test_typing_extensions.py``. You will also need to install the latest
version of ``typing`` if you are using a version of Python that does not
include ``typing`` as a part of the standard library. | {
"source": "yandex/perforator",
"title": "contrib/python/typing-extensions/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/typing-extensions/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 3126
} |
# Typing Extensions
[](https://gitter.im/python/typing)
[Documentation](https://typing-extensions.readthedocs.io/en/latest/#) –
[PyPI](https://pypi.org/project/typing-extensions/)
## Overview
The `typing_extensions` module serves two related purposes:
- Enable use of new type system features on older Python versions. For example,
`typing.TypeGuard` is new in Python 3.10, but `typing_extensions` allows
users on previous Python versions to use it too.
- Enable experimentation with new type system PEPs before they are accepted and
added to the `typing` module.
`typing_extensions` is treated specially by static type checkers such as
mypy and pyright. Objects defined in `typing_extensions` are treated the same
way as equivalent forms in `typing`.
`typing_extensions` uses
[Semantic Versioning](https://semver.org/). The
major version will be incremented only for backwards-incompatible changes.
Therefore, it's safe to depend
on `typing_extensions` like this: `typing_extensions >=x.y, <(x+1)`,
where `x.y` is the first version that includes all features you need.
## Included items
See [the documentation](https://typing-extensions.readthedocs.io/en/latest/#) for a
complete listing of module contents.
## Contributing
See [CONTRIBUTING.md](https://github.com/python/typing_extensions/blob/main/CONTRIBUTING.md)
for how to contribute to `typing_extensions`. | {
"source": "yandex/perforator",
"title": "contrib/python/typing-extensions/py3/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/typing-extensions/py3/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1466
} |
|pypi_downloads| |codecov| |license|
============
Introduction
============
This library is mainly for CLI programs that carefully produce output for
Terminals, or make pretend to be an emulator.
**Problem Statement**: The printable length of *most* strings are equal to the
number of cells they occupy on the screen ``1 character : 1 cell``. However,
there are categories of characters that *occupy 2 cells* (full-wide), and
others that *occupy 0* cells (zero-width).
**Solution**: POSIX.1-2001 and POSIX.1-2008 conforming systems provide
`wcwidth(3)`_ and `wcswidth(3)`_ C functions of which this python module's
functions precisely copy. *These functions return the number of cells a
unicode string is expected to occupy.*
Installation
------------
The stable version of this package is maintained on pypi, install using pip::
pip install wcwidth
Example
-------
**Problem**: given the following phrase (Japanese),
>>> text = u'コンニチハ'
Python **incorrectly** uses the *string length* of 5 codepoints rather than the
*printable length* of 10 cells, so that when using the `rjust` function, the
output length is wrong::
>>> print(len('コンニチハ'))
5
>>> print('コンニチハ'.rjust(20, '_'))
_______________コンニチハ
By defining our own "rjust" function that uses wcwidth, we can correct this::
>>> def wc_rjust(text, length, padding=' '):
... from wcwidth import wcswidth
... return padding * max(0, (length - wcswidth(text))) + text
...
Our **Solution** uses wcswidth to determine the string length correctly::
>>> from wcwidth import wcswidth
>>> print(wcswidth('コンニチハ'))
10
>>> print(wc_rjust('コンニチハ', 20, '_'))
__________コンニチハ
Choosing a Version
------------------
Export an environment variable, ``UNICODE_VERSION``. This should be done by
*terminal emulators* or those developers experimenting with authoring one of
their own, from shell::
$ export UNICODE_VERSION=13.0
If unspecified, the latest version is used. If your Terminal Emulator does not
export this variable, you can use the `jquast/ucs-detect`_ utility to
automatically detect and export it to your shell.
wcwidth, wcswidth
-----------------
Use function ``wcwidth()`` to determine the length of a *single unicode
character*, and ``wcswidth()`` to determine the length of many, a *string
of unicode characters*.
Briefly, return values of function ``wcwidth()`` are:
``-1``
Indeterminate (not printable).
``0``
Does not advance the cursor, such as NULL or Combining.
``2``
Characters of category East Asian Wide (W) or East Asian
Full-width (F) which are displayed using two terminal cells.
``1``
All others.
Function ``wcswidth()`` simply returns the sum of all values for each character
along a string, or ``-1`` when it occurs anywhere along a string.
Full API Documentation at https://wcwidth.readthedocs.org
==========
Developing
==========
Install wcwidth in editable mode::
pip install -e .
Execute unit tests using tox_::
tox -e py27,py35,py36,py37,py38,py39,py310,py311,py312
Updating Unicode Version
------------------------
Regenerate python code tables from latest Unicode Specification data files::
tox -e update
The script is located at ``bin/update-tables.py``, requires Python 3.9 or
later. It is recommended but not necessary to run this script with the newest
Python, because the newest Python has the latest ``unicodedata`` for generating
comments.
Building Documentation
----------------------
This project is using `sphinx`_ 4.5 to build documentation::
tox -e sphinx
The output will be in ``docs/_build/html/``.
Updating Requirements
---------------------
This project is using `pip-tools`_ to manage requirements.
To upgrade requirements for updating unicode version, run::
tox -e update_requirements_update
To upgrade requirements for testing, run::
tox -e update_requirements37,update_requirements39
To upgrade requirements for building documentation, run::
tox -e update_requirements_docs
Utilities
---------
Supplementary tools for browsing and testing terminals for wide unicode
characters are found in the `bin/`_ of this project's source code. Just ensure
to first ``pip install -r requirements-develop.txt`` from this projects main
folder. For example, an interactive browser for testing::
python ./bin/wcwidth-browser.py
====
Uses
====
This library is used in:
- `jquast/blessed`_: a thin, practical wrapper around terminal capabilities in
Python.
- `prompt-toolkit/python-prompt-toolkit`_: a Library for building powerful
interactive command lines in Python.
- `dbcli/pgcli`_: Postgres CLI with autocompletion and syntax highlighting.
- `thomasballinger/curtsies`_: a Curses-like terminal wrapper with a display
based on compositing 2d arrays of text.
- `selectel/pyte`_: Simple VTXXX-compatible linux terminal emulator.
- `astanin/python-tabulate`_: Pretty-print tabular data in Python, a library
and a command-line utility.
- `rspeer/python-ftfy`_: Fixes mojibake and other glitches in Unicode
text.
- `nbedos/termtosvg`_: Terminal recorder that renders sessions as SVG
animations.
- `peterbrittain/asciimatics`_: Package to help people create full-screen text
UIs.
- `python-cmd2/cmd2`_: A tool for building interactive command line apps
- `stratis-storage/stratis-cli`_: CLI for the Stratis project
- `ihabunek/toot`_: A Mastodon CLI/TUI client
- `saulpw/visidata`_: Terminal spreadsheet multitool for discovering and
arranging data
===============
Other Languages
===============
- `timoxley/wcwidth`_: JavaScript
- `janlelis/unicode-display_width`_: Ruby
- `alecrabbit/php-wcwidth`_: PHP
- `Text::CharWidth`_: Perl
- `bluebear94/Terminal-WCWidth`_: Perl 6
- `mattn/go-runewidth`_: Go
- `grepsuzette/wcwidth`_: Haxe
- `aperezdc/lua-wcwidth`_: Lua
- `joachimschmidt557/zig-wcwidth`_: Zig
- `fumiyas/wcwidth-cjk`_: `LD_PRELOAD` override
- `joshuarubin/wcwidth9`_: Unicode version 9 in C
=======
History
=======
0.2.13 *2024-01-06*
* **Bugfix** zero-width support for Hangul Jamo (Korean)
0.2.12 *2023-11-21*
* re-release to remove .pyi file misplaced in wheel files `Issue #101`_.
0.2.11 *2023-11-20*
* Include tests files in the source distribution (`PR #98`_, `PR #100`_).
0.2.10 *2023-11-13*
* **Bugfix** accounting of some kinds of emoji sequences using U+FE0F
Variation Selector 16 (`PR #97`_).
* **Updated** `Specification <Specification_from_pypi_>`_.
0.2.9 *2023-10-30*
* **Bugfix** zero-width characters used in Emoji ZWJ sequences, Balinese,
Jamo, Devanagari, Tamil, Kannada and others (`PR #91`_).
* **Updated** to include `Specification <Specification_from_pypi_>`_ of
character measurements.
0.2.8 *2023-09-30*
* Include requirements files in the source distribution (`PR #82`_).
0.2.7 *2023-09-28*
* **Updated** tables to include Unicode Specification 15.1.0.
* Include ``bin``, ``docs``, and ``tox.ini`` in the source distribution
0.2.6 *2023-01-14*
* **Updated** tables to include Unicode Specification 14.0.0 and 15.0.0.
* **Changed** developer tools to use pip-compile, and to use jinja2 templates
for code generation in `bin/update-tables.py` to prepare for possible
compiler optimization release.
0.2.1 .. 0.2.5 *2020-06-23*
* **Repository** changes to update tests and packaging issues, and
begin tagging repository with matching release versions.
0.2.0 *2020-06-01*
* **Enhancement**: Unicode version may be selected by exporting the
Environment variable ``UNICODE_VERSION``, such as ``13.0``, or ``6.3.0``.
See the `jquast/ucs-detect`_ CLI utility for automatic detection.
* **Enhancement**:
API Documentation is published to readthedocs.org.
* **Updated** tables for *all* Unicode Specifications with files
published in a programmatically consumable format, versions 4.1.0
through 13.0
0.1.9 *2020-03-22*
* **Performance** optimization by `Avram Lubkin`_, `PR #35`_.
* **Updated** tables to Unicode Specification 13.0.0.
0.1.8 *2020-01-01*
* **Updated** tables to Unicode Specification 12.0.0. (`PR #30`_).
0.1.7 *2016-07-01*
* **Updated** tables to Unicode Specification 9.0.0. (`PR #18`_).
0.1.6 *2016-01-08 Production/Stable*
* ``LICENSE`` file now included with distribution.
0.1.5 *2015-09-13 Alpha*
* **Bugfix**:
Resolution of "combining_ character width" issue, most especially
those that previously returned -1 now often (correctly) return 0.
resolved by `Philip Craig`_ via `PR #11`_.
* **Deprecated**:
The module path ``wcwidth.table_comb`` is no longer available,
it has been superseded by module path ``wcwidth.table_zero``.
0.1.4 *2014-11-20 Pre-Alpha*
* **Feature**: ``wcswidth()`` now determines printable length
for (most) combining_ characters. The developer's tool
`bin/wcwidth-browser.py`_ is improved to display combining_
characters when provided the ``--combining`` option
(`Thomas Ballinger`_ and `Leta Montopoli`_ `PR #5`_).
* **Feature**: added static analysis (prospector_) to testing
framework.
0.1.3 *2014-10-29 Pre-Alpha*
* **Bugfix**: 2nd parameter of wcswidth was not honored.
(`Thomas Ballinger`_, `PR #4`_).
0.1.2 *2014-10-28 Pre-Alpha*
* **Updated** tables to Unicode Specification 7.0.0.
(`Thomas Ballinger`_, `PR #3`_).
0.1.1 *2014-05-14 Pre-Alpha*
* Initial release to pypi, Based on Unicode Specification 6.3.0
This code was originally derived directly from C code of the same name,
whose latest version is available at
https://www.cl.cam.ac.uk/~mgk25/ucs/wcwidth.c::
* Markus Kuhn -- 2007-05-26 (Unicode 5.0)
*
* Permission to use, copy, modify, and distribute this software
* for any purpose and without fee is hereby granted. The author
* disclaims all warranties with regard to this software.
.. _`Specification_from_pypi`: https://wcwidth.readthedocs.io/en/latest/specs.html
.. _`tox`: https://tox.wiki/en/latest/
.. _`prospector`: https://github.com/landscapeio/prospector
.. _`combining`: https://en.wikipedia.org/wiki/Combining_character
.. _`bin/`: https://github.com/jquast/wcwidth/tree/master/bin
.. _`bin/wcwidth-browser.py`: https://github.com/jquast/wcwidth/blob/master/bin/wcwidth-browser.py
.. _`Thomas Ballinger`: https://github.com/thomasballinger
.. _`Leta Montopoli`: https://github.com/lmontopo
.. _`Philip Craig`: https://github.com/philipc
.. _`PR #3`: https://github.com/jquast/wcwidth/pull/3
.. _`PR #4`: https://github.com/jquast/wcwidth/pull/4
.. _`PR #5`: https://github.com/jquast/wcwidth/pull/5
.. _`PR #11`: https://github.com/jquast/wcwidth/pull/11
.. _`PR #18`: https://github.com/jquast/wcwidth/pull/18
.. _`PR #30`: https://github.com/jquast/wcwidth/pull/30
.. _`PR #35`: https://github.com/jquast/wcwidth/pull/35
.. _`PR #82`: https://github.com/jquast/wcwidth/pull/82
.. _`PR #91`: https://github.com/jquast/wcwidth/pull/91
.. _`PR #97`: https://github.com/jquast/wcwidth/pull/97
.. _`PR #98`: https://github.com/jquast/wcwidth/pull/98
.. _`PR #100`: https://github.com/jquast/wcwidth/pull/100
.. _`Issue #101`: https://github.com/jquast/wcwidth/issues/101
.. _`jquast/blessed`: https://github.com/jquast/blessed
.. _`selectel/pyte`: https://github.com/selectel/pyte
.. _`thomasballinger/curtsies`: https://github.com/thomasballinger/curtsies
.. _`dbcli/pgcli`: https://github.com/dbcli/pgcli
.. _`prompt-toolkit/python-prompt-toolkit`: https://github.com/prompt-toolkit/python-prompt-toolkit
.. _`timoxley/wcwidth`: https://github.com/timoxley/wcwidth
.. _`wcwidth(3)`: https://man7.org/linux/man-pages/man3/wcwidth.3.html
.. _`wcswidth(3)`: https://man7.org/linux/man-pages/man3/wcswidth.3.html
.. _`astanin/python-tabulate`: https://github.com/astanin/python-tabulate
.. _`janlelis/unicode-display_width`: https://github.com/janlelis/unicode-display_width
.. _`rspeer/python-ftfy`: https://github.com/rspeer/python-ftfy
.. _`alecrabbit/php-wcwidth`: https://github.com/alecrabbit/php-wcwidth
.. _`Text::CharWidth`: https://metacpan.org/pod/Text::CharWidth
.. _`bluebear94/Terminal-WCWidth`: https://github.com/bluebear94/Terminal-WCWidth
.. _`mattn/go-runewidth`: https://github.com/mattn/go-runewidth
.. _`grepsuzette/wcwidth`: https://github.com/grepsuzette/wcwidth
.. _`jquast/ucs-detect`: https://github.com/jquast/ucs-detect
.. _`Avram Lubkin`: https://github.com/avylove
.. _`nbedos/termtosvg`: https://github.com/nbedos/termtosvg
.. _`peterbrittain/asciimatics`: https://github.com/peterbrittain/asciimatics
.. _`aperezdc/lua-wcwidth`: https://github.com/aperezdc/lua-wcwidth
.. _`joachimschmidt557/zig-wcwidth`: https://github.com/joachimschmidt557/zig-wcwidth
.. _`fumiyas/wcwidth-cjk`: https://github.com/fumiyas/wcwidth-cjk
.. _`joshuarubin/wcwidth9`: https://github.com/joshuarubin/wcwidth9
.. _`python-cmd2/cmd2`: https://github.com/python-cmd2/cmd2
.. _`stratis-storage/stratis-cli`: https://github.com/stratis-storage/stratis-cli
.. _`ihabunek/toot`: https://github.com/ihabunek/toot
.. _`saulpw/visidata`: https://github.com/saulpw/visidata
.. _`pip-tools`: https://pip-tools.readthedocs.io/
.. _`sphinx`: https://www.sphinx-doc.org/
.. |pypi_downloads| image:: https://img.shields.io/pypi/dm/wcwidth.svg?logo=pypi
:alt: Downloads
:target: https://pypi.org/project/wcwidth/
.. |codecov| image:: https://codecov.io/gh/jquast/wcwidth/branch/master/graph/badge.svg
:alt: codecov.io Code Coverage
:target: https://app.codecov.io/gh/jquast/wcwidth/
.. |license| image:: https://img.shields.io/pypi/l/wcwidth.svg
:target: https://pypi.org/project/wcwidth/
:alt: MIT License | {
"source": "yandex/perforator",
"title": "contrib/python/wcwidth/py2/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/wcwidth/py2/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 13560
} |
|pypi_downloads| |codecov| |license|
============
Introduction
============
This library is mainly for CLI programs that carefully produce output for
Terminals, or make pretend to be an emulator.
**Problem Statement**: The printable length of *most* strings are equal to the
number of cells they occupy on the screen ``1 character : 1 cell``. However,
there are categories of characters that *occupy 2 cells* (full-wide), and
others that *occupy 0* cells (zero-width).
**Solution**: POSIX.1-2001 and POSIX.1-2008 conforming systems provide
`wcwidth(3)`_ and `wcswidth(3)`_ C functions of which this python module's
functions precisely copy. *These functions return the number of cells a
unicode string is expected to occupy.*
Installation
------------
The stable version of this package is maintained on pypi, install using pip::
pip install wcwidth
Example
-------
**Problem**: given the following phrase (Japanese),
>>> text = u'コンニチハ'
Python **incorrectly** uses the *string length* of 5 codepoints rather than the
*printable length* of 10 cells, so that when using the `rjust` function, the
output length is wrong::
>>> print(len('コンニチハ'))
5
>>> print('コンニチハ'.rjust(20, '_'))
_______________コンニチハ
By defining our own "rjust" function that uses wcwidth, we can correct this::
>>> def wc_rjust(text, length, padding=' '):
... from wcwidth import wcswidth
... return padding * max(0, (length - wcswidth(text))) + text
...
Our **Solution** uses wcswidth to determine the string length correctly::
>>> from wcwidth import wcswidth
>>> print(wcswidth('コンニチハ'))
10
>>> print(wc_rjust('コンニチハ', 20, '_'))
__________コンニチハ
Choosing a Version
------------------
Export an environment variable, ``UNICODE_VERSION``. This should be done by
*terminal emulators* or those developers experimenting with authoring one of
their own, from shell::
$ export UNICODE_VERSION=13.0
If unspecified, the latest version is used. If your Terminal Emulator does not
export this variable, you can use the `jquast/ucs-detect`_ utility to
automatically detect and export it to your shell.
wcwidth, wcswidth
-----------------
Use function ``wcwidth()`` to determine the length of a *single unicode
character*, and ``wcswidth()`` to determine the length of many, a *string
of unicode characters*.
Briefly, return values of function ``wcwidth()`` are:
``-1``
Indeterminate (not printable).
``0``
Does not advance the cursor, such as NULL or Combining.
``2``
Characters of category East Asian Wide (W) or East Asian
Full-width (F) which are displayed using two terminal cells.
``1``
All others.
Function ``wcswidth()`` simply returns the sum of all values for each character
along a string, or ``-1`` when it occurs anywhere along a string.
Full API Documentation at https://wcwidth.readthedocs.org
==========
Developing
==========
Install wcwidth in editable mode::
pip install -e .
Execute unit tests using tox_::
tox -e py27,py35,py36,py37,py38,py39,py310,py311,py312
Updating Unicode Version
------------------------
Regenerate python code tables from latest Unicode Specification data files::
tox -e update
The script is located at ``bin/update-tables.py``, requires Python 3.9 or
later. It is recommended but not necessary to run this script with the newest
Python, because the newest Python has the latest ``unicodedata`` for generating
comments.
Building Documentation
----------------------
This project is using `sphinx`_ 4.5 to build documentation::
tox -e sphinx
The output will be in ``docs/_build/html/``.
Updating Requirements
---------------------
This project is using `pip-tools`_ to manage requirements.
To upgrade requirements for updating unicode version, run::
tox -e update_requirements_update
To upgrade requirements for testing, run::
tox -e update_requirements37,update_requirements39
To upgrade requirements for building documentation, run::
tox -e update_requirements_docs
Utilities
---------
Supplementary tools for browsing and testing terminals for wide unicode
characters are found in the `bin/`_ of this project's source code. Just ensure
to first ``pip install -r requirements-develop.txt`` from this projects main
folder. For example, an interactive browser for testing::
python ./bin/wcwidth-browser.py
====
Uses
====
This library is used in:
- `jquast/blessed`_: a thin, practical wrapper around terminal capabilities in
Python.
- `prompt-toolkit/python-prompt-toolkit`_: a Library for building powerful
interactive command lines in Python.
- `dbcli/pgcli`_: Postgres CLI with autocompletion and syntax highlighting.
- `thomasballinger/curtsies`_: a Curses-like terminal wrapper with a display
based on compositing 2d arrays of text.
- `selectel/pyte`_: Simple VTXXX-compatible linux terminal emulator.
- `astanin/python-tabulate`_: Pretty-print tabular data in Python, a library
and a command-line utility.
- `rspeer/python-ftfy`_: Fixes mojibake and other glitches in Unicode
text.
- `nbedos/termtosvg`_: Terminal recorder that renders sessions as SVG
animations.
- `peterbrittain/asciimatics`_: Package to help people create full-screen text
UIs.
- `python-cmd2/cmd2`_: A tool for building interactive command line apps
- `stratis-storage/stratis-cli`_: CLI for the Stratis project
- `ihabunek/toot`_: A Mastodon CLI/TUI client
- `saulpw/visidata`_: Terminal spreadsheet multitool for discovering and
arranging data
===============
Other Languages
===============
- `timoxley/wcwidth`_: JavaScript
- `janlelis/unicode-display_width`_: Ruby
- `alecrabbit/php-wcwidth`_: PHP
- `Text::CharWidth`_: Perl
- `bluebear94/Terminal-WCWidth`_: Perl 6
- `mattn/go-runewidth`_: Go
- `grepsuzette/wcwidth`_: Haxe
- `aperezdc/lua-wcwidth`_: Lua
- `joachimschmidt557/zig-wcwidth`_: Zig
- `fumiyas/wcwidth-cjk`_: `LD_PRELOAD` override
- `joshuarubin/wcwidth9`_: Unicode version 9 in C
=======
History
=======
0.2.13 *2024-01-06*
* **Bugfix** zero-width support for Hangul Jamo (Korean)
0.2.12 *2023-11-21*
* re-release to remove .pyi file misplaced in wheel files `Issue #101`_.
0.2.11 *2023-11-20*
* Include tests files in the source distribution (`PR #98`_, `PR #100`_).
0.2.10 *2023-11-13*
* **Bugfix** accounting of some kinds of emoji sequences using U+FE0F
Variation Selector 16 (`PR #97`_).
* **Updated** `Specification <Specification_from_pypi_>`_.
0.2.9 *2023-10-30*
* **Bugfix** zero-width characters used in Emoji ZWJ sequences, Balinese,
Jamo, Devanagari, Tamil, Kannada and others (`PR #91`_).
* **Updated** to include `Specification <Specification_from_pypi_>`_ of
character measurements.
0.2.8 *2023-09-30*
* Include requirements files in the source distribution (`PR #82`_).
0.2.7 *2023-09-28*
* **Updated** tables to include Unicode Specification 15.1.0.
* Include ``bin``, ``docs``, and ``tox.ini`` in the source distribution
0.2.6 *2023-01-14*
* **Updated** tables to include Unicode Specification 14.0.0 and 15.0.0.
* **Changed** developer tools to use pip-compile, and to use jinja2 templates
for code generation in `bin/update-tables.py` to prepare for possible
compiler optimization release.
0.2.1 .. 0.2.5 *2020-06-23*
* **Repository** changes to update tests and packaging issues, and
begin tagging repository with matching release versions.
0.2.0 *2020-06-01*
* **Enhancement**: Unicode version may be selected by exporting the
Environment variable ``UNICODE_VERSION``, such as ``13.0``, or ``6.3.0``.
See the `jquast/ucs-detect`_ CLI utility for automatic detection.
* **Enhancement**:
API Documentation is published to readthedocs.org.
* **Updated** tables for *all* Unicode Specifications with files
published in a programmatically consumable format, versions 4.1.0
through 13.0
0.1.9 *2020-03-22*
* **Performance** optimization by `Avram Lubkin`_, `PR #35`_.
* **Updated** tables to Unicode Specification 13.0.0.
0.1.8 *2020-01-01*
* **Updated** tables to Unicode Specification 12.0.0. (`PR #30`_).
0.1.7 *2016-07-01*
* **Updated** tables to Unicode Specification 9.0.0. (`PR #18`_).
0.1.6 *2016-01-08 Production/Stable*
* ``LICENSE`` file now included with distribution.
0.1.5 *2015-09-13 Alpha*
* **Bugfix**:
Resolution of "combining_ character width" issue, most especially
those that previously returned -1 now often (correctly) return 0.
resolved by `Philip Craig`_ via `PR #11`_.
* **Deprecated**:
The module path ``wcwidth.table_comb`` is no longer available,
it has been superseded by module path ``wcwidth.table_zero``.
0.1.4 *2014-11-20 Pre-Alpha*
* **Feature**: ``wcswidth()`` now determines printable length
for (most) combining_ characters. The developer's tool
`bin/wcwidth-browser.py`_ is improved to display combining_
characters when provided the ``--combining`` option
(`Thomas Ballinger`_ and `Leta Montopoli`_ `PR #5`_).
* **Feature**: added static analysis (prospector_) to testing
framework.
0.1.3 *2014-10-29 Pre-Alpha*
* **Bugfix**: 2nd parameter of wcswidth was not honored.
(`Thomas Ballinger`_, `PR #4`_).
0.1.2 *2014-10-28 Pre-Alpha*
* **Updated** tables to Unicode Specification 7.0.0.
(`Thomas Ballinger`_, `PR #3`_).
0.1.1 *2014-05-14 Pre-Alpha*
* Initial release to pypi, Based on Unicode Specification 6.3.0
This code was originally derived directly from C code of the same name,
whose latest version is available at
https://www.cl.cam.ac.uk/~mgk25/ucs/wcwidth.c::
* Markus Kuhn -- 2007-05-26 (Unicode 5.0)
*
* Permission to use, copy, modify, and distribute this software
* for any purpose and without fee is hereby granted. The author
* disclaims all warranties with regard to this software.
.. _`Specification_from_pypi`: https://wcwidth.readthedocs.io/en/latest/specs.html
.. _`tox`: https://tox.wiki/en/latest/
.. _`prospector`: https://github.com/landscapeio/prospector
.. _`combining`: https://en.wikipedia.org/wiki/Combining_character
.. _`bin/`: https://github.com/jquast/wcwidth/tree/master/bin
.. _`bin/wcwidth-browser.py`: https://github.com/jquast/wcwidth/blob/master/bin/wcwidth-browser.py
.. _`Thomas Ballinger`: https://github.com/thomasballinger
.. _`Leta Montopoli`: https://github.com/lmontopo
.. _`Philip Craig`: https://github.com/philipc
.. _`PR #3`: https://github.com/jquast/wcwidth/pull/3
.. _`PR #4`: https://github.com/jquast/wcwidth/pull/4
.. _`PR #5`: https://github.com/jquast/wcwidth/pull/5
.. _`PR #11`: https://github.com/jquast/wcwidth/pull/11
.. _`PR #18`: https://github.com/jquast/wcwidth/pull/18
.. _`PR #30`: https://github.com/jquast/wcwidth/pull/30
.. _`PR #35`: https://github.com/jquast/wcwidth/pull/35
.. _`PR #82`: https://github.com/jquast/wcwidth/pull/82
.. _`PR #91`: https://github.com/jquast/wcwidth/pull/91
.. _`PR #97`: https://github.com/jquast/wcwidth/pull/97
.. _`PR #98`: https://github.com/jquast/wcwidth/pull/98
.. _`PR #100`: https://github.com/jquast/wcwidth/pull/100
.. _`Issue #101`: https://github.com/jquast/wcwidth/issues/101
.. _`jquast/blessed`: https://github.com/jquast/blessed
.. _`selectel/pyte`: https://github.com/selectel/pyte
.. _`thomasballinger/curtsies`: https://github.com/thomasballinger/curtsies
.. _`dbcli/pgcli`: https://github.com/dbcli/pgcli
.. _`prompt-toolkit/python-prompt-toolkit`: https://github.com/prompt-toolkit/python-prompt-toolkit
.. _`timoxley/wcwidth`: https://github.com/timoxley/wcwidth
.. _`wcwidth(3)`: https://man7.org/linux/man-pages/man3/wcwidth.3.html
.. _`wcswidth(3)`: https://man7.org/linux/man-pages/man3/wcswidth.3.html
.. _`astanin/python-tabulate`: https://github.com/astanin/python-tabulate
.. _`janlelis/unicode-display_width`: https://github.com/janlelis/unicode-display_width
.. _`rspeer/python-ftfy`: https://github.com/rspeer/python-ftfy
.. _`alecrabbit/php-wcwidth`: https://github.com/alecrabbit/php-wcwidth
.. _`Text::CharWidth`: https://metacpan.org/pod/Text::CharWidth
.. _`bluebear94/Terminal-WCWidth`: https://github.com/bluebear94/Terminal-WCWidth
.. _`mattn/go-runewidth`: https://github.com/mattn/go-runewidth
.. _`grepsuzette/wcwidth`: https://github.com/grepsuzette/wcwidth
.. _`jquast/ucs-detect`: https://github.com/jquast/ucs-detect
.. _`Avram Lubkin`: https://github.com/avylove
.. _`nbedos/termtosvg`: https://github.com/nbedos/termtosvg
.. _`peterbrittain/asciimatics`: https://github.com/peterbrittain/asciimatics
.. _`aperezdc/lua-wcwidth`: https://github.com/aperezdc/lua-wcwidth
.. _`joachimschmidt557/zig-wcwidth`: https://github.com/joachimschmidt557/zig-wcwidth
.. _`fumiyas/wcwidth-cjk`: https://github.com/fumiyas/wcwidth-cjk
.. _`joshuarubin/wcwidth9`: https://github.com/joshuarubin/wcwidth9
.. _`python-cmd2/cmd2`: https://github.com/python-cmd2/cmd2
.. _`stratis-storage/stratis-cli`: https://github.com/stratis-storage/stratis-cli
.. _`ihabunek/toot`: https://github.com/ihabunek/toot
.. _`saulpw/visidata`: https://github.com/saulpw/visidata
.. _`pip-tools`: https://pip-tools.readthedocs.io/
.. _`sphinx`: https://www.sphinx-doc.org/
.. |pypi_downloads| image:: https://img.shields.io/pypi/dm/wcwidth.svg?logo=pypi
:alt: Downloads
:target: https://pypi.org/project/wcwidth/
.. |codecov| image:: https://codecov.io/gh/jquast/wcwidth/branch/master/graph/badge.svg
:alt: codecov.io Code Coverage
:target: https://app.codecov.io/gh/jquast/wcwidth/
.. |license| image:: https://img.shields.io/pypi/l/wcwidth.svg
:target: https://pypi.org/project/wcwidth/
:alt: MIT License | {
"source": "yandex/perforator",
"title": "contrib/python/wcwidth/py3/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/python/wcwidth/py3/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 13560
} |
# How to contribute #
We'd love to accept your patches and contributions to this project. There are
a just a few small guidelines you need to follow.
## Contributor License Agreement ##
Contributions to any Google project must be accompanied by a Contributor
License Agreement. This is not a copyright **assignment**, it simply gives
Google permission to use and redistribute your contributions as part of the
project.
* If you are an individual writing original source code and you're sure you
own the intellectual property, then you'll need to sign an [individual
CLA][].
* If you work for a company that wants to allow you to contribute your work,
then you'll need to sign a [corporate CLA][].
You generally only need to submit a CLA once, so if you've already submitted
one (even if it was for a different project), you probably don't need to do it
again.
[individual CLA]: https://developers.google.com/open-source/cla/individual
[corporate CLA]: https://developers.google.com/open-source/cla/corporate
Once your CLA is submitted (or if you already submitted one for
another Google project), make a commit adding yourself to the
[AUTHORS][] and [CONTRIBUTORS][] files. This commit can be part
of your first [pull request][].
[AUTHORS]: AUTHORS
[CONTRIBUTORS]: CONTRIBUTORS
## Submitting a patch ##
1. It's generally best to start by opening a new issue describing the bug or
feature you're intending to fix. Even if you think it's relatively minor,
it's helpful to know what people are working on. Mention in the initial
issue that you are planning to work on that bug or feature so that it can
be assigned to you.
1. Follow the normal process of [forking][] the project, and setup a new
branch to work in. It's important that each group of changes be done in
separate branches in order to ensure that a pull request only includes the
commits related to that bug or feature.
1. Do your best to have [well-formed commit messages][] for each change.
This provides consistency throughout the project, and ensures that commit
messages are able to be formatted properly by various git tools.
1. Finally, push the commits to your fork and submit a [pull request][].
[forking]: https://help.github.com/articles/fork-a-repo
[well-formed commit messages]: http://tbaggery.com/2008/04/19/a-note-about-git-commit-messages.html
[pull request]: https://help.github.com/articles/creating-a-pull-request | {
"source": "yandex/perforator",
"title": "contrib/restricted/google/benchmark/CONTRIBUTING.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/restricted/google/benchmark/CONTRIBUTING.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 2484
} |
# Benchmark
[](https://github.com/google/benchmark/actions?query=workflow%3Abuild-and-test)
[](https://github.com/google/benchmark/actions/workflows/bazel.yml)
[](https://github.com/google/benchmark/actions?query=workflow%3Apylint)
[](https://github.com/google/benchmark/actions?query=workflow%3Atest-bindings)
[](https://coveralls.io/r/google/benchmark)
[](https://discord.gg/cz7UX7wKC2)
A library to benchmark code snippets, similar to unit tests. Example:
```c++
#include <benchmark/benchmark.h>
static void BM_SomeFunction(benchmark::State& state) {
// Perform setup here
for (auto _ : state) {
// This code gets timed
SomeFunction();
}
}
// Register the function as a benchmark
BENCHMARK(BM_SomeFunction);
// Run the benchmark
BENCHMARK_MAIN();
```
## Getting Started
To get started, see [Requirements](#requirements) and
[Installation](#installation). See [Usage](#usage) for a full example and the
[User Guide](docs/user_guide.md) for a more comprehensive feature overview.
It may also help to read the [Google Test documentation](https://github.com/google/googletest/blob/main/docs/primer.md)
as some of the structural aspects of the APIs are similar.
## Resources
[Discussion group](https://groups.google.com/d/forum/benchmark-discuss)
IRC channels:
* [libera](https://libera.chat) #benchmark
[Additional Tooling Documentation](docs/tools.md)
[Assembly Testing Documentation](docs/AssemblyTests.md)
[Building and installing Python bindings](docs/python_bindings.md)
## Requirements
The library can be used with C++03. However, it requires C++14 to build,
including compiler and standard library support.
_See [dependencies.md](docs/dependencies.md) for more details regarding supported
compilers and standards._
If you have need for a particular compiler to be supported, patches are very welcome.
See [Platform-Specific Build Instructions](docs/platform_specific_build_instructions.md).
## Installation
This describes the installation process using cmake. As pre-requisites, you'll
need git and cmake installed.
_See [dependencies.md](docs/dependencies.md) for more details regarding supported
versions of build tools._
```bash
# Check out the library.
$ git clone https://github.com/google/benchmark.git
# Go to the library root directory
$ cd benchmark
# Make a build directory to place the build output.
$ cmake -E make_directory "build"
# Generate build system files with cmake, and download any dependencies.
$ cmake -E chdir "build" cmake -DBENCHMARK_DOWNLOAD_DEPENDENCIES=on -DCMAKE_BUILD_TYPE=Release ../
# or, starting with CMake 3.13, use a simpler form:
# cmake -DCMAKE_BUILD_TYPE=Release -S . -B "build"
# Build the library.
$ cmake --build "build" --config Release
```
This builds the `benchmark` and `benchmark_main` libraries and tests.
On a unix system, the build directory should now look something like this:
```
/benchmark
/build
/src
/libbenchmark.a
/libbenchmark_main.a
/test
...
```
Next, you can run the tests to check the build.
```bash
$ cmake -E chdir "build" ctest --build-config Release
```
If you want to install the library globally, also run:
```
sudo cmake --build "build" --config Release --target install
```
Note that Google Benchmark requires Google Test to build and run the tests. This
dependency can be provided two ways:
* Checkout the Google Test sources into `benchmark/googletest`.
* Otherwise, if `-DBENCHMARK_DOWNLOAD_DEPENDENCIES=ON` is specified during
configuration as above, the library will automatically download and build
any required dependencies.
If you do not wish to build and run the tests, add `-DBENCHMARK_ENABLE_GTEST_TESTS=OFF`
to `CMAKE_ARGS`.
### Debug vs Release
By default, benchmark builds as a debug library. You will see a warning in the
output when this is the case. To build it as a release library instead, add
`-DCMAKE_BUILD_TYPE=Release` when generating the build system files, as shown
above. The use of `--config Release` in build commands is needed to properly
support multi-configuration tools (like Visual Studio for example) and can be
skipped for other build systems (like Makefile).
To enable link-time optimisation, also add `-DBENCHMARK_ENABLE_LTO=true` when
generating the build system files.
If you are using gcc, you might need to set `GCC_AR` and `GCC_RANLIB` cmake
cache variables, if autodetection fails.
If you are using clang, you may need to set `LLVMAR_EXECUTABLE`,
`LLVMNM_EXECUTABLE` and `LLVMRANLIB_EXECUTABLE` cmake cache variables.
To enable sanitizer checks (eg., `asan` and `tsan`), add:
```
-DCMAKE_C_FLAGS="-g -O2 -fno-omit-frame-pointer -fsanitize=address -fsanitize=thread -fno-sanitize-recover=all"
-DCMAKE_CXX_FLAGS="-g -O2 -fno-omit-frame-pointer -fsanitize=address -fsanitize=thread -fno-sanitize-recover=all "
```
### Stable and Experimental Library Versions
The main branch contains the latest stable version of the benchmarking library;
the API of which can be considered largely stable, with source breaking changes
being made only upon the release of a new major version.
Newer, experimental, features are implemented and tested on the
[`v2` branch](https://github.com/google/benchmark/tree/v2). Users who wish
to use, test, and provide feedback on the new features are encouraged to try
this branch. However, this branch provides no stability guarantees and reserves
the right to change and break the API at any time.
## Usage
### Basic usage
Define a function that executes the code to measure, register it as a benchmark
function using the `BENCHMARK` macro, and ensure an appropriate `main` function
is available:
```c++
#include <benchmark/benchmark.h>
static void BM_StringCreation(benchmark::State& state) {
for (auto _ : state)
std::string empty_string;
}
// Register the function as a benchmark
BENCHMARK(BM_StringCreation);
// Define another benchmark
static void BM_StringCopy(benchmark::State& state) {
std::string x = "hello";
for (auto _ : state)
std::string copy(x);
}
BENCHMARK(BM_StringCopy);
BENCHMARK_MAIN();
```
To run the benchmark, compile and link against the `benchmark` library
(libbenchmark.a/.so). If you followed the build steps above, this library will
be under the build directory you created.
```bash
# Example on linux after running the build steps above. Assumes the
# `benchmark` and `build` directories are under the current directory.
$ g++ mybenchmark.cc -std=c++11 -isystem benchmark/include \
-Lbenchmark/build/src -lbenchmark -lpthread -o mybenchmark
```
Alternatively, link against the `benchmark_main` library and remove
`BENCHMARK_MAIN();` above to get the same behavior.
The compiled executable will run all benchmarks by default. Pass the `--help`
flag for option information or see the [User Guide](docs/user_guide.md).
### Usage with CMake
If using CMake, it is recommended to link against the project-provided
`benchmark::benchmark` and `benchmark::benchmark_main` targets using
`target_link_libraries`.
It is possible to use ```find_package``` to import an installed version of the
library.
```cmake
find_package(benchmark REQUIRED)
```
Alternatively, ```add_subdirectory``` will incorporate the library directly in
to one's CMake project.
```cmake
add_subdirectory(benchmark)
```
Either way, link to the library as follows.
```cmake
target_link_libraries(MyTarget benchmark::benchmark)
``` | {
"source": "yandex/perforator",
"title": "contrib/restricted/google/benchmark/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/restricted/google/benchmark/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 7876
} |
# Googletest Mocking (gMock) Framework
### Overview
Google's framework for writing and using C++ mock classes. It can help you
derive better designs of your system and write better tests.
It is inspired by:
* [jMock](http://www.jmock.org/)
* [EasyMock](https://easymock.org/)
* [Hamcrest](https://code.google.com/p/hamcrest/)
It is designed with C++'s specifics in mind.
gMock:
- Provides a declarative syntax for defining mocks.
- Can define partial (hybrid) mocks, which are a cross of real and mock
objects.
- Handles functions of arbitrary types and overloaded functions.
- Comes with a rich set of matchers for validating function arguments.
- Uses an intuitive syntax for controlling the behavior of a mock.
- Does automatic verification of expectations (no record-and-replay needed).
- Allows arbitrary (partial) ordering constraints on function calls to be
expressed.
- Lets a user extend it by defining new matchers and actions.
- Does not use exceptions.
- Is easy to learn and use.
Details and examples can be found here:
* [gMock for Dummies](https://google.github.io/googletest/gmock_for_dummies.html)
* [Legacy gMock FAQ](https://google.github.io/googletest/gmock_faq.html)
* [gMock Cookbook](https://google.github.io/googletest/gmock_cook_book.html)
* [gMock Cheat Sheet](https://google.github.io/googletest/gmock_cheat_sheet.html)
GoogleMock is a part of
[GoogleTest C++ testing framework](https://github.com/google/googletest/) and a
subject to the same requirements. | {
"source": "yandex/perforator",
"title": "contrib/restricted/googletest/googlemock/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/restricted/googletest/googlemock/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1536
} |
### Generic Build Instructions
#### Setup
To build GoogleTest and your tests that use it, you need to tell your build
system where to find its headers and source files. The exact way to do it
depends on which build system you use, and is usually straightforward.
### Build with CMake
GoogleTest comes with a CMake build script
([CMakeLists.txt](https://github.com/google/googletest/blob/main/CMakeLists.txt))
that can be used on a wide range of platforms ("C" stands for cross-platform.).
If you don't have CMake installed already, you can download it for free from
<https://cmake.org/>.
CMake works by generating native makefiles or build projects that can be used in
the compiler environment of your choice. You can either build GoogleTest as a
standalone project or it can be incorporated into an existing CMake build for
another project.
#### Standalone CMake Project
When building GoogleTest as a standalone project, the typical workflow starts
with
```
git clone https://github.com/google/googletest.git -b v1.15.0
cd googletest # Main directory of the cloned repository.
mkdir build # Create a directory to hold the build output.
cd build
cmake .. # Generate native build scripts for GoogleTest.
```
The above command also includes GoogleMock by default. And so, if you want to
build only GoogleTest, you should replace the last command with
```
cmake .. -DBUILD_GMOCK=OFF
```
If you are on a \*nix system, you should now see a Makefile in the current
directory. Just type `make` to build GoogleTest. And then you can simply install
GoogleTest if you are a system administrator.
```
make
sudo make install # Install in /usr/local/ by default
```
If you use Windows and have Visual Studio installed, a `gtest.sln` file and
several `.vcproj` files will be created. You can then build them using Visual
Studio.
On Mac OS X with Xcode installed, a `.xcodeproj` file will be generated.
#### Incorporating Into An Existing CMake Project
If you want to use GoogleTest in a project which already uses CMake, the easiest
way is to get installed libraries and headers.
* Import GoogleTest by using `find_package` (or `pkg_check_modules`). For
example, if `find_package(GTest CONFIG REQUIRED)` succeeds, you can use the
libraries as `GTest::gtest`, `GTest::gmock`.
And a more robust and flexible approach is to build GoogleTest as part of that
project directly. This is done by making the GoogleTest source code available to
the main build and adding it using CMake's `add_subdirectory()` command. This
has the significant advantage that the same compiler and linker settings are
used between GoogleTest and the rest of your project, so issues associated with
using incompatible libraries (eg debug/release), etc. are avoided. This is
particularly useful on Windows. Making GoogleTest's source code available to the
main build can be done a few different ways:
* Download the GoogleTest source code manually and place it at a known
location. This is the least flexible approach and can make it more difficult
to use with continuous integration systems, etc.
* Embed the GoogleTest source code as a direct copy in the main project's
source tree. This is often the simplest approach, but is also the hardest to
keep up to date. Some organizations may not permit this method.
* Add GoogleTest as a git submodule or equivalent. This may not always be
possible or appropriate. Git submodules, for example, have their own set of
advantages and drawbacks.
* Use CMake to download GoogleTest as part of the build's configure step. This
approach doesn't have the limitations of the other methods.
The last of the above methods is implemented with a small piece of CMake code
that downloads and pulls the GoogleTest code into the main build.
Just add to your `CMakeLists.txt`:
```cmake
include(FetchContent)
FetchContent_Declare(
googletest
# Specify the commit you depend on and update it regularly.
URL https://github.com/google/googletest/archive/5376968f6948923e2411081fd9372e71a59d8e77.zip
)
# For Windows: Prevent overriding the parent project's compiler/linker settings
set(gtest_force_shared_crt ON CACHE BOOL "" FORCE)
FetchContent_MakeAvailable(googletest)
# Now simply link against gtest or gtest_main as needed. Eg
add_executable(example example.cpp)
target_link_libraries(example gtest_main)
add_test(NAME example_test COMMAND example)
```
Note that this approach requires CMake 3.14 or later due to its use of the
`FetchContent_MakeAvailable()` command.
##### Visual Studio Dynamic vs Static Runtimes
By default, new Visual Studio projects link the C runtimes dynamically but
GoogleTest links them statically. This will generate an error that looks
something like the following: gtest.lib(gtest-all.obj) : error LNK2038: mismatch
detected for 'RuntimeLibrary': value 'MTd_StaticDebug' doesn't match value
'MDd_DynamicDebug' in main.obj
GoogleTest already has a CMake option for this: `gtest_force_shared_crt`
Enabling this option will make gtest link the runtimes dynamically too, and
match the project in which it is included.
#### C++ Standard Version
An environment that supports C++14 is required in order to successfully build
GoogleTest. One way to ensure this is to specify the standard in the top-level
project, for example by using the `set(CMAKE_CXX_STANDARD 14)` command along
with `set(CMAKE_CXX_STANDARD_REQUIRED ON)`. If this is not feasible, for example
in a C project using GoogleTest for validation, then it can be specified by
adding it to the options for cmake via the`-DCMAKE_CXX_FLAGS` option.
### Tweaking GoogleTest
GoogleTest can be used in diverse environments. The default configuration may
not work (or may not work well) out of the box in some environments. However,
you can easily tweak GoogleTest by defining control macros on the compiler
command line. Generally, these macros are named like `GTEST_XYZ` and you define
them to either 1 or 0 to enable or disable a certain feature.
We list the most frequently used macros below. For a complete list, see file
[include/gtest/internal/gtest-port.h](https://github.com/google/googletest/blob/main/googletest/include/gtest/internal/gtest-port.h).
### Multi-threaded Tests
GoogleTest is thread-safe where the pthread library is available. After
`#include <gtest/gtest.h>`, you can check the
`GTEST_IS_THREADSAFE` macro to see whether this is the case (yes if the macro is
`#defined` to 1, no if it's undefined.).
If GoogleTest doesn't correctly detect whether pthread is available in your
environment, you can force it with
```
-DGTEST_HAS_PTHREAD=1
```
or
```
-DGTEST_HAS_PTHREAD=0
```
When GoogleTest uses pthread, you may need to add flags to your compiler and/or
linker to select the pthread library, or you'll get link errors. If you use the
CMake script, this is taken care of for you. If you use your own build script,
you'll need to read your compiler and linker's manual to figure out what flags
to add.
### As a Shared Library (DLL)
GoogleTest is compact, so most users can build and link it as a static library
for the simplicity. You can choose to use GoogleTest as a shared library (known
as a DLL on Windows) if you prefer.
To compile *gtest* as a shared library, add
```
-DGTEST_CREATE_SHARED_LIBRARY=1
```
to the compiler flags. You'll also need to tell the linker to produce a shared
library instead - consult your linker's manual for how to do it.
To compile your *tests* that use the gtest shared library, add
```
-DGTEST_LINKED_AS_SHARED_LIBRARY=1
```
to the compiler flags.
Note: while the above steps aren't technically necessary today when using some
compilers (e.g. GCC), they may become necessary in the future, if we decide to
improve the speed of loading the library (see
<https://gcc.gnu.org/wiki/Visibility> for details). Therefore you are
recommended to always add the above flags when using GoogleTest as a shared
library. Otherwise a future release of GoogleTest may break your build script.
### Avoiding Macro Name Clashes
In C++, macros don't obey namespaces. Therefore two libraries that both define a
macro of the same name will clash if you `#include` both definitions. In case a
GoogleTest macro clashes with another library, you can force GoogleTest to
rename its macro to avoid the conflict.
Specifically, if both GoogleTest and some other code define macro FOO, you can
add
```
-DGTEST_DONT_DEFINE_FOO=1
```
to the compiler flags to tell GoogleTest to change the macro's name from `FOO`
to `GTEST_FOO`. Currently `FOO` can be `ASSERT_EQ`, `ASSERT_FALSE`, `ASSERT_GE`,
`ASSERT_GT`, `ASSERT_LE`, `ASSERT_LT`, `ASSERT_NE`, `ASSERT_TRUE`,
`EXPECT_FALSE`, `EXPECT_TRUE`, `FAIL`, `SUCCEED`, `TEST`, or `TEST_F`. For
example, with `-DGTEST_DONT_DEFINE_TEST=1`, you'll need to write
```
GTEST_TEST(SomeTest, DoesThis) { ... }
```
instead of
```
TEST(SomeTest, DoesThis) { ... }
```
in order to define a test. | {
"source": "yandex/perforator",
"title": "contrib/restricted/googletest/googletest/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/restricted/googletest/googletest/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 8957
} |
This directory contains data needed by Bison.
# Directory Content
## Skeletons
Bison skeletons: the general shapes of the different parser kinds, that are
specialized for specific grammars by the bison program.
Currently, the supported skeletons are:
- yacc.c
It used to be named bison.simple: it corresponds to C Yacc
compatible LALR(1) parsers.
- lalr1.cc
Produces a C++ parser class.
- lalr1.java
Produces a Java parser class.
- glr.c
A Generalized LR C parser based on Bison's LALR(1) tables.
- glr.cc
A Generalized LR C++ parser. Actually a C++ wrapper around glr.c.
These skeletons are the only ones supported by the Bison team. Because the
interface between skeletons and the bison program is not finished, *we are
not bound to it*. In particular, Bison is not mature enough for us to
consider that "foreign skeletons" are supported.
## m4sugar
This directory contains M4sugar, sort of an extended library for M4, which
is used by Bison to instantiate the skeletons.
## xslt
This directory contains XSLT programs that transform Bison's XML output into
various formats.
- bison.xsl
A library of routines used by the other XSLT programs.
- xml2dot.xsl
Conversion into GraphViz's dot format.
- xml2text.xsl
Conversion into text.
- xml2xhtml.xsl
Conversion into XHTML.
# Implementation Notes About the Skeletons
"Skeleton" in Bison parlance means "backend": a skeleton is fed by the bison
executable with LR tables, facts about the symbols, etc. and they generate
the output (say parser.cc, parser.hh, location.hh, etc.). They are only in
charge of generating the parser and its auxiliary files, they do not
generate the XML output, the parser.output reports, nor the graphical
rendering.
The bits of information passing from bison to the backend is named
"muscles". Muscles are passed to M4 via its standard input: it's a set of
m4 definitions. To see them, use `--trace=muscles`.
Except for muscles, whose names are generated by bison, the skeletons have
no constraint at all on the macro names: there is no technical/theoretical
limitation, as long as you generate the output, you can do what you want.
However, of course, that would be a bad idea if, say, the C and C++
skeletons used different approaches and had completely different
implementations. That would be a maintenance nightmare.
Below, we document some of the macros that we use in several of the
skeletons. If you are to write a new skeleton, please, implement them for
your language. Overall, be sure to follow the same patterns as the existing
skeletons.
## Symbols
### `b4_symbol(NUM, FIELD)`
In order to unify the handling of the various aspects of symbols (tag, type
name, whether terminal, etc.), bison.exe defines one macro per (token,
field), where field can `has_id`, `id`, etc.: see
`prepare_symbols_definitions()` in `src/output.c`.
The macro `b4_symbol(NUM, FIELD)` gives access to the following FIELDS:
- `has_id`: 0 or 1
Whether the symbol has an `id`.
- `id`: string (e.g., `exp`, `NUM`, or `TOK_NUM` with api.token.prefix)
If `has_id`, the name of the token kind (prefixed by api.token.prefix if
defined), otherwise empty. Guaranteed to be usable as a C identifier.
This is used to define the token kind (i.e., the enum used by the return
value of yylex). Should be named `token_kind`.
- `tag`: string
A human readable representation of the symbol. Can be `'foo'`,
`'foo.id'`, `'"foo"'` etc.
- `code`: integer
The token code associated to the token kind `id`.
The external number as used by yylex. Can be ASCII code when a character,
some number chosen by bison, or some user number in the case of `%token
FOO <NUM>`. Corresponds to `yychar` in `yacc.c`.
- `is_token`: 0 or 1
Whether this is a terminal symbol.
- `kind_base`: string (e.g., `YYSYMBOL_exp`, `YYSYMBOL_NUM`)
The base of the symbol kind, i.e., the enumerator of this symbol (token or
nonterminal) which is mapped to its `number`.
- `kind`: string
Same as `kind_base`, but possibly with a prefix in some languages. E.g.,
EOF's `kind_base` and `kind` are `YYSYMBOL_YYEOF` in C, but are
`S_YYEMPTY` and `symbol_kind::S_YYEMPTY` in C++.
- `number`: integer
The code associated to the `kind`.
The internal number (computed from the external number by yytranslate).
Corresponds to yytoken in yacc.c. This is the same number that serves as
key in b4_symbol(NUM, FIELD).
In bison, symbols are first assigned increasing numbers in order of
appearance (but tokens first, then nterms). After grammar reduction,
unused nterms are then renumbered to appear last (i.e., first tokens, then
used nterms and finally unused nterms). This final number NUM is the one
contained in this field, and it is the one used as key in `b4_symbol(NUM,
FIELD)`.
The code of the rule actions, however, is emitted before we know what
symbols are unused, so they use the original numbers. To avoid confusion,
they actually use "orig NUM" instead of just "NUM". bison also emits
definitions for `b4_symbol(orig NUM, number)` that map from original
numbers to the new ones. `b4_symbol` actually resolves `orig NUM` in the
other case, i.e., `b4_symbol(orig 42, tag)` would return the tag of the
symbols whose original number was 42.
- `has_type`: 0, 1
Whether has a semantic value.
- `type_tag`: string
When api.value.type=union, the generated name for the union member.
yytype_INT etc. for symbols that has_id, otherwise yytype_1 etc.
- `type`: string
If it has a semantic value, its type tag, or, if variant are used,
its type.
In the case of api.value.type=union, type is the real type (e.g. int).
- `slot`: string
If it has a semantic value, the name of the union member (i.e., bounces to
either `type_tag` or `type`). It would be better to fix our mess and
always use `type` for the true type of the member, and `type_tag` for the
name of the union member.
- `has_printer`: 0, 1
- `printer`: string
- `printer_file`: string
- `printer_line`: integer
- `printer_loc`: location
If the symbol has a printer, everything about it.
- `has_destructor`, `destructor`, `destructor_file`, `destructor_line`, `destructor_loc`
Likewise.
### `b4_symbol_value(VAL, [SYMBOL-NUM], [TYPE-TAG])`
Expansion of $$, $1, $<TYPE-TAG>3, etc.
The semantic value from a given VAL.
- `VAL`: some semantic value storage (typically a union). e.g., `yylval`
- `SYMBOL-NUM`: the symbol number from which we extract the type tag.
- `TYPE-TAG`, the user forced the `<TYPE-TAG>`.
The result can be used safely, it is put in parens to avoid nasty precedence
issues.
### `b4_lhs_value(SYMBOL-NUM, [TYPE])`
Expansion of `$$` or `$<TYPE>$`, for symbol `SYMBOL-NUM`.
### `b4_rhs_data(RULE-LENGTH, POS)`
The data corresponding to the symbol `#POS`, where the current rule has
`RULE-LENGTH` symbols on RHS.
### `b4_rhs_value(RULE-LENGTH, POS, SYMBOL-NUM, [TYPE])`
Expansion of `$<TYPE>POS`, where the current rule has `RULE-LENGTH` symbols
on RHS.
<!--
Local Variables:
mode: markdown
fill-column: 76
ispell-dictionary: "american"
End:
Copyright (C) 2002, 2008-2015, 2018-2021 Free Software Foundation, Inc.
This file is part of GNU Bison.
This program is free software: you can redistribute it and/or modify
it under the terms of the GNU General Public License as published by
the Free Software Foundation, either version 3 of the License, or
(at your option) any later version.
This program is distributed in the hope that it will be useful,
but WITHOUT ANY WARRANTY; without even the implied warranty of
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
GNU General Public License for more details.
You should have received a copy of the GNU General Public License
along with this program. If not, see <https://www.gnu.org/licenses/>.
--> | {
"source": "yandex/perforator",
"title": "contrib/tools/bison/data/README.md",
"url": "https://github.com/yandex/perforator/blob/main/contrib/tools/bison/data/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 7817
} |
The Python C API
================
The C API is divided into these sections:
1. ``Include/``: Limited API
2. ``Include/cpython/``: CPython implementation details
3. ``Include/cpython/``, names with the ``PyUnstable_`` prefix: API that can
change between minor releases
4. ``Include/internal/``, and any name with ``_`` prefix: The internal API
Information on changing the C API is available `in the developer guide`_
.. _in the developer guide: https://devguide.python.org/c-api/ | {
"source": "yandex/perforator",
"title": "contrib/tools/python3/Include/README.rst",
"url": "https://github.com/yandex/perforator/blob/main/contrib/tools/python3/Include/README.rst",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 485
} |
# logging (with profiling features)
See [`./demo`](./demo/app.py) application for using example.
## Полезные ссылки
**Спецификация Trace Event Format**
- https://docs.google.com/document/d/1CvAClvFfyA5R-PhYUmn5OOQtYMH4h6I0nSsKchNAySU
**UI для просмотра трейсов:**
- Chrome DevTools, вкладка Performance (кнопка Load profile), [руководство](https://developer.chrome.com/docs/devtools/performance/reference/)
- https://ui.perfetto.dev/
- <chrome://tracing/> в Chromium браузерах, включая [Yandex Browser](browser://tracing/) | {
"source": "yandex/perforator",
"title": "devtools/frontend_build_platform/libraries/logging/README.md",
"url": "https://github.com/yandex/perforator/blob/main/devtools/frontend_build_platform/libraries/logging/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 528
} |
# nots builder
Внутренний инструмент, выполняющий сборку Frontend-проектов (узлов сборки) в рамках сборщика `ya make`.
## Собирает папку `node_modules`
Запускает `pnpm install`, используя скаченные заранее архивы пакетов.
В результате ожидается архив `node_modules.tar`.
_Скоро выпилим_
## Запускает сборщики типа `tsc`, `webpack`, `next`, `vite`
В соответствии с типом таргета (`TS_TSC`, `TS_WEBPACK`, `TS_NEXT`, `TS_VITE` макросами в `ya.make`) для непосредственной сборки проекта.
Пакеты должны быть установлены в качестве зависимостей проекта, обычно в `devDependencies`.
В результате ожидается архив `<module_name>.output.tar`.
## Релиз
`builder` собирается в бинарники, которые используются внутри сборки уже собранными.
Маппинг собранных под разные платформы ресурсы описан в [resources.json](./resources.json).
Временно релиз настроен на ручной запуск, причем в двух местах: при запуске релиза и при влитии результата в транк.
При изменениях в builder в рамках PR нужно:
0. Нужно пошарить PR на `robot-nots` **ВАЖНО!**
1. Запустить ручной action `devtools/frontend_build_platform/nots/builder: Build nots-builder (from PR)`
2. Будет собран builder из ветки и новые ресурсы будут прописаны в [resources.json](./resources.json)
3. С новым коммитом будет перезапущены все tier0 проекты на TS макросах - проверка, что изменения билдера ничего не сломали
4. "Релиз" новой версии - это влитие ветки.
{% note warning "ПОМНИ!" %}
При дополнительных коммитах в PR нужно вручную запускать action
```
devtools/frontend_build_platform/nots/builder: Build nots-builder (from PR)
```
{% endnote %}
Можно не делать проверку в рамках своего ПР, а после влития запустить
[релиз](https://a.yandex-team.ru/projects/?frontend_build_platform/ci/releases/timeline?dir=devtools%2Ffrontend_build_platform%2Fnots%2Fbuilder&id=release-nots),
который:
1. Соберет новые ресурсы
2. Обновит [resources.json](./resources.json)
3. Сделает PR в транк
4. Будет ждать влития
## Локальный запуск
Чтобы отключить запуск предсобаранного бинарника и запускать с учетом изменений в ветке, то нужно указать переменную:
```shell
ya make -DTS_USE_PREBUILT_NOTS_TOOL=no --host-platform-flag=TS_USE_PREBUILT_NOTS_TOOL=no
```
Для удобства можно [настроить алиас](https://docs.yandex-team.ru/yatool/usage/options#primer-otklyuchenie-predpostroennyh-tulov) в `junk/<username>/ya.conf`:
```toml
[[include]]
path = "${YA_ARCADIA_ROOT}/devtools/frontend_build_platform/nots/builder/builder.ya.conf"
```
После чего будут доступны алиасы `-n` и `--no-prebuilt-nots`:
```shell
ya make -n
ya make --no-prebuilt-nots
```
## Запуск
Обычно запускается внутри сборки `ya make`, как команда (`.CMD`).
Ручной запуск сложен, т.к. требует определенный набор переданных в аргументах подготовленных директорий.
Если очень уж хочется, то можно бросить исключение из builder (упасть).
Тогда сборка завершится с ошибкой, а при этом сборочные папки не удаляются, в лог пишется команда запуска builder со всем аргументами.
Таким образом его можно поправить и перезапустить с теми же параметрами в режиме дебага, например.
## Отладка
Можно включить логирование с опцией `-DTS_LOG=yes`, тогда `builder` напишет в консоль переданные аргументы, то, как он их распарсил.
А также покажет команду для запуска самого сборщика (можно будет скопировать и выполнить самому).
### Отладка с брекпоинтами
Настоящий дебаг в IDE делается так:
1. Настраивается окружение с помощью команды `ya ide venv` из папки `cli` (в принципе полезно, в codenv должно быть уже настроено);
2. Это окружение активируется в IDE;
3. Запустить `builder` с опцией `-DTS_LOG`, скопировать все аргументы командной строки из выхлопа;
4. Настроить запуск `main.py` в IDE в режиме дебага и передать все эти аргументы.
## Профилирование
Есть возможность сгенерировать файлы с трейсами для отладки производительности на всех этапах сборки (внутри `builder`)
Для этого нужно запустить `ya make` с опцией `-DTS_TRACE` (т.е. установкой переменной `TS_TRACE=yes`).
В выходной архив добавится папка `.traces` с файлом в `Trace Event Format`, который можно открыть в Chrome Devtools или в https://ui.perfetto.dev/.
Подробнее, о том, как читать трейсинг тут: https://a.yandex-team.ru/arcadia/devtools/frontend_build_platform/libraries/logging/README.md
После запуска из `ya tool nots build` архивы распакуются и трейсы следует искать по таким путям:
- Для `create-node-modules`: `./node_modules/.traces/builder_create-node-modules.trace.json`
- Для прочих команд: `./.traces/builder_<command>.trace.json` | {
"source": "yandex/perforator",
"title": "devtools/frontend_build_platform/nots/builder/README.md",
"url": "https://github.com/yandex/perforator/blob/main/devtools/frontend_build_platform/nots/builder/README.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 4535
} |
# Architecture overview
This page describes Perforator's architecture.
## Intro
Perforator consists of several components. There are four different kinds of components:
1. Perforator agent which is running on each node of the profiled cluster
2. A few microservices to collect, store, analyze, and symbolize profiles & executable files from the agents. We run our microservices inside the same cluster Perforator profiles.
3. A web-based user interface.
4. A set of databases: ClickHouse to store profile metadata, PostgreSQL to store binary metadata, and S3-compatible store to store raw profiles and binaries.
Such design has been proven to be effective enough to scale to tens of thousands of nodes and petabytes of profiles.
## High-level schema

## Components overview
### Agent
The Perforator agent is the core of our profiling infrastructure. The agent runs on each node inside the cluster, collects profiles using eBPF, aggregates, compresses and sends them to the storage via gRPC in the pprof-compatible format.
Agent connects to kubelet to identify running pods on a node. Also agent tracks all the processes on a node and analyzes running executables. To profile, the agent uses [perf_events Linux API](https://man7.org/linux/man-pages/man2/perf_event_open.2.html) to trigger eBPF program on each perf event like "1M CPU cycles" or "1K major pagefaults". The eBPF program collects info about thread it was invoked on like thread / process id, thread / process name, userspace & kernelspace call stack and so on. The program sends collected samples to the user-space part of the agent via eBPF perfbuf API.

Agent collects samples from the eBPF program in memory and periodically sends them to the storage over gRPC. By default, for each pod an agent will send a profile every minute. We call this profile, consisting of the samples of one workload over one minute, an *atomic profile*.
In order to generate human-readable profile, the agent should map addresses of executable instructions to source code lines. That process, called *symbolization*, is compute-intensive. If the same code is executed on thousands of nodes, the agent should run the same symbolization on each node, which is proven to be really expensive. So we took another approach.
In addition to profiles, agent uploads executable files found on the profiled nodes. This can sound scary, but with careful synchronization we guarantee that each binary file is uploaded only once. The binaries are uploaded through storage microservice to S3 and PostgreSQL storages, and can be post-processed later to generate symbolized profiles. Binary files are identified using *build-id*: unique identifier of the compiled binary, which is often computed as a hash of meaningful ELF sections. Some executables do not contain build-id, so we compute so-called *pseudo build-id*: hash based on a few random portions of executable text.
{% note info %}
While in theory we can support external artifact registries, this feature is not available now. Feel free to discuss and contribute.
{% endnote %}
### Databases
Perforator uses three different storages. All the other microservices are stateless.
- Raw atomic profiles, binaries, and some post-processed artifacts of binaries (GSYM data) are stored in the S3-compatible storage. S3 is cheap and scales well.
- For each atomic profile we write a row to the ClickHouse cluster. This allows us to quickly find interesting profiles using complex selectors, because ClickHouse is blazing fast. See [ClickHouse table structure](https://github.com/yandex/perforator/tree/main/perforator/cmd/migrate/migrations/clickhouse) for details.
- For each executable file we write a row to the PostgreSQL cluster to synchronize uploads and track binary TTL. We have found that there is not too many executable files and they can be easily stored in an PostgreSQL. However, it is quite easy to use your custom SQL database instead (probably a few hundred lines of Golang). See the source code for details.
For more details about database structure see [database reference](../../reference/database.md).
### Storage proxy
Storage is a simple gRPC server that proxies upload requests from agents to ClickHouse, PostgreSQL and S3. Storage proxy is stateless, despite the name, and can (and should!) safely be replicated. We run hundreds of storage pods in our largest cluster. Storage can optionally provide agent authorization using mTLS.
### Symbolizer
Symbolizer (or *proxy* in the source code) is a main user-facing service. Symbolizer allows user to list profiles or services, build merged profiles that span multiple atomic profiles matching one selector and so on. Symbolizer consumes a noticeable amount of resources: symbolization is heavy. We are working to optimize this. Likely the symbolizer itself is often running in the same cluster we profile and can be easily self-profiled.
Symbolizer provides two variants of the same API: raw gRPC interface and HTTP RPC using beautiful [grpc-gateway](https://github.com/grpc-ecosystem/grpc-gateway/). The gRPC interface can be used by CLIs and automated services like profile quality monitors while HTTP API is used by our web interface. | {
"source": "yandex/perforator",
"title": "docs/en/explanation/architecture/overview.md",
"url": "https://github.com/yandex/perforator/blob/main/docs/en/explanation/architecture/overview.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 5352
} |
# Fetch Profile
Perforator CLI allows you to fetch profiles from the Perforator server.
## Collect a flamegraph for a service over 30 minutes from 10 samples and start viewer on localhost:9000
```console
perforator fetch --format=flamegraph -s "now-30m" --service "redis-master" -m 10 --serve ":9000""
```
## Collect a flamegraph from a pod for the last 30 minutes and start viewer on localhost:9000
```console
perforator fetch -s "now-30m" --pod-id "mongodb-statefulset-2" -m 10 --serve ":9000"
```
## Collect a flamegraph for an executable over 30 minutes from 10 samples
To identify the executable, use the BuildID. You can find the BuildID using the following command:
```console
readelf -n <path_to_binary>
```
```console
perforator fetch --format=flamegraph -s "now-30m" --build-id "abacaba" -m 10 --serve ":9000"
```
## Collect a pprof profile for an arbitrary selector
```console
perforator fetch --format=pprof -s "now-30m" --selector
'{node_id="example.org|worker-us-east1-b-1", timestamp>="now-30m"}' -m 10 -o profile.pprof
```
## Collect a flamegraph filtered by TLS string variable value
Before collecting a profile, you need to mark TLS variables in your code using one of `Y_PERFORATOR_THREAD_LOCAL` macros.
```console
perforator fetch --format=flamegraph -s "now-30m" --selector {node_id="worker-us-east1-b-1", "tls.TString_KEY"="VALUE"}
``` | {
"source": "yandex/perforator",
"title": "docs/en/guides/cli/fetch.md",
"url": "https://github.com/yandex/perforator/blob/main/docs/en/guides/cli/fetch.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1372
} |
# Install Perforator CLI
Build `perforator/cmd/cli` from the root of the repository.
# Configure Perforator CLI
* Set `PERFORATOR_ENDPOINT` environment variable to specify the Perforator server URL once. Otherwise you need to use `--url` flag for each command.
* Set `PERFORATOR_SECURE` to enable or disable TLS. By default, TLS is enabled.
```console
export PERFORATOR_ENDPOINT="https://perforator.example.com"
export PERFORATOR_SECURE=true
``` | {
"source": "yandex/perforator",
"title": "docs/en/guides/cli/install.md",
"url": "https://github.com/yandex/perforator/blob/main/docs/en/guides/cli/install.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 450
} |
# List Profiles by Selector
Perforator CLI allows you to list profiles by selector.
## View the list of profiles by host for the last 15 minutes
```console
perforator list profiles --node-id "worker-us-east1-b-1" -s "now-15m"
```
## View the list of profiles by service for the last 30 minutes
```console
perforator list profiles --service "kafka-broker" -s "now-30m""
```
## View the list of profiles by abstract selector
```console
perforator list profiles --selector '{node_id="example.org|worker-us-east1-b-1", timestamp>="now-30m"}'
``` | {
"source": "yandex/perforator",
"title": "docs/en/guides/cli/list_profiles.md",
"url": "https://github.com/yandex/perforator/blob/main/docs/en/guides/cli/list_profiles.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 548
} |
# List Services
Perforator CLI allows you to list profiled services from the Perforator server.
## View the complete list of services sorted by name
```console
perforator list services
```
## View the list of services filtered by regex and sorted by number of profiles
```console
perforator list services -r "perforator" --order-by "profiles"
```
## View the list of services that had profiles in the last hour
```console
perforator list services --max-stale-age "1h"
``` | {
"source": "yandex/perforator",
"title": "docs/en/guides/cli/list_services.md",
"url": "https://github.com/yandex/perforator/blob/main/docs/en/guides/cli/list_services.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 478
} |
# Force profile saving using Microscope
Microscope is a way to save all profiles from a specific selector bypassing sampling (for example `{host="worker-us-east1-b-1"}`).
## Save profiles from a host {#save-profiles-from-a-host}
Create a microscope to save profiles from the entire host for 1 hour starting from the current moment:
```console
perforator microscope create --node-id "worker-us-east1-b-1" --duration "1h" --start-time "now"
```
After this command, minute-by-minute profiles will start being saved from the host `worker-us-east1-b-1`. You can view the list of profiles from the node for the last 15 minutes using this command:
```console
perforator list profiles --node-id "worker-us-east1-b-1" -s "now-15m"
```
## Save profiles from a pod
Create a microscope to save profiles from a pod for 15 minutes, starting in 30 minutes:
```console
perforator microscope create --pod-id "perforator-storage-production-73" --duration "15m" --start-time "now+30m"
```
View the profiles from the pod for the last 15 minutes:
```console
perforator list profiles --pod-id "perforator-storage-production-73" -s "now-15m"
```
## List created microscopes
View your created microscopes:
```console
perforator microscope list
``` | {
"source": "yandex/perforator",
"title": "docs/en/guides/cli/microscope.md",
"url": "https://github.com/yandex/perforator/blob/main/docs/en/guides/cli/microscope.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1237
} |
# Collecting a Local Profile
Perforator can collect ad-hoc profiles on a local machine.
## Prerequisites
To work, you need:
1. A recent Linux kernel. Minimum 5.4, preferably 5.15. You can check the kernel version using `uname -r`.
2. Full root access. Perforator requires `CAP_SYS_ADMIN` because it runs an eBPF program capable of reading any state from the kernel or userspace. (run with `sudo`)
## Collect profile of a process and save to a pprof file
```console
perforator record --format pprof -p <pid> --duration 1m --output profile.pprof
```
View `profile.pprof` file.
## Start a subprocess and collect its flamegraph
```console
perforator record --duration 1m -o ./flame.svg -- ls
```
View `flame.svg` file.
## Collect profile of a process and serve a flamegraph on localhost:9000
```console
perforator record -p <pid> --duration 1m --serve ":9000"
```
View the flamegraph at `http://localhost:9000` in your browser.
## Collect profile of a whole system and serve a flamegraph on localhost:9000
```console
perforator record -a --duration 1m --serve ":9000"
```
View the flamegraph at `http://localhost:9000` in your browser.
## Collect profile of a whole system and save flamegraph SVG to file.
```console
perforator record -a --duration 1m --output flame.svg
```
View `flame.svg` file. | {
"source": "yandex/perforator",
"title": "docs/en/guides/cli/record.md",
"url": "https://github.com/yandex/perforator/blob/main/docs/en/guides/cli/record.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 1312
} |
# JVM-based language support
This page documents support for Java and other JVM-based languages.
## Requirements
Following requirements must be met:
* HotSpot JVM version 17 or newer is used.
* JVM is running with `-XX:+PreserveFramePointer` flag.
* Perfmap-based symbolization is enabled for the JVM process, and [`java` option](../perfmap.md#configuration-java) is enabled as well.
## Automatic perfmap generation for JVM {#jvm}
When enabled, Perforator can automatically instruct a JVM process to generate perfmap. Internally, agent will use [JDK Attach API](https://docs.oracle.com/en/java/javase/21/docs/api/jdk.attach/module-summary.html) to periodically execute equivalent to the following command
```bash
jcmd ${PID} Compiler.perfmap
```
{% note warning %}
Attach API is an OpenJDK extension. It may be unavailable in other implementations of the JVM.
{% endnote %} | {
"source": "yandex/perforator",
"title": "docs/en/reference/language-support/java.md",
"url": "https://github.com/yandex/perforator/blob/main/docs/en/reference/language-support/java.md",
"date": "2025-01-29T14:20:43",
"stars": 2926,
"description": "Perforator is a cluster-wide continuous profiling tool designed for large data centers",
"file_size": 884
} |
Subsets and Splits