repo
stringlengths 7
55
| path
stringlengths 4
127
| func_name
stringlengths 1
88
| original_string
stringlengths 75
19.8k
| language
stringclasses 1
value | code
stringlengths 75
19.8k
| code_tokens
sequence | docstring
stringlengths 3
17.3k
| docstring_tokens
sequence | sha
stringlengths 40
40
| url
stringlengths 87
242
| partition
stringclasses 1
value |
---|---|---|---|---|---|---|---|---|---|---|---|
symengine/symengine.py | symengine/compatibility.py | as_int | def as_int(n):
"""
Convert the argument to a builtin integer.
The return value is guaranteed to be equal to the input. ValueError is
raised if the input has a non-integral value.
Examples
========
>>> from sympy.core.compatibility import as_int
>>> from sympy import sqrt
>>> 3.0
3.0
>>> as_int(3.0) # convert to int and test for equality
3
>>> int(sqrt(10))
3
>>> as_int(sqrt(10))
Traceback (most recent call last):
...
ValueError: ... is not an integer
"""
try:
result = int(n)
if result != n:
raise TypeError
except TypeError:
raise ValueError('%s is not an integer' % n)
return result | python | def as_int(n):
"""
Convert the argument to a builtin integer.
The return value is guaranteed to be equal to the input. ValueError is
raised if the input has a non-integral value.
Examples
========
>>> from sympy.core.compatibility import as_int
>>> from sympy import sqrt
>>> 3.0
3.0
>>> as_int(3.0) # convert to int and test for equality
3
>>> int(sqrt(10))
3
>>> as_int(sqrt(10))
Traceback (most recent call last):
...
ValueError: ... is not an integer
"""
try:
result = int(n)
if result != n:
raise TypeError
except TypeError:
raise ValueError('%s is not an integer' % n)
return result | [
"def",
"as_int",
"(",
"n",
")",
":",
"try",
":",
"result",
"=",
"int",
"(",
"n",
")",
"if",
"result",
"!=",
"n",
":",
"raise",
"TypeError",
"except",
"TypeError",
":",
"raise",
"ValueError",
"(",
"'%s is not an integer'",
"%",
"n",
")",
"return",
"result"
] | Convert the argument to a builtin integer.
The return value is guaranteed to be equal to the input. ValueError is
raised if the input has a non-integral value.
Examples
========
>>> from sympy.core.compatibility import as_int
>>> from sympy import sqrt
>>> 3.0
3.0
>>> as_int(3.0) # convert to int and test for equality
3
>>> int(sqrt(10))
3
>>> as_int(sqrt(10))
Traceback (most recent call last):
...
ValueError: ... is not an integer | [
"Convert",
"the",
"argument",
"to",
"a",
"builtin",
"integer",
"."
] | 1366cf98ceaade339c5dd24ae3381a0e63ea9dad | https://github.com/symengine/symengine.py/blob/1366cf98ceaade339c5dd24ae3381a0e63ea9dad/symengine/compatibility.py#L359-L389 | train |
symengine/symengine.py | symengine/compatibility.py | default_sort_key | def default_sort_key(item, order=None):
"""Return a key that can be used for sorting.
The key has the structure:
(class_key, (len(args), args), exponent.sort_key(), coefficient)
This key is supplied by the sort_key routine of Basic objects when
``item`` is a Basic object or an object (other than a string) that
sympifies to a Basic object. Otherwise, this function produces the
key.
The ``order`` argument is passed along to the sort_key routine and is
used to determine how the terms *within* an expression are ordered.
(See examples below) ``order`` options are: 'lex', 'grlex', 'grevlex',
and reversed values of the same (e.g. 'rev-lex'). The default order
value is None (which translates to 'lex').
Examples
========
>>> from sympy import S, I, default_sort_key
>>> from sympy.core.function import UndefinedFunction
>>> from sympy.abc import x
The following are equivalent ways of getting the key for an object:
>>> x.sort_key() == default_sort_key(x)
True
Here are some examples of the key that is produced:
>>> default_sort_key(UndefinedFunction('f'))
((0, 0, 'UndefinedFunction'), (1, ('f',)), ((1, 0, 'Number'),
(0, ()), (), 1), 1)
>>> default_sort_key('1')
((0, 0, 'str'), (1, ('1',)), ((1, 0, 'Number'), (0, ()), (), 1), 1)
>>> default_sort_key(S.One)
((1, 0, 'Number'), (0, ()), (), 1)
>>> default_sort_key(2)
((1, 0, 'Number'), (0, ()), (), 2)
While sort_key is a method only defined for SymPy objects,
default_sort_key will accept anything as an argument so it is
more robust as a sorting key. For the following, using key=
lambda i: i.sort_key() would fail because 2 doesn't have a sort_key
method; that's why default_sort_key is used. Note, that it also
handles sympification of non-string items likes ints:
>>> a = [2, I, -I]
>>> sorted(a, key=default_sort_key)
[2, -I, I]
The returned key can be used anywhere that a key can be specified for
a function, e.g. sort, min, max, etc...:
>>> a.sort(key=default_sort_key); a[0]
2
>>> min(a, key=default_sort_key)
2
Note
----
The key returned is useful for getting items into a canonical order
that will be the same across platforms. It is not directly useful for
sorting lists of expressions:
>>> a, b = x, 1/x
Since ``a`` has only 1 term, its value of sort_key is unaffected by
``order``:
>>> a.sort_key() == a.sort_key('rev-lex')
True
If ``a`` and ``b`` are combined then the key will differ because there
are terms that can be ordered:
>>> eq = a + b
>>> eq.sort_key() == eq.sort_key('rev-lex')
False
>>> eq.as_ordered_terms()
[x, 1/x]
>>> eq.as_ordered_terms('rev-lex')
[1/x, x]
But since the keys for each of these terms are independent of ``order``'s
value, they don't sort differently when they appear separately in a list:
>>> sorted(eq.args, key=default_sort_key)
[1/x, x]
>>> sorted(eq.args, key=lambda i: default_sort_key(i, order='rev-lex'))
[1/x, x]
The order of terms obtained when using these keys is the order that would
be obtained if those terms were *factors* in a product.
See Also
========
sympy.core.expr.as_ordered_factors, sympy.core.expr.as_ordered_terms
"""
from sympy.core import S, Basic
from sympy.core.sympify import sympify, SympifyError
from sympy.core.compatibility import iterable
if isinstance(item, Basic):
return item.sort_key(order=order)
if iterable(item, exclude=string_types):
if isinstance(item, dict):
args = item.items()
unordered = True
elif isinstance(item, set):
args = item
unordered = True
else:
# e.g. tuple, list
args = list(item)
unordered = False
args = [default_sort_key(arg, order=order) for arg in args]
if unordered:
# e.g. dict, set
args = sorted(args)
cls_index, args = 10, (len(args), tuple(args))
else:
if not isinstance(item, string_types):
try:
item = sympify(item)
except SympifyError:
# e.g. lambda x: x
pass
else:
if isinstance(item, Basic):
# e.g int -> Integer
return default_sort_key(item)
# e.g. UndefinedFunction
# e.g. str
cls_index, args = 0, (1, (str(item),))
return (cls_index, 0, item.__class__.__name__
), args, S.One.sort_key(), S.One | python | def default_sort_key(item, order=None):
"""Return a key that can be used for sorting.
The key has the structure:
(class_key, (len(args), args), exponent.sort_key(), coefficient)
This key is supplied by the sort_key routine of Basic objects when
``item`` is a Basic object or an object (other than a string) that
sympifies to a Basic object. Otherwise, this function produces the
key.
The ``order`` argument is passed along to the sort_key routine and is
used to determine how the terms *within* an expression are ordered.
(See examples below) ``order`` options are: 'lex', 'grlex', 'grevlex',
and reversed values of the same (e.g. 'rev-lex'). The default order
value is None (which translates to 'lex').
Examples
========
>>> from sympy import S, I, default_sort_key
>>> from sympy.core.function import UndefinedFunction
>>> from sympy.abc import x
The following are equivalent ways of getting the key for an object:
>>> x.sort_key() == default_sort_key(x)
True
Here are some examples of the key that is produced:
>>> default_sort_key(UndefinedFunction('f'))
((0, 0, 'UndefinedFunction'), (1, ('f',)), ((1, 0, 'Number'),
(0, ()), (), 1), 1)
>>> default_sort_key('1')
((0, 0, 'str'), (1, ('1',)), ((1, 0, 'Number'), (0, ()), (), 1), 1)
>>> default_sort_key(S.One)
((1, 0, 'Number'), (0, ()), (), 1)
>>> default_sort_key(2)
((1, 0, 'Number'), (0, ()), (), 2)
While sort_key is a method only defined for SymPy objects,
default_sort_key will accept anything as an argument so it is
more robust as a sorting key. For the following, using key=
lambda i: i.sort_key() would fail because 2 doesn't have a sort_key
method; that's why default_sort_key is used. Note, that it also
handles sympification of non-string items likes ints:
>>> a = [2, I, -I]
>>> sorted(a, key=default_sort_key)
[2, -I, I]
The returned key can be used anywhere that a key can be specified for
a function, e.g. sort, min, max, etc...:
>>> a.sort(key=default_sort_key); a[0]
2
>>> min(a, key=default_sort_key)
2
Note
----
The key returned is useful for getting items into a canonical order
that will be the same across platforms. It is not directly useful for
sorting lists of expressions:
>>> a, b = x, 1/x
Since ``a`` has only 1 term, its value of sort_key is unaffected by
``order``:
>>> a.sort_key() == a.sort_key('rev-lex')
True
If ``a`` and ``b`` are combined then the key will differ because there
are terms that can be ordered:
>>> eq = a + b
>>> eq.sort_key() == eq.sort_key('rev-lex')
False
>>> eq.as_ordered_terms()
[x, 1/x]
>>> eq.as_ordered_terms('rev-lex')
[1/x, x]
But since the keys for each of these terms are independent of ``order``'s
value, they don't sort differently when they appear separately in a list:
>>> sorted(eq.args, key=default_sort_key)
[1/x, x]
>>> sorted(eq.args, key=lambda i: default_sort_key(i, order='rev-lex'))
[1/x, x]
The order of terms obtained when using these keys is the order that would
be obtained if those terms were *factors* in a product.
See Also
========
sympy.core.expr.as_ordered_factors, sympy.core.expr.as_ordered_terms
"""
from sympy.core import S, Basic
from sympy.core.sympify import sympify, SympifyError
from sympy.core.compatibility import iterable
if isinstance(item, Basic):
return item.sort_key(order=order)
if iterable(item, exclude=string_types):
if isinstance(item, dict):
args = item.items()
unordered = True
elif isinstance(item, set):
args = item
unordered = True
else:
# e.g. tuple, list
args = list(item)
unordered = False
args = [default_sort_key(arg, order=order) for arg in args]
if unordered:
# e.g. dict, set
args = sorted(args)
cls_index, args = 10, (len(args), tuple(args))
else:
if not isinstance(item, string_types):
try:
item = sympify(item)
except SympifyError:
# e.g. lambda x: x
pass
else:
if isinstance(item, Basic):
# e.g int -> Integer
return default_sort_key(item)
# e.g. UndefinedFunction
# e.g. str
cls_index, args = 0, (1, (str(item),))
return (cls_index, 0, item.__class__.__name__
), args, S.One.sort_key(), S.One | [
"def",
"default_sort_key",
"(",
"item",
",",
"order",
"=",
"None",
")",
":",
"from",
"sympy",
".",
"core",
"import",
"S",
",",
"Basic",
"from",
"sympy",
".",
"core",
".",
"sympify",
"import",
"sympify",
",",
"SympifyError",
"from",
"sympy",
".",
"core",
".",
"compatibility",
"import",
"iterable",
"if",
"isinstance",
"(",
"item",
",",
"Basic",
")",
":",
"return",
"item",
".",
"sort_key",
"(",
"order",
"=",
"order",
")",
"if",
"iterable",
"(",
"item",
",",
"exclude",
"=",
"string_types",
")",
":",
"if",
"isinstance",
"(",
"item",
",",
"dict",
")",
":",
"args",
"=",
"item",
".",
"items",
"(",
")",
"unordered",
"=",
"True",
"elif",
"isinstance",
"(",
"item",
",",
"set",
")",
":",
"args",
"=",
"item",
"unordered",
"=",
"True",
"else",
":",
"# e.g. tuple, list",
"args",
"=",
"list",
"(",
"item",
")",
"unordered",
"=",
"False",
"args",
"=",
"[",
"default_sort_key",
"(",
"arg",
",",
"order",
"=",
"order",
")",
"for",
"arg",
"in",
"args",
"]",
"if",
"unordered",
":",
"# e.g. dict, set",
"args",
"=",
"sorted",
"(",
"args",
")",
"cls_index",
",",
"args",
"=",
"10",
",",
"(",
"len",
"(",
"args",
")",
",",
"tuple",
"(",
"args",
")",
")",
"else",
":",
"if",
"not",
"isinstance",
"(",
"item",
",",
"string_types",
")",
":",
"try",
":",
"item",
"=",
"sympify",
"(",
"item",
")",
"except",
"SympifyError",
":",
"# e.g. lambda x: x",
"pass",
"else",
":",
"if",
"isinstance",
"(",
"item",
",",
"Basic",
")",
":",
"# e.g int -> Integer",
"return",
"default_sort_key",
"(",
"item",
")",
"# e.g. UndefinedFunction",
"# e.g. str",
"cls_index",
",",
"args",
"=",
"0",
",",
"(",
"1",
",",
"(",
"str",
"(",
"item",
")",
",",
")",
")",
"return",
"(",
"cls_index",
",",
"0",
",",
"item",
".",
"__class__",
".",
"__name__",
")",
",",
"args",
",",
"S",
".",
"One",
".",
"sort_key",
"(",
")",
",",
"S",
".",
"One"
] | Return a key that can be used for sorting.
The key has the structure:
(class_key, (len(args), args), exponent.sort_key(), coefficient)
This key is supplied by the sort_key routine of Basic objects when
``item`` is a Basic object or an object (other than a string) that
sympifies to a Basic object. Otherwise, this function produces the
key.
The ``order`` argument is passed along to the sort_key routine and is
used to determine how the terms *within* an expression are ordered.
(See examples below) ``order`` options are: 'lex', 'grlex', 'grevlex',
and reversed values of the same (e.g. 'rev-lex'). The default order
value is None (which translates to 'lex').
Examples
========
>>> from sympy import S, I, default_sort_key
>>> from sympy.core.function import UndefinedFunction
>>> from sympy.abc import x
The following are equivalent ways of getting the key for an object:
>>> x.sort_key() == default_sort_key(x)
True
Here are some examples of the key that is produced:
>>> default_sort_key(UndefinedFunction('f'))
((0, 0, 'UndefinedFunction'), (1, ('f',)), ((1, 0, 'Number'),
(0, ()), (), 1), 1)
>>> default_sort_key('1')
((0, 0, 'str'), (1, ('1',)), ((1, 0, 'Number'), (0, ()), (), 1), 1)
>>> default_sort_key(S.One)
((1, 0, 'Number'), (0, ()), (), 1)
>>> default_sort_key(2)
((1, 0, 'Number'), (0, ()), (), 2)
While sort_key is a method only defined for SymPy objects,
default_sort_key will accept anything as an argument so it is
more robust as a sorting key. For the following, using key=
lambda i: i.sort_key() would fail because 2 doesn't have a sort_key
method; that's why default_sort_key is used. Note, that it also
handles sympification of non-string items likes ints:
>>> a = [2, I, -I]
>>> sorted(a, key=default_sort_key)
[2, -I, I]
The returned key can be used anywhere that a key can be specified for
a function, e.g. sort, min, max, etc...:
>>> a.sort(key=default_sort_key); a[0]
2
>>> min(a, key=default_sort_key)
2
Note
----
The key returned is useful for getting items into a canonical order
that will be the same across platforms. It is not directly useful for
sorting lists of expressions:
>>> a, b = x, 1/x
Since ``a`` has only 1 term, its value of sort_key is unaffected by
``order``:
>>> a.sort_key() == a.sort_key('rev-lex')
True
If ``a`` and ``b`` are combined then the key will differ because there
are terms that can be ordered:
>>> eq = a + b
>>> eq.sort_key() == eq.sort_key('rev-lex')
False
>>> eq.as_ordered_terms()
[x, 1/x]
>>> eq.as_ordered_terms('rev-lex')
[1/x, x]
But since the keys for each of these terms are independent of ``order``'s
value, they don't sort differently when they appear separately in a list:
>>> sorted(eq.args, key=default_sort_key)
[1/x, x]
>>> sorted(eq.args, key=lambda i: default_sort_key(i, order='rev-lex'))
[1/x, x]
The order of terms obtained when using these keys is the order that would
be obtained if those terms were *factors* in a product.
See Also
========
sympy.core.expr.as_ordered_factors, sympy.core.expr.as_ordered_terms | [
"Return",
"a",
"key",
"that",
"can",
"be",
"used",
"for",
"sorting",
"."
] | 1366cf98ceaade339c5dd24ae3381a0e63ea9dad | https://github.com/symengine/symengine.py/blob/1366cf98ceaade339c5dd24ae3381a0e63ea9dad/symengine/compatibility.py#L392-L541 | train |
symengine/symengine.py | symengine/utilities.py | var | def var(names, **args):
"""
Create symbols and inject them into the global namespace.
INPUT:
- s -- a string, either a single variable name, or
- a space separated list of variable names, or
- a list of variable names.
This calls :func:`symbols` with the same arguments and puts the results
into the *global* namespace. It's recommended not to use :func:`var` in
library code, where :func:`symbols` has to be used::
Examples
========
>>> from symengine import var
>>> var('x')
x
>>> x
x
>>> var('a,ab,abc')
(a, ab, abc)
>>> abc
abc
See :func:`symbols` documentation for more details on what kinds of
arguments can be passed to :func:`var`.
"""
def traverse(symbols, frame):
"""Recursively inject symbols to the global namespace. """
for symbol in symbols:
if isinstance(symbol, Basic):
frame.f_globals[symbol.__str__()] = symbol
# Once we hace an undefined function class
# implemented, put a check for function here
else:
traverse(symbol, frame)
from inspect import currentframe
frame = currentframe().f_back
try:
syms = symbols(names, **args)
if syms is not None:
if isinstance(syms, Basic):
frame.f_globals[syms.__str__()] = syms
# Once we hace an undefined function class
# implemented, put a check for function here
else:
traverse(syms, frame)
finally:
del frame # break cyclic dependencies as stated in inspect docs
return syms | python | def var(names, **args):
"""
Create symbols and inject them into the global namespace.
INPUT:
- s -- a string, either a single variable name, or
- a space separated list of variable names, or
- a list of variable names.
This calls :func:`symbols` with the same arguments and puts the results
into the *global* namespace. It's recommended not to use :func:`var` in
library code, where :func:`symbols` has to be used::
Examples
========
>>> from symengine import var
>>> var('x')
x
>>> x
x
>>> var('a,ab,abc')
(a, ab, abc)
>>> abc
abc
See :func:`symbols` documentation for more details on what kinds of
arguments can be passed to :func:`var`.
"""
def traverse(symbols, frame):
"""Recursively inject symbols to the global namespace. """
for symbol in symbols:
if isinstance(symbol, Basic):
frame.f_globals[symbol.__str__()] = symbol
# Once we hace an undefined function class
# implemented, put a check for function here
else:
traverse(symbol, frame)
from inspect import currentframe
frame = currentframe().f_back
try:
syms = symbols(names, **args)
if syms is not None:
if isinstance(syms, Basic):
frame.f_globals[syms.__str__()] = syms
# Once we hace an undefined function class
# implemented, put a check for function here
else:
traverse(syms, frame)
finally:
del frame # break cyclic dependencies as stated in inspect docs
return syms | [
"def",
"var",
"(",
"names",
",",
"*",
"*",
"args",
")",
":",
"def",
"traverse",
"(",
"symbols",
",",
"frame",
")",
":",
"\"\"\"Recursively inject symbols to the global namespace. \"\"\"",
"for",
"symbol",
"in",
"symbols",
":",
"if",
"isinstance",
"(",
"symbol",
",",
"Basic",
")",
":",
"frame",
".",
"f_globals",
"[",
"symbol",
".",
"__str__",
"(",
")",
"]",
"=",
"symbol",
"# Once we hace an undefined function class",
"# implemented, put a check for function here",
"else",
":",
"traverse",
"(",
"symbol",
",",
"frame",
")",
"from",
"inspect",
"import",
"currentframe",
"frame",
"=",
"currentframe",
"(",
")",
".",
"f_back",
"try",
":",
"syms",
"=",
"symbols",
"(",
"names",
",",
"*",
"*",
"args",
")",
"if",
"syms",
"is",
"not",
"None",
":",
"if",
"isinstance",
"(",
"syms",
",",
"Basic",
")",
":",
"frame",
".",
"f_globals",
"[",
"syms",
".",
"__str__",
"(",
")",
"]",
"=",
"syms",
"# Once we hace an undefined function class",
"# implemented, put a check for function here",
"else",
":",
"traverse",
"(",
"syms",
",",
"frame",
")",
"finally",
":",
"del",
"frame",
"# break cyclic dependencies as stated in inspect docs",
"return",
"syms"
] | Create symbols and inject them into the global namespace.
INPUT:
- s -- a string, either a single variable name, or
- a space separated list of variable names, or
- a list of variable names.
This calls :func:`symbols` with the same arguments and puts the results
into the *global* namespace. It's recommended not to use :func:`var` in
library code, where :func:`symbols` has to be used::
Examples
========
>>> from symengine import var
>>> var('x')
x
>>> x
x
>>> var('a,ab,abc')
(a, ab, abc)
>>> abc
abc
See :func:`symbols` documentation for more details on what kinds of
arguments can be passed to :func:`var`. | [
"Create",
"symbols",
"and",
"inject",
"them",
"into",
"the",
"global",
"namespace",
"."
] | 1366cf98ceaade339c5dd24ae3381a0e63ea9dad | https://github.com/symengine/symengine.py/blob/1366cf98ceaade339c5dd24ae3381a0e63ea9dad/symengine/utilities.py#L184-L242 | train |
Murali-group/halp | halp/undirected_hypergraph.py | UndirectedHypergraph._combine_attribute_arguments | def _combine_attribute_arguments(self, attr_dict, attr):
# Note: Code & comments unchanged from DirectedHypergraph
"""Combines attr_dict and attr dictionaries, by updating attr_dict
with attr.
:param attr_dict: dictionary of attributes of the node.
:param attr: keyword arguments of attributes of the node;
attr's values will override attr_dict's values
if both are provided.
:returns: dict -- single dictionary of [combined] attributes.
:raises: AttributeError -- attr_dict argument must be a dictionary.
"""
# If no attribute dict was passed, treat the keyword
# arguments as the dict
if attr_dict is None:
attr_dict = attr
# Otherwise, combine the passed attribute dict with
# the keyword arguments
else:
try:
attr_dict.update(attr)
except AttributeError:
raise AttributeError("attr_dict argument \
must be a dictionary.")
return attr_dict | python | def _combine_attribute_arguments(self, attr_dict, attr):
# Note: Code & comments unchanged from DirectedHypergraph
"""Combines attr_dict and attr dictionaries, by updating attr_dict
with attr.
:param attr_dict: dictionary of attributes of the node.
:param attr: keyword arguments of attributes of the node;
attr's values will override attr_dict's values
if both are provided.
:returns: dict -- single dictionary of [combined] attributes.
:raises: AttributeError -- attr_dict argument must be a dictionary.
"""
# If no attribute dict was passed, treat the keyword
# arguments as the dict
if attr_dict is None:
attr_dict = attr
# Otherwise, combine the passed attribute dict with
# the keyword arguments
else:
try:
attr_dict.update(attr)
except AttributeError:
raise AttributeError("attr_dict argument \
must be a dictionary.")
return attr_dict | [
"def",
"_combine_attribute_arguments",
"(",
"self",
",",
"attr_dict",
",",
"attr",
")",
":",
"# Note: Code & comments unchanged from DirectedHypergraph",
"# If no attribute dict was passed, treat the keyword",
"# arguments as the dict",
"if",
"attr_dict",
"is",
"None",
":",
"attr_dict",
"=",
"attr",
"# Otherwise, combine the passed attribute dict with",
"# the keyword arguments",
"else",
":",
"try",
":",
"attr_dict",
".",
"update",
"(",
"attr",
")",
"except",
"AttributeError",
":",
"raise",
"AttributeError",
"(",
"\"attr_dict argument \\\n must be a dictionary.\"",
")",
"return",
"attr_dict"
] | Combines attr_dict and attr dictionaries, by updating attr_dict
with attr.
:param attr_dict: dictionary of attributes of the node.
:param attr: keyword arguments of attributes of the node;
attr's values will override attr_dict's values
if both are provided.
:returns: dict -- single dictionary of [combined] attributes.
:raises: AttributeError -- attr_dict argument must be a dictionary. | [
"Combines",
"attr_dict",
"and",
"attr",
"dictionaries",
"by",
"updating",
"attr_dict",
"with",
"attr",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/undirected_hypergraph.py#L137-L162 | train |
Murali-group/halp | halp/undirected_hypergraph.py | UndirectedHypergraph.remove_node | def remove_node(self, node):
"""Removes a node and its attributes from the hypergraph. Removes
every hyperedge that contains this node.
:param node: reference to the node being added.
:raises: ValueError -- No such node exists.
Examples:
::
>>> H = UndirectedHypergraph()
>>> H.add_node("A", label="positive")
>>> H.remove_node("A")
"""
if not self.has_node(node):
raise ValueError("No such node exists.")
# Loop over every hyperedge in the star of the node;
# i.e., over every hyperedge that contains the node
for hyperedge_id in self._star[node]:
frozen_nodes = \
self._hyperedge_attributes[hyperedge_id]["__frozen_nodes"]
# Remove the node set composing the hyperedge
del self._node_set_to_hyperedge[frozen_nodes]
# Remove this hyperedge's attributes
del self._hyperedge_attributes[hyperedge_id]
# Remove node's star
del self._star[node]
# Remove node's attributes dictionary
del self._node_attributes[node] | python | def remove_node(self, node):
"""Removes a node and its attributes from the hypergraph. Removes
every hyperedge that contains this node.
:param node: reference to the node being added.
:raises: ValueError -- No such node exists.
Examples:
::
>>> H = UndirectedHypergraph()
>>> H.add_node("A", label="positive")
>>> H.remove_node("A")
"""
if not self.has_node(node):
raise ValueError("No such node exists.")
# Loop over every hyperedge in the star of the node;
# i.e., over every hyperedge that contains the node
for hyperedge_id in self._star[node]:
frozen_nodes = \
self._hyperedge_attributes[hyperedge_id]["__frozen_nodes"]
# Remove the node set composing the hyperedge
del self._node_set_to_hyperedge[frozen_nodes]
# Remove this hyperedge's attributes
del self._hyperedge_attributes[hyperedge_id]
# Remove node's star
del self._star[node]
# Remove node's attributes dictionary
del self._node_attributes[node] | [
"def",
"remove_node",
"(",
"self",
",",
"node",
")",
":",
"if",
"not",
"self",
".",
"has_node",
"(",
"node",
")",
":",
"raise",
"ValueError",
"(",
"\"No such node exists.\"",
")",
"# Loop over every hyperedge in the star of the node;",
"# i.e., over every hyperedge that contains the node",
"for",
"hyperedge_id",
"in",
"self",
".",
"_star",
"[",
"node",
"]",
":",
"frozen_nodes",
"=",
"self",
".",
"_hyperedge_attributes",
"[",
"hyperedge_id",
"]",
"[",
"\"__frozen_nodes\"",
"]",
"# Remove the node set composing the hyperedge",
"del",
"self",
".",
"_node_set_to_hyperedge",
"[",
"frozen_nodes",
"]",
"# Remove this hyperedge's attributes",
"del",
"self",
".",
"_hyperedge_attributes",
"[",
"hyperedge_id",
"]",
"# Remove node's star",
"del",
"self",
".",
"_star",
"[",
"node",
"]",
"# Remove node's attributes dictionary",
"del",
"self",
".",
"_node_attributes",
"[",
"node",
"]"
] | Removes a node and its attributes from the hypergraph. Removes
every hyperedge that contains this node.
:param node: reference to the node being added.
:raises: ValueError -- No such node exists.
Examples:
::
>>> H = UndirectedHypergraph()
>>> H.add_node("A", label="positive")
>>> H.remove_node("A") | [
"Removes",
"a",
"node",
"and",
"its",
"attributes",
"from",
"the",
"hypergraph",
".",
"Removes",
"every",
"hyperedge",
"that",
"contains",
"this",
"node",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/undirected_hypergraph.py#L256-L288 | train |
Murali-group/halp | halp/undirected_hypergraph.py | UndirectedHypergraph.add_hyperedge | def add_hyperedge(self, nodes, attr_dict=None, **attr):
"""Adds a hyperedge to the hypergraph, along with any related
attributes of the hyperedge.
This method will automatically add any node from the node set
that was not in the hypergraph.
A hyperedge without a "weight" attribute specified will be
assigned the default value of 1.
:param nodes: iterable container of references to nodes in the
hyperedge to be added.
:param attr_dict: dictionary of attributes of the hyperedge being
added.
:param attr: keyword arguments of attributes of the hyperedge;
attr's values will override attr_dict's values
if both are provided.
:returns: str -- the ID of the hyperedge that was added.
:raises: ValueError -- nodes arguments cannot be empty.
Examples:
::
>>> H = UndirectedHypergraph()
>>> x = H.add_hyperedge(["A", "B", "C"])
>>> y = H.add_hyperedge(("A", "D"), weight=2)
>>> z = H.add_hyperedge(set(["B", "D"]), {color: "red"})
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
# Don't allow empty node set (invalid hyperedge)
if not nodes:
raise ValueError("nodes argument cannot be empty.")
# Use frozensets for node sets to allow for hashable keys
frozen_nodes = frozenset(nodes)
is_new_hyperedge = not self.has_hyperedge(frozen_nodes)
if is_new_hyperedge:
# Add nodes to graph (if not already present)
self.add_nodes(frozen_nodes)
# Create new hyperedge name to use as reference for that hyperedge
hyperedge_id = self._assign_next_hyperedge_id()
# For each node in the node set, add hyperedge to the node's star
for node in frozen_nodes:
self._star[node].add(hyperedge_id)
# Add the hyperedge ID as the hyperedge that the node set composes
self._node_set_to_hyperedge[frozen_nodes] = hyperedge_id
# Assign some special attributes to this hyperedge. We assign
# a default weight of 1 to the hyperedge. We also store the
# original node set in order to return them exactly as the
# user passed them into add_hyperedge.
self._hyperedge_attributes[hyperedge_id] = \
{"nodes": nodes, "__frozen_nodes": frozen_nodes, "weight": 1}
else:
# If its not a new hyperedge, just get its ID to update attributes
hyperedge_id = self._node_set_to_hyperedge[frozen_nodes]
# Set attributes and return hyperedge ID
self._hyperedge_attributes[hyperedge_id].update(attr_dict)
return hyperedge_id | python | def add_hyperedge(self, nodes, attr_dict=None, **attr):
"""Adds a hyperedge to the hypergraph, along with any related
attributes of the hyperedge.
This method will automatically add any node from the node set
that was not in the hypergraph.
A hyperedge without a "weight" attribute specified will be
assigned the default value of 1.
:param nodes: iterable container of references to nodes in the
hyperedge to be added.
:param attr_dict: dictionary of attributes of the hyperedge being
added.
:param attr: keyword arguments of attributes of the hyperedge;
attr's values will override attr_dict's values
if both are provided.
:returns: str -- the ID of the hyperedge that was added.
:raises: ValueError -- nodes arguments cannot be empty.
Examples:
::
>>> H = UndirectedHypergraph()
>>> x = H.add_hyperedge(["A", "B", "C"])
>>> y = H.add_hyperedge(("A", "D"), weight=2)
>>> z = H.add_hyperedge(set(["B", "D"]), {color: "red"})
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
# Don't allow empty node set (invalid hyperedge)
if not nodes:
raise ValueError("nodes argument cannot be empty.")
# Use frozensets for node sets to allow for hashable keys
frozen_nodes = frozenset(nodes)
is_new_hyperedge = not self.has_hyperedge(frozen_nodes)
if is_new_hyperedge:
# Add nodes to graph (if not already present)
self.add_nodes(frozen_nodes)
# Create new hyperedge name to use as reference for that hyperedge
hyperedge_id = self._assign_next_hyperedge_id()
# For each node in the node set, add hyperedge to the node's star
for node in frozen_nodes:
self._star[node].add(hyperedge_id)
# Add the hyperedge ID as the hyperedge that the node set composes
self._node_set_to_hyperedge[frozen_nodes] = hyperedge_id
# Assign some special attributes to this hyperedge. We assign
# a default weight of 1 to the hyperedge. We also store the
# original node set in order to return them exactly as the
# user passed them into add_hyperedge.
self._hyperedge_attributes[hyperedge_id] = \
{"nodes": nodes, "__frozen_nodes": frozen_nodes, "weight": 1}
else:
# If its not a new hyperedge, just get its ID to update attributes
hyperedge_id = self._node_set_to_hyperedge[frozen_nodes]
# Set attributes and return hyperedge ID
self._hyperedge_attributes[hyperedge_id].update(attr_dict)
return hyperedge_id | [
"def",
"add_hyperedge",
"(",
"self",
",",
"nodes",
",",
"attr_dict",
"=",
"None",
",",
"*",
"*",
"attr",
")",
":",
"attr_dict",
"=",
"self",
".",
"_combine_attribute_arguments",
"(",
"attr_dict",
",",
"attr",
")",
"# Don't allow empty node set (invalid hyperedge)",
"if",
"not",
"nodes",
":",
"raise",
"ValueError",
"(",
"\"nodes argument cannot be empty.\"",
")",
"# Use frozensets for node sets to allow for hashable keys",
"frozen_nodes",
"=",
"frozenset",
"(",
"nodes",
")",
"is_new_hyperedge",
"=",
"not",
"self",
".",
"has_hyperedge",
"(",
"frozen_nodes",
")",
"if",
"is_new_hyperedge",
":",
"# Add nodes to graph (if not already present)",
"self",
".",
"add_nodes",
"(",
"frozen_nodes",
")",
"# Create new hyperedge name to use as reference for that hyperedge",
"hyperedge_id",
"=",
"self",
".",
"_assign_next_hyperedge_id",
"(",
")",
"# For each node in the node set, add hyperedge to the node's star",
"for",
"node",
"in",
"frozen_nodes",
":",
"self",
".",
"_star",
"[",
"node",
"]",
".",
"add",
"(",
"hyperedge_id",
")",
"# Add the hyperedge ID as the hyperedge that the node set composes",
"self",
".",
"_node_set_to_hyperedge",
"[",
"frozen_nodes",
"]",
"=",
"hyperedge_id",
"# Assign some special attributes to this hyperedge. We assign",
"# a default weight of 1 to the hyperedge. We also store the",
"# original node set in order to return them exactly as the",
"# user passed them into add_hyperedge.",
"self",
".",
"_hyperedge_attributes",
"[",
"hyperedge_id",
"]",
"=",
"{",
"\"nodes\"",
":",
"nodes",
",",
"\"__frozen_nodes\"",
":",
"frozen_nodes",
",",
"\"weight\"",
":",
"1",
"}",
"else",
":",
"# If its not a new hyperedge, just get its ID to update attributes",
"hyperedge_id",
"=",
"self",
".",
"_node_set_to_hyperedge",
"[",
"frozen_nodes",
"]",
"# Set attributes and return hyperedge ID",
"self",
".",
"_hyperedge_attributes",
"[",
"hyperedge_id",
"]",
".",
"update",
"(",
"attr_dict",
")",
"return",
"hyperedge_id"
] | Adds a hyperedge to the hypergraph, along with any related
attributes of the hyperedge.
This method will automatically add any node from the node set
that was not in the hypergraph.
A hyperedge without a "weight" attribute specified will be
assigned the default value of 1.
:param nodes: iterable container of references to nodes in the
hyperedge to be added.
:param attr_dict: dictionary of attributes of the hyperedge being
added.
:param attr: keyword arguments of attributes of the hyperedge;
attr's values will override attr_dict's values
if both are provided.
:returns: str -- the ID of the hyperedge that was added.
:raises: ValueError -- nodes arguments cannot be empty.
Examples:
::
>>> H = UndirectedHypergraph()
>>> x = H.add_hyperedge(["A", "B", "C"])
>>> y = H.add_hyperedge(("A", "D"), weight=2)
>>> z = H.add_hyperedge(set(["B", "D"]), {color: "red"}) | [
"Adds",
"a",
"hyperedge",
"to",
"the",
"hypergraph",
"along",
"with",
"any",
"related",
"attributes",
"of",
"the",
"hyperedge",
".",
"This",
"method",
"will",
"automatically",
"add",
"any",
"node",
"from",
"the",
"node",
"set",
"that",
"was",
"not",
"in",
"the",
"hypergraph",
".",
"A",
"hyperedge",
"without",
"a",
"weight",
"attribute",
"specified",
"will",
"be",
"assigned",
"the",
"default",
"value",
"of",
"1",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/undirected_hypergraph.py#L384-L447 | train |
Murali-group/halp | halp/undirected_hypergraph.py | UndirectedHypergraph.add_hyperedges | def add_hyperedges(self, hyperedges, attr_dict=None, **attr):
"""Adds multiple hyperedges to the graph, along with any related
attributes of the hyperedges.
If any node of a hyperedge has not previously been added to the
hypergraph, it will automatically be added here.
Hyperedges without a "weight" attribute specified will be
assigned the default value of 1.
:param hyperedges: iterable container to references of the node sets
:param attr_dict: dictionary of attributes shared by all
the hyperedges being added.
:param attr: keyword arguments of attributes of the hyperedges;
attr's values will override attr_dict's values
if both are provided.
:returns: list -- the IDs of the hyperedges added in the order
specified by the hyperedges container's iterator.
See also:
add_hyperedge
Examples:
::
>>> H = UndirectedHypergraph()
>>> hyperedge_list = (["A", "B", "C"],
("A", "D"),
set(["B", "D"]))
>>> hyperedge_ids = H.add_hyperedges(hyperedge_list)
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
hyperedge_ids = []
for nodes in hyperedges:
hyperedge_id = self.add_hyperedge(nodes, attr_dict.copy())
hyperedge_ids.append(hyperedge_id)
return hyperedge_ids | python | def add_hyperedges(self, hyperedges, attr_dict=None, **attr):
"""Adds multiple hyperedges to the graph, along with any related
attributes of the hyperedges.
If any node of a hyperedge has not previously been added to the
hypergraph, it will automatically be added here.
Hyperedges without a "weight" attribute specified will be
assigned the default value of 1.
:param hyperedges: iterable container to references of the node sets
:param attr_dict: dictionary of attributes shared by all
the hyperedges being added.
:param attr: keyword arguments of attributes of the hyperedges;
attr's values will override attr_dict's values
if both are provided.
:returns: list -- the IDs of the hyperedges added in the order
specified by the hyperedges container's iterator.
See also:
add_hyperedge
Examples:
::
>>> H = UndirectedHypergraph()
>>> hyperedge_list = (["A", "B", "C"],
("A", "D"),
set(["B", "D"]))
>>> hyperedge_ids = H.add_hyperedges(hyperedge_list)
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
hyperedge_ids = []
for nodes in hyperedges:
hyperedge_id = self.add_hyperedge(nodes, attr_dict.copy())
hyperedge_ids.append(hyperedge_id)
return hyperedge_ids | [
"def",
"add_hyperedges",
"(",
"self",
",",
"hyperedges",
",",
"attr_dict",
"=",
"None",
",",
"*",
"*",
"attr",
")",
":",
"attr_dict",
"=",
"self",
".",
"_combine_attribute_arguments",
"(",
"attr_dict",
",",
"attr",
")",
"hyperedge_ids",
"=",
"[",
"]",
"for",
"nodes",
"in",
"hyperedges",
":",
"hyperedge_id",
"=",
"self",
".",
"add_hyperedge",
"(",
"nodes",
",",
"attr_dict",
".",
"copy",
"(",
")",
")",
"hyperedge_ids",
".",
"append",
"(",
"hyperedge_id",
")",
"return",
"hyperedge_ids"
] | Adds multiple hyperedges to the graph, along with any related
attributes of the hyperedges.
If any node of a hyperedge has not previously been added to the
hypergraph, it will automatically be added here.
Hyperedges without a "weight" attribute specified will be
assigned the default value of 1.
:param hyperedges: iterable container to references of the node sets
:param attr_dict: dictionary of attributes shared by all
the hyperedges being added.
:param attr: keyword arguments of attributes of the hyperedges;
attr's values will override attr_dict's values
if both are provided.
:returns: list -- the IDs of the hyperedges added in the order
specified by the hyperedges container's iterator.
See also:
add_hyperedge
Examples:
::
>>> H = UndirectedHypergraph()
>>> hyperedge_list = (["A", "B", "C"],
("A", "D"),
set(["B", "D"]))
>>> hyperedge_ids = H.add_hyperedges(hyperedge_list) | [
"Adds",
"multiple",
"hyperedges",
"to",
"the",
"graph",
"along",
"with",
"any",
"related",
"attributes",
"of",
"the",
"hyperedges",
".",
"If",
"any",
"node",
"of",
"a",
"hyperedge",
"has",
"not",
"previously",
"been",
"added",
"to",
"the",
"hypergraph",
"it",
"will",
"automatically",
"be",
"added",
"here",
".",
"Hyperedges",
"without",
"a",
"weight",
"attribute",
"specified",
"will",
"be",
"assigned",
"the",
"default",
"value",
"of",
"1",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/undirected_hypergraph.py#L449-L487 | train |
Murali-group/halp | halp/undirected_hypergraph.py | UndirectedHypergraph.get_hyperedge_id | def get_hyperedge_id(self, nodes):
"""From a set of nodes, returns the ID of the hyperedge that this
set comprises.
:param nodes: iterable container of references to nodes in the
the hyperedge to be added
:returns: str -- ID of the hyperedge that has that the specified
node set comprises.
:raises: ValueError -- No such hyperedge exists.
Examples:
::
>>> H = UndirectedHypergraph()
>>> hyperedge_list = (["A", "B", "C"],
("A", "D"),
set(["B", "D"]))
>>> hyperedge_ids = H.add_hyperedges(hyperedge_list)
>>> x = H.get_hyperedge_id(["A", "B", "C"])
"""
frozen_nodes = frozenset(nodes)
if not self.has_hyperedge(frozen_nodes):
raise ValueError("No such hyperedge exists.")
return self._node_set_to_hyperedge[frozen_nodes] | python | def get_hyperedge_id(self, nodes):
"""From a set of nodes, returns the ID of the hyperedge that this
set comprises.
:param nodes: iterable container of references to nodes in the
the hyperedge to be added
:returns: str -- ID of the hyperedge that has that the specified
node set comprises.
:raises: ValueError -- No such hyperedge exists.
Examples:
::
>>> H = UndirectedHypergraph()
>>> hyperedge_list = (["A", "B", "C"],
("A", "D"),
set(["B", "D"]))
>>> hyperedge_ids = H.add_hyperedges(hyperedge_list)
>>> x = H.get_hyperedge_id(["A", "B", "C"])
"""
frozen_nodes = frozenset(nodes)
if not self.has_hyperedge(frozen_nodes):
raise ValueError("No such hyperedge exists.")
return self._node_set_to_hyperedge[frozen_nodes] | [
"def",
"get_hyperedge_id",
"(",
"self",
",",
"nodes",
")",
":",
"frozen_nodes",
"=",
"frozenset",
"(",
"nodes",
")",
"if",
"not",
"self",
".",
"has_hyperedge",
"(",
"frozen_nodes",
")",
":",
"raise",
"ValueError",
"(",
"\"No such hyperedge exists.\"",
")",
"return",
"self",
".",
"_node_set_to_hyperedge",
"[",
"frozen_nodes",
"]"
] | From a set of nodes, returns the ID of the hyperedge that this
set comprises.
:param nodes: iterable container of references to nodes in the
the hyperedge to be added
:returns: str -- ID of the hyperedge that has that the specified
node set comprises.
:raises: ValueError -- No such hyperedge exists.
Examples:
::
>>> H = UndirectedHypergraph()
>>> hyperedge_list = (["A", "B", "C"],
("A", "D"),
set(["B", "D"]))
>>> hyperedge_ids = H.add_hyperedges(hyperedge_list)
>>> x = H.get_hyperedge_id(["A", "B", "C"]) | [
"From",
"a",
"set",
"of",
"nodes",
"returns",
"the",
"ID",
"of",
"the",
"hyperedge",
"that",
"this",
"set",
"comprises",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/undirected_hypergraph.py#L592-L618 | train |
Murali-group/halp | halp/undirected_hypergraph.py | UndirectedHypergraph.get_hyperedge_attribute | def get_hyperedge_attribute(self, hyperedge_id, attribute_name):
# Note: Code unchanged from DirectedHypergraph
"""Given a hyperedge ID and the name of an attribute, get a copy
of that hyperedge's attribute.
:param hyperedge_id: ID of the hyperedge to retrieve the attribute of.
:param attribute_name: name of the attribute to retrieve.
:returns: attribute value of the attribute_name key for the
specified hyperedge.
:raises: ValueError -- No such hyperedge exists.
:raises: ValueError -- No such attribute exists.
Examples:
::
>>> H = UndirectedHypergraph()
>>> hyperedge_list = (["A", "B", "C"],
("A", "D"),
set(["B", "D"]))
>>> hyperedge_ids = H.add_hyperedges(hyperedge_list)
>>> attribute = H.get_hyperedge_attribute(hyperedge_ids[0])
"""
if not self.has_hyperedge_id(hyperedge_id):
raise ValueError("No such hyperedge exists.")
elif attribute_name not in self._hyperedge_attributes[hyperedge_id]:
raise ValueError("No such attribute exists.")
else:
return copy.\
copy(self._hyperedge_attributes[hyperedge_id][attribute_name]) | python | def get_hyperedge_attribute(self, hyperedge_id, attribute_name):
# Note: Code unchanged from DirectedHypergraph
"""Given a hyperedge ID and the name of an attribute, get a copy
of that hyperedge's attribute.
:param hyperedge_id: ID of the hyperedge to retrieve the attribute of.
:param attribute_name: name of the attribute to retrieve.
:returns: attribute value of the attribute_name key for the
specified hyperedge.
:raises: ValueError -- No such hyperedge exists.
:raises: ValueError -- No such attribute exists.
Examples:
::
>>> H = UndirectedHypergraph()
>>> hyperedge_list = (["A", "B", "C"],
("A", "D"),
set(["B", "D"]))
>>> hyperedge_ids = H.add_hyperedges(hyperedge_list)
>>> attribute = H.get_hyperedge_attribute(hyperedge_ids[0])
"""
if not self.has_hyperedge_id(hyperedge_id):
raise ValueError("No such hyperedge exists.")
elif attribute_name not in self._hyperedge_attributes[hyperedge_id]:
raise ValueError("No such attribute exists.")
else:
return copy.\
copy(self._hyperedge_attributes[hyperedge_id][attribute_name]) | [
"def",
"get_hyperedge_attribute",
"(",
"self",
",",
"hyperedge_id",
",",
"attribute_name",
")",
":",
"# Note: Code unchanged from DirectedHypergraph",
"if",
"not",
"self",
".",
"has_hyperedge_id",
"(",
"hyperedge_id",
")",
":",
"raise",
"ValueError",
"(",
"\"No such hyperedge exists.\"",
")",
"elif",
"attribute_name",
"not",
"in",
"self",
".",
"_hyperedge_attributes",
"[",
"hyperedge_id",
"]",
":",
"raise",
"ValueError",
"(",
"\"No such attribute exists.\"",
")",
"else",
":",
"return",
"copy",
".",
"copy",
"(",
"self",
".",
"_hyperedge_attributes",
"[",
"hyperedge_id",
"]",
"[",
"attribute_name",
"]",
")"
] | Given a hyperedge ID and the name of an attribute, get a copy
of that hyperedge's attribute.
:param hyperedge_id: ID of the hyperedge to retrieve the attribute of.
:param attribute_name: name of the attribute to retrieve.
:returns: attribute value of the attribute_name key for the
specified hyperedge.
:raises: ValueError -- No such hyperedge exists.
:raises: ValueError -- No such attribute exists.
Examples:
::
>>> H = UndirectedHypergraph()
>>> hyperedge_list = (["A", "B", "C"],
("A", "D"),
set(["B", "D"]))
>>> hyperedge_ids = H.add_hyperedges(hyperedge_list)
>>> attribute = H.get_hyperedge_attribute(hyperedge_ids[0]) | [
"Given",
"a",
"hyperedge",
"ID",
"and",
"the",
"name",
"of",
"an",
"attribute",
"get",
"a",
"copy",
"of",
"that",
"hyperedge",
"s",
"attribute",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/undirected_hypergraph.py#L620-L649 | train |
Murali-group/halp | halp/undirected_hypergraph.py | UndirectedHypergraph.get_hyperedge_attributes | def get_hyperedge_attributes(self, hyperedge_id):
"""Given a hyperedge ID, get a dictionary of copies of that hyperedge's
attributes.
:param hyperedge_id: ID of the hyperedge to retrieve the attributes of.
:returns: dict -- copy of each attribute of the specified hyperedge_id
(except the private __frozen_nodes entry).
:raises: ValueError -- No such hyperedge exists.
"""
if not self.has_hyperedge_id(hyperedge_id):
raise ValueError("No such hyperedge exists.")
dict_to_copy = self._hyperedge_attributes[hyperedge_id].items()
attributes = {}
for attr_name, attr_value in dict_to_copy:
if attr_name != "__frozen_nodes":
attributes[attr_name] = copy.copy(attr_value)
return attributes | python | def get_hyperedge_attributes(self, hyperedge_id):
"""Given a hyperedge ID, get a dictionary of copies of that hyperedge's
attributes.
:param hyperedge_id: ID of the hyperedge to retrieve the attributes of.
:returns: dict -- copy of each attribute of the specified hyperedge_id
(except the private __frozen_nodes entry).
:raises: ValueError -- No such hyperedge exists.
"""
if not self.has_hyperedge_id(hyperedge_id):
raise ValueError("No such hyperedge exists.")
dict_to_copy = self._hyperedge_attributes[hyperedge_id].items()
attributes = {}
for attr_name, attr_value in dict_to_copy:
if attr_name != "__frozen_nodes":
attributes[attr_name] = copy.copy(attr_value)
return attributes | [
"def",
"get_hyperedge_attributes",
"(",
"self",
",",
"hyperedge_id",
")",
":",
"if",
"not",
"self",
".",
"has_hyperedge_id",
"(",
"hyperedge_id",
")",
":",
"raise",
"ValueError",
"(",
"\"No such hyperedge exists.\"",
")",
"dict_to_copy",
"=",
"self",
".",
"_hyperedge_attributes",
"[",
"hyperedge_id",
"]",
".",
"items",
"(",
")",
"attributes",
"=",
"{",
"}",
"for",
"attr_name",
",",
"attr_value",
"in",
"dict_to_copy",
":",
"if",
"attr_name",
"!=",
"\"__frozen_nodes\"",
":",
"attributes",
"[",
"attr_name",
"]",
"=",
"copy",
".",
"copy",
"(",
"attr_value",
")",
"return",
"attributes"
] | Given a hyperedge ID, get a dictionary of copies of that hyperedge's
attributes.
:param hyperedge_id: ID of the hyperedge to retrieve the attributes of.
:returns: dict -- copy of each attribute of the specified hyperedge_id
(except the private __frozen_nodes entry).
:raises: ValueError -- No such hyperedge exists. | [
"Given",
"a",
"hyperedge",
"ID",
"get",
"a",
"dictionary",
"of",
"copies",
"of",
"that",
"hyperedge",
"s",
"attributes",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/undirected_hypergraph.py#L651-L668 | train |
Murali-group/halp | halp/undirected_hypergraph.py | UndirectedHypergraph.get_star | def get_star(self, node):
"""Given a node, get a copy of that node's star, that is, the set of
hyperedges that the node belongs to.
:param node: node to retrieve the star of.
:returns: set -- set of hyperedge_ids for the hyperedges
in the node's star.
:raises: ValueError -- No such node exists.
"""
if node not in self._node_attributes:
raise ValueError("No such node exists.")
return self._star[node].copy() | python | def get_star(self, node):
"""Given a node, get a copy of that node's star, that is, the set of
hyperedges that the node belongs to.
:param node: node to retrieve the star of.
:returns: set -- set of hyperedge_ids for the hyperedges
in the node's star.
:raises: ValueError -- No such node exists.
"""
if node not in self._node_attributes:
raise ValueError("No such node exists.")
return self._star[node].copy() | [
"def",
"get_star",
"(",
"self",
",",
"node",
")",
":",
"if",
"node",
"not",
"in",
"self",
".",
"_node_attributes",
":",
"raise",
"ValueError",
"(",
"\"No such node exists.\"",
")",
"return",
"self",
".",
"_star",
"[",
"node",
"]",
".",
"copy",
"(",
")"
] | Given a node, get a copy of that node's star, that is, the set of
hyperedges that the node belongs to.
:param node: node to retrieve the star of.
:returns: set -- set of hyperedge_ids for the hyperedges
in the node's star.
:raises: ValueError -- No such node exists. | [
"Given",
"a",
"node",
"get",
"a",
"copy",
"of",
"that",
"node",
"s",
"star",
"that",
"is",
"the",
"set",
"of",
"hyperedges",
"that",
"the",
"node",
"belongs",
"to",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/undirected_hypergraph.py#L691-L703 | train |
Murali-group/halp | halp/utilities/directed_statistics.py | _F_outdegree | def _F_outdegree(H, F):
"""Returns the result of a function F applied to the set of outdegrees in
in the hypergraph.
:param H: the hypergraph whose outdegrees will be operated on.
:param F: function to execute on the list of outdegrees in the hypergraph.
:returns: result of the given function F.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs
"""
if not isinstance(H, DirectedHypergraph):
raise TypeError("Algorithm only applicable to directed hypergraphs")
return F([len(H.get_forward_star(node))
for node in H.get_node_set()]) | python | def _F_outdegree(H, F):
"""Returns the result of a function F applied to the set of outdegrees in
in the hypergraph.
:param H: the hypergraph whose outdegrees will be operated on.
:param F: function to execute on the list of outdegrees in the hypergraph.
:returns: result of the given function F.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs
"""
if not isinstance(H, DirectedHypergraph):
raise TypeError("Algorithm only applicable to directed hypergraphs")
return F([len(H.get_forward_star(node))
for node in H.get_node_set()]) | [
"def",
"_F_outdegree",
"(",
"H",
",",
"F",
")",
":",
"if",
"not",
"isinstance",
"(",
"H",
",",
"DirectedHypergraph",
")",
":",
"raise",
"TypeError",
"(",
"\"Algorithm only applicable to directed hypergraphs\"",
")",
"return",
"F",
"(",
"[",
"len",
"(",
"H",
".",
"get_forward_star",
"(",
"node",
")",
")",
"for",
"node",
"in",
"H",
".",
"get_node_set",
"(",
")",
"]",
")"
] | Returns the result of a function F applied to the set of outdegrees in
in the hypergraph.
:param H: the hypergraph whose outdegrees will be operated on.
:param F: function to execute on the list of outdegrees in the hypergraph.
:returns: result of the given function F.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs | [
"Returns",
"the",
"result",
"of",
"a",
"function",
"F",
"applied",
"to",
"the",
"set",
"of",
"outdegrees",
"in",
"in",
"the",
"hypergraph",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/directed_statistics.py#L40-L54 | train |
Murali-group/halp | halp/utilities/directed_statistics.py | _F_indegree | def _F_indegree(H, F):
"""Returns the result of a function F applied to the list of indegrees in
in the hypergraph.
:param H: the hypergraph whose indegrees will be operated on.
:param F: function to execute on the list of indegrees in the hypergraph.
:returns: result of the given function F.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs
"""
if not isinstance(H, DirectedHypergraph):
raise TypeError("Algorithm only applicable to directed hypergraphs")
return F([len(H.get_backward_star(node))
for node in H.get_node_set()]) | python | def _F_indegree(H, F):
"""Returns the result of a function F applied to the list of indegrees in
in the hypergraph.
:param H: the hypergraph whose indegrees will be operated on.
:param F: function to execute on the list of indegrees in the hypergraph.
:returns: result of the given function F.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs
"""
if not isinstance(H, DirectedHypergraph):
raise TypeError("Algorithm only applicable to directed hypergraphs")
return F([len(H.get_backward_star(node))
for node in H.get_node_set()]) | [
"def",
"_F_indegree",
"(",
"H",
",",
"F",
")",
":",
"if",
"not",
"isinstance",
"(",
"H",
",",
"DirectedHypergraph",
")",
":",
"raise",
"TypeError",
"(",
"\"Algorithm only applicable to directed hypergraphs\"",
")",
"return",
"F",
"(",
"[",
"len",
"(",
"H",
".",
"get_backward_star",
"(",
"node",
")",
")",
"for",
"node",
"in",
"H",
".",
"get_node_set",
"(",
")",
"]",
")"
] | Returns the result of a function F applied to the list of indegrees in
in the hypergraph.
:param H: the hypergraph whose indegrees will be operated on.
:param F: function to execute on the list of indegrees in the hypergraph.
:returns: result of the given function F.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs | [
"Returns",
"the",
"result",
"of",
"a",
"function",
"F",
"applied",
"to",
"the",
"list",
"of",
"indegrees",
"in",
"in",
"the",
"hypergraph",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/directed_statistics.py#L105-L119 | train |
Murali-group/halp | halp/utilities/directed_statistics.py | _F_hyperedge_tail_cardinality | def _F_hyperedge_tail_cardinality(H, F):
"""Returns the result of a function F applied to the set of cardinalities
of hyperedge tails in the hypergraph.
:param H: the hypergraph whose tail cardinalities will be
operated on.
:param F: function to execute on the set of cardinalities in the
hypergraph.
:returns: result of the given function F.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs
"""
if not isinstance(H, DirectedHypergraph):
raise TypeError("Algorithm only applicable to directed hypergraphs")
return F([len(H.get_hyperedge_tail(hyperedge_id))
for hyperedge_id in H.get_hyperedge_id_set()]) | python | def _F_hyperedge_tail_cardinality(H, F):
"""Returns the result of a function F applied to the set of cardinalities
of hyperedge tails in the hypergraph.
:param H: the hypergraph whose tail cardinalities will be
operated on.
:param F: function to execute on the set of cardinalities in the
hypergraph.
:returns: result of the given function F.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs
"""
if not isinstance(H, DirectedHypergraph):
raise TypeError("Algorithm only applicable to directed hypergraphs")
return F([len(H.get_hyperedge_tail(hyperedge_id))
for hyperedge_id in H.get_hyperedge_id_set()]) | [
"def",
"_F_hyperedge_tail_cardinality",
"(",
"H",
",",
"F",
")",
":",
"if",
"not",
"isinstance",
"(",
"H",
",",
"DirectedHypergraph",
")",
":",
"raise",
"TypeError",
"(",
"\"Algorithm only applicable to directed hypergraphs\"",
")",
"return",
"F",
"(",
"[",
"len",
"(",
"H",
".",
"get_hyperedge_tail",
"(",
"hyperedge_id",
")",
")",
"for",
"hyperedge_id",
"in",
"H",
".",
"get_hyperedge_id_set",
"(",
")",
"]",
")"
] | Returns the result of a function F applied to the set of cardinalities
of hyperedge tails in the hypergraph.
:param H: the hypergraph whose tail cardinalities will be
operated on.
:param F: function to execute on the set of cardinalities in the
hypergraph.
:returns: result of the given function F.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs | [
"Returns",
"the",
"result",
"of",
"a",
"function",
"F",
"applied",
"to",
"the",
"set",
"of",
"cardinalities",
"of",
"hyperedge",
"tails",
"in",
"the",
"hypergraph",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/directed_statistics.py#L170-L186 | train |
Murali-group/halp | halp/utilities/directed_statistics.py | _F_hyperedge_head_cardinality | def _F_hyperedge_head_cardinality(H, F):
"""Returns the result of a function F applied to the set of cardinalities
of hyperedge heads in the hypergraph.
:param H: the hypergraph whose head cardinalities will be
operated on.
:param F: function to execute on the set of cardinalities in the
hypergraph.
:returns: result of the given function F.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs
"""
if not isinstance(H, DirectedHypergraph):
raise TypeError("Algorithm only applicable to directed hypergraphs")
return F([len(H.get_hyperedge_head(hyperedge_id))
for hyperedge_id in H.get_hyperedge_id_set()]) | python | def _F_hyperedge_head_cardinality(H, F):
"""Returns the result of a function F applied to the set of cardinalities
of hyperedge heads in the hypergraph.
:param H: the hypergraph whose head cardinalities will be
operated on.
:param F: function to execute on the set of cardinalities in the
hypergraph.
:returns: result of the given function F.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs
"""
if not isinstance(H, DirectedHypergraph):
raise TypeError("Algorithm only applicable to directed hypergraphs")
return F([len(H.get_hyperedge_head(hyperedge_id))
for hyperedge_id in H.get_hyperedge_id_set()]) | [
"def",
"_F_hyperedge_head_cardinality",
"(",
"H",
",",
"F",
")",
":",
"if",
"not",
"isinstance",
"(",
"H",
",",
"DirectedHypergraph",
")",
":",
"raise",
"TypeError",
"(",
"\"Algorithm only applicable to directed hypergraphs\"",
")",
"return",
"F",
"(",
"[",
"len",
"(",
"H",
".",
"get_hyperedge_head",
"(",
"hyperedge_id",
")",
")",
"for",
"hyperedge_id",
"in",
"H",
".",
"get_hyperedge_id_set",
"(",
")",
"]",
")"
] | Returns the result of a function F applied to the set of cardinalities
of hyperedge heads in the hypergraph.
:param H: the hypergraph whose head cardinalities will be
operated on.
:param F: function to execute on the set of cardinalities in the
hypergraph.
:returns: result of the given function F.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs | [
"Returns",
"the",
"result",
"of",
"a",
"function",
"F",
"applied",
"to",
"the",
"set",
"of",
"cardinalities",
"of",
"hyperedge",
"heads",
"in",
"the",
"hypergraph",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/directed_statistics.py#L240-L256 | train |
Murali-group/halp | halp/utilities/undirected_matrices.py | get_hyperedge_weight_matrix | def get_hyperedge_weight_matrix(H, hyperedge_ids_to_indices):
"""Creates the diagonal matrix W of hyperedge weights as a sparse matrix.
:param H: the hypergraph to find the weights.
:param hyperedge_weights: the mapping from the indices of hyperedge IDs to
the corresponding hyperedge weights.
:returns: sparse.csc_matrix -- the diagonal edge weight matrix as a
sparse matrix.
"""
# Combined 2 methods into 1; this could be written better
hyperedge_weights = {}
for hyperedge_id in H.hyperedge_id_iterator():
hyperedge_weights.update({hyperedge_ids_to_indices[hyperedge_id]:
H.get_hyperedge_weight(hyperedge_id)})
hyperedge_weight_vector = []
for i in range(len(hyperedge_weights.keys())):
hyperedge_weight_vector.append(hyperedge_weights.get(i))
return sparse.diags([hyperedge_weight_vector], [0]) | python | def get_hyperedge_weight_matrix(H, hyperedge_ids_to_indices):
"""Creates the diagonal matrix W of hyperedge weights as a sparse matrix.
:param H: the hypergraph to find the weights.
:param hyperedge_weights: the mapping from the indices of hyperedge IDs to
the corresponding hyperedge weights.
:returns: sparse.csc_matrix -- the diagonal edge weight matrix as a
sparse matrix.
"""
# Combined 2 methods into 1; this could be written better
hyperedge_weights = {}
for hyperedge_id in H.hyperedge_id_iterator():
hyperedge_weights.update({hyperedge_ids_to_indices[hyperedge_id]:
H.get_hyperedge_weight(hyperedge_id)})
hyperedge_weight_vector = []
for i in range(len(hyperedge_weights.keys())):
hyperedge_weight_vector.append(hyperedge_weights.get(i))
return sparse.diags([hyperedge_weight_vector], [0]) | [
"def",
"get_hyperedge_weight_matrix",
"(",
"H",
",",
"hyperedge_ids_to_indices",
")",
":",
"# Combined 2 methods into 1; this could be written better",
"hyperedge_weights",
"=",
"{",
"}",
"for",
"hyperedge_id",
"in",
"H",
".",
"hyperedge_id_iterator",
"(",
")",
":",
"hyperedge_weights",
".",
"update",
"(",
"{",
"hyperedge_ids_to_indices",
"[",
"hyperedge_id",
"]",
":",
"H",
".",
"get_hyperedge_weight",
"(",
"hyperedge_id",
")",
"}",
")",
"hyperedge_weight_vector",
"=",
"[",
"]",
"for",
"i",
"in",
"range",
"(",
"len",
"(",
"hyperedge_weights",
".",
"keys",
"(",
")",
")",
")",
":",
"hyperedge_weight_vector",
".",
"append",
"(",
"hyperedge_weights",
".",
"get",
"(",
"i",
")",
")",
"return",
"sparse",
".",
"diags",
"(",
"[",
"hyperedge_weight_vector",
"]",
",",
"[",
"0",
"]",
")"
] | Creates the diagonal matrix W of hyperedge weights as a sparse matrix.
:param H: the hypergraph to find the weights.
:param hyperedge_weights: the mapping from the indices of hyperedge IDs to
the corresponding hyperedge weights.
:returns: sparse.csc_matrix -- the diagonal edge weight matrix as a
sparse matrix. | [
"Creates",
"the",
"diagonal",
"matrix",
"W",
"of",
"hyperedge",
"weights",
"as",
"a",
"sparse",
"matrix",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/undirected_matrices.py#L103-L123 | train |
Murali-group/halp | halp/utilities/undirected_matrices.py | get_hyperedge_degree_matrix | def get_hyperedge_degree_matrix(M):
"""Creates the diagonal matrix of hyperedge degrees D_e as a sparse matrix,
where a hyperedge degree is the cardinality of the hyperedge.
:param M: the incidence matrix of the hypergraph to find the D_e matrix on.
:returns: sparse.csc_matrix -- the diagonal hyperedge degree matrix as a
sparse matrix.
"""
degrees = M.sum(0).transpose()
new_degree = []
for degree in degrees:
new_degree.append(int(degree[0:]))
return sparse.diags([new_degree], [0]) | python | def get_hyperedge_degree_matrix(M):
"""Creates the diagonal matrix of hyperedge degrees D_e as a sparse matrix,
where a hyperedge degree is the cardinality of the hyperedge.
:param M: the incidence matrix of the hypergraph to find the D_e matrix on.
:returns: sparse.csc_matrix -- the diagonal hyperedge degree matrix as a
sparse matrix.
"""
degrees = M.sum(0).transpose()
new_degree = []
for degree in degrees:
new_degree.append(int(degree[0:]))
return sparse.diags([new_degree], [0]) | [
"def",
"get_hyperedge_degree_matrix",
"(",
"M",
")",
":",
"degrees",
"=",
"M",
".",
"sum",
"(",
"0",
")",
".",
"transpose",
"(",
")",
"new_degree",
"=",
"[",
"]",
"for",
"degree",
"in",
"degrees",
":",
"new_degree",
".",
"append",
"(",
"int",
"(",
"degree",
"[",
"0",
":",
"]",
")",
")",
"return",
"sparse",
".",
"diags",
"(",
"[",
"new_degree",
"]",
",",
"[",
"0",
"]",
")"
] | Creates the diagonal matrix of hyperedge degrees D_e as a sparse matrix,
where a hyperedge degree is the cardinality of the hyperedge.
:param M: the incidence matrix of the hypergraph to find the D_e matrix on.
:returns: sparse.csc_matrix -- the diagonal hyperedge degree matrix as a
sparse matrix. | [
"Creates",
"the",
"diagonal",
"matrix",
"of",
"hyperedge",
"degrees",
"D_e",
"as",
"a",
"sparse",
"matrix",
"where",
"a",
"hyperedge",
"degree",
"is",
"the",
"cardinality",
"of",
"the",
"hyperedge",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/undirected_matrices.py#L126-L140 | train |
Murali-group/halp | halp/utilities/undirected_matrices.py | fast_inverse | def fast_inverse(M):
"""Computes the inverse of a diagonal matrix.
:param H: the diagonal matrix to find the inverse of.
:returns: sparse.csc_matrix -- the inverse of the input matrix as a
sparse matrix.
"""
diags = M.diagonal()
new_diag = []
for value in diags:
new_diag.append(1.0/value)
return sparse.diags([new_diag], [0]) | python | def fast_inverse(M):
"""Computes the inverse of a diagonal matrix.
:param H: the diagonal matrix to find the inverse of.
:returns: sparse.csc_matrix -- the inverse of the input matrix as a
sparse matrix.
"""
diags = M.diagonal()
new_diag = []
for value in diags:
new_diag.append(1.0/value)
return sparse.diags([new_diag], [0]) | [
"def",
"fast_inverse",
"(",
"M",
")",
":",
"diags",
"=",
"M",
".",
"diagonal",
"(",
")",
"new_diag",
"=",
"[",
"]",
"for",
"value",
"in",
"diags",
":",
"new_diag",
".",
"append",
"(",
"1.0",
"/",
"value",
")",
"return",
"sparse",
".",
"diags",
"(",
"[",
"new_diag",
"]",
",",
"[",
"0",
"]",
")"
] | Computes the inverse of a diagonal matrix.
:param H: the diagonal matrix to find the inverse of.
:returns: sparse.csc_matrix -- the inverse of the input matrix as a
sparse matrix. | [
"Computes",
"the",
"inverse",
"of",
"a",
"diagonal",
"matrix",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/undirected_matrices.py#L143-L156 | train |
Murali-group/halp | halp/signaling_hypergraph.py | node_iterator | def node_iterator(self):
"""Provides an iterator over the nodes.
"""
return iter(self._node_attributes)
def has_hypernode(self, hypernode):
"""Determines if a specific hypernode is present in the hypergraph.
:param node: reference to hypernode whose presence is being checked.
:returns: bool -- true iff the node exists in the hypergraph.
"""
return hypernode in self._hypernode_attributes | python | def node_iterator(self):
"""Provides an iterator over the nodes.
"""
return iter(self._node_attributes)
def has_hypernode(self, hypernode):
"""Determines if a specific hypernode is present in the hypergraph.
:param node: reference to hypernode whose presence is being checked.
:returns: bool -- true iff the node exists in the hypergraph.
"""
return hypernode in self._hypernode_attributes | [
"def",
"node_iterator",
"(",
"self",
")",
":",
"return",
"iter",
"(",
"self",
".",
"_node_attributes",
")",
"def",
"has_hypernode",
"(",
"self",
",",
"hypernode",
")",
":",
"\"\"\"Determines if a specific hypernode is present in the hypergraph.\n\n :param node: reference to hypernode whose presence is being checked.\n :returns: bool -- true iff the node exists in the hypergraph.\n\n \"\"\"",
"return",
"hypernode",
"in",
"self",
".",
"_hypernode_attributes"
] | Provides an iterator over the nodes. | [
"Provides",
"an",
"iterator",
"over",
"the",
"nodes",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/signaling_hypergraph.py#L243-L256 | train |
Murali-group/halp | halp/signaling_hypergraph.py | add_hypernode | def add_hypernode(self, hypernode, composing_nodes=set(), attr_dict=None, **attr):
"""Adds a hypernode to the graph, along with any related attributes
of the hypernode.
:param hypernode: reference to the hypernode being added.
:param nodes: reference to the set of nodes that compose
the hypernode.
:param in_hypernodes: set of references to the hypernodes that the
node being added is a member of.
:param attr_dict: dictionary of attributes of the node.
:param attr: keyword arguments of attributes of the node;
attr's values will override attr_dict's values
if both are provided.
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
# If the hypernode hasn't previously been added, add it along
# with its attributes
if not self.has_hypernode(hypernode):
attr_dict["__composing_nodes"] = composing_nodes
added_nodes = composing_nodes
removed_nodes = set()
self._hypernode_attributes[hypernode] = attr_dict
# Otherwise, just update the hypernode's attributes
else:
self._hypernode_attributes[hypernode].update(attr_dict)
added_nodes = composing_nodes - self._hypernode_attributes\
[hypernode]["__composing_nodes"]
removed_nodes = self._hypernode_attributes\
[hypernode]["__composing_nodes"] - composing_nodes
# For every "composing node" added to this hypernode, update
# those nodes attributes to be members of this hypernode
for node in added_nodes:
_add_hypernode_membership(node, hypernode)
# For every "composing node" added to this hypernode, update
# those nodes attributes to no longer be members of this hypernode
for node in remove_nodes:
_remove_hypernode_membership(node, hypernode) | python | def add_hypernode(self, hypernode, composing_nodes=set(), attr_dict=None, **attr):
"""Adds a hypernode to the graph, along with any related attributes
of the hypernode.
:param hypernode: reference to the hypernode being added.
:param nodes: reference to the set of nodes that compose
the hypernode.
:param in_hypernodes: set of references to the hypernodes that the
node being added is a member of.
:param attr_dict: dictionary of attributes of the node.
:param attr: keyword arguments of attributes of the node;
attr's values will override attr_dict's values
if both are provided.
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
# If the hypernode hasn't previously been added, add it along
# with its attributes
if not self.has_hypernode(hypernode):
attr_dict["__composing_nodes"] = composing_nodes
added_nodes = composing_nodes
removed_nodes = set()
self._hypernode_attributes[hypernode] = attr_dict
# Otherwise, just update the hypernode's attributes
else:
self._hypernode_attributes[hypernode].update(attr_dict)
added_nodes = composing_nodes - self._hypernode_attributes\
[hypernode]["__composing_nodes"]
removed_nodes = self._hypernode_attributes\
[hypernode]["__composing_nodes"] - composing_nodes
# For every "composing node" added to this hypernode, update
# those nodes attributes to be members of this hypernode
for node in added_nodes:
_add_hypernode_membership(node, hypernode)
# For every "composing node" added to this hypernode, update
# those nodes attributes to no longer be members of this hypernode
for node in remove_nodes:
_remove_hypernode_membership(node, hypernode) | [
"def",
"add_hypernode",
"(",
"self",
",",
"hypernode",
",",
"composing_nodes",
"=",
"set",
"(",
")",
",",
"attr_dict",
"=",
"None",
",",
"*",
"*",
"attr",
")",
":",
"attr_dict",
"=",
"self",
".",
"_combine_attribute_arguments",
"(",
"attr_dict",
",",
"attr",
")",
"# If the hypernode hasn't previously been added, add it along",
"# with its attributes",
"if",
"not",
"self",
".",
"has_hypernode",
"(",
"hypernode",
")",
":",
"attr_dict",
"[",
"\"__composing_nodes\"",
"]",
"=",
"composing_nodes",
"added_nodes",
"=",
"composing_nodes",
"removed_nodes",
"=",
"set",
"(",
")",
"self",
".",
"_hypernode_attributes",
"[",
"hypernode",
"]",
"=",
"attr_dict",
"# Otherwise, just update the hypernode's attributes",
"else",
":",
"self",
".",
"_hypernode_attributes",
"[",
"hypernode",
"]",
".",
"update",
"(",
"attr_dict",
")",
"added_nodes",
"=",
"composing_nodes",
"-",
"self",
".",
"_hypernode_attributes",
"[",
"hypernode",
"]",
"[",
"\"__composing_nodes\"",
"]",
"removed_nodes",
"=",
"self",
".",
"_hypernode_attributes",
"[",
"hypernode",
"]",
"[",
"\"__composing_nodes\"",
"]",
"-",
"composing_nodes",
"# For every \"composing node\" added to this hypernode, update",
"# those nodes attributes to be members of this hypernode",
"for",
"node",
"in",
"added_nodes",
":",
"_add_hypernode_membership",
"(",
"node",
",",
"hypernode",
")",
"# For every \"composing node\" added to this hypernode, update",
"# those nodes attributes to no longer be members of this hypernode",
"for",
"node",
"in",
"remove_nodes",
":",
"_remove_hypernode_membership",
"(",
"node",
",",
"hypernode",
")"
] | Adds a hypernode to the graph, along with any related attributes
of the hypernode.
:param hypernode: reference to the hypernode being added.
:param nodes: reference to the set of nodes that compose
the hypernode.
:param in_hypernodes: set of references to the hypernodes that the
node being added is a member of.
:param attr_dict: dictionary of attributes of the node.
:param attr: keyword arguments of attributes of the node;
attr's values will override attr_dict's values
if both are provided. | [
"Adds",
"a",
"hypernode",
"to",
"the",
"graph",
"along",
"with",
"any",
"related",
"attributes",
"of",
"the",
"hypernode",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/signaling_hypergraph.py#L305-L344 | train |
Murali-group/halp | halp/algorithms/undirected_partitioning.py | _create_random_starter | def _create_random_starter(node_count):
"""Creates the random starter for the random walk.
:param node_count: number of nodes to create the random vector.
:returns: list -- list of starting probabilities for each node.
"""
pi = np.zeros(node_count, dtype=float)
for i in range(node_count):
pi[i] = random.random()
summation = np.sum(pi)
for i in range(node_count):
pi[i] = pi[i] / summation
return pi | python | def _create_random_starter(node_count):
"""Creates the random starter for the random walk.
:param node_count: number of nodes to create the random vector.
:returns: list -- list of starting probabilities for each node.
"""
pi = np.zeros(node_count, dtype=float)
for i in range(node_count):
pi[i] = random.random()
summation = np.sum(pi)
for i in range(node_count):
pi[i] = pi[i] / summation
return pi | [
"def",
"_create_random_starter",
"(",
"node_count",
")",
":",
"pi",
"=",
"np",
".",
"zeros",
"(",
"node_count",
",",
"dtype",
"=",
"float",
")",
"for",
"i",
"in",
"range",
"(",
"node_count",
")",
":",
"pi",
"[",
"i",
"]",
"=",
"random",
".",
"random",
"(",
")",
"summation",
"=",
"np",
".",
"sum",
"(",
"pi",
")",
"for",
"i",
"in",
"range",
"(",
"node_count",
")",
":",
"pi",
"[",
"i",
"]",
"=",
"pi",
"[",
"i",
"]",
"/",
"summation",
"return",
"pi"
] | Creates the random starter for the random walk.
:param node_count: number of nodes to create the random vector.
:returns: list -- list of starting probabilities for each node. | [
"Creates",
"the",
"random",
"starter",
"for",
"the",
"random",
"walk",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/algorithms/undirected_partitioning.py#L188-L202 | train |
Murali-group/halp | halp/algorithms/undirected_partitioning.py | _has_converged | def _has_converged(pi_star, pi):
"""Checks if the random walk has converged.
:param pi_star: the new vector
:param pi: the old vector
:returns: bool-- True iff pi has converged.
"""
node_count = pi.shape[0]
EPS = 10e-6
for i in range(node_count):
if pi[i] - pi_star[i] > EPS:
return False
return True | python | def _has_converged(pi_star, pi):
"""Checks if the random walk has converged.
:param pi_star: the new vector
:param pi: the old vector
:returns: bool-- True iff pi has converged.
"""
node_count = pi.shape[0]
EPS = 10e-6
for i in range(node_count):
if pi[i] - pi_star[i] > EPS:
return False
return True | [
"def",
"_has_converged",
"(",
"pi_star",
",",
"pi",
")",
":",
"node_count",
"=",
"pi",
".",
"shape",
"[",
"0",
"]",
"EPS",
"=",
"10e-6",
"for",
"i",
"in",
"range",
"(",
"node_count",
")",
":",
"if",
"pi",
"[",
"i",
"]",
"-",
"pi_star",
"[",
"i",
"]",
">",
"EPS",
":",
"return",
"False",
"return",
"True"
] | Checks if the random walk has converged.
:param pi_star: the new vector
:param pi: the old vector
:returns: bool-- True iff pi has converged. | [
"Checks",
"if",
"the",
"random",
"walk",
"has",
"converged",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/algorithms/undirected_partitioning.py#L205-L219 | train |
Murali-group/halp | halp/utilities/priority_queue.py | PriorityQueue.add_element | def add_element(self, priority, element, count=None):
"""Adds an element with a specific priority.
:param priority: priority of the element.
:param element: element to add.
"""
if count is None:
count = next(self.counter)
entry = [priority, count, element]
self.element_finder[element] = entry
heapq.heappush(self.pq, entry) | python | def add_element(self, priority, element, count=None):
"""Adds an element with a specific priority.
:param priority: priority of the element.
:param element: element to add.
"""
if count is None:
count = next(self.counter)
entry = [priority, count, element]
self.element_finder[element] = entry
heapq.heappush(self.pq, entry) | [
"def",
"add_element",
"(",
"self",
",",
"priority",
",",
"element",
",",
"count",
"=",
"None",
")",
":",
"if",
"count",
"is",
"None",
":",
"count",
"=",
"next",
"(",
"self",
".",
"counter",
")",
"entry",
"=",
"[",
"priority",
",",
"count",
",",
"element",
"]",
"self",
".",
"element_finder",
"[",
"element",
"]",
"=",
"entry",
"heapq",
".",
"heappush",
"(",
"self",
".",
"pq",
",",
"entry",
")"
] | Adds an element with a specific priority.
:param priority: priority of the element.
:param element: element to add. | [
"Adds",
"an",
"element",
"with",
"a",
"specific",
"priority",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/priority_queue.py#L41-L52 | train |
Murali-group/halp | halp/utilities/priority_queue.py | PriorityQueue.reprioritize | def reprioritize(self, priority, element):
"""Updates the priority of an element.
:raises: ValueError -- No such element in the priority queue.
"""
if element not in self.element_finder:
raise ValueError("No such element in the priority queue.")
entry = self.element_finder[element]
self.add_element(priority, element, entry[1])
entry[1] = self.INVALID | python | def reprioritize(self, priority, element):
"""Updates the priority of an element.
:raises: ValueError -- No such element in the priority queue.
"""
if element not in self.element_finder:
raise ValueError("No such element in the priority queue.")
entry = self.element_finder[element]
self.add_element(priority, element, entry[1])
entry[1] = self.INVALID | [
"def",
"reprioritize",
"(",
"self",
",",
"priority",
",",
"element",
")",
":",
"if",
"element",
"not",
"in",
"self",
".",
"element_finder",
":",
"raise",
"ValueError",
"(",
"\"No such element in the priority queue.\"",
")",
"entry",
"=",
"self",
".",
"element_finder",
"[",
"element",
"]",
"self",
".",
"add_element",
"(",
"priority",
",",
"element",
",",
"entry",
"[",
"1",
"]",
")",
"entry",
"[",
"1",
"]",
"=",
"self",
".",
"INVALID"
] | Updates the priority of an element.
:raises: ValueError -- No such element in the priority queue. | [
"Updates",
"the",
"priority",
"of",
"an",
"element",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/priority_queue.py#L79-L89 | train |
Murali-group/halp | halp/utilities/priority_queue.py | PriorityQueue.contains_element | def contains_element(self, element):
"""Determines if an element is contained in the priority queue."
:returns: bool -- true iff element is in the priority queue.
"""
return (element in self.element_finder) and \
(self.element_finder[element][1] != self.INVALID) | python | def contains_element(self, element):
"""Determines if an element is contained in the priority queue."
:returns: bool -- true iff element is in the priority queue.
"""
return (element in self.element_finder) and \
(self.element_finder[element][1] != self.INVALID) | [
"def",
"contains_element",
"(",
"self",
",",
"element",
")",
":",
"return",
"(",
"element",
"in",
"self",
".",
"element_finder",
")",
"and",
"(",
"self",
".",
"element_finder",
"[",
"element",
"]",
"[",
"1",
"]",
"!=",
"self",
".",
"INVALID",
")"
] | Determines if an element is contained in the priority queue."
:returns: bool -- true iff element is in the priority queue. | [
"Determines",
"if",
"an",
"element",
"is",
"contained",
"in",
"the",
"priority",
"queue",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/priority_queue.py#L102-L109 | train |
Murali-group/halp | halp/utilities/priority_queue.py | PriorityQueue.is_empty | def is_empty(self):
"""Determines if the priority queue has any elements.
Performs removal of any elements that were "marked-as-invalid".
:returns: true iff the priority queue has no elements.
"""
while self.pq:
if self.pq[0][1] != self.INVALID:
return False
else:
_, _, element = heapq.heappop(self.pq)
if element in self.element_finder:
del self.element_finder[element]
return True | python | def is_empty(self):
"""Determines if the priority queue has any elements.
Performs removal of any elements that were "marked-as-invalid".
:returns: true iff the priority queue has no elements.
"""
while self.pq:
if self.pq[0][1] != self.INVALID:
return False
else:
_, _, element = heapq.heappop(self.pq)
if element in self.element_finder:
del self.element_finder[element]
return True | [
"def",
"is_empty",
"(",
"self",
")",
":",
"while",
"self",
".",
"pq",
":",
"if",
"self",
".",
"pq",
"[",
"0",
"]",
"[",
"1",
"]",
"!=",
"self",
".",
"INVALID",
":",
"return",
"False",
"else",
":",
"_",
",",
"_",
",",
"element",
"=",
"heapq",
".",
"heappop",
"(",
"self",
".",
"pq",
")",
"if",
"element",
"in",
"self",
".",
"element_finder",
":",
"del",
"self",
".",
"element_finder",
"[",
"element",
"]",
"return",
"True"
] | Determines if the priority queue has any elements.
Performs removal of any elements that were "marked-as-invalid".
:returns: true iff the priority queue has no elements. | [
"Determines",
"if",
"the",
"priority",
"queue",
"has",
"any",
"elements",
".",
"Performs",
"removal",
"of",
"any",
"elements",
"that",
"were",
"marked",
"-",
"as",
"-",
"invalid",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/priority_queue.py#L111-L125 | train |
Murali-group/halp | halp/algorithms/directed_paths.py | is_connected | def is_connected(H, source_node, target_node):
"""Checks if a target node is connected to a source node. That is,
this method determines if a target node can be visited from the source
node in the sense of the 'Visit' algorithm.
Refer to 'visit's documentation for more details.
:param H: the hypergraph to check connectedness on.
:param source_node: the node to check connectedness to.
:param target_node: the node to check connectedness of.
:returns: bool -- whether target_node can be visited from source_node.
"""
visited_nodes, Pv, Pe = visit(H, source_node)
return target_node in visited_nodes | python | def is_connected(H, source_node, target_node):
"""Checks if a target node is connected to a source node. That is,
this method determines if a target node can be visited from the source
node in the sense of the 'Visit' algorithm.
Refer to 'visit's documentation for more details.
:param H: the hypergraph to check connectedness on.
:param source_node: the node to check connectedness to.
:param target_node: the node to check connectedness of.
:returns: bool -- whether target_node can be visited from source_node.
"""
visited_nodes, Pv, Pe = visit(H, source_node)
return target_node in visited_nodes | [
"def",
"is_connected",
"(",
"H",
",",
"source_node",
",",
"target_node",
")",
":",
"visited_nodes",
",",
"Pv",
",",
"Pe",
"=",
"visit",
"(",
"H",
",",
"source_node",
")",
"return",
"target_node",
"in",
"visited_nodes"
] | Checks if a target node is connected to a source node. That is,
this method determines if a target node can be visited from the source
node in the sense of the 'Visit' algorithm.
Refer to 'visit's documentation for more details.
:param H: the hypergraph to check connectedness on.
:param source_node: the node to check connectedness to.
:param target_node: the node to check connectedness of.
:returns: bool -- whether target_node can be visited from source_node. | [
"Checks",
"if",
"a",
"target",
"node",
"is",
"connected",
"to",
"a",
"source",
"node",
".",
"That",
"is",
"this",
"method",
"determines",
"if",
"a",
"target",
"node",
"can",
"be",
"visited",
"from",
"the",
"source",
"node",
"in",
"the",
"sense",
"of",
"the",
"Visit",
"algorithm",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/algorithms/directed_paths.py#L84-L98 | train |
Murali-group/halp | halp/algorithms/directed_paths.py | is_b_connected | def is_b_connected(H, source_node, target_node):
"""Checks if a target node is B-connected to a source node.
A node t is B-connected to a node s iff:
- t is s, or
- there exists an edge in the backward star of t such that all nodes in
the tail of that edge are B-connected to s
In other words, this method determines if a target node can be B-visited
from the source node in the sense of the 'B-Visit' algorithm. Refer to
'b_visit's documentation for more details.
:param H: the hypergraph to check B-connectedness on.
:param source_node: the node to check B-connectedness to.
:param target_node: the node to check B-connectedness of.
:returns: bool -- whether target_node can be visited from source_node.
"""
b_visited_nodes, Pv, Pe, v = b_visit(H, source_node)
return target_node in b_visited_nodes | python | def is_b_connected(H, source_node, target_node):
"""Checks if a target node is B-connected to a source node.
A node t is B-connected to a node s iff:
- t is s, or
- there exists an edge in the backward star of t such that all nodes in
the tail of that edge are B-connected to s
In other words, this method determines if a target node can be B-visited
from the source node in the sense of the 'B-Visit' algorithm. Refer to
'b_visit's documentation for more details.
:param H: the hypergraph to check B-connectedness on.
:param source_node: the node to check B-connectedness to.
:param target_node: the node to check B-connectedness of.
:returns: bool -- whether target_node can be visited from source_node.
"""
b_visited_nodes, Pv, Pe, v = b_visit(H, source_node)
return target_node in b_visited_nodes | [
"def",
"is_b_connected",
"(",
"H",
",",
"source_node",
",",
"target_node",
")",
":",
"b_visited_nodes",
",",
"Pv",
",",
"Pe",
",",
"v",
"=",
"b_visit",
"(",
"H",
",",
"source_node",
")",
"return",
"target_node",
"in",
"b_visited_nodes"
] | Checks if a target node is B-connected to a source node.
A node t is B-connected to a node s iff:
- t is s, or
- there exists an edge in the backward star of t such that all nodes in
the tail of that edge are B-connected to s
In other words, this method determines if a target node can be B-visited
from the source node in the sense of the 'B-Visit' algorithm. Refer to
'b_visit's documentation for more details.
:param H: the hypergraph to check B-connectedness on.
:param source_node: the node to check B-connectedness to.
:param target_node: the node to check B-connectedness of.
:returns: bool -- whether target_node can be visited from source_node. | [
"Checks",
"if",
"a",
"target",
"node",
"is",
"B",
"-",
"connected",
"to",
"a",
"source",
"node",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/algorithms/directed_paths.py#L215-L234 | train |
Murali-group/halp | halp/algorithms/directed_paths.py | is_f_connected | def is_f_connected(H, source_node, target_node):
"""Checks if a target node is F-connected to a source node.
A node t is F-connected to a node s iff s if B-connected to t.
Refer to 'f_visit's or 'is_b_connected's documentation for more details.
:param H: the hypergraph to check F-connectedness on.
:param source_node: the node to check F-connectedness to.
:param target_node: the node to check F-connectedness of.
:returns: bool -- whether target_node can be visited from source_node.
"""
f_visited_nodes, Pv, Pe, v = f_visit(H, source_node)
return target_node in f_visited_nodes | python | def is_f_connected(H, source_node, target_node):
"""Checks if a target node is F-connected to a source node.
A node t is F-connected to a node s iff s if B-connected to t.
Refer to 'f_visit's or 'is_b_connected's documentation for more details.
:param H: the hypergraph to check F-connectedness on.
:param source_node: the node to check F-connectedness to.
:param target_node: the node to check F-connectedness of.
:returns: bool -- whether target_node can be visited from source_node.
"""
f_visited_nodes, Pv, Pe, v = f_visit(H, source_node)
return target_node in f_visited_nodes | [
"def",
"is_f_connected",
"(",
"H",
",",
"source_node",
",",
"target_node",
")",
":",
"f_visited_nodes",
",",
"Pv",
",",
"Pe",
",",
"v",
"=",
"f_visit",
"(",
"H",
",",
"source_node",
")",
"return",
"target_node",
"in",
"f_visited_nodes"
] | Checks if a target node is F-connected to a source node.
A node t is F-connected to a node s iff s if B-connected to t.
Refer to 'f_visit's or 'is_b_connected's documentation for more details.
:param H: the hypergraph to check F-connectedness on.
:param source_node: the node to check F-connectedness to.
:param target_node: the node to check F-connectedness of.
:returns: bool -- whether target_node can be visited from source_node. | [
"Checks",
"if",
"a",
"target",
"node",
"is",
"F",
"-",
"connected",
"to",
"a",
"source",
"node",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/algorithms/directed_paths.py#L263-L276 | train |
Murali-group/halp | halp/utilities/undirected_graph_transformations.py | from_networkx_graph | def from_networkx_graph(nx_graph):
"""Returns an UndirectedHypergraph object that is the graph equivalent of
the given NetworkX Graph object.
:param nx_graph: the NetworkX undirected graph object to transform.
:returns: UndirectedHypergraph -- H object equivalent to the
NetworkX undirected graph.
:raises: TypeError -- Transformation only applicable to undirected
NetworkX graphs
"""
import networkx as nx
if not isinstance(nx_graph, nx.Graph):
raise TypeError("Transformation only applicable to undirected \
NetworkX graphs")
G = UndirectedHypergraph()
for node in nx_graph.nodes_iter():
G.add_node(node, copy.copy(nx_graph.node[node]))
for edge in nx_graph.edges_iter():
G.add_hyperedge([edge[0], edge[1]],
copy.copy(nx_graph[edge[0]][edge[1]]))
return G | python | def from_networkx_graph(nx_graph):
"""Returns an UndirectedHypergraph object that is the graph equivalent of
the given NetworkX Graph object.
:param nx_graph: the NetworkX undirected graph object to transform.
:returns: UndirectedHypergraph -- H object equivalent to the
NetworkX undirected graph.
:raises: TypeError -- Transformation only applicable to undirected
NetworkX graphs
"""
import networkx as nx
if not isinstance(nx_graph, nx.Graph):
raise TypeError("Transformation only applicable to undirected \
NetworkX graphs")
G = UndirectedHypergraph()
for node in nx_graph.nodes_iter():
G.add_node(node, copy.copy(nx_graph.node[node]))
for edge in nx_graph.edges_iter():
G.add_hyperedge([edge[0], edge[1]],
copy.copy(nx_graph[edge[0]][edge[1]]))
return G | [
"def",
"from_networkx_graph",
"(",
"nx_graph",
")",
":",
"import",
"networkx",
"as",
"nx",
"if",
"not",
"isinstance",
"(",
"nx_graph",
",",
"nx",
".",
"Graph",
")",
":",
"raise",
"TypeError",
"(",
"\"Transformation only applicable to undirected \\\n NetworkX graphs\"",
")",
"G",
"=",
"UndirectedHypergraph",
"(",
")",
"for",
"node",
"in",
"nx_graph",
".",
"nodes_iter",
"(",
")",
":",
"G",
".",
"add_node",
"(",
"node",
",",
"copy",
".",
"copy",
"(",
"nx_graph",
".",
"node",
"[",
"node",
"]",
")",
")",
"for",
"edge",
"in",
"nx_graph",
".",
"edges_iter",
"(",
")",
":",
"G",
".",
"add_hyperedge",
"(",
"[",
"edge",
"[",
"0",
"]",
",",
"edge",
"[",
"1",
"]",
"]",
",",
"copy",
".",
"copy",
"(",
"nx_graph",
"[",
"edge",
"[",
"0",
"]",
"]",
"[",
"edge",
"[",
"1",
"]",
"]",
")",
")",
"return",
"G"
] | Returns an UndirectedHypergraph object that is the graph equivalent of
the given NetworkX Graph object.
:param nx_graph: the NetworkX undirected graph object to transform.
:returns: UndirectedHypergraph -- H object equivalent to the
NetworkX undirected graph.
:raises: TypeError -- Transformation only applicable to undirected
NetworkX graphs | [
"Returns",
"an",
"UndirectedHypergraph",
"object",
"that",
"is",
"the",
"graph",
"equivalent",
"of",
"the",
"given",
"NetworkX",
"Graph",
"object",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/undirected_graph_transformations.py#L81-L107 | train |
Murali-group/halp | halp/utilities/directed_graph_transformations.py | from_networkx_digraph | def from_networkx_digraph(nx_digraph):
"""Returns a DirectedHypergraph object that is the graph equivalent of the
given NetworkX DiGraph object.
:param nx_digraph: the NetworkX directed graph object to transform.
:returns: DirectedHypergraph -- hypergraph object equivalent to the
NetworkX directed graph.
:raises: TypeError -- Transformation only applicable to directed
NetworkX graphs
"""
import networkx as nx
if not isinstance(nx_digraph, nx.DiGraph):
raise TypeError("Transformation only applicable to directed \
NetworkX graphs")
G = DirectedHypergraph()
for node in nx_digraph.nodes_iter():
G.add_node(node, copy.copy(nx_digraph.node[node]))
for edge in nx_digraph.edges_iter():
tail_node = edge[0]
head_node = edge[1]
G.add_hyperedge(tail_node,
head_node,
copy.copy(nx_digraph[tail_node][head_node]))
return G | python | def from_networkx_digraph(nx_digraph):
"""Returns a DirectedHypergraph object that is the graph equivalent of the
given NetworkX DiGraph object.
:param nx_digraph: the NetworkX directed graph object to transform.
:returns: DirectedHypergraph -- hypergraph object equivalent to the
NetworkX directed graph.
:raises: TypeError -- Transformation only applicable to directed
NetworkX graphs
"""
import networkx as nx
if not isinstance(nx_digraph, nx.DiGraph):
raise TypeError("Transformation only applicable to directed \
NetworkX graphs")
G = DirectedHypergraph()
for node in nx_digraph.nodes_iter():
G.add_node(node, copy.copy(nx_digraph.node[node]))
for edge in nx_digraph.edges_iter():
tail_node = edge[0]
head_node = edge[1]
G.add_hyperedge(tail_node,
head_node,
copy.copy(nx_digraph[tail_node][head_node]))
return G | [
"def",
"from_networkx_digraph",
"(",
"nx_digraph",
")",
":",
"import",
"networkx",
"as",
"nx",
"if",
"not",
"isinstance",
"(",
"nx_digraph",
",",
"nx",
".",
"DiGraph",
")",
":",
"raise",
"TypeError",
"(",
"\"Transformation only applicable to directed \\\n NetworkX graphs\"",
")",
"G",
"=",
"DirectedHypergraph",
"(",
")",
"for",
"node",
"in",
"nx_digraph",
".",
"nodes_iter",
"(",
")",
":",
"G",
".",
"add_node",
"(",
"node",
",",
"copy",
".",
"copy",
"(",
"nx_digraph",
".",
"node",
"[",
"node",
"]",
")",
")",
"for",
"edge",
"in",
"nx_digraph",
".",
"edges_iter",
"(",
")",
":",
"tail_node",
"=",
"edge",
"[",
"0",
"]",
"head_node",
"=",
"edge",
"[",
"1",
"]",
"G",
".",
"add_hyperedge",
"(",
"tail_node",
",",
"head_node",
",",
"copy",
".",
"copy",
"(",
"nx_digraph",
"[",
"tail_node",
"]",
"[",
"head_node",
"]",
")",
")",
"return",
"G"
] | Returns a DirectedHypergraph object that is the graph equivalent of the
given NetworkX DiGraph object.
:param nx_digraph: the NetworkX directed graph object to transform.
:returns: DirectedHypergraph -- hypergraph object equivalent to the
NetworkX directed graph.
:raises: TypeError -- Transformation only applicable to directed
NetworkX graphs | [
"Returns",
"a",
"DirectedHypergraph",
"object",
"that",
"is",
"the",
"graph",
"equivalent",
"of",
"the",
"given",
"NetworkX",
"DiGraph",
"object",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/directed_graph_transformations.py#L81-L110 | train |
Murali-group/halp | halp/utilities/directed_matrices.py | get_tail_incidence_matrix | def get_tail_incidence_matrix(H, nodes_to_indices, hyperedge_ids_to_indices):
"""Creates the incidence matrix of the tail nodes of the given
hypergraph as a sparse matrix.
:param H: the hypergraph for which to create the incidence matrix of.
:param nodes_to_indices: for each node, maps the node to its
corresponding integer index.
:param hyperedge_ids_to_indices: for each hyperedge ID, maps the hyperedge
ID to its corresponding integer index.
:returns: sparse.csc_matrix -- the incidence matrix as a sparse matrix.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs
"""
if not isinstance(H, DirectedHypergraph):
raise TypeError("Algorithm only applicable to directed hypergraphs")
rows, cols = [], []
for hyperedge_id, hyperedge_index in hyperedge_ids_to_indices.items():
for node in H.get_hyperedge_tail(hyperedge_id):
# get the mapping between the node and its ID
rows.append(nodes_to_indices.get(node))
cols.append(hyperedge_index)
values = np.ones(len(rows), dtype=int)
node_count = len(H.get_node_set())
hyperedge_count = len(H.get_hyperedge_id_set())
return sparse.csc_matrix((values, (rows, cols)),
shape=(node_count, hyperedge_count)) | python | def get_tail_incidence_matrix(H, nodes_to_indices, hyperedge_ids_to_indices):
"""Creates the incidence matrix of the tail nodes of the given
hypergraph as a sparse matrix.
:param H: the hypergraph for which to create the incidence matrix of.
:param nodes_to_indices: for each node, maps the node to its
corresponding integer index.
:param hyperedge_ids_to_indices: for each hyperedge ID, maps the hyperedge
ID to its corresponding integer index.
:returns: sparse.csc_matrix -- the incidence matrix as a sparse matrix.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs
"""
if not isinstance(H, DirectedHypergraph):
raise TypeError("Algorithm only applicable to directed hypergraphs")
rows, cols = [], []
for hyperedge_id, hyperedge_index in hyperedge_ids_to_indices.items():
for node in H.get_hyperedge_tail(hyperedge_id):
# get the mapping between the node and its ID
rows.append(nodes_to_indices.get(node))
cols.append(hyperedge_index)
values = np.ones(len(rows), dtype=int)
node_count = len(H.get_node_set())
hyperedge_count = len(H.get_hyperedge_id_set())
return sparse.csc_matrix((values, (rows, cols)),
shape=(node_count, hyperedge_count)) | [
"def",
"get_tail_incidence_matrix",
"(",
"H",
",",
"nodes_to_indices",
",",
"hyperedge_ids_to_indices",
")",
":",
"if",
"not",
"isinstance",
"(",
"H",
",",
"DirectedHypergraph",
")",
":",
"raise",
"TypeError",
"(",
"\"Algorithm only applicable to directed hypergraphs\"",
")",
"rows",
",",
"cols",
"=",
"[",
"]",
",",
"[",
"]",
"for",
"hyperedge_id",
",",
"hyperedge_index",
"in",
"hyperedge_ids_to_indices",
".",
"items",
"(",
")",
":",
"for",
"node",
"in",
"H",
".",
"get_hyperedge_tail",
"(",
"hyperedge_id",
")",
":",
"# get the mapping between the node and its ID",
"rows",
".",
"append",
"(",
"nodes_to_indices",
".",
"get",
"(",
"node",
")",
")",
"cols",
".",
"append",
"(",
"hyperedge_index",
")",
"values",
"=",
"np",
".",
"ones",
"(",
"len",
"(",
"rows",
")",
",",
"dtype",
"=",
"int",
")",
"node_count",
"=",
"len",
"(",
"H",
".",
"get_node_set",
"(",
")",
")",
"hyperedge_count",
"=",
"len",
"(",
"H",
".",
"get_hyperedge_id_set",
"(",
")",
")",
"return",
"sparse",
".",
"csc_matrix",
"(",
"(",
"values",
",",
"(",
"rows",
",",
"cols",
")",
")",
",",
"shape",
"=",
"(",
"node_count",
",",
"hyperedge_count",
")",
")"
] | Creates the incidence matrix of the tail nodes of the given
hypergraph as a sparse matrix.
:param H: the hypergraph for which to create the incidence matrix of.
:param nodes_to_indices: for each node, maps the node to its
corresponding integer index.
:param hyperedge_ids_to_indices: for each hyperedge ID, maps the hyperedge
ID to its corresponding integer index.
:returns: sparse.csc_matrix -- the incidence matrix as a sparse matrix.
:raises: TypeError -- Algorithm only applicable to directed hypergraphs | [
"Creates",
"the",
"incidence",
"matrix",
"of",
"the",
"tail",
"nodes",
"of",
"the",
"given",
"hypergraph",
"as",
"a",
"sparse",
"matrix",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/utilities/directed_matrices.py#L59-L87 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.add_node | def add_node(self, node, attr_dict=None, **attr):
"""Adds a node to the graph, along with any related attributes
of the node.
:param node: reference to the node being added.
:param attr_dict: dictionary of attributes of the node.
:param attr: keyword arguments of attributes of the node;
attr's values will override attr_dict's values
if both are provided.
Examples:
::
>>> H = DirectedHypergraph()
>>> attributes = {label: "positive"}
>>> H.add_node("A", attributes)
>>> H.add_node("B", label="negative")
>>> H.add_node("C", attributes, root=True)
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
# If the node hasn't previously been added, add it along
# with its attributes
if not self.has_node(node):
self._node_attributes[node] = attr_dict
self._forward_star[node] = set()
self._backward_star[node] = set()
# Otherwise, just update the node's attributes
else:
self._node_attributes[node].update(attr_dict) | python | def add_node(self, node, attr_dict=None, **attr):
"""Adds a node to the graph, along with any related attributes
of the node.
:param node: reference to the node being added.
:param attr_dict: dictionary of attributes of the node.
:param attr: keyword arguments of attributes of the node;
attr's values will override attr_dict's values
if both are provided.
Examples:
::
>>> H = DirectedHypergraph()
>>> attributes = {label: "positive"}
>>> H.add_node("A", attributes)
>>> H.add_node("B", label="negative")
>>> H.add_node("C", attributes, root=True)
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
# If the node hasn't previously been added, add it along
# with its attributes
if not self.has_node(node):
self._node_attributes[node] = attr_dict
self._forward_star[node] = set()
self._backward_star[node] = set()
# Otherwise, just update the node's attributes
else:
self._node_attributes[node].update(attr_dict) | [
"def",
"add_node",
"(",
"self",
",",
"node",
",",
"attr_dict",
"=",
"None",
",",
"*",
"*",
"attr",
")",
":",
"attr_dict",
"=",
"self",
".",
"_combine_attribute_arguments",
"(",
"attr_dict",
",",
"attr",
")",
"# If the node hasn't previously been added, add it along",
"# with its attributes",
"if",
"not",
"self",
".",
"has_node",
"(",
"node",
")",
":",
"self",
".",
"_node_attributes",
"[",
"node",
"]",
"=",
"attr_dict",
"self",
".",
"_forward_star",
"[",
"node",
"]",
"=",
"set",
"(",
")",
"self",
".",
"_backward_star",
"[",
"node",
"]",
"=",
"set",
"(",
")",
"# Otherwise, just update the node's attributes",
"else",
":",
"self",
".",
"_node_attributes",
"[",
"node",
"]",
".",
"update",
"(",
"attr_dict",
")"
] | Adds a node to the graph, along with any related attributes
of the node.
:param node: reference to the node being added.
:param attr_dict: dictionary of attributes of the node.
:param attr: keyword arguments of attributes of the node;
attr's values will override attr_dict's values
if both are provided.
Examples:
::
>>> H = DirectedHypergraph()
>>> attributes = {label: "positive"}
>>> H.add_node("A", attributes)
>>> H.add_node("B", label="negative")
>>> H.add_node("C", attributes, root=True) | [
"Adds",
"a",
"node",
"to",
"the",
"graph",
"along",
"with",
"any",
"related",
"attributes",
"of",
"the",
"node",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L204-L234 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.add_nodes | def add_nodes(self, nodes, attr_dict=None, **attr):
"""Adds multiple nodes to the graph, along with any related attributes
of the nodes.
:param nodes: iterable container to either references of the nodes
OR tuples of (node reference, attribute dictionary);
if an attribute dictionary is provided in the tuple,
its values will override both attr_dict's and attr's
values.
:param attr_dict: dictionary of attributes shared by all the nodes.
:param attr: keyword arguments of attributes of the node;
attr's values will override attr_dict's values
if both are provided.
See also:
add_node
Examples:
::
>>> H = DirectedHypergraph()
>>> attributes = {label: "positive"}
>>> node_list = ["A",
("B", {label="negative"}),
("C", {root=True})]
>>> H.add_nodes(node_list, attributes)
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
for node in nodes:
# Note: This won't behave properly if the node is actually a tuple
if type(node) is tuple:
# See ("B", {label="negative"}) in the documentation example
new_node, node_attr_dict = node
# Create a new dictionary and load it with node_attr_dict and
# attr_dict, with the former (node_attr_dict) taking precedence
new_dict = attr_dict.copy()
new_dict.update(node_attr_dict)
self.add_node(new_node, new_dict)
else:
# See "A" in the documentation example
self.add_node(node, attr_dict.copy()) | python | def add_nodes(self, nodes, attr_dict=None, **attr):
"""Adds multiple nodes to the graph, along with any related attributes
of the nodes.
:param nodes: iterable container to either references of the nodes
OR tuples of (node reference, attribute dictionary);
if an attribute dictionary is provided in the tuple,
its values will override both attr_dict's and attr's
values.
:param attr_dict: dictionary of attributes shared by all the nodes.
:param attr: keyword arguments of attributes of the node;
attr's values will override attr_dict's values
if both are provided.
See also:
add_node
Examples:
::
>>> H = DirectedHypergraph()
>>> attributes = {label: "positive"}
>>> node_list = ["A",
("B", {label="negative"}),
("C", {root=True})]
>>> H.add_nodes(node_list, attributes)
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
for node in nodes:
# Note: This won't behave properly if the node is actually a tuple
if type(node) is tuple:
# See ("B", {label="negative"}) in the documentation example
new_node, node_attr_dict = node
# Create a new dictionary and load it with node_attr_dict and
# attr_dict, with the former (node_attr_dict) taking precedence
new_dict = attr_dict.copy()
new_dict.update(node_attr_dict)
self.add_node(new_node, new_dict)
else:
# See "A" in the documentation example
self.add_node(node, attr_dict.copy()) | [
"def",
"add_nodes",
"(",
"self",
",",
"nodes",
",",
"attr_dict",
"=",
"None",
",",
"*",
"*",
"attr",
")",
":",
"attr_dict",
"=",
"self",
".",
"_combine_attribute_arguments",
"(",
"attr_dict",
",",
"attr",
")",
"for",
"node",
"in",
"nodes",
":",
"# Note: This won't behave properly if the node is actually a tuple",
"if",
"type",
"(",
"node",
")",
"is",
"tuple",
":",
"# See (\"B\", {label=\"negative\"}) in the documentation example",
"new_node",
",",
"node_attr_dict",
"=",
"node",
"# Create a new dictionary and load it with node_attr_dict and",
"# attr_dict, with the former (node_attr_dict) taking precedence",
"new_dict",
"=",
"attr_dict",
".",
"copy",
"(",
")",
"new_dict",
".",
"update",
"(",
"node_attr_dict",
")",
"self",
".",
"add_node",
"(",
"new_node",
",",
"new_dict",
")",
"else",
":",
"# See \"A\" in the documentation example",
"self",
".",
"add_node",
"(",
"node",
",",
"attr_dict",
".",
"copy",
"(",
")",
")"
] | Adds multiple nodes to the graph, along with any related attributes
of the nodes.
:param nodes: iterable container to either references of the nodes
OR tuples of (node reference, attribute dictionary);
if an attribute dictionary is provided in the tuple,
its values will override both attr_dict's and attr's
values.
:param attr_dict: dictionary of attributes shared by all the nodes.
:param attr: keyword arguments of attributes of the node;
attr's values will override attr_dict's values
if both are provided.
See also:
add_node
Examples:
::
>>> H = DirectedHypergraph()
>>> attributes = {label: "positive"}
>>> node_list = ["A",
("B", {label="negative"}),
("C", {root=True})]
>>> H.add_nodes(node_list, attributes) | [
"Adds",
"multiple",
"nodes",
"to",
"the",
"graph",
"along",
"with",
"any",
"related",
"attributes",
"of",
"the",
"nodes",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L236-L278 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.remove_node | def remove_node(self, node):
"""Removes a node and its attributes from the hypergraph. Removes
every hyperedge that contains this node in either the head or the tail.
:param node: reference to the node being added.
:raises: ValueError -- No such node exists.
Examples:
::
>>> H = DirectedHypergraph()
>>> H.add_node("A", label="positive")
>>> H.remove_node("A")
"""
if not self.has_node(node):
raise ValueError("No such node exists.")
# Remove every hyperedge which is in the forward star of the node
forward_star = self.get_forward_star(node)
for hyperedge_id in forward_star:
self.remove_hyperedge(hyperedge_id)
# Remove every hyperedge which is in the backward star of the node
# but that is not also in the forward start of the node (to handle
# overlapping hyperedges)
backward_star = self.get_backward_star(node)
for hyperedge_id in backward_star - forward_star:
self.remove_hyperedge(hyperedge_id)
# Remove node's forward and backward star
del self._forward_star[node]
del self._backward_star[node]
# Remove node's attributes dictionary
del self._node_attributes[node] | python | def remove_node(self, node):
"""Removes a node and its attributes from the hypergraph. Removes
every hyperedge that contains this node in either the head or the tail.
:param node: reference to the node being added.
:raises: ValueError -- No such node exists.
Examples:
::
>>> H = DirectedHypergraph()
>>> H.add_node("A", label="positive")
>>> H.remove_node("A")
"""
if not self.has_node(node):
raise ValueError("No such node exists.")
# Remove every hyperedge which is in the forward star of the node
forward_star = self.get_forward_star(node)
for hyperedge_id in forward_star:
self.remove_hyperedge(hyperedge_id)
# Remove every hyperedge which is in the backward star of the node
# but that is not also in the forward start of the node (to handle
# overlapping hyperedges)
backward_star = self.get_backward_star(node)
for hyperedge_id in backward_star - forward_star:
self.remove_hyperedge(hyperedge_id)
# Remove node's forward and backward star
del self._forward_star[node]
del self._backward_star[node]
# Remove node's attributes dictionary
del self._node_attributes[node] | [
"def",
"remove_node",
"(",
"self",
",",
"node",
")",
":",
"if",
"not",
"self",
".",
"has_node",
"(",
"node",
")",
":",
"raise",
"ValueError",
"(",
"\"No such node exists.\"",
")",
"# Remove every hyperedge which is in the forward star of the node",
"forward_star",
"=",
"self",
".",
"get_forward_star",
"(",
"node",
")",
"for",
"hyperedge_id",
"in",
"forward_star",
":",
"self",
".",
"remove_hyperedge",
"(",
"hyperedge_id",
")",
"# Remove every hyperedge which is in the backward star of the node",
"# but that is not also in the forward start of the node (to handle",
"# overlapping hyperedges)",
"backward_star",
"=",
"self",
".",
"get_backward_star",
"(",
"node",
")",
"for",
"hyperedge_id",
"in",
"backward_star",
"-",
"forward_star",
":",
"self",
".",
"remove_hyperedge",
"(",
"hyperedge_id",
")",
"# Remove node's forward and backward star",
"del",
"self",
".",
"_forward_star",
"[",
"node",
"]",
"del",
"self",
".",
"_backward_star",
"[",
"node",
"]",
"# Remove node's attributes dictionary",
"del",
"self",
".",
"_node_attributes",
"[",
"node",
"]"
] | Removes a node and its attributes from the hypergraph. Removes
every hyperedge that contains this node in either the head or the tail.
:param node: reference to the node being added.
:raises: ValueError -- No such node exists.
Examples:
::
>>> H = DirectedHypergraph()
>>> H.add_node("A", label="positive")
>>> H.remove_node("A") | [
"Removes",
"a",
"node",
"and",
"its",
"attributes",
"from",
"the",
"hypergraph",
".",
"Removes",
"every",
"hyperedge",
"that",
"contains",
"this",
"node",
"in",
"either",
"the",
"head",
"or",
"the",
"tail",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L280-L315 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.get_node_attribute | def get_node_attribute(self, node, attribute_name):
"""Given a node and the name of an attribute, get a copy
of that node's attribute.
:param node: reference to the node to retrieve the attribute of.
:param attribute_name: name of the attribute to retrieve.
:returns: attribute value of the attribute_name key for the
specified node.
:raises: ValueError -- No such node exists.
:raises: ValueError -- No such attribute exists.
"""
if not self.has_node(node):
raise ValueError("No such node exists.")
elif attribute_name not in self._node_attributes[node]:
raise ValueError("No such attribute exists.")
else:
return copy.\
copy(self._node_attributes[node][attribute_name]) | python | def get_node_attribute(self, node, attribute_name):
"""Given a node and the name of an attribute, get a copy
of that node's attribute.
:param node: reference to the node to retrieve the attribute of.
:param attribute_name: name of the attribute to retrieve.
:returns: attribute value of the attribute_name key for the
specified node.
:raises: ValueError -- No such node exists.
:raises: ValueError -- No such attribute exists.
"""
if not self.has_node(node):
raise ValueError("No such node exists.")
elif attribute_name not in self._node_attributes[node]:
raise ValueError("No such attribute exists.")
else:
return copy.\
copy(self._node_attributes[node][attribute_name]) | [
"def",
"get_node_attribute",
"(",
"self",
",",
"node",
",",
"attribute_name",
")",
":",
"if",
"not",
"self",
".",
"has_node",
"(",
"node",
")",
":",
"raise",
"ValueError",
"(",
"\"No such node exists.\"",
")",
"elif",
"attribute_name",
"not",
"in",
"self",
".",
"_node_attributes",
"[",
"node",
"]",
":",
"raise",
"ValueError",
"(",
"\"No such attribute exists.\"",
")",
"else",
":",
"return",
"copy",
".",
"copy",
"(",
"self",
".",
"_node_attributes",
"[",
"node",
"]",
"[",
"attribute_name",
"]",
")"
] | Given a node and the name of an attribute, get a copy
of that node's attribute.
:param node: reference to the node to retrieve the attribute of.
:param attribute_name: name of the attribute to retrieve.
:returns: attribute value of the attribute_name key for the
specified node.
:raises: ValueError -- No such node exists.
:raises: ValueError -- No such attribute exists. | [
"Given",
"a",
"node",
"and",
"the",
"name",
"of",
"an",
"attribute",
"get",
"a",
"copy",
"of",
"that",
"node",
"s",
"attribute",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L420-L438 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.get_node_attributes | def get_node_attributes(self, node):
"""Given a node, get a dictionary with copies of that node's
attributes.
:param node: reference to the node to retrieve the attributes of.
:returns: dict -- copy of each attribute of the specified node.
:raises: ValueError -- No such node exists.
"""
if not self.has_node(node):
raise ValueError("No such node exists.")
attributes = {}
for attr_name, attr_value in self._node_attributes[node].items():
attributes[attr_name] = copy.copy(attr_value)
return attributes | python | def get_node_attributes(self, node):
"""Given a node, get a dictionary with copies of that node's
attributes.
:param node: reference to the node to retrieve the attributes of.
:returns: dict -- copy of each attribute of the specified node.
:raises: ValueError -- No such node exists.
"""
if not self.has_node(node):
raise ValueError("No such node exists.")
attributes = {}
for attr_name, attr_value in self._node_attributes[node].items():
attributes[attr_name] = copy.copy(attr_value)
return attributes | [
"def",
"get_node_attributes",
"(",
"self",
",",
"node",
")",
":",
"if",
"not",
"self",
".",
"has_node",
"(",
"node",
")",
":",
"raise",
"ValueError",
"(",
"\"No such node exists.\"",
")",
"attributes",
"=",
"{",
"}",
"for",
"attr_name",
",",
"attr_value",
"in",
"self",
".",
"_node_attributes",
"[",
"node",
"]",
".",
"items",
"(",
")",
":",
"attributes",
"[",
"attr_name",
"]",
"=",
"copy",
".",
"copy",
"(",
"attr_value",
")",
"return",
"attributes"
] | Given a node, get a dictionary with copies of that node's
attributes.
:param node: reference to the node to retrieve the attributes of.
:returns: dict -- copy of each attribute of the specified node.
:raises: ValueError -- No such node exists. | [
"Given",
"a",
"node",
"get",
"a",
"dictionary",
"with",
"copies",
"of",
"that",
"node",
"s",
"attributes",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L440-L454 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.add_hyperedge | def add_hyperedge(self, tail, head, attr_dict=None, **attr):
"""Adds a hyperedge to the hypergraph, along with any related
attributes of the hyperedge.
This method will automatically add any node from the tail and
head that was not in the hypergraph.
A hyperedge without a "weight" attribute specified will be
assigned the default value of 1.
:param tail: iterable container of references to nodes in the
tail of the hyperedge to be added.
:param head: iterable container of references to nodes in the
head of the hyperedge to be added.
:param attr_dict: dictionary of attributes shared by all
the hyperedges.
:param attr: keyword arguments of attributes of the hyperedge;
attr's values will override attr_dict's values
if both are provided.
:returns: str -- the ID of the hyperedge that was added.
:raises: ValueError -- tail and head arguments cannot both be empty.
Examples:
::
>>> H = DirectedHypergraph()
>>> x = H.add_hyperedge(["A", "B"], ["C", "D"])
>>> y = H.add_hyperedge(("A", "C"), ("B"), 'weight'=2)
>>> z = H.add_hyperedge(set(["D"]),
set(["A", "C"]),
{color: "red"})
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
# Don't allow both empty tail and head containers (invalid hyperedge)
if not tail and not head:
raise ValueError("tail and head arguments \
cannot both be empty.")
# Use frozensets for tail and head sets to allow for hashable keys
frozen_tail = frozenset(tail)
frozen_head = frozenset(head)
# Initialize a successor dictionary for the tail and head, respectively
if frozen_tail not in self._successors:
self._successors[frozen_tail] = {}
if frozen_head not in self._predecessors:
self._predecessors[frozen_head] = {}
is_new_hyperedge = not self.has_hyperedge(frozen_tail, frozen_head)
if is_new_hyperedge:
# Add tail and head nodes to graph (if not already present)
self.add_nodes(frozen_head)
self.add_nodes(frozen_tail)
# Create new hyperedge name to use as reference for that hyperedge
hyperedge_id = self._assign_next_hyperedge_id()
# Add hyperedge to the forward-star and to the backward-star
# for each node in the tail and head sets, respectively
for node in frozen_tail:
self._forward_star[node].add(hyperedge_id)
for node in frozen_head:
self._backward_star[node].add(hyperedge_id)
# Add the hyperedge as the successors and predecessors
# of the tail set and head set, respectively
self._successors[frozen_tail][frozen_head] = hyperedge_id
self._predecessors[frozen_head][frozen_tail] = hyperedge_id
# Assign some special attributes to this hyperedge. We assign
# a default weight of 1 to the hyperedge. We also store the
# original tail and head sets in order to return them exactly
# as the user passed them into add_hyperedge.
self._hyperedge_attributes[hyperedge_id] = \
{"tail": tail, "__frozen_tail": frozen_tail,
"head": head, "__frozen_head": frozen_head,
"weight": 1}
else:
# If its not a new hyperedge, just get its ID to update attributes
hyperedge_id = self._successors[frozen_tail][frozen_head]
# Set attributes and return hyperedge ID
self._hyperedge_attributes[hyperedge_id].update(attr_dict)
return hyperedge_id | python | def add_hyperedge(self, tail, head, attr_dict=None, **attr):
"""Adds a hyperedge to the hypergraph, along with any related
attributes of the hyperedge.
This method will automatically add any node from the tail and
head that was not in the hypergraph.
A hyperedge without a "weight" attribute specified will be
assigned the default value of 1.
:param tail: iterable container of references to nodes in the
tail of the hyperedge to be added.
:param head: iterable container of references to nodes in the
head of the hyperedge to be added.
:param attr_dict: dictionary of attributes shared by all
the hyperedges.
:param attr: keyword arguments of attributes of the hyperedge;
attr's values will override attr_dict's values
if both are provided.
:returns: str -- the ID of the hyperedge that was added.
:raises: ValueError -- tail and head arguments cannot both be empty.
Examples:
::
>>> H = DirectedHypergraph()
>>> x = H.add_hyperedge(["A", "B"], ["C", "D"])
>>> y = H.add_hyperedge(("A", "C"), ("B"), 'weight'=2)
>>> z = H.add_hyperedge(set(["D"]),
set(["A", "C"]),
{color: "red"})
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
# Don't allow both empty tail and head containers (invalid hyperedge)
if not tail and not head:
raise ValueError("tail and head arguments \
cannot both be empty.")
# Use frozensets for tail and head sets to allow for hashable keys
frozen_tail = frozenset(tail)
frozen_head = frozenset(head)
# Initialize a successor dictionary for the tail and head, respectively
if frozen_tail not in self._successors:
self._successors[frozen_tail] = {}
if frozen_head not in self._predecessors:
self._predecessors[frozen_head] = {}
is_new_hyperedge = not self.has_hyperedge(frozen_tail, frozen_head)
if is_new_hyperedge:
# Add tail and head nodes to graph (if not already present)
self.add_nodes(frozen_head)
self.add_nodes(frozen_tail)
# Create new hyperedge name to use as reference for that hyperedge
hyperedge_id = self._assign_next_hyperedge_id()
# Add hyperedge to the forward-star and to the backward-star
# for each node in the tail and head sets, respectively
for node in frozen_tail:
self._forward_star[node].add(hyperedge_id)
for node in frozen_head:
self._backward_star[node].add(hyperedge_id)
# Add the hyperedge as the successors and predecessors
# of the tail set and head set, respectively
self._successors[frozen_tail][frozen_head] = hyperedge_id
self._predecessors[frozen_head][frozen_tail] = hyperedge_id
# Assign some special attributes to this hyperedge. We assign
# a default weight of 1 to the hyperedge. We also store the
# original tail and head sets in order to return them exactly
# as the user passed them into add_hyperedge.
self._hyperedge_attributes[hyperedge_id] = \
{"tail": tail, "__frozen_tail": frozen_tail,
"head": head, "__frozen_head": frozen_head,
"weight": 1}
else:
# If its not a new hyperedge, just get its ID to update attributes
hyperedge_id = self._successors[frozen_tail][frozen_head]
# Set attributes and return hyperedge ID
self._hyperedge_attributes[hyperedge_id].update(attr_dict)
return hyperedge_id | [
"def",
"add_hyperedge",
"(",
"self",
",",
"tail",
",",
"head",
",",
"attr_dict",
"=",
"None",
",",
"*",
"*",
"attr",
")",
":",
"attr_dict",
"=",
"self",
".",
"_combine_attribute_arguments",
"(",
"attr_dict",
",",
"attr",
")",
"# Don't allow both empty tail and head containers (invalid hyperedge)",
"if",
"not",
"tail",
"and",
"not",
"head",
":",
"raise",
"ValueError",
"(",
"\"tail and head arguments \\\n cannot both be empty.\"",
")",
"# Use frozensets for tail and head sets to allow for hashable keys",
"frozen_tail",
"=",
"frozenset",
"(",
"tail",
")",
"frozen_head",
"=",
"frozenset",
"(",
"head",
")",
"# Initialize a successor dictionary for the tail and head, respectively",
"if",
"frozen_tail",
"not",
"in",
"self",
".",
"_successors",
":",
"self",
".",
"_successors",
"[",
"frozen_tail",
"]",
"=",
"{",
"}",
"if",
"frozen_head",
"not",
"in",
"self",
".",
"_predecessors",
":",
"self",
".",
"_predecessors",
"[",
"frozen_head",
"]",
"=",
"{",
"}",
"is_new_hyperedge",
"=",
"not",
"self",
".",
"has_hyperedge",
"(",
"frozen_tail",
",",
"frozen_head",
")",
"if",
"is_new_hyperedge",
":",
"# Add tail and head nodes to graph (if not already present)",
"self",
".",
"add_nodes",
"(",
"frozen_head",
")",
"self",
".",
"add_nodes",
"(",
"frozen_tail",
")",
"# Create new hyperedge name to use as reference for that hyperedge",
"hyperedge_id",
"=",
"self",
".",
"_assign_next_hyperedge_id",
"(",
")",
"# Add hyperedge to the forward-star and to the backward-star",
"# for each node in the tail and head sets, respectively",
"for",
"node",
"in",
"frozen_tail",
":",
"self",
".",
"_forward_star",
"[",
"node",
"]",
".",
"add",
"(",
"hyperedge_id",
")",
"for",
"node",
"in",
"frozen_head",
":",
"self",
".",
"_backward_star",
"[",
"node",
"]",
".",
"add",
"(",
"hyperedge_id",
")",
"# Add the hyperedge as the successors and predecessors",
"# of the tail set and head set, respectively",
"self",
".",
"_successors",
"[",
"frozen_tail",
"]",
"[",
"frozen_head",
"]",
"=",
"hyperedge_id",
"self",
".",
"_predecessors",
"[",
"frozen_head",
"]",
"[",
"frozen_tail",
"]",
"=",
"hyperedge_id",
"# Assign some special attributes to this hyperedge. We assign",
"# a default weight of 1 to the hyperedge. We also store the",
"# original tail and head sets in order to return them exactly",
"# as the user passed them into add_hyperedge.",
"self",
".",
"_hyperedge_attributes",
"[",
"hyperedge_id",
"]",
"=",
"{",
"\"tail\"",
":",
"tail",
",",
"\"__frozen_tail\"",
":",
"frozen_tail",
",",
"\"head\"",
":",
"head",
",",
"\"__frozen_head\"",
":",
"frozen_head",
",",
"\"weight\"",
":",
"1",
"}",
"else",
":",
"# If its not a new hyperedge, just get its ID to update attributes",
"hyperedge_id",
"=",
"self",
".",
"_successors",
"[",
"frozen_tail",
"]",
"[",
"frozen_head",
"]",
"# Set attributes and return hyperedge ID",
"self",
".",
"_hyperedge_attributes",
"[",
"hyperedge_id",
"]",
".",
"update",
"(",
"attr_dict",
")",
"return",
"hyperedge_id"
] | Adds a hyperedge to the hypergraph, along with any related
attributes of the hyperedge.
This method will automatically add any node from the tail and
head that was not in the hypergraph.
A hyperedge without a "weight" attribute specified will be
assigned the default value of 1.
:param tail: iterable container of references to nodes in the
tail of the hyperedge to be added.
:param head: iterable container of references to nodes in the
head of the hyperedge to be added.
:param attr_dict: dictionary of attributes shared by all
the hyperedges.
:param attr: keyword arguments of attributes of the hyperedge;
attr's values will override attr_dict's values
if both are provided.
:returns: str -- the ID of the hyperedge that was added.
:raises: ValueError -- tail and head arguments cannot both be empty.
Examples:
::
>>> H = DirectedHypergraph()
>>> x = H.add_hyperedge(["A", "B"], ["C", "D"])
>>> y = H.add_hyperedge(("A", "C"), ("B"), 'weight'=2)
>>> z = H.add_hyperedge(set(["D"]),
set(["A", "C"]),
{color: "red"}) | [
"Adds",
"a",
"hyperedge",
"to",
"the",
"hypergraph",
"along",
"with",
"any",
"related",
"attributes",
"of",
"the",
"hyperedge",
".",
"This",
"method",
"will",
"automatically",
"add",
"any",
"node",
"from",
"the",
"tail",
"and",
"head",
"that",
"was",
"not",
"in",
"the",
"hypergraph",
".",
"A",
"hyperedge",
"without",
"a",
"weight",
"attribute",
"specified",
"will",
"be",
"assigned",
"the",
"default",
"value",
"of",
"1",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L465-L548 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.add_hyperedges | def add_hyperedges(self, hyperedges, attr_dict=None, **attr):
"""Adds multiple hyperedges to the graph, along with any related
attributes of the hyperedges.
If any node in the tail or head of any hyperedge has not
previously been added to the hypergraph, it will automatically
be added here. Hyperedges without a "weight" attribute specified
will be assigned the default value of 1.
:param hyperedges: iterable container to either tuples of
(tail reference, head reference) OR tuples of
(tail reference, head reference, attribute dictionary);
if an attribute dictionary is provided in the tuple,
its values will override both attr_dict's and attr's
values.
:param attr_dict: dictionary of attributes shared by all
the hyperedges.
:param attr: keyword arguments of attributes of the hyperedges;
attr's values will override attr_dict's values
if both are provided.
:returns: list -- the IDs of the hyperedges added in the order
specified by the hyperedges container's iterator.
See also:
add_hyperedge
Examples:
::
>>> H = DirectedHypergraph()
>>> xyz = hyperedge_list = ((["A", "B"], ["C", "D"]),
(("A", "C"), ("B"), {'weight': 2}),
(set(["D"]), set(["A", "C"])))
>>> H.add_hyperedges(hyperedge_list)
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
hyperedge_ids = []
for hyperedge in hyperedges:
if len(hyperedge) == 3:
# See ("A", "C"), ("B"), {weight: 2}) in the
# documentation example
tail, head, hyperedge_attr_dict = hyperedge
# Create a new dictionary and load it with node_attr_dict and
# attr_dict, with the former (node_attr_dict) taking precedence
new_dict = attr_dict.copy()
new_dict.update(hyperedge_attr_dict)
hyperedge_id = self.add_hyperedge(tail, head, new_dict)
else:
# See (["A", "B"], ["C", "D"]) in the documentation example
tail, head = hyperedge
hyperedge_id = \
self.add_hyperedge(tail, head, attr_dict.copy())
hyperedge_ids.append(hyperedge_id)
return hyperedge_ids | python | def add_hyperedges(self, hyperedges, attr_dict=None, **attr):
"""Adds multiple hyperedges to the graph, along with any related
attributes of the hyperedges.
If any node in the tail or head of any hyperedge has not
previously been added to the hypergraph, it will automatically
be added here. Hyperedges without a "weight" attribute specified
will be assigned the default value of 1.
:param hyperedges: iterable container to either tuples of
(tail reference, head reference) OR tuples of
(tail reference, head reference, attribute dictionary);
if an attribute dictionary is provided in the tuple,
its values will override both attr_dict's and attr's
values.
:param attr_dict: dictionary of attributes shared by all
the hyperedges.
:param attr: keyword arguments of attributes of the hyperedges;
attr's values will override attr_dict's values
if both are provided.
:returns: list -- the IDs of the hyperedges added in the order
specified by the hyperedges container's iterator.
See also:
add_hyperedge
Examples:
::
>>> H = DirectedHypergraph()
>>> xyz = hyperedge_list = ((["A", "B"], ["C", "D"]),
(("A", "C"), ("B"), {'weight': 2}),
(set(["D"]), set(["A", "C"])))
>>> H.add_hyperedges(hyperedge_list)
"""
attr_dict = self._combine_attribute_arguments(attr_dict, attr)
hyperedge_ids = []
for hyperedge in hyperedges:
if len(hyperedge) == 3:
# See ("A", "C"), ("B"), {weight: 2}) in the
# documentation example
tail, head, hyperedge_attr_dict = hyperedge
# Create a new dictionary and load it with node_attr_dict and
# attr_dict, with the former (node_attr_dict) taking precedence
new_dict = attr_dict.copy()
new_dict.update(hyperedge_attr_dict)
hyperedge_id = self.add_hyperedge(tail, head, new_dict)
else:
# See (["A", "B"], ["C", "D"]) in the documentation example
tail, head = hyperedge
hyperedge_id = \
self.add_hyperedge(tail, head, attr_dict.copy())
hyperedge_ids.append(hyperedge_id)
return hyperedge_ids | [
"def",
"add_hyperedges",
"(",
"self",
",",
"hyperedges",
",",
"attr_dict",
"=",
"None",
",",
"*",
"*",
"attr",
")",
":",
"attr_dict",
"=",
"self",
".",
"_combine_attribute_arguments",
"(",
"attr_dict",
",",
"attr",
")",
"hyperedge_ids",
"=",
"[",
"]",
"for",
"hyperedge",
"in",
"hyperedges",
":",
"if",
"len",
"(",
"hyperedge",
")",
"==",
"3",
":",
"# See (\"A\", \"C\"), (\"B\"), {weight: 2}) in the",
"# documentation example",
"tail",
",",
"head",
",",
"hyperedge_attr_dict",
"=",
"hyperedge",
"# Create a new dictionary and load it with node_attr_dict and",
"# attr_dict, with the former (node_attr_dict) taking precedence",
"new_dict",
"=",
"attr_dict",
".",
"copy",
"(",
")",
"new_dict",
".",
"update",
"(",
"hyperedge_attr_dict",
")",
"hyperedge_id",
"=",
"self",
".",
"add_hyperedge",
"(",
"tail",
",",
"head",
",",
"new_dict",
")",
"else",
":",
"# See ([\"A\", \"B\"], [\"C\", \"D\"]) in the documentation example",
"tail",
",",
"head",
"=",
"hyperedge",
"hyperedge_id",
"=",
"self",
".",
"add_hyperedge",
"(",
"tail",
",",
"head",
",",
"attr_dict",
".",
"copy",
"(",
")",
")",
"hyperedge_ids",
".",
"append",
"(",
"hyperedge_id",
")",
"return",
"hyperedge_ids"
] | Adds multiple hyperedges to the graph, along with any related
attributes of the hyperedges.
If any node in the tail or head of any hyperedge has not
previously been added to the hypergraph, it will automatically
be added here. Hyperedges without a "weight" attribute specified
will be assigned the default value of 1.
:param hyperedges: iterable container to either tuples of
(tail reference, head reference) OR tuples of
(tail reference, head reference, attribute dictionary);
if an attribute dictionary is provided in the tuple,
its values will override both attr_dict's and attr's
values.
:param attr_dict: dictionary of attributes shared by all
the hyperedges.
:param attr: keyword arguments of attributes of the hyperedges;
attr's values will override attr_dict's values
if both are provided.
:returns: list -- the IDs of the hyperedges added in the order
specified by the hyperedges container's iterator.
See also:
add_hyperedge
Examples:
::
>>> H = DirectedHypergraph()
>>> xyz = hyperedge_list = ((["A", "B"], ["C", "D"]),
(("A", "C"), ("B"), {'weight': 2}),
(set(["D"]), set(["A", "C"])))
>>> H.add_hyperedges(hyperedge_list) | [
"Adds",
"multiple",
"hyperedges",
"to",
"the",
"graph",
"along",
"with",
"any",
"related",
"attributes",
"of",
"the",
"hyperedges",
".",
"If",
"any",
"node",
"in",
"the",
"tail",
"or",
"head",
"of",
"any",
"hyperedge",
"has",
"not",
"previously",
"been",
"added",
"to",
"the",
"hypergraph",
"it",
"will",
"automatically",
"be",
"added",
"here",
".",
"Hyperedges",
"without",
"a",
"weight",
"attribute",
"specified",
"will",
"be",
"assigned",
"the",
"default",
"value",
"of",
"1",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L550-L606 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.get_hyperedge_id | def get_hyperedge_id(self, tail, head):
"""From a tail and head set of nodes, returns the ID of the hyperedge
that these sets comprise.
:param tail: iterable container of references to nodes in the
tail of the hyperedge to be added
:param head: iterable container of references to nodes in the
head of the hyperedge to be added
:returns: str -- ID of the hyperedge that has that the specified
tail and head sets comprise.
:raises: ValueError -- No such hyperedge exists.
Examples:
::
>>> H = DirectedHypergraph()
>>> hyperedge_list = (["A"], ["B", "C"]),
(("A", "B"), ("C"), {weight: 2}),
(set(["B"]), set(["A", "C"])))
>>> hyperedge_ids = H.add_hyperedges(hyperedge_list)
>>> x = H.get_hyperedge_id(["A"], ["B", "C"])
"""
frozen_tail = frozenset(tail)
frozen_head = frozenset(head)
if not self.has_hyperedge(frozen_tail, frozen_head):
raise ValueError("No such hyperedge exists.")
return self._successors[frozen_tail][frozen_head] | python | def get_hyperedge_id(self, tail, head):
"""From a tail and head set of nodes, returns the ID of the hyperedge
that these sets comprise.
:param tail: iterable container of references to nodes in the
tail of the hyperedge to be added
:param head: iterable container of references to nodes in the
head of the hyperedge to be added
:returns: str -- ID of the hyperedge that has that the specified
tail and head sets comprise.
:raises: ValueError -- No such hyperedge exists.
Examples:
::
>>> H = DirectedHypergraph()
>>> hyperedge_list = (["A"], ["B", "C"]),
(("A", "B"), ("C"), {weight: 2}),
(set(["B"]), set(["A", "C"])))
>>> hyperedge_ids = H.add_hyperedges(hyperedge_list)
>>> x = H.get_hyperedge_id(["A"], ["B", "C"])
"""
frozen_tail = frozenset(tail)
frozen_head = frozenset(head)
if not self.has_hyperedge(frozen_tail, frozen_head):
raise ValueError("No such hyperedge exists.")
return self._successors[frozen_tail][frozen_head] | [
"def",
"get_hyperedge_id",
"(",
"self",
",",
"tail",
",",
"head",
")",
":",
"frozen_tail",
"=",
"frozenset",
"(",
"tail",
")",
"frozen_head",
"=",
"frozenset",
"(",
"head",
")",
"if",
"not",
"self",
".",
"has_hyperedge",
"(",
"frozen_tail",
",",
"frozen_head",
")",
":",
"raise",
"ValueError",
"(",
"\"No such hyperedge exists.\"",
")",
"return",
"self",
".",
"_successors",
"[",
"frozen_tail",
"]",
"[",
"frozen_head",
"]"
] | From a tail and head set of nodes, returns the ID of the hyperedge
that these sets comprise.
:param tail: iterable container of references to nodes in the
tail of the hyperedge to be added
:param head: iterable container of references to nodes in the
head of the hyperedge to be added
:returns: str -- ID of the hyperedge that has that the specified
tail and head sets comprise.
:raises: ValueError -- No such hyperedge exists.
Examples:
::
>>> H = DirectedHypergraph()
>>> hyperedge_list = (["A"], ["B", "C"]),
(("A", "B"), ("C"), {weight: 2}),
(set(["B"]), set(["A", "C"])))
>>> hyperedge_ids = H.add_hyperedges(hyperedge_list)
>>> x = H.get_hyperedge_id(["A"], ["B", "C"]) | [
"From",
"a",
"tail",
"and",
"head",
"set",
"of",
"nodes",
"returns",
"the",
"ID",
"of",
"the",
"hyperedge",
"that",
"these",
"sets",
"comprise",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L724-L753 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.get_forward_star | def get_forward_star(self, node):
"""Given a node, get a copy of that node's forward star.
:param node: node to retrieve the forward-star of.
:returns: set -- set of hyperedge_ids for the hyperedges
in the node's forward star.
:raises: ValueError -- No such node exists.
"""
if node not in self._node_attributes:
raise ValueError("No such node exists.")
return self._forward_star[node].copy() | python | def get_forward_star(self, node):
"""Given a node, get a copy of that node's forward star.
:param node: node to retrieve the forward-star of.
:returns: set -- set of hyperedge_ids for the hyperedges
in the node's forward star.
:raises: ValueError -- No such node exists.
"""
if node not in self._node_attributes:
raise ValueError("No such node exists.")
return self._forward_star[node].copy() | [
"def",
"get_forward_star",
"(",
"self",
",",
"node",
")",
":",
"if",
"node",
"not",
"in",
"self",
".",
"_node_attributes",
":",
"raise",
"ValueError",
"(",
"\"No such node exists.\"",
")",
"return",
"self",
".",
"_forward_star",
"[",
"node",
"]",
".",
"copy",
"(",
")"
] | Given a node, get a copy of that node's forward star.
:param node: node to retrieve the forward-star of.
:returns: set -- set of hyperedge_ids for the hyperedges
in the node's forward star.
:raises: ValueError -- No such node exists. | [
"Given",
"a",
"node",
"get",
"a",
"copy",
"of",
"that",
"node",
"s",
"forward",
"star",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L833-L844 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.get_backward_star | def get_backward_star(self, node):
"""Given a node, get a copy of that node's backward star.
:param node: node to retrieve the backward-star of.
:returns: set -- set of hyperedge_ids for the hyperedges
in the node's backward star.
:raises: ValueError -- No such node exists.
"""
if node not in self._node_attributes:
raise ValueError("No such node exists.")
return self._backward_star[node].copy() | python | def get_backward_star(self, node):
"""Given a node, get a copy of that node's backward star.
:param node: node to retrieve the backward-star of.
:returns: set -- set of hyperedge_ids for the hyperedges
in the node's backward star.
:raises: ValueError -- No such node exists.
"""
if node not in self._node_attributes:
raise ValueError("No such node exists.")
return self._backward_star[node].copy() | [
"def",
"get_backward_star",
"(",
"self",
",",
"node",
")",
":",
"if",
"node",
"not",
"in",
"self",
".",
"_node_attributes",
":",
"raise",
"ValueError",
"(",
"\"No such node exists.\"",
")",
"return",
"self",
".",
"_backward_star",
"[",
"node",
"]",
".",
"copy",
"(",
")"
] | Given a node, get a copy of that node's backward star.
:param node: node to retrieve the backward-star of.
:returns: set -- set of hyperedge_ids for the hyperedges
in the node's backward star.
:raises: ValueError -- No such node exists. | [
"Given",
"a",
"node",
"get",
"a",
"copy",
"of",
"that",
"node",
"s",
"backward",
"star",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L846-L857 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.get_successors | def get_successors(self, tail):
"""Given a tail set of nodes, get a list of edges of which the node
set is the tail of each edge.
:param tail: set of nodes that correspond to the tails of some
(possibly empty) set of edges.
:returns: set -- hyperedge_ids of the hyperedges that have tail
in the tail.
"""
frozen_tail = frozenset(tail)
# If this node set isn't any tail in the hypergraph, then it has
# no successors; thus, return an empty list
if frozen_tail not in self._successors:
return set()
return set(self._successors[frozen_tail].values()) | python | def get_successors(self, tail):
"""Given a tail set of nodes, get a list of edges of which the node
set is the tail of each edge.
:param tail: set of nodes that correspond to the tails of some
(possibly empty) set of edges.
:returns: set -- hyperedge_ids of the hyperedges that have tail
in the tail.
"""
frozen_tail = frozenset(tail)
# If this node set isn't any tail in the hypergraph, then it has
# no successors; thus, return an empty list
if frozen_tail not in self._successors:
return set()
return set(self._successors[frozen_tail].values()) | [
"def",
"get_successors",
"(",
"self",
",",
"tail",
")",
":",
"frozen_tail",
"=",
"frozenset",
"(",
"tail",
")",
"# If this node set isn't any tail in the hypergraph, then it has",
"# no successors; thus, return an empty list",
"if",
"frozen_tail",
"not",
"in",
"self",
".",
"_successors",
":",
"return",
"set",
"(",
")",
"return",
"set",
"(",
"self",
".",
"_successors",
"[",
"frozen_tail",
"]",
".",
"values",
"(",
")",
")"
] | Given a tail set of nodes, get a list of edges of which the node
set is the tail of each edge.
:param tail: set of nodes that correspond to the tails of some
(possibly empty) set of edges.
:returns: set -- hyperedge_ids of the hyperedges that have tail
in the tail. | [
"Given",
"a",
"tail",
"set",
"of",
"nodes",
"get",
"a",
"list",
"of",
"edges",
"of",
"which",
"the",
"node",
"set",
"is",
"the",
"tail",
"of",
"each",
"edge",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L859-L875 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.get_predecessors | def get_predecessors(self, head):
"""Given a head set of nodes, get a list of edges of which the node set
is the head of each edge.
:param head: set of nodes that correspond to the heads of some
(possibly empty) set of edges.
:returns: set -- hyperedge_ids of the hyperedges that have head
in the head.
"""
frozen_head = frozenset(head)
# If this node set isn't any head in the hypergraph, then it has
# no predecessors; thus, return an empty list
if frozen_head not in self._predecessors:
return set()
return set(self._predecessors[frozen_head].values()) | python | def get_predecessors(self, head):
"""Given a head set of nodes, get a list of edges of which the node set
is the head of each edge.
:param head: set of nodes that correspond to the heads of some
(possibly empty) set of edges.
:returns: set -- hyperedge_ids of the hyperedges that have head
in the head.
"""
frozen_head = frozenset(head)
# If this node set isn't any head in the hypergraph, then it has
# no predecessors; thus, return an empty list
if frozen_head not in self._predecessors:
return set()
return set(self._predecessors[frozen_head].values()) | [
"def",
"get_predecessors",
"(",
"self",
",",
"head",
")",
":",
"frozen_head",
"=",
"frozenset",
"(",
"head",
")",
"# If this node set isn't any head in the hypergraph, then it has",
"# no predecessors; thus, return an empty list",
"if",
"frozen_head",
"not",
"in",
"self",
".",
"_predecessors",
":",
"return",
"set",
"(",
")",
"return",
"set",
"(",
"self",
".",
"_predecessors",
"[",
"frozen_head",
"]",
".",
"values",
"(",
")",
")"
] | Given a head set of nodes, get a list of edges of which the node set
is the head of each edge.
:param head: set of nodes that correspond to the heads of some
(possibly empty) set of edges.
:returns: set -- hyperedge_ids of the hyperedges that have head
in the head. | [
"Given",
"a",
"head",
"set",
"of",
"nodes",
"get",
"a",
"list",
"of",
"edges",
"of",
"which",
"the",
"node",
"set",
"is",
"the",
"head",
"of",
"each",
"edge",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L877-L892 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.is_BF_hypergraph | def is_BF_hypergraph(self):
"""Indicates whether the hypergraph is a BF-hypergraph.
A BF-hypergraph consists of only B-hyperedges and F-hyperedges.
See "is_B_hypergraph" or "is_F_hypergraph" for more details.
:returns: bool -- True iff the hypergraph is an F-hypergraph.
"""
for hyperedge_id in self._hyperedge_attributes:
tail = self.get_hyperedge_tail(hyperedge_id)
head = self.get_hyperedge_head(hyperedge_id)
if len(tail) > 1 and len(head) > 1:
return False
return True | python | def is_BF_hypergraph(self):
"""Indicates whether the hypergraph is a BF-hypergraph.
A BF-hypergraph consists of only B-hyperedges and F-hyperedges.
See "is_B_hypergraph" or "is_F_hypergraph" for more details.
:returns: bool -- True iff the hypergraph is an F-hypergraph.
"""
for hyperedge_id in self._hyperedge_attributes:
tail = self.get_hyperedge_tail(hyperedge_id)
head = self.get_hyperedge_head(hyperedge_id)
if len(tail) > 1 and len(head) > 1:
return False
return True | [
"def",
"is_BF_hypergraph",
"(",
"self",
")",
":",
"for",
"hyperedge_id",
"in",
"self",
".",
"_hyperedge_attributes",
":",
"tail",
"=",
"self",
".",
"get_hyperedge_tail",
"(",
"hyperedge_id",
")",
"head",
"=",
"self",
".",
"get_hyperedge_head",
"(",
"hyperedge_id",
")",
"if",
"len",
"(",
"tail",
")",
">",
"1",
"and",
"len",
"(",
"head",
")",
">",
"1",
":",
"return",
"False",
"return",
"True"
] | Indicates whether the hypergraph is a BF-hypergraph.
A BF-hypergraph consists of only B-hyperedges and F-hyperedges.
See "is_B_hypergraph" or "is_F_hypergraph" for more details.
:returns: bool -- True iff the hypergraph is an F-hypergraph. | [
"Indicates",
"whether",
"the",
"hypergraph",
"is",
"a",
"BF",
"-",
"hypergraph",
".",
"A",
"BF",
"-",
"hypergraph",
"consists",
"of",
"only",
"B",
"-",
"hyperedges",
"and",
"F",
"-",
"hyperedges",
".",
"See",
"is_B_hypergraph",
"or",
"is_F_hypergraph",
"for",
"more",
"details",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L928-L941 | train |
Murali-group/halp | halp/directed_hypergraph.py | DirectedHypergraph.get_induced_subhypergraph | def get_induced_subhypergraph(self, nodes):
"""Gives a new hypergraph that is the subhypergraph of the current
hypergraph induced by the provided set of nodes. That is, the induced
subhypergraph's node set corresponds precisely to the nodes provided,
and the coressponding hyperedges in the subhypergraph are only those
from the original graph consist of tail and head sets that are subsets
of the provided nodes.
:param nodes: the set of nodes to find the induced subhypergraph of.
:returns: DirectedHypergraph -- the subhypergraph induced on the
provided nodes.
"""
sub_H = self.copy()
sub_H.remove_nodes(sub_H.get_node_set() - set(nodes))
return sub_H | python | def get_induced_subhypergraph(self, nodes):
"""Gives a new hypergraph that is the subhypergraph of the current
hypergraph induced by the provided set of nodes. That is, the induced
subhypergraph's node set corresponds precisely to the nodes provided,
and the coressponding hyperedges in the subhypergraph are only those
from the original graph consist of tail and head sets that are subsets
of the provided nodes.
:param nodes: the set of nodes to find the induced subhypergraph of.
:returns: DirectedHypergraph -- the subhypergraph induced on the
provided nodes.
"""
sub_H = self.copy()
sub_H.remove_nodes(sub_H.get_node_set() - set(nodes))
return sub_H | [
"def",
"get_induced_subhypergraph",
"(",
"self",
",",
"nodes",
")",
":",
"sub_H",
"=",
"self",
".",
"copy",
"(",
")",
"sub_H",
".",
"remove_nodes",
"(",
"sub_H",
".",
"get_node_set",
"(",
")",
"-",
"set",
"(",
"nodes",
")",
")",
"return",
"sub_H"
] | Gives a new hypergraph that is the subhypergraph of the current
hypergraph induced by the provided set of nodes. That is, the induced
subhypergraph's node set corresponds precisely to the nodes provided,
and the coressponding hyperedges in the subhypergraph are only those
from the original graph consist of tail and head sets that are subsets
of the provided nodes.
:param nodes: the set of nodes to find the induced subhypergraph of.
:returns: DirectedHypergraph -- the subhypergraph induced on the
provided nodes. | [
"Gives",
"a",
"new",
"hypergraph",
"that",
"is",
"the",
"subhypergraph",
"of",
"the",
"current",
"hypergraph",
"induced",
"by",
"the",
"provided",
"set",
"of",
"nodes",
".",
"That",
"is",
"the",
"induced",
"subhypergraph",
"s",
"node",
"set",
"corresponds",
"precisely",
"to",
"the",
"nodes",
"provided",
"and",
"the",
"coressponding",
"hyperedges",
"in",
"the",
"subhypergraph",
"are",
"only",
"those",
"from",
"the",
"original",
"graph",
"consist",
"of",
"tail",
"and",
"head",
"sets",
"that",
"are",
"subsets",
"of",
"the",
"provided",
"nodes",
"."
] | 6eb27466ba84e2281e18f93b62aae5efb21ef8b3 | https://github.com/Murali-group/halp/blob/6eb27466ba84e2281e18f93b62aae5efb21ef8b3/halp/directed_hypergraph.py#L1046-L1061 | train |
aio-libs/multidict | multidict/_multidict_py.py | _Base.getall | def getall(self, key, default=_marker):
"""Return a list of all values matching the key."""
identity = self._title(key)
res = [v for i, k, v in self._impl._items if i == identity]
if res:
return res
if not res and default is not _marker:
return default
raise KeyError('Key not found: %r' % key) | python | def getall(self, key, default=_marker):
"""Return a list of all values matching the key."""
identity = self._title(key)
res = [v for i, k, v in self._impl._items if i == identity]
if res:
return res
if not res and default is not _marker:
return default
raise KeyError('Key not found: %r' % key) | [
"def",
"getall",
"(",
"self",
",",
"key",
",",
"default",
"=",
"_marker",
")",
":",
"identity",
"=",
"self",
".",
"_title",
"(",
"key",
")",
"res",
"=",
"[",
"v",
"for",
"i",
",",
"k",
",",
"v",
"in",
"self",
".",
"_impl",
".",
"_items",
"if",
"i",
"==",
"identity",
"]",
"if",
"res",
":",
"return",
"res",
"if",
"not",
"res",
"and",
"default",
"is",
"not",
"_marker",
":",
"return",
"default",
"raise",
"KeyError",
"(",
"'Key not found: %r'",
"%",
"key",
")"
] | Return a list of all values matching the key. | [
"Return",
"a",
"list",
"of",
"all",
"values",
"matching",
"the",
"key",
"."
] | 1ecfa942cf6ae79727711a109e1f46ed24fae07f | https://github.com/aio-libs/multidict/blob/1ecfa942cf6ae79727711a109e1f46ed24fae07f/multidict/_multidict_py.py#L64-L72 | train |
aio-libs/multidict | multidict/_multidict_py.py | MultiDict.extend | def extend(self, *args, **kwargs):
"""Extend current MultiDict with more values.
This method must be used instead of update.
"""
self._extend(args, kwargs, 'extend', self._extend_items) | python | def extend(self, *args, **kwargs):
"""Extend current MultiDict with more values.
This method must be used instead of update.
"""
self._extend(args, kwargs, 'extend', self._extend_items) | [
"def",
"extend",
"(",
"self",
",",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"self",
".",
"_extend",
"(",
"args",
",",
"kwargs",
",",
"'extend'",
",",
"self",
".",
"_extend_items",
")"
] | Extend current MultiDict with more values.
This method must be used instead of update. | [
"Extend",
"current",
"MultiDict",
"with",
"more",
"values",
"."
] | 1ecfa942cf6ae79727711a109e1f46ed24fae07f | https://github.com/aio-libs/multidict/blob/1ecfa942cf6ae79727711a109e1f46ed24fae07f/multidict/_multidict_py.py#L218-L223 | train |
aio-libs/multidict | multidict/_multidict_py.py | MultiDict.setdefault | def setdefault(self, key, default=None):
"""Return value for key, set value to default if key is not present."""
identity = self._title(key)
for i, k, v in self._impl._items:
if i == identity:
return v
self.add(key, default)
return default | python | def setdefault(self, key, default=None):
"""Return value for key, set value to default if key is not present."""
identity = self._title(key)
for i, k, v in self._impl._items:
if i == identity:
return v
self.add(key, default)
return default | [
"def",
"setdefault",
"(",
"self",
",",
"key",
",",
"default",
"=",
"None",
")",
":",
"identity",
"=",
"self",
".",
"_title",
"(",
"key",
")",
"for",
"i",
",",
"k",
",",
"v",
"in",
"self",
".",
"_impl",
".",
"_items",
":",
"if",
"i",
"==",
"identity",
":",
"return",
"v",
"self",
".",
"add",
"(",
"key",
",",
"default",
")",
"return",
"default"
] | Return value for key, set value to default if key is not present. | [
"Return",
"value",
"for",
"key",
"set",
"value",
"to",
"default",
"if",
"key",
"is",
"not",
"present",
"."
] | 1ecfa942cf6ae79727711a109e1f46ed24fae07f | https://github.com/aio-libs/multidict/blob/1ecfa942cf6ae79727711a109e1f46ed24fae07f/multidict/_multidict_py.py#L281-L288 | train |
aio-libs/multidict | multidict/_multidict_py.py | MultiDict.popall | def popall(self, key, default=_marker):
"""Remove all occurrences of key and return the list of corresponding
values.
If key is not found, default is returned if given, otherwise
KeyError is raised.
"""
found = False
identity = self._title(key)
ret = []
for i in range(len(self._impl._items)-1, -1, -1):
item = self._impl._items[i]
if item[0] == identity:
ret.append(item[2])
del self._impl._items[i]
self._impl.incr_version()
found = True
if not found:
if default is _marker:
raise KeyError(key)
else:
return default
else:
ret.reverse()
return ret | python | def popall(self, key, default=_marker):
"""Remove all occurrences of key and return the list of corresponding
values.
If key is not found, default is returned if given, otherwise
KeyError is raised.
"""
found = False
identity = self._title(key)
ret = []
for i in range(len(self._impl._items)-1, -1, -1):
item = self._impl._items[i]
if item[0] == identity:
ret.append(item[2])
del self._impl._items[i]
self._impl.incr_version()
found = True
if not found:
if default is _marker:
raise KeyError(key)
else:
return default
else:
ret.reverse()
return ret | [
"def",
"popall",
"(",
"self",
",",
"key",
",",
"default",
"=",
"_marker",
")",
":",
"found",
"=",
"False",
"identity",
"=",
"self",
".",
"_title",
"(",
"key",
")",
"ret",
"=",
"[",
"]",
"for",
"i",
"in",
"range",
"(",
"len",
"(",
"self",
".",
"_impl",
".",
"_items",
")",
"-",
"1",
",",
"-",
"1",
",",
"-",
"1",
")",
":",
"item",
"=",
"self",
".",
"_impl",
".",
"_items",
"[",
"i",
"]",
"if",
"item",
"[",
"0",
"]",
"==",
"identity",
":",
"ret",
".",
"append",
"(",
"item",
"[",
"2",
"]",
")",
"del",
"self",
".",
"_impl",
".",
"_items",
"[",
"i",
"]",
"self",
".",
"_impl",
".",
"incr_version",
"(",
")",
"found",
"=",
"True",
"if",
"not",
"found",
":",
"if",
"default",
"is",
"_marker",
":",
"raise",
"KeyError",
"(",
"key",
")",
"else",
":",
"return",
"default",
"else",
":",
"ret",
".",
"reverse",
"(",
")",
"return",
"ret"
] | Remove all occurrences of key and return the list of corresponding
values.
If key is not found, default is returned if given, otherwise
KeyError is raised. | [
"Remove",
"all",
"occurrences",
"of",
"key",
"and",
"return",
"the",
"list",
"of",
"corresponding",
"values",
"."
] | 1ecfa942cf6ae79727711a109e1f46ed24fae07f | https://github.com/aio-libs/multidict/blob/1ecfa942cf6ae79727711a109e1f46ed24fae07f/multidict/_multidict_py.py#L311-L336 | train |
rootpy/rootpy | rootpy/stats/histfactory/histfactory.py | Data.total | def total(self, xbin1=1, xbin2=-2):
"""
Return the total yield and its associated statistical uncertainty.
"""
return self.hist.integral(xbin1=xbin1, xbin2=xbin2, error=True) | python | def total(self, xbin1=1, xbin2=-2):
"""
Return the total yield and its associated statistical uncertainty.
"""
return self.hist.integral(xbin1=xbin1, xbin2=xbin2, error=True) | [
"def",
"total",
"(",
"self",
",",
"xbin1",
"=",
"1",
",",
"xbin2",
"=",
"-",
"2",
")",
":",
"return",
"self",
".",
"hist",
".",
"integral",
"(",
"xbin1",
"=",
"xbin1",
",",
"xbin2",
"=",
"xbin2",
",",
"error",
"=",
"True",
")"
] | Return the total yield and its associated statistical uncertainty. | [
"Return",
"the",
"total",
"yield",
"and",
"its",
"associated",
"statistical",
"uncertainty",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/stats/histfactory/histfactory.py#L133-L137 | train |
rootpy/rootpy | rootpy/stats/histfactory/histfactory.py | Sample.iter_sys | def iter_sys(self):
"""
Iterate over sys_name, overall_sys, histo_sys.
overall_sys or histo_sys may be None for any given sys_name.
"""
names = self.sys_names()
for name in names:
osys = self.GetOverallSys(name)
hsys = self.GetHistoSys(name)
yield name, osys, hsys | python | def iter_sys(self):
"""
Iterate over sys_name, overall_sys, histo_sys.
overall_sys or histo_sys may be None for any given sys_name.
"""
names = self.sys_names()
for name in names:
osys = self.GetOverallSys(name)
hsys = self.GetHistoSys(name)
yield name, osys, hsys | [
"def",
"iter_sys",
"(",
"self",
")",
":",
"names",
"=",
"self",
".",
"sys_names",
"(",
")",
"for",
"name",
"in",
"names",
":",
"osys",
"=",
"self",
".",
"GetOverallSys",
"(",
"name",
")",
"hsys",
"=",
"self",
".",
"GetHistoSys",
"(",
"name",
")",
"yield",
"name",
",",
"osys",
",",
"hsys"
] | Iterate over sys_name, overall_sys, histo_sys.
overall_sys or histo_sys may be None for any given sys_name. | [
"Iterate",
"over",
"sys_name",
"overall_sys",
"histo_sys",
".",
"overall_sys",
"or",
"histo_sys",
"may",
"be",
"None",
"for",
"any",
"given",
"sys_name",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/stats/histfactory/histfactory.py#L260-L269 | train |
rootpy/rootpy | rootpy/stats/histfactory/histfactory.py | Sample.sys_hist | def sys_hist(self, name=None):
"""
Return the effective low and high histogram for a given systematic.
If this sample does not contain the named systematic then return
the nominal histogram for both low and high variations.
"""
if name is None:
low = self.hist.Clone(shallow=True)
high = self.hist.Clone(shallow=True)
return low, high
osys = self.GetOverallSys(name)
hsys = self.GetHistoSys(name)
if osys is None:
osys_high, osys_low = 1., 1.
else:
osys_high, osys_low = osys.high, osys.low
if hsys is None:
hsys_high = self.hist.Clone(shallow=True)
hsys_low = self.hist.Clone(shallow=True)
else:
hsys_high = hsys.high.Clone(shallow=True)
hsys_low = hsys.low.Clone(shallow=True)
return hsys_low * osys_low, hsys_high * osys_high | python | def sys_hist(self, name=None):
"""
Return the effective low and high histogram for a given systematic.
If this sample does not contain the named systematic then return
the nominal histogram for both low and high variations.
"""
if name is None:
low = self.hist.Clone(shallow=True)
high = self.hist.Clone(shallow=True)
return low, high
osys = self.GetOverallSys(name)
hsys = self.GetHistoSys(name)
if osys is None:
osys_high, osys_low = 1., 1.
else:
osys_high, osys_low = osys.high, osys.low
if hsys is None:
hsys_high = self.hist.Clone(shallow=True)
hsys_low = self.hist.Clone(shallow=True)
else:
hsys_high = hsys.high.Clone(shallow=True)
hsys_low = hsys.low.Clone(shallow=True)
return hsys_low * osys_low, hsys_high * osys_high | [
"def",
"sys_hist",
"(",
"self",
",",
"name",
"=",
"None",
")",
":",
"if",
"name",
"is",
"None",
":",
"low",
"=",
"self",
".",
"hist",
".",
"Clone",
"(",
"shallow",
"=",
"True",
")",
"high",
"=",
"self",
".",
"hist",
".",
"Clone",
"(",
"shallow",
"=",
"True",
")",
"return",
"low",
",",
"high",
"osys",
"=",
"self",
".",
"GetOverallSys",
"(",
"name",
")",
"hsys",
"=",
"self",
".",
"GetHistoSys",
"(",
"name",
")",
"if",
"osys",
"is",
"None",
":",
"osys_high",
",",
"osys_low",
"=",
"1.",
",",
"1.",
"else",
":",
"osys_high",
",",
"osys_low",
"=",
"osys",
".",
"high",
",",
"osys",
".",
"low",
"if",
"hsys",
"is",
"None",
":",
"hsys_high",
"=",
"self",
".",
"hist",
".",
"Clone",
"(",
"shallow",
"=",
"True",
")",
"hsys_low",
"=",
"self",
".",
"hist",
".",
"Clone",
"(",
"shallow",
"=",
"True",
")",
"else",
":",
"hsys_high",
"=",
"hsys",
".",
"high",
".",
"Clone",
"(",
"shallow",
"=",
"True",
")",
"hsys_low",
"=",
"hsys",
".",
"low",
".",
"Clone",
"(",
"shallow",
"=",
"True",
")",
"return",
"hsys_low",
"*",
"osys_low",
",",
"hsys_high",
"*",
"osys_high"
] | Return the effective low and high histogram for a given systematic.
If this sample does not contain the named systematic then return
the nominal histogram for both low and high variations. | [
"Return",
"the",
"effective",
"low",
"and",
"high",
"histogram",
"for",
"a",
"given",
"systematic",
".",
"If",
"this",
"sample",
"does",
"not",
"contain",
"the",
"named",
"systematic",
"then",
"return",
"the",
"nominal",
"histogram",
"for",
"both",
"low",
"and",
"high",
"variations",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/stats/histfactory/histfactory.py#L271-L293 | train |
rootpy/rootpy | rootpy/stats/histfactory/histfactory.py | Channel.sys_hist | def sys_hist(self, name=None, where=None):
"""
Return the effective total low and high histogram for a given
systematic over samples in this channel.
If a sample does not contain the named systematic then its nominal
histogram is used for both low and high variations.
Parameters
----------
name : string, optional (default=None)
The systematic name otherwise nominal if None
where : callable, optional (default=None)
A callable taking one argument: the sample, and returns True if
this sample should be included in the total.
Returns
-------
total_low, total_high : histograms
The total low and high histograms for this systematic
"""
total_low, total_high = None, None
for sample in self.samples:
if where is not None and not where(sample):
continue
low, high = sample.sys_hist(name)
if total_low is None:
total_low = low.Clone(shallow=True)
else:
total_low += low
if total_high is None:
total_high = high.Clone(shallow=True)
else:
total_high += high
return total_low, total_high | python | def sys_hist(self, name=None, where=None):
"""
Return the effective total low and high histogram for a given
systematic over samples in this channel.
If a sample does not contain the named systematic then its nominal
histogram is used for both low and high variations.
Parameters
----------
name : string, optional (default=None)
The systematic name otherwise nominal if None
where : callable, optional (default=None)
A callable taking one argument: the sample, and returns True if
this sample should be included in the total.
Returns
-------
total_low, total_high : histograms
The total low and high histograms for this systematic
"""
total_low, total_high = None, None
for sample in self.samples:
if where is not None and not where(sample):
continue
low, high = sample.sys_hist(name)
if total_low is None:
total_low = low.Clone(shallow=True)
else:
total_low += low
if total_high is None:
total_high = high.Clone(shallow=True)
else:
total_high += high
return total_low, total_high | [
"def",
"sys_hist",
"(",
"self",
",",
"name",
"=",
"None",
",",
"where",
"=",
"None",
")",
":",
"total_low",
",",
"total_high",
"=",
"None",
",",
"None",
"for",
"sample",
"in",
"self",
".",
"samples",
":",
"if",
"where",
"is",
"not",
"None",
"and",
"not",
"where",
"(",
"sample",
")",
":",
"continue",
"low",
",",
"high",
"=",
"sample",
".",
"sys_hist",
"(",
"name",
")",
"if",
"total_low",
"is",
"None",
":",
"total_low",
"=",
"low",
".",
"Clone",
"(",
"shallow",
"=",
"True",
")",
"else",
":",
"total_low",
"+=",
"low",
"if",
"total_high",
"is",
"None",
":",
"total_high",
"=",
"high",
".",
"Clone",
"(",
"shallow",
"=",
"True",
")",
"else",
":",
"total_high",
"+=",
"high",
"return",
"total_low",
",",
"total_high"
] | Return the effective total low and high histogram for a given
systematic over samples in this channel.
If a sample does not contain the named systematic then its nominal
histogram is used for both low and high variations.
Parameters
----------
name : string, optional (default=None)
The systematic name otherwise nominal if None
where : callable, optional (default=None)
A callable taking one argument: the sample, and returns True if
this sample should be included in the total.
Returns
-------
total_low, total_high : histograms
The total low and high histograms for this systematic | [
"Return",
"the",
"effective",
"total",
"low",
"and",
"high",
"histogram",
"for",
"a",
"given",
"systematic",
"over",
"samples",
"in",
"this",
"channel",
".",
"If",
"a",
"sample",
"does",
"not",
"contain",
"the",
"named",
"systematic",
"then",
"its",
"nominal",
"histogram",
"is",
"used",
"for",
"both",
"low",
"and",
"high",
"variations",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/stats/histfactory/histfactory.py#L894-L931 | train |
rootpy/rootpy | rootpy/stats/histfactory/histfactory.py | Channel.apply_snapshot | def apply_snapshot(self, argset):
"""
Create a clone of this Channel where histograms are modified according
to the values of the nuisance parameters in the snapshot. This is
useful when creating post-fit distribution plots.
Parameters
----------
argset : RooArtSet
A RooArgSet of RooRealVar nuisance parameters
Returns
-------
channel : Channel
The modified channel
"""
clone = self.Clone()
args = [var for var in argset if not (
var.name.startswith('binWidth_obs_x_') or
var.name.startswith('gamma_stat') or
var.name.startswith('nom_'))]
# handle NormFactors first
nargs = []
for var in args:
is_norm = False
name = var.name.replace('alpha_', '')
for sample in clone.samples:
if sample.GetNormFactor(name) is not None:
log.info("applying snapshot of {0} on sample {1}".format(
name, sample.name))
is_norm = True
# scale the entire sample
sample *= var.value
# add an OverallSys for the error
osys = OverallSys(name,
low=1. - var.error / var.value,
high=1. + var.error / var.value)
sample.AddOverallSys(osys)
# remove the NormFactor
sample.RemoveNormFactor(name)
if not is_norm:
nargs.append(var)
# modify the nominal shape and systematics
for sample in clone.samples:
# check that hist is not NULL
if sample.hist is None:
raise RuntimeError(
"sample {0} does not have a "
"nominal histogram".format(sample.name))
nominal = sample.hist.Clone(shallow=True)
for var in nargs:
name = var.name.replace('alpha_', '')
if not sample.has_sys(name):
continue
log.info("applying snapshot of {0} on sample {1}".format(
name, sample.name))
low, high = sample.sys_hist(name)
# modify nominal
val = var.value
if val > 0:
sample.hist += (high - nominal) * val
elif val < 0:
sample.hist += (nominal - low) * val
# TODO:
# modify OverallSys
# modify HistoSys
return clone | python | def apply_snapshot(self, argset):
"""
Create a clone of this Channel where histograms are modified according
to the values of the nuisance parameters in the snapshot. This is
useful when creating post-fit distribution plots.
Parameters
----------
argset : RooArtSet
A RooArgSet of RooRealVar nuisance parameters
Returns
-------
channel : Channel
The modified channel
"""
clone = self.Clone()
args = [var for var in argset if not (
var.name.startswith('binWidth_obs_x_') or
var.name.startswith('gamma_stat') or
var.name.startswith('nom_'))]
# handle NormFactors first
nargs = []
for var in args:
is_norm = False
name = var.name.replace('alpha_', '')
for sample in clone.samples:
if sample.GetNormFactor(name) is not None:
log.info("applying snapshot of {0} on sample {1}".format(
name, sample.name))
is_norm = True
# scale the entire sample
sample *= var.value
# add an OverallSys for the error
osys = OverallSys(name,
low=1. - var.error / var.value,
high=1. + var.error / var.value)
sample.AddOverallSys(osys)
# remove the NormFactor
sample.RemoveNormFactor(name)
if not is_norm:
nargs.append(var)
# modify the nominal shape and systematics
for sample in clone.samples:
# check that hist is not NULL
if sample.hist is None:
raise RuntimeError(
"sample {0} does not have a "
"nominal histogram".format(sample.name))
nominal = sample.hist.Clone(shallow=True)
for var in nargs:
name = var.name.replace('alpha_', '')
if not sample.has_sys(name):
continue
log.info("applying snapshot of {0} on sample {1}".format(
name, sample.name))
low, high = sample.sys_hist(name)
# modify nominal
val = var.value
if val > 0:
sample.hist += (high - nominal) * val
elif val < 0:
sample.hist += (nominal - low) * val
# TODO:
# modify OverallSys
# modify HistoSys
return clone | [
"def",
"apply_snapshot",
"(",
"self",
",",
"argset",
")",
":",
"clone",
"=",
"self",
".",
"Clone",
"(",
")",
"args",
"=",
"[",
"var",
"for",
"var",
"in",
"argset",
"if",
"not",
"(",
"var",
".",
"name",
".",
"startswith",
"(",
"'binWidth_obs_x_'",
")",
"or",
"var",
".",
"name",
".",
"startswith",
"(",
"'gamma_stat'",
")",
"or",
"var",
".",
"name",
".",
"startswith",
"(",
"'nom_'",
")",
")",
"]",
"# handle NormFactors first",
"nargs",
"=",
"[",
"]",
"for",
"var",
"in",
"args",
":",
"is_norm",
"=",
"False",
"name",
"=",
"var",
".",
"name",
".",
"replace",
"(",
"'alpha_'",
",",
"''",
")",
"for",
"sample",
"in",
"clone",
".",
"samples",
":",
"if",
"sample",
".",
"GetNormFactor",
"(",
"name",
")",
"is",
"not",
"None",
":",
"log",
".",
"info",
"(",
"\"applying snapshot of {0} on sample {1}\"",
".",
"format",
"(",
"name",
",",
"sample",
".",
"name",
")",
")",
"is_norm",
"=",
"True",
"# scale the entire sample",
"sample",
"*=",
"var",
".",
"value",
"# add an OverallSys for the error",
"osys",
"=",
"OverallSys",
"(",
"name",
",",
"low",
"=",
"1.",
"-",
"var",
".",
"error",
"/",
"var",
".",
"value",
",",
"high",
"=",
"1.",
"+",
"var",
".",
"error",
"/",
"var",
".",
"value",
")",
"sample",
".",
"AddOverallSys",
"(",
"osys",
")",
"# remove the NormFactor",
"sample",
".",
"RemoveNormFactor",
"(",
"name",
")",
"if",
"not",
"is_norm",
":",
"nargs",
".",
"append",
"(",
"var",
")",
"# modify the nominal shape and systematics",
"for",
"sample",
"in",
"clone",
".",
"samples",
":",
"# check that hist is not NULL",
"if",
"sample",
".",
"hist",
"is",
"None",
":",
"raise",
"RuntimeError",
"(",
"\"sample {0} does not have a \"",
"\"nominal histogram\"",
".",
"format",
"(",
"sample",
".",
"name",
")",
")",
"nominal",
"=",
"sample",
".",
"hist",
".",
"Clone",
"(",
"shallow",
"=",
"True",
")",
"for",
"var",
"in",
"nargs",
":",
"name",
"=",
"var",
".",
"name",
".",
"replace",
"(",
"'alpha_'",
",",
"''",
")",
"if",
"not",
"sample",
".",
"has_sys",
"(",
"name",
")",
":",
"continue",
"log",
".",
"info",
"(",
"\"applying snapshot of {0} on sample {1}\"",
".",
"format",
"(",
"name",
",",
"sample",
".",
"name",
")",
")",
"low",
",",
"high",
"=",
"sample",
".",
"sys_hist",
"(",
"name",
")",
"# modify nominal",
"val",
"=",
"var",
".",
"value",
"if",
"val",
">",
"0",
":",
"sample",
".",
"hist",
"+=",
"(",
"high",
"-",
"nominal",
")",
"*",
"val",
"elif",
"val",
"<",
"0",
":",
"sample",
".",
"hist",
"+=",
"(",
"nominal",
"-",
"low",
")",
"*",
"val",
"# TODO:",
"# modify OverallSys",
"# modify HistoSys",
"return",
"clone"
] | Create a clone of this Channel where histograms are modified according
to the values of the nuisance parameters in the snapshot. This is
useful when creating post-fit distribution plots.
Parameters
----------
argset : RooArtSet
A RooArgSet of RooRealVar nuisance parameters
Returns
-------
channel : Channel
The modified channel | [
"Create",
"a",
"clone",
"of",
"this",
"Channel",
"where",
"histograms",
"are",
"modified",
"according",
"to",
"the",
"values",
"of",
"the",
"nuisance",
"parameters",
"in",
"the",
"snapshot",
".",
"This",
"is",
"useful",
"when",
"creating",
"post",
"-",
"fit",
"distribution",
"plots",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/stats/histfactory/histfactory.py#L1040-L1109 | train |
rootpy/rootpy | rootpy/extern/byteplay2/__init__.py | printcodelist | def printcodelist(codelist, to=sys.stdout):
"""Get a code list. Print it nicely."""
labeldict = {}
pendinglabels = []
for i, (op, arg) in enumerate(codelist):
if isinstance(op, Label):
pendinglabels.append(op)
elif op is SetLineno:
pass
else:
while pendinglabels:
labeldict[pendinglabels.pop()] = i
lineno = None
islabel = False
for i, (op, arg) in enumerate(codelist):
if op is SetLineno:
lineno = arg
print >> to
continue
if isinstance(op, Label):
islabel = True
continue
if lineno is None:
linenostr = ''
else:
linenostr = str(lineno)
lineno = None
if islabel:
islabelstr = '>>'
islabel = False
else:
islabelstr = ''
if op in hasconst:
argstr = repr(arg)
elif op in hasjump:
try:
argstr = 'to ' + str(labeldict[arg])
except KeyError:
argstr = repr(arg)
elif op in hasarg:
argstr = str(arg)
else:
argstr = ''
print >> to, '%3s %2s %4d %-20s %s' % (
linenostr,
islabelstr,
i,
op,
argstr) | python | def printcodelist(codelist, to=sys.stdout):
"""Get a code list. Print it nicely."""
labeldict = {}
pendinglabels = []
for i, (op, arg) in enumerate(codelist):
if isinstance(op, Label):
pendinglabels.append(op)
elif op is SetLineno:
pass
else:
while pendinglabels:
labeldict[pendinglabels.pop()] = i
lineno = None
islabel = False
for i, (op, arg) in enumerate(codelist):
if op is SetLineno:
lineno = arg
print >> to
continue
if isinstance(op, Label):
islabel = True
continue
if lineno is None:
linenostr = ''
else:
linenostr = str(lineno)
lineno = None
if islabel:
islabelstr = '>>'
islabel = False
else:
islabelstr = ''
if op in hasconst:
argstr = repr(arg)
elif op in hasjump:
try:
argstr = 'to ' + str(labeldict[arg])
except KeyError:
argstr = repr(arg)
elif op in hasarg:
argstr = str(arg)
else:
argstr = ''
print >> to, '%3s %2s %4d %-20s %s' % (
linenostr,
islabelstr,
i,
op,
argstr) | [
"def",
"printcodelist",
"(",
"codelist",
",",
"to",
"=",
"sys",
".",
"stdout",
")",
":",
"labeldict",
"=",
"{",
"}",
"pendinglabels",
"=",
"[",
"]",
"for",
"i",
",",
"(",
"op",
",",
"arg",
")",
"in",
"enumerate",
"(",
"codelist",
")",
":",
"if",
"isinstance",
"(",
"op",
",",
"Label",
")",
":",
"pendinglabels",
".",
"append",
"(",
"op",
")",
"elif",
"op",
"is",
"SetLineno",
":",
"pass",
"else",
":",
"while",
"pendinglabels",
":",
"labeldict",
"[",
"pendinglabels",
".",
"pop",
"(",
")",
"]",
"=",
"i",
"lineno",
"=",
"None",
"islabel",
"=",
"False",
"for",
"i",
",",
"(",
"op",
",",
"arg",
")",
"in",
"enumerate",
"(",
"codelist",
")",
":",
"if",
"op",
"is",
"SetLineno",
":",
"lineno",
"=",
"arg",
"print",
">>",
"to",
"continue",
"if",
"isinstance",
"(",
"op",
",",
"Label",
")",
":",
"islabel",
"=",
"True",
"continue",
"if",
"lineno",
"is",
"None",
":",
"linenostr",
"=",
"''",
"else",
":",
"linenostr",
"=",
"str",
"(",
"lineno",
")",
"lineno",
"=",
"None",
"if",
"islabel",
":",
"islabelstr",
"=",
"'>>'",
"islabel",
"=",
"False",
"else",
":",
"islabelstr",
"=",
"''",
"if",
"op",
"in",
"hasconst",
":",
"argstr",
"=",
"repr",
"(",
"arg",
")",
"elif",
"op",
"in",
"hasjump",
":",
"try",
":",
"argstr",
"=",
"'to '",
"+",
"str",
"(",
"labeldict",
"[",
"arg",
"]",
")",
"except",
"KeyError",
":",
"argstr",
"=",
"repr",
"(",
"arg",
")",
"elif",
"op",
"in",
"hasarg",
":",
"argstr",
"=",
"str",
"(",
"arg",
")",
"else",
":",
"argstr",
"=",
"''",
"print",
">>",
"to",
",",
"'%3s %2s %4d %-20s %s'",
"%",
"(",
"linenostr",
",",
"islabelstr",
",",
"i",
",",
"op",
",",
"argstr",
")"
] | Get a code list. Print it nicely. | [
"Get",
"a",
"code",
"list",
".",
"Print",
"it",
"nicely",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/extern/byteplay2/__init__.py#L785-L840 | train |
rootpy/rootpy | rootpy/extern/byteplay2/__init__.py | recompile | def recompile(filename):
"""Create a .pyc by disassembling the file and assembling it again, printing
a message that the reassembled file was loaded."""
# Most of the code here based on the compile.py module.
import os
import imp
import marshal
import struct
f = open(filename, 'U')
try:
timestamp = long(os.fstat(f.fileno()).st_mtime)
except AttributeError:
timestamp = long(os.stat(filename).st_mtime)
codestring = f.read()
f.close()
if codestring and codestring[-1] != '\n':
codestring = codestring + '\n'
try:
codeobject = compile(codestring, filename, 'exec')
except SyntaxError:
print >> sys.stderr, "Skipping %s - syntax error." % filename
return
cod = Code.from_code(codeobject)
message = "reassembled %r imported.\n" % filename
cod.code[:0] = [ # __import__('sys').stderr.write(message)
(LOAD_GLOBAL, '__import__'),
(LOAD_CONST, 'sys'),
(CALL_FUNCTION, 1),
(LOAD_ATTR, 'stderr'),
(LOAD_ATTR, 'write'),
(LOAD_CONST, message),
(CALL_FUNCTION, 1),
(POP_TOP, None),
]
codeobject2 = cod.to_code()
fc = open(filename+'c', 'wb')
fc.write('\0\0\0\0')
fc.write(struct.pack('<l', timestamp))
marshal.dump(codeobject2, fc)
fc.flush()
fc.seek(0, 0)
fc.write(imp.get_magic())
fc.close() | python | def recompile(filename):
"""Create a .pyc by disassembling the file and assembling it again, printing
a message that the reassembled file was loaded."""
# Most of the code here based on the compile.py module.
import os
import imp
import marshal
import struct
f = open(filename, 'U')
try:
timestamp = long(os.fstat(f.fileno()).st_mtime)
except AttributeError:
timestamp = long(os.stat(filename).st_mtime)
codestring = f.read()
f.close()
if codestring and codestring[-1] != '\n':
codestring = codestring + '\n'
try:
codeobject = compile(codestring, filename, 'exec')
except SyntaxError:
print >> sys.stderr, "Skipping %s - syntax error." % filename
return
cod = Code.from_code(codeobject)
message = "reassembled %r imported.\n" % filename
cod.code[:0] = [ # __import__('sys').stderr.write(message)
(LOAD_GLOBAL, '__import__'),
(LOAD_CONST, 'sys'),
(CALL_FUNCTION, 1),
(LOAD_ATTR, 'stderr'),
(LOAD_ATTR, 'write'),
(LOAD_CONST, message),
(CALL_FUNCTION, 1),
(POP_TOP, None),
]
codeobject2 = cod.to_code()
fc = open(filename+'c', 'wb')
fc.write('\0\0\0\0')
fc.write(struct.pack('<l', timestamp))
marshal.dump(codeobject2, fc)
fc.flush()
fc.seek(0, 0)
fc.write(imp.get_magic())
fc.close() | [
"def",
"recompile",
"(",
"filename",
")",
":",
"# Most of the code here based on the compile.py module.",
"import",
"os",
"import",
"imp",
"import",
"marshal",
"import",
"struct",
"f",
"=",
"open",
"(",
"filename",
",",
"'U'",
")",
"try",
":",
"timestamp",
"=",
"long",
"(",
"os",
".",
"fstat",
"(",
"f",
".",
"fileno",
"(",
")",
")",
".",
"st_mtime",
")",
"except",
"AttributeError",
":",
"timestamp",
"=",
"long",
"(",
"os",
".",
"stat",
"(",
"filename",
")",
".",
"st_mtime",
")",
"codestring",
"=",
"f",
".",
"read",
"(",
")",
"f",
".",
"close",
"(",
")",
"if",
"codestring",
"and",
"codestring",
"[",
"-",
"1",
"]",
"!=",
"'\\n'",
":",
"codestring",
"=",
"codestring",
"+",
"'\\n'",
"try",
":",
"codeobject",
"=",
"compile",
"(",
"codestring",
",",
"filename",
",",
"'exec'",
")",
"except",
"SyntaxError",
":",
"print",
">>",
"sys",
".",
"stderr",
",",
"\"Skipping %s - syntax error.\"",
"%",
"filename",
"return",
"cod",
"=",
"Code",
".",
"from_code",
"(",
"codeobject",
")",
"message",
"=",
"\"reassembled %r imported.\\n\"",
"%",
"filename",
"cod",
".",
"code",
"[",
":",
"0",
"]",
"=",
"[",
"# __import__('sys').stderr.write(message)",
"(",
"LOAD_GLOBAL",
",",
"'__import__'",
")",
",",
"(",
"LOAD_CONST",
",",
"'sys'",
")",
",",
"(",
"CALL_FUNCTION",
",",
"1",
")",
",",
"(",
"LOAD_ATTR",
",",
"'stderr'",
")",
",",
"(",
"LOAD_ATTR",
",",
"'write'",
")",
",",
"(",
"LOAD_CONST",
",",
"message",
")",
",",
"(",
"CALL_FUNCTION",
",",
"1",
")",
",",
"(",
"POP_TOP",
",",
"None",
")",
",",
"]",
"codeobject2",
"=",
"cod",
".",
"to_code",
"(",
")",
"fc",
"=",
"open",
"(",
"filename",
"+",
"'c'",
",",
"'wb'",
")",
"fc",
".",
"write",
"(",
"'\\0\\0\\0\\0'",
")",
"fc",
".",
"write",
"(",
"struct",
".",
"pack",
"(",
"'<l'",
",",
"timestamp",
")",
")",
"marshal",
".",
"dump",
"(",
"codeobject2",
",",
"fc",
")",
"fc",
".",
"flush",
"(",
")",
"fc",
".",
"seek",
"(",
"0",
",",
"0",
")",
"fc",
".",
"write",
"(",
"imp",
".",
"get_magic",
"(",
")",
")",
"fc",
".",
"close",
"(",
")"
] | Create a .pyc by disassembling the file and assembling it again, printing
a message that the reassembled file was loaded. | [
"Create",
"a",
".",
"pyc",
"by",
"disassembling",
"the",
"file",
"and",
"assembling",
"it",
"again",
"printing",
"a",
"message",
"that",
"the",
"reassembled",
"file",
"was",
"loaded",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/extern/byteplay2/__init__.py#L842-L885 | train |
rootpy/rootpy | rootpy/extern/byteplay2/__init__.py | recompile_all | def recompile_all(path):
"""recursively recompile all .py files in the directory"""
import os
if os.path.isdir(path):
for root, dirs, files in os.walk(path):
for name in files:
if name.endswith('.py'):
filename = os.path.abspath(os.path.join(root, name))
print >> sys.stderr, filename
recompile(filename)
else:
filename = os.path.abspath(path)
recompile(filename) | python | def recompile_all(path):
"""recursively recompile all .py files in the directory"""
import os
if os.path.isdir(path):
for root, dirs, files in os.walk(path):
for name in files:
if name.endswith('.py'):
filename = os.path.abspath(os.path.join(root, name))
print >> sys.stderr, filename
recompile(filename)
else:
filename = os.path.abspath(path)
recompile(filename) | [
"def",
"recompile_all",
"(",
"path",
")",
":",
"import",
"os",
"if",
"os",
".",
"path",
".",
"isdir",
"(",
"path",
")",
":",
"for",
"root",
",",
"dirs",
",",
"files",
"in",
"os",
".",
"walk",
"(",
"path",
")",
":",
"for",
"name",
"in",
"files",
":",
"if",
"name",
".",
"endswith",
"(",
"'.py'",
")",
":",
"filename",
"=",
"os",
".",
"path",
".",
"abspath",
"(",
"os",
".",
"path",
".",
"join",
"(",
"root",
",",
"name",
")",
")",
"print",
">>",
"sys",
".",
"stderr",
",",
"filename",
"recompile",
"(",
"filename",
")",
"else",
":",
"filename",
"=",
"os",
".",
"path",
".",
"abspath",
"(",
"path",
")",
"recompile",
"(",
"filename",
")"
] | recursively recompile all .py files in the directory | [
"recursively",
"recompile",
"all",
".",
"py",
"files",
"in",
"the",
"directory"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/extern/byteplay2/__init__.py#L887-L899 | train |
rootpy/rootpy | rootpy/extern/byteplay2/__init__.py | Code.from_code | def from_code(cls, co):
"""Disassemble a Python code object into a Code object."""
co_code = co.co_code
labels = dict((addr, Label()) for addr in findlabels(co_code))
linestarts = dict(cls._findlinestarts(co))
cellfree = co.co_cellvars + co.co_freevars
code = CodeList()
n = len(co_code)
i = 0
extended_arg = 0
while i < n:
op = Opcode(ord(co_code[i]))
if i in labels:
code.append((labels[i], None))
if i in linestarts:
code.append((SetLineno, linestarts[i]))
i += 1
if op in hascode:
lastop, lastarg = code[-1]
if lastop != LOAD_CONST:
raise ValueError(
"%s should be preceded by LOAD_CONST code" % op)
code[-1] = (LOAD_CONST, Code.from_code(lastarg))
if op not in hasarg:
code.append((op, None))
else:
arg = ord(co_code[i]) + ord(co_code[i+1])*256 + extended_arg
extended_arg = 0
i += 2
if op == opcode.EXTENDED_ARG:
extended_arg = arg << 16
elif op in hasconst:
code.append((op, co.co_consts[arg]))
elif op in hasname:
code.append((op, co.co_names[arg]))
elif op in hasjabs:
code.append((op, labels[arg]))
elif op in hasjrel:
code.append((op, labels[i + arg]))
elif op in haslocal:
code.append((op, co.co_varnames[arg]))
elif op in hascompare:
code.append((op, cmp_op[arg]))
elif op in hasfree:
code.append((op, cellfree[arg]))
else:
code.append((op, arg))
varargs = bool(co.co_flags & CO_VARARGS)
varkwargs = bool(co.co_flags & CO_VARKEYWORDS)
newlocals = bool(co.co_flags & CO_NEWLOCALS)
args = co.co_varnames[:co.co_argcount + varargs + varkwargs]
if co.co_consts and isinstance(co.co_consts[0], basestring):
docstring = co.co_consts[0]
else:
docstring = None
return cls(code = code,
freevars = co.co_freevars,
args = args,
varargs = varargs,
varkwargs = varkwargs,
newlocals = newlocals,
name = co.co_name,
filename = co.co_filename,
firstlineno = co.co_firstlineno,
docstring = docstring,
) | python | def from_code(cls, co):
"""Disassemble a Python code object into a Code object."""
co_code = co.co_code
labels = dict((addr, Label()) for addr in findlabels(co_code))
linestarts = dict(cls._findlinestarts(co))
cellfree = co.co_cellvars + co.co_freevars
code = CodeList()
n = len(co_code)
i = 0
extended_arg = 0
while i < n:
op = Opcode(ord(co_code[i]))
if i in labels:
code.append((labels[i], None))
if i in linestarts:
code.append((SetLineno, linestarts[i]))
i += 1
if op in hascode:
lastop, lastarg = code[-1]
if lastop != LOAD_CONST:
raise ValueError(
"%s should be preceded by LOAD_CONST code" % op)
code[-1] = (LOAD_CONST, Code.from_code(lastarg))
if op not in hasarg:
code.append((op, None))
else:
arg = ord(co_code[i]) + ord(co_code[i+1])*256 + extended_arg
extended_arg = 0
i += 2
if op == opcode.EXTENDED_ARG:
extended_arg = arg << 16
elif op in hasconst:
code.append((op, co.co_consts[arg]))
elif op in hasname:
code.append((op, co.co_names[arg]))
elif op in hasjabs:
code.append((op, labels[arg]))
elif op in hasjrel:
code.append((op, labels[i + arg]))
elif op in haslocal:
code.append((op, co.co_varnames[arg]))
elif op in hascompare:
code.append((op, cmp_op[arg]))
elif op in hasfree:
code.append((op, cellfree[arg]))
else:
code.append((op, arg))
varargs = bool(co.co_flags & CO_VARARGS)
varkwargs = bool(co.co_flags & CO_VARKEYWORDS)
newlocals = bool(co.co_flags & CO_NEWLOCALS)
args = co.co_varnames[:co.co_argcount + varargs + varkwargs]
if co.co_consts and isinstance(co.co_consts[0], basestring):
docstring = co.co_consts[0]
else:
docstring = None
return cls(code = code,
freevars = co.co_freevars,
args = args,
varargs = varargs,
varkwargs = varkwargs,
newlocals = newlocals,
name = co.co_name,
filename = co.co_filename,
firstlineno = co.co_firstlineno,
docstring = docstring,
) | [
"def",
"from_code",
"(",
"cls",
",",
"co",
")",
":",
"co_code",
"=",
"co",
".",
"co_code",
"labels",
"=",
"dict",
"(",
"(",
"addr",
",",
"Label",
"(",
")",
")",
"for",
"addr",
"in",
"findlabels",
"(",
"co_code",
")",
")",
"linestarts",
"=",
"dict",
"(",
"cls",
".",
"_findlinestarts",
"(",
"co",
")",
")",
"cellfree",
"=",
"co",
".",
"co_cellvars",
"+",
"co",
".",
"co_freevars",
"code",
"=",
"CodeList",
"(",
")",
"n",
"=",
"len",
"(",
"co_code",
")",
"i",
"=",
"0",
"extended_arg",
"=",
"0",
"while",
"i",
"<",
"n",
":",
"op",
"=",
"Opcode",
"(",
"ord",
"(",
"co_code",
"[",
"i",
"]",
")",
")",
"if",
"i",
"in",
"labels",
":",
"code",
".",
"append",
"(",
"(",
"labels",
"[",
"i",
"]",
",",
"None",
")",
")",
"if",
"i",
"in",
"linestarts",
":",
"code",
".",
"append",
"(",
"(",
"SetLineno",
",",
"linestarts",
"[",
"i",
"]",
")",
")",
"i",
"+=",
"1",
"if",
"op",
"in",
"hascode",
":",
"lastop",
",",
"lastarg",
"=",
"code",
"[",
"-",
"1",
"]",
"if",
"lastop",
"!=",
"LOAD_CONST",
":",
"raise",
"ValueError",
"(",
"\"%s should be preceded by LOAD_CONST code\"",
"%",
"op",
")",
"code",
"[",
"-",
"1",
"]",
"=",
"(",
"LOAD_CONST",
",",
"Code",
".",
"from_code",
"(",
"lastarg",
")",
")",
"if",
"op",
"not",
"in",
"hasarg",
":",
"code",
".",
"append",
"(",
"(",
"op",
",",
"None",
")",
")",
"else",
":",
"arg",
"=",
"ord",
"(",
"co_code",
"[",
"i",
"]",
")",
"+",
"ord",
"(",
"co_code",
"[",
"i",
"+",
"1",
"]",
")",
"*",
"256",
"+",
"extended_arg",
"extended_arg",
"=",
"0",
"i",
"+=",
"2",
"if",
"op",
"==",
"opcode",
".",
"EXTENDED_ARG",
":",
"extended_arg",
"=",
"arg",
"<<",
"16",
"elif",
"op",
"in",
"hasconst",
":",
"code",
".",
"append",
"(",
"(",
"op",
",",
"co",
".",
"co_consts",
"[",
"arg",
"]",
")",
")",
"elif",
"op",
"in",
"hasname",
":",
"code",
".",
"append",
"(",
"(",
"op",
",",
"co",
".",
"co_names",
"[",
"arg",
"]",
")",
")",
"elif",
"op",
"in",
"hasjabs",
":",
"code",
".",
"append",
"(",
"(",
"op",
",",
"labels",
"[",
"arg",
"]",
")",
")",
"elif",
"op",
"in",
"hasjrel",
":",
"code",
".",
"append",
"(",
"(",
"op",
",",
"labels",
"[",
"i",
"+",
"arg",
"]",
")",
")",
"elif",
"op",
"in",
"haslocal",
":",
"code",
".",
"append",
"(",
"(",
"op",
",",
"co",
".",
"co_varnames",
"[",
"arg",
"]",
")",
")",
"elif",
"op",
"in",
"hascompare",
":",
"code",
".",
"append",
"(",
"(",
"op",
",",
"cmp_op",
"[",
"arg",
"]",
")",
")",
"elif",
"op",
"in",
"hasfree",
":",
"code",
".",
"append",
"(",
"(",
"op",
",",
"cellfree",
"[",
"arg",
"]",
")",
")",
"else",
":",
"code",
".",
"append",
"(",
"(",
"op",
",",
"arg",
")",
")",
"varargs",
"=",
"bool",
"(",
"co",
".",
"co_flags",
"&",
"CO_VARARGS",
")",
"varkwargs",
"=",
"bool",
"(",
"co",
".",
"co_flags",
"&",
"CO_VARKEYWORDS",
")",
"newlocals",
"=",
"bool",
"(",
"co",
".",
"co_flags",
"&",
"CO_NEWLOCALS",
")",
"args",
"=",
"co",
".",
"co_varnames",
"[",
":",
"co",
".",
"co_argcount",
"+",
"varargs",
"+",
"varkwargs",
"]",
"if",
"co",
".",
"co_consts",
"and",
"isinstance",
"(",
"co",
".",
"co_consts",
"[",
"0",
"]",
",",
"basestring",
")",
":",
"docstring",
"=",
"co",
".",
"co_consts",
"[",
"0",
"]",
"else",
":",
"docstring",
"=",
"None",
"return",
"cls",
"(",
"code",
"=",
"code",
",",
"freevars",
"=",
"co",
".",
"co_freevars",
",",
"args",
"=",
"args",
",",
"varargs",
"=",
"varargs",
",",
"varkwargs",
"=",
"varkwargs",
",",
"newlocals",
"=",
"newlocals",
",",
"name",
"=",
"co",
".",
"co_name",
",",
"filename",
"=",
"co",
".",
"co_filename",
",",
"firstlineno",
"=",
"co",
".",
"co_firstlineno",
",",
"docstring",
"=",
"docstring",
",",
")"
] | Disassemble a Python code object into a Code object. | [
"Disassemble",
"a",
"Python",
"code",
"object",
"into",
"a",
"Code",
"object",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/extern/byteplay2/__init__.py#L320-L387 | train |
rootpy/rootpy | rootpy/plotting/contrib/quantiles.py | effective_sample_size | def effective_sample_size(h):
"""
Calculate the effective sample size for a histogram
the same way as ROOT does.
"""
sum = 0
ew = 0
w = 0
for bin in h.bins(overflow=False):
sum += bin.value
ew = bin.error
w += ew * ew
esum = sum * sum / w
return esum | python | def effective_sample_size(h):
"""
Calculate the effective sample size for a histogram
the same way as ROOT does.
"""
sum = 0
ew = 0
w = 0
for bin in h.bins(overflow=False):
sum += bin.value
ew = bin.error
w += ew * ew
esum = sum * sum / w
return esum | [
"def",
"effective_sample_size",
"(",
"h",
")",
":",
"sum",
"=",
"0",
"ew",
"=",
"0",
"w",
"=",
"0",
"for",
"bin",
"in",
"h",
".",
"bins",
"(",
"overflow",
"=",
"False",
")",
":",
"sum",
"+=",
"bin",
".",
"value",
"ew",
"=",
"bin",
".",
"error",
"w",
"+=",
"ew",
"*",
"ew",
"esum",
"=",
"sum",
"*",
"sum",
"/",
"w",
"return",
"esum"
] | Calculate the effective sample size for a histogram
the same way as ROOT does. | [
"Calculate",
"the",
"effective",
"sample",
"size",
"for",
"a",
"histogram",
"the",
"same",
"way",
"as",
"ROOT",
"does",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/contrib/quantiles.py#L107-L120 | train |
rootpy/rootpy | rootpy/plotting/contrib/quantiles.py | critical_value | def critical_value(n, p):
"""
This function calculates the critical value given
n and p, and confidence level = 1 - p.
"""
dn = 1
delta = 0.5
res = ROOT.TMath.KolmogorovProb(dn * sqrt(n))
while res > 1.0001 * p or res < 0.9999 * p:
if (res > 1.0001 * p):
dn = dn + delta
if (res < 0.9999 * p):
dn = dn - delta
delta = delta / 2.
res = ROOT.TMath.KolmogorovProb(dn * sqrt(n))
return dn | python | def critical_value(n, p):
"""
This function calculates the critical value given
n and p, and confidence level = 1 - p.
"""
dn = 1
delta = 0.5
res = ROOT.TMath.KolmogorovProb(dn * sqrt(n))
while res > 1.0001 * p or res < 0.9999 * p:
if (res > 1.0001 * p):
dn = dn + delta
if (res < 0.9999 * p):
dn = dn - delta
delta = delta / 2.
res = ROOT.TMath.KolmogorovProb(dn * sqrt(n))
return dn | [
"def",
"critical_value",
"(",
"n",
",",
"p",
")",
":",
"dn",
"=",
"1",
"delta",
"=",
"0.5",
"res",
"=",
"ROOT",
".",
"TMath",
".",
"KolmogorovProb",
"(",
"dn",
"*",
"sqrt",
"(",
"n",
")",
")",
"while",
"res",
">",
"1.0001",
"*",
"p",
"or",
"res",
"<",
"0.9999",
"*",
"p",
":",
"if",
"(",
"res",
">",
"1.0001",
"*",
"p",
")",
":",
"dn",
"=",
"dn",
"+",
"delta",
"if",
"(",
"res",
"<",
"0.9999",
"*",
"p",
")",
":",
"dn",
"=",
"dn",
"-",
"delta",
"delta",
"=",
"delta",
"/",
"2.",
"res",
"=",
"ROOT",
".",
"TMath",
".",
"KolmogorovProb",
"(",
"dn",
"*",
"sqrt",
"(",
"n",
")",
")",
"return",
"dn"
] | This function calculates the critical value given
n and p, and confidence level = 1 - p. | [
"This",
"function",
"calculates",
"the",
"critical",
"value",
"given",
"n",
"and",
"p",
"and",
"confidence",
"level",
"=",
"1",
"-",
"p",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/contrib/quantiles.py#L123-L138 | train |
rootpy/rootpy | rootpy/io/pickler.py | dump | def dump(obj, root_file, proto=0, key=None):
"""Dump an object into a ROOT TFile.
`root_file` may be an open ROOT file or directory, or a string path to an
existing ROOT file.
"""
if isinstance(root_file, string_types):
root_file = root_open(root_file, 'recreate')
own_file = True
else:
own_file = False
ret = Pickler(root_file, proto).dump(obj, key)
if own_file:
root_file.Close()
return ret | python | def dump(obj, root_file, proto=0, key=None):
"""Dump an object into a ROOT TFile.
`root_file` may be an open ROOT file or directory, or a string path to an
existing ROOT file.
"""
if isinstance(root_file, string_types):
root_file = root_open(root_file, 'recreate')
own_file = True
else:
own_file = False
ret = Pickler(root_file, proto).dump(obj, key)
if own_file:
root_file.Close()
return ret | [
"def",
"dump",
"(",
"obj",
",",
"root_file",
",",
"proto",
"=",
"0",
",",
"key",
"=",
"None",
")",
":",
"if",
"isinstance",
"(",
"root_file",
",",
"string_types",
")",
":",
"root_file",
"=",
"root_open",
"(",
"root_file",
",",
"'recreate'",
")",
"own_file",
"=",
"True",
"else",
":",
"own_file",
"=",
"False",
"ret",
"=",
"Pickler",
"(",
"root_file",
",",
"proto",
")",
".",
"dump",
"(",
"obj",
",",
"key",
")",
"if",
"own_file",
":",
"root_file",
".",
"Close",
"(",
")",
"return",
"ret"
] | Dump an object into a ROOT TFile.
`root_file` may be an open ROOT file or directory, or a string path to an
existing ROOT file. | [
"Dump",
"an",
"object",
"into",
"a",
"ROOT",
"TFile",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/io/pickler.py#L344-L358 | train |
rootpy/rootpy | rootpy/io/pickler.py | load | def load(root_file, use_proxy=True, key=None):
"""Load an object from a ROOT TFile.
`root_file` may be an open ROOT file or directory, or a string path to an
existing ROOT file.
"""
if isinstance(root_file, string_types):
root_file = root_open(root_file)
own_file = True
else:
own_file = False
obj = Unpickler(root_file, use_proxy).load(key)
if own_file:
root_file.Close()
return obj | python | def load(root_file, use_proxy=True, key=None):
"""Load an object from a ROOT TFile.
`root_file` may be an open ROOT file or directory, or a string path to an
existing ROOT file.
"""
if isinstance(root_file, string_types):
root_file = root_open(root_file)
own_file = True
else:
own_file = False
obj = Unpickler(root_file, use_proxy).load(key)
if own_file:
root_file.Close()
return obj | [
"def",
"load",
"(",
"root_file",
",",
"use_proxy",
"=",
"True",
",",
"key",
"=",
"None",
")",
":",
"if",
"isinstance",
"(",
"root_file",
",",
"string_types",
")",
":",
"root_file",
"=",
"root_open",
"(",
"root_file",
")",
"own_file",
"=",
"True",
"else",
":",
"own_file",
"=",
"False",
"obj",
"=",
"Unpickler",
"(",
"root_file",
",",
"use_proxy",
")",
".",
"load",
"(",
"key",
")",
"if",
"own_file",
":",
"root_file",
".",
"Close",
"(",
")",
"return",
"obj"
] | Load an object from a ROOT TFile.
`root_file` may be an open ROOT file or directory, or a string path to an
existing ROOT file. | [
"Load",
"an",
"object",
"from",
"a",
"ROOT",
"TFile",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/io/pickler.py#L361-L375 | train |
rootpy/rootpy | rootpy/io/pickler.py | Pickler.dump | def dump(self, obj, key=None):
"""Write a pickled representation of obj to the open TFile."""
if key is None:
key = '_pickle'
with preserve_current_directory():
self.__file.cd()
if sys.version_info[0] < 3:
pickle.Pickler.dump(self, obj)
else:
super(Pickler, self).dump(obj)
s = ROOT.TObjString(self.__io.getvalue())
self.__io.reopen()
s.Write(key)
self.__file.GetFile().Flush()
self.__pmap.clear() | python | def dump(self, obj, key=None):
"""Write a pickled representation of obj to the open TFile."""
if key is None:
key = '_pickle'
with preserve_current_directory():
self.__file.cd()
if sys.version_info[0] < 3:
pickle.Pickler.dump(self, obj)
else:
super(Pickler, self).dump(obj)
s = ROOT.TObjString(self.__io.getvalue())
self.__io.reopen()
s.Write(key)
self.__file.GetFile().Flush()
self.__pmap.clear() | [
"def",
"dump",
"(",
"self",
",",
"obj",
",",
"key",
"=",
"None",
")",
":",
"if",
"key",
"is",
"None",
":",
"key",
"=",
"'_pickle'",
"with",
"preserve_current_directory",
"(",
")",
":",
"self",
".",
"__file",
".",
"cd",
"(",
")",
"if",
"sys",
".",
"version_info",
"[",
"0",
"]",
"<",
"3",
":",
"pickle",
".",
"Pickler",
".",
"dump",
"(",
"self",
",",
"obj",
")",
"else",
":",
"super",
"(",
"Pickler",
",",
"self",
")",
".",
"dump",
"(",
"obj",
")",
"s",
"=",
"ROOT",
".",
"TObjString",
"(",
"self",
".",
"__io",
".",
"getvalue",
"(",
")",
")",
"self",
".",
"__io",
".",
"reopen",
"(",
")",
"s",
".",
"Write",
"(",
"key",
")",
"self",
".",
"__file",
".",
"GetFile",
"(",
")",
".",
"Flush",
"(",
")",
"self",
".",
"__pmap",
".",
"clear",
"(",
")"
] | Write a pickled representation of obj to the open TFile. | [
"Write",
"a",
"pickled",
"representation",
"of",
"obj",
"to",
"the",
"open",
"TFile",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/io/pickler.py#L162-L176 | train |
rootpy/rootpy | rootpy/io/pickler.py | Unpickler.load | def load(self, key=None):
"""Read a pickled object representation from the open file."""
if key is None:
key = '_pickle'
obj = None
if _compat_hooks:
save = _compat_hooks[0]()
try:
self.__n += 1
s = self.__file.Get(key + ';{0:d}'.format(self.__n))
self.__io.setvalue(s.GetName())
if sys.version_info[0] < 3:
obj = pickle.Unpickler.load(self)
else:
obj = super(Unpickler, self).load()
self.__io.reopen()
finally:
if _compat_hooks:
save = _compat_hooks[1](save)
return obj | python | def load(self, key=None):
"""Read a pickled object representation from the open file."""
if key is None:
key = '_pickle'
obj = None
if _compat_hooks:
save = _compat_hooks[0]()
try:
self.__n += 1
s = self.__file.Get(key + ';{0:d}'.format(self.__n))
self.__io.setvalue(s.GetName())
if sys.version_info[0] < 3:
obj = pickle.Unpickler.load(self)
else:
obj = super(Unpickler, self).load()
self.__io.reopen()
finally:
if _compat_hooks:
save = _compat_hooks[1](save)
return obj | [
"def",
"load",
"(",
"self",
",",
"key",
"=",
"None",
")",
":",
"if",
"key",
"is",
"None",
":",
"key",
"=",
"'_pickle'",
"obj",
"=",
"None",
"if",
"_compat_hooks",
":",
"save",
"=",
"_compat_hooks",
"[",
"0",
"]",
"(",
")",
"try",
":",
"self",
".",
"__n",
"+=",
"1",
"s",
"=",
"self",
".",
"__file",
".",
"Get",
"(",
"key",
"+",
"';{0:d}'",
".",
"format",
"(",
"self",
".",
"__n",
")",
")",
"self",
".",
"__io",
".",
"setvalue",
"(",
"s",
".",
"GetName",
"(",
")",
")",
"if",
"sys",
".",
"version_info",
"[",
"0",
"]",
"<",
"3",
":",
"obj",
"=",
"pickle",
".",
"Unpickler",
".",
"load",
"(",
"self",
")",
"else",
":",
"obj",
"=",
"super",
"(",
"Unpickler",
",",
"self",
")",
".",
"load",
"(",
")",
"self",
".",
"__io",
".",
"reopen",
"(",
")",
"finally",
":",
"if",
"_compat_hooks",
":",
"save",
"=",
"_compat_hooks",
"[",
"1",
"]",
"(",
"save",
")",
"return",
"obj"
] | Read a pickled object representation from the open file. | [
"Read",
"a",
"pickled",
"object",
"representation",
"from",
"the",
"open",
"file",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/io/pickler.py#L272-L291 | train |
rootpy/rootpy | rootpy/utils/extras.py | iter_ROOT_classes | def iter_ROOT_classes():
"""
Iterator over all available ROOT classes
"""
class_index = "http://root.cern.ch/root/html/ClassIndex.html"
for s in minidom.parse(urlopen(class_index)).getElementsByTagName("span"):
if ("class", "typename") in s.attributes.items():
class_name = s.childNodes[0].nodeValue
try:
yield getattr(QROOT, class_name)
except AttributeError:
pass | python | def iter_ROOT_classes():
"""
Iterator over all available ROOT classes
"""
class_index = "http://root.cern.ch/root/html/ClassIndex.html"
for s in minidom.parse(urlopen(class_index)).getElementsByTagName("span"):
if ("class", "typename") in s.attributes.items():
class_name = s.childNodes[0].nodeValue
try:
yield getattr(QROOT, class_name)
except AttributeError:
pass | [
"def",
"iter_ROOT_classes",
"(",
")",
":",
"class_index",
"=",
"\"http://root.cern.ch/root/html/ClassIndex.html\"",
"for",
"s",
"in",
"minidom",
".",
"parse",
"(",
"urlopen",
"(",
"class_index",
")",
")",
".",
"getElementsByTagName",
"(",
"\"span\"",
")",
":",
"if",
"(",
"\"class\"",
",",
"\"typename\"",
")",
"in",
"s",
".",
"attributes",
".",
"items",
"(",
")",
":",
"class_name",
"=",
"s",
".",
"childNodes",
"[",
"0",
"]",
".",
"nodeValue",
"try",
":",
"yield",
"getattr",
"(",
"QROOT",
",",
"class_name",
")",
"except",
"AttributeError",
":",
"pass"
] | Iterator over all available ROOT classes | [
"Iterator",
"over",
"all",
"available",
"ROOT",
"classes"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/utils/extras.py#L27-L38 | train |
rootpy/rootpy | rootpy/plotting/style/cmstdr/labels.py | CMS_label | def CMS_label(text="Preliminary 2012", sqrts=8, pad=None):
""" Add a 'CMS Preliminary' style label to the current Pad.
The blurbs are drawn in the top margin. The label "CMS " + text is drawn
in the upper left. If sqrts is None, it will be omitted. Otherwise, it
will be drawn in the upper right.
"""
if pad is None:
pad = ROOT.gPad
with preserve_current_canvas():
pad.cd()
left_margin = pad.GetLeftMargin()
top_margin = pad.GetTopMargin()
ypos = 1 - top_margin / 2.
l = ROOT.TLatex(left_margin, ypos, "CMS " + text)
l.SetTextAlign(12) # left-middle
l.SetNDC()
# The text is 90% as tall as the margin it lives in.
l.SetTextSize(0.90 * top_margin)
l.Draw()
keepalive(pad, l)
# Draw sqrt(s) label, if desired
if sqrts:
right_margin = pad.GetRightMargin()
p = ROOT.TLatex(1 - right_margin, ypos,
"#sqrt{{s}}={0:d}TeV".format(sqrts))
p.SetTextAlign(32) # right-middle
p.SetNDC()
p.SetTextSize(0.90 * top_margin)
p.Draw()
keepalive(pad, p)
else:
p = None
pad.Modified()
pad.Update()
return l, p | python | def CMS_label(text="Preliminary 2012", sqrts=8, pad=None):
""" Add a 'CMS Preliminary' style label to the current Pad.
The blurbs are drawn in the top margin. The label "CMS " + text is drawn
in the upper left. If sqrts is None, it will be omitted. Otherwise, it
will be drawn in the upper right.
"""
if pad is None:
pad = ROOT.gPad
with preserve_current_canvas():
pad.cd()
left_margin = pad.GetLeftMargin()
top_margin = pad.GetTopMargin()
ypos = 1 - top_margin / 2.
l = ROOT.TLatex(left_margin, ypos, "CMS " + text)
l.SetTextAlign(12) # left-middle
l.SetNDC()
# The text is 90% as tall as the margin it lives in.
l.SetTextSize(0.90 * top_margin)
l.Draw()
keepalive(pad, l)
# Draw sqrt(s) label, if desired
if sqrts:
right_margin = pad.GetRightMargin()
p = ROOT.TLatex(1 - right_margin, ypos,
"#sqrt{{s}}={0:d}TeV".format(sqrts))
p.SetTextAlign(32) # right-middle
p.SetNDC()
p.SetTextSize(0.90 * top_margin)
p.Draw()
keepalive(pad, p)
else:
p = None
pad.Modified()
pad.Update()
return l, p | [
"def",
"CMS_label",
"(",
"text",
"=",
"\"Preliminary 2012\"",
",",
"sqrts",
"=",
"8",
",",
"pad",
"=",
"None",
")",
":",
"if",
"pad",
"is",
"None",
":",
"pad",
"=",
"ROOT",
".",
"gPad",
"with",
"preserve_current_canvas",
"(",
")",
":",
"pad",
".",
"cd",
"(",
")",
"left_margin",
"=",
"pad",
".",
"GetLeftMargin",
"(",
")",
"top_margin",
"=",
"pad",
".",
"GetTopMargin",
"(",
")",
"ypos",
"=",
"1",
"-",
"top_margin",
"/",
"2.",
"l",
"=",
"ROOT",
".",
"TLatex",
"(",
"left_margin",
",",
"ypos",
",",
"\"CMS \"",
"+",
"text",
")",
"l",
".",
"SetTextAlign",
"(",
"12",
")",
"# left-middle",
"l",
".",
"SetNDC",
"(",
")",
"# The text is 90% as tall as the margin it lives in.",
"l",
".",
"SetTextSize",
"(",
"0.90",
"*",
"top_margin",
")",
"l",
".",
"Draw",
"(",
")",
"keepalive",
"(",
"pad",
",",
"l",
")",
"# Draw sqrt(s) label, if desired",
"if",
"sqrts",
":",
"right_margin",
"=",
"pad",
".",
"GetRightMargin",
"(",
")",
"p",
"=",
"ROOT",
".",
"TLatex",
"(",
"1",
"-",
"right_margin",
",",
"ypos",
",",
"\"#sqrt{{s}}={0:d}TeV\"",
".",
"format",
"(",
"sqrts",
")",
")",
"p",
".",
"SetTextAlign",
"(",
"32",
")",
"# right-middle",
"p",
".",
"SetNDC",
"(",
")",
"p",
".",
"SetTextSize",
"(",
"0.90",
"*",
"top_margin",
")",
"p",
".",
"Draw",
"(",
")",
"keepalive",
"(",
"pad",
",",
"p",
")",
"else",
":",
"p",
"=",
"None",
"pad",
".",
"Modified",
"(",
")",
"pad",
".",
"Update",
"(",
")",
"return",
"l",
",",
"p"
] | Add a 'CMS Preliminary' style label to the current Pad.
The blurbs are drawn in the top margin. The label "CMS " + text is drawn
in the upper left. If sqrts is None, it will be omitted. Otherwise, it
will be drawn in the upper right. | [
"Add",
"a",
"CMS",
"Preliminary",
"style",
"label",
"to",
"the",
"current",
"Pad",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/style/cmstdr/labels.py#L15-L50 | train |
rootpy/rootpy | rootpy/stats/histfactory/utils.py | make_channel | def make_channel(name, samples, data=None, verbose=False):
"""
Create a Channel from a list of Samples
"""
if verbose:
llog = log['make_channel']
llog.info("creating channel {0}".format(name))
# avoid segfault if name begins with a digit by using "channel_" prefix
chan = Channel('channel_{0}'.format(name))
chan.SetStatErrorConfig(0.05, "Poisson")
if data is not None:
if verbose:
llog.info("setting data")
chan.SetData(data)
for sample in samples:
if verbose:
llog.info("adding sample {0}".format(sample.GetName()))
chan.AddSample(sample)
return chan | python | def make_channel(name, samples, data=None, verbose=False):
"""
Create a Channel from a list of Samples
"""
if verbose:
llog = log['make_channel']
llog.info("creating channel {0}".format(name))
# avoid segfault if name begins with a digit by using "channel_" prefix
chan = Channel('channel_{0}'.format(name))
chan.SetStatErrorConfig(0.05, "Poisson")
if data is not None:
if verbose:
llog.info("setting data")
chan.SetData(data)
for sample in samples:
if verbose:
llog.info("adding sample {0}".format(sample.GetName()))
chan.AddSample(sample)
return chan | [
"def",
"make_channel",
"(",
"name",
",",
"samples",
",",
"data",
"=",
"None",
",",
"verbose",
"=",
"False",
")",
":",
"if",
"verbose",
":",
"llog",
"=",
"log",
"[",
"'make_channel'",
"]",
"llog",
".",
"info",
"(",
"\"creating channel {0}\"",
".",
"format",
"(",
"name",
")",
")",
"# avoid segfault if name begins with a digit by using \"channel_\" prefix",
"chan",
"=",
"Channel",
"(",
"'channel_{0}'",
".",
"format",
"(",
"name",
")",
")",
"chan",
".",
"SetStatErrorConfig",
"(",
"0.05",
",",
"\"Poisson\"",
")",
"if",
"data",
"is",
"not",
"None",
":",
"if",
"verbose",
":",
"llog",
".",
"info",
"(",
"\"setting data\"",
")",
"chan",
".",
"SetData",
"(",
"data",
")",
"for",
"sample",
"in",
"samples",
":",
"if",
"verbose",
":",
"llog",
".",
"info",
"(",
"\"adding sample {0}\"",
".",
"format",
"(",
"sample",
".",
"GetName",
"(",
")",
")",
")",
"chan",
".",
"AddSample",
"(",
"sample",
")",
"return",
"chan"
] | Create a Channel from a list of Samples | [
"Create",
"a",
"Channel",
"from",
"a",
"list",
"of",
"Samples"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/stats/histfactory/utils.py#L32-L53 | train |
rootpy/rootpy | rootpy/stats/histfactory/utils.py | make_measurement | def make_measurement(name,
channels,
lumi=1.0, lumi_rel_error=0.1,
output_prefix='./histfactory',
POI=None,
const_params=None,
verbose=False):
"""
Create a Measurement from a list of Channels
"""
if verbose:
llog = log['make_measurement']
llog.info("creating measurement {0}".format(name))
if not isinstance(channels, (list, tuple)):
channels = [channels]
# Create the measurement
meas = Measurement('measurement_{0}'.format(name), '')
meas.SetOutputFilePrefix(output_prefix)
if POI is not None:
if isinstance(POI, string_types):
if verbose:
llog.info("setting POI {0}".format(POI))
meas.SetPOI(POI)
else:
if verbose:
llog.info("adding POIs {0}".format(', '.join(POI)))
for p in POI:
meas.AddPOI(p)
if verbose:
llog.info("setting lumi={0:f} +/- {1:f}".format(lumi, lumi_rel_error))
meas.lumi = lumi
meas.lumi_rel_error = lumi_rel_error
for channel in channels:
if verbose:
llog.info("adding channel {0}".format(channel.GetName()))
meas.AddChannel(channel)
if const_params is not None:
if verbose:
llog.info("adding constant parameters {0}".format(
', '.join(const_params)))
for param in const_params:
meas.AddConstantParam(param)
return meas | python | def make_measurement(name,
channels,
lumi=1.0, lumi_rel_error=0.1,
output_prefix='./histfactory',
POI=None,
const_params=None,
verbose=False):
"""
Create a Measurement from a list of Channels
"""
if verbose:
llog = log['make_measurement']
llog.info("creating measurement {0}".format(name))
if not isinstance(channels, (list, tuple)):
channels = [channels]
# Create the measurement
meas = Measurement('measurement_{0}'.format(name), '')
meas.SetOutputFilePrefix(output_prefix)
if POI is not None:
if isinstance(POI, string_types):
if verbose:
llog.info("setting POI {0}".format(POI))
meas.SetPOI(POI)
else:
if verbose:
llog.info("adding POIs {0}".format(', '.join(POI)))
for p in POI:
meas.AddPOI(p)
if verbose:
llog.info("setting lumi={0:f} +/- {1:f}".format(lumi, lumi_rel_error))
meas.lumi = lumi
meas.lumi_rel_error = lumi_rel_error
for channel in channels:
if verbose:
llog.info("adding channel {0}".format(channel.GetName()))
meas.AddChannel(channel)
if const_params is not None:
if verbose:
llog.info("adding constant parameters {0}".format(
', '.join(const_params)))
for param in const_params:
meas.AddConstantParam(param)
return meas | [
"def",
"make_measurement",
"(",
"name",
",",
"channels",
",",
"lumi",
"=",
"1.0",
",",
"lumi_rel_error",
"=",
"0.1",
",",
"output_prefix",
"=",
"'./histfactory'",
",",
"POI",
"=",
"None",
",",
"const_params",
"=",
"None",
",",
"verbose",
"=",
"False",
")",
":",
"if",
"verbose",
":",
"llog",
"=",
"log",
"[",
"'make_measurement'",
"]",
"llog",
".",
"info",
"(",
"\"creating measurement {0}\"",
".",
"format",
"(",
"name",
")",
")",
"if",
"not",
"isinstance",
"(",
"channels",
",",
"(",
"list",
",",
"tuple",
")",
")",
":",
"channels",
"=",
"[",
"channels",
"]",
"# Create the measurement",
"meas",
"=",
"Measurement",
"(",
"'measurement_{0}'",
".",
"format",
"(",
"name",
")",
",",
"''",
")",
"meas",
".",
"SetOutputFilePrefix",
"(",
"output_prefix",
")",
"if",
"POI",
"is",
"not",
"None",
":",
"if",
"isinstance",
"(",
"POI",
",",
"string_types",
")",
":",
"if",
"verbose",
":",
"llog",
".",
"info",
"(",
"\"setting POI {0}\"",
".",
"format",
"(",
"POI",
")",
")",
"meas",
".",
"SetPOI",
"(",
"POI",
")",
"else",
":",
"if",
"verbose",
":",
"llog",
".",
"info",
"(",
"\"adding POIs {0}\"",
".",
"format",
"(",
"', '",
".",
"join",
"(",
"POI",
")",
")",
")",
"for",
"p",
"in",
"POI",
":",
"meas",
".",
"AddPOI",
"(",
"p",
")",
"if",
"verbose",
":",
"llog",
".",
"info",
"(",
"\"setting lumi={0:f} +/- {1:f}\"",
".",
"format",
"(",
"lumi",
",",
"lumi_rel_error",
")",
")",
"meas",
".",
"lumi",
"=",
"lumi",
"meas",
".",
"lumi_rel_error",
"=",
"lumi_rel_error",
"for",
"channel",
"in",
"channels",
":",
"if",
"verbose",
":",
"llog",
".",
"info",
"(",
"\"adding channel {0}\"",
".",
"format",
"(",
"channel",
".",
"GetName",
"(",
")",
")",
")",
"meas",
".",
"AddChannel",
"(",
"channel",
")",
"if",
"const_params",
"is",
"not",
"None",
":",
"if",
"verbose",
":",
"llog",
".",
"info",
"(",
"\"adding constant parameters {0}\"",
".",
"format",
"(",
"', '",
".",
"join",
"(",
"const_params",
")",
")",
")",
"for",
"param",
"in",
"const_params",
":",
"meas",
".",
"AddConstantParam",
"(",
"param",
")",
"return",
"meas"
] | Create a Measurement from a list of Channels | [
"Create",
"a",
"Measurement",
"from",
"a",
"list",
"of",
"Channels"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/stats/histfactory/utils.py#L56-L104 | train |
rootpy/rootpy | rootpy/stats/histfactory/utils.py | make_workspace | def make_workspace(measurement, channel=None, name=None, silence=False):
"""
Create a workspace containing the model for a measurement
If `channel` is None then include all channels in the model
If `silence` is True, then silence HistFactory's output on
stdout and stderr.
"""
context = silence_sout_serr if silence else do_nothing
with context():
hist2workspace = ROOT.RooStats.HistFactory.HistoToWorkspaceFactoryFast(
measurement)
if channel is not None:
workspace = hist2workspace.MakeSingleChannelModel(
measurement, channel)
else:
workspace = hist2workspace.MakeCombinedModel(measurement)
workspace = asrootpy(workspace)
keepalive(workspace, measurement)
if name is not None:
workspace.SetName('workspace_{0}'.format(name))
return workspace | python | def make_workspace(measurement, channel=None, name=None, silence=False):
"""
Create a workspace containing the model for a measurement
If `channel` is None then include all channels in the model
If `silence` is True, then silence HistFactory's output on
stdout and stderr.
"""
context = silence_sout_serr if silence else do_nothing
with context():
hist2workspace = ROOT.RooStats.HistFactory.HistoToWorkspaceFactoryFast(
measurement)
if channel is not None:
workspace = hist2workspace.MakeSingleChannelModel(
measurement, channel)
else:
workspace = hist2workspace.MakeCombinedModel(measurement)
workspace = asrootpy(workspace)
keepalive(workspace, measurement)
if name is not None:
workspace.SetName('workspace_{0}'.format(name))
return workspace | [
"def",
"make_workspace",
"(",
"measurement",
",",
"channel",
"=",
"None",
",",
"name",
"=",
"None",
",",
"silence",
"=",
"False",
")",
":",
"context",
"=",
"silence_sout_serr",
"if",
"silence",
"else",
"do_nothing",
"with",
"context",
"(",
")",
":",
"hist2workspace",
"=",
"ROOT",
".",
"RooStats",
".",
"HistFactory",
".",
"HistoToWorkspaceFactoryFast",
"(",
"measurement",
")",
"if",
"channel",
"is",
"not",
"None",
":",
"workspace",
"=",
"hist2workspace",
".",
"MakeSingleChannelModel",
"(",
"measurement",
",",
"channel",
")",
"else",
":",
"workspace",
"=",
"hist2workspace",
".",
"MakeCombinedModel",
"(",
"measurement",
")",
"workspace",
"=",
"asrootpy",
"(",
"workspace",
")",
"keepalive",
"(",
"workspace",
",",
"measurement",
")",
"if",
"name",
"is",
"not",
"None",
":",
"workspace",
".",
"SetName",
"(",
"'workspace_{0}'",
".",
"format",
"(",
"name",
")",
")",
"return",
"workspace"
] | Create a workspace containing the model for a measurement
If `channel` is None then include all channels in the model
If `silence` is True, then silence HistFactory's output on
stdout and stderr. | [
"Create",
"a",
"workspace",
"containing",
"the",
"model",
"for",
"a",
"measurement"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/stats/histfactory/utils.py#L107-L129 | train |
rootpy/rootpy | rootpy/stats/histfactory/utils.py | measurements_from_xml | def measurements_from_xml(filename,
collect_histograms=True,
cd_parent=False,
silence=False):
"""
Read in a list of Measurements from XML
"""
if not os.path.isfile(filename):
raise OSError("the file {0} does not exist".format(filename))
silence_context = silence_sout_serr if silence else do_nothing
filename = os.path.abspath(os.path.normpath(filename))
if cd_parent:
xml_directory = os.path.dirname(filename)
parent = os.path.abspath(os.path.join(xml_directory, os.pardir))
cd_context = working_directory
else:
parent = None
cd_context = do_nothing
log.info("parsing XML in {0} ...".format(filename))
with cd_context(parent):
parser = ROOT.RooStats.HistFactory.ConfigParser()
with silence_context():
measurements_vect = parser.GetMeasurementsFromXML(filename)
# prevent measurements_vect from being garbage collected
ROOT.SetOwnership(measurements_vect, False)
measurements = []
for m in measurements_vect:
if collect_histograms:
with silence_context():
m.CollectHistograms()
measurements.append(asrootpy(m))
return measurements | python | def measurements_from_xml(filename,
collect_histograms=True,
cd_parent=False,
silence=False):
"""
Read in a list of Measurements from XML
"""
if not os.path.isfile(filename):
raise OSError("the file {0} does not exist".format(filename))
silence_context = silence_sout_serr if silence else do_nothing
filename = os.path.abspath(os.path.normpath(filename))
if cd_parent:
xml_directory = os.path.dirname(filename)
parent = os.path.abspath(os.path.join(xml_directory, os.pardir))
cd_context = working_directory
else:
parent = None
cd_context = do_nothing
log.info("parsing XML in {0} ...".format(filename))
with cd_context(parent):
parser = ROOT.RooStats.HistFactory.ConfigParser()
with silence_context():
measurements_vect = parser.GetMeasurementsFromXML(filename)
# prevent measurements_vect from being garbage collected
ROOT.SetOwnership(measurements_vect, False)
measurements = []
for m in measurements_vect:
if collect_histograms:
with silence_context():
m.CollectHistograms()
measurements.append(asrootpy(m))
return measurements | [
"def",
"measurements_from_xml",
"(",
"filename",
",",
"collect_histograms",
"=",
"True",
",",
"cd_parent",
"=",
"False",
",",
"silence",
"=",
"False",
")",
":",
"if",
"not",
"os",
".",
"path",
".",
"isfile",
"(",
"filename",
")",
":",
"raise",
"OSError",
"(",
"\"the file {0} does not exist\"",
".",
"format",
"(",
"filename",
")",
")",
"silence_context",
"=",
"silence_sout_serr",
"if",
"silence",
"else",
"do_nothing",
"filename",
"=",
"os",
".",
"path",
".",
"abspath",
"(",
"os",
".",
"path",
".",
"normpath",
"(",
"filename",
")",
")",
"if",
"cd_parent",
":",
"xml_directory",
"=",
"os",
".",
"path",
".",
"dirname",
"(",
"filename",
")",
"parent",
"=",
"os",
".",
"path",
".",
"abspath",
"(",
"os",
".",
"path",
".",
"join",
"(",
"xml_directory",
",",
"os",
".",
"pardir",
")",
")",
"cd_context",
"=",
"working_directory",
"else",
":",
"parent",
"=",
"None",
"cd_context",
"=",
"do_nothing",
"log",
".",
"info",
"(",
"\"parsing XML in {0} ...\"",
".",
"format",
"(",
"filename",
")",
")",
"with",
"cd_context",
"(",
"parent",
")",
":",
"parser",
"=",
"ROOT",
".",
"RooStats",
".",
"HistFactory",
".",
"ConfigParser",
"(",
")",
"with",
"silence_context",
"(",
")",
":",
"measurements_vect",
"=",
"parser",
".",
"GetMeasurementsFromXML",
"(",
"filename",
")",
"# prevent measurements_vect from being garbage collected",
"ROOT",
".",
"SetOwnership",
"(",
"measurements_vect",
",",
"False",
")",
"measurements",
"=",
"[",
"]",
"for",
"m",
"in",
"measurements_vect",
":",
"if",
"collect_histograms",
":",
"with",
"silence_context",
"(",
")",
":",
"m",
".",
"CollectHistograms",
"(",
")",
"measurements",
".",
"append",
"(",
"asrootpy",
"(",
"m",
")",
")",
"return",
"measurements"
] | Read in a list of Measurements from XML | [
"Read",
"in",
"a",
"list",
"of",
"Measurements",
"from",
"XML"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/stats/histfactory/utils.py#L132-L166 | train |
rootpy/rootpy | rootpy/stats/histfactory/utils.py | write_measurement | def write_measurement(measurement,
root_file=None,
xml_path=None,
output_path=None,
output_suffix=None,
write_workspaces=False,
apply_xml_patches=True,
silence=False):
"""
Write a measurement and RooWorkspaces for all contained channels
into a ROOT file and write the XML files into a directory.
Parameters
----------
measurement : HistFactory::Measurement
An asrootpy'd ``HistFactory::Measurement`` object
root_file : ROOT TFile or string, optional (default=None)
A ROOT file or string file name. The measurement and workspaces
will be written to this file. If ``root_file is None`` then a
new file will be created with the same name as the measurement and
with the prefix ``ws_``.
xml_path : string, optional (default=None)
A directory path to write the XML into. If None, a new directory with
the same name as the measurement and with the prefix ``xml_`` will be
created.
output_path : string, optional (default=None)
If ``root_file is None``, create the ROOT file under this path.
If ``xml_path is None``, create the XML directory under this path.
output_suffix : string, optional (default=None)
If ``root_file is None`` then a new file is created with the same name
as the measurement and with the prefix ``ws_``. ``output_suffix`` will
append a suffix to this file name (before the .root extension).
If ``xml_path is None``, then a new directory is created with the
same name as the measurement and with the prefix ``xml_``.
``output_suffix`` will append a suffix to this directory name.
write_workspaces : bool, optional (default=False)
If True then also write a RooWorkspace for each channel and for all
channels combined.
apply_xml_patches : bool, optional (default=True)
Apply fixes on the output of ``Measurement::PrintXML()`` to avoid known
HistFactory bugs. Some of the patches assume that the ROOT file
containing the histograms will exist one directory level up from the
XML and that hist2workspace, or any tool that later reads the XML will
run from that same directory containing the ROOT file.
silence : bool, optional (default=False)
If True then capture and silence all stdout/stderr output from
HistFactory.
"""
context = silence_sout_serr if silence else do_nothing
output_name = measurement.name
if output_suffix is not None:
output_name += '_{0}'.format(output_suffix)
output_name = output_name.replace(' ', '_')
if xml_path is None:
xml_path = 'xml_{0}'.format(output_name)
if output_path is not None:
xml_path = os.path.join(output_path, xml_path)
if not os.path.exists(xml_path):
mkdir_p(xml_path)
if root_file is None:
root_file = 'ws_{0}.root'.format(output_name)
if output_path is not None:
root_file = os.path.join(output_path, root_file)
own_file = False
if isinstance(root_file, string_types):
root_file = root_open(root_file, 'recreate')
own_file = True
with preserve_current_directory():
root_file.cd()
log.info("writing histograms and measurement in {0} ...".format(
root_file.GetName()))
with context():
measurement.writeToFile(root_file)
# get modified measurement
out_m = root_file.Get(measurement.name)
log.info("writing XML in {0} ...".format(xml_path))
with context():
out_m.PrintXML(xml_path)
if write_workspaces:
log.info("writing combined model in {0} ...".format(
root_file.GetName()))
workspace = make_workspace(measurement, silence=silence)
workspace.Write()
for channel in measurement.channels:
log.info("writing model for channel `{0}` in {1} ...".format(
channel.name, root_file.GetName()))
workspace = make_workspace(
measurement, channel=channel, silence=silence)
workspace.Write()
if apply_xml_patches:
# patch the output XML to avoid HistFactory bugs
patch_xml(glob(os.path.join(xml_path, '*.xml')),
root_file=os.path.basename(root_file.GetName()))
if own_file:
root_file.Close() | python | def write_measurement(measurement,
root_file=None,
xml_path=None,
output_path=None,
output_suffix=None,
write_workspaces=False,
apply_xml_patches=True,
silence=False):
"""
Write a measurement and RooWorkspaces for all contained channels
into a ROOT file and write the XML files into a directory.
Parameters
----------
measurement : HistFactory::Measurement
An asrootpy'd ``HistFactory::Measurement`` object
root_file : ROOT TFile or string, optional (default=None)
A ROOT file or string file name. The measurement and workspaces
will be written to this file. If ``root_file is None`` then a
new file will be created with the same name as the measurement and
with the prefix ``ws_``.
xml_path : string, optional (default=None)
A directory path to write the XML into. If None, a new directory with
the same name as the measurement and with the prefix ``xml_`` will be
created.
output_path : string, optional (default=None)
If ``root_file is None``, create the ROOT file under this path.
If ``xml_path is None``, create the XML directory under this path.
output_suffix : string, optional (default=None)
If ``root_file is None`` then a new file is created with the same name
as the measurement and with the prefix ``ws_``. ``output_suffix`` will
append a suffix to this file name (before the .root extension).
If ``xml_path is None``, then a new directory is created with the
same name as the measurement and with the prefix ``xml_``.
``output_suffix`` will append a suffix to this directory name.
write_workspaces : bool, optional (default=False)
If True then also write a RooWorkspace for each channel and for all
channels combined.
apply_xml_patches : bool, optional (default=True)
Apply fixes on the output of ``Measurement::PrintXML()`` to avoid known
HistFactory bugs. Some of the patches assume that the ROOT file
containing the histograms will exist one directory level up from the
XML and that hist2workspace, or any tool that later reads the XML will
run from that same directory containing the ROOT file.
silence : bool, optional (default=False)
If True then capture and silence all stdout/stderr output from
HistFactory.
"""
context = silence_sout_serr if silence else do_nothing
output_name = measurement.name
if output_suffix is not None:
output_name += '_{0}'.format(output_suffix)
output_name = output_name.replace(' ', '_')
if xml_path is None:
xml_path = 'xml_{0}'.format(output_name)
if output_path is not None:
xml_path = os.path.join(output_path, xml_path)
if not os.path.exists(xml_path):
mkdir_p(xml_path)
if root_file is None:
root_file = 'ws_{0}.root'.format(output_name)
if output_path is not None:
root_file = os.path.join(output_path, root_file)
own_file = False
if isinstance(root_file, string_types):
root_file = root_open(root_file, 'recreate')
own_file = True
with preserve_current_directory():
root_file.cd()
log.info("writing histograms and measurement in {0} ...".format(
root_file.GetName()))
with context():
measurement.writeToFile(root_file)
# get modified measurement
out_m = root_file.Get(measurement.name)
log.info("writing XML in {0} ...".format(xml_path))
with context():
out_m.PrintXML(xml_path)
if write_workspaces:
log.info("writing combined model in {0} ...".format(
root_file.GetName()))
workspace = make_workspace(measurement, silence=silence)
workspace.Write()
for channel in measurement.channels:
log.info("writing model for channel `{0}` in {1} ...".format(
channel.name, root_file.GetName()))
workspace = make_workspace(
measurement, channel=channel, silence=silence)
workspace.Write()
if apply_xml_patches:
# patch the output XML to avoid HistFactory bugs
patch_xml(glob(os.path.join(xml_path, '*.xml')),
root_file=os.path.basename(root_file.GetName()))
if own_file:
root_file.Close() | [
"def",
"write_measurement",
"(",
"measurement",
",",
"root_file",
"=",
"None",
",",
"xml_path",
"=",
"None",
",",
"output_path",
"=",
"None",
",",
"output_suffix",
"=",
"None",
",",
"write_workspaces",
"=",
"False",
",",
"apply_xml_patches",
"=",
"True",
",",
"silence",
"=",
"False",
")",
":",
"context",
"=",
"silence_sout_serr",
"if",
"silence",
"else",
"do_nothing",
"output_name",
"=",
"measurement",
".",
"name",
"if",
"output_suffix",
"is",
"not",
"None",
":",
"output_name",
"+=",
"'_{0}'",
".",
"format",
"(",
"output_suffix",
")",
"output_name",
"=",
"output_name",
".",
"replace",
"(",
"' '",
",",
"'_'",
")",
"if",
"xml_path",
"is",
"None",
":",
"xml_path",
"=",
"'xml_{0}'",
".",
"format",
"(",
"output_name",
")",
"if",
"output_path",
"is",
"not",
"None",
":",
"xml_path",
"=",
"os",
".",
"path",
".",
"join",
"(",
"output_path",
",",
"xml_path",
")",
"if",
"not",
"os",
".",
"path",
".",
"exists",
"(",
"xml_path",
")",
":",
"mkdir_p",
"(",
"xml_path",
")",
"if",
"root_file",
"is",
"None",
":",
"root_file",
"=",
"'ws_{0}.root'",
".",
"format",
"(",
"output_name",
")",
"if",
"output_path",
"is",
"not",
"None",
":",
"root_file",
"=",
"os",
".",
"path",
".",
"join",
"(",
"output_path",
",",
"root_file",
")",
"own_file",
"=",
"False",
"if",
"isinstance",
"(",
"root_file",
",",
"string_types",
")",
":",
"root_file",
"=",
"root_open",
"(",
"root_file",
",",
"'recreate'",
")",
"own_file",
"=",
"True",
"with",
"preserve_current_directory",
"(",
")",
":",
"root_file",
".",
"cd",
"(",
")",
"log",
".",
"info",
"(",
"\"writing histograms and measurement in {0} ...\"",
".",
"format",
"(",
"root_file",
".",
"GetName",
"(",
")",
")",
")",
"with",
"context",
"(",
")",
":",
"measurement",
".",
"writeToFile",
"(",
"root_file",
")",
"# get modified measurement",
"out_m",
"=",
"root_file",
".",
"Get",
"(",
"measurement",
".",
"name",
")",
"log",
".",
"info",
"(",
"\"writing XML in {0} ...\"",
".",
"format",
"(",
"xml_path",
")",
")",
"with",
"context",
"(",
")",
":",
"out_m",
".",
"PrintXML",
"(",
"xml_path",
")",
"if",
"write_workspaces",
":",
"log",
".",
"info",
"(",
"\"writing combined model in {0} ...\"",
".",
"format",
"(",
"root_file",
".",
"GetName",
"(",
")",
")",
")",
"workspace",
"=",
"make_workspace",
"(",
"measurement",
",",
"silence",
"=",
"silence",
")",
"workspace",
".",
"Write",
"(",
")",
"for",
"channel",
"in",
"measurement",
".",
"channels",
":",
"log",
".",
"info",
"(",
"\"writing model for channel `{0}` in {1} ...\"",
".",
"format",
"(",
"channel",
".",
"name",
",",
"root_file",
".",
"GetName",
"(",
")",
")",
")",
"workspace",
"=",
"make_workspace",
"(",
"measurement",
",",
"channel",
"=",
"channel",
",",
"silence",
"=",
"silence",
")",
"workspace",
".",
"Write",
"(",
")",
"if",
"apply_xml_patches",
":",
"# patch the output XML to avoid HistFactory bugs",
"patch_xml",
"(",
"glob",
"(",
"os",
".",
"path",
".",
"join",
"(",
"xml_path",
",",
"'*.xml'",
")",
")",
",",
"root_file",
"=",
"os",
".",
"path",
".",
"basename",
"(",
"root_file",
".",
"GetName",
"(",
")",
")",
")",
"if",
"own_file",
":",
"root_file",
".",
"Close",
"(",
")"
] | Write a measurement and RooWorkspaces for all contained channels
into a ROOT file and write the XML files into a directory.
Parameters
----------
measurement : HistFactory::Measurement
An asrootpy'd ``HistFactory::Measurement`` object
root_file : ROOT TFile or string, optional (default=None)
A ROOT file or string file name. The measurement and workspaces
will be written to this file. If ``root_file is None`` then a
new file will be created with the same name as the measurement and
with the prefix ``ws_``.
xml_path : string, optional (default=None)
A directory path to write the XML into. If None, a new directory with
the same name as the measurement and with the prefix ``xml_`` will be
created.
output_path : string, optional (default=None)
If ``root_file is None``, create the ROOT file under this path.
If ``xml_path is None``, create the XML directory under this path.
output_suffix : string, optional (default=None)
If ``root_file is None`` then a new file is created with the same name
as the measurement and with the prefix ``ws_``. ``output_suffix`` will
append a suffix to this file name (before the .root extension).
If ``xml_path is None``, then a new directory is created with the
same name as the measurement and with the prefix ``xml_``.
``output_suffix`` will append a suffix to this directory name.
write_workspaces : bool, optional (default=False)
If True then also write a RooWorkspace for each channel and for all
channels combined.
apply_xml_patches : bool, optional (default=True)
Apply fixes on the output of ``Measurement::PrintXML()`` to avoid known
HistFactory bugs. Some of the patches assume that the ROOT file
containing the histograms will exist one directory level up from the
XML and that hist2workspace, or any tool that later reads the XML will
run from that same directory containing the ROOT file.
silence : bool, optional (default=False)
If True then capture and silence all stdout/stderr output from
HistFactory. | [
"Write",
"a",
"measurement",
"and",
"RooWorkspaces",
"for",
"all",
"contained",
"channels",
"into",
"a",
"ROOT",
"file",
"and",
"write",
"the",
"XML",
"files",
"into",
"a",
"directory",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/stats/histfactory/utils.py#L169-L282 | train |
rootpy/rootpy | rootpy/stats/histfactory/utils.py | patch_xml | def patch_xml(files, root_file=None, float_precision=3):
"""
Apply patches to HistFactory XML output from PrintXML
"""
if float_precision < 0:
raise ValueError("precision must be greater than 0")
def fix_path(match):
path = match.group(1)
if path:
head, tail = os.path.split(path)
new_path = os.path.join(os.path.basename(head), tail)
else:
new_path = ''
return '<Input>{0}</Input>'.format(new_path)
for xmlfilename in files:
xmlfilename = os.path.abspath(os.path.normpath(xmlfilename))
patched_xmlfilename = '{0}.tmp'.format(xmlfilename)
log.info("patching {0} ...".format(xmlfilename))
fin = open(xmlfilename, 'r')
fout = open(patched_xmlfilename, 'w')
for line in fin:
if root_file is not None:
line = re.sub(
'InputFile="[^"]*"',
'InputFile="{0}"'.format(root_file), line)
line = line.replace(
'<StatError Activate="True" InputFile="" '
'HistoName="" HistoPath="" />',
'<StatError Activate="True" />')
line = re.sub(
'<Combination OutputFilePrefix="(\S*)" >',
'<Combination OutputFilePrefix="hist2workspace" >', line)
line = re.sub('\w+=""', '', line)
line = re.sub('\s+/>', ' />', line)
line = re.sub('(\S)\s+</', r'\1</', line)
# HistFactory bug:
line = re.sub('InputFileHigh="\S+"', '', line)
line = re.sub('InputFileLow="\S+"', '', line)
# HistFactory bug:
line = line.replace(
'<ParamSetting Const="True"></ParamSetting>', '')
# chop off floats to desired precision
line = re.sub(
r'"(\d*\.\d{{{0:d},}})"'.format(float_precision + 1),
lambda x: '"{0}"'.format(
str(round(float(x.group(1)), float_precision))),
line)
line = re.sub('"\s\s+(\S)', r'" \1', line)
line = re.sub('<Input>(.*)</Input>', fix_path, line)
fout.write(line)
fin.close()
fout.close()
shutil.move(patched_xmlfilename, xmlfilename)
if not os.path.isfile(os.path.join(
os.path.dirname(xmlfilename),
'HistFactorySchema.dtd')):
rootsys = os.getenv('ROOTSYS', None)
if rootsys is not None:
dtdfile = os.path.join(rootsys, 'etc/HistFactorySchema.dtd')
target = os.path.dirname(xmlfilename)
if os.path.isfile(dtdfile):
log.info("copying {0} to {1} ...".format(dtdfile, target))
shutil.copy(dtdfile, target)
else:
log.warning("{0} does not exist".format(dtdfile))
else:
log.warning(
"$ROOTSYS is not set so cannot find HistFactorySchema.dtd") | python | def patch_xml(files, root_file=None, float_precision=3):
"""
Apply patches to HistFactory XML output from PrintXML
"""
if float_precision < 0:
raise ValueError("precision must be greater than 0")
def fix_path(match):
path = match.group(1)
if path:
head, tail = os.path.split(path)
new_path = os.path.join(os.path.basename(head), tail)
else:
new_path = ''
return '<Input>{0}</Input>'.format(new_path)
for xmlfilename in files:
xmlfilename = os.path.abspath(os.path.normpath(xmlfilename))
patched_xmlfilename = '{0}.tmp'.format(xmlfilename)
log.info("patching {0} ...".format(xmlfilename))
fin = open(xmlfilename, 'r')
fout = open(patched_xmlfilename, 'w')
for line in fin:
if root_file is not None:
line = re.sub(
'InputFile="[^"]*"',
'InputFile="{0}"'.format(root_file), line)
line = line.replace(
'<StatError Activate="True" InputFile="" '
'HistoName="" HistoPath="" />',
'<StatError Activate="True" />')
line = re.sub(
'<Combination OutputFilePrefix="(\S*)" >',
'<Combination OutputFilePrefix="hist2workspace" >', line)
line = re.sub('\w+=""', '', line)
line = re.sub('\s+/>', ' />', line)
line = re.sub('(\S)\s+</', r'\1</', line)
# HistFactory bug:
line = re.sub('InputFileHigh="\S+"', '', line)
line = re.sub('InputFileLow="\S+"', '', line)
# HistFactory bug:
line = line.replace(
'<ParamSetting Const="True"></ParamSetting>', '')
# chop off floats to desired precision
line = re.sub(
r'"(\d*\.\d{{{0:d},}})"'.format(float_precision + 1),
lambda x: '"{0}"'.format(
str(round(float(x.group(1)), float_precision))),
line)
line = re.sub('"\s\s+(\S)', r'" \1', line)
line = re.sub('<Input>(.*)</Input>', fix_path, line)
fout.write(line)
fin.close()
fout.close()
shutil.move(patched_xmlfilename, xmlfilename)
if not os.path.isfile(os.path.join(
os.path.dirname(xmlfilename),
'HistFactorySchema.dtd')):
rootsys = os.getenv('ROOTSYS', None)
if rootsys is not None:
dtdfile = os.path.join(rootsys, 'etc/HistFactorySchema.dtd')
target = os.path.dirname(xmlfilename)
if os.path.isfile(dtdfile):
log.info("copying {0} to {1} ...".format(dtdfile, target))
shutil.copy(dtdfile, target)
else:
log.warning("{0} does not exist".format(dtdfile))
else:
log.warning(
"$ROOTSYS is not set so cannot find HistFactorySchema.dtd") | [
"def",
"patch_xml",
"(",
"files",
",",
"root_file",
"=",
"None",
",",
"float_precision",
"=",
"3",
")",
":",
"if",
"float_precision",
"<",
"0",
":",
"raise",
"ValueError",
"(",
"\"precision must be greater than 0\"",
")",
"def",
"fix_path",
"(",
"match",
")",
":",
"path",
"=",
"match",
".",
"group",
"(",
"1",
")",
"if",
"path",
":",
"head",
",",
"tail",
"=",
"os",
".",
"path",
".",
"split",
"(",
"path",
")",
"new_path",
"=",
"os",
".",
"path",
".",
"join",
"(",
"os",
".",
"path",
".",
"basename",
"(",
"head",
")",
",",
"tail",
")",
"else",
":",
"new_path",
"=",
"''",
"return",
"'<Input>{0}</Input>'",
".",
"format",
"(",
"new_path",
")",
"for",
"xmlfilename",
"in",
"files",
":",
"xmlfilename",
"=",
"os",
".",
"path",
".",
"abspath",
"(",
"os",
".",
"path",
".",
"normpath",
"(",
"xmlfilename",
")",
")",
"patched_xmlfilename",
"=",
"'{0}.tmp'",
".",
"format",
"(",
"xmlfilename",
")",
"log",
".",
"info",
"(",
"\"patching {0} ...\"",
".",
"format",
"(",
"xmlfilename",
")",
")",
"fin",
"=",
"open",
"(",
"xmlfilename",
",",
"'r'",
")",
"fout",
"=",
"open",
"(",
"patched_xmlfilename",
",",
"'w'",
")",
"for",
"line",
"in",
"fin",
":",
"if",
"root_file",
"is",
"not",
"None",
":",
"line",
"=",
"re",
".",
"sub",
"(",
"'InputFile=\"[^\"]*\"'",
",",
"'InputFile=\"{0}\"'",
".",
"format",
"(",
"root_file",
")",
",",
"line",
")",
"line",
"=",
"line",
".",
"replace",
"(",
"'<StatError Activate=\"True\" InputFile=\"\" '",
"'HistoName=\"\" HistoPath=\"\" />'",
",",
"'<StatError Activate=\"True\" />'",
")",
"line",
"=",
"re",
".",
"sub",
"(",
"'<Combination OutputFilePrefix=\"(\\S*)\" >'",
",",
"'<Combination OutputFilePrefix=\"hist2workspace\" >'",
",",
"line",
")",
"line",
"=",
"re",
".",
"sub",
"(",
"'\\w+=\"\"'",
",",
"''",
",",
"line",
")",
"line",
"=",
"re",
".",
"sub",
"(",
"'\\s+/>'",
",",
"' />'",
",",
"line",
")",
"line",
"=",
"re",
".",
"sub",
"(",
"'(\\S)\\s+</'",
",",
"r'\\1</'",
",",
"line",
")",
"# HistFactory bug:",
"line",
"=",
"re",
".",
"sub",
"(",
"'InputFileHigh=\"\\S+\"'",
",",
"''",
",",
"line",
")",
"line",
"=",
"re",
".",
"sub",
"(",
"'InputFileLow=\"\\S+\"'",
",",
"''",
",",
"line",
")",
"# HistFactory bug:",
"line",
"=",
"line",
".",
"replace",
"(",
"'<ParamSetting Const=\"True\"></ParamSetting>'",
",",
"''",
")",
"# chop off floats to desired precision",
"line",
"=",
"re",
".",
"sub",
"(",
"r'\"(\\d*\\.\\d{{{0:d},}})\"'",
".",
"format",
"(",
"float_precision",
"+",
"1",
")",
",",
"lambda",
"x",
":",
"'\"{0}\"'",
".",
"format",
"(",
"str",
"(",
"round",
"(",
"float",
"(",
"x",
".",
"group",
"(",
"1",
")",
")",
",",
"float_precision",
")",
")",
")",
",",
"line",
")",
"line",
"=",
"re",
".",
"sub",
"(",
"'\"\\s\\s+(\\S)'",
",",
"r'\" \\1'",
",",
"line",
")",
"line",
"=",
"re",
".",
"sub",
"(",
"'<Input>(.*)</Input>'",
",",
"fix_path",
",",
"line",
")",
"fout",
".",
"write",
"(",
"line",
")",
"fin",
".",
"close",
"(",
")",
"fout",
".",
"close",
"(",
")",
"shutil",
".",
"move",
"(",
"patched_xmlfilename",
",",
"xmlfilename",
")",
"if",
"not",
"os",
".",
"path",
".",
"isfile",
"(",
"os",
".",
"path",
".",
"join",
"(",
"os",
".",
"path",
".",
"dirname",
"(",
"xmlfilename",
")",
",",
"'HistFactorySchema.dtd'",
")",
")",
":",
"rootsys",
"=",
"os",
".",
"getenv",
"(",
"'ROOTSYS'",
",",
"None",
")",
"if",
"rootsys",
"is",
"not",
"None",
":",
"dtdfile",
"=",
"os",
".",
"path",
".",
"join",
"(",
"rootsys",
",",
"'etc/HistFactorySchema.dtd'",
")",
"target",
"=",
"os",
".",
"path",
".",
"dirname",
"(",
"xmlfilename",
")",
"if",
"os",
".",
"path",
".",
"isfile",
"(",
"dtdfile",
")",
":",
"log",
".",
"info",
"(",
"\"copying {0} to {1} ...\"",
".",
"format",
"(",
"dtdfile",
",",
"target",
")",
")",
"shutil",
".",
"copy",
"(",
"dtdfile",
",",
"target",
")",
"else",
":",
"log",
".",
"warning",
"(",
"\"{0} does not exist\"",
".",
"format",
"(",
"dtdfile",
")",
")",
"else",
":",
"log",
".",
"warning",
"(",
"\"$ROOTSYS is not set so cannot find HistFactorySchema.dtd\"",
")"
] | Apply patches to HistFactory XML output from PrintXML | [
"Apply",
"patches",
"to",
"HistFactory",
"XML",
"output",
"from",
"PrintXML"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/stats/histfactory/utils.py#L285-L354 | train |
rootpy/rootpy | rootpy/plotting/views.py | _FolderView.path | def path(self):
''' Get the path of the wrapped folder '''
if isinstance(self.dir, Directory):
return self.dir._path
elif isinstance(self.dir, ROOT.TDirectory):
return self.dir.GetPath()
elif isinstance(self.dir, _FolderView):
return self.dir.path()
else:
return str(self.dir) | python | def path(self):
''' Get the path of the wrapped folder '''
if isinstance(self.dir, Directory):
return self.dir._path
elif isinstance(self.dir, ROOT.TDirectory):
return self.dir.GetPath()
elif isinstance(self.dir, _FolderView):
return self.dir.path()
else:
return str(self.dir) | [
"def",
"path",
"(",
"self",
")",
":",
"if",
"isinstance",
"(",
"self",
".",
"dir",
",",
"Directory",
")",
":",
"return",
"self",
".",
"dir",
".",
"_path",
"elif",
"isinstance",
"(",
"self",
".",
"dir",
",",
"ROOT",
".",
"TDirectory",
")",
":",
"return",
"self",
".",
"dir",
".",
"GetPath",
"(",
")",
"elif",
"isinstance",
"(",
"self",
".",
"dir",
",",
"_FolderView",
")",
":",
"return",
"self",
".",
"dir",
".",
"path",
"(",
")",
"else",
":",
"return",
"str",
"(",
"self",
".",
"dir",
")"
] | Get the path of the wrapped folder | [
"Get",
"the",
"path",
"of",
"the",
"wrapped",
"folder"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/views.py#L293-L302 | train |
rootpy/rootpy | rootpy/plotting/views.py | _MultiFolderView.Get | def Get(self, path):
''' Merge the objects at path in all subdirectories '''
return self.merge_views(x.Get(path) for x in self.dirs) | python | def Get(self, path):
''' Merge the objects at path in all subdirectories '''
return self.merge_views(x.Get(path) for x in self.dirs) | [
"def",
"Get",
"(",
"self",
",",
"path",
")",
":",
"return",
"self",
".",
"merge_views",
"(",
"x",
".",
"Get",
"(",
"path",
")",
"for",
"x",
"in",
"self",
".",
"dirs",
")"
] | Merge the objects at path in all subdirectories | [
"Merge",
"the",
"objects",
"at",
"path",
"in",
"all",
"subdirectories"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/views.py#L341-L343 | train |
rootpy/rootpy | rootpy/logger/roothandler.py | python_logging_error_handler | def python_logging_error_handler(level, root_says_abort, location, msg):
"""
A python error handler for ROOT which maps ROOT's errors and warnings on
to python's.
"""
from ..utils import quickroot as QROOT
if not Initialized.value:
try:
QROOT.kTRUE
except AttributeError:
# Python is exiting. Do nothing.
return
QROOT.kInfo, QROOT.kWarning, QROOT.kError, QROOT.kFatal, QROOT.kSysError
QROOT.gErrorIgnoreLevel
Initialized.value = True
try:
QROOT.kTRUE
except RuntimeError:
# Note: If the above causes us problems, it's because this logging
# handler has been called multiple times already with an
# exception. In that case we need to force upstream to raise it.
_, exc, traceback = sys.exc_info()
caller = sys._getframe(2)
re_execute_with_exception(caller, exc, traceback)
if level < QROOT.gErrorIgnoreLevel:
# Needed to silence some "normal" startup warnings
# (copied from PyROOT Utility.cxx)
return
if sys.version_info[0] >= 3:
location = location.decode('utf-8')
msg = msg.decode('utf-8')
log = ROOT_log.getChild(location.replace("::", "."))
if level >= QROOT.kSysError or level >= QROOT.kFatal:
lvl = logging.CRITICAL
elif level >= QROOT.kError:
lvl = logging.ERROR
elif level >= QROOT.kWarning:
lvl = logging.WARNING
elif level >= QROOT.kInfo:
lvl = logging.INFO
else:
lvl = logging.DEBUG
if not SANE_REGEX.match(msg):
# Not ASCII characters. Escape them.
msg = repr(msg)[1:-1]
# Apply fixups to improve consistency of errors/warnings
lvl, msg = fixup_msg(lvl, msg)
log.log(lvl, msg)
# String checks are used because we need a way of (un)forcing abort without
# modifying a global variable (gErrorAbortLevel) for the multithread tests
abort = lvl >= ABORT_LEVEL or "rootpy.ALWAYSABORT" in msg or root_says_abort
if abort and not "rootpy.NEVERABORT" in msg:
caller = sys._getframe(1)
try:
# We can't raise an exception from here because ctypes/PyROOT swallows it.
# Hence the need for dark magic, we re-raise it within a trace.
from .. import ROOTError
raise ROOTError(level, location, msg)
except RuntimeError:
_, exc, traceback = sys.exc_info()
if SHOWTRACE.enabled:
from traceback import print_stack
print_stack(caller)
if DANGER.enabled:
# Avert your eyes, dark magic be within...
re_execute_with_exception(caller, exc, traceback)
if root_says_abort:
log.critical("abort().. expect a stack trace")
ctypes.CDLL(None).abort() | python | def python_logging_error_handler(level, root_says_abort, location, msg):
"""
A python error handler for ROOT which maps ROOT's errors and warnings on
to python's.
"""
from ..utils import quickroot as QROOT
if not Initialized.value:
try:
QROOT.kTRUE
except AttributeError:
# Python is exiting. Do nothing.
return
QROOT.kInfo, QROOT.kWarning, QROOT.kError, QROOT.kFatal, QROOT.kSysError
QROOT.gErrorIgnoreLevel
Initialized.value = True
try:
QROOT.kTRUE
except RuntimeError:
# Note: If the above causes us problems, it's because this logging
# handler has been called multiple times already with an
# exception. In that case we need to force upstream to raise it.
_, exc, traceback = sys.exc_info()
caller = sys._getframe(2)
re_execute_with_exception(caller, exc, traceback)
if level < QROOT.gErrorIgnoreLevel:
# Needed to silence some "normal" startup warnings
# (copied from PyROOT Utility.cxx)
return
if sys.version_info[0] >= 3:
location = location.decode('utf-8')
msg = msg.decode('utf-8')
log = ROOT_log.getChild(location.replace("::", "."))
if level >= QROOT.kSysError or level >= QROOT.kFatal:
lvl = logging.CRITICAL
elif level >= QROOT.kError:
lvl = logging.ERROR
elif level >= QROOT.kWarning:
lvl = logging.WARNING
elif level >= QROOT.kInfo:
lvl = logging.INFO
else:
lvl = logging.DEBUG
if not SANE_REGEX.match(msg):
# Not ASCII characters. Escape them.
msg = repr(msg)[1:-1]
# Apply fixups to improve consistency of errors/warnings
lvl, msg = fixup_msg(lvl, msg)
log.log(lvl, msg)
# String checks are used because we need a way of (un)forcing abort without
# modifying a global variable (gErrorAbortLevel) for the multithread tests
abort = lvl >= ABORT_LEVEL or "rootpy.ALWAYSABORT" in msg or root_says_abort
if abort and not "rootpy.NEVERABORT" in msg:
caller = sys._getframe(1)
try:
# We can't raise an exception from here because ctypes/PyROOT swallows it.
# Hence the need for dark magic, we re-raise it within a trace.
from .. import ROOTError
raise ROOTError(level, location, msg)
except RuntimeError:
_, exc, traceback = sys.exc_info()
if SHOWTRACE.enabled:
from traceback import print_stack
print_stack(caller)
if DANGER.enabled:
# Avert your eyes, dark magic be within...
re_execute_with_exception(caller, exc, traceback)
if root_says_abort:
log.critical("abort().. expect a stack trace")
ctypes.CDLL(None).abort() | [
"def",
"python_logging_error_handler",
"(",
"level",
",",
"root_says_abort",
",",
"location",
",",
"msg",
")",
":",
"from",
".",
".",
"utils",
"import",
"quickroot",
"as",
"QROOT",
"if",
"not",
"Initialized",
".",
"value",
":",
"try",
":",
"QROOT",
".",
"kTRUE",
"except",
"AttributeError",
":",
"# Python is exiting. Do nothing.",
"return",
"QROOT",
".",
"kInfo",
",",
"QROOT",
".",
"kWarning",
",",
"QROOT",
".",
"kError",
",",
"QROOT",
".",
"kFatal",
",",
"QROOT",
".",
"kSysError",
"QROOT",
".",
"gErrorIgnoreLevel",
"Initialized",
".",
"value",
"=",
"True",
"try",
":",
"QROOT",
".",
"kTRUE",
"except",
"RuntimeError",
":",
"# Note: If the above causes us problems, it's because this logging",
"# handler has been called multiple times already with an",
"# exception. In that case we need to force upstream to raise it.",
"_",
",",
"exc",
",",
"traceback",
"=",
"sys",
".",
"exc_info",
"(",
")",
"caller",
"=",
"sys",
".",
"_getframe",
"(",
"2",
")",
"re_execute_with_exception",
"(",
"caller",
",",
"exc",
",",
"traceback",
")",
"if",
"level",
"<",
"QROOT",
".",
"gErrorIgnoreLevel",
":",
"# Needed to silence some \"normal\" startup warnings",
"# (copied from PyROOT Utility.cxx)",
"return",
"if",
"sys",
".",
"version_info",
"[",
"0",
"]",
">=",
"3",
":",
"location",
"=",
"location",
".",
"decode",
"(",
"'utf-8'",
")",
"msg",
"=",
"msg",
".",
"decode",
"(",
"'utf-8'",
")",
"log",
"=",
"ROOT_log",
".",
"getChild",
"(",
"location",
".",
"replace",
"(",
"\"::\"",
",",
"\".\"",
")",
")",
"if",
"level",
">=",
"QROOT",
".",
"kSysError",
"or",
"level",
">=",
"QROOT",
".",
"kFatal",
":",
"lvl",
"=",
"logging",
".",
"CRITICAL",
"elif",
"level",
">=",
"QROOT",
".",
"kError",
":",
"lvl",
"=",
"logging",
".",
"ERROR",
"elif",
"level",
">=",
"QROOT",
".",
"kWarning",
":",
"lvl",
"=",
"logging",
".",
"WARNING",
"elif",
"level",
">=",
"QROOT",
".",
"kInfo",
":",
"lvl",
"=",
"logging",
".",
"INFO",
"else",
":",
"lvl",
"=",
"logging",
".",
"DEBUG",
"if",
"not",
"SANE_REGEX",
".",
"match",
"(",
"msg",
")",
":",
"# Not ASCII characters. Escape them.",
"msg",
"=",
"repr",
"(",
"msg",
")",
"[",
"1",
":",
"-",
"1",
"]",
"# Apply fixups to improve consistency of errors/warnings",
"lvl",
",",
"msg",
"=",
"fixup_msg",
"(",
"lvl",
",",
"msg",
")",
"log",
".",
"log",
"(",
"lvl",
",",
"msg",
")",
"# String checks are used because we need a way of (un)forcing abort without",
"# modifying a global variable (gErrorAbortLevel) for the multithread tests",
"abort",
"=",
"lvl",
">=",
"ABORT_LEVEL",
"or",
"\"rootpy.ALWAYSABORT\"",
"in",
"msg",
"or",
"root_says_abort",
"if",
"abort",
"and",
"not",
"\"rootpy.NEVERABORT\"",
"in",
"msg",
":",
"caller",
"=",
"sys",
".",
"_getframe",
"(",
"1",
")",
"try",
":",
"# We can't raise an exception from here because ctypes/PyROOT swallows it.",
"# Hence the need for dark magic, we re-raise it within a trace.",
"from",
".",
".",
"import",
"ROOTError",
"raise",
"ROOTError",
"(",
"level",
",",
"location",
",",
"msg",
")",
"except",
"RuntimeError",
":",
"_",
",",
"exc",
",",
"traceback",
"=",
"sys",
".",
"exc_info",
"(",
")",
"if",
"SHOWTRACE",
".",
"enabled",
":",
"from",
"traceback",
"import",
"print_stack",
"print_stack",
"(",
"caller",
")",
"if",
"DANGER",
".",
"enabled",
":",
"# Avert your eyes, dark magic be within...",
"re_execute_with_exception",
"(",
"caller",
",",
"exc",
",",
"traceback",
")",
"if",
"root_says_abort",
":",
"log",
".",
"critical",
"(",
"\"abort().. expect a stack trace\"",
")",
"ctypes",
".",
"CDLL",
"(",
"None",
")",
".",
"abort",
"(",
")"
] | A python error handler for ROOT which maps ROOT's errors and warnings on
to python's. | [
"A",
"python",
"error",
"handler",
"for",
"ROOT",
"which",
"maps",
"ROOT",
"s",
"errors",
"and",
"warnings",
"on",
"to",
"python",
"s",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/logger/roothandler.py#L42-L124 | train |
rootpy/rootpy | rootpy/context.py | preserve_current_canvas | def preserve_current_canvas():
"""
Context manager which ensures that the current canvas remains the current
canvas when the context is left.
"""
old = ROOT.gPad
try:
yield
finally:
if old:
old.cd()
elif ROOT.gPad:
# Put things back how they were before.
with invisible_canvas():
# This is a round-about way of resetting gPad to None.
# No other technique I tried could do it.
pass | python | def preserve_current_canvas():
"""
Context manager which ensures that the current canvas remains the current
canvas when the context is left.
"""
old = ROOT.gPad
try:
yield
finally:
if old:
old.cd()
elif ROOT.gPad:
# Put things back how they were before.
with invisible_canvas():
# This is a round-about way of resetting gPad to None.
# No other technique I tried could do it.
pass | [
"def",
"preserve_current_canvas",
"(",
")",
":",
"old",
"=",
"ROOT",
".",
"gPad",
"try",
":",
"yield",
"finally",
":",
"if",
"old",
":",
"old",
".",
"cd",
"(",
")",
"elif",
"ROOT",
".",
"gPad",
":",
"# Put things back how they were before.",
"with",
"invisible_canvas",
"(",
")",
":",
"# This is a round-about way of resetting gPad to None.",
"# No other technique I tried could do it.",
"pass"
] | Context manager which ensures that the current canvas remains the current
canvas when the context is left. | [
"Context",
"manager",
"which",
"ensures",
"that",
"the",
"current",
"canvas",
"remains",
"the",
"current",
"canvas",
"when",
"the",
"context",
"is",
"left",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/context.py#L46-L62 | train |
rootpy/rootpy | rootpy/context.py | preserve_batch_state | def preserve_batch_state():
"""
Context manager which ensures the batch state is the same on exit as it was
on entry.
"""
with LOCK:
old = ROOT.gROOT.IsBatch()
try:
yield
finally:
ROOT.gROOT.SetBatch(old) | python | def preserve_batch_state():
"""
Context manager which ensures the batch state is the same on exit as it was
on entry.
"""
with LOCK:
old = ROOT.gROOT.IsBatch()
try:
yield
finally:
ROOT.gROOT.SetBatch(old) | [
"def",
"preserve_batch_state",
"(",
")",
":",
"with",
"LOCK",
":",
"old",
"=",
"ROOT",
".",
"gROOT",
".",
"IsBatch",
"(",
")",
"try",
":",
"yield",
"finally",
":",
"ROOT",
".",
"gROOT",
".",
"SetBatch",
"(",
"old",
")"
] | Context manager which ensures the batch state is the same on exit as it was
on entry. | [
"Context",
"manager",
"which",
"ensures",
"the",
"batch",
"state",
"is",
"the",
"same",
"on",
"exit",
"as",
"it",
"was",
"on",
"entry",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/context.py#L81-L91 | train |
rootpy/rootpy | rootpy/context.py | invisible_canvas | def invisible_canvas():
"""
Context manager yielding a temporary canvas drawn in batch mode, invisible
to the user. Original state is restored on exit.
Example use; obtain X axis object without interfering with anything::
with invisible_canvas() as c:
efficiency.Draw()
g = efficiency.GetPaintedGraph()
return g.GetXaxis()
"""
with preserve_current_canvas():
with preserve_batch_state():
ROOT.gROOT.SetBatch()
c = ROOT.TCanvas()
try:
c.cd()
yield c
finally:
c.Close()
c.IsA().Destructor(c) | python | def invisible_canvas():
"""
Context manager yielding a temporary canvas drawn in batch mode, invisible
to the user. Original state is restored on exit.
Example use; obtain X axis object without interfering with anything::
with invisible_canvas() as c:
efficiency.Draw()
g = efficiency.GetPaintedGraph()
return g.GetXaxis()
"""
with preserve_current_canvas():
with preserve_batch_state():
ROOT.gROOT.SetBatch()
c = ROOT.TCanvas()
try:
c.cd()
yield c
finally:
c.Close()
c.IsA().Destructor(c) | [
"def",
"invisible_canvas",
"(",
")",
":",
"with",
"preserve_current_canvas",
"(",
")",
":",
"with",
"preserve_batch_state",
"(",
")",
":",
"ROOT",
".",
"gROOT",
".",
"SetBatch",
"(",
")",
"c",
"=",
"ROOT",
".",
"TCanvas",
"(",
")",
"try",
":",
"c",
".",
"cd",
"(",
")",
"yield",
"c",
"finally",
":",
"c",
".",
"Close",
"(",
")",
"c",
".",
"IsA",
"(",
")",
".",
"Destructor",
"(",
"c",
")"
] | Context manager yielding a temporary canvas drawn in batch mode, invisible
to the user. Original state is restored on exit.
Example use; obtain X axis object without interfering with anything::
with invisible_canvas() as c:
efficiency.Draw()
g = efficiency.GetPaintedGraph()
return g.GetXaxis() | [
"Context",
"manager",
"yielding",
"a",
"temporary",
"canvas",
"drawn",
"in",
"batch",
"mode",
"invisible",
"to",
"the",
"user",
".",
"Original",
"state",
"is",
"restored",
"on",
"exit",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/context.py#L95-L116 | train |
rootpy/rootpy | rootpy/context.py | thread_specific_tmprootdir | def thread_specific_tmprootdir():
"""
Context manager which makes a thread specific gDirectory to avoid
interfering with the current file.
Use cases:
A TTree Draw function which doesn't want to interfere with whatever
gDirectory happens to be.
Multi-threading where there are two threads creating objects with the
same name which must reside in a directory. (again, this happens with
TTree draw)
"""
with preserve_current_directory():
dname = "rootpy-tmp/thread/{0}".format(
threading.current_thread().ident)
d = ROOT.gROOT.mkdir(dname)
if not d:
d = ROOT.gROOT.GetDirectory(dname)
assert d, "Unexpected failure, can't cd to tmpdir."
d.cd()
yield d | python | def thread_specific_tmprootdir():
"""
Context manager which makes a thread specific gDirectory to avoid
interfering with the current file.
Use cases:
A TTree Draw function which doesn't want to interfere with whatever
gDirectory happens to be.
Multi-threading where there are two threads creating objects with the
same name which must reside in a directory. (again, this happens with
TTree draw)
"""
with preserve_current_directory():
dname = "rootpy-tmp/thread/{0}".format(
threading.current_thread().ident)
d = ROOT.gROOT.mkdir(dname)
if not d:
d = ROOT.gROOT.GetDirectory(dname)
assert d, "Unexpected failure, can't cd to tmpdir."
d.cd()
yield d | [
"def",
"thread_specific_tmprootdir",
"(",
")",
":",
"with",
"preserve_current_directory",
"(",
")",
":",
"dname",
"=",
"\"rootpy-tmp/thread/{0}\"",
".",
"format",
"(",
"threading",
".",
"current_thread",
"(",
")",
".",
"ident",
")",
"d",
"=",
"ROOT",
".",
"gROOT",
".",
"mkdir",
"(",
"dname",
")",
"if",
"not",
"d",
":",
"d",
"=",
"ROOT",
".",
"gROOT",
".",
"GetDirectory",
"(",
"dname",
")",
"assert",
"d",
",",
"\"Unexpected failure, can't cd to tmpdir.\"",
"d",
".",
"cd",
"(",
")",
"yield",
"d"
] | Context manager which makes a thread specific gDirectory to avoid
interfering with the current file.
Use cases:
A TTree Draw function which doesn't want to interfere with whatever
gDirectory happens to be.
Multi-threading where there are two threads creating objects with the
same name which must reside in a directory. (again, this happens with
TTree draw) | [
"Context",
"manager",
"which",
"makes",
"a",
"thread",
"specific",
"gDirectory",
"to",
"avoid",
"interfering",
"with",
"the",
"current",
"file",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/context.py#L120-L142 | train |
rootpy/rootpy | rootpy/context.py | working_directory | def working_directory(path):
"""
A context manager that changes the working directory to the given
path, and then changes it back to its previous value on exit.
"""
prev_cwd = os.getcwd()
os.chdir(path)
try:
yield
finally:
os.chdir(prev_cwd) | python | def working_directory(path):
"""
A context manager that changes the working directory to the given
path, and then changes it back to its previous value on exit.
"""
prev_cwd = os.getcwd()
os.chdir(path)
try:
yield
finally:
os.chdir(prev_cwd) | [
"def",
"working_directory",
"(",
"path",
")",
":",
"prev_cwd",
"=",
"os",
".",
"getcwd",
"(",
")",
"os",
".",
"chdir",
"(",
"path",
")",
"try",
":",
"yield",
"finally",
":",
"os",
".",
"chdir",
"(",
"prev_cwd",
")"
] | A context manager that changes the working directory to the given
path, and then changes it back to its previous value on exit. | [
"A",
"context",
"manager",
"that",
"changes",
"the",
"working",
"directory",
"to",
"the",
"given",
"path",
"and",
"then",
"changes",
"it",
"back",
"to",
"its",
"previous",
"value",
"on",
"exit",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/context.py#L181-L191 | train |
rootpy/rootpy | rootpy/plotting/autobinning.py | autobinning | def autobinning(data, method="freedman_diaconis"):
"""
This method determines the optimal binning for histogramming.
Parameters
----------
data: 1D array-like
Input data.
method: string, one of the following:
- sturges
- sturges-doane
- scott
- sqrt
- doane
- freedman-diaconis
- risk
- knuth
Returns
-------
(nbins, min, max): int, type(data), type(data)
nbins is the optimal number of bin estimated by the method
min is the minimum of data
max is the maximum of data
Notes
-----
If the length of data is less than 4 the method retun nbins = 1
"""
name = method.replace("-", "_")
try:
method = getattr(BinningMethods, name)
if not isinstance(method, types.FunctionType):
raise AttributeError
except AttributeError:
raise ValueError("`{0}` is not a valid binning method".format(name))
if len(data) < 4:
return 1, np.min(data), np.max(data)
return int(np.ceil(method(data))), np.min(data), np.max(data) | python | def autobinning(data, method="freedman_diaconis"):
"""
This method determines the optimal binning for histogramming.
Parameters
----------
data: 1D array-like
Input data.
method: string, one of the following:
- sturges
- sturges-doane
- scott
- sqrt
- doane
- freedman-diaconis
- risk
- knuth
Returns
-------
(nbins, min, max): int, type(data), type(data)
nbins is the optimal number of bin estimated by the method
min is the minimum of data
max is the maximum of data
Notes
-----
If the length of data is less than 4 the method retun nbins = 1
"""
name = method.replace("-", "_")
try:
method = getattr(BinningMethods, name)
if not isinstance(method, types.FunctionType):
raise AttributeError
except AttributeError:
raise ValueError("`{0}` is not a valid binning method".format(name))
if len(data) < 4:
return 1, np.min(data), np.max(data)
return int(np.ceil(method(data))), np.min(data), np.max(data) | [
"def",
"autobinning",
"(",
"data",
",",
"method",
"=",
"\"freedman_diaconis\"",
")",
":",
"name",
"=",
"method",
".",
"replace",
"(",
"\"-\"",
",",
"\"_\"",
")",
"try",
":",
"method",
"=",
"getattr",
"(",
"BinningMethods",
",",
"name",
")",
"if",
"not",
"isinstance",
"(",
"method",
",",
"types",
".",
"FunctionType",
")",
":",
"raise",
"AttributeError",
"except",
"AttributeError",
":",
"raise",
"ValueError",
"(",
"\"`{0}` is not a valid binning method\"",
".",
"format",
"(",
"name",
")",
")",
"if",
"len",
"(",
"data",
")",
"<",
"4",
":",
"return",
"1",
",",
"np",
".",
"min",
"(",
"data",
")",
",",
"np",
".",
"max",
"(",
"data",
")",
"return",
"int",
"(",
"np",
".",
"ceil",
"(",
"method",
"(",
"data",
")",
")",
")",
",",
"np",
".",
"min",
"(",
"data",
")",
",",
"np",
".",
"max",
"(",
"data",
")"
] | This method determines the optimal binning for histogramming.
Parameters
----------
data: 1D array-like
Input data.
method: string, one of the following:
- sturges
- sturges-doane
- scott
- sqrt
- doane
- freedman-diaconis
- risk
- knuth
Returns
-------
(nbins, min, max): int, type(data), type(data)
nbins is the optimal number of bin estimated by the method
min is the minimum of data
max is the maximum of data
Notes
-----
If the length of data is less than 4 the method retun nbins = 1 | [
"This",
"method",
"determines",
"the",
"optimal",
"binning",
"for",
"histogramming",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/autobinning.py#L12-L50 | train |
rootpy/rootpy | rootpy/plotting/autobinning.py | BinningMethods.all_methods | def all_methods(cls):
"""
Return the names of all available binning methods
"""
def name(fn):
return fn.__get__(cls).__name__.replace("_", "-")
return sorted(name(f) for f in cls.__dict__.values()
if isinstance(f, staticmethod)) | python | def all_methods(cls):
"""
Return the names of all available binning methods
"""
def name(fn):
return fn.__get__(cls).__name__.replace("_", "-")
return sorted(name(f) for f in cls.__dict__.values()
if isinstance(f, staticmethod)) | [
"def",
"all_methods",
"(",
"cls",
")",
":",
"def",
"name",
"(",
"fn",
")",
":",
"return",
"fn",
".",
"__get__",
"(",
"cls",
")",
".",
"__name__",
".",
"replace",
"(",
"\"_\"",
",",
"\"-\"",
")",
"return",
"sorted",
"(",
"name",
"(",
"f",
")",
"for",
"f",
"in",
"cls",
".",
"__dict__",
".",
"values",
"(",
")",
"if",
"isinstance",
"(",
"f",
",",
"staticmethod",
")",
")"
] | Return the names of all available binning methods | [
"Return",
"the",
"names",
"of",
"all",
"available",
"binning",
"methods"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/autobinning.py#L58-L65 | train |
rootpy/rootpy | rootpy/plotting/autobinning.py | BinningMethods.doane | def doane(data):
"""
Modified Doane modified
"""
from scipy.stats import skew
n = len(data)
sigma = np.sqrt(6. * (n - 2.) / (n + 1.) / (n + 3.))
return 1 + np.log2(n) + \
np.log2(1 + np.abs(skew(data)) / sigma) | python | def doane(data):
"""
Modified Doane modified
"""
from scipy.stats import skew
n = len(data)
sigma = np.sqrt(6. * (n - 2.) / (n + 1.) / (n + 3.))
return 1 + np.log2(n) + \
np.log2(1 + np.abs(skew(data)) / sigma) | [
"def",
"doane",
"(",
"data",
")",
":",
"from",
"scipy",
".",
"stats",
"import",
"skew",
"n",
"=",
"len",
"(",
"data",
")",
"sigma",
"=",
"np",
".",
"sqrt",
"(",
"6.",
"*",
"(",
"n",
"-",
"2.",
")",
"/",
"(",
"n",
"+",
"1.",
")",
"/",
"(",
"n",
"+",
"3.",
")",
")",
"return",
"1",
"+",
"np",
".",
"log2",
"(",
"n",
")",
"+",
"np",
".",
"log2",
"(",
"1",
"+",
"np",
".",
"abs",
"(",
"skew",
"(",
"data",
")",
")",
"/",
"sigma",
")"
] | Modified Doane modified | [
"Modified",
"Doane",
"modified"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/autobinning.py#L84-L92 | train |
rootpy/rootpy | rootpy/utils/lock.py | lock | def lock(path, poll_interval=5, max_age=60):
"""
Aquire a file lock in a thread-safe manner that also reaps stale locks
possibly left behind by processes that crashed hard.
"""
if max_age < 30:
raise ValueError("`max_age` must be at least 30 seconds")
if poll_interval < 1:
raise ValueError("`poll_interval` must be at least 1 second")
if poll_interval >= max_age:
raise ValueError("`poll_interval` must be less than `max_age`")
proc = '{0:d}@{1}'.format(os.getpid(), platform.node())
lock = LockFile(path)
log.debug("{0} attempting to lock {1}".format(proc, path))
while not lock.i_am_locking():
if lock.is_locked():
# Protect against race condition
try:
# Check age of the lock file
age = time.time() - os.stat(lock.lock_file)[stat.ST_MTIME]
# Break the lock if too old (considered stale)
if age > max_age:
lock.break_lock()
# What if lock was released and reacquired in the meantime?
# We don't want to break a fresh lock!
# If a lock is stale then we may have many threads
# attempting to break it here at the "same time".
# Avoid the possibility of some thread trying to break the
# lock after it has already been broken and after the first
# other thread attempting to acquire the lock by sleeping
# for 0.5 seconds below.
log.warning(
"{0} broke lock on {1} "
"that is {2:d} seconds old".format(
proc, path, int(age)))
except OSError:
# Lock was released just now
# os.path.exists(lock.lock_file) is False
# OSError may be raised by os.stat() or lock.break_lock() above
pass
time.sleep(0.5)
try:
log.debug(
"{0} waiting for {1:d} seconds "
"for lock on {2} to be released".format(
proc, poll_interval, path))
# Use float() here since acquire sleeps for timeout/10
lock.acquire(timeout=float(poll_interval))
except LockTimeout:
pass
log.debug("{0} locked {1}".format(proc, path))
yield lock
lock.release()
log.debug("{0} released lock on {1}".format(proc, path)) | python | def lock(path, poll_interval=5, max_age=60):
"""
Aquire a file lock in a thread-safe manner that also reaps stale locks
possibly left behind by processes that crashed hard.
"""
if max_age < 30:
raise ValueError("`max_age` must be at least 30 seconds")
if poll_interval < 1:
raise ValueError("`poll_interval` must be at least 1 second")
if poll_interval >= max_age:
raise ValueError("`poll_interval` must be less than `max_age`")
proc = '{0:d}@{1}'.format(os.getpid(), platform.node())
lock = LockFile(path)
log.debug("{0} attempting to lock {1}".format(proc, path))
while not lock.i_am_locking():
if lock.is_locked():
# Protect against race condition
try:
# Check age of the lock file
age = time.time() - os.stat(lock.lock_file)[stat.ST_MTIME]
# Break the lock if too old (considered stale)
if age > max_age:
lock.break_lock()
# What if lock was released and reacquired in the meantime?
# We don't want to break a fresh lock!
# If a lock is stale then we may have many threads
# attempting to break it here at the "same time".
# Avoid the possibility of some thread trying to break the
# lock after it has already been broken and after the first
# other thread attempting to acquire the lock by sleeping
# for 0.5 seconds below.
log.warning(
"{0} broke lock on {1} "
"that is {2:d} seconds old".format(
proc, path, int(age)))
except OSError:
# Lock was released just now
# os.path.exists(lock.lock_file) is False
# OSError may be raised by os.stat() or lock.break_lock() above
pass
time.sleep(0.5)
try:
log.debug(
"{0} waiting for {1:d} seconds "
"for lock on {2} to be released".format(
proc, poll_interval, path))
# Use float() here since acquire sleeps for timeout/10
lock.acquire(timeout=float(poll_interval))
except LockTimeout:
pass
log.debug("{0} locked {1}".format(proc, path))
yield lock
lock.release()
log.debug("{0} released lock on {1}".format(proc, path)) | [
"def",
"lock",
"(",
"path",
",",
"poll_interval",
"=",
"5",
",",
"max_age",
"=",
"60",
")",
":",
"if",
"max_age",
"<",
"30",
":",
"raise",
"ValueError",
"(",
"\"`max_age` must be at least 30 seconds\"",
")",
"if",
"poll_interval",
"<",
"1",
":",
"raise",
"ValueError",
"(",
"\"`poll_interval` must be at least 1 second\"",
")",
"if",
"poll_interval",
">=",
"max_age",
":",
"raise",
"ValueError",
"(",
"\"`poll_interval` must be less than `max_age`\"",
")",
"proc",
"=",
"'{0:d}@{1}'",
".",
"format",
"(",
"os",
".",
"getpid",
"(",
")",
",",
"platform",
".",
"node",
"(",
")",
")",
"lock",
"=",
"LockFile",
"(",
"path",
")",
"log",
".",
"debug",
"(",
"\"{0} attempting to lock {1}\"",
".",
"format",
"(",
"proc",
",",
"path",
")",
")",
"while",
"not",
"lock",
".",
"i_am_locking",
"(",
")",
":",
"if",
"lock",
".",
"is_locked",
"(",
")",
":",
"# Protect against race condition",
"try",
":",
"# Check age of the lock file",
"age",
"=",
"time",
".",
"time",
"(",
")",
"-",
"os",
".",
"stat",
"(",
"lock",
".",
"lock_file",
")",
"[",
"stat",
".",
"ST_MTIME",
"]",
"# Break the lock if too old (considered stale)",
"if",
"age",
">",
"max_age",
":",
"lock",
".",
"break_lock",
"(",
")",
"# What if lock was released and reacquired in the meantime?",
"# We don't want to break a fresh lock!",
"# If a lock is stale then we may have many threads",
"# attempting to break it here at the \"same time\".",
"# Avoid the possibility of some thread trying to break the",
"# lock after it has already been broken and after the first",
"# other thread attempting to acquire the lock by sleeping",
"# for 0.5 seconds below.",
"log",
".",
"warning",
"(",
"\"{0} broke lock on {1} \"",
"\"that is {2:d} seconds old\"",
".",
"format",
"(",
"proc",
",",
"path",
",",
"int",
"(",
"age",
")",
")",
")",
"except",
"OSError",
":",
"# Lock was released just now",
"# os.path.exists(lock.lock_file) is False",
"# OSError may be raised by os.stat() or lock.break_lock() above",
"pass",
"time",
".",
"sleep",
"(",
"0.5",
")",
"try",
":",
"log",
".",
"debug",
"(",
"\"{0} waiting for {1:d} seconds \"",
"\"for lock on {2} to be released\"",
".",
"format",
"(",
"proc",
",",
"poll_interval",
",",
"path",
")",
")",
"# Use float() here since acquire sleeps for timeout/10",
"lock",
".",
"acquire",
"(",
"timeout",
"=",
"float",
"(",
"poll_interval",
")",
")",
"except",
"LockTimeout",
":",
"pass",
"log",
".",
"debug",
"(",
"\"{0} locked {1}\"",
".",
"format",
"(",
"proc",
",",
"path",
")",
")",
"yield",
"lock",
"lock",
".",
"release",
"(",
")",
"log",
".",
"debug",
"(",
"\"{0} released lock on {1}\"",
".",
"format",
"(",
"proc",
",",
"path",
")",
")"
] | Aquire a file lock in a thread-safe manner that also reaps stale locks
possibly left behind by processes that crashed hard. | [
"Aquire",
"a",
"file",
"lock",
"in",
"a",
"thread",
"-",
"safe",
"manner",
"that",
"also",
"reaps",
"stale",
"locks",
"possibly",
"left",
"behind",
"by",
"processes",
"that",
"crashed",
"hard",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/utils/lock.py#L18-L71 | train |
rootpy/rootpy | rootpy/ROOT.py | proxy_global | def proxy_global(name, no_expand_macro=False, fname='func', args=()):
"""
Used to automatically asrootpy ROOT's thread local variables
"""
if no_expand_macro: # pragma: no cover
# handle older ROOT versions without _ExpandMacroFunction wrapping
@property
def gSomething_no_func(self):
glob = self(getattr(ROOT, name))
# create a fake func() that just returns self
def func():
return glob
glob.func = func
return glob
return gSomething_no_func
@property
def gSomething(self):
obj_func = getattr(getattr(ROOT, name), fname)
try:
obj = obj_func(*args)
except ReferenceError: # null pointer
return None
# asrootpy
return self(obj)
return gSomething | python | def proxy_global(name, no_expand_macro=False, fname='func', args=()):
"""
Used to automatically asrootpy ROOT's thread local variables
"""
if no_expand_macro: # pragma: no cover
# handle older ROOT versions without _ExpandMacroFunction wrapping
@property
def gSomething_no_func(self):
glob = self(getattr(ROOT, name))
# create a fake func() that just returns self
def func():
return glob
glob.func = func
return glob
return gSomething_no_func
@property
def gSomething(self):
obj_func = getattr(getattr(ROOT, name), fname)
try:
obj = obj_func(*args)
except ReferenceError: # null pointer
return None
# asrootpy
return self(obj)
return gSomething | [
"def",
"proxy_global",
"(",
"name",
",",
"no_expand_macro",
"=",
"False",
",",
"fname",
"=",
"'func'",
",",
"args",
"=",
"(",
")",
")",
":",
"if",
"no_expand_macro",
":",
"# pragma: no cover",
"# handle older ROOT versions without _ExpandMacroFunction wrapping",
"@",
"property",
"def",
"gSomething_no_func",
"(",
"self",
")",
":",
"glob",
"=",
"self",
"(",
"getattr",
"(",
"ROOT",
",",
"name",
")",
")",
"# create a fake func() that just returns self",
"def",
"func",
"(",
")",
":",
"return",
"glob",
"glob",
".",
"func",
"=",
"func",
"return",
"glob",
"return",
"gSomething_no_func",
"@",
"property",
"def",
"gSomething",
"(",
"self",
")",
":",
"obj_func",
"=",
"getattr",
"(",
"getattr",
"(",
"ROOT",
",",
"name",
")",
",",
"fname",
")",
"try",
":",
"obj",
"=",
"obj_func",
"(",
"*",
"args",
")",
"except",
"ReferenceError",
":",
"# null pointer",
"return",
"None",
"# asrootpy",
"return",
"self",
"(",
"obj",
")",
"return",
"gSomething"
] | Used to automatically asrootpy ROOT's thread local variables | [
"Used",
"to",
"automatically",
"asrootpy",
"ROOT",
"s",
"thread",
"local",
"variables"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/ROOT.py#L61-L87 | train |
rootpy/rootpy | rootpy/plotting/legend.py | Legend.AddEntry | def AddEntry(self, thing, label=None, style=None):
"""
Add an entry to the legend.
If `label` is None, `thing.GetTitle()` will be used as the label.
If `style` is None, `thing.legendstyle` is used if present,
otherwise `P`.
"""
if isinstance(thing, HistStack):
things = thing
else:
things = [thing]
for thing in things:
if getattr(thing, 'inlegend', True):
thing_label = thing.GetTitle() if label is None else label
thing_style = getattr(thing, 'legendstyle', 'P') if style is None else style
super(Legend, self).AddEntry(thing, thing_label, thing_style)
keepalive(self, thing) | python | def AddEntry(self, thing, label=None, style=None):
"""
Add an entry to the legend.
If `label` is None, `thing.GetTitle()` will be used as the label.
If `style` is None, `thing.legendstyle` is used if present,
otherwise `P`.
"""
if isinstance(thing, HistStack):
things = thing
else:
things = [thing]
for thing in things:
if getattr(thing, 'inlegend', True):
thing_label = thing.GetTitle() if label is None else label
thing_style = getattr(thing, 'legendstyle', 'P') if style is None else style
super(Legend, self).AddEntry(thing, thing_label, thing_style)
keepalive(self, thing) | [
"def",
"AddEntry",
"(",
"self",
",",
"thing",
",",
"label",
"=",
"None",
",",
"style",
"=",
"None",
")",
":",
"if",
"isinstance",
"(",
"thing",
",",
"HistStack",
")",
":",
"things",
"=",
"thing",
"else",
":",
"things",
"=",
"[",
"thing",
"]",
"for",
"thing",
"in",
"things",
":",
"if",
"getattr",
"(",
"thing",
",",
"'inlegend'",
",",
"True",
")",
":",
"thing_label",
"=",
"thing",
".",
"GetTitle",
"(",
")",
"if",
"label",
"is",
"None",
"else",
"label",
"thing_style",
"=",
"getattr",
"(",
"thing",
",",
"'legendstyle'",
",",
"'P'",
")",
"if",
"style",
"is",
"None",
"else",
"style",
"super",
"(",
"Legend",
",",
"self",
")",
".",
"AddEntry",
"(",
"thing",
",",
"thing_label",
",",
"thing_style",
")",
"keepalive",
"(",
"self",
",",
"thing",
")"
] | Add an entry to the legend.
If `label` is None, `thing.GetTitle()` will be used as the label.
If `style` is None, `thing.legendstyle` is used if present,
otherwise `P`. | [
"Add",
"an",
"entry",
"to",
"the",
"legend",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/legend.py#L87-L105 | train |
rootpy/rootpy | rootpy/logger/magic.py | get_seh | def get_seh():
"""
Makes a function which can be used to set the ROOT error handler with a
python function and returns the existing error handler.
"""
if ON_RTD:
return lambda x: x
ErrorHandlerFunc_t = ctypes.CFUNCTYPE(
None, ctypes.c_int, ctypes.c_bool,
ctypes.c_char_p, ctypes.c_char_p)
# Required to avoid strange dynamic linker problem on OSX.
# See https://github.com/rootpy/rootpy/issues/256
import ROOT
dll = get_dll("libCore")
SetErrorHandler = None
try:
if dll:
SetErrorHandler = dll._Z15SetErrorHandlerPFvibPKcS0_E
except AttributeError:
pass
if not SetErrorHandler:
log.warning(
"Couldn't find SetErrorHandler. "
"Please submit a rootpy bug report.")
return lambda x: None
SetErrorHandler.restype = ErrorHandlerFunc_t
SetErrorHandler.argtypes = ErrorHandlerFunc_t,
def _SetErrorHandler(fn):
"""
Set ROOT's warning/error handler. Returns the existing one.
"""
log.debug("called SetErrorHandler()")
eh = ErrorHandlerFunc_t(fn)
# ``eh`` can get garbage collected unless kept alive, leading to a segfault.
_keep_alive.append(eh)
return SetErrorHandler(eh)
return _SetErrorHandler | python | def get_seh():
"""
Makes a function which can be used to set the ROOT error handler with a
python function and returns the existing error handler.
"""
if ON_RTD:
return lambda x: x
ErrorHandlerFunc_t = ctypes.CFUNCTYPE(
None, ctypes.c_int, ctypes.c_bool,
ctypes.c_char_p, ctypes.c_char_p)
# Required to avoid strange dynamic linker problem on OSX.
# See https://github.com/rootpy/rootpy/issues/256
import ROOT
dll = get_dll("libCore")
SetErrorHandler = None
try:
if dll:
SetErrorHandler = dll._Z15SetErrorHandlerPFvibPKcS0_E
except AttributeError:
pass
if not SetErrorHandler:
log.warning(
"Couldn't find SetErrorHandler. "
"Please submit a rootpy bug report.")
return lambda x: None
SetErrorHandler.restype = ErrorHandlerFunc_t
SetErrorHandler.argtypes = ErrorHandlerFunc_t,
def _SetErrorHandler(fn):
"""
Set ROOT's warning/error handler. Returns the existing one.
"""
log.debug("called SetErrorHandler()")
eh = ErrorHandlerFunc_t(fn)
# ``eh`` can get garbage collected unless kept alive, leading to a segfault.
_keep_alive.append(eh)
return SetErrorHandler(eh)
return _SetErrorHandler | [
"def",
"get_seh",
"(",
")",
":",
"if",
"ON_RTD",
":",
"return",
"lambda",
"x",
":",
"x",
"ErrorHandlerFunc_t",
"=",
"ctypes",
".",
"CFUNCTYPE",
"(",
"None",
",",
"ctypes",
".",
"c_int",
",",
"ctypes",
".",
"c_bool",
",",
"ctypes",
".",
"c_char_p",
",",
"ctypes",
".",
"c_char_p",
")",
"# Required to avoid strange dynamic linker problem on OSX.",
"# See https://github.com/rootpy/rootpy/issues/256",
"import",
"ROOT",
"dll",
"=",
"get_dll",
"(",
"\"libCore\"",
")",
"SetErrorHandler",
"=",
"None",
"try",
":",
"if",
"dll",
":",
"SetErrorHandler",
"=",
"dll",
".",
"_Z15SetErrorHandlerPFvibPKcS0_E",
"except",
"AttributeError",
":",
"pass",
"if",
"not",
"SetErrorHandler",
":",
"log",
".",
"warning",
"(",
"\"Couldn't find SetErrorHandler. \"",
"\"Please submit a rootpy bug report.\"",
")",
"return",
"lambda",
"x",
":",
"None",
"SetErrorHandler",
".",
"restype",
"=",
"ErrorHandlerFunc_t",
"SetErrorHandler",
".",
"argtypes",
"=",
"ErrorHandlerFunc_t",
",",
"def",
"_SetErrorHandler",
"(",
"fn",
")",
":",
"\"\"\"\n Set ROOT's warning/error handler. Returns the existing one.\n \"\"\"",
"log",
".",
"debug",
"(",
"\"called SetErrorHandler()\"",
")",
"eh",
"=",
"ErrorHandlerFunc_t",
"(",
"fn",
")",
"# ``eh`` can get garbage collected unless kept alive, leading to a segfault.",
"_keep_alive",
".",
"append",
"(",
"eh",
")",
"return",
"SetErrorHandler",
"(",
"eh",
")",
"return",
"_SetErrorHandler"
] | Makes a function which can be used to set the ROOT error handler with a
python function and returns the existing error handler. | [
"Makes",
"a",
"function",
"which",
"can",
"be",
"used",
"to",
"set",
"the",
"ROOT",
"error",
"handler",
"with",
"a",
"python",
"function",
"and",
"returns",
"the",
"existing",
"error",
"handler",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/logger/magic.py#L85-L129 | train |
rootpy/rootpy | rootpy/logger/magic.py | get_f_code_idx | def get_f_code_idx():
"""
How many pointers into PyFrame is the ``f_code`` variable?
"""
frame = sys._getframe()
frame_ptr = id(frame)
LARGE_ENOUGH = 20
# Look through the frame object until we find the f_tstate variable, whose
# value we know from above.
ptrs = [ctypes.c_voidp.from_address(frame_ptr+i*svp)
for i in range(LARGE_ENOUGH)]
# Find its index into the structure
ptrs = [p.value for p in ptrs]
fcode_ptr = id(frame.f_code)
try:
threadstate_idx = ptrs.index(fcode_ptr)
except ValueError:
log.critical("rootpy bug! Please report this.")
raise
return threadstate_idx | python | def get_f_code_idx():
"""
How many pointers into PyFrame is the ``f_code`` variable?
"""
frame = sys._getframe()
frame_ptr = id(frame)
LARGE_ENOUGH = 20
# Look through the frame object until we find the f_tstate variable, whose
# value we know from above.
ptrs = [ctypes.c_voidp.from_address(frame_ptr+i*svp)
for i in range(LARGE_ENOUGH)]
# Find its index into the structure
ptrs = [p.value for p in ptrs]
fcode_ptr = id(frame.f_code)
try:
threadstate_idx = ptrs.index(fcode_ptr)
except ValueError:
log.critical("rootpy bug! Please report this.")
raise
return threadstate_idx | [
"def",
"get_f_code_idx",
"(",
")",
":",
"frame",
"=",
"sys",
".",
"_getframe",
"(",
")",
"frame_ptr",
"=",
"id",
"(",
"frame",
")",
"LARGE_ENOUGH",
"=",
"20",
"# Look through the frame object until we find the f_tstate variable, whose",
"# value we know from above.",
"ptrs",
"=",
"[",
"ctypes",
".",
"c_voidp",
".",
"from_address",
"(",
"frame_ptr",
"+",
"i",
"*",
"svp",
")",
"for",
"i",
"in",
"range",
"(",
"LARGE_ENOUGH",
")",
"]",
"# Find its index into the structure",
"ptrs",
"=",
"[",
"p",
".",
"value",
"for",
"p",
"in",
"ptrs",
"]",
"fcode_ptr",
"=",
"id",
"(",
"frame",
".",
"f_code",
")",
"try",
":",
"threadstate_idx",
"=",
"ptrs",
".",
"index",
"(",
"fcode_ptr",
")",
"except",
"ValueError",
":",
"log",
".",
"critical",
"(",
"\"rootpy bug! Please report this.\"",
")",
"raise",
"return",
"threadstate_idx"
] | How many pointers into PyFrame is the ``f_code`` variable? | [
"How",
"many",
"pointers",
"into",
"PyFrame",
"is",
"the",
"f_code",
"variable?"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/logger/magic.py#L138-L161 | train |
rootpy/rootpy | rootpy/logger/magic.py | get_frame_pointers | def get_frame_pointers(frame=None):
"""
Obtain writable pointers to ``frame.f_trace`` and ``frame.f_lineno``.
Very dangerous. Unlikely to be portable between python implementations.
This is hard in general because the ``PyFrameObject`` can have a variable size
depending on the build configuration. We can get it reliably because we can
determine the offset to ``f_tstate`` by searching for the value of that pointer.
"""
if frame is None:
frame = sys._getframe(2)
frame = id(frame)
# http://hg.python.org/cpython/file/3aa530c2db06/Include/frameobject.h#l28
F_TRACE_OFFSET = 6
Ppy_object = ctypes.POINTER(ctypes.py_object)
trace = Ppy_object.from_address(frame+(F_CODE_IDX+F_TRACE_OFFSET)*svp)
LASTI_OFFSET = F_TRACE_OFFSET + 4
lasti_addr = LASTI_OFFSET
lineno_addr = LASTI_OFFSET + ctypes.sizeof(ctypes.c_int)
f_lineno = ctypes.c_int.from_address(lineno_addr)
f_lasti = ctypes.c_int.from_address(lasti_addr)
return trace, f_lineno, f_lasti | python | def get_frame_pointers(frame=None):
"""
Obtain writable pointers to ``frame.f_trace`` and ``frame.f_lineno``.
Very dangerous. Unlikely to be portable between python implementations.
This is hard in general because the ``PyFrameObject`` can have a variable size
depending on the build configuration. We can get it reliably because we can
determine the offset to ``f_tstate`` by searching for the value of that pointer.
"""
if frame is None:
frame = sys._getframe(2)
frame = id(frame)
# http://hg.python.org/cpython/file/3aa530c2db06/Include/frameobject.h#l28
F_TRACE_OFFSET = 6
Ppy_object = ctypes.POINTER(ctypes.py_object)
trace = Ppy_object.from_address(frame+(F_CODE_IDX+F_TRACE_OFFSET)*svp)
LASTI_OFFSET = F_TRACE_OFFSET + 4
lasti_addr = LASTI_OFFSET
lineno_addr = LASTI_OFFSET + ctypes.sizeof(ctypes.c_int)
f_lineno = ctypes.c_int.from_address(lineno_addr)
f_lasti = ctypes.c_int.from_address(lasti_addr)
return trace, f_lineno, f_lasti | [
"def",
"get_frame_pointers",
"(",
"frame",
"=",
"None",
")",
":",
"if",
"frame",
"is",
"None",
":",
"frame",
"=",
"sys",
".",
"_getframe",
"(",
"2",
")",
"frame",
"=",
"id",
"(",
"frame",
")",
"# http://hg.python.org/cpython/file/3aa530c2db06/Include/frameobject.h#l28",
"F_TRACE_OFFSET",
"=",
"6",
"Ppy_object",
"=",
"ctypes",
".",
"POINTER",
"(",
"ctypes",
".",
"py_object",
")",
"trace",
"=",
"Ppy_object",
".",
"from_address",
"(",
"frame",
"+",
"(",
"F_CODE_IDX",
"+",
"F_TRACE_OFFSET",
")",
"*",
"svp",
")",
"LASTI_OFFSET",
"=",
"F_TRACE_OFFSET",
"+",
"4",
"lasti_addr",
"=",
"LASTI_OFFSET",
"lineno_addr",
"=",
"LASTI_OFFSET",
"+",
"ctypes",
".",
"sizeof",
"(",
"ctypes",
".",
"c_int",
")",
"f_lineno",
"=",
"ctypes",
".",
"c_int",
".",
"from_address",
"(",
"lineno_addr",
")",
"f_lasti",
"=",
"ctypes",
".",
"c_int",
".",
"from_address",
"(",
"lasti_addr",
")",
"return",
"trace",
",",
"f_lineno",
",",
"f_lasti"
] | Obtain writable pointers to ``frame.f_trace`` and ``frame.f_lineno``.
Very dangerous. Unlikely to be portable between python implementations.
This is hard in general because the ``PyFrameObject`` can have a variable size
depending on the build configuration. We can get it reliably because we can
determine the offset to ``f_tstate`` by searching for the value of that pointer. | [
"Obtain",
"writable",
"pointers",
"to",
"frame",
".",
"f_trace",
"and",
"frame",
".",
"f_lineno",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/logger/magic.py#L166-L193 | train |
rootpy/rootpy | rootpy/logger/magic.py | set_linetrace_on_frame | def set_linetrace_on_frame(f, localtrace=None):
"""
Non-portable function to modify linetracing.
Remember to enable global tracing with :py:func:`sys.settrace`, otherwise no
effect!
"""
traceptr, _, _ = get_frame_pointers(f)
if localtrace is not None:
# Need to incref to avoid the frame causing a double-delete
ctypes.pythonapi.Py_IncRef(localtrace)
# Not sure if this is the best way to do this, but it works.
addr = id(localtrace)
else:
addr = 0
traceptr.contents = ctypes.py_object.from_address(addr) | python | def set_linetrace_on_frame(f, localtrace=None):
"""
Non-portable function to modify linetracing.
Remember to enable global tracing with :py:func:`sys.settrace`, otherwise no
effect!
"""
traceptr, _, _ = get_frame_pointers(f)
if localtrace is not None:
# Need to incref to avoid the frame causing a double-delete
ctypes.pythonapi.Py_IncRef(localtrace)
# Not sure if this is the best way to do this, but it works.
addr = id(localtrace)
else:
addr = 0
traceptr.contents = ctypes.py_object.from_address(addr) | [
"def",
"set_linetrace_on_frame",
"(",
"f",
",",
"localtrace",
"=",
"None",
")",
":",
"traceptr",
",",
"_",
",",
"_",
"=",
"get_frame_pointers",
"(",
"f",
")",
"if",
"localtrace",
"is",
"not",
"None",
":",
"# Need to incref to avoid the frame causing a double-delete",
"ctypes",
".",
"pythonapi",
".",
"Py_IncRef",
"(",
"localtrace",
")",
"# Not sure if this is the best way to do this, but it works.",
"addr",
"=",
"id",
"(",
"localtrace",
")",
"else",
":",
"addr",
"=",
"0",
"traceptr",
".",
"contents",
"=",
"ctypes",
".",
"py_object",
".",
"from_address",
"(",
"addr",
")"
] | Non-portable function to modify linetracing.
Remember to enable global tracing with :py:func:`sys.settrace`, otherwise no
effect! | [
"Non",
"-",
"portable",
"function",
"to",
"modify",
"linetracing",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/logger/magic.py#L196-L212 | train |
rootpy/rootpy | rootpy/logger/magic.py | re_execute_with_exception | def re_execute_with_exception(frame, exception, traceback):
"""
Dark magic. Causes ``frame`` to raise an exception at the current location
with ``traceback`` appended to it.
Note that since the line tracer is raising an exception, the interpreter
disables the global trace, so it's not possible to restore the previous
tracing conditions.
"""
if sys.gettrace() == globaltrace:
# If our trace handler is already installed, that means that this
# function has been called twice before the line tracer had a chance to
# run. That can happen if more than one exception was logged.
return
call_lineno = frame.f_lineno
def intercept_next_line(f, why, *args):
if f is not frame:
return
set_linetrace_on_frame(f)
# Undo modifications to the callers code (ick ick ick)
back_like_nothing_happened()
# Raise exception in (almost) the perfect place (except for duplication)
if sys.version_info[0] < 3:
#raise exception.__class__, exception, traceback
raise exception
raise exception.with_traceback(traceback)
set_linetrace_on_frame(frame, intercept_next_line)
linestarts = list(dis.findlinestarts(frame.f_code))
linestarts = [a for a, l in linestarts if l >= call_lineno]
# Jump target
dest = linestarts[0]
oc = frame.f_code.co_code[frame.f_lasti]
if sys.version_info[0] < 3:
oc = ord(oc)
opcode_size = 2 if oc >= opcode.HAVE_ARGUMENT else 0
# Opcode to overwrite
where = frame.f_lasti + 1 + opcode_size
#dis.disco(frame.f_code)
pc = PyCodeObject.from_address(id(frame.f_code))
back_like_nothing_happened = pc.co_code.contents.inject_jump(where, dest)
#print("#"*100)
#dis.disco(frame.f_code)
sys.settrace(globaltrace) | python | def re_execute_with_exception(frame, exception, traceback):
"""
Dark magic. Causes ``frame`` to raise an exception at the current location
with ``traceback`` appended to it.
Note that since the line tracer is raising an exception, the interpreter
disables the global trace, so it's not possible to restore the previous
tracing conditions.
"""
if sys.gettrace() == globaltrace:
# If our trace handler is already installed, that means that this
# function has been called twice before the line tracer had a chance to
# run. That can happen if more than one exception was logged.
return
call_lineno = frame.f_lineno
def intercept_next_line(f, why, *args):
if f is not frame:
return
set_linetrace_on_frame(f)
# Undo modifications to the callers code (ick ick ick)
back_like_nothing_happened()
# Raise exception in (almost) the perfect place (except for duplication)
if sys.version_info[0] < 3:
#raise exception.__class__, exception, traceback
raise exception
raise exception.with_traceback(traceback)
set_linetrace_on_frame(frame, intercept_next_line)
linestarts = list(dis.findlinestarts(frame.f_code))
linestarts = [a for a, l in linestarts if l >= call_lineno]
# Jump target
dest = linestarts[0]
oc = frame.f_code.co_code[frame.f_lasti]
if sys.version_info[0] < 3:
oc = ord(oc)
opcode_size = 2 if oc >= opcode.HAVE_ARGUMENT else 0
# Opcode to overwrite
where = frame.f_lasti + 1 + opcode_size
#dis.disco(frame.f_code)
pc = PyCodeObject.from_address(id(frame.f_code))
back_like_nothing_happened = pc.co_code.contents.inject_jump(where, dest)
#print("#"*100)
#dis.disco(frame.f_code)
sys.settrace(globaltrace) | [
"def",
"re_execute_with_exception",
"(",
"frame",
",",
"exception",
",",
"traceback",
")",
":",
"if",
"sys",
".",
"gettrace",
"(",
")",
"==",
"globaltrace",
":",
"# If our trace handler is already installed, that means that this",
"# function has been called twice before the line tracer had a chance to",
"# run. That can happen if more than one exception was logged.",
"return",
"call_lineno",
"=",
"frame",
".",
"f_lineno",
"def",
"intercept_next_line",
"(",
"f",
",",
"why",
",",
"*",
"args",
")",
":",
"if",
"f",
"is",
"not",
"frame",
":",
"return",
"set_linetrace_on_frame",
"(",
"f",
")",
"# Undo modifications to the callers code (ick ick ick)",
"back_like_nothing_happened",
"(",
")",
"# Raise exception in (almost) the perfect place (except for duplication)",
"if",
"sys",
".",
"version_info",
"[",
"0",
"]",
"<",
"3",
":",
"#raise exception.__class__, exception, traceback",
"raise",
"exception",
"raise",
"exception",
".",
"with_traceback",
"(",
"traceback",
")",
"set_linetrace_on_frame",
"(",
"frame",
",",
"intercept_next_line",
")",
"linestarts",
"=",
"list",
"(",
"dis",
".",
"findlinestarts",
"(",
"frame",
".",
"f_code",
")",
")",
"linestarts",
"=",
"[",
"a",
"for",
"a",
",",
"l",
"in",
"linestarts",
"if",
"l",
">=",
"call_lineno",
"]",
"# Jump target",
"dest",
"=",
"linestarts",
"[",
"0",
"]",
"oc",
"=",
"frame",
".",
"f_code",
".",
"co_code",
"[",
"frame",
".",
"f_lasti",
"]",
"if",
"sys",
".",
"version_info",
"[",
"0",
"]",
"<",
"3",
":",
"oc",
"=",
"ord",
"(",
"oc",
")",
"opcode_size",
"=",
"2",
"if",
"oc",
">=",
"opcode",
".",
"HAVE_ARGUMENT",
"else",
"0",
"# Opcode to overwrite",
"where",
"=",
"frame",
".",
"f_lasti",
"+",
"1",
"+",
"opcode_size",
"#dis.disco(frame.f_code)",
"pc",
"=",
"PyCodeObject",
".",
"from_address",
"(",
"id",
"(",
"frame",
".",
"f_code",
")",
")",
"back_like_nothing_happened",
"=",
"pc",
".",
"co_code",
".",
"contents",
".",
"inject_jump",
"(",
"where",
",",
"dest",
")",
"#print(\"#\"*100)",
"#dis.disco(frame.f_code)",
"sys",
".",
"settrace",
"(",
"globaltrace",
")"
] | Dark magic. Causes ``frame`` to raise an exception at the current location
with ``traceback`` appended to it.
Note that since the line tracer is raising an exception, the interpreter
disables the global trace, so it's not possible to restore the previous
tracing conditions. | [
"Dark",
"magic",
".",
"Causes",
"frame",
"to",
"raise",
"an",
"exception",
"at",
"the",
"current",
"location",
"with",
"traceback",
"appended",
"to",
"it",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/logger/magic.py#L219-L269 | train |
rootpy/rootpy | rootpy/logger/magic.py | _inject_jump | def _inject_jump(self, where, dest):
"""
Monkeypatch bytecode at ``where`` to force it to jump to ``dest``.
Returns function which puts things back to how they were.
"""
# We're about to do dangerous things to a function's code content.
# We can't make a lock to prevent the interpreter from using those
# bytes, so the best we can do is to set the check interval to be high
# and just pray that this keeps other threads at bay.
if sys.version_info[0] < 3:
old_check_interval = sys.getcheckinterval()
sys.setcheckinterval(2**20)
else:
old_check_interval = sys.getswitchinterval()
sys.setswitchinterval(1000)
pb = ctypes.pointer(self.ob_sval)
orig_bytes = [pb[where + i][0] for i in range(3)]
v = struct.pack("<BH", opcode.opmap["JUMP_ABSOLUTE"], dest)
# Overwrite code to cause it to jump to the target
if sys.version_info[0] < 3:
for i in range(3):
pb[where + i][0] = ord(v[i])
else:
for i in range(3):
pb[where + i][0] = v[i]
def tidy_up():
"""
Put the bytecode back to how it was. Good as new.
"""
if sys.version_info[0] < 3:
sys.setcheckinterval(old_check_interval)
else:
sys.setswitchinterval(old_check_interval)
for i in range(3):
pb[where + i][0] = orig_bytes[i]
return tidy_up | python | def _inject_jump(self, where, dest):
"""
Monkeypatch bytecode at ``where`` to force it to jump to ``dest``.
Returns function which puts things back to how they were.
"""
# We're about to do dangerous things to a function's code content.
# We can't make a lock to prevent the interpreter from using those
# bytes, so the best we can do is to set the check interval to be high
# and just pray that this keeps other threads at bay.
if sys.version_info[0] < 3:
old_check_interval = sys.getcheckinterval()
sys.setcheckinterval(2**20)
else:
old_check_interval = sys.getswitchinterval()
sys.setswitchinterval(1000)
pb = ctypes.pointer(self.ob_sval)
orig_bytes = [pb[where + i][0] for i in range(3)]
v = struct.pack("<BH", opcode.opmap["JUMP_ABSOLUTE"], dest)
# Overwrite code to cause it to jump to the target
if sys.version_info[0] < 3:
for i in range(3):
pb[where + i][0] = ord(v[i])
else:
for i in range(3):
pb[where + i][0] = v[i]
def tidy_up():
"""
Put the bytecode back to how it was. Good as new.
"""
if sys.version_info[0] < 3:
sys.setcheckinterval(old_check_interval)
else:
sys.setswitchinterval(old_check_interval)
for i in range(3):
pb[where + i][0] = orig_bytes[i]
return tidy_up | [
"def",
"_inject_jump",
"(",
"self",
",",
"where",
",",
"dest",
")",
":",
"# We're about to do dangerous things to a function's code content.",
"# We can't make a lock to prevent the interpreter from using those",
"# bytes, so the best we can do is to set the check interval to be high",
"# and just pray that this keeps other threads at bay.",
"if",
"sys",
".",
"version_info",
"[",
"0",
"]",
"<",
"3",
":",
"old_check_interval",
"=",
"sys",
".",
"getcheckinterval",
"(",
")",
"sys",
".",
"setcheckinterval",
"(",
"2",
"**",
"20",
")",
"else",
":",
"old_check_interval",
"=",
"sys",
".",
"getswitchinterval",
"(",
")",
"sys",
".",
"setswitchinterval",
"(",
"1000",
")",
"pb",
"=",
"ctypes",
".",
"pointer",
"(",
"self",
".",
"ob_sval",
")",
"orig_bytes",
"=",
"[",
"pb",
"[",
"where",
"+",
"i",
"]",
"[",
"0",
"]",
"for",
"i",
"in",
"range",
"(",
"3",
")",
"]",
"v",
"=",
"struct",
".",
"pack",
"(",
"\"<BH\"",
",",
"opcode",
".",
"opmap",
"[",
"\"JUMP_ABSOLUTE\"",
"]",
",",
"dest",
")",
"# Overwrite code to cause it to jump to the target",
"if",
"sys",
".",
"version_info",
"[",
"0",
"]",
"<",
"3",
":",
"for",
"i",
"in",
"range",
"(",
"3",
")",
":",
"pb",
"[",
"where",
"+",
"i",
"]",
"[",
"0",
"]",
"=",
"ord",
"(",
"v",
"[",
"i",
"]",
")",
"else",
":",
"for",
"i",
"in",
"range",
"(",
"3",
")",
":",
"pb",
"[",
"where",
"+",
"i",
"]",
"[",
"0",
"]",
"=",
"v",
"[",
"i",
"]",
"def",
"tidy_up",
"(",
")",
":",
"\"\"\"\n Put the bytecode back to how it was. Good as new.\n \"\"\"",
"if",
"sys",
".",
"version_info",
"[",
"0",
"]",
"<",
"3",
":",
"sys",
".",
"setcheckinterval",
"(",
"old_check_interval",
")",
"else",
":",
"sys",
".",
"setswitchinterval",
"(",
"old_check_interval",
")",
"for",
"i",
"in",
"range",
"(",
"3",
")",
":",
"pb",
"[",
"where",
"+",
"i",
"]",
"[",
"0",
"]",
"=",
"orig_bytes",
"[",
"i",
"]",
"return",
"tidy_up"
] | Monkeypatch bytecode at ``where`` to force it to jump to ``dest``.
Returns function which puts things back to how they were. | [
"Monkeypatch",
"bytecode",
"at",
"where",
"to",
"force",
"it",
"to",
"jump",
"to",
"dest",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/logger/magic.py#L272-L313 | train |
rootpy/rootpy | rootpy/tree/chain.py | BaseTreeChain.Draw | def Draw(self, *args, **kwargs):
"""
Loop over subfiles, draw each, and sum the output into a single
histogram.
"""
self.reset()
output = None
while self._rollover():
if output is None:
# Make our own copy of the drawn histogram
output = self._tree.Draw(*args, **kwargs)
if output is not None:
output = output.Clone()
# Make it memory resident (histograms)
if hasattr(output, 'SetDirectory'):
output.SetDirectory(0)
else:
newoutput = self._tree.Draw(*args, **kwargs)
if newoutput is not None:
if isinstance(output, _GraphBase):
output.Append(newoutput)
else: # histogram
output += newoutput
return output | python | def Draw(self, *args, **kwargs):
"""
Loop over subfiles, draw each, and sum the output into a single
histogram.
"""
self.reset()
output = None
while self._rollover():
if output is None:
# Make our own copy of the drawn histogram
output = self._tree.Draw(*args, **kwargs)
if output is not None:
output = output.Clone()
# Make it memory resident (histograms)
if hasattr(output, 'SetDirectory'):
output.SetDirectory(0)
else:
newoutput = self._tree.Draw(*args, **kwargs)
if newoutput is not None:
if isinstance(output, _GraphBase):
output.Append(newoutput)
else: # histogram
output += newoutput
return output | [
"def",
"Draw",
"(",
"self",
",",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
":",
"self",
".",
"reset",
"(",
")",
"output",
"=",
"None",
"while",
"self",
".",
"_rollover",
"(",
")",
":",
"if",
"output",
"is",
"None",
":",
"# Make our own copy of the drawn histogram",
"output",
"=",
"self",
".",
"_tree",
".",
"Draw",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
"if",
"output",
"is",
"not",
"None",
":",
"output",
"=",
"output",
".",
"Clone",
"(",
")",
"# Make it memory resident (histograms)",
"if",
"hasattr",
"(",
"output",
",",
"'SetDirectory'",
")",
":",
"output",
".",
"SetDirectory",
"(",
"0",
")",
"else",
":",
"newoutput",
"=",
"self",
".",
"_tree",
".",
"Draw",
"(",
"*",
"args",
",",
"*",
"*",
"kwargs",
")",
"if",
"newoutput",
"is",
"not",
"None",
":",
"if",
"isinstance",
"(",
"output",
",",
"_GraphBase",
")",
":",
"output",
".",
"Append",
"(",
"newoutput",
")",
"else",
":",
"# histogram",
"output",
"+=",
"newoutput",
"return",
"output"
] | Loop over subfiles, draw each, and sum the output into a single
histogram. | [
"Loop",
"over",
"subfiles",
"draw",
"each",
"and",
"sum",
"the",
"output",
"into",
"a",
"single",
"histogram",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/tree/chain.py#L105-L128 | train |
rootpy/rootpy | rootpy/interactive/console.py | interact_plain | def interact_plain(header=UP_LINE, local_ns=None,
module=None, dummy=None,
stack_depth=1, global_ns=None):
"""
Create an interactive python console
"""
frame = sys._getframe(stack_depth)
variables = {}
if local_ns is not None:
variables.update(local_ns)
else:
variables.update(frame.f_locals)
if global_ns is not None:
variables.update(local_ns)
else:
variables.update(frame.f_globals)
shell = code.InteractiveConsole(variables)
return shell.interact(banner=header) | python | def interact_plain(header=UP_LINE, local_ns=None,
module=None, dummy=None,
stack_depth=1, global_ns=None):
"""
Create an interactive python console
"""
frame = sys._getframe(stack_depth)
variables = {}
if local_ns is not None:
variables.update(local_ns)
else:
variables.update(frame.f_locals)
if global_ns is not None:
variables.update(local_ns)
else:
variables.update(frame.f_globals)
shell = code.InteractiveConsole(variables)
return shell.interact(banner=header) | [
"def",
"interact_plain",
"(",
"header",
"=",
"UP_LINE",
",",
"local_ns",
"=",
"None",
",",
"module",
"=",
"None",
",",
"dummy",
"=",
"None",
",",
"stack_depth",
"=",
"1",
",",
"global_ns",
"=",
"None",
")",
":",
"frame",
"=",
"sys",
".",
"_getframe",
"(",
"stack_depth",
")",
"variables",
"=",
"{",
"}",
"if",
"local_ns",
"is",
"not",
"None",
":",
"variables",
".",
"update",
"(",
"local_ns",
")",
"else",
":",
"variables",
".",
"update",
"(",
"frame",
".",
"f_locals",
")",
"if",
"global_ns",
"is",
"not",
"None",
":",
"variables",
".",
"update",
"(",
"local_ns",
")",
"else",
":",
"variables",
".",
"update",
"(",
"frame",
".",
"f_globals",
")",
"shell",
"=",
"code",
".",
"InteractiveConsole",
"(",
"variables",
")",
"return",
"shell",
".",
"interact",
"(",
"banner",
"=",
"header",
")"
] | Create an interactive python console | [
"Create",
"an",
"interactive",
"python",
"console"
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/interactive/console.py#L33-L54 | train |
rootpy/rootpy | rootpy/plotting/root2matplotlib.py | hist | def hist(hists,
stacked=True,
reverse=False,
xpadding=0, ypadding=.1,
yerror_in_padding=True,
logy=None,
snap=True,
axes=None,
**kwargs):
"""
Make a matplotlib hist plot from a ROOT histogram, stack or
list of histograms.
Parameters
----------
hists : Hist, list of Hist, HistStack
The histogram(s) to be plotted
stacked : bool, optional (default=True)
If True then stack the histograms with the first histogram on the
bottom, otherwise overlay them with the first histogram in the
background.
reverse : bool, optional (default=False)
If True then reverse the order of the stack or overlay.
xpadding : float or 2-tuple of floats, optional (default=0)
Padding to add on the left and right sides of the plot as a fraction of
the axes width after the padding has been added. Specify unique left
and right padding with a 2-tuple.
ypadding : float or 2-tuple of floats, optional (default=.1)
Padding to add on the top and bottom of the plot as a fraction of
the axes height after the padding has been added. Specify unique top
and bottom padding with a 2-tuple.
yerror_in_padding : bool, optional (default=True)
If True then make the padding inclusive of the y errors otherwise
only pad around the y values.
logy : bool, optional (default=None)
Apply special treatment of a log-scale y-axis to display the histogram
correctly. If None (the default) then automatically determine if the
y-axis is log-scale.
snap : bool, optional (default=True)
If True (the default) then the origin is an implicit lower bound of the
histogram unless the histogram has both positive and negative bins.
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
kwargs : additional keyword arguments, optional
All additional keyword arguments are passed to matplotlib's
fill_between for the filled regions and matplotlib's step function
for the edges.
Returns
-------
The return value from matplotlib's hist function, or list of such return
values if a stack or list of histograms was plotted.
"""
if axes is None:
axes = plt.gca()
if logy is None:
logy = axes.get_yscale() == 'log'
curr_xlim = axes.get_xlim()
curr_ylim = axes.get_ylim()
was_empty = not axes.has_data()
returns = []
if isinstance(hists, _Hist):
# This is a single plottable object.
returns = _hist(hists, axes=axes, logy=logy, **kwargs)
_set_bounds(hists, axes=axes,
was_empty=was_empty,
prev_xlim=curr_xlim,
prev_ylim=curr_ylim,
xpadding=xpadding, ypadding=ypadding,
yerror_in_padding=yerror_in_padding,
snap=snap,
logy=logy)
elif stacked:
# draw the top histogram first so its edges don't cover the histograms
# beneath it in the stack
if not reverse:
hists = list(hists)[::-1]
for i, h in enumerate(hists):
kwargs_local = kwargs.copy()
if i == len(hists) - 1:
low = h.Clone()
low.Reset()
else:
low = sum(hists[i + 1:])
high = h + low
high.alpha = getattr(h, 'alpha', None)
proxy = _hist(high, bottom=low, axes=axes, logy=logy, **kwargs)
returns.append(proxy)
if not reverse:
returns = returns[::-1]
_set_bounds(sum(hists), axes=axes,
was_empty=was_empty,
prev_xlim=curr_xlim,
prev_ylim=curr_ylim,
xpadding=xpadding, ypadding=ypadding,
yerror_in_padding=yerror_in_padding,
snap=snap,
logy=logy)
else:
for h in _maybe_reversed(hists, reverse):
returns.append(_hist(h, axes=axes, logy=logy, **kwargs))
if reverse:
returns = returns[::-1]
_set_bounds(hists[max(range(len(hists)), key=lambda idx: hists[idx].max())],
axes=axes,
was_empty=was_empty,
prev_xlim=curr_xlim,
prev_ylim=curr_ylim,
xpadding=xpadding, ypadding=ypadding,
yerror_in_padding=yerror_in_padding,
snap=snap,
logy=logy)
return returns | python | def hist(hists,
stacked=True,
reverse=False,
xpadding=0, ypadding=.1,
yerror_in_padding=True,
logy=None,
snap=True,
axes=None,
**kwargs):
"""
Make a matplotlib hist plot from a ROOT histogram, stack or
list of histograms.
Parameters
----------
hists : Hist, list of Hist, HistStack
The histogram(s) to be plotted
stacked : bool, optional (default=True)
If True then stack the histograms with the first histogram on the
bottom, otherwise overlay them with the first histogram in the
background.
reverse : bool, optional (default=False)
If True then reverse the order of the stack or overlay.
xpadding : float or 2-tuple of floats, optional (default=0)
Padding to add on the left and right sides of the plot as a fraction of
the axes width after the padding has been added. Specify unique left
and right padding with a 2-tuple.
ypadding : float or 2-tuple of floats, optional (default=.1)
Padding to add on the top and bottom of the plot as a fraction of
the axes height after the padding has been added. Specify unique top
and bottom padding with a 2-tuple.
yerror_in_padding : bool, optional (default=True)
If True then make the padding inclusive of the y errors otherwise
only pad around the y values.
logy : bool, optional (default=None)
Apply special treatment of a log-scale y-axis to display the histogram
correctly. If None (the default) then automatically determine if the
y-axis is log-scale.
snap : bool, optional (default=True)
If True (the default) then the origin is an implicit lower bound of the
histogram unless the histogram has both positive and negative bins.
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
kwargs : additional keyword arguments, optional
All additional keyword arguments are passed to matplotlib's
fill_between for the filled regions and matplotlib's step function
for the edges.
Returns
-------
The return value from matplotlib's hist function, or list of such return
values if a stack or list of histograms was plotted.
"""
if axes is None:
axes = plt.gca()
if logy is None:
logy = axes.get_yscale() == 'log'
curr_xlim = axes.get_xlim()
curr_ylim = axes.get_ylim()
was_empty = not axes.has_data()
returns = []
if isinstance(hists, _Hist):
# This is a single plottable object.
returns = _hist(hists, axes=axes, logy=logy, **kwargs)
_set_bounds(hists, axes=axes,
was_empty=was_empty,
prev_xlim=curr_xlim,
prev_ylim=curr_ylim,
xpadding=xpadding, ypadding=ypadding,
yerror_in_padding=yerror_in_padding,
snap=snap,
logy=logy)
elif stacked:
# draw the top histogram first so its edges don't cover the histograms
# beneath it in the stack
if not reverse:
hists = list(hists)[::-1]
for i, h in enumerate(hists):
kwargs_local = kwargs.copy()
if i == len(hists) - 1:
low = h.Clone()
low.Reset()
else:
low = sum(hists[i + 1:])
high = h + low
high.alpha = getattr(h, 'alpha', None)
proxy = _hist(high, bottom=low, axes=axes, logy=logy, **kwargs)
returns.append(proxy)
if not reverse:
returns = returns[::-1]
_set_bounds(sum(hists), axes=axes,
was_empty=was_empty,
prev_xlim=curr_xlim,
prev_ylim=curr_ylim,
xpadding=xpadding, ypadding=ypadding,
yerror_in_padding=yerror_in_padding,
snap=snap,
logy=logy)
else:
for h in _maybe_reversed(hists, reverse):
returns.append(_hist(h, axes=axes, logy=logy, **kwargs))
if reverse:
returns = returns[::-1]
_set_bounds(hists[max(range(len(hists)), key=lambda idx: hists[idx].max())],
axes=axes,
was_empty=was_empty,
prev_xlim=curr_xlim,
prev_ylim=curr_ylim,
xpadding=xpadding, ypadding=ypadding,
yerror_in_padding=yerror_in_padding,
snap=snap,
logy=logy)
return returns | [
"def",
"hist",
"(",
"hists",
",",
"stacked",
"=",
"True",
",",
"reverse",
"=",
"False",
",",
"xpadding",
"=",
"0",
",",
"ypadding",
"=",
".1",
",",
"yerror_in_padding",
"=",
"True",
",",
"logy",
"=",
"None",
",",
"snap",
"=",
"True",
",",
"axes",
"=",
"None",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"axes",
"is",
"None",
":",
"axes",
"=",
"plt",
".",
"gca",
"(",
")",
"if",
"logy",
"is",
"None",
":",
"logy",
"=",
"axes",
".",
"get_yscale",
"(",
")",
"==",
"'log'",
"curr_xlim",
"=",
"axes",
".",
"get_xlim",
"(",
")",
"curr_ylim",
"=",
"axes",
".",
"get_ylim",
"(",
")",
"was_empty",
"=",
"not",
"axes",
".",
"has_data",
"(",
")",
"returns",
"=",
"[",
"]",
"if",
"isinstance",
"(",
"hists",
",",
"_Hist",
")",
":",
"# This is a single plottable object.",
"returns",
"=",
"_hist",
"(",
"hists",
",",
"axes",
"=",
"axes",
",",
"logy",
"=",
"logy",
",",
"*",
"*",
"kwargs",
")",
"_set_bounds",
"(",
"hists",
",",
"axes",
"=",
"axes",
",",
"was_empty",
"=",
"was_empty",
",",
"prev_xlim",
"=",
"curr_xlim",
",",
"prev_ylim",
"=",
"curr_ylim",
",",
"xpadding",
"=",
"xpadding",
",",
"ypadding",
"=",
"ypadding",
",",
"yerror_in_padding",
"=",
"yerror_in_padding",
",",
"snap",
"=",
"snap",
",",
"logy",
"=",
"logy",
")",
"elif",
"stacked",
":",
"# draw the top histogram first so its edges don't cover the histograms",
"# beneath it in the stack",
"if",
"not",
"reverse",
":",
"hists",
"=",
"list",
"(",
"hists",
")",
"[",
":",
":",
"-",
"1",
"]",
"for",
"i",
",",
"h",
"in",
"enumerate",
"(",
"hists",
")",
":",
"kwargs_local",
"=",
"kwargs",
".",
"copy",
"(",
")",
"if",
"i",
"==",
"len",
"(",
"hists",
")",
"-",
"1",
":",
"low",
"=",
"h",
".",
"Clone",
"(",
")",
"low",
".",
"Reset",
"(",
")",
"else",
":",
"low",
"=",
"sum",
"(",
"hists",
"[",
"i",
"+",
"1",
":",
"]",
")",
"high",
"=",
"h",
"+",
"low",
"high",
".",
"alpha",
"=",
"getattr",
"(",
"h",
",",
"'alpha'",
",",
"None",
")",
"proxy",
"=",
"_hist",
"(",
"high",
",",
"bottom",
"=",
"low",
",",
"axes",
"=",
"axes",
",",
"logy",
"=",
"logy",
",",
"*",
"*",
"kwargs",
")",
"returns",
".",
"append",
"(",
"proxy",
")",
"if",
"not",
"reverse",
":",
"returns",
"=",
"returns",
"[",
":",
":",
"-",
"1",
"]",
"_set_bounds",
"(",
"sum",
"(",
"hists",
")",
",",
"axes",
"=",
"axes",
",",
"was_empty",
"=",
"was_empty",
",",
"prev_xlim",
"=",
"curr_xlim",
",",
"prev_ylim",
"=",
"curr_ylim",
",",
"xpadding",
"=",
"xpadding",
",",
"ypadding",
"=",
"ypadding",
",",
"yerror_in_padding",
"=",
"yerror_in_padding",
",",
"snap",
"=",
"snap",
",",
"logy",
"=",
"logy",
")",
"else",
":",
"for",
"h",
"in",
"_maybe_reversed",
"(",
"hists",
",",
"reverse",
")",
":",
"returns",
".",
"append",
"(",
"_hist",
"(",
"h",
",",
"axes",
"=",
"axes",
",",
"logy",
"=",
"logy",
",",
"*",
"*",
"kwargs",
")",
")",
"if",
"reverse",
":",
"returns",
"=",
"returns",
"[",
":",
":",
"-",
"1",
"]",
"_set_bounds",
"(",
"hists",
"[",
"max",
"(",
"range",
"(",
"len",
"(",
"hists",
")",
")",
",",
"key",
"=",
"lambda",
"idx",
":",
"hists",
"[",
"idx",
"]",
".",
"max",
"(",
")",
")",
"]",
",",
"axes",
"=",
"axes",
",",
"was_empty",
"=",
"was_empty",
",",
"prev_xlim",
"=",
"curr_xlim",
",",
"prev_ylim",
"=",
"curr_ylim",
",",
"xpadding",
"=",
"xpadding",
",",
"ypadding",
"=",
"ypadding",
",",
"yerror_in_padding",
"=",
"yerror_in_padding",
",",
"snap",
"=",
"snap",
",",
"logy",
"=",
"logy",
")",
"return",
"returns"
] | Make a matplotlib hist plot from a ROOT histogram, stack or
list of histograms.
Parameters
----------
hists : Hist, list of Hist, HistStack
The histogram(s) to be plotted
stacked : bool, optional (default=True)
If True then stack the histograms with the first histogram on the
bottom, otherwise overlay them with the first histogram in the
background.
reverse : bool, optional (default=False)
If True then reverse the order of the stack or overlay.
xpadding : float or 2-tuple of floats, optional (default=0)
Padding to add on the left and right sides of the plot as a fraction of
the axes width after the padding has been added. Specify unique left
and right padding with a 2-tuple.
ypadding : float or 2-tuple of floats, optional (default=.1)
Padding to add on the top and bottom of the plot as a fraction of
the axes height after the padding has been added. Specify unique top
and bottom padding with a 2-tuple.
yerror_in_padding : bool, optional (default=True)
If True then make the padding inclusive of the y errors otherwise
only pad around the y values.
logy : bool, optional (default=None)
Apply special treatment of a log-scale y-axis to display the histogram
correctly. If None (the default) then automatically determine if the
y-axis is log-scale.
snap : bool, optional (default=True)
If True (the default) then the origin is an implicit lower bound of the
histogram unless the histogram has both positive and negative bins.
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
kwargs : additional keyword arguments, optional
All additional keyword arguments are passed to matplotlib's
fill_between for the filled regions and matplotlib's step function
for the edges.
Returns
-------
The return value from matplotlib's hist function, or list of such return
values if a stack or list of histograms was plotted. | [
"Make",
"a",
"matplotlib",
"hist",
"plot",
"from",
"a",
"ROOT",
"histogram",
"stack",
"or",
"list",
"of",
"histograms",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/root2matplotlib.py#L141-L265 | train |
rootpy/rootpy | rootpy/plotting/root2matplotlib.py | errorbar | def errorbar(hists,
xerr=True, yerr=True,
xpadding=0, ypadding=.1,
xerror_in_padding=True,
yerror_in_padding=True,
emptybins=True,
snap=True,
axes=None,
**kwargs):
"""
Make a matplotlib errorbar plot from a ROOT histogram or graph
or list of histograms and graphs.
Parameters
----------
hists : Hist, Graph or list of Hist and Graph
The histogram(s) and/or Graph(s) to be plotted
xerr : bool, optional (default=True)
If True, x error bars will be displayed.
yerr : bool or string, optional (default=True)
If False, no y errors are displayed. If True, an individual y
error will be displayed for each hist in the stack. If 'linear' or
'quadratic', a single error bar will be displayed with either the
linear or quadratic sum of the individual errors.
xpadding : float or 2-tuple of floats, optional (default=0)
Padding to add on the left and right sides of the plot as a fraction of
the axes width after the padding has been added. Specify unique left
and right padding with a 2-tuple.
ypadding : float or 2-tuple of floats, optional (default=.1)
Padding to add on the top and bottom of the plot as a fraction of
the axes height after the padding has been added. Specify unique top
and bottom padding with a 2-tuple.
xerror_in_padding : bool, optional (default=True)
If True then make the padding inclusive of the x errors otherwise
only pad around the x values.
yerror_in_padding : bool, optional (default=True)
If True then make the padding inclusive of the y errors otherwise
only pad around the y values.
emptybins : bool, optional (default=True)
If True (the default) then plot bins with zero content otherwise only
show bins with nonzero content.
snap : bool, optional (default=True)
If True (the default) then the origin is an implicit lower bound of the
histogram unless the histogram has both positive and negative bins.
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
kwargs : additional keyword arguments, optional
All additional keyword arguments are passed to matplotlib's errorbar
function.
Returns
-------
The return value from matplotlib's errorbar function, or list of such
return values if a list of histograms and/or graphs was plotted.
"""
if axes is None:
axes = plt.gca()
curr_xlim = axes.get_xlim()
curr_ylim = axes.get_ylim()
was_empty = not axes.has_data()
if isinstance(hists, (_Hist, _Graph1DBase)):
# This is a single plottable object.
returns = _errorbar(
hists, xerr, yerr,
axes=axes, emptybins=emptybins, **kwargs)
_set_bounds(hists, axes=axes,
was_empty=was_empty,
prev_ylim=curr_ylim,
xpadding=xpadding, ypadding=ypadding,
xerror_in_padding=xerror_in_padding,
yerror_in_padding=yerror_in_padding,
snap=snap)
else:
returns = []
for h in hists:
returns.append(errorbar(
h, xerr=xerr, yerr=yerr, axes=axes,
xpadding=xpadding, ypadding=ypadding,
xerror_in_padding=xerror_in_padding,
yerror_in_padding=yerror_in_padding,
snap=snap,
emptybins=emptybins,
**kwargs))
return returns | python | def errorbar(hists,
xerr=True, yerr=True,
xpadding=0, ypadding=.1,
xerror_in_padding=True,
yerror_in_padding=True,
emptybins=True,
snap=True,
axes=None,
**kwargs):
"""
Make a matplotlib errorbar plot from a ROOT histogram or graph
or list of histograms and graphs.
Parameters
----------
hists : Hist, Graph or list of Hist and Graph
The histogram(s) and/or Graph(s) to be plotted
xerr : bool, optional (default=True)
If True, x error bars will be displayed.
yerr : bool or string, optional (default=True)
If False, no y errors are displayed. If True, an individual y
error will be displayed for each hist in the stack. If 'linear' or
'quadratic', a single error bar will be displayed with either the
linear or quadratic sum of the individual errors.
xpadding : float or 2-tuple of floats, optional (default=0)
Padding to add on the left and right sides of the plot as a fraction of
the axes width after the padding has been added. Specify unique left
and right padding with a 2-tuple.
ypadding : float or 2-tuple of floats, optional (default=.1)
Padding to add on the top and bottom of the plot as a fraction of
the axes height after the padding has been added. Specify unique top
and bottom padding with a 2-tuple.
xerror_in_padding : bool, optional (default=True)
If True then make the padding inclusive of the x errors otherwise
only pad around the x values.
yerror_in_padding : bool, optional (default=True)
If True then make the padding inclusive of the y errors otherwise
only pad around the y values.
emptybins : bool, optional (default=True)
If True (the default) then plot bins with zero content otherwise only
show bins with nonzero content.
snap : bool, optional (default=True)
If True (the default) then the origin is an implicit lower bound of the
histogram unless the histogram has both positive and negative bins.
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
kwargs : additional keyword arguments, optional
All additional keyword arguments are passed to matplotlib's errorbar
function.
Returns
-------
The return value from matplotlib's errorbar function, or list of such
return values if a list of histograms and/or graphs was plotted.
"""
if axes is None:
axes = plt.gca()
curr_xlim = axes.get_xlim()
curr_ylim = axes.get_ylim()
was_empty = not axes.has_data()
if isinstance(hists, (_Hist, _Graph1DBase)):
# This is a single plottable object.
returns = _errorbar(
hists, xerr, yerr,
axes=axes, emptybins=emptybins, **kwargs)
_set_bounds(hists, axes=axes,
was_empty=was_empty,
prev_ylim=curr_ylim,
xpadding=xpadding, ypadding=ypadding,
xerror_in_padding=xerror_in_padding,
yerror_in_padding=yerror_in_padding,
snap=snap)
else:
returns = []
for h in hists:
returns.append(errorbar(
h, xerr=xerr, yerr=yerr, axes=axes,
xpadding=xpadding, ypadding=ypadding,
xerror_in_padding=xerror_in_padding,
yerror_in_padding=yerror_in_padding,
snap=snap,
emptybins=emptybins,
**kwargs))
return returns | [
"def",
"errorbar",
"(",
"hists",
",",
"xerr",
"=",
"True",
",",
"yerr",
"=",
"True",
",",
"xpadding",
"=",
"0",
",",
"ypadding",
"=",
".1",
",",
"xerror_in_padding",
"=",
"True",
",",
"yerror_in_padding",
"=",
"True",
",",
"emptybins",
"=",
"True",
",",
"snap",
"=",
"True",
",",
"axes",
"=",
"None",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"axes",
"is",
"None",
":",
"axes",
"=",
"plt",
".",
"gca",
"(",
")",
"curr_xlim",
"=",
"axes",
".",
"get_xlim",
"(",
")",
"curr_ylim",
"=",
"axes",
".",
"get_ylim",
"(",
")",
"was_empty",
"=",
"not",
"axes",
".",
"has_data",
"(",
")",
"if",
"isinstance",
"(",
"hists",
",",
"(",
"_Hist",
",",
"_Graph1DBase",
")",
")",
":",
"# This is a single plottable object.",
"returns",
"=",
"_errorbar",
"(",
"hists",
",",
"xerr",
",",
"yerr",
",",
"axes",
"=",
"axes",
",",
"emptybins",
"=",
"emptybins",
",",
"*",
"*",
"kwargs",
")",
"_set_bounds",
"(",
"hists",
",",
"axes",
"=",
"axes",
",",
"was_empty",
"=",
"was_empty",
",",
"prev_ylim",
"=",
"curr_ylim",
",",
"xpadding",
"=",
"xpadding",
",",
"ypadding",
"=",
"ypadding",
",",
"xerror_in_padding",
"=",
"xerror_in_padding",
",",
"yerror_in_padding",
"=",
"yerror_in_padding",
",",
"snap",
"=",
"snap",
")",
"else",
":",
"returns",
"=",
"[",
"]",
"for",
"h",
"in",
"hists",
":",
"returns",
".",
"append",
"(",
"errorbar",
"(",
"h",
",",
"xerr",
"=",
"xerr",
",",
"yerr",
"=",
"yerr",
",",
"axes",
"=",
"axes",
",",
"xpadding",
"=",
"xpadding",
",",
"ypadding",
"=",
"ypadding",
",",
"xerror_in_padding",
"=",
"xerror_in_padding",
",",
"yerror_in_padding",
"=",
"yerror_in_padding",
",",
"snap",
"=",
"snap",
",",
"emptybins",
"=",
"emptybins",
",",
"*",
"*",
"kwargs",
")",
")",
"return",
"returns"
] | Make a matplotlib errorbar plot from a ROOT histogram or graph
or list of histograms and graphs.
Parameters
----------
hists : Hist, Graph or list of Hist and Graph
The histogram(s) and/or Graph(s) to be plotted
xerr : bool, optional (default=True)
If True, x error bars will be displayed.
yerr : bool or string, optional (default=True)
If False, no y errors are displayed. If True, an individual y
error will be displayed for each hist in the stack. If 'linear' or
'quadratic', a single error bar will be displayed with either the
linear or quadratic sum of the individual errors.
xpadding : float or 2-tuple of floats, optional (default=0)
Padding to add on the left and right sides of the plot as a fraction of
the axes width after the padding has been added. Specify unique left
and right padding with a 2-tuple.
ypadding : float or 2-tuple of floats, optional (default=.1)
Padding to add on the top and bottom of the plot as a fraction of
the axes height after the padding has been added. Specify unique top
and bottom padding with a 2-tuple.
xerror_in_padding : bool, optional (default=True)
If True then make the padding inclusive of the x errors otherwise
only pad around the x values.
yerror_in_padding : bool, optional (default=True)
If True then make the padding inclusive of the y errors otherwise
only pad around the y values.
emptybins : bool, optional (default=True)
If True (the default) then plot bins with zero content otherwise only
show bins with nonzero content.
snap : bool, optional (default=True)
If True (the default) then the origin is an implicit lower bound of the
histogram unless the histogram has both positive and negative bins.
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
kwargs : additional keyword arguments, optional
All additional keyword arguments are passed to matplotlib's errorbar
function.
Returns
-------
The return value from matplotlib's errorbar function, or list of such
return values if a list of histograms and/or graphs was plotted. | [
"Make",
"a",
"matplotlib",
"errorbar",
"plot",
"from",
"a",
"ROOT",
"histogram",
"or",
"graph",
"or",
"list",
"of",
"histograms",
"and",
"graphs",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/root2matplotlib.py#L481-L577 | train |
rootpy/rootpy | rootpy/plotting/root2matplotlib.py | step | def step(h, logy=None, axes=None, **kwargs):
"""
Make a matplotlib step plot from a ROOT histogram.
Parameters
----------
h : Hist
A rootpy Hist
logy : bool, optional (default=None)
If True then clip the y range between 1E-300 and 1E300.
If None (the default) then automatically determine if the axes are
log-scale and if this clipping should be performed.
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
kwargs : additional keyword arguments, optional
Additional keyword arguments are passed directly to
matplotlib's fill_between function.
Returns
-------
Returns the value from matplotlib's fill_between function.
"""
if axes is None:
axes = plt.gca()
if logy is None:
logy = axes.get_yscale() == 'log'
_set_defaults(h, kwargs, ['common', 'line'])
if kwargs.get('color') is None:
kwargs['color'] = h.GetLineColor('mpl')
y = np.array(list(h.y()) + [0.])
if logy:
np.clip(y, 1E-300, 1E300, out=y)
return axes.step(list(h.xedges()), y, where='post', **kwargs) | python | def step(h, logy=None, axes=None, **kwargs):
"""
Make a matplotlib step plot from a ROOT histogram.
Parameters
----------
h : Hist
A rootpy Hist
logy : bool, optional (default=None)
If True then clip the y range between 1E-300 and 1E300.
If None (the default) then automatically determine if the axes are
log-scale and if this clipping should be performed.
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
kwargs : additional keyword arguments, optional
Additional keyword arguments are passed directly to
matplotlib's fill_between function.
Returns
-------
Returns the value from matplotlib's fill_between function.
"""
if axes is None:
axes = plt.gca()
if logy is None:
logy = axes.get_yscale() == 'log'
_set_defaults(h, kwargs, ['common', 'line'])
if kwargs.get('color') is None:
kwargs['color'] = h.GetLineColor('mpl')
y = np.array(list(h.y()) + [0.])
if logy:
np.clip(y, 1E-300, 1E300, out=y)
return axes.step(list(h.xedges()), y, where='post', **kwargs) | [
"def",
"step",
"(",
"h",
",",
"logy",
"=",
"None",
",",
"axes",
"=",
"None",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"axes",
"is",
"None",
":",
"axes",
"=",
"plt",
".",
"gca",
"(",
")",
"if",
"logy",
"is",
"None",
":",
"logy",
"=",
"axes",
".",
"get_yscale",
"(",
")",
"==",
"'log'",
"_set_defaults",
"(",
"h",
",",
"kwargs",
",",
"[",
"'common'",
",",
"'line'",
"]",
")",
"if",
"kwargs",
".",
"get",
"(",
"'color'",
")",
"is",
"None",
":",
"kwargs",
"[",
"'color'",
"]",
"=",
"h",
".",
"GetLineColor",
"(",
"'mpl'",
")",
"y",
"=",
"np",
".",
"array",
"(",
"list",
"(",
"h",
".",
"y",
"(",
")",
")",
"+",
"[",
"0.",
"]",
")",
"if",
"logy",
":",
"np",
".",
"clip",
"(",
"y",
",",
"1E-300",
",",
"1E300",
",",
"out",
"=",
"y",
")",
"return",
"axes",
".",
"step",
"(",
"list",
"(",
"h",
".",
"xedges",
"(",
")",
")",
",",
"y",
",",
"where",
"=",
"'post'",
",",
"*",
"*",
"kwargs",
")"
] | Make a matplotlib step plot from a ROOT histogram.
Parameters
----------
h : Hist
A rootpy Hist
logy : bool, optional (default=None)
If True then clip the y range between 1E-300 and 1E300.
If None (the default) then automatically determine if the axes are
log-scale and if this clipping should be performed.
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
kwargs : additional keyword arguments, optional
Additional keyword arguments are passed directly to
matplotlib's fill_between function.
Returns
-------
Returns the value from matplotlib's fill_between function. | [
"Make",
"a",
"matplotlib",
"step",
"plot",
"from",
"a",
"ROOT",
"histogram",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/root2matplotlib.py#L603-L641 | train |
rootpy/rootpy | rootpy/plotting/root2matplotlib.py | fill_between | def fill_between(a, b, logy=None, axes=None, **kwargs):
"""
Fill the region between two histograms or graphs.
Parameters
----------
a : Hist
A rootpy Hist
b : Hist
A rootpy Hist
logy : bool, optional (default=None)
If True then clip the region between 1E-300 and 1E300.
If None (the default) then automatically determine if the axes are
log-scale and if this clipping should be performed.
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
kwargs : additional keyword arguments, optional
Additional keyword arguments are passed directly to
matplotlib's fill_between function.
Returns
-------
Returns the value from matplotlib's fill_between function.
"""
if axes is None:
axes = plt.gca()
if logy is None:
logy = axes.get_yscale() == 'log'
if not isinstance(a, _Hist) or not isinstance(b, _Hist):
raise TypeError(
"fill_between only operates on 1D histograms")
a.check_compatibility(b, check_edges=True)
x = []
top = []
bottom = []
for abin, bbin in zip(a.bins(overflow=False), b.bins(overflow=False)):
up = max(abin.value, bbin.value)
dn = min(abin.value, bbin.value)
x.extend([abin.x.low, abin.x.high])
top.extend([up, up])
bottom.extend([dn, dn])
x = np.array(x)
top = np.array(top)
bottom = np.array(bottom)
if logy:
np.clip(top, 1E-300, 1E300, out=top)
np.clip(bottom, 1E-300, 1E300, out=bottom)
return axes.fill_between(x, top, bottom, **kwargs) | python | def fill_between(a, b, logy=None, axes=None, **kwargs):
"""
Fill the region between two histograms or graphs.
Parameters
----------
a : Hist
A rootpy Hist
b : Hist
A rootpy Hist
logy : bool, optional (default=None)
If True then clip the region between 1E-300 and 1E300.
If None (the default) then automatically determine if the axes are
log-scale and if this clipping should be performed.
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
kwargs : additional keyword arguments, optional
Additional keyword arguments are passed directly to
matplotlib's fill_between function.
Returns
-------
Returns the value from matplotlib's fill_between function.
"""
if axes is None:
axes = plt.gca()
if logy is None:
logy = axes.get_yscale() == 'log'
if not isinstance(a, _Hist) or not isinstance(b, _Hist):
raise TypeError(
"fill_between only operates on 1D histograms")
a.check_compatibility(b, check_edges=True)
x = []
top = []
bottom = []
for abin, bbin in zip(a.bins(overflow=False), b.bins(overflow=False)):
up = max(abin.value, bbin.value)
dn = min(abin.value, bbin.value)
x.extend([abin.x.low, abin.x.high])
top.extend([up, up])
bottom.extend([dn, dn])
x = np.array(x)
top = np.array(top)
bottom = np.array(bottom)
if logy:
np.clip(top, 1E-300, 1E300, out=top)
np.clip(bottom, 1E-300, 1E300, out=bottom)
return axes.fill_between(x, top, bottom, **kwargs) | [
"def",
"fill_between",
"(",
"a",
",",
"b",
",",
"logy",
"=",
"None",
",",
"axes",
"=",
"None",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"axes",
"is",
"None",
":",
"axes",
"=",
"plt",
".",
"gca",
"(",
")",
"if",
"logy",
"is",
"None",
":",
"logy",
"=",
"axes",
".",
"get_yscale",
"(",
")",
"==",
"'log'",
"if",
"not",
"isinstance",
"(",
"a",
",",
"_Hist",
")",
"or",
"not",
"isinstance",
"(",
"b",
",",
"_Hist",
")",
":",
"raise",
"TypeError",
"(",
"\"fill_between only operates on 1D histograms\"",
")",
"a",
".",
"check_compatibility",
"(",
"b",
",",
"check_edges",
"=",
"True",
")",
"x",
"=",
"[",
"]",
"top",
"=",
"[",
"]",
"bottom",
"=",
"[",
"]",
"for",
"abin",
",",
"bbin",
"in",
"zip",
"(",
"a",
".",
"bins",
"(",
"overflow",
"=",
"False",
")",
",",
"b",
".",
"bins",
"(",
"overflow",
"=",
"False",
")",
")",
":",
"up",
"=",
"max",
"(",
"abin",
".",
"value",
",",
"bbin",
".",
"value",
")",
"dn",
"=",
"min",
"(",
"abin",
".",
"value",
",",
"bbin",
".",
"value",
")",
"x",
".",
"extend",
"(",
"[",
"abin",
".",
"x",
".",
"low",
",",
"abin",
".",
"x",
".",
"high",
"]",
")",
"top",
".",
"extend",
"(",
"[",
"up",
",",
"up",
"]",
")",
"bottom",
".",
"extend",
"(",
"[",
"dn",
",",
"dn",
"]",
")",
"x",
"=",
"np",
".",
"array",
"(",
"x",
")",
"top",
"=",
"np",
".",
"array",
"(",
"top",
")",
"bottom",
"=",
"np",
".",
"array",
"(",
"bottom",
")",
"if",
"logy",
":",
"np",
".",
"clip",
"(",
"top",
",",
"1E-300",
",",
"1E300",
",",
"out",
"=",
"top",
")",
"np",
".",
"clip",
"(",
"bottom",
",",
"1E-300",
",",
"1E300",
",",
"out",
"=",
"bottom",
")",
"return",
"axes",
".",
"fill_between",
"(",
"x",
",",
"top",
",",
"bottom",
",",
"*",
"*",
"kwargs",
")"
] | Fill the region between two histograms or graphs.
Parameters
----------
a : Hist
A rootpy Hist
b : Hist
A rootpy Hist
logy : bool, optional (default=None)
If True then clip the region between 1E-300 and 1E300.
If None (the default) then automatically determine if the axes are
log-scale and if this clipping should be performed.
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
kwargs : additional keyword arguments, optional
Additional keyword arguments are passed directly to
matplotlib's fill_between function.
Returns
-------
Returns the value from matplotlib's fill_between function. | [
"Fill",
"the",
"region",
"between",
"two",
"histograms",
"or",
"graphs",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/root2matplotlib.py#L644-L698 | train |
rootpy/rootpy | rootpy/plotting/root2matplotlib.py | hist2d | def hist2d(h, axes=None, colorbar=False, **kwargs):
"""
Draw a 2D matplotlib histogram plot from a 2D ROOT histogram.
Parameters
----------
h : Hist2D
A rootpy Hist2D
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
colorbar : Boolean, optional (default=False)
If True, include a colorbar in the produced plot
kwargs : additional keyword arguments, optional
Additional keyword arguments are passed directly to
matplotlib's hist2d function.
Returns
-------
Returns the value from matplotlib's hist2d function.
"""
if axes is None:
axes = plt.gca()
X, Y = np.meshgrid(list(h.x()), list(h.y()))
x = X.ravel()
y = Y.ravel()
z = np.array(h.z()).T
# returns of hist2d: (counts, xedges, yedges, Image)
return_values = axes.hist2d(x, y, weights=z.ravel(),
bins=(list(h.xedges()), list(h.yedges())),
**kwargs)
if colorbar:
mappable = return_values[-1]
plt.colorbar(mappable, ax=axes)
return return_values | python | def hist2d(h, axes=None, colorbar=False, **kwargs):
"""
Draw a 2D matplotlib histogram plot from a 2D ROOT histogram.
Parameters
----------
h : Hist2D
A rootpy Hist2D
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
colorbar : Boolean, optional (default=False)
If True, include a colorbar in the produced plot
kwargs : additional keyword arguments, optional
Additional keyword arguments are passed directly to
matplotlib's hist2d function.
Returns
-------
Returns the value from matplotlib's hist2d function.
"""
if axes is None:
axes = plt.gca()
X, Y = np.meshgrid(list(h.x()), list(h.y()))
x = X.ravel()
y = Y.ravel()
z = np.array(h.z()).T
# returns of hist2d: (counts, xedges, yedges, Image)
return_values = axes.hist2d(x, y, weights=z.ravel(),
bins=(list(h.xedges()), list(h.yedges())),
**kwargs)
if colorbar:
mappable = return_values[-1]
plt.colorbar(mappable, ax=axes)
return return_values | [
"def",
"hist2d",
"(",
"h",
",",
"axes",
"=",
"None",
",",
"colorbar",
"=",
"False",
",",
"*",
"*",
"kwargs",
")",
":",
"if",
"axes",
"is",
"None",
":",
"axes",
"=",
"plt",
".",
"gca",
"(",
")",
"X",
",",
"Y",
"=",
"np",
".",
"meshgrid",
"(",
"list",
"(",
"h",
".",
"x",
"(",
")",
")",
",",
"list",
"(",
"h",
".",
"y",
"(",
")",
")",
")",
"x",
"=",
"X",
".",
"ravel",
"(",
")",
"y",
"=",
"Y",
".",
"ravel",
"(",
")",
"z",
"=",
"np",
".",
"array",
"(",
"h",
".",
"z",
"(",
")",
")",
".",
"T",
"# returns of hist2d: (counts, xedges, yedges, Image)",
"return_values",
"=",
"axes",
".",
"hist2d",
"(",
"x",
",",
"y",
",",
"weights",
"=",
"z",
".",
"ravel",
"(",
")",
",",
"bins",
"=",
"(",
"list",
"(",
"h",
".",
"xedges",
"(",
")",
")",
",",
"list",
"(",
"h",
".",
"yedges",
"(",
")",
")",
")",
",",
"*",
"*",
"kwargs",
")",
"if",
"colorbar",
":",
"mappable",
"=",
"return_values",
"[",
"-",
"1",
"]",
"plt",
".",
"colorbar",
"(",
"mappable",
",",
"ax",
"=",
"axes",
")",
"return",
"return_values"
] | Draw a 2D matplotlib histogram plot from a 2D ROOT histogram.
Parameters
----------
h : Hist2D
A rootpy Hist2D
axes : matplotlib Axes instance, optional (default=None)
The axes to plot on. If None then use the global current axes.
colorbar : Boolean, optional (default=False)
If True, include a colorbar in the produced plot
kwargs : additional keyword arguments, optional
Additional keyword arguments are passed directly to
matplotlib's hist2d function.
Returns
-------
Returns the value from matplotlib's hist2d function. | [
"Draw",
"a",
"2D",
"matplotlib",
"histogram",
"plot",
"from",
"a",
"2D",
"ROOT",
"histogram",
"."
] | 3926935e1f2100d8ba68070c2ab44055d4800f73 | https://github.com/rootpy/rootpy/blob/3926935e1f2100d8ba68070c2ab44055d4800f73/rootpy/plotting/root2matplotlib.py#L701-L740 | train |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.