body_hash
stringlengths
64
64
body
stringlengths
23
109k
docstring
stringlengths
1
57k
path
stringlengths
4
198
name
stringlengths
1
115
repository_name
stringlengths
7
111
repository_stars
float64
0
191k
lang
stringclasses
1 value
body_without_docstring
stringlengths
14
108k
unified
stringlengths
45
133k
b902d17abe789d41104c59352edd738cb5bfd758e9b422c738da54594c712c45
def size(self, name=None): 'Compute the number of elements in this queue.\n\n Args:\n name: A name for the operation (optional).\n\n Returns:\n A scalar tensor containing the number of elements in this queue.\n ' if (name is None): name = ('%s_Size' % self._name) return gen_data_flow_ops._queue_size(self._queue_ref, name=name)
Compute the number of elements in this queue. Args: name: A name for the operation (optional). Returns: A scalar tensor containing the number of elements in this queue.
tensorflow/python/ops/data_flow_ops.py
size
habangar/tensorflow
73
python
def size(self, name=None): 'Compute the number of elements in this queue.\n\n Args:\n name: A name for the operation (optional).\n\n Returns:\n A scalar tensor containing the number of elements in this queue.\n ' if (name is None): name = ('%s_Size' % self._name) return gen_data_flow_ops._queue_size(self._queue_ref, name=name)
def size(self, name=None): 'Compute the number of elements in this queue.\n\n Args:\n name: A name for the operation (optional).\n\n Returns:\n A scalar tensor containing the number of elements in this queue.\n ' if (name is None): name = ('%s_Size' % self._name) return gen_data_flow_ops._queue_size(self._queue_ref, name=name)<|docstring|>Compute the number of elements in this queue. Args: name: A name for the operation (optional). Returns: A scalar tensor containing the number of elements in this queue.<|endoftext|>
9a606c23df9d3bf8dced02f0098c09ac9d78d6716726c8d4dfe81e9e0b6aa069
def __init__(self, capacity, min_after_dequeue, dtypes, shapes=None, names=None, seed=None, shared_name=None, name='random_shuffle_queue'): 'Create a queue that dequeues elements in a random order.\n\n A `RandomShuffleQueue` has bounded capacity; supports multiple\n concurrent producers and consumers; and provides exactly-once\n delivery.\n\n A `RandomShuffleQueue` holds a list of up to `capacity`\n elements. Each element is a fixed-length tuple of tensors whose\n dtypes are described by `dtypes`, and whose shapes are optionally\n described by the `shapes` argument.\n\n If the `shapes` argument is specified, each component of a queue\n element must have the respective fixed shape. If it is\n unspecified, different queue elements may have different shapes,\n but the use of `dequeue_many` is disallowed.\n\n The `min_after_dequeue` argument allows the caller to specify a\n minimum number of elements that will remain in the queue after a\n `dequeue` or `dequeue_many` operation completes, to ensure a\n minimum level of mixing of elements. This invariant is maintained\n by blocking those operations until sufficient elements have been\n enqueued. The `min_after_dequeue` argument is ignored after the\n queue has been closed.\n\n Args:\n capacity: An integer. The upper bound on the number of elements\n that may be stored in this queue.\n min_after_dequeue: An integer (described above).\n dtypes: A list of `DType` objects. The length of `dtypes` must equal\n the number of tensors in each queue element.\n shapes: (Optional.) A list of fully-defined `TensorShape` objects\n with the same length as `dtypes`, or `None`.\n names: (Optional.) A list of string naming the components in the queue\n with the same length as `dtypes`, or `None`. If specified the dequeue\n methods return a dictionary with the names as keys.\n seed: A Python integer. Used to create a random seed. See\n [`set_random_seed`](../../api_docs/python/constant_op.md#set_random_seed)\n for behavior.\n shared_name: (Optional.) If non-empty, this queue will be shared under\n the given name across multiple sessions.\n name: Optional name for the queue operation.\n ' dtypes = _as_type_list(dtypes) shapes = _as_shape_list(shapes, dtypes) names = _as_name_list(names, dtypes) (seed1, seed2) = random_seed.get_seed(seed) queue_ref = gen_data_flow_ops._random_shuffle_queue(component_types=dtypes, shapes=shapes, capacity=capacity, min_after_dequeue=min_after_dequeue, seed=seed1, seed2=seed2, shared_name=shared_name, name=name) super(RandomShuffleQueue, self).__init__(dtypes, shapes, names, queue_ref)
Create a queue that dequeues elements in a random order. A `RandomShuffleQueue` has bounded capacity; supports multiple concurrent producers and consumers; and provides exactly-once delivery. A `RandomShuffleQueue` holds a list of up to `capacity` elements. Each element is a fixed-length tuple of tensors whose dtypes are described by `dtypes`, and whose shapes are optionally described by the `shapes` argument. If the `shapes` argument is specified, each component of a queue element must have the respective fixed shape. If it is unspecified, different queue elements may have different shapes, but the use of `dequeue_many` is disallowed. The `min_after_dequeue` argument allows the caller to specify a minimum number of elements that will remain in the queue after a `dequeue` or `dequeue_many` operation completes, to ensure a minimum level of mixing of elements. This invariant is maintained by blocking those operations until sufficient elements have been enqueued. The `min_after_dequeue` argument is ignored after the queue has been closed. Args: capacity: An integer. The upper bound on the number of elements that may be stored in this queue. min_after_dequeue: An integer (described above). dtypes: A list of `DType` objects. The length of `dtypes` must equal the number of tensors in each queue element. shapes: (Optional.) A list of fully-defined `TensorShape` objects with the same length as `dtypes`, or `None`. names: (Optional.) A list of string naming the components in the queue with the same length as `dtypes`, or `None`. If specified the dequeue methods return a dictionary with the names as keys. seed: A Python integer. Used to create a random seed. See [`set_random_seed`](../../api_docs/python/constant_op.md#set_random_seed) for behavior. shared_name: (Optional.) If non-empty, this queue will be shared under the given name across multiple sessions. name: Optional name for the queue operation.
tensorflow/python/ops/data_flow_ops.py
__init__
habangar/tensorflow
73
python
def __init__(self, capacity, min_after_dequeue, dtypes, shapes=None, names=None, seed=None, shared_name=None, name='random_shuffle_queue'): 'Create a queue that dequeues elements in a random order.\n\n A `RandomShuffleQueue` has bounded capacity; supports multiple\n concurrent producers and consumers; and provides exactly-once\n delivery.\n\n A `RandomShuffleQueue` holds a list of up to `capacity`\n elements. Each element is a fixed-length tuple of tensors whose\n dtypes are described by `dtypes`, and whose shapes are optionally\n described by the `shapes` argument.\n\n If the `shapes` argument is specified, each component of a queue\n element must have the respective fixed shape. If it is\n unspecified, different queue elements may have different shapes,\n but the use of `dequeue_many` is disallowed.\n\n The `min_after_dequeue` argument allows the caller to specify a\n minimum number of elements that will remain in the queue after a\n `dequeue` or `dequeue_many` operation completes, to ensure a\n minimum level of mixing of elements. This invariant is maintained\n by blocking those operations until sufficient elements have been\n enqueued. The `min_after_dequeue` argument is ignored after the\n queue has been closed.\n\n Args:\n capacity: An integer. The upper bound on the number of elements\n that may be stored in this queue.\n min_after_dequeue: An integer (described above).\n dtypes: A list of `DType` objects. The length of `dtypes` must equal\n the number of tensors in each queue element.\n shapes: (Optional.) A list of fully-defined `TensorShape` objects\n with the same length as `dtypes`, or `None`.\n names: (Optional.) A list of string naming the components in the queue\n with the same length as `dtypes`, or `None`. If specified the dequeue\n methods return a dictionary with the names as keys.\n seed: A Python integer. Used to create a random seed. See\n [`set_random_seed`](../../api_docs/python/constant_op.md#set_random_seed)\n for behavior.\n shared_name: (Optional.) If non-empty, this queue will be shared under\n the given name across multiple sessions.\n name: Optional name for the queue operation.\n ' dtypes = _as_type_list(dtypes) shapes = _as_shape_list(shapes, dtypes) names = _as_name_list(names, dtypes) (seed1, seed2) = random_seed.get_seed(seed) queue_ref = gen_data_flow_ops._random_shuffle_queue(component_types=dtypes, shapes=shapes, capacity=capacity, min_after_dequeue=min_after_dequeue, seed=seed1, seed2=seed2, shared_name=shared_name, name=name) super(RandomShuffleQueue, self).__init__(dtypes, shapes, names, queue_ref)
def __init__(self, capacity, min_after_dequeue, dtypes, shapes=None, names=None, seed=None, shared_name=None, name='random_shuffle_queue'): 'Create a queue that dequeues elements in a random order.\n\n A `RandomShuffleQueue` has bounded capacity; supports multiple\n concurrent producers and consumers; and provides exactly-once\n delivery.\n\n A `RandomShuffleQueue` holds a list of up to `capacity`\n elements. Each element is a fixed-length tuple of tensors whose\n dtypes are described by `dtypes`, and whose shapes are optionally\n described by the `shapes` argument.\n\n If the `shapes` argument is specified, each component of a queue\n element must have the respective fixed shape. If it is\n unspecified, different queue elements may have different shapes,\n but the use of `dequeue_many` is disallowed.\n\n The `min_after_dequeue` argument allows the caller to specify a\n minimum number of elements that will remain in the queue after a\n `dequeue` or `dequeue_many` operation completes, to ensure a\n minimum level of mixing of elements. This invariant is maintained\n by blocking those operations until sufficient elements have been\n enqueued. The `min_after_dequeue` argument is ignored after the\n queue has been closed.\n\n Args:\n capacity: An integer. The upper bound on the number of elements\n that may be stored in this queue.\n min_after_dequeue: An integer (described above).\n dtypes: A list of `DType` objects. The length of `dtypes` must equal\n the number of tensors in each queue element.\n shapes: (Optional.) A list of fully-defined `TensorShape` objects\n with the same length as `dtypes`, or `None`.\n names: (Optional.) A list of string naming the components in the queue\n with the same length as `dtypes`, or `None`. If specified the dequeue\n methods return a dictionary with the names as keys.\n seed: A Python integer. Used to create a random seed. See\n [`set_random_seed`](../../api_docs/python/constant_op.md#set_random_seed)\n for behavior.\n shared_name: (Optional.) If non-empty, this queue will be shared under\n the given name across multiple sessions.\n name: Optional name for the queue operation.\n ' dtypes = _as_type_list(dtypes) shapes = _as_shape_list(shapes, dtypes) names = _as_name_list(names, dtypes) (seed1, seed2) = random_seed.get_seed(seed) queue_ref = gen_data_flow_ops._random_shuffle_queue(component_types=dtypes, shapes=shapes, capacity=capacity, min_after_dequeue=min_after_dequeue, seed=seed1, seed2=seed2, shared_name=shared_name, name=name) super(RandomShuffleQueue, self).__init__(dtypes, shapes, names, queue_ref)<|docstring|>Create a queue that dequeues elements in a random order. A `RandomShuffleQueue` has bounded capacity; supports multiple concurrent producers and consumers; and provides exactly-once delivery. A `RandomShuffleQueue` holds a list of up to `capacity` elements. Each element is a fixed-length tuple of tensors whose dtypes are described by `dtypes`, and whose shapes are optionally described by the `shapes` argument. If the `shapes` argument is specified, each component of a queue element must have the respective fixed shape. If it is unspecified, different queue elements may have different shapes, but the use of `dequeue_many` is disallowed. The `min_after_dequeue` argument allows the caller to specify a minimum number of elements that will remain in the queue after a `dequeue` or `dequeue_many` operation completes, to ensure a minimum level of mixing of elements. This invariant is maintained by blocking those operations until sufficient elements have been enqueued. The `min_after_dequeue` argument is ignored after the queue has been closed. Args: capacity: An integer. The upper bound on the number of elements that may be stored in this queue. min_after_dequeue: An integer (described above). dtypes: A list of `DType` objects. The length of `dtypes` must equal the number of tensors in each queue element. shapes: (Optional.) A list of fully-defined `TensorShape` objects with the same length as `dtypes`, or `None`. names: (Optional.) A list of string naming the components in the queue with the same length as `dtypes`, or `None`. If specified the dequeue methods return a dictionary with the names as keys. seed: A Python integer. Used to create a random seed. See [`set_random_seed`](../../api_docs/python/constant_op.md#set_random_seed) for behavior. shared_name: (Optional.) If non-empty, this queue will be shared under the given name across multiple sessions. name: Optional name for the queue operation.<|endoftext|>
9910fa826977a547d74fc7ae5cca5464c315082a411d48ec9138fe4cf221cdae
def __init__(self, capacity, dtypes, shapes=None, names=None, shared_name=None, name='fifo_queue'): 'Creates a queue that dequeues elements in a first-in first-out order.\n\n A `FIFOQueue` has bounded capacity; supports multiple concurrent\n producers and consumers; and provides exactly-once delivery.\n\n A `FIFOQueue` holds a list of up to `capacity` elements. Each\n element is a fixed-length tuple of tensors whose dtypes are\n described by `dtypes`, and whose shapes are optionally described\n by the `shapes` argument.\n\n If the `shapes` argument is specified, each component of a queue\n element must have the respective fixed shape. If it is\n unspecified, different queue elements may have different shapes,\n but the use of `dequeue_many` is disallowed.\n\n Args:\n capacity: An integer. The upper bound on the number of elements\n that may be stored in this queue.\n dtypes: A list of `DType` objects. The length of `dtypes` must equal\n the number of tensors in each queue element.\n shapes: (Optional.) A list of fully-defined `TensorShape` objects\n with the same length as `dtypes`, or `None`.\n names: (Optional.) A list of string naming the components in the queue\n with the same length as `dtypes`, or `None`. If specified the dequeue\n methods return a dictionary with the names as keys.\n shared_name: (Optional.) If non-empty, this queue will be shared under\n the given name across multiple sessions.\n name: Optional name for the queue operation.\n ' dtypes = _as_type_list(dtypes) shapes = _as_shape_list(shapes, dtypes) names = _as_name_list(names, dtypes) queue_ref = gen_data_flow_ops._fifo_queue(component_types=dtypes, shapes=shapes, capacity=capacity, shared_name=shared_name, name=name) super(FIFOQueue, self).__init__(dtypes, shapes, names, queue_ref)
Creates a queue that dequeues elements in a first-in first-out order. A `FIFOQueue` has bounded capacity; supports multiple concurrent producers and consumers; and provides exactly-once delivery. A `FIFOQueue` holds a list of up to `capacity` elements. Each element is a fixed-length tuple of tensors whose dtypes are described by `dtypes`, and whose shapes are optionally described by the `shapes` argument. If the `shapes` argument is specified, each component of a queue element must have the respective fixed shape. If it is unspecified, different queue elements may have different shapes, but the use of `dequeue_many` is disallowed. Args: capacity: An integer. The upper bound on the number of elements that may be stored in this queue. dtypes: A list of `DType` objects. The length of `dtypes` must equal the number of tensors in each queue element. shapes: (Optional.) A list of fully-defined `TensorShape` objects with the same length as `dtypes`, or `None`. names: (Optional.) A list of string naming the components in the queue with the same length as `dtypes`, or `None`. If specified the dequeue methods return a dictionary with the names as keys. shared_name: (Optional.) If non-empty, this queue will be shared under the given name across multiple sessions. name: Optional name for the queue operation.
tensorflow/python/ops/data_flow_ops.py
__init__
habangar/tensorflow
73
python
def __init__(self, capacity, dtypes, shapes=None, names=None, shared_name=None, name='fifo_queue'): 'Creates a queue that dequeues elements in a first-in first-out order.\n\n A `FIFOQueue` has bounded capacity; supports multiple concurrent\n producers and consumers; and provides exactly-once delivery.\n\n A `FIFOQueue` holds a list of up to `capacity` elements. Each\n element is a fixed-length tuple of tensors whose dtypes are\n described by `dtypes`, and whose shapes are optionally described\n by the `shapes` argument.\n\n If the `shapes` argument is specified, each component of a queue\n element must have the respective fixed shape. If it is\n unspecified, different queue elements may have different shapes,\n but the use of `dequeue_many` is disallowed.\n\n Args:\n capacity: An integer. The upper bound on the number of elements\n that may be stored in this queue.\n dtypes: A list of `DType` objects. The length of `dtypes` must equal\n the number of tensors in each queue element.\n shapes: (Optional.) A list of fully-defined `TensorShape` objects\n with the same length as `dtypes`, or `None`.\n names: (Optional.) A list of string naming the components in the queue\n with the same length as `dtypes`, or `None`. If specified the dequeue\n methods return a dictionary with the names as keys.\n shared_name: (Optional.) If non-empty, this queue will be shared under\n the given name across multiple sessions.\n name: Optional name for the queue operation.\n ' dtypes = _as_type_list(dtypes) shapes = _as_shape_list(shapes, dtypes) names = _as_name_list(names, dtypes) queue_ref = gen_data_flow_ops._fifo_queue(component_types=dtypes, shapes=shapes, capacity=capacity, shared_name=shared_name, name=name) super(FIFOQueue, self).__init__(dtypes, shapes, names, queue_ref)
def __init__(self, capacity, dtypes, shapes=None, names=None, shared_name=None, name='fifo_queue'): 'Creates a queue that dequeues elements in a first-in first-out order.\n\n A `FIFOQueue` has bounded capacity; supports multiple concurrent\n producers and consumers; and provides exactly-once delivery.\n\n A `FIFOQueue` holds a list of up to `capacity` elements. Each\n element is a fixed-length tuple of tensors whose dtypes are\n described by `dtypes`, and whose shapes are optionally described\n by the `shapes` argument.\n\n If the `shapes` argument is specified, each component of a queue\n element must have the respective fixed shape. If it is\n unspecified, different queue elements may have different shapes,\n but the use of `dequeue_many` is disallowed.\n\n Args:\n capacity: An integer. The upper bound on the number of elements\n that may be stored in this queue.\n dtypes: A list of `DType` objects. The length of `dtypes` must equal\n the number of tensors in each queue element.\n shapes: (Optional.) A list of fully-defined `TensorShape` objects\n with the same length as `dtypes`, or `None`.\n names: (Optional.) A list of string naming the components in the queue\n with the same length as `dtypes`, or `None`. If specified the dequeue\n methods return a dictionary with the names as keys.\n shared_name: (Optional.) If non-empty, this queue will be shared under\n the given name across multiple sessions.\n name: Optional name for the queue operation.\n ' dtypes = _as_type_list(dtypes) shapes = _as_shape_list(shapes, dtypes) names = _as_name_list(names, dtypes) queue_ref = gen_data_flow_ops._fifo_queue(component_types=dtypes, shapes=shapes, capacity=capacity, shared_name=shared_name, name=name) super(FIFOQueue, self).__init__(dtypes, shapes, names, queue_ref)<|docstring|>Creates a queue that dequeues elements in a first-in first-out order. A `FIFOQueue` has bounded capacity; supports multiple concurrent producers and consumers; and provides exactly-once delivery. A `FIFOQueue` holds a list of up to `capacity` elements. Each element is a fixed-length tuple of tensors whose dtypes are described by `dtypes`, and whose shapes are optionally described by the `shapes` argument. If the `shapes` argument is specified, each component of a queue element must have the respective fixed shape. If it is unspecified, different queue elements may have different shapes, but the use of `dequeue_many` is disallowed. Args: capacity: An integer. The upper bound on the number of elements that may be stored in this queue. dtypes: A list of `DType` objects. The length of `dtypes` must equal the number of tensors in each queue element. shapes: (Optional.) A list of fully-defined `TensorShape` objects with the same length as `dtypes`, or `None`. names: (Optional.) A list of string naming the components in the queue with the same length as `dtypes`, or `None`. If specified the dequeue methods return a dictionary with the names as keys. shared_name: (Optional.) If non-empty, this queue will be shared under the given name across multiple sessions. name: Optional name for the queue operation.<|endoftext|>
35c11710d0d177207522f9e1996a3c9d580c3653d45ece0c346b366bf5e44a3c
def __init__(self, capacity, dtypes, shapes, names=None, shared_name=None, name='padding_fifo_queue'): "Creates a queue that dequeues elements in a first-in first-out order.\n\n A `PaddingFIFOQueue` has bounded capacity; supports multiple concurrent\n producers and consumers; and provides exactly-once delivery.\n\n A `PaddingFIFOQueue` holds a list of up to `capacity` elements. Each\n element is a fixed-length tuple of tensors whose dtypes are\n described by `dtypes`, and whose shapes are described by the `shapes`\n argument.\n\n The `shapes` argument must be specified; each component of a queue\n element must have the respective shape. Shapes of fixed\n rank but variable size are allowed by setting any shape dimension to None.\n In this case, the inputs' shape may vary along the given dimension, and\n `dequeue_many` will pad the given dimension with zeros up to the maximum\n shape of all elements in the given batch.\n\n Args:\n capacity: An integer. The upper bound on the number of elements\n that may be stored in this queue.\n dtypes: A list of `DType` objects. The length of `dtypes` must equal\n the number of tensors in each queue element.\n shapes: A list of `TensorShape` objects, with the same length as\n `dtypes`. Any dimension in the `TensorShape` containing value\n `None` is dynamic and allows values to be enqueued with\n variable size in that dimension.\n names: (Optional.) A list of string naming the components in the queue\n with the same length as `dtypes`, or `None`. If specified the dequeue\n methods return a dictionary with the names as keys.\n shared_name: (Optional.) If non-empty, this queue will be shared under\n the given name across multiple sessions.\n name: Optional name for the queue operation.\n\n Raises:\n ValueError: If shapes is not a list of shapes, or the lengths of dtypes\n and shapes do not match.\n " dtypes = _as_type_list(dtypes) shapes = _as_shape_list(shapes, dtypes, unknown_dim_allowed=True) names = _as_name_list(names, dtypes) if (len(dtypes) != len(shapes)): raise ValueError(('Shapes must be provided for all components, but received %d dtypes and %d shapes.' % (len(dtypes), len(shapes)))) queue_ref = gen_data_flow_ops._padding_fifo_queue(component_types=dtypes, shapes=shapes, capacity=capacity, shared_name=shared_name, name=name) super(PaddingFIFOQueue, self).__init__(dtypes, shapes, names, queue_ref)
Creates a queue that dequeues elements in a first-in first-out order. A `PaddingFIFOQueue` has bounded capacity; supports multiple concurrent producers and consumers; and provides exactly-once delivery. A `PaddingFIFOQueue` holds a list of up to `capacity` elements. Each element is a fixed-length tuple of tensors whose dtypes are described by `dtypes`, and whose shapes are described by the `shapes` argument. The `shapes` argument must be specified; each component of a queue element must have the respective shape. Shapes of fixed rank but variable size are allowed by setting any shape dimension to None. In this case, the inputs' shape may vary along the given dimension, and `dequeue_many` will pad the given dimension with zeros up to the maximum shape of all elements in the given batch. Args: capacity: An integer. The upper bound on the number of elements that may be stored in this queue. dtypes: A list of `DType` objects. The length of `dtypes` must equal the number of tensors in each queue element. shapes: A list of `TensorShape` objects, with the same length as `dtypes`. Any dimension in the `TensorShape` containing value `None` is dynamic and allows values to be enqueued with variable size in that dimension. names: (Optional.) A list of string naming the components in the queue with the same length as `dtypes`, or `None`. If specified the dequeue methods return a dictionary with the names as keys. shared_name: (Optional.) If non-empty, this queue will be shared under the given name across multiple sessions. name: Optional name for the queue operation. Raises: ValueError: If shapes is not a list of shapes, or the lengths of dtypes and shapes do not match.
tensorflow/python/ops/data_flow_ops.py
__init__
habangar/tensorflow
73
python
def __init__(self, capacity, dtypes, shapes, names=None, shared_name=None, name='padding_fifo_queue'): "Creates a queue that dequeues elements in a first-in first-out order.\n\n A `PaddingFIFOQueue` has bounded capacity; supports multiple concurrent\n producers and consumers; and provides exactly-once delivery.\n\n A `PaddingFIFOQueue` holds a list of up to `capacity` elements. Each\n element is a fixed-length tuple of tensors whose dtypes are\n described by `dtypes`, and whose shapes are described by the `shapes`\n argument.\n\n The `shapes` argument must be specified; each component of a queue\n element must have the respective shape. Shapes of fixed\n rank but variable size are allowed by setting any shape dimension to None.\n In this case, the inputs' shape may vary along the given dimension, and\n `dequeue_many` will pad the given dimension with zeros up to the maximum\n shape of all elements in the given batch.\n\n Args:\n capacity: An integer. The upper bound on the number of elements\n that may be stored in this queue.\n dtypes: A list of `DType` objects. The length of `dtypes` must equal\n the number of tensors in each queue element.\n shapes: A list of `TensorShape` objects, with the same length as\n `dtypes`. Any dimension in the `TensorShape` containing value\n `None` is dynamic and allows values to be enqueued with\n variable size in that dimension.\n names: (Optional.) A list of string naming the components in the queue\n with the same length as `dtypes`, or `None`. If specified the dequeue\n methods return a dictionary with the names as keys.\n shared_name: (Optional.) If non-empty, this queue will be shared under\n the given name across multiple sessions.\n name: Optional name for the queue operation.\n\n Raises:\n ValueError: If shapes is not a list of shapes, or the lengths of dtypes\n and shapes do not match.\n " dtypes = _as_type_list(dtypes) shapes = _as_shape_list(shapes, dtypes, unknown_dim_allowed=True) names = _as_name_list(names, dtypes) if (len(dtypes) != len(shapes)): raise ValueError(('Shapes must be provided for all components, but received %d dtypes and %d shapes.' % (len(dtypes), len(shapes)))) queue_ref = gen_data_flow_ops._padding_fifo_queue(component_types=dtypes, shapes=shapes, capacity=capacity, shared_name=shared_name, name=name) super(PaddingFIFOQueue, self).__init__(dtypes, shapes, names, queue_ref)
def __init__(self, capacity, dtypes, shapes, names=None, shared_name=None, name='padding_fifo_queue'): "Creates a queue that dequeues elements in a first-in first-out order.\n\n A `PaddingFIFOQueue` has bounded capacity; supports multiple concurrent\n producers and consumers; and provides exactly-once delivery.\n\n A `PaddingFIFOQueue` holds a list of up to `capacity` elements. Each\n element is a fixed-length tuple of tensors whose dtypes are\n described by `dtypes`, and whose shapes are described by the `shapes`\n argument.\n\n The `shapes` argument must be specified; each component of a queue\n element must have the respective shape. Shapes of fixed\n rank but variable size are allowed by setting any shape dimension to None.\n In this case, the inputs' shape may vary along the given dimension, and\n `dequeue_many` will pad the given dimension with zeros up to the maximum\n shape of all elements in the given batch.\n\n Args:\n capacity: An integer. The upper bound on the number of elements\n that may be stored in this queue.\n dtypes: A list of `DType` objects. The length of `dtypes` must equal\n the number of tensors in each queue element.\n shapes: A list of `TensorShape` objects, with the same length as\n `dtypes`. Any dimension in the `TensorShape` containing value\n `None` is dynamic and allows values to be enqueued with\n variable size in that dimension.\n names: (Optional.) A list of string naming the components in the queue\n with the same length as `dtypes`, or `None`. If specified the dequeue\n methods return a dictionary with the names as keys.\n shared_name: (Optional.) If non-empty, this queue will be shared under\n the given name across multiple sessions.\n name: Optional name for the queue operation.\n\n Raises:\n ValueError: If shapes is not a list of shapes, or the lengths of dtypes\n and shapes do not match.\n " dtypes = _as_type_list(dtypes) shapes = _as_shape_list(shapes, dtypes, unknown_dim_allowed=True) names = _as_name_list(names, dtypes) if (len(dtypes) != len(shapes)): raise ValueError(('Shapes must be provided for all components, but received %d dtypes and %d shapes.' % (len(dtypes), len(shapes)))) queue_ref = gen_data_flow_ops._padding_fifo_queue(component_types=dtypes, shapes=shapes, capacity=capacity, shared_name=shared_name, name=name) super(PaddingFIFOQueue, self).__init__(dtypes, shapes, names, queue_ref)<|docstring|>Creates a queue that dequeues elements in a first-in first-out order. A `PaddingFIFOQueue` has bounded capacity; supports multiple concurrent producers and consumers; and provides exactly-once delivery. A `PaddingFIFOQueue` holds a list of up to `capacity` elements. Each element is a fixed-length tuple of tensors whose dtypes are described by `dtypes`, and whose shapes are described by the `shapes` argument. The `shapes` argument must be specified; each component of a queue element must have the respective shape. Shapes of fixed rank but variable size are allowed by setting any shape dimension to None. In this case, the inputs' shape may vary along the given dimension, and `dequeue_many` will pad the given dimension with zeros up to the maximum shape of all elements in the given batch. Args: capacity: An integer. The upper bound on the number of elements that may be stored in this queue. dtypes: A list of `DType` objects. The length of `dtypes` must equal the number of tensors in each queue element. shapes: A list of `TensorShape` objects, with the same length as `dtypes`. Any dimension in the `TensorShape` containing value `None` is dynamic and allows values to be enqueued with variable size in that dimension. names: (Optional.) A list of string naming the components in the queue with the same length as `dtypes`, or `None`. If specified the dequeue methods return a dictionary with the names as keys. shared_name: (Optional.) If non-empty, this queue will be shared under the given name across multiple sessions. name: Optional name for the queue operation. Raises: ValueError: If shapes is not a list of shapes, or the lengths of dtypes and shapes do not match.<|endoftext|>
8dec1916ad6557ae2799eb622ef5db859ed9818cbb4e9979653a750f51f909cd
def parse(map_data_def, input) -> dict: ' Traverses a map-data: definition and calls value.parse on each value ' logger.debug(f'Entering: Key count = {len(map_data_def)}, data count = {(len(input) if isinstance(input, list) else 1)}') output = {} for key_value in map_data_def: if isinstance(key_value['key'], str): key_def = key.key(key_value['key']) elif isinstance(key_value['key'], dict): if ((key_def := key.key(key_value['key']['name'])) is None): logger.error(f"The key {key_value['key']} must have a 'name' key with a non-empty value") if ((section := key_value['key'].get('section')) is not None): key_def.addProperty('section', section) if (((tags := key_value['key'].get('tags')) is not None) and isinstance(tags, list)): key_def.addTags(tags) logger.debug(f"Mapping data to '{key_def}'") value_def = key_value['value'] value = data.value.parse(value_def, input) logger.debug(f"Mapping '{key_def}':'{value}'") output[key_def] = value if (key_def.getProperty('section') is not None): key_def.addProperty('value', value) logger.debug(f'Leaving: Key count = {len(map_data_def)}, data count = {(len(input) if isinstance(input, list) else 1)}') return output
Traverses a map-data: definition and calls value.parse on each value
data/map.py
parse
samadhicsec/threatware
0
python
def parse(map_data_def, input) -> dict: ' ' logger.debug(f'Entering: Key count = {len(map_data_def)}, data count = {(len(input) if isinstance(input, list) else 1)}') output = {} for key_value in map_data_def: if isinstance(key_value['key'], str): key_def = key.key(key_value['key']) elif isinstance(key_value['key'], dict): if ((key_def := key.key(key_value['key']['name'])) is None): logger.error(f"The key {key_value['key']} must have a 'name' key with a non-empty value") if ((section := key_value['key'].get('section')) is not None): key_def.addProperty('section', section) if (((tags := key_value['key'].get('tags')) is not None) and isinstance(tags, list)): key_def.addTags(tags) logger.debug(f"Mapping data to '{key_def}'") value_def = key_value['value'] value = data.value.parse(value_def, input) logger.debug(f"Mapping '{key_def}':'{value}'") output[key_def] = value if (key_def.getProperty('section') is not None): key_def.addProperty('value', value) logger.debug(f'Leaving: Key count = {len(map_data_def)}, data count = {(len(input) if isinstance(input, list) else 1)}') return output
def parse(map_data_def, input) -> dict: ' ' logger.debug(f'Entering: Key count = {len(map_data_def)}, data count = {(len(input) if isinstance(input, list) else 1)}') output = {} for key_value in map_data_def: if isinstance(key_value['key'], str): key_def = key.key(key_value['key']) elif isinstance(key_value['key'], dict): if ((key_def := key.key(key_value['key']['name'])) is None): logger.error(f"The key {key_value['key']} must have a 'name' key with a non-empty value") if ((section := key_value['key'].get('section')) is not None): key_def.addProperty('section', section) if (((tags := key_value['key'].get('tags')) is not None) and isinstance(tags, list)): key_def.addTags(tags) logger.debug(f"Mapping data to '{key_def}'") value_def = key_value['value'] value = data.value.parse(value_def, input) logger.debug(f"Mapping '{key_def}':'{value}'") output[key_def] = value if (key_def.getProperty('section') is not None): key_def.addProperty('value', value) logger.debug(f'Leaving: Key count = {len(map_data_def)}, data count = {(len(input) if isinstance(input, list) else 1)}') return output<|docstring|>Traverses a map-data: definition and calls value.parse on each value<|endoftext|>
216eb4fde65cbf2d3b89547b1f835cc31610133a16ffb2bd201b68d0536d5cce
def list_reserved_resources(): 'Displays the currently reserved resources on all agents via state.json;\n Currently for INFINITY-1881 where we believe uninstall may not be\n always doing its job correctly.' state_json_slaveinfo = dcos.mesos.DCOSClient().get_state_summary()['slaves'] for slave in state_json_slaveinfo: reserved_resources = slave['reserved_resources'] if (reserved_resources == {}): continue msg = 'on slaveid=%s hostname=%s reserved resources: %s' log.info((msg % (slave['id'], slave['hostname'], reserved_resources)))
Displays the currently reserved resources on all agents via state.json; Currently for INFINITY-1881 where we believe uninstall may not be always doing its job correctly.
testing/sdk_utils.py
list_reserved_resources
greggomann/dcos-commons
7
python
def list_reserved_resources(): 'Displays the currently reserved resources on all agents via state.json;\n Currently for INFINITY-1881 where we believe uninstall may not be\n always doing its job correctly.' state_json_slaveinfo = dcos.mesos.DCOSClient().get_state_summary()['slaves'] for slave in state_json_slaveinfo: reserved_resources = slave['reserved_resources'] if (reserved_resources == {}): continue msg = 'on slaveid=%s hostname=%s reserved resources: %s' log.info((msg % (slave['id'], slave['hostname'], reserved_resources)))
def list_reserved_resources(): 'Displays the currently reserved resources on all agents via state.json;\n Currently for INFINITY-1881 where we believe uninstall may not be\n always doing its job correctly.' state_json_slaveinfo = dcos.mesos.DCOSClient().get_state_summary()['slaves'] for slave in state_json_slaveinfo: reserved_resources = slave['reserved_resources'] if (reserved_resources == {}): continue msg = 'on slaveid=%s hostname=%s reserved resources: %s' log.info((msg % (slave['id'], slave['hostname'], reserved_resources)))<|docstring|>Displays the currently reserved resources on all agents via state.json; Currently for INFINITY-1881 where we believe uninstall may not be always doing its job correctly.<|endoftext|>
0657807e8261b6b6a4861a8856eff62c2fe9091d2c9721e662603d039f0013f0
def check_dcos_min_version_mark(item: pytest.Item): "Enforces the dcos_min_version pytest annotation, which should be used like this:\n\n @pytest.mark.dcos_min_version('1.10')\n def your_test_here(): ...\n\n In order for this annotation to take effect, this function must be called by a pytest_runtest_setup() hook.\n " min_version_mark = item.get_marker('dcos_min_version') if min_version_mark: min_version = min_version_mark.args[0] message = 'Feature only supported in DC/OS {} and up'.format(min_version) if ('reason' in min_version_mark.kwargs): message += ': {}'.format(min_version_mark.kwargs['reason']) if dcos_version_less_than(min_version): pytest.skip(message)
Enforces the dcos_min_version pytest annotation, which should be used like this: @pytest.mark.dcos_min_version('1.10') def your_test_here(): ... In order for this annotation to take effect, this function must be called by a pytest_runtest_setup() hook.
testing/sdk_utils.py
check_dcos_min_version_mark
greggomann/dcos-commons
7
python
def check_dcos_min_version_mark(item: pytest.Item): "Enforces the dcos_min_version pytest annotation, which should be used like this:\n\n @pytest.mark.dcos_min_version('1.10')\n def your_test_here(): ...\n\n In order for this annotation to take effect, this function must be called by a pytest_runtest_setup() hook.\n " min_version_mark = item.get_marker('dcos_min_version') if min_version_mark: min_version = min_version_mark.args[0] message = 'Feature only supported in DC/OS {} and up'.format(min_version) if ('reason' in min_version_mark.kwargs): message += ': {}'.format(min_version_mark.kwargs['reason']) if dcos_version_less_than(min_version): pytest.skip(message)
def check_dcos_min_version_mark(item: pytest.Item): "Enforces the dcos_min_version pytest annotation, which should be used like this:\n\n @pytest.mark.dcos_min_version('1.10')\n def your_test_here(): ...\n\n In order for this annotation to take effect, this function must be called by a pytest_runtest_setup() hook.\n " min_version_mark = item.get_marker('dcos_min_version') if min_version_mark: min_version = min_version_mark.args[0] message = 'Feature only supported in DC/OS {} and up'.format(min_version) if ('reason' in min_version_mark.kwargs): message += ': {}'.format(min_version_mark.kwargs['reason']) if dcos_version_less_than(min_version): pytest.skip(message)<|docstring|>Enforces the dcos_min_version pytest annotation, which should be used like this: @pytest.mark.dcos_min_version('1.10') def your_test_here(): ... In order for this annotation to take effect, this function must be called by a pytest_runtest_setup() hook.<|endoftext|>
1ca22585da97136f9398df3bbe5cb20c691d83b0ddc7aea9071a18dff84c2bd7
def is_open_dcos(): 'Determine if the tests are being run against open DC/OS. This is presently done by\n checking the envvar DCOS_ENTERPRISE.' return (not (os.environ.get('DCOS_ENTERPRISE', 'true').lower() == 'true'))
Determine if the tests are being run against open DC/OS. This is presently done by checking the envvar DCOS_ENTERPRISE.
testing/sdk_utils.py
is_open_dcos
greggomann/dcos-commons
7
python
def is_open_dcos(): 'Determine if the tests are being run against open DC/OS. This is presently done by\n checking the envvar DCOS_ENTERPRISE.' return (not (os.environ.get('DCOS_ENTERPRISE', 'true').lower() == 'true'))
def is_open_dcos(): 'Determine if the tests are being run against open DC/OS. This is presently done by\n checking the envvar DCOS_ENTERPRISE.' return (not (os.environ.get('DCOS_ENTERPRISE', 'true').lower() == 'true'))<|docstring|>Determine if the tests are being run against open DC/OS. This is presently done by checking the envvar DCOS_ENTERPRISE.<|endoftext|>
bb74f336e2cce520c6aec1b9398d47470139de814c36970067c5a51ffce23c6b
def is_strict_mode(): 'Determine if the tests are being run on a strict mode cluster.' return (os.environ.get('SECURITY', '') == 'strict')
Determine if the tests are being run on a strict mode cluster.
testing/sdk_utils.py
is_strict_mode
greggomann/dcos-commons
7
python
def is_strict_mode(): return (os.environ.get('SECURITY', ) == 'strict')
def is_strict_mode(): return (os.environ.get('SECURITY', ) == 'strict')<|docstring|>Determine if the tests are being run on a strict mode cluster.<|endoftext|>
50a53140288649f04327eacede9d5205cef4343e80fe5068a134769f9ca1f6f4
def get_in(keys, coll, default=None): " Reaches into nested associative data structures. Returns the value for path ``keys``.\n\n If the path doesn't exist returns ``default``.\n\n >>> transaction = {'name': 'Alice',\n ... 'purchase': {'items': ['Apple', 'Orange'],\n ... 'costs': [0.50, 1.25]},\n ... 'credit card': '5555-1234-1234-1234'}\n >>> get_in(['purchase', 'items', 0], transaction)\n 'Apple'\n >>> get_in(['name'], transaction)\n 'Alice'\n >>> get_in(['purchase', 'total'], transaction)\n >>> get_in(['purchase', 'items', 'apple'], transaction)\n >>> get_in(['purchase', 'items', 10], transaction)\n >>> get_in(['purchase', 'total'], transaction, 0)\n 0\n " try: return functools.reduce(operator.getitem, keys, coll) except (KeyError, IndexError, TypeError): return default
Reaches into nested associative data structures. Returns the value for path ``keys``. If the path doesn't exist returns ``default``. >>> transaction = {'name': 'Alice', ... 'purchase': {'items': ['Apple', 'Orange'], ... 'costs': [0.50, 1.25]}, ... 'credit card': '5555-1234-1234-1234'} >>> get_in(['purchase', 'items', 0], transaction) 'Apple' >>> get_in(['name'], transaction) 'Alice' >>> get_in(['purchase', 'total'], transaction) >>> get_in(['purchase', 'items', 'apple'], transaction) >>> get_in(['purchase', 'items', 10], transaction) >>> get_in(['purchase', 'total'], transaction, 0) 0
testing/sdk_utils.py
get_in
greggomann/dcos-commons
7
python
def get_in(keys, coll, default=None): " Reaches into nested associative data structures. Returns the value for path ``keys``.\n\n If the path doesn't exist returns ``default``.\n\n >>> transaction = {'name': 'Alice',\n ... 'purchase': {'items': ['Apple', 'Orange'],\n ... 'costs': [0.50, 1.25]},\n ... 'credit card': '5555-1234-1234-1234'}\n >>> get_in(['purchase', 'items', 0], transaction)\n 'Apple'\n >>> get_in(['name'], transaction)\n 'Alice'\n >>> get_in(['purchase', 'total'], transaction)\n >>> get_in(['purchase', 'items', 'apple'], transaction)\n >>> get_in(['purchase', 'items', 10], transaction)\n >>> get_in(['purchase', 'total'], transaction, 0)\n 0\n " try: return functools.reduce(operator.getitem, keys, coll) except (KeyError, IndexError, TypeError): return default
def get_in(keys, coll, default=None): " Reaches into nested associative data structures. Returns the value for path ``keys``.\n\n If the path doesn't exist returns ``default``.\n\n >>> transaction = {'name': 'Alice',\n ... 'purchase': {'items': ['Apple', 'Orange'],\n ... 'costs': [0.50, 1.25]},\n ... 'credit card': '5555-1234-1234-1234'}\n >>> get_in(['purchase', 'items', 0], transaction)\n 'Apple'\n >>> get_in(['name'], transaction)\n 'Alice'\n >>> get_in(['purchase', 'total'], transaction)\n >>> get_in(['purchase', 'items', 'apple'], transaction)\n >>> get_in(['purchase', 'items', 10], transaction)\n >>> get_in(['purchase', 'total'], transaction, 0)\n 0\n " try: return functools.reduce(operator.getitem, keys, coll) except (KeyError, IndexError, TypeError): return default<|docstring|>Reaches into nested associative data structures. Returns the value for path ``keys``. If the path doesn't exist returns ``default``. >>> transaction = {'name': 'Alice', ... 'purchase': {'items': ['Apple', 'Orange'], ... 'costs': [0.50, 1.25]}, ... 'credit card': '5555-1234-1234-1234'} >>> get_in(['purchase', 'items', 0], transaction) 'Apple' >>> get_in(['name'], transaction) 'Alice' >>> get_in(['purchase', 'total'], transaction) >>> get_in(['purchase', 'items', 'apple'], transaction) >>> get_in(['purchase', 'items', 10], transaction) >>> get_in(['purchase', 'total'], transaction, 0) 0<|endoftext|>
f45e9c81930b2904fa35be67ff3ef3375e932a7e73d88f3262da1eb50e83e9b1
def saveJSON(fileName, data): '\n Function for/to <short description of `netpyne.sim.save.saveJSON`>\n\n Parameters\n ----------\n fileName : <type>\n <Short description of fileName>\n **Default:** *required*\n\n data : <type>\n <Short description of data>\n **Default:** *required*\n\n\n ' import json, io from .utils import NpSerializer with io.open(fileName, 'w', encoding='utf8') as fileObj: str_ = json.dumps(data, indent=4, sort_keys=True, separators=(',', ': '), ensure_ascii=False, cls=NpSerializer) fileObj.write(to_unicode(str_))
Function for/to <short description of `netpyne.sim.save.saveJSON`> Parameters ---------- fileName : <type> <Short description of fileName> **Default:** *required* data : <type> <Short description of data> **Default:** *required*
netpyne/sim/save.py
saveJSON
ghafari2019/netpyne
1
python
def saveJSON(fileName, data): '\n Function for/to <short description of `netpyne.sim.save.saveJSON`>\n\n Parameters\n ----------\n fileName : <type>\n <Short description of fileName>\n **Default:** *required*\n\n data : <type>\n <Short description of data>\n **Default:** *required*\n\n\n ' import json, io from .utils import NpSerializer with io.open(fileName, 'w', encoding='utf8') as fileObj: str_ = json.dumps(data, indent=4, sort_keys=True, separators=(',', ': '), ensure_ascii=False, cls=NpSerializer) fileObj.write(to_unicode(str_))
def saveJSON(fileName, data): '\n Function for/to <short description of `netpyne.sim.save.saveJSON`>\n\n Parameters\n ----------\n fileName : <type>\n <Short description of fileName>\n **Default:** *required*\n\n data : <type>\n <Short description of data>\n **Default:** *required*\n\n\n ' import json, io from .utils import NpSerializer with io.open(fileName, 'w', encoding='utf8') as fileObj: str_ = json.dumps(data, indent=4, sort_keys=True, separators=(',', ': '), ensure_ascii=False, cls=NpSerializer) fileObj.write(to_unicode(str_))<|docstring|>Function for/to <short description of `netpyne.sim.save.saveJSON`> Parameters ---------- fileName : <type> <Short description of fileName> **Default:** *required* data : <type> <Short description of data> **Default:** *required*<|endoftext|>
fdc283c3c5bb1c2c85c12fdec429c161c5108fc60866204d8e544a56bdd11755
def saveData(include=None, filename=None): '\n Function for/to <short description of `netpyne.sim.save.saveData`>\n\n Parameters\n ----------\n include : <``None``?>\n <Short description of include>\n **Default:** ``None``\n **Options:** ``<option>`` <description of option>\n\n filename : <``None``?>\n <Short description of filename>\n **Default:** ``None``\n **Options:** ``<option>`` <description of option>\n\n\n ' from .. import sim if ((sim.rank == 0) and (not getattr(sim.net, 'allCells', None)) and (not getattr(sim, 'allSimData', None))): needGather = True else: needGather = False if needGather: gather.gatherData() if filename: sim.cfg.filename = filename if (sim.rank == 0): sim.timing('start', 'saveTime') import os if (isinstance(sim.cfg.backupCfgFile, list) and (len(sim.cfg.backupCfgFile) == 2)): simName = (sim.cfg.simLabel if sim.cfg.simLabel else os.path.basename(sim.cfg.filename)) print(('Copying cfg file %s ... ' % simName)) source = sim.cfg.backupCfgFile[0] targetFolder = sim.cfg.backupCfgFile[1] try: os.mkdir(targetFolder) except OSError: if (not os.path.exists(targetFolder)): print((' Could not create target folder: %s' % targetFolder)) targetFile = (((targetFolder + '/') + simName) + '_cfg.py') if os.path.exists(targetFile): print(' Removing prior cfg file', targetFile) os.system(('rm ' + targetFile)) os.system(((('cp ' + source) + ' ') + targetFile)) targetFolder = os.path.dirname(sim.cfg.filename) if (targetFolder and (not os.path.exists(targetFolder))): try: os.mkdir(targetFolder) except OSError: print((' Could not create target folder: %s' % targetFolder)) if (not include): include = sim.cfg.saveDataInclude dataSave = {} net = {} dataSave['netpyne_version'] = sim.version(show=False) dataSave['netpyne_changeset'] = sim.gitChangeset(show=False) if getattr(sim.net.params, 'version', None): dataSave['netParams_version'] = sim.net.params.version if ('netParams' in include): sim.net.params.__dict__.pop('_labelid', None) net['params'] = utils.replaceFuncObj(sim.net.params.__dict__) if ('net' in include): include.extend(['netPops', 'netCells']) if (('netCells' in include) and hasattr(sim.net, 'allCells')): net['cells'] = sim.net.allCells if (('netPops' in include) and hasattr(sim.net, 'allPops')): net['pops'] = sim.net.allPops if net: dataSave['net'] = net if ('simConfig' in include): dataSave['simConfig'] = sim.cfg.__dict__ if ('simData' in include): if ('LFP' in sim.allSimData): sim.allSimData['LFP'] = sim.allSimData['LFP'].tolist() dataSave['simData'] = sim.allSimData if dataSave: if sim.cfg.timestampFilename: timestamp = time() timestampStr = ('-' + datetime.fromtimestamp(timestamp).strftime('%Y%m%d_%H%M%S')) else: timestampStr = '' filePath = (sim.cfg.filename + timestampStr) if sim.cfg.savePickle: import pickle dataSave = utils.replaceDictODict(dataSave) print(('Saving output as %s ... ' % (filePath + '.pkl'))) with open((filePath + '.pkl'), 'wb') as fileObj: pickle.dump(dataSave, fileObj) print('Finished saving!') if sim.cfg.saveDpk: import gzip print(('Saving output as %s ... ' % (filePath + '.dpk'))) gzip.open(filePath, 'wb').write(pk.dumps(dataSave)) print('Finished saving!') if sim.cfg.saveJson: print(('Saving output as %s ... ' % (filePath + '.json '))) sim.saveJSON((filePath + '.json'), dataSave) print('Finished saving!') if sim.cfg.saveMat: from scipy.io import savemat print(('Saving output as %s ... ' % (filePath + '.mat'))) savemat((filePath + '.mat'), utils.tupleToList(utils.replaceNoneObj(dataSave))) print('Finished saving!') if sim.cfg.saveHDF5: dataSaveUTF8 = utils._dict2utf8(utils.replaceNoneObj(dataSave)) import hdf5storage print(('Saving output as %s... ' % (filePath + '.hdf5'))) hdf5storage.writes(dataSaveUTF8, filename=(filePath + '.hdf5')) print('Finished saving!') if sim.cfg.saveCSV: if ('simData' in dataSave): import csv print(('Saving output as %s ... ' % (filePath + '.csv'))) writer = csv.writer(open((filePath + '.csv'), 'wb')) for dic in dataSave['simData']: for values in dic: writer.writerow(values) print('Finished saving!') if sim.cfg.saveDat: traces = sim.cfg.recordTraces for ref in list(traces.keys()): for cellid in list(sim.allSimData[ref].keys()): dat_file_name = ('%s_%s.dat' % (ref, cellid)) dat_file = open(dat_file_name, 'w') trace = sim.allSimData[ref][cellid] print(('Saving %i points of data on: %s:%s to %s' % (len(trace), ref, cellid, dat_file_name))) for i in range(len(trace)): dat_file.write(('%s\t%s\n' % (((i * sim.cfg.dt) / 1000), (trace[i] / 1000)))) print('Finished saving!') if sim.cfg.timing: sim.timing('stop', 'saveTime') print((' Done; saving time = %0.2f s.' % sim.timingData['saveTime'])) if (sim.cfg.timing and sim.cfg.saveTiming): import pickle with open('timing.pkl', 'wb') as file: pickle.dump(sim.timing, file) for key in list(dataSave.keys()): del dataSave[key] del dataSave import os return ((os.getcwd() + '/') + filePath) else: print('Nothing to save')
Function for/to <short description of `netpyne.sim.save.saveData`> Parameters ---------- include : <``None``?> <Short description of include> **Default:** ``None`` **Options:** ``<option>`` <description of option> filename : <``None``?> <Short description of filename> **Default:** ``None`` **Options:** ``<option>`` <description of option>
netpyne/sim/save.py
saveData
ghafari2019/netpyne
1
python
def saveData(include=None, filename=None): '\n Function for/to <short description of `netpyne.sim.save.saveData`>\n\n Parameters\n ----------\n include : <``None``?>\n <Short description of include>\n **Default:** ``None``\n **Options:** ``<option>`` <description of option>\n\n filename : <``None``?>\n <Short description of filename>\n **Default:** ``None``\n **Options:** ``<option>`` <description of option>\n\n\n ' from .. import sim if ((sim.rank == 0) and (not getattr(sim.net, 'allCells', None)) and (not getattr(sim, 'allSimData', None))): needGather = True else: needGather = False if needGather: gather.gatherData() if filename: sim.cfg.filename = filename if (sim.rank == 0): sim.timing('start', 'saveTime') import os if (isinstance(sim.cfg.backupCfgFile, list) and (len(sim.cfg.backupCfgFile) == 2)): simName = (sim.cfg.simLabel if sim.cfg.simLabel else os.path.basename(sim.cfg.filename)) print(('Copying cfg file %s ... ' % simName)) source = sim.cfg.backupCfgFile[0] targetFolder = sim.cfg.backupCfgFile[1] try: os.mkdir(targetFolder) except OSError: if (not os.path.exists(targetFolder)): print((' Could not create target folder: %s' % targetFolder)) targetFile = (((targetFolder + '/') + simName) + '_cfg.py') if os.path.exists(targetFile): print(' Removing prior cfg file', targetFile) os.system(('rm ' + targetFile)) os.system(((('cp ' + source) + ' ') + targetFile)) targetFolder = os.path.dirname(sim.cfg.filename) if (targetFolder and (not os.path.exists(targetFolder))): try: os.mkdir(targetFolder) except OSError: print((' Could not create target folder: %s' % targetFolder)) if (not include): include = sim.cfg.saveDataInclude dataSave = {} net = {} dataSave['netpyne_version'] = sim.version(show=False) dataSave['netpyne_changeset'] = sim.gitChangeset(show=False) if getattr(sim.net.params, 'version', None): dataSave['netParams_version'] = sim.net.params.version if ('netParams' in include): sim.net.params.__dict__.pop('_labelid', None) net['params'] = utils.replaceFuncObj(sim.net.params.__dict__) if ('net' in include): include.extend(['netPops', 'netCells']) if (('netCells' in include) and hasattr(sim.net, 'allCells')): net['cells'] = sim.net.allCells if (('netPops' in include) and hasattr(sim.net, 'allPops')): net['pops'] = sim.net.allPops if net: dataSave['net'] = net if ('simConfig' in include): dataSave['simConfig'] = sim.cfg.__dict__ if ('simData' in include): if ('LFP' in sim.allSimData): sim.allSimData['LFP'] = sim.allSimData['LFP'].tolist() dataSave['simData'] = sim.allSimData if dataSave: if sim.cfg.timestampFilename: timestamp = time() timestampStr = ('-' + datetime.fromtimestamp(timestamp).strftime('%Y%m%d_%H%M%S')) else: timestampStr = filePath = (sim.cfg.filename + timestampStr) if sim.cfg.savePickle: import pickle dataSave = utils.replaceDictODict(dataSave) print(('Saving output as %s ... ' % (filePath + '.pkl'))) with open((filePath + '.pkl'), 'wb') as fileObj: pickle.dump(dataSave, fileObj) print('Finished saving!') if sim.cfg.saveDpk: import gzip print(('Saving output as %s ... ' % (filePath + '.dpk'))) gzip.open(filePath, 'wb').write(pk.dumps(dataSave)) print('Finished saving!') if sim.cfg.saveJson: print(('Saving output as %s ... ' % (filePath + '.json '))) sim.saveJSON((filePath + '.json'), dataSave) print('Finished saving!') if sim.cfg.saveMat: from scipy.io import savemat print(('Saving output as %s ... ' % (filePath + '.mat'))) savemat((filePath + '.mat'), utils.tupleToList(utils.replaceNoneObj(dataSave))) print('Finished saving!') if sim.cfg.saveHDF5: dataSaveUTF8 = utils._dict2utf8(utils.replaceNoneObj(dataSave)) import hdf5storage print(('Saving output as %s... ' % (filePath + '.hdf5'))) hdf5storage.writes(dataSaveUTF8, filename=(filePath + '.hdf5')) print('Finished saving!') if sim.cfg.saveCSV: if ('simData' in dataSave): import csv print(('Saving output as %s ... ' % (filePath + '.csv'))) writer = csv.writer(open((filePath + '.csv'), 'wb')) for dic in dataSave['simData']: for values in dic: writer.writerow(values) print('Finished saving!') if sim.cfg.saveDat: traces = sim.cfg.recordTraces for ref in list(traces.keys()): for cellid in list(sim.allSimData[ref].keys()): dat_file_name = ('%s_%s.dat' % (ref, cellid)) dat_file = open(dat_file_name, 'w') trace = sim.allSimData[ref][cellid] print(('Saving %i points of data on: %s:%s to %s' % (len(trace), ref, cellid, dat_file_name))) for i in range(len(trace)): dat_file.write(('%s\t%s\n' % (((i * sim.cfg.dt) / 1000), (trace[i] / 1000)))) print('Finished saving!') if sim.cfg.timing: sim.timing('stop', 'saveTime') print((' Done; saving time = %0.2f s.' % sim.timingData['saveTime'])) if (sim.cfg.timing and sim.cfg.saveTiming): import pickle with open('timing.pkl', 'wb') as file: pickle.dump(sim.timing, file) for key in list(dataSave.keys()): del dataSave[key] del dataSave import os return ((os.getcwd() + '/') + filePath) else: print('Nothing to save')
def saveData(include=None, filename=None): '\n Function for/to <short description of `netpyne.sim.save.saveData`>\n\n Parameters\n ----------\n include : <``None``?>\n <Short description of include>\n **Default:** ``None``\n **Options:** ``<option>`` <description of option>\n\n filename : <``None``?>\n <Short description of filename>\n **Default:** ``None``\n **Options:** ``<option>`` <description of option>\n\n\n ' from .. import sim if ((sim.rank == 0) and (not getattr(sim.net, 'allCells', None)) and (not getattr(sim, 'allSimData', None))): needGather = True else: needGather = False if needGather: gather.gatherData() if filename: sim.cfg.filename = filename if (sim.rank == 0): sim.timing('start', 'saveTime') import os if (isinstance(sim.cfg.backupCfgFile, list) and (len(sim.cfg.backupCfgFile) == 2)): simName = (sim.cfg.simLabel if sim.cfg.simLabel else os.path.basename(sim.cfg.filename)) print(('Copying cfg file %s ... ' % simName)) source = sim.cfg.backupCfgFile[0] targetFolder = sim.cfg.backupCfgFile[1] try: os.mkdir(targetFolder) except OSError: if (not os.path.exists(targetFolder)): print((' Could not create target folder: %s' % targetFolder)) targetFile = (((targetFolder + '/') + simName) + '_cfg.py') if os.path.exists(targetFile): print(' Removing prior cfg file', targetFile) os.system(('rm ' + targetFile)) os.system(((('cp ' + source) + ' ') + targetFile)) targetFolder = os.path.dirname(sim.cfg.filename) if (targetFolder and (not os.path.exists(targetFolder))): try: os.mkdir(targetFolder) except OSError: print((' Could not create target folder: %s' % targetFolder)) if (not include): include = sim.cfg.saveDataInclude dataSave = {} net = {} dataSave['netpyne_version'] = sim.version(show=False) dataSave['netpyne_changeset'] = sim.gitChangeset(show=False) if getattr(sim.net.params, 'version', None): dataSave['netParams_version'] = sim.net.params.version if ('netParams' in include): sim.net.params.__dict__.pop('_labelid', None) net['params'] = utils.replaceFuncObj(sim.net.params.__dict__) if ('net' in include): include.extend(['netPops', 'netCells']) if (('netCells' in include) and hasattr(sim.net, 'allCells')): net['cells'] = sim.net.allCells if (('netPops' in include) and hasattr(sim.net, 'allPops')): net['pops'] = sim.net.allPops if net: dataSave['net'] = net if ('simConfig' in include): dataSave['simConfig'] = sim.cfg.__dict__ if ('simData' in include): if ('LFP' in sim.allSimData): sim.allSimData['LFP'] = sim.allSimData['LFP'].tolist() dataSave['simData'] = sim.allSimData if dataSave: if sim.cfg.timestampFilename: timestamp = time() timestampStr = ('-' + datetime.fromtimestamp(timestamp).strftime('%Y%m%d_%H%M%S')) else: timestampStr = filePath = (sim.cfg.filename + timestampStr) if sim.cfg.savePickle: import pickle dataSave = utils.replaceDictODict(dataSave) print(('Saving output as %s ... ' % (filePath + '.pkl'))) with open((filePath + '.pkl'), 'wb') as fileObj: pickle.dump(dataSave, fileObj) print('Finished saving!') if sim.cfg.saveDpk: import gzip print(('Saving output as %s ... ' % (filePath + '.dpk'))) gzip.open(filePath, 'wb').write(pk.dumps(dataSave)) print('Finished saving!') if sim.cfg.saveJson: print(('Saving output as %s ... ' % (filePath + '.json '))) sim.saveJSON((filePath + '.json'), dataSave) print('Finished saving!') if sim.cfg.saveMat: from scipy.io import savemat print(('Saving output as %s ... ' % (filePath + '.mat'))) savemat((filePath + '.mat'), utils.tupleToList(utils.replaceNoneObj(dataSave))) print('Finished saving!') if sim.cfg.saveHDF5: dataSaveUTF8 = utils._dict2utf8(utils.replaceNoneObj(dataSave)) import hdf5storage print(('Saving output as %s... ' % (filePath + '.hdf5'))) hdf5storage.writes(dataSaveUTF8, filename=(filePath + '.hdf5')) print('Finished saving!') if sim.cfg.saveCSV: if ('simData' in dataSave): import csv print(('Saving output as %s ... ' % (filePath + '.csv'))) writer = csv.writer(open((filePath + '.csv'), 'wb')) for dic in dataSave['simData']: for values in dic: writer.writerow(values) print('Finished saving!') if sim.cfg.saveDat: traces = sim.cfg.recordTraces for ref in list(traces.keys()): for cellid in list(sim.allSimData[ref].keys()): dat_file_name = ('%s_%s.dat' % (ref, cellid)) dat_file = open(dat_file_name, 'w') trace = sim.allSimData[ref][cellid] print(('Saving %i points of data on: %s:%s to %s' % (len(trace), ref, cellid, dat_file_name))) for i in range(len(trace)): dat_file.write(('%s\t%s\n' % (((i * sim.cfg.dt) / 1000), (trace[i] / 1000)))) print('Finished saving!') if sim.cfg.timing: sim.timing('stop', 'saveTime') print((' Done; saving time = %0.2f s.' % sim.timingData['saveTime'])) if (sim.cfg.timing and sim.cfg.saveTiming): import pickle with open('timing.pkl', 'wb') as file: pickle.dump(sim.timing, file) for key in list(dataSave.keys()): del dataSave[key] del dataSave import os return ((os.getcwd() + '/') + filePath) else: print('Nothing to save')<|docstring|>Function for/to <short description of `netpyne.sim.save.saveData`> Parameters ---------- include : <``None``?> <Short description of include> **Default:** ``None`` **Options:** ``<option>`` <description of option> filename : <``None``?> <Short description of filename> **Default:** ``None`` **Options:** ``<option>`` <description of option><|endoftext|>
93a53853b250b1c8ab0880fe1d1a96d0e1aa53ec5b50c24f349f34984efbe245
def distributedSaveHDF5(): '\n Function for/to <short description of `netpyne.sim.save.distributedSaveHDF5`>\n\n\n ' from .. import sim import h5py if (sim.rank == 0): sim.timing('start', 'saveTimeHDF5') sim.compactConnFormat() conns = [([cell.gid] + conn) for cell in sim.net.cells for conn in cell.conns] conns = sim.copyRemoveItemObj(conns, keystart='h', newval=[]) connFormat = (['postGid'] + sim.cfg.compactConnFormat) with h5py.File((sim.cfg.filename + '.h5'), 'w') as hf: hf.create_dataset('conns', data=conns) hf.create_dataset('connsFormat', data=connFormat) if (sim.rank == 0): sim.timing('stop', 'saveTimeHDF5')
Function for/to <short description of `netpyne.sim.save.distributedSaveHDF5`>
netpyne/sim/save.py
distributedSaveHDF5
ghafari2019/netpyne
1
python
def distributedSaveHDF5(): '\n \n\n\n ' from .. import sim import h5py if (sim.rank == 0): sim.timing('start', 'saveTimeHDF5') sim.compactConnFormat() conns = [([cell.gid] + conn) for cell in sim.net.cells for conn in cell.conns] conns = sim.copyRemoveItemObj(conns, keystart='h', newval=[]) connFormat = (['postGid'] + sim.cfg.compactConnFormat) with h5py.File((sim.cfg.filename + '.h5'), 'w') as hf: hf.create_dataset('conns', data=conns) hf.create_dataset('connsFormat', data=connFormat) if (sim.rank == 0): sim.timing('stop', 'saveTimeHDF5')
def distributedSaveHDF5(): '\n \n\n\n ' from .. import sim import h5py if (sim.rank == 0): sim.timing('start', 'saveTimeHDF5') sim.compactConnFormat() conns = [([cell.gid] + conn) for cell in sim.net.cells for conn in cell.conns] conns = sim.copyRemoveItemObj(conns, keystart='h', newval=[]) connFormat = (['postGid'] + sim.cfg.compactConnFormat) with h5py.File((sim.cfg.filename + '.h5'), 'w') as hf: hf.create_dataset('conns', data=conns) hf.create_dataset('connsFormat', data=connFormat) if (sim.rank == 0): sim.timing('stop', 'saveTimeHDF5')<|docstring|>Function for/to <short description of `netpyne.sim.save.distributedSaveHDF5`><|endoftext|>
51d6669f16669f6fb92ca10876c9bcef4cfac2d755f84d492b0a238a001daf1f
def compactConnFormat(): '\n Function for/to <short description of `netpyne.sim.save.compactConnFormat`>\n\n\n ' from .. import sim if (type(sim.cfg.compactConnFormat) is not list): if (len(sim.net.params.stimTargetParams) > 0): sim.cfg.compactConnFormat = ['preGid', 'preLabel', 'sec', 'loc', 'synMech', 'weight', 'delay'] else: sim.cfg.compactConnFormat = ['preGid', 'sec', 'loc', 'synMech', 'weight', 'delay'] connFormat = sim.cfg.compactConnFormat for cell in sim.net.cells: newConns = [[conn[param] for param in connFormat] for conn in cell.conns] del cell.conns cell.conns = newConns
Function for/to <short description of `netpyne.sim.save.compactConnFormat`>
netpyne/sim/save.py
compactConnFormat
ghafari2019/netpyne
1
python
def compactConnFormat(): '\n \n\n\n ' from .. import sim if (type(sim.cfg.compactConnFormat) is not list): if (len(sim.net.params.stimTargetParams) > 0): sim.cfg.compactConnFormat = ['preGid', 'preLabel', 'sec', 'loc', 'synMech', 'weight', 'delay'] else: sim.cfg.compactConnFormat = ['preGid', 'sec', 'loc', 'synMech', 'weight', 'delay'] connFormat = sim.cfg.compactConnFormat for cell in sim.net.cells: newConns = [[conn[param] for param in connFormat] for conn in cell.conns] del cell.conns cell.conns = newConns
def compactConnFormat(): '\n \n\n\n ' from .. import sim if (type(sim.cfg.compactConnFormat) is not list): if (len(sim.net.params.stimTargetParams) > 0): sim.cfg.compactConnFormat = ['preGid', 'preLabel', 'sec', 'loc', 'synMech', 'weight', 'delay'] else: sim.cfg.compactConnFormat = ['preGid', 'sec', 'loc', 'synMech', 'weight', 'delay'] connFormat = sim.cfg.compactConnFormat for cell in sim.net.cells: newConns = [[conn[param] for param in connFormat] for conn in cell.conns] del cell.conns cell.conns = newConns<|docstring|>Function for/to <short description of `netpyne.sim.save.compactConnFormat`><|endoftext|>
52737175f4c74b2339887f17813e08acd86903bb7835865a97ba017e1bd4442b
def intervalSave(t): '\n Function for/to <short description of `netpyne.sim.save.intervalSave`>\n\n Parameters\n ----------\n t : <type>\n <Short description of t>\n **Default:** *required*\n\n\n ' from .. import sim from ..specs import Dict import pickle, os import numpy as np sim.pc.barrier() if (sim.rank == 0): if hasattr(sim.cfg, 'intervalFolder'): targetFolder = sim.cfg.intervalFolder else: targetFolder = os.path.dirname(sim.cfg.filename) if (targetFolder and (not os.path.exists(targetFolder))): try: os.mkdir(targetFolder) except OSError: print((' Could not create target folder: %s' % targetFolder)) gatherLFP = True simDataVecs = (['spkt', 'spkid', 'stims'] + list(sim.cfg.recordTraces.keys())) singleNodeVecs = ['t'] netPopsCellGids = {popLabel: list(pop.cellGids) for (popLabel, pop) in sim.net.pops.items()} nodeData = {'simData': sim.simData} if hasattr(sim.cfg, 'saveWeights'): if sim.cfg.saveWeights: nodeData['simData']['allWeights'] = sim.allWeights simDataVecs = (simDataVecs + ['allWeights']) data = ([None] * sim.nhosts) data[0] = {} for (k, v) in nodeData.items(): data[0][k] = v gather = sim.pc.py_alltoall(data) sim.pc.barrier() if (sim.rank == 0): print(' Gathering only sim data in master...') sim.allSimData = Dict() for k in list(gather[0]['simData'].keys()): if (gatherLFP and (k == 'LFP')): sim.allSimData[k] = np.zeros(gather[0]['simData']['LFP'].shape) else: sim.allSimData[k] = {} for key in singleNodeVecs: sim.allSimData[key] = list(nodeData['simData'][key]) for node in gather: for (key, val) in node['simData'].items(): if (key in simDataVecs): if isinstance(val, dict): for (cell, val2) in val.items(): if isinstance(val2, dict): sim.allSimData[key].update(Dict({cell: Dict()})) for (stim, val3) in val2.items(): sim.allSimData[key][cell].update({stim: list(val3)}) else: sim.allSimData[key].update({cell: list(val2)}) else: sim.allSimData[key] = (list(sim.allSimData[key]) + list(val)) elif (gatherLFP and (key == 'LFP')): sim.allSimData[key] += np.array(val) elif (key not in singleNodeVecs): sim.allSimData[key].update(val) if (len(sim.allSimData['spkt']) > 0): (sim.allSimData['spkt'], sim.allSimData['spkid']) = zip(*sorted(zip(sim.allSimData['spkt'], sim.allSimData['spkid']))) (sim.allSimData['spkt'], sim.allSimData['spkid']) = (list(sim.allSimData['spkt']), list(sim.allSimData['spkid'])) name = (targetFolder + '/data_{:0.0f}.pkl'.format(t)) with open(name, 'wb') as f: pickle.dump(dict(sim.allSimData), f, protocol=2) for node in gather: if node: node.clear() del node for item in data: if item: item.clear() del item sim.pc.barrier() for (k, v) in sim.simData.items(): if (k in ['spkt', 'spkid', 'stims']): v.resize(0) if hasattr(sim.cfg, 'saveWeights'): if sim.cfg.saveWeights: sim.allWeights = []
Function for/to <short description of `netpyne.sim.save.intervalSave`> Parameters ---------- t : <type> <Short description of t> **Default:** *required*
netpyne/sim/save.py
intervalSave
ghafari2019/netpyne
1
python
def intervalSave(t): '\n Function for/to <short description of `netpyne.sim.save.intervalSave`>\n\n Parameters\n ----------\n t : <type>\n <Short description of t>\n **Default:** *required*\n\n\n ' from .. import sim from ..specs import Dict import pickle, os import numpy as np sim.pc.barrier() if (sim.rank == 0): if hasattr(sim.cfg, 'intervalFolder'): targetFolder = sim.cfg.intervalFolder else: targetFolder = os.path.dirname(sim.cfg.filename) if (targetFolder and (not os.path.exists(targetFolder))): try: os.mkdir(targetFolder) except OSError: print((' Could not create target folder: %s' % targetFolder)) gatherLFP = True simDataVecs = (['spkt', 'spkid', 'stims'] + list(sim.cfg.recordTraces.keys())) singleNodeVecs = ['t'] netPopsCellGids = {popLabel: list(pop.cellGids) for (popLabel, pop) in sim.net.pops.items()} nodeData = {'simData': sim.simData} if hasattr(sim.cfg, 'saveWeights'): if sim.cfg.saveWeights: nodeData['simData']['allWeights'] = sim.allWeights simDataVecs = (simDataVecs + ['allWeights']) data = ([None] * sim.nhosts) data[0] = {} for (k, v) in nodeData.items(): data[0][k] = v gather = sim.pc.py_alltoall(data) sim.pc.barrier() if (sim.rank == 0): print(' Gathering only sim data in master...') sim.allSimData = Dict() for k in list(gather[0]['simData'].keys()): if (gatherLFP and (k == 'LFP')): sim.allSimData[k] = np.zeros(gather[0]['simData']['LFP'].shape) else: sim.allSimData[k] = {} for key in singleNodeVecs: sim.allSimData[key] = list(nodeData['simData'][key]) for node in gather: for (key, val) in node['simData'].items(): if (key in simDataVecs): if isinstance(val, dict): for (cell, val2) in val.items(): if isinstance(val2, dict): sim.allSimData[key].update(Dict({cell: Dict()})) for (stim, val3) in val2.items(): sim.allSimData[key][cell].update({stim: list(val3)}) else: sim.allSimData[key].update({cell: list(val2)}) else: sim.allSimData[key] = (list(sim.allSimData[key]) + list(val)) elif (gatherLFP and (key == 'LFP')): sim.allSimData[key] += np.array(val) elif (key not in singleNodeVecs): sim.allSimData[key].update(val) if (len(sim.allSimData['spkt']) > 0): (sim.allSimData['spkt'], sim.allSimData['spkid']) = zip(*sorted(zip(sim.allSimData['spkt'], sim.allSimData['spkid']))) (sim.allSimData['spkt'], sim.allSimData['spkid']) = (list(sim.allSimData['spkt']), list(sim.allSimData['spkid'])) name = (targetFolder + '/data_{:0.0f}.pkl'.format(t)) with open(name, 'wb') as f: pickle.dump(dict(sim.allSimData), f, protocol=2) for node in gather: if node: node.clear() del node for item in data: if item: item.clear() del item sim.pc.barrier() for (k, v) in sim.simData.items(): if (k in ['spkt', 'spkid', 'stims']): v.resize(0) if hasattr(sim.cfg, 'saveWeights'): if sim.cfg.saveWeights: sim.allWeights = []
def intervalSave(t): '\n Function for/to <short description of `netpyne.sim.save.intervalSave`>\n\n Parameters\n ----------\n t : <type>\n <Short description of t>\n **Default:** *required*\n\n\n ' from .. import sim from ..specs import Dict import pickle, os import numpy as np sim.pc.barrier() if (sim.rank == 0): if hasattr(sim.cfg, 'intervalFolder'): targetFolder = sim.cfg.intervalFolder else: targetFolder = os.path.dirname(sim.cfg.filename) if (targetFolder and (not os.path.exists(targetFolder))): try: os.mkdir(targetFolder) except OSError: print((' Could not create target folder: %s' % targetFolder)) gatherLFP = True simDataVecs = (['spkt', 'spkid', 'stims'] + list(sim.cfg.recordTraces.keys())) singleNodeVecs = ['t'] netPopsCellGids = {popLabel: list(pop.cellGids) for (popLabel, pop) in sim.net.pops.items()} nodeData = {'simData': sim.simData} if hasattr(sim.cfg, 'saveWeights'): if sim.cfg.saveWeights: nodeData['simData']['allWeights'] = sim.allWeights simDataVecs = (simDataVecs + ['allWeights']) data = ([None] * sim.nhosts) data[0] = {} for (k, v) in nodeData.items(): data[0][k] = v gather = sim.pc.py_alltoall(data) sim.pc.barrier() if (sim.rank == 0): print(' Gathering only sim data in master...') sim.allSimData = Dict() for k in list(gather[0]['simData'].keys()): if (gatherLFP and (k == 'LFP')): sim.allSimData[k] = np.zeros(gather[0]['simData']['LFP'].shape) else: sim.allSimData[k] = {} for key in singleNodeVecs: sim.allSimData[key] = list(nodeData['simData'][key]) for node in gather: for (key, val) in node['simData'].items(): if (key in simDataVecs): if isinstance(val, dict): for (cell, val2) in val.items(): if isinstance(val2, dict): sim.allSimData[key].update(Dict({cell: Dict()})) for (stim, val3) in val2.items(): sim.allSimData[key][cell].update({stim: list(val3)}) else: sim.allSimData[key].update({cell: list(val2)}) else: sim.allSimData[key] = (list(sim.allSimData[key]) + list(val)) elif (gatherLFP and (key == 'LFP')): sim.allSimData[key] += np.array(val) elif (key not in singleNodeVecs): sim.allSimData[key].update(val) if (len(sim.allSimData['spkt']) > 0): (sim.allSimData['spkt'], sim.allSimData['spkid']) = zip(*sorted(zip(sim.allSimData['spkt'], sim.allSimData['spkid']))) (sim.allSimData['spkt'], sim.allSimData['spkid']) = (list(sim.allSimData['spkt']), list(sim.allSimData['spkid'])) name = (targetFolder + '/data_{:0.0f}.pkl'.format(t)) with open(name, 'wb') as f: pickle.dump(dict(sim.allSimData), f, protocol=2) for node in gather: if node: node.clear() del node for item in data: if item: item.clear() del item sim.pc.barrier() for (k, v) in sim.simData.items(): if (k in ['spkt', 'spkid', 'stims']): v.resize(0) if hasattr(sim.cfg, 'saveWeights'): if sim.cfg.saveWeights: sim.allWeights = []<|docstring|>Function for/to <short description of `netpyne.sim.save.intervalSave`> Parameters ---------- t : <type> <Short description of t> **Default:** *required*<|endoftext|>
7dec5c6bba34e172ebed71397f681865763f0509d79286712d5f105bb6a5ba78
def saveDataInNodes(filename=None, saveLFP=True, removeTraces=False, dataDir=None): '\n Function to save simulation data by node rather than as a whole\n\n Parameters\n ----------\n filename : str\n The name to use for the saved files.\n **Default:** ``None``\n\n saveLFP : bool\n Whether to save any LFP data.\n **Default:** ``True`` saves LFP data.\n **Options:** ``False`` does not save LFP data.\n\n removeTraces : bool\n Whether to remove traces data before saving.\n **Default:** ``False`` does not remove the trace data.\n **Options:** ``True`` removes the trace data.\n\n dataDir : str\n Name of the directory where data files will be saved.\n **Default:** ``None`` saves to the default directory.\n\n ' from .. import sim from ..specs import Dict, ODict sim.timing('start', 'saveInNodeTime') import os if (not dataDir): dataDir = (sim.cfg.filename + '_node_data') if (not os.path.exists(dataDir)): os.makedirs(dataDir, exist_ok=True) sim.pc.barrier() if (sim.rank == 0): print(('\nSaving an output file for each node in: %s' % dataDir)) dataSave = {} dataSave['netpyne_version'] = sim.version(show=False) dataSave['netpyne_changeset'] = sim.gitChangeset(show=False) simDataVecs = (['spkt', 'spkid', 'stims'] + list(sim.cfg.recordTraces.keys())) singleNodeVecs = ['t'] saveSimData = {} if saveLFP: simData = sim.simData else: simData = {k: v for (k, v) in sim.simData.items() if (k not in ['LFP'])} for k in list(simData.keys()): saveSimData[k] = {} for (key, val) in simData.items(): if (key in simDataVecs): if isinstance(val, dict): for (cell, val2) in val.items(): if isinstance(val2, dict): saveSimData[key].update({cell: {}}) for (stim, val3) in val2.items(): saveSimData[key][cell].update({stim: list(val3)}) else: saveSimData[key].update({cell: list(val2)}) else: saveSimData[key] = (list(saveSimData[key]) + list(val)) elif (key in singleNodeVecs): if (sim.rank == 0): saveSimData[key] = list(val) else: saveSimData[key] = val dataSave['simData'] = saveSimData dataSave['cells'] = sim.net.cells dataSave['pops'] = sim.net.pops if removeTraces: for k in sim.cfg.recordTraces.keys(): del sim.simData[k] if getattr(sim.net.params, 'version', None): dataSave['netParams_version'] = sim.net.params.version if dataSave: if sim.cfg.timestampFilename: timestamp = time() timestampStr = ('-' + datetime.fromtimestamp(timestamp).strftime('%Y%m%d_%H%M%S')) else: timestampStr = '' if (not filename): filename = sim.cfg.filename filePath = (filename + timestampStr) if sim.cfg.savePickle: import pickle dataSave = utils.replaceDictODict(dataSave) fileName = (((filePath + '_node_') + str(sim.rank)) + '.pkl') print((' Saving output as: %s ... ' % fileName)) with open(os.path.join(dataDir, fileName), 'wb') as fileObj: pickle.dump(dataSave, fileObj) if sim.cfg.saveJson: fileName = (((filePath + '_node_') + str(sim.rank)) + '.json') print((' Saving output as: %s ... ' % fileName)) sim.saveJSON(os.path.join(dataDir, fileName), dataSave) sim.pc.barrier() if (sim.rank == 0): if sim.cfg.timing: sim.timing('stop', 'saveInNodeTime') print((' Done; saving time = %0.2f s.' % sim.timingData['saveInNodeTime'])) if (sim.cfg.timing and sim.cfg.saveTiming): import pickle with open('timing.pkl', 'wb') as file: pickle.dump(sim.timing, file)
Function to save simulation data by node rather than as a whole Parameters ---------- filename : str The name to use for the saved files. **Default:** ``None`` saveLFP : bool Whether to save any LFP data. **Default:** ``True`` saves LFP data. **Options:** ``False`` does not save LFP data. removeTraces : bool Whether to remove traces data before saving. **Default:** ``False`` does not remove the trace data. **Options:** ``True`` removes the trace data. dataDir : str Name of the directory where data files will be saved. **Default:** ``None`` saves to the default directory.
netpyne/sim/save.py
saveDataInNodes
ghafari2019/netpyne
1
python
def saveDataInNodes(filename=None, saveLFP=True, removeTraces=False, dataDir=None): '\n Function to save simulation data by node rather than as a whole\n\n Parameters\n ----------\n filename : str\n The name to use for the saved files.\n **Default:** ``None``\n\n saveLFP : bool\n Whether to save any LFP data.\n **Default:** ``True`` saves LFP data.\n **Options:** ``False`` does not save LFP data.\n\n removeTraces : bool\n Whether to remove traces data before saving.\n **Default:** ``False`` does not remove the trace data.\n **Options:** ``True`` removes the trace data.\n\n dataDir : str\n Name of the directory where data files will be saved.\n **Default:** ``None`` saves to the default directory.\n\n ' from .. import sim from ..specs import Dict, ODict sim.timing('start', 'saveInNodeTime') import os if (not dataDir): dataDir = (sim.cfg.filename + '_node_data') if (not os.path.exists(dataDir)): os.makedirs(dataDir, exist_ok=True) sim.pc.barrier() if (sim.rank == 0): print(('\nSaving an output file for each node in: %s' % dataDir)) dataSave = {} dataSave['netpyne_version'] = sim.version(show=False) dataSave['netpyne_changeset'] = sim.gitChangeset(show=False) simDataVecs = (['spkt', 'spkid', 'stims'] + list(sim.cfg.recordTraces.keys())) singleNodeVecs = ['t'] saveSimData = {} if saveLFP: simData = sim.simData else: simData = {k: v for (k, v) in sim.simData.items() if (k not in ['LFP'])} for k in list(simData.keys()): saveSimData[k] = {} for (key, val) in simData.items(): if (key in simDataVecs): if isinstance(val, dict): for (cell, val2) in val.items(): if isinstance(val2, dict): saveSimData[key].update({cell: {}}) for (stim, val3) in val2.items(): saveSimData[key][cell].update({stim: list(val3)}) else: saveSimData[key].update({cell: list(val2)}) else: saveSimData[key] = (list(saveSimData[key]) + list(val)) elif (key in singleNodeVecs): if (sim.rank == 0): saveSimData[key] = list(val) else: saveSimData[key] = val dataSave['simData'] = saveSimData dataSave['cells'] = sim.net.cells dataSave['pops'] = sim.net.pops if removeTraces: for k in sim.cfg.recordTraces.keys(): del sim.simData[k] if getattr(sim.net.params, 'version', None): dataSave['netParams_version'] = sim.net.params.version if dataSave: if sim.cfg.timestampFilename: timestamp = time() timestampStr = ('-' + datetime.fromtimestamp(timestamp).strftime('%Y%m%d_%H%M%S')) else: timestampStr = if (not filename): filename = sim.cfg.filename filePath = (filename + timestampStr) if sim.cfg.savePickle: import pickle dataSave = utils.replaceDictODict(dataSave) fileName = (((filePath + '_node_') + str(sim.rank)) + '.pkl') print((' Saving output as: %s ... ' % fileName)) with open(os.path.join(dataDir, fileName), 'wb') as fileObj: pickle.dump(dataSave, fileObj) if sim.cfg.saveJson: fileName = (((filePath + '_node_') + str(sim.rank)) + '.json') print((' Saving output as: %s ... ' % fileName)) sim.saveJSON(os.path.join(dataDir, fileName), dataSave) sim.pc.barrier() if (sim.rank == 0): if sim.cfg.timing: sim.timing('stop', 'saveInNodeTime') print((' Done; saving time = %0.2f s.' % sim.timingData['saveInNodeTime'])) if (sim.cfg.timing and sim.cfg.saveTiming): import pickle with open('timing.pkl', 'wb') as file: pickle.dump(sim.timing, file)
def saveDataInNodes(filename=None, saveLFP=True, removeTraces=False, dataDir=None): '\n Function to save simulation data by node rather than as a whole\n\n Parameters\n ----------\n filename : str\n The name to use for the saved files.\n **Default:** ``None``\n\n saveLFP : bool\n Whether to save any LFP data.\n **Default:** ``True`` saves LFP data.\n **Options:** ``False`` does not save LFP data.\n\n removeTraces : bool\n Whether to remove traces data before saving.\n **Default:** ``False`` does not remove the trace data.\n **Options:** ``True`` removes the trace data.\n\n dataDir : str\n Name of the directory where data files will be saved.\n **Default:** ``None`` saves to the default directory.\n\n ' from .. import sim from ..specs import Dict, ODict sim.timing('start', 'saveInNodeTime') import os if (not dataDir): dataDir = (sim.cfg.filename + '_node_data') if (not os.path.exists(dataDir)): os.makedirs(dataDir, exist_ok=True) sim.pc.barrier() if (sim.rank == 0): print(('\nSaving an output file for each node in: %s' % dataDir)) dataSave = {} dataSave['netpyne_version'] = sim.version(show=False) dataSave['netpyne_changeset'] = sim.gitChangeset(show=False) simDataVecs = (['spkt', 'spkid', 'stims'] + list(sim.cfg.recordTraces.keys())) singleNodeVecs = ['t'] saveSimData = {} if saveLFP: simData = sim.simData else: simData = {k: v for (k, v) in sim.simData.items() if (k not in ['LFP'])} for k in list(simData.keys()): saveSimData[k] = {} for (key, val) in simData.items(): if (key in simDataVecs): if isinstance(val, dict): for (cell, val2) in val.items(): if isinstance(val2, dict): saveSimData[key].update({cell: {}}) for (stim, val3) in val2.items(): saveSimData[key][cell].update({stim: list(val3)}) else: saveSimData[key].update({cell: list(val2)}) else: saveSimData[key] = (list(saveSimData[key]) + list(val)) elif (key in singleNodeVecs): if (sim.rank == 0): saveSimData[key] = list(val) else: saveSimData[key] = val dataSave['simData'] = saveSimData dataSave['cells'] = sim.net.cells dataSave['pops'] = sim.net.pops if removeTraces: for k in sim.cfg.recordTraces.keys(): del sim.simData[k] if getattr(sim.net.params, 'version', None): dataSave['netParams_version'] = sim.net.params.version if dataSave: if sim.cfg.timestampFilename: timestamp = time() timestampStr = ('-' + datetime.fromtimestamp(timestamp).strftime('%Y%m%d_%H%M%S')) else: timestampStr = if (not filename): filename = sim.cfg.filename filePath = (filename + timestampStr) if sim.cfg.savePickle: import pickle dataSave = utils.replaceDictODict(dataSave) fileName = (((filePath + '_node_') + str(sim.rank)) + '.pkl') print((' Saving output as: %s ... ' % fileName)) with open(os.path.join(dataDir, fileName), 'wb') as fileObj: pickle.dump(dataSave, fileObj) if sim.cfg.saveJson: fileName = (((filePath + '_node_') + str(sim.rank)) + '.json') print((' Saving output as: %s ... ' % fileName)) sim.saveJSON(os.path.join(dataDir, fileName), dataSave) sim.pc.barrier() if (sim.rank == 0): if sim.cfg.timing: sim.timing('stop', 'saveInNodeTime') print((' Done; saving time = %0.2f s.' % sim.timingData['saveInNodeTime'])) if (sim.cfg.timing and sim.cfg.saveTiming): import pickle with open('timing.pkl', 'wb') as file: pickle.dump(sim.timing, file)<|docstring|>Function to save simulation data by node rather than as a whole Parameters ---------- filename : str The name to use for the saved files. **Default:** ``None`` saveLFP : bool Whether to save any LFP data. **Default:** ``True`` saves LFP data. **Options:** ``False`` does not save LFP data. removeTraces : bool Whether to remove traces data before saving. **Default:** ``False`` does not remove the trace data. **Options:** ``True`` removes the trace data. dataDir : str Name of the directory where data files will be saved. **Default:** ``None`` saves to the default directory.<|endoftext|>
edff29e09386ea64f35aec9745c0faabe1d6f5655e44f06c0ceac7cf16ed62d8
def __init__(self, data: list, merger): '\n 初始化线段树\n __data:\n 区间内的数据\n __tree:\n 构建的线段树\n __merger:\n 自定义的融合规则,通常是 lambda 表达式传来的 function 对象\n ' self.__merger = merger self.__data = data self.__tree = (([None] * len(data)) * 4) self.__build_tree(0, 0, (len(self.__data) - 1))
初始化线段树 __data: 区间内的数据 __tree: 构建的线段树 __merger: 自定义的融合规则,通常是 lambda 表达式传来的 function 对象
datastruct/segment_tree/SegmentTree.py
__init__
LibertyDream/algorithm_data_structure
0
python
def __init__(self, data: list, merger): '\n 初始化线段树\n __data:\n 区间内的数据\n __tree:\n 构建的线段树\n __merger:\n 自定义的融合规则,通常是 lambda 表达式传来的 function 对象\n ' self.__merger = merger self.__data = data self.__tree = (([None] * len(data)) * 4) self.__build_tree(0, 0, (len(self.__data) - 1))
def __init__(self, data: list, merger): '\n 初始化线段树\n __data:\n 区间内的数据\n __tree:\n 构建的线段树\n __merger:\n 自定义的融合规则,通常是 lambda 表达式传来的 function 对象\n ' self.__merger = merger self.__data = data self.__tree = (([None] * len(data)) * 4) self.__build_tree(0, 0, (len(self.__data) - 1))<|docstring|>初始化线段树 __data: 区间内的数据 __tree: 构建的线段树 __merger: 自定义的融合规则,通常是 lambda 表达式传来的 function 对象<|endoftext|>
f820ea559427a87d88fb39dda564ad56b66ce27b89fa0a0325a3f83a0bd1d3c7
def query(self, queryL: int, queryR: int): '\n 查询 [queryL...queryR] 范围内的内容\n ' if ((queryL < 0) or (queryL >= len(self.__data)) or (queryR < 0) or (queryR >= len(self.__data))): raise IndexError('Index is illegal') return self.__query(0, 0, (len(self.__data) - 1), queryL, queryR)
查询 [queryL...queryR] 范围内的内容
datastruct/segment_tree/SegmentTree.py
query
LibertyDream/algorithm_data_structure
0
python
def query(self, queryL: int, queryR: int): '\n \n ' if ((queryL < 0) or (queryL >= len(self.__data)) or (queryR < 0) or (queryR >= len(self.__data))): raise IndexError('Index is illegal') return self.__query(0, 0, (len(self.__data) - 1), queryL, queryR)
def query(self, queryL: int, queryR: int): '\n \n ' if ((queryL < 0) or (queryL >= len(self.__data)) or (queryR < 0) or (queryR >= len(self.__data))): raise IndexError('Index is illegal') return self.__query(0, 0, (len(self.__data) - 1), queryL, queryR)<|docstring|>查询 [queryL...queryR] 范围内的内容<|endoftext|>
6361f8cc61fe2501ea265ab0208c356e2b1bd4784c3b28aa565e10fb10a81d3d
def __query(self, tree_index: int, left, right, queryL: int, queryR: int): '\n 查询以 treeIndex 为根,[left,right] 为界,目标范围为 [queryL,queryR] 中的内容\n ' if ((left == queryL) and (right == queryR)): return self.__tree[tree_index] mid = (left + ((right - left) // 2)) left_index = self.left_child(tree_index) right_index = self.right_child(tree_index) if (queryL > mid): return self.__query(right_index, (mid + 1), right, queryL, queryR) elif (queryR <= mid): return self.__query(left_index, left, mid, queryL, queryR) left_res = self.__query(left_index, left, mid, queryL, mid) right_res = self.__query(right_index, (mid + 1), right, (mid + 1), queryR) return self.__merger(left_res, right_res)
查询以 treeIndex 为根,[left,right] 为界,目标范围为 [queryL,queryR] 中的内容
datastruct/segment_tree/SegmentTree.py
__query
LibertyDream/algorithm_data_structure
0
python
def __query(self, tree_index: int, left, right, queryL: int, queryR: int): '\n \n ' if ((left == queryL) and (right == queryR)): return self.__tree[tree_index] mid = (left + ((right - left) // 2)) left_index = self.left_child(tree_index) right_index = self.right_child(tree_index) if (queryL > mid): return self.__query(right_index, (mid + 1), right, queryL, queryR) elif (queryR <= mid): return self.__query(left_index, left, mid, queryL, queryR) left_res = self.__query(left_index, left, mid, queryL, mid) right_res = self.__query(right_index, (mid + 1), right, (mid + 1), queryR) return self.__merger(left_res, right_res)
def __query(self, tree_index: int, left, right, queryL: int, queryR: int): '\n \n ' if ((left == queryL) and (right == queryR)): return self.__tree[tree_index] mid = (left + ((right - left) // 2)) left_index = self.left_child(tree_index) right_index = self.right_child(tree_index) if (queryL > mid): return self.__query(right_index, (mid + 1), right, queryL, queryR) elif (queryR <= mid): return self.__query(left_index, left, mid, queryL, queryR) left_res = self.__query(left_index, left, mid, queryL, mid) right_res = self.__query(right_index, (mid + 1), right, (mid + 1), queryR) return self.__merger(left_res, right_res)<|docstring|>查询以 treeIndex 为根,[left,right] 为界,目标范围为 [queryL,queryR] 中的内容<|endoftext|>
aa8445fe14d785f14bf9bb76d358fe4fa0223e9a0b75b907b149121322779d5b
def update(self, index: int, ele): '\n 更新 index 处的值为 ele\n ' if ((index < 0) or (index >= len(self.__data))): raise IndexError('Invalid index') self.__data[index] = ele self.__update(0, 0, (len(self.__data) - 1), index, ele)
更新 index 处的值为 ele
datastruct/segment_tree/SegmentTree.py
update
LibertyDream/algorithm_data_structure
0
python
def update(self, index: int, ele): '\n \n ' if ((index < 0) or (index >= len(self.__data))): raise IndexError('Invalid index') self.__data[index] = ele self.__update(0, 0, (len(self.__data) - 1), index, ele)
def update(self, index: int, ele): '\n \n ' if ((index < 0) or (index >= len(self.__data))): raise IndexError('Invalid index') self.__data[index] = ele self.__update(0, 0, (len(self.__data) - 1), index, ele)<|docstring|>更新 index 处的值为 ele<|endoftext|>
ab3a44092065a6f5bebe46144ff9c794c1aa0936b8df57340dd598de66db051d
def __update(self, tree_index: int, left: int, right: int, index: int, ele): '\n 在以 treeIndex 为根的 [left,right] 的区间内更新 index 处的值为 e\n ' if (left == right): self.__tree[tree_index] = ele return mid = (left + ((right - left) // 2)) left_index = self.left_child(tree_index) right_index = self.right_child(tree_index) if (index > mid): self.__update(right_index, (mid + 1), right, index, ele) else: self.__update(left_index, left, mid, index, ele) self.__tree[tree_index] = self.__merger(self.__tree[left_index], self.__tree[right_index])
在以 treeIndex 为根的 [left,right] 的区间内更新 index 处的值为 e
datastruct/segment_tree/SegmentTree.py
__update
LibertyDream/algorithm_data_structure
0
python
def __update(self, tree_index: int, left: int, right: int, index: int, ele): '\n \n ' if (left == right): self.__tree[tree_index] = ele return mid = (left + ((right - left) // 2)) left_index = self.left_child(tree_index) right_index = self.right_child(tree_index) if (index > mid): self.__update(right_index, (mid + 1), right, index, ele) else: self.__update(left_index, left, mid, index, ele) self.__tree[tree_index] = self.__merger(self.__tree[left_index], self.__tree[right_index])
def __update(self, tree_index: int, left: int, right: int, index: int, ele): '\n \n ' if (left == right): self.__tree[tree_index] = ele return mid = (left + ((right - left) // 2)) left_index = self.left_child(tree_index) right_index = self.right_child(tree_index) if (index > mid): self.__update(right_index, (mid + 1), right, index, ele) else: self.__update(left_index, left, mid, index, ele) self.__tree[tree_index] = self.__merger(self.__tree[left_index], self.__tree[right_index])<|docstring|>在以 treeIndex 为根的 [left,right] 的区间内更新 index 处的值为 e<|endoftext|>
8841aa2f5f7cfe54440513b9b897ea05edf2ca51c30ed25be30f32bf26c39bb9
def rnacentral_id(context: Context, entry: HgncEntry) -> ty.Optional[str]: '\n Map HGNC ncRNAs to RNAcentral using RefSeq, Vega, gtRNAdb accessions\n and sequence matches.\n ' if entry.refseq_id: refseq_based = helpers.refseq_id_to_urs(context, entry.refseq_id) if refseq_based: return refseq_based if entry.gtrnadb_id: gtrnadb_id = entry.gtrnadb_id if gtrnadb_id: return helpers.gtrnadb_to_urs(context, gtrnadb_id) return None elif entry.ensembl_gene_id: gene = entry.ensembl_gene_id fasta = helpers.ensembl_sequence(context, gene) if (not fasta): return None md5_hash = helpers.md5(fasta) urs = helpers.md5_to_urs(context, md5_hash) if urs: return urs return helpers.ensembl_gene_to_urs(context, gene) LOGGER.info('Cannot map %s', entry) return None
Map HGNC ncRNAs to RNAcentral using RefSeq, Vega, gtRNAdb accessions and sequence matches.
rnacentral_pipeline/databases/hgnc/parser.py
rnacentral_id
RNAcentral/rnacentral-import-pipeline
1
python
def rnacentral_id(context: Context, entry: HgncEntry) -> ty.Optional[str]: '\n Map HGNC ncRNAs to RNAcentral using RefSeq, Vega, gtRNAdb accessions\n and sequence matches.\n ' if entry.refseq_id: refseq_based = helpers.refseq_id_to_urs(context, entry.refseq_id) if refseq_based: return refseq_based if entry.gtrnadb_id: gtrnadb_id = entry.gtrnadb_id if gtrnadb_id: return helpers.gtrnadb_to_urs(context, gtrnadb_id) return None elif entry.ensembl_gene_id: gene = entry.ensembl_gene_id fasta = helpers.ensembl_sequence(context, gene) if (not fasta): return None md5_hash = helpers.md5(fasta) urs = helpers.md5_to_urs(context, md5_hash) if urs: return urs return helpers.ensembl_gene_to_urs(context, gene) LOGGER.info('Cannot map %s', entry) return None
def rnacentral_id(context: Context, entry: HgncEntry) -> ty.Optional[str]: '\n Map HGNC ncRNAs to RNAcentral using RefSeq, Vega, gtRNAdb accessions\n and sequence matches.\n ' if entry.refseq_id: refseq_based = helpers.refseq_id_to_urs(context, entry.refseq_id) if refseq_based: return refseq_based if entry.gtrnadb_id: gtrnadb_id = entry.gtrnadb_id if gtrnadb_id: return helpers.gtrnadb_to_urs(context, gtrnadb_id) return None elif entry.ensembl_gene_id: gene = entry.ensembl_gene_id fasta = helpers.ensembl_sequence(context, gene) if (not fasta): return None md5_hash = helpers.md5(fasta) urs = helpers.md5_to_urs(context, md5_hash) if urs: return urs return helpers.ensembl_gene_to_urs(context, gene) LOGGER.info('Cannot map %s', entry) return None<|docstring|>Map HGNC ncRNAs to RNAcentral using RefSeq, Vega, gtRNAdb accessions and sequence matches.<|endoftext|>
f0e31a426fb07b3e9817fd4fc01bdf56e4cc18fff7518a7a087ea1ac5599fce2
def test_basic(self) -> None: 'Test to create projectConfig class.' cfg = ProjectConfig(database='mydb', type='SQLite3 (SQLITE3)') self.assertTrue(cfg) self.assertEqual(cfg.SAVE_VERSION, VERSION_1_2)
Test to create projectConfig class.
pineboolib/loader/tests/test_projectconfig.py
test_basic
Aulla/pineboo
2
python
def test_basic(self) -> None: cfg = ProjectConfig(database='mydb', type='SQLite3 (SQLITE3)') self.assertTrue(cfg) self.assertEqual(cfg.SAVE_VERSION, VERSION_1_2)
def test_basic(self) -> None: cfg = ProjectConfig(database='mydb', type='SQLite3 (SQLITE3)') self.assertTrue(cfg) self.assertEqual(cfg.SAVE_VERSION, VERSION_1_2)<|docstring|>Test to create projectConfig class.<|endoftext|>
e4610f5cc717edfcabc423cacc07c27481c551876b198e794f081b4cf531d0e2
def test_read_write(self) -> None: 'Test that we can read a file, save it back, read it again and stays the same.' project_test1 = fixture_read('project_test1.xml') with tempfile.TemporaryDirectory() as tmpdirname: cfg = ProjectConfig(database='mydb', type='SQLite3 (SQLITE3)', filename=os.path.join(tmpdirname, 'test.xml')) cfg.SAVE_VERSION = VERSION_1_1 cfg.save_projectxml(False) self.assertEqual(open(cfg.filename).read(), project_test1) cfg2 = ProjectConfig(load_xml=cfg.filename) cfg2.SAVE_VERSION = cfg2.version cfg2.save_projectxml(True) self.assertEqual(open(cfg2.filename).read(), project_test1)
Test that we can read a file, save it back, read it again and stays the same.
pineboolib/loader/tests/test_projectconfig.py
test_read_write
Aulla/pineboo
2
python
def test_read_write(self) -> None: project_test1 = fixture_read('project_test1.xml') with tempfile.TemporaryDirectory() as tmpdirname: cfg = ProjectConfig(database='mydb', type='SQLite3 (SQLITE3)', filename=os.path.join(tmpdirname, 'test.xml')) cfg.SAVE_VERSION = VERSION_1_1 cfg.save_projectxml(False) self.assertEqual(open(cfg.filename).read(), project_test1) cfg2 = ProjectConfig(load_xml=cfg.filename) cfg2.SAVE_VERSION = cfg2.version cfg2.save_projectxml(True) self.assertEqual(open(cfg2.filename).read(), project_test1)
def test_read_write(self) -> None: project_test1 = fixture_read('project_test1.xml') with tempfile.TemporaryDirectory() as tmpdirname: cfg = ProjectConfig(database='mydb', type='SQLite3 (SQLITE3)', filename=os.path.join(tmpdirname, 'test.xml')) cfg.SAVE_VERSION = VERSION_1_1 cfg.save_projectxml(False) self.assertEqual(open(cfg.filename).read(), project_test1) cfg2 = ProjectConfig(load_xml=cfg.filename) cfg2.SAVE_VERSION = cfg2.version cfg2.save_projectxml(True) self.assertEqual(open(cfg2.filename).read(), project_test1)<|docstring|>Test that we can read a file, save it back, read it again and stays the same.<|endoftext|>
61881d36da1311389b4105b9bdad06d5bdb541056b960e13babc80fc3e0fb23a
@patch('time.time') @patch('os.urandom') def test_read_write2(self, mock_urandom: Mock, mock_time: Mock) -> None: 'Test we can read and write and stays equal (slightly more complicated).' mock_urandom.side_effect = (lambda n: b'1234567890123456789012345678901234567890'[:n]) mock_time.side_effect = (lambda : 10000) project_test2 = fixture_read('project_test2.xml') project_test3 = fixture_read('project_test3.xml') with tempfile.TemporaryDirectory() as tmpdirname: cfg = ProjectConfig(database='postgres_testdb', description='Postgres Test DB', type='PostgreSQL (PSYCOPG2)', host='192.168.1.101', port=5432, username='postgres', password='postgrespassword', project_password='myhardtoguesspassword', filename=os.path.join(tmpdirname, 'test.xml')) cfg.SAVE_VERSION = VERSION_1_1 cfg.save_projectxml(False) self.assertEqual(open(cfg.filename).read(), project_test2) with self.assertRaises(PasswordMismatchError): ProjectConfig(load_xml=cfg.filename, project_password='wrongpassword') cfg2 = ProjectConfig(load_xml=cfg.filename, project_password='myhardtoguesspassword') cfg2.SAVE_VERSION = cfg2.version cfg2.save_projectxml(True) self.assertEqual(open(cfg2.filename).read(), project_test2) cfg2.SAVE_VERSION = VERSION_1_2 cfg2.save_projectxml(True) self.assertEqual(open(cfg2.filename).read(), project_test3) with self.assertRaises(PasswordMismatchError): ProjectConfig(load_xml=cfg2.filename, project_password='wrongpassword') cfg3 = ProjectConfig(load_xml=cfg2.filename, project_password='myhardtoguesspassword') cfg3.SAVE_VERSION = VERSION_1_2 cfg3.save_projectxml(True) self.assertEqual(open(cfg3.filename).read(), project_test3)
Test we can read and write and stays equal (slightly more complicated).
pineboolib/loader/tests/test_projectconfig.py
test_read_write2
Aulla/pineboo
2
python
@patch('time.time') @patch('os.urandom') def test_read_write2(self, mock_urandom: Mock, mock_time: Mock) -> None: mock_urandom.side_effect = (lambda n: b'1234567890123456789012345678901234567890'[:n]) mock_time.side_effect = (lambda : 10000) project_test2 = fixture_read('project_test2.xml') project_test3 = fixture_read('project_test3.xml') with tempfile.TemporaryDirectory() as tmpdirname: cfg = ProjectConfig(database='postgres_testdb', description='Postgres Test DB', type='PostgreSQL (PSYCOPG2)', host='192.168.1.101', port=5432, username='postgres', password='postgrespassword', project_password='myhardtoguesspassword', filename=os.path.join(tmpdirname, 'test.xml')) cfg.SAVE_VERSION = VERSION_1_1 cfg.save_projectxml(False) self.assertEqual(open(cfg.filename).read(), project_test2) with self.assertRaises(PasswordMismatchError): ProjectConfig(load_xml=cfg.filename, project_password='wrongpassword') cfg2 = ProjectConfig(load_xml=cfg.filename, project_password='myhardtoguesspassword') cfg2.SAVE_VERSION = cfg2.version cfg2.save_projectxml(True) self.assertEqual(open(cfg2.filename).read(), project_test2) cfg2.SAVE_VERSION = VERSION_1_2 cfg2.save_projectxml(True) self.assertEqual(open(cfg2.filename).read(), project_test3) with self.assertRaises(PasswordMismatchError): ProjectConfig(load_xml=cfg2.filename, project_password='wrongpassword') cfg3 = ProjectConfig(load_xml=cfg2.filename, project_password='myhardtoguesspassword') cfg3.SAVE_VERSION = VERSION_1_2 cfg3.save_projectxml(True) self.assertEqual(open(cfg3.filename).read(), project_test3)
@patch('time.time') @patch('os.urandom') def test_read_write2(self, mock_urandom: Mock, mock_time: Mock) -> None: mock_urandom.side_effect = (lambda n: b'1234567890123456789012345678901234567890'[:n]) mock_time.side_effect = (lambda : 10000) project_test2 = fixture_read('project_test2.xml') project_test3 = fixture_read('project_test3.xml') with tempfile.TemporaryDirectory() as tmpdirname: cfg = ProjectConfig(database='postgres_testdb', description='Postgres Test DB', type='PostgreSQL (PSYCOPG2)', host='192.168.1.101', port=5432, username='postgres', password='postgrespassword', project_password='myhardtoguesspassword', filename=os.path.join(tmpdirname, 'test.xml')) cfg.SAVE_VERSION = VERSION_1_1 cfg.save_projectxml(False) self.assertEqual(open(cfg.filename).read(), project_test2) with self.assertRaises(PasswordMismatchError): ProjectConfig(load_xml=cfg.filename, project_password='wrongpassword') cfg2 = ProjectConfig(load_xml=cfg.filename, project_password='myhardtoguesspassword') cfg2.SAVE_VERSION = cfg2.version cfg2.save_projectxml(True) self.assertEqual(open(cfg2.filename).read(), project_test2) cfg2.SAVE_VERSION = VERSION_1_2 cfg2.save_projectxml(True) self.assertEqual(open(cfg2.filename).read(), project_test3) with self.assertRaises(PasswordMismatchError): ProjectConfig(load_xml=cfg2.filename, project_password='wrongpassword') cfg3 = ProjectConfig(load_xml=cfg2.filename, project_password='myhardtoguesspassword') cfg3.SAVE_VERSION = VERSION_1_2 cfg3.save_projectxml(True) self.assertEqual(open(cfg3.filename).read(), project_test3)<|docstring|>Test we can read and write and stays equal (slightly more complicated).<|endoftext|>
b9c337bef1270bd4274fe98fd77ee41df6f2f92a27fa902edeab811b629c984e
def check_experimenter_input(js): '\n Valida el input de entrada.\n\n Args:\n js (dict): Diccionario con el json parseado.\n ' experimenter_schema = get_full_schema() try: jsonschema.validate(js, experimenter_schema) except jsonschema.ValidationError as err: return models.VerificationResult(False, err.schema['error_msg']) dataset = get_dataset(js['datasets']) task = dataset.tags['task'] if (js['task'] != task): return models.VerificationResult(False, f'''La task entregada es distinta con la del dataset: {js['task']} no es {task}''') return models.VerificationResult(True)
Valida el input de entrada. Args: js (dict): Diccionario con el json parseado.
experimenter/utils.py
check_experimenter_input
imfd/TextPerimenter
1
python
def check_experimenter_input(js): '\n Valida el input de entrada.\n\n Args:\n js (dict): Diccionario con el json parseado.\n ' experimenter_schema = get_full_schema() try: jsonschema.validate(js, experimenter_schema) except jsonschema.ValidationError as err: return models.VerificationResult(False, err.schema['error_msg']) dataset = get_dataset(js['datasets']) task = dataset.tags['task'] if (js['task'] != task): return models.VerificationResult(False, f'La task entregada es distinta con la del dataset: {js['task']} no es {task}') return models.VerificationResult(True)
def check_experimenter_input(js): '\n Valida el input de entrada.\n\n Args:\n js (dict): Diccionario con el json parseado.\n ' experimenter_schema = get_full_schema() try: jsonschema.validate(js, experimenter_schema) except jsonschema.ValidationError as err: return models.VerificationResult(False, err.schema['error_msg']) dataset = get_dataset(js['datasets']) task = dataset.tags['task'] if (js['task'] != task): return models.VerificationResult(False, f'La task entregada es distinta con la del dataset: {js['task']} no es {task}') return models.VerificationResult(True)<|docstring|>Valida el input de entrada. Args: js (dict): Diccionario con el json parseado.<|endoftext|>
dc4b0a22ead3501ce673e543b8d925347e43c8fea96c3ef7b29a7eddc4367967
def creator(models_dic, metrics_dic): '\n Llama a las funciones creadoras de modelos y métricas.\n\n Args:\n models_dic (dict): Diccionario de modelos ingresados.\n metrics_dic (dict): Diccionario de métricas ingresadas.\n\n Returns:\n tuple: Tupla cuyo primer elemento corresponde a la lista de modelos\n instanciados, y el segundo elemento es una lista de métricas\n instanciadas.\n ' return (models_creator(models_dic), metrics_creator(metrics_dic))
Llama a las funciones creadoras de modelos y métricas. Args: models_dic (dict): Diccionario de modelos ingresados. metrics_dic (dict): Diccionario de métricas ingresadas. Returns: tuple: Tupla cuyo primer elemento corresponde a la lista de modelos instanciados, y el segundo elemento es una lista de métricas instanciadas.
experimenter/utils.py
creator
imfd/TextPerimenter
1
python
def creator(models_dic, metrics_dic): '\n Llama a las funciones creadoras de modelos y métricas.\n\n Args:\n models_dic (dict): Diccionario de modelos ingresados.\n metrics_dic (dict): Diccionario de métricas ingresadas.\n\n Returns:\n tuple: Tupla cuyo primer elemento corresponde a la lista de modelos\n instanciados, y el segundo elemento es una lista de métricas\n instanciadas.\n ' return (models_creator(models_dic), metrics_creator(metrics_dic))
def creator(models_dic, metrics_dic): '\n Llama a las funciones creadoras de modelos y métricas.\n\n Args:\n models_dic (dict): Diccionario de modelos ingresados.\n metrics_dic (dict): Diccionario de métricas ingresadas.\n\n Returns:\n tuple: Tupla cuyo primer elemento corresponde a la lista de modelos\n instanciados, y el segundo elemento es una lista de métricas\n instanciadas.\n ' return (models_creator(models_dic), metrics_creator(metrics_dic))<|docstring|>Llama a las funciones creadoras de modelos y métricas. Args: models_dic (dict): Diccionario de modelos ingresados. metrics_dic (dict): Diccionario de métricas ingresadas. Returns: tuple: Tupla cuyo primer elemento corresponde a la lista de modelos instanciados, y el segundo elemento es una lista de métricas instanciadas.<|endoftext|>
6973ca05b195ede8ba60cc34d67d14203fbcaa9e7c9c0c4f751f3fc8f029ec1b
def filter_models(ranked_models, n): '\n Retorna los n primeros reportes de modelos.\n\n Args:\n ranked_models (list): Reportes de modelos ordenados.\n n (int): Cantidad de modelos a escoger.\n\n Returns:\n list: Lista de n reportes de modelos ordenados.\n ' if (n == 0): return [[model['model'] for model in report] for report in ranked_models] return [[model['model'] for model in report[:n]] for report in ranked_models]
Retorna los n primeros reportes de modelos. Args: ranked_models (list): Reportes de modelos ordenados. n (int): Cantidad de modelos a escoger. Returns: list: Lista de n reportes de modelos ordenados.
experimenter/utils.py
filter_models
imfd/TextPerimenter
1
python
def filter_models(ranked_models, n): '\n Retorna los n primeros reportes de modelos.\n\n Args:\n ranked_models (list): Reportes de modelos ordenados.\n n (int): Cantidad de modelos a escoger.\n\n Returns:\n list: Lista de n reportes de modelos ordenados.\n ' if (n == 0): return [[model['model'] for model in report] for report in ranked_models] return [[model['model'] for model in report[:n]] for report in ranked_models]
def filter_models(ranked_models, n): '\n Retorna los n primeros reportes de modelos.\n\n Args:\n ranked_models (list): Reportes de modelos ordenados.\n n (int): Cantidad de modelos a escoger.\n\n Returns:\n list: Lista de n reportes de modelos ordenados.\n ' if (n == 0): return [[model['model'] for model in report] for report in ranked_models] return [[model['model'] for model in report[:n]] for report in ranked_models]<|docstring|>Retorna los n primeros reportes de modelos. Args: ranked_models (list): Reportes de modelos ordenados. n (int): Cantidad de modelos a escoger. Returns: list: Lista de n reportes de modelos ordenados.<|endoftext|>
7053215e44b30726254e35ee2a905960000c8d4e1690e73950d4edb0698e97e1
def get_optimizer_params(metrics_dic): '\n Función auxiliar, para obtener directamente los parámetros\n del optimizador.\n\n Args:\n metrics_dic (dict): Diccionario de métricas entregado en el input.\n\n Returns:\n tuple: Tupla de tres elementos. El primero corresponde a la label\n según la cual se optimizará, mientras que el segundo es la\n métrica con respecto a la que se realizará dicha optimización.\n El tercero corresponde al número de modelos que se quiere\n rescatar.\n ' opt_label = str(metrics_dic['optimizer_label']) opt_metric = metrics_dic['optimizer_metric'] best_n = metrics_dic['best_n'] return (opt_label, opt_metric, best_n)
Función auxiliar, para obtener directamente los parámetros del optimizador. Args: metrics_dic (dict): Diccionario de métricas entregado en el input. Returns: tuple: Tupla de tres elementos. El primero corresponde a la label según la cual se optimizará, mientras que el segundo es la métrica con respecto a la que se realizará dicha optimización. El tercero corresponde al número de modelos que se quiere rescatar.
experimenter/utils.py
get_optimizer_params
imfd/TextPerimenter
1
python
def get_optimizer_params(metrics_dic): '\n Función auxiliar, para obtener directamente los parámetros\n del optimizador.\n\n Args:\n metrics_dic (dict): Diccionario de métricas entregado en el input.\n\n Returns:\n tuple: Tupla de tres elementos. El primero corresponde a la label\n según la cual se optimizará, mientras que el segundo es la\n métrica con respecto a la que se realizará dicha optimización.\n El tercero corresponde al número de modelos que se quiere\n rescatar.\n ' opt_label = str(metrics_dic['optimizer_label']) opt_metric = metrics_dic['optimizer_metric'] best_n = metrics_dic['best_n'] return (opt_label, opt_metric, best_n)
def get_optimizer_params(metrics_dic): '\n Función auxiliar, para obtener directamente los parámetros\n del optimizador.\n\n Args:\n metrics_dic (dict): Diccionario de métricas entregado en el input.\n\n Returns:\n tuple: Tupla de tres elementos. El primero corresponde a la label\n según la cual se optimizará, mientras que el segundo es la\n métrica con respecto a la que se realizará dicha optimización.\n El tercero corresponde al número de modelos que se quiere\n rescatar.\n ' opt_label = str(metrics_dic['optimizer_label']) opt_metric = metrics_dic['optimizer_metric'] best_n = metrics_dic['best_n'] return (opt_label, opt_metric, best_n)<|docstring|>Función auxiliar, para obtener directamente los parámetros del optimizador. Args: metrics_dic (dict): Diccionario de métricas entregado en el input. Returns: tuple: Tupla de tres elementos. El primero corresponde a la label según la cual se optimizará, mientras que el segundo es la métrica con respecto a la que se realizará dicha optimización. El tercero corresponde al número de modelos que se quiere rescatar.<|endoftext|>
9e394327049d76181fdb69d1764d1b17cf0dcec0b2e96af036d7ac0089a0d27d
def metrics_creator(list_metrics): '\n Entrega las clases de las métricas introducidas.\n\n Args:\n list_metrics (list): Lista de strings con los nombres de las métricas.\n\n Returns:\n list: Lista de las clases de las métricas.\n ' metrics_mapping = get_metrics_mapping() return [metrics_mapping[met] for met in list_metrics]
Entrega las clases de las métricas introducidas. Args: list_metrics (list): Lista de strings con los nombres de las métricas. Returns: list: Lista de las clases de las métricas.
experimenter/utils.py
metrics_creator
imfd/TextPerimenter
1
python
def metrics_creator(list_metrics): '\n Entrega las clases de las métricas introducidas.\n\n Args:\n list_metrics (list): Lista de strings con los nombres de las métricas.\n\n Returns:\n list: Lista de las clases de las métricas.\n ' metrics_mapping = get_metrics_mapping() return [metrics_mapping[met] for met in list_metrics]
def metrics_creator(list_metrics): '\n Entrega las clases de las métricas introducidas.\n\n Args:\n list_metrics (list): Lista de strings con los nombres de las métricas.\n\n Returns:\n list: Lista de las clases de las métricas.\n ' metrics_mapping = get_metrics_mapping() return [metrics_mapping[met] for met in list_metrics]<|docstring|>Entrega las clases de las métricas introducidas. Args: list_metrics (list): Lista de strings con los nombres de las métricas. Returns: list: Lista de las clases de las métricas.<|endoftext|>
d6549024f8373c2f57ba790132afbdd4998d2f7962a7e24dafd6c10685e93f87
def models_creator(dic_models): '\n Inicializa los modelos.\n\n Args:\n dic_models (dict): Diccionario de modelos entregado en el input.\n\n Returns:\n list: Lista de modelos instanciados.\n ' models_mapping = get_models_mapping() models = [] for model_info in dic_models: model_class = [] model_params = model_info.get('params', None) if (model_params is not None): model_params_grid = list(ParameterGrid(model_info['params'])) else: model_params_grid = [{}] preprocesses = model_info.get('preprocesses', []) if (len(preprocesses) == 0): for model_parameter_combination in model_params_grid: model_dic = {'params': model_parameter_combination} model = models_mapping[model_info['name']](**model_dic) model_class.append(model) else: for preprocess_info in preprocesses: model_dic = {} preprocess_dic = {} tokenizers = [] tokenizers_info = preprocess_info.get('tokenizers', []) for tokenizer in tokenizers_info: tokenizers.append(tokenizers_mapping[tokenizer]()) preprocess_dic['tokenizers'] = tokenizers preprocess_params = preprocess_info.get('params', None) if (preprocess_params is not None): preprocess_params_grid = list(ParameterGrid(preprocess_params)) else: preprocess_params_grid = [{}] for model_parameter_combination in model_params_grid: for preprocess_parameter_combination in preprocess_params_grid: preprocess_dic['params'] = preprocess_parameter_combination preprocess = preprocesses_mapping[preprocess_info['name']](preprocess_dic) model_dic['preprocess'] = preprocess model_dic['params'] = model_parameter_combination model = models_mapping[model_info['name']](**model_dic) model_class.append(model) models.append(model_class) return models
Inicializa los modelos. Args: dic_models (dict): Diccionario de modelos entregado en el input. Returns: list: Lista de modelos instanciados.
experimenter/utils.py
models_creator
imfd/TextPerimenter
1
python
def models_creator(dic_models): '\n Inicializa los modelos.\n\n Args:\n dic_models (dict): Diccionario de modelos entregado en el input.\n\n Returns:\n list: Lista de modelos instanciados.\n ' models_mapping = get_models_mapping() models = [] for model_info in dic_models: model_class = [] model_params = model_info.get('params', None) if (model_params is not None): model_params_grid = list(ParameterGrid(model_info['params'])) else: model_params_grid = [{}] preprocesses = model_info.get('preprocesses', []) if (len(preprocesses) == 0): for model_parameter_combination in model_params_grid: model_dic = {'params': model_parameter_combination} model = models_mapping[model_info['name']](**model_dic) model_class.append(model) else: for preprocess_info in preprocesses: model_dic = {} preprocess_dic = {} tokenizers = [] tokenizers_info = preprocess_info.get('tokenizers', []) for tokenizer in tokenizers_info: tokenizers.append(tokenizers_mapping[tokenizer]()) preprocess_dic['tokenizers'] = tokenizers preprocess_params = preprocess_info.get('params', None) if (preprocess_params is not None): preprocess_params_grid = list(ParameterGrid(preprocess_params)) else: preprocess_params_grid = [{}] for model_parameter_combination in model_params_grid: for preprocess_parameter_combination in preprocess_params_grid: preprocess_dic['params'] = preprocess_parameter_combination preprocess = preprocesses_mapping[preprocess_info['name']](preprocess_dic) model_dic['preprocess'] = preprocess model_dic['params'] = model_parameter_combination model = models_mapping[model_info['name']](**model_dic) model_class.append(model) models.append(model_class) return models
def models_creator(dic_models): '\n Inicializa los modelos.\n\n Args:\n dic_models (dict): Diccionario de modelos entregado en el input.\n\n Returns:\n list: Lista de modelos instanciados.\n ' models_mapping = get_models_mapping() models = [] for model_info in dic_models: model_class = [] model_params = model_info.get('params', None) if (model_params is not None): model_params_grid = list(ParameterGrid(model_info['params'])) else: model_params_grid = [{}] preprocesses = model_info.get('preprocesses', []) if (len(preprocesses) == 0): for model_parameter_combination in model_params_grid: model_dic = {'params': model_parameter_combination} model = models_mapping[model_info['name']](**model_dic) model_class.append(model) else: for preprocess_info in preprocesses: model_dic = {} preprocess_dic = {} tokenizers = [] tokenizers_info = preprocess_info.get('tokenizers', []) for tokenizer in tokenizers_info: tokenizers.append(tokenizers_mapping[tokenizer]()) preprocess_dic['tokenizers'] = tokenizers preprocess_params = preprocess_info.get('params', None) if (preprocess_params is not None): preprocess_params_grid = list(ParameterGrid(preprocess_params)) else: preprocess_params_grid = [{}] for model_parameter_combination in model_params_grid: for preprocess_parameter_combination in preprocess_params_grid: preprocess_dic['params'] = preprocess_parameter_combination preprocess = preprocesses_mapping[preprocess_info['name']](preprocess_dic) model_dic['preprocess'] = preprocess model_dic['params'] = model_parameter_combination model = models_mapping[model_info['name']](**model_dic) model_class.append(model) models.append(model_class) return models<|docstring|>Inicializa los modelos. Args: dic_models (dict): Diccionario de modelos entregado en el input. Returns: list: Lista de modelos instanciados.<|endoftext|>
6f01fd59625637bbce39567ccf15457a737a287b0c72403b3893782712809201
def models_preprocess(model, datasets_list): '\n Aplica el preprocesamiento de model en cada uno de los datasets contenidos\n en la lista datasets_list.\n\n Args:\n model (Model): Modelo instanciado.\n datasets_list (list): Lista de datasets a preprocesar.\n\n Returns:\n list: Lista de datasets preprocesados.\n ' return [model.preprocess(ds) for ds in datasets_list]
Aplica el preprocesamiento de model en cada uno de los datasets contenidos en la lista datasets_list. Args: model (Model): Modelo instanciado. datasets_list (list): Lista de datasets a preprocesar. Returns: list: Lista de datasets preprocesados.
experimenter/utils.py
models_preprocess
imfd/TextPerimenter
1
python
def models_preprocess(model, datasets_list): '\n Aplica el preprocesamiento de model en cada uno de los datasets contenidos\n en la lista datasets_list.\n\n Args:\n model (Model): Modelo instanciado.\n datasets_list (list): Lista de datasets a preprocesar.\n\n Returns:\n list: Lista de datasets preprocesados.\n ' return [model.preprocess(ds) for ds in datasets_list]
def models_preprocess(model, datasets_list): '\n Aplica el preprocesamiento de model en cada uno de los datasets contenidos\n en la lista datasets_list.\n\n Args:\n model (Model): Modelo instanciado.\n datasets_list (list): Lista de datasets a preprocesar.\n\n Returns:\n list: Lista de datasets preprocesados.\n ' return [model.preprocess(ds) for ds in datasets_list]<|docstring|>Aplica el preprocesamiento de model en cada uno de los datasets contenidos en la lista datasets_list. Args: model (Model): Modelo instanciado. datasets_list (list): Lista de datasets a preprocesar. Returns: list: Lista de datasets preprocesados.<|endoftext|>
0cf88e2d6e8cfd13a5307c8a2146646c07367db3d9bf95cf708ad4a713b6c34b
def get_set_result(models, metrics, X, y, exp_id): '\n Genera los resultados por metrica para una lista de modelos, dado un\n conjunto de entrenamiento (X e y), ademas guarda los resultados con un\n nombre unico.\n\n Args:\n models (list): Lista de listas que contienen los modelos.\n metrics (list): Lista que contiene las metricas que se usaran\n para generar el reporte.\n X (array): Arreglo con textos de testeo preprocesados.\n y (array): Matriz con las etiquetas reales de los textos\n de testeo.\n exp_id: id del experimento del cual se esta llamando a esta funcion.\n\n Returns:\n dict: Diccionario con los resultados del modelo por metrica.\n ' results = {} for j in range(len(models)): name = models[j][0].__class__.__name__ for i in range(len(models[j])): (X_prep,) = models_preprocess(models[j][i], [X]) y_pred = models[j][i].predict(X_prep) result = metrics_result(metrics, y, y_pred) results[f'{name}-{j}-{i}-{exp_id}'] = result return results
Genera los resultados por metrica para una lista de modelos, dado un conjunto de entrenamiento (X e y), ademas guarda los resultados con un nombre unico. Args: models (list): Lista de listas que contienen los modelos. metrics (list): Lista que contiene las metricas que se usaran para generar el reporte. X (array): Arreglo con textos de testeo preprocesados. y (array): Matriz con las etiquetas reales de los textos de testeo. exp_id: id del experimento del cual se esta llamando a esta funcion. Returns: dict: Diccionario con los resultados del modelo por metrica.
experimenter/utils.py
get_set_result
imfd/TextPerimenter
1
python
def get_set_result(models, metrics, X, y, exp_id): '\n Genera los resultados por metrica para una lista de modelos, dado un\n conjunto de entrenamiento (X e y), ademas guarda los resultados con un\n nombre unico.\n\n Args:\n models (list): Lista de listas que contienen los modelos.\n metrics (list): Lista que contiene las metricas que se usaran\n para generar el reporte.\n X (array): Arreglo con textos de testeo preprocesados.\n y (array): Matriz con las etiquetas reales de los textos\n de testeo.\n exp_id: id del experimento del cual se esta llamando a esta funcion.\n\n Returns:\n dict: Diccionario con los resultados del modelo por metrica.\n ' results = {} for j in range(len(models)): name = models[j][0].__class__.__name__ for i in range(len(models[j])): (X_prep,) = models_preprocess(models[j][i], [X]) y_pred = models[j][i].predict(X_prep) result = metrics_result(metrics, y, y_pred) results[f'{name}-{j}-{i}-{exp_id}'] = result return results
def get_set_result(models, metrics, X, y, exp_id): '\n Genera los resultados por metrica para una lista de modelos, dado un\n conjunto de entrenamiento (X e y), ademas guarda los resultados con un\n nombre unico.\n\n Args:\n models (list): Lista de listas que contienen los modelos.\n metrics (list): Lista que contiene las metricas que se usaran\n para generar el reporte.\n X (array): Arreglo con textos de testeo preprocesados.\n y (array): Matriz con las etiquetas reales de los textos\n de testeo.\n exp_id: id del experimento del cual se esta llamando a esta funcion.\n\n Returns:\n dict: Diccionario con los resultados del modelo por metrica.\n ' results = {} for j in range(len(models)): name = models[j][0].__class__.__name__ for i in range(len(models[j])): (X_prep,) = models_preprocess(models[j][i], [X]) y_pred = models[j][i].predict(X_prep) result = metrics_result(metrics, y, y_pred) results[f'{name}-{j}-{i}-{exp_id}'] = result return results<|docstring|>Genera los resultados por metrica para una lista de modelos, dado un conjunto de entrenamiento (X e y), ademas guarda los resultados con un nombre unico. Args: models (list): Lista de listas que contienen los modelos. metrics (list): Lista que contiene las metricas que se usaran para generar el reporte. X (array): Arreglo con textos de testeo preprocesados. y (array): Matriz con las etiquetas reales de los textos de testeo. exp_id: id del experimento del cual se esta llamando a esta funcion. Returns: dict: Diccionario con los resultados del modelo por metrica.<|endoftext|>
f8de4c7050c436dab54d4d39e04e15e87eeb020c5a1f3a1f10e5316d55fc67b7
def models_tester(models, metrics, X_test, y_test, exp_id): '\n Realiza predicciones con respecto al set de testeo de los mejores modelos\n obtenidos en el trainer. Se genera el output del experimenter, al cual se\n le añaden los reportes de desempeño.\n\n Args:\n models (list): Lista de listas, en donde cada lista interna corresponde\n a las n_best instancias de esa clase según el dataset de\n entrenamiento.\n metrics (list): Lista con las clases de las métricas que se\n introdujeron en show.\n X_test (array): Arreglo con textos de testeo preprocesados.\n y_test (array): Matriz con las etiquetas reales de los textos\n de testeo.\n\n Returns:\n dict: Output del experimenter.\n ' output = {} for j in range(len(models)): name = models[j][0].__class__.__name__ output[f'{name} {j}'] = [] for i in range(len(models[j])): (X_te,) = models_preprocess(models[j][i], [X_test]) y_pred = models[j][i].predict(X_te) result_by_metric = metrics_result(metrics, y_test, y_pred) report = models_report(result_by_metric) model_dict = {'name': f'{name}-{j}-{i}-{exp_id}', 'model_id': f'{datetime.now()}', 'scores': report} output[f'{name} {j}'].append(model_dict) return output
Realiza predicciones con respecto al set de testeo de los mejores modelos obtenidos en el trainer. Se genera el output del experimenter, al cual se le añaden los reportes de desempeño. Args: models (list): Lista de listas, en donde cada lista interna corresponde a las n_best instancias de esa clase según el dataset de entrenamiento. metrics (list): Lista con las clases de las métricas que se introdujeron en show. X_test (array): Arreglo con textos de testeo preprocesados. y_test (array): Matriz con las etiquetas reales de los textos de testeo. Returns: dict: Output del experimenter.
experimenter/utils.py
models_tester
imfd/TextPerimenter
1
python
def models_tester(models, metrics, X_test, y_test, exp_id): '\n Realiza predicciones con respecto al set de testeo de los mejores modelos\n obtenidos en el trainer. Se genera el output del experimenter, al cual se\n le añaden los reportes de desempeño.\n\n Args:\n models (list): Lista de listas, en donde cada lista interna corresponde\n a las n_best instancias de esa clase según el dataset de\n entrenamiento.\n metrics (list): Lista con las clases de las métricas que se\n introdujeron en show.\n X_test (array): Arreglo con textos de testeo preprocesados.\n y_test (array): Matriz con las etiquetas reales de los textos\n de testeo.\n\n Returns:\n dict: Output del experimenter.\n ' output = {} for j in range(len(models)): name = models[j][0].__class__.__name__ output[f'{name} {j}'] = [] for i in range(len(models[j])): (X_te,) = models_preprocess(models[j][i], [X_test]) y_pred = models[j][i].predict(X_te) result_by_metric = metrics_result(metrics, y_test, y_pred) report = models_report(result_by_metric) model_dict = {'name': f'{name}-{j}-{i}-{exp_id}', 'model_id': f'{datetime.now()}', 'scores': report} output[f'{name} {j}'].append(model_dict) return output
def models_tester(models, metrics, X_test, y_test, exp_id): '\n Realiza predicciones con respecto al set de testeo de los mejores modelos\n obtenidos en el trainer. Se genera el output del experimenter, al cual se\n le añaden los reportes de desempeño.\n\n Args:\n models (list): Lista de listas, en donde cada lista interna corresponde\n a las n_best instancias de esa clase según el dataset de\n entrenamiento.\n metrics (list): Lista con las clases de las métricas que se\n introdujeron en show.\n X_test (array): Arreglo con textos de testeo preprocesados.\n y_test (array): Matriz con las etiquetas reales de los textos\n de testeo.\n\n Returns:\n dict: Output del experimenter.\n ' output = {} for j in range(len(models)): name = models[j][0].__class__.__name__ output[f'{name} {j}'] = [] for i in range(len(models[j])): (X_te,) = models_preprocess(models[j][i], [X_test]) y_pred = models[j][i].predict(X_te) result_by_metric = metrics_result(metrics, y_test, y_pred) report = models_report(result_by_metric) model_dict = {'name': f'{name}-{j}-{i}-{exp_id}', 'model_id': f'{datetime.now()}', 'scores': report} output[f'{name} {j}'].append(model_dict) return output<|docstring|>Realiza predicciones con respecto al set de testeo de los mejores modelos obtenidos en el trainer. Se genera el output del experimenter, al cual se le añaden los reportes de desempeño. Args: models (list): Lista de listas, en donde cada lista interna corresponde a las n_best instancias de esa clase según el dataset de entrenamiento. metrics (list): Lista con las clases de las métricas que se introdujeron en show. X_test (array): Arreglo con textos de testeo preprocesados. y_test (array): Matriz con las etiquetas reales de los textos de testeo. Returns: dict: Output del experimenter.<|endoftext|>
481b61c1f8b067604b5696164524e8e5c1608013d71b47ec8448774f2cfab682
def metrics_result(metrics, y_gold, y_pred): '\n Genera un reporte de desempeño de prediccción con respecto\n a las metricas especificadas. El reporte queda particionado\n por metrica.\n\n Args:\n metrics (list): Lista con las clases de las métricas especificadas.\n y_gold (array): Matriz con las labels reales.\n y_pred (array): Matriz con las labels predichas.\n\n Returns:\n dict: Reporte de desempeño\n ' metrics_mapping = get_metrics_mapping() report = {} if (len(metrics) == 0): metrics = metrics_mapping.values() for metric in metrics: result = metric(y_gold, y_pred) report[metric.name] = result return report
Genera un reporte de desempeño de prediccción con respecto a las metricas especificadas. El reporte queda particionado por metrica. Args: metrics (list): Lista con las clases de las métricas especificadas. y_gold (array): Matriz con las labels reales. y_pred (array): Matriz con las labels predichas. Returns: dict: Reporte de desempeño
experimenter/utils.py
metrics_result
imfd/TextPerimenter
1
python
def metrics_result(metrics, y_gold, y_pred): '\n Genera un reporte de desempeño de prediccción con respecto\n a las metricas especificadas. El reporte queda particionado\n por metrica.\n\n Args:\n metrics (list): Lista con las clases de las métricas especificadas.\n y_gold (array): Matriz con las labels reales.\n y_pred (array): Matriz con las labels predichas.\n\n Returns:\n dict: Reporte de desempeño\n ' metrics_mapping = get_metrics_mapping() report = {} if (len(metrics) == 0): metrics = metrics_mapping.values() for metric in metrics: result = metric(y_gold, y_pred) report[metric.name] = result return report
def metrics_result(metrics, y_gold, y_pred): '\n Genera un reporte de desempeño de prediccción con respecto\n a las metricas especificadas. El reporte queda particionado\n por metrica.\n\n Args:\n metrics (list): Lista con las clases de las métricas especificadas.\n y_gold (array): Matriz con las labels reales.\n y_pred (array): Matriz con las labels predichas.\n\n Returns:\n dict: Reporte de desempeño\n ' metrics_mapping = get_metrics_mapping() report = {} if (len(metrics) == 0): metrics = metrics_mapping.values() for metric in metrics: result = metric(y_gold, y_pred) report[metric.name] = result return report<|docstring|>Genera un reporte de desempeño de prediccción con respecto a las metricas especificadas. El reporte queda particionado por metrica. Args: metrics (list): Lista con las clases de las métricas especificadas. y_gold (array): Matriz con las labels reales. y_pred (array): Matriz con las labels predichas. Returns: dict: Reporte de desempeño<|endoftext|>
1e779ece48bccc0477e5e61d6d46a60db140acbb52239cc2a968210220e5f150
def models_report(results): '\n Genera reporte de resultados por label o nivel.\n\n Args:\n results (dict): Diccionartio con los resultados por metrica\n\n Returns:\n dict: Reporte de desempeño, particionado por label o nivel.\n ' def union(metric_name, metric_value, report): for key in metric_value: if (key in report): report[key][metric_name] = metric_value[key] else: report[key] = {} report[key][metric_name] = metric_value[key] report = {} metrics_results = results.copy() while (metrics_results != {}): metric_list = metrics_results.popitem() union(metric_list[0], metric_list[1], report) return report
Genera reporte de resultados por label o nivel. Args: results (dict): Diccionartio con los resultados por metrica Returns: dict: Reporte de desempeño, particionado por label o nivel.
experimenter/utils.py
models_report
imfd/TextPerimenter
1
python
def models_report(results): '\n Genera reporte de resultados por label o nivel.\n\n Args:\n results (dict): Diccionartio con los resultados por metrica\n\n Returns:\n dict: Reporte de desempeño, particionado por label o nivel.\n ' def union(metric_name, metric_value, report): for key in metric_value: if (key in report): report[key][metric_name] = metric_value[key] else: report[key] = {} report[key][metric_name] = metric_value[key] report = {} metrics_results = results.copy() while (metrics_results != {}): metric_list = metrics_results.popitem() union(metric_list[0], metric_list[1], report) return report
def models_report(results): '\n Genera reporte de resultados por label o nivel.\n\n Args:\n results (dict): Diccionartio con los resultados por metrica\n\n Returns:\n dict: Reporte de desempeño, particionado por label o nivel.\n ' def union(metric_name, metric_value, report): for key in metric_value: if (key in report): report[key][metric_name] = metric_value[key] else: report[key] = {} report[key][metric_name] = metric_value[key] report = {} metrics_results = results.copy() while (metrics_results != {}): metric_list = metrics_results.popitem() union(metric_list[0], metric_list[1], report) return report<|docstring|>Genera reporte de resultados por label o nivel. Args: results (dict): Diccionartio con los resultados por metrica Returns: dict: Reporte de desempeño, particionado por label o nivel.<|endoftext|>
fa2d915e836b365004ab1d1145d99d2e84a4444d02efd98f02717d87f6773fd7
def models_trainer(models, X_train, X_val, y_train, y_val): '\n Entrena los modelos y genera reportes en base a todas\n las métricas.\n\n Args:\n models (list): Lista de listas de modelos, en donde cada lista interna\n corresponde a un modelo especificado en el input.\n X_train (array): Arreglo con textos de entrenamiento preprocesados.\n X_val (array): Arreglo con textos de validación preprocesados.\n y_train (array): Matriz con las etiquetas reales de los textos\n de entrenamiento.\n y_val (array): Matriz con las etiquetas reales de los textos\n de validación.\n\n Returns:\n list: Lista con los reportes de desempeño obtenidos con respecto\n al set de validación.\n ' output = [] for model_class in models: results = [] for model in model_class: (X_tr, X_va) = models_preprocess(model, [X_train, X_val]) model.fit(X_tr, y_train) y_pred = model.predict(X_va) result_by_metric = metrics_result([], y_val, y_pred) report = models_report(result_by_metric) results.append({'model': model, 'report': report}) output.append(results) return output
Entrena los modelos y genera reportes en base a todas las métricas. Args: models (list): Lista de listas de modelos, en donde cada lista interna corresponde a un modelo especificado en el input. X_train (array): Arreglo con textos de entrenamiento preprocesados. X_val (array): Arreglo con textos de validación preprocesados. y_train (array): Matriz con las etiquetas reales de los textos de entrenamiento. y_val (array): Matriz con las etiquetas reales de los textos de validación. Returns: list: Lista con los reportes de desempeño obtenidos con respecto al set de validación.
experimenter/utils.py
models_trainer
imfd/TextPerimenter
1
python
def models_trainer(models, X_train, X_val, y_train, y_val): '\n Entrena los modelos y genera reportes en base a todas\n las métricas.\n\n Args:\n models (list): Lista de listas de modelos, en donde cada lista interna\n corresponde a un modelo especificado en el input.\n X_train (array): Arreglo con textos de entrenamiento preprocesados.\n X_val (array): Arreglo con textos de validación preprocesados.\n y_train (array): Matriz con las etiquetas reales de los textos\n de entrenamiento.\n y_val (array): Matriz con las etiquetas reales de los textos\n de validación.\n\n Returns:\n list: Lista con los reportes de desempeño obtenidos con respecto\n al set de validación.\n ' output = [] for model_class in models: results = [] for model in model_class: (X_tr, X_va) = models_preprocess(model, [X_train, X_val]) model.fit(X_tr, y_train) y_pred = model.predict(X_va) result_by_metric = metrics_result([], y_val, y_pred) report = models_report(result_by_metric) results.append({'model': model, 'report': report}) output.append(results) return output
def models_trainer(models, X_train, X_val, y_train, y_val): '\n Entrena los modelos y genera reportes en base a todas\n las métricas.\n\n Args:\n models (list): Lista de listas de modelos, en donde cada lista interna\n corresponde a un modelo especificado en el input.\n X_train (array): Arreglo con textos de entrenamiento preprocesados.\n X_val (array): Arreglo con textos de validación preprocesados.\n y_train (array): Matriz con las etiquetas reales de los textos\n de entrenamiento.\n y_val (array): Matriz con las etiquetas reales de los textos\n de validación.\n\n Returns:\n list: Lista con los reportes de desempeño obtenidos con respecto\n al set de validación.\n ' output = [] for model_class in models: results = [] for model in model_class: (X_tr, X_va) = models_preprocess(model, [X_train, X_val]) model.fit(X_tr, y_train) y_pred = model.predict(X_va) result_by_metric = metrics_result([], y_val, y_pred) report = models_report(result_by_metric) results.append({'model': model, 'report': report}) output.append(results) return output<|docstring|>Entrena los modelos y genera reportes en base a todas las métricas. Args: models (list): Lista de listas de modelos, en donde cada lista interna corresponde a un modelo especificado en el input. X_train (array): Arreglo con textos de entrenamiento preprocesados. X_val (array): Arreglo con textos de validación preprocesados. y_train (array): Matriz con las etiquetas reales de los textos de entrenamiento. y_val (array): Matriz con las etiquetas reales de los textos de validación. Returns: list: Lista con los reportes de desempeño obtenidos con respecto al set de validación.<|endoftext|>
e5e6d4db130b0f908f44945d31726cedfab100890d883922ebdf5210b94d2358
def optimizer(reports, metrics_dic): '\n Genera un ranking de los mejores reportes con respecto a la etiqueta y\n métrica especificadas en el input. Luego, filtra los reportes de manera\n tal de dejar los n mejores (especificados por el usuario).\n\n Args:\n reports (list): Lista con los reportes obtenidos en el trainer.\n metrics_dic (dict): Diccionario con las métricas especificadas\n por el usuario.\n\n Returns:\n list: Lista con los reportes filtrados.\n ' (opt_label, opt_metric, best_n) = get_optimizer_params(metrics_dic) rank_reports(reports, opt_label, opt_metric) return filter_models(reports, best_n)
Genera un ranking de los mejores reportes con respecto a la etiqueta y métrica especificadas en el input. Luego, filtra los reportes de manera tal de dejar los n mejores (especificados por el usuario). Args: reports (list): Lista con los reportes obtenidos en el trainer. metrics_dic (dict): Diccionario con las métricas especificadas por el usuario. Returns: list: Lista con los reportes filtrados.
experimenter/utils.py
optimizer
imfd/TextPerimenter
1
python
def optimizer(reports, metrics_dic): '\n Genera un ranking de los mejores reportes con respecto a la etiqueta y\n métrica especificadas en el input. Luego, filtra los reportes de manera\n tal de dejar los n mejores (especificados por el usuario).\n\n Args:\n reports (list): Lista con los reportes obtenidos en el trainer.\n metrics_dic (dict): Diccionario con las métricas especificadas\n por el usuario.\n\n Returns:\n list: Lista con los reportes filtrados.\n ' (opt_label, opt_metric, best_n) = get_optimizer_params(metrics_dic) rank_reports(reports, opt_label, opt_metric) return filter_models(reports, best_n)
def optimizer(reports, metrics_dic): '\n Genera un ranking de los mejores reportes con respecto a la etiqueta y\n métrica especificadas en el input. Luego, filtra los reportes de manera\n tal de dejar los n mejores (especificados por el usuario).\n\n Args:\n reports (list): Lista con los reportes obtenidos en el trainer.\n metrics_dic (dict): Diccionario con las métricas especificadas\n por el usuario.\n\n Returns:\n list: Lista con los reportes filtrados.\n ' (opt_label, opt_metric, best_n) = get_optimizer_params(metrics_dic) rank_reports(reports, opt_label, opt_metric) return filter_models(reports, best_n)<|docstring|>Genera un ranking de los mejores reportes con respecto a la etiqueta y métrica especificadas en el input. Luego, filtra los reportes de manera tal de dejar los n mejores (especificados por el usuario). Args: reports (list): Lista con los reportes obtenidos en el trainer. metrics_dic (dict): Diccionario con las métricas especificadas por el usuario. Returns: list: Lista con los reportes filtrados.<|endoftext|>
2c20ad9d20e6142829f9ba6b79759a9ff52e35d374bdd6a2c93ba358c3c0299f
def rank_reports(unranked, label, metric): '\n Ordena una lista de listas de reportes con respecto a la label y métrica\n especificada.\n\n Args:\n unranked (list): Lista de listas de reportes.\n label (str): Nombre de la label con la cual se quiere optimizar.\n metric (str): Nombre de la métrica con la cual se quiere optimizar.\n ' for report_list in unranked: report_list.sort(key=(lambda dic: search_value(dic, label, metric))) report_list.reverse()
Ordena una lista de listas de reportes con respecto a la label y métrica especificada. Args: unranked (list): Lista de listas de reportes. label (str): Nombre de la label con la cual se quiere optimizar. metric (str): Nombre de la métrica con la cual se quiere optimizar.
experimenter/utils.py
rank_reports
imfd/TextPerimenter
1
python
def rank_reports(unranked, label, metric): '\n Ordena una lista de listas de reportes con respecto a la label y métrica\n especificada.\n\n Args:\n unranked (list): Lista de listas de reportes.\n label (str): Nombre de la label con la cual se quiere optimizar.\n metric (str): Nombre de la métrica con la cual se quiere optimizar.\n ' for report_list in unranked: report_list.sort(key=(lambda dic: search_value(dic, label, metric))) report_list.reverse()
def rank_reports(unranked, label, metric): '\n Ordena una lista de listas de reportes con respecto a la label y métrica\n especificada.\n\n Args:\n unranked (list): Lista de listas de reportes.\n label (str): Nombre de la label con la cual se quiere optimizar.\n metric (str): Nombre de la métrica con la cual se quiere optimizar.\n ' for report_list in unranked: report_list.sort(key=(lambda dic: search_value(dic, label, metric))) report_list.reverse()<|docstring|>Ordena una lista de listas de reportes con respecto a la label y métrica especificada. Args: unranked (list): Lista de listas de reportes. label (str): Nombre de la label con la cual se quiere optimizar. metric (str): Nombre de la métrica con la cual se quiere optimizar.<|endoftext|>
51f58f37bdca21900fdc1fc666bd16d882d2b1fa8e2f0c19c3758a20a822dc79
def search_value(dic, first_key, second_key): '\n Retorna el valor obtenido de la métrica.\n\n Args:\n dic (dict): Diccionario que tiene un modelo y su reporte.\n first_key (str): String con la primera llave del reporte.\n second_key (str): String con la segunda llave del reporte.\n\n Returns:\n float: Valor obtenido.\n ' return dic['report'][first_key][second_key]
Retorna el valor obtenido de la métrica. Args: dic (dict): Diccionario que tiene un modelo y su reporte. first_key (str): String con la primera llave del reporte. second_key (str): String con la segunda llave del reporte. Returns: float: Valor obtenido.
experimenter/utils.py
search_value
imfd/TextPerimenter
1
python
def search_value(dic, first_key, second_key): '\n Retorna el valor obtenido de la métrica.\n\n Args:\n dic (dict): Diccionario que tiene un modelo y su reporte.\n first_key (str): String con la primera llave del reporte.\n second_key (str): String con la segunda llave del reporte.\n\n Returns:\n float: Valor obtenido.\n ' return dic['report'][first_key][second_key]
def search_value(dic, first_key, second_key): '\n Retorna el valor obtenido de la métrica.\n\n Args:\n dic (dict): Diccionario que tiene un modelo y su reporte.\n first_key (str): String con la primera llave del reporte.\n second_key (str): String con la segunda llave del reporte.\n\n Returns:\n float: Valor obtenido.\n ' return dic['report'][first_key][second_key]<|docstring|>Retorna el valor obtenido de la métrica. Args: dic (dict): Diccionario que tiene un modelo y su reporte. first_key (str): String con la primera llave del reporte. second_key (str): String con la segunda llave del reporte. Returns: float: Valor obtenido.<|endoftext|>
005b37c24a1da9aebb1fe1d605e5b0cc450da18e53f45de4e5812c4afafdd5c2
def set_params_to_list(js): '\n Inserta en listas los parámetros ingresados individualmente.\n\n Args:\n js (dict): Diccionario con los parámetros modificados.\n ' models = js['models'] if isinstance(models, dict): js['models'] = [models] for model in models: model_params = model.get('params', {}) for key in model_params: if (not isinstance(model_params[key], list)): model_params[key] = [model_params[key]] preprocesses = model.get('preprocesses', []) if (not isinstance(preprocesses, list)): model['preprocesses'] = [preprocesses] for preprocess in preprocesses: preprocess_params = preprocess.get('params', {}) for key in preprocess_params: if (not isinstance(preprocess_params[key], list)): preprocess_params[key] = [preprocess_params[key]]
Inserta en listas los parámetros ingresados individualmente. Args: js (dict): Diccionario con los parámetros modificados.
experimenter/utils.py
set_params_to_list
imfd/TextPerimenter
1
python
def set_params_to_list(js): '\n Inserta en listas los parámetros ingresados individualmente.\n\n Args:\n js (dict): Diccionario con los parámetros modificados.\n ' models = js['models'] if isinstance(models, dict): js['models'] = [models] for model in models: model_params = model.get('params', {}) for key in model_params: if (not isinstance(model_params[key], list)): model_params[key] = [model_params[key]] preprocesses = model.get('preprocesses', []) if (not isinstance(preprocesses, list)): model['preprocesses'] = [preprocesses] for preprocess in preprocesses: preprocess_params = preprocess.get('params', {}) for key in preprocess_params: if (not isinstance(preprocess_params[key], list)): preprocess_params[key] = [preprocess_params[key]]
def set_params_to_list(js): '\n Inserta en listas los parámetros ingresados individualmente.\n\n Args:\n js (dict): Diccionario con los parámetros modificados.\n ' models = js['models'] if isinstance(models, dict): js['models'] = [models] for model in models: model_params = model.get('params', {}) for key in model_params: if (not isinstance(model_params[key], list)): model_params[key] = [model_params[key]] preprocesses = model.get('preprocesses', []) if (not isinstance(preprocesses, list)): model['preprocesses'] = [preprocesses] for preprocess in preprocesses: preprocess_params = preprocess.get('params', {}) for key in preprocess_params: if (not isinstance(preprocess_params[key], list)): preprocess_params[key] = [preprocess_params[key]]<|docstring|>Inserta en listas los parámetros ingresados individualmente. Args: js (dict): Diccionario con los parámetros modificados.<|endoftext|>
ee3e73767a128deacbdeacbc1c513afacbe847829c826332c22ba206a6b15bcb
@pytest.mark.parametrize('measured_dist,distance_measure_params,expected_mmd', [(BitstringDistribution({'000': 0.1, '111': 0.9}), {'sigma': 0.5}, 0.32000000000000006), (BitstringDistribution({'000': 0.5, '111': 0.5}), {'sigma': 1}, 0.0), (BitstringDistribution({'000': 0.5, '111': 0.5}), {'sigma': [1, 0.5, 2]}, 0.0)]) def test_gaussian_mmd_is_computed_correctly(measured_dist, distance_measure_params, expected_mmd): 'Maximum mean discrepancy (MMD) with gaussian kernel between distributions is\n computed correctly.' target_distr = BitstringDistribution({'000': 0.5, '111': 0.5}) mmd = compute_mmd(target_distr, measured_dist, distance_measure_params) assert (mmd == expected_mmd)
Maximum mean discrepancy (MMD) with gaussian kernel between distributions is computed correctly.
tests/zquantum/core/bitstring_distribution/distance_measures/distance_measures_test.py
test_gaussian_mmd_is_computed_correctly
yukiizm/z-quantum-core
24
python
@pytest.mark.parametrize('measured_dist,distance_measure_params,expected_mmd', [(BitstringDistribution({'000': 0.1, '111': 0.9}), {'sigma': 0.5}, 0.32000000000000006), (BitstringDistribution({'000': 0.5, '111': 0.5}), {'sigma': 1}, 0.0), (BitstringDistribution({'000': 0.5, '111': 0.5}), {'sigma': [1, 0.5, 2]}, 0.0)]) def test_gaussian_mmd_is_computed_correctly(measured_dist, distance_measure_params, expected_mmd): 'Maximum mean discrepancy (MMD) with gaussian kernel between distributions is\n computed correctly.' target_distr = BitstringDistribution({'000': 0.5, '111': 0.5}) mmd = compute_mmd(target_distr, measured_dist, distance_measure_params) assert (mmd == expected_mmd)
@pytest.mark.parametrize('measured_dist,distance_measure_params,expected_mmd', [(BitstringDistribution({'000': 0.1, '111': 0.9}), {'sigma': 0.5}, 0.32000000000000006), (BitstringDistribution({'000': 0.5, '111': 0.5}), {'sigma': 1}, 0.0), (BitstringDistribution({'000': 0.5, '111': 0.5}), {'sigma': [1, 0.5, 2]}, 0.0)]) def test_gaussian_mmd_is_computed_correctly(measured_dist, distance_measure_params, expected_mmd): 'Maximum mean discrepancy (MMD) with gaussian kernel between distributions is\n computed correctly.' target_distr = BitstringDistribution({'000': 0.5, '111': 0.5}) mmd = compute_mmd(target_distr, measured_dist, distance_measure_params) assert (mmd == expected_mmd)<|docstring|>Maximum mean discrepancy (MMD) with gaussian kernel between distributions is computed correctly.<|endoftext|>
f4ba6bac1f5e0dc9c38f8067e14971de05c197e55676c420dcf0d29d5dad346c
def test_jensen_shannon_divergence_is_computed_correctly(): 'jensen shannon divergence between distributions is computed correctly.' target_distr = BitstringDistribution({'000': 0.5, '111': 0.5}) measured_dist = BitstringDistribution({'000': 0.1, '111': 0.9}) distance_measure_params = {'epsilon': 0.1} jensen_shannon_divergence = compute_jensen_shannon_divergence(target_distr, measured_dist, distance_measure_params) assert (jensen_shannon_divergence == 0.9485599924429406)
jensen shannon divergence between distributions is computed correctly.
tests/zquantum/core/bitstring_distribution/distance_measures/distance_measures_test.py
test_jensen_shannon_divergence_is_computed_correctly
yukiizm/z-quantum-core
24
python
def test_jensen_shannon_divergence_is_computed_correctly(): target_distr = BitstringDistribution({'000': 0.5, '111': 0.5}) measured_dist = BitstringDistribution({'000': 0.1, '111': 0.9}) distance_measure_params = {'epsilon': 0.1} jensen_shannon_divergence = compute_jensen_shannon_divergence(target_distr, measured_dist, distance_measure_params) assert (jensen_shannon_divergence == 0.9485599924429406)
def test_jensen_shannon_divergence_is_computed_correctly(): target_distr = BitstringDistribution({'000': 0.5, '111': 0.5}) measured_dist = BitstringDistribution({'000': 0.1, '111': 0.9}) distance_measure_params = {'epsilon': 0.1} jensen_shannon_divergence = compute_jensen_shannon_divergence(target_distr, measured_dist, distance_measure_params) assert (jensen_shannon_divergence == 0.9485599924429406)<|docstring|>jensen shannon divergence between distributions is computed correctly.<|endoftext|>
53206814c1af2002c93639fa917c9d8daedfc357a216206014d7b4af4f4cb42a
def run(self): 'Run (solve) the Genetic Algorithm.' for i in range(self.generations): log.info(f'Training population in generation {(i + 1)}...') if (i == 0): self.create_first_generation() else: self.create_next_generation() log.info(f'best individual: {self.best_individual()[1]}') log.info(f'best individual score: {self.best_individual()[0]}')
Run (solve) the Genetic Algorithm.
easynas/genetic_algorithm.py
run
erap129/EasyNAS
0
python
def run(self): for i in range(self.generations): log.info(f'Training population in generation {(i + 1)}...') if (i == 0): self.create_first_generation() else: self.create_next_generation() log.info(f'best individual: {self.best_individual()[1]}') log.info(f'best individual score: {self.best_individual()[0]}')
def run(self): for i in range(self.generations): log.info(f'Training population in generation {(i + 1)}...') if (i == 0): self.create_first_generation() else: self.create_next_generation() log.info(f'best individual: {self.best_individual()[1]}') log.info(f'best individual score: {self.best_individual()[0]}')<|docstring|>Run (solve) the Genetic Algorithm.<|endoftext|>
88d0b2eb98c3224b53536816a3094e7c4e7620c852cb09018684b9dc4e9b564d
def calculate_population_fitness(self): 'Calculate the fitness of every member of the given population using\n the supplied fitness_function.\n ' for individual in tqdm(self.current_generation): individual.fitness = self.fitness_function(individual.genes, self.seed_data) log.info(f'Current best validation accuracy: {max([x.fitness for x in self.current_generation])}')
Calculate the fitness of every member of the given population using the supplied fitness_function.
easynas/genetic_algorithm.py
calculate_population_fitness
erap129/EasyNAS
0
python
def calculate_population_fitness(self): 'Calculate the fitness of every member of the given population using\n the supplied fitness_function.\n ' for individual in tqdm(self.current_generation): individual.fitness = self.fitness_function(individual.genes, self.seed_data) log.info(f'Current best validation accuracy: {max([x.fitness for x in self.current_generation])}')
def calculate_population_fitness(self): 'Calculate the fitness of every member of the given population using\n the supplied fitness_function.\n ' for individual in tqdm(self.current_generation): individual.fitness = self.fitness_function(individual.genes, self.seed_data) log.info(f'Current best validation accuracy: {max([x.fitness for x in self.current_generation])}')<|docstring|>Calculate the fitness of every member of the given population using the supplied fitness_function.<|endoftext|>
e42150652486303b224141080e5071b6fd0aabcdd00208d96c2d89141322883f
def test_return_type(self): '\n Invalid requests should still return a complete response dict\n ' self.assertIsInstance(self.response_dict, dict) self.assertEqual(len(self.response_dict), 6)
Invalid requests should still return a complete response dict
tests/test_response_handling.py
test_return_type
5150brien/retsdk
1
python
def test_return_type(self): '\n \n ' self.assertIsInstance(self.response_dict, dict) self.assertEqual(len(self.response_dict), 6)
def test_return_type(self): '\n \n ' self.assertIsInstance(self.response_dict, dict) self.assertEqual(len(self.response_dict), 6)<|docstring|>Invalid requests should still return a complete response dict<|endoftext|>
7447e23bbf40586bc27e2b27570897fcad8758bf61b8e2080f89a2c54d6754f2
def test_response_data_payload(self): "\n The 'rows' value should be an empty list (no data payload returned)\n " self.assertIsInstance(self.response_dict['rows'], list) self.assertEqual(len(self.response_dict['rows']), 0)
The 'rows' value should be an empty list (no data payload returned)
tests/test_response_handling.py
test_response_data_payload
5150brien/retsdk
1
python
def test_response_data_payload(self): "\n \n " self.assertIsInstance(self.response_dict['rows'], list) self.assertEqual(len(self.response_dict['rows']), 0)
def test_response_data_payload(self): "\n \n " self.assertIsInstance(self.response_dict['rows'], list) self.assertEqual(len(self.response_dict['rows']), 0)<|docstring|>The 'rows' value should be an empty list (no data payload returned)<|endoftext|>
835eae6137599f87ecebb12bb8a4df8311410a1156907e1974c4811f66510de5
def test_error_reply_code(self): '\n Reply code for bad requests should be non-null and non-zero\n ' self.assertIsNotNone(self.response_dict['reply_code']) self.assertNotEqual(self.response_dict['reply_code'], '') self.assertNotEqual(self.response_dict['reply_code'], '0')
Reply code for bad requests should be non-null and non-zero
tests/test_response_handling.py
test_error_reply_code
5150brien/retsdk
1
python
def test_error_reply_code(self): '\n \n ' self.assertIsNotNone(self.response_dict['reply_code']) self.assertNotEqual(self.response_dict['reply_code'], ) self.assertNotEqual(self.response_dict['reply_code'], '0')
def test_error_reply_code(self): '\n \n ' self.assertIsNotNone(self.response_dict['reply_code']) self.assertNotEqual(self.response_dict['reply_code'], ) self.assertNotEqual(self.response_dict['reply_code'], '0')<|docstring|>Reply code for bad requests should be non-null and non-zero<|endoftext|>
229e1b6b04340446cbe53202e35393119e9346fcfece83130fc480727256fe78
def test_reply_text(self): '\n Reply text for bad requests should be non-null\n ' self.assertIsNotNone(self.response_dict['reply_text']) self.assertNotEqual(self.response_dict['reply_text'], '')
Reply text for bad requests should be non-null
tests/test_response_handling.py
test_reply_text
5150brien/retsdk
1
python
def test_reply_text(self): '\n \n ' self.assertIsNotNone(self.response_dict['reply_text']) self.assertNotEqual(self.response_dict['reply_text'], )
def test_reply_text(self): '\n \n ' self.assertIsNotNone(self.response_dict['reply_text']) self.assertNotEqual(self.response_dict['reply_text'], )<|docstring|>Reply text for bad requests should be non-null<|endoftext|>
e8fd10b09856bd1d27137f4c4c19a7326ed90644277aaa320db1655e1d7d40dd
def test_ok_value(self): "\n The response dict's 'ok' val should be False for bad requests\n " self.assertFalse(self.response_dict['ok'])
The response dict's 'ok' val should be False for bad requests
tests/test_response_handling.py
test_ok_value
5150brien/retsdk
1
python
def test_ok_value(self): "\n \n " self.assertFalse(self.response_dict['ok'])
def test_ok_value(self): "\n \n " self.assertFalse(self.response_dict['ok'])<|docstring|>The response dict's 'ok' val should be False for bad requests<|endoftext|>
bf056419043ff2133db3e8354916cccb5233b5961f1e851303c27a10f725aeaf
def test_more_rows_value(self): "\n The response dict's 'more_rows' val should be False for bad requests\n " self.assertFalse(self.response_dict['more_rows'])
The response dict's 'more_rows' val should be False for bad requests
tests/test_response_handling.py
test_more_rows_value
5150brien/retsdk
1
python
def test_more_rows_value(self): "\n \n " self.assertFalse(self.response_dict['more_rows'])
def test_more_rows_value(self): "\n \n " self.assertFalse(self.response_dict['more_rows'])<|docstring|>The response dict's 'more_rows' val should be False for bad requests<|endoftext|>
a73bd741b8e5b5517245e00ebc0231d4626ef8c46038ae6fb012f45bf47e04af
def test_response_rows(self): '\n The response dict should contain a list of values (can be empty)\n ' self.assertIsInstance(self.response_dict['rows'], list) self.assertGreaterEqual(len(self.response_dict['rows']), 0)
The response dict should contain a list of values (can be empty)
tests/test_response_handling.py
test_response_rows
5150brien/retsdk
1
python
def test_response_rows(self): '\n \n ' self.assertIsInstance(self.response_dict['rows'], list) self.assertGreaterEqual(len(self.response_dict['rows']), 0)
def test_response_rows(self): '\n \n ' self.assertIsInstance(self.response_dict['rows'], list) self.assertGreaterEqual(len(self.response_dict['rows']), 0)<|docstring|>The response dict should contain a list of values (can be empty)<|endoftext|>
dcae7e9cc1567d4ae81f81fe83fb68fdd978dd5cb9829466b57fc65c1863e3c6
def test_ok_value(self): "\n The response dict's 'ok' val should be True\n " self.assertTrue(self.response_dict['ok'])
The response dict's 'ok' val should be True
tests/test_response_handling.py
test_ok_value
5150brien/retsdk
1
python
def test_ok_value(self): "\n \n " self.assertTrue(self.response_dict['ok'])
def test_ok_value(self): "\n \n " self.assertTrue(self.response_dict['ok'])<|docstring|>The response dict's 'ok' val should be True<|endoftext|>
b49c1259e8e2e7dd5ea1228236cadcfd3a26805a7e3905800ae5d90e5201278b
def test_more_rows_value(self): "\n The response dict's 'more_rows' val should be False\n " self.assertFalse(self.response_dict['more_rows'])
The response dict's 'more_rows' val should be False
tests/test_response_handling.py
test_more_rows_value
5150brien/retsdk
1
python
def test_more_rows_value(self): "\n \n " self.assertFalse(self.response_dict['more_rows'])
def test_more_rows_value(self): "\n \n " self.assertFalse(self.response_dict['more_rows'])<|docstring|>The response dict's 'more_rows' val should be False<|endoftext|>
a73bd741b8e5b5517245e00ebc0231d4626ef8c46038ae6fb012f45bf47e04af
def test_response_rows(self): '\n The response dict should contain a list of values (can be empty)\n ' self.assertIsInstance(self.response_dict['rows'], list) self.assertGreaterEqual(len(self.response_dict['rows']), 0)
The response dict should contain a list of values (can be empty)
tests/test_response_handling.py
test_response_rows
5150brien/retsdk
1
python
def test_response_rows(self): '\n \n ' self.assertIsInstance(self.response_dict['rows'], list) self.assertGreaterEqual(len(self.response_dict['rows']), 0)
def test_response_rows(self): '\n \n ' self.assertIsInstance(self.response_dict['rows'], list) self.assertGreaterEqual(len(self.response_dict['rows']), 0)<|docstring|>The response dict should contain a list of values (can be empty)<|endoftext|>
dcae7e9cc1567d4ae81f81fe83fb68fdd978dd5cb9829466b57fc65c1863e3c6
def test_ok_value(self): "\n The response dict's 'ok' val should be True\n " self.assertTrue(self.response_dict['ok'])
The response dict's 'ok' val should be True
tests/test_response_handling.py
test_ok_value
5150brien/retsdk
1
python
def test_ok_value(self): "\n \n " self.assertTrue(self.response_dict['ok'])
def test_ok_value(self): "\n \n " self.assertTrue(self.response_dict['ok'])<|docstring|>The response dict's 'ok' val should be True<|endoftext|>
e7999c02f61086c39a5fc0a98e52c3fcee2371667d92c19b2fa8e0af2a6889e5
def test_more_rows_value(self): "\n The response dict's 'more_rows' val should be True\n " self.assertTrue(self.response_dict['more_rows'])
The response dict's 'more_rows' val should be True
tests/test_response_handling.py
test_more_rows_value
5150brien/retsdk
1
python
def test_more_rows_value(self): "\n \n " self.assertTrue(self.response_dict['more_rows'])
def test_more_rows_value(self): "\n \n " self.assertTrue(self.response_dict['more_rows'])<|docstring|>The response dict's 'more_rows' val should be True<|endoftext|>
e2655bac50398a24f2aefc00be442e4dc4965892ba7fad2638dba732ac69a81f
def test_get_api_url(): '\n Make sure we get a functioning API URL\n ' api_url = get_api_url() resp = requests.get(api_url) check_response(resp) content = resp.json() assert ('cases' in content.keys())
Make sure we get a functioning API URL
tests/test_utils.py
test_get_api_url
bensteinberg/cap-examples
49
python
def test_get_api_url(): '\n \n ' api_url = get_api_url() resp = requests.get(api_url) check_response(resp) content = resp.json() assert ('cases' in content.keys())
def test_get_api_url(): '\n \n ' api_url = get_api_url() resp = requests.get(api_url) check_response(resp) content = resp.json() assert ('cases' in content.keys())<|docstring|>Make sure we get a functioning API URL<|endoftext|>
2fe960a9a64446d8bf69f7931123ca58cad5491a39e7fb42d3b064a319170515
def _transform(s_data: str, replacements: list, keep_license_text: bool=False) -> str: '\n Internal function called to transform source data into templated data\n :param s_data: the source data to be transformed\n :param replacements: list of transformation pairs A->B\n :param keep_license_text: whether or not you want to keep license text\n :return: the potentially transformed data\n ' t_data = str(s_data) for replacement in replacements: t_data = t_data.replace(replacement[0], replacement[1]) while ('${Random_Uuid}' in t_data): t_data = t_data.replace('${Random_Uuid}', str(uuid.uuid4()), 1) if (not keep_license_text): t_data = _replace_license_text(t_data) return t_data
Internal function called to transform source data into templated data :param s_data: the source data to be transformed :param replacements: list of transformation pairs A->B :param keep_license_text: whether or not you want to keep license text :return: the potentially transformed data
scripts/o3de/o3de/engine_template.py
_transform
SparkyStudios/o3de
11
python
def _transform(s_data: str, replacements: list, keep_license_text: bool=False) -> str: '\n Internal function called to transform source data into templated data\n :param s_data: the source data to be transformed\n :param replacements: list of transformation pairs A->B\n :param keep_license_text: whether or not you want to keep license text\n :return: the potentially transformed data\n ' t_data = str(s_data) for replacement in replacements: t_data = t_data.replace(replacement[0], replacement[1]) while ('${Random_Uuid}' in t_data): t_data = t_data.replace('${Random_Uuid}', str(uuid.uuid4()), 1) if (not keep_license_text): t_data = _replace_license_text(t_data) return t_data
def _transform(s_data: str, replacements: list, keep_license_text: bool=False) -> str: '\n Internal function called to transform source data into templated data\n :param s_data: the source data to be transformed\n :param replacements: list of transformation pairs A->B\n :param keep_license_text: whether or not you want to keep license text\n :return: the potentially transformed data\n ' t_data = str(s_data) for replacement in replacements: t_data = t_data.replace(replacement[0], replacement[1]) while ('${Random_Uuid}' in t_data): t_data = t_data.replace('${Random_Uuid}', str(uuid.uuid4()), 1) if (not keep_license_text): t_data = _replace_license_text(t_data) return t_data<|docstring|>Internal function called to transform source data into templated data :param s_data: the source data to be transformed :param replacements: list of transformation pairs A->B :param keep_license_text: whether or not you want to keep license text :return: the potentially transformed data<|endoftext|>
f602e1ea028d4229ef70c8279c7777c95c9b6264fe8859b1239e24097028fe00
def _transform_copy(source_file: pathlib.Path, destination_file: pathlib.Path, replacements: list, keep_license_text: bool=False) -> None: '\n Internal function called to transform and copy a source file into templated destination file\n :param source_file: the source file to be transformed\n :param destination_file: the destination file, this is the transformed file\n :param replacements: list of transformation pairs A->B\n :param keep_license_text: whether or not you want to keep license text\n ' (name, ext) = os.path.splitext(source_file) if (ext in binary_file_ext): shutil.copy(source_file, destination_file) else: try: with open(source_file, 'r') as s: s_data = s.read() d_data = _transform(s_data, replacements, keep_license_text) if os.path.isfile(destination_file): os.unlink(destination_file) with open(destination_file, 'w') as d: d.write(d_data) except Exception as e: shutil.copy(source_file, destination_file) pass
Internal function called to transform and copy a source file into templated destination file :param source_file: the source file to be transformed :param destination_file: the destination file, this is the transformed file :param replacements: list of transformation pairs A->B :param keep_license_text: whether or not you want to keep license text
scripts/o3de/o3de/engine_template.py
_transform_copy
SparkyStudios/o3de
11
python
def _transform_copy(source_file: pathlib.Path, destination_file: pathlib.Path, replacements: list, keep_license_text: bool=False) -> None: '\n Internal function called to transform and copy a source file into templated destination file\n :param source_file: the source file to be transformed\n :param destination_file: the destination file, this is the transformed file\n :param replacements: list of transformation pairs A->B\n :param keep_license_text: whether or not you want to keep license text\n ' (name, ext) = os.path.splitext(source_file) if (ext in binary_file_ext): shutil.copy(source_file, destination_file) else: try: with open(source_file, 'r') as s: s_data = s.read() d_data = _transform(s_data, replacements, keep_license_text) if os.path.isfile(destination_file): os.unlink(destination_file) with open(destination_file, 'w') as d: d.write(d_data) except Exception as e: shutil.copy(source_file, destination_file) pass
def _transform_copy(source_file: pathlib.Path, destination_file: pathlib.Path, replacements: list, keep_license_text: bool=False) -> None: '\n Internal function called to transform and copy a source file into templated destination file\n :param source_file: the source file to be transformed\n :param destination_file: the destination file, this is the transformed file\n :param replacements: list of transformation pairs A->B\n :param keep_license_text: whether or not you want to keep license text\n ' (name, ext) = os.path.splitext(source_file) if (ext in binary_file_ext): shutil.copy(source_file, destination_file) else: try: with open(source_file, 'r') as s: s_data = s.read() d_data = _transform(s_data, replacements, keep_license_text) if os.path.isfile(destination_file): os.unlink(destination_file) with open(destination_file, 'w') as d: d.write(d_data) except Exception as e: shutil.copy(source_file, destination_file) pass<|docstring|>Internal function called to transform and copy a source file into templated destination file :param source_file: the source file to be transformed :param destination_file: the destination file, this is the transformed file :param replacements: list of transformation pairs A->B :param keep_license_text: whether or not you want to keep license text<|endoftext|>
bae588e00f74d14ad756c771453c2d0a3a92d6917da99da0b00e12d73c721497
def _instantiate_template(template_json_data: dict, destination_name: str, template_name: str, destination_path: pathlib.Path, template_path: pathlib.Path, destination_restricted_path: pathlib.Path, template_restricted_path: pathlib.Path, destination_restricted_platform_relative_path: pathlib.Path, template_restricted_platform_relative_path: pathlib.Path, replacements: list, keep_restricted_in_instance: bool=False, keep_license_text: bool=False) -> int: "\n Internal function to create a concrete instance from a template\n\n :param template_json_data: the template json data\n :param destination_name: the name of folder you want to instantiate the template in\n :param template_name: the name of the template\n :param destination_path: the path you want to instantiate the template in\n :param template_path: the path of the template\n :param destination_restricted_path: the path of the restricted destination\n :param template_restricted_path: the path of the restricted template\n :param destination_restricted_platform_relative_path: any path after the Platform of the restricted destination\n :param template_restricted_platform_relative_path: any path after the Platform of the restricted template\n :param replacements: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs\n Ex. ${Name},TestGem,${Player},TestGemPlayer\n This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer'\n :param keep_restricted_in_instance: whether or not you want to keep the templates restricted files in your instance\n or separate them out into the restricted folder\n :param keep_license_text: whether or not you want to keep the templates license text in your instance.\n template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE},\n this controls if you want to keep the license text from the template in the new instance. It is false by default\n because most customers will not want license text in their instances, but we may want to keep them.\n :return: 0 for success or non 0 failure code\n " _execute_template_json(template_json_data, destination_path, template_path, replacements, keep_license_text, keep_restricted_in_instance) for restricted_platform in restricted_platforms: restricted_json_data = {} if template_restricted_path: template_restricted_platform = (template_restricted_path / restricted_platform) template_restricted_platform_path_rel = (template_restricted_platform / template_restricted_platform_relative_path) platform_json = (template_restricted_platform_path_rel / 'template.json') if os.path.isfile(platform_json): if (not validation.valid_o3de_template_json(platform_json)): logger.error(f'Template json {platform_json} is invalid.') return 1 with open(platform_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error((f'Failed to load {platform_json}: ' + str(e))) return 1 _execute_restricted_template_json(template_json_data, restricted_json_data, restricted_platform, destination_name, destination_path, destination_restricted_path, template_path, template_restricted_path, destination_restricted_platform_relative_path, template_restricted_platform_relative_path, replacements, keep_restricted_in_instance, keep_license_text) return 0
Internal function to create a concrete instance from a template :param template_json_data: the template json data :param destination_name: the name of folder you want to instantiate the template in :param template_name: the name of the template :param destination_path: the path you want to instantiate the template in :param template_path: the path of the template :param destination_restricted_path: the path of the restricted destination :param template_restricted_path: the path of the restricted template :param destination_restricted_platform_relative_path: any path after the Platform of the restricted destination :param template_restricted_platform_relative_path: any path after the Platform of the restricted template :param replacements: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs Ex. ${Name},TestGem,${Player},TestGemPlayer This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer' :param keep_restricted_in_instance: whether or not you want to keep the templates restricted files in your instance or separate them out into the restricted folder :param keep_license_text: whether or not you want to keep the templates license text in your instance. template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE}, this controls if you want to keep the license text from the template in the new instance. It is false by default because most customers will not want license text in their instances, but we may want to keep them. :return: 0 for success or non 0 failure code
scripts/o3de/o3de/engine_template.py
_instantiate_template
SparkyStudios/o3de
11
python
def _instantiate_template(template_json_data: dict, destination_name: str, template_name: str, destination_path: pathlib.Path, template_path: pathlib.Path, destination_restricted_path: pathlib.Path, template_restricted_path: pathlib.Path, destination_restricted_platform_relative_path: pathlib.Path, template_restricted_platform_relative_path: pathlib.Path, replacements: list, keep_restricted_in_instance: bool=False, keep_license_text: bool=False) -> int: "\n Internal function to create a concrete instance from a template\n\n :param template_json_data: the template json data\n :param destination_name: the name of folder you want to instantiate the template in\n :param template_name: the name of the template\n :param destination_path: the path you want to instantiate the template in\n :param template_path: the path of the template\n :param destination_restricted_path: the path of the restricted destination\n :param template_restricted_path: the path of the restricted template\n :param destination_restricted_platform_relative_path: any path after the Platform of the restricted destination\n :param template_restricted_platform_relative_path: any path after the Platform of the restricted template\n :param replacements: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs\n Ex. ${Name},TestGem,${Player},TestGemPlayer\n This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer'\n :param keep_restricted_in_instance: whether or not you want to keep the templates restricted files in your instance\n or separate them out into the restricted folder\n :param keep_license_text: whether or not you want to keep the templates license text in your instance.\n template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE},\n this controls if you want to keep the license text from the template in the new instance. It is false by default\n because most customers will not want license text in their instances, but we may want to keep them.\n :return: 0 for success or non 0 failure code\n " _execute_template_json(template_json_data, destination_path, template_path, replacements, keep_license_text, keep_restricted_in_instance) for restricted_platform in restricted_platforms: restricted_json_data = {} if template_restricted_path: template_restricted_platform = (template_restricted_path / restricted_platform) template_restricted_platform_path_rel = (template_restricted_platform / template_restricted_platform_relative_path) platform_json = (template_restricted_platform_path_rel / 'template.json') if os.path.isfile(platform_json): if (not validation.valid_o3de_template_json(platform_json)): logger.error(f'Template json {platform_json} is invalid.') return 1 with open(platform_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error((f'Failed to load {platform_json}: ' + str(e))) return 1 _execute_restricted_template_json(template_json_data, restricted_json_data, restricted_platform, destination_name, destination_path, destination_restricted_path, template_path, template_restricted_path, destination_restricted_platform_relative_path, template_restricted_platform_relative_path, replacements, keep_restricted_in_instance, keep_license_text) return 0
def _instantiate_template(template_json_data: dict, destination_name: str, template_name: str, destination_path: pathlib.Path, template_path: pathlib.Path, destination_restricted_path: pathlib.Path, template_restricted_path: pathlib.Path, destination_restricted_platform_relative_path: pathlib.Path, template_restricted_platform_relative_path: pathlib.Path, replacements: list, keep_restricted_in_instance: bool=False, keep_license_text: bool=False) -> int: "\n Internal function to create a concrete instance from a template\n\n :param template_json_data: the template json data\n :param destination_name: the name of folder you want to instantiate the template in\n :param template_name: the name of the template\n :param destination_path: the path you want to instantiate the template in\n :param template_path: the path of the template\n :param destination_restricted_path: the path of the restricted destination\n :param template_restricted_path: the path of the restricted template\n :param destination_restricted_platform_relative_path: any path after the Platform of the restricted destination\n :param template_restricted_platform_relative_path: any path after the Platform of the restricted template\n :param replacements: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs\n Ex. ${Name},TestGem,${Player},TestGemPlayer\n This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer'\n :param keep_restricted_in_instance: whether or not you want to keep the templates restricted files in your instance\n or separate them out into the restricted folder\n :param keep_license_text: whether or not you want to keep the templates license text in your instance.\n template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE},\n this controls if you want to keep the license text from the template in the new instance. It is false by default\n because most customers will not want license text in their instances, but we may want to keep them.\n :return: 0 for success or non 0 failure code\n " _execute_template_json(template_json_data, destination_path, template_path, replacements, keep_license_text, keep_restricted_in_instance) for restricted_platform in restricted_platforms: restricted_json_data = {} if template_restricted_path: template_restricted_platform = (template_restricted_path / restricted_platform) template_restricted_platform_path_rel = (template_restricted_platform / template_restricted_platform_relative_path) platform_json = (template_restricted_platform_path_rel / 'template.json') if os.path.isfile(platform_json): if (not validation.valid_o3de_template_json(platform_json)): logger.error(f'Template json {platform_json} is invalid.') return 1 with open(platform_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error((f'Failed to load {platform_json}: ' + str(e))) return 1 _execute_restricted_template_json(template_json_data, restricted_json_data, restricted_platform, destination_name, destination_path, destination_restricted_path, template_path, template_restricted_path, destination_restricted_platform_relative_path, template_restricted_platform_relative_path, replacements, keep_restricted_in_instance, keep_license_text) return 0<|docstring|>Internal function to create a concrete instance from a template :param template_json_data: the template json data :param destination_name: the name of folder you want to instantiate the template in :param template_name: the name of the template :param destination_path: the path you want to instantiate the template in :param template_path: the path of the template :param destination_restricted_path: the path of the restricted destination :param template_restricted_path: the path of the restricted template :param destination_restricted_platform_relative_path: any path after the Platform of the restricted destination :param template_restricted_platform_relative_path: any path after the Platform of the restricted template :param replacements: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs Ex. ${Name},TestGem,${Player},TestGemPlayer This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer' :param keep_restricted_in_instance: whether or not you want to keep the templates restricted files in your instance or separate them out into the restricted folder :param keep_license_text: whether or not you want to keep the templates license text in your instance. template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE}, this controls if you want to keep the license text from the template in the new instance. It is false by default because most customers will not want license text in their instances, but we may want to keep them. :return: 0 for success or non 0 failure code<|endoftext|>
daced38d6fc416926d312ae10c7ce90a7f1f9eabb4fbcd6efe691dd84be38cce
def create_template(source_path: pathlib.Path, template_path: pathlib.Path, source_name: str=None, source_restricted_path: pathlib.Path=None, source_restricted_name: str=None, template_restricted_path: pathlib.Path=None, template_restricted_name: str=None, source_restricted_platform_relative_path: pathlib.Path=None, template_restricted_platform_relative_path: pathlib.Path=None, keep_restricted_in_template: bool=False, keep_license_text: bool=False, replace: list=None, force: bool=False, no_register: bool=False) -> int: "\n Create a template from a source directory using replacement\n\n :param source_path: The path to the source that you want to make into a template\n :param template_path: the path of the template to create, can be absolute or relative to default templates path\n :param source_name: Name to replace within template folder with ${Name} placeholder\n If not specified, the basename of the source_path parameter is used as the source_name instead\n :param source_restricted_path: path to the source restricted folder\n :param source_restricted_name: name of the source restricted folder\n :param template_restricted_path: path to the templates restricted folder\n :param template_restricted_name: name of the templates restricted folder\n :param source_restricted_platform_relative_path: any path after the platform in the source restricted\n :param template_restricted_platform_relative_path: any path after the platform in the template restricted\n :param replace: optional list of strings uses to make templated parameters out of concrete names. X->Y pairs\n Ex. TestGem,${Name},TestGemPlayer,${Player}\n This will cause all references to 'TestGem' be replaced by ${Name}, and all 'TestGemPlayer' replaced by ${Player}\n Note these replacements are executed in order, so if you have larger matches, do them first, i.e.\n TestGemPlayer,${Player},TestGem,${Name}\n TestGemPlayer will get matched first and become ${Player} and will not become ${Name}Player\n :param keep_restricted_in_template: whether or not you want to keep the templates restricted in your template.\n :param keep_license_text: whether or not you want to keep the templates license text in your instance.\n Templated files can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE},\n this controls if you want to keep the license text from the template in the new instance. It is false by default\n because most people will not want license text in their instances.\n :param force Overrides existing files even if they exist\n :param no_register: whether or not after completion that the new object is registered\n :return: 0 for success or non 0 failure code\n " if (not source_path): logger.error('Src path cannot be empty.') return 1 if (not source_path.is_dir()): logger.error(f'Src path {source_path} is not a folder.') return 1 source_path = source_path.resolve() if (not source_name): source_name = os.path.basename(source_path) sanitized_source_name = utils.sanitize_identifier_for_cpp(source_name) if (not template_path): logger.info(f'Template path empty. Using source name {source_name}') template_path = source_name if (not template_path.is_absolute()): default_templates_folder = manifest.get_registered(default_folder='templates') template_path = (default_templates_folder / source_name) logger.info(f'Template path empty. Using default templates folder {template_path}') if ((not force) and template_path.is_dir() and len(list(template_path.iterdir()))): logger.error(f'Template path {template_path} already exists.') return 1 try: template_path.relative_to(source_path) except ValueError: pass else: logger.error(f'''Template output path {template_path} cannot be a subdirectory of the source_path {source_path} ''') return 1 template_name = os.path.basename(template_path) if (template_name in restricted_platforms): logger.error(f'Template path cannot be a restricted name. {template_name}') return 1 if (source_restricted_name and (not source_restricted_path)): source_restricted_path = manifest.get_registered(restricted_name=source_restricted_name) if source_restricted_path: if (not source_restricted_path.is_dir()): logger.error(f'Source restricted path {source_restricted_path} is not a folder.') return 1 restricted_json = (source_restricted_path / 'restricted.json') if (not validation.valid_o3de_restricted_json(restricted_json)): logger.error(f'Restricted json {restricted_json} is not valid.') return 1 with open(restricted_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error((f'Failed to load {restricted_json}: ' + str(e))) return 1 try: source_restricted_name = restricted_json_data['restricted_name'] except KeyError as e: logger.error(f'Failed to read restricted_name from {restricted_json}') return 1 if (template_restricted_name and (not template_restricted_path)): template_restricted_path = manifest.get_registered(restricted_name=template_restricted_name) if (not template_restricted_name): template_restricted_name = template_name if template_restricted_path: if template_restricted_path.is_dir(): restricted_json = (template_restricted_path / 'restricted.json') if (not validation.valid_o3de_restricted_json(restricted_json)): logger.error(f'{restricted_json} is not valid.') return 1 with open(restricted_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error((f'Failed to load {restricted_json}: ' + str(e))) return 1 try: template_restricted_name = restricted_json_data['restricted_name'] except KeyError as e: logger.error(f'Failed to read restricted_name from {restricted_json}') return 1 else: os.makedirs(template_restricted_path, exist_ok=True) if (not source_restricted_platform_relative_path): source_restricted_platform_relative_path = '' if (not template_restricted_platform_relative_path): template_restricted_platform_relative_path = '' logger.info(f'Processing Src: {source_path}') replacements = list() while replace: replace_this = replace.pop(0) with_this = replace.pop(0) replacements.append((replace_this, with_this)) os.makedirs(template_path, exist_ok=True) replacements.append((source_name.lower(), '${NameLower}')) replacements.append((source_name.upper(), '${NameUpper}')) replacements.append((source_name, '${Name}')) replacements.append((sanitized_source_name, '${SanitizedCppName}')) sanitized_name_index = (len(replacements) - 1) def _is_cpp_file(file_path: pathlib.Path) -> bool: '\n Internal helper method to check if a file is a C++ file based\n on its extension, so we can determine if we need to prefer\n the ${SanitizedCppName}\n :param file_path: The input file path\n :return: bool: Whether or not the input file path has a C++ extension\n ' (name, ext) = os.path.splitext(file_path) return (ext.lower() in cpp_file_ext) def _transform_into_template(s_data: object, prefer_sanitized_name: bool=False) -> (bool, str): "\n Internal function to transform any data into templated data\n :param s_data: the input data, this could be file data or file name data\n :param prefer_sanitized_name: Optionally swap the sanitized name with the normal name\n This can be necessary when creating the template, the source\n name and sanitized source name might be the same, but C++\n files will need to prefer the sanitized version, or else\n there might be compile errors (e.g. '-' characters in the name)\n :return: bool: whether or not the returned data MAY need to be transformed to instantiate it\n t_data: potentially transformed data 0 for success or non 0 failure code\n " def swap_sanitized_name_and_normal(): (replacements[(sanitized_name_index - 1)], replacements[sanitized_name_index]) = (replacements[sanitized_name_index], replacements[(sanitized_name_index - 1)]) t_data = str(s_data) if prefer_sanitized_name: swap_sanitized_name_and_normal() for replacement in replacements: t_data = t_data.replace(replacement[0], replacement[1]) if prefer_sanitized_name: swap_sanitized_name_and_normal() if (not keep_license_text): t_data = _replace_license_text(t_data) try: pattern = '.*AZ_RTTI\\(\\$\\{SanitizedCppName\\}Module, \\"(?P<ModuleClassId>\\{.*-.*-.*-.*-.*\\})\\",' module_class_id = re.search(pattern, t_data).group('ModuleClassId') replacements.append((module_class_id, '${ModuleClassId}')) t_data = t_data.replace(module_class_id, '${ModuleClassId}') except Exception as e: pass try: pattern = '.*AZ_COMPONENT\\(\\$\\{SanitizedCppName\\}SystemComponent, \\"(?P<SysCompClassId>\\{.*-.*-.*-.*-.*\\})\\"' sys_comp_class_id = re.search(pattern, t_data).group('SysCompClassId') replacements.append((sys_comp_class_id, '${SysCompClassId}')) t_data = t_data.replace(sys_comp_class_id, '${SysCompClassId}') except Exception as e: pass try: pattern = '.*AZ_COMPONENT\\(\\$\\{SanitizedCppName\\}EditorSystemComponent, \\"(?P<EditorSysCompClassId>\\{.*-.*-.*-.*-.*\\})\\"' editor_sys_comp_class_id = re.search(pattern, t_data).group('EditorSysCompClassId') replacements.append((editor_sys_comp_class_id, '${EditorSysCompClassId}')) t_data = t_data.replace(editor_sys_comp_class_id, '${EditorSysCompClassId}') except Exception as e: pass if ((s_data != t_data) or ('{BEGIN_LICENSE}' in t_data)): return (True, t_data) else: return (False, t_data) def _transform_restricted_into_copyfiles_and_createdirs(root_abs: pathlib.Path, path_abs: pathlib.Path=None) -> None: '\n Internal function recursively called to transform any paths files into copyfiles and create dirs relative to\n the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it\n :param root_abs: This is the path everything will end up relative to\n :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir\n of root_abs\n ' if (not path_abs): path_abs = root_abs entries = os.listdir(path_abs) for entry in entries: entry_abs = (path_abs / entry) logger.info(f'Processing file: {entry_abs}') try: entry_rel = entry_abs.relative_to(root_abs) except ValueError as err: logger.fatal(f'Unable to create relative path: {str(err)}') (_, destination_entry_rel) = _transform_into_template(entry_rel.as_posix()) destination_entry_rel = pathlib.Path(destination_entry_rel) if destination_entry_rel.as_posix().startswith('/'): destination_entry_rel = pathlib.Path(destination_entry_rel.as_posix().lstrip('/')) if isinstance(destination_entry_rel, pathlib.Path): destination_entry_rel = destination_entry_rel.as_posix() if template_restricted_path: destination_entry_abs = ((((template_restricted_path / restricted_platform) / template_restricted_platform_relative_path) / 'Template') / destination_entry_rel) destination_entry_rel = pathlib.Path(destination_entry_rel) first = True for component in destination_entry_rel.parts: if first: first = False result = ((pathlib.Path(component) / 'Platform') / restricted_platform) else: result = (result / component) destination_entry_rel = result.as_posix() else: destination_entry_rel = pathlib.Path(destination_entry_rel) first = True for component in destination_entry_rel.parts: if first: first = False result = ((pathlib.Path(component) / 'Platform') / restricted_platform) else: result = (result / component) destination_entry_rel = result.as_posix() destination_entry_abs = ((template_path / 'Template') / destination_entry_rel) os.makedirs(os.path.dirname(destination_entry_abs), exist_ok=True) templated = False if os.path.isfile(entry_abs): (name, ext) = os.path.splitext(entry) if (ext in binary_file_ext): shutil.copy(entry_abs, destination_entry_abs) else: try: with open(entry_abs, 'r') as s: source_data = s.read() (templated, source_data) = _transform_into_template(source_data, _is_cpp_file(entry_abs)) if (keep_license_text and (ext in expect_license_info_ext)): if (('Copyright (c)' not in source_data) or ('{BEGIN_LICENSE}' not in source_data)): logger.warning(f'Un-templated License header in {entry_abs}') if os.path.isfile(destination_entry_abs): os.unlink(destination_entry_abs) with open(destination_entry_abs, 'w') as s: s.write(source_data) except Exception as e: shutil.copy(entry_abs, destination_entry_abs) pass if keep_restricted_in_template: copy_files.append({'file': destination_entry_rel, 'isTemplated': templated}) else: restricted_platform_entries[restricted_platform]['copyFiles'].append({'file': destination_entry_rel, 'isTemplated': templated}) else: if keep_restricted_in_template: create_dirs.append({'dir': destination_entry_rel}) else: restricted_platform_entries[restricted_platform]['createDirs'].append({'dir': destination_entry_rel}) _transform_restricted_into_copyfiles_and_createdirs(root_abs, entry_abs) def _transform_dir_into_copyfiles_and_createdirs(root_abs: pathlib.Path, path_abs: pathlib.Path=None) -> None: '\n Internal function recursively called to transform any paths files into copyfiles and create dirs relative to\n the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it\n :param root_abs: This is the path everything will end up relative to\n :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir\n of root_abs\n ' if (not path_abs): path_abs = root_abs entries = os.listdir(path_abs) for entry in entries: entry_abs = (path_abs / entry) logger.info(f'Processing file: {entry_abs}') try: entry_rel = entry_abs.relative_to(root_abs).as_posix() except ValueError as err: logger.fatal(f'Unable to create relative path: {str(err)}') (_, destination_entry_rel) = _transform_into_template(entry_rel) destination_entry_rel = pathlib.Path(destination_entry_rel) found_platform = '' platform = False if ((not keep_restricted_in_template) and ('Platform' in entry_abs.parts)): platform = True try: pattern = '/Platform/(?P<Platform>[^/:*?\\"<>|\\r\\n]+/?)' found_platform = re.search(pattern, entry_abs.as_posix()).group('Platform') found_platform = found_platform.replace('/', '') except Exception as e: pass if (os.path.isfile(entry_abs) and (found_platform not in restricted_platform_entries)): found_platform = '' if ((found_platform not in restricted_platform_entries) and (found_platform in restricted_platforms)): restricted_platform_entries.update({found_platform: {'copyFiles': [], 'createDirs': []}}) if (platform and (found_platform in restricted_platforms)): if (not template_restricted_path): logger.warning('Restricted platform file found!!! {destination_entry_rel}, {found_platform}') continue for replacement in replacements: destination_entry_rel = destination_entry_rel.replace(replacement[0], replacement[1]) destination_entry_rel = destination_entry_rel.replace(f'Platform/{found_platform}', '') destination_entry_rel = destination_entry_rel.lstrip('/') if template_restricted_platform_relative_path: destination_entry_abs = (((((template_restricted_path / found_platform) / template_restricted_platform_relative_path) / template_name) / 'Template') / destination_entry_rel) else: destination_entry_abs = (((template_restricted_path / found_platform) / 'Template') / destination_entry_rel) else: destination_entry_abs = ((template_path / 'Template') / destination_entry_rel) if isinstance(destination_entry_rel, pathlib.Path): destination_entry_rel = destination_entry_rel.as_posix() if destination_entry_rel.startswith('/'): destination_entry_rel = destination_entry_rel.lstrip('/') os.makedirs(os.path.dirname(destination_entry_abs), exist_ok=True) templated = False if os.path.isfile(entry_abs): (name, ext) = os.path.splitext(entry) if (ext in binary_file_ext): shutil.copy(entry_abs, destination_entry_abs) else: try: with open(entry_abs, 'r') as s: source_data = s.read() (templated, source_data) = _transform_into_template(source_data, _is_cpp_file(entry_abs)) if (keep_license_text and (ext in expect_license_info_ext)): if (('Copyright (c)' not in source_data) or ('{BEGIN_LICENSE}' not in source_data)): logger.warning(f'Un-templated License header in {entry_abs}') if os.path.isfile(destination_entry_abs): os.unlink(destination_entry_abs) with open(destination_entry_abs, 'w') as s: s.write(source_data) except Exception as e: shutil.copy(entry_abs, destination_entry_abs) pass if (platform and (found_platform in restricted_platforms)): restricted_platform_entries[found_platform]['copyFiles'].append({'file': destination_entry_rel, 'isTemplated': templated}) else: copy_files.append({'file': destination_entry_rel, 'isTemplated': templated}) else: if (platform and (found_platform in restricted_platforms)): restricted_platform_entries[found_platform]['createDirs'].append({'dir': destination_entry_rel}) else: create_dirs.append({'dir': destination_entry_rel}) _transform_dir_into_copyfiles_and_createdirs(root_abs, entry_abs) copy_files = [] create_dirs = [] restricted_platform_entries = {} _transform_dir_into_copyfiles_and_createdirs(source_path) if source_restricted_path: for restricted_platform in os.listdir(source_restricted_path): restricted_platform_src_path_abs = ((source_restricted_path / restricted_platform) / source_restricted_platform_relative_path) if os.path.isdir(restricted_platform_src_path_abs): if (restricted_platform not in restricted_platform_entries): restricted_platform_entries.update({restricted_platform: {'copyFiles': [], 'createDirs': []}}) _transform_restricted_into_copyfiles_and_createdirs(restricted_platform_src_path_abs) copy_files.sort(key=(lambda x: x['file'])) create_dirs.sort(key=(lambda x: x['dir'])) json_data = {} json_data.update({'template_name': template_name}) json_data.update({'origin': f'The primary repo for {template_name} goes here: i.e. http://www.mydomain.com'}) json_data.update({'license': f'What license {template_name} uses goes here: i.e. https://opensource.org/licenses/MIT'}) json_data.update({'display_name': template_name}) json_data.update({'summary': f'A short description of {template_name}.'}) json_data.update({'canonical_tags': []}) json_data.update({'user_tags': [f'{template_name}']}) json_data.update({'icon_path': 'preview.png'}) if ((not keep_restricted_in_template) and template_restricted_path): json_data.update({'restricted_name': template_restricted_name}) if (template_restricted_platform_relative_path != ''): json_data.update({'template_restricted_platform_relative_path': template_restricted_platform_relative_path.as_posix()}) json_data.update({'copyFiles': copy_files}) json_data.update({'createDirectories': create_dirs}) json_name = ((template_path / source_restricted_platform_relative_path) / 'template.json') with json_name.open('w') as s: s.write((json.dumps(json_data, indent=4) + '\n')) preview_png_src = ((this_script_parent / 'resources') / 'preview.png') preview_png_dst = (template_path / 'preview.png') if (not os.path.isfile(preview_png_dst)): shutil.copy(preview_png_src, preview_png_dst) if ((not keep_restricted_in_template) and (not template_restricted_path) and len(restricted_platform_entries)): logger.info(f'Restricted platform files found!!! and no template restricted path was found...') if ((not keep_restricted_in_template) and template_restricted_path): json_name = (template_restricted_path / 'restricted.json') if (not json_name.is_file()): json_data = {} json_data.update({'restricted_name': template_restricted_name}) os.makedirs(os.path.dirname(json_name), exist_ok=True) with json_name.open('w') as s: s.write((json.dumps(json_data, indent=4) + '\n')) for restricted_platform in restricted_platform_entries: restricted_template_path = ((template_restricted_path / restricted_platform) / template_restricted_platform_relative_path) restricted_platform_entries[restricted_platform]['copyFiles'].sort(key=(lambda x: x['file'])) restricted_platform_entries[restricted_platform]['createDirs'].sort(key=(lambda x: x['dir'])) json_data = {} json_data.update({'restricted_name': template_name}) json_data.update({'template_name': template_name}) json_data.update({'origin': f'The primary repo for {template_name} goes here: i.e. http://www.mydomain.com'}) json_data.update({'license': f'What license {template_name} uses goes here: i.e. https://opensource.org/licenses/MIT'}) json_data.update({'display_name': template_name}) json_data.update({'summary': f'A short description of {template_name}.'}) json_data.update({'canonical_tags': [f'{restricted_platform}']}) json_data.update({'user_tags': [f'{template_name}']}) json_data.update({'copyFiles': restricted_platform_entries[restricted_platform]['copyFiles']}) json_data.update({'createDirectories': restricted_platform_entries[restricted_platform]['createDirs']}) json_name = (restricted_template_path / 'template.json') os.makedirs(os.path.dirname(json_name), exist_ok=True) with json_name.open('w') as s: s.write((json.dumps(json_data, indent=4) + '\n')) if (not no_register): if register.register(restricted_path=template_restricted_path): logger.error(f'Failed to register the restricted {template_restricted_path}.') return 1 return (register.register(template_path=template_path) if (not no_register) else 0)
Create a template from a source directory using replacement :param source_path: The path to the source that you want to make into a template :param template_path: the path of the template to create, can be absolute or relative to default templates path :param source_name: Name to replace within template folder with ${Name} placeholder If not specified, the basename of the source_path parameter is used as the source_name instead :param source_restricted_path: path to the source restricted folder :param source_restricted_name: name of the source restricted folder :param template_restricted_path: path to the templates restricted folder :param template_restricted_name: name of the templates restricted folder :param source_restricted_platform_relative_path: any path after the platform in the source restricted :param template_restricted_platform_relative_path: any path after the platform in the template restricted :param replace: optional list of strings uses to make templated parameters out of concrete names. X->Y pairs Ex. TestGem,${Name},TestGemPlayer,${Player} This will cause all references to 'TestGem' be replaced by ${Name}, and all 'TestGemPlayer' replaced by ${Player} Note these replacements are executed in order, so if you have larger matches, do them first, i.e. TestGemPlayer,${Player},TestGem,${Name} TestGemPlayer will get matched first and become ${Player} and will not become ${Name}Player :param keep_restricted_in_template: whether or not you want to keep the templates restricted in your template. :param keep_license_text: whether or not you want to keep the templates license text in your instance. Templated files can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE}, this controls if you want to keep the license text from the template in the new instance. It is false by default because most people will not want license text in their instances. :param force Overrides existing files even if they exist :param no_register: whether or not after completion that the new object is registered :return: 0 for success or non 0 failure code
scripts/o3de/o3de/engine_template.py
create_template
SparkyStudios/o3de
11
python
def create_template(source_path: pathlib.Path, template_path: pathlib.Path, source_name: str=None, source_restricted_path: pathlib.Path=None, source_restricted_name: str=None, template_restricted_path: pathlib.Path=None, template_restricted_name: str=None, source_restricted_platform_relative_path: pathlib.Path=None, template_restricted_platform_relative_path: pathlib.Path=None, keep_restricted_in_template: bool=False, keep_license_text: bool=False, replace: list=None, force: bool=False, no_register: bool=False) -> int: "\n Create a template from a source directory using replacement\n\n :param source_path: The path to the source that you want to make into a template\n :param template_path: the path of the template to create, can be absolute or relative to default templates path\n :param source_name: Name to replace within template folder with ${Name} placeholder\n If not specified, the basename of the source_path parameter is used as the source_name instead\n :param source_restricted_path: path to the source restricted folder\n :param source_restricted_name: name of the source restricted folder\n :param template_restricted_path: path to the templates restricted folder\n :param template_restricted_name: name of the templates restricted folder\n :param source_restricted_platform_relative_path: any path after the platform in the source restricted\n :param template_restricted_platform_relative_path: any path after the platform in the template restricted\n :param replace: optional list of strings uses to make templated parameters out of concrete names. X->Y pairs\n Ex. TestGem,${Name},TestGemPlayer,${Player}\n This will cause all references to 'TestGem' be replaced by ${Name}, and all 'TestGemPlayer' replaced by ${Player}\n Note these replacements are executed in order, so if you have larger matches, do them first, i.e.\n TestGemPlayer,${Player},TestGem,${Name}\n TestGemPlayer will get matched first and become ${Player} and will not become ${Name}Player\n :param keep_restricted_in_template: whether or not you want to keep the templates restricted in your template.\n :param keep_license_text: whether or not you want to keep the templates license text in your instance.\n Templated files can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE},\n this controls if you want to keep the license text from the template in the new instance. It is false by default\n because most people will not want license text in their instances.\n :param force Overrides existing files even if they exist\n :param no_register: whether or not after completion that the new object is registered\n :return: 0 for success or non 0 failure code\n " if (not source_path): logger.error('Src path cannot be empty.') return 1 if (not source_path.is_dir()): logger.error(f'Src path {source_path} is not a folder.') return 1 source_path = source_path.resolve() if (not source_name): source_name = os.path.basename(source_path) sanitized_source_name = utils.sanitize_identifier_for_cpp(source_name) if (not template_path): logger.info(f'Template path empty. Using source name {source_name}') template_path = source_name if (not template_path.is_absolute()): default_templates_folder = manifest.get_registered(default_folder='templates') template_path = (default_templates_folder / source_name) logger.info(f'Template path empty. Using default templates folder {template_path}') if ((not force) and template_path.is_dir() and len(list(template_path.iterdir()))): logger.error(f'Template path {template_path} already exists.') return 1 try: template_path.relative_to(source_path) except ValueError: pass else: logger.error(f'Template output path {template_path} cannot be a subdirectory of the source_path {source_path} ') return 1 template_name = os.path.basename(template_path) if (template_name in restricted_platforms): logger.error(f'Template path cannot be a restricted name. {template_name}') return 1 if (source_restricted_name and (not source_restricted_path)): source_restricted_path = manifest.get_registered(restricted_name=source_restricted_name) if source_restricted_path: if (not source_restricted_path.is_dir()): logger.error(f'Source restricted path {source_restricted_path} is not a folder.') return 1 restricted_json = (source_restricted_path / 'restricted.json') if (not validation.valid_o3de_restricted_json(restricted_json)): logger.error(f'Restricted json {restricted_json} is not valid.') return 1 with open(restricted_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error((f'Failed to load {restricted_json}: ' + str(e))) return 1 try: source_restricted_name = restricted_json_data['restricted_name'] except KeyError as e: logger.error(f'Failed to read restricted_name from {restricted_json}') return 1 if (template_restricted_name and (not template_restricted_path)): template_restricted_path = manifest.get_registered(restricted_name=template_restricted_name) if (not template_restricted_name): template_restricted_name = template_name if template_restricted_path: if template_restricted_path.is_dir(): restricted_json = (template_restricted_path / 'restricted.json') if (not validation.valid_o3de_restricted_json(restricted_json)): logger.error(f'{restricted_json} is not valid.') return 1 with open(restricted_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error((f'Failed to load {restricted_json}: ' + str(e))) return 1 try: template_restricted_name = restricted_json_data['restricted_name'] except KeyError as e: logger.error(f'Failed to read restricted_name from {restricted_json}') return 1 else: os.makedirs(template_restricted_path, exist_ok=True) if (not source_restricted_platform_relative_path): source_restricted_platform_relative_path = if (not template_restricted_platform_relative_path): template_restricted_platform_relative_path = logger.info(f'Processing Src: {source_path}') replacements = list() while replace: replace_this = replace.pop(0) with_this = replace.pop(0) replacements.append((replace_this, with_this)) os.makedirs(template_path, exist_ok=True) replacements.append((source_name.lower(), '${NameLower}')) replacements.append((source_name.upper(), '${NameUpper}')) replacements.append((source_name, '${Name}')) replacements.append((sanitized_source_name, '${SanitizedCppName}')) sanitized_name_index = (len(replacements) - 1) def _is_cpp_file(file_path: pathlib.Path) -> bool: '\n Internal helper method to check if a file is a C++ file based\n on its extension, so we can determine if we need to prefer\n the ${SanitizedCppName}\n :param file_path: The input file path\n :return: bool: Whether or not the input file path has a C++ extension\n ' (name, ext) = os.path.splitext(file_path) return (ext.lower() in cpp_file_ext) def _transform_into_template(s_data: object, prefer_sanitized_name: bool=False) -> (bool, str): "\n Internal function to transform any data into templated data\n :param s_data: the input data, this could be file data or file name data\n :param prefer_sanitized_name: Optionally swap the sanitized name with the normal name\n This can be necessary when creating the template, the source\n name and sanitized source name might be the same, but C++\n files will need to prefer the sanitized version, or else\n there might be compile errors (e.g. '-' characters in the name)\n :return: bool: whether or not the returned data MAY need to be transformed to instantiate it\n t_data: potentially transformed data 0 for success or non 0 failure code\n " def swap_sanitized_name_and_normal(): (replacements[(sanitized_name_index - 1)], replacements[sanitized_name_index]) = (replacements[sanitized_name_index], replacements[(sanitized_name_index - 1)]) t_data = str(s_data) if prefer_sanitized_name: swap_sanitized_name_and_normal() for replacement in replacements: t_data = t_data.replace(replacement[0], replacement[1]) if prefer_sanitized_name: swap_sanitized_name_and_normal() if (not keep_license_text): t_data = _replace_license_text(t_data) try: pattern = '.*AZ_RTTI\\(\\$\\{SanitizedCppName\\}Module, \\"(?P<ModuleClassId>\\{.*-.*-.*-.*-.*\\})\\",' module_class_id = re.search(pattern, t_data).group('ModuleClassId') replacements.append((module_class_id, '${ModuleClassId}')) t_data = t_data.replace(module_class_id, '${ModuleClassId}') except Exception as e: pass try: pattern = '.*AZ_COMPONENT\\(\\$\\{SanitizedCppName\\}SystemComponent, \\"(?P<SysCompClassId>\\{.*-.*-.*-.*-.*\\})\\"' sys_comp_class_id = re.search(pattern, t_data).group('SysCompClassId') replacements.append((sys_comp_class_id, '${SysCompClassId}')) t_data = t_data.replace(sys_comp_class_id, '${SysCompClassId}') except Exception as e: pass try: pattern = '.*AZ_COMPONENT\\(\\$\\{SanitizedCppName\\}EditorSystemComponent, \\"(?P<EditorSysCompClassId>\\{.*-.*-.*-.*-.*\\})\\"' editor_sys_comp_class_id = re.search(pattern, t_data).group('EditorSysCompClassId') replacements.append((editor_sys_comp_class_id, '${EditorSysCompClassId}')) t_data = t_data.replace(editor_sys_comp_class_id, '${EditorSysCompClassId}') except Exception as e: pass if ((s_data != t_data) or ('{BEGIN_LICENSE}' in t_data)): return (True, t_data) else: return (False, t_data) def _transform_restricted_into_copyfiles_and_createdirs(root_abs: pathlib.Path, path_abs: pathlib.Path=None) -> None: '\n Internal function recursively called to transform any paths files into copyfiles and create dirs relative to\n the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it\n :param root_abs: This is the path everything will end up relative to\n :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir\n of root_abs\n ' if (not path_abs): path_abs = root_abs entries = os.listdir(path_abs) for entry in entries: entry_abs = (path_abs / entry) logger.info(f'Processing file: {entry_abs}') try: entry_rel = entry_abs.relative_to(root_abs) except ValueError as err: logger.fatal(f'Unable to create relative path: {str(err)}') (_, destination_entry_rel) = _transform_into_template(entry_rel.as_posix()) destination_entry_rel = pathlib.Path(destination_entry_rel) if destination_entry_rel.as_posix().startswith('/'): destination_entry_rel = pathlib.Path(destination_entry_rel.as_posix().lstrip('/')) if isinstance(destination_entry_rel, pathlib.Path): destination_entry_rel = destination_entry_rel.as_posix() if template_restricted_path: destination_entry_abs = ((((template_restricted_path / restricted_platform) / template_restricted_platform_relative_path) / 'Template') / destination_entry_rel) destination_entry_rel = pathlib.Path(destination_entry_rel) first = True for component in destination_entry_rel.parts: if first: first = False result = ((pathlib.Path(component) / 'Platform') / restricted_platform) else: result = (result / component) destination_entry_rel = result.as_posix() else: destination_entry_rel = pathlib.Path(destination_entry_rel) first = True for component in destination_entry_rel.parts: if first: first = False result = ((pathlib.Path(component) / 'Platform') / restricted_platform) else: result = (result / component) destination_entry_rel = result.as_posix() destination_entry_abs = ((template_path / 'Template') / destination_entry_rel) os.makedirs(os.path.dirname(destination_entry_abs), exist_ok=True) templated = False if os.path.isfile(entry_abs): (name, ext) = os.path.splitext(entry) if (ext in binary_file_ext): shutil.copy(entry_abs, destination_entry_abs) else: try: with open(entry_abs, 'r') as s: source_data = s.read() (templated, source_data) = _transform_into_template(source_data, _is_cpp_file(entry_abs)) if (keep_license_text and (ext in expect_license_info_ext)): if (('Copyright (c)' not in source_data) or ('{BEGIN_LICENSE}' not in source_data)): logger.warning(f'Un-templated License header in {entry_abs}') if os.path.isfile(destination_entry_abs): os.unlink(destination_entry_abs) with open(destination_entry_abs, 'w') as s: s.write(source_data) except Exception as e: shutil.copy(entry_abs, destination_entry_abs) pass if keep_restricted_in_template: copy_files.append({'file': destination_entry_rel, 'isTemplated': templated}) else: restricted_platform_entries[restricted_platform]['copyFiles'].append({'file': destination_entry_rel, 'isTemplated': templated}) else: if keep_restricted_in_template: create_dirs.append({'dir': destination_entry_rel}) else: restricted_platform_entries[restricted_platform]['createDirs'].append({'dir': destination_entry_rel}) _transform_restricted_into_copyfiles_and_createdirs(root_abs, entry_abs) def _transform_dir_into_copyfiles_and_createdirs(root_abs: pathlib.Path, path_abs: pathlib.Path=None) -> None: '\n Internal function recursively called to transform any paths files into copyfiles and create dirs relative to\n the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it\n :param root_abs: This is the path everything will end up relative to\n :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir\n of root_abs\n ' if (not path_abs): path_abs = root_abs entries = os.listdir(path_abs) for entry in entries: entry_abs = (path_abs / entry) logger.info(f'Processing file: {entry_abs}') try: entry_rel = entry_abs.relative_to(root_abs).as_posix() except ValueError as err: logger.fatal(f'Unable to create relative path: {str(err)}') (_, destination_entry_rel) = _transform_into_template(entry_rel) destination_entry_rel = pathlib.Path(destination_entry_rel) found_platform = platform = False if ((not keep_restricted_in_template) and ('Platform' in entry_abs.parts)): platform = True try: pattern = '/Platform/(?P<Platform>[^/:*?\\"<>|\\r\\n]+/?)' found_platform = re.search(pattern, entry_abs.as_posix()).group('Platform') found_platform = found_platform.replace('/', ) except Exception as e: pass if (os.path.isfile(entry_abs) and (found_platform not in restricted_platform_entries)): found_platform = if ((found_platform not in restricted_platform_entries) and (found_platform in restricted_platforms)): restricted_platform_entries.update({found_platform: {'copyFiles': [], 'createDirs': []}}) if (platform and (found_platform in restricted_platforms)): if (not template_restricted_path): logger.warning('Restricted platform file found!!! {destination_entry_rel}, {found_platform}') continue for replacement in replacements: destination_entry_rel = destination_entry_rel.replace(replacement[0], replacement[1]) destination_entry_rel = destination_entry_rel.replace(f'Platform/{found_platform}', ) destination_entry_rel = destination_entry_rel.lstrip('/') if template_restricted_platform_relative_path: destination_entry_abs = (((((template_restricted_path / found_platform) / template_restricted_platform_relative_path) / template_name) / 'Template') / destination_entry_rel) else: destination_entry_abs = (((template_restricted_path / found_platform) / 'Template') / destination_entry_rel) else: destination_entry_abs = ((template_path / 'Template') / destination_entry_rel) if isinstance(destination_entry_rel, pathlib.Path): destination_entry_rel = destination_entry_rel.as_posix() if destination_entry_rel.startswith('/'): destination_entry_rel = destination_entry_rel.lstrip('/') os.makedirs(os.path.dirname(destination_entry_abs), exist_ok=True) templated = False if os.path.isfile(entry_abs): (name, ext) = os.path.splitext(entry) if (ext in binary_file_ext): shutil.copy(entry_abs, destination_entry_abs) else: try: with open(entry_abs, 'r') as s: source_data = s.read() (templated, source_data) = _transform_into_template(source_data, _is_cpp_file(entry_abs)) if (keep_license_text and (ext in expect_license_info_ext)): if (('Copyright (c)' not in source_data) or ('{BEGIN_LICENSE}' not in source_data)): logger.warning(f'Un-templated License header in {entry_abs}') if os.path.isfile(destination_entry_abs): os.unlink(destination_entry_abs) with open(destination_entry_abs, 'w') as s: s.write(source_data) except Exception as e: shutil.copy(entry_abs, destination_entry_abs) pass if (platform and (found_platform in restricted_platforms)): restricted_platform_entries[found_platform]['copyFiles'].append({'file': destination_entry_rel, 'isTemplated': templated}) else: copy_files.append({'file': destination_entry_rel, 'isTemplated': templated}) else: if (platform and (found_platform in restricted_platforms)): restricted_platform_entries[found_platform]['createDirs'].append({'dir': destination_entry_rel}) else: create_dirs.append({'dir': destination_entry_rel}) _transform_dir_into_copyfiles_and_createdirs(root_abs, entry_abs) copy_files = [] create_dirs = [] restricted_platform_entries = {} _transform_dir_into_copyfiles_and_createdirs(source_path) if source_restricted_path: for restricted_platform in os.listdir(source_restricted_path): restricted_platform_src_path_abs = ((source_restricted_path / restricted_platform) / source_restricted_platform_relative_path) if os.path.isdir(restricted_platform_src_path_abs): if (restricted_platform not in restricted_platform_entries): restricted_platform_entries.update({restricted_platform: {'copyFiles': [], 'createDirs': []}}) _transform_restricted_into_copyfiles_and_createdirs(restricted_platform_src_path_abs) copy_files.sort(key=(lambda x: x['file'])) create_dirs.sort(key=(lambda x: x['dir'])) json_data = {} json_data.update({'template_name': template_name}) json_data.update({'origin': f'The primary repo for {template_name} goes here: i.e. http://www.mydomain.com'}) json_data.update({'license': f'What license {template_name} uses goes here: i.e. https://opensource.org/licenses/MIT'}) json_data.update({'display_name': template_name}) json_data.update({'summary': f'A short description of {template_name}.'}) json_data.update({'canonical_tags': []}) json_data.update({'user_tags': [f'{template_name}']}) json_data.update({'icon_path': 'preview.png'}) if ((not keep_restricted_in_template) and template_restricted_path): json_data.update({'restricted_name': template_restricted_name}) if (template_restricted_platform_relative_path != ): json_data.update({'template_restricted_platform_relative_path': template_restricted_platform_relative_path.as_posix()}) json_data.update({'copyFiles': copy_files}) json_data.update({'createDirectories': create_dirs}) json_name = ((template_path / source_restricted_platform_relative_path) / 'template.json') with json_name.open('w') as s: s.write((json.dumps(json_data, indent=4) + '\n')) preview_png_src = ((this_script_parent / 'resources') / 'preview.png') preview_png_dst = (template_path / 'preview.png') if (not os.path.isfile(preview_png_dst)): shutil.copy(preview_png_src, preview_png_dst) if ((not keep_restricted_in_template) and (not template_restricted_path) and len(restricted_platform_entries)): logger.info(f'Restricted platform files found!!! and no template restricted path was found...') if ((not keep_restricted_in_template) and template_restricted_path): json_name = (template_restricted_path / 'restricted.json') if (not json_name.is_file()): json_data = {} json_data.update({'restricted_name': template_restricted_name}) os.makedirs(os.path.dirname(json_name), exist_ok=True) with json_name.open('w') as s: s.write((json.dumps(json_data, indent=4) + '\n')) for restricted_platform in restricted_platform_entries: restricted_template_path = ((template_restricted_path / restricted_platform) / template_restricted_platform_relative_path) restricted_platform_entries[restricted_platform]['copyFiles'].sort(key=(lambda x: x['file'])) restricted_platform_entries[restricted_platform]['createDirs'].sort(key=(lambda x: x['dir'])) json_data = {} json_data.update({'restricted_name': template_name}) json_data.update({'template_name': template_name}) json_data.update({'origin': f'The primary repo for {template_name} goes here: i.e. http://www.mydomain.com'}) json_data.update({'license': f'What license {template_name} uses goes here: i.e. https://opensource.org/licenses/MIT'}) json_data.update({'display_name': template_name}) json_data.update({'summary': f'A short description of {template_name}.'}) json_data.update({'canonical_tags': [f'{restricted_platform}']}) json_data.update({'user_tags': [f'{template_name}']}) json_data.update({'copyFiles': restricted_platform_entries[restricted_platform]['copyFiles']}) json_data.update({'createDirectories': restricted_platform_entries[restricted_platform]['createDirs']}) json_name = (restricted_template_path / 'template.json') os.makedirs(os.path.dirname(json_name), exist_ok=True) with json_name.open('w') as s: s.write((json.dumps(json_data, indent=4) + '\n')) if (not no_register): if register.register(restricted_path=template_restricted_path): logger.error(f'Failed to register the restricted {template_restricted_path}.') return 1 return (register.register(template_path=template_path) if (not no_register) else 0)
def create_template(source_path: pathlib.Path, template_path: pathlib.Path, source_name: str=None, source_restricted_path: pathlib.Path=None, source_restricted_name: str=None, template_restricted_path: pathlib.Path=None, template_restricted_name: str=None, source_restricted_platform_relative_path: pathlib.Path=None, template_restricted_platform_relative_path: pathlib.Path=None, keep_restricted_in_template: bool=False, keep_license_text: bool=False, replace: list=None, force: bool=False, no_register: bool=False) -> int: "\n Create a template from a source directory using replacement\n\n :param source_path: The path to the source that you want to make into a template\n :param template_path: the path of the template to create, can be absolute or relative to default templates path\n :param source_name: Name to replace within template folder with ${Name} placeholder\n If not specified, the basename of the source_path parameter is used as the source_name instead\n :param source_restricted_path: path to the source restricted folder\n :param source_restricted_name: name of the source restricted folder\n :param template_restricted_path: path to the templates restricted folder\n :param template_restricted_name: name of the templates restricted folder\n :param source_restricted_platform_relative_path: any path after the platform in the source restricted\n :param template_restricted_platform_relative_path: any path after the platform in the template restricted\n :param replace: optional list of strings uses to make templated parameters out of concrete names. X->Y pairs\n Ex. TestGem,${Name},TestGemPlayer,${Player}\n This will cause all references to 'TestGem' be replaced by ${Name}, and all 'TestGemPlayer' replaced by ${Player}\n Note these replacements are executed in order, so if you have larger matches, do them first, i.e.\n TestGemPlayer,${Player},TestGem,${Name}\n TestGemPlayer will get matched first and become ${Player} and will not become ${Name}Player\n :param keep_restricted_in_template: whether or not you want to keep the templates restricted in your template.\n :param keep_license_text: whether or not you want to keep the templates license text in your instance.\n Templated files can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE},\n this controls if you want to keep the license text from the template in the new instance. It is false by default\n because most people will not want license text in their instances.\n :param force Overrides existing files even if they exist\n :param no_register: whether or not after completion that the new object is registered\n :return: 0 for success or non 0 failure code\n " if (not source_path): logger.error('Src path cannot be empty.') return 1 if (not source_path.is_dir()): logger.error(f'Src path {source_path} is not a folder.') return 1 source_path = source_path.resolve() if (not source_name): source_name = os.path.basename(source_path) sanitized_source_name = utils.sanitize_identifier_for_cpp(source_name) if (not template_path): logger.info(f'Template path empty. Using source name {source_name}') template_path = source_name if (not template_path.is_absolute()): default_templates_folder = manifest.get_registered(default_folder='templates') template_path = (default_templates_folder / source_name) logger.info(f'Template path empty. Using default templates folder {template_path}') if ((not force) and template_path.is_dir() and len(list(template_path.iterdir()))): logger.error(f'Template path {template_path} already exists.') return 1 try: template_path.relative_to(source_path) except ValueError: pass else: logger.error(f'Template output path {template_path} cannot be a subdirectory of the source_path {source_path} ') return 1 template_name = os.path.basename(template_path) if (template_name in restricted_platforms): logger.error(f'Template path cannot be a restricted name. {template_name}') return 1 if (source_restricted_name and (not source_restricted_path)): source_restricted_path = manifest.get_registered(restricted_name=source_restricted_name) if source_restricted_path: if (not source_restricted_path.is_dir()): logger.error(f'Source restricted path {source_restricted_path} is not a folder.') return 1 restricted_json = (source_restricted_path / 'restricted.json') if (not validation.valid_o3de_restricted_json(restricted_json)): logger.error(f'Restricted json {restricted_json} is not valid.') return 1 with open(restricted_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error((f'Failed to load {restricted_json}: ' + str(e))) return 1 try: source_restricted_name = restricted_json_data['restricted_name'] except KeyError as e: logger.error(f'Failed to read restricted_name from {restricted_json}') return 1 if (template_restricted_name and (not template_restricted_path)): template_restricted_path = manifest.get_registered(restricted_name=template_restricted_name) if (not template_restricted_name): template_restricted_name = template_name if template_restricted_path: if template_restricted_path.is_dir(): restricted_json = (template_restricted_path / 'restricted.json') if (not validation.valid_o3de_restricted_json(restricted_json)): logger.error(f'{restricted_json} is not valid.') return 1 with open(restricted_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error((f'Failed to load {restricted_json}: ' + str(e))) return 1 try: template_restricted_name = restricted_json_data['restricted_name'] except KeyError as e: logger.error(f'Failed to read restricted_name from {restricted_json}') return 1 else: os.makedirs(template_restricted_path, exist_ok=True) if (not source_restricted_platform_relative_path): source_restricted_platform_relative_path = if (not template_restricted_platform_relative_path): template_restricted_platform_relative_path = logger.info(f'Processing Src: {source_path}') replacements = list() while replace: replace_this = replace.pop(0) with_this = replace.pop(0) replacements.append((replace_this, with_this)) os.makedirs(template_path, exist_ok=True) replacements.append((source_name.lower(), '${NameLower}')) replacements.append((source_name.upper(), '${NameUpper}')) replacements.append((source_name, '${Name}')) replacements.append((sanitized_source_name, '${SanitizedCppName}')) sanitized_name_index = (len(replacements) - 1) def _is_cpp_file(file_path: pathlib.Path) -> bool: '\n Internal helper method to check if a file is a C++ file based\n on its extension, so we can determine if we need to prefer\n the ${SanitizedCppName}\n :param file_path: The input file path\n :return: bool: Whether or not the input file path has a C++ extension\n ' (name, ext) = os.path.splitext(file_path) return (ext.lower() in cpp_file_ext) def _transform_into_template(s_data: object, prefer_sanitized_name: bool=False) -> (bool, str): "\n Internal function to transform any data into templated data\n :param s_data: the input data, this could be file data or file name data\n :param prefer_sanitized_name: Optionally swap the sanitized name with the normal name\n This can be necessary when creating the template, the source\n name and sanitized source name might be the same, but C++\n files will need to prefer the sanitized version, or else\n there might be compile errors (e.g. '-' characters in the name)\n :return: bool: whether or not the returned data MAY need to be transformed to instantiate it\n t_data: potentially transformed data 0 for success or non 0 failure code\n " def swap_sanitized_name_and_normal(): (replacements[(sanitized_name_index - 1)], replacements[sanitized_name_index]) = (replacements[sanitized_name_index], replacements[(sanitized_name_index - 1)]) t_data = str(s_data) if prefer_sanitized_name: swap_sanitized_name_and_normal() for replacement in replacements: t_data = t_data.replace(replacement[0], replacement[1]) if prefer_sanitized_name: swap_sanitized_name_and_normal() if (not keep_license_text): t_data = _replace_license_text(t_data) try: pattern = '.*AZ_RTTI\\(\\$\\{SanitizedCppName\\}Module, \\"(?P<ModuleClassId>\\{.*-.*-.*-.*-.*\\})\\",' module_class_id = re.search(pattern, t_data).group('ModuleClassId') replacements.append((module_class_id, '${ModuleClassId}')) t_data = t_data.replace(module_class_id, '${ModuleClassId}') except Exception as e: pass try: pattern = '.*AZ_COMPONENT\\(\\$\\{SanitizedCppName\\}SystemComponent, \\"(?P<SysCompClassId>\\{.*-.*-.*-.*-.*\\})\\"' sys_comp_class_id = re.search(pattern, t_data).group('SysCompClassId') replacements.append((sys_comp_class_id, '${SysCompClassId}')) t_data = t_data.replace(sys_comp_class_id, '${SysCompClassId}') except Exception as e: pass try: pattern = '.*AZ_COMPONENT\\(\\$\\{SanitizedCppName\\}EditorSystemComponent, \\"(?P<EditorSysCompClassId>\\{.*-.*-.*-.*-.*\\})\\"' editor_sys_comp_class_id = re.search(pattern, t_data).group('EditorSysCompClassId') replacements.append((editor_sys_comp_class_id, '${EditorSysCompClassId}')) t_data = t_data.replace(editor_sys_comp_class_id, '${EditorSysCompClassId}') except Exception as e: pass if ((s_data != t_data) or ('{BEGIN_LICENSE}' in t_data)): return (True, t_data) else: return (False, t_data) def _transform_restricted_into_copyfiles_and_createdirs(root_abs: pathlib.Path, path_abs: pathlib.Path=None) -> None: '\n Internal function recursively called to transform any paths files into copyfiles and create dirs relative to\n the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it\n :param root_abs: This is the path everything will end up relative to\n :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir\n of root_abs\n ' if (not path_abs): path_abs = root_abs entries = os.listdir(path_abs) for entry in entries: entry_abs = (path_abs / entry) logger.info(f'Processing file: {entry_abs}') try: entry_rel = entry_abs.relative_to(root_abs) except ValueError as err: logger.fatal(f'Unable to create relative path: {str(err)}') (_, destination_entry_rel) = _transform_into_template(entry_rel.as_posix()) destination_entry_rel = pathlib.Path(destination_entry_rel) if destination_entry_rel.as_posix().startswith('/'): destination_entry_rel = pathlib.Path(destination_entry_rel.as_posix().lstrip('/')) if isinstance(destination_entry_rel, pathlib.Path): destination_entry_rel = destination_entry_rel.as_posix() if template_restricted_path: destination_entry_abs = ((((template_restricted_path / restricted_platform) / template_restricted_platform_relative_path) / 'Template') / destination_entry_rel) destination_entry_rel = pathlib.Path(destination_entry_rel) first = True for component in destination_entry_rel.parts: if first: first = False result = ((pathlib.Path(component) / 'Platform') / restricted_platform) else: result = (result / component) destination_entry_rel = result.as_posix() else: destination_entry_rel = pathlib.Path(destination_entry_rel) first = True for component in destination_entry_rel.parts: if first: first = False result = ((pathlib.Path(component) / 'Platform') / restricted_platform) else: result = (result / component) destination_entry_rel = result.as_posix() destination_entry_abs = ((template_path / 'Template') / destination_entry_rel) os.makedirs(os.path.dirname(destination_entry_abs), exist_ok=True) templated = False if os.path.isfile(entry_abs): (name, ext) = os.path.splitext(entry) if (ext in binary_file_ext): shutil.copy(entry_abs, destination_entry_abs) else: try: with open(entry_abs, 'r') as s: source_data = s.read() (templated, source_data) = _transform_into_template(source_data, _is_cpp_file(entry_abs)) if (keep_license_text and (ext in expect_license_info_ext)): if (('Copyright (c)' not in source_data) or ('{BEGIN_LICENSE}' not in source_data)): logger.warning(f'Un-templated License header in {entry_abs}') if os.path.isfile(destination_entry_abs): os.unlink(destination_entry_abs) with open(destination_entry_abs, 'w') as s: s.write(source_data) except Exception as e: shutil.copy(entry_abs, destination_entry_abs) pass if keep_restricted_in_template: copy_files.append({'file': destination_entry_rel, 'isTemplated': templated}) else: restricted_platform_entries[restricted_platform]['copyFiles'].append({'file': destination_entry_rel, 'isTemplated': templated}) else: if keep_restricted_in_template: create_dirs.append({'dir': destination_entry_rel}) else: restricted_platform_entries[restricted_platform]['createDirs'].append({'dir': destination_entry_rel}) _transform_restricted_into_copyfiles_and_createdirs(root_abs, entry_abs) def _transform_dir_into_copyfiles_and_createdirs(root_abs: pathlib.Path, path_abs: pathlib.Path=None) -> None: '\n Internal function recursively called to transform any paths files into copyfiles and create dirs relative to\n the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it\n :param root_abs: This is the path everything will end up relative to\n :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir\n of root_abs\n ' if (not path_abs): path_abs = root_abs entries = os.listdir(path_abs) for entry in entries: entry_abs = (path_abs / entry) logger.info(f'Processing file: {entry_abs}') try: entry_rel = entry_abs.relative_to(root_abs).as_posix() except ValueError as err: logger.fatal(f'Unable to create relative path: {str(err)}') (_, destination_entry_rel) = _transform_into_template(entry_rel) destination_entry_rel = pathlib.Path(destination_entry_rel) found_platform = platform = False if ((not keep_restricted_in_template) and ('Platform' in entry_abs.parts)): platform = True try: pattern = '/Platform/(?P<Platform>[^/:*?\\"<>|\\r\\n]+/?)' found_platform = re.search(pattern, entry_abs.as_posix()).group('Platform') found_platform = found_platform.replace('/', ) except Exception as e: pass if (os.path.isfile(entry_abs) and (found_platform not in restricted_platform_entries)): found_platform = if ((found_platform not in restricted_platform_entries) and (found_platform in restricted_platforms)): restricted_platform_entries.update({found_platform: {'copyFiles': [], 'createDirs': []}}) if (platform and (found_platform in restricted_platforms)): if (not template_restricted_path): logger.warning('Restricted platform file found!!! {destination_entry_rel}, {found_platform}') continue for replacement in replacements: destination_entry_rel = destination_entry_rel.replace(replacement[0], replacement[1]) destination_entry_rel = destination_entry_rel.replace(f'Platform/{found_platform}', ) destination_entry_rel = destination_entry_rel.lstrip('/') if template_restricted_platform_relative_path: destination_entry_abs = (((((template_restricted_path / found_platform) / template_restricted_platform_relative_path) / template_name) / 'Template') / destination_entry_rel) else: destination_entry_abs = (((template_restricted_path / found_platform) / 'Template') / destination_entry_rel) else: destination_entry_abs = ((template_path / 'Template') / destination_entry_rel) if isinstance(destination_entry_rel, pathlib.Path): destination_entry_rel = destination_entry_rel.as_posix() if destination_entry_rel.startswith('/'): destination_entry_rel = destination_entry_rel.lstrip('/') os.makedirs(os.path.dirname(destination_entry_abs), exist_ok=True) templated = False if os.path.isfile(entry_abs): (name, ext) = os.path.splitext(entry) if (ext in binary_file_ext): shutil.copy(entry_abs, destination_entry_abs) else: try: with open(entry_abs, 'r') as s: source_data = s.read() (templated, source_data) = _transform_into_template(source_data, _is_cpp_file(entry_abs)) if (keep_license_text and (ext in expect_license_info_ext)): if (('Copyright (c)' not in source_data) or ('{BEGIN_LICENSE}' not in source_data)): logger.warning(f'Un-templated License header in {entry_abs}') if os.path.isfile(destination_entry_abs): os.unlink(destination_entry_abs) with open(destination_entry_abs, 'w') as s: s.write(source_data) except Exception as e: shutil.copy(entry_abs, destination_entry_abs) pass if (platform and (found_platform in restricted_platforms)): restricted_platform_entries[found_platform]['copyFiles'].append({'file': destination_entry_rel, 'isTemplated': templated}) else: copy_files.append({'file': destination_entry_rel, 'isTemplated': templated}) else: if (platform and (found_platform in restricted_platforms)): restricted_platform_entries[found_platform]['createDirs'].append({'dir': destination_entry_rel}) else: create_dirs.append({'dir': destination_entry_rel}) _transform_dir_into_copyfiles_and_createdirs(root_abs, entry_abs) copy_files = [] create_dirs = [] restricted_platform_entries = {} _transform_dir_into_copyfiles_and_createdirs(source_path) if source_restricted_path: for restricted_platform in os.listdir(source_restricted_path): restricted_platform_src_path_abs = ((source_restricted_path / restricted_platform) / source_restricted_platform_relative_path) if os.path.isdir(restricted_platform_src_path_abs): if (restricted_platform not in restricted_platform_entries): restricted_platform_entries.update({restricted_platform: {'copyFiles': [], 'createDirs': []}}) _transform_restricted_into_copyfiles_and_createdirs(restricted_platform_src_path_abs) copy_files.sort(key=(lambda x: x['file'])) create_dirs.sort(key=(lambda x: x['dir'])) json_data = {} json_data.update({'template_name': template_name}) json_data.update({'origin': f'The primary repo for {template_name} goes here: i.e. http://www.mydomain.com'}) json_data.update({'license': f'What license {template_name} uses goes here: i.e. https://opensource.org/licenses/MIT'}) json_data.update({'display_name': template_name}) json_data.update({'summary': f'A short description of {template_name}.'}) json_data.update({'canonical_tags': []}) json_data.update({'user_tags': [f'{template_name}']}) json_data.update({'icon_path': 'preview.png'}) if ((not keep_restricted_in_template) and template_restricted_path): json_data.update({'restricted_name': template_restricted_name}) if (template_restricted_platform_relative_path != ): json_data.update({'template_restricted_platform_relative_path': template_restricted_platform_relative_path.as_posix()}) json_data.update({'copyFiles': copy_files}) json_data.update({'createDirectories': create_dirs}) json_name = ((template_path / source_restricted_platform_relative_path) / 'template.json') with json_name.open('w') as s: s.write((json.dumps(json_data, indent=4) + '\n')) preview_png_src = ((this_script_parent / 'resources') / 'preview.png') preview_png_dst = (template_path / 'preview.png') if (not os.path.isfile(preview_png_dst)): shutil.copy(preview_png_src, preview_png_dst) if ((not keep_restricted_in_template) and (not template_restricted_path) and len(restricted_platform_entries)): logger.info(f'Restricted platform files found!!! and no template restricted path was found...') if ((not keep_restricted_in_template) and template_restricted_path): json_name = (template_restricted_path / 'restricted.json') if (not json_name.is_file()): json_data = {} json_data.update({'restricted_name': template_restricted_name}) os.makedirs(os.path.dirname(json_name), exist_ok=True) with json_name.open('w') as s: s.write((json.dumps(json_data, indent=4) + '\n')) for restricted_platform in restricted_platform_entries: restricted_template_path = ((template_restricted_path / restricted_platform) / template_restricted_platform_relative_path) restricted_platform_entries[restricted_platform]['copyFiles'].sort(key=(lambda x: x['file'])) restricted_platform_entries[restricted_platform]['createDirs'].sort(key=(lambda x: x['dir'])) json_data = {} json_data.update({'restricted_name': template_name}) json_data.update({'template_name': template_name}) json_data.update({'origin': f'The primary repo for {template_name} goes here: i.e. http://www.mydomain.com'}) json_data.update({'license': f'What license {template_name} uses goes here: i.e. https://opensource.org/licenses/MIT'}) json_data.update({'display_name': template_name}) json_data.update({'summary': f'A short description of {template_name}.'}) json_data.update({'canonical_tags': [f'{restricted_platform}']}) json_data.update({'user_tags': [f'{template_name}']}) json_data.update({'copyFiles': restricted_platform_entries[restricted_platform]['copyFiles']}) json_data.update({'createDirectories': restricted_platform_entries[restricted_platform]['createDirs']}) json_name = (restricted_template_path / 'template.json') os.makedirs(os.path.dirname(json_name), exist_ok=True) with json_name.open('w') as s: s.write((json.dumps(json_data, indent=4) + '\n')) if (not no_register): if register.register(restricted_path=template_restricted_path): logger.error(f'Failed to register the restricted {template_restricted_path}.') return 1 return (register.register(template_path=template_path) if (not no_register) else 0)<|docstring|>Create a template from a source directory using replacement :param source_path: The path to the source that you want to make into a template :param template_path: the path of the template to create, can be absolute or relative to default templates path :param source_name: Name to replace within template folder with ${Name} placeholder If not specified, the basename of the source_path parameter is used as the source_name instead :param source_restricted_path: path to the source restricted folder :param source_restricted_name: name of the source restricted folder :param template_restricted_path: path to the templates restricted folder :param template_restricted_name: name of the templates restricted folder :param source_restricted_platform_relative_path: any path after the platform in the source restricted :param template_restricted_platform_relative_path: any path after the platform in the template restricted :param replace: optional list of strings uses to make templated parameters out of concrete names. X->Y pairs Ex. TestGem,${Name},TestGemPlayer,${Player} This will cause all references to 'TestGem' be replaced by ${Name}, and all 'TestGemPlayer' replaced by ${Player} Note these replacements are executed in order, so if you have larger matches, do them first, i.e. TestGemPlayer,${Player},TestGem,${Name} TestGemPlayer will get matched first and become ${Player} and will not become ${Name}Player :param keep_restricted_in_template: whether or not you want to keep the templates restricted in your template. :param keep_license_text: whether or not you want to keep the templates license text in your instance. Templated files can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE}, this controls if you want to keep the license text from the template in the new instance. It is false by default because most people will not want license text in their instances. :param force Overrides existing files even if they exist :param no_register: whether or not after completion that the new object is registered :return: 0 for success or non 0 failure code<|endoftext|>
455a51dc701b9f4241e6ae3ecdebbcc18a3a9865aa16031303a4cb9eecf01b7b
def create_from_template(destination_path: pathlib.Path, template_path: pathlib.Path=None, template_name: str=None, destination_name: str=None, destination_restricted_path: pathlib.Path=None, destination_restricted_name: str=None, template_restricted_path: pathlib.Path=None, template_restricted_name: str=None, destination_restricted_platform_relative_path: pathlib.Path=None, template_restricted_platform_relative_path: pathlib.Path=None, keep_restricted_in_instance: bool=False, keep_license_text: bool=False, replace: list=None, force: bool=False, no_register: bool=False) -> int: "\n Generic template instantiation for non o3de object templates. This function makes NO assumptions!\n Assumptions are made only for specializations like create_project or create_gem etc... So this function\n will NOT try to divine intent.\n :param destination_path: the folder you want to instantiate the template into\n :param template_path: the path to the template you want to instance\n :param template_name: the name of the template you want to instance, resolves template_path\n :param destination_name: the name that will be substituted when instantiating the template.\n The placeholders of ${Name} and ${SanitizedCppName} will be replaced with destination name and a sanitized\n version of the destination name that is suitable as a C++ identifier. If not specified, defaults to the\n last path component of the destination_path\n :param destination_restricted_path: path to the projects restricted folder\n :param destination_restricted_name: name of the projects restricted folder, resolves destination_restricted_path\n :param template_restricted_path: path of the templates restricted folder\n :param template_restricted_name: name of the templates restricted folder, resolves template_restricted_path\n :param destination_restricted_platform_relative_path: any path after the platform in the destination restricted\n :param template_restricted_platform_relative_path: any path after the platform in the template restricted\n :param keep_restricted_in_instance: whether or not you want to keep the templates restricted files in your instance\n or separate them out into the restricted folder\n :param keep_license_text: whether or not you want to keep the templates license text in your instance.\n template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE},\n this controls if you want to keep the license text from the template in the new instance. It is false by default\n because most customers will not want license text in their instances, but we may want to keep them.\n :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs\n Ex. ${Name},TestGem,${Player},TestGemPlayer\n This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer'\n :param force Overrides existing files even if they exist\n :return: 0 for success or non 0 failure code\n " if (template_name and template_path): logger.error(f'Template Name and Template Path provided, these are mutually exclusive.') return 1 if (destination_restricted_name and destination_restricted_path): logger.error(f'Destination Restricted Name and Destination Restricted Path provided, these are mutually exclusive.') return 1 if (template_restricted_name and template_restricted_path): logger.error(f'Template Restricted Name and Template Restricted Path provided, these are mutually exclusive.') return 1 if ((not template_path) and (not template_name)): logger.error(f'Template Name or Template Path must be specified.') return 1 if template_name: template_path = manifest.get_registered(template_name=template_name) if (not template_path): logger.error(f'''Could not find the template path using name {template_name}. Has the engine been registered yet. It can be registered via the "o3de.py register --this-engine" command''') return 1 if (not os.path.isdir(template_path)): logger.error(f'Could not find the template {template_name}=>{template_path}') return 1 template_json = (template_path / 'template.json') if (not validation.valid_o3de_template_json(template_json)): logger.error(f'Template json {template_path} is invalid.') return 1 with open(template_json) as s: try: template_json_data = json.load(s) except KeyError as e: logger.error(f'Could not read template json {template_json}: {str(e)}.') return 1 try: template_name = template_json_data['template_name'] except KeyError as e: logger.error(f'Could not read "template_name" from template json {template_json}: {str(e)}.') return 1 if ((not template_restricted_name) and (not template_restricted_path)): try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name".') else: template_restricted_name = template_json_restricted_name if (template_restricted_name or template_restricted_path): if template_restricted_name: try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_name}') else: if (template_json_restricted_name != template_restricted_name): logger.error(f'The supplied --template-restricted-name {template_restricted_name} does not match the templates "restricted_name". Either the the --template-restricted-name is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name}, --template-restricted-name need not be supplied.') template_restricted_path = manifest.get_registered(restricted_name=template_restricted_name) else: try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_path}') else: template_json_restricted_path = manifest.get_registered(restricted_name=template_json_restricted_name) if (template_json_restricted_path != template_restricted_path): logger.error(f'The supplied --template-restricted-path {template_restricted_path} does not match the templates "restricted_name" {template_restricted_name} => {template_json_restricted_path}. Either the the supplied --template-restricted-path is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name} --template-restricted-path need not be supplied and {template_json_restricted_path} will be used.') return 1 if (template_restricted_path and (not os.path.isdir(template_restricted_path))): logger.error(f'Template restricted path {template_restricted_path} does not exist.') return 1 if template_restricted_platform_relative_path: try: template_json_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: logger.info(f'The template does not specify a "restricted_platform_relative_path". Using {template_restricted_platform_relative_path}') else: if (template_restricted_platform_relative_path != template_json_restricted_platform_relative_path): logger.error(f'The supplied --template-restricted-platform-relative-path "{template_restricted_platform_relative_path}" does not match the templates.json "restricted_platform_relative_path". Either --template-restricted-platform-relative-path is incorrect or the templates "restricted_platform_relative_path" is wrong. Note that since this template specifies "restricted_platform_relative_path" it need not be supplied and "{template_json_restricted_platform_relative_path}" will be used.') return 1 else: try: template_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: template_restricted_platform_relative_path = '' if (not template_restricted_platform_relative_path): template_restricted_platform_relative_path = '' if (not destination_path): logger.error('Destination path cannot be empty.') return 1 if ((not force) and os.path.isdir(destination_path)): logger.error(f'Destination path {destination_path} already exists.') return 1 else: os.makedirs(destination_path, exist_ok=force) if (not destination_name): destination_name = destination_path.name if (destination_name in restricted_platforms): logger.error(f'Destination path cannot be a restricted name. {destination_name}') return 1 if destination_restricted_name: destination_restricted_path = manifest.get_registered(restricted_name=destination_restricted_name) elif destination_restricted_path: if (not os.path.isabs(destination_restricted_path)): restricted_default_path = manifest.get_registered(default_folder='restricted') new_destination_restricted_path = ((restricted_default_path / 'Templates') / destination_restricted_path) logger.info(f'{destination_restricted_path} is not a full path, making it relative to default restricted path = {new_destination_restricted_path}') destination_restricted_path = new_destination_restricted_path else: restricted_default_path = manifest.get_registered(default_folder='restricted') new_destination_restricted_path = ((restricted_default_path / 'Templates') / destination_name) logger.info(f'--destination-restricted-path is not specified, using default restricted path / Templates / destination name = {new_destination_restricted_path}') destination_restricted_path = new_destination_restricted_path if (not destination_restricted_platform_relative_path): destination_restricted_platform_relative_path = '' replacements = list() while replace: replace_this = replace.pop(0) with_this = replace.pop(0) replacements.append((replace_this, with_this)) sanitized_cpp_name = utils.sanitize_identifier_for_cpp(destination_name) replacements.append(('${Name}', destination_name)) replacements.append(('${NameUpper}', destination_name.upper())) replacements.append(('${NameLower}', destination_name.lower())) replacements.append(('${SanitizedCppName}', sanitized_cpp_name)) if _instantiate_template(template_json_data, destination_name, template_name, destination_path, template_path, destination_restricted_path, template_restricted_path, destination_restricted_platform_relative_path, template_restricted_platform_relative_path, replacements, keep_restricted_in_instance, keep_license_text): logger.error(f'Instantiation of the template has failed.') return 1 if (not keep_restricted_in_instance): if destination_restricted_path: os.makedirs(destination_restricted_path, exist_ok=True) restricted_json = (destination_restricted_path / 'restricted.json') if (not os.path.isfile(restricted_json)): with open(restricted_json, 'w') as s: restricted_json_data = {} restricted_json_data.update({'restricted_name': destination_name}) s.write((json.dumps(restricted_json_data, indent=4) + '\n')) if (not no_register): if register.register(restricted_path=destination_restricted_path): logger.error(f'Failed to register the restricted {destination_restricted_path}.') return 1 logger.warning(f'Instantiation successful. NOTE: This is a generic instantiation of the template. If this was a template of an o3de object like a project, gem, template, etc. then the create-project or create-gem command can be used to register the object type via its project.json or gem.json, etc. Create from template is meant only to instance a template of a non o3de object.') return 0
Generic template instantiation for non o3de object templates. This function makes NO assumptions! Assumptions are made only for specializations like create_project or create_gem etc... So this function will NOT try to divine intent. :param destination_path: the folder you want to instantiate the template into :param template_path: the path to the template you want to instance :param template_name: the name of the template you want to instance, resolves template_path :param destination_name: the name that will be substituted when instantiating the template. The placeholders of ${Name} and ${SanitizedCppName} will be replaced with destination name and a sanitized version of the destination name that is suitable as a C++ identifier. If not specified, defaults to the last path component of the destination_path :param destination_restricted_path: path to the projects restricted folder :param destination_restricted_name: name of the projects restricted folder, resolves destination_restricted_path :param template_restricted_path: path of the templates restricted folder :param template_restricted_name: name of the templates restricted folder, resolves template_restricted_path :param destination_restricted_platform_relative_path: any path after the platform in the destination restricted :param template_restricted_platform_relative_path: any path after the platform in the template restricted :param keep_restricted_in_instance: whether or not you want to keep the templates restricted files in your instance or separate them out into the restricted folder :param keep_license_text: whether or not you want to keep the templates license text in your instance. template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE}, this controls if you want to keep the license text from the template in the new instance. It is false by default because most customers will not want license text in their instances, but we may want to keep them. :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs Ex. ${Name},TestGem,${Player},TestGemPlayer This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer' :param force Overrides existing files even if they exist :return: 0 for success or non 0 failure code
scripts/o3de/o3de/engine_template.py
create_from_template
SparkyStudios/o3de
11
python
def create_from_template(destination_path: pathlib.Path, template_path: pathlib.Path=None, template_name: str=None, destination_name: str=None, destination_restricted_path: pathlib.Path=None, destination_restricted_name: str=None, template_restricted_path: pathlib.Path=None, template_restricted_name: str=None, destination_restricted_platform_relative_path: pathlib.Path=None, template_restricted_platform_relative_path: pathlib.Path=None, keep_restricted_in_instance: bool=False, keep_license_text: bool=False, replace: list=None, force: bool=False, no_register: bool=False) -> int: "\n Generic template instantiation for non o3de object templates. This function makes NO assumptions!\n Assumptions are made only for specializations like create_project or create_gem etc... So this function\n will NOT try to divine intent.\n :param destination_path: the folder you want to instantiate the template into\n :param template_path: the path to the template you want to instance\n :param template_name: the name of the template you want to instance, resolves template_path\n :param destination_name: the name that will be substituted when instantiating the template.\n The placeholders of ${Name} and ${SanitizedCppName} will be replaced with destination name and a sanitized\n version of the destination name that is suitable as a C++ identifier. If not specified, defaults to the\n last path component of the destination_path\n :param destination_restricted_path: path to the projects restricted folder\n :param destination_restricted_name: name of the projects restricted folder, resolves destination_restricted_path\n :param template_restricted_path: path of the templates restricted folder\n :param template_restricted_name: name of the templates restricted folder, resolves template_restricted_path\n :param destination_restricted_platform_relative_path: any path after the platform in the destination restricted\n :param template_restricted_platform_relative_path: any path after the platform in the template restricted\n :param keep_restricted_in_instance: whether or not you want to keep the templates restricted files in your instance\n or separate them out into the restricted folder\n :param keep_license_text: whether or not you want to keep the templates license text in your instance.\n template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE},\n this controls if you want to keep the license text from the template in the new instance. It is false by default\n because most customers will not want license text in their instances, but we may want to keep them.\n :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs\n Ex. ${Name},TestGem,${Player},TestGemPlayer\n This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer'\n :param force Overrides existing files even if they exist\n :return: 0 for success or non 0 failure code\n " if (template_name and template_path): logger.error(f'Template Name and Template Path provided, these are mutually exclusive.') return 1 if (destination_restricted_name and destination_restricted_path): logger.error(f'Destination Restricted Name and Destination Restricted Path provided, these are mutually exclusive.') return 1 if (template_restricted_name and template_restricted_path): logger.error(f'Template Restricted Name and Template Restricted Path provided, these are mutually exclusive.') return 1 if ((not template_path) and (not template_name)): logger.error(f'Template Name or Template Path must be specified.') return 1 if template_name: template_path = manifest.get_registered(template_name=template_name) if (not template_path): logger.error(f'Could not find the template path using name {template_name}. Has the engine been registered yet. It can be registered via the "o3de.py register --this-engine" command') return 1 if (not os.path.isdir(template_path)): logger.error(f'Could not find the template {template_name}=>{template_path}') return 1 template_json = (template_path / 'template.json') if (not validation.valid_o3de_template_json(template_json)): logger.error(f'Template json {template_path} is invalid.') return 1 with open(template_json) as s: try: template_json_data = json.load(s) except KeyError as e: logger.error(f'Could not read template json {template_json}: {str(e)}.') return 1 try: template_name = template_json_data['template_name'] except KeyError as e: logger.error(f'Could not read "template_name" from template json {template_json}: {str(e)}.') return 1 if ((not template_restricted_name) and (not template_restricted_path)): try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name".') else: template_restricted_name = template_json_restricted_name if (template_restricted_name or template_restricted_path): if template_restricted_name: try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_name}') else: if (template_json_restricted_name != template_restricted_name): logger.error(f'The supplied --template-restricted-name {template_restricted_name} does not match the templates "restricted_name". Either the the --template-restricted-name is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name}, --template-restricted-name need not be supplied.') template_restricted_path = manifest.get_registered(restricted_name=template_restricted_name) else: try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_path}') else: template_json_restricted_path = manifest.get_registered(restricted_name=template_json_restricted_name) if (template_json_restricted_path != template_restricted_path): logger.error(f'The supplied --template-restricted-path {template_restricted_path} does not match the templates "restricted_name" {template_restricted_name} => {template_json_restricted_path}. Either the the supplied --template-restricted-path is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name} --template-restricted-path need not be supplied and {template_json_restricted_path} will be used.') return 1 if (template_restricted_path and (not os.path.isdir(template_restricted_path))): logger.error(f'Template restricted path {template_restricted_path} does not exist.') return 1 if template_restricted_platform_relative_path: try: template_json_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: logger.info(f'The template does not specify a "restricted_platform_relative_path". Using {template_restricted_platform_relative_path}') else: if (template_restricted_platform_relative_path != template_json_restricted_platform_relative_path): logger.error(f'The supplied --template-restricted-platform-relative-path "{template_restricted_platform_relative_path}" does not match the templates.json "restricted_platform_relative_path". Either --template-restricted-platform-relative-path is incorrect or the templates "restricted_platform_relative_path" is wrong. Note that since this template specifies "restricted_platform_relative_path" it need not be supplied and "{template_json_restricted_platform_relative_path}" will be used.') return 1 else: try: template_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: template_restricted_platform_relative_path = if (not template_restricted_platform_relative_path): template_restricted_platform_relative_path = if (not destination_path): logger.error('Destination path cannot be empty.') return 1 if ((not force) and os.path.isdir(destination_path)): logger.error(f'Destination path {destination_path} already exists.') return 1 else: os.makedirs(destination_path, exist_ok=force) if (not destination_name): destination_name = destination_path.name if (destination_name in restricted_platforms): logger.error(f'Destination path cannot be a restricted name. {destination_name}') return 1 if destination_restricted_name: destination_restricted_path = manifest.get_registered(restricted_name=destination_restricted_name) elif destination_restricted_path: if (not os.path.isabs(destination_restricted_path)): restricted_default_path = manifest.get_registered(default_folder='restricted') new_destination_restricted_path = ((restricted_default_path / 'Templates') / destination_restricted_path) logger.info(f'{destination_restricted_path} is not a full path, making it relative to default restricted path = {new_destination_restricted_path}') destination_restricted_path = new_destination_restricted_path else: restricted_default_path = manifest.get_registered(default_folder='restricted') new_destination_restricted_path = ((restricted_default_path / 'Templates') / destination_name) logger.info(f'--destination-restricted-path is not specified, using default restricted path / Templates / destination name = {new_destination_restricted_path}') destination_restricted_path = new_destination_restricted_path if (not destination_restricted_platform_relative_path): destination_restricted_platform_relative_path = replacements = list() while replace: replace_this = replace.pop(0) with_this = replace.pop(0) replacements.append((replace_this, with_this)) sanitized_cpp_name = utils.sanitize_identifier_for_cpp(destination_name) replacements.append(('${Name}', destination_name)) replacements.append(('${NameUpper}', destination_name.upper())) replacements.append(('${NameLower}', destination_name.lower())) replacements.append(('${SanitizedCppName}', sanitized_cpp_name)) if _instantiate_template(template_json_data, destination_name, template_name, destination_path, template_path, destination_restricted_path, template_restricted_path, destination_restricted_platform_relative_path, template_restricted_platform_relative_path, replacements, keep_restricted_in_instance, keep_license_text): logger.error(f'Instantiation of the template has failed.') return 1 if (not keep_restricted_in_instance): if destination_restricted_path: os.makedirs(destination_restricted_path, exist_ok=True) restricted_json = (destination_restricted_path / 'restricted.json') if (not os.path.isfile(restricted_json)): with open(restricted_json, 'w') as s: restricted_json_data = {} restricted_json_data.update({'restricted_name': destination_name}) s.write((json.dumps(restricted_json_data, indent=4) + '\n')) if (not no_register): if register.register(restricted_path=destination_restricted_path): logger.error(f'Failed to register the restricted {destination_restricted_path}.') return 1 logger.warning(f'Instantiation successful. NOTE: This is a generic instantiation of the template. If this was a template of an o3de object like a project, gem, template, etc. then the create-project or create-gem command can be used to register the object type via its project.json or gem.json, etc. Create from template is meant only to instance a template of a non o3de object.') return 0
def create_from_template(destination_path: pathlib.Path, template_path: pathlib.Path=None, template_name: str=None, destination_name: str=None, destination_restricted_path: pathlib.Path=None, destination_restricted_name: str=None, template_restricted_path: pathlib.Path=None, template_restricted_name: str=None, destination_restricted_platform_relative_path: pathlib.Path=None, template_restricted_platform_relative_path: pathlib.Path=None, keep_restricted_in_instance: bool=False, keep_license_text: bool=False, replace: list=None, force: bool=False, no_register: bool=False) -> int: "\n Generic template instantiation for non o3de object templates. This function makes NO assumptions!\n Assumptions are made only for specializations like create_project or create_gem etc... So this function\n will NOT try to divine intent.\n :param destination_path: the folder you want to instantiate the template into\n :param template_path: the path to the template you want to instance\n :param template_name: the name of the template you want to instance, resolves template_path\n :param destination_name: the name that will be substituted when instantiating the template.\n The placeholders of ${Name} and ${SanitizedCppName} will be replaced with destination name and a sanitized\n version of the destination name that is suitable as a C++ identifier. If not specified, defaults to the\n last path component of the destination_path\n :param destination_restricted_path: path to the projects restricted folder\n :param destination_restricted_name: name of the projects restricted folder, resolves destination_restricted_path\n :param template_restricted_path: path of the templates restricted folder\n :param template_restricted_name: name of the templates restricted folder, resolves template_restricted_path\n :param destination_restricted_platform_relative_path: any path after the platform in the destination restricted\n :param template_restricted_platform_relative_path: any path after the platform in the template restricted\n :param keep_restricted_in_instance: whether or not you want to keep the templates restricted files in your instance\n or separate them out into the restricted folder\n :param keep_license_text: whether or not you want to keep the templates license text in your instance.\n template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE},\n this controls if you want to keep the license text from the template in the new instance. It is false by default\n because most customers will not want license text in their instances, but we may want to keep them.\n :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs\n Ex. ${Name},TestGem,${Player},TestGemPlayer\n This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer'\n :param force Overrides existing files even if they exist\n :return: 0 for success or non 0 failure code\n " if (template_name and template_path): logger.error(f'Template Name and Template Path provided, these are mutually exclusive.') return 1 if (destination_restricted_name and destination_restricted_path): logger.error(f'Destination Restricted Name and Destination Restricted Path provided, these are mutually exclusive.') return 1 if (template_restricted_name and template_restricted_path): logger.error(f'Template Restricted Name and Template Restricted Path provided, these are mutually exclusive.') return 1 if ((not template_path) and (not template_name)): logger.error(f'Template Name or Template Path must be specified.') return 1 if template_name: template_path = manifest.get_registered(template_name=template_name) if (not template_path): logger.error(f'Could not find the template path using name {template_name}. Has the engine been registered yet. It can be registered via the "o3de.py register --this-engine" command') return 1 if (not os.path.isdir(template_path)): logger.error(f'Could not find the template {template_name}=>{template_path}') return 1 template_json = (template_path / 'template.json') if (not validation.valid_o3de_template_json(template_json)): logger.error(f'Template json {template_path} is invalid.') return 1 with open(template_json) as s: try: template_json_data = json.load(s) except KeyError as e: logger.error(f'Could not read template json {template_json}: {str(e)}.') return 1 try: template_name = template_json_data['template_name'] except KeyError as e: logger.error(f'Could not read "template_name" from template json {template_json}: {str(e)}.') return 1 if ((not template_restricted_name) and (not template_restricted_path)): try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name".') else: template_restricted_name = template_json_restricted_name if (template_restricted_name or template_restricted_path): if template_restricted_name: try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_name}') else: if (template_json_restricted_name != template_restricted_name): logger.error(f'The supplied --template-restricted-name {template_restricted_name} does not match the templates "restricted_name". Either the the --template-restricted-name is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name}, --template-restricted-name need not be supplied.') template_restricted_path = manifest.get_registered(restricted_name=template_restricted_name) else: try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_path}') else: template_json_restricted_path = manifest.get_registered(restricted_name=template_json_restricted_name) if (template_json_restricted_path != template_restricted_path): logger.error(f'The supplied --template-restricted-path {template_restricted_path} does not match the templates "restricted_name" {template_restricted_name} => {template_json_restricted_path}. Either the the supplied --template-restricted-path is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name} --template-restricted-path need not be supplied and {template_json_restricted_path} will be used.') return 1 if (template_restricted_path and (not os.path.isdir(template_restricted_path))): logger.error(f'Template restricted path {template_restricted_path} does not exist.') return 1 if template_restricted_platform_relative_path: try: template_json_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: logger.info(f'The template does not specify a "restricted_platform_relative_path". Using {template_restricted_platform_relative_path}') else: if (template_restricted_platform_relative_path != template_json_restricted_platform_relative_path): logger.error(f'The supplied --template-restricted-platform-relative-path "{template_restricted_platform_relative_path}" does not match the templates.json "restricted_platform_relative_path". Either --template-restricted-platform-relative-path is incorrect or the templates "restricted_platform_relative_path" is wrong. Note that since this template specifies "restricted_platform_relative_path" it need not be supplied and "{template_json_restricted_platform_relative_path}" will be used.') return 1 else: try: template_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: template_restricted_platform_relative_path = if (not template_restricted_platform_relative_path): template_restricted_platform_relative_path = if (not destination_path): logger.error('Destination path cannot be empty.') return 1 if ((not force) and os.path.isdir(destination_path)): logger.error(f'Destination path {destination_path} already exists.') return 1 else: os.makedirs(destination_path, exist_ok=force) if (not destination_name): destination_name = destination_path.name if (destination_name in restricted_platforms): logger.error(f'Destination path cannot be a restricted name. {destination_name}') return 1 if destination_restricted_name: destination_restricted_path = manifest.get_registered(restricted_name=destination_restricted_name) elif destination_restricted_path: if (not os.path.isabs(destination_restricted_path)): restricted_default_path = manifest.get_registered(default_folder='restricted') new_destination_restricted_path = ((restricted_default_path / 'Templates') / destination_restricted_path) logger.info(f'{destination_restricted_path} is not a full path, making it relative to default restricted path = {new_destination_restricted_path}') destination_restricted_path = new_destination_restricted_path else: restricted_default_path = manifest.get_registered(default_folder='restricted') new_destination_restricted_path = ((restricted_default_path / 'Templates') / destination_name) logger.info(f'--destination-restricted-path is not specified, using default restricted path / Templates / destination name = {new_destination_restricted_path}') destination_restricted_path = new_destination_restricted_path if (not destination_restricted_platform_relative_path): destination_restricted_platform_relative_path = replacements = list() while replace: replace_this = replace.pop(0) with_this = replace.pop(0) replacements.append((replace_this, with_this)) sanitized_cpp_name = utils.sanitize_identifier_for_cpp(destination_name) replacements.append(('${Name}', destination_name)) replacements.append(('${NameUpper}', destination_name.upper())) replacements.append(('${NameLower}', destination_name.lower())) replacements.append(('${SanitizedCppName}', sanitized_cpp_name)) if _instantiate_template(template_json_data, destination_name, template_name, destination_path, template_path, destination_restricted_path, template_restricted_path, destination_restricted_platform_relative_path, template_restricted_platform_relative_path, replacements, keep_restricted_in_instance, keep_license_text): logger.error(f'Instantiation of the template has failed.') return 1 if (not keep_restricted_in_instance): if destination_restricted_path: os.makedirs(destination_restricted_path, exist_ok=True) restricted_json = (destination_restricted_path / 'restricted.json') if (not os.path.isfile(restricted_json)): with open(restricted_json, 'w') as s: restricted_json_data = {} restricted_json_data.update({'restricted_name': destination_name}) s.write((json.dumps(restricted_json_data, indent=4) + '\n')) if (not no_register): if register.register(restricted_path=destination_restricted_path): logger.error(f'Failed to register the restricted {destination_restricted_path}.') return 1 logger.warning(f'Instantiation successful. NOTE: This is a generic instantiation of the template. If this was a template of an o3de object like a project, gem, template, etc. then the create-project or create-gem command can be used to register the object type via its project.json or gem.json, etc. Create from template is meant only to instance a template of a non o3de object.') return 0<|docstring|>Generic template instantiation for non o3de object templates. This function makes NO assumptions! Assumptions are made only for specializations like create_project or create_gem etc... So this function will NOT try to divine intent. :param destination_path: the folder you want to instantiate the template into :param template_path: the path to the template you want to instance :param template_name: the name of the template you want to instance, resolves template_path :param destination_name: the name that will be substituted when instantiating the template. The placeholders of ${Name} and ${SanitizedCppName} will be replaced with destination name and a sanitized version of the destination name that is suitable as a C++ identifier. If not specified, defaults to the last path component of the destination_path :param destination_restricted_path: path to the projects restricted folder :param destination_restricted_name: name of the projects restricted folder, resolves destination_restricted_path :param template_restricted_path: path of the templates restricted folder :param template_restricted_name: name of the templates restricted folder, resolves template_restricted_path :param destination_restricted_platform_relative_path: any path after the platform in the destination restricted :param template_restricted_platform_relative_path: any path after the platform in the template restricted :param keep_restricted_in_instance: whether or not you want to keep the templates restricted files in your instance or separate them out into the restricted folder :param keep_license_text: whether or not you want to keep the templates license text in your instance. template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE}, this controls if you want to keep the license text from the template in the new instance. It is false by default because most customers will not want license text in their instances, but we may want to keep them. :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs Ex. ${Name},TestGem,${Player},TestGemPlayer This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer' :param force Overrides existing files even if they exist :return: 0 for success or non 0 failure code<|endoftext|>
3b614afa580d188c08130a66b10cc8ee608048d999c5a57a7e2fa1731b8f86d7
def create_project(project_path: pathlib.Path, project_name: str=None, template_path: pathlib.Path=None, template_name: str=None, project_restricted_path: pathlib.Path=None, project_restricted_name: str=None, template_restricted_path: pathlib.Path=None, template_restricted_name: str=None, project_restricted_platform_relative_path: pathlib.Path=None, template_restricted_platform_relative_path: pathlib.Path=None, keep_restricted_in_project: bool=False, keep_license_text: bool=False, replace: list=None, force: bool=False, no_register: bool=False, system_component_class_id: str=None, editor_system_component_class_id: str=None, module_id: str=None, project_id: str=None) -> int: "\n Template instantiation specialization that makes all default assumptions for a Project template instantiation,\n reducing the effort needed in instancing a project\n :param project_path: the project path, can be absolute or relative to default projects path\n :param project_name: the project name, defaults to project_path basename if not provided\n :param template_path: the path to the template you want to instance, can be absolute or relative to default templates path\n :param template_name: the name the registered template you want to instance, defaults to DefaultProject, resolves template_path\n :param project_restricted_path: path to the projects restricted folder, can be absolute or relative to the restricted='projects'\n :param project_restricted_name: name of the registered projects restricted path, resolves project_restricted_path\n :param template_restricted_path: templates restricted path can be absolute or relative to restricted='templates'\n :param template_restricted_name: name of the registered templates restricted path, resolves template_restricted_path\n :param project_restricted_platform_relative_path: any path after the platform to append to the project_restricted_path\n :param template_restricted_platform_relative_path: any path after the platform to append to the template_restricted_path\n :param keep_restricted_in_project: whether or not you want to keep the templates restricted files in your project or\n separate them out into the restricted folder\n :param keep_license_text: whether or not you want to keep the templates license text in your instance.\n template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE},\n this controls if you want to keep the license text from the template in the new instance. It is false by default\n because most customers will not want license text in their instances, but we may want to keep them.\n :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs\n Ex. ${Name},TestGem,${Player},TestGemPlayer\n This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer'\n :param force Overrides existing files even if they exist\n :param no_register: whether or not after completion that the new object is registered\n :param system_component_class_id: optionally specify a uuid for the system component class, default is random uuid\n :param editor_system_component_class_id: optionally specify a uuid for the editor system component class, default is\n random uuid\n :param module_id: optionally specify a uuid for the module class, default is random uuid\n :param project_id: optionally specify a str for the project id, default is random uuid\n :return: 0 for success or non 0 failure code\n " if (template_name and template_path): logger.error(f'Template Name and Template Path provided, these are mutually exclusive.') return 1 if (project_restricted_name and project_restricted_path): logger.error(f'Project Restricted Name and Project Restricted Path provided, these are mutually exclusive.') return 1 if (template_restricted_name and template_restricted_path): logger.error(f'Template Restricted Name and Template Restricted Path provided, these are mutually exclusive.') return 1 if ((not template_path) and (not template_name)): template_name = 'DefaultProject' if (template_name and (not template_path)): template_path = manifest.get_registered(template_name=template_name) if (not template_path): logger.error(f'''Could not find the template path using name {template_name}. Has the engine been registered yet. It can be registered via the "o3de.py register --this-engine" command''') return 1 if (not os.path.isdir(template_path)): logger.error(f'Could not find the template {template_name}=>{template_path}') return 1 template_json = (template_path / 'template.json') if (not validation.valid_o3de_template_json(template_json)): logger.error(f'Template json {template_path} is not valid.') return 1 with open(template_json) as s: try: template_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Could not read template json {template_json}: {str(e)}.') return 1 try: template_name = template_json_data['template_name'] except KeyError as e: logger.error(f'Could not read "template_name" from template json {template_json}: {str(e)}.') return 1 if ((not template_restricted_name) and (not template_restricted_path)): try: template_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name".') if (template_restricted_name or template_restricted_path): if (template_restricted_name and (not template_restricted_path)): try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_name}') else: if (template_json_restricted_name != template_restricted_name): logger.error(f'The supplied --template-restricted-name {template_restricted_name} does not match the templates "restricted_name". Either the the --template-restricted-name is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name}, --template-restricted-name need not be supplied.') template_restricted_path = manifest.get_registered(restricted_name=template_restricted_name) else: try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_path}') else: template_json_restricted_path = manifest.get_registered(restricted_name=template_json_restricted_name) if (template_json_restricted_path != template_restricted_path): logger.error(f'The supplied --template-restricted-path {template_restricted_path} does not match the templates "restricted_name" {template_restricted_name} => {template_json_restricted_path}. Either the the supplied --template-restricted-path is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name} --template-restricted-path need not be supplied and {template_json_restricted_path} will be used.') return 1 if (template_restricted_path and (not os.path.isdir(template_restricted_path))): logger.error(f'Template restricted path {template_restricted_path} does not exist.') return 1 if template_restricted_platform_relative_path: try: template_json_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: logger.info(f'The template does not specify a "restricted_platform_relative_path". Using {template_restricted_platform_relative_path}') else: if (template_restricted_platform_relative_path != template_json_restricted_platform_relative_path): logger.error(f'The supplied --template-restricted-platform-relative-path "{template_restricted_platform_relative_path}" does not match the templates.json "restricted_platform_relative_path". Either --template-restricted-platform-relative-path is incorrect or the templates "restricted_platform_relative_path" is wrong. Note that since this template specifies "restricted_platform_relative_path" it need not be supplied and "{template_json_restricted_platform_relative_path}" will be used.') return 1 else: try: template_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: template_restricted_platform_relative_path = '' if (not template_restricted_platform_relative_path): template_restricted_platform_relative_path = '' if (not project_path): logger.error('Project path cannot be empty.') return 1 project_path = project_path.resolve() if (not os.path.isabs(project_path)): default_projects_folder = manifest.get_registered(default_folder='projects') new_project_path = (default_projects_folder / project_path) logger.info(f'Project Path {project_path} is not a full path, we must assume its relative to default projects path = {new_project_path}') project_path = new_project_path if ((not force) and project_path.is_dir() and len(list(project_path.iterdir()))): logger.error(f'Project path {project_path} already exists and is not empty.') return 1 elif (not os.path.isdir(project_path)): os.makedirs(project_path, exist_ok=force) if (not project_name): project_name = os.path.basename(project_path) if (not utils.validate_identifier(project_name)): logger.error(f'Project name must be fewer than 64 characters, contain only alphanumeric, "_" or "-" characters, and start with a letter. {project_name}') return 1 if (project_name in restricted_platforms): logger.error(f'Project name cannot be a restricted name. {project_name}') return 1 if (project_restricted_name and (not project_restricted_path)): gem_restricted_path = manifest.get_registered(restricted_name=project_restricted_name) if (not gem_restricted_path): logger.error(f'Project Restricted Name {project_restricted_name} cannot be found.') return 1 if project_restricted_path: if (not os.path.isabs(project_restricted_path)): logger.error(f'Project Restricted Path {project_restricted_path} is not an absolute path.') return 1 else: project_restricted_path = ((manifest.get_o3de_restricted_folder() / 'Projects') / project_name) if (not project_restricted_platform_relative_path): project_restricted_platform_relative_path = '' replacements = list() while replace: replace_this = replace.pop(0) with_this = replace.pop(0) replacements.append((replace_this, with_this)) sanitized_cpp_name = utils.sanitize_identifier_for_cpp(project_name) replacements.append(('${Name}', project_name)) replacements.append(('${NameUpper}', project_name.upper())) replacements.append(('${NameLower}', project_name.lower())) replacements.append(('${SanitizedCppName}', sanitized_cpp_name)) if project_id: replacements.append(('${ProjectId}', project_id)) else: replacements.append(('${ProjectId}', (('{' + str(uuid.uuid4())) + '}'))) if module_id: replacements.append(('${ModuleClassId}', module_id)) else: replacements.append(('${ModuleClassId}', (('{' + str(uuid.uuid4())) + '}'))) if system_component_class_id: if (('{' not in system_component_class_id) or ('-' not in system_component_class_id)): logger.error((f'System component class id {system_component_class_id} is malformed. Should look like Ex.' + '{b60c92eb-3139-454b-a917-a9d3c5819594}')) return 1 replacements.append(('${SysCompClassId}', system_component_class_id)) else: replacements.append(('${SysCompClassId}', (('{' + str(uuid.uuid4())) + '}'))) if editor_system_component_class_id: if (('{' not in editor_system_component_class_id) or ('-' not in editor_system_component_class_id)): logger.error((f'Editor System component class id {editor_system_component_class_id} is malformed. Should look like Ex.' + '{b60c92eb-3139-454b-a917-a9d3c5819594}')) return 1 replacements.append(('${EditorSysCompClassId}', editor_system_component_class_id)) else: replacements.append(('${EditorSysCompClassId}', (('{' + str(uuid.uuid4())) + '}'))) if _instantiate_template(template_json_data, project_name, template_name, project_path, template_path, project_restricted_path, template_restricted_path, project_restricted_platform_relative_path, template_restricted_platform_relative_path, replacements, keep_restricted_in_project, keep_license_text): logger.error(f'Instantiation of the template has failed.') return 1 project_json = (project_path / 'project.json') if (not keep_restricted_in_project): if project_restricted_path: os.makedirs(project_restricted_path, exist_ok=True) restricted_json = (project_restricted_path / 'restricted.json') if os.path.isfile(restricted_json): if (not validation.valid_o3de_restricted_json(restricted_json)): logger.error(f'Restricted json {restricted_json} is not valid.') return 1 else: with open(restricted_json, 'w') as s: restricted_json_data = {} restricted_json_data.update({'restricted_name': project_name}) s.write((json.dumps(restricted_json_data, indent=4) + '\n')) with open(restricted_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Failed to load restricted json {restricted_json}.') return 1 try: restricted_name = restricted_json_data['restricted_name'] except KeyError as e: logger.error(f'Failed to read "restricted_name" from restricted json {restricted_json}.') return 1 project_json = (project_path / 'project.json') if (not validation.valid_o3de_project_json(project_json)): logger.error(f'Project json {project_json} is not valid.') return 1 with open(project_json, 'r') as s: try: project_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Failed to load project json {project_json}.') return 1 project_json_data.update({'restricted': restricted_name}) os.unlink(project_json) with open(project_json, 'w') as s: try: s.write((json.dumps(project_json_data, indent=4) + '\n')) except OSError as e: logger.error(f'Failed to write project json {project_json}.') return 1 if (not no_register): if register.register(restricted_path=project_restricted_path): logger.error(f'Failed to register the restricted {project_restricted_path}.') return 1 return (register.register(project_path=project_path) if (not no_register) else 0)
Template instantiation specialization that makes all default assumptions for a Project template instantiation, reducing the effort needed in instancing a project :param project_path: the project path, can be absolute or relative to default projects path :param project_name: the project name, defaults to project_path basename if not provided :param template_path: the path to the template you want to instance, can be absolute or relative to default templates path :param template_name: the name the registered template you want to instance, defaults to DefaultProject, resolves template_path :param project_restricted_path: path to the projects restricted folder, can be absolute or relative to the restricted='projects' :param project_restricted_name: name of the registered projects restricted path, resolves project_restricted_path :param template_restricted_path: templates restricted path can be absolute or relative to restricted='templates' :param template_restricted_name: name of the registered templates restricted path, resolves template_restricted_path :param project_restricted_platform_relative_path: any path after the platform to append to the project_restricted_path :param template_restricted_platform_relative_path: any path after the platform to append to the template_restricted_path :param keep_restricted_in_project: whether or not you want to keep the templates restricted files in your project or separate them out into the restricted folder :param keep_license_text: whether or not you want to keep the templates license text in your instance. template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE}, this controls if you want to keep the license text from the template in the new instance. It is false by default because most customers will not want license text in their instances, but we may want to keep them. :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs Ex. ${Name},TestGem,${Player},TestGemPlayer This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer' :param force Overrides existing files even if they exist :param no_register: whether or not after completion that the new object is registered :param system_component_class_id: optionally specify a uuid for the system component class, default is random uuid :param editor_system_component_class_id: optionally specify a uuid for the editor system component class, default is random uuid :param module_id: optionally specify a uuid for the module class, default is random uuid :param project_id: optionally specify a str for the project id, default is random uuid :return: 0 for success or non 0 failure code
scripts/o3de/o3de/engine_template.py
create_project
SparkyStudios/o3de
11
python
def create_project(project_path: pathlib.Path, project_name: str=None, template_path: pathlib.Path=None, template_name: str=None, project_restricted_path: pathlib.Path=None, project_restricted_name: str=None, template_restricted_path: pathlib.Path=None, template_restricted_name: str=None, project_restricted_platform_relative_path: pathlib.Path=None, template_restricted_platform_relative_path: pathlib.Path=None, keep_restricted_in_project: bool=False, keep_license_text: bool=False, replace: list=None, force: bool=False, no_register: bool=False, system_component_class_id: str=None, editor_system_component_class_id: str=None, module_id: str=None, project_id: str=None) -> int: "\n Template instantiation specialization that makes all default assumptions for a Project template instantiation,\n reducing the effort needed in instancing a project\n :param project_path: the project path, can be absolute or relative to default projects path\n :param project_name: the project name, defaults to project_path basename if not provided\n :param template_path: the path to the template you want to instance, can be absolute or relative to default templates path\n :param template_name: the name the registered template you want to instance, defaults to DefaultProject, resolves template_path\n :param project_restricted_path: path to the projects restricted folder, can be absolute or relative to the restricted='projects'\n :param project_restricted_name: name of the registered projects restricted path, resolves project_restricted_path\n :param template_restricted_path: templates restricted path can be absolute or relative to restricted='templates'\n :param template_restricted_name: name of the registered templates restricted path, resolves template_restricted_path\n :param project_restricted_platform_relative_path: any path after the platform to append to the project_restricted_path\n :param template_restricted_platform_relative_path: any path after the platform to append to the template_restricted_path\n :param keep_restricted_in_project: whether or not you want to keep the templates restricted files in your project or\n separate them out into the restricted folder\n :param keep_license_text: whether or not you want to keep the templates license text in your instance.\n template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE},\n this controls if you want to keep the license text from the template in the new instance. It is false by default\n because most customers will not want license text in their instances, but we may want to keep them.\n :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs\n Ex. ${Name},TestGem,${Player},TestGemPlayer\n This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer'\n :param force Overrides existing files even if they exist\n :param no_register: whether or not after completion that the new object is registered\n :param system_component_class_id: optionally specify a uuid for the system component class, default is random uuid\n :param editor_system_component_class_id: optionally specify a uuid for the editor system component class, default is\n random uuid\n :param module_id: optionally specify a uuid for the module class, default is random uuid\n :param project_id: optionally specify a str for the project id, default is random uuid\n :return: 0 for success or non 0 failure code\n " if (template_name and template_path): logger.error(f'Template Name and Template Path provided, these are mutually exclusive.') return 1 if (project_restricted_name and project_restricted_path): logger.error(f'Project Restricted Name and Project Restricted Path provided, these are mutually exclusive.') return 1 if (template_restricted_name and template_restricted_path): logger.error(f'Template Restricted Name and Template Restricted Path provided, these are mutually exclusive.') return 1 if ((not template_path) and (not template_name)): template_name = 'DefaultProject' if (template_name and (not template_path)): template_path = manifest.get_registered(template_name=template_name) if (not template_path): logger.error(f'Could not find the template path using name {template_name}. Has the engine been registered yet. It can be registered via the "o3de.py register --this-engine" command') return 1 if (not os.path.isdir(template_path)): logger.error(f'Could not find the template {template_name}=>{template_path}') return 1 template_json = (template_path / 'template.json') if (not validation.valid_o3de_template_json(template_json)): logger.error(f'Template json {template_path} is not valid.') return 1 with open(template_json) as s: try: template_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Could not read template json {template_json}: {str(e)}.') return 1 try: template_name = template_json_data['template_name'] except KeyError as e: logger.error(f'Could not read "template_name" from template json {template_json}: {str(e)}.') return 1 if ((not template_restricted_name) and (not template_restricted_path)): try: template_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name".') if (template_restricted_name or template_restricted_path): if (template_restricted_name and (not template_restricted_path)): try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_name}') else: if (template_json_restricted_name != template_restricted_name): logger.error(f'The supplied --template-restricted-name {template_restricted_name} does not match the templates "restricted_name". Either the the --template-restricted-name is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name}, --template-restricted-name need not be supplied.') template_restricted_path = manifest.get_registered(restricted_name=template_restricted_name) else: try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_path}') else: template_json_restricted_path = manifest.get_registered(restricted_name=template_json_restricted_name) if (template_json_restricted_path != template_restricted_path): logger.error(f'The supplied --template-restricted-path {template_restricted_path} does not match the templates "restricted_name" {template_restricted_name} => {template_json_restricted_path}. Either the the supplied --template-restricted-path is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name} --template-restricted-path need not be supplied and {template_json_restricted_path} will be used.') return 1 if (template_restricted_path and (not os.path.isdir(template_restricted_path))): logger.error(f'Template restricted path {template_restricted_path} does not exist.') return 1 if template_restricted_platform_relative_path: try: template_json_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: logger.info(f'The template does not specify a "restricted_platform_relative_path". Using {template_restricted_platform_relative_path}') else: if (template_restricted_platform_relative_path != template_json_restricted_platform_relative_path): logger.error(f'The supplied --template-restricted-platform-relative-path "{template_restricted_platform_relative_path}" does not match the templates.json "restricted_platform_relative_path". Either --template-restricted-platform-relative-path is incorrect or the templates "restricted_platform_relative_path" is wrong. Note that since this template specifies "restricted_platform_relative_path" it need not be supplied and "{template_json_restricted_platform_relative_path}" will be used.') return 1 else: try: template_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: template_restricted_platform_relative_path = if (not template_restricted_platform_relative_path): template_restricted_platform_relative_path = if (not project_path): logger.error('Project path cannot be empty.') return 1 project_path = project_path.resolve() if (not os.path.isabs(project_path)): default_projects_folder = manifest.get_registered(default_folder='projects') new_project_path = (default_projects_folder / project_path) logger.info(f'Project Path {project_path} is not a full path, we must assume its relative to default projects path = {new_project_path}') project_path = new_project_path if ((not force) and project_path.is_dir() and len(list(project_path.iterdir()))): logger.error(f'Project path {project_path} already exists and is not empty.') return 1 elif (not os.path.isdir(project_path)): os.makedirs(project_path, exist_ok=force) if (not project_name): project_name = os.path.basename(project_path) if (not utils.validate_identifier(project_name)): logger.error(f'Project name must be fewer than 64 characters, contain only alphanumeric, "_" or "-" characters, and start with a letter. {project_name}') return 1 if (project_name in restricted_platforms): logger.error(f'Project name cannot be a restricted name. {project_name}') return 1 if (project_restricted_name and (not project_restricted_path)): gem_restricted_path = manifest.get_registered(restricted_name=project_restricted_name) if (not gem_restricted_path): logger.error(f'Project Restricted Name {project_restricted_name} cannot be found.') return 1 if project_restricted_path: if (not os.path.isabs(project_restricted_path)): logger.error(f'Project Restricted Path {project_restricted_path} is not an absolute path.') return 1 else: project_restricted_path = ((manifest.get_o3de_restricted_folder() / 'Projects') / project_name) if (not project_restricted_platform_relative_path): project_restricted_platform_relative_path = replacements = list() while replace: replace_this = replace.pop(0) with_this = replace.pop(0) replacements.append((replace_this, with_this)) sanitized_cpp_name = utils.sanitize_identifier_for_cpp(project_name) replacements.append(('${Name}', project_name)) replacements.append(('${NameUpper}', project_name.upper())) replacements.append(('${NameLower}', project_name.lower())) replacements.append(('${SanitizedCppName}', sanitized_cpp_name)) if project_id: replacements.append(('${ProjectId}', project_id)) else: replacements.append(('${ProjectId}', (('{' + str(uuid.uuid4())) + '}'))) if module_id: replacements.append(('${ModuleClassId}', module_id)) else: replacements.append(('${ModuleClassId}', (('{' + str(uuid.uuid4())) + '}'))) if system_component_class_id: if (('{' not in system_component_class_id) or ('-' not in system_component_class_id)): logger.error((f'System component class id {system_component_class_id} is malformed. Should look like Ex.' + '{b60c92eb-3139-454b-a917-a9d3c5819594}')) return 1 replacements.append(('${SysCompClassId}', system_component_class_id)) else: replacements.append(('${SysCompClassId}', (('{' + str(uuid.uuid4())) + '}'))) if editor_system_component_class_id: if (('{' not in editor_system_component_class_id) or ('-' not in editor_system_component_class_id)): logger.error((f'Editor System component class id {editor_system_component_class_id} is malformed. Should look like Ex.' + '{b60c92eb-3139-454b-a917-a9d3c5819594}')) return 1 replacements.append(('${EditorSysCompClassId}', editor_system_component_class_id)) else: replacements.append(('${EditorSysCompClassId}', (('{' + str(uuid.uuid4())) + '}'))) if _instantiate_template(template_json_data, project_name, template_name, project_path, template_path, project_restricted_path, template_restricted_path, project_restricted_platform_relative_path, template_restricted_platform_relative_path, replacements, keep_restricted_in_project, keep_license_text): logger.error(f'Instantiation of the template has failed.') return 1 project_json = (project_path / 'project.json') if (not keep_restricted_in_project): if project_restricted_path: os.makedirs(project_restricted_path, exist_ok=True) restricted_json = (project_restricted_path / 'restricted.json') if os.path.isfile(restricted_json): if (not validation.valid_o3de_restricted_json(restricted_json)): logger.error(f'Restricted json {restricted_json} is not valid.') return 1 else: with open(restricted_json, 'w') as s: restricted_json_data = {} restricted_json_data.update({'restricted_name': project_name}) s.write((json.dumps(restricted_json_data, indent=4) + '\n')) with open(restricted_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Failed to load restricted json {restricted_json}.') return 1 try: restricted_name = restricted_json_data['restricted_name'] except KeyError as e: logger.error(f'Failed to read "restricted_name" from restricted json {restricted_json}.') return 1 project_json = (project_path / 'project.json') if (not validation.valid_o3de_project_json(project_json)): logger.error(f'Project json {project_json} is not valid.') return 1 with open(project_json, 'r') as s: try: project_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Failed to load project json {project_json}.') return 1 project_json_data.update({'restricted': restricted_name}) os.unlink(project_json) with open(project_json, 'w') as s: try: s.write((json.dumps(project_json_data, indent=4) + '\n')) except OSError as e: logger.error(f'Failed to write project json {project_json}.') return 1 if (not no_register): if register.register(restricted_path=project_restricted_path): logger.error(f'Failed to register the restricted {project_restricted_path}.') return 1 return (register.register(project_path=project_path) if (not no_register) else 0)
def create_project(project_path: pathlib.Path, project_name: str=None, template_path: pathlib.Path=None, template_name: str=None, project_restricted_path: pathlib.Path=None, project_restricted_name: str=None, template_restricted_path: pathlib.Path=None, template_restricted_name: str=None, project_restricted_platform_relative_path: pathlib.Path=None, template_restricted_platform_relative_path: pathlib.Path=None, keep_restricted_in_project: bool=False, keep_license_text: bool=False, replace: list=None, force: bool=False, no_register: bool=False, system_component_class_id: str=None, editor_system_component_class_id: str=None, module_id: str=None, project_id: str=None) -> int: "\n Template instantiation specialization that makes all default assumptions for a Project template instantiation,\n reducing the effort needed in instancing a project\n :param project_path: the project path, can be absolute or relative to default projects path\n :param project_name: the project name, defaults to project_path basename if not provided\n :param template_path: the path to the template you want to instance, can be absolute or relative to default templates path\n :param template_name: the name the registered template you want to instance, defaults to DefaultProject, resolves template_path\n :param project_restricted_path: path to the projects restricted folder, can be absolute or relative to the restricted='projects'\n :param project_restricted_name: name of the registered projects restricted path, resolves project_restricted_path\n :param template_restricted_path: templates restricted path can be absolute or relative to restricted='templates'\n :param template_restricted_name: name of the registered templates restricted path, resolves template_restricted_path\n :param project_restricted_platform_relative_path: any path after the platform to append to the project_restricted_path\n :param template_restricted_platform_relative_path: any path after the platform to append to the template_restricted_path\n :param keep_restricted_in_project: whether or not you want to keep the templates restricted files in your project or\n separate them out into the restricted folder\n :param keep_license_text: whether or not you want to keep the templates license text in your instance.\n template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE},\n this controls if you want to keep the license text from the template in the new instance. It is false by default\n because most customers will not want license text in their instances, but we may want to keep them.\n :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs\n Ex. ${Name},TestGem,${Player},TestGemPlayer\n This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer'\n :param force Overrides existing files even if they exist\n :param no_register: whether or not after completion that the new object is registered\n :param system_component_class_id: optionally specify a uuid for the system component class, default is random uuid\n :param editor_system_component_class_id: optionally specify a uuid for the editor system component class, default is\n random uuid\n :param module_id: optionally specify a uuid for the module class, default is random uuid\n :param project_id: optionally specify a str for the project id, default is random uuid\n :return: 0 for success or non 0 failure code\n " if (template_name and template_path): logger.error(f'Template Name and Template Path provided, these are mutually exclusive.') return 1 if (project_restricted_name and project_restricted_path): logger.error(f'Project Restricted Name and Project Restricted Path provided, these are mutually exclusive.') return 1 if (template_restricted_name and template_restricted_path): logger.error(f'Template Restricted Name and Template Restricted Path provided, these are mutually exclusive.') return 1 if ((not template_path) and (not template_name)): template_name = 'DefaultProject' if (template_name and (not template_path)): template_path = manifest.get_registered(template_name=template_name) if (not template_path): logger.error(f'Could not find the template path using name {template_name}. Has the engine been registered yet. It can be registered via the "o3de.py register --this-engine" command') return 1 if (not os.path.isdir(template_path)): logger.error(f'Could not find the template {template_name}=>{template_path}') return 1 template_json = (template_path / 'template.json') if (not validation.valid_o3de_template_json(template_json)): logger.error(f'Template json {template_path} is not valid.') return 1 with open(template_json) as s: try: template_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Could not read template json {template_json}: {str(e)}.') return 1 try: template_name = template_json_data['template_name'] except KeyError as e: logger.error(f'Could not read "template_name" from template json {template_json}: {str(e)}.') return 1 if ((not template_restricted_name) and (not template_restricted_path)): try: template_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name".') if (template_restricted_name or template_restricted_path): if (template_restricted_name and (not template_restricted_path)): try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_name}') else: if (template_json_restricted_name != template_restricted_name): logger.error(f'The supplied --template-restricted-name {template_restricted_name} does not match the templates "restricted_name". Either the the --template-restricted-name is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name}, --template-restricted-name need not be supplied.') template_restricted_path = manifest.get_registered(restricted_name=template_restricted_name) else: try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_path}') else: template_json_restricted_path = manifest.get_registered(restricted_name=template_json_restricted_name) if (template_json_restricted_path != template_restricted_path): logger.error(f'The supplied --template-restricted-path {template_restricted_path} does not match the templates "restricted_name" {template_restricted_name} => {template_json_restricted_path}. Either the the supplied --template-restricted-path is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name} --template-restricted-path need not be supplied and {template_json_restricted_path} will be used.') return 1 if (template_restricted_path and (not os.path.isdir(template_restricted_path))): logger.error(f'Template restricted path {template_restricted_path} does not exist.') return 1 if template_restricted_platform_relative_path: try: template_json_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: logger.info(f'The template does not specify a "restricted_platform_relative_path". Using {template_restricted_platform_relative_path}') else: if (template_restricted_platform_relative_path != template_json_restricted_platform_relative_path): logger.error(f'The supplied --template-restricted-platform-relative-path "{template_restricted_platform_relative_path}" does not match the templates.json "restricted_platform_relative_path". Either --template-restricted-platform-relative-path is incorrect or the templates "restricted_platform_relative_path" is wrong. Note that since this template specifies "restricted_platform_relative_path" it need not be supplied and "{template_json_restricted_platform_relative_path}" will be used.') return 1 else: try: template_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: template_restricted_platform_relative_path = if (not template_restricted_platform_relative_path): template_restricted_platform_relative_path = if (not project_path): logger.error('Project path cannot be empty.') return 1 project_path = project_path.resolve() if (not os.path.isabs(project_path)): default_projects_folder = manifest.get_registered(default_folder='projects') new_project_path = (default_projects_folder / project_path) logger.info(f'Project Path {project_path} is not a full path, we must assume its relative to default projects path = {new_project_path}') project_path = new_project_path if ((not force) and project_path.is_dir() and len(list(project_path.iterdir()))): logger.error(f'Project path {project_path} already exists and is not empty.') return 1 elif (not os.path.isdir(project_path)): os.makedirs(project_path, exist_ok=force) if (not project_name): project_name = os.path.basename(project_path) if (not utils.validate_identifier(project_name)): logger.error(f'Project name must be fewer than 64 characters, contain only alphanumeric, "_" or "-" characters, and start with a letter. {project_name}') return 1 if (project_name in restricted_platforms): logger.error(f'Project name cannot be a restricted name. {project_name}') return 1 if (project_restricted_name and (not project_restricted_path)): gem_restricted_path = manifest.get_registered(restricted_name=project_restricted_name) if (not gem_restricted_path): logger.error(f'Project Restricted Name {project_restricted_name} cannot be found.') return 1 if project_restricted_path: if (not os.path.isabs(project_restricted_path)): logger.error(f'Project Restricted Path {project_restricted_path} is not an absolute path.') return 1 else: project_restricted_path = ((manifest.get_o3de_restricted_folder() / 'Projects') / project_name) if (not project_restricted_platform_relative_path): project_restricted_platform_relative_path = replacements = list() while replace: replace_this = replace.pop(0) with_this = replace.pop(0) replacements.append((replace_this, with_this)) sanitized_cpp_name = utils.sanitize_identifier_for_cpp(project_name) replacements.append(('${Name}', project_name)) replacements.append(('${NameUpper}', project_name.upper())) replacements.append(('${NameLower}', project_name.lower())) replacements.append(('${SanitizedCppName}', sanitized_cpp_name)) if project_id: replacements.append(('${ProjectId}', project_id)) else: replacements.append(('${ProjectId}', (('{' + str(uuid.uuid4())) + '}'))) if module_id: replacements.append(('${ModuleClassId}', module_id)) else: replacements.append(('${ModuleClassId}', (('{' + str(uuid.uuid4())) + '}'))) if system_component_class_id: if (('{' not in system_component_class_id) or ('-' not in system_component_class_id)): logger.error((f'System component class id {system_component_class_id} is malformed. Should look like Ex.' + '{b60c92eb-3139-454b-a917-a9d3c5819594}')) return 1 replacements.append(('${SysCompClassId}', system_component_class_id)) else: replacements.append(('${SysCompClassId}', (('{' + str(uuid.uuid4())) + '}'))) if editor_system_component_class_id: if (('{' not in editor_system_component_class_id) or ('-' not in editor_system_component_class_id)): logger.error((f'Editor System component class id {editor_system_component_class_id} is malformed. Should look like Ex.' + '{b60c92eb-3139-454b-a917-a9d3c5819594}')) return 1 replacements.append(('${EditorSysCompClassId}', editor_system_component_class_id)) else: replacements.append(('${EditorSysCompClassId}', (('{' + str(uuid.uuid4())) + '}'))) if _instantiate_template(template_json_data, project_name, template_name, project_path, template_path, project_restricted_path, template_restricted_path, project_restricted_platform_relative_path, template_restricted_platform_relative_path, replacements, keep_restricted_in_project, keep_license_text): logger.error(f'Instantiation of the template has failed.') return 1 project_json = (project_path / 'project.json') if (not keep_restricted_in_project): if project_restricted_path: os.makedirs(project_restricted_path, exist_ok=True) restricted_json = (project_restricted_path / 'restricted.json') if os.path.isfile(restricted_json): if (not validation.valid_o3de_restricted_json(restricted_json)): logger.error(f'Restricted json {restricted_json} is not valid.') return 1 else: with open(restricted_json, 'w') as s: restricted_json_data = {} restricted_json_data.update({'restricted_name': project_name}) s.write((json.dumps(restricted_json_data, indent=4) + '\n')) with open(restricted_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Failed to load restricted json {restricted_json}.') return 1 try: restricted_name = restricted_json_data['restricted_name'] except KeyError as e: logger.error(f'Failed to read "restricted_name" from restricted json {restricted_json}.') return 1 project_json = (project_path / 'project.json') if (not validation.valid_o3de_project_json(project_json)): logger.error(f'Project json {project_json} is not valid.') return 1 with open(project_json, 'r') as s: try: project_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Failed to load project json {project_json}.') return 1 project_json_data.update({'restricted': restricted_name}) os.unlink(project_json) with open(project_json, 'w') as s: try: s.write((json.dumps(project_json_data, indent=4) + '\n')) except OSError as e: logger.error(f'Failed to write project json {project_json}.') return 1 if (not no_register): if register.register(restricted_path=project_restricted_path): logger.error(f'Failed to register the restricted {project_restricted_path}.') return 1 return (register.register(project_path=project_path) if (not no_register) else 0)<|docstring|>Template instantiation specialization that makes all default assumptions for a Project template instantiation, reducing the effort needed in instancing a project :param project_path: the project path, can be absolute or relative to default projects path :param project_name: the project name, defaults to project_path basename if not provided :param template_path: the path to the template you want to instance, can be absolute or relative to default templates path :param template_name: the name the registered template you want to instance, defaults to DefaultProject, resolves template_path :param project_restricted_path: path to the projects restricted folder, can be absolute or relative to the restricted='projects' :param project_restricted_name: name of the registered projects restricted path, resolves project_restricted_path :param template_restricted_path: templates restricted path can be absolute or relative to restricted='templates' :param template_restricted_name: name of the registered templates restricted path, resolves template_restricted_path :param project_restricted_platform_relative_path: any path after the platform to append to the project_restricted_path :param template_restricted_platform_relative_path: any path after the platform to append to the template_restricted_path :param keep_restricted_in_project: whether or not you want to keep the templates restricted files in your project or separate them out into the restricted folder :param keep_license_text: whether or not you want to keep the templates license text in your instance. template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE}, this controls if you want to keep the license text from the template in the new instance. It is false by default because most customers will not want license text in their instances, but we may want to keep them. :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs Ex. ${Name},TestGem,${Player},TestGemPlayer This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer' :param force Overrides existing files even if they exist :param no_register: whether or not after completion that the new object is registered :param system_component_class_id: optionally specify a uuid for the system component class, default is random uuid :param editor_system_component_class_id: optionally specify a uuid for the editor system component class, default is random uuid :param module_id: optionally specify a uuid for the module class, default is random uuid :param project_id: optionally specify a str for the project id, default is random uuid :return: 0 for success or non 0 failure code<|endoftext|>
a77708bfbd97a7e52c6e4eb35f90184c0c4ccb9037cfb95e0410cbaa3039008c
def create_gem(gem_path: pathlib.Path, template_path: pathlib.Path=None, template_name: str=None, gem_name: str=None, gem_restricted_path: pathlib.Path=None, gem_restricted_name: str=None, template_restricted_path: pathlib.Path=None, template_restricted_name: str=None, gem_restricted_platform_relative_path: pathlib.Path=None, template_restricted_platform_relative_path: pathlib.Path=None, keep_restricted_in_gem: bool=False, keep_license_text: bool=False, replace: list=None, force: bool=False, no_register: bool=False, system_component_class_id: str=None, editor_system_component_class_id: str=None, module_id: str=None) -> int: "\n Template instantiation specialization that makes all default assumptions for a Gem template instantiation,\n reducing the effort needed in instancing a gem\n :param gem_path: the gem path, can be absolute or relative to default gems path\n :param template_path: the template path you want to instance, can be absolute or relative to default templates path\n :param template_name: the name of the registered template you want to instance, defaults to DefaultGem, resolves template_path\n :param gem_name: the name that will be substituted when instantiating the template.\n The placeholders of ${Name} and ${SanitizedCppName} will be replaced with gem name and a sanitized\n version of the gem name that is suitable as a C++ identifier. If not specified, defaults to the\n last path component of the gem_path\n :param gem_restricted_path: path to the gems restricted folder, can be absolute or relative to the restricted='gems'\n :param gem_restricted_name: str = name of the registered gems restricted path, resolves gem_restricted_path\n :param template_restricted_path: the templates restricted path, can be absolute or relative to the restricted='templates'\n :param template_restricted_name: name of the registered templates restricted path, resolves template_restricted_path\n :param gem_restricted_platform_relative_path: any path after the platform to append to the gem_restricted_path\n :param template_restricted_platform_relative_path: any path after the platform to append to the template_restricted_path\n :param keep_restricted_in_gem: whether or not you want to keep the templates restricted files in your instance or\n separate them out into the restricted folder\n :param keep_license_text: whether or not you want to keep the templates license text in your instance. template can\n have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE}, this controls if you want to keep\n the license text from the template in the new instance. It is false by default because most customers will not\n want license text in their instances, but we may want to keep them.\n :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs\n Ex. ${Name},TestGem,${Player},TestGemPlayer\n This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer'\n :param force Overrides existing files even if they exist\n :param system_component_class_id: optionally specify a uuid for the system component class, default is random uuid\n :param editor_system_component_class_id: optionally specify a uuid for the editor system component class, default is\n random uuid\n :param module_id: optionally specify a uuid for the module class, default is random uuid\n :return: 0 for success or non 0 failure code\n " if (template_name and template_path): logger.error(f'Template Name and Template Path provided, these are mutually exclusive.') return 1 if (gem_restricted_name and gem_restricted_path): logger.error(f'Gem Restricted Name and Gem Restricted Path provided, these are mutually exclusive.') return 1 if (template_restricted_name and template_restricted_path): logger.error(f'Template Restricted Name and Template Restricted Path provided, these are mutually exclusive.') return 1 if ((not template_name) and (not template_path)): template_name = 'DefaultGem' if (template_name and (not template_path)): template_path = manifest.get_registered(template_name=template_name) if (not template_path): logger.error(f'''Could not find the template path using name {template_name}. Has the template been registered yet? It can be registered via the "o3de.py register --tp <template-path>" command''') return 1 if (not os.path.isdir(template_path)): logger.error(f'Could not find the template {template_name}=>{template_path}') return 1 template_json = (template_path / 'template.json') if (not validation.valid_o3de_template_json(template_json)): logger.error(f'Template json {template_path} is not valid.') return 1 with open(template_json) as s: try: template_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Could not read template json {template_json}: {str(e)}.') return 1 try: template_name = template_json_data['template_name'] except KeyError as e: logger.error(f'Could not read "template_name" from template json {template_json}: {str(e)}.') return 1 if ((not template_restricted_name) and (not template_restricted_path)): try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name".') else: template_restricted_name = template_json_restricted_name if (template_restricted_name or template_restricted_path): if (template_restricted_name and (not template_restricted_path)): try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_name}') else: if (template_json_restricted_name != template_restricted_name): logger.error(f'The supplied --template-restricted-name {template_restricted_name} does not match the templates "restricted_name". Either the the --template-restricted-name is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name}, --template-restricted-name need not be supplied.') template_restricted_path = manifest.get_registered(restricted_name=template_restricted_name) else: try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_path}') else: template_json_restricted_path = manifest.get_registered(restricted_name=template_json_restricted_name) if (template_json_restricted_path != template_restricted_path): logger.error(f'The supplied --template-restricted-path {template_restricted_path} does not match the templates "restricted_name" {template_restricted_name} => {template_json_restricted_path}. Either the the supplied --template-restricted-path is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name} --template-restricted-path need not be supplied and {template_json_restricted_path} will be used.') return 1 if (template_restricted_path and (not os.path.isdir(template_restricted_path))): logger.error(f'Template restricted path {template_restricted_path} does not exist.') return 1 if template_restricted_platform_relative_path: try: template_json_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: logger.info(f'The template does not specify a "restricted_platform_relative_path". Using {template_restricted_platform_relative_path}') else: if (template_restricted_platform_relative_path != template_json_restricted_platform_relative_path): logger.error(f'The supplied --template-restricted-platform-relative-path "{template_restricted_platform_relative_path}" does not match the templates.json "restricted_platform_relative_path". Either --template-restricted-platform-relative-path is incorrect or the templates "restricted_platform_relative_path" is wrong. Note that since this template specifies "restricted_platform_relative_path" it need not be supplied and "{template_json_restricted_platform_relative_path}" will be used.') return 1 else: try: template_restricted_platform_relative_path = template_json_data['restricted_platform_relative_path'] except KeyError as e: template_restricted_platform_relative_path = '' if (not template_restricted_platform_relative_path): template_restricted_platform_relative_path = '' if (not gem_path): logger.error('Gem path cannot be empty.') return 1 gem_path = gem_path.resolve() if (not os.path.isabs(gem_path)): default_gems_folder = manifest.get_registered(default_folder='gems') new_gem_path = (default_gems_folder / gem_path) logger.info(f'Gem Path {gem_path} is not a full path, we must assume its relative to default gems path = {new_gem_path}') gem_path = new_gem_path if ((not force) and gem_path.is_dir() and len(list(gem_path.iterdir()))): logger.error(f'Gem path {gem_path} already exists.') return 1 else: os.makedirs(gem_path, exist_ok=force) if (not gem_name): gem_name = os.path.basename(gem_path) if (not utils.validate_identifier(gem_name)): logger.error(f'Gem name must be fewer than 64 characters, contain only alphanumeric, "_" or "-" characters, and start with a letter. {gem_name}') return 1 if (gem_name in restricted_platforms): logger.error(f'Gem path cannot be a restricted name. {gem_name}') return 1 if (gem_restricted_name and (not gem_restricted_path)): gem_restricted_path = manifest.get_registered(restricted_name=gem_restricted_name) if (not gem_restricted_path): logger.error(f'Gem Restricted Name {gem_restricted_name} cannot be found.') return 1 if gem_restricted_path: if (not os.path.isabs(gem_restricted_path)): logger.error(f'Gem Restricted Path {gem_restricted_path} is not an absolute path.') return 1 else: gem_restricted_path = ((manifest.get_o3de_restricted_folder() / 'Gems') / gem_name) if (not gem_restricted_platform_relative_path): gem_restricted_platform_relative_path = '' replacements = list() while replace: replace_this = replace.pop(0) with_this = replace.pop(0) replacements.append((replace_this, with_this)) sanitized_cpp_name = utils.sanitize_identifier_for_cpp(gem_name) replacements.append(('${Name}', gem_name)) replacements.append(('${NameUpper}', gem_name.upper())) replacements.append(('${NameLower}', gem_name.lower())) replacements.append(('${SanitizedCppName}', sanitized_cpp_name)) if module_id: replacements.append(('${ModuleClassId}', module_id)) else: replacements.append(('${ModuleClassId}', (('{' + str(uuid.uuid4())) + '}'))) if system_component_class_id: if (('{' not in system_component_class_id) or ('-' not in system_component_class_id)): logger.error((f'System component class id {system_component_class_id} is malformed. Should look like Ex.' + '{b60c92eb-3139-454b-a917-a9d3c5819594}')) return 1 replacements.append(('${SysCompClassId}', system_component_class_id)) else: replacements.append(('${SysCompClassId}', (('{' + str(uuid.uuid4())) + '}'))) if editor_system_component_class_id: if (('{' not in editor_system_component_class_id) or ('-' not in editor_system_component_class_id)): logger.error((f'Editor System component class id {editor_system_component_class_id} is malformed. Should look like Ex.' + '{b60c92eb-3139-454b-a917-a9d3c5819594}')) return 1 replacements.append(('${EditorSysCompClassId}', editor_system_component_class_id)) else: replacements.append(('${EditorSysCompClassId}', (('{' + str(uuid.uuid4())) + '}'))) if _instantiate_template(template_json_data, gem_name, template_name, gem_path, template_path, gem_restricted_path, template_restricted_path, gem_restricted_platform_relative_path, template_restricted_platform_relative_path, replacements, keep_restricted_in_gem, keep_license_text): logger.error(f'Instantiation of the template has failed.') return 1 if (not keep_restricted_in_gem): if gem_restricted_path: os.makedirs(gem_restricted_path, exist_ok=True) restricted_json = (gem_restricted_path / 'restricted.json') if os.path.isfile(restricted_json): if (not validation.valid_o3de_restricted_json(restricted_json)): logger.error(f'Restricted json {restricted_json} is not valid.') return 1 else: with open(restricted_json, 'w') as s: restricted_json_data = {} restricted_json_data.update({'restricted_name': gem_name}) s.write((json.dumps(restricted_json_data, indent=4) + '\n')) with open(restricted_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Failed to load restricted json {restricted_json}.') return 1 try: restricted_name = restricted_json_data['restricted_name'] except KeyError as e: logger.error(f'Failed to read "restricted_name" from restricted json {restricted_json}.') return 1 gem_json = (gem_path / 'gem.json') if (not validation.valid_o3de_gem_json(gem_json)): logger.error(f'Gem json {gem_json} is not valid.') return 1 with open(gem_json, 'r') as s: try: gem_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Failed to load gem json {gem_json}.') return 1 gem_json_data.update({'restricted': restricted_name}) os.unlink(gem_json) with open(gem_json, 'w') as s: try: s.write((json.dumps(gem_json_data, indent=4) + '\n')) except OSError as e: logger.error(f'Failed to write project json {gem_json}.') return 1 "\n for restricted_platform in restricted_platforms:\n restricted_gem = gem_restricted_path / restricted_platform / gem_name\n os.makedirs(restricted_gem, exist_ok=True)\n cmakelists_file_name = restricted_gem / 'CMakeLists.txt'\n if not os.path.isfile(cmakelists_file_name):\n with open(cmakelists_file_name, 'w') as d:\n if keep_license_text:\n d.write(O3DE_LICENSE_TEXT)\n " if (not no_register): if register.register(restricted_path=gem_restricted_path): logger.error(f'Failed to register the restricted {gem_restricted_path}.') return 1 return (register.register(gem_path=gem_path) if (not no_register) else 0)
Template instantiation specialization that makes all default assumptions for a Gem template instantiation, reducing the effort needed in instancing a gem :param gem_path: the gem path, can be absolute or relative to default gems path :param template_path: the template path you want to instance, can be absolute or relative to default templates path :param template_name: the name of the registered template you want to instance, defaults to DefaultGem, resolves template_path :param gem_name: the name that will be substituted when instantiating the template. The placeholders of ${Name} and ${SanitizedCppName} will be replaced with gem name and a sanitized version of the gem name that is suitable as a C++ identifier. If not specified, defaults to the last path component of the gem_path :param gem_restricted_path: path to the gems restricted folder, can be absolute or relative to the restricted='gems' :param gem_restricted_name: str = name of the registered gems restricted path, resolves gem_restricted_path :param template_restricted_path: the templates restricted path, can be absolute or relative to the restricted='templates' :param template_restricted_name: name of the registered templates restricted path, resolves template_restricted_path :param gem_restricted_platform_relative_path: any path after the platform to append to the gem_restricted_path :param template_restricted_platform_relative_path: any path after the platform to append to the template_restricted_path :param keep_restricted_in_gem: whether or not you want to keep the templates restricted files in your instance or separate them out into the restricted folder :param keep_license_text: whether or not you want to keep the templates license text in your instance. template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE}, this controls if you want to keep the license text from the template in the new instance. It is false by default because most customers will not want license text in their instances, but we may want to keep them. :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs Ex. ${Name},TestGem,${Player},TestGemPlayer This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer' :param force Overrides existing files even if they exist :param system_component_class_id: optionally specify a uuid for the system component class, default is random uuid :param editor_system_component_class_id: optionally specify a uuid for the editor system component class, default is random uuid :param module_id: optionally specify a uuid for the module class, default is random uuid :return: 0 for success or non 0 failure code
scripts/o3de/o3de/engine_template.py
create_gem
SparkyStudios/o3de
11
python
def create_gem(gem_path: pathlib.Path, template_path: pathlib.Path=None, template_name: str=None, gem_name: str=None, gem_restricted_path: pathlib.Path=None, gem_restricted_name: str=None, template_restricted_path: pathlib.Path=None, template_restricted_name: str=None, gem_restricted_platform_relative_path: pathlib.Path=None, template_restricted_platform_relative_path: pathlib.Path=None, keep_restricted_in_gem: bool=False, keep_license_text: bool=False, replace: list=None, force: bool=False, no_register: bool=False, system_component_class_id: str=None, editor_system_component_class_id: str=None, module_id: str=None) -> int: "\n Template instantiation specialization that makes all default assumptions for a Gem template instantiation,\n reducing the effort needed in instancing a gem\n :param gem_path: the gem path, can be absolute or relative to default gems path\n :param template_path: the template path you want to instance, can be absolute or relative to default templates path\n :param template_name: the name of the registered template you want to instance, defaults to DefaultGem, resolves template_path\n :param gem_name: the name that will be substituted when instantiating the template.\n The placeholders of ${Name} and ${SanitizedCppName} will be replaced with gem name and a sanitized\n version of the gem name that is suitable as a C++ identifier. If not specified, defaults to the\n last path component of the gem_path\n :param gem_restricted_path: path to the gems restricted folder, can be absolute or relative to the restricted='gems'\n :param gem_restricted_name: str = name of the registered gems restricted path, resolves gem_restricted_path\n :param template_restricted_path: the templates restricted path, can be absolute or relative to the restricted='templates'\n :param template_restricted_name: name of the registered templates restricted path, resolves template_restricted_path\n :param gem_restricted_platform_relative_path: any path after the platform to append to the gem_restricted_path\n :param template_restricted_platform_relative_path: any path after the platform to append to the template_restricted_path\n :param keep_restricted_in_gem: whether or not you want to keep the templates restricted files in your instance or\n separate them out into the restricted folder\n :param keep_license_text: whether or not you want to keep the templates license text in your instance. template can\n have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE}, this controls if you want to keep\n the license text from the template in the new instance. It is false by default because most customers will not\n want license text in their instances, but we may want to keep them.\n :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs\n Ex. ${Name},TestGem,${Player},TestGemPlayer\n This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer'\n :param force Overrides existing files even if they exist\n :param system_component_class_id: optionally specify a uuid for the system component class, default is random uuid\n :param editor_system_component_class_id: optionally specify a uuid for the editor system component class, default is\n random uuid\n :param module_id: optionally specify a uuid for the module class, default is random uuid\n :return: 0 for success or non 0 failure code\n " if (template_name and template_path): logger.error(f'Template Name and Template Path provided, these are mutually exclusive.') return 1 if (gem_restricted_name and gem_restricted_path): logger.error(f'Gem Restricted Name and Gem Restricted Path provided, these are mutually exclusive.') return 1 if (template_restricted_name and template_restricted_path): logger.error(f'Template Restricted Name and Template Restricted Path provided, these are mutually exclusive.') return 1 if ((not template_name) and (not template_path)): template_name = 'DefaultGem' if (template_name and (not template_path)): template_path = manifest.get_registered(template_name=template_name) if (not template_path): logger.error(f'Could not find the template path using name {template_name}. Has the template been registered yet? It can be registered via the "o3de.py register --tp <template-path>" command') return 1 if (not os.path.isdir(template_path)): logger.error(f'Could not find the template {template_name}=>{template_path}') return 1 template_json = (template_path / 'template.json') if (not validation.valid_o3de_template_json(template_json)): logger.error(f'Template json {template_path} is not valid.') return 1 with open(template_json) as s: try: template_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Could not read template json {template_json}: {str(e)}.') return 1 try: template_name = template_json_data['template_name'] except KeyError as e: logger.error(f'Could not read "template_name" from template json {template_json}: {str(e)}.') return 1 if ((not template_restricted_name) and (not template_restricted_path)): try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name".') else: template_restricted_name = template_json_restricted_name if (template_restricted_name or template_restricted_path): if (template_restricted_name and (not template_restricted_path)): try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_name}') else: if (template_json_restricted_name != template_restricted_name): logger.error(f'The supplied --template-restricted-name {template_restricted_name} does not match the templates "restricted_name". Either the the --template-restricted-name is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name}, --template-restricted-name need not be supplied.') template_restricted_path = manifest.get_registered(restricted_name=template_restricted_name) else: try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_path}') else: template_json_restricted_path = manifest.get_registered(restricted_name=template_json_restricted_name) if (template_json_restricted_path != template_restricted_path): logger.error(f'The supplied --template-restricted-path {template_restricted_path} does not match the templates "restricted_name" {template_restricted_name} => {template_json_restricted_path}. Either the the supplied --template-restricted-path is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name} --template-restricted-path need not be supplied and {template_json_restricted_path} will be used.') return 1 if (template_restricted_path and (not os.path.isdir(template_restricted_path))): logger.error(f'Template restricted path {template_restricted_path} does not exist.') return 1 if template_restricted_platform_relative_path: try: template_json_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: logger.info(f'The template does not specify a "restricted_platform_relative_path". Using {template_restricted_platform_relative_path}') else: if (template_restricted_platform_relative_path != template_json_restricted_platform_relative_path): logger.error(f'The supplied --template-restricted-platform-relative-path "{template_restricted_platform_relative_path}" does not match the templates.json "restricted_platform_relative_path". Either --template-restricted-platform-relative-path is incorrect or the templates "restricted_platform_relative_path" is wrong. Note that since this template specifies "restricted_platform_relative_path" it need not be supplied and "{template_json_restricted_platform_relative_path}" will be used.') return 1 else: try: template_restricted_platform_relative_path = template_json_data['restricted_platform_relative_path'] except KeyError as e: template_restricted_platform_relative_path = if (not template_restricted_platform_relative_path): template_restricted_platform_relative_path = if (not gem_path): logger.error('Gem path cannot be empty.') return 1 gem_path = gem_path.resolve() if (not os.path.isabs(gem_path)): default_gems_folder = manifest.get_registered(default_folder='gems') new_gem_path = (default_gems_folder / gem_path) logger.info(f'Gem Path {gem_path} is not a full path, we must assume its relative to default gems path = {new_gem_path}') gem_path = new_gem_path if ((not force) and gem_path.is_dir() and len(list(gem_path.iterdir()))): logger.error(f'Gem path {gem_path} already exists.') return 1 else: os.makedirs(gem_path, exist_ok=force) if (not gem_name): gem_name = os.path.basename(gem_path) if (not utils.validate_identifier(gem_name)): logger.error(f'Gem name must be fewer than 64 characters, contain only alphanumeric, "_" or "-" characters, and start with a letter. {gem_name}') return 1 if (gem_name in restricted_platforms): logger.error(f'Gem path cannot be a restricted name. {gem_name}') return 1 if (gem_restricted_name and (not gem_restricted_path)): gem_restricted_path = manifest.get_registered(restricted_name=gem_restricted_name) if (not gem_restricted_path): logger.error(f'Gem Restricted Name {gem_restricted_name} cannot be found.') return 1 if gem_restricted_path: if (not os.path.isabs(gem_restricted_path)): logger.error(f'Gem Restricted Path {gem_restricted_path} is not an absolute path.') return 1 else: gem_restricted_path = ((manifest.get_o3de_restricted_folder() / 'Gems') / gem_name) if (not gem_restricted_platform_relative_path): gem_restricted_platform_relative_path = replacements = list() while replace: replace_this = replace.pop(0) with_this = replace.pop(0) replacements.append((replace_this, with_this)) sanitized_cpp_name = utils.sanitize_identifier_for_cpp(gem_name) replacements.append(('${Name}', gem_name)) replacements.append(('${NameUpper}', gem_name.upper())) replacements.append(('${NameLower}', gem_name.lower())) replacements.append(('${SanitizedCppName}', sanitized_cpp_name)) if module_id: replacements.append(('${ModuleClassId}', module_id)) else: replacements.append(('${ModuleClassId}', (('{' + str(uuid.uuid4())) + '}'))) if system_component_class_id: if (('{' not in system_component_class_id) or ('-' not in system_component_class_id)): logger.error((f'System component class id {system_component_class_id} is malformed. Should look like Ex.' + '{b60c92eb-3139-454b-a917-a9d3c5819594}')) return 1 replacements.append(('${SysCompClassId}', system_component_class_id)) else: replacements.append(('${SysCompClassId}', (('{' + str(uuid.uuid4())) + '}'))) if editor_system_component_class_id: if (('{' not in editor_system_component_class_id) or ('-' not in editor_system_component_class_id)): logger.error((f'Editor System component class id {editor_system_component_class_id} is malformed. Should look like Ex.' + '{b60c92eb-3139-454b-a917-a9d3c5819594}')) return 1 replacements.append(('${EditorSysCompClassId}', editor_system_component_class_id)) else: replacements.append(('${EditorSysCompClassId}', (('{' + str(uuid.uuid4())) + '}'))) if _instantiate_template(template_json_data, gem_name, template_name, gem_path, template_path, gem_restricted_path, template_restricted_path, gem_restricted_platform_relative_path, template_restricted_platform_relative_path, replacements, keep_restricted_in_gem, keep_license_text): logger.error(f'Instantiation of the template has failed.') return 1 if (not keep_restricted_in_gem): if gem_restricted_path: os.makedirs(gem_restricted_path, exist_ok=True) restricted_json = (gem_restricted_path / 'restricted.json') if os.path.isfile(restricted_json): if (not validation.valid_o3de_restricted_json(restricted_json)): logger.error(f'Restricted json {restricted_json} is not valid.') return 1 else: with open(restricted_json, 'w') as s: restricted_json_data = {} restricted_json_data.update({'restricted_name': gem_name}) s.write((json.dumps(restricted_json_data, indent=4) + '\n')) with open(restricted_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Failed to load restricted json {restricted_json}.') return 1 try: restricted_name = restricted_json_data['restricted_name'] except KeyError as e: logger.error(f'Failed to read "restricted_name" from restricted json {restricted_json}.') return 1 gem_json = (gem_path / 'gem.json') if (not validation.valid_o3de_gem_json(gem_json)): logger.error(f'Gem json {gem_json} is not valid.') return 1 with open(gem_json, 'r') as s: try: gem_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Failed to load gem json {gem_json}.') return 1 gem_json_data.update({'restricted': restricted_name}) os.unlink(gem_json) with open(gem_json, 'w') as s: try: s.write((json.dumps(gem_json_data, indent=4) + '\n')) except OSError as e: logger.error(f'Failed to write project json {gem_json}.') return 1 "\n for restricted_platform in restricted_platforms:\n restricted_gem = gem_restricted_path / restricted_platform / gem_name\n os.makedirs(restricted_gem, exist_ok=True)\n cmakelists_file_name = restricted_gem / 'CMakeLists.txt'\n if not os.path.isfile(cmakelists_file_name):\n with open(cmakelists_file_name, 'w') as d:\n if keep_license_text:\n d.write(O3DE_LICENSE_TEXT)\n " if (not no_register): if register.register(restricted_path=gem_restricted_path): logger.error(f'Failed to register the restricted {gem_restricted_path}.') return 1 return (register.register(gem_path=gem_path) if (not no_register) else 0)
def create_gem(gem_path: pathlib.Path, template_path: pathlib.Path=None, template_name: str=None, gem_name: str=None, gem_restricted_path: pathlib.Path=None, gem_restricted_name: str=None, template_restricted_path: pathlib.Path=None, template_restricted_name: str=None, gem_restricted_platform_relative_path: pathlib.Path=None, template_restricted_platform_relative_path: pathlib.Path=None, keep_restricted_in_gem: bool=False, keep_license_text: bool=False, replace: list=None, force: bool=False, no_register: bool=False, system_component_class_id: str=None, editor_system_component_class_id: str=None, module_id: str=None) -> int: "\n Template instantiation specialization that makes all default assumptions for a Gem template instantiation,\n reducing the effort needed in instancing a gem\n :param gem_path: the gem path, can be absolute or relative to default gems path\n :param template_path: the template path you want to instance, can be absolute or relative to default templates path\n :param template_name: the name of the registered template you want to instance, defaults to DefaultGem, resolves template_path\n :param gem_name: the name that will be substituted when instantiating the template.\n The placeholders of ${Name} and ${SanitizedCppName} will be replaced with gem name and a sanitized\n version of the gem name that is suitable as a C++ identifier. If not specified, defaults to the\n last path component of the gem_path\n :param gem_restricted_path: path to the gems restricted folder, can be absolute or relative to the restricted='gems'\n :param gem_restricted_name: str = name of the registered gems restricted path, resolves gem_restricted_path\n :param template_restricted_path: the templates restricted path, can be absolute or relative to the restricted='templates'\n :param template_restricted_name: name of the registered templates restricted path, resolves template_restricted_path\n :param gem_restricted_platform_relative_path: any path after the platform to append to the gem_restricted_path\n :param template_restricted_platform_relative_path: any path after the platform to append to the template_restricted_path\n :param keep_restricted_in_gem: whether or not you want to keep the templates restricted files in your instance or\n separate them out into the restricted folder\n :param keep_license_text: whether or not you want to keep the templates license text in your instance. template can\n have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE}, this controls if you want to keep\n the license text from the template in the new instance. It is false by default because most customers will not\n want license text in their instances, but we may want to keep them.\n :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs\n Ex. ${Name},TestGem,${Player},TestGemPlayer\n This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer'\n :param force Overrides existing files even if they exist\n :param system_component_class_id: optionally specify a uuid for the system component class, default is random uuid\n :param editor_system_component_class_id: optionally specify a uuid for the editor system component class, default is\n random uuid\n :param module_id: optionally specify a uuid for the module class, default is random uuid\n :return: 0 for success or non 0 failure code\n " if (template_name and template_path): logger.error(f'Template Name and Template Path provided, these are mutually exclusive.') return 1 if (gem_restricted_name and gem_restricted_path): logger.error(f'Gem Restricted Name and Gem Restricted Path provided, these are mutually exclusive.') return 1 if (template_restricted_name and template_restricted_path): logger.error(f'Template Restricted Name and Template Restricted Path provided, these are mutually exclusive.') return 1 if ((not template_name) and (not template_path)): template_name = 'DefaultGem' if (template_name and (not template_path)): template_path = manifest.get_registered(template_name=template_name) if (not template_path): logger.error(f'Could not find the template path using name {template_name}. Has the template been registered yet? It can be registered via the "o3de.py register --tp <template-path>" command') return 1 if (not os.path.isdir(template_path)): logger.error(f'Could not find the template {template_name}=>{template_path}') return 1 template_json = (template_path / 'template.json') if (not validation.valid_o3de_template_json(template_json)): logger.error(f'Template json {template_path} is not valid.') return 1 with open(template_json) as s: try: template_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Could not read template json {template_json}: {str(e)}.') return 1 try: template_name = template_json_data['template_name'] except KeyError as e: logger.error(f'Could not read "template_name" from template json {template_json}: {str(e)}.') return 1 if ((not template_restricted_name) and (not template_restricted_path)): try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name".') else: template_restricted_name = template_json_restricted_name if (template_restricted_name or template_restricted_path): if (template_restricted_name and (not template_restricted_path)): try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_name}') else: if (template_json_restricted_name != template_restricted_name): logger.error(f'The supplied --template-restricted-name {template_restricted_name} does not match the templates "restricted_name". Either the the --template-restricted-name is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name}, --template-restricted-name need not be supplied.') template_restricted_path = manifest.get_registered(restricted_name=template_restricted_name) else: try: template_json_restricted_name = template_json_data['restricted_name'] except KeyError as e: logger.info(f'The template does not specify a "restricted_name". Using supplied {template_restricted_path}') else: template_json_restricted_path = manifest.get_registered(restricted_name=template_json_restricted_name) if (template_json_restricted_path != template_restricted_path): logger.error(f'The supplied --template-restricted-path {template_restricted_path} does not match the templates "restricted_name" {template_restricted_name} => {template_json_restricted_path}. Either the the supplied --template-restricted-path is incorrect or the templates "restricted_name" is wrong. Note that since this template specifies "restricted_name" as {template_json_restricted_name} --template-restricted-path need not be supplied and {template_json_restricted_path} will be used.') return 1 if (template_restricted_path and (not os.path.isdir(template_restricted_path))): logger.error(f'Template restricted path {template_restricted_path} does not exist.') return 1 if template_restricted_platform_relative_path: try: template_json_restricted_platform_relative_path = pathlib.Path(template_json_data['restricted_platform_relative_path']) except KeyError as e: logger.info(f'The template does not specify a "restricted_platform_relative_path". Using {template_restricted_platform_relative_path}') else: if (template_restricted_platform_relative_path != template_json_restricted_platform_relative_path): logger.error(f'The supplied --template-restricted-platform-relative-path "{template_restricted_platform_relative_path}" does not match the templates.json "restricted_platform_relative_path". Either --template-restricted-platform-relative-path is incorrect or the templates "restricted_platform_relative_path" is wrong. Note that since this template specifies "restricted_platform_relative_path" it need not be supplied and "{template_json_restricted_platform_relative_path}" will be used.') return 1 else: try: template_restricted_platform_relative_path = template_json_data['restricted_platform_relative_path'] except KeyError as e: template_restricted_platform_relative_path = if (not template_restricted_platform_relative_path): template_restricted_platform_relative_path = if (not gem_path): logger.error('Gem path cannot be empty.') return 1 gem_path = gem_path.resolve() if (not os.path.isabs(gem_path)): default_gems_folder = manifest.get_registered(default_folder='gems') new_gem_path = (default_gems_folder / gem_path) logger.info(f'Gem Path {gem_path} is not a full path, we must assume its relative to default gems path = {new_gem_path}') gem_path = new_gem_path if ((not force) and gem_path.is_dir() and len(list(gem_path.iterdir()))): logger.error(f'Gem path {gem_path} already exists.') return 1 else: os.makedirs(gem_path, exist_ok=force) if (not gem_name): gem_name = os.path.basename(gem_path) if (not utils.validate_identifier(gem_name)): logger.error(f'Gem name must be fewer than 64 characters, contain only alphanumeric, "_" or "-" characters, and start with a letter. {gem_name}') return 1 if (gem_name in restricted_platforms): logger.error(f'Gem path cannot be a restricted name. {gem_name}') return 1 if (gem_restricted_name and (not gem_restricted_path)): gem_restricted_path = manifest.get_registered(restricted_name=gem_restricted_name) if (not gem_restricted_path): logger.error(f'Gem Restricted Name {gem_restricted_name} cannot be found.') return 1 if gem_restricted_path: if (not os.path.isabs(gem_restricted_path)): logger.error(f'Gem Restricted Path {gem_restricted_path} is not an absolute path.') return 1 else: gem_restricted_path = ((manifest.get_o3de_restricted_folder() / 'Gems') / gem_name) if (not gem_restricted_platform_relative_path): gem_restricted_platform_relative_path = replacements = list() while replace: replace_this = replace.pop(0) with_this = replace.pop(0) replacements.append((replace_this, with_this)) sanitized_cpp_name = utils.sanitize_identifier_for_cpp(gem_name) replacements.append(('${Name}', gem_name)) replacements.append(('${NameUpper}', gem_name.upper())) replacements.append(('${NameLower}', gem_name.lower())) replacements.append(('${SanitizedCppName}', sanitized_cpp_name)) if module_id: replacements.append(('${ModuleClassId}', module_id)) else: replacements.append(('${ModuleClassId}', (('{' + str(uuid.uuid4())) + '}'))) if system_component_class_id: if (('{' not in system_component_class_id) or ('-' not in system_component_class_id)): logger.error((f'System component class id {system_component_class_id} is malformed. Should look like Ex.' + '{b60c92eb-3139-454b-a917-a9d3c5819594}')) return 1 replacements.append(('${SysCompClassId}', system_component_class_id)) else: replacements.append(('${SysCompClassId}', (('{' + str(uuid.uuid4())) + '}'))) if editor_system_component_class_id: if (('{' not in editor_system_component_class_id) or ('-' not in editor_system_component_class_id)): logger.error((f'Editor System component class id {editor_system_component_class_id} is malformed. Should look like Ex.' + '{b60c92eb-3139-454b-a917-a9d3c5819594}')) return 1 replacements.append(('${EditorSysCompClassId}', editor_system_component_class_id)) else: replacements.append(('${EditorSysCompClassId}', (('{' + str(uuid.uuid4())) + '}'))) if _instantiate_template(template_json_data, gem_name, template_name, gem_path, template_path, gem_restricted_path, template_restricted_path, gem_restricted_platform_relative_path, template_restricted_platform_relative_path, replacements, keep_restricted_in_gem, keep_license_text): logger.error(f'Instantiation of the template has failed.') return 1 if (not keep_restricted_in_gem): if gem_restricted_path: os.makedirs(gem_restricted_path, exist_ok=True) restricted_json = (gem_restricted_path / 'restricted.json') if os.path.isfile(restricted_json): if (not validation.valid_o3de_restricted_json(restricted_json)): logger.error(f'Restricted json {restricted_json} is not valid.') return 1 else: with open(restricted_json, 'w') as s: restricted_json_data = {} restricted_json_data.update({'restricted_name': gem_name}) s.write((json.dumps(restricted_json_data, indent=4) + '\n')) with open(restricted_json, 'r') as s: try: restricted_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Failed to load restricted json {restricted_json}.') return 1 try: restricted_name = restricted_json_data['restricted_name'] except KeyError as e: logger.error(f'Failed to read "restricted_name" from restricted json {restricted_json}.') return 1 gem_json = (gem_path / 'gem.json') if (not validation.valid_o3de_gem_json(gem_json)): logger.error(f'Gem json {gem_json} is not valid.') return 1 with open(gem_json, 'r') as s: try: gem_json_data = json.load(s) except json.JSONDecodeError as e: logger.error(f'Failed to load gem json {gem_json}.') return 1 gem_json_data.update({'restricted': restricted_name}) os.unlink(gem_json) with open(gem_json, 'w') as s: try: s.write((json.dumps(gem_json_data, indent=4) + '\n')) except OSError as e: logger.error(f'Failed to write project json {gem_json}.') return 1 "\n for restricted_platform in restricted_platforms:\n restricted_gem = gem_restricted_path / restricted_platform / gem_name\n os.makedirs(restricted_gem, exist_ok=True)\n cmakelists_file_name = restricted_gem / 'CMakeLists.txt'\n if not os.path.isfile(cmakelists_file_name):\n with open(cmakelists_file_name, 'w') as d:\n if keep_license_text:\n d.write(O3DE_LICENSE_TEXT)\n " if (not no_register): if register.register(restricted_path=gem_restricted_path): logger.error(f'Failed to register the restricted {gem_restricted_path}.') return 1 return (register.register(gem_path=gem_path) if (not no_register) else 0)<|docstring|>Template instantiation specialization that makes all default assumptions for a Gem template instantiation, reducing the effort needed in instancing a gem :param gem_path: the gem path, can be absolute or relative to default gems path :param template_path: the template path you want to instance, can be absolute or relative to default templates path :param template_name: the name of the registered template you want to instance, defaults to DefaultGem, resolves template_path :param gem_name: the name that will be substituted when instantiating the template. The placeholders of ${Name} and ${SanitizedCppName} will be replaced with gem name and a sanitized version of the gem name that is suitable as a C++ identifier. If not specified, defaults to the last path component of the gem_path :param gem_restricted_path: path to the gems restricted folder, can be absolute or relative to the restricted='gems' :param gem_restricted_name: str = name of the registered gems restricted path, resolves gem_restricted_path :param template_restricted_path: the templates restricted path, can be absolute or relative to the restricted='templates' :param template_restricted_name: name of the registered templates restricted path, resolves template_restricted_path :param gem_restricted_platform_relative_path: any path after the platform to append to the gem_restricted_path :param template_restricted_platform_relative_path: any path after the platform to append to the template_restricted_path :param keep_restricted_in_gem: whether or not you want to keep the templates restricted files in your instance or separate them out into the restricted folder :param keep_license_text: whether or not you want to keep the templates license text in your instance. template can have license blocks starting with {BEGIN_LICENSE} and ending with {END_LICENSE}, this controls if you want to keep the license text from the template in the new instance. It is false by default because most customers will not want license text in their instances, but we may want to keep them. :param replace: optional list of strings uses to make concrete names out of templated parameters. X->Y pairs Ex. ${Name},TestGem,${Player},TestGemPlayer This will cause all references to ${Name} be replaced by TestGem, and all ${Player} replaced by 'TestGemPlayer' :param force Overrides existing files even if they exist :param system_component_class_id: optionally specify a uuid for the system component class, default is random uuid :param editor_system_component_class_id: optionally specify a uuid for the editor system component class, default is random uuid :param module_id: optionally specify a uuid for the module class, default is random uuid :return: 0 for success or non 0 failure code<|endoftext|>
8cbb395c53200b008fd1637ec4c76d90a31ad9dbc3775df83b1425a8253a6d1f
def add_args(subparsers) -> None: '\n add_args is called to add expected parser arguments and subparsers arguments to each command such that it can be\n invoked locally or aggregated by a central python file.\n Ex. Directly run from this file alone with: python engine_template.py create-gem --gem-path TestGem\n OR\n o3de.py can aggregate commands by importing engine_template,\n call add_args and execute: python o3de.py create-gem --gem-path TestGem\n :param subparsers: the caller instantiates subparsers and passes it in here\n ' create_template_subparser = subparsers.add_parser('create-template') utils.add_verbosity_arg(create_template_subparser) create_template_subparser.add_argument('-sp', '--source-path', type=pathlib.Path, required=True, help='The path to the source that you want to make into a template') create_template_subparser.add_argument('-tp', '--template-path', type=pathlib.Path, required=False, help='The path to the template to create, can be absolute or relative to default templates path') group = create_template_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-srp', '--source-restricted-path', type=pathlib.Path, required=False, default=None, help='The path to the source restricted folder.') group.add_argument('-srn', '--source-restricted-name', type=str, required=False, default=None, help='The name of the source restricted folder. If supplied this will resolve the --source-restricted-path.') group = create_template_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-trp', '--template-restricted-path', type=pathlib.Path, required=False, default=None, help='The path to the templates restricted folder.') group.add_argument('-trn', '--template-restricted-name', type=str, required=False, default=None, help='The name of the templates restricted folder. If supplied this will resolve the --template-restricted-path.') create_template_subparser.add_argument('-sn', '--source-name', type=str, help='Substitutes any file and path entries which match the source name within the source-path directory with the ${Name} and ${SanitizedCppName}.Ex: Path substitution--source-name Foo<source-path>/Code/Include/FooBus.h -> <source-path>/Code/Include/${Name}Bus.hEx: File content substitution.class FooRequests -> class ${SanitizedCppName}Requests') create_template_subparser.add_argument('-srprp', '--source-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --source-restricted-path/<platform> to where the restricted source is. EX. --source-restricted-path C:/restricted --source-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<source_name>') create_template_subparser.add_argument('-trprp', '--template-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --template-restricted-path/<platform> to where the restricted template source is. --template-restricted-path C:/restricted --template-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<template_name>') create_template_subparser.add_argument('-kr', '--keep-restricted-in-template', action='store_true', default=False, help='Should the template keep the restricted platforms in the template, or create the restricted files in the restricted folder, default is False so it will create a restricted folder by default') create_template_subparser.add_argument('-kl', '--keep-license-text', action='store_true', default=False, help='Should license in the template files text be kept in the instantiation, default is False, so will not keep license text by default. License text is defined as all lines of text starting on a line with {BEGIN_LICENSE} and ending line {END_LICENSE}.') create_template_subparser.add_argument('-r', '--replace', type=str, required=False, nargs='*', help='String that specifies A->B replacement pairs. Ex. --replace CoolThing ${the_thing} 1723905 ${id} Note: <TemplateName> is the last component of template_path Note: <TemplateName> is automatically ${Name} Note: <templatename> is automatically ${NameLower} Note: <TEMPLATENAME> is automatically ${NameUpper}') create_template_subparser.add_argument('-f', '--force', action='store_true', default=False, help='Copies to new template directory even if it exist.') create_template_subparser.add_argument('--no-register', action='store_true', default=False, help='If the template is created successfully, it will not register the template with the global or engine manifest file.') create_template_subparser.set_defaults(func=_run_create_template) create_from_template_subparser = subparsers.add_parser('create-from-template') utils.add_verbosity_arg(create_from_template_subparser) create_from_template_subparser.add_argument('-dp', '--destination-path', type=pathlib.Path, required=True, help='The path to where you want the template instantiated, can be absolute or relative to the current working directory.Ex. C:/o3de/TestTest = <destination_name>') group = create_from_template_subparser.add_mutually_exclusive_group(required=True) group.add_argument('-tp', '--template-path', type=pathlib.Path, required=False, help='The path to the template you want to instantiate, can be absolute or relative to the current working directory.Ex. C:/o3de/Template/TestTemplateTestTemplate = <template_name>') group.add_argument('-tn', '--template-name', type=str, required=False, help='The name to the registered template you want to instantiate. If supplied this will resolve the --template-path.') create_from_template_subparser.add_argument('-dn', '--destination-name', type=str, help='The name to use when substituting the ${Name} placeholder in instantiated template, must be alphanumeric, and can contain _ and - characters. If no name is provided, will use last component of destination path. Ex. New_Gem') group = create_from_template_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-drp', '--destination-restricted-path', type=pathlib.Path, required=False, default=None, help='The destination restricted path is where the restricted files will be written to.') group.add_argument('-drn', '--destination-restricted-name', type=str, required=False, default=None, help='The name the registered restricted path where the restricted files will be written to. If supplied this will resolve the --destination-restricted-path.') group = create_from_template_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-trp', '--template-restricted-path', type=pathlib.Path, required=False, default=None, help='The template restricted path to read from if any') group.add_argument('-trn', '--template-restricted-name', type=str, required=False, default=None, help='The name of the registered restricted path to read from if any. If supplied this will resolve the --template-restricted-path.') create_from_template_subparser.add_argument('-drprp', '--destination-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --destination-restricted-path/<platform> to where the restricted destination is. --destination-restricted-path C:/instance --destination-restricted-platform-relative-path some/folder => C:/instance/<platform>/some/folder/<destination_name>') create_from_template_subparser.add_argument('-trprp', '--template-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --template-restricted-path/<platform> to where the restricted template is. --template-restricted-path C:/restricted --template-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<template_name>') create_from_template_subparser.add_argument('-kr', '--keep-restricted-in-instance', action='store_true', default=False, help='Should the instance keep the restricted platforms in the instance, or create the restricted files in the restricted folder, default is False') create_from_template_subparser.add_argument('-kl', '--keep-license-text', action='store_true', default=False, help='Should license in the template files text be kept in the instantiation, default is False, so will not keep license text by default. License text is defined as all lines of text starting on a line with {BEGIN_LICENSE} and ending line {END_LICENSE}.') create_from_template_subparser.add_argument('-r', '--replace', type=str, required=False, nargs='*', help='String that specifies A->B replacement pairs. Ex. --replace CoolThing ${the_thing} ${id} 1723905 Note: <DestinationName> is the last component of destination_path Note: ${Name} is automatically <DestinationName> Note: ${NameLower} is automatically <destinationname> Note: ${NameUpper} is automatically <DESTINATIONNAME>') create_from_template_subparser.add_argument('-f', '--force', action='store_true', default=False, help='Copies over instantiated template directory even if it exist.') create_from_template_subparser.add_argument('--no-register', action='store_true', default=False, help='If the project template is instantiated successfully, it will not register the project with the global or engine manifest file.') create_from_template_subparser.set_defaults(func=_run_create_from_template) create_project_subparser = subparsers.add_parser('create-project') utils.add_verbosity_arg(create_project_subparser) create_project_subparser.add_argument('-pp', '--project-path', type=pathlib.Path, required=True, help='The location of the project you wish to create from the template, can be an absolute path or relative to the current working directory. Ex. C:/o3de/TestProject TestProject = <project_name> if --project-name not provided') create_project_subparser.add_argument('-pn', '--project-name', type=str, required=False, help='The name of the project you wish to use, must be alphanumeric, and can contain _ and - characters. If no name is provided, will use last component of project path. Ex. New_Project-123') group = create_project_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-tp', '--template-path', type=pathlib.Path, required=False, default=None, help='The path to the template you want to instance, can be absolute or relative to default templates path') group.add_argument('-tn', '--template-name', type=str, required=False, default=None, help='The name the registered template you want to instance, defaults to DefaultProject. If supplied this will resolve the --template-path.') group = create_project_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-prp', '--project-restricted-path', type=pathlib.Path, required=False, default=None, help='The path to the projects restricted folder, can be absolute or relative to the default restricted projects directory') group.add_argument('-prn', '--project-restricted-name', type=str, required=False, default=None, help='The name of the registered projects restricted path. If supplied this will resolve the --project-restricted-path.') group = create_project_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-trp', '--template-restricted-path', type=pathlib.Path, required=False, default=None, help='The templates restricted path can be absolute or relative to the default restricted templates directory') group.add_argument('-trn', '--template-restricted-name', type=str, required=False, default=None, help='The name of the registered templates restricted path. If supplied this will resolve the --template-restricted-path.') create_project_subparser.add_argument('-prprp', '--project-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --project-restricted-path/<platform> to where the restricted project is. --project-restricted-path C:/restricted --project-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<project_name>') create_project_subparser.add_argument('-trprp', '--template-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --template-restricted-path/<platform> to where the restricted template is. --template-restricted-path C:/restricted --template-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<template_name>') create_project_subparser.add_argument('-kr', '--keep-restricted-in-project', action='store_true', default=False, help='Should the new project keep the restricted platforms in the project, orcreate the restricted files in the restricted folder, default is False') create_project_subparser.add_argument('-kl', '--keep-license-text', action='store_true', default=False, help='Should license in the template files text be kept in the instantiation, default is False, so will not keep license text by default. License text is defined as all lines of text starting on a line with {BEGIN_LICENSE} and ending line {END_LICENSE}.') create_project_subparser.add_argument('-r', '--replace', required=False, nargs='*', help='String that specifies ADDITIONAL A->B replacement pairs. ${Name} and all other standard project replacements will be automatically inferred from the project name. These replacements will superseded all inferred replacements. Ex. --replace ${DATE} 1/1/2020 ${id} 1723905 Note: <ProjectName> is the last component of project_path Note: ${Name} is automatically <ProjectName> Note: ${NameLower} is automatically <projectname> Note: ${NameUpper} is automatically <PROJECTNAME>') create_project_subparser.add_argument('--system-component-class-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the system class component, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_project_subparser.add_argument('--editor-system-component-class-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the editor system class component, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_project_subparser.add_argument('--module-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the module, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_project_subparser.add_argument('--project-id', type=str, required=False, help='The str id you want to associate with the project, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_project_subparser.add_argument('-f', '--force', action='store_true', default=False, help='Copies over instantiated template directory even if it exist.') create_project_subparser.add_argument('--no-register', action='store_true', default=False, help='If the project template is instantiated successfully, it will not register the project with the global or engine manifest file.') create_project_subparser.set_defaults(func=_run_create_project) create_gem_subparser = subparsers.add_parser('create-gem') utils.add_verbosity_arg(create_gem_subparser) create_gem_subparser.add_argument('-gp', '--gem-path', type=pathlib.Path, required=True, help='The gem path, can be absolute or relative to the current working directory') create_gem_subparser.add_argument('-gn', '--gem-name', type=str, help='The name to use when substituting the ${Name} placeholder for the gem, must be alphanumeric, and can contain _ and - characters. If no name is provided, will use last component of gem path. Ex. New_Gem') group = create_gem_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-tp', '--template-path', type=pathlib.Path, required=False, default=None, help='The template path you want to instance, can be absolute or relative to default templates path') group.add_argument('-tn', '--template-name', type=str, required=False, default=None, help='The name of the registered template you want to instance, defaults to DefaultGem. If supplied this will resolve the --template-path.') group = create_gem_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-grp', '--gem-restricted-path', type=pathlib.Path, required=False, default=None, help='The gem restricted path, can be absolute or relative to the default restricted gems directory') group.add_argument('-grn', '--gem-restricted-name', type=str, required=False, default=None, help='The name of the gem to look up the gem restricted path if any.If supplied this will resolve the --gem-restricted-path.') group = create_gem_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-trp', '--template-restricted-path', type=pathlib.Path, required=False, default=None, help='The templates restricted path, can be absolute or relative to the default restricted templates directory') group.add_argument('-trn', '--template-restricted-name', type=str, required=False, default=None, help='The name of the registered templates restricted path. If supplied this will resolve the --template-restricted-path.') create_gem_subparser.add_argument('-grprp', '--gem-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --gem-restricted-path/<platform> to where the restricted template is. --gem-restricted-path C:/restricted --gem-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<gem_name>') create_gem_subparser.add_argument('-trprp', '--template-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --template-restricted-path/<platform> to where the restricted template is. --template-restricted-path C:/restricted --template-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<template_name>') create_gem_subparser.add_argument('-r', '--replace', type=str, required=False, nargs='*', help='String that specifies ADDITIONAL A->B replacement pairs. ${Name} and all other standard gem replacements will be automatically inferred from the gem name. These replacements will superseded all inferred replacement pairs. Ex. --replace ${DATE} 1/1/2020 ${id} 1723905 Note: <GemName> is the last component of gem_path Note: ${Name} is automatically <GemName> Note: ${NameLower} is automatically <gemname> Note: ${NameUpper} is automatically <GEMANME>') create_gem_subparser.add_argument('-kr', '--keep-restricted-in-gem', action='store_true', default=False, help='Should the new gem keep the restricted platforms in the project, orcreate the restricted files in the restricted folder, default is False') create_gem_subparser.add_argument('-kl', '--keep-license-text', action='store_true', default=False, help='Should license in the template files text be kept in the instantiation, default is False, so will not keep license text by default. License text is defined as all lines of text starting on a line with {BEGIN_LICENSE} and ending line {END_LICENSE}.') create_gem_subparser.add_argument('--system-component-class-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the system class component, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_gem_subparser.add_argument('--editor-system-component-class-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the editor system class component, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_gem_subparser.add_argument('--module-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the gem module, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_gem_subparser.add_argument('-f', '--force', action='store_true', default=False, help='Copies over instantiated template directory even if it exist.') create_gem_subparser.add_argument('--no-register', action='store_true', default=False, help='If the gem template is instantiated successfully, it will not register the gem with the global, project or engine manifest file.') create_gem_subparser.set_defaults(func=_run_create_gem)
add_args is called to add expected parser arguments and subparsers arguments to each command such that it can be invoked locally or aggregated by a central python file. Ex. Directly run from this file alone with: python engine_template.py create-gem --gem-path TestGem OR o3de.py can aggregate commands by importing engine_template, call add_args and execute: python o3de.py create-gem --gem-path TestGem :param subparsers: the caller instantiates subparsers and passes it in here
scripts/o3de/o3de/engine_template.py
add_args
SparkyStudios/o3de
11
python
def add_args(subparsers) -> None: '\n add_args is called to add expected parser arguments and subparsers arguments to each command such that it can be\n invoked locally or aggregated by a central python file.\n Ex. Directly run from this file alone with: python engine_template.py create-gem --gem-path TestGem\n OR\n o3de.py can aggregate commands by importing engine_template,\n call add_args and execute: python o3de.py create-gem --gem-path TestGem\n :param subparsers: the caller instantiates subparsers and passes it in here\n ' create_template_subparser = subparsers.add_parser('create-template') utils.add_verbosity_arg(create_template_subparser) create_template_subparser.add_argument('-sp', '--source-path', type=pathlib.Path, required=True, help='The path to the source that you want to make into a template') create_template_subparser.add_argument('-tp', '--template-path', type=pathlib.Path, required=False, help='The path to the template to create, can be absolute or relative to default templates path') group = create_template_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-srp', '--source-restricted-path', type=pathlib.Path, required=False, default=None, help='The path to the source restricted folder.') group.add_argument('-srn', '--source-restricted-name', type=str, required=False, default=None, help='The name of the source restricted folder. If supplied this will resolve the --source-restricted-path.') group = create_template_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-trp', '--template-restricted-path', type=pathlib.Path, required=False, default=None, help='The path to the templates restricted folder.') group.add_argument('-trn', '--template-restricted-name', type=str, required=False, default=None, help='The name of the templates restricted folder. If supplied this will resolve the --template-restricted-path.') create_template_subparser.add_argument('-sn', '--source-name', type=str, help='Substitutes any file and path entries which match the source name within the source-path directory with the ${Name} and ${SanitizedCppName}.Ex: Path substitution--source-name Foo<source-path>/Code/Include/FooBus.h -> <source-path>/Code/Include/${Name}Bus.hEx: File content substitution.class FooRequests -> class ${SanitizedCppName}Requests') create_template_subparser.add_argument('-srprp', '--source-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --source-restricted-path/<platform> to where the restricted source is. EX. --source-restricted-path C:/restricted --source-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<source_name>') create_template_subparser.add_argument('-trprp', '--template-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --template-restricted-path/<platform> to where the restricted template source is. --template-restricted-path C:/restricted --template-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<template_name>') create_template_subparser.add_argument('-kr', '--keep-restricted-in-template', action='store_true', default=False, help='Should the template keep the restricted platforms in the template, or create the restricted files in the restricted folder, default is False so it will create a restricted folder by default') create_template_subparser.add_argument('-kl', '--keep-license-text', action='store_true', default=False, help='Should license in the template files text be kept in the instantiation, default is False, so will not keep license text by default. License text is defined as all lines of text starting on a line with {BEGIN_LICENSE} and ending line {END_LICENSE}.') create_template_subparser.add_argument('-r', '--replace', type=str, required=False, nargs='*', help='String that specifies A->B replacement pairs. Ex. --replace CoolThing ${the_thing} 1723905 ${id} Note: <TemplateName> is the last component of template_path Note: <TemplateName> is automatically ${Name} Note: <templatename> is automatically ${NameLower} Note: <TEMPLATENAME> is automatically ${NameUpper}') create_template_subparser.add_argument('-f', '--force', action='store_true', default=False, help='Copies to new template directory even if it exist.') create_template_subparser.add_argument('--no-register', action='store_true', default=False, help='If the template is created successfully, it will not register the template with the global or engine manifest file.') create_template_subparser.set_defaults(func=_run_create_template) create_from_template_subparser = subparsers.add_parser('create-from-template') utils.add_verbosity_arg(create_from_template_subparser) create_from_template_subparser.add_argument('-dp', '--destination-path', type=pathlib.Path, required=True, help='The path to where you want the template instantiated, can be absolute or relative to the current working directory.Ex. C:/o3de/TestTest = <destination_name>') group = create_from_template_subparser.add_mutually_exclusive_group(required=True) group.add_argument('-tp', '--template-path', type=pathlib.Path, required=False, help='The path to the template you want to instantiate, can be absolute or relative to the current working directory.Ex. C:/o3de/Template/TestTemplateTestTemplate = <template_name>') group.add_argument('-tn', '--template-name', type=str, required=False, help='The name to the registered template you want to instantiate. If supplied this will resolve the --template-path.') create_from_template_subparser.add_argument('-dn', '--destination-name', type=str, help='The name to use when substituting the ${Name} placeholder in instantiated template, must be alphanumeric, and can contain _ and - characters. If no name is provided, will use last component of destination path. Ex. New_Gem') group = create_from_template_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-drp', '--destination-restricted-path', type=pathlib.Path, required=False, default=None, help='The destination restricted path is where the restricted files will be written to.') group.add_argument('-drn', '--destination-restricted-name', type=str, required=False, default=None, help='The name the registered restricted path where the restricted files will be written to. If supplied this will resolve the --destination-restricted-path.') group = create_from_template_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-trp', '--template-restricted-path', type=pathlib.Path, required=False, default=None, help='The template restricted path to read from if any') group.add_argument('-trn', '--template-restricted-name', type=str, required=False, default=None, help='The name of the registered restricted path to read from if any. If supplied this will resolve the --template-restricted-path.') create_from_template_subparser.add_argument('-drprp', '--destination-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --destination-restricted-path/<platform> to where the restricted destination is. --destination-restricted-path C:/instance --destination-restricted-platform-relative-path some/folder => C:/instance/<platform>/some/folder/<destination_name>') create_from_template_subparser.add_argument('-trprp', '--template-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --template-restricted-path/<platform> to where the restricted template is. --template-restricted-path C:/restricted --template-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<template_name>') create_from_template_subparser.add_argument('-kr', '--keep-restricted-in-instance', action='store_true', default=False, help='Should the instance keep the restricted platforms in the instance, or create the restricted files in the restricted folder, default is False') create_from_template_subparser.add_argument('-kl', '--keep-license-text', action='store_true', default=False, help='Should license in the template files text be kept in the instantiation, default is False, so will not keep license text by default. License text is defined as all lines of text starting on a line with {BEGIN_LICENSE} and ending line {END_LICENSE}.') create_from_template_subparser.add_argument('-r', '--replace', type=str, required=False, nargs='*', help='String that specifies A->B replacement pairs. Ex. --replace CoolThing ${the_thing} ${id} 1723905 Note: <DestinationName> is the last component of destination_path Note: ${Name} is automatically <DestinationName> Note: ${NameLower} is automatically <destinationname> Note: ${NameUpper} is automatically <DESTINATIONNAME>') create_from_template_subparser.add_argument('-f', '--force', action='store_true', default=False, help='Copies over instantiated template directory even if it exist.') create_from_template_subparser.add_argument('--no-register', action='store_true', default=False, help='If the project template is instantiated successfully, it will not register the project with the global or engine manifest file.') create_from_template_subparser.set_defaults(func=_run_create_from_template) create_project_subparser = subparsers.add_parser('create-project') utils.add_verbosity_arg(create_project_subparser) create_project_subparser.add_argument('-pp', '--project-path', type=pathlib.Path, required=True, help='The location of the project you wish to create from the template, can be an absolute path or relative to the current working directory. Ex. C:/o3de/TestProject TestProject = <project_name> if --project-name not provided') create_project_subparser.add_argument('-pn', '--project-name', type=str, required=False, help='The name of the project you wish to use, must be alphanumeric, and can contain _ and - characters. If no name is provided, will use last component of project path. Ex. New_Project-123') group = create_project_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-tp', '--template-path', type=pathlib.Path, required=False, default=None, help='The path to the template you want to instance, can be absolute or relative to default templates path') group.add_argument('-tn', '--template-name', type=str, required=False, default=None, help='The name the registered template you want to instance, defaults to DefaultProject. If supplied this will resolve the --template-path.') group = create_project_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-prp', '--project-restricted-path', type=pathlib.Path, required=False, default=None, help='The path to the projects restricted folder, can be absolute or relative to the default restricted projects directory') group.add_argument('-prn', '--project-restricted-name', type=str, required=False, default=None, help='The name of the registered projects restricted path. If supplied this will resolve the --project-restricted-path.') group = create_project_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-trp', '--template-restricted-path', type=pathlib.Path, required=False, default=None, help='The templates restricted path can be absolute or relative to the default restricted templates directory') group.add_argument('-trn', '--template-restricted-name', type=str, required=False, default=None, help='The name of the registered templates restricted path. If supplied this will resolve the --template-restricted-path.') create_project_subparser.add_argument('-prprp', '--project-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --project-restricted-path/<platform> to where the restricted project is. --project-restricted-path C:/restricted --project-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<project_name>') create_project_subparser.add_argument('-trprp', '--template-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --template-restricted-path/<platform> to where the restricted template is. --template-restricted-path C:/restricted --template-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<template_name>') create_project_subparser.add_argument('-kr', '--keep-restricted-in-project', action='store_true', default=False, help='Should the new project keep the restricted platforms in the project, orcreate the restricted files in the restricted folder, default is False') create_project_subparser.add_argument('-kl', '--keep-license-text', action='store_true', default=False, help='Should license in the template files text be kept in the instantiation, default is False, so will not keep license text by default. License text is defined as all lines of text starting on a line with {BEGIN_LICENSE} and ending line {END_LICENSE}.') create_project_subparser.add_argument('-r', '--replace', required=False, nargs='*', help='String that specifies ADDITIONAL A->B replacement pairs. ${Name} and all other standard project replacements will be automatically inferred from the project name. These replacements will superseded all inferred replacements. Ex. --replace ${DATE} 1/1/2020 ${id} 1723905 Note: <ProjectName> is the last component of project_path Note: ${Name} is automatically <ProjectName> Note: ${NameLower} is automatically <projectname> Note: ${NameUpper} is automatically <PROJECTNAME>') create_project_subparser.add_argument('--system-component-class-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the system class component, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_project_subparser.add_argument('--editor-system-component-class-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the editor system class component, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_project_subparser.add_argument('--module-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the module, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_project_subparser.add_argument('--project-id', type=str, required=False, help='The str id you want to associate with the project, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_project_subparser.add_argument('-f', '--force', action='store_true', default=False, help='Copies over instantiated template directory even if it exist.') create_project_subparser.add_argument('--no-register', action='store_true', default=False, help='If the project template is instantiated successfully, it will not register the project with the global or engine manifest file.') create_project_subparser.set_defaults(func=_run_create_project) create_gem_subparser = subparsers.add_parser('create-gem') utils.add_verbosity_arg(create_gem_subparser) create_gem_subparser.add_argument('-gp', '--gem-path', type=pathlib.Path, required=True, help='The gem path, can be absolute or relative to the current working directory') create_gem_subparser.add_argument('-gn', '--gem-name', type=str, help='The name to use when substituting the ${Name} placeholder for the gem, must be alphanumeric, and can contain _ and - characters. If no name is provided, will use last component of gem path. Ex. New_Gem') group = create_gem_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-tp', '--template-path', type=pathlib.Path, required=False, default=None, help='The template path you want to instance, can be absolute or relative to default templates path') group.add_argument('-tn', '--template-name', type=str, required=False, default=None, help='The name of the registered template you want to instance, defaults to DefaultGem. If supplied this will resolve the --template-path.') group = create_gem_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-grp', '--gem-restricted-path', type=pathlib.Path, required=False, default=None, help='The gem restricted path, can be absolute or relative to the default restricted gems directory') group.add_argument('-grn', '--gem-restricted-name', type=str, required=False, default=None, help='The name of the gem to look up the gem restricted path if any.If supplied this will resolve the --gem-restricted-path.') group = create_gem_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-trp', '--template-restricted-path', type=pathlib.Path, required=False, default=None, help='The templates restricted path, can be absolute or relative to the default restricted templates directory') group.add_argument('-trn', '--template-restricted-name', type=str, required=False, default=None, help='The name of the registered templates restricted path. If supplied this will resolve the --template-restricted-path.') create_gem_subparser.add_argument('-grprp', '--gem-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --gem-restricted-path/<platform> to where the restricted template is. --gem-restricted-path C:/restricted --gem-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<gem_name>') create_gem_subparser.add_argument('-trprp', '--template-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --template-restricted-path/<platform> to where the restricted template is. --template-restricted-path C:/restricted --template-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<template_name>') create_gem_subparser.add_argument('-r', '--replace', type=str, required=False, nargs='*', help='String that specifies ADDITIONAL A->B replacement pairs. ${Name} and all other standard gem replacements will be automatically inferred from the gem name. These replacements will superseded all inferred replacement pairs. Ex. --replace ${DATE} 1/1/2020 ${id} 1723905 Note: <GemName> is the last component of gem_path Note: ${Name} is automatically <GemName> Note: ${NameLower} is automatically <gemname> Note: ${NameUpper} is automatically <GEMANME>') create_gem_subparser.add_argument('-kr', '--keep-restricted-in-gem', action='store_true', default=False, help='Should the new gem keep the restricted platforms in the project, orcreate the restricted files in the restricted folder, default is False') create_gem_subparser.add_argument('-kl', '--keep-license-text', action='store_true', default=False, help='Should license in the template files text be kept in the instantiation, default is False, so will not keep license text by default. License text is defined as all lines of text starting on a line with {BEGIN_LICENSE} and ending line {END_LICENSE}.') create_gem_subparser.add_argument('--system-component-class-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the system class component, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_gem_subparser.add_argument('--editor-system-component-class-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the editor system class component, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_gem_subparser.add_argument('--module-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the gem module, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_gem_subparser.add_argument('-f', '--force', action='store_true', default=False, help='Copies over instantiated template directory even if it exist.') create_gem_subparser.add_argument('--no-register', action='store_true', default=False, help='If the gem template is instantiated successfully, it will not register the gem with the global, project or engine manifest file.') create_gem_subparser.set_defaults(func=_run_create_gem)
def add_args(subparsers) -> None: '\n add_args is called to add expected parser arguments and subparsers arguments to each command such that it can be\n invoked locally or aggregated by a central python file.\n Ex. Directly run from this file alone with: python engine_template.py create-gem --gem-path TestGem\n OR\n o3de.py can aggregate commands by importing engine_template,\n call add_args and execute: python o3de.py create-gem --gem-path TestGem\n :param subparsers: the caller instantiates subparsers and passes it in here\n ' create_template_subparser = subparsers.add_parser('create-template') utils.add_verbosity_arg(create_template_subparser) create_template_subparser.add_argument('-sp', '--source-path', type=pathlib.Path, required=True, help='The path to the source that you want to make into a template') create_template_subparser.add_argument('-tp', '--template-path', type=pathlib.Path, required=False, help='The path to the template to create, can be absolute or relative to default templates path') group = create_template_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-srp', '--source-restricted-path', type=pathlib.Path, required=False, default=None, help='The path to the source restricted folder.') group.add_argument('-srn', '--source-restricted-name', type=str, required=False, default=None, help='The name of the source restricted folder. If supplied this will resolve the --source-restricted-path.') group = create_template_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-trp', '--template-restricted-path', type=pathlib.Path, required=False, default=None, help='The path to the templates restricted folder.') group.add_argument('-trn', '--template-restricted-name', type=str, required=False, default=None, help='The name of the templates restricted folder. If supplied this will resolve the --template-restricted-path.') create_template_subparser.add_argument('-sn', '--source-name', type=str, help='Substitutes any file and path entries which match the source name within the source-path directory with the ${Name} and ${SanitizedCppName}.Ex: Path substitution--source-name Foo<source-path>/Code/Include/FooBus.h -> <source-path>/Code/Include/${Name}Bus.hEx: File content substitution.class FooRequests -> class ${SanitizedCppName}Requests') create_template_subparser.add_argument('-srprp', '--source-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --source-restricted-path/<platform> to where the restricted source is. EX. --source-restricted-path C:/restricted --source-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<source_name>') create_template_subparser.add_argument('-trprp', '--template-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --template-restricted-path/<platform> to where the restricted template source is. --template-restricted-path C:/restricted --template-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<template_name>') create_template_subparser.add_argument('-kr', '--keep-restricted-in-template', action='store_true', default=False, help='Should the template keep the restricted platforms in the template, or create the restricted files in the restricted folder, default is False so it will create a restricted folder by default') create_template_subparser.add_argument('-kl', '--keep-license-text', action='store_true', default=False, help='Should license in the template files text be kept in the instantiation, default is False, so will not keep license text by default. License text is defined as all lines of text starting on a line with {BEGIN_LICENSE} and ending line {END_LICENSE}.') create_template_subparser.add_argument('-r', '--replace', type=str, required=False, nargs='*', help='String that specifies A->B replacement pairs. Ex. --replace CoolThing ${the_thing} 1723905 ${id} Note: <TemplateName> is the last component of template_path Note: <TemplateName> is automatically ${Name} Note: <templatename> is automatically ${NameLower} Note: <TEMPLATENAME> is automatically ${NameUpper}') create_template_subparser.add_argument('-f', '--force', action='store_true', default=False, help='Copies to new template directory even if it exist.') create_template_subparser.add_argument('--no-register', action='store_true', default=False, help='If the template is created successfully, it will not register the template with the global or engine manifest file.') create_template_subparser.set_defaults(func=_run_create_template) create_from_template_subparser = subparsers.add_parser('create-from-template') utils.add_verbosity_arg(create_from_template_subparser) create_from_template_subparser.add_argument('-dp', '--destination-path', type=pathlib.Path, required=True, help='The path to where you want the template instantiated, can be absolute or relative to the current working directory.Ex. C:/o3de/TestTest = <destination_name>') group = create_from_template_subparser.add_mutually_exclusive_group(required=True) group.add_argument('-tp', '--template-path', type=pathlib.Path, required=False, help='The path to the template you want to instantiate, can be absolute or relative to the current working directory.Ex. C:/o3de/Template/TestTemplateTestTemplate = <template_name>') group.add_argument('-tn', '--template-name', type=str, required=False, help='The name to the registered template you want to instantiate. If supplied this will resolve the --template-path.') create_from_template_subparser.add_argument('-dn', '--destination-name', type=str, help='The name to use when substituting the ${Name} placeholder in instantiated template, must be alphanumeric, and can contain _ and - characters. If no name is provided, will use last component of destination path. Ex. New_Gem') group = create_from_template_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-drp', '--destination-restricted-path', type=pathlib.Path, required=False, default=None, help='The destination restricted path is where the restricted files will be written to.') group.add_argument('-drn', '--destination-restricted-name', type=str, required=False, default=None, help='The name the registered restricted path where the restricted files will be written to. If supplied this will resolve the --destination-restricted-path.') group = create_from_template_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-trp', '--template-restricted-path', type=pathlib.Path, required=False, default=None, help='The template restricted path to read from if any') group.add_argument('-trn', '--template-restricted-name', type=str, required=False, default=None, help='The name of the registered restricted path to read from if any. If supplied this will resolve the --template-restricted-path.') create_from_template_subparser.add_argument('-drprp', '--destination-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --destination-restricted-path/<platform> to where the restricted destination is. --destination-restricted-path C:/instance --destination-restricted-platform-relative-path some/folder => C:/instance/<platform>/some/folder/<destination_name>') create_from_template_subparser.add_argument('-trprp', '--template-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --template-restricted-path/<platform> to where the restricted template is. --template-restricted-path C:/restricted --template-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<template_name>') create_from_template_subparser.add_argument('-kr', '--keep-restricted-in-instance', action='store_true', default=False, help='Should the instance keep the restricted platforms in the instance, or create the restricted files in the restricted folder, default is False') create_from_template_subparser.add_argument('-kl', '--keep-license-text', action='store_true', default=False, help='Should license in the template files text be kept in the instantiation, default is False, so will not keep license text by default. License text is defined as all lines of text starting on a line with {BEGIN_LICENSE} and ending line {END_LICENSE}.') create_from_template_subparser.add_argument('-r', '--replace', type=str, required=False, nargs='*', help='String that specifies A->B replacement pairs. Ex. --replace CoolThing ${the_thing} ${id} 1723905 Note: <DestinationName> is the last component of destination_path Note: ${Name} is automatically <DestinationName> Note: ${NameLower} is automatically <destinationname> Note: ${NameUpper} is automatically <DESTINATIONNAME>') create_from_template_subparser.add_argument('-f', '--force', action='store_true', default=False, help='Copies over instantiated template directory even if it exist.') create_from_template_subparser.add_argument('--no-register', action='store_true', default=False, help='If the project template is instantiated successfully, it will not register the project with the global or engine manifest file.') create_from_template_subparser.set_defaults(func=_run_create_from_template) create_project_subparser = subparsers.add_parser('create-project') utils.add_verbosity_arg(create_project_subparser) create_project_subparser.add_argument('-pp', '--project-path', type=pathlib.Path, required=True, help='The location of the project you wish to create from the template, can be an absolute path or relative to the current working directory. Ex. C:/o3de/TestProject TestProject = <project_name> if --project-name not provided') create_project_subparser.add_argument('-pn', '--project-name', type=str, required=False, help='The name of the project you wish to use, must be alphanumeric, and can contain _ and - characters. If no name is provided, will use last component of project path. Ex. New_Project-123') group = create_project_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-tp', '--template-path', type=pathlib.Path, required=False, default=None, help='The path to the template you want to instance, can be absolute or relative to default templates path') group.add_argument('-tn', '--template-name', type=str, required=False, default=None, help='The name the registered template you want to instance, defaults to DefaultProject. If supplied this will resolve the --template-path.') group = create_project_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-prp', '--project-restricted-path', type=pathlib.Path, required=False, default=None, help='The path to the projects restricted folder, can be absolute or relative to the default restricted projects directory') group.add_argument('-prn', '--project-restricted-name', type=str, required=False, default=None, help='The name of the registered projects restricted path. If supplied this will resolve the --project-restricted-path.') group = create_project_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-trp', '--template-restricted-path', type=pathlib.Path, required=False, default=None, help='The templates restricted path can be absolute or relative to the default restricted templates directory') group.add_argument('-trn', '--template-restricted-name', type=str, required=False, default=None, help='The name of the registered templates restricted path. If supplied this will resolve the --template-restricted-path.') create_project_subparser.add_argument('-prprp', '--project-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --project-restricted-path/<platform> to where the restricted project is. --project-restricted-path C:/restricted --project-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<project_name>') create_project_subparser.add_argument('-trprp', '--template-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --template-restricted-path/<platform> to where the restricted template is. --template-restricted-path C:/restricted --template-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<template_name>') create_project_subparser.add_argument('-kr', '--keep-restricted-in-project', action='store_true', default=False, help='Should the new project keep the restricted platforms in the project, orcreate the restricted files in the restricted folder, default is False') create_project_subparser.add_argument('-kl', '--keep-license-text', action='store_true', default=False, help='Should license in the template files text be kept in the instantiation, default is False, so will not keep license text by default. License text is defined as all lines of text starting on a line with {BEGIN_LICENSE} and ending line {END_LICENSE}.') create_project_subparser.add_argument('-r', '--replace', required=False, nargs='*', help='String that specifies ADDITIONAL A->B replacement pairs. ${Name} and all other standard project replacements will be automatically inferred from the project name. These replacements will superseded all inferred replacements. Ex. --replace ${DATE} 1/1/2020 ${id} 1723905 Note: <ProjectName> is the last component of project_path Note: ${Name} is automatically <ProjectName> Note: ${NameLower} is automatically <projectname> Note: ${NameUpper} is automatically <PROJECTNAME>') create_project_subparser.add_argument('--system-component-class-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the system class component, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_project_subparser.add_argument('--editor-system-component-class-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the editor system class component, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_project_subparser.add_argument('--module-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the module, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_project_subparser.add_argument('--project-id', type=str, required=False, help='The str id you want to associate with the project, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_project_subparser.add_argument('-f', '--force', action='store_true', default=False, help='Copies over instantiated template directory even if it exist.') create_project_subparser.add_argument('--no-register', action='store_true', default=False, help='If the project template is instantiated successfully, it will not register the project with the global or engine manifest file.') create_project_subparser.set_defaults(func=_run_create_project) create_gem_subparser = subparsers.add_parser('create-gem') utils.add_verbosity_arg(create_gem_subparser) create_gem_subparser.add_argument('-gp', '--gem-path', type=pathlib.Path, required=True, help='The gem path, can be absolute or relative to the current working directory') create_gem_subparser.add_argument('-gn', '--gem-name', type=str, help='The name to use when substituting the ${Name} placeholder for the gem, must be alphanumeric, and can contain _ and - characters. If no name is provided, will use last component of gem path. Ex. New_Gem') group = create_gem_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-tp', '--template-path', type=pathlib.Path, required=False, default=None, help='The template path you want to instance, can be absolute or relative to default templates path') group.add_argument('-tn', '--template-name', type=str, required=False, default=None, help='The name of the registered template you want to instance, defaults to DefaultGem. If supplied this will resolve the --template-path.') group = create_gem_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-grp', '--gem-restricted-path', type=pathlib.Path, required=False, default=None, help='The gem restricted path, can be absolute or relative to the default restricted gems directory') group.add_argument('-grn', '--gem-restricted-name', type=str, required=False, default=None, help='The name of the gem to look up the gem restricted path if any.If supplied this will resolve the --gem-restricted-path.') group = create_gem_subparser.add_mutually_exclusive_group(required=False) group.add_argument('-trp', '--template-restricted-path', type=pathlib.Path, required=False, default=None, help='The templates restricted path, can be absolute or relative to the default restricted templates directory') group.add_argument('-trn', '--template-restricted-name', type=str, required=False, default=None, help='The name of the registered templates restricted path. If supplied this will resolve the --template-restricted-path.') create_gem_subparser.add_argument('-grprp', '--gem-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --gem-restricted-path/<platform> to where the restricted template is. --gem-restricted-path C:/restricted --gem-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<gem_name>') create_gem_subparser.add_argument('-trprp', '--template-restricted-platform-relative-path', type=pathlib.Path, required=False, default=None, help='Any path to append to the --template-restricted-path/<platform> to where the restricted template is. --template-restricted-path C:/restricted --template-restricted-platform-relative-path some/folder => C:/restricted/<platform>/some/folder/<template_name>') create_gem_subparser.add_argument('-r', '--replace', type=str, required=False, nargs='*', help='String that specifies ADDITIONAL A->B replacement pairs. ${Name} and all other standard gem replacements will be automatically inferred from the gem name. These replacements will superseded all inferred replacement pairs. Ex. --replace ${DATE} 1/1/2020 ${id} 1723905 Note: <GemName> is the last component of gem_path Note: ${Name} is automatically <GemName> Note: ${NameLower} is automatically <gemname> Note: ${NameUpper} is automatically <GEMANME>') create_gem_subparser.add_argument('-kr', '--keep-restricted-in-gem', action='store_true', default=False, help='Should the new gem keep the restricted platforms in the project, orcreate the restricted files in the restricted folder, default is False') create_gem_subparser.add_argument('-kl', '--keep-license-text', action='store_true', default=False, help='Should license in the template files text be kept in the instantiation, default is False, so will not keep license text by default. License text is defined as all lines of text starting on a line with {BEGIN_LICENSE} and ending line {END_LICENSE}.') create_gem_subparser.add_argument('--system-component-class-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the system class component, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_gem_subparser.add_argument('--editor-system-component-class-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the editor system class component, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_gem_subparser.add_argument('--module-id', type=uuid.UUID, required=False, help='The uuid you want to associate with the gem module, default is a random uuid Ex. {b60c92eb-3139-454b-a917-a9d3c5819594}') create_gem_subparser.add_argument('-f', '--force', action='store_true', default=False, help='Copies over instantiated template directory even if it exist.') create_gem_subparser.add_argument('--no-register', action='store_true', default=False, help='If the gem template is instantiated successfully, it will not register the gem with the global, project or engine manifest file.') create_gem_subparser.set_defaults(func=_run_create_gem)<|docstring|>add_args is called to add expected parser arguments and subparsers arguments to each command such that it can be invoked locally or aggregated by a central python file. Ex. Directly run from this file alone with: python engine_template.py create-gem --gem-path TestGem OR o3de.py can aggregate commands by importing engine_template, call add_args and execute: python o3de.py create-gem --gem-path TestGem :param subparsers: the caller instantiates subparsers and passes it in here<|endoftext|>
93774899140b07086fd0bba5071ec54fa34e62310ba799a4b8f189e193b837c1
def _is_cpp_file(file_path: pathlib.Path) -> bool: '\n Internal helper method to check if a file is a C++ file based\n on its extension, so we can determine if we need to prefer\n the ${SanitizedCppName}\n :param file_path: The input file path\n :return: bool: Whether or not the input file path has a C++ extension\n ' (name, ext) = os.path.splitext(file_path) return (ext.lower() in cpp_file_ext)
Internal helper method to check if a file is a C++ file based on its extension, so we can determine if we need to prefer the ${SanitizedCppName} :param file_path: The input file path :return: bool: Whether or not the input file path has a C++ extension
scripts/o3de/o3de/engine_template.py
_is_cpp_file
SparkyStudios/o3de
11
python
def _is_cpp_file(file_path: pathlib.Path) -> bool: '\n Internal helper method to check if a file is a C++ file based\n on its extension, so we can determine if we need to prefer\n the ${SanitizedCppName}\n :param file_path: The input file path\n :return: bool: Whether or not the input file path has a C++ extension\n ' (name, ext) = os.path.splitext(file_path) return (ext.lower() in cpp_file_ext)
def _is_cpp_file(file_path: pathlib.Path) -> bool: '\n Internal helper method to check if a file is a C++ file based\n on its extension, so we can determine if we need to prefer\n the ${SanitizedCppName}\n :param file_path: The input file path\n :return: bool: Whether or not the input file path has a C++ extension\n ' (name, ext) = os.path.splitext(file_path) return (ext.lower() in cpp_file_ext)<|docstring|>Internal helper method to check if a file is a C++ file based on its extension, so we can determine if we need to prefer the ${SanitizedCppName} :param file_path: The input file path :return: bool: Whether or not the input file path has a C++ extension<|endoftext|>
f3d4e5f8bb43b54ee179c37cf2686fe7b6005598ba0d806c6e619551f8b78cb7
def _transform_into_template(s_data: object, prefer_sanitized_name: bool=False) -> (bool, str): "\n Internal function to transform any data into templated data\n :param s_data: the input data, this could be file data or file name data\n :param prefer_sanitized_name: Optionally swap the sanitized name with the normal name\n This can be necessary when creating the template, the source\n name and sanitized source name might be the same, but C++\n files will need to prefer the sanitized version, or else\n there might be compile errors (e.g. '-' characters in the name)\n :return: bool: whether or not the returned data MAY need to be transformed to instantiate it\n t_data: potentially transformed data 0 for success or non 0 failure code\n " def swap_sanitized_name_and_normal(): (replacements[(sanitized_name_index - 1)], replacements[sanitized_name_index]) = (replacements[sanitized_name_index], replacements[(sanitized_name_index - 1)]) t_data = str(s_data) if prefer_sanitized_name: swap_sanitized_name_and_normal() for replacement in replacements: t_data = t_data.replace(replacement[0], replacement[1]) if prefer_sanitized_name: swap_sanitized_name_and_normal() if (not keep_license_text): t_data = _replace_license_text(t_data) try: pattern = '.*AZ_RTTI\\(\\$\\{SanitizedCppName\\}Module, \\"(?P<ModuleClassId>\\{.*-.*-.*-.*-.*\\})\\",' module_class_id = re.search(pattern, t_data).group('ModuleClassId') replacements.append((module_class_id, '${ModuleClassId}')) t_data = t_data.replace(module_class_id, '${ModuleClassId}') except Exception as e: pass try: pattern = '.*AZ_COMPONENT\\(\\$\\{SanitizedCppName\\}SystemComponent, \\"(?P<SysCompClassId>\\{.*-.*-.*-.*-.*\\})\\"' sys_comp_class_id = re.search(pattern, t_data).group('SysCompClassId') replacements.append((sys_comp_class_id, '${SysCompClassId}')) t_data = t_data.replace(sys_comp_class_id, '${SysCompClassId}') except Exception as e: pass try: pattern = '.*AZ_COMPONENT\\(\\$\\{SanitizedCppName\\}EditorSystemComponent, \\"(?P<EditorSysCompClassId>\\{.*-.*-.*-.*-.*\\})\\"' editor_sys_comp_class_id = re.search(pattern, t_data).group('EditorSysCompClassId') replacements.append((editor_sys_comp_class_id, '${EditorSysCompClassId}')) t_data = t_data.replace(editor_sys_comp_class_id, '${EditorSysCompClassId}') except Exception as e: pass if ((s_data != t_data) or ('{BEGIN_LICENSE}' in t_data)): return (True, t_data) else: return (False, t_data)
Internal function to transform any data into templated data :param s_data: the input data, this could be file data or file name data :param prefer_sanitized_name: Optionally swap the sanitized name with the normal name This can be necessary when creating the template, the source name and sanitized source name might be the same, but C++ files will need to prefer the sanitized version, or else there might be compile errors (e.g. '-' characters in the name) :return: bool: whether or not the returned data MAY need to be transformed to instantiate it t_data: potentially transformed data 0 for success or non 0 failure code
scripts/o3de/o3de/engine_template.py
_transform_into_template
SparkyStudios/o3de
11
python
def _transform_into_template(s_data: object, prefer_sanitized_name: bool=False) -> (bool, str): "\n Internal function to transform any data into templated data\n :param s_data: the input data, this could be file data or file name data\n :param prefer_sanitized_name: Optionally swap the sanitized name with the normal name\n This can be necessary when creating the template, the source\n name and sanitized source name might be the same, but C++\n files will need to prefer the sanitized version, or else\n there might be compile errors (e.g. '-' characters in the name)\n :return: bool: whether or not the returned data MAY need to be transformed to instantiate it\n t_data: potentially transformed data 0 for success or non 0 failure code\n " def swap_sanitized_name_and_normal(): (replacements[(sanitized_name_index - 1)], replacements[sanitized_name_index]) = (replacements[sanitized_name_index], replacements[(sanitized_name_index - 1)]) t_data = str(s_data) if prefer_sanitized_name: swap_sanitized_name_and_normal() for replacement in replacements: t_data = t_data.replace(replacement[0], replacement[1]) if prefer_sanitized_name: swap_sanitized_name_and_normal() if (not keep_license_text): t_data = _replace_license_text(t_data) try: pattern = '.*AZ_RTTI\\(\\$\\{SanitizedCppName\\}Module, \\"(?P<ModuleClassId>\\{.*-.*-.*-.*-.*\\})\\",' module_class_id = re.search(pattern, t_data).group('ModuleClassId') replacements.append((module_class_id, '${ModuleClassId}')) t_data = t_data.replace(module_class_id, '${ModuleClassId}') except Exception as e: pass try: pattern = '.*AZ_COMPONENT\\(\\$\\{SanitizedCppName\\}SystemComponent, \\"(?P<SysCompClassId>\\{.*-.*-.*-.*-.*\\})\\"' sys_comp_class_id = re.search(pattern, t_data).group('SysCompClassId') replacements.append((sys_comp_class_id, '${SysCompClassId}')) t_data = t_data.replace(sys_comp_class_id, '${SysCompClassId}') except Exception as e: pass try: pattern = '.*AZ_COMPONENT\\(\\$\\{SanitizedCppName\\}EditorSystemComponent, \\"(?P<EditorSysCompClassId>\\{.*-.*-.*-.*-.*\\})\\"' editor_sys_comp_class_id = re.search(pattern, t_data).group('EditorSysCompClassId') replacements.append((editor_sys_comp_class_id, '${EditorSysCompClassId}')) t_data = t_data.replace(editor_sys_comp_class_id, '${EditorSysCompClassId}') except Exception as e: pass if ((s_data != t_data) or ('{BEGIN_LICENSE}' in t_data)): return (True, t_data) else: return (False, t_data)
def _transform_into_template(s_data: object, prefer_sanitized_name: bool=False) -> (bool, str): "\n Internal function to transform any data into templated data\n :param s_data: the input data, this could be file data or file name data\n :param prefer_sanitized_name: Optionally swap the sanitized name with the normal name\n This can be necessary when creating the template, the source\n name and sanitized source name might be the same, but C++\n files will need to prefer the sanitized version, or else\n there might be compile errors (e.g. '-' characters in the name)\n :return: bool: whether or not the returned data MAY need to be transformed to instantiate it\n t_data: potentially transformed data 0 for success or non 0 failure code\n " def swap_sanitized_name_and_normal(): (replacements[(sanitized_name_index - 1)], replacements[sanitized_name_index]) = (replacements[sanitized_name_index], replacements[(sanitized_name_index - 1)]) t_data = str(s_data) if prefer_sanitized_name: swap_sanitized_name_and_normal() for replacement in replacements: t_data = t_data.replace(replacement[0], replacement[1]) if prefer_sanitized_name: swap_sanitized_name_and_normal() if (not keep_license_text): t_data = _replace_license_text(t_data) try: pattern = '.*AZ_RTTI\\(\\$\\{SanitizedCppName\\}Module, \\"(?P<ModuleClassId>\\{.*-.*-.*-.*-.*\\})\\",' module_class_id = re.search(pattern, t_data).group('ModuleClassId') replacements.append((module_class_id, '${ModuleClassId}')) t_data = t_data.replace(module_class_id, '${ModuleClassId}') except Exception as e: pass try: pattern = '.*AZ_COMPONENT\\(\\$\\{SanitizedCppName\\}SystemComponent, \\"(?P<SysCompClassId>\\{.*-.*-.*-.*-.*\\})\\"' sys_comp_class_id = re.search(pattern, t_data).group('SysCompClassId') replacements.append((sys_comp_class_id, '${SysCompClassId}')) t_data = t_data.replace(sys_comp_class_id, '${SysCompClassId}') except Exception as e: pass try: pattern = '.*AZ_COMPONENT\\(\\$\\{SanitizedCppName\\}EditorSystemComponent, \\"(?P<EditorSysCompClassId>\\{.*-.*-.*-.*-.*\\})\\"' editor_sys_comp_class_id = re.search(pattern, t_data).group('EditorSysCompClassId') replacements.append((editor_sys_comp_class_id, '${EditorSysCompClassId}')) t_data = t_data.replace(editor_sys_comp_class_id, '${EditorSysCompClassId}') except Exception as e: pass if ((s_data != t_data) or ('{BEGIN_LICENSE}' in t_data)): return (True, t_data) else: return (False, t_data)<|docstring|>Internal function to transform any data into templated data :param s_data: the input data, this could be file data or file name data :param prefer_sanitized_name: Optionally swap the sanitized name with the normal name This can be necessary when creating the template, the source name and sanitized source name might be the same, but C++ files will need to prefer the sanitized version, or else there might be compile errors (e.g. '-' characters in the name) :return: bool: whether or not the returned data MAY need to be transformed to instantiate it t_data: potentially transformed data 0 for success or non 0 failure code<|endoftext|>
3e8bbe1a675aa9685bb555c586c05d258e0a9eeece728bff5bd4e29c59f31058
def _transform_restricted_into_copyfiles_and_createdirs(root_abs: pathlib.Path, path_abs: pathlib.Path=None) -> None: '\n Internal function recursively called to transform any paths files into copyfiles and create dirs relative to\n the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it\n :param root_abs: This is the path everything will end up relative to\n :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir\n of root_abs\n ' if (not path_abs): path_abs = root_abs entries = os.listdir(path_abs) for entry in entries: entry_abs = (path_abs / entry) logger.info(f'Processing file: {entry_abs}') try: entry_rel = entry_abs.relative_to(root_abs) except ValueError as err: logger.fatal(f'Unable to create relative path: {str(err)}') (_, destination_entry_rel) = _transform_into_template(entry_rel.as_posix()) destination_entry_rel = pathlib.Path(destination_entry_rel) if destination_entry_rel.as_posix().startswith('/'): destination_entry_rel = pathlib.Path(destination_entry_rel.as_posix().lstrip('/')) if isinstance(destination_entry_rel, pathlib.Path): destination_entry_rel = destination_entry_rel.as_posix() if template_restricted_path: destination_entry_abs = ((((template_restricted_path / restricted_platform) / template_restricted_platform_relative_path) / 'Template') / destination_entry_rel) destination_entry_rel = pathlib.Path(destination_entry_rel) first = True for component in destination_entry_rel.parts: if first: first = False result = ((pathlib.Path(component) / 'Platform') / restricted_platform) else: result = (result / component) destination_entry_rel = result.as_posix() else: destination_entry_rel = pathlib.Path(destination_entry_rel) first = True for component in destination_entry_rel.parts: if first: first = False result = ((pathlib.Path(component) / 'Platform') / restricted_platform) else: result = (result / component) destination_entry_rel = result.as_posix() destination_entry_abs = ((template_path / 'Template') / destination_entry_rel) os.makedirs(os.path.dirname(destination_entry_abs), exist_ok=True) templated = False if os.path.isfile(entry_abs): (name, ext) = os.path.splitext(entry) if (ext in binary_file_ext): shutil.copy(entry_abs, destination_entry_abs) else: try: with open(entry_abs, 'r') as s: source_data = s.read() (templated, source_data) = _transform_into_template(source_data, _is_cpp_file(entry_abs)) if (keep_license_text and (ext in expect_license_info_ext)): if (('Copyright (c)' not in source_data) or ('{BEGIN_LICENSE}' not in source_data)): logger.warning(f'Un-templated License header in {entry_abs}') if os.path.isfile(destination_entry_abs): os.unlink(destination_entry_abs) with open(destination_entry_abs, 'w') as s: s.write(source_data) except Exception as e: shutil.copy(entry_abs, destination_entry_abs) pass if keep_restricted_in_template: copy_files.append({'file': destination_entry_rel, 'isTemplated': templated}) else: restricted_platform_entries[restricted_platform]['copyFiles'].append({'file': destination_entry_rel, 'isTemplated': templated}) else: if keep_restricted_in_template: create_dirs.append({'dir': destination_entry_rel}) else: restricted_platform_entries[restricted_platform]['createDirs'].append({'dir': destination_entry_rel}) _transform_restricted_into_copyfiles_and_createdirs(root_abs, entry_abs)
Internal function recursively called to transform any paths files into copyfiles and create dirs relative to the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it :param root_abs: This is the path everything will end up relative to :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir of root_abs
scripts/o3de/o3de/engine_template.py
_transform_restricted_into_copyfiles_and_createdirs
SparkyStudios/o3de
11
python
def _transform_restricted_into_copyfiles_and_createdirs(root_abs: pathlib.Path, path_abs: pathlib.Path=None) -> None: '\n Internal function recursively called to transform any paths files into copyfiles and create dirs relative to\n the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it\n :param root_abs: This is the path everything will end up relative to\n :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir\n of root_abs\n ' if (not path_abs): path_abs = root_abs entries = os.listdir(path_abs) for entry in entries: entry_abs = (path_abs / entry) logger.info(f'Processing file: {entry_abs}') try: entry_rel = entry_abs.relative_to(root_abs) except ValueError as err: logger.fatal(f'Unable to create relative path: {str(err)}') (_, destination_entry_rel) = _transform_into_template(entry_rel.as_posix()) destination_entry_rel = pathlib.Path(destination_entry_rel) if destination_entry_rel.as_posix().startswith('/'): destination_entry_rel = pathlib.Path(destination_entry_rel.as_posix().lstrip('/')) if isinstance(destination_entry_rel, pathlib.Path): destination_entry_rel = destination_entry_rel.as_posix() if template_restricted_path: destination_entry_abs = ((((template_restricted_path / restricted_platform) / template_restricted_platform_relative_path) / 'Template') / destination_entry_rel) destination_entry_rel = pathlib.Path(destination_entry_rel) first = True for component in destination_entry_rel.parts: if first: first = False result = ((pathlib.Path(component) / 'Platform') / restricted_platform) else: result = (result / component) destination_entry_rel = result.as_posix() else: destination_entry_rel = pathlib.Path(destination_entry_rel) first = True for component in destination_entry_rel.parts: if first: first = False result = ((pathlib.Path(component) / 'Platform') / restricted_platform) else: result = (result / component) destination_entry_rel = result.as_posix() destination_entry_abs = ((template_path / 'Template') / destination_entry_rel) os.makedirs(os.path.dirname(destination_entry_abs), exist_ok=True) templated = False if os.path.isfile(entry_abs): (name, ext) = os.path.splitext(entry) if (ext in binary_file_ext): shutil.copy(entry_abs, destination_entry_abs) else: try: with open(entry_abs, 'r') as s: source_data = s.read() (templated, source_data) = _transform_into_template(source_data, _is_cpp_file(entry_abs)) if (keep_license_text and (ext in expect_license_info_ext)): if (('Copyright (c)' not in source_data) or ('{BEGIN_LICENSE}' not in source_data)): logger.warning(f'Un-templated License header in {entry_abs}') if os.path.isfile(destination_entry_abs): os.unlink(destination_entry_abs) with open(destination_entry_abs, 'w') as s: s.write(source_data) except Exception as e: shutil.copy(entry_abs, destination_entry_abs) pass if keep_restricted_in_template: copy_files.append({'file': destination_entry_rel, 'isTemplated': templated}) else: restricted_platform_entries[restricted_platform]['copyFiles'].append({'file': destination_entry_rel, 'isTemplated': templated}) else: if keep_restricted_in_template: create_dirs.append({'dir': destination_entry_rel}) else: restricted_platform_entries[restricted_platform]['createDirs'].append({'dir': destination_entry_rel}) _transform_restricted_into_copyfiles_and_createdirs(root_abs, entry_abs)
def _transform_restricted_into_copyfiles_and_createdirs(root_abs: pathlib.Path, path_abs: pathlib.Path=None) -> None: '\n Internal function recursively called to transform any paths files into copyfiles and create dirs relative to\n the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it\n :param root_abs: This is the path everything will end up relative to\n :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir\n of root_abs\n ' if (not path_abs): path_abs = root_abs entries = os.listdir(path_abs) for entry in entries: entry_abs = (path_abs / entry) logger.info(f'Processing file: {entry_abs}') try: entry_rel = entry_abs.relative_to(root_abs) except ValueError as err: logger.fatal(f'Unable to create relative path: {str(err)}') (_, destination_entry_rel) = _transform_into_template(entry_rel.as_posix()) destination_entry_rel = pathlib.Path(destination_entry_rel) if destination_entry_rel.as_posix().startswith('/'): destination_entry_rel = pathlib.Path(destination_entry_rel.as_posix().lstrip('/')) if isinstance(destination_entry_rel, pathlib.Path): destination_entry_rel = destination_entry_rel.as_posix() if template_restricted_path: destination_entry_abs = ((((template_restricted_path / restricted_platform) / template_restricted_platform_relative_path) / 'Template') / destination_entry_rel) destination_entry_rel = pathlib.Path(destination_entry_rel) first = True for component in destination_entry_rel.parts: if first: first = False result = ((pathlib.Path(component) / 'Platform') / restricted_platform) else: result = (result / component) destination_entry_rel = result.as_posix() else: destination_entry_rel = pathlib.Path(destination_entry_rel) first = True for component in destination_entry_rel.parts: if first: first = False result = ((pathlib.Path(component) / 'Platform') / restricted_platform) else: result = (result / component) destination_entry_rel = result.as_posix() destination_entry_abs = ((template_path / 'Template') / destination_entry_rel) os.makedirs(os.path.dirname(destination_entry_abs), exist_ok=True) templated = False if os.path.isfile(entry_abs): (name, ext) = os.path.splitext(entry) if (ext in binary_file_ext): shutil.copy(entry_abs, destination_entry_abs) else: try: with open(entry_abs, 'r') as s: source_data = s.read() (templated, source_data) = _transform_into_template(source_data, _is_cpp_file(entry_abs)) if (keep_license_text and (ext in expect_license_info_ext)): if (('Copyright (c)' not in source_data) or ('{BEGIN_LICENSE}' not in source_data)): logger.warning(f'Un-templated License header in {entry_abs}') if os.path.isfile(destination_entry_abs): os.unlink(destination_entry_abs) with open(destination_entry_abs, 'w') as s: s.write(source_data) except Exception as e: shutil.copy(entry_abs, destination_entry_abs) pass if keep_restricted_in_template: copy_files.append({'file': destination_entry_rel, 'isTemplated': templated}) else: restricted_platform_entries[restricted_platform]['copyFiles'].append({'file': destination_entry_rel, 'isTemplated': templated}) else: if keep_restricted_in_template: create_dirs.append({'dir': destination_entry_rel}) else: restricted_platform_entries[restricted_platform]['createDirs'].append({'dir': destination_entry_rel}) _transform_restricted_into_copyfiles_and_createdirs(root_abs, entry_abs)<|docstring|>Internal function recursively called to transform any paths files into copyfiles and create dirs relative to the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it :param root_abs: This is the path everything will end up relative to :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir of root_abs<|endoftext|>
a982a37af872c1332cac60743c82a22b74979b196b7936656651895121898b14
def _transform_dir_into_copyfiles_and_createdirs(root_abs: pathlib.Path, path_abs: pathlib.Path=None) -> None: '\n Internal function recursively called to transform any paths files into copyfiles and create dirs relative to\n the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it\n :param root_abs: This is the path everything will end up relative to\n :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir\n of root_abs\n ' if (not path_abs): path_abs = root_abs entries = os.listdir(path_abs) for entry in entries: entry_abs = (path_abs / entry) logger.info(f'Processing file: {entry_abs}') try: entry_rel = entry_abs.relative_to(root_abs).as_posix() except ValueError as err: logger.fatal(f'Unable to create relative path: {str(err)}') (_, destination_entry_rel) = _transform_into_template(entry_rel) destination_entry_rel = pathlib.Path(destination_entry_rel) found_platform = '' platform = False if ((not keep_restricted_in_template) and ('Platform' in entry_abs.parts)): platform = True try: pattern = '/Platform/(?P<Platform>[^/:*?\\"<>|\\r\\n]+/?)' found_platform = re.search(pattern, entry_abs.as_posix()).group('Platform') found_platform = found_platform.replace('/', '') except Exception as e: pass if (os.path.isfile(entry_abs) and (found_platform not in restricted_platform_entries)): found_platform = '' if ((found_platform not in restricted_platform_entries) and (found_platform in restricted_platforms)): restricted_platform_entries.update({found_platform: {'copyFiles': [], 'createDirs': []}}) if (platform and (found_platform in restricted_platforms)): if (not template_restricted_path): logger.warning('Restricted platform file found!!! {destination_entry_rel}, {found_platform}') continue for replacement in replacements: destination_entry_rel = destination_entry_rel.replace(replacement[0], replacement[1]) destination_entry_rel = destination_entry_rel.replace(f'Platform/{found_platform}', '') destination_entry_rel = destination_entry_rel.lstrip('/') if template_restricted_platform_relative_path: destination_entry_abs = (((((template_restricted_path / found_platform) / template_restricted_platform_relative_path) / template_name) / 'Template') / destination_entry_rel) else: destination_entry_abs = (((template_restricted_path / found_platform) / 'Template') / destination_entry_rel) else: destination_entry_abs = ((template_path / 'Template') / destination_entry_rel) if isinstance(destination_entry_rel, pathlib.Path): destination_entry_rel = destination_entry_rel.as_posix() if destination_entry_rel.startswith('/'): destination_entry_rel = destination_entry_rel.lstrip('/') os.makedirs(os.path.dirname(destination_entry_abs), exist_ok=True) templated = False if os.path.isfile(entry_abs): (name, ext) = os.path.splitext(entry) if (ext in binary_file_ext): shutil.copy(entry_abs, destination_entry_abs) else: try: with open(entry_abs, 'r') as s: source_data = s.read() (templated, source_data) = _transform_into_template(source_data, _is_cpp_file(entry_abs)) if (keep_license_text and (ext in expect_license_info_ext)): if (('Copyright (c)' not in source_data) or ('{BEGIN_LICENSE}' not in source_data)): logger.warning(f'Un-templated License header in {entry_abs}') if os.path.isfile(destination_entry_abs): os.unlink(destination_entry_abs) with open(destination_entry_abs, 'w') as s: s.write(source_data) except Exception as e: shutil.copy(entry_abs, destination_entry_abs) pass if (platform and (found_platform in restricted_platforms)): restricted_platform_entries[found_platform]['copyFiles'].append({'file': destination_entry_rel, 'isTemplated': templated}) else: copy_files.append({'file': destination_entry_rel, 'isTemplated': templated}) else: if (platform and (found_platform in restricted_platforms)): restricted_platform_entries[found_platform]['createDirs'].append({'dir': destination_entry_rel}) else: create_dirs.append({'dir': destination_entry_rel}) _transform_dir_into_copyfiles_and_createdirs(root_abs, entry_abs)
Internal function recursively called to transform any paths files into copyfiles and create dirs relative to the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it :param root_abs: This is the path everything will end up relative to :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir of root_abs
scripts/o3de/o3de/engine_template.py
_transform_dir_into_copyfiles_and_createdirs
SparkyStudios/o3de
11
python
def _transform_dir_into_copyfiles_and_createdirs(root_abs: pathlib.Path, path_abs: pathlib.Path=None) -> None: '\n Internal function recursively called to transform any paths files into copyfiles and create dirs relative to\n the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it\n :param root_abs: This is the path everything will end up relative to\n :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir\n of root_abs\n ' if (not path_abs): path_abs = root_abs entries = os.listdir(path_abs) for entry in entries: entry_abs = (path_abs / entry) logger.info(f'Processing file: {entry_abs}') try: entry_rel = entry_abs.relative_to(root_abs).as_posix() except ValueError as err: logger.fatal(f'Unable to create relative path: {str(err)}') (_, destination_entry_rel) = _transform_into_template(entry_rel) destination_entry_rel = pathlib.Path(destination_entry_rel) found_platform = platform = False if ((not keep_restricted_in_template) and ('Platform' in entry_abs.parts)): platform = True try: pattern = '/Platform/(?P<Platform>[^/:*?\\"<>|\\r\\n]+/?)' found_platform = re.search(pattern, entry_abs.as_posix()).group('Platform') found_platform = found_platform.replace('/', ) except Exception as e: pass if (os.path.isfile(entry_abs) and (found_platform not in restricted_platform_entries)): found_platform = if ((found_platform not in restricted_platform_entries) and (found_platform in restricted_platforms)): restricted_platform_entries.update({found_platform: {'copyFiles': [], 'createDirs': []}}) if (platform and (found_platform in restricted_platforms)): if (not template_restricted_path): logger.warning('Restricted platform file found!!! {destination_entry_rel}, {found_platform}') continue for replacement in replacements: destination_entry_rel = destination_entry_rel.replace(replacement[0], replacement[1]) destination_entry_rel = destination_entry_rel.replace(f'Platform/{found_platform}', ) destination_entry_rel = destination_entry_rel.lstrip('/') if template_restricted_platform_relative_path: destination_entry_abs = (((((template_restricted_path / found_platform) / template_restricted_platform_relative_path) / template_name) / 'Template') / destination_entry_rel) else: destination_entry_abs = (((template_restricted_path / found_platform) / 'Template') / destination_entry_rel) else: destination_entry_abs = ((template_path / 'Template') / destination_entry_rel) if isinstance(destination_entry_rel, pathlib.Path): destination_entry_rel = destination_entry_rel.as_posix() if destination_entry_rel.startswith('/'): destination_entry_rel = destination_entry_rel.lstrip('/') os.makedirs(os.path.dirname(destination_entry_abs), exist_ok=True) templated = False if os.path.isfile(entry_abs): (name, ext) = os.path.splitext(entry) if (ext in binary_file_ext): shutil.copy(entry_abs, destination_entry_abs) else: try: with open(entry_abs, 'r') as s: source_data = s.read() (templated, source_data) = _transform_into_template(source_data, _is_cpp_file(entry_abs)) if (keep_license_text and (ext in expect_license_info_ext)): if (('Copyright (c)' not in source_data) or ('{BEGIN_LICENSE}' not in source_data)): logger.warning(f'Un-templated License header in {entry_abs}') if os.path.isfile(destination_entry_abs): os.unlink(destination_entry_abs) with open(destination_entry_abs, 'w') as s: s.write(source_data) except Exception as e: shutil.copy(entry_abs, destination_entry_abs) pass if (platform and (found_platform in restricted_platforms)): restricted_platform_entries[found_platform]['copyFiles'].append({'file': destination_entry_rel, 'isTemplated': templated}) else: copy_files.append({'file': destination_entry_rel, 'isTemplated': templated}) else: if (platform and (found_platform in restricted_platforms)): restricted_platform_entries[found_platform]['createDirs'].append({'dir': destination_entry_rel}) else: create_dirs.append({'dir': destination_entry_rel}) _transform_dir_into_copyfiles_and_createdirs(root_abs, entry_abs)
def _transform_dir_into_copyfiles_and_createdirs(root_abs: pathlib.Path, path_abs: pathlib.Path=None) -> None: '\n Internal function recursively called to transform any paths files into copyfiles and create dirs relative to\n the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it\n :param root_abs: This is the path everything will end up relative to\n :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir\n of root_abs\n ' if (not path_abs): path_abs = root_abs entries = os.listdir(path_abs) for entry in entries: entry_abs = (path_abs / entry) logger.info(f'Processing file: {entry_abs}') try: entry_rel = entry_abs.relative_to(root_abs).as_posix() except ValueError as err: logger.fatal(f'Unable to create relative path: {str(err)}') (_, destination_entry_rel) = _transform_into_template(entry_rel) destination_entry_rel = pathlib.Path(destination_entry_rel) found_platform = platform = False if ((not keep_restricted_in_template) and ('Platform' in entry_abs.parts)): platform = True try: pattern = '/Platform/(?P<Platform>[^/:*?\\"<>|\\r\\n]+/?)' found_platform = re.search(pattern, entry_abs.as_posix()).group('Platform') found_platform = found_platform.replace('/', ) except Exception as e: pass if (os.path.isfile(entry_abs) and (found_platform not in restricted_platform_entries)): found_platform = if ((found_platform not in restricted_platform_entries) and (found_platform in restricted_platforms)): restricted_platform_entries.update({found_platform: {'copyFiles': [], 'createDirs': []}}) if (platform and (found_platform in restricted_platforms)): if (not template_restricted_path): logger.warning('Restricted platform file found!!! {destination_entry_rel}, {found_platform}') continue for replacement in replacements: destination_entry_rel = destination_entry_rel.replace(replacement[0], replacement[1]) destination_entry_rel = destination_entry_rel.replace(f'Platform/{found_platform}', ) destination_entry_rel = destination_entry_rel.lstrip('/') if template_restricted_platform_relative_path: destination_entry_abs = (((((template_restricted_path / found_platform) / template_restricted_platform_relative_path) / template_name) / 'Template') / destination_entry_rel) else: destination_entry_abs = (((template_restricted_path / found_platform) / 'Template') / destination_entry_rel) else: destination_entry_abs = ((template_path / 'Template') / destination_entry_rel) if isinstance(destination_entry_rel, pathlib.Path): destination_entry_rel = destination_entry_rel.as_posix() if destination_entry_rel.startswith('/'): destination_entry_rel = destination_entry_rel.lstrip('/') os.makedirs(os.path.dirname(destination_entry_abs), exist_ok=True) templated = False if os.path.isfile(entry_abs): (name, ext) = os.path.splitext(entry) if (ext in binary_file_ext): shutil.copy(entry_abs, destination_entry_abs) else: try: with open(entry_abs, 'r') as s: source_data = s.read() (templated, source_data) = _transform_into_template(source_data, _is_cpp_file(entry_abs)) if (keep_license_text and (ext in expect_license_info_ext)): if (('Copyright (c)' not in source_data) or ('{BEGIN_LICENSE}' not in source_data)): logger.warning(f'Un-templated License header in {entry_abs}') if os.path.isfile(destination_entry_abs): os.unlink(destination_entry_abs) with open(destination_entry_abs, 'w') as s: s.write(source_data) except Exception as e: shutil.copy(entry_abs, destination_entry_abs) pass if (platform and (found_platform in restricted_platforms)): restricted_platform_entries[found_platform]['copyFiles'].append({'file': destination_entry_rel, 'isTemplated': templated}) else: copy_files.append({'file': destination_entry_rel, 'isTemplated': templated}) else: if (platform and (found_platform in restricted_platforms)): restricted_platform_entries[found_platform]['createDirs'].append({'dir': destination_entry_rel}) else: create_dirs.append({'dir': destination_entry_rel}) _transform_dir_into_copyfiles_and_createdirs(root_abs, entry_abs)<|docstring|>Internal function recursively called to transform any paths files into copyfiles and create dirs relative to the root. This will transform and copy the files, and save the copyfiles and createdirs data, no not save it :param root_abs: This is the path everything will end up relative to :path_abs: This is the path being processed, it is always either root_abs (where it starts) or a subdir of root_abs<|endoftext|>
b37f1b6c47345d0d7093ae3fc828d5c02e0388b0e054fb811394af94afdf125d
def open(self): ' New connection has been established ' clients.append(self) self.logger.info('New connection')
New connection has been established
lib/WebSocketServer.py
open
dorneanu/netgrafio
71
python
def open(self): ' ' clients.append(self) self.logger.info('New connection')
def open(self): ' ' clients.append(self) self.logger.info('New connection')<|docstring|>New connection has been established<|endoftext|>
578f53199ca06f81e5b4a0522189112b5d51aeeb98c414a11a455fddab985ae3
def on_message(self, message): ' Data income event callback ' self.write_message((u'%s' % message))
Data income event callback
lib/WebSocketServer.py
on_message
dorneanu/netgrafio
71
python
def on_message(self, message): ' ' self.write_message((u'%s' % message))
def on_message(self, message): ' ' self.write_message((u'%s' % message))<|docstring|>Data income event callback<|endoftext|>
56e7a6a5f49e387c8eff7760af5636efdbc5ee2e90213ce497357ab498b75a92
def on_close(self): ' Connection was closed ' clients.remove(self) self.logger.info('Connection removed')
Connection was closed
lib/WebSocketServer.py
on_close
dorneanu/netgrafio
71
python
def on_close(self): ' ' clients.remove(self) self.logger.info('Connection removed')
def on_close(self): ' ' clients.remove(self) self.logger.info('Connection removed')<|docstring|>Connection was closed<|endoftext|>
bbe3fbc457b80ed27b158dfe83de627c59f9507d8dbc7e265d457479688528bd
def __init__(self, host, port, in_queue=Queue()): ' Constructor for the WebSocketServer class\n\n Args:\n host(str): Hostname\n port(int): Port number to listen on\n in_queue(Queue): Thread-safe working queue\n\n ' self.application = Application() self.server = tornado.httpserver.HTTPServer(self.application) self.host = host self.port = port self.in_queue = in_queue self.server.listen(self.port, self.host) logging.basicConfig(level=logging.DEBUG) self.logger = logging.getLogger('WebSocketServer') self.logger.setLevel(logging.INFO)
Constructor for the WebSocketServer class Args: host(str): Hostname port(int): Port number to listen on in_queue(Queue): Thread-safe working queue
lib/WebSocketServer.py
__init__
dorneanu/netgrafio
71
python
def __init__(self, host, port, in_queue=Queue()): ' Constructor for the WebSocketServer class\n\n Args:\n host(str): Hostname\n port(int): Port number to listen on\n in_queue(Queue): Thread-safe working queue\n\n ' self.application = Application() self.server = tornado.httpserver.HTTPServer(self.application) self.host = host self.port = port self.in_queue = in_queue self.server.listen(self.port, self.host) logging.basicConfig(level=logging.DEBUG) self.logger = logging.getLogger('WebSocketServer') self.logger.setLevel(logging.INFO)
def __init__(self, host, port, in_queue=Queue()): ' Constructor for the WebSocketServer class\n\n Args:\n host(str): Hostname\n port(int): Port number to listen on\n in_queue(Queue): Thread-safe working queue\n\n ' self.application = Application() self.server = tornado.httpserver.HTTPServer(self.application) self.host = host self.port = port self.in_queue = in_queue self.server.listen(self.port, self.host) logging.basicConfig(level=logging.DEBUG) self.logger = logging.getLogger('WebSocketServer') self.logger.setLevel(logging.INFO)<|docstring|>Constructor for the WebSocketServer class Args: host(str): Hostname port(int): Port number to listen on in_queue(Queue): Thread-safe working queue<|endoftext|>
1ae035e74ab9844b68374059cbd0b43d4ff9217817df40d6298221f30cf4ca6f
def start_server(self): ' Starts the HTTP server\n ' self.logger.info(('Starting WebSocket server on port %d' % self.port)) http_server = Thread(target=tornado.ioloop.IOLoop.instance().start) http_server.start()
Starts the HTTP server
lib/WebSocketServer.py
start_server
dorneanu/netgrafio
71
python
def start_server(self): ' \n ' self.logger.info(('Starting WebSocket server on port %d' % self.port)) http_server = Thread(target=tornado.ioloop.IOLoop.instance().start) http_server.start()
def start_server(self): ' \n ' self.logger.info(('Starting WebSocket server on port %d' % self.port)) http_server = Thread(target=tornado.ioloop.IOLoop.instance().start) http_server.start()<|docstring|>Starts the HTTP server<|endoftext|>
6d22b55c005d826aeb369b71a6f01ad98d0a9f618d8da560a5308fb81ef2844c
def start_collector(self): ' Starts collecting packages\n ' self.logger.info('Start collector server') collector_server = Thread(target=self.collect_data) collector_server.start()
Starts collecting packages
lib/WebSocketServer.py
start_collector
dorneanu/netgrafio
71
python
def start_collector(self): ' \n ' self.logger.info('Start collector server') collector_server = Thread(target=self.collect_data) collector_server.start()
def start_collector(self): ' \n ' self.logger.info('Start collector server') collector_server = Thread(target=self.collect_data) collector_server.start()<|docstring|>Starts collecting packages<|endoftext|>
4a27201f6145d9b6deeb5a418af972d083ed42a73905898c5133eae83b030e10
def collector_process_data(self, data): ' Process incoming data and send it to all available clients\n\n Args:\n data: Received data\n\n ' for c in clients: c.on_message(json.dumps(data))
Process incoming data and send it to all available clients Args: data: Received data
lib/WebSocketServer.py
collector_process_data
dorneanu/netgrafio
71
python
def collector_process_data(self, data): ' Process incoming data and send it to all available clients\n\n Args:\n data: Received data\n\n ' for c in clients: c.on_message(json.dumps(data))
def collector_process_data(self, data): ' Process incoming data and send it to all available clients\n\n Args:\n data: Received data\n\n ' for c in clients: c.on_message(json.dumps(data))<|docstring|>Process incoming data and send it to all available clients Args: data: Received data<|endoftext|>
b66b6f396626a9b6bc1fa974ee56c105a05b52b6484dc4aaca06e07c60ec4b51
def collect_data(self): ' Wait for data in individual thread\n ' self.logger.info('Waiting for incoming data ...') while True: item = self.in_queue.get() self.logger.info('Received data!') self.collector_process_data(item)
Wait for data in individual thread
lib/WebSocketServer.py
collect_data
dorneanu/netgrafio
71
python
def collect_data(self): ' \n ' self.logger.info('Waiting for incoming data ...') while True: item = self.in_queue.get() self.logger.info('Received data!') self.collector_process_data(item)
def collect_data(self): ' \n ' self.logger.info('Waiting for incoming data ...') while True: item = self.in_queue.get() self.logger.info('Received data!') self.collector_process_data(item)<|docstring|>Wait for data in individual thread<|endoftext|>
1ab8658321b54e1787405bdd243f1ba5d8a40b8196501fb866be8787986d7bd9
def start(self): ' Starts the server\n\n .. note::\n The server will listen for incoming JSON packets and pass them\n to all clients connected to the WebSocket.\n ' self.start_server() self.start_collector()
Starts the server .. note:: The server will listen for incoming JSON packets and pass them to all clients connected to the WebSocket.
lib/WebSocketServer.py
start
dorneanu/netgrafio
71
python
def start(self): ' Starts the server\n\n .. note::\n The server will listen for incoming JSON packets and pass them\n to all clients connected to the WebSocket.\n ' self.start_server() self.start_collector()
def start(self): ' Starts the server\n\n .. note::\n The server will listen for incoming JSON packets and pass them\n to all clients connected to the WebSocket.\n ' self.start_server() self.start_collector()<|docstring|>Starts the server .. note:: The server will listen for incoming JSON packets and pass them to all clients connected to the WebSocket.<|endoftext|>
61bb960435bb430467b7fe65996ef6d4adab430e06e04a9982b2e9405717ce1e
async def async_change(self, change): 'Merge changes with queue and send when possible, returning True when done' self.changes = update(self.changes, change) if self.lock.locked(): return False async with self.lock: while self.changes: (await asyncio.sleep(0)) payload = self.changes self.changes = {} try: async with self.session.get(f'http://{self.ip}:{self.port}/setAircon', params={'json': json.dumps(payload)}, timeout=aiohttp.ClientTimeout(total=4)) as resp: data = (await resp.json(content_type=None)) if (data['ack'] == False): raise ApiError(data['reason']) except (aiohttp.client_exceptions.ServerDisconnectedError, ConnectionResetError) as err: self.changes = update(self.changes, payload) (await asyncio.sleep(1)) except aiohttp.ClientError as err: raise ApiError(err) except asyncio.TimeoutError: raise ApiError('Connection timed out.') except AssertionError: raise ApiError('Response status not 200.') except SyntaxError as err: raise ApiError('Invalid response') return True
Merge changes with queue and send when possible, returning True when done
advantage_air/__init__.py
async_change
Bre77/advantage_air
3
python
async def async_change(self, change): self.changes = update(self.changes, change) if self.lock.locked(): return False async with self.lock: while self.changes: (await asyncio.sleep(0)) payload = self.changes self.changes = {} try: async with self.session.get(f'http://{self.ip}:{self.port}/setAircon', params={'json': json.dumps(payload)}, timeout=aiohttp.ClientTimeout(total=4)) as resp: data = (await resp.json(content_type=None)) if (data['ack'] == False): raise ApiError(data['reason']) except (aiohttp.client_exceptions.ServerDisconnectedError, ConnectionResetError) as err: self.changes = update(self.changes, payload) (await asyncio.sleep(1)) except aiohttp.ClientError as err: raise ApiError(err) except asyncio.TimeoutError: raise ApiError('Connection timed out.') except AssertionError: raise ApiError('Response status not 200.') except SyntaxError as err: raise ApiError('Invalid response') return True
async def async_change(self, change): self.changes = update(self.changes, change) if self.lock.locked(): return False async with self.lock: while self.changes: (await asyncio.sleep(0)) payload = self.changes self.changes = {} try: async with self.session.get(f'http://{self.ip}:{self.port}/setAircon', params={'json': json.dumps(payload)}, timeout=aiohttp.ClientTimeout(total=4)) as resp: data = (await resp.json(content_type=None)) if (data['ack'] == False): raise ApiError(data['reason']) except (aiohttp.client_exceptions.ServerDisconnectedError, ConnectionResetError) as err: self.changes = update(self.changes, payload) (await asyncio.sleep(1)) except aiohttp.ClientError as err: raise ApiError(err) except asyncio.TimeoutError: raise ApiError('Connection timed out.') except AssertionError: raise ApiError('Response status not 200.') except SyntaxError as err: raise ApiError('Invalid response') return True<|docstring|>Merge changes with queue and send when possible, returning True when done<|endoftext|>
479f85a78220653f22158fb6f07d0a5bd7c1256bdea48b3b8fe1556a8e83494f
async def validate_input(hass: HomeAssistant, data: dict[(str, Any)]) -> dict[(str, Any)]: 'Validate the user input allows us to connect.\n\n Data has the keys from STEP_USER_DATA_SCHEMA with values provided by the user.\n ' api = NYC311API(async_get_clientsession(hass), data['api_key']) try: (await api.get_calendar([CalendarType.NEXT_EXCEPTIONS])) except NYC311API.InvalidAuth as error: raise InvalidAuth from error except NYC311API.CannotConnect as error: raise CannotConnect from error log.debug('API authenticated successfully.') return {'title': INTEGRATION_NAME}
Validate the user input allows us to connect. Data has the keys from STEP_USER_DATA_SCHEMA with values provided by the user.
custom_components/nyc311/config_flow.py
validate_input
elahd/ha-nyc311
1
python
async def validate_input(hass: HomeAssistant, data: dict[(str, Any)]) -> dict[(str, Any)]: 'Validate the user input allows us to connect.\n\n Data has the keys from STEP_USER_DATA_SCHEMA with values provided by the user.\n ' api = NYC311API(async_get_clientsession(hass), data['api_key']) try: (await api.get_calendar([CalendarType.NEXT_EXCEPTIONS])) except NYC311API.InvalidAuth as error: raise InvalidAuth from error except NYC311API.CannotConnect as error: raise CannotConnect from error log.debug('API authenticated successfully.') return {'title': INTEGRATION_NAME}
async def validate_input(hass: HomeAssistant, data: dict[(str, Any)]) -> dict[(str, Any)]: 'Validate the user input allows us to connect.\n\n Data has the keys from STEP_USER_DATA_SCHEMA with values provided by the user.\n ' api = NYC311API(async_get_clientsession(hass), data['api_key']) try: (await api.get_calendar([CalendarType.NEXT_EXCEPTIONS])) except NYC311API.InvalidAuth as error: raise InvalidAuth from error except NYC311API.CannotConnect as error: raise CannotConnect from error log.debug('API authenticated successfully.') return {'title': INTEGRATION_NAME}<|docstring|>Validate the user input allows us to connect. Data has the keys from STEP_USER_DATA_SCHEMA with values provided by the user.<|endoftext|>
9a5357f08f964aa0e64a5c9732aee09ae71738c68f2500c7ef98571faec12758
async def async_step_user(self, user_input: (dict[(str, Any)] | None)=None) -> FlowResult: 'Handle the initial step.' if (user_input is None): return self.async_show_form(step_id='user', data_schema=STEP_USER_DATA_SCHEMA) errors = {} if (user_input is not None): try: info = (await validate_input(self.hass, user_input)) except CannotConnect: errors['base'] = 'cannot_connect' except InvalidAuth: errors['base'] = 'invalid_auth' except Exception: log.exception('Unexpected exception') errors['base'] = 'unknown' else: return self.async_create_entry(title=info['title'], data=user_input) return self.async_show_form(step_id='user', data_schema=STEP_USER_DATA_SCHEMA, errors=errors)
Handle the initial step.
custom_components/nyc311/config_flow.py
async_step_user
elahd/ha-nyc311
1
python
async def async_step_user(self, user_input: (dict[(str, Any)] | None)=None) -> FlowResult: if (user_input is None): return self.async_show_form(step_id='user', data_schema=STEP_USER_DATA_SCHEMA) errors = {} if (user_input is not None): try: info = (await validate_input(self.hass, user_input)) except CannotConnect: errors['base'] = 'cannot_connect' except InvalidAuth: errors['base'] = 'invalid_auth' except Exception: log.exception('Unexpected exception') errors['base'] = 'unknown' else: return self.async_create_entry(title=info['title'], data=user_input) return self.async_show_form(step_id='user', data_schema=STEP_USER_DATA_SCHEMA, errors=errors)
async def async_step_user(self, user_input: (dict[(str, Any)] | None)=None) -> FlowResult: if (user_input is None): return self.async_show_form(step_id='user', data_schema=STEP_USER_DATA_SCHEMA) errors = {} if (user_input is not None): try: info = (await validate_input(self.hass, user_input)) except CannotConnect: errors['base'] = 'cannot_connect' except InvalidAuth: errors['base'] = 'invalid_auth' except Exception: log.exception('Unexpected exception') errors['base'] = 'unknown' else: return self.async_create_entry(title=info['title'], data=user_input) return self.async_show_form(step_id='user', data_schema=STEP_USER_DATA_SCHEMA, errors=errors)<|docstring|>Handle the initial step.<|endoftext|>
5fd27ecf6981eb0e13f94749813ed55d44a532e41f4b838b931782c1d3140e87
def resample_pcd(pcd, n): 'drop or duplicate points so that input of each object has exactly n points' idx = np.random.permutation(pcd.shape[0]) if (idx.shape[0] < n): idx = np.concatenate([idx, np.random.randint(pcd.shape[0], size=(n - pcd.shape[0]))]) return pcd[idx[:n]]
drop or duplicate points so that input of each object has exactly n points
OcCo_TF/utils/data_util.py
resample_pcd
sun-pyo/OcCo
158
python
def resample_pcd(pcd, n): idx = np.random.permutation(pcd.shape[0]) if (idx.shape[0] < n): idx = np.concatenate([idx, np.random.randint(pcd.shape[0], size=(n - pcd.shape[0]))]) return pcd[idx[:n]]
def resample_pcd(pcd, n): idx = np.random.permutation(pcd.shape[0]) if (idx.shape[0] < n): idx = np.concatenate([idx, np.random.randint(pcd.shape[0], size=(n - pcd.shape[0]))]) return pcd[idx[:n]]<|docstring|>drop or duplicate points so that input of each object has exactly n points<|endoftext|>
4803135b83a70a68add73088bf4a8c769be466e8753c116e57f82cc170e4a728
def lmdb_dataflow(lmdb_path, batch_size, input_size, output_size, is_training, test_speed=False): 'load LMDB files, then generate batches??' df = dataflow.LMDBSerializer.load(lmdb_path, shuffle=False) size = df.size() if is_training: df = dataflow.LocallyShuffleData(df, buffer_size=2000) df = dataflow.PrefetchData(df, nr_prefetch=500, nr_proc=1) df = BatchData(df, batch_size, input_size, output_size) if is_training: df = dataflow.PrefetchDataZMQ(df, nr_proc=8) df = dataflow.RepeatedData(df, (- 1)) if test_speed: dataflow.TestDataSpeed(df, size=1000).start() df.reset_state() return (df, size)
load LMDB files, then generate batches??
OcCo_TF/utils/data_util.py
lmdb_dataflow
sun-pyo/OcCo
158
python
def lmdb_dataflow(lmdb_path, batch_size, input_size, output_size, is_training, test_speed=False): df = dataflow.LMDBSerializer.load(lmdb_path, shuffle=False) size = df.size() if is_training: df = dataflow.LocallyShuffleData(df, buffer_size=2000) df = dataflow.PrefetchData(df, nr_prefetch=500, nr_proc=1) df = BatchData(df, batch_size, input_size, output_size) if is_training: df = dataflow.PrefetchDataZMQ(df, nr_proc=8) df = dataflow.RepeatedData(df, (- 1)) if test_speed: dataflow.TestDataSpeed(df, size=1000).start() df.reset_state() return (df, size)
def lmdb_dataflow(lmdb_path, batch_size, input_size, output_size, is_training, test_speed=False): df = dataflow.LMDBSerializer.load(lmdb_path, shuffle=False) size = df.size() if is_training: df = dataflow.LocallyShuffleData(df, buffer_size=2000) df = dataflow.PrefetchData(df, nr_prefetch=500, nr_proc=1) df = BatchData(df, batch_size, input_size, output_size) if is_training: df = dataflow.PrefetchDataZMQ(df, nr_proc=8) df = dataflow.RepeatedData(df, (- 1)) if test_speed: dataflow.TestDataSpeed(df, size=1000).start() df.reset_state() return (df, size)<|docstring|>load LMDB files, then generate batches??<|endoftext|>
09a2b7ccff93adcc16ceabcbf703beb202ea765664e3890909abed062e4503b7
def __len__(self): 'get the number of batches' ds_size = len(self.ds) div = (ds_size // self.batch_size) rem = (ds_size % self.batch_size) if (rem == 0): return div return (div + int(self.remainder))
get the number of batches
OcCo_TF/utils/data_util.py
__len__
sun-pyo/OcCo
158
python
def __len__(self): ds_size = len(self.ds) div = (ds_size // self.batch_size) rem = (ds_size % self.batch_size) if (rem == 0): return div return (div + int(self.remainder))
def __len__(self): ds_size = len(self.ds) div = (ds_size // self.batch_size) rem = (ds_size % self.batch_size) if (rem == 0): return div return (div + int(self.remainder))<|docstring|>get the number of batches<|endoftext|>
f7a473a66147c832000e3f660fbc9ac67b86871a8702d83067e12904067434c3
def __iter__(self): 'generating data in batches' holder = [] for data in self.ds: holder.append(data) if (len(holder) == self.batch_size): (yield self._aggregate_batch(holder, self.use_list)) del holder[:] if (self.remainder and (len(holder) > 0)): (yield self._aggregate_batch(holder, self.use_list))
generating data in batches
OcCo_TF/utils/data_util.py
__iter__
sun-pyo/OcCo
158
python
def __iter__(self): holder = [] for data in self.ds: holder.append(data) if (len(holder) == self.batch_size): (yield self._aggregate_batch(holder, self.use_list)) del holder[:] if (self.remainder and (len(holder) > 0)): (yield self._aggregate_batch(holder, self.use_list))
def __iter__(self): holder = [] for data in self.ds: holder.append(data) if (len(holder) == self.batch_size): (yield self._aggregate_batch(holder, self.use_list)) del holder[:] if (self.remainder and (len(holder) > 0)): (yield self._aggregate_batch(holder, self.use_list))<|docstring|>generating data in batches<|endoftext|>
f55f8257dad181540c0277bf1b47a9c390a6a56f266e2ed680418102a22ae826
def _aggregate_batch(self, data_holder, use_list=False): '\n Concatenate input points along the 0-th dimension\n Stack all other data along the 0-th dimension\n ' ids = np.stack([x[0] for x in data_holder]) inputs = [(resample_pcd(x[1], self.input_size) if (x[1].shape[0] > self.input_size) else x[1]) for x in data_holder] inputs = np.expand_dims(np.concatenate([x for x in inputs]), 0).astype(np.float32) npts = np.stack([(x[1].shape[0] if (x[1].shape[0] < self.input_size) else self.input_size) for x in data_holder]).astype(np.int32) gts = np.stack([resample_pcd(x[2], self.gt_size) for x in data_holder]).astype(np.float32) return (ids, inputs, npts, gts)
Concatenate input points along the 0-th dimension Stack all other data along the 0-th dimension
OcCo_TF/utils/data_util.py
_aggregate_batch
sun-pyo/OcCo
158
python
def _aggregate_batch(self, data_holder, use_list=False): '\n Concatenate input points along the 0-th dimension\n Stack all other data along the 0-th dimension\n ' ids = np.stack([x[0] for x in data_holder]) inputs = [(resample_pcd(x[1], self.input_size) if (x[1].shape[0] > self.input_size) else x[1]) for x in data_holder] inputs = np.expand_dims(np.concatenate([x for x in inputs]), 0).astype(np.float32) npts = np.stack([(x[1].shape[0] if (x[1].shape[0] < self.input_size) else self.input_size) for x in data_holder]).astype(np.int32) gts = np.stack([resample_pcd(x[2], self.gt_size) for x in data_holder]).astype(np.float32) return (ids, inputs, npts, gts)
def _aggregate_batch(self, data_holder, use_list=False): '\n Concatenate input points along the 0-th dimension\n Stack all other data along the 0-th dimension\n ' ids = np.stack([x[0] for x in data_holder]) inputs = [(resample_pcd(x[1], self.input_size) if (x[1].shape[0] > self.input_size) else x[1]) for x in data_holder] inputs = np.expand_dims(np.concatenate([x for x in inputs]), 0).astype(np.float32) npts = np.stack([(x[1].shape[0] if (x[1].shape[0] < self.input_size) else self.input_size) for x in data_holder]).astype(np.int32) gts = np.stack([resample_pcd(x[2], self.gt_size) for x in data_holder]).astype(np.float32) return (ids, inputs, npts, gts)<|docstring|>Concatenate input points along the 0-th dimension Stack all other data along the 0-th dimension<|endoftext|>
2eaa97460f397399fba41fce31e92a2b2e7cd1c5254b02f0c8960aa9884c3069
def get_gdf(self): '\n Obtain OSM data and save as GeoDataFrame.\n\n Returns\n -------\n GeoDataFrame\n ' return json_to_gdf(osm_json=self.query(), data_type=self.data_type)
Obtain OSM data and save as GeoDataFrame. Returns ------- GeoDataFrame
osmsc/geogroup.py
get_gdf
ruirzma/osmsc
9
python
def get_gdf(self): '\n Obtain OSM data and save as GeoDataFrame.\n\n Returns\n -------\n GeoDataFrame\n ' return json_to_gdf(osm_json=self.query(), data_type=self.data_type)
def get_gdf(self): '\n Obtain OSM data and save as GeoDataFrame.\n\n Returns\n -------\n GeoDataFrame\n ' return json_to_gdf(osm_json=self.query(), data_type=self.data_type)<|docstring|>Obtain OSM data and save as GeoDataFrame. Returns ------- GeoDataFrame<|endoftext|>
1d2031d6964801c6a6defd890835251777c53669c9a619f85816d5447f110930
def get_gdf(self, tags=False): '\n Obtain OSM data and save as GeoDataFrame.\n\n Parameters\n ----------\n tags : bool\n if False, the GeoDataFrame won\'t add OSM "tags" column.\n if True, need to extract tag info into current GeoDataFrame\n\n\n Returns\n -------\n GeoDataFrame\n ' return json_to_gdf(osm_json=self.query(), data_type=self.data_type, tags=tags)
Obtain OSM data and save as GeoDataFrame. Parameters ---------- tags : bool if False, the GeoDataFrame won't add OSM "tags" column. if True, need to extract tag info into current GeoDataFrame Returns ------- GeoDataFrame
osmsc/geogroup.py
get_gdf
ruirzma/osmsc
9
python
def get_gdf(self, tags=False): '\n Obtain OSM data and save as GeoDataFrame.\n\n Parameters\n ----------\n tags : bool\n if False, the GeoDataFrame won\'t add OSM "tags" column.\n if True, need to extract tag info into current GeoDataFrame\n\n\n Returns\n -------\n GeoDataFrame\n ' return json_to_gdf(osm_json=self.query(), data_type=self.data_type, tags=tags)
def get_gdf(self, tags=False): '\n Obtain OSM data and save as GeoDataFrame.\n\n Parameters\n ----------\n tags : bool\n if False, the GeoDataFrame won\'t add OSM "tags" column.\n if True, need to extract tag info into current GeoDataFrame\n\n\n Returns\n -------\n GeoDataFrame\n ' return json_to_gdf(osm_json=self.query(), data_type=self.data_type, tags=tags)<|docstring|>Obtain OSM data and save as GeoDataFrame. Parameters ---------- tags : bool if False, the GeoDataFrame won't add OSM "tags" column. if True, need to extract tag info into current GeoDataFrame Returns ------- GeoDataFrame<|endoftext|>
c241a00c0b32c55f40bb1a46770c3b5184a1ef864bef9e5fb7d75ea20c28b3dc
@app.route('/registration', methods=['GET', 'POST']) def reg(): '\n Отвечает за вывод страницы регистрации и регистрацию\n :return: Страница регистрации\n ' form = RegForm() if form.validate_on_submit(): user = User(form.username_reg.data, form.email_reg.data) user.set_password(form.password_reg.data, form.pubkey_reg.data, form.privkey_reg.data) session['Username'] = user.username try: send_payable_transaction(Merkatuan.functions.transfer(form.pubkey_reg.data, 10000000000000000000).buildTransaction(dict(gas=3000000)), app.config.get('TEST_PUBLIC_KEY'), app.config.get('TEST_PRIVATE_KEY')) except BaseException as e: print('FUCK', e, form.pubkey_reg.data, type(form.pubkey_reg.data)) return redirect(url_for('index')) return render_template('auth/registration.html', form=form, user=cur_user())
Отвечает за вывод страницы регистрации и регистрацию :return: Страница регистрации
backend/app/controllers/auth.py
reg
DankanTsar/memesmerkatuan
0
python
@app.route('/registration', methods=['GET', 'POST']) def reg(): '\n Отвечает за вывод страницы регистрации и регистрацию\n :return: Страница регистрации\n ' form = RegForm() if form.validate_on_submit(): user = User(form.username_reg.data, form.email_reg.data) user.set_password(form.password_reg.data, form.pubkey_reg.data, form.privkey_reg.data) session['Username'] = user.username try: send_payable_transaction(Merkatuan.functions.transfer(form.pubkey_reg.data, 10000000000000000000).buildTransaction(dict(gas=3000000)), app.config.get('TEST_PUBLIC_KEY'), app.config.get('TEST_PRIVATE_KEY')) except BaseException as e: print('FUCK', e, form.pubkey_reg.data, type(form.pubkey_reg.data)) return redirect(url_for('index')) return render_template('auth/registration.html', form=form, user=cur_user())
@app.route('/registration', methods=['GET', 'POST']) def reg(): '\n Отвечает за вывод страницы регистрации и регистрацию\n :return: Страница регистрации\n ' form = RegForm() if form.validate_on_submit(): user = User(form.username_reg.data, form.email_reg.data) user.set_password(form.password_reg.data, form.pubkey_reg.data, form.privkey_reg.data) session['Username'] = user.username try: send_payable_transaction(Merkatuan.functions.transfer(form.pubkey_reg.data, 10000000000000000000).buildTransaction(dict(gas=3000000)), app.config.get('TEST_PUBLIC_KEY'), app.config.get('TEST_PRIVATE_KEY')) except BaseException as e: print('FUCK', e, form.pubkey_reg.data, type(form.pubkey_reg.data)) return redirect(url_for('index')) return render_template('auth/registration.html', form=form, user=cur_user())<|docstring|>Отвечает за вывод страницы регистрации и регистрацию :return: Страница регистрации<|endoftext|>
d13732d5ab2b584cc10b004ed53c1472e56ddb82afa9e8e4a32ba13d5df08847
@app.route('/login', methods=['GET', 'POST']) def log(): '\n Отвечает за вывод страницы входа и вход\n :return: Страница входа\n ' form = LogForm() if form.validate_on_submit(): session['Username'] = form.username_log.data return redirect(url_for('index')) return render_template('auth/login.html', form=form, user=cur_user())
Отвечает за вывод страницы входа и вход :return: Страница входа
backend/app/controllers/auth.py
log
DankanTsar/memesmerkatuan
0
python
@app.route('/login', methods=['GET', 'POST']) def log(): '\n Отвечает за вывод страницы входа и вход\n :return: Страница входа\n ' form = LogForm() if form.validate_on_submit(): session['Username'] = form.username_log.data return redirect(url_for('index')) return render_template('auth/login.html', form=form, user=cur_user())
@app.route('/login', methods=['GET', 'POST']) def log(): '\n Отвечает за вывод страницы входа и вход\n :return: Страница входа\n ' form = LogForm() if form.validate_on_submit(): session['Username'] = form.username_log.data return redirect(url_for('index')) return render_template('auth/login.html', form=form, user=cur_user())<|docstring|>Отвечает за вывод страницы входа и вход :return: Страница входа<|endoftext|>
832e37a55fa87cacde34fe47020d875270c6d1df6bc06de824453f2922e821dc
def __init__(self, config: dict=None): 'Initialise a Milestones object.\n\n Args:\n config (dict): Arbitrary configuration.\n ' self.config = config
Initialise a Milestones object. Args: config (dict): Arbitrary configuration.
lexos/cutter/milestones.py
__init__
scottkleinman/lexos
0
python
def __init__(self, config: dict=None): 'Initialise a Milestones object.\n\n Args:\n config (dict): Arbitrary configuration.\n ' self.config = config
def __init__(self, config: dict=None): 'Initialise a Milestones object.\n\n Args:\n config (dict): Arbitrary configuration.\n ' self.config = config<|docstring|>Initialise a Milestones object. Args: config (dict): Arbitrary configuration.<|endoftext|>
6057a08a8d59094b74d702565d6843b8b7f863593353dd6a2bdb5ff320319d3b
def set(self, docs: Union[(object, list)], milestone: Union[(dict, str)]) -> Union[(List[object], object)]: 'Set the milestones for a doc or a list of docs.\n\n Args:\n docs (object): A spaCy doc or a list of spaCy docs.\n milestone (Union[dict, str]): The milestone token(s) to match.\n Returns:\n Union[List[object], object]: A spaCy doc or list of spacy docs with\n `doc._.is_milestone` set.\n ' result = [] if (not isinstance(docs, list)): docs = [docs] for doc in docs: result.append(self._set_milestones(doc, milestone)) if (len(result) == 1): return result[0] else: return result
Set the milestones for a doc or a list of docs. Args: docs (object): A spaCy doc or a list of spaCy docs. milestone (Union[dict, str]): The milestone token(s) to match. Returns: Union[List[object], object]: A spaCy doc or list of spacy docs with `doc._.is_milestone` set.
lexos/cutter/milestones.py
set
scottkleinman/lexos
0
python
def set(self, docs: Union[(object, list)], milestone: Union[(dict, str)]) -> Union[(List[object], object)]: 'Set the milestones for a doc or a list of docs.\n\n Args:\n docs (object): A spaCy doc or a list of spaCy docs.\n milestone (Union[dict, str]): The milestone token(s) to match.\n Returns:\n Union[List[object], object]: A spaCy doc or list of spacy docs with\n `doc._.is_milestone` set.\n ' result = [] if (not isinstance(docs, list)): docs = [docs] for doc in docs: result.append(self._set_milestones(doc, milestone)) if (len(result) == 1): return result[0] else: return result
def set(self, docs: Union[(object, list)], milestone: Union[(dict, str)]) -> Union[(List[object], object)]: 'Set the milestones for a doc or a list of docs.\n\n Args:\n docs (object): A spaCy doc or a list of spaCy docs.\n milestone (Union[dict, str]): The milestone token(s) to match.\n Returns:\n Union[List[object], object]: A spaCy doc or list of spacy docs with\n `doc._.is_milestone` set.\n ' result = [] if (not isinstance(docs, list)): docs = [docs] for doc in docs: result.append(self._set_milestones(doc, milestone)) if (len(result) == 1): return result[0] else: return result<|docstring|>Set the milestones for a doc or a list of docs. Args: docs (object): A spaCy doc or a list of spaCy docs. milestone (Union[dict, str]): The milestone token(s) to match. Returns: Union[List[object], object]: A spaCy doc or list of spacy docs with `doc._.is_milestone` set.<|endoftext|>
056a9d8d951cf20a284f0b2283b13242fb60465a540107c4cb31d1ecad701989
def _set_milestones(self, doc: object, milestone: str) -> object: 'Set the milestones for a doc.\n\n Args:\n doc (object): A spaCy doc.\n milestone (str): The milestone token(s) to match.\n\n Returns:\n object: A spaCy doc with `doc._.is_milestone` set.\n ' if (not doc[0].has_extension('is_milestone')): Token.set_extension('is_milestone', default=False, force=True) for token in doc: if self._matches_milestone(token, milestone): token._.is_milestone = True else: token._.is_milestone = False return doc
Set the milestones for a doc. Args: doc (object): A spaCy doc. milestone (str): The milestone token(s) to match. Returns: object: A spaCy doc with `doc._.is_milestone` set.
lexos/cutter/milestones.py
_set_milestones
scottkleinman/lexos
0
python
def _set_milestones(self, doc: object, milestone: str) -> object: 'Set the milestones for a doc.\n\n Args:\n doc (object): A spaCy doc.\n milestone (str): The milestone token(s) to match.\n\n Returns:\n object: A spaCy doc with `doc._.is_milestone` set.\n ' if (not doc[0].has_extension('is_milestone')): Token.set_extension('is_milestone', default=False, force=True) for token in doc: if self._matches_milestone(token, milestone): token._.is_milestone = True else: token._.is_milestone = False return doc
def _set_milestones(self, doc: object, milestone: str) -> object: 'Set the milestones for a doc.\n\n Args:\n doc (object): A spaCy doc.\n milestone (str): The milestone token(s) to match.\n\n Returns:\n object: A spaCy doc with `doc._.is_milestone` set.\n ' if (not doc[0].has_extension('is_milestone')): Token.set_extension('is_milestone', default=False, force=True) for token in doc: if self._matches_milestone(token, milestone): token._.is_milestone = True else: token._.is_milestone = False return doc<|docstring|>Set the milestones for a doc. Args: doc (object): A spaCy doc. milestone (str): The milestone token(s) to match. Returns: object: A spaCy doc with `doc._.is_milestone` set.<|endoftext|>
72188a2301b29f2898f5a26a5fef9393a5af7d50487c8dbbbb5c01b8f1900e17
def _matches_milestone(self, token: object, milestone: Union[(dict, list, str)]) -> bool: 'Check if a token matches a milestone.\n\n Args:\n token (object): The token to test.\n milestone (Union[dict, list, str]): The milestone token(s) to match.\n\n Returns:\n bool: Whether the token matches the milestone.\n ' if isinstance(milestone, str): if (token.text == milestone): return True else: return False elif isinstance(milestone, list): if (token.text in milestone): return True else: return False elif isinstance(milestone, dict): return self._parse_milestone_dict(token, milestone)
Check if a token matches a milestone. Args: token (object): The token to test. milestone (Union[dict, list, str]): The milestone token(s) to match. Returns: bool: Whether the token matches the milestone.
lexos/cutter/milestones.py
_matches_milestone
scottkleinman/lexos
0
python
def _matches_milestone(self, token: object, milestone: Union[(dict, list, str)]) -> bool: 'Check if a token matches a milestone.\n\n Args:\n token (object): The token to test.\n milestone (Union[dict, list, str]): The milestone token(s) to match.\n\n Returns:\n bool: Whether the token matches the milestone.\n ' if isinstance(milestone, str): if (token.text == milestone): return True else: return False elif isinstance(milestone, list): if (token.text in milestone): return True else: return False elif isinstance(milestone, dict): return self._parse_milestone_dict(token, milestone)
def _matches_milestone(self, token: object, milestone: Union[(dict, list, str)]) -> bool: 'Check if a token matches a milestone.\n\n Args:\n token (object): The token to test.\n milestone (Union[dict, list, str]): The milestone token(s) to match.\n\n Returns:\n bool: Whether the token matches the milestone.\n ' if isinstance(milestone, str): if (token.text == milestone): return True else: return False elif isinstance(milestone, list): if (token.text in milestone): return True else: return False elif isinstance(milestone, dict): return self._parse_milestone_dict(token, milestone)<|docstring|>Check if a token matches a milestone. Args: token (object): The token to test. milestone (Union[dict, list, str]): The milestone token(s) to match. Returns: bool: Whether the token matches the milestone.<|endoftext|>
76af7c8fcdab80c53a44ad60b3ad31c960dd0789b115a64e993c8e52e36afddf
def _parse_milestone_dict(self, token, milestone_dict): 'Parse a milestone dictionary and get results for each criterion.\n\n Key-value pairs in `milestone_dict` will be interpreted as token\n attributes and their values. If the value is given as a tuple, it\n must have the form `(pattern, operator)`, where the pattern is the\n string or regex pattern to match, and the operator is the matching\n method to use. Valid operators are "in", "not_in", "starts_with",\n "ends_with", "re_match", and "re_search". The prefix "re_" implies\n that the pattern is a regex, and either `re.match` or `re.search`\n will be used.\n\n Args:\n token (object): The token to test.\n milestone_dict (dict): A dict in the format given above.\n\n Returns:\n bool: Whether the token matches the query.\n ' and_ = milestone_dict.get('and', {}) or_ = milestone_dict.get('or', {}) and_valid = True or_valid = False for query_dict in and_: (attr, value) = list(query_dict.items())[0] if self._get_milestone_result(attr, token, value): and_valid = True else: and_valid = False for query_dict in or_: (attr, value) = list(query_dict.items())[0] if self._get_milestone_result(attr, token, value): or_valid = True if (and_valid and or_valid): is_match = True elif (and_valid and (not or_valid)): is_match = True elif ((not and_valid) and or_valid): is_match = True else: is_match = False for (attr, value) in milestone_dict.items(): if (attr not in ['and', 'or']): if self._get_milestone_result(attr, token, value): is_match = True else: is_match = False return is_match
Parse a milestone dictionary and get results for each criterion. Key-value pairs in `milestone_dict` will be interpreted as token attributes and their values. If the value is given as a tuple, it must have the form `(pattern, operator)`, where the pattern is the string or regex pattern to match, and the operator is the matching method to use. Valid operators are "in", "not_in", "starts_with", "ends_with", "re_match", and "re_search". The prefix "re_" implies that the pattern is a regex, and either `re.match` or `re.search` will be used. Args: token (object): The token to test. milestone_dict (dict): A dict in the format given above. Returns: bool: Whether the token matches the query.
lexos/cutter/milestones.py
_parse_milestone_dict
scottkleinman/lexos
0
python
def _parse_milestone_dict(self, token, milestone_dict): 'Parse a milestone dictionary and get results for each criterion.\n\n Key-value pairs in `milestone_dict` will be interpreted as token\n attributes and their values. If the value is given as a tuple, it\n must have the form `(pattern, operator)`, where the pattern is the\n string or regex pattern to match, and the operator is the matching\n method to use. Valid operators are "in", "not_in", "starts_with",\n "ends_with", "re_match", and "re_search". The prefix "re_" implies\n that the pattern is a regex, and either `re.match` or `re.search`\n will be used.\n\n Args:\n token (object): The token to test.\n milestone_dict (dict): A dict in the format given above.\n\n Returns:\n bool: Whether the token matches the query.\n ' and_ = milestone_dict.get('and', {}) or_ = milestone_dict.get('or', {}) and_valid = True or_valid = False for query_dict in and_: (attr, value) = list(query_dict.items())[0] if self._get_milestone_result(attr, token, value): and_valid = True else: and_valid = False for query_dict in or_: (attr, value) = list(query_dict.items())[0] if self._get_milestone_result(attr, token, value): or_valid = True if (and_valid and or_valid): is_match = True elif (and_valid and (not or_valid)): is_match = True elif ((not and_valid) and or_valid): is_match = True else: is_match = False for (attr, value) in milestone_dict.items(): if (attr not in ['and', 'or']): if self._get_milestone_result(attr, token, value): is_match = True else: is_match = False return is_match
def _parse_milestone_dict(self, token, milestone_dict): 'Parse a milestone dictionary and get results for each criterion.\n\n Key-value pairs in `milestone_dict` will be interpreted as token\n attributes and their values. If the value is given as a tuple, it\n must have the form `(pattern, operator)`, where the pattern is the\n string or regex pattern to match, and the operator is the matching\n method to use. Valid operators are "in", "not_in", "starts_with",\n "ends_with", "re_match", and "re_search". The prefix "re_" implies\n that the pattern is a regex, and either `re.match` or `re.search`\n will be used.\n\n Args:\n token (object): The token to test.\n milestone_dict (dict): A dict in the format given above.\n\n Returns:\n bool: Whether the token matches the query.\n ' and_ = milestone_dict.get('and', {}) or_ = milestone_dict.get('or', {}) and_valid = True or_valid = False for query_dict in and_: (attr, value) = list(query_dict.items())[0] if self._get_milestone_result(attr, token, value): and_valid = True else: and_valid = False for query_dict in or_: (attr, value) = list(query_dict.items())[0] if self._get_milestone_result(attr, token, value): or_valid = True if (and_valid and or_valid): is_match = True elif (and_valid and (not or_valid)): is_match = True elif ((not and_valid) and or_valid): is_match = True else: is_match = False for (attr, value) in milestone_dict.items(): if (attr not in ['and', 'or']): if self._get_milestone_result(attr, token, value): is_match = True else: is_match = False return is_match<|docstring|>Parse a milestone dictionary and get results for each criterion. Key-value pairs in `milestone_dict` will be interpreted as token attributes and their values. If the value is given as a tuple, it must have the form `(pattern, operator)`, where the pattern is the string or regex pattern to match, and the operator is the matching method to use. Valid operators are "in", "not_in", "starts_with", "ends_with", "re_match", and "re_search". The prefix "re_" implies that the pattern is a regex, and either `re.match` or `re.search` will be used. Args: token (object): The token to test. milestone_dict (dict): A dict in the format given above. Returns: bool: Whether the token matches the query.<|endoftext|>
610ecab364b9dbc023d4bcf496aaa80c020d282dce83ab92bd7640419725eef4
def _get_milestone_result(self, attr: str, token: object, value: Union[(str, tuple)]) -> bool: 'Test a token for a match.\n\n If value is a tuple, it must have the form `(pattern, operator)`,\n where pattern is the string or regex pattern to match, and\n operator is the method to use. Valid operators are "in", "not_in",\n "starts_with", "ends_with", "re_match", and "re_search".\n The prefix "re_" implies that the pattern is a regex, and either\n `re.match` or `re.search` will be used.\n\n Args:\n attr (str): The attribute to test.\n token (object): The token to test.\n value (Union[str, tuple]): The value to test.\n\n Returns:\n bool: Whether the token matches the query.\n ' if isinstance(value, str): if (getattr(token, attr) == value): return True else: return False elif isinstance(value, tuple): pattern = value[0] operator = value[1] if (operator == 'in'): if (getattr(token, attr) in pattern): return True else: return False elif (operator == 'not_in'): if (getattr(token, attr) not in pattern): return True else: return False elif (operator == 'starts_with'): if getattr(token, attr).startswith(pattern): return True else: return False elif (operator == 'ends_with'): if getattr(token, attr).endswith(pattern): return True else: return False elif (operator == 're_match'): if re.match(pattern, getattr(token, attr)): return True else: return False elif (operator == 're_search'): if re.search(pattern, getattr(token, attr)): return True else: return False
Test a token for a match. If value is a tuple, it must have the form `(pattern, operator)`, where pattern is the string or regex pattern to match, and operator is the method to use. Valid operators are "in", "not_in", "starts_with", "ends_with", "re_match", and "re_search". The prefix "re_" implies that the pattern is a regex, and either `re.match` or `re.search` will be used. Args: attr (str): The attribute to test. token (object): The token to test. value (Union[str, tuple]): The value to test. Returns: bool: Whether the token matches the query.
lexos/cutter/milestones.py
_get_milestone_result
scottkleinman/lexos
0
python
def _get_milestone_result(self, attr: str, token: object, value: Union[(str, tuple)]) -> bool: 'Test a token for a match.\n\n If value is a tuple, it must have the form `(pattern, operator)`,\n where pattern is the string or regex pattern to match, and\n operator is the method to use. Valid operators are "in", "not_in",\n "starts_with", "ends_with", "re_match", and "re_search".\n The prefix "re_" implies that the pattern is a regex, and either\n `re.match` or `re.search` will be used.\n\n Args:\n attr (str): The attribute to test.\n token (object): The token to test.\n value (Union[str, tuple]): The value to test.\n\n Returns:\n bool: Whether the token matches the query.\n ' if isinstance(value, str): if (getattr(token, attr) == value): return True else: return False elif isinstance(value, tuple): pattern = value[0] operator = value[1] if (operator == 'in'): if (getattr(token, attr) in pattern): return True else: return False elif (operator == 'not_in'): if (getattr(token, attr) not in pattern): return True else: return False elif (operator == 'starts_with'): if getattr(token, attr).startswith(pattern): return True else: return False elif (operator == 'ends_with'): if getattr(token, attr).endswith(pattern): return True else: return False elif (operator == 're_match'): if re.match(pattern, getattr(token, attr)): return True else: return False elif (operator == 're_search'): if re.search(pattern, getattr(token, attr)): return True else: return False
def _get_milestone_result(self, attr: str, token: object, value: Union[(str, tuple)]) -> bool: 'Test a token for a match.\n\n If value is a tuple, it must have the form `(pattern, operator)`,\n where pattern is the string or regex pattern to match, and\n operator is the method to use. Valid operators are "in", "not_in",\n "starts_with", "ends_with", "re_match", and "re_search".\n The prefix "re_" implies that the pattern is a regex, and either\n `re.match` or `re.search` will be used.\n\n Args:\n attr (str): The attribute to test.\n token (object): The token to test.\n value (Union[str, tuple]): The value to test.\n\n Returns:\n bool: Whether the token matches the query.\n ' if isinstance(value, str): if (getattr(token, attr) == value): return True else: return False elif isinstance(value, tuple): pattern = value[0] operator = value[1] if (operator == 'in'): if (getattr(token, attr) in pattern): return True else: return False elif (operator == 'not_in'): if (getattr(token, attr) not in pattern): return True else: return False elif (operator == 'starts_with'): if getattr(token, attr).startswith(pattern): return True else: return False elif (operator == 'ends_with'): if getattr(token, attr).endswith(pattern): return True else: return False elif (operator == 're_match'): if re.match(pattern, getattr(token, attr)): return True else: return False elif (operator == 're_search'): if re.search(pattern, getattr(token, attr)): return True else: return False<|docstring|>Test a token for a match. If value is a tuple, it must have the form `(pattern, operator)`, where pattern is the string or regex pattern to match, and operator is the method to use. Valid operators are "in", "not_in", "starts_with", "ends_with", "re_match", and "re_search". The prefix "re_" implies that the pattern is a regex, and either `re.match` or `re.search` will be used. Args: attr (str): The attribute to test. token (object): The token to test. value (Union[str, tuple]): The value to test. Returns: bool: Whether the token matches the query.<|endoftext|>
96f46542d299b6d7587cbd60554801f016a6142b9c46ae7153ba5fd06ab8f502
def register(router, model, database=db, view=AdminView): "Register an administration view for each model\n\n :param app: Flaks application\n :param list models: A list of AutomapModels\n :param Admin router: An instance of Flask's Admin\n :param ModelView view:\n :return:\n " if (hasattr(model, '__admin__') and model.__admin__): admin_view = model.__admin__(database.session) else: admin_view = view(model, database.session) router.add_view(admin_view) return admin_view
Register an administration view for each model :param app: Flaks application :param list models: A list of AutomapModels :param Admin router: An instance of Flask's Admin :param ModelView view: :return:
flask_sandman/admin.py
register
manaikan/sandman2
0
python
def register(router, model, database=db, view=AdminView): "Register an administration view for each model\n\n :param app: Flaks application\n :param list models: A list of AutomapModels\n :param Admin router: An instance of Flask's Admin\n :param ModelView view:\n :return:\n " if (hasattr(model, '__admin__') and model.__admin__): admin_view = model.__admin__(database.session) else: admin_view = view(model, database.session) router.add_view(admin_view) return admin_view
def register(router, model, database=db, view=AdminView): "Register an administration view for each model\n\n :param app: Flaks application\n :param list models: A list of AutomapModels\n :param Admin router: An instance of Flask's Admin\n :param ModelView view:\n :return:\n " if (hasattr(model, '__admin__') and model.__admin__): admin_view = model.__admin__(database.session) else: admin_view = view(model, database.session) router.add_view(admin_view) return admin_view<|docstring|>Register an administration view for each model :param app: Flaks application :param list models: A list of AutomapModels :param Admin router: An instance of Flask's Admin :param ModelView view: :return:<|endoftext|>
ff314682ea439080d7d3e36d75c45dd273f4b1eafde17be8b88ac9faf0efa30c
def hinge_loss(correct_answer, incorrect_answer, margin): '\n Loss by calculating the correct/incorrect score difference in relation to given margin\n :param margin:\n :param correct_answer:\n :param incorrect_answer:\n :return:\n ' loss_sum = torch.sum(((margin + incorrect_answer) - correct_answer)) zero_vector = torch.Tensor([[0.0]]).to(loss_sum.device) return max(zero_vector, loss_sum)
Loss by calculating the correct/incorrect score difference in relation to given margin :param margin: :param correct_answer: :param incorrect_answer: :return:
utility/training.py
hinge_loss
RobinRojowiec/intent-recognition-in-doctor-patient-interviews
0
python
def hinge_loss(correct_answer, incorrect_answer, margin): '\n Loss by calculating the correct/incorrect score difference in relation to given margin\n :param margin:\n :param correct_answer:\n :param incorrect_answer:\n :return:\n ' loss_sum = torch.sum(((margin + incorrect_answer) - correct_answer)) zero_vector = torch.Tensor([[0.0]]).to(loss_sum.device) return max(zero_vector, loss_sum)
def hinge_loss(correct_answer, incorrect_answer, margin): '\n Loss by calculating the correct/incorrect score difference in relation to given margin\n :param margin:\n :param correct_answer:\n :param incorrect_answer:\n :return:\n ' loss_sum = torch.sum(((margin + incorrect_answer) - correct_answer)) zero_vector = torch.Tensor([[0.0]]).to(loss_sum.device) return max(zero_vector, loss_sum)<|docstring|>Loss by calculating the correct/incorrect score difference in relation to given margin :param margin: :param correct_answer: :param incorrect_answer: :return:<|endoftext|>
8aaeb03e3332fbc9dc2657bd2e7141fee0bf6c4580de68886b77adacee7dae8c
def get_optimizer(model: nn.Module, name, **kwargs): '\n initializes the optimizer\n :param model:\n :param name:\n :param kwargs:\n :return:\n ' if (name == 'SGD'): return torch.optim.SGD(model.parameters(), **kwargs) elif (name == 'Adam'): return torch.optim.Adam(model.parameters(), **kwargs) elif (name == 'BertAdam'): return AdamW(model.parameters(), **kwargs)
initializes the optimizer :param model: :param name: :param kwargs: :return:
utility/training.py
get_optimizer
RobinRojowiec/intent-recognition-in-doctor-patient-interviews
0
python
def get_optimizer(model: nn.Module, name, **kwargs): '\n initializes the optimizer\n :param model:\n :param name:\n :param kwargs:\n :return:\n ' if (name == 'SGD'): return torch.optim.SGD(model.parameters(), **kwargs) elif (name == 'Adam'): return torch.optim.Adam(model.parameters(), **kwargs) elif (name == 'BertAdam'): return AdamW(model.parameters(), **kwargs)
def get_optimizer(model: nn.Module, name, **kwargs): '\n initializes the optimizer\n :param model:\n :param name:\n :param kwargs:\n :return:\n ' if (name == 'SGD'): return torch.optim.SGD(model.parameters(), **kwargs) elif (name == 'Adam'): return torch.optim.Adam(model.parameters(), **kwargs) elif (name == 'BertAdam'): return AdamW(model.parameters(), **kwargs)<|docstring|>initializes the optimizer :param model: :param name: :param kwargs: :return:<|endoftext|>